A common file format that is used to import and export data is the delimited file. A delimited file is a common term for a file where the data is separated by a delimiter. Commonly we see Comma separated values (CSV) tab separated values, but the delimiter can be any character you want, I prefer the pipe | (not the I or l but the |).

This is an unstructured data format, meaning that there is no way to validate the data integrity from one line to the next or from one file to the next. It is also a flat structure, meaning that a header and a line are indistinguishable, unlike XML or JSON where the structure supports nested data objects. To keep things simple in our first implementation we are going to import Customers, as opposed to a header line object like Sales Quotes.

The first thing we need to do is identify the “columns” that represent our data. We say columns, because if we import this data into Excel, it will place each row into a distinct column of data. These columns should be static and not change between file loads. Handling named headers where the column order changes can be done, but it is a little more complicated for our first file and will be addressed later.

Let’s start with this file layout.

ColumnData Value
NameText[100]
Address Line 1Text[100]
Address Line 2Text[50]
CityText[30]
StateText[30]
Postal CodeText[20]
Contact EmailText[80]
Credit LimitDecimal

A side note, the city field length max of 30 is great unless you live in Llanfair­pwllgwyngyll­gogery­chwyrn­drobwll­llan­tysilio­gogo­goch Wales, or the other two cities with more that 30 characters in their name. Don’t worry, all three have official shortened names that do fit in the 30-character limit.

Back to the task at hand! If I generate some data with this format it looks like this:

Name,Address Line 1,Address Line 2,City,State,Postal Code,Contact Email,Credit Limit
John Smith,123 Maple Street,Apt 4B,Springfield,IL,62704,john.smith@example.com,5000
Maria Lopez,456 Oak Avenue,,Los Angeles,CA,90001,maria.lopez@example.com,7500
Ken Tanaka,789 Pine Road,,Seattle,WA,98101,ken.tanaka@example.com,6200
Amina Hassan,321 Birch Lane,,Atlanta,GA,30303,amina.hassan@example.com,8000
Liam O'Connell,85 Elm Street,Floor 2,Boston,MA,02108,liam.oconnell@example.com,4000
Chloe Nguyen,160 Cedar Blvd,,Dallas,TX,75201,chloe.nguyen@example.com,7000
Raj Patel,998 Spruce Drive,,Chicago,IL,60616,raj.patel@example.com,5500
Sofia Rossi,210 Redwood Street,Building B,Miami,FL,33101,sofia.rossi@example.com,6700

Thank you Copilot for the sample data.

Now that we have our data, we are going to need a means to import it. In my experience, the best practice is to import into a Staging Table, either permanent or temporary, then process the data into the target record. This gives us a place to perform data validation, data corrections and data retry. While it is not necessary, this is my Blog, so we will do it my way.

For this customer import we are going to use a temporary staging table. Using the details from the table definition we can create a table that looks like this:

ARDCustomerStaging.Table.al

table 50000 ARD_CustomerStaging
{
    Caption = 'Customer Staging';
    DataClassification = CustomerContent;
    TableType = Temporary;

    fields
    {
        field(1; "ARD_No."; Integer)
        {
            Caption = 'No.';
            AutoIncrement = true;
            ToolTip = 'Unique identifier for the customer.';
        }
        field(2; ARD_Name; Text[100])
        {
            Caption = 'Name';
            ToolTip = 'The name of the customer.';
        }
        field(3; ARD_AddressLine1; Text[100])
        {
            Caption = 'Address Line 1';
            ToolTip = 'The first line of the customer''s address.';
        }
        field(4; ARD_AddressLine2; Text[50])
        {
            Caption = 'Address Line 2';
            ToolTip = 'The second line of the customer''s address, if applicable.';
        }
        field(5; ARD_City; Text[30])
        {
            Caption = 'City';
            ToolTip = 'The city where the customer is located.';
        }
        field(6; ARD_State; Text[30])
        {
            Caption = 'State';
            ToolTip = 'The state or region of the customer''s address.';
        }
        field(7; ARD_PostalCode; Text[20])
        {
            Caption = 'Postal Code';
            ToolTip = 'The postal code for the customer''s address.';
        }
        field(8; ARD_ContactEmail; Text[80])
        {
            Caption = 'Contact Email';
            ToolTip = 'The email address for contacting the customer.';
        }
        field(9; ARD_CreditLimit; Decimal)
        {
            Caption = 'Credit Limit';
            ToolTip = 'The credit limit assigned to the customer.';
        }
        field(10; ARD_CustomerNo; Code[20])
        {
            Caption = 'Customer No.';
            ToolTip = 'The unique customer number assigned to the customer.';
        }
    }
    keys
    {
        key(PK; "ARD_No.")
        {
            Clustered = true;
        }
    }
}

Notice that line 5 declares this as a temporary table. This table will not be saved to the database. We also don’t require a permission set for this table as it doesn’t store data that we would be able to restrict permission to.

We need a page to manage the import. This page should prompt for the file, and we can either parse in a code unit or in the page directly. For this example, we are going to do all the work in the page to keep the code together.

I created a page called ARDCustomerImport.Page.AL. There are some interesting layout bits in this page, but I’m going to focus on the elements related to data parsing.

There is an action button that includes a Trigger for OnAction.

trigger OnAction()
var
    FileFilter: Text; // Defines the file filter for the upload dialog
    InStream: InStream; // Stream to read the uploaded file
    CurrentText: Text; // Temporary variable to hold the current line of text
    TextValue: TextBuilder; // TextBuilder to accumulate the file content
begin
    // Set the file filter to allow all file types
    FileFilter := 'All Files (*.csv)|*.csv';

    // Open a dialog to upload a file and read its content into a stream
    if UploadIntoStream(FileFilter, InStream) then
        // Read the stream line by line until the end of the stream
        while InStream.EOS = false do
            // Append each line of text to the TextBuilder
            if InStream.ReadText(CurrentText) <> 0 then
                TextValue.AppendLine(CurrentText);

    // Convert the accumulated text into a single string
    ImportText := TextValue.ToText();
    // Parse the customer data from the imported text
    ParseCustomerData(ImportText);
end;

The UploadIntoStream command pops up the file upload dialog box. We provide a file filter so that it limits the visible files to CSV, and an In Stream to load the file into.

We then read the file until we reach EOS (End of Stream). As we read the file we append the text into a Text Builder.

Lastly, we turn the Text Builder into a Text object for us to parse. Texts that get saved to a database have a size limit of 2048. In this case we have an unsized Text variable which is limited in size to a Int32.MaxValue, which is big. We are more likely limited by the size of an InStream which is 1,000,000 bytes.

When we have our Text we pass that data onto our Parsing procedure.

procedure ParseCustomerData(customerData: Text)
var
    Lines: List of [Text];
    Line: Text;
    LineDetails: List of [Text];
    tempDecimal: Decimal;
    CRLF: Char;
    ProcessedHeader: Boolean;
    Count: Integer;
begin
    Count := 1; // Initialize the count of processed records
    ProcessedHeader := false; // Flag to skip the header line
    CRLF := 10; // Define the line break character

    Lines := customerData.Split(CRLF); // Split the text into lines
    foreach Line in Lines do begin
        if not ProcessedHeader then begin
            // Skip the header line
            ProcessedHeader := true;
            continue;
        end;

        LineDetails := Line.Split(',');

        if LineDetails.Count() < 8 then // Skip lines that do not have enough data
            continue;

        TempCustomerImportRec.Init();
        TempCustomerImportRec."ARD_No." := Count; // Auto-increment field, set to 0 for new records
        TempCustomerImportRec."ARD_Name" := CopyStr(LineDetails.Get(1), 1, 100);
        TempCustomerImportRec."ARD_AddressLine1" := CopyStr(LineDetails.Get(2), 1, 100);
        TempCustomerImportRec."ARD_AddressLine2" := CopyStr(LineDetails.Get(3), 1, 50);
        TempCustomerImportRec."ARD_City" := CopyStr(LineDetails.Get(4), 1, 30);
        TempCustomerImportRec."ARD_State" := CopyStr(LineDetails.Get(5), 1, 30);
        TempCustomerImportRec."ARD_PostalCode" := CopyStr(LineDetails.Get(6), 1, 20);
        TempCustomerImportRec."ARD_ContactEmail" := CopyStr(LineDetails.Get(7), 1, 80);
        if Evaluate(tempDecimal, LineDetails.Get(8)) then
            TempCustomerImportRec."ARD_CreditLimit" := tempDecimal;

        // Insert the record into the staging table
        TempCustomerImportRec.Insert();
        Count += 1;
    end;

    RefreshPage();
end;

Lines 11, 12, and 13 setup some variables we will need. We need a count for line numbering, a flag to identify if we have processed the header line, and a character to identify the line breaks in the text file for splitting it into lines.

Business Central has a method of splitting data into lists aptly named “Split”. We have the customerData variable that contains all the text from our file. Using the Split command and the CRLF character we defined on line 13 we split it into individual lines, each representing a customer record.

We can then iterate through the lines and split those into values. First on line 17 we check if we have processed the first row of data. The first row contains the names of the columns, and we don’t want to create a customer out of those values.

After the header check we split the line on the comma character into the LineDetails variable. This represents the individual values. It is at this point we could have looked at the header, identified the column values and dynamically allocated them to their associated values, but for this example we are going to assume this is a bespoke CSV file and the column order will not change.

Lines 28-38 take values from the line details list and coerce them to size using the CopyStr procedure and place them into a set of Customer Staging records.

That is it! CSV Parsed to Records!

Here is the page when we open it.

Clicking the Import button runs the Action that prompts for the file import.

I’ll feed it a file containing our sample data.

There we are! All parsed out to temporary records, ready to become Customer Records!

The last bit if code we need is to convert the Staged values to Customer records. It looks something like this:

procedure GenerateCustomerRecords()
var
    CustomerRec: Record Customer;
begin
    TempCustomerImportRec.Setfilter("ARD_No.", '<>0'); // Filter out the auto-increment field
    if TempCustomerImportRec.FindSet() then
        repeat
            CustomerRec.Init();
            CustomerREc."No." := ''; // Let the system assign the customer number
            CustomerRec.Name := TempCustomerImportRec."ARD_Name";
            CustomerRec.Address := TempCustomerImportRec."ARD_AddressLine1";
            CustomerRec."Address 2" := TempCustomerImportRec."ARD_AddressLine2";
            CustomerRec.City := TempCustomerImportRec."ARD_City";
            CustomerRec.County := TempCustomerImportRec."ARD_State";
            CustomerRec."Post Code" := TempCustomerImportRec."ARD_PostalCode";
            CustomerRec."E-Mail" := TempCustomerImportRec."ARD_ContactEmail";
            CustomerRec."Credit Limit (LCY)" := TempCustomerImportRec."ARD_CreditLimit";

            // Insert the customer record into the main customer table
            if CustomerRec.Insert(true) then begin
                TempCustomerImportRec.ARD_CustomerNo := CustomerRec."No.";
                TempCustomerImportRec.Modify();
            end else
                Error(GetLastErrorText());

        until TempCustomerImportRec.Next() = 0;

    RefreshPage();
end;

Here is all our contacts:

All of our values from our CSV are in place.

There we go, the technically the second of our import parsing technologies. A while back we covered JSON. To keep things simple, I’m combining all the parsing examples into a single Example up on the GitHub. You can download the entire project here: AardvarkMan/BC-File-Import-Examples: File Import and Parsing Examples for Business Central

Let me know what type of file imports you have done in CSV and the challenges you have had.

One response to “Best Practices for Handling Delimited Data Imports into Business Central”

  1. […] in our Delimited File Import example we imported and parsed a simple file into Customer records. While useful, there are many […]

    Like

Leave a comment

Trending