Software optimize

Write an advanced Excel data import with Agile Platform

Justin James explains how to create a data import for Excel in Agile Platform. He states the process is much faster than using third-party components in .NET code.

One of the most persistently obnoxious requirements in the history of development is the need to import data into an application from Microsoft Excel. We know how much users love to use Excel as a miscellaneous catch-all application (especially small businesses that seem to run their entire organizations from it), so the data import is important, but dealing with Excel is a hassle.

Running Office on the server is known to be problematic (it creates a memory leak), which rules out using Visual Studio Tools for Office. Third-party components are available, but the good ones are usually quite expensive, and the free and open source ones are typically buggy, out of date, or lacking features. Luckily for me, creating a data import in the OutSystems Agile Platform took hours rather than days like it usually does; I used components built-in to the system and learned what to do in a few minutes. I'll show you what I did to write an advanced data import for Excel 2007/2010.

Excel data imports

The first step in the data import is to get the data from the user. For this, we use the FileUpload Widget -- just place it on the screen and give it a name. Next, create a new Button and link it to a new Action to perform your import; this will put a box on the screen for the user to select their local file and then upload it.

In the Action, the real fun begins. What makes this so insanely easy is one simple built-in Action that's available in the Toolbox: ExcelToRecordList. Incidentally, there is also a RecordListToExcel Widget, which does just what you think, and is great for writing data exports. Using this Action takes the Excel file and turns it into a RecordList, using the Record Definition you provide. Here are some notes on how it parses the file:

  • You can specify which sheet to read. If you don't, it looks for "Sheet1" and, if that is not available, it uses the first sheet in the workbook.
  • If the number of columns in the sheet matches the number of columns in the record definition, it will read them from left-to-right to fill in the record. Otherwise, it will use the data in row 1 to match columns to Attributes (I highly recommend this).
  • If two record types in the definition have Attributes with the same name, the column name in row 1 should differentiate which one it means using a colon to separate the record's type from the column name (e.g., "Customer:Name" and "Part:Name").
  • Your record definition can contain Entities or Structures (this will become important in a moment).
  • The record definition cannot contain any Entities or Structures with Attributes of the type BinaryData.

My initial idea was to use the destination Entity for the import in the data definition; this is perfectly fine in many situations, but I quickly hit a few walls. First of all, one of the Entities I was importing had a BinaryData Attribute, and I really did not feel like rewriting all of the places it was used to put it in its own Entity. Secondly, some of the Entities I needed to import refer to other Entities. For example, my PART entity is linked to SUPPLIER, which I was also importing. Even if they were not both coming in at the same time, I certainly did not want to force my users to find out the ID numbers for the linked Entities.

So, I reached into my bag of tricks and came up with an easy solution: I selected the Entity, right-clicked and chose Copy, and then went to the very top of the data tree, right-clicked, and chose Paste As.... I then selected Structure, which gave me a Structure with identical Attributes as the Entity I wanted. Next, I removed the BinaryData Attributes and the Id Attribute. Then, any Attribute that referred to another Entity, I replaced with a Text Entity. For example, where my PART Entity has a Supplier Attribute pointing to SUPPLIER, I replaced it with a Text Attribute called SupplierName.

Using this record in my record definition for the ExcelToRecordList Action, it was time to move on. I used a ForEach Action to iterate over the resulting Record List. Within the ForEach, my first step was to look up the SUPPLIER Entity by the SupplierName Attribute in the imported data, which was easy enough. Then I used an Assign Action to copy all of the data from the imported structure into a local variable of type Record with a definition of SUPPLIER, as well as some default data value and the Supplier Identifier needed for the Supplier Attribute of PART. This local variable gets passed to the CreatePART Action defined by the Entity.

One Entity (CUSTOMER) was particularly tricky -- it referred to another Entity (ADDRESS) three times (PhysicalAddress, MailingAddress, and BillingAddress). What to do? Easy! I copied the ADDRESS Entity into three structures (ForImportPhysicalAddress, ForImportMailingAddress, and ForImportBillingAddress), and in each of them, I changed the Attributes to be in the form of "PhysicalAddressCity" or "MailingAddressLine1." While I could have just left the names the same and used the names "PhysicalAddress:City" or "MailingAddress:Line1" in the Excel column headers, my way is a lot easier for the users who will be working with the Excel file. In the import for this sheet, I first created each ADDRESS Entity separately, and then filled the resulting IDs into the CUSTOMER Entity before creating it.

As an added bonus, I did the heavy lifting in an eSpace Action, passing in the BinaryData from the upload. I created an Output Variable of type RecordList, with a definition of PART for my Action. After creating each Entity in the database, I took the resulting ID, added it to the local PART record variable, and used ListAppend to add it to the Output Variable. With this, I have the option of displaying to the user what records were imported after the successful import.

The only trouble I had with this process was that I lost a lot of time because I was editing a different Excel file than the one I was actually testing with. Outside of that, the process took me about two hours (including figuring out the Structure trick) to do imports for five Entities, some of which were fairly complex. I know that using third-party components in .NET code would have taking me much, much longer.

Read my other TechRepublic posts about Agile Platform

J.Ja

About

Justin James is the Lead Architect for Conigent.

0 comments