One client required nightly imports from an external database to update contact names, addresses, and other information. They also wanted a report of all the changes made each night. This is not a trivial programming task. There is no automated log of change transactions in FileMaker…and even if there were, it might not be so easy to make a nicely formatted report of it.
Our first attempt at tackling a solution was to add several new fields to the contact table to store the previous values before importing. For example, we added NameFirstPrevious, NameLastPrevious, AddressLine1Previous, CityPrevious, etc. Before we run the import, we execute a script to copy the current values into all the Previous value fields. Then after the import occurs we compare the old and the new values through calculations and show the differences. Here’s how the store previous values script looks:
Go to Record/Request/Page [First] Loop Set Field [CONTACTS::CityPrevious; CONTACTS::City] Set Field [CONTACTS::NameFirstPrevious; CONTACTS::NameFirst] ....# Many more lines here Go to Record/Request/Page [Next; Exit after last] End Loop
Then the calculation to see the differences:
If(CONTACTS::NameFirstPrevious<>NameFirst; "Changed First Name: " & NameFirst; "")
This worked great on our test set of records but it quickly became evident that it would not work on large data sets. At one point, with several thousand records it took over 9 hours just to store the previous values!
We realized the the amount of work FileMaker needed to do to reindex all the fields in the Contacts table each time a change was stored was slowing things down significantly. So our first optimization was to eliminate some of the reindexing. We changed our script to check if the values were different before stored the previous values which made a change to the field in Contacts. It looked like this…
If [CONTACTS::City<>CONTACTS::CityPrevious] Set Field [CONTACTS::CityPrevious; CONTACTS::City] End If
This actually sped things up quite a lot…cutting several hours off the process.
Even then, we realized later that it could be optimized further if we moved the previous value fields out of the Contacts table altogether to another 1-1 related table by the primary key ContactID. This table would reindex very quickly because the only items in it would be changes. So the new script set the field through a relationship (Contacts_ContactsPrev) that allowed new records to be created automatiaclly:
If [CONTACTS::City<>CONTACTS::CityPrevious] Set Field [Contacts_ContactsPrev::CityPrevious; CONTACTS::City] End If
With these two optimizations, the script now runs in less than 30 seconds on a regular basis…quite an improvement from the original 9+ hours…and we have the report we need.
So optimizations are possible when we keep at it and think through the possible options. For me, I often need to let the problem stir in the back of my mind for a while before I have the inspiration needed to fix it. Sometimes, I’ll wake up in the morning from a good nights rest and have the “ah ha” moment immediately. I think “If I change this, it will work!” …and often it does!