Last week Hal Gumbert mentioned on Twitter that he was “working on a FileMaker quote to display and edit a BOM ( Build of Materials ) that can go 9 levels deep.” Probably the most efficient and user friendly way to implement this is using a tree view with collapsible/expandable items.
Just today I needed to decode HTML encoded text in FileMaker. After checking few functions I found one that seemed pretty good. Written in 2009 by Fabrice Nordman and named HTMLencoded2Text, this custom function was converting my imported text OK at first sight.
This is the first example I was showing in my session Marvelous Optimizations at Pause On Error [x] London 2011. I already wrote about this optimization some time ago. It’s the one that led me to unveil the Marvelous Optimization Formula. I took the example and added FM Bench Detective into it to be able to exactly measure and examine what happens.
I noticed that one of the articles updated in the official FileMaker Knowledge Base on September 23, 2011 was explaining how to select a random set of records in a FileMaker database. I was wondering how fast the currently recommended technique is and whether I can make it faster with the help of FM Bench.
Last September I wrote an article about a custom function that I optimized to evaluate hundreds times faster. At the end of the article, I challenged my readers and myself by claiming that the already optimized custom function can be optimized even further. Do you remember? Later on I actually really optimized it again, and talked about this.
The second example I was showing in my session Marvelous Optimizations at Pause On Error [x] London 2011 was the script for selecting Random Set of Records. I found this example in the FileMaker Knowledge Base and optimized it to run at least 158 times faster when selecting 10 random records out of 50,000.
This example demonstrates that even a single-step script can be optimized. You just have to think a little bit out of the box... I was showing this as a surprise in my session Marvelous Optimizations at Pause On Error [x] London 2011. I used a sample file with 25 fields and 5,000 records and imported these records 5 times in a row in just 13s.