Recently, I shared how replacing the Expand function in TI processes resulted in a significant reduction in processing time. As a result, I had happier users (well fewer complaints.) Today I look at a similar time drain when working with dates in Planning Analytics.

The Problem

During the same analysis, I found a much bigger culprit. 2 functions that will now forever send shivers down my spine: ParseDate and FormatDate. These functions were first introduced in TM1 10.2. They were new, shiny, and represented a more flexible way of working with dates and in Planning Analytics. Essentially, we can now do everything we used to be able to do but with much greater flexibility on the date formats available for us out of the box.

The old DayNo function for converting a date string to a serial number could only accept YYYY-MM-DD or YY-MM-DD formats. If you worked in another format e.g. DD-MM-YYYY, you would need to cut up the string and jumble it around to the “correct” format before you could get a serial number from it. And if you wanted to convert a serial date to a string, then the TimSt function was much more limited in its output options than the huge number of patterns available with FormatDate.

Simply put the newer functions do everything the old functions did and MUCH more. So, it’s reasonable to assume they take a little bit more processing overhead and a bit longer to run. The problem I found is that they took a LOT longer to run.

9 times out of 10, in TI processes, you are probably just doing 1 or 2 date conversions and calculations. For example, you may want to record the time a process; starts, when it ends, and the total duration. In these cases, you will notice no humanly perceivable difference.

In my use case, I needed to calculate over a billion dates in a single process. Applying lead times on top of a dispatch date to estimate when goods would be received at various stops on logistics journeys. From suppliers through ports, warehouses, distribution hubs, and stores. It’s a specific use case, but any process calculating many dates will have similar issues. And the issue was that using FormatDate and ParseDate was incredibly slow.

The Proof & Examples (DayNo/Timst vs ParseDate/FormatDate)

I have put together a simple example that explains the issue:

The Process:

  • Has a starting date as a string (YYYY-MM-DD format)
  • Converts it to a serial number
  • Adds a lead time
  • Converts the new serial number back to a string (YYYY-MM-DD format)

Changing the nFlag variable between 1 and 0 decides whether it uses the “old” functions (TimSt and DayNo) or the “new” functions (FormatDate and ParseDate) to do its transformations.

Looping 1 million times, for 10 executions it averaged:

  • 3.10 seconds using DayNo and Timst
  • 29.09 seconds using ParseDate and FormatDate

If you factor that up from 1 million records to 1 billion records (as per my use case), it’s the difference between 50 minutes and 8 hours! That’s before all the other things that my process also needed to do. Thankfully, I was cutting up my data and running the processes in parallel across 8 threads – which when fully optimised would be the difference between approximately 6 minutes or 1 hour. One of those numbers is certainly preferable to the other… hint – it’s the smaller one!

I of course got “lucky” that I had control over the date formats and had already decided we were going YYYY-MM-DD format, which works nicely with the DayNo function. But even if I had a different format e.g. DD-MM-YYYY, adding extra steps something like below:

The average time taken was:

  • 3.44 seconds using DayNo, TimSt and SubSt
  • 29.04 seconds using ParseDate and FormatDate

It’s still not maybe the most unusual date example. But realistically when the difference is so significant, creating a lookup dimension and doing the TimSt/DayNo method combined with an attribute lookup would still take significantly less time. The only downside would be additional lines of code to do so. This would be useful if you were needing to do things such as convert dates into a week number format etc.


Long story short, if you work with Dates in Planning Analytics, if you have TI processes doing lots of date conversions – I would strongly suggest using the older methods of Timst and DayNo. You may end up with more verbose processes and having to explain to subsequent developers why you used that method instead of the visually “easier” FormatDate and ParseDate functions. But at the end of the day, your end-users don’t care about how your code works. They just want it to give them their results quickly.

Declan Rodger

Declan is an IBM Champion for Data & AI, he is the Technical Director for Spitfire Analytics. Excited to share his experiences with IBM Planning Analytics and can often be found writing technical articles.


The great thing about working with Spitfire Analytics is their financial background.  We can just explain what we need in our own terms and they understand exactly what needs to be done.

- Phil Talbot, Finance Director, Robertson Group

Request a demo →

  • This field is for validation purposes and should be left unchanged.