Advanced logging in IBM Planning Analytics (TM1)

Most of the integrated IBM Planning Analytics - TM1 applications have a global technical setup as displayed below.

Figure 1: Integrated IBM Planning Analytics - TM1 set-up

Advanced logging in IBM Planning Analytics (TM1)

In this integrated setup, data and metadata are uploaded from external systems. These external systems can be databases or text-files. IBM Planning Analytics - TM1 does the necessary calculations and, if applicable for the functional domain, the end-user adds data. When all calculations and simulations are done, the data is loaded back to an external data warehouse or an external application (MRP, consolidation, …). 

We all agree that metadata (read dimension updates) and data (read cube updates) should be regulated by separate TI-processes. The reason for this split is, that dimension updates cause locks during the process duration. Locks give end-users a feeling of bad performance because they have to wait. In more advanced systems these upload processes are scheduled via TI-chores. The big advantage is that these applications run pretty much without any intervention of IBM Planning Analytics - TM1 administrators. This advanced automation and scheduling however brings some additional requirements/questions in the design of the application:

An important question is who or what takes care of the assurance of the quality of the metadata and data that is uploaded into IBM Planning Analytics - TM1 via these scheduled chores?

In most situations, it’s not realistic and feasible to ask the IBM Planning Analytics - TM1 administrators to do a frequent quality check of the dimensions and the data that are uploaded into the cubes. The objective of the scheduling, probably was to limit the effort of these people in the first place. This is why a developer has to make some choices in the design how to assure data and metadata quality, in an automated IBM Planning Analytics - TM1 setup.

Maturity Level I

Only Standard IBM Planning Analytics - TM1 quality checks

Figure 2: Maturity Level I

Advanced logging in IBM Planning Analytics (TM1)

TM1 standard already does some quality checks for you. For example:

  • A check is made whether the lay-out of the uploaded data is in line with the format of the element. For example, you cannot put numerical data on a string element;
  • A check is made whether the alias of an element is unique;
  • A check is made whether the format of data or the file is aligned with the expected format;
  • Another obvious check is whether the data can be assigned to an element in all of the dimensions of the cube;
  •  …

When something goes wrong, as described above, or something “normal” happens in the application, IBM Planning Analytics - TM1 creates loggings. Some examples of these loggings are:

  • Process Error Log

When (minor or major) errors occurs during process run TM1 creates error logs in the logging directory.

  • Transaction log

Each TM1 server tracks the data transactions made by its clients. When a client changes a cube value, TM1 records the change in a transaction log file named Tm1s.log. The IBM Planning Analytics - TM1 administrator can decide for which cubes transactions are logged. Transaction logging is very useful when you want to recover data.

  • Message Log

The TM1 message log is an interesting log for the IBM Planning Analytics - TM1 administrator. Every TM1 server records status messages of the activity of the server in a log file.  These messages contain details on activity such as executed processes, chores, loaded cubes and dimensions, and synchronized replication.

  • Admin server log

The TM1 Admin Server log is useful for troubleshooting connection issues when using the TM1 Secure Socket Layer (SSL) with custom certificates or certificates from the Microsoft Windows certificate store.  It contains messages about the communication between TM1 clients, the  TM1 Admin Server, and individual  TM1  servers.

  • TM1 Web Log file

The logging process for IBM Planning Analytics TM1 Web records activity and error messages for the program into the tm1web.log file. Administrators can use this log file for status and troubleshooting of IBM Planning Analytics  TM1 Web. The severity levels in the log files helps to organize messages.           

All these loggings have their own objective. The loggings are generally available and usable within a IBM Planning Analytics TM1 application. They can be tuned by setting parameters and settings in property files and client tools. 

If all these standard checks are passed, the data and metadata is at least technically valid and can be uploaded and used in the TM1 application. A characteristic of these standard loggings is that they are rather technical and in most cases are only understandable by a more technical profile. Another characteristic is, that in this situation the data is technically valid but this of course says nothing about the accuracy/correctness of the data or the metadata!

Maturity Level II

Custom IBM Planning Analytics - TM1 quality checks - txt files

Figure 3: Maturity Level II

Advanced logging in IBM Planning Analytics (TM1)

In a more mature application, the developer builds in custom quality checks to verify the accuracy of the data. The developer discusses upfront, with the stakeholders, what the quality level of the data should be. What do we do for example when certain attributes are missing or when the format of the elements does not meet the regulative format.  If you do not ask these questions upfront everything that is technically valid is uploaded into TM1 cubes and/or inserted in dimensions. This can lead to contaminated TM1 applications that are not really user-friendly and laborious in maintenance.

A common thing to do in case of a cube update, when elements are not matching the elements in a dimension, is putting the data on a technical element “unknown” in the dimension. In this case the data in the TM1 application, at least, is aligned with the data in the source database. The user of the application will have insight in what could not be assigned to an element in a dimension. This however does not give insight in the reason why data is not uploaded. So I personally prefer of doing the check and not upload the data into the TM1 cube.

For custom IBM Planning Analytics - TM1 quality checks the developer can do the following in case of an error or mismatch in the upload process:

  1. Do an itemskip (record is ignored and no logging is created);
  2. Do an itemskip and do a logging to a custom txt-file;

I sometimes see applications were only the first action is covered. The analysis on data quality is done, but the necessary error logging is missing. The result is that only qualitative data and metadata is uploaded to TM1, and that is good. But the end-user of the application has no clue at all, why certain data or metadata is missing. In this situation, the IBM Planning Analytics TM1 administrator regularly, will get questions in case of missing data or metadata. The best thing to do is to create customer error log txt-files that are placed in an agreed folder on the system.

One big disadvantage of doing this, is that this will probably lead to a bunch of files for the end-users. Most probably the end-user will not see the wood for the trees. When the standard logging of TM1 and/or a custom logging file is in place, an often heard complaint is that this is not user-friendly enough and the TM1 administrators still need to be involved to investigate the issue(s).

Maturity Level III

Custom IBM Planning Analytics - TM1 quality checks - TM1 application

Figure 4: Maturity Level III

Advanced logging in IBM Planning Analytics (TM1)

The next level in maturity by giving more user-friendliness and more responsibility to the end-user when analyzing data quality, is storing the error logging in TM1 cubes. You even can create nice templates that are provided to the end-user via, for example, IBM Planning Analytics Workspace (PAW) or IBM Planning Analytics TM1 web
The requirements for the loggings in TM1 are the following:

  • The error logging should not have a negative impact on the performance of the TI process itself;
  • A limited number of additional dimensions and cubes should be necessary for logging the errors;
  • The logging storage capacity should be limited, so that it does not have an impact on the total memory needed by the TM1 application;
  • The end-user should be able to open the error loggings with use of the standard TM clients;
  • The logging steps in the TI-process should be simply incorporable in the TI-process; No additional development effort should be needed when introducing new TI-processes;
  • The logging functionality should be easy to copy over to new models and should be easy to understand for a moderate TM1 developer;

For all my IBM Planning Analytics - TM1 applications, I incorporate a standard set of TM1 objects to cover the above requirements and which are transmissible to every new or existing TM1 model. An important object of this standard set is the TM1 cube for storing the results of quality checks and is called “Error Log”.

This TM1 cube consists of the following dimensions:

  • }Processes
  • Record Number
  • Error Category
  • Error Log Measure
  • Date and Timing dimensions(Optional)

The dimension “Error Category” contains the possible errors that can arise from the quality checks executed in the upload processes. Below a example of a subset of such a dimension:

Figure 5: Dimension Error Category

Advanced logging in IBM Planning Analytics (TM1)

The dimension “Error Log Measure” contains a number of string elements that, depending on the error category, contain a descriptive context of the error. The dimension “Record Number” is a placeholder that is needed for storing more than one error per category and process. The technical dimension “}Processes” is used for assigning the error to the process were the error resides from. The date and timing dimensions (year, month, day, hour, minute and second) are optional. In most cases, it’s more than enough to store only the result of the last process run, instead of storing the results of all historical runs.

Figure 6: Error Log Cube

Advanced logging in IBM Planning Analytics (TM1)

Every process in which quality checks are applicable and logging is necessary, a standard set of TI coding is added that is filling the above displayed cube in case of mismatches. This standard set of TI-coding is build up with variables, so that only the variables have to be changed and limited number of additional coding is necessary.

This cube should be read accessible to every end-user that benefits from the insight in the status of data quality uploaded to the TM1 model. If you want to make this even more user-friendly you can create templates that you deploy to IBM Planning Analytics - TM1 web or IBM Planning Analytics Workspace (PAW).

Figure 7: (part of) IBM Planning Analytics - TM1 web application error Logging

Advanced logging in IBM Planning Analytics (TM1)

Maturity Level IV

Notification in case of mismatch

Figure 8: Maturity Level IV

Advanced logging in IBM Planning Analytics (TM1)

In all of the above described situations the end-user must open a file or open an application to get information about data quality. The last mile in reaching maximum user-friendliness and maturity is putting some kind of an alert in place to inform the end-user that something went wrong with the automated upload.

It’s possible to initiate executables (bat-files) in TI-processes that give an e-mail notification to a defined group of end-users. You need to decide upfront which people have interest in insight in data quality and are able to take the necessary actions to solve these issues. Sending out e-mail notifications to people that have no interest makes no sense and is quickly marked as spam.

When your application is in this state of maturity regarding data quality, end-users have the full picture they need to understand the status of the quality of the data in the model. This will make the world of both the IBM Planning Anlaytics TM1 administrator and the IBM Planning Anlaytics TM1 users a lot more easy.