In this case study, we explain element61's role in designing, building and deploying a Management Dashboard at USG People. Together with senior management at the customer, element61 built a management information system (MIS) based on a management cockpit / dashboard philosophy for the benefit of the local operating companies using existing knowledge, experience and insights on how to effectively monitor and manage a temping company, in line with the corporate strategy.
The local brand portfolio of USG People comprises brands like Start People, USG Financial Forces, USG Innotiv, Unique, Secretary Plus and Express Medical. USG People is listed on the NYSE Euronext Amsterdam stock exchange and is included in the Amsterdam Exchange Index (AEX).
Uniformity in the way information is offered to local management, further enabling aggregation & transparency, through a best-practise underlying data warehousing architecture.
A re-use of best practises from pilots & previous efforts.
The module should form the basis for regional reporting, implying cross-country, cross-label, global customers, etc.
A solution supporting flexible goal & target setting. Based on the available information, the MIS should help in proactive guidance towards certain targets, in line with current economical market situations.
Flexibility in terms of local flavors.
Fast track delivery.
Cube build & cube access performance
Given the requirements that the solution should also be accessible in the individual offices for end users in need of the information and at the same time having only a limited amount of time to retrieve it, a thorough investigation took place of all parameters and their various combinations which can influence one or the other. This has led to an optimized cube build & cube access performance.
This flexibility involves support for the local parties in terms of language and security.Furthermore an operating company can define its own dimensions, attributes and KPI's which can be completely company specific. These local flavors can be integrated transparently.
Introduction of a reporting solution in an international environment
The implementation blueprint is a methodology which enables operating companies to get on board as swiftly as possible. It contains the answers to the questions "who, "what, "why, "when & "howMajor elements here were:
- Definition of the various high-level tracks, medium-level phases and detail-level processes within the overall blueprint.
- Identification of all required local & corporate roles and their corresponding responsibilities.
- RACI methodology identifying persons which are Responsible, Accountable, must be Consulted and must be Informed.
- Identification of the major communication lanes between corporate and local roles.
- Identification of all required materials (templates, 1-pagers, documentation, ...)
Data warehouse robustness
Dealing with different local parties, all delivering their own data, implies a multitude of classic data integrity issues.
Major parts here were:
- When dealing with different (local) parties, data is coming from everywhere. This data can be of low quality and therefore it is important to deal with error handling in a transparent way. The screening technique caters for a classic star schema containing all identified errors in order to detect as early as possible all data related issues and furthermore automatically return these errors in a transparent way to the corresponding party.
- Quality assurance, proving the data in' to be equal to the information out' offered via the reporting tools combined with data lineage.
- Fully automated ETL, from interface files up to the cubes.
- Capturing logging metadata, such as timings, volumes and correctness.
- Automated communication on intermediate and final statuses of the load processes.
- Detailed functional & technical documentation; although this sounds obvious in theory, it is often mistreated in practice.
Based on a number of corporate rules, the interface files coming from the operational company systems are put on a central file server.
The Oracle data warehouse contains four main areas:
- A company specific landing area, used for applying the quality assurance & technical validation processes.
- A generic staging area in which only data of acceptable quality is gathered.
- The data which enters the data warehouse area is enriched with corporate information, such as the various segmentation groupings.
- The metadata area interacts with all other areas, ranging from file server up to the cube build areas.
- Cognos 8 BI Data Manager is
used as ETL solution. Complex manipulations (e.g. an algorithm to determine segmentation groups) are handled by PL/SQL procedures, which are called from within Data Manager jobs.
- In Cognos 8 BI Transformer, different cubes are built, based on the company specific languages & security requirements.
- Cognos 8 Framework Manager defines a global Relational and Dimensionally Modeled Relational (DMR) Metadata model including security filters on company and hierarchical organization groups.
- Cognos 8 BI ReportStudio is used as report building tool on top of the Framework Manager model and Transformer cubes, creating the various KPI reports.
- Cognos 8 BI Analysis Studio is available on top of the Transformer cubes for ad-hoc analysis.
- Cognos 8 BI
Metrics Studio scorecards have been implemented.
Definition of the architecture
Cognos 8 Business Intelligence expertise
Oracle RDBMS expertise
Definition of the implementation (blueprint) methodology
Data quality handling
Following benefits have been realized:
A vast set of KPI's, which are strongly related to the temping business and as such accepted by the business, can now be consulted by the end users.
The reporting solution has a very intuitive reporting interface linked to an acceptable, i.e. "fast reporting performance. The solution is actively used for steering and is indispensable for its end users.
Although a strong focus exists on the one size fits all' concept, the solution can easily cope with various local flavors making the solution recognizable and locally fit.
Finally, a transparant workflow has been established which caters for very time effective pre & post go-live processes.