The war on analytics talent & BICC project backlog: can data estate automation be the solution?

The war on talent is affecting most industries, but none more than the IT field. It is increasingly difficult to find quality data professionals and to hold onto these talented individuals. What is worse, is that most of these professionals are spending their time and valuable expertise on hand-coding ETL scripts and meticulously managing data flows.

In order to win this war on talent, companies need to not only provide their employees with the latest digital tools, but also allow them to use their talent for purposeful growth, innovation, and game-changing breakthroughs, rather than tedious coding and data management.

Data is no longer a byproduct or just an accidental creation of business processes or the storage of information that is required by regulatory authorities. The term Data Estate reframes the narrative. Think of data as a raw resource.

Data is the new oil and data science the refinery. Enterprises – large and small – need to handle their data with the respect it deserves. Business leaders understand that data is the potential for business transformation, new business models and profit. They manage their data effectively, increase data quality, entrust it to good data stewards and grow it.

Analyst research reveals that the amount of data in the world is actually doubling every two years. For many organizations this goes even faster. As data increases it becomes more complex and increasingly more difficult to access, govern, and maintain compliance.

So, companies are looking into solving their current pains, but what about in a few years? How will they keep up with all this growing volumes of their data? They want to make sure that the solution is not only able to help their current organization but also prepares for the future.

Self-service ETL and data warehouse automation

On the one side, companies have their ever-growing data sources. On the other side, there are the Business Analytics instruments at their disposal: dashboards, reports, self-service BI, predictive tools, etc.

 

The war on analytics talent & BICC project backlog: can data estate automation be the solution?
click to enlargeThe war on analytics talent & BICC project backlog: can data estate automation be the solution?

Figure 1 – Self-service approach to BI

Some companies begin connecting these front-end analytics tools directly to the data sources, either using connectors within the tool, or by writing scripts to extract the data. In taking this approach each analytical tool has its own data pipeline and set of transformations. While you get quick access to your data, it becomes increasingly difficult to manage. Data silos begin to develop with limited control of data quality and security.

Then, when a business user needs more data or new data, it is common to encounter an extended delay to address connectivity to source systems and data infrastructure issues and reproduce business logic which already has been developed for other users & departemental analytical applications.

Patchwork of tools

Many organizations begin to see the shortcomings of the previous approach and attempt to simplify matters. They may implement a staging layer to help simplify connections and improve security.

The war on analytics talent & BICC project backlog: can data estate automation be the solution?
click to enlargeThe war on analytics talent & BICC project backlog: can data estate automation be the solution?

Figure 2 – Patchwork of tools

Then when they encounter specific difficulties or pains, they try and implement a solution to treat each symptom. A tool for extracting data, a tool for transforming, a tool for modeling, a tool for scheduling and many others.

While this approach may make individual problems easier to handle, it ends up only making the overall implementation more difficult to manage. Each tool requires an expert to maintain and most of these tools do not talk to one another, so you need to orchestrate the entire flow. In addition, maintaining compliant documentation for the solutions often requires the same number of hours as it does to implement. This creates a serious draw on resources and results in a backlog of IT requests for Analytics and delayed access to data.

Data estate automation

Companies do not want to simply treat the symptoms. They want to address the underlying condition. To run an effective business, users can’t wait days or weeks for new data: they want instant access. And not all user’s needs are the same, power users need raw data, business users need governed data, and casual users need pre-built data models.

IT or BICC management wants to oblige, but rightfully has concerns about governance, security, privacy and cost. Furthermore, organizations understand they should take advantage of deploying their data estate to the cloud. The advantages of cloud computing are numerous: saving time and money on maintaining systems, only paying for technology that you actually use, working anywhere without disruptive access delays, scalability, flexibility, and more. However, deploying a cloud management strategy that involves critical financial and operational company data or customer data may sound overwhelming. The answer to ease the journey to deploying data in the cloud is automation.

Classic data warehouse efforts typically require manual work on schemas, data, ETL processes, metadata, users and applications. Companies leveraging a public cloud service such as Microsoft Azure should look for tools to simplify and accelerate the work of delevering a data estate that adequately serves their organizations. That is important because time-to-data matters. Time-to-data is the time between when someone has an idea of a new way of using data and when the data for that idea hits the BI front-end or analytics tool being used. So, how does automation help build a modern data estate on Azure? Utilizing a modern data management platform provides a cohesive data fabric for analytics data on Azure, on-premises or in a hybrid model. No need to stitch together tools for ETL, data modeling, code management, security, and documentation. In an automated solution, scripts are automatically generated and kept up to date to reflect the correct names of sources that have changed. Measures and calculations are made once and reused throughout the entire solution and in every front-end like MS Power BI, Qlik Sense or Tableau to access the data warehouse. Automated data impact analysis and lineage bring clarity about the origin of data without IT having to answer requests by going through lines of code manually. Automatic creation of documentation keeps track of which data goes where, satisfying GDPR compliance and audit requirements. Automation frees up resources, tracks changes and keeps systems up to date, making maintenance easier. Adding new data sources is significantly faster and more flexibel. By automating repetitive manual work businesses free up employees' time, allowing them to focus their attention on critical business matters.

Conclusion

Whilst organizations have never had more interest in using data as a new company asset, Executives and Business Intelligence Competence Center leaders have two new main challenges for coping with increasing demand for Analytics. On the one hand there is a War on Talent, especially in Analytics, and on the other hand there is an increasing backlog for ICT / the BICC to deliver on all new requests from the business. Additionally, the rise of Self-Service Business Intelligence is not necessarily replacing the need for a solid Data Warehousing layer in the ICT architecture. Disintegrated data silos do not always deliver the essential “single version of the truth”.

Automation has always been a key solution for bringing greater efficiency into industries and vocations of all types. When it comes to data management however, and especially data integration and ETL (Extraction, Transformation and Loading) strenuous, repetitive coding still is the way most BI professionals are building the Data Warehouse. Data Warehouse Automation can accelerate intricate processes and allow proficient Analytics professionals to execute and complete more important tasks that lead to real business value. timeXtender's Discovery Hub is such an enabling technology to modernize and transform one’s data estate can clear project backlogs and relieve Analytics professionals from current time-consuming coding processes. TimeXtender and their distinctive data architecture, Discovery Hub®, is a high-performance software platform, which leverages AI and automation to build a flexible, information-focused data estate on the Microsoft SQL Server Data Platform. 

It is imperative that IT or a lack of skills never is a limiting factor for an organization to become more Data-Driven. Instead, the data teams & ICT should empower users to leverage all the valuable information they have already stored in operational systems and to use this information as support for even more intelligent decisions. So, companies looking for a platform that is specifically designed to build and maintain a modern Data Discovery architecture should consider to deploy their data estate to Microsoft Azure and enable instant access to data with timeXtender’s Discovery Hub.