The Foundation of a Robust Data & Analytics Program - Part One


Peer Practices
Written by Erica Marroquin

Cal Rosen

Vice President, Data & Analytics

Home Trust

Cal Rosen’s approach to rigorous data management truly began years ago when he started his own consulting firm. Without extensive corporate resources and structure, Rosen rolled up his sleeves to develop and test comprehensive methodologies, toolkits, and accelerators (MTA) from scratch.

When there is no team behind you, it’s on you as an individual to put pen to paper. I was delivering numerous projects in a one-off manner, which led me to document and progressively refine a toolkit on how to consistently and harmoniously approach common challenges. So, I developed detailed MTA’s to deliver end-to-end data and analytics programs. That's the origin of my data management framework and overall data strategy approach.

 

Rosen has applied, tested, evolved and proven aspects of the MTA in more than 40 unique engagements as a management consultant and financial industry executive across North America. Most organizations are at different stages in their data program maturity, but they share many common issues. From this lengthy portfolio of work, he built this foundational framework. 

“At the center of a successful D&A program is a core set of capabilities that I refer to as the Data Management Framework (DMF). The DMF defines and describes both the ‘what’ and ‘how’ of the program, including the operating model, business engagement, and data management and analytics target state,” said Rosen. 

Data Management Framework

Rosen’s Data Management Framework is composed of four dimensions — stakeholders, people, process and technology. 

The focus of the first part of The Foundation of a Robust Data & Analytics Program series will cover three of the four Data Management Framework dimensions: stakeholders, people, and technology. 

Stakeholders — Who are you building this for?

When launching any large-scale program within an organization, it is critical to identify and engage key stakeholders — individuals that will be directly involved in, impacted by or influential to the program. They can be internal or external participants, as well as partners or agents of the business. By understanding the key stakeholders, their level of support towards the program, and their unique set of circumstances, program leadership can effectively prioritize elements of the program scope and realize capabilities in the most impactful manner. 

People — How is the program organized and how does it operate?

Within the dimension of People, there are two core considerations — the Data Governance Organizational Model and the Data Stewardship Operating Model. These two complementary models define the structure and make-up of the actors within the programs as well as define how the program functions. The Data Governance Organizational Model has a lot in common with corporate data governance models. It addresses the cadence and mandate of the data program and defines the hierarchy of the council organizations. It answers common organizational questions like: Who does what? When do issues get escalated? How are disputes resolved, etc. 

A robust Data Stewardship Operating Model is often overlooked. It describes how data is governed across both data production and data consumption, and how stewardship functions on a day-to-day basis balancing the relationship between project work and business as usual (BAU). This model goes well beyond defining the roles and responsibilities of the individuals involved in the program from across the organization. Driven by business demands and priorities, it defines the rules of engagement, level of sharing, integration, and standardization necessary for executing persistent enterprise data governance. The model defines how stewardship roles and processes are mobilized for project work and BAU to meet the organization’s data management needs. 

“The model starts from the enterprise data domain perspective and moves down to the granular physical instance of the related data field in an operational system. This ensures that end-to-end governance is fully accounted for across projects and BAU for both operational and informational systems,” says Rosen.

 

Technology — What tools do I need?

The next dimension is technology. Rosen is purposely focused on six elements of technology.

  1. Authoritative Systems of Record. The first element addresses the authoritative systems of record. Most organizations have established systems of record for every single field, but Rosen looks beyond the basic record tracking. Based on his experience in the finance industry, he recommends establishing certain guidelines when transferring data into and out of data repositories, data lakes or warehouses.

“A lot of organizations are not capturing and sharing their metadata along with the datasets. Metadata is an absolute prerequisite for all source systems or authoritative systems of record because you're going to want to map your authoritative systems of record to your enterprise data model,” says Rosen.

 

  1. Artificial Intelligence (AI) & Machine Learning (ML). The second element is advanced analytics (i.e. AI & ML). All organizations need to consider new data science workflows and processes that will help their employees work more efficiently and effectively to deliver greater business value. Organizations are located at various points of the advanced analytics maturity curve, and those furthest along can develop and deploy advanced analytics at scale. Rosen includes this section as an opportunity for clients to declare what the business will focus on with respect to advanced analytics (i.e. offensive vs defensive use cases). 

The advanced analytics development environment should enable a consistent development and deployment methodology, providing a standard approach to model development, model management and model execution with consistency in automated data preparation, data analysis, and insight generation. This will in turn enable the migration from a one-off advanced analytics project mentality to one where you deliver broad ranges of advanced analytics use cases deployed at scale.

  1. Search, Visulization and Reporting. Closely tied to an organization's decision on analytics, the third element is search, visualization and reporting. The organization should standardize interaction with data and information amongst all employees whether it be with look up type searches or unique dashboards. Accessing data through a common UI facilitates rapid internalization of findings. Modern end-user tools offer enormous autonomous flexibility to access, link and join data from a variety of end-user self-selected sources illustrating results in a highly intuitive and visual manner. However, this needs to be pursued with caution as uncontrolled data democratization can, in some cases, lead to and exacerbate end-user data anarchy.
  2. Technology Architecture. The fourth and fifth elements are closely related. There is the Technology Architecture (TA) that provides a common understanding of data, promotes consistent use, and enables sharing of data across systems and processes. The goal of the TA is to establish the means and methods of dealing with event and delta/batch data flow across the environment. In doing so, the TA establishes a standardized blueprint or pattern for data ingestion, data transformation, data integration and data consumption.
  3. Data Architecture. Data Architecture (DA) is a design discipline that describes the various data structures in place across the continuum of collection, storage, transformation, distribution and consumption systems. If required, the DA also defines how you intend to construct your common view of organizational data. The key to establishing a common view of data is where to integrate your data. In Rosen’s experience there is no one-size-fits-all as different industries do it in different ways. However, in many industries the practice of enabling broad consumption of corporate information based on a foundation of governed data is enabled by integration at the core.

 

Lastly, the sixth element is the data lifecycle. Due to its potentially expansive reach, Rosen identified this as a frequently underserved consideration that many organizations fail to adequately address.

 

  1. Data Lifecycle. A data lifecycle describes the process of how data is handled — from the time that it is captured or originated to the time it is destroyed, and at all points in between. The data lifecycle breaks down this continuum into discrete steps (i.e. Governance, Risk and Compliance (GRC); Structure; Content; Logistics; Consumers). This allows organizations to consider data management across the data lifecycle not as a point in time exercise but more as an exercise of managing data across a continuum.

In part two, Rosen will discuss the final dimension — Process — the components needed for your data management framework.

 

About the Executive: Cal Rosen, vice president of data and analytics at Home Trust in Toronto, Canada, began his data and analytics consulting career in the early ‘90s. Starting with defining and building data warehouses and prototypes for the Telco industry in the US and Canada with Teradata Industry Consulting, Rosen led consulting practices for PwC and Cap Gemini Ernst & Young in addition to starting and successfully running his own consulting business, ActionInfo Consulting. Rosen has successfully delivered projects in multiple industries across North America (Communications, Retail, Financial & Insurance, Energy & Utilities, Transportation, Healthcare, Natural Resources, Gaming and Public Sector). Over the past few years, Rosen has also held executive leadership roles in the nascent Data Offices for two of Canada’s largest International Banks. As a data and analytics thought leader, evangelist, and author, Rosen has appeared as a speaker and panelist in well over 30 industry conferences.

by CDOs, for CDOs


Join the conversation with peers in your local CDO community.

LEARN MORE