Back in the 1990’s, decision science was all the rage. Often looked at as the precursor to Big Data, decision science focused on streamlining decision-making and using all available tools and data for advanced modeling. Consolidating and combining different, independent functions became a key enabler of decision science. For example, when a company decides to market financial-services offerings, if done independently of a risk-management function, the company will primarily focus on increasing revenue from new accounts. Risk management, however, is also needed to ensure that those new accounts will not ultimately become bad assets. Combining elements of both functions would allow for a more efficient, coordinated process, with better outcomes.
Twenty-five years later, decision science has been replaced with data science. Essentially the same concept, deploying better solutions through advanced data access and modeling, except now the data is at massive scale. Companies are deploying new technologies at a record pace, but many of those same companies are neglecting to update organizationally as they would have with decision science because it can be very hard to do. It’s one thing to bring on new technologies, but updating organizations, moving resources around, changing reporting relationships…that’s hard! The result, however, not just inhibits, but prohibits change.
To be effective, Big Data technology ultimately must rely on five core enablers:
1) Use-case generation, prioritization and approval to make sure analytics initiatives are delivering business value.
2) Leverage data lineage and metadata, tying use case implementation to core data assets.
3) Institute governance, security and access control to ensure regulatory compliance.
4) Track success and identify lessons-learned so results can help drive business transformation.
5) Develop an operating model and map budget allocation to tackle the tough question: Who gets control?
Simply implementing policies or technologies to add these functions is not enough; the organization must be set up to embody and embrace these core principles. That doesn’t happen without making some hard decisions.
Organizations must address questions such as which executive gets to approve analytics use cases, how to ensure full use of production capability, what chargeback model to use, who owns data-science resources and many, many more. Enterprises must also ensure they have a plan for deploying new analytics models and making certain that these changes are resolving real data issues.
Without tackling and answering such fundamental questions along with deployment of new technology, that technology will not produce anything close to lasting, transformative change.
So how do leading organizations tackle this? How do they make sure they allocated the appropriate resources to the right places to ensure a successful transformation?
To activate these functions, consider putting two governance bodies in place: An Analytics Governance Council (AGC) and a Data Governance Council (DGC).
Analytics Governance Council (AGC)
An AGC is a group that meets monthly to discuss the transition to a new Big Data system, and evaluate what is working and what isn’t. They audit the approaches taken and determine the best course of action to achieve positive results by considering both operational requirements and business benefits. Ideally, the AGC is made up of senior executives. This ensures that real changes and decisions can happen without getting hung up on budgetary restrictions or corporate policies. A well-functioning AGC can also mean less downtime and more time spent innovating.
Data Governance Council (DGC)
A DGC plays an equally important role in organizational change, but functions at an operational level. A DGC is made up of data stewards or subject-matter experts who have direct knowledge of core data sources. They can provide valuable insight into potential issues that may arise as an organization heads towards a new strategy surrounding Big Data. During periods of significant change or innovation, a DGC should meet weekly to advise on data access and any areas that may create issues for deploying use cases in production.
There are real nuances in how these bodies are constructed and what they do. If done right, however, these constructs allow for real decisions to be made with shared accountability for the results.
They enable the organizational change that must underpin the rapid advances in Big Data and data science. Without them, the highly touted and much promised increase in productivity, revenue growth and customer experience resulting from Big Data and advanced analytics cannot come about.
Joshua Siegel is a Director for Dell EMC Big Data and IoT Consulting Services with more than 20 years of experience assisting large enterprise clients with strategy and implementation around Big Data transformations.
Subscribe to Data Informed for the latest information and news on big data and analytics for the enterprise.