Deloitte’s John Lucker on Analytics Trends 2016

by   |   January 28, 2016 5:30 am   |   0 Comments

Even in the fast-moving, ever evolving world of data analytics, some topics manage to linger in our collective consciousness, their significance to the success of big data initiatives too great for attention to shift away for long.

In preparing its third annual analytics trend report, Deloitte identified a mix of topics that will dominate the conversation in 2016: some that defined the course of the analytics world last year and continue to be impactful, and new topics that promise to make their mark over the course of this year.

As we did last year, Data Informed once again spoke with John Lucker, Deloitte’s Global Advanced Analytics and Modeling Market Leader and a leader for Deloitte Analytics and a core contributor to the report, about the trends identified in the report and the analytics outlook for 2016.

Data Informed: Last year’s report identified data security as a supertrend, and security is singled out again in this year’s report. What steps do you see organizations taking to combat the growing sophistication of cybercriminals?

John Lucker of Deloitte

John Lucker of Deloitte

John Lucker: Many companies are beginning to take an offensive, rather than a defensive, stance in their approach to cybersecurity. This move from reactive to proactive is happening because the potential damage to operations and reputations is so serious, and also because data security is better funded by companies today than it has been in the past.

Becoming proactive about their security flaws takes many forms, from the mundane to the more sophisticated. For example, many organizations are maintaining security patches in an up-to-the-second timeframe instead of waiting for them to pile up and addressing them in bulk mode. No longer is such systems maintenance considered an annoyance, but rather a business imperative. Many businesses are also taking a fast-and-furious approach to hiring experts in the area of cybersecurity. As security issues grow in complexity, expect a talent shortfall in this area, which will increase the need for external expertise and advanced technologies to keep pace with emerging threats.

Technology also is evolving in response to cybercrime in ways that enable a more proactive approach. Software developers are designing software from the ground up with cyber risk in mind. Many new systems are systematically designed to eliminate known or predicted cyber risks, with security holes eliminated architecturally. Analytics engines have also evolved to help analyze cyber risks in a more sophisticated way, addressing both reactive and proactive risk.

The report states that data security could, in some cases, slow the adoption of other trends that drive innovation. How can an organization identify the “sweet spot” between ensuring its data is protected and not slowing innovation in the process?

Lucker: Approaching data security and mitigating cyber risk proactively is a foundational skill set and imperative, and it should be a component of any company’s inherent risk management architecture. In the same way you wouldn’t build a house on an unstable foundation, companies must be sure they create a stable cyber security foundation before rushing ahead with higher-level innovation.

Related Stories

Analytics Trends 2015: Q and A with Deloitte’s John Lucker.
Read the story »

Deloitte Analytics Leader on 6 Steps to Successful Analytics Projects.
Read the story »

Big Data Analytics Predictions for 2016.
Read the story »

5 Steps for Creating a Scalable Data Security Plan.
Read the story »

What is a stable foundation? That’s a fluid concept that depends upon the industry and the evaluated state of affairs of any given company. Some companies are probably very ready – many banks have a more secure foundation, while others – some mid-sized retail or manufacturing operations, for example – may not be as far along. We see that there is a continuum of readiness, and each company must evaluate where it sits along that continuum to evolve as an innovator and mitigator while ensuring data security – to inch closer to that sweet spot, as you said.

Another thing that’s important: In proactively mitigating risk, many companies do, or need to, purchase cyber insurance to help mitigate the cost of the risk should they suffer a loss. Be aware, however, that insurers are expecting a certain level of readiness and architectural foundation before they will write a policy. No one will insure the contents of a house without windows or doors. It’s the same in this case. Metaphorically, insurers are expecting good door locks and secure windows in the area of cybersecurity before they will insure the organization’s assets.

So where’s the sweet spot? That’s a good question, but I’m afraid there isn’t a single right answer. Each company needs to evaluate the extent to which it believes it has adequately evaluated its risk. In doing so, companies may want to bring in outside advisers to evaluate where they are. Outside experts can do penetration testing and full-scale audits of policies, procedures, infrastructures, and foundations. If you think you have done all you can do, don’t stop. Bring in someone with an open mind to evaluate your progress and tell you how you’ve actually done and where you may still need to improve.

The report anticipates the rise of what it calls IDO, or insight-driven organization. How does this differ from the majority of organizations that are using analytics to gain insights that inform decision making, and what factors are contributing to its rise?

Lucker: Previously, many organizations weren’t talking about being insight-driven, but were talking about being data-driven. Is there a difference? Yes. What we’ve seen is that industries have spent a lot of time building their data foundation – whether it’s through simple architectures or big data architectures with new and evolving big data technologies. That’s what went into being data-driven. But that effort can go on only so long before either you are doing it well, or until leadership expects to get some value out of what you have built.

So now most companies are saying, “We are not completely done, but let’s get going with what we have and lead the business forward to get good insights from our data foundation, to become more insight-driven.”

Our observation is that, with the exception of a few industries and case studies within those industries, the majority of companies have been building the foundation for the insight-driven organization all along, while they were dabbling with analytics, creating analytics teams, hiring the necessary leadership, creating analytic frameworks, and so on. Now companies have matured, and executives across the organization tend to feel a sense of urgency. Executives that lead in those areas of the company also see the value of connecting these silos of insights and innovation together, creating harmonization in leveraging data across functions to create a truly-insight driven organization.

We also have seen the importance of analytics roadmaps and the analytics lifecycle in the journey of the IDO. Today, companies are using short-, medium-, and long-term components of analytics endeavors to create this kind of enterprise perspective of analytics. Building the IDO takes this larger perspective to create a full end-to-end solution. One of the differentiators of an IDO is that you are creating true end-to-end solutions and not just creating analytics bits and pieces.  What makes an analytic solution end-to-end? We see this as composed of six components: analytic strategy, analytic component development, analytic business operationalization, analytic technology integration, organizational change management to integrate the analytics into the business processes, and performance management and measurement to recognize what’s going well and improve it and also detect what’s not going well and fix it.  In this way, IDOs have a solution that connects successes across the enterprise.

Another differentiator of the IDO is that the insight-driven culture values a single version of the truth. If you are creating an IDO and you have a particular KPI or measurement, everyone should be working from the same numbers, the same reality, and the same unified view. That’s part of what the infrastructure helps drive.

The report states that much of the infrastructure for Internet of Things applications already is in place, but that peak impact of the technology is five years off. What has to be done in that time for the IoT to achieve maximum impact?

Lucker: It’s true that much of the technical infrastructure is done for those organizations intending to leverage IoT, but certain components of the ecosystem infrastructure are still evolving. Part of that has to do with ways in which data is normalized to create standards for the data that gets generated by different IoT gadgetry from different manufacturers and platforms.

For example, if you are trying to measure the health effects of the use of fitness bands, you need to recognize that, in most data sets, you will have data from multiple types of fitness bands. So, how do you standardize the format of IoT data? Standards for that kind of thing are still evolving.

Just like standards, IoT technology also is still evolving. Some devices, even though they are mature, fail. And failures in these devices can affect hundreds of thousands of people. We are in the middle of the evolution from IoT being separate cool gadgets to connected critical infrastructure components. And that’s not necessarily a comfortable place to be.

The IoT industry is getting used to this idea, though. A lot of IoT companies began as startups, and while they are serious about the technology, they may not have – or likely haven’t – evolved to the point where they recognize just how critical their devices are becoming. The entire software development, deployment, and production processes need to adopt standards and practices that help productionize what these companies are doing to drive quality and reliability. And many of the newer companies are not prepared as they need to be for that effort.

Another standards issue is the lack of true standards for how these devices talk to each other and how independent device data can be brought together in meaningful ways. In your house, you may have devices generating temperature data and moisture data. But how do you bring it together in a meaningful way? How do you ensure these devices talk to each other in the same way, with synchronized times? For example, comparing snapshot thermostat data from 10 a.m. and moisture sensor data from 1 p.m. doesn’t deliver an integrated view of the conditions in your house. Standards are required to align and deliver accurate and meaningful insights from IoT data.

That’s really why we are talking five years out. While the data infrastructure may be more mature, the use cases and paradigm for bringing it all together from an insight perspective is not at all mature. And that’s not just in consumer cases.

From an industrial perspective, from the extent to which manufactures are building IoT components into their products, they are likely thinking it’s all one ecosystem, all one device. So the large manufacturers may be harmonizing all their sensor data because they have been doing it for a long time. In a closed ecosystem of IoT, with a limited number of coordinated manufactures doing a good job moving along quickly, things are progressing rapidly. But across the larger IoT, we really have a Wild West of organizations trying to create standards so that they can become the glue that bring companies, devices, and IoT data together.

The ongoing shortage of data science talent is identified as a trend that will continue into 2016. What are some steps that organizations are taking to address this shortage? Is this a trend that you see reversing, or is this likely to be an ongoing challenge for the industry?

Lucker: You can’t produce enough of these people quickly enough to keep pace with demand. In a world in which we are different, with different things to offer, different capabilities, and different skills sets, organizations can’t just plug-and-play data scientists.

Just because someone takes a half-dozen to a dozen classes and gets a certificate or a formal degree in a data science program, doesn’t necessarily make them an experienced data scientist. It’s the same with statisticians who help interpret business problems. It typically takes years for these people to develop and gain the level of experience necessary to apply what they have learned to complex problems.

As a result, we have a very limited supply of true experts, and that shortfall will likely take many years to solve. For the foreseeable future, a very small percentage of data scientists will be real experts, and the rest will have a continuum of skills. Useful? Probably. Differentiating? Maybe not.

In my opinion, we will continue to experience a disproportionate supply of novices and up-and-comers – of people who say they know how to do data mining and statistical analysis, but who, in reality, have a limited understanding of how to apply it to create powerful end-to-end business solutions with analytics at the core. At the same time, we’ll likely suffer though a shortage of experienced data scientists and statisticians who really know how to uncover and solve the most complex problems. The supply/demand imbalance will work itself out over time, but time and a concerted effort by companies needing to cultivate the talent are what’s required.

As an example, we have seen a similar situation in the real world with the urgent need posed by natural disasters. The day after a hurricane, thousands of carpenters set out for the affected area, and they get licensed – and many don’t – and start building. But invariably, for those less experienced or outright beginning craftsman, the results are sub-par and the structures suffer for it.  Experience doesn’t come just from affixing a badge, license, or certification to one’s resume.

Pragmatically, would any organization make a newly minted CPA their CFO or CAO? Not likely.

Let’s talk about attracting the short supply of good candidates out there. Companies should be sensitive to the courtship process. They need to be aware that these candidates often have an expectation that they will be continuously challenged with interesting and diverse assignments that are highly valued by the organization. So companies have to create harmony between the recruiter and the recruit and work closely with colleges and universities to become the preferred place to work for higher-skilled individuals looking for interesting, exciting, and diverse analytic experiences.  Organizations should be sensitive to creating rotational or short-term opportunities for their analytic talent.

The bottom line is that success in closing the talent gap, just like success from the company’s overall efforts, comes from getting things done, not from doing things. To retain gifted people, you have to build and maintain your reputation of giving these people rewarding roles.

An interesting trend the report identifies is cross-pollination between science and business. Can you cite some examples of this and describe how businesses are benefiting from it?

Lucker: There are certain methodologies and technologies used in the hard and soft sciences now being used in business. In one example, businesses are leveraging the scientific methods used by DNA researchers and human geneticists to gather insights buried in hundreds of thousands of organizational email messages – for understanding organizational behaviors, the interconnectivity of employees and work groups, uncovering fraudulent activities, and better understanding customer service improvement opportunities. Text analytics, which borrows sequencing techniques from bioinformatics, is cracking the code on these messages, enabling organizations to be more responsive and effective.

In cyber analysis, biological approaches are being used to target computer viruses, Trojan horses, and phishing schemes. Since there are relatively few entirely new structures for computer viruses originating in time, biological methods traditionally used to identify variants in real-world viruses as they morph and evolve can be applied to computer viruses, which are also taught to morph and evolve. In this way, hard science and cybersecurity cross-pollinate to proactively protect organizations – which brings us back to why cyber security continues to be our identified analytic supertrend.

Scott Etkin is the editor of Data Informed. Email him at Scott.Etkin@wispubs.com. Follow him on Twitter: @Scott_WIS.


Subscribe to Data Informed
for the latest information and news on big data and analytics for the enterprise.


Big Data predictions for 2016




Tags: , , , ,

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>