As businesses search for ways to distinguish themselves from the pack and deliver superior value to customers, author and longtime industry executive Joe Weinman argues that IT should not just be fodder for cost reduction, but can be a critical strategic enabler for the business side of your organization. In his forthcoming book, Digital Disciplines, out in hardcover on August 24th, Weinman explains how big data, cloud, and related technologies can help create unparalleled value for customers and differentiate companies from their competition in four ways: information excellence, solution leadership, collective intimacy, and accelerated innovation.
Weinman spoke with Data Informed about the evolution in how businesses are using big data and cloud technologies.
Data Informed: Please discuss the digital disciplines framework and how digital technologies can have an impact.
Joe Weinman: It’s one thing to say IT should be strategic, and we can wish IT would be strategic, but that doesn’t make it so. The first question one needs to ask is how you would even know that something was strategic. Not just IT, but anything.
I reviewed a number of the traditional business strategy frameworks, such as Michael Porter’s Five Forces model and the Kepner-Tregoe driving forces model, and some of the ones that have emerged recently such as blue ocean strategy, business model generation, big think strategy and so forth. The one that seemed to resonate the most is a model from Michael Treacy and Fred Wiersema called the value disciplines model. Their point was that companies need to differentiate themselves in one of three ways: better processes, which they call operational excellence; better products or services, which they call product leadership; or better customer relationships, which they call customer intimacy.
Their framework is timeless, but a lot has happened in 20 years: The cloud, big data, the Internet of Things, broadband wireline and wireless networks, and social media, to name a few. These technologies do more than put the traditional value disciplines online – they radically transform their application and are disrupting virtually every industry.
My new book, Digital Disciplines, examines how digital technologies impact the value disciplines framework that Tracy and Wiersema originated. After substantial research, I realized that companies in a variety of industries were applying the same generic strategies in contexts that made sense to them. For example, the way that an entertainment company provides movie recommendations can be very similar to the way a healthcare company offers personalized medicine.
When I think of traditional operational excellence, I think of Henry Ford’s Model T assembly line – interchangeable parts and mass production. For the time it was really amazing, and these process improvements led to product cost reduction, which made cars affordable, which transformed society and, you could even argue, civilization and geopolitics. Today, however, operational excellence is being extended, supplanted, and complemented by what I call “information excellence,” which includes dynamic, real-time process optimization. Instead of the Model T production line, consider a busy port with thousands of containers and hundreds of ships and trucks arriving on a weekly or daily basis. There are complexities in loading and unloading, different issues arising in terms of highway congestion or delayed ships. Lots of big data in terms of not just feeds like container IDs but also video surveillance of drawbridges and trucks coming in, bills of lading and cost tradeoffs between labor rates, perishable goods, and trans-shipment costs. To solve that kind of problem, one needs to have multiple companies that need to interact – trucking and shipping companies, port authorities, shippers. It’s process optimization that is only possible through information excellence. And there’s a lot more to information excellence than that, for example, what Burberry has done in terms of seamlessly fusing physical and digital channels, or being able to monetize exhaust data, as 23andMe has done, first collecting DNA test kits and then selling the specific genetic data for one disease to Genentech for more money than they had made from selling DNA tests. That’s information excellence and, obviously, data is required to optimize and offer customers value in this way.
The second major value discipline is product leadership, such as, say, a Bentley or a Rolex or, on the services side, a Four Seasons. But today, standalone products and disconnected services are becoming smart, digital, and connected to the cloud, and thereby becoming sources of a massive amount of data. And what comes to mind there is an ordinary pair of sneakers that has evolved to become an athletic shoe with accelerometers and pressure sensors. That data gets uploaded and collected and brought into the cloud and analyzed in terms of personal bests and activity tracking and performance trajectories. There are a lot of opportunities to not just add value to products through these added services, but also through extensibility and upgradability and data collection and trend analysis and by moving from what Joe Pine and James Gilmore would call moving up the hierarchy of the experience economy: evolving beyond commodities, products and services to experiences and transformations and a new ability to focus on customer outcomes.
The third major area is the evolution of customer intimacy to collective intimacy. Customer intimacy used to mean simple, one-to-one customer relationships, such as you had with your butcher, tailor, hair stylist, or physician. Now, customer intimacy is giving way to collective intimacy. A good example of that is the Netflix recommender. Netflix collects data from each customer on their demographics and other characteristics, on their behaviors – how they stream, watch, skip, pause, and rewind – and also context, like watching on an iPad away from home late at night or on the family TV during prime time. Another example is the Mayo Clinic, which has a massive amount of microbiomic, genomic, epigenetic, and pharmacological efficacy data. As a result of data collected from each patient, they can provide better patient-specific, targeted therapies to each individual patient based on that collective big data analysis.
The fourth major strategy is accelerated innovation. The big idea there is that there’s been an evolution from the individual inventor to shop invention to industrial research labs to open innovation, in which companies look beyond their four walls and now into things like contests, challenges, and innovation networks and idea markets. The idea is that companies, rather than paying employee’s salaries and hoping that they find an answer to an innovation need, instead create a structured challenge. They offer an incentive and a variety of ad hoc solvers can attempt to provide answers ranging from a new design for a logo to extremely challenging technical problems, with a winner then selected based on qualitative or quantitative criteria and reduced costs due to contest economics. There was the Netflix prize to improve their recommendation engine. Goldcorp published seismic analysis data and managed to veer away from the brink of bankruptcy to become worth $40 billion. GE has run contests to better predict and also to improve flight arrival times.
These are the four basic strategies, or digital disciplines: product leadership evolving to solution leadership, operational excellence evolving to information excellence, customer intimacy evolving to collective intimacy, and traditional innovation evolving to accelerated innovation. Big data plays a critical role, depending on the discipline. It enables information excellence, it can be collected and managed through solution leadership, it is at the heart of collective intimacy, and it is often a key component of accelerated innovation. Then there are the related technologies that help collect that data through things and/or people, bring that data to where it can be analyzed through a variety of networks, and also process the data thanks to the cloud.
We are talking about a fundamental transformation in the way business is taking place today. What can we say about the role of cloud in this transformation?
Weinman: Cloud is central to each of the disciplines. If you think about collective intimacy, the cloud is where the data from all those individuals gets collected and processed. If you think about solution leadership, the cloud is what these formerly individual standalone products and services now connect to, so it becomes an aggregation point for data, a way to offer services, and a means to enable ecosystems. For example, the iPad itself is OK, but it wouldn’t be much without the App Store, iTunes, and cloud-based services to connect to such as, say, FaceTime or Facebook. In the same way a connected athletic shoe may be a nice sneaker with great traction and attractive laces, but the key value is taking that data and bring it up to the cloud to be able to do everything from social workouts, offering coaching services, and being able to track personal performance characteristics like routes, races, personal bests, how you compare to others in your demographic, etc.
And then for innovation, the cloud is critical because companies like InnoCentive or Edison Nation wouldn’t exist without the ability to reach all these ad hoc solvers anywhere in the world and provide mechanisms for them to access large, typically anonymized data sets. Netflix, for example, offered up anonymized data on customers’ movie preferences to solvers for training and algorithm development. GE Flight Quest published a set of data on scheduled and actual arrival times, both in terms of runway touchdown and gate arrival, and that was used, together with external data like weather and airborne particulates, which was also published as part of a challenge to enable solvers to try their hand at applying different algorithms to create the best possible solution.
None of these big data innovations can exist without cloud. For individual companies, is a big data strategy even realistic without cloud?
Weinman: It depends on how you define cloud. If it’s a restrictive definition, such as pay-per-use, on-demand, pooled resources shared across customers, then big data doesn’t need a cloud. It can be processed on a grid, a dedicated cluster, a supercomputer, etc. But I’d rather think about the cloud as a high-level concept: a global network of computing resources and endpoints including things and people. Consider CERN’s Large Hadron Collider. I suppose that all the scientists could travel to Switzerland to do the research. But realistically, from a business perspective, for businesses serving global markets, big data and cloud just seem to go hand in hand because you are trying to aggregate data from a broad variety of geographically dispersed endpoints and it’s not just single products but product ecosystems. Things like wind turbines interact with the smart grid. Nike sneakers and the NikeFuel ecosystem interact with not just Nike products but Withings scales, Wahoo Fitness, cardiac monitors, and things like that. I think that for most of the implementations of the disciplines you are looking at a mix of big data and cloud, and networks to bring the big data to the cloud and things to collect the data for the cloud to then process.
There’s a lot of trepidation for some organizations around cloud. To what do you attribute that hesitancy?
Weinman: There are rational and irrational factors at work, and it’s very complex and nuanced. The number-one concern that people have about cloud in virtually every survey is security. But the question you should ask yourself is, “If you are a mid-sized enterprise, do you really think you are going to have a more capable security staff than, say, IBM does? Can you staff a security operations center 24/7 and address zero-day threats rapidly?” The other thing is security is not a monolithic term. You have to look at your risks from various threats, one of which is, let’s say, distributed denial of service attacks. The type of attack bandwidth that can be generated through D-DOS attacks, generally and specifically things like NTP amplification attacks, is massive, especially compared to what a typical enterprise has for access bandwidth. So if you are trying to protect against D-DOS, it’s better to go with a public cloud provider.
Of course, anytime you are in a connected, shared infrastructure, there’s always the risk that there will be some new, undiscovered attack vector, as well as subtle issues such as government ability to subpoena data. But I really don’t see that security or compliance are valid reasons to avoid cloud anymore. For most enterprises, security via a public cloud provider is going to be as good as or better than what they can do themselves.
Are there advantages to one cloud configuration over another, or is this more of a case-by-case situation?
Weinman: My first book, Cloudonomics, goes into that question in a lot of depth including several dozen different cost drivers – such as floor space, power, cybersecurity, HVAC etc. – and rational economic optimization and (irrational) behavioral economic factors. The basic answer in my view is that “it depends.” It’s a case-by-case basis as a rule and it’s a very complex decision for large companies with the wherewithal to run their own data centers requiring an understanding of unit costs, workload demand variability, performance differentials, physical security, privacy, compliance, and migration costs, modulated by risk and uncertainty in terms of revenues and price trajectories.
If you are a start-up, you have an unpredictable growth trajectory, you are someone who doesn’t want to deal with all that complexity, etc., then the public cloud is a really good choice. But I looked at a variety of industries and every industry has a mix of do-it-yourself and “cloud”-based solutions, i.e., dedicated, owned resources and on-demand, pay-per-use services. People own cars, yet they also rent them and take taxis. People own homes, yet they also rent apartments and stay in hotels. People have equity capital, yet they also rent capital by taking out a loan. It’s true even for electricity, which some people say is the canonical example of why everything is becoming a utility and commodity and moving to the cloud. Yet, electricity is moving out of the “electric cloud” and back into the premises through distributed power generation such as solar cells and portable wind turbines. In short, every customer uses a hybrid mix tuned to their requirements, and then shifts that as requirements and characteristics change.
Evernote, for example, moved out of the public cloud and into its own data centers, and they claim a fourfold savings. On the other hand, Netflix is the poster child for being all in in the cloud. And then you have a company like Zynga, which was using a public cloud, then felt that it had enough stability and could achieve a 3-1 performance improvement. So they came out of the cloud and went into their own data center. And then they suffered a bit of a reversal of fortune and one of the issues is if you have dedicated capacity that you own and then all of a sudden your business volume drops, you have all these under-utilized resources. So they recently announced that they are moving back into the public cloud.
I don’t think there’s a single answer for everyone. Generally, hybrid approaches can achieve the optimal mix of interacting with legacy systems and being able to achieve the right user experience, cost optimization, performance, and also reliability. And there are many ways of doing hybrid. Sony is one example I talk about in Cloudonomics. They have a hybrid strategy in which, during normal periods, everything happens in their own data center. But during periods of peak demand, they move a lot of their non-revenue generating things like e-catalogs out to the edge via public cloud and/or content delivery networks. But they keep their financial transactions very close to the vest and those always occur on their own systems in their own data center.
With all these different advantages, it sounds like hybrid is a solution for just about anyone.
Weinman: Well, if I am Bob’s Pizza down the street, I am not going to have a hybrid solution in terms of a data center behind the pizza oven. For SMBs/startups, cloud services really are the way to go. For larger enterprises, it really comes down to questions such as competency, cost, profile. But as a rule, it’s safe to say that a hybrid of some sort or another, including what Gartner calls bi-modal IT, will make sense. It’s just hard to picture a blue chip just overnight saying, “We are just going to throw away our old systems and go all cloud,” even though some of them have stated that to a greater or lesser extent and are interested in moving to a cloud-first strategy. In some cases, their systems are such a mess that they literally are planning to throw them all away and start from scratch. Otherwise they have no hope of competing with new digital upstart start-ups.
What are some factors to consider when choosing a cloud partner that meets your needs?
Weinman: You would have to look at your current development and operations skills, providers’ service portfolio breadth, geographic proximity and/or dispersion, transparency, reliability, data transport costs, the list goes on and on. Another point would be to consider not just the provider’s cloud services, but related services, such as systems integration and enterprise support, e.g., dedicated account teams. You also might want to consider not just today’s offers, but long-term strategic directions, say, offering cognition as a service. In short, there are many different reasons why you might pick one provider over another.
Subscribe to Data Informed for the latest information and news on big data and analytics for the enterprise.