Don’t look now, but crude oil prices are down again, continuing a plunge that started last June. Considering the economic and geo-political landscape driving the price of oil, it’s clear that prices simply aren’t going to bump back up substantially anytime in the foreseeable future either.
What does big data and analytics have to do with the price of gasoline? For the oil and gas industry, a whole lot. To weather this storm of deflated prices and uncertainty, the industry – which accounts for hundreds of billions of dollars in transactions and employs hundreds of thousands of people globally – must undergo a fundamental shift in how it collects, shares, and analyzes data. It is a shift in mindset that many other industries already have experienced or will need to experience very soon.
I’ve dubbed this new approach the “connected well” and, at its core, it’s built on the same framework as the quantified self and the connected car paradigms – basically, a way for an industry to understand the value of bringing stakeholders and data together around a particular ecosystem. Already, the manufacturing, aerospace, and automotive industries have used such a framework to bounce back from industry upheaval.
Shared, Data-Driven Insights
One way that big data plays a role in the oil and gas industry is with regards to understanding costs – something that’s especially important given today’s tough marketplace. For every single well they own, executives in the oil and gas industry must know full lifecycle costs, from exploration to abandonment. Data is the foundation of this understanding; even at a basic level it requires looking at everything from geophysical exploration data to real-time production data to refining operations data to trading data, and more.
Without a connected mindset both within and beyond an organization’s walls, getting this complete picture is time-consuming, if it’s possible at all. Engineers and geoscientists working in companies where data is siloed spend, on average, 14 work days out of every month searching for data, integrating it from multiple sources, and preparing it for analysis in applications. Some simple subtraction tells us that leaves a mere nine work days for analysis. In today’s world, where strategic decisions need to be made in near real time, that is unacceptable.
This brings us to the first tangible benefit of the connected well paradigm: increased productivity. Having processes in place to integrate data automatically not only helps companies understand costs better, but also lowers costs in and of itself.
Internal integration is only the first step for big data and the oil and gas industry, however. Beyond efficiently integrating data that are spread out across various business units, a true connected well requires accessing and integrating data that’s across an ecosystem of contractors, partners, and stakeholders.
To continue with the cost example, decision makers are at an even greater advantage if they can look at lifecycle costs in the context of all wells and equipment on similar fields to find patterns in massive data sets. To do this, they must go beyond the aforementioned list of data sources and add in all available equipment information from drilling contractors, plant providers, engineering inspection and service companies, and more to develop an even better understanding of what works and what doesn’t.
Only if this level of integration becomes a reality will oil and gas companies, which also happen to be some of the largest companies in the world, have a full understanding of what is cost effective. They will know how to avoid non-productive time, how best to schedule (and even automate) maintenance against overall productivity and, ultimately, when to buy, sell, develop, or defer particular wells. While these decisions are always important, they hold even more weight considering the price storm currently brewing in the industry.
Realizing the Vision of the Connected Well
The question, of course, is how the oil and gas industry can make the connected well a reality. The answer lies in having the right technology and, more importantly, the willingness to forge new lines of communication and new pan-organizational and intra-industry relationships.
With regard to technology, data warehousing has long been the scale-out solution for integrating large amounts of data to quantify well-defined relationships for immediate business use. However, the disruptive explosion of massive amounts of time-series data from sensors and loggers means that a refining process must be applied before newly generated data can be placed in the context of a wider knowledge pool. That’s where Hadoop comes in. The vibrant Hadoop ecosystem has all of the components to ingest and process such data at the scale and pace necessary and pass it to the operational data warehouse for contextualization and decision support.
But this transition will be more about people and processes than about technology, as it will require much closer cooperation between operating companies, service companies, and technology companies to work. This type of industry-wide integration requires actually putting people from different organizations in the same room and bringing disparate teams together. The current lack of cooperation and communication is largely due to gaps in understanding between what technology companies have to offer and what the oil industry requires for this digital transition. It doesn’t suffice for tech companies to namecheck big data and cloud and hope that it will do the trick. There are a lot of gotchas in this industry, from the high science to the fact that the data often outlives the applications – and even the people – who work on an oilfield.
That’s why, beyond domain expertise, there also needs to be an investment in a data-driven and analytical mindset on the part of oil companies and service companies to understand how this new connected world will function. It’s also why any organization that can bridge this gap will be highly valued – from the analytically minded scientific consultancies to analytics and data science teams working as centers of excellence in the service companies and operating companies.
Despite the uncertainty surrounding oil and gas, one thing is clear: There is a storm raging in the industry, and it’s not going away for the foreseeable future. On a philosophical level, that means companies must focus on what they can control in order to survive. On a practical level, a large part of that means putting the right technology and communication processes in place to make more out of the big data that’s out there – not only beyond the four walls of individual business units, but also beyond the business itself.
Duncan Irving is the Teradata Industry Consultant for Oil & Gas in EMEA. Duncan joined Teradata from the University of Manchester, where he instructed in geophysical interpretation and geocomputational methods for the last seven years. He is a geophysicist and also has a Ph.D. in glacial geophysics and geotechnical engineering (why frozen ground moves faster during climate warming and how this affects infrastructure). Duncan lives in the hills outside Manchester with his wife and three children. He is a mountaineer, fell runner, and a card-carrying member of the Campaign for Real Ale (CAMRA).
Subscribe to Data Informed for the latest information and news on big data and analytics for the enterprise.