Maximize the Value of your Perishable Data

by   |   August 16, 2016 1:00 pm   |   0 Comments

Mike Flannagan, Vice President of Data and Analytics, Cisco

Mike Flannagan, Vice President of Data and Analytics, Cisco

The Internet of Things (IoT) has fundamentally changed how businesses treat and value data.

While some data can be collected, sorted, and analyzed to develop deeper insight into business, more and more data is becoming known as “perishable” because, unless analyzed immediately, that data depreciates in value until becoming effectively worthless.

Meanwhile, many businesses are struggling to keep up with the exponential growth of network-connected devices and sensors and the data they produce. To capitalize on the opportunities that IoT-driven data sets provide, organizations across industries are rethinking and diversifying their IT infrastructures, supplementing the large-scale data centers of today with smaller, more local hubs located at the edge.

What is Perishable Data?

The IoT has pushed the creation and collection of data into overdrive. During the Super Bowl, for example, 71,000 mobile users generated 22 terabytes of data in just three hours. This is a staggering amount of data, but simply having it is not enough. The true power of data comes from the ability to quickly analyze and extrapolate it into real-time insights.

As such, just as people check the expiration dates on cartons of milk or bottles of aspirin, the same concept now also holds true for data – meaning organizations are faced with a “use it or lose it” reality when it comes to making real-time decisions with their data. If they wait too long, the usefulness of the data shrinks until the data eventually becomes outdated and irrelevant.

And this applies not only to new, IoT-generated data, but also to legacy data, which businesses have been hoarding while they work out how to unlock its value. This realization has led organizations to alter their approach to data management and analysis by understanding and checking data’s perishability.

Related Stories

Analytics Living on the Edge of Cloud Computing.
Read the story »

You Have a Business Problem, Not an IoT Problem.
Read the story »

The IoT Isn’t About the Future, It’s Already Here.
Read the story »

‘Silly’ IoT Projects and Growing Heterogeneous Data.
Read the story »

With this in mind, perishable data is information that must be acted on quickly – often within milliseconds – to minimize the loss of business value. Consider the following examples:

  • Data on wind direction and speed to properly align wind turbines to maximize energy production.

 

  • Monitoring traffic and parking spaces to maximize parking revenue and reduce wait times and pollution by helping drivers quickly find a parking space for their cars.

 

  • Data on oil well pressure, fluid flow, and other performance metrics to prevent costly failures and potential environmental damage.

 

Left for just an hour, the insight these data types offer becomes outdated and worthless for real-time decision making. Therefore, knowing which data is perishable, and how to handle it, is essential to reacting quickly enough to changing business needs while avoiding excessive data storage and transmission costs.

Edge Analytics and Data Tiers

The IoT is enabling companies to connect the unconnected; creating new, previously unknown and untapped sources of data that can be leveraged to drive valuable business insight and, ultimately, revenue.

Capturing, sorting, and analyzing data is crucial to this process. However, whereas traditional forms of analytics required that information be sent back to the data center for this to happen,  the vast quantities of data being created in today’s IoT landscape mean this is no longer feasible.

As a result, many businesses are now relying on edge analytics, enabling data to be captured, sorted, and analyzed at the edge of the network.

Many organizations already store data in “tiers” based on its importance and, more specifically, how often it is accessed. Financial data from the first fiscal quarter, for example, might be accessed very often in that quarter, somewhat less often the next quarter, but very seldom (if ever) after that. Over time, a tiering strategy would move that data from expensive, fast storage (such as solid-state drives) to less expensive but slower commodity disk drives, finally archiving it on still slower but even less expensive tape.

Perishable data takes this ranking process to another level by identifying data that may lose its value almost immediately after its collection and whose real-time analysis and use is urgent. Because many IoT devices are reporting on fast-changing conditions, such as the temperature of a refrigerated shipping container or a customer’s location in a store, that data must be analyzed and acted on very quickly in order to deliver the greatest business value.

eBook: The Internet of Things and Data Insights for your Organization

 

Unlike the backward-looking analysis often performed on, for example, sales or payment records, data from the IoT often will be used for forward-looking analytics, such as predicting device failures or generating a “next best offer” for a customer before she leaves a store or a Web site. This makes the latency involved in transmitting that data for analysis unacceptable.

To understand how edge analytics reduces response time and data storage and network costs, consider the example of a security camera monitoring a shelf of high-value items. An intelligent camera or router at the edge can filter out the unneeded data (signals showing the inventory is still on the shelf) and only transmit an “error” alert that requires action when the item is no longer there.

This same approach can be applied to everything from data for home healthcare monitoring to continual, real-time analysis of production equipment. General Electric estimates that if such analysis could increase system efficiency by one percent, over 15 years it would save the airline industry $30 billion in jet fuel; the global health care system $63 billion through improved treatment, patient flows, and equipment use; and gas-fired power plants $66 billion in fuel.

This shows the power of edge analytics and its value to business and explains why, as more organizations seek such efficiencies, studies are predicting that, within the next three to five years, the majority of data will be processed at the edge.

As Vice President and General Manager of Cisco’s Data and Analytics Group, Mike Flannagan is responsible for the company’s data and analytics strategy, and leads multiple software business units. Mike joined Cisco in 2000. He previously held IT leadership positions in consulting and global media companies, and founded several startups. Mike is a Cisco Certified Internetwork Engineer, patent holder, and published author. He attended the University of Texas at San Antonio and earned a Master’s degree in Business Administration from Auburn University, where he now serves as Chairman of the Advisory Board for the College of Business’s graduate programs.

Subscribe to Data Informed for the latest information and news on big data and analytics for the enterprise, plus get instant access to more than 20 eBooks.



The Critical Moment: Getting Operational Intelligence from Logs, Metrics, and Transactions




Tags: , , , , ,

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>