In 2017, improvements in time-series forecasting will make it a powerful analytical weapon for companies across the board. Time-series forecasting, or the use of historical data to predict future measurements, has the potential to be a boon for business planning. New advances in AI are empowering companies to create accurate time-series forecasts that not only predict outcomes, but prescribe the best course of action to boost their numbers. Machine-intelligence technologies now can deliver automated recommendations from data, which may change the data-science world as we know it.
Time-series forecasting allows companies to make predictions. Unfortunately, it’s fraught with challenges. First, time-series forecasts, by nature, extrapolate into the future, which can deliver a high level of analytical uncertainty. Equations and solutions that have worked in the past may not follow the same pattern one week or two years down the line.
Second, time-series data is often extremely complex. Variables within time-series data aren’t necessarily aligned in time, making it difficult to capture real causes and effects. For example, most time-series forecasts today don’t capture lagged effects between variables, whereby a marketing campaign today may not show ROI for three weeks. Time-series forecasts often deal with noise and complexity by averaging data, which diminishes the impact and accuracy of the forecasts. If an automotive company wants to predict sales and align inventory, it would be better to develop models that forecast sales at each individual dealership rather than apply one global, macro-level model to dealerships in both New York City and Idaho. Unfortunately, most companies only have the time and resources to implement a small number of models.
Third, time-series forecasting requires users to be selective about their data. For typical modeling problems, the more data, the better; piles of information usually mean more reference points and building blocks to train the model. But for time-series forecasts, more data isn’t always better. Old data, for instance, isn’t always useful, since relationships and trends can change rapidly over time. Time-series forecasts often necessitate working with a relatively small data set with newer, more relevant, information.
Machine intelligence technologies patch up these three issues and more. Machine-intelligence applications build time-series forecasts in minutes. They automatically identify the right data and features, cut through the noise to unearth complex time-dependent relationships, and account for seasonality. But beyond being accurate prediction engines, machine-intelligence applications are becoming prescriptive. For the first time, analytics software not only crunches numbers and spits out models, but also generates recommendations on what to do to maximize your desired outcome. If you want to sell more cars, for example, machine intelligence can give you location-by-location advice about how to optimize your revenue. In New York City, staff the showrooms with five employees, increase online ads, and offer 15-percent holiday discounts. In Idaho, however, the advice is to keep showrooms lean and target spending on printed flyers to families.
As a result, machine-intelligence-powered time-series forecasting is becoming one of the biggest competitive advantages in the market. Here are some of the most prominent use cases to date.
Stopping Cyber Attacks
In 2016, hackers caused 4,419 data breaches, exposing more than 4.2 billion “sensitive” records. Security experts expect these numbers to only get worse, as bad actors turn to automation. What if we had the power to predict cyber attacks? To do so, we would need to capture a wide array of metrics, such as CPU usage, network activity and geographical requests to build nominal behavior as a baseline. From there, the system could highlight abnormal activity, such a huge spike in traffic from China. Once there is historical data measuring both the baseline and abnormal activity, users could use time-series forecasting to predict when cyber-crime is imminent, so the security team can preemptively intervene.
Save Tens of Millions in Revenue
Every year, retail companies leave millions of dollars on the table by not optimizing store staffing, inventory and placement of new brick-and-mortar stores. One of the common pitfalls is that retailers implement forecasting models that inherently assume their stores or products perform equally across locations. Successful time-series forecasting for retailers should, of course, look at metrics on a national level, but the real value is diving into the lower-level insights, such as performance of specific stores, managers and marketing campaigns. National data is often very noisy and spotty, so the ability to forecast at the store level or product level within a certain store will empower companies to optimize staffing, product stocking, etc., for the most profit. This optimization can make or break a retailer’s bottom line. It has been reported that the difference between building a new store in a “good” location versus an “average” location added up to $30 million in extra revenue per store per year.
Build a Market-Beating Investment Portfolio
The financial industry is full of ever-present uncertainty, with huge financial gains and losses at stake – an optimal area for time-series forecasting to shine. Investors using time-series forecasting on their proprietary data can identify key predictive relationships that indicate whether an investment is primed to tumble or soar. For example, machine-intelligence-generated time-series forecasts can identify the key “indicators” of future excess returns for stock X, among thousands of different noisy market signals. Applying this predictive insight to investment strategies can help beat the market.
Time-series data presents extraordinary opportunities to accelerate predictive and prescriptive capabilities across all industries, particularly cyber security, finance, IoT and retail. Leveraging machine intelligence for time-series forecasting will allow companies to optimize detailed business operations at unprecedented granularity, precision and impact. Let the transition begin.
Michael Schmidt is founder and chief technology officer of Nutonian. He is the creator of Eureqa, Nutonian’s dynamic modeling engine that accelerates discovery and understanding by making data science and artificial intelligence accessible to all business users. Michael has a PhD in Computational Biology, Masters of Engineering in Computer Science and BS in Electrical and Computer Engineering from Cornell University.
Subscribe to Data Informed for the latest information and news on big data and analytics for the enterprise.