As an exceptionally severe snow storm moved across the Northeastern United States and Canada on the evening of Friday, February 8, a familiar litany of reports soon chronicled the growing number of power outages.
According to USA Today, by 5:20 a.m. on Saturday, February 9, more than 650,000 homes and businesses across the region were without electric power. Reportedly, utility officials were warning customers to prepare for power outages lasting for days. And by Monday, government officials such as Massachusetts Gov. Deval Patrick were sharply criticizing local Boston power utilities NStar and National Grid for their rate of power restoration.
Noting that tens of thousands of customers were still living in unheated and unlit homes four days after the storm hit, Governor Patrick urged utilities to redouble their reconnection efforts. “Patience is going to start to wear thin for everybody if the utilities don’t continue to make significant progress,” Patrick told the Boston Globe. (By February 13, both National Grid and NStar had issued statements saying power had been restored to nearly all customers in Massachusetts.)
But, shivering in the dark, such was the scale of the disruption that utility customers could at least console themselves that their power supplier most likely knew they were without power. And that it almost certainly knew the cause: the snow, freezing rain, and howling winds brought by a blizzard that has been variously described as “unprecedented,” and “historic.”
Less fortunate customers are those hit by power outages in more normal times. In other words, when isolated tree falls, landslips and equipment failures disrupt electricity supplies to homes and businesses numbered in the hundreds, rather than the thousands or tens of thousands.
In such situations, says Rick Nicholson, group vice president of analyst firm IDC’s Energy Insights division, it can be vastly more difficult for a utility to know that disruption has occurred. “The average utility doesn’t have any sensing capability below their major high voltage systems,” he says. “They know if a part of their high voltage network goes out, but once you get into the medium-voltage distribution networks that serve individual neighborhoods, there’s no sensing equipment at all. They’re relying on consumers calling in and saying: ‘My lights have gone out.’”
Worse, he adds, the information that a power outage has occurred provides few clues as to the cause. The result: repair crews sent to repair a suspected equipment failure, when the real culprit is a tree fall. Or sent to deal with a suspected tree fall, when the real problem is a landslip calling for excavators, not a crew with chain saws.
“Getting crews on the ground to diagnose faults is costly,” Nicholson says. “It’s better to diagnose faults centrally, and then the crews can focus on repair, and not diagnosis.”
GE, as part of its Industrial Internet portfolio of products, has begun to offer an analytics capability designed to address such situations.
Using a wide variety of data sources—including government-sourced weather data, grid monitoring devices, smart meters, geo-spatial mapping and satellite imagery data from specialists such as Esri, and social media content from sources such as Twitter and Facebook—GE Digital Energy’s Grid IQ Insight analytics product aims to do two things. First, help utilities pinpoint the locations of power outages faster than conventional techniques. And second, help deliver an informed guess as to the cause of the outage.
Developed after working directly with some of the world’s largest utility companies, Grid IQ Insight takes in terabytes of data and uses analytics technologies to pinpoint power outages and generate actionable information as to their cause—and does so, claims GE, faster than traditional means of outage identification.
Layering Social Media and GPS Data on Top of Power Grid Maps
So how exactly does it work? The key, it transpires, is the ability to construct a virtual map of the power network, and then overlay on it data from as many sources as are available, or deemed relevant.
“Utilities have a lot of data, and can access a lot more, but they’re not so capable of taking an integrated view of it,” says Jon Garrity, strategy specialist at GE Digital Energy. “We aim to gather together disparate data sources, and add value by integrating them.”
Take social media data. A free, publicly-available API, made available by Twitter, enables utilities to look for phrases such as ‘power outage’—which, goes the theory, Twitter users might tweet if their work or domestic life has been disrupted by a sudden outage.
Nor is this wishful thinking, stresses IDC’s Nicholson: during a Commonwealth Edison outage in Chicago in 2011, which left hundreds of thousands without electricity, consumer-to-consumer tweets reportedly provided a more accurate picture of the utility’s recovery efforts than that disseminated by the company itself.
And Garrity notes that when computer users lose power, they are forced to switch to devices such as smartphones; in such cases, consumers thereby helpfully—albeit perhaps unwittingly—reveal vitally-useful locational data.
These days, for instance, instance, it’s not just smart phones that contain built-in GPS chips—a growing number of cell phones do, too. The result: a stream of ‘geotagged’ tweets, containing specified phrases such as ‘power outage’, all broadly coming from an area pre-specified as approximating to the utility’s region of service.
Facebook, too, provides a data source—although consumers have to have first ‘liked’ the utility’s own Facebook profile, says Garrity. But once they have, the utility can then access all of that consumer’s Facebook posts—and will pay particular attention to posts containing phrases which indicate a power outage. Again, he adds, it’s the outage-induced move to mobile devices that provides vital locational clues to consumers’ physical location.
Integrated Datasets Instead of Data Silos
At which point, when social media has identified a portion of a neighborhood as the likely victim of an outage—a suspicion perhaps confirmed by calls to the utility to report the outage—system grid maps, satellite imagery, aerial photography and government-sourced weather data adds contextual richness to the picture.
“Does the power line bringing electricity to the neighborhood pass through a wooded area? Has there been recent high winds and heavy rain? If so, you can make an educated guess that a likely cause of the outage is a tree falling on a power line,” says Garrity.
Garrity says he concedes that the power of the tool really resides in the data that it integrates, not the analytics applied to that data. “It’s not rocket science: it’s joining together data sources that aren’t usually joined together, and taking a view as to what might have happened,” he says. “What we provide is a way of doing that joining, and consequently making those insights available.”
Nevertheless, IDC’s Nicholson sees the GE tool as genuinely meeting an un-met need. “It’s not that utilities don’t have outage management systems: they do, but the systems are usually very siloed,” he says. “This product does a good job of integrating multiple sources of information—structured and unstructured—into a single environment. And from that perspective, it’s a genuine step forward.”
Freelance writer Malcolm Wheatley is old enough to remember analyzing punched card datasets in batch mode, using SPSS on mainframes. He lives in Devon, England, and can be reached at email@example.com.