Why does creating an accurate, immediate, and constantly updated model of reality used by billions of interconnected devices matter?
To provide some perspective, let’s look at three historical events involving transformational breakthroughs in the technology of location and time. These events fundamentally changed the course of human events.
For hundreds of years, mariners were able to navigate, to widely varying degrees of success, using crude maps, compass, celestial measurements, and currents. Long distance navigation was imprecise and very dangerous. There are many notable examples of explorers being wildly removed from the locations they set out to discover.
This was due primarily to an inability to accurately compute east/west position (longitude). Unlike latitude, measuring longitude accurately required a very precise time measurement from a known location. It was not until the 1750s, when Englishman John Harrison developed the marine chronometer, that this all changed. Without the accuracy that marine chronometers enabled, it is quite likely that the ascendancy of the Royal Navy and, by extension, the British Empire, would not have occurred.
In the 1870s, railroads connected hundreds of towns in the United States, each using a local version of time based on sun readings. Many states had 30-40 different versions of time. To confuse the situation further, each of the 50 railroads operating at the time had its own version of time. Imagine leaving New Haven at 2 p.m. for a two-hour ride to New York City with a planned arrival of 4 p.m. But the time, as measured in New York, is 3:55. Your train is running on Boston time, which is 4:17 at time of arrival. You are trying to catch a connection to Baltimore, and the train is on Baltimore time, which will leave at 4:05 Baltimore time.
It wasn’t until 1881 that William Frederick Allen, Secretary of the General Time Convention, proposed a standard time and replacing the 50 different railway times, and hundreds of local times, with standardized time zones. By embracing this system in 1883, the U.S. entered a new era in economic expansion.
The GPS system became fully operational in 1995. It was built out for the prior 10 years and had already been in use, but reached a full constellation of satellites that year. This system allowed unprecedented precision in navigation, but really only benefited the U.S. military and large aviation and maritime operators. It wasn’t until May 2, 2000, that “Selective Availability” (a deliberate degradation of location precision) was permanently discontinued, opening up GPS navigational capabilities to general consumer users.
It took additional time to develop the affordable receiver technology that is now ubiquitous in our daily lives, much of which has been in mobile phones thanks to QualComm chips. This breakthrough gave us the wave of location-based capabilities currently available at our fingertips: Google Maps, Four-Square check-ins, Uber car service, etc. I always know precisely where I am and what time it is. What more could be needed? For starters, what do we do when everything is location aware? Unmanned Air Vehicles (UAVs), all modes of transportation, watches, pets, everything you can imagine, and much we have not imagined.
The problem is that in the big data era, we are, figuratively, still trying to discover longitude. Knowing where I am on a digital map or satellite image is a 21st century equivalent of getting out the sextant and trying to make sense of my position and surroundings. It is necessary, but woefully insufficient. The value of contextual awareness is constantly and rapidly changing. The volume and richness of additional data that can help me (or a machine) make sense of my situation is mind bending. We are not talking about adding static points of interest, road segments, or the shortest path to a destination to my digital parchment. We are talking about constantly fusing all relevant data (micro weather, instantaneous movement patterns, abnormalities from normal conditions) as events that are occurring in the real world and enabling people, things, or large groups of things (smart dust, swarming autonomous vehicles, etc.) to have a live digital model of reality closely approximating physical reality. Think of the potential this capability provides, across most any segment of business or society. We believe it will be as transformational as the discovery of longitude, standardization of time, or the commercialization of GPS technology.
Let’s look at a few examples. When pondering big data and the Internet of Things, among the last places most people think about are farms. In reality, modern farming equipment and techniques are becoming exceedingly high tech. Most equipment on the ground (tractors, etc.) are completely instrumented and location aware. A vast array of sensors precisely measure moisture and other soil conditions. UAVs are increasingly used for mapping, inspection, and spraying. Large amounts of historical data on the effectiveness previous methods of planting, fertilizing, and watering a particular seed variety in similar conditions are available from seed suppliers. Micro weather forecasting and monitoring is emerging.
For the value of all this data to be fully realized, the data must be fused into a common frame of reference (location and time) and be constantly fresh and available to farmers. Very soon, the data will have to be available to equipment that will automate these operations. The potential efficiency improvements are staggering. Consider being able to water or spray only the individual plant that requires it instead of spraying acres of land, and doing so from a remote location on your iPad, or even not being a part of the decision-making process at all. This cannot be accomplished by simply showing a device’s location and path on a map or image.
We don’t want to think about emergencies happening, but all municipalities and government organizations must have emergency-response plans. Detailed evacuation routes are mapped out. Location and fastest-path analyses have been done for emergency responders. Databases of building locations, floor plans, locations of potentially toxic materials, fire hydrants, and power transformers have been or are being established. The problem is that most of these plans are static. Routes and locations are plotted on maps or images. Multiple layers of this information may be contained in a GIS system.
We need the ability to know where people are right now, where each mode of transportation is this very second and the most effective way to use them to move people from danger. We need to know the weather conditions at the exact location of an incident and the surrounding area. We need to know where, exactly, all of our emergency responders are right now and, given the location of the incident and the population, the most effective way to deploy them.
To fully exploit the capabilities that the profusion of sensors and data are presenting to us, we need to have an always live digital model of area that is always accessible, always fresh, and completely fused together in a common operating picture. Precise location and time are the keys to providing this capability.
Dane Coyer is CEO of SpaceCurve, a spatial analytics company. SpaceCurve enables unprecedented value creation from geospatial and location data.
Subscribe to Data Informed for the latest information and news on big data and analytics for the enterprise.