The Internet of Things: More Connectivity Can Mean More Vulnerability

by   |   May 30, 2014 5:30 am   |   0 Comments

Christian Beedgen, co-founder and CTO at Sumo Logic

Christian Beedgen, co-founder and CTO at Sumo Logic

As the Internet of Things grows and connected devices become more prevalent, enterprises are bracing for an unprecedented influx of data. Businesses are acquiring more and more connected devices that may generate business opportunities for them, but they must be aware that along with additional sensors comes the potential for increased vulnerability.

Cisco predicts that the number of devices connected to the Internet of Things will grow to between 15 billion and 25 billion by 2015, and some think even that may be a conservative estimate. Every one of these devices presents a new opportunity for an invasive cyber attack to occur undetected. And monitoring these devices effectively is impossible using traditional IT methods.

Enterprises have been mining data from user logs, software activity and network traffic for more than a decade to track security vulnerabilities, typically by establishing a set of rules that denote normal behavior among users and IT infrastructure. When a rule is “broken” – that is, when an anomaly is detected – the security team is alerted and looks back at the logs to investigate if a malicious attack had taken place, if data were inappropriately accessed, etc. But this traditional approach falls down when confronted with the petabyte-levels of data currently being generated in real time, every day, by expanding network infrastructure.

Related Stories

Modernizing M2M Analytics Strategies for the Internet of Things.
Read the story »

Cisco Sees Retailers Harvesting Internet of Things for Analytics.
Read the story »

Stream Processing Poised for Boost from Internet of Things.
Read the story »

The Internet of Things and the Eclipse of Capitalism.
Read the story »

Log data, which is really machine-generated data, is expected to grow 15 times by 2020, according to a report by the research firm IDC. In a world where security breaches are ever more sophisticated and frequent, it is a challenge for security teams to know what they are looking for in their data that would indicate that an attack is taking place. So with a growing field of data and so many points of entry, how will IT departments create enough rules to catch everything? The answer is complex and requires proactive, not reactive, machine learning combined with human interaction and domain expertise to uncover relevant security issues and data breaches that would otherwise be masked in the flood of data from the Internet of Everything.

The following are steps that enterprises can take as part of a proactive approach to security.

Get strict about access. In a changing environment in which employees or visitors may be introducing unauthorized connected devices, solidify your network policies to identify and manage access to the network. But even with this protection in place, it is almost inevitable that an unauthorized device – perhaps one with a back door that enables a malicious entity to access key data – will become connected to the network. So solid network access policies should be just one security layer among many.

Even with many embedded systems that would be authorized to access the network, such as found in printers, HVAC systems, etc., software patching can be a spotty practice, so attention must be paid both to creating a barrier between these devices and your business’s critical data, as well as monitoring how these clients interact with the network once they are connected.

Break the rules. The IT and security teams cannot assume that any pre-defined rules will catch all significant anomalies in their log data. By opting for security solutions that incorporate automation and machine learning, enterprises can take a more agile approach by letting the data inform the need for policy changes within the network architecture. Instead of a monthly or quarterly review of rules as seen in a traditional log management approach, these rules can be enabled to automatically evolve as a result of changes in usage norms. As new sensors and devices present additional points of entry for a hacker, rules can be generated on the fly to incorporate the data coming from these new clients.

Put together the right team. Enterprise security teams employ a number of services and software to make up their security stack because there is no single silver bullet that will fully protect an enterprise or prevent an attack. And while many elements of security monitoring can be automated, as with the rules discussed above, skilled personnel are required to assess and address the threats that their tools identify. To best understand the skillsets needed, enterprises should take periodic stock of the vulnerabilities and security events they face and evaluate the monetary cost of such events: the time it takes to remediate them, the value of any data compromised, and other affected assets. Enterprises can then match the cost of the most resource-intensive events with the talent or skillsets best able to address them.

We can only expect that data will continue its exponential increase in the coming years. For businesses of all sizes, the growing number and purpose of connected devices can gather information about customer buying habits, operational efficiency, and more. To ensure that the Internet of Things stays an asset and does not become a liability, organizations must be proactive in improving their security posture to anticipate and identify threats from this new data source.

Christian Beedgen is co-founder and CTO at Sumo Logic, the next generation machine data analytics company. Christian was at ArcSight from 2001 through January 2010, most recently serving as Chief Architect for the event-based products, both ArcSight ESM and ArcSight Logger. He led development teams that owned and delivered the core of the ArcSight product line across server infrastructure and user interaction. Christian was the first server-side engineer building the ESM backend and growing the server team. He designed and implemented the core backend event handling framework along with the object-relational mapping framework. He was integral in the design and development of core product features such as a flexible framework for real-time query processing and the initial set of stream processors. Christian also co-designed and implemented a patented approach to display large result sets that change dynamically. Prior to ArcSight, he was co-founder of Gigaton, an infrastructure software company enabling distributed file management solution over IP-enabled networks.






Tags: , , ,

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>