IBM Unveils Software Defined Storage Technology for Data Management

by   |   May 12, 2014 5:43 am   |   0 Comments

IBM announced today a new software defined storage product portfolio designed to reduce data storage costs, improve access to data, and significantly cut time to insight.

Software-defined storage is a set of software capabilities that automatically manage data locally and globally, allowing organizations to access and process any type of data on any type of storage device, anywhere.

Vincent Hsu, IBM Fellow and CTO, Systems Storage, IBM STG, called it “hardware-agnostic storage software.”

Related Stories

Marketing Analytics and Customer Engagement Preview: How Data Management Leads to Advanced Web Analytics.
Read the story »

IBM’s Jeff Jonas on Baking Data Privacy into Predictive Analytics.
Read the story »

Three ways flash memory can accelerate data center architecture.
Read the story »

How Data Profiling Improves Management of Storage Media, Risks, Costs.
Read the story »

“The mission of software defined storage is to eliminate data isolation, the data and the storage island,” said Hsu. “This technology will be able to support the data integration platform.”

Traditional storage systems require users to move data to separate designated systems for transaction processing and analytics. One of the technologies in the portfolio, which IBM refers to as “elastic storage,” automatically balances resources to support both types of application workloads, including Hadoop-based analytics.

Elastic storage virtualizes storage, allowing multiple systems and applications to share common pools of storage. Through its support of OpenStack cloud management software, Elastic storage also enables users to store, manage, access, and share data across private, public and hybrid clouds.

“Digital information is growing at such a rapid rate and in such dramatic volumes that traditional storage systems used to house and manage it will eventually run out of runway,” Tom Rosamilia, Senior Vice President, IBM Systems and Technology Group, said in a statement.

“I think that one of the most important benefits is that you eliminate unnecessary data replication,” Hsu said. “How do people do analytics today? You store data in some kind of primary storage. In order to run analytics on those data, you need to ship those data to the analytics platform. [But] the data becomes so big that moving it to the analytics platform becomes very costly. And because data is so big, it takes a long time to move the data. You won’t get the real-time insight of your data.

“Existing infrastructure is running out of steam because the existing data center infrastructures are fragmented,” Hsu added. “The way forward is a universal data platform, a data integration platform, one platform to put all your data there and run different processes on it.”

Elastic storage automatically optimizes data storage by maintaining frequently used data on high-speed drives and moving little-used data to less expensive storage, driving down storage costs.

“This storage software has a lot of storage efficiency technology built in,” Hsu said. “It has the intelligence to put the right data in the right place. For example, the data you use very often, it puts on the Flash [storage]. The data you don’t use very often, like the email from six months ago, it will actually transparently put that to the tape. So you don’t even see that. To you, it seems like all the data is available to you online, but some of the data you don’t use very often, it’s actually stored in a more economical form.”

IBM said that elastic storage software will be available as a cloud service through SoftLayer in the second half of this year.

Because of its scalability, elastic storage was used in Watson. According to IBM, elastic storage can scan 10 billion files on a single system in 43 minutes.

“Elastic storage is the backbone for Watson technology,” Hsu said. “This technology, in the past, was exclusive to supercomputers. Today, those technologies have an important role to play in the commercial world as well. You are applying supercomputer technology in enterprise applications and cloud applications.”

What this means for businesses, Hsu said, is, “Now you can get your insight much faster compared to your competition. You can use this data platform to inject the data and you can run your big data operations on the same platform. Your competition will have to inject the data into one primary storage and replicate the data to the big data platform. You have an advantage over them that is not a matter of seconds. It’s at least a matter of hours, days, because you are going to a one-step process, the others to a two-step process. They put the data in one storage first and then send it to the big data platform. In your environment, with elastic storage, you just put the data in one place and you can start processing it. The fact that you don’t have to move a large amount of data, the speed to get your insight, the delta is like hours or days.

“Let’s say you have 10 TB of data from social media,” he added. “And you have to move the data to the big data area to do the analytics. That 10 TB will cost you half a day to move it. In elastic storage, once you finish the data ingestion, you can kick off your analytics right on the spot.”

Scott Etkin is the managing editor of Data Informed. Email him at Scott.Etkin@wispubs.com. Follow him on Twitter: @Scott_WIS.


a52c8a27 ca6f 483c aa59 0d4ca49c6de5 IBM Unveils Software Defined Storage Technology for Data Management



Tags: , , ,

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>