IBM Updates Big Data Analytics Offers with Skills and Trust in Mind

by   |   November 6, 2013 9:52 am   |   0 Comments

LAS VEGAS – It takes a mix of information technology, corporate strategy, a data-driven culture, and people with the right skills to implement big data analytics that generate business value. And IBM spent this year’s Information on Demand conference here seeking to address all of these points.

Among the highlights of IBM’s fusillade of announcements were technologies designed to boost performance of its data platforms and the security of systems using the open source Hadoop Distributed File System. It introduced new capabilities to its analytics software, including SPSS Analytic Catalyst, to make the onerous process of preparing data for analysis lighter both in terms of labor and the advanced skills required. It updated InfoSphere Data Explorer, a cloud-based application development framework, with features to ease data integration tasks.

Related Stories

IBM’s Watson moves into customer service, readies new use cases.
Read the story »

IBM’s data science project to build analytics to predict heart failure.
Read the story »

The evolution of the enterprise data warehouse, starring Hadoop.
Read the story »

Fractal CEO on tailoring analytics to mesh with corporate culture.
Read the story »

IBM also said it is expanding its line of Watson cognitive computing systems, industry use case by use case, with a planned cloud-based offering and APIs for third-party application developers in the future. And the company previewed Project Neo, a data visualization tool that goes into beta in January, which returns tree maps and other data displays in response to users who type into a box that asks, “What do you want to know?”

In their remarks, executives from IBM and customers suggested that the lessons learned from big data analytics projects are proving their value, and that there is an ongoing transition among users of analytics from the era of the enterprise data warehouse to a world in which companies find new ways to analyze external and unstructured data sources along with data from established systems.

“We’ve got systems of record. Now it’s about systems of engagement,” said Robert LeBlanc, a senior vice president of the IBM Software Group, during one of the keynote talks at the Mandalay Bay Convention Center. LeBlanc was one of several speakers who emphasized that data quality—or veracity, to make a fourth V to go with the volume, variety and velocity of big data—is an essential component of creating trustworthy insights.

Data Veracity and Trust

Indeed, trust was an important theme at the conference. While the people who attend an event like this are interested in what analytics can do, that is not the case everywhere. It’s not difficult to find survey data suggesting that boardrooms are interested in big data projects, and IBM’s recent executive survey showed that 75 percent of top performing organizations said revenue generation was a key benefit of implementing big data analytics. But the same questionnaire found 62 percent of respondents cited “political or executive constraint” as holding them back from delivering value with analytics.

While part of the challenge is attributed to cultural differences at the organization level, speakers at the conference suggested another hurdle is in the design of IT systems devoted to delivering insights. At more than one session in which discussion turned to the Watson system, IBM experts noted that the system delivers multiple answers to user questions, ranked in order of confidence. The user has a chance to review and correct this output, enhancing Watson’s “understanding” of the content it analyzes. This is another way to build trust.

“What we’re seeing [with Watson] is a new class of apps, probabilistic apps versus the deterministic apps that we use today,” said Manoj Saxena, Watson Solutions general manager.

In the Big Data Analytics Trenches

While there was an emphasis on fresh offerings, fresh faces—Jake Porway, a data scientist who founded to address social problems, was the general session master of ceremonies—a number of speakers returned to constant themes in the field.

“For all types of analytics projects, up to 80 percent of the effort is to get the data and make it accurate,” said Henry Morris, a big data researcher and analyst at IDC. Morris spoke at a panel discussion about the Watson systems, but he could have been adding his voice to any number of conversations.

To Wes Hunt, the vice president of customer analytics at Nationwide Mutual Insurance Co., the lessons of big data analytics projects encompass technology, people, strategy and culture.

Hunt said his company’s goal when it comes to achieving a 360-degree view of his customers is to be able to say through research: “we know you, we care about you and we are easy to do business with.”

Along the way to implementation, Hunt shared four lessons:

1. You need to build a strong data foundation. This gives you flexibility later to solve more problems than you imagined when you started building your systems.

2. Grow your talent. “We don’t have enough,” he said.

3. Big data analytics are not a corporate strategy by themselves. “They are an enabler to strategy,” he said.

4. Trust is the secret ingredient. Hunt said this means that end users trust the insights from analytics. Analysts trust the data they are working with. And business and IT teams trust each other.

“Trust is the bridge between insight and action,” he said.

Michael Goldberg is the editor of Data Informed. Email him at

Home page photo of Information on Demand general session on November 5, 2013, via IBM.

Tags: , , ,

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>