Building the Foundation of the Cognitive Computing Era

by   |   April 26, 2016 5:30 am   |   1 Comments

Scott Crowder, CTO and VP, Technical Strategy, IBM Systems

Scott Crowder, CTO and VP, Technical Strategy, IBM Systems

When most people think about artificial intelligence and cognitive computing, they think of futuristic technological landscapes overrun with evil robots. But a growing number of business and IT leaders are beginning to grasp how cognitive computing is radically reshaping the present IT landscape. This next wave of information technology is giving businesses the ability to outthink their competition and giving society the ability to solve some of our most pressing problems.

In 2011, when a first-generation cognitive system outplayed two human Jeopardy champions, it did only one thing: answer natural-language questions. Today, that family of cognitive solutions has more than 30 capabilities, accessed via application programming interfaces (APIs) and delivered from the cloud. Application developers across multiple industries recognize the value of leveraging cognitive capabilities to add insight to their digital interaction with their customers. Cognitive systems are now helping doctors, for example, make more informed treatment decisions by analyzing medical journals and medical images, and in the process revolutionizing healthcare. Our computers are becoming, as Thomas Watson, Jr. once envisioned, the most potent tool “for extending the powers of the human beings who use them.”

Cognitive systems are defined by a few characteristics: They understand the world rather than automate it; they advise and provide evidence rather than output a binary answer; and they learn and improve rather than remain static. To achieve their goal of better understanding the world, cognitive systems glean insights from vast amounts of unstructured text, voice, and image data.

Related Stories

The Next Logical Step Past Analytics Is Cognitive Computing.
Read the story »

To Start an Artificial Intelligence Strategy, Follow the Data.
Read the story »

Watson Trend App Brings Cognitive Analytics to Consumers [Podcast].
Read the story »

How Machine Learning Will Improve Retail and Customer Service.
Read the story »

If that sounds like it takes a lot of IT horsepower to accomplish, it does. This places new stress on the underlying IT infrastructure. The data infrastructure needs to be cost efficient so you can afford to store lots of data and at the same time meet the performance required to deliver real-time insights. New technologies such as software-defined storage and flash-based storage are the foundation for the cognitive data infrastructure. New cognitive workloads, such as deep learning, are very compute intensive. Processors designed specifically for handling large data problems give customers a leg up in the race by providing infrastructure well suited for modern analytics engines such as Apache Spark. Systems designed specifically to attach specialized accelerators like graphic processing units (GPU) or field programmable gate arrays (FPGA) are becoming the standard for the computationally intensive training of cognitive systems.

So you can see how today’s system technologies, which were designed with big data and Internet-scale data centers in mind, are being adapted to deal with the even greater demands of cognitive computing. I expect advances in processors, system design, accelerators, and storage to continue to come in waves over the next few years. One promising area is processors designed specifically for neural networks – which can extract patterns from unstructured data quickly and with significantly lower power consumption. Another is quantum computing. Scientists at tech companies and within academia are achieving advances in the field that, within a decade, could result in functioning quantum computers that can find the best solution from a universe of possibilities exponentially faster than today’s machines.

IT leaders live in a complex world in which data originates from a myriad of sources and must reside where it can both meet security requirements and satisfy the business demands for real-time insights. IT leaders are embracing hybrid cloud because they know that, in order to succeed today and in the future, they’ll need a combination of on-premises and off-premises solutions. In almost all industry solutions, real business value will come only when cognitive capabilities are integrated with core enterprise data and business processes.  For many situations, this means integrating cognitive capabilities into the on-premises IT infrastructure. For others, it means integrating cloud services with an IT shop’s existing enterprise services via APIs. The right decision depends on the compliance, security, and performance requirements of the solution.

Finally, none of these technological breakthroughs will make a difference unless IT leaders, once viewed solely as technicians, embrace their new roles as trusted C-suite advisors and key strategists behind business success. As more companies and organizations struggle to meet the demands of this new cognitive era, IT leaders are being challenged to add value by opening their enterprises to new ways of thinking and new ways of doing business.

This is an incredibly exciting time to be in the computing business. Right before our eyes, the world is being redesigned by data, remade in the cloud, and rewritten in code. Whether you are a developer, enterprise IT professional, a major corporation, digital start-up, government official, or CEO, cognitive computing is transforming your world. Are you ready?

Scott Crowder is currently Chief Technical Officer and Vice President, Technical Strategy and Transformation for IBM Systems. He is responsible for driving the strategic direction across the hardware and software-defined systems portfolio, leading the agile and Design Thinking transformation, and accelerating innovation within development through special projects. Previously, Scott was Vice President, Technical Strategy within IBM Corporate Strategy. In this role, he helped define the cross-IBM technical strategy for cloud infrastructure, workload optimized systems, Big Data and Analytics, composable services, software-defined infrastructure, and cognitive solutions. Scott joined IBM in 1995 and was the lead engineer on the industry’s first logic-based embedded DRAM technology before serving in a variety of executive management roles within the semiconductor research and development organization. Scott received A.B./Sc.B. degrees in Electrical Engineering and International Relations from Brown University and an M.A. in Economics and M.Sc./Ph.D. in Electrical Engineering from Stanford University.

Subscribe to Data Informed for the latest information and news on big data and analytics for the enterprise.



eBook: crack the unstructured data code with deep learning




Tags: , , , , , , ,

One Comment

  1. Posted August 28, 2017 at 11:59 am | Permalink

    Thanks a lot for the kind of perfect topic I have not a lot of information about it but I have got an extra unique info in your unique post.!

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>