Natural-Language Processing Evolves into Vital Technology

by   |   September 9, 2014 5:30 am   |   1 Comments

Text Analysis International (TAI) announced last month the launch of its cloud-based Natural Language Processing (NLP) service, which allows programmers to customize text analyzers to consume any content they need. David de Hilster, TAI’s chief operating officer, explained that the analyzers use transparent, open coding to extract meaning from riches of text-based content.

TAI’s programming language, NLP++, can perform tasks like optical character recognition and sentiment analysis for finance, medicine, real estate – any number of business verticals, said de Hilster.

“We have a programming language for human language,” he said. “It doesn’t matter if it’s full-blown stories, HTML, old documents from the 18th Century – NLP++ is very ample.”

The announcement is the sort of event that has led analysts to set the value of the natural language processing market at roughly $9 billion by 2018. As distinct as TAI’s product is, it follows a string of arrivals in the NLP market, created by startups and smaller firms, that together signify profit’s touch on innovation in this sector.

Related Stories

Exec Tells How to Track and Understand Customer Sentiment.
Read the story »

Take a Data-Centric Approach to Managing Your Unstructured Data.
Read the story »

Developing a Legal Risk Model for Big Volumes of Unstructured Data.
Read the story »

‘Second Generation’ of Social Media Analytics Uses Both Machine Learning and Natural Language Processing.
Read the story »

NLP could be the future’s most vital technology. In the big data age, NLP vendors have anticipated our immersion into the details of every human activity. Their advancements are meaningful beyond their balance sheets. For the coming scale of data to have any worth to us, the ability to analyze text and understand the circumstances in which we act must grow with the mass of information we continue to gather.

Yet the urge to enhance NLP is felt by technologists well beyond the commercial sector, and it is important to understand the influences that are shaping NLP as a mechanism. With its devotion to knowledge and its long love for the NLP concept, academia is at once broadening the scope of natural language processing and honing it to thousands of human languages.

EMNLP2014, an upcoming conference in Doha, Qatar, will showcase the activity of scholars in NLP research. The show’s list of accepted papers hangs our understanding of economies and political history on such arcane things. One document traces the shifts in Chinese diplomatic sentiment as reported in six decades of articles in the People’s Daily, the country’s largest English-language newspaper. Our brains, too, are under scrutiny; several papers review the psycholinguistic contexts of reading.

Indeed, language remains a link to cultures and intellects, even removed and ancient ones. A scholar in Athens, Greece, has used NLP in an exegesis of the Roman emperor Julian’s speeches. According to the paper, “In order to define the ideological character of his message,” the researcher built a body of more than 57,000 words, of which God, justice, virtue, deeds, and reasons stand out for their repeated presence in the canon. In its aim to preserve the meaning that can be lost in translation between modern idioms—say, from English to Punjabi—academia looks at the variances of dialect and inflection with a nuance that the business world has yet to match, but would love to convert into revenue.

No subject is too remote for NLP – in fact, that’s the point. While the academic culture pursues knowledge, scholars expand our ability to take meaning from other languages and cultures in a way that, say, an English-speaking product manager, eyeing an emerging Asian market, can admire. Any global company would be thrilled to achieve customer sentiment analysis that perfectly translates the parochial dialects of a new sales territory. The potential for language processing is so serious that, rather than scoff at such study as the meanderings of dreamers, business is an enthusiastic colleague to NLP scholarship. EMNLP2014 proudly features as speakers an Ivy League professor and an IBM cognitive computing expert.

People of all talents add their own curiosities to the advancement of NLP concepts. TAI’s de Hilster, long an artificial intelligence researcher, directed a documentary, Einstein Wrong, about scientists who oppose the theory of relativity. He is an accomplished visual artist. While pursuing these expressions, he has worked in business for most of his life.

Indeed, the commercial culture that is designing NLP technology is thrilled to pull whatever people and resources are available for the good of evolution – especially the cloud. Machine Linking, a startup based in the Italian Alps, markets a SaaS program that connects unstructured documents to resources in the Linked Open Data cloud. The Canadian company Yactraq offers cloud-based audio and video meaning extraction, which provides material for targeted marketing and ad delivery. According to its website, Yactraq’s executives and advisers carry credentials from a range of academic and strategic institutions.

Along with art and study, the exigencies of compliance and health have led NLP into new areas of purpose. In 2011, the University of Pittsburgh Medical Center joined with Nuance Communications to design natural language processing solutions to capture patient data. Speaking to Healthcare Informatics, UPMC’s Dr. Andrew Watson explained that, “Eighty percent of our data used to be unstructured text…we have put our entire system onto voice recognition and NLP.”

As a result, UPMC has achieved $12 million in transcription savings. But, more importantly, NLP provides accurate, reliable information when it’s needed in what are literally life and death situations.

“If you don’t have accurate or timely information, you can injure patients, they can get readmitted, and they can die,” Watson said. “In medicine, the absence of information is harmful, and you need real-time data for medical decision making.”

Joshua Whitney Allen has been writing for fifteen years. He has contributed articles on technology, human rights, politics, environmental affairs, and society to several publications throughout the United States.


Subscribe to Data Informed
for the latest information and news on big data and analytics for the enterprise.





Tags: , ,

One Comment

  1. Roger McWilliams
    Posted September 9, 2014 at 5:15 pm | Permalink

    NLP has been around for a while. Too many pieces, black boxes, inflexible tools. Maybe this is the language we have been waiting for. Will have to check out NLP++.

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>