Traditional Data Management Going the Way of the Mainframe

by   |   April 4, 2016 5:30 am   |   0 Comments

Ashish Gupta, CMO and SVP, Business Development, Actian

Ashish Gupta, CMO and SVP, Business Development, Actian

The big data technology and services market is expected to reach $48.6 billion in annual spending by 2019. The increasing spend is coming with business intelligence (BI) and analytics requirements that are breaking traditional data warehousing platforms. In a recent study, 46 percent of the senior executives surveyed acknowledged that their traditional technologies were not architected to handle modern workloads.

It’s a problem that is being compounded by an unbound growth of data, and new sources of data only make the quest for accurate insights more difficult. These traditional architectures forced more than 25 percent of respondents to discard data to get analytic insights because they couldn’t scale to process the volume of data being collected. In addition, 31 percent of respondents said traditional architectures are not designed for new workloads, and 29 percent said it is too expensive to scale. Customers are suffering due to the innovator’s dilemma these traditional vendors find themselves in. Is this the tipping point?

More data has been created in the last few years alone than in the entire history of mankind, and it’s not slowing down. It has become imperative for organizations to access, process, and analyze troves of data in real-time to make effective business decisions in a contextual and timely manner, but it is no longer viable for business leaders to fall back on traditional data management platforms. Many have started to turn their attention from static legacy frameworks to more modern solutions like Hadoop and Spark, but have found limited success. While these may be cost-effective tools for storing and sifting through massive amounts of data, 30 percent of respondents in the aforementioned study said these new Hadoop analytics architectures are not yet ready to provide the enterprise-grade performance needed to effectively execute advanced real-time analytics.

Related Stories

Hadoop vs Data Warehouse: Apples & Oranges?
Read the story »

Will Hadoop Kill the Enterprise Data Warehouse?
Read the story »

Defining Elements of the Next-Generation Data Warehouse.
Read the story »

The Evolution of the Enterprise Data Warehouse, Starring Hadoop.
Read the story »

Amid this platform stagnation, BI tools are also suffering, and this one-two punch is leaving businesses overwhelmed on dual fronts. Just 40 percent of survey respondents said their BI tools were working well with historical data sets, and 32 percent acknowledged these tools were overmatched by increasing data volumes.

This suffering extends beyond functionality, ultimately forcing enterprises to pay increasing sums for solutions that are getting less effective over time: 47 percent of respondents said the cost to maintain traditional systems continues to rise. However, companies are apprehensive to rip and replace their faltering legacy solutions: only 32 percent of survey respondents indicated that they would supplant them with a modern tool. This is often the case because organizations have already invested so much in their existing data management platforms and they are reluctant to write them off as a loss. In addition, 62 percent of respondents have decided to augment these traditional systems with modern architectures to meet the needs of their modern analytic workloads. This often results in relegating traditional systems to transaction processing workloads – much like what happened in the shift from mainframes to x86-based servers.

Opportunity in the Downfall of Traditional Management Platforms

There is still opportunity as businesses navigate the challenging waters of the modern data era. The opening to capitalize on business opportunities through customer analytics and Internet-of-Things strategies is there, but organizations first must overcome the barriers created by the breakdown of legacy solutions and find ways to better manage the growing stress of big data. Until that is achieved, they will continue to sit on a wealth of untapped data, limited by the commercial and technical constraints of their traditional management systems. So what will it take to break free?

There is no magic bullet, but the best answer lies in the amalgamation of existing and new technologies. Businesses want to use their domain expertise and continue leveraging their legacy investments while still enjoying the benefits of more modern environments like Hadoop – and there is a way to do this through solutions like SQL-in-Hadoop. Hadoop already can address the issues of cost and size for data storage. All that’s left is turning it into a big data analytics platform adept for the enterprise.

Fortunately, existing applications and queries based on SQL – the lingua franca of relational databases – don’t have to be rewritten to work with Hadoop, and data don’t have to be brought out of it either, making Hadoop the perfect complement for legacy deployments. Using SQL also enables users to leverage existing BI and visualization tools, not to mention existing dashboards and reports. In all, it allows organizations who have invested extensively in legacy data management platforms to retain these existing systems and maximize their ROI while working to match the demands of analyzing today’s big data with more modern solutions.

Just like mainframes that could not keep up with compute and scale requirements, traditional data management systems are floundering in the depths created by the current waves of data workloads. They are pinned by architectural limitations and expensive commercial models. But due to extenuating financial circumstances – and the lack of an enterprise-ready modern alternative – they will not be replaced soon. Perhaps it’s time for organizations to invest those dollars in a different approach that leverages new technology to complement the faltering data-management platforms that have proven so hard to replace.

Ashish Gupta joined Actian in 2013, where he is responsible for marketing and business development. Ashish brings more than 21 years of experience in enterprise software companies, where he focused on creating go-to-market approaches that scale rapidly and building product portfolios that became category leaders in the industry. Ashish was formerly at Vidyo, where the company grew to be the leading software-based videoconferencing platform and was named to the Wall Street Journal’s “Next Big Thing” list for three years and was selected as a “Tech Innovator” by the World Economic Forum.

Previously, Ashish led the Business Development and Strategy, Marketing, and Corporate Sales teams for Microsoft Office Division’s Unified Communications Group, responsible for introducing the industry-leading Microsoft Lync product. Prior to Microsoft, Ashish was VP of Product and Solutions at Alcatel/Genesys Telecommunications and VP of Marketing and Business Development at Telera Inc. (acquired by Alcatel), and management consultant for Braxton/Deloitte Consulting. He also held marketing leadership positions at HP and Covad. He holds an MBA from UCLA and a bachelor’s degree in Economics and Computer Science from Grinnell College.

Subscribe to Data Informed for the latest information and news on big data and analytics for the enterprise.



The Database Decision: Key Considerations to Keep in Mind




Tags: , , , , , , , ,

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>