How to Choose the Right Solution in an Evolving Database Market

by   |   August 26, 2016 2:00 pm   |   0 Comments

Julie Lockner, Global Data Platform Product Marketing, InterSystems

Julie Lockner, Global Data Platform Product Marketing, InterSystems

In 2011, 451 Research began publishing its now popular map of the evolving database landscape. Inspired by the London Underground maps, they looked very much like the schematic for a demented theme park ride.

This year’s map sees additional complexity in that landscape, with 386 products.

What has happened in the last few years to create this crowded field is that more and more technology has evolved to solve niche problems, and these new market entrants are gaining traction in a variety of industries with the support of analysts and developers. The other interesting thing, beyond the sheer number of market players, is the explosion of categories. Back in 2011, there were six or seven categories. Today, even the relationships among the vendors are three-dimensional.

Does this mean you need to have two or three data platforms in place? Is this why you have so many vendors knocking on your door? How many databases do you really need?

Survival of the Fittest

Whenever there is an explosion of suppliers in any industry, there are interesting lessons to learn. In a rapidly changing industry such as data platforms, whatever the market develops to solve a problem is often no longer valid by the time it is created.

Within a two-mile radius of our company headquarters in Cambridge, Mass., 15 new database startups have emerged since 2010. And there’s no telling how many will still be here three years from now. A few will thrive, while others will vanish. This crowded chart will grow increasingly bare – an inevitable consolidation. We have seen tech companies come and go in many other markets, and the data platforms category is not immune to this thinning of the herd.

What if you have a project that will last multiple decades? You’ll need a platform that will stand the test of time. How do you decide?

William O’Mullane, Scientific Operations Manager of the Gaia mission at the European Space Agency, recently spoke about his need for a data platform to map a billion stars in the Milky Way. Not a trivial task. He had several database options to choose from to support his mission.

Related Stories

In-Memory Databases vs. Cache Systems: Old Dogs in a New Fight.
Read the story »

Key Challenges Facing the Modern Database Administrator.
Read the story »

Open Source: Is it Right for Your Database?
Read the story »

How to Address Top Challenges of Database Management.
Read the story »

“My projects run for 20 to 30 years,” he said. ESA started with a relational database, but switched to a multi-model database that supported SQL and had a better, more efficient data model requiring fewer hardware resources. It also could support multi-dimensional queries. It wasn’t one of the newer databases on the market, but it was the one that was proven, reliable, and scalable while offering a lower total cost of ownership.

“In our projects, we find ways to make our technology independent from the database itself, without having to worry about performance tuning,” he said.

So what do you say when your star architect, developer, and analyst say you need to use database X or Y? What are your decision criteria? One key question to ask is whether the latest “flavor of the month” technology will be around for more than a few years.

Many of the newer databases look like specialized solutions – in particular, with a specialty in services. However, if you look at legacy systems in established firms that were put in place 20-30 years ago and the investment in those systems, it is clear that SQL is not going away any time soon. One or two players will emerge victorious from the fray.

Scaling Horizontally, Vertically, or a Blend?

There was a significant shift in the market when the chip vendors ran up against a wall. The need for speed forced them to jam more and more cores in a single socket, so we now see 128-core systems. This is a radical leap from the past, when 64-core systems were deemed gargantuan for physics – although not for business. Now, they are becoming commonplace.

We have orthogonally moved toward virtualization – hiding the hardware: commoditizing the core size, memory size, disk speed, I/O, etc., that provide different opportunities for the configurations you are designing for. Two radically different directions are emerging: low-level concurrency lengths vs. queuing hundreds of thousands of transactions waiting on a single latch or resource. Clearly, there are needs for massive scale-up or scale-out systems.

Which of these are more relevant to you? Capacity chunks of byte-sized elements? Very large systems? Or a blend? It’s not just about the workload; it’s about the total cost of ownership. If you deploy 12 smaller systems vs. one large system, care and feeding requirements are different, with different costs at deployment, development, and management time. Every customer’s business goals and thus data platform requirements are unique and are constantly being evaluated for something better.

What about Cloud?

When we sit down with partners who want to deploy in public cloud – server, storage, I/O model – the question becomes what you have to pay to deliver a specific level of service and how rapidly the cloud providers are evolving to provide the right kind of services. Answering these questions is a daunting task.

What will the bill look like when deploying in the cloud? Customers are concerned about security and liability issues. But almost every enterprise I have ever worked with uses a cloud-based application in at least one mission-critical application. The tolerance for risk is becoming less as the cost-risk benefit becomes more appealing.

What about DBaaS?

In conversations with clients, we discuss the database as an independent architectural component. Starting with a few small but vocal innovators and extending to all major providers, people are talking about the opportunities to have database as a service, or DBaaS.

Amazon, Google, and Oracle offer great choices. You can have a full set of capabilities, from disaster recovery to analytics-only capabilities. It is interesting to note that 451 Research found a very slow shift of database deployments separate from the application deployment into the cloud.

eBook: The Database Decision: Key Considerations to Keep in Mind

 

Many enterprises are looking to their current providers to solve these problems for them rather than looking at a new vendor. Do you see DBaaS as separate from your application? What kind of applications do you see deploying DBaaS? Should the database be part of the application?

As a European colleague of mine noted, “You can’t put data on an American server.” This is an issue with local jurisdiction laws, with ramifications for how data platforms are selected in many countries.

What is Next-generation?

Increasingly, databases are consolidating to answer the need for a general-purpose platform for transaction processing and some basic analytics. Now, with the need for more data-driven decisions to guide how we run our businesses, organizations, and governments, there is an even greater need for a general-purpose data platform. Ideally, this would be based on a singular multi-model database that stores all types of data structures, and a common engine that supports multiple workloads (transaction processing and analytics) without sacrificing ACID compliance and interoperability. This will future-proof applications as data sources change how they publish data.

The data platform market, inevitably, will see more consolidation. Application developers will need to pay extra attention to how well integrated the merged technologies really are, and how much extra work they will need to plan on to realize value. In this changing environment, customers will derive the most benefit from a single platform that supports multiple data models, multiple scale-up and scale-out configurations, multiple languages and deployment options, and multiple decades of proven success.

So how many databases do you really need? The question should be which is the data platform upon which you want to bet the future of your most important applications.

Julie Lockner leads global data platform and partner marketing programs for InterSystems. She has more than 20 years of experience in IT product marketing management and technology strategy, including roles at Informatica and EMC. Follow her on Twitter at @JulieLockner.

Subscribe to Data Informed for the latest information and news on big data and analytics for the enterprise, plus get instant access to more than 20 eBooks.





Download the modern database landscape whitepaper




Tags: , , , , ,

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>