There is little question that the public cloud business is picking up, as suggested in the article in CIO Magazine, “Why Private Clouds Will Suffer a Long, Slow Death.” This is a pretty dramatic title, but fact is that public cloud platforms such as Amazon AWS, Microsoft Azure, and Google seem to be growing at a pretty impressive clip.
Yet even the article title, if you read it closely, says the transition from private to public cloud dependence will be long and slow. So how fast should your organization move on its journey to the public cloud?
Moving to the Cloud: Swift or Slow?
There are a number of organizations that have already put the majority of its business applications in the public cloud. For example, well over 80% of the software applications that Informatica runs its business on are cloud applications at this point. But not every organization is ready or able to move that quickly. For the vast majority of larger corporations with a great deal of gravity around on-premise applications and data, it will be a long process. Recently, I spoke to a bank that had over 100 data warehouses and used thousands of applications to run its business. Organizations of that size and complexity will not be moving to the cloud overnight.
So, what should your organization’s public cloud strategy be? That depends on a number of factors, including the size and scope of your data. Depending on your organization’s needs, your cloud strategy can be to:
– Buy new applications in the cloud where there is no on-premise application of equivalent functionality available.
– Hold the current on-premise applications constant and add any new applications in the cloud.
– Institute a “cloud-first policy” where you would primarily use cloud applications unless you can justify a deviation from that policy.
– Institute a “cloud-only” policy. This is an extreme step, and I have not seen many F500 companies doing this.
– Or a combination of any of the above
Once you have determined your cloud strategy, you can start to scope out the size of the data management challenge you will be dealing with. Different organizations are going to have different public cloud journeys and different speeds of implementing them.
Modern Data Integration in a Hybrid World
No matter what your strategy is, you will most likely be dealing with a hybrid environment for several years as you split your applications between on-premise and cloud options. If you are planning your data integration and data management strategy for this hybrid world, here are some safe assumptions:
– For most large organizations, particularly those with a history of on-premise applications and data, the journey to the public cloud is going to be a long process of change. In the interim, you will be dealing with a highly hybrid environment from a data management point of view.
– Cloud applications will continue to proliferate and grow rapidly. These will need to have data synched and managed with other applications, both on-premise and in the cloud.
– We will continue to see strong growth in cloud analytics from cloud data warehouses, such as AWS Redshift, Azure SQL Data Warehouse, and Snowflake. These analytics platforms are going to include an increasing amount of machine learning/deep learning, predictive, and AI analytics capabilities.
– Many organizations will increase their spending on public platform vendors (like Amazon AWS, Microsoft Azure, or the Google Cloud Platform). The question is, will they standardize on a single platform and risk lock-in, or will they support multiple cloud platforms? We are seeing many large enterprise using multiple platforms, at least for the present.
If data management was hard before with the complexity and silos of the data center, it is going to get much harder as the number of cloud applications and platforms grow. This growth in hybridity means that you are going to have to connect to everything and plan for the fact that “everything” might be either on-premise or in the cloud in the future.
The mega-trend here is that organizations are increasingly looking to compete with their data. This is the best and most sustainable source of competitive differentiation they have today. In this world, all of an organization’s data will need to be managed as a shared asset available to all, whether it was internally or externally sourced. The challenge will be to build data management systems and architecture that will enable them to manage the transition to new cloud applications while at the same time speeding the delivery of trusted data to fuel business processes, applications, and analytics across the organization.
Data Management for the Long-Term Cloud Journey
Here are some final data management concerns to think about as you embark on the journey to the public cloud:
– It will pay to have a mature data management architecture in place to help you migrate to the cloud and to keep your on-premise and cloud applications synched with the most current and trustworthy data. Your data integration and data management systems should work across both environments. Failure to do that will be to risk creating new “data silos.”
– It will also pay to have some form of data governance in place. There is no point in moving around bad-quality data that nobody can trust. And the longer you take, the harder it will be to retrofit this in a complex environment.
– Your architects will need to be thinking about your data “center of gravity” as you make the journey to the cloud. The data management architecture that works when you are 90% on-premise is probably not the data management architecture that will work when you are 90% in the cloud.
There is no doubt; the public cloud journey is happening. The organizations that successfully harness data to drive better customer interactions, higher productivity, better business outcomes, etc. will be those who take the time to think data-first and define a clear data management architecture that will work today, during the transition to cloud, and in their desired future state.
Roger Nolan is the Director of Solutions at Informatica. His focus is on data management solutions and architectures for Analytics and Application Modernization that will accelerate business value delivery. Before joining Informatica, Roger held a variety of senior roles in Product Marketing, Product Management, Strategic Alliances, and Corporate Development at Avaya, Sun Microsystems and Metricom. He has deep experience in enterprise software, communications & collaboration software, and internet telephony products. Roger has an MBA from Boston College and a BS from Northeastern University.
Subscribe to Data Informed for the latest information and news on big data and analytics for the enterprise.