With hybrid IT paving the way to the cloud, organizations have more options than ever before to provision and deploy workloads in a way that will best benefit business and end-users. However, IT budgets are expected to decrease throughout this year, according to Gartner, and as IT professionals look for the best place to host their application workloads—whether on-premises, in the cloud or both—it is critical to strategize around maximizing available funds to realize the full benefits of a hybrid approach and leave money for other important projects.
A key part of doing this is to look to database professionals and the databases they manage, which are arguably one of the most important components of a business’s success and surely one of the most complex and critical when it comes to resource (and cost) consumption.
The DBA’s Impact on the Bottom Line
While it might not be obvious, there are a couple ways database professionals can have an impact on the finances of an IT department.
First, moving workloads from high license cost database management systems to lower-cost or open source systems can affect the bottom line. In fact, for certain applications, this strategy of moving from a traditional legacy database such as Oracle or SAP to MySQL or PostgreSQL can result in significant savings.
But the biggest opportunity for DBAs to influence cost savings has to do with proactive database management, which can help prevent downtime and maintain company efficiency, as a slow database decreases productivity for every user of any given application. What’s more, making sure databases are not only running, but performing well can have a dramatic financial impact. The performance—good or bad—of a database can significantly impact the operating cost an organization is footing on a monthly basis, whether it’s in the cloud, on-premises or both.
However, despite how important applications are to modern business and, in turn, how important databases are to applications—when an application performance or availability problem arises, there’s a good chance it’s associated with the underlying database’s performance—the database is still one of the least understood components of data center technology.
As a result, organizations routinely try to solve application performance problems with a number of costly solutions. If a database’s performance begins to degrade, for example, all too often the first reaction is to scale up with additional, faster hardware, an approach based on the method typically used to solve application performance issues. Some organizations will simply move a poorly-performing on-premises database to the cloud, but without a fundamental understanding of the root cause, that database will continue to perform poorly in the cloud as well and will consume expensive resources.
A company might attempt to add memory to the server running the database, move it to an instance with a higher CPU capacity, move to SSD storage, or try any number of other potential solutions that ultimately add up to a much larger bill at the end of each month. And because few people really understand what happens inside a database, and what the specific bottleneck impacting performance might be, these decisions are usually made without an appreciation or knowledge of how they will ultimately impact database performance.
This all results in a continuously increasing infrastructure spend that the majority of business accept as a side effect of running a technology-based business, but never actually addressing the root of the problem.
In such cases, DBAs can step in and act as a technology liaison to business management, regardless of whether databases reside on-premises or in the cloud. With the right performance optimization tools, which provide comprehensive visibility into where the database is spending its time, DBAs can drill down to tune queries, identify and remove bottlenecks, and pinpoint the root cause of performance problems. This method helps inform a more strategic troubleshooting approach.
For example, a DBA will only add CPU when they know CPU is the bottleneck, and they know the exact performance impact they will see with the additional investment. Similarly, he or she will only use SSD drives when they know storage r/w throughput is a significant contributor to performance.
The final result is what can be called performance certainty, a state in which the DBA knows how their system performs, why it performs that way, what the drivers of performance are and that the system has been optimized to run at top speed while consuming the minimum resources needed. At the end of the day, a DBA’s focus on understanding the performance of their organization’s database systems equals a more efficient application that consumes less infrastructure resources and therefore costs less to operate, and a more strategic investment overall.
In the era of hybrid IT, which aims to allow organizations to run leaner by leveraging cloud providers’ services alongside traditional on-premises deployments, the DBA is in an ideal position to work alongside management to generate cost savings through performance management.
The following tips will help DBAs successfully manage the performance of databases in hybrid IT environments, and allow them to create a consistent correlation between performance and cost that can add to a business’s bottom line.
– Strategy is key: If your organization is moving a set of applications to the cloud, implement a strategy ahead of time. A roadmap that’s put in place prior to the transition will help cut workload costs and headaches. This includes having a fundamental understanding of the service provider’s SLAs and capabilities, as well as a thorough review of their recommended architecture and maintenance schedules. And remember, the cloud is not an end-all, be-all solution. A database that is performing poorly on-premises will perform poorly in the cloud if the root cause is not addressed.
– Unified view: If it is important to monitor and optimize on-premises deployments, it’s even more important in the cloud given its dynamic nature, and a consistent set of tools to do so across both sides of hybrid IT environments is ideal. DBAs should use comprehensive management and monitoring tools that provide a single dashboard of performance and the ability to drill down across database technologies and across deployment methods, including cloud. This will ensure your organization isn’t wasting valuable IT funds by addressing a database performance problem with the wrong solution.
– Shift to a proactive mindset: Instead of reactively fighting fires, DBAs must transition to strategic roles designed to proactively improving the database. This will create extra time to gain knowledge about new technologies, while also cutting costs. Such a proactive, performance mindset requires looking beyond resource consumption and slow queries to performance-oriented methodologies like wait-time analysis.
DBAs are and have always been on the frontline of managing a crucial, but often overlooked, piece of business success: the database. By leveraging recognizing the impact DBAs can have on the bottom line and by leveraging the above best practices to help ease the shift towards hybrid IT, DBAs can enable their organizations to maximize their available IT funds and use any savings from strategic performance tuning to put towards additional projects in greater need of additional resources.
Gerardo Dada is Vice President of Product Marketing and Strategy for SolarWinds’ database, applications and cloud businesses globally, including SolarWinds Database Performance Analyzer and SolarWinds Cloud. Gerardo is a technologist who has been at the center of the Web, mobile, social, and cloud revolutions at companies like Rackspace, Microsoft, Motorola, Vignette, and Bazaarvoice. He has been involved with database technologies from dBase and BTrieve to SQL Server, NoSQL, and DBaaS in the cloud.
Subscribe to Data Informed for the latest information and news on big data and analytics for the enterprise.