Until recently, much the talk around cutting data center costs focused on closing facilities and eliminating redundancies. But that’s only part of the solution. To be most effective, you need to focus on optimizing your data centers, using virtualization to add smart capabilities to the assets you keep.
August 1, 2016 marked the release of the Data Center Optimization Initiative (DCOI) – a refreshed set of standards for federal data center management from the Office of Management and Budget (OMB). It requires agencies to consolidate their facilities, adopt more efficient infrastructure, reduce costs, improve IT security, and move to cloud and inter-agency shared services.
While it builds on earlier efforts to save money by consolidating data centers, DCOI is uniquely focused on lowering energy consumption and optimizing the assets that remain in place.
The initiative names five new target areas for optimization:
- Energy metering
- Effectiveness of power use
- Efficiency of use for servers
- Efficiency of use for facility floor space
By gaining efficiencies and reducing excess, DCOI’s goal is to save $2.7 billion (25 percent of data center costs) by 2018.
Optimizing the Ecosystem
CIOs will decide which data centers to retain and how to optimize them. Moving to the cloud is a great first step, but agencies still might need some physical, private infrastructure to meet their program requirements. CIOs must work with program and facility managers to evaluate all IT components and make strategic decisions that support the entire ecosystem.
A smart data center strategy should leverage four key capability areas: cloud, virtualization, data architecture, and sensor technology.
Government has already taken great strides to facilitate cloud adoption. With FedRAMP, it’s easier than ever for programs and departments to explore the benefits of cloud technology without compromising data security. Cloud environments are scalable and let agencies provision resources on-demand. Additionally, cloud platforms are configurable and flexible and free up much of the physical space in data centers, leaving more room for critical infrastructure. GSA has a number of solutions in place to help agencies quickly and cost-effectively procure cloud services.
Earlier mandates focused on server virtualization, which was a great start. Now, DCOI is focusing on virtualization across the entire data center. By virtualizing servers, storage and networks, agencies can adopt software-defined data centers that support sharing and efficiency. Multiple applications, programs and missions can all work on the same system without having to duplicate infrastructure or maintain perfect silos. It also lets them more effectively use the critical hard resources they decide to keep. And, virtualization helps to build a “future proof” data center.
Smarter Data Architecture
An optimized data center relies on a smart data architecture. You need to know where your data is and what kind of storage you need, based on the data’s importance and frequency of use. For example, the data you use often can be housed in more expensive, high-performance flash storage, while less important data can be stored with spinning media, optical or tape. With storage optimization technologies, you can automate this tiering process. This creates an environment where the data is in the correct place at the correct time when you need it.
A smart facility recognizes that different storage methods have different energy needs. For example, flash doesn’t require as much power and cooling as other methods – like spinning disk drives. With microclimates in your data center, warmer areas can house smaller sets of critical data on flash, and cooled sections can house larger amounts of data on spinning disk drives. With a strategic tiering structure, the data center will see major cost savings from reduced cooling and power usage.
Sensors and IoT
The more IT assets you can monitor with sensors, the more awareness you gain across your IT ecosystem. DCOI calls for agencies to replace manual data collection with more accurate and efficient automated monitoring tools. By collecting and cataloging data on key functions, CIOs can establish benchmarks for what, when and where resources are used.
In an IoT-enabled data center, sensors that measure temperature and air flow will identify hot spots. This lets IT managers know when to cool specific areas, as well as how much cooling is needed in each area. Analytics can highlight usage patterns, helping IT teams regulate temperature automatically, proactively and gradually so they avoid using large bursts of energy. Over time, learnings from these analytics will feed future analytics. This will lead to increasingly intelligent automated systems that continuously adapt to changing inputs. As these systems get smarter, CIOs will be able to make faster, more informed decisions for their agency.
The Right Tools for Meeting Your Goals
DCOI still aims to reduce the number of federal data centers, but the bigger challenge is optimizing the assets that agencies will keep. Hitachi Data Systems Federal offers a range of technologies with capabilities for cloud, virtualization, IoT, and data tiering.
Additionally, if your agency is looking for ways to save costs in your data center environment, HDS Federal can help you reach your goals with expert guidance in datacenter economics. Let us show you the various cost-saving models that will enhance your environment through virtualization and cloud adoption.
Download the HDS Federal Storage, Content and Infrastructure Solutions guide and explore how our solutions support your strategy for a more efficient, data-driven, DCOI-compliant IT environment. For more information, contact your Hitachi Data Systems Federal representative or email firstname.lastname@example.org.