Add up all the Federal government bodies in the US, and you have a giant, sprawling enterprise. There’s a lot we can learn from what it has been doing with its data centers.

Back in 2009, President Barack Obama’s administration decided that the IT provision of Federal agencies needed coordination, and appointed the United States’ first Federal CIO, Vivek Kundra. Among other things, Kundra turned his attention to data centers which were, at the time, just emerging as a potential issue of concern.

white house government data center getty.png
– Getty

The great thing about Federal efforts to consolidate data centers is they show us a large-scale effort taking place in the public eye. Private sector organizations have had similar issues, but they rarely get the same scrutiny.

Cloud First

Federal bodies were all operating with little coordination and strategy, with a result that new data centers were being opened willy-nilly. In 2010, Kundra issued a “Cloud First” instruction, that the government should use cloud computing where possible, to save costs and reverse the data center sprawl.

By June 2011, Kundra had left his government job to become a Harvard professor (and then swiftly on to an executive position at Salesforce.com). But his initiative has continued, under various names, and become an epic quest for efficiency.

Before he left, Kundra followed up Cloud First with the Federal Data Center Consolidation Initiative (FDCCI). By 2015, he said the federal government needed to close 40 percent of its data centers, to hit a target of 800 facilities. The IT resources in those facilities should all be moved to the cloud, or else consolidated into fewer, more efficient data centers.

Things got interesting when that plan encountered reality, and Government advisors realized how hard it is to actually count data centers. The more they looked, the more they found. Pretty soon, the FDCCI had a target of shutting down 1,200 facilities, a move which was reckoned to save the government between $5bn and $8bn per year.

That figure then grew even more. In 2014, preparing for a new streamlining initiative, the Federal Information Technology Acquisition Reform Act (FITARA), the Government Accounting Office did a count, and found good news and bad news.

The good news was that under FDCCI, a massive 3,300 data centers had been closed. The bad news was that like a hydra, government IT had spawned new facilities. There were still another 11,700 facilities that needed closing (many of them “created” by new classifications for data centers).

The data center hydra

How do we assess the progress of an initiative, when the environment has changed so rapidly?

Delie Minaie, vice president at Booz Allen, says the goals themselves have also changed: “Ten years later, the paradigm is changing around Washington from 'Cloud First' to 'Cloud Smart' going beyond accelerating agency adoption of cloud-based solutions to enabling the way the mission gets executed.”

Given that, she tells DCD: “the verdict is still out on what constitutes ‘success’.”

In the intervening years, the definition of data centers had broadened. Given the broad brief to shut data centers and get more efficient, agencies first had to find out what facilities they had. And they found there were a lot of racks and cabinets in closets distributed throughout agencies.

In 2010, a facility counted if it was 500 square feet or more, and met stringent availability requirements. There were only 2,094 data centers that met those criteria. In the ensuing years, the definition grew. It still excluded print servers and network wiring closets, but some said it now included “any room with a server”.

Look up the stack

In parallel, agencies found that the real savings came by looking up the stack and finding the services which should be consolidated. No agency should be operating two email services, for instance - and before they commission any new service, agencies should see if other bodies have already developed something that meets their needs.

In 2016, the Government added another specific program: the Data Center Optimization Initiative, (DCOI), which went into more detail. All the remaining data centers should improve efficiency using measures such as virtualization, power metering, and killing unused “zombie” servers.

By August 2017, under the new definitions, the government now said it had 12,062 data centers. Some had been commissioned since the original FDCCI, but many were simply newly classified as data centers.

At this point, the Office of Management and Budget suggested splitting facilities into “tiered” data centers (with a UPS, dedicated cooling and a backup generator) and “non-tiered” facilities.

OMB wanted 25 percent of tiered data centers to go by October 2020, along with 60 percent of non-tiered ones.

Twenty-four agencies fell under the FITARA and the DCOI, and in 2019 the Government Audit Office checked on progress, finding 6,250 data centers had been closed by August 2018, saving more than $2.37 billion over the years 2016 to 2018.

This wasn’t enough for the GAO, which found only 13 agencies were on target: "Several agencies indicated that they were seeking revised closure goals because they viewed their goals as unattainable,” said the GAO’s report.

Some of those non-tiered facilities were too essential to go, agencies pleaded.

Better utilization

Remember that DCOI added some efficiency measures? Back in 2009, Kundra’s team had been shocked to find that some server utilization rates were as low as five percent. The OMB demanded this be raised to 65 percent.

The DCOI also wanted all tiered data centers to have 100 percent energy metering and 80 percent utilization - and a power usage effectiveness (PUE) of 1.5 or lower. Any new data centers that were allowed, should achieve a PUE of 1.4.

Progress to these targets has been slow. PUE targets were met by eight of the 24 agencies, but energy metering, server utilization and automatic monitoring were only in place at three agencies each in 2019.

The poster child for success was the National Oceanic and Atmospheric Administration (NOAA) which adopted a cloud-first policy, which has enabled it to deal with surges in web demand in the storms and weather events which have plagued the US since hurricanes Irma and Harvey in 2017.

Alongside this, analysts and advisers have been urging the authorities to keep an eye on the goals and address the actual activities of the organizations, not just their assets. As Minaie puts it: “To truly realize the benefits, an interdisciplinary approach centered around IT modernization is needed for federal enterprises to provide ROI, enhanced security and resiliency while deliberately reducing the data center footprint.

Where are we now?

In 2021, the OMB announced the FDCCI had saved $6.24 billion, but again said more needed to be done, particularly against performance metrics. In 2022, there were still 29 data centers slated to close by the end of the fiscal year, but things actually looked better.

All 24 agencies met their cost savings goals for fiscal year 2020 - amounting to $875.10 million saved in 2020, and $335.88 million saved in 2021 up to August.

The trouble is, as the most obviously wasteful facilities go, the DCOI is running out of low-hanging fruit. Future closures will be tougher and have smaller paybacks.

"Closures and savings are expected to slow in the future according to agencies' Data Center Optimization Initiative (DCOI) strategic plans,” said the OMB’s 2022 report. “For example, seven agencies reported that they plan to close 83 data centers in fiscal year 2022 through 2025, and save a total of $46.32 million."

It’s also hard to assess how well optimization is going. If agencies can convince the OMB that data centers have to be the way they are, they get an exemption. It seems that a very large proportion get such exemptions, as around one-third of agencies are reporting that the DCOI targets are “not applicable.”

What can we learn?

At this point, it’s worth looking across at the private sector, where plenty of large organizations have made bold plans to move to the cloud. The results of those strategies don’t always get thoroughly scrutinized, but it’s clear they don’t always go well.

“Government IT professionals operate in an entirely different ecosystem than their private sector counterparts,” points out Minaie, “one that includes complex, high-stakes mission sets and a budget of taxpayer dollars overseen by Congress.”

Despite this, the private sector can learn a lot from government, she adds: “While the sections are distinctly different, the core tenets of cloud security bear equal importance for the private sector.”

On the one hand, the government has an effective, repeatable approach she says, because “government-grade standards and processes” and regulatory requirements such as FedRAMP, NIST, HIPAA, and HITECH, will “drive peak performance and strengthen security posture.”

On the other hand, she adds: “The private sector has achieved compute and energy efficiency gains that would be difficult and costly for federal agencies to replicate going at it alone.”

One private sector institution that did share its cloud-first experiences was JPMorgan - and its results were as mixed as those of the public sector. Despite a strategic drive to the cloud, the bank spent $2 billion on new data centers in 2021, out of a total tech budget of $12 billion.

The news brought the bank sharp criticism from investors, who were conditioned to expect savings from the cloud, and a small drop in its share price.

Answering a probing question from Mike Mayo of Wells Fargo Securities on an earnings call, Jaimie Dimon was forced to explain that, no matter how well it used the cloud, it still had to keep its data centers going - and even open new ones.

We will come back to banks, and meet Dimon again elsewhere in this supplement. But for now, let’s observe that they have a lot in common with the public sector, including high levels of bureaucracy and conservatism.

The difference is that the private sector is often better funded. Government agencies might find it harder to win an increased tech budget, and shrug off spending $2 billion on activity that diametrically opposes a strategic direction.

Change the scorecard

In the public sector, agencies have to report and measure their progress, but it’s clear that metrics aren’t simple. Moving to the cloud is no panacea, and Federal watchdogs want to see better ways to measure progress - tracking actual benefits rather than proxies.

Each year, Federal agencies are graded by a scorecard issued by the House Oversight and Reform Committee, guided by the Government Operations subcommittee. In that committee, there’s a strong movement to get better measures of progress, according to Meritalk, a site which reports on Federal services.

The Democrat committee chair Congressman Gerry Connolly said: “The scorecard needs to evolve to reflect the changing nature of IT services and to guarantee we are accurately assessing the modernization and IT management practices of federal agencies.

A Republican colleague on the committee, representative Jody Hice, asked if continued consolidation was worth doing: “I think it’s a fair question as to whether indeed we’ve reached a point of diminishing returns. Beyond the current scorecard, I believe it’s time to take a hard look at how FITARA can evolve from this point.”

“The goal here is to incentivize progress,” said Connolly, “not to get a gold star on our foreheads.”

An outside view

Although there have been hiccups along the way, some outside observers are impressed with the progress. “While there may be some edge cases, we have seen a trend in adoption of colocation services where they support cloud, or hybrid colocation use cases and maximize resilience, security and efficiency with an eye towards sustainability,” says Minaie.

“Almost every federal agency has progressed toward climate change goals, looking inward to seek greater efficiency and access to Edge computing which unlocks greater speed, flexibility and productivity,” she says. “Federal agencies that work with cloud and digital ecosystem partners inside a vendor-neutral colocation provider are well positioned to meet their data center modernization and sustainability needs.

But she’s realistic that this is not a journey to a single end goal: “‘Cloud Smart’ is the North Star where it will be a business imperative to redefine how the mission gets executed promoting service management, innovation and adoption of emerging technologies.

“As we see more cloud adoption across the federal landscape, federal agencies must focus on upskilling their workforce, enhancing security postures, and sharing knowledge in best practice cloud acquisition approaches.”

She thinks the pandemic drove cloud but won’t produce a complete shift: “While the Covid-19 pandemic was a forcing function for cloud adoption, I think we will live in this hybrid on-prem/cloud world for quite some time.

“The good news is that we are making progress and federal agencies are actively getting out of the hardware management business and paving the way to a new generation of federal IT that’s far more agile and resilient than in decades past.”

Subscribe to our daily newsletters