How Google Data Centers Are Protected – All News KH

How Google Data Centers Are Protected

Google’s data centers are the backbone of its services. They power search, maps, Gmail and more. They’re also the source of intense scrutiny from privacy advocates and labor rights groups.

The company has 35 owned data centers in operation and development across the globe. Google carefully chooses the locations for its facilities, based on energy infrastructure, developable land, tax rebates and utility rates.

Location

Google has many data centers scattered across the globe, and its network is constantly expanding. The company is deploying new regions, or zones, to support its growing suite of cloud services. Each region is located in a different country or continent to provide low latency and redundancy.

The location of a data center is crucial to its success, and Google carefully selects each site. It considers several factors, including proximity to fiber network connectivity, availability of clean water, and the ability to use natural climate conditions. Google also seeks locations that are safe and secure, with good infrastructure and a workforce that can support the company’s commitment to privacy.

Google’s data centers are huge, and they require massive amounts of power to run them. The company tries to limit the amount of electricity it consumes by building its own power plants or buying from local utilities. This allows it to avoid paying expensive utility rates.

In addition to power, the data centers require cooling systems to keep servers at a comfortable temperature. This is why they’re often built in areas that have a temperate climate. For example, the facility in Dublin, Ireland uses a system that cools the data center with outdoor air, rather than using mechanical air conditioning.

Other factors that Google considers when selecting a data center location include its access to local fiber network connectivity and the cost of electricity. The company prefers to locate its data centers near power sources that are environmentally friendly and reliable.

Another important consideration is the availability of a skilled workforce. The data centers need technicians and engineers who can work around the clock to maintain the computers that house the information. These workers must be able to troubleshoot and repair problems. In some cases, they may need to take a machine apart to see what is wrong with it.

As a result, Google has invested heavily in the training of local IT professionals. It has also worked with the local government to create jobs and support economic development. In fact, the city of Midlothian has received more than $4 million in grants from Google since it began construction of its data center.

How Google Data Centers Are Protected

Power

Google data centers use multiple redundant power systems to ensure continuous services and limit the risk of equipment failure. These systems include cooling systems to maintain a stable operating temperature for servers and other hardware. They also have backup diesel engine backup generators that can provide emergency electrical power in case of a disruption.

The company also uses advanced software to optimize energy usage in its data centers. The software is designed to predict demand from the grid and adjust operations accordingly. This reduces the need to build new capacity or rely on expensive peaker plants. It also helps Google avoid outages and save money by shifting operations to times when electricity costs are lowest.

To further reduce their carbon footprint, Google data centers have begun incorporating renewable energy sources into their supply chain. The company has also invested in solar projects that generate clean energy to power its data centers. Three of these projects have recently come online: the Rodby Fjord solar project in Denmark, which provides carbon-free energy to the data center in Fredericia; and the Quilicura and Hamina wind farms in Chile, which supply energy to the data centers in those countries.

Each data center has its own network, but it shares many of the same components: Racks and rows of 19-inch racks filled with computers, with each server plugged into a 1 Gbit/s Ethernet link to a top-of-rack switch (TOR). TOR switches connect to the cluster switches that form the datacenter interconnect fabric. The cluster switches themselves are interconnected using a variety of links, including multiple gigabit and ten gigabit uplinks.

Managing the power for this network is complex. The computers are constantly consuming electricity, and the datacenters have to balance that demand with the needs of the local utility grid. This is particularly challenging when sourcing green power, because renewable resources are intermittent. For example, solar and wind energy are only available during daylight hours or when the winds blow, respectively. To address this issue, Google has developed a system to automatically reduce its datacenter electricity consumption when there is high stress on local power grids. This system, called Demand Response, shifts non-urgent computing work to other times and locations without impacting the most commonly used Google services.

Cooling

Google’s massive data centers use a lot of water. Last year, the company consumed 15.8 billion liters of water for cooling purposes. That’s enough water to irrigate 29 golf courses. But the company is taking steps to reduce its water consumption. For example, it is using seawater to cool some of its data centers, and it’s pushing the humidity level in server rooms down to 13 percent. This will save the company millions of gallons of water per year.

Cooling is one of the largest energy-consuming components in a data center. But it’s also one of the most difficult to control. That’s why Google has turned to artificial intelligence to help reduce energy consumption and water use. For years, the company has been testing an AI system that learns how to optimize data center fans and ventilation systems in order to cut energy use and power costs. This system has made recommendations that have been implemented by data center operators, which resulted in energy savings of about 40 percent.

Now, the company is putting the AI system in charge of managing cooling at several of its data centers. The system pulls in information from thousands of sensors every 30 seconds. It analyzes the data and uses deep neural networks to predict how different changes will affect energy usage. Potential actions are checked against an internal list of safety constraints before they are implemented, and local data center operators can still take over if needed.

The AI system is predicting temperatures an hour in advance, which allows data center managers to make adjustments ahead of time. This has led to an overall PUE reduction of about 15 percent and a water use reduction of about 10 percent. In addition, the AI system can detect when it will be hotter or colder, so it adjusts cooling to match predicted conditions.

While the AI system has a number of advantages, it’s not infallible. Data center operators must constantly monitor and tweak the system to keep it working correctly. That’s why Google is also experimenting with other technologies to improve energy efficiency, including an on-site cooling plant at its St. Ghislain, Belgium, data center that uses recycled water from an industrial canal rather than tapping into the region’s freshwater supply.

Security

Google’s data centers are protected by multiple layers of security. On-site security operations monitor closed circuit TV (CCTV) cameras and alarm systems around the clock, and on-site staff patrol the facilities regularly. Access to Google’s data centers is tightly controlled and monitored, with electronic card key access and mandatory biometric identification required for entry. Data center employees and contractors must also submit to background checks before gaining access to sensitive areas.

Inside the data centers, everything is layered in security, from the facility itself to individual servers and networking equipment. For example, Google’s servers don’t include unnecessary components such as video cards and chipsets that could introduce vulnerabilities. They are designed for the sole purpose of running Google services, and they run a stripped-down and hardened version of Linux to ensure that only essential functions are enabled. This approach helps to reduce the risk of data breaches and other security incidents.

The company also enforces a strict “least privilege” protocol, which limits the number of permissions that are given to any one person inside a facility. This is a critical component of defense in depth, as it prevents any single point of failure. For example, people’s access permissions are checked at badge readers in every room and at each door into a data center facility. And all hardware is tagged and tracked, from the time it arrives at the data center to the moment it leaves, so that any stolen or lost equipment can be quickly located.

Another layer of protection is the data itself, which is replicated across multiple systems to help protect against accidental destruction or loss. Google also encrypts traffic using RSA and ECDSA cryptographic keys, providing perfect forward secrecy. This is important because if any of these encrypted keys are ever compromised, it won’t affect traffic because Google uses newer algorithms to make sure each packet is still protected.

Google requires all personnel to conduct themselves in a manner consistent with the company’s security policy, including a requirement that they receive security training. In addition, all personnel must sign a confidentiality agreement and must acknowledge receipt of and compliance with the privacy and information security policies.

Leave a Reply