为保护数据中心免受全球热浪影响,我们应该些知道什么?

szhjh

气候变化给一个意想不到的地方带来了麻烦:数据中心的运作。随着全球热浪频率的增加,这些关键的技术基础设施正在越来越频繁地受到影响,威胁到互联网的基础元素。

7 月,当英国经历超过 40 摄氏度的创纪录高温时,谷歌和甲骨文位于伦敦的数据中心被迫离线。热浪使他们的数据中心冷却系统无法使用,并导致许多客户的网站中断。而谷歌中断影响了整个欧洲的 WordPress 托管网站。

这些冷却系统旨在调节服务器和其他数据中心设备散发的热量。但是,当内部热量遇到热浪的外部高温时,冷却系统就会不堪重负,无法正常工作。因此,重要的数据中心设备过热并熄火。

随着热浪席卷秋季,科技公司的 IT 运营面临长期中断。根据Twitter 前工程副总裁 Carrie Fernandez 的一份公司备忘录, 9 月,当高温导致其萨克拉门托数据中心中断时,Twitter 发现自己处于“非冗余状态” 。她称这一事件“史无前例”,并表示热浪导致“物理设备完全关闭”。

尽管热浪在全球范围内成为普遍现象,但 Twitter 等公司对酷热可能对科技行业造成的破坏准备不足。Twitter 前安全负责人 Peiter “Mudge” Zatko 在 8 月份的一份举报披露中透露,Twitter 因“数据中心冗余不足”而面临风险。他警告说,“少数数据中心暂时但重叠的中断”可能导致 Twitter 被“离线数周、数月或永久”。既然 Elon Musk 已经收购了 Twitter 并解雇了大量员工,该公司可能比以前更没有为热浪做好准备。

一、热浪:IT 越来越头疼

热浪会对严重依赖 IT 服务或提供数字产品的企业造成重大损害。4D Data Centers的首席运营官 Steve Wright警告说,高温等环境条件有能力“损坏 IT 设备并因电网过载而导致停电”。

​Wright 表示,在高温情况下未能妥善维护数据中心的公司可能会遇到“服务器故障、硬盘崩溃和数据丢失”的情况。“任何断电对客户来说都是毁灭性的,关键数据文件会损坏或丢失,大型机出现故障,系统过热会造成资金损失,”他说。

但是,对于这种日益严重的 IT 难题,有一些解决方案。Wright 解释说,数据中心运营商可以先安装备用发电机,以确保在停电期间继续供电。他补充说,他们还可以通过跟踪温度和湿度来延长数据中心服务器和硬盘驱动器的使用寿命。

Wright 指出微软成功测试水下数据中心作为应对过热的手段。这家科技巨头称这个概念“可靠且实用”,与陆上数据中心相比,其故障率大大降低。

“对于数据中心,有必要使用高能冷却系统来应对不断上升的温度,尤其是 2021 年对美国数据中心的调查显示,45% 的数据中心所有者和运营商负责管理全球最大 IT 的基础设施组织表示,极端天气已经威胁到他们的持续运营,”他解释道。

二、数据中心的冷却系统是整个系统的薄弱环节

冷却系统旨在防止数据中心设备过热,但事实是它们根本无法应对破纪录的热浪。数据中心智库 Uptime Institute Intelligence 的研究主管 Daniel Bizo 表示,酷热会给压缩机、泵、风扇和其他冷却设备带来压力。

“在不深入技术的情况下,压缩机有多种类型,是机械制冷系统的核心,例如空调和冷水机系统。他们使用电力来压缩气态冷却剂,然后在循环的后期膨胀(冷却盘管暴露在环境空气或水中)并急剧冷却以产生冷却效果,”他解释道。

“这个例子中的泵是一个水泵,它在数据中心周围循环设施水(在冷冻水系统中)作为冷却剂(由水冷却器中的压缩机冷却)以去除计算机房空气处理器的热量,行内冷却装置和其他热交换装置。他们工作越努力,失败的可能性就越大。”

不幸的是,冷却设备并不是数据中心易受极端高温影响的唯一重要组成部分。Bizo 说备用发电机和外部电力设备也会受到热浪的影响。这“会降低他们支持数据中心全部容量的能力,如果需要的话,如果电网遇到热引起的问题”。

尽管谷歌、甲骨文和推特等知名科技公司的服务在 2022 年因酷热而受到严重干扰,但该行业仍有一些希望。Bizo 解释说,除了一些明显的例外,绝大多数数据中心在夏季都“在极端温度下没有出现重大问题”。他将此归因于“适当的电源和冷却冗余以及良好的设备维护卫生”。

“此外,大多数数据中心通常仅以中等利用率水平运行。运营商可以利用备用冷却能力来应对极端高温,”他说。“相比之下,云提供商更倾向于将他们的基础设施推向极限,并且在极端天气事件中容错的余地更小。”华东CIO大会、华东CIO联盟、CDLC中国数字化灯塔大会、CXO数字化研学之旅、数字化江湖-讲武堂,数字化江湖-大侠传、数字化江湖-论剑、CXO系列管理论坛(陆家嘴CXO管理论坛、宁波东钱湖CXO管理论坛等)、数字化转型网,走进灯塔工厂系列、ECIO大会等

三、如何保护数据中心免受热浪影响

随着热浪对全球技术基础设施变得越来越普遍和灾难性,数据中心运营商必须加强防御以抵御这一非常现实的威胁。幸运的是,Bizo 相信有很多解决方案可以减轻极端高温的影响。

首先,数据中心运营商可以投资蒸发和绝热冷却系统。或者他们可以用自动喷水灭火系统补充现有的空调和冷却装置。

“在数据大厅中承受几度的高温有助于减轻冷却系统的压力;如果您的系统仅使用环境空气冷却(仅由外部空气冷却),操作员可能会考虑使用蒸发效果对其进行升级,”他说。

“一个例子可能是在空调/冷却器的冷却盘管周围雾化空气。新建/大型翻新工程可以选择冷却系统,通过设计使用蒸发(或绝热效应,另一种依赖空气吸收水的物理现象)来冷却环境空气,只要它不太潮湿,以达到冷却效果”

但他表示,应对气候危机的“更具战略性的长期应对措施”是采用液冷 IT 系统。“液冷 IT 也有帮助,因为它允许整个冷却‘链’的温度更高。这是因为与根据行业建议需要在 18-27 摄氏度范围内供应的空气不同,直接输送到服务器的液体(水、工程流体)可以达到 30 摄氏度以上,在某些情况下甚至可以达到 40 摄氏度以上,具体取决于实施情况,“ 他说。“这意味着,比方说,冷冻水系统可以设计为提供 32C 的数据中心冷却水。这比冷却到 15C 以下(例如)所需的能量要少得多,这在许多实施中都很常见。”

由于数据中心运营商已经经历了热浪的有害影响,Uptime 的立场是他们应该进行定期评估以识别与气候相关的漏洞并在时间用完之前实施解决方案。

Bizo补充道:“随着极端天气事件和气候变化的其他后果变得更加严重和广泛,解决气候恢复力是现代商业的当务之急。”

四、数据中心如何抵御高温

尽管科技行业极易受到热浪的影响,但解决这个问题的技术解决方案并不缺乏。

CyrusOne 是一家全球数据中心提供商,它通过闭环冷冻水系统和风冷式冷水机组应对极端高温。CyrusOne 负责环境健康、安全和可持续发展的副总裁 Kyle Myers 将它们描述为“为我们的设备提供冷水的节能方式”。

该系统包含一个包含不到 8,000 加仑水的回路,只需注满一次,而其他运营商通常每年消耗数千万加仑的水来冷却他们的每个数据中心。这种一次性水源随后由集成的压缩机和冷凝器冷却。一旦水变冷,它可以降低数据中心内部的温度。

他告诉媒体:“这个过程冷却了全国不同温度区域的 IT 设备。我们的风冷式冷水机配有节能器,使我们能够利用较低的温度更有效地从冷冻水中排出热量——同时无需补充水源来维持运行。”

​在限制极端高温对重要数据中心设备的影响的同时,这项技术对环境也有好处。它不需要恒定的水源,而且由于不需要排污管,数据中心不会释放污染物。

Myers说:“对于依赖水进行冷却的设施,它们可以在这些时期燃烧大量的水来保持数据中心凉爽。幸运的是,我们的现代建筑标准使用无水冷却,因此虽然我们的总电力负荷在此期间可能会增加,但我们并没有耗尽遭受旱灾的凤凰城地区的水资源。”

但是,如果不投资冷冻水系统等花哨的系统,组织可以通过做出更明智的决策来缓解增加的热量。例如,Cirrus Nexus 首席信息官 Kelly Fleming 建议希望将工作负载转移到云端的组织选择使用可再生能源的数据中心区域。

他还建议:“不需要全天候 24/7 运行的服务器可以在其数据中心区域消耗的能源处于最清洁状态时上下旋转,这可能会因为其供电的能源而有很大差异。”

过去几个月证明了热浪对全球科技行业的破坏力有多大。随着热度的上升没有放缓的迹象,科技公司显然正处于十字路口。如果科技公司未能监控和缓解极端高温,几乎肯定会出现更严重的停电。

​原文:

Climate change is throwing a wrench into an unexpected place: the workings of data centers. As the frequency of heat waves grows worldwide, these critical pieces of technology infrastructure are melting down more and more often, threatening a foundational element of the internet.

In July, Google’s and Oracle’s London-based data centers were forced offline when Britain experienced record-high temperatures of over 40 degrees celsius. The heat wave rendered their data center cooling systems useless and caused website outages for many customers. Namely, the Google outage impacted WordPress-hosted websites across Europe.

These cooling systems are designed to regulate heat emitted by servers and other data center equipment. But when the internal heat meets the high external temperature of heat waves, cooling systems become overwhelmed and cannot do their job. As a result, vital data center equipment overheats and goes out.

With heat waves now creeping into the autumn months, technology companies face prolonged disruption to their IT operations. In September, Twitter found itself in a “non-redundant state” when intense heat caused an outage at its Sacramento data center, according to a company memo by former Twitter vice president of engineering Carrie Fernandez. She called the incident “unprecedented” and said the heat wave caused “the total shutdown of physical equipment”.

Despite heat waves becoming a common occurrence globally, companies such as Twitter are grossly underprepared for the havoc that intense heat can wreak on the technology industry. Peiter “Mudge” Zatko, ex-head of security at Twitter, revealed in a whistleblowing disclosure in August that Twitter is put at risk by “insufficient data center redundancy”. He warned that “a temporary but overlapping outage of a small number of data centers” could cause Twitter to be knocked “offline for weeks, months, or permanently”. Now that Elon Musk has acquired Twitter and laid off large swathes of staff, it’s likely that the firm is even less prepared for heat waves than before.

Heat waves: A growing IT headache

Heat waves can cause major harm to businesses that rely heavily on IT services or that offer digital products. Steve Wright, chief operating officer at 4D Data Centers , warns that environmental conditions like intense heat have the ability to “damage IT equipment and cause power outages due to overloaded power grids”.

Companies that fail to properly maintain their data centers in the face of soaring heat can experience “server failure, hard drive crashes, and data loss”, according to Wright. “Any lapse in power can be devastating for a customer, with critical data files getting corrupted or lost, mainframes malfunctioning, and money being lost when systems overheat,” he says.

There are, however, solutions to this growing IT headache. Wright explains that data center operators can begin by installing backup generators, which ensure power supply continues during an outage. He adds that they can also extend the lifespan of data center servers and hard drives by tracking temperature and humidity.

Wright points to Microsoft’s success testing underwater data centers as a means to counter overheating. The tech giant called the concept “reliable and practical”, slashing its failure rate considerably compared to on-land data centers.

“For data centers, it is necessary to use high-energy cooling systems to combat rising temperatures, especially as a 2021 survey on US-based data centers revealed that 45% of data center owners and operators responsible for managing infrastructure at the world’s largest IT organizations said extreme weather had threatened their continuous operations,” he explains.

Data centers’ cooling systems are weak points for the whole system

Cooling systems are designed to prevent data center equipment from overheating, but the reality is that they’re simply not equipped to deal with record-breaking heat waves. Intense heat places strain on compressors, pumps, fans, and other cooling equipment, according to Daniel Bizo, research director at data center think tank Uptime Institute Intelligence.

“Without going into technical depth, compressors, of which there are many types, are at the heart of mechanical refrigeration systems, such as air conditioners and water chiller systems. They use electrical power to compress a gaseous coolant which then later in the cycle expands (cooling coils exposed to ambient air or water) and cools down dramatically to create a cooling effect,” he explains.

“A pump in this example is a water pump that circulates facility water (in a chilled water system) around the data center as a coolant (cooled by the compressors in the water chillers) to remove heat from computer room air handlers, in-row cooling units, and other heat exchange units. The harder they work, the greater the likelihood of failure.”

Unfortunately, cooling equipment isn’t the only vital component of data centers vulnerable to extreme heat. Bizo says backup generators and external power equipment can also be affected by heat waves. This “can reduce their ability to support the full capacity of the data center, if called upon, should the grid experience heat-induced issues”.

Although prominent tech companies like Google, Oracle, and Twitter saw significant disruption to their services due to intense heat in 2022, there is some hope for the industry. Bizo explains that, minus a few notable exceptions, the vast majority of data centers survived “extreme temperatures without significant problems” during the summer. He attributes this to “appropriate power and cooling redundancy and good equipment maintenance hygiene.”

“Additionally, most data centers typically run at only moderate utilization levels. Operators can take advantage of spare cooling capacity to combat extreme heat,” he says. “ In contrast, cloud providers are more inclined to push their infrastructure closer to the limits and have less margin for error during extreme weather events.”

How to protect data centers from heat waves

As heat waves become more common and catastrophic for global technology infrastructure, data center operators must shore up their defenses against this very real threat. Luckily, Bizo is confident that there are lots of solutions for mitigating the fallout of extreme heat.

For starters, data center operators can invest in evaporative and adiabatic cooling systems. Or they can supplement existing air conditioning and chiller units with sprinkler systems.

“Tolerating a few degrees higher temperature in the data hall helps lessen the stress on cooling systems; if your system uses ambient air cooling only (just cooled by outside air), an operator may want to consider upgrading it with an evaporation effect,” he says.

“An example could be to mist air around the cooling coils of the air-conditioner / chiller. New builds/major refurbishments can opt for cooling systems that by design use evaporation (or adiabatic effect, another physical phenomenon that relies on water absorbed into air), to cool ambient air down, as long as it is not too humid, for cooling effect.”

But he says “a more strategic, long-term response” to the climate crisis would be to adopt liquid-cooled IT systems. “Liquid cooled IT helps too, because it allows higher temperatures across the cooling ‘chain’. This is because unlike air, which needs to be supplied in the rage of 18-27C per industry recommendation, liquid (water, engineered fluid) directly to the servers can be 30+ C, even 40+ C in some cases, depending on implementation,” he says. “That means that, say, a chilled water system can be designed to deliver data center cooling water at 32C. This requires massively less energy than cooling to under 15C (as an example), which is typical in many implementations.”

With data center operators already experiencing the harmful effects of heatwaves, Uptime’s stance is that they should conduct regular assessments to identify climate-related vulnerabilities and enforce solutions before time runs out.

Bizo adds: “As extreme weather events and other consequences of climate change become more severe and widespread, addressing climatic resilience is a modern business imperative.”

How data centers can beat the heat

Even though the technology industry is extremely vulnerable to heat waves, there’s no shortage of technological solutions to this problem.

CyrusOne, a provider of global data centers, has responded to extreme heat with closed-loop chilled water systems and air-cooled chillers. Kyle Myers, vice president of environmental health, safety, and sustainability at CyrusOne, describes them as an “energy-efficient means of providing cool water to our equipment”.

This system comprises a loop containing under 8,000 gallons of water and only needs to be filled once, whereas other operators typically consume tens of millions of gallons of water annually to cool each of their data centers. This one-time source of water is then cooled by an integrated compressor and condenser. Once the water is cold, it can lower temperatures inside the data center.

He tells Gizmodo: “This process cools the IT gear in different temperature regions around the nation. Our air-cooled chillers come with economizers that allow us to leverage colder temperatures to reject heat from our chilled water more efficiently – while eliminating the need for makeup water sources to maintain operation.”

While limiting the effects of extreme heat on vital data center equipment, this technology is also good for the environment. It doesn’t require a constant water source, and because there’s no need for a sewage pipe, pollutants aren’t released from the data center.

Myers says: “For facilities that are dependent on water for cooling, they can burn through a tremendous amount of water during these periods to keep data centers cool. Fortunately, our modern build standard uses water-free cooling, so while our total electrical load can increase during this time, we’re not depleting water resources in the drought-stricken Phoenix region.”

But without investing in fancy systems like chilled-water systems, organizations can mitigate increased heat by making smarter decisions. For instance, Cirrus Nexus CIO Kelly Fleming recommends that organizations looking to move workloads to the cloud choose data center regions that use renewable energy.

He also recommends: “Servers that don’t need to run 24/7 can be spun up and down when the energy consumed in their data center region is at its cleanest, which can vary significantly depending on the energy sources powering it.”

The last few months have proved just how devastating heat waves can be for the global technology industry. And as increased heat shows no signs of slowing down, technology companies are clearly at a crossroads. If tech firms fail to monitor and mitigate extreme heat, worse outages seem almost certain.

本文主要内容转载原作者为Nicholas Fearn,仅供广大读者参考,如有侵犯您的知识产权或者权益,请联系我提供证据,我会予以删除。

CXO联盟(CXO union)是一家聚焦于CIO,CDO,cto,ciso,cfo,coo,chro,cpo,ceo等人群的平台组织,其中在CIO会议领域的领头羊,目前举办了大量的CIO大会、CIO论坛、CIO活动、CIO会议、CIO峰会、CIO会展。如华东CIO会议、华南cio会议、华北cio会议、中国cio会议、西部CIO会议。在这里,你可以参加大量的IT大会、IT行业会议、IT行业论坛、IT行业会展、数字化论坛、数字化转型论坛,在这里你可以认识很多的首席信息官、首席数字官、首席财务官、首席技术官、首席人力资源官、首席运营官、首席执行官、IT总监、财务总监、信息总监、运营总监、采购总监、供应链总监。

数字化转型网(资讯媒体,是企业数字化转型的必读参考,在这里你可以学习大量的知识,如财务数字化转型、供应链数字化转型、运营数字化转型、生产数字化转型、人力资源数字化转型、市场营销数字化转型。通过关注我们的公众号,你就知道如何实现企业数字化转型?数字化转型如何做?

【CXO UNION部分社群会员】超威电源集团CISO、海澜集团CISO、无锡产业发展集团CISO、北京金隅集团股份CISO、河北津西钢铁集团股份CISO、中国重型汽车集团CISO、山东东明石化集团CISO、雅戈尔集团股份CISO、南山集团CISO、中国黄金集团CISO、江阴澄星实业集团CISO、四川省宜宾五粮液集团CISO、亨通集团CISO、杭州钢铁集团CISO、新华联集团CISO、酒泉钢铁(集团)CISO、协鑫集团CISO、广西柳州钢铁集团CISO、辽宁方大集团实业CISO、日照钢铁控股集团CISO、河北新华联合冶金控股集团CISO、长城汽车股份CISO、万达控股集团CISO、江铃汽车集团CISO、传化集团CISO、宁波金田投资控股CISO、江苏悦达集团CISO、利华益集团股份CISO、中兴通讯股份CISO、扬子江药业集团CISO、内蒙古伊利实业集团股份CISO、贵州茅台酒股份CISO、正邦集团CISO、徐州工程机械集团CISO、包头钢铁(集团)CISO、三一集团CISO、晨鸣控股CISO、中国国际海运集装箱(集团}股份CISO、杭州锦江集团CISO、通威集团CISO、正泰集团股份CISO、太原钢铁(集团)CISO、天津荣程祥泰投资控股集团CISO、桐昆控股集团CISO、河北普阳钢铁CISO、山东黄金集团CISO、云南锡业集团(控股)CISO、奇瑞控股集团CISO、华泰集团CISO、陕西汽车控股集团CISO、金鼎钢铁集团CISO、奥克斯集团CISO、温氏食品集团股份CISO、三房巷集团CISO、红豆集团CISO、盘锦北方沥青燃料CISO、云天化集团CISO、永锋集团CISO、洛阳栾川钼业集团股份CISO、万华化学集团股份CISO、广州工业投资控股集团CISO、双胞胎(集团)股份CISO、江苏新长江实业集团CISO、山东海科控股CISO、山东招金集团CISO、新余钢铁集团CISO、昆明钢铁控股CISO、贵州磷化(集团)CISO等

展开阅读全文

页面更新:2024-04-05

标签:热浪   数据中心   集团股份   运营商   首席   高温   空气   集团   设备   全球   系统

1 2 3 4 5

上滑加载更多 ↓
推荐阅读:
友情链接:
更多:

本站资料均由网友自行发布提供,仅用于学习交流。如有版权问题,请与我联系,QQ:4156828  

© CopyRight 2020-2024 All Rights Reserved. Powered By 71396.com 闽ICP备11008920号-4
闽公网安备35020302034903号

Top