The sound of a Conservative transport secretary talking was surprising. Grant Shapps explained the principle of good engineering design on the Today programme. You need to be able to specify the constraints under which the design is expected to function.
Shapps explained that the railway system he is currently in charge of was designed to operate at a temperature of -20C and 35C. He pointed out that the actual temperature of the rails might be twice that if the air temperature is 40C. Some lines had been closed that day because they were made of steel and could potentially collapse in the heat.
The railway industry wasn't the only one that couldn't handle the heat. On Tuesday, when the temperature reached 40.3C, the datacentres operated by the two companies had to be offline. Some resources, services, and virtual machines became unavailable, taking down unlucky websites, as a result of the selected machines being powered off to avoid long-term damage. A subset of cooling infrastructure within the UK South Data Centre experienced an issue due to unseasonal temperatures. A subset of our service infrastructure needed to be powered down because of this. The aim of this step is to limit the impact on our customers.
Anyone who has been lucky enough to visit one of these centers will not be surprised by this. Invitations are rare since the tech companies are very sensitive about them. They are huge, windowless metal sheds built in remote locations and surrounded by military-grade perimeter fencing. There are many thousands of stripped down PCs in the building.
Different sizes of centers are available. The average is 100,000 square feet. In China, the biggest one is over 6 million sq ft. Depending on the tasks that the server performs, the number of server in any given centre can vary. It has been estimated that a 1m sq ft centre can hold up to 600,000 machines.
What were the temperature design parameters for the two centres that had to go offline on Tuesday?
Everything about a data centre is dependent on electrical power. They run hot, so it is necessary to run the server. You will get the idea if you imagine hundreds of thousands of PCs running continuously. Air-conditioning of their oppressive interiors is important. Tech companies try to put them in places where the electricity supply is cheap and stable, and where it is easier to cool down.
Datacentres are the cathedrals of our world because they were once called server farms. Datacentres are just as important in the online world as cathedrals are in the real world. Our phones would be just paperweights without them. You are interacting with a datacentre if you send a text message to a friend, consult a weather app, or post a photograph on social media.
Which is the reason why the events of last week are intriguing. The railways are not the only part of our society that is vulnerable to global heating. The question that was running through my mind was: what were the design parameters for the two centres that had to go offline on Tuesday. Is it possible that the engineers thought the sheds would never have to deal with more than 35C? They could be in trouble if that's the case.
It is easy to understand why railways are so important to a society. The technology behind our phones looks ethereal. Datacentres remind us that it isn't.
What are the challenges of teaching an introduction to philosophy course to the young people on social media? The subject of tossing the canon in a cannon is examined in an interesting 3 Quarks Daily essay.
Donald McNeil's Monkeypox: What You Actually Need to Know is an informative piece.
Is the World Really Falling Apart, or Does It Just Feel That Way? is an essay by Max Fisher in the New York Times.