I was looking into the new, probably AI, data center being built in town and noticed it’s built by a private equity backed firm. The data center was rejected by the city and has to operate with a standard cooperate building water supply. They said are switching to air cooling only and reducing the compute power to keep power usage the same. This has caused amazon, the alleged operator, to back out. So they are building a giant reduced capacity data center with no operator and apparently still think that’s a good idea. My understanding of the private equity bubble is that the firms can hide “under performing” assets because it’s all private. From what I read, possibly 3.2 Trillion dollars of it. I feel like this new data center is going on the “under performing” pile.


Absolutely, yes. I didn’t want to elongate my comment further, but one odd benefit of the Dot Com bubble collapsing was all of the dark fibre optic cable laid in the ground. Those would later be lit up, to provide additional bandwidth or private circuits, and some even became fibre to the home, since some municipalities ended up owning the fibre network.
In a strange twist, the company that produced a lot of this fibre optic cable and went bankrupt during the bubble pop – Corning Glass – would later become instrumental in another boom, because their glass expertise meant they knew how to produce durable smartphone screens. They are the maker of Gorilla Glass.
Rack space is literally the only thing valuable that would be left. Those GPUs are useless for non LLM computation. The optimization of the chips and the massive amounts of soldered RAM. They are purpose made, and they were also manufactured very cheap without common longevity and endurance design features. They will degrade and start failing after less than 5 years or so. Most would be inoperable in a decade. Those data centers are massive piles of e-waste, an absolute misuse of sand.
Racks/cabinets, fiber optic cables, PDUs, CAT6 (OOBM network), top-of-rack switches, aggregation switches, core switches, core routers, external multi-homed ISP/transit connectivity, megawatt three-phase power feeds from the electric utility, internal power distribution and step-down transformers, physical security and alarm systems, badge access, high-strength raised floor, plenum spaces for hot/cold aisles, massive chiller units.
Yes, that’s rack space. It is not even half of the costs of a data center. I know because I’ve worked in data centers and read the financial breakdowns of those materials. They are also useless without actual servers and deprecrate their value really fast.