Yeah that language is pure corporate BS - data centers CONSUME energy at massive scales (up to 1 gigawatt in this case, which is insane), they’re literally just giant heaters that occasionally produce AI outputs as a byproduct of all that wasted electricty.
1 gigawatt is not that insane, and I doubt it’s what the datacenter consumes. A rack can easily get into double, even triple digits kW for GPU heavy setups. So let’s say 10 racks per Megawatt. I’m sure such a datacenter has more than 10.000 racks. Plus A/C, and all other “ancillary” uses. A normal datacenter can get close to 1 GW, this thing might be double digits, but I doubt they will publish exact numbers.
Every square kilometer of land (0.38 Sq miles in freedom units) gets about a GW of heat from the sun (depending on latitude). I doubt one datacenter will contribute that much…
Yeah that language is pure corporate BS - data centers CONSUME energy at massive scales (up to 1 gigawatt in this case, which is insane), they’re literally just giant heaters that occasionally produce AI outputs as a byproduct of all that wasted electricty.
1 gigawatt is not that insane, and I doubt it’s what the datacenter consumes. A rack can easily get into double, even triple digits kW for GPU heavy setups. So let’s say 10 racks per Megawatt. I’m sure such a datacenter has more than 10.000 racks. Plus A/C, and all other “ancillary” uses. A normal datacenter can get close to 1 GW, this thing might be double digits, but I doubt they will publish exact numbers.
I guess they could say they are generating 1GW of computing power
More like a GW of heat… Thankfully I’m sure that will counteract whatever has caused it to be over 80 degrees on my way to work before 0700.
Every square kilometer of land (0.38 Sq miles in freedom units) gets about a GW of heat from the sun (depending on latitude). I doubt one datacenter will contribute that much…