But given the cost of the hardware, it doesn't make sense to idle them overnight. Owners of the hardware will want to utilize them as close to 100% as possible.
Datacenters upgrade their computers when the price of electricity outweighs the price of buying more efficient devices. Ergo, used datacenter hardware is separated from economic reuse by the price of electricity. That is what you would build the solar energy dumps out of.
Since computers are basically electrical resistances you might as well heat houses with them, put the data centers somewhere cold, next to a nuclear power plant, water cool them and heat a few neighbourhoods
Nuclear plants only have about a 15 degree C temperature gradient in their cooling water in winter, the waste heat is not very useful. Swedish power companies have extensively researched using their nuke plants for district heating since they were planning the plants originally in the 60's, but it's never penciled out.
Who is talking about waste heat? If you want to heat homes you could just use the thermal energy produced directly at a much higher efficiency. You don't need the detour of electricity generation at all.
See, my point isn't really that you should do that, but the fallacy in thinking there is a "free" energy hack for crypto, AI, ... . Wasting electrical energy is always bad.