I mean, they can't test hardware that doesn't exist yet, so what is the alternative?
If there's a hardcoded maximum, that's one thing, but we don't know that's what is going on, do we?
There are occasionally hiccups, like the whole debacle where the MBR partition table (from 1893) was limited to a few terabytes and the industry switched over to GPT, but even today if you put in a >2TB disk into an ancient machine that does not support GPT it can usually at least use the portion of the disk that it can address.
Just read the spec and write your software to support it. It is not like we are asking for support of terabyte SD cards on machines released with the maximum SD card size was 128MB.
I wish PC hardware manufacturers would do that. You don’t even need too old hardware or software to find compatibility bugs, here’s an example: https://superuser.com/q/807871/31483
I still think it's quite an impressive feat to support that much data before computers even existed though lol