Hacker News new | past | comments | ask | show | jobs | submit login

> But it seems like device manufacturers end up testing the biggest card when they make the device

I mean, they can't test hardware that doesn't exist yet, so what is the alternative?

If there's a hardcoded maximum, that's one thing, but we don't know that's what is going on, do we?

The lack of available hardware to test hasn't stopped PC manufactures from supporting media larger than available for a given interface.

There are occasionally hiccups, like the whole debacle where the MBR partition table (from 1893) was limited to a few terabytes and the industry switched over to GPT, but even today if you put in a >2TB disk into an ancient machine that does not support GPT it can usually at least use the portion of the disk that it can address.

Just read the spec and write your software to support it. It is not like we are asking for support of terabyte SD cards on machines released with the maximum SD card size was 128MB.

> Just read the spec and write your software to support it

I wish PC hardware manufacturers would do that. You don’t even need too old hardware or software to find compatibility bugs, here’s an example: https://superuser.com/q/807871/31483

> MBR partition table (from 1893)

I still think it's quite an impressive feat to support that much data before computers even existed though lol

Maybe not, Babbage might disagree: https://en.wikipedia.org/wiki/Analytical_Engine

It is a showcase of how little many device manufacturers care after they have your money.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact