Hacker News new | past | comments | ask | show | jobs | submit login

I mean, we're all programmers here, and we've all dealt with the growing pains of changing requirements.

But has an RDBMS ever been a major source of that pain? I can't say I've ever encountered a time when it has. If you need to change the structure, just write a script.

I'm not saying it's completely painless, but nothing is.






Before I started using migration scripts it was a bit of a pain (and before there was off-the-shelf libraries for this), it was a bit of pain. But these days, nope. MySQL is slightly more annoying than postgres because it doesn't let you wrap DDL queries in transactions so you can get left in an inconsistent state if your migration has a bug. Postgres is seamless.

IMO if your data model's still so nebulous that you're changing it so often that SQL migrations are a serious impediment to progress, you probably don't need any real datastore yet. You can usually figure out WTF you're going to do, broadly speaking, before persisting anything to a remote database. And if you can, you very much should.

Yes, yes, there are sometimes exceptions, one must repeat explicitly despite having already said it (see: "probably") because this is HN.


Can someone explain to me any circumstances where having no well defined data model is better than coming up with a clear relational model? Honest question. It seems like it would just be a huge hassle trying to deal with your dissimilar data.

When I'm designing an app from scratch I often think about the SQL tables first and how I'm going to build them, and it really sharpens my idea of what my program will be.

I don't see how skipping that process would make things easier.


I've built IoT platforms where data from any device must be accepted. This is largely where my preference for NoSQL comes from. A device created tomorrow will not have a schema I can predict or control. NoSQL allows the easy integration of that device while a traditional database will at worst require a migration for each new device you want to support.

Please correct me if I'm wrong, but that sounds like a very narrow use case, and also something that could be solved by simply stuffing JSON into a RDBMS.

However, perhaps there are tools that NoSQL provides that are handy.


What's wrong is that you asked for any usecase and then critique one because it's not broad enough for you.

Sorry, I'm not trying to be argumentative, I just argue in order to understand better.

Then argue honestly and work to steelman other's arguments. To address your point, I'd hardly consider IoT platforms to be a "narrow" usecase. Smaller than the whole of computing, surely, but it's a growing field. The reality is that more and more devices will become available that generate all sorts of hard to predict data. Being able to handle those easily will be a large strength for platforms going forward. Dropping this hard to predict data as JSON into a RDMS will certainly come back to haunt you in 5 years.

How will it come back to haunt? Again, just curious.

You'll have dumped it into a strict database, giving yourself a false sense of order and organization. But later when you need to query that amorphous data, you might be able to use OPENJSON or something else, but a NoSQL solution will have been built to handle this type of query specifically with utilities like key exists and better handling for keys missing or only sometimes being present.

You can't really design your tables well enough without knowing your UI, its a back and forth, forth and back process.

And it's something you can do entirely on paper, too, before you go to code.

However I've never greenfielded an actually large application.


How does UI inform table design?

It will give you lots of insight when planning your data models/tables.



Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: