The other day I was trying to just sample some outputs from a (admittedly big)database, so I executed select * from database limit 10.
It took more than a minute to execute this (not in queue). The data engineer claims that my query is a bad one since the db is a big one and I did not include a where clause.
But are the db engines that naive?
In postgres, the limit query can sometimes result in poor performance if the table statistics are not kept up to date. You can run `ANALYZE tablename` to refresh the stats manually, but the autovacuum process should do this periodically if configured correctly.