Hacker News new | comments | show | ask | jobs | submit login

The problem isn't so much exposing the data as a rest api, as it is allowing for complex queries that may contain various table joins, subqueries or recursive conditions. I only skimmed through the documentation of postgrest, but it doesn't make mention of joining tables, which is a deal breaker for our use case.



Idea from someone just starting to learn about databases (very green :P):

- People request access and get an API key associated with a given load threshold, or don't use an API key and default to some low threshold

- Anything that SQL EXPLAIN says is over the threshold returns an error

- Successful requests' load costs and execution time (and possibly CPU, if that can be determined) count toward a usage rate limit

- An SQL parser implements the subset of SQL you deem safe and acceptable and forms a last-resort firewall

Obviously this is a complex solution; I'm curious what people's opinions are on whether this would overall be simpler or more difficult in the long run.




Applications are open for YC Winter 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: