I used to use a preload-on-hover trick like this but decided to remove it once we started getting a lot of traffic. I was afraid I’d overload the server.
About your first statement though, which server software do you use that still sends data after the client has closed the connection? Doesn't it use hearbeats based on ACKs?
I use nginx to proxy_pass to django+gunicorn via unix socket. I sometimes see 499 code responses in my nginx logs which I believe means that nginx received a response from the backend, but can’t send it to the client because the client canceled the request.
I admit I haven’t actually tested it directly, but I’ve always assumed the django request/response cycle doesn’t get aborted mid request.
Closing a connection to Postgres from the client doesn't even stop execution.
Unless you are focusing on the word server and assuming that has nothing to do with the framework/code/etc, then I can assure you it can be done. I’ve done it multiple times for reasons similar to this situation. I profiled extensively, so I definitely know what work was done after client disconnect.
Many frameworks provide hooks for “client disconnect”. If you setup you’re environment (more appropriate term than server, admittedly) fully and properly, which isn’t something most do, you can definitely cancel a majority (if not all, depending on timing) of the “work” being done on a request.
> Closing a connection to Postgres from the client doesn't even stop execution.
There are multiple ways to do this. If your DB library exposes no methods to do it, there is always:
If you are using Java and JDBI, there is:
Which does cancel the running query.
If you are using Psycopg2 in Python, you’d call cancel() on the connection object (assuming you were in an async or threaded setting).
So yes, with a bunch of extra overhead in handler code, you could most definitely cancel DB queries in progress when a client disconnects.