Hacker News new | past | comments | ask | show | jobs | submit login

Pretty nice tool that Just Works (tm).

Although the concurrency level (512 connections) is a bit too aggressive for most servers. You'll get throttled, blocked or your backend will crash (which isn't too bad in any case, except that might not be what you were after with a link checker).




Actually, I totally agree with you. I decided the number based on the default maximum number of open files on Linux because I was not sure about common limits of concurrent connections between clients and HTTP servers. Or, it should probably regulate numbers of requests per second sent to the same hosts. If someone suggests other options, I would adopt it.


A good default might be based on current browser behavior. Keep in mind http2 might make everything use one connection but allow 100 concurrent requests.


I think your approach is fine for local dev servers.

Maybe just introduce something like an exponential backoff algorithm if you start getting too many 5xx errors or the requests are hanging.


Both Chrome and Firefox limited the number of connections to a server to six (6), if memory serves. I'm not certain if those limits have changed or if the number is different based on HTTP 1.1 versus HTTP 2.


The limit is per _hostname_ not per _server_ (unless things have changed in the last 10 years).

This is why you'll see assets1.foo.com, assets2.foo.com, etc. all pointing to the same IP address(es). Server-side code picks one based on a modulus of a hash of the filename or something similar when rendering the HTML to get additional pipelines in the user's browser. Not sure how or if this is done in SPA.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: