Hacker News new | past | comments | ask | show | jobs | submit | codaphiliac's comments login

Thinking this could be useful in a multi tenants service where you need to fairly allocate job processing capacity across tenants to a number of background workers (like data export api requests, encoding requests etc.)

That was my first thought as well. However, in a lot of real world cases, what matters is not the frequency of requests, but the duration of the jobs. For instance, one client might request a job that takes minutes or hours to complete, while another may only have requests that take a couple of seconds to complete. I don't think this library handles such cases.

Lots of heuristics continue to work pretty well as long as the least and greatest are within an order of magnitude of each other. It’s one of the reasons why we break stories down to 1-10 business days. Anything bigger and the statistical characteristics begin to break down.

That said, it’s quite easy for a big job to exceed 50x the cost of the smallest job.


defining a unit of processing like duration or quantity and then feeding the algorithm with the equivalent of units consumed (pre or post processing a request) might help.

To mitigate this case you could limit capacity in terms of concurrency instead of request rate. Basically it would be like a fairly-acquired semaphore.

I believe nginx+ has a feature that does max-conns by IP address. It’s a similar solution to what you describe. Of course that falls down wrt fairness when fanout causes the cost of a request to not be proportional to the response time.

Per capita means nothing... Our environment cares only about total emission. The rest is politics


If per capita means nothing, the logical solution for China is to split itself into 100 countries. Somehow that would make the contribution from each new country less important in your view.

Measuring by per capita pollution and getting that measure down is the only sane way to count emissions. The end product is the same, total emissions go down, but we can focus on a metric that makes sense no matter how large the country you’re measuring is.


I'm not following. All other things equal, how would the national division of China affect the sum of emissions?


We don't generate emissions for the fun of it, we do it to support the lifestyles of humans.

Sure, total emissions is what matters, but the formula for total emissions is:

    Total Emissions = Emissions per capita * People
Unless you're proposing a reduction in people (don't) then per capita means quite a lot.


Money. Ego


AWS will spin-off to be a wildly successful entity.


Email security scanner following links?


Wait till you fight a prompt to update the project in minor ways...


But tools are fun!


Yeah, it's a curse for all of us that are into tech haha


Apply to companies having deep R&D programs, they all need maths: 3d rendering algorithms research, energy sector, etc.


> 3d rendering algorithms research

My guess is those jobs either go to PhDs in graphics programming, or people with extensive (and impressive) practical experience.


Yeah, in my experience, all those "interesting but practical fields" have plenty of people with experience in exactly that field, be it academic researchers, or more practically oriented people. They don't need some random mathematician that still has to learn everything.


The cost of compliance.


Rally a team around a single goal, get out of the way and let them move fast. If you don't trust your people, you have bigger problems that no methodologies will fix.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: