Hacker News new | past | comments | ask | show | jobs | submit | pradn's comments login

The key is that the probes

a) are fast: they certainly incur the same network cost of a regular request. But more than that, all they do is read two counters, so they're super quick for backends to serve.

b) cheap: they don't do nearly as much work as a "real" request, so the cost of enabling this system is not prohibitive. They simply return two numbers. The probes don't compete with "real" requests for resources.

c) give the load balancer useful information: among all the metrics they could have returned from the backend, the ones they chose led to good prediction outcomes.

One could imagine playing with the metrics used, even using ML to select the best ones, and to adapt them dynamically based on workload and time period.


Optimizing tail latency isn't about saving fleet cost. It's about reducing the chance of one tail latency event ruining a page view, when a page view incurs dozens of backend requests.

The more requests you have, the higher the chance one of them hits a tail. So the overall latency a user sees is largely dependent on a) number of requests b) tail latency of each event.

This method improves the tail latency for ALL supported services, in a generic way. That's multiplicative impact across all services, from a user perspective.

Presumably, the number of requests is harder to reduce if they're all required for the business.


The last sentence of the abstract indicates the goal is to be able to run at higher utilization without sacrificing latency. The outcome of that is you need less hardware for a given load which for Google undoubtedly means massive cost savings and higher profitability.

   > Prequal has dramatically decreased tail latency, error rates, and resource 
   > use, enabling YouTube and other production systems at Google to run at much 
   > higher utilization.

> Optimizing tail latency isn't about saving fleet cost.

Indirectly, it is. As the quote I replied to suggests, in order to combat tail latency services often run with surplus capacity. This is just a fundamental tradeoff between the two variables mentioned.

So, by improving the LB algo, they (and anyone, really) can reduce the surplus needed to meet any specific SLO.


You're correct - that's a fair point.

It makes total sense for European farmers to cry foul if cheaper imports get to masquerade as the real thing. This has been a big sticking point for the EU-Mercosur deal that recently concluded.


The price seems entirely reasonable. $200 is about 1-2 hours of a professional's time in the USA.

It's in everyone's interest for the company to be a sustainable business.


This doesn't increase my salary and if you are consultant it reduces your billable hours. No thanks.


As a client I'd prefer you round up to a full hour instead of wasting my time.


Cookie banners are just one more reason why it's so painful to use the web. They are nothing but required pop-ups! Making it painful to visit sites hurts the internet ecosystem. Might as well stay in walled-gardens, where ads are occasional, but less intrusive.


It's not the popup that is required, it is crappy websites that prefers to do the popups in the most annoying way to force you to accept you wouldn't want to.

Somehow we can at thanks to Europe, so that sites and app that do things well have a clear competitive advantage.

What we can be annoying is Europe not doing much to sanction abused. Like these assholes of trustarc and co that block in "processing" for 2 minutes when you refuse cookies. They pretend that it is the time to make a request to tell all the providers to not track you. But by definition if you don't want to be track there is no request they can do to tell to not keep info that they are not supposed to receive.


> They are nothing but required pop-ups

They aren't required at all, just don't use any tracking cookies. Cookies that are useful to the user generally don't require the banner, and many analytics tools now have the option to work without cookies (at the cost of not tracking repeat visitors).

Annoying your users with cookie popups is the price sites pay for using invasive tracking. The only miscalculation is that apparently everyone is fine paying that price


That’s always been the argument, but in the end, everyone wants to know how many visitors they have. I know, there are trackless solutions, but you have to admit the reality: The audience of website administrators who want to use non-tracking analytics is a subset of the audience for recompilation of kernels on Arch Linux.

Also I’m quite sure your website is not ranked as high on Google if you don’t use GA.


HN seems pretty in love with plausible.io every time it's mentioned. It's pretty minimalist, but we never had an issue with it not tracking something we actually cared about.

The real issue is ad tracking. If you either show ads or buy ads that lead to your website you are basically forced to add tracking cookies.


You need 0 personal information to track visitors, and even if you needed it at collect time you wouldn't store any of it if your only purpose is to count visitors


> everyone wants to know how many visitors they have

Actually we already know that if you want to know how many visitors you have, GA will strongly underestimate, especially with tech literate audience. You may want the log parsing instead.

Even if you want the JS-based solution, you can do it first party, without the need for GA.

> I’m quite sure your website is not ranked as high on Google if you don’t use GA.

Sure enough to quote a source? There lots of high ranking pages, using non-GA solutions.


This is a common misunderstanding. Cookie banners are not required. It is your[0] active choice to invade your website visitor's privacy which leads to this degradation of usability. Stop with the cookie nonsense and you don't need a cookie banner.

[0] Royal you, speaking to a significant fraction of the HN audience.


Cookie popups are not required, it is just how companies who were already on course of shitifying the internet decide to implement consent required by EU laws.


> where ads are occasional, but less intrusive

One out of three videos on Youtube short, several per video on Youtube, Facebook and X sponsored post are hidden within the timeline. That's not even talking about how much ads is outside those walled-gardens, for instance searching for something where SEO is heavily gamed like cooking recipes ends up returning websites where half the page is ads or sponsored content. I typed "buy pc" on Google and the first 4 results are paid ads labelled Sponsored, the first organic result is outside my screen. How are they not intrusive on walled-gardens?

Those cookie banners are not mandatory, functional mandatory cookies do not require consent.


We’ve entirely lost the war and some people are still fighting the wording of the declaration.

> the first organic result is outside my screen.

And it’s only a referral blogpost.

Is it even worth fighting? What have we won by attempting this fight? The Internet is unusable today.


> And it’s only a referral blogpost. Is it even worth fighting? What have we won by attempting this fight? The Internet is unusable today.

Actually for me it offered ldlc.com which is a fairly good result. With an ad-blocker, it's the first result. Is it worth fighting? If the alternative is a web that is unusable today, yes. Will it prevent the enshittifiction of the web in the long term? Probably not.


They aren't required, though; they are a signal that the site you are visiting would rather utilize tracking cookies than provide a good UX. They could simply not use tracking cookies and then they wouldn't need to provide a pop-up!


Twitter used to have an app, Periscope. You could start a livestream any time, anywhere. And viewers could fine live streams on a world map.

For a few months, it was possible to feel the incredible simultaneity and richness of human lives. Someone biking, another person cooking. Day in one place, night in another place.

It was ahead of its time. And too expensive for Twitter to keep running for too long. But it was a precursor to today's Snapschat's map view and Instagram live streams.


And before that, there was Bambuser which was very similar to Periscope but launched some 8 years earlier. It never gained the popularity of Periscope, likely at least partly due to its Nordic rather than Bay Area roots - but, oh boy, was it fun!

At any time of the day you could go to the website and watch normal people around the globe doing random stuff. And chat with them!There weren't any real influencers at the time (at least not on the platform) and monetization wasn't possible, so people's motivations for live streaming stuff was not to make money but rather the joy of sharing a moment or just experiencing new cool technology. It got a bit less joyful when the Arabic Spring started and the platform got used by many in very dire situations but it remained incredibly interesting to follow.

The company still exists, though they stopped offering free-to-use consumer services long ago.


they should bring it back


I think Twitter is losing enough money already, doubt they could afford to do that


I’m struggling to imagine Elon Musk doing that.


Isn't that exactly what he did when live-streaming on X last year?

https://x.com/elonmusk/status/1653608284606312448

"This is 2015 Periscope code. Yeah, seems like we just need to improve it a bit."


GPS spoofing ruined any potential for local-first content.


Well the whole point of this product is to link back to websites. There’s no necessary link between the text and the links, which are chosen after the fact from an index. That’s different from traditional search engines, where links are directly retrieved from the index as part of ranking.


The answer though has so much info already that it reduces the need to visit the links.


Honestly, for any serious query, the links serve mostly so you can double-check the AI. That's a useful function however.

I think we're going to see even fewer site visits as a consequence of AI search engines. The internet's ad-based funding model is going to dry up further, but the impact will be disproportionate. It'll be a few years til we see where the cards land.


Valve is trying to obsolete Windows, so they can prevent Microsoft from interfering with Steam. Apple could team up with them, and help obsolete Windows for a very large percentage of game-hours.

There will always be a long tail of niche Windows games (retro + indie especially). But you can capture the Fortnite (evergreen) / Dragon Age (new AAA) audience.


Anyone know how well this works with Godot?


Great progress from Anthropic! They really shouldn't change models from under the hood, however. A name should refer to a specific set of model weights, more or less.

On the other hand, as long as its actually advancing the Pareto frontier of capability, re-using the same name means everyone gets an upgrade with no switching costs.

Though, all said, Claude still seems to be somewhat of an insider secret. "ChatGPT" has something like 20x the Google traffic of "Claude" or "Anthropic".

https://trends.google.com/trends/explore?date=now%201-d&geo=...


> Great progress from Anthropic! They really shouldn't change models from under the hood, however. A name should refer to a specific set of model weights, more or less.

In the API (https://docs.anthropic.com/en/docs/about-claude/models) they have proper naming you can rely on. I think the shorthand of "Sonnet 3.5" is just the "consumer friendly" name user-facing things will use. The new model in API parlance would be "claude-3-5-sonnet-20241022" whereas the previous one's full name is "claude-3-5-sonnet-20240620"


That's great to know - business customers require a lot more stability, I suppose!


There was a recent article[0] trending on HN a about their revenue numbers, split by B2C vs B2B.

Based on it, it seems like Anthropic is 60% of OpenAI API-revenue wise, but just 4% B2C-revenue wise. Though I expect this is partly because the Claude web UI makes 3.5 available for free, and there's not that much reason to upgrade if you're not using it frequently.

[0]: https://www.tanayj.com/p/openai-and-anthropic-revenue-breakd...


3.5 is rate limited free, same as 4o (4o's limits are actually more generous). I think the real reason is much simpler - Claude/Anthropic has basically no awareness in the general public compared to Open AI.

The chatGPT site had over 3B visits last month (#11 in Worldwide Traffic). Gemini and Character AI get a few hundred million but Claude doesn't even register in comparison. [0]

Last they reported, OpenAI said they had 200M weekly active users.[1] Anthropic doesn't have anything approaching that.

[0] https://www.similarweb.com/blog/insights/ai-news/chatgpt-top...

[1] https://www.reuters.com/technology/artificial-intelligence/o...


I basically have to tell most of my coworkers to stop using GPT and switch to Claude for coding - Sonnet 3.5 is the first model that I feel isn't wasting my time.


They also had a very limited roll-out at first. Until somewhat recently Canada and Europe were excluded from the list of places they allowed sign-ups from.


I suppose business customers are savvy and will do enough research to find the best cost-performance LLM. Whereas consumers are more brand and habit oriented.

I do find myself running into Claude limits with moderate use. It's been so helpful, saving me hours of debugging some errors w/ OSS products. Totally worth $20/mo.


Traveling to the US recently, I was surprised to see Claude ads around the city/in the airport. It seems like they're investing on marketing there.

In my country I've never seen anyone mention them at all.


Been traveling more recently, and I've seen those ads in major cities like NYC or San Francisco, but not Miami.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: