Hacker News new | past | comments | ask | show | jobs | submit | phire's comments login

It’s a little oversimplified, but I wouldn’t call it misleading.

There is little point to getting an app like perplexity AI pre-installed on a phone as a non-default. Changing defaults isn’t exactly trivial, and any user motivated enough to go through that will have no problems installing the app from the App Store.

So of course the deal fell through.

And it’s accurate to say that “Google blocked a deal to put Perplexity AI on Motorola phones”, and highly monopolistic.

Though… as an end user and occasional family tech support person, I’m thankful for anything that reduces pre-installed bloatware on phones. Thanks google.


> I’m thankful for anything that reduces pre-installed bloatware on phones

Except that I still get all the google bloatware on my phone.


On my pixel, GrapheneOS has 2x the battery life of factory android.

Of course, most commercial apps won’t run due to Google’s monopolistic bullshit. You can partially fix it by installing (sandboxed) google play services, but that halves the battery life back to what stock android gets.


Every single app I have tried on GrapheneOS works great without GMS except Too Good To Go (their app mandates location access and the manual location selection is broken).

I download them using Aurora Store.


Sure, migrations are bearable (especially ones that only add columns).

But for the example of the "updated_at" column, or "soft delete" functionality, you only find out you need it because the operations team suddenly discovered they needed that functionality on existing production rows because something weird happened.


Good point. When reading, I kind of just assumed the "use of initialised memory" warning would pick this up.

But because the whole line is parsed in a single sscanf call, the compiler's static analysis is forced to assume they have now initialised. There doesn't seem to be any generic static analysis approach that can catch this bug.

Though... you could make a specialised warning just for scanf that forced you to either pass in pre-initilized values or check the return result.


In many countries, the consumer protection laws are strong enough that consumers probably can return such appliances, as long as the facts about app requirements weren't made abundantly clear to them at the store.

Though, it's usually the store who's responsible for that refund, not the manufacturer . Still, stores are motivated to reduce return rates and will put pressure on manufactures to not do stupid things.


For things you can carry back to the store, that works well. When I buy a new washing machine, though, it's a bit more complicated:

- I go to the store and make the purchase.

- A delivery crew brings the washing machine to my house.

- They unhook my old washing machine and take it away.

- They attach my new washing machine in its place.

Even with the strongest reasonable protection laws I can imagine, the most the store would be obligated to do if the new machine is unsatisfactory would be to detach the new machine and take it away. And I've probably had to pay for one or two visits from the installers at that point. Regardless of whether the extra visit from the installers carries any extra cost for me, there's enough hassle associated with this that I can easily imagine keeping a machine where I'm not happy with some app requirement because it'd be too much trouble to make the change.


Depends on what you mean by “important”. It’s not like it will be a huge loss if we never invent AGI. I suspect we can reach a technology singularity even with limited AI derived from today’s LLMs

But AGI is important in the sense that it have a huge impact on the path humanity takes, hopefully for the better.


> But AGI is important in the sense that it have a huge impact on the path humanity takes

The only difference between AI and AGI is that AI is limited in how many tasks it can carry out (special intelligence), while AGI can handle a much broader range of tasks (general intelligence). If instead of one AGI that can do everything, you have many AIs that, together, can do everything, what's the practical difference?

AGI is important only in that we are of the belief that it will be easier to implement than many AIs, which appeals to the lazy human.


It's really surprising how long reddit lasted.

Slashdot lasted maybe 10 years (though still limps on). Digg lasted about 4 years before it started shedding users at alarming rates (and 6 years before it killed itself).

But after 20 years, Reddit still is still gaining users; It's not dying yet.

Reddit has changed so much over time. Reddit of 2006 was very different to reddit of 2008, and reddit of 2013 was very different again. By 2019, it's more or less managed to reinvent itself as an App, trading blows with Facebook, Instagram and Tiktok, almost unrecognisable.

I'm sure many people will put peak-reddit around 2019, but for me, Reddit of roughly 2011 was my favourite, and it's only been down hill from there.

I don't think Reddit can re-invent itself again, only continue to get worse. But I suspect it will still be around in 10 years.


It was quite interesting that Reddit had its own unique culture during the rage comics and narwhal era. At that time, you knew you were on Reddit and not some other site. Whereas now, it's pretty homogeneous with every other site.

I agree with you about Reddit being around in 10 years - because I don't see its users having any reason to suddenly depart, given every other large community is largely similar.


>ad its own unique culture during the rage comics and narwhal era. At that time, you knew you were on Reddit and not some other site. Whereas now, it's pretty homogeneous with every other site.

I disagree completely. You got the same thing at the same time on 4chan and other places as well.

I think it was less each place having a distinct & unique culture and more the overall culture of the internet was a huge fence between places of business/serious stuff and then the larger messy, funny, sad, sometimes offensive chaos of (mostly geeky or techy type people) side of the fence where Internet culture thrived - it wasn't a place of business, just a place to have fun and not everyone was there.

Because money, that wall has been broken down _from_ the business side of the Internet and business has leeched into almost every corner of our green space - all of that internet culture, funny cat videos, funny meme comics is all very profitable - especially if it's pushed and manicured for the masses using evil social media tactics.

That's the key, I think; the internet and all its culture has become INRT, as a stock that mostly only goes up, feeding on itself in the same way that Disney now only make live-action remakes.


Peak reddit was 2010-2015 in it's full uncensored glory

Wow 2019! Haha, yeah it went downhill way way earlier for me. The digg users joining changed the site for sure but when ever "famous" novelty accounts stopped being a big thing and the first rounds of subreddit banning it started to suck. I would have assumed for most peak reddit was around whenever there was the huge rally in DC. Perhaps there are lots more users now but the quality is awful, it used to be so easy to get multiple experts on anything to answer your questions.

> it used to be so easy to get multiple experts on anything to answer your questions.

It’s still possible to get useful answers on niche topics, but you will also get flooded with questionable sub-specific dogma, and god forbid your question is the tiniest bit obvious/unnecessary (according to the “experts” of course, never mind that it clearly wasn’t obvious to the asker)


I call that the stackoverflow effect, where once a topical community reaches a certain information density anything that does not go neatly on the very top of the current pile is mercilessly destroyed.

It's akin to the concept of "climbing a ladder to the top and then pulling the ladder up behind you", and imho it's the alarm bell that indicates that a community has died, even if the community itself does not know it yet.


Stack Overflow itself is also being more overtly destroyed by the corporation that owns it. Did you know they de-attributed Luigi Mangione's posts in blatant violation of their license agreement to his content?

I think it's just called the passing of time. Our bodies build up senescent cells. Our personalities do it as we move through being new adults in the world to old people shouting at clouds. You have something new and novel and empty and you add two pieces of knowledge/whatever and wow this is lively useful discussion. Over time you have 20,000 bits of knowledge/songs in the genre/etc and those two new bits aren't that exciting, get lost in the noise, get canceled by what came before.

In theory, yes, but there's also an insularization, where the older posters who got there first and accumulated clout inside of the group now actively prevent new users from participating until they have completely consumed the knowledge of the group, and even then they might still be excluded just for having an account that isn't old enough or that hasn't generated enough value to the group.

It would be like having to read and parse every book in the Library in order to get feedback on your short story, and half the time the reviewers will just throw your story in the trash before even reading the cover.


How do you know reddit is net gaining real users instead of bots and alts?

I'm basing my "still growing" assessment off google trends, nobody thinks to bot that. The trends for Slashdot and Digg [0] more or less match up with when they died. While reddit [1] is still growing (at least up until 3 months ago, but that's most likely noise). You really have to zoom into the right date range to not have Reddit dwarf Slashdot/Digg [2].

The new users aren't exactly high quality, but they seem to exist. Or at least advertisers think they exist.... shrug.

For some other fun trends, checkout Facebook which has been clearly declining since 2012, or Instagram which appears to have been declining since 2023... not entirely sure why.

[0] https://trends.google.com/trends/explore?date=all&q=slashdot...

[1] https://trends.google.com/trends/explore?date=all&q=slashdot...

[2] Zoomed: https://trends.google.com/trends/explore?date=2004-01-01%202...


To be fair, adding "reddit" to your search query has been one of the few ways to get away from the SEO garbage to the point where it's become a thing. So I'm not sure how effective Google trends is as a measurement.

Guilty.

I know Reddit has become infested with junk recently, but it just shows how bad the broader-internet has become that I'd rather search in Reddit than walk in that swamp.


chatgptbots really saved that metric. the new scams putting the old scams on life support . maybe the next big thing can keep the llm-multimarketscheme alive. Forever growth by forever bigger scams.

The best Reddit was the one the users from Digg were migrating to.

HN is the closest facsimile to Reddit before the mass exodus from Digg

Digg is also supposedly coming back and has an early access sign up

Partly.

But I suspect it's more that CGI was the way things had always been done. They didn't even consider doing a reverse proxy. They asked the question "how do we make CGI faster" and so ended up with FastCGI.

Other developers asked the same question and ended up making mod_php (and friends), embedding the scripting language directly into the web server.


It’s fraud when they lie to investors, or allow them to assume the wrong thing.

Doesn’t matter what consumers believe, it’s more or less legal to lie to consumers about how a product works, as long as investors know how the sausage is made. (Though, in reality it’s near impossible to lie to customers without also misleading investors, especially for publicly listed companies)

In this case, investors were under the impression that the AI worked, completing 99% of transactions without any human intervention. In reality, it was essentially 0%


I do think an open source, distributed, content addressable VCS was inevitable. Not git itself, but something with similar features/workflows.

Nobody was really happy with the VCS situation in 2005. Most people were still using CVS, or something commercial. SVN did exist, it had only just reached version 1.0 in 2004, but your platforms like SourceForge still only offered CVS hosting. SVN was considered to be a more refined CVS, but it wasn't that much better and still shared all the same fundamental flaws from its centralised nature.

On the other hand, "distributed" was a hot new buzzword in 2005. The recent success of Bittorrent (especially its hot new DHT feature) and other file sharing platforms had pushed the concept mainstream.

Even if it wasn't for the Bitkeeper incident, I do think we would have seen something pop up by 2008 at the latest. It might not have caught on as fast as git did, but you must remember the thing that shot git to popularity was GitHub, not the linux kernel.


Yeah I think people that complain about git should try running a project with CVS or subversion.

The amazing flexibility of git appears to intimidate a lot of people, and many coders don't seem to build up a good mental model of what is going on. I've run a couple of git tutorials for dev teams, and the main feedback I get is "I had no idea git was so straightforward".


> It could have and should have.

From a technical perspective, yes. You are basically talking about the Nokia N800 but with a cellphone modem and a bit of effort spent shrinking the bezels down.

But from a product design perspective, I suspect it was impossible to make that leap. We are talking about the point when cellphones were at their very smallest. The 1st gen iphone with it's 3.5" display was considered to be large for a phone. Nobody thought mainstream users would be happy pocketing a phone with a "massive" 4.13" display.

And Nokia were only happy excluding the keyboard from the N800 because it was considered to be a content consumption device. At that time, smartphones were regarded as productivity devices (for email) and the physical keyboard was essential, which would have bulked out the device (See N810).

I don't think we could have gotten to today's large smartphones without first creating a viable browsing experience on an iphone sized display.


Not impossible, it only required a small amount of vision and risk taking. Which Nokia et al obviously lacked.

> Nobody thought mainstream users would be happy pocketing a phone with a "massive" 4.13" display.

Yet it was exceedingly obvious there was a very profitable sizeable niche of users that were willing to do so.

And it shouldn't have taken very much imagination to realize that "web in the pocket" was useful in 2008, and would quickly become much more useful in 2009, 2010 etc as the population of people with the web in their pocket grew and companies started to serve the market.

The big problem was that all of those phone companies were hardware companies. Putting Firefox in a phone was a challenge beyond them. Microsoft could have and should have done it, but they were dysfunctional at the time.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: