Hacker Newsnew | past | comments | ask | show | jobs | submit | kevinsync's commentslogin

Ok, very cool. But I want Netflix to explain it related to Seinfeld, which at 10-12 feet looks fine, but up close looks insane. Blocky MJPEG + grain filter??

It's not like we're on Pentium II processors anymore -- I can filter just about anything with ShaderGlass [0] on a shitty computer (and some of the CRT shaders like crt-hyllian-curvature are brilliant, especially on old shows like NewsRadio that only exist on DVD) .. and I'm shocked that Netflix doesn't just have this built into their Apple TV app or whatever. I'm shocked PLEX doesn't have it! (that I know of)

I made a comment on a different post about imagining a world where local AI/LLM/whatever does some favorable processing for you, by you, on your device, of web content, to enhance your experience. I really believe media (streamers all the way down to open source devs) need to begin to incorporate whatever's out there that reduces friction and increases joy. It's all out there already! The heavy lifting has been done! Just make Family Matters look like how it looked when I was locking in on a Friday night for TGIF LOL

[0] https://github.com/mausimus/ShaderGlass


> Shaderglass + NewsRadio

I just want to say this is brilliant suggestion. Shit tier DVD rips of Newsradio really be elevated on shaderglass.


My stomach is churning already knowing I'm about to type a short-sighted hot take related to LLMs, but I do wonder what a screen reader would look like that could provide a "summarized" version of any given web page (assumingly via LLM). Basically allow the user to swap between the full page rendered with current methodology / presentation of content and links, and a version of the same page with a summarized version of the text content + a collated, deduped section of actions found in the content.

ex.

To download W3C's editor/browser Amaya, [click here].

[Download Amaya]

[Click here] to get Amaya for Windows

All collapse into something singular and sensible like [Download Amaya installer for Windows here] as an action inside the action section.

I don't know. I should probably put on a sleeping mask and navigate the web via a screen reader one of these days to really experience how things are.


When we can run our own models that are good enough on local hardware (practically) it'll really take off, I believe AI accelerators in end user electronics will revolutionize how we utilize computers.

> I don't know. I should probably put on a sleeping mask and navigate the web via a screen reader one of these days to really experience how things are. The difference is that it wouldn't be like experiencing it through a screen reader, it'd be like experiencing it with a screen reader that you can't use and will never be motivated enough to learn. Some blind people are known to listen to code in "reading speed" which is pretty incredible.

You'd be like standing on skis for the first time, or using Vim


Not at all, I've been doing this with ChatGPT and Claude for a long time. I only recently (last couple weeks) started playing around with Claude Code on command line (not in an IDE). I didn't like Cursor very much. YMMV


IMO comments so far seem to be not seeing the forest for the trees -- I can imagine incredible value for myself in a browser that hooks into a local LLM, writes everything it sees to a local timestamped database (oversimplification), parses and summarizes everything you interact with (again, oversimplification -- this would be tunable and scriptable), exposes Puppeteer-like functionality that is both scriptable via code and prompt-to-generate-code, helps you map shit out, remember stuff, find forgotten things that are "on the tip of your [digital] tongue", learn what you're interested in (again, local), help proactively filter ads, spam, phishing, bullshit you don't want to see, etc, can be wound up and let go to tackle internet tasks autonomously for (and WITH) you (oversimplification), on and on and on.

Bookmarks don't cut it anymore when you've got 25 years of them saved.

Falling down deep rabbit holes because you landed on an attention-desperate website to check one single thing and immediately got distracted can be reduced by running a bodyguard bot to filter junk out. Those sites create deafening noise that you can squash by telling the bot to just let you know when somebody replies to your comment with something of substance that you might actually want to read.

If it truly works, I can imagine the digital equivalent of a personal assistant + tour manager + doorman + bodyguard + housekeeper + mechanic + etc, that could all be turned off and on with a switch.

Given that the browser is our main portal to the chaos that is internet in 2025, this is not a bad idea! Really depends on the execution, but yeah.. I'm very curious to see how this project (and projects like it) go.


Thank you so much for your honest feedback. I 100% agree - this is spot on! This is exactly the vision we had.

We spend 90%+ of our time in browsers, yet they're still basically dumb windows. Having an AI assistant that remembers what you visited, clips important articles (remember Evernote web clipper?), saves highlights and makes everything semantically searchable - all running locally - would be game-changing.

Everything stays in a local PostgresDB - your history, highlights, sessions. You can ask "what was that pricing comparison from last month?" or "find my highlights about browser automation" and it just works. Plus built-in self-control features to block distracting sites when you need to focus.

Beyond search and memory, the browser can actually help you work. AI that intelligently groups your tabs ("these 15 are all Chromium research"), automation for grunt work ("compare 2TB hard drive prices across these sites"), or even "summarize all new posts in my Discord servers" - all handled locally. The browser should help us manage internet chaos, not add to it.

Would love to hear what specific workflows are painful for you!


For a long time I kicked around the idea of a browser extension that archives the full text of any long webpages you spend more than 30 seconds on, for full text indexing and search.

This would be that, but even better.



Oh wow, this is exactly what I want, but with a server component so it works on mobile too (where I do most of my reading) and gets data from all of my workstations (I have 4-6 at any given time).

Maybe I can hack it into this one.


This is basically what Microsoft wants to do with Recall and they got slammed for it. Which drives me nuts because it's the only feature from the recent AI hype wave that excites me because it's the only thing so far that sounds like it will actually make my life better. But then I thought about it a bit more and I realized that what I really want is not AI, I just want my computer to have a detailed local history and search functionality.

My computer should remember everything I did on it, period. It should remember every website I visited, exactly how far down I scrolled on each page, every thought I typed and subsequently deleted before posting... And it should have total recall! I should be able to rewind back to any point in time and track exactly what happened, because it's a computer. I already have a lossy memory of stuff that happened yesterday and that's inside my head. The whole point of having my computer remember stuff for me is that it's supposed to do it better than me.

And I want the search to be deterministic. I want to be able to input precise timestamps and include boolean operators. Yes, it would be helpful to have fuzzy matches, recommendations and a natural language processing layer too, but Lucene et al already did that acceptably well for local datasets 20+ years ago. It's great we have a common corpus, but I don't care about getting tokenized prose from the corpus, I care about the stuff I did on my own computer!

From my perspective LLMs don't bring much value on the personalized search front. The way I understand it, the nature of their encoding makes it impossible to get back the data you were actually looking for unless that data was also stored and indexed the traditional way, in which case you could have just skipped the layer of indirection and queried the source data in the first place.

I am also curious to see how all of this develops. I get a sense that the current trend of injecting LLMs everywhere is a temporary stop-gap measure used to give people the illusion of a computer that knows everything because researchers haven't yet figured out how to actually index "everything" in a performant way. But for the use case of personalized search, the computer doesn't actually need to know "everything", it only needs to know about text that was visible on-screen, plus a bit of metadata (time period, cursor position, clipboard, URL etc). If we currently still need an LLM to index that because snapshotting the actual text and throwing it into a traditional index requires too much disk space, okay, but then what's next? Because just being able to have a vague conversation about a thing I kindasorta maybe was doing yesterday is not it. Total recall is it.


> It should remember every website I visited

I don't know about other browsers, but Safari does this. It's come in handy when I'm like "what was that site I visited two years ago?" and I can open my history and query to filter the list of pages, and there it is, January 17, 2023, yoda ate my balls retrospective


There's the whole privacy issue. We know every software company exfiltrates as much data as they can get away with, and we know the US government has access to all that data. If Recall is good, then ICE conc-camping individuals based on their search history is good, because that's what Recall will do.


Well i mean what microsoft wanted to do was a great idea, what they delivered was bullshit and poorly executed just asking for a data leak the way it was implemented they didn't even properly vault the data lol


> My computer should remember everything I did on it, period. It should remember every website I visited, exactly how far down I scrolled on each page, every thought I typed and subsequently deleted before posting...

Sheesh.

As someone who uses web browsers that delete all session data when they're closed, and routinely wipes any "recently used" lists and temporary files in all operating systems I use, the thought of the machines I'm using remembering my usage behavior to such an extent is terrifying to me.

I mean, I get why those features would be appealing. I just have zero trust in the companies that build such software, because they've violated my trust time and time again. Yet I'm expected to suddenly trust them in this case? Not a chance. Not when the data that I would be entrusting them with for the features you mention are a literal gold mine for them. Trusting a trillion-dollar corporation with a history of privacy violations and hostility towards its users is just unthinkable for me, no matter what features I might be missing out on.

It's unfortunate that computing has come to this, but I choose the hardware and software I use very carefully. In cases where I'm forced to use a system I don't trust, I try my best to limit my activity and minimize the digital footprints I leave behind. I prefer using open source software for this reason, but even that is carefully selected, since OSS is easily corruptible by unscrupulous developers, and companies that use it as a marketing tactic.

The only way I might use software with that level of intrusion is if I've inspected every line of it, I run it inside a container with very limited permissions, or if I've written it myself. Needless to say, it's more likely I'll just miss out on using those features instead, and I'm fine with that.


This is an amazing vision. I want my browser to remind me if I lose focus, and to analyze and show me what I've been doing so I can learn from myself. Self-reflection is powerful here.


my forest is this - "new LLM-based native ad-blocker"... this forest is so big my brain hurts of even thinking about it. (sarcasm sorry)


Everything you said sounds great, other than the fact the first half of it is generic surveillance dystopia. I had hoped we’d be a more unique dystopia, but it appears we’ll have to settle for run of the mill.

How it all started:

“Bookmarks and shit don’t cut it anymore”.

Tsk tsk.

I’ll just leave this here:

https://youtu.be/kGYwdVt3rhI


Yeah, keep in mind that significant abuse of the drugs mentioned are, uh, let's say, prone to inflating the ego rather than killing it and showing a different path.

Kinda fills in some unspoken gaps about the 'discipline' of psychology...


Can you explain this for those of us with minimal background in psychology? Is there a substantial amount of pseudoscience in modern psychology, or just historically?


Psychology has a pretty significant reproducibility crisis.

From what I've seen as an outsider, a lot of studies are taken as fact without any confirmation with attempts to reproduce the results. And many results suffer from questionable methodology.

A big part of the problem is that doing psychology well is really, really hard. You are dealing with human subjects, which means there are a lot of ethical and regulatory constraints. A lot of experiments that might give you important insights are unethical and/or illegal. Getting people to participate in studies is difficult and expensive, which means sample sizes are often much smaller than they should be. And there are often significant biases in the population sampled (I believe most psychology studies are done on college students... often psychology students). And then there is the inherent complexity of the subject. Every person's brain is different, and finding general rules that apply to the incredible diversity of human minds is very, very difficult. And finally, I suspect that a lot of psychologists are not trained in statistics and experimental methodology to the same degree as scientists in "harder" sciences.


I can't really speak for psychology as a field other than a spectator, but I find it to be quite subjective. What I really meant was related to cocaine and nitrous oxide -- two very different substances that both result in wild egos, wild ideas, hard-to-sustain invincibility and a host of other effects, unlike psychedelics (LSD, psilocybin, DMT, etc) which are arguably closer to (and provide) positive psychological responses / breakthroughs / perspectives.

For a current-times look into nitrous, observe Kanye West. The rumor mill (plus believable evidence) suggests he is out of his MIND on large amounts of N2O frequently, and his erratic and grandiose behavior reinforces the idea. That's probably not ideal for American psychology if "the father" of it is similarly whacked lol.

For a historical look into cocaine, observe Sigmund Freud. There was a great book called Cocaine: An Unauthorized Biography [0] by Dominic Streatfeild, the second-third of which covers Freud's discovery and promotion of cocaine as a cure-all.

TL;DR Freud was searching for a drug, any drug, that hadn't been claimed yet by a scientific promoter to then market as his own for fame and fortune, stumbled upon cocaine (hydrochloride, not freebase), started doing a lot of it, proselytizing it (it could cure your heroin addiction!) etc, before the whole thing kind of collapsed around him.

While arguably fun, it's a substance that is the polar opposite of "introspection" and drives a lot of behavior that honestly a person might seek a therapist or psychologist to resolve LOL, so for a psychologist to promote it early in his career who eventually progresses into more or less defining psychology as a field, well ... I just find it curious and would wonder what theories Freud would have put forth had he come to be in a time with psychedelics available instead. That's all!

[0] https://en.wikipedia.org/wiki/Cocaine:_An_Unauthorized_Biogr...


Psychology isn't a science and never has been. It fails to meet the basic criteria. Reproducibility in particular has been a very big deal in recent years (google search term: "replication crisis").


This is a little blunt, but I knew a lot of people who went into psychology and got degrees. They were not very gifted academically (C students in highschool) and went into psych as it was an easy major that let them coast and party for four years. I'm sure this isn't the case for all psychology majors of course, but it was very different on average than what I was used to in engineering or the hard sciences where you generally had 4 years of excruciating math classes including calculus and differential equations that weeded out most.

Psychology is very dependent on statistics and experiments. That can be complicated and after going through those classes I simply don't trust the majority of students (or their professors) to get any of that right. It's why I roll my eyes every time the radio guy talks about the results of another pop psych study. I knew some psych majors big into new age crystal stuff and legitimately believed it all as well as a bunch of additional pseudoscience garbage. That kind of thing is a lot more rare in say physics where it's really hard to get through the program without a rational brain.

Again, there are probably some brilliant folks drawn to that field who knows how to do solid research, but my experiences suggest that the signal to noise ratio may be suspect.


Psych majors with just a BS degree - totally agree with you.

PhDs, though - some more rigor is involved. Definitely not C-grade level folks (or if they were, they've rectified that problem). But still, we do have a replication crisis...


I appreciate your honesty, blunt or not that’s an interesting perspective.


I was fully "in" on webOS :( Still got a Palm Pre, Pre 2, Pre 3 and TouchPad in a box, and an LG webOS 2.0 OLED that died in the basement.

Apps were built sort of like PhoneGap, but intentional and supported rather than a middleware work-around. webOS introduced the card concept that we all use now, along with a very coherent design language, and the devices were cool (to me, albeit a bit flimsy) with full keyboards (I was super sold on that but have long-since changed my mind after switching to iOS).

I came from a long line of "alt" devices though, Sidekick 1, 2, 3, Helio Ocean, etc, so you can see where my sensibilities lie HAHAHA

I would also get freakin' roasted by literally everybody I knew every time they saw it for being a hold-out and not getting an iPhone, but iOS just wasn't there yet as far as I was concerned. Apple/Android hadn't cornered the market yet and it was just a time with a lot of options (Blackberry, Windows Phone, etc).

Anyways, when I heard HP was buying Palm (and AT&T did a deal for Pre 3 exclusivity, I think), I assumed it would be a great thing for the mass adoption of what seemed like a really exciting future for mobile. Then HP poured gasoline on it and killed it with fire.

RIP late-oughts Palm, we barely knew ye!


I still miss my Palm Pre. I've sat here since it died and watched Android and iOS slowly adopt the UI that my Pre had 15 years ago. We were swapping between apps with cards and swiping them away a decade before anyone else.

I had multiple friends end up buying the Pre and the non-slidey Pre (I can't remember the name) because they saw what I had thought it was so cool.

Now my LG TV runs WebOS, which I assume is the name with no shared code, but who knows.


> Now my LG TV runs WebOS, which I assume is the name with no shared code, but who knows.

Pretty sure it is based on a derivative of the original WebOS code! I think the LuneOS folks use some WebOS OSE code: https://en.m.wikipedia.org/wiki/LuneOS




reminds me of braid, very satisfying


John Wick is an accurate summary of the recent movie

https://achtaitaipai.github.io/odyc-exemples/games/john-wick...


I’ve gone through periods of life where I didn’t recall dreaming for years at a time, and others where I have frequent, vibrant dreams. I’ve had sleep paralysis many times when I was younger, a handful of extremely lucid dreams, times when I close my eyes and see nothing, and times when I can close my eyes and visualize clear, fully-actualized images.

I bring this up for two reasons: I wonder how fluid this sort of thing is, and I wonder what factors can dial up and down the intensity. Nicotine, patches in particular, absolutely supercharged my dreams to be bright, vivid, insane, bizarre hallucinations.

In general, my memory of novel events / odd connections / hilariously specific details is quite good, going back many years. I can also forget what I’m supposed to be doing right now within minutes. I can often remember when/where/how I read/saw something but not WHAT I read, so I have to retrace my steps to get to where I know the information is that I’m seeking.

It all seems to oscillate and shift and it’s fascinating.


Not to derail, but NFT mania (part of the opening salvo in the article) was the giant shitshow that it was -not- because the concept of unique digital bits in the possession of a single owner was a bad idea (or, the concept of unique verification of membership in a club was a bad idea) -- it was a diarrhea-slicked nightmare because it was implemented via blockchains and their related tokens, which inherently peg fluctuating fiat value to the underlying mechanisms of assigning and verifying said ownership or membership, and encourages a reseller's market .. not to mention the perverse, built-in economic incentives required to get nodes to participate in that network to make the whole thing go.

Had NFTs simply been deployed as some kind of protocol that could be leveraged for utility rather than speculation, I think the story would be a complete 180. No clue personally how to achieve that, but it feels like it could be done.. except that, too, would have been completely perverted and abused by centralized behemoths, leading to a different but terrible outcome. Can you imagine if all data became non-fungible? Convince all the big identity vendors (Google, Apple, etc) to issue key pairs to users that then get used by media companies to deliver audio and video keyed only to you that's embedded with maybe some kind of temporal steganographic signature that's hard to strip and can be traced back to your key? It's not just cracking AACS once and copying the bytes. It becomes this giant mess of you literally can't access anything without going through centralized authorities anymore. Then build more anti-patterns on top of that lol. Prolly better that it was mostly just monkey JPEGs and rug pulls.

Anyways, I'm so far off topic from what's actually being discussed -- just couldn't help myself from veering into left field.


Just conjecture, but this might also be what Apple does with a lot of their TV+ series that are filmed for Immersive / Spatial Video (shot stereoscopically and end up in MV-HEVC format). On a regular screen it ends up looking like a super weird bokeh towards the edges of the image.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: