Hacker News new | past | comments | ask | show | jobs | submit login
John Carmack pushes out unlocked OS for defunct Oculus Go headset (arstechnica.com)
817 points by JaimeThompson on Oct 22, 2021 | hide | past | favorite | 304 comments



> allowing for a randomly discovered shrink wrapped headset twenty years from now to be able to update to the final software version, long after over-the-air update servers have been shut down.

This resonates so much with me. Each time I setup a new device that requires an Internet connection, I think about how we can enjoy booting 30 years old retro computers and how the next generation will not be able to do the same because of locked down hardware.


The same is true of some of the best work being done on the web, such as the NYTimes interactive infographics.

Because they are so JS-heavy, and reliant on CI/CD pipelines for deployment, on custom CMSes, there is no way to archive them in the way that static pages containing just text and images can be archived on the Wayback Machine. Heck, even Flash projects from 15 years ago still run fine when compiled on Ruffle or some other Flash player.


> Because they are so JS-heavy, and reliant on CI/CD pipelines for deployment, on custom CMSes, there is no way to archive them in the way that static pages containing just text and images can be archived on the Wayback Machine.

Welcome to the world of digital archiving. It's an enormously complicated space, and even for just my own personal projects and content, I've spent a lot of time thinking about how to ensure things are future proof and can be archived easily.

As a simple example, building my personal website atop Markdown ensures that, even if the formatting can't be preserved, the core content will be since it's simple ASCII (yes, that's ignoring issues of long-term digital storage and access and so forth, but at least it's not also a bunch of binary blobs or database formats or whatnot).

Equally alarming is that fact that so much of our digital lives aren't even in our control. A historian used to be able to rely on family archives, public libraries, etc, to understand our past. A hundred years from now they'll be looking back and hoping someone somewhere preserved the contents of an S3 bucket before Amazon decided to delete it on a whim...


Some people actually use those "takeout" features to collect archive data. So then you do get the archive, it's like having somebody's cuttings from a local newspaper rather than a complete set of local papers on microfiche.

One reason I take these is that I have RAM and I have grep and apparently either the people who had the data don't have RAM or they don't have grep, and so while I can ask my local Facebook archive "Er, didn't I write something about anti-freeze?" and get an answer in seconds, Facebook itself will try to suggest I might want pages about anti-freeze, a group that cares about anti-freeze, a sponsored advert for anti-freeze ... and not the thing I wrote.


Facebook's search and suggestion engine is hilariously broken.

Say, I am commenting in a thread trying to respond to John Smith. That's the only person whose name starts with a J.

If I start typing @J..., the suggestions would be for literally anyone else but John Smith in the thread.

On their mobile website (which lags behind the app), typing @John Smith will sometimes suggest a number of John Smiths, none of them being the one in the thread I am writing in.

Same with friends. If I want to tag a friend of mine and start typing their name, I usually get suggestions for random people first (neither from my friend list or the comment thread).

Why on Earth is the list not prioritized by (friends in thread) / (everyone else in thread) / (friends) / (everyone else) is absolutely beyond me.


Once you do manage to tag @JohnSmith, he will get a notification that he has been tagged in the thread. One notification per thread, regardless of the number of individual posts he was tagged in.

The link on the notification will take him to the top of the thread.

Depending on the thread's popularity, John could have a very difficult time finding the posts he's tagged in.


These are just symptoms, though, not coding mistakes.

Facebook literally wants you to be caught up in wading through their posts, spending your life on their website.


Ha ha, perfect - you both summed up the hellhole that is Facebook commenting so well.


> Some people actually use those "takeout" features to collect archive data. So then you do get the archive, it's like having somebody's cuttings from a local newspaper rather than a complete set of local papers on microfiche.

The problem with these (at least the Facebook ones) is that the data is lacking all context. It's kinda OK if you just want copies of your photos, but I can't make heads or tails of most my comments from the archive, and posts are missing a lot without the comments.


I have this exact same situation with my Youtube history. I used to be able to search it in Google's history search, but that only shows a subset of results.

If I want to really search for the title of a random video I saw 5 years ago the only option is to download a raw CSV of my history and use grep :(


Fwiw, the Internet Archive is very much trying to avoid the random S3 bucket deletion problem, and donations to them are tax deductible.

The issues of long-term digital storage are such that - use whatever you want for your own blog - but (imo) ASCII isn't going to save you any more than binary blobs are, 300 years into the future after we're all long gone and buried. We're already in a world where UTF-8 is taking over in many places. (Many places but not all. Fun fact, you can't send Zelle to someone with an emoji in their local contact name with some banks.)

If I (today) said I had a word document and needed "an old version of Microsoft Word", I'm sure most people would know what I mean, and that I'd find someone with a Windows XP machine and a copy of Office 97'. Meanwhile, there are tons of people who are just going to stare at you blankly if you tell them about EBCDIC, never mind help you find a decoder.


> If I (today) said I had a word document and needed "an old version of Microsoft Word", I'm sure most people would know what I mean, and that I'd find someone with a Windows XP machine and a copy of Office 97'. Meanwhile, there are tons of people who are just going to stare at you blankly if you tell them about EBCDIC, never mind help you find a decoder.

Funny, I suspect the precise reverse is true.

EBCDIC is a well-documented encoding. Worst case, find you a reference book and you can figure out how to deal with it, because that knowledge is open and available.

The same is true of ASCII. If you can understand binary encodings with 8-bit groupings--a fairly fundamental concept in digital computing--you can probably find your way to an ASCII table in a library somewhere.

But good luck finding a working Windows XP machine with Office '97 fifty or one hundred years from now, let alone a spec for the format.


the part about the spec of that office97 format is more or less taken care of by the libreoffice project


And once the maintained version of Libre Office inevitably drops office97 support you are back at having to find old Libre Office versions and trying to get them to run or port the code.


And that's ignoring the fact that code is a terrible spec. Trying to reverse engineer a file format from a software implementation is a godawful nightmare, and I say that from personal experience.

Given the choice between that and having to figure out how 8-bit ASCII works, it's pretty clear which is the easier problem to solve.


7-bit ASCII is a subset of UTF-8, so ASCII is fine in a UTF-8 world.


> If I (today) said I had a word document and needed "an old version of Microsoft Word"

Modern Word versions still load Word 97 docs. There's a decent chance Office versions from around that time still work on Windows 10.


>As a simple example, building my personal website atop Markdown ensures that, even if the formatting can't be preserved

That's why I built my personal ADHD blog[1] on TiddlyWiki[2].

It's a self-contained HTML page that has everything.

I could have even embedded the images.

You can archive it with *File -> Save As...* (single-file .mht works).

[1] https://romankogan.net/adhd

[2] https://tiddlywiki.com


I still don't get why Firefox doesn't support MHT(ML)(=EML), while Thunderbird does, considering how that's pretty much the best digital document format we have...


They used to support MHTML prior to the Quantum update, via the Mozilla Archive Format addon or the superior UnMHT addon (which captured pages more accurately than Chromium's MHTML support did in direct comparisons I made).

Not sure why they dropped support for it entirely since, given it was supported for the longest time and it's the most convenient single file format for web page saving. It's a major reason I couldn't continue using Firefox as my main browser.


Mozilla is too busy inserting ads and removing useful features from Firefox.


I'm just waiting until some other organization says enough is enough and forks Firefox. More likely than Mozilla pulling their head out of their ass IMO.


But it already has been forked, several times to boot ?


One aspect of this is to look at the ways that history is being rewritten now from original materials. All of the -isms of the 1900's painted a picture of straight, white (male) Captains of Industry paving a way to the future, and in revisiting the source materials we are discovering that this image paved over a lot of people that were doing a lot of heavy lifting.

History is full of assistants, spinsters and confirmed bachelors whose stories are being re-told now from diaries and correspondence letters that have been family heirlooms for generations. You can't trust the contemporary reports as accurate, because they have a different agenda than we do 20, 40, 100 years in the future. We only knew of Marie Curie within her own lifetime, less because her work was so profound, but because she had a husband in her own field who conspired with her to subvert a system that didn't want to give her standing. A partner outside your field can't do much for you, and a more selfish collaborator wouldn't.

Who knows what polite fictions are being told about people now that will be reframed by our grandchildren, assuming that scholars can find any of it. If I had to guess it will be neurodiversity. Probably/hopefully doing away with the Tortured Genius trope.


TBH, IMO this is all a non sequitur

My point is that the nature of digital technologies is such that information is far more ephemeral and closed off than it's ever been, not just for historians but for we, the people who are creating that information. We produce a lot more information, but control and long-term preservation is infinitely harder.

Your observations regarding the challenge of historians is absolutely true. But the effects of technology are entirely orthogonal to that problem.

After all, even if we had perfect digital preservation, what you say is still true, if only because subjugated groups are less represented in the digital discourse for many reasons, including socioeconomics, direct censorship/interference from power groups, etc.


"My point is that the nature of digital technologies is such that information is far more ephemeral and closed off than it's ever been"

I don't think I agree with that. For a lot of pre-historical research the only thing we have to go on is fossils and rock formations. Our picture about dinosaurs is extremely ephemeral and extrapolated from a very small number of things in the grand scheme of history, I don't think we can even begin to imagine the sheer number of events that happened in the total history of organic life forms that resulted in the current state of things. But knowing those things is really important for a lot of scientific fields.

Edit: Also I guess I just don't see why digital information is really significant here. It seems just as likely for a marginalized person without safety to have a physical notebook or photo album get lost or destroyed, for example.


> Welcome to the world of digital archiving. It's an enormously complicated space, and even for just my own personal projects and content, I've spent a lot of time thinking about how to ensure things are future proof and can be archived easily.

For web content, in my eyes it's a pretty cut and dry example - if the authors of any piece of content don't want it to be archived and aren't forthcoming in making this archival a viable pursuit, then the content simply should not be archived. Alternatively, just get a static PDF of it for future reference instead of fighting an uphill battle against webpages and even software that's user hostile.

For your own content, however, i think that you're on the right track. Use simple file formats, have tested backups and ideally rely on stable, boring software that's also slow to evolve and change.


If we had only things from ancient history which people wanted to be archived we'd have a very different view on history. True future historians will have a lot more material about our present, but that still shouldn't limit our intent for archival of as much as we can. Only future will know what is relevant


> True future historians will have a lot more material about our present, but that still shouldn't limit our intent for archival of as much as we can.

In my eyes, unless you have a personal interest in the material that you want to archived that would make you overlook most complications, the burden of preserving or even making their information easy to preserve should lie on its authors.

For example, if you as a person want to make your voice be heard through centuries, then it should be upon you to use open data formats or even something as simple as Markdown, as opposed to binary .docx files or similar formats that will have significant problems related to reading them.

Furthermore, there is no actual guarantee of anyone actually caring about what you (or, let's say, i) might say, for example, in an offhanded remark on Twitter about seeing some cute kittens today, apart from any such data being included in a larger analysis of bulk data.

I'm certainly not against the idea of archiving or preserving data, but it seems that most larger events out there will have a huge amount of coverage either way.


I don’t know why CI or a custom CMS would be the problem here, the output is still static HTML. IMO the problem compared to years ago is that the client side dependency trees are so much more complex, involving third party domains and such. But dependencies are still a problem even in Flash: it can load external JSON data for example, and it’s a lot more difficult to sniff all that out than it is with JS.

Don’t get me wrong, the web is still miserable in this regard. But IMO it’s the mobile apps that are going to be a giant hole in history: we’ll just have some screenshots to look back on.


It isn't static html, it is an ever changing tapestry of js fetching this and that, on demand loaded content, and so on. To archive, one need to not only snag the page, and loaf time js, but also js triggered by user actions.

This means you need to load the whole page, then hit js triggered by page load, etc, then save output to html.

Because, some of those js libs won't work in future browsers, others won't work if they can't phone home, or 100 other things.

Archive.org is about 100 years from now too, and js content is a PITA in that respect.


Surely someone is archiving apks, just for the fun of it? I can imagine that someday we'd have a webarchive that embeds a simulation of the device where you can run that apk in your browser. Of course, if it's just an application shell dependent on a running backend, then yes it's not really an application so much as an "application shell" at this point, and you ALSO need to be able to move around the data corpus to systematically capture all reachable application state. This is a hard problem in general, but pretty easy I think when applied to specific cases.


This is why piracy is so important.


I would imagine the Internet Archive is already storing them. APKMirror is fairly comprehensive. Wouldn't take much for the Archive to mirror them.

Sometimes, I wonder what the Archive stores in their non-public repositories. Just waiting for the day when it's safe to show the world.


Haha well that just blew my mind xD

I personally don't really care about achieving anything. Why is this such a big thing? Why would we want to achieve all the APKs? There are too many of them lol History? Memories? Who cares, we're all going to die lol And with the qty of APKs, only 0.00000001% of them will ever be dug out of the archives.


It's definitely feasible to store all APKs. After all, Google does just that. It probably wouldn't even be a significant chunk of what the Internet Archive is currently capable of storing.


> Why would we want to archive all the APKs?

It is hard to know what APKs (or anything really) will be interesting to people 30 years out. Archiving all of them gives our future selves a chance at finding what they want to see.

Here is an example pulled from a discussion I once had. Someone was looking for a really obscure feature phone game from the late 90's. I had played the original game on original hardware and commented about how bad that game was and that they weren't missing anything by not finding it. Their reply was essentially "I don't care if the game is any good, I really just want to experience what was like," which made sense to me after I heard it.

Anyway, the game was so bad at the time it was released that I doubt anyone felt like it would be worth revisiting in the future. And yet, 25-ish years later, they would be wrong.


Was it E.T.?


I'm currently hunting down an old Firefox Android apk, why? Because that had a "Print" option integrated with Android's built in printer functionality, which was removed from future Firefox updates.

Why did they do it? No idea, and there is no official response in the forums, despite users complaining about its removal for over a year.

https://support.mozilla.org/en-US/questions/1301314


Your statement is dumb. With FDroid and some Android 4.1 ARM chinese netbook, you can have a portable and usable machine for several years on. Backing up these APK's for future usage it's essential.


I just got out of jail after 8 years. Went through my del.ici.ous bookmarks. 99% of the sites either dead or can no longer be viewed properly.


I would definitely read a blog (or just follow up comments) about a technology orientated person jumping forward almost a decade in time and what they find really different.

Are you a professional developer? Has framework churn been an issue?


Yes, been a web developer since 1994. Framework churn is THE WORST. Ugh. Really, nothing in terms of outside appearance has changed. The web looks pretty much identical to when I was locked up in 2013. There are a lot more ads, and way more video ads. The Web uses up waaaay more RAM and CPU.

Development, though, has been a bitch since I got out a couple of months ago. I want to use the latest frameworks, but I'm starting from zero again. None of the old frameworks even exist. I feel like I'm 5 years old again.

The code I've written in the last few weeks has all been very old fashioned! I just needed to get the job done.

I had zero access to the Internet in all that time. The biggest thing I found was TikTok. I fucking love TikTok. From the inside we would see the occasional video on the news, but it just looked like it was to make videos of people dancing, but it's actually fucking awesome for those of us with "neurodiversity".

One last thing - out of the, like, 1000 online account I had... only 2 were accessible once I got out. Wikipedia and eSnipe. Can't get into anything else. Don't have ready access to the email address used on a lot of them. The others have an email address on a domain I own, but I can't change the nameservers because I can't get into the account, but my friend paid the domain fees while I was locked up, so I still "own" the domain. I can't get anything back as the court owns all the identity documents that I have and won't let me have them. In fact, when I asked a couple of weeks ago they said they have no idea where they put them all.

Oh, and everything is SSL now. That wasn't a thing in 2013.

Any other questions, fire away. It's a fascinating topic. I do feel like I teleported 8 years into the future.


> I can't get anything back as the court owns all the identity documents that I have and won't let me have them. In fact, when I asked a couple of weeks ago they said they have no idea where they put them all.

This is perhaps not what you were expecting to be asked about, but I’m curious nonetheless. So when you reported to prison you had to hand in your passport, driver’s license etc? And now when you were released they claimed that they have misplaced them? How do you get new documents, family members vouching for you or something like that?


Everything except my passport was taken either from my person or from my house. My passport had to be handed over in order to get out of jail. Actually my passport was handed over to the prosecutor years before I was released. The judge wanted the prosecutor to have it because it normally is held by the court, but they had sold R. Kelly's passport when they had it. I ended up spending three extra weeks in custody because the passport needed to go to the jail and the prosecutor's office refused to walk it the 100ft from their office to the jail and made my family come and get it and walk it over themselves.


This is all sorts of messed up. Sorry you’ve had to deal with this.


Surely the prosecutor's laziness keeping you in jail for an extra three weeks is something you can sue somebody about? That seems like a huge violation of your rights.


Yeah, I thought this over and over. It's actually hard to figure out exactly who to sue and for what. I've done a ton of litigation and at the end of the day this case just seemed too difficult to pursue, even though I had mountains of documentation and even the British embassy got involved at one point. I just have to suck it up like a bitch and accept I lost another three weeks of my life.


Did you notice the reduction of information density on web pages? I think that would be the biggest immediate difference. Old Reddit vs. new Reddit as one prominent example. The dominance of responsive designs now, as compared to the old separation between main site and mobile site, as another example. I guess hamburger menus weren't a big thing in 2013? I honestly can't remember. Maybe time to hit the Internet Archive and look at pages from 2013.

It's interesting thinking of the changes. I guess many of the current trends were well underway by 2013 so the current state would be different but not too different to you. At any rate, I'm glad you're out and hope you can sort the ID mess.


I actually like New Reddit, except for the advertising.

Design is much more responsive now, I'll give it that. Lots and lots of huge photographic headers. Hamburger menus? I'm guessing that is the name for the 3-line icon? They were pretty new in 2013 on mobile sites on my iPhone 5. And the 3-dot thing for "extra" options ... I don't remember that existing back then.

There is a reduction in information density.. some of it is warranted by an increase in white space which is good. A lot of sites now have super-intrusive advertising posted all the way through the copy, which wasn't common in 2013.

The sheer amount of data I burn through just browsing the Web.. that's a huge change. Even my mobile plan with 100GB of data gets burned in no time just browsing around. Sites are so, so heavy now. I saw that post yesterday about Discord having an enormous favicon file and so I can see that people just gave up trying to trim their code. I look at some HTML source now and I lose my shit because it is literally megabytes of bullshit. People were more careful with their code in my time.

One weird thing is that my brain doesn't know it is 2021 yet. I saw a show the other day where a woman said her son was born in 2011 and I did the maths and my brain said her son was two years old. This happens to me constantly. It's like my brain stopped counting time as soon as I entered the jail.


Strong suggestion: uBlock Origin ad blocker, including in Firefox on Android.

Other options include the Brave or Duck Duck Go browsers - similar features built in, but possibly less bookmark portability, etc


Thanks. I have ethical issues with ad blockers. As someone who has built businesses based on ads it pains me to hurt these sites, even though the ad situation is really fucked up in 2021.

I just found out about Brave this week. It looks promising. I didn't know DuckDuckGo had a browser. I've been using the search occasionally. People say they can't switch from Google, but honestly their search quality is over-rated. They are really sanitizing their results more than the others recently. Even Bing is bringing back more results from really useful obscure sites than Google is.


Adblock Plus might be an acceptable solution for you, IIRC it has an "acceptable ads" program designed to only filter out the heavier or more intrusive ads.


Thank you for the tip. Just installed it. Open source, and powered by donations. Awesome.


Wait till you get a raspberry pi and install pi-hole


I find it really interesting that you love TikTok.

> it's actually fucking awesome for those of us with "neurodiversity".

Can you expand upon that a bit?

How engaged were you with social media before your sentence, and has that (or will that) change now that you've had some space from it?

> we would see the occasional video on the news

How does the picture that the media paints of the internet & social media compare to your observations now that you're able to use it hands-on?


I was really suprised about TikTok. I did a lot of social media marketing for nonprofits before I was locked up, so I was very into social media.

The thing with TikTok is that there are tons of videos helping, supporting and bringing awareness to issues such as ADHD, OCD, Tourettes, differing sexualities, gender identity etc, that are really refreshing and have really helped me to understand myself and my neuro-problems that caused me to get locked up. No other social network has that type of content.

Facebook was huge in 2013 and still relevant. Now I barely use it at all. I think Facebook has some big problems. I don't want to say it will go the way of MySpace as Facebook has been much better at adapting than MySpace was, and Facebook has some way smarter people and more money. It might even come back into trend again in the future if their "Meta" projects take flight.

The media only really shows the goofiest or cutest videos from social media. Nothing of any real worth. So my view of TikTok was very fucked up. TikTok is different for every person that uses it since your For You page is based around how you interact with the videos it gives you. I get zero people dancing on my page, and a lot of really smart content.

Also, for the first five years I didn't have any access to the news, for security reasons, so I was very cut off.


> Also, for the first five years I didn't have any access to the news, for security reasons, so I was very cut off.

What did they get you for?


The War On Redemption sucks so much I don't even want to know anymore. The past was better when we left it in the past.


What is the war on redemption? I have never heard this expression before and cursory searching does not yield anything.

I have an inkling of what you may mean, but I'd like to know more.


It's a label I use to group together the ways that society continuously ruins a transgressor's life, long after they've paid their penalty.

By moving the goalposts further and further away, we've made redemption effectively impossible & wiped out the strongest motivations for people to learn and improve from mistakes (penalized by justice systems).


Amen, brother. 90% of the people I saw in jail were there because of this. Once they had a black mark against them society would not let them reenter and they were forced onto the margins where inevitably something would happen that would cause them to reenter the justice system. Sometimes something as small as not being able to get to an AA meeting because they had no transport. Maybe 25% of the jail population were those who had made a minor transgression while on an ankle monitor awaiting trial and were rearrested for it. The lawmakers in Illinois recently passed a law allowing you to be able to buy groceries while on an ankle monitor, but just before the law became effective it was rescinded.


I'm just curious about what warrants a 5 year news blackout for security reasons.


It's just standard policy at a lot of detention facilities in the USA.


That doesn't explain what warrants it tho.


Oh, nothing warrants it at all. The jails just like to control everything, even if there is no reason for it. Their policy is basically restrict everything and then dial back once sued.


Yeah that would be facinating


Ask away.


Huh? JS can be cached and preserved just like anything else. Even the (presumably JSON or CSV) data could be cached, though I don't know if the Wayback Machine follows API endpoints by default


Browsers evolve and break older pages.

JS that requires requests as you interact with it will need an implementation of the server it uses (the subset and data of it's endpoints)

How do you preserve that?


Browsers don't usually break older pages. The only time this happens when you rely on unstandardized features. At least, I have not noticed any page breaking in the past 15 years except for the ones I built using unstandardized API's.


-https sites embedding http images

-SameSite:none cookies (with bonus breakage that makes it impossible to use accross older and newer browsers simultaneously without user agent sniffing)

-the planned chrome alert/prompt changes already mentioned below

browsers have been getting a lot more comfortable with the idea of breaking backwards compatibility as of late.



> except for the ones I built using unstandardized API's.

Think you proved my point.


The changing behaviour of browser autocomplete and the new disregard for autocomplete="off" really harmed multiple large CRM / ERP-style sites i worked on, as passwords would get "helpfully" autofilled into completely wrong fields, causing data loss.

I actually still don't think there's a proper sanctioned solution to this, it seems to be cat-and-moused by web developers and browser developers every year or two.


I've had some success in the past using a WARC proxy - it will basically record everything that traverses the browser and can "play it back" on demand. So while it won't automatically download everything on a site the idea is that whatever you visit with and interact with in a session can be "played back" in the future at some point.


I actually built a machine to do this, and wanted to use l my "old" spindle drives as storage, but I was and am unhappy with the offerings of replicated or raid like storage solutions. I keep waiting for ceph or something to have a better NFS layer, but I might just have to have scripts that balance/replicate the warc however many times based on usage.

For those interested I have 43TB, 210TB, 61-2TB, 212TB. The machine boots and sees all the spindles. ZFS would be ideal but cannot handle differing drive sizes I guess. JBOD misses the replication requirement. Just doing mdraid on all the similar drives and having different folders or JBOD "wastes" too many spindles (at least one per group, so 4-6 wasted spindles by size alone!).

So I'll probably handroll something with whatever that triggered rsync program is, or cronjobs. lsyncd, that's the one.


How is that any different than wanting to archive a CGI website from the 90s with a URL structure like http://example.com/?query=foo? Unless there's an index page with links to all possible query values, or you can work out how to manually iterate all possible query values, there's not much you can do. This doesn't seem to have anything to do with JavaScript data visualizations specifically.


That URL structure is trivial for a crawler to walk and index. I'm not sure why you'd assume that there wouldn't be an index page, such a site would have all desired links in the DOM, the crawler just sniffs those out and visits in sequence. There's no need to think that the links would somehow be 'hidden' from the user and have to be randomly enumerated...

Not only that, but a site of that era probably also has a sitemap.xml file which would enumerate all available public endpoints, specifically to make it easier for crawlers to index everything.


> I'm not sure why you'd assume that there wouldn't be an index page

I’m not assuming either way. I’m just pointing out that either type of web site could choose to have an index page or choose not to have an index page.


If it's view-only, request-response replaying could be a viable option. Browser software can be emulated.


Maybe a good example is the original Nintendo (NES) emulators. New gaming consoles can't play those old cartridges, but we have a virtual layer that can. The same holds true for browsers, OSs, etc. It does create a pretty long chain of dependencies, though.


Ignoring the network point I made above, it'll be a monumental effort to get there.

Best we can hope for is virtual machines, and archive files targeting a specific browser version on specific virtual machines.


If it's dynamically updating based on a database of information that's not shipped to the app in it's entirety, you either have to hope you've somehow seen and preserved all the date from exploring the app, or accept that some data may be lost.

> presumably JSON or CSV

That's presuming a lot. Even if it's accurate for most/all NYTimes infographics today, it doesn't mean it's accurate tomorrow, and it isn't accurate today for a lot of other sites.


> If it's dynamically updating based on a database of information that's not shipped to the app in it's entirety, you either have to hope you've somehow seen and preserved all the date from exploring the app, or accept that some data may be lost.

Well, yeah, that's true of all normal websites too. That's precisely what web crawlers are for. If there's no index page that links to all pages, or some way of iterating through all the pages, you wouldn't be able to exhaustively archive any web site.


> Well, yeah, that's true of all normal websites too.

Not exactly. While you may miss data that isn't requested specifically, you can crawl the site and get most/all that is accessible through links at least. Stuff only available through search results won't show, but if it's discoverable through browsing, you can get it.

The same can't necessarily be said for custom interfaces that are JS heavy, possibly with non-link click actions, custom sliders, a graphical representation of a map that expects a click on a region, etc. An old style page that lists all the regions (like states, or counties in a state), or even that has a dropdown in a form? Those are much easier to crawl and archive.


Sure, that's fair if they don't have a single call that fetches the whole dataset. Though I'd think an article would often be covering a specific, bounded dataset to make its point, and wouldn't need to query a table of indeterminate length


We'd hope. Sometimes weird choices are made, or even not-so-weird choices (like if some site in some other country lifts the whole thing and presents it as their own) that cause sites to choose to be a bit harder to scrape than you would assume.


My guess is that it only requests data that it can statically parse (e.g. HTML attributes and tags) and archives that. Anything more complex would require using an actual browser (either via Webdriver, a custom build, or a pile of hacks that implement something identical to one); and would have problems with adversarial content and so on.

I say this because I know that Wayback Machine didn't archive multi-load Flash files. That would require parsing SWFs and executing their embedded Action/ABC tags, which requires writing something equivalent to Flash Player. SPAs aren't much different in terms of archivability as all-Flash websites were.


It's going to be like java is now. You have to find the exact right date for the right runtime environment to get your JS to execute properly. And that is quite a task. Complex toolkit JS is generally not forwards compatible for more than a couple years.


That's... not remotely true. A built JS bundle consists of least-common-denominator JS that in theory should continue to run ad infinitum. "Don't break the web" is a mantra among browser devs.

Rebuilding the bundle from scratch might be more complicated. But you don't need to do that to preserve it.


Uh... name one toolkit or platform or library that became incompatible with the JS engines after a couple of years? JS inherits the Web property of extreme backwards-compatibility. Breakages do happen, but they're extraordinarily rare.

Unless you mean something different by compatibility? Sure, you won't be able to mix wildly different versions of libraries because their APIs change. But I wouldn't call that a "runtime environment".


Damn I miss being able to drop a URL into HTTRACK and then just having a whole website locally archived with everything working.


Could one argue that sites that can be backed up in this way (or via save site as: ) are probably worth being backed up, and sites that cannot, are not?

I'd be willing to hear counter-examples, for sure.


Back in the day I did that with Slashdot and some Spanish clone too as I didn't have internet at home.



I've always been of the opinion that to allow third party content to be embedded in an HTML page was a mistake. URLS within a page that embed content should have been relative to the page itself, never to another host. Of course the current one is much more powerful but both archiving, bitrot and security would have been a lot better served if it all got served from the same host.


I have to point out that Ruffle falls shorts on the Flash games that used certain animation or ActionScript features. It's going to be a challenge to fully recreate Flash in a supposedly secure manner. I am also wondering if Flash was always bound to be resurrected and if the company wanted to let it die a natural death or just put it out to pasture.


Just wait for the first big JS Framework which uses the canvas for everything.


That kind of thing makes indexing a nightmare but it's not any more difficult to archive than .exe or .swf files. Sure you need a "player" but that's true of PDF and OOXML and people don't really complain about those.


In a PDF or normal program there are clear semantics what is text and what is something else. On a canvas, everything is just made up of pixels. You'd need OCR Software to detect what is what and they won't ever be 100% correct unless you use only text and fonts which are made to be recognized by OCR Software.


Ive implemented a text editor with screen reader support using the html canvas element. Every graphics interface is just a canvas, what is sent to the screen is just data. The nice thing with html is that the data is human readable.


Flutter allready does it no?


At least the artifacts can be saved and distributed, things like iOS apps have a short shelf-live, hell it only takes a year or two before they can't even be compiled again without modification


I don't see how JS, CI/CD, or CMSes impact archivability, but something like dependence on API availability instantly ruins archivability.


IIRC IA archives full WARCs using a headless browser so replaying all the requests will be fine. The main issue will be future browser compatibility with old HTML/CSS/images/JS. Also incompatibility between Chrome and the headless browser.


Interesting (semi-related) video here from the graphics editor of NYT: https://www.youtube.com/watch?v=860d8usGC0o


This goes against the idea that software is "art" in the sense that most works of art are created to physically endure over time.


that's just not true. non-recorded performance art is still art.


The statement is qualified ("most works of art"). Non-recorded performance art is indeed a form of art, but, IMO, it does not comprise "most works of art". I should probably have further qualified the statement to works memorialised in a tangible medium. Programming usually implies "code" that is recorded to a tangible medium. It is usually expected that the "code" may be looked at again after it is recorded.


Readers please note my response beginning "if you want to bring formal logic into this..." makes a lot less sense now and cannot be edited at this time. The original version I replied to before editing was:

> The statement is qualified ("most works of art"). Non-recorded performance art is not most art. Hence, the statement is true.


I mean, if you want to bring formal logic into this with "(Therefore), the statement is true", then I suppose we can go forth, apply the rules of formal logic, and see what happens. Perhaps I make errors in my analysis and someone can set me straight.

It's very strange for you to be that pedantic about the supporting clause without being at all critical of the independent clause. You defended the supporting clause but completely ignored the primary/independent.

Independent clause: "This goes against the idea that software is "art"

Supporting clause: "...in the sense that most works of art are created to physically endure over time."

I was arguing against the primary clause, as it was immediately obvious to me at first read that superficially combining the two clauses yields a "Fallacy of existentialism" (more accurately, it violates the rule that anything distributed in the conclusion must be distributed in at least one premise...but the less correct short quote is easier for readers of this comment to DDG). Laid plainly:

A: "Software is not created to endure over time."

B: "Some works of art are created to physically endure over time."

C: "Therefore, Software is not art."

That's not very logical at all.

In fact, upon reflection you'll find that I was not arguing against a "straw man" version of their original 'weasel-word' assertion. Instead, before replying(!), I purposefully removed the logical fallacy in an attempt to "steel-man" their argument. I figured that if I could show that a charitably non-fallacious argument of the same character was horribly flawed, that most readers could intuit that any weaker versions of the "steel-man'd" argument would also be some kind of wrong.

On a personal note, some unorganized thoughts I'd like to share which I might be convinced to change my beliefs about:

I feel that people don't "get" that one can't just add weasel words to magically make a bulletproof argument with instant "gotcha"-trap potential against anyone who tries to rebut their bad/unthoughtful assertions. It's hard (for me) to find examples where weasel words don't do at least one of the following:

- Weaken the argument to the point of saying something which is still 100% true and specific, but so trivial/impactless that it doesn't matter in any meaningful application. "Some children receive higher test scores after switching from drinking Pepsi to drinking Coca-Cola".

- Create a tautology (a non-falsifiable statement that literally adds nothing to any discussion). "No, really! It very well might be possible that Tupac is still alive!"

- Creates a logical fallacy which means that the statement is meaningless. "This goes against the idea that software is "art" in the sense that most works of art are created to physically endure over time."

Hopefully this write-up helps other people think, read, and comment more clearly because it took me 40 minutes to craft a complete response to a couple of 20-second Markov Chain comments. The asymmetry is real.


This is why I still try so hard to buy physical copies of games, and if there is a GOTY Edition, to have the DLC be on-disk, as opposed to being a bunch of download vouchers.

I wanna be able to play the freaking game in ten years without worrying about whether the game/console's online service is still available to do some mandatory authentication or version check.


Unfortunately, a lot of new releases need a 0-day patch to be playable. It's basically an anti-piracy/anti-early release measure.

So if you really want to preserve a game on disk, you pretty much have to pick up another copy stamped at a later date to include all of the patches.


For non-Nintendo Switch games, I buy from GOG whenever possible. It does require different taste in games (I grew up on Quest for Glory, Space Quest, and King’s Quest). You get a tried-and-true game, DRM-free, usually for dirt cheap, and it probably doesn’t require an RTX video card. It’s not for everyone, but it’s worked great for me.


Calling it anti piracy is a bit of a stretch. It's just that companies realized that the printing of the physical media is no longer the point of no return from a development standpoint.


GOTY editions rarely need 0-day patches.


Right, that would be the "copy stamped at a later date which includes all the patches" version I was mentioning that you'd want to pick up if you own a new release copy.


> This is why I still try so hard to buy physical copies of games

Aren't these also subject to disk rot?


It's certainly a possibility, but anecdotally, I have audio CDs from the late 80s that still play.

Meanwhile, I have games I played 3-4 years ago on the PS3 that are severely hampered because the online components no longer work--the servers were either shut down, or in one case the company completely went under.


Disk rot is slower than online service rot. The other day I tried FC3 blood dragon and it had an error because they shut the servers down. It hasn't even been 10 years.


I stopped this the day I had to try an emulate a DOS environment to run a simple game.


It is refreshing indeed to see a move like this. Too many times, companies have just obsoleted products, turned off the servers and/or yanked the app (or had it yanked for them by $GATEKEEPER) needed for them to function, and left the buyers in the lurch. Speaking for myself, it's a huge factor in my disillusionment with tech. It was also the reason I've never bought a VR headset in the first place, even before Facebook wormed its way into Oculus.

Is it too much to ask for vendors to do the right thing?


It's pretty likely they will; they'll just have to crack them first. Pretty much everyone does it with retro consoles today, since CDs are inconvenient and expensive for the core audience of that wants to play video games and DRM is always breakable. It'll probably be easier though, because it almost assuredly won't require you to take out a soldering iron, like you have to with most legacy consoles.


As systems get more complex, cracking/emulating them gets more effort.

We already see this in old stuff. DOS games run great in dosbox, because the IBM PC was pretty simple. Same with the gameboy, snes, etc.

When you get to more modern things like Windows 98 games, PS3 games, xbox360 games... Most of those are much harder to emulate/crack/archive. They have more complex copy protection schemes, interact more deeply with their OS and hardware, and emulating them is generally more effort.

The next gen of stuff will interact with closed source now-defunct servers. Re-implementing those servers is very much possible... but a massive amount of effort. Effort that won't happen for most products ever.


I think you have a few reasoning errors in your comment:

The PS3 and 360 aren't actually harder to emulate because they have more complex copy protection; they're hard to emulate because they're very novel systems (hardware-wise) and developers had to use all sorts of tricks, which is something their successors are not.

Meanwhile, the PS4 is literally just a PC and already has a pretty good emulator (if early), because the PS4 has desirable exclusives. The author of it only started writing it a couple years ago and it's already booting commercial games and has a handful playable; way faster than old emulator development was! The Xbox One is literally just an NT PC and lack of emulation is largely because there's not really a point to, yet; it's just a PC and has very few exclusives.

The Switch is literally just a phone with a controller and had its first emulator booting commercial video games within the first two years, because it had desirable exclusives.

The Quest is literally just a phone (even moreso, because it's literally Android and even their window manager is just a layer over Unity). It isn't emulated or cracked because it has no really great exclusives and Facebook allows as much piracy as you want.

Windows 98 games are actually pretty easy to emulate; very little at all doesn't work with QEMU out of the box, and that heavy-handed approach probably isn't necessary for the consoles of the future; the PS4 has a great emulator that's basically just a compatibility layer like WINE is, because again, it's literally just a PC.


> Xbox One … lack of emulation is largely because there's not really a point to

I thought it was because no had actually cracked the DRM yet.


Both have the same root problem: There's no incentive. There used to be reasons to crack consoles, and there still is for many of them, and eventually there might be an incentive to for the Xbox One, but there's no reason for the Xbox One to be right now and there never really has been. Pretty much all of the good Xbox exclusives are available on PCs as well (albeit some only via UWP, which had its copy protection broken a long time ago), and almost the entirety of its library is available on either the PS4 or PC, both of which are solved problems. There's not a reason to bother with it while Microsoft's still pushing firmware updates, so there's not as many attempts.

It's also worth noting that plenty of emulators only work with homebrew titles early-on; a lack of a crack for the copy protection wouldn't in itself prevent emulation.


There's tons of incentive. There's over 50 million Xbox One consoles out there, so there's a giant market for people would love to not buy any games for the one time cost of ~$100.


No, you're missing the point. People who want that can buy a PS4, which has always been cheaper, has more interesting exclusives, and as a result has had its copy protection broken since nearly launch.

There's no point in doing it with the Xbox, because the Xbox has no exclusives anyone cares about and is more expensive than a device that sold tremendously better and is cheaper.

This is the same reason people develop private servers for MMOs, and the exact same reason they don't bother doing so for consoles if there's a better edition on another platform.

The niche of "cheap piracy box" has been filled, and the only way for people to have an incentive to hack on the Xbox while firmware updates are still going on is if it suddenly starts getting big exclusives now that it's EOLing.


I'm not missing the point, I just don't agree with it.

There is an intrinsic economic incentive in breaking the console's security, because there's an untapped market of 50M devices out there already. If I have an Xbox One already, am into the idea of piracy (perhaps I bought my Xbox One near launch expecting the same kind of piracy scene the previous Xboxes had), why wouldn't I spend the cost of modchip?

There's a market for the kind of cracking and it's only because of the stupidily good job Microsoft did on the security that you're not seeing a homebrew or piracy scene (and thus not the seeds for an emulation scene).


PS4 has a PS specific graphics API.


This is probably beneficial, as it provides a clean interface for emulation writers to target.

Part of the intent on getting developers to use these APIs as much as possible is to make forward-porting / "legitimate" emulation of games easier.


Ideally you could map API calls to Vulkan but it would still be a huge amount of work.


Like Wine with Win32 and DirectX->OpenGL. 5-10% performance loss on translating is nothing, as the PS4 was low-end hardware for its day.


We need a law to require the removal of DRM after 10 years. That's more than fair for how fast hardware and games currently move.


I don't think we need a law, we just need businesses to stay greedy.

DRM licenses are kind of expensive and it's hard to publishers to justify the added cost of DRM after the initial release, when they've made 80% of the revenue for the game. So they often (though, not always) get patched out eventually.


When a publisher has made the majority of the expected revenue off a property they're pretty unlikely to want to donate some more dev time out of the goodness of their heart. I think historical trends have also shown that DRM is almost always left in place - so I think the empirical data is pointing strongly at a lack of any organic motivation to de-DRM any products.

Most of the time an anti-DRM patch gets rolled out it's either a company/individual that has strong personal feelings about DRM (i.e. somebody like Stardock) or else it's a patch that the devs wrote way back when to make testing in some environment easier that they kept around and valid out of the goodness of their hearts. I think in almost all cases it boils down to someone who personally disagrees with the prevailing business opinion that DRM is good and actually has enough political power in the company to force their opinion onto the business at large.


Depends how the DRM license is written.

It could easily be a perpetual license for a given title. Then there is no incentive to remove it.


It could be perpetually licensed. Sure.

But there's a business case to be made for patching out DRM very early in the product lifetime. The longer a product is protected by DRM, the more likely the DRM is to be cracked, and that this crack could be applied to a new releases. It doesn't make sense to put a new game release at risk just to keep protecting some five year old game from piracy.

If you're selling DRM, you want to disincentive publishers for using your protection for too long. And if the DRM is built in-house, you want it to keep functioning for a number of titles to amortize the cost of development.


>because the IBM PC was pretty simple.

You wish. NFSII for DOS was a HUGE task to run under DOSBox. Linux had it much easier with Wine on w9x games and DOSEmu for DOS games.

OFC games for the 486 and below ran pretty well on DOSBox.


This is precisely one of the key reasons I stick with Nintendo consoles for gaming. Sure, they sell plenty of connected games and online experiences, too, but I purchase all new games as physical "cartridges." This way in 20 years my son will be able to relive Breath of the Wild and a very large library of other Switch classics should he so desire.


Honestly, it feels like there should be a law about drivers and source code being made available after the end of life/support for any product, much like there should be right of repair laws.

Anything else is essentially like renting hardware and letting the manufacturer forcefully make it into e-waste whenever they like.


I'm not sure if there's already a term for this, but the "lack of permanency" is going to be an absolutely massive problem for future genealogists.

Here's an example of how genealogy research looks in the modern day:

* Newspapers (Headlines, Obits, Engagement/Marriage announcements)

* Handwritten letters

* Census records

* Physical photographs

* [Depending on century, location, and religious denomination] Baptism records

* Family Bibles

* Military enlistment paperwork [assuming that it wasn't destroyed in a fire.]

* White & Yellow pages

Much of this physical information has been digitized, but for information that was birthed digitally is precariously susceptible to complete loss once a website goes offline, or a hard drive fails, or any other number of issues that happens to things over the centuries. War, the elements, material shortages, etc.

There's other sources, of course. A big one is first-hand accounts of the past, but even such verbal interviews should be conducted with objects or photos to help jog the interviewees memories of the event.

The majority of how personal memories are being stored are on phones, in hard drives, and other various media that are susceptible to bit rot, proprietary encryption methods, technology or social media platforms that have already gone defunct. This is a trend which is quickly increasing too. If some serious solutions aren't proposed and adopted very soon, it seems highly likely that there's basically just going to be a giant void from the late 90's to at least the 2030's.

Heck, just think about how 10 to 20 years ago, the most common way to get a PC game was on CD. Now it would be an inconvenience because very few people even have a headphone jack, let alone a CD/DVD drive.


I made the mistake of buying a Chumby clock (With Firmware signed by Sony, which likely maintained a VM somewhere to boot their clocks until they decided it wasn't worth the effort) without realizing that it needed an internet connection tho even boot and show the time of the day.


> I think about how we can enjoy booting 30 years old retro computers and how the next generation will not be able to do the same because of locked down hardware.

I worry about this too. It's been a great experience collecting and _using_ devices that were made a while before I was born. It's sad future generations might not be able to use the tech I'm enjoying now.


This is part of the reason I'm excited about Urbit - it's the only project I've seen that has a hope of resolving this.


Do you mean locked-down software?

Do you have examples of hardware that the next generation might not be able to boot 30 years later?

I'm curious and ignorant :D


Lots and lots of IoT requires internet access and servers to stay up and running for even initialising a device after a reset. Some require these servers to control the device once it's as up. I already own a smartwatch that cannot be enabled because the application that used to initialise it has patched out support for that model.


I was thinking about the Xbox One. Someone opening a shrink wrapped version in 30 years will have a useless brick because it needs to connect to Microsoft servers for the initial setup. I am not exactly sure what it does during this first exchange and how much of it is hardware/software but I'm pretty sure none of it is done in the final user's best interest.


Carmark is a legend in so many ways.


>I think about how we can enjoy booting 30 years old retro computers and how the next generation will not be able to do the same because of locked down hardware.

This seems pointless. My old phone is obsolete. There is literally no reason I would want to use it over my current phone. I could care less if it was bricked.


Remember setting IRQ with dip switches on your 300 baud modem....

Computers are becoming (already are) 'black-boxes' -- non serviceable by the mainstream.... Thank Tim Apple


Massive props to John Carmack! We also need laws to protect consumers when they don’t have an enlightened champion on their side.

Part of the reason I buy Apple devices is because Apple is unlikely to get acquired or go out of business for the foreseeable future. I wish there were viable alternatives. Android devices are not an option for me, as they are more likely to be abandoned and/or contain Google/manufacturer/carrier malware, and I need to use banking and work apps.

In that same vein, I have a preorder for a Framework Laptop because, at least for actual computers, I have the option of not splurging on a non-upgradeable, non-customizable Apple device.

We really shouldn’t have to wait for Librarian of Congress-granted exemptions, which can be rescinded at any time and are meaningless with locked-down devices anyway.


Oddly enough, this is why I stopped buying iOS devices. With my old Android phones/the one Android tablet I got I was able to install something useful and pared-down once they became older and unsupported. They work fine as readers/browsers on my home network or as "fancy" remotes and local-network media players.

The only old iOS device I haven't sold/tossed is an iPad 2 which is unbearably slow for most things, and I can neither downgrade the OS or install some lighter alternative. Plus it needs a new 30-pin power cable but I haven't gotten around to buying one because all the other old devices just use one of a pile of USB cables I have stashed around.

As far as I'm concerned, at least in terms of repurposing older devices, iOS is like the old consoles where it's essentially stuck once there are no more "official" updates. If it boots Android, there's a decent chance I can install something custom on it years later.


Yeah. I had a LG G Pad 8.3 (2013) that I used as much as last year until the screen separated from its body and started to fail as an effect. Was using it with LineageOS (don't remember the exactly version, but it was either Android 9 or 10), and it also allowed to tweak the scheduler so the CPU would always run at pretty much the maximum speed. The battery life was obviously bad but would still get one day of reading/watching videos (that was pretty much the main usage I had for it), and thanks for the tweaked scheduler the UI was not that laggy (it was laggy, but not worse than an actual low-end device).

Also, different from iOS at least Google still pushes updates for important parts of the system (Chrome, WebView, etc.), so at least for casual navigation it wasn't horrible insecure.


This is exactly what I did. I imported my GPad in 2013. It is still working as a pdf reader with its screen seperated.

I am planning to do some DIY fix and going to donate it. There are many kids here in India who needs a digital device to do their online classes but have to use a small screen phone


Hm, I have an OG iPad that runs a few excellent apps (TouchOSC, Animoog, Samplr) well enough that it‘s still a worthwhile device with just those. TouchOSC in particular should get massive props. As of last year, it was still able to update and is solid and super responsive. The App Store still works and I can download the last compatible version of any app I own.


Except for Safari, which is tied to OS updates. And of course Apple don't allow other browsers on the store so now you're stuck on the outdated browser.

I know my old iPad has issues with Home Assistant's web interface for example.


Yeah safari barely works on it, which is maybe half because its outdated and half because websites have gotten a lot heavier since then. The iPad had a tiny amount of RAM by today‘s standards. Something like 512 MB, or even 255? iOS was aggressive about killing apps (esp safari) when RAM runs out. Not sure if it can even swap, or could back then?


Apple allows other browsers in the App Store. I have Chrome on my iPad.

Or do you mean the underlying rendering engine?


This doesn't invalidate your argument (which I agree with) since it's unofficial and relies on exploits, but you should be able to downgrade your iPad 2 thanks to its unpatchable bootrom exploit and the fact that Apple apparently has to keep iOS 6.1.3 and 8.4.1 signed for that model specifically https://github.com/LukeZGD/iOS-OTA-Downgrader/wiki/How-to-Us...


I run iOS 8 on my iPad 2 (downgraded from iOS 9, which made it super slow). Unfortunately, Apple doesn't let you download old versions of apps unless you installed them at the time, so the device usage is restricted to "what I used the device for then", which hinders a lot.


All valid points. But I’ve made the mistake of trading in or losing all of my old Android phones which had unlocked bootloaders.

Also, installing a phone OS is more intimidating to me than, say, putting Debian on a laptop/desktop. I’m happy to go into a BIOS, play with voltages, overclock, etc. but please don’t ask me to hold some combination of buttons to start a process that can easily brick my phone.


As someone who has loaded custom roms on more than a couple of dozen Android devices over the years, it is really hard to brick a device installing an OS. The closest I came was an Xperia Play years ago, but even that was recoverable after pulling the battery.

These days there are some decently high guardrails for not bricking a device, as long as you follow installation instructions.


As long as that Android device has an unlocked bootloader, or an exploit to bypass the bootloader. The same thing can be said about iOS devices (other than perhaps a lack of 3rd party operating systems to install in the first place.)


Are current Apple products still worth that trust ?

I had to repair an old Macbook, and while Apple is still in business they won't take repairs for out of support products. Official repair shops seem to have some access to old pieces, but I don't know how long it will last, my laptop got its part from another junk laptop.

Then the current crop of Apple product have unreplaceable parts for third party shops, especially anything touching the secure enclaves.

I kinda doubt Apple being in business will help on the supoort side for 5 or 10 years older products.


I agree, and for non-work computers I’m trying to move away from Apple. But for phones and tablets, I don’t see any viable alternatives (yet).


Apple is one of the only companies I dare buy content from because they are too big to be acquired. I (begrudgingly) wrote DRM code to encrypt all the music for one of Apple's competitors and then they got acquired and all the music and video was just dead bytes on the users' hard drives and there were no refunds.

Still, Apple is known to lock people's accounts for a variety of bogus reasons and then all your purchased content is lost.


This is tangential, but I really wish that their would be legislation, valid retroactively, to enable old unsupported devices to still be utilized. I think this goes somewhat beyond right to repair?

Take for example digital backs for medium format cameras - these things are built in low numbers, with high end FPGAs and camera sensors, with JTAG interfaces ready to go and everything - but then forgotten about a few years later. Why not enforce the creation of some document on how one would build their own OS for it? Or how the bus from the sensor ADCs works? This all existed at one point internally, but now is lost, and most of this backs will slowly die and go to waste, even though they could easily be repurposed or repaired.


> This is tangential, but I really wish that their would be legislation, valid retroactively, to enable old unsupported devices to still be utilized.

Not at all. This is precisely on point, and directly intersects with the Right to Repair as well. It's about damn time we, as a society, put a stop to black box devices over which we have no ability to inspect, repair, or repurpose after the vendor decides to end support.


Could not agree more.

At the bare minimum a vendor should not be allowed to sell a device that has digital locks if the user is not also given a copy of the keys.

You can lock the device, but if I don't get a key at time of purchase, then I don't own the damn thing.


I'm trying to think of a situation where this is objectively bad, and I having trouble thinking of areas where this is objectively bad. The best I came up with is purchasing an elevator, which has a key for firemen. I wouldn't really want to just give everybody a copy of that key. But on the other hand, you can buy that key on Amazon for $5. It would maybe help people think about security a bit more if they bought a TSA-approved lock and it came with a TSA key with a little warning that read "Note: this key opens any TSA-approved locks, please only open your own baggage."

One possibility is around DNS. A public/private keypair is basically a lock and a key. If you can't put ANY public keys on my device without giving me the private key, HTTPS is going to be problematic. Software updates become a little scarier as well, since a man-in-the-middle attack becomes MUCH easier to pull off. But perhaps the answer there is, like DNS on a desktop computer, to simply allow the user to edit those local keys. As long as there's a "Yes, I am also cool with installing unsigned software updates," then I don't see a problem.


> I wouldn't really want to just give everybody a copy of that key.

Why would you give the key to everybody? Just give it to the owner... That's what I want. I shouldn't need to hack my own smartphone or have to solder a board to my Xbox to run my own code on it.


The reason Apple doesn't do this is because users will get deceived into providing those keys to malicious entities which will compromise their devices and everything on them in exchange for the promise of free games, or free in-game currency, or whatever.


Thinking about situations [...] > where this is objectively bad

Thinking here about a smartphone. Note that I'm explaining the current state of things, I am *not* excusing the state of things.

Directly for end-users, generally no real scenario where it's bad as long as they can enroll their own keys in a safe fashion preventing evil-maid type attacks.

Tangentially for end-users, locked devices are easier to make worthless for thieves. FRP on Android, or whatever Apple does, when it's locked to a user account even when reset. This is one thing that would become harder to implement when the root of trust can be manipulated on the device.

Then there's supply chain integrity for OEMs. This is the reason some android vendors only allow unlocking when attached to an online account after a delay (e.g. xiaomi). Some unscrupulous vendors would open the box, replace the system image with a malware-ridden system image, and sell those to end-users.

Finally, there's somewhat a case for DRM and similar uses. The current implementations are built on the current "security" model, where it's security for the businesses first, then security for end-users last.

Still, I agree wholeheartedly that users should be in control of the root of trust, in a way that does not reduce their abilities to use their owned devices. Add to that that standards-based boot should be used. All the time. All devices.


I wonder as we (slowly) march towards "greener" laws and more climate-conscious ways, if some of this will tie into that?

I feel like you could get good traction on right-to-repair if it was framed around waste reduction and a cleaner future.

Which means we might not get there for a generation still, but these things feel related to me.


They're absolutely related. In fact, a lot of discussions around the Right to Repair specifically center on the issue of e-waste. For example, you'll find that all over Framework's website. From https://frame.work/ca/en/about :

> Consumer electronics is broken. We’ve all had the experience of a busted screen, button, or connector that can’t be fixed, battery life degrading without a path for replacement, or being unable to add more storage when full. Individually, this is irritating and requires us to make unnecessary and expensive purchases of new products to get around what should be easy problems to solve. Globally though, it’s much worse. We create over fifty million tons of e-waste each year. That’s 6 kg or 13 lb per person on earth per year, made up of our former devices. We need to improve recyclability, but the biggest impact we can make is generating less waste to begin with by making our products last longer.

Certainly, for myself, the right to repair is very much about ending the cycle of disposable products so we can create a more sustainable future.


Thanks for that link, it's a much better articulated version of what is in my head. :-)


> I think this goes somewhat beyond right to repair?

Definitely goes beyond it. This should be able to get traction as an anti-electronics waste policy.

Designs and relevant documentation should be packaged up and handed over to the Library of Congress (or alternate entity) as soon as any design goes into mass production. LoC may release them when the product is no longer supported by the manufacturer or the company has gone out of business -- which could be fairly automated on the LoC side.

One side-effect of this may be that companies will be incentivized to support hardware longer, if they believe that these designs have notable design elements that they do not wish to disclose.

I'd settle for something non-retroactive. Since I don't think that'd be tenable technically at this point anyway.

One of the cruxes in this are things like GPUs and wireless chips which are pretty unfriendly in terms of getting documentation but even providing a subset of functionality would be great.


I agree. Electronic waste is only getting worse over the last decade, with everyone getting a new phone every year, new laptop etc. Even if everyone optimally buys/sells used, the devices still end up in landfills because the hardware is no longer supported. Not acceptable anymore.


> I think this goes somewhat beyond right to repair?

Something like right to use would be nice. If you need specs or docs or source code for the software/hardware to be usable, then it should be provided. And IMO that should include not gating essential functionality behind online services unless it is inherent to the function.


What if we required lifetime warranties for everything? By lifetime I mean human lifetime, not lifetime if the device. It sounds crazy when thinking how it’d work out in practice, especially with electronics, but we need drastic action like this to respond to climate change.


This would completely close markets to new entrants.

As an inventor I can't financially hope to support a device for a human lifetime and break even, much less profit.

Right to repair and allowing these things to be legally opened and hacked by the end user is the right way, not burdening every manufacturer with unrealistic support laws.

For an example, look at military equipment costs. 20 year support is often built into those to give you an idea of the cost of this. Spoiler: things will cost 10-20x what you think they will for a business to hope to profit.


Of course for military equipment there are also other cost drivers - more demanding specs, less units produced to spread the development cost etc. etc.


This is true, but you may find that if you had to support something (and keep it relevant) across 20 years those specs and requirements may look much the same whether they are military or civilian.

For example, the amount of bending of the case of a device has to be DRASTICALLY less to allow effectively sealing contaminants out for 20 years vice 2, as well as the seals themselves being an order of magnitude better if there is no servicing involved. For most military equipment we have all of those AND regular servicing, something that consumers would absolutely revolt against nowadays.

One other thing people fail to realize on the electronics front: many of the chips in these older systems are getting very difficult to come by. About 10 years ago I was involved in repairing F/A-18 avionics, and one specific chip in that system was extraordinarily important. It was a radiation hardened 80286 CPU, and had a single production run for the entire budgeted lifecycle of the systems it was in. Unfortunately a design flaw in the power delivery systems meant that the CPUs were being destroyed at a rate roughly 4x as fast as expected and they had to figure out what to do. This specific chip was one of the many reasons (but a key one) that we retired that airframe.


I've worked with EE's who spec'd a $300 high quality motor when a $20 one was powerful enough but didn't have lifetime specifications because the extra $280 was less than the cost of the service call to replace the motor.

When you start designing things to last a long time AND be very reliable, the cost increases very quickly in ways that are not always predictable.


When I was at Lockheed, they would buy $2,000 panasonic 'toughbook' laptops, which were then sold to the military for $20,000 - not including the software licenses for our RFID product.


And if one of those tough books broke, what did Lockheed charge to replace it?


Declare abandonment and open source all documentation and code. This should get you out of the human lifetime burden.


Yea I've been thinking about this. Open up specs, schematics, docs, and code, or actually support it "forever" (not literally, but a decently long time). It might be a bit extreme though. To provide some incentive, I'd consider the responsibility for recycling if you're unwilling to support usage. There is far too much throwaway crap.


> To provide some incentive, I'd consider the responsibility for recycling if you're unwilling to support usage.

Germany put the onus on the sellers here. Since 2016 sellers of electronic appliances with a store space of >400m^2 and online sellers with a warehouse space of >400m^2 are required by law to take small electronic appliances (up to 25cm max side length) and dispose of them properly, which usually means recycling, no matter if you actually bought the thing from them or not, and no matter if you buy a new thing or not.

These sellers are furthermore required to take larger appliances if you buy a new replacement appliance from them.

This service has to provided free of charge (except for reasonable shipping costs).

In practice, almost every commercial seller of new appliances, even those who do not fall under the law, will take your old appliances at least when you buy a new similar one voluntarily. Because if not, a lot of customers would just go to a competitor who does.

This spread to other areas too, where e.g. a lot of sellers will voluntary take and dispose of your old mattress when you buy a new one from them.

The sellers do have to dispose of the appliances properly, which is usually also the least expensive option for them. Recycling companies will come and take that stuff for free from the sellers, because they make their money by stripping anything precious out of the stuff.

The area where this is problematic is non-commercial sellers and/or sellers of used stuff. But by law, municipalities have to take all electronic appliances free of charge, the drawback being that they do not have to provide collection/shipping. Getting your old washing machine to the recycling center can be a burden. In my city at least you can call them up and make an appointment for I believe 10 bucks. And they encourage you to tell your neighbors about the pick up time so they can put out their electro trash as well. My city also has about 40 collection containers all around the city for small appliances (up to desktop computer size). The one closest to me is about 7 mins by foot.

Of course, trashing perfectly fine electronic appliances may be a waste sometimes (but sometimes not, because these old things may be extremely power hungry compared to newer models), and a right to repair would be better, but at least it's a step in the right direction.


Yea, it's the same here. Probably most of the EU.

Thing is, most of the stuff I buy is online. Local retailers taking things for recycling does nothing to encourage the maker of a product I buy online to keep supporting it or open it up for end users (or their local repair services) to support it from there on.

Of course it's good that local retailers offer recycling, it's definitely a step in the right direction, but it's far from having the impact I wish we could have on longevity / support / (semi-un)planned obsolescence.


I don't want a warranty. I want my relationship with the company to end as soon as our transaction is complete. If something is worth repairing, I want to be able to pay whoever I want a reasonable price to repair it, regardless of what condition its in or how it got like that. I don't want a company dictating how I utilize and maintain my property potentially decades after I purchase it, and I don't want to pay an absurdly high amount for something I'm going to replace in a few years just because somebody may want to utilize it for far longer.


Right to modify --- i.e. what the automotive industry has had for around a century now.

It's why you can still get parts (aftermarket, usually) for vehicles many decades old. I wish companies like Tesla weren't trying to change that, however.


I'm really glad for what Carmack did. It's rather sad though, that support was dropped for this headset after only two years. As a reminder, the Oculus Go was released more than a year after the Nintendo Switch.


It was obsolete at launch, though. 3DOF is "poison the well" territory and not good marketing for VR at all.


This is true. I’ve had both the Go and Quest2. The difference is between taping a screen to your face, and VR.


True. It also had a garbage Snapdragon 821 in it, even while smartphones had the Snapdragon 845.


VR is advancing at an incredible pace really, I don't think it's sad.


Can you give some examples? What was not possible and what's possible now?


It's honestly hard to describe to someone who's never tried modern VR, but it basically gives you full immersion. You can watch Netflix in VR, you can move around (if you have enough real-life space), you can grab and play with things surrounding you, it's just so close to real life it's mind blowing. I actually wrote about social experiences (specifically playing boardgames in VR!) within the Go[1], and you can imagine that it has evolved exponentially with the Quest and the Quest 2.

I'm not sure how easy it is to get a demo, but look around you maybe malls or some arcade places will have a headset you can try. Or you can spend the $300 to get the Quest 2 :) It's honestly not that much to get a taste of the future.

[1]: https://p1x3l.com/story/239/social-virtual-reality-and-the-o...


It's definitely immersive, but after a while I found it wasn't any more fun than normal gaming, and the constant setup was horrible. I still can't look at a headset without wishing ill will upon the Oculus Home developers. Maybe I'll pick up another headset someday when I don't have to fix the play area every single time I use it and can put the headset on without thinking whomever designed it has a grudge against people with glasses.


Also watch this to understand how immersive the tech is now xD

https://www.reddit.com/r/gaming/comments/qdcrxl/vr_but_taken...


In this particular case, it didn't have full tracking, you could only rotate your head (the 3dof mentioned in another comment). So even at launch it was behind the times. Full tracking is so much better, even required for VR to work.


Some of the latest stuff that's now possible is very small form factors, variable focus lenses, and correction for people who need glasses including people with astigmatisms.


The thought of Facebook having telemetry on real time eye tracking data for a future popular Oculus VR headset in a few years is horrifying.

I’ll never get over the sale of Oculus to Facebook.


Why? Can you rationally explain what you are afraid of them doing?


Eye tracking is biometric data. It can reliably be used to detect various neurological conditions.


This is about telemetry, not the direct eye tracking data. Also I'm not sure how you could detect various conditions with just what you look at. That seems a little far fetched / not reliable. Even if it could magically diagnose you with a condition what would be the point to do that? To sell you medication? To connect you with others who share your condition? Those things don't mseem to be much to worry about.


Not medical condition, but your state of mind. The goal is to sell others the option to change your mind.

Everybody keeps getting surprised at how seemingly irrelevant data points can be used to segment/profile people with a shockingly high accuracy. It seems the only requirement is scale.


Have you considered how much VR is used for porn? Totally unrelated side note, but virtamate is um, probably risky for someone who feels like they could become obsessed with porn.


>Have you considered how much VR is used for porn?

And what does that have to do with eye tracking? I could see it recommend more content based off what you are looking at instead of having to manually like posts. Having a hands free experience sounds like a pro to me.

>virtamate is um, probably risky for someone who feels like they could become obsessed with porn.

Almost all VR porn sucks compared to the flatscreen counterpart. There's also a much bigger collection of 2D stuff that exists.


> And what does that have to do with eye tracking?

Nothing. But the Quest2 specially also has four cameras that point out at you and your rooms. So it’s the same theme that I don’t really want Facebook seeing anything remotely related to this.

> Almost all VR porn sucks compared to the flatscreen counterpart.

Compelling argument.


>So it’s the same theme that I don’t really want Facebook seeing anything remotely related to this.

What's the issue of them anonymously collecting things like how big your room is, or what the average brightness of your room is?

>Compelling argument.

It's mostly subjective. The vast majority of it is blurry 3 DoF videos and proper 6 DoF content is limited.


Me neither.

Haven't touched their hardware since and won't; would pass on jobs if that becomes an aspect.


I'm actually pretty nervous of how few Carmack's we have.

Not at all saying they don't exist, but the job market favors engineers hopping around instead of staying at one place a while to become experts in things.


> the job market favors engineers hopping around instead of staying at one place

Does it? That's a fair description of the consulting space, but I don't think it's an accurate representation of the job market as a whole. When product companies hire software engineers, they tend to aim for the long haul.


You'd think. Most career advice I get is you need to jump ship to get an actual pay raise. Maybe stay 3 or 4 years.

Not enough to build up deep knowledge, at least compared to older engineers imo


I don't think it's necessary to stay with the same company a very long time to develop "deep knowledge". on my last team there were two engineers with about 20 years of experience. one had spent all of it at that company, and the other had worked at five different companies in the same space. the former knew more about the company's code and was an invaluable resource for "why are things the way they are?" type questions. but the latter often proposed more novel (to us) solutions, drawing on his experience from previous roles. both were extremely effective engineers.


Being unable to get a pay raise from your current employer often boils down to one of two reasons:

1. You're not negotiating effectively -- a skill you can improve upon

2. You're not worth as much as you think you are (e.g. low margin industry, company doing poorly, part of a low impact team etc, or maybe you're just not very good at $whatever skill the company needs).

Only #2 is a good reason to jump ship and try your luck elsewhere. Over the years I've learnt how to negotiate a fair remuneration. It's not easy, and it can definitely be uncomfortable at first. But in the end, it's totally worth it.


3. Salaries for those already on the payroll are not subject to competing offers, and employees likely have outdated information on market salaries, and almost always "anchor" on their current income. Employers have a some competition when hiring, and are subject to some market forces on compensation (including "price transparency" on offer letters).

In the current environment, you're almost always able to get a better offer compared to any raise offered. In most organizations, a manager expends less political capital justifying a salary band for an open position vs. advocating for a raise.


I've played the "salary band" game too many times to believe this. Let's tie your salary to your title so that we have to promote you before we can pay you more. Rather than, say, having salaries reflect the actual market and having titles reflect people's actual roles in the company.

The only blame for being underpaid that falls on an employee is staying somewhere they aren't valued longer than they should. We owe it to our families to get paid what we can [sustainably] earn.


I think this is specific to certain areas of the software market. I'm in embedded systems and the engineers I've worked with generally stay much longer than 3-4 years. There's a former co-worker on my LinkedIn who's my age and she's been with the same company for over 20 years.

I quit my last job after 5 years and over my 30 year career, that's the shortest stint I've had.


A buddy of mine just joined Palo Alto networks and they gave him a 30% pay increase, as well as a $100,000 signing bonus, and then put him on a team who sold well and he got a $40,000 bonus a couple months after joining.


One of my friends stays at his current company where he has stayed for years, despite only getting low single-digit pay increases each year.


>I'm actually pretty nervous of how few Carmack's we have.

Just because they aren't celebrities it doesn't mean they don't exist.


Go is a solid piece of kit. No head or hand tracking, but the benefit is low weight and power efficiency. Basically, it's an all-in-one GearVR, which is a solid minimum viable standalone VR headset for lots of things like anything that involves a large virtual TV screen or spectating things like events in VR. Nice to see that it will be an open platform now, which will provide a nice baseline for apps to target who care about ensuring their content is accessible via cheap, open devices.


> Oculus CTO (and former id Software co-founder) John Carmack

I thought Carmack stepped down from being CTO a few years ago? [0]

[0] https://www.cnet.com/tech/mobile/oculus-cto-john-carmack-to-...


The sub-heading of your link says:

> Carmack says he's transitioning to the role of "consulting CTO" at Oculus.

So he can still be called CTO.


But that was also 2 years ago. I assumed by this time they'd have found a permanent CTO? Maybe not.


Why would the role of consulting CTO be non-permanent? It just implies a part-time approach.


Consulting CTO just isn't an arrangement I hear about very often. I'm surprised part-time (if that's what it is) CTO-ing works out long-term for a tech company.


Maybe it means "full time, but I reserve the right to not be CTO at a time of my choosing."


Well that just sounds like most US jobs (i.e. at-will employment).

And if the point is that you're clarifying with your employer that you're looking for an exit (even if it's not immediate) it probably means the employer (and maybe you are helping) is interviewing for your replacement with the goal of eventually finding it. Hence my curiosity that it's been 2 years and they haven't got a new CTO.


As a company with a strong focus on smooth 3D rendering I imagine it’s quite difficult to find a CTO who can top John Carmack, and if he’s willing to stick around then why rush it.


This is Carmack, so I'm just thinking it's a bit reversed. He's taking this job at-will and can cancel them at any time.


Well at-will employment already works in both directions. :)


I think he is working closer to full time on AI, and does the VR CTO thing separately.


Ken Silverman must be busy


I'm looking forward to the day he does it for Oculus Quest 1. I haven't used mine in a year because Facebook.


time to get the Oculus Quest 2 my friend!


Don't those things need a facebook account also?


Please tell me there is a way to get rid of Facebook out of quest 2


I'm using https://github.com/basti564/Oculess which works well for removing facebook. Caveats: * most of the UI is broken * you can only run untrusted sideloaded apks * some telemetry services have to be stopped after each reboot * you might still need facebook for initial setup (haven't tried it)


There is this https://github.com/basti564/Oculess

Haven't tried it myself, and it has some caveats, but it does make it possible to use the Quest 2 without Facebook.


Just create one!


I'm going to give this a try later. One small fly in the ointment is that if like me you do not have and do not want to ever have a Facebook account, it will not be possible to load this onto the device after 2022.

This is because loading the ADB requires putting the device in developer mode, which in turn requires an internet connection and an active login to your Oculus developer account. Oculus developer accounts have been deprecated and will stop working in 2023, after which time a Facebook account is required.


You sure?

“In part, the unlocking is an attempt to guarantee that Go hardware will continue to be fully functional well into the future, allowing for "a randomly discovered shrink wrapped headset twenty years from now [to] be able to update to the final software version, long after over-the-air update servers have been shut down," Carmack wrote.”


Yes, I am sure. Look at point 2 in the instructions and follow the link.

The whole process for putting an Oculus device in developer mode is incredibly fiddly and annoying. You need a verified developer account. You an app running on a phone, which in turn requires an online login. You can't just do it from a desktop PC and you certainly can't flip a toggle on the device itself.

Ideally, as a follow-up to this, they will release something that allows putting the device into developer mode without requiring any of that. As of now, that does not exist.


I can confirm this, I've done it with mine. Even if you do everything from the bootloader alone, as soon as you unlock the bootloader and the device factory resets, you're required to use the Oculus app to continue setup.

I assume you're supposed to unlock the bootloader and... I guess, wait for someone else to release a fully defanged system image?


John Carmack is a living legend. Seriously impressive career!


He's truly had an incredible run, but it's a shame he's let the finale be "enabled Facebook to mix eyeball tracking with advertising".


This assumption strikes me as ageist.


Look at that, he's younger than I thought. In my head he was in his 60s, where "transition from leadership to 'consulting leadership' at a major company" sounds like a step towards a quieter life.


Finale?


Does this mean you get a blob image where you can have root, or more like Carmack of old, GPLing / open sourcing the prev-gen tech? I'm asking basically: Is this free as in beer or as in freedom?


> Accessing or using the unlockable software (“Software”) is subject to the Oculus Terms of Service or, if you use your Facebook account to access Oculus Products, the Supplemental Oculus Terms of Service and Facebook Terms of Service (the “Applicable TOS”). For clarity, Oculus Products (as described in the Applicable TOS) include the Software. We provide the Software to you for your personal and noncommercial use only on your personal Oculus Products. Installing the Software voids all warranties, express or implied, applicable to Oculus Products. In no event shall Facebook, its affiliates or any of their respective directors, officers, employees or agents have responsibility or liability arising out of or relating to making the Software available to you.

https://developer.oculus.com/licenses/go-unlock-tos

Free as in beer


I hope they do the Oculus Rift CV1 next. I'm trying to sell mine because it's useless without my deleted Facebook account.


that would be dope - no chance though


I love the spirit, but is there any *successful* precedence?

Sure, everything is now open-sourced, but setting up the kind of infrastructure that is necessitated to keep the thing up and running seems non-trivial.


I don’t believe it was open sourced. This is just a firmware that gives your root access and doesn’t require any Facebook integration.


Anyone tried this on an newly unboxed or at least a wiped Go? I think I had to go through pairing and activation before enabling adb on Go, which instruction says is required.


Now do the Oculus Quest


This is awesome!


Wasn’t he supposed to be developing AGI?


He seems to be working on AI / deep learning related projects if you follow him on twitter.


He remains part-time CTO for Oculus. I am talking out my ass but I believe the arrangement is something like 80/20 time split between his AGI project and Oculus.


Imagining that future is about people wearing VR devices on their head (even as minimal as a pair of glasses) is hard to grasp.

"In their head" would be a different matter but that is not what the current technology is focusing on (apart from Neuralink).

Edit: not propagating for any such vision the future! just noticing that wearing stuff on head would be clunky and impractical for true immersion.


I'm just waiting for a lightweight pair of AR glasses that I can wear while wrenching on a hobby car that give overlays, labels, disassembly instructions akin to car mechanic simulator.

If they can automagically tell me what size wrench/socket i need to grab, even better.


As hard as it is to find any service manual at all for old machinery, and then when you spend $100 to order a used manual off ebay find that the really useful pages are ripped out or covered with grease, I think a product like you describe is waiting on AGI more than it's waiting on nice AR glasses. Very few people need this product, so the intellectual labor to produce it will need to be nearly free.

I've gotten to where I can identify most bolt head sizes on sight... I very rarely have to pick up more than two wrenches and usually the first one is right. Also I find that bolt sizes are typically fairly standardized on a particular machine. I occasionally work on a mini-excavator that has 10mm, 13mm, and 19mm bolts (and 8mm allen-heads), but nothing else I've found so far.


Oh I know its a complete pipe dream, I can't help but want it regardless.

I see you may not have had the joy of working on something with mixed SAE, Metric, and if you're real lucky, JIS all combined.


Well I haven't worked on anything with JIS! A guy I used to work with hadn't had fractions in school, so he would use the metric even on SAE parts. He couldn't figure out that e.g. 7/16" is smaller than 1/2".


Leave it to the advertising-surveillance industrial complex to explore every possible route to inject outrage into subjects' brains (whether they want want it or not).


A VR world controlled by Facebook sounds dystopian enough already - now imagine the same with neural implants...


imagine if other people's "likes" and "angry face" are broadcasted into your neurons directly


As a person who wears glasses, I'm very excited for AR devices. A great input device* and incremental improvements in the battery tech and energy efficiency would make them ideal for on-the-go computing, reading, and note taking.

* https://wefunder.com/tapwithus looks promising, and facebook bought up companies doing similar things. A ring with a touch slider, gyroscope, and a button would be interesting.


I e purchased and learned how to use a tap strap. Unfortunately it doesn't really live up the the hype and after a few frustrating months of practice and work it now lives in a drawer collecting dust.


Yeah, I also have one, and I'm not really excited for new products from that company, even if it's the only real device in it's field at the moment. The whole configuration app and strange ecosystem they tried to push rubs me the wrong way, and I probably won't be getting the wrist device if I have to deal with that.


Do you really think that metaverse will gain momentum?


This is great as such but it really should be the law.


I wish Sony to do the same thing for Playstation 4.


Cool. Now can we please have that for the Quest 2?


"but damn, getting all the necessary permissions for this involved SO much more effort that you would expect." srsly?

He choose facebook, the cancer of our society, to persuite his VR dreams.

What did he expect?


NOT doing this should be illegal


How much one can improve a dictatorship working for the dictatorship ? Just thinking ...


i think i threw mine out


John Carmack working for Facebook is such a huge loss for mankind :(. It makes me sad every time I think about it.


I admire and respect John Carmack for his role in the history of computing (and his ongoing contributions), but I don't see how making 3D video games was really helping humanity more than this. ¯\_(ツ)_/¯


Carmack made multiple fundamental breakthroughs in 3D graphics. Not only that, but he consistently positioned himself to commercialize those, which is not easy. His Quake 3 engine was the bread and butter of Id Software, generating billions in revenue.

In recent times, he's done coronavirus simulations (I don't know anything else about that), and is doing AGI research.

He's also a true hacker through and through. Simply having an example like his, as a model to follow, is helping humanity. I happen to be human, and I distinctly remember his .plan files influencing my decisions in a positive way since I was 13 or so.

(The line about getting a hotel and accomplishing as much as possible in 2 days of focused effort was particularly impactful.)


He also helped to create a generation of people which modded and created forks of the engines he wrote to use in their own projects, providing the basis for tools to modify assets and whatnot. I owe him so much and I thank him for what he enabled everyday.


I think a lot of people miss this. I grew up half a generation after him, so I got into computers by tinkering with Commodore Log, GWBasic and later C++/Allegro. But the next generation after that was of a huge amount of kids doing mods for Quake, Doom and the like. It opened the doors in an accessible way to computing for a lot of people.


> generating billions in revenue.

Source?


I was going to point out another HN'er that said this, but then I looked at the username and it turned out to be me in 2014. https://news.ycombinator.com/item?id=7509095

Well, at least my ideas are consistent.

As for the root source, I suppose I could just ask him now. But the idea came from a slashdot post. Carmack personally responded to some criticisms of Q3 code quality, and said that he was proud of it, and that it had generated a lot of revenue for the company.

Billions may have been off by a couple orders of magnitude though. I'm no longer sure.

Maybe not too far off, though. The revenue of Q3 Arena alone was $11M. Far more valuable is Id Tech 3, the engine that was licensed by many studios over many years. Unfortunately, I can't seem to find revenue numbers on that. I wonder if it's public.



He only does light consulting for them currently. He stepped back a couple years ago to focus on his own AI research.


Really? I see this as a huge win for VR, but different opinions I guess.


He does not need the paycheck. He must think this endeavor is the most worth his while, which is fine.


People keep talking like he's focused solely on Facebook.

He does AGI research. He's sent me screenshots of his experiments.


Maybe he could push it to the better, who knows.


This move already does IMHO.


Maybe he'd be doing something more useful elsewhere, who knows.

One thing is certain though; Carmack is an inspiring role model for many developers. Facebook is not a great direction to lead smart people towards.


This will be a massive unlock for a Metaverse funded by Zuck with no affiliation with Facebook. Thanks for the hardware Zuck. Now we'll build the new world without you.


Not with a 3DOF headset




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: