Hacker Newsnew | past | comments | ask | show | jobs | submit | mariopt's commentslogin

Amazing work!

I have been thinking exactly about this. CF Workers are nice but the vendor lock-in is a massive issue mid to long term. Bringing D1 makes a lot of sense for web apps via libSql (SQLite with read/write replicas).

Do you intended to work with the current wrangler file format? Does this currently work with Hono.js with the cloudflare connector>


Wrangler file format: not planned. We're taking a different approach for config but we intend to be compatible with Cloudflare adapters (SvelteKit, Astro, etc). Assets binding already has the same API. We just need to support _routes.json and add static file routing on top of workers, data model is ready for it.

For D1: our DB binding is Postgres-based, so the API differs slightly. Same idea, different backend.

Hono should just work, it just needs a manual build step and copy paste for now. We will soon host OpenWorkers dashboard and API (Hono) directly on the runner (just some plumbing to do at this point).


I think it would be worth it to keep the D1 compatibility, Sqlite and Postgres have different SQL dialects. Cloudflare has Hyperdrive to keep the connection alive to Postgres/other dbs, what D1/libSql/Turso brings to the table is the ability to run a read/write replica in the machine, this can dramatically reduce the latency.


The artwork looks amazing, is it AI/ComfyUI?


No.


This won’t replace my GL-AXT1800 which offers a lot more flexibility.

Unifi shipping without eSIM support is a big mistake imo. I don’t want to have a 5g router(which are insanely expensive) or a second smartphone with 5G.


It doesn't have a modem. Why would it support eSIM?


It would be super convenient, given it's size, for me to purchase a eSim card abroad with unlimited data and not having to drain my phone battery.

This is a travel router.


This is a travel router without a modem. It would be super inconvenient if you bought an eSIM for a device that does not have a modem. You might as well by an eSIM for your toothbrush when you are traveling abroad, it would equally "convenient."


Hi Peter, thank you for doing this AMA.

How can an European Senior SWE land a job, let’s say, SF and have some kind of guarantee regarding the visa before flying into the US.

The 100K cost for the H-1B is so absurd that it crushes any hope of me ever participating in Silicon Valley or any other tech hub in the US. Is the tech alive or are companies just relocating to EU/others?


> IT projects suffer from enough management hallucinations and delusions without AI adding to them.

Software is also incredibly hard, the human mind can understand the physical space very well but once we're deep into abstractions it simply struggles to keep up with it.

It is easier to explain how to build a house from scratch to virtually anyone than a mobile app/Excel.


I came to opposite conclusions. Technology is pretty easy, people are hard and the business culture we have fostered in the last 40 years gets in the way of success.


Easy, just imagine a 1GB array as a 2.5mm long square in RAM (assuming a DRAM cell is 10nm). Now it's physical.


Fair point but you also get exposed if the dns provider has an outage.

Self hosting will also bring its own set of problems and costs.


> > Keep your domain name registrar, DNS service provider and application infrastructure provider separately.

> Fair point but you also get exposed if the dns provider has an outage

The usual workaround here is to put two IP addresses in your A record, one that points to your main server on hosting provider A, and the other to your mirror server on hosting provider B.

If your DNS provider goes down, cached DNS should still contain both IPs. And if one of your hosting providers goes down as well, clients should timeout and then fallback to the other IP (I believe all major browsers implement this).

Of course this is extra hassle/cost to maintain, and if you aren't quite careful in selecting hosting providers A and B, there's a good chance they have coordinated failures anyway (i.e. both have a dependency on some 3rd party like AWS/Cloudflare).


Traditional non-cloud, non-weird DNS providers have sufficiently long TTLs, not the "60 seconds and then it's broken" crap that clouds do to facilitate some of their services.

Something like TTL 86400 gets you over a lot of outages just because all the caches will still have your entries.


Only for you use case. I use cloudflare for my dynamic ip dns, caching that long make it worthless.


Yes, of course. But you usually don't put your important webserver doing bazillions of requests per short interval on dynamic IPs. Especially if you need to avoid any downtimes.


Use multiple DNS providers. Some secondaries have thousands of anycast nodes that are provided for free. One can also condition their user-base to know of multiple domains that are on different registrar accounts and of course a few .onion domains.


You can switch DNS providers if you're able to edit the domain's nameservers.

You can also separate your DNS provider from your registrar, so that you can switch DNS providers if your registrar is still online.


Enterprise self hosting is an expensive nightmare for most companies. I think it is time to discuss multi cloud deployments to escape outages.

I am hosted on Cloudflare but my stack is also capable of running on a single server if needed, most libraries are not design with this in mind.

I’m also wondering if all these recent outages are connected to cyber attacks, the timing is strange.


Steam Machines can become an existencial crisis for PlayStation and Xbox.

A “console” that I can use as a PC? I am in 100%. You’ll get the world biggest game library at a discount, this is why I sold my PlayStation after spending 200 euros and watching it becoming useless.

I also suspect a lot of game devs will optimize for steam machine and finally we’ll get a console like experience on PC.

Don’t let the “low specs” fool you, it has the same specs or better as 70% of steam users.

Given Valve gave money to a lot of open source maintainers , it’s also great for Linux.

Just take my money


Valve isn't likely to make SteamOS the kind of platform that facilitates intrusive* anti-cheat** or which is locked down in a way to prevent cheating at the client side. This means that a number of competitive multiplayer games will never run on it. I think in this regard, consoles still have an advantage*** if you're into those kinds of games.

* I don't care what the intention is, they are _objectively_ intrusive.

** Last time I argued this, someone seemed to assume that I was claiming that writing Linux kernel drivers is harder than Windows kernel drivers. I am not arguing that, you need some kind of trusted party enforcing signed kernel drivers and a signed kernel in order to make KLA sufficiently hard to bypass.

*** In terms of the average Joe just wanting their game to run rather than having to think about the ethical implications of buying hardware you don't actually own or running an OS which gives control of your hardware to various corporations (but not you).


> Valve isn't likely to make SteamOS the kind of platform that facilitates intrusive* anti-cheat* or which is locked down in a way to prevent cheating at the client side. This means that a number of competitive multiplayer games will never run on it. I think in this regard, consoles still have an advantage** if you're into those kinds of games.

Depends on just how successful SteamOS gets. If it start to have a significant market share, competitive multiplayer games might start to find it hard to ignore it. Though how they decide to deal with that, I have no idea.

I think Valve see a future for anti-cheat where most of it is behavioral analysis. Client-side anti-cheat is a big game of cat and mouse. It does make cheat harder to develop, but to a point where the customer is impacted. Post game analysis cannot be countered "technically". Cheat would need to mimic a real player behavior, which at the end is a success. If you can't tell if a player is cheating or not, does it matter that they are ? Although for things like wallhacks, it might be harder to detect.


"you can't tell if a player is cheating or not, does it matter that they are"

This is basically effectively where KLA has gotten to. There are still plenty of cheaters, people just don't realize.

I think it does matter in a strictly moral sense, and if people were more aware of how bad the problem is, they would likely be outraged. Alas, since they can't see it, they are not aware of it, so there is no outrage and the games companies are satisfied.


KLA?


Kernel-level Anticheat


I think the assumption that Valve would choose user protection over getting games to work is flawed, they want openness where possible because they see it as a competitive advantage. With VAC they clearly think that maximally invasive anti-cheat isn't necessary so maybe they'll try to push providers in that direction?


Valve thinks it's not necessary and it's still in the air if it really isn't.

They have bet on the behavioral analysis anti-cheat horse but it hasn't won any races yet.

Moreover, they've proven that it's certainty more difficult to get it working than regular old fuck-the-end-user anti cheat.

Lastly, don't assume that the success of the platform will persuade these companies. They were already firmly un-persuaded when the steam deck got popular. And really, I think the popularity of a platform depends on the support of these companies more than the support of these companies depends on the popularity of a platform.


There is a real need for anti cheat / certified hardware. Valve is uniquely be positioned to address it because they have trust from the gaming community. Ideally a single anti cheat mechanism would be shared by all software vendors. Online games could request "console mode" involving hardware key exchange. Done right this wouldn't have to be invasive like current anti cheat.


It's either invasive anti-cheat on a vendor controlled platform or it's a totally locked down vendor controlled platform, there are no other options in the client side anti cheat space.

Given that valve refuses to use KLA for their own competitive multiplayer games, and has gone out of their way to not make their hardware locked down, I really dont think they will go down the path of making a locked down platform or facilitating intrusive anti cheat.


Is it truly either-or? Obviously the root of anti-cheat needs to be totally locked down, aka the TPM. But almost all "open" computers have a locked down TPM. The TPM doesn't need to prevent you from running an unsigned firmware, kernel, modules or user software, it only needs to report on whether you are / have. You can reboot your computer into "trusted" mode and run your games with anti-cheat. Then when you're done playing you can as much unsigned software as you want.


You ask if it's either intrusive spyware or if it's a locked down system and then describe dual-booting intrusive spyware.

A TPM is entirely under your control. It's designed in such a way that you can't do certain things with data within it, but that's not because (at least in theory) someone else can and is controlling your TPM to prevent you from doing those things. The TPM, unlike an installation of Windows, doesn't only listen to Microsoft.


What I'm describing is exactly the situation now. Many people dual boot Windows & Linux, with kernel level anti-cheat on their Windows partition. The existence of Linux on the same computer does not prevent the kernel level anti-cheat from working on Windows.

Similarly, the presence of unsigned software on a computer would not stop a Linux kernel level anti-cheat from working, and the kernel level anti-cheat shouldn't prevent the unsigned software from working. Once you run that unsigned software, your machine is tainted similarly to the way your kernel is tainted if you load the NVidia driver.


I wonder if it’s possible to implement anti-cheat as a USB stick. Your GabeCube or gaming PC would stay open by default, but you could buy an anti-cheat accessory that plugs into a free USB port. Connecting that device grants access to match making with other people who have the device.

There are several products that rely on a USB device like this for DRM solutions. It’s probably much easier to unlock static assets than validate running code, but I don’t have insight on the true complexity.


>I wonder if it’s possible to implement anti-cheat as a USB stick. Your GabeCube or gaming PC would stay open by default, but you could buy an anti-cheat accessory that plugs into a free USB port. Connecting that device grants access to match making with other people who have the device.

What does the USB stick actually do? The hard part of implementing the anti-cheat (ie. either invasive scanning or attestation with hardware root of trust) is entirely unaddressed, so your description is as helpful as "would it be possible to implement a quantum computer as a USB stick?"


I am very skeptic there's much cheating in Counter Strike or Dota.

They use different means to detect cheaters, which means sometimes they are banned several weeks after the fact, but they do ban cheaters.


> Don’t let the “low specs” fool you, it has the same specs or better as 70% of steam users.

We are also out of the rat race of hardware requirements of the 90s. I'm on a 7 year old system and if you're not chasing to max out the latest AAA game on launch, that thing can run a lot of games. It's mainly storage and RAM for modded minecraft or Satisfactory that's a bit of a mess atm. Though RAM prices are spicy at the moment, jeez.

Similar, my dad has my system from 10 years ago or more, and the only real snag for his strategy games is now a DX12 requirement.


> I'm on a 7 year old system and if you're not chasing to max out the latest AAA game on launch

Yep, people who didn't fall for the resolution meme can play any games maxed out with a 2060. People chasing 4k and 120+ fps will never ever get satisfied and will always spend $1k every other year for the new high end gpu

I made two upgrades since 2015, ryzen 7 1700 to ryzen 5 5600,for $100. And I swapped my gtx 970 for an rtx 2060, for $300


any strategy game recommendations from your dad?


It's certainly not an existential threat to Playstation, but Xbox certainly has weakened itself enough that yes, this could be another nail in the coffin, given that their plan was to retreat into the Windows ecosystem.

The low specs aren't a problem if it's cheap enough, but for every dollar this goes above the retail price of a PS5 will seriously hurt its mass appeal.

The problem for Valve is that they can't really sell this thing at a console-like discount, because it's a general purpose computer. If this thing is way cheaper than a regular computer of the same spec, corporations will just buy up Steam Machines by the palette load and use them as office machines or whatever (just like what happened to Sony when they allowed the PS3 to boot into Linux and they had to release an emergency update that disabled the linux functionality even though it was an advertised feature).

I really hope this will be successful, but it'll likely be successful in a specific niche. The nice thing though about this niche is that they don't have to hit anywhere near the same sales numbers as a console to be a success because the R&D costs are lower, and games didn't have to be specifically tailor made for it.

E.g. the PS Vita sold more units than the SteamDeck, but the Vita was an unmitigated failure for Sony because unlike the SteamDeck, the Vita needed games to be specifically made for it, whereas the SteamDeck benefits from the entire PC ecosystem so doesn't need the same level of adoption to be a (limited) success.


> If this thing is way cheaper than a regular computer of the same spec, corporations will just buy up Steam Machines by the palette load and use them as office machines or whatever

Sure, but corporations don't want/need the same spec. They don't need the GPU, they don't need the fancy controller. If you just want a cheap PC that'll run a browser and Office, you can get them for under $200. If you want a Beelink with CPU/RAM/SSD similar to the Steam Machine, that's $350, and it includes a Windows license. Steam machine has an estimated BOM of $425, so even $500 will be a subsidized price after overhead. As long as Valve prices well above $250 it'll be safe from this concern, since corporations will likely want to add a Windows license to the cost.


> It's certainly not an existential threat to Playstation

To add to this, PlayStation is almost entirely sustained by exclusives at this point, and it's starting to backfire (more and more players are just waiting for the PC release, and the wait is killing some of marketing/hype that a game would have had, e.g., FF16 likely would have done a lot better if it released for PC at the same time rather than starting with PS exclusivity, and I suspect Death stranding 2 will be the same


> If this thing is way cheaper than a regular computer of the same spec, corporations will just buy up Steam Machines by the palette load and use them as office machines or whatever

On one hand, this would be a problem.

On the other hand, if the Steam Machine doesn't support windows, businesses fleeing from MS Windows en masse because the Steam Machine is cheap would be a VERY interesting turn of events, and I'd be VERY curious to see how it all unfolds.


Sorry if I was unclear, but what I was saying was that this would be unsustainable because the only way it'd be possible is if Valve was subsidizing each unit in hopes of recouping their losses on Steam game sales.

If that happened, Valve would get bankrupted by companies buying up subsidized Steam Machines with no intention of playing games on them.


Yes that's what I understood. But it'd still be insteresting.

Amazon was once a bookstore. There's nothing stopping Steam from adapting to a "Steam Business Machines" / "Steam OS Business Edition" once it has a foothold in the business market. After all, the store already distributes software, they're just not as popular as games. So if this scenario were to happen, and Microsoft failed to react, I'm sure Steam would adapt very quickly to take advantage of it rather than sit and wait for the bankruptcy.


I don't really get why people are calling it a console. It is a PC to me in all the ways that matter, and it's probably going to save me from spending 1500 euros on a mid-range gaming laptop that I don't really need. The only thing that I don't use my ipad for is playing games with my friends in other countries while we chat on discord. And the last 5 games we've played together do benefit from keyboard and mouse controls, but don't have huge spec requirements. And pretty much everything else for which I'd want a bigger screen than my ipad's can be done in the browser, which I can also happily install on the steam machine because it's just a Linux machine with some extra bells. So yeah, it will probably completely replace my need for a PC, and I'd be plenty happy to pay a PC price for it as a result.


> I don't really get why people are calling it a console. It is a PC to me

because "console" isn't what a product is (supply) - it is a name for product niche (demand)

when someone talks about buying a console, the expectations are 1)significantly cheaper than "usual" computer 2)most likely optimized for games (controller input, easy install) 3)expectation of using already existing TV as display

consoles weren't different from low-end pc all the way since x-box


Because it gives you a console-like experience. What's so hard to get about that? In their own press release, Steam notes it's "just a PC."


Well, it's kind of a new thing, isn't it.

Just like the deck popularized the idea of "handheld PCs". Maybe the Machine will do the same to "console PC". It's a PC, but also a console.


To me, PC = Windows = Microsoft's Spyware and every other game company's anti-cheat root kits

Someone might say I can install Linux on my PC but then I have to deal with maintaining it.

So, what I hope the Steam Machine is, is effectively PC based console with no Microsoft, no root kits, and no maintainence.


> Someone might say I can install Linux on my PC but then I have to deal with maintaining it.

What maintenance do you mean? I do not know of an OS that does not require allowing updates to keep secure.

It definitely meets the other criteria you want.


I don't know what Linux you're on. On mine I have to actively maintain it. I have to run something `sudo apt update` `sudo apt upgrade` every week or so and then deal with whatever breaks. Conversely, on my Switch and PS5 it does this automatically. I expect the Steam Machine will also update itself automatically. They have an incentive not to break things since it will cost them money to fix all the device they break. Linux on the other hand (not complaining) basically says "not our fault if it breaks". So, I'd prefer the Steam Machine where, I believe, it is their fault if it breaks and they will fix it.


Sometimes there are regressions in the kernel and other driver issues. My laptop is more than five years old and I had to boot an older kernel for a while until a regression got fixed. I guess that's less likely to happen on Windows. Not being as close to the hardware vendors means there are bound to be more edge cases even on boring distros like Ubuntu.


If you don't need to get an expensive gaming PC your should not get an expensive gaming PC. The steam hardware isn't magic. You can already get equivalent specs for cheap.


You can? What am I looking for?

I've tried to hit the $600 mark and in the past few years it's gotten harder and harder. The GPU invariably ruins things. And normal APUs are too asthmatic to really game on.


Good, the Steam Machine won't be 600$ most likely, either.

Also you don't necessarily need a dedicated GPU, unless you go with Intel.


> You can already get equivalent specs for cheap.

This is the context. A Steam Deck is bizarrely great value compared to anything you can buy, even without a screen, controller and battery.

Again if you know otherwise, please share.


The APU from AMD they have is pretty much magic. You will not find anything comparable, any Ryzen APU you can actually buy is pretty much trash apart from very light-weight gaming. You absolutely need a separate GPU, and even low-end will set you back at least 250$. The only way to build something comparable for cheap would be to buy used.


The APU in the steam deck isn't anything too special (the 740M is comparable but is RDNA3). For the GabeCube they are using a customized 7600M, which previously has been used in many eGPUs for Chinese handhelds (at a very high price).

AMD does have some pretty powerful APUs right now, but I don't think we'll see it on many mass market

How customized it is, I guess we'll find out closer to release, but I am guessing just based on the dimensions that it is just customized specifically for the case, for space and cooling reasons.

A similar PC without the fit and finish with just consumer parts comes in at around $900.

Curious how much pull valve has with AMD to get this into people's hands.


Exactly, I sold my Switch because I just happened to play most of my games on PC and steam. Worst case scenario, it can be a desktop computer (I don't have one, only a laptop)

Whereas a playstation or a switch, once I don't game anymore, it's just an expensive paperweight


Man, I got a free ps5 from my isp and was excited to have friends over for games. Come to find out that playing games with your friends apparently isn’t a thing anymore (I guess there’s fighting and racing games). What a lame-ass boring system.


This is why I've been all in on Steam for so long! The catalog is so huge, there's a massive number of fantastic couch multiplayer games. It is indeed a bit more fiddly... I've found that it's generally easier to connect my Steam Deck to the TV and play lower fidelity games than it is to fiddle with a Windows machine that needs to be prepped for friends popping in every other month.

Although, Nintendo is still doing a good job at keeping the couch-social experience alive, and building 1st party games that can be good solo experiences but really shine when played next to a friend sitting on the same couch.


I used to rally friends over for couch gaming years ago. Think I'll try to put something together for Thanksgiving. Been a bit out of the loop - any recent PC couch games you recommend for normies? I think for my group, it'll lean casual / co-operative. Nintendo's always felt too kid-like for us.

The Trine series was a big hit for us way back.


"It takes two", "Split Fiction" and "Lego Voyage" are the same style of casual two-player coop. Downside is it's only two players but otherwise may he exactly what you're looking for and very well made.

If you want something a bit different, check out "keep talking or everbody explodes".


Rivals of Aesther - like Super Smash Bros but with Steam Workshop support for player-made characters... TARS (interstellar) vs Ronald McDonald vs Obama anyone??

nidhogg - deserves to be in an arcade cabinet but honestly this one is ALWAYS a hit...just 2player though

Broforce - 80s action stars in 80s action movie multiplayer platformer

Ultimate Chicken Horse - competitively build a platformer level and then race to complete the level first, best with 4 players

TowerFall Ascension - 2-4 players, also deserves to be in an arcade cabinet

Screen Cheat - FPS made for the couch; think N64 Goldeneye or Quake, but all the players are invisible, so the only way to figure out where your opponents are is to look at their quadrants (screencheat)

Overcooked 2 - it's pretty kid-oriented on the surface, but it's a game where you must out-communicate the absolute chaos unfolding around you in order to succeed...such a good couch-multiplayer experience, but best for experienced gamers imho

Rocket League

Magicka

Regular Human Basketball - control giant basketball automatons by jumping inside them and operating the manual controls in a team v team. Minimum 4 players to really work well, supports up to 10 players shared screen

That should get you started! But oops that wasn't my casual coop list, more my "makes for memorable group experiences" list.

It Takes Two, Portal 2, Untitled Goose Game, Halo Master Chief Collection was like $10 recently, all come to mind as positive local coop experiences I've had.


Thank you!


There are a few games that support couch coop, but you’re right it’s not very well supported any more.

We enjoyed Split Fiction and It Takes Two recently, but those are quite couple-y. And Blue Prince is very playable as a group effort.

Then there are games like “overcooked”, though again not many. IIRC there’s a new Katamari on the way as well.

But there really aren’t that many “get your mates together” games any more.


> But there really aren’t that many “get your mates together” games any more.

It's especially fascinating against the lens of this return-to-office phase we're living through. I'm a big fan of WFH but hear me out: online gaming (physically by yourself) is somewhat analogous to work-from-home. It empowers you to optimize your entertainment, challenge, competition into narrower and narrower facets of the experience. And tribe.

But jibba jabba around the water cooler can be enjoyable with the right people, just like it can be on the couch with friends and fam with Mario Kart, NHL '97, or Jackbox. Or board games.

There's room in this zeitgeist for a breakout multiplayer hit that just doesn't feel good unless you're in person.


Playing games with friends has never been more popular. I guess couch co-op has been replaced with online multiplayer. The assumption being that if you want to play with friends, they'll have their own device.

But there's still plenty of couch co-op games. They're usually quite niche though and not your typical racing or shooting game.


We ended up hooking up my old N64 and playing Goldeneye


> I guess couch co-op has been replaced with online multiplayer. The assumption being that if you want to play with friends, they'll have their own device.

What's the point with a console then though?


It is on the Nintendo Switch. My kids play loads of Mario Party, Minecraft and Mario Cart.


> Steam Machines can become an existencial crisis for PlayStation and Xbox.

Xbox, as a console, already is in an existential crisis.

I think people have weird expectations about what the Steam Machine will cost. From what Valve has said so far (cheaper than if you build it yourself from parts), it will still cost significantly more than a PS5, and probably also more than a PS5 Pro, while having less performance than both. You will not beat the PS5 in terms of performance per dollar. Yes, games are more expensive on PS5, but most people don't work that way but just want to know whether they'll be able to play GTA6 on day one.


I don't think it's an existential crisis for console manufacturers, but it's certainly part of a shift in how we think about "consoles".

Microsoft has seen the writing on the wall for years now, and they've expanded their library to run across platforms. The Xbox as we knew it is effectively dead.

Sony and Nintendo are still holding on to the legacy concept, and trying to lure people into their walled garden, but even their hardware is essentially a general purpose PC that happens to be locked down in software.

So I suspect we'll see one last traditional "console" generation with the PS6 and whatever Nintendo makes next, and after that the concept of a single-purpose machine will fizzle out. Nintendo will probably be the last to give in, since they have the strongest first-party IPs to make that feasible, but eventually they'll follow suit as well.


It's not a console you can use as a PC, it's a PC you can use as a PC.

If you want a console you can use as a PC, the next Xbox is rumoured to be along those lines. It will run Windows so you can play Steam, GOG etc but will also run the existing Xbox library natively.

The 70% figure needs to be taken in context, tons of people have Steam installed on old computers that they use for old games. I currently have it installed on three devices, and yes two of them are worse specs than this. But I don't have any intention of upgrading them either, they are just old machines I have hanging around. They do the job if I'm travelling.


Is that actually confirmed? That Microsoft is going to allow steam and other third party stores on the new Xbox console?


> Steam Machines can become an existencial crisis for PlayStation and Xbox.

Not really. An existential crisis to System76, Framework computer and all the other Linux computer companies.

> A “console” that I can use as a PC? I am in 100%. You’ll get the world biggest game library at a discount, this is why I sold my PlayStation after spending 200 euros and watching it becoming useless.

No different to getting a regular PC but this time you can just buy a high performance state of the art GPU like the NVIDIA RTX 5090 and it runs all your games at 4K @ 120 FPS instead of 60.

> I also suspect a lot of game devs will optimize for steam machine and finally we’ll get a console like experience on PC.

Proton is the software that is doing the optimizations. However once you want to run a highly anticipated game like Battlefield 6 and your friends are playing it on their Windows PCs and consoles on day 1, the Steam Machine is left behind waiting for compatibility updates.

> Don’t let the “low specs” fool you, it has the same specs or better as 70% of steam users.

2020 specs in 2026 isn't really good for convincing 70% of Windows PC gamers or console players either.

The real test is when the next generation Xbox or Playstation arrives, will the Steam Machine outsell them?

> Given Valve gave money to a lot of open source maintainers , it’s also great for Linux.

We will see if that is enough to convince Steam players to run SteamOS instead of Windows or consoles. but so far it is totally underpowered and you might as well get a Windows PC + Nvidia RTX 5090 which runs all your games well including the highly anticipated ones.

No thanks and no deal.


A 5090 is likely 3x the cost of the steam machine. You are at the extreme high end of the gaming market here and not the target of the steam machine.


This and machines with 5090s are a completely different market.

The Steam Machine is marketed primarily as something sitting under your TV. I don't have 5090 under my TV money, 99.9% of people don't. That's not the target demographic.


Funny to see "Game Machine that can be used as a PC" when, as I was growing up, it was "Personal Computer can also play games".

In both cases I suppose it was the dedicated gaming-machines (Switch now, Atari then) that were feeling the squeeze.


Yeah, learn with the computer! When I was growing we had an intellivision game console. My parents bought us the "keyboard component" that turned it into a computer. What a terrible computer it was. Turned out it was released because the company was being fined for advertising a computer add on and not delivering. You wanted to write games but no, worse than the timex Sinclair 1000...

A lot of home computing used tvs back then

https://en.wikipedia.org/wiki/Entertainment_Computer_System


Then sony produce ps only exclusive s. No pc ports anymore.


That is only a problem if you suffer from FOMO. Otherwise there are enough PC games for a hundred lifetimes.


Not going to happen. In fact, they seem to go the opposite direction because there's more money to be made.


I also don't believe it, but sony was a all time bad player.

I like the Xbox because they changed so much in the console ecosystem, play anywhere, backwards compatibility without extra cost.


I don’t think that’s true. The whole reason they’re producing PC ports is to sell the most profitable part (software) to those who to haven’t been giving them money.

Sony makes zero dollars off of the consoles, and while they do enjoy taking their PS Store royalties rather than giving it up to Steam, they also have a huge collection of first party studios that might even be a more important business.

And it’s not like Sony is giving their big console releases PC ports on day one, if you want to own them right away you have to buy a PlayStation.


You are mostly right about the broad strategy but a few of the claims are too absolute. Sony does make money on hardware later in the cycle even if margins are small. They also care about PC ports for more than just pure profit such as extending the IP footprint and keeping franchises visible between major releases. The part about delayed PC ports is completely correct. PlayStation is still the primary window and PC is the secondary revenue phase once the console market is saturated.


How could you believe that Sony would give up 30% of every Call of Duty, Madden, Fortnite sale for the measly PC sales of Slapper-Man 2 and God of Warm?

Every Steam Machine sold that plays Sony's exclusives is a genuine threat to Sony's control over the gaming market. The more I think about it, the more I believe Sony's games coming to PC is over since yesterday's announcement.


Sony’s current tactics is to publish all their releases on PC 6-12 months later. Doing this expands the potential player base and even makes some players to double dip and buy the game twice


If this Steam Machine really takes off and starts impacting Sony's ability to sell Playstation consoles, you bet your ass Sony will stop porting games to PC.


> and finally we’ll get a console like experience on PC

What do you mean by that? The PC experience with adequate hardware is almost universally better than the console experience.


Games being optimised for specific hardware.

Some games just run better on the PS when compared to the PC version, regardless of you having the latest and best PC. You can see this fairly well on the Nintendo Switch, which is a low spec tablet but the games run very well and the experience is great.

PC Games, generally speaking, tend to favor keyboard and mouse not controllers.

This is why I suspect game devs will start optimising for the Steam Machine, provided it sells well.


To be honest, I don’t think you’ll need to use the app. I get my passes via email and add them to Apple Wallet.

In some airports, printing the pass is actually free, they have a dedicated vending machine just for this.

They are not saying that you can not print the pass yourself, I suspect this move will save a lot of time and money which, ultimately, is a good thing for everyone.

They want to charge a check-in fee if you don’t do it online, I would not be surprised if they have a lot of people wasting time doing it in-person instead of doing it online. This could speed customer who actually have luggage.

What am I missing?


It does say that paper boarding passes won't exist at all. No printing at the airport. And they claim this makes flying "greener".

They also call passengers who don't check-in beforehand "stupid".


The online check-in opens 48 hours before the flight and you get a notification, no reason to delay this and your life harder inside the airport and potentially lose your flight.

Still confused about the no paper thing, you could print the digital pass given It has the gate qr code


Been using CF Workers with JavaScript and I absolutely love it.

What is performance overhead when comparing rust against wasm?

Also think the time for a FOSS alternative is coming. Serverless without, virtually, cold starts is here to stay but being tied to only 1 vendor is problematic.


> Also think the time for a FOSS alternative is coming. Serverless without, virtually, cold starts is here to stay but being tied to only 1 vendor is problematic.

Supabase Edge Functions runs on the same V8 isolate primitive as Cloudflare Workers and is fully open-source (https://github.com/supabase/edge-runtime). We use the Deno runtime, which supports Node built-in APIs, npm packages, and WebAssembly (WASM) modules. (disclaimer: I'm the lead for Supabase Edge Functions)


It would be interesting if Supabase allows me to use that runtime without forcing me to use supabase, being a separated product on its own.

Several years ago, I used MeteorJs, it uses mongo and it is somehow comparable to Supabase. The main issue that burned me and several projects was that It was hard/even impossible to bring different libraries, it was a full stack solution that did not evolved well, it was great for prototyping until it became unsustainable and even hard to on board new devs due to “separating of concerns” mostly due to the big learning curve of one big framework.

Having learn for this, I only build apps where I can bring whatever library I want. I need tool/library/frameworks to as agnostic as possible.

The thing I love about CloudFlare workers is that you are not force to use any other CF service, I have full control of the code, I combine it with HonoJs and I can deploy it as a server or serverless.

About the runtimes: Having to choose between node, demo and bun is something that I do not want to do, I’m sticking with node and hopefully the runtimes would be compatible with standard JavaScript.


>It would be interesting if Supabase allows me to use that runtime without forcing me to use supabase, being a separated product on its own.

It's possible for you to self-host Edge Runtime on its own. Check the repo, it has Docker files and an example setup.

> I have full control of the code, I combine it with HonoJs and I can deploy it as a server or serverless.

Even with Supabase's hosted option, you can choose to run Edge Functions and opt out of others. You can run Hono in Edge Functions, meaning you can easily switch between CF Workers and Supabase Edge Functions (and vice versa) https://supabase.com/docs/guides/functions/routing?queryGrou...

> Having to choose between node, demo and bun is something that I do not want to do, I’m sticking with node and hopefully the runtimes would be compatible with standard JavaScript.

Deno supports most of Node built-in API and npm packages. If your app uses modern Node it can be deployed on Edge Functions without having to worry about the runtime (having said that, I agree there are quirks and we are working on native Node support as well).


Cool, I'll check it out.


It surely depends on your use case. Testing my Ricochet Robots solver (https://ricochetrobots.kevincox.ca/) which is pure computation with effectively no IO the speed is basically indistinguishable. Some runs the WASM is faster sometimes the native is faster. On average the native is definitely faster but it is surprisingly within the noise.

Last time I compared (about 8 years ago) WASM was closer to double the runtime. So things have definitely improved. (I had to check a handful of times that I was compiling with the correct optimizations in both cases.)


The stats I've seen show a 10-20% loss in speed relative to natively-compiled, which is effectively noise for all but the most critical paths.

It may get even closer with WASM3, released 2 months ago, since it has things like 64 bit address support, more flexible vector instructions, typed references (which remove runtime safety checks), basic GC, etc. https://webassembly.org/news/2025-09-17-wasm-3.0/


Unfortunately 64bit address suppport does the opposite, that comes with a non-trivial performance penalty because it breaks the tricks that were used to minimize sandboxing overhead in 32bit mode.

https://spidermonkey.dev/blog/2025/01/15/is-memory64-actuall...


1) This may be temporary.

2) The bounds checking argument is a problem, I guess?

3) This article makes no mention of type-checking, which is also a new feature, which moves some checks that normally only run at runtime to only needing to be checked once at compile time, and this may include bounds-style checks


The Cloudflare Workers runtime is open source: https://github.com/cloudflare/workerd

People can and do use this to run Workers on hosting providers other than Cloudflare.


It's also worth noting that workerd is only a part of the Cloudflare Workers stack. It doesn't have the same security properties.

https://github.com/cloudflare/workerd#warning-workerd-is-not...

(I know you know this, but frankly you should add a disclaimer when you comment about CF or Capnp. It's too convenient for you to leave out the cons.)


Job scheduling and tenant sandboxing are generally the responsibility of the hosting provider, not the JS runtime. If you are going to run workerd on, say, Lambda, then you rely on Lambda for these things, not workerd. No other server JS runtime offers hardened sandboxing either -- they all defer to the hosting provider.

(Though if we assume no zero-days in V8, then workerd as-is actually does provide strong sandboxing, at least as strong as (arguably stronger than) any other JS runtime. Unfortunately, V8 does in fact have zero-days, quite often.)

What mariopt said above was: "being tied to only 1 vendor is problematic." My point here is that when you build on Workers, you are not tied to one provider, because you can run workerd anywhere. And we do, in fact, have former customers who have migrated off Cloudflare by running workerd on other providers.

> frankly you should add a disclaimer when you comment about CF or Capnp

I usually do. Sometimes I forget. But my name and affiliation is easily discovered by clicking my profile. I note that yours is not.


I think it's pretty well understood that Cloudflare does not actually deploy a VM/container/etc per tenant, but you guys are relying on something like detecting bad behavior and isolating or punishing tenants that try to use attacks in the style of rowhammer: https://developers.cloudflare.com/workers/reference/security... -- so there is secret sauce that is not part of workerd, and one cannot get the same platform as open source.

Meanwhile, somebody like Supabase is making the claim that what you see as open source is what they run, and Deno says their proprietary stuff is KV store and such, not the core offering.

Now, do these vendors have worse security, by trusting the V8 isolates more? Probably. But clearly Cloudflare Workers are a lot more integrated than just "run workerd and that's it" -- which is the base Supabase sales pitch, with Postgrest, their "Realtime" WAL follower, etc.

(I am not affiliated with any of the players in this space; I have burned a few fingers trying to use Cloudflare Workers, especially in any advanced setup or with Rust. You have open, valid, detailed, reproducible, bug reports from me.)


I am not very familiar with Supabase edge functions, but it appears to be based on Deno. According to Deno's documentation, it does not implement hardening against runtime exploits, instead recommending that you set that up separately:

https://docs.deno.com/runtime/fundamentals/security/#executi...

The intro blog post for Supabase edge functions appears to hint that, in production, they use Deno Deploy subhosting: https://supabase.com/blog/edge-runtime-self-hosted-deno-func...

Note that Deno Deploy is a hosting service run by Deno-the-company. My understanding is that they have proprietary components of their hosting infrastructure just like we do. But disclaimer: I haven't looked super-closely, maybe I'm wrong.

But yes, it's true that we don't use containers, instead we've optimized our hosting specifically for isolates as used in workerd, which allows us to run more efficiently and thus deploy every app globally with better pricing than competitors who only deploy to one region. Yes, how we do that is proprietary, just like the scheduling systems of most/all other cloud providers are also proprietary.

But how does that make anyone "tied to one vendor"?


> But how does that make anyone "tied to one vendor"?

Because you can't, in the general case, recreate the setup on a different platform? That's like the definition of that expression.

BTW here's Deno saying Deno Deploy is process-per-deployment with seccomp. No idea if that's always true, but I'd expect them to boast about it if they were doing something different. https://deno.com/blog/anatomy-isolate-cloud

Process-per-deployment is something you can reasonably recreate on top of K8S or whatever for self-hosting. And there's always KNative. Note that in that setting scheduling and tenant sandboxing are not the responsibility of the hosting provider.

Personally, I haven't really felt that cold starts are a major problem when I control my stack, don't compile Javascript at startup, can leave 1 instance idling, and so on. Which is why I'm pretty much ok with the "containers serving HTTP" stereotype for many things, when that lets me move them between providers with minimal trouble. Especially considering the pain I've felt with pretty much every "one vendor" stack, hitting every edge case branch on my way falling down the stack of abstractions. I've very very much tried to use Durable Objects over and over and keep coming back to serving HTTP with Rust or Typescript, using Postgres or SQLite.

Pretending you don't see the whole argument for why people want the option of self-hosting the whole real thing really comes across as the cliched "It is difficult to get a man to understand something, when his salary depends upon his not understanding it!"


> BTW here's Deno saying Deno Deploy is process-per-deployment with seccomp.

And that part isn't open source, AFAICT.

> Because you can't, in the general case, recreate the setup on a different platform?

You also can't recreate Lambda on Google Cloud since Lambda's scheduler is not open source.

But you can use Google Cloud Functions instead.

None of these schedulers are open source. Not Deno Deploy, not Supabase, and yeah, not ours either. Standard practice here is to offer an open source local runtime that can be used with other schedulers, but not to open source the cloud scheduler itself.

> Pretending you don't see the whole argument for why people want the option of self-hosting the whole real thing

Yes I get that everyone would like to have full control over their whole stack and would like to have it for free, because of course, why wouldn't you? I like those things too!

But we're a business, we gotta use our competitive advantage to make money.

The argument that I felt mariopt was making, when they said "being tied to only 1 vendor is problematic", is that some proprietary technology locks you in when you use it. Like if you build a large application in a proprietary programming language, then the vendor jacks up the prices, you are stuck. All I'm saying is that's not the case here: we've open sourced the parts needed so that you can switch vendors. The other vendor might not be as fast and cheap as us, but they will be just as fast and cheap as they'd have been if you had never used us in the first place.

I will also note, if we actually open sourced the tech, I think you'd find it not as useful as you imagine. It's really designed for running a whole multi-tenant hosting service (across a globally distributed network) and would be massive overkill for just hosting your own code. workerd is actually better for that.

> Durable Objects

I want to be forthright and admit my argument doesn't currently hold up here. workerd's implementation of Durable Objects doesn't scale at all, so can't plausibly be used in production. We actually have some plans to fix this.


Workers is a v8 isolates runtime like Deno. v8 and Deno are both open source and Deno is used in a variety of platforms, including Supabase and ValTown.

It is a terrific technology, and it is reasonably portable but I think you would be better using it in something like Supabase where are the whole platform is open source and portable, if those are your goals.


In code I’ve worked on, cold starts on AWS lambda for a rust binary that handled nontrivial requests was around 30ms.

At that point it doesn’t really matter if it’s cold start or not.


Workerd is already open source so that's a good start.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: