I worked on a similar solution to this and we had a price point of $5/month per user...
EDIT: 16GB of RAM and 16vCPUs. What a weird balancing of resources. Chrome is typically memory bound, not CPU bound. This also explains why it would be so wildly expensive compared to anything else out there.
EDIT2: A lot of the replies I'm getting seem to think my implication here is that no one would pay for this or it would be easier for people to build this themselves. I'm not saying that at all, I'm just critiquing the price point. There's huge market demand for browser isolation, I've worked on products in that field, I just haven't encountered any customers willing to pay $30-50/month for it.
Believe me, I was skeptical too. I remember sitting in a car driving back up from YC with Michael Siebel asking him: "Hey man, do you think I am absolutely nuts thinking people would pay for a browser that's FREE? That's an idiotic idea right?" and, of course, he encouraged me and I am still feeling pretty encouraged based on talking to users and seeing the revenue/usage/praise 18 mo later.
We have a lot of work to do and I am pretty embarrassed of what we've got still but it felt right to get public about it.
Why might I use this instead of / in addition to Shadow (https://shadow.tech)? I'm a Shadow user, and they seem to give you beefier hardware at half the price, and it's a general purpose OS that will let you run any app (as opposed to "just" a browser).
Fwiw, we started by streaming Windows and pivoted away.
It's not clear to me that Shadow's business is sustainable. Windows licensing alone for virtualization across end-users if you buy from a reseller is $11/mo/user alone. I only know because we tried and became a reseller briefly. They also seem to use consumer GPUs that violate NVIDIA's licensing and agreements. Maybe they know something we don't.
They claim to, in reality they are sliced Quadro/Tesla cards that get a GTX 1080's worth of performance. I was wondering about the Windows licensing myself, not clear how they got around that.
In any case, even at $20 p/m it feels like a strong value. That ~$1000 every four years - without ever being stuck with an out of date machine.
I think they are close to bankruptcy though, and signing up takes ages.
Technology is amazing.
Their tech is incredible, by far the best performing IMHO.
This video is a great interview of JB + story of Shadow
It's one great piece of tech, so I'm not surprised he'd be interested in trying to turn it around
Congrats on all the work here. Browser streaming isn't easy stuff!
Pricing is a good example of something that most people are intuitively wrong about. What you think people will pay and what people actually will pay are rarely congruent, and most of the time people guess far too low. Literally every bit of advice and writing about pricing I've ever read boils down to "Charge more than what feels right; you'll be surprised at how high you can go before you lose customers."
Apple keeps applying this strategy since 1990s.
Tesla bootstrapped itself off $80k cars, and only now is expanding to the "reasonable" $30k market segment.
You may not need everyone jump on your service just yet, you can start with the most needing it who are moneyed. Then you expand, economies of scale kick in, and you can introduce lower and lower price tiers, and people enjoy falling prices and getting a bargain.
Perhaps their solution has something specific to the browser which allows them to do it really fast and cost effective. Eg. Sending just diffs of DOM to the client.
PS very impressed with MightyApp - joined the waitlist. Congrats :)
positioning for consumer/prosumer is interesting and invites changing the math! opera was notable here as a web accelerator, but also a warning sign for pursuing this as a VC-funded businesses. the internet is bigger now..
good luck to the mighty team!
but invite list, wooo, I got to get on it
Wondering if anyone did a test (speedometer or something similar) comparing Safari on average macbook vs $30/mo mightyapp.
At the end of the day your search history should be fed into a personal search engine which digests the data and figures out which pages were most useful to you (maybe by helpful browser buttons)…and uploads that into some open database. This can then be the basis for a new type of search engine.
It could be implemented trivially on something like Mighty, since everyones browsers run in the same datacenter.
Today I want to visualize 100,000 rows across 1,000 dimensions in 10 different tabs.
Between Today and Someday there are endless things I want to do.
Same architecture as: https://v20.ohayo.computer/?filename=discovery-of-elements.o...
Or spend $600 and get an always-on home PC that you can vnc to with your hi speed connection
On the other hand, if this catches on, then i can see people airbnb-ing their servers
on the third hand, if this catches on , users will soon realize they can spend the $30 to buy the extra RAM they re missing
That drop box comment was a bit off since having an offsite backup of your most important data and having it available across all your devices is super useful. However I see where he was coming from. I still have on site backups. And most of the time that’s way cheaper for massive backups.
$30-50 USD for browser inception? If I had my entire environment there I could see the usefulness. But the browser alone?
I see some comments where people are already paying. Who is using this?
Might as well exercise the less-used part of the brain where you try to imagine the positive aspects of something.
My favorite example
He quotes a post from Drew thanking Brandon for his remarks, and spends the rest of the essay saying the thanking is an uncalled-for "level of retribution", "effective slander case". But both exchanges between Drew and Brandon (the one in 2007 as well as the one in 2018) seem friendly to me.
My impression is that when linking to Brandon's post, people are usually saying "a company can still succeed by offering something that was previously possible, by making it easier to do" and "don't be discouraged by criticism saying it's already possible". They're not saying that Brandon was a bad person or that his feedback wasn't useful or anything.
Zed also makes a big deal about Brandon not being able to delete his post - but I remember dang mentioning that they would delete posts when asked, but everyone so far has agreed to a compromise of removing the username but keeping the post, which does seem like the best solution in a case like this (where the content of the post has historical value but the author might want to disavow it).
I think the initial HN comment was justified albeit a bit nerdy, the marketing was just poor at the time. Not being able to delete a post is sort of a problem with all written media, the internet is not your group of friends at a bar. Being able to distance yourself from something that you've previously said might be a solution, an "I stand corrected" button might be a solution. Perhaps just being able to add a strikethough to an old post.
In the case of Mighty the experience is known. It is Chrome, just faster. Sure, someone might prefer to use Mighty, fair enough, but there's no "extra magic"
I m sure the makers have done their research and found $30/month is the optimal price of a browser of a browser. Surely a lot of businesses will be convinced it's worth the money because $bigCorp uses it as well, and cargo cults work, I'm just pointing out what money can buy at that price point.
Then someone might figure that they can rent servers for $30 /mo and sell 10 remote desktop subscriptions on it.
The BBC loses an additional 10% of users for every extra second it takes for its site to load. And when Yahoo! reduced its page load time by just 0.4 seconds, traffic increased by 9%
1 second delay reduces customer satisfaction by 16%
The longer a webpage takes to load, the more its bounce rate will skyrocket.
Nvidia GeForce NOW (Cloud Gaming Steaming) is $10 a month and gives you access to top of the line enterprise GPU/CPU/RAM hardware and nearly your entire Steam, Unreal, Ubi, etc libraries. I can play Cyberpunk 2077 with fully maxed out graphics settings with no perceptible latency.
https://play.geforcenow.com (There's a free tier, that gives you access to 1 hour a day of gameplay.)
But you are updating. You're spending $360-600 a year on this.
RAM isn't that expensive, even if you do feel like you need to upgrade again in another 2-3 years. I can buy a completely brand new, good computer every 3 years for that price. And it will be able to handle running 100 tabs.
There are a lot of potential reasons why someone might benefit from a remote browser, but I don't think computer processing power is one of them. My phone can handle running over 100 tabs in Firefox.
I don't know, is this an adblock thing? I currently have ~950 tabs open on my 6-year-old desktop computer, and my computer isn't crashing. I think it's currently using 8-9 gigs of RAM. Maybe my system is particularly optimized, or maybe without an adblocker websites are way heavier and multitasking is a big problem? I do run uMatrix and uBlock Origin, so maybe my experience isn't typical. But the point is, for $30-60 a month I could buy another 16 gigs of RAM.
16GB of RAM and 16vCPUs. What a weird balancing of resources.
They are probably doing things somewhat inefficiently in the beginning, like renting whole, generic VMs for every customer. Both the price and the resource balance should get better when they catch a little scale.
If you’re making good money, investing $1-2 dollars a day to be able to work more productively is incredible roi.
I hope to see people normalize spending $ on software. A lot of software is way under priced, and if it was priced higher, we’d have more incentives for companies to come and make more great software.
I worked there and they had these awful surface pros with hardly any memory. Their solution was to use AWS's hosted Desktop for Developers. It.. sort of worked OK.
This, by the way, was not just for a few people: because of Brexit there are thousands of people all working on making the new systems for customs etc work.
I suspect organisations that are undergoing digital transformation (as they are) will have this kind of setup. It was rife through the whole place: rubbish old IT stuff rubbing shoulders with modern SaaS.
I suspect it's more so companies can pretend all their old rules about keeping data on site can remain. Still, it's better than going back to the office.
Bless home working.
edit Apparently that solution uses (or used) unencrypted connections, making it unsuitable for most uses. https://old.reddit.com/r/ShadowPC/comments/a6hi2c/anyone_use...
> it was priced higher, we’d have more incentives for companies to come and make more great software
This is also a strange logic. The definition of innovation and the benefit of competition is to drive down prices for consumers, not up. Let's not turn software into some sort of Veblen good.
Sure, but investing $2/day pays for an M1 MacBook Air in under 2 years. That's why so many of us are struggling to understand this.
It might make sense in the context of companies with weird IT department restrictions that won't let them buy new laptops but will let them spend $50/month on a service.
1. We are more and more moving to a world of highly valuable workers. Improving their efficiency in a high salary country is easily worth it. Company should be willing to pay 0.4 - 1% of your salary to make you more efficient.
2. Longer liftetime of company computers. No need to upgrade to M1 yet.
3. Seems like they are building a full on WorkOS as well. That migth also just be worth it.
Once you get above 20 tabs, are you genuinely keeping track of every single one as something to return to later? Or are you just being lazy and lack the personal systems to track what's actually important or needs to be returned to later?
I've been using a 11y/o computer at home for everything--code compilation, VMs, work AND personal life--and this has never been an issue for me.
Maybe I'll give you #3, but if an employee came to me asking for this as a paid subscription, I'd shut the idea down immediately. Seems like another startup trying to fill a space that doesn't need to be occupied.
yes! Ideally I have around 500 tabs that I all need. I for example let your comment sit here for a while unsure if I was going to reply to it. There are more topics on HN currently under investigation. Each spawns a series of extra tabs. Cloud browsers, whole OS in the cloud, what hapend to paperspace? I open several articles that I may or may not read. When I get back to this discussion I look over the tabs it spawned and continue exploring while closing old ones... There is a window with music, one with youtube videos I might want to watch/comment on with the further research tabs they spawn. A dozen tax tabs, courier services, business card services. Dozens of tabs for websites I'm working at. jsfiddles, specs, demos. Tabs about wind turbines without propellers, road side wind turbines, covid, oil and coal reserves. And aggregators ofc
Basically, I can only do work or look at depressing shit for so long but I get back to it after watching a cat video.
When closing lots of tabs I go over the topics which helps me remember what I've looked at.
Its funny howmany people I talk with who have a single tab (usually also a single application and a single monitor) but know instinctively that their approach is better. (as if there should be only one metric) I cant begin to explain how much I'm enjoying myself.
In the old days there was webspeedreader and MyIE2 that were much more suitable for the giant session. Then there was tabmixplus and then came chrome which is pretty much a turd with 10+ tabs then web extensions killed all the good tools.
What I don't really see is why this service needs to exist to solve that particular problem (browser gets slow because too many tabs), because IMO that problem has already been solved very well by most decent browsers. They just swap out the inactive tabs and are able to restore them fast enough even on low-end systems, as long as they have an SSD. Inactive tabs that are not swapped out don't take a lot of CPU resources either. This service sells you a cloud browser with 16GB of RAM, which is pretty much the norm for laptops and desktops now, so it's not going to save you much if 'too many tabs' is causing slowness.
For a while I use different browsers simultaneously for different things. The session turns out entirely different for some reason as if one is a different person in a different location. I could see a cloud browser as something like that. I have no idea what would happen. Portability will probably influence the session.
I wish bookmarks were good enough, I use tabs in stead to preserve scroll audio and video offset and to have a bunch of tabs for a domain with related tabs next to them. Browsers have poor organization for large numbers of tabs but bookmarks are even worse.
I have no real idea how the session should be organized but I'm sure there are tons of visualizations out there that would work wonderfully. Perhaps some filters with a flow chart for the entire browsing history. Full text search? I don't know.
The price doesn't really matter as I spend way to much time online. 1 euro per day is nothing.
I thought you were trolling at first, but I realize this may actually be serious. You can lose the tab with my comment. I'm a worthless internet stranger, and if you REALLY feel the need to reply, you'll remember, anyway.
How many of those HN topics actually matter? The "may or may not read" stuff I think you can comfortably file under "does not matter" and discard for your sanity's sake.
I waste a lot of time looking at animal videos, too, but I close the tab after. I don't think that counts as something productive or necessary to revisit...
If you're closing lots of tabs, I'd hope you understand those tabs should've been closed earlier--rather than something nostalgic to revisit that never really mattered in terms of what you actually need to do?
It's fun to abuse technology, but at the end of the day, you should ask yourself... why? Is this really making your life more complete? Are you being more productive?
AND we're not even talking about the huge nocode push happening rn which always end up as RAM hogs
AND AND we're not even yet talking about the huge clunky internal tools that some companies have their whole business revolving around.
Or the upgraded machine comes with other differences that worker doesn't want :)
It doesn't need to be each of this reasons, and it doesnt need to be a combination, but im just pointing these out as possible ways to justify the pricing.
An M1 MacBook Air can be had for $999.
That's equivalent to 20 months of a $50/month service.
Alternatively, you can finance a MacBook Air for $83/month for 12 months, and then no additional payments after that.
The whole pricing thing is super interesting though, and I'm glad you're having success
The other thing is that browsers need GPUs as well. A lot of stuff is hardware accelerated in Browsers these days. Just a bunch of vcpus does not help that much.
As it is, the target audience seems to be people with too much money and yet with a shitty laptop unable to run a browser properly. I'm sure these people exist but it does not sound like a great market opportunity.
Also, from a security point of view, I don't think that a lot of Fortune 500 companies would ever agree tot this.
Also there's the convenience angle. A lot of developers are running their development tools remotely. Github is pushing e.g. code spaces. VS code can mount things over ssh. It's becoming normal to do that. So why not do that for other things as well? Streaming games is another good example. I would not be surprised to see Google go there eventually.
1. We use GPUs and many vCPUs actually help with multi-processing since most browser tabs will peg a single CPU
2. A lot fortune 500 companies have adopted "browser isolation" products that do something similar but aren't focused around speed.
EDIT: I did wonder about offsetting some of my CPU load by renting a VM out on the cloud instead of paying the $30 though. Not sure of the cost there.
The problem it'll face being marketed to consumers is that every one of the big JS application sites has already deployed a mobile app that takes care of the "works on light hardware" part.
For the ad-laden, tracker-heavy news sites of the world, there are ad-blocker extensions and Brave.
Independent professionals that have to use a heavy site will opt to upgrade their spec, almost certainly; computer financing has made it so that you can pay $30-50 a month to get a whole new system - why would you pay to get a worse experience?
Now, the enterprise can afford to spend on this and it can even solve some major problems. But that's a "current enterprise" problem, and not where I see tomorrow's enterprise going. There will always be startups aiming to be savvier than this, cut out more fat and not get locked into this particular opex and security model. The basic premise relies on the Web keeping its dominant state and I suspect we're in the midst of a trend reversal against centralized systems.
And...if the current enterprise doesn't provision correctly, it's likely they'll just continue to do so and leave their employees to suffer with 2Gb laptops, because it hasn't become mission critical yet.
So, I really do suspect that while it might have a chance for a few years, it's in a race against time to get some market share and expand differentiating factors. In this respect it could have the success of a Dropbox, i.e. "get big, then run out of places to go".
It's not a bad thing if people get the feeling your service provides great experience, but is too expensive. You can fix this later by dropping price or giving discounts.
I see this being similar, people who spend a lot of time in Chrome and for who the improved speeds are highly valuable in both terms of opportunity cost will not think of $30 as 'too expensive'.
The other thing is customer service, like Superhuman, with a $30 a month price tag you can actually give good customer service.
Finally, at this price tag you only need about 275,000 customers and you have $100m ARR. I don't know how long the Mighty wait list is, I do know Superhumans was last reported as 275,000.
Only time and the market will tell, but I'm really bullish on this company doing great things.
For people who spend all day in a browser, which is a lot of people, I could see it.
A vcpu is about 0.25 of a real core, so it's just a little high.
Browser can use multiple CPUs, but mostly for multimedia which you probably aren't using a cloud browser for.
16GB of memory for 16vCPUs is a very weird balancing of resources in anyone's books. Either their definition of a "vCPU" is actually a far smaller CPU share in order to pump up the numbers or they are overselling CPU hard.
And yes, 50$ a month is also a high price point for this.
EDIT: Just because the attitude of this comment really grinds my gears: Here's my patent for network-based content rendering which was submitted back in 2017: https://patents.google.com/patent/US10878187B1
Believe me, I've thought about this a little more than 5 minutes.
As a self hoster, nothing irks me more that more software that takes control from the user to some random third party.
And I fail to see why anyone would use this, you need high speed internet capable of streaming 4k for one and if you have access to that, then chances are you also have access to a sufficiently powerful computer capable of running chrome locally.
Coming to security, this is a complete disaster. All your traffic including passwords are going to a third party server and you have to trust that server to not do anything shady.
This cant be economical either, or will be too expensive.
And the testimonial on the website, I find it hard to believe that a CEO of a company cannot afford a powerful computer but can afford a (presumably expensive) subscription service giving them access to a video stream of a browser running on powerful hardware.
Like another user said VNC can already do this, and much more without the electron wrapper.
I hope every single one of these cloud-streamed remote-app or remote-OS plays fails and fails hard. They're helping lead the Internet and the computing ecosystem in an even more dystopian direction. I've been happy to see Stadia not really take off.
So lets say this succeeds. Then Google or Facebook buys it. Now all your browser sessions including passwords, keys, authentication codes, private messages, etc. are globally visible to be data mined.
Who's to say they're not doing this already?
What if this is hacked?
This is worse than that Amazon idea of giving Amazon delivery people keys to your house. In the physical world it's pretty easy to see people when they come in your front door. In the digital world you have no idea what these people are doing with your data. There is zero situational awareness.
I don't usually care about companies success or failure, this none of my business, after all, but this kind of "innovation" could have extremely unpleasant side effects.
I hope they crash quick.
Tech people are the minority. The market IS moving towards cloud. It's happened, it's happening, it will keep happening. Stadia may have failed now, but it IS conceptually the future of gaming. It's like you're arguing for blockbuster in a netflix world. We cannot stop this from happening, no matter how many choirs we preach to. All we can do is find ways to make this happen better.
I think it's more constructive (and technically difficult) to accept that the market is heading to full cloud and we as tech people need to find better ways of making this vibe with good privacy practices.
Personally, I would not use anything like this without knowing a lot more about their security. Even then, maybe we're still a few years out from a security perspective before I would feel comfortable storing my passwords and browsing data with a 3rd party server AND pay for it (wild). But, I could see myself doing this if my privacy was ensured.
I hope these guys really focus on innovating in that aspect, and then I hope they succeed big.
No, it's not like that at all. Nobody is arguing for going back to distributing software in boxes with floppies or CD-ROMs in them.
Reason from first principles, not by analogy. Context and details matter.
Here are some major reasons for the push to cloud. None of these reasons are immutable or universal.
(1) Wimpy mobile devices with constrained power, storage, and bandwidth requirements.
(2) Cloud is the only kind of DRM that works. It's a way to lock things up and make piracy virtually impossible. As a bonus you can still build on "open source" and placate the open source zealots who don't understand the current state of things and are still living in the 90s.
(3) Application delivery and installation/uninstallation are terrible. OSes are broken.
Here are some solutions:
(1) Moore's law, huge improvements in battery capacity, 5G, WiFi 6, etc. are eating away at this problem. This issue will die of natural causes.
(2) The hopelessly naive idea that "information wants to be free" and everything has to be "free" (as in beer) needs to die, be cut into a thousand pieces, burned, encased in concrete, and sunk to the bottom of the ocean. Nothing is free. Software takes a vast amount of labor to produce, and that must be funded. If it's not funded directly and honestly it will be funded indirectly and dishonestly (surveillance capitalism, cloud lock-in, etc.). "Everything has to be free" and piracy actually help push us toward a surveillance capitalist panopticon future.
(3) This might be the toughest problem. Windows is by far the worst offender here with its nightmarish installation subsystem. Closed app stores are another huge problem but eventually I think anti-trust action is going to chip away at that.
That's not by any means a complete analysis. This is just a comment on a HN thread. It does hit the major points I think.
Imagine you buy into this service and they go bust: Suddenly all your history, passwords and cache: poof, gone.
It's exactly what local-first advocates tell us is the current enemy, not closed-source.
See Kleppmann's latest blog post about the GPL: https://martin.kleppmann.com/2021/04/14/goodbye-gpl.html
That's irrelevant, though. The "tech people" aren't preferring local solutions because they're funny this way - they prefer them because cloud-streamed remote apps objectively sucks. It takes some knowledge about computers to comprehend how and why exactly, but it doesn't change the facts.
(To use an analogy - doctors are a minority too, but you listen to them when they say you should vaccinate.)
> The market IS moving towards cloud. It's happened, it's happening, it will keep happening.
The important question to ask is, why. Why it's happened, why it's happening? The answer has little to do with providing value to customers - it's mostly about creating ability to seek rent. Privacy issues only happen on top of that - they're not the entirety of the problem.
> I think it's more constructive (and technically difficult) to accept that the market is heading to full cloud
Or, we could fight it. Maybe it's a quixotic quest. Maybe not. The market is a dumb greedy optimizer, it flows down the profitability gradients the way water flows downhill. If you want it to flow elsewhere, you have to put obstacles in the way, or cut out a better path.
Every service needs auth. I can't believe nothing is properly integrated. I still have to click and enter a password, which fortunately the browser can create for me. I still have to receive an email and click on a link to validate my account. Web developers still have to create forms, manage the whole process, hash, salt and sauce my password and not leak it.
Plenty of people who can’t afford a fast computer currently have access to a fast internet connection. The ability to substitute internet bandwidth for CPU and RAM will be very valuable for them.
I'm quite probably overlooking something and I'd be curious to learn what.
What makes this hardware not great is the many developers who have fast machines who are ok using a lot of it with the software they develop. This makes the experience on older systems slow. It's unplanned obsolescence.
For chrome stuff and using the web I shouldn't need a killer system. No one should.
I’m pretty sure this set of people can’t afford 50$ for Mighty either.
On the other hand, I don't really run Chrome or Firefox on anything that operates on battery, because I don't like seeing the little battery icon deplete twice as fast, and it barely even matters how powerful the machine is (M1 helps, but there's still a noticeable difference). Maybe there are people who really, really want to run Chrome all the time, but also work mostly on portables and like them actually lasting as long on battery as they're supposed to. Maybe that's worth $50/m to them.
The idea is interesting for lightweight computers e.g. chromebooks and ultrabooks, but it would irk me a lot to have my browser and personal information running on some other machine that I don't control.
What I would be super-interested in though is a self-hosted version of Mighty, that I could install on a Linux box anywhere of my choosing. For example, the server runs on my powerful desktop at home, and my ultrabook in the bedroom can be a client.
(the notion that this is completely fucking absurd since those "render instructions" are called "HTML" and I'm just describing server-side rendering isn't lost on me, but it's not my fault things have gotten so bad that having a server-side browser forward draw commands from bloated "web apps" to a resource-light client might actually be kinda nice)
You can also use Opera Mini in MicroEmulator for server-side rendering of web pages.
The cloud could be the worst thing that can happen to the Internet.
Privacy and Ownership should not be treated as abstract ideas.
Is this not the same as saying "The internet could be the worst thing that can happen to the Internet"?
Many thin client setups like teradici PCoiP are used by tons of film studios in post production. The last few animation Oscar winners have all been done without computers at people's desks.
There's already services like Nimble collective (bought by Amazon) that streams 3D apps to your browser like Maya, Blender etc...
WebRTC is already seeing tons of companies move to streaming content to thin client end points. Epics MetaHuman creator for example runs in the cloud.
That's a weird assumption. Where I'm from gigabit (or at the very least 100mbit) fibre is the norm, which means fast 4K-ready internet cuts across virtually every socioeconomic demographic.
Would be interesting to see how far you can take a raspberry pi with mighty.
How much lithium battery degradation is due to some mobile tab going rogue?
Given that their engineering expenses are a fixed cost and the majority of their spending, they'll be able to lower prices as they scale.
Sometimes a heavy web-app like Twitter will be slow on first load, but Mighty wouldn't help with network speed, right?
Slack is slow because it's slow to load actual conversation data; the iOS app is just as slow as the Electron app. This is not UI jank; it's a slow API and/or insufficient prefetching.
Jira is slow because it sucks; I've used native apps that are slow because they suck.
Other than that, I don't have many relevant experiences to point to. I'm sure for people running older machines the picture is different (the state of web engineering as a whole could certainly be improved on several dimensions), but I also doubt people stuck using slow computers can afford to spend $30-$50/month on something like this.
I'm genuinely asking: what things are slow for you? Is it just the fact that the code has to load before it can request the data (or render anything) that makes it feel slow? Or is there genuine sluggishness? What web apps are you using that I'm not?
Same with electron apps. VSCode is generally among the best. I currently have 9-10 projects open. If I accidentally trigger a font resize by missing cmd-backspace and hit + instead, I’m sitting around for 1-5 minutes waiting for everything to settle. I’ve even hit bugs where trying something app-wide then reverting hit a very slow race condition and just completely deleted my settings.json. Restart to update can take a couple minutes too, and that’s to restore visible functionality while waiting for the changelog tab to randomly show up.
Slack on iOS isn’t nearly as bad as on macOS. But that’s not Electron. I’m in 8 Slack orgs, not a large number compared to some people I know. Refreshing the window takes long enough I just go take a break.
This is on a machine with 64GB RAM, even when it’s not swapping from a couple Chrome windows.
> web pages with lots of ads and fonts
Ad blockers are a thing. I use them on every browser (it's even possible to do on iOS). It makes a big difference.
> Scrolling past one of those dumb sticky videos can cause everything to jump into place, then back out, then back again
There are plenty of annoying dark patterns (and simply poor UX) out there being used, but what I'm trying to get at here is specifically the perception of slowness for the web as a platform. UX problems can exist in any software.
> Navigating back in history can be so slow just hitting the cache that the previous (now forward entry) page shows up again and blocks rendering of the navigation
I think you may be misunderstanding here... some sites - especially news sites - use a dark pattern where they override the back button behavior to prevent you from going back (presumably to increase "engagement", or whatever). You could argue the web platform shouldn't let them do this, but that still wouldn't have to do with "slowness" (and wouldn't be solved by the OP).
> I’ve even hit bugs where trying something app-wide then reverting hit a very slow race condition and just completely deleted my settings.json
This just sounds like a logic bug; bugs exist regardless of platform
> If I accidentally trigger a font resize by missing cmd-backspace and hit + instead, I’m sitting around for 1-5 minutes waiting for everything to settle
> Restart to update can take a couple minutes too, and that’s to restore visible functionality while waiting for the changelog tab to randomly show up
This is absolutely insane to me. I just tried changing the font size in a very large VSCode project with 10 files open and it took 1 second to change the font size for the whole app. Killing the entire app (Cmd+Q) and restarting it took 4-5 seconds.
How many files do you have open? Are you using some crazy extensions that could be poorly-written or interacting badly?
> I’m in 8 Slack orgs, not a large number compared to some people I know. Refreshing the window takes long enough I just go take a break.
Again, totally crazy compared to my experience. I just refreshed the full window for the medium-sized org I'm in and it took 2 seconds for the UI to come back, and another 5 seconds to load the conversation text (the latter is pretty bad, but as I noted in my original post, not related to performance of the actual web platform)
True. They also break things, which I’m not a fan of for personal use. They also make manual testing of web work less consistent with what normal users experience, which I avoid.
> There are plenty of annoying dark patterns (and simply poor UX) out there being used, but what I'm trying to get at here is specifically the perception of slowness for the web as a platform. UX problems can exist in any software.
What I’m describing though is sites using common patterns having such poor performance that I can literally watch a sequence of state changes take place and categorize them as they happen. Forget the ad experience. Common tech oriented sites linked on HN which make it to the front page will frequently show me three to four layout shifts as their fonts load.
> I think you may be misunderstanding here... some sites - especially news sites - use a dark pattern where they override the back button behavior to prevent you from going back (presumably to increase "engagement", or whatever). You could argue the web platform shouldn't let them do this, but that still wouldn't have to do with "slowness" (and wouldn't be solved by the OP).
This wasn’t some back button hijack, I double checked. It was a slow website meeting what I assume is a race condition in the browser, where the state change on load coincided with my decision to stop waiting. And it happens a lot on iOS Safari on perfectly trustworthy sites.
> This just sounds like a logic bug; bugs exist regardless of platform
Sure, like I said, race condition. But exacerbated by how slowly reverting some mistake might take effect.
> This is absolutely insane to me. I just tried changing the font size in a very large VSCode project with 10 files open and it took 1 second to change the font size for the whole app. Killing the entire app (Cmd+Q) and restarting it took 4-5 seconds.
> How many files do you have open? Are you using some crazy extensions that could be poorly-written or interacting badly?
Like I said I have 9-10 projects open. Assuming I have 10 files open in each (I have more, but that wouldn’t matter if the app we’re using native controls), that’s 9-10 times the same thing you tried. Each instance is its own process pool. But they’re all responding to the same event asynchronously.
> Again, totally crazy compared to my experience. I just refreshed the full window for the medium-sized org I'm in and it took 2 seconds for the UI to come back, and another 5 seconds to load the conversation text (the latter is pretty bad, but as I noted in my original post, not related to performance of the actual web platform)
This is also not comparable to what I described, you refreshed one instance vs my 8. And again this would not be an issue using native controls, which would not be running 8 separate instances.
- - -
You seem pretty focused on defending the web and web technologies in the abstract. I’m not necessarily even disagreeing with that. Although real world usage of web tech is the reason things are so bad that I do experience the performance degradation I describe.
I’m not your typical HN anti-JS zealot. I’m just very disappointed with how bad the common web based product is.
You’re mostly right that it’s not the underlying tech that’s bad but how it’s used. But not totally. It’s the only UI platform I’m aware of that developed a huge resource intensive multiprocess model to work around the fact that common usage routinely blocks shared resources and routinely crashes.
This makes no sense to me. On my machine, the desktop Slack client has noticeable input lag and it takes ages to perform any action. Try installing Ripcord and compare them; they are talking to the same API, but Ripcord doesn't make me want to throw my laptop out the window.
But Mighty is not a solution, it is just a band-aid that will perpetuate the problem and make UI developers even more lazy because they can assume their crappy Electron apps are always running on a beefy machine in the cloud.
Again, I'm not exactly using ancient computers, but that's been my anecdotal experience. I was working from a MacBook Pro that was 3-4 years old at one point, for what that's worth. Not maxed out, though I'll admit it was probably still not a slouch.
I have 16 tabs currently open in firefox on my MBP. Everything is snappy.
On my desktop (which to be fair, is very powerful) I have maybe 40 or so tabs, the majority of which never get loaded because they are saved by by the tree-style-tab extension, and I don't visit some of the subtrees often.
Literally the only webapp I use that feels slow any more (after I stopped using Gmail) is Notion, and they know they have perf problems. Like you mentioned, these things are slow (Gmail, Notion, Jira, whatever) because they... suck. Gmail is/was just as awful on my powerful desktop as it is on the laptop. I just don't get what this buys me.
That's UI jank on top of network issues. Another commenter mentioned Ripcord, which is a good baseline for how fast Slack or Discord should be.
At least for myself, when I say web is slow as a category - including wider web technologies like Electron - I'm mostly thinking about UI performance. Any time a website takes more than 50-100ms to react to an input event, it's noticeably jarring. If it's consistent, it makes the experience of using that site painful. And unfortunately, this problem is common across the board in everything done with modern web tools and principles.
Here's one theory: the web makes it super easy to add lots of little animations to apps. Discord in particular takes lots of advantage of this. Is it possible all the little animations are making things feel slightly less "instant", and being mistaken for input lag? That wouldn't impact typing, but
Having powerful machines would be one thing. I really implore you to try using a low end machine for a while and see how bad the web is. I currently use a 4gb 2015 MacBook Air and i often see Discord and other websites-masquerading-as-apps hogging upwards of 2 gigs of ram which is inexcusable. I can hardly believe that animations would contribute to lag (or perceived lag) especially because we can see a lot of completely native apps that have these "micro-interactions" and still feel fast and responsive.
On the other hand Ripcord, an alternate client for Slack and Discord sits at 50mb of RAM and single digit CPU usage.
Of course there are what seem to be caching / network-related issues, like switching between conversations always takes forever. But there are also clearly UI issues, like when I try to scroll up in a newly-opened conversation, it scrolls a bit, waits to load, then it sends me all the way back down again before jumping around to some random position. This happens when there are no new messages in the chat and I only try to scroll a little, not go back days.
And the crown, for me: somehow, the number of letters out of order when I type is through the roof in Teams. It happens practically on every message I send, whereas this basically never happens in Telegram (where I send the same kind of short messages) or when I write long-form emails.
I guess I'm happy as long as the keyboard response is (very) good. The only time I notice real slowdowns are when my actions get out of sync with the system. A slow terminal or text editor drives me insane and is one reason I really can't use VSCode; Sublime Text never makes me wait.
Granted, there are things that are genuinely slow for me and drive me up a wall (ie. most issue tracking) but overall, I'd say that my daily use is pretty good. It's certainly not bad enough that I would offload my browsing to some cloud based system. But like you, I'm probably not the target market here.
I've seen people say things like "Safari is much faster than Chrome" but I don't really see it. Sure, it can seem a bit quicker on some sites but most of the time I don't really notice it. I do notice things like CPU and energy usage between those two browsers, but I'm mostly plugged into power all day anyway so it doesn't make any practical difference which one I use. Perhaps when I get a new M1 machine (ie. 2021 16" MBP!!) I'll feel differently. Perhaps.
For anything else that comes to my mind todays browsers + adblock works for me. I have my 50 plus (most inaktive) tabs open all the time.
With slack though, it absolutely is UI jank. The interface is an absolute nightmare to use on phones (I'd say deliberately so, to force you to use their app).
But even then: surely native enterprise software was/is just as bad?
It would if they pre-fetch the content based on your browsing behaviour.
The web isn't slow because it takes a second longer to load a website. The web is slow because, once loaded, the website takes 100+ ms to react to a click or a keypress. Plenty of popular websites are so far off the mark that they take half a second or more to react!
I haven't used mighty but I'm basing that on my own experience with similar technology.
I think there's a window of optimal use bracketed by low and high download bandwidth for you. if you're faster than that maybe the only speed up you get is if your machine has a slow CPU. If you're slower than that I suspect that the video streaming they're going to use will produce a lag that will make your experience worse than if you were just loading the site directly.
If mighty wanted to push that lower bound lower, instead of streaming video they might be able to stream changes to the DOM. they could compile a sort of single file version of the page on their server that included all the requests the third party resources and styles in line and then whenever there were changes in the layout we could stream those style or DOM changes down to you. As far as I can tell that's basically the minimum amount of information you need to replicate the experience. that might even help a little bit with machines with slow CPUs I sort of tree shaking styles and resources that are not used.
from this point of view with enough development mighty could be purchased by Google as a sort of deluxe subscription model for chrome with bundled premium subscriptions to various streaming services and so on. From that point of view, the bundling up of content, delivery and medium is not really a novel thing because I think similar things have happened with cable TV magazines and news to some extent.
But for the good point that you make that people at the slow or low speck ends might often not be able to afford that kind of service, that lowest of the low-end might be a real focused niche... say people on airplane Wi-Fi or in remote locations on a satellite link.
but from a purely product marketing and psychological point of view I don't think that a product needs actual technical superiority or real measurable utility to become a big hit. I think it really only needs something that makes people want to use it. mighty could position itself as a sort of luxury upgrade for people with already good specs.
Because from the point of view people who are perhaps already of that successful and wealthy mindset many of them may consider that time is their most valuable asset and the accumulated frustrations and annoyance of waiting for websites to load is something they are prepared to pay a service to get rid of and to provide them and experience which they feel is more in line with their station and their expectations of Life in general.
Two fps at 34KB each is ~500kbps by the way, not 60 kbps
> Not sure if serious :-/
And I said not really joking so I guess you don't want to believe me XD
If you would use your 2fps streaming browser to read, say, hacker news, every scroll operation would be hideously slow and pull in another ~60KB per second, even though the page data itself is only a few KB and never changes. Your ‘streaming solution’ only makes sense if the total amount of data to fetch for the page itself outweighs the total amount of data for all the frames you need to stream while you are using the page. Which is probably almost never, unless you always look at static single-page applications which continuously pull in data on the backend without presenting anything new at the front end. Highly unlikely.
> the ‘not serious’ part is how anyone could find that acceptable
I guess you don't have a beeline on what everyone finds acceptable. That's normal, you can only share your perspective not everybody's.
> every scroll operation would be hideously slow
I guess you haven't experienced it because what you describe is not how it works.
The two frames per second is not streaming a 60 frame per second source down to you at two frames per second it's capturing two frames per second from the source and sending them to you because that's what your bandwidth will permit.
> Your ‘streaming solution’ only makes sense if... Highly unlikely.
Only if the goal is a reduction in bandwidth used viewing the page. There are many other goals were streaming the browser makes a helluvalotta sense.
I get you had this focus on bandwidth because i think it's the main obvious focus of this thread but there's an expanded context in which these things operate. I'm sure you'd appreciate that if you'd experience it.
STOP CREATING MIDDLEMEN! It's going to cost me 30 bucks to just browse the web where I spend another dollar amount to where someone collects a "handling fee". Jesus I feel like the world is going nuts.
This doesn’t solve the Figma case tho.
It is an electron app, as unlikely as it sounds.
That sounds... just ridiculous.
edit: omg really
And the solution to this is to put the browser in the cloud? So what’s the desktop browser on your new $3,000 mbp now, like... a demo environment?
It boggles my mind that we’re not demanding the web bloat stop. Maybe figma just doesn’t really work as a web app! If I have to run my browser in a datacenter, I think it’s fair to say it doesn’t.
As a web dev I’m just embarrassed. How are we not saying “this is too much, stop making web apps that crash my computer it’s not worth it.”
my point is not that I'm condoning it I just happen to think that it's probably inevitable.
And there's also the analogy between sort of handmade and craftmade things that are sort of indie Craft products built outside the system and that movement of indy websites and bloat-free websites. I think they're both destined to be small slices of the eventual mainstream market.
getting even more meta societies tend to capture more energy over time and if you think about it more energy is going to end up being crystallized into more matter so we're going to produce more things and, ignoring some inflection points in technology and efficiency, use more energy to produce more things so things are probably going to bloat out.
This "DRM" plays at least some role in making the optimizers in V8 work a lot harder to get anything reasonable out of the spaghetti.
Why Google needs DRM for a web email app? Beyond me.
The reason we use such tactics is to increasing barrier of reverse engineering because our teams value their work. Some people claim that security through obscurity is bad. I challenge this view. I claim that every security defense such as RSA is a obscurity.
It's a matter of time until RSA breaks in the same way as Obfuscation does.
Gmail is not your let's make it weekend kind of app. It's highly sophisticated and deliver huge value.
There are lot of people who hate Obfuscation. Some are communists and others are attackers.
My wife (she works in the fraud detection department) found an interesting attacker who masqueraded as a security researcher and student of X University, but in fact he was a a criminal scum. He has reverse engineered anti-fraud scripts of many websites and published them on Github for everyone to see. His main goal was to attract malicious buyers and sell them scripts that bypass this protection. It was one of the heck of marketing.
Brian Krebs also had similar story on his blog.
First, encryption is not "obscurity" in the same way you think DRM is.
Second, several other email providers don't think they need to rely on some performance-killing DRM to "protect" their web app (oh no, what of all the value!).
Outlook has a part of their files minified, but doesn't use any obfuscation; apps like ProtonMail and Tutanota are even open source.
(I'm actually starting to migrate off of Gmail to Protonmail myself.)
: https://github.com/ProtonMail/proton-mail/ (the new site, on beta.protonmail.com)
Oh, and there's no need to call people "communists", "attackers", or "criminal scum". Be civil.
> Quantum computers will break RSA
Now here it will take X amount of time so is breaking any protection like DRM.
The goal of any security method is increasing attack time.
TLS got attacked, SSL got attacked. History repeats itself. Period.
> Oh, and there's no need to call people "communists", "attackers", or "criminal scum". Be civil.
Why? I have a right to use these terms. What should I use instead?
Would you call Osama Bin Laden as "His Highness Bin Laden"?
The words exists for reason. I use them in appropriate context.
People don't understand Russian soul. I'm very direct and speak my mind!
>> Second, several other email providers don't think they need to rely on some performance-killing DRM to "protect" their web app (oh no, what of all the value!).
>> Outlook has a part of their files minified, but doesn't use any obfuscation; apps like ProtonMail and Tutanota are even open source.
So? What's your point?
You have Linux which is Open Source and you have Windows (A lot of parts including their licencing is obfuscated)
The performance hit is minimal. ProtonMail & Tutanota are way slower than GMail and lack cutting edge features we offer.
Gmail vs Outlook is like Ferrari vs Toyota.
Gmail has great UX even my grandmother can use it.
While there may be a case for DRM in some places, gmail is almost certainly not it.
Attackers don't care about laws. All they care about their end goal.
You have fraudsters who game the AdWords, reCaptcha etc
Gmail is a strategic tool.
>> the chances of gmail's client side bits doing anything that novel that's also competetively important are slim to none
You are underestimating value of Gmail product. I'm not allowed to share what kind of value the client side has but it certainly does.
>> While there may be a case for DRM in some places, gmail is almost certainly not it.
Again, you are underestimating value Gmail provides to consumers.
This country is a democracy. Companies can obfuscate or de obfuscate code at their wish whether there is value or not.
Privacy people can use Privacy oriented tools or go build their own seriously.
DRM is a billion dollar industry!
Obviously people and corporations can choose to obfuscate; their prerogative. Doesn't mean it's effective nor wise in every instance, though, does it? Gmail is entirely free to waste effort and make its app slower and less (easily) maintainable, no question there.
You might call them spammers but they are often fraudsters.
I'll bite once again - from personal experience, I knew Gmail is slower than ProtonMail, but I tested it anyway. I loaded both Gmail and ProtonMail, using the browser's profiler.
Gmail spent 6x the time ProtonMail did in the garbage collector, and 2x the time ProtonMail spent in the JIT compiler.
DRM is a contributor to that.
You always have the option for loading "Basic HTML" and you can get Protonmail or Toyota like experience there ;)
I don't know what's your agenda really is. Attacking DRMs are bad.
You have issues like spammers abusing Gmail interface to send emails using Google IPs and there DRM rocks.
For example until this month Notion was extremely slow and everyone complained. They fixed it recently and no one's complaining but the important thing here is that no one left the product for being slow. May be there is a way to reduce bloat on the web and to ship desktop apps while keeping pace with modern app dev experience but surely there isn't any right now. Maybe webassembly will help? Lets see
Would you please elaborate on this? I tried to find some info but no luck. Does gmail.com actually load some virtualization technology in the browser?
Google writes gmail but they also write chrome and V8, so they are in a unique position of writing both the application and the platform it runs upon. Presumably this would allow them to make something more performant than most, not less.
I however got down voted for explaining purpose of all this. Guess attackers hate me. Whatever. Leaving HN for all good!