I recommend everyone who reads this must do their part to take back control. One of your computers must be a Linux. Diversify your clouds and mobile platforms. One of your browsers must be open source in the true sense.
Take control your own data, at the very least have some sort of physical backup for the resources you create (code, music, videos, documents, etc.)
It’s scary IMO.
And there is so much vendor lock-in, I’d much rather just start from scratch and do everything myself. They say the point of the cloud is easy deployment but honestly that only works for hobby projects — anything serious, and you will be tweaking everything quite a bit so there isn’t much benefit of these ready made solutions as you will eventually hit a roadblock that they can’t overcome easily and you will either spend a lot of time making your cloud solution work or abandon it for something else.
Definitely a huge risk of losing control, the web and everything is becoming more closed.
Agree with you 100% that we need to support open movements
It’s like the App Store debate from a decade ago. It was obvious that opting in to curation was a bad idea, but first movers were rewarded with huge profits because Apple was effectively giving them a ton of free advertising via discovery.
This will be the same. The true long term cost of losing control is massive and not being adequately assessed as the huge risk it is.
Working from a thin client also mean constant monitoring of your work, every characters you type, every mouse movements will be recorded and fed to an AI.
The erosion of control is mostly a voluntary one, cloud offerings make life simpler and more convenient and that's very hard to beat. Especially the ones aimed at consumers. I am confident that people are losing less personal data thanks to the big tech companies, even at the expense of losing control. Therefore it's a net positive for individual consumers, perhaps it's not if you look at the societal level but what would be the main arguments at that level?
Hahaha, there is a certain irony about this getting downvoted in a discussion about erosion of control.
Erosion of freedom is almost always volontary at first.
Look how perpetual Office apps vs Office 365 apps work. It’s a slow boil, but eventually the only viable option will be the online version.
This is an extremely dangerous trend and this is why I stopped using VSCode as I think it is a potential trojan horse.
These standards have revolutionised IDE language support in the last few years meaning I can now run my code on a remote machine and connect to it using VS Code or any other LSP compatible IDE.
I write a language server for a niche scripting language, and Microsoft's specs mean that I wrote support for VS Code and now Sublime text, Emacs, Vim, Atom, etc all just have support too.
You can, right now, run VS Code on your Linux server, and another copy on your Windows desktop, and just use SSH to connect them seamlessly: https://code.visualstudio.com/docs/remote/ssh
Sure, this currently uses a Microsoft server to set up the connection, but I'm sure the moment they lock that down someone would fork the extension and provide an alternative.
Microsoft mainly does things that helps lock people into their systems. Why do you need to connect to their server at all? It's all about control.
I can release an extension with whatever license I want.
The problem is that I sense a potential bait & switch tactic.
I don't want to live in world where the norm is that everything in on the so-called "cloud".
I want full ownership of my files and my tools.
We’ll pay for everything. We’ll be renting our syntax highlighting if they think people will tolerate it. I think we’ll pay per CPU cycle, but the cost will be based on feature tiers. IE: CPU cycles * features = mega profit.
Again, go look at the LSP.
There are already alternative IDEs that are compatible with VS Code extensions. You can use the exact same community written syntax highlighting extension in a different IDE without touching any Microsoft code at all...
It would be foolish, if not impossible for a start up's infrastructure to hold the same level of certification as Azure, GCP or AWS.
That last part is what concerns me about these products. Eventually we may have to justify using a local dev environment, just as some developers currently have to justify using a laptop, as it's more 'risky' for the employer.
Everyone crows due diligence and removing administrative access and putting in security controls everywhere, not using own hardware, having locked down this that and everything, centralising control and environmental access. It adds a lot of friction but realistically ignores the fact that some of the core libraries and packages our products depend upon are probably put together by someone toking on a bong at 2AM in an insecure rural outback shack on a laptop riddled with porn and warez that was bought from ebay.
Please stop imagining things you are making up things in your mind just so you can get mad at them
The base image being used in these environments supports lots of common languages but I'm more excited about each repository defining their own development environment via docker container images and .devcontainer files for specifying extensions and settings. This will finally allow people to contribute to open source without having to invest time in setting up the necessary tooling for the project. These devcontainers work locally too via the remote docker extension.
All in all I expect this to significantly remove the barrier for participating in open source development.
Is that something that could run without GitHub / Codespaces?
To get started all you need is docker, vscode + remote containers extension
I have been using it to play around with multiple languages locally and also adding devcontainers for new projects to easily hop into isolated project development envs.
The general premise is great and I've wanted something like that for years. One thing I don't like is the Dockerfile that lives alongside the .devcontainer file. Using that to install a portion of the dev environment is a mistake IMO because things like 'apt-get install something' are non-deterministic.
I think it would be amazing if that Dockerfile were hooked into CI and used to build an official dev container for a project every day or every week and if that were the container that's used for development. I already try to do something similar for an official project build container (ie: this container can build the project and is used for all CI builds) and the dev container would perfectly double up on responsibility for that.
The big win with having an immutable container for dev / building is that it could be kept around if it's used to do a production build. Ex: When building v1.0.1, pull the current dev / build container, tag it or label it, push it into a special registry, and use that container for the production build. That way anyone could check out the 1.0.1 dev container and have the exact, old environment that was used to build the production artifact(s) for that version.
I honestly think using this type of setup will become an expectation at some point. It's just too awesome to not use which is part of what scares me about it. While it's not tied to Azure and I can run it locally it's amazing. If it ever becomes exclusive to some type of "run on Azure" play by MS it'll be terrible because it's so compelling you won't be able to ignore it.
Note that there are alternatives to this containerized approach in the nix world with nix packages which I think maybe a better approach for deterministic dev environments but so far containers are all the rage.
Yes of course, I could a proper setup with my own IDE and then github desktop or something similar, but honestly I got so many things going on in my life, I have no interest in setting up everything perfectly - theres a mental cost associated with that.
Also when I am switching between my Mac, my vmware Linux and my work Windows PC, I don’t have to worry about setup - just login and get typing!
I would guess I’m not the only one!
EDIT: Looks like it wont be free after the beta. That reduces the value proposition a fair bit. Because even though I wrote above that I prefer avoiding doing Github Desktop setup for every new repo, I’m not going to pay for that service given it doesn’t take that long to do and is relatively straightforward. Unless it was a very nominal charge. However GitHub is a great service, so I would support them just for the sake of supporting, but that’s separate to the value proposition of this.
But you just opened my eyes.
Actually that’s software these days summed up really.
As a commenter pointed out, we're back to mainframes. We got phones which got smarter and pretty damn powerful only for us to go back to web apps.
So the pricing is not over the top.
After watching https://www.youtube.com/watch?v=-enIM4x-KPA I
I want to add it to my toolchain containers and have everything in one place.
I wonder how it will be priced by GH or using alternatives like: https://github.com/cdr/code-server
But since most of C developers are now using vscode if the performance are fine this could change the way embedded software development tools are distributed.
[Edit] the pricing is now public and looks fair:
Instance Type (Linux) Per-hour rate
Basic (2 cores, 4GB RAM, 32 GB SSD) $0.085
Standard (4 cores, 8 GB RAM, 32 GB SSD) $0.169
Premium (8 cores, 16 GB RAM, 32 GB SSD) $0.339
I'm using CoCalc for the first time this year to teach my classes, and it's not clear what advantages this would have, since CoCalc gives you complete control over a full Linux environment and has collaboration set up out of the box.
This is essentially a solution to that goal, multiplied across any device with a web browser.