Hacker News new | past | comments | ask | show | jobs | submit login
NPM Is Joining GitHub (github.blog)
1829 points by mholt on March 16, 2020 | hide | past | favorite | 557 comments

Microsoftie here — throwaway for obvious reasons.

Microsoft doesn’t do everything right but the GitHub acquisition has honestly gone better than I ever expected. Rather than forcing GitHub to adopt Microsoft centric policies, Microsoft has adopted more GitHub stuff, especially from a product POV. GitHub still runs as a separate company (different logins and health care and hiring systems) with its own policies and point of view.

The reality is npm was in a bad place and in a land of not good options, this strikes me as the best possibility. I’d rather have GitHub control this and be able to give the resources to npm than a company like Oracle or Amazon or even Google or Facebook to own it. In a perfect world, some independent entity could fund npm out of gratitude but at the same time, consider how poorly npm as a company was run for YEARS and the general lack of direction.

So yeah, I’m cautiously optimistic this won’t be fucked up by GitHub — but I understand the concern.

As for those worried about Microsoft embracing, extending, and extinguishing. Lol. Even if that was the goal (and I truly don’t think that’s the ethos at all any more), Microsoft is laughably incompetent at achieving that sort of strategy. Google and Amazon have the EEE under lock right now (Facebook too — let’s be glad Zuck didn’t buy this after we saw what happened to yarn), but Microsoft can’t even put coherent dev strategy outside of .NET on Azure.

> Microsoft doesn’t do everything right but the GitHub acquisition has honestly gone better than I ever expected. Rather than forcing GitHub to adopt Microsoft centric policies, Microsoft has adopted more GitHub stuff, especially from a product POV. GitHub still runs as a separate company (different logins and health care and hiring systems) with its own policies and point of view.

That's what we said about the Skype acquisition too.

"It's different this time, it will run independently, for once Microsoft won't interfere and destroy the acquired company".

3 years later (I was there), 50% of Skype original management and developers left. All the major new projects of Skype turn out to be integration with the endless already existing Microsoft products: integration with Link, integration with microsoft ID, integration with Microsoft UI, etc...

5 years later, Skype is dead... but everyone left already.

Good job Microsoft.

Skype is crap, it was doomed to die around that time frame anyway. If anything, Microsoft Business kept Skype going longer than it would have otherwise.

Seriously, go use products like Google Hangouts, Slack or Microsoft Teams (Microsofts's surprisingly better clone of Slack) and then tell me that Skype was ruined by integrating with Microsoft's other product base. The closest argument you might be able to make is that Skype could have developed new features or updated its UI and stayed competitive, but that wouldn't give them the competitive edge that it's competitors nearly all had in terms of being backed by major tech companies. Additionally, the major revenue source was Business tier use, which Microsoft dramatically improved by integrating it with their other cloud-based business suite applications.

Microsoft didn't kill Skype, if anything they extended its life expectancy and value in markets that actually paid to use it. It's dead/dying now because better tools have been developed and marketed to replace it. That's just standard product life cycle.

> Microsoft Teams (Microsofts's surprisingly better clone of Slack)

Personal experience, but I have not found a single person who likes MS Teams. Usually people around me say that Slack is ok-ish, while MS Teams is crap.

Agree, Teams is an absolute shitshow of a communicator. Everything's wrong about it, from unintuitive UX, incredibly slow UI, up to terrible API design where even making a simple bot is laughably hard(try working with 'chats' instead of 'teams' or 'channels' and you'll see). Recent days also proved how bad infrastructure they have, they're not capable at handling bigger traffic at all. I've used Slack equally long before that and it was amazingly good in comparison(still not perfect though).

I'll tell you one thing it does better: audio/video conferencing. We went full WFH and the usual suspects couldn't support us, but we discovered that teams is a thing we pay for and operates like a champ, haha.

While Teams isn't as good as Slack IMO, it is better than the experience of using Skype (Lync) For Business by a country mile.

I've the misfortune of being forced to use Skype and I would kill to have it replaced with Teams at my workplace.

Skype for Business is not Skype. It's just a rebranding of Lync that happened after Skype acquisition.

There is a half-baked link between Skype and Lync to make it seem like it's the same product. This link is all shitty mainly because Lync and Skype work completely differently, and even simple things like call signaling have to deal with years of legacy Microsoft SIP extensions that were punched into Lync.

I remember the first time we received access to the Lync code repository. That stuff was a multi-GB (!!!) repository that contained files, promo videos, PDFs, code, etc. That tells you much about the development practices behind Lync.

I still run a Lync Server 2010 at home to chat with my wife using a MSN looking messenger. We used to chat a lot using MSN messenger and this is a way to hold onto it. I don't mind Lync at all!

I actually enjoy it, more than slack.

You haven't suffered worse tools. Have you ever heard of Lotus Notes / Lotus Sametime?

It's not great, but in the corporate space, Teams does an OK job, and MS is a reputable company for B2B.

"Better than Lotus Notes" is not a high bar.

> Seriously, go use products like Google Hangouts

Corporate standard at my place of work. I hate Hangouts with a passion. It's the worst messenger, feature- and UI- wise that I've seen for years.

> Microsoft Teams (Microsofts's surprisingly better clone of Slack)

No it isn't. Used Teams at another company which adopted it as the standard. Couple of years ago. Feature-wise it was like a proof of concept, very early access.

> Additionally, the major revenue source was Business tier use, which Microsoft dramatically improved by integrating it with their other cloud-based business suite applications.

Again, I worked at another company where Skype for Business was the standard and nobody ever used it unless they needed formal IT help or something.

All the teams I'm acquainted with at my current place refuse to communicate on Hangouts and, in violation of corporate policies use something else like Slack or Mattermost.

Teams a few years ago is not a good comparison it has evolved a lot since just a year ago. Wile slack is getting more bloated and messy, through currently okey to use.

I'm not at Microsoft anymore but I was there when Skype got acquired and I had a bunch of friends who worked on it.

You have to understand that the technology underlying Skype at the time was very brittle and poorly designed.

Google almost bought Skype before Microsoft did and backed out after they got a look at the code.

When Microsoft acquired it Skype was routing its traffic over port 80 for example.

> You have to understand that the technology underlying Skype at the time was very brittle and poorly designed. > When Microsoft acquired it Skype was routing its traffic over port 80 for example.

__I have to understand__ ?

Who is saying this? do you have any credentials / knowledge of Skype technology? My understanding is that you only report indirect discussion from "your friends who worked on it"?

Having been there, I would say quite the opposite.

Skype had pretty intricate means to bypass NAT to NAT clients. I would even go as far as saying that Skype P2P connectivity tricks were top notch, considering the incredible amount of different network setups that clients could face. Even in the weirdest conditions, you could trust your Skype client to somehow find a way to get the call through. This was through an immense collection of in-house-trial-and-error-STUN-hole-punching like techniques.

Now I can understand that for people outside of the peer to peer connectivity world, these techniques could seem completely foreign and brittle, but it's not. It's the world of internet clients we live in. It's not related to Skype, it's the route all peer to peer clients have to deal with. If you don't want that, don't go peer to peer.

> Google almost bought Skype before Microsoft did and backed out after they got a look at the code.

Where did you get that information? I had never heard of that interpretation of the story before.

My view on this is that Google considered buying Skype, but backed up because they wanted to have a cloud based service, instead of a p2p one. Microsoft was in the same state of mind, but decided to go along and migrate Skype to be a cloud service, which they did.

Now if you really want to discuss technical details and the state of Microsoft/Azure at that time, I would be pleased to do so.

Microsoft started the migration of Skype to the cloud at a time Azure was just a big beta test. Nothing was working properly, the tools were sub-par or in-existent. Nothing was reliable. You would deploy Azure services through remote desktop automated by PowerShell scripts. Managing databases was done through in-browser silverlight clients - yes, that was already EOL at the time, but that was the only way to perform DB queries with a UI.

When complaining about the deplorable half-baked status of the tooling and cloud services that we were required to use to migrate Skype, the only response was "Yeahhh, Eat your own dog food".

Thanks but no thanks.

All the great Skype engineers left in the two years after the start of the migration to Azure - mostly to join Twilio.

Here is an article from Wired with quotes from the Google Product Manager that did the due diligence when they considered buying Skype:


Key points:

He did not think Skypes p2p communication was a good fit for Google, going so far as to say it ate up bandwidth and was like an old technology.

The PM's remarks could be summarized as saying the basic p2p communication architecture was overly and only used in order to avoid a cloud/server based architecture.

Basically, the PM thought Skypes architecture and code base couldn't effectively scale or meet real world business requirements that people would pay for.

It could be true (and frankly I certainly can believe it!) but an article from someone who didn't buy a company could also be a post hoc justification for why they didn't get the deal.

Like VCs who try to invest in a company but lose the deal (or never get to see it at all): "Oh FooBbarApp? Yeah, we passed"

Isn't using port 80 a trick to avoid being blocked by firewalls?

People would say that but most firewalls block _incoming_ port 80 connections so that logic never held much weight.

That isn't really relevant to the problem they were working around. The problem is that many firewalls block outbound ports other than port 80, 443, and some other very common ports.

Put another way, if both sides block 80 incoming, their only hope is fancy NAT-punching techniques.

But those NAT-punching tricks are useless if they are using a port that is completely blocked on the outbound side.

Many networks like in hotels and airplanes only allow out on port 80 or 443.

Port 80 isn't necessarily a problem? Sounds like a great way to get around restrictive network devices like firewalls. Encryption is a thing right?

> That's what we said about the Skype acquisition too.

“On 10 May 2011, Microsoft Corporation acquired Skype Communications, S. à r.l for US$8.5 billion”

I think Microsoft has changed significantly in the last decade.

I’m quietly hopeful the npm acquisition will go well for us (although I still hold some serious grudges about Microsoft’s past behaviour).

Let's face it. Microsoft is a business and at any point in the future it might change course if it has economic preasures to do so. It can only keep being "good" as long as it has a stream of money coming in that allows this to happen. So the important thing is how they make npm economically viable. They need to have a good business model. I can only imagine GitHub was economically viable when they bought it, hence they left it run indpendently since it provides a revenue stream.

"So the important thing is how they make npm economically viable."

Thats not the important thing, thats the problem. Npm could easily be an open source client. Contain less code and be better. And have a mirors system like every other repo so it does not need money for hosting.

Npm wanted to control nodejs and make money out it. And they have.

Microsoft purchased that control and plan to make money out of it.

This is bad news for OS dev.

> Npm could easily be an open source client

The NPM client is and has always been open source: https://github.com/npm/cli.

> Npm wanted to control nodejs and make money out it. And they have.

In what way, other than offering private package hosting for enterprises?

By selling / "doing an exit" to GitHub?

> Npm wanted to control nodejs and make money out it.

What does this even mean?

It means Npm always wanted to be the "one true source" for nodejs code. No matter who wrote the code.

No interest in mirrors from day one.

Npm sold out to Microsoft and got paid. All that free community effort to stop Npms "crashiness" got sold to Microsoft for dollar.

Nodejs went to the Linux foundation.

Npm went to Microsoft.


That's not what I asked.

You said they controlled nodejs, and that they somehow earned money by controlling nodejs. I think that neither of those has ever been the case.

i’ve been at three companies that acquired small shops. that 5y horizon to the kill-decision lines up with my experience. most recently i worked with a company (mid size) that had been acquired two years prior by a very large international organization. “its great, they let us do what we do best and are excited to adopt some of our practices”. Two weeks later the senior leadership was shuffled and the mothership was changing things to better match their process, etc.

My thoughts as well. I've been at Skype when it was still running semi autonomously, when I left we were part of the office organization.

> after we saw what happened to yarn

What happened with yarn? As a very casual user of it, it seems to work well and have pushed npm to innovate a bit when it was stagnating. But I haven’t followed it lately. Were there technical issues or political drama?

Yarn's new version does a bunch of weird, complex to reason about stuff that's closer to a bundler than it is to a module manager: https://github.com/yarnpkg/yarn/issues/6953

> "weird, complex"

I mean, zip files are hardly complex. PnP is a breath of fresh air and lets you commit packages "safely" to your SCM without LFS. NPM has been experimenting in v7 with similar ideas.

I'm personally quite fond of "zero-install" setup to a repo.

And I think no facebook employees had a significant role in that v2.

You can use the node-modules linker and then it’s exactly like yarn 1 except faster.

What's wrong with Microsoft's dev strategy? .NET continues to be the most powerful and productive platform that I've used.

.NET Core was a great move and it's all coming together nicely now, and even creating innovations like Blazor.

I think the poster was saying that .NET has a dev strategy, but other projects don't.

> Microsoft can’t even put coherent dev strategy outside of .NET on Azure


You don't need Azure to use any part of .NET - and VSCode, Github, etc., stretch beyond both.

That’s not what was said either.

They said that .NET and Azure, two separate things, are very well managed. OP did not imply any correlation between the two of them.

My guess is that he is referring to Microsoft Azure's attempt to migrate to Javascript support. You can see some of the support, but it is still currently lacking and difficult to use. I hope that they can manage though, Azure has tremendous features, just challenging to utilize properly.

The flip flopping between Metro / Windows 8 / UWP permanently turned me from doing anything non-win32-API on Windows again.

That debacle was an entirely different chain of management that is no longer at the company ;)

> after we saw what happened to yarn

I missed something, what happened to Yarn?

The maintainer released Yarn 2. Yarn 2 is pretty foundationaly different than Yarn 1, and can and does break a lot of products/projects if used. Some folks are not happy about it, although Yarn 1 will probably continue to be maintained by the community for a while.

This seems to be pretty fair about the whole thing: https://shift.infinite.red/yarn-1-vs-yarn-2-vs-npm-a69ccf022...

Note that yarn is also no longer under the control of Facebook and the primary maintainer who has been developing yarn 2 no longer works there.


What happened to yarn is they released a v2 with some backwards incompatibilities... and for this reason we should be glad facebook didn't buy npm? (Also facebook doesn't actually own yarn). Well I'm confused.

I'm not OP, but I the inference I draw from the comment is that with Yarn declining in quality the probability is higher than NPM will once again become the de-facto standard, thus, it is ideal that any company responsible for its stewardship serve not as further consolidation of power in the industry (which currently leans Alphabet/Facebook).

IMO its crazy we're here, growing up in the IE days, but Microsoft acquiring influence over the JS ecosystem in this day and age is at least arguably, as OP stated, a distribution of power.

some backward incompatibilities? You apparently can’t run node directly. That’s kind of a problem for all but the most obvious of use cases.

Yes you can, it just requires extra configuration and defeats the purpose of using v2 at all.

The review you cite compares yarn 1 to npm and declares yarn 1 the winner. They did not evaluate yarn 2 because they say it does not yet support react native without a plugin.

One of the big additions in Yarn 2 is Plug N' Play, which allows for more flexible module resolution.

Support is middling, but once you get it working, you have near-instance near-zero size installs.

Second this. It still seems to be working fine... (actually, better than npm last I checked)

Given that Microsoft for all intents and purposes killed Atom (along with their really promising xray project [0]) almost immediately after the acquisition [1] even after explicitly claiming they wouldn't [2], please excuse me for not seeing the GitHub acquisition in the same positive light.

[0] https://github.com/atom-archive/xray

[1] They never officially announced it, but they almost certainly de-staffed it to the point where it's barely on life support: https://imgur.com/a/jQBHsUk

[2] https://www.reddit.com/r/AMA/comments/8pc8mf/im_nat_friedman...


What a surprise, the VSCode fanboys are coming in droves to downvote and say nothing more than how Atom was going to die anyways.

Sure, maybe it was, but that's not the point. The point is Microsoft _actively pulled development resources away from Atom after explicitly claiming they wouldn't_.

I get that a lot of people like VSCode better than Atom, but _please_ put things into perspective for a moment and consider if you'd make the same comment if the same thing happened to _your pet project that happened to be #2 in popularity but then got axed after being acquired by the company who owned the #1 after claiming they wouldn't do exactly that_.

Whatever your opinion might be on Atom vs VSCode, can we not at least agree that this kind of behavior is something we should hold acquiring companies accountable for? It might not make any difference to their bottom line at the end of the day, but the least we should do is hold them to the fire in the court of public opinion.

Counter thesis: what "killed" Atom is the balance of user enthusiasm quickly shifting to VSCode. Well before the acquisition was even announced, VSCode started grabbing developer mindshare real, real fast. I remember being an Atom user who resisted that tide for a while, but it became pretty clear that VSCode was taking off like a rocket and Atom, well, wasn't.

If commit activity graphs are really a meaningful measure, look at VSCode's:


The number of commits per, uh, date unit (the graph is not super clear on that axis, honestly) across the entire length of VSCode's activity graph rarely drops as low as the highest number of commits per date unit for Atom.

I'd have preferred to see both survive and do well, but that really hasn't been the way the text editor space seems to have worked. Editors that are conceptually awfully similar to one another tend to have one dominant player: TextMate (at least for Macs), then Sublime Text, then Atom, then very quickly Code. Given that Code and Atom are probably the closest of any two in that list, this just isn't that surprising.

It would be delusional of me to claim that VSCode wasn't already winning in terms of mindshare by a large margin when the acquisition happened. That's not what I'm claiming here.

Atom still had a healthy number of active contributors (presumably most of them were from GitHub) making improvements to the product on a daily basis to make it a perfectly viable tool for the people who chose to use it (and despite the much smaller developer base they continued to innovate with projects like xray and tree-sitter)... That is until the Microsoft acquisition happened.

Before anyone jumps in with the causation vs correlation argument, I think any reasonable person looking at the evidence would agree that the timing is convenient enough to make it highly unlikely to have been a coincidence, especially considering that most of those contributions were from employees at GitHub who were _getting paid to work on Atom_, so the only reasonable explanation for the contributions to stop abruptly within a month is that they _stopped getting paid to work on Atom_.

To add insult to injury they even had the audacity to claim they wouldn't do exactly what they did. That is the crux of my issue with how they handled this acquisition.

I agree, it's probable that there's been a decision to de-prioritize Atom in favor of VSCode. And sure, I get being upset about that, and about GitHub not confirming that this is what's happening if it really is what's happening.

But the words of the linked Reddit comment from Nat Friedman were "we will continue to develop and support both Atom and VS Code going forward"; that's a true statement today. Atom is currently being developed and supported. That's a case of adhering to the letter of the statement rather than the spirit, I know. But that circles around to the problem of VSCode's rapid ascent in mindshare -- if your company ends up owning two very similar editors and they both have roughly equal downloads and community interest, you might try to support both equally. But if one of them has orders of magnitude more downloads and community interest than the other, you're going to focus your efforts on the popular one.

How many of those github-people were officialy working on atom and how many were just random githuber who supported company-focus and followed social pressure?

It's quite possible that this was just a random side-effect from a shift in focus, instead of a planned sabotage.

From Microsoft's perspective, what's the advantage to them or their users to pay for two somewhat similar free offerings to be worked on in parallel? Given that one was gaining in popularity by leaps and bounds while the other was rapidly losing market share and relevance, what course would you recommend? "Fund both indefinitely, to keep the handful of atom users that want new features happy"? Be reasonable.

I don't think anybody's expecting them to fund both indefinitely, but given that their soon-to-be new CEO went on the record to say that they would actually keep funding Atom development, I feel it's fairly reasonable to expect that they wouldn't pull funding from Atom almost completely as soon as the acquisition went through. That's a really shitty move no matter how you look at it.

He said "we will continue to develop and support both Atom and VS Code going forward," which at least of right now is still happening -- Atom 1.45 was released last week, along with 1.46 beta 0. The conjecture that they've effectively defunded it is reasonable, but it's still conjecture, and even if they have it doesn't actually break Friedman's (possibly quite deliberately worded) statement.

I agree it was stupid of them to placate people by making promises they don't intend to keep.

Maybe Microsoft helped killing Atom, but I don't think Atom is completely innocent by itself.

I was an user of Atom and later switched to VSCode. It took me few months of weighting all the options before I made the switch. That's how much I love Microsoft -- Very little.

Don't know if they got it fixed, but there was a design flaw in Atom: Bug inside some Atom plugin (`linter-ui-default` is one of it if my memory is correct) can reset the entire Atom into it's default setting, and it happens randomly (See link 1, 2 and 3).

This problem pisses me off so much and so many times. The last time it happened, I accidentally deleted my backup configuration `config.cson` when trying to recover the setting. Yes, it is technically my fault, but no, really, it is not. So, after seeing all my life flashed before my eyes, I decided to stop living ... with Atom, I had enough.

VSCode is generally a better editor when compare to Atom. I mean, VSCode has it own issues, sure. But for me, so far those issues are mild and usually got fixed quickly.

1: https://github.com/atom/atom/issues/14922 2: https://github.com/atom/atom/issues/14909 3: https://discuss.atom.io/t/atom-keeps-losing-settings/61617 (This one was in 2018 while the other two was in 2017)

Author of `linter` / `linter-ui-default` here. The package in no place in its code rewrites the entire Atom package configuration. It does however observe it's own configuration. If observing in itself resets the Atom config, then it's out of the hands of the package and a bug in Atom core.

It's possible that one of the linter providers had that issue, and since Linter providers are only called by the linter package, they wouldn't exhibit the issue on their own. It's not uncommon to have issues that seem like it's the linter's fault since all of the providers do nothing and seem harmless unless invoked by the linter package.

I would've helped to debug this, had somebody pinged me on any of the issues. Oh well :)

Some information (https://github.com/atom/atom/issues/17060) indicates that the problem is related to `atom.config.set`. `linter-ui-default` is one plugin that triggers it.

So maybe it's not a bug of `linter-ui-default` after all. Sorry dear innocent man, my judgement is not always on point :)

What killed atom was that not even with rewriting stuff in C++ they were able to match what Microsoft is able to do with pure JavaScript on Electron.

Lack of delivery killed Atom, not Microsoft.

> Microsoft for all intents and purposes killed Atom

I don't think so. When VS Code came out my reaction was "Wow! It's like a 1.0 version of Atom!" I.e. an electron (or similar) based editor that works, whereas atom always seemed like a beta release. I tried to use atom a bit but it came with little out of the box and the plugin ecosystem was a complete mess. I filed an issue asking if obsolete/dead plugins could be somehow removed from the plugin repository, but nothing came of it.

Atom wasn't killed, it died on its own.

Please not call MS "killed" atom. Even before MS acquisition Atom was clearly losing in the competition.

I switched from Atom to VSCode well before the acquisition, and I personally know many others that did too.

What need does Atom fill that VSCode doesn't?

Making use of idle cores.

Harsh, but true...

> Google and Amazon have the EEE under lock right now

what is EEE?

Embrace, Extend, and Extinguish:

> "Embrace, extend, and extinguish" (EEE), also known as "embrace, extend, and exterminate",is a phrase that the U.S. Department of Justice found was used internally by Microsoft to describe its strategy for entering product categories involving widely used standards, extending those standards with proprietary capabilities, and then using those differences in order to strongly disadvantage its competitors.


You know what they say ... there is no I in EEE.

GitHub is still a good choice, but working with their more recent APIs, like GitHub Actions, I feel Microsoft's touch. They're slowly changing for the worst.

Ooh details please. I've looked at the docs but haven't used actions yet. What makes it worse than circle?

The initial capabilities look similar and circle really needs a competitor with how flaky their service has recently been.

Nah, Github Actions is actually amazing, and has been rock solid for me.

For someone who's not a pro in using CIs, it's been much easier to use than CircleCI. I never have to spend more than 20 or 30 minutes fiddling with Actions to make it actually work.

This for me is the biggest contribution to open source: normal people like me (who don't want to get too familiar with proprietary or overengineered CI tech) will be able to have CIs for pretty much ALL their projects.

I can’t speak for actions but you seem extremely biased for some reason. It would take you less time than it took to write this post than to learn circleci. Much less than 20-30 mins.


I might sound biased because I had lots of difficulties with CircleCI that I'm not having with Actions anymore. That's why I think it's a better product for my own use case.

The Github acquisition was pretty recent, no? It seems like Actions is something that Github would have planned and been working on for some time prior to that.

Really I feel competition with Gitlab which is pushing Github to do things like Actions that they should have done for a long time.

No. Microsoft did not adopt to Github. The first thing microsoft did after acquiring github is forced to login to view commits. I felt like some huge corporation took over the park i daily visit and charged me. It felt terrible.

> Microsoft is laughably incompetent at achieving that sort of strategy.

If they did choose to EEE, this makes it more likely, not less.

Why, what happened to yarn?

I’m not sure how you made out Microsoft to be anything but a benevolent ruler here, “throw away for obvious reasons” seems like a weird thing to say

They’re likely not supposed to be commenting on things like this as an active employee of the company in question.

Freedom of speech isn’t freedom from consequences, as is often said.

True, but insisting on consequences for speech is often a sign of ignorance. That comes with consequences as well.

If there are consequences for your speech, then it's not free.

You're conflating definitions of the word free.

Free in "freedom of speech" is the ability to say anything you want.

Your choice of free means cost.

So you're free to speak, but the speech itself isn't free.

Your definition makes zero sense. Unless you are physical unable to speak, you are always "free" to speak, saying whatever you want.

Freedom of speech is by definition exactly this: free of consequences.

So by your definition if you call someone an idiot and they punch you in the face for saying that, you don’t have freedom of speech?

Right. It’s freedom of consequences from a specific entity, the government, not freedom from consequences in general.

Re 1: Yes.

Re 2: No. It's just the usual implementation.

An absolutist definition like that is completely and utterly meaningless unless the only person you talk to is yourself. And even then there are concequences for what you say to yourself, even though they might not be externally visable immediately. And if you want to get really ridiculous concequences of saying anything involve movements of air and possibly particles of spit. Maybe even spreading an infection.

So by your definition freedom of speech cannot exist, or you need to refine your meaning of "concequences". That refined meaning is accepted as conceqences from the government.

Any other meaning is senseless because people are free agents and may respond to your speech in any way they feel fit.

I see freedom of speech starts and ends at the government can't compel or supress your speech.

Your employer or society in general absolutely can via the consequences of what you say. If you are making your employer look bad then they can and should be able to get rid of you.

If society decides they don't like you because of what you say and choose not to associate with you or use your business than that also feels fair enough.

This seems like a good outcome overall. NPM being such an important pillar in the software supply chain while having an unviable business model and largely being funded by VC money was never a good position to be in. There are problems with more of the software ecosystem consolidating with a single entity but it still feels like an improvement.

> NPM being such an important pillar in the software supply chain while having an unviable business model and largely being funded by VC money was never a good position to be in.

Why does NPM need to be funded as a commercial entity at all? What other open source library has a private company running its package manager? This one still boggles my mind.

For programming languages, there are several examples of commercially run package managers:

    - the Java/Kotlin/Scala ecosystem is based around maven central, which is run by Sonatype, Inc.
    - Go modules are hosted by Google. Previously, most libraries were hosted on Github
    - Rust's crate index is on Github
    - The Docker/Moby registry is run by Docker, Inc. (though that might be a stretch for "package manager" :))

> Rust's crate index is on Github

Note that the crates.io index is just a single git repo that holds JSON metadata about each crate: https://github.com/rust-lang/crates.io-index . The actual code found on crates.io is hosted on S3. The index is an important part of the system, but there's nothing tying it to Github specifically.

Who's funding the S3 bill? Took a peek around crates.io but didn't see anything about it.

Ah, neat! Works out well for them; I assume it's relatively cheap to just serve an S3 bucket on their CDN (though I guess bandwidth costs may rise rather dramatically if Rust ever reaches Node levels of popularity), while not taking on any other operational expenses of actually running a registry.

CDN bandwidth by nature of the reach of Cloudfront will mostly be offload onto the local peering fabrics. It basically costs nothing in per mbit billing - e.g. linx https://www.linx.net/wp-content/uploads/2017/10/Fees_Schedul.... 100gb hand. That's not to say there aren't other major costs in running a resilent edge network which go someway to justify 0.0Y/GB pricing where Y varies from 1-9 depending on location for non sponsored projects. tl;dr - it won't ever be a problem

Please don't use code blocks for regular text and quotes. Really hard to read on mobile and narrow viewports.

Technically the Go module _proxy_ is hosted by Google. Even if the proxy went away, you'd still be able to get access to all of the packages as they're still hosted elsewhere. It just wouldn't be as fast.

Maven Central has mirrors and alternatives and you can trivially host your own repository, all you'd need is a plain web server serving a bunch of static files.

Some libraries aren't hosted on Maven Central actually, so it's not uncommon to see instructions for adding extra resolvers to your build config.

The Java ecosystem isn't as dependent on Maven Central as the JavaScript ecosystem is on npmjs.com

Almost every library out there is on Maven Central. Even Oracle JDBC drivers are now (finally) on Maven Central.

If MC goes away as it exists today, the Java Ecosystem will take a huge hit as almost every open source project would stop building in CICD environments from the get-go.

If it vanished instantly then yes, but a huge number of packages on Central are mirrors from jcenter. There are not only theoretical competitors to Maven Central but an actual widely used one (jcenter/bintray), which is easier to use anyway. There's also jitpack too. So people could migrate pretty quickly to alternatives.

You're not technically wrong, but I bet that 90% of the projects or at least the examples have some sort of intentional or unintentional reliance on Maven Central, that would break the build if it weren't there. Even a lot of companies that set up internal repositories don't realize that they hit still are gonna hit Maven Central initially, before everything is bootstrapped, depending on how things are configured. It would reflect just really poorly on the Maven ecosystem (or any ecosystem alike, I have to assume, but know less about) if the canonical repository would just... "poof".

It's actually really common in the JVM ecosystem to run private mirrors of Central.

I wasn’t aware that I was a commercial entity because I use github!

I think the point is that you are using a commercial entity to host your code. There is a bill for the code you have hosted, and you aren't the one paying for it.

If that was the point why did they write “commercially run”? That is explicitly about management, not hosting or single points of failure.

It's never a problem until it is all at once and you realize they hold all the keys.

crates.io is not a commercial enterprise.

Confused why you think a service servings millions or billions of requests a day wouldn't require money to run. Do you think some grant magically appear out of thin air to pay for the servers, storage, bandwidth, and maintenance?

We've been doing fine here in reality, where most Linux distros, Perl's CPAN, Python's PyPi, RubyGems, etc. are all run by volunteers and donations. There is nothing special about NPM that requires it to be owned by a for-profit corporation. If Wikipedia and Archive.org can get by, I'm sure NPM can too.

Big leap from "servers cost money" to "a package manager requires a commercial entity". How are other language ecosystems and package managers operating, many without private companies attached, when they too are serving millions of requests a day?

Well the big ones are hosted by commercial enterprises, and the smaller ones don't have the scale of JavaScript so they don't require that level of support.

Which package manager has the scale of JS and isn't hosted by a commercial entity?

apt, pacman, pip, and others. You can have large scale systems that aren't dependent on a commercial entity.

I mean, the scale of pacman is miniscule compared to NPM. PyPI is hosted by AWS and Google.

Not sure how APT works in the linux community can really work for anyone else, but definitely worth a shot for sure.

Almost any other package repository is funded by donations from companies using them or a grant from an infrastructure provider.

PyPy runs on donated infrastructure that costs over 800K/month in hosting costs[1].

Not many non-commercial entities can afford that.

[1] https://twitter.com/dstufft/status/1236331765846990848

PyPI, not PyPy. For those confused over how PyPy could cost that much.

Compiling Python code isn't easy!

Yes you are correct of course - my mistyping!

Looks like they’re paying for bandwidth, aka “cloud pricing.” They could probably reduce those numbers significantly, but if it’s funded by a grant why bother?

Good reminder to use pip cache.

Maven Central is no hosted by Apache for example.

I'd really love to teleport some forum nerd from just six years ago to the present.

- Microsoft is a leading sponsor of open source?

- In fact, .NET is open source now and runs on Linux?

- Microsoft bought Github, then NPM... and the community is celebrating both of those things??

- The guy from the apprentice is what?

> Microsoft bought Github, then NPM... and the community is celebrating both of those things??

I don't think you could honestly say the community celebrated both of those things.

I do not consider the largest distributor of proprietary, closed-source spyware (Windows) owning the fastest growing open source package manager to be a good outcome, personally.

It depends on what the alternative is. When NPM starts running out of money to run the service what would happen? More VC, but only to a point and the firms would be increasingly be influencing NPM to make money by any means(probably not good for anyone but the firms). Alternatively a cash strapped NPM fails to invest in security and availability of the service leading to widespread outages or worse a large scale supply chain attack facilitated via the registry.

> It depends on what the alternative is.

Ruby Gems, PHP composer, PIP, etc. would all like a word with you....



Isn't it more that NPM specifically has had a bumpy ride, and people are seeing this acquisition as a sign of better times to come, and it not being a value judgment of other open source initiatives.

Yeah, and the PSF is worried that the possible cancellation of PyCon could send the whole foundation broke.

The fact that there's a bunch of critical infra run on a precarious volunteer shoestring is not a good thing.

Maybe look in your node_modules folder sometime, eh? Or the Linux kernel. The whole world runs their critical infrastructure on volunteer work.

You haven't bothered looking at the contributers to Linux since 1997, I take it.

Hmm, not everyone is a volunteer...

That's a path but NPM already being a company with signficant VC investment would a transition to such a model workout with the existing stakeholders? Also NPM is quite a bit bigger than both the Ruby and Python library spaces.

> Also NPM is quite a bit bigger than both the Ruby and Python library spaces.

I don’t know, so I have to ask, what’s the metric here?

Perhaps because Node code bases when deployed cause the package manager (npm) to consume more bandwidth in general than comparable Ruby or Python code bases due to higher dependence on third-party packages?

As at Oct. 2018, the top 12 npm packages [1] were already doing more than 0.5 billion downloads a month. Granted, popularity has waned for a few of those top packages due to deprecation or new language features, for instance.


NPM’s announcement about the acquisition [2] provides up-to-date numbers:

“Today, npm serves over 1.3 million packages to roughly 12 million developers, who download these things 75 billion times a month ...”

1: https://news.ycombinator.com/item?id=18343604

2: http://blog.npmjs.org/post/612764866888007680/next-phase-mon...

You're probably right with all the one-off and fairly small dependencies in the npm ecosystem. Seeing those numbers would be pretty neat to compare side-by-side.

This is sad to read. Why does every project have to be profitable? If NPM is useful users (companies and people) can invest time or cash to support the operations and continued development. This foundation model has been successful across open source and prevents one company from changing the direction of a project to fit their own needs at the expense of everyone else. I think this was critical to the continued growth of open source software over the last two decades. If this trend of selling out to massive corporations continues it will be a major step backwards.

To be clear that's not what I am arguing here, I agree that package registries should, ideally, be owned and supported by the community. However NPM already had fairly significnat VC investment and as such any transition to a community supported model would be challenging.

The acquisition can be a good outcome for the current situation without it being the ideal state of things.

Profitable of not, they should not run out of money.

Would yo prefer npm infrastructure to be maintained and developed by the lowest-paid programmers they could hire?

Imhi Npm is two things a cli tool, and a central repo that tries to be the one and only js repo: to make money from that position. Their infra costs them money only because that was their angle for selling out.

Linux repos take the alternative approach to try to get as many mirrors as possible so it can remain free.

Microsoft did not buy npm to help it become free. Believe. They bought it because they rekon they can make buck out of it.

That buck comes from somewhere.

Nodejs going to the Linux foundation was good news.

Microsoft buying Npm is bad news.

Why would NPM run out of money? NPM is the primary vendor for worry-free distribution and management of private JavaScript packages for $7/month/user. In a time where bandwidth is basically free (outside AWS/Azure/GCP) that should surely pay for server costs and a handful of developers.

It probably isn't going to 20x VC money, but it sounds like it would be profitable to run as a business.

Agreed. It was profitable before, (no indication they sold at a loss), it will probably be profitable after for MS.

"2fa" sounds bad. That is clearly marketing bs for linking your npm account to a MS account with more personal info attached.

Ease of publishing will be the first thing to go.

Then the fun will disappear with MS as owner, like when Oracle bought Java.

Happy to be a rustacean.

Regards 2FA/Login: Except that Microsoft has not done this with GitHub. And NPM is joining GitHub not Microsoft. You will see non login vanishing in favor of GitHub Login for sure. And the second factor, see what GitHub offers. For me that is currently a OTP generator

Gihub already had a masssive social graph. They know who you are in terms of who you work for and who your CTO is. Npm does not know that about me. It has no sales channel.

GitHub is owned by the same entity that owns LinkedIn. Microsoft has really no problems cross linking persons. They can also easily link the source code between the systems etc (if they want).

And they will, to the extent possible by law.

And they will ask you, under the guise of 2fa, to confirm thier suspicions.

And $7 a month gets plently of new upgrade options offered.

All talk of mirrors gets brushed under the carpet.

Microsoft pwn all nodejs code except core and those savvy enough to spot this coming and distribute via debian repos.

I hope they dont use that Microsoft Authenticator app. That has never worked for me, never once got me logged in, and has locked me out of Teams on a couple occasions.

To my understanding you can use any app supporting the common standard there.

The alternative is that the entire source control system (GitHub) and the entire artifact registry (npm) are run by a giant multinational military defense contractor with close and longstanding ties to the US military. Did we forget so soon that the Snowden slides are PowerPoint?

I'm not sure that's an improvement in any way whatsoever.

The military is a MS customer, just like nearly everyone else on the planet. Calling them a “defense contractor” is a very long stretch.

No its not.

Boeing is a defense contractor. Northrup is a defense contractor.

MS? Not even a chance. What percent do you think of their business is actually government, beyond buying licenses just like nearly every other corporation on the planet.

No US company is better in that regard. I prefer one with an army of lawyers. Damn, should have gone to Oracle. That is a joke. A joke!!!

> Did we forget so soon that the Snowden slides are PowerPoint?

what does that mean? That Microsoft works for the NSA or something?

Microsoft does indeed work for and with the NSA. As an ASP they participate in the warrantless PRISM bulk collection program (they were the first!), and they also provide software and services (support) as a direct vendor.

Change my mind: package management infrastructure run by government as a public service should be the ideal end outcome for everyone.

Why I think this: private or volunteer models are unsustainable in the long run owing to funding uncertainties or conflicts of interest between stakeholders. Utilities that support the bulk of our technical infrastructure should be secured by public interests. Governments can keep things free.

Common objections:

- "Governments are inefficient". Depends on the area. The government tends to be inefficient in handling areas with direct consumer benefit, but less so in dealing with consortiums or private entities. Since private entities are the primary mainstream users of packages, I don't think government will be too slow on this front.

- "Governments will be malicious". This I don't doubt, but the solution for that is building better trust mechanisms rather than keeping a practical solution at bay, and for software at least such trust mechanisms are tenable e.g. see the CNCF's Falcon project.

I like Linux's decentralized approach.

That would work for Npm. Npm has enough problems that its worth running your own mirror for business continuity.

Microsoft are likely to make breaking changes immediatly.

> That would work for Npm. Npm has enough problems that its worth running your own mirror for business continuity.

And risk-averse businesses do already.

You aren't the only one. Most users are too young to understand how predatory Microsoft has always been. Can't wait for the "npm won't publish my package because it circumvents something in Windows" or whatever. Give it time.

Tbf Microsoft have won back a lot of good faith with developers due to projects like VS Code and TypeScript, even for those of us who remember their past.

And we're yet to hear of any negative impact of their Github acquisition (afaik - correct me if wrong).

> Tbf Microsoft have won back a lot of good faith with developers due to projects like VS Code and TypeScript, even for those of us who remember their past.

Those are great until they're not. It's why it's called "bait and switch".

> And we're yet to hear of any negative impact of their Github acquisition (afaik - correct me if wrong).

ANY?! Heh, do a quick search just on HN and you'll find it pretty quickly.

I did search, didn't find it quickly. Could you share some sources of the negative impact? (I'm legitimately curious as my use of GitHub hasn't led me to notice any change.)

I found two pretty damn quickly. They're 3 of the top 5 most upvoted comments on the announcement thread...



> I'm legitimately curious as my use of GitHub hasn't led me to notice any change

Nor I. That's not that point.

For the record, I think Microsoft has done wonderfully for the dev community in the last 10 years. I don't see any reason that they are going to "f it up", but big businesses get desperate when environments change and profits get impacted (look no further than what Oracle is doing). Microsoft is not immune to that.

The complaints are "I think it really impacts the neutrality of github" and "I hope they don't discontinue Atom or apply their UX styling to the site/desktop app", respectively. Those don't strike me as particularly specific and/or nightmarish.

concern of negative impact =/= negative impact. Great job twisting the truth. Im not gonna be surprised if you say you're a journalist.

Neither of those show negative impact but people worried about potential future negative impact.

I know what 'bait and switch' means, just like I also know what FUD means.

Next you'll call them Micro$oft. Come on now.

Are you aware of what Oracle is doing to its customers right now?

Come on, Oracle was always predatory. You always knew exactly what would happen to anything they acquire.

Oh this takes me back to when Internet Exploder was the most popular browser

Nothing will come from those idiots spreading fud calling it micro$oft and using npm...

If it does micro$oft will just buy the world out from under them.

Before someone else comes along and writes a monologue, the biggest downside might be how it handled (didn't break) its contract with ICE[0]. If the acquisition didn't happen, old GitHub might've dropped the contract immediately upon enough employees speaking about it.

0: https://news.ycombinator.com/item?id=21412600

That's a subjective political opinion of a far-left vocal minority. Not everyone has an issue with ICE (a federal law-enforcement agency that stops criminals and saves lives) nor finds a problem with a company legally providing services to the government.

The fact that ICE has built concentration camps (as verified by scholars who study concentration camps) is a fact, not a subjective or political opinion. The fact that there are children in these camps, many being raped, some being tortured, is also a documented fact, not a subjective opinion.

These are well-documented facts that have been widely reported on in the mass media. You can find links to specific articles on my blog in the recent article about Microsoft and GitHub, if you wish to learn specifics.

The kids are in there right now, as I write to you.

Whether or not Microsoft's provisioning of services to the government is "legal" or not is not particularly relevant to the thread, but it is interesting that you bring it up, presumably as a defense of their behavior.

Again: This is not a partisan thing. At all. Your attempt to reduce it to such is inaccurate (and, tbqf, off-topic for the thread about Microsoft-the-corporation, as well as off-topic for HN).


Concentration camps are for political prisoners. These are not the same. And there are US citizens and children in jails and juvenile detention too, which is what happens when crimes are committed.

I'm a naturalized immigrant. I've been in far worse places than America. I suggest you visit CBP and ICE. Talk to the agents. Visit the border. See the shelters. View the damage done by criminals who prey on these people and find out how much the agents do to help them while risking their lives fighting cartels and traffickers.

Like I said, America is far softer with borders than other nations. Crossing illegally brings enforcement and penalties. I'm not sure why this is so controversial, or why protest against govt organizations is done by proxy of software companies.

>That's a subjective political opinion of a far-left vocal minority

What you are talking about is Right-wing politics, even if they are extremely far to the left of your -and the majority of peoples political view -in the USA. Both parties in the US are on the right. This isn't Reddit where the state of US politics is the default norm when it differs from most of planet earth. Though HN is quickly getting there.

Open/loose borders is far left policy everywhere. Most countries are far stricter with borders than the US.

ICE doesn't protect the border, that's United States Customs and Border Protection (CBP).

ICE are the goons operating everywhere (i. e. not tied to the border) rounding up "suspected illegal immigrants".

Because it isn't exactly hard to find illegal immigrants, and their charter allows them to control people without objective cause, ICE gets to arbitrarily decide whom to harass.

It's the real-life version of the perennial fear of civil libertarians that too many criminal laws will just lead to any one of us being arrested whenever it happens to be convenient for whoever is currently in power.

It isn't going to end illegal immigration, nor curtail it to any significant degree. It just serves to keep a large segment of the people who see every day in a constant state of fear, unable to (for example) seek protection from crime or exploitation for fear of being deported.

ICE is the enforcement arm for immigration, which handles people who cross the border. Obviously if there's a strong border then there's no problem in the first place.

As for the rest of your comment, this is the far-left extremist position that I cited in my first post. There's no good faith discussion to be had here.

> ICE is the enforcement arm for immigration, which handles people who cross the border

No, it's the enforcement arm for immigration and customs, the former of which is people who are living in the US despite not being US citizens. Enforcement of people (and goods) crossing the border is the Border Patrol and it's parent organization Customs and Border Protection.

Sure, but the parent posts are discussing (illegal) immigration which is the jurisdiction of ICE.

There's not much controversy over the people who cross illegally but stay temporarily and close to the border. Those are just traffickers and cartels.

I guess every person who was put in a cage near the border of the US was a huge danger.


To keep this as straightforward as possible: Illegal immigration is a crime and is enforced just like any other. Committing crime can mean incarceration, which many US citizens face everyday. It has nothing to do with being a "danger", but when you don't have permission to even be in the country then there's no bail or other release possible.

Also detainment centers are not cages but fenced areas with free movement inside where people receive food, shelter, healthcare, entertainment, schooling and legal services paid for by US taxpayers while their cases are processed. Detainees are free to deport themselves at any time. This is more accommodating than pretty much every other developed nation.

Bad people exist and bad things happen in every large organization in every government of every country. It is not representative nor useful in discussing the behavior of the whole.

Yes they do. But important part is what organisation does about that. Does it close the eyes, does it encourages it, does it publicaly removes bad people from org to send the message?

Almost every word of this is a lie, misleading or poor framing.

It's 100% factual except for the last sentence which can be argued.

That's a fair point - thanks for the reminder :|

Lots of big tech firms are government contractors, and as we've seen most of them are unwilling to drop government contracts (ICE, DARPA, etc). So this problem would arise with almost any large benefactor. I would've liked to see GitHub drop ICE though, personally.

Others see that as an upside. ICE today, who knows what tomorrow. The sort of activists who wanted that have all kinds of random targets. No company wants to deal with suppliers suddenly blacklisting them because the hard left decided they're evil.

Agree, What would you say would have been a better outcome? Google? Facebook? Microsoft has changed quite a lot, and in a good way.

VS Code is also spyware; I am not sure that this argument furthers your intended point.

The fact that it is open source and popular is not sufficient on its own. It had to be forked (vscodium) to show basic respect for the user’s privacy and system resources.

> VS Code is also spyware

This is such an extreme & pretentious viewpoint. Microsoft knowing that I have VS Code installed & getting a report when it crashes is not in mine, or really any normal developer's threat landscape.

Don’t forget you’re on HN.

VSCodium say they’re not a fork.

> This is not a fork. This is a repository of scripts to automatically build Microsoft's `vscode` repository into freely-licensed binaries with a community-driven default configuration.

It's true insofar as VS Code is widely loved by web developers.

So it "furthers my intended point".

No fork is required. If you build before from the source in its main repo, there is no tracking included by default.

It’s builds released by Microsoft that have all of their specific stuff added in.

I am old enough (42) to remember those days, but honestly I don't feel that threatened by them. I remember their EEE days, and for a long time I haven't seen much of the same behavior.

Same here (40). I was with Ballmer singing “developer developer developer”... i think his legacy is not that bad. The company was not ready to grasp the idea of open source at these times, but the principle holds.

It may appear blaming Ms forever for their past actions is a good idea. It's not. Those actions and decisions came from certain people. They're long gone. I look only to the present. MS decisions and actions these past few years have been pretty solid imo. We must always assume the best of everything, not the worst. Not matter what. It may appear naive, but it's the only sane way.

They've created a funnel from a jobs site they control, through an operating system and hardware they control, with development tools they control, and source (etc.) on a web platform they control, onto a cloud they control.

Assume the best all you like. Microsoft are spying on you and can lock you out of their ecosystem for any reason. When that ecosystem includes critical public infrastructure, there is a problem.

The only "sane" response to this is to smile at the MS employees who tell you how much they love Open Source and to use absolutely any other platform.

MS’s decisions outside of everything to do with windows 10 anyhow.

And pulling a package from a custom url is what, one line of code in this package documentation? And the moment it happens, this package will be on top of HN?

I understand the concern about MS business practices, but I don't think it applies to environment where transactions (as in, importing someone's package or submitting a pull request to it) don't involve any contracts or money.

Then you must not like React or Angular, since the owners of those projects are the largest spyware and aggregators of personal data in the history of humanity.

Software and services are not the same thing.

For some examples: RMS being a douchebag has nothing to do with the usefulness of gdb, nor can that circumstance affect the utility in any imaginable scenario.

Microsoft setting censorship policies (aka ToS) on a website they own and control directly affects the utility of npm/yarn/clients. Their website, their rules.

Well, this comparison seems to be close enough. What about VSCode and Github itself?

The time for GitHub is over. I have moved all of my repositories away from there that do not depend on GitHub-only integrated services, and am migrating my DNS and domains/hosting off of those integrated services this week. You should too. If you work there, you should quit.


VS Code has had to fork to remove the unethical spyware portions within it placed there by Microsoft:


Good for you. However, the general sentiment doesn't seem to behave the same way. I haven't noticed a mass Github exodus at all, aside from some people on the internet being vocal about it for the first month after the Github acquisition. Same with VSCode.

I realize this is just pure anecdata and not a legitimately researched observation, but I don't know a single dev in real life who either switched away from Github or VSCode due to those concerns, despite having a wide variety of dev friends from all kinds of backgrounds, including big tech devs, non-tech company devs, fully remote devs, self-taught devs, small startup devs, outside of the US devs, freelancer devs, etc.

I know a couple of projects that switched to gitlab. I use gitlab for my personal projects. I've abstained from moving Red Moon away from GitHub because it's still where people are, and I have some doubts about GitLab's VC-funded model (will they be able to stay as open forever?). I also want to consider other options, like SourceHut. At the same time, it is in the back of my mind and I am ready to move away at the first sign of extend/extinguish.

Just for reference, vscodium is not a fork to remove Microsoft's code - it is just a build tool for the open source repo as explained in the README.

"When we [Microsoft] build Visual Studio Code, we do exactly this. We clone the vscode repository, we lay down a customized product.json that has Microsoft specific functionality (telemetry, gallery, logo, etc.), and then produce a build that we release under our license."

"When you clone and build from the vscode repo, none of these endpoints are configured in the default product.json. Therefore, you generate a "clean" build, without the Microsoft customizations, which is by default licensed under the MIT license"

> Just for reference, vscodium is not a fork to remove Microsoft's code - it is just a build tool for the open source repo as explained in the README.

When a certain build configuration enables major spyware features, and that is the build configuration for the released version by the first party, and another build configuration (that is not released by the first party) disables those major spyware features, the distinction between a fork/patch and a "different build configuration" becomes semantically meaningless.

It's a fork, regardless of how they care to present it. The result of the build configuration is embedded in the release. Consider it a "binary fork" if you don't like considering json "source code".

When the first line of the repo you yourself linked says ‘It’s not a fork’, I believe we will take it instead of your semantics about it.

What's the specific outcome you're concerned about?

Enriching Microsoft and improving their position in the market as a byproduct of publishing and using open source code that has nothing to do with them.

How is open source software harmed by Microsoft being successful promoting open source software? This feels like cutting off your nose to spite your face.

The danger is not to open source in particular; it is to the world in general, which includes open source developers, as well as all other people.

Microsoft and their allies make the world a lot worse for a lot of people. They’re the number one distributor of spyware in the world!

It was already consolidated. The vast majority of public npm packages are already hosted on Github. The dependency on them has been there since the beginning.

I would expect that moving git repos is easier than replacing NPM?

It is, but who is doing that? The users of NPM all are choosing to stay on Github.

Yes, indeed - and the dependency is literally right there on the technical level. For years, you've been able to specify a version of a package as a github repo's branch HEAD.

npm i some-package username/repo#branchName

Bonus points:

    npm install username/repo#semver:^1.2.3
The big problem is that lots of Node.js modules don't push their tags, so there are issues on lots of repos begging maintainers to push their Git tags so that we don't have to use the npm registry.

JavaScript is an interpreted language -- as long as you're only downloading source code from the registry there's really no reason to use a registry instead of the plain old Git repository.

A common issue I've had with using Git repos directly as Node.js modules, is that many projects are transpiled/built before publishing to NPM. Depending on specifics of that build process, it may not work out of the box (or at all) from a node_modules folder.

With NPM acquired by GitHub, I can imagine them "filling in some steps" by leveraging the fairly new Actions feature, so that repos can provide built artifacts, the same ones as published on NPM. The deeper integration will be an interesting development to watch.

Repos have been able to provide artifacts since forever ago; they just don’t sit in the tree. While you can commit from an action, I’m not sure that’s a great idea.

You're right, artifacts in GitHub repos have been around a long time. I suppose what I was missing was a way to point to a specific built artifact (like a tar.gz from a release) as a dependency, from package.json. As far as I know, it's not possible yet with npm. I can imagine that will be covered somehow with deeper integration of GitHub and the NPM package repository.

Yes, this has always been possible. Just specify the tarball url instead of a version or range.

Thank you, learned something new - from none other than the founder of npm! :) Congratulations on the acquisition, bright future ahead for the whole ecosystem.

There is a build/transpilation step.

How is a convenience feature a dependency? The same command exists as "gitlab:username/repo" variant. The GH variant just happens to be the unprefixed one as it has by far the biggest userbase.

Perhaps dependency was the wrong term, but my point is what you said - they've built it in as a convenience feature precisely because it's such a common usecase - a better way to say it might be they're inseparably linked tools / tightly coupled even on the technical level.

Exactly, especially given the instability over at NPM. Hopefully MS / Github can be a stabilizing influence both financially and culturally.

It does have a bit of a 'value add' feel to it.

But, you know, we've had decades of companies whose 'business model' is just their exit strategy...

I'm not sure I like the continued consolidation of all things tech around just a few large companies.

That's generally not a good place to be.

given the significant returns to scale in the tech industry this is a pretty natural development and it happens in most tech sectors over time as monopolistic competition generally outperforms the 'bazaar' economy.

'small business' is only the equilibrium in sectors that can't increase aggregate output by growing or capital investment like say, the restaurant industry.

Thanks to Microsoft/GitHub for this acquisition. NPM is essential to the Javascript eco-system and it is hard to have a business model for just a registry. In the ruby eco-system the awesome Ruby Together https://rubytogether.org/ was started to run the registry. In this case one of the worlds most valuable companies will run it, which means it doesn't need a not-for-profit.

Regarding "trace a change from a GitHub pull request to the npm package version that fixed it" will there be an API to add a source in case the change was made outside of GitHub? Although I recognize that the vast majority of changes to npm packages happen on GitHub.

Just to clarify, RubyCentral http://rubycentral.org is running the RubyGems registry.

It's confusing. RubyCentral pays for hosting and "ops" (not sure how much 'ops' staff time, if any?), but I think not development? And RubyTogether hypothetically pays for development (some but not neccesarily all that's needed), which can include new features but also required maintenance (we all know software requires care and feeding, it's never "done")?

But I could have this not right?

It has been confusing for a variety of reasons.

And I think there are mixed reviews with how well it's going overall, especially the RubyTogether part.

Thanks for that clarification, I was not aware of that. Thanks RubyCentral!

That must make you nervous over at GitLab, no? GitLab's integrated workflow is one of its main selling points (I love it), and GitHub now seems to be well underway to cross that moat.

It is exciting to see that having everything in a single application is being validated by GitHub. Last year it was very clear they are switching from a marketplace model to a single application by including Verify (CI), Package, and Secure.

We think Git(Lab|Hub) will become the two most popular solutions and we look forward to this competition https://about.gitlab.com/handbook/leadership/biggest-risks/#...

I think the companies that should be nervous are ones that have only one stage or ones that have multiple stages but as a suite of applications instead of a single application https://about.gitlab.com/handbook/product/single-application... There are a lot of these https://about.gitlab.com/devops-tools/

> It is exciting to see that having everything in a single application is being validated by GitHub

I wish Gitlab would get over this passive-aggressive negging of GitHub.

I would squirm seeing something like that among any two competing companies. But it takes a strange configuration of overcompensating an inferiority complex to use it for the specific case of one company starting out as an explicit clone of another, to then lord any small feature the original company may have followed over them.

This isn't the first time. I've seen it dozens of times, and I don't even specifically care about these two companies.

I don't think GP's comment was negging or passive-aggressive at all. The original GP said "That must make you nervous over at GitLab, no?" so it only seems rational to explain that they see this as validation and not as a risk.

Somehow, you took this explanation of why they aren't worried about this and turned it into a passive-aggressive stance..

It is important to understand that the "one single workflow" was very much what VSTS (Microsoft's GitHub competitor before they bought GitHub) was providing. It is very evident that Microsoft's enterprise background is shaping how GitHub is evolving.

GitHub is now very much focused on the end to end life cycle now that they have "GitHub One".

Reminds me of what happened with Cloud9 and VS Code. First, Cloud9 was awesome for allowing devs to code remotely. Then once VS Code became the best editor out there, they added remote host support (among other things) and now Cloud9 caters to a different audience entirely.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact