I agree that only public funding and massive donations would allow the development of true alternative to these products.
The building blocks are already there: desktop Linux, openLDAP, FreeRADIUS, LibreOffice, etc.
What is needed is lots of refinements, improvements and UI/UX polish to have a package that small shops to Fortune 500 companies could deploy.
A child that has a chromebook will never understand why you need a virus checker, firewall, that the data really is gone if a ssd breaks or have to install a special program to make a fysical print. The average training time to make this migration from MS Office to Google Docs for employees at Forbes 500 companies is 1 hour.
Chrome OS is such an undervalued solution for the end user in a traditional business.
In tech we all use GSuite. On stackshare.io a whopping 17.200 companies use G-suite in their tech stack. Microsoft Office 173.
You can't get hired if you answer MS Office as your main document driver in an interview. This discussion is not even a thing that gets discussed at a sane technology company. G Suite is awesome and it just works.
Kids use Google Docs as a chat app. A traditional document program is a typewriter and a google doc is a laptop with internet connection.
G Suite API's create so much automation option plug&play out of the box that 95% of your automation can be arranged with SAAS.
Both Zapier & IFTT supply a polished UI, where anybody can connect G Suite with 600 other SAAS API providers. Nobody outside technology has ever started a conversation about these two.
I think that if the government needs OSS software for actual use by the government, then that's fine. Don't make the government become a competitor to private industry. Look how that turned out with Uber vs. Taxis.
OSS is just like anything in the free market. If people and companies want it that much, they will spend time and energy on it.
What incentive? In the scenario you describe, I don't see an obvious mechanism for the government to receive additional tax revenue as a result of funding OSS, and that's the usual mechanism for such an incentive.
I mean, I guess you could argue that tax revenues will tend to increase as the economy grows, but that seems like a case where the interests of the government are aligned with the interests of the nation(s) as a whole, which is generally considered a good thing.
Speaking of that, it's absurd how many private companies benefit from Linux! I bet only a tiny percentage of those companies contribute back into the open source projects they benefit from.
CERN has never been a Microsoft organisation excepted in the head of management that was pushing for Microsoft solutions everywhere without success.
- Scientific computation are done at CERN under Linux with the Root framework.
- Most (all?) scientists uses OSx or Linux.
- All computing clusters runs SC Linux or centos.
- Most internal softwares: indico, EDH, landb, and other are running under Linux and are web based.
- All DBs are OSS or oracle
- All storages system are home made (EOS, Castor) or ceph based and run under Linux.
- Data distribution is home made and based on a framework named xrootd under Linux.
- Software distribution is also Linux based and run as a fuse module (cernVMFS)
- Most systems services are UNIX C++ and Java for the control part.
- CERN uses Openstack for virtualisation After the management pushed for Hyper-V and failed miserably.
- Management pushed for SharePoint for years before the entire website switched to PHP and Drupal.
There si not real "Microsoft" at CERN excepted AD and phones. It is however a study case of bad management decisions in IT.
- When asking about version control I was told that if I really wanted to use it I could have Visual SourceSafe 6.0 or something from way back in the nineties. It was basically CVS 0.1 alpha with a GUI, and I ended up learning Git instead.
- I made sure all my changes worked on Firefox, because that's what I knew the physicists were using. My boss wanted me to support IE only, and after a heated discussion the quote which always haunted me was "We don't care about the physicists!"
- A colleague lamented about having spent about a year developing some uber-flexible interface in SharePoint which would've been an order of magnitude simpler in other content management systems.
The rest of CERN is completely different, as indicated by cernguy.
I left a year into a two-year contract, and I'm happy to say I've not worked on a Microsoft platform ever since!
Yes, that is also a thing at CERN.
Many other teams re-develop things internally (Linux based) because they could not trust or rely on IT management to do the right things.
That had a very perveted effect of internal duplication across teams and experiments.
It's not clear to me. But it seems to have been part of a vision to "servicify" everything. Where everything should be a service "maintained" / "paid" but never "developed" to reduce costs.
Something that backfired beautifully
Ok, given the size and budget of CERN, it's probably cheaper for them to develop things in house. But maintaining a team of people not at the heart of your activities is always a bit challenging from an organizational point of view. For example, as it's not core for your activities, in case of a budget reduction, this team will likely be reduced, re-purposed or removed, leaving a lot of their services unmaintained with no bug fixes and difficult to operate by IT.
It's also less battle tested than off the shelf software, so you will hit all the common traps before having a stable solution.
And open-sourcing the code is not a magic bullet. Even if the code is open-sourced, the likelihood of the project building a community of users and external contributors is quite small.
That being said, sometimes the offering for off the shelf software is just so bad that it's preferable to implement it in house. Or you have an idea that, even if not core to your business/activities, could provide a competitive advantage/huge gain in efficiency. But to pull that off, you need to be well ahead of the curve, have smart people in-house and significant resources to dedicated to that (CERN actually crosses these boxes). But it's also likely the world will catch-up and you will then be stuck with your in-house solution lacking features and reliability.
It's not just their size, it's that they're doing something that no one has done before. When you're building a me too product or even a version 2 of something, maybe buying makes sense. But when you're doing something for the first time, and you're not even sure what will work (e.g., CERN), by definition it's impossible to buy. If you could buy it, someone's already built it.
Generally, the more uncertainty around the business model and potential solutions, the stronger the case for building. Conversely, sometimes the larger and more bureaucratic the institution, leads to slower iterations and experiments. Because of these two seemingly contradictory points, when a large company wants to do something unknown and relatively uncertain, they buy an entire company, provide them resources, but try no to interfere too much (see, e.g., GM buying Cruise).
"The Microsoft Alternatives project (MAlt) started a year ago to mitigate anticipated software license fee increases. MAlt’s objective is to put us back in control using open software."
"A prime example is that CERN has enjoyed special conditions for the use of Microsoft products for the last 20 years, by virtue of its status as an “academic institution”. However, recently, the company has decided to revoke CERN’s academic status, a measure that took effect at the end of the previous contract in March 2019, replaced by a new contract based on user numbers, increasing the license costs by more than a factor of ten. Although CERN has negotiated a ramp-up profile over ten years to give the necessary time to adapt, such costs are not sustainable."
Nice to see some of the old guard is still alive and well within the Microsoft walls.
 e.g. https://www.ox.ac.uk/research/innovation-and-partnership
Microsoft offers an academic discount because it thinks that getting students to use their products makes them more money as the students continue to use them later in life. Microsoft then SAYS they are offering that discount because students/degrees/training/teaching/whatever. What it SAYS doesn't matter. It's not the reason.
Similarly, Microsoft wants CERN to pay because it thinks it can make more money in licensing fees from CERN than it can from getting people at CERN familiar with Microsoft products. (Microsoft may have gotten this wrong, if CERN pushes more non-Microsoft products into the spotlight; whether they are wrong or right isn't at issue, their reasoning is the issue.) Once deciding they can make more money if CERN pays, now Microsoft has to come up with some reasonable-sounding reason that keeps its existing academic and industry customers happy.
So they came up with that reason. But don't think for a second the REAL reason has anything to do with anything but Microsoft's bottom line. Don't let them deceive you. This decision was made by executives at Microsoft by talking about profit.
Well, yeah, I'm repeating what I remember them saying. I'm definitely not making my own argument, nor am I particularly willing to speculate about any ulterior motives.
But maybe they thought they could charge 10x the fees AND still get all the benefits.
So if the distinction is "industry" or "academia" I don't know how someone would call labs "industry". I understand not calling them "academia" but that's definitely the classification if you restrict yourself to a binary choice.
This, is a reason that this is a great move for the spread of OSS:
>"The comprehensive range of training schemes and fellowships attracts many talented young scientists and engineers to the Laboratory. Many go on to find careers in industry, where their experience of working in a high-tech, multi-national environment is highly valued." (ibid) //
I'd warrant that if people start their career at CERN using OSS they'll help to advocate for those tools as they move out into other industries.
With MS's market share it's probably not significant for them, but it could be significant for whatever OSS tools are favoured at CERN.
For the record, I also did my PhD at CERN, but they just provided facilities (offices, computers, accelerators, detectors etc) and not academic guidance/teaching/assessment: that was the job of my University.
I previously worked for an enterprise analytics software provider whose success hinged on low-cost license agreements to schools and universities. Once the company became more successful commercially, it took the same tact--substantially raising these fees in search of profitability. Today, the firm is under serious pressure from open source analytical libraries. The point being, what looks to be optimal in the short run may not be optimal in the long run, as future consequences could be anticipated but short term revenue growth was too tempting.
"Tack", from sailing: changing direction lets you use the wind differently. Two boats on the same tack are using the same wind strategy.
My workplace currently teaches students several proprietary systems that are now old and clunky (and of course expensive). We want to switch to open source competitors, but the how and when are rather complicated to coordinate. It will happen eventually though.
Hope they can repeat that with a few more FOSS projects!
I recently picked it back up to investigate and was pleasantly surprised. It felt like the team dearly missed the sanity of 2012 version, continued from there and modernized it. Thank you CERN.
They literally get paid the moment value is created from using their software. It's beautiful.
“That is why AISLER allows its users to easily donate to the KiCad project during the ordering process.”
The donation is an item on the bill, they automatically set it and the user can change the amount.
They call me up one day to note that one of their devices I supported had an accident and they were concerned they didn't know why. So they sent me photos.
This device had rows of modular cards installed in it. In the center of the device with two cards pulled out you could see that something had burred and even melted some of the surrounding cards. But it didn't look like any given card had failed as much as there was some sort of really hot fire ... that had been in the air between the cards or something. Now keep in mind this was just what a handful of photos looked like, so who really knows. Makes no sense that there was something floating in the air between the cards hot enough to do that thing ... but that is what it looked like.
Anyway it was like a good 100k+ in hardware burned up, possibly MUCH more as the full chassis held a lot more than that. So I promise them a new chassis and such and tell them to pack it up nicely and we will have a courier come and get it and send it to our QA team. The CERN guys promised not to expose the equipment to anymore micro black holes ;)
The process to send stuff to the QA team in strange situations like this was a painful series of steps. The QA team was BRUTAL about process (even if they never followed it themselves...). They also were a real pain to even email, but that was part of the process. Amusingly enough when I sent them the photos and explained it was CERN even the QA ultra dry guys cracked some good X-files references ;)
Still wonder what the hell happened to that equipment.
Many teams had implemented very complicated pipelines doing all sorts of things. Including using GitLab to design, validate and eventually produce hardware that was used in ATLAS.
Was thinking about this yesterday, and was curious how much functional coverage or issues there are with what people actually use it for.
I started using Jupyter and Pandas for my own work, just so I could get something done. I really liked it.
But the organization was moving even more in the MS direction, I'm sure any thought of Pandas died after I left.
Try something like that if you can.
When I hear stories like that I always wonder what was the thought process of the sales people that managed that account. Probably something like that:
- Hey, do you remember our old customer CERN, a world-famous scientific non-profit organization that pushes the boundaries of human knowledge about the universe?
- Yeah, what about them?
- Let's charge them 10 times more for our software licenses
- Can't see anything wrong about it, go ahead
and what I said last time:
I'm honestly pretty happy about this. I'm hoping that by aiming to replace the commercial products they use with opensource alternatives, the alternatives leave with a better polish and user experience. I also see CERN as an institution that is willing to hire the devs needed to maintain / support a project.
Adam and I spoke to a computer scientist and two physicists from CERN for our podcast, and you might like to listen if you like physics or software.
However, one issue is that small institutions cannot afford to self-host everything, or make the necessary adjustments themselves. I wonder if CERN and other big institutions could perform some heavy lifting (and maybe provide some shared services, hosting servers, etc.) without necessarily centralizing everything like it is done nowadays.
-what happens for stuff like user management with Active Directory etc?
-what happens for things like FOIA requirements considering it is e2e encrypted?
As for user management, synapse has identity servers and password providers to form a complete authentication solution. mxisd is a service that uses both to offer LDAP authentication, although it seems dead with no real replacement.
In terms of audit compliance for E2EE, there are three main options:
a) Turn it off (as per the parent post); there isn't a button for this in synapse but would be possible to add, albeit technically an abuse vector given it is effectively a downgrade attack.
b) Add an audit user to rooms which need to be 'on the record'. This is our preferred solution, as it makes it crystal clear to users as to which conversations are on the record (and whose record!) and which aren't. One could run such a user via a client like https://github.com/matrix-org/pantalaimon, and have the server autoinvite them into rooms which need to be recorded.
c) Add an audit (aka ghost) device to users who need to be 'on the record'. For instance, you could use pantalaimon to log in as a given user and record their messages. The audit device will appear in the E2E devices for the user, and once cross-signing lands, could be signed by the user (or their sysadmin) to be trusted. However, we're not keen on ghost devices in general - we've built all of Matrix's E2EE trust model to protect users against unexpected devices being present in their rooms, so we'd recommend audit users instead.
In terms of LDAP integration - there are more and more enterprise integration options appearing; for instance, ma1sd is a maintained fork of mxisd, and we're working on better LDAP bridging for Matrix in general. We want Matrix to work in an enterprise environment, so if people see stuff missing, please yell.
for directory service, LDAP and web SSO are both supported
why? labview is not a dynamic programming language.
> I'm a proponent on using python for labautomation (we run workshops at some major optics conferences, see http://python4photonics.org for an unfortunately somewhat out of date summary about some of what we do).
python is orders of magnitude slower than labview and doesn't have any ability like labview to do true mulithreading and multicore execution.
> In my experience labview is a mess, it's only momentum (in particular drivers provided by vendors, but that is changing) that keeps it going. I find labview code developed by amateurs is an unmaintainable mess. Grad students pass along code from on generation to the next without anyone daring to fix some of the glaring bugs. Really use anything but labview, even if it's the buggy matlab instrument toolbox.
in my experience, code developed by amateurs in any language is an unmaintainable mess. that isn't a valid critique of a language. the same thing happens with amateurs who code in python. i feel python, as a language is a mess, and also as an ecosystem. packages quickly become obsolete or only supported in certain versions.
drivers for labview are really not the thing keeping it going, at all. most third-party (i.e., non-national instruments) companies provide drivers in the form of c/c++ DLLs, .NET DLLs, and LabVIEW APIs. since labview can call any of these, this is indeed a plus. however, it is the ability in labview to program windows, real-time linux, and FPGA applications and the availability of natural instruments hardware that easily integrates into this software platform that makes it a killer. you can do some programming of NI hardware with .net languages, python, or c/c++, but you cannot do the full gamut of windows to real-time embedded linux to fpga. and if you know what you are doing, then the dataflow nature of labview makes it extremely easy to build and maintain large applications.
Do you have any more workshops/hackathons scheduled?
however, it does seem they have used it a lot, at least in the past. they have a site license (which is uncommon except for universities) and what appears to be a large internal component written in labview.
if someone is heavily using ni hardware and software, then there are really not many alternatives at all.
I couldn't find anything concrete on the CERN website, but in their virtual DC tour I noticed HP Procurves as top-of-rack switches. I was really hoping to see whitebox switches.
> Needless to say, isolated initiatives will waste effort and resources.
If the centralized procurement approach is what put everyone in the current mess, what indication is there that a centralized approach to open source won't produce the same issues?
I personally wonder why commit to open source in a centralized manner vs commit to interoperable standards (but not the specific tools used to speak to the standard).
Our company still relies on it, but any new piece of software has to be platform independent. Licensing income probably skyrocketed on MS end, but I don't think they made any friends with W10.
Maybe MS is correct that W10 will be the last Windows. But maybe not because of rolling releases.
Anyway, good news and the correct strategy in my opinion.
Why does it matter if they learn them on a Mac or a Windows machine? The OS for the most part these days are irrelevant.
And when there is a need for an operating system, which I don't see in basic and mid-level teaching in school, schools should use libre software, because it relates to public infrastructure.
I support the educational board use, but you're describing a different kind of class that will enable students to contextualize the history of computing, describe the fundamental paradigms of computing at an abstract level, and maybe prototype some cool demos on a breadboard, but they'd balk like cavemen at modern computers. The idea of actively shielding students from the realities of modern hardware and OSes is ludicrous to me. Should driver's ed classes limit themselves to chalkboard theory and "libre" vehicles hand-built by the students themselves?
Demanding every child in 7th grade to buy an iPad for math courses, so they can write on them instead of on paper, will not teach them about computers. They use them only as tools for the convenience (actually, they aren't more convenient; paper is extremely convenient for math), and the school has the possibility to say 'hey look, we are modern and digital!'. That is snake oil. The kids won't learn a thing about the excessively expensive status symbol devices they are forced to buy, especially not with Apple products of all. This is alienating them from what computers are; universal machines.
I don't have in my mind children that 'balk like cavemen at modern computers'. I think of children that can get excited about advanced hardware and systems, and see all the possibilities. They'd ask for the battery run time, for the computing capacity, for the I/O abilities, or the instruction set, etc. And they'd want to code the machine to do things they deem useful. That curiosity is not just not nourished, it is killed how it is done now.
Driving school has different goals. Cars are products that are dangerous when their operator is not trained. So the driving school is there to train people to operate these machines safely. General public school has the goal to prepare children with general knowledge, a basis to build on, so they can be informed citizens. Democracies crucially depend on that.
How? Moving a mouse around is the same in KDE, Gnome, Windows and MacOS. A keyboard is the same. The terminals aren't even that different (many Linux and DOS / Powershell commands have equivalents). Almost all modern programming languages are cross platform.
You aren't explaining how it creates this dependence, because I don't see it.
> The teaching can be done abstractly on a board and paper, or with interactive games.
You can't learn how to program or really know what the computer is doing unless you are using and trying to create something on there.
> The practice can be done with microprocessors on educational board with plenty of easily accessible I/O.
This sounds more like electronics class than anything else.
Humans are strongly bound to habits. I have seen that personally, and in reports about institutions doing the change. Some people were strongly opposed to change away from Windows because the alternative doesn't look and feel the same, so they don't like it. And that even if the alternative offers the same capabilities.
Building a habit around Windows is part of Microsoft strategy to bind people to its platform. That is why academics and schools get discounts. Public infrastructure like schools must not fall for this fixation.
Children learn how a combustion engine works, and how it can be controlled. Why don't they learn how a CPU works, and how to program it?
> You can't learn how to program or really know what the computer is doing unless you are using and trying to create something on there.
This is incorrect. The principles of a computer can be explained and understood with pen and paper. Even the execution of a simplistic computer can be worked out with pen and paper, just as algorithms can be thought and reasoned about that way.
Almost every operating system on the market comes with a WIMP interface that requires some sort of pointer device. So yes they are kinda necessary now.
> don't want schools to prepare the next generation of Web users that can create an account with Google, but barely anything more; that way the democratic oversight of technological changes and powerful companies fails, because the citizens aren't informed and competent.
You are building a strawman here. We were talking about concepts. Concepts can be taught on any modern operating system. What logging into Google has to do with any of this isn't clear.
> Humans are strongly bound to habits. I have seen that personally, and in reports about institutions doing the change. Some people were strongly opposed to change away from Windows because the alternative doesn't look and feel the same, so they don't like it. And that even if the alternative offers the same capabilities.
Sorry this is another mis-representation conflating habits with people not liking people changing something that works for them.
People generally don't like you changing things because change normally brings a new set of problems they don't want to deal with when what they already have works perfectly well. That is what they are really telling you when they say "It doesn't feel the same".
This has nothing to do with teaching computing concepts in schools. The reality is that only a small percentage of those children taught these core concepts will choose to pursue them in higher education.
> Building a habit around Windows is part of Microsoft strategy to bind people to its platform. That is why academics and schools get discounts. Public infrastructure like schools must not fall for this fixation.
Another way of saying this is "Apple and Microsoft make products that work well and people will prefer them if they aren't forced to use something else first".
Again this isn't anything to do with teaching computing concepts which can be done on any modern computer (or even ones from the past).
People like to pretend that being required to use to use a Windows machine is akin to water-boarding or some other awful imposition.
Computers for children are used for convenience or to solve problems. Neither of which requires Windows, or any other vendor-specific platform.
It's neither justified nor necessary to force children to use Windows or any other specific operating system in school.
If they don't require windows and it is OS agnostic it shouldn't matter whether they use Windows or something else.
> It's neither justified nor necessary to force children to use Windows or any other specific operating system in school.
You are talking about Children using Windows as if it was forcing them to clean chimneys. This is ridiculous.
But enterprise is the bigger deal anyways.
Kind of funny in this context, since the specific impetus here is Microsoft narrowing the applicability of academic licensing.
It's not: it's things like Active Directory, Skype4B, Office, Exchange etc. Desktop OS's aren't covered by MAlt.
Universities have thousands of students that do not generate any revenue. The standard licensing model per person or per computer totally breaks down in that case, asking for a huge bill based only on the sheer amount of users. Education discounts are adjusting for that, large user bases with no money to pay.
That being said. The CERN is a government entity. They should argue to be given government discount.
There are thousands of students, postdocs and academics with CERN computing accounts (way outnumbering employees). It's not like they're generating revenue either.
> They should argue to be given government discount.
They did, IIRC, but their status as an IGO didn't cut it with Microsoft.
Honestly, it looks like the terms and licensing are being dictated by some dude(s) from Microsoft in the US, based on US norms. The way academy/university/research is organized and operated in Europe is fairly different. Bet Microsoft doesn't mind the added money either.
If the CERN were playing by US rules, they would be suing Microsoft to be recognized as a government or academic entity.
I agree with the move away from .ch, since it's an international organisation, but .int would have been the correct choice IMO
I suspect they will discover over time that the fully loaded costs of switching are higher than the costs of the licenses, unless this is just a way of trying to leverage MS into giving them academic licenses again.
Given that figure, do you really think that one-time project implementation and switching costs amortized over the next X years + ongoing support would really come to more than 10x? How do you make such a strong claim that easily, any prior examples?
Not saying you're obviously wrong or that CERN is obviously right, but this seems to be a much harder call (and intuitively feels right in fact), given the number.
Edit: here  is a good article (unfortunately in German) about the reasons.
Given this, some years ago, the mayor changed, Accenture did a 'cost' study and MS moved its german HQ to the city.
Can't speak for large organizations but some of the more well functioning places I have worked on has been running without much Microsoft services.
Also for many of us sysadmins in many cases we would say we would possibly save more from not having to deal with the hassle that is Windows than from just the licenses.
I mean: having to run and patch Apache httpd or other standard technologies on Windows is such a major hassle compared to Ubuntu or or any other mainstream commercial distro.
You can of course just use IIS, but then you have to deal with the hassle that is IIS.
This is a bit tongue in cheek, but the idea that Windows is somehow easier for everything is a bit unnuanced I think.
Officialy IT don't support Linux. Unofficially they'll be happy to make sure everything works.
We threw out MS Office many moons ago anyway, and management switched first and with genuine enthusiasm so it was no problem to get the rest of the company on board ;-)
And this is not the first place: Already in 2009 - 2012 I worked for two places like that; well one of them went further and almost completely banned Windows because the IT manager couldn't stand it. The other place just had a policy to let people choose, -and a dev team that would encourage people to pick Macs.
From 2015 I've been free to choose my own OS again.
I guess you are right.
Still it hives me hope that things are changing for the better :-)
I suspect the switching costs will be higher in the short term for sure, but this also seems to be a strategic move.
In a quest for money they didn't need and likely were not gonna get much of anyways, by removing perks MS currently offered academic institutions, they've risked teh creation of a much greater threat that can undermine their entire business. CERN is an organization which could truly help create a set of alternative tools to MS products.
Talk about killing the golden goose.