Hacker News new | past | comments | ask | show | jobs | submit login
Announcing .NET Core 1.0 (microsoft.com)
717 points by runesoerensen on June 27, 2016 | hide | past | favorite | 314 comments

FYI, from the announcement. Probably the most important part at the end, italicized by me. Also note (elsewhere) that licenses are MIT and Apache2:

.NET Core Tools Telemetry

The .NET Core tools include a telemetry feature so that we can collect usage information about the .NET Core Tools. It’s important that we understand how the tools are being used so that we can improve them. Part of the reason the tools are in Preview is that we don’t have enough information on the way that they will be used. The telemetry is only in the tools and does not affect your app.


The telemetry feature is on by default. The data collected is anonymous in nature and will be published in an aggregated form for use by both Microsoft and community engineers under a Creative Commons license.

You can opt-out of the telemetry feature by setting an environment variable DOTNET_CLI_TELEMETRY_OPTOUT (e.g. export on OS X/Linux, set on Windows) to true (e.g. “true”, 1). Doing this will stop the collection process from running.

Data Points

The feature collects the following pieces of data:

The command being used (e.g. “build”, “restore”)

The ExitCode of the command

For test projects, the test runner being used

The timestamp of invocation

The framework used

Whether runtime IDs are present in the “runtimes” node

The CLI version being used

The feature will not collect any personal data, such as usernames or emails. It will not scan your code and not extract any project-level data that can be considered sensitive, such as name, repo or author (if you set those in your project.json). We want to know how the tools are used, not what you are using the tools to build. If you find sensitive data being collected, that’s a bug. Please file an issue and it will be fixed.

I choose to ignore this post so five weeks from now I can write an outraged "Microsoft is TRACKING YOU" article on Medium that will garner me praise and upvotes.

Much quicker and easier than simply forking the code and commenting out the telemetry feature :P

And also generates more upvotes.

And 100x quicker than setting the clearly documented environment variable that disables the feature.

> You can opt-out of the telemetry feature by setting an environment variable DOTNET_CLI_TELEMETRY_OPTOUT (e.g. export on OS X/Linux, set on Windows) to true (e.g. “true”, 1). Doing this will stop the collection process from running.

It is still opt-out, and thus considered harmful (at least by me)

If it were opt-in, they'd collect significantly less data, and the data they do collect would likely by heavily skewed. Microsoft could be misled by the poor quality of data collected, and an opt-in system could actually be worse than not collecting any data in the first place.

The way I see it, at the end of the day, the decision for Microsoft is really between not collecting data and an opt-out system. If Microsoft chooses not to collect data, then all developers have to live with tools that improve slowly and have issues (possibly security related that could be maliciously abused) that are not fixed as quickly as they could be.

If Microsoft chooses an opt-out system, they can collect the data they need to make sure their tools are working optimally and as intended. Some developers may not be comfortable sharing how they use Microsoft's tools even with no personally identifiable information collected. These people can opt-out while minimally compromising the quality of the data collected. Additionally, the tools are open source, so any developer that's skeptical of how and what data is being collected by the tools can verify Microsoft's claims.

Those are the two options I see. To me, the cost/benefit of the second option greatly outweighs the cost/benefit of the first for all involved. By not collecting data, security issues that could actually compromise your privacy could go unfixed for longer. By collecting data through and opt-out and open source system, Microsoft can fix issues ASAP and developers can verify that data is collected in way that preserves their own privacy.

It seems like a lot of people are knee-jerking to the idea of collecting data through an opt-out system and not actually weighing the cost/benefit of the realistic options. Can you explain how not collecting data has a lower practical cost/benefit ratio than an opt-out and open source system?

Bullshit. Other companies manage to build quality products without opt-out tracking of their users just fine.

That doesn't refute my point. No data collection still leaves a greater probability of issues being left unresolved for a longer period of time. Also, the code is open source. You can see exactly what data is being collected.

To address your point, it's not possible to be aware of the benefits you're missing out on without data collection.

For me the issue comes down to whether or not Microsoft does anything useful with this data (probably not, if 20 years of NVIDIA blue screen driver failure logs, Windows 8 and OneDrive are any example of how 'big data' impacts Microsoft product quality) versus how many comments I have to read where joeblow52 is personally offended that Microsoft dares to learn what his compile time plus 999,999 other compile times, divided by a million, equals.

How, exactly, are you thinking that Microsoft is going to fix nVidia's buggy drivers? They can collect all the data they want, but at the end of the day, it's nVidia's driver.

I worked there. Ways we solved these sorts of problems include: hardening the other side of the API/HAL when appropriate/possible, simplifying the driver model so that mere mortals could write drivers, writing our own drivers and overwriting known buggy ones for companies that couldn't get their shit together (usually network vendors), adding workarounds to the OS not to use certain features of certain cards, flying external engineers to lavish parties and our driver development labs and compatibility labs and providing one on one engineering development assistance from senior kernel developers, providing free testing of drivers for known problems before release, rolling fixed drivers into Windows updates, providing marketing funds as reward for fixing problems, and not using NVIDIA in the Xbox 360 after using them in the original Xbox as punishment because they were personally responsible for over 80% of blue screens in Windows for the preceding five years.

Sadly the motivation was often to ignore the data or watch it get spun by some jackass with the exact wrong agenda. It's just software, there's always a way to fix things if you really want to.

Nice work, thanks for the insight!

I just installed these tools and it tells you on first invocation that telemetry is enabled and how ti disable it. I think that buys a little bit of good will. I am also opposed to telemetry by default, but I understand it, and appreciate the opt out message being presented clear and up front.

I'm one of the self-appointed resident whiners but I think I'll cut them some slack because they've used the magic word "preview".

This case is different from silently adding telemetry on a minor upgrade to a tool in production

"The telemetry is only in the tools and does not affect your app."

Microsoft is looking to do more data-driven design; this is the reason for all the telemetry in Windows as well. Raymond Chen pointed out an example where a button was removed File Explorer (prime real-estate) because the telemetry showed that hardly anyone ever pressed it.

It's unfortunate they are so tone-deaf about the PR implications in Windows.

We have removed all the weapon deployment buttons from our nuclear subs, silos, and bombers, because telemetry shows that they are typically only pressed during unit testing.

Since we only have two recorded events of someone deploying a nuclear weapon in a real-world scenario, it should be safe to relocate those buttons off the main console, and just use the auxiliary functions button, behind the kick panel, next to the row of DIP switches used for selecting the button press function. Since "reorder coffee pods" was the last auxiliary function to be added, "launch nukes" will be next, so make sure you double-check those switches before pushing, coffee drinkers!

You're joking but I'll give a serious reply to why your analogy is off. This is why they have war games. The telemetry from those games would be included in the analysis.

My point was that user interface patterns do not always indicate the importance of the control, because there may be multiple modes of operation (such as "Standby", "War", "War Game", and "Readiness Drill") that might not be obvious--or even apparent--from the raw button-press numbers.

You want the "War Game" telemetry to count for the "War" design, but you can't really tell which is which without having additional data that you shouldn't be allowed to see. If you detect "casual user" and "programmer" modes in your OS, the user shouts, "I don't want my computer knowing (and sharing) that much info about who I am!"

Even though most people aren't likely to ever need to use a fire extinguisher, you really want that interface to be simple and accessible. If you decide that because no one ever used this particular fire extinguisher, it doesn't need to be there, that's a decision that could make the difference between life and death. Moving a button on a form doesn't seem quite that serious, but it could still cause people to lose time or money, which adds up across all the people using the software.

But they aren't saying they will blindly follow the telemetry data, just that it's an important factor.

You need to determine which feature to add:

1. One that 3 developers asked for

2. One that 6 developers asked for.

With telemetry that can become:

1. One that 3 developers asked for and can improve things for 60% of your userbase.

2. One that 6 developers asked for, but only 12 of them are using the function that it will improve.

Without telemetry data, you are playing with an arm tied behind your back.

I know this is just meant in jest, but because the weapon deployment buttons are something you never want to be pressed on accident, ideally you really do want to obscure them behind layers and layers of user interaction.

If anything we had the problem of them not being obscured enough for a while (if the story of the launch codes being 00000000 across the board is true).

I thought it was just four zeros (0000)

The "Compatibility Files" button was removed: https://blogs.msdn.microsoft.com/oldnewthing/20150902-00/?p=...

Like when they removed Macros from Visual Studio because according to telemetry only 1% of users used them.

I always suspected that there was probably significant overlap between users of macros and users who opted out of telemetry. Lesson learned: leaving "yes" checked from now on.

Big Brother has trained you well. ;-)

Yep. Don't vote and nobody will care about your (unexpressed) needs when making decisions.

Well, it would be nice if there is a way to vote or express one's needs via some alternative to having one's usage actively mined for data.

I think you are truly in a minority if you'd rather take a survey than have the data collected automatically. There has to be a sane middle ground.

Data collected by a neutral third party without conflicting interests pushing for creative monetization opportunities that's responsible for sufficient annonymisation/aggregation before releasing the data?

There is mining for data and there is mining for data.

Monitoring which features I use? Opt in.

Monitoring what I do? Uninstall if possible.

You can always refrain to use their technology. There is plenty green grass in the neighboring fields.

I only remember accidentally t turning them on and they getting mad!

Are you referring to the 1% on wall street? ;)

The Story of the Ribbon video about Microsoft and data driven design.


The Ribbon is, in my opinion, the use interface equivalent of a broken pull tab on a can of corn. I know it is there, I know that I'm supposed to use it, but I can't because it is broken.

I am relieved to see there is somebody out there who shares my opinion on this ... innovation.

Ribbon, or "How to Figure Out Where a Menu Item Is Based Solely on UX Data"*

*Customers may not be granted access to UX data

> a button was removed from File Explorer

Would that be the "Go up to the parent directory" button that was removed in 7 by any chance?

Easily my favorite button in file explorer.

They still have that button, and you can go up by clicking on the parent folder name as well.

[Ctrl][Up] works in Vista onwards to move up a folder. It's worth knowing.

<Backspace> does this, since XP I think - ^↑ doesn't seem to work for me in Win 10.

Backspace is "back" not "up a folder level" as far as I am aware. This is consistent with web browsers - and indeed the conceptual meaning of "backspace" (go back 1)

No. And that was added back in Windows 10.

8, actually

So random question: I like when MS people come out and engage the community like we are responsible adults and not like moronic children who need hand-holding and cannot decide for ourselves.

Since we all know the famous "Open Source is cancer" reign of Balmer oft-joked of here, I assume the related issue at hand is culture shift is good and less of a concern now, as opposed to a few years ago. But I am super happy when people like you drop by.

How do people like us on HN, Reddit, and Ars get you and those like you to say, see daveguy's manager[s], we will reconsider our avoidance of the MS tech stack because how daveguy did X?

Is this silly? I am not sure. But I wonder if I am the only one who wants people like yourself to thrive so MS keeps going in this direction.

The best way to make sure they (we, since I work at Microsoft) get the feedback is by finding out where the product is listening.

A lot of products have UserVoice set up, a lot of dev product teams (dotnet, asp.net, etc.) are looking for GitHub issues.

For Windows 10 I'd recommend using the Windows 10 Feedback Hub - if you hate something about Windows 10, post feedback in the Feedback Hub before you uninstall it.

Visual Studio has this page that explains where best to submit feedback / bugs / feature requests: https://msdn.microsoft.com/en-us/library/b8akas30.aspx

There are lots of us who see your comments on HN / Reddit / Slashdot / etc. but having been part of the Microsoft ecosystem (both inside and out of Microsoft) for many years, I can tell you that there's a huge difference between "I've seen some unhappy comments about X" and "Our feedback numbers show 68% of users want us to improve feature X". Also, specifics are important - a detailed issue report with repro steps and recommendations for what you'd like to see work a whole lot better than "@drunkvs LOL VS suxxxx ikr!!!"

Also, just because I'm an ocd fact checker, Ballmer technically didn't say "open source is a cancer". He said that Linux is a cancer, referring to the strong copyleft license requirements (https://web.archive.org/web/20011211130654/http://www.suntim...). [yes, I very well know that you can run software on Linux without that software being GPL. I'm writing this from Red Hat DevNation :)]

Regardless, I can tell you that our current CEO and everyone I work with is excited about writing great software that runs everywhere.

Well I tried that once or twice in the Connect era. But if you say it will make a difference this time, next time I have issues I will contribute.

And yeah, that cancer line was meant to snarky and cute, not factual. It does not really matter more than me indicating the obvious, that this is a big culture shift if you look at the timeline. I think MS has ways to go too, and the long game is where we will see how it fares, but I want to make sure we encourage the good. Or it will languish, as "no one cared after a while."

Yeah, please try again. The problem with a lot of Connect programs is that they were tied to a specific release (MonkeyManager Pro 2008) so bugs that didn't make the cut for the product release just got lost in space (not picked up for MonkeyManager Pro 2010). Part of the reason was that there were always internal bug tracking systems, which was what the dev team was really working from day to day, and these other systems were just inputs to that.

If you look at any of the projects that are on GitHub now, the public issues list is the issues list. You can see all the dev comments, scheduling via labels, pull requests, code reviews, etc.



For products that aren't on GitHub, public issue lists like UserVoice have still been a big improvement (in my opinion) because they usually keep the bugs / issues / feedback around until it's fixed, and the votes accumulate so the important issues keep bubbling up.

I had a couple of reports on Connect related to crashes in Edge on Xbox disappear as part of a migration - the best I can work out they've become private/internal. I'm assuming those kinds of things are going to UserVoice and not GitHub but is there some way I can find out what happened in the end?

I am looking into ways to move everything we do off of MS Tech completely because I don't trust their direction with Windows 10.

I posted this because I thought it was an interesting and relevant part of the announcement. If MS was this straightforward and transparent with Win10 (allowing easy disabling of telemetry and the option to control upgrades at all levels) then I wouldn't be looking at alternatives. Although off by default would be better, but they would get approximately zero feedback with that. Best would probably be ask on install with a checkbox easily unchecked.

Whoever decided their .net tools telemetry and messaging needs to be in charge of that aspect across all products.

You and I are on the same page. What concerns me is that really the best work on securing core internals of the OS from long standing problems, such as strengthing LSASS and SAM against pass-the-hash with virtualized, isolated processing shielding and HW age I will be forced to it already. All the talk of Corona scares me, and not even CIS and the big players have guidance since MS has the new attitude of "just trust us already." Their announcing, even with Enterprise SKU, you cannot disable Windows Store, worries me. In my personal life, Windows is something I hardly ever use. But work and Windows 10, being shoved down into the deskop market, is scary. The Cred Guard stuff and improvements are wonderful, but their increasingly we're the cloud and you're the peon attitude also terrifies me.


I think that kind of engagement is present on their github repos.

Well, until it is not interesting for them as a PR strategy. I do not want to lose these people when the pendulum swings back the other way.

And you and I check Github. That does not mean anyone being engineering managers care. I am curious what else we can do beyond starring GH repos. I do that, I just do not like my lazy armchair form of support and worry I will lose a culture shift at MS I have quickly fallen in love with after years of reviling the company as a whole, short-sighted or not.

Purely subjective and observational... but I had a few MS guys go out of their way to communicate a bug in probably the most popular node.js driver for MS-SQL (tedious), and it was interesting... There wasn't as much updated on the Github issue, but I was included in on some of the communication issues, and they were pretty open about it.

The person that had poked in was iirc, and Azure developer in MS, not from the MS-SQL team. There are plenty of developers at MS that do follow various GH projects (as happens everywhere else) and will get the right people involved when they see things.

I would presume that by pushing these kinds of projects (.Net Core, etc) out to github, and even the documentation likely won't see things close off any time soon, and not without losing more mindshare than Oracle has caused people to move away from Java.

So it is bad when Microsoft asks to do it, but ok when Eclipse, Netbeans or InteliJ do it?

They are being quite open about it and how to turn it off.

I don't think the previous comment is suggesting it is bad. I think they're trying to stave off any "OMG! MS is looking at your porn!" posts, by highlighting the privacy policy of the telemetry; which sound reasonable to most people.

Thank you. I wasn't suggesting it is necessarily bad. It looks like the openness and ease of disabling is being handled much better here than with Win10.

However, the option to disable on install (checkbox prompt for preference) would be much better. This is an improvement over the Win10 approach.

Sorry if I took it the wrong way and jumped the gun too soon.

Microsoft is by no means an example of sainthood, but many in the HN community tend to only criticize it, while closing the eyes to similar actions of companies that are closer at heart.

Yes. It's bad when Microsoft does it this way because it's opt-out. IntelliJ/Android Studio prompts you and asks you your preference first. I think Eclipse is opt-in but could be mistaken.

That’s the recurring debate with opt-in = better for privacy but skewed stats vs. opt-out = more relevant stats but bad for privacy(?).

Homebrew is opt-out post-install by setting a flag as well (no option or warning pre-install). But they do remind you afterwards that they are tracking and give you a link to learn how to disable it. As long as no data is sent until first use (giving you a buffer to opt-out), I don't have an issue with this.

Microsoft is a bit more influential than those guys, no?

The telemetry was in the original preview release of the tools with RC2. It can be turned off and I believe it won't be in the final stable CLI tools release, but that may change.

It's the same with VS and VS Code (and Atom too). There has been a move from opt-in (for older versions of VS) to opt-out. Although I do still see connections to MS domains with all the customer experience stuff turned off.

I simply don't want to be tracked in any form regardless of whether it collects my personal data. Yet it seems more and more impossible these days. It's like "If there's nothing about collecting your personal data, then you should compromise".

Sorry to tell you this 4684499, but this website and most every other website on the internet knows how many page views they have and the IP addresses that are used to get to their website, etc. ...all because they have telemetry on the server. It's how they tell if something is working or not working. Not all telemetry is bad.

At least it should be an opt-in, not opt-out.

A router has to track you to know where to send the info you've requested.

Well, it does track, but it doesn't necessarily collect the data and use it to profiling users. I just don't like to be experimented like a little white mouse in a mouse lab, regardless of whether it's for good or bad.

A couple months ago I installed .NET core on a Ubuntu virtual machine running on the Windows 10 hypervisor, and was able to get a MVC5 app running using Visual Studio Code. As someone who really loves Visual Studio (it made me expect a lot more from my tools) and C# (it made me expect a lot more from my languages), it was an exciting moment. I actually took a selfie with my monitor.

It was still a little rough: the "getting started" instructions ONLY worked on Ubuntu 14.x and not Ubuntu 16.x, and my PR to the documentation pointing this out was nixed. (I notice they've since added a disclaimer: https://docs.asp.net/en/1.0.0-rc1/getting-started/installing...). I really hope to someday be able to build projects with React and a .NET core WebApi and be confident that my teammates will be able to get the project running on their macbooks without kms.

MVC 6 is the version that works best with Core, although they don't really refer to it directly much any more. Both MVC 6 and Web API 2 are now part of Core. They're just packages like everything else.

Good point, I meant MVC6. I confused the versions of ASP.NET 5 (which is now known as ASP.NET Core) and MVC6.

MVC 6's name is now also ASP.NET Core MVC 1.x.x. It's really a version number reset. The confusing thing is that it's not plugged through everywhere and people interchange the two regularly.

#wellactually We don't talk about ASP.NET MVC as a separate thing anymore. It's just ASP.NET Core. So you might build an MVC app on ASP.NET Core, or build APIs on ASP.NET Core, etc., but it's all just ASP.NET Core.

Personally, I like the new "one true name". However, most .NET developers know what ASP.NET MVC is for, but are less clear about what ASP.NET Core is about. Hopefully this will change over time but, for now at least, MVC is still a useful term from a marketing perspective, or at least that's what my publishers tell me. :)

The naming is really confusing, but then naming things is hard. As Phil Karlton famously said:

  There are only two hard things in Computer Science: cache invalidation and naming things.

I don't know where it came from, but I've always preferred this variant:

There are only two hard things in Computer Science: cache invalidation, naming things, and off-by-1 errors.

I actually used that version in my recent book. You can see it in this section [0] of the first chapter (which is free to read, with no sign-up) that tries to explain the confusing naming.

[0] https://www.packtpub.com/packtlib/book/Application%20Develop...

That tweet dates from 2010 and the same quote is on this page [0] dated 2009. Of course, it's possible Martin updated it after the initial publication. I'm not saying you copied it either, it could have been independently created.

[0] http://martinfowler.com/bliki/TwoHardThings.html

I wish MS would release a stripped down version of VS for OSX/Linux, but I know it probably won't happen.

Visual Studio Code is pretty neat, and it's plugin system is very powerful: more and more languages are adding amazing IntelliSense and interactive debugging support for it. It's like Atom, if Atom was faster and focused on exposing nice APIs for autocompletion!

I've played with VS Code a bit, it's a pretty cool tool for small scale scripting, but it's not really a replacement for a full fledged IDE.

Hopefully Jetbrains will release their cross platform C# IDE sometime soon. I would prefer Visual Studio, but I don't think that's going to happen (at least not anytime soon).

From my experience, Rider is pretty stable and usable. Have you tried it out at all? https://www.jetbrains.com/rider/

Oh, didn't realize you could request early access. I'll check it out, thanks.

Not being fully fledged IDE can be good and bad. I'm using VSCode for most things I use VS for. It's super fast to launch, great language extensions, debugger, integrated terminal and tasks. It's like sublime and visual studio had a baby. I have licenses for jet brains but I seldom ever use it.

I may also be a bit biased since I contribute to VScode extensions. But I see that as a positive. I use a free editor that I can hack, look at its source and collaborate in the open.

F# doesnt work ootb with sdk preview2. Work ok with preview1 of SDK (win/ubuntu/osx/docker), but preview2 has a bug, the fix is in progress ( ref https://github.com/dotnet/netcorecli-fsc/issues/12 ) and will be published a nuget package with fix soon.

You reported the issue 11 days ago. I'm surprised they just announced it with such a basic use case being broken. It doesn't bode well for F# as a first class citizen in their ecosystem.

F# is not a first class citizen. They pay it lip service because it's a far more advanced language and makes MS look like they're on the cutting edge. Plus the team that made it is responsible for dragging the CLR into the modern era (or into the 60s) by bringing generics. And showing off important features on .NET such as quoted code, async workflows, F# interactive. But a simple look at tooling and language announcements shows that the F# team is very underfunded.

> The results of the 2016 Stack Overflow Developer Survey are in! F# came out as the single most highly paid tech worldwide and is amongst the third top paying techs in the US


Yes and it's clear from what the F# team has said that the hope they have for the future of the language is that the community will invest.

That's fine, but let's not think that this will produce tooling anywhere nearly as refined as C#'s stuff. Take F# interactive v the C# one. F# has like a decade lead. Yet the C# interactive editor is smooth, polished, even has VS project integration, something the F# team had thought of doing many years ago.

Non F#-team members[1] have said that internal politics are the issue here. To the point where some books were ... edited ... to paint C# in a better light, relatively. MS's marketing reflects this. My guess is they're too proud to admit their flagship language from their high-profile hire was shown up by what was a research project. And that the CLR's arguably biggest tech advantage over the JVM (generics) was also only done through the intense efforts of MSR; that MS Corp was against it.

It's sad, because MS is in a position to really elevate the world's programming consciousness/ability by really promoting F#, yet it's still a novelty for, as MS has said "scientific and engineering" applications. Yet, apart from tooling/legacy, F# handles every case C# does in a better way. At worst, it's C# with lighter syntax.

Oh well. At least it's there, works, and has some level of support. Only reason I consider using .NET these days.

1: The F# folks are amazingly polite and I've never heard them even hint at a complaint about MS.

The finance industry where F# shines is willing to invest, I suppose. OCaml e.g. is backed by Jane Street and F# is even simpler in that regard because the hardest part is efficient runtime which is for F# is .NET - already done well by MS. But what some corp willing to invest in tooling for F# could do while MSVS is closed source and is not most transparent IDE in the world to say the least wrt plugins.

> too proud to admit their flagship language from their high-profile hire

That's acute while we know that high-profile hire's past victories (Turbo Pascal, Object Pascal, Delphi) were never about the language, but about incredibly polished IDE, compiler, libraries and runtime.

> yet it's still a novelty for, as MS has said "scientific and engineering" applications

MS marketing wisdom is overrated, to say the very least. Look at Tablet PC. Windows XP Tablet PC Edition released in 2001. Ink APIs are all way through Windows SDK since back then. And they never realized that stylus thing is something more than just a 'uhm, you can draw a kitten, maybe?'. At least they never articulate anything more than that in any promotion campaign. And now Apple released iPad Pro and will eat the TabletPC-Covertable-Surface market.

Bug was found too late, near code freeze for rtm, not enough time for fix.

fix is in progress ( https://github.com/Microsoft/visualfsharp/pull/1290 ) probably tomorrow, it's not a big problem one or two days of delay. Also because that's an sdk issue (preview2) not of .NET Core (rtm)

The f# support is in beta, c# was ahead obv, but that's ok i think, it's used as the language for the corefx, etc. vb is not working at all atm.

The .NET Core sdk preview1 worked ok ootb, see for example the https://github.com/SuaveIO/Suave-CoreCLR-sample working xplat (win/ubuntu/osx/docker). And the preview1 continue to work.

What i really like it's how the sdk it's evolving, using modularized components it's possibile to fix/improve/evolve the f# compiler/library, without waiting a new version of sdk. That's really good.

In the next sdk or rtm also the `dotnet new` it's going to be updatable, so np about template too

Seconded. I saw the announcement and immediately went to grab the docker image and give it a shot with F# via "dotnet new -l F#" and go. It restored and built, but didn't run. So v1.0 is broken, I guess?

I don't think so. It means that the tooling is broken, which is not at 1.0 yet (still in `preview2` I believe).

Exacly, the preview2 of the sdk, atm, is broken for f#. Not the .net core, that's ok.

new console/lib templates are going to be published (with the fixed fsharp compiler/library package), that's all.

Still a little odd for a release announcement. It feels disingenuous to only afterwards make the fine-grained distinction that yes the libraries are technically v1, but the tooling is still in preview.

Kinda like announcing your new line of cars is ready for purchase today! Except that down in the fine print you might find out that the engine is ready to go, but the steering wheel, headlights, dashboard, and pedals are all still in development and so it's justifiable that they're broken.

Was like that also in the previous release. The .net core was rc2 and the .net core sdk was preview1.

I use it, and i think it's ok. The sdk it's just a build system. the real value is the .NET Core, that mean the coreclr (the virtual machine) and the foundation libraries (corefx). I can change the build system one month from now, to build a project using the old packages.

It's not a car (engine vs wheel). It's more like food vs marketplace. the food it's the real deal, the markeplace may be incomplete (no parking).

To build off of enricosada's post, this is indeed being tracked and actively worked on right now. More information here: https://github.com/dotnet/cli/issues/3705#issuecomment-22889...

Known issues will also be updated, as will documentation on how to use F# on .NET Core.

I have just run into this problem when migrating my project. I see the build is nearly done; when it is, what's the best way of patching?

Odd it worked for me without any problems

After so many years microsoft finally presenting their tools in a more moden way. Never been a huge fan of .NET but i can't deny is a great tool hopefully people try it out on more platforms.

I still can't shake the feeling that their naming is just confusing. I still am not sure what exactly is .NET and what's not.

.Net is 14 years old now - if you don't know what it is by now you're probably not ever going to know. It's a cross platform runtime and a group of programming languages that run on it, just like Java.

The general idea of .NET as a platform is easy to understand. The naming of .NET Core, .NET Framework, ASP.NET, etc. and the difference between them is not.

.NET Framework = the whole shebang, the full runtime including Windows-only bits.

.NET Core = portable subset of .NET Framework. Therefore not entirely compatible with .NET Framework.

ASP.NET Core = portable rewrite of ASP.NET. AFAIK not fully compatible with previous versions.

iirc asp.net core does not contain the web framework part, just the server stuff

My point is that it's just hard to understand what exactly one means when they're writing ".Net". Do they mean the entire framework, a specific library, the CLR, one language, etc. For whatever reason, the name has a very contextual meaning, and people use them interchangeably (".Net Core" is clearly specific though).

Not that they're alone. Adobe did worse with "Flex" since it could refer to the compiler, a framework, and an editor at one point until they decided to make it a bit more standardized.

Java JRE/SE/JDK... It's really not much more confusing than what others use, and as to what you need to install for something.

.Net core apps should be a portable application (portable as in the runtime is compiled in)... Yeah, it is a little confusing, and hopefully removing some of the separate terminology will help. A lot of what has changed, is that you will likely be developing .Net Core (or Xamarin apps) that will target a given platform for running in... Most of the rest should be cross-platform modules that install via nuget (platform/language package manager) and bundled with the application output.

> Java JRE/SE/JDK... It's really not much more confusing than what others use

That's not saying much; "Java" means so many different things it's enough to make your head spin. At least these days it isn't a stock ticker symbol any more.

Java is confusing too so being just like Java doesn't mean it's easy to understand. (Do I need the Java runtime environment, the SDK or the browser extensions to run this code?)

> Java is confusing too

> Do I need the Java runtime environment, the SDK or the browser extensions to run this code?

It's not really confusing.

The Runtime is required to run code. The Source Development Kit (SDK) is for development, and the Browser Extension is for running browser extensions.

That's really only half of it; Java is also a programming language, a bytecode spec, and a binary executable that sometimes points to a JRE and sometimes refers to a JDK, depending on how you installed it. Arguably it's also now being used to refer to the API of the standard library too.

What about Java SE?

The naming is very confusing but it's better than the old name of ASP.NET 5. Source: I had to write about this for a book and it was a challenge to explain it clearly. :)

Yes, that's a thing but at this point imagine just how much more confusion it could create if they change the name.

I went full circle, from a critic how they cloned Java back when my employer had the privilege to beta test .NET as a Microsoft partner, to someone that enjoys delivering solutions on the .NET stack.

For me the sweet spot will be when .NET Native becomes more mature and I can get Delphi/Modula-3 back.

What do you mean by "getting Delphi/Modula-3 back" here? There are some implementations for .NET? or that C# is kinda similar? or something else?

C# was originally designed by Anders Hejlsberg, the C# 1.0 features that weren't taken from Java (J++) were based on Delphi, like Properties.

You also have RAD development on .NET.

Although .NET always had AOT compilation via NGEN, it always depends on the runtime being available at OS level.

With .NET Native it is like things were on PC before JVM took off, with strong typed system languages like Delphi and Modula-3 that compile directly to native code.

I know one needs a bit of imagination to make the comparison, but it is how I like to think about it.

FWIW, yes, there is "Delphi for .NET" (but it's actually much more than that - it can also target JVM and Cocoa):


i can't speak for the author, but my first thought was the speed. Delphi was almost as easy to develop as Visual Basic, but the result was fast. a native EXE, no big framework-dependencies.

Delphi binaries dont have any dependencies, right?

You can have it both ways, dynamic or static compilation.

If you using COM or third party dlls, which is quite common on Windows, then you need those as well.

Have you tried writing a complex wep app/api with C#? It's an absolute joy to use. Python still has it beat for math and ML libraries, but for getting up a quick web app with robust (and easy) data queries (LINQ) it's pretty unbeatable.

Yes, LINQ is really good. With these news i am willing to go back and try it in some more project.

It's fascinating how Microsoft's marketing moved from ".Net - One platform with multiple languages" to "This is .Net - here some samples in C#"(without mentioning C# by name anywhere in the page).

I think that .Net would be much better if they had this kind of focus in dev side from the start.

*Hides from the f# mafia.

Can you make a mafia with only three people?

Funny, I saw far more than 3 F# presenters at NDC Oslo a few weeks ago..

If I'm an experienced Python developer who develops a lot of websites and APIs using Django or Flask and SQLAlchemy, is there any reason to try this stuff? C# is a great language, but what about the libraries? What replaces Flask? What replaces SQLAlchemy? How do I deploy? Looking for some practical reasons to invest time on this if Windows development is not in my roadmap.

Well, besides libraries, performance.

.NET supports proper multithreading (no GIL) and it is also much faster than CPython.

If IronPython gets ported to .NET Core (maybe it already has, I don't know), then you'd get those benefits for free.

For me, the GIL is a feature. Proper pthreading is for systems software as I see it. If you're rewriting Apache or IIS then you may need to bring in Rust, C(++).

I never found platforms like .Net and Java to be low level enough to write systems software, where you'd definitely want multithreading, and not high level enough to be as convenient as Python.

I write in Python and then use PyPy for more CPU intensive services. If I wanted multithreading and instances won't cut it, it would be a pretty hardcore usecase (for a lone wolf like me), I'd most likely also need no GC so I'd reach for something like Rust rather than C#.

I hate to say "no" to learning anything, but to be the devil's advocate this is how I've always seen it.

In the public repos, there no substantial new work taking place with IronPython.

If there was enough interest you could get the Microsoft Python team to support IronPython. There's been talk of a renewed effort.

[1] https://blogs.msdn.microsoft.com/pythonengineering/

The performance of IronPython was horrible and Python being an ecosystem of libraries built in C and wrapped in a Python API, it makes alternatives to be unusable, since most Python libraries that matter are incompatible.

Flask - Nancy SqlAlchemy - EntityFramework, nHibernate Deployment - AppVeyor (cloud), TeamCity/OctopusDeploy (self-hosted)

Having said that, there is probably very little reason unless you're just doing it out of pure interest.

No. If you really want a high performance Windows friendly server software, try Java. But honestly I would not use .net for anything past excel plugins.

Documentation as a PDF (constantly updated):


It's really just a copy of their informational web site content with articles by different authors, but still very conveniently combined into a single PDF useful for printing etc.

Thank you!

I have been looking for something like this!

I might be missing something here, but does this mean that .NET Framework and .NET Core have diverged, and you need to take extra steps to keep code compatible with both?

The major differences between .NET Core and the .NET Framework: [...] APIs — .NET Core contains many of the same, but fewer, APIs as the .NET Framework, and with a different factoring (assembly names are different; type shape differs in key cases). These differences currently typically require changes to port source to .NET Core.

While I understand the motivation, this at first sight looks like something that will be with us for a long time, and could make life more difficult especially for library authors, who need to potentially target both 'platforms'.

[Disclaimer: haven't used .NET technologies for a very long time and might be horribly wrong here]

There are new platform targets that cover the common functionality. Support in NuGet is a bit nascent though.


I've mixed feelings about this release, because it seems a bit early to me. Some fundamentals will change soon (project.json replacement) and I am still not able to get debugging to work in VS Code on OSX. The whole situation in the .NET Core space was very confusing for the last months and this trend seems to continue. This is sad, because I'd love to code in C# on any platform and that things would just work as advertised on the website.

Edit: Debugging works now for me, after the .NET Core Debugger was automatically installed in the background!

It's still extremely buggy. Very difficult to get a .NET MVC app up and running (they're not using the term WebAPI anymore).

I also think they've made a terrible decision, the first-class support for DI is a bad direction which has made many simple concepts like config settings into absolute farces that take 5 lines of code.

It's code "purity" over usability. It's putting the core dev team's principles over their customers actual need, completely violating KISS, DRY and YAGNI.

DI really is our generation's factories. I'm already seeing projects written by people who don't understand it making utter nightmare spagetti code, worse than any code I've ever seen.

It's nasty scaffolding code which is a symptom of limitations of the language, definitely not code anyone should ever actually be wasting time writing.

Instead of recognising it's a flash in the pan, the core team have embraced it and are trying to force it down everyone's necks, and a lot programmers simply don't get it and are making an utter mess instead.

>and a lot programmers simply don't get it and are making an utter mess instead.

Then maybe those programmers should go back to JS ES5?

I'm not sure that the tooling will necessarily need to be updated as the rest seems to be nuget packages, that should be easy enough to update as you build an application.

I'm more of an outsider now, as I've been doing far more node/js dev lately than .Net ...

Is there a primer somewhere that explains simply what the difference and dependence is between .NET (core, foundation), ASP.NET (core, foundation), Xamarin, Visual Studio (in all it's different flavors including VS.NET etc.) and Visual Studio Code?

I tried Wikipedia, and Microsoft's own homepages for each, and am even more confused.

I'm a hierarchical thinker, and a hierarchical tree explaining the above would be very very helpful.

Disclaimer: I haven't downloaded any of these yet. I usually don't until I've understood something.

Bonus question: If I want to develop websites and electron apps using: HTML, CSS, Javascript and PHP, what's the minimum set of technologies (or whatever it is that Microsoft is calling them these days) I will need?

Dotnet Foundation = NGO that manages various legal and org things for .NET. Think FSF for GNU.

.NET Core = cross platform software development platform. Includes a VM, compiler, tons of libraries.

ASP.NET Core = HTTP server + server side .NET libraries.

There is no ASP.NET Foundation, it's all part of the Dotnet Foundation.

Xamarin = .NET libraries for mobile development. Wraps Android/iOS native libraries.

Visual Studio = native Windows IDE for .NET and other languages.

Visual Studio Code = portable, Electron/HTML 5 IDE for Javascript/Typescript and other languages.

You probably want Visual Studio Code for HTML/PHP. But you should check out .NET Core, it can replace PHP.

By the way, if it works, it works, but I find it better to research and do at the same time. Downloading only after you've understood the org chart seems a bit too radical for me :)

Thanks. Very helpful. Last 2 Qs: is Framework different from Core? And how does Windows Presentation Foundation fit in all this?

.NET Framework is the old .NET Framework. It's basically the entire shebang, tied to Windows: VM, class libraries, etc.

.NET Core is as the name says, just the "core" of the .NET Framework, the part that's cross platform. However, I'm not sure that at this point the code's common, I believe at least a part of it has been reimplemented. I think that in the future .NET Framework will be based on .NET Core. Therefore .NET Framework will be .NET Core + Windows specific bits.

WPF is the fancy name for a .NET UI toolkit for Windows, basically. Think of it as a Windows-only GTK.

Thanks. That clarified everything. Sad that one can't find this simplified info on Wikipedia or on the Microsoft website.

Does Microsoft have some diagram or something that shows how all the .NET things fit together (on Linux)?

The C# development I've done was... fine, but that was seven years ago and it's like Java in the sense that it's an amazingly complex thing to grasp and keep up with. Understanding how everything is layered is pretty complicated at this point.

I could be wrong, but I think this is what you are looking for. http://www.hanselman.com/blog/AnUpdateOnASPNETCore10RC2.aspx

Thanks, that sort of helps. It's still a little confusing though. Branding and naming isn't something Microsoft is awesome at.

What's your current stack? Maybe we can compare and contrast the two stacks.

I don't see mention of FreeBSD anywhere in the .NET Core 1.0 announcements -- has official support for FreeBSD dropped or been postponed?

* FreeBSD is listed on CoreCLR GitHub repo (https://github.com/dotnet/coreclr). See current build status here: http://dotnet-ci.cloudapp.net/job/dotnet_coreclr/job/master/...

* Also, see doc here (last updated in feb though): https://github.com/dotnet/coreclr/blob/master/Documentation/...

* Laslty, FreeBSD arrived on azure last week, so it would be quite surprising if support was dropped.

I understand, but none of those links are the .NET Core 1.0 release announcements.

And the download page for .NET Core 1.0 doesn't mention FreeBSD at https://www.microsoft.com/net/download#core

Interesting name choice considering they have already had a .Net 1.0. I mean I get it, I understand why they chose to make this more of a 1.0 release but for those already in the .Net ecosystem it seems a little confusing to me.

It could have been more confusing by bumping versions. They first tried ASP.NET 5 and ASP.NET MVC 6 and EF 7 and there was no simple migration path from older version. Now they are all Core 1.0

Excuse me if this was already obvious, but the word "core" designates a new product line, starting from 1.0. Obviously, it's informed heavily from the old .net framework, but it's not backwards compatible with it.

This is just a personal view, but my first instinct when I read about product "CoolApp" and "CoolApp Core" would be to assume core is a subset of the first. Not a new and re-engineered product line.

I could see why others might see .Net Core as a parallel product as opposed to a new one.

Sure it's obvious. Like I said I "get it" it's just core has always been a part of .Net just less directly referred to so my first thought was this was .Net 1.0's core until I started reading through the link.

Overall I think it's a good move just a little confusing (unless I'm the only one) for those who have done .Net work before.

I'd LOVE to see some Linux web/micro benchmarks... particularly against Java, Go, Node, Python.

They link to some benchmarks in the article: https://github.com/aspnet/benchmarks (they're a little hard to read on mobile, but I think the tl;dr is "plenty fast" (or to quote the article: "Our lab runs show that ASP.NET Core is faster than some of our industry peers. We see throughput that is 8x better than Node.js and almost 3x better than Go, on the same hardware. We’re also not done! These improvements are from the changes that we were able to get into the 1.0 product.")

I find the binary-tree benchmark specially disturbing, it looks like the .NET Core GC/escape-analysis has some catching up to do.

The multi core difference?

Check the cpu secs and cpu load; check the source code.

Oh I should have checked that first. Thanks.

Scott Hunter of MS did an interview not so long ago re .NET, Kestrel (the new libuv HTTP server they build), and their march to build up as a top contender for Tech Empower.


They have a team who does it all for fun and has an impressive testing environment build out. Fascinating to listen to given today's new. I listened a few weeks ago myself.

I have friends at work who liked C# and moved on to frontend. The fact they are so less interested in .NET now is amazing to me as I finally want them to teach me to use ... on Linux!? I never thought I would say that. Haha.

Even more details on .NET and Kestrel in a more recent episode just a few days ago: http://dotnetrocks.com/?show=1312

My bad. I think I meant that one! There website without JS enabled is terrible these days, so I search around with Google and that was from a FB post I scanned.

Yes, their new site is terrible, I much preferred the simple site they had before that loaded quickly. The new animations, modals and UI buttons are a big downgrade and I'm surprised they went with it considering some of their prior podcasts.

I should not be so judgemental. I noticed they interview what appears to be close friends they have in the Angular community, especially since one of the popular devs/evangelists of Angular, John Papa[0], is a friend of theirs they interviewed.

More elaborate front-end was inevitable. That being said, I love their content and only really interact with their work through an Android podcatcher/audio client.

[0] https://github.com/johnpapa/angular-styleguide/blob/master/a...

The webstack is pretty fast, they're working on getting listed officially in the techempower rankings [1]. Here's an older article from Feb explaining more details. [2]

1. https://github.com/aspnet/benchmarks

2. http://web.ageofascent.com/asp-net-core-exeeds-1-15-million-...

Me too. Especially compared to raw servlets, not the normal 50 layers of inheritance, enterprise crap.

From the release notes:

"We used industry benchmarks for web platforms on Linux as part of the release, including the TechEmpower Benchmarks. We’ve been sharing our findings as demonstrated in our own labs, starting several months ago. We’re hoping to see official numbers from TechEmpower soon after our release.

Our lab runs show that ASP.NET Core is faster than some of our industry peers. We see throughput that is 8x better than Node.js and almost 3x better than Go, on the same hardware."

Good news! Now time for all the library and framework authors to add support for .NET Core. I know a lot of people were waiting for RTM before starting this, considering the massive changes between RC1 and RC2.

I've started a support matrix project at https://anclafs.com. Feel free to file an issue or send a PR on GitHub.

You're missing akka.net there

Thanks! I have a big list of things to add but I'll check this is on there and add it if they are planning (or have already added) support for Core. It started as things I had talked about in my recent book.

the asp.net core changed a lot between rc1 and rc2 (because moved to .net sdk instead of dnx). The normal libraries or console app not too much

It depends on if you want to run ASP.NET Core on .NET Framework 4.6 or on .NET Core 1.0. It's not just the SDK that is new, Core console apps are pretty recent too, as previously it was mainly for web apps.

I'm receiving

> “dotnet-dev-osx-x64.1.0.0-preview2-003121.pkg” can’t be opened because it is from an unidentified developer."

I know full well how to get around this, but it makes me suspicious of what I've downloaded. Would Microsoft have truly missed this?

It is annoying. The preview MSIs are signed but the preview PKGs are not. The last release was so maybe it just takes some time?

BTW you can ctrl-click/right-click the installer package and select open to get the option to run it without fully disabling gatekeeper.


Hey folks, sorry for the inconvenience around PKG signing. We're working on it right now and should have things updated soon.

Lee [.NET PM]

Giant leap forward represented by Microsoft!! Happened right there when I heard of Microsoft released SQL Server running on Linux...

They should have named it as xNET :) pronunce it as cross platfrom NET - dot net framework which works on all platforms

Any ideas how to do GUI development with .NET Core? Is there a version of Xamarin? Are there plans to port Windows.Forms, WPF? There is this massive thread about it http://forums.dotnetfoundation.org/t/cross-platform-wpf/421/...

and an issue here https://github.com/dotnet/corefx/issues/5766

You can have a Web UI in desktop app. Just run the app with embedded kestrel web server and make all UI in HTML. And it will look pretty good an all platforms.

CatLight is one of the apps that do that - https://catlight.io

That's Electron, .NET Core (?) and Squirrel, all working together. Looks pretty cool, just pity the people on <20 megabit connections - these apps are getting pretty big.

Universal Windows Platform ".NET Native" stack is aligned with (built upon/merging towards) .NET Core. It's XAML-based like WPF and Silverlight.

ETA: Other than not cross-platform. Other option is Xamarin.Forms and would be useful to see more Xamarin.Forms targets.

yea, xamarin.forms targets UWP.

and looks like there's a forms mac branch being somewhat worked on: https://github.com/xamarin/Xamarin.Forms/tree/mac

A bit unrelated, but I'm hoping to hear something soon about the future licensing model for the upcoming SQL Server on Linux. Do we get a free version without DB size / CPU / RAM limitations (but perhaps with some other restrictions) that smaller companies and startups can use in production?

I'm planning to run my future ASP.NET MVC projects on Linux and very much would like to know if a better SQL Server Express / Community Edition is coming or do I need to move to PostgreSQL (which I have already started looking into).

Microsoft already announced the SQL Server on Linux licensing:


"A customer who buys a SQL Server license (per-server or per-core) will be able to use it on Windows Server or Linux. This is just part of how we are enabling greater customer choice and meeting customers where they are."

Same editions, same prices.

Thanks for the information! Apparently I missed that in the news stream.

Sad to hear about that though. I was hoping for a break. Will just have to continue reading that PostgreSQL book I've already started.

Now that someone has downvoted me, I know it is the right choice and I must learn Postgres. Thanks for your support, folks.

PostgreSQL is the right choice, no matter what.

I'm starting to learn postgres as well, which book if you don't mind me asking?

I've started with the book "PostgreSQL Up & Running". It's rather lacking in detail and has a lot of typos, but I'm nevertheless enjoying it. It's written in a simple language and it provides a good landscape overview, which is exactly what you need in the beginning. Get the big picture first and then you can look for the necessary details as you go.


Why limit yourself with Express/ Community if you can get full featured RDBMS with Postgres?

For a developer favoring .NET, the degree of .NET integration with SQL Server might be a compelling advantage for applications where the limitations of the relevant SQL Server tier weren't a problem.

Because I really like the power of SSDT (SQL Server Data Tools) and how nicely it integrates with Visual Studio and would like to continue using that instrument if possible.

But I can't seriously build a complex product around SQL Server knowing that 10GB is the sky for my database.

You can go ahead with PostreSQL if you use Entity Framework and Code First approach. I used it many times and I was able to quickly jump from PostreSQL to MS SQL and back. It just works. With EF I don't see any reason why I should tie to any particular SQL database

Personally, I prefer writing SQL by hand. That way I know exactly what will happen in a query. A database really is an important component of a system and shouldn't be treated like a dumb data store where you just throw in any stuff you like through an ORM and hope it sticks there somehow.

Lots of people do exactly that (treat it like a dumb data store) and are wildly successful. Do what you want, but don't say other people shouldn't do it a different way which has been proven to work.

False dichotomy, you can use both together and get the benefits of both an ORM and raw sql, each when they're the better choice.

Certainly. Yet there's been a tendency later to declare SQL wrong and deprecated and to incite everyone to forget about it and go with an ORM as the only way. I do not accept that. The stronger the ORM zealots are pushing the more repulsive the idea of an ORM becomes to me. But of course it's just another instrument that has its place among the others.

Only among the inexperienced. Skilled people use both as needed, even at the same time. You can keep the complex SQL in a view and map the view with the ORM. ORM's are far too valuable to ever hand write all that SQL and SQL is far too flexible to ever ORM 100% of every use case, the ORM should make up most of the program with hand written SQL sprinkled in where useful when the ORM is clunky.

It's easy enough to do that, you don't HAVE to use an ORM framework. Though mapping sprocs to an Entity model is easy enough... I've had to do that several times where performance in EF was particularly bad.

I would be surprised if there wasn't a free (as in beer) Express version for Linux, with a DB size limit like 10GB. Otherwise Entity Framework Core supports PostgreSQL, but there isn't any lazy loading yet.

I don't believe it works outside of using EF. Last I checked no database drivers existed for .NET Core for MySQL and PostgreSQL. For that reason I've stuck to Django.

npgsql is the open-source default driver for PostgreSQL and works with .NET core (netstandard 1.3):


I believe you can use PostgreSQL with Dapper, as there is an ADO provider for PG, but I haven't tried this.


What I meant was that's just not enough. You can toy with it in a hobby project but in production that will be exhausted pretty soon.

I think that probably depends pretty heavily on what your app is. One of my clients has about 200,000 user accounts (and 40K actives/week) in a database that's well under 4GB, and it's been live for three years now. It's in Postgres, but there's nothing that'd stop them from using a 10GB-capped SQL Server Express until a point where having to pay for SQL Server is a good-problem-to-have.

What's the current status of the CoreRT runtime [1], the one designed for AOT compilation? Can it be used for non-trivial programs yet?

[1]: https://github.com/dotnet/corert

Live stream of the release at Red Hat DevNation: https://channel9.msdn.com/Events/Linux/DevNation-2016

Starts at 0930 local.

I was really hoping for ARM support at least as far as RasPi goes for 1.0.

But still very good news I personally love .Net it's nice to be able to use it cross platform without having to work specifically around mono.

.NET Core does work for Windows IOT, on the Pi 2 and 3. So ARM compilation is already there on Windows.

From what I see Samsung has been contributing on that front. They said in the announcement that Samsung has joined the Dotnet Foundation.

Yeah, the only problem with Samsung is that they most likely going to make builds for the Exynos based SOC which isn't that wide spread. The ARM ecosystem is quite fragmented and depending on how you build your software it's not as portable as one would assume.

It's also not clear which instruction sets they are going to be aiming for Exynos started with ARMv7 + all the stuff that Samsung added to it, but the newer ships are ARMv8. The Raspberry Pi still uses ARMv6, 7 and 8 depending on the exact model with the newer RasPi compute, and Zero models still running the old BCM2835 ARMv6 CPU's.

So I'm hoping for a more or less clean stock ARMv7 and ARMv8 builds for the .NET core coming out sooner rather than later, I also hope they'll release the X86 version to more platforms than Windows since running it on something like an Intel Edison which comes with a 32bit ATOM CPU will be quite cool.

.NET is quite powerful and for my taste it's considerably better than Node.JS (I don't like Javascript, this isn't some technical observation just personal preference) but I do like C# and F# quite a bit and having the ability to run the same code across multiple platforms makes me excited for IOT/Embedded devices again especially considering that I don't have to work through some of the headaches that come with Mono (some pun indented).

Well, I imagine that if someone offers support for those platforms they won't say no. But they'd probably also need for stuff such as the CI chain.

What is the licensing like if you're a vendor and you want to ship software based on .NET Core on your own hardware appliances? Can you redistribute the runtime, or do you have to pay?

MIT and Apache, apparently.

The source is MIT and Apache, but that doesn't mean the binaries you get from MS are redistributable.

For anyone asking this question and not getting a sufficient answer, the safe route would be "build from source, then distribute that".

I don't understand, why does the package name for Linux say preview2?

    apt-get install dotnet-dev-1.0.0-preview2-003121
Isn't this the final release?

The preview refers to the tooling (CLI, VS etc.) [https://github.com/aspnet/home/wiki/roadmap#schedule]

Even if this isn't the final release, I'm confused why they would include version numbers in the package name at all. One of the reasons of having a package manager in the first place is to avoid things like this.

Do we officially have better-than-Java on all platforms now?

Yeah but so what. If the "best" language always won, why isn't the entire world running on Haskell and Rust or whatever? Java's eco-system is ginormous and it's deeply entrenched as the "corporate stack" in non-MS shops, with more than 15 years of history.

I'm afraid this is just too late to make a big dent. C# is nicer than Java, but not that much nicer to warrant switching over or rewriting your stuff in a very immature eco-system.

Maybe they're hoping to capture some of that elusive start-up market- and mind-share? That one is already pretty hostile towards MS...

Either way, this is a good thing, I just wish they had done this in 2005 or so.

I'm not entirely sure about that.. Node.js has made huge inroads in the 6-7 years it's been around, and Go is pretty popular in some circles as well. Also, look at how RoR grew. It's entirely possible we'll see a shift towards .Net away from Java as an option for lots of projects.

Especially as the tooling for micro-services and docker (or similar) gain traction moving forward. C# is pretty well supported, and at least here in Phoenix is about as common as Java is... can't speak for other metro areas.

I've been a pretty big node.js fan, and haven't used C# as much the last few years, but I wouldn't count it out. Two years ago, C# wouldn't have been on my radar for a new project, now it's entirely possible, given the opportunity for better cross platform deployments.

On top of that, Scala is a much nicer language than C#. I moved from being a C# developer to Scala and couldn't be happier. Best thing about C# is Visual Studio (IDEs are weaker in Scala land) but a bigger ecosystem and more powerful language make it for it imo.

Could you sell me on Scala over C#?

Pattern Matching, Destructing, and ADTs are amazing. Inference is much better and not just for local variables.

for comprehension is the equivalent of LINQ syntax which is nice, very powerful when working with asynchronous calls.

Higher Kinded Types combined with implicit parameters are the two aces in the hole that make it an asbolute definitive win when heads up against C#. When I had to switch back to C# in my last job (some projects were scala, some C#) this was what I missed the omst. Combining the two gives you the 'type class pattern', which is pretty much the first time i've actually seen composition and code sharing just work without the cruft of an 'OO' like framework.


Microsoft's previous stewardship would never have allowed what they're doing now sadly. They must have lost years of progress with their previous strategy of locking things down. Still, better late than never - I love where they're going with .Net now.

Language and tooling wise? Yes, or very close to it.

There's a lot to be said about the huge amount of value in the main maven repository vs. what's in nuget.

Java's decade long head start in that area doesn't seem so huge when you look at what the node.js community has put together in only a few years.

If you were considering .net on a new project, however, I would probably recommend comparing something like Scala. Scala has the benefits of being a "modern" language like C# with the full support of all or nearly all of the existing JVM projects.

That being said, there are a growing number of reasons to choose .net on non-windows platforms.

I'd argue that it will be a while for Nuget to catch up.

When I go on Maven Central I expect every library to be cross platform except for those marked as platform dependent.

For Nuget... all bets are off for now. It's the exact opposite, I assume that things are not portable or at least have not been tested on say, Linux.

> Java's decade long head start in that area doesn't seem so huge when you look at what the node.js community has put together in only a few years.

It's easier to follow than to blaze a path. We have learned from Java and can, hopefully, avoid those pains.

Looks like it.

A few years ago, I wouldn't have thought I would write this, but it looks like Microsoft is definitely changing, for the best!

Two questions:

- Does .NET Core offers something similar to Go goroutines or Erlang "lightweight" processes?

- And something similar to gofmt?

1. I can't say for sure. Maybe this? http://getakka.net/

2. There's https://msdn.microsoft.com/en-us/library/ff926074.aspx + your favorite IDE which can automatically reformat the code. Visual Studio can do this. If you want to enforce things, there's: https://stylecop.codeplex.com/

> Does .NET Core offers something similar to Go goroutines or Erlang "lightweight" processes?

Not quite the same thing, but F# has the MailboxProcessor<'T> type which is conceptually similar to Erlang processes and can be used in the same way.

I don't know of any green threads in .NET -- at least, not built in. However, .NET does have a built-in thread pooling / task execution service. You generally interface via `System.Threading.Tasks.Task`[0], which has static `Run` functions, or use the `Factory` property to get the underlying `TaskFactory` for more fun. Since the executors are pooled, the creation costs will be amortized over your program's lifetime.

Then there's `async`/`await`[1], which is built on top of tasks.

[0] https://msdn.microsoft.com/en-us/library/system.threading.ta...

[1] https://msdn.microsoft.com/en-us/library/mt674882.aspx

I remember that a long time ago, .NET threading was implemented in such a way that it should, in theory, support Win32 fibers. It's why all the low-level .NET APIs don't use the OS thread IDs, but instead have their own IDs, and require explicit mapping.

I believe that it was abandoned as a supported scenario a while ago, though. But you can still see artifacts of that in the docs, e.g.:


"The value of the ManagedThreadId property does not vary over time, even if unmanaged code that hosts the common language runtime implements the thread as a fiber."

Not out of the box, but this library does exactly that (with lots of functional niceness too). Disclaimer, I wrote it, so I'm obviously biased


Just had a cursory look at the README. Is each lightweight process mapped to an OS thread? What is the minimal memory used by each lightweight process?

It wraps the F# MailboxProcessor mentioned in a comment above. A thread is only allocated when a message is processed, and then free'd up afterwards. So there's no permanent thread overhead and 100,000s can be created.

The overhead in terms of memory is the internal state of each Process + the user's Process state. To see the internal overhead, check out the member variables for the Actor class [1].

It's pretty lightweight, although could probably shave a few bytes here and there.

[1] https://github.com/louthy/language-ext/blob/master/LanguageE...

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact