There are terms like .NET Core, .NET Platform, .NET Framework, .NET SDK all mixed up on nearly every page, multiple versions of "Getting started" and "Quick start" guides, massive navigation menus and options on every page so you never know if you're looking at the latest and greatest or it's some kind of abandoned dark corner of a web property, as it often happens with corporate sites.
Yo! If you want .NET to be a massive hit on non-Windows platforms, move it off Microsoft.com to a small site, have a single version of "getting started" (and only one "guide"). Don't ever mention things that exist and don't work on Linux.
Also, have a simple downloadable .tar.gz which expands into /bin + /lib + /examples. I loved C# back in my Windows days and I moved to Linux to escape Microsoft complexities and over-reliance on complex IDEs and tools, scattered like shrapnel all over my c:/
I will not run apt-get against your repo without knowing ahead of time what I'm getting and where will it all go, so let me play with the tarball first.
I gave a talk at the London .NET user group earlier this year on why it's all so hard to understand. Maybe it will help clarify some things, however it was before the 2.0 announcement: https://unop.uk/on-asp-net-core-and-moving-targets/
I'd love to use .NET but the overhead of getting started isn't worth it when I can use any number of other truly free languages/platforms that are much easier to understand. And are truly cross platform because they have been so for years.
CLR and .NET seem very awesome but so far have turned me off in a big way. Please fix <3
I've been putting food on the table as a .net specialist for fifteen years. My impression is that .Net 4 is a band-aid of Windows-dependent implementations and that .Net Core is some solid tech, using industry's best practises for streams, collections, GC and so on. How do you think they made it portable?
I started to get pissed off at MS at around the time MVC 3 came out. That was not the direction I would have taken. Oh the bloat. Asp.Net Core Mvc is a dream. You start out with nothing, basically. Invent your own conventions.
I'm happy to not touch .Net 4 again. Love Core.
Though, maybe legacy knowledge isn't helping...
If you don't even want to spend that you can install Visual Studio and .NET Core stuff will just work out of the box, as is customary for VS.
Versioning and documentation is a mess, but neither you nor the grandparent seem to have actually made the minimal time investment necessary to even encounter those problems.
This is Microsoft's fault, not these guys, and it has pissed me off so many times over the years.
I always use the analogy that Microsoft builds these gigantic, beautiful mansions, but then to get to them you have to find the secret path that's covered in weeds.
A slightly different topic, but very much along the same theme:
What a mess! I want SSMS, do I even have it installed anymore? How do I clean this up without spending 6 hours (because I know something's going to go wrong during the uninstall)?
Perhaps you could make a useful comment next time instead of something plainly not?
However FWIW, I put together a C# on Linux Workshop for DevConf.cz earlier on in the year that might be useful for some folks https://github.com/martinwoodward/csharpworkshop - also includes links to the docs for building from source etc if you want to go really into the details.
Note that that still seems to require an existing binary (the bootstrapping problem). I wonder if the Debian CLR people have seen this.
The command line tool I was trying to write was to read my Safari reading list on my Mac, and construct an HTML page that contains the same information that I could put on my website, where I could then access it from my Surface Pro. Apple provides a way to export the reading list in XML, so that was my input.
I don't remember what it was now, but my first approach used some XML stuff that I got out of examples from some recent C# book, and it worked fine in Visual Studio Community Edition on my Windows gaming machine. In Core, though, on my Mac (and on my Surface Pro and Windows gaming machine) it failed to build. It was not finding something I was trying to include via "using".
I was not able to figure out if that thing is simply not part of Core, or if some build setting somewhere has to be changed to make it available.
(I eventually changed my approach and dealt with the XML through LINQ instead of at a lower level, so that I no longer needed whatever it was whose "using" was giving me trouble, and successfully got access to my Reading List from my Surface Pro).
Especially since some components embedded different versions of other components.
I don't know if the situation has improved since I rotated off, but a bog-ordinary semver scheme would've saved a world of pain.
Failing that, a single page with components (SDK? Runtime?) and available version numbers that gets updated.
Because we mostly wound up working what was what from forensic readings of scattered blog posts, Github release and I think comments on Github issues.
In any case, I am sure the team would be glad to give you feedback on their experiences since then -- my email is in my profile if want me to pass anything along.
Worked like charm had site up and running in 20 mins which is kind of "production" ready. With supervisor and reverse proxy. Or just do docker? You are overthinking a bit also with the terms, you don't need to know those terms by heart.
In contrast to those two, I've found Android's pretty good in the 3 years I've been referencing them.
But I went "ubuntu" route and bailed, looking for a simple tarball.
If you don't like it you can install the same debian tarball on ubuntu.
Interpretations are fun when there's a baseline :-)
This is a great, succinct, non-ideological explanation for why open-source projects where anyone can contribute tend to be better. For a given component/function, there might be only a single person in the entire world who needs that optimized badly enough to actually do it themselves, but once they do, everyone benefits. A closed-source team has to prioritize their development efforts, which means niche improvements will probably never make it in. Multiply this by a thousand different niches, and the product is going to be slower.
This is such a good point.
We (TechEmpower) had this in mind when we created our framework benchmarks project a few years back. Performance improvements in platforms and frameworks have the potential of very broad impact. With our project, we wanted to provide some inspiration for doing that kind of performance tuning. We had found ourselves in many conversations about how many real-world CRUD web applications take multiple seconds to render a page with a form. We realized that if, just as an example, the JSON serializer or template engine were substantially faster, many real-world applications that use those components would see notable improvements to their user experience.
This is a great community activity. Clearly, the community is more than capable of performance enhancements, based on the improvements they have made in the product.
If people start improving the C# benchmarks, please file an issue on dotnet/core to get feedback and some cred. We may do another blog post on that if there is some gravity around the activity.
From a very quick look, something like http://benchmarksgame.alioth.debian.org/u64q/program.php?tes... looks like a direct port from a C program, rather than idiomatic C#.
> Please don't implement your own custom "arena" or "memory pool" or "free list" - they will not be accepted.
> We ask that contributed programs not only give the correct result, but also use the same algorithm to calculate that result.
So there might not be too much room to improve. There could be some room to improve for things like "custom ... memory pool" since .NET Core has ArrayPool  built-in. But I can't tell if the spirit of that rule is "don't implement pooling" vs. "you can only allocate memory in the standard ways provided by the runtime."
However in CoreCLR there are multiple efforts to reduce memory consumption, mostly driven by the ports to other architectures (e.g. ARM, ARM32), see https://github.com/dotnet/coreclr/search?q=memory+consumptio...
“We expect that many of these improvements will be brought to the .NET Framework over the next few releases, too”
We still need to ensure that all the changes are behavioral compatible. Other than that, we are intending on improving the .NET Framework as well with these same performance investments.
From what I can tell, from following the CoreCLR repo (https://github.com/dotnet/coreclr), it's changed over time.
In the beginning the sources of the 2 were more closely tied, changes were (automatically?) copied over. Now it seems like it's more ad-hoc, done as and when they decide it's needed. See the mentions of 'TFS Mirror' in this thread for a bit more info https://github.com/dotnet/coreclr/issues/972#issuecomment-25...
Just look at how much .NET Framework (Desktop) code they removed from CoreCLR earlier this year!
Perhaps I'm missing something but wouldn't removing Windows specific code from a now cross platform runtime be the obvious thing to do?
We still do mirror some code (mainly the JIT) into TFS to make it easier to share code with the Desktop in an automated fashion. However, for the rest of the code (e.g. the VM and BCL), if there are improvements we want to bring back, an engineer will just port them manually.
You might have confused it with ASP.NET Core, the web development framework, which is a full rewrite.
Although if I understood this correctly, these performance improvements will only take effect if you compile using .Net Core 2.0 and run using .Net Core 2.0 runtime?
I did not realize .Net Core had diverged this much from .Net Framework.
Yeah me too.
On the flip side I consider it a healthy sign for the project as a whole and especially as an open-source undertaking.
As such I've stuck with .NET 4.5 for now. On the positive side Mono seems to have got a lot better and I have a bunch of stuff running on Linux with surprisingly few problems "out of the box".
for (int i = 0; i < 10_000_000; i++)
s_result = s.Min;
> Further, normally such testing is best done with a tool like BenchmarkDotNet; I’ve not done so for this post simply to make it easy for you to copy-and-paste the samples out into a console app and try them.
We love BenchmarkDotNet and use it (and other perf tools) quite a lot internally.
Mobile objective c, swift, Android java, xamarin.net, phone gap
Web asp.net Mvc , php, java
Using Asp.net Mvc requires using 3rd party ui libraries and
Depends on size of team and experience in .net, usually we assign.net developers with at least 4 years experience in .net and front-end currently mvvm js libraries
If you starter in .net you have learning curve but this is reducing as technologies improve
Microsoft keeps getting better. Open sourcing so many things.
C# and Visual Studio is a breath of fresh air compared to the bloated, aging and vexing obfuscation called Java (work in both).
Since 1991 Microsoft Research keeps making contributions.
Now that's a gross oversight.
> In other cases, operations have been made faster by changing the algorithmic complexity of an operation. It’s often best when writing software to first write a simple implementation, one that’s easily maintained and easily proven correct. However, such implementations often don’t exhibit the best possible performance, and it’s not until a specific scenario comes along that drives a need to improve performance does that happen. For example, SortedSet<T>‘s ctor was originally written in a relatively simple way that didn’t scale well due to (I assume accidentally) employing an O(N^2) algorithm for handling duplicates.
You'll never get anything done if you want to get it perfect the first time round. Or as they say, first make it work, then make it right, then make it fast.
I'd like to play a bit with it but it should be as easy as pushing to a repo to deploy and run.
What would be the easiest way to run on macOS?
Microsoft touting performance improvements the day after Apple amps up performance on pretty much every aspect of the Apple developers infrastructure (Xcode, Swift, APIs, processor and GPU utilization, etc).
Must be a coincidence...
I have no way of knowing for sure, but I doubt a blog post that long, published on an official M/S blog (i.e. probably requiring sign-off) could be written that quickly!