
NET Core 2.1 Roadmap - benaadams
https://blogs.msdn.microsoft.com/dotnet/2018/02/02/net-core-2-1-roadmap/
======
yokohummer7
> Build-time performance is much improved in .NET Core 2.1, particularly for
> incremental build. These improvements apply to both dotnet build on the
> commandline and to builds in Visual Studio.

It is good to see some improvements. Currently, `dotnet build` takes at least
3 seconds to compile a simple hello world project on my system, _even when the
source code isn 't changed at all_. At first I thought I did something wrong,
but no, others were suffering from this too.[1][2]

This occurs because every `dotnet build` run tries to resolve the dependencies
and inspect the file structure to see if any changes are made. And for some
reason those operations are dog slow. Visual Studio doesn't have this problem,
as it knows the file structure and the dependencies beforehand, so it can
determine whether there should be a recompilation easily. I wanted to use VS
too but it was too slow for my tiny laptop, so I'm stuck with VS Code.

I hope the improvements are big enough that I can reevaluate using .NET Core
again. The build time was a huge obstacle to my iteration cycle, so I had to
use TypeScript in my previous project. I sincerely want to code my backends in
C#.

[1]
[https://github.com/dotnet/cli/issues/5918](https://github.com/dotnet/cli/issues/5918)

[2]
[https://github.com/dotnet/cli/issues/7610](https://github.com/dotnet/cli/issues/7610)

------
polskibus
Still no production-ready ODBC in .NET Core? How is one supposed to work with
database engines in a standard way? Java is miles ahead by just sticking to
JDBC.

Microsoft first moved off ODBC towards OLE DB only to ditch it some years
later.

Working with data and being able to connect to a plethora of data sources is
essential to a platform success. .NET Core won't be treated as a serious
cross-platform toolkit if it won't offer a reliable, interoperable interface.

~~~
da_chicken
.NET Core didn't have DataSet/Table/View until v2.0. I'm still boggled by the
idea that they think those are legacy
([https://github.com/dotnet/corefx/issues/1039#issuecomment-77...](https://github.com/dotnet/corefx/issues/1039#issuecomment-77476363))
and that everybody will just use entity framework for everything.

~~~
nwah1
Are you concerned about the overhead of EF? Porting old code? If you want db-
specific features, you could always call a stored proc.

EF code first is very pleasant to develop with, and when starting a new
project in .NET Core, I'd always prefer that.

~~~
da_chicken
I'm mainly involved in data integration. I don't have access to the data model
that the applications I integrate use. I don't have access to the application
source code; I have access to the data store. I don't need to extract an
object, manipulate it, and then put it back where I got it. I typically need
to take a set of _all_ objects, transform them (typically with VIEWs),
serialize them, and import them into another system that almost certainly
don't represent the underlying objects in the same manner. No matter what,
though, I usually don't get to have access to whatever ORM the application was
created with. We do not have access to source code. Sometimes it's a matter of
creating CSV data files for scheduled import, but sometimes it's me doing the
actual direct manipulation of the data systems.

Some of them do have APIs, but since I already need to understand the _data
store_ to do reporting with all the third party reporting software we have,
learning the API is an additional task on top of already needing to learn the
data store. Come to think of it, I don't even know how you would even write
third party reporting software without something basic like a DataTable.
Furthermore, most systems are complex enough that whatever the developers
envisioned they should be used for, however the developers thought we would
configure our systems, and whatever they envisioned their customers would
require of them is almost certainly incorrect, inadequate, or both.

Finally, some of my processes do use stored procedures, but most of them use
table-valued parameters in order to control how the data are submitted. As far
as I'm aware, EF doesn't have _any_ support for TVPs.

Almost nothing I write is written in Visual Studio. Not that I can't, but it's
not particularly useful to do so. I don't have nuget or chocolatey installed
on any of the systems I work with. I do use SSIS when I can, but that's
usually out of BIDS or SSDT, not VS. Others use PowerShell scripts with
System.Data and whatever .Net provider I need for that DB.

~~~
nwah1
Looks like you have lots of options. And you could use the SqlClient library
directly.

[https://blogs.msdn.microsoft.com/dotnet/2016/11/09/net-
core-...](https://blogs.msdn.microsoft.com/dotnet/2016/11/09/net-core-data-
access/)

Also, regular .NET Framework will be supported for a long time.

------
nwah1
Awesome. Lazy Loading for EF is probably the most requested feature there.

I'm wondering how Kestrel is doing, and if it will finally be available as a
freestanding web server, not behind a reverse proxy. Looks like HTTP/2 support
is delayed until at least 2.2.

~~~
dmarlow
Is lazy loading really that widely used? I used it once and it led to all
sorts of issues unless I managed everything myself. I much prefer to have it
off and rely on explicitly getting what I need.

~~~
nwah1
I notice for complicated apps you often want to turn it off, but development
time probably is slightly faster on simple apps with lazy loading and is the
previous default.

I would probably want eager loading to be the default, but lazy would be
useful for porting old code at minimum.

