
CUDA on Windows Subsystem for Linux 2 - T-A
https://devblogs.nvidia.com/announcing-cuda-on-windows-subsystem-for-linux-2/
======
anaisbetts
Having set this up, this is going to be absolutely huge for ML. A lot of
nonsense is getting cut down by this:

1\. You don't have to install special CUDA-specific drivers that are behind
the normal gaming drivers anymore. You'll soon be able to just use the Regular
nVidia Drivers (and even now, all you need to do is install a Beta driver
version). That's _huge_ for someone just starting out, they don't have to have
"the dedicated CUDA machine / OS"

2\. At least in Arch, the Linux side of this is literally as simple as `yay
-Sy python-tensorflow-cuda`, _that 's it_. Can't get any easier.

3\. VS Code's WSL integration means you can use a real editor, with real
linting / auto-complete, but without the technical issues of Desktop Linux
(aka the Wayland Nightmare and DPI issues everywhere)

~~~
pantalaimon
> without the technical issues of Desktop Linux (aka the Wayland Nightmare)

Nobody is forced to use Wayland. It’s still quite experimental, I think only
Gnome enables it by default.

~~~
cbhl
Xming on Windows isn't exactly a walk in the park either -- I'd still rather
use the native, proprietary build of VSCode on a proprietary OS with the SSH
extension if I'm on a high-DPI screen.

~~~
deno
VSCode works just fine with hidpi on Linux. It’s just an Electron app after
all, and Chromium has had hidpi support since forever.

~~~
anaisbetts
Many many many apps on Linux still have massive issues with DPI, especially
with mixed-DPI environments (which are no longer an edge case, they're the
Common Case with a laptop attached to a monitor).

Even accessing machines remotely via Xrdp has huge issues because once you
create the session with a certain DPI, logging into the session from a
different DPI machine means you're stuck reading either extremely tiny or
extremely huge text

~~~
diffeomorphism
They are totally an edge case. If you care about hidpi why do you still have a
bad external screen? If you don't care about hidpi simply set a lower
resolution on the one hidpi screen.

~~~
pjmlp
Because that is a standard office configuration during the last 15 years,
laptops with docking stations and external monitors.

More than enough time to catch up.

~~~
diffeomorphism
Yes, standard configuration where the laptop and the external monitor have
about the same ppi.

------
withinrafael
Throwing out a random data point: Until now, NVIDIA has been very anti-
virtualization (on the consumer side), going as far as to engineer its drivers
to detect in-use virtualization (Hyper-V, Xen, QEMU, etc.) and fail on purpose
[1]. I'm curious to see how they now handle this scenario (given WSLv2 runs in
a virtual machine). Perhaps they just commented those checks out in their
'specialized' drivers, an interesting development for enterprising individuals
looking to enable consumer GPU pass-through for general purpose virtual
machines and containers.

[1] [https://github.com/riverar/Remove-
HypervisorChecks](https://github.com/riverar/Remove-HypervisorChecks)

~~~
deno
It’s not GPU pass-through, it’s paravirtualization.

~~~
0xFFC
Can you explain about paravirtualization?

~~~
AaronFriel
Yes, instead of using a virtual GPU driver that simulates the behavior of a
real device, a paravirtualized driver is a shim that connects a device in the
virtualized operating system to a real device on the host.

In summary:

* Full virtualization is a complete, in-software implementation of a device. Early virtualization technology was typically of this nature.

* Paravirtualization typically requires cooperation between the host and the guest, with a special communication layer (in WSL2's case, provided by Hyper-V) between a guest device driver and a host driver.

There are at least two more method of passing a host device through to a VM.

* "GPU passthrough", "PCIe passthrough", or "VFIO passthrough" depending on the source, Microsoft bucks these all and calls this direct device assignment, or DDA. In this mode, the guest OS is given exclusive access to a device or a device hierarchy (defined by the layout of the motherboard itself). This uses the MMU and IOMMU of the host to allow a VM to run a native driver, e.g.: nvidia's CUDA driver, and it will see a real physical device. (Nvidia's driver has historically blocked this by detecting that other parts of the guest OS are virtualized, because from the driver's perspective the _device_ is a real, authentic Nvidia device, but the rest of the OS devices are virtualized and there are ways to detect that.)

* SR-IOV ([https://en.wikipedia.org/wiki/Single-root_input/output_virtu...](https://en.wikipedia.org/wiki/Single-root_input/output_virtualization)) is a PCI-express native method of splitting a device into virtual functions which can be mapped into a guest. I think the first real use for this was network adapters, which allowed VMs to get 10-40GBps network adapters working at native speeds by passing through virtual functions so that hardware offloading worked. Nvidia supports this on some of their server platforms, with GPUs offering up to 7 or 8 "virtual functions" which allows a single GPU to be partitioned and assigned to separate VMs. Once split up in this fashion though, I think it can be tricky to present the full device as a unified GPU.

~~~
0xFFC
Thank you so much for the through explanation.

------
jacquesm
Pity. CUDA was/is proof positive that Linux on the desktop is perfectly
feasible and that you can use it to both do a UI _and_ do meaningful
computation on the same machine without getting tied down into all kinds of
licensing schemes. Opening this further up to Windows gives fewer people a
really good reason to try out Linux as their daily driver. I personally don't
get why any developer would prefer Windows over Linux with its near infinite
software repositories related to all things developers would like.

~~~
patrec
> I personally don't get why any developer would prefer Windows over Linux

Because windows now literally has all the stuff Linux has plus a lot more.

The only reason for a well paid developer to prefer linux over windows for a
desktop OS is that you don't need any of the extra stuff (a decent desktop
experience, hardware support and a commercial end user software eco system) or
that you have some fundamental objections concerning Microsoft's business
practices. These are good reasons, but experience shows that most people don't
care enough about things like privacy to give up convenience or shiny things.

Apple may be in for a tough awakening. They profited enormously from providing
the desktop OS that a very large fraction of the technical elite has been
using for a long time. MacOS still is still smoother than windows in some
regards, but the writing is on the wall: linux completely dominates the server
but is as hopeless as ever as a general purpose desktop OS, but windows no
longer sucks for software development. It used to be that macs had the unique
selling point of offering a nice desktop experience on good hardware coupled
with something close enough to linux that you could use it for a very wide
range of software development tasks. But new hardware is developer hostile and
macOS drifts further and further away from linux (ever more lock-down that
directly breaks developer workflows and tooling and ever more diverging user
land with a mix of ancient GNU and irrelevant BSD tooling). By contrast
windows now includes an ultra-high-fidelity linux layer with excellent tooling
and integration to "core" windows and Microsoft also owns the world's most
popular editor and code hosting solution.

I'm amazed how well Microsoft has managed to find back to its embrace, extend,
extinguish roots. It is both a technical and a strategic marvel.

~~~
dman
I hope they stop at extend this time.

~~~
smichel17
I think "extend" is a good safety buffer / canary in the coal mine, so I'd be
happier if they stuck to "embrace".

~~~
yarrel
Spoiler: they don't.

------
mattip
I have the opposite setup: Linux is my primary os and I have windows on a
separate ssd that I can either dual boot into or use in qemu from Linux. It
gets used once a month or so. I don’t game, so maybe I am the wrong
demographic, but I don’t miss anything from windows. All my document work
(even right-to-left languages) works well on Linux, default Ubuntu gnome. I
use two monitors, with different resolution. What am I missing? I spend my
time in vim/cli, slack, github in Firefox, and email on thunderbird.

~~~
jentist_retol
Last time I tried it, I installed Ubuntu 18.04LTS, updated, installed the
recommended graphics driver, and then the machine would not boot after that.

I recently tried 20.04LTS on a laptop, with an external GPU over tb3 - it
works! However, I really struggled to rectify the display density of my
external 4k with the laptop's built in 1080p. I had other show stopper bugs,
besides bad scaling, however. :(

What you're missing is the ability to open a game for a quick session while
keeping your work open in the background.

What you're missing is that desktop linux is famously unpolished, and worse,
brittle. There's a list of known(!) issues with desktop linux that some guy
updates - currently it has over 100 entries.

What you're missing is your system continuing to work after major updates
(Windows 10's relatively recent and rare "big oops" bugs aside lol).
Personally, I use w10 pro with deferred feature updates to mitigate this,
which is kinda messed up but there you go. I simply can't afford an update to
knock my desktop out, which has happened to me a few times with desktop linux.

What you're missing is that "works for me" works for you, but not for me.

In case you're wondering why I keep trying desktop linux, it's because what
I'm missing is an open, free, libre, privacy-respecting desktop. These are
values I take seriously enough that I pay yearly estimated license fees to the
open source components I use in the form of donations.

~~~
read_if_gay_
> What you're missing is the ability to open a game for a quick session while
> keeping your work open in the background.

Absolutely possible with Wine or QEMU.

> What you're missing is that desktop linux is famously unpolished, and worse,
> brittle

Correct, but arguably also true for Windows. There’s no Linux distro that has
had two settings panels for years. IMO the only polished OS is macOS.

~~~
fomine3
IMO Linux desktop needs more than one settings panels. I always need to edit
by gconf-editor or editing /etc/ __, ~ /.local/ __.

------
blastonico
How long to start seeing advertisement like "Linux, best viewed on Windows".

~~~
benbristow
Does kind of feel that way. Having a full Windows Desktop with good graphics
drivers and software like Microsoft Office, Photoshop, commercial games etc.
and Linux for the programming/server-software side of things.

I have literally no reason to dual-boot with Ubuntu anymore. WSL2 and Visual
Studio Code fulfils all my Linux coding requirements.

Maybe (probably?) a bad thing but it's damn convenient.

~~~
anonymousDan
Yeah I just started using Windows 10/WSL yesterday and I have to say I'm
impressed. Are there any downsides/painpoints I should be aware of?

~~~
FridgeSeal
I/O performance is pretty horrific.

~~~
bransonf
WSL 2 significantly increased I/O performance if you were wondering.

~~~
ubercow13
Across the VM/host barrier?

~~~
bransonf
To some extent, yes. However, you will get the best performance (and
improvement over WSL1) when working with files contained within the Linux file
system.

------
yahyaheee
This is particularly impactful since macOS won’t support CUDA drivers anymore.

~~~
mikkelam
Yep, as a long time hackintosher i'm definitely swapping now. Apple doesn't
want GPU compute users, so it's not really worth fighting anymore.

~~~
yahyaheee
It's such a ridiculous position they are taking

------
FiReaNG3L
This is BIG for machine learning adoption! 2 years ago it was a nightmare to
setup everything on a Windows box, and wasn't working in any VMs (WSL or
VirtualBox). Very excited about this.

~~~
black_puppydog
This feels like it's gonna be _terrible_ for people already doing ML.

Now, the following has more than a bit of elitism, I'm aware of that, so
please don't comment to point that bit out...

I'm afraid that searching for answers to specific ML questions will start
feeling like trying to google some Windows problems (every few years when I
make the mistake of trying to help someone out) where it's all "download this,
then click here..."

I hope I'm wrong and this will only be used by people willing to go the extra
mile...

~~~
cma
Hope we don't see things like: "Google announces today that to work around
Windows' file path length limitations, Tensorflow's directory structure is
being rewritten to use three characters or less per directory."

~~~
washadjeffmad
I recently helped a friend with their museum's collection management software.
The server and client were Windows based, and even with the proper
dependencies and provisioning, the installs kept failing.

Some of the paths were too long and the installer could only be executed after
being unpacked to root.

It's been a while since I more than casually gutted and used Windows as an app
container, but I hadn't realized how much has remained the same after more
than two decades.

~~~
int_19h
Windows itself can deal with long paths for a long time now. The problem is
the apps - they need to use newer APIs that don't e.g. deal with structs with
wchar_t[MAX_PATH] fields in them. For apps written in higher-level languages,
it's usually the standard library that needs to be updated - e.g. Node.js and
Python already have such support. But if it's C++, then it depends on how much
the app developer cares.

------
AndrewKemendo
This is pretty wild. The WSL2 upgrade really made the terminal way faster than
the original WSL, so I'm curious how this runs. Looking forward to trying it!

~~~
cchance
I'd imagine pretty close to what it is native

------
vaidhy
My biggest concern for not trying it out right now is that the insider preview
for windows is asking for all kinds of data permissions :(

~~~
zantana
Indeed. I'm not sure why WSL has consistently required some bleeding edge
version of Windows to run. I remember trying to get on a managed desktop at
work a couple of years ago and apparently the LTS version of Windows used in
the enterprise is even farther behind.

------
syntaxing
Ironically, I'm super excited about this because I run a Mac. I didn't feel
like changing anything in the bootloader using ReIT so I use an old laptop for
Linux stuff. Mac + Bootcamp Windows + WSL2 + eGPU sounds like my dream
machine.

------
cbhl
Previous discussion from when Microsoft sent patches to Linux upstream to
enable this:

[https://news.ycombinator.com/item?id=23241040](https://news.ycombinator.com/item?id=23241040)

And related discussion on lkml with more context:

[https://lkml.org/lkml/2020/5/19/1139](https://lkml.org/lkml/2020/5/19/1139)

~~~
mook
As usual, LWN has a great article on it that is much more approachable than
raw LKML:

[https://lwn.net/Articles/821817/](https://lwn.net/Articles/821817/)

------
codezero
I'm getting the feeling that this is part of a longer term partnership where
Microsoft acquires Nvidia, but for a weird reason: Azure cloud.

------
RcouF1uZ4gsC
This is finally the year of the Linux desktop. However, it is running on
Windows as WSL.

With this support now for CUDA now, it can be argued that the best, most
versatile developer experience is on Windows. You now have access to all the
Windows specific tools (such as Visual Studio) as well as all the Linux tools
in a very seamless environment.

~~~
shmerl
Next step, Nvidia will support only WSL and not normal Linux? Good riddance,
we need less blobs.

~~~
RcouF1uZ4gsC
Actually, this might be a great price discriminator for NVidia. They have
tried to differentiate their consumer and professional GPUs so they can charge
different prices. I could see NVidia only providing WSL/Windows drivers for
their consumer GPUs. If you want native Linux, then you would have to buy a
professional GPU at higher cost.

By great, I mean great for NVidia, not for us.

~~~
shmerl
I'd say it's going to be great for Linux, since usage of Nvidia will drop and
progress will accelerate, because no one will have to deal with blob
idiosyncrasies caused by Nvidia refusing to upstream their driver.

With Intel joining AMD on high end GPU scene with open drivers, it's Nvidia
who will be the loser with their dinosaur blob approach.

~~~
xiphias2
As games are running both on AMD and NVIDIA GPUs, but CUDA is NVIDIA only (and
supported by lots of languages and libraries), as a programmer I don't see how
AMD could disrupt NVIDIA's developer friendliness/lock in (unless AMD provides
a great CUDA implementation).

~~~
shmerl
There is a need to move away from CUDA to begin with. It's an ugly solution
since it's tied to Nvidia.

------
ilyas121
Can someone explain to me the WSL acornym? Isn't WSL really a Linux subsystem
for Windows? tiny pet peeve

~~~
stephen_g
Yeah, it's very confusing. I've heard it explained but I still think it
doesn't really make sense. The core of the argument that I've heard is that
Windows has multiple subsystems so they say it's a "Windows Subsystem", but
the 'for' in "For Linux" still makes no sense.

The arguments for not calling it something like "Linux Subsystem [for/on/in]
Windows" don't seem to match up with the fact that there used to be a similar
thing which was called "Microsoft POSIX Subsystem" also.

~~~
mattkrause
Windows Subsystem for (running) Linux (programs).

------
IronWolve
Too bad you need wsl2, i hate running hyperv and the apps that dont play nice
with it.

~~~
jstarks
Which apps are broken for you? We’ve been working with VMware and Google to
ensure their apps well with Hyper-V, but I know there are still some gaps.

~~~
psykotic
The new release of VMware Workstation now works with Hyper-V as its hypervisor
but a bunch of people (and I'm one of them) have experienced massive,
unacceptable performance regressions for VM guests. I had to disable Hyper-V
and hence stop using WSL2 until it's resolved. If the performance issues are
intrinsic this will probably be a permanent showstopper for people who rely on
VMware or Virtual Box. Hopefully this gets sorted out one way or another.

------
alkonaut
What happens in Linux when the video driver crashes (restarts) in Windows?

------
galkk
Seems like the one thing that I will miss from desktop Linux is i3

------
spullara
This black screened my computer. I have been spending the last few hours
trying to get it to boot up again.

FIXED: safe mode, reinstall latest nvidia driver.

------
tsp
Wow, this is huge for data scientists! Now Apple is even more behind with ML
after NVIDIA decided to drop macOS driver support.

------
g_airborne
This looks very cool if it can deliver up to its promises! As a Mac user and
ML developer I’m starting to look more and more jealous towards Windows - it
is starting to make sense this way.

But I’m also a bit afraid. Has anything related to CUDA ever been easy to
install and setup? Anyone who tried this have some pointers on this? For
example, I don’t know how many times I’ve googled the CUDA, cudnn, TF
compatibility matrix but it must be close to 100. Is this helping fix that as
well?

------
nbardy
I tried to set this up yesterday and spent 3 hours and still couldn't get it
working.

------
husamia
Docker containers will work with the GPU?

------
awadheshv
wsl2 really is a game changer for people working on ml.

------
shmerl
CUDA proliferation. Not interesting.

------
jpm48
This is quite light on detail, have just installed, was thinking that nvcc
would be installed to WSL2 as part of the process. It wasn't so did it via apt
which worked fine. Built a sample (with some hacking of the SM versions which
should be higher as I have an RTX) and when I run the basic Matrix mult demo
get code=38(cudaErrorNoDevice) Anyone else had any luck?

~~~
jpm48
Just done some more reading around and it seems most people are using docker
in WSL to get things work! This seems overkill to me. I usually just write
CUDA code in linux and use it. Am I missing something?

------
29athrowaway
When you mention Microsoft + EEE, you get downvoted.

Well, they have so far: 1) embraced Linux. 2) extended it with proprietary
functionality (Windows living outside WSL), and now what is next?

How is this not the same?

Of course, the MS employee will downvote me automatically, the VS code users
will downvote me too, and I will lose 5 karma for saying the truth.

~~~
Kipters
Next? They're trying to upstream it

~~~
bonzini
Why would it matter if it's upstream? Upstreaming something like this is not
really useful to anyone but Microsoft; it is still something that is
inextricably tied to Windows and WSL.

The point is not "extinguishing" Linux per se, it's achieving enough lock-in
that only Linux that Microsoft customers can use is WSL.

~~~
cwyers
WSL is not a Linux. It is a Windows subsystem for running Linux userlands.
(Almost as if the WSL name isn't totally nonsense!)

The "default" is Ubuntu. But Debian is supported, OpenSUSE is supported, Kali
is supported. Unsupported but available, you can get Alpine, CentOS, Fedora,
Arch, lots of distros.

~~~
Kipters
That was WSL1, WSL2 is a micro-vm running the Linux kernel and whatever
userland you want. Ubuntu is just the most advertised one, but all are equally
unsupported by MSFT, support is provided by the distro "vendor"

~~~
cwyers
By "supported" I mean "available in the Windows Store." I believe that those
are submitted by the distro vendors themselves.

And from a lock-in perspective, the userland is all that matters, yeah? If an
app runs on Ubuntu, whether it's WSL 2 or in Docker or in a a VM or on bare
metal. If it's all the same, then it's not a Microsoft Linux, it's just
Ubuntu. Or whatever Linux you want.

~~~
bonzini
If it has to run on Hyper-V, then as far as Microsoft cares it's a Microsoft-
extended Linux that they can make money from.

