Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Wanting to dual-boot sometimes isn't that rare, surely. And local storage for documents remains popular too. It seems to reflect the reality that serious Linux development effort has been very server-focussed for a long time.


On the contrary I have the feeling dual boot is a thing of the late 90's-early 2000's when processors didn't come equiped with virtualization extensions and virtualization was slow as hell.

Having to reboot is so inconvenient, sooner or later ina dual boot setup you leave your non-preferred env colecting dusts and cobwebs.


Sure but the Linux audience is certainly not one that pays for such things. Like the Parent Poster explained, this is a hard and niche technical problem. Someone has to foot the bill for this, and apparently there is a "need" for it but no market to sell it.


> "Sure but the Linux audience is certainly not one that pays for such things."

Eh? People just pay for their stuff. The problem is there isn't really a market for OSes (as a consumer), period. Unless you build a PC yourself, which is not the norm, you don't usually even buy an OS.

This has nothing do with whether the linux audience pays for stuff or not...they pay fine. It is true that it is relatively niche even for linux folk to want to do this sort of thing, but that has more to do with laptop vs desktop form factors than the market. If it were a common need, I think it would be a common thing found on the market. In fact the existing ext4 stuff for Windows indeed used to be paid, but this was primarily for Windows users, Linux users have had no issue accessing ntfs drives for ages, and generally didn't really enjoy giving Windows access to their drives anyways, what with the market for viruses and whatnot...


The Linux audience is software engineers and the companies they who employ them.


Back when I was a desktop Linux user, a very sizeable portion of the other people I knew that used it weren’t “software engineers”. They were ‘power users’. I don’t think that my experience was all that atypical.


So why haven’t such an open source application materialized after all these years?


Well you'd need to fund the cost of a kernel driver signing certificate and get it through WHQL as well. That's just for a start. So this wouldn't be something you could build yourself.


Surely you’d just need to turn driver signing off


The need is more personal than corporate.


There are corporate solutions (like the Paragon drivers), they just suck. Paragon tried merging this code into the Linux kernel, but the code was so bad and unmaintainable that it was rejected. The FOSS solution is using FUSE, but that requires a UNIX-like OS for easy porting. Unless Windows ditches the NT kernel altogether, I don't think you can expect native filesystem extensions beyond NFS support.

GNU philosophy is not about moneymaking or footing the bill. Sometimes a problem gets worse when you throw money at it.


If there was a Linux OS focusing on desktop/laptop users that offered a highly polished user experience - comparable to the best of MacOS and Windows - I would be very happy to pay for it.

I am probably in the minority of Linux users though.


I do not find macOS or Windows to be polished. Linux isn't either. They all just have different areas of roughness.


I mean…using desktop Linux in the first place is already an incredibly harsh qualifier. macOS and even Windows (+ WSL) are no doubt serving the needs of many people that need a *nix environment for their work, and eating into the percentage of people that would otherwise push for Linux to be supported by their employer, or use it at home. That and the ubiquity of containerised workloads.

It’s really not outside the realm of reason for the subset of those people that want to use Windows for something other than playing video games is small enough for there to not be the base level of interest needed for this stuff to get off the ground.


These days probably quite rare, except for gaming. Virtualization is far more convenient.


Used to dual boot, but with the combo of steam/proton and VFIO passthrough + looking glass, it’s been really solid to just run both at the same time. File sharing between the two is pretty easy in that case, because it’s just the usual rsync/scp tools or SMB share.

Although, I’ve found I very rarely need the guest for gaming anymore, more so for opening the odd Photoshop file and similar tasks.


This is where I am at now.

In 2010, used to dual boot. In 2014 or so, I switched to VFIO GPU/SSD passthrough and never booted to Windows again. In maybe 2019 or so, I pretty much stopped using passthrough even, though this is partly because I stopped playing a few games that proton didn't run (mostly because of e.g. Easy Anti Cheat, etc).

I don't even have my windows ssd installed in my tower any more, that's how little I expect to need it these days.


Virtualisation is really laggy if you have no GPU acceleration, there are ways to get it working but they either need two GPUs or are still slower than native


Yeah, if you're going to exclude the common use cases, obviously it's going to be rare. (:

People still use local storage.

(I'd also like native ext4 support in Windows.)


Yeah, but when I boot Windows for gaming, I don't care about sharing data with Linux.

And given the availability of cloud storage, I can just upload whatever little I need to share.


Not all dual boot systems are "main OS" plus "wintendo." Quite a lot of the ones I encounter on personal machines are, certainly, but people doing actual dev stuff on both is still a use case.


Valve is working hard to change that.


When I dual booted a hackintosh and windows (2015) I did so on 2 different drives so I wouldn’t have windows scribbling over my boot sector


Linux has good NTFS drivers. If you want to dual boot and share files, you just make three partitions: Windows (NTFS), Linux (whatever the cool kids are using these days), and Shared Data (NTFS).


And before linux had good NTFS drivers, you just put your shared data on FAT32.


EXFAT is another option these days.


In my experience, exFAT support in Linux is worse than NTFS support (and NTFS is more likely to be installed out-of-the-box). If you also need to support macOS, then exFAT might be an option, but exFAT lacks journalling and doesn’t scale too well due to cluster size/count limits.


For me dual boot has ended the day VMWare and Virtual Box became good enough, that was around 2010.


Why would I dual boot when I could use a VM?


For performance, to use the entire RAM and CPU of the bare metal machine


The overhead of VMs, properly provisioned, is minimal these days. Even going with card passthrough just works.


You need a whole damn extra card for passthrough. With the price of today’s high end cards, that option seems… less than attractive.



Let alone trying to fit 2 3-slot GPUs in your machine, or have enough PCIe lanes to take advantage of them


That doesn’t work if you have one GPU (unless you shut down the Linux desktop which defeats the point)


For what?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: