I hope Microsoft stops with auto-update, otherwise the alternatives are also catching up fast if you are not a gamer.
Linux on desktops is fine (except from a physical security standpoint), but primitive in terms of UX. And it's still dependent on Mozilla or Google for its browsing experience.
And if you want a portable computer (which by the way are demonstrably more secure in the face of physical tampering) you basically relegate yourself to terrible battery life, poor display support, dicey sleep support, and the fixes for these often compromise performance.
I really wish Linux users would stop softballing their desktop vendors and kernel maintainers so much. It's just not competitive!
The point about portable computers is also false. You just need to pick a machine with hardware manufactured by friendly vendors who help write drivers. Why would you want to support anyone else?
Battery life? My x230 with tpm installed gets 6 hours of battery life on the stock battery.
Display support? I'm using a 2560x1440 monitor with the mini displayport on my x230 right now and have had no issues with it whatsoever. Plays Quake 3 great too, no tearing. Debian Jessie.
Performance issues? My system idles at 200mb. Good luck getting macOS or Windows to do that. And to preempt the bias card, I use a Mac for work and love it to death. I think most of your points were applicable in 2005. The terrain has changed and the mainstream Linux distributions are now very stable and usable daily driver systems.
Linux is more than competitive, it's just not targeted at inexperienced users. Which is fine. Not everyone has the need or want to configure their computer to suit them. Some people just need a computer that works. That's why macOS and Windows exist.
Aaaaand this is why I opted away from Linux and back into the Windows world after university. I'm trying to get shit done, I want to install and go and almost never have to touch anything but my code and solving my problems - NOT solving the problems with the tools that I'm ostensibly supposed to be using to solve my problems.
For all the flak it gets, since 7, Windows Just Works for me.
Okay, so... let's just think about what you said for a second. Windows 10 and Mac OS X deliver extremely high end configuration with a lot of extensibility in their Window managers via first and third parties without extensive modifications. I have some unique XMonad configurations too! But if I have to be a direct contributor to my experience, I remove a lot of the credit from the people who shipped the software, because they basically sent me an SDK for making a good UI environment.
> The point about portable computers is also false. You just need to pick a machine with hardware manufactured by friendly vendors who help write drivers. Why would you want to support anyone else?
Yeah. "Lenovo is friendly" is not a very compelling argument. Their support and sales are miserable. Their machines are unimpressive. Their supported versions of Linux are years out of date.
> Battery life? My x230 with tpm installed gets 6 hours of battery life on the stock battery.
My surface book gets 6 hours of battery life even if I'm compiling android binaries regularly. Without changing anything.
Are you actually going to pretend that sleep support isn't a major issue for many portable hardware setups? Or is this restricted to "only specific token gesture devices from specific vendors?"
> Performance issues? My system idles at 200mb. Good luck getting macOS or Windows to do that.
What does this have to do with performance?
> The terrain has changed and the mainstream Linux distributions are now very stable and usable daily driver systems.
So.. tell me... does Canonical's kernel update purge old kernel images yet, or are regular users still SOL after about 8 months of use unless they invoke a shell script they barely understand provided by stack overflow? Asking for a friend who I had to do this form.
> Linux is more than competitive,
If Linux is only for experienced users but both OSX and Windows can deliver to the full spectrum, then that's a superset, doge.
This kind of double standard is why linux ends up getting overhyped.
> Yeah. "Lenovo is friendly" is not a very compelling argument. Their support and sales are miserable. Their machines are unimpressive. Their supported versions of Linux are years out of date.
I don't care about sales, and they publish excellent hardware maintenance manuals which means that I don't need to rely on their support. You say that their machines are unimpressive, but I don't share that sentiment. They're not flashy, sure. But they're rock-solid and take a lot of abuse. You can drop them or spill liquids on the keyboard, and your system will be fine. Just wait a minute for your drink to come back out at the bottom. And despite that, they're still pretty light. I think they're quite impressive.
>> Battery life? My x230 with tpm installed gets 6 hours of battery life on the stock battery.
> My surface book gets 6 hours of battery life even if I'm compiling android binaries regularly. Without changing anything.
You're comparing a 5-year old computer to a brand new one. Apples and oranges.
> > HandleLidSwitch=suspend
> Are you actually going to pretend that sleep support isn't a major issue for many portable hardware setups? Or is this restricted to "only specific token gesture devices from specific vendors?"
Never had a sleep issue with my ThinkPad, ever. There is no vendor support for Linux.
> So.. tell me... does Canonical's kernel update purge old kernel images yet, or are regular users still SOL after about 8 months of use unless they invoke a shell script they barely understand provided by stack overflow? Asking for a friend who I had to do this form.
That's a Ubuntu issue, not a general one, and they should really offer an option to do this automatically. There's no excuse not to. But that's hardly a big issue, is it?
But tell me, does Windows auto-remove the remains of failed updates automatically now? Because during the Christmas holidays, I used the disk cleanup tool to remove several Gigabytes of them from a family member's Windows 10 computer.
I'm not terribly concerned about app or browser telemetry. If you're using Chrome, you're spitting back a ton. Firefox? Still sending some! Your mobile device? Spitting back a ton. Everyone other machine? Also doing so. Canonical? Also doing some, although to their credit it's less.
You have basically equated all telemetry with intrusive spyware, when in fact it's usually banal data designed to make it easy to identify problems after a bad software push. While maybe we could have a discusion about where to draw the line of "too much" for Windows 10, you've set such a profound double standard you won't even allow a dialogue about it.
2560x1400 external monitor? Doesn't sound like a HiDPI display unless it's only 15". This is another bare minimum thing I wouldn't brag about.
"good sleep support" has very little to do with activating sleep by shutting the lid. Grandposter is referring to Linux's terrible reputation for successfully going to sleep and waking up without crashing. You fail to convince again.
Quake 3 is an 18 year old game. How is it's performance in 2017 anything to write home about?
RAM is plentiful, the fact your idle system fits in 200mb is thus unimpressive too.
You, sir, yourself, sound like you have 2005 standard. (Or perhaps 1999 standards? That's when Quake 3 came out.)
sad to think that 'MacBook Pro' is the bare minimum now.
>RAM is plentiful, the fact your idle system fits in 200mb is thus unimpressive too.
not true. It is impressive that a modern machine can be ran and coordinated on such a small footprint.
Ram isn't eaten up with a zero cost. In other words : memory usage indicates more than just what's available, it's a performance metric.
Yes! Actually! Older Macbooks have better battery life! Apple's quest for form factor has actually led them to cut into battery life for their top models. This is why people jokingly tell me to "upgrade to a 2014 mac."
> not true. It is impressive that a modern machine can be ran and coordinated on such a small footprint.
It's also impressive that Red Fraggle can balance two pickles on her nose. It's not really relevant to the claim of "performance" though. It's equally irrelevant to security. Unless you define performance as "fitting into a minimal RAM footprint." Something many memory allocators elect not to do because of the compaction cost hurting running time performance.
I think that's where this sentiment often comes from: people who cut their teeth on XF86Config tweaking and compiling NVidia kernel modules from source and shopping for just the right PCMCIA wireless adapter are now amazed when a fresh Ubuntu install has working 3D-accelerated graphics, 802.11n (with GUI for configuration), Bluetooth, etc. (And yes, we have Freedesktop.org/systemd/NetworkManager and friends to thank for a lot of this.)
Maybe the bar was just really low, and you've got a strong argument if you say it shouldn't be anymore, but we are making progress...
I was one of those people for longer than I can remember. Last year I decided that it was time for me to make this argument. I don't regret the time I spent learning, tinkering, and sometimes wrestling Linux into working. However, while Linux has made a ton of progress, the gap between what "just works" and what users expect has only grown. My time and needs are just too valuable to me now to be messing around with Linux.
The main problem that the article presents is that windows acts like malware which brings a new element to your desktop. I remember getting a popup notification from facebook, this after I thought I turned off all of that stuff. You don't face that in linux. You did with some earlier versions of Ubuntu, but not anymore.
No doubt, windows brings computing to people who aren't technical and makes it easy, but anything beyond the most simple configuration will take a lot of time as well.
In the end, I find I have to spend far more time tinkering with my HTPC box running Ubuntu, then I ever really had to with windows. I know there are other distros, but Ubuntu is pretty much king of desktop linux here. I have considered switching to Debian proper, or an Ubuntu derivative, but haven't done so.
Oh, and don't get my started on the pain of getting an MCE remote working halfway properly under Kodi... that was a real painful experience. I'm just glad that suspend/resume has hdmi audio after now, where it lost it before some recent changes (daily intel driver ppa).
In the mid-90s I remember endlessly fiddling with CONFIG.SYS and AUTOEXEC.BAT to squeeze out that extra 30kb of conventional memory to run some game. Buying peripherals was such a crap shoot that 'Plug and Play' was something that needed to be advertised, and much of the time you were still stuck manually juggling IRQs, moving cards to different slots, etc.
Nowadays, one can buy any Windows or Mac computer/peripheral and generally expect it to work, with some bare minimum of reading reviews.
So while Linux has made tremendous strides in terms of driver support (you mean my wifi actually works now?), things are still very far from zero conf that users have become accustomed to.
Time is valuable, and 2-10 hours spent experimenting with various driver packages, editing text config files, recompiling the kernel, etc. is literally money out of the user's pocket, not to mention the learning curve if you don't already know how to do all these things. That doesn't include the time for extra research to see what hardware is well supported.
Here's an example of install instructions on a pretty well-supported laptop that does Linux . I'd guesstimate an hour of time if you've already done this before, and anything up to 10 hours if you run into unexpected difficulties or are totally new to this.
I'm a big fan of Arch myself, but if you look at the distro's philosophy, manually installing it is actually the point. There are enough other distros if you want to avoid that.
Looking for directions for that laptop on Ubuntu returns a bunch of frustrated users and rather conflicting testimony:
Disclaimer: not an Arch user myself
This depends a lot on how well supported your hardware is. I run Linux Mint (MATE) on a Thinkpad T430 and have had 0 issues with displays, sleep, battery life, you name it.
Not having issues means that you arrive at the hardware (which you may have never seen, e.g. at a customer's site), plug the wires, and it works immediately.
More importantly, not having issues means that you can rely on being able to just plug the wires and have it work, and that you don't have a risk of being unable to make it work immediately even if you forget the right invocations and are offline and can't look them up.
How does it work immediately? How does it know of I want to clone or extend the display? If I extend, do I want the same resolution on both screens, or different? You'll have to set that somehow, and whether it's a GUI or a CLI tool doesn't matter.
Forgetting the invocations aren't really an issue anymore either, my shell (zsh) has autocompletion of xrandr outputs, modes and resolutions.
Requiring a CLI to connect a monitor/projector is a UX fail.
None of this is new, and the whole point of this discussion is that Linux desktops are much better than they were ten years ago. It is that old state that many folks have in mind when criticising Linux distros' usability. It's just not a very interesting discussion to have.
It even worked on the first try from the X220's DisplayPort through a DP-to-HDMI-cable onto a TV.
I've never had to use xrandr or any command line tool to select displays – on Xubuntu, there's this little graphical dialogue: http://netupd8.com/w8img2/xfce-mini-displays.png that pops up (or you can force it to show by hitting that key on the Thinkpad keyboard with the picture of an external display).
OTOH, I did just a month ago for the first time actually use xrandr, but this time it was because I wanted to write a script that set my windows and monitors up "just right" for how I like it when I'm at the office. I love how easy Linux makes it to do that stuff when I find I do want something automated.
Is that your way of saying that you cannot run proprietary stuff like IE or Safari and that .. is a bad thing? I don't know a single person using Edge/IE anywhere around me, so I have trouble parsing and understanding this statement.
There is no networked computing experience that does not allow vendors to extract metadata.
And you're right, linux does sometimes require fiddling during installation to install drivers to make hardware work... because as we all know, windows requires no drivers at installation time! All hardware 'just works' without drivers on windows, right? And you never have to be on your toes lest your driver installer sideload some shovelware you didn't want - what a 'modern' UX experience! Yes, please, my mouse driver needs to have it's own service visible in the dock that also phones home separate to all the other items I install. How very modern!
Every desktop env has something that sucks about it, and windows has plenty (remember the clusterfuck that was 'removing the Start button'?). Similarly, if you don't like traditional desktops then there's plenty of alternatives in ^nix-land, like tiling window managers.
Apart from the above points that you mentioned its definitely not getting you the out-fo-box working functionality that windows gives. But i think i am willing to suffer that much to have a system with more control and an OS which doesn't installs random apps without my permission like Windows 10 does.
But: The time I use up in Linux to configure various things like that
I easily waste on Windows while downloading/installing all the tools/graphics drivers I need by hand.
The fact here is that Windows' tooling - if not entirely absent - is horrible. And although there are always alternatives that are easy to come by (sysinternals tools, 3rd party tools like Voidtools' Everything, Putty, DisplayFusion, Notepad++, a decent browser, etc.), the fact that their functionality is still not integrated (or there is at least a simple way to bulk-install them) still boggles my mind.
So, railing on about "out of the box" performance from a windows perspective seems a bit off to me. Windows as a blank slate is horrible.
Oh come on, windows distro doesn't even have coreutils in it.
That said, keeping said system running, when an update causes a regression for your system seems to happen to me far more on my Ubuntu system than my windows or osx ones.
Battery on linux if perfectly fine.
Same people who love tiling window managers... you're the 1%!
Davinci resolve seems to have a Linux version but not enough for someone who does more than video.
I don't know how fleshed out the DTP side of things are but as a amateur photographer and graphic designer I've been able to get away without having Adobe anything (except Lightroom) installed for the past few years at home and at work.
You will have to learn a new UI though, but I find this is easier to do than it sounds because most of the open-source software doesn't try so hard to map things to their equivalent real world process. (I found that the hardest thing about Photoshop was its terminology and workflow, which was built to be familiar to film photographers and print media people)
Linkscape is good, but I don't use Illustrator that much. For DTP, Photoshop + InDesign are hard to beat and again, an industry standard.
Maybe I'll run Windows in a VM but I don't really see myself going without these tools which I rely on daily.
The problem I have with most of the replacements is that none of them seem to put much/any effort into cataloguing and taxonomy. There's plenty of open source alternatives for the RAW exposure component of Lightroom but I don't think that's the reason people stick with it. Organising photos is hard and Lightroom is fantastic at it.
You mean in a file system at all or you mean that it wants things stored in a particular folder structure? If the latter, Lightroom doesn't enforce any particular structure. If the former, I'm not aware of another solution that works with images that aren't on disk. Darktable, RawTherapee, Capture One Pro etc. all work with stuff on a filesystem.
Darktable for example saves its settings in a file alongside the RAW image. Done. I can move those around without being afraid of anything. LR however enforces Adobe's own, opaque solution that I'm not aware of any possibilities to efficiently manage without going through LR itself.
Don't get me wrong, tagging etc. is nice. I'd just rather do it on my own in a system that I can manage myself (e.g. BeFS if need be).
If you want to make books profesionaly there is no replacement for indesign. Scribus doesnt have multiline paragraph composing. So technicaly if you want to set something to the block, you will get always better result in indesign. 3D and photography are interesting to programmers but books... they make them with TeX. But with tex its hard to do columns and more complicated composition. One way could be scribus latex render windows but its somehow shitty.
Its too niche thing i guess.
Same goes for After Effects - node based composing is fine on linux but layer based for animation (motion graphics). AE only thing on the planet.
Its getting frustrating especialy with recent decline of mac os and mac hw. Again - hackintosh to the rescue but try to find hackintosh laptop.
Huh? Linux is still one of top platforms for any kind of highend professional video work. Software like Lightworks, CG renderers and designers, etc. Is that not "creative" work?
Its mostly Adobe with their monopoly. The thing is.. i suspect they could target linux atleast experimentaly - since its already multiplatform and on unix. They dont seem to be using much of the OS parts.
Capture One may or may not work with Wine; all the tests in the database are old.
Just out of curiosity, though, as a photographer, what does Linux do better than macOS? Apart from the hardware (and my 2012 MBP works great for development and raw processing), I can't see what is compelling about Linux. I ran Linux on my Thinkpad T42 for years (Debian, Gentoo, Ubuntu), and macOS is so much nicer.
Photoshop is the industry standard, yes, but 'creative work' extends beyond digital design.
Having worked with and at digital agencies, adobe is the industry standard. I am open to learning about an agency which does not use Adobe, Sketch etc.
Why the hell anyone would want that on a server is beyond me.
> Devices currently running Windows 10 Pro, version 1607 can get Windows 10 Enterprise Current Branch (CB) or Current Branch for Business (CBB). This benefit does not include Long Term Service Branch (LTSB)
I know about this already, but the point is that end user experience for me had been getting worse than Windows7 for example (not talking about the UX here).
The fact that i have to use powershell or service control center to control the stuff is just not nice.
I think an end-user should not be forced to go to regedit to stop Microsoft to install crapware like candy-crush etc.
A few comments:
- It is better to split such a mega-script into a set of named scripts, so admins can mix-and-match their own configuration set.
- The configuration set scripts should be re-entrant, that is, one can run it few times in a row, achieving the same stable result. This is an important principle because those scripts evolve over time until they are are stable, so the re-entrancy enabled the re-configuration game.
- Some configuration items are system-based while other are user-account-based. This means that the latter should be invoked automatically once a new user account is created.
- VM is your friend. Wash, rinse, repeat.
- It is not always wise to replace automation (PowerShell) invocations with direct registry modifications. Tradeoffs should be obvious.
- MDT setups should avoid direct system configuration wherever possible, and rely on configuration scripts instead.
- One of the features still not possible to script is setting the policy startup/shutdown/login/logout scripts. One can provide this manually in a base workstation image.
- Esp. on Windows systems prior to Windows 10: make sure PowerShell is stable - version and module-wise.
I would also change the default policy in Windows Firewall to drop all outgoing traffic, and then enable access on application basis, and for basic things such as DNS and DHCP.
Windows 10 will still spam the DNS server for telemetry hostnames, and there seems to be nothing that you can do about that.
And really, if you can, you should switch to a better OS that doesnt require you to work against it.
If you have any issues with it let me know - we work on it pretty much constantly.
That is a feature, not a bug. I wish I could find something that would break windows trusted installer from using 25% of my CPU all the time on win 7.
They have gotten better (these links are in chronological order), but best practices for individual users and administrators is to disable the windows app store.
Just because I understand/know how to verify them doesn't mean I can write them down myself in the same time, so no, I don't "need" it, but it is useful to have and to compare to what I'm usually doing.
Besides you clearly need to read it anyway - it does stuff that not everyone will want, like disabling the lock screen.
Edit: In fact I'd be curious if there has ever been an actual instance of `curl ... | sudo sh` being malicious. I mean it's an obvious attack vector but it's also obvious. I've never heard of anyone actually using it.
I run Windows because I have to for work. I don't trust it.
Everything in that script is straightforward and if not easily googled to determine what its doing. It's not black magic.
But I have to question your attitude, especially here on a forum called "Hacker News"; why do you advocate against exercising ownership over one's rightful property? Microsoft already got their money and they should leave users the hell alone. There is a proud history of hacking one's PC and poking big brother in the eye, don't forget that!
I walked up to my laptop and in the upper-left corner of the screen Cortana wants me to try ordering Star Wars tickets. Fuck you Cortana, don't tell me what to do!!!!
I do agree that people executing this script should not only rely on "it's open source, so smart people will look at it and find issues", but actually research and fully understand what is going on.
I have a bit of a pet peeve about the word "untrusted", because it leaves open who is doing the trusting.
They will be undone, they can cause deeper issues in the OS, and they can in some cases cause vulnerabilities.
I see this a lot with chrome. People will load it up with extensions, then blame the browser when something doesn't work.
I'm not saying don't ever use it, but remember this if you start having weird issues, or one day the changes are undone.
It doesn't mean MS is out to get you, they just don't support those who are changing undocumented internals and don't need to announce hen they change some undocumented internal registry entry.
"If you do not understand every single command in this thing, you should avoid it, and if you understand every single command in this, you don't need it."
It's so universally applicable!
Tool users, remember?
So... you're saying that if you do know all this stuff, then you should manually type it all in a shell window (or hunt down in a gui) on every win10 box you want to administer?
Get-AppxPackage ** | Remove-AppxPackage
EDIT: A cursory glance of the script doesn't show anything dangerous. It may do thinks you don't want though but PowerShell's pretty decent to understand what's going on even if you don't know the commands specifically.
You can not check the source code of windows.
Who of those could have something to hide?
What I really don't like however is Microsoft pushing garbage like candy crush to my machine without my consent.
And if you use a web app, e.g., Google Apps, they get all this data plus more (and completely not anonymized).
That and not being able to see what it's sending out. I recently took a close look at the Privacy control panel on my iPhone. Not only does it give the option of turning off telemetry and ad data, but it also shows you exactly what is being sent back to the mothership.
This is what Windows 10 needs.
A big one is that some people are on metered connections and that telemetry can cost money.
Did you actually try this or is this based on one of those articles about the open beta of Windows 10 that had a lot more telemetry you couldn't disable?
As you can see in gpedit, Cortana and Web Search is disabled. Why does explorer.exe need to access akamai or search.msn?
Instead: Click Search, Hamburger Menu, Settings, Uncheck "Search Online and Include Web Results."
If you have OneDrive installed Explorer keeps a constant connection to Microsoft's servers to populate your OneDrive folders into the navigation pane.
Also, if you had actually researched before publishing, you'd known that Microsoft removed the disable option from the Hamburger menu during the Anniversary Update.
This is not a conspiracy theory but Microsoft blatantly crossing the line, I personally don't care since I'm not interested in joining the arms race against Telemetry, but not being able to easily disable an annoying persistent connection certainly leaves you a bad taste in your mouth.
It says right in the GPO policies which operating systems they apply to. For example in your screenshot you set "Do not allow web search" to quote the policy itself that only applies to:
> Microsoft Windows XP, or Windows Server 2003 with Windows Search version 3.01 or later
Doesn't do anything on Windows 10. So it isn't "blatantly nonsense." You are just playing with pro functionality you don't understand.
> Also, if you had actually researched before publishing, you'd known that Microsoft removed the disable option from the Hamburger menu during the Anniversary Update.
As you can see I am on 10.0.14393 which is the Anniversary Retail Release and have the option.
I will say that I definitely do not want to manage options via clicky-click. Local policy editing (and domain group policies in the enterprise) have been a replicable, scalable way to manage settings in a Windows environment since at least XP/Server 2003 (the earliest Windows client + server environments I've admin'd IIRC).
Configuration outside of LP/GP doesn't scale particularly well beyond personal use, I'd say and as such I get what the other poster was driving at.
You said above:
> That's not true. Install and run wireshark then dial all you want, there will always be traffic every other minute or so to one of Redmond's servers from just idling.
I literally did exactly what you said. Shutdown all third party applications on a Windows 10 Pro machine, loaded Wireshark 2.2.3, filtered out all LAN traffic/broadcast traffic/etc and watched. Didn't see any traffic at all going to Microsoft nor anyone else, it is still running now and not a peep.
Now I have no doubt that if I waited long enough I would, since I have Windows Update enabled, use Microsoft for time synchronisation, and a handful of other things. But it definitely isn't "every other minute." I cannot reproduce that.
Is your system volume license or retail?
Maybe Microsoft tracks this and forces my in theory Retail copy to still act like some sort of guinea pig :/
I think the other side effect from all those regedit hacks was issues with Windows updates. In the end, my machine got stuck at installing October patches, rebooting then failing at 99% applying the patch, rebooting again, rolling back the patch, rebooting, trying to install the patch again. The non-stop CPU usage and reboots made me quit Windows for awhile.
Currently it is possibly and easy to remove telemetry. And it only takes a few clicks to disable auto-update for Windows 10 by setting your wifi to metered, and a Google search to set your ethernet to metered.
I have to tweak the registry via shitty scripts like this one. Thats the opposite of how an OS should work
If you'd rather do it manually, these steps do the same thing: http://i.imgur.com/HnFpmE6.png
Start > [group policy] > [enter] > Computer Configuration > Administrative Templates > All Settings > Allow Telemetry
Additionally, has anyone checked to see if these same keys are changed by using lgpe?
> has anyone checked to see if these same keys are changed by using lgpe?
It does not touch the Wow6432Node entry, but given that as far as i know that exists for compatibility for 32 bit programs, i'm not sure it is necessary. Or it might be mirrored in by some kind of maintenance process.
Before moving definitively to Linux I'm considering installing a proxy on my router, bloc all ports except one and just redirect Firefox and a few apps that need connectivity to this port.
I'm not a network guy though, might be complicated.
I know what you mean, but as a Linux user, I feel like I'm spending a fair amount of time fighting the OS, too..
I'm not going to argue which is "worse" but I sure prefer to switch a few reg keys to stay ahead of Microsoft, to scouring forums trying to figure out what well hidden lever I need to pull to just get the hardware working.
You'll be fighting the OS either way - I'm picking this (easy) fight.
Considering the state of Ubuntu, probably the "easiest" to use desktop out there at the moment, my freedom is second to my ability to get a working desktop.
Reminds me of this video (Free as in free time; the freedom less mentioned by free software evangelists.) by Louis Rossmann: https://www.youtube.com/watch?v=KOjCJXHJhPg
Windows routinely would take hours to install, -and that would be a "pre-installed" image from the vendor.
Lately things have gotten a lot better but I have never chosen Linux because of freedom but rather always because of simplicity and usability .
Then again, usability is in they eye of the beholder it seems.
It's just not there. But yes, if you want to run Python, GCC or Ruby from the commandline, Linux is very capable of giving you a faster experience.
Well, yeah, but I never claimed that Linux would be suitable for everyone's uses; just mine.
Image editing: I've never needed more than the Gimp. Why would I pay for something like Photoshop, full of features that I'll never use? It'd be like buying a power tool to assemble Ikea furniture.
Video project: I did a little in Windows XP with Windows Movie Maker, pasting together porn videos about 15 years ago, and conversions with stuff like Handbrake since then. I've never been inclined to do anything more advanced. It's not a need that I've had...but I've got Windows available if I did.
Try to master audio: Audacity covered all that I used to do, but honestly, I haven't used it in 5 years or so. Oh, plus some of my own waveform-generating and mixing software (simple stuff like generating audio for video game emulators).
LibreOffice: It, and previously OpenOffice, have covered every single need that I've had in the last 15 years. Granted, my needs have been simple, but I've never claimed differently.
It's as if I claimed that Windows is all I ever needed, and you came in asking how I intended to hack on the Linux kernel from Visual Studio...
"Blender is the free and open source 3D creation suite. It supports the entirety of the 3D pipeline—modeling, rigging, animation, simulation, rendering, compositing and motion tracking, even video editing and game creation."
Now its not the most intuitive pieces of software and you'll be spending a lot of time on youtube following tutorials, but its freely available and opensource.
Out of those, I recognize "mathematical equations" as something I've ever tried, regardless of which software I've been using, and LibreOffice Math always fit my needs. The truth is, outside of programming, I don't use enough of the features of office software for it to make a difference which suite I use.
I'm not sure what you're trying to prove. That there are features in some Windows software that aren't in Linux software? Granted, but I don't see how that's relevant to my original comment.
Anyway, LibreOffice has been able to do all these things for a long time with Graphite fonts, and, since version 5.3, it can also do them with OpenType fonts.
I'm not the kind of guy who would found a Free Software Foundation because of some Xerox printer drivers.
I'd guess that any *nix user who reverse engineer their own drivers in 2017 does so because they want to and/or have a job that pays them to do so (most likely handsomely).
For the rest of us at least mainstream Linux distros are almost as easy as Windows if not easier in some cases (until you come to MS Office, AutoCAD etc which is a whole different story, mostly unrelated to drivers IMO.)
I.E. Unless you need to use mainstream business software.
Right now I only miss MS Office and given recent moves from MS it wouldn't surprise if a preview is available within a year.
Nope, that's just fiction, the awesomely rewarded jobs are boring banking systems written in plain old Windows Forms (or WPF/Angular if you're lucky).
Most jobs would send you with HR to have a serious talk if they found out you lost a week reverse engineering a display link docking station because your multi-monitor setup wouldn't run after installing Fedora in your job laptop.
But that's none of my business. Kids are free to believe in Santa.
> For the rest of us at least mainstream Linux distros are almost as easy as Windows if not easier in some cases.
You usually install Linux because you enjoy working with the terminal, not because it's easy to use, but I certainly understand where do you come from. Ubuntu has done an amazing job to lower the barrier to start using Linux.
That would be reasonable, if Linux development isn't your job.
And if you do it for the fun of it I say of course you should do it on your own time.
> But that's none of my business. Kids are free to believe in Santa.
Unnecessary attempt at an insult IMO.
>> For the rest of us at least mainstream Linux distros are almost as easy as Windows if not easier in some cases.
> You usually install Linux because you enjoy working with the terminal, not because it's easy to use, but I certainly understand where do you come from. Ubuntu has done an amazing job to lower the barrier to start using Linux.
Plain wrong IMO: I can interact with a terminal all I want without installing Linux just by running putty. In fact I prefer gui for most things except sysadmin stuff.
More serious answer: I do use other OSs. But I keep using Windows because I want access to the fruits of millions of man-hours spent in Redmond and elsewhere developing for the Windows platform. It's a non-trivial body of work.
EDIT: ps. I know Linux has great, more or less equivalent, free or very cheap software available but I'm too lazy to learn that other software.
EDIT2: I think perhaps a lot of windows users might fall into my category.
If it is open source, that doesn't make it automatically trustworthy. If it isn't open source, you can trust it as much as you trust the author(s). Open source and trust are separate things.
Oh and by the way, the logfiles from most package servers provide an accurate description of what you're doing with your machine and a weak concept of identity. So you'll need to avoid those.
Browsers keep updating and the vast majority of websites collect telemetry as well. So no more internet.
But yeah, the OS app level telemetry seems like a pretty big deal and we should stress out about it.
Also, if you use a browser that respects your privacy, and treats you like an adult, then you can turn that telemetry off.
Apart from their short flirt with Amazon (which is now long gone), what else do they do apart from package updates?
There are also major distributions like Fedora and Debian which never did this.
Please stop spreading your FUD.
Because if getting 1/2 the battery life if I want CUDA support is pretty shit in my opinion.
Those "package servers" are distributed mirror networks with no central entity being able to read all the logs.
You can go through all the hops, run every script out there and Microsoft can push a new update tomorrow that will put you back on square one. So why even bother when you can't ever be really sure you achieved what you wanted and for how long.
Granted, they aren't for beginners, but that's the sliding scale in action.
Looking at this script there are seems like many more vectors on Windows then macOS and Ubuntu.
Eventually I gave up, reformatted the drive it was installed on, and re-installed. Everything's been pretty much fine since then.
The NoLockScreen hack does not work in Anniversary Update. It's even described in the gist and the Anniversary workaround is commented out.
Anyone who has one of the new netbooks that have 32 GB drives, because it's hard to get 16 GB free to install the update.
Microsoft probably just bought that info from my Android location history.
- Click Make the computer easier to see.
- Uncheck the option Turn on Magnifier and click on Save
I find this interesting as a security choice rather than completely disabling P2P updates, as I'd guess that a substantial number of users aren't in control of all machines on their network, and if the other computers on the network got their updates from peers outside the network, then you'll still end up getting the updates from those peers. Is completely disabling P2P updates not an option?
For desktops local network is fine.
For laptops it should be a no-go.
If you want to run it you should read through it first, it says above every function what it is going to disable. I would go with a simple rule: If you don't understand it, don't disable it. Otherwise it contains code for enabling all features again in case you start to miss something.
Edit: especially UI part seems to disable a lot of things by default that you might not want to disable!