Hacker News new | past | comments | ask | show | jobs | submit login
Nvidia adds telemetry to latest drivers (majorgeeks.com)
409 points by schwarze_pest on Nov 6, 2016 | hide | past | web | favorite | 231 comments

First, let me say that I think what they did was wrong and it should only be opt-in and clearly stated.

That said, having managed fleets of machines that were nominally running the "same" software, getting updates from all of them is a really powerful debugging tool. Once you get above about 1,000 machines logs comparison of all the machines immediately surfaces software issues (happens on all machines), connectivity issues (machines in a certain area), bad machines (unique problem signature), and environmental issues (time of day correlation with other things like power/temp/humidity/etc).

And that gives you a bit more courage to release things early because you'll see problems faster and can fix them.

So with a typical roll-out of 10% of the population followed by an additional 15% followed by the rest, you can catch a lot of errors and 75% of your population sees a really good experience (and in web services where 66% of the populations the minimum requirement for delivering rated service you can often get close to 100% uptime).

Does that justify their action? No. But since you really don't need everybody to participate to get the benefit I could see a path where you opt in for early access to drivers which requires the telmetry, and people who are ok waiting for the driver to be clean in the 'canary' population get a driver without telemetry.

This isn't about just technical issue monitoring. Their privacy policies and such include sharing your personally identifying information (email, full name, etc) with 3rd parties & advertisers, and associating your random web browsing with your video card's account.

Remember that it's not just games, but your web browser, media players and anything that might use hardware acceleration runs through these drivers, which then reports back to nVidia. Plus, all telemetry data is unencrypted during transmission.

On the plus side, it looks like this is only transmitted back to the mothership if you're running GeForce Experience, which I don't think is even available on Linux.

In my experience it is more nuanced than that, engineering will add a feature or capability and someone will come along and find a use for it that was unintended. At the search engine where I worked we came up with ways to inspect and adjust host rank and page rank such that we could knock out bad sites from the search results that were doing a good job gaming the system. But that biz dev folks also noted the same pieces could be combined to favor someone who advertised with us over someone who didn't.

The engineering feature is designed with pure intentions, keep crap out of the results stream, but the mechanism is amoral. If you're executive team wants to exploit it for more money so that the company does better financially, your options are limited as an engineer.

At a data design level, they could (and should, by default) at the very least keep all grossly personally identifying information out of the data collected. That's just simply responsible engineering in the modern era.

They could have implemented telemetry without requiring you to link your graphics card to your email or Facebook account, as GeForce Experience never required that in the past and still worked fine.

This isn't execs using innocuous data that the good engineers simply happened to have. This is new and specific personal data collection and association of information that they have no business collecting, except for the sole purpose of invading their users' privacy for side profits.

I'm not disagreeing, I'm just saying it is hard to build tools and features into your project that your own business development team can't abuse.

Isn't the easy option to allow customers to participate in the program? Their software is already so buggy and bloated that a simple checkbox that ANONYMOUSLY sends telemetry would be trivial to implement.

This is something that, I hate to say, will probably take a law to fix. Companies in general have shown a resistance to understanding why this is a problem (even when well intentioned).

This. As long as consumers have no way of knowing whether or not their information is being transmitted, what information it is, and how it will be used, corporations will continue to do it at will. We need strong user data protection laws here in the United States that make it extremely clear what rights you are signing away any time you allow a service to collect personal data about you.

Its bigger than that. We are at a point where I would pay a monthly premium in order to keep my privacy and would rather have a bunch of free tiered alternatives that are all explicitly non-private. Because lets be honest with ourselves nobody but us really gives a shit about their privacy. And it is a billion dollar industry. Let people who do not care keep the free shit at an up front cost rather than all this running around corporations do now. Or I can even see a future where the government gives you free internet in return for data mining 100% of it.

I would actually probably use that.

Actually, even if you're paying, the service operator may be mining your data. This is the case for example with cell phone operators: you pay a monthly fee, and yet they use your geolocation to send you custom advertising, sell market studies, etc.

Paying for email sounds weird to most people these days, but now that I do, I'd never have it any other way again.

I'm days away from moving to fastmail. May I ask what service you're currently using? Are you happy with it?

Not parent, but I too am paying for email for years and am very happy with FastMail. You can see a lot of work is going into it lately now that they're independent again. Can't recommend them enough.


FastMail, indeed, and I love it. Something insanely gratifying about being able to file support tickets and rapidly get answers for an email service.


I use;




and use Postbox as my mail client. All great so far and only costs a few £'s a month. Probably less than lunch for a day or two.

Fastmail for features and ease of use. Protonmail if you want security.

Same here. Three separate, paid services. I'd never go back to something like GMail again.

  And that gives you a bit more courage to release things
  early because you'll see problems faster and can fix them.
I gotta say, if the major benefit of telemetry is that vendors can test less before they release, that sounds like a bad thing not a good thing for users?

Bad for few users. But great for most.

Testing less means they can release sooner.

Here is a bad example. New game comes out. Graphical driver glitch crash. Typically would take 2 weeks to get to public as whole. Now with this they can release in 2 days to 10% fix up and release to whole by end of first week. So 90% of population got a fix sooner(1 week) And some percentage of the 10% may have had some annoyances. Sucks for them. But if they roll out in a way that minimizes this. It'd be beneficial to all overall.

Here's a different example: A new driver improves game performance for 99.9% of customers, and disables graphics output for 0.1% of customers. The bug could have been caught with a few days of automated testing in a test lab with lots of different cards and system configurations, but that's slow and costs a lot of money to maintain - chucking it over the wall to consumers is quicker and cheaper.

Seems to me getting the poorly tested drivers earlier isn't much of a deal for users. After all, just because I'm in the 99.9% this week, doesn't mean I will be next week.

But in my experience there are a few more 9's on your "works" number for a specific issue.

Hardware-related problems are so fucking hard to debug in a lab. Many of the issues i've dealt with in the past only show up with pretty esoteric hardware mixtures, or are so specific that i'd be surprised if more than one person has that exact config on the planet.

A test lab might be able to get maybe a hundred different configurations. But in the wild, something like 60% of our users are on unique hardware configurations (as in nobody else has that exact set of hardware among our users).

Telemetry is pretty much the only real way you are going to be able to support a wide range of hardware when you start interacting with it on the lower levels, because buying and assembling all the different configurations is all but impossible, even if you prioritised it.

My AMD drivers & Steam Client already report crashes

On a related note; I can't even use NVidia's "Geforce Experience" any more without logging in. Thanks for that, NVidia. Just what I wanted; a driver tool that forces me to log in.

So annoying, yes. I since uninstalled GeForce Experience and replaced update notifications (the only feature I was using) with this ugly batch script. Careful, it's ugly! Botched stackoverflow-oriented batch brogramming! But it works for me! Feel free to reuse and improve :)

    setLocal EnableDelayedExpansion

    rem nvup.bat, a quick & dirty driver downloader since GeForce Experience requires a login.
    rem In a folder with write permissions, drop the script and its two dependencies:
    rem  - jq: https://stedolan.github.io/jq/
    rem  - curl: https://curl.haxx.se/
    rem For automation, just create a Scheduled Task that runs when you want it (I like on Resume).
    rem Reuse / modify / redistribute at will.

    rem http://stackoverflow.com/questions/19131029/how-to-get-date-in-bat-file
    for /f "tokens=2 delims==" %%a in ('wmic OS Get localdatetime /value') do set "dt=%%a"
    set "YY=%dt:~2,2%" & set "YYYY=%dt:~0,4%" & set "MM=%dt:~4,2%" & set "DD=%dt:~6,2%"
    set "HH=%dt:~8,2%" & set "Min=%dt:~10,2%" & set "Sec=%dt:~12,2%"
    set "datestamp=%YYYY%%MM%%DD%" & set "timestamp=%HH%%Min%%Sec%"
    set "fullstamp=%YYYY%-%MM%-%DD%_%HH%-%Min%-%Sec%"
    echo lastCheck: "%fullstamp%" > lastCheck.log

    rem to get the update feed for your device/OS combo, go to http://www.geforce.com/drivers
    rem , pick your device/os, pop the "Network" tab of your devtools, start the driver search,
    rem and copy the url
    curl --silent -o rawNv.json "http://www.geforce.com/proxy?..."

    for /f "delims=" %%i in ('jq ".IDS[0].downloadInfo.DownloadURL" rawNv.json') do set lastUrl=%%i

    set /p installedUrl=< installedUrl.txt

    if %installedUrl%==%lastUrl% (
      echo same version, quitting
      exit /B
    ) else (
      echo new version, updating
      echo %lastUrl% > installedUrl.txt
      pushd C:\Users\YOURUSERNAME\Downloads
      %~dp0\curl -O %lastUrl%
      rem http://stackoverflow.com/questions/774175/show-a-popup-message-box-from-a-windows-batch-file
      mshta "javascript:alert('New driver');close()"

Here's a less ugly PowerShell version of your script that doesn't have any external dependencies and compares the version of the currently installed driver with the latest (non-beta) driver on NVIDIA's website.

  $URI = 'http://www.geforce.com/proxy?proxy_url=http%3A%2F%2Fgfwsl.geforce.com%2Fservices_toolkit%2Fservices%2Fcom%2Fnvidia%2Fservices%2FAjaxDriverService.php%3Ffunc%3DDriverManualLookup%26psid%3D101%26pfid%3D815%26osID%3D57%26languageCode%3D1078%26beta%3D0%26isWHQL%3D1%26dltype%3D-1%26sort1%3D0%26numberOfResults%3D10'
  $Download = (Invoke-WebRequest $URI | ConvertFrom-Json | Select -ExpandProperty IDS)[0].downloadInfo.DownloadURL
  # Installed driver version ( > NVIDIA driver version (375.70)
  [Version]$Driver = (Get-WmiObject Win32_PnPSignedDriver | Select DeviceName, DriverVersion, Manufacturer | Where { $_.Manufacturer -eq "NVIDIA" -and $_.DeviceName -like "*GeForce GTX*" }).DriverVersion
  [Version]$CurrentDriver = ("{0}{1}" -f $Driver.Build,$Driver.Revision).Substring(1).Insert(3,'.')
  # Latest driver on NVIDIA's website
  [Version]$LatestDriver = ([System.Uri]$Download).Segments[-2].Trim('/')
  If ($CurrentDriver -lt $LatestDriver) {
      Write-Output "New driver available"
      Start-BitsTransfer -Source $Download -Destination "$env:USERPROFILE\Downloads"
      (New-Object -ComObject WScript.Shell).Popup("New NVIDIA driver downloaded",0,"Update")
  } Else {
      Write-Output "Same version. Nothing to download"
Improvements welcome. :)

Thanks. I hesitated reaching for PowerShell because, although it looks solid, I'm not familiar with it. Even at work, I was automating part of a Windows build last week, and used batch :-/

So I have a few questions to you or knowledgeable PS-passersby:

1. Any recommendations to get started with PowerShell? Good documentation, tooling, linters, useful packages?

2. Are there remaining cases where it's impossible or unreasonable to use PowerShell rather than Batch or VBScript?

3. PS runs on Win≥XP, right? How does the language evolve / is it versioned? If so, which version should I target?

1. Start with the ISE, comes with windows. If you can't find it "C:\Windows\System32\WindowsPowerShell\v1.0\powershell_ise.exe" Otherwise VSCode with a plugin is nice but you miss some intellisense magic the ise can do.

1a. with powershell as an administrator run `Update-Help` That's all the help for the commands installed and up to date from online. Get-help <command> and don't forget to use wildcards `get-help get-*` will show every single get command you can run. Get-Verb will show what you should expect things to be called. `Get-module -listAvailable` worth looking at too. Tab complete is your friend.

1b. This and the advenced one are great although quite slow; https://mva.microsoft.com/en-us/training-courses/getting-sta...

2. No? Maybe stuff that's already fine or where working out the signing policy is a hassle.

3. Powershell is wrapped into the "Windows Management Framework" you can install. WMF 5 has PS 5, can install down to Windows 7. 7 shipped with PS 2.0 where you could write cmdlets in powershell itself, before that you had to use C++ and visual studio. PS3.0 is the minimum it's really pleasant to use but 2.0 support isn't crazy.

1. Try this: https://web.archive.org/web/20120103141402/http://powershell...

2. Not really. Worst case, you can batch & co.

3. Probably v2.

Off-Topic: While looking through your script I noticed the `jq' dependency. I went to install it and in the docs it mentioned this thing called chocolatey [0], which is like homebrew for windows, and seems pretty amazing :)

Thank you for sharing your work with us!

[0] https://chocolatey.org/

Be aware that even if you manually installed the driver in the first place and have never had GFE installed, Windows Update will helpfully install GFE for you if you let it update the driver.

Sadly, that doesn't allow use of shadowplay.

There are some older versions of nvidea experience floating around that still work with shadowplay without the login.

If you're on Windows 10, the built in games recorder is pretty fantastic, I prefer it over Shadowplay. Of course, a lot of people trying to avoid telemetry are probably still avoiding Windows 10.

Use OBS instead.

Shadow play has that record last X minutes feature that OBS doesn't I think

OBS does, OBS studio doesn't.

It's replay buffer where you start the replay buffer and it will keep the last n seconds of footage in a buffer. You set a keybind to save that buffer to file.


Holy moly, I thought that this post was a joke for sure - and then I checked out GeForce Experience.

Uninstalled. Never to be reinstalled.

AMD will be my next card. Thanks nVidia for letting me know what you think of me :)

My employer took took a shipment of fifteen Dell computers a few months ago - with ATI Radeon HD 8570 graphics cards (3 years old, with DP 1.2 and 4k support so not obsolete by any means, and certified for Ubuntu 14.04 [1])

Then Ubuntu 16.04 comes out, and guess what? No fglrx driver support any more [2] and users report the open source drivers use about a core and a half worth of CPU, slowing down their entire machines. Apparently our only options are to buy new cards or stay on 14.04 indefinitely.

Meanwhile, I had an nVidia card and I was able to upgrade to 16.04 with no problems at all.

This week you might have resolved not to buy from nVidia - but in the same week I've resolved not to buy from ATI.

[1] https://certification.ubuntu.com/hardware/201302-12679/ [2] http://www.omgubuntu.co.uk/2016/03/ubuntu-drops-amd-catalyst...

It's weird that you're seeing such high CPU usage with the open source drivers. AMD has been directly supporting them for a while now. For most applications radeonsi (the open source driver for GCN arhictecture cards) performs within spitting distance of the proprietary driver and for some the performance is better. I suppose it's possible they are more CPU intensive generally, but the kind of slowdown you're seeing seems pretty excessive. What applications are you seeing this problem with?

Ubuntu 16.04 does ship with a pretty old version of Mesa (full major version behind, about to be 2 major verions as Mesa 13 is in the RC stage) which probably isn't helping matters. Unfortunately, while there are some PPAs that make it fairly easy to install a bleeding edge build, there's not really a convenient way to install the latest stable release.

For some context, the reason fglrx/Catalyst has not been updated is due to an in progress driver transition for AMD cards on Linux. Generally speaking, Linux video drivers are split into a kernel part and a user space part. For quite some time, AMD has maintained two completely separate driver stacks on Linux. For the user space side, this didn't represent a huge amount of duplication of effort as most of the code is shared with the OpenGL portion of their Windows driver, but for the kernel side it was a bunch of wasted effort. A while back, they started on a new open source kernel driver, called amdgpu, that could provide the necessary facilities for both their open source and proprietary driver efforts. The new proprietary driver (AMDGPU-Pro) targets this kernel module. Unfortunately, amdgpu does not have production-ready support for GCN 1.0 GPUs like the 8570. This will get fixed eventually, but that doesn't really help you now.

14.04 just switched to the 16.04 graphics stack a couple months back -- do your drivers still work after that update?

IIRC the users who switched back to 14.04 also had to downgrade Xorg [1]

[1] https://askubuntu.com/a/815592

AMD has their AMDGPU-PRO [1] drivers that are available on 16.04

[1] https://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-G...

No mention of the Radeon HD 8570 on the list of supported cards. Unless you know better?

The open source drivers aren't configured correctly, Southern islands cards are close to feature complete with Radeon. Turn on DRI 3 and glamor.

ATI is AMD now.

I'll be buying AMD as well, the RX 400 series already piqued my interest but I can't justify buying another budget GPU from nVidia, even though the 1050 looks pretty attractive.

I wouldn't recommend that. AMD's software quality is horrible, and I'm saying that as an iMac user stuck with them on Windows. Whatever you think of nVidia, it's a breath of fresh air to see such polished auxiliary tools -- and yes, I think GeForce Experience is good... why? Living with AMD crap for years.

And if you think nVidia treats customers/users badly, just watch AMD's treatment of Apple hardware. They added artificial blocks preventing the "normal" driver from installing there (even though it is fully functional) and you have to use a special Bootcamp driver, which is incredibly outdated and buggy as hell in modern games. Their support is about as useful as you expect in a big corporation, and then some. It's so bad that enthusiasts are re-packaging AMD's drivers to have something: https://www.mxdriver.com

The new Wattman software is supposedly a huge leap forward.

What model of AMD can replace GTX 1080?

Not sure if this was rhetorical or not, but if it isn't, none of them. The 1070 and 1080 stand in a class of their own. AMD won't have a competitive card in these class until Q1 2017 which will be about the same time that the 1080ti drops.

I honestly don't care about this, these threads always attract those who value privacy above all else and believe that every should play on their terms. Most people do not care, and as an engineer I see extreme value in providing this data. This isn't to say that there aren't dark patterns being used here(the acceptance language is hidden in the EULA), but there is a simple work around...don't use GeForce Experience. Manually download the drivers from Nvida's website. In addition to there being a complete go around, I also use G-sync for my games which is superior technology to Freesync, so there is basically no chance that I'm leaving the Nvidia ecosystem unless they did something truly harmful to me.

I asked just in case if I missed something and AMD secretly developed competitive card.

After huge annoyance with required login in GeForce, this story will 100% convince me to uninstall it and check updates manually.

Upcoming Vega cards should be in same performance segment.

I blocked their telemetry and not I can't even update my drivers. Not through windows update, nor through their installer. It hangs trying to talk back to them.

No harm, no foul, probably time to give AMD cards a chance. Variety is the spice of life and the spice must flow.

As much as I welcome the capitalistic response, I'm worried it won't be enough.

As much as I'd love true competition in the GPU/CPU space, it doesn't exist. AMD's cards simply cannot compete with Nvidia for GPGPU type scenarios, and even in its basic capacity, often have known heat/perf issues. Now that may be worth it for now to make a statement against the telemetry, but what if (less if and more when IMO) AMD then adds driver telemetry? And then intel?

These domains (Chip manufacture/GPU driver writing) are so advanced at this point that I don't see how competition could reasonably disrupt an incumbent over anything less than a samsung-grade failure (and even then probably not), and I'm concerned about the long-term wherein the producers realize this and through a combination of boiling the frog slowly and leaving consumers no other choices put themselves in the position to have a "pragmatic monopoly" of free reign over our machines. (I've always wondered what would happen from an antitrust sense, if it's "we're the only producer not because we WANT to but because we're the only ones who CAN")

We've certainly seen it happening with OSes, as well as some attempts from PC oems, I've always unfortunately thought it was just a matter of time until the more irreplaceable components got into the game too and I'd love some creative thoughts to actually stop the trend and not stand in its way, because I'm not sure we'll win that latter battle.

EDIT: as a child post points out, I completely forgot to mention drivers as well; as a strong argument to my "we don't have many options" thesis. AMD's linux support has been historically lacking next to NVIDIA which makes it a non starter in many cases.

Odd, I always bought AMD and only ever had issues if I overclocked them. The one time I have an Nvidia card, all I have is issues with drivers, both on Windows and Linux. The only thing that impresses me that Nvidia has done is the Nvidia Shield Tablet, and they made it less impressive with their K1 rendition of it. There's also the Nintendo Switch but it's yet to be determined if Nintendo is doing something amazing with it, or losing out on another big opportunity. Nvidia does not impress me. I've had AMD for years, and have had a solid investment every time. Of course this is my experience, other people's may vary. My main complaint against AMD is Linux drivers either exist, or they don't, they either work, or they don't. With Nvidia it's the same, some people / distros keep a specific driver version because Nvidia breaks a previous one with their 'fixes'.

" AMD's cards simply cannot compete with Nvidia for GPGPU type scenarios"

Really? Which brand of GPU pretty much ran Bitcoin during its inception, again?

You're absolutely right, HOWEVER, I would call that the exception rather than the rule, as IIRC it was largely due to AMD supporting some operators in a more native/optimized form than Nvidia did at the time, than any sort of true general purpose superiority. Given how quickly mining moved to ASICS I wouldn't use that as an advocation that they're (ASICS) superior to Nvidia for GPGPU functionality. (Special-purpose specially-designed, no doubt, but that's not really "General")

Edit: read a sister post later article on the breakdown of the perf differences, and it seems to slightly affirm what I'm saying (re: bitcoin aligning with a functionality AMD performed very strongly in), although I'll admit to have only skimmed it.

I will say though, even despite this, it doesn't address the known heat/power issues even the "heyday" amd cards seemed to suffer from and that were still present when I was comparing them while looking at a 1070 recently.

EDIT: since I can't edit my original post any more, I'd note here that I didn't intend to turn this into a NVIDIA/AMD debate, even if we ignore that for a moment and consider it a duopoly situation instead of monoply, I'd ask readers to consider my core point in that light.

Per dollar, AMD performs much better than nVidia for a significant number of cases[1] and they have much larger memory. However, this is comparing an equally optimized implementation. Unfortunately nVidia simply has much more optimized content availible.


Try to write some GPU software on AMD cards, then compare that to the NVidia/CUDA experience. I don't know about the Bitcoin thing (maybe in the end, for very long lived code, the lower cost would be more important than the developer pain and time time required) but for pretty much all the HPC code I see, getting things running sooner and with more polished dev tools beats price/performance ratios every time. If you're spending a few thousand on Titans anyway, getting a few more cycles per watt doesn't matter all that much anymore; except for Bitcoin mining where your main cost is energy (by design).

So I think there's a very clear reason for that specific case.

Bitcoin mining uses lots of integer manipulation. AMD cards are faster for integer operations. Hence bitcoin miners used to use AMD cards.

I'm... confused why you'd say this. It's simply true: NVIDIA's consistently ahead on these tasks on performance/energy cost metric.

You can do most any computation on any GPU with the right SDK support (and of course, the CPU hasn't gone anywhere). It's just fantastically less efficient.

It really isn't true - AMD cards utterly crushed nVidia ones during the heydey of mining: http://www.extremetech.com/computing/153467-amd-destroys-nvi...

GPU mining is only one of a multitude of GPGPU applications right now and it really isn't even growing.

AMD are already doing sneaky things with their driver autoupdate on Windows, they put the checkbox below the "upgrade" button, prechecked and with a large spacing so it is likely you don't see it immediately and click "upgrade" before you realize it, and they also present the checkbox at least twice, rechecking it if you unchecked it previously. So I fear that after that move by Nvidia they'll feel empowered to push more things on the user.

I'm worried about the semiconductor industry in general as well, there is a lot of concentration currently all over the place, not sure where it's going to stop, and at some degree competition is going to become less fierce, you don't need to get to a monopoly but an oligopoly is likely and will lead to tacit agreement on market separation and price fixing as in the telecom industry.

What was the checkbox for?

Enable auto update.

Actually, AMD's latest design, the RX 4x0 series, measures up really well to NVidia's mid and low end cards, especially for Vulkan/DX12 apps. They're not a contender in the enthusiast market, but will be releasing high-end cards using a refined design in the first half of 2017.

As for drivers, AMD has pledged to open source their Vulkan and OpenCL implementations. While that release has been pending "legal review" forever now, alternative open source drivers are making great progress thanks to Vulkan's simpler driver model. While NVidia's generally had the "better" driver, both from adhering less strictly to the spec and having the manpower to routinely fix application bugs in their driver, that's all changed with Vulkan/DX12 being significantly closer to the hardware.

I'd say things are looking up for market balance.

On my last box I found the AMD drivers were injecting a DLL into almost every process - including the brand new program I was in the process of developing.

Can someone ELI5 what injecting a DLL does, how they did it, and what they might have been using it for?

I know with my old Nvidia drivers, you can add extra icons for window management to every window (move to next monitor, etc). Would this be using DLL injection?

I've seen Nvidia drivers do that too. I think it's mostly done to let the user override graphics settings that are otherwise hardcoded into the application's API calls.

You can uncheck Plays.tv/Raptr/AMD Gaming Evolved during the installation or uninstall it after the fact. It's seperate from the drivers.

I noticed the same thing. It's ridiculous. Geforce Experience doesn't do anything I care about except auto-update drivers. Why do I need to log in for that?

you forgot game capture/streaming and automatic game setting optimization

anything he cares about. Windows has built in game capture, and I know how to adjust settings on my games myself.

The "experience" app is just superfluous bullshit.

windows has automatic driver updates too so... why not just skip it alltogether

The drivers Windows uses are a fair bit older. Occasionally you really do need the latest drivers for a game

I use GameDVR (agario can take the framerate hit). GameDVR has a bug where Windows won't draw menus until I unplug and replug the monitor after using it (sometimes), so I did try Nvidia's thing. That recorded the start page for agario and my mouse. It was super interesting to watch, but not what I wanted. Maybe it works better for "real" games.

Anyway, everything is shit ;)

I use OBS with the NVEC codec - the one that is installed with the NVIDIA driver, and if you are recording anything, regardless of 3D or 2D, it is your best bet.

It's especially nice compared to normal H.264, and has the option of light compression with nominal CPU usage if you don't like large lossless files. It outputs normal MP4/H.264 files that I can then work with in Linux (as opposed to say Fraps) but uses the fact your GPU is already rendering the scene to then output it in H.264.

Thanks for the recommendation. I used simplescreenrecorder on Linux before, it worked pretty well. I'll try OBS on Windows and see how that goes.

You can go to their website and install the normal driver. It installs the nvidia control panel which also has an update checker. If I had known I could just be using this I would have never installed that monstrosity.

Be aware there is a "Have feedback?" link in (I believe) the bottom right corner of the app, where you can have a better chance of them seeing your frustration than the comments here.

I don't know about others, but I approach that stuff with the attitude of "I can't expect change if I just swear at my monitor", but stop short of actually expecting change

I just install the drivers manually now. I always hated Geforce Experience anyway. Crazy amount of bloat for some drivers.

As a workaround, you can download v2 (and prevent the updates). I think the games data(base) used is the same, so you can obtain the same acceleration, without the hassles of v3.

I did that a while back. My current version of experience has decided that, upon opening it, I will either upgrade or do nothing.

For now I have shadowplay set up correctly, but any changes aren't gonna be happening it seems.

I just uninstalled GeForce Experience and it immediately caused the already open Battlefield 1 game instance to crash.

Way to go Nvidia.. uninstalling your non-driver app crashes games :)

> Way to go Nvidia.. uninstalling your non-driver app crashes games

That's a weird edge case to support. After all, the app in question is responsible for recording in game video thus implying some kind of dependency.

I have the suspicion that their overlay drawing is fairly low level as it seems to manage to draw over programs which are injection unfriendly and exclusive fullscreen.

Evidently it does not handle getting the rug pulled from under itself well. :)

I just uninstalled GFE3 and went back to 2. Still works.

I have an nVidia card, but it's rather old and I have not updated the drivers on it past a version that just seemed to cause occasional crashes. It works fine for what I need, and is somewhat disruptive to change. As a believer in the "leave it alone if it works" principle, and especially after this news, I'm probably not going to try any newer drivers now.

Absolutely reasonable if your windows box is used for mundane web browsing / programming and the occasional video, but that doesn't fly for a gaming machine; some games will exhibit bad performance or will plain crash with old drivers that don't incorporate the regular fixes/workarounds added by nvidia (and amd does it too).

There's a youtube video I fail to find where a nvidia driver engineer explains how many games are terribly broken, failing to respect OpenGL/DirectX basics. Drivers hand-patch that, just like Microsoft hand-patch Windows for specific games "because compatibility" [1].

[1] http://www.joelonsoftware.com/articles/APIWar.html

If you buy a game in october (Such as Battlefield 1) it wont even start if it finds your driver isn't the one shipped in October, together with the game. Basically AAA games these days are partly implemented in the drivers, and games have custom required tweaks for nvidia and and drivers.

As a longtime NVidia user I agree this is annoying, but it has been this way for a few years now.

Maybe I haven’t looked hard enough, but I’m surprised this and the forced login for GeForce Experience didn’t make a bigger wave amongst gamers, who’ve historically been very vocal about questionable decisions that provide far more value to the business than to the gamers.

historically been very vocal about questionable decisions that provide far more value to the business than to the gamers.

Maybe true historically - but in recent years publishers have figured out a formula to subvert criticism (review embargos, sponsored reviews, reddit astroturfing). There's no action after the complaining, it's all bark, they still buy the game, little bit of heat on reddit/twitter but then you still set sales records. See The Division, No Man's Sky.

The Division has had a major overhaul though - partially in response to the criticism.

Well, I simply uninstalled the whole thing. Will download drivers manually now. Same for razer synapse. Anything that requires me to have an account for no obvious reason has no place on my system.

I created an account named something like fuckyou just for making me sign up. Now they email me addressing me as "fuck you. Have you seen.." :/

As an option I sometimes register an account with fake email like the one you can get from 10minutemail

Likewise. I originally downloaded GeForce experience when I was installing my drivers. When I noticed login was required I uninstalled and downloaded the drivers manually.

Probably many experienced user do that. Same with Windows since XP. Yet the companies don't learn and use such biased telemetry data for statistics and decide based on them. I wonder if Office and Windows UI got worse and worse, because of reading to much into telemetry data? The WindowsVista devs wrre pretty open in their official blog about it back then. And remember how it tanked, and that was just a small thing compared to the big failure of Win8/10 UI. Such companies need a CEO or CTO who tests their products themself like Steve Jobs and Bill Gates did, it worked fine as long the were the leader.

A sample size of 1-10 C's isn't necessarily better than a sample size of 100ks of users who aren't intimately interested in their privacy. I think C's don't have the time to think about the multitude of possible workflows in their apps. On the other hand developers are so intimately familiar with their apps that they have a hard time "not knowing" what to do.

You raise a good point about telemetry/analytics being biased away from people who disable those things, which can lead to companies unknowingly alienating that market. Another commenter on HN recently mentioned that most of their sales come from people with Google Analytics blocked.

I think most of the people who care about this sort of thing don't play games much anymore or are just used to it. Most games do something similar now.

I still use Win7, and usually deactivate analytics. I heard Win10 store fails big time - have you heard the story of the new "Call of Duty" game cannot be played in multiplayer together with non-Win10Store users - mind you the majority uses Steam. It's a sad trend. In general analytics are useful, but the should be opt-in. And usually a crash reporting would be enough, right? But they are all abusing it by sending too much private data, that's not okay, at all.

"I heard Win10 store fails big time - have you heard the story of the new "Call of Duty" game cannot be played in multiplayer together with non-Win10Store users - mind you the majority uses Steam."

Man, that sounds like a breach of Magnusson-Moss, and anti-competitive at that.

Same. Hopefully by the time win 7 is not supported gaming on Linux will have total parity with windows.

I usually just download and install the latest driver with regular intervals. I don't choose to install/use GeForce Experience, and now that it requires login I certainly won't.

Why would I want to use this Exprience thing anyway? It's crapware, right?

As for telemetry - as long as software only sends reasonable things (feature usage etc) and uses reasonable bandwidth, I'm completely fine. I honestly don't even mind if programs do it without asking and I think all apps should have feature statistics telemetry to be able to cut (or make more discoverable) features no one uses.

> Why would I want to use this Exprience thing anyway? It's crapware, right?

I use it for NVIDIA Share. It's my understanding that GPU-based alternatives can't compete because NVIDIA won't give them access to NVFBC and NVIFR. The creator of RivaTuner goes into more detail here:



Is the share function that much better than the built in win10 one? I quite like that one but the video quality is so-so.

Might just try experience 2.x which is still around. I don't mind telemetry I do mind s mandatory Facebook login...

The problem is you'll never fully know what info they collect and send, and who might have access to it.

It is equivalent to installing spyware.

If a program isn't evil, then it does only things that shouldn't require permission. Example: include the OS version and RAM amount in the http request to the upgrade server that is done on every startup.

Any info ever collected by no-permissions-asked telemetry must be such that it doesn't matter who has the information or what they do with it. If it isn't information of that kind then of course a program should ask permission. But that in my view isn't "telemetry" then. If it collects anything even remotely user-identifying or personal then it's in my definition not telemetry and should never be done without permission (if at all).

If a program really is malicious, then it doesn't matter if it asks for permission because why would that wouldn't respect the users wish anyway?

My argument wasn't pro/against telemetry, it was that asking permission doesn't change anything. Permission isn't what tells malicious programs from others. A benign program doesn't do things that needs permission to begin with.

One problem is that several seemingly innocuous pieces of data can be combined to create a unique ID. See the Panopticlick for an example.

Yeah, if someone e.g. gets my complete hw fingerprint, I'm identifiable. But it's no worse than I'm already uniquely id'd by visiting pretty much any website today. I don't like it, but I also don't see a point in being outraged when my music player does it but not when my favorite website does it.

You seem very willing to just dish your data over to everyone - can I install some software on your computer, too?

Well if you make a piece of useful software I'll install it (I'd rather install it myself).

I'm not very generous with my data at all - but I'd rather trust my firewall to protect my data, than a dialog that asks me if I'm OK with sharing it. That was my point.

Is anyone going to packet capture it and find out?

gamers vocal? they are sheep, they accepted mouse drivers calling home to the mothership before (only complain was a bug causing mouse jerkiness when connection goes offline).

It's probably off-topic but I'm still curious... Which vendor's drivers that was?

Razer (http://arstechnica.com/gadgets/2012/11/why-the-hell-does-thi...)

I know Logitech's software also makes network calls but I never bothered looking what it is, I just block everything. At least it doesn't ask for a login, it can save locally or in the on-board memory.

Good. Telemetry should have been in these video drivers for crash reporting for a decade. Would have helped a ton with various video game crashs and the low quality of video drivers.

No, crash reporting is easily done with logs, preferably in some format you could redact any sensitive information from. This seems more like the "I will secretly phone home and not tell you about it or what I'm sending" kind of thing.

These are GPU drivers. It's not at all unreasonable that there may be something sensitive shown on the screen when a crash occurs. It might not even be shown on the screen but still present in GPU memory:


I would not be surprised if the telemetry included some parts of GPU state which could contain sensitive data.

> No, crash reporting is easily done with logs, preferably in some format you could redact any sensitive information from. This seems more like the "I will secretly phone home and not tell you about it or what I'm sending" kind of thing.

It's not terribly secret, now was it? It was... you know... immediately discovered and I'm pretty sure my driver changelog AND firewall asked about it.

> These are GPU drivers. It's not at all unreasonable that there may be something sensitive shown on the screen when a crash occurs.

Pardon me, could you please explain exactly what in the telemetry or common crash logs might reflect "sensitive" information? You seem to me like you're arguing from some sort of grand final consequence, "Well I assume there is sensitive data here!" And while perhaps that's not an unreasonable default policy to take, you might want to state it as such rather than implying (as you have) that it's been observed already.

In general, telemetry doesn't include bulk memory dumps. The technology for collection strongly discourages this, as the endpoints collecting standard telemetry need to run at the scale of your customer base. I'd be much more concerned about sharing log dumps if you've filled your framebuffers with confidential information.

I think this is a case where we need to assume guilty until proven innocent.

You see, there's no way for users to know what data is being collected and sent today, or what they might change and decide to collect tomorrow.

What if government wants access to this data? What if some hacker gets access to the data or their methods of collecting it (MitM)?

As such, they need to prove we can trust them before we accept this at face value. They have not done so.

> As such, they need to prove we can trust them before we accept this at face value. They have not done so.

This is ultimately a trust relationship with your vendor. There is nothing they can do but be trustworthy.

Don't say, "open sourcing." Open sourcing code doesn't assert much of anything about the binaries you have running. sourceless propagating binary behavior is 30 year old technology.

I'm playing a game, I alt-tab out, game crashes, and literally anything can end up in the framebuffer related to a crash.

So apps shouldn't send any dumps/buffers as part of crash reports. They are still useful.

Also: crash reports should be accepted individually and not part of "telemetry" (anonymous usage and hw/os stats).

Yep, atm NVIDIA and AMD can only get display driver crash data directly from Microsoft. From what I've been told they not only have to pay for it, but while the data has statistical significance it has very low technical value on it's own.

So as far as things go now what happens is, new game is released, players with card X and configuration Y N P and Z complain about driver crashes over reddit/forums, NVIDIA/AMD picks up on it and then starts to try to figure what the hell is going on. Usually some initial mitigating actions would be released within a few days, and within a week to a month a full driver update will be released.

While this isn't the end of the world, it's annoying that you have issues that prevent you from enjoying a game that you paid 60$ for on a system that you likely paid at least 1000$ if not 3-4 times that.

Fair enough, allow people to click "send to nvidia" upon an actual crash, and allow a permanent opt-out. Isn't this the way companies have been handling crash reports since... forever?

I agree that there should be an opt-out option (other than not installing GFE, tho considering that GFE has always phoned home I don't know if that is that important), yes in an ideal world people should opt-in, the problem is that almost no one does.

Anyone who ever worked on a crash report system knows that opt-in rates are below 10% even for corporate clients. Heck if you are lucky you get single digit % figures on "send this report" even if the checkbox is ticked by default, the vast majority of people just hit cancel.

The stats are actually pretty darn interesting, especially for image quality vs fps I had a chance to speak to a few reps from NVIDIA once and they told me that as much as PC players bitch and moan about 60fps vsync the vast majority of them would push settings at the expense of smooth(er) framerates even if they have no to very little effect on image quality.

maybe opt-in rates are low because people dont want the data sent? Assuming they are wrong because it makes your job harder is a pretty self serving deduction.

People aren't bothered about security or privacy, they just cannot be bothered.

Giving even the slightest incentive to send data brings those numbers up extensively even if what you get is meaningless.

Basically humans need a reason to tick a box.

This is why this is under GFE which gives you value added services.

>People aren't bothered about security or privacy, they just cannot be bothered.

Says who? You? Facebook? Microsoft? Google? You don't see the inherent conflict of benefiting from that position and declaring it, unilaterally, it to be so?

How many people do you think would be comfortable, and explicitly approve the kinds of "opt out" data collection that goes on, if you gave them the true extent of how that data can be/is used along with the impacts of it?

Frankly, fuck the attitude that you, or any other developer knows more than me, and decides that i "Just cannot be bothered", especially when its to their (often considerable) benefit.

> Says who? You? Facebook? Microsoft? Google?

I think you just answered your own question. "Says" the plurality or even majority of the human population who use services from the companies you just listed, despite constantly being under scrutiny for privacy concerns.

When using "people" in aggregate, this is a completely correct statement . It's why, for example, HN throws a fit[1] the moment a developer goes so far as to add anonymous google analytics to a package manager [2] even when that data couldn't possibly be used to harm them or track them in any way.

If you make $thing opt in, most people will not do $thing, regardless of what $thing is. Defaults matter.

[1]: https://news.ycombinator.com/item?id=11566720

[2]: https://github.com/Homebrew/brew/blob/master/docs/Analytics....

Google analytics. Anonymous. Couldn't possibly be used to track them.

You've either missed the last decade of privacy-related discussions or you're playing for team spyware yourself.

Companies like Google have billions, lawyers lobbyists, sociologists and every bloody specialist working for them to suck all information out of everybody and there's always some clueless person jumping to their defense with some pointlessly pedantic arguments. Because we don't want to be unfair towards Google or nvidia.

It's not about missing the privacy discussion it's about the public at large not giving a flying duck about it, regardless of what you think.

Pick a random person on the street and ask them, you need to understand that by enlarge simply by knowing about this site you are already part of a tiny subculture of of the general population, and most likely living in a walled garden as far as your social connections goes.

People don't opt in, but they don't care about their privacy just look at the amount of people that would sign up for a mailing list/club benefits at a store they'll only visit maybe once in their life for a 5-10$ worth of discount that they'll never lose - for that they'll be willing to give up a whole lot more of personal information that GA or NVIDIA GFE collects.

This doesn't excuses the practices, it's just the reality we live in.

People can't be expected to understand all the subtle aspects of privacy, medicine, drugs, automobiles and many other things.

That's why there are laws which by default protect those people from the maliciousness, greed or incompetence of companies. And why the US needs strong privacy laws.

The fact that the masses don't understand something is irrelevant.

And what of the people that do "understand all the subtle aspects of privacy", and disagree with your conclusions?

With so many battles being lost, an apathetic or ignorant public and few resources those people just waste everyone's time and help abusers indirectly.

People don't appreciate the consequences of far away activities in general, until you make it explicit like the NSA segment on Last Week Tonight.

Even then they don't give a flying duckling, have you seen that segment? Before the interview they've asked people on the street and they didn't knew who the fuck Snowden was, people who watch LWT are already a tiny privileged portion of society.

FYI Same goes for Real Time, The Daily Show, and the Colbert Report.... You are literally preaching to the choir....

I'm talking about the end of the segment, where everyone is upset about the NSA having access to their private photos.

Or, I don't agree with this absurd definition of privacy that the tech sector has gotten itself enamored with, but okay, we'll go with the accusations of being a shill.

It doesn't matter if you are a shill or are doing it because you believe you're right. At the end of the day you and other people that agree with such abuses for some subtle reason are all part of the problem. Hence playing for the opposite team.

You don't seem to understand that being reasonable, impartial and giving companies the benefit of doubt doesn't work when you are a mere flea opposing gigantic conglomerates.

Privacy is very simple. The person's data should be controlled by that person: they should know what data is stored on them, should be able to correct it if wrong or ask for it to be deleted. Any kind of data transmission should be opt-in.

Anything else is bullshit and goes against the interests of the customer. If not at the beginning, when they inevitably decide to monetize that data.

So would it hurt them in some way if they would ask first?

Yes and no, while opt-in is important for privacy and other concerns the problem is that virtually no one opts in.

Doesn't that seem to indicate that people don't want it then? That sounds like an even stronger reason to have it opt-in.

Real issue here though is it doesn't seem like they have an opt out option.

There is, don't install Geforce Experience, GFE has always sent some data to NVIDIA now they are also collecting driver telemetry.

GFE requires a separate installation and for you to signup and login.

This isn't a service required for you to get GPU drivers.

The only thing you lose is NVIDIA's screen capture software and game optimization (and some deals sometimes).

You can still use your card to the fullest without it.

Source: I have the latest drivers and don't have GFE, the telemetry software is part of the update core which is currently installed with GFE.

>This isn't a service required for you to get GPU drivers.

That is not true, beta drivers require the login. They are often needed for new games to run without crashing, or bad performance.

Beta drivers can also be downloaded from the geforce.com website.

That is not a really valid "opt-out" function since they are heavily advertising GFE and most of the people who have it, because they have to, have no idea what's happening. Even after this discovery, a ridiculously small amount of new people know that there is something shady going on.

This looks bad, it smells bad and it probably is just bad for the customer. The person that has already payed for the damn product and not for whatever happens in the background and I have to look up first somewhere on the internet...This behaviour towards customers is disgusting and I really hope it'll backfire in a spectacular way one day.

they are sending way more than crash data. Just like most of the people that claim "telemetry" is only to fix bugs...


On crash reporting, I just submitted https://news.ycombinator.com/item?id=12883823 about a privacy-related feature invented in 2011.

Send an email to info@nvidia.com to let them know that you'd like them to change their policies regarding opt-in vs opt-out default settings.


Here is my email:

Dear Nvidia,

  I have been a life-long supporter since I was in college (14 years). I have recommended your products to many friends and purchased more than 15 of your graphics cards for my own computers.  I build servers and run a cloud storage business. My friends and family look to me for advice on their own purchases. I am your target market - a technology leader that makes recommendations to others.

  I have been extremely satisfied with your product for a long time and would like to be able to recommend your "issue-free" products to my friends, family, and associates. I'm a big fan of Nvidia.

  Unfortunately, you have recently enabled telemetry reports (https://news.ycombinator.com/item?id=12884762), and I will be less willing to recommend your product, opting for an AMD solution, or on-board solutions.

  To resolve this issue, please:
1: Please use opt-in defaults instead of opt-out defaults for privacy-sensitive reports

2: Make a blog post publishing your public policy on prioritizing user privacy over other priorities.


On a more general note, privacy issues will be an increasingly important consideration for technology leaders before making recommendations. Microsoft made a mistake integrating privacy-invasive telemetry into Windows 10. Please don't make the same mistake. Nvidia needs leaders that will prioritize user privacy over other market concerns.

Thank you,


How about buying AMD gfx cards instead?

As a long time Nvidia user, I grew tired of Nvidia not releasing open drivers. At the same time, amdgpu + radeonsi + radv are constantly improving, so my next GPU is going to be AMD Vega.

In the words of Linus, "Nvidia, Fuck You"

Why, exactly?

Firstly for all the previous things they have done, for example their lack of driver support for free operating systems which is why Linus said this, their lack of Free Software drivers, their tight lockdown on proprietary technologies like PhysX that hurts portability and hurts non-Nvidia users and finally default collection of telemetry data is a highly unethical practice and the inclusion of the extra code to do it can only hurt stability and performance and increase driver size. Go ahead and downvote me to hell, I'm not bothered, go ahead and buy Nvidia chips, I will avoid Nvida like the plague till the day I die.

> Firstly for all the previous things they have done, for example their lack of driver support for free operating systems which is why Linus said this

It is not Nvidia's job to prop up the desktop linux ecosystem. It's wise not to invest time in that ecosystem, as far as I can tell. It's got an ongoing history of disappointing its customers and vendors over and over. And we get good enough computational support ever business or individual out there, Nvidia GPUs are still preferred for clustering.

> I will avoid Nvida like the plague till the day I die.

The way you talk makes it sound like it's a religious or political debate. You're mad that they don't support your demagogue or your shared values. Isn't Hacker News a place where we're supposed to keep politics at bay and focus on the technology?

NVIDIA could release documentation on their hardware that would make it easier for Nouveau developers to do their work, but they don't. Nobody is asking for free labor on a low-marketshare OS.

NVIDIA's own linux drivers are excellent, they are considerably better than AMD's ones (reversal from the days of ATI), but they are closed source.

NVIDIA released Linux drivers hand in hand with their Windows based ones, you get at least one update every month.

So while it's true that you can't have a completely open source driver for NVIDIA you do have excellent Linux drivers.

On the compute side NVIDIA's support for Linux is also unmatched, in fact for many things CUDA actually works better on Linux than on Windows.

Unfortunately, Nvidia's proprietary drivers are excellent for features where the puck has been and not-so-excellent in features where the puck is going. See also Wayland support or switchable GPUs.

Been hearing about Wayland for a while wake me up when it's actually here. Also there is no problem with switchable GPU's which is specific to nVIDIA anymore the Optimus chip is no longer used (and haven't been for a while) since Intels Iris came out it powered a single generation of mobile GPUs and was a hack even on Windows.

Wayland is actually there; it has been shipping with Fedora for some time and for 25 (release in less than two weeks), it is going to be default.

There is still problem with switchable GPUs: namely those, where one GPU is using Mesa and the other is not (i.e. Nvidia). Those computers didn't disappear for the face of the Earth, people are still using it and expect it to work, even if there are never laptops available on the market. It is being worked on (https://blogs.gnome.org/uraeus/2016/11/01/discrete-graphics-...).

Because of MESA, also nvidia tried to make Optimus work on Linux when it supported it couldn't.

Intel doesn't supports GPU switching on its MESA drivers it requires the proprietary one.

Get the prop nvidia and intel drivers and get nvidia prime and you are golden.

Same issue with AMD GPUs with intel CPUs especially the PowerXpress ones.

Intel doesn't have a proprietary driver for GPUs for Linux.

They do have a binary firmware, for Broxton and newer, however that's a thing that runs on the GPU. It does not have anything with how Mesa or other OpenGL stack shares buffers and video output devices.

Sorry that's what I mean, in Ubuntu it's under the display driver, they call it microcode.

> Also there is no problem with switchable GPU's which is specific to nVIDIA anymore the Optimus chip is no longer used

Do you have a source for this? In any case, just because some hardware has been recently phased out doesn't mean the manufacturer shouldn't be expected to support it. Not everyone has enough money to buy a new computer every year; I'm a student with an Optimus laptop and I know I won't be able to afford a replacement in the next year or two.

If NVIDIA can't be asked to fully support their hardware on Linux, the least they can do is release some documentation so the Nouveau devs don't need to grope in the dark.

>If NVIDIA can't be asked to fully support their hardware on Linux, the least they can do is release some documentation so the Nouveau devs don't need to grope in the dark.

I don't disagree in principle, but they've killed it, releasing information to make it work might be a problem for them, and since this thing is no longer supported adding compatibility for older devices isn't a high priority for them.

It worked poorly on many windows machine, it's effectively still doesn't work on Windows 10 [0] and it's been effectively killed since 2014.


It's also worth noting that NVIDIA couldn't get the darn thing to work on Linux, it's also worth noting that AMD also had/has similar issues with PowerXpress on non-Iris based Intel CPUs, it does somewhat work well on AMD's own CPU's with integrated graphics and it's more modern "APUs".

>Do you have a source for this?

Source for what? under Iris Intel the CPU handles the switching between GPUs not that it works that well, but it doesn't work well on most OSs other than windows mostly due to how the display driver infrastructure is setup and how the composition manager works (even on OSX you still likely to want to use gpu-switch than rely on the OS).

If you want to blame some one blame the OS tbh, if OSX and Linux had proper user mode display drivers like WDDM, had modern display drivers like WDDM and actually had a composition manager capable of using multiple GPUs then there might be better solutions for this.

But overall it works, it does require you to use the proprietary drivers from both NVIDIA and Intel but it works, at least on Ubuntu I never had issues with following a similar guide to this http://ubuntuhandbook.org/index.php/2016/04/switch-intel-nvi... (reading the comments on that guide it seems that it missing an important part which is apt-get install nvidia-prime) So this just works...

To switch between the 2 on demand you can simply use these commands:

-prime-select intel

-prime-select nvidia

On other distros (or if you are not using nvidia-prime) you might need to use /proc/acpi/bbswitch And modprobe/rmmod to do the switching, there are bash scripts online ready to use which are available as long as you have the proper drivers installed.

P.S. Sorta OT but relevant: http://i.imgur.com/Ql1dsZC.jpg

EDIT: Since it's been too long bbswitch = Bumblebee (which NVIDIA actually helps also) the open source implementation of prime, both of which also support the older Optimus laptops but ymmv with BB.

Also it looks like PRIME supports synchronization now if you have a distro with DRM support (no not the bad kind ;) https://devtalk.nvidia.com/default/topic/957814/prime-and-pr... so if you are using their currently suggested setup you don't need to switch manually anymore.

Prime and Bumblebee are totally different from Nvidia Optimus. They have to render to render to Nvidia's GFX framebuffer and then copy it to Intel's GFX display framebuffer.

This is much more expensive than hardware implementation of Nvidia Optimus.

No they aren't, PRIME and BB only now support synchronisation which is how it also works on Windows.

It's more expensive on Linux because the DRM on Linux is broken, there is no longer "Optimus" since NVIDIA no longer uses a dedicated chip, you share a frame buffer with the IGP.

PRIME also supports the older Optimus chip laptops, but again this doesn't work that well, but then again these don't work well with Windows 10 either due to the changes to WDDM.

Unlikely to happen without major business motivation/incentives to do so. Right now outside that small community, nobody really cares if it's open source or not as long as it works well and it does.

You are fooling yourself if you think technology can be separated from politics.

Personally I think that the HN response of, "We flag politics" is really just a blind for not discussing sensitive topics like racism, sexism, and human rights in a larger sense.

But that won't stop me from trying to fairly apply the policy to the free software people who are utterly convinced that serving their agenda is a net good rather than a lifestyle choice.

Why isn't free software a net good?

Some background. https://www.youtube.com/watch?v=IVpOyKCNZYw

Shouldn't be a problem anymore under Linux as most distros today install Nouveau drivers by default. https://nouveau.freedesktop.org/wiki/

Nouveau has no CUDA support, unfortunately.

CUDA is proprietary to Nvidia.

CUDA only exists because Nvidia is attempting to pretend OpenCL, Vulkan, and DX12 don't exist [1]. These require hardware scheduling on the GPU to switch shaders. Rather then dedicated X amount of chip hardware to Y shader for Z ms.

It should be noted for GPGPU compute Nvidia is not the correct choice. AMD RX 480 has 5.8TFLOPS @$200 ($37/TFLOP) vs Nvidia GTX1080 8.9TFLOPS @$600 ($67/TFLOP). In reality you should be doing GPU programming in OpenCL so you are GPU agnostic. You can switch vendors or platforms seamlessly (in most cases if you avoid proprietary extensions) even target AMD64, ARM, and POWER8/9 hardware.

That being said I own a boat load of Nvidia stock because their marketing is excellent. Really marketing is all 80% of people pay attention too. CUDA has some great marketing around it. In reality CUDA is slower then OpenCL (on Nvidia's platforms even) and no easier to work in.

[1] https://postimg.org/image/vsnidk8p5/

It's worth pointing out that ROCm is basically AMD's answer to CUDA. Similar programming model and everything.

Let's hope it gets picked up by machine learning frameworks etc., because this market badly needs the competition, as your comparison of per-dollar raw performance numbers shows.

I agree with your point about avoiding vendor lockins, something I experienced for myself with MATLAB. I also happened to buy a RX 480 recently, so I'm happy to hear it's good for GPGPU.

But I'm curious in how the FLOPS on these cards were measured. For example one concern I have is that presumably these two cards have slightly different levels of parallelism. So it may be more or less difficult to extract the full performance from a particular card due to parallelism overhead. Then there's driver overhead, ease of programming, etc.

FLOPS is always calculated via the simple formual

      F * (1/Hz) * 2 = FLOPS
Where F is # of FPU front ends (SIMD and scalar). This is wrong because scalar math often is slower then SIMD, and compute kernels rarely run on the scalar pipeline.

Where Hz is the well.. the clock rate, inverse to get cycles per second. This is wrong because stalls happen, memory transfers, cache misses etc. It is also wrong because the clock rate is throttled and you are not always at Maximum boost clock.

Then multiply by 2 for FMA (fused multiply add). This is wrong because well not every operation is a one cycle FMA. Division can be many (>100). Also scalar pipelines don't have FMA.

Ultimately all vendors use the same crappy calculation so we are comparing apples to apples. Just rotten apples to rotten apples. It gives you a good ideal circumstance you can optimize towards but never actually attain.

There is difference in job scheduling between AMD and Nvidia. So if you want to optimize your OpenCL applications you can do it only for one of them or do it twice.

Sample applies to integer math, long double math and so on.

Do the power consumption numbers cancel out the up-front price advantage?

RX480 consumes less power then the GTX1080 so they'd amplify the initial price advantage.

Is that total power, or power per GFLOP?

Are you seriously asking me to do division for you? Do you own a calculator, cellphone or computer? Or are you actually that helpless?

AMD RX 480 has 5.8TFLOPS @$200 ($37/TFLOP) @231Watt Peak (2.5GFLOPS/Watt)

Nvidia GTX1080 8.9TFLOPS @$600 ($67/TFLOP) @318Watt Peak (2.8GFLOPS/Watt)

See Furmark benchmark for wattage values [1]

Typical KW-H in the US $0.12KW-H [2]. So the delta-cost of GTX1080 vs RX480 will be mitigated by GFLOPS/Watt efficiency savings in 4 years, 4 months. Which on a typical 2, 3, or even 4 year hardware replacement cycle the extra cost will NEVER be re-couped.

[1] http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1... for load wattage numbers

[2] http://www.npr.org/sections/money/2011/10/27/141766341/the-p...

Thanks for the info.

It can't, Nouveau is not developed by Nvidia.

Nouveau has no reclocking, so it's practically useless as is. If you want a working open driver - use AMD.

Nvidia do not want to support linux nor themselves or let anyone else do it.

They've had the best 3D drivers on Linux for years. There were issues with Optimus and Ion but IIRC a lot of it was legal and issues with Linux's driver infra back then meant it required far more effort than it was worth at the time.

Their drivers might work, but saying to users "here, load this giant blob with a history of privesc bugs into your kernel" isn't what I'd call "supporting Linux".

Meanwhile, AMD are active contributors to Mesa and DRM, which benefits everyone.

I wish they'd open things up too, but it's there prerogative, and Linux repesents a minuscule share of their market. The fact that they support it in the way they do today is surprising.

The article tries to get you to download "autoruns.zip" from their site. That's suspicious. But it seems to be OK. The official Microsoft version is at [1] and the ZIP files compare equal.

[1] https://technet.microsoft.com/en-us/sysinternals/bb963902.as...

If you know the sysinternals tool you want, you can grab it directly from live.sysinternals.com, which is an official microsoft site. e.g. http://live.sysinternals.com/autoruns.exe

If I find any software defaulting to opt me in, without asking, it gets turned off and stays off. If it asks, I almost always permit it. When I have customers ask me about the same (when prompted by their software) they tend to agree with my thinking.

There is an exception though - since Windows 10's enforced telemetry, I have turned off all of the telemetry for all of their software across the board. Until they start to conduct themselves respectably again, they can do without my drop in their ocean.

Go a step further and abandon their platform. Don't financially support their behavior. Convince your clients to do the same.

In NVIDIA's defense... this is all optional. You can still install the drivers with control panel without any telemetry or "GeForce Experience".

I recently just switch to AMD because I bought a FreeSync monitor and I've been meaning to upgrade my card anyway. Looks like I did it just in the nick of time.

Please AMD make competitive products to keep Intel and nvidia honest!

The 480 is really competitive right now, especially when you can get one for ~$150 with discounts on sale in the last couple weeks.

Hopefully Zen can complete at least at the $200 price point with Intel.

I just switched to a geforce 8400gs using the nouveau drivers on GNU/Linux. I'm not big into graphically intense applications and I don't have to worry about my graphics card waking up and "phoning home." :)

Ever since AMD put in the effort into open source Linux drivers, I've been only buying Radeon GPUs. This is a reminder of why I don't want to rely on proprietary drivers.

Imagine that five years from now, RISC-V or POWER8 have become established as free core-CPU platforms. Is there anything equivalent coming through in the GPU space?

Hopefully POWER9 by then.

It would be good to know if this is the geforce experience of the actual driver, and what exactly its doing. The article seems to be light on details.

Even if we outlaw phone-home for information-gathering, automatic updates have to upload information about your architecture in order to download binaries.

I don't think anybody is suggesting at this point we ditch automatic updates -- the consensus seems to be they fix more problems than they cause. So this is going to remain a problem.

I'm not bothered about the privacy so much as the bloat (or rather - I trust that those more vigilant than me will warn me if the privacy issues are more than theoretical - lazy I know).

The bloat is endemic to hardware companies. Is there some law of nature that says if you primarily build peripherals then you write terrible software?

What we actually need is good _independant_ firewall vendors.

It is not enough to focus on the telemetry giant corporations like NVIDIA or Microsoft while forgetting about all the P2P software being installed by game's vendors and "telemetry" of software smaller vendors.

On big computers/pcs the default mode makes the user give up too much control _forever_ once the software its has been installed. Most software only need to be doing anything when your actually using it.

What we need is not opt-in checkboxes from vendors, what we need is the operating-system level software to be better -> where our explicit permission is needed to "allow" some kind of activity like transmitting over the network or detecting my location.

Any idea if the Linux drivers do this as well?

Geforce experience is not available for linux.

I used to just use the Windows "Update driver" dialog to manually find nvlddmkm.sys and install it that way to avoid the bloat. I haven't used Windows in a while but it may still be possible.

Can it be avoided using the "advanced" option in the setup wizard and deselect everything but the graphics driver (,the physics engine and the sound driver)?

Great article: highlights a problem, shows a solution. Autoruns is a very useful program.

What's "GeForce Experience", "Wireless Controller" and "ShadowPlay services"?

Do recent drivers always include the latter? How do I check for them? Are they kernel modules?

In my case, all the nvidia drivers I see loaded are:

    $ lsmod | grep nv
    nvidia_drm             20480  1
    drm                   294912  3 nvidia_drm
    nvidia_uvm            704512  0
    nvidia_modeset        770048  3 nvidia_drm
    nvidia              11866112  42 nvidia_modeset,nvidia_uvm

GeForce Experience is a seperate application that automatically updates the drivers for you. ShadowPlay is their hardware accelerated video capture software. They are optional.

I thought it was pretty obvious it is about the Windows drivers.

This behaviour is not acceptable.

I'm surprised by the backlash against telemetry on HN. How else are you supposed to improve reliability of software used on tens of millions of devices with an ear infinite number of hardware and software permutations?

The telemetry data isn't yours. It belongs to the user and exfiltrating it without user's permissions or even knowledge puts your software at the spyware level.

Do you really want your software to be considered spyware?

When such telemetry exposes the names of programs you are running and is sent unencrypted, it's a big deal. Coupled with very specific hardware information, this could easily be used to track TOR users for example.

Doubt it's tracking all applications. Probably only GPU heavy ones like games, CAD and creative tools like Photoshop and Premiere.

Except to make that kind of statement you need to back it up with evidence. From one of the other articles it looks like they now use TLS so you'll need your own MITM cert and Wireshark.

By at least allowing the user to opt out.

> By at least allowing the user to opt out.

Rather: by requiring the user to opt in.

Don't get me wrong--I'd much prefer opt-in. I don't think that's realistic though given their leverage and what they are trying (in theory) to do with this data.

However to not provide an opt-out is a slap in the face to those who know enough to care. And it is an increasingly common trend that is very concerning.

And yet, people do.

Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact