Edit: I hated investing in anti-stuff.
That and RevoUinstaller. You run RevoUinstaller then CCleaner, and maybe a reboot in between, and you've really removed an app. (The paid version of Revo is even cooler, as it will let you log the files generated during an install, and then on uninstall, it'll make sure to remove all the files and directories you had previously logged.)
In addition, this also gives you granular privacy control
Also, quick question: are Puppet and Ansible "factory produced" sysadmin?
No idea how much it gets vs. leaves behind and since turning on it hasn't thrown up any obvious notifications lauding itself.
This is on a secondary machine. My primary machine isn't Windows.
But Windows 10 (a few older versions as well, but somewhat hidden) has something similar called "Disk Cleanup" (Win key > enter "Disk Cleanup" > "Clean up system files" > Check all marks). "Disk Cleanup" removes stuff that CCleaner doesn't, so I use them together.
You can even remove system restore and shadow copies (Win key > enter "Disk Cleanup" > "Clean up system files" > "More Options" tab > "System Restore and Shadow Copies" > "Clean up" button).
And vice versa if you use CCleaner combined with CCEnhancer
I never said that.
> Defender will not do this.
I know what it does.
Well, things have been like that for a long time already. E.g. Thunderbyte used an ISA card in the early 90ies to insert itself before and outside the operating system .
Old anti-virus programs also used Terminate and Stay Resident (TSR) 
Surprise, surprise, now there is more attack surface available since AV is kernel level. And so now we see attacks getting root or kernel level access via AV vulnerabilities.
Yes it was. After using it a couple years, when I decided to get it off my system, it was far from easy. It simply didn't go. Because of the trouble it was giving me, I thought I had no choice but to renew my subscription. It was at this point I started looking for an OS alternative.
There are really only two things keeping me on Windows.
1. I run some Windows-only applications that don't work with Wine.
2. A lot of Linux software often doesn't work well on a hi-dpi monitors.
As for running windows apps....how much time do you want to waste? The only windows stuff I really run is games, and the way I do that is by running windows in a VM with the graphics card direct assigned. It works really well, and is super neat, but it was a huge pain in the ass to set up on consumer grade hardware.
As someone who is the same boat but hasn't discovered Linux yet, how hard would you say it would be to install a version of Linux that support 3 monitors (using an onboard ATI card and a PCIe card)? Last time I tried installing it this is the part I gave up at.
Also as a person always been aware of Window I find the graphics in Ubuntu lacking some finesse, i.e. the scrollbars, window panes look like that created using Java applets (granted it is a matter of taste and personal choice) but what is the most professional looking desktop to try right now?
Sorry i know this is off-topic but I too really want to switch from windows and don't want to waste money in Macbooks so any advice would be appreciated.
People generally don't fault MacOS for not supporting whatever hardware they bodged together, yet they expect Linux to work with anything because hackers and magic. I guess that's a good reputation to have but it's not very useful if you want something that just works.
In the case of Linux, what just works is generally the drivers that are built in to the system, with vendors that take an active part in Linux development. And since Intel started to make both network and graphics cards, that's usually your best bet. ATI is generally usable too, but performance and power consumption generally lag their Windows equivalent drivers.
With that said, it's quite easy to test. Download the latest Fedora (or Ubuntu) live DVD and boot it. If that works with your display, a native install is going to work too.
Just don't install Linux as an almost-Windows or cheap-MacOS. If you want Windows software then that's what you should run it on. Run Linux because you want a UNIX desktop and the kind of software that goes with it (gcc, bash, rsync, native TCP/IP utilities and those things).
With regards to your question about triple monitors. I mean, I managed to get triple monitors working on FreeBSD without much fuss so I am not sure of how much help I can be when I say "it should just work". I am obviously biased.
But, it's likely that it will just work, the open source drivers for AMD are significantly superior to the open source drivers for nVidia. Although I only have nVidia cards running and I have 2 setups with triple monitors.
The tradeoff is that the CPU half of AMD's processors is kinda bad - they are all derived from either Jaguar laptop processors or Bulldozer, they desperately need to be refreshed with Ryzen. So Intel has the better CPU but a mediocre GPU (unless you splash out for Crystalwell), and AMD has mediocre CPUs but a good GPU...
AMD's non-APU processors (FX, Ryzen, TR, Epyc) do not have iGPUs however, same goes for Intel's lines of server chips (X99/X299/etc). This is something that is only in "client" processors.
As a data point, the development desktop I'm using runs 3 monitors on Linux (Fedora 25). They're hooked into a single AMD R9 390 graphics card (overkill, but it was hanging around spare when putting together the pc).
The monitors are 1x Dell 30" in the middle, with 2x ASUS 24" monitors (one each side). No HiDPI stuff, and everything works fairly well. Using the "Xfce" spin of Fedora.
Didn't need to do any complicated setup. The monitors were all detected fine without mucking around, and I just needed to drag them into position in the Xfce4 gui tool so it knew which one was on the left, middle, right.
My advice echoes others - you can try before you buy (install) with a few flavors of Linux; see what happens. If you have trouble, try to find a fellow human who can help - X is a little surprising to configure, if you haven't done it before.
But frankly I find the Gnome-Shell desktop to be beautiful. Maybe try Ubuntu-Gnome? 
That being said, the majority of UI animations in Linux are like that, so meh.
Stay away from Fedora and Arch unless you don't mind getting into some set up files. Personally it is fun getting things setup for me and so I enjoy Arch and Fedora and sometimes it they are just as quick and simple as OpenSUSE and Ubuntu.
Just my honest opinion after 15 years of continuous use in university and work.
XKCD still relevant today: https://xkcd.com/456/
I think for a lot of people who want to just browse the Web and watch movies, Linux is more than up to the task. And suit, recently even libreoffice is working fine for my simple needs for word processing and spreadsheet work.
Usability of Linux in the past 10 years has been exponentially increasing.
Ironically, Canonical is making money mainly with cloud services. Meaning servers - what Linux is good at.
and I wish this to be true for any *nix system.
The ability to venture down the rabbit as far as you want, without black-boxes.
I think a lot of us sit here today because of that privilege.
Trusting old memories of what's good or not can be dangerous. I remember getting caught out once by CoreTemp which suddenly packed a load of dung in its default installer. :(
Sourceforge - You will never find a more wretched hive of scum and villainy.
Sourceforge is respectable again! (hopefully?)
When pigs fly. I cringe every time I encounter a project hosted on SF and have to play their stupid find the download link game.
So take that as you will. That tells me that SourceForge is probably bound to degrade some time soon.
Apart from that, I just use Defender.
I worked in the anti-malware industry for 5 years. Pretty much across the board, this kind of software does more harm than good. AV on corporate email servers or your email provider makes sense. Rooting your own computer and leaving it vulnerable to shitty AV/AM vendor's software doesn't.
Literally saw malware in the wild that exploited AV software.
On our side (developers) we need to be careful with this idea that "we will know" when something is wrong and be more careful when deploying software. It would also be nice if some form of tool could be used to test a binary to make sure it only contains what it should contain (sort of a whitelist of symbol names compared to the source files, idk...) I'm sure something along these lines probably exists for some different purpose.
Another important point is that distributions have already solved effectively all of these problems. We have automated building and signing systems that mean that installation and upgrades are done in ways that are not vulnerable even to fairly sophisticated attacks. You can build packages locally if you want to verify them, and modifying a package after it has been built invalidates the signature that all modern package managers require before installing a package. As part of the openSUSE project we even have a free-to-use (and free as in freedom) build project called the Open Build Service which allows you to build packages (with automated dependency update rebuilds) for many different distributions (Arch Linux, Debian, Ubuntu, Fedora, RHEL and obviously openSUSE and SLES).
I get that distributions aren't "sexy" but it's getting quite frustrating seeing all these communities make the same mistakes that distributions made (and learned from) more than 20 years ago.
[Disclaimer: I work for SUSE and am an openSUSE community member.]
I agree that doing everything I mentioned manually is hard, that's again why I said that distributions have solved this problem and made it easy.
 also that protects you against malware injecting binaries in an executable when compiling it, but not from malware injecting code into the source code of the executable.
Publisher. A user could (if they were really paranoid) rebuild the packages locally, with two or three commands.
> Also that protects you against malware injecting binaries in an executable when compiling it, but not from malware injecting code into the source code of the executable.
You can download the source code that OBS used (both as a src RPM generated by the builder and the OBS repo that the builder was given read-only access to), and OBS supports cryptographic signatures of the originating source (with gpg-offline keys to avoid WoT attacks). If your developers are using sane source control practices (use GPG keys for every commit, but especially tags) then you are protected against that too.
Of course, reproducible builds is something that would solve this problem even better (protecting against attacks on OBS that cause it to add source that are not in the repo). As a side point, our threat model doesn't fully trust the nodes compiling the software so such attacks are fairly limited in scope (but I'm not a developer of OBS so I'm really not the right person to be asked these questions).
Many things could go wrong with this (mainly attacks on the expected results db, it should be replicated) but the idea should work.
- (N-1) machines build a new Debian package.
- The (N-1) results are fetched by the fourth party.
- The Nth party checks whether the build results are identical.
- The Nth party signs the result with GPG on a machine that is not connected.
- The signed package is distributed.
An attacker would need to compromise either: (1) N-1 build machines; (2) the offline machine used for signing packages; or (3) the upstream source.
I consider (3) to be the most serious thread. But in this scenario, only the distribution packagers need to inspect the source changes. In the common scenario where you download from a vendor (e.g. the Transmission or CCleaner website), every user has to inspect a binary blob.
Another possibility would be to have strong sandboxing for applications. An application could still participate in DDoS attacks, etc. But it would at least not encrypt/destroy your data.
tl;dr: reproducible builds are an extremely important development and Debian (and other distributions participating in this initiative) should be commended for their work!
is there any distros out there now that are source based like Gentoo with more friendly defaults?
And reproducible builds are a work in progress.
Here a few other cases
What makes this scary is that, as far as I know, pretty much no software has that kind of security, and there are several pieces of widely used software that always update automatically (sometimes for good reason, sometimes not so much).
I think that there needs to be a more complete solution than just "secure the developers machines". You need to have peer-review, where the developers sign commits to approve them.
Isn't this pretty much mandatory with SOX compliance anyway?
I suggest you use something like "Little Snitch" for mac which warns you when software makes inside/outside connections.
It might not be the best, but it's definitely something that works to mitigate some hacks.
At least a binary check after compilation+signing (by the developer) should improve security a little bit.
You can think of it as a firewall for your filesystem and devices.
Could you please elaborate? I only recall one instance of compromised Transmission installer.
With Google’s recent actions, this makes Chrome literally into malware that your system’s AV should automatically detect and remove. If a significant amount of users gets the software without intending to, the install was malicious, and should be removed.
Why is that is my question.
Even if it is explicit the idea is to have it enabled by default so people accidentally install it and then hopefully start using it.
Which browser dominates should be based on its quality not how much money you spend on it :/
The thing is with Firefox that if it's on your computer is most likely because you installed it yourself.
Fortunately I don't think I've updated the program in 2-3 years, so it probably doesn't have any malware in it, but still, rather scary to think that was used to be a daily program for me is now infected.
Which reminds me, I probably need to call my dad and anyone else I installed that for...
IIRC Windows Store apps don't run in the same way as regular Windows applications, they run in a sandbox.
It's 2017, how is this still a thing?
That sort of maintenance seems like it's the result of poor design in an OS that has the hood welded shut.
I actually used ccleaner on Win 10 recently, an MS update had associated loads of files with TWINUI which wasn't installed making things like viewing images impossible. Ccleaner found of the order of thousands of stale entries, removed them and made a backup. It also let me simply check and disable startup programs - I don't think Win 10 has a way to do that in the user UI?
Windows 10 can already do this by default.
> remove registry entries for software that's no longer installed.
Uninstalling old style Windows software is a hard problem that generally can't be done in a foolproof way, without potentially breaking stuff, due to bad legacy design decisions like giving programs free reign to install stuff wherever they want, without a proper application model nor dependency tracking. Therefore, registry cleaners are also prone to break things.
> That sort of maintenance seems like it's the result of poor design in an OS that has the hood welded shut.
That's what UWP solves.
> It also let me simply check and disable startup programs
This is already in the Windows 10 UI.
Plenty of examples of native apps doing something that third-party apps do much better.
No, sorry, you don't get to take up 30 GB of valuable SSD space for some unspecified stuff that I might need later because removing it is "not supported."
If you type startup or autoruns or similar does Win10 suggest task manager, would never have thought to look there ... will try next time!
Did you read the article? It can happen to any company. It just so happens they targeted a very popular downloaded application. Who knows what other software installers have been compromised.
By default CCleaner installs both the 32-bit and 64-bit versions, however on 64-bit systems it only runs the 64-bit executable and points every shortcut it makes to the 64-bit executable.
On one of my affected systems that appears to have had 5.33 installed, I noticed no registry keys that appear to be created and that system never ran the 32-bit executable.
Would it be safe to assume it's not affected and simply uninstalling CCleaner 5.33 is enough?
Piriform seems to suggest that only some useless system information was ever released by the compromised version. The general worry is that it wasn't just that information, but also other more important things like account logins and such.
So the default for CCleaner, which is supposed to get rid of old system bloat and cruft, is to be bloaty and crufty, and install versions of itself the system does not need or can not use?
This blog post from Piriform has more details: http://www.piriform.com/news/release-announcements/2017/9/18...
Basically they believe it was only the 32-bit installer that was compromised.
The CCleaner installer is always 32 bit for compatibility - it installs both 32 bit and 64 bit program binaries. On 64 bit systems, the default shortcuts are to the 64 bit binary.
So was the 32 bit installer compromised, or only the 32 bit binary? The original advisory makes references to the installer which is quite confusing. Tried to figure it out myself but I assume the loader has VM detection techniques as I wasn't able to infect a VM.
Well, it would be many things, but it wouldn't be "safe". Not a tinfoiler, just a pedant :) I could go with "reasonable".
'The 64-bit version of CCleaner v5.33.6162 was not affected but we encourage all users to update to the latest version, 5.24.'
Avast/Piriform claim that the payload is not executed at all on 64 bit systems, and there is no secondary payload.
I guess it depends how much you want to trust that information.
Obviously, this gave me quite a scare, so I downloaded and ran both MalwareBytes and Immunet - both came up negative. I checked my registry for the keys mentioned in the article, and found none of them. Can I assume I'm "safe" (well, one never is, but relatively speaking), or should I revert my system to an August image?
Even with ton of subject "I have removed X with CCleaner and now I.."
Anecdotally, I've seen CCleaner delete way too many false positives in the Registry, breaking applications, (and people have never heeded it's warning to properly backup the Registry), and worse entirely corrupt Registry Hives, breaking Windows.
The Hive database format of the Windows Registry was built to be read-mostly/write-rarely and doesn't survive well to active surgery, especially not "I run CCleaner once a week with all the options checked". Like I said, I've seen it corrupt entire Hives from too regular operation.
I'm also of the opinion that some of that "Windows slowdown" that these users complain of is a snowball impact of too much Registry surgery leaving sadly deteriorated/badly optimized for reading Hives behind, but that's mostly a hypothesis I have not scientifically proven.
 I kind of forgive people still running Avast out of habit from bad old XP days (not everyone got on the Microsoft Security Essentials train as fast as they could, and that was as much a marketing/awareness problem), though as knowledge that Windows Defender exists spreads there are increasingly fewer excuses to still run Avast.
The CCleaner infection was for win32 machines and from what I understand upgrading to the next version (v5.34) fixes the problem.
I mean, I guess that's still true since the build was compomised by an outside party, but it's still just an interesting moment of synchronicity.
Office itself is in Preview on the Windows Store, and when that comes out of Preview, other developers are especially going to be on notice to get applications into APPX packages, if not the Store, because for most applications if Office can do it, so can you.
Also does this effect the Mac OSX version of CCleaner or just the windows version ?
Release Post: http://www.piriform.com/news/release-announcements/2017/9/18...
>This compromise only affected customers with the 32-bit version of the v5.33.6162 of CCleaner and the v1.07.3191 of CCleaner Cloud. No other Piriform or CCleaner products were affected. We encourage all users of the 32-bit version of CCleaner v5.33.6162 to download v5.34 here: download. We apologize and are taking extra measures to ensure this does not happen again.
macOS seems fine, it looks like it was their 32bit Windows/Cloud offerings:
>Before delving into the technical details, let me say that the threat has now been resolved in the sense that the rogue server is down, other potential servers are out of the control of the attacker, and we’re moving all existing CCleaner v5.33.6162 users to the latest version. Users of CCleaner Cloud version 1.07.3191 have received an automatic update. In other words, to the best of our knowledge, we were able to disarm the threat before it was able to do any harm.
So if you have a Windows copy, look for a patch I guess. Seems like it's not just fixed, but the rogue server taken down.
Gotta check other computers though, and smartphones.
Two seconds to type four words into the search bar.
That software basically go through your computer and delete a ton of things that it considers useless. Have you ever seen the defaults settings? For instance, it used to delete the history, cache and settings from all major browsers.