I always hated anti virus packages both for the fact that AV vendors profit on something that shouldn't be required in the first place and because that software tends to hook into lots of places in the OS so if a backdoor is found you are immediately in big trouble.
You must hate a lot of things then.
In one company every developer was forced on Antivirus without file exceptions making compilations a huge pain.
And I guess non-glued usb ports on every computer.
And users that put every USB stick in their computer they are handed by strangers in front of the office. Or they find on the printer. Especially when labeled "Pictures".
CEO: Have we Antivirus installed?
CEO: Are we secure?
How does your risk analysis look like?
Do we internal or perimeter defense?
I never understood the stupidity of some people to glue them instead of using OS policies.
Glueing is much easier, can't fail with wrong configurations or roll outs, works if people have too many permissions on Linux and in a myriad of other ways.
You could buy several houses, or be settled for life, with the costs to cover an eventual security breach.
The companies I've consulted with had USB not secured in the Laptops of the marketing departments I've seen - and yes some of them lost several hundred millions of $ b/c of breaches in revenue.
 http://www.adobe.com/content/dam/Adobe/en/devnet/acrobat/pdf... section 3.4.1
However, I think it's quite a stretch to put any blame on Adobe for this one.
In essence, Avast has implemented their own std::vec in C for the management of the magic numbers, and they implemented it quite poorly.
As mentioned in the article, the find_magicnums function supports roughly 300 (!) different magic numbers. Adobe's PDF is not required at all to exploit this bug.
During the last year, I discovered dozens of bugs in different widespread anti-virus products. Not all of them can be as easily explained as this one and only a few a as critical as this one, but still...
Concerning your question: In general, I consider it quite difficult (if not impossible) to give a concrete answer. The software, as it is now, is just in a horrible state. Depending on what your threat model is, how experienced you are as a user, it might very well decrease your system security.
Where things get hairy are antivirus suites in particular. It's one thing to passively scan files, but AV suites have a bad habit of hooking into the operating system and making other changes, usually in the name of real-time monitoring. This is where I feel their supposed benefits fall flat. It would be one thing if these hooks were also passive, but Windows and most other operating systems make it difficult to access the kernel's data on purpose, and I don't trust an antivirus suite to do so in a safe manner. It's one thing for an exploit to compromise passive scanner running with userland permissions, and another thing entirely to exploit an AV suite with direct kernel access. The latter can cause the security features to backfire pretty hard.
First of all, this bug was fixed nine months ago (see Timeline).
This is in no way a "fiasco", this is normal process of improving security of the product. Bug reported and timely fixed, bounty paid, no exploit was found in the wild, no harm done. You can count yourself number of similar bugs fixed in Chrome or Firefox this year.
>is it safe to say that running an antivirus is actually increasing your risk instead of decreasing it?
Yes, running ANY software on your computer increases attack surface, thus increasing risks. In the same time running AV decreases many other risks.
My opinion is all in all AV still does more good then harm today, provided it's actively developed by credible shop and timely updated.
Write parsing for complicated formats in C/C++. Run it on suspicious files from the outside world. What could possibly go wrong? Play stupid games, win stupid prizes.
I discovered this via a coverage based fuzzing engine with a dictionary (containing those magic numbers).
Said fuzzing engine is similar to libFuzzer, but I have designed it with a focus on fuzzing closed source Windows binaries (PE).
So to be very clear: I reversed the functions and types without any symbols. All function names, type names, and variable names from the article are chosen by me. In the actual code, those names are most likely very different.
For such a simple function as this, all you need is the control flow graph form of X86 disassembly as linked in footnote 4 of the article.
I would love to hear some feedback, in the hope that the following posts will be more enjoyable than this first one.
Then again, if you consider the number who fail at the latter, and how many would want to work on AV software anyway, it's no surprise things like this will happen.
This is X86 code  running as NT AUTHORITY\SYSTEM. Hence, successful exploitation for arbitrary remote code execution (as NT AUTHORITY\SYSTEM) only requires circumventing the stack canary.
As mentioned in footnote 6 of the article, they seem to use Control Flow Guard (CFG) on the latest Windows platforms. However, just as the stack canary, this is only a mitigation. It does not make exploitation impossible, it just makes it a bit harder.
 In the article, I present a pseudocode version of the relevant function. If you are interested in the actual X86 instructions, you might want to look at footnote 4 of the article.
Remote execution would be harder to achieve.
But, parsing anything that you end up evaluating includes a whole class of bugs that Rust can't reason about.
A common (terrible) example on Windows is to serialize a data structure, and then pass it to another program after forking to a new process. If that serialized data includes particular sequences, it can cause the forking process to do other things.
Rust can help you ensure memory safety, and type safety. But it doesn't prevent stupidity.