"The lever, the transistor, the vacuum tube and the computer all have something in common. They’re amplifiers, they allow a relatively small change or capability in a domain to have a much larger effect."
The article's premise unfortunately relies on a conflating of two or more distinct meanings for "amplify". A vacuum tube or transistor increases a pre-existing voltage or current level, but the thing being increased stays the same -- it's a change of scale. A computer amplifies a multidimensional capability like the ability to model reality using mathematics -- it's a change of scope.
A computer isn't just a faster or stronger version of biological processing, although that happens to be true. Consistency, precision and scope come into the equation also. But "amplification" really doesn't describe what the computer does, any more than it describes what we do.
Not mentioned in the article is the fact that an underscored file can be algorithmically converted to camel case, but not the reverse (with any reliability). Therefore camel case has higher entropy, lower information content, and is less flexible.
Also, the expression "scientific showdown" about studies that rely on psychological self-reporting is a bit of a stretch.
"A simple protocol can give us an idea of whether data is being sent to Dropbox:
1. Create a large-ish file (1MB) outside of the Dropbox folder
2. Monitor the network usage of the Dropbox application to see if it sends enough data that it could be that file
3. Repeat with many different files, etc.
Doing exactly that, Dropbox only sent a few hundred KB after “accessing” the target file. Seems unlikely that Dropbox is uploading files outside your Dropbox folder."
This test approach has a problem. A more realistic test would be to place a well-compressed file, one that by definition cannot be made smaller, on Dropbox and see what the system traffic size is for that file. For an optimally compressed file, if the system is reading the entire file, the read size will more or less equal the file size.
You’re still missing the possibility of DB uploading a hash vs standard file compression. Even if they never upload the full file plenty of people would love to know if any of your files was on some list. (Classified information, piracy, etc.)
If you read the discussion near the end about why a few hundred KBs, you'll see that the first test file I used was a JPEG embedded in a Word doc (the latter because the original claim used Word files), which should be very difficult to compress.
To be fair, I added this section after several people pointed this out, so you may not have seen it.
> What still remains to be solved in terms of Graphics Programming?
That's not a particularly useful question, because the direction of graphics programming is steered more by improvements in hardware than software. For example, I can remember writing and publishing a game for a programmable calculator, at a time when a "display" was a sheet of graph paper:
My point is that I've personally witnessed huge changes in what people regard as cutting edge computer graphics, and most of the change is driven by hardware improvements -- GPUs, faster and cheaper computer power, display technology.
So to answer your question, I would advise that you pay attention to the hardware responsible for processing and displaying the images.