Obviously this is the entertainment system and not something more critical, but it's telling. There is a huge cadence mismatch between software cycles and capital good replacement cycles. Airplanes, factories, HVAC systems, even home appliances last for decades. Software on these systems needs to get upgraded, I can't even imagine the number of security patches that have gone into the Linux kernel in the last 11 years.
There of course may be realities of the market to cause it to not work out this way, but it's a pretty sensible assumption.
EDIT: hullo's comment below would seem to contradict this; 32" TVs from Samsung seem like a fair data point to look at (albeit just one data pt), since it's a very mainstream manufacturer and a non-niche size.
"Picking a manufacturer (Samsung) and size (32") at random, I see the smart TV for $499 and non-smart options for 219, 269, 299. http://www.samsung.com/us/video/tvs/all-products
Just for example."
They are more expensive than a best buy model. Not terribly so, however, otherwise the airport couldn't buy 200 of them.
They are incredible displays.
Just for example.
The Smart TV even without the "Smarts" are generally higher end TVs.
Though not indicated on the main screen the $299 option is actually a SmartTV if you click through. You have to get down to the $269 option to get to non-smart TV.
The difference between the $269 and $299 seems to be about the difference for putting a processor in a TV and making it smart....
(Note I don't make the claim that the SmartTVs are cheaper but for the most part its being absorbed in the cost of higher end devices so you get the Smarts "for free" on better sets)
The cheapest 50" TV they have is $799 has interactivity.
An assumption is reasonable if it follows logically from facts we know about the world to some conclusion. For example, cars with more features usually cost more than cars with fewer. A washing machine with a detergent dispenser and 11 different wash modes costs more than one with 3 modes and no dispenser. A thermostat that just sets the temperature and does nothing else will almost always be cheaper than one with wifi connectivity and a companion iOS app.
Based on that knowledge, its reasonable to assume that a TV with more features will cost more than one with fewer. This assumption is wrong of course, but it wasn't unreasonable.
So that means an assumption is unreasonable if there's no reason you'd make it in the first place.
Sure there is. 99% of everything else I've ever experienced in my life have had a positive correlation between features and price. I'm actually struggling right now to think of another product where—in general—more features are cheaper than fewer. Other than television sets, I cannot think of one right now. (Maybe if I spend some time on it, I can think of another) Therefore, knowing what I know about prices of things, it is totally reasonable to assume that TVs follow the pattern.
If you truly don't understand this distinction, then this whole conversation (indeed, this entire posting/thread) is hopelessly beyond your comprehension.
Thanks for confirming my suspicion; the fact that "it's a feature not a bug" is pretty unethical has nothing to do with whether it can successful or not. It's a useful logical tool to learn to distinguish "is" from "ought"; whether or not you think that Smart TVs are better than dumb TVs or not has no bearing on whether or not they're marketed as such and most importantly believed as such by the majority of consumers.
Hell it's not even one of the worst heuristics in play: (higher) price and popularity of a product as a heuristic for quality may be even worse than "more features" as a heuristic, and these two are extremely prevalent. And yet just because I don't like them doesn't mean I pretend that these tendencies simply don't exist.
And hell yeah you can prove it's not in there, either with a quick wireless scan or even a simple physical tear down of the housing which can then be put back together.
Show me a dumb TV that has been wired up to spy on people that has been in the wild before.
This is a 100% trash point. I have yet to see an unknown device on my network from a dumb screen, let alone any additional microphones or cameras on said dumb screen. I have also not seen or heard of any reports of it becoming a spying type device. You mean to tell me the dumb panel I bought, which is from a major manufacture with known tear downs and a ton of buyers, managed to sneak this hardware in (even something like a cellular radio) and nobody noticed?
I'll repeat this for you so you perhaps it will impact you this time: Show me a dumb TV that has been wired up to spy on people that has been in the wild before.
Please come equipped with citations, references, and examples before commenting.
You cannot show up to a thread, a demand that a negative be proven. That's not how that works, and the burden of proof is on you.
> Show me a unicorn.
If you're not going to be constructive here, just stop commenting.
Not if it isn't connected to a network. But its hardly as if devices with manufacturer-paid cellular connectivity built in and preconfigured don't exist, so there's no reason that it has to be your network.
To the extent that manufacturers are either monetizing networked services or deriving useful data from them, making them independent of end-user networking choices has a pretty clear benefit, and I wouldn't be surprised to see it become a common thing in Smart TVs.
And if some manufacturer decides to clutter things, I just won't buy it. Problem solved.
This is getting well into ad absurdum (a common problem on HN). Smart TVs are perfectly fine for the person who doesn't want a smart TV. Just don't use any of the smart TV features and don't connect it to the network. Problem solved.
I just don't feel that anything relevant to the IoT is missing from my life. At all.
That's the real problem with the Internet of Things: most of the things we own are not all that useful when we're not in close proximity to them. Thus, not only are users and manufacturers unlikely to update them in the future; users are just as unlikely to connect the thing in the first place.
Home automation through things like light switches, etc. has a use case, but those products have been available and Internet-connected for over a decade and we still haven't seen wide adoption. I recently priced it out -- it would cost me over $5000 to swap out the outlets and switches in my house for Insteon devices. And that's just the hardware; not the electrician required to connect it all or the time I would spend configuring everything. Home builders aren't going to spend that kind of money building this into anything but the most high-end homes -- the IoT hardware alone blows through the fixtures and appliances budget that most home builders allocate.
People want systems that "just work". IoT does not "just work", and none of the current or announced implementations address the big problems around configuration (namely, every house is different so every implementation is custom). And in some cases like a stove or a refrigerator, any amount of configuration is going to be too much.
It worked OK for a while, but the setup was not robust and eventually some controllers would not work some devices at some times. The annoyance factor in going from 0 errors to a 1% error rate is HUGE.
Five years passed, and I have been slowly replacing all these X-10 devices with hard-wired switches or with Insteon. Of course, the original 1959 wiring paths (12 gauge Cu FTW) still work fine.
Now, when I see connected/automated homes in design mags, all that tech seems more like a long-term maintenance headache than a desirable feature. If I had an unlimited budget, I would not build those features in, I would just install conduit and run old-school copper wires through it.
My lessons: (1) The design life for a home is decades, the refresh rate for home automation devices is years; (2) Upgrading/tinkering is fun the first time -- but only the first time; (3) Your spouse hates it more than you do; (4) The existing device does one thing without fail, replacing it with a device that does more things but sometimes fails is not a net gain.
But prices need to come down, light output needs to go up (wireless bulbs seem to top out around 60W equivalent), and switches need to not be a $60 optional accessory (looking at you, Hue Tap). Controlling lighting with your phone is neat, only being able to control lighting with your phone sucks.
Long term, I'm sure I'll end up with more IoT devices. But it'll be because they got shoved down our throats and I didn't want to pay more to avoid new "features," not because I wanted a wireless microwave.
But I'm personally not convinced by IoT. Every implementation I've seen adds complexity without really improving functionality. I have a few Insteon switches in my house, but only in a few places where I need them (e.g. on the lights in front of my house so they can be turned on with a timer when I'm out of town).
I actually have a "connected refrigerator" made by Samsung. I tried for 5 minutes to get it set up before I gave up. I really couldn't think of what I would need to use a connected refrigerator for. IMO this is the usage model for the vast majority of IoT devices: if I, a major geek about this stuff, am not willing to spend more than 5 minutes setting it up, who actually cares enough to bother with any of it?
For example, good sleep is really important to me. So I used the Philips Hue bulbs and a NUC to build a smart home lighting system that behaves much more like daylight. When I describe it to people, quite a number really want it. Not bad enough to get my code off GitHub and install it, but they'd pay something extra for it. And I now see pre-packaged commercial products getting proposed, so I'm sure they'll have options.
The interesting thing happens when that cycle drives the costs a fair bit lower. Look at phones, for example. Smartphones were a weird, exotic thing. Then they were a high-end consumer thing. Now they're the default. A couple years back I went into a store looking for a cheap phone and asked for one without a web browser. The clerk looked puzzled and said, "Well, they all come with web access." I'm sure I could have ordered a dumbphone somehow, but it would have been harder and cost more.
So my question is: how much extra will you pay for your house not to be connected? Because that's the real test for me about where the IoT thing will end up.
The situation with your daylight system is a perfect example. People think "Oh, that's cool" but won't actually get off their ass and spend a few hours setting it up and configuring it to their liking. Unless it comes out of the box, it's a non-starter.
IoT devices as they are today require a systems integrator to come in and tie everything together and configure them. So your lighting system is integrated with a presence system that is also integrated with your thermostat. Each of these has to be configured on a case-by-case basis, because no two homes are alike. But if your products require an integrator, suddenly the integrator is your customer, and you begin sacrificing end-user focus for things that make the integrator's job easier.
And then you realize that the largest consumer of IoT platforms isn't consumers themselves, it's installers like ADT. Their focus is on selling simple products that require minimal support, not feature-rich ones. Why? Because users simply aren't interested in paying any amount of money for advanced features. Until that changes, IoT is going to be a niche hobbyist market. Even if it's built in to every device you buy, if it's not worth anything to you, you're not going to configure it in the first place.
But if you look at devices with cycles in between, I already see it happening. 10 years ago when I bought a stereo receiver, I bought something entirely dumb. Last year I replaced it and ended up with something that was internet connected. Not because I really cared, but because the equivalent model came with that. And in retrospect I'm glad; their phone/tablet app is a way better remote control than punching a bunch of mysterious, no-feedback buttons. The same thing happened with my DVD player; when the old one died I just bought whatever Consumer Reports recommended and it too is internet connected.
I think that you're right that the development of pluggable Android is what will push this forward. And Google clearly agrees; their project Brillo is surely one effort among many.
I've always thought that the reason for keeping Android as open source was to enable this use case. Eventually, Android-capable SOCs will come down to a very low price point, and for these use cases they don't need to push much more than a few kb per second. But it retains the advantage of having a development ecosystem that is well understood and widely available (you can't throw a rock in China or India without hitting an Android developer).
Neat. Did you try to coordinate room lighting color changes with computer screen colors controlled by fl.ux?
It's minimally sufficient for my needs, but I'd be glad to work with others to expand it.
And we're only worried about vendors going out of business because it's the early days and it's largely startups pushing the trend. With a Samsung or Apple, it's more that they'll quickly (by home equipment standards) stop supporting whatever doesn't stick to the wall.
There is a case to be made for self-contained objects that don't derive most of their value from an ecosystem, but work normally with no network. Work up from a toaster, not down from a computer.
Traditional hardware makers are going to have to factor the support of this software component into their prices now.
I will go out of my way to buy a dumb TV next time.
How? I don't see any dumb TVs for sale.
A dumb TV still has a TV tuner, so they aren't equivalent.
But unless you plan on plugging it directly into an antenna for OTA broadcasts, they are basically equivalent.
- TVs don't need anywhere near the same level of display quality. They are viewed from ten feet away and do not render small text, so they don't need as clear a picture. They also don't really have to go over 30fps, and latency is less of a concern. Basically, they have looser constraints in many ways, making them cheaper
- TVs have a plethora of inputs of many formats
- TVs have remote controls
But I doubt you could buy a 60" glass front monitor for the price they sell TVs :)
The impetus there was that LG changed its net-connected TV platform in 2011, and instantly dropped all support for older devices. One would think that a final update could remove that "coming soon" box from their proprietary added-feature screen, but they haven't even bothered to do that.
So I can watch NetFlix and YouTube on that device, but not Amazon instant video, or Crackle, or Crunchyroll, or Vimeo, or any of the dozens of selections available to better supported platforms. Having learned my lesson, and aware of the increasingly stalkerish behavior of "smart" televisions, my next TV purchase was very specifically a dumb screen. If I want an internet-connected service now, I use the Wii, or XBox, or the extended desktop from the nearest computer.
I will likely refuse to buy any network-enhanced appliance in the future, unless I am able to root/jailbreak it and install software without the manufacturer's stamp of approval. I probably wouldn't do much beyond installing ChillBox, or FridgeBSD, or CryogenMod, or whatever, but it feels like the possibility might keep them a little more honest. Because you know that refrigerator hackers would be capturing and picking apart every packet that thing sends out, quickly discovering that every time someone closes the door, it sends a tattle out to fridge-use.org about how long you stood there with the fridge door open, along with before-and-after photos of your food.
Though it would also be embarrassing if they marketed value models of a product line by disabling features in software/firmware, and some NetBSD-loving punks could come along and write a simple script that turns the doohickey that retails at $200 into the one that sells for $800.
So it's already too late for me. "Smart" appliances are just another low-capability computer that I will have to support as the in-home IT guy. And I will have to presume that they come pre-loaded with all manner of crapware and spyware. I would forever need to be checking on chipsets and revision numbers and compatibility lists. No thanks. It's hard enough managing the congestion on the home WiFi already.
A company doesn't have to be out of business to not do security updates; they can not do security updates starting day one. There was an article a while ago about tons of home router vendors with insecure software from a third party, where the third party had resolved security issues years ago but the vendors had never bothered to update, leaving hundreds of thousands of devices vulnerable over the last few years.
I'm surprised IoT conversations are still happening with Linux as a contender for the OS, let alone Windows.
How well would you say that Tesla is coping with this as a company? For that matter, what about Apple?
I can't even imagine the number of security patches that have gone into the Linux kernel in the last 11 years.
Let's take a step back and think about this statement. Isn't this insane? We know enough to be able to build something much better than this. The reason that we don't, is that we've just kept on pragmatically building on what we had before. We're like a corporation that keeps pouring money into its "stovepipe" system because we keep on making short-term decisions. (Somehow "stovepipe" has come to mean "vertically isolated," but I seem to remember that it also used to refer to the tendency of iron stovepipes to corrode and need constant patching.)
Apple just kind of assumes that you have the latest shiny, because why wouldn't you? This induces a phenomenon I call the Apple Turnover: when a software update aimed at new Apple things comes out and makes your old Apple thing not run so good anymore. Sluggish iPhones are the hallmark example today, but I was bitten badly by this in the mid-2000s when Panther would no longer compile C++ files. You see, one of Apple's OS updates for Panther came with Tiger's libstdc++, which used the new Itanium ABI. This was so Xcode for Tiger could compile programs to run on Panther, but without heroic efforts to set up compiler flags in every package you built to link against the old static libstdc++, compiling on Panther would link against the new libstdc++ by default and fail horribly, rendering C++ code uncompilable. (Deleting or renaming the new libstdc++ was not an option; it was a heavily depended on system component and I think even the header files were changed for the new library.) And a lot of stuff depended on C++, including C-API stuff like SDL. And Apple did fuck all to fix it.
So if you buy a shiny Apple toy, your choices are to commit to upgrading early in the new product cycle or risk an Apple Turnover rendering your purchase, if not useless, then with degraded functionality even relative to the same device when you bought it.
And the pisser is during the 80s and 90s, Apple gear was legendary for running well, and being supported, many years if not more than a decade after its purchase date.
I'd guess the typical hardware replacement cycle for a computer is 3-5 years. My 2009 MacBook Pro is on Mountain Lion. It runs just fine, and the OS continues to receive security updates. Both 10.9 and 10.10 also officially support my machine, I just haven't bothered to upgrade. Rumor is that support for older machines is one of the areas of focus for 10.11. We'll see in a week or two.
The typical hardware replacement cycle for a smart phone is probably 2-3 years because of contract upgrades. My iPhone 5 is running 8.2 and runs just fine. 8.3 supports it as well, but I need to clear some photos off to make space to run the installer.
Looking at the entire Apple installed bases for computers and phones, Apple users seem to do a very good job of keeping up with supported OS versions.
I don't know as much about the iPad. At work I have an iPad 2 that is running iOS 8 and seems to work fine.
If you scroll down to "Device Breakdown (sorted by Usage)", you see a bunch of devices with 90+% stuck on the "last supported OS version" - 97% of iPad1G on iOS5, 97% of iPhone 3GS and iPod Tough 4G on iOS6, 91% of iPhone 4 on iOS7. Even including thise devices, they're showing 75% on devices on iOS8 and 20% on iOS7.
I'm in the "left behind" category - with both my iPad1 and Mac Mini single core being "stick" at iOS5 and OS X 10.6 respectively. Im somewhat disappointed at the lack of patches for the known security holes in iOS5 - especially since those numbers still show over 3% of the iPads in use are not upgradable past iOS5...
The appliances in my house were bought because of all the bells and whistles. I didn't buy them, but am forced to work on them when a sensor fails. They have gotten so complicated, parts so expensive, service manuals hard to get; I just throw them away when they fail. I don't like it.
I think, if consumers start demanding it, we will go back to buying a based on longevity, and not on the newest feature?
Every time my dryer's alarm goes off, I am reminded it's a durable good. It's almost reminding me to save up?
The upgrade cycles are simply too long for TVs.
But more importantly, Apple's whole model is to treat things like this as just "dumb complements". Your mobile device is, from the carrier's perspective, is increasingly becoming an a dumb Internet pipe (first with the App Store, later with the likes of iMessages and how LTE works, etc).
The TV for Apple is simply a dumb display to stick an Apple TV into. A sub-$100 device you can replace every other year if need be. A $3000 TV is replaced a whole lot less often.
Why would Apple want to be in the business of (eventually) supporting 5+ year old TVs for such a low-margin business? Or what makes you think users would pay for the Apple brand and/or upgrade more often to make it worthwhile?
So as far as IoT goes, I have trouble seeing a future where someone says "I need to buy new lightbulbs because mine don't get firmware updates anymore" or "I need to buy a new fridge because it can't talk to my new phone".
Not only does it make sense to put the smarts and connectivity in devices that are either cheap, that people already have anyway, and can be easily upgraded but the user interface in a phone/tablet/etc. tends to be far better than a typical remote.
In general, I tend to prefer the Chromecast model of just casting video from a general purpose device, but the Kindle stick and Apple TV are OK as well. By contrast, I rarely used the Smart TV features on my Panasonics because they were just so painful to use.
One Windows running company I worked for long time ago simply didn't apply the patches. They said it broke things...
According to recent reports, the entertainment system is not fully isolated from the plane's navigation systems. However, Boeing has denied this.
The entire story is bogus.
The systems are isolated just fine.
One of the collateral PDFs for the device shows it connected to "IFE (In-Flight Entertainment)" (http://www.teledyne-controls.com/pdf/NED_Brochure.pdf).
Just the existence of such a device gives me reasonable cause to believe that there's more interconnection than "isolated just fine". While I have a lot of respect for the engineering processes that go into aviation systems, I've also had enough practical experience in IT security to know that blind faith in things being done right is foolish.
 I've seen the claim made more than once in this thread without a source. I understand that the article was probably very wrong. I'm just interested to know if a reputable publication has confirmed this or if it's just conjecture.
Edit: Just got confirmation, this software was the root cause. No hacks/whatsoever!
It would be ironic if the bug bounty program directly/indirectly lead to this.
Bugs that are not eligible for submission:
* Bugs on internal sites for United employees or agents (not customer-facing)
* Bugs on onboard Wi-Fi, entertainment systems or avionics
Ironic: A state of affairs or an event that seems deliberately contrary to what one expects and is often wryly amusing as a result.
That said, the plane communication protocols aren't terribly secure, so it's certainly feasible someone is playing around with them. Maybe they'll decide it's in our interest for us to know at some point.
Imagine how much money is being lost right now as a result of this disruption. Somewhere hackers are popping champagne.
As long as securities react to hacks, there will be a massive incentive to 1) hack, and 2) overstate the hack's significance. Furthermore, as bots become more sophisticated, confusing them becomes easier. If you know bots will short CompanyX when "CompanyX hacked" hits the headlines, then you have an unfair advantage just by being the first to know of the hack.
The response from United was unapologetic and absolutely disgraceful: https://hub.united.com/en-us/News/Company-Operations/Pages/s....
UPDATED: Jun 3, 2015 at 1:45PM
While United did not operate the flight, Ms. Ahmad was our customer and we apologize to her for what occurred on the flight.
After investigating this matter, United has ensured that the flight attendant, a Shuttle America employee, will no longer serve United customers.
United does not tolerate behavior that is discriminatory – or that appears to be discriminatory - against our customers or employees.
All of United’s customer-facing employees undergo annual and recurrent customer service training, which includes lessons in cultural awareness. Customer-facing employees for Shuttle America also undergo cultural sensitivity training, and United will continue to work with all of our partners to deliver service that reflects United’s commitment to cultural awareness.
1) The beer isn't free, the passenger paid for the entire can or used a 1K drink chit.
2) UA flight attendants are famous for making up rules and many try to avoid handing out entire cans of soda, and this one wasn't even a United flight attendant.
3) What on earth would that have to do with today's event?
Similarly, whether or not this was a United flight attendant is also of absolutely zero relevance. They may have technically been an employee of Shuttle America, but were part of the cabin crew and a representative of United on that flight, working under the United brand and wearing United uniforms. Therefore, when United releases a statement making no apology for abhorrent behavior exhibited by their representative, it reflects directly on them.
It may have nothing to do with this event, just as Chris Roberts tweeting that he hacked into the in-flight entertainment system may have nothing to do with this event. It's merely interesting that Wired explicitly ignored the actions of United as having any possible relationship to this event.
The whole fiasco is BS. Airplane networks are as safe as it gets.
In this instance, the inflight entertainment network and the avionics network were physically connected, and the security researcher was able to gain access to the avionics network by connecting to the inflight entertainment network.
"If the two networks aren't physically connected, there's no way to gain access to one network from another" is no longer true.
Like skeuomorphism. Nobody uses floppy disks anymore, yet the floppy logo is universal for "save".
The case above is an example cited as a life-critical system:
> Computers used in aviation, such as FADECs and avionics
1) The system can't be compromised through input data.
2) There's no other output mechanism by which an attacker could retrieve data (or watch the plane crash, which technically counts as output).
I'm not sure I buy the idea for contexts where a genuine security airgap is required.
This sounds almost as if you configured a network adapter to only send and not receive, but you used "optics", so I'm guessing you refer to some sort of fibre-based device that actually impedes light to go in the other direction, and thus the security is via physical means?
All you need to do is only run one fiber jumper, the TX from one side to the RX on another, not the two you're "supposed to" use. SNMP traps run on UDP and work just fine, old style syslog runs over one way UDP too. Note that your production/secure network can DDOS your IT/insecure network if enough people on the secure side try to use a DNS or NFS server on the insecure side, it'll just spam packets forever and never get its response, so its not like firewalls are completely pointless.
In the really old days we'd do something similar to RS-232 cables, physically yank out pin 2 or pin 3 (conveniently they swapped TX/RX on 9 pin vs 25 pin RS232). PPP negotiates connections, SLIP doesn't care and works great.
(edited to add another thing we did back when 10 meg ethernet was "new" and just replacing thinnet and thicknet, was pulling the approriate pins for TX, before auto negotiation existed, before 100 meg ethernet even existed, you could get away with that... I suppose if you had two smart-ish switches that could be forced to 10 meg no negotiation, you could do this today...)
Also there were production machines that output error, alert, and log messages to theoretically directly attached parallel port printers and there also existed converters that could go from parallel port to serial port (presumably for serial port printers, which really did exist in the 80s), which makes a pretty good unidirectional connection from a secured device to a semi-secured logging server. Bidirectional parallel ports didn't really exist until maybe 1990 or so and never did standardize, not really.
I made a lot of money implementing this kind of stuff in the 90s. It was fun.
Correct. To simplify, imagine a diode on one side and a photodetector cell on the other. On a microcontroller, this would be an opto-isolator (http://en.wikipedia.org/wiki/Opto-isolator).
Of course, the higher levels of the OSI protocol you're using need to support this sort of physical layer. Its typically used on very primitive, low-bitrate connections (sensors mostly, although I've seen it used in highly sensitive installations using scada equipment).
Bruce Schneier has a great piece on air gaps. I've included a link to it below.
You get a couple of these Ethernet to 100FX converters:
Then you hook them up with just one fibre strand (instead of the usual 2).
You obviously can't use TCP over that since that requires a 2-way connection, but UDP works just fine. You'll probably want to wrap your data in some error correcting code, too.
I've found that using hashes and sending duplicates of the data seems to weed out any transmission issues in a one way system.
Consider an N bit stream of data, with a probability mu of any given bit being flipped. Then the probability of your stream containing an error is gamma := 1-(1-mu)^N. Since you're sending the identical stream twice, there's a 1-(1-gamma)^2 chance that your overall transmission is unrecoverable. The hash will tell you that the transmission failed, but not how to correct it. Furthermore, there's a probability that your hash has a bit flipped somewhere, too...
An error correcting code makes a guarantee that if up to m bits are flipped, the original message can be recovered exactly. A Reed-Solomon code can correct up to m errors while adding just 2m bits to the message length; even with a pretty conservative upper bound on the number of expected errors, this should be way less than 2N+k.
And if you want to 'leave no space', why fill up a disk with 'random files' rather than just a single file that uses up all the space? If you're going to the lengths of encryption + airgap + cloudantivirus (!) + etc, you're a power user at this point, so why not just consume all space rather than just collect together files?
And hell, if you're being that paranoid, then use the paranoid OS for your desktop, OpenBSD. Why go to all the effort in the article and not take the extra step to get familiar with an OS that has an earned reputation for security, and that most exploit-writers don't target? I mean, OpenBSD does OOo and pdfs as indicated in the article - what's this airgapped PC going to be doing that requires Windows in particular?
It's ironic that this author people turn to for commentary against 'security theatre' writes articles doing the same.
Thanks for all the useful responses. I always learn something new by reading the comments :)
Or does raw fiber use different protocols?
Imagine that the designers want to send information about the plane's expected arrival time to the in-flight entertainment system. You could send a packet once per minute, without any knowledge of whether or not anyone is even listening. If an update is missed, it doesn't matter since the information rapidly becomes stale.
It obviously wouldn't work for carrying something like TCP/IP but rather much lower level signaling.
I did not have time to discuss it with them, but how a tcp based protocol could be "one way" just does not make sense to me...
I've never heard the term "air gap" refer to something that is actually only segmented in software, crypto or no.
Who knows, maybe this is just a 16 year old who got accosted going through security and wanted to burn off some steam.