As one who is playing S.T.A.L.K.E.R.[1] again lately, this reminds me of the way PDAs are used in that game to track people close by. It's the plot device that allows for a very short term radar-like map. I think it's interesting that this game from 2007, which takes place in the Ukraine and was developed by a Ukrainian developer, is so similar. I wonder if this idea was out there and incorporated into the game, or was in the game and that may have helped the idea along when it was later implemented.
1: An FPS/RPG where Chernobyl was a much larger event and caused weird anomalies and mutants, and people explore this area. The game is based on the 1979 book Roadside Picnic.
The idea of tracking people by phone was definitely out there in 2007; Heck I'm surprised people were surprised about "stingray" type devices in Cop cars which have been depicted in movies since the late 90's, especially since 'Training Day' they actually used a branded Stingray one....
To what extent have others outside Crowdstrike confirmed their findings? They seem mainly based around commonalities in what is understood to be a propriety malware platform/toolset attributed to the Russians. Like a Metasploit-Stuxnet combo? I find these two statements from this vs. the DNC press release to be odd:
"X-Agent is a cross platform remote access toolkit, variants have been identified for various Windows operating systems, Apple’s iOS, and likely the MacOS. Also known as Sofacy, X-Agent has been tracked ..." [1]
"FANCY BEAR (also known as Sofacy or APT 28) ... This adversary has a wide range of implants at their disposal, which have been developed over the course of many years and include Sofacy, X-Agent ... and even malware for Linux, OSX, IOS, Android and Windows Phones." [2]
So, is Sofacy the name of the group or their malware platform? Are Sofacy and X-Agent the same tool, or separate? Is X-Agent a metasploit-type tool or a platform-specific malware for various operating systems? Has the security community in general seen this evidence and agreed on these findings, or is this primarily Crowdstrike's own little expedition?
I'm not in the military but I think bringing a commercial smartphone on a military operation against a very electronically advanced adversary is probably the stupidest opsec fail a soldier can do.
True enough, but on the other hand, how can you be a hero if you can't tweet about it? This particular conflict seems to be characterized by a rather unusual combination of (somewhat) irregular troops and heavy artillery, so creative use of consumer electronics for both morale and maths should not be too surprising.
Besides, troops making ends meet with old soviet weapons stockpiles are not alone in using consumer electronics to augment their military hardware:
http://disruptivethinkers.blogspot.de/2012/05/lessons-in-how...
I guess these guys had more than enough process in place to avoid the mistakes made by Ukrainian artillerists, but then the baseline technology from which improvisation starts is supposed to be much less outdated in the USMC than in Ukrainian forces (who in considerable parts are hardly more than post-fact legalized militias, including an international cast of "lifestyle warriors").
.. and that's what you get for allowing binary blobs (which are prime places to hide backdoors or to find 0days) into every possible level of your stack, because it both makes the system less secure and makes it a lot harder to verify the security properties of a system (as you can't formally verify a CPU whose specifications you don't have and whose microcode you can't know).
Let's look at the attack surface for a bit, the average Android phone has:
1. a CPU for which no methods exist to verify the properties of its physical internals, which contains a signed microcode blob (only Intel & Co. knows what's in there)
2. a blob in TrustZone (I haven't been able to find out if the secure world in TrustZone has DMA to the insecure world, but I'm sure there's more attack surface here)
3. signed closed-source firmware blobs in multiple places in the phone, including in the baseband which handles all radio communications
4. a monolithic kernel with 7 million lines of driver code running in supervisor mode (to be fair, a lot of that can potentially be excluded at compile time)
5. closed-source userspace OS services (Play Services) required by most apps
6. apps with a native GUI have to run on the closed-source Java Runtime
7. apps are disseminated via the Play Store which has:
7a. no provisions for providing a license (e.g. "filter by license" in the search option, a dedicated spot in the app page if the license property is set, it's not rocket science)
7b. no provisions for providing the source code and validating the code you run matches the source (few people use F-Droid and that only solves the first problem)
7c. no provisions against Google serving backdoored apps
8. all apps have the possibility to communicate with other apps without user intervention (if those apps chose to allow this) via Actions, and stock apps tend allow a bunch of things through this channel
It's rotten all the way down, unverifiable by users or security experts and just not secure against any kind of targeted attack.
The Ukranians shouldn't have been using smartphones at all, and they're just unlucky to be targeted by a state level adversary first (that we know of).
This could happen to any of us if we become a nuisance to a state (or when we anger one of the employees of the companies involved in producing this shitshow).
I wonder why crowdstrike is putting out this information publicly. Could it be because fancy bear made a fool out of them with the DNC debacle and now they're trying to regain some respectability?
1: An FPS/RPG where Chernobyl was a much larger event and caused weird anomalies and mutants, and people explore this area. The game is based on the 1979 book Roadside Picnic.