Malware on your Android device picked up from third party app stores (FDroid? Amazon?) that steals email accounts and auth tokens. Looks like it only works on the older Android 4 Jellybean software (and some Android 5 Lollipop) and below, so mostly concentrated in Asia where there are lower-end phones.
You can see if your account has been affected here:
Thanks for making this comment. This post is a wonderful example of the rampant marketing that has given the security industry a bad name.
- The title is technically accurate, which is the best kind of accurate for clickbait. This is not a novel vulnerability representative of an application security flaw within Google - the malware campaign specifically targets older devices using previously known vulnerabilities.[1] There is no new exploit research here.
- There's a logo and cute name for something which is, again, not a novel vulnerability.[2]
- Scaremongering tactics are used throughout to hype up the finding.[3][4] Deliberately ominous language like "...for now" is perhaps tolerable when it's coming from a media outlet, but it's certainly unacceptable from a firm conducting original security research.
All things told, this is closer to "threat intelligence" than real security research. A much better source for this news is the blog post by Google's Director of Android Security, Adrian Ludwig (first footnote, linked elsewhere in this thread as well). In particular, notice the succinctness and the serious, yet detached professionalism associated with the post.
In any case, there are legitimate arguments to be made in favor of extending software or device support lifetimes for vulnerability patches, but the onus is on device manufacturers to coordinate this. In the meantime, it would be great if fewer firms practiced this sort of manic self-promotion, but unfortunately there's little incentive not to.
That's not to say Google has no responsibility in this. Google's OS has a terrible security-update policy. Being able to buy a new computing device from a store that will receive no security updates is terrible, and is fairly common in Android devices.
Now, there are valid technical reasons that Google can't be as good as Microsoft at pushing out updates to every device running their OS, but still, it's hard to say that Google has made fixing this problem a priority. Even their own 1st-party devices have a pathetic 2-year upgrade window from launch, which, I'll remind you, still means somebody can buy last year's device on a store shelf and stop getting security updates before the device is even out of warranty.
Google in most cases is not the device's manufacturer, and in a mobile device OS and application SW are tightly coupled, so you cannot really have OS updates separate from firmware updates, bypassing the actual manufacturer's own updates.
Google (and Apple and Microsoft) can totally do it for devices that manufactures and maintains on its own, and actually it is one of the selling points of their new phone.
In addition, at least in the US, devices connecting to a cellular network have to go through a certification process, enforced by the carriers, and every firmware update requires additional regression testing (time and money) for the device to be admitted on the network again.
(That's my guess of why Lenovo dropped the policy of fast updates for Moto phones when they bought Motorola)
Google have an approval system before they allow a manufacturer to bundle Google services. Update mechanisms could easily be built into that approval process. I suspect that they haven't turned the screws too hard on manufacturers for fear of Samsung or LG making an Amazon-style fork.
Google have already drawn their own roadmap with the Android One project - a number of low-end manufacturers have devices that get updates directly from Google. They could easily create a Nexus-type brand that manufacturers can opt into, guaranteeing timely updates and long-term support. This could be a big draw for mid-tier manufacturers.
Exactly. Apple only has one set of hardware to support. Microsoft support the PC platform and you can usually do a fresh install on any machine and it will boot (driver support is a little different).
Android is garbage in this regard. Google binds everyones' feet with the OHA so they are required to use the Google Play Store and services (and they can also never manufacture Amazon devices) yet they don't standardize the system to ensure that AOSP can install anywhere. Part of this is the difficulty of ARM not really being an architecture, but even Microsoft was able to deal with this by requiring UEFI and some standardization on Windows devices (although they're more like Apple where there's limited hardware to support).
Google makes a ton of money from their licensing. It's in their advantage that people buy new phones all the time. If the hardware wasn't all over the place, we'd see more uptake for thinks like Plasma.
What are the alternatives right now for software devs that are willing to do their own roll-your-own work? Ubuntu Touch doesn't seem to have been updated for most of their ports in forever. Plasma supports two devices, neither of which have sdcard slots.
> Part of this is the difficulty of ARM not really being an architecture, but even Microsoft was able to deal with this by requiring UEFI and some standardization on Windows devices (although they're more like Apple where there's limited hardware to support).
Well, the reason why PCs are a standardized platform is because the industry was built around cloning the AT. If anything wanted to be successful, it had to do everything the AT did the way the AT did it. Once the AT started getting long in the tooth, the industry got together to agree on further standards like the ISA bus (an extended version of the AT's bus), various ATA storage standards (again, derivative of the AT's storage protocol), the ATX form factor, the PC System Design Guide (PC 97/98/99/2001), etc.
The PC industry has a culture of working together and collaborating for the sake of compatibility. For part of this, Intel was involved with the standardization process (something ARM refuses to do), but even they didn't have the full authority to force anyone to adopt their standards. In fact, there were plenty of manufacturers of x86 machines who decided to skip out on PC compatibility entirely in order to do their own thing. Just go ahead and try to install Windows 3.1 on a WonderSwan, for example. It's x86, but not a PC or any kind of AT clone. The industry simply declined to see it as a PC and moved on.
It's a shame that nobody in the phone industry every attempted a hardware standardization effort. Google and Qualcomm could've worked together to come up with some real standards, but they dropped the ball.
I have to wonder if Google would've been able to force something through if they made Android run entirely on native code instead of shoving everything into a Java-based VM. If the industry couldn't take the shortcut of "let's just port Dalvik to our hardware and call it a day" and instead had to ensure compatibility for a wide array of native software, they might actually have developed some form of collaborative discipline.
What Google needs to do now is collaborate with Qualcomm and come up with their own standardized hardware platform. Create a phone equivalent to PCI, ATX, PC 98, etc. And then refuse to license Android to any device that isn't built on this platform. They should complete the process of moving AOSP into GApps, replace the Linux kernel with a closed-source BSD derivative, and then announce the closure of AOSP. They should do with Android exactly what Microsoft does with desktop Windows.
>What Google needs to do now is collaborate with Qualcomm and come up with their own standardized hardware platform. Create a phone equivalent to PCI, ATX, PC 98, etc. And then refuse to license Android to any device that isn't built on this platform. They should complete the process of moving AOSP into GApps, replace the Linux kernel with a closed-source BSD derivative, and then announce the closure of AOSP. They should do with Android exactly what Microsoft does with desktop Windows.
Making Android closed source won't do anything. Unless you're using Fire, all Android manufacturers in the US are not getting their code through AOSP. They get it through a side license with Google.
The problem is not that Google can't force security, it just doesn't want to.
That is specifically a design flaw in the AOSP. Right now manufacturers have to integrate their custom device drivers into every new OS build, leading to long delays and fragmentation. The device drivers should be separate, and the OS should expose a stable API and integration points. That way OS upgrades could be pushed out without breaking everything, just like with desktop OSs.
That's not Google's policy, it's Linux. What you suggest would mean abandoning Linux as a kernel. I'd be all for it - the industry needs some more open source kernel competition.
Linux is a bad choice for a half-open environment. Either it's totally open (like most Linux distros) or everything is done by the manufacturer (like routers). But not this garbage with closed source drivers that will prevent you from recompiling eventually somewhere in the future.
The basics are that every phone out there uses a forked Linux kernel patched to hell to get it working. Since none of the drives are upstreamed it's unmaintainable.
The linux kernel does not have a stable driver interface so shipping updates to phones is a LOT of work.
This doesn't explain why they can't upgrade user-space applications and libraries. It's very rare for a user-space application upgrade to require a kernel update on any major operating system.
> and in a mobile device OS and application SW are tightly coupled
I call bullshit. There's no reason Google can't update everything AOSP-y in /system - libc, libart, libwebkit etc.
That's not the point. Even if it were so, it's still responsibility of the manufacturer to integrate it in its own firmware and push the update with the carrier's approval.
You are comparing a laptop to a smartphone, which makes no sense, the smartphone has to connect to cellular network to be useful, and it's the carrier that establishes the rules for the update process.
I agree that it should work as you say for devices with no cellular connectivity, such as WiFi only tablets, where no other parties other than the OS and device manufacturers are involved.
Right now, Google has no credible open competitor to Android, and not for lack of trying. If Android wants to be the Windows to iPhone's Mac, it will have to get serious about security, or be swept away by the competitors which will inevitably emerge.
I also want to say that voting machines run unsupported Android builds. If Google is derelict in that duty... well, that's a much bigger deal than some compromised Google accounts.
I never heard that about voting machines. Do you have a source for that? I'm not sure why that's more surprising than hearing that they run Windows XP...
The Android device in use is the EA Tablet. The certification tests are listed in "EA TABLET FOR ANDROID WITH JELLYBEAN 4.2.1 ELECTRONIC Test Report," dating from 2013.
To be fair, it's probably the best of the horrible lot in security, but that ain't saying much.
For example, the iVotronic systems contain a readily accessible compact flash card right on the top, which stores the election returns. Demonstration machines are set up in each county, so I went to see one in person. Unsurprisingly, the demo machine's card wasn't even covered with a tamper-evident seal.
The devices, including the compact flash cards and the PEBs, are reused from year to year because the legally required certification for the device is very narrow. As the demo machine compact flash cards and PEBs are re-used in each election, at any time prior to the election, infecting the demo machine can be used as a vector to attack the entire county voting total.
Since the demo machine is not sealed, its compact flash can be accessed. If the compact flash card is compromised, the system can be quickly owned. From there, the malware can spread rather trivially to the PEB unit used as a secure token by the election workers, and from there to the county's Unity system at Election Central, allowing the entire county's vote to be altered. So instead of the 4,500 machine compromises PA is claiming would be necessary to influence a state election, it would probably only take 6-7 people any time in the past ten years planting their malware in a few key counties.
All one would need to do to untraceably change the vote totals would be walk in to the county election commission, swap the compact flash out for your malware, and leave. If you do this at any point prior to the election, the malware can spread from the demo machine, to a live voting machine, and finally, when the compact flash cards are entered into the Unity system for final tally, the malware can compromise the whole lot. Then the malware would self-delete, leaving no reliable paper audit record.
Interestingly, from a legal perspective, the Secretary of the Commonwealth's certification for these machines is contingent upon the locking mechanism preventing access to the compact flash card. The machine that I saw, the most common model in use in the state, physically could not be secured that way. The plastic cover mechanism to which the lock is affixed simply doesn't cover the flash card slot well enough.
Under the PA election code, if a specific requirement of the Secretary's certification is not met, the law would invalidate the votes cast through all the iVotronics as a matter of law. As the machines were not configured as approved, they aren't approved for casting ballots, which would throw the PA recount into chaos. It's probably the only judicial avenue left to sue for a state-wide recount that might actually have a chance of being considered.
Nobody tell Jill Stein. In all liklihood, the PA legislature would just send the current electors anyway, as is their prerogative.
Now I really want to post this to /r/politics or one of the jill stein subs, with a title like "PENN VOTING MACHINES COULD HAVE EASILY BEEN HACKED, THE VOTES ARE INVALID".
It would get upvoted, perhaps to the front page, and then news outlets would likely pick up the story.
The iVotronics hacking part is very public. The legal aspect may not be as well-known.
The iVotronics vulnerabilities were documented in a lawsuit joined by the Commonwealth's own Deputy Commissioner of Elections. See Banfield v. Cortes [0]
The Election Code specifies that the Secretary of the Commonwealth shall certify electionic voting systems, and issue directives and instructions upon which such approval is conditioned, with which counties are required to comply.
§ 3031.5. Examination and approval of electronic voting systems by the Secretary of the Commonwealth
(a) The Secretary of the Commonwealth may issue directives or instructions for implementation of electronic voting procedures and for the operation of electronic voting systems....
The county board shall comply with the requirements for the use of the electronic voting system as set forth in the report by the Secretary of the Commonwealth...
(c) No electronic voting system not so approved shall be used at any election... [1]
The Secretary alone determines the method of certification.
While the Legislature mandated that an electronic voting system must comply with specific federal testing and performance standards and the requirements set forth in the Election Code, it does not prescribe a particular testing procedure to govern the manner in which the Secretary is to perform the examination, but ultimately left this discretion to the expertise of the Secretary, who is tasked with implementing the Election Code. [0]
However, counties must still comply with the implementation "directives and instructions" issued by the Secretary.
Section 1105-A of the Election Code, 25 P.S. § 3031.5 requires that the Secretary of the Commonwealth examine all electronic voting system used in any election in Pennsylvania and that the Secretary make and file a report stating whether, in her opinion, the electronic voting system can safely be used by voters and meets all of the applicable requirements of the Election Code...
The Secretary of the Commonwealth certifies the iVotronic Voting System in accordance with the conditions detailed in the reports... and the following conditions. [2]
The certification of the iVotronics system implemenation directives and instructions include a the specific provision that counties "must install the locking mechanism over the serial port and compact flash memory in a manner to prevent access to the compact flash card."
3. Pennsylvania counties using the iVotronic Voting System must install the locking mechanism over the serial port and compact flash memory in a manner to prevent access to the compact flash card. [ibid.]
As the construction of the locking mechanism itself renders the compact flash accessible regardless of the physical lock used, as determined by multiple audits in academia as well as other states, the iVotronics system in question was not certified in accordance with the requirements of the statute.
The Secretary put a caveat on the certification of the iVotronics with which the counties did not comply. This is physically analogous to requiring that a tenant "must install a lock on this door which prevents access to the inside of this room," but the door cannot latch no matter which lock is used. If instead of repairing the latching mechanism, the tenant merely replaces the lock, he would not be in compliance with the directive.
Rather than work with the manufacturers to create a locking mechanism that complied with the Secretary's directive (changing the latch), the counties merely changed the locks used. These locks do not prevent access to the compact flash card, and thus Secretary's implementation requirements were not met by the counties which used them. The counties failure to meet these directives was not due to lack of ability, as the requirements of the iVotronics maintenance contracts include modifications necessary to comply with state law, or lack of knowledge, as they were disclosed during Banfield, cited by the certification report itself.
The counties simply failed to ensure that the locking mechanisms were updated subsequent to the Secretary's report. Each county board is required to submit their vote totals to the Commonwealth in accordance with the Election Code. As the Election Code requires counties to comply with the Election Code, a county's failure to meet the Secretary's certification requirements disqualifies its reported vote totals.
It's pretty telling about the seriousness of the recount effort that nobody has even bothered to sue a county that used these machines. The Commonwealth's Election Code is not a mere recommendation to the county. Its provisions regarding DREs are specifically intended to punish counties that do not comply with the Secretary's requirements for certification, which many did not.
But please don't cross post me. You won't accomplish anything, except maybe landing me in DHS lockup for ten days for no reason.
> Being able to buy a new computing device from a store that will receive no security updates is terrible, and is fairly common in Android devices.
This seems like the kind of problem the free market could solve. Just get one phone vendor to guarantee secruity updates for a few years and then some customers will start buying those phones. After a while other vendors will start promising it or losing sales.
> This seems like the kind of problem the free market could solve.
It's the kind of problem solved by perfect market where all actors were rational, had access to complete information, and correctly prioritized their long term and short term needs.
Alas, the world we live in is seven billion highly distracted primates who interact by wiggling their smallest appendages on grids of buttons and pushing streams of air over a weird blob of muscle located inside an organ also used for food consumption.
The underlying assumption is that a multitude of users would switch to devices produced by such a manufacturer. This, I think, overestimates how much most users currently care about security.
As it turns out, there are more secure devices in the marketplace than the affected phones, but they cost more. All other things equal, a contractual obligation for security policies would increase the cost (and thus price) of devices, and users would likely stick with cheaper options.
>This, I think, overestimates how much most users currently care about security.
The media has failed to inform the lay public about this issue. Users could be made to care about security with the right messaging. Your average user may not understand OS updates but the issue can be phrased simply in terms of product defects which the manufacturer refuses to fix and that put their personal info at risk.
Blackberry android phones does this. They patch phones the same day google nexus is patched or even earlier for beta program users. Still no one is buying them
No it doesn't, because most everyday users don't give a toss about security. It has to be something that is pushed as a best-practice by those who know better, not something that is demanded by an everyday user who doesn't. The invisible hand won't do shit here.
That economic fiction requires an ideal rationale actor and a different time horizon.
1) Noble price researchers (Kahnemann & Tversky) showed that economic actors are not rationale.
2) Taking a long term view tends to require sufficient funding to allow to worry about the long term. People with lower level funds intensely worry about the short term and for them this is totally rationale.
Checkpoint has been notorious for this kind of exaggerated marketing, especially within the past few years. My theory is that their security appliance line has been suffering due to superior competitors (source: personal experience; could be wrong without global sales numbers), so I think they're trying to get their name back in people's minds.
This research is definitely good and beneficial, but yes, it's threat intelligence research, not vulnerability research or any sort of revelation. Definitely not deserving of a logo.
Years and years ago I went out searching for UMD devices vulnerable to CSRF.
They universally were, but the only vendor that responded to my email was CheckPoint, who admitted it and said they were working on a fix. (They had the fix released soon, too.) Everyone else was 100% silent treatment.
What % of the devices you tested were security appliances? It's good Checkpoint did that, but you would expect most security companies to care a lot more about patching their devices relative to all of the other various network device manufacturers out there.
I'd argue the opposite. It's this dismissive attitude about a vulnerability affecting 1M+ accounts that is the problem in some sectors of the security industry.
You're drawing arbitrary lines around what Google is responsible for and what the user is responsible for, and ultimately blaming the user for having an "older device". But guess what? This problem doesn't affect iOS products anywhere nearly as much, even though there are hundreds of millions of older devices in use. That's because Apple took an approach that allowed them to ensure devices stay up to date. Google didn't. And that's as important to security as is UX design and all the other often-dismissed factors that go into achieving successful security outcomes.
>the malware campaign specifically targets older devices using previously known vulnerabilities.[1] There is no new exploit research here.
That is a false statement where you are implying a certainty that has not yet been established. A Google employee says that as far as they have been able to investigate, several variants use known vulnerabilities.
In any case, there is indeed a giant security flaw when an application executing in a supposed sandbox can get root access. It points to severe design flaws in the application model as well as the underlying OS. The fact that elevation of privilege is almost expected should be unacceptable. Given the terrible state of security updates in Android, I would say it is worth drawing as much publicity as possible to these events.
The fact that most consumers aren't aware they most Android devices are susceptible to these kind of vulnerability argues for more noise about these issues - not calming press releases talking about how the issues are moot with the latest build.
>Malware on your Android device picked up from third party app stores
They say that, but then Google's G+ post[1] says "These apps are most often downloaded outside of Google Play"
You could read "most often" as "some of these were downloaded from Google Play".
Either way, they are exploiting known vulnerabilities. The big issue to me is that phone manufacturers / carriers, by choice, stop patching phones whenever they please.
- Removing apps from Play: We’ve removed apps associated with the Ghost Push family from Google Play. We also removed apps that benefited from installs delivered by Ghost Push to reduce the incentive for this type of abuse in the future. Downloading apps from Google Play, rather than from unknown sources [https://goo.gl/9rqdiH], is a good practice and will help reduce the threat of installing one of these malicious apps in the future.
Adrian also stated they were removing affected apps from the Play Store. Many of the more recent vulnerabilities even show up in the Play Store first.
Adrian's constant defense hinges on saying "stick with the Play Store, where we protect you", but the Play Store really isn't much better, it's just that saying it is scares people from looking at competitors' markets.
It is not likely that this malware is picked up through F-Droid as that 'store' only contains software which was built from source by the store maintainers. Any non-free code is removed from the build before the package is hosted on the download server.
As such an Android device can be used (in a useful way) without having Google Play services (or, for that matter, any other Google apps) installed. I've been doing just that for more than 5 years now without having the feeling I'm missing out on something.
AOSP or a tailor-made Cyanogenmod (with all the Cyanogen-account related stuff removed) plus F-Droid gives you a perfectly usable device.
Do you actually know which stores they mean? I'd hate for F-Droid to be vilified. F-Droid isn't just a store, it's an Android Repository Browser[1]. It would be a shame if the F-Droid repository was exploited beyond the concessions[2] that they allow.
I do not, just wanted to throw a couple that I know of out there. Hopefully neither of those third party stores because I like and use them both. I hope it was clear from the question marks in my post that those were just examples, certainly don't want to smear either one.
If you're going to name app stores, I would think places like Baidu would be more likely, given their size and popularity with users of lower-tier Android devices.
>Looks like it only works on the older Android 4 Jellybean software (and some Android 5 Lollipop) and below, so mostly concentrated in Asia where there are lower-end phones.
Yep, I'm stuck on 4.4.2 because verizon doesn't provide OTA updates any more and the 4.4 update ensures that the phone bricks if you go through the process of installing cyanogenmod.
If it is just auth tokens instead of email password, should google be able to invalidate all these auth tokens in their backend immediately? Force those uses to re-login and get new auth tokens?
The malware is still installed and would just capture the new auth tokens. And forcing the user to login would also give the malware an opportunity to capture the actual password.
Google should log other signatures such as device id, ip, network, region where the request coming from and use those data as additional layer of security in the backend to help id the folks/org behind hack.
Just to be clear, they didn't obtain any passwords, but auth tokens. This would potentially allow them to log into accounts, but only as long as the tokens are valid.
Also, they don't reveal which "third party app stores" served infected apps, but they do provide a list of infected apps, and searching for these yields some real shady download sites: http://imgur.com/a/0luW3
Couldn't Google just revoke all of those access tokens? It'd be a minor inconvenience for some, but it would hardly be a big deal, right? You'd just have to grant access again.
You're forgetting the less diligent users. I guarantee some Google users don't remember their password and just count on never being prompted, I know some of them.
> While Google implemented multiple mechanisms, like two-factor-authentication, to prevent hackers from compromising Google accounts, a stolen authorization token bypasses this mechanism and allows hackers the desired access as the user is perceived as already logged in.
The trouble is that auth tokens are generally not tied to a specific device or IP. There aren't really any mechanisms for this in standard OAuth 2.0 flows (if indeed this is what they're using).
I used to work in an ad-tech company focused on mobile cpi offers that for several months paid the salaries of everyone involved by injecting malware in cracked apps on several third party app stores (they were making a profit out of it enough to dedicate a team only for this).
They even managed to automate all the process of "selling" cracked apps on third party stores. It is amazing how easy it is to trick broke 13yr old kids into installing stuff on their phones.
There absolutely is a trade-off here between freedom to operate and likely security (with the exception of highly skilled technical people with a lot of time on their hands, who can likely have both).
Personally I'd say that most non-technical computer users are better off using a more locked down/secure OS (e.g. iOS) as they are generally ill-equipped to manage an open computing platform with the current level of threats that there are out there.
You can win. Sane defaults. Allowing the potentially unsafe method "expert mode" is OptIn. I wouldn't buy a car if the dealer held the only key to the hood, still I don't expect everyone to be a grease monkey nor do I think it remarkably safe.
What's "insane" about explicitly having to opt-in to 3rd party application installs in Android ? That switch existed for years and was praised upon. The media opinion only shifted after Android becoming most widely distributed phone OS. It's just easy clicks.
I'm in agreement with you and that's what I was saying or tried to say. Opting into 3rd party installs is fine with me, having the option disabled is a "sane default".
Yeah, Android's not fine. The business model, the weak full disk encryption, inability to set strong FDE password separate from your pin, centralization around Google Play Services, moving AOSP into GAPPS iteratively, forced obsolecense via carriers/hardware producers
Android could let me add a specific source so I don't have all-or-nothing security.
For example, I only wish to install apps from HumbleBundle and F-Droid (in addition to the default play store). Any other APK that lands on my system is still untrustworthy.
Of course, this would probably require additional signature on the APK, since I think all 3rd party sources are just providing raw APKs.
Apple makes the choice with iOS to be closed. People hate on that because they distrust a centralized authority with good reason, however it is far safer for those who don't mind giving up control.
Google makes the choice with Android to be open. People hate on that because they want the system to be safe, however it is in the control of the end user for those who don't mind taking responsibility for the safety of their own device.
Both absolutely win at what they are trying to be.
MS on the other hand tries to pretend to be both while actually not delivering the benefits of either. That is certainly a way not to win.
You definitely can't win, but those two complaints are not mutually exclusive. Instead of locking down apps to solve the first problem, in theory Microsoft could have redesigned Windows to make third party executables less of a risk. Obviously that's harder, but it's not hypocritical to make both those statements.
UWP apps can't be run outside of the MS store. So that's the lock-in he was talking about. It would be nice if Microsoft enabled "mini-VMs" for legacy x86 apps at least.
That way it could shoot two birds with one stone - make x86 apps a little slower and more resource intensive, and thus give both users and developers a reason to switch to UWP, while at the same time it would also make legacy x86 apps vastly more secure.
The problem with that is that you can't tell programmatically if a potentially risky action is performed by a program acting as the user's agent or by a program acting on behalf of some malicious fuckwit.
You can just forbid it, sure, but then you're reducing the usefulness of your platform.
Except Microsoft provides updates for Windows longer than any other popular desktop or mobile operating system. So if crap gets installed because of an OS vulnerability either you didn't update or it that flaw will be patched as soon as possible.
What is the downside of someone having your email address, especially with no other context.
If they have other data on your email address, they don't need your email address to do anything with it. If they don't have other data, then there's no issue.
You're not concerned about people having public access to your twitter handle, why would you be concerned about people having public access to your email address.
A list of emails + IP addresses would be valuable to web surveillance companies that already have trails of data for the IP addresses and emails separately.
Well, they at least know my email address and my interest for android phones. Am I only one thinking that It is so easy to sell the whole list to a Chinese manufacturer ?
Indeed. Literally all you are providing to the site is an email address. I could check my coworkers' email addresses. They have a list, I'm checking to see if something is on the list.
Any site that requires you to provide an email address and password (or any other "verifiable info") to check is probably a scam.
K9Mail with Fastmail, AddressToGPS + OSMAnd, I back up my own photos with a home server and some rsync (Syncopoli). There is PhotoBackup which is kind of neat though.
We use Google Calendar for stuff at work, Google still has an old psuedo-deprecated-not-talked-about CalDav endpoint I hit with DavDroid that gives me all of that. These problems were solved with protocols long before gardens. I use offlineCalendar(Yes you need an app for that in Android) to create a calendar for myself locally, though I have a radiCAL server for things I need sync'd. Could have used Fastmail I guess.
Funny, I use exactly the same apps on my Cyanogenmod phone - K9 + Fastmail, OSMAnd. There arnt that many good alternatives to Google, apparently.
The only thing I'm still missing is public transit navigation.
I do have the Play Store installed though - through OpenGApps pico - I only have the Play Store, none of the other Google stuff - with a dummy gmail account.
My city has a transit app they maintain, but I'm afraid that'll stop if Google gets too ubiquitous in this area. It's really too bad that information that should be released as clean openData sets is so difficult to acquire and parse easily. Really should be some regulation around what, how and how often public information should be published and maintained.
Fdroid for what I use daily. Firefox Aurora direct from FF which auto updates. HumbleBundle for games because I like funding DRMFree + outside GPlay distribution methods.
I use raccoon to download from GPlay if I need to as a peer stated and know when to update with apktrack from fdroid, but really that's only for Slack since we use it at work. Slack uses Google Push Notifications like most apps, so it doesn't notify me about updates, but I check it periodically enough. I could go the microG route, but I prefer not having to interface with google servers at all if possible.
Edit: Maybe I can either convince my work to use something like Riot.im when it gets mature enough, or use a slack bridge for it and rid myself of this problem.
You can use Raccon http://www.onyxbits.de/raccoon : "Download apps directly from Google Play. Raccoon is the only APK Downloader that also supports paid and large apps."
I've got a few crutches I'm not ready to give up so I'm on regular Android but I've been starting down the path of using F-Droid only apps so I can trasition more smoothly when I'm ready. The only app I've got left is Maps, OsmAnd is a little too tedious for me but I'll convince myself it's worth it eventually. I'm also concerned that I might not be able to get Project Fi working quite right.
I am using "Here WeGo" https://here.com/ (originally developed by Nokia), quite a good alternative to Google Maps. It does not need any Google services installed. Downloaded it directly from the Google Play store with Raccoon.
Hmm, that Raccoon thing seems ok but using something like that would be a bit of a sideways move for me. I'd rather move toward 100% libre software and not rely on proprietary software I'll need to hack around to maintain privacy.
F-Droid and apkmirror. You can use any app that doesn't run google services, including open source alternatives for apps like youtube and google maps. Can access many things from your web browser, as well.
I have a Google account created with a non-Gmail email address. I only use it for YouTube subscriptions (and I'm in the process of moving that to a basically RSS reader) and Google Play.
I have absolutely nothing work-related on my phone since I know that I can't update the system when a patch is submitted thanks to the shitty Android ecosystem. (And I'm still stuck at v5.0 and can't figure out a safe way to root my phone.)
Oh, and every single point of Google's tracking that I can turn off is turned off (like searches, YouTube history, location history etc).
As an android phone owner for just over 30 hours.. at this point: yes. I grow less inclined to change that. OTOH, I never trusted my previous I-device with any significant data either, and any paid apps on there were gifted from a separate account on a desktop machine.
Damn, should have thought of that. I assume it's not possible to transfer the few purchased apps I already have? I'd probably be willing to ditch them.
Is that what I have? A security-first mindset? Honestly it never even crossed my mind back in 2010 when I bought a Nexus S that the thing to do was put the keys to the kingdom on something that might fall out of a pocket somewhere... silly me; I just made a separate account and I don't use that account for signing up other accounts or whatnot; when that account gets compromised they'll find precious little of value.
I take that as validation; the stuff that matters to me isn't on Google's systems and never has been.
I got down-modded pretty badly for my first comment; apparently that was terribly offensive. I don't really care much but it is very telling; the same folks that wail and moan endlessly about having their privacy compromised by Google et al. apparently don't hesitate to make it trivially simple to do so.
It's not. In fact, Google has the right to publish your private emails if it's for promotional purposes. The worldwide unrevokable license they grant themselves to everything you store on their service has no requirement that they don't publicize private information. Check their Terms of Service.
Not really phishing, just malware hosted on different app stores. And the Google post[1] seems to indicate perhaps some of them were on the official Play store. I read "These apps are most often downloaded outside of Google Play" as "maybe some were downloaded from Google Play".
> While Google implemented multiple mechanisms, like two-factor-authentication, to prevent hackers from compromising Google accounts, a stolen authorization token bypasses this mechanism and allows hackers the desired access as the user is perceived as already logged in.
What's the right fix here? Should auth tokens be ip-address-tied? How much will that break? Or would that not even fix it?
While it might be possible to strengthen the mechanism used to keep sessions, the best fix isn't necessarily on that front. They have root access to the device you're logged in with, so they can do whatever they want with your current sessions. Auth tokens actually limit the damage by being expirable and being insufficient to perform a new login.
I'm not sure what the solution is, but my worry with tying to an IP address is the mobile setting where I may transition from work wifi to bus wifi to home wifi, with a mobile carrier in between all of those steps.
Maybe something like a device ID, although I assume that that can be easily stolen and spoofed.
Would this mean that the token is only good for the duration of the connection though? Most apps on mobile hold tokens that ~never expire (iirc I've never had to re-auth the Gmail app.)
The difference between iOS and android could not be more clear in this regard. It's interesting to see the difference in security between the two. It's night and day. Google has some serious problems to address. But it seems like they don't care. Their track record is deplorable regarding android security.
Is this really the best google can do?
Most android isn't stock and there are a ton of old versions out there so it's a little apple to oranges. I think if you would have a phone created by google and keep it up to date it would probably be pretty secure.
I didn't hear any customer beg for a customized version of Android. Rather "stock Android" seems to be a selling point nowadays.
It's just a bunch of marketing weenies looking for "an unique opportunity to put focus on the brand". Seldom I see things (like multitasking in some Samsung devices before it came to Android) that would really help the end user.
You can see if your account has been affected here:
https://gooligan.checkpoint.com/