In the Android security model, this is not a capability that should have ever been provided to apps to begin with. It allows for apps to have almost free reign over the filesystem of other apps (read, write, modify), among other powerful capabilities. No flagship Android phones on the market allow this.
It would be like if the Facebook app on your iOS device could read all of the files of your banking app, or notes app. It's just not something that is enabled in any Android system claiming to have some level of security.
> but this feature is clearly gated behind user permission
You're right that the owner of the device should have the ultimate say. But the sad reality is that most owners aren't necessarily good caretakers of those devices. They don't understand what that permission entails, and they don't actually want to take responsibility for the outcome of the decision. But they will want to hold the manufacturer accountable for the damage.
I can't count how many times I heard people say "this decision should be mine to make" only to follow it up after some time with "somebody should have warned me not to do it". It's human nature and the solution for this can't/won't be technical.
Windows XP was a good example of letting the person decide what's good for their device and it was also the OS with the slowest adoption of updates. People collectively decided that the discomfort of rebooting once in a while was worse than letting malware completely wreck their device and data.
> I can't count how many times I heard people say "this decision should be mine to make" only to follow it up after some time with "somebody should have warned me not to do it".
The correct response, if (as in this case) they were warned, is to say “someone did warn you, pay more attention next time”, then walk away[1].
Just like if a beginner ignores the black piste markets and the “for good skiers only” sign at the top of a slope then complains that they fell over.
It is problematic to create in users an expectation that if they blindly mash at their globally-networked, bank-account-connected devices without paying a modicum of attention to anything that appears on the screen when they do so, that everything will be fine, and if it’s not it’s someone else’s fault.
> The correct response, if (as in this case) they were warned, is to say “someone did warn you, pay more attention next time”, then walk away[1].
In reality, this does far more harm than good. In almost all cases this goes wrong because of the 'little learning is a dangerous thing' problem. People tend to be in two camps:
- Don't care, don't want to fiddle with the thing, the manufacturer has to do everything
- Knowing just enough to break things, but not enough to fix it (and thus it is the fault of the manufacturer)
Other types like the 'I am the owner, I make the rules' crowd are insignificantly small.
This means that in the real world (so not in an echochamber) you only get one scaled and realistic scenario: the user creates problems (for themselves, others), but cannot fix them, and everyone/everything not-user then has to care for them to deal with it.
In an ideal theoretical world we might say that the end-user has to be responsible, and they have to make infinite mistakes and learn everything so they can become good caretakers of their networked systems. But that is not reality, and is not realistic.
Harm reduction isn't always the most important goal, especially when it's other people's harm and reducing it also involves restricting what they can do.
You don't have to allow all users everything, but you should allow those who want, to do as they please.
You can always hide the option behind some kind of mechanism. A mechanism a general user wouldn't use because they don't if the rest works as intended. Those who still do, should suffer the consequences, but this is not the manufacturers' problem. They have all kinds of safeguards to prevent liability because of those "special choices".
People would go to great lengths to follow tutorials on the internet to disable things they were told were bad for them. The less qualified, the more likely that they fell for the "updates are bad, they ruin your computer" narrative. As long as there's an option that can be abused, people will be tricked into allowing it.
This is less relevant for the current discussion about the FireTV and this feature. It's for the more general discussion of being able to do whatever you want on a device you own.
But why should everyone else suffer because of that small fraction?
The real answer: users are captive. For the vendors, they're cattle. And like with any good big farm, it does not matter how much it sucks for the cattle - but it does matter the cattle is safe, because few bad cases can become known and risk your farm getting shut down.
'Krasnol argued for keeping powerful/dangerous features, but making them opt-in (and a bit of a hassle to enable). You countered that there will be "some fraction" of users incapable of not hurting themselves with those features, who "will still stumble upon it anyways and will still refuse to take any responsibility for enabling it". My counter to that is that we shouldn't remove such power features just because "some fraction" may find and misuse them.
That's the should/should not part. The rest is my take on why companies remove those features anyway - they have no incentive to provide anything above bare minimum, especially not when they could be on the hook for "some fraction"'s mishaps.
I didn't raise the 'should/should not part' at all, you are the one who raised the point. I'm focused on actual facts and possibilities in this comment chain.
And as we've seen in the PC space, this will absolutely destroy security as the general population will simply hit "Ok" or "Allow" on any (security) prompt so they can get to their desired goal.
I hate to say it but at some point it is their device and they can install malware if they want to. I think it is good to put up some warnings and make particularly dangerous permissions particularly hard to give. But at some point it is my device and you need to get out of my way.
Yes, but one of the major externalities in this context is security industry smothering every computing platform, turning it from a bicycle for the mind into a TV for the mind.
> the general population will simply hit "Ok" or "Allow" on any (security) prompt so they can get to their desired goal.
So let them? I keep hearing this argument but I have yet to hear a good explanation of why it's a problem or why I should care.
If a thief walks up to someone's door and asks to be let in, and the person opens the door and lets them in, is that a security flaw on the door's part? Should we make doors harder or even impossible to open by their owners to prevent them from letting a criminal in?
Cool, ok… you either learn from your mistakes or you don’t.
Developers aren’t responsible for the general population doing dumb shit, as long as they don’t trick them into it, and it doesn’t happen as a result of bugs in their software.
Imagine if the makers of stoves or kitchen knives believed that they should design out every possible way someone could burn or cut themselves…
“Do you want to let [application] access [the calendar|your photos|files created by other apps]?” seems totally reasonable; stopping users from running programs that do this altogether, not at all.
The biggest problem with it all is: that which OS developers do to “protect users” becomes what application developers use constrain users and prevent them accessing their data, in order to extract more money or control how people use their own devices.
My butterknife doesn’t have the ability to upgrade itself to a chainsaw over the internet. Software has this somewhat unique and autonomous ability; comparing it to static household objects when it comes to manufacturers’ legal obligations (or ethical oughts) doesn’t necessarily make sense.
Normally, I’d agree with you. But over the years I have been thinking that bad practices on internet connected devices ends up being everyone’s problem.
Even that barely helps - try opening the web dev console in your browser on a popular social media site and there are huge warnings telling people not to paste commands in there they have been told to do to "hack Facebook and see nudes from your hot neighbor"
I tend to disagree, at least for many purposes. In the world of mega-apps (WeChat, Facebook, etc), do you really want these apps to be able to ask for or even require permissions like this?
In an older, kinder, gentler era of computing, if I granted a permission to an app [0], it was probably doing something with that permission that I wanted. Nowadays, not so much — apps are generally actively hostile to the user, and even apps that are friendly are frequently purchased by more or less malicious companies that turn them into malware.
[0] Yeah right. There were no permissions. And apps were mostly well behaved because they had no way to call home to the mothership.
> I tend to disagree, at least for many purposes. In the world of mega-apps (WeChat, Facebook, etc), do you really want these apps to be able to ask for or even require permissions like this?
Not really… I want to ban or break up/massively curtail the apps and/or their business models and aggressively police them so that that isn't a thing…
This isn't actually about filesystem permissions. ADB is a debugging tool that seems to have been used by certain applications. It shouldn't be required for any normal app functionality including storage access.
That said this is still a user hostile change. I will never purchase a device that blocks ADB since it's required for several useful things, often related to fixing or working around issues created by the vendors themselves.
In WhatsApp you can enable save photos and videos to your phone gallery and then any backup app can do backups. Google Photos and numerous other apps are able backup WhatsApp photos.
>It allows for apps to have almost free reign over the filesystem of other apps (read, write, modify), among other powerful capabilities
What went so wrong with personal computing that allowing an application to have access to the files on the device with the user's permission is now considered an unthinkable crime? Is the average smartphone user so clueless about the capabilities of their device beyond scrolling through tiktok that the use cases for this are beyond them?
It's because of people like you that I can't even load up an FTP app to backup the files on my android phone to my PC anymore.
Here's the scenario in question. Your average person with a smartphone, who is not so technically inclined, downloads a game of some sort. The game upon first launch pops up a dialog which says "to provide you with the best experience, we need to clean up temporary files on your device, when prompted by the dialog (screenshot of system dialog), please press 'OK'." The user is then presented with the relatively scary system dialog which says "Allow this app to use system debugging features?", which they have seen 100 times and never understood, decides that this time they will press 'OK'. The game then proceeds to send all of their photos to a malicious actor for whatever purposes.
The average person simply isn't cognizant of the dire security and privacy consequences of many of the things that they do when interacting with a computer.
Note that I am NOT advocated for the removal of ADB. As an Android developer, I once used adb on a daily basis. I also love the idea of using adb to ftp my filesystem to my local machine for the sake of backups and whatever other useful purposes. In the case of the FireTV, I believe that if the device is put into developer mode, ADB can still be accessed over a USB cable. I think this great, and necessary for development and other use cases.
The point here is about making a system less likely to cause incomprehensible harm to the average person. Android and iOS were an opportunity to rethink the security model of computing (for computers that most people carry with them and use every few minutes), and I think that's great.
I am not the average person. It's none of my business what the average person does. Just because the average person can't be trusted with something, doesn't mean I should have to suffer because of it.
Permission dialogs should be as informative as possible, sure, warn people that they're giving the app full access to their files if a permission is granted. If people still accept, then they accepted, it's their device and the device should respect their choice. It's not anyone else's responsibility to make that decision for them.
A sufficiently advanced user can still install something like LADB or shizuku or even a custom rom. This is extremely unnecessary to be a bundled OS permission.
> A sufficiently advanced user can still install something like LADB or shizuku or even a custom rom
Not without unlocking the bootloader (which can only be done on a few phones nowadays) and having to deal with getting locked out of a bunch of apps and functionalities as a result of Google's "security".
Let’s never blame the corporations hoovering all the data to carelessly resell it to marketers and who-knows-who. Instead let’s blame people on a forum.
Yes, I will blame people on a forum for falling for such a dumb trick. The corporations "hoovering all the data" (such as Google themselves) are the same ones providing you with a convenient "solution" to the problem they caused: restricting your own access over the devices you supposedly own.
Yeah, it’s amusing that the author simultaneously claims that (a) it’s a “core OS capability”; (b) it’s a developer debugging capability gated behind so many hoops that it can’t be abused (which is of course nonsense because even though each connection needs to be approved, the app allegedly only clearing other apps’ cache could be doing any number of evil things after the approval). These are incompatible claims.
You can fundamentally disagree with the Android security and isolation model, but if you were okay with it before this update, then the arguments presented are just an incoherent mess.
It is heartbreaking and very disappointing to feel not loved and appreciated enough because if you are loved and appreciated enough by your partner there will be no need for her to cheat and give any attention to any man other than yourself, I felt this way for so many years without knowing what to do and I couldn't leave her because there was no concrete evidence to back up the feelings I had and none of my families believed me when I told them I think my wife was seeing someone else so I was determined to prove it, I went searching online on how to hack or spy my wife phone without her knowing then I saw a lot of people recommending Marie as the best in the game well I wasted no time in contacting via on
(MARIECONSULTANCYOZ@GMAIL.COM and INSTAGRAM :MARIE_CONSULTANCY) and she gave me full access to her phone I was able to get the evidence needed
I appreciate people taking the time to write articles like this, but please consider the people who are going to read it. ADB is an acronym. Not everyone knows what it means. If you're going to write an entire article about a feature being removed, please at least describe the feature in a way that people can understand without having to google it so that they can have an idea as to whether or not this is relevant to them.
> ADB is an acronym. Not everyone knows what it means
I've used ADB a half-hundred times, and I had no idea what it meant until I read replies to your comment. Defining the acronym is likely not helpful to anyone who has used it before (and thus be affected by the change)
If I recall, the article seemed to indicate that certain apps would be broken. I would be worried that apps I relied on would not function anymore. This has nothing to do with me using ADB, it has to do with apps using it (at least from how I read the article).
You use adb, you have a mental model for it. It does something because you make it do that thing and see it happen. You must be able to put yourself into the shoes of a person that isn't you and doesn't have that experience to recognize what I'm saying. Note: many developers not being able to or choosing not to do this is the source of most of the bad UX that is inflicted upon us today.
To each their own, but your response baffled me. If you've used ADB a half hundred times, weren't you ever curious what it stood for? I can't even use acronyms without knowing what they mean - it's always the first thing I look up, primarily because it helps me really remember and understand what the topic is about. Sure, the acronym soon takes on an "atomic unit of meaning" in my head, but especially for something like "Android Debug Bridge", which pretty perfectly describes the tool, knowing the original meaning still often enhances my understanding.
I think it depends on people. I've used adb and it was vaguely related to debugger acronyms so I assumed it would be debugger something. It does not matter to me what the exact name is.
I use a SOAR and never know what this is for (the name, not the product :)). Idem for the plethora of acronyms we use in France they expand to weird words.
Do not even get me started about the acronyms in my company, I have up completely and use them randomly. The fact that it usually does not bother anyone shows their importance :)
I generally wasn't consciously aware that it even was an acronym. For me adb was just a command like cat. I don't randomly ask myself what cat stands for, or all the other obtuse commands I use on the command line. Life is too short.
"adb" is the name of the executable. Expanding the acronym isn't that relevant. It wpuld be like explaining that KDE stands for Kommon Desktop Environment.
Of course it would be. Even in your example, I would know that KDE is a desktop environment, which, from KDE alone, I would not. ADB is a debug bridge (apparently), which means it's a debugging tool, which means it likely wouldn't be terribly relevant for most typical Android users (and if that's wrong, then the article makes an even bigger mistake).
Typical Android users? Maybe not. But any non-developer user who has tried to sideload an app has used it. It’s hard to overstate how central adb is for doing anything on Android that goes beyond vanilla usage.
(But I agree, 5-7 words of explanation would be helpful)
The larger issue with the article for me is that it waits until the second paragraph (after a screenshot and so below the fold) to even name what the feature is.
> While it’s not a capability used by many Fire TV apps, without it, Fire TV apps can no longer perform certain advanced tasks, such as freeing up internal storage space by clearing the cache of all installed apps.
Maybe the article was edited? Because that seems to give the context required.
Anyone that has a mental model that includes what ATMs do. The point is that most people don't use ADB and therefore don't have that mental model. Many people have used an ATM, so if an article talked about someone stealing cards/pins on an ATM, most people would know what it meant. There are certain abbreviations that are so commonplace it may be safe to assume people know what they mean, even if they do not know what they stand for.
ADB is not one of those. This is the point, and this is the thing that anyone who is arguing against that point fails to recognize. It's myopia, or self-centered thinking. It leads to bad things in our industry (poor UX being one of them) and I'd encourage anyone thinking this way to consider what "empathy" means and how it is applicable to our field.
My point is defining the letters in an acronym doesn’t matter. It doesn’t matter what ADB stands for. What matters is what ADB does and literally the next sentence after “ADB” explains exactly what the implications are of it being removed.
“There is a thing called ADB and it being removed means apps can’t clean caches any more” would in no way be helped by defining what ADB stands for.
I totally agree with you, but I'm deeply amused that you didn't define the term yourself!
This is a pet peeve of mine, for real, especially with PR releases that sometimes hit the top of HN. Sometimes you'll see a post about "Updates from Widgetizer" and then it just talks about the features, assuming everyone already knows what Widgetizer is. It's kind of crazy. The writers are missing out on a lot of conversion by not doing a "by the way, ADB is the Amazon Debug Bridge, a tool that lets you..." and then people go "Wow, I didn't know that existed," and go investigate.
I didn’t know what it was until someone posted a reply to my comment. I don’t have an Android device and I can’t be bothered to fill in the obvious gaps the author left when the thing isn’t of immediate consequence to me.
It wasn’t intentional, but one could argue by my not defining the term I helped it really sink in for people how annoying it is, heh.
Interesting, they’re not disabling adb. It sounds like they’re just disabling the ability for apps on the device to connect to adb on localhost and run (e.g.) shell commands that way.
Tangentially: Is there a FireTV implementation that is anywhere near e.g. Apple TV in basic performance? Every FireTV device I have used has seemed practically unusable, but it could be there is just a low bar for what hardware they let go to market.
Nah, most Android TV sticks and boxes (bar Nvidia) ship with underpowered chipsets that lack the raw performance to render smoothly at 4K, using HW acceleration to at least get acceptable video decode performance.
Apple, being in control of the software and hardware, has better optimisation, as well as imho, a more fluid animation system on their OS.
The stick and dongle streaming device variants in particular are astonishingly weak. The current Chromecast 4K dongle for example is handily outgunned by a first gen Apple TV 4K, which this year will be a 7 year old device.
Wow, thanks for pointing that out - didn't realize I'd had these ATV 4Ks since 2017. They feel a bit sluggish.. probably just that everything else got faster.
On this I will say they are far more snappy than they used to be, battery life is great if you ask me.
But in real terms I moved my home streaming app of choice to infuse. Between that app and the Apple TV 4K I’ve had since just before they moved the remote to usb c it’s been great.
I will say that it is tucked inside a cupboard and hooked up to Ethernet but aside from that it’s leaps and bounds ahead of the other solutions I looked at /tried over the last few years.
> They feel a bit sluggish.. probably just that everything else got faster.
Either the 4K videos of today are "heavier" than the ones from 2017 somehow, or Apple's system updates have slowed down your ATVs to convince you to buy new ones.
The latest Apple TV SoC (released in late 2021) is the same one used in the iPhone 13 and 14 (non-Pro). 2x 3+ GHz P-cores and 4x E-cores and 4GB of memory. I haven’t noticed any issues.
Yeah but if I spend that much money (not sure about the US but here in Europe it's really expensive, 170 or 190 euro) I really want something recent :) for longevity also. It will have to last a good few years after all. 3 years in for the same price is just not a great deal. It's still a 3 year old cpu (I don't use iOS so I don't really follow what's SoC generation is what)
Fire and Android TV are terrible. If you want lower end but actually useful streaming device Roku is so much better. I either buy a Roku smart TV or get the $30 Express for all my TVs.
No they're not. In fact I'd even say their vastly superior to the closed Apple TV ecosystem. The ability to use SmartTube with Sponsorblick blows away anything on the Apple Tv.
Not trying to be snarky—or maybe just a little—but I own an AppleTV, it’s fantastic albeit very closed. I can’t parse your statement in part because I haven’t googled those things yet (I will) but also because many years of experience have led me to disregard statements like this.
Then you are doing yourself a disservice. Smarttube alone makes me stay on my Nvidia Shield over my AppleTV. It would have taken you a minute to google and you would have saved yourself some embarrassment. Doubling down on ignorance is a wild move.
I did look it up. It’s a 1-person project with many caveats about installation and functionality. It looks impressive for what is, but not an alternative to AppleTV, certainly not if I expect anyone else in my house to use it. I’d put Kodi as a lot closer.
SmartTube is a YouTube app replacement, not an AppleTV replacement. It has a far superior interface and functionality than the default YouTube app. This is relevant to this conversation because you cannot install an app like this in an AppleTV.
I just installed Yattee on my iPhone and my what a horrible app. I then pasted a a valid YouTube link and got the error message that it "Failed loading video". If Yattee is the best you have on iOS then I truly feel sorry for people that have to endure with a subpar app and an even more subpar experience. SmartTube with Sponsorblock is the only way to watch YouTube.
Google TV impressed me by having a "Basic TV" option for sets that come with it, that disables most smart features, which is useful in a pinch when you can't find a dumb TV (such as a Sceptre) or monitor usable as a TV. It's kinda weird to praise Google for implementing something that halfway respects privacy, but here we are.
Are you saying Google TV has a OTA channel option? This is exactly what I'm looking for, but I assumed since there wasn't an antenna input on the Google TV then there wasn't a way to watch OTA using it...am I misunderstanding what you meant?
*edit*
Aha, I didn't realize Google TV was also included on some TVs and wasn't the just the Google TV dongle with a remote.
I've been happy with my FireTV cubes... they feel like the right amount of horsepower thrown at a TV...
Obviously, performance scales with price... the $20 HDMI sticks are the bulk of the devices out there, but honestly, it's surprising what they can do for $20. (The "Lite" is currently 2 for $35)
I have a 2nd gen Fire TV 4K Max gathering dust you can have. It's slow as fuck and barely faster than the first one. All Fire TV devices I've seen are slow, laggy, and shitty.
Same, but it’s been tanking the past year with the huge amount of bloat Amazon keeps shoveling onto it.
As a side note, my Insignia TV (best buy store brand) with fire tv built in is basically unusable.
Echoing a previous comment I made too, about “smart tvs” and the “streaming sticks”:
Hey, have you ever thought of why even the $149 Black Friday loss-leader no-name-brand TVs all have Amazon Fire, Roku, or are now "Smart" in some way?
Certainly isn't because they need to incentivise you to connect it to the internet so it acts as a Nielsen-esq measurement device of all media you view on the screen via digital fingerprints that exist in all commercial media and advertisements. [1][2]
[1] https://www.ispot.tv/ [2] https://www.samba.tv/
Maybe it is not what you're looking for, but Fire TV, included within the Echo Show 15, is pretty snappy.
All my TVs have Apple TVs connected, but my Echo Show 15 in the kitchen does a fine job of plating YouTube TV, so I can watch shows and sports while I cook.
It's a small stick that I can toss into my bag while traveling, plug into a hotel or airbnb tv, and watch my shows on Plex or Emby.
No other stick-like device out there "just works". I've tried so many of them over the years, but they all have issues ranging from the irritating to the unusable. I had high hopes for the Nokia stick, but it couldn't even connect to WIFI.
Ironically, name one better for convenient (so, not Apple TV) 4K HDR Sponsorblock capable YouTube near that price point or really any price point unless you’re going the HTPC route. Form factor is also a consideration. The FireTV would not be the only media device in my setup but it serves a purpose.
What is an Android TV device? Is that distinct from the FireTV?
I plug my FireTV Cube into my projector. It has some kind of smart TV functionality and a second remote to go with it, but it's on a dongle and I don't plug it in because I don't use it.
Allowing apps to use adb breaks the security model. Apps should only be able to run code as their own user. Any capabilities these apps need should be properly provided by an API for them to use instead of using such a hack.
Probably people are using adb to disable ads or something?
Amazon has indicated that they are creating their own TV OS in Rust running ReactNative Apps and going away from Android although it might take a few years.
Fun fact, I recently bought an Apple TV that I connect to my FireTV. Amazon video is nearly unwatchable on the FireTV unless I switch the input to the Apple TV and watch it there.
Side note: I have recently moved away from Firetv stick, because it was unreliable.
The product idea was great, a <$50 device that gave you access to all the media you could want. Bonus there were apps and games.
However in practice its a deeply unreliable product thats deeply frustrating to use. It appears that not even amazon targets the stick for optimisation, as even the home screen is exceptionally slow.
Moreover it looses network connection frequently, but doesn't tell you. So you need to reboot the stick, but that takes an age because often there is some sort of memory leak causing the whole device to crawl.
Updates appear to be optimised to happen during use of your apps.
The virtual remote is deeply unreliable (and has been for years)
I didn't realise how bad it was until I tried a friend's apple TV. Much as it annoys me to admit, the apple TV, whilst twice the price (although its now more expensive) is 8 times the product.
The Apple TV is an Apple product, with all the pros and cons that carries, but the Fire stick honestly feels like a low quality knockoff of a knockoff.
I bought a FireTV stick for the lesser-used TV, having been accustomed to the Apple TV in the main room. It was cheap, and seemed to do all the stuff
The software was horrible:
* Highlight a search field, think you can use Alexa to dictate text? No, it'll just do a regular Alexa query. AppleTV gets this right: if you're on a text field, the mic button will just activate dictation to enter text
* Screensaver comes on, exit screensaver, also exits currently playing show back to the screen where you have to hit resume
* Slow, janky UI built with zero love
I felt bad when I eventually gave it away to my mum, who had a very early gen AppleTV that could not run Disney+. I felt so bad about inflicting the FireTV experience on her that I soon bought her a new AppleTV to replace it
Reading this I'm glad having chosen Chromecast w Google TV last week for my new "dumb tv".
I didn't have the opportunity to try Fire stick but slowness and an unpolished UI what I was fearing. Google TV otoh has a polished interface, supports all streaming services and the remote can control your TV as well.
Can recommend.
Despite being an Apple user who bought into the ecosystem, I don't like any of their offerings in the media streaming space. Too expensive, too much lock-in, catalogues too small.
Just get a Roku and move on. Clean UI, the best remotes. I have Fire TVs and Chromecast with Google TV and Roku is just better. No nonsense. Apple TV is overpriced and does nothing I need.
Been a long long time that Amazon has basically been good for consumers, done well enough.
Amazon Video for Prime getting worse was a signficant first ding (loss of hdr, ads). Now the appliances getting worse... This is like the mega trend, of massively profitable huge companies decimating their workforces, of the world being squeezed tighter.
> all under the, seemingly false, guise of enhanced security.
Maybe just me, but this became excruciatingly hard to read after this. At this point, I don't know what they're talking about, but they're implying something being done for security reasons is wrong without an understanding of the problem and it's already sounding like a rant.
That was before even mentioning the feature is adb, which yes, that is a security concern because you can bypass Android's security model. Why that's been open for so long in the first place is an even bigger story IMO.
Regardless these app and cache killer apps are at best barely useful and often scams. This functionality is handled by the OS on its own. And it should still be possible to kill apps and clear caches manually from the device settings if needed.
“While it’s not a capability used by many Fire TV apps, without it, Fire TV apps can no longer perform certain advanced tasks, such as freeing up internal storage space by clearing the cache of all installed apps”
That seems like a failure of the OS. Why should apps have to force that?
So what else does it do? That’s the only example in the entire article. It’s not clear to me at all why this is necessary other than “I already had it so don’t take it away”.
it's not clear that they do, or that this functionality is necessary or helpful in any way. "cache-clearing" apps have always been popular on android, and they range in effectiveness from being useless to being actual scams. but people like to push buttons to feel as though they're optimizing performance somehow.
Perhaps because to this day, Android refuses to provide the basic, fundamental, user-facing feature of making a process go down and stay down. All we get instead are lame excuses about complexity of battery management or such other nonsense, despite all the reluctantly added battery optimization functions are all nerfed versions of regular task manager.
It's called uninstalling. The whole point of having an app installed is so that its activities / services can run or other apps can use the providers and services it has. For apps on the system partition settings will let you disable the app which will have a similar effect.
No, the whole point of having an app installed is to have an icon you can tap to bring up said app when you need it. Same way as with desktop: installing != running.
When devices ship with 8GB of storage, average users may or may not follow system prompts to free up storage on an ad hoc basis. (This is analogous to 16GB iPhones and constantly having to free up space). More savvy users can figure out that FireTV is Android compatible and use adb as a power tool. Or just buy a different streaming device that doesn't run at 75% storage full after installing YouTube and Netflix.
Don't even get me started on Samsung's Nexus S, which is partitioned into 1 Go "internal" and 15 Go "external" space with a bunch of apps being extremely annoying about insisting on being installed in "internal" space only...
It does. But cache clearing only goes to a certain point after that it will stop you from updating apps and installing new apps, until you free up space by uninstalling an app.
User never has to manage cache space which has always been the case. Also your normal user cannot do that, without system level privileges. Third party apps where to able to call adb directly on FireTV which is not the case on other Android devices with GMS. In fact if you build Android with access for third party to call adb, Google will never let you pass CTS and you cannot have Google Play on it. It’s a huge misstep from security point for Amazon sell devices with adb access for third party apps.
Because users want to do it. It's absolutely none of your (or Amazon's) business why. How did we ever get into this bizarre situation where supposed "tech people" are questioning whether basic functionality should even exist? This billion dollar company is not your friend.
Users want to do what? That was my question. What useful thing does this allow besides the single example in the article?
And if you don’t want to be beholden to a billion dollar company, maybe don’t buy suspiciously under priced hardware from them that is obviously subsidized by their ad/service empire?
And again, I contend this is not basic functionality. The example in the article isn’t basic functionality, it’s a bandaid on a broken OS.
Storage and memory management is basic functionality. OSes universally suck at this. Partly because they can't predict the future and are mostly reactive in their behavior. Partly because they can't reason. The user can.
> While it’s an unpopular opinion, I see nothing wrong with Amazon protecting its Fire TV revenue by stopping the use of alternative home screens. It’s crucial to the business model Amazon has chosen to use for the Fire TV and if customers don’t like it, they don’t have to buy one.
Why are some people like this? I would assume fanboyism is the answer, but this isn't Apple.
If a business chooses a flawed business model, they should fail. Consumers should always go for what benefits them the most, not make sacrifices to try and protect a business. If the business is unsustainable, it should die so that actual sustainable business can take its place.
If Amazon based their business model for FireTV around homescreen ads, yet customers found a way to easily get around them, then Amazon chose a stupid business model and should face the consequences. Crippling the product with a software update is an action that is harmful to customers, and should be illegal.
Not surprising. Amazon is terrible at delivering consumer apps. The Apple TV prime video app freezes often, then after they added ads, the app now actively prevents you from muting them! You can mute during an ad, but then the app will immediately turn the volume back up. It’s straight out of black mirror.
Not to mention that on all devices I used the official Prime Video app (yes, even Fire TV Sticks), there we massive audio latency issues up to the point where I stopped watching. On FireTV Netflix had also lagging behind audio, which it never did on any other platform.
I am talking about a like 1 second delay here, not 50ms (which is already not good).
Amazon doesn't care about the viewer/user experience. It is always a disaster.
Surprised nobody's developed a custom rom for these things, since I've seen so many other random cheap android devices get custom roms on the xda site.
It would be like if the Facebook app on your iOS device could read all of the files of your banking app, or notes app. It's just not something that is enabled in any Android system claiming to have some level of security.