
Apps? No root? Your device serves others, warns Berners-Lee - lispython
http://www.zdnet.com/apps-no-root-your-device-serves-others-berners-lee-7000010661/
======
brudgers
Corey Doctorow explains more fully:

 _"When we turn a computer into an appliance. We're not making a computer that
runs only the "appliance" app; we're making a computer that can run every
program, but which uses some combination of rootkits, spyware, and code-
signing to prevent the user from knowing which processes are running, from
installing her own software, and from terminating processes that she doesn't
want. In other words, an appliance is not a stripped-down computer -- it is a
fully functional computer with spyware on it out of the box."_ [The Coming War
on General Computation]

Video: <http://www.youtube.com/watch?v=HUEvRyemKSg>

Transcript:
[https://github.com/jwise/28c3-doctorow/blob/master/transcrip...](https://github.com/jwise/28c3-doctorow/blob/master/transcript.md)

~~~
anigbrowl
More like Cory Doctorow grinds his axe. Most people don't _want_ to know how
it works or hack on it, they just want it to operate reliably for the purposes
they purchased it. It's not a war on general-purpose computation, it's an
unprecedented variety of embedded computers.

~~~
vy8vWJlco
_"it's an unprecedented variety of embedded computers."_

... at the expense (decreased availability, and public tolerance) of the
general ones. Possessing or promoting general purpose computing might well
wind up on the FBI's list of suspicious behaviors, along side such threatening
habits as paying cash for gum; "normal" people are still basically terrified
of magic/the unknown. Hackers are the new witches. A plethora of embedded
devices, where one general purpose computer could have done the same, would be
tragic.

Of course, all of this is just my humble opinion.

~~~
anigbrowl
Oh, I remember people worrying about the same thing 20 years ago and general
purpose computing is cheaper and more accessible than ever. Embedded devices
exist because they serve a purpose well. I like a good Swiss army knife, but I
also have a bag of specialized tools.

As for hackers being 'the new witches,' people aren't exactly scared of
computers. If anything, they take the complexity of the digital age with too
much complacency.

~~~
vy8vWJlco
I think the complacency you describe is only side of the coin. People are
happy as long as knowledge doesn't empower other people more than it does
them, but as soon as you see and say that the emperor has no clothes
(especially if they're the emperor), the lawyers and the Luddites descend like
the four horsemen of the infopacalyse.

Admittedly, I don't think general purpose computing will in any sense
"disappear," but I do think that we are already in a technological depression
("where's my flying car already?") and the witches metaphor is particularly
apt. Until "normal" people catch up on the details of general purpose
computers ("what's a filesystem?") and feel empowered by them (confident
enough to secure their own systems, and write software to automate their
lives), we will continue to stagnate, and embedded devices will thrive. The
proliferation of embedded devices is a barometer of the public's aversion to
generality, pure and simple... I would even expect a new model for general
purpose computing to result from the current Cambrian explosion eventually,
and be a net win. But I think it's important to realize we spend less time
inventing today than 5 or 10 or 20 years ago, and more time dealing with the
political and social response to the full implications of the current breadth
of computing if only because it is still entering the mainstream. Anyone
interested in pushing the limits today has to focus as much, if not more, on
addressing the distaste for change, as on the underlying technology (at least
if they want to put their name on it). Designing and coding Bitcoin is the
easy part...

------
anon1385
The web is the least free of all platforms from the users point of view* so I
find it bizarre that Berners-Lee would use the 'freedom for users' angle to
try and convince people to switch to web technologies. I fully support the
idea that we should have root on devices we own, it's just a shame he had to
dilute that point by pushing his own platform.

* nearly always closed source, in the rare cases where you have the source you can't modify the code that actually runs when you use the service, usually no control or even ownership of your data, it runs on a machine you don't control which can make your data subject to mining by foreign governments or advertisers

~~~
nathan_long
>> the web is the least free of all platforms from the users point of view

You're confusing "the web" with "networked software".

When you use the web, you have no control of the server but full control of
your client (if your browser is open source). When you use a networked app,
you have no control of the server AND no control of the client.

"Do I want control of the software I run on my machine?" is a separate
question from "do I want that software communicating with other computers that
I don't control?"

~~~
anon1385
There is no confusion here: I want control of my data.

Currently the number of web apps (by which I mean something with a html/css/js
interface that runs in a web browser) that I can run locally with my own data
that I store locally (or not, my choice) is tiny. The number of native apps
that give me full control of my data is huge, because native platforms make
that easy to do. Local file access APIs are relatively new in the browser.

You are correct that there are plenty of tasks that are inherently about
communicating with another computer, but that is not what we are talking about
here. Writing a document needn't involve communicating with another computer.
Editing an image needn't involve communicating with another computer.
Maintaining my todo list, calculating how much time to bill a client; these
needn't involve communication with another computer that I don't control.

~~~
brudgers
_"I want control of my data."_

One might even argue the idea of individual control is somewhat at odds with
the logical structure of FOSS, particularly in it's GNUist form - i.e. for a
Lisper (and Stallman was/is one), there's no semantic distinction between code
and data. Your position requires distinguishing bits from bits.

Solutions which distinguish bits from bits are political, not computational.

~~~
telent
> there's no semantic distinction between code and data

I struggle to understand how this can be true, and I say that as a Lisper
(spiritually if not currently), because 'this data does something meaningful
when interpreted by a computer' is _exactly_ a semantic distinction from 'this
other data doesn't'

~~~
brudgers
What is the semantic difference between data which does nothing meaningful and
lisp code that does nothing meaningful?

Or to be Wittgensteinian about it, point to the part of a datum which is the
meaning.

------
alanctgardner2
I don't know if it really matters to the average user to have root on their
system; for technical people it certainly implies "true ownership" of the
whole stack on the device. What I find bizarre and a bit unsettling is the
appification of general purpose computing devices. When it was mobile phones,
locking the user out (almost) made sense - the device was being financed by
the carrier via subsidy, so they should have a bit of control. Then tablets
came along and provided an identical interface in terms of restricting user
access, but the user pays the entire cost up front[1]. Now Windows 8 (moreso
RT, but even the x64 version) and Mountain Lion are moving to the walled-
garden, segregated, impotent app approach. The best reason I can explain why
this is terrible is XCode:

I don't really develop for Mac, but there was a Linux tool which also shipped
a 32-bit Darwin binary. I thought "I'd like to use that on my 64-bit Mac,
surely I can just compile it with XCode". Downloaded XCode from the App Store
(after giving my mother's maiden name, my blood type, etc.), and it doesn't
appear on the PATH, because it can only write to /Applications/XCode. Jesus
[2].

My point is, somehow subsidized phones turned into locked-down tablets, turned
into a compiler that can't install itself to /usr/local/bin. That seems daft
to me.

1\. ASUS let me root my tablet for free, no strings. So it's opt in, but not
terribly evil.

2\. If anyone has a good Mac homebrew tutorial, I was wondering if I could
compile with a different toolchain than Xcode? I'm really not up on this
stuff, I need to learn.

~~~
anon1385
Open the Xcode preferences and install the command line tools from the
'Components' part of the 'Downloads' section, although it does say this:

 _Before installing, note that from within Terminal you can use the XCRUN tool
to launch compilers and other tools embedded within the Xcode application. Use
the XCODE-SELECT tool to define which version of Xcode is active. Type "man
xcrun" from within Terminal to find out more._

 _Downloading this package will install copies of the core command line tools
and system headers into system folders, including the LLVM compiler, linker,
and build tools._

~~~
jonhendry
Yes, if you use the Terminal and cd to
/Applications/Xcode.app/Contents/Developer, you'll see

    
    
      drwxr-xr-x   3 root  wheel  102 Jan 25 14:52 Documentation
      drwxr-xr-x   7 root  wheel  238 Jan 29 13:33 Library
      drwxr-xr-x   7 root  wheel  238 Jan 29 13:11 Makefiles
      drwxr-xr-x   5 root  wheel  170 Jan 25 14:53 Platforms
      drwxr-xr-x   3 root  wheel  102 Jan 25 14:54 Toolchains
      drwxr-xr-x  22 root  wheel  748 Jan 29 13:12 Tools
      drwxr-xr-x   7 root  wheel  238 Jan 25 14:54 usr
    

Inside ./usr/bin is gcc, git, etc.

Basically the developer tools used to live by default in /Developer at the
root of the filesystem, now the Developer directory is inside Xcode.

~~~
mhenr18
And if you install the command line tools they live in /usr/bin.

------
millstone
When I clicked on that link, my "device" contacted Facebook, LinkedIn,
Twitter, Omniture, and CBS Interactive. These sites attempt to compile
information about me, track me, and make that information available to others
- I'm assuming. Of course, I don't know because they don't tell me.

By using the web, my device is serving others (rather, serving me _to_ others)
on a massive scale, every day.

~~~
jiggy2011
At least in such a case you have some control over that tracking, by
installing ghostery or whatever.

Without root you cannot count on being able to do that.

------
api
It's a terrible trend, but I don't see it as some kind of conspiracy as some
people seem to. It's a legitimate market trend being driven by several
factors:

(1) Operating systems have an outmoded security model. Most focus on multi-
user security, which still has its uses, but they fail to focus on application
isolation. Applications should be installable by anyone and isolated
completely from other apps unless specifically granted permission by a
user/administrator. The entire malware problem can be laid at the feet of
this.

(2) The (related) poor state of installability. Mac has this problem the least
with drag-and-drop .app installation, though sometimes even that can be
confusing (and .dmg packages are weird beasts... why?). Linux has .rpm or
.deb, which applies a massive and complex band-aid to the otherwise awful
state of installability on that platform. Windows is absolutely horrible...
it's like Linux where you have "installers" that have to do package management
instead of a formal package system.

The open source world -- and even commercial vendors that want to keep the PC
alive -- have to either address this problem or accept the dominion of the
locked-down vendor-controlled consumer compute device. This will require
abandoning the old fashioned Unix design philosophy (and the similar way
Windows works) and thinking seriously about the problems of installability and
isolation. It would be worth taking cues from iOS and Android here, though
there's also a lot of room for new ideas.

Oh, and I forgot to mention. If we don't address these problems, all app
vendors will pay a ~30% per-sale tax to Apple, Google, and Microsoft in
exchange for the valuable service of a platform that provides installability
and application isolation. And you know what? The market will pay it, because
for most users those things are _that valuable_.

Point, click, install, with no fear of damaging my system. If I don't like it
I click and uninstall. _Anything else is completely broken._

It's interesting to note that the prevalence of virtualization is also a sign
of the failure of operating systems. OSes in a box (whether via complex
container overlays like OpenVZ/Virtuozzo or hypervisors) are an ugly hack to
fix the fact that the OS security model is broken even for multi-user
operation. The fact that everything requires root to install is the deepest
issue, along with the lack of permission structures for things like network
interfaces. It should be possible to run an OS and sell _accounts_ to the
general public, not VMs, and people should be able to run whatever they want
from their local account and this should be safe. The fact that this isn't
viable is because OSes are broken, thus we have the huge overhead of
virtualization as an ugly band-aid to fix it.

~~~
silentOpen
What about hypervisor as microkernel and VM as app? Is this a viable new model
for system security?

~~~
api
It's feasible, but I personally think the hypervisor overhead stinks. Better
to fix to OS permission model.

~~~
vy8vWJlco
Why not use filesystem permissions/ACLs on named pipes to access native
interfaces (as a poor-man's microkernel)?

------
Devlin_Donnelly
I have always liked the web as a platform because I can build something once
and have it work across many operating systems/browsers/etc... and because the
web is built on open standards.

One criticism people have of this approach to application development is that
web apps aren't as efficient as native apps.

This pro/con discussion of web vs native apps reminds me of the C programming
language, why it was created and what for.

The C Programming language was designed to be a portable language, meaning
that it can be compiled on virtually any platform, any operating system. Yet C
was also designed to constitute the minimum abstraction away from a given
platforms native assembly language.

Thus a program written in C can be compiled on almost any machine, and run as
efficiently or nearly as efficiently on that machine as a program written in
that machine's native assembly language.

So perhaps we could use something like the C programming language for the web.
A technology which allows us as developers to write our applications once in a
portable open format, without needing to sacrifice in terms of performance.

Any thoughts?

~~~
tree_of_item
Sounds like <http://en.wikipedia.org/wiki/Google_Native_Client> is what you're
looking for.

------
snowwrestler
I think the best thing the tech community can do in this area is continue to
investigate and provide specific criticism of individual apps that are doing
nasty things. Great examples include the uproars over Google's data
collection, or how Path uploaded the entire address book without asking. These
are the sorts of specific nasty things the popular press will cover, and the
general public wants to know about.

Root access would be very helpful for the tech community in doing this, but on
its own, it's probably too esoteric to make much of an impression on the press
or general public.

------
demallien
Not a terribly coherent discourse by Tim B.Lee. On the one hand, he wants you
to run apps downloaded off the internet in your browser. On the other hand, he
wants the user to have root. There's not even a whelk's chance in a supernova
that I will ever run a browser as root. I assume that Tim's OK with that as a
security stance, but it then follows that browser apps can't have root, and if
you're a non-tech-savvy user, if your apps don't have root, then _you_ don't
have root.

Focusing a bit more on the whole root access thing, I think it would be well
to remember that we tried the open root thing already, and it turned out to be
a security disaster for normal users. I think it's a good thing that the
mainstream computing platforms (Windows, MacOSX, iOS) are trying a new
approach. I also think that it's a very good thing that there are more open
solutions, such as Android (which, just to be clear, is also a mainstream
choice) and Linux, that allow us to see what can be done with more open
access. Then the closed platforms can see how to go about providing that
functionality in a more secure manner for less tech-savvy users.

~~~
jiggy2011
I don't think he's suggesting that you literally run your browser as the root
user.

The argument seems to be that if you don't have the ability to access whatever
is the equivalent to the root account on your device _at all_ then this means
that whoever does has more control of your device than you do.

This means that you _have_ to trust the device or OS manufacturer. For example
if you buy an iPhone there is no way to "untrust" Apple without throwing the
device out or doing a jailbreak.

With a more open system such as Debian you can decide to untrust the OS vendor
by simply replacing the entries in /etc/apt.sources with something else.

This kind of model is often used in corporate environments where computers are
locked down by the IT dept, therefor the IT dept can make choices on behalf of
the users as to what the security settings should be, what software is
installed etc.

------
bradleyland
I don't live here; I just rent.

Sometimes it's OK to just rent. If I want to own, I'll move elsewhere.

~~~
WayneDB
The problem becomes obvious when you discover that there's no place to move in
order to own. When you have no choice, are you really renting or are you a
serf?

Even if you think that you own your phone, you still have to get it onto a
network for it to become useful and you'll be renting that network access and
you'll have no choice between contracts because they're all the same.

~~~
pekk
They aren't all the same. 3 years of contract at $70+/mo != prepaid $30/mo !=
prepaid 10c/min.

~~~
WayneDB
They are all the same in many respects though. Do any of them offer true
unlimited data? Tethering? Usage of unlocked phones? Decent coverage? Data
privacy?

~~~
JadeNB
Well, none of them offers a pony, either. I think that only one of these (data
privacy) is a right; that (2)–(5) are conveniences; and that true unlimited
data is impossible (assuming you mean "… at a fixed price").

~~~
WayneDB
However, if I want to _own_ a pony I certainly have the option of purchasing
one.

2-5 are perfectly reasonable offerings and the corporations that control the
data pipes have all accepted our tax breaks which contribute to them being
able to maintain their virtual monopolies. My point is that the rent analogy
does not work at all. It's too simple.

------
lazyjones
Using a task manager on your phone is like analyzing the water coming out from
your tap before you drink it, only a little less important for your general
well-being.

Most people are simply ignorant about the details because they trust an
authority to keep things in order (e.g. the FTC). Perhaps trust is more
misplaced with Apple and Google than with your water supplier, or perhaps
those working in the food or water industry will be as wary about water
quality as we are interested in whatever spyware is running on our phones -
and we will find it a bit ridiculous.

------
aneth4
My iPhone does a pretty darn good job of serving me. In fact, that's why I own
it.

If it happens to server others collaterally, well, I'm not the jealous type.

------
gdubs
Minor quibble: having to port to multiple devices may be 'boring', but writing
native apps can be truly wonderful if you love the platform, framework, etc.
My personal experience is largely with audiovisual apps -- and for that, I'm a
big fan of iOS. Core Audio in particular may be a tough framework to approach,
but what you can achieve with it is very exciting.

------
dendory
I fully agree about things that require native apps. When I click on a link to
a YouTube video or a tweet and instead of showing me the result in the
browser, it pulls me out and brings me to the YouTube or Twitter app, it means
I can't bookmark it or share it. I hate that.

------
tobylane
This is more of the idealistic view that I haven't seen pan out. Android may
be more open, in the times when the latest version is open source. But when
the carriers modify, lock down the phone it's a hassle. It's still a hassle on
a perfect phone, out of reach to the vast majority. So on those 98%+ stock
(including carrier mods) you have apps that aren't as jailed as on iOS and
(still haven't read anything on this either way) pretty much the only online
advertiser supplying you this for free.

Real world circumstances render this correct ideal view mostly or entirely not
true.

------
pjmlp
I rather make use of my native applications that know how to take advantage of
the underlying hardware and OS.

The browser should have stayed a document only thing.

~~~
papsosouid
Could you give me an example of anything other than a game that is actually
"taking advantage of the underlying hardware and OS" in some way that the
browser can not? Looking at all the apps people use, games are literally the
only things I've seen that aren't just poorly re-implemented websites.

~~~
UnFleshedOne
Interestingly, I consider websites to be poorly reimplemented native
applications. They make sense when there is a centralized database at the
back, but even then UI and UX usually suffer. For one thing, they often lose
input methods available to the underlying system. For example, how many web
apps use context menus? How many _can_ use them without breaking browser's own
UX?

~~~
papsosouid
>but even then UI and UX usually suffer

Because of bad designers. Most native apps suffer for the same reason.

>For one thing, they often lose input methods available to the underlying
system.

Such as?

>For example, how many web apps use context menus? How many _can_ use them
without breaking browser's own UX?

I have no idea how many do, but they don't break anything unless the designer
messes up.

~~~
UnFleshedOne
I've seen some sites overriding context menus to provide their own copy/paste
commands (for unknown reasons, probably to change the look). In doing so, they
remove all the options browser puts in there.

For example, right now in Opera I have 17 entries in context menu when
clicking on a empty spot on the page, 9 entries when clicking on selected text
and 15 entries when clicking inside text edit field and yet another set when
clicking on a link. Some of the options control behaviour of the browser, some
deal with current page, some deal with specific page element. If HN wanted to
use context menus for some app-specific reason, all of those options would be
wiped out (at least for affected elements) because the app would be in direct
conflict with the browser.

Touch events are often another casualty -- sometimes they are commands to the
OS (swipe to change active desktop), sometimes to the browser (pinch to zoom),
sometimes to the app (drag to drag around a map, or maybe pinch to zoom?).

Additional layers always introduce problems and cause things at the end of the
chain to have worse UX or adapt by using lowest common denominator. A crazy
example: gmail running in browser running in windows running in a windows VM
running on a mac machine which is remotely connected via VNC from an android
tablet. >:D Android -> VNC app -> OSX -> VMWare -> Windows -> IE -> GMail app.
Ok, this doesn't illustrate much besides the point that browser _is_ an
element of input chain (that makes a disproportionate effect on UX too -- the
other links are at least trying to be transparent), I just like the idea. :)

~~~
papsosouid
>I've seen some sites overriding context menus to provide their own copy/paste
commands (for unknown reasons, probably to change the look). In doing so, they
remove all the options browser puts in there.

Of course, and you've probably seen much worse than that. Incompetent web
designers are plentiful, but the capabilities of a web site are not limited to
"broken sites made by incompetent designers".

>If HN wanted to use context menus for some app-specific reason, all of those
options would be wiped out (at least for affected elements) because the app
would be in direct conflict with the browser.

If HN wanted to do something incredibly stupid, they could do something
incredibly stupid. Right. I'm not sure what the issue is there. If they wanted
to not do something incredibly stupid, that is also an option. Why judge only
on the worst possible scenario and assume every website is designed to be as
broken and shitty as possible?

>Touch events are often another casualty -- sometimes they are commands to the
OS (swipe to change active desktop), sometimes to the browser (pinch to zoom),
sometimes to the app (drag to drag around a map, or maybe pinch to zoom?).

This applies equally to apps vs web, and isn't even specific to phones.

>Additional layers always introduce problems and cause things at the end of
the chain to have worse UX or adapt by using lowest common denominator

That statement is false, and not relevant. The browser is no more an
"additional layer" than an app is. The same number of layers in both cases.

~~~
UnFleshedOne
> If HN wanted to do something incredibly stupid, they could do something
> incredibly stupid. Right. I'm not sure what the issue is there. If they
> wanted to not do something incredibly stupid, that is also an option. Why
> judge only on the worst possible scenario and assume every website is
> designed to be as broken and shitty as possible?

But context menu is a useful concept (sometimes). It _would_ be nice if it
could be used when it is called for. The problem is that it is impossible (I
think) to use context menu or _any other right click action_ you could think
of without breaking browser functionality. It would be like you said,
incredibly stupid.

Thus web apps effectively lose one whole mouse button, thus reducing available
input bandwidth.

Another example of lost input is lack of system-global shortcuts. A web app in
my knowledge can't have them. I listen to music while I work and I tend to
switch tracks or pause them once in a while. So when I listen to youtube
playlists, I have to go back to the browser, open the playing tab and do it
there. (and you never know if the video currently playing is SFW :))

> That statement is false, and not relevant. The browser is no more an
> "additional layer" than an app is. The same number of layers in both cases.

The browser is a layer, and the native-app is a layer, except to get to web-
app, you need to go through the browser first and a native-app sits at the
same level as the browser (so a portion of input capabilities is not used up
for the browser itself).

~~~
papsosouid
>But context menu is a useful concept (sometimes).

Of course. Which is why you would use it only where it is useful, not just
hijacking a particular input event globally.

>Thus web apps effectively lose one whole mouse button, thus reducing
available input bandwidth.

No, you can use right click all you want, and it need not interfere with the
browser's use of right click. Also, very few people use a mouse on their
phone, so this does seem like a bit of a stretch.

>Another example of lost input is lack of system-global shortcuts. A web app
in my knowledge can't have them.

Right. There are a few other things like this that web apps can't do. And apps
that people would want this sort of interface for should not be web apps. But
as I said originally, virtually all the apps people use are not of that nature
(and those that are almost always are apps that came with the phone), they are
just websites turned into clunky, awkward apps for no reason.

>except to get to web-app, you need to go through the browser first

The browser is the app.

------
padmanabhan01
if 'open' wants to compete, it should do so by being better than 'closed'. The
argument in the article tries to outlaw being 'closed'. No one stops anyone
from making any 'open' system (OS or hardware or whatever) and why should
anyone stop someone trying to make a 'closed' system. If someone wants to make
a closed system and someone else wants to buy that, sure, by all means..

~~~
keenerd
_If "safe" wants to compete, it should do so by being more fun than
"dangerous". No one stops anyone from making any "safe" system (automobile or
recreational drug or whatever) and why should anyone stop someone from trying
to make a "dangerous" system._

One of the best uses of law is to place limits, regulations and standards on
what businesses are allowed to get away with.

------
DenisM
I don't have root on my TV and it serves me just fine for what I need from it.
There might be good arguments to keep platforms open, but this ain't one of
them.

~~~
mrgoldenbrown
OTOH, I would love to root my DVD player so I could fast forward when I want
to, not at the mercy of the FBI.

~~~
erichocean
Hacking the player would be useless. DVDs are _programmed_ and any (and all)
limitations that are there are because a programmer put them there.

Same with Blu-rays.

~~~
CamperBob2
This isn't correct. The DVD can only specify that fast-forward and other
functions are to be disabled. It's up to the player firmware to enforce or
ignore the DVD's user permission flags.

------
youngerdryas
From the comment section:

>Trust will disrupt Openness

There are a couple of factors that will disrupt the Web architecture.

The Web has trained us to build dumb clients and centralize anything of value
on the server, at a huge cost and never enough trust. We can safely predict
today that light-weight protocols, mediated by the mobile OS (and its
Platform) will directly challenge the Web architecture, precisely because we
can leverage the platform trust model. That evolution is extremely profound.

For instance, apps running on your device can securely and privately share
information without requiring a complex temporal integration involving a 3rd
party service (such as Google AdSense). The information is produced and
consumed on the device or the device of a related end-user. What happens on
your device can now stay on your device.

Just to be clear, and to show how disruptive that architecture is, the primary
key of your private data becomes your phone, not your identity. Merchants no
longer need to identify you. They can’t care less about YOU, they just care to
know some information about you. The problem with the Web Architecture was
that the only way to do that was to associate PII to a primary key on a server
and hence merchants needed to identify you to track your every move (and they
shamelessly did).

The second factor is just as profound: the very open nature of the Web is
driving scale over scope. The Web has successfully nurtured the largest
Catalog, the largest Search engine, the largest Auction site, the largest
Social Network, but I see this as a negative side effect of the Web
architecture because it limits the scope of what people can do. In other
words, the scope of what Amazon, Google or Facebook offer is limited by the
scale (and hence the revenue) they can achieve.

I actually argue that a trust-based neutral Platform will support a more
vibrant and diverse ecosystem than a truly open model because in essence a Web
business couples the leve of trust it can achieve with the functionality it
can deliver. The Platform decouples the trust from the functionality and it
enables much smaller actors to deliver a lot more scenarios while relying on
the trust establish by the Platform.

I would be surprised if the Web can resist being disrupted by the Platform.
Actually, I think it already is.

~~~
valley_guy_12
Unfortunately, you can't really trust a device, since there is always a chance
that it could be rooted. See the long history of rooted game consoles.

The best you can do is sort-of trust the device. But that's not much better
than not trusting the device in terms of the kind of architectures and
products you can build.

~~~
illuminate
"you can't really trust a device, since there is always a chance that it could
be rooted. See the long history of rooted game consoles"

How many get rooted without the user's intervention?

~~~
randomdata
I do not have an answer to your question, but in my mind the targeted attack
is more concerning than a random drive-by that hits everyone. That means just
one instance is too many.

As another example: In the earlier days of iOS, you could root it just by
visiting a website. Who knows how many fell pray and have malicious code
running on their devices today? With a targeted attack, you don't even get the
benefit of security researchers all scouring over the code like you would with
something more widespread, thus there really is no way to even keep track of
what might be out there.

