
Facebook and the media: united, they attack the web - cpeterso
https://www.baldurbjarnason.com/notes/media-websites-vs-facebook/
======
Animats
From the article: _" The web definitely has a speed problem due to over-design
and the junkyard of tools people feel they have to include on every single web
page. However, I don’t agree that the web has an inherent slowness. The
articles for the new Facebook feature will be sent over exactly the same
connection as web pages. However, the web versions of the articles have an
extra layer of cruft attached to them, and that’s what makes the web slow to
load. The speed problem is not inherent to the web; it’s a consequence of what
passes for modern web development. Remove the cruft and we can compete
again."_

I've been bitching about this ever since it became clear that CSS made web
pages bulkier. (Yeah, it was supposed to make them shorter. Didn't happen.) We
had much faster loading pages when people created HTML 3.2 with Dreamweaver
and tables, and _most of them looked about the same_. Now we need HTTP2 so
that all the "assets" can be loaded more effectively. There are even content
management systems which generate bloated custom CSS for every page, with huge
numbers of classed tags customized for each page. This eliminates any useful
caching.

Some major sites have figured this out. The Wall Street Journal's home page
has become shorter and simpler at the HTML level. Two years I looked at it,
and some content management system was inserting the entire sign-up-for-a-
subscription machinery on _every page_. They've fixed that. The Washington
Post is inserting the same block of Javascript for each story. Somebody should
tell those guys about functions.

I have a web service which strips all active content (Javascript, Flash,
Active-X, Silverlight, etc.) from a page, parses it, and redisplays it. Most
major news sites seem to work OK in that mode. Instagram is fine. (WhatsApp
looks awful.) Much of the junk that slows page loading is providing little, if
any, benefit to the user.

~~~
scholia
I saved a Washington Post page -- a short interview -- and it came to 3054KB.
I saved a NY Post page and it came to 2843K. For all I know, there could have
been piles of background junk that weren't saved with the page.

It does seem a bit remarkable that these sites need to ship 3 megabytes of
stuff to deliver 20K of text....

I think I need your web service ;-)

~~~
Animats
If you want to try it, try

    
    
      http://www.sitetruth.com/fcgi/viewer.fcgi?url=news.ycombinator.com
    

replacing the URL parameter with the desired web site. It's a tool we use to
see how our crawler views a page. It reads the first 1MB from a URL, converts
everything to UTF-8 based on the strict rules for charsets (no guessing),
parses it with a HTML5 parser to create a parse tree, throws out the
Javascript and Flash, makes all the links absolute, and emits a pretty-printed
version.

Most sites have readable text after this, although some features may not work.
The ones that don't do something like creating the page with document.write(),
or one of those "parallax" formats. It's a reasonable test to see if the web
designers went completely overboard. If you're getting a blank page, search
engines probably aren't indexing that page very well.

~~~
scholia
Many thanks, will have a play...

------
thenomad
This article really highlights something that I'm constantly amazed by - just
how SLOW major media organisations are prepared to allow their websites to be.

When I'm designing a landing page for anything significant (product, movie,
whatever), I'm looking for sub-second load times - ideally below 500ms. Why?
Because there are tons of studies showing that page load speed makes a huge
difference to user experience, retention, CTR, the whole nine yards.

How come bigger media companies, whose expense accounts for team lunch are
probable larger than my entire Web budget, fail to learn this?

I'm baffled.

~~~
matthewmacleod
You are not kidding. The user experience of most media sites is appalling,
mostly down to performance.

Here's a great example I saw recently when I hit the Daily Mash without
Ghostery enabled:
[https://www.dropbox.com/s/seu0run7pn4tbc6/Screenshot%202015-...](https://www.dropbox.com/s/seu0run7pn4tbc6/Screenshot%202015-06-19%2011.26.39.png?dl=0)
\- in excess of fifty(!!) third-party scripts and trackers, rendering the
browser unusable for multiple seconds. It's a frustrating, miserable
experience.

~~~
rockdoe
Mozilla reported an _average_ pageload performance of 40% by just blocking the
_tracking_ scripts.

------
bostik
The article was better than I expected from the title.

It's not about Facebook, and it's not exactly about online media either. It's
about failed expectations with (particularly mobile) online media, and how
conflicting needs far too often neglect the user experience.

Speed matters. Performance is crucial. Load times are atrocious. And above
all: screen real estate on mobile is _precious_. A badly transferred advert
that happens to sort of work on desktop, when the user has >20Mbps connection
- it murders the UX on mobile. And as a result, makes the entire site
unusable. As a result, users are moving to anything that makes the content
load faster, and where the ads are not as intrusive.

The irony of it all? FB is better at serving media than the producers and
_publishers_ themselves.

UX >> UI.

------
finnyspade
I agree with the overall sentiment expressed that web developers can and
should do better as it comes to performance. There were a few inaccurate or
misleading remarks in the article that I'd like to draw attention to:

 _1) Guess how those Instant Articles are formatted? HTML. Guess how those
articles get to the app? HTTP._

This is misleading because it seems to equate pulling in content with loading
an article. By loading these articles into the app (as opposed to launching a
browser), Facebook gains the ability to do the following:

\- Provide the basic styles from app-install time. Most of your css doesn't
need to be downloaded.

\- Skip app boot time. App's take a while to cold start, meaning I don't have
to start safari before I start the web request.

\- Prefetch. I don't know if they do or will do this, but facebook could
reasonably prefetch articles and their resources before you click

 _2) Stop buying into the ‘native is better’ myth. (It’s just different.)_

Another phrase for "different" is "better at different things". It turns out
that the app model provides some amazing guarantees that the web can't match
(and vice-versa):

\- Preloaded styles and logic. Content changes, but presentation stays the
same. Only download content. (no caching is not a valid solution, because
exercising new parts of the code would require a download. Though some
appmanifest magic may let you get pretty close)

\- Cheap library inclusion. If an app is an extra MB or so it's no big deal
most of the time. You can use that to load in libraries that vastly speed up
developer work and can do "progressive enhancement" by supported new APIs on
old OS versions. Libraries are way more costly on the web as the author points
out.

\- Consistent (with the OS) UX and much cheaper animations

\- Lower memory footprint

These things do matter! The web is amazing, yes. No, it's not dying, but there
is value in native apps and they're not equivalent. Smart engineers will make
the choice based on their needs which is what both facebook and these news
sources are doing.

~~~
paganel
For what it's worth the Facebook app takes forever to load (like 3 minutes or
more) on my 3-year old iPhone 4, while browsing for stuff on Safari stills
works reasonably well, bar the odd website which throws a huge advert at you.

People would probably advise me to change my phone, to get a newer one, but
why should we change our phones every 2 years? Why create more waste? Why
spend money uselessly on stuff that should work for at least 5 years?

~~~
josephb
Even on an iPhone 6 the Facebook app can feel sluggish and clunky. The mobile
site in Safari is a good alternative and feels quicker.

------
Paul_S
I feel like I'm living under a rock, sheltered from all this except when ever
so often I click on a link on HN an get taken to some website from hell.
Complaining about that world is like complaining about how pointless broadcast
TV is. You can't fight it and there isn't anything there worth fighting for,
you're not missing out on anything.

------
tempodox
The gist is for me: _The speed problem is not inherent to the web; it’s a
consequence of what passes for modern web development. Remove the cruft and we
can compete again._

My guess is, the cruft will not be removed, but will just be sped up with
WebAssembly etc. Building a web site that's functional and fast stays an art
form that's not accessible to everyone. Native apps are just a quick-&-dirty
cop-out from that problem.

~~~
dspillett
WebAssembly won't do anything for the latency delays of a myriad of support
files (javascript, CSS, images) which is the key problem for small apps and
non-applicatino pages, HTTP2's better handling of multiple files (or for
HTTP(S) grouping everything into one minified block) won't help if they are
coming from different 3rd parties, ...

That is an issue that needs to be addressed by people changing how they
currently build some pages/apps. It is particularly bothersome when mobile and
stuck on a slow high-latency connection.

------
omouse
Mozilla needs to have a separate web browser project that only focuses on
cutting out the cruft of the top 100 websites plus top 50 news sites. Filter
out the ads, filter out the JS that launches advertising modals, cache the
content, in short do anything to make sure they load in under 2 seconds with 1
second or 500ms the goal.

When you control the client you can stop so many unnecessary requests and you
can patch sites (GreaseMonkey and Stylish were/are the best extensions) when
they arrive.

This is exactly why the web is built as it is; you don't _have_ to take
whatever crap is shoved at your browser, _you_ are in control.

~~~
pc2g4d
Except this probably violates everybody's Terms of Service....

------
jrochkind1
> Your problem is that if you put your developers ahead of your customers,
> you’ll end up with just the developers.

This implies it's just a matter of priorities. In fact, it's a matter of cost.
The OP admits that what he advocates "isn't easy", technically. That means
it's more expensive. More developer hours, more highly-skilled developers, or
both.

It is a matter of priorities in the sense that an organization needs to
understand what a good product looks like and be willing to spend the money to
create a good product, trusting the investment will ultimately be repaid (as
the OP argues but is hardly guaranteed). But it's not avoiding "putting
developers over customers", in fact they need to be spending more on
developers than they are now to get a good product, they need to 'elevate'
their developers further, always remembering the ultimate goal is serving
users of course.

~~~
woah
I hate to be the guy- but how about just serving some html and some css?

~~~
jrochkind1
I dunno, what do you think are the reasons media sites aren't doing that?

Ads? Tracking? What they think is a good design/UX that users will like, find
easy and enjoyable to read, and associate with professionalism and
credibility? (are they right or wrong?) Good UX for multi-media presentations?
Social media sharing icons? Good UX for mobile? Comments features? Other
stuff?

~~~
rjaco31
Basically a little bit of everything, it's just feature bloating.

------
wahsd
And facebook users are it's foot-soldiers. As long as people converge on
Facebook and while Zuckerberg moves ever closer to his goal of replacing the
internet with Facebook, the assault will not cease until the society has been
subjugated and the internet domesticated.

------
GBKS
Here's a quick illustration I put together recently to show the cruft of
mobile news sites: [https://dribbble.com/shots/2118552-DBC-News-
Concept/attachme...](https://dribbble.com/shots/2118552-DBC-News-
Concept/attachments/384411)

I hope this whole discussion is a bit of a wake-up call for news sites to get
their priorities and tech straightened out.

