
How My Brain Kept Me from Co-Founding YouTube - nreece
http://prog21.dadgum.com/39.html
======
petercooper
In the late 90s I had a similarly bizarre "limitation" in that I didn't
"trust" drives that were over 10GB in size (because.. "how can it work? It's
just too much to get into a drive!"), so I just had multiple smaller drives
instead of one 20~40GB one.

Thankfully, like the writer, I eventually realized the stupidity of my ways,
but I think many of us are susceptible to this bug at least once ;-)

~~~
yan
What's funny is that I now have a similar fear of 1TB drives.. "If a lone
drive fails, that is a _lot_ of data to go at once"

~~~
buugs
Both the fears seem pretty reasonable to me, probably because I usually don't
buy a main harddrive with more than 200gb usually no more than 120.

~~~
10ren
I started coding on a ZX81 with 16K RAM.

In 2 years, that will be 30 years ago. If RAM doubles every 1.5 years, then in
2 years a new machine should have 16G RAM (x1,000,000). At the moment they
have about 4GB, so it's not far off.

~~~
ovi256
You can already buy MacPro's with 32GB of Ram. For about 8k$ unfortunately.

~~~
10ren
You could also buy 32KB RAM for a ZX81 - but it felt a little out of reach. It
was expensive, unusual, non-standard.

The thing is, 16GB RAM will soon be ordinary. Every PC will have it. What
things become possible that are unthinkable today? Maybe multiple virtualized
OSes could become standard.

------
ashr
The point that the article makes is not a technical one.

It is a philosophical one instead. The ideas of tomorrow must look beyond what
is possible today into anticipating what may be possible tomorrow.

The execution of such an idea would then hold the promise of innovation to
break today's barriers.

------
10ren
It's Moore's Law: you're unconsciously calibrated to the past, and it doesn't
make sense to your hindbrain that you'll be out by orders of magnitude: x1000
in 15 years. I mean, that increase is completely unreasonable. Isn't it?

On the flip side, it also opens opportunities that people in the industry
simply didn't entertain, because they were literally unthinkable and therefore
unseeable just a few years ago (and I literally mean literally). I spoke to
the inventor of a new method (a few decades ago) of hardware multiplication
that became the standard for a while; he said it was made possible by an
increase in silicon. It was theoretically possible before then, but no one
looked.

------
smoody
Was he given the opportunity to co-found YouTube? That isn't clear from the
article.

~~~
JacobAldridge
I don't think so. I believe he had similar thoughts (streaming video online
would be great) but immediately dismissed them as requiring too much
bandwidth.

Had he had the foresight to avoid those limiting beliefs, he could have
invented something like YouTube.

~~~
dood
Thing is, a lot of people could have invented something like YouTube, and a
lot of people tried. But the YouTube people had the skills to grow and scale
their site in a phenomenal way, which is why they're the ones that are
remembered. YouTube exemplifies the notion that ideas are worthless, execution
is everything.

~~~
ericb
Right, but the article isn't really about creating youtube, it is about
realizing when you've mentally added an artificial constraint and not letting
it stop you.

~~~
smakz
Yeah, but in reality the ideas in the article are really more boring then the
headline suggests.

He wasn't in a position to co-found youtube, not to mention there were already
plenty of video sites around before and during youtube's rise to prominence.

It's a classic case of "well I had that idea I could of done it" - doing it is
99.9999% of the exercise, having the idea is 0.0001%.

------
tophat02
The author's argument is a total non-sequitur. The whole "mental block" about
sites like YouTube and Flickr was that DEVELOPERS couldn't imagine having to
pay for such outrageous bandwidth usage (or that they'd even be able to find a
place to host the site, for that matter).

He then goes on to equate this to developers not wanting to have their users
download 64MB runtimes or 2GB IDEs. These two things are entirely different.

A developer knows that, while Joe User may have no problem waiting for a 500MB
video of some chick topless to download, the same user isn't going to want to
wait to download a 50MB installer for a program that makes Bingo cards.

~~~
patio11
_the same user isn't going to want to wait to download a 50MB installer for a
program that makes Bingo cards._

As something of the resident expert on this topic, let me tell you: users are
certainly willing to wait for a 12MB installer for a program that makes bingo
cards. It actually converts (very marginally) better than the 500kb installer
which needs Java as a pre-req.

~~~
tophat02
Hehe, I actually had you in mind when I wrote that :)

------
stcredzero
I created a deployment of a production Smalltalk application recently. (It's
been around well over a decade.) 24 MB. I use the image compression utility.
8MB.

The default VisualWorks image is 12MB. And I also worked on one production app
that was 110MB deployed. Most of that was data. We pre-cached all of the
queries required to populate all of the drop-downs, so that after it got done
loading, users found it to be "fast."

I thought of something like YouTube. "Flash video? No self-respecting techie
would use _Flash_!"

------
psranga
Hmm, maybe I'm too retro for this guy, but 16 bytes for an element in a linked
list does sound excessive. :)

~~~
blasdel
It's only two words! You need two 64 bit pointers on a 64-bit machine.

A fuckton of people have come up with the idea that "hey, since a 4gb address
space is really plenty, why don't we try using an n/2 address space and have a
whole cons fit in one word?" I'm sure the same thing has come up plenty of
times before.

It turns out to be not so hot to avoid directly using the OS' VM and the
hardware's MMU.

------
mynameishere
_Flickr blew my mind when it appeared back in 2004._

A mind easily blown. Stand back anytime this guy is near a website that is
anything more than an aggregate of img tags.

~~~
blasdel
Did you ever notice how flickr prevents you from seeing multiple high-res
images at once? You can't even see links to other photos when viewing a higher
res! The highest resolution you can see on a navigable page is 500px on a
side.

Flickr is fucking atrocious for actually viewing someone's photos. It's about
as shitty as they could possibly get away with.

On the other hand, it has absolutely terrific support for communities, with
all sorts of little touches to encourage their formation. It also manages to
simultaneously appeal to US Grandmas, Arab twentysomethings, and the requisite
"self facilitating media nodes" that fuel _WEB 2.0_ hype.

