The university organized a dinner every year to introduce scholarship students to their patrons. It was at the Ritz-Carlton and I remember feeling very, very underdressed. Anyhow, it turned out that our 90-something patron was simultaneously sponsoring about two-dozen scholarship students, so rather than doing much talking I sipped a coke and just listened to the dinner table conversation.
Nemesis, in his oh-so-charming way, began bragging about a civil engineering project that he had been on ("As a sophomore -- really not something many people do, you realize") remodeling an overpass near the school. He was going into lots of irrelevant detail -- specs, etc. Our patron made the requisite politely interested noises and, at one point, suggested that a particular implementation detail might be improved upon. I recall it being something like the amount of reinforced concrete required.
Nemesis: "I don't know how things were when you were still working, old-timer, but I'm absolutely positive that blah blah blah.
90-something guy: "Oh, I guess it is possible that they've improved the formula since..."
Nemesis: "Since when anyhow?
90-something guy: "Since I invented reinforced concrete."
The gentleman passed away a few years ago and, sure enough, that was quite prominent in his obituary.
We took CS201 (discrete math) together. One test was exactly one problem long and scheduled for a 90 minute class period. The core insight was that it decomposed into "Take the product of all f(N) together for N = 1 through N = 25", where f(N) was something looking vaguely algebra 2-y. Hand calculation looked like it would take, oh, north of an hour.
On inspection of f(N), I noticed that it contained the term "(BLAH BLAH BLAH BLAH BLAH) * (.5N - 11) / (BLAH BLAH BLAH)". This obvious makes f(22) 0, thus making simplifying the expression or calculating f(1..21) and f(23..25) then multiplying rather moot.
Now, our professor was not a very tricksy guy, so I figured "Hah, betcha didn't see f(22) coming now did you" was not exactly his style, so I went back to check my assumptions. But they were, unquestionably, right.
So I went up approximately 4 minutes into the exam and said "Umm, sir, I'm done." He looked surprised. I said "If you're surprised, then I think there's a bug in the problem." He said I had probably made an arithmetic mistake and I would hate to lose all credit for the test.
Back to seat. Checked setup of expression. Nope, sorry, it was absolutely mathematically unavoidable that (.5N - 11) was in there.
So I went back to the prof, now 5:15 into the 90 minute long test, and said "Sir, I really think I'm right here. I'll take the zero if I botched this -- can you take a look for me right now?" So he looks it over for 20 seconds, smiles, says "That is what I get for having grad students write my exams" (this quote is literally accurate), and takes my exam.
I had a friend in the class with whom I was going to go to a study group after the test was over. Not having anything else to do, I waited for him.
Twenty minutes later, Nemesis exits the class -- "first" to finish the test, naturally. Without prompting, he says "Don't feel bad, not everyone can hack discrete math. No reason to waste time on the exam if you're not going to get the right answer." "Ahh, out of idle curiosity, what did you get?" "It's complicated but..." biggest smile of my college career
Edit: Seems I've told the story before on HN. My last recounting, two years ago, has the wording very slightly different. http://news.ycombinator.com/item?id=829901
Patio11's comment that this was roughly 10 years ago almost fits when this rather amazing gentleman passed away.
Look, we all know HNers are smart enough to Google up the particulars. That would put me in the awkward position of having published a private conversation by a deceased gentleman who generously paid for several dozen poor students to go to college, including me. I'd appreciate if you just wrote this one off to "I trust Patrick not to have invented this story out of whole cloth" and let it lie.
It's not like you told us he secretly punched babies to steal their toys. It was a charming anecdote. It makes him look like a great person with a sense of humor. If anything, the story is entirely flattering. This is not the stuff secrets are made of.
I've regretted being brash or self-promoting fairly regularly, but I have yet to regret being humble.
Though admittedly this may be related to the low sample size of the latter.
I always struggle to figure out when I'm crossing the line from being humble to self-deprecating (the latter not always being a good thing and something I've regretted on occasion).
One single job interview where they ended up hiring the guy with nearly identical qualifications. The guy leading the interview later told me on the phone "it was almost 50/50, but we had to pick only one of you", and the reason was that the other person appeared slightly more confident in his abilities using a particular framework. If I'd have been a bit more self-promoting, that could've been me. Lesson learned, of course.
Of course there's a difference between confidence and being brash.
Humility /=/ Unconfident
The annual WUSTL Engineering Scholarship Dinners were always very pleasant in my experience. If anything, the students were overly polite and felt a little intimidated in their sponsors' presence. We certainly felt intimidated by the formal atmosphere at the dinner.
Especially (but not just) because social grace strikes me as a arbitrary and moving point on an arbitrary and moving scale.
when people use iPads they end up just using technology to consume things instead of making things. With a computer you can make things. You can code, you can make things and create things that have never before existed and do things that have never been done before.
I think closed devices are really a step backwards. I learnt programming (and a whole lot more, like mathematics and logic), when I picked up a computer as a kid and discovered Python and started hacking around. I could look around for tutorials, download code or try out code from books and nobody stopped me.
On a somewhat similar vein, I am also sad that desktops are slowly going away. I learnt so much more about computers (and how it is not magic) by building my own.
I think something needs to be said about systems that allow you to explore and hack around for the fun of it.
I'm getting sick of this "iPad is for consumption" thing. How are you going to "produce" something if you haven't learned anything yet? For me, iPad is a learning device. In the past year, I've read maybe 15 (technical) books on my iPad. I wanted to read them for the past 3 years, I had downloaded their ebooks and stashed them neatly in a 'To-Read' folder, but never get to do so on a Mac or PC because of their form factor.
: http://itunes.apple.com/us/app/readdledocs-for-ipad-pdf-view... - Believe me, this app is fantastic. You don't know how great it is until you use it.
Something you miss is that the central directive for the iPad was a consumer experience. It's intended to be roughly equivalent to a TV. It's great that you choose to use it for Discovery Channel and the like, but you're probably not using it to write code. And that's fine; it's not meant for that. And that's the point: it's not meant for that.
Incidentally the original Macintosh did not grant shell access to the user either, and for that reason was a favored web server platform of certain DOD programs. Those worried that Apple is closing off the possibility of programming should consider a longer view: with OS X, Apple introduced shell access to their OS, which is a certified Unix. Even with Mountain Lion, they are still introducing additional CLI functionality.
A better word might be 'passive learning'.
Consumption without purpose is pointless. Your case may be an exception, but unfortunately I think it's more of a rarity than it is a representation of the common use of the device.
All one needs is a seed.
My seed was planted on a closed system: an Atari 2600. Specifically, one with a Spectravideo Compumate keyboard. http://www.atarimuseum.com/videogames/consoles/2600/compumat...
When I did as much as I could do on it, an Apple ][ clone was the next step.
When the iPad came out, I lamented the closed platform, forgetting that my Apple ][ was not where I started.
Something similar to my Spectravideo for iPad, is Codea http://twolivesleft.com/Codea/.
Instantly accessible, and instantly running your code. A perfect seed for a kid who might express an interest in programming.
Any kid who discovers a passion for programming has options to move onto any platform their desire takes them. The ubiquity of the iPad and other tablets might even promote the introduction to programming to anyone who is curious enough.
Also, would people would stop calling an iPad a consumption device. It can be used this way. Many people do. But then, many people use their PC for little more than web browsing and to watch Netflix. It's not the platform, it's what interest them. My laptop is gathering dust because of my iPad.
The naturally curious will naturally do more within existing constraints, and then break free of them.
The naturally curious will achieve more where there are less constraints. Making the iPad the default device for every kid out there is just making things worse for them in the end. I remember the startup screen of the C64: you could just type "load" and "run" if you wanted to play a game, but basically you could also write some stuff in basic and run it yourself. You could tinker all you want without having to install anything.
The iPad IS a consumption device. By design. It's certainly not made to produce anything, and even if it is possible, it's never the best tool for the purpose. It's just a massive trade-offs device.
But what about today? One never needs to type "load" and "run". Instead of discovery by the kid, a parent who thinks programming might be interesting for their kid is a prerequisite. And then they need to be able to do the research and install perhaps Python. And then what?
Playing around with Lua inside of Codea sent me right back to the immediate gratification of my Apple ][
In the end, does the platform matter? When I was a kid, I took apart the TV so I could see what was inside. Curious kids today will be similarly compelled to jailbreak their iPad.
I'd be curious to know how many programmers there are here who are programmers because of a push in the right direction by an adult, or simply because there were curious kids.
Then, the absence of any kind of tools to recreate the very same applications you are using on your iPad re-enforces that aspect. You actually have to SPEND money to develop for iPad, and you need a proper computer to do that. Net, it's not a device that is made for development (even if you wanted to), hence it's a consumption device only, by positioning.
Absolutely true. A lot of people like to mock the position that the iPad is a 'device for consumption' whenever a new story comes out about someone creating a song or painting on his iPad.
Well, there's a reason why it's worthy of a news story. You don't see articles written whenever someone creates a song on his Mac or PC.
I love my iPad. It's a great consumption device. But its terrible for tinkering and creativity (and this limitation is by design).
Let me know when I can design flyers and brochures from it. Or write a program on it to do something useful that can be shared with others.
That's a bit like saying, "you don't see articles written whenever someone creates a song on his piano or violin", back in the 50s or 60s. There was a time when computers weren't used to produce music, pictures, or movies.
Rather than lament the current state of new mobile platforms, why not celebrate these amazing new platforms? They're increasingly becoming creating rather than consuming platforms and, even if we will never easily create apps directly on them (which I don't believe is true), they're wonderful new distribution platforms.
Just because they're not like the Apple ][s or C64s of yore doesn't mean it's the end of the hacking world. A new generation is growing up programming apps (yes, on their desktop computers right now) for this new world and can easily share their creations with millions. From where I'm sitting, this looks like a pretty interesting step forward.
(And for designing flyers and brochures on an iPad, you're welcome to use Pages or any number of other drawing or painting apps found on the App Store.)
Yes, and we're at that state right now with tablets. Sure, we'll get there eventually but it won't happen by fooling ourselves into contentment by saying they are already great for producing creative work.
why not celebrate these amazing new platforms
I did say that I love my iPad. I've reached the point where I don't mind paying more for e-books that I can read on it than the dead-tree version that I have to lug around with me. I also use it extensively for mail, photos and browsing.
It may eventually become a tinkerer's device. It's not there yet though.
And for designing flyers and brochures on an iPad, you're welcome to use Pages or any number of other drawing or painting apps found on the App Store
They let me use my own fonts (possibly in other languages)? Or let me cut a photo and place it on a page?
The latter may not exist because no one has done it yet (ie, not a technical limitation). I'm pretty sure the former is a limitation of the platform rules that have been set.
Anyway, I don't really disagree with much of what you say. We will eventually reach a point where creating cool stuff on the iPad is the norm. But we're not there today.
> Or let me cut a photo and place it on a page?
1) Let me work with custom fonts
2) Let me import images and cut them and place them
3) Export the whole thing as pdf
(Bonus points for handling svg and clip art)
I have not found one that does. I didn't try Pages - I assume it's a word processing app like Word and not a page layout/design/typesetting app like In-Design.
Have you tried it? Does it let me do these things?
I've often toyed with the idea of writing it myself. But I stop at point (1) because loading custom fonts that don't ship as part of the app bundle doesn't seem to be supported by iOS. (And also because I tend to procrastinate)
Perhaps you should have read my requirements before mentioning Pages?
 http://www.theipadguide.com/faq/can-i-add-fonts-iwork-keynot... https://discussions.apple.com/thread/2387345?start=0&tst...
For Android, yeah.
Thing is, I'm teaching "creative computer stuff" (which includes programming) to kids at a volunteer centre. One of the kids came to me and wanted to know if I could help them build an app for their iPod Touch.
Turns out the costs of writing iDevice apps is quite prohibitive to most teenagers.
First, the SDK+simulator is only available for Macs running Mac OS X 10.5.4+, not just any "desktop computer" will do. That is a problem because the "Young Researcher's Centre" runs on donated PCs, most of them old WinXP office machines that are perfectly serviceable for most of our purposes. We never get Macs though (well one time a really ancient one, I doubt one could develop apps on it).
If you do have a Mac, apparently you can get the SDK/emulator for free, I think (Apple's site said "free!" a lot but kept on directing me to some page where I had to pay. I gave up when I found out it couldn't be done on a PC, anyway).
But then comes the next problem. Your app just runs on the simulator. You need to join the "iOS Developer Program" in order to actually run it on a real iDevice, unless you jailbreak it, (which I'm not going to teach the kids mainly because I don't want to be responsible if they accidentally brick it).
The iOS Developer Program costs $99 per year.
Trust me, no kid is going through all that trouble just to have "something that looks like an app running in a bloody simulator", there has to be some tangible end-result.
From there (Wikipedia) "applications can be distributed in three ways: through the App Store, through enterprise deployment to a company's employees only, and on an "Ad-hoc" basis to up to 100 iPhones."
IMO it is ridiculous to teach children a valuable skill (programming) merely in order to immediately sell their services/work on some App Store market. They're children, they need to learn and play--of course if they want to try and sell their apps it can be a valuable experience but it should not be the only reason or the only way. The first point is that you write code for the sheer thrill of having that kind of control over your own devices.
I'm not sure how the "enterprise" method works, but I doubt we'd be eligible.
So finally you're left with paying $99 yearly in order to liberate 100 iPhones so you can write your own code for them. Whoop-tee-doo. And I'm probably forgetting about a whole bunch of ways Apple can get in your way as you try to actually go this route.
Suffice to say, after having figured this out, most kids realized their next phone was going to run Android.
The iPad is a fantastic creativity/tinkering machine for making music. Its not consumption, but Performance.
And I've just spent a week in Yorkshire including a visit to Salts Mill. Hockney has an exhibition of some of his iPad paintings (projections) and some of the paintings he did using a graphics tablet on Photoshop. I actually liked the tablet/photoshop paintings more (composed of thin lines building shapes and shadings) than the iPad ones.
But I think that a 'direct manipulation' interface will allow creation in variety of media. I'm waiting for a notebook sized tablet with stylus input for an electronic moleskine.
Plus, I don't have to have it sitting there, like I'm checking my email on stage. Hate that about laptop musicians, as if it is actually interesting to watch someone glow their face when I'm out at night and want to have fun, away from the office .. with the iPad form factor, at least, it lays flat and stays out of the way. And there is no QWERTY keyboard, so its not easy for someone to make the association "that guy must be writing an email onstage instead of making live music", and believe me that happens a lot with laptop musicians around my parts.
It's not a great device for making music. It's an adequate device mostly used by hipsters (and I use that term offensively) to look cool, sort of like how "writers" hang out in coffee shops using Macbook Pros (or various other expensive laptops, like Vaios) to "write" the next American Novel.
Jordan Rudess isn't just anybody, he's one of the top keyboard players alive today. He's always experimented with alternative instrument interfaces and I think the fact that he's doing this on the iPad lends a lot to its credibility as a creative platform.
I also recently took it to Poland for a month and used it to compile a hundred pages of typed and handwritten notes and video footage--all the while using it to navigate my way around the unfamiliar terrain. I keep being surprised by my little sidekick's range of uses and flexibility...
They are UK products, couldn't find a reasonably sized VAIO on Amazon.com. The VAIO has lower specs, though.
like, you know, Gorillaz.
> But its terrible for tinkering and creativity (and this
> limitation is by design)
> Let me know when I can design flyers and brochures from
The hacking apps leave a lot to be desired, but also seem to be cropping up gradually. I'm sure soon enough we'll see some pretty powerful "make something new" kinds of tools on these devices (so sad that Scratch/BYOB can't happen on the iPad .. yet <<glaring at Apple>>). OTOH, the sun may burn out before the "take things apart" kind of tinkering happens with the full blessing of the device makers.
iBooks Author is a Mac app that lets you create for (in some cases exclusively for) the iPad. It is not an iPad app that lets you create with an iPad.
Also, nobody is stopping anybody from building desktop PCs. As long as people are willing to buy the components I'm sure somebody will be selling them.
People don't usually suggest a lamp as an introductory computing device for children.
edit: Or rather, I might jump on the internet and complain that the company making this particular lamp markets it as such.
Because it's a largely artificial restriction imposed by Apple to further their business interests at the expense of users. As noted below, the iPad form factor would be great for a programming environment like Squeak, but Apple says no. Now that we're starting to get Android tablets that don't suck, I'm hoping that we'll see some tools like that.
The tools to create are more democratized, which is extremely powerful. Seeing high-quality apps in the app store, and knowing that I have all the tools to do the same, would've been the ultimate challenge for me.
I would've spent my high-school years studying existing apps, open-source, etc, and trying to emulate successful ones. I know this b/c I DID spend most of my childhood trying to figure this stuff out, but available resources were so primitive (mid- to late-80's). Most people didn't even have a computer, and there was little open source or good-quality software to look at for inspiration.
Desktops are not going to "go away" , things will simply evolve. We have the technology to create many types of computers now, not just a box with a keyboard & monitor.
Yes, yes you do. The ridiculous lawsuit that Apple is currently waging against Samsung is a prominent example. Frivolous waste of time. "Let's patent a rectangle. Wait, a black rectangle with rounded edges!" Good thing no one told any TV manufacturers about the no-rectangle patent! Silly.
Don't get me wrong, though. I work on Macs every day; love 'em. But I think the platform wars should be over by now; being able to use *any device or OS or application should be the focus, shouldn't it? (For my part, I'm enjoying the flexibility to create, code, whatever on open and expansive Android devices in addition to my Mac these days. And I'm grateful that OSX is now based on a BSD variant.)
But that is no reason to be complacent about the issue (or to think that this legality is written in anything but sand).
(I do think it is arbitrary; It would cost Apple very little to offer some mechanism to turn off the safety of only running apps from the app store; the small impact of jailbreaks on the ecosystem suggests it wouldn't introduce many new problems)
I don't see a shift. I see a dramatic growth in both.
I'm sure a number of people wouldn't be motivated to produce if there was not a large number of "consumers". Right?
That quote is from Genesis 11:6:
"And the Lord said, Behold, the people is one, and they have all one language; and this they begin to do: and now nothing will be restrained from them, which they have imagined to do" (http://www.biblegateway.com/passage/?search=Genesis+11%3A6...).
Edit to add: Yeah, I'm cranky. But this is something I've seen here over and over. Now for the downvotes to DEAD status.
Edited further to add: He said nothing is withheld, not that you could either do anything or automagically succeed. You can have a revelation -- but still eff it all up, because it's still ultimately up to you. Trust me, I've experienced that myself! Especially the effing it all up part. So don't do that.
That said, it would be really cool if he just went around every week and picked someone new to randomly inspire :).
'cache:http://joelrunyon.com/two3/an-unexpected-ass-kicking -> http://webcache.googleusercontent.com/search?hl=en&outpu...
Sorry about this. Didn't think the site was going to get hammered like it did.
My current home page, http://canonical.org/~kragen/, benchmarks at 4000 requests per second with ab on localhost. That's roughly a thousand times the load you were under. This is not because our server is a thousand times faster than yours. (It's a dual-core 2.4GHz Celeron, running Apache.) This is because your server is doing a thousand times as much work as it needs to in order to serve that page.
If you haven't already, install W3 Total Cache or WP-Supercache.
Thanks for writing this up.
"SEAC was demonstrated in April 1950, and in May 1950 it went into full production, making it the first fully functional stored-program electronic computer in the US."
"On May 6, 1949 the EDSAC in Cambridge ran its first program, and due to this event, it is considered "the first complete and fully operational regular electronic digital stored-program computer"
Who made then the first real stored-program computer?
Personally, I think it's not an inherent limitation. I believe that tablets are fundamentally more usable for the vast majority of folks and I believe that they can and will be used for creating. More so, I think that putting computers into more people's hands will be a benefit to the human race, even merely in terms of creativity, that is almost impossible to estimate today.
I suppose you could circumvent it by coding on some web site or server. But nevertheless, I'd say that counts as "inherent in the tablet design". Of course there are other creative things to do besides coding. You can do some things with the iPad.
Or paraphrased slightly better by Walter Sobchak, "If you will it, Dude, it is no dream."
from Mein Kampf, by Adolf Hitler
You are never given a wish without also being given the power to make it true.
You may have to work for it, however.
"Attempt great things and great forces will come to your aid."
No idea who said it; I saw it on the side of a refrigerator at the dump while disposing of a deceased dishwasher.
The main reason I titled it as such was to show that while I'm working on doing hard things, Russell challenged me to really step my game up (aka kicking my ass). I meant it the most complementary way possible. You start to reframe the way you look at challenges when you meet someone who's done something like build the world's first internally programmable computer.
And your post was great, it was genuinely inspiring. Thank you very much for posting it, I would have loved to have been in your shoes on that day.
Is more of a cultural criticism really, that we often allow ourselves to see so many of these kinds of interactions as some sort of constant social competition, when there are so many other interesting ways of approaching stuff.
The first meaning: if you’ve conceived something in your
mind, decide to do it, and are willing to put in the work
– nothing can stop you.
The point of the quote is not some wishy-washy "you can if you believe you can". Rather, it might be phrased better as "If you have the vision to see how something might be achieved and the drive to actually achieve it, nothing can stop you from achieving it".
It's not about "just think and you can do it." Vision is key. If you can't conceptualize how something could possible be created, then you're not going to be able to do it, but being able to first conceive the idea takes you to the next step of actually attempting to create it.
I've observed it can be the transient signal of some deeper disappointment. Those things eat at people, and in a success focused society it can be hard if not impossible for people who have decided to measure themselves against other people's success to come to grips with that. If emotional damage had a unit of measure it would no doubt be the snark.
Many times I have felt the need to write something snarky in response to an article on HN, but what it really comes back to is my own pride or sense of inadequacy.
I need to learn to be positive and less dismissive.
Or do you often just accept the concepts of others as true and therefore beyond refute?
(Really though, the flaw in your comment is that its an impossible statement. It's like asking "If God is all powerful - can he make a rock that even he cannot lift?"
First one, liquids with zero viscosity, such as helium superfluids. - http://www.sciencemag.org/content/286/5438/213.summary
Or even 4 dimensional quantum time crystals (which surprisingly, doesn't appear to be completely mental) - http://arxiv.org/abs/1202.2539
Course, you can't take any power off them and you have to put them somewhere very very cold, but they will keep going, apparently.
As for the second, we know there are issues with the current physics and I think that might just be what stuff like CERN is for. Or to prove it really really well.