My favorite story in a related genre: I was a scholarship student at university, funded by a wealthy couple. Also at university I had someone who, over three classes taken together, had graduated from "rubs me the wrong way" to "nemesis." It turns out that he was also there on the same scholarship.
The university organized a dinner every year to introduce scholarship students to their patrons. It was at the Ritz-Carlton and I remember feeling very, very underdressed. Anyhow, it turned out that our 90-something patron was simultaneously sponsoring about two-dozen scholarship students, so rather than doing much talking I sipped a coke and just listened to the dinner table conversation.
Nemesis, in his oh-so-charming way, began bragging about a civil engineering project that he had been on ("As a sophomore -- really not something many people do, you realize") remodeling an overpass near the school. He was going into lots of irrelevant detail -- specs, etc. Our patron made the requisite politely interested noises and, at one point, suggested that a particular implementation detail might be improved upon. I recall it being something like the amount of reinforced concrete required.
Nemesis: "I don't know how things were when you were still working, old-timer, but I'm absolutely positive that blah blah blah.
90-something guy: "Oh, I guess it is possible that they've improved the formula since..."
Nemesis: "Since when anyhow?
90-something guy: "Since I invented reinforced concrete."
The gentleman passed away a few years ago and, sure enough, that was quite prominent in his obituary.
I seem to have totally hijacked this, so what's the harm with another Nemesis story:
We took CS201 (discrete math) together. One test was exactly one problem long and scheduled for a 90 minute class period. The core insight was that it decomposed into "Take the product of all f(N) together for N = 1 through N = 25", where f(N) was something looking vaguely algebra 2-y. Hand calculation looked like it would take, oh, north of an hour.
On inspection of f(N), I noticed that it contained the term "(BLAH BLAH BLAH BLAH BLAH) * (.5N - 11) / (BLAH BLAH BLAH)". This obvious makes f(22) 0, thus making simplifying the expression or calculating f(1..21) and f(23..25) then multiplying rather moot.
Now, our professor was not a very tricksy guy, so I figured "Hah, betcha didn't see f(22) coming now did you" was not exactly his style, so I went back to check my assumptions. But they were, unquestionably, right.
So I went up approximately 4 minutes into the exam and said "Umm, sir, I'm done." He looked surprised. I said "If you're surprised, then I think there's a bug in the problem." He said I had probably made an arithmetic mistake and I would hate to lose all credit for the test.
Back to seat. Checked setup of expression. Nope, sorry, it was absolutely mathematically unavoidable that (.5N - 11) was in there.
So I went back to the prof, now 5:15 into the 90 minute long test, and said "Sir, I really think I'm right here. I'll take the zero if I botched this -- can you take a look for me right now?" So he looks it over for 20 seconds, smiles, says "That is what I get for having grad students write my exams" (this quote is literally accurate), and takes my exam.
I had a friend in the class with whom I was going to go to a study group after the test was over. Not having anything else to do, I waited for him.
Twenty minutes later, Nemesis exits the class -- "first" to finish the test, naturally. Without prompting, he says "Don't feel bad, not everyone can hack discrete math. No reason to waste time on the exam if you're not going to get the right answer." "Ahh, out of idle curiosity, what did you get?" "It's complicated but..." biggest smile of my college career
It's probably not word-for-word literally accurate, given that it is my ~10 year old recollection of a conversation and as an Irish storyteller I generally don't let facts get in the way of a Narrative. That said, this guy was routinely obnoxious enough to drive me to hatred just with classroom comments.
My guess was that the invention is not an invention as such, but more along the lines of "he made it practical in a huge way and showed everyone else following him how to do it". There is often a very long lag between actual "first to invent" and "figured out how to make it practical without it being stupidly expensive", and oftentimes the latter is regrettably (IMHO) not recognized nearly as much as the inventor. A quick bit of Googling around led me to Tung-Yen_Lin [1], and prestressed concrete, a relatively (as these things go in civil engineering) modern technique.
Patio11's comment that this was roughly 10 years ago almost fits when this rather amazing gentleman passed away.
No, not him, and not interested in playing 20 questions.
Look, we all know HNers are smart enough to Google up the particulars. That would put me in the awkward position of having published a private conversation by a deceased gentleman who generously paid for several dozen poor students to go to college, including me. I'd appreciate if you just wrote this one off to "I trust Patrick not to have invented this story out of whole cloth" and let it lie.
That would put me in the awkward position of having published a private conversation by a deceased gentleman who generously paid for several dozen poor students to go to college, including me.
It's not like you told us he secretly punched babies to steal their toys. It was a charming anecdote. It makes him look like a great person with a sense of humor. If anything, the story is entirely flattering. This is not the stuff secrets are made of.
Sorry, wasn't trying to trace the actual name. I knew Monier's name from some documentary I watched and was confused if it was you to met him or were you sharing someone else's century old anecdote.
I think the comment was supposed to point out that Monier is supposed to be one of the principle inventors of reinforced concrete and that he passed away in 1906 ... hence trying to figure out how the gentleman in the story fit in.
What formula? It is concrete reinforced with rebar. My guess is the OP just misremembered the specific thing the patron invented. It's a great story either way.
Might just be misremembering the specific retort he made - he is famous for use of reinforced concrete. (I checked that after the last time I told the story, to make sure I hadn't invented that part.)
When you're not humble enough, bad things happen. When you're too humble, good things that would have happened don't happen. That makes it harder to regret being too humble, because you usually don't know about the things that could have happened.
I always struggle to figure out when I'm crossing the line from being humble to self-deprecating (the latter not always being a good thing and something I've regretted on occasion).
One single job interview where they ended up hiring the guy with nearly identical qualifications. The guy leading the interview later told me on the phone "it was almost 50/50, but we had to pick only one of you", and the reason was that the other person appeared slightly more confident in his abilities using a particular framework. If I'd have been a bit more self-promoting, that could've been me. Lesson learned, of course.
Of course there's a difference between confidence and being brash.
Wow, I can't believe that a student had the gall to talk like that to his sponsor. The admissions process usually weeds out those types of students, but a few bad apples can always get through.
The annual WUSTL Engineering Scholarship Dinners were always very pleasant in my experience. If anything, the students were overly polite and felt a little intimidated in their sponsors' presence. We certainly felt intimidated by the formal atmosphere at the dinner.
You see "not intimidated by authority"; I see "asinine". Part of the point, actually, was that "nemesis" didn't know that the fellow was an authority on the topic. Maybe if he had, he would have been intimidated. ;)
Did I imply otherwise? That said, I think socially ungraceful people should have access to higher education, too, and we'd be a much poorer world if they hadn't had so in the past.
Especially (but not just) because social grace strikes me as a arbitrary and moving point on an arbitrary and moving scale.
when people use iPads they end up just using technology to consume things instead of making things. With a computer you can make things. You can code, you can make things and create things that have never before existed and do things that have never been done before.
I think closed devices are really a step backwards. I learnt programming (and a whole lot more, like mathematics and logic), when I picked up a computer as a kid and discovered Python and started hacking around. I could look around for tutorials, download code or try out code from books and nobody stopped me.
On a somewhat similar vein, I am also sad that desktops are slowly going away. I learnt so much more about computers (and how it is not magic) by building my own.
I think something needs to be said about systems that allow you to explore and hack around for the fun of it.
Maybe some people use their iPads for Facebook/Twitter/YouTube. I don't give a damn. They're the same boys/girls that were watching celebrity news and other useless TV shows 10 years ago, when you and I were watching "scientific/educational" shows on TV and hacking away with computers.
I'm getting sick of this "iPad is for consumption" thing. How are you going to "produce" something if you haven't learned anything yet? For me, iPad is a learning device. In the past year, I've read maybe 15 (technical) books on my iPad. I wanted to read them for the past 3 years, I had downloaded their ebooks and stashed them neatly in a 'To-Read' folder, but never get to do so on a Mac or PC because of their form factor.
In the past 2 weeks alone, I read 'JavaScript: The Good Parts' and first 5 chapters of 'Git Pro' (ePub, in iBooks), 70% of 'Head-First C' (PDF, in the fantastic ReaddleDocs[1] app) and 20 other technical/"intellectual" articles in Instapaper. Without iPad, I wouldn't have read them.
> I'm getting sick of this "iPad is for consumption" thing.
Something you miss is that the central directive for the iPad was a consumer experience. It's intended to be roughly equivalent to a TV. It's great that you choose to use it for Discovery Channel and the like, but you're probably not using it to write code. And that's fine; it's not meant for that. And that's the point: it's not meant for that.
The consumer experience extends beyond just watching TV, to things like managing and viewing photos, drawing, music, and interacting with others online. Here on HN it is easy to reduce "production" to coding. And it's true--the iPad is not a good platform for coding because the user does not have shell access, and Apple does not allow IDE apps in the App Store. However, the simplicity of the interface, the quality of the display, and the extreme mobility of the form factor, make it great for the more "consumer" type of production I mention above.
Incidentally the original Macintosh did not grant shell access to the user either, and for that reason was a favored web server platform of certain DOD programs. Those worried that Apple is closing off the possibility of programming should consider a longer view: with OS X, Apple introduced shell access to their OS, which is a certified Unix. Even with Mountain Lion, they are still introducing additional CLI functionality.
It's an Americanism, too. The last time I was playing tourist in Mexico, they talked about how amazing TVs were because they were basically used as free, national education.
The problem is, that if more and more people use these "consumption" devices, the choices becomes increasingly closed and restricted for the producers.
True, but you're (hopefully) taking that knowledge and putting it towards actually DOING something.
Consumption without purpose is pointless. Your case may be an exception, but unfortunately I think it's more of a rarity than it is a representation of the common use of the device.
Sometimes I catch myself thinking this way and need to remind myself of my own experience.
All one needs is a seed.
My seed was planted on a closed system: an Atari 2600. Specifically, one with a Spectravideo Compumate keyboard. http://www.atarimuseum.com/videogames/consoles/2600/compumat...
When I did as much as I could do on it, an Apple ][ clone was the next step.
When the iPad came out, I lamented the closed platform, forgetting that my Apple ][ was not where I started.
Something similar to my Spectravideo for iPad, is Codea http://twolivesleft.com/Codea/.
Instantly accessible, and instantly running your code. A perfect seed for a kid who might express an interest in programming.
Any kid who discovers a passion for programming has options to move onto any platform their desire takes them. The ubiquity of the iPad and other tablets might even promote the introduction to programming to anyone who is curious enough.
Also, would people would stop calling an iPad a consumption device. It can be used this way. Many people do. But then, many people use their PC for little more than web browsing and to watch Netflix. It's not the platform, it's what interest them. My laptop is gathering dust because of my iPad.
The naturally curious will naturally do more within existing constraints, and then break free of them.
>> "The naturally curious will naturally do more within existing constraints, and then break free of them."
The naturally curious will achieve more where there are less constraints. Making the iPad the default device for every kid out there is just making things worse for them in the end. I remember the startup screen of the C64: you could just type "load" and "run" if you wanted to play a game, but basically you could also write some stuff in basic and run it yourself. You could tinker all you want without having to install anything.
The iPad IS a consumption device. By design. It's certainly not made to produce anything, and even if it is possible, it's never the best tool for the purpose. It's just a massive trade-offs device.
Yes, we could tinker away with all the computers from the early 80s. We were lucky, this lower level of access was unavoidable.
But what about today? One never needs to type "load" and "run". Instead of discovery by the kid, a parent who thinks programming might be interesting for their kid is a prerequisite. And then they need to be able to do the research and install perhaps Python. And then what?
Playing around with Lua inside of Codea sent me right back to the immediate gratification of my Apple ][
In the end, does the platform matter? When I was a kid, I took apart the TV so I could see what was inside. Curious kids today will be similarly compelled to jailbreak their iPad.
I'd be curious to know how many programmers there are here who are programmers because of a push in the right direction by an adult, or simply because there were curious kids.
As I said, its a matter or design. The iPad is not made to enter information (except tiny bits), it is clearly a ''click/touch'' centric device. No keyboard by default says it all. You wont write a full novel on a iPad. You wont code something very long on it. And you wont even have the tools at hand to develop the apps that you use everyday. It is completely asymetric, and the barrier for curious kids to do something is huge. Whereas the same kids can just open a terminal on a Linux device and start experimenting away.
How about that: I wrote my first programs with pencil, on paper. I had no computer at home and very very limited access to computers at the university. Then I had to retype everything from my notes.
It's not JUST a matter of keyboard, while it's important. The visual cue you get from the ipad is a touchscreen. A computer, by design, makes the keyboard available at all times, and it's integrated to the nature of the computer. On the iPad, the keyboard becomes an option, not something necessary to do anything. The fact that you put the keyboard in the background says a lot about the positioning of the device. I'm not saying this is right or wrong, but what I'm saying is that it's clearly a consumer-centric device because of the design.
Then, the absence of any kind of tools to recreate the very same applications you are using on your iPad re-enforces that aspect. You actually have to SPEND money to develop for iPad, and you need a proper computer to do that. Net, it's not a device that is made for development (even if you wanted to), hence it's a consumption device only, by positioning.
when people use iPads they end up just using technology to consume things instead of making things. With a computer you can make things.
Absolutely true. A lot of people like to mock the position that the iPad is a 'device for consumption' whenever a new story comes out about someone creating a song or painting on his iPad.
Well, there's a reason why it's worthy of a news story. You don't see articles written whenever someone creates a song on his Mac or PC.
I love my iPad. It's a great consumption device. But its terrible for tinkering and creativity (and this limitation is by design).
Let me know when I can design flyers and brochures from it. Or write a program on it to do something useful that can be shared with others.
Well, there's a reason why it's worthy of a news story. You don't see articles written whenever someone creates a song on his Mac or PC.
That's a bit like saying, "you don't see articles written whenever someone creates a song on his piano or violin", back in the 50s or 60s. There was a time when computers weren't used to produce music, pictures, or movies.
Rather than lament the current state of new mobile platforms, why not celebrate these amazing new platforms? They're increasingly becoming creating rather than consuming platforms and, even if we will never easily create apps directly on them (which I don't believe is true), they're wonderful new distribution platforms.
Just because they're not like the Apple ][s or C64s of yore doesn't mean it's the end of the hacking world. A new generation is growing up programming apps (yes, on their desktop computers right now) for this new world and can easily share their creations with millions. From where I'm sitting, this looks like a pretty interesting step forward.
(And for designing flyers and brochures on an iPad, you're welcome to use Pages or any number of other drawing or painting apps found on the App Store.)
I do all of my wire framing and app planning work exclusively on iPad. With the iPad, I can actually create content while sardined in the subway. These "consumption-only" folks are just parroting Microsoft-sponsored drivel.
Your wire framing and app planning work don't fall in the category of creating things that have never before existed and never been done before. That was the quoted line from Russell Kirsch. The MS line parroting, that is just defensive fanboy bait. But whatever rocks your boat. In any case, it is what it is, a consumption device.
Had you gone with being able to edit html, javascript and css on it, or even code and compile java on the AIDE app... but no, it was an attack that you had to counter with an anedoctal. It's like textbook dumbing down consumer bs galore over there. But the guy in the article, he gets to be the parrot...
I didn't establish that creation is circunscript to writing software. Had I mentioned writing music or editing video, that would be outside the scope of what the poster established to be the field of interest for the usage of the device in the anedoctal. It still would not make any difference to the point it would have contradicted the presented quote in the article.
But thanks for the one liner and the down vote I see how what I did not say was probably confusing you.
There was a time when computers weren't used to produce music, pictures, or movies.
Yes, and we're at that state right now with tablets. Sure, we'll get there eventually but it won't happen by fooling ourselves into contentment by saying they are already great for producing creative work.
why not celebrate these amazing new platforms
I did say that I love my iPad. I've reached the point where I don't mind paying more for e-books that I can read on it than the dead-tree version that I have to lug around with me. I also use it extensively for mail, photos and browsing.
It may eventually become a tinkerer's device. It's not there yet though.
And for designing flyers and brochures on an iPad, you're welcome to use Pages or any number of other drawing or painting apps found on the App Store
They let me use my own fonts (possibly in other languages)? Or let me cut a photo and place it on a page?
The latter may not exist because no one has done it yet (ie, not a technical limitation). I'm pretty sure the former is a limitation of the platform rules that have been set.
Anyway, I don't really disagree with much of what you say. We will eventually reach a point where creating cool stuff on the iPad is the norm. But we're not there today.
That wasn't rhetoric. I actually want an app that can do these three things:
1) Let me work with custom fonts
2) Let me import images and cut them and place them
3) Export the whole thing as pdf
(Bonus points for handling svg and clip art)
I have not found one that does. I didn't try Pages - I assume it's a word processing app like Word and not a page layout/design/typesetting app like In-Design.
Have you tried it? Does it let me do these things?
I've often toyed with the idea of writing it myself. But I stop at point (1) because loading custom fonts that don't ship as part of the app bundle doesn't seem to be supported by iOS. (And also because I tend to procrastinate)
I don't know about svg's, but Pages can do all of this stuff. It's not just a word processor, but a very good age layout tool. You might want to check it out before dismissing the iPad so quickly.
> A new generation is growing up programming apps (yes, on their desktop computers right now) for this new world and can easily share their creations with millions.
For Android, yeah.
Thing is, I'm teaching "creative computer stuff" (which includes programming) to kids at a volunteer centre. One of the kids came to me and wanted to know if I could help them build an app for their iPod Touch.
Turns out the costs of writing iDevice apps is quite prohibitive to most teenagers.
First, the SDK+simulator is only available for Macs running Mac OS X 10.5.4+, not just any "desktop computer" will do. That is a problem because the "Young Researcher's Centre" runs on donated PCs, most of them old WinXP office machines that are perfectly serviceable for most of our purposes. We never get Macs though (well one time a really ancient one, I doubt one could develop apps on it).
If you do have a Mac, apparently you can get the SDK/emulator for free, I think (Apple's site said "free!" a lot but kept on directing me to some page where I had to pay. I gave up when I found out it couldn't be done on a PC, anyway).
But then comes the next problem. Your app just runs on the simulator. You need to join the "iOS Developer Program" in order to actually run it on a real iDevice, unless you jailbreak it, (which I'm not going to teach the kids mainly because I don't want to be responsible if they accidentally brick it).
The iOS Developer Program costs $99 per year.
Trust me, no kid is going through all that trouble just to have "something that looks like an app running in a bloody simulator", there has to be some tangible end-result.
From there (Wikipedia) "applications can be distributed in three ways: through the App Store, through enterprise deployment to a company's employees only, and on an "Ad-hoc" basis to up to 100 iPhones."
IMO it is ridiculous to teach children a valuable skill (programming) merely in order to immediately sell their services/work on some App Store market. They're children, they need to learn and play--of course if they want to try and sell their apps it can be a valuable experience but it should not be the only reason or the only way. The first point is that you write code for the sheer thrill of having that kind of control over your own devices.
I'm not sure how the "enterprise" method works, but I doubt we'd be eligible.
So finally you're left with paying $99 yearly in order to liberate 100 iPhones so you can write your own code for them. Whoop-tee-doo. And I'm probably forgetting about a whole bunch of ways Apple can get in your way as you try to actually go this route.
Suffice to say, after having figured this out, most kids realized their next phone was going to run Android.
I've seen an electronic music group perform with some iPads used as controllers and others making sound directly, including modifying the sound of a cello. One chap was using a thinkpad with a games controller to be different I suppose...
And I've just spent a week in Yorkshire including a visit to Salts Mill. Hockney has an exhibition of some of his iPad paintings (projections) and some of the paintings he did using a graphics tablet on Photoshop. I actually liked the tablet/photoshop paintings more (composed of thin lines building shapes and shadings) than the iPad ones.
But I think that a 'direct manipulation' interface will allow creation in variety of media. I'm waiting for a notebook sized tablet with stylus input for an electronic moleskine.
I disagree. It's "fantastic" as a generic (midi) control panel (similar to the Lemur, for instance Touch OSC or Max/MSP). Of course, you could usually achieve the same thing with an old second-hand Behringer BCF-2000 (or BCR-2000). Bonus: You get motorized faders and real knobs that you can touch.
Well, I have 2 ipads and an iPhone which I use as a portable studio these days, jamming with my band .. and I use the iPads just like VST plugin hosts on my PC .. since I play keyboards in the band, its awesome! Plug in, fire up a few apps, and off we go .. a very, very versatile creativity machine. Beats the DAW by far!
Its simple: it just plain works. Pretty much, always. I plug in a MIDI keyboard and Controller, fire up a synth app, and off we go. Stable, sounds great (comparable by far with any PC plugin), and just plain works. I have two iPads and an iPhone in my studio rig now, plus a couple of Akai controllers (pads/keys/sliders/knobs), and as a portable action studio this configuration just rocks.
Plus, I don't have to have it sitting there, like I'm checking my email on stage. Hate that about laptop musicians, as if it is actually interesting to watch someone glow their face when I'm out at night and want to have fun, away from the office .. with the iPad form factor, at least, it lays flat and stays out of the way. And there is no QWERTY keyboard, so its not easy for someone to make the association "that guy must be writing an email onstage instead of making live music", and believe me that happens a lot with laptop musicians around my parts.
Because...magic. And fairy dust. And nobody who uses a laptop could possibly be cool.
It's not a great device for making music. It's an adequate device mostly used by hipsters (and I use that term offensively) to look cool, sort of like how "writers" hang out in coffee shops using Macbook Pros (or various other expensive laptops, like Vaios) to "write" the next American Novel.
as a musician, I disagree. The iPad makes for a great adaptable electronic instrument. The major feature of tablets is that they become the app you're running. This lets the designers of the various synths for the iPad experiment with interfaces like was never before possible. Check out some of the stuff that Jordan Rudess's Wizdom Music is putting out: http://www.wizdommusic.com/
Jordan Rudess isn't just anybody, he's one of the top keyboard players alive today. He's always experimented with alternative instrument interfaces and I think the fact that he's doing this on the iPad lends a lot to its credibility as a creative platform.
I bought the ipad on a total whim and have been repaid many times over by the enjoyment it continues to provide. I'm an old-school musician from the days when sounds were 'etched' on analog tape and I can tell you that as far as I'm concerned, it is a GREAT device for making music and one that is only going to get better as software evolves to exploit the full potential of its touch panel.
I also recently took it to Poland for a month and used it to compile a hundred pages of typed and handwritten notes and video footage--all the while using it to navigate my way around the unfamiliar terrain. I keep being surprised by my little sidekick's range of uses and flexibility...
I agree...and Garage Band is the equivalent of a very low-level Pro Tools, if you will, and best of all, it's free. I do wish there was an easier way to sync tempos on the program but it's good enough for a basic user.
The artist David Hockney loves iPads so much that he actually has had his suits modified to include a suitable pocket - here is a video of him using an iPad to draw in a cafe:
This was one of the strongest criticisms of the original iPad by Alan Kay to Steve Jobs iirc. After that criticism, we got Garage Band for iPad (which imo is so insanely awesome I bet it will take at least 2 years to appear on other devices at that level of quality). We also have iBooks author, Brushes, and such "creative" apps. Overall, I see more "create" type of apps coming up.
The hacking apps leave a lot to be desired, but also seem to be cropping up gradually. I'm sure soon enough we'll see some pretty powerful "make something new" kinds of tools on these devices (so sad that Scratch/BYOB can't happen on the iPad .. yet <<glaring at Apple>>). OTOH, the sun may burn out before the "take things apart" kind of tinkering happens with the full blessing of the device makers.
I think the touch input is simply much less practical for certain repetitive tasks than a mouse and keyboard, due to the ergonomics. For text editing, lifting your arms from the keyboard area of the screen, or from a Bluetooth keyboard, to scroll the page, select a tool, insert the cursor, select some content, etc. is much less efficient than reaching sideways for a mouse and scrolling its wheel or clicking its buttons, with only subtle movements needed to position the cursor while you rest your hand on the mouse. It is both tiring and tedious to perform certain tasks without a mouse, and until eyeball tracking is workable, you're still going to want a mouse and keyboard device accessible.
I don't really understand why people complain about this. iPads aren't tools for creating things. Do you get upset that your lamp doesn't allow you to create things? If you want a tablet that lets you hack it however you want then hop on Octopart, order the components you need, and start hacking away. But that's not the product Apple is selling, and it's not the product most people want to buy.
Also, nobody is stopping anybody from building desktop PCs. As long as people are willing to buy the components I'm sure somebody will be selling them.
Add a long piece of paper and suddenly you have a limited manual turing machine. Add some LEGO Mindstorms, and suddenly you have a limited automatic turing machine.
I don't really understand why people complain about this.
Because it's a largely artificial restriction imposed by Apple to further their business interests at the expense of users. As noted below, the iPad form factor would be great for a programming environment like Squeak, but Apple says no. Now that we're starting to get Android tablets that don't suck, I'm hoping that we'll see some tools like that.
If modern smartphones had existed when I was a kid, I'm 100% sure I would've been building (probably crappy) games and apps for them when I was 12.
The tools to create are more democratized, which is extremely powerful. Seeing high-quality apps in the app store, and knowing that I have all the tools to do the same, would've been the ultimate challenge for me.
I would've spent my high-school years studying existing apps, open-source, etc, and trying to emulate successful ones. I know this b/c I DID spend most of my childhood trying to figure this stuff out, but available resources were so primitive (mid- to late-80's). Most people didn't even have a computer, and there was little open source or good-quality software to look at for inspiration.
There will always be systems that can be explored & hacked as long as there is a community interested in doing so.
Desktops are not going to "go away" , things will simply evolve. We have the technology to create many types of computers now, not just a box with a keyboard & monitor.
Is that really true? They have python "apps" for the ipad, but that's not what i mean. I like tinkering, always have my parents tell stories about how when i was a child I'd take things apart instead of playing with toys. IMO that type of curiosity is what makes a "hacker". Regarding apple products part of the fun is jailbreaking the device and figuring out how it works, trying to break the security etc. this will teach you A LOT. Yes you can tinker with android devices but with something closed like the ipad I get a drive to figure out how it works and break it. Apple might not like it much when their devices are jailbroken/the cydia store but you don't see them suing people like geohot as sony does.
"...but you don't see them suing people like geohot as sony does."
Yes, yes you do. The ridiculous lawsuit that Apple is currently waging against Samsung is a prominent example. Frivolous waste of time. "Let's patent a rectangle. Wait, a black rectangle with rounded edges!" Good thing no one told any TV manufacturers about the no-rectangle patent! Silly.
Don't get me wrong, though. I work on Macs every day; love 'em. But I think the platform wars should be over by now; being able to use *any device or OS or application should be the focus, shouldn't it? (For my part, I'm enjoying the flexibility to create, code, whatever on open and expansive Android devices in addition to my Mac these days. And I'm grateful that OSX is now based on a BSD variant.)
In the US, at least, the legality of jailbreaking currently rests on a temporary DMCA exception that needs to be renewed every three years. Prospects for the renewal look good at the moment:
I picked up a deep love for using computers via a terminal with a 300 baud modem. Not that I don't agree with the sentiment, just saying there are a lot of paths.
You know, I've learned (and still am) a lot about computers and programming from books too. The paper books. Those horrible horrible things that allowed consumption only.
Also there are millions of people using desktops for consumption only.
So no, no discussion about this. False dichotomies are not something I am going to discuss.
Right, because the limitations of paper are largely an arbitrary software implementation detail.
(I do think it is arbitrary; It would cost Apple very little to offer some mechanism to turn off the safety of only running apps from the app store; the small impact of jailbreaks on the ecosystem suggests it wouldn't introduce many new problems)
Well, it's a relative thing. At the time it was built, it was probably one of the easiest platforms to create things on, other platforms at the time involving actual physical equipment and work.
Well, within the production end, there's a shift from "tinkering and having fun" to "this is something done by professionals, now shoo, kid". The necessary licensing fee to develop for iOS only reinforces this notion.
Absolutely. That was the bit that made me continue reading because I liked where the story was going, then it suddenly took a turn for the unexpectedly more awesome :)
Nothing is withheld from us what we have conceived to do.
That quote is from Genesis 11:6:
"And the Lord said, Behold, the people is one, and they have all one language; and this they begin to do: and now nothing will be restrained from them, which they have imagined to do" (http://www.biblegateway.com/passage/?search=Genesis+11%3A6...).
I hate it when simple declarative statements are dissected by "lawyer-think." You entirely miss his point doing that. Did he stop and qualify anything? Stop and think. He's in his eighties, FFS. Do you think he's spouting gibberish or something he deeply thought about and saw actually happen? "But this..." and "But that..." -- you're creating your own damn roadblocks. And then wonder why you can't move forward.
Edit to add: Yeah, I'm cranky. But this is something I've seen here over and over. Now for the downvotes to DEAD status.
Edited further to add: He said nothing is withheld, not that you could either do anything or automagically succeed. You can have a revelation -- but still eff it all up, because it's still ultimately up to you. Trust me, I've experienced that myself! Especially the effing it all up part. So don't do that.
Wouldn't it be great if Russell Kirsch went from coffee shop to coffee shop having similar conversations? Kinda like the hacker version of that scene in the movie Soapdish where Sally Fields and Whoopi Goldberg go to the mall and Whoopi pretends to recognize Sally. In the movie, the result was a flock of women crowding around her. In Russell's version, each encounter would end with an inspirational blog post or tweet.
He's actually pretty low key. I emailed him about the post and he simply said thanks. Really nice and genuine man. Really glad to have 20-30 minutes to have been able to talk with him.
That said, it would be really cool if he just went around every week and picked someone new to randomly inspire :).
For those who don't know (every time I mention this, I get like 10 upvotes. Apparently a lot of people even on HN don't know about it!), if you want to get a (Google) cached version of a page, just type 'cache:[URL]' in Google search bar and press return.
Thanks sir! And never underestimate the value of the little cantrips like that to somebody who isn't aware of them. I was astounded that most people didn't know CTRL+F, and I always ask people if they know about it.(http://www.theatlantic.com/technology/archive/2011/08/crazy-...)
Does Google document these all anywhere? I could've sworn they used to have a page listing them, but I can't find it anymore. There are a few third-party listings, though, e.g. this one from SEOmoz: http://www.seomoz.org/article/the-professionals-guide-to-adv...
Depends. 50k hits in 3 hours is 5 hits a second. I could serve more than 10 hits a second on the first web server I ever set up, in 1994, on an SGI Indigo 2 with maybe a 250MHz CPU and 256 megabytes of RAM, using NCSA httpd, which spawned a new process for every hit. Under IRIX. While I was using it as a workstation. Also, doing a reverse DNS lookup for every hit for the access log.
My current home page, http://canonical.org/~kragen/, benchmarks at 4000 requests per second with ab on localhost. That's roughly a thousand times the load you were under. This is not because our server is a thousand times faster than yours. (It's a dual-core 2.4GHz Celeron, running Apache.) This is because your server is doing a thousand times as much work as it needs to in order to serve that page.
I'm asserting that your server isn't a thousand times slower because a thousand times slower would be a high-end eight-bit microcontroller. An Arduino, say. Feel free to correct me if your server is in fact running on an Arduino or equivalent.
I've had my tech guy speed things up. Got off shared hosting, on a VPS and should have things sped up all around. Still need to cut all the javascript in the current theme, but that's another project for another day.
with an out-of-the-box configuration, Wordpress is notorious for dying if you so much as breathe on it. It requires generous lashings of caching to make it workable.
If you haven't already, install W3 Total Cache or WP-Supercache.
It's on the top page of HN right now, and following the link gets me "Error establishing a database connection." So all I know about the article as I'm writing this is the title; it's pretty accurate :)
"SEAC was demonstrated in April 1950, and in May 1950 it went into full production, making it the first fully functional stored-program electronic computer in the US."
"On May 6, 1949 the EDSAC in Cambridge ran its first program, and due to this event, it is considered "the first complete and fully operational regular electronic digital stored-program computer"
fit together?
Who made then the first real stored-program computer?
Let's shift the debate a tad. Is the fact that the iPad is predominantly a consumption vs. creation device inherent in the tablet design or just a quirk of the iPad's implementation?
Personally, I think it's not an inherent limitation. I believe that tablets are fundamentally more usable for the vast majority of folks and I believe that they can and will be used for creating. More so, I think that putting computers into more people's hands will be a benefit to the human race, even merely in terms of creativity, that is almost impossible to estimate today.
What the iPad will probably end up doing is modify the way in which creativity is expressed. I would imagine that we will end up having another level of abstraction and creation will be done with tools like Pipes[1] or IFTTT[2]
Well Apple deliberately forbids runtime environments on the iPad. Or rather, you could have one, and a programming editor, but you would not be allowed to share your code with others or run code created by others.
I suppose you could circumvent it by coding on some web site or server. But nevertheless, I'd say that counts as "inherent in the tablet design". Of course there are other creative things to do besides coding. You can do some things with the iPad.
Excuse my ignorance, but at which part of this otherwise nice story the guy's ass got kicked? It was an unexpected encounter with an important figure in computing that has been quite unknown to people, and it must've been fun, but at no point I got the impression that he got his ass kicked...
It was metaphorical ass kicking. You start to reevaluate how difficult your challenges are when you start talking to someone who invented the first internally programmable computer.
Actually, the common slang you wanted was "a kick in the ass" or less vulgarly "a kick in the seat of the pants", as in this book's title, http://www.amazon.com/dp/0060155280 . An "ass kicking" is slang for a beating.
I had a semi-similar experience. I sat next to the guy who invented the shower radio on a flight from SFO to ERW. He told me about his start as an entrepreneur by getting himself $500k in debt and at the last moment, he sold the design to Salton.
“Nothing is withheld from us what we have conceived to do.” is a beautiful idea, but I think Theodore Herzl said it simpler: "If you will it, it is no dream."
Or paraphrased slightly better by Walter Sobchak, "If you will it, Dude, it is no dream."
I fail to see how this was an "ass kicking". Mr. Runyon met someone who's a pioneer in his field and had an enjoyable time listening to him rant about the iWhatever devices. Did Runyon also "get his shit pushed in" by Mr. Kirsch?
The main reason I titled it as such was to show that while I'm working on doing hard things, Russell challenged me to really step my game up (aka kicking my ass). I meant it the most complementary way possible. You start to reframe the way you look at challenges when you meet someone who's done something like build the world's first internally programmable computer.
"A kick in the ass" is the phrase for what you describe, because it propels the recipient forward. An asskicking, on the other hand, implies that the recipient will be out of commission for a while.
I use exactly the same kind of language myself in exactly the same way, so am not having a go in the slightest. Was just from thinking about the way we use fighting as a metaphor for this kind of meeting that led me to write my comment.
And your post was great, it was genuinely inspiring. Thank you very much for posting it, I would have loved to have been in your shoes on that day.
Am not sure that it is even a poor headline as such, as it does it's job descriptively with a commonly understood label, as in, I understood when I saw it that someone was going to find themselves completely outclassed.
Is more of a cultural criticism really, that we often allow ourselves to see so many of these kinds of interactions as some sort of constant social competition, when there are so many other interesting ways of approaching stuff.
If you conceive to build a perpetual motion machine then you will have taken the first step. The second (more difficult) step will be to conceive how a perpetual motion machine might actually work, or to come up with a plausible way of re-writing the laws of thermodynamics.
The point of the quote is not some wishy-washy "you can if you believe you can". Rather, it might be phrased better as "If you have the vision to see how something might be achieved and the drive to actually achieve it, nothing can stop you from achieving it".
It's not about "just think and you can do it." Vision is key. If you can't conceptualize how something could possible be created, then you're not going to be able to do it, but being able to first conceive the idea takes you to the next step of actually attempting to create it.
Lots of people conceive of ways in which a perpetual motion machine might work - it's why so many people have tried to patent them that the US Patent Office has a specific ban on perpetual motion patents. They don't work, of course, but it tends to require a deeper level of knowlege to figure out why exactly (for instance) any of the magnetic perpetual motion machines won't work than it does to conceive them in the first place.
I think this kind of snarky, rediculous pedantry is exactly one of the things that Russell Kirsch doesn't want people doing, even if you only meant it in jest.
When I read it I think this person might be in pain. This particular message, like "you can do anything you set your mind to" which was recently pilloried on HN [1] combines that thought with a belief in God which also gets the "you must be stupid if you believe ..." treatment.
I've observed it can be the transient signal of some deeper disappointment. Those things eat at people, and in a success focused society it can be hard if not impossible for people who have decided to measure themselves against other people's success to come to grips with that. If emotional damage had a unit of measure it would no doubt be the snark.
Many times I have felt the need to write something snarky in response to an article on HN, but what it really comes back to is my own pride or sense of inadequacy.
I need to learn to be positive and less dismissive.
Then clearly, you have not conceived to do either of those things.
Or do you often just accept the concepts of others as true and therefore beyond refute?
(Really though, the flaw in your comment is that its an impossible statement. It's like asking "If God is all powerful - can he make a rock that even he cannot lift?"
total nonsense.)
Or even 4 dimensional quantum time crystals (which surprisingly, doesn't appear to be completely mental) - http://arxiv.org/abs/1202.2539
Course, you can't take any power off them and you have to put them somewhere very very cold, but they will keep going, apparently.
As for the second, we know there are issues with the current physics and I think that might just be what stuff like CERN is for. Or to prove it really really well.
The university organized a dinner every year to introduce scholarship students to their patrons. It was at the Ritz-Carlton and I remember feeling very, very underdressed. Anyhow, it turned out that our 90-something patron was simultaneously sponsoring about two-dozen scholarship students, so rather than doing much talking I sipped a coke and just listened to the dinner table conversation.
Nemesis, in his oh-so-charming way, began bragging about a civil engineering project that he had been on ("As a sophomore -- really not something many people do, you realize") remodeling an overpass near the school. He was going into lots of irrelevant detail -- specs, etc. Our patron made the requisite politely interested noises and, at one point, suggested that a particular implementation detail might be improved upon. I recall it being something like the amount of reinforced concrete required.
Nemesis: "I don't know how things were when you were still working, old-timer, but I'm absolutely positive that blah blah blah.
90-something guy: "Oh, I guess it is possible that they've improved the formula since..."
Nemesis: "Since when anyhow?
90-something guy: "Since I invented reinforced concrete."
The gentleman passed away a few years ago and, sure enough, that was quite prominent in his obituary.