As I understand it, VPRI wrapped up because they ran out of money. I wonder what caused the move away from CDG?
Regardless, I hope he succeeds before we run out of four-letter acronyms funded by three-letter acronyms.
(Edit: CPRG -> CDG, my bad)
Although it is 2012, it had been unavailable publicly until now. A great way to catch up on how FONC ended. It was worked on by many individuals being cited as part of HARC.
"This brings forth fanciful and motivational questions about domain areas, such as (...) "How many t-shirts will be required to define TCP/IP" (about 3), or all the graphical operations needed for personal computing?" (about 10)."
[ed: Actually thought the pdf deserved a submission of its own, lets see if other people think so too:
On the other hand, I am a little concerned to see InfoSys joining the fray here. InfoSys, to my understanding, are basically the face of H-1B abuse. So I'm happy to see these people funded, but it's a little harder for me to cheer when the funding comes from such a questionable source...
They probably file 60K H1bs if they ended up getting 30K in the lottery.
I have no issue with Infosys's offshoring- only their unfair gobbling up of the H1b visas.
As a naive college student, I believed I could contribute to these causes too. After I joined TCS (since I did not study well and couldn't get other jobs), I realized that they had a built up an amazing PR team to attract employees and customers. On the other hand, I bet most Infosys programmers wouldn't have heard of Alan Kay. So Infosys needs to make up for the tech "deficit" by supporting research.
AS far as YCR are concerned, I guess they appreciate Infosys' funding and won't let Infosys' unethical policies affect their research.
I understand it as a PR move for them, but it really devalues the concept to me.
> HARC’s mission is to ensure human wisdom exceeds human power, by inventing and freely sharing ideas and technology that allow all humans to see further and understand more deeply.
Isn't this what the internet is for? What's new?
> Our shared vision of technology combines an expansive long-term view with a strong moral sense. We look to the distant past as well as the far future. We reject the artificial boundaries created between the humanities, arts, and sciences. We don’t always agree on what is good or evil, right or wrong, but we use these words seriously and are driven by them.
This is so vague I have no idea how you can attach any meaning to it.
> We are focusing on areas where we believe the structures created today will have the most impact on the future, and that can most benefit from having dedicated resources outside the for-profit world. At the moment, these areas include programming languages, interfaces, education, and virtual reality.
So you're gathering a group of smart people together to do non-profit research in a few select fields with the goal of improving humanity?
http://worrydream.com/MediaForThinkingTheUnthinkable/ is probably a good starting point for what is yet to be done regarding seeing further and understanding more deeply.
> This is so vague I have no idea how you can attach any meaning to it.
Are you concerned that this isn't externally measurable? I don't think it's meant to be.
> So you're gathering a group of smart people together to do non-profit research in a few select fields with the goal of improving humanity?
Well, as a layperson it seems like you figured out exactly what all of this meant...
Take a step back and remember the internet was an invention too. How can we make a better internet or radicalize communication in another form? What is a browser anyway? Are screens the ideal medium? Also, it's a bit silly to ask "what's new?" out of a research center's vision statement.
Think about VR. Let's say they research how to make it super addicting to play in your own little fantasy. This ends up leading to "VR abuse" and kids stop talking to one another, etc... It is very important for them to understand how potential inventions will affect humanity if their goal is to improve humanity.
Yes you summarized his aims. Those select fields are seen as having the most efficacy upon human improvement.
Since you admit to having no idea what any of it means, I would suggest you try to look beyond the words and not be so eager to extract tangible simple facts. This is a vision statement and it's abstract because forward-looking mental vision is abstract in nature.
Is there anything more to this than peacocking to show how visionary the YC folks are?
hmm, possibly. here i was thinking it was for porn, cat videos, science paper paywalls and ransomware distribution.
The programmer and the program have never been further away from each other as it is today. The development environment have never been further away from the production environment as it is today. And end-user interfaces have never been as abstracted away from computing as it is today.
If any group can breathe some life into this stagnant area then my bet would be on this group.
They released a video last year that showcased some iterations of their prototypes and I found them to be very innovative and promising: https://www.youtube.com/watch?v=VZQoAKJPbh8
This industry has become a branch of the finance industry. So instead of programming being driven by math (e.g. category theory, abelian groups), it's driven by the market (e.g. time-to-market, what works now).
HARC is a push back the other way, giving space to people who are thinking about it in the right way, and I applaud that. But we're still missing the 'inclusive' piece of it; until we can talk to each other and listen to each other, we'll remain frustrated.
Yet at the same time there hasn't ever been so much working software produced, or so many working programmers, so something is working. The barrier to entry to writing code has been lowered. That's a good thing. We have more people trying out their ideas now because it's easy to get something working. You can't reasonably ignore that with a hand-wavy "it could be so much better" lament.
That isn't to suggest for a second that things couldn't be even better. I just think it's important to understand that getting lots of people involved in development, even if what they produce is awful, has benefits for society. That might be more important than good code.
One wonders what could be if programming was easier, being considered as such a basic skill as writing an essay.
They really are. I've been writing code for 30 years (professionally for 20) and while the logic hasn't really changed a great deal, all of the things around coding absolutely have. The work necessary to get something to display on a screen, the speed of both compiling code and running an application, the sheer quantity and quality of example code to learn from, the tools available to do things like source control and testing, the libraries that do most of the work so you can concentrate on the bit you're actually interested in, the quality of debugging tools and error messages that a language will give you ... all of these things add up to an environment where writing code is just so much simpler than it ever has been before. All that means that beginners can concentrate on learning rather than getting frustrated by incomprehensible messages that they need to look up in a manual and decipher because a lot of technical writing was pretty awful back then.
It's impossible to overemphasise how important it is when you're learning to be able to concentrate on what you're trying to figure out without needing to worry about other stuff that might distract you. That is what's made it easier to get in to development.
Looking at it from this lens -- I don't the barriers are lower. Most kids' first interaction with computers now are through smartphones and tablets. And the separation between using these things and actually programming them can't be further away.
"My name is Marian Goldeen. I'm an eight grade student at Jordan Junior High School in Palo Alto, California, and I would like to tell you about how I got started working with computers at Xerox, and the class I taught.
It all started in Dcember, 1973, when I was in the seventh grade. (...)"
Now, after reading that, are you proud at how far we've come in 40 years? I think we're doing rather poorly, and I believe the VPRI folks (and now this new project) is part of how we might do better.
I just wish these people would/could be more generous with throwing code out there for people to run and play with. I'm not sure if it's a culture thing, a funding thing, or a bit of both -- I understand that a presenting a coherent whole can be more powerful than small ideas, but whenever I read the VPRI articles, I always feel there's too little code to play with. I want more! :-)
By the way, there's lots of interesting stuff in that archive[a], like:
"Is Breaking Into A Timesharing System A Crime?"
And in fairness, we have seen some improvements:
That's the whole point of the STEPS project: Less code to play with ;)
But that maybe makes me only about 10% faster. As for learners, I'm sure it is useful, but they aren't that far off from when I started (when docs were mostly first-party vs. crowd sourced), and that is depressing to me. Every new step we take is very incremental.
10 PRINT "HELLO, WORLD"
In more recent environments you have a text editor and IDE or shell, which likely introduce more steps (even the concept of saving your code in a file!). I've also seen people worry about how much harder it can be to make a program display graphics compared to the BASIC era (which generally had built-in language primitives to go to graphics mode and start drawing stuff, which may at least require importing a library in contemporary languages and maybe installing a library).
But I would say that the amount of code to put something on the screen, speed of compilation and running (non-cpu bounded) code, quality of debugging tools are all approximately catching up now to the state of the art of 1980: a Smalltalk environment created by Alan Kay, Dan Ingalls and friends at Xerox PARC (I've heard LISP workstations were similarly advanced). By their work at VPRI, they still have a few things to invent and for industry to learn.
So while I agree barriers have decreased a lot, it is important to remember that the gap between state of the art what is common is also commonly huge in this area.
Arguably, _introducing_ people to programming through C++ and R (complex languages with difficult tooling) adds about 30 years worth of barriers right back, and is still common practice in some places.
The core problem there is two-fold. One, humans are bad at breaking things down explicitly into steps, and we make errors of description and errors of omission, which we call bugs. The program arrives at a goal, it is just the wrong one. Second, that goals grow more ambitious with time. At one point making a basic spreadsheet was a major endeavour, now it is a practical exercise as part of a programming course. Because the goal is more complex / further away, the number of steps required grows bigger. The current focus of tooling is in ways to make it easier to describe the steps, by giving you starting points closer to the goal (frameworks and blueprints) or by making the steps bigger (languages, libraries and abstractions) or by agreeing on patterns of steps, etc... I think this line of thinking, while important, is inherently limited in its ability to solve the problem.
Fundamentally, the trick to making programming easy is to turn it into something goal-oriented instead of step-oriented. You describe the desired results and the computer works out all steps regardless of the state space, and adapts as the state space changes (new data, changing circumstances).
Basically this is what unit tests do, minus the part of programming the code behind the test. But imagine one person writes only tests, and one writes only code. Replace the second person by an algorithm, and tada, goal-oriented programming. That part looks solvable to me. But ... now the problem becomes one of describing the goal, and we're merely shifting piles around in the problem space. Someone who cracks the problem of easily and accurately describing a goal would solve the harder half of the problem.
Anyway, just my two cents.
It's clear that these paradigms can be easier for solving certain kinds of problems. For example, they're very good for solving puzzles like Sudoku; instead of writing a full-fledged Sudoku solver, you can, in some of these environments, basically just describe what would constitute a solution to your problem, and then the programming environment can find one for you.
But you're right; for my project we scrapped it all. What's left is essentially a thin VM layer over hardware. It was the only way.
The next step is writing software that can write software. Maybe automated automation can't write a self-driving algorithm, but automation should be able to write the code to send email to a list of addresses.
We shouldn't automate dumb things. We should automate the automation of dumb things.
It does sound kind of vague. I hope it's not so open ended that it just means kind of noodling around with interesting stuff. But hey, Alan Kay's noodling would still be good, I bet. Interested to see what comes out of it!
GUI's... I think that was not him, or certainly not just him.
That AK popularized this stuff - sure.
Although there were graphical displays in the 1950s (some of them being simply direct displays of the contents of Williams tubes), GUIs as such only go back to Sketchpad (1961-65, roughly) which sort of had windows, and there were windows and a mouse in NLS by 1968, but without a GUI. Simula (1967) wasn't OO; Alan coined that phrase to describe Smalltalk (1971 at PARC, established 1970, which you'll note is less than a decade after 1967, though begun in 1969), which drew inspiration from Sketchpad, Simula, the B5000, NLS, and some weird magnetic tape format developed by the Air Force where the tape started with a program that interpreted the rest of the tape, among other places.
Alan and Ed Cheadle actually did a lot of the development of the ideas that later became OO in the 1966-70 timeframe on a personal computer they were working on called FLEX.
You can read Alan's account of this history in "The Early History of Smalltalk", one copy of which is at http://gagne.homedns.org/~tgagne/contrib/EarlyHistoryST.html.
Alan and Dan and Adele and other people on the Smalltalk team deserve a lot of credit both for developing the ideas of object-orientation but also for popularizing them, and essentially the modern desktop GUI (mouse, overlapping windows, icons, WYSIWYG, multiple fonts, proportional fonts, buttons, menus, drag-and-drop) was developed at PARC in the 1970s, an effort that included the Smalltalk team but also included a lot of other people working in other languages.
Of course, much as I detest Steve Jobs, the really powerful popularizing influences on this stuff were Macintosh and NeXT. (Later, Java.)
Are they being spun out into HARC, or are they just collaborating from inside SAP's blue-sky research group?
It would be helpful to clarify what their institutional standings are, if we're to understand the full import of this announcement.
Teach about cognitive bias. Tech it at a fundamental level -- like, starting in 1st grade, with annual refreshers thereafter. Give cognitive bias an equal place at the table alongside reading, writing, and arithmetic.
Many cognitive biases are deeply rooted in biological and social structures -- but with awareness and training, they can be overcome. Without awareness and training, they can be profoundly destructive, and certainly limit the scope of human advancement.
So before we start pimping out our metacortexes or whatever, let's see if we can't overcome some of the less salutary whisperings of our old and honestly pretty useless reptilian brains. We can be better than that.
it could be challenging to avoid all or even most of those. i wonder if i just fell prey to one or more.
"The best answers are the most matter of fact. It’s a mistake to use marketing-speak to make your idea sound more exciting. We’re immune to marketing-speak; to us it’s just noise. 1. So don’t begin your answer with something like
We are going to transform the relationship between individuals and information.
That sounds impressive, but it conveys nothing. It could be a description of any technology company. Are you going to build a search engine? Database software? A router? I have no idea.
One test of whether you’re explaining your idea effectively is to ask how close the reader is to reproducing it. After reading that sentence I’m no closer than I was before, so its content is effectively zero. Another mistake is to begin with a sweeping introductory paragraph about how important the problem is:
Information is the lifeblood of the modern organization. The ability to channel information quickly and efficiently to those who need it is critical to a company’s success. A company that achieves an edge in the efficient use of information will, all other things being equal, have a significant edge over competitors.
Again, zero content; after reading this, the reader is no closer to reproducing your project than before. A good answer would be something like:
A database with a wiki-like interface, combined with a graphical UI for controlling who can see and edit what.
I’m not convinced yet that this will be the next Google, but at least I’m starting to engage with it. I’m thinking what such a thing would be like.
One reason founders resist giving matter-of-fact descriptions is that they seem to constrain your potential. “But it’s so much more than a database with a wiki UI!” The problem is, the less constraining your description, the less you’re saying. So it’s better to err on the side of matter-of-factness." https://www.ycombinator.com/howtoapply/
You wrote this up within 5 minutes of the announcement. Perhaps you're judging it a bit too early? YCR is about "work that requires a very long time horizon, seeks to answer very open-ended questions" - it's research and shouldn't be judged as a startup/technology company, which seems to be what you're doing.
If you want more concrete details about what these particular people have done in the past, look at the code and papers they've published as CDG and VPRI. (And PARC, in Alan's case.)
My only take-away about HARC is they hired 20 "researchers" to maybe make some mockups of ui designs they deem "better" ... "for humanity"?
In that broadest context, it really seems to lose its meaning. I think a closer word to that description is "ideas." Why try to cram the word "technology" into it, when it's such a strech?
So why not keep busy while they wait for the next wave to form? It's a great time to grow other branches.
To what are you referencing here? I must have missed this story.
They're 20 people already. There is no way they will agree on a coherent direction, unless it's a weak/safe one. I guess we'll see.
I am waiting for the days when some big names wake up and say "RMS has been right all along, we need to copyleft everything asap!"
I'm getting tired of hearing about $NEXTGREATTECH and then finding out it's a SaaS or proprietary etc.
Heres to hoping this group does something productive and free as in beer and speech.
How will this relate to VPRI?
And are these researchers joining full-time and collaborating together directly, or are they just "part of the project"?
Text and speech are inherently limited by their linear and one-dimensional nature. Graphics are much more powerful, and leverage high-bandwidth senses. Knowledge will be consumed by navigating knowledge hypergraphs and causational trees (how-why axis).
In English, you write everything from scratch. You start with a blank page, then a word, then a sentence. With this new language, you must start with something that exists, some kind of node/edge of the graph. Communication is mostly done by manipulating the knowledge graph. This means that you can't say something that has already been said before. You don't need to provide or explain context. You can instantly see the impact of your thoughts. You can't say incoherent or fallacious things. In most cases, you don't have to say a single thing, as you can find it has been said before. Most decent programming environment provide libraries/modules, auto-completion, and compilation/execution. This new platform brings all these things to communication.
We're talking about an universal language here. An homoiconic language, where the distinction between code, data and UI disappears.
This is nothing new. People have been talking about this for decades. The most important challenge here is not technical, but social. It is becoming clear that the application paradigm is not sustainable and cannot scale. Personal assistants (Siri, Viv, Cortana, Google Now) and Messaging/Bots (Magic, WeChat, Slack, Microsoft Bots Framework, Facebook Bots Platform) are clear symptoms. Although I've been trying to tell people about this future for more than a decade now, I don't have the resources of track-record to be heard. I am hopeful that HARC will change that.
In any case, this looks like a necessary and timely line of inquiry and am looking forward to the fruits of these endeavours. Good luck!
In good company!
Free software ?
Ahhh nooo, my mistake, YCombinator is funding the project...
I know it was a typo, but HN does a pretty good job at being an echo system with technology, at times :-)
But since they are YC funded go ahead and downvote me dead for heresy.
Just the result I expected from you asskissers.
Please read and follow them when posting here:
The downvotes are the inevitable result of shouting that you'll get downvoted for your opinion.