Hacker News new | comments | ask | show | jobs | submit login
HARC (ycombinator.com)
415 points by sama on May 11, 2016 | hide | past | web | favorite | 101 comments

So, if I understand this correctly, it's the latest in a series of attempts by Alan Kay to find the right home for his vision of a new Xerox PARC. That began with VPRI (NSF funded), then CDG (SAP funded), and now HARC (YCR funded).

As I understand it, VPRI wrapped up because they ran out of money. I wonder what caused the move away from CDG?

Regardless, I hope he succeeds before we run out of four-letter acronyms funded by three-letter acronyms.

(Edit: CPRG -> CDG, my bad)

I don't think they ran out of money, from the final report is sounds like they just finished the research project. VPRI is just moving to ycom now.

My understanding is that VP who was championing CDG at SAP left and they didn't have the support they wanted or needed to do research.

Last week VPRI[1] published the final "STEPS Toward the Reinvention of Programming" paper[2].

Although it is 2012, it had been unavailable publicly until now. A great way to catch up on how FONC ended. It was worked on by many individuals being cited as part of HARC.

[1] http://vpri.org/html/writings.php [2] http://www.vpri.org/pdf/tr2012001_steps.pdf

Thank you for pointing this out. I hadn't read the last one, and was actually wondering if they'd just shut down, without concluding in some form or other. Love this stuff:

"This brings forth fanciful and motivational questions about domain areas, such as (...) "How many t-shirts will be required to define TCP/IP" (about 3), or all the graphical operations needed for personal computing?" (about 10)."

[ed: Actually thought the pdf deserved a submission of its own, lets see if other people think so too:



On the one hand, I'm happy to see visionaries like Vi Hart and Bret Victor (and presumably the others who I don't know of their work, but can only imagine to be quite good) supported to do the work that they do.

On the other hand, I am a little concerned to see InfoSys joining the fray here. InfoSys, to my understanding, are basically the face of H-1B abuse. So I'm happy to see these people funded, but it's a little harder for me to cheer when the funding comes from such a questionable source...

Would've expected YC to know this. Infosys hogs half the highly skilled worker H1b quota every year at an average salary of $80K in effect denying many startups and founders the ability. http://www.myvisajobs.com/Visa-Sponsor/Infosys/1088782.htm

They probably file 60K H1bs if they ended up getting 30K in the lottery.

Aren't they also pushing to increase the quota? And isn't increasing the quota a good thing?

For one company to take up 50% of the really scarce H1b quota (65,000) is just not fair. To do it to get a lil lower cost worker is inexcusable.

I have no issue with Infosys's offshoring- only their unfair gobbling up of the H1b visas.

Everyone from the Army, to the Govt, to Apple, to your medical insurer, to your bank have been using those workers. Maybe it will make you feel better if they all worked for 100 different Indian firms, but this is just more efficient. The bodyshops face stiff competition from each other to consolidate. The fact that Infosys has survived 20 years says something about the company.

/s You need to be the worst offender to survive

Infosys is funding the project for PR reasons. They're pretty active in the CSR space too. In early 2010, there was a show in India on CNBC TV18 that showcased CSR activities by these companies.

As a naive college student, I believed I could contribute to these causes too. After I joined TCS (since I did not study well and couldn't get other jobs), I realized that they had a built up an amazing PR team to attract employees and customers. On the other hand, I bet most Infosys programmers wouldn't have heard of Alan Kay. So Infosys needs to make up for the tech "deficit" by supporting research.

AS far as YCR are concerned, I guess they appreciate Infosys' funding and won't let Infosys' unethical policies affect their research.

Infosys wants to be in on this project because its simply a good idea. Vishal is also very good friends with Alan.

This was my first thought on seeing Infosys too... In my experience, I've found very little reason to look up to them.

I understand it as a PR move for them, but it really devalues the concept to me.

As a layperson, I have almost no idea what any of this means.

> HARC’s mission is to ensure human wisdom exceeds human power, by inventing and freely sharing ideas and technology that allow all humans to see further and understand more deeply.

Isn't this what the internet is for? What's new?

> Our shared vision of technology combines an expansive long-term view with a strong moral sense. We look to the distant past as well as the far future. We reject the artificial boundaries created between the humanities, arts, and sciences. We don’t always agree on what is good or evil, right or wrong, but we use these words seriously and are driven by them.

This is so vague I have no idea how you can attach any meaning to it.

> We are focusing on areas where we believe the structures created today will have the most impact on the future, and that can most benefit from having dedicated resources outside the for-profit world. At the moment, these areas include programming languages, interfaces, education, and virtual reality.

So you're gathering a group of smart people together to do non-profit research in a few select fields with the goal of improving humanity?

> What's new?

http://worrydream.com/MediaForThinkingTheUnthinkable/ is probably a good starting point for what is yet to be done regarding seeing further and understanding more deeply.

> This is so vague I have no idea how you can attach any meaning to it.

Are you concerned that this isn't externally measurable? I don't think it's meant to be.

> So you're gathering a group of smart people together to do non-profit research in a few select fields with the goal of improving humanity?

Well, as a layperson it seems like you figured out exactly what all of this meant...

> Isn't this what the internet is for? What's new?

Take a step back and remember the internet was an invention too. How can we make a better internet or radicalize communication in another form? What is a browser anyway? Are screens the ideal medium? Also, it's a bit silly to ask "what's new?" out of a research center's vision statement.

> This is so vague I have no idea how you can attach any meaning to it.

Think about VR. Let's say they research how to make it super addicting to play in your own little fantasy. This ends up leading to "VR abuse" and kids stop talking to one another, etc... It is very important for them to understand how potential inventions will affect humanity if their goal is to improve humanity.

> So you're gathering a group of smart people together to do non-profit research in a few select fields with the goal of improving humanity?

Yes you summarized his aims. Those select fields are seen as having the most efficacy upon human improvement.

Since you admit to having no idea what any of it means, I would suggest you try to look beyond the words and not be so eager to extract tangible simple facts. This is a vision statement and it's abstract because forward-looking mental vision is abstract in nature.

Spend some time reading the output of VPRI - writings at vpri.org for some indication of some of their directions. Also view some of the videos on Yoshiki Ohshima's Youtube channel for some of the philosophy (he's one of the PI's [Principal Investigatore - head researcher] at this new organization).


I too am a little puzzled.

Is there anything more to this than peacocking to show how visionary the YC folks are?

Sam Altman is becoming the best friend of Elon Musk to save the humanity from whatever x,y,z. :)

> Isn't this what the internet is for? What's new?

hmm, possibly. here i was thinking it was for porn, cat videos, science paper paywalls and ransomware distribution.

The point is to make less laypeople

It's because the ways to put computing talents into good use are not limited to, you know, "startups".

I'm excited to see what comes out of this. The area I'm hoping they'd look at is programming. There hasn't been any big ideas in programming in a really long time. Languages are rehashes of the same features with slightly different configurations and incremental improvement in performance or tooling. Programming interfaces are stagnant too -- Smalltalk's interactive devlopement environment is still the sci-fi version of what most people think programming could be.

The programmer and the program have never been further away from each other as it is today. The development environment have never been further away from the production environment as it is today. And end-user interfaces have never been as abstracted away from computing as it is today.

If any group can breathe some life into this stagnant area then my bet would be on this group.

Eve [1] is one project I know of that has been trying to tackle this very difficult problem. The project is led by Chris Granger, whom you might know from his work on the Light Table IDE [2].

They released a video last year that showcased some iterations of their prototypes and I found them to be very innovative and promising: https://www.youtube.com/watch?v=VZQoAKJPbh8

[1] http://www.witheve.com/

[2] http://lighttable.com/

Kayia (http://kayia.org, my project) is also approaching programming from a very different angle. I would like to say that, since a large percentage of us know that programming is broken, we are supportive, inclusive and embrace new ideas. We don't.

This industry has become a branch of the finance industry. So instead of programming being driven by math (e.g. category theory, abelian groups), it's driven by the market (e.g. time-to-market, what works now).

HARC is a push back the other way, giving space to people who are thinking about it in the right way, and I applaud that. But we're still missing the 'inclusive' piece of it; until we can talk to each other and listen to each other, we'll remain frustrated.

The programmer and the program have never been further away from each other as it is today.

Yet at the same time there hasn't ever been so much working software produced, or so many working programmers, so something is working. The barrier to entry to writing code has been lowered. That's a good thing. We have more people trying out their ideas now because it's easy to get something working. You can't reasonably ignore that with a hand-wavy "it could be so much better" lament.

That isn't to suggest for a second that things couldn't be even better. I just think it's important to understand that getting lots of people involved in development, even if what they produce is awful, has benefits for society. That might be more important than good code.

We have easier access to computers (not true when I was growing up), we have slightly better tools, but otherwise the barriers aren't much lower. We have more people attracted to the industry-CS is the most popular major now in many schools-programming is still hard even if more people are doing it.

One wonders what could be if programming was easier, being considered as such a basic skill as writing an essay.

the barriers aren't much lower

They really are. I've been writing code for 30 years (professionally for 20) and while the logic hasn't really changed a great deal, all of the things around coding absolutely have. The work necessary to get something to display on a screen, the speed of both compiling code and running an application, the sheer quantity and quality of example code to learn from, the tools available to do things like source control and testing, the libraries that do most of the work so you can concentrate on the bit you're actually interested in, the quality of debugging tools and error messages that a language will give you ... all of these things add up to an environment where writing code is just so much simpler than it ever has been before. All that means that beginners can concentrate on learning rather than getting frustrated by incomprehensible messages that they need to look up in a manual and decipher because a lot of technical writing was pretty awful back then.

It's impossible to overemphasise how important it is when you're learning to be able to concentrate on what you're trying to figure out without needing to worry about other stuff that might distract you. That is what's made it easier to get in to development.

I think it comes down to a question of measurement. Now, of course you're going to find a lot more programmers now because computers are ubiquitous and CS is a hot major. But one way to think about it is what is the ratio of users that can program their computers to the ratio of pure end-users. I think its orders of magnitude lower now. You could argue that because we have all these fancy apps now and we don't need to program our computers but I believe people don't get a fraction of the computing power they can or need from their devices. This is evident by the recent trend in bots and "workflow" applications (e.g. Microsoft Flow, IFTT, etc). People want a lot more from their machines. And apps are a single-purpose, crude way of doing computing. The way you get the most out of your computer is that you program it.

Looking at it from this lens -- I don't the barriers are lower. Most kids' first interaction with computers now are through smartphones and tablets. And the separation between using these things and actually programming them can't be further away.

When thinking about if we've improved or not, I like to point out stuff like this, from 1976:

"My name is Marian Goldeen. I'm an eight grade student at Jordan Junior High School in Palo Alto, California, and I would like to tell you about how I got started working with computers at Xerox, and the class I taught.

It all started in Dcember, 1973, when I was in the seventh grade. (...)"

(My emphasis) http://www.atariarchives.org/bcc1/showpage.php?page=61

Now, after reading that, are you proud at how far we've come in 40 years? I think we're doing rather poorly, and I believe the VPRI folks (and now this new project) is part of how we might do better.

I just wish these people would/could be more generous with throwing code out there for people to run and play with. I'm not sure if it's a culture thing, a funding thing, or a bit of both -- I understand that a presenting a coherent whole can be more powerful than small ideas, but whenever I read the VPRI articles, I always feel there's too little code to play with. I want more! :-)

By the way, there's lots of interesting stuff in that archive[a], like:

"Is Breaking Into A Timesharing System A Crime?" http://www.atariarchives.org/bcc1/showpage.php?page=4

And in fairness, we have seen some improvements:


[a] http://www.atarimagazines.com/

> ... but whenever I read the VPRI articles, I always feel there's too little code to play with.

That's the whole point of the STEPS project: Less code to play with ;)

Your last point, that kids are being introduced to apps through devices that don't have easily accessible dev tools, is something I hadn't considered. That could certainly push the bar back up, especially if they don't have a PC at home or at school as well. That's something to think about.

I agree that the tooling is better. But if I compare the mid to late 90s to now, it feels like we haven't moved very far. Sure, computers are faster, compilers are a bit more polished, we have access to a CodeView-style debugger more often than not, code completion is a bit more reliable, we can test more easily and use stackoverflow to provide context for new problems.

But that maybe makes me only about 10% faster. As for learners, I'm sure it is useful, but they aren't that far off from when I started (when docs were mostly first-party vs. crowd sourced), and that is depressing to me. Every new step we take is very incremental.

I think I pretty much agree about all of those changes except for "the work necessary to get something to display on a screen". This was much less in BASIC on machines like the Atari 400/800, Commodore 64, Apple ][, and IBM PC that could boot into a BASIC interpreter. Then you could turn on the machine, type

and see something on the screen.

In more recent environments you have a text editor and IDE or shell, which likely introduce more steps (even the concept of saving your code in a file!). I've also seen people worry about how much harder it can be to make a program display graphics compared to the BASIC era (which generally had built-in language primitives to go to graphics mode and start drawing stuff, which may at least require importing a library in contemporary languages and maybe installing a library).

Of course the coverage by libraries, accessibility of varied example code have improved tremendously, as have version control (es. distributed).

But I would say that the amount of code to put something on the screen, speed of compilation and running (non-cpu bounded) code, quality of debugging tools are all approximately catching up now to the state of the art of 1980: a Smalltalk environment created by Alan Kay, Dan Ingalls and friends at Xerox PARC (I've heard LISP workstations were similarly advanced). By their work at VPRI, they still have a few things to invent and for industry to learn.

So while I agree barriers have decreased a lot, it is important to remember that the gap between state of the art what is common is also commonly huge in this area.

Arguably, _introducing_ people to programming through C++ and R (complex languages with difficult tooling) adds about 30 years worth of barriers right back, and is still common practice in some places.

I think the fundamental issue that makes programming hard, and harder with time, is that we don't describe goals, we describe steps towards a goal. The computer only understands how to move through the state space, it does not understand where to go. We end up reverse engineering the goal / end result into the series of steps / state transitions needed to get there, and we call this activity programming.

The core problem there is two-fold. One, humans are bad at breaking things down explicitly into steps, and we make errors of description and errors of omission, which we call bugs. The program arrives at a goal, it is just the wrong one. Second, that goals grow more ambitious with time. At one point making a basic spreadsheet was a major endeavour, now it is a practical exercise as part of a programming course. Because the goal is more complex / further away, the number of steps required grows bigger. The current focus of tooling is in ways to make it easier to describe the steps, by giving you starting points closer to the goal (frameworks and blueprints) or by making the steps bigger (languages, libraries and abstractions) or by agreeing on patterns of steps, etc... I think this line of thinking, while important, is inherently limited in its ability to solve the problem.

Fundamentally, the trick to making programming easy is to turn it into something goal-oriented instead of step-oriented. You describe the desired results and the computer works out all steps regardless of the state space, and adapts as the state space changes (new data, changing circumstances).

Basically this is what unit tests do, minus the part of programming the code behind the test. But imagine one person writes only tests, and one writes only code. Replace the second person by an algorithm, and tada, goal-oriented programming. That part looks solvable to me. But ... now the problem becomes one of describing the goal, and we're merely shifting piles around in the problem space. Someone who cracks the problem of easily and accurately describing a goal would solve the harder half of the problem.

Anyway, just my two cents.

There's already logic programming and more generally declarative programming, which sort of have the features you describe (the programmer describes what is wanted, but not necessarily how to find or do it).



It's clear that these paradigms can be easier for solving certain kinds of problems. For example, they're very good for solving puzzles like Sudoku; instead of writing a full-fledged Sudoku solver, you can, in some of these environments, basically just describe what would constitute a solution to your problem, and then the programming environment can find one for you.

Nicely put! Many of us feel the same way: how we are approaching programming through PL today in much of the academic community is contributing to those gaps, not eliminating them. There is also a chance to evolve the topic of programming to one of human advancement (among other things), the old approaches aren't moving things forward so much anyways. See what Bret Victor did with programming topics in just a few short years, I see a lot of potential there!

There are a bunch of projects claiming to be next-gen though. The thing is, if you change the paradigm, you sometimes have to redo the whole toolchain (like VCS and so on).

See http://www.lamdu.org/

A bunch of projects, true, though much fewer than I would like to see.

But you're right; for my project we scrapped it all. What's left is essentially a thin VM layer over hardware. It was the only way.

Programmers are known as "people who do not like to repeat themselves" but they spend their days typing the same brackets and 5 words. (If, else, for/foreach, select/case, string...)

The next step is writing software that can write software. Maybe automated automation can't write a self-driving algorithm, but automation should be able to write the code to send email to a list of addresses.

We shouldn't automate dumb things. We should automate the automation of dumb things.

Maybe having to repeat those brackets and 5 words are soul reason why programmers don't like to repeat themselves? </showerThought>

If you always write the brackets when you write the "if", why not write a function that writes both for you?

Alan Kay? Ok, I'm impressed!

It does sound kind of vague. I hope it's not so open ended that it just means kind of noodling around with interesting stuff. But hey, Alan Kay's noodling would still be good, I bet. Interested to see what comes out of it!

Alan Kay's whole MO is noodling around with interesting stuff, and history shows it's pretty good when he does.

He's also produced a number of working systems though, right?

Yes. Object Oriented Programming, for a start. So there's that. GUI's, too.

OO is pretty conceptual without something to back it up. In this case, Smalltalk.

GUI's... I think that was not him, or certainly not just him.


Well, Simula predates PARC by about a decade, so I'm not so sure about AK created OOP. As for Windows and such, sniff around YouTube -- that notion goes back to the late 50's early 60's.

That AK popularized this stuff - sure.

Your history is pretty confused, although it has some elements of truth to it. Let me set you straight.

Although there were graphical displays in the 1950s (some of them being simply direct displays of the contents of Williams tubes), GUIs as such only go back to Sketchpad (1961-65, roughly) which sort of had windows, and there were windows and a mouse in NLS by 1968, but without a GUI. Simula (1967) wasn't OO; Alan coined that phrase to describe Smalltalk (1971 at PARC, established 1970, which you'll note is less than a decade after 1967, though begun in 1969), which drew inspiration from Sketchpad, Simula, the B5000, NLS, and some weird magnetic tape format developed by the Air Force where the tape started with a program that interpreted the rest of the tape, among other places.

Alan and Ed Cheadle actually did a lot of the development of the ideas that later became OO in the 1966-70 timeframe on a personal computer they were working on called FLEX.

You can read Alan's account of this history in "The Early History of Smalltalk", one copy of which is at http://gagne.homedns.org/~tgagne/contrib/EarlyHistoryST.html.

Alan and Dan and Adele and other people on the Smalltalk team deserve a lot of credit both for developing the ideas of object-orientation but also for popularizing them, and essentially the modern desktop GUI (mouse, overlapping windows, icons, WYSIWYG, multiple fonts, proportional fonts, buttons, menus, drag-and-drop) was developed at PARC in the 1970s, an effort that included the Smalltalk team but also included a lot of other people working in other languages.

Of course, much as I detest Steve Jobs, the really powerful popularizing influences on this stuff were Macintosh and NeXT. (Later, Java.)

This gets to something that's dogged AK for ages. He created the term "object-oriented," but the way everyone in the field thinks of it is not what he meant. He blames himself for that, using terminology that got people focused on objects. It's not about the objects. It's about messaging and interfaces, and the independence of objects. Objects are just the endpoints for messages. His notions of objects have a lot more in common with how the internet (TCP/IP) works than the way Simula, C++, or Java works. The real abstraction, indeed the real programming, is about interfaces and messages, and their relationships. A short summary of it is to think of objects as servers on the internet, and interfaces as analogous to the protocols used for communication between clients and servers. If you could imagine programming using the structure of the internet, with the same scaling and loosely-bound qualities that the internet possesses, that would be the ultimate in OOP. Simula doesn't get into this at all. It's a class-based language, and what the design really communicates is that it formalizes the notion of abstract data types. C++, Java, and C# do the same thing, in this respect.

Most of these luminaries worked at SAP until recently.

Are they being spun out into HARC, or are they just collaborating from inside SAP's blue-sky research group?

It would be helpful to clarify what their institutional standings are, if we're to understand the full import of this announcement.

Spun out. They're full time employees of YCR starting a few days ago.

Including Alan, or is he still heading VPRI?

May I make a human advancement suggestion which has very little to do with technology?

Teach about cognitive bias. Tech it at a fundamental level -- like, starting in 1st grade, with annual refreshers thereafter. Give cognitive bias an equal place at the table alongside reading, writing, and arithmetic.

Many cognitive biases are deeply rooted in biological and social structures -- but with awareness and training, they can be overcome. Without awareness and training, they can be profoundly destructive, and certainly limit the scope of human advancement.

So before we start pimping out our metacortexes or whatever, let's see if we can't overcome some of the less salutary whisperings of our old and honestly pretty useless reptilian brains. We can be better than that.

A method for overcoming cognitive bias is a technology, in the same way that spoken language is a technology. See formal logic. But if we were somehow, as a species, constrained to doing one thing at a time on the tech tree - then yes, we would do well to max that tech out asap.


it could be challenging to avoid all or even most of those. i wonder if i just fell prey to one or more.

This isn't a judgment of the project itself, but the announcement's wording reminds me of the marketing newspeak from "How to Apply to Y Combinator":

"The best answers are the most matter of fact. It’s a mistake to use marketing-speak to make your idea sound more exciting. We’re immune to marketing-speak; to us it’s just noise. 1. So don’t begin your answer with something like

We are going to transform the relationship between individuals and information.

That sounds impressive, but it conveys nothing. It could be a description of any technology company. Are you going to build a search engine? Database software? A router? I have no idea.

One test of whether you’re explaining your idea effectively is to ask how close the reader is to reproducing it. After reading that sentence I’m no closer than I was before, so its content is effectively zero. Another mistake is to begin with a sweeping introductory paragraph about how important the problem is:

Information is the lifeblood of the modern organization. The ability to channel information quickly and efficiently to those who need it is critical to a company’s success. A company that achieves an edge in the efficient use of information will, all other things being equal, have a significant edge over competitors.

Again, zero content; after reading this, the reader is no closer to reproducing your project than before. A good answer would be something like:

A database with a wiki-like interface, combined with a graphical UI for controlling who can see and edit what.

I’m not convinced yet that this will be the next Google, but at least I’m starting to engage with it. I’m thinking what such a thing would be like.

One reason founders resist giving matter-of-fact descriptions is that they seem to constrain your potential. “But it’s so much more than a database with a wiki UI!” The problem is, the less constraining your description, the less you’re saying. So it’s better to err on the side of matter-of-factness." https://www.ycombinator.com/howtoapply/

I think research is really different from startups. That said, I expect that over the next month or two the teams will figure out their general directions, and will share it when they do.

This is an announcement of the collaboration between between YCR and HARC, not a specific project/technology. As stated in the post, "We will share more detail about each PI’s current projects once we settle into our new roles and establish a web presence".

You wrote this up within 5 minutes of the announcement. Perhaps you're judging it a bit too early? YCR is about "work that requires a very long time horizon, seeks to answer very open-ended questions"[0] - it's research and shouldn't be judged as a startup/technology company, which seems to be what you're doing.


Because it's research, they don't know what they're doing until they do it; they just know the direction they're going in. They're betting on people, not projects.

If you want more concrete details about what these particular people have done in the past, look at the code and papers they've published as CDG and VPRI. (And PARC, in Alan's case.)

Yep, i entirely agree.

My only take-away about HARC is they hired 20 "researchers" to maybe make some mockups of ui designs they deem "better" ... "for humanity"?

>HARC researches technology in its broadest context, which includes: technology for communication (from the invention of spoken language to modern data graphics), intellectual tools (such as the scientific method and computer simulation), media (from cave painting to video games), and social systems (including democracy and public education).

In that broadest context, it really seems to lose its meaning. I think a closer word to that description is "ideas." Why try to cram the word "technology" into it, when it's such a strech?

YC seems to be trying to reach into a number of different areas lately. Maybe they find that they don't have enough innovation walking through their door anymore. Or that they need to go the extra mile to push things forward themselves.

I'll take a wild uneducated guess and guess that they are trying to find the next frontier in "idea incubation" while their current product (YC) is on top of its game. So as to not be "disrupted" by whatever is going to replace YC's idea factory, years from now. The iPhone to their current iPod. In that context, advanced research makes sense.

I'm bored out of my mind reading HN lately and I'd hope they are, too. (That ad today for the all-girl we-got-board-games thing was just... painful.)

So why not keep busy while they wait for the next wave to form? It's a great time to grow other branches.

>That ad today for the all-girl we-got-board-games thing was just... painful.

To what are you referencing here? I must have missed this story.

That's actually far more relevant than a lot of YC startups. Custom packaging is a big trend in Asia and far more evolved than in the west.

I didn't understand anything from this post.

The description of what they hope to achieve is vague, in a good way. I'm sure a lot of people from the rationalist community in the Bay Area, who already admire YC, will love this.

Very exciting! How can I join? :D

^ Ditto

As much as I feel that I'm better qualified for the job than half of HARC's team, I don't think joining would help much.

They're 20 people already. There is no way they will agree on a coherent direction, unless it's a weak/safe one. I guess we'll see.

Very impressed they hired Bret Victor. His brilliance outpaces anyone else in his field, and he communicates well.

I hope they realize that copyleft and protection of users is a prereq for this bit, "to ensure human wisdom exceeds human power, by inventing and freely sharing ideas".

I am waiting for the days when some big names wake up and say "RMS has been right all along, we need to copyleft everything asap!"

I'm getting tired of hearing about $NEXTGREATTECH and then finding out it's a SaaS or proprietary etc.

Heres to hoping this group does something productive and free as in beer and speech.

Very interesting, and very exciting!

How will this relate to VPRI?

And are these researchers joining full-time and collaborating together directly, or are they just "part of the project"?

I can say without the shadow of a doubt that HARC is working on a new language. Not your average spoken or programming language, but a general-purpose and interactive language. A language that can only be communicated through a computer.

Text and speech are inherently limited by their linear and one-dimensional nature. Graphics are much more powerful, and leverage high-bandwidth senses. Knowledge will be consumed by navigating knowledge hypergraphs and causational trees (how-why axis).

In English, you write everything from scratch. You start with a blank page, then a word, then a sentence. With this new language, you must start with something that exists, some kind of node/edge of the graph. Communication is mostly done by manipulating the knowledge graph. This means that you can't say something that has already been said before. You don't need to provide or explain context. You can instantly see the impact of your thoughts. You can't say incoherent or fallacious things. In most cases, you don't have to say a single thing, as you can find it has been said before. Most decent programming environment provide libraries/modules, auto-completion, and compilation/execution. This new platform brings all these things to communication.

We're talking about an universal language here. An homoiconic language, where the distinction between code, data and UI disappears.

This is nothing new. People have been talking about this for decades. The most important challenge here is not technical, but social. It is becoming clear that the application paradigm is not sustainable and cannot scale. Personal assistants (Siri, Viv, Cortana, Google Now) and Messaging/Bots (Magic, WeChat, Slack, Microsoft Bots Framework, Facebook Bots Platform) are clear symptoms. Although I've been trying to tell people about this future for more than a decade now, I don't have the resources of track-record to be heard. I am hopeful that HARC will change that.

I'm excited to see that Alex Warth is involved with this. In my late undergrad days, I was really excited by his OMeta language. It turned out that I'm actually really bad at writing PEGs (or even ANTLR grammars), so I wasn't the best candidate for advancing OMeta into common use, but the idea is really compelling.

I try to make a private study of the global survey of Wisdom Literature in an attempt to distil the essence of what it basically means to live a good life and be a moral person. Everything from Confucius to Aurelius, from Goethe to Gogol. And on and on. Wouldn't it be prudent, whilst you have such an assemblage of noble thinkers, to compile some sort of universal knowledge base of choice epigrams. For the purpose of henceforth explicitly delineating what it means to be "beneficial" and "just" for all future readers to come?

In any case, this looks like a necessary and timely line of inquiry and am looking forward to the fruits of these endeavours. Good luck!

I'm excited to see where this goes! I'm curious, though: Alan and gang were already working on many of these ideas under the aegis of SAP's Communications Design Group. What was the motivation behind starting a new project?

I hope they just give these guys a bag of money and leave them alone to do their thing. They can then connect their work to other guys with a commercial focus where appropriate, without interrupting the flow.

Can someone explain to me what their goal is?

YC is building a research facility similar to Xerox PARC. No specific goals, but I'd assume researchers would be looking at projects 10-20 years out. There are a lot of similarities with this and PARC, they have Alan Kay (former PARC), and even the name HARC is clearly inspired by PARC.

To make people smart enough to know whats good for themselves.

I especially like the term because it means 'battle' in Hungarian.

Not to be confused with another HARC: http://www.harcresearch.org/about/75

First SLAC (homebrew computer club fame more than the linear accelerator part) and PARC (so many innovations... where to start... right, ask Alan!)... and now HARC.

In good company!

There is a HARC in Amy Tintera's novel Reboot with similar goals. I wonder if they named it HARC because of the novel, or if it's a coincidence.

>> Our shared vision of technology combines an expansive long-term view with a strong moral sense. >> > HARC’s mission is to ensure human wisdom exceeds human power, by inventing and freely sharing ideas and technology

Free software ?

Ahhh nooo, my mistake, YCombinator is funding the project...

This is great of course, but then again - it feels like human is the only species on this planet. We really have to start to think as an echo system, especially with technology.

> We really have to start to think as an echo system, especially with technology.

I know it was a typo, but HN does a pretty good job at being an echo system with technology, at times :-)

This somehow reminds me of the Encyclopedia Galactica from the Foundation series.

I hope they have an airtight written partnership agreement on this thang.

Very exciting. I wished there would also be room for a group at YCR/HARC working on collaborative theorem proving (CTP) as proposed here: http://arxiv.org/abs/1404.6186 :-)

Equating invention of future computing technologies with human advancement is too pompous for my taste. I won't hold my breath.

off topic, but FYI HARC in hungarian means fight :)

I don't know about this. The morality stuff sounds like a cult and the rest is so damn vague this whole thing feels half architecture astronaut and half bullshit artist.

But since they are YC funded go ahead and downvote me dead for heresy.


Just the result I expected from you asskissers.

The HN guidelines ask you not to go own about downvotes and not to bait other users into downvoting you. They also ask you not to call names ('you asskissers') in comments.

Please read and follow them when posting here:



Bringing people together under the umbrella of a very loosely defined goal is what Alan Kay does, and he has a track record of those groups creating big things.

The downvotes are the inevitable result of shouting that you'll get downvoted for your opinion.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact