
Parallella: A Supercomputer For Everyone is Dying - ahalan
http://kickstarter.com/projects/adapteva/parallella-a-supercomputer-for-everyone
======
aristidb
Dying? At $612K of $750K raised with 31 hours to go? Is there something I
don't understand there?

EDIT: Realizing that mods might change the title at any time, it is right now
"Parallella: A Supercomputer For Everyone is Dying"

~~~
Eduardo3rd
They only have 31 hours left to raise the rest of the money. Considering they
fact that they took several weeks to reach this point it seems unlikely that
they will hit the goal, and therefore they will not secure any of the pledged
funds.

~~~
vidarh
It slowed down for a long time, but it's sped up again the last few days, and
in less than the last 20 minutes they've gotten another $6k. It's going to be
close either way it seems.

------
incision
I seem to recall repeatedly reading that Kickstarter projects collect he
majority of their funding in the first 72 and final 24 hours.

My anecdotal experience backing and following half a dozen projects agrees
with this.

I'd be quite surprised if this project doesn't make it.

------
polyfractal
Could someone explain how this is different from GPU computing and regular
multi-core CPU computing?

I realize there is a difference...but I'm not quite sure I grasp it yet. GPU
computing is a lot of parallel math computations with limited shared memory.
I'm assuming the Epiphany CPU is more capable than the simple GPU math units?

How's it different from multi-core CPUs? Just the sheer quantity of cores they
have packed in there?

~~~
wmf
IMO this combines some of the worse features of Cell (e.g. local memory and
DMA) and GPUs, and while the power efficiency is good the absolute performance
is very low. For a parallel noob who's using OpenMP/OpenCL I don't think it's
any better than a desktop PC because programming it is going to feel the same
and performance is going to be equal or lower. And if you don't use the
libraries then you're in low-level ninjas-only land — the extremely simple and
flexible hardware is good in theory because you can use it many different
ways, but it also doesn't help you or give any hints about how to properly
exploit it.

~~~
vidarh
It's not meant to compete with a desktop PC, or with a mass produced GPU.

It's meant to be a development platform for solutions based on their
architecture and for people to get familiar with the development model, with
an existing 64-core version of their chip and future versions intended to put
1000+ cores on a board as the eventual target.

That it's also a reasonably capable platform to run Linux on (on the ARM chip)
so you can do development directly on the board is an added bonus.

~~~
wmf
If the architecture is not good then people don't want to get familiar with
it. I'm skeptical that even an "eventual" version with hundreds of cores would
be worth using.

~~~
vidarh
Well, clearly at least 3700 people want to get familiar with it based on the
number of backers so far, which is pretty good for a niche platform like this.

We'll find out soon enough.

------
kiba
For some reason, I thought it relates to the concept of copying the brain of
the dead to the computer so that they can live on as disembodied souls in the
computer.

It just turns out to be a kickstarter project for a powerful computer.

~~~
vidarh
It's not so much that it's a powerful computer, but a computer architecture
that can scale up to be a very powerful system. The version they're trying to
fund is a cost reduced version including their 16 core chip. They also have a
64 core chip, and plan to scale it much higher.

It's differentiated from GPU's in that each core is a simple but fully
independent CPU core, with direct access to main system memory AND to the
memory of the other cores.

This current project is most interesting as a means for people to start
playing with the architecture rather than for the raw performance.

~~~
mtgx
How is it different from Tilera and Intel's Xeon Phi?

~~~
lonetech
I've not had the time to read up on Xeon Phi, but compared to the Tilera, the
Epiphany is a considerably simpler processor. There's no MMU in the cores,
instead of caches there is direct DMA control, and the on-chip network extends
past the edges of the chip (that's all the I/O, there are no peripherals in
the chip). It all adds up to something you can scale by mounting more of them
on a board, assuming your task is sufficiently adaptable to a data flow (since
the external bandwidth scales slower than the number of cores). It's not at a
level where you can run a general purpose operating system with virtual memory
and memory protection (though extending it for that would be fairly easy -
perhaps Epiphany V?), nor does it (currently) run multiple threads per core,
but this simplicity affords it a much lower power expense. A GPU may be more
similar, as those tend to have prefetch operations and no memory protection,
but they are designed to have huge bunches of threads doing the exact same
type of work. They look like vector processors handling between 16 and 128
identical operations per control core (each a multiprocessor). Mainly the
Epiphany is easier to program, but optimization is a different story (similar
to place and route processes FPGAs need). It's a move toward a data and
control flow granularity currently not available at a price for individuals.
And to make it more useful, those individuals need to try things.

------
jws
At some point in the next 24 hours, the backers who signed up for one unit
will need to ask themselves if they would rather have two units or zero.

I can't speak for the other 1800 people in my bin, but I just decided on two.

~~~
Evbn
Isn't that (multiples as pledge awards) a direct violation of the letter and
spirit of kickstarters new rules?

------
vidarh
I'm happy to see it get more attention, but to say it is "dying" is a bit
hyperbolic. Sure, the Kickstarter campaign seems like it's unlikely to meet
its target.

But from the sounds of it I don't think the company behind it will just give
up if that happens. I know for my part if they put up another campaign,
preferably with a longer lead time, elsewhere and/or take pre-orders, I'll
commit again and I'm sure a lot of the other people who signed up will too.

I think it was unfortunate that they didn't release all the material they've
released in the last few days right at the beginning of the campaign, though -
they'd likely have done better. They've also clearly had a hard time
explaining to people what it's _for_ , which is a pity. I don't think the 16
core version by itself is all that interesting from a performance point of
view, but I'm interested in the architecture in the hope that they manage to
pull of the 64 core version and larger.

EDIT: It's added $20k in the hour since I wrote this - happily it looks like
it's got a good chance to succeed.

------
rrreese
Several commenters and the OP, seem to think that this Kickstarter will fail.
Having backed quite few Kickstarter campaigns, and watched a lot more, this
seems unlikely.

Backing is concentrated very heavily in the first three days and the last
three. Projects that have reached 80% of their funding goal by the last three
days are extremely likely to succeed.

It seems that many people delay backing till the last minute. Possibly this is
just human nature, though the Kickstarter process also means that as the
project progresses more information is released in a steady stream, and often
new funding levels are created.

Additionally backers who really want the project to succeed raise their
pledges to help the project succeed.

~~~
mikepurvis
And no big project like this would ever fail by < $50,000. Wouldn't the
founders or their investors or whatever just max their credit cards to see it
through?

Does Kickstarter explicitly prohibit such things?

------
mendocino
> we see a critical need for a truly open, high-performance computing platform

> FAQ: Will you open source the Epiphany chips? > Not initially, but it may be
> considered in the future.

Well, that makes it a lot less interesting than I hoped it would be.

~~~
vidarh
You were hoping for what exactly? VHDL/Verilog for the chips? Netlists?

For most people I'd assume the main thing is that the architecture is well
documented and open, as well as the board, and they _have_ released all of the
architecture documentation and a lot of other material.

As much as it'd be great to have a market in other sources for the chips,
unless/until the architecture has some traction that is pretty irrelevant.

------
GregBuchholz
Has anyone played with the 144-core GreenArray's IC?

<http://www.greenarraychips.com/>

~~~
jws
GreenArrays are intriguing, but a completely different animal from this
computer. The GreenArrays compute nodes are microscopic by comparison. Think
256 bytes of storage, shared with instructions and data. If you can map your
problem on to them they seem very efficient.

------
almost
Only $99 for the first reward that actually will come with a board to play
with. Sounds good to me! I've added $119 (international shipping!) to the
total, hope they make it...

------
nilsimsa
If the current trend rate continues, they should be able to reach their goal.
If they could somehow get on the Reddit front page it would easily happen. I
think there are many who might be interested if they only knew.

Here is the trend graph: [http://canhekick.it/projects/adapteva/parallella-a-
supercomp...](http://canhekick.it/projects/adapteva/parallella-a-
supercomputer-for-everyone)

------
pbharrin
I think the market is telling these guys: We don't care about computing power.
People are getting by with iPads and Chromebooks powered by ARM cores with 1/8
the computing power of an Intel processor. Don't get me wrong if you want to
play around with parallel computing you should love this, and support it. Just
don't be surprised when it doesn't reach Pebble funding levels.

~~~
amalag
They want to enable small scale computers to do more powerful computations, i
don't think it is directly for ipad's and chromebooks, it could be useful for
something like a quadcopter where there is a algorithm that gives better
stability but needs more computational power.

------
converging
It's alive. Alive! Adapteva reached their target; right now they are at
$769,996 pledged with a $750,000 goal, which was cleared on october 27th
between 2 and 3 am. Not sure what (US) timezone this refers to.

Source: [http://canhekick.it/projects/adapteva/parallella-a-
supercomp...](http://canhekick.it/projects/adapteva/parallella-a-
supercomputer-for-everyone)), 13.700 projects graphed). Great project, Daniel.

Did not delve into past performance of kickstarter projects, but comments from
across the net seem to confirm rrreese's comment: "Backing is concentrated
very heavily in the first three days and the last three. Projects that have
reached 80% of their funding goal by the last three days are extremely likely
to succeed."

Canhekickit states several todo's, of which aggregates and prediction would be
especially useful.

Any comments on how the funding dynamic of future kickstarter/other
crowdfunding projects would be affected if this data would be available?

------
Eduardo3rd
I'm impressed that they made it as far as they have. $612k puts them in the
top tier of all Kickstarter projects, but unfortunately they look to have set
their goal too high. Maybe they can pull a Clang and raise a ton of money in
the last 24 hours, but I'd be surprised to see that happen. Here's hoping I'm
wrong.

~~~
damian2000
I think they'll make it easily, sort of like what happened to Dalton
Caldwell's App.net ... they got a ton of support in the last 12-24 hours.

~~~
pi18n
Maybe this is cynical, but there seems to be little reason for them to not
borrow enough from friends and family to collect from Kickstarter, and pay
them back immediately after. Unless the gifts are ridiculously expensive?

------
dkhenry
While I am happy to see another post for this on the front page, I would have
preferred a positive post. People jump on bandwagons I would rather we started
a positive bandwagon rather then one looking to find the shovels and a decent
grave for a awesome project

~~~
kemiller
For what it's worth, this is the first I've heard of it, and this convinced me
to contribute.

------
joshuafcole
I did my part. I'm very much the archetypal broke college student at the
moment, but I won't always be. I have big plans in the Artificial Intelligence
and Machine Learning sectors, and I can't imagine a better, cheaper solution
to get started on working with multi-agent systems.

I desperately want to see this sort of pricing for cluster computing available
in the future, when I have the scratch and knowledge necessary to make these
ideas into products.

I think that future is worth skipping the occasional movie or meal to pay
into, and I'm looking forward to my somewhat unexpected end of year gift.

~~~
Evbn
Here is one cheaper solution: simulate a multi agent system on your laptop.
You don't need parallel hardware to ave multiple agents.

------
laacz
I wonder, if this was elaborate reverse engeneering of hn crowd to reach
Parallella financing target :)

------
bhickey
Hasn't Amazon already done this with EC2?

(Edit: Written in response to the title.)

~~~
wmf
Yeah, the cloud is much more accessible "for everyone" than this chip. If you
want to learn parallel programming rent a GPU instance.

~~~
vidarh
A GPU has a totally different architecture and programming model. It is
interesting for very different types of uses.

------
dannnnnnny
Argh! I get paid in a couple of days! I was hoping the cash would go into my
account before this runs out. Looks like that might not happen now.... Why
couldn't you have given us another few days? Oh well. I bet it'll get funded.

------
Sami_Lehtinen
Well, they made it and more: Funding: $788,138 of $750,000

------
eigenrick
It's a shame they're not making something useless like a video game. Then
they'd have millions.

~~~
lonetech
Video games is exactly where heterogenous computers like this have flourished.
But they don't have a game to market it yet - historically it hasn't even
mattered much if that initial game used the platform anywhere near well. How
did the Ouya sell, for instance? (It really resembles a Nexus 7 with a broken
screen.)

