

Algorithmatic.com: a repository and dev. env. for algorithms - ANaimi
http://www.algorithmatic.com

======
tl
"To view this page you need Microsoft Silverlight 3 plug-in."

I don't think that was a good idea.

~~~
jacquesm
If you design stuff for programmers it should be as platform independent as
you could possibly make it.

I'm having a hard time with sites that use flash (scribd) or the one mentioned
above for stuff that should be done with server side code or dhtml.

I realize that in the case of scribd that might be a bit much to ask, but it
looks as though they gave up on page '1'.

~~~
plaes
Weird thing is that I already have Moonlight (Mono-powered Silverlight)
installed, but it still isn't working :S

------
plinkplonk
"Algorithmatic, by itself, is a small and simple dynamically typed Object
Oriented programming language that serves as a common denominator among
popular programming languages. This means that Algorithmatic doesn’t rely on
any exclusive/new feature – this guarantees the ability to port descent
implementations to any other language."

I don't get it. How is a bunch of common algorithms implemented in a custom
site-specific language particularly helpful?

Pseudo code for algorithms is widespread, as are working algorithms in almost
every language you can think of. How is porting from this "algorithmatic"
langauge any better than porting from (say) a Java implementation?

from the FAQ, "Implementations of all type of algorithms do exist wildly on
the internet. However, these implementations differ in quality as much as they
differ in syntax (or programming language.) "

So how does adding a _new_ syntax (and behind the syntax, an untested
interpreter/compiler, runtime etc) and inviting random people to submit
algorithms in this _new_ language solve this problem?

I must be missing something.

~~~
derefr
From the perspective of implementing an _individual_ algorithm, it's easier to
just google an implementation. But pretend you're writing the standard library
internals for a _new_ language['s reference implementation]—where do you
start? How do you compare algorithmic approaches, each written a slightly
different way, to know which one you want to use? For that matter, how do you
know which algorithms you "need" to put into your language (given that if you
don't include it, your users will probably just write it themselves instead of
complaining to you)?

I think the final goal, actually, would be to have an easy grammar spec that
allows automated transformation of this particular code into a given language.
That way, all languages, no matter the platform, can have the same core set of
algorithm code. This would become the "language-neutral" encoding of the
algorithms, open to inspection, and the other libraries produced from it would
just be considered object code, not to be modified themselves. Whenever a flaw
was found in any language's version of the library, the algorithm or the
translating parser could be updated, and the flaw thus fixed in _every_
version of the library. Whenever you came to a new language, you could
rightfully expect the same core to be available. Whenever you started writing
a new language, you'd get the core "for free" (in est, the price of writing a
translator parser spec.)

~~~
plinkplonk
"But pretend you're writing the standard library internals for a new
language['s reference implementation]—where do you start?"

(Books by) Knuth? Cormen ?

Seriously though, if someone implementing "standard libraries" for a new
programming language has to lookup implementations in some site-specific
untested language, it doesn't bode well for the new language's future.

People who write such high impact libraries should be well versed in
algorithmics, be able to (1) analyse and port pseudocode, (2)read and
comprehend published papers (with mathematical proofs etc), (3) read
comprehend and port existing implementations etc.

"I think the final goal, actually, would be to have an easy grammar spec that
allows automated transformation of this particular code into a given language.
That way, all languages, no matter the platform, can have the same core set of
algorithm code."

Why not write an automated translator for(say) Java or C then? vs throwing yet
another language into the mix.

Besides, this sounds too idealistic to me. But I guess seeing is believing.
I'll believe it when I see a new language's implementor takes this path for a
standard library.

I can't think of anyone good enough to write a robust, performant language
interpreter/compiler and runtime balking at implementing algorithms and
needing automated translation from a website.

~~~
derefr
> I can't think of anyone good enough to write a robust, performant language
> interpreter/compiler and runtime balking at implementing algorithms and
> needing automated translation from a website.

You're assuming we're talking about big, robust, powerful languages here. This
wouldn't be for those, because those can survive without any help. We're
talking about a _long tail_ for language design: if you have an idea (say,
Erlang's actor-model-as-core) and want to try it out, writing a standard
library is a big barrier to putting out something satisfactory enough for
"normal" programming that others will want to tinker on your language and help
it grow. Imagine if Linus had to invent C when he was starting off with his
tiny little prototype of Linux. We have rapid _app_ development; why can't we
have rapid _language_ development?

~~~
plinkplonk
"You're assuming we're talking about big, robust, powerful languages here."

No I wasn't. I said " anyone good enough to write robust, performant language
_interpreter/compiler_ ".

I said nothing about the _language_ size. Small langauges can be very
powerful. _You_ added the "big" adjective.

"If you have an idea (say, Erlang's actor-model-as-core) and want to try it
out, writing a standard library is a big barrier to putting out something
satisfactory enough for "normal" programming that others will want to tinker
on your language and help it grow. "

And do you really think that a collection of random algorithms in an untested
language on a website would solve this problem for them?

The way to test out that idea would be to use an existing language (like
Scheme,or C) to build the interpreter or compiler for the language you have in
mind instantiating the specific programming model you have in mind, write an
_application_ or two in your new language, and write the minimum amount of
libraries you need, not build a "standard" library by _automatic_ translation
from another language.

In the meanwhile you use some kind of FFI to access the underlying language/OS
libraries. (e.g arc using PLT Scheme's networking libs).

Besides, the kind of algorithms a "standard library" has are not the ones
built on the site (fibonacci and ceasar's cypher in a standard library)?

A "standard library" has collections, os interfaces, guis, regexes, networking
libraries and so on. I don't see those kind of algorithms being portable
across languages not to mention paradigms. Or were you thinking that the new
language would be a "small and simple" sequential object oriented language
which is what the site's language seems to be?

I quote from the site

"Algorithmatic, by itself, is a small and simple dynamically typed Object
Oriented programming language that serves as a common denominator among
popular programming languages".

Says who?

How do you define "common denominator" among "popular programming languages"
(like C, C++,Python, Java , C#, javascript, PHP, and Objective C)? I'd like to
see a "common denominator" for such a language defined better. And assume you
did design such a "common denominator", how would you go about automatically
rewriting those algorithms to use an "erlang actor model"?

As I understand automatically converting a sequential non referentially
transparent program to a concurrent one is _still_ a research topic. So your
"erlang actor model" language won't necessaarily get any benefit from
algorithms written in this site's language anyway.

If I understand you right you are saying that someone designing languages
(like PG is doing with arc) will now go and write some kind of automatic
translator from this site-specific language to their own to get a "standard
library"?

I've _never_ heard of _any_ language designer anywhere ever doing that. It
maybe that such things have happened and I haven't heard of it. Any real world
_examples_ (vs suppositions of what _could_ happen in the future) you have of
such an effort would contribute greatly to the discussion.

I _think_ (please correct me if I am wrong) that you are _hypothesizing_ that
such a thing _might_ work vs having any successful real world examples of such
an approach. I don't think there is a real problem language designers have
that this site can solve.

If you have _any_ examples of a language designer creating a _standard
library_ for his language by writing a translator for algorithms written in
another language and running that translator, please give citations.

And if someone were (crazy enough to) to do something like this, I strongly
suspect they would prefer to get some well tested algorithms written in the
same language/paradigm "family" to "port" than some random collection of
algorithms in a vaguely defined language on a random website.

IOW even if a langauge designer were to create _another_ dynamic object
oriented language, he'd do better to attempt porting well tested and robust
smalltalk libraries than from (a future version of) this site.

As a thought experiment please imagine automatically converting smalltalk code
to say Haskell, or Erlang. If that is too hard today why even bother with the
silly algorithms on this site?

And I am not sure an automatic translation would work even in that case.
Unless the new language were an exact clone of smalltalk, this "automatic"
translation would have so many places the designer would jump in and hand code
stuff anyway, that it just wouldn't work out in practice.

I would really appreciate any counter examples of a language designer actually
building a standard library like you claim can be done.

~~~
derefr
No, no such thing exists. No one has ever tried to do such a thing before. I'm
actually quite confused as to your saying "before any such claims are made"—I
have made no claim that such a thing _does_ exist, only that this site _could_
be used to _achieve_ the existence of such a thing. Specifically, I don't
support the claims the site makes that it already achieves these things.

As far as I know, though, this non-existence is because no one has ever
_thought of_ doing such a thing before, not that it's been tried and didn't
work out. No one knows if it can or cannot be done, because it has never been
attempted. That's sort of why I had to explain the finer points of the idea,
rather than just calling it "an algorithm warehouse" or whatever it would be
dubbed if there had already been a prototypical example to classify.

Also, I meant "big" as in the phrase "big in Japan"—well-known or, at the very
least, actively in development. A big commit log, in other words.

To be frank, I haven't actually looked at the site we're both talking about
here (and Silverlight isn't quite executable on my machine, anyway.) From
first glance, it looked to be _trying_ to be a sort of encyclopedia of good,
tested implementations of well-known algorithms for hashing, scheduling,
compression, garbage collection, error detection, sorting, pathfinding, and so
on—the sorts of things you think about when you think "algorithm"—but failing,
because it wasn't reaching the right content-creating audience.

These implementations could certainly be written in C, but C isn't a very
machine-transformable language—Scheme or Forth would probably be a better
choice. The point, though, wouldn't be to transform the _syntax_ (because
that's the easy part) but to transform the _paradigm_ —to munge typecasts into
real types, or switch loops between iterative and recursive form, or, at its
simplest, transform between OOP and functional styles. "Hand-writing" the
examples would negate the point, which is to let a computer understand how to
write in your language, given pseudocode (or a given Well-Known Language.)
Basically, it would apply the goals of RDF to code instead of data.

If it _was_ this thing, then my argument would hold, and you would also be
agreeing with me. I'm beginning to suspect our disagreement lies in the fact
that the site itself doesn't live up to the potential that I've layed out
here, although doing so would be no more effort. Perhaps I should make my own
site?

(And just so you know, yes, I'm talking about a completely hypothetical future
thingy—this is what I think language design will be like in the future, thus
my comment of "why can't we have _rapid language development_?" It would be
completely different than what we're doing now, which is reading reference
algorithm books (or not-too-well-known-and-quite-recent papers, where this
would be much more of a help) and turning the pseudocode into a "pure idea" in
your head, and then re-encoding the "pure idea" as code in your language.
Instead, you'd teach the computer to do what you're doing, and then just give
it the site's URL as training data—in other words, making your compiler an AI.
I think this site is a good first step toward that—getting all the world's
algorithms in one place, so we can build the training dataset. That's all it
is, though, and all I claim it to be.)

~~~
plinkplonk
"I'm actually quite confused as to your saying "before any such claims are
made"—I have made no claim that such a thing does exist, only that this site
could be used to achieve the existence of such a thing"

I _am_ challenging the claim that "this site could be used to achieve the
existence of such a thing". The claims on the site itself are (imo) not worth
challenging.

I _am_ saying that languages are different across multiple dimensions
sequential vs concurrent (different types say threads, vs message passing ),
paradigms - objects, vs functions vs imperative, many kinds of type systems,
memory management strategies and so on) that the "automatic translation to get
a standard library" is a doomed effort with today's technology in spite of the
advances made to date in operational and denotational semantics.

At _best_ , a lot of research needs to eb doneto make such a thing possible.

Is it possible? I guess, eventually. Is it probable in say the next five
years? Imnsho, no.

And with that I have said everything I wanted to and this thread is getting
too large, which is always a sign that it is time to stop.

Good luck with any effort you make to write such an automatically translatable
collection of algorithms. If you can pull it off, I'll be the first to admit
my error.

Over and out.

EDIT: I see you added "I'm talking about a completely hypothetical future
thingy"

and

"in other words, making your compiler an AI".

OK then, I withdraw my arguments. Sufficiently advanced technology is, after
all, indistinguishable from magic, as Arthur C Clarke said.

------
roundsquare
The implementaions here don't look that great.

Just take a look at the Naive Prime Generator. I know they call it Naive, but
its _very_ naive. To the point where if its checking if i is prime, it tries
to see if number _greater than i_ evenly divide it!

There are other optimizations they could toss in there (e.g. since they keep
an array of prime numbers they need only divide by those) but you could try to
argue that these make the code harder to understand... so for those I'm
willing to give a pass.

------
j_baker
I left the site as soon as I typed in quicksort and nothing came up.

------
pssdbt
Usually I just close the window when asked to upgrade Silverlight, this seems
like a good enough reason to upgrade (as much as I dislike it). Really though,
why is it needed here? Not trying to hate, just curious...

------
thomaspaine
I see a lot of naysayers here, but I think this is a good idea. I was actually
considering doing this myself a while ago. One of the best things about Matlab
is the community of plugins and algorithms, and having a somewhat central
repository that makes them easy to find.

That being said, requiring Silverlight is a deal breaker for me. I'd also like
to see something more like github, where it's easy to fork and make
improvements to other people's code.

------
ismarc
When I first opened the page, and saw the front page, I went "Man, this is
gonna be great. An in-browser IDE (probably using some site specific syntax)
to try out the algorithms. Given a common run-time and an assortment of
different types of datasets, you can try out different methods of
optimizations right there in the browser without having to do the tedious
parts." Then I actually clicked on some links. Then I saw what it was really
doing. Then I was sad because I made assumptions.

Seriously though, I'd completely redesign the site. Needing Silverlight to
view the algorithm is just insanity. Add in a large amount of varied datasets,
and make the goal of it an easy way to experiment with and share
algorithms/optimizations (ie, you have 20 different variants of quick sort
algorithms. A search for quick sort returns all 20 with graphs indicating the
performance of each on each dataset).

As it stands, Wikipedia is a better reference for algorithms, using well-
defined and concise pseudo-code to demonstrate the algorithm.

------
asjo
'Type algorithm name here...' "Simplex" → 'your search term didn't match
anything'

They forgot the "beta"-badge?

~~~
leif
right, because everyone needs a quick simplex algorithm under their pillow

~~~
asjo
I think it would be fitting to have in a repository of algorithms; you don't?

Maybe I didn't understand the purpose of the website from its frontpage.
(Which is a failure, either mine or the frontpages.)

------
gwern
Anyone want to compare this to <http://rosettacode.org/> ?

------
gritzko
Heh, the search returned nothing for "quick sort" and "hoare".

------
FrankBlack
Doh! I thought it was the web page for a former U.S. Vice-President's new
band. :(

------
dzlobin
This is a great project, looking forward to it bustling with code.

