"The real time websocket relay system used to make collaboration with JAM possible was developed using the Go programming language."
And the web app is hosted on app engine. AFAIK though, you couldn't host the Go websocket component on app engine (even though GAE supports Go as a runtime) since GAE doesn't support persistent connections directly.
In any case, very very cool!
(also available for Python and Java)
Any 2 sounds played above 30ms apart start to be distinct to the human ears.
Now, if only it could work with real instruments plugged to the computer.
Just a few weeks ago an extended family member mentioned them to me. He'd been using their software to play with his old bandmates, who are now scattered around the country.
Edit: Here's a link to their "learn more" page, which is hard to find: http://ejamming.com/learn-more/
But, perhaps I'm being nostalgic :)
But does it have AISO low latency audio driver support? Lets not even get into the latencies websockets are going to create. Even if you set something like this up on a LAN, the latencies would still be too high for any truly professional musician.
Dont expect anything like this being used by any real musicians anytime soon, even minus the websockets collaborative part.
Still very cool though. But I still wait for a day when something like Rock Band or Guitar Hero actually has professional low latency audio that could seriously be used live on stage by a band with sub 10ms audio latencies.
If rockband and guitar hero cant even do that, I'm not holding my breath for chrome anytime soon, especially in a networked situation.
I have only a fuzzy understanding of how this works, but I was thinking you could run a bunch of .wav samples through Web Audio to get their wave shapes ... save them as arrays, and then synthesize sounds from a single js file? (Rather than lugging around the .wav files themselves).
If this is possible, it would be awesome.
Go backend using App Engine.. Good to see Google pushing it.
Some tech notes from a pro-audio software developer:
- All the samples are OGG ( http://chrome-jam-static.commondatastorage.googleapis.com/xm... )
- They're using some "DMAF" engine for the audio, though I can't figure out who made it: http://www.jamwithchrome.com/js/compiled/dmaf_all.js
It's pretty serious stuff from the looks of it - It's got a node graph with different DSP modules (envelope follower, delay, ADSR, distortion, LFO, chorus, compressor, phaser, equalizer, biquad filters), it plays MIDI files, and more.
I developed a similar application for Chrome (which coincidentally had the codename "Jam"), but instead, it uses Native Client for audio instead of the Web Audio API. It's called SongStarter, and you can jam with it here:
(NaCl apps must be installed from the Chrome Web Store for the time being, otherwise they won't work. Google's choice...)
It deals with latency by ensuring that it's rounded to measures, so in a free form jam, everyone always stays in sync. It's an ingenious solution and it really works. Being in a jam where real musicians with different instruments just join in and play together is incredibly cool. (That's the achilles heel of these web experiments unfortunately - Cool tech demo aside, they're not really aimed at musicians...)
How long have you been working on it for?
There's going to be loads of "big brand" companies looking for crazy audio doodads for their websites after this Jam thing. Probably lots of contract work to found be there... :)
It's been in the making since December, if I don't remember it wrong. I have to give credit to the rest of my team too, of course.
It seems yo can add this as an extension to Chrome, too:
It seems to say that I can use the QWERTY keys to play the guitar, and while the buttons are moving on the screen, the chords aren't. Is that a bug or am I doing something wrong? The sounds seem to work only when I used the auto mode.
The real bug, though, is that the "key" setting at the top of the UI that's supposed to update the available set of chords is completely wonky, and rarely does what it's supposed to (what, you wanted B? Let's go with A instead. You chose major? Minor it is!)
Seems kind of strange. I mean if FB goes down or moves the location of that js file or something, the whole jamwithchrome site won't work?
I'm not really sure about the differing capabilities, or what standardization of either looks like right now. But I think Mozilla's api is more low level, so it might be possible to create a compatibility shim on top of it?
Being that this is HN many of you might have seen this already, but there's an exciting Google IO talk from Chris Wilson about Web Audio:
And see the corresponding slide deck for code:
Oh well, maybe I just need to practice more!
My best friend and I built this in a few months and we're really proud of it. It's still very early but we're adding new features all the time.
It doesn't have the graphic design and branding of jamwithchrome, but you can start creating music really quickly.
Also, if you sign up for free, you can create and loop clips and jam with yourself if you like.
Give it a try and let us know what you think.
Not sure where to go from here, since I certainly have a connection.
I didn't know about the insert mode thing for a long time, but when I figured it out it was a "duh" moment :)