Hacker News new | past | comments | ask | show | jobs | submit login
Ableton Note – A playable iOS app for forming musical ideas (ableton.com)
173 points by alphadelphi on Oct 19, 2022 | hide | past | favorite | 107 comments



I'm loving this.

No one has pointed out how cool it is that you get random instruments at the start.

It's a looot of fun to be able to 'doodle' songs based on a cool thing you made inspired by a drum sound, or a hot synth line. The equivalent would be having a magic art supplies drawer that gives you a few coloured crayons, or watercolours, or charcoal etc at random, paired with different types of paper and scales. And if you don't like your random set, you can make another in 3 seconds.

The 'quick start' aspect with the touch of randomness is just so cool. I don't know how many times I've opened Ableton, then gotten bored before I could pair an instrument with a drum track and a bass. This leapfrogs that pain point beautifully.

I also really like how intuitive it is for an Ableton user. To be fair the learning curve for Ableton was pretty steep - but so, so worth it.

One area that I think needs improvement: there are many finicky little problems about setting and correcting the length of your loops. For example, I love that it auto-guesses your tempo and loop length, but it can be very wrong sometimes and I'd like to be able to change it without going into Ableton Proper. I hope they can solve that.

I'm also struggling to figure out how to lengthen notes once they're put down - maybe it's not possible? This results in some rough edges, but nothing not very easily fixed in desktop Ableton.

Overall, this is really fun. I made about 15 little song seeds yesterday, and it was enjoyable.


That's definitely a good point. Restricting options and randomization can be great tools for creativity. I also like Splice's "CoSo" app for the same reason. It isn't a great app for making full songs but it's phenomenal for helping getting my creativity flowing.


Just tried it. As someone who has never used Live, I didn't understand anything (I know Logic though, and I'm generally good with figuring how apps work as a dev myself) even after the tour.

I believe the onboarding should be interactive and telling the user how things work instead of just putting a sample video which doesn't teach much.

That said, from what I see from the results it can be a great tool... if one can figure how things work.


Judging by the landing page, this app is not meant to be a standalone tool in your music production but a companion tool to Ableton on the computer. You sketch your idea in the mobile app and eventually transfer it off to the computer to finish the track in Ableton.

With that in mind, most of the screens seems to be self-explanatory, at least for someone who dabbled around in Ableton.


This is wild, because I think of Ableton desktop as a song prototyping environment…


Well, for many other producers, it's both were songs start and end. They might send them elsewhere for mastering, but not for production work.

Anything without analogue instruments doesn't need much to be found elsewhere (I'd argue also anything with analogue instruments doesn't either, but many prefer to track vocals, drums, guitars, etc. on Pro Tools or similar)

Mostly, though depends on the genre, and the kind of artist. Artists working mostly with analogue instruments, or that have huge sales and production teams, will tend to use other apps, mostly Pro Tools, and use Live as a starting point/sketchpad.

Producers and composers of pop and electronic music, that are not on the paycheck level of, say, top-charting artists, even if they still have tons of fans and a good career, do tend to do everything "in the box". It's perfectly capable. Heck, even FL Studio has created tons of "in the box" global hits...


Not so wild, people use different tools for different needs :)

I do my sketching wherever, actual production on bunch of hardware and only use Ableton to record final main output of hardware and do my mastering there, nothing else.

Producers be producing in tons of different ways :)


...but hard to use on the bus or in line at the bank.


Live is such a terrible program. Oh, it's great if you want to do 4/4 dance music centered around samples but for all other purposes it is incredibly painful.

As a classic example, I play a midi wind instrument which sends notes and breath control and pitch bend. If you use "scale time" on the midi track, it moves the notes and leads the breath control and pitch bend where it was, silently breaking the track. This is true also of pitch bend, modulation, and volume controls.

I contacted them well over 10 years ago with a polite and careful bug report. Their original response was, "That is what it is supposed to do". I said "But all other sequencers stretch all the MIDI, not just the notes! And it breaks my tracks. Isn't there some way I can just stretch all the midi?"

"No, why would you want to do this?"

"Because I want to stretch existing MIDI tracks to match other tracks."

"You should record with a metronome so that the tracks are already in sync."

"But I already have these tracks. And also, when I am composing, I prefer to just play without a metronome, and then select the parts I like."

"You're just being difficult."

And they refused to respond after that. Years went by, and every year I saw some other sucker complaining about the same thing.

Well over a decade later, they finally implemented it - _except_ it's only for three MPE controllers, so it doesn't deal with breath control, classic pitch bend the way it has been sent by every single controlled before around 2021, modulation, or volume.

I have dozens of similar bugs.

Now I use Reaper. What a difference!


>Live is such a terrible program. Oh, it's great if you want to do 4/4 dance music centered around samples but for all other purposes it is incredibly painful.

That's just not true. It's used all the time for experimental, decidedly not "4/4 dance music", for soundtrack work, and for many other things, including tracking regular rock and other such bands.

>As a classic example, I play a midi wind instrument which sends notes and breath control and pitch bend. If you use "scale time" on the midi track, it moves the notes and leads the breath control and pitch bend where it was, silently breaking the track. This is true also of pitch bend, modulation, and volume controls.

Well, all DAWs have similar quirks. Some don't have microtuning (Live does) so you can't play Indian or Arabic maqam scales. Others don't have MPE (Live does), or can't sync live drums to the DAW in real time so synths/arps etc match the tempo (Live does), and so on...


MIDI Stretch stretches all linked envelopes, not only the three MPE controllers, if you select TIME not notes, as Live differentiates those types of selection.

The status bar in the bottom will say if you have "Time Selection" (Start, End, Length) or "Note Selection" (Time, Pitch, Velocity, Probablity).

Arrangement has a similar distinction between selecting time or Clips.


Live is first and foremost a tool for making electronic music in the same way Cubase is for making live band stuff. You can use one for the other, but you'll fight against the design goals that underly them. I would love Cubase's expression maps for handling keyswitches in Live for those rare times I use orchestral stuff, but I know I'll need to spend money on the top Max device for it if I want to get serious about sampled instruments. And I don't, because 99% of the time I'm making ridiculous sounds in Massive/Massive X. Live is all about the painless automation and clip launching. Automation is central to electronic music. Clips are fun to goof around in, but also let musicians cut up their songs for dynamic playing during actual live sets.


> Oh, it's great if you want to do 4/4 dance music centered around samples but for all other purposes it is incredibly painful.

A good friend of mine uses Live heavily for creating music -- definitely not 4/4 dance music: https://album.link/i/1627338395 . He's mainly using acoustic recordings as samples though, so it sounds like your specific use case works better in other DAWs. I've never played with Reaper, sounds interesting!


Interesting. I use Reaper and love it, but sometimes I feel "ashamed" of not using Live like everyone else. Of course when listening to a finished track nobody cares what it was made with, but for collaboration the choice of Reaper seems a little limiting. Yet it's so powerful.

That said, there must be reasons why Live is sooo popular...?


Essentially by default, Live can take any material and automatically ensure that it matches the tempo (and key) that you're working in. Clip launching is a very powerful way to play around with musical ideas, even if it is not necessarily the best way to produce finished pieces of music (though it can be). Reaper doesn't have this workflow (at least, not builtin). It is this "I can goof around with almost any material and it almost always sound good, or at least interesting" element that I believe made Live so popular.


Ah ok. The way you're describing it reminds me a lot of Acid Pro, which went from Sonic Foundry to Sony to Magix where it seems to have disappeared. It's still available apparently, but I have yet to meet anyone who uses it (or has even heard of it!)

There was a time when it was all the rage. How things change.


> That said, there must be reasons why Live is sooo popular...?

I've been using Live since 2015. My take on what it's got going for it is that it has a relatively streamlined interface, and is fairly opinionated on a particular way of setting things up, which makes learning it quicker. Instruments and effects for each track across the bottom. Make a bunch of tracks whose output by default feed into groups, which you can treat as busses. You can override the routing, but the default is easy to understand and straightforward. The stock devices cover 95% of what you need to do, generally "sound good" out of the box, and Max For Live provides a way to make custom devices that seamlessly integrate into the DAW.

Also the clip launching stuff and integration with the Push is neat. I suppose more DAWs have that capability now, but in 2015, Live made it easy.

If I were to pick a new DAW I'd probably go with Bitwig. Live is starting to feel like legacy software to me. Max For Live is too low-level to do polyphonic instruments easily, but the visual node-graph approach is harder for me than writing code would be. It sometimes hangs for 30 seconds at a time while I'm trying to flesh out some ideas, and find myself annoyedly sitting there trying not to forget what I was trying to do while Live gets its shit together.


i think a part if Live's popularity stems from its image as "the DAW for serious producers". Might have something to do with it being pushed at universities as well/or tech colleges which teach audio. So - it's somewhat engrained as the DAW that's used for electronic production. And Protools as what's used for audio recording. I think it's most intuitive to use as a DJ mixing tool. I've never more easily created mixes with blends than i have in Live. This is where it started i guess (as Henke/Monolakes live set software as a MaxMSP patch) - so it makes sense it's most intuitive there (in my opinion)


>That said, there must be reasons why Live is sooo popular...?

the clip launching workflow makes it a great file for live sets, improvised electronic (or any other kind of music based on blocks), and so on

very streamlined for easy focus - many DAWs suffer from windows upon windows, in Live it's all (including built-in fx and instruments) in a single, non-MDI window, and all tools, fx, instruments, windows use the same widgets and concepts (not the case in e.g. Logic or Cubase)

automatic (and good) sync-to-tempo for audio, including several manual manipulation options (pitch/tempo/combo/etc)

opinionated fx and built-in instruments tuned to electronic and experimental music (including a full-on built-in version of Max, one of the most popular visual programming environments for building your own fx, instruments, audio/midi processing tools, and so on)


I'd recommend to watch the Learn Note video: https://www.youtube.com/watch?v=BjjMEZburmQ

It was really cool to see all the things you can do with it.


>I know Logic though, and I'm generally good with figuring how apps work as a dev myself

Me too (knew Logic and Cubase, and am dev and good with figuring how apps work). Ableton Live (desktop edition) still took a while, and checking some onlive introductions (which I never needed for a new DAW in 20 years). It just has a different model (Logic's new clip launching abilities are somewhat similar, but Live also does makes several other things in its own unique way).

That said, in the end, it's a DAW with audio and MIDI. A lot of things will be common too. But the workflow part can be tricky to grasp just by using.


If it makes you feel any better, Ableton Live's interface is also not particularly intuitive, though at least it has the advantage of tooltips. It does seem like something you have to use for a while to figure out the workflow, though some of it is a bit easier for Live users.

I think it is to Live what GarageBand for iOS is to Logic — a stripped down app that lets you build ideas on the go that can be exported to the main application to complete/further flesh out the idea.


Pretty much all DAWs are that way though. They're very feature rich pieces of software. It's quite difficult to design something that can be usable by a layman and professionals at the same time.

It's kind of like your grandmother (assuming she's not a developer) complaining that VSCode or a JetBrains IDE is confusing. Of course it is. She doesn't know what half of it does, not because it's poorly laid out or designed, but because she doesn't have the domain knowledge. She's not the target audience.


There's an even bigger (or at least, more on-point) problem with the audience though.

In the pre-DAW days, "audio engineering" was a specific task and skill set that was quite distinct from "playing music". Many DAWs were created to assist with audio engineering, assuming usage by someone who understands that domain.

Fast forward to recent times, and there's some widespread belief that a DAW should be a tool for musicians, even for musicians who don't know anything (or at least, not much) about audio engineering. "I just want to record my ideas".

So then someone cooks up some relatively simple DAW-like application for such people. They start using it, and within a few weeks or months, they find themselves unavoidably learning something about audio engineering. They want more from the application, and within a relatively short period of time, they need and/or want the full DAW.

The same thing is happening, to a lesser degree in the podcasting/radio production world.


Every audio engineer is also a musician. Every DAW has been developed by audio and software engineers who are also musicians[1]. Every professional musician understands the basic concepts of audio production. DAW UIs mimic the interfaces of real world devices, of which all musicians interact with on a regular basis.

Suggesting that musicians weren't meant to (and shouldn't even be allowed to) use DAWs is beyond nonsensical. This holier-than-thou argument you're trying to make is baseless.

[1] https://twitter.com/JustinFrankel/status/1582430125941198848


> Every audio engineer is also a musician.

Absolutely untrue. Since you used the word "every", I only need one example to disprove this. Geoff Emerick, the engineer for the Beatles "late" albumns, was not a musician. I could name dozens more, spread across decades. Susan Rogers, Prince's audio engineer: not a musician. Chris Lord-Alge ... not a musician. This is just so wrong.

> Every professional musician understands the basic concepts of audio production.

I know hundreds of professional musicians. Most of them know almost nothing about audio engineering other than a few buzzwords.

DAWs used to mimic mixing consoles, but increasingly do not (because their functionality has expanded into new realms not touched by mixing consoles). Plugins used to mimic hardware units, but increasingly do not (because (a) skeuomorphism comes and goes as a fashion statement (b) they do things never implemented in hardware).

Very, very few classical musicians interact with an EQ or reverb unit on a regular basis. Very few drummers ever use stomp boxes or EQ. Very few singers have any knowledge about mic or preamp technology.

Then there's this little chestnut:

> Every DAW has been developed by audio and software engineers who are also musicians

You don't appear to be aware of the fact that I am a DAW developer, and over the last 22+ years of being in the field have gotten to know (a little) the other people that you refer to. You're just wrong about this. Sure, most of the companies have audio engineers and musicians on staff, but most of the actual coders are not musicians.

Justin is probably one of the exceptions to the rule, although even he concedes that (a) he isn't a very good musician (b) he doesn't know that much about audio engineering. You can hear him say this on the 2.5 chat we had at http://adc.equalarea.com/2022/02/07/adc1/

I have no idea what I said that made you believe I was suggesting that musicians should not use DAWs. My point was that it is very difficult to design tools that work well for both musicians and audio engineers (unless they happen to be the same person), and that when you design one that works well for musicians, there's a tendency for it experience pressure to be more "engineer-y".


This is really all that needs to be quoted to show what nonsense you're trying to pull:

> there's some widespread belief that a DAW should be a tool for musicians [...] I have no idea what I said that made you believe I was suggesting that musicians should not use DAWs.

And the rest of your comment is more holier-than-thou nonsense, mostly baseless and not accurate to any reality that I've ever heard of, much less experienced.

> DAWs used to mimic mixing consoles, but increasingly do not

Except all of the buttons and faders and everything else still look the same. You're completely making things up, and even your made up things don't prove your point. No other DAW developer or audio engineer in the world would back up your claim that DAWs aren't meant to be used by musicians.

> I know hundreds of professional musicians. Most of them know almost nothing about audio engineering other than a few buzzwords.

I've met thousands of musicians in my life, and 90%+ of them understand the basics of audio production. The musicians you know can't be very professional if they haven't ever encountered a situation where they learned anything about audio.

> You don't appear to be aware of the fact that I am a DAW developer

Because apparently my work in the field is irrelevant and I couldn't possibly know anything, right? Every company developing DAWs is primarily engineered by musicians. Just because other non-musical engineers get involved, doesn't make my statement any less factual. There are other aspects to software development (even in DAWs) that don't have anything to do with audio. As "someone in the field," you should know that.


Hi. Musician/producer/audio engineer here.

I've re-read Paul's initial post three times now and I'm still not seeing how you're interpreting the message of his post as "DAWs aren't mean to be used by musicians".

All he's said is that introductory products in the domain like Garageband (or perhaps Ableton's new app) initially start out as stepping stones for musicians who don't have an interest in the engineering aspect of things and that as the musician gets drawn into the discipline of audio engineering those apps become insufficient for their needs and they end up being drawn into full-featured DAWs.

He was simply highlighting an interesting problem that many musicians encounter as they dip their toes into the water of recording their music for the first time.

That aside, he very clearly refuted your three assertions:

1. That ALL engineers are musicians: His examples are correct (Though I believe CLA may be a drummer, if not a great one.) There are still plenty of other examples to draw on.

2. ALL musicians have a grasp of music production: His example of classical musicians is spot on.

3. EVERY DAW is developed by programmers who are musicians: I can't speak on this, but since he's a DAW developer, he'd sure as hell know a thing or two about that.

I don't know where you're getting this "holier-than-thou" attitude from. He's just having a conversation. There would be far less cause to refute your points if you didn't speak in so many absolutes ("All", "every").

Take a deep breath, man.


> 1. That ALL engineers are musicians: His examples are correct

No, they're actually not correct. Two of the examples he provided as "non-musican" engineers had tons of experience with producing electronic music, and the other he purported that the engineer wasn't a musician based on no actual knowledge of the person.

> ALL musicians have a grasp of music production: His example of classical musicians is spot on.

As I already explained, classical musicians are the literal only exception in the world of music, as greater than 99% of the work in their lifetime has nothing to do with recording. You don't get to cherry pick the exception and use it as universal statement.

> EVERY DAW is developed by programmers who are musicians: I can't speak on this, but since he's a DAW developer, he'd sure as hell know a thing or two about that.

Again, you're reinforcing his holier-than-thou nonsense while also pretending you don't see it. He is trying to assert himself as having superior knowledge of the subject because he has worked on a DAW. It doesn't matter to either of you that I've worked as a coder on two separate DAWs in my life. So, no, he doesn't magically know more than other people, especially when he's making claims that are completely false and inaccurate to reality.

At the end of the day, he believes that DAWs were never designed to be used by musicians and that it's a mistake to design them for musicians. It can't be interpreted any other way, because it's literally what he said.


You seem to have a problem understanding what I mean, so I will try clarify:

> there's some widespread belief that a DAW should be a tool for musicians

This means "usable by musicians who find themselves intimidated or confused by the interface on a modern full-service DAW such as ProTools or Logic". This means tools like those appearing for mobile devices that are just basically easy to use recording devices. You can see this plea daily on KVR, Gearspace and other similar places.

This does NOT mean: "musicians should not use DAWs".

> The musicians you know can't be very professional if they haven't ever encountered a situation where they learned anything about audio.

Classical performers: no need to learn about audio. Folk performers: no need to learn about audio. Live acoustic music performers generally: no need to learn about audio.

The set of musicians I was referring to includes two Grammy winners, who, just like many amateur and other professional musicians, have chosen to focus their attention on their music rather than the process of recording (or PA'ing) it.

> Except all of the buttons and faders and everything else still look the same.

Where was the last mixing console you saw that did stretch-to-fit-tempo? That ran arbitrary plugins? That allowed arbitrary anywhere-to-anywhere routing? That needed, somehow, to fit in information about editing state alongside the mixer interface? That could do clip launching? That had any editing component at all?

I work with one of the older mixing console companies, and we're constantly bumping against the boundaries of stuff that DAWs do that consoles do not, things that consoles do better than DAWs ever have, things that consoles do that DAWs do not, and so forth. I would say this sort of thing comes up in almost every weekly meeting. The same was true back in 2008/2009 when I worked with (for?) another old mixing console company and they were trying to understand how to reconcile their established products with the reality of DAW-based priduction.

> Every company developing DAWs is primarily engineered by musicians.

I could name several major DAWs whose lead and sub-lead developers are not musicians (or at least, do not consider themselves to be musicians). I could name a few others where the lead and sub-lead developers were not musicians or audio engineers when they began working on their software, but have become so over time.

I don't presume to know your background, and I certainly did not say that "couldn't possibly know anything". I just said that several things in your post I know to be factually incorrect.


> there's some widespread belief that a DAW should be a tool for musicians [...] This means "usable by musicians who find themselves intimidated or confused by the interface [...]

We all understood what you said and meant, and no amount of deflection changes it.

> Classical performers: no need to learn about audio. Folk performers: no need to learn about audio. Live acoustic music performers generally: no need to learn about audio.

Really? You're going to cherry pick classical performers who largely never directly record anything? The only exception in all of music?

Your other two examples, essentially both being folk musicians, are completely wrong and you have zero basis for your assertion, which is apparently that folk musicians don't have any interest in sounding good. As a manager of folk musicians, I can tell you with 100% confidence that you are speaking from a place of complete ignorance.

> Where was the last mixing console you saw that did stretch-to-fit-tempo?

Just because features specific to a digital domain exist, doesn't mean that the DAW interfaces aren't based on real world interfaces. I can't even imagine what you're trying to prove, but it has nothing to do with my statement.

All-in-all, you're doubling down on things that are far from reality, and trying to move the goal posts with every comment. Just admit that you said something completely incorrect and let's all move on with our lives.


>We all understood what you said and meant, and no amount of deflection changes it.

I beg to defer. I understood what the parent said, but agree with him and disagree with your take.

You also come of as rude. And, to my experience with musicians, also wrong in making those general statements. Many pro musicians don't know about DAWs, and are too intimidated to even use

If your experience is mostly with modern pop/electronic/hip-hop etc mucisians, of course they'll know about DAWs. Or someone playing keyboards in bands, will too. And of course if you are a dev and have dev friends who play guitars and keys and such, they'll also know DAWs.

But there are many many genres outside that, and many pro musicians, or musicians that are not techy, where conventional instruments rule, and musicians who even take pride in not dealing with computers and DAWs, whereas other would like to, but find them intimidating.

And I'm not talking about 50-year olds here. I've recently had a music seminar with 20 or so other musicians, mostly 30 and below, with many at their early 20s, and people playing instruments like cellos, trumpet, etc could barely use basic external effects units, didn't know what things send/return is, and were totally lost of using a DAW.

>Your other two examples, essentially both being folk musicians, are completely wrong and you have zero basis for your assertion, which is apparently that folk musicians don't have any interest in sounding good.

This is not only wrong, but a bad faith strawman...


If you find what I've said more rude than the person making several false claims about the history of DAWs and music, and declaring that folk and acoustic musicians don't care about how they sound -- among other complete nonsense -- that says more about your ethical priorities than it says anything about me.

> If your experience is mostly with modern pop/electronic/hip-hop etc mucisians, of course they'll know about DAWs. [...]

These three paragraphs you wrote are baseless, and you're making massively generalized statements (that are overwhelmingly untrue) while claiming that I'm wrong to be making generalized statements (even though they're overwhelmingly true). Your hypocrisy is staggering.

> This is not only wrong, but a bad faith strawman...

You apparently misread this completely, because your response doesn't make sense. The parent literally said that folk and acoustic performers don't have any need to learn about audio, which is nonsensical and untrue of greater than 90% of folk musicians I've met and worked with (which is many hundreds). If you are trying to back up their claim that a entire swath of a million musicians don't care about how they sound, then your arguments are being made in even worse faith than the parent.

Given that the point of my original comment was that DAW development has always had music production in mind -- and not claiming that absolutely 100% of everyone knows 100% of everything about music and engineering -- you're strictly making bad faith arguments by trying to nitpick semantics.


> We all understood what you said and meant, and no amount of deflection changes it.

On the contrary, I think you've fully misinterpreted what's been said.


> Really? You're going to cherry pick classical performers who largely never directly record anything? The only exception in all of music?

I’m sorry but this sounds like you just dismissed classical performers as insignificant musicians just because they don’t fit your categorization that “all musicians also do audio engineering”.

The world is wide, blanket claims like this is bound to have exception, by principle.


I said nothing of the sort. In fact, the parent commenter is the one claiming that musicians are insignificant. You've misread everything and are putting words in my mouth. Please do better in the future.


"I think it is to Live what GarageBand for iOS is to Logic"

That's what I was thinking, and I think in both cases this capture/export pipeline for musical ideas is a brilliant use of mobile technology.


There's a ton of music apps on iOS including Garageband that comes with it.

https://www.reasonstudios.com/mobile-apps

https://ampifymusic.com/groovebox/

https://www.bandlab.com/

I'm too lazy to look up more but I feel I've seen more than 20.

Anything special about one over another?


Koala is a fun one for making sample based beats, it is inspired by the Roland SP404 which is a legendary sampler used by e.g. J Dilla.

Drambo is my personal favourite, an incredibly powerful semi-modular groovebox somewhat inspired by Elektron’s sequencer, but with infinite possibilities for sound design (especially when combined with AUv3 plugins) and a great UI. Not as beginner friendly though.


I'm also aware of https://www.flipsampler.com by the YouTuber and electronic music producer Andrew Huang. I've had good feedback on this app from my musically talented friends. I can't provide any judgement in this area myself as while I am an ardent music fan and a keen observer of music production mastery of others, I have not been blessed with the necessary skills or propensity for developing them myself, unfortunately.


I used several iOS apps, and flipsampler is one of the best ones I've tried, quite intuitive.

I didn't expect it, as many times such apps from vloggers and celebrity musicians are just BS endorsement deals for mediocre cookie-cutter stuff.

>while I am an ardent music fan and a keen observer of music production mastery of others, I have not been blessed with the necessary skills or propensity for developing them myself, unfortunately.

If you want to give it a try, Ableton themselves offer a nice tutorial:

https://learningmusic.ableton.com/?pk_vid=7fc48d891c93e86616...


Andrew Huang is really interesting. He does a good line in bubble gum pop type stuff as well as radically experimental synth stuff.


For me, the most inspiring music app on iOS is Endlesss (http://endlesss.fm) because it allows me to share ideas with my fellow music-makers seamlessly .. hours and hours of jams have gone on and on because of the easy to use interface and very well integrated sharing of clips between users ..


Many things special about one over another, as most have different features and strong/weak points.

One particularly special about this one is easy integration with Ableton Live, the DAW of choice for many/most electronic music producers - automatic sync, access to a sutset of the sound library, same (but stripped-down) workflow, and compatible project files (from Note to desktop Live).


My personal favorite is Auxy. Has a subscription, but there's a ton of free instruments included so the subscription isn't mandatory. Very easy interface but can create complicated compositions. Can export to Ableton, MIDI, WAV and others. It's easily my most used app outside of Safari.

https://apps.apple.com/us/app/auxy-studio/id1034348186

I think the basic interface of Auxy would pass the grandparent test. Some of the knobs might be a little confusing. But selecting an instrument and plotting out the loops is dead easy.


Lots of really good suggestions here. To add to this, I recently discovered https://www.songen.app. I think you have to pay for the export functionality, but just playing around with the parameters and genres gives a nice starting point whenever I feel stuck.


Ah, bandlab is very nice. Thx für the suggestion!


For a more Lo-fi experience, theres's the excellent https://nanoloop.com/

Works on:

- iOS

- Android

- Game Boy

- Game Boy Advance


Love it, but the latency with a Bluetooth headset is a sore point. Probably an inevitability of iOS...


Latency is inherent in Bluetooth, for music production you're pretty much tied to wired headphones (or expensive live setups). AFAIK, AIAIAI makes wireless headphones for music production, which has its own dongle and doesn't make use of BT.


I can’t believe it doesn’t support manual midi entry (as far as I can tell), seems like a huge miss. This is a well understood limitation for apps in this space (see Auxy), and people increasingly have only wireless headphones.


well, those people will have to put up with Bluetooth latency, or use wired headphones on the Lighting jack. That's not up to the app (or even iOS itself). It's not meant for the general public that just wants to have bluetooth headphones. Musicians working with DAWs and music apps, mobile or not, will have wired headphones too.


I am such a musician, and have been looking for years for an app I can use on the go to capture a spontaneous idea on the go and then pick it up later at my home DAW. The ability to manually enter musical notation is a essential here, because it allows me to use an app even if I don’t have my wired headphones (and when I’m out and about living my life, I don’t carry wired headphones). The closest I’ve found is Auxy, but it can’t do a truly native import to Ableton.

This app misses so badly because, by design, it makes such manual entry impossible. It actually hides the musical grid, and tries to focus on snippets of musical performance that are automatically (and invisibly) tempo analyzed and locked. This means that with wireless headphones the app is actually complete useless, instead of merely impaired.

Unless I’m missing something?


you're not.

they'll probably add this over time - as well as audio recording, for adding vocals etc.

but for now it's aimed as quick sketches and getting down some ideas as "finger-playing" jams, not at being a lite MIDI sequencer.


It's not about iOS, it's about Bluetooth.

In iOS you can still use regular headphones with the adapters.


To be fair, it is kinda about iOS. There are lower-latency codecs than AAC available, but you cannot install them since Apple doesn't allow it.


AAC is an audio encoding codec. It's not about Bluetooth latency - just audio quality.


FL Studio Mobile has been around for 11 years at this point. Specifically as a launchpad for ideas which are then finished in FL Studio.

Not the best place for this...but i just cant help but sigh at the near shunning of FL as a major DAW player by the industry by and large. the amount of third party VST houses which list support for all major DAWs and consistently leave out FL is just laughable at this point. How can one of the most forward thinking DAWs with some of the most industry leading features, abd year on year reported most popular DAW by marketshare continue to be ignored in this way?

I understand a common excuse is "FL Studio is highly pirated". So what? Adobe and scores of other software is...


I love Ableton. They simply offer the best workflow. I use it for more than 10 years and I can reproduce almost every sound I hear in a song and quickly compose anything that comes to my mind. I also bought Logic but it just didn't click for me.


All I want is a remote transport with the ability to place my start markers. The ability to create tracks, arm them and assign inputs would be secondary but awesome.

I need to be able to sit at my drums and control basic recording / playback features in ableton.


You can already use your phone as a virtual transport with several different apps, but I do agree: one would expect this kind of feature from a companion app.


it's not a mobile Live controller app - that's where you'd expect to see a virtual transport.

it's a mobile Live sketchpad app.


I had hoped to use it to draft ideas on the go, but I don't play an instrument, so I lack the timing practice to use it the way it's made to be used (live). I really need the note editor and sequencer from the big desktop program to be productive in it. However, it's fun to poke around in, and I'm sure they'll add missing features over time.


A nice work around is using the note repeat feature, and then moving those notes around.


This looks so promising, but there's no volume control and it seems to bypass the system volume. I didn't even know that was possible.


I just purchased it and system volume control works fine.


> there's no volume control

Took me a minute to find it, but it's there.

On the set screen, there's a levels icon at the bottom that lets you control channel volume.


The app also works great on iPad btw. I think ableton missed advertising for it since there’s now iOS and iPadOS.


I think the selling point of this app is that this is the only way you can start an idea on the go and then easily bring that into Ableton Live. Don't know if it works the other way around too but I'd be surprised. But it's interesting to see Ableton dipping their toe in mobile water.


I don't think it'll work the other way around, much like the GarageBand for iOS to Logic Pro X conversion is only one way. It'd really only work in cases where you're only working with audio and not software instruments.


Wouldn't be impossible as long as the same soft synths are available on both devices, which in this case, they seem to be.

Case in point: Maschine+ and the desktop software has a similar approach where you can create songs on Maschine+ and later transfer it to the desktop application. If you just tune some things without adding soft synths/effects/VSTs not available on the standalone device, you can also transfer them back to the Maschine+ to continue edit it there.


Maschine is different because it is for all intents and purposes the same application.

This does not have the same capabilities. Live just added support for AUv3 but generally third party Audio Units are not cross compatible and Note doesn’t even support those (yet?).

It does seem like Note could serve as a framework for porting Live to iPadOS with its touch oriented interface, but I do not think you could send it back and forth between the two without issue. Ableton doesn’t even let you load files saved for beta versions into regular Live, even if you’ve done nothing to the file.


I don't what's the motivation for making it iOS only. It makes more sense to choose web platform like another of their products learnsynth (https://learningsynths.ableton.com/).


iOS has a significantly better audio stack than Android that works consistently between iPhones/iPads and Apple actually supports music production as a use case. Android has had issues with its audio stack forever. One significant issue that's plagued it has been its audio latency, which is pretty important when it comes to building music apps. It's become less of an issue now but it's a massive crapshoot.

Maybe they could target the web, but it's pretty clear that iOS is a superior platform to the web for music production apps as well.


Was going to say the same, a couple of years back I led a team shipping an audio app on iOS and Android (pretty simple in terms of audio demands) and Android was a nightmare. Different devices vary so much in terms of audio performance due to drivers, bugs and performance optimisations (e.g. Samsungs would ramp down the CPU to minimum frequency if you weren’t actively touching the screen) so testing is a huge undertaking if you want to cover a wide range of devices. Latency was a big problem, and the Android audio stack itself had plenty of issues. The situation did seem to be improving last time I looked though.

Then on top that you’d have the complexity of shipping a cross-platform app and either building large parts of it twice or using something like Flutter or React Native.

It must be annoying for Android users but the reality is that shipping an audio app for Android is (or was a couple of years ago at least) a huge challenge and probably much harder to make a compelling business case for.

WebAudio has similar issues in terms of wildly varying performance across devices (it used to be pretty poor on iOS for example, though I think this has improved) and relative to native, you’re pretty limited in terms of processing power. Again this is improving all the time, but I’d guess Note has a cut down version of Ableton’s synth engines and effects running rather than remaking them for the app, and running something of this complexity with WebAudio is likely to be very challenging if not impossible, especially on mobile.


Those are some great insights, thank you for sharing.

Android notably does have some good apps for making music on it. There's a version of Cubasis on Android, Koala Sampler is on Android, Google had Teenage Engineering build a music app for Pixels, and I'm sure there's many other apps that have found a way to make it work, though I know for a fact that Koala Sampler has had issues with input latency due to the audio stack on certain Android Devices (where it becomes unplayable).

Apple's done a lot of work to make it a viable workflow on iOS/iPadOS. inter-app audio, AUv3, actual support for tablets, and a good audio stack are just some of the things they've done to make it a real platform for musicians. Google would have to put in a lot of work to make Android at all attractive to musicians and frankly I don't even know if it's worth their time. Still, they can do it and they've done a fairly good job of improving the state of Android for visual arts on tablets, so it's not impossible. Just a lot harder.


Yeah, I don’t see google ever having the incentive. Apple can onboard people with the far reach of iOS, and pull them into the Mac later on once they are hooked on audio products for iOS. With Android…you are a bit stuck as I don’t see ChromeOS becoming a center for audio production.


I have no preferences on Apple or Google. I use Mac and Android phone. But the Chrome team has spent lots of effort for web audio, especially the latest audioworklet api and wasm supporting gcfree rt audio. Yet on Safari their audio is way worse. Who knows the reason?


Yeah that is a good point that it is possible to ship a pretty good audio app, but it requires a lot of work (especially testing) or you restrict it to eg. just Pixels. I can imagine Ableton surveying their user base, finding them quite Apple-centric and mainly iOS users (just a guess!) and deciding it wasn’t worth the potentially huge amount of extra work to support Android, at least for v1.

Google did introduce Oboe a couple of years ago which was a new, much improved audio API, so they have put in work there. But yeah, there’s nothing like AUv3 as far as I know and the whole ecosystem must be so far behind what exists on iOS that it would be hard to justify many devs putting in the work. But it is possible to ship cool stuff with work for sure!


That comment about Samsung ramping down CPU activity makes me want to puke. This is why we can’t have nice things.


Ha! Yeah it was one of the more frustrating issues. At the time there was no way to opt out of it either, but I believe they subsequently issued a firmware update addressing it somehow.


It's also reported that Apple uses software to make old iPhone slower.


It only happens when the battery is aging and the iPhone would otherwise shutdown since it can not handle the voltage peaks. Since the reports Apple has now a battery health section in settings which warns actively when you need to replace the battery. For example my father just replaced the battery in my old 2015 6S plus and it works on full speed again.


It's a shame they didn't tell anyone about it, maybe they could have avoided that class-action lawsuit if they were a bit more transparent.


I should point out that the Samsung CPU speed thing was designed to save battery power, I guess the assumption being you only need full power when playing a game and when playing a game you’re touching the screen. Well intentioned but not much good for our app which integrated with a Bluetooth MIDI keyboard so users didn’t touch the screen at all!


Realtime no gc Web audio I'd possible with wasm: https://glicol.org/

And you are right iOS is still the worst in Web audio. But if I'm to make a choice, I choose Web audio for the cross platform support.


Yeah, AudioWorklet and WASM has opened up so many possibilities but the performance still can’t quite compare to native as far as I know. I guess the thing here is Ableton presumably weren’t willing to compromise on audio quality if they wanted to use their desktop sound engines, so probably wouldn’t have been able to make WebAudio in its current incarnation (on iOS at least) work well enough for high track counts.


Super low latency makes native iOS apps popular for music.


Kinda mitigated when your best option for headphones is Bluetooth, though.


This is actually one instance where a native iOS app makes a lot more sense.


You don't need to fight for "good enough" RT performance on iOS.


you may like this presentation: https://www.youtube.com/watch?v=8vOf_fDtur4


What's the easiest to "play" with Ableton, but with software only?



If only FL Studio Mobile exported as a Ableton desktop project file..


Seems like a cheap ripoff of Endlesss, which is far more mature and allows very easy live jamming without fuss:

http://endlesss.fm/


> Seems like a cheap ripoff of Endlesss, which is far more mature and allows very easy live jamming without fuss:

The live jamming thing seems cool, though it's been done before. The NFT and "web 3" nonsense turns me off of it instantly, though.

Ableton Note is more like a notepad for music making that you can easily export to Ableton Live to build out the track, much like how GarageBand for iOS serves as a notepad for Logic Pro X users.


Endlesss allows you to do the same - build out sketches and transfer them to your DAW of choice for later production.

Didn't notice the NFT and Web3 stuff, but I've been an Endlesss user since the beginning and only skip straight to the "Jam" page in the app, ignoring everything else ..


For me, it feels a bit disingenuous to call this a cheap knockoff of a specific 2yo app, when clip-launch music apps have been on phones for over a decade. The Endlesss UI, while quite refined, isn't exceptionally divergent from prior art... It's like calling Nova a cheap knockoff of VS Code.

And given that Live has played a large role in popularizing the clip-launch style of music production for more than 2 decades, this kind of app feels like a very natural extension of what they want to do.

This did get me kinda going down a rabbit hole though to remember the ui of old ios music-making apps, though... This timeline was kind of a fun memory lane: https://www.jakobhaq.com/log/2021/10/23/sources-evolution-of...


Well, we could go back to Sonic Foundry Acid, pre-Ableton, and even further as necessary .. what I refer to though, is the iOS User Experience, which is shockingly similar for an app built a year or two after Endlesss had already established primacy as a clip-based collaborative music-making app.

Great link though, thanks for sharing that! Very definitely a deep rabbit hole for music technology boffins to inspect ..


The point above is well-made. Endlesss can’t claim IP of clip launching interfaces and to say Ableton has copied Endlesss by making a clip launch interface seems almost satirical given Live has been around forever and almost certainly inspired Endlesss UI!

Also I don’t remember seeing a clip launch interface in SF Acid. It just did neat warping and looping, which was awesome … and thanks for reminding me about all my (very dodgy!) uploads to acidplanet.com :)


Looks interesting. Also I noticed they have a "connect wallet" button for NEAR. How are they using that?


I don't know, I've never used any of the wallet stuff in Endlesss .. for me its just a great way to work on song ideas with my friends while we're all out and about in the world and not sharing a studio session together.


Looks really cool and packed with features


hmmmhmm, nice... is there something similar for Android devices?


https://glicol.org/

Not similar but for music making as well.


Wicked! Thx!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: