Hacker News new | past | comments | ask | show | jobs | submit login
3.5mm audio vs. USB Type-C: the good, bad and the future (androidauthority.com)
50 points by jswny on July 13, 2016 | hide | past | web | favorite | 70 comments



The notion of removing the headphone jack from flagship phones is infuriating and ludicrous to me.. When I first read about it I couldn't help but think 'haha, nice joke!'

There are so many downsides to removing the jack, specifically around connections to other devices and systems - hifi systems, car audio, external microphones, and yes, even selfie sticks.

I would consider myself an audio enthusiast (Westone W60 headphones for daily use), and I don't relish the thought of needing to have either a) a bulky and inconvenient adaptor with an external DAC, b) new headphone cable with a DAC built in, at probably considerable expense, or c) switching to a new handset manufacturer that hasn't lost their mind.

The people who want external DACs can already make use of the USB or lightning connector while the rest of us quite happily make do with the convenience of the 3.5mm jack.

Apologies for the somewhat-ranty nature of this post, it just frustrates me to see functionality taken away for no good reason. Phones are already too thin, and I constantly have issues gripping my iPhone 6 - the thicker iPhone 5 felt much more comfortable in my hand, and was far easier to grip into with the square bevelled edges. We're heading in the wrong direction with thickness at the moment - I'd rather extra battery life via a larger battery than an extra 0.6mm thinner.


> The people who want external DACs can already make use of the USB or lightning connector while the rest of us quite happily make do with the convenience of the 3.5mm jack.

That's what gets me the most. Removing the jack simply takes away a choice and doesn't introduce anything new in its place. So frustrating.


Consider d) phones necessarily require DACs for their speakers so they'll always be able to dump analog signals onto the usb-c pins. All that's needed is an agreement of what to expect from the connector when analogue audio is playing and not fry equipment expecting something else, and best-case scenario all those pins can now be used even for balanced 5.1 audio (there are 6 positive-negative pairs free for alternate use according to wikipedia). So the dreaded adapter, while still unappealing, can be as simple as connecting pins to the correct 3.5mm pins (altho if usb requires some acknowledgment to get to that point it would only require a small finite state machine to achieve, I couldn't say what the impact of that would be on adapter size)

e; it occurs to me i have no idea the different impedance for usb signals and line level, so this could all be hogwash


You can choose whatever impedance you want since you're going to have to isolate the analog signal from the digital usb in your solution anyway. Or the usb-c to 3.5mm adapter can have a matching circuit in it made up of cheap passive components. Not really a big issue.

More importantly, is this an actual solution being pursued? It sounds good, but I can see this getting out of hand with each manufacturer having a different standard. Or there could be an "analog audio over usb-c" standard that codifies which pins get which channel.


This is, while a standard, not required for USB Type C implementations.


The article raises the spectre of DAC-less phones to cut costs but I don't see that ever happening unless the microphone and speaker are removed too, so not implementing it, at least for the foreseeable future where everything still has 3.5mm sockets, seems.. foolhardy. Maybe we'll have worse DACs? I know that in my experience the difference between an acceptable dac and a great dac is not one I can notice with my ears


Whether it's implemented or not will not depend on if the phone has a DAC, but if the USB chipset supports it. If the USB chipset does not support it, the phone could have the best DAC support in the world, and not provide analog audio over USB-C.


A lot really hinges on what a USB-C to jack adaptor will look like. If it's not bulky and inconvenient (or expensive), none of your listed concerns apply.

In the meantime, the 3.5mm jack actually takes up a not at all trivial volume in the handset.


A usb-c to 3.5mm adapter will almost certainly be inconvenient. It has to plug into the usb-c port, which means that it will protrude from the phone, and the 3.5mm plug will stick out past that. Integrating it into the phone was much better since you didn't have the additional protrusion.


The biggest drawback for me: one more think to keep track of with my phone.


If the adapter is in the form of a low profile short and stubby cable, the protrusion issue wouldn't be so bad.


iPod touch is thinner than an iPhone and still has a 3.5mm jack. Thickness considerations isn’t a reason to remove it. The thickness of iPhone 6S/Plus actually increased compared to 6/Plus to accommodate the, to me, gimmicky 3D touch display assembly. The size of the battery shrank, too. So, if Apple really cared about thinness and battery life, they wouldn’t have included/will remove the 3D touch stuff in the next iPhone. I think controlling all aspects of the device (perhaps a DRM move?) is one reason but I have yet to hear a compelling reason to remove.

I suspect that if they were to remove it, they’ll do it without trying to justify it at all. They’ll present it as the way it is. See the App Store search results Ad. announcement, which is borderline tone-deaf, for one recent example of doing something that wholly benefits Apple and no one else.


A non-trivial volume? Seriously?

What a load of nonsense. What else of equivalent utility would be put there, a cure for cancer?

This is just a DRM play in the guise of making the phones sexier.


It's non-trivial if you include the added depth that affects the whole phone.


Oh golly gee a whole four millimeters! Whatever shall we do?

The "how thin can we make our phones" game is as stupid as the 3D TV game.


DRM play? What content is this protecting?


I'm glad that there is sound reasoning around this topic. Removing the 3.5mm is just a plain stupid idea that is going to anger users unnecessarily. It's a really bad design decision and UX fail.


Remember when phones had IrDA ports and slide out keyboards?


I can't remember the last time I used a headphone jack. I have Bluetooth headphones, speakers, and radios, which I find much easier than a cable. Removing it, even if no alternative cable connection is offered, wouldn't bother me at all.


Next you're going to be telling us to get off your lawn.


I wonder how strong the Type-C connector is, especially to off-axis abuse. The 3.5mm is a pretty solid piece of metal, and the part that usually breaks is the solder connection from the internal jack to the circuit board. And this is easy to repair. I suspect a damaged Type-C jack will not only be unrepairable, it will leave you with a phone that can't be recharged (can't even be used as a phone anymore).


My experience with 3.5mm jacks has been the opposite: every failure has either left the tip, ring, and/or sleeve in the jack, or broke solder joints on a multi-layered and typically unrepairable board. Or in the case of laptops, a failed microswitch for optical output failing and rendering it likewise unusable.

Granted, Lightning and USB-C have their own failure modes, but the physical construction and durability of 3.5mm plugs and jacks are a good reason to abandon TRS connectors.


yeah, USB/lightning in general have never really been designed for being tumbling around in my pocket/backpack while connected, like my headphone jack has learnt to expect.


Weren’t they? Honest question!

They are both definitely recent enough that a use case like that must have been considered during their design.

I don’t know whether that was considered, but you seem quite sure about that and I don’t know where you get your certainty from and I would like to know.


> To be fair, the USB Type-C standard is capable of transmitting analogue audio through the interface’s Sideband Unit (SBU) pins.

To be equally fair, this is not a required part of the USB Type-C spec, so it's unlikely to be widely or properly adopted.

The whole thing strikes me as a net negative for the average consumer; those who want quality audio will simply continue to use high quality external DACs (assuming this isn't prevented by some convoluted DRM implementation).


Definitely user hostile. I have 4-5 Apple earbuds kicking around [in my office, in my car, some at home, etc.]. That's one of the "lock-ins" with my iPads and iPhones.

On the bright side, maybe phone manufactures who don't treat the users in this way will be more competitive. Competition is good.


There have been a few articles on HN about this topic but this one has a little bit more detail on the sad state of available USB-C peripherals.


iPhone doesn't play some hi-res formats and I suspect new headphones will be even less compatible. Not that I dread about sound quality, but it is an inconvenience that now you need to downsample your music.

Regarding adapter problem, all decent headphones come with removable cable. You'll probably be able to buy OEM replacement with tiny DAC that is hidden in the cable.


It turns out that 44.1 kHz / 16 bit is sonically transparent, even in the most exacting environments. This has been verified over and over again in listening tests. So the good news is you can downsample to reasonable rates without losing any perceptible quality (although you might lose the placebo effect).


However, for consumers, Apple has set the bar for audio quality at compressed AAC, which is arguably discernibly below CD quality. I'd be very happy if Apple went to CD quality, which very few people would need to downsample.

People with the (perceived?) need for higher resolution audio than CD seem like the sort of people who would probably want to hand-select their own DAC anyway, which should obviate downsampling their higher sample rate tracks.


You say it's arguably discernibly below CD quality. Can you make that argument? Last time I checked, most audio compression formats could achieve transparency at reasonable bit rates with all or nearly all source material. For example, AAC encoded with iTunes generally achieves near transparency at 128 kbit/s, and complete transparency for by 192 kbit/s. Music you buy from iTunes is 256 kbit/s which, honestly, is overkill.

Hydrogen Audio has done some of the more accessible tests, but there are other tests in more controlled conditions which you might prefer.


I've never seen a robust study, but I'd love to! I've seen a lot of anecdata and quasi-scientific experimentation, but most of those seem to be fatally flawed or have a pretty obvious bias at the offset. Hence, my couched language :-)


Here: http://wiki.hydrogenaud.io/index.php?title=Hydrogenaudio_Lis...

Read the methodology and the statistical analyses, they are reasonably thorough and rigorous (amateur, but definitely not "quasi-scientific").

Here's another: http://soundexpert.org/encoders-192-kbps

And another: http://soundexpert.org/encoders-256-kbps

You can see that the differences at 192 Kbit/s AAC are generally beyond the capabilities of human perception, and at 256 Kbit/s, AAC has a very healthy margin.

So the argument that AAC is perceptually inferior to CD quality has little basis according to the evidence.


I'm surprised the article didn't mention DRM, which is almost certainly one of the main reasons behind this push.


I'm not surprised, since PR material was probably the source of the information.


That doesn't make sense. If you're recording off the 3.5mm jack, then you can record the output of the external dac just as well. Moving the dac out of the phone doesn't make piracy harder.


DRM isn't about protection it's about inconvenience.

Think about a streaming service which would play songs but some songs could only be decoded using Beats or Skullcandy...


Or serves higher quality audio for specific manufacturers.


Yeah, I suppose that's possible, but it doesn't make sense for the artists, the publishers, or the streaming service, who make money by having people listen to their songs.


They already do this, they limit which streaming services have their content, and do time/exclusive releases for specific services. Apple bought Beats for the app and the user base, it's not impossible to already do promotion like some headphones give you 20$ worth of iTunes credit, now it's even easier get Beats and get access to additional exclusive content, or like some one else suggested better quality content. Seems like a good way of pushing people to pay 300$ for 30$ worth of hardware, or more precisely helping people to justify their purchase because their headphones can actually now sound better than even entry quality studio/recording headphones.

DRM can rarely be used as a protection against mass copying and pretty much never when you cannot control the analog output of the protected content, it's only real use it to restrict how lawfull consumers can access and use the content. External DAC's have existed for Android devices for quite a while (BT/USB DAC's), and even for iOS (you could connect a DAC to an iPhone/iPad since they switched to the Lighting cable). The streaming industry has almost hit a dead end with monetization, having a digital interface that can be controlled gives them a whole new avenue of value added services and while it's unlikely that they'll jump on it immediately I would bet my money on the likes of Spotify, Amazon Music, and Apple Music capitalizing on this within the next 2 years.


Does it makes sense for an artist to make their music available to only one streaming service? Some would say no, and yet they still do it.


...No, they make money by people paying to listen to (or play) their songs.


The DAC will be hard wired into the headphones, there won't be a jack between the headphones and the DAC that you can plug into.

At the start, you'd need to modify headphones to be able to record the signal, but I doubt this will work in the long term. At some point it's highly likely the DACs will come with DRM so that only approved headphones work with a device.


From an EE perspective headphone elements are merely small very low performance speakers.

The only "DRM" I can think of with USB-C is playing impedance games. So right now you could probably get away with soldering a 1K resistor and a RCA plug in place of the headphone element and you'll have a usable signal, but if they played weird impedance games you might need a simple opamp in there to adjust the levels to normal.

Something that's annoying today is to get line level audio out of the bluetooth connection to my car stereo I need to bypass a nanny warning about blowing my ears out, but only on my phone. Apparently "line level" on my BT adapter that makes my BT volume about the same as, say, the radio or CD, would, if connected to a BT earpiece or headphones, blow someones ears out. You can safely assume the people who screwed up something so incredibly simple as setting levels, will successfully find a way to screw up usb-C headphone levels. So one brand will nanny-state and prevent anything louder than a church whisper, others will sound different levels based on what device they connect to, it'll be the predictable epic fail.

Speaking of my nice bluetooth headphones with their couple hours of battery life, since buying them many years ago I haven't plugged headphones into my phone or tablet. They're nice enough headphones. I'm just surprised the marketing push is to keep shoving audio thru legacy wired cables instead of going full on bluetooth. I suppose there's a lot of money to be made in worn out headphone cables. When I'm not listening to music, but speech instead (audio books, podcasts, etc) I'll swap my bluetooth headphones for my bluetooth earpiece thingy.


> "The only "DRM" I can think of with USB-C is playing impedance games."

I didn't mean to imply DRM existed in USB-C today, what I was implying is that the move away from the 3.5mm jack to USB-C audio is at least partly driven by DRM (in the long run). It's like what happened with HDMI. HDMI is more or less DVI-D video + digital audio + DRM (HDCP). What was the point of adding in the DRM to HDMI? What risk was posed by unencrypted DVI video streams? In my opinion, similar arguments for adding DRM to HDMI will be given for adding DRM in future revisions of USB-C audio. Perhaps the phone manufacturers will also use the argument that dodgy USB-C cables can damage devices (due to the high power spec for USB-C). In any case, I'm fairly certain that those that seek to benefit from DRM will be looking to add it in.


Unless the DAC is very tightly integrated into the actual speaker (literally fused, so it's hard to connect to the coil), there always will be an analog loophole.

DRM is not about preventing this, though. It's about making ordinary consumers surrender to whatever media dictates, not preventing absolutely everyone from ripping the content.


Nothing preventing someone from making a DAC with a 3.5mm jack instead of a set of headphones. In fact they already exist...


In a DRM scenario, there would be an audio equivalent to HDCP in place, so you'd need a piece of technology which it's illegal to clone and only available if you sign a restrictive licensing agreement.

Mind you, you'd still be able to cannibalise an existing set for this.


Given how "successful" HDCP has been, I'd give it a month or two of survivablity. And yeah, as you mentioned, anyone with basic electronics skills (or for that matter, could follow a youtube video) could simply mod their hardware to plug whatever they want in.


This very much depends on the design of the headphones. Look at bluetooth headphones: there are models with a transciever built into the headphones, and separate transcievers where you can plug whatever headphones you want.

The same can be the case with type-C.


Why do you assume that's why phone manufacturers would want to switch to Type-C for audio?

Phones are getting slimmer, and the 3.5mm jack is the single thickest component in the phone. Removing it makes sense from this perspective.

Now, sure, you theoretically could use USB-C for DRM, but until someone actually implements this, I'll apply Occam's razor.

Consider that iTunes and the like have been selling DRM-free music for years now. And that, even though Blu-ray players require HDCP DRM in order to play back video, they don't require it for audio.


The 3.5mm jack is typically the only remaining unmediated "analog gap" on a phone.

Since there's no way of doing authentication signaling on it in a way that can be enforced without mutual cooperation, there's no way to limit access and/or monetize the audio experience any further than it is right now.

USB-C removes those obstacles. This only lives in the land of potentiality right now, but the path is pretty clear. Once you have accessories that can mutually authenticate, you can limit access in exchange for money. This is technically a "slippery slope" argument, but I think capitalism gives us ample evidence to show that it's neither an illogical nor improbable step.


> "the 3.5mm jack is the single thickest component in the phone. Removing it makes sense from this perspective."

Here's the problem with that perspective...

1. The 3.5mm jack isn't the thickest part of modern smartphones. To give one example, the camera module tends to be thicker.

2. As far as I've seen pretty much nobody is looking for phones to get any slimmer, especially as slimmer phones = smaller batteries.

If you can find me any comments online from other people stating that the current top of the line smartphones are too thick, I'll be very surprised.


The first iPhone wasn't too thick either, but it doesn't stop phone manufacturers from aiming for ever-slimmer phones.


That does not address my points. Find examples of people that are unsatisfied with the thickness of current phones and you'll have a much stronger argument.


Why do we need thinner phones? They aren't bricks right now.


I don't get it either. Thinner means lower structural integrity (remember "bendgate"), smaller batteries, and in this case lower functionality.

Phones are thin enough. Plus people buy these thin phones then slap thick $5 rubber covers on them, what is even the point anymore? If people really want a thin phone then ditch the accessories, they're thin already.


Some people prefer thinner and lighter devices. Do we need them? No, but product decisions are not made based on needs alone.


I realize this is anecdotal, but I never hear[0] anyone say the iPhone needs to be thinner.

I hear it needs to lose the bezels.

I hear it needs a larger battery.

I never hear it needs to be thinner.

[0] Nor do I read about many people wanting it thinner.


The air gap will always exist, just a matter of the tools being ubiquitous enough for it to be worth the effort.


Years ago I had a Windows Mobile phone that needed a bulky, USB dongle for audio out.

I'm excited to relive those days.


Pretty detailed and well written article.

This is good for audio-conscious consumer, since it allows greater specialization via a custom DAC. It's likely mixed for the consumer, who will end up with lower-quality (but possibly cheaper) aftermarket DAC/headphones.


Not really though. You already could have a USB dac plugged into your phone- heck, I have one plugged in right now. It doesn't offer audiophiles anything, and gives the average "what's the difference between a 20$ dac and 100$ dac" consumer nothing either.


If it moves to digital audio, I can almost guarantee the calls for DRM'd audio output. This will result in a drastic loss of public choice in headphones. The copyright cartel has long hated any form of analog holes in a DRM scheme.


It only it didn't have to be analog in order for our ears to hear it- then they'd be all set.


Seems to me at the very least that there's a good case to be argued for potentially two USB-C sockets on the phone. I certainly wouldn't relish daisy chaining headphones to power supplies or similar.


What about FM radio? I know that the iPhone doesn't support FM, but other phones do and they use the 3.5mm jack as the antenna. sure they could include the antenna in the phone, but I am not sure that's as practical as using the 3.5mm jack (please correct me if I'm wrong.)


Why do we expect a usb type-c to be used as a audio jack? No one likes wires, this looks more like Apple moving use to wireless bluetooth or some new format they have developed.


Looks like fixing an issue that does not exist.


Not going to happen.




Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: