Often I'm sitting at work with my headphones plugged into my computer to listen to music. I'll then see a little video or something has been sent to me by a friend on my phone. I unplug the headphones from my macbook, plug them into my phone, watch the video, then plug my headphones back into my macbook. Simple.
Can you imagine what an annoying hassle it would be to have to pair and unpair the same bluetooth headphones between two devices like that? I'd be paranoid it's not connected to the right device and suddenly I'm blasting music to the people in my office. Not even including the fact that I now have another thing I need to worry about charging.
I like my apple earbuds. They sound nice, let me control the volume and playback, and don't need to be charged.
Also, all my nice headphones are 3.5mm. I don't want to have to get new ones, and I don't want to have to carry around a stupid lightning adaptor for them.
Many companies have tried making slightly thicker phones with bigger batteries (which many people claim is what they want), no-one buys them. Companies make thinner phones, lots of them get sold.
These companies (mostly) aren't idiots, they do huge amounts of measurements on what people say they want to buy, and what they actually buy.
I always claimed I wanted qwerty keyboard. now blackberry finally has a phone with it that is not selling. the priv.
well, while id kill for a keyboard on android, I'd also need a non-phalet size and open firmware. so I continue to use my keboardless motoX developer edition first gen.
nobody buys phones for one feature. you cant add a thick battery but sel a device with 300x200 screen, which was the case I think you're talking.
I found the Bold keyboard to be fantastic. I could type 60 wpm without looking at the phone.
I found the Droid keyboard to be terrible and unusable.
Not all keyboards are the same, but I do know the best-possible keyboard experience can be far better than the Droid (or any iPhone I've used).
Barebones BB Curve 9310 user here for ~4 years and counting, which replaced a Treo 755p when seemingly everyone had iPhones back in uni. Been meaning to purchase a spare for when this one inevitably hits the pooper. Battery obsolescence is my biggest sustainment worry. QUERTY keyboard requirement has made me a perpetual mobile luddite.
Definitely feels like the last man standing now that our senior-most engineer's wife forced him off his flip phone this past Christmas. Personally don't know anyone who doesn't own a smartphone these days...lonely world.
But few want to pay more for leg room. It's just not much of a benefit compared to the five things I mentioned (and several others like comfortable lounges at hubs -- but not apparently decent food). The airlines want to sell leg room because it's very easy to adjust.
Not that it's on the way to many other countries...
Basically, the problem seems to be that the USA does not have "sterile transit"; if your plane stops there in the way to somewhere else, you have to disembark and go through customs, and you need an expensive visa.
There are no other flights directly between Asia and Latin America (unless you count the EK GRU-DXB flight), though NH may be flying to MEX soon.
On my most recent trip to the US I selected cities that I could get between via train to avoid the entire mess apart from arrival into the US (and made an effort to depart out of Canada).
Among a wide variety of industries, a large portion of revenues come from "upsales", pushing some impulse buys extra like insurance. Which is to say that the impulse buy extras are shitty ripoffs and consumers over time have grown to expect promises to be worthless and to only pay for things they can clearly verify.
And that relates to selling cellphone with longer battery life - all the companies make unrealistic claims, how could someone feel safe giving up something that can see (like a feature or a smaller size) for something they can't see?
I think the airline market hardly ever offers real choices and many things that look like choices end up actually being the same flight sold by different carriers from the same alliance. Alliances are a huge pet peeve of mine anyways. There are certain (US based) airlines I strongly dislike and would like to avoid. However, it's almost impossible. Even if I book another carrier the flight often times end up being operated by them again. Sometimes it gets changed after I book. This is a dysfunctional market. I want Delta and United out of business and I've had the same from others. Yet we are all stuck flying them because we can't avoid it.
Want +3in legroom? For each flight, go to a flight forum and punch in the route to figure out what plane they're flying. Google the airline and that plane to figure out how much legroom is available.
Give up and just fly jetblue and/or southwest, and if they don't go, decline to travel.
4. Comfortable flying
(Step 3 is becoming a millionaire that flies business class.)
I'm glad Apple didn't go in the same direction and released the iPhone SE, but even they were surprised by the demand for it:
That said, i wonder how much the "big phone" issue comes from the paradox of wanting slim phones (needs to fit those jeans after all) longer battery life (Wh pr volume is largely a fixed value based on battery chemistry). Stretching the phones on the X and Y then allows a bigger (Wh) battery to be installed.
Never mind that they can also then market it with a higher screen inch number...
On top of that you have things like store commissions, carrier subsidies, and who knows what else, going on between the OEM and the end user that affects the choice of model.
One relative of mind went from a small featurephone to a large smartphone, because another relative had gotten the same one. This with the idea that said other relative would be "tech support".
Big box stores are usually brightly lit (which is usually compounded by having all the light-emitting TVs next to each other, whatever their brightness setting is); human perception of color is strongly influenced by background, and in that environment not having a set configured at the brightest makes it looked dull and washed out rather than seeming like accurate reproduction of color.
This is, of course, very different from what looks good in typical use conditions.
Or, if you could remove battery because of lowered power consumption of components (and then make the phone smaller) just Keep the battery the same size.
I would not bet anything on phone companies marketeers ability to run multi-dimensional hypothesis tests on the market. And I do actively doubt they have any intention to, given the opportunity cost they'll pay at the beginning of such test, and how long the test results will probably stay valid.
The idea that the Market is an all knowing beast always operating at maximum efficiency is hilarious.
I'm yet to see a person who'd say "You know what? It would be cool if the new phones had heart-rate sensors.". It's always "oh, they put a heart-rate sensor in that phone, cool.". Driven by what manufacturers do, not what the consumers want.
Also the fact that people end up buying something different than they were initially requesting doesn't help to put more weight on their suggestions. But if you don't hear people asking for features, you're not listening all too closely.
Before that I owned every Droid qwerty slider including the elusive Droid 3. I love the form factor mainly because once extended the phone becomes a very natural reading/typing device not unlike some of the early PDAs. The keyboard was getting better with each iteration and Droid 4 was probably the best phone I have ever used for remote management with the row of dedicated numbers keys. We have all lost something when qwerty sliders faded out of the mainstream.
The closest thing I've seen is the S7 Active. It looks good, but I'm on T-Mobile, so it wouldn't be a great idea to get one.
I haven't seen a market that's more driven by what is being put for sale than mobile phones. People don't choose shit, they pick from the few somewhat cost-effective options that are available any given year.
I doubt the technology to have higher density and more energy efficient battery is far off, but I fail to see any report of mass manufacturing of these, only article about new technology advancement in labs. And if there is no incentive from buyers (the device manufacturer) for better parts (in this case the battery), then the alternative is just have something that is good enough and that won't be dwarfed by the competition.
Also, when buying a smartphone, they rarely think "I should go for the uglier one, with a bigger battery", but after they notice how often they run out of battery, they'll buy an external one.
We can only guess based on what we hear. Cell phone manufacturers have a lot more data than we do and they keep going for thinness, so I have to imagine there's a demand for it.
I wouldn't be surprised to learn that most consumers prefer battery life to thinness in surveys and polls but actually wind up buying the thinner phone when presented with actual hardware. Thinness is easy to see. If you're comparing phones in a Best Buy, the thinner one might feel more modern and impressive. Battery life is a hard feature to demo in that environment.
In other words, maybe the average person only wants battery life in the abstract, but opts for thinness when making a purchase.
I've never been asked which political candidate I prefer, but I see polls on the news every night.
I worry about this being "data driven design", where X sells so lets try X+1. Problem is that phones do not sell on their own. Most customers buy them with some plan attached, meaning they often grab what gets the biggest carrier subsidy (or whatever gets the seller the biggest commission).
That or they get whatever their neighbor/relative/coworker/friend has.
There are a whole pile of perverse incentives between the OEM and the users that may or may not show up in the data hitting the meeting room table.
And i seem to recall the stated surprise for BB was how much of the iPhone was battery.
Sales seem to prove otherwise
> Can you imagine what an annoying hassle it would be to have to pair and unpair the same bluetooth headphones between two devices like that?
Bose seems to have come with a solution to avoid un-pairing and re-pairing: https://www.bose.com/en_us/products/headphones/over_ear_head...
> I don't want to have to get new ones, and I don't want to have to carry around a stupid lightning adaptor for them
Well, you can always not upgrade, or purchase a phone that has a 3.5mm jack. If enough people feel the same, manufacturers will continue to produce compatible products.
I suspect that (outside of HN), you're in the minority, however.
> Sales seem to prove otherwise
I'm not sure that sales have proven that people prefer a thinner phone. I know that when I buy a phone, I never look at the thickness specs, just at the height/width. My phone could be another few mm thicker and I wouldn't notice or care -- it would still fit in my pocket either way.
I may end up buying the thinner phone since it has more of the features I'm looking for, but not because it's thinner.
"Manage your headphones with an app" still sounds like an annoying hassle to me. Even if it works seamlessly, it's still a far cry from hot-plugging at will.
What do you use? I also buy $20 totally off-brand Bluetooth headphones (I wear them while exercising, and sweat ruins them every few months), and mine don't have this feature, which I would love.
Once data has to be transmitted wirelessly, there is a much better chance that you will do it in ways that can't be pirated, but can be controlled, monitored, and profited from.
Once all the necessary pieces are in place to decisively force it on the consumer, why wouldn't they try again?
If there's a USB port, most people will just plug in and back up to their computer. But if they do it over wireless, they'd just rather use Google / Apple's services, which means they get your data.
And with movies, there's the ridiculous HDCP crap... and I highly believe that the MAFIAA would give manufacturers with a fully protected AV path cheaper licensing.
> it must not transmit high definition protected video to non-HDCP receivers; and DVD-Audio material can be played only at CD-audio quality by non-HDCP digital audio outputs (analog audio outputs have no quality limits)
Analog hole is still analog hole, and the MAFIAA wants to close it.
1) Multi-channel streams are easy to grab in digital, but expensive to grab in analog - simple reason: six channels that have to be captured in exactly the same time, without offset. Most consumer-grade sound cards only carry one line-in, not three of them.
2) Power supplies in computing tend to be NOISY. I mean, REALLY noisy. So basically the PSU adds noise to the output DAC, and then the PSU adds noise at the ADC stage. It most definitely is a quality loss.
3) Naturally, headphone-outs and in a lot of cases line-outs tend to be engineered in a way that their output impedance can drive headphones, too. This can massively distort the signal.
4) Clipping. Hard enough to avoid when using a professional mixing table, next to impossible to avoid when ripping audio via the analog hole.
It's also somewhat poorly-defined, because "CD quality" can mean lousy mid-range performance when rounding 24-bit to 16-bit, or high frequency hiss when noise-shaping 24-bit to 16-bit.
So "CD quality audio" is quite a degradation when you listen in a 100% quiet room with expensive speakers and your full attention.
I, for example, can hear these ultrasonic weasel defenders, which operate between 20 and 30 kHz, again - when I used to work on outside construction works using massive jackhammers, excavators etc. I couldn't because daily stress would fuck up my ears.
It's more likely that the sound produced by those devices, even if ultrasonic, may produce intermodulations in the audible range.
> The commonly stated range of human hearing is 20 Hz to 20 kHz. Under ideal laboratory conditions, humans can hear sound as low as 12 Hz and as high as 28 kHz, though the threshold increases sharply at 15 kHz in adults, corresponding to the last auditory channel of the cochlea.
Not having an analogue hole would be a right royal pain in the arse.
[edited to fix my english]
I guess a lot of newer cars have bluetooth pairing, but mine doesn't.
Further, a lot of the anti-piracy impulse is not simply a desire to charge get more money from everyone but a desire for control. Essentially, the music industry is reconciled to most people paying little for their music (or better yet, most people forced to hear advertising mixed with their music) but the industry want to have the ability to sell to people at a variety of levels. Perhaps charge more for just released music or certain artists or music connected to movie sound-tracks or for "audiophile level" audio or whatever.
For a commodity that's infinitely reproducible, the ideal position is being able to separately charge each consumer exactly what they're willing to pay. Maximum utility for the owner, which would imply minimum utility for the buyer.
what an annoying hassle it would be to have to pair and
unpair the same bluetooth headphones between two devices
That said, such headsets are pretty rare and command a surprising premium for what looks like a purely software feature.
For example, I absolutely love thinner phones. I cannot revert to thicker ones after each time I switched to thinner ones. I guess my experience aligns with the majority of the consumers, but I do not know.
I totally respect Apple's (or any smart phone manufacture) decision, they certainly know much more than me on how to cater millions and billions consumers.
Why don't you just watch the video on your laptop? Even with a simple 3.5mm plug, moving the headphones over sounds like a lot of work for something that you can just do on the laptop.
Even if your friends send you an SMS with the video, there are plenty of ways to get an SMS on your laptop.
Perhaps the videos aren't reachable via his work network, or the OP doesn't want his "work" to know what's being viewed at the office? I'm sure there are other reasons to use a cell network (which is the assumption).
For this I just connect my phone to my computer so that all audio comes out of my laptop.
This regularly happens to people who sit near me and use bluetooth headphones. It is pretty awkward when they are listening to pick-up artist tutorials on Youtube.
And should I exhaust their (measured, actual) 22-hour battery life, they can be used in wired mode, as well.
Apple earbuds aren't 'nice' headphones.
On the UI side just need the OS to present local bluetooth devices as output options, just like they switch to usb/hdmi/internal/headphones
It's too bad, especially since the SRS-X3 was not a cheap purchase. And NFC speaker pairing is all sorts of cool. But music that stutters constantly? The crappiest $2.50 earbuds you can find make for better listening than that.
Less problems with CyanogenMod 12 and 13's stack, though.
I am not going to follow manufacturers on this until they have a solution that is convenient. And that's just to remain at par with 3.5mm in term of convenience. Now what do I get for having to replace my headset? Am I really going to notice the difference in quality when I am travelling? Really? It looks to me that it is rather another pathetic attempt to lock people into their hardware, with only Apple approved earphones being compatible with the iphone, etc...
Wireless. Allows play/pause button, forward and rewind button. Even transmits song titles to your car stereo. Exists on all PCs, Macs, iOS phones, Android phones, late-model cars, and even some home stereos.
A few people will shout loudly about this jack being gone. Everyone else will get a cheap Bluetooth, wonder why they futzed with a cord this long, remember to charge it every now and then, and otherwise not think about it much.
And that, folks, is why the 3.5mm jack is still the gold standard.
Bluetooth has some compelling use cases. This isn't one of them.
Eg. my car has no problem being the 'headphone' for several different devices.
once that ... thing is nearby it will try to pair with any and all devices in the vicinity, making everything pretty much unusable.
i was really happy with my bluetooth headset until i moved into an apartment in a bigger city. suddenly my devices no longer pair with the headset because other devices, not owned by me, are quicker with their autodiscovery and forced 'add' behaviour.
-more expensive, again bluetooth stack is the size of IPv4 stack
-lossy audio, or again more expensive headphones full of patented crap
If I go hi-fi route I am sure theres tons of options available anyway. Lots of headphone amps do accept digital input already and headphone amplifier circuitry generally sucks, nexus and iphone being not too bad in this department.
Another thing is, makers should make digital port extra sturdy. MicroUSB and lighting is pretty good at that already IMO.
It reminds me of a much more agressive attempt apple made way back with the ipod's headphone jack being slightly deeper (if I remember correctly) effectively making many headphones not function correctly.
Maybe back then that was more branding than control - seeing those white earbuds told everyone you owned a ipod and not one of the other brands.
One more reason to get rid of the 3.5mm jack: To annoy people like Square who avoid paying the Apple tax to connect to an iOS device
I'm sure that a set of Apple Bluetooth headphones connected to an Apple device could shortcut some of this, but it would still likely be a PITA on anything without an Apple logo. But maybe that's the point?
I can plug my asus devices direct to a vga or hdmi projector. for apple I need a external plug. then the same for network. then the same for SD cards, I need a special shorter one. then for 3.5mm phone jacks, for apple I ned one with COMM in the MIC pin and vice versa. etc.
the market was always gullible to all that and they know it. so they will continue with the trend until the money flow stops.
Because DisplayPort is a much better and more versatile standard than HDMI. I have no idea why projectors are switching to the shitty foosball TV standard instead of it.
The barrel jack is a hack and has a tonne of downsides. We hacked stereo into a mono plug. We then hacked a mic into the stereo plug. Hotplug detection is a patented minefield mess. It's a big plug that tends to collect cruft and makes it difficult to waterproof devices.
Eventually we'll have USB-C DACs that are cheap and small enough to keep connected to the ends of your headphones. For people that don't care about audio quality, a $5 Chinese-branded adapter from Amazon will be indistinguishable in a few years.
Please expand upon the tonne of downsides, because in practice, none of the issues you have mentioned have ever impacted my use of the jack.
Keeping a small cable sized DAC attached to my headphones (once they finally exist) is not my concern. The concern is getting manufacturers to agree to a standard. So instead of one DAC, I'll need more than that, or some set of adapters for whatever that plug ends up being.
Their only saving grace is how robust they are, but that's mostly thrown away by 3.5mm ones, where either the plug bends with remarkable ease, or the plug doesn't bend and the socket gets knocked out of shape with the slightest provocation.
There are better analogue standards around. There always have been. It's mostly a lost cause.
Wrong. This is evidence of a cheaply desigined output circuit. There are many jacks that include muting switches. They are spring loaded, and when contact is lost, a muting circuit is enabled.
And, if you are disconnecting the source side first, that is your problem not the jack's. Any analog connection disconnected source side first could result in this behaivor. It can be mitigated by some circiut on the amplifier side that detects a loss of connection. The technology exists, some cheaper electronics just fail to implement it.
Noise on disconnection could happen with a digital circuit too if the designer of the amplifier/speaker fails to account for that case.
Sadly, people don't care..
Also, my jack is connected to my speakers from behind. I don't want to bend my arm in strange positions every time I want to disconnect the jack.
The rest I imagine will simply be replacing the barrel connector with a USB connector and using USB-C's analog audio output over the proper conductors.
Not sure why this is that large of an issue - the only problems here I can see are of course the transitioning of the physical connector type which is truly painful - and I suppose vendors could choose to disable analog audio support on devices in favor of more lock-in.
To convert the digital signal coming from the device to an analog signal that my headphones can understand?
The former is terrible because the mic signal is on the outermost connection, so if your device has a metal enclosure the plug shorts the mic to ground. (Unless you have a special plug with no metal shoulder.)
People also use active noise-cancelling headphones that require their own battery.
I'm a bit nervous about the transition away from the classic 3.5 jack, especially since it looks like it'll probably be a standards war between the Apple and Android worlds. There are, however, some nice potential advantages.
Ugh. That. A tip: always ensure that the extension cable is attached to the auxiliary device (eg headset) first.
One of those flaky impedance tricks is to try and detect the microphone through it's impedance when the TRRS jack senses a plug.
If you plug the extension cable in first, the device detects the plug in the jack, can't detect the microphone (or whatever other magical device it's looking for - like volume controls), and disables anything it doesn't detect until the plug is removed.
( At least that's how my Macbook works. )
I think it makes infinitely more sense to give you something you plug in. Attackers can't hijack the signal in interesting ways, and nearly anyone who would use a Square device would be familiar with how to plug things into a 3.5mm jack.
People are used to pair Bluetooth devices these days - cars, wireless headphones, keyboards, mice... it's incredibly easy.
As for Wifi setup, Edimax does something really clever in their Wifi smart plugs - upon first powerup, the device creates an AP. The phone app connects to this IP, transmits the Wifi connection details and then disconnects.
However, I think that's giving them too much credit.
It's not that Apple "knows what the people actually want", it's that Apple knows how far it can "push" the customers to extract the most amount of money possible. It is a business afterall. Also, they want to lock people into proprietary protocols. The headphone jack isn’t proprietary. Apple probably hates the fact that you can plug a pair of Sony headphones into an iPhone.
They know that they have a large enough market share to be able to “drive” things they way they want. As for the other manufacturers ditching the jack as well, they are just following Apple.
If Moto was the only company doing this, no one would care. The people who want the jack would simply buy another phone. However, in the case of the iPhone, the barriers of simply switching to another phone are much larger. It's unlikely that someone who wishes they had the headphone jack is going to switch to Android just because of that. They will probably just end up biting the bullet and buying whatever adapter is required by Apple.
I just don't understand where manufactures are going with this. I recently got a small Bluetooth speaker and have problems pairing from time to time, but not when I have a 3.5mm jack. I just connect the cable and it works. Don't even get me started with battery drain.
This is really annoying. It all reminds me of when I was searching for my first cellphone and most phone didn't have the jack. In the end my first cellphone (Samsung Omnia) didn't have a 3.5mm jack and it was really annoying having to buy an adapter in order to listen to music.
I don't know if that's a universal problem for everyone, but my negative experiences with it have been enough to turn me off wanting to buy anything Bluetooth and favor other solutions instead. I use a Logitech mouse with a little USB antenna instead of Bluetooth, which is consistently reliable and never has an issue, and normal headphones with a 3.5mm plug or non-Bluetooth wireless headphones which have no connectivity issues.
On the other hand, my Bluetooth Mac keyboard seems rock-solid (other than dying when the battery goes dead, of course). I don't recall it ever losing its pairing.
On the third hand, a Bluetooth iPad keyboard (not Apple-made) we had at a place I used to work was total crap. You could figure on having to go through the pairing process at least once a day.
Other non-Apple Bluetooth keyboards also have this issue, where I'll be typing and it randomly loses connection mid-word, then sometimes comes back after a few seconds and other times just decides it's not going to turn back on without manually turning it off and on and fiddling with Bluetooth settings on the computer before it finally returns, or not.
> USB-powered headphones will (in theory) run a lot like Bluetooth headphones which have their own DAC/AMP. Your phone passes the raw data through to the headphones and it does the required converting. This can be a great thing. Instead of relying on a poorly-calibrated DAC in the particular phone that you are using, you can instead move that component to a piece of hardware you can control. So if 24-bit uncompressed audio is your thing you can have it with any audio source. While this increases the cost of the headphones it will also produce better quality audio if you are willing to put some money into it, which is a win in my book.
In this scenario laid out by the author, audio from your music is passed on to a BT device untouched. This is most certainly NOT the case.
Regardless of the format of the source audio, uncompressed (WAV,AUF), lossless compressed (ALAC, FLAC, SHN), or lossy (MP3, AAC, etc.), the data is transcoded and repackaged into the Bluetooth stream. What this means is that lossless audio becomes lossy, and lossy audio gets even more lossy.
The author's assertion that if "24-bit uncompressed audio is your thing you can have it..." is pure B.S. You will be at the mercy of whatever link is between your source device (phone) and your listening device (headset/speaker) and what ever hacking of the audio signal it does.
> So while Apple will be successfully pushing its customers towards its Lightning port powered headphones on an established (Apple) standard with readily available products, Android OEM’s that choose follow Lenovo fight a largely uphill battle and an empty ecosystem.
This is nothing short of ridiculous. iPhone users have a bit more spending power, but android still represents a huge majority of smartphone users. There is just no chance whatsoever that headphone manufacturers don't target android. We might see a very brief lag while android sort of "switches over," but in the meantime everyone else can just use 3.5mm like they always have.
Dude, what are you doing? I've been using one cable for several years for exactly that purpose, and it was a nothing-special cable I got, IIRC, at a drug store. I go through phones more often than I go through 3.5mm cables.
In the current world you can often plug your phone into your car by a 3.5mm jack and play music. In the brave new world, you can plug a bluetooth dongle into your car's cigarette lighter and 3.5mm jack, pair your phone and then listen to music. If needed, you have the "option" of hiding this awful mess under the dashboard and hard wiring it.
It sounds worse than cassette adapters to me, but I am not a phone manufacturer, so I can't do much more than bitch on the internet :)
I find this to be a very puzzling part of hacker culture. Any suggestion that doesn't fit the personal desires of the commenter must be attacked, not merely ignored or politely rejected.
Ignoring an idea in the context of a discussion about the idea seems pretty weird. Disagreement is part of discussion, and disagreement is not, in and of itself, impolite.
Losing a universal port on one of the most common phones doesn't really seem like an "option." It hasn't happened yet, so it's still a "possibility" but it is a possibility in the sense that the future is unknowable, not in the sense that it's something you have a choice in (beyond abandoning iphones). It's something Apple wants to do, but it isn't like a person could go buy an iphone with the "backwards compatible audio" option
If you're a heavy user, buy a cable intended for professional use. As an example, look at these:
Mogami specializes in entertainment-grade cable which is often certified to be waterproof, crushproof, oil resistant, UV resistant, etc. The connectors used are from Neutrik, who are also highly reputable. In the event that something goes wrong the parts are usually hand-repairable, and often you can find cable/connector assemblies carrying lifetime warranties. Other reputable names to look for are Belden and Carnare on the cable side and Switchcraft on the connector side.
I've had more iPod and phone charger cables (read - digital cables akin to what a new headphone jack would use) die than I've had straight analog audio cables die.
If you are going through six cables per year, you are doing something wrong. Even the cheapest cable from eBay should last longer than that provided you aren't also using it as a jump rope.
The DACs in Apple products have a very good reputation going all the way back to the early iPods and supposedly they kept getting better. When I travel I bring an Objective 2 headphone amp and AKG Q701 or Beyderdynamic DT-880 600ohm and use my iPhone 6+ as a music player and DAC. I can't tell the difference between that and the Objective 2 + ODAC I have at home.
In terms of pieces of the audio chain I think DAC is one of the least important in my anecdatal experience. At least if you have a decent one like any Apple device. Onboard audio from a PC/laptop motherboard is generally awful and even dedicated sound cards are full of noise and whine. I am very glad to have HDMI audio these days. A USB sound card trivially fixes the issue though.
So yeah it's nice for high end DACs to become more common, but is it that useful over the convenience of a solid built in DAC? I don't think so. That said I suspect the market will respond with adapters and cases that resurrect the 3.5 inch jack without a huge fuss and still pass through other functionality like syncing and charging.
For myself and most of Hacker News with above average incomes this is going to be a minor speed bump in terms of cost and a larger inconvenience in terms of more kit to lug around and swap between devices.
I do not think this is the case at all. I recall the early iPods being blasted for their sound quality. One of the huge "wins" for the Microsoft Zune over the iPod was the audio quality. Everyone I know who owned a Zune was an audiophile.
I have no idea what the sound quality on current iPods/iPhones/iWhatevers is like. Maybe the DAC on the old iPod was held back by the codecs that Apple chose to use at the time. I know Apple created ALAC, and improved the quality of the lossy codecs at some point. Regardless, the audio quality of the oldest iPods was poorly regarded in general.
I was slightly late to the MP3 player game myself, but after listening to several different MP3 players I ended up with a Dell DJ because it just sounded better than the competition. That's just my personal, non-audiophile opinion, of course.
I'm not knocking the iPod. It was a better device than the competition in a lot of ways. It just didn't have audiophile quality sound.
It's difficult to find cites from that far back, but the types over at head-fi appear to broadly agree http://www.head-fi.org/t/580987/has-ipod-changed-their-sound...
The best portable music players I've listened to (in terms of headphone driving capabilities) were the portable CD players from back in the day.
My experience from way back in the early iPod days was that it simply did not have the power to drive my headphones hard enough to provide good sounding bass, where as my old CD player (which could play MP3s on a CD), sounded much better.
I always thought it was a result of some law which limited the power of headphone ports to prevent hearing loss in kids, but this could be all BS.
Side note: ALAC, although lossless, never upset FLAC for a reason. Primarily because FLAC was open and ALAC was closed.
EDIT: Please disregard last comment, it has been corrected by a reply below this
Before the cause of the one-day bricking was known, I'd already tossed it so hard against a wall it would never work again.
And people here talk about the 3.5mm jack being a hack or a kludge. BT was not originally intended to carry high quality digital audio - it was a friggin wireless standard for telephone headsets. Telephone audio quality isn't exactly the world standard for Hi-Fi Audio.
I guess the new aptX Lossless codec allows lossless audio - but I'm not sure how many devices support it.
All this reminds me of a friend who attended a seminar with Sennheiser for their professional wireless microphone technology. The instructor said something to the effect of, "always remember, never replace a connection with wireless technology where a $10 cable will do. Because, all $10 cables will outperform even $10,000 wireless systems."
Here's a good example, Just check out that description:
I don't remember this being true at all. I was in the camp that bought iRiver's H3x0 series + other companies' offerings because Apple's products had comparatively worse audio quality.
Confession: I still use my Blackberry Bold 9000 but I have to admit attempting to listen to music on the thing flattens the battery (always has done).
- Intel is working on adding analog audio over some unused pins on USB 3.1 (sfaik)
- Even if USB gets analog audio we will probably see the DAC+amp in computers and phones go away so they have more room for a battery
- With USB 3.1 headphones there would be a DAC+amp in/on the headphones
- With a DAC+amp on the headphones we could select to use the host computers' DAC+amp or the headphones' DAC+amp. Choice between quality or battery life?
- DRM sucks dicks - but there will always be a way to get between the DAC and the speaker for recording. Let's just limit consumer choice, amirite?
- Phones and laptops are all going USB, but there aren't nearly enough ports - I thought we killed fucking docks in the 90s. They're BACK?!?!
- USB and keyboard-cases became so expensive when tablets became popular... docks for more ports and audio will similarly be $100 and above.
- We might see the DAC+amp become another accessory to interchange between the digital audio source and the speakers. Another $100 device for enthusiasts.
- Lightning seems way better than USB type-c (physically better), but no way am I buying it just to be in the Apple crowd.
- Cash grab or DRM blitz? or both?
This was some bad reporting by one of the tech blogs last week - Intel was talking this up at IDF in 2014 and the author of this article didn't fact check it (and admitted his mistake on the Reddit discussion of this article). Audio adapter mode has been part of the usb type-c connector spec since its conception. Passing analog audio out over a couple pins is part of the spec, and not in any way experimental or undefined.