Now it seems they're leaving people who depended on that behind. No company offers an ecosystem that doesn't require "fiddling" to get things to work correctly. Maybe this is the way it has to be, but I really wonder what Apple's strategy is going forward, because its clear that they've slowed down or stopped development on everything other than their phones/pads and the occasional laptop. What are all their engineers doing? What is the use of having hundreds of billions in the bank if you're not investing it in growing or creating product lines?
Looks down at iPhone 7 which cannot be plugged into new MacBook Pro...
this is another announcement after the earlier announcement that they do not develop monitor anymore. They won't develop wifi anymore. They are definitively reducing the Mac line ( you may love or not the new MBP, but they are clearly settling down on a single line of laptop with 3.5 models, and at best a status quo with the rest )
So what does Apple still do ? They do iPhone, iPad and Watches with a limited cloud offering. Everything else is shrinking: Professional line of software and hardware, Mac line in general, hardware ecosystem, even the plain first party software are somewhat stable (I mean, yeah there is TV coming and the rest is same old same old)
What is such a gigantic company doing with its boatload of money that they can't even ship 1 pair of headphone on time ? Even their flagship laptop recently announced has a 4 weeks waiting period, after working on it for 2 years ? (yeah this is an exaggeration, of course there are technical difficulties, but again that's Apple, an enormous, ultra-rich company focused on an extremely tiny product line)
When an Apple loving friend came around I witnessed first hand the value of branding. She was entranced by this cool black Apple monitor and wanted to know where I had got it from. I'm pretty sure she would have paid at least $100 over the Dell price for such a cool Apple product. She was incredibly disappointed that it was a trick when she found out. The monitor no longer had any value in her eyes even though it was exactly the same as before.
Apple can charge outrageous prices for their products, and people love them despite/because of that. Can't understand why they'd stop making monitors.
Before that i don't think most anyone outside of USA even knew Apple existed unless you were interested in media production in some sense.
And that in turn lead to a feedback loop because certain celebrities would show up with an iPod (mostly noticed because of those white wires) because they themselves walk the media circles.
Similarly Apple could get the labels and studios to agree to distribute via iTunes because of that inside track in the industry.
And that in turn was what bootstrapped iPhone beyond being a fancy screened featurephone (though whoever talked Jobs down from going "nuclear" on the gray market jailbreakers was perhaps the true genius there).
But that all this then produce the effect that if you slap a fruit logo on a object peoples interest and appreciation is still worrying. What comes to mind is the comedy caricature of an "art critic" that can deliver a spiel of big words that means crap all upon closer scrutiny.
Another interpretation might be that 'people pay outrageous prices' because Apple's products are of higher quality/last longer, which makes the long term price lower than similar products of competitors; For example, i am still using an iPhone 5 (September 2012) which runs the latest iOS version smoothly...
you know that because (i assume) you have the technical knowledge.
Average people don't know which products of which brand may or may not be well built. But they do now that Apple products are generally of high quality.
Apple being of whatever quality has no bearing on the quality of anything else. Not only do other companies match, many of them exceed quality of apple.
"But these are considerably less than what you already had paid for a laptop!" -> "Yeah, but they're not Apple, and when people see Apple, they know you have a good laptop". That branding was literally worth almost $1k to my colleague...
Apple users seems to be the only ones that even care about "resale" of their computers, almost like they are investments rather than tools.
I upgrade my PC hardware by throwing away/recycling the old one and buying a new one.
I upgrade my Apple hardware by selling the old one on Craigslist and then buying a new one.
In both cases, I spend about $1000 net each time. In the PC case, that's +$0, -$1000. In the Apple case, that's +$1000, -$2000. Either way, it's -$1000 net.
If you prefer Apple products but think they're "too expensive" because they cost ~$1000 more, this is why people tend to disagree. They paid that ~$1000 once, as an "entrance fee" for their first laptop, but they don't have to pay it again.
TL/DR - only Apple devices HAVE a resale value.
Anyone want to buy a barely used XPS 15 (top spec)?
It wasn't about 'mental sifting', or 'flexy laptops' or any of the other excuses people will try to insert. It was about paying for the brand recognition. He'd gone 'wow' at my X1 carbon, and gone 'wow' at another colleagues XPS. But he balked on price because of brand recognition, not because "a single product line is easier to deal with than a choice, so I'm willing to pay almost 50% more".
I certainly agree that Apple's greatest asset is their brand reputation.
In other words: if you buy a $3000 gaming PC, you're a weird nerd who cares too much about computers. If you buy a $3000 MBP, you're a professional and a connoisseur. Your friend wanted a $3000 computer (probably because, to be frank, they are a "weird nerd" who can put $3000 of hardware to good use), but they didn't want to be labelled as such.
Apple products generally have a high resale value like BMWs.
My guess is that there's just not enough profit in it. Monitors last a really long time, and while people like your friend will ooh and ahh over them, that doesn't necessarily translate into sales, and since they don't need to be replaced every year, they won't have the built-in profit that their phones have. It's probably the same with their WiFi access points.
I have another Linksys "premium" home router ($200-300) but the WiFi died on that too, but it lives on as an OpenWRT home router but with WiFi disabled. The Apple WiFi routers also lived on for a while with WiFi disabled but acting as TimeMachine backups. Then the disks died and I couldn't be arsed to pry open the case with a hair drier to get at the disk inside and replace it. So for my home, the setup went from Apple all-in-one combo gear with routing, WiFi and backup disk, to specialized functions for WiFi (cheapest off the shelf), routing (openwrt), and backup disks (standard NAS hardware). (For TimeMachine.)
I hope my 15" Macbook lasts for a really long time, because when it's dead I don't know if Apple has any Macbook I want to have. But that's tomorrows sorrows..., no use worrying about that until the time comes. (I guess I can always fall back to Linux and some PC laptop.)
Just beacuse you paid a bunch of money for it and flashed it with a better firmware doesn't make it good hardware, and custom firmwares often have poor support (and, on more than a few occasions, worse performance) than the default firmwares optimized by the manufacturer for the specific hardware they have.
The fact that you have to keep buying new hardware tells me that you have lousy hardware and should step up to proper hardware rather than expensive consumer-grade stuff.
My work expensive Cisco access points are so reliable I've never seen some of them in 10 years (they're above the dropped ceiling)
What the heck are you doing with your WiFi gear to have these problems? I've used all kinds of cheap-o routers over the years, and never had problems with them overheating. Are you living in the desert without A/C or something? My current router is just a cheap TP-Link dual-band I got used on Ebay; it runs DD-WRT and has been working fine for 1.5 years hiding behind my sofa. Before that, I had a cheap Cisco/Linksys e1000 I got used and ran DD-WRT on for probably 3 years. I gave it away to someone, and last I heard it still works just fine. Before that I had some D-Link I think, which I replaced because a firmware update prevented my network print jobs from going through (obviously not a hardware problem). I've also had (about 4 years ago, for work) a Cisco/Linksys dual-band which worked great, and a Cisco enterprise AP which was very reliable though the IOS UI was the worst thing I've ever seen in my life. Honestly, your post is the first time I've heard anyone complain about WiFi routers having overheating problems.
Maybe I'm luckier because I usually run DD-WRT and set the transmit strength a bit lower, but I've only done that with the last couple of routers.
No, it's endemic to Apple products in the past five+ years, starting with Thunderbolt so people needed to carry converters to connect DVI and VGA monitors and beamers, wired networks, etc - which are common in most workplaces still. Apple could've alleviated that by providing a universal converter with the MBP, but they decided against that.
They're repeating that step with the new Macbooks using only USB-C, although this time it's a bit more standardized.
They screwed up with the iPhone, again making their own proprietary connector instead of USB-C (which probably wasn't around at the time), and by removing the 3.5mm jack making it incompatible with all existing headphones / earplugs.
Apple could've fixed it with the iphone 7, but they didn't, sticking to their connector in which they probably invested a lot of monies. They could've fixed it with the new MBP by adding a universal converter / docking station, or even supplying an USB-C to Lightning cable, but they didn't.
It's not a glitch in the Apple matrix, because they're not making any efforts in fixing it.
That sounds great if you want Apple to have all your data. But why do you?
Never mind that wires will always be better and more private than wireless.
For some values of "better". For me, not plugging in and needing to carrying around wires is always better.
> and more private
Wireless can be secure and unencrypted wired transfers can be sniffed via the electromagnetic radiation they emit. This isn't as cut and dry as some like to pretend.
I went to a security conference once where a presenter was telling me about a previous conf, where he was demonstrating a 'free wifi' box that snooped on all connections and pulled passwords out of them. He was demoing it on stage when the wifi went out in the other auditorium behind (also part of the sec conference). His magic little box then went berserk as huge amounts of security professionals at a security hacking conference then connected to it to try and re-establish their wifi. 
Compare this kind of story to the kind of setup you need to have to sniff EM radiation from a cat-5 cable.
 The point of his story was that security professionals love to lecture others on correct tech use, but in reality the sec recommendations come so thick and fast that even the sec professionals don't keep up with them. How is a mere mortal to cope?
The "better" argument was my bigger disagreement. Wired isn't "better" in any absolute terms. I'll happily use wireless rather than wired for the convenience of walking across my house with my laptop. Or, you know, actually using my cell phone.
Ha! I don't believe it for a second. The other day my iPhone went from 100% to dead in less than an hour - in airplane mode using only the music app and strava.
Given that you were using Strava, I'd guess you killed your battery running GPS, but that doesn't mix with airplane mode.
And if I'm in that situation (often: in a library, a school parking lot waiting for my kid's basketball practice to end, etc.) I'm stingy about battery power. I turn off the Mac's Wifi (which makes a big difference), and wire tether my phone.
Yesterday at an airport to charge my phone! But I haven't synched via a cable in ages.
I would not be surprised if the next iPhone came with a USB-C cable.
I don't buy into enough of the eco system, but do Apple devices sharing an Apple ID to sync over LAN/WLAN instead of hitting icloud?
Sure, having to purchase an extra cable (after buying your $2000 laptop) might be fairly user-hostile, but the claim made was incorrect - there's not much arguing that.
This is all kind of mitigated by the fact that users's just don't plug iOS devices in computers anymore to transfer stuff - Spotify (and iCloud) has eliminated most need for that. If people are plugging them in, its to charge them (which is pretty important), but you can just use the power adapter IF you have one around.
But Microsoft and Google aren't as user friendly somehow?
It fails so frequently that it's just not able to be used.
Meh. They should just copy their photos off their idevice using USB PTP mode. That's USB config 1 when an idevice plugs into a PC. No macOS required. No iTunes required. Done. Solved. As for the crap, isn't that an argument to NOT migrate via iTunes?
Most of iPhone users have a 5Gb iCloud account where not even a whatsapp backup fits, and they still need a computer to migrate all the data
ICloud storage is super cheap, especially compared to the price of a smartphone every 2 years. If you're not using this, you're cheating yourself out of one of the biggest advantages that iOS has over other computing platforms. iCould backup is the one thing in iCloud that really just works.
Plus there's dropbox. Solved.
But it still isn't strictly necessary.
That, and the fact that you're still unable to put music inside an iPhone if you don't a) buy it from the app store or b) using iTunes
Wat? Who still does this?
But it's very important not to allow your iPhone to talk to your Mac because that means iTunes will run and make everything worse.
- they don't expect you to plug into a computer at all except in emergency
- can be plugged into the provided charger
- can be plugged into a computer you're statistically more likely to have than a new MBP with only USB-C ports
With regard to the Apple ecosystem, I think it's valid to complain that the cable included with the new iphone does not connect to the new macbook, when they were announced at the same time. Even if it's only supposed to be used in an emergency, does that mean in an emergency you should be punished for staying within the Apple ecosystem?
Next time there will be people who had a laptop but not the adapter, because you are never supposed to need it.
That's of course just an edge case, but the fact that it just happened shows that it can be important. And Apple used to be known to always deliver a great user experience, not just in 95% of the cases.
It turns out that when I plugged the phone into the laptop USB, it showed up as a USB ethernet device, which NetworkManager then happily auto-configured.
So clearly 2016 is the year of the Linux laptop.
This may seem stupid, but do you use it when both devices haven't been connected to the internet recently? Seems like it works better if the laptop was recently able to access your iCloud account.
That can be solved with an external battery, but using a short USB-Lightning cable has been easy and great. Now we'll just need to make sure we buy yet a different cable if we use a new-new laptop, which isn't that upsetting.
And Firewire to USB, and VGA to DVI to Displayport to HDMI, and...
I'm just surprised that apple did the thing everyone wanted: switch to a standard. And now bam, nope this is horrible. Its time for usb c, sucks now but it'll pass. Be more worried about usb c incompatibility more than usb c.
I tether for internet access and backup the device.
So when you buy a new macbook and the new external keyboard and mouse by apple you need an adapter additionally to be able to connect them with each other.
Obviously you're going to need at least one of these  in your carrying case for any number of reasons.
Equally obvious is in a couple years you won't need it, and you won't miss it.
I mean this is the classic Apple playbook, and every time it ends the same way; Apple led the pack, too early for some, but ultimately they led.
 - https://www.amazon.com/gp/aw/d/B01G1SKCPC
 - https://www.amazon.com/gp/aw/d/B01AUKU1OO/
FWIW, having played around with Google's routers, they work pretty well, even from iOS. Chromecasts "just work" in the old Apple sense of the term: you plug it in, you find the thing you want to watch online that you want on your TV, press the button in Chrome and hey presto, you got your cat video on your OLED.
If you're OK with having Google store data on you, I'd say the Google hardware division has a lot of what you talk about here.
 Before the flood of angry HN replies come in, I get that is not everyone, but knowing what I know about Google's privacy practices, I am perfectly happy with that.
I also really love the idea. TVs should mostly just be big monitors and decoupled from the source of content. Especially in conjunction with both tablets and web browsers on a laptop/Chromebook, this is precisely what Chromecast enables. I don't think I've used any of my "smart TV" features since I got a Chromecast. It was always painful to enter things like passwords anyway.
I wish. Chromecast could not mirror or could not stream audio. Eventually, I was fed up and I bought an Apple TV.
That said, chromecast has probably my favorite interface for interacting with the set-top box. I mourned moving on to the apple tv.
Is this an accurate characterization of Apple's strategy over the last 12-24 months? A lot of stuff has happened:
- Expansion of their cloud services. Apple Music, iCloud, HomeKit, Health are all vertically-integrated services that (in theory) just work across your Apple devices. Siri improves little-by-little every day; Maps is low-key excellent in major US cities.
- Release of the Apple Watch and (soon) Airpods. Note that the Airpods (and new Beats) include the W1 chip, which qualifies as the "no-fiddle" type of innovation you mentioned before.
- Their vision of personal computing in the iPad Pro. iOS is stale on these devices (and I suspect we'll see an overhaul this year), but Tim Cook sells these as "computers" for a reason.
- Continued growth along their main product lines. The iPhone is still the best smartphone for most people, the iPad is the best tablet available for anyone, the MBP was redesigned. These are boring points that produce crazy $$$ for Apple.
- Many rumored projects: AR, self-driving cars, a redesign of the iPhone, the ongoing play for streaming TV, a near-certain redesign of their desktops.
It's difficult to judge Apple at this point because computing has changed. Capital-O Opinions mattered more when the alternative was a muddled computing ecosystem. Since then, other companies have copied the Apple playbook, and we see
well-designed, vertically-integrated products everywhere. Apple is the tide that raised the fleet of "tech product" ships, for better or worse.
I buy airport extremes for the exact same reason.
Between this, the refusal to make another 17 inch laptop, making those laptops unupgradeable via soldering storage in and the assortment of other decisions that are out there like this one I'm getting much closer to full abandon ship mode.
I was already planning a Linux laptop but this might accelerate things.
I handle all the phone support and leave my mom a script for when the repair person comes and in both I'm making sure that the replacement unit is a simple modem and not a combination modem/router that they like to give out. And yet each and every time, if I'm not personally present when the technician visits, I find that the technician set up the combo router and the nice, expensive Airport is sitting idly by, useless until I can figure out how to make their PoS combo router operate as a bridge, which isn't easy considering how poor the web UIs are on those boxes. After repeating this process many times, I gave up on buying stand-alone routers...it's just too frustrating.
So if my mom, who has a techie son who knows the difference between a modem and a router, can't reliably get her broadband provider to give her a simple modem without router functionality, what are the chances that other people with her level of acumen who don't have a son like me can get them to do it? I never like to generalize my own experience to explain larger trends, but I think in this case it might be apt. I imagine Apple has trouble with Airports from the broadband companies pushing their own inferior solution to people who don't understand enough to push back. Apple probably has a lot of angry feedback from people who don't realize that the Airport they paid a bunch of money for is sitting unused next to a crappy, inferior box that has been setup and connected to their internet, which is actually the box causing the problems.
In modern times you rarely have to fiddle with wireless routers or displays to make them work correctly with different devices. There's not much left there for Apple to add any unique value to and if there's no benefit to consumers they won't pay a premium for the products.
Their computing platform has effectively been left behind: from five-year-old chips in laptops which haven't received a substantial update in almost as long, to their flagship prosumer "desktop" PC which is as unremarkable as it is unexpandable, nothing seems competitive anymore, let alone the bleeding edge upon which a lot of their products used to be positioned. The fact that the UE from their phones and tablets is being lifted and shoehorned into OS X (whoops, "MacOS") at the expense of usability and stability is more than a little off-putting, too.
It seems like all they care about these days are their mobile phones and tablets, and the iTunes universe that goes with them. I couldn't care less about any of that.
1. Top quality industrial design and components (trackpads, screens, etc.).
2. Reasonably competitive specs (never as cheap as building an equivalent PC from parts, but close enough that you didn't feel totally ripped off)
3. An OS that provided Unix compatibility with a nice GUI that Just Worked.
Now, in all three areas, they've either fallen back, or their competitors have caught up.
1. The design and build quality of the best ultrabooks matches that of the Mac, and individual components like screens are often better. Meanwhile, Apple refuse to introduce real touch-screens on their laptop, presumably due to internal pressure from the iPad division, so they settle on the touchbar as a compromise. Or they want better sound, but can't make it work internally, so they just add fake speaker grilles instead.
2. The specs have stagnated, while the prices have increased dramatically.
3. The OS has gradually been locked down, and its Unix roots eroded. Meanwhile, Desktop Linux has improved, and Microsoft has added Linux compatibility to Windows.
Early use of UEFI, and PCIe and NVMe SSDs?
The first use of the Core 2 processors in anything was the MBP, which had been PowerPC before.
Plus all their interconnecticity protocols.
Seems reasonably bleeding edge to me. Completely fair to hold them to it.
Their strategy is simple and obvious, and straight out of a modern American business school textbook: reduce or eliminate product lines that aren't making big profits, and continue milking the product lines that are making big profits. This means bigger profits company-wide.
>What is the use of having hundreds of billions in the bank if you're not investing it in growing or creating product lines?
Why bother doing that when you can just continue milking your cash cow product lines when you have hordes of cultish buyers happy to give you all their money for your overpriced products? Obviously, they've found that this works great with some products (phones, tablets), and not too well with other products (WiFi APs), so it's perfectly sensible for them to drop the latter.
Will this bite them in the ass later? Looking at Apple customers, I seriously doubt it.
On a relative basis, Apple has infinite resources. It has cash, the brand and can attract the right people to run the business. Each product line, like the routers and mac pro can be focused on because they have the resources to do it. Most companies re-focus on core products because they are spread too thin - Apple is not.
To a lay person, this doesn't look like better focus. It looks like:
* Incompetence in management, because of all the outdated stuff still being sold at high and old prices
* Inability to attract and/or retain good talent, because Apple is not expanding teams to sustain its product lines. It's instead closing down and collapsing them into the iPhone/iPad/MBP lines.
* A complete lack of vision, because nothing is being done for years on several product lines. If that's not shameful, I don't know what is.
It's very sad to see this unfold, but I guess all good things come to an end, and this is happening sooner than expected.
EDIT: And look at Tim's wikipedia page, there's a quote: "You kind of want to manage it like you're in the dairy business. If it gets past its freshness date, you have a problem"
The sad thing is that creative vision is one of those things that all of us start out with, but then it gets beat out of us through the course of childhood and mandatory education.
What made Steve Jobs different was that he was a product guy and an incredible visionary to boot. He had the rare ability to care about details of individual products while also having a solid grasp of the big picture in terms of how those products should come together to form an ecosystem. I think Tim Cook tries to delegate those things to people who just aren't as good as Steve Jobs (naturally), which explains the questionable decisions Apple has made recently regarding their products.
I don't know if you have ever hang around supply-chain people, but with the ones I know, I would not want them designing stuff.
You make it sound as if he single-handedly sold over a billion phones. What else helped sell over a billion iPhones? Saccharine, simplified, cheery, brain-dead UI; an army of Apple loyalists, explainers, and apologists (still mourning the loss of their Beloved Leader); several metric fuck-tons of marketing and advertising spend, a panoply of factors that result in a once-per-year or once-per-every-other-year purchase cycle, and a phalanx of impeccably-decorated retail locations, among other things. Sure, Mr. Cook plays his role, and Apple's SCM is best-in-the-world, but I think there are more Tim Cooks out there than Steve Jobs.
Not product freshness.
Doing things in parallel isn't how the org is setup: instead of vertical feature groups like most other companies (a Mac division, a phone division, accessories, etc), the company is split into design/hardware engineering/software. Means you get more integration with each part and less cross-org battles that Microsoft was famous for back in the day. Downside is that when the next iPhone was all hands on deck, they were pulling people off of Mac OS to work on it.
Actually its the complete opposite. They have a strong vision of the future and are shaping whole company to be ready for it. That means dropping all the things that will become irrelevant/commodity in ~5 years. Personal computer is one of those things.
In Apple vision future iteration of iPhone will replace _everything_. Small portable gateway to the cloud.
Yes, this means dropping all the pro-summer products. Apple doesnt need top 5% consumers anymore, the Final Cut Pro/Aperture fanatics, the DTP market, education, software devs. This was important in early 2000 to establish leading brand thru influencers. Today all of it gives marginal return on investment, so why bother when you can concentrate on volume?
People go Apple because they want the integrated experience, but if that's not on offer anymore, if you have to actually research which is "the best" of any component in your setup, why not re-evaluate everything? Eventually it will get to the point where it's as much hassle to use Apple as it would be to use Windows or Linux and at that point, it will boil down to a decision on price, and Apple will lose heavily. Those HP Mini Workstations look pretty sweet, compared to a Mac Mini...
It seems kind of silly for Apple to sell a handful of separate products --each replaceable by outside companies' offerings -- when they could instead sell one integrated Apple TV/Airport/Time Capsule/MacMini/whatever. Slap a touchscreen on it call it (Apple icon) Home and sell it for $299 (or $499 if you want a decent SSD).
"The Time capsule doubles as a backup storage hard drive for Mac computers."
It seems that they won't offer even anything like that, when the "people familiar with the matter" are right. From the article:
"Apple began shutting down the wireless router team over the past year, dispersing engineers to other product development groups"
"Apple hasn’t refreshed its routers since 2013 following years of frequent updates to match new standards from the wireless industry. The decision to disband the team indicates the company isn’t currently pushing forward with new versions of its routers."
Yeah maybe add $x00 to my examples above.
> "Apple began shutting down the wireless router team over the past year, dispersing engineers to other product development groups"
It specifically says they put some of them on the Apple TV team and Apple is generally pretty secretive about entirely new offerings. "Apple is getting out of the router business" is clearly an extrapolation from "Airport is not going to get future updates". That may be correct but it's not a 'matter' that the sources are likely to be 'familiar' with unless they are senior executives.
I think Cook and other leadership are really focused on Jobs' vision of the post-PC period. The problem here is that the PC's never went away. We just stopped updating them as often. I'm afraid Jobs died when it was looking like PCs could ultimately be replaced by mobile devices in most circumstances. I imagine Cook is under pressure by the board to "fulfill Steve's last bullet points" which was an email that was famously leaked recently. It seemed very 2007-2010-ish level of thinking, when the mobile revolution was in full force, but come 2011-2016, suddenly tablet sales stagnated and PC sales didn't bottom out and have recently gone up.
Sadly, with VR and AR possibly being the new big thing, a beefy video-card is all but mandatory and not the kind of thing that can be shoved into a 3-5 watt SoC. That's another big thing Apple is going to ignore because it goes against its mobile-only vision.
Apple is in a strange position because no one there seems to have the political clout to say "Steve was wrong, the post-PC world isn't happening and we need to beat Lenovo, MS, and Dell on laptops, desktops, and accessories. PC's arent going away guys!" Jobs as a dead prophet is unquestionable. This is the problem of having a cult of personality instead of proper flexible leadership. Cook has become a caretaker to Jobs' vision and Jobs' vision is probably wrong. The moves they are now making are probably wrong.
Just from a "Apple Laptop" vs. "Windows Laptop" perspective, the MS Surface laptops are a bit pricy if you're only looking at the lower end of Apple's offerings.
You have backup plans for your hardware, and your software. If either one starts to reek, you can replace it without losing the other.
If you're not making 40+% profit margins you're bringing the company's overall margins down and have to go.
Apple had an integrated vision for computing. Wireless routers were never going to be more than a round off error on their balance sheet, but they did it anyway because they wanted it done right.
With apple giving up on monitors, it's been a sad year: the bean counters are clearly in control.
I was laid off from a semiconductor company years ago because the product we were supporting was aging, and their projections for the successor product did not show a 40+% profit margin, so they decided to simply exit the market altogether and dump their customers, who were really pissed.
The AirPort line was born during a time when your ISP only provided you with a modem, if that. Wireless connectivity was a brand new world that required new hardware.
We're not in that world anymore: ISPs set up (and support!) wireless access with their networking at no extra charge. To most consumers, the AirPort went from being the simplest gateway to wireless coverage to being an accessory for the Apple-centric household. That's why you saw so many features added to the AirPort over time (hard drive, print server, airplay server). Bluetooth speakers, airplay receivers, AirPrint, etc. have over time obsoleted these uses.
There's another story for the Mac Pro - although many professionals who look to the Mac as an option among many systems probably won't like it.
Not that their Airport Extreme is the best on the market, but it's still something substantially more reliable than whatever ships with your cable modem.
Acquiring something like Eero would help improve that story. Integrate it into a bunch of Apple devices would help the ecosystem.
I feel like Apple has given up a bit on Wifi and moved to out of band wifi and Bluetooth. I wish they had stuck to Wifi. I can't really use AirDrop because my laptop is too old and doens't support out of band wifi. I was very confused why two AppleTVs showed up when trying to use AirPlay until after googling I found out one was Bluetooth and the other was WiFi.
We're moving on from product and platform wars to ecosystem wars, and the core feature of Apple's ecosystem is all Apple software and hardware working together (preferably to the exclusion of non-Apple products, but not always). Having control of everything gives you the best chance of making everything just work.
Dropping products (X-Serve, monitors, routers etc) and neglecting products (Mac Pro, Mac mini etc) makes the whole ecosystem weaker.
For comparison, the main rival ecosystems are Google and Microsoft. Google's is based on doing everything online and creating a complete Google stack. Microsoft's is based on supporting everything cross-platform (Windows, iOS, Android, MacOS and increasingly Linux) both online and offline.
Actually, I stand corrected. It appears they updated the page to use Pixelmator benchmarks instead of Aperture. This update must have just happened because a friend of mine and I were joking about it a week or two ago.
Top end Macbook Pros are capable enough for their current audience for running the likes of Logic or FCP. So what's the point in maintaining top end hardware if they don't have the software?
I'm sure Apple will continue to do well with what it is these days, a fashion house. I'm not sure what my next laptop will be but I fear I've dug myself into a hole with OSX/MacOS. I spent most of last night and half of yesterday toying with Ubuntu and sadly it didn't do it for me. Sure, I was running it on my Macbook Pro and so the keyboard was all out of whack, which I'm sure is solvable, but I found it unstable.
So who knows. I'll keep this early 2013 Macbook Pro until it gives up the ghost (I think I'll have another couple of versions of MacOS in it before it's resigned to the scrap pile) but after that, no idea. Things don't look especially bright for Apple users outside those who are only interested in iOS devices.
But, Mac Buyers Guide says hasn't been updated since Dec 2013.
This is what happened to Microsoft, and it took many years and a major internal upset to get them back on a positive track. And now look at Microsoft since they've started diversifying and innovating again: they are providing an OS (and hardware) which is genuinely interesting to professionals in a variety of fields. They are going to steal Apple's thunder here soon, unless Apple really makes an effort.
Something that is well designed and innovative. I for one would pay for one.
The way I see it, Apple excels on the whole overarching synergy thing, while Linux excels on individual component quality. Considering that each component is an independent project, the Linux ecosystem will never be as homogenous as Windows or Mac. That said, it's a trade many of us happily make for the power of choice.
• iPhone 7 looks like iPhone 6 sans the headphone jack. So no design changes for 3 years. To me, 7 does not feel like a significant update over 6.
• Macbook Pro got a touch bar. Otherwise, minor design changes since last revision. Does not feel like a significant update.
• iMac not updated for 12 months. No significant design changes for years, but the screen resolution is now Retina.
• Macbook Air not updated since March 2015 (still low resolution). No significant design changes since introduction.
• Mac Mini not updated since 2014. No design changes since 2011.
• Mac Pro not updated since 2013.
• iOS 10 and macOS are minor revisions.
• Thunderbolt display and now Airport extreme/express are dead
• The iPad Pro 9.7" looks like iPad air (1 or 2). iPad pro 12.9" looks like any existing iPad but bigger. The iPad minis all look alike.
• They didn't release new iPads this fall. Isn't that a first?
It feels like the hardware line-ups are getting more confusing: Two different iPad sizes called Pro as wells as "Air 2" and the minis. It made sense that the Pro was the largest one, but they confused us by releasing a smaller Pro that looks like an Air 2, but has a better display than the large Pro. How many iPads do we need?
There is the main iPhone line (... 6 6S and 7) that comes in two sizes, and then the evil cousin called iPhone SE which looks like a 5.
The laptop line is getting more messy too. The Macbook is like a slower Macbook Air but with higher resolution and 12". They killed the 11" Air, but we now have 3 laptops at 13" (Air + two types of Pro). Is the 13" Pro without touch bar option really necessary?
All these series ("", Air, Pro, SE, Mini) which pop in and out of existence feels like they are trying different names for marketing reasons (especially for the iPads).
I appreciate the yearly impressive but predictable CPU/GPU and software improvements, but it is really starting to feel like they are either struggling a bit, or working on something that takes a lot of resources from non-essentials and focus.
Unless they've got something wonderful up their sleeves, it seems like they really did become a 'lifestyle brand' instead of making the technically best stuff they can.
The release of the stupidly expensive gold watch was a watershed moment. Literally no functionality improvement over the base model and several hundred percent more expensive. A big long-term error in positioning and philosophy in my opinion.
In any case, the Watch 2 looks almost identical to the Watch 1, so from a design point of view, even that is stagnant.
"What should we read into the fact that R&D has more than doubled over the past three years while sales growth was sort of a fifth of that? Are R&D investments just less efficient than they were in the company's history, or should we think about that as incremental spend for products that haven't yet come to market?
Timothy Donald Cook - Apple, Inc.
There's clearly some amount of R&D that are on products that today are in the development phase that have not reached the market, and so that's a part of it. And we feel really great about the things that we've got. We've also put a lot of emphasis on our Services business as well and on making the ecosystem even better. And so we're very much, we're confidently investing in the future, and that's the reason you see the R&D spend increasing."
My super informal analysis of where Apple's priorities are:
However, maybe some of the hardware guys who used to work on the router are now busy with the watch. It would make sense that they would sacrifice the routers in favor of the watch (although I would prefer the routers).
Anyway, it just feels like there are way to many different models now. Are they trying to please the bean-counters and not the customer? What advantage is there is having so many different types of computing devices that really are very similar?
The worst part is that the old Macbook Pro was my go-to model previously, but the new one just feels like a more expensive step in the wrong direction.
I don't really want a touch bar (I use an external keyboard 75% of the time), and I do actually like to have the earphone jack, SD card, Magsafe, and traditional USB ports.
... So despite the messy list of models, there isn't really one that I want anymore.
Oh, well. Maybe it will make sense in a year or two.
Wifi though has always been a very big PITA for consumers, and Apple's hardware/software integration has always been a better bet for ease of configuration, and honestly, reliability.
The optimist in me hopes that maybe they'll have something better for us, or are making an acquisition to replace their current product lineup completely.
The pessimist in me thinks that maybe they're leaving this market to avoid needing to develop hardware that meets its publicly stated standards for protecting consumer privacy. Potentially they have been approached/mandated to enable some kind of backdoor in it, and they chose to stop producing it, rather than comply. /tinfoil_hat
People won't listen though because in 2016 it seems most people think wires are ritually unclean.
... they don't particularly want to lift floorboards, and drill through floors and ceilings if the wireless solution is 'good enough'.
One of the big things that people don't realize is that bandwidth of wireless is split between every device on that channel. So if you are streaming to your TV/roku/xbox/etc over wifi, you are killing the bandwidth for your laptop and phone as well. Plus, if you live in a city your wireless bandwidth is pretty much guaranteed to suck because every ISP includes wifi with their modems, destroying the noise floor. Comcast modems are a particularly notable example, as they have a "public" 2.4 ghz network in addition to the "private" 2.5 and 5 ghz networks. There are no fewer than 30 wifi networks available inside my house when I check on my phone, and I live in a row house with metal lathe under the plaster, which greatly attenuates wifi signals.
I think the biggest barrier to wiring things is that patch cables are incredibly expensive if you buy them at Best Buy, etc. Most non-technical people I know only own the patch cable that came with their modem. Once I get them wired up, they are astounded by how much faster their internet is. They don't have buffering issues, the internet doesn't drop randomly, and it just seems "faster" (which I attribute to lower latency). I toss extra patch cables in my Monoprice orders now and sell them to my friends at-cost. To be honest, it's partially for my own sanity as well, because watching Netflix at 480p on a 50/15 connection drives me nuts.
Yes. Same here. Ethernet is an oddly underrated technology. And Monoprice is one of these things that nerds know about and no one else does. There may be other companies like that—someone above mentioned an enterprise router company that sells better stuff than most consumer companies. Yet I've already forgotten its name.
Ubiquiti also makes a great series of wireless APs that are worth checking out.
My network right now is a Edgerouter lite, 2 Unifi AP AC Lites, and a used Dell 2816 managed switch.
Don't do that. Instead get a ethernet powerline adapter:
This is like an engineer not knowing the mains voltage is 230V, or a baker not knowing about standard grades of flour.
Apparently I was grossly wrong, and I still don't understand why I was wrong and why the changes made such a large difference. Something to do with network protocols and response to packet loss or something, I guess.
I'm always posting latency numbers every programmer should know: https://gist.github.com/jboner/2841832
I had no idea that WiFi was that much better than Ethernet!
( this is why units matter, kids ;)
This is also shitty news because I had very high confidence that an Apple router would not ship with any kind of malware or other crap I don't need. Now, I have to find something better, comparable in price, and works well with Mac household.. Sad face.
That being said, the Apple APs are the fastest consumer-grade APs out there. If you are going to hit the theoretical maximum, an Apple AP is your best bet. I don't think streaming is where people notice issues with their internet though. In my experience people tend to notice issues in video games, skype, facetime, and other applications where there isn't a buffer to hide issues with dropped packets first.
However, I get ~950 mbps over the wire (on a 1 gbps connection), so I still plug in when I'm sitting at my desk and need to transmit large files.
I max out my cable modem's connection. I pay for 50mb/s, and 802.11n on a 5Ghz channel is more than sufficient.
At work however, yes, that gigabit wired ethernet connection is really nice. :)
I'm not thrilled about the autoupdate feature of Google WiFi. I wouldn't be surprised if it's leveraged to benefit Google's analytics, or for government snooping. But the greater risk at this point is from criminal hackers. And I prefer running my WiFi routers in bridging mode, anyhow, making it less likely Google will bother tracking my family's browsing from the LAN.
Maintaining my own homegrown router isn't appealing, though. For one thing, all the hardware available for that is sub-par. Solid support for MIMO and beamforming is mostly non-existent except for proprietary stacks; even simultaneous multi-channel is iffy using open source solutions. AirPort always shipped with better than average support for the latest tech, and I'm not keen on going backward in that regard, especially for what should be a performance upgrade.
https://support.google.com/wifi/answer/6246642 has a tl;dr version: "Importantly, the Google Wifi app and your Wifi points do not track the websites you visit or collect the content of any traffic on your network. However, your Wifi points does collect data such as Wi-Fi channel, signal strength, and device types that are relevant to optimize your Wi-Fi performance." and a full explanation of everything it does (or does not) collect, and why.
Apart from Apple's and Google's idiotic antics, what routers require anything more than a browser to configure?
As annoying as it is for some, the configuration interface for Apple AirPort used a simple SNMP interface. Now, lots of SNMP software has had bugs and remote exploits, but all things being equal it's a much simpler interface to export and requires shipping less code, not least because it doesn't have to support a GUI, directly or indirectly.
That said, I always configure my WiFi router (AirPort or w'ever) in bridging mode and put them behind my gateway. And I try to disable any web GUI entirely if at all possible. People who use the built-in WiFi capabilities of their Cable or DSL modems are begging to be hacked.
Not really, CGI can be written in anything, shell scripting, C, whatever. But security-wise, it doesn't much matter the language because the programmers working on the web interfaces of such embedded devices rarely do a good job.
Considering the amount of crappy C most routers have, that's probably the least of your concerns.
Good luck configuring that with a mac....
However most apartments let you 'hang pictures'; anything that leaves so small of a hole that painting over it effectively patches it.
In my apartment, the biggest issue I face is interference from other wireless routers. Because of this, the powerline performance is about 40% faster on my desktop (right above the router one floor) and much more consistent. It also does help with wireless congestion, and really helps with using my desktop as a PLEX server. Before, higher bitrates would cause serious issues, and now it just works.
I'm not sure if the tech has matured recently or I just had a good use-case for it, but it's miles better than WiFi. I'm using TP-Link's product.
For testing, I plugged them into AC outlets along the same wall in the same room, with ~14 feet separating them. After getting them linked up (very simple), I connected laptops to each unit, manually assigned IP addresses to the laptops, and used iPerf  to run throughput tests.
I'll spare you my rant about them but peak speeds were extremely disappointing, just shy of 50 Mbps. I honestly didn't expect anywhere near the advertised 1200 Mbps but I was expecting speeds relatively close to what I can get over my existing wireless. 50 Mbps may be quite acceptable to some, especially if one's Internet connection isn't that fast, but it was far from acceptable for me. They would also randomly "freeze" or "pause" for 20-30 seconds at a time but I didn't really investigate that issue.
Also the idea of willingly putting noise in the electrical grid seems sketchy to me. I used to live in a house where I could not get good noise-free audio because the line was not a pure sine wave and most consumer PSUs were crappy to let noise through.
With wood floors you can usually avoid much impact and just run cables under the floorboards.
Can I push 1 Gbps from my phone to my other phone? No... but I don't need to. My TVs and other devices work just fine too (none of them need more than 10Mbps to be usable, even with stable HD streaming).
The few things that do need bandwidth are on a wired connection giving them a hefty 1 Gbps that isn't interfered with by anything.
Maybe you have just had terrible routers?
Also, my phone company refuses to upgrade so I am on high-latency DSL. When a packet gets dropped in my local network, it takes a long time to get a replacement from internet based services.
You don't need high bandwidth to use the internet reliably, but you do need low packet loss, and the effects of packet loss and high latency multiply with each other -- local packet loss is much more painful if your internet is slow than if it is fast. Thus having a reliable local network helps experienced performance, even if you have a 2Mbps internet connection.
There may be a unit conversion problem here, but an apartment as small as 500Sqft apartment would be exceedingly rare - that's ~46.5Sq/m.
Many cities don't even allow apartments that small...
Studios are usually around 400 sq ft.
I serve my 1200 sq ft condo with a single node, and it gets good coverage throughout the space, including the front/back patios.
My parent's 2400 sq ft house took 2 nodes in the attic + one in the back of the garage to cover the back yard.
They can hit their ISP's 150mbit/second cap from anywhere in the house or yard.
They use tablets almost exclusively and though dad still uses a laptop from time to time, he has no desire to plug it into ethernet, so a wired network wouldn't really be useful for them.
If I crunched the numbers correctly, roughly 2.5% of the US population lives in NYC alone. If you add in Boston, DC, San Francisco and other expensive cities I'd bet you can break 10% of the population in the core expensive cities alone.
80% of Americans live in an urban area.
I suspect that your biggest problem is really just your terrible ISP. I've had that problem before too, but my current one is great (and local).
What's wrong with expecting WIFI to work well enough to enjoy low latency high definition/4k content?
You can totally do that in rural areas. I'm talking at least 1/4 mile separation between houses.
If you're not in that kind of area and you still expect it, it's because you don't understand the physics. I don't mean that as an insult, just a statement of fact. I went to school for electrical engineering and that statement you made above is laughable for anyone who knows the trade. To me, a similar statement might be "Why can't I have a car that makes 1000 HP, gets 100 MPG and costs $10k?" Yes someday that might be a reality but with the tech we have now, it's definitely not.
Bandwidth over the air is fixed and it's split between everyone who is within "earshot" and similar to how it eventually gets impossible to hear anyone at a party once the room is packed and everyone is talking, so it goes with wireless. There's just not enough spectrum to go around. That's why people use wires; every wire has (roughly) a whole spectrum all to itself. Pull 100 wires? Get 100x bandwidth. Put up 100 wireless transmitters? Get 1/100 the bandwidth on each one. The math there heavily, heavily favors wires. Ethernet, coax, fiber, whatever. If you make a new signal propagation domain you can use as much of it as it'll allow you. If you use the big signal propagation domain that everyone has access to, prepare to share.
No, you're just using subjective feelings rather than objective measurements. 4K doesn't specify a particular bit rate. If there is going to be a productive discussion, it needs to be acknowledged that there is a huge difference between streaming 1080p and Bluray 1080p.
Also, I have gotten wifi working well in non-rural areas using Ubiquiti Unifi AC Lite APs. In suburban New York, a couple of them blanket my home. In my grandmother's Shanghai apartment in a building that is a mix of concrete and plaster, just 1 is sufficient (although for a much smaller volume), despite dozens of nearby APs. Admittedly, I moved to 5GHz there because I was literally the only one using it, but 2.4GHz had worked fine when I tried it. Peak throughout is higher on 5GHz due to higher channel widths though.
OK, now take your dog to a place full of dogs and other dog whistles. See how well it works.
Once the spectrum gets noisy, interference happens. Just because you found a quiet spot in the spectrum doesn't mean that spectrum sharing isn't a physical reality anymore. It just means you found some to keep all to yourself! Until someone else starts using 5GHz too.
In fairness, I had made tweaks to iwlwifi's kernel module options in order to have usable wifi on my laptop. Until a year ago, I was accustomed to have a hellish experience in congested areas until I turned off Bluetooth coexistence support in the iwlwifi driver. That feature would almost always cause severe packet loss on my T520 in crowded environments even when there did not appear to be any Bluetooth traffic.
Anyway, it is possible to get wifi working well for certain workloads in crowded environments. Making it work well in general might require better drivers and better equipment than one might have at first though. For example, I understand that ath9k was a disaster when it was first made, but it is fairly decent now. I ran an AP off an ath9k USB wifi dongle in China at one point and it worked well too. I doubt that would have worked as well with drivers from 5 years ago.
Also, from what I understand, a used Ruckus Zoneflex 7982 off eBay ought to be able to handle just about any environment fairly well. Their proprietary beam forming hardware is special because it attenuates signals coming from other directions than that of the client. The only exception from what I have read is when other wifi equipment is right next to it (e.g. practically touching). Some review of it said that it failed to work until they moved other equipment away after consulting Ruckus. I cannot find a link to it though.
I placed an order with an eBay merchant that I expect to receive soon. $90 per used Zoneflex 7982 (that had a $1099 MSRP when new) is a bargain considering that these still provide some of the best wifi in the world. There are reports of these getting good throughout through multiple concrete walls:
A newer version of the midrange model in their product line (that has inferior radio specifications to the top end model) was able to give a cell phone decent throughput from 225 yards away when the (omnidirectional) AP was in doors:
As far as I know, the only better access points are the newer Ruckus models that replaced it. I am really excited to be getting one later this week. :)
If you meant to imply that I am somehow paid to say such things, I assure that I am not. I am a fairly well known OSS developer and such a thing would be damaging to my reputation. I just happen to be genuinely excited about this topic for the first time in years at this moment.
I just returned from visiting my grandmother in China, who is in the hospital. I stayed at the concrete and plaster apartment complex where my grandmother's apartment is. The horrible pentetration of wifi signals made me think plenty about wifi and that eventually lead me to discover that Ruckus's old equipment would work far better and is selling used on eBay at pricepoints seen in consumer grade gear. There really is not much benefit to Ruckus from the free advertisement of me talking about the used Ruckus equipment on eBay. They are not making any money off the used hardware market while the pricing for new Ruckus equipment is so high that it might as well not exist as far as the majority of people are concerned.
You managed to achieve good network performance by using your specialized skills.
Most people don't have the skills and aren't interested in learning them. This is the market for Apple's APs. The interesting description of what you did to get stuff working is close to meaningless to most people, and hence by contrast makes the trusted-brand easy-peasy gear look even more valuable.
I've wrestled with wireless gear since before 802.11 existed, including writing my own radio firmware, and I have an Airport at home which in practice means I don't need to do any of that awful stuff any more. In my lab we throw away about half the APs we buy due to mystery incompatibilities. Radio is still hard compared to most other things.
It would do strange things to cellular and broadcast radio, but it wouldn't destroy them, I think.
The "real" solution is for people in dense apartment buildings to just turn their damn Tx power down, but that's a prisoner's dilemma if I've ever seen one.
So with the craptastic firemware of most consumer wifi APs, the AP hops on channels 1, 6 or 11, and stays there, screaming at full power. And the consumer, even if they know enough about radio to realize such is suboptimal, can't do anything about it without changing the firmware or buying a more expensive device.
So, people on the same channel silence each other (CTS/RTS), but on other channels it is simply seen as noise. If there are lots of people on 1/6/11, you're going to spend a lot of time silenced.
It's one of those things that you just have to try out in your own particular circumstances. Will you get better performance if frequently silenced, or better performance dealing with a high level of 'noise'?
In the case where the building is served by one major ISP, and they provide combination gateway+APs, I could see the ISP deciding to "solve" the problem by turning the whole building into one AP mesh (with only one AP actually active per unit sphere) where each client gets a VLAN with tagged QoSing over the last-mile "backplane" back to their own gateway.
Though, if you're going that far, you may as well just call yourself a WiMAX provider and tell people their gateways are just extension picocells for the network, rather than end-user equipment. Which is basically what you get already from the ISPs that offer "public wi-fi for customers at our hotspots" (i.e. broadcast by all APs on our business plan), except that said wi-fi would be the only product, and would be served by residential gateways as well.
Lots of people are complaining about their WiFi and many are discovering that a $200 consumer router doesn't help. What does work is to accept the physics, get a wired router and a $80 ubiquiti access point and run a wire so the access point is in the center of your house.
Let's put it this way: the inverse square law means that if you put the router at the edge of your house, relative to the center, you are going to have a minimum signal strength dropped by a factor of 4; this is on top of any attenuation that you get from going through multiple walls.
Accept the laws of physics or you'll find that no amount of spending on your access point will make your system completely reliable for a wide range of devices.
This is why all attempts to sell differentiated WiFi routers have failed -- the one way you can differentiate yourself in terms of reliablity, never mind performance, is to site your access point correctly.
Everybody thinks the mesh network fairy will save them, but there is a name for mesh networks, and it is "radio interference".
This is complete nonsense. 802.11N / 802.11AC are perfectly fine for your streaming needs. No need to try and rip up the walls or run ugly cables all over the place. Wireless in 2016 just works for streaming like this.
Unless you're trying to drive some 8K video but even then AC should still be good. I know over 802.11N, on an older generation of FireTV, separate by 1 story and multiple walls our TV streams fast and does native playback on the FireTV for most of our videos at its original quality settings.
So, Apple TV? It's all download, so wifi is fine. XBox One? About half and half, so that gets a wire. Printer? All download and barely any bandwidth needed anyway, wifi. Desktop PC? Wired, of course.
I dunno where you're seeing this hostility to wired networks, the only hostility I've seen to them are from people who have legitimate reasons to not want to do it (or can't) i.e. renters, people who are not particularly handy, or people who don't have those high demand needs. For your average Joe running a phone, maybe a stream box and browsing Facebook on a laptop, a $40 wifi router will handle that just fine.
It is a given that wired is a better option, but I cannot imagine any modern SOHO environment without a wireless router/access point.
Mesh networks are just going from the frying pan to the fire.
I am not saying that people "don't need wireless" but that if having a reliable WiFi network matters to you, the best way to speed up your WiFi network is to move anything that you can possible move to wired to free up bandwidth for WiFi.
That includes turning off the rouge access points that are created by printers, game consoles, phones, etc.
I keep what devices I can on wired, but it's really not feasible for everyone or for all devices. I sometimes have to switch my phone over to LTE because the wireless sucks too much. There's obviously no ethernet port on that.
or maybe i'll just nail a bunch of cables up around the corners of the ceiling, that'll be gorgeous
it'll be great, it'll be real convenient to do all that again when i move
Who says it has to be ugly? Get creative and make some art out of it .
> it'll be great, it'll be real convenient to do all that again when i move
This strikes me as a bogus complaint; there's nothing convenient about moving, IMO. The inconvenience of it rises with the amount of stuff you have. If you already have devices that it makes sense to wire them up, the inconvenience of setting up and tearing down cable runs when moving is secondary to having to move that stuff in and out of an apartment to begin with.
$200 is reasonable, compared to all the time I've spent farting with other brands of WiFi routers, including whatever piece of crap my ISP sent me. Now what's the no-brainer WiFi router choice? ASUS something, Ubiquity (sp?)? Read some reviews, order what seems the best, marvel at the weird new mutant feature set... Oops, that's another afternoon wasted.
Oh darn, my offbrand WiFi router had another security exploit, 6 months ago, and I just found out. And so forth.
Hopefully the silver lining is Apple is anticipating metro WiFi (WiMax?), though I wish they were helping drive that transition. Like I had hoped Google was... Oh well.
The speed and coverage is also fantastic from my experience.
Ubiquity. Or in general: stay away from consumer-grade network equipment.
Airport is amazing for consistent throughput and reliability.
Ars Technica's article basically says good riddance, but if you go back to their original review observed that the AirPort Extreme crushes all competing routers on both ease of use and performance.
Consumer-grade routers usually are faster. What they are not is more reliable, or easy to use/manage.
I may look into the new google APs, so I can get the latest wireless speeds across the house.
Meh. I worked at AppleCare for about 1.5 years and when I got calls regarding issues with the MacBooks I'd say 99% of the time the internet connectivity issues was with the AirPort Extreme and it had to be hard reset (I've had to walk through this process so much I could essentially recite it in my sleep back in the day).
That's only an anecdote but consumer grade networking, including Apple's, just always seems to suck. Nowadays I try to spend a little more and get some sort of small business type of router.
So, in my mind, their customers are not missing out on much. Though I'll admit the App for getting it up was nicely than browsing to some IP address like many routers at the time.
That's the real reason Apple is getting out of consumer WiFi routers: for most consumers it is an optional aftermarket accessory that makes things more complicated, not less.
Even my crappy DSL modem is a WiFi router. To attach my preferred router I had to login and set it to bridge mode. Why would Apple want to sell a product that requires people to do that?
The best part of working on fixing an internet issue was the second it looked like it could be an issue with the cable modem we had to get off the call. Even if it's likely turning it off and on would have worked. We were told to have the customer call their ISP and refuse to help with even any ideas let alone walking through something.
Which, I get, you don't want to be diagnosing some third party hardware. But I'd say about half the time when someone made it to me they have already called both Apple Care AND their ISP at least 1-2 times and they continue to blame each other.
Granted, just an anecdote with a sample size of 1, but after that experience there's no way I would ever consider buying another Apple router. Especially since my previous router (a Linksys) was still chugging along after like 7 years.
Also: don't most people get their wireless service from their cable router now?
I don't think Apple has a comparative advantage in providing wireless routers. That's not to say that there's no advantage to be found in producing excellent wireless routers: there are companies trying to do this today, but Apple isn't one of them.
I think this is the key part: thinking about my friends and family, over the last few years everyone has switched to using the box which came from their ISP except for a few IT people and gamers.
This is generally a good thing since the ISPs are significantly more likely to install security updates, and WiFi speeds exceeded the average home internet connection somewhere around a decade ago so the benefits to buying your own are fairly limited for most people.
Depends where you live. There are ISPs that will get you 250 Mbit downstream via FTTH, but their WiFi supports only 802.11n over 2.4 GHz only. When your neighbors get similar setup, you will not get over WiFi anything that approaches your Internet speed.
People think they need repeaters, cable installation, all kind of expensive and worrying projects... nope, just use a real wireless router instead of the Comcast box.
Yes they do, but they shouldn't.
The problem is that you place your all-in-one cable-modem-router-AP at the point where the cable enters your house. This is unlikely to be the ideal position to place an access point.
For example, my fiber connection enters my home at one of the corners of the house. You want your AP in a central location.
Are you thinking of other issues besides signal strength?
Apple is nudging consumer behaviour with each product transition - I wouldn't be entirely surprised if we start seeing more devices with mobile data built in (say, with an antenna behind the logo or behind the Touch Bar).
I'm a network engineer and use pretty much "enterprise" gear across the board at home. A couple of years ago, though, I yanked out my Cisco ASA firewall (connecting my home network to my ISP) and replaced it. I'm now using a (expensive) "RouterMaxx 1106"  (PDF) as my home router. It was designed for and shipped with Mikrotik RouterOS but I replaced that with pfSense, initially, and then switched, later, to OpenBSD. It Just Works(TM).
The hardware, for the most part, doesn't matter too much. The software you choose is much more important, in my opinion. Choose something that can be tuned and locked down as needed and, critically, gets updated regularly. I'm a "CLI guy" and quite comfortable with OpenBSD and a terminal so this setup works for me. For those desiring a point-and-click web-based method of administration, pfSense, OpenWRT, and similar may better serve their needs.
n.b.: FWIW, my router doesn't provide Wi-Fi, only (6 x 1000 Mbit) wired Ethernet. I have been using an Aerohive AP330 access point but am currently replacing it with a Ubiquiti Unifi AP AC PRO.
: http://www.balticnetworks.com/docs/routermaxx%206%20port.pdf (PDF)
Option 1: look and see what all the DDWRT/tomato/etc... guys are suggesting. Chances are the hardware is decent, and you know it will have support for the good 3rd party firmwares if you ever want to try it.
Option 2: Abandon all consumer grade gear. Just get Ubiquity stuff.
Option 3: Google just released their routers (like last week), which support automatic mesh networking when you use multiple. Way too early to make a call, but its something to keep an eye on.
Asus was really nice for a while, one of the more open mfg's, my AC3200 unfortunately came out after the lockdown, and is unsupported... would love to get back to tomato-usb or similar.
I have owned three consumer routers the last three years, all of which required a reboot every few days in order to stay responsive. My problem with consumer routers isn't speed, easy of setup or need for mesh networking, but reliability.
I currently have a rock-solid Mikrotik hAP ac lite , which has not been rebooted once since I installed it in February. However, it only has an internal antenna, and devices in another room are constantly dropping out, so I'm considering something else, if I can retain the reliability.
If you don't trust your ISP provided firewall/router then you probably have bigger problems. Both big ISPs in my area offer to turn the router into a bridge-mode device I can plug into a box running pfsense if I desire it, but frankly, that sounds like a PITA. Their routers work well enough. I think going forward we will just buy access point, not routers. The router problem has been solved for a long time.
>Abandon all consumer grade gear. Just get Ubiquity stuff.
I'd argue Ubiquity is really consumer stuff, just small business friendly. Their performance, price, stability, etc isn't that much better than a Netgear in access point mode. They just have a java-based controller which is handy for managing multiple AP's. Inside, they're kinda crappy.
>Google just released their routers (like last week), which support automatic mesh networking when you use multiple.
Google wifi is actually a latecomer. You can get residential friendlier Ubiquity-style solutions via Orbi or Eero, which is what Google is copying here. I believe Google's 3-pack beats both on price currently.
edit: modems and routers are completely different cases/devices and performance/stability are different than money saving. Regardless, my ISP doesn't charge me for my modem, so there's no incentive for me to buy my own.
As for Ubiquiti being consumer grade, it is like an Intel Xeon E3. It is the same chip that is used in consumer stuff with enhancements for reliability. For instance, Ubiquiti equipment supports grounding via STP Ethernet. It generally supports some variant of PoE and is meant to be mounted on walls and ceilings. There are even outdoor models. It also does not require a reboot on every change. In addition, there is a team of engineers working on making the firmware better and their customers are able to report problems to them. Then there are firmware features like VLANs, policy based routing, BGP, ping watch dogs and other things not found in stock firmware of consumer grade equipment.
Comcast does this as well - if you are leasing the Comcast modem then you are putting out a pair of APs. One of them is a public Comcast hotspot. I think you can request that it be turned off however.
Right now I have a Cisco DPC3008, which I can't really complain about. I have a Buffalo Airstation Extreme 1900, which has been good so far apart from its minute-long reboot times. I thought I was getting a DD-WRT router though, didn't realize the standard version did not have a supported DD-WRT build.
I have to admit though - when a contractor ripped out the sidewalks and took out my cable, I was glad that my neighbors were running the hotspot. Took almost 3 weeks to get back online.
In Germany, Kabel Deutschland can and does regularly update your cable modem remotely. There's a even a list of "censored websites" pushed down to your own router. They provide a Fritzbox but it's slightly more locked down (feature-wise) than normal so you can't even use it to a full potential.
I bought an Airport and stuck it behind it to at least isolate our home network from this backdoored mess. Good to know that Ubiquity provide a reasonable replacement.
Bought my own modem + router and saw my speeds go from ~40mbs to 120+mbs. Upload speed also saw a similar scale increase.
I mean really, this is the kind of thing you can/should be able to log into from a web page, why does this all have to be locked behind an app?
I did enjoy using Apple's wireless routers, though.
But I do concur - I will miss the ease of the Airport tech (not so much the most recent prices though).
There is no need for Apple to be in this market, there is no problem to be solved for 99.9% of people.
Usually, you don't even want to remove the PoS router you get from your ISP from it's box. In my experience no ISP ever gives away decent equipment, it's just the cheapest thing they could order in quantity.
Another issue with an ISP-issued router is that you have no idea what they did and can do with it. I know of several ISP's that have full remote access to the routers they hand out. Basically, you risk exposing your entire home network.
There are have been some guides to try to bypass the router and go strait to the OTN but it is extremely non trivial.
I hear it's a challenge to avoid the standard Verizon router if you also have TV or phone service through Fios.
I also confirmed this a while back on some forums but perhaps they have changed this. It is good to hear you had success. Do you have a residential or biz plan?
Here is the link (I was mistaken and thought it was on a forum): http://theassociatespress.com/bypass-fios-router-entirely/
Like I said before it is pretty non trivial and you have to plead your case to them to let it happen.
My own household used its own router with the Verizon router configured as a MoCA bridge almost exclusively from 2006 until 2015 when we cancelled FiOS TV in favor of an OTA TV antenna. There were a few brief periods of time where I tried having Verizon's router be the gateway router, but it turned out to be awful.
Expect to hear the CSR make vague references to advanced features being absent when saying that you want to use your own router over their advice to rent theirs. The only "advanced feature" that would be absent is Verizon's remote management backdoor.
Of course there is a need. Apple has always been one of the first to use new WiFi standards, the Airports made these transfer speeds non-theoretical. My three year-old Airport still beats my 2016 cable modem royally on 802.11 AC. And this is not one of those cheapo cable modems (basically the latest and greatest Fritz!Box).
A large chunk of Apple users that I know have an Airport or Time Capsule for these and other reasons.
Experience tells me this is not actually the case. Provider-provided routers are a major PITA if you want them secure and stable.
I also had a router from another UK ISP that would crash and hang if you sent it a malformed IP packet (ie put its respond-to address as itself); it'd go mental replying to itself and then processing the messages it sent to itself. Not quality.
The EE broadband router I had would hand off its internal web admin session to external visitors if they connected whilst a web admin session was in progress. Yes, let's hand our internal web admin session page to an external visitor....
BT sent another hub that looked identical and worked correctly; underneath it was a Siemens instead of a Thompson I think.
So all in all, I have found that the ISP boxes were never really finished, and some (like Sky's box) needed rebooting periodically as I recall, so buggy too.
to be fair, I've got $1000 cisco hardware that can't wrap the external IP like that...
same-security-traffic permit intra-interface
static (inside,inside) ...
Time Warner charges $10/month for theirs, and they're awful.
Hopefully, in the near future we can have this done with libre hardware. Cheap POWER stuff would be great.