Hacker News new | comments | ask | show | jobs | submit login
iPad Pro (apple.com)
480 points by tambourine_man 3 months ago | hide | past | web | favorite | 783 comments



To me, the shift to USB-C signifies a fundamental change in how Apple is approaching the iPad and computing going forward. Those stupid "what's a computer" ads make a little more sense. With USB-C, developers will support whatever I/O standards they need to in iOS, and the ecosystem of monitors, keyboards, and peripherals we all bought for our laptops/desktops will now begin to work with iOS too. This feature alone grows iOS into a much more powerful OS.

By this logic, it's pretty clear Apple is killing macOS x86 dependency (and macOS itself) in two ways. 1. Shifting compute intensive workloads to A level chips. This started with the small stuff around "security" and ApplePay but today the video encode/decode functionality in the T2 is a huge leap. This will extend into graphics next year (thank god cause fuck Intel's integrated graphics) with APIs like Metal and going forward almost anything I/O, Display, and every function outside of the CPU. #2 strategy is the iPad - by offering a compelling device that has all the features and capability of laptops without 2 decades of baggage.

We've all been so worried of the convergence of iOS and macOS, but after today I don't see a macOS future. Sure it will move to Apple's ARM chips, it will last another decade, but the shovel is out of the shed, it's called the T2.

(Jailbreakers, we need you)


The biggest pain for me of lack of access to the underlying OS has turned out not to be any of my thousands of tiny gripes with the way the software works that I can't change (e.g. setting dns on the cell connection), but the lack of transparency and ability to debug basic processes. What is the computer doing now? Why is it hanging? Why did that app crash? Why is my network failing? Who is that app talking to? My main fear is not, in the end, that I won't control or own my device (though I do feel this), but that it will turn computing from a transparent network of understandable processes—which I find beautiful, and which drew me deep into programming—into a Kafkaesque nightmare of inexplicably bad software and automated computer support, not just from apple, but from all the apps in the market they created.

I am absolutely sure that they will port XCode to the iPad itself and allow visibility into my own apps. This is not the same as a workstation.

EDIT: in retrospect, this is only tangentially related to parent post—my apologies. Also: wording, punctuation.


The reason Apple cares so much more about iOS than MacOS is vastly increased control. With iOS, they control the hardware from the chips up, the OS, which apps you can run, and via Safari-only access to the internet, even what you can and can't do using external services.

It makes computing & communications an Apple theme park, where you can't bring anything forbidden into the park, have to buy everything you need from Apple concessionaires, and can't do anything that they haven't planned for you to do. The people in the "park" pay to enter and then become a new type of domesticated herd to be milked.

You can't allow people to bring heavy equipment and power tools into a theme park, but you also can't build the park's controlled features without them, so how do you give power to authorized builders while keeping it away from users? In a physical theme park, they have controlled users during the day and empowered builders at night.

But in Appleworld, I'm thinking they'll keep iOS locked down by limiting the building tools to MacOS.

A unix-style workstation that gives maximum power to its users is so antithetical to everything Apple stands for that MacOS would never be created today, but since it already exists, they can take advantage of it to make iOS even more locked down than it would have been. (And refuse to let any other company use MacOS and not port Xcode anywhere else.) You want to see what your computer is doing (file sys, CPU procs, memory, network, etc.)? Do it with a Mac. iOS is not YOUR computer. It's Apple's. You only paid for admission to iOS, and you aren't allowed behind the locked doors. If you want control, you're in the wrong place.

I'm afraid that if they ever figure out how to sandbox an iOS partition of some sort to allow builder tools to run on iOS itself in a totally controlled way, then the Mac is toast, but the Mac gives them time. I think that for the foreseeable future, they won't risk any accidental empowerment of iOS users and will limit the power tools to the Mac.


The reason Apple cares so much more about iOS than MacOS is easily obtainable through earnings reports.


There are lots of tools you can use to develop iOS apps other than Xcode. Phonegap/Cordova/OutSystems, Visual Studio, Xojo, Mendix, LiveCode et al.

End of the day, Apple like any other corporation does what it does because it makes money for them. The alternative is to use a "free" operating system like Android and you subject yourself to constant surveillance. It's like that scene in The Big Short; "tell me how you're f*ing me." At least with Apple, you know straight up what you're getting yourself into. Your relationship with Apple ends when you stop using their devices. Not so with Google/Facebook, where your data lives on in perpetuity, used for purposes beyond your control.

Now I'm not saying it's not possible to have your cake and eat it too. I'm saying, where is this mythical product, where a user is free to do whatever he wants with his device? What's the market size? How come no one has built it yet?


It's called a "hand-built Linux desktop."


Typing this on a netbook, the only surviving Linux desktop I still use.

For some reason most Linux conferences end up being about file systems, containers, device drivers and what not, seldom about desktop development.


Because conferences require money to run, and the kinds of people who back conferences tend to only run Linux on their servers, not their workstations. Enterprise Linux in general is happy enough being in the data center, and could have made harder pushes for the desktop for years now, but haven't.


Basically what Apple and Microsoft have found out, it suffices to offer a POSIX CLI for UNIX "desktop" users.


(Apple user disclaimer) Apple isn’t really offering just a CLI, it exposes the UNIX on which it is built. My understanding of Windows is that it just offers a runtime and abstraction layer.


Windows offers a complete personality for Linux syscalls.

Objective-C/Swift Frameworks have nothing to do with UNIX, Apple could easily port them to another kernel architecture.

Likewise OS X driver model has nothing to do with UNIX, being modeled on an C++ subset, which was originally written in Objective-C back in the NeXTSTEP days.

The only UNIX GUI certified as such is Motif, which Apple certainly isn't offering on their products.

Nor is the audio stack in any way related to UNIX.


I never claimed macOS was UNIX, but Darwin is certified UNIX[1]. Opening Terminal in macOS gives you a real bash without any of the rest of macOS.

1. https://www.opengroup.org/openbrand/register/brand3555.htm


And access to Bell Labs teletype style applications like a command shell, because everything else built on top isn't UNIX related, just like NeXTSTEP.


Yes, and the market size for that is…?

We don't all grow our own crops, make our own clothes, brew our own beer, refine our own crude oil etc, even though we may have the knowledge to do so. Some things are better left to specialists, due to efficiencies/legal reasons. Surely we can agree on this?


> How come no one has built it yet?

People are greedy and have no values.


1. It's not Safari only. It's Webkit only and there are legitimate security reasons for this.

2. The idea that UNIX is antithetical to what Apple stands for is just ridiculous. It is still at the core of iOS and MacOS and is one of the areas where Apple continues to innovate.

3. Apple licensed MacOS in the past. It nearly killed the company since third parties like PowerComputing went straight after their core base and did nothing to grow the ecosystem.

4. You can see what your iOS device is doing. Plenty of apps allow you to see what processes are running, file system behaviour etc. Apple just doesn't build it in.

5. Anyone who thinks the Mac is dead needs to go have a lie down with some chamomile tea. Apple even today doubled down on the Mac with the new Air + Mini. And they will continue to grow and invest in the platform since content creation will always be largely done on a Mac.


Apple was always focused on their own platform, even Steve while at NeXT saw UNIX just as a way to lead developers into their hardware, but the real juice was in NeXTSTEP Objective-C world not POSIX.

What is happening now are those that only discovered the Mac world after OS X, don't care about Objective-C/Swift development, now feeling that their pretty UI Linux replacement is no more.

Apple naturally cares about their Objective-C/Swift developers in first place.


There are not legitimate security reasons for Webkit only. That makes zero sense. Chrome has hands down beat every other browser on security for years. Go read the CVEs. There is absolutely zero reason to believe that iOS webkit has a better track record than webkit at large. Therefore it's nearly certain your iOS device is less secure browsing the net with webkit/Safari than if Apple allowed real chrome to run there.


You misunderstand. Apple can fix WebKit bugs, but they can't force Google to fix Chrome bugs.

Also for Apple to allow v8, they would have to permit 3rd party unsigned executable code (JITed in this case). Apple doesn't want to allow that.

This is not about CVEs, it's a meta discussion of not outsourcing the security of the platform.


Which are both control issues not security issues.

There's no actual security reason for iOS to be locked to Apple's webkit exclusively and there's no actual security reason for iOS to not allow JIT'd code. Those are both control issues, not security ones. JIT in particular is purely a control issue - the process itself is already sandboxed which is the one and only actual security boundary here. Preventing a JIT doesn't prevent arbitrary code execution, after all, especially if there's an interpreter in play which Apple sort of allows.


Google has a track record of fixing Chrome bugs far far faster than Apple fixes Safari bugs so the point still stands. You'd be safer and more secure if allowed to run real Chrome on iOS than Safari by every measurable metric. In other words it's effectively provable the restriction is not about security.

> Apple doesn't want to allow that.

Yes, that's all it's about. Apple's control. Other excuses people make up for their reasons are demonstrably false.


> Also for Apple to allow v8, they would have to permit 3rd party unsigned executable code (JITed in this case). Apple doesn't want to allow that.

And how does that pertain to security? Android doesn't seem to have a problem with JITed code in sandboxed store apps. Neither does Win10.


Actually UWP does not allow for JIT code, other than Chackra, as you should know.

Hence MDIL in WP 8.x and .NET Native on WP 10 onwards.


WinRT sandbox has always allowed for JIT'ted code, all the way back to the original Windows 8 release - all .NET Store apps back then were running on a JIT. .NET Native is a later addition that is there solely to improve performance, and it is still opt-in.

Now, Win8.x did not allow for third-party JIT compilers in the sandbox; it was only CLR or Chakra. But UWP does - look for the "codeGeneration" capability here:

https://docs.microsoft.com/en-us/windows/uwp/packaging/app-c...


WP 8.x did not JIT code on device, hence the whole MDIL and cloud compiler on the store.

WP 8.x only did dynamic linking at installation time and when OS updates were done, by replacing symbolic labels with the actual target destinations. Everything else was already compiled at the store and downloaded as binary into the devices. This was the whole point of MDIL.

There is a BUILD session and a further Channel 9 deep dive interview showing how MDIL deployment works on WP 8.x.

So Chakra was the only JIT in town.


Since the comment does not allow for editing any longer.

"BUILD 2012, Deep Dive into the Kernel of .NET on Windows Phone 8"

https://channel9.msdn.com/Events/Build/2012/3-005

"Mani Ramaswamy and Peter Sollich: Inside Compiler in the Cloud and MDIL"

https://channel9.msdn.com/Shows/Going+Deep/Mani-Ramaswamy-an...


Oh, I meant Windows, not Windows Phone (for 8.x, that was a big difference still).

Either way, code generation is there today.


> You misunderstand. Apple can fix WebKit bugs, but they can't force Google to fix Chrome bugs.

If apple could make their browser the defacto browser of tomorrow in the way chrome is today then google would surely work harder to fix the pain points apple identifies


>4. You can see what your iOS device is doing. Plenty of apps allow you to see what processes are running, file system behaviour etc. Apple just doesn't build it in.

Do you have any examples? I've never seen anything like this in the app store.

Some apps give you limited access to their filesystem but (as a Jailbreaker) I'm pretty sure the larger system isn't viewable under normal circumstances.


Search for Lirum. Tons of apps like this.


I do feel it odd to say content creation will be done on a desktop OS rather than a mobile OS when it seems that more and more people are using their mobile OS as a primary means of creation and sharing. Maybe in a less powerful way, but mobile OS apps and hardware have really democratized content creation and I don't see that trend reversing.


> "mobile OS apps and hardware have really democratized content creation"?

Democratized? Is that because you perceive mobile devices as more affordable? Have you checked the price of a new iPad Pro?


The majority of "new content" is being created with mobile OS apps today. Family photos, social media, etc. The fact that it's mostly all low quality junk is orthogonal to that reality.


Mobile systems have democratized content creation not because of price (though that too; not everyone buys the most expensive device) but because they are demonstrably easier to use for content creation than anything that came before them.

Source: personal experience with multiple over-65 people for whom an iPad or iPhone is the first internet-connected computer they've ever owned.


"Content creation" isn't usually referring to "family photos and social media". Besides, family photos are made with digital cameras, not mobile apps.

In the context of this thread, the target user of iPad Pro is more than someone taking pictures of their kids and pets on weekends.

As for my opinion on this, I believe in using the right tool for the job, AND using multiple tools to get the job done. Sometimes an iPad Pro might be fine for certain stages of a project, for sketching out backgrounds or initial ideas for a design, and then importing that into another system such as animation pipelines or other applications on desktop or laptop or workstations. Depends what the job is.

I've never met a professional designer who works only on their tablet. Professionals love their workstations, multiple screens, and all the comforts and power of a proper setup. If you're just marking up a PDF, I wouldn't count that as professional work!


> "Content creation" isn't usually referring to "family photos and social media".

I wish I agreed with you, but I can't but see that as elitist condescension against unsophisticated content.

> the target user of iPad Pro is more than someone taking pictures of their kids and pets on weekends.

I wish I agreed with you, but in reality the iPad Pro is marketed as a tool for "serious professionals" but in reality that's just an aspirational message and a large number of iPad Pro buyers will use them to take pictures of their kids and pets on weekends.

The same is true of DSLRs. They're supposed to be professional work tools but the overwhelming sales numbers are to amateurs taking photos for trivial reasons. (That said, the DSLR market has now matured to the point where they do target the amateru audience.)


A decade ago, I never would have thought that Apple would be the ones mainstreaming Trusted Computing. (Well, the iPhone was out... maybe 12 years ago.) These were machines that supported the hacker ethos of tinkering and experimentation, even if the company involved was dedicated to abstracting more and more of the computer's guts away from the user.


You could crack the case and do stuff to an Apple ][. From day one the Mac has been hostile to this, right down to weird recessed screws.


>What is the computer doing now? Why is it hanging? Why did that app crash?

For what it's worth... On a computer, I'd look at log files. On an iPad, I'd plug it into my computer... then look at log files.


You can also go to Settings>Privacy>Analytics>Analytics Data and see some pretty specific logs on why things crashed or why they were slow. You can see if an app used too much CPU time, OOM kills (a.k.a. JetsamEvents, in iOS-lingo), and explanations for app crashes (I can see that iOS killed an app for trying to write to the photos library without permission, for instance).


and then realize the closed app you are using have none.


You have to keep in mind that apple is going to dogfood any potential "iPad as workstation" product with it's own developers. If their own developers jump out in rage then they'll fix it before release.


> If their own developers jump out in rage then they'll fix it before release.

That strikes me as very optimistic: the business needs come first.


Still, it's funny to see Google beat Apple to the punch here and make ChromeOS a valid dev machine for most people before Apple. USB-C doesn't let me run Docker, VSCode or IntelliJ. I'd love to bring an iPad pro for the weekend when I'm on call, but it's still not anywhere near.


"for most people" -> "for most javascript devs" ?

totally ridiculous dev machine for scientific or financial computing, no?


Anything you can run on Linux, you can run on a Chromebook, without even having to do anything hacky these days, as I understand it.

So, nothing to do with JavaScript specifically.


I think it was a reference to the capability of the hardware.


I do a lot of scientific computing and data analysis development work and most of the time during my dev cycle I'm working on tiny test data sets developing and testing new approaches to certain problems. Most of that work could absolutely be done on something with the capability of the new iPad pro. In fact I do a fair bit of development on a mid-range surface pro 4 which has pretty similar specs as the new iPad pro.


there's plenty of non-JS dev work that isn't scientific or financial. Since ChromeOS now runs Linux apps, any dev is feasible; arguments about machine specs are about the hardware, not the OS.


Last time I checked only Google devices do.


Nope; plenty of other manufacturers' Chromebooks now run Linux. Google's are certainly among the higher-spec'ed though.


I would take an iPad Pro with the overpriced keyboard and an Apple mouse. I think if they can jailbreak iOS 12 to support a Bluetooth mouse... we might be onto something.


You can always use something Amazon Workspaces or similar RDP services.

That's what I do when I need to do development on the go.


None of what I use a Mac for is available or practical on iOS. Atom, Git, bash, ssh, llvm, Python, R, PostreSQL, CouchDB, Docker, nginx, Node.js, homebrew, curl, wget, ffmpeg...

I’m typing this on an iPad now. I mainly use it for web surfing, Netflix, email and casual gaming.


All of that can be done in a remote VM. I use Coda + Shelly on an iPad pro and it's a pretty decent development experience. The real barrier is for (non-web) GUI development.


What's the point of having an A12X chip while running everything remotely?


The UI is absolutely smooth unlike any laptop!

Coda has a local server/browser pair you can use for simpler things. But usually the kind of web work I'm doing barely exercises a single core, it all runs on a $2.50/month VM.


Remote VM is an interesting idea, what about mouse support?


You can use swiftpoint on the jump desktop app (that is my setup), it’s highly portable. Don’t get the apple ipad keyboard though, no escape key...


out of that entire list, the only thing that called for a mouse was atom

maybe we're returning to the age of terminal-only development ;)

but with vector font rendering!


Actually there were terminals with vector fonts in the '60s https://www.cca.org/vector/


When you're starting from a change in plugs and end up predicting shifts in CPU architecture ten years down the road, there's a pretty high chance it's just fitting noise into your pre-existing narrative.

FWIW: the death of the Mac has been predicted constantly since at least the original iPhone came out. If it takes another decade to happen, at least we can lay these accusations to rest that publicly traded companies are forced into short-termism.


I hope iOS 13 will have mouse support for the iPad Pro.

Only being able to touch the screen while using the iPad Pro in laptop mode feels really limiting.


It's not only the fact that a finger is so much less precise than a cursor, but also that UIs made for fat fingers take up a lot of space. Buttons are bigger, when dragging you need to know what's going on under your finger, etc.

I also think that the lack of a desktop UI and a mouse is a huge limiting factor of the iPad Pro.


Also, sometimes you are using a UI not designed for touch (e.g., remote desktop).


At least AWS has solved that problem. The iPad Workspace client supports a Bluetooth mouse. I wish it were supported system wide.

https://docs.aws.amazon.com/workspaces/latest/userguide/amaz...


You can do mouselike pointer selection via the on-screen keyboard: a force press on iOS devices that support it, else two fingers together: your keyboard is now a mouse pad.

I"m not much of a mouse user so this is adequate for me.


I don't know why this factual comment would be voted down. It is functionality built into the OS to produce mouse cursor motion.

Not like my life depends on a karma point; this is simply puzzling.


Because it’s a horrifically limited “mouselike cursor”: it’s entirely bound to inside a text entry field.

It achieves nothing of what people actually what mouse cursors for.


Okay, then say that. Don't downvote, or if you have to downvote, justify it then and there.

BTW, people use the mouse to select and change the cursor position all the time, so it's very relevant to the topic since it's a step toward desktop-like mouse support.


I always forget about it but that is a great feature, especially since it's nearly impossible for me to modify the start of a url in safari since my finger bumps up against the edge of the case.


All that horsepower, no accurate steering wheel.

Is it a pro device if the choice to add a mouse isn't there? Many pros work in pixel accurate selection instead of pencil or finger.


"Pixel-accurate" might be an obsolete term today when you have 9 or more pixels where previously there was 1. (I know, that isn't what you meant ;-)

See how Autocad was using a command line for ages to implement precise control over coordinates. Don't need a mouse for that. At the same time, the imprecise things that you use mouse for (e.g. dropping a symbol from a library into a workspace, or connecting controls with outlets in an interface builder) are much easier to do using a direct touch.


Obsolete or not, it was clear what I meant. :)

A mouse has a lot of reasonable functions that are quicker for selection and editing that touch doesn't come near. Touch has its own benefits.


Is pixel level selection an issue when the UI immediately supports an arbitrary zoom level? (Pixel grows to the size of your palm).


The iPad UI is designed around touch, so generally not in those cases.

Add a few select and very valuable uses (remote shell, remote access, desktop/graphic layout) and a mouse is much faster.


It's a "Pro" device because Apple used a flimsy keyboard that doesn't allow the possibility of actual lap-top use.


You can buy other keyboards... it's the specs that make it Pro.


Can you get a USB-C keyboard? BT keyboards are generally painful to use


I don't see why a USB-C to USB-A adapter wouldn't work for a keyboard.

But yes, it does look like it's a product you can actually buy

https://www.amazon.co.uk/Macally-UCKEYE-Keyboard-12-inch-Com...


Yup Logitech makes a good keyboard.


I wonder when they're just going to start calling it "iOS".


apple: we won't give you touch screen laptops because touchscreens are garbage

also apple: here's a laptop with only touchscreen.

fan boys are going to have a field day to sell this one up. (edit: it already started with a workarounds for text input only. let's see how far it will go :)


Touchscreens are garbage on a desktop-oriented OS like macOS. The giant-ass buttons and controls in Windows 10 are one of the things that make it so infuriating for me. I don't want big buttons sucking up screen real-estate everywhere when I have a mouse available for precision input.


No, they said they're unviable on laptops because of having to lift your hands up. Now they're pushing the iPad Pro with a keyboard that props it up to stand vertically, without a touchpad, requiring the user to lift their hands up more often than, say, a Surface.


UX design in the past 10 years (since the iPhone launched) has incidentally been precisely in the direction of giant buttons everywhere.


I don't get what macOS vs iOS has to do with processor architectures or the T2.

Why don't you see a macOS future? It can use the T2 just as well as iOS, no?


> developers will support whatever I/O standards they need to in iOS, and the ecosystem of monitors, keyboards, and peripherals...

What are you talking about? Apple won't allow developers to do any such thing. They only recently allowed a narrow range of NFC uses even though the hardware itself has been there forever. You think they're going to let people develop your own kernel drivers? Maybe with a $10,000 "hardware developer" account.


Jailbreaking will never be mainstream when people worry about the warranty on their 1k-2k+ devices. If you want people to own their own devices you will have to write it into law presumably in a more consumer friendly jurisdiction than the US which cares about business first last and only.


I'm aware the jailbreak community often felt Apple was targeting them specifically, and perhaps they were, but every jailbreak was also exposing a security flaw. Personally, I've never been worried about jailbreaking voiding my warranty -- I just never got sufficient value from it to make it worth the drawbacks. (Both value and drawbacks are subjective measures, obviously.)

While I understand the "you must own your own device" line, that's more of a rallying cry than a useful description. (I have a similar pedantic complaint with "if you're not the customer, you're the product," for what it's worth.) It's interesting to think about a regulatory framework that mandates companies build software-driven devices with the ability for end users to do their own software loads, but figuring out how to do that in a way that doesn't let companies wiggle out of it and doesn't essentially mandate insecure back doors in all "smart" consumer electronic devices seems to me to be a non-trivial problem.


Back in the day (iOS 3-4 ish?) you used to be able to install Cydia packages that would patch whatever exploit the Jailbreak originally used.

This isn't the case anymore, unfortunately, because the Jailbreak community is smaller.


I remember some of that. I didn't really get into the jailbreak scene too deeply, though. Usually, what happened is I heard that some Thing I wanted to do that you couldn't on iOS was available if it was jailbroken -- "tethering," for instance, back before that was added to iOS. I'd jailbreak the phone (which could be really easy or involve a lot of hoops to jump through, depending), and install Cydia, and install the App That Did The Thing. By that point I was usually already frustrated because the whole user experience of the software from the jailbreak side of things tended to be noticeably worse. (I'm sure there were exceptions, but in general, that was consistently my experience.) Then I would discover that the App That Did The Thing was a little buggy and not very well designed, and sometimes didn't even consistently Do The Thing, and I would look around the Cydia store and discover that there was very little other stuff that I actually wanted, and then something would crash in a way that took the whole phone down in a way that official iOS apps very, very rarely do, and I would say "enough of this" and put the phone back in jail.

tl;dr: I tried to be adventurous, but was too impatient. :)


Letting the user control the device inevitably means letting the user hurt themselves that is inevitable and impossible to fix.

The least intrusive method is too allow users to add additional software sources and either opt out of sandboxing wholly or totally for select vetted apps.

This isnt hard it would just hurt Apples ability to get a 30% cut as third party stores would spring up.

User owned personal computing devices have been a thing for decades can we not pretend this is uncharted territory?


The comment of yours that I replied to explicitly mentioned jailbreaking, which is not the same thing as "additional software sources." I'd like to see Apple allow users to install signed apps from places other than the App Store, and I suspect that protecting revenue is the main reason they don't. But I'm not particularly interested in jailbreaking, even though I'd like to see functionality that iOS doesn't currently allow.


Well technically you can do that. Download code, build in Xcode, and install to your device. Probably not what you meant, but it does enable some side loading.


Further, there’s no theorem that jailbreaking will always be possible. I get the impression it’s becoming harder over time.


Jailbreaking has never violated the warranty. Apple just doesn’t support it so they won’t service your device until you update it to the latest software, which is completely reasonable.


It's completely reasonable that upgrading your phone and maintaining administrator access shouldn't be a challenge.


I.e. violates half of your warranty, because any but the most obviously mechanical problem can be blamed on messing with software.


But the first thing they'd do even if you didn't jailbreak would be to restore your device and pull a backup from iCloud. If you want to keep your jailbreak, then they can't help you, because who knows if the latest version of iOS included a patch for your problem, a tweak you installed borked the phone, or if Apple borked it?


And as far as I’ve seen when it comes to jailbreaks, anything but the most obviously mechanical problem is caused by the jailbreak.


I would read your articles. Do you blog somewhere? Link, please.


i disagree in that i do see a future for macOS. steve jobs said it best - sometimes you drive a convertible, sometimes a pickup truck.

macOS may be the pickup truck. but it isn't going to go away without a fight. not unless you can make software engineering work on an iPad the same way it does on macOS today.


A shift from lightning to usb c is apple losing because they didnt donate their port to the spec. They could have "invented" usb c. I understand why they didnt, but I still prefer lightning to usbc.

They are doing this before the EU makes them do it. What apple did do was force the market to adapt to a plug that can go in both directions. And they thinned out the plug quite nicely, it never needed to be so thick.


Lightning is awful. It looks nice, but that’s because all the fiddly bits that are prone to failure are hidden in the port. Ever had your iPhone’s Lightning port fail? It’s not particularly rare, and Apple can’t fix it without replacing the whole phone.

USB-C puts the parts that wear out quickly in the cable where they belong.


> Ever had your iPhone’s Lightning port fail? It’s not particularly rare

Really? In another life I worked at an apple store genius bar, and can't recall ever seeing a failed lightning port (physical damage aside). You know what's extremely common though? Schmutz. The port picks up pocket lint, and then the connector packs it all down to the bottom of the port. Eventually your phone stops charging. Often the gunk is packed so tightly it's not obvious even looking with a flashlight that there's anything in there, or that the connector isn't sitting properly. Just dig it out with a pin and you're fine.

Obviously I don't know what happened with your phone or etc. All I can say is in my experience (which is sadly extensive), if you had your phone replaced for a failed lightning port the real problem was almost always schmutz + incompetent tech support.

Edit: Didn't notice jen729w's comment when I posted. So... seconded.


> Often the gunk is packed so tightly it's not obvious even looking with a flashlight that there's anything in there, or that the connector isn't sitting properly. Just dig it out with a pin and you're fine.

That's a great tip - it never occurred to me and I've just been living with a unreliable connection for months.


My concern with USB-C is lint. My iPhone’s Lightning port regularly collects pocket lint, which I fish out with a toothpick. That’s easy to do.

(I notice the lint in there because the Lightning cable doesn’t quite snap in all the way; each time I’ve pushed it in, I’m compressing the lint up at the back of the port, and eventually it gets too much and prevents a solid connection. Only happens about once a year, but it happens.)

Looking at a USB-C port, there’s still space for lint but much less space to get in and remove it. Can anyone with a USB-C phone share their experience?


> Looking at a USB-C port, there’s still space for lint but much less space to get in and remove it. Can anyone with a USB-C phone share their experience?

You anticipated my experience exactly. My iPhone 6 got lint impacted as much as my Pixel, but it was a lot easier to fix the iPhone. Lightning does seem to require a more solid connection. USB-C will work longer with a "partial connection", which really just means I'm going to wait longer to clean it out.

Edit: I've never gotten any lint inside any of my USB-C cables though. Looking at the cable in front of me, it would be a bitch to clean out.


I have a usb c jack on my Android and haven't had any lint woes.


I didn't know that, so I'm glad you mentioned it. I have had at least 20 failed cables and 0 failed ports (of maybe five or six devices) in my experience with Lightning.

Actually, I'm not glad you mentioned it since that makes it even more perplexing. I was glad because if a port ever failed I would not have easily diagnosed it because I thought they were pretty good. But it must be my fault the cables are failing....


My phones seem to eat cables, at the rate of about one a year, per device, but the ports last the lifetime of the device - which is a huge improvement over microusb.


I’ve had two ports fail.

Another issue with lightning: the contacts arc. The middle pin on any non-brand-new cable is almost always a bit burnt — it’s pretty easy to see. I don’t know whether USB-C has mitigations for this.


buy cables with kevlar in them. i no longer eat cables per year.


Huh? USB-C has that flimsy little tab on the port whereas Lightning is... just a hole.

I seriously don't understand the logic here. I've never had a port fail on me.


They may not have invented it, but 18 of the engineers on the certification work for Apple and they were the first laptop manufacturer to announce support. They’ve pushed heavily and quickly into USB-C so I really don’t see any signs of coercion.


Why do you prefer Lightning to USB-C? Just curious.


I prefer it because it feels way better. The snap, and the way the plug feels secure in the socket.... I had a nexus 6p for a year and the plug was so damn unreliable and wiggly. Bought multiple different types but they were all bad. That said, I haven't used any Apple devices with USBC and I would not be surprised if they felt much better.


I have an iPhone and a MacBook with USB-C ports, and my experience has been the exact opposite. The lightning port on my iPhone is really weak, all cords can way too easily lose a connection so my music stops or it stops charging. The USB-C on my laptop snaps in really nicely, I'm never worried about the cable not being fully connected.


Sounds like you’ve got some lint in your iPhone’s charging port. I recently used a cocktail stick to remove a big lump of it from my wife’s 6s where she had the same issues.

Cable snaps in nicely now.


Yeah that's what I was originally hoping, but I've tried cleaning it out a few times with no luck.


Two of the usb-c ports on my 2016 MBP has been loose for a while


I understand what you mean about the feel but lightning ports have a fundamental physical flaw that is part of its design. The pins are on the iOS device side so if they wear down or break that device needs to be repaired. With USBC, the pins are on the cable side so any damage to them is just a simple cable replacement.


Is this an actual problem? In all my experience I've seen cables fall apart (mainly USB Micro or HDMI) but beyond lint, never had any issues relating to the actual ports.

The lightning contacts are flush with the port, so very unlikely to get damaged. On the other hand the USB-C has that fragile looking wafer on the device-side.


I agree. I will often knock out the USB-C cable from my MacBook when it's on my lap causing it to stop charging, whereas I can hang my phone from a lightning cable.


For usb-c, this depends hugely.

I've had phones were the plug felt very weak in the socket, and devices with a very satisfying snap and a very strong connection.

This is to say that usb-c is not inherently bad, it depends a lot on the hardware (and I'm sure Apple will get it right)


They didn't on the MBP - some of my dongles and cables even with high end equipment lose connections easily.


I've had a 6p for ~2.5 years now and only occasionally have troubles, and then only with certain plugs.


the snap. and things cant get IN the cable port.


Are you sure Apple didn't invent USB-C? There are sources suggesting they did, like https://9to5mac.com/2015/03/14/apple-invent-usb-type-c/


The simpler explanation is that Apple simply wants to move to USB C because they see benefit in being on the "standard" mobile device port now that it does not have marked drawbacks vs their proprietary solution and can used for their entire line of mobile devices, including laptops


It's kind of amazing how much the lack of a real Chrome (not just a wrapper around WKWebView) prevents me from caring about this.

I suspect the drawing experience on here is better than with a ChromeOS device, but I care more about my web browsing/creation experience than I do about sketching.


What is it about Chrome's renderer that specifically appeals to you?

I agree it's somewhat ridiculous that you can't get (real) third party web browsers, but Apple's is so damn good that I don't consider it a real problem.


....


Do you mean the extension cable that came with the original Apple USB keyboard?

It was only for the extension cable -- which are prohibited by the USB spec, so it was technically a "captive" extension. (The USB specs define captive as having a non-standard-USB plug, even if it's not physically captive.)

Blame the USB spec, not Apple. They're actually the only company I've ever seen make an extension cable that meets the letter of the spec.


>Apparently you're not old enough to remember those apple "USB" keyboard plugs with the little v-shaped indentations so you couldn't plug in non-apple approved devices.

Wasn't that because of the extender cable? As I recall extenders couldn't be USB compliant, so apple made a notch in the extender so that it wasn't technically a separate USB accessory from the keyboard?


The keyboards themselves didn't have the V, it was the extension cord, and it was because it was out of USB spec. If you push hard enough, any old USB port went in just fine.


Seems like it'll be really difficult for them to ever make the iPad suitable for software development while also preventing users from installing software except through the app store. But if they ever figure that out I'll probably replace my laptop with an iPad.


I wish they'd make the iPad more development-friendly. I don't know exactly what that looks like… but if they did that, I'd use it as a daily driver in a heartbeat.

So so so done with Apple laptops, though. It seems like the butterfly keyboards are here to stay, and they're just plain awful.


"The new Liquid Retina display goes from edge to edge"

I've never owned an iPad, perhaps I'm missing something here. How does the above statement reconcile with the pictures I see on the page where there's clearly a black bezel?

Do they just mean there's not an area at the bottom with a physical button?


Am I the only one who actually like to have an edge on my devices so they dont break by looking at them? At least if there is a metal edge it can take some of the force and you have a chance of not breaking the screen. I dont understand why metal is bad and glass is somehow good? Also, I need to grab my ipad, move it around, spin it, grab it with the other hand etc. I see the edges / bezel's as very useful in these cases.


You're not the only one. I also like a bit of a bezel or "chin" on my phones, because it makes holding the device with one hand for e.g. taking photos much more secure.


My 75 year old mom has real trouble trying to use her ipad mini, especially with taking pictures. Inevitably the left hand trying to hold the ipad has a thumb on the touch screen. Then when she tries to do anything else with right hand it is registered as a pinch.

She also had a career in graphic design and now teaches oil/acrylic painting. I know from her, the art world likes frames. A bezel is not a bad thing.


Not to take away from the larger concern, but in your mother's case, perhaps a case with a significant border would help.

I suppose if there is not a ready-made one that suits, making /hacking one, the old-fashioned way or through 3D printing, would be an option.

One reason I put a case on every one of my cell phones, is to give me something more substantial to hold on to. The thin bodies and edges start to become a detriment; all the more so in that I have no interest in squeezing mine into the pocket of skinny jeans.


Thank you for your suggestion. My mother has declined adding a case because she carries the ipad mini in her purse when she is out. She doesn't want the additional bulk. She knows she is "not holding it properly" and moves her thumb if it's not responding to her.


I've seen people using finger loops that are stuck onto the back, or straps. I managed to find a decent looking instance of the latter in a quick search (so, just the first I found):

https://www.amazon.com/TFY-Security-Holder-Finger-Tablet/dp/...

Cheers


I think iPhones have for a long time been designed with the assumption that every user will add a case.


That's funny... I specifically don't use a case with my iPhone because I feel they have been designed as they are to work without a case.


Iirc, many years ago Apple noticed the majority of customers liked cases (due to things like personalization as well as safety), and started designing iPhones with cases in mind. They're still meant to be pretty when naked, but I believe majority of people are using cases. This is why things like the camera bump have made an appearance but are really not an issue, since cases flatten those lines out.


I wish I could use it without a case but the thing is so damn slippery without one. Slides out of my pocket or around when sitting on the couch and it's on a slight incline.


I fee like if Apple truly didn’t believe they needed a case, they wouldn’t make and sell them.


So if everybody is going to use a case anyway, why isn't the functionality of the case just built into the phone? This applies not just to apple by the way. I've never understood this.


Personalization. People love to express themselves with different cases and having the device even a several mm thicker would make most of the customers look at in disgust, because then with the personalized cases it would by huge. And trust me, outside of tech community, no one wants thick phones.


I'd much rather an external 3rd party case. When my $20 case gets worn or damaged I can spend $20 on a new one.

If that was built into the phone it would be much more expensive to replace, and I'd end up just putting another, slightly bigger $20 case around it to prevent that from happening.


If Apple believed the iphone required a separate case, they would build the iphone like a raspberry pi so users can build their own lego case :). Or build the iphone with plastic because it's cheap and light instead of metal and glass.


iOS is generally supposed to ignore touches on the edge of the screen, but it doesn't always work. I guess this is one case where it isn't really functioning as expected.


Honestly I think that can be fixed quite easily with a phone case. I actually prefer a case on my XS because it's so damned slippery if you lay it on anything with a slight slope it's falling off, and it's easier to hold.


Eww you're touching your devices. Your only allowed to swipe.


Right?! I do not understand the crusade to remove all bezels.


It's a key part of the bigger plan to remove thickness, bezels, ports and functionality.


And battery life.


"Edge to edge glass" doesn't mean there's literally no physical edge banding around the screen. The iPhone X models with their "all-screen design" still have a stainless steel edge band. The iPad pro clearly still has a metal edge.

The device you seem to be imagining where the glass runs all the way to the edge with no edge banding doesn't exist, and would be an ugly and unpleasant device if it did, as the edges would reveal the electronic innards (and heaps of glue).


Don’t some high end Android or Samsung models have edge to edge phone displays?


Sure. They still have an edge band (or a back that comes around to act as an edge band). There are no phones with screens that simply stop in a raw cut edge.


Whether you like the edge or not, that's not the question your parent asked.


Very much agreed. Plus, it looks better. Compare e.g. my phone [1] with its symmetrical shape, with "chin" abominations like half of new phones being released.

[1]: https://www.gsmarena.com/lenovo_zuk_z2-pictures-8125.php


I agree. I can't say removing the bezel is something that wanted or asked for.


Why not just put a case on it?


Billions of dollars in research and industrial design, thousands of man-hours sourcing the perfect materials, hundreds of thousands of hours of engineering that give a device a most perfect feel.

Only to be placed in a $10 case with bunny ears so it won't crack if you accidentally drop it.


Exactly!

Slim sleack shiny phones which break when dropped. Easily dropped because of shiny slippery surface.

What to do? Put some cheap something around it.....

And I'm wondering why people ask me why I don't have a case around my phone.

I don't get the thin phone to have more space for the plastic protection that's why...


Frankly I think it makes perfect sense that a phone come without protection built in so that I can swap out the protection myself and, after a couple years, I'm not stuck with a banged up device.

This doesn't seem weird or contradictory to me at all. Nothing seems cheaper to me than my girlfriend's Android phone with rubber bumpers built in. Meanwhile my iPhones with cases look brand new when I'm ready to sell them.


I agree with you. I was just saying that if you want a rugged phone, then you can accomplish that with a case.


I'd prefer that the ruggedizing were built into the phone itself, they could probably space components out a bit better for heat dissipation and add maybe add a second port.

It is sort of hilarious that miniaturization has gotten so far that we purposefully de-miniaturize miniaturized goods.


$10 case? Brother if I drop $1,899 on a tablet it's going in a $130 Otterbox Defender and I'll consider it money well spent.


I hate the bezelless trend. And quite frankly I hate the trend of making devices thin. Now I just need to add depth so I can hold it and use it comfortably without worries about dropping the damn thing.


Can't you solve both problems by just using a case?


without a bezel there is nothing for the case to wrap around.


Yeah I have to spend extra money for a case and I do. Instead of proper bezels and extra battery life. Instead I get a device that sucks at Palm detection. But it's super thin.


Think of the extra battery that could be added in...


Indeed. People do just love hearing words...

Presenter: “This display goes from its start...”

(Audience gasps)

Presenter: “...all the way until it ends! We call this ‘entire’.”

Audience: <mad applause>

Audience Member 1: (whispering) “I can’t wait until I can get a device with an entire display. I’ll have to put in some OT at the job to save up, though.”

Audience Member 2: “What do you do?”

AM1: “I work at a company that specializes in digging half-holes.”


This is my favorite response so far. I don't really have much interest in the bezel or not debate. I just didn't get "edge to edge screen" as a good description of what was being sold. By that definition, all displays are edge to edge unless you deliberately obscure some portion of the screen.


"I have wit. I have charm. I have brains. I have legs that go all the way down to the floor, my friend."


Well, I guess you could say the display is edge-to-edge because it goes from the edge of one bezel to the edge of the opposite bezel... :)


Yeah pretty much. The bezels are smaller than on current ipads, but yeah "edge to edge" is definitely a stretch.


You mean a "lie." Calling a notched iPhone "all-screen" is a bit of a stretch. This is just straight upfalse.


Forgive the humor in a HN comment, but because of you I shall add "upfalse" to my working lexicon.


It's a plusgood word.


There should be room for "infinity screen" down the line too


"edge to edge" is definitely a stretch.

Ha!


I guess Android phones were "edge to edge" when they first got onscreen buttons :)


But they were not "magical".


This is because the glass is edge to edge, even if the pixels aren't. (so 'technically' correct, but certainly misleading)


Pretty much all iPads ever had had this feature, but I find this advancement super innovative!


I own many Apple products, and I like Apple as a company a lot; but I would have to agree with your observation. The reality does not agree with their marketing. I understand marketing can be employed as a tool to favorably present reality, but still, the edge-to-edge statement is quite a stretch.


Bezels are smaller, and consistent all the way around, with no ‘chin’ or ‘forehead’ bars.


But they aren't even small bezels. There are still rather large bezels all around the "edge to edge" screen.

This is just a blatant lie by Apple's marketing.


Their marketing begins and ends with large, high quality photographs of the product which clearly shows the bezel size. Nobody is going to be misled.

Reserve your criticism for companies like LG who actually lied about their monitors:

https://www.flatpanelshd.com/news.php?subaction=showfull&id=...


There's not a limit for criticism. Just because LG was worse in a particular instances doesn't mean Apple gets a free pass.


My point is nobody is being deceived and Apple clearly isn't trying to be deceptive. "Edge to edge" as a term has been used to describe many things, and never the mythical 0.00mm bezel that only a pedant would assume.

But sure, there's no limit to criticism.

Everything is horrible.

Buy nothing.


How is calling a 9mm bezel "edge to edge" not being deceptive? It's not even a small bezel by monitor standards much less mobile ones. This isn't at all about a mythical 0.00mm bezel (which isn't actually mythical - those devices do exist), it's about not being in the same ballpark as what is commonly referred to as "edge to edge" or "all screen", which is in the ~4mm or less range range.

How can you possibly look at what Apple is claiming and then look at the pictures and say the words aren't deceptive?


It's not deceptive because they show you the picture first. They have the most beautiful high resolution product photography that leaves any potential buyer in no doubt about the bezel proportions. There is absolutely no opportunity for deception to occur, unless the buyer is a complete fucking idiot who thinks the real thing is going to be better than the photographs because they also read a few words of marketing hyperbole.


Yes, in this case it seems to be a euphemism for “we junked the Back button”.


I think this essentially means the bezels are the same width the entire way round.


It isn't liquid either. False advertising!


It's like saying 'lightning fast', when in reality it could be faster or slower than lightning, it's a metaphor that does well in marketing and doesn't leave you open to legal challenges as much as a more technical reference (e.g. 'zero bezels) would, but gets the point across that bezels are thinner than usual.


> it's a metaphor

No, "from edge to edge" and "all-screen design" are not metaphors but as concrete as it gets, and obviously wrong. This iPad may be great but these marketing texts are just straight lies.


You don't seem to understand what a metaphor means.

If I say it's fast as lightning, I'm not saying it's exactly that fast. It's a metaphor for fast.

The fact the speed of lightning is measurable and 'as concrete as it gets' doesn't change the fact that people use it as a metaphor.

You may feel the use of 'edge to edge' is an inappropriate metaphor that's fine, not every metaphor is fitting and, hell I'd even agree with you if you said that.

If you ask me, you can start saying edge-to-edge when the edge is at least 50-80% thinner than what it is now. But it's a metaphor nonetheless, and I'd feel fine applying it to extremely thin bezels which are not actually edge to edge. If you'd have said such a thin bezel was 'razor-thin' or 'paper-thin' when it really wasn't, I'd say it's still a perfectly fine metaphor.


Agreed that 'fast as lightning' rarely refers to the literal speed of electricity in atmosphere; it's commonly understood to mean 'very fast'.

However, there is no such common understanding of the phrase 'edge to edge' meaning almost edge to edge. Edge to edge in English means it goes from the edge of one side to the edge of the other, literally.

It's not a metaphor. Only Jobs had a powerful enough reality warp to make that fly…


You're being silly, it is a metaphor and it's been used before Apple ran with it. The Dell XPS was often said to have an edge to edge display. [0] It didn't, literally, the bezels are similar to this ipad.

Because in the context of displays there isn't really any big consumer product out there that actually has no bezel at all. Rather you have all these nearly edge to edge displays. And we talk about them as being edge to edge, razor thin bezels etc. There's no consumer out there who thinks he's getting a display without bezels. Everyone gets the metaphor.

> It's not a metaphor.

If you don't think it's a metaphor you'd have to believe that Apple truly thinks this device has no bezels, that consumers typically think it has no bezels, that journalists using these descriptions think they have no bezels, because they all take 'edge to edge' as being literal, rather than metaphorical. And that's simply not true. It is a metaphor, whether you (or I) think it's an appropriate one or not.

[0] https://www.theverge.com/2015/10/8/9476199/dell-xps-15-2015-...


Dell doesn't call it edge-to-edge or all-screen in their marketing.

Anyway looking at the dimensions of the iPad Pro 12.9" model it's 11.04 inches wide with a 12.9" 4:3 screen. The screen itself is going to be 10.32" wide as a result, putting the bezel at .36" or around 9mm. By comparison the XPS 13's bezel is 5.2mm.


Dell does call it edge-to-edge in their marketing - https://www.dell.com/en-my/work/shop/cty/pdp/spd/xps-13-9365... - "Real wonder. The revolutionary InfinityEdge display is now available for the first time on a 2-in-1, providing a virtually borderless edge-to-edge view, all in the smallest 13-inch 2-in-1 on the planet*." Note that they talk about the borders as being outside the edge-to-edge part.

Lenovo also use it on this desktop with bezels the size of the moon[1]: https://www.lenovo.com/us/en/desktops/lenovo/b-series/b50-30... saying it has a 'vivid 23.8" edge-to-edge display'

Microsoft use it to describe the Surface Laptop here - https://www.microsoft.com/en-us/p/surface-laptop-1st-gen/90f... - "Enjoy more space for your ideas with an edge-to-edge display and ultra-thin bezel." - specifically calling out the edge-to-edge display as being a separate thing from the bezels as well.

HP describe this desktop iMac ripoff with huge bezels as "edge-to-edge" here: http://www8.hp.com/h20195/V2/GetPDF.aspx/c06002849 - "An entertainment sensation; Sit back and enjoy a captivating entertainment experience. Elevate every stream, video chat, and photo with an edge-to-edge up to QHD display"

Acer uses it to describe a laptop in 2012: https://www.acer.com/ac/en/US/press/2012/50215 - "Featuring a 10-point touch edge-to-edge display and a larger trackpad, the Aspire V5-471P and V5-571P are designed to enhance multi-gesture content" - and it's a 2012 laptop, it has bezels.

The claim that only Apple use it and it's a lie and nobody else uses it in their marketing is nonsense.

[1] I mean literally the size of the moon, because nobody is allowed to use words differently without your approval, of course.


LG were perhaps the worst offender—they didn't just use jargon, they faked the measurements and doctored the press photos.

https://www.flatpanelshd.com/news.php?subaction=showfull&id=...


They call it an InfintyEdge display...


Ugh, I hate the way Dell lies! That's obviously not a metaphor.

Also, whenever journalists and commentators write about Dell screens as being edge to edge, years before Apple, they're not lies, they're metaphors.

It only becomes a literal statement that is obviously a lie, when Apple uses it.

/s

To everyone downvoting me above, you can disagree with the idea 'edge to edge' it's a proper metaphor to use for an iPad display with a 9mm bezel. I agree with you in full. You'd want something like 2-3mm bezels for that, at most.

But to say it's not a metaphor is silly. It would mean that you think Apple, journalists and consumers, all or some of them, consider the edge-to-edge marketing statement to be a literal one, that must be taken literally, and is obviously a lie, rather than a statement which must be taken metaphorically.

If you go to the CEO of Apple right now and ask him in an interview, do you mean literally edge to edge, or metaphorically, what do you think the answer would be? (apart from dodging the question)

After that you could tell him it's a crappy metaphor to use. But telling him he's lying because Apple means it literally is just silly.


No matter what it's not a metaphor. You might want to look up what that word actually means.


I Google'd "define:metaphor" and it says "a figure of speech in which a word or phrase is applied to an object or action to which it is not literally applicable."

and... Grammarly says "A metaphor is a figure of speech that describes an object or action in a way that isn’t literally true, but helps explain an idea or make a comparison."

Soooo this supports the idea that "edge-to-edge" can be descriptive but not literally true, and be described as a metaphor, right?


The words 'edge to edge' do literally apply to the object in question, though. You can literally have an edge-to-edge display - see the Galaxy S9+ for example.

You can't just say "well even though the words could literally apply to the object and they do literally apply to the object in some cases in this case it's 'just a metaphor'"

It's not. It'd be embellishment if you want to be generous. But not a metaphor.


No.

The word "edge" in this context, literally means the edge of the device or screen. That's everyone's understanding. There is nothing metaphorical about it.

The word "lightning" for example, does NOT literally mean "fast". It literally means the electrical energy we see in the sky. When used to describe computer speed, it's obviously a metaphor, and well-understood as a metaphor.

You can't just claim something is a "metaphor" to excuse false marketing.


Yes you can. "a word or phrase applied to an object to which it is not literally applicable." 'Edge' applied to an object which is not literally the edge. Metaphor.

You can't just claim the definition is not the definition and expect people to side with you over multiple dictionaries.

That's everyone's understanding. There is nothing metaphorical about it.

Literally everyone? Or figuratively everyone? You think the Apple marketing people cannot see the bezel and honestly think is not there?


"Edge" refers to the edge of the device. There is no other edge metaphorical or otherwise, imho.

More than definition, how metaphors are used is important.

Is "edge" a metaphor when the actual edge is right there, 9mm from the screen perimeter? There is no metaphor only 'edge-to-edge' window dressing. In the same way "lightning" is not a metaphor in a discussion about thunderstorms.


As others have advised, you should look up how metaphors are used,

See my other comment in this thread, there is at least six years of prior usage by multiple big tech companies describing their product's screens as "edge-to-edge" without literally meaning edge-of-device to edge-of-device, with links.

And see my other other comment where I wonder why "edge-to-edge" has your goat, when Apple describing the iPad Pro as "all screen" doesn't. Is that intended to be taken literally as well? There's no CPU, no memory, no battery, no glass, no other components, all screen?

Because the device is right there, it clearly has parts which are not screen, why aren't you frothing at the mouth about how it's a lie intended to "attract suckers" instead of a non-literal highlighting that the screen is large?

Like "all butter cookies" - that's a lie for suckers to think they are made of butter and no other ingredients, right? Because there's no way it could be read except literally, is there?

Non-literal descriptions are everywhere.


> there is at least six years of prior usage by multiple big tech companies

Irrelevant. This discussion was about the one and only use in town right now of "edge-to-edge" in a major campaign to sell new tablets.

"All screen" is more ambiguous. For starters "all screen" is not a pre-existing term. It could be used to mean there's no other elements or buttons on the front, except the screen. "All screen".

"Edge-to-edge" on the other hand, has explicit meaning built in. The primary component of which is measurement. "Edge-to-edge" refers to not one, but two hard edges, and describes that which spans in full from one edge to the other. There's no vague interpretation possible unless you force a square peg through round hole of English. It's either an edge-to-edge screen, or it's not.

I had to measure a washing machine recently to find out if it would fit. I measured edge-to-edge, and by that I mean the actual left edge to the actual right edge. But you knew that already without me explaining.... because I said "edge to edge".


So every single phone on the market can be described as “edge to edge”?


My old 15" CRT monitor had awesome edge-to-edge technology. It was apparently way ahead of its time.


What is calling it "edge-to-edge" describing? Would it be equally valid to say that it has a 20" screen as a metaphor for how large the screen is?


¯\_(ツ)_/¯

Are you equally offended that this "floor to ceiling room divider" stops a couple of inches below the ceiling? https://www.amazon.com/Royhom-Privacy-Divider-Decoration-Apa...

Do you expect this "top to bottom house cleaning service" to include the chimney and roof and aerials because the "top" has a literal definition? https://top-to-bottom.cleaning/

Are you angry that "surround sound" only has a discrete number of speakers, usually 5 or 7, instead of a continuous surrounding panel?

Are you baffled by Thomson Video Networks' claim that they have an "all-encompassing video infrastructure" when you can see things in the world not encompassed by it? ( https://www.broadcastingcable.com/post-type-the-wire/thomson... )

Do you think "unmissable TV shows" are literally unmissable? ( https://www.makeuseof.com/tag/unmissable-tv-shows-watch-hulu... )

Why aren't you complaining about Apple calling the iPad Pro "magic"?

Why are you choosing "edge-to-edge" as the hill to die on, when Apple call the iPad Pro "all new" (it isn't), "all screen" (it's not), "all powerful" (it isn't), "a magical piece of glass" (nope) which "does everything you need" (even breathing?), "any way you hold it" (even covering the screen?), "true to life color" (even though you can't represent purples on an RGB screen?), "make everything look gorgeous" (even ugly people?), "the perfect machine for augmented reality" (even more than dedicated glasses? There can never be a better machine for AR?), "immersive games" when you can't immerse yourself literally in them?


I think the floor to ceiling divider is meant to literally reach the ceiling. It comes in different sizes. I would be upset if I bought one that didn't quite reach the ceiling because it was a few inches shorter than advertised. I would definitely be upset if I hired someone to make a floor-to-ceiling partition and it didn't reach the ceiling, unless floor-to-ceiling had a clear well-understood different meaning in that domain.

The surround sound does literally surround you. Surrounding a person doesn't require a continuous circle.

They said it's an "all-encompassing video infrastructure for broadcast and multi-screen services." That has a pretty clear meaning that it covers everything you need regarding infrastructure in those domains. It is not a metaphor.

"Unmissable" has a second meaning, "too good or important to be missed." [1] It is a subjective statement that the episode is good.

Calling the iPad pro "magic" is the only thing here that is a metaphor.

I'm not picking this hill to die on. I asked one question. You say it's a metaphor, but neither of us have any idea what it's a metaphor for. I think what you're actually trying to argue is that it's completely meaningless marketing fluff that can be applied to any phone or tablet. Would you object to calling a feature phone edge-to-edge?

I also disagree with "all screen", which should have the same meaning as edge-to-edge. All your other Apple marketing term examples have clear meanings (except "all powerful", did they really say that?)

A metaphor is when you say one thing is something else to draw an analogy between them. It doesn't work when the something else is the same type of thing. It would not make sense to say "my car is a Lexus" as a metaphor to mean my Hyundai is nice. It wouldn't make sense to say "my car is turbocharged" to mean it's fast when it doesn't have a turbocharger. If edge-to-edge is a metaphor, I can only think it's a metaphor for a tablet with a screen that doesn't have a bezel. It seems like a straight-up lie.

My biggest issue with the edge-to-edge marketing claim here is that I have no idea what they mean by it. It was bad enough when they made that claim for the iPhone X, but the iPad pro has a huge bezel. They might as well claim it fits in the palm of your hand.

[1] https://www.merriam-webster.com/dictionary/unmissable


You say it's a metaphor, but neither of us have any idea what it's a metaphor for. [..] I think what you're actually trying to argue is that it's completely meaningless marketing fluff that can be applied to any phone or tablet.

I probably am, yes. I mostly think it's silly to argue that it "can only be used literally" in the face of it being used non-literally. It's a metaphor for its literal definition - this screen is "edge-to-edge" like a screen with no borders. It metaphorically has no borders because it's so big and the borders are so small, even though it literally does have borders, but you won't notice them, you'll accept the screen is edge to edge because it very like one that is. Yes it does seem like a literally false statement to sell people on an idea.

Would you object to calling a feature phone edge-to-edge

I don't know. On the one hand I don't think it is, any more than I don't think the iPad Pro is. On the other hand, they can call it whatever they want, they will just say "the screen goes from the edge of the screen to the edge of the screen" and everyone will say "duh". Do I think it's harmful? Not so much because the screen is visible. "All day battery life" was more misleading because you can't see the contrary at a glance. "No fee" when there is a fee, way more harmful.

"Unmissable" has a second meaning, "too good or important to be missed." [1]

Well "Edge" has meanings "2.a. the line where an object or area begins or ends : BORDER" and "2.b. the narrow part adjacent to a border" and "2.c. a point near the beginning or the end" - https://www.merriam-webster.com/dictionary/edge

So "edge to edge" could meaninglessly but accurately be saying "from a point near the edge to a part adjacent to the other edge". :-|

(except "all powerful", did they really say that?)

Yep; top of the page here https://www.apple.com/ipad/ - All New. All Screen. All Powerful.

Meaning, presumably, "every bit of it is new, it's mostly screen, every component is high end".

It would not make sense to say "my car is a Lexus" as a metaphor to mean my Hyundai is nice.

"How is your new Hyundai?" "It's so luxurious and feature filled, it's totally .. i dunno, it's ... very Lexus! Yeah, my car is a Lexus by another badge!". You wouldn't understand that?

It wouldn't make sense to say "my car is turbocharged" to mean it's fast when it doesn't have a turbocharger.

"How is your new Tesla?" "FAST it's so fast it's turbocharged supercharged bullet train rocket engine awesome" "I don't understand, it cannot be turbocharged because there is no internal combustion engine exhaust gas to spin the turbine, why are you lying?"


> you'll accept the screen is edge to edge because it very like one that is

I think that's the core of the dispute here. The screen on this iPad is nothing like a screen that's edge-to-edge. It's surprising and confusing to see it described like that with such a clear thick bezel in all the images. I and a lot of other people in the comments here don't accept that this screen is edge-to-edge.


"a figure of speech in which a word or phrase is applied to an object or action to which it is not literally applicable."


But you can not just take any words and say it is a metaphor. I can not say "this is a walnut desk" when it isn't and just claim that it's a figure of speech metaphoring a very solid and beautiful desk. A metaphor must be widely understood as such before you can use it.


Just to be clear, "InfintyEdge" is a brand-name, like "Lighting connector", it is not a description of the display itself.


Exactly, which is clearly a made up marketing term that has no established meaning just like Retina Display is.

all-screen and edge-to-edge, however, are not.


"It's as fast as lightning" is a simile.


Yeah, no ugly "chin".

Problem is, the iPhone X family got rid of the chin... but have the dreaded notch, and not even the mostly acceptable teardrop everyone else standardized on.

At least the iPad Pro adopted chinless and also notchless design. First time in years Apple has made a product that isn't weirdly ugly.


>Problem is, the iPhone X family got rid of the chin... but have the dreaded notch, and not even the mostly acceptable teardrop everyone else standardized on.

You mean "copied with a tiny alteration".

>At least the iPad Pro adopted chinless and also notchless design. First time in years Apple has made a product that isn't weirdly ugly.

You probably didn't get the memo, but the "oh my god, it has a notch, oh noes, so ugly" thing some pundits tried to pull died in its birth, and not only competitors copied the notch, but the X itself became the best selling phone.

Even ignoring that, everybody pretty much praised the Apple Watch 4 as beautiful as well...


> the "oh my god, it has a notch, oh noes, so ugly" thing some pundits tried to pull died in its birth

Did you know that Google had to ship an update for Nexus 3 XL that "disables" the notch in hardware (by making the background for the part of the screen around it black, making it less noticeable), because of how many people complained about it?


there were phones that effectively had a "notch" long before the iphone x. for example, the lg v10 had a second screen directly on top of the main screen with the two front-facing cameras slid over to the left. the second screen was used to access recently opened apps, text messages, settings, flashlight, music control, etc.

in general, apple just tells people they invent and do everything the best, and everyone listens. it's funny, because my lg v35 is much thinner and lighter than my girlfriend's new iphone xs max (which has a marginally larger screen), doesn't have a protruding camera lens, and it actually has a headphone jack with a high-end dac (which basically no other phone has). so apple says they need to get rid of the headphone jack to save room, and yet here we are.

i specifically avoided the lg v40 because they added the notch. as with the new ipad, i simply don't understand this need to extend the screen all the way to the end of the phone in the long way. this is especially true on the ipad, which i nearly always hold horizontally with my thumb on the perfectly sized bezel. getting rid of the bezel literally adds no functionality for me and actually removes some.


"mostly acceptable teardrop everyone else standardized on."?


Yeah, that's pretty delusional. I don't think anyone has created a phone with a notch that would accurately be characterized as a teardrop shape. The closest was probably Essential (which inexplicably had a notch and a chin). The Pixel 3 notch is a V shape. The OnePlus notch pretty much looks like an iPhone notch. There's no "standard" and certainly no "standard teardrop shape".


I don't blame you. The one plus 6T released yesterday with the teardrop notch.

It is trend among chinese phones (Huawei,Oppo, Vivo, OP ..basically BBK electronics)

OP 6T is the first phone popular in the west with this style, and I find it to be amazing.

Still like the MiMix 3 the most, with no notch. But, the OP 6T is the first one that's not egregious.


Looks more like a "condensation drip about to fall" than a tear drop...

Not very catchy though, perhaps "innovative CDATF notch - uses 66% less screen space" would sound hype enough for the Marketing department?


What if I have really weirdly shaped tears?


far from everyone, but prominent examples are the OnePlus 6T and the Huawei Mate-20:

https://i.ytimg.com/vi/r_LNpJOd0ao/maxresdefault.jpg


That's not a "teardrop". Are people actually calling it that?

Also, two phones does not make a standard. Even the Mate 20 Pro has an iPhone-style notch.


Both have the slanted sides that I think "passes for" teardrop shaped, although it's clearly more slanted on the OnePlus.

And indeed I was referring to the Mate 20 and Mate 20X rather than the Pro.


From the keynote, Cook claimed the new A12X Bionic[1] is faster than 92% of portable PCs[2] (in English, laptops, Surfaces) sold in the past year, including some i7 models.

- 8 core CPU (4 low and 4 high power cores)

- 7 core GPU

- HEVC encoding/decoding

- Neural Engine (they haven't made any comparisons to the A12 sans X neural engine, so at this point I think it's the same)

- Audio DSP, Storage controller, Image Signal Processor, Apple performance controller, Depth Engine, &c.

[1] http://live.arstechnica.com/apples-october-30-2018-more-in-t...

[2] http://live.arstechnica.com/apples-october-30-2018-more-in-t...


The hardware specs of the new iPad Pro read like the dream mobile machine. If only there were not too many software limitations in iOS which would prevent it to replace a laptop for many tasks, it could take over the laptop space in no time.

- Files is way too limited. It should be possible to exchange the system supported data types between all apps via files. You can't even add a audio track to Music or a video file to the TV app as it is right now.

- The split screen feature is a good one, but so far I rarely found it useful. Not all apps support it, many apps differ in their behavior. The 50:50 split is nice, but the other split ratios I fould less useful. For some tasks you need overlapping windows.

- On the same page, one of my favorite features is the picture in picture for playing video - unfortunately again, apps differ in behavior and I don't understand the size limit. I would like to be able to put the video to any size.

- And of course, coding and running plain Unix utilities. While I am happy about the security the app model with sandboxing brings, for a "Pro" machine, this is too restrictive. Apple should allow something like Termux on the iPad, even if it were limited to a sandboxed file system, it would make the iPad soo much more useful as a computer. Apple might even release a lightweight Linux-VM as an app.

If Apple could remove these pure software-restrictions of the iPad, they could attract a lot of "Pro" users, I think. Disclaimer: I am an iPad Pro owner, fully in the Apple universe :). iCloud sync already makes it a much more useful device, but I keep hitting my head against the limitations. Running "Blink" to connect to my Linux server makes it almost a laptop, but is very limited.


Totally agreed on this.

I could do real work with a browser + terminal client, but have a few issues:

- None of the terminal clients really give you 100% of a normal keyboard (been a while, I forget what doesn't work right).

- Splitting the screen doesn't work well enough. I want 2 windows (browser and terminal) that are each 75% of the screen and the ability to switch between them, preferably from the keyboard. A 50/50 split doesn't work well, and switching the 70/30 split between windows makes a mess when it redraws my terminal on resize.


Yes, 2x75% Windows had come to my mind as well. At least the new swiping guestures let you switch between two full screen apps quicker. For the terminal client, have you tried "Blink"? I found the keyboard reasonably well working, you can even remap caps-lock to ESC. As it uses mosh to connect, it deals also extremely well with connection cuts (especially when the iPad hibernates the app in the background).


Sounds like you just want an A-Series MacBook (with decent keyboard?)


Overall the iPad has a very attractive hardware as a laptop competitor. It has a touch screen, the pen support, is fanless, can be used without the keyboard attached. The more "Pro" it gets, the thought of replacing a laptop with an iPad comes up, but ends at the software limitations.


Faster at -what-? People make tons of speed claims, but what is it faster at? I'll be interested to see real world bench marks. I highly doubt the new iPad Pro can outperform the newly released Surface Pro with the base i7. I mean...he did say -in the past year- and the new Surface has been available for...a few weeks. Also mind you that the new Surface line doesn't even used Intel's latest chips, the latest are 9th gen, but the new Surface is only packing the 8th. Honestly, when you think about it, their claims weren't overly impressive given they were comparing their hardware to the 7th gen Intel chips.

That's not to say what they've done isn't impressive and that they don't plan to kick Intel to the curb...but it is entirely possible this will be another PowerPC thing, where PowerPC outperforms for a while, Apple switches and then Intel gets their shit together and others are no longer able to compete. If Intel were to merely get a good integrated GPU that could compete with Nvidia's, there is no chance Apple could compete.


That is the question, now isn’t it? I too look forward to the benchmarks.

Something to keep in mind too, iPad Pros have a much more constrained TDP than your average laptop, so it isn’t merely just a matter of performance but performance per watt and sustained performance.

There is also this question: given a higher TDP, what could Apple’s silicon team really do? Although I’m not convinced it makes financial sense for them to replace Intel with custom silicon, I’m happy to be proven wrong.


This is entirely anecdotal, but my iPad pro from last year handles photo edits (lightroom, affinity) much easier than my i5-powered desktop with a 1060 GPU. Anything from opening the file, doing edits and exporting the file in various formats is just a lot snappier on the iPad. I don't know about the surface pro


I'm not denying that the Pro is a beast of a machine, but I also think the Lightroom on iPad is a slimmed down version, no? One advantage that the iPad really has is that the code the companies like Adobe use don't have to be backward compatible with various GPUs and hardware. I'm sure there is a ton of bloat in Photoshop etc that are just there because of older machines.

I am happy that the iPad Pro is pushing the limits, it is about time that the Surface Pro line had true competition, it will push both to be better.


It's a little slimmed down in the number of options on offer, it's essentially the "CC" version that's also available on desktop. But any limitations in features vs the full old-school desktop version is a result of design choices, not lack of power.


This was either a sharp prod for intel, or some serious signaling.


MacBooks _will_ transition to Apple's own chips at some point. My guess is that the A14 or A15 generation will be fast and powerful enough to make the leap.

This will be big especially if they can get major price or power usage advantages.


I'm worried they won't transition MacBooks to these CPUs, but they'll just let the entire Mac lineup gradually fade to black with intermittent refresh cycles and dead end features like touch bar.

I can well imagine Apple expecting people to plug in a keyboard and monitor to their iPad Pro, while it gets thinner and thinner.


While "intermittent refresh cycles" are certainly a thing for the Mac line recently, I think it's important to distinguish "new features that seem to be flops" from both "no new features" and "not caring about the Mac line." The Touch Bar may or may not be a dead end (I don't hate it the way many other people do, but if it went away I wouldn't shed a tear, either), but it represents a significant amount of engineering work put into a Mac-only feature. So does the love-it-or-hate-it† "Butterfly" keyboard. And the new ARM-based T2 chip, which handles tasks from encryption to signal processing to being the SSD controller, is also obviously significant engineering -- and again, Mac-only.

I don't doubt that Macs will eventually fade into the sunset, but I don't think it's coming nearly as quickly as either pundits or pessimists seem to believe. "Marzipan" may indeed be a first step, but I think it's going to be a Carbon vs. Cocoa situation, taking many years to fully transition. For iOS to supplant macOS, it has to essentially do everything macOS does. It doesn't have to do everything the same way, but if it's going to be a full replacement for a general purpose computing platform, it has to be, well, a general purpose computing platform.

†Technically, "grudgingly-tolerate-it" should be in there, but it doesn't flow well.


Just a note on the T2: it's a variant of the A10, more specifically, it's basically an A10 running a custom operating system (BridgeOS) and with the high performance cores disabled. There might be other modifications, but it isn't the kind of thing I would hold up as a triumphant piece of Mac-specific engineering. Not that it is a bad piece of silicon, just repurposed silicon isn't the sort of thing I would put stock in when it comes to the importance of the Mac to Apple.

The TouchBar was a good example though, even if it didn't receive the warmest reception on the market, it is still a good piece of Mac-specific engineering.


I wish they would take the touch bar idea, and apply it to the trackpad instead of the function keys…


Re-purposed engineering can still be triumphant.


> I can well imagine Apple expecting people to plug in a keyboard and monitor to their iPad Pro, while it gets thinner and thinner.

And at some point, Apple will introduce an "iPad mini" with no display


Arguably they are already powerful enough to make the leap in at least some cases, that said Apple does not push Macs in the same quantities as iOS devices as to justify wholly new silicon for the line. I have been in an argument with a friend for the past year or so about precisely when (or if) Apple will rip Intel out and use their own silicon. I think they will, but I'm going to try to present his argument to the best of my abilities, with my own observations and guesses as context.

Presently the T series is generally gimped variations of the A series. My understanding is that the T2 is essentially an A10 with the high performance cores disabled, and the low performance cores repurposed to manage the myriad of functions that were previously handled by dedicated controllers. They're also necessary for certain features like TouchID, and for Apple to use its own HEVC encoder/decoder.

Presumably the T3 will be keyed off of a later part like the A12 or a later generation, which will make it the first T series chip to also include the neural engine. Assuming they don't disable that part, it will potentially make the NE available to Mac application software.

So what's the real problem with ripping out Intel right now? Given a higher TDP, Apple could hypothetically design a much more performant chip that would replace both the CPU and GPU in their current Mac lineup, but they simply don't push enough Macs to justify the expenditure on custom silicon, right now at least. Probably doesn't help that they have been making existing customers less happy, but it also doesn't help that Intel has simply been re-releasing Skylake in different variations for a good few years now.

If you have a Macintosh able to run Mojave, an SSD, and at least 16GB of RAM, you are probably set. The important thing about Mojave isn't so much any one feature of Mojave, it is the continued support including bug fixes and security updates. Given all of that, most people simply won't need a new system any time soon. There's essentially no reason a laptop from 2012 can't take you all the way to 2022 all other things being equal, so we're starting to see sales stagnate across the entire PC market including the Macintosh, so while Apple is selling phones about as fast as they can stamp them out, the same is not true of Macbooks or iMacs.

Silicon is much more of a volume business. Apple can re-purpose older versions of their A series chips because those are an R&D and capital investment that has already paid off, it isn't a new design, and they just need someone to continue fabbing them. New generations of the A series chips will generally pay off, because they will be selling hundreds of millions of the same iPhone, and usually for greater than one calendar year. This past year, for the first time since the A7 was released, they skipped releasing an -X variant of their A series chips which are generally released to support new iPads (there was no A11X for those who may have forgotten). It is possible that iPads have not been selling in the quantities necessary for even a variant design, and rather than an annual refresh cycle, we could reasonably expect them to move to a biennial refresh cycle, hence why they have skipped the A11X and jumped straight to the A12X.

Intel and Qualcomm have customers, but for Apple's own silicon, Apple's only customer is itself, by design. That also imposes a limitation on themselves that it just might not make financial sense to fabricate a series of chips exclusive to their Mac lineup, appropriate to the TDP of a Mac.

By all means, tear this apart if you disagree.


Very well put. It seems to me that desktop performance has stagnated for the most part while mobile performance continues to make leaps and bounds, so people think they need to upgrade their phones and tablets much more often just like the early days of the PC. A 2008 MacBook is going to be a lot more useful than a 2008 iPhone in 2018.

Apple certainly markets their iPads as a competitor to the MacBook, so maybe they’re not looking to use custom silicon in the Macs but simply to phase out desktop computers when iOS and third party applications are ready to replace PCs for the most part.


> Given all of that, most people simply won't need a new system any time soon. There's essentially no reason a laptop from 2012 can't take you all the way to 2022 all other things being equal, so we're starting to see sales stagnate across the entire PC market including the Macintosh, so while Apple is selling phones about as fast as they can stamp them out, the same is not true of Macbooks or iMacs.

Isn't this becoming equally true for phones? Perhaps not for hardware as far back as 2012, but I'm expecting my 6S to last me a very, very long time. It's fast, has a great screen (and headphone jack), and I'll get the battery replaced as many times as I need to.

But this doesn't appear to have slowed iPhone XRS+ Maxx sales one bit.


I read a report recently that the average replacement cycle for iPhones is closer to 3 years now than it was this time last year. I would say this is true-ish of phones, but the reasons people upgrade are varied. I’m using a 6s+ right now and the reason I’ll be upgrading the next time I go to the Apple Store isn’t because this phone is in any way bad or worn out. What I’ll really be purchasing is a new and improved photography pipeline and bigger disk. Everything else is just a nice extra, but if I didn’t care about the camera I would probably continue to use this phone until it died of natural causes or starved to death from a lack of software updates.


totally agree. i still have a 2011 macbook pro upgraded with an ssd and a 2013 macbook pro. both almost indistinguishable performance wise from a brand new macbook I use at work. there's nothing it can do that these can't really, except touch ID, but then my apple watch unlocks my laptop now anyway.

i would add that the faster upgrade cycle, and therefore money worth spending on developing smartphones is also because people are used to carrier subsidy and spreading out the cost. so it's kind of a virtuous circle of investment that the pc market doesn't have.

makes you realise what a perfect storm iphone was - a massive and growing market of people paying through the nose, annually, for a crap product, just waiting for someone to take that money and invest it in something good.


I don't disagree, but I do want to point out that they did release an X variant of the A series with the new iPads.


Ah, I just re-read my post and I was unclear in that portion because I edited it too much.

I meant to say there was no A11X. I'll edit that in shortly.


Microsoft is preparing the move to ARM too. I'd be really interested how the performance will be. The Intel Mobile Processor (i7 + M7) felt really slow before the eight generation and still used a lot of power (at least in the Lenovo and HP Business notebooks).


Is Microsoft preparing an x86 emulation layer for their ARM-based OS? Because as the past has proven, nobody wants to buy a Windows PC that can't run Windows software. Unfortunately, it's almost all been compiled to run on x86. Or are they still betting on moving everything to UWP?


Microsoft is shipping an x86 emulation layer on the ARM based Windows laptops right now. (although with some limitations, e.g. as far as I know no 64-bit support yet, and of course it's slower than native on already not-so-fast laptops)

Benchmarks from the first device generation: https://www.techspot.com/review/1599-windows-on-arm-performa...


Yes, it was launched earlier this year: https://docs.microsoft.com/en-us/windows/uwp/porting/apps-on...


I think by the time MBPs are using A chips, iOS will have morphed into the new macOS — launching these together as a new MBP.


Or pure marketing. Look for the asterisk that explains the workload.


Yet 90% of the apps that people will use on it are optimized for a phone with lower specs. If I use a laptop with GPU I know I am going to use games or productivity apps (Photoshop, modeling software) that uses the available computing power. With an iPad I am not so sure.


Checkout the Photoshop on iPad Pro demo from the Keynote then. Apparently AutoCAD is also being ported to iOS. The difference in available computing power is greatly diminished to the point that the real advantages Macs have nowadays is in access to vaster stores of RAM, Dev Tools, and Darwin.


What's the difference? iPad apps use the full potential of the machine, like in any computer.

Yes, most iPads in existence are much slower than this iPad Pro, and yes, most people will be buying the inferior sub $400 one.

But also, most people have PCs with an Intel GPU, or a really low end discrete GPU... that doesn't stop game makers to make games that can't run on most machines, or Adobe to stop making photoshop.

Let's see what happens with the iPad.


This is changing, Photoshop(coming), AutoCad(coming), Civ 6, Procreate, the next MS Office mobile version, Bias Amp 2 are all examples of apps targeting iPads. I think this will continue and even go faster


The more hardware support for HVEC the faster the scene will move to x265, the smaller files everywhere. I'm excited!


I wish they had a different keyboard cover though. I had a lot of problems with the iPad Pro 12.9 on my lap, because it isn't very stable.

More

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: