By this logic, it's pretty clear Apple is killing macOS x86 dependency (and macOS itself) in two ways. 1. Shifting compute intensive workloads to A level chips. This started with the small stuff around "security" and ApplePay but today the video encode/decode functionality in the T2 is a huge leap. This will extend into graphics next year (thank god cause fuck Intel's integrated graphics) with APIs like Metal and going forward almost anything I/O, Display, and every function outside of the CPU. #2 strategy is the iPad - by offering a compelling device that has all the features and capability of laptops without 2 decades of baggage.
We've all been so worried of the convergence of iOS and macOS, but after today I don't see a macOS future. Sure it will move to Apple's ARM chips, it will last another decade, but the shovel is out of the shed, it's called the T2.
(Jailbreakers, we need you)
I am absolutely sure that they will port XCode to the iPad itself and allow visibility into my own apps. This is not the same as a workstation.
EDIT: in retrospect, this is only tangentially related to parent post—my apologies. Also: wording, punctuation.
It makes computing & communications an Apple theme park, where you can't bring anything forbidden into the park, have to buy everything you need from Apple concessionaires, and can't do anything that they haven't planned for you to do. The people in the "park" pay to enter and then become a new type of domesticated herd to be milked.
You can't allow people to bring heavy equipment and power tools into a theme park, but you also can't build the park's controlled features without them, so how do you give power to authorized builders while keeping it away from users? In a physical theme park, they have controlled users during the day and empowered builders at night.
But in Appleworld, I'm thinking they'll keep iOS locked down by limiting the building tools to MacOS.
A unix-style workstation that gives maximum power to its users is so antithetical to everything Apple stands for that MacOS would never be created today, but since it already exists, they can take advantage of it to make iOS even more locked down than it would have been. (And refuse to let any other company use MacOS and not port Xcode anywhere else.) You want to see what your computer is doing (file sys, CPU procs, memory, network, etc.)? Do it with a Mac. iOS is not YOUR computer. It's Apple's. You only paid for admission to iOS, and you aren't allowed behind the locked doors. If you want control, you're in the wrong place.
I'm afraid that if they ever figure out how to sandbox an iOS partition of some sort to allow builder tools to run on iOS itself in a totally controlled way, then the Mac is toast, but the Mac gives them time. I think that for the foreseeable future, they won't risk any accidental empowerment of iOS users and will limit the power tools to the Mac.
End of the day, Apple like any other corporation does what it does because it makes money for them. The alternative is to use a "free" operating system like Android and you subject yourself to constant surveillance. It's like that scene in The Big Short; "tell me how you're f*ing me." At least with Apple, you know straight up what you're getting yourself into. Your relationship with Apple ends when you stop using their devices. Not so with Google/Facebook, where your data lives on in perpetuity, used for purposes beyond your control.
Now I'm not saying it's not possible to have your cake and eat it too. I'm saying, where is this mythical product, where a user is free to do whatever he wants with his device? What's the market size? How come no one has built it yet?
For some reason most Linux conferences end up being about file systems, containers, device drivers and what not, seldom about desktop development.
Objective-C/Swift Frameworks have nothing to do with UNIX, Apple could easily port them to another kernel architecture.
Likewise OS X driver model has nothing to do with UNIX, being modeled on an C++ subset, which was originally written in Objective-C back in the NeXTSTEP days.
The only UNIX GUI certified as such is Motif, which Apple certainly isn't offering on their products.
Nor is the audio stack in any way related to UNIX.
We don't all grow our own crops, make our own clothes, brew our own beer, refine our own crude oil etc, even though we may have the knowledge to do so. Some things are better left to specialists, due to efficiencies/legal reasons. Surely we can agree on this?
People are greedy and have no values.
2. The idea that UNIX is antithetical to what Apple stands for is just ridiculous. It is still at the core of iOS and MacOS and is one of the areas where Apple continues to innovate.
3. Apple licensed MacOS in the past. It nearly killed the company since third parties like PowerComputing went straight after their core base and did nothing to grow the ecosystem.
4. You can see what your iOS device is doing. Plenty of apps allow you to see what processes are running, file system behaviour etc. Apple just doesn't build it in.
5. Anyone who thinks the Mac is dead needs to go have a lie down with some chamomile tea. Apple even today doubled down on the Mac with the new Air + Mini. And they will continue to grow and invest in the platform since content creation will always be largely done on a Mac.
What is happening now are those that only discovered the Mac world after OS X, don't care about Objective-C/Swift development, now feeling that their pretty UI Linux replacement is no more.
Apple naturally cares about their Objective-C/Swift developers in first place.
Also for Apple to allow v8, they would have to permit 3rd party unsigned executable code (JITed in this case). Apple doesn't want to allow that.
This is not about CVEs, it's a meta discussion of not outsourcing the security of the platform.
There's no actual security reason for iOS to be locked to Apple's webkit exclusively and there's no actual security reason for iOS to not allow JIT'd code. Those are both control issues, not security ones. JIT in particular is purely a control issue - the process itself is already sandboxed which is the one and only actual security boundary here. Preventing a JIT doesn't prevent arbitrary code execution, after all, especially if there's an interpreter in play which Apple sort of allows.
> Apple doesn't want to allow that.
Yes, that's all it's about. Apple's control. Other excuses people make up for their reasons are demonstrably false.
And how does that pertain to security? Android doesn't seem to have a problem with JITed code in sandboxed store apps. Neither does Win10.
Hence MDIL in WP 8.x and .NET Native on WP 10 onwards.
Now, Win8.x did not allow for third-party JIT compilers in the sandbox; it was only CLR or Chakra. But UWP does - look for the "codeGeneration" capability here:
WP 8.x only did dynamic linking at installation time and when OS updates were done, by replacing symbolic labels with the actual target destinations. Everything else was already compiled at the store and downloaded as binary into the devices. This was the whole point of MDIL.
There is a BUILD session and a further Channel 9 deep dive interview showing how MDIL deployment works on WP 8.x.
So Chakra was the only JIT in town.
"BUILD 2012, Deep Dive into the Kernel of .NET on Windows Phone 8"
"Mani Ramaswamy and Peter Sollich: Inside Compiler in the Cloud and MDIL"
Either way, code generation is there today.
If apple could make their browser the defacto browser of tomorrow in the way chrome is today then google would surely work harder to fix the pain points apple identifies
Do you have any examples? I've never seen anything like this in the app store.
Some apps give you limited access to their filesystem but (as a Jailbreaker) I'm pretty sure the larger system isn't viewable under normal circumstances.
Democratized? Is that because you perceive mobile devices as more affordable? Have you checked the price of a new iPad Pro?
Source: personal experience with multiple over-65 people for whom an iPad or iPhone is the first internet-connected computer they've ever owned.
In the context of this thread, the target user of iPad Pro is more than someone taking pictures of their kids and pets on weekends.
As for my opinion on this, I believe in using the right tool for the job, AND using multiple tools to get the job done. Sometimes an iPad Pro might be fine for certain stages of a project, for sketching out backgrounds or initial ideas for a design, and then importing that into another system such as animation pipelines or other applications on desktop or laptop or workstations. Depends what the job is.
I've never met a professional designer who works only on their tablet. Professionals love their workstations, multiple screens, and all the comforts and power of a proper setup. If you're just marking up a PDF, I wouldn't count that as professional work!
I wish I agreed with you, but I can't but see that as elitist condescension against unsophisticated content.
> the target user of iPad Pro is more than someone taking pictures of their kids and pets on weekends.
I wish I agreed with you, but in reality the iPad Pro is marketed as a tool for "serious professionals" but in reality that's just an aspirational message and a large number of iPad Pro buyers will use them to take pictures of their kids and pets on weekends.
The same is true of DSLRs. They're supposed to be professional work tools but the overwhelming sales numbers are to amateurs taking photos for trivial reasons. (That said, the DSLR market has now matured to the point where they do target the amateru audience.)
For what it's worth... On a computer, I'd look at log files.
On an iPad, I'd plug it into my computer... then look at log files.
That strikes me as very optimistic: the business needs come first.
totally ridiculous dev machine for scientific or financial computing, no?
That's what I do when I need to do development on the go.
I’m typing this on an iPad now. I mainly use it for web surfing, Netflix, email and casual gaming.
Coda has a local server/browser pair you can use for simpler things. But usually the kind of web work I'm doing barely exercises a single core, it all runs on a $2.50/month VM.
maybe we're returning to the age of terminal-only development ;)
but with vector font rendering!
FWIW: the death of the Mac has been predicted constantly since at least the original iPhone came out. If it takes another decade to happen, at least we can lay these accusations to rest that publicly traded companies are forced into short-termism.
Only being able to touch the screen while using the iPad Pro in laptop mode feels really limiting.
I also think that the lack of a desktop UI and a mouse is a huge limiting factor of the iPad Pro.
I"m not much of a mouse user so this is adequate for me.
Not like my life depends on a karma point; this is simply puzzling.
It achieves nothing of what people actually what mouse cursors for.
BTW, people use the mouse to select and change the cursor position all the time, so it's very relevant to the topic since it's a step toward desktop-like mouse support.
Is it a pro device if the choice to add a mouse isn't there? Many pros work in pixel accurate selection instead of pencil or finger.
See how Autocad was using a command line for ages to implement precise control over coordinates. Don't need a mouse for that. At the same time, the imprecise things that you use mouse for (e.g. dropping a symbol from a library into a workspace, or connecting controls with outlets in an interface builder) are much easier to do using a direct touch.
A mouse has a lot of reasonable functions that are quicker for selection and editing that touch doesn't come near. Touch has its own benefits.
Add a few select and very valuable uses (remote shell, remote access, desktop/graphic layout) and a mouse is much faster.
But yes, it does look like it's a product you can actually buy
also apple: here's a laptop with only touchscreen.
fan boys are going to have a field day to sell this one up. (edit: it already started with a workarounds for text input only. let's see how far it will go :)
Why don't you see a macOS future? It can use the T2 just as well as iOS, no?
What are you talking about? Apple won't allow developers to do any such thing. They only recently allowed a narrow range of NFC uses even though the hardware itself has been there forever. You think they're going to let people develop your own kernel drivers? Maybe with a $10,000 "hardware developer" account.
While I understand the "you must own your own device" line, that's more of a rallying cry than a useful description. (I have a similar pedantic complaint with "if you're not the customer, you're the product," for what it's worth.) It's interesting to think about a regulatory framework that mandates companies build software-driven devices with the ability for end users to do their own software loads, but figuring out how to do that in a way that doesn't let companies wiggle out of it and doesn't essentially mandate insecure back doors in all "smart" consumer electronic devices seems to me to be a non-trivial problem.
This isn't the case anymore, unfortunately, because the Jailbreak community is smaller.
tl;dr: I tried to be adventurous, but was too impatient. :)
The least intrusive method is too allow users to add additional software sources and either opt out of sandboxing wholly or totally for select vetted apps.
This isnt hard it would just hurt Apples ability to get a 30% cut as third party stores would spring up.
User owned personal computing devices have been a thing for decades can we not pretend this is uncharted territory?
macOS may be the pickup truck. but it isn't going to go away without a fight. not unless you can make software engineering work on an iPad the same way it does on macOS today.
They are doing this before the EU makes them do it. What apple did do was force the market to adapt to a plug that can go in both directions. And they thinned out the plug quite nicely, it never needed to be so thick.
USB-C puts the parts that wear out quickly in the cable where they belong.
Really? In another life I worked at an apple store genius bar, and can't recall ever seeing a failed lightning port (physical damage aside). You know what's extremely common though? Schmutz. The port picks up pocket lint, and then the connector packs it all down to the bottom of the port. Eventually your phone stops charging. Often the gunk is packed so tightly it's not obvious even looking with a flashlight that there's anything in there, or that the connector isn't sitting properly. Just dig it out with a pin and you're fine.
Obviously I don't know what happened with your phone or etc. All I can say is in my experience (which is sadly extensive), if you had your phone replaced for a failed lightning port the real problem was almost always schmutz + incompetent tech support.
Edit: Didn't notice jen729w's comment when I posted. So... seconded.
That's a great tip - it never occurred to me and I've just been living with a unreliable connection for months.
(I notice the lint in there because the Lightning cable doesn’t quite snap in all the way; each time I’ve pushed it in, I’m compressing the lint up at the back of the port, and eventually it gets too much and prevents a solid connection. Only happens about once a year, but it happens.)
Looking at a USB-C port, there’s still space for lint but much less space to get in and remove it. Can anyone with a USB-C phone share their experience?
You anticipated my experience exactly. My iPhone 6 got lint impacted as much as my Pixel, but it was a lot easier to fix the iPhone. Lightning does seem to require a more solid connection. USB-C will work longer with a "partial connection", which really just means I'm going to wait longer to clean it out.
Edit: I've never gotten any lint inside any of my USB-C cables though. Looking at the cable in front of me, it would be a bitch to clean out.
Actually, I'm not glad you mentioned it since that makes it even more perplexing. I was glad because if a port ever failed I would not have easily diagnosed it because I thought they were pretty good. But it must be my fault the cables are failing....
Another issue with lightning: the contacts arc. The middle pin on any non-brand-new cable is almost always a bit burnt — it’s pretty easy to see. I don’t know whether USB-C has mitigations for this.
I seriously don't understand the logic here. I've never had a port fail on me.
Cable snaps in nicely now.
The lightning contacts are flush with the port, so very unlikely to get damaged. On the other hand the USB-C has that fragile looking wafer on the device-side.
I've had phones were the plug felt very weak in the socket, and devices with a very satisfying snap and a very strong connection.
This is to say that usb-c is not inherently bad, it depends a lot on the hardware (and I'm sure Apple will get it right)
I suspect the drawing experience on here is better than with a ChromeOS device, but I care more about my web browsing/creation experience than I do about sketching.
I agree it's somewhat ridiculous that you can't get (real) third party web browsers, but Apple's is so damn good that I don't consider it a real problem.
It was only for the extension cable -- which are prohibited by the USB spec, so it was technically a "captive" extension. (The USB specs define captive as having a non-standard-USB plug, even if it's not physically captive.)
Blame the USB spec, not Apple. They're actually the only company I've ever seen make an extension cable that meets the letter of the spec.
Wasn't that because of the extender cable? As I recall extenders couldn't be USB compliant, so apple made a notch in the extender so that it wasn't technically a separate USB accessory from the keyboard?
So so so done with Apple laptops, though. It seems like the butterfly keyboards are here to stay, and they're just plain awful.
I've never owned an iPad, perhaps I'm missing something here. How does the above statement reconcile with the pictures I see on the page where there's clearly a black bezel?
Do they just mean there's not an area at the bottom with a physical button?
She also had a career in graphic design and now teaches oil/acrylic painting. I know from her, the art world likes frames. A bezel is not a bad thing.
I suppose if there is not a ready-made one that suits, making /hacking one, the old-fashioned way or through 3D printing, would be an option.
One reason I put a case on every one of my cell phones, is to give me something more substantial to hold on to. The thin bodies and edges start to become a detriment; all the more so in that I have no interest in squeezing mine into the pocket of skinny jeans.
If that was built into the phone it would be much more expensive to replace, and I'd end up just putting another, slightly bigger $20 case around it to prevent that from happening.
The device you seem to be imagining where the glass runs all the way to the edge with no edge banding doesn't exist, and would be an ugly and unpleasant device if it did, as the edges would reveal the electronic innards (and heaps of glue).
Only to be placed in a $10 case with bunny ears so it won't crack if you accidentally drop it.
Slim sleack shiny phones which break when dropped. Easily dropped because of shiny slippery surface.
What to do? Put some cheap something around it.....
And I'm wondering why people ask me why I don't have a case around my phone.
I don't get the thin phone to have more space for the plastic protection that's why...
This doesn't seem weird or contradictory to me at all. Nothing seems cheaper to me than my girlfriend's Android phone with rubber bumpers built in. Meanwhile my iPhones with cases look brand new when I'm ready to sell them.
It is sort of hilarious that miniaturization has gotten so far that we purposefully de-miniaturize miniaturized goods.
Presenter: “This display goes from its start...”
Presenter: “...all the way until it ends! We call this ‘entire’.”
Audience: <mad applause>
Audience Member 1: (whispering) “I can’t wait until I can get a device with an entire display. I’ll have to put in some OT at the job to save up, though.”
Audience Member 2: “What do you do?”
AM1: “I work at a company that specializes in digging half-holes.”
This is just a blatant lie by Apple's marketing.
Reserve your criticism for companies like LG who actually lied about their monitors:
But sure, there's no limit to criticism.
Everything is horrible.
How can you possibly look at what Apple is claiming and then look at the pictures and say the words aren't deceptive?
No, "from edge to edge" and "all-screen design" are not metaphors but as concrete as it gets, and obviously wrong. This iPad may be great but these marketing texts are just straight lies.
If I say it's fast as lightning, I'm not saying it's exactly that fast. It's a metaphor for fast.
The fact the speed of lightning is measurable and 'as concrete as it gets' doesn't change the fact that people use it as a metaphor.
You may feel the use of 'edge to edge' is an inappropriate metaphor that's fine, not every metaphor is fitting and, hell I'd even agree with you if you said that.
If you ask me, you can start saying edge-to-edge when the edge is at least 50-80% thinner than what it is now. But it's a metaphor nonetheless, and I'd feel fine applying it to extremely thin bezels which are not actually edge to edge. If you'd have said such a thin bezel was 'razor-thin' or 'paper-thin' when it really wasn't, I'd say it's still a perfectly fine metaphor.
However, there is no such common understanding of the phrase 'edge to edge' meaning almost edge to edge. Edge to edge in English means it goes from the edge of one side to the edge of the other, literally.
It's not a metaphor. Only Jobs had a powerful enough reality warp to make that fly…
Because in the context of displays there isn't really any big consumer product out there that actually has no bezel at all. Rather you have all these nearly edge to edge displays. And we talk about them as being edge to edge, razor thin bezels etc. There's no consumer out there who thinks he's getting a display without bezels. Everyone gets the metaphor.
> It's not a metaphor.
If you don't think it's a metaphor you'd have to believe that Apple truly thinks this device has no bezels, that consumers typically think it has no bezels, that journalists using these descriptions think they have no bezels, because they all take 'edge to edge' as being literal, rather than metaphorical. And that's simply not true. It is a metaphor, whether you (or I) think it's an appropriate one or not.
Anyway looking at the dimensions of the iPad Pro 12.9" model it's 11.04 inches wide with a 12.9" 4:3 screen. The screen itself is going to be 10.32" wide as a result, putting the bezel at .36" or around 9mm. By comparison the XPS 13's bezel is 5.2mm.
Lenovo also use it on this desktop with bezels the size of the moon: https://www.lenovo.com/us/en/desktops/lenovo/b-series/b50-30... saying it has a 'vivid 23.8" edge-to-edge display'
Microsoft use it to describe the Surface Laptop here - https://www.microsoft.com/en-us/p/surface-laptop-1st-gen/90f... - "Enjoy more space for your ideas with an edge-to-edge display and ultra-thin bezel." - specifically calling out the edge-to-edge display as being a separate thing from the bezels as well.
HP describe this desktop iMac ripoff with huge bezels as "edge-to-edge" here: http://www8.hp.com/h20195/V2/GetPDF.aspx/c06002849 - "An entertainment sensation; Sit back and enjoy a captivating entertainment experience. Elevate every stream, video chat, and photo with an edge-to-edge up to QHD display"
Acer uses it to describe a laptop in 2012: https://www.acer.com/ac/en/US/press/2012/50215 - "Featuring a 10-point touch edge-to-edge display and a larger trackpad, the Aspire V5-471P and V5-571P are designed to enhance multi-gesture content" - and it's a 2012 laptop, it has bezels.
The claim that only Apple use it and it's a lie and nobody else uses it in their marketing is nonsense.
 I mean literally the size of the moon, because nobody is allowed to use words differently without your approval, of course.
Also, whenever journalists and commentators write about Dell screens as being edge to edge, years before Apple, they're not lies, they're metaphors.
It only becomes a literal statement that is obviously a lie, when Apple uses it.
To everyone downvoting me above, you can disagree with the idea 'edge to edge' it's a proper metaphor to use for an iPad display with a 9mm bezel. I agree with you in full. You'd want something like 2-3mm bezels for that, at most.
But to say it's not a metaphor is silly. It would mean that you think Apple, journalists and consumers, all or some of them, consider the edge-to-edge marketing statement to be a literal one, that must be taken literally, and is obviously a lie, rather than a statement which must be taken metaphorically.
If you go to the CEO of Apple right now and ask him in an interview, do you mean literally edge to edge, or metaphorically, what do you think the answer would be? (apart from dodging the question)
After that you could tell him it's a crappy metaphor to use. But telling him he's lying because Apple means it literally is just silly.
and... Grammarly says "A metaphor is a figure of speech that describes an object or action in a way that isn’t literally true, but helps explain an idea or make a comparison."
Soooo this supports the idea that "edge-to-edge" can be descriptive but not literally true, and be described as a metaphor, right?
You can't just say "well even though the words could literally apply to the object and they do literally apply to the object in some cases in this case it's 'just a metaphor'"
It's not. It'd be embellishment if you want to be generous. But not a metaphor.
The word "edge" in this context, literally means the edge of the device or screen. That's everyone's understanding. There is nothing metaphorical about it.
The word "lightning" for example, does NOT literally mean "fast". It literally means the electrical energy we see in the sky. When used to describe computer speed, it's obviously a metaphor, and well-understood as a metaphor.
You can't just claim something is a "metaphor" to excuse false marketing.
You can't just claim the definition is not the definition and expect people to side with you over multiple dictionaries.
That's everyone's understanding. There is nothing metaphorical about it.
Literally everyone? Or figuratively everyone? You think the Apple marketing people cannot see the bezel and honestly think is not there?
More than definition, how metaphors are used is important.
Is "edge" a metaphor when the actual edge is right there, 9mm from the screen perimeter? There is no metaphor only 'edge-to-edge' window dressing. In the same way "lightning" is not a metaphor in a discussion about thunderstorms.
See my other comment in this thread, there is at least six years of prior usage by multiple big tech companies describing their product's screens as "edge-to-edge" without literally meaning edge-of-device to edge-of-device, with links.
And see my other other comment where I wonder why "edge-to-edge" has your goat, when Apple describing the iPad Pro as "all screen" doesn't. Is that intended to be taken literally as well? There's no CPU, no memory, no battery, no glass, no other components, all screen?
Because the device is right there, it clearly has parts which are not screen, why aren't you frothing at the mouth about how it's a lie intended to "attract suckers" instead of a non-literal highlighting that the screen is large?
Like "all butter cookies" - that's a lie for suckers to think they are made of butter and no other ingredients, right? Because there's no way it could be read except literally, is there?
Non-literal descriptions are everywhere.
Irrelevant. This discussion was about the one and only use in town right now of "edge-to-edge" in a major campaign to sell new tablets.
"All screen" is more ambiguous. For starters "all screen" is not a pre-existing term. It could be used to mean there's no other elements or buttons on the front, except the screen. "All screen".
"Edge-to-edge" on the other hand, has explicit meaning built in. The primary component of which is measurement. "Edge-to-edge" refers to not one, but two hard edges, and describes that which spans in full from one edge to the other. There's no vague interpretation possible unless you force a square peg through round hole of English. It's either an edge-to-edge screen, or it's not.
I had to measure a washing machine recently to find out if it would fit. I measured edge-to-edge, and by that I mean the actual left edge to the actual right edge. But you knew that already without me explaining.... because I said "edge to edge".
Are you equally offended that this "floor to ceiling room divider" stops a couple of inches below the ceiling? https://www.amazon.com/Royhom-Privacy-Divider-Decoration-Apa...
Do you expect this "top to bottom house cleaning service" to include the chimney and roof and aerials because the "top" has a literal definition? https://top-to-bottom.cleaning/
Are you angry that "surround sound" only has a discrete number of speakers, usually 5 or 7, instead of a continuous surrounding panel?
Are you baffled by Thomson Video Networks' claim that they have an "all-encompassing video infrastructure" when you can see things in the world not encompassed by it? ( https://www.broadcastingcable.com/post-type-the-wire/thomson... )
Do you think "unmissable TV shows" are literally unmissable? ( https://www.makeuseof.com/tag/unmissable-tv-shows-watch-hulu... )
Why aren't you complaining about Apple calling the iPad Pro "magic"?
Why are you choosing "edge-to-edge" as the hill to die on, when Apple call the iPad Pro "all new" (it isn't), "all screen" (it's not), "all powerful" (it isn't), "a magical piece of glass" (nope) which "does everything you need" (even breathing?), "any way you hold it" (even covering the screen?), "true to life color" (even though you can't represent purples on an RGB screen?), "make everything look gorgeous" (even ugly people?), "the perfect machine for augmented reality" (even more than dedicated glasses? There can never be a better machine for AR?), "immersive games" when you can't immerse yourself literally in them?
The surround sound does literally surround you. Surrounding a person doesn't require a continuous circle.
They said it's an "all-encompassing video infrastructure for broadcast and multi-screen services." That has a pretty clear meaning that it covers everything you need regarding infrastructure in those domains. It is not a metaphor.
"Unmissable" has a second meaning, "too good or important to be missed."  It is a subjective statement that the episode is good.
Calling the iPad pro "magic" is the only thing here that is a metaphor.
I'm not picking this hill to die on. I asked one question. You say it's a metaphor, but neither of us have any idea what it's a metaphor for. I think what you're actually trying to argue is that it's completely meaningless marketing fluff that can be applied to any phone or tablet. Would you object to calling a feature phone edge-to-edge?
I also disagree with "all screen", which should have the same meaning as edge-to-edge. All your other Apple marketing term examples have clear meanings (except "all powerful", did they really say that?)
A metaphor is when you say one thing is something else to draw an analogy between them. It doesn't work when the something else is the same type of thing. It would not make sense to say "my car is a Lexus" as a metaphor to mean my Hyundai is nice. It wouldn't make sense to say "my car is turbocharged" to mean it's fast when it doesn't have a turbocharger. If edge-to-edge is a metaphor, I can only think it's a metaphor for a tablet with a screen that doesn't have a bezel. It seems like a straight-up lie.
My biggest issue with the edge-to-edge marketing claim here is that I have no idea what they mean by it. It was bad enough when they made that claim for the iPhone X, but the iPad pro has a huge bezel. They might as well claim it fits in the palm of your hand.
I probably am, yes. I mostly think it's silly to argue that it "can only be used literally" in the face of it being used non-literally. It's a metaphor for its literal definition - this screen is "edge-to-edge" like a screen with no borders. It metaphorically has no borders because it's so big and the borders are so small, even though it literally does have borders, but you won't notice them, you'll accept the screen is edge to edge because it very like one that is. Yes it does seem like a literally false statement to sell people on an idea.
Would you object to calling a feature phone edge-to-edge
I don't know. On the one hand I don't think it is, any more than I don't think the iPad Pro is. On the other hand, they can call it whatever they want, they will just say "the screen goes from the edge of the screen to the edge of the screen" and everyone will say "duh". Do I think it's harmful? Not so much because the screen is visible. "All day battery life" was more misleading because you can't see the contrary at a glance. "No fee" when there is a fee, way more harmful.
"Unmissable" has a second meaning, "too good or important to be missed." 
Well "Edge" has meanings "2.a. the line where an object or area begins or ends : BORDER" and "2.b. the narrow part adjacent to a border" and "2.c. a point near the beginning or the end" - https://www.merriam-webster.com/dictionary/edge
So "edge to edge" could meaninglessly but accurately be saying "from a point near the edge to a part adjacent to the other edge". :-|
(except "all powerful", did they really say that?)
Yep; top of the page here https://www.apple.com/ipad/ - All New. All Screen. All Powerful.
Meaning, presumably, "every bit of it is new, it's mostly screen, every component is high end".
It would not make sense to say "my car is a Lexus" as a metaphor to mean my Hyundai is nice.
"How is your new Hyundai?" "It's so luxurious and feature filled, it's totally .. i dunno, it's ... very Lexus! Yeah, my car is a Lexus by another badge!". You wouldn't understand that?
It wouldn't make sense to say "my car is turbocharged" to mean it's fast when it doesn't have a turbocharger.
"How is your new Tesla?" "FAST it's so fast it's turbocharged supercharged bullet train rocket engine awesome" "I don't understand, it cannot be turbocharged because there is no internal combustion engine exhaust gas to spin the turbine, why are you lying?"
I think that's the core of the dispute here. The screen on this iPad is nothing like a screen that's edge-to-edge. It's surprising and confusing to see it described like that with such a clear thick bezel in all the images. I and a lot of other people in the comments here don't accept that this screen is edge-to-edge.
all-screen and edge-to-edge, however, are not.
Problem is, the iPhone X family got rid of the chin... but have the dreaded notch, and not even the mostly acceptable teardrop everyone else standardized on.
At least the iPad Pro adopted chinless and also notchless design. First time in years Apple has made a product that isn't weirdly ugly.
You mean "copied with a tiny alteration".
>At least the iPad Pro adopted chinless and also notchless design. First time in years Apple has made a product that isn't weirdly ugly.
You probably didn't get the memo, but the "oh my god, it has a notch, oh noes, so ugly" thing some pundits tried to pull died in its birth, and not only competitors copied the notch, but the X itself became the best selling phone.
Even ignoring that, everybody pretty much praised the Apple Watch 4 as beautiful as well...
Did you know that Google had to ship an update for Nexus 3 XL that "disables" the notch in hardware (by making the background for the part of the screen around it black, making it less noticeable), because of how many people complained about it?
in general, apple just tells people they invent and do everything the best, and everyone listens. it's funny, because my lg v35 is much thinner and lighter than my girlfriend's new iphone xs max (which has a marginally larger screen), doesn't have a protruding camera lens, and it actually has a headphone jack with a high-end dac (which basically no other phone has). so apple says they need to get rid of the headphone jack to save room, and yet here we are.
i specifically avoided the lg v40 because they added the notch. as with the new ipad, i simply don't understand this need to extend the screen all the way to the end of the phone in the long way. this is especially true on the ipad, which i nearly always hold horizontally with my thumb on the perfectly sized bezel. getting rid of the bezel literally adds no functionality for me and actually removes some.
It is trend among chinese phones (Huawei,Oppo, Vivo, OP ..basically BBK electronics)
OP 6T is the first phone popular in the west with this style, and I find it to be amazing.
Still like the MiMix 3 the most, with no notch. But, the OP 6T is the first one that's not egregious.
Not very catchy though, perhaps "innovative CDATF notch - uses 66% less screen space" would sound hype enough for the Marketing department?
Also, two phones does not make a standard. Even the Mate 20 Pro has an iPhone-style notch.
And indeed I was referring to the Mate 20 and Mate 20X rather than the Pro.
- 8 core CPU (4 low and 4 high power cores)
- 7 core GPU
- HEVC encoding/decoding
- Neural Engine (they haven't made any comparisons to the A12 sans X neural engine, so at this point I think it's the same)
- Audio DSP, Storage controller, Image Signal Processor, Apple performance controller, Depth Engine, &c.
- Files is way too limited. It should be possible to exchange the system supported data types between all apps via files. You can't even add a audio track to Music or a video file to the TV app as it is right now.
- The split screen feature is a good one, but so far I rarely found it useful. Not all apps support it, many apps differ in their behavior. The 50:50 split is nice, but the other split ratios I fould less useful. For some tasks you need overlapping windows.
- On the same page, one of my favorite features is the picture in picture for playing video - unfortunately again, apps differ in behavior and I don't understand the size limit. I would like to be able to put the video to any size.
- And of course, coding and running plain Unix utilities. While I am happy about the security the app model with sandboxing brings, for a "Pro" machine, this is too restrictive. Apple should allow something like Termux on the iPad, even if it were limited to a sandboxed file system, it would make the iPad soo much more useful as a computer. Apple might even release a lightweight Linux-VM as an app.
If Apple could remove these pure software-restrictions of the iPad, they could attract a lot of "Pro" users, I think. Disclaimer: I am an iPad Pro owner, fully in the Apple universe :). iCloud sync already makes it a much more useful device, but I keep hitting my head against the limitations. Running "Blink" to connect to my Linux server makes it almost a laptop, but is very limited.
I could do real work with a browser + terminal client, but have a few issues:
- None of the terminal clients really give you 100% of a normal keyboard (been a while, I forget what doesn't work right).
- Splitting the screen doesn't work well enough. I want 2 windows (browser and terminal) that are each 75% of the screen and the ability to switch between them, preferably from the keyboard. A 50/50 split doesn't work well, and switching the 70/30 split between windows makes a mess when it redraws my terminal on resize.
That's not to say what they've done isn't impressive and that they don't plan to kick Intel to the curb...but it is entirely possible this will be another PowerPC thing, where PowerPC outperforms for a while, Apple switches and then Intel gets their shit together and others are no longer able to compete. If Intel were to merely get a good integrated GPU that could compete with Nvidia's, there is no chance Apple could compete.
Something to keep in mind too, iPad Pros have a much more constrained TDP than your average laptop, so it isn’t merely just a matter of performance but performance per watt and sustained performance.
There is also this question: given a higher TDP, what could Apple’s silicon team really do? Although I’m not convinced it makes financial sense for them to replace Intel with custom silicon, I’m happy to be proven wrong.
I am happy that the iPad Pro is pushing the limits, it is about time that the Surface Pro line had true competition, it will push both to be better.
This will be big especially if they can get major price or power usage advantages.
I can well imagine Apple expecting people to plug in a keyboard and monitor to their iPad Pro, while it gets thinner and thinner.
I don't doubt that Macs will eventually fade into the sunset, but I don't think it's coming nearly as quickly as either pundits or pessimists seem to believe. "Marzipan" may indeed be a first step, but I think it's going to be a Carbon vs. Cocoa situation, taking many years to fully transition. For iOS to supplant macOS, it has to essentially do everything macOS does. It doesn't have to do everything the same way, but if it's going to be a full replacement for a general purpose computing platform, it has to be, well, a general purpose computing platform.
†Technically, "grudgingly-tolerate-it" should be in there, but it doesn't flow well.
The TouchBar was a good example though, even if it didn't receive the warmest reception on the market, it is still a good piece of Mac-specific engineering.
And at some point, Apple will introduce an "iPad mini" with no display
Presently the T series is generally gimped variations of the A series. My understanding is that the T2 is essentially an A10 with the high performance cores disabled, and the low performance cores repurposed to manage the myriad of functions that were previously handled by dedicated controllers. They're also necessary for certain features like TouchID, and for Apple to use its own HEVC encoder/decoder.
Presumably the T3 will be keyed off of a later part like the A12 or a later generation, which will make it the first T series chip to also include the neural engine. Assuming they don't disable that part, it will potentially make the NE available to Mac application software.
So what's the real problem with ripping out Intel right now? Given a higher TDP, Apple could hypothetically design a much more performant chip that would replace both the CPU and GPU in their current Mac lineup, but they simply don't push enough Macs to justify the expenditure on custom silicon, right now at least. Probably doesn't help that they have been making existing customers less happy, but it also doesn't help that Intel has simply been re-releasing Skylake in different variations for a good few years now.
If you have a Macintosh able to run Mojave, an SSD, and at least 16GB of RAM, you are probably set. The important thing about Mojave isn't so much any one feature of Mojave, it is the continued support including bug fixes and security updates. Given all of that, most people simply won't need a new system any time soon. There's essentially no reason a laptop from 2012 can't take you all the way to 2022 all other things being equal, so we're starting to see sales stagnate across the entire PC market including the Macintosh, so while Apple is selling phones about as fast as they can stamp them out, the same is not true of Macbooks or iMacs.
Silicon is much more of a volume business. Apple can re-purpose older versions of their A series chips because those are an R&D and capital investment that has already paid off, it isn't a new design, and they just need someone to continue fabbing them. New generations of the A series chips will generally pay off, because they will be selling hundreds of millions of the same iPhone, and usually for greater than one calendar year. This past year, for the first time since the A7 was released, they skipped releasing an -X variant of their A series chips which are generally released to support new iPads (there was no A11X for those who may have forgotten). It is possible that iPads have not been selling in the quantities necessary for even a variant design, and rather than an annual refresh cycle, we could reasonably expect them to move to a biennial refresh cycle, hence why they have skipped the A11X and jumped straight to the A12X.
Intel and Qualcomm have customers, but for Apple's own silicon, Apple's only customer is itself, by design. That also imposes a limitation on themselves that it just might not make financial sense to fabricate a series of chips exclusive to their Mac lineup, appropriate to the TDP of a Mac.
By all means, tear this apart if you disagree.
Apple certainly markets their iPads as a competitor to the MacBook, so maybe they’re not looking to use custom silicon in the Macs but simply to phase out desktop computers when iOS and third party applications are ready to replace PCs for the most part.
Isn't this becoming equally true for phones? Perhaps not for hardware as far back as 2012, but I'm expecting my 6S to last me a very, very long time. It's fast, has a great screen (and headphone jack), and I'll get the battery replaced as many times as I need to.
But this doesn't appear to have slowed iPhone XRS+ Maxx sales one bit.
i would add that the faster upgrade cycle, and therefore money worth spending on developing smartphones is also because people are used to carrier subsidy and spreading out the cost. so it's kind of a virtuous circle of investment that the pc market doesn't have.
makes you realise what a perfect storm iphone was - a massive and growing market of people paying through the nose, annually, for a crap product, just waiting for someone to take that money and invest it in something good.
I meant to say there was no A11X. I'll edit that in shortly.
Benchmarks from the first device generation: https://www.techspot.com/review/1599-windows-on-arm-performa...
Yes, most iPads in existence are much slower than this iPad Pro, and yes, most people will be buying the inferior sub $400 one.
But also, most people have PCs with an Intel GPU, or a really low end discrete GPU... that doesn't stop game makers to make games that can't run on most machines, or Adobe to stop making photoshop.
Let's see what happens with the iPad.