Hacker News new | past | comments | ask | show | jobs | submit login
Apple introduces macOS Mojave (apple.com)
441 points by ihuman 10 months ago | hide | past | web | favorite | 636 comments



The complaints here about how the new macOS does not have anything exciting or interesting is weird. I thought the best thing about the new macOS (and iOS) is that it's supposed to be Apple focusing on stability, rather than new features. Given all the bad things that have happened with macOS in the past year, I thought this was going to be appreciated.

https://www.macrumors.com/2018/01/30/apple-focus-on-software...


High Sierra didn't have much notable or interesting either, and it proved to be the least stable release in a while by a damn sight.

Lately it seems like Apple has downshifted in competence all-around (at least on the Mac). I'll believe evidence to the contrary when I see it — not sooner.

I know that makes me seem like an HN Apple hater. And it pains me to say that. I've been a Mac fan since the mid-90s. I've tried to give them the benefit of the doubt for a few years, but at this point they've blown through that into deep red territory. It makes me sad :(


I would not consider a brand new file system to not be notable. Maybe for average consumers it wouldn't matter much, but for us, we probably should appreciate how big an endeavour a new file system is.


I certainly appreciated it when I went to install the beta this afternoon. Took two seconds to add a volume, as opposed to many minutes (and the risk of data loss) partitioning in the old days.


In my case it resulted in total data loss causing me to do a fresh install, then within days a file system that was filled to capacity due to time machine snapshots that couldn't be deleted due to a cyclical depedency. I had to figure out how to reinstall on HFS+ and don't plan on ever trying again.


It's a "brand new" file system with implementation details that are a disaster. Lots of apps - including Steam just don't work on it if you use case sensitivity. Which is just silly. I had to create a virtual disk and mount it in my install folder so steam could work. It's 2018.. my 3k macbookpro shouldn't have this problem.

Give me my damn physical escape key back! the touchbar is the silliest thing ever.


While I have many issues with the new file systems as well, it is difficult to justify blaming it for Steam not running. Steam for Mac had never ran with case sensitive file systems since its inception, and Valve continuously refused to fix this despite numerous user complaints. This is hardly Apple’s fault.


I was saying this almost 10 years ago with various applications blowing up on case sensitive HFSX, and not really getting very good reasons why. And it's not just a few developers, they were all telling me case sensitivity on macOS is a problem. They'd get things working with one build, and a new dev build would bust their app but only on a case sensitive file system, not case insensitive. So the fact it's 2018 and this is still going on with APFS? shrug It doesn't surprise me one bit, even though I don't understand why it's still a problem. Two different file systems, same problem. Sounds like Apple just doesn't really care to properly support or test case sensitivity, their own test suite doesn't catch problems in the real world?


As much as I value the new file system (interestingly architected by the chap who did the BeFS, if only Apple had bought Be Inc!), it appears to still be lacking all the wonderful features of NTFS.

I'm still on Sierra at the moment as I delay upgrading after all the horrible bugs of the new release have been found - in everyone's opinion, is now a good time to leapfrog High Sierra? Time will tell.


Anecdote is not data, but I had huge issues when I first updated, so I rolled back. About 6 weeks ago I bit the bullet and updated again and it's been fine.


Thank you - anecdotes help me because I know 1 other Mac user and he's not overly technical (day job is Windows C++ so there's little Apple love around here)


It was notable when it nuked my install drive on update and I had to restore from backup ;) That was a first for me with an Apple OS.


I'll second that it's the least stable. It broke a whole load of stuff I needed for work. I eventually fixed it up (or found kludgy workarounds...) but I definitely resent encountering that immediately after updating.


So. The most important feature of this new release "designed for pros" that gets top placement on the announcement page is ... dark mode. The second is stacks. The third is the ports of iOS apps (including the long overdue Home).

I find it very hard to be excited about a yet another "major" release that doesn't even qualify as a minor version bump.


I’ve been wishing for dark mode for all 6 years I’ve had my MacBook Pro, so I’m pretty happy with that alone. I also have next to no qualms with OS X in its current state so I’m pretty easy to please. I will say that the hardware quality on the other hand has taken a nosedive lately.


It looks like dark mode is not a global option like on Linux, but rather needs to be adopted by individual apps. That means when opening old apps at 2AM, my eyes are really gonna be hurt..


it will depend on which APIs the app is built with. I don't know which desktop environment you are refering to, but I assume it's one, not all linux based operating systems. But in this linux DE, if I run xeyes will it come up in dark mode?


Does MacOS have night shift like iOS does? Else, f.lux really does the trick.


yes it does indeed - it operates in much the same manner - set by the clock or manually enabled.


Perhaps try sleeping at 2AM.


Not everyone is lucky enough to work 9-5.


Maybe try telling that to every user?


OK. Everyone, get some sleep.


It's so kind of you.


Speaking of Stacks, I keep remembering this different feature: https://www.dropbox.com/s/loojgwkskhqn4lu/About%20Stacks.pdf...


It's not designed for pros. Linux is. Apple is about easy usage and they benefit hugely from having Linux underneath, which is why they even are considered by real pros. It looks nice and that's what people like.


It's not "Linux underneath".

It's a Mach microkernel and a BSD userland taken from FreeBSD, which coincided with them hiring the founder of FreeBSD into a role to do release management. He's left since.

The fact you don't know this suggests you might not be aware of macOS fundamentals, the history of OS X, or MacOS that preceded it, the design decisions that went into all of those, the user groups they targeted at key points (including the adoption of FreeBSD userland), or their overall design intent.

I therefore struggle to agree with your premise that it's "not designed for pros", or that you are qualified to make that assertion.


Their definition of 'Pro' stretches much wider than developers. Traditionally, their Pro Apps were Aperture (raw editing), Logic (audio workstation) and Final Cut (video editing). Of course, the Adobe suite fits that moniker as well. For most people doing video/audio/photo editing, macOS and Windows are the primary choices (though you can obviously do this on Linux as well).

Note: macOS is not using Linux underneath. It uses the XNU kernel, which is based on Mach and BSD.


For the longest time ever MacOS was the OS for professional print, professional graphic design, professional sound design and professional sound editing, professional video editing.

When MacOS moved to a BSD-based system, it won over a large portion of software development. Go to almost any IT conference, you'll see people with MacBooks running MacOS (some run *nixes on MacBooks because of hardware).

In the past 4 to 5 years though Apple has let MacOS more or less stagnate.


I take Apple at their word. If they don't claim to have focussed on stability in this release, then I won't assume that they did.


I don't think they'd ever admit to focussing on stability because that would imply the OS isn't already stable.


But that's exactly how they marketed Snow Leopard (macOS 10.6). They even said it would have "no new features", instead focusing on making it faster and more reliable.


Fair enough, Snow Leopard marketing was before my time noticing Apple's existence. I've just noticed that in general they never admit that they've done anything wrong or made any mistakes, and it seems hard to market a release for stability without somehow admitting to existing mistakes.


They joyously announced zero new features for Snow Leopard over Leopard and it has been known as the most excellent release of OSX, ever (before the iOS-ification of the system and deep iCloud integration).


SJ had plenty of moments where he admitted mistakes, together with the new solution. IMO he was much more engaged with user feedback than Cook/Ive/Federighi.


Bingo.


I was hoping they were going to remove some features that didn't work out - Launchpad for example, which seems to have been a spasm in macOS a couple years ago to make macOS more suitable for a touch screen that never came.


My understanding is a lot of users use LaunchPad (possibly all those who don't have an Applications folder dragged to their dock and do not know they can just type the name of their app to run it via spotlight)


It's somewhat useful on a laptop where it's easily invoked by a trackpad gesture — I've sent a few users who consider it to be where all their apps 'are'.


I for one am extremely happy if they work on stability rather than gimmicks (or worse, breaking my workflow which pointless changes for the sake of change).

The problem is, it's difficult to be excited in advance about a stability which is being promised, but can only be verified some months away.

But I don't care really. Give me an OS which works and doesn't get in my way. Something without ridiculous amounts of telemetry, please.


The side story could be different: very low effort on MacOS -> No much new features -> No much work to stability issues.


did they announce all the fixes? if not then I'd assume that's why people aren't focusing on them.

I too would appreciate stability and fixes. I know they are listed in the notes for each release. Still if that really was the focus that needs to be communicated

note I didn't watch the announcements so if they did emphasize that great!


I think no matter what the focus is, they’re never gonna spend time during highly publicized events highlighting bugs in previous versions that they’ve fixed!


I'm 100% with you. I heard nothing about focus on stability, or bug fixes. Really, just silence about it.


Seems like some people still got this ongoing "war" concept in their minds between major desktop OSes. They demand breathtaking features that will make them outstanding placed side by side with competition and all what they're getting are gimmicks, changes for sake of change. And I don't find that surprising either; for more than a decade OS are feature complete and you can do your everyday chores no matter if it's Windows, OSX or any *nix with DE of choice - the difference is in style that mentioned are giving; so it's hard to give something really meaningful that would change the workflow, especially when majority of people are more happy with mobile envirnoments


I'd disagree. I can think of multiple feature that only OSes can improve upon, not apps.

One is, creating a common platform for apps to share data, live data, with one another. By live data, I mean editing a spreadsheet in Excel, and seeing it change a chart in Photoshop, instantaneously. Not after saving the file. Not after hitting a refresh button somewhere.

I know, this has been possible for decades, but there are no apps taking advantage of it, mainly because these things need a common platform with well-defined standards.

This is something that OSes are best positioned to implement, but none do.


Maybe because it doesn't actually mean anything. It's like those random numbers they pull out... 40% faster app starts etc... In practice, you won't even think about it in real usage. They take some part of upstart procedure and load it afterwards in the background so it seems the app is faster. It's smoke and mirrors.


I am guessing you have some specific examples of that happening in mind? Otherwise, you’re just spreading FUD, not to mention insulting the developers working there.

PS: i am pretty sure caching, pre-compiling, etc are perfectly reasonable optimization techniques that many of us rely on.


WWDC 2018, from underwhelming to boring. I will lose my mind if i hear *oji one more time. Will they ever stop with this nonsense? Favicons in Safari? It took them 6 years for the browser to remember the zoom setting per site, now we have the luxury of having favicons.


Wait for the Platforms State Of The Union video to become available. That's usually where the good stuff is. The keynote is often more consumer oriented to grab some media headlines.


Very little good this year.


It's amazing that so many bright minds are wasting time on *oji.


Always interesting to see the divergence between HN and what the average user cares about. For most people emojis (and animojis) have opened a whole new way to communicate with each other. I can't think of another linguistic feature in history which saw such widespread use within a decade and I feel like emojis don't get enough credit for that.


Agreed -- as a Deaf person who uses visual languages to communicate, it's allowed me to actually begin to express myself using a set of pseudo-language visual elements. Some of them roughly map over to ASL expressions; I am looking forward to when they add ASL-specific emoji, or ASL language support, or even hands in the new animoji.


That's.. actually pretty extraordinary. In my cynicism I had imagined the only possible motivation for positioning new emoji as flagship features of their iOS iterations was a blatant attempt to lure pre-teens into a long (and lucrative) journey into Apple's ecosystem. I think perhaps I should take a few steps back and absorb the idea that:

1. people communicate in many different ways and

2. these things aren't valueless just because I don't picture myself using them. And in retrospect, I only struggle to picture myself authoring one; receiving them is completely fine.



I hadn't considered this at all. Thank you for expanding my knowledge.


I'm really curious about this; thanks for adding this and iluminating so many of us that fail to consider deaf people when thinking about these things.

But I'm not sure I totally understand what Animoji does for you. Your deafness is irrelevant in the context of messaging apps, isn't it? Hearing people, when using Messages, have just writing/reading, just like you.

Is it that, being deaf, you are more used to puting extra emphasis on non-verbal communication, and thus the transition to Messages from "real life" conversation feels more limiting than for us? How is it any better than sending a short video, or may be recording a gif of your actual face? Doesn't the loss in fidelity make it frustratingly hard to express the nuances we get from facial expression?

I tend to think all of those grandiouse statements about "Opening new ways of communicating" or "Creating new connexions between people" are total bullshit. Most animoji users communicate equally well with or without them, it add's nothing except fun. And that's fine! Fun is good.

But it's really hard to put oneself on anyone else's shoes, and I'd love to know if anyone has found more value in animojis.


There's no way to say this without being insensitive, but this reminds me of a grade school joke without a punchline - "how do you write in sign-language?"

Can you help me better understand what ASL language support would look like or what that would mean to you?


I'm not deaf, but as per the GP's comment, I can immediately see animojis with hands performing sign gestures.

If you consider how ingrained and emotionally significant that expressions in your native language are, animojis with hands - which will take that emotional significance and put a cute spin on it - is going to be massive.


ASL (or any language's sign language) isn't a simple translation of gestures for words or letters; the different sign languages have their own grammars and a different "vocabulary". Think of every sign language user as being bilingual, with a sign language and a written language.


ASL is not English. It is an entirely separate visual language with different syntax, lexicon, etc. ASL speakers can communicate to each other over mobile devices using text in the same way that, say, 17th century intellectuals communicated using Latin, or late-dynasty Chinese officials communicated using Classical Chinese.

I do not know how to sign, but I do imagine that a member of the signing community would not feel truly at home in their digital life in that they cannot type the language that they "speak" and very probably think in, but must rather resort to a second auxiliary language whenever they interact with text.

Like, imagine if Apple (or Google or anyone else--this is an industry-wide issue) made it technically impossible for you communicate in anything except French. In this hypothetical world it is not a show stopping issue because you are fully proficient in French having used it in some way nearly every day of your life, and so are all the people you would want to communicate with. But it's not your mother tongue, and so you wouldn't really feel at home or fully included in the digital world, now would you? Texting your family and close friends in French when in fact all your other interactions with them are using spoken English would just be weird.

(Incidentally I do wonder if Swiss German or Scotts speakers feel similarly, and if they don't to what extent that serves as a counter point.)


>(Incidentally I do wonder if Swiss German or Scotts speakers feel similarly, and if they don't to what extent that serves as a counter point.)

I don't know about Swiss German speakers, but Scots speakers tend to be fairly comfortable in code-switching between standard written English and a transliterated form of Scots.

Scottish Twitter is as culturally distinctive as African American Twitter:

https://www.buzzfeed.com/hilarywardle/get-right-inty-the-mic...

Wikipedia has a Scots language version:

https://sco.wikipedia.org/wiki/Main_Page


I've never heard anyone call it anything other than black Twitter - what are you doing lol


Swiss-German sign language is.. interesting. It has Cantonal dialects (I wish I was joking!). I'd really prefer everyone switched to German sign-language for simplicity's sake - in the same way most Swiss-German speakers write High-German instead of dialect.

I could imagine signing emoji being pretty successful. My son is deaf, but too young to be using chat software, so it's difficult to say without asking about at his school.


I meant the Swiss German spoken language, which is not High German.


I would imagine that ASL speakers have usage patterns that they would like to express via text messaging without simply “translating” them into formal written English. That’s no different than why English speakers use emoji (“” instead of “I am happy”) or informal onomatopoeia (“uggh” instead of “I am annoyed”).


I love (sarcastically) how Hacker News has no support for emoji. What year is it?


It actually has code to handle emoji: by deleting them. You can find older posts that do contain emoji, from before they were blocked.


Yeah, I noticed that after submitting my comment, but I figured I’d leave it there for the irony.


If you're not entirely without hearing, you may be interested in knowing that the iOS 12 beta appears to have a new bit of functionality that lets EarPods act as hearing aids.


I suffer from social anxiety. There were a lot of times in the past when I would receive a text and fail to reply because I got stuck trying to figure out how to express myself "correctly". Emojis and the normalization of emoji-heavy texts help me a lot.


How is an emoji a whole new way to communicate?

EDIT: I meant this as a genuine question, not as a "how could you possibly think this" response.


I don't mean this to sound in any way cruel or judgemental, but a very large proportion of the population have very limited literacy skills. Emoji are useful for all users who are writing short, personal messages that might be ambiguous in tone. They are extremely useful for people who would otherwise struggle to express or understand tone and emotion using the written word.

In the last National Assessment of Adult Literacy, 43% of Americans were assessed as having "basic or below basic" literacy. They can extract basic factual information from short, straightforward texts, but little more than that.

Here are a couple of example questions from that test.

Only 33% of Americans could describe what is expressed in the following poem:

"The pedigree of honey Does not concern the Bee - A clover, any time, to him Is Aristocracy"

Either a literal or thematic description of the poem constitutes an acceptable answer.

Read the text at the link below. After reading this text, only 16% of Americans could describe the purpose of the Se Habla Español expo.

https://nces.ed.gov/NAAL/sample_imgtxtequiv.asp?Imageid=164

Acceptable answers include any statement such as the following: "to enable people to better serve and sell to the Hispanic community", "to improve marketing strategies to the Hispanic community" and "to enable people to establish contacts to serve the Hispanic community".

Did you get the right answer? 84% of Americans didn't. Bear that in mind when you're writing documentation or dialog boxes.

https://www.nngroup.com/articles/writing-for-lower-literacy-...


> In the last National Assessment of Adult Literacy, 43% of Americans were assessed as having "basic or below basic" literacy. They can extract basic factual information from short, straightforward texts, but little more than that.

That's intentionally misleading and it's thrown around frequently without clarification of what the basic and below basic levels exactly mean, how they compare to the rest of the world, and who is in the figures (a lot of non-English speaking immigrants), usually to try to prove points.

The US basic literacy level is a high bar compared to what 95% of the planet actually tests at. Over half of China is below basic by the US standard. Over half of Eastern Europe is below the US basic line, including Russia.

In the US ~44% of the below basic population are non-native English speakers, who didn't speak English at all prior to starting school. 39% are Hispanic adults. Ie this group overwhelmingly consists of currently or originally low skill, poor immigrants (people that wouldn't even be allowed into most other developed nations such as Canada).

Demonstrating that effect in action, 43% of hispanic adults test poorly in literacy, compared to about 10% of white adults. Gee, I wonder if immigration into a new culture + language barrier has something to do with these numbers.

Despite a vast immigration flow of low skill, poor, low English literate persons since 1980, the US literacy rate didn't drop meaningfully. That means literacy rates for the base population increased.

Despite all of that, the US is the 7th most literate nation on earth, in front of: Canada, Germany, the Netherlands, France, New Zealand, Belgium, Israel, South Korea, Italy, Ireland, Russia.

https://www.washingtonpost.com/news/answer-sheet/wp/2016/03/...


> The US basic literacy level is a high bar compared to what 95% of the planet actually tests at

Americans are well educated relative to the global population. That isn't what we're discussing. OP is explaining why large swaths of the population might prefer communicating with pictures over words. It isn't that they can't understand words. Just that parsing and constructing language to express complex thoughts isn't a common experience for many, for whatever reason. Emojis fill that gap.


My comment was not intended as a critique of the American education system. Immigrants buy phones and computers. They run businesses and use SaaS products. Non-native English speakers are an important demographic that we need to keep in mind when we are designing products and writing documentation.

In a globalised world, a great many people are frequently communicating in a language that they have not fully mastered. South Africa has eleven official languages. India has 22. Globally, non-native English speakers outnumber native speakers by two-to-one. Hindi/Urdu has a roughly equal number of first and second language users.


I guess I expected HN not to understand emotion-based communication. After all, "disagreement" is assumed in pretty much every comment.

But as for deciding it exists because people aren't smart enough to use words, U+1F914.


What's the proper set of answer about the Dickenson passage? Does any interpretation count as correct?

(I know this thread isn't about the methodology of literacy assessment, but now I'm really curious to know how they do it. Does publicly available question-level response data exist somewhere out there? from previous years?)


Yeah, I'm interested as well.


A footnote to your comment is that only 80% of US inhabitants have English as a first language - https://en.wikipedia.org/wiki/Languages_of_the_United_States. And amongst those there's literacy and education levels to consider.


Thank you for the reasoned and considered reply.


I would be interested to find out if things were that bad in the 50s or 70s. It does feel like the intro of the movie Idiocracy is happening. Even among sophisticated people, when you watch interviews or speeches of public figures from the 30s or 50s, their spoken English (but it applies to other western languages too, French for sure) was so much superior than even your typical written newspaper article today. Trump’s speeches made out of no more than a 100 distinct words are merely a dent in a downward curve.


"Whole new way" is a bit hyperbolic, but an emoji definitely brings layers of meaning that are normally only feasible in in-person conversation into the written realm. It would be pretty hard to express the sentiment the ¯\_(ツ)_/¯ emoji can, and even if you put the effort into writing it, you'd lose the immediacy.

It's also interesting how they work as reactions. If you hit "like" on something, for instance, you don't have to explain why you approve or add your own commentary, you just indicate your approval. And if you write a comment on a thread, there's a certain expectation that it contain an original thought or that it demands a response. So reactions manage to avoid a lot of inane filler.

For a good illustration, watch a thread on Facebook where they say "type AMEN if you agree!" and you get a thousand people tediously writing it out. If there were just a little prayer emoji and a counter, you get closer to their actual intent, since that's literally what "amen" means.


> It would be pretty hard to express the sentiment the ¯\_(ツ)_/¯ emoji can, and even if you put the effort into writing it, you'd lose the immediacy.

"meh"


Funnily, I have no clue what "meh" means.

1) The emoji has several possible interpretations/meanings depending on context, such as "I don't know", "I don't care", "indifference" or "shrug". There's this saying, "a picture says a thousand words". It applies here (nobody said the words couldn't be a few words in hundreds of different languages ;)).

2) The emoji generally does not require translation to different languages. It isn't universal, but its more accessible, and some emojis are certainly universal (such as :) which is a facial expression my 3 month old understands).


> emoji generally does not require translation to different languages

Really good point. I don't find watching a keynote speeches about emoji at all exciting, either. But I work on a mixed-language engineering team with a lot of (extremely smart, highly literate) people, and we use emoji all the time.

Whether it's cold-sweat-face or thinking-face really helps me understand the nuance of my colleague's Japanese comments (which, btw, as a native English speaker with only fair Japanese ability, gives me a free opportunity to experience 'basic or lower literacy').

I know that adding emoji characters helps them in the same way, so I use them frequently.


I read that as "who knows?"


Somehow comparing emoji to the worst parts of Facebook's brand of gamified socialization doesn't make it sound any more appealing.


> It would be pretty hard to express the sentiment the ¯\_(ツ)_/¯

That's what acronyms were used for. Back in the day of AIM/ICQ I never felt I had issues expressing myself with just text. It was text supplemented with a healthy dose of emoticons and acronyms, which leads me to believe that emoji are redundant


Emoticons are severely limited in their range of expressivity. It’s hard to do much more than :) :( and :/. Acronyms are limited in both range of expressivity and audience size. Acronyms require prior agreement on what they mean, so there’s a barrier to their basic use and another barrier separating people that don’t know what they stand for. This severely limits your audience size and the number of acronyms you can use.

What’s really interesting about emojis is that they can transcend even hard language barriers. An emoji used by an exclusive Japanese-speaker can be understood by an exclusive English-speaker. The range in expressivity for emojis obviously isn’t as great as a full language, but it’s surprisingly large and can grow without cognitive costs to users (unlike acronyms). Unlike acronyms, the audience size is effectively universal. So I 100% disagree with your claim that emojis are redundant.


There are a lot more emoticons than a few simple faces. Even your example ("¯\_(ツ)_/¯") is an emoticon, albeit one from the Asian side of the pond. I've had long distance relationships over text chat and so I really disagree on the fact that they are limited in their expressiveness.

Part of what is off-putting about emoji is that they make text look like early first-language reading materials, where pictures of objects are embedded next to the word you are meant to learn. It's kind of off-putting to read, at least for those of us who grew up in that experience.

I also do not share your adoration for things that are super instantly accessible. There is much value in learning, and in struggling along the path to learning, that people think they don't want or need. Linguistic training (like learning to read written words) is a necessary skill for assimilating oneself to a new culture or in-group and language is one of the best methods for practicing that.

Also, are emoji really so universally understood? Are the peach or eggplant really universally understood to stand in for genitalia? Like almost everything the simplest form of emoji are accessible but I do not think everyone in the world is on the same page about the finer points on how to use some symbols


> I also do not share your adoration for things that are super instantly accessible. There is much value in learning, and in struggling along the path to learning, that people think they don't want or need. Linguistic training (like learning to read written words) is a necessary skill for assimilating oneself to a new culture or in-group and language is one of the best methods for practicing that.

At its core, language is a tool. Its job is to allow people to communicate ideas with other people. You're arguing that there's value in not making a tool easier to use because it promotes learning, but I disagree with that point. Sure, there's value in overcoming challenges to benefit learning, but we shouldn't create artificial challenges (like not using emojis) just for that benefit to learning by doing things in a harder way. It's like saying we should use hammers to put in screws because using a screwdriver makes things too easy. I agree that there's much value in learning, but I think learning can be done in a much more efficient and productive way than by refraining from using emojis.

Also, you have to think of the costs associated with the inefficiencies of human language. One example: look at all the scientific work being done in English. Anyone that doesn't speak English fluently is automatically at a massive disadvantage in the scientific field. These non-English speaking scientists have to spend years simply learning English to contribute their work. That's an opportunity cost. All those years could have been spent on their actual scientific work, and those individuals and society as a whole have to live with that loss. I'm obviously not arguing that emojis fix this problem, but I'm saying that simplifying our language tool in some way could.

> Also, are emoji really so universally understood? Are the peach or eggplant really universally understood to stand in for genitalia? Like almost everything the simplest form of emoji are accessible but I do not think everyone in the world is on the same page about the finer points on how to use some symbols

That's an interesting point. There are differences in how emojis are interpreted, but this is no different from written or spoken language. Since the beginning of human communication, people have developed slang words and altered the rules of language. Some of these changes have spread and persisted while others died out or remained in use within specific groups of people. While the peach and eggplant emoji may not have the same interpretations across age groups, different cultures can likely still infer their meaning. For example, the see/hear/say-no-evil monkey emojis likely transcend many cultural and language barriers. Knife + scream + shower head emojis likely convey the shower scene from Psycho to anyone who's seen the movie regardless of culture or language.


> Acronyms require prior agreement on what they mean

Emojis do as well. As someone who has autism, I don't recognize or understand facial expressions all the time. I remember being 18 years old and finally understand what faceroll/roll eyes meant (thanks to -what it was called then- an emoticon).


I'll extrapolate a bit on your post.

AIM/ICQ already had picture-smiley support, converting :) to a smiley. The first smileys on the internet were used in the start of 80's (they're apparently used in written form as well see [1]). It was used on IRC and e-mail.

Another fun fact of that time (80s) is that domain names and TLDs used to be written in CAPITAL LETTERS. And the first spam was from DEC (Digital Equipment Corporation).

Acronyms are very old (widely used), and useful, but that doesn't mean they're better because they're older or more used in the past. Remember that reading in past centuries wasn't for everyone, same for latin. Acronyms are language specific which emoticons/emoji are not. The acronym LOL, one of the first chat-specific acronyms, stems from IRC and is believed to be coming from Dutch (The Dutch word lol means "fun" or [non-sexual] "pleasure". The Dutch were one of the very first if not the first countries to connect to the USA internet, and same for its IRC presence.) If you're a native English speaker you may not give a rat about acronyms being English-centric, but for the rest of the world they often don't even know what the acronyms stand for or they have their local acronyms which you or me wouldn't grasp.

Emoticons and emojis do not suffer from that problem. Case in point, the red 100 emoji is widely popular in the USA (so I heard). People don't use it here in NL. But we understand what an American would mean with it.

I believe picture language (as I call it) plus on-the-fly translation devices (what Google Glass could've been ages ago but didn't work out due to public outcry) is going to solve communication in the 21st century. The effect of the tower of Babel shall be mitigated. Why are my glasses still dumb? All these brands being sold here, are ultimately owned by the same big fat multinational. There's a huge opportunity here.

[1] https://en.wikipedia.org/wiki/Smiley


A major part of human communication happens through facial expressions. Emojis/Animojis enable this for digital communication. It's new in the sense that we now have a global set of symbols for these expressions.


A friend asked if he could crash at my place one night and said “don’t worry if not, I got options” followed by the nail-painting emoji.

How do you express that in ASCII emoticons?


It's hard to say how to express that, since I have absolutely no idea what it means :) If I got that message I would be pretty confused. Is the other option literal nail-painting or is it just sarcasm?


The nail-painting one can be used to be a bit ‘sassy’, so it’s like saying “don’t worry, I know other ways to find myself a bed for the night ;-)” but in a more nuanced way that I at least find funnier.

It’s probably not a universal meaning but my point is that emojis can be used to express more than just basic emoticon ‘UNIVERSAL EXPRESSION OF HUMAN HAPPINESS’ style things.


I wouldn't even know how to express it in words.

This may or may not be an example of it -- for all I know it's a commonly used one with a well-established metaphoric meaning --, but I think emoji usage is strangely idiosyncratic.


"hahah, no worries, if you don't come maybe I'll just paint my nails or something"


You missed the joke.


I did too. Perhaps if emoji are such an ambiguous method of communication, they aren't so good after all?

That isn't an isolated instance of emoji confusion for me, either. Apart from the basic facial expression emoji, for me they're a great way of making a sentence more confusing.


Did I? Maybe emoticons aren't as universal as we're assuming then ;)


My significant other suffers from depression. We are long distance and much of our daily communication is through text-messaging.

We both rely on this one emoji to communicate more effectively: <smiling face with squinting eyes> [0]

We use others, but that one emoji makes all the difference in the world. Often we send only that single sign to each other.

[0] https://emojipedia.org/smiling-face-with-smiling-eyes/

EDIT: Replace emoji with text description and add citation.


Very curious - what do you think makes the image version superior to the text-based smiley?

And as a follow-on, do you think the animated and personalized versions of these emojis make it that much more effective?

For me, the text version communicates the same meaning and context, so it’s fascinating to see examples where the representational medium has a significant impact.


This smiling-face-with-smiling-eyes is one of the few emoji that cause an automatic emotional response inside me. Another is https://emojipedia.org/loudly-crying-face/ (Apple rendition of both, particularly).

Text, and ASCII emoticons, just don't do that.

(and I'm guessing that after I see enough "we can save you money on your car insurance :smiling_face_with_smiling_eyes: manipulative overuse by marketing "humans", emojis won't do it anymore either).

(Although, do you genuinely think a photo of a cat being cute is emotionally the same and just as effective as a description of a cat being cute?)


There’s a whole range of “smiling face” emoji with slightly different smiles, and slightly different eye expressions, some blushing, some not. In the context of “i have all these options to choose from”, choosing that one particular option conveys more information than what the text-based alternative provides


it's definitely different for everybody. text based ones are limited in range of emotions they can convey.

I never understood people that spent money on digital goods until I found kakaotalk emojis.

for me there is no comparison. I adore their animated emojis and I'm willing to pay for them as do many others.

you can check them out.

https://youtu.be/sKCPUfPaJBI


Sometimes I'd like to think that it prevents misunderstandings. But that's probably very subjective.

I'd also guess that just because they feature it heavily on the keynote that a huge team spent the whole year only working on just Emojis. That doesn't mean that there weren't a lot of other teams doing important ground work and internal improvements that are not that easy to showcase on a keynote for a very diverse audience and press.


I suspect some people have either never heard of emoji - could be old or young, not sure - or simply didn't think they would use it before these kinds of features came along. Outside of Apple nerd (and broader HN) social circles, the awareness of this is probably much smaller than we realize.

After all, even with social networks, we still haven't convinced everyone that the "at" username construct is a useful way to get people's attention online.


It amuses me to think we have moved from the printed word back to hieroglyphs (emoji). Not in a derogatory way, communication is communication, but in a awe of older cultures. I like the thought experiment of imagining a fallen advanced civilization in Egypt. I imagine them stuck... watching their batteries drop to 0%, and turning to stone and emoji as the only way to persist communication.

What if printed words were really just a transitional state of language until their purest form, emoji!


The Ancient Egyptians also used base 2 multiplication.

https://en.wikipedia.org/wiki/Ancient_Egyptian_multiplicatio...


To a certain extent, one could consider Chinese characters to be more like emoji than words constructed via a phonetic alphabet, in which case there never really was a transition period, just some backwater outliers.


:(


But that's an emoticon, not an emoji.


It’s a wireframe emoji before it hits the render pupeline.


Reactmoji


I think emojis are better at conveying emotion than straight text, since emotion is a very visual in-the-moment thing.


Because it uses pictures instead of words.


That idea was dismissed several thousand years ago, and for good reason. Text is way more powerful and precise than pictures. And I'm not saying having both is a bad thing, but this focus on Emojis is just idiotic.


Text accompanied by images has been a staple of modern visual communication for the past several hundred years. Emojis just bring additional ways to express oneself in addition to plain text.

They augment; they do not replace.


When I chat with my Thai girlfriend, many miles away, we usually use the LINE app. We don't directly use emoji's, but we do use stickers [0] as part of our chat. And I feel stickers provide much the same use as emojis, the can more easily be used to convey emotions in a chat that would otherwise be harder to interpret correctly.

---

[0]: https://store.line.me/stickershop/showcase/top/en


Emotional human conversation is imprecise. Tactical use of that is a skill.


> I can't think of another linguistic feature in history

Only slightly related, but this latest migration back to a "sign-like" language (the emojis) reminds me of Giambattista Vico's "Scienza Nuova" (https://en.wikipedia.org/wiki/The_New_Science), where at some point he says that the language spoken by the first humans ("the giants") was a "mute" one, based on "signs", which was correlated with a poetic sense of mind, so to speak.

> Beginning with the first form of authority intuited by the giganti or early humans and transposed in their first "mute" or "sign" language, Vico concludes that “first, or vulgar, wisdom was poetic in nature.” This observation is not an aesthetic one, but rather points to the capacity inherent in all men to imagine meaning via comparison and to reach a communal "conscience" or "prejudice" about their surroundings.

There's of course nothing scientific about Vico's discourse, but his themes somehow stick and resonate more (at least to people like me) compared to the latest linguistic findings.


I have yet to see animoji used by anyone outside of Apple events.


> For most people emojis (and animojis) have opened a whole new way to communicate with each other

Emojis I agree, they are standardized and their meaning is clear by convention. Animojis? That's just salespeak for snapchat-like filters. They're funny but the novelty wears off the same day.


maybe that's still not worth a wwdc talk but more like 'obviously we added support for this emoji thing'


> For most people emojis (and animojis) have opened a whole new way to communicate with each other

Wow that is an extremely generous characterization. At best they're just prettier versions of :) and :( and I don't see how they allow people to convey ideas they couldn't do just as well via text.


They don't really convey ideas, they convey emotions and reactions. It pretty much solves the "tone is hard to convey over text problem."


Except your version of an emoji doesn't look anything at all like my version of what is supposed to be the same emoji.

I don't think that solves the problem at all. In fact, I don't think that solves any problems at all.

If anything, it causes way more problems than it solves.


What you say is a recognized problem that the big companies that have their own emoji fonts work to progressively eliminate.


How did they do that beyond `:)` or `xD` or `-.-` does?


Well, for starters, you don't have to turn your head 90º to resolve some, but not all of them as an image, as you do with your provided examples. They're also much higher-resolution so it's easier to pull meaning from an unfamiliar one.


> Well, for starters, you don't have to turn your head 90º to resolve some, but not all of them as an image

Well, no one actually does, so I'm glad we got out ahead of that problem.

> They're also much higher-resolution so it's easier to pull meaning from an unfamiliar one.

except you don't really need that many. There are a few common emotions that people use... and then there are winky T-Rex emoji's that are completely unnecessary.



I feel like you and others are being purposefully obtuse, to some degree.

Can you seriously not distinguish between the tiny selection and low res quality of text faces, and the wide variety of highly specific and detailed set of reactions now available to us? There's only so much you can do with text before you have to be extremely creative (a level of effort excessive for quick casual conversations) or rely on the other party being familiar with your specific vocabulary of text-faces.


>Can you seriously not distinguish between the tiny selection and low res quality of text faces, and the wide variety of highly specific and detailed set of reactions now available to us? There's only so much you can do with text before you have to be extremely creative (a level of effort excessive for quick casual conversations) or rely on the other party being familiar with your specific vocabulary of text-face

Then give us a single example! So many replies _and not a single example of where words or ascii fail to impart what only an emoji can_. You can say "they're obviously better" until you're blue in the face, but it's all hot air until you prove it.


> and not a single example of where words or ascii fail to impart what only an emoji can

Obviously words can (almost certainly) impart what an emoji can - but one small image versus maybe 100 words? That's before you start combining them and the expanded meaning you can get from that.

You might as well say "give me an example of where Proper English fails to import what only slang can" - you're missing the point.


Not the OP but here’s one. I recently got divorced and back into dating. These days, that means a lot of texting in some form, and I’ve found I have a distinctive style that people who know me well enjoy but that tends to produce a lot of misunderstandings with people That don’t know me that well.

I have the choice of either adjusting my writing style to new people, which I’d rather not, or use either text or picture emoji to convey the tone that makes my writing clearer to people who can’t infer it. I find that image-based emoji are much more specific in the mood they convey, and provide more range — and there is a definite difference in how clearly I come across.


What’s the text version of a birthday cake?

More challenging: what’s the text version of a singing, eye-rolling T-Rex?

It’s fine to not care about emoji, but you can’t logically dismiss them as prettier smilies.


> What’s the text version of a birthday cake?

"Happy birthday!"

> More challenging: what’s the text version of a singing, eye-rolling T-Rex?

Ok, you got me there, because I have no idea what concept is even meant to be communicated by such an absurd thing.


Let me put it another way: what’s the point of these newfangled moving pictures when we already have books?


> Let me put it another way: what’s the point of these newfangled moving pictures when we already have books?

Yeah, figured that would be trotted out at some point. That's a fine sounding argument, but do you really feel emoji's are on the same level as the advent of video? I don't believe you do. At some point you have to take a look at the specific thing you're talking about and get down out of the clouds.

I have yet to hear a reasonable argument as to why emoji's are better. All I see here is "they're different and can be funny." Ok.


Why are you so intent on invalidating other people when they say that emoji help them communicate?


Am I not allowed to disagree with a statement that implies emojis are some groundbreaking form of communication? I never said the concept was not useful; I said that images provide nothing text cannot aside from aesthetics. Why are you people so defensive about this?


Because you have observed that emojis do not help you communicate, and then concluded that emojis cannot possibly help anyone communicate. There are lots of people in this thread who have mentioned concrete examples of emojis "providing something text cannot," and yet you refuse to accept it.

"This helps me communicate" is not a falsifiable claim. You're telling lots of people that they have somehow made a mistake in interpreting their own life experiences. You are not even considering the possibility that something is there, and you just can't see it.


They are an improvement to an existing form of communication. Aesthetics are also a form of communication. Emojis can be used as part of a sentence and there is no way to communicate exactly the same thing without using them.

There has never been a way to put images in a sentence as easy and expressive as emoji (all there used to be was fonts like Wingdings), and it’s standardized. That is quite revolutionary.


That's a better argument, but it still doesn't really explain what qualities make emoji a better medium of expression, or why we need 2000 of them.


The fact that they’re wildly popular should provide some indication that the qualities exist.

Communication is rife with ambiguities, emotion, shortcuts, and mistakes. And between people who share friendship or more personal relationships, those “flaws” are often features, not bugs.

The concept of emoji, I feel, embraces those flaws.

(And personally speaking on the subject of emoji vs common text shorthand, if I never see “lol” again it’ll be too soon.)


Many popular things are not quality things.

> (And personally speaking on the subject of emoji vs common text shorthand, if I never see “lol” again it’ll be too soon.)

Funny, I feel the same way about emoji. I dunno, maybe I'm too autistic to get it, but when people use emoji it makes me feel like I'm talking to a child who hasn't learned express themselves like an adult yet.


> The fact that they’re wildly popular should provide some indication that the qualities exist.

Really?

Pet rocks were wildly popular. Unhealthy foods are wildly popular. Cocaine is wildly popular (well maybe that's a stretch).

I think there's a correlation problem here. However, I think you're missing the point again; _what can I convey via an emoji that I cannot convey in ascii_? I have yet to see a single example, and that's what started the entire debate.


Pet rocks were popular for 5 minutes. Unhealthy foods are perpetually popular because they have the quality of tasting wonderful.

And neither of you tried to address the core argument I made in the parent, that emoji reflect the inherent messiness of personal communications and for that matter personal relationships.

The same reason it’s important (but inefficient) to tell someone you love them in nonverbal ways is the reason emoji are popular. We all appreciate communications that extend beyond the written word. Emoji is just another option among many for achieving that.


>And neither of you tried to address the core argument I made in the parent, that emoji reflect the inherent messiness of personal communications and for that matter personal relationships

And exactly zero people, including yourself, have been able to provide a single gle example where text fails to convey what an icon can. And, you, that was the entire subject of this discussion if you haven't noticed.


> I have no idea what concept is even meant to be communicated by such an absurd thing

Depends on context. If we were discussing someone, it might signal criticism or a desire to party. The fact that it cannot compress losslessly into words is the whole point.


> it might signal criticism or a desire to party

I'd like to see examples of both of these. Specifically how the T-Rex plays a role because, if you take the T-Rex out, we're back to something I can easily convey in ascii.


> if you take the T-Rex out, we're back to something I can easily convey in ascii

May I ask if you read fiction in more than one language? There are constructions even in those close to English which I find impossible to accurately translate in a way that preserves the delight of the interaction between their phrasing and underlying meaning.

For T-Rex, two examples:

"I drank too much at the Christmas party.

Not as much as Bob. He puked in the restaurant sink before appetizers were served.

[Dancing eye-rolling T-Rex]"

--or--

"Let's go.

Where?

BarBar.

BarBar?

Happy hour pricing until midnight.

[Dancing eyes-rolled-back T-Rex]"

In the former, the emoji communicates derision. In the latter, playfulness. Depending on the style of animation and context, the emoji could further communicate cuteness versus tactile incompetence, letting go versus a loss of control, subject versus object.

The process of decoding an emoji is analogous to a simplified form of interpreting art. Why is that there? Am I supposed to interpret it using the positive or negative connotation? In some cases, less ambiguity is desired. But in others, the ambiguity itself carries information of a sort impossible to parse into words.


>"I drank too much at the Christmas party.

>Not as much as Bob. He puked in the restaurant sink before appetizers were served.

>[Dancing eye-rolling T-Rex]"

Ummmm... Points for trying I guess? You lost me at "person who drank too much == T-Rex"


If it is contextual and subjective, then we could just use any word or phrase in the same manner to the same effect. Written language itself is just contextual line patterns.


How many languages do you speak well?

Very likely less than the amount of different native language speakers who understand the vast majority of your emoticons/emojis.

I suppose you don't care cause you just speak your native language with other people who are native speakers. But a universal language on top of that has huge benefits in international circles. And, my 3 month old understands the :) smile. One of the very first abilities a newborn learns is recognizing faces. That's when they cannot even see a meter far!


So let me get this straight: emoji is both highly contextual and simultaneously universal?

I find that difficult to believe. On the other hand, given enough time and global interaction in that medium, it could develop a stable enough meaning across a large enough conceptual space to have a situation no worse than exists between any "standard" language and its various dialects. That'd be interesting, but I'm not holding my breath.


> So let me get this straight: emoji is both highly contextual and simultaneously universal?

Well, not always universal. Some are generally well understood. They're easy to learn (you might wanna also look into where to start if you're interested in learning many languages; I understood its best to start with an Asian language such as Japanese/Korean/Chinese), and on top of that even allow to learn languages easily (see Memrise and Duolingo who use SVG art to teach languages. They use the same SVG art in different languages!). Even on school when children learn their first words (which are in Dutch: boom/roos/vis/vuur, English meaning: tree/rose/fish/fire) this is done via pictures!

Emoticons and emoji are contextual, yes.

If I say:

That's fun ;) :)

That has a different meaning than:

That is fun :)

or

That is fun ;)

Different context, yet a wink or smile is universal.

And if I'd write:

Dat is leuk :)

You wouldn't understand it because you don't speak Dutch. But you would understand the smiley. Without using any translator. The emoticon & emoji always describes the text around it, like an adjective (though it could also describe other smileys). As such, it is descriptive.

True, sometimes the emoticons (and especially emoji) explanation must be explained. Once it is explained, it can be used in combination with any language. For example, the kappa emoji [1] which originates from Twitch can be used on an English stream, but also on a Spanish or Japanese one. Its generally understood within the gamer community, but if you'd start using it within your local hockey club they'd first need to understand the meaning.

[1] https://www.twitchemotes.com/emotes/25


> If I say:

> That's fun ;) :)

> That has a different meaning than:

> That is fun :)

> or

> That is fun ;) > > Different context, yet a wink or smile is universal.

Really? Because I can't see any real difference between any of those examples. Does the wink mean you're being sarcastic? Or that you're coming on to me? What purpose does the smiley serve? you already said it was fun, one could presume that would leave you in a positive emotional state.

> the kappa emoji [1] which originates from Twitch can be used on an English stream

I hate those stupid things so much, probably because I have no context for understanding their meaning and, since its already an english stream, you could just use words! And if you're not speaking the same language as the rest of the stream, you can't express anything meaningful enough to be worth saying anyway.


> Really? Because I can't see any real difference between any of those examples. Does the wink mean you're being sarcastic? Or that you're coming on to me? What purpose does the smiley serve? you already said it was fun, one could presume that would leave you in a positive emotional state.

That'd depend on the rest of the text. It could mean I am making a joke ("not serious" / "just kidding"). It could mean I'm sarcastic. It could mean that I'm trying to hit on you. I think that sums it up (though I'm open for different explanations).

Thing is, back in the days, even in native languages between native speakers (but more so with one or more non-native) sarcasm and jokes weren't always easy to detect. The wink smiley specifically filled that niche! If you don't know about the story behind it, you might find it interesting to look it up.

As for the difference between these, "That is fun :)" denotes no sarcasm, but warmth. Possibly still humor, but its a genuine statement. "That is fun ;)" was covered earlier above and "That is fun ;) :)" is a mixed bag which could go either way (possibly clever to "talk your way out of the meaning" e.g. when trying to flirt but its not well received, or to create some -albeit simple- mysticism around your flirt). That's without knowing the context. The context still matters and is, ultimately, decisive for the meaning.

I have autism, btw, so although I find this fascinating it is rather difficult for me to understand. It took me serious effort to learn the meaning of the different emoticons/emoji (as far as one can know them, since there's so many in unicode these days).

> I hate those stupid things so much, probably because I have no context for understanding their meaning and, since its already an english stream, you could just use words! And if you're not speaking the same language as the rest of the stream, you can't express anything meaningful enough to be worth saying anyway.

(I don't like it either but that's because it is overused in these circles, and it reminds me of my age ie. that I'm not youth anymore.)

The ability to understand a language isn't binary. (See e.g. the example of the wink where language is not being understood!)

Another example coming from my own is I understand some Spanish, some French, some and some German, but I do not want to learn any French or Portuguese, and my German is better than my Spanish but I'm very curious to learn more Spanish. My English is pretty good, as is my Dutch, but I'm only interested in learning more English and Spanish; Dutch not so much. YMMV obviously.


> we could just use any word or phrase in the same manner to the same effect. Written language itself is just contextual line patterns

No, we can't. There is an inherent visual component to emojis. A picture worth a thousand words, et cetera.

It's not an abstract idea mapping to an arbitrary icon; without prior explanation, many emojis make sense (within a certain cultural context). Kind of like how we can't replace the essence of giving a friend a gift or a lover a flower with words or an arbitrary icon. Apple understands this in a way few technology companies do.


> It's not an abstract idea mapping to an arbitrary icon; without prior explanation, many emojis make sense (within a certain cultural context).

What a coincidence, the exact same thing is true about written words.

>Kind of like how we can't replace the essence of giving a friend a gift or a lover a flower with words or an arbitrary icon. Apple understands this in a way few technology companies do.

I feel like you're one of those people who would have been way into flaming guitar gifs and midi on your geocities page in the 90s. I mean, seriously? You're literally saying that sending a gif conveys so much more meaning meaning it is similar to giving a gift or a flower than sending a text.

Maybe you're right: https://tinyurl.com/y7xeu7dc


> What’s the text version of a birthday cake?

Is that a serious question?

> More challenging: what’s the text version of a singing, eye-rolling T-Rex?

Who cares because that's dumb? Can you tell me what deep emotional state is being conveyed by a T-rex rolling its eyes? I think you lost track of the premise we're debating.


> At best they're just prettier versions of :) and :(

Do you really think this?

I don't really 'get' emojis but I think you're woefully underestimating their impact on communication and language. The emoji library on a normal iPhone is enormous.


>Do you really think this?

...yes?

>you're woefully underestimating their impact on communication and language. The emoji library on a normal iPhone is enormous

What does one have to do with the other? Yes, there are a lot of dumb icons to chose from. How does that directly lead to "[having a] large impact on communication and language"? If that's true, do you think it's a _positive_ impact?


Have you ever encountered difficulty conveying or understanding conversational tone over the internet? No? You are lying or lack self-awareness. Voice and body language are important for disambiguating sentences with more than one possible meaning or implication. Emojis approximate the role of voice tone and body language in digital text-based communication.


The discussion is not "are emojis in any form useful?", it's "do icons provide a new and before unrealized form of communication". Literally every person here missed the statement in the first comment.


If you haven't seen how emojis are used in the wild to enhance communication, it's probably not for you.


You didn't read. I said that you can convey any of these equally well in ascii. Yes, it's handy to be able to plug an :) at the end of a sentence which may otherwise sound rude/overly direct. That doesn't mean I need 1000 icons, and _that's what we're talking about_.


I read the whole thing.

The emojis serve a purpose. Text doesn't serve that purpose. I don't know how to describe the niche they fill with text. They're not a stand-in for emoticons.

You don't get it. It's okay. Not everything is for you.


>I don't know how to describe the niche they fill with text. They're not a stand-in for emoticons.

So you can't explain it, but it's I who "doesn't get it". Ok then. I'll excuse you for a bit as it's going to take some time to untwist your brain from that logical contortion.


> I don't know how to describe the niche they fill with text.

Consider the possibility that this is because there isn't one.


HN doesn't support emoji, so here's an example: https://cybre.space/@Riley/100149464360818801

The meaning is immediately clear to anyone familiar with the reference. It's basically a pictographic language that leans heavily on a shared culture that's largely internet-based.


Ah, I get it, it's value is making the people use it feel special because only "the right kind of people" will get their jokes. Just a new generation of children using slang. Why was that so difficult to use words to describe?


>> "Ah, I get it, it's value is making the people use it feel special because only "the right kind of people" will get their jokes. Just a new generation of children using slang."

You sure do have some text there.

>> "Why was that so difficult to use words to describe?"

The medium is the message.


Oh, so inside jokes. Yeah that's totally new and groundbreaking.


Do you similarly disdain things like Cockney Rhyming Slang and Polari because they're effectively "inside jokes" that express things you could equally well express with "plain words"?


I assume you don't know any people born in the last 15 years or so. Emoji communication (using just emojis) is quite common, often several in a row.


I feel like I'm in the Twilight zone (and no, I don't hang out with a lot of 12 year olds, but I do in fact know that kids like to pepper near everything they write with dumb icons.)

The debate is not "are emojis widely used". Yes, of course they are. The question is "opened a whole new way to communicate with each other", which is what I responded to.

I say, no, they haven't. I can convey the same emotions with ascii. I can convey the same emotions with written text. If you want to prove me wrong then fine, but don't re-frame the discussion.


The most common emoji I see among kids is (Face With Tears of Joy) or (OK Hand) (apparently ycombinator doesn't support UTF-8). I don't know of any ascii that can do those. (FYI, I'm 28 and I often chat with my younger cousins who are in their mid-teens at the moment.)


Yeah I agree. Why would anyone prefer something that looked nicer than an old thing? Has anything like this phenomenon ever been observed in the past?


Ok well here's what I actually said. It's right up there if you want to take another look.

> At best they're just prettier versions of :) and :( and I don't see how they allow people to convey ideas they couldn't do just as well via text.

Never did I say "These are dumb get off my lawn!" I said they don't meaningfully impact or improve communication, which is in direct response to the person I replied to who said that they have "opened a whole new way to communicate."

That's a serious claim, I'd like to see a single coherent argument to show it's actually the case. But, no, all I get are mischaracterizations of what I said.

The debate is not "are pretty things nice", it's "have emoji's fundamentally improved communication".


> The debate is not "are pretty things nice"

If the debate is not about how something looks then why was your first instinct to dismiss them as “just prettier versions of :) and :(“?


>If the debate is not about how something looks then why was your first instinct to dismiss them as “just prettier versions of :) and :(“?

...because I was responding to someone who said emojis were a new and previously unrealized form of communication! Seriously guys...

You know, you're right; I find myself reaching for a "smashes head against brick wall" emoji right about now.

Oh, damn; I just described that emotion with words. I guess I'm still right.


“I had fun once. It was awful.” — one of Kate Beaton’s characters.


I think the new Facetime features are gamechangers in terms of driving usage. All this stuff was already available in Snapchat etc. but Snap's problem is that the impressive AR stuff they were doing 3 years ago is now built into the OS.

Allowing people to call each other as avatars complete with facial expressions or with flattering filters applied gets rid of one of the last remaining key barriers to mass video calling adoption: people tend to look like hell in low light on front-facing cameras. Teenagers will upgrade to Face ID just to get these features, grandkids will love calling their grandparents as cartoon tigers and grandparents will love responding as cartoon dinosaurs.

Also nice to see the fruits of the Workflow acquisition, this will allow people to do all sorts of customisations including using slang and profanity to trigger commands. "Hey Siri, order my favourite fucking pizza".


Neal Stephenson may have not gotten the "Metaverse" prediction quite right, but he seems to have nailed "mediaglyphs".


It's amazing that so many bright minds are wasting time on oji

At least it’s merely wasteful and not actually toxic. They could be working on more adtech.


It's fun to develop it - computer vision, facial features detection, 3D mesh deformation - I programmed all those parts for another unrelated app and it was total fun. Not that it was useful or something, but I enjoyed it a lot.


I tolerate emoji (though don't use them myself) but I always get upset at the acceptance rate of new emoji compared to the historical acceptance rate of actual characters that are used in actual written languages by the unicode stewards.


It’s amazing that even more bright minds are wasting time on ads.


There is an eye opening, amazing discussion about emojis going on here. But, like really aren't we just all accepting of the nuances of a cartoon based language? I'm not saying this is a bad thing, but that is reality. We're sending forest gump like 'shit happens' cartoons to one another.


business partner and I both have Surface related products. His #1 complaint against Windows10, is the Emoji support compared to macOS.


Emoji are amazing, stop it.


I'd far rather they do that than try to get people to click on ads.


Still beats them wasting their time trying to manipulate minds with either ads or freemium games.


Your glob matches moji, which just means "letters". So many bright minds wasted at colleges of Letters and Science, eh?


I deeply suspect that the lack of features is because the releases this year internally centered around responding to the negative buzz associated with iOS 11 problems. I remember when emojis were showcased at WWDC 2016, a fairly low-feature release. Whenever that happens the true feature is unannounced: it's simply stability.


"Whenever that happens the true feature is unannounced: it's simply stability."

Can't remember which version it was now, but I remember when Apple heavily marketed a Mac OS X version dedicated almost exclusively to bug fixes and performance enhancements.

It was very well received. I'm sure the same would happen if they announced they were dedicating an iOS version to bug fixes and performance enhancements.

EDIT: I think the version may have been Snow Leopard. :)


10.6 Snow Leopard.


Worth noting that only iOS got the "double down" on performance. Probably because the userbase is larger and this idea had crossed into the mainstream news as a thing people were complaining about.

They could get into this during the developer SOTU, so there's hope...


Exactly. But now, in contrast, they're silent on the issue of stability. I'll take them at their word: if they say they've only done X, Y, and Z, then I'll accept that this is all they've done.


That would be Snow Leopard (10.6).


Endangered Snow Kitten, er, I mean Snow Leopard.


Snow Leopard


Personal experience, emojis largely enrich my chats with friends and family. That’s one of the reasons I prefer WeChat over WhatsApp. In fact, WeChat even created platform to invite artists to create more emojis for the app.


I guess animated icons trump concerns about privacy and monitoring by a surveillance state [1]

1: https://www.theverge.com/2018/4/30/17302720/wechat-deleted-m...


How does this article support the idea that WeChat raises concerns about monitorig by a surveillance state?

The Chinese authorities in the article didn't "monitor" anything. They retrieved the deleted message from the phone itself. For all we know, it could be as innocuous as running a SQLite DELETE to delete the message and the scheduled vacuum hasn't run yet. I don't see any convincing evidence of either actual monitoring or Tencent conspiring to allow such monitoring.


I wonder if they'll ever fix `localStorage` in private Safari windows. It breaks countless sites and Safari is the only browser to block it.


Any site that is broken by Private Browsing causing storage quota exceptions in localStorage is going to be broken by quota exceptions under "normal" operation.

It's in the spec, maybe developers should make their websites a bit more resilient to the very real errors that can happen? Did they just forget about error handling?


spec? localstorage errors are not standardized, every browser gives quota exceeded error different name and number, you have to use something like

'' e.name.toLowerCase().includes("quota")


But localStorage.setItem() throwing an error definitely is in the spec, so if someone isn't handling Safari's localStorage behaviour in Private Browsing Mode (that is, throwing exceptions) then they aren't handling the parts of the spec.


The web isn't built on specs, it's built on conventions. Safari breaks the convention.


the convention is that localStorage throws exceptions. it does in every browser. if a site breaks because it doesn't do error handling, then that's the site's problem.


You and many of the commenters miss an important point: Any apparent behaviour difference between private and non-private mode should be considered a bug, because sites can and do abuse it to detect private mode.


I disagree. You might be able to use it to _guess_ but all browsers can throw a quota exceeded exception under normal behaviour. I see this every now and then on mobile devices where the phone is so full the browser refuses to store more localStorage items.


This is exactly what I want to prevent though - I want to prevent dodgy sites from using the quota exceeded exceptions from localStorage to (100% accurately in my experience) guess that I'm browsing in private mode.


I too wish they provided an ephemeral localStorage in private browsing.

But in any case, your application should be interacting with localStorage through an in-memory facade, otherwise it's still going to break with lots of other edge-cases. All operations should be treated as volatile, and probably silently suppressed on failures.

Using a facade also makes testing easier, and if you avoid global side-effects it becomes easier to parallelize test-cases.


I love Safari, it's my main browser - but there are so many small annoying things that take ages to fix it's just sad.


Chrome with certain privacy-conscious settings disallows access to localStorage as well. It broke something we built too.


There's also a security policy that affects IE11, accessing localStorage ends up giving an "Access is Denied" error...


Bitmoji and Animoji were already popular, so it makes sense for Apple to combine the two (and sherlock Bitmoji).


> and sherlock Bitmoji

Considering Animoji and Memoji are only in Apple's Messaging app/iMessage, I think Bitmoji will be fine as long as Snapchat is fine (which, well, isn't certain).


its so disappointing, with all the resources and talent at their disposal, that the pace and direction of macOS - to truly advance a desktop system - is so lack luster, "un-courageous", and as you put it boring.


I'm just curious, what "courageous" changes would you like to see in macOS?

Last time I saw a courageous desktop OS change, it was Windows merging their Mobile and desktop OS, and that was a hard fail.

It's a workstation. I prefer reliability and consistency. I think Apple knows they need to keep the general public buying macbooks with stupid superficial features, while maintaining consistent, reliable, functionality for the power users out there before they get annoyed and migrate to Linux machines. Hopefully they address the hardware reliability issues (ie. keyboards) in their next hardware release.


>> I'm just curious, what "courageous" changes would you like to see in macOS?

Modern OpenGL? App store with useful apps?


Vulkan really along with OpenGL latest. The whole Metal thing was wrong headed - both iOS and macOS should have supported Vulkan.

But that's nothing that affects me personally. What bugs me about macOS is how sloppy, buggy and limited it has become. Finder sucks big time. SMB doesn't work all too reliably. Wanna domain join - tough luck. (Even Linux distros are advanced in that area - Gnome 3 on Fedora allowed me to setup Enterprise Login and just entering ID / Password for my AD account during initial setup and it all worked flawlessly!). The OS updates are atrociously slow - at least they are infrequent but just goes to show how much attention they're really paying.

That's just from memory - I haven't used it for last year.


Finder sucks? At what specifically? And what about macOs has become limited as an OS?


my favorite part of this all is that the last generation's retina have: 1) larger batteries, 2) better track pads , 3) magsafe vs crap connector for power USBC 4) horrific keyboards that have no travel, bad feel, no space between keys and 5) on macs with that stupid touch bar no physical ESC key.

not only is it disappointing and boring but things are starting to go backwards.


You are surprised at that?

When a site requests location permission I only get the option to deny request and make Safari remember that decision for a "day", not "forever" which is what I would like Safari to do, esp. for sites like Google.

Now I have looked at forums and didn't find anything that helped. Contacted Apple support (via call and chat both) and on both the occasions I was told that I need to reinstall the OS which was frustrating but heck I did just that with last major update - backed up my data and did a fresh install. I still face that issue. I called Apple Support I was again asked to do the same - reinstall macOS as Safari alone cannot be reinstalled.


chill dude there's 4 more days left.


Great! Go and found a company that makes 10 bn USD a quarter in profit.


Well that's a pretty inane reaction. Next time I see a bad movie should I just keep my mouth shut and make my own movie?


You certainly shouldn't get indignant about it. It's one thing to express an dissenting opinion, but throwing a mini-temper tantrum on the Internet is a bit much. Not saying that's what you're up to, but I'm not as sure about OP.


A more apt comparison would read: You watch a mediocre preview of a new movie from a company that has made the most successful movies ever and continuous to do so. They're doing something right. Instead of complaining, maybe, just maybe, we should consider whether we're the ones misjudging reality.


>from a company that has made the most successful movies ever and continuous to do so. They're doing something right. Instead of complaining, maybe, just maybe, we should consider whether we're the ones misjudging reality.

That's a rather silly comparison. I guess the same logic doesn't apply to Electronic Arts/Oracle/Comcast/<insert hated company on HN>. People seem to be giving them money, and yet people also simultaneously hate them. I think we can and should criticize things we don't like, but only if we're honest with our reasons.


this seems like a fallacious appeal of one variety or another.


Wow, they are DEPRECATING OpenGL from both macOS and iOS: https://developer.apple.com/macos/whats-new/

This is in addition to last year's announcement that "macOS High Sierra is the last version of macOS to run 32bit apps without compromise"

I wonder if we will soon see a new lineage of Macbooks fitted with Apple-specific arm64 chips.

The most scary thought is if UIKit-on-macOS starts requiring Developer ID entitlements and need to be installed via the app store, with fairplay DRM encryption of binaries and everything.

Edit: Also, r.i.p. my old macbook air 2011 :-/


And OpenCL too. This is terrible. I was thinking about adding GPU support to a numerical simulator I am working on, and I was planning to have nice cross-platform support with OpenCL. Whelp, that's no longer the case. My code is in C++, and I refuse to use proprietary, vendor-locked, Objective-C-only Metal. If people want GPU support, they'll just have to use Linux, which doesn't artificially constrain you with corporate frameworks.


The deprecation of OpenCL is especially crappy given how heavily Apple were pushing OpenCL back in 2014 or so with the Mac Pros.


Apple lost interest in OpenCL after learning Khronos wasn't steering it into the direction they wanted it to go.

For example, Metal compute shaders are C++14 and Khronos only adopted C++ in OpenCL after the beating they took from CUDA, which supported it since the beginning.


>My code is in C++, and I refuse to use proprietary, vendor-locked, Objective-C-only Metal. If people want GPU support, they'll just have to use Linux, which doesn't artificially constrain you with corporate frameworks.

Well, some people refuse to use non-platform-native lowest-common-denominator libs, so there's that too...


one possibility for cross platform gpu is webgpu. the people working on it seem to be planning to have a c or c++ level standalone library. behind the scenes it will be directx/metal/vulkan

Apple/Google/Microsoft/Mozilla and others are all participating

https://lists.w3.org/Archives/Public/public-gpu/


Does this imply Apple is planning to continue support for OpenGL informally/externally, the way that X Window support is through xQuartz, perhaps via an external Mesa-based library?

And maybe the same informal/external support model for OpenCL?

Performance will be diminished but not extinguished.


FWIW, they've only deprecated it, meaning no new updates. They haven't removed it from the platform. Furthermore, desktop OpenGL seems like it's dead anyway, given that Vulkan has replaced it.


Aren't most people running those kinds of workloads doing so on Linux already?

Doesn't seem cost-effective at scale to run on beefy Apple machines.


People just getting into a field like to run code on their personal machines. This can be quite relevant when your code gets a 50X speedup from running on GPU.

This is sort of like saying "people only do web serving workloads on Linux, we don't need web servers to run on Apple machines" to me.


Not necessarily. Many media editing apps use OpenCL to speed up processing. I know Capture One uses OpenCL, and I think Adobe's Lightroom and Photoshop use it also. At this point even Pixelmator and less well known alternatives use OpenCL, too.

Sadly, most companies won't have any choice but to port their app to Apple's proprietary APIs. It's really a net loss for consumers because most of these devs have better things to spend their time on than Apple breaking compatibility on a whim.


"At this point even Pixelmator and less well known alternatives use OpenCL, too."

Pixelmator, at least, is based on Core Image, which Apple has probably already moved from OpenCL to Metal.


I mean, if you're already running decent code or already have extremely good tooling where small examples can be easily sent to other machines. Additionally, debugging code running on other machines is also a huge pain in the ass vs. being able to step through it directly, locally.

Almost all of my (and my lab's) time is spent tinkering with small numerical examples before sending it off to one of the lab machines to run overnight, and using the GPU on the MBP through OpenCL is a huge advantage.


People used to buy expensive eGPUs for those workloads to keep them on macOS.


Look into either an existing engine to abstract it away or Vulkan.


This is just quibbling, but 1) you can write Metal code with Swift as well as Objective-C (and that's not even vendor-locked; I'm doing Swift in Linux right now), and 2) you can write C++ in Objective-C.

I know this isn't what you're really complaining about, though.


> you can write C++ in Objective-C

You mean Objective-C++.


The most common pattern I've seen with cross-platform stuff is a small Objective-C wrapper over MacOS APIs, which then gets called like C functions from pure C++ code.


Somehow this is directly related to the failure of desktop Linux.

The only successful variants of it, actually do constrain devs with either Web or Java corporate frameworks, with a very tiny subset of native code allowed to take part into the whole game.


Yes, because GPU computing is mac/windows primarily, and in these cases, primarily for games?


I would be a lot more okay with this if Apple supported Vulkan, the more portable comparable API, rather than just the macOS/iOS-only Metal.

I also wonder what means for WebGL and its future. Right now, WebGL works in browsers on macOS, Linux, Windows, iOS, Android, which is incredible. There is no equivalent.

Sure, Apple has started working on WebGPU, but that’s not yet ready nor is it guaranteed to gain Linux, Windows, Android support.


Agreed it would be nice if there were official Apple support, but FYI there's a project called MoltenVK that implements the Vulkan API on top of Metal: https://arstechnica.com/gadgets/2018/02/vulkan-is-coming-to-...


Apparently Valve shipped a DOTA 2 update a few days ago that is using it:

https://twitter.com/Plagman2/status/1002324195135520768

https://www.phoronix.com/scan.php?page=news_item&px=Dota-2-I...

So that's pretty promising!


My guess is browsers will just implement WebGL on top of Metal and this will be transparent to web developers.


There's precedent: Chrome and Firefox both implement WebGL on Windows with ANGLE, which translates OpenGL to Direct3D.

https://en.wikipedia.org/wiki/ANGLE_(software)


And it seems they are working on a more generic library : https://github.com/google/nxt-standalone


Ah, and NXT has a Metal backend! Very cool.

It's important to note that NXT isn't necessarily a replacement for ANGLE. It's an experimental replacement for WebGL as a whole, with a different API. There still needs to be a way to run WebGL programs on Mac if this deprecation leads to removal a few versions down the line.


And not strictly a replacement either, WebGL would not go away. The successor to WebGL is still so far away that there will probably be some versions after WebGL 2.0.



WebGL is a browser level API that does not require native driver support like OpenGL. They're related only in semantics.


Mentioning this in a couple different places, but consider Molten (https://moltengl.com/) for this use case; it's a third party but very high quality reimplementation of Vulkan and OpenGL ES atop Metal. In other words, an easy way to adopt metal without actually porting your app.


> DEPRECATING OpenGL

Wow, does this mean Maya, Houdini, and basically every 3D package out there will no longer run on macOS? If so that seriously sucks for 3D pros.


I'm guessing it's not the end of the world for Autodesk to add a metal backend to maya. Some smaller teams might very well choose to let go of mac though.


If they weren't already doing that just for the performance improvement itself.


Apple will easily ease the transition. That would be way too stupid to kill partners that way.


No, because contrary to HN beliefs about 3D APIs, in the real world most business already added Metal backends to their rendering engines.


Exactly. Not sure why everyone thinks their favorite Mac apps are using OpenGL anyway. They probably moved to Metal a long time ago — it is much better.


I'm sure everyone's favorite /exclusive/ Mac apps are probably using Metal.

I'm guessing you haven't used apps like Maya, Nuke, or Houdini. They were all written in the mid-90's on IRIX machines and later ported to Linux, Windows, and OSX. Surprisingly, 3d performance isn't always big goal of their's. My guess is the core features don't sell new versions, so even though they have annual releases those things don't get much attention. They'll have drawing issues and transparency sorting problems for years. Same with audio bugs.

Their Mac support was spotty and irregular until the past 5-10 years.



OpenGL has just been deprecated. It hasn’t been removed.


It will be in a year.


Won't there just be OpenGL drivers as a separate download? Is this any different than when they removed Java?


Does anyone know how this will effect Blender in the short term?


When Metal was introduced for the Mac they had the Modo devs at WWDC and they ported their code to Metal in like a week or two. Not really the apocalypse.


Wat.

Metal and OpenGL two completely different APIs, shading languages and probably a whole host of other things.

I've ported my fair share of things from fixed-function to programmable shader pipelines and you'd be effing naive if you think you can do that in a couple weeks on anything more than a toy demo.


I worked with a AAA gamedev recently who had written a Vulkan renderer for their game to demo quality level in 2 weeks. It all depends on existing level of abstraction for the rendering API and shading language (and to some extent assets), and how much performance and efficiency you’re happy leaving on the table.


> demo quality level in 2 weeks

Getting pixels on the screen and shipping something to end users are to very, very different things, 90/10 rule and all that.

Vulkan also has the benefit of multiple platforms supporting it so you're not doing all that work for a minority(which is what OSX is in the graphics space) platform.


If you already had an architecture with replaceable renderers, especially with DirectX 12 one already written, adding Vulkan one will be a matter of just a few weeks. If you hadn't, it will be much tougher.


Well Modo isn’t a toy.


The question is, how many devs did they have working those two weeks (and what resources did Apple provide to help them)?

I develop an OpenGL-based video engine for a live media playback application, which is very nearly as simple an application of OpenGL as you can expect to find in the real world, and there's no way I could expect to port it to Metal in a week or two singlehandedly. Like others have mentioned, it's a completely different paradigm, not just a matter of changing around some function calls.

That said, I welcome this change with open arms (and secure in the knowledge that legacy code will continue to work for the foreseeable future). OpenGL is a fragmented, brittle, spaghetti-inducing pile of global state. Rewriting in Metal isn't anywhere near as small a project as Apple claims, but I'm perversely looking forward to it -- I'll be very happy to have OpenGL in my rearview mirror.


Nice marketing campaign here a couple years ago. https://developer.apple.com/opencl/

Make up your fucking minds. Developers aren't hamsters.

I now declare MacOS to be deprecated, if that's how it's going to be.


Oh this is hilarious.

You could put money on how long this page gets forgotten and left up.


Aw, crap. MacOS is deprecated!? How could you!?


Yup, you will see a deprecation notice when booting MacOS now.


Good riddance. OpenGL has been an awful API for many years now. The drivers are way too complicated, and applications don’t have enough control to deliver smooth performance. All OpenGL does now is let you kind of mediocrily put things on the screen.


It would be a good riddance IF there was another universal API (Vulkan) and they would adopt it in substitution.

The fact they want to force game developers to use instead Metal is... ridiculous, especially considering the extremely low macos marketshare, particularly outside US.


Sometimes at work I daydream about this alternate reality where we let go of the idea of an universal API on top GPUs and just let vendors publish some ISA and hardware-specific libs/drivers/docs. I'm sure people would figure out nice (and less nice) abstract libraries on their own just as well.

It's so frustrating to read what the GPU is actually capable of (for example in the intel PRMs) and to know that there is absolutely no way to get the driver's compiler to do the right thing in a reliable way.

I mean imagine if icc was the only x86 compiler.


It's ok man, we can just keep piling on more turtles.


I do get the impression the new more-explicit APIs help somewhat with that, no?


All game engines that matter already support Metal, plus writing platform specific APIs is something that professional game developers are used to do since Atari 2600.


What do you think about them forcing you to use Swift or Objective-C for this? Forcing you to use languages that use ref counting and object oriented pointer tables that are traversed at runtime? How much of the gains does objc_msgSend eat up? I though you are against such things?

How ugly would the JAI code to need to be to interface with this?

I wish they had a simple Metal C API, but their new API comes with a bunch of Objective-C baggage.


Have you actually used the API? When it comes to submitting geometry and textures (some of the biggest bottlenecks), it's exactly the same as in C. You pass a pointer to your buffer of bytes and they get copied to the GPU for you.


When you're calling any library, that simple C call invariably goes somewhere else inside the library itself. For OpenGL this is because there's always hooks for introspection, or because the GPU driver wants to implement something themselves, etc. For many other libraries it's just because people don't feel like they've written "production quality code" unless it goes through a bunch of hoops and ends up in some method with five underscores in the name.

In Metal the Obj-C abstraction is part of the design and used to eliminate any other abstraction people would want to introduce. The objects you get back from the API are implemented straight in the GPU vendor code, and the debug tools can swap the methods out for extra validation, recording, etc.


Any overhead coming from objc_msgSend is minuscule compared to the gains from things like better serialization of GPU-bound tasks and not having to synchronize repeatedly with the CPU.

If you're worried about refcounting, use ARC (which you have to with Swift anyway). First, the compiler is very smart about optimizing away retain/release/autorelease calls whenever it can. Second, when those calls do have to be made, they're implemented using vtables, and never hit objc_msgSend() in the first place.


I think it is about time industry relevant OS vendors start to move away from C, kudos to Apple.


Pretty ballsy, it didn't work well for Microsoft when they said "Use DirectX or die" I doubt it will work well for Apple. This is especially true for OpenCL (also deprecated) which nobody on big Linux server farms with GPUs is going to be using "Metal on Linux" for their code.


How exactly did it not work well? Windows is THE gaming platform for PC.


It kept a lot of commercial software off the Windows platform and left it on Workstations like SGI and DEC had. The movie houses that were rendering movies were using OpenGL on their renderfarms and its lack of availability on Windows kept windows off those desktops.

The key being that if you've got a technology that works on both server farms for production and workstations for development, you support that so that your OS is a viable candidate for the developer workstation. I don't see a Metal port coming to Linux in any reasonable way any time soon, and I don't see researchers giving up OpenCL or even OpenGL any time soon, so it just means that Apple is going to forego that business.

With the recent github purchase it gives the oddly dissonant experience of having Microsoft being the 'developer friendly' OS company and the MacOS being the 'developer hostile' OS company. Where, and this is important, support for cross platform tools determines hostility or support. I would not argue that Apple is not the best development environment for the Apple platform, or Windows for the Windows platform.


OpenGL is a real-time graphics API, not an offline render system used by renderfarms. I have never heard of movies or special effects rendered in OpenGL. The first major renderer was Renderman, the only game in town for years, and it has nothing to do with OpenGL.


You are correct. However, many in-studio tools are written in OpenGL. These tools are used to model objects and layout lighting, scenes, etc. by artists. They are written in OpenGL, usually on Linux. (Source: I work with several people who formerly built these tools for well-known studios like DreamWorks and Sony.)


Of course, but that’s not what we are discussing. The context for my comment was regarding offline rendering and render farms.


That's like saying roads are THE travel surface for cars.


... yes?


Not sure what you're saying is changing on the server -- people are going to go from not using OpenCL to not using Metal.

Everything is CUDA. Everything depends on the shitty unstable software designed by a hardware company (Nvidia). This sucks and I hope someone can disrupt it, but Apple has no influence in the field of GPU computing.


When did Apple ever have influence in the field of GPU computing.

And I work in data science and nobody is using their own laptops when you have AWS.


I didn't say they did. My claim is that Apple deprecating OpenCL is a straightforward and uninteresting thing; it's a company that has no influence on GPU computing getting out of the business of a technology that also has no influence on GPU computing.

I work in data science too, and who cares about laptops. Desktop computers with GPUs, SSDs, and a lot of RAM are what you need. You can thoroughly bling out the hardware and the entire computer will still cost less than your monthly AWS bill to access a GPU. (This is all getting pretty irrelevant to Apple, though, who doesn't make such computers.)


Whatever criticisms can justifiably be levelled against nvidia, having a bad software stack isn't one of them.


Most of the field of machine learning is irreproducible right now because you can't not use CUDA, but you can't promise that it will work the same on anyone else's computer, or that it will work six months from now.


Actually, it worked well for MS - back in the day, most of the games in the industry were done in DirectX. And to be honest, DirectX/3D was the only option if you wanted to have Vulkan-like low-level access to GPU.


Apple isn't a large gaming platform and they aren't used as server farms.

So I can't imagine this is going to hurt at all.


The writing was on the wall since they stopped updating it (OpenGL) 5 years ago.


True, but it'd be a shame to lose out on the library of legacy software and games.

Edit: and XQuartz (X11) with OpenGL comes in handy once in a while too...


I mean... it looks like there's at least one commercial option for OpenGL on Metal (MoltenGL).

Also, it seems to me that the better option in the long run for legacy game support is to just run a VM for the highest degree of compatibility.


Yep, I bet the MoltenGL guys are throwing a major party right now. I wonder what kind of effort it would be to hook up Mesa to a Metal backend...?


Years.

EDIT: As a follower of https://mesamatrix.net/ I don't think it would be unreasonable to say about 3 years to get something that kind of works, five for something semi reasonable, and 8 for something at modern open gl level.


I guess John Carmack is sad today (he advocated early for OpenGL on Mac).


I am not sure. In the past Carmack stated that DirectX at some point became a much better API compared to OpenGL. He also once stated (admittedly when talking about id Software, after he left the company) that he's not really a sentimental person.


John Carmack probably knows the Apple of today is not the Apple of yesteryear.


If the Apple of today would just include a couple more buttons on their mice, John Carmack would probably be happier.


Say what you will of Microsoft they still understood how important backwards compatibility was and didn't do the same to OpenGL back when they were pushing D3D hard.

I really hate to see such a focus on a platform lock-in API when viable alternatives(Vulkan) are available.


So that's how they weed out the still good 2011 Macs...

I can't say I'll be protesting by not buying a new Mac because I'm already devoid of any desire to do so.


I'm curious about this – does it really matter? How important is it that the Operating System has OpenGL? Can't individual apps just static link to (and ship) their own versions of OpenGL?


OpenGL is just an API, the underlying features are provided by the graphic driver. You can't ship with your own OpenGL, that would mean shipping with your own amd/nvidia/intel driver (and the associated kernel module, etc).

A reasonable alternative would be to implement OpenGL on top of Metal for compatibility but this is a lot of work.


What is gonna happen to games and apps which were using OpenGL since today then? They will stop working in the new macOS version?

edit, ups just read it: Apps built using OpenGL and OpenCL will continue to run in macOS 10.14, but these legacy technologies are deprecated in macOS 10.14. Games and graphics-intensive apps that use OpenGL should now adopt Metal. Similarly, apps that use OpenCL for computational tasks should now adopt Metal and Metal Performance Shaders. [0]

[0]https://developer.apple.com/macos/whats-new/


There's at least one implementation that I'm aware of in MoltenGL (https://moltengl.com/moltengl/), though that's only ES.


There is also MoltenVK https://github.com/KhronosGroup/MoltenVK for Vulkan on metal...


"Also, r.i.p. my old macbook air 2011 :-/"

macOS performance was already getting so poor on 2010/2011 MacBook Airs, so I think this is the right move. I recently downgraded my old 2GB 2010 MacBook Air back to 10.11 El Capitan and it runs much, much better than it did on High Sierra.


Careful with that. A number of security fixes make it to El Cap, but I don't think all of them do.


I guess we will be saying our last goodbye to “write once, run on all major platforms” graphics code. Does this mean that the only choices now are 1. Fork your graphics code into two separate implementations or 2. Go up the stack and use a cross-platform “game” engine like Unity? I suppose 3. Stop supporting macOS is another (sad) option.


If you're programming in Rust, you can just use `gfx-rs`'s HAL[1]. This is designed to be a Vulkan-like API that works on top of Vulkan, Metal, D3D12, or OpenGL.

If you aren't in Rust, just use Vulkan. There are high-performance wrappers for it in various stages of development, such as MoltenVK[2] for macOS and iOS, and VulkanOnD3D12[3] for UWP. Non-UWP Windows, Linux, and Android should support Vulkan natively through their GPU drivers.

[1]: https://github.com/gfx-rs/gfx

[2]: https://github.com/KhronosGroup/MoltenVK

[3]: https://github.com/Chabloom/VulkanOnD3D12


> “write once, run on all major platforms” graphics code.

This was never true for game consoles, regardless of urban myths regarding OpenGL support.


Fine. All major desktop, laptop, and mobile platforms not including game consoles. Still, more portable than any other existing alternative.


What does that mean for games like League of Legends running on OpenGL? (https://engineering.riotgames.com/news/trip-down-lol-graphic...)


Riot probably has enough resources to pay its developers to target Metal on macOS by the time OpenGL is removed (some number of years from now).

It’s the smaller scale developers and projects that might not have have such ability.


Certainly has enough resources. Their 2015 revenue estimate was $1.6 billion. 2,500 employees in 2017.

(Not intending to nitpick--just pointing out that Riot is a LOT bigger than many folks realize)


The smaller scale ones are generally using an existing engine like Unity or Unreal, which handles Metal support.


Generally, yes. Certainly not all. The likes of SFML and SDL2 and libgdx have zero support for Metal, although SDL2 did recently add support for Vulkan.


How far back did Unity/Unreal transparently handle Metal support? I suspect some developers won't have the resources to update games running on older versions of those engines.

Games made by developers who've gone out of business are probably just going to stop working in a few versions of MacOS.


Perhaps they could use some middleware like MoltenGL [0]. That way they might still be able to write against an OpenGL API (which allows for code re-use), while supporting Metal under the hood. It does seem this particular tech might be more suited for mobile platforms, unless OpenGL ES is also used these days on the PC / Mac platforms.

---

[0]: https://moltengl.com/moltengl/


I guess an eventual port to Metal, like WoW.


> Wow, they are DEPRECATING OpenGL from both macOS and iOS

How does this effect Qt iOS applications?


It doesn't, because they are migrating to an universal 3D backend, away from a pure OpenGL one.


Do you have more info on this? When is it expected to be done?


macOS apps as well...


OpenGL is a pain in the ass to use, Metal does what OpenGL does so much better, and with the Vulkan Metal wrapper you can still write cross-platform apps/libs. So nothing of value was lost.


What's the consensus on Metal? How does it compare to OpenGL?


You can’t compare them directly, they’re very different categories of graphics APIs. Metal belongs to the category of “modern low-level graphics API” (which also includes Vulkan and Direct3D 12), while OpenGL is an older higher-level graphics API (which also includes Direct3D 11 and previous).

The low-level graphics APIs like Metal and Vulkan allow for much better performance, but they are much harder to use and require more work from the developer (hence, they’re usually used only by game engine makers). Higher-level graphics APIs like OpenGL are less efficient and have lower peak performance, but are easier to use for individual projects and have the benefit of having existing functional code (no need to rewrite a working project).

Also, OpenGL (and its mobile and web variants OpenGL ES and WebGL) are very portable (macOS, Linux, Windows, iOS, Android, both and native and in browsers), while Metal is macOS/iOS-only.


To be fair - OpenGL was fairly horrible to use directly. It sat in an awkward middle area between "low level enough to be efficient" and "high level enough to be productive".

Maybe I'm biased - every time I looked at OpenGl code I shuddered and ran away to a higher level framework (I'm excluding shader code from this - that's concise enough for me not to mind getting that close to the metal)


So much this! I've been writing OpenGL code on a daily basis for the past 10 years, and I hate it. It works like an API designed in the 1970s. It uses none of the modern features of languages, easily allowing you to pass incorrect data to a function, and giving you almost no clue what, if anything, went wrong. Just look on StackOverflow some time at how many questions are basically, "My previously working OpenGL program is now rendering nothing but a black screen. What went wrong?" And then compare the huge number of different causes. There's no reason they couldn't have improved it along the way, keeping backwards compatibility, and just adding more informative error codes and better use of language features. But they didn't. My feeling is "Don't let the door hit you on the way out."


I don't think that is quite accurate. But you will have to ask a professional game developer for better judgement. Comparatively speaking.

Vulkan, you need to know exactly what you are doing. There are little to no handhelding. You are trying to squeeze out every last 10% of performance in exchange for lots more development time. You need to write a hell a lot more code to do something what you previously thought were simple.

OpenGL is higher level that should be compared to Direct3D 10, not 11. As a matter of fact I will go about saying compare to Direct 3D 9. And unless you are a OpenGL zealot, any sane game developers would have told you Direct X 9 already exceed OpenGL in most way.

Metal is offering what most of Vulkan can do and making it even easier then OpenGL.

Honestly I don't get / understand why all the backlash. OpenGL is deprecated, and it simply means Apple won't be updating OpenGL anymore. Stop asking every god damn year. They have been shipping other deprecated library and API for years! OpenGL is developed in such a way that no one really cares. And designed by committees for maximum backward compatibility. And if you want your App on iOS, you will have to use Metal anyway.


Thank you for explanation. I don't do any graphics programming, besides few toy projects with OpenGL, but my understanding was that one of it's benefits was portability (for a varying definition of "portability").

That's why I wasn't sure what Metal is offering instead.


Except they aren't portable at all for game consoles.


Most of the game devs I follow on Twitter expressed a positive opinion on Metal, particularly over Kronos' Vulkan and of course everyone(that actually has to develop with it) hates OpenGL.


To my understanding, the consensus is that it's, y'know, fine, but nothing particular to recommend it over Vulkan. The main problem people seem to have with it is that it feels unnecessary, like Apple being incompatible for the sake of being incompatible.


Didn't Apple try to work with Krohnos to make Vulkan, but they were taking so long, that Apple just gave up and made Metal and shipped it before Vulkan was finished?


Much better.

A 3D modern API that acknowledges the time of pure C APIs is long gone, with C++14 as shading/compute language, providing math, 3D models, materials, fonts, textures support.

Whereas in OpenGL you are stuck with fishing for libraries, multiple execution paths depending on the extensions and bugs, compiling and linking shaders at runtime without library support, C style APIs and a plethora of deprecated APIs.


Metal is a lower-level API, somewhat similar to Vulkan or Direct3D 12. OpenGL supports driver extensions that can be used to reduce overhead in a similar manner to Metal, but this is largely a moot point when it comes to OpenGL on macOS, as many such extensions are simply not available.


They also sneakily deprecated the sub-pixel font rendering, so the "Use LCD font smoothing when available" option will be gone from the preferences.


More info on this for the interested: https://twitter.com/siracusa/status/1004143205078597633


Mentioning this in a couple different places, but consider Molten (https://moltengl.com/) for this use case; it's a third party but very high quality reimplementation of Vulkan and OpenGL ES atop Metal. In other words, an easy way to adopt Metal without actually porting your app.


Will this effect Safari (and other browsers) supporting WebGL on OSX/iOS?


Doesn't Core Image use the OpenGL Shading Language, so is that going too?


That was gone last year, check WWDC 2017 Metal 2 related sessions.

When it was introduced, the Core frameworks were updated to use Metal instead, with OpenGL left for backwards compatibility mode.


the notch...no 3.5mm aux...no OpenGL... the hits just keep on coming. Is this what passes for "invention" today? What's next, deprecating USB?


You can add CD ROM, the diskette drive, SCSI, ADB, FireWire and the LaserWriter to that. How dare they!


Notch decreases function and is aesthetically displeasing.

3.5 mm = 1) there is no way a wireless medium will ever have better throughput than a wired medium over a superb DAC - ever, 2) dongle, 3) extra battery worry now for BT.

OpenGL, I will grant you that one, since both Metal and/or Vulkan are a vast improvement on OpenGL.


> Notch decreases function

I know that some people like to say this, but it's pretty clearly wrong, and it makes you look unobservant when you repeat it. It takes about five seconds and two phones to demonstrate this. The standard top section on every phone, Android and iOS alike, is a dedicated status section with little system icons with wide gaps of unused blank space between them. Every phone, every time. A notched phone just puts the camera directly between those icons instead of putting unused pixels there. It doesn't take a genius to see that un-notched phones have both larger top bezels and more pixels wasted in the status area. The notch doesn't cut into the screen. The screen extends up around the camera and puts the status icons on the horns.

> and is aesthetically displeasing

All of the tens of people I know who have notched phones say they love it and don't notice the notch. So, maybe to you, but it seems like the market is speaking.


It might benefit you to look at a 3rd phone. Because my top notification bar is always full. So no, they are not clearly wrong. Or better yet, maybe it just comes down to individual preference!? I personally never thought I'd see someone trying to justify the loss of screen space as a "win". But here we are. And also, the market is still out on it. Until the majority of phones are doing a notch, the market hasn't spoken. Apple hasn't been the leader in smartphone sales in many years now. They aren't even #2 anymore.


Notched phones have gained screen space, not lost it. Without the notch, the new screen area around the notch would not be screen area at all.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: