Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Meta wants to capture your biometric data so that your facial expressions and body language can be mapped onto a 3D avatar in virtual reality. Is this really the future of the metaverse? or is it just what Mark Zuckerberg thinks what the metaverse should look like?

I am a little skeptical of this vision. And I imagine most of the "metaverse" will be experienced through flat screens - and text, voice, and video will remain dominate. VR will grow but do we really need to map our physical expressions onto an avatar to play games or collaborate together? Why go through all this complexity when we can just have controls to enable our 3D avatars to emote?

If you really want to convey facial expressions and a personable experience, just enable video.



> If you really want to convey facial expressions and a personable experience, just enable video.

I hate to take Meta's side, but enabling video is inviting discrimination and hate speech. Enabling audio alone has proven to be enough to bring out the worst in people online.

Voice mixers and avatars will result in a much more even experience for everyone involved. Emoting through an avatar will make that experience richer.

You can (literally) be a dog online if you want.


> Enabling audio alone has proven to be enough to bring out the worst in people online.

I am not sure if the issue stems from enabling audio. We see this over text on facebook, reddit, and pretty much every gaming platform.


Enabling audio makes it easy to find a target based on their inherit characteristics. The bar is low to find, for example, a woman to harass in a videogame over voice comms (an exceedingly common problem).

FB, Reddit, et.al. require the target to actually exhibit a belief or opinion to become a target. It ironically raises the bar for being an asshole to a specific target when you can't hear them.


I do not think the solution to hate speech and discrimination is to hide the traits that make us different.

If the future of the metaverse is on the blockchain, each digital identity will have a reputation to uphold. I think transparency in people's actions, a reputation system, and a simple moderation system is sufficient to manage toxic behavior.


Just like china has they are so progressive..


The user daenz made a good point too:

> Its primary feature is to strongly tie someone's identity to their physical body.

I am just being skeptical here. Again, why cant I just have a control panel with a wide range of avatar emotes, the same way we currently have a wide range of emojis.

My physical facial expressions are limited. What if I wanted my jaw to literally drop on the floor? Or my heart to beat out of my chest?


Insert a "why not both" meme here. Having both available is very powerful, especially for non-power users who would rarely use the advanced list of emotes.


It’ll become the PenisSwastikaVerse about five minutes after it opens.


Trolls will troll fps players Will say stupid shit its the world regardless of how many layers you wrap around it.


> I hate to take Meta's side, but enabling video is inviting discrimination and hate speech. Enabling audio alone has proven to be enough to bring out the worst in people online.

Virtual avatars already work in video conferencing apps, and any voice modulation software will work there as well. They're even easier to use, because tracking facial expressions and mapping them to a virtual avatar can happen with just a webcam, you don't need a complicated extra set of hardware.

I really just don't believe that privacy/anonymity/anti-discrimination is the reason Facebook is making any decision with VR. And I would base that on mostly just on the entire history of the company.

This is the company that has pushed harder than any other mainstream social network to force people online to use their real names. They've repeatedly had micro-targeting bugs in their ad-system that allowed targeting people based on race, and there have been multiple instances of people having parts of their identity leaked to friends and family members through advertising. The whole facial recognition thing they're planning to get rid of was a system that would tag your real face and correlate it with your real identity online, outside of your control via other people's accounts. I know people online who's literal first experience with Facebook was as soon as they signed up immediately getting doxed/outed by automated systems that sent suggested friend requests from their alt accounts to other contacts based on their address book. This is not a system or a company that in any way cares about digital autonomy, anonymity, or user-controlled identity, it really never has, and I don't believe Mark when he suddenly starts claiming that the company has now suddenly done a 180 on user autonomy.

I'm not anti-VR; I don't think it has the workplace value that Facebook is claiming, but it does have some benefits. And there is some potential for self-expression in VR that is genuinely exciting (see platforms like VR-Chat, but notably, Facebook is doing nothing in that space). But VR is definitely not a technology that's designed to decrease the risk of abuse. When you strap a VR headset on, you're putting yourself in a vulnerable situation, and Facebook has no controls I can see that try and mitigate the fundamental problem; they don't have tons of control over avatars, they don't have voice masks.

If they're claiming lack of video passthrough for faces is a privacy feature, that's an excuse, I don't believe it for a single second. None of the rest of their setup reflects a company that honestly cares about any of that stuff. You bring up audio as being a problem on its own -- Facebook isn't helping that problem; it's moving away from text communication, and then not enabling audio filters by default in VR. It's making that problem just straightforwardly worse.


I'm not the least bit skeptical of this vision, but that's because I see direct use along Facebook's (Meta's) business model.

If you are selling advertising (influencing) access to a population through heightened metrics and more ability to target amenable customers, there are many useful things you can learn and correlate through things like delay before click-through, where the mouse pointer hovers, and so on. This is done on a massive scale and it's extremely revealing, allowing for deep and fascinating connections into the human subconscious.

This is NOTHING to what you'd get tracking micro-expressions on the human face, something that is largely involuntary on a whole different scale.

This is like mapping the human genome, but for human behavior. It's absolutely revolutionary. Even with the most primitive recognition the big data generated will be transformative.

The inevitable result, WHOEVER does this (if it isn't Facebook, it'll be somebody else. If it isn't the soulless commercial sphere it'll instead be the government. Or 'all of the above') will be this outcome: you can then pay to manipulate entire populations to your whims. All you have to do is have an outcome, a plausible means of influencing them, and this ability to micro-target everyone who is susceptible to your message and skip over anyone who might whistle-blow on you or raise a coordinated objection.

This… changes the conditions under which democracy is practiced, and under which anti-democracy can be consistently maintained. It's a technology that is easily wielded by just plain old humans that are no smarter than the 'masses' to be manipulated.

The really interesting question is what happens when AI or larger-than-human frameworks get involved and have this technology. If there is a Singularity, there will be no objections. If AI rules us, it'll be as the domesticated animals we essentially are, and there will be no call for tyranny and widespread violence: if anything, this suggests that violence and horror become artifacts of fallible human implementation of these things. The more sophisticated use of it will be bloodless, with any desired outcome 'willingly' adopted by the populations targeted.


> If you are selling advertising (influencing) access to a population through heightened metrics and more ability to target amenable customers, there are many useful things you can learn and correlate through things like delay before click-through, where the mouse pointer hovers, and so on. This is done on a massive scale and it's extremely revealing, allowing for deep and fascinating connections into the human subconscious.

Counterpoint: no it isn't and no it doesn't. All of these models presume that they can infer what has the user's primary focus, when in fact users are distracted by all sorts of things that are not connected to the computer in front of their face. This is all just the daily sturm and drang to sell to marketing departments, which is Facebook's real business model (lying to advertisers [1]).

> This… changes the conditions under which democracy is practiced, and under which anti-democracy can be consistently maintained. It's a technology that is easily wielded by just plain old humans that are no smarter than the 'masses' to be manipulated.

I'm all for the presumption of the worst when it comes to tech corporations, but some consideration of likely outcomes based on historical trends is prudent. The fact is the only corporations who have a vested interest in what you're saying here are military contractors, and they are content to charge the DoD, FAA, CIA, etc $10k per line of custom code to run on their Access 2000 databases and IBM AIX systems.

At the end of the day, there's more money in lying to the Ford/GM marketing departments than there is trying to organize a fake coup for Elon Musk in Bolivia. After all, if they were willing to pay for the lithium, they'd just pay for it. The whole point is trying to get it on the cheap. Car marketing departments, on the other hand? They'll pay, and pay a lot.

1. https://www.google.com/search?&q=facebook+lies+to+advertiser...


You’re both right! There will be incompetence, fraud, AND evil.


> do we really need to map our physical expressions onto an avatar to play games or collaborate together

Its potentially great.

I use Horizon Workplace to chat with friends socially that i may otherwise facetime with. When its > 1 person, having the spacial audio and the hand gestures is great. I don't mind not seeing my friends real faces. Adding facial gestures that are real-time and natural would be huge imo and a lot more immersive.

Would i use that for my grandma? No. Online friends? Yes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: