Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Xiaomi explains more about how its under-screen camera works (theverge.com)
194 points by notlukesky on June 9, 2019 | hide | past | favorite | 121 comments


Mounting the camera behind the screen could create eye-contact in video chats.

That's the biggest difference between video chat and face-to-face conversations, and it creates low trust.

And while xiaomi isn't the first to achieve that[1], xiaomi is a consumer company, and their price point is very affordable.

So i hope they target this problem. they could be very successful, with good implications(environmental, social), hopefully.

[1]https://www.prnewswire.com/news-releases/dve-launches-camera...


> Eye-contact in video chats. That's the biggest difference between video chat and face-to-face conversations

This issue is pretty far down the list IMHO. Above it are fixing the obligatory 5 minutes of "Can you hear me? I can't hear you" before the actual conversation starts.

Face-to-face conversations also lack "Please mute your mic when you're not talking because the background noise is annoying". "Did you forget you're muted?". "Can you repeat that, I can't understand you". "The video/sound is cutting out". "Where's this echo coming from?". "Please say something on your end". "What's that noise?". "Please use a headset next time". "Sorry, the call dropped out". "Let me call you again, maybe this will fix the issue". etc. etc.

Video chat is one of the great unsolved problems of computing. Yes, camera location is there as well - especially on Dell laptops that for some reason have it looking up from below into your chin. But that is really one of the lesser issues as far as my experience goes.


Sounds like you’re limiting your view to work-related calls. If I could have real eye contact on personal calls, it would make a huge difference. My current technique of looking just below the camera already makes a huge difference, and it’s only a hack to make it seem like eye contact for the other person.


I do half of that crap on personal calls too. Interestingly enough, I don't get those issues on phone calls! Or most audio only calls over the internet.

The underlying issue is the technical failures of video calling, not the social environment it's used in.


On laptop I would reduce the video window to 10% of screen and put it just under the camera and it made a huge difference


I've had decent luck tilting the laptop screen down a bit so that it reduces the distance between the camera and the window


You would have to look directly into the camera, no? Why just below?


I would guess because if you always looked directly into the camera, it would be harder to watch the video/see the other person at the same time.


It is natural to look at the person and eye contact when talking to them. Hence we don't look at the camera, though I'm sure some people can and do this.


> "Can you hear me? I can't hear you"

Would you mind explain me why there isn't anything (or only little) done in that area? I guess we could implement some assistive checks for input and output audio streams? What is blocking us to implement a "handshake" that will verify that all is fine? - let's say communicators will send a recorded sample "Hello World" message over wire and check the microphone, volume, noise etc. In my opinion the pattern how we resolve the "I can't hear you" problem is a repeatable checklist that we more or less confidently do, so why we can't implement it as some algorithm that takes care of those problems in automatic way?


because your microphone can't hear what's happening in your headphones, and your computer doesn't know if you're using headphones.


A system that has access to both the microphone and the headphones could coordinate both based on past experiences (e.g., learning). I don't know how much data you would need to make it reliable though. If you know how the data is going to be transmitted, and assume little network delays, recording yourself before a video session might be of great help. Probably the same problem that TV broadcasters have. It's hard work to get quality real time content.


> Above it are fixing the obligatory 5 minutes of "Can you hear me? I can't hear you" before the actual conversation starts.

Whenever I’m in this mode (which basically means whenever I’m on a conference call) I wonder what percentage of the GDP this represents, in aggregate.


Supposedly, way way less than the portion of the GDP that is made possible by remote work.


Those problems are solvable, the solutions are here.

And there are even more problems with consumer telepresence:

http://hellarddesign.blogspot.com/2013/07/why-living-room-vi...

But i feel that they too are solvable, if you care enough(like when you want to keep in touch with you senior parents).


Senior parents might not be as forthcoming as a project lead about the fact that the call quality sucks and that they can't hear half the things you say. Just a thought.

I spent the past 5+ years in jobs where I had to have multiple video calls per week. I used every tech imaginable, from those that cost five figures to the gratis ones. If these problems are solved, why is the experience universally terrible on all of them?

When I call my parents, it's over the plain old phone line, because I actually want to talk to them. Not seeing their faces on the screen is a small price to pay for not having each and every call turn into an impromptu debugging session.


I have the opposite problem. Call quality over cellular networks is always terrible. Even when you can hear each other, the quality is awful. I generally use voice over internet with something like signal.

I also find that Apple's FaceTime is the most consistently good and functional with little "can you hear me now".


I have the opposite problem. Call quality over cellular networks is always terrible. Even when you can hear each other, the quality is awful. I generally use voice over internet with something like signal.


What about a microphone array ?

Have you tried putting that both on your and your parents' locations ?

And what kind of internet you two are using ? and are the devices using wifi , or are wired ?


Sounds like you've only used shit systems on shit networks.


Filmmaker Errol Morris adapted a teleprompter to achieve eye contact with interview subjects. The screen is horizontal, below and in front of the camera, an inclined 2-way mirror moves the screen image in front of the camera.

http://www.whiterabbitdesigncompany.com/uploads/1/4/1/3/1413...

His non-patented "Interrotron" inspired the patented "Eyedirect". Other variations may be possible for small batch manufacturing.


Yes, eye-contact in video chats is very important. The way this was done in the early days [circa late 1990's and early 2k](probably still today in more elaborate ways) in video conferencing was the use of a one way mirror at an angle, camera behind mirror and below the mirror the display screen projecting up. This presented the display and the camera at the same level as the eye level of those upon the display, making for a much more comfortable engagement. By that, have a chat with somebody and look at one of their eyebrows, this makes them uncomfortable and increases the number of times they blink as well. The difference of eye-level contact in video conferencing/chats makes the experience and interaction around the subject and not limited by the technology.

EDIT - the same trick for eye-level camera positioning is used for some autocue setups in Television studios.


> it creates low trust

Try looking into the camera when you speak rather than the screen.


Then they'll trust you, but you won't trust them


+1 I do this when doing 1-to-1 or 1-to-many calls. I look at the lens at the top of my laptop or the one in the meeting room when I am talking.

It means I can't see what others are doing and feels kinda weird just staring at a lens when talking, but in a world of people who don't make eye contact like this on video calls, I think it has a good amount of impact/gravitas.


Wouldn't that require the camera to be located nearer to the middle of the screen, where the on-screen eyes of the other party would be? I wouldn't mind a slightly perceptible camera dot at the top of the screen, but if it was in the middle it might appear pretty conspicuous.


It will need to move. On anything bigger than a phone, the video chat may be in an off-center window. And i guess multi-window group chats are a problem too.

A great solution for a very limited problem.


well, that will require making a hole in the battery

What we talk in the industry as a new de facto standard is called "1B 1C layout" — a layout with 1 board and one battery cell edge to edge. A maximum by how much you can move the camera is determined by how much you are ready to sacrifice battery size.


It also has the eye-contact with one's self aspect, think of people who use their phones for doing their makeup as well as doing selfies.

A lot more of this goes on than video conferencing. Culturally the eye contact thing is not actually universal, there are understandings that the lack of eye contact is due to the video conferencing setup. So maybe the personal vanity selfie taking use case is what wins it here.


Does the screen have to be off where the camera is located?

I haven't dug deep but the few videos I saw showed a black bar covering the camera while it was in use (and bright colors when not).


I guess depending on how fast the camera is they could turn off the screen for a fraction of a second in between frames to take a picture. Not sure if that's feasible atm though.


That could actually work. They already make low persistence OLED displays for use in VR, where the screen stays black for most of the frame and a bright bar of image rolls down the screen.

The iPhone can already operate in this mode.

I assume there is a display brightness penalty, but the image quality from the camera should be fine as long as it gets more than half the frame collecting light.


I think it's quite real with OLED. But even with screen on, if you use a specialised CCD with huge dynamic range, I think it's doable.


You might find this interesting https://ieeexplore.ieee.org/document/1521517


A software fix seems more realistic. The latest smartphones packed enough AI circuitry to be able to infer how an image should look if you were staring into the camera.


> Lack of trust due to lack of eye contact

> Let's use fake eyes instead

Not sure that works out there..


How do I get to where I and someone else can trust each other and spend time in person? I am never going to want to perform in front of a camera. I refuse.


I don't think it would accurately recreate eye contact. Maybe some kind of uncanny valley like version of it, which seems worse than what we have.


It actually does, if you get a chance to use it (I have). Nothing uncanny about it whatsoever.

Keeping the person's eyes roughly around the camera area is achieved by moving/zooming the video on your screen to the appropriate spot. People don't usually move their heads that much, and when they do you can manually or automatically readjust as required.


The amount of tech turned from idea to actually manufactured in order to reach the holy grail of "full screen cover" is impressive. Ultrasonic fingerprint reader, under the screen camera and sensors, ...

I am not sure I buy into the need for it on my cellphone; while I'm not like a lot of people from HN and actually enjoy a large phone and I was one of the first Note customers, lately I switched from a S9+ to a S10+ and I actually prefered the 9's screen compared to the infinity display with its hole (or concurrent and their notch), so I guess I need to wait until they finally figure out how to not need such tricks.

But despite not feeling like I need / like it, I can't stop but be impressed at the tech and the speed at wich it is made. The first Galaxy S phone was barely ten years ago.


So the days of putting electrical tape over webcams are over? The black hats of the world rejoice.

In all seriousness. I am very careful about webcams. I would never own a screen with such a hidden cam. And the emergence of this tech is a little unsettling as it defeats an effective and very practical security measure.


Same here, this would push me to use competing devices that don't have hidden cameras.

Another recent Chinese phone (may have also been Xiaomi?) had a "pop out" camera that you can put away, kind of like the pop out flash on digital cameras. That's much more preferable from my perspective. No tape needed!


Yeah the K20, not too expensive either.


ha, just imagine staying at a hotel where there's a giant flat screen with a hidden cam pointed at your bed...


a hotel can already hide a camera in many secret locations. this doesn't change that.


What if the hotel wasn't the malicious party? Maybe it's a "conference Webcam" built into a smart TV (aren't most nowadays?).


Yes, exactly- that was my point; it is fairly unlikely the hotel cares too much. But how secure are these devices? What is stopping someone from hacking into a bunch of them, recording hotel guests and then mining the data for compromising footage to sell / use to blackmail / <insert malicious purpose here>.


I never understood the reason behind covering a camera on a laptop. I mean, what exactly are you afraid of? Someone can film you sitting in from of a screen? A microphone would be a more dangerous threat in this case, no?


People do Other Things™ in front of their laptops...

Imagine that you never did anything in a certain room of your house except sit quietly. Are you saying that you’re okay with anyone standing outside that room staring into it at anytime?


I'm self conscious about my double chin. I don't want the cute guy working at the NSA to see that.


I would like to kill my microphone as well. The fact that I have no easy way of doing that is not an argument for not covering the web cam.


Privacy. Simple as that.


I have a security badge for work. I dont want it photographed.



This is going to be killer for establishing eye contact while videoconferencing. They just need to move it further down for that.


It seems like the eye contact problem should be very easy to solve with a little CV and image manipulation. Particularly when you're on a phone or laptop, the adjustment needed is really tiny.


The end game could be to have millions of low quality (possibly single-pixel) cameras in your screen, with software combining the images into a single one from a location that can be (somewhat) controlled in software.

IIRC, that’s what was proposed in Starfire, in 1992. (https://www.asktog.com/starfire/), where they could also use those tiny cameras as a scanner (place a paper on top of the screen, and read out all the pinhole cameras)


Put a full screen sensor under the screen and scan pinholes while also playing video.


Apple filed for patents on this circa 2001.


Which means we're only two years from them expiring.


Not sure when they were issued.


All my smartphone/tablet/laptop cameras have labelmaker tape over them normally. I can't do that with an under-screen camera.


I was trying to come up with a reasonable solution this. I wouldn't trust any software solution in the OS that "disables" the camera because even if hacking of the device is not possible, some sort of settings-phising might be.

It seems that the only way to deal with this, would be to have a hardware switch that disables the camera, e.g. like iPhone's mute toggle.


But the OS can still get hacked to ignore the state of your toggle. E.g. the iPhone’s mute toggle doesn’t stop the alarm clock from using the phone speakers.


Maybe I didn't make this clear but by hardware I meant literally physical switch, as in e.g. power delivery through the circuit.


Yes! Thank you. I had the same thought, and then, "Well, no one else is paranoid enough to have this complaint."


It's been a not-unusual practice among privacy&security people for a long time.

Starting around 20 years ago, hackers spying on people through their webcams was publicized (such as with the cDc's Back Orifice, IIRC). For a while, webcams sometimes had physical doors/shutters over them. More recently, there was a boost of awareness, when Zuckerberg was seen covering his own camera.

But some companies presumably wanted you to be accustomed to their proprietary-ish videoconferencing, or to be accustomed to large wireless data plans, with network effects, and so they perhaps wanted to encourage front-camera use. Also, a door/shutter doesn't look sleek.

(Next time I have a physical office, I'll probably hang a little basket outside it, to give away trimmings of black and white labelmaker tapes, for this camera-covering purpose.)


This is very cool, but I wish marketing material told people that skin oils on their lends cause soft photos. Most people don't know to wipe the lens before taking a selfie, and the resulting photos look terrible.


Many phones (maybe Chinese phones particularly?) have software filters to deliberately give selfies that blurry and distorted look, a.k.a. Beauty Mode, so maybe it doesn't matter.


Beauty mode is very specific in what it does, namely it only blurs skin, and usually just the face. Oil makes everything blurry.


Whatever happened to lens covers


Don't know about Xuamaiu phones, but Pixel phones already warn you if the camera lens is dirty when you take a picture.


Samsungs' as well


I found that rather annoying.

How much time do you think they spent to write an algo to detect dirty lenses, and how much megacycles is spent to run it? So much for an annoying popup.


> How much time do you think they spent to write an algo to detect dirty lenses, and how much megacycles is spent to run it?

Compared to all the other things running when using the camera ? Nothing. Modern phone camera are like 50% cpu processing, both preview and capture.

By comparison it's rather easy to figure out there is something in focus far below minimal distance.

> So much for an annoying popup.

The very vast majority of people prefer to be warned about it and correct, than figure out later that the photo they wanted is ruined.


The Pixel 2 has an extra 8-core processor purely for processing camera images.

https://www.blog.google/products/pixel/pixel-visual-core-ima...


I remember a long time ago when I first heard about OLED displays I was promised that they could be effectively transparent, and this is the first time I've heard of this property actually being used.


I really hope full screen coverage (and later foldable) phones will lead to smaller high end phones again. Even the Galaxy A class phones have become massive. And anecdotally, starting last year a lot of my friends that upgraded their phones have started to complain about the size. So I hope the mainstream has found its limit and the manufacturers will react.


I wonder if FaceID will work underneath the OLED. Since I assume the OLED screen may distort the Dot Sensor.

So assuming this will work, wouldn't all Smartphone be looking exactly alike? I am pretty sure Xiaomi, Huawei, Vivo, Oppo, will push this to sub $400 Price point two years after it appears. Assuming this becomes available in 2021, we are talking about by 2023/2024 every smartphone on the market will very much the same.

I am going to assume Qualcomm / ARM CPU will improves up to the point CPU performance no longer matters. And everyone would be buying the same Sensor from Sony.

It will literally be Apple's value ( Privacy ), Security ( FaceID ) and Software ( iOS ) that separate between iPhone and Android. The hardware could be mostly the same. But iPhone would cost double the price.


> I am going to assume Qualcomm / ARM CPU will improves up to the point CPU performance no longer matters.

Unlikely. CPUs are hitting the limits of Moore's Law.


My guess is that whenever the camera is active, nothing can be displayed over it. In the demovideos I've seen, the display goes to black around where the camera is.


Can Xiaomi be trusted? They make some stunning hardware.


Sure, about as much as hardware from any other manufacturer. Maybe Fairphone might be more trustworthy than the rest because they care about the supply chain and are not that popular (hence they're less likely a target), but other than that, I don't think I'd trust any brand over another based on their country. They all get their parts from all over the planet and the software is also from all over the planet.

I'd rather look at their profit model, which in Xiaomi's case is not just selling phones: they bake advertisements into the OS and are open about that being part of their profit model. Which would be fair enough if everyone understood how technology works and where your data goes, but very few people really understand that. For now, you can opt out and/or root or flash your phone, but this kind of thing usually becomes mandatory once the majority of people got used to it and there won't be a big outcry. Turning it on mandatory affects everyone, but turning it on as opt-out gives the media little to talk about; then, when most people have left it turned on anyway, it's a much smaller change to make it mandatory.


I've installed LineageOS on my Xiaomi and it's great. Much faster, more stable and fewer bugs.


They can be trusted until they gain enough market share. /s


I hope all these future in-display cameras are within the status bar area as depicted and nobody gets the bright idea to put it dead center of the screen. I'm rather obsessive about display perfection and that little dot of shadow on a content area would be more than enough to drive me crazy.


I think that defeats the purpose? The main benefit that I see is she contact in video chat, where you'd want the camera pretty close to the center. If you don't get that, why bother at all?


Same reason for notches and glass backs and removing headphone jacks and curved screens: fashion. Smartphone sales are all about being trendy no matter how stupid the trend is.


I've been super impressed by Xiaomi's products.

The other day I found a deal on a Mi A2 for only $150 and it feels like a $400-500 device. Plus it runs Android One, so pretty much stock Android.

I'm a bit concerned about getting China spyware to be honest but AFAIK Xiaomi has been so far legit.


Do note that Xiaomi turns the OS itself into an adware platform. For now it's opt-out, but we all know how these things evolve.

Having a Huawei myself, I am not opposed to Chinese phones or anything. I'm just hesitant about any ad-supported tech, or tech that you can't own. I wonder if they ever stop supporting rooting/flashing your phone to stop people from undermining their profit model. Which would be fair enough if you understand how technology works and where your data goes and still choose to buy it, but (percentage-wise) almost nobody understands it sufficiently to make that choice.


On Android One too?


I think so, but I don't know. I don't own one ;)


I do and I've never seen that option


What did you think o' da fm-radio update, and when are we going to get an open-hardware collar 4 da 3.4mm headphone-jack?


Does anyone know if they had to create a new type of OLED for the required level of transparency or if all OLEDs were already transparent? The source tweet had a graphic but it wasn't too clear.


Apple and Qualcomm have been exploring in-screen ultrasonic sensors for TouchID.

https://www.tomsguide.com/us/apple-acoustic-pulse-fingerprin...


Wouldn’t there be artifacts (maybe diffraction) from a view through the OLED pattern?


I wonder if the marketers who invented “full screen coverage” as a selling point are measuring the ROI of their invention


It's not a marketer thing, it's a people thing. TV went through the same thing, for the same reason. Practical or not, better or not, it looks and feel cooler to most, and that's enough for people to want it.


I'm sure they are and the results are pretty impressive.


I wonder if this was also stolen from Samsung. Samsung was actively working on this in parallel to their folding displays. If both of these key technologies have been stolen from their R&D Department that would be a cause for major concern.


I would be very surprised if every major cell phone manufacturer in the world wasn’t working on this right now.


This is an accusation without any proof.

Probably every major mobile phone manufacturer are working on this to solve the notch problem.


It's not an accusation, it's speculation. The theft of their foldable display technology is proven, so it's relevant to speculate where this is coming from.

https://edition.cnn.com/2018/11/30/tech/samsung-china-tech-t...


You are right, but this source could be in the parent comment. Without it seems just a silly accusation.


My guess, it is actually using a Samsung OLED Panel. So it is not really stolen.


Love how the AD copy is bigger vs the info graphic


[flagged]


Cite sources and references for accusations. This isn't Facebook.


I think he’s being sarcastic.


i keep seeing this repeated constantly, almost exclusively by americans.

i assume these are the last years of US hegemony.


Judging by the attitudes displayed by Americans online, in general even though it could be a vocal minority, they are in so much debt to the smart American people that worked their butts off in the last century. If it weren't for the ground work laid by those brilliant scientists and politicians, the current attitude of "we rock, everyone else sucks and inferior" wouldn't have netted them much at all.


We're all living off the corpse of our ancestors.


No notch required might have been a great value prop when the iPhone X was introduced. But today? I have the feeling the notch has become a status symbol, that other phone manufacturers copy.

EDIT: I don’t really the downvotes. Look at the “top 2019 android phones”, I counted 7 out of 15 having a notch... compare that with the ridicule and outrage when the X was introduced.

https://www.wired.com/gallery/best-android-phones/


Not really. The top of the line android doesn't even really try to copy apple anymore. Some copy it because they can't just have the old "huge bezel at the top" design anymore, but eg samsung leaped above it with mini-bezel and then infinity-o, and xiaomi is going for under screen tech.

And it is guaranteed that Apple will move to under-screen too once it's mature. The notch was a stop gap, no matter how used to it you think you might be, it's still a screen waster. Though with Apple's "we made that choice so it was right we won't go back", I wonder if they will ever go to an under the screen fingerprint sensor now that it works well. As great as face unlock is, they are many times where I want to unlock with my phone on my desk and fingerprint is superior, I was glad to recover it on my S10+ after 2 generations of missing it (scanner being backside).


Apple won't move to under screen.

There is way too much going on in their notch compared to their Android counterparts:

https://www.iphonefaq.org/archives/976228


An underscreen camera was an Apple patent circa 2005. I always thought that this was going to be the killer feature of the iPhone, but that never came to pass.


I'm fascinated by your worldview. What happens to your status when new iPhone comes out without a notch? And do older smartphones that didn't have notch suddenly become status symbols?


I wonder what would be his reaction when seeing my friend asking me to borrow my Huawei to take pictures instead of using his iPhone XR because of the camera quality.

Both are great phones but there's more to the world than Apple and some people don't seem to realize.


OnePlus 7 Pro is selling like crazy, so notch is probably something people want to get rid of if they have a choice, which isn't something iPhone users really have.

iPhone is no longer a trend setter in smartphone business for some time now.


«Look at the “top 2019 android phones”, I counted 7 out of 15 having a notch» Probably because it's provides a useful function, more screen space. Don't forget the Essential Android handset was first with a notch, because of the extra real estate.


A while ago the notch was a status symbol, now not having it is.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: