Hacker News new | past | comments | ask | show | jobs | submit login
Samsung Unveils 15.6-Inch Ultra-HD OLED Display for Laptops (anandtech.com)
184 points by jseliger 87 days ago | hide | past | web | favorite | 172 comments



If this is still AMOLED, which I assume it will be, then I wouldn't buy one unless I intend to buy a new laptop every two years. LG's Top Emission WOLED is still 2 years out and I am not convinced that has solved burn in effect on TV with lots of moving pictures, let alone on PC where almost 80%+ of what we do every day has static screen.

It just seems the OLED burn in problem will never really be solved, only delayed.

May be WOLED will one day become good enough, the Joint research from Google and LG shows WOLED was capable of much higher PPI than what we have today, but the brightness just aren't good enough for Phones and Laptop usage. So WOLED will still needs to increase PPI, Increase brightness, reduce power, all while trying to delay burn in. I don't see that happening in 3 years time.

MicroLED is still at least another 5+ years before happening on phones or laptop. LCD, or QLED LCD is the best we have got at the moment in terms of longitivtiy.


I'm using an OLED Thinkpad X1 Yoga as my primary laptop, and I was afraid of burnin, though almost two years in I have not had any problems.

Admittedly, I was very concious of the burnin issue from day zero and have the screen setup to hide toolbars etc off the screen. I always go into full screen mode with the terminal.

The colors are gorgeous and I feel such a raw sense of visceral delight when using the OLED screen.

At least at the two year mark, I see no problems. Separately I've had an OLED TV for 4 years and I have seen no burnin problems there either.

I have wanted to switch to a newer laptop but I dread going back to a non OLED screen.

This is a great development, and if the choice is between me being a bit more concious of tweaking software settings to go fullscreen and hide toolbars vs. amazing colors, that's a limitation I'm willing to deal with.


Having to micromanage what you do so you don't burn in the screen seems like an endless source of anxiety for someone like me. No thank you.


Maybe it'd help to modify a window manager, to very slowly change the position of things on screen. This would be especially effective with a tiling WM in dark mode, so there aren't any large areas of lit pixels.

There'd be some details to work out, like making sure you don't trigger word wrap changes. One approach would be to leave a little border on at least one of the screen, so you can move everything together horizontally without changing window widths. In the vertical direction maybe you can get away with changing relative window sizes slightly.

(On the other hand, if you're mainly using text in dark mode, maybe your fancy OLED screen isn't benefiting you much.)


If burn-in is a serious threat to the usable lifetime of the display, I wonder whether the laptop could be configured to display a continuously varying image in an attempt to save the screen.


Screen savers won't help, because burn in is a cumulative effect - it's not just reversible image persistence like on LCD's. IOW, it's better to just turn the screen off when not in use. But you can slash the burn-in effect dramatically, at least for non-graphically-intensive use (no picture or video display) by just switching to a theme based on a total-black background, with dark green as the main color and yellowish or reddish highlights. Bonus points for the 1980s "Hollywood hacker" effect, of course!


You can however invert the image. Keep track of how bright each color of each pixel is. Doesn't have to be exact, just average.

Then you can adapt to the fatigue by brightening the pixels with most wear. You can also display an inverse image to even out the wear.

I have vague memories of an application that did this. Guess it was in the plasma days.

Though this is something that could be done in hardware, judging from this clip https://www.youtube.com/watch?v=nOcLasaRCzY it must already be done on TVs since letterboxing didn't create any artifacts. But it doesn't seem to be done on a per pixel basis. Not sure why that would be that much more difficult.


> Keep track of how bright each color of each pixel is.

The monitor does not have that capability - if it did, it would be able to compensate for burn-in exactly. You would need to do weird things like point a camera at the monitor and have it diagnose and re-calibrate itself, but this would not be anywhere near exact so you'd still have quite a bit of noise. And everything you do to try and "solve" burn-in, other than turning the screen off, just adds even more burn-in!


You can sorta meet the burn-in half way with a smaller number of thermocouples (temperature is the primary component of burn-in AFAIK) and a wear model related to cumulative energy input. I've been looking to prototype this, but I've not done the work to acquire a panel which I could prototype it on. It'd take quite a bit of fast memory, but not an outlandish amount.

I hope I get a chance to take a poke at it before too many patents are filed.


Not talking external sensors. You already know the brightness of each pixel from the color information and panel brightness.


Great idea! Why hasn't someone already invented this screen-saver type of software?


(That was the joke)


How bad is it?

I ask because I'm interested in a tv and the prices are starting to enter the realm of sanity (IMHO).

I have a 2010 Plasma, and people said that would suffer from burn-in. The only issue I ever had with it is when I left a game or menu up for several hours, and a shadow image would be left. The smudges went away pretty fast when something else got displayed, and all is well 9 years on.

So ... does anyone know how bad the OLED problem is? Is it a bit overblown like the concerns about SSD writing shortening their useful lives? Or is this really something to be concerned about, and the industry is trying to brush it under the carpet?


I watched this burn in test a few weeks ago. It looks like a great test.

https://www.youtube.com/watch?v=nOcLasaRCzY


These guys are doing god's work. Their website was an impeccable resource when I was buying my first new TV in a decade about a year ago. Absolutely unparalleled quality of reviews for different TV models in the market. Makes the job of cutting through all the marketing B.S. and finding the TV model you really need a whole lot easier.


It started showing some form of burn in for 2016 LG TV model during 2018. But as far as I can tell burn in are still only seen within very minority of users. ( Otherwise you would have read about it on the news ) So if you are not picky, OLED is the way to go. 2017 and 2018, as well as 2019 models all have improved in longevity, and so will the Top Emission WOLED coming in 2020 / 2021, but it will take some more time to test before being sure of it.

Given the price of WOLED panel fall below the price of LCD panel, there will be a price point of OLED tv where it is well worth to buy one even if it might fail in 4 - 5 years time. And that price point will be different for everyone, some consider $1200 55" were good to go, some wait for $1000, some will only buy one for $800.

But if you intend your TV to last for 10 years, then I suggest you wait a little longer.


Personal opinion from what I am reading around and from owning an oled TV. I am in no way an expert.

I own an oled for 1 and a half year now without issues. From what I read online is that issues with burning might occur after having static images for a very very long time. E.g if you are watching cnn for a long time everyday its static images might burn. Thats my understanding of it. Supposedly also LG has a fix on the new oled models that make burning even much less possible.

Other than that which hasn't happened to me at all, the image is considerably better than any lcd/led model out there.

Again this is just my personal opinion/experience.


Cool, thanks. Was thinking of the newest LG models, probably a C (rather than B) line one. The new tax year is still a couple of months away though!


For regular TV usage, it's probably fine, but I would not recommend buying one to use as a PC monitor. I did that about a year ago, and I really regret it.

Burn in patterns are everywhere, and especially visible in red channels, so the overall image has a greenish hue. Things like taskbars, tiling window borders, browser titlebars, game HUDs (HP bars especially since those are often bright red), have all left their permanent mark on the display, and have become super distracting.

I'll probably get another OLED eventually to replace this one (I'd try to sell the current one but kind of doubt anyone would pay decent money for it given the state it's in), but will probably use it exclusively for movie watching and maybe occasional gaming sessions. I've been completely spoiled by the movie watching experience of the infinite contrast ratio in a dark room. Nothing else is even comparable. But I'll have to babysit the next one much more carefully to make sure it doesn't suffer the same fate as the one I have.


Huh... thanks for the advice. Sounds like you might have a warranty claim there?

(YMMV depending on which country you live in I guess)


I've had an LG OLED for 2 years now, and have had no problems with burn-in so far. Before that i owned a Panasonic plasma screen and while it's still pretty nice for the bedroom, it does have Netflix pretty badly burned into it in certain color spaces.


I had Panasonic Plasma before, and LG C7 OLED since a year

The plasma had some burn in of channel logos that was visible on grey background only.

Zero burn in after 1 year with the OLED.


I thought OLED burn-in was something I'd have to just deal with down the road when I bought a Samsung Galaxy Tab Pro S (windows tablet), and that the machine was going to be on a two year countdown before becoming annoying.

So far it's been about 30 months with, no issue. I've taken no active steps to fight against burn-in, but also haven't completely disabled the default screensaver (a wobbling word art thing that displays the name of the tablet model after about 5 minutes of disuse). I've been checking for burn-in periodically as I'm suffering from a bit of upgrade-itus, and that would be a fantastic excuse.


No worries, the batteries don't last long anyway.


You're right. For how nice it is, OLED is really a stopgap tech.

The reason samsung never invested in OLED tvs is due to cost and poor longevity.

MicroLED is the same concept as OLED but without the degradation. OLED was simply easier to engineer given current tech, vs manufacturing tiny LEDs.

OLED will be supplanted by microLED. However if samsung can make these displays cheap enough, they will be nice in the interim. I don't expect to see microLED in mass market devices for quite some time.


600 cd/m2 might be just enough for outdoor work. Current MBPs say they are 500. I can only work on it with bright IDE theme.

My iPhone pushes up to 725 and it's useable with sunglasses at the beach.

Discontinued Lenovos OLEDs had pissy 330 cd/m2...


What you want is a laptop with a matte screen. Not a "glossy treated for reflection/matte like", which is what you find on most expensive laptops like those from Apple, but actually matte.

It's kind of hard to find because it doesn't sell well because the colors are very much toned down on it, but you can have the sun right behind your screen and it won't matter. I still have one from ~2012 (from the french store LDLC, which are rebranded custome made Clevo) that I gave my GF, and for work purpose and always being at top visibility the screen destroys anything else I've used since.

I'm genuinely weirded out that as far as I know there is no major work brand like XPS or MBP offering true matte screen as options anymore ...

PS: if you haven't seen one before, it's truly no reflection/competition with the sun at all, and it's truly washed out colors.

EDIT: I would at least like to know why I'm downvoted ? A true matte screen is vastly better when facing sunlight, which is what I'm answering to


AFAIK Lenovo still provides matte as an option for their T- and X- series.


I love my custom ordered 2010 Macbook Pro with matte screen. It's a shame they took the option away with the Retina models. All I really want is for someone to make that same machine, but with up-to-date components.


Agreed, I have a Macbook Pro and the screen is ridiculously reflective - I literally have no idea why Apple would think that was a good idea?


What about when it's in a shadow or overcast day?


It looks just fine and plenty bright, although it is clearly far worse looking than your regular glossy screen (since that's their perfect conditions). So in that scenario I would prefer a glossy screen because the colors are better BUT I would have no problem with a fully matte one.

And then in a fully dark room, I would again say the matte it superior to glossy because it's brightness feels much less abrasive to the eyes (but my eyes tend to be sensitive to light, I'm the kind of person who needs to change his screen luminosity depending on condition, so for people who don't care about that it would be glossy being better due to the colors I guess)


The matte screens I've worked with are plenty bright for such situations.


> I can only work on it with bright IDE theme.

If you can set your pointer color to inverted, it's a tremendous help (even if you only occasionally rely on it). Also, of course, as big as possible.


I recommend using polarised sunglasses. The macbook screen is polarised the correct way for most sunglasses, so it doesn't dim your screen half as much as it does your surroundings.


Something like ePaper would be fantastic for outdoor work.

Sadly, currently, the refresh rate is very slow and it has no viable color options.


Reading this right now on the beach on my iPhone XS and wearing sunglasses


Currently -5 cloudy with 1/2 meter of snow in Oslo. Your comment makes me jealous.


Considering my old phones get burn-in after a year or two, that windows has far more static elements than Android, and that I don't replace my laptop every year, oled would be a horrible choice for me.


Have had an S3 for 3 years, S5 for another 3 and now an S8 for one, and haven't had any burn-in issues.

Though I mostly find max brightness to be uncomfortably bright, so I usually keep it down in the 10-30% range (auto).

My gf likes to illuminate half the block with her phone though, and we both got the S8 at the same time, so will be interesting to compare.


> that windows has far more static elements than Android

Android has a status bar at the top that's displayed over most applications. And applications on Android are far more likely to be maximized, which makes the common design elements and widgets more likely to be in the same place.

To the best of my knowledge and research, though, the current generation of OLED screens and controllers manage to avoid image retention.


OLED screens suffer from burn-in due to pixels wearing out unevenly on the screen. The effect is cumulative and can't be fixed by using screensavers. I wouldn't call it "image retention".

I have an OLED TV for watching shows and movies and try to be careful to not display static elements for too long. I would never use an OLED as a PC screen for fear of burn-in.


My Moto Z2 Play showed burn in of the top status bar after a year. My Moto Z did the same. OLED burn in is not a fixed problem.


IIRC, many rely on pixel shifting every time the screen sleeps. Not sure if the same strategy would work on laptops


It shouldn't be that noticeable if scaling is turned on. Could be a pretty good strategy for laptops.


My three year old LG OLED TV has burn in.


My three year old oled TV does not, and I use it as a PC display 8h/day. Are you sure you do not disconnect it sometimes?


Really? I've used mine for a little over a year as a PC display and the burn-in has come to a point where I'm thinking of just throwing it out because nobody in their right mind would pay money to get it off my hands.

Do you happen to have your brightness set to really low or something? I left mine around the default thinking it'd be safe but that totally didn't work out.


Differences in climates can play a part in this as well it would be difficult to call but possible that cooler locations fair better than the tropics. I have no data to support this but would be very interested if there is data on the lifespan of electronics like this in different climates.


My 6 year old oled phone that I used daily until last year (I would say 'used heavily' but then compared to some people, it might have been moderately) did not have any such issues.


Switch to a black scheme, saves power and avoids burn in


"Adapt yourself to the problems of your products" is not motivating me to buy said product.


It can motivate if there are other advantages, eg OLED infinite contrast/better colors, etc.

I've had an OLED phone since the original Galaxy Note and an OLED tv for 15 months now and I'm never going back to LCD.


Does Windows not have burn-in prevention?


I'm sure you've heard of screen savers.


I don't think that does an adequate job here. In most workflows, static elements such as the taskbar will still be displayed in the same place most of the time the display is on.

As I understand it, plasma displays (which now seem defunct?) had issues with burn-in if presented with the same image for a long period of time during a single instance, while from personal experience AMOLED burn-in can occur even if the same image is displayed repeatedly over a long period of time. Screen savers would appear to mitigate the former, but not completely avoid the latter.


And I'm sure you've heard of the taskbar.


The whole point of screensavers is that they actively display something different than what the screen normally has. Burn-in is not a problem that originated with OLED, and last time anybody had that problem, screensavers were the best mitigation they came up with.


OLED burn in happens because the organic stuff used to color the light into RGB degrades at different rates for the three colors. So for OLED, having an image on the screen for 30 minutes is the same as having it on the screen for 5 minutes, 6 times.

If the Windows taskbar is present 95% of the time that the computer is being used (with the other 5% being fullscreen applications), and a screensaver is active 40% of the time the screen is on, then the taskbar is still visible for 57% of the time the screen is on.

Maybe the tech has changed, but as of my first OLED phone 7 years ago, blue (the same color as the taskbar's Windows icon) was the fastest color to burn in.

Newer phones shift the navbar and status bar items a few pixels every minute. But the start menu icon is too big for that, so after a year or two, every time the user watched a movie in fullscreen, they would get a blue blob in the lower left corner.


>screen savers

You should just turn off the screen when you are away


One of the only OLED laptops I'm aware of, the Lenovo X1 Yoga Gen 2, has consistent, widespread problems with flicker. It seems that every display needs replacement every year or so due to flicker at either high or low brightness. I'm on my second panel, and it's already failing at brightness above 50%. I'm hoping Samsung figured this out by now.


I have an Alienware 13 r3 (QHD OLED Touch display) and so far I've had absolutely no issues. The screen is just amazing. I've had the computer for over 1 year and the screen is as good as the first day. This is a gaming laptop, so I've tried lots of games and I haven't seen any type of flickering.


I'm really annoyed with my 13r3s OLED screen - the red pixels are much faster to illuminate from zero to little compared to the blue - which themselves are faster than the green pixels. this leads to a reddish/magenta smear on the leading edge when something gray moves on black background.

same stuff that guy over there is complaining about: http://forum.notebookreview.com/threads/alienware-13r3-oled-...


Must be a nightmare when installing a new Linux distro in text mode and having bird seeds as characters.


Currently that is an issue yes but linux 4.21/5.0 is adding a console font designed to be used on hiDPI screens at the tty.

https://www.phoronix.com/scan.php?page=news_item&px=Linux-4....


Yes, it feels like almost if you need a microscope


Pixel density is pretty low, so that wouldn't be a problem I would be concerned with 8k screen.


Does such resolution even make sense on 15.6" screen? You won't see any pixel difference with a lower resolution most likely (unless you look at the screen from a very close range).


Actually it does. It essentially looks like print including things like readable 4pt fonts. Now do you want to burn your battery on the GPU that is needed to drive that thing at full speed? That is an interesting question. Does the lack of a backlight compensate? If I were a betting person I would guess this is the display of the new Macbook Pro.


I owe a Dell XPS 13 with 4k display running Linux. Fonts are clearly much more sharper (maybe even feel darker) and pleasant to read than 1080p or even 2k . Although, I see you point. For most practical purposes it does not make difference I think, but I really love 4k when I'm programming. These days I'm not programming much, hence switched to 1080p as little experiment (and got fed up with sloppy GNOME performance), and I would say I don't notice much for general tasks.


There are two generations of the Dell XPS 13, at first they had a 3200x1800 display, they switched to 3840x2160 last year.

In my opinion 3200x1800 at 13.3" is more than enough already (I own one) but I guess marketing wanted more.


Yes, I also have the old 2015 model (3200x1800). Definitely, good enough in my opinion too.


Are they in the range where the UX size can be doubled without being too large?


Yes. 2x worked perfect for me at 4k, and its the default for hidpi settings I think. When I switched to 1080p I had to manually increase font sizes (not overall scaling) because the default size wasnt sharp/readable enough. Its pretty cool 4k now works so well out-of-box on Linux. It was nightmare in 2015.


You'd definitely notice the difference. Text is much sharper on my 4k screen, regardless of its size.


My bet is that Apple’s first ARM machine will be a MacBook with a display like this in a jet black case, 12-core “X-series” chip, Face ID.

Defaults to dark mode to help skirt any burn-in issues.

Announce it at WWDC and even at $2500+ devs will go nuts for it.


I want 120+Hz display and I will gladly give them whatever they will ask for it


The only downside is that they won't let you uninstall Facebook from it.

All jokes aside the specs for this thing look great. 15.6" and 4k. I want to see this in production laptops ASAP.


And two mail clients, a good one and the Samsung one

And two calendars, a good one and the Samsung one

And two browsers, a good one and the Samsung one

And two text editots, a good one and the Samsung one

And a camera application, a good one and the Samsung one

Why do they bother?


As far as I've read, because they wish they didn't have to depend on gapps. To the extent you wish Android were a little freer of Google, you should support that idea.


So they make worse quality apps yet still include gapps. That's a brilliant plan.

How about they stick to quality hardware and let the pros deal with software.


Because there is a huge fucking pile of money (Samsung revenues) and every manager works hard to convince their manager that their product is worth a piece of it. See Dilbert


Do they include the Samsung ones on Chinese market phones, where there's no GApps?


No. Xiaomi and Huawei have their own apps as well


So, yes.


Two roads diverged in a wood, and I—

I took the one less traveled by,

And that has made all the difference.


And I ate seeds,

and got stung by bees

for three long days

until Park Rangers finally found me.


I have this on my Dell XPS 15. It's really nice, except for the fact that I need to have everything magnified 200% for anything to be readable..


Many years ago, typical laser printers offered 300DPI. Then new engines came out that could print at 600DPI. Twice as many pixels in each direction, so four times as many pixels per printed character.

No one complained that everything had to be magnified 200% to print on the new engines - even though that was going on behind the scenes. Instead, everyone enjoyed the new sharper and crisper printouts. And it was wonderful.

The same is true for displays. When you have a display that can run at 200% scaling, that gives you four times the pixels per character compared to your old 100% display.

Everyone's taste is different, but for myself, give me all the pixels I can get. I don't like pixels and I don't want to see them. I want to see the text and graphics I'm working on. If the display has more pixels to represent those, to the point where I can't even see the pixels any more, that's what I want.


I don't appreciate the natural resolution of my XPS 15's 4k screen because my typical use case is to use an external screen. For that use case, the laptop's own screen is totally irrelevant.

Without an external screen, 4k resolution makes everything too small. But setting scaling on the laptop's screen interacts badly with the external screen, which I use almost all the time and do want at 4k. So I end up with the laptop being that much less usable when I'm using it by itself.


That's actually a software problem and not a hardware one. But it's also one of the main reasons I bought a 1080p laptop instead of 4k one.


The main reason is to get better pixel density. That's what Apple's Macbooks do by default, and it's really georgeous. I couldn't go back to the full resolution and everything being super small.


You do not have "this", an OLED panel, in your XPS 15.


>>> All jokes aside the specs for this thing look great. 15.6" and 4k. I want to see this in production laptops ASAP.

>> I have this on my Dell XPS 15.

> You do not have "this", an OLED panel, in your XPS 15.

That's... uncharitable. I also have a 4k screen on my XPS 15. Why would I care whether it's OLED? 4k on a 15 inch screen has been in production laptops for years.


> Why would I care whether it's OLED?

For the reasons laid out in the article. Which is about a specific OLED display which will be used in laptops like the XPS 15 later this year. Those laptops are in development but not released yet.


200% scale is really easy to do. Its when you need something like 150% that it gets hard.


125%, 150%, 175% - they’re all available out of the box on Windows 10, without even selecting a custom dpi.


Now try using them any of them, and you'll see what the GP meant. For example, try opening the (built in) device manager.


It's "good enough", honestly. It will never work perfectly with older apps, which is why Windows has started to default to just bitmap-upscaling them wholesale instead in many cases. And there are still nooks and crannies in Windows itself that are not high-DPI aware (another one is the font installer). But for day to day use, it's not something that you notice.


On Mac I used switchresx to add a new scaled resolution like this:

2560x1440 (native) * 1.5 = 3840x2160 HiDPI / 2 = 1920x1080 (effective resolution)

Obviously it is not as good a real panel but looked good enough to me.

I did the same thing on Ubuntu using xrandr, it was pretty good when it worked (sometimes it didn't, even using the exact same command)


nice. I just bought a System76 Linux laptop with a 4K display (and a 1070 GPU) and having a 4K display is really nice. I almost didn't upgrade to the 4K display and I am glad I did.

I think large TVs are great for supporting group-watch events but for personal viewing something really hires and nearby is better, IMO.


Ugh, 16:9.

When will laptops stop coming with 16:9 monitors? Do people really buy laptops to watch movies, or productivity?


Most consumer laptops are probably used quite often for watching movies, but this shouldn't be an issue. My laptop has a 3:2 screen and I don't even notice the black bars when watching a video. An OLED display would be even better for this as the bars are as really black.


I mean, how many movies can you watch in any given day? I would think 80% of the time is spent browsing the web.

My screen is 3:2, too, and it's many times better. Apple's laptops (sorry, forget exact ratio) are acceptable, too, it's just that 16:9 doesn't make any sense to me and I have no idea why more companies aren't providing alternatives. I guess it's a non-problem and I'm the only one who sees it this way.


> I mean, how many movies can you watch in any given day? I would think 80% of the time is spent browsing the web.

I know a few people who mostly use their laptops to watch movies / netflix. The rest of the day, they're not using the laptop at all. (there is some web browsing of course, but not much).


Apple uses 16:10 I believe.


I thought oled panels couldn't be used for laptops because of burn-in. the article says they don't know if this laptop is any better, which is the only question that matters


Guess we'll see in a year or so. My OLED phone had burn in after a year and I can see the Android status bar when I watch full screen videos.


Is your phone from Samsung? Is it new?


It's a Moto Z2 Play. Previous one was a Moto Z. Both show burn in after a year of use.


From what I know, the higher end phones don't suffer from burn-in. Your phone is quite cheap and they do have to skimp on something. It's generally a good idea to get a midranger with an LCD instead of OLED, for that reason. You lose some image quality but you avoid these issues.


Moto Z cost north of 700 €. I would not call that cheap.


Well, considering how much an iPhone XS costs, the definition of "cheap" for a smartphone has migrated North the past few years :)))


Why do all these oled displays have to be 4k nowadays? I was shopping TVs recently and found exactly one 2k (1920px) oled screen for about 700 euros when looking across three countries, which was slightly above budget but I was considering it. If there had been more than just this one choice, just to have some OSes and features to choose between, I might have gone for it. But 4k oled costs thousands at no added benefit, so now we have a crappy (girlfriend-selected) 4k backlit lcd that you actually notice turning on backlight selectively on parts of the screen that have non-black pixels.

It's not as if I can see individual pixels from across the room. Same with a laptop (I don't see my pixels on a 16" 2k screen), a computer display (I don't see pixels on a 23" 2k screen), and a phone (I don't see pixels on a 5.5" 720px screen). I'm pretty sure that if you can see individual pixels at those sizes, you're sitting too close.

On my phone I really just want my 720p oled display back: there, too, I have a crappy 2k lcd now because oled is too expensive at the resolutions they make them nowadays. And I was hoping to see oled laptops soon, but it looks like we first have to get every single application to be rewritten to support more-pixels-than-you-can-actually-see zooming modes, then bring down the price of these panels, and then I can buy one.


I'm not sure about the TV across the room, but I certainly see the pixels on a 2k 23" monitor. I'm using two now and every second I'm itching to get back to my Macbook's Retina display. Everything else about these monitors is fine... except pixel density.

In Windows this is better (for me) because I personally like the subpixel rendering there - it's aligned to pixel boundaries more often and looks clearer to my eyes.


I've got a 27" 5K iMac. At work, we've these 21.5" 1080p displays. I just can't use them, the text just looks so ridiculously blurry. Once you've seen HiDPI rendering, you can't go back.

I'm doubting GP has ever worked with HiDPI content for an extended period of time.


Few things:

1. TV sizes are getting bigger and bigger. You mightn't notice the pixels at the size of your current TV, but your current TV might be considered small by the standard of increasing screen sizes.

2. You mightn't be able to see the pixels on your 16" 2k screen or on a 4K TV, but your eyes aren't the only ones on the planet. To my eyes, 2k is old news; my 13" has 1600 pixels vertically. My left eye can barely see anything thanks to keratoconus, but with both eyes open I can still discern the difference between 720p, 1080p, and full 4K content on my TV — mainly in games and Blu Ray movies.

3. You don't need to rewrite every single application to support HiDPI displays — Windows already has scaling baked into the OS (and has for years), and the majority of apps support the feature with no issue; macOS has had Retina display support since late 2012 and only apps ported from Linux exhibit any weirdness.

4. All of the Galaxy phones have had OLED displays standard for years, long after 720p was considered a high resolution for a phone (back in 2011). Even some of the non-flagship models have HiDPI OLEDs and aren't particularly expensive.


The problem is that 15.6" is physically too small to leverage that much of a resolution real estate with human eye comfort.

I do high definition photography (think gigapixel panoramas) and editing anything like that on 3840x2160 laptop (high end laptop) is a suffering.

I always looking forward to get back to my 32" monitor.


The main point here is to gain improved resolution on text where it's quite noticeable going from 140 PPI to 280 PPI - many developers will appreciate it.

It should be fine for the kind of work you're doing though, are you using tools that don't support DPI scaling?


I dunno, in my experience going from 1080p to 4K is a huge benefit for photos as well. While I wouldn't necessarily notice the extra details when looking at a photo from a distance, when I see the 4K next to the 1080p displaying the same photo it is noticeably more "realistic" somehow. Considering that even phone cameras take 12MP+ pictures, that extra detail has to count for something. For me pretty much everything looks noticeably better on a 4K display. Granted I'm slightly under 30 but I do wear glasses, so I dunno how much eyesight affects things.


Same thing here. Also, thumbnails and avatars become a lot more useful, because with higher DPI you can make out a lot more.


I leveraged scaling and kind of played with different options of it.

For developers working with proportional code - unless you have super sharp eyes to scale your Sublime Editor to sub-1mm characters - i don't see it making much difference.

I physically feel comfortable with a text character of a certain minimal size. Beyond 4k it really doesn't matter how many pixels or colors are used to render the font or image on 15" screen. Making it too tiny or super-true-color-ish doesn't make much difference on a laptop screen that typically only used for "temporary" on-the-road work.


It's about being able to use more pixels to render things like curves and serifs in fonts mainly - things that subpixel hinting attempted to address on lower DPI displays can now be done much better and without the blur tradeoff by using more pixels. At the very same size, fonts look noticeably better because more pixels can go into those tiny rounded edges.


I disagree - I have a Dell 15.6" with a 4k screen and there is a noticable difference over my work Retina 15" Mac and my Pixelbook.

Of course you're not supposed to run at 1x UI scaling.


My 4k 15" in 1080p is much sharper looking than a native 1080p screen, but I prefer to run 4k with 2x scaling.


I wish laptop companies would focus on battery life more. I would love a laptop that lasted 4 days on a full charge.


I'm not sure how much laptop companies are in control of the battery life of laptops. I mean, they could always add more batteries, but those cost weight and their doesn't seem to be a market for that. X86 CPUs, screens, and wifi connections all take up a certain amount of energy. I think most companies are doing the best with what they have to work with.


Then switch to ARM already. I’ve been using an iPad Pro (1st gen) 9.7” for most things including google docs/web/email and it’s plenty fast. And it still has 46% battery after 3 days. For development I have a 8core Xeon box with 64gb of ram to fall back on or remote into.


Lenovo and Samsung have launched ARM laptops last year, both supposedly with 20h+ of battery life.


Where's the problem with charging your device overnight? As long as it gets you through the day I think it's fine.


It gets old, if I could I would wire up my desktop to charge all my devices wirelessly but that would probably fry my nutsack or worse.


Given Samsung's stellar reputation for sourcing batteries that don't combust, I'll pass on that.


Strange question: say I want to buy this display for a hobbyist project, what's the best way to do this?


You may be able to find some at panelook.com


You can't.


That does look slim. Wonder if we will start seeing screen notches on laptops as well


anyone tried a screen from an imac (https://support.apple.com/kb/SP760?locale=en_US) or any apple computer?

Even my 5years old imac is better than this new samsung screen.

I have a Win/PC as well (for gaming) and I'm using the Samsung CHG70 (https://www.samsung.com/us/computing/monitors/gaming/32--chg...).

It cannot even be compared with the screen from imac. It's blurry, with wrong colours and low quality.

Even now, I don't know how no other company can replicate Apple's screens.


Have you looked at this new Samsung screen in person, or what makes you say that?

How about the HDR screen from the X16G?


I haven't but I've tested a lot of Samsung screens and I own one of their latest good ones which is the CHG70 (32').

It sucks so much you can't even compare it with the Retina iMac (5k) screen.


The display looks amazing, however the computer itself looks horribly ugly.


Maybe make LCD laptop screens decent first?

Hard to buy an affordable laptop with decent screen, bascially all crap with garbage gamut, literally hurt eyes.

It's a shame laptop makers are willing to add those useless entry level discrete GPUs yet are too stingy to spend 20 to 30 bucks more on the display.


But the addition of 'dedicated GPU' is what drives sales, unfortunately. I don't think nVidia's x20/x30 GPUs for mobile improves anything. Anybody that actually needs power would opt for x50 or higher, and anything less would be served by integrated graphics.


Give me micro LED or give me death


The pixel density is too low for 2019. If would be good 11-13 incher but 15 will be too pixelated.


On my desktop I have a 27" 4k monitor (3840×2160) and I'm a bit intrigued on how this resolution is not enough for a 15.6" monitor, what do I miss here?


Number queens. Same people that have such advanced vision they can see the difference between 100 and 120 fps. After a certain point, even more if you're at a safe distance from the screen, more pixels don't make different to the naked eye. But you'll never convince number queens.


Why so bitter? There is no need to call people names because they have better eyesight than yourself.


I'm thinking the same thing. I have 28" 4k monitors and my 2560x1440 14" laptop doesn't bother me I can't image complaining about 4k on a 15.6 inch.


> what do I miss here?

/s


do we still have this here ? :)


After all these years, the perfect resolution (for me) is still 1920x1080 - whether it's 13", 15" or 17" ...


4k with 2x scaling is much sharper and easier on the eyes.


I'm finding the exact opposite, but this could be due to me being okay with seeing actual pixels, and might be even missing them. I've started on Apple ][ clone with 40x20 console resolution (80x20 with PR#3), and later Hercules monochrome one for my PC.


Serious question: what is the point of such high pixel density? I assume you can not tell any difference between 2k and 4k from the distance a laptop is supposed to be used. Is it purely a marketing hype? Or am I missing something?


Full-HD is ~140 PPI, that's not low but not so high more wouldn't be a noticeable difference. Not really needed for general use IMHO, but also not just hype.


> I assume you can not tell any difference between 2k and 4k from the distance a laptop is supposed to be used

My 2013 13" macbook pro retina has a resolution of 2560x1600. I absolutely can tell the difference compared to 'Full' HD, just in crispness of image, beautiful font rendering etc.

So a 4K screen at 15.6" doesn't seem outrageous to me.


Also, OLED displays have fewer subpixels as far as I know. I think LCD usually have Red/Green/Blue for every pixel, OLEDs don't. So on an OLED display, you need a higher resolution to make it look as sharp as an LCD.


Font rendering is the big one. It has a massive impact on the readability of complex characters at small font sizes.


Your question is entirely valid -- there is a point of diminishing returns. The GS8, for instance, has a resolution of 2960 x 1440. By default it runs the screen at 2220 x 1080 and the overwhelming majority of customers will never know the difference. I have a Lenova 720 with a UHD screen (like the linked), and the vast majority of time I run it at 1920x1080 to halve, and halve again, the load on the GPU trying to run so many pixels (not to mention the occasional irritating software incompatibility with the resolution, leading to interfaces for ants). It is generally an imperceptible difference, and one of the very few times I go back to the native resolution is if doing an extended reading session of dense subject matter, where there's a minuscule advantage in type clarity.

There is a placebo effect as well, so someone with their shiny new ultra high resolution screen may feel that there's a compelling difference, but I have doubts it would hold up under legitimate blind testing. We see the same thing with color perceptions now -- that OLED and a good LCD (e.g. iPhone XR) will be calibrated to close to identical color performance, and in blind testing will likely be very difficult to separate on that basis, but it's very common to hear OLED users laud color performance.

There is no question that OLED absolutely owns when it comes to contrast, especially in dark environments where the difference between LCD grey and OLED black is without question.


My understanding is that this is pentile, meaning each ‘pixel’ is either R+G, or G+B; not the R+G+B pixel you’d expect. In practice this means 4k brightness detail, but only 2k colour detail.

That aside, 1080p isn’t past the threshold of imperceptibility. 4k totally is, but doubling a common res means you get all the benefits without non-integer scaling issues or changes in screen real-estate.


The pixel density always needs to be considered together with the typical viewing distance.

A UHD 15.6" display with a viewing distance of 34 inches (86cm) results in 170 pixels per degree which is quite high.

It's a very subjective thing. Some people value it higher than others.

For example going from 1920x1080 to 2560x1440 on your 5.8" phone screen provides only limited value, yet many people still go for it.


I prefer laptops to be thin, light and something that I can carry around in my hands. 15.6 inch is too big for a portable laptop and starts to approach desktop replacement territory. Rather get a 14 inch Laptop and plug it into a 4k monitor when you need the larger display area.

And since this is OLED, does that mean that the windows start button and Apple's system menu would get burned into the screen :-/


I hope that some manufacturers (hello HP!) will use that slim bezel 15,6" screen in a 14" body format without those hideous number pads I have to buy and never use. The laptop in the article looks perfect, just add three physical buttons around the touchpad as in HPs ZBooks.


I prefer laptops to be thin, light and something that I can carry around in my hands. 15.6 inch is the perfect size for a portable laptop.

It offers the best tradeof in screenspace to portability and current thin bezel 15.6 laptops fit in standard size laptop backpacks.


Obviously it's not for you, but anything less is no good for my eyes.


Look at the picture of the laptop. Without top and side bezels, it has the same footprint as a normal 14" laptop.


Gorgeous display unfortunately married to sub-par industrial design.

The colors are all off. Black display hinges break the silver line of the chassis. Two black buttons on the left do the same. Big black touchbar type thing coupled with white keyboard keys. Small trackpad that's not color matched to the chassis. Also, 'SAMSUNG DISPLAY' branding trashes the beautiful edge-to-edge display.

Only ID conceit I understand is the rubber pads at the edges, since presumably there's no room to add them to the edge to edge display.

Aside from the display, it looks like a bad MacBook knockoff. Samsung can do better, they have on some of their phones.


> Black display hinges break the silver line of the chassis. Two black buttons on the left do the same. Big black touchbar type thing coupled with white keyboard keys. Small trackpad that's not color matched to the chassis. Also, 'SAMSUNG DISPLAY' branding trashes the beautiful edge-to-edge display... Aside from the display, it looks like a bad MacBook knockoff. Samsung can do better, they have on some of their phones.

This is about the display. The laptop it's attached to is probably just a proof of concept demo platform for the display.


Good point. I just looked at Samsung's lineup [1] and it seems they're mostly plasticy machines aimed at the low and mid end of the market. I thought they had some high-end parts like Dell. Guess they don't have the platform to adequately show off the screen quality, so they just cobbled something together.

[1] https://www.samsung.com/us/computing/windows-laptops/


The platform is dell, lenovo, hp, etc...

Samsung is a big supplier of electronic parts.


Samsung is both a creator of consumer electronics (TVs, dishwashers, smartphones, notebooks, etc) and a supplier to other consumer electronics companies. They're massive. Hence my bringing up their notebooks, as I was initially confused that they'd be introducing this in their own lineup.

When you have unique access to technology, one strategic opportunity is to use that to move up the value chain from being a supplier to being a retail brand and create a differentiated product, winning market share. Samsung right now pretty much has a lock on non-TV OLED production, but is choosing not to go the path of using those in their own computer brand, and instead are selling to other computer makers. It's an option they have but have chosen not to pursue... so thats interesting.


Samsung supplies Apple too.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: