It just seems the OLED burn in problem will never really be solved, only delayed.
May be WOLED will one day become good enough, the Joint research from Google and LG shows WOLED was capable of much higher PPI than what we have today, but the brightness just aren't good enough for Phones and Laptop usage. So WOLED will still needs to increase PPI, Increase brightness, reduce power, all while trying to delay burn in. I don't see that happening in 3 years time.
MicroLED is still at least another 5+ years before happening on phones or laptop. LCD, or QLED LCD is the best we have got at the moment in terms of longitivtiy.
Admittedly, I was very concious of the burnin issue from day zero and have the screen setup to hide toolbars etc off the screen. I always go into full screen mode with the terminal.
The colors are gorgeous and I feel such a raw sense of visceral delight when using the OLED screen.
At least at the two year mark, I see no problems. Separately I've had an OLED TV for 4 years and I have seen no burnin problems there either.
I have wanted to switch to a newer laptop but I dread going back to a non OLED screen.
This is a great development, and if the choice is between me being a bit more concious of tweaking software settings to go fullscreen and hide toolbars vs. amazing colors, that's a limitation I'm willing to deal with.
There'd be some details to work out, like making sure you don't trigger word wrap changes. One approach would be to leave a little border on at least one of the screen, so you can move everything together horizontally without changing window widths. In the vertical direction maybe you can get away with changing relative window sizes slightly.
(On the other hand, if you're mainly using text in dark mode, maybe your fancy OLED screen isn't benefiting you much.)
Then you can adapt to the fatigue by brightening the pixels with most wear. You can also display an inverse image to even out the wear.
I have vague memories of an application that did this. Guess it was in the plasma days.
Though this is something that could be done in hardware, judging from this clip https://www.youtube.com/watch?v=nOcLasaRCzY it must already be done on TVs since letterboxing didn't create any artifacts. But it doesn't seem to be done on a per pixel basis. Not sure why that would be that much more difficult.
The monitor does not have that capability - if it did, it would be able to compensate for burn-in exactly. You would need to do weird things like point a camera at the monitor and have it diagnose and re-calibrate itself, but this would not be anywhere near exact so you'd still have quite a bit of noise. And everything you do to try and "solve" burn-in, other than turning the screen off, just adds even more burn-in!
I hope I get a chance to take a poke at it before too many patents are filed.
I ask because I'm interested in a tv and the prices are starting to enter the realm of sanity (IMHO).
I have a 2010 Plasma, and people said that would suffer from burn-in. The only issue I ever had with it is when I left a game or menu up for several hours, and a shadow image would be left. The smudges went away pretty fast when something else got displayed, and all is well 9 years on.
So ... does anyone know how bad the OLED problem is? Is it a bit overblown like the concerns about SSD writing shortening their useful lives? Or is this really something to be concerned about, and the industry is trying to brush it under the carpet?
Given the price of WOLED panel fall below the price of LCD panel, there will be a price point of OLED tv where it is well worth to buy one even if it might fail in 4 - 5 years time. And that price point will be different for everyone, some consider $1200 55" were good to go, some wait for $1000, some will only buy one for $800.
But if you intend your TV to last for 10 years, then I suggest you wait a little longer.
I own an oled for 1 and a half year now without issues. From what I read online is that issues with burning might occur after having static images for a very very long time. E.g if you are watching cnn for a long time everyday its static images might burn. Thats my understanding of it.
Supposedly also LG has a fix on the new oled models that make burning even much less possible.
Other than that which hasn't happened to me at all, the image is considerably better than any lcd/led model out there.
Again this is just my personal opinion/experience.
Burn in patterns are everywhere, and especially visible in red channels, so the overall image has a greenish hue. Things like taskbars, tiling window borders, browser titlebars, game HUDs (HP bars especially since those are often bright red), have all left their permanent mark on the display, and have become super distracting.
I'll probably get another OLED eventually to replace this one (I'd try to sell the current one but kind of doubt anyone would pay decent money for it given the state it's in), but will probably use it exclusively for movie watching and maybe occasional gaming sessions. I've been completely spoiled by the movie watching experience of the infinite contrast ratio in a dark room. Nothing else is even comparable. But I'll have to babysit the next one much more carefully to make sure it doesn't suffer the same fate as the one I have.
(YMMV depending on which country you live in I guess)
The plasma had some burn in of channel logos that was visible on grey background only.
Zero burn in after 1 year with the OLED.
So far it's been about 30 months with, no issue. I've taken no active steps to fight against burn-in, but also haven't completely disabled the default screensaver (a wobbling word art thing that displays the name of the tablet model after about 5 minutes of disuse). I've been checking for burn-in periodically as I'm suffering from a bit of upgrade-itus, and that would be a fantastic excuse.
The reason samsung never invested in OLED tvs is due to cost and poor longevity.
MicroLED is the same concept as OLED but without the degradation. OLED was simply easier to engineer given current tech, vs manufacturing tiny LEDs.
OLED will be supplanted by microLED. However if samsung can make these displays cheap enough, they will be nice in the interim. I don't expect to see microLED in mass market devices for quite some time.
My iPhone pushes up to 725 and it's useable with sunglasses at the beach.
Discontinued Lenovos OLEDs had pissy 330 cd/m2...
It's kind of hard to find because it doesn't sell well because the colors are very much toned down on it, but you can have the sun right behind your screen and it won't matter. I still have one from ~2012 (from the french store LDLC, which are rebranded custome made Clevo) that I gave my GF, and for work purpose and always being at top visibility the screen destroys anything else I've used since.
I'm genuinely weirded out that as far as I know there is no major work brand like XPS or MBP offering true matte screen as options anymore ...
PS: if you haven't seen one before, it's truly no reflection/competition with the sun at all, and it's truly washed out colors.
EDIT: I would at least like to know why I'm downvoted ? A true matte screen is vastly better when facing sunlight, which is what I'm answering to
And then in a fully dark room, I would again say the matte it superior to glossy because it's brightness feels much less abrasive to the eyes (but my eyes tend to be sensitive to light, I'm the kind of person who needs to change his screen luminosity depending on condition, so for people who don't care about that it would be glossy being better due to the colors I guess)
If you can set your pointer color to inverted, it's a tremendous help (even if you only occasionally rely on it). Also, of course, as big as possible.
Sadly, currently, the refresh rate is very slow and it has no viable color options.
Though I mostly find max brightness to be uncomfortably bright, so I usually keep it down in the 10-30% range (auto).
My gf likes to illuminate half the block with her phone though, and we both got the S8 at the same time, so will be interesting to compare.
Android has a status bar at the top that's displayed over most applications. And applications on Android are far more likely to be maximized, which makes the common design elements and widgets more likely to be in the same place.
To the best of my knowledge and research, though, the current generation of OLED screens and controllers manage to avoid image retention.
I have an OLED TV for watching shows and movies and try to be careful to not display static elements for too long. I would never use an OLED as a PC screen for fear of burn-in.
Do you happen to have your brightness set to really low or something? I left mine around the default thinking it'd be safe but that totally didn't work out.
I've had an OLED phone since the original Galaxy Note and an OLED tv for 15 months now and I'm never going back to LCD.
As I understand it, plasma displays (which now seem defunct?) had issues with burn-in if presented with the same image for a long period of time during a single instance, while from personal experience AMOLED burn-in can occur even if the same image is displayed repeatedly over a long period of time. Screen savers would appear to mitigate the former, but not completely avoid the latter.
If the Windows taskbar is present 95% of the time that the computer is being used (with the other 5% being fullscreen applications), and a screensaver is active 40% of the time the screen is on, then the taskbar is still visible for 57% of the time the screen is on.
Maybe the tech has changed, but as of my first OLED phone 7 years ago, blue (the same color as the taskbar's Windows icon) was the fastest color to burn in.
Newer phones shift the navbar and status bar items a few pixels every minute. But the start menu icon is too big for that, so after a year or two, every time the user watched a movie in fullscreen, they would get a blue blob in the lower left corner.
You should just turn off the screen when you are away
same stuff that guy over there is complaining about: http://forum.notebookreview.com/threads/alienware-13r3-oled-...
In my opinion 3200x1800 at 13.3" is more than enough already (I own one) but I guess marketing wanted more.
Defaults to dark mode to help skirt any burn-in issues.
Announce it at WWDC and even at $2500+ devs will go nuts for it.
All jokes aside the specs for this thing look great. 15.6" and 4k. I want to see this in production laptops ASAP.
And two calendars, a good one and the Samsung one
And two browsers, a good one and the Samsung one
And two text editots, a good one and the Samsung one
And a camera application, a good one and the Samsung one
Why do they bother?
How about they stick to quality hardware and let the pros deal with software.
I took the one less traveled by,
And that has made all the difference.
and got stung by bees
for three long days
until Park Rangers finally found me.
No one complained that everything had to be magnified 200% to print on the new engines - even though that was going on behind the scenes. Instead, everyone enjoyed the new sharper and crisper printouts. And it was wonderful.
The same is true for displays. When you have a display that can run at 200% scaling, that gives you four times the pixels per character compared to your old 100% display.
Everyone's taste is different, but for myself, give me all the pixels I can get. I don't like pixels and I don't want to see them. I want to see the text and graphics I'm working on. If the display has more pixels to represent those, to the point where I can't even see the pixels any more, that's what I want.
Without an external screen, 4k resolution makes everything too small. But setting scaling on the laptop's screen interacts badly with the external screen, which I use almost all the time and do want at 4k. So I end up with the laptop being that much less usable when I'm using it by itself.
>> I have this on my Dell XPS 15.
> You do not have "this", an OLED panel, in your XPS 15.
That's... uncharitable. I also have a 4k screen on my XPS 15. Why would I care whether it's OLED? 4k on a 15 inch screen has been in production laptops for years.
For the reasons laid out in the article. Which is about a specific OLED display which will be used in laptops like the XPS 15 later this year. Those laptops are in development but not released yet.
2560x1440 (native) * 1.5 = 3840x2160 HiDPI / 2 = 1920x1080 (effective resolution)
Obviously it is not as good a real panel but looked good enough to me.
I did the same thing on Ubuntu using xrandr, it was pretty good when it worked (sometimes it didn't, even using the exact same command)
I think large TVs are great for supporting group-watch events but for personal viewing something really hires and nearby is better, IMO.
When will laptops stop coming with 16:9 monitors? Do people really buy laptops to watch movies, or productivity?
My screen is 3:2, too, and it's many times better. Apple's laptops (sorry, forget exact ratio) are acceptable, too, it's just that 16:9 doesn't make any sense to me and I have no idea why more companies aren't providing alternatives. I guess it's a non-problem and I'm the only one who sees it this way.
I know a few people who mostly use their laptops to watch movies / netflix. The rest of the day, they're not using the laptop at all. (there is some web browsing of course, but not much).
It's not as if I can see individual pixels from across the room. Same with a laptop (I don't see my pixels on a 16" 2k screen), a computer display (I don't see pixels on a 23" 2k screen), and a phone (I don't see pixels on a 5.5" 720px screen). I'm pretty sure that if you can see individual pixels at those sizes, you're sitting too close.
On my phone I really just want my 720p oled display back: there, too, I have a crappy 2k lcd now because oled is too expensive at the resolutions they make them nowadays. And I was hoping to see oled laptops soon, but it looks like we first have to get every single application to be rewritten to support more-pixels-than-you-can-actually-see zooming modes, then bring down the price of these panels, and then I can buy one.
In Windows this is better (for me) because I personally like the subpixel rendering there - it's aligned to pixel boundaries more often and looks clearer to my eyes.
I'm doubting GP has ever worked with HiDPI content for an extended period of time.
1. TV sizes are getting bigger and bigger. You mightn't notice the pixels at the size of your current TV, but your current TV might be considered small by the standard of increasing screen sizes.
2. You mightn't be able to see the pixels on your 16" 2k screen or on a 4K TV, but your eyes aren't the only ones on the planet. To my eyes, 2k is old news; my 13" has 1600 pixels vertically. My left eye can barely see anything thanks to keratoconus, but with both eyes open I can still discern the difference between 720p, 1080p, and full 4K content on my TV — mainly in games and Blu Ray movies.
3. You don't need to rewrite every single application to support HiDPI displays — Windows already has scaling baked into the OS (and has for years), and the majority of apps support the feature with no issue; macOS has had Retina display support since late 2012 and only apps ported from Linux exhibit any weirdness.
4. All of the Galaxy phones have had OLED displays standard for years, long after 720p was considered a high resolution for a phone (back in 2011). Even some of the non-flagship models have HiDPI OLEDs and aren't particularly expensive.
I do high definition photography (think gigapixel panoramas) and editing anything like that on 3840x2160 laptop (high end laptop) is a suffering.
I always looking forward to get back to my 32" monitor.
It should be fine for the kind of work you're doing though, are you using tools that don't support DPI scaling?
For developers working with proportional code - unless you have super sharp eyes to scale your Sublime Editor to sub-1mm characters - i don't see it making much difference.
I physically feel comfortable with a text character of a certain minimal size.
Beyond 4k it really doesn't matter how many pixels or colors are used to render the font or image on 15" screen. Making it too tiny or super-true-color-ish doesn't make much difference on a laptop screen that typically only used for "temporary" on-the-road work.
Of course you're not supposed to run at 1x UI scaling.
Even my 5years old imac is better than this new samsung screen.
I have a Win/PC as well (for gaming) and I'm using the Samsung CHG70 (https://www.samsung.com/us/computing/monitors/gaming/32--chg...).
It cannot even be compared with the screen from imac. It's blurry, with wrong colours and low quality.
Even now, I don't know how no other company can replicate Apple's screens.
How about the HDR screen from the X16G?
It sucks so much you can't even compare it with the Retina iMac (5k) screen.
Hard to buy an affordable laptop with decent screen, bascially all crap with garbage gamut, literally hurt eyes.
It's a shame laptop makers are willing to add those useless entry level discrete GPUs yet are too stingy to spend 20 to 30 bucks more on the display.
My 2013 13" macbook pro retina has a resolution of 2560x1600. I absolutely can tell the difference compared to 'Full' HD, just in crispness of image, beautiful font rendering etc.
So a 4K screen at 15.6" doesn't seem outrageous to me.
There is a placebo effect as well, so someone with their shiny new ultra high resolution screen may feel that there's a compelling difference, but I have doubts it would hold up under legitimate blind testing. We see the same thing with color perceptions now -- that OLED and a good LCD (e.g. iPhone XR) will be calibrated to close to identical color performance, and in blind testing will likely be very difficult to separate on that basis, but it's very common to hear OLED users laud color performance.
There is no question that OLED absolutely owns when it comes to contrast, especially in dark environments where the difference between LCD grey and OLED black is without question.
That aside, 1080p isn’t past the threshold of imperceptibility. 4k totally is, but doubling a common res means you get all the benefits without non-integer scaling issues or changes in screen real-estate.
A UHD 15.6" display with a viewing distance of 34 inches (86cm) results in 170 pixels per degree which is quite high.
It's a very subjective thing. Some people value it higher than others.
For example going from 1920x1080 to 2560x1440 on your 5.8" phone screen provides only limited value, yet many people still go for it.
And since this is OLED, does that mean that the windows start button and Apple's system menu would get burned into the screen :-/
It offers the best tradeof in screenspace to portability and current thin bezel 15.6 laptops fit in standard size laptop backpacks.
The colors are all off. Black display hinges break the silver line of the chassis. Two black buttons on the left do the same. Big black touchbar type thing coupled with white keyboard keys. Small trackpad that's not color matched to the chassis. Also, 'SAMSUNG DISPLAY' branding trashes the beautiful edge-to-edge display.
Only ID conceit I understand is the rubber pads at the edges, since presumably there's no room to add them to the edge to edge display.
Aside from the display, it looks like a bad MacBook knockoff. Samsung can do better, they have on some of their phones.
This is about the display. The laptop it's attached to is probably just a proof of concept demo platform for the display.
Samsung is a big supplier of electronic parts.
When you have unique access to technology, one strategic opportunity is to use that to move up the value chain from being a supplier to being a retail brand and create a differentiated product, winning market share. Samsung right now pretty much has a lock on non-TV OLED production, but is choosing not to go the path of using those in their own computer brand, and instead are selling to other computer makers. It's an option they have but have chosen not to pursue... so thats interesting.