Hacker News new | past | comments | ask | show | jobs | submit login
Eye Strain from LED Backlighting in MacBook Pro (2008) (discussions.apple.com)
69 points by walterbell on Jan 6, 2018 | hide | past | web | favorite | 90 comments



I think this probably concerns many more people than are aware of it. I could live with the flickering but I live better by doing what OP did, set screen brightness to max, on all of my laptops and external LCD screens. I then use redshift (I'm on Linux)[1] to dim the screen to the desired brightness, i.e. absorb the excess light in the liquid crystal layer. Sadly this does appear to lead to some loss in brilliance and color quality, but it's not too bad, and I like that I never have to see the flickering.

I know someone who puts a foil based filter in front of the screen, that may work somewhat better, I still haven't tried that myself.

I find it frustrating how almost all LED lighting installed in the UK is PWM based (if dimmed), or otherwise flickering (AC), too. And it's just a question of consumer expectation: if you buy an LED light bulb in a standard shop in Switzerland, it doesn't flicker, if you buy one in the UK, it does. Both bulbs use the new LED filaments, you can't tell the difference from looking at the bulbs, the difference is in how the power supply in the base of the bulb works. I opened up one of the Swiss bulbs and it had a proper switched-mode power supply (using a switching frequency probably in the 100 KHz range); I saw a video on Youtube by someone who opened up one of the UK ones, and the power supply was based on current limiting via capacitors (and then a rectifier), which leads to a flicker at twice the AC frequency (hence 100 Hz). There's going to be a cost difference of £1 or so, the Swiss consumers are apparently ready to pay for that, the UK population not (or the UK shops possibly pocket the savings).

[1] https://en.wikipedia.org/wiki/Redshift_(software) (actually still using the command-line variant of an older version, e.g. "redshift -b 1 -g 0.9:0.9:0.9 -l $POS -t 5500:4800")


A lot of cheap mains-powered LEDs aren't using PWM, but running on mains power directly.

You can test whether this is the case using a smartphone camera and an app with manual shutter speed control. If you see scanlines just above and below, but not at 1/50s, your light source is flickering at 50 Hz, which is the AC frequency of UK mains power.


> A lot of cheap mains-powered LEDs aren't using PWM, but running on mains power directly.

Yes, I just added information about that to my comment, sorry for the race condition.


I find newer cars' brake lights at night very annoying with their PWM "trails".


Iris claims to perform brightness adjustments in software (GPU?), allowing the monitor’s PWM brightness to stay at 100%, for Win/Mac/Linux: https://iristech.co/iris/

I'm wondering if the Iris approach affects color accuracy, otherwise why wouldn't MS/Apple use the same technique?


The Iris approach lowers the device’s battery life, and the screen’s contrast ratio.


Could this be a difference in the dimmer. I’m in the US (120V vs 240V) and I had research carefully what dimmers were compatible with the dimmable LEDs in our flat. Sometimes the dimmer would hum, the LEDs would squeal, or they would go from 0 to 100% with 20% dimming.

Most LEDs are made in China and I doubt the Swiss are getting a specially manufactured ones. But old, incompatible dimmers are definitely a thing.


It's not the LEDs themselves that are a problem there, but the driver circuitry in the "bulb"[0]. There are a fair number of these being made in the US and Europe, not just China. Significant numbers of LEDs are also made outside China - Cree manufactures LEDs in both the US and China; Nichia manufactures LEDs in Japan.

[0] Bulb is not really an accurate term here. Maybe bulb-replacement?


No external dimmers were involved. As I wrote, I opened up a bulb myself (sawed off the tip of the base) and saw electronic parts consistent with a switch mode power supply (dozens of parts including rectifier, capacitor, tiny transformer, rectifier, capacitor, IC for controlling it). I know I saw something different than was in the video for the UK version! No reason China can't produce both.


Not sure about the backlighting or a post from 2008, but on the topic of eye strain... the diminutive text sizes and obsession with low contrast throughout the redesigned bleached UI certainly causes me eye strain. I've heard similar complaints from others, and about the new system fonts and anti-aliasing too.


I wonder if turning on accessibility features would help.


Title should have (2008). (Also is there some current relevance that I'm just out of the loop on?)


Thanks! Updated.


Nope, as far as I can see there is no reason for this to get posted. Seems almost like OP is trolling for karma because it has a controversial title and links to Apple.com.


The thread is 164 pages long and active through 2017.


So you think we should read 10 years of babbling by cranks who don’t like LED backlights on a support forum?


A statement from Apple would be best, for an issue that has affected customers for nine years.

PWM backlighting was introduced on the iPhone X and is affecting some customers.

edit: s/backlighting/OLED dimming/


Why does the iPhone X need backlighting when it has an OLED screen?


Maybe someone who knows more about AMOLED panels can comment about the role of PWM? It seems to be something other than brightness control, the article at [1] says:

At any level of brightness, modulation with a frequency of approximately 60 or 240 Hz is present ... It can be seen that the amplitude of the modulation is not very large at the maximum and near to it brightness, as a result there is no visible flicker. However, with a strong decrease in brightness, modulation appears with a large relative amplitude, its presence can already be seen in the test for the presence of a stroboscopic effect or simply with rapid eye movement. Depending on the individual sensitivity, such flicker can cause increased fatigue."

[1] https://translate.google.com/translate?act=url&depth=2&hl=en...


PWM is used to control the brightness level of each pixel.


You are very dismissive and impolite. I'm sure they are complaining about it not because they have nothing better to do.


Dude I don’t doubt they care about this issue. My point is I’m not gonna read 168(!) pages of posts about it. Post a single article summarizing the issue or something.


Here's a summary from a vendor trying to compensate for the issue in software: https://iristech.co/pwm-flicker/


To be honest, I'm skeptical that this is a universal feature of the MacBook line. I don't own one of my own (I'd never!) but I use one for work and I can see those effects as well if I know how to look at them (I used to have a DLP and it was easy to reproduce the rainbow effect).

The flickering on this LEDs is not really any different from the CCFL flickers before.


Interesting, but don't CCFLs also "flicker" just at a much higher rate?


Yes; I can see the flickering on my CCFL-based ThinkPad, but it's thankfully not annoying.



YEOW. OUCH.

I...... I am very glad I'm using a CCFL laptop. This is like going back to evil 60Hz CRT flicker!!

I've just accepted I might need to rip out the LED driver board in a future laptop and replace it with my own design. Hopefully not, but if that has to happen then okay.

Thanks for the reference to the comparison, that was also very interesting to go through (and bookmark).


Just tested my Macbook Pro (2016 model) with a 120 FPS camera and apparently there is no flickering in the display, however the keyboard backlight appears to use PWM.


Same for 2017.

I think a lot of it is down to the usage scenario, but the keyboard is the most obvious ‘flickerer’ I’ve seen... anywhere.

A screen like that [or a mandatory keyboard backlight] would be an instant dealbreaker to me.


> My research into LED

> technology turned up

> the fact that it is a

> bit of a technological

> challenge to dim an LED.

This is complete nonsense. While pwm can indeed be used, many of the nicer LED drivers are actually variable current. While adjusting the voltage directly indeed does not work as expected for LEDs, a constant current driver does indeed produce a constant brightness. Varying the current varies the brightness. Only the cheapest displays and lighting installations use pwm for brightness control.


The 2008 MBP did use PWM backlighting. PWM is slightly more efficient than a constant current regulator and gives a more consistent backlight color temperature. IIRC Apple have stuck with PWM backlighting for these reasons, but have moved to an extremely high frequency (~120kHz) which should be imperceptible even to the most sensitive users. Many other top-tier laptop manufacturers also use PWM backlighting, but with a switching frequency of around 1kHz.


... and many car/truck taillights. I think PWM taillights should be illegal. The artifacts are distracting.


This, so much this.

I have nystagmus which means my eyes move uncontrollably. Primarily on the horizontal plane. While my brian compensates for this and I don't see a moving image anything that uses PWD flickers very, very badly.

PWM tail lights are THE ABSOLUTE WORST for this condition. Not only do I see it on the source but I see it on anything the light projects onto, it's horrible.


Wait. Contemporary nystagmus == saccades?

I haven't tracked the cause down yet - I know it's a nerve issue of some kind - but sometimes my jaw, arms or eyes (yup) can occasionally jump. My jaw and arms jump when I'm nervous (or have a buildup of too much nervous energy/excitement), while my eyes jump when I'm trying to focus on something too hard, they drift, and I try to course-correct too hard. (I think half my brain is aware of the drifting, but the other half, the part that would make the fine motor movements to keep my eyes locked, is happily whistling away blithely in a corner somewhere.)

When my eyes jump and I overcorrect when I fail to re-point them where they need to be, I usually have a second or two where I'm not technically blind but I need to let my eyes refocus and remember their purpose in life (if you will) :P

I always thought this was nystagmus (read the term online one day and went "!!! that's what it's called!")... but now I wonder, is what I'm dealing with something else?

(An aside: I'm really, really bad at tracking moving objects in space. If someone else is holding onto something and I'm expected to look at it, I have to hold it instead. No other alternative.)


If not banned outright, they should at least be required to use a 1MHz minimum frequency.


10 or 20 kHz would probably be sufficient, based on talking with the author of some of the most used open-source flashlight driver firmwares, who is quite sensitive to PWM.


Which flashlights use this firmware?


These are all enthusiast-oriented, not something you'd find in a retail store. At least one can only be ordered through a group buy on budgetlightforum.com right now, though a retail offering will come in the future.

* BLF A6 (AKA Astrolux S1)

* Astrolux S41/Manker E14

* BLF X5/X6 (AKA Astrolux S2/S3/SS)

* Emisar D4, D1, D1S

* BLF Q8

* BLF GT

* Convoy S2+/C8 (new firmware version)

These don't all use the same firmware, but they all use an open-source firmware with high-frequency PWM, except that I think the BLF GT does not use PWM. A number of expensive custom lights use these too; those often focus on metalworking and exotic materials.

Firmware sources available here: http://bazaar.launchpad.net/~toykeeper/flashlight-firmware/t...

I own an Astrolux S41 (Nichia 219B), Astrolux SS (3A) and a semi-custom light I assembled using the A6 and Biscotti firmwares by ToyKeeper. I plan to pick up the Emisar D4 (Nichia 219C) and D1S (5D).


Thanks! I’ve been looking at [1] low-flicker, daylight simulation LED lights that can output 10K LUX. Great to hear that OSS firmware exists for some PWM lighting.

[1] https://www.aputure.com/products/al528-1


> that can output 10K LUX

What does this mean? I would assume the surface brightness of the emitters is a good bit higher than that. Aside from surface brightness, lux does not provide any information about a light source without a distance. The SI unit of luminous flux is the lumen, and luminous intensity, the candela.

I have a Viltrox panel with similar characteristics, but smaller (adjustable color temperature, CRI 95+) and I think it uses 5mm Yujis. Brightness adjustment is non-PWM. It's also meant to take a camera battery, but I converted it to use two loose 18650s. Model VL162T, $30ish on Amazon.


The Aputure 528S is rated at 12700 lux at 50 cm. Will check out the Viltrox, thanks.


I see on the specs page it's 4380 lux at 1 meter, or 4380 candela. No information is offered on total output, but it's possible to estimate from the information given.

Lux is lumens per square meter. Candela is lumens per steradian, the SI unit of solid angle. At one meter, these are equal, unless the beam hasn't converged at 1 meter. Aputure provides the beam angle in degrees, for which it was surprisingly hard to find the conversion. Sure, the units aren't technically comparable, but we know they meant the apex plane angle, at least we do now that I looked up the correct term.

So a right circular cone (which the light produced by a rectangular panel isn't, but we'll pretend) with an apex plane angle of 25 degrees has a solid angle of 0.1489365798 sr, approximately. 4380 lm/sr / 0.1489365798 sr = 29408 lm.

And that can't be right because the power consumption is 30 W, and the theoretical limit of luminous efficacy is 683 lm/W. I'd expect an LED panel like this to be on the order of 100 lm/W for about 3000 lm from 30 W.


Worse than typical PWM is running LEDs from AC mains power directly. I've seen this used in large, high-output panels. 20,000 lumens with 50/60 Hz flicker is rather unpleasant to be around.

And yes, two other ways to dim an LED are a switch-mode power supply with output current regulation, or a variable linear regulator. I'm familiar with both approaches being used in flashlight drivers, though PWM is also quite common. Good PWM designs tend to be over 10 kHz, which is nearly always invisible to the naked eye.


I've long wondered how this is actually done electronically. The only approach I've ever been able to come up with is a digipot connected to a linear power supply, but that would generate a bit of heat at low brightness settings, and I suspect even a 256-step digipot would exhibit visible fade "jumps".

As someone who knows zero about electronics fundamentals... how is this done?


Linear regulators are fairly common in flashlight drivers. They do generate waste heat at low output levels since the difference between battery voltage and LED forward voltage is just burned off, but the heat itself isn't a significant problem. Efficiency is more of an issue, but low-output runtime in LED flashlights is so long that most people will be able to charge their battery long before it runs out.

Switch-mode power supplies are the high-end version. These can step the input voltage up or down to produce the required output voltage and can regulate current to one or many desired levels.

More information on how both of these work is easy to find with a google search, but I should note that some things written about linear regulators assume a target voltage. When used to drive LEDs, regulators with a target current are required.


Interesting, thanks very much for the insight.

> Switch-mode power supplies are the high-end version. These can step the input voltage up or down to produce the required output voltage and can regulate current to one or many desired levels.

The way you wrote this makes it sound like SMPS don't have the smooth progression linear regulators do, which is in line with what I understand about SMPS designs. I presume the controller is connected to (or incorporates?) a digipot or similar, or is it actually possible to define an arbitrary reference voltage (derived from a digipot for example)?

> More information on how both of these work is easy to find with a google search, but I should note that some things written about linear regulators assume a target voltage. When used to drive LEDs, regulators with a target current are required.

Sure thing. And yeah, I find the constant-{voltage,current} thing a very fascinating/interesting quirk of LED regulation. At some point I'll find out why LEDs need constant current.


There are commercially-produced flashlights that use SMPS and continuous dimming. As they're all managed by microcontrollers, I'm sure it's actually hundreds or thousands of discrete steps.

> At some point I'll find out why LEDs need constant current.

There are flashlights that work by connecting a Li-ion battery to an LED through a FET that can use PWM to dim it. This will deliver all the power the battery is capable of providing at a given point on the LED's forward voltage curve. If that sounds like a good way to burn out components, it... can be. Everything needs to be fairly robust.

These flashlights do not have constant brightness. Even in lower modes, output tracks battery voltage. Strictly speaking, however, it demonstrates that it's possible to operate LEDs without constant current. Forward voltage drops a bit as temperature increases though, so LEDs are "greedy" and will essentially take all the current available as long as there's sufficient voltage. At some point, this will let the magic smoke out.

FET drivers in flashlights only work because the voltage range of the battery and LED overlap, battery voltage sags when it gets a heavy load on it, and some LEDs can handle a lot more current than the datasheet says for a few minutes at a time. None of these things are valid for mains-powered lighting, except that with sufficient heatsinking, many LEDs can be overdriven at a modest cost to longevity.


The brightness of an LED depends on the current through it. The current through an LED increases exponentially with increasing voltage. It's very difficult to directly control the voltage precisely enough to keep the LED lit at a given brightness, but it's a lot easier if you can just control the current.

Switchmode supplies can output any voltage, they're not limited to discrete steps. Their only disadvantages are increased complexity and potentially increased output noise, depending on the design. Most good switchmode control chips use a bandgap voltage reference internally, that provides a temperature-independent fixed voltage. Then an internal error amplifier and some feedback from the output is used to control the output voltage. Think of it like a thermostat: if the output voltage is too low/high, the supply senses that and starts boosting/dropping the output. Once the output goes over/under the threshold the boosting/dropping stops, and the output starts to fall/rise. The supply keeps switching it around the output threshold. (Linear regulators do exactly the same thing, they just only drop the voltage instead of being able to boost, drop, or both. They also drop the voltage in a less efficient but less noisy manner.)


> The brightness of an LED depends on the current through it. The current through an LED increases exponentially with increasing voltage. It's very difficult to directly control the voltage precisely enough to keep the LED lit at a given brightness, but it's a lot easier if you can just control the current.

I see. So set the current, let it draw the voltage it wants. Heh.

> Switchmode supplies can output any voltage, they're not limited to discrete steps.

Right - the issue I was getting at was the problem of getting an arbitrary representation of a desired voltage out of a microprocessor and into the real world. Digipots are the only way I'm aware of to do that (apart from PWM).

Thanks for the very interesting and highly informative wall of text :)

Open question, in case this piques your curiosity: I'm very interested what your comments might be on https://news.ycombinator.com/item?id=15005811, and any suggestions/ideas you might happen to have. I've long thought that what I describe in this post is noise-related, SMPS noise in particular, but I'm absolutely stumped on how to pinpoint/narrow things down any further.


FYI, switchmode supplies don't need microprocessors. Many are entirely analog, most of the rest use a small amount of digital logic to control the switching. EG check out the LT3574 datasheet[1] pg 6: the only digital logic is the master latch, and it's just doing the "thermostat-style" switching I mentioned above. Or for an even more basic system: the ringing choke converter[2]. Lots of cheap power supplies use these because they are very simple and require no digital circuitry.

Edit: also to get a given voltage out of a microprocessor, you just use an analog-to-digital converter (ADC). Some have them built in, others are external. I like the AD9850 for hobby work, as it has a high resolution and is quite fast. If you need a large voltage range there are ADCs available for that, or you can amplify the signal from a lower voltage one.

WRT your issues with sensitivity to SMPS, it probably is audible noise. See if Screen Tunes[3] causes symptoms, it causes audible (but quiet) coil whine of varying frequency on most monitors I've tried. Your issues are unlikely to be directly caused by RF EM, but the side effects of the systems that generate such signals are often detectable by people.

[1] http://cds.linear.com/docs/en/datasheet/3574f.pdf [2] http://mmcircuit.com/understand-rcc-smps/ [3] https://thume.ca/screentunes/


Any voltage-mode linear regulator can be connected to form a constant-current driver. It's best to use adjustable regulators if doing this, to improve stability. And of course a dedicated current-source is better still, as they will tend to have a wider possible operating range.


I wish there is a Standard where ALL LED, doesn't matter whether they are used in lighting, backlight, keyboard or what ever, small or big, to be flicker free. Or Extreme High Frequency.

There are increasingly more shops, shopping mall, or whatever places it is, and Monitor, billboard etc, especially in China or made in China where culture are more price sensitive, using cheaper LED, so they could say they are governmentally friendly. ( There are also some government incentive in doing so ). And INSANE environmental groups which forces everyone to switch to LED.

Many of these LED has flickering issues, that some people dont have a problem, some people are little annoyed but are fine with it, or some minority like me which needs to run away or I would puke or sick.*

I tired LED lighting once, it started have flickering issues two years in. It was very minor, but enough to irritate me. I have since switch back to good old light bulb.

*I have also discovered those who are sensitive to these flickering issues are also likely to be latency sensitive, as discussed in the computer IO latency and web latency.


I've always found LED Christmas lights to be the worst offender here. Most of the time this issue can be solved by buying quality LED bulbs and using the correct dimmer, if you have one.

Walmart had 100-bulb LED Christmas lights on sale for $6 a string. This is just around the price of non-LED strings, however these are 120v rated LEDs that run right off the mains with only a fuse to protect them. Needless to say these things are tied to the 60hz phase. Since these were being put on my tree this year, I decided to do something about it. A simple full bridge rectifier fixed most of the issue. I think now the phase on the string is more around 120hz which is much easier on the eyes.

I can still see the flicker, however it's not nearly as bad. I believe the thing that would fully rectify the issue (see what I did there) would be to throw a capacitor in the circuit to further smooth out the DC supply. Really it comes down to the idea of a diode; current flows in one direction. In an AC circuit, it flows both directions. Convert it to DC and smooth the supply out and you're using just as much energy without flicker. Of course this ends up costing more if the manufacturer were to implement, so there's that.

I really just want DC throughout the house.


This is something I noticed as well - I upgraded from a PowerBook G3 to a PowerBook G4 in 2009, and then to a MacBook Pro in 2010. I bought the fastest non-unibody machine available, a 2.6GHz A1260. It lasted me very well, five years of daily use before I upgraded. However, I began to notice the backlight issue. I'm a writer in my spare time, so staring at a screen is an occupational hazard. With a CCFL screen, I can stare at it all day at a comfortable level. With an LED screen, full brightness is astonishingly bright, so you have no choice but to use it dimmed. I found I could only go an hour or two at a time in front of my expensive MBP, and for a long time I couldn't figure out what it was. I considered going back to the G4 many times.

It was only after I started using an LED-backlit external screen that I spotted the connection. Apparently I'm a little more sensitive to PWM than average, although perhaps not as much as the OP of the Apple thread. At worst, after about 45 minutes, my expensive LED-lit monitor would give me migraines when dimmed, which the laptop didn't.

Either LED technology has improved or I'm no longer sensitive to it, because my current Clevo laptop is LED (as is just about every screen on the market now) and I'm okay using it for long periods. I had to learn to tolerate the external screen on maximum brightness in a well-lit room for a long time.

It's annoying and certainly not ideal, but I can't blame the manufacturers. CCFLs are power-hungry and toxic, and were always going to lose against LEDs. The technology is easily superior but lacks finesse. It only affects a minority of people, which is probably why it quickly supplanted CCFL - either no study picked it up, or it was such a small number that it could be ignored. I wouldn't want to keep using CCFLs but I wish they'd figured out a better solution before bringing LEDs to market.


To manually check if the PWM has a low frequency: put your finger in front of the screen (or the led lamp you want to test), and move it quickly to describe a paper fan (V). If you see a stroboscopic effect, the frequency is too low. => The screen is not flicker free or the screen brightness too low. This is only a quick test not a bullet proof solution.


It is high time for an OLED dispay on the MacBook Pro. And not just on that horrible touchbar gimmick.


OLED has horrible burn in. For phones they’re doing incredible mitigations but the fact is that the screen isn’t on with static elements for 8hours+ solid each day.


I'm not so sure that's true. The mitigations they're doing seems to work very well for all sorts of form factors. I've been using a dell UP3017Q OLED monitor as my main display for most of last year and haven't had any issues so far whatsoever. It's simply the best display I've ever used. Even for my LG OLED tv, which I've had for a couple of years hasn't exhibited any image retention issues at all, despite watching a lot of news with scrolling headlines and logos. I wish OLED monitors would become more generally available since they're far, far superior to LCDs but for some reason apart from a limited run of the Dells and some short lived laptop models they're pretty much nonexistent.


There's the ThinkPad X1 Yoga with an OLED display. From what I've heard, the thing is absolutely stunning and a pure joy to code on with a black / dark editor theme. It just emits way less light than the equivalent LCD setup.


I can attest to the OLED burn-in. My three-year-old Samsung Note 4 has a very noticeable bar across the top, from the dark status line above what is usually a light background on everything else. I guess the rest of the screen is what is burnt in and the status bar area is closer to its original state.

I don't notice it most of the time, but it really stands out on anything full screen. I think I would be rather annoyed if a three-year-old laptop did this.


> the screen isn’t on with static elements for 8 hours

I have two old cellphones mounted on the wall in my house that show time, weather, control the thermostat, etc. Most of the display is static. After a couple of years the burn-in is dramatic. I was surprised as I thought only phosphors burned.


OLED has worse PWM.


It depends: I've got a Samsung S II with OLED and it doesn't show any flickering. But also a later Samsung also with OLED from all I can tell, and it does, and sat together with someone else trying to figure out what's going on. My interpretation is that early OLED displays were always running 100% duty cycle, but instead would lower the current through each pixel for lowering the brightness (when dimming the S II very low (and adapt your eyes), you can start seeing artifacts, i.e. pixels being lit up less precise because they are in fact using the lowest few values available of the current range). For later phones it appears Samsung added a PWM on top of it, so as to be able to dim the display independent of the pixel currents, perhaps to get rid of these artifacts in low brightness settings, or possibly because this might be a bit more power efficient.

What can be done, apparently (as my friend found out searching the internet), if you've got root, is to install a kernel driver to set the PWM duty cycle to 100% and instead use the old per-pixel dimming approach. I haven't bothered (not using my phones enough) but something potentially worthwhile to do if you use your device for a lot of (book) reading.


Question: this is a problem because the LED array (as an LED backlight or an OLED display is) all flickers in synchrony, right? Would it still be a problem if each LED in the array was hooked up to its own delay line with a randomized-at-the-factory value?


That would probably help some. Really, they should just flicker faster. There’s no reason not to do PWM at a kHz or 10, and nobody is going to see that flickering.


Actually, there is a slight problem, in that many people will be able to hear the inverter.

It's tough to make a completely silent inverter that works near the peak of human hearing sensitivity. It can be done, but it's likely to be more expensive.

Going all the way to 20 kHz+ is probably the best strategy. LEDs can be driven well into the MHz range, but if you go too high, RF emissions compliance starts to become a problem.


That’s a good point. Call it 50 kHz to deal with people who have abnormal hearing response.


https://www.notebookcheck.net/ always tests for PWM and otherwise has the best laptop reviews on the planet. Example of PWM test: https://www.notebookcheck.net/Lenovo-ThinkPad-X1-Carbon-2017...


I use a MacBook Pro all day at work, then at home in the evenings. I can't say I've ever had an issue with eye strain.

Over the years I've used many different models too.


The PWM backlight driver frequency should be much higher. This kind of effect is only perceptible at sub-kilohertz LED switching. The LEDs themselves are happy to be cycled into the megahertz, so the only reason for this on the Macbook is poor electronic design. A low frequency does provide reduced cost, slightly higher efficiency, and reduced auditory/RF noise emissions, but visual effect has to be top priority.


Yep. I wonder if you give up some small amount of power efficiency at higher frequency, but it shouldn’t be much.


It's evident that this sensitivity is pretty rare. But with essentially everything LED-backlit these days, living with that sensitivity sounds hellish!


I can stare at my CCFL ThinkPad all day, but if I stare away from it at around 75-80° (basically not quite 90°, so say around 2PM on a clock) I can see it flickering away. But if I stare straight-on it's fine; it feels completely steady, and certainly more rock-solid than any CRT.

My light emits a (very) tiny amount of light when it's initially turned off. If I stare straight at it I can't see this tiny bit of light at all; I can only see it in my periphery. I suspect my inability to see flickering works by the same mechanism - and I'm very glad for that!

Incidentally, when I was using CRTs I had to have them at 85Hz to hide the flickering as much as possible. 60Hz would drive me nuts - but then again, I used 60Hz for several years while using MS-DOS, so I think 85Hz was just a strong preference.

This may be interesting to some: https://news.ycombinator.com/item?id=15004608#15005811 (the whole thread, along with the highlighted comment)


Most people can see these effects if they know how to look for them. Most "sensitives" are folks who accidentally figured out how to see them.

It's really not at all clear that they possess some kind of higher clock rate for sampling optical images.


I've noticed this with some LED light fixtures in people's houses before and deliberately don't mention it because once you see it you can't unsee it. Particularly evident on things like running water in a bathroom.

I can deal for a few minutes, they have to live there!


Personally, I think this is like when kids start sticking their tongue through the slit of a Halloween mask until they cut it. It's just something you can choose to stop doing but people can't help themselves.


I agree (I replied to the other comment in this subthread).


Not sure if it is the OLED, but after switching from iPhone 7 to iPhone X I get terrible headaches when trying to watch Netflix on my phone before I go to sleep.


It's a similar issue. The Samsung display in iPhone X flickers at 60/240Hz due to PWM brightness control of the OLED. This is more noticeable to the human eye at low brightness. Options are to use the device at higher brightness level or switch to iPhone 8.

https://forums.macrumors.com/threads/eye-strain-while-using-...

https://news.ycombinator.com/item?id=16088173


It explains it, thanks. It’s a deal breaker for me so if I could I would go back to 8plus.


Apple will likely prefer a refund/exchange to paying for customer medical bills.


Isn't the solution to simply turn up the frequency of the flickering?


That would be a solution, however flicker-free dimming for LEDs is not difficult. Linear regulator chips are cheap, but inefficient. Switch-mode power supplies cost a little more. Both are mature technologies any electrical engineer should be familiar with.


Now that you make me think a little deeper about it, it seems simple indeed. The exponential V/I curve of the LED makes I unstable for a given value of V, so all you need to do is have a feedback loop that controls I.


LED light output isn’t linear in power input though. PWM allows you to drive the LED in its most efficient regime no matter how much brightness you want.


Efficiency tends to decrease with current. It is rarely more efficient to run higher current and PWM than to simply run lower current.

It is, however often easier in terms of circuit design.


That’s interesting, I thought there was an optimal range. Do you have a link with more info?


Well, I can offer an example. This is a test of an LED using a bench power supply. In this test, efficiency drops every time current is increased, without fail.

http://budgetlightforum.com/node/51693


Apple displays are always poor quality.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: