[...] They offered to give me a full refund of my purchase. I suppose this would be fine, if I could go to another Apple retail store and purchase a portable machine of this size that had an 8bit screen. [...]
Here's where I stopped reading. I, too, would like a pony.
I'm not sure about this - if false advertising (millions of colors) caused him to sell his old laptop with a better screen, then he's still worse off, as a result of Apple's actions.
Then again, I think the whole MBP line has had 6 bit displays for their whole history, so I'm not sure why he's complaining now.
Damn nearly everybody was a poor peasant or slave in the 1000s, people where getting upset over nothing.
Just because the status quo is such it doesn't mean it is good and cannot be improved. I was searching for a quality LCD screen lately and I couldn't find a reasonable 8-bit screen at any shop, just 6-bit. I read a lot of reviews from seemingly professional sites and they all say the same thing: 8-bit MVA, PVA or IPS panel types are much much better than 6-bit TN at anything except price and maybe a bit of extra latency (not important for anyone except hardcore FPS gamers). The sad state of affairs is that all the reasonably priced 8-bit screens I wanted were out of stock for an unknown period of time at all shops (like the HP LP2475w) or out of production for good. It seems I'll have to pick the best of the worst and be happy just because everybody has the same.
When your product is theoretically a pro product aimed as such and is pretty expensive to begin with you expect the best quality out of all the components. I own a PowerBook Pro and this thing is amazing. More than 5 years old and and everthing works perfectly except some scratches on the metal surface due to drops from heights. The screen in it is the best I ever saw, even compared to new expensive Dells, HPs and others I saw at friends and colleagues. It will be a sad day when I have to change it with a MacBook Pro and the screen will be worse.
That's not entirely fair. The specific information he was looking for is not made available to the public by Apple. He was only able to find the information after getting the model number for the display and checking with the vendor.
There was no way of knowing exactly which display was used without access to a MacBook Pro because that's not information made public by Apple, either. It's an awfully roundabout way of finding out how many colors can be displayed.
In hindsight it's easy to say he could have done the research ahead of time, but think about this: 1) It's called a MacBook Pro and is marketed at more demanding users, i.e. professionals, 2) Apple advertises it as being able to display millions of colors despite the fact that it really only displays 262,144. Under those circumstances, it's probably easy to take a look at the specs on the Apple site and feel confident that it's the right product.
There's still the matter of Apple refusing to disclose which of its products do, in fact, display millions of colors as advertised. At least they offered him a refund in the end, but in my opinion the whole thing sucks. I don't have a problem paying more and getting my money's worth, but they're clearly skimping on parts for products directly targeting the most demanding users.
>>It's called a MacBook Pro and is marketed at more demanding users, i.e. professionals
Which "professionals"? 8bit color is important to pretty much one professional group: graphic artists/designers (maybe two, if you include people doing video work). For the vast majority of people - "professional" or otherwise - 6bit vs. 8bit color doesn't matter. It looks good, it's fast, it runs what they want. Those are the things that matter.
As a developer, I consider myself a "professional", and my life is fine with 6bit color. My roommate is a screenwriting "professional" and they're fine with 6bit color. Presumably an 8bit color display would cost more, and hence would reduce the number of people who could afford the machine, at the expense of pleasing one small market segment.
Bottom line is that if a feature (any feature) was important enough for me that it would make or break my decision to buy the underlying system, I'd explicitly confirm its existence (in a way more rigorous than "millions and millions", which sounds like marketing-speak), and I'd get reviews from my peers who would have the same interests. In this case, the author did neither.
At the minimum, it's faulty marketing on Apple's product. They shouldn't state that it supports millions of colors when it actually doesn't. Just like the CRT monitors that are 17" but only 15" viewable... sometimes that stuff doesn't change until someone files a lawsuit.
It "supports" millions of colors; it just displays some of them with multiple pixel tricks. Which is probably fine with me -- I haven't seen a comparison where I could tell the difference without blowing the pixels up.
I agree. Most people don't know how LCDs work, don't know what dithering is, and wouldn't know what 6bit or 8bit meant if their life depended on it. To these - the majority - of people, "millions of colors" is "millions of colors", and any losses by dithering be damned.
They would ask "if I looked at my monitor, could it show me 'millions of colors'?" to which the one-word answer is "yes". If you had some time, you could expand it to be "yes, but the majority of those are done by this process called dithering, which is when..." and the person you're talking to has lost interest and/or gotten confused. To them, color is color, and they don't care about dithering since it's not visible to them.
This is who marketing is designed for. When most people here see an ad for a system, they immediately skip all the marketing bits and go to the stats, and from that get a good mental idea about the performance of the system, where the bottlenecks will be, etc. When "average" people see an ad for a system they look at the ad copy that says "will launch Word in half the time", decide that that's what they need, and buy the thing.
Ad copy caters to, and explains in context for, the lowest common denominator. If you're not that, then you should be looking elsewhere (at the fine print, the stats, the asterisk at the bottom of the page, etc.) to draw your own conclusions.
Thats the least compelling definition of supports millions of colours I've ever heard. So did my EGA display adapter in the 80s then. I'd be furious if I bought a 6-bit monitor in this day and age and it wasn't spelled in giant letters all over the thing.
You must be furious pretty often, then - because the vast majority of LCDs sold today, with the exception of a few expensive high-end displays, use 6-bit color.
(This is 6 bits per channel, not 6 bits per pixel, by the way.)
This probably means that you have a defective display. I have a 17" MacBook Pro that has this kind of defect, where I can see bad gradients, etc. However, three other Macs in our possession doesn't have this issue.
This is absolutely not true. The difference between the screen on the 13" and the 15" MacBook Pro is enormous. The text is far sharper on the 15", and that matters hugely to developers. It matters to anyone who reads text.
When you use dithering/pixel tricks to fake color depth, everything turns slightly fuzzy and washed out. Even black is affected. It really begins to affect the way things look and feel on the screen.
Oh come on. I've been using laptops for 15 years, and all of the screens have been fine for programming work. (even the monochrome ones) The 6-bit display on my MacBook is clearer and crisper than the machine I had before that, and that machine was better than the one before it. Yes, it's not a patch on my twin Dell 2408WFP's (which are 8-bit), but so what?! It's not "fuzzy and washed out" by comparison - that's a ridiculously hyperbolic statement! No, I don't do colour-accurate work on my laptop, and nor would I expect to.
I do agree with the sentiment that it is disingenuous of Apple (and almost everyone else in the industry) not to list the bit depth of their displays, though.
The differences in screen quality between the 13" and 15" MacBook Pro are not because of 6-bit vs 8-bit panels. All MacBook Pros ever made have only used 6-bit panels. The 15" just uses a higher quality 6-bit panel.
Apple advertises it as being able to display millions of colors despite the fact that it really only displays 262,144.
A 6-bit display is capable of displaying more than 6 bits of color information, using a technique named temporal dithering. For an example, open this image on a 6-bit display: < http://imgur.com/BeVicl.png >. If the display could only display 262k colors, the squares would look identical.
Please note that the 6-bit image is not what will be present on an Apple screen. The difference between 8-bit and dithered 6-bit is difficult to notice for most people.
He seemed to have very specific requirements, doing a bit of research isn't out of the question. I knew about the color situation (and don't care!), it has been well publicized.
A class action law suit about this issue was filed last year, it's not exactly breaking news.
This is not clear at all. Every manufacturer uses 6-bit displays, even in most if not all high-end models. This has been true for years. They choose to do so because such displays have desirable properties other than color, such as price and lower response times.
Alternate comment: How's about commenting on the actual substance and not whining about the headline?
Sure, he's asking for a pony in some ways, but they're making pro machines with displays that don't do what they say or anything "pro" for that matter.
It's not just the headline. Who makes a purchase decision base on a product being called "pro"? It must be very confusing for him when he visits a supermarket and encounters words all over the place like "new", "improved", "better", "more", "bigger"...
A lot of people do exactly that. Marketers use those those terms and graphical callouts because they work, not out of some altruistic desire to keep copywriters and graphic designers gainfully employed. I resent the time it takes to process all these claims when I go shopping, but people like you and me are a minority among the buying public.
I'd say that the Pro nomenclature is meant to raise expectations anyway, and it's certainly true that Macs are used almost exclusively by his kind of professionals. Not an unreasonable expectation then.
I believe that's what his alternate headline was doing. Saves you from also having to read the post unless you liked the constant whining and childish ranting from the blogger.
That's the problem. A lot of HN readers would look at the top comment to decide whether the article is worth a read. jonknee's comment does not provide much value in this regard. It's somewhat disparaging and reminiscent of comments you'd find from fanboys on Engadget or similar site.
The author makes a point that it's hard to believe an extremely well-regarded company like Apple would fail to disclose the true color depth of their panels or that the millions of colors are generated by dithering rather than the native capability of the monitor. I do agree though that his article was rather long for the point he made. There was some entertainment value in the length but it was not value a typical HN reader would seek
I would have taken my refund and used it to pay a notebook repair tech (perhaps an Apple one, off-the-job) to swap out my 6-bit display for an 8-bit one.
He mentions a couple of times about dithering, saying it is "blending nearby colors". I have heard notebook displays also use temporal dithering where they quickly change the color of single pixel and your eye merges it into an intermediate value. (strictly speaking, blending nearby colors could refer to nearby in time, but I think he was refering to spatially nearby)
I am anxious to see a class action lawsuit against hardware manufacturers claiming "millions of colors" on 6-bit panels. One doesn't need to be a designer/photographer to tell two screens apart: a true 8-bit panel with 16.7M output and dithered 6-bit junk. And don't get me started on lack of standardization for contrast ratios and viewing angles.
Once consumers will see real "262K colors" stickers on their laptops, who knows maybe we'll get back our FlexView screens on Thinkpads.
If you've been paying attention you would have noticed that the tech goes backwards in the world of LCDs: as time goes on, high-quality LCDs are being displaced by 6-bit junk-panels not only in laptops (there isn't a single one 8-bit LCD laptop available in US right now) but even in desktops as well: 20" iMacs are 6-bit too.
Viewing angles and contrast ratios are shrinking as well, hence the disgusting proliferation of contrast-enhancing glass/film panel covers: it's like forcing every customer to wear permanent sunglasses: just because your panel is junk.
Right on the money. Most people have no idea how difficult (and expensive) it is to acquire a decent non-TN LCD. And even when you find one, nobody ever gives the bit depth.
The reason the quality has been (and is) receding is because manufacturers have noticed that most people can't tell the difference between e.g. TN and IPS displays, so they produce the former, which are much cheaper. And better options are being phased out all the time; the two LCDs I have were phased out within a couple years. An S-IPS panel now costs at least $100 more than a same-size TN panel, and usually at much lower advertised contrast ratios, which further puts off consumers. Pretty much the only people who buy the good stuff now know their stuff, and we pay a premium because of it.
"...there isn't a single one 8-bit LCD laptop available in US right now..."
Yes, there is. The HP Elitebook 8730W can be ordered with the a DreamColor 8-bit IPS (or PVA not sure...) LCD panel. And Lenovo used to ship the T-series line with IPS panels. I think the W700 can also be ordered with a IPS WUXGA panel.
In any more or less civilized, urban environment you should be able to choose from a selection of used CRT screens. Of course, you should be knowing what you are buying and check them rigorously, as not all of them will be in good condition and CRTs develop various defects when they age. But generally, you can get yourself a decent CRT (or other piece of old equipment, for that matter) if you really want to.
I bought 2 CRTs half a year ago, for me and my brother. They were 21" Sony E530, mnf. 2001-2002 Not the top models (F520/G520 are better, and there is also 24" widescreen Sony FW900) but I was not willing to hunt for the better ones, those were quite fine for us. I think of buying 2 more 21" CRTs for the country house.
Probably the recession has something to do with it. Right now the smart thing to do is compete on price rather than features. 2-3 years from now, the pendulum may swing back the other way.
All manufacturers use 6-bit displays on most if not all portable models. In fact, the trend has been more use of 6-bit panels, because they can be made larger for cheaper with a lower response time, things which more consumers notice. Last I checked, LG (one of the biggest suppliers) didn't even make 8-bit panels anymore.
The eye can see lots more than 262,144 colors. What's the number, 10 mil? Quick googling gave varying results, depending on what's counted as a distinct color (e.g. differences in luminance).
Yes but a 6 bit display accomplishes millions of colors through dithering, and your answer says nothing about whether that kind of dithering could fool the eye.
Since all colors on a LCD screen are dithered between red, green, and blue anyway, and the human eye has no trouble handling that, the question is whether a higher level dithering would be similarly undetectable.
I'm no color expert but even I could tell the terrible banding issues with gradients on a MacBook's screen. It's extremely apparent as soon as you throw a smooth gradient on screen.
Is there an output device other than a "pro" monitor with the same range? If the consumers of your work don't have 8 bit displays, how will they know the difference?
It sounds like he was happy with it until he took it apart. If you only discover it's not good enough after checking the part number, I think maybe it is good enough.
Anyway, is there a picture somewhere I can download that will show me how awful my shitty 6 bit screen is? I've been skeptical of "I can see/hear more pixels/sin waves than you" before, but I've also seen compelling proof for some of it.
PS: Why is it 6 and 8 bit, not 18 and 24? It makes me think you're a retard for thinking 8 bits is enough for 16 million colors, but that seems to be how everyone labels it.
Note that an image that uses 6-bit color with an alpha channel will have 24 bits per pixel, which shouldn't be confused with the 32 bites per pixel of 8-bit color.
Apple has been using comparatively low-quality panels in their laptops for a while. I ran into an issue like this when researching my 17" Macbook Pro purchase in late 2006. I still ended up making the purchase, but it was indeed a lower-quality screen with very apparent dithering under certain color profiles.
I've noticed a phenomenal variance in LCD quality recently, and 6-bit is the least of the problem.
My Asus G1 has an incredibly good screen--very high-contrast, a near-perfect viewing angle: I can read text as far as 80 degrees to the side or top, where 90 means unable to see any of the screen. It's high-DPI (1680x1050 on a 15.4") as well. Odds are it's a 6-bit TN, and all the research I've done suggests that. But it's amazing despite that.
My company gave me a Thinkpad for personal use. It has a similar size screen. The contrast is abominable; the screen is bright gray when it's black and no matter how much I adjust contrast/brightness I can't make it usable. The viewing angle is atrocious. And yet from what I can tell... it's also a 6-bit TN.
Bits are, IMO, one of the less important problems in LCD panels these days; the overall quality as a whole is suffering greatly in new screens.
Regardless of what you think of the dude writing it, saying "millions of colors" is false advertising if it's done by dithering. Why stop at millions, why not advertise billions or trillions of colors? (Though I guess it's limited by the number of pixels on the screen.)
I had no idea about this, and that's the most dishonest piece of marketing from a reputable company I've heard in a long time.
MacBook screens have been crap for as long as I can remember, since the days when they were selling PowerBooks and iBooks, long before the first iPod.
I have no idea why the "demanding" Apple audience let's them get away with that but the panels on these things have always had lower contrast and (sometimes ridiculously) lower viewing angles than comparable notebooks, e.g. the sony vaios.
I've done plenty side-by-side comparisons and both problems are fairly obvious even to the untrained eye. For example the RGB color #fefefe is generally indistinguishable from white on MacBook displays. Some of our designers refer to them as WashyBooks and spend time to make sure their palettes are "WashyBook safe".
I suspect that would be a gamma difference with OSX, not an artifact of the screen quality. If you look at windows computers, the lighter greys are generally a bit darker than the same colour on a mac.
PITA dealing with Apple on this, and the screens should indeed do what they advertise.
Still, since you already knew about 6-bit vs 8-bit issues, why did you not look in the store before buying? The dithering is visible to the careful eye just by looking for it on my 13" Macbook (white, plastic, relatively cheap).
Interesting. The Apple MacBook Pro website makes no distinction between the models as to the color support. They all say "millions of colors", and as someone who doesn't deal with colors for a living I would expect them to have identical support.
This is actually kind of an important point. I don't know what you're being downvoted for. The upshot is that you really only notice the 6bit/8bit difference on large, smooth gradients.
This article seems to indicate the exact opposite; that the new Pro displays (including the 13") have markedly improved colour accuracy.
"Colour accuracy in the three MacBook Pro displays is as right as we've seen in a laptop display, equal to or better than some midrange desktop displays and not that far off the level of colour correctness found in a premium desktop display."
Color accuracy isn't so much color resolution. That's like the difference between precision and accuracy, you can have something accurate with poor precision, and vice versa.
Situations when you need all 8 bits of color resolution:
-- performing color matching against a sample - fabric, print, etc
-- soft-proofing for print output (spot colors)
-- when performing any color/contrast adjustments on photographs or art that have fine color/value gradients. A common example is a grey sweep backdrop--zoom in past a certain point and you can see visible banding
That said, anyone doing prepress on a laptop is stupid. Get thee to a color booth and a 10-bit LUT Eizo (or even better to your cache of discontinued Sony Artisans you keep in a closet somewhere and replace as they age).
I think the main problem is the visual artifacts introduced by the time-domain dithering done to compensate for the 6-bit DACs, rather than the lack of color resolution itself. This affects everyone, not just artists foolish enough to use laptops.
The fact that, to a first approximation, one can no longer buy a laptop with an LCD of the same quality as those made 10 years ago - at any price - is disturbing all in itself, and suggests that technology is marching backwards. What's next, the abolition of 44.1KHz audio?
Compare a 1980s keyboard to a modern one. It is entirely conceivable that the UI technology we will be using in a decade or two will be of such abominable quality that one might be ashamed to design it into a children's toy computer today.
Actually, it's funny that you pose that question in this article. I think MP3s are a great example of "44.1KHz" audio that is not the fidelity of what you're probably thinking of, the CD-Audio standard.
People are clearly prioritizing things other than the sheer quality of the material, be it visual or audio. I think it's impressive that the LCD folks have been able to engineer LCDs at the prices they have. And, much like MP3s shrunk music file sizes by throwing away parts of the music that people aren't supposed to miss, the LCD guys are throwing away quality that, apparently, people don't care about either.
> the LCD guys are throwing away quality that, apparently, people don't care about either.
The problem is that a substantial minority does care about quality - and we are left out in the cold. Have you ever priced a "specialist" monitor? That is where all non-lowest-common-denominator user interface technology seems to be headed - squarely outside the budget of the individual. And as we can see from the article, it has become entirely impossible to purchase a new laptop with both a decent screen and a decent operating system.
It is a trend in everything that surrounds us - cars, fridges, clothes, shoes, buildings - everything is made cheaper. Compare computer case from 1990 and now - modern economy class cases are ultra light. Reason is simple - market doesn't care about 8 or 10 or 12 bit displays, so they are not offered. Cost to the company to introduce them far overweights benefit of selling it to .01% of individuals who care. Or who think that they care.
Calibrating the display will give you accuracy, ie the colors match those in real life. It doesn't help with precision, which is how narrow the distribution is. (That said, we're not strictly talking about that here, since there's no randomness in the colors, just an inability to reproduce small enough changes.)
But I agree with the other comment, on a laptop I care more about things looking nice, ie no dither or banding, rather than about accurate color reproduction.
I went a-Googling for that before I saw your message, and here's the best thing I found. (A surprising number of 404s and forbiddens on most test pattern images.) This article [1] has a test image: http://static.photo.net/attachments/bboard/00G/00GkGu-302804... which has a grey gradient across the bottom. Zoom in (Firefox zoom is fine), and then look very closely at the pixels. If you see a dither pattern in most of the cells, you've got 6-bit. If it's smooth and unpatterned, you've got 8-bit.
I have 6-bit, but it's also higher res. Dithering is less of a problem in that scenario. But having seen it clearly, now I understand why you shouldn't do video-intensive work on this sort of display.
If I have two monitors side-by-side, one 8-bit and one 6-bit, I'm pretty sure I could see the difference (just like, as an audio engineer, I can clearly hear the difference between high end monitors and consumer speakers). The question is...how do I know, if I don't know? You know?
I have multiple monitors, but the only one that I knew was 8-bit (a high end Viewsonic intended for designers, and from a time when specs were easy to find and very clear and generally accurate, at least on high end devices) has long since left the stable as it was only 19". I'm certain the differences are subtle, as they are in good consumer speakers vs. good studio monitors. But, it'd be nice to know if what I'm working with is altering my perception of things. Particularly as I've just done several thousand dollars worth of design work for print on my new laptop which may or may not have an accurate display. It's just nice to know what I'm dealing with is all.
Moreover, different batches have different monitors.
Manufacturers keep lowering prices (remember how much the Powerbooks did cost?) so they must use cheaper technology, and most people are mostly ignorant about specs anyway.
Glossy screens are another example (they're cheaper to make and they look better at first sight).
Exactly. I've heard many "graphics professionals" who buy a new MacBook Pro to replace an old one and blame the 6-bit panel for poor colour reproduction. It's not the fact that the panel is 6-bit. They came from a 6-bit panel, although most were under the impression it was 8-bit. The fact is just that Apple is using cheaper displays. It's a crappy 6-bit panel, not the fact that it's specifically a 6-bit panel that's the problem. After all, they didn't complain with the old 6-bit panel they were using.
It's funny to hear some of these guys talk about how they buy the MacBook Pro over the MacBook because of the 6-bit vs 8-bit panel when all of those laptops have only ever had 6-bit panels in them. The Pro just happens to use a better quality panel, that's all. But it's still 6-bit.
Does anybody know why Apple won't just come out and declare whether the 13" MacBook has 6 Bit or 8 Bit color?
As a side note - this is why I love HN - I remember the class action lawsuit over colors and Apple a few years ago (or at least discussion around it) - but I didn't know what it was over. Now I have insight into the differences in LCD color qualities and get the insight that the 13" MacBook pro isn't _really_ totally Pro.
If it's true that all panels are now 6-bit, that wouldn't matter. Also, specifying a 6-bit display wouldn't prevent them from silently shipping a better one in the future -- Apple has been known to ship faster processors or optical drives than specified.
I'm not sure what this guy is looking for. Regardless of whether the display is 6-bit, or just 8-bit that looks crap, you're still going to have a problem. Unless you're harboring a fantasy that Apple will give you a MacBook with a custom screen as way of an apology.
They've offered you a refund. Take the money and use it to buy the 15" MacBook with a better screen. And this time try checking out the product to make sure it meets your needs before splashing down thousands of dollars.
I dunno, "pro" for a graphic artist probably includes "actually displaying millions of colors instead of dithering so I can actually tell what color it is."
Nice. I love the particularly relevant blurb at the bottom:
"Whether you’re a designer, photographer, or any kind of creative pro who deals with visual material, you face the daunting task of getting your colors just right. [...] Learn how you can make the most of your Mac to maximize the effectiveness of your colors."
Can you back up this assertion? Since one person in this topic already said that the 15 inch model does have 8bit and you just say that it doesn't, how am I supposed to believe you?
Well, there are very few notebooks with other than TN displays. Thinkpads used to have IPS, but since the introduction of latest generation (T400, T500), they have TN too. For a very short list, see a poster above (http://news.ycombinator.com/item?id=706579), he mentions one HP, one Lenovo model with optional IPS (W in series designation means workstation) and one Sony model.
If it's so obvious and important to his professional work why doesn't he drag his ass into the Apple store where he bought his MacBook and check out the screens of all available models? This would have been advised before buying the 13" machine, but hey at least he can get a refund. Apple already "misled" him, so why does he care what they say? Walk into the store and check for yourself. Or can he not tell the difference?
Might sound like a smart ass remark but if he can't tell by looking at the displays in person whether they're 8-bit or 6-bit, then I can see why 99% of laptops on the market use 6-bit panels. I fully understand that when you compare them side-by-side you can see the difference. But I remember looking into which LCD panels were actually 8-bit and I could count the number of laptops (2 years ago) which were using these panels on one hand.
There are 8bit portable displays, you know, and Apple should be a company that uses them, especially when they are generally known for great quality products, but instead, they went with the cheaper option, a lower bit display.
Translation: I am unhappy that Apple made a business decision to use a cheaper component that won't affect most users of the machine, but negatively affects my particular use of it.
More broadly, in this day and age I'd also be skeptical of the author's notion that the word "Pro" means it's explicitly designed for "professionals" as opposed to normal people. Look at most advertising - the word has come to mean nothing more than high-end consumer grade. For the most part, if something is made explicitly for professionals in a certain field it will rely on industry-specific promotions/reviews and knowledge to make itself known, as opposed to outright advertising. This is similar to the frequent use of the word "exclusive" in reference to availability - if it was really exclusive, it wouldn't be advertised.
And even if I'm wrong about those two points, when did "professional" come to mean "digital art professional"?
WOW. I'm finding this hard to believe. So the MacBook Pro 13" doesn't have an 8bit display, that's a little lame, we agree. But, here's the problem. You were in an Apple store and used the laptop and didn't notice. Then you took the laptop home and somehow 'found out' it doesn't support the millions of colors you apparently need (I'm no designer so I have no appreciation for this)... So... go back to the store and do the same 'check' you did for this laptop on a 15" model -- if it passes, problem solved, if not... well, maybe you should just do your designing on your 30" cinema which you now have working after receiving a $70 discount on the adapter (granted a $70 discount - the pain involved in not having it for a week or whatever). /rant
Does anyone know the price difference between 6-Bit and 8-Bit display panels? I can't really find any info on it. I've always thought, by comparison, that Macbook displays looked fantastic compared to the HP PC laptops I'm used to using at work. Just as good as the Samsung displays I use on my desktop. I do believe Apple still has their 30 day no questions asked, no restocking fee, return policy though so there's that.
The main difference is with the technology used in the panel -- cheaper panels (TN panels) generally only use 6 bits for each colour element of the display (Red, Green and Blue), meaning that it is capable of producing a much more limited range of colours (the thousands indicated by the article).
For playback purposes (DVDs, games, etc), most consumers won't notice the difference. For content creation, though, 6-bit panels are a disaster. I've seen cheap panels from Dell that simply could not display colours in the light blue or light yellow range correctly, and light greys not at all.
Wikipedia has an excellent article about the different types of LCD panels available, but the takeaway is basically "you get what you pay for."
Maybe I didn't read the article correctly, but if this guy is outputting to another screen, isn't it the video card that's not outputting the colors/resolution correctly, and that has nothing to do with whether the laptop screen is 6bit or 8bit?
Wait a second this guy is an icon/ui designer .... Which means he works on computer design so to be realistic most of his customers do not have "millions of colors".
He has a 30 inch cinema display that he can now use for designing and checking colors with the replacement cable but he still would rather rant and rave and call support hours on end rather than just replace the laptop.
Professionals in any creative industry need to be able to see or hear exactly what they are actually creating. This is why audio monitors for recording studios cost a lot, and don't "sound as good" to the untrained ear. Accuracy is what you are paying for. If it is inaccurate in any one direction, the potential for problems during reproduction on client machines (which could be inaccurate in the other direction; doubling the magnitude of the problem) is much higher.
Cheap monitors are bad in many different ways. One cannot predict exactly how it will diverge from what it ought to look like. So, if your source has been designed on a device that is accurate and balanced, you have a dramatically higher chance of it looking or sounding good on the majority of devices.
I think it's a valid criticism of Apple. It is information that is valuable to consumers who work in the design field, and it's information that I would expect to find on the "Tech Specs" section of their site. The fact that they don't provide that information, for quite pricey machines intended for "professional" users, makes me very much less likely to buy their products. But, then again, I'm generally surprised that anyone puts up with the abuse that Apple inflicts on their customers.
> This is why audio monitors ... don't "sound as good" to the untrained ear.
I've never understood this, myself. If it was mixed on monitors, and made to sound good on monitors, then shouldn't it sound best when replayed on monitors?
I think the idea is that taste can be developed. A musician probably will like the sound best from a monitor, but someone who doesn't pay as much attention to sound has a less nuanced appreciation and taste. Cheaper speakers are often biased to exaggerate the qualities that are appealing to the majority (whose tastes are less refined), hence the scare quotes acknowledging the subjective interpretation of goodness.
I didn't actually intend to make actual speaker "goodness" into a factor in the discussion, but I see now that it seems to have done so; the scare quotes were to indicate that a non-engineer (even a musician who isn't also an engineer) would possibly prefer consumer speakers because they often smooth the sound and boost some frequencies often perceived as being "good".
As you note, consumer speakers (at most price points under the obscenely expensive mark) tend to hype certain frequencies: bass and highs in particular. Folks hear it and think, "Wow, so bright and clear! And dig that bass! You can really feel it." Studio monitors do none of those things, on purpose. The job of studio monitors is not to lure a consumer at Best Buy into thinking this set of speakers is better because it is louder and brighter in a room full of noise. Their job is to accurately reflect what is on tape.
So, engineers very rarely talk about whether monitors sound "good". The word engineers use as praise is "accurate". And, when they do use the word "good" in this context, they usually mean "accurate", where someone else probably means "hyped".
I've got a pair of Quested F11's (the sexy purple version) at home, and I think they sound really "good" :)
They're the best investment I've ever made, but I'm just a music-enthusiast. I was roughly aware of the issues you brought up, and it was a very good and thorough explanation.
I guess I just wanted to point out that there are even "normal people" out there who appreciate accuracy/quality in speakers.
It's important to know that anyone can benefit from using studio monitors for listening to music. They reproduce music in such a lovely way - accurate, detailed and pleasant. Listen to your favourite music, but in a new way. Discover things you didn't know were there.
Even - shall we say, despite the accuracy - there's plenty of bass to go around, and your studio monitors can put out insane levels of volume and still reproduce everything beautifully.
Actually I agree that folks who want a really great set of speakers probably should go for a reputable brand of studio monitors and appropriate amp (these days some studio monitors are powered and need no amp). You'll get better accuracy and less hype (marketing and/or frequency response, or both) than from high end audio speaker manufacturers. I'm suspicious of most high end "audiophile" products, because there's so much superstition and snake oil surrounding the business. Magic cables that cost hundreds of dollars, electronics suspended to prevent vibrations from altering the sound (this is sane for speakers and mics, not sane for solid state electronics), light switch and cable jack covers that claim to improve the sound of the room, etc. All with obscene price tags. A good set of studio monitors and a good amp isn't cheap, but most of the money you spend goes towards important stuff; like high quality components and a good design.
Folks are always amazed by the audio in my living room...it doubles as my office and home studio, and I have an obscenely overpowered Hafler amplifier driving a pair of Tannoy System 600 speakers. Those little speakers can shake the house, and they do it while being really precise and clear. I also have bass traps in all the corners and have made more than a passing attempt to make the room balanced and non-reactive, which is something most music consumers have never experienced (a good sounding room is at least as important as the speakers you put in it). It's still not exactly a studio experience, but it's pretty awesome for Rock Band Taco Night. I also just bought a projector and an 80" diagonal screen for our booth at OSCON next week...Rock Band Taco Night will never be the same.
> Actually I agree that folks who want a really great set of speakers probably should go for a reputable brand of studio monitors and appropriate amp (these days some studio monitors are powered and need no amp).
Right. This is what I keep telling anyone who's interested. Of course, most people are blissfully oblivious to quality (wrt. lots of things)
My speakers are powered though. I'd have no idea what kind of amp to get.
> Magic cables that cost hundreds of dollars
A timeless classic :) Some time last year, I was investigating headphones, and came across a pair of Grados at some shop.
I mentioned that the earpieces felt abrasive, and the "helpful" shop owner suggested I buy a pair of thicker, fluffier ones for around 50 euros. Yes, 50 euros for two pieces of foam or whatever. "THIS.. IS.. FINLAND!!"
He claimed, with a straight face, that the headphones would sound better with the thicker earpieces too, "because they're shaped to better direct sound into your ears" .. :P
> a good sounding room is at least as important as the speakers you put in it
This is particularly unfortunate for me, since I live in a crummy apartment with concrete walls. There's not much I can do about acoustics, short of a considerable (and very costly) renovation.
I bet your Rock Band Taco Nights are awesome :)
By the way, what kind of headphones would you recommend? I'm leaning towards Beyerdynamic DT880(PRO?)'s.
I'm not SwellJoe, but I'll jump in to recommend Sony's MDR7506 headphones. These aren't consumer junk. They're comfortable, sound good, and are widely used by professionals for soundboard, editing, and broadcasting work. Next time you go to a show, take a look at the soundboard - chances are the folks working it will be wearing 7506 headphones.
I think they're about 1/2 to 1/3 the price of those Beyerdynamics, too.
The point is that you can't know what it sounds like unless they are accurate. Every consumer speaker sounds different in different ways. So, if I bought a pair of mid-range car speakers to mix on, assuming most of my target audience would hear it on their car stereo, and I made it sound good on those speakers...even if everybody only listened on car speakers, odds are good it would still be too hyped or too bassy or too tinny on about half of all speakers. The degree of variance is the problem.
If I mix on a set of accurate studio monitors (and assuming a reasonable level of skill in the area), the sound will be pretty well balanced across any number of speakers. Some will be hyped (smiley curve EQ), some will be bassy, and some will be bright. But a balanced mix will be pretty well-balanced on any set of speakers of reasonable quality. If I mix on speakers that lead me to boost or lower a specific set of frequencies, the speakers that do the opposite will be twice as far off as if I had mixed to a balanced set of speakers.
Everybody here is probably pretty mathematical, so think of a single sine wave at a specific frequency. Say I'm aiming for an amplitude of 1 (units irrelevant). My mixing speaker boosts it by .5, because it's a speaker aimed at guys who like their cars to rattle. So, now I've mixed this sine wave at an amplitude of .66. If someone who likes bright sounding music (and with speakers that drop the bass by .5) listens, my mix now has the bass at .33 (ignoring all the dB math, and the crazy physics of ears/distance/frequencies etc. for the sake of simplicity).
If, on the other hand, I mixed on a speaker that accurately gave me a 1 when I wanted a 1, it'll play at 1.5 on the bass-heavy speakers, and at .5 on the treble-heavy speakers. Smaller differences are better.
Note also that professional studios also tend to include a few sets of speakers. You mix on the "mains" which are usually high quality near-field monitors, and you test on the small and big speakers and possibly a specific set of speakers that matches your target audience. Good engineers also usually live and breathe their projects, and so they'll also listen in the car on the way home and to work in the morning. The main work of mixing, however, always happens on a good, accurate set of monitors. If it doesn't, odds are very good it'll sound like crap on a good portion of speakers.
And to answer this questions specifically:
then shouldn't it sound best when replayed on monitors?
Sure, but how many people have $1000+ speaker/amp combos in their car, iPod, home, etc.? The problem is merely one of making the best mix for the most people. If you know exactly what speakers your customer is using, and those are the only speakers that will be used for playback of your recording, then you would obviously tune your mix for that exact set of speakers. But, that's pretty much never the case. We do the best we can with a complex set of variables.
I don't buy the accuracy argument for a second. By working only on the most expensive, rarified equipment, the only thing you're ensuring is that your own experience will be drastically different from 99.9% of end users.
Imagine if the guy responsible for developing new recipes for McDonalds insists that his "creative work" only be done with organic Niman ranch beef.
Yes, your experience will be different from that of most end users. But it will be minimally biased - you won't be baking in the settings that sound good on one set of cheap speakers but which would sound like total ass on other ones where cost has been saved in other ways.
For example, many hifi and computer speakers come with some kind of bass boost. If I edit something using speakers like that there will be little or no bass on speakers which don't have that 'feature'. Or if I go in the other direction and finish on speakers that sound rather thin, the result will be an unmerciful honking noise on anything with a bass boost. Using a pair of (moderately) expensive speakers means I can split the difference, and know that while it will never sound as good as I'd like on consumer speakers, it will never sound too awful either.
If you still don't think it matters, watch some short films or videos on YouTube. Then take even a modest Hollywood movie and notice how vastly, enormously better it sounds. Obsessing over sound and color seems trivial to those who don't have to work on content, but the very fact that most people take these things for granted is what fuels that obsession: because people are so sensitive they notice very quickly if the quality is off, even if they can't articulate why.
Also, cheap components have a ridiculous degree of variance. As a designer you need your work to look/sound the same on all of the tools you use. That shade of blue should not look blue-er on one monitor in your studio.
It's generally a poor assumption that averaging a huge number of inaccurate devices is going to end up with the same sound as one high quality set. I would support finding out what the "average" home equipment sounds like, but suggest "perfection" is the best approximation is laughable. For example, the average single cone speaker is going to be setup to handle mid range sounds ok, but it's going to miss highs and lows.
I think the real advantage is people get used to how their system adjusts music. So by following the convention it's going to sound "normal" to them.
Without dithering, 6 bit LCDs produce 191 colors. 8 bit LCDs produce 767 colors. If you want more than 767 colors you will be using dithering.
I think where people get confused here is the naive dithering where a group of 3 pixels (one of each color) are logically grouped together gives your the 252k and 16m numbers, but there is no reason that is the only desirable solution.
Consider sub-pixel rendering of typography. This abandons the traditional groups of 3 pixels and uses alternatives to gain better horizontal spatial resolution. Consider a white, lowercase 'L'. Depending where it appears on my screen it could either be a single white pixel under the naive "groups of three" dithering, or a yellow pixel next to a blue pixel. The catch is that unless I show you the borders of the screen you can't tell which it is.
Fortunately, I have a brand new 15" MacBook Pro under my fingers. Tonight when it gets dark I'll take a picture of a gradient, enlarge it beyond the capabilities of my eyes, and we will have an answer about what Apple does.
You are confused. "6 bit displays" have 6 bits of color data per channel, and hence can display 2^18 (262,144) colors without dithering. I'm not sure where you're getting 191 and 767 from.
I am in no way confused. The three color components of an LCD pixel are not in the same spatial location. Look closely and you will find three rectangular areas, one for each color. The only colors you can make without combining elements are at the 2^6 or 2^8 variants of red, green, and blue. I subtract two because (ok, I am confused, I only subtracted one before) because the blacks are all the same.
1. If you're seeing the RGB elements separately, you need to sit a little farther back from the screen. :) I can't pick them out individually at all (I can just barely see the pixel grid with the screen an inch from my face), so it's not particularly useful to think of color this way - because nobody sees it like that.
2. Even ignoring this, your numbers are still wrong. 2^6 = 64, and 2^8 = 256.
2^6 = 64, but I have red, green, and blue pixels, so times three, but subtract two because all the blacks are the same.
I suppose I'll need to write a small article on this with some pictures. In the process of researching this I was impressed with how useless macrophotography of a screen was. Small changes that are obvious to the human at a distance are invisible when looking at the individual r,g,b elements up close.
I'm away from my camera that can output RAW data, maybe with it and some numerical image analysis I can tell if there is multipixel dithering going on, but my human eye isn't going to tell, even from magnified pictures.
There's a much easier way to demonstrate multipixel dithering: create a 50% gray bitmap (the scrollbar on an xterm works fine for this) and drag it around on screen. If you see weird color effects, there's multipixel dithering happening.
So far, on large areas I can see a difference between #404040 and both #3f3f3f and #414141 on the new 15" MacBook Pro.
That rules out a naive 6 bit display. I think. The application of color profiles can could cause some regions of to have one-to-one mapping even if there aren't enough output codes available to cover the inputs, but I think with the 4:1 difference I can rule that out.
It will make a table with cells whose text is just slightly off of the background. You can stare are them and explore the limits of your visual system and your computer's display. (No one will be able to see all of the text on any statically adjusted display.)
Additionally, on this MacBook Pro I looked at a greyscale gradient which I have used over the past and decided it was poorly distributed, so I ran through the color calibration (expert mode) in the Displays preferences pane. I ended up with a much better distributed gradient, but I introduced hue distortion so I didn't keep it. But there seems to be room for an improved profile.
Here's where I stopped reading. I, too, would like a pony.