Weird dithering “bug” on Lenovo X200

I don’t even begin to know how to search for a solution to this problem, let alone fix it. Maybe just by forcing myself to describe it, I’ll see through to a solution. If not, then I hope there’s someone out there who might be able to point me in the right direction.

white to black gradient sample
For most people this image should be a creamy smooth gradient from pure white to pure black

On my Lenovo X200 running an up to date install of Ubuntu’s Intrepid Ibex, I too see a smooth gradient—except on the far right, where it should be super dark gray just before the black, my screen breaks out in a case of adolescent 8-bit dithering.

Lenovo X200 dithering bug on Ubuntu Intrepid Ibex
Here’s a macro shot of my screen of what I see
Lenovo X200 dithering bug on Ubuntu Intrepid Ibex
Here’s a 100% crop of the shot above with contrast applied

I can tell you one thing, it makes editing photos with dark areas really really annoying.

Here’s my /etc/X11/xorg.conf file (minus the commented-out header bits at the top). It comes whole hog from ThinkWiki.

Section "Monitor"
    Identifier    "Configured Monitor"

Section "Monitor"
    Identifier    "HDMI-1"
    Option        "Ignore" "True"

Section "Monitor"
    Identifier    "HDMI-2"
    Option        "Ignore" "True"

Section "Screen"
    Identifier    "Default Screen"
    Monitor        "Configured Monitor"
    Device        "Configured Video Device"
    DefaultDepth     24
    SubSection "Display"
        Modes "1280x800" "1024x768"
# The following line was an auto-configuration added by an external VGA projector; you might leave it out to try
# letting the system detect dimensions appropriate for whatever display you happen to use.
        Virtual    2432 864

 Section "Device"
     Identifier    "Configured Video Device"
     Driver        "intel"
     Option        "monitor-HDMI-1" "HDMI-1"
     Option        "monitor-HDMI-2" "HDMI-2"

Apparently this is my video card driver:

$ lspci | grep VGA
00:02.0 VGA compatible controller: Intel Corporation Mobile 4 Series Chipset Integrated Graphics Controller (rev 07)

Is this the right driver? I have no idea. The X200 comes with an Intel Graphics Media Accelerator 4500MHD.

In an online forum I read:

if you see dithering in certain colors then it’s because notebooks use 6-bit displays and simply cannot reproduce all colors correctly. this phenomena is different than grain but will make a display look grainy under certain conditions.

Is that correct?

The comments in this official Lenovo blog post seem to confirm the above:

No one will use 6 bit TN panel for serious photo editing while there are more proper display technologies available for many years.

Over on Wikipedia I learn that TN stands for twisted nematic:

Also, panels that represent colors using 6 bits per color, instead of 8, are not able to display the 16.7 million color shades (24-bit truecolor) that are available from modern graphics cards. Instead, these panels display interpolated 24-bit color using a dithering method that combines adjacent pixels to simulate the desired shade. They can also use Frame Rate Control (FRC), which cycles pixels on and off to simulate a given shade. These color simulation methods are noticeable to most people and bothersome to some[citation needed]. FRC tends to be most noticeable in darker tones, while dithering appears to make the individual pixels of the LCD visible.

I’ve got your citation right here! But seriously, I have no idea if this 6-bit to 8-bit FRC translation is this really my problem. All I know is that the X200’s LCD panel is a 12.1″ WXGA. But is the dithering an innate hardware limitation, or a fixable software issue?

Update: Here’s a great article that explains the difference between 6-bits per color and 8-bits per color: LCD Color 8-Bit vs. 6-Bit. Apparently it all comes down to a trade-off between speed and color. 6 bits per color means faster response times, which is better for video, but worse for accurate color reproduction.

Wow, I feel like I’m getting a whole education on this. Especially when I read things like this: “There are no real 8-bit screens for notebooks at this point, it’s 6-bit like any other notebook panel.” When did the collective laptop industry go to shit?

Apparently a year ago Apple quietly settled a lawsuit over dithered laptop displays, this very issue. Apple? 6-bit per color displays. WTF? It’s like I’m in the twilight zone.

Note: I’ve removed “running Ubuntu” from the title, because it’s becoming clear to me that this is an innate AND AGGRAVATING hardware limitation, not a product of my running Ubuntu.


This is partially an intrinsic hardware issue (there’s only 6 bits per color possible in the actual physical pixels) and partially a software problem (you’re trying to display 24-bits of color on an 18-bit display). This limitation is actually quite common on lower-end (cheaper) LCD monitors.

Unfortunately, there’s also a mismatch between X and the hardware; X only has 1, 4, 8, 15, 16 and 24-bit color modes; no 18-bit.

You could try setting your DefaultDepth to 15 or 16 (5 bits per color; 16 usually allocates 5 to R and B and 6 to G). You’d be telling the software end to use slightly fewer colors than the hardware is capable of… You’d be trading for a whole different set of color problems (nothing cleanly translates between bit depths except black and white, basically; and there’s a few colors where the correct choice is ambiguous and some software decides the opposite from other software), but they might be less annoying/noticeable.

You could try ‘Option “Dac6Bit” “True”‘ in the Device section… If I understand it correctly, this tells the software about your hardware limitation of digital-to-analog conversion being 6 bit, and might result in better results.

The statement about “serious photo editing” is true. If you’re doing “serious” photo editing, you want a really high quality LCD of the type they don’t put in laptops, or you want a CRT.

Dude! I will try all of the above when I get home. Thanks!

Eric, sadness. I tried all of your suggestions. Here are my results:

Setting DefaultDepth to 16 broke “Normal Visual effects” and added more dithering to my test image.

Setting DefaultDepth to 15 preventing Gnome from displaying the desktop—all I got was a cursor on a black background, and had to fix xorg.conf in recovery mode.

Adding Dac6Bit "True" to the device section (with DefaultDepth set to 24) just seemed to break X, it didn’t like that at all.

Oh, and by the way, turning visual effects off with a DefaultDepth set to 24 didn’t have any effect on the dithering.

Got any other tricks up your sleeve?

I’ve run into the same problem with my X200. I’ve worked around it by scaling luma from 0-255 range to 7-255, thus avoiding sending dark luma levels to the display altogether. Wrote a small HLSL shader for a few media players under Windows, see I’ll do something for Linux when I’ll get to installing it on the laptop, but god knows when that might happen.


I have a very similar problem on my x200. When I use the laptop screen, there is no dithering in gray color. However, when I use my external 22 inch LCD (Dell 2209WA), the dithering is so annoying, which affects reading web pages with gray background. It is striking if I open adobe reader without having any document opened in it (adobe reader has a gray area background). In my case, Dell 2209WA is much more capable than x200’s LCD, but that’s where the problem occurs. I think this is a driver’s issue instead of hardware problem, but I have no idea how to fix it. I also would guess this is the limitation of the graphics card or some compatible problem between the driver and card.


Email (optional)

Blog (optional)