Monitor Brightness and Image Colour Temperature
Hi there
The following article assumes Win7 and a monitor that is colour-calibrated, but the main findings still apply to a monitor in ex-factory default mode.
I have spent some time in the past using a grey card to set the white balance in post-processing. But more often than not I rejected the card WB setting because it was not representative of the overall "look" of the image. Also, sunlit and in-shadow card positions give very different WB readings.
So, like many of us I rely on a personal judgement, moving the temperature and tint sliders to obtain an image that "looks right". The brightness of the monitor is kept at a constant intensity that is appropriate for subsequent printouts. If the print is too dark, I darken the monitor and lighten the image, and vice-versa.
All boringly familiar to experienced Cap1 users, I'm sure.
But recently I used AMD software with my monitor to create a series of brightness settings that I saved as presets on my Taskbar. Then, with one mouse click I can change the brightness while also adjusting the WB settings. The brightest setting is the monitor default level (which I set manually, typically at 50%). The darkest setting is the level I use to proof for printouts (typically 20%).
In between these extremes I have 5 step-changes that I can apply at any stage when processing a raw file. So now, for example, I can set the image/monitor to "bright" mode and adjust the WB sliders to suit my taste, producing a high key image. Alternatively I can set the image/monitor in "dark" printout mode and adjust the WB settings - my normal workflow.
Now the interesting bit. When I use the bright monitor setting the WB colour temperature that I select (in a high key setting) is significantly higher than the temperature selected to make the dark setting look right. Also, if I take the high key image and set the monitor to the dark, printout setting, my eyes see deeper, more saturated colours with little or no shift in colour hue. Altogether a more pleasing result.
Then, staying in "dark" monitor mode I can refine the software settings to create a high quality digital negative, checking the final outcome with a brighter monitor setting as necessary.
I have not seen a record of this monitor-driven approach elsewhere. It can be applied to any raw converter workflow and in my experience it enables me to create a starting point that is robust and very malleable to further adjustments.
Any comments, I wonder?
Peter.
The following article assumes Win7 and a monitor that is colour-calibrated, but the main findings still apply to a monitor in ex-factory default mode.
I have spent some time in the past using a grey card to set the white balance in post-processing. But more often than not I rejected the card WB setting because it was not representative of the overall "look" of the image. Also, sunlit and in-shadow card positions give very different WB readings.
So, like many of us I rely on a personal judgement, moving the temperature and tint sliders to obtain an image that "looks right". The brightness of the monitor is kept at a constant intensity that is appropriate for subsequent printouts. If the print is too dark, I darken the monitor and lighten the image, and vice-versa.
All boringly familiar to experienced Cap1 users, I'm sure.
But recently I used AMD software with my monitor to create a series of brightness settings that I saved as presets on my Taskbar. Then, with one mouse click I can change the brightness while also adjusting the WB settings. The brightest setting is the monitor default level (which I set manually, typically at 50%). The darkest setting is the level I use to proof for printouts (typically 20%).
In between these extremes I have 5 step-changes that I can apply at any stage when processing a raw file. So now, for example, I can set the image/monitor to "bright" mode and adjust the WB sliders to suit my taste, producing a high key image. Alternatively I can set the image/monitor in "dark" printout mode and adjust the WB settings - my normal workflow.
Now the interesting bit. When I use the bright monitor setting the WB colour temperature that I select (in a high key setting) is significantly higher than the temperature selected to make the dark setting look right. Also, if I take the high key image and set the monitor to the dark, printout setting, my eyes see deeper, more saturated colours with little or no shift in colour hue. Altogether a more pleasing result.
Then, staying in "dark" monitor mode I can refine the software settings to create a high quality digital negative, checking the final outcome with a brighter monitor setting as necessary.
I have not seen a record of this monitor-driven approach elsewhere. It can be applied to any raw converter workflow and in my experience it enables me to create a starting point that is robust and very malleable to further adjustments.
Any comments, I wonder?
Peter.
0
-
So you are not actually using a color calibrated work flow??
❓ ❓0 -
Hi Steve
Yes, I am colour-calibrated. Despite this, a manually selected WB (that makes the image look right) using a bright screen has a higher setting than a WB for the same image using a dark screen. Perhaps all down to visual perception, but is it? The interesting thing is that I much prefer the darker-looking image with the bright screen WB setting. And it prints out well.
Thank you for your interest.
Peter.0 -
Hi Peter,
I personally calibrate my screens either with X-Rite i1Pro & their software, or the self-calibrators on new Eizo Screen's using ColorNavigator.
I found that 100cd/m2 Brightness, 2.2 Gamma & D65 (6500k) works best for me. You will find that people from different backgrounds will use slightly different settings, but it's a good starting point. I've never heard of people switching colour temperature of the monitor and changing brightness levels around to suit a print, instead making a profile on a monitor and calibrating a printer so it is always consistent.0
Please sign in to leave a comment.
Comments
3 comments