"IMPORTANT!" How reliable are optimal osd settings?

#1
Today I probably learned something very important. When unpacking my 3rd IPS panel, with hope to have fewer IPS glow (was not the case -_-), I still made a very valuable discovery.

When taking the settings from my review for "optimal" data, the picture seemed a little strange to me. So I took my meter and measured again - Result: the white point is almost 7180 Kelvin !. It is clear that you will have different temperatures across the picture due to the illumination, but that monitors differ so much, I would not have thought. Even with TFT Central other optimal settings are indicated, although the same measuring device and the same software was used in the measurement.

The Monitor was on for about 2 hours, before doing these measurings. All ICC profiles and calibrations were disabled. What is even more different is actually as good as everything, sometimes more, sometimes less.

Presets color temperature:
Monitor 1: Normal 7800 / Warm 6600 / Cool 9600 / User 6000
Monitor 2: Normal 7651 / Warm 6613 / Cool 9441 / User 6482

Brightness with a huge difference:
Monitor 1: 100% 369 cd/m2 / 70% 286 cd/m2 / 50% 228 cd/m2 / 20% 130 cd/m2
Monitor 2: 100% 407 cd/m2 / 70 % 318 cd/m2 / 50% 252 cd/m2 / 20% 146 cd/m2

Contrast:
Monitor 1: 100% 1189:1 / 70% 1144:1 / 50% 1200:1 / 20% 1180:1
Monitor 2: 100% 1103:1 / 70% 1093:1 / 50% 1060:1 / 20% 1115:1

Preset (Racing Mode):
Monitor 1: 257 cd/m2 / 0,22 cd/m2 (Blackpoint) / 1166:1
Monitor 2: 289 cd/m2 / 0,26 cd/m2 (Blackpoint) / 1112:1

Optimale Settings (120 cd/m2, 6500K)
Monitor 1: RGB 94 / 92 / 100 / Brightness 22 / Contrast 50
Result: Whitepoint 6504 / 120 cd/m2 / Blackpoint 0,12 / Contrast 1004:1
Monitor 2: RGB 100 / 96 / 100 / Brightness 14 / Contrast 50
Tesult: Whitepoint 6502 / 119 cd/m2 / Blackpoint 0,12 / Contrast 991,7:1

Why isn't the difference in contrast with optimal settings that big anymore? Very easily. Monitor 1 has a better contrast and a higher deviation in the white point. By calibrating -6 red and -8 green, the monitor loses brightness while the black level remains the same.

What is the effect on me?
Dont use the same settings like other people. Monitors differentiate in each individual unit. Gaming monitors are not built for a perfect, stable homogeneity and it's also true that it's not necessarily needed. While in games you will not see a big difference, whereas graphics designers @print for example need a very good homogeneity without deviations in the color temperature and the illumination (brightness across the screen). Otherwise, a WQHD IPS gaming monitor would cost 1500€/$ and not 800€/$.

Can you see the difference?
You can see a little bit in the blacks, while it's hard to see in the contrast. Moreover, it's important to say that a test cannot say how "good" the brightness peak of the Asus PG279Q is. The 2nd Monitor has a peak of 148 cd/m2 while the first had only 129 cd/m2. That's a difference of 19 cd/m2 which is pretty much at this lower brightness.

What do you think about that? Does it still make sense to measure a monitor in the presets? Does it make sense to measure color temperatures from the presets? A rough guideline value of the contracts is certainly helpful.

My goal
My goal in the future is to buy 2-3 units for a review, so to ensure what the average customer gets. The problem within test exemplars from manufacturers is, that most of them test their models, before they send them to a reviewer.

Who is online

Users browsing this forum: No registered users and 1 guest