Graphics cards and sharpness -- is it my imagination?

I spend (retired) Winters down South. I run Mint Linux latest in both places. Lots of fast expensive memory (32 gig) in both places. Same 32" inch Dell monitors in both places. The down South desktop box has a hot rod Nvidia “gaming” graphics card (I bought that rig almost new but second hand…long story). The up North desktop box has a cheapo graphics card I keep meaning to upgrade but nave not so far.

The same images look sharper to me down South where the only difference is the graphics card (same Dell monitors both places). Is this possible? Or is it my imagination?

It’s quite possible you’re getting a perceived sharper image from the “hot rod Nvidia” card vs the “cheapo” card on that basis. What you could do is put both setups side by side, have someone else set up the monitors with the cables hidden, so that you will not know which machine is feeding which monitor. See if you can tell a difference.
I guess that having both monitors calibrated would be a good idea but maybe use a different machine for that beforehand. :confused:

:=)) The two setups are 2500 miles apart. I carried a 6tb hard disk with me, in a foam pouch, so I could bring my entire image collection with me.

Hm… Nutty idea, perhaps, but what about
taking a screen dump up North, and
a screen dump (of the same image) down South
— and then compare the screen dumps side-by-side?

Have fun!
Claes in Lund, Sweden

1 Like

@pittendrigh I don’t know what your traveling restrictions might be, but traveling light is surely understandable. A monitor and computer tower doesn’t seem like much, but I wouldn’t want to lug them around either. If side by side isn’t ever going to happen, don’t lose sleep over it.

I think both video cards and monitors can often have configuration settings for sharpness and other settings that might affect perceived sharpness… The monitors are the same I would start there and see… Then check the card settings… Finally is the light in the room similar … just throwing out a couple of things…

Todd’s response makes good sense. Perhaps I was wondering if a cheap ($120 USA dollar) graphics card might end up with some pixel chaos due to floating point imprecision.

It could be an issue but if you go through both the display panel settings and the nvidia settings just to see what is set to what it might help. It could be that nvidia isn’t set to enhance the image and is just good and that the cheaper one might be able to enhance the image but its not activated to do so?? Or you may just have a mismatch in the display settings for contrast sharpness or you are using one of the modes of the monitor on one like cinema or whatever but not on the other one… I guess unless you have them both calibrated all you can do is check for some settings mismatch between your setup and if that seems all equal then maybe the card does not provide as good an image…

Why not take out one graphics card and take only that with you to the other place? Then you try both cards in the same machine and see if there is a difference.

Maybe each monitor has different sharpness? Which one has been used the most?

And there are settings for sharpness and other things that can be set in the monitor as well so issues like this would have to confirm what settings are in play in the monitor…

2 Likes