How adjust pixel values linearly for linear 16 bit Grayscale TIFF in GIMP?

Hi, I have several linear 16 bit Grayscale TIFF images. With GIMP 2.10.12, how can I adjust input levels so that pixel values are changed linearly? In the original tiff, the pixel value of pixel A is twice as high as that of pixel B. After the level adjustment, the value of pixel A should still be exactly twice as high as pixel B. Only measured values matter, not the perception.
Also, my original tiffs have no color (profile) information in the metadata. Do I have to embed a Gray ICC profile with linear TRC (“gamma=1.0”) before opening in Gimp?

@DerAndere Welcome to the forum! Please check out the GIMP manual. In general, you need to figure out whether the tool is linear or not and whether you can toggle between the two. For colour profile, go to Image → Color Management → Assign Color Profile… Also take a look at Mode and Precision.

(For Precision, I am a little confused about it myself, as it starts at perceptual by default. I am unsure whether any conversion takes place if I change it to linear. I am sure that someone could help clarify.)

Thanks for the quick response. In fact I would like to clearify the manual entry for the levels tool once I am sure how to do it correctly. The Color -> Levels tool has indeed an option for performing the operation in linear light. My confusion stems from the fact that under the section “Advanced color options” the levels tool has the undocumented setting “assume pixels are in built-in sRGB” (default) versus “convert to built-in sRGB”. None of those options are what I need. I assume that I have to do the following. Can someone verify?:

  1. Open the image in GIMP.
  2. Select “Image->Precision”. Select 32 bit floating point. Set channel encoding to “linear light” (this selection will be ignored by the levels operation)
  3. Image->Color Management-> Assign Profile. Assign the correct Gray ICC profile with linear TRC.
  4. Select “Color->Levels…”. Select “Adjust levels in linear light”. Show “Advanced color options” and select “convert to built-in sRGB”. Change value of “High input” in the levels tool. Select “OK”.
  5. Image->Color Management-> Convert to Profile. Select Gray ICC profile with linear TRC.
  6. Image->Precision. Select “16 bit unsigned integer”. Keep channel encoding in “linear light”.
  7. File-> Export as… . Select “TIFF”. Select “save color profile”.

Levels is different in 2.10.10, which I have installed and doesn’t have

Does curves have these settings as well? (I have no inkling of what high input is.) My guess is that levels has not yet fully transitioned to linear, which is the reason for them.

After applying levels, read the window title to see whether the bit depth, linearity and colour space have changed. As well, check your values using sample points. Let me know if you don’t know how to set them up.

With “convert to built-in sRGB before applying the filter” enabled, the final image is still linear Gray. It looks as if the level/curve tool internally converts to sRGB color space, applies the level adjustment and then converts back to the color space according to the profile initially assigned to the image. I guess this is because the levels/curves tool still only works in the hard-coded sRGB working space (either with linear TRC or with sRGB TRC). I will have to use GIMP-CCE / Krita or wait for Gimp 3.0 to get full support for non-sRGB working spaces like Gray without back-and-forth coversion of color space.

If the sRGB profile has a gamma 1.0 TRC, the image data should still be linear. If not, the data has been tone-curved to that, and the display profile is making it look the same before putting it to the screen.

If me, I’d be getting that sRGB profile out of the way… It’s of no help color-wise, and any “non-identity” TRC is messing with your linear…

Hello @ggbutcher,

With the levels/curves tool of GIMP 2.10.12 I have to choose one of the two options “assume built-in sRGB” (default, bad) or “convert to built-in sRGB before applying the filter” (gives good results but seems to be a workaround). I never actively converted to an sRGB profile with sRGB TRC via “Precision” or “Color management->Covert to profile”: In “Precision”, I select “linear light” and in “Image->Color management” I select “Gray built-in with linear TRC”. From your comment I deduce that this is the correct choice. Thx.

1 Like

The only sRGB profile I’ve ever seen with a gamma 1.0 TRC was from Elle Stone’s collection. “Standard” sRGB profiles have the sRGB TRC, which will definitely un-linearize your image data.


Good morning,

it would help if you describe what you are aiming at in the end (besides keeping the pixel value ratios as they are). First of all, I would disable colour management since the input image does not have a profile.

As far as I see from a mathematical point of view, you must not change the shadow point (i.e. the lowest level). This will destroy the pixel value relationships. You might adjust the highlight level, which will preserve the pixel relations. And of course, you need a linear tone curve.


PS: If your goal can be reached by pixel arithmetic I would use imageJ.

@Jossie: My final goal: I want to make levels adjustments to make features in the image more visible while keeping pixel ratios so that others can still analyze pixel values for signal quantification in the final image. I guess what I want to achieve is like multiplying all pixel values with a constant. Indeed so far I was always using ImageJ on raw images for analysis. thanks for your thoughts.

Until a couple of weeks ago, I’d say exposure compensation would be the appropriate tool, as it specifically does a multplication on the data. However, various softwares seem to want to protect highlights from wanton clipping, so some “exposure” tools put what I’ll call a “forehead” curve (as opposed to the “toe” of filmic and such… :smiley: ) on the top of the linear exposure curve. RT and dt may keep it clean; I haven’t heard specifically in the recent discourse.

G’MIC might be your friend here:

gmic goesinta.tif mul 2 -o comesouta.tif

If your goal is analysis, then GIMP isn’t the right tool. Moreover, it is undergoing heavy development and has some bugs (TIFF colour profile one fixed in 2.10.12).

You need technically robust packages. ImageJ and G’MIC were mentioned. There is also ImageMagick and Octave among others. I am most familiar with G’MIC since playing with it is my past time. If you need any assistance or have any questions concerning it, let me know.

I would do it with ImageMagick:

magick in.tiff -evaluate Divide %[fx:maxima] out.tiff

This divides every pixel value (on a scale of 0.0 to 1.0) by the maximum value.

I expect the same can be done in G’MIC, and other tools.

So why do you want to use GIMP now? Multiplying pixels by a constant is so easy in imageJ. If you want to change the appearance of the image, the only thing left is to my understanding adjusting the contrast by normalizing the image with the ratio 65535/(max in image) for 16bit data as suggested by Alan above. All other operations will - as far as I can see - not give the result you desire (i.e. keep pixel ratios constant).


1 Like

Thanks everyone, especially for pointing out those powerful command line tools. With ImageJ I was missing a way to specify if input data is linear or sRGB. I was curious about GIMP because I would like to document the pitfalls when using it for scientific purposes. Once upcoming GIMP 3.x will have support for non-sRGB working spaces, I think a recommended workflow for such cases should be documented, too. Note that my images have no color (profile) information in the metadata despite being linear grayscale. For ImageMagic, I guess this means that first I have to do

magick myimage.tif -set colorspace LinearGray myimageLinearGraySet.tif

Best, DerAndere


Why do you need a colour space, if the input data don’t have a profile assigned? I would be worried that assigning a colour space would ruin your requirement of constant pixel ratios.

I just did a little test. An artificial image created by imageJ without any profile was opened in Photoshop and converted (not only assigned!) into sRGB colour space. Then I calculated the ratio of both images in imageJ. Here is the result together with the histogram:



According to,

Most image processing algorithms assume a linear colorspace, therefore it might be prudent to convert to linear color or remove the gamma function before certain image processing algorithms are applied.

The -set colorspace command only declares the color space (sets the metadata) and does not change pixel values. This is different from
convert myimage.tif -colorspace LinearGray myimageLinearGrayConverted.tif

Yes, I understand the difference between assign and convert. And yes, the gamma-correction is a further issue. So if you read out pixel values in GIMP with sRGB assigned, I expect that the values displayed will not be the values in the data but with the profile applied (otherwise the profile would not make sense; NB: I am not a GIMP user, so this is my expectation). Would this be useful to you?

For quantitative work, imageJ is the correct tool, as pointed out above.


1 Like

Thanks Hermann-Josef. I think you point is valid. For pixel arithmetics in ImageMagick, the -set color space may not be required (although for other operations it is needed). ImageJ will then stay my trusted tool no.1 for this kind of work.

You can if you want, but you don’t need to. If you multiply all pixel values by a constant, then ratios between any values will be unchanged (provided no values have been clipped). This is true whatever the colorspace is.