Matrix multiplication

Could you give an explicit example of the problem you are talking about, perhaps with screenshots? You mentioned RawTherapee a couple of different times - is there something RawTherapee is doing with sRGB images with and without embedded sRGB profiles, that somehow seems unexpected?

Spot on!

Yes that is very helpful! and is exactly in agreement with several examples I have.

I am using the RT code as reference, for what I am trying to do. Ideally I would like to incorporate my work into at RT some point!

Specifically I am trying to create a matrix that will take color values that I have computed from density readings of a film scan, that should be in a particular colorspace. Lets call that colorspace RA4 space. (i.e. the colorspace that is created by using the CMY dyes in a RA4 processed color print)

I then want to transform that data to sRGB correctly. At least that is what I want to do initiallyā€¦

Thatā€™s why I want understand how to do it properly! Once I have got the method correct I may do things slightly differently. But my goal initially is to do it correctly.

I can make up some examples using generated solid colors, and compare the results that I get from RT.

Iā€™m totally confused as to how your inquiries regarding viewing conditions is related to "Perhaps the other reason . . . " as quoted above.

There are many web browsers that fail this test of ā€œsRGB image looks the same with and without an embedded sRGB ICC profileā€ - are you talking about something you see in a web browser?

Perhaps a step back is order.

What I am trying to should be straight forward in that I want use the exact same methods that are used in digital photography and some extent even more so those used in RT. (which I hope are same etcā€¦) The only difference is my values ā€œcameraā€ to XYZ are unique.

This involves a lot of trial and error, trying different things, and each time I try something I like to understand the process, the values used etc.

I understand this because the software simply ignores the values. But in the case of software that processes ICC profiles correctly what should happen?

Perhaps I can ask the question in different way. Let take two tiff files. File A has the correct sRGB value attached with D50. and File B which has the exact same data inside but no profile.

When the same file is displayed in RT should they appear the same or different? Or what color values does the internals or RT thinks it has?

Hmm, I thought I already answered this question above :slight_smile: . The files should look the same assuming the software assigns sRGB to images without embedded ICC profiles, which Iā€™m sure RT does, leastways I canā€™t see any change in how a sample image looks before and after assigning an sRGB profile to an image with no embedded ICC profile. Do you see a visual difference?

Iā€™m not sure what you mean by ā€œignoresā€ but the problem is that those web browsers are flawed :slight_smile: . They should all by default assign sRGB to images without embedded ICC profiles, and then convert to the monitor profile.

No, hence my point that when no profile is applied RT assumes it sRGB at D50, and I assume so does many other applications. If it assumed the file without the profile was D65 it should apply an adjustment and the result should look different to the one with D50 profile?

Putting aside applications that donā€™t do things correctly I am looking to understand what the correct behaviour should be. Does that make my question clear?

If a V2 or V4 ICC profile color-managed editing application assumed an image without an embedded ICC profile was somehow ā€œD65ā€ and then tried to apply an adjustment to somehow compensate for the difference between D65 and D50, it would be a very confused ICC profile color-managed editing application.

If I were using software that behaved in this fashion, Iā€™d file a bug report.

Very sorry @LaurenceLumi, the confusion was all mine. @agriggio explains it perfectly. Of course itā€™s not A * B * C because the input pixel is a vector, not a 3x3 matrix therefore needs to be on the rhs of each multiply! Incidentally, gmic has the mix_channels command for that. I think Iā€™ll keep quiet now :slight_smile:

1 Like

I canā€™t help but think that Iā€™m somehow missing the question that you are actually asking. An sRGB ICC profile already has the ā€œD65ā€ source white point incorporated into the profile by means of the chromatic adaptation that was used to make the sRGB ICC profile from the sRGB color space specs. It doesnā€™t have to be added again, in the context of using ICC profile color management.

I tried an experiment once, modifying LCMS to use D65 as the illuminant, instead of D50. I installed the modified LCMS in /usr (this is on Linux) so all ICC profile applications used this modified LCMS. And I made a set of ICC profiles that used D65 as the illuminant. With this modified LCMS, when making ICC profiles from color space specs, D65 color spaces didnā€™t need to be chromatically adapted to D50. But D50 color spaces such as ProPhotoRGB did need to be chromatically adapted to D65. And of course ā€œEā€, D60, and etc color spaces still needed to be chromatically adapted, but to D65 instead of D50.

When I tried editing using this modified LCMS and the modified profiles with my editing software such as GIMP, all the colors looked exactly the same. Exactly the same. The only way I could get ā€œdifferent colorsā€ was to:

  • Use my ā€œD50 illuminantā€ ICC profiles with my modified-to-use-D65 version of LCMS
  • Or else use my ā€œD65ā€ version of LCMS with my normal D50-adapted ICC profiles: Edit: What I meant to say, should have said, was ā€œUse the D65-illuminant profiles with normal non-modified D50-illuminant LCMSā€.

Sometimes you might run across an incorrectly-made sRGB ICC profile where the chromatic adaptation from D65 to D50 wasnā€™t done. Using such a profile makes images look blue, such as the image below on the right (the colors on the left are correct):

Back in the days of V2 workflows, you could ā€œget different colorsā€ - either blue or yellow or even other colors depending on your monitorā€™s actual white point, by using Absolute colorimetric rendering intent to the monitor.

You could also get different colors when converting from one ICC RGB working space to another ICC RGB working space with a different white point, but you had to specifically ask for Absolute colorimetric intent - all the editing software Iā€™ve ever seen defaults to Perceptual or Relative, so nobody was likely to do this accidentally.

For example, you might convert from from sRGB to BetaRGB (which has a D50 white point) or vice versa, using Absolute colorimetric intent, resulting in images such as are shown below. Notice the image on the right is ā€œtoo yellowā€ and the image on the left is ā€œtoo blueā€:

But the ICC decided this sort of color change when using Absolute colorimetric was confusing to users.

So for V4 workflows, when the source and destination color spaces are both ā€œmonitor classā€ profiles (all the standard RGB working spaces we use in the digital darkroom are ā€œmonitor classā€ profiles), when you ask for Absolute colorimetric rendering intent, what you get is Relative. Which makes it decidedly more difficult to write tutorials that encourage users to experiment and thus learn for themselves first-hand the difference between relative and absolute colorimetric intents :frowning:

The images above come from my article on ā€œWill the real sRGB profile please stand upā€, which was written when I actually had access to V2 editing software: https://ninedegreesbelow.com/photography/srgb-profile-comparison.html

1 Like

Specifically if I am creating data to store in a file that does not have ICC profile attached what parameters should I use?, and what parameters should I use if use a ICC profile with the correct primaries, and D50 white point etc ? Initially I though I should use the sRGB primaries and D65 for the former and D50 for the later.

But that does not seem to fit what the software I am using as example does. It seems rightly or wrongly that if you want the software to work as expected you need to create data in the former (the file without the ICC profile) with a D50 white point. As the software will NOT make any chromatic adaptation.

Hmm, well, the only answer anyone will ever be able to give you is that for V2 and V4 ICC profile applications, use the D50 adapted matrix for sRGB. Whether the profile is actually embedded in the image or not is irrelevant. Iā€™ve tried to give reasons why several times in this thread, but as I said, Iā€™m not hitting the area that answers your questions, and at this point I somewhat doubt my ability to do so :slight_smile: .

If you donā€™t want to use a D50-adapted matrix, wait until someone adds iccMAX support to an ICC profile color managed editing application, in which scenario I donā€™t have much of a clue what will happen or be possible. But donā€™t try to mix whatever you do using iccMAX applications with what you do using V2/V4 applications.

Or else donā€™t use ICC profile color management at all, and instead use OCIO color management, which requires using OCIO LUTS to get from whatever color space the image is in, to whatever color space you want the image to be in. But Iā€™m not the person to advise you on the specifics of OCIO, if thatā€™s the direction you want to go. If you do a search on the forum, there are already some threads on the topic.

Here is a thought: Go ahead and try whatever it is that you think should be done, as you generate the matrices for whatever application you have in mind. And if it works, great! Experimenting with doing whatever you think should work is a great way to learn what does work. In general trying stuff and seeing what happens, and then figuring out why really is a nice way to learn stuff.

Bear-of-little-brain here, methinks that 1) if youā€™re going to put primaries and whitepoint information in an image file, it should represent the color gamut to which the data was last converted, and 2) if youā€™re not going to put that information, you need to ensure the color gamut of the data can be used as-is by whatever media the data is intended for.

You can combine #1 and #2 for the largely unmanaged wild of the web by converting and storing sRGB/D50 (D50 mainly because the ICC tells people thatā€™s their notion of reference whitepoint) and pray someoneā€™s not going to regard your image on a cheap projector, ask me how I knowā€¦

I think the primary consideration is to ensure the metadata properly represents the image characteristics, and in its absence you need to have particular media in mind.

I think what youā€™re missing from Elleā€™s responses is that there are multiple ā€˜white pointsā€™ that are used in different ways, at different stages within the calculations used to generate the matrix used to convert between colorspaces. Specifically, the keyword you should look at more closely is ā€˜adaptedā€™.

Disclaimer: Iā€™m not an expert on the standards, Iā€™ve just struggled with the math and figured this out after reading way too much documentation that was way too vague. I might still be misunderstanding a lot of this, so I would honestly like some feedback from experts like Elle.

So, consider for a moment that we consider ā€˜whiteā€™ to be [1, 1, 1] no matter what RGB colorspace weā€™re in. This doesnā€™t specify a whitepoint per se - no, we specify a white point in terms of the XYZ colorspace. For example, while sRGBā€™s white point has xy coordinates [0.3127, 0.3290], that still is just saying that the exact ā€˜colorā€™ for [1, 1, 1] (or ā€˜whiteā€™) can be measured externally as having those xy coordinates.

ICC profiles use whatā€™s called a ā€˜Profile Connection Spaceā€™ (PCS). What this is will vary, but most of the time itā€™s either XYZ or L*a*b* - and for ICC profiles (I guess versions 2 and 4), the white point that they use for the PCS isnā€™t E, but instead D50 - which is roughly equal to XYZ values [0.964, 1.000, 0.825]. This means that, to stay consistent, we have to transform whatever ā€˜whiteā€™ is to XYZ values such that ā€˜pure whiteā€™ is [0.964, 1.000, 0.825], rather than [1, 1, 1] (or, if we were using D65, roughly [0.950, 1.000, 1.089]).

However, because of how human eyes work, you canā€™t just rescale XYZ values directly to convert between white points. Instead, you have to convert XYZ values into LMS (native colorspace for the human eye), rescale those values, then convert back into XYZ.

There is some debate about what the best matrix to use is for converting between XYZ and LMS, and it often depends on your use case, needs, and specific setup. However, the most common when dealing with ICC profiles is the original ā€˜Bradfordā€™ color transformation matrix. I specify ā€˜originalā€™ because apparently there are two versions, and ICC profiles explicitly use the original one.

So, hereā€™s an overview of how this looks:
Linear sRGBā†’XYZā†’LMSā†’D50/D65ā†’LMSā†’XYZ (PCS)

And going to another RGB space (for this example, to be displayed on a monitor with a D75 white point):
XYZ (PCS)ā†’LMSā†’D75/D50ā†’LMSā†’XYZā†’RGB

Itā€™s important to note that in both RGB colorspaces (both sRGB and the monitorā€™s colorspace), the RGB value for ā€˜whiteā€™ remains [1, 1, 1]. If the picture is a photo of a white piece of paper with a drawing on it, any part that shows the paper will have the same RGB value in both RGB colorspaces (assuming that itā€™s perfectly encoded as white and not slightly off-color, nor darkened to a light gray).

Thatā€™s why one of Elleā€™s comments carefully noted that the ICC specs assume that your eyes are 100% adapted to the white point of your display - because theyā€™re designed to make sure that the displayā€™s white point is always used for the actual value of white.

Now, for the math:

  1. orig = Original RGB value.
  2. final = Final resulting RGB value.
  3. toXyz = RGB to XYZ matrix for the initial (or ā€˜sourceā€™) RGB colorspace. Uses whatever that colorspaceā€™s actual white point is, such as D65.
  4. toRgb = XYZ to RGB matrix for the final (or ā€˜destinationā€™) RGB colorspace. Uses whatever that colorspaceā€™s actual white point is, such as D75.
  5. whiteSource = Source RGB colorspaceā€™s white point.
  6. whiteDest = Destination RGB colorspaceā€™s white point.
  7. toLms = XYZ to LMS matrix, such as the Bradford or Hunt matrices.
  8. diag() = Function to turn a 3-element vector into a diagonal matrix.

final = toRgb * (toLms^-1) * diag((toLms*whiteDest)/(toLms*D50)) * toLms *
(toLms^-1) * diag((toLms*D50)/(toLms*whiteSource)) * toLms * toXyz * orig

I noticed that the built-in editor had decided to line-break right at the point where colors would be in the PCS (at the time I hadnā€™t put spaces around the asterisks), so I decided to put an actual line break in there. I put the spaces around most of the asterisks to help show where each colorspace conversion takes place. Decided not to with the ones inside ā€˜diag()ā€™, to better group those together as a single ā€˜conversionā€™.

Hope this helps! While I did find this thread while googling for how to do matrix multiplication in gmic, I saw what looked like a very recent thread from someone going through some of the same issues I did.


Now, the reason I had gotten so confused while learning all this, was because I was wanting to figure this all out so that I could specifically use absolute colorimetric conversions between colorspaces; I didnā€™t want to use white point adaptation. Specifically, I wanted to make one image look identical on several different monitors, and make that look identical to the original object in real life. I had all displays in the same room as the object, too.

But I had in my head the idea that ā€˜white balanceā€™ was meant to help adjust colors to be more or less white, going bluish or orangeish based on color temperature. So I kept trying to use white point adaptation to do the opposite of what it was intended to do, and since none of the documentation I could find was geared toward that, it was kinda frustrating!

Had to take a step back and figure out what it did first, in the context in which it was being used - and after I figured that out it was much easier to ā€˜undoā€™ the whitepoint adaptation.

Except then I learned that my phoneā€™s camera was doing it all wrong and was assuming D50 was the actual white point for the display. Figuring out why certain shades of green lacked that slight tint of blue while everything else looked spot on was ā€˜funā€™, alright.

ā€¦ Actually it kinda was. And the whole project was just for fun anyway; canā€™t seem to get a job, so may as well mess around with colorspaces instead!

1 Like

Not really, if something is the same, then it is the same, if something is different then it is different!

This is not meant as criticism of Elle, who has been very helpful!

At the end of the day my question is very simple and can be summarised as: If I have some data that is in the sRGB colorspace what are the parameters used to describe that data? and in addition are those parameters any different from that I would find in sRGB ICC profile.

In seems at least as a defacto standard the answer to the later part of that question is that there is no difference. Now perhaps that was not intended, maybe its even a mistakeā€¦ but otherwise users would likely complain if they attached an sRGB ICC profile to sRGB data and the result looked different!

Elle, I think you should have point me to this pageā€¦ :slight_smile:

What I did not get before, is the values of the primaries in the an ICC profile have (or SHOULD have) been chromatically adapted from there absolute values. This was not intuitive but I get it now.

So in summary the sRGB data that uses the unadapted SRGB primaries plus the D65 white point should equal (or be close enough) to same data that is defined using correctly adapted primaries and the D50 white point?

1 Like

Hi @LaurenceLumi - hmm, well, I actually did mention that article, back in Comment 4 :slight_smile: , where I gave a link to an article that has a downloadable spreadsheet for calculating the sRGB ICC profile from the ICC specs and the sRGB color space specs. But Iā€™m really glad you found that article helpful - it was a ton of work to write! - but I learned a lot while writing it.

@Tynach - I really like all the experimenting youā€™ve been doing with various displays. That sort of stuff is the 100% best way to actually learn how ICC profile color management really works. Otherwise the constant temptation is to make a lot of unconscious assumptions, that seem inevitably to end up being not correct just when you really want something to work ā€œas expectedā€.

It always makes me a bit nervous when people refer to me as an expert :slight_smile: because everything I know about ICC profile color management was learned the hard way, one experiment at a time just like you are doing, followed by trying to figure out ā€œwhyā€ results are as they are, whether by doing further testing or doing a lot of background reading or asking questions on forums and mailing lists, or whatever it takes. So whatever expertise I have is relative to the stuff Iā€™ve tried to figure out. I donā€™t have any formal university degree in color management or anything like that.

Anyway, I do have some thoughts on your descriptions and comments for your wonderful monitor experiments, but I need to clear off some other ā€œto dosā€ my list before sitting down to type something up.

1 Like

I should probably mention that this is all done with GLSL code, on Shadertoy and in the Android app ā€˜Shader Editorā€™. Iā€™ve looked at ICC profiles and compared different ones to each other, but Iā€™ve yet to write any code that actually uses any such profile.

Itā€™s one of those things I feel I should do, but a large part of what Iā€™m doing right now is on my phone in Shader Editor (where I have to ā€˜calibrateā€™ the display by having a massive array containing the equivalent to a VCGT tag, multiply all color values by 255, and use that as the array index for nabbing the final value for that channel).

Same, itā€™s happened a few times with meā€¦ And here I am unemployed because nobody wants to hire someone without actual job experience. Often Iā€™ll post something that really makes sense to me and seems completely true, but Iā€™ll still get the feeling that it might only seem true to me because I ā€œdonā€™t live in the real world.ā€ And thatā€™s often the sort of thing (though in more detail and with examples) told to me when I give opinions on topics like tab indentation, so I have a feeling they might be right.

What I more or less meant by ā€˜expertā€™ in my own post, however, was that youā€™re someone who has done that testing before - and thus you have built up a fairly decently sized repository of knowledge, at least when compared to others. I hesitate to say ā€˜professionalā€™ because I honestly donā€™t know what your actual job consists of, but given the number of articles youā€™ve written (and both the depth and breadth of the topics in them), I think itā€™s safe to say youā€™re an expert - at least relatively.

I should have joined this community much sooner, but I didnā€™t really know about it. Besides that, it was only very recently that I broke down and finally just bought myself a colorimeter, as before that I was making a lot more assumptions about the quality of my own equipment (factory-calibrated monitors are, apparently, often very badly calibrated).

Mostly so far Iā€™ve just been posting test code on Shadertoy, and occasionally asking for things like ā€˜where can I find copies of various official standards?ā€™ on Redditā€¦ Where I didnā€™t get any really useful leads; the subreddit I saw over there was I think /r/colorists, and 90% of the content is people saying things like, ā€œIn this program, always set these settings to those values when doing that.ā€

So Wikipedia has still been my number one source for things like what chromaticity coordinates belong to the primaries of which standards, and Iā€™ve not really had anywhere to go for asking for feedback on the actual algorithms.

As for responding later, thatā€™s no problem! I figure thatā€™s what forum-like websites are for - group conversations that could take anywhere from minutes to weeks between responses. Wouldnā€™t want to rush you :smiling_face:

At any rate, uhā€¦ I sorta split up when I wrote this comment, part of it in the morning and part of it in the evening. Iā€™m not really sure where I was going with some of it or if I intended to modify/add to/remove from earlier parts, so Iā€™m sorry if itā€™s a little bit of a rambling mess. Iā€™ll just post it as-is for now, as Iā€™m not sure what else to do with it.

Hi @Tynach - my apologies for taking so long to circle back around to your very thought-provoking post - :slight_smile: I bet you thought I forgot about this post, but nope, not at all!

Edit: with my usual stellar inability to speak clearly, when I tried to rewrite my initial sentence to make it more clear, I left out the critical part of the sentence above, which is that I bet you thought I forgot about this thread, not true!

So to try again, your post was very thought-provoking, and I didnā€™t forget about it, in fact have been mulling over the points youā€™ve made. So I just edited the original sentence above to put in the missing phrase. Sigh.

I never stopped to think about what RGB values the monitor profile might have for the color white near-white colors - thanks! for mentioning that.

I used ArgyllCMS xicclu to check several monitor profiles that I made at different times, using different algorithms, and sure enough ā€œwhiteā€ defined as Lab color (100, 0,0) was close to or exactly (1,1,1) in the monitor space, using relative colorimetric intent:

xicclu -ir -pL -fif file-name-for-monitor-profile

But ā€œhow close are the channel values for white and near-whiteā€ does depend on the type of monitor profile. For my LUT monitor profiles, R, G, and B are only approximately the same for grayscale values, being very close for white and near white, and progressively farther apart as the grays get darker. On the other hand, for my profile made using ā€œ-aSā€, R=G=B up and down the gray axis.

Iā€™m guessing that "how different are the channel values for white and gray also depends on what sort of prior calibration was done using the vcgt tag, before making the monitor profile.

Your goal of making one image look identical on several different monitors, and also make the image look identical to the original object in real life, of course means that at some point you took a photograph of the real life object (Iā€™m really good at figuring out the obvious :slight_smile: ).

Recently I took a photograph of a painting and used RawTherapeeā€™s CIECAM02 module to make the colors on the screen match the colors in the painting:

Of course your situation - multiple monitors in the same room right along with the photographed object - might have the advantage that the entire room is evenly lit with the same color and brightness of light. In which case ā€œwhat color of lightā€ to calibrate all the monitors to might depend on the color of the ambient light. But then youā€™d need to consider whatever compromises might be required when calibrating any given monitor to a white point thatā€™s too far away from its native white point.

I had been thinking about your quest to make the colors look the same on all your monitors, and thinking that the CIECAM02 modules might be a way to accomplish your goal (even without first calibrating and perhaps also profiling the monitors). Making images look the same on different display devices was @jdc 's motivation for RawTherapeeā€™s CIECAM02 module existing in the first place.

@ggbutcher - the RawTherapee CIECAM02 module is something that might also work for displaying images on your projection screen, though it might mean making a solid white image (and perhaps also an 11-step grayscale wedge at L=100 down to L=0) using editing software, projecting that image onto your screen, taking a photograph of the projected image, and seeing what color of white the projected white actually is. There are sophisticated devices for meauring such things, but probably a photograph would get you ā€œin the ballparkā€.

@gwgill - now that @Tynach has a colorimeter and can calibrate and profile his various monitors, would this colprof switch allow him to accomplish his goal of making images look the same on all the monitors using ICC profile color management? Or (as I sort of suspect) am I missing something critical in how images are displayed on different monitors?

http://argyllcms.com/doc/colprof.html#ua

For input profiles, this flag forces the effective intent to be Absolute Colorimetric even when used with Relative Colorimetric intent selection in a CMM, by setting a D50 white point tag. This also has the effect of preserving the conversion of colors whiter than the white patch of the test chart without clipping them (similar to the -u flag), but does not hue correct white. This flag can be useful when an input profile is needed for using a scanner as a ā€œpoor mansā€ colorimeter.

@Elle
Thanks for the compliment, Iā€™ll look at what Ciecam can or can not bringā€¦ with my (very) bad english :slight_smile:

1 Like