RapidRAW adds AgX Color Management, inspired by Blender and darktable

Hey everyone,

Just wanted to share a quick update for those interested in new RAW processing tools. The latest release of RapidRAW, my cross-platform RAW editor, now includes the AgX tone mapper.

Following the development of open source projects like Blender and darktable, AgX provides a more advanced approach to tone mapping. It’s designed to produce more natural, filmic results by better handling extreme brightness and color saturation. This helps avoid the harsh clipping and overly saturated “digital” or “plastic” look that can happen when pushing highlights.

Here’s a quick comparison showing how it handles a +3 EV push compared to the previous tone mapper:

Base Image

Base Tonemapper (+3 EV)

AgX (+3 EV)

To better support a scene referred workflow with AgX, the exposure controls have also been completely reworked. The old single slider is now split into two distinct tools:

  • Exposure: A linear exposure compensation for setting the technical white point.
  • Brightness: A perceptual adjustment that primarily targets midtones, great for finetuning the look after the base exposure is set.

The release also includes a few other features, like a selective Copy & Paste system for adjustments and more control over the processing engine.

You can get the latest release and see the full changelog over on GitHub:

Have a nice day!

16 Likes

why not ACES?

based on the blender examples it seems even better.

It seems that this is not widely adopted for still image editing and where is is available the word cumbersome is often used. But I wonder if ACES is something that will appear in still image editing more universally in the future. Thanks for the link to the video about ACES in Blender.

same could be said about AgX no?

Wasn’t that originally developed for blender?

I am not implying that ACES will not soon make its way into still photography editing. I am just saying it has not yet been widely adopted. I would certainly like to see it as an option in DT one day as I am sure there will be certain images that would benefit from this approach. :+1:

actually krita has an ACEScg profile (which you can pick on the new file dialogue). gimp does not yet.

If Krita is used to produce content for incorporation in an ACES workflow (CGI, animation, etc), that’s a prudent inclusion.

I don’t see an ACES use case for “stand-alone” still photography. ACES has specific destination colorspaces corresponding to movie theater equipment and such; still photograpy has to deal with the plethora of monitors and printers out in “the wild”. However, photographers producing still images for inclusion in cinema workflows will need availability of the corresponding ACES profiles for inclusion, where projects use ACES.

Hey @darix

Basically (from my understanding) ACES is a full color management system that handles everything from how colors are captured to how they’re shown on different screens. It’s pretty complex and aimed mostly at film, 3d and video workflows.

AgX is just a tone mapper - it focuses on making the image look more natural when bright areas get pushed without weird clipping or colors going crazy.

I added AgX because many users asked for it, and it fits well with how RapidRAW works right now. ACES is definitely interesting, but it’s a bigger project to bring in.

Thanks

2 Likes

If you have the ability to use ICC-style profiles to export renditions, that would be the trivial “implementation” of ACES. All one would need then to submit stills to an ACES workflow would be an ACES-2065-1 or ACES-cg profile with which to export their images.

Elle Stone’s Well Behaved ICC Profiles includes both of the above:

https://github.com/ellelstone/elles_icc_profiles

1 Like

I’m not able to go in depth at the moment, but I’m really happy with how RapidRaw is looking!

AgX is working beautifully and makes it really easy to shape images to my taste.

Thanks! :+1:

3 Likes

How are you finding the speed. I see some long pauses when I make a change and the screen blurs and then the preview is fine. Zooming and panning is lightning fast but I feel like I see processing delays…My machine is not crazy fast but I have a 3060 TI, 64 GB DDR5 Ram and 12th Gen i5 processor that seems adequate for DT and its complex math…just wondering if you see that… I also tried one or two images and the noise reduction sliders didn’t seem to do anything … I cranked luma noise correction to 100 which I thought should likely blur and crush the image but I could see little difference between 0 and 100…Just a couple of things that I saw

@CyberTimon , the British Rail logo in your samples should be red!

6 Likes

Maybe the matrix is wrong?
@CyberTimon , where do these values come from? What"s your base space? Are these row- or column-major matrices?

OK, today I gave RapidRAW a second try. The first time I did, I was not able to get anywhere I wanted. I don’t know how long ago this first try was, but for sure not really long. So my congratulations to a impressive development speed. On my second try the results looked much more the way I like.

I’m probably not the best person to judge over this kind of software, because I guess I’m not the target group. I love to have full control and I love options and many different ways to achieve my goal. While RapidRAW offers in the meantime more control, it will probably never offer my the amount of options and control I really would like to have.

In the end it is very good, that it is like that. I don’t think we need another darktable, but a beginner-friendly but capable RAW Processor.
RapidRAW is filling this niche really good.

A further aspect is, that I have a x86 tablet (running debian) where I can’t use darktable. I have not tested it yet, but the UI of RapidRAW seems to be operable on it.

The UI concept is quite different on some parts to the Raw Processors I know so far. Some things I like, some not. But it would be too early to judge anything in this regard. All in all I find it very good and intuitive.

So I come to some points where I see improvement potential:

  • The colour choosers (for example on colour grading) are nice, but for me, they miss the possibility to exactly choose what colour I want. I struggle, depending on the used mouse, to get the colour I want. Maybe it would be a possibility to have an option to give in a html colour code or something like that to fine tune the selected colours…

  • With some elements, I struggle to get what they really are for. This is partly because of their names. For example, what does Centré mean and do in the details section. Maybe a tooltip or a help function which enables someone to get some description when he clicks an element would be helpful.

  • On some sliders, it would be useful to know in which direction I have to move them. For example, the temperature and tint sliders. At the moment, I simply have to move the sliders to see in which direction and to what extent I have to move them. An underlying palette would help.

  • I edited an old Play RAW photo and chose the Auto adjustment:


    I get pinkish highlights where they are clipped. A well known thing from darktable and RawTherapee. But on them, I know how to get rid of the pinkish colour. Is there a way to avoid this in Rapid RAW?

  • I as well tested on a high ISO image. While I was over all very pleased with RapidRAW, here is a really weak spot. I couldn’t get a satisfying result with denoising.

I hope this feedback helps you to some extent. Keep up your great work!

I’m aware of that and this is mostly how AgX works. The image was intentionally pushed several stops over to demonstrate AgX’s highlight handling. Traditional transforms often desaturate colors as they approach clipping, leading to a washout effect. AgX, however, prioritizes a more natural, energy-preserving path to white. See Blender’s guide for more detailed sample image, showcasing how colors change depending on exposure: Color Management - Blender Developer Documentation

The values come from Darktable & Blender and the workflow is like this:

  1. Input: Scene-Referred Linear color with sRGB primaries.

  2. AGX_INPUT_MATRIX → Internal AGX working space.

  3. agx_tonemap function applies the curve in this special space.

  4. AGX_OUTPUT_MATRIX → Back to Scene-Referred Linear with sRGB primaries.

  5. Final gamma correction (pow(…, 2.4)) converts the result to a Display-Referred sRGB image ready for the screen.

Lastely, in WGSL, matrices are column-major .

Thanks for the feedback - I’ve always wanted a simple way to draw attention to the center of an image without fiddling with masks. So I built the “Centré Adjustment” slider. It’s like an inverted vignette, subtly boosting clarity, brightness, contrast, and vibrancy in the middle of the frame. It’s a quick way to add a little pop where it counts.

You can adjust the Highlights Compression Point in the app settings under the “Processing Engine” section. Lowering this value will make the recovery less aggressive and should eliminate the pink highlights.


Thank you all for the feedback! It’s very helpful.

4 Likes

I don’t think reds should turn yellow but white instead. At least this is how I understood https://www.youtube.com/watch?v=ZFGxdb2pH8g

How were the matrices derived? (I know the steps of AgX.)
The values don’t seem to match Blender; also, darktable’s resulting matrices are for Rec 2020 as base, not sRGB / Rec 709 primaries (which is what you use, based on your description).

No, the output of the tone curve is already prepared for display (it’s ‘gamma-encoded’, display-referred). In darktable, we add a power to linearise it.
Also, terminology: after the curve, it’s no longer scene-referred data, whether it’s linearised or not.

The Blender input matrices (from Eary_Chow - see Blender AgX in darktable (proof of concept) - #1014 by Eary_Chow):

The matrix generated using the new settings in Rec.2020:

[[ 0.85662715628877917  0.09512124540253490  0.04825159830868580]
[ 0.13731897228355167  0.76124198700908063  0.10143904070736748]
[ 0.11189820804517953  0.07679941456251757  0.81130237739230304]]

Vs. the original matrix used in the python script and the config, which was generated from Rec.709 related parameters, but got used on Rec.2020:

[[ 0.8566271533159830  0.0951212405381588  0.0482516061458583]
[ 0.1373189729298470  0.7612419906025910  0.1014390364675620]
[ 0.1118982129999500  0.0767994186031903  0.8113023683968590]]

In Eary’s OCIO config (formatted a bit):

        # Rec.2020 generated parameters rotate = [2.13976149, -1.22827335, -3.05174246], inset = [0.32965205, 0.28051336, 0.12475368]
        - !<MatrixTransform> {matrix: [
                              0.856627153315983, 0.0951212405381588, 0.0482516061458583, 0,
                              0.137318972929847, 0.761241990602591, 0.101439036467562, 0,
                              0.11189821299995, 0.0767994186031903, 0.811302368396859, 0,
                              0, 0, 0, 1]}

Those are row-major – numbers are generally in the same range, but there are no negative members, whereas there is one in your input matrix. Is that the combined sRGB → Rec 2020 → AgX rendering space matrix?

          const AGX_INPUT_MATRIX = mat3x3<f32>(
              vec3<f32>(0.84565281, 0.18854923, -0.03420204),
              vec3<f32>(0.09162919, 0.8034334, 0.10493741),
              vec3<f32>(0.062718, 0.00801737, 0.92926463)
          );

darktable and RapidRaw, with AgX processing on the same HDR TIFF input:


test-tif.zip (129.9 KB)
The input is Rec 2020, I don’t know if RapidRaw can read that from the image.

I force darktable to interpret the image as sRGB:

Lots of test images (including this one):

Hi everyone,

Thank you all for the valuable feedback. I especially want to thank @kofa, who provided technical information and great sample images.

To give you a little background on my “AgX” journey: after I first released RapidRAW, I started getting many requests for AgX. Back then, my knowledge of tonemapping and color spaces was quite limited, so I read up on the topic and used AI to help summarize and learn the concepts.

Because my understanding was still developing, implementing it correctly wasn’t easy. During the process, the language models I was using hallucinated the color matrices, and I didn’t verify them well enough - that’s on me :=D.

However, today I spent a good amount of time re-implementing the logic to correctly calculate these matrices, similar to how Darktable does it, and the results are now much more accurate. With the correct matrices, the overexposed red from the British Rail logo now renders as a rich but realistic red, which is exactly how you would expect AgX to handle it:

AgX:

Basic:

The commit is now in the main branch and will be included in the next release in the coming days.

Thank you again for all the technical insights.

Timon

10 Likes