It’s the 4k screen. Performance scales linearly with the number of pixels. Darktable on my Mac Mini M1 is about as fast as on my gaming PC with a Nvidia 3060.
I sometimes enable the color assignment mode if I need faster performance, which zooms out, and shows fewer pixels.
Today, I finished the editing session for my vacation. All things told, some 500 photos from three cameras: a Pixel 6a phone, a Ricoh GR III, and a Fuji X-T5. My goal was to have a fairly consistent look across the three cameras. Additionally, the X-T5 got a firmware update during the vacation, adding the “Reala Ace” film simulation, which I very much like, and wanted to emulate in darktable.
Overall, this all worked very well. For the first time on this vacation, I set my white balance in camera to daylight, as I find I prefer my evenings to be orange and my mornings blue, as opposed to equalizing them all to neutral grey. This helped in editing, as I needed to color-match camera white balance only once. Except for the phone, of course, which does not have adjustable white balance.
Exporting took a LONG time, probably on account of my export sharpening style. Not that I mind, the computer can do it in my absence. I was somewhat disappointed to find the exported images were unusually noisy. I guess I didn’t check noise during editing enough. Perhaps I’ll need to adjust my auto-presets to enable noise reduction at a lower ISO. Especially the phone pictures were a bit rough here and there. I didn’t re-export though, as it won’t be visible anywhere but my big 27/4K desktop screen.
So I’ve had a test with a more powerful Mac today, a Mac Studio M2 Max. On average, this is roughly twice as fast as my M1 Mac Mini. Which is a tad disappointing, as it looks three times faster on paper.
Additionally, I have identified another performance problem with my system: I’m running my 4k screens in a slightly high-resolution setting which apparently actually renders everything at closer to 5k internal resolution.