Batch Process Hardware

You may think about hard drives the other way around: it’s not that the process will be quicker with a fast HD, but instead that it will be slower with a slow HD.

You would only need a fast HD for your shared files, but certainly having a fast drive on your OS drive won’t hurt either.

A bit off topic (because it is not related to batch processing), but still:
When using RawTherapee on dual boot with Linux and Windows, you should have a look at this page
http://rawpedia.rawtherapee.com/File_Paths#Cache
for Custom config and cache folders

Manjaro linux is a good option.

2 Likes

I bought myself a new system not that long ago. These are the specks (here at pixls): Insights or advise about: AMD Ryzen 5/7 CPU's and NVIDIA GTX/RTX GPU's - #6 by Jade_NL The rationale for those components are in that thread.

2 reasons I bring this up:

  • A general one: I’m really happy how this one performs running darktable, GIMP and, as of late, RawTherapee. Often multiple versions at once.
  • A specific one: I started out with 16GB and thought, as mentioned in that topic, that that would be enough. Turns out that it isn’t, not really. I just upgraded to 32GB and that works much better.

Just so you know: I’m on Debian (10/Buster) and run Nvidia’s propriety drivers (450.66) on kernel 5.8.10. Without any problems whatsover. I do use some back-ported stuff though!

1 Like

though i have code that will run ~20MP images through a typical darktable-like processing pipeline in under 20ms. on a system with an okay SSD (not up to today’s standards) for batch export clearly it becomes a bottleneck loading these files from disk, decoding in rawspeed, and uploading to GPU (which is why i interleave these things, the GPU can asynchronously upload and process while rawspeed can be a third parallel process). so thinking ahead a couple of years i think disk i/o may well become our bottleneck. i think it is very easily for thumbnail creation.

Ha! As I was writing that I had that idea in the back of my mind (“this is NOT TRUE for vkdt!”). I guess once your developments are ready for mass consumption it will be GPU = SSD >> CPU > RAM

Keep in mind that there are raw formats (Canon CR2 for example), for which just one or two cores can be used to decode them (no GPU). For this formats the frequency of the cpu matters as well, not only the number of CPU-cores and the GPU.

1 Like

@Soupy Do you plan to build (or at least assemble) it yourself?

Thanks! So about 18% faster for those two modules.
Its a small sample size, but if we were to use the average of those two modules as representative of all modules, and we used 20 modules per image, then 8 cores would take ~26 seconds per image, and 6 cores ~32. If we had 1000 images, that is ~7.2 hours compared to ~8.9. So for individual images the difference between 6 and 8 cores is not that significant, but for batching large amounts it is.

Is the moral of the story to put the cache folder on the shared drive available to both OS? Or to keep it separate? Most of my RT work will be done on the Linux side, but it may be convenient to install it on both.

Is the benefit of 32 over 16 only noticeable when you have multiple programs opened at once, or is it noticeable within just a single program?

No, that is unfortunately beyond me at this stage.

Both, depending on the situation and/or program(s) used.

Here are some of my experiences:

I noticed that you seem to use the same programs that I do (or at least have an overlap): GIMP, RawTherapee and darktable.

GIMP is rather memory hungry on its own. The average camera nowadays produces 18-24 MP shots and if you are using just a few layers to get the result you want/need (final touches after the RAW edit) you are fine, but a larger workflow with many layers and/or layer groups will benefit greatly from more memory. I had the pleasure of working with a D810 the other day and those extra GB’s are very welcome when working on 36MP shots!

If using darktable or RawTherapee on their own 16GB is enough. As mentioned I often use multiple versions of one program (dev vs stable) or have multiple editors open at the same time. In that case 16GB is probably not enough.

A while back there was a topic here that dealt with recreating the style of a Russian photographer. I started my own little project and one of the steps was averaging 297 24.4MP shots. I wasn’t able to do that with HDRMerge or Hugin due to memory limits. ImageMagick’s convert was able to do it though, although slowly.

So, in the end it all comes down to what it is that you (want to) do with your newly build machine. Only you can actually answer that. Do hope that I gave you some footholds.

PS: I can confirm from experience what Ingo has mentioned: RawTherapee definitely leans more towards CPU usage compared to darktable (opencl and all that). GIMP is a bit of a mixed bag: It does have (experimental) opencl, but I turned it off: Too many crashes when doing B&W stuff.

EDIT: This is the topic I was talking about: How to emulate Titarenko's long exposure stacking with digital?

1 Like

Exactly.

1 Like

It’s helped a great deal, thanks very much. I initially had 16gb pencilled in as it seemed enough for both RT or DT on their own, which is how I use them. But I hadn’t taken into consideration higher MP cameras, or multi-stacking projects - which I haven’t done a lot of, but will do more of in the future. So now I think 32gb is the way to go.

Based on the way I use each program, and pricing considerations at the store, I think I will prioritise GPU and darktable for very large batches, but continue to use both for single images or small batches. My new system will end up quite similar to yours. Just need to determine if the price increase from gtx 1660 super to rtx 2060 is worth it. gtx is about 73% cost of rtx. And like you, taking Ryzen 5 over 7 might make it worth it.

Hm. Permit me to have a different opinion.
Nowadays, it is not difficult to assemble one’s own computer.
I have just built my third real thing. And if I can do
it, so can you…

Have fun!
Claes in Lund, Sweden

1 Like

+1. I’m about to assemble my umpteenth one. Just takes a screwdriver, and a bit of hand-holding. Here’s a decent hand: How to Build a PC (2024): Hardware Suggestions, Instructions, and More | WIRED

Yeah, just make sure you don’t wear wool socks :smile: Jokes aside, I agree - it’s not difficult at all, just make sure to watch some brief tutorial video if you’re doing it for the first time. The whole process takes about 30 to 60 minutes. Hope I didn’t sound like bragging, I just wanted to motivate to do it yourself if you hesitate @Soupy .

1 Like

I believe you. I am more worried about selecting incompatible parts, because I don’t know about all the hardware. To solve that I could perhaps get a quote from the store to see what they come up with, then just get it myself. But its a time/cost thing. Time it takes to learn how to do it all myself vs cost of getting store to do it. In my life right now, time is a little more precious than cost, so I don’t think I will learn just yet. Maybe my next machine, in another ten years time (hoping this one will last that long, Haha).

PCPartpicker can help you there. It gives for each component selection criteria, and when you select a part for your list, it will tell you about the compatibility. Start with a CPU, and it’ll offer compatible alternatives for the other parts…

https://pcpartpicker.com/

3 Likes