New GPU for Darktable/ video encoding

Hi All.

My current system specs are as follows:

> [brian@Giger ~]$ inxi -F

System: Host: Giger Kernel: 5.6.16-1-MANJARO x86_64 bits: 64 Desktop: KDE Plasma 5.18.5 Distro: Manjaro Linux
Machine: Type: Desktop Mobo: ASUSTeK model: A88XM-A v: Rev X.0x serial: UEFI: American Megatrends v: 1801
date: 08/25/2014
CPU: Topology: Quad Core model: AMD A8-6600K APU with Radeon HD Graphics bits: 64 type: MCP L2 cache: 2048 KiB
Speed: 2177 MHz min/max: 1900/3900 MHz Core speeds (MHz): 1: 2177 2: 2155 3: 1895 4: 1894
Graphics: Device-1: NVIDIA GP108 [GeForce GT 1030] driver: nvidia v: 440.82
Display: x11 server: X.Org 1.20.8 driver: nvidia resolution: 2560x1440~60Hz
OpenGL: renderer: GeForce GT 1030/PCIe/SSE2 v: 4.6.0 NVIDIA 440.82
Audio: Device-1: Advanced Micro Devices [AMD] FCH Azalia driver: snd_hda_intel
Device-2: NVIDIA GP108 High Definition Audio driver: snd_hda_intel
Device-3: VIA VX1 type: USB driver: hid-generic,snd-usb-audio,usbhid
Device-4: C-Media type: USB driver: hid-generic,snd-usb-audio,usbhid
Sound Server: ALSA v: k5.6.16-1-MANJARO
Network: Device-1: Realtek RTL8111/8168/8411 PCI Express Gigabit Ethernet driver: r8169
IF: enp4s0 state: up speed: 1000 Mbps duplex: full mac: f0:79:59:6e:12:57
Drives: Local Storage: total: 1.93 TiB used: 1.16 TiB (59.9%)
ID-1: /dev/sda vendor: Kingston model: SV300S37A120G size: 111.79 GiB
ID-2: /dev/sdb vendor: Seagate model: ST1000DM003-1ER162 size: 931.51 GiB
ID-3: /dev/sdc vendor: Seagate model: ST1000DM003-1CH162 size: 931.51 GiB
Partition: ID-1: / size: 109.30 GiB used: 10.23 GiB (9.4%) fs: ext4 dev: /dev/sda2
ID-2: /home size: 916.77 GiB used: 709.18 GiB (77.4%) fs: ext4 dev: /dev/sdb1
Sensors: System Temperatures: cpu: 11.2 C mobo: N/A gpu: nvidia temp: 38 C
Fan Speeds (RPM): N/A gpu: nvidia fan: 0%
Info: Processes: 221 Uptime: 16h 19m Memory: 15.58 GiB used: 3.96 GiB (25.4%) Shell: bash inxi: 3.0.37

I’ve been having slow exports from darktable, and running darktable from the command line with darktable -d opencl, gives the following error:

728.899903 [pixelpipe_process] [export] using device 0
730.234856 [guided filter] unknown error: -4
730.623625 [guided filter] fall back to cpu implementation due to insufficient gpu memory
[export_job] exported to `/home/brian/Desktop/IMG_5070_01.jpg’

So I suspect I’m running out of memory on the GPU (a 2GB Nvidia GT 1030). My display is a 2560x1440 screen, so I suspect that may be using up GPU memory as well, especially since I’m running Plasma on Manjaro using the opengl compositor.

Also, the GT1030 does not support NVENC for video encoding.

So my question is this, would I see much benefit from upgrading the GPU? It is planned to give the system a complete upgrade eventually (which will mean new motherboard & ram, due to modern CPUs being different fittings :roll_eyes:)

If so, what would be a decent GPU for darktable / video encoding etc. I don’t game on my system, I have a PS4 for that :wink:

Also, would I get more bang for buck going for AMD rather than Nvidia?

E.G: Radeon RX 580 Pulse OC Light 8192MB GDDR5 PCI-Express Graphics Card (£180) or GeForce GTX 1660 Twin Fan 6144MB GDDR5 PCI-Express Graphics Card (£200)?

Suggestions welcome!

1 Like

Hi @Brian_Innes,

Can of worms! :slight_smile:
Here, I run Ryzen 3900X plus GTX-1660, and
Manjaro/KDE. Works just fine.

I believe I have performance clockings from a GT1050
(i.e. before and after my switch to GTX-1660), showing
how darktable behaves with/without openCL. Do you
want me to dig them up for you?

Have fun!
Claes in Lund, Sweden


I found this but couldn’t find a different webpage that explained this better. So, open .config/darktable/darktablerc and try increasing the opencl_memory_headroom like this user did. Hope this helps.

1 Like

I’m in the hunt for a new system also. Here’s a few benchmarks I’ve found testing darktable on various GPU’s:
https://www.phoronix.com/scan.php?page=article&item=rocm-20-linux50&num=3
https://math.dartmouth.edu/~sarunas/darktable_bench.html
https://www.phoronix.com/scan.php?page=news_item&px=Darktable-27-Results

Nvidia GTX seems to perform better than various Radeon GPU’s in most instances. No idea if they support NVENC. I’m leaning towards a GTX 1660 or 1060. Am also interested to hear the responses here.

1 Like

I went from GTX 1060 3GB to GTX 1060 3GB+RTX 2080 8GB (Ubuntu 18.04) and had a hard time with the driver. Had to remove 2080, install lastest with GTX 1060 and then reinstall 2080 again.
Noise reduction in darktable that was the thing that took longest time for me is now no issue anymore.

1 Like

Thanks for the replies.

I’m leaning more towards an Nvidia GPU, as I’ve never had an issue with those on linux.

Question is, would a GPU with 8GB memory be overkill? Or would I get away with a 6GB memory one? Lots to research, I think! Certainly, whatever GPU I do buy, I want it to still be effective even if I do upgrade my CPU / Motherboard in the future.

Didn’t check for darktable. When I use Metashape my 1060 is always 100% filled. 2080 is around 60% filled.

1 Like

Thanks @Peter, since you can’t upgrade the memory on a GPU, probably best to go for one with as much memory as possible in my budget :slight_smile:

Checked now with darktable. when increasing kind of everything in the NR module (you will never do that) my KFA2 GeForce RTX 2080 Super EX 8GB uses 84% memory when moving around the picture at 100% view.

1 Like

I’ve always been a bang-for-buck purchaser - I’ll switch brand more readily than many :stuck_out_tongue_winking_eye:

When I built my current system, I had no choice but to go the AMD way for reasons of affordability. So I bought an AM4 socketed motherboard capable of supporting Ryzen but started off with an A12 APU (integrated graphics running OpenCL specifically for darktable). Obviously no separate GPU memory, but with 16GB (now 32GB) RAM, no shortage experienced.

I have since upgraded to a Ryzen 7 and added a GTX 1660 and performance is frequently gratifyingly quick, even for processor intensive denoising and the like. Export can still take a while, especially at full resolution (my DSLRs are both 24 MP).

Part of my eagerness to go the AMD route was also the number of bugs/vulnerabilities that Intel silicon was reputed for, especially back in 2017 when I built my current system.

Previously, though, I had an Core-i7-powered with NVidia graphics laptop.

1 Like

Curiously, I’ve did some darktable benchmarking using the phoronix test suite. Same hardware, first on a Manjaro KDE install, then repeated the same test benchmark on my Mint 19.3 Cinnamon install.

Significantly better performance on Mint 19.3 Cinnamon. Yes, the GPU drivers were installed on Manjaro!

I Still think it’s worthwhile upgrading my GPU though, as running watch -n 1 nvidia-smi I can see the GPU memory usage almost reach 100% at times while exporting!

1 Like

Anyway, I’ve just pressed the buy button on a Radeon RX 580 GTS 8 GB GDDR5 3xDP/HDMI/DVI-D Graphics Card for £160.

I could have spent weeks browsing specs etc, but this seems the best combination of performance and cost :slight_smile: Hopefully should be here tomorrow :slight_smile:

I’ve been looking at the that card as well, please let us know how it goes!

1 Like

Hopefully should be here tomorrow. So once I get it installed and gpu drivers installed, I’ll do another darktable benchmark :slight_smile:

GPU installed today. A bit more fiddly to install the drivers for the AMD GPU, compared to Nvidia.

However a fair bit of improvement on my OpenCL scores!

2 Likes

Which drivers are you using?

1 Like

@paperdigits, I’m using the amdgpu-pro drivers from the AMD website :slight_smile:

This are really great specs but can you tell m something?. What is the minimum required gpu for video encoding?. I am asking for those who really cannot afford high range gpu.

The minimum will vary depending on what you’re using to encode your video. You don’t need a GPU at all if price is an issue, you can do the whole thing on the CPU.