Darktable AI Edge Detection

That’d be. a UX disaster, and we already get hammered on UI/UX all the time (true or not).

Could you please explain why and where you think it would be a UX disaster? Do you mean the toggle in the settings (blender e.g. hides features by settings) or the module itself (it has in my imagination a file dialog button, a text field naming the file, a recompute button, and a toggle for consider in thumbs or not, and maybe also a roi vs full selector and maybe even a color format dropdown).

Btw., I don’t see any major UX disaster in darktable, I even like the oft-discussed collection filters very much, I was hoping for such a feature for years, but I also like the UX of blender, so it’s maybe my fault :wink:

^ Implementing it this way would be a UX disaster

I am a native speaker and I think your use of the word hostile is completely legitimate. It’s arguable, of course, depending on the judgement of the reader, but having read through this thread, the word hostile is within the bounds of reasonable interpretation of at least some of the commentary. Thanks

2 Likes

Not all image processing software is targeted towards the same population and the same needs. The same is of course true of most products. We would not expect, for instance, car manufacturers to slavishly emulate each and every feature that others do. We each select and make compromises with acquisitions according to our needs and availability.
Rather than asking the dt developers, with their meager resources, to add features … why not ask Adobe to upgrade LR to match the superior development of dt?
Maybe you should buy a copy of LR for the specific fast AI processing needs and simply use dt for quality processing.

Hi David. I think you make a good point on the limited resources and that may be argument enough. However, saying that choosing photo software is a matter of horses for courses doesn’t seem entirely realistic. Most people don’t often use multiple full-fat editing software packages, particularly, I suppose, pros who need to get the job done. I wonder (out loud) whether there isn’t a tendency, not so much from the developers but the users, that this is their niche and newcomers should do the hard work of figuring things out and not expect things to be too simple or straightforward or popular. Not directing this at you but some of what I’ve seen in this and other threads. It’s probably an unfair characterisation. Thanks

1 Like

Hello.

I just post a play raw with an image to see how users change the background using just luminosity masks. As I’m not a heavy user of darktable and don’t know the tools so well, I want to know how users separate the model from the background using luminosity masks. This is a easy one.

This is another image where I tried to select the model in darktable using luminosity masks but at the end I switched to GIMP to end the edition.

Basic edition in darktable

High end in GIMP

But I don´t changed the background color, and definitely not with AI :wink:

That’s my impression as well.
But I did not wanted to exclude the possibility up front :slight_smile:

I do think it would be a more interesting solution to be able to externally edit the masks