G'MIC for OpenFX and Adobe plugins

Unfortunately I can’t change the title of this topic to better reflect that the primary focus for me is the G’MIC OpenFX plugin (with Adobe plugins following shortly after that).

So I made good progress in the last days, I can now parse the latest 2.9.3 filters and make them available in hosts like Natron. “Unfortunately” even when skipping the about, animated, interactive, sequences, etc. filters that do not make in a render-only context, we end up with more than 1000 filters, which makes it quite difficult to present in a context-menu like in Natron or Nuke, see here:

I added the option to define a allow/deny list in a text file so a user can define which filters should appear as OFX plugins and which not, but the number is still overwhelming and we only have one division level (the category) in both OFX and G’MIC. So not sure yet how this will evolve in future.

I am still testing functionality and whenever I find a plugin that is not working, I will run the command on the G’MIC CLI to see what the expected output should be.

In Natron, I often run into the dreaded “stack overflow” errors though. I patched the Natron binary to increase the stack size to 16MB, just like the G’MIC CLI has, but it seems it still happens from time to time.

If there is interest in testing the OFX plugin for G’MIC on Windows, let me know, then I will upload my current binaries here.

1 Like

Better now?

1 Like

Thanks, I am happy now! :slight_smile:

When you say ‘Adobe plug-in’, do you mean that Adobe has its own plug-in API to have compatible plug-ins running across different applications ? (PS, I presume ?).
I have looked at how to create a G’MIC plug-in for PS years ago, and even tried to hire people doing that, but with no success.

Personally, I think the quality of all filters is not even, and all filters should not appear in the same place. I’ve already tried to organize filters by categories, but it’s not enough. Category ‘Testing/’ has been set up for incomplete / not finished filters, but that’s only intended to give a ‘binarization’ of the filter quality (good enough / still developed). And people do not really take care about this, they just leave their (sometimes great) filters in Testing/, because they are too shy to move them in the main tree :slight_smile:

It would be actually good to get more clues about what filters are the most used and most appreciated. A notation system for each filter would be great, but that means collecting some kind of information from our users, and we don’t want that.

Also, some filters are better for working with videos (faster) than others, so the quality of a filter is not even measurable without the context it is used for.

My suggestion would be then that you make your own “selection” of what you consider useful for videos. And if you succeed in having a plug-in for PS, it will be probably another selection (closer to what we have in the G’MIC-Qt plugin).

Selecting those filters is an open question though.

Adobe offers SDKs/APIs to many of their products:
https://console.adobe.io/downloads/ae

The one I have been developing with for more than 10 years now is the After Effects SDK, which allows you to create video effect plugins for After Effects (which can also be read by Premiere Pro). The OFX standard started as an (incompatible) spinoff of the AE SDK, by the way :slight_smile:

I also developed some PS plugins (which is a totally different API) some years ago, but wasn’t too fond of the API interface and decided to focus on video effects.

So both my OFX and After Effects/Premiere Pro plugins for G’MIC are just wrappers/shells, compiled with their respective plugin SDKs, that present G’MIC filters to the host in their required parameter and processing format. Internally, after conversion of the image buffers and parameter values, these plugins call the G’MIC library with the same commands as you would on the command line. The processed output is then converted back to the format required by the hosting application.

A PS plugin for G’MIC is definitely possible, you just need someone with enough expertise to put it together, but with the library abstraction, the dependencies are minimal, so it shouldn’t be too hard. I can’t do it however, I have too much stuff to do already :wink:

2 Likes

Thanks for the clarification.
The Windows world is still a land to discover for me :slight_smile: (without me wanting it so much)

1 Like

Some progress I made today:
Many G’MIC filters are a bit difficult if not nearly impossible to use in a video rendering environment, where the host might complain if a frame is not rendered within a certain timeframe. This is especially relevant when the user interactively changes a parameter on the UI, as a tiny bit of mouse movement can create lots of render calls in the background.
In OFX plugins, there is therefore the option to discard the current render, and a plugin can query if it shoudl stop. In G’MIC, there is the option to set an “abort” flag that should hopefully return quickly. Previously, whenever I created a G’MIC instance from an OFX plugin instance, I set up a second thread that checks the discard/abort option from the OFX host while G’MIC does its processing, and when the OFX flag is set, it sets the G’MIC abort flag.
But this never worked for me in Natron and Nuke (haven’t tested in Resolve and Vegas yet), as the OFX abort function was never called. I did now switch it around, so that the main plugin thread queries the OFX abort flag, while the second thread runs the G’MIC command, and suddenly it works! I assume both of these OFX hosts also check which thread does actually send the query for the abort flag.

Anyway, the good news now is that the plugin is much more responsive when changing parameters :slight_smile: I wonder how well that will translate to the Adobe AE SDK, I think there is a similar structure there as well.

4 Likes

Another progress report…

So one of the main issues when using G’MIC as a plugin or shared library is that it is very stack-demanding, and while for the G’MIC CLI application, the stack size gets set to 16 MB (compared to the 1 MB default in Windows and 0.5 MB default in MacOS) when compiling, this cannot be done for libraries, since the calling process defines the stack size.
Now on Windows, in the past I used to remedy this by patching the headers of the host applications like Natron, Nuke, After Effects, etc., increasing the default stack size to 16MB, but this was only a hack, and applied to all threads created by this application.

My new approach is now to create a thread inside the plugin with an increased stack size of 16MB, and then launch the request in the G’MIC library from that thread. That is however easier said than done, since the C++ standard thread library deliberately does not include the option to set the stack size of a thread. So I switched to pthreads also on Windows, where I can set the thread size on creation.

Also, dozens and dozens of bugs fixed. I test OFX now with Natron, Nuke, DaVinci Resolve and Sony Vegas 18, and they all behave oh so differently, this is crazy. In Vegas, for example, any parameter containing a french accent character will lead to a hard crash (took me along time to figure that out). In DaVinci, if two parameters within a plugin have the same name, it will crash. Also, if too many plugins are present, most hosts will run into problems or refuse to accept the plugins at all. I can load around 300-400 at the moment.

The usability of many filters in a video host is also very varying. Some work great, others no so much, and some simply aren’t feasible. Maybe it is a good idea to provide a list of filters that are verified to be working and usable as a starting point, and users can simply extend that list if they want to.

For now, I am happy that I have it running on Windows in a sort of usable early state, although it still is a rather finicky solution at the moment. I think my next step will be to get the After Effects plugin to roughly the similar state.

2 Likes

I have thought about trying to write a PS plugin that works with G’MIC-Qt, the plugin design can grow complicated fairly quickly depending on the features that you want to support.
The PS filter API also has some limitations, e.g. a filter cannot create or resize layers and images.

I wrote a small PS filter that allows PS users to run 32-bit plugin from a 64-bit version of PS, it transfers the image to and from a 32-bit filter host application which runs the 32-bit filters and returns the modified image to PS.
The 32-bit host application is based on my Paint.NET plugin that runs 32-bit and 64-bit PS filters.

The PS filter API is an old and complex beast, with many different applications providing varying levels of support for using the plugins that are created for it.
For example, PS has supported filters reading from any image layer since 7.0, but few if any of the 3rd party applications that can use PS filters would support that part of the plugin API.

Still if the filter was written to target an old enough version of the PS SDK it could work in PSP, XnView etc.

2 Likes

As written before, the sheer amount of filters brings the host and the operating system to its limits in many cases when being handled from within one plugin. In case of OFX, we have to provide an entry-point function for each filter separately, in After Effects, each filter even has to be compiled to a separate plugin/DLL, so not really trivial, although for both approaches I have found workarounds/helpers.

I have now also added a simply way to define which G’MIC filters should be exposed as OFX plugins. The cool thing is that in this text while (which is read at first load of the plugin), you can not only define which filters to load or not load, but also freely rename both the filter name and the category it should appear under, without affecting any of the G’MIC commands at all. If this file does not exist, it will be created with all filters and their categories, so one can then pick the ones that are working. So this makes working with all the filters a bit easier for those who want to.

After filtering out the various test filters, I still have around 1000 G’MIC plugins available, and from my tests around 80% of them seem to produce at least a somewhat usable results in video hosts with more or less acceptable speeds/waiting times, at least for smaller images.

2 Likes

@David_Tschumperle Do we have sample_videos on gmic? I’m interested into trying my filters on videos.

To manage videos, G’MIC has to be compiled and linked with the OpenCV library enabled, which is not done by default on Windows. That’s because OpenCV is a monster library that comes with a lot of large DLLs, and I didn’t want to distribute a 100Mb archive just to enable the few OpenCV commands in G’MIC (basically webcam input with command camera and video input/output).
On Linux, the compiled G’MIC packages enable OpenCV because it’s always a shared dependency, so at least, this can benefit to other installed programs (when it’s not already installed).

I used to compile OpenCV in its early days and it was a nightmare, esp. on Windows. Much easier nowadays but I get your reasons. @Reptorian you could always enable it. :wink: As for samples, I think it could be as simple as inputting an URL or using your own files. Maybe G’MIC could use a short video demo that sample could access…

Hmm, there’s another idea I have in mind. Maybe @Tobias_Fleischer can make a database of nonworking filters for videos, and then volunteers can help adapt those filters for use on videos?

Some of those however may not be possible, unfortunately, but I think we can save a few at least if we try.

Not sure if I have the time and means to do so, this is really a sideproject by me and I hardly have time for it.
In general, most video effect plugins need one input and one output layer, not more, not less. Most hosts and plugin types can be configured to have additional (optional) input layers, but are restricted to one output layer. In my OFX and AE G’MIC plugins, I include these additional input layer options, and for the output layer, I allow the user to choose from the layers that the G’MIC command spits out (or provide a layer merge option).
Furthermore, all input and output layers are considered to be 4-channel RGBA.

So in theory all G’MIC commands that provide a reasonable transform of an input pixel buffer into an output pixel buffer will also work on video on a frame-by-frame basis.

The main issue is rendering time, as it makes quite a difference if a user applies a filter to an image in GIMP and waits 3 seconds to receive a rendered result, or if a user applies the same filter on a 10 seconds video sequence in Natron and has to wait 30 minutes to get it rendered!

1 Like

And we finally have G’MIC running in Adobe After Effects again :slight_smile:
Below you can see the Rodilius filter applied to a video (here is a simple render of the processed clip with the amplitude parameter automated from 0 to 10): red woman vs rodilius on Vimeo)

A different render with a 1:4 downsampling (resulting in 1/10th render time) is here: red woman vs rodilius 2 on Vimeo

3 Likes

Testing temporal consistency with the “Black Crayon Graffiti” filter, not too bad:

2 Likes

I found out I added a special mode to the plugins 5 years ago that I just re-discovered. :slight_smile:
Besides having each filter available as a separate plugin, there is also a “generic G’MIC plugin”, which is just an option to enter the commands as text to be sent to the G’MIC interpreter. That way any command (or sequence of commands) can be entered from one single plugin, very comfortable. I just added 8 normalized float parameters to the plugin and made them available for the interpreter as separate variables, so that you can now even automate the command for each rendered frame.
In the screenshot below, the first parameter of the fx_crayongraffiti2 command (i.e. the amplitude) gets modulated by the first slider in the plugin, which can be controlled by the Natron scripting or automation API.

Same thing in After Effects:

4 Likes

Would be lovely to have this as an MLT plugin so that Kdenlive, Shotcut, Synfig and others could benefit from it. Not to mention standalone cases like this where millions of videos are generated.

How complex would it be? Would you be interested in developing it? Maybe the community could set up a fundraiser if so?

How long did it take to render this?

1 Like