The purpose of this thread is to keep people up to date on my progress and process, similar to @David_Tschumperle’s change log and diary threads. Hope you like it. Feedback is welcome; just keep it short (or start a new thread).
If you have been following my posts in the Play Raw and G’MIC categories, among others, I have been keeping a growing number of commands. It is time for me to clean up the mess and start pull requesting them.
The first command to be going up is
afre_y50. Its objective is to generate a luminance Y (D50) channel from a linear RGB image using either Rec.2020 or Rec.709.
afre_y50 also introduces quirky error messages
* (>_<) *.
* (>_<) * afre_y50: Parameter must be '0' (Rec.709) or '1' (Rec.2020).
I hope to share my guided filter next because it has many applications. The hard part is making it presentable and useful to the general public. To do:
1 Among ~10 versions, choose which ones to share.
2 Proofread to make sure they work as expected.
3 Add CLI tags for
4 Nake sure there is an appropriate amount of parameters.
5 Set reasonable value ranges and checks.
6 Do the final checks and letting it sit in my fork.
7 Make a pull request.
PS Lesson Since I made so many versions, it has been hard to determine which commands I issued bug fixes. I will have to comb through them and also decide on the command and image naming scheme. Guess this is the reason that people use version control like Git.
Can’t wait to see your coding samples here.
Mr.Afre, is that new guided filter basically a inside implementation of it as in it doesn’t use guided() from the builtin code at all? Step by step?
It is an implementation with personal touches. It will be a core command in
– @garagecoder helped me with G’MIC specific questions.
– The original papers by He helped me understand the idea and components.
– Other papers helped me understand its limitations.
– Step-by-step, I replicated algorithms I thought were interesting yet simple.
– After gaining insight, I went on to experiment and write my own code.
In that sense, I am an unemployed researcher doing it for the sake of FLOSS and fun. This upload won’t be the end. (Still making it ready for public release, which is another task all together.) In time, I might share my other variations or implement them into what I have decided to integrate into
No, that was my fault : a bug in the filter update command. Basically the update didn’t work since a few days. It should be fixed now.
Why should it do that ?
The PR included that commit.
Just a personal request, whenever you’re done with the gf filter, can you leave commentary on how it works next to each line? I believe the gmic code would finally provide a way to code in guided filter for Krita.
I have already been labelling the image stack to keep track of things. If you have the paper handy, you should be able to follow along. Still prepping for release. Kind of annoying but me knowing how to use it doesn’t mean the public will.
PS Sticking with existing code has proved to be difficult. Full rewrite in progress, retracing my steps. Tried
restore but won’t be using them as they would slow down the command.
If you are using G’MIC 2.7.2, with
restore hard-coded as native commands, I doubt they will slow down the command. Native versions of these commands are actually quite fast.
For instance, with:
foo : v - sp lena,4096 # <- 4096x4096 image store lena repeat 10 tic restore lena v + toc v - rm done q
$ gmic foo [gmic]-0./ Start G'MIC interpreter. [gmic]-1./foo/*repeat/ Elapsed time: 0.147 s. [gmic]-1./foo/*repeat/ Elapsed time: 0.145 s. [gmic]-1./foo/*repeat/ Elapsed time: 0.144 s. [gmic]-1./foo/*repeat/ Elapsed time: 0.145 s. [gmic]-1./foo/*repeat/ Elapsed time: 0.144 s. [gmic]-1./foo/*repeat/ Elapsed time: 0.144 s. [gmic]-1./foo/*repeat/ Elapsed time: 0.178 s. [gmic]-1./foo/*repeat/ Elapsed time: 0.166 s. [gmic]-1./foo/*repeat/ Elapsed time: 0.144 s. [gmic]-1./foo/*repeat/ Elapsed time: 0.145 s
Quite but not quite. Since they can make images disappear and reappear, they are useful for building complex commands. However, they are slower than holding on to the original in the image stack, esp. when the command is iterative like
rolling_guidance. I.e., if you place
toc outside of the
repeat loop, all of the elapsed times would add up quickly.
Speaking of which, remember my comment about
toc? Say if I wanted to time the command at various junctions. I would want to use one
tic at the beginning and multiple
tocs following; but that is not the current behaviour. Take this extreme but short example
gmic repeat 51 tic done repeat 50 sp tiger toc done toc q
I would have to prepopulate the
tics ahead of time.
Made some progress on the guided filter. Rewrote from scratch the self-guided and custom guided variants. Kept them simple this time. If you are keen on the stats, so far the code is 946 characters long including the formatting and comprises of 3 sub-commands. In the future I might break them into even smaller commands, as they have other applications.
A Before writing the one command to rule them all, I will go through steps 3-5 for each sub-command to make them user-friendly on their own.
Took a detour from the guided filter again. This time I have been working on making
afre_hnorm ready for release. I may extend those commands later. In particular, a year ago, I discussed how the edge is uneven, being too bright or thick in some parts. There is also the amazing phase congruency (and its many applications) but I might never get to it. Currently, I have simpler ideas that I may explore soon.
On other news,
afre_y50 is being extra fussy right now about there being exactly 3 channels, which gets in the way of general use; i.e., if RGB, then do command, else norm, or something like that. I will make the appropriate changes soon.
Update Barely made any progress. Been busy and exhausted. Incoming discussion.
Guided filter Besides getting it ready, I have realized that it is slow for normal use, a limitation of G’MIC scripting, among other factors. Radius 1-2 are very fast but larger radii are much slower. Without optimizing or compromising the code (or writing and compiling C++ code), I might decide to extend the core filter with an iterative (edge-preserving1) version.
1 Long term work in progress. Edge detection
gradient_norm is surprisingly elegant, compared to other methods, but as I have noted in other posts, strong edges aren’t only brighter but also thicker. A map with lines with even brightness and thickness would be ideal for pixel weighting. There are other approaches to edge-preservation but they are difficult to understand and implement properly.
Local contrast This would depend on the guided filter, as do many filters on my to do list. Research on this isn’t done. I have much to learn. What I do have is a working filter that is conservative to the point where it behaves like a sharpener.
Sharpen Speaking of which, instead of including my LoG (Laplacian of Gaussian) filter, I have decided to make a new one using FFT. I have one more improvement to make before I am comfortable releasing it and that requires additional research.
Dehaze Sorry about hyping it up. The haze removal I want to do requires lots of elements to work individually and in tandem, a robust guided filter being one of them. Currently, it is last on my list of priorities.
afre_vigrect and added
afre_vigcirc. Edit both in a PR.
These are my vignette filters, one rectangular and the other circular. They add a vignette that either brightens or darkens the periphery, controlling its size, strength and position.
afre_vigrect has an additional blur feature to give it more flexibility that I don’t think
afre_vigcirc requires: let me know how that goes.
Hmm, adding shape vignette seem doable. I’ll attempt that.
@David_Tschumperle By the way, I’m gonna make a pull request. Requesting manual pull request insertion.
Reason - Pull request get lost like in the case of my changes in a pull request on @Joan_Rake1 gmic file. You can see that in a pull request I have made, and that includes a link to the lost pull request. So, since there’s going to be 2 pull request…
Both vignetting filters are
in a PR available now. Clarification: Both filters have GUI and CLI.
Will do that in the future but currently I am more interested in upping practical filters. The fun versions will come after I am done with the core commands and filters (which are many).
What shapes would interest you?
Personally, I would be interested in real vignetting examples of lenses and camera + lens + hood (mis)matches.
PS I finally used a bit of
git and learnt to do
git push -f origin last_known_good_commit:branch_name
No more clumsily deleting the repository and re-forking it after a mistake.
Potential afre_vig* features
afre_vigcirc could go into a GUI multi-filter.
2 Could add other transformations such as rotation and sheering.
3 Could generalize rectangle and circle with polygon and ellipse, respectively.
4 Could give user the option to submit their own shape for manipulation.
Looking at the pull request, what with the
^_^ face? I’m confused about that.
Been working on improving the
gradient_norm for my edge detection purposes.
afre_edge CLI GUI will do a fairly good job in thinning the edges and evenly brightening them.
afre_box CLI has been added to my core set of commands.
afre_edge depends on it and so will the guided filter.
Introduced a smiley face in the output. As with the error face, I want to insert some charm into an otherwise boring CLI. If Ed could do it I could do it too. (Not that I wanted to make the reference, but it fits. Speaking of which, a live-action version of Cowboy Bebop is in development. I wonder if it would be any good. Redoing or renewing classics is controversial at best.)
These changes are in a PR right now and will be available soon. Edit Just increased the input range. Bonus Try turning up details and see what happens. With the right combination, you get a web-like appearance.
I think I might just use it for another pdn filter replica. Thank you.