G'MIC Tutorial Fragments

Thread: Trial examples that may or may not find their ways into tutorials.
Possible tutorial: using name to keep track of items in a longish pipeline.

Ghost Cat

gmic                                                \
   -sample cat,720                                  \
   +luminance.                                      \
   -name. graycat                                   \
   +diffusiontensors. 0,1,0.1,8                     \
   -eigen.                                          \
   -fill.. 'c==0?1:0'                               \
   +fill.. 'c==0?0:1'                               \
   [-2]                                             \
   -eigen2tensor[-4,-3]                             \
   name... smooth0                                  \
   -eigen2tensor[-2,-1]                             \
   -name. smooth1                                   \
   '(-1,-1,-1;-1,8,-1;-1,-1,-1)'                    \
   -name. semiairy                                  \
   +convolve[graycat] [semiairy]                    \
   -name. convolvecat                               \
   -remove[semiairy]                                \
   -sqr[convolvecat]                                \
   -threshold[convolvecat] '{0.025*iM}'             \
   [convolvecat]                                    \
   -name. convolvecat2                              \
   -repeat 3                                        \
      smooth[convolvecat] [smooth0],60              \
      smooth[convolvecat2] [smooth1],20             \
   -done                                            \
   -remove[graycat,smooth0,smooth1]                 \
   100%,100%,1,1,'(i#-1+i#-2)/2'                    \
   -append[-3--1] c                                 \
   -name. ghostcat                                  \
   -fill[ghostcat] 'rot(vector3(1/sqrt(3)),120)*I'  \
   -blur_bloom. 30,3,10,+,0,0,c                     \
   -normalize[ghostcat] 0,255                       \
   -output[ghostcat] ghostcat.png
3 Likes

Looong but has the attraction of an interesting result.

I think it would be useful to have people understanding how to apply theory and convert it into a gmic script and optimizing it. Really, the most part of my own script development is about theory and figuring how to code with theory in mind. It can be hard though since you need to know how the math works out.

Quite often, that is the heart of the matter. Thank you for kicking that can up front.

Sometimes, people are not so much interested in -foo as they are in “However you do that criss-cross thing, whatever?”

Whatever that “criss-cross” thing is, it arises from the Cookbook article Eigenvalues and Eigenvectors. The gist of that page is: unroll the label “tensor field”, wherever you may read it, to “box of ellipses, one for each pixel”.

Why care about ellipses?

Theory: In the immediate neighborhood of each pixel, an ellipse’s semi major axis documents the direction of fastest intensity change, this direction encoded by grayscale channels Cosine and Sine. Channel EigenOne (semi major axis length) encodes that intensity change “in the direction of the gradient (λ1)”. Channel EigenTwo, (semi minor axis length) encodes the magnitude of a vector always rotated 90° counter to EigenOne. EigenTwo encodes a magnitude “in the direction of the contour (λ2).” The eccentricity, e, follows from:

pixelpost

so a very eccentric ellipse (nearly one) straightaway indicates that in the immediate neighborhood of the pixel, there is a very steep gradient with respect to the contour: i. e. the pixel is on an “edge”, while a very circular ellipse (eccentricity nearly zero) indicates a pixel sitting on a “plain”. Note that e is just an off-to-one-side calculation for visualization purposes. When we flip the roles of λ1 and λ2 – as we are about to do – we will be dividing by zero only insofar as far as this relation is concerned – but we aren’t using this relation. Making tensors does not care about eccentricity. So sleep tight through the next dangerous curve.

Recipe: Append EigenOne and EigenTwo into one two-channel image; append Cosine and Sine into the second. -eigen2tensor computes from these two intermediaries the per-pixel tensors encoding this data. Hand off to -smooth or whatever else that eats tensor fields for breakfast. In any case, we’re doing Arto Huotari’s “Dream Smoothing” at right angles.

To wit: the script’s game. Ask -diffusiontensors for a smoothing tensor field. -eigen unrolls whatever -eigen2tensor rolls somewhere in the guts of -diffusiontenors. Throw out whatever eigen values -diffusiontensors told us about: we just want eccentricity 1.0 ellipses. That is, for purposes of Art, we claim every pixel in the image sits on an edge. Then we flip semi-major and semi-minor axes for the makings of a second tensor field that, when fed to -smooth via -eigen2tensor, effectively operates in directions 90° rotated from the first tensor field, a follow-on consequence of shrinking the semi-major axis to zero while stretching the semi-minor axis to one: the ellipse effectively rotates in place. From the same source imagery (The Cat): drive one channel using one tensor field ([smooth0]); drive the second channel using the second tensor field ([smooth1]). Color and filter those channels however you want.

For my lot, I stacked a blue channel under smooth0 (red) and smooth1 (green) and impressed it with the averaged intensities of both, just to get a very rough magenta/cyan palette, then performed a +120° color space rotation around the white axis to align the color climate more to what I had in mind. -blur_bloom to desaturate-by-smear the color channels was an odd choice, perhaps. I was intrigued by its non-linear “blooming blur.” In any case, it gave me a couple of off-center grays, one kissed by purple, the other by sea-green, that fit my notion of ‘ghost-like’. Take that where you will.

So that’s the criss-cross thing. Easy stew to make once the rabbit is in the pot. First: get the rabbit. Bugs Bunny rarely cooperates with us Elmer Fudd tutorial writers. Back in the day, I was a long time poking around tensor fields before I could do anything interesting with them. The heart-and-soul of theory-first, dress-it-up-in-G’MIC-commands second, is a long tutorial row to hoe. I’ve only written a few tutorials of that ilk. Some more to come some day.

Before closing, there is another bit of theory, quite apart from the criss-cross thing, that (I think) could amuse. The Cat, you will readily note, was taken with a drum-head narrow depth of field - maybe f/1.4? f/1.2? - her face in pin-point focus, and everything else, fore-and-aft, going soft. That’s a use case for an Airy disk or a rough approximation thereof: big positive spike surrounded by a shallower negative ring, so-scaled that the ring sum just cancels the spike - net energy: zero. Convolve that Airy disk over edgy (in-focus) regions to get hugmungous +/- coefficients. Soft focus: not so much. square and threshold-cut just to get the in-focus face framed by nothing at all. A ghost cat peering in from the void. Keep an Airy disk in your kit for just these tight depth-of-field images.

That’s it for now. Have fun. Thank you for the comments. Back to the tutorials.

3 Likes

Possible Beginner’s Cookbook entry: G’MIC for Python coders.
@myselfhimself has written a Python module that feeds G’MIC command lines to a G’MIC process:

import gmic
gmic.run("sp car,300 b. 10 n. 0,255 o. myblurredcar.png")

puts the file myblurredcar.png in the current working directory. A Pythonista may also keep the image in the Python context by furnishing a list as a second parameter.

myblurredcarlist = list()
gmic.run("sp car,300 b. 10 n. 0,255 o. myblurredcar.png", myblurredcarlist)
len(myblurredcarlist)
1

The purpose of the proposed Cookbook article is not to teach Python folk how to use the module. @myselfhimself has done that. See his Quickstart Guide.

The purpose of the proposed Cookbook article would be to assist Python folk how to compose the command line between the double-quotes, because in short order they will want do do much, much, more than blur Ford coupes. And they can. By lots. But they have to go through a paradigm-shift and it is a complex one (Sorry, David, but it’s true…)

Back history: I started with Linux (Fedora, Red Hat then) in 2010 after moving off of SGI Irix. I discovered gmic in short order, As I did ImageMagick. Both were part of that distro. Couldn’t do much with gmic for - oh about a year and a half. Lotsa black pictures in Gimp. The genesis of the G’MIC documentation arose from my wont of writing out things I don’t understand. I was able to dump ImageMagick for gmic in early 2013 after ImageMagick quirkiness infuriated me. But it took two and a half years to do that. That’s a long time to get over a hump.

The takeaway from that experience, though, was getting a “mental model” of G’MIC: its paradigm. I wish I had it up front. I would have been up to a useful speed with G’MIC in three or four months.

The model which eventually grew up in my head gave rise to these articles:

  1. Basics
  2. Images
  3. Selections and Command Decorations
  4. Images have edges - Now What?
  5. Conjuring Images out of the Aether and Other Generators

The Python Cookbook article would use these articles for a basis and attempt to harness Python-like objects that bear strong similarity to G’MIC objects as a means of building the G’MIC paradigm in a “Python-like” way. In that way, I hope to build the ‘grammar’ of what to put between those double quotes when the Python programmer invokes gmic.run("<some stuff...">)

There is an enormous potential in working with the two paradigms in parallel. My aim, quite frankly, is to promote dual citizens. The Python image processing community is perhaps two orders of magnitude larger than G’MIC’s, but I think having G’MIC in their tool kits could move them to useful images quickly - they just need to be able to read the code. They certainly have useful viewpoints, insights - and code snippets! - to serve as a basis for rich, interesting G’MIC filters. If some Python programmers have G’MIC fluency, the transfer between the two worlds would be enormously useful and healthy for both.

This probably would be one of the more interesting, but intricate, cookbook articles to write. Especially in choosing Python objects that serve as good illustrations of G’MIC constructs. Feedback welcome - and needed. Thank you in advance.

3 Likes

Hello,
I am really grateful for your mentioning the gmic-py documentation, because I have a only little feedback about it.

I must say that I am very unfamiliar with most of your gmic shortcuts (o. etc…) which you have written. Your 1.-5. links are just pages that I have almost never read in my life and for which I am thankful that you mention them as a starter, because I had not much idea what had most priority… What I had a look most at is the G’MIC commands reference, after 1.5 years of full time to occasional G’MIC.

Where would you see a Python G’MIC cookbook posted? On the gmic-py documentation website, and/or the gmic.eu website?
I see that you write proficiently and like to do that. I would propose you to take the lead in writing documentation, on any github repository that you like and I can reread and make pull requests.
As far as the gmic-py.readthedocs.org website is concerned, it is written in the Sphinx-compatible ReSTructured Text language with a G’MIC Sphinx in-house Macro. The documentation is located in this repository’s docs/ directory, and generated by the bash build_tools.bash 6_make_full_doc and bash build_tools.bash 6b_make_doc_without_c_recompilation commands (recompilation is there if the gmic-py C++ wrapping code is amended and API documentation generation is needed). PRs for that would be welcome… Any commit on the master branch or any other branch of that repository gets rendered to HTML, PDF etc by a Github 1-click-configuration job to the gmic-py.readthedocs.eu website.

Work in any other repository in a non-Pythonic Markdown language is also OK (maybe the gmic markdown language in some gmic.eu or gmic-community repository…).

I would be happy to learn more G’MIC at your side by being somehow pushed into helping with the G’MIC 101-isation for Python people.

At your disposal!

@grosgood I could create a cookbock.rst file in the gmic-py docs directory to start with, and with some custom branch or PR if you would fancy that.

Good to hear. Makes a lot of sense to host this cookbook article as an addition to gmic-py.readthedocs.org. That’s where the Python eyeballs are going to be, and if they’re installing gmic, that’s the place they want to read up on what goes into the string fed to gmic.run() I’ve no problem with ReStructured. When you technically write, you get exposed to lots of different mark ups. I started with VAX/VMS troff, myself (showing my age…). Makes sense to push to your repo, then, later on, a polished push to gmic.eu. For the G’MIC people, this is nice to have - at py-gmic.readthedocs.org, the Python people want something to help get them off the ground. We can work out the details later. Hard part is the writing and making sure it is sane. You made the bridge modules; your help in writing - even just vetting my notions - would be enormously helpful. Got a MS Teams meeting coming up - gotta tip.

2 Likes

Thank you! Agreed, I will pave the way soon with a new blank file in the gmic-py repository’s docs/ directory and point it to you.

@grosgood is a related Github issue for writing a G’MIC cookbook for Python developers. I will post commits soon mentioning the issue id (#91) or a in proper independent PR or branch

The cookbook work-in-progress gmic-py documentation variant is up and builds on each to the cookbook branch. Technical details on contributing have been added in the related Github issue.

1 Like

By the way, images names are not well tested within gmic-py, they correspond to the 3rd argument of type Python list of gmic.run() or gmic.Gmic().run(). A sleeping implementation is coded though. So writing documentation on it may force fixing gmic-py. The idea of using a Python dictionary (aka hash) to assemble names (as keys) and images (as values) was discarded for packing/unpacking Python C API complexity. The goal was to release some simple binding working fast enough, without differing too much from the libgmic C++ API.

1 Like

Concerning the documentation pages: Would it be interesting to convert the documentation pages made by @myselfhimself (for the Python doc) and @PDN_GMIC (for the 8bf installation) into .gmd files (G’MIC-Markdown) so that we can put them directly on the G’MIC website, in the reference documentation ? Let me know what you think about the idea.

I would be OK at first thought with that, it is just that the documentation build system should be clear enough to use for local building. At second thought not so much…
One showstopper with the Python doc to G’MIC markdown conversion idea, is that those tutorials do not just embed simple G’MIC expressions to be interpreted, but pure-Python lines of code (eg. some of them go back and forth with Numpy, PIL, Scikit-image module functions etc). Such high-level of intepretation helps to paste complex Python scripts and make sure that not just my environment works, but also the readthedocs container, which may also mean the end user’s environment. One tutorial that I have in mind is to write a Flask RESTPlus endpoint in front of a G’MIC level-command based 3d low-relief generator to .obj from an uploaded image and have it rendered in three.js… For this I would need a lot of Python environment freedom (accomplished by writing custom Sphinx macros), or give up this whole server-side documentation reproducibility, in a “simpler” .gmd environment.
One other advantage of the Python doc is that its graphical theme layout is known to pythonists. Also it does have a working builtin search engine for visitors (the Python Sphinx renderer most probably builds an index of stemmized terms and hands it to a front-end Javascript library that allows for searching without any backend search engine or database alive), and branch switching.
What could be done is to write a .html converter for the Sphinx-built pages to some gmic.eu .html or .gmd pages.
So to conclude… as far as the Python doc is concerned, the conversion to .gmd would not be obvious at all.

I believe that @PDN_GMIC’s 8bf installation could be more easily written to .gmd :slight_smile:

A .gmd page could however show a static overview of the Python binding’s possibilities (ie. a diagram) and link to the documentation website. Would you like me to create one such page?

I agree.

1 Like

@David_Tschumperle OK I see that the gmic.eu 's .gmd files are located here.
What is the command to see an offline preview of any .gmd file as html?
Found gmd2html
Would the calling method be gmd2html myfile.gmd. The reference page does not mention a .gmd file, and I have not found an example command usage yet.
@grosgood I guess you know how it works as well

1 Like

(just to make a general gmic.eu gmic-py-related help page to point towards the gmic-py readthedocs website)

Everything is explained here: G'MIC - GREYC's Magic for Image Computing: A Full-Featured Open-Source Framework for Image Processing - G'MIC Markdown
(see last section to convert .gmd to .html).

1 Like

OK thank you!

gmic it input.gmd gmd2html ot output.html

1 Like

Necessary Topic: Indexing - Using a math expression pixel iterator to drive changes to other (arbitrary) images. Works even when an image is being newly created, as in ...x,y,z,c,<fill math expression> where <fill math expression> references other images on the list via #<index> notation. However, as @Reptorian observes in looking for a gmic python scripter to help me out here, the image-in-the-process-of-being-created is not indexed yet, so relative indexing from the end of the list is still with respect to the last, fully extant, image on the list.

Tutorial material on the math expression parser is still very scant and I don’t think the topic of indexing other images in math expressions has been dealt with outside Mathematical Expressions. See Specific functions, in particular i(), j() i[], j[] I[], J[] and the next-to-the-last paragraph in the -fill tutorial, where the notation is covered, but not the delicate question of what peg does relative indexing count from when the math expression is operating on an image in the process of being created.

Shouldn’t also leave out the technique which prompted this entry: newly creating an image just for the purpose of being an “iteration controller” in math expressions primarily designed to alter other images on the list. See rep_pfrac in include/reptorian.gmic around line 922.