[gmic]./ Start G'MIC interpreter (v.3.3.2).
[gmic]./run/__run/ *** Error *** Item substitution '{`expr(11+x,256)`}': Function 'expr()': First argument has invalid type 'scalar' (should be 'vector'), in expression 'expr(11+x,256)'.
I got similar results with PowerShell after more testing. And I found out that the encoding used is UTF-8NoBom. Maybe it has to do with this, but I know nearly nothing about encoding. After looking around, I don’t see a solution here to this. However, G’MIC works as expected with the GUI interface with regards to text related things.
On Linux, terminal/CLI has moved to UTF-8 quite a while ago (because support for that useful € character). Control characters are very often ignored (as far as I can tell, only characters 8-13 have a visible & recognizable effect on the output.
‘char’ is used to convert a string char into a numerical representation. The back tick tells G’MIC to to treat numbers as if they were characters instead of number. It gives me this.
No remedy unfortunately unless some one makes a really dedicated terminal for it.
In GUI plugin, code[local] and changing echo to status should reveal that it works as expected.
Edit: After some more knowledge, I have to conclude that G’MIC needs to remove UTF-16 characters for encoding/decoding and switch to UTF-8 replacement. There’s no terminal that supports UTF-16.
Could also be UCS-2, the difference being that UCS-2 can only encode 64K characters (all characters on 16-bits) while UTF-16 uses a mechanism similar to UTF-8 to encode more values.
Some info about how the character encoding is managed in G’MIC.
G’MIC is actually encoding-agnostic, meaning that if you write a command in a file with a certain encoding, the result should depend on what encoding is used for output.
For instance, defining a command file like :
foo :
echo "ééé"
and invoking it in the terminal with gmic foo, should output ééé as long as the encoding used for the file and the terminal matches.
G’MIC just loads a command file as a binary file, so it does not really care about how the strings are encoded in the command file. All strings are actually stored as char* in the C++ code (so sequences of 8-bits, meaning a single “exotic” character can be encoded with several bytes, as in UTF-8 or UTF-16).
For instance, on my Linux, my .gmic is encoded in UTF-8, and the command:
foo :
e {'é'}
e {`'é'`}
prints:
~$ gmic foo
[gmic]./ Start G'MIC interpreter (v.3.3.3).
195,169
é
[gmic]./ End G'MIC interpreter.
So, here, the é character is indeed encoded with 2 bytes: 195,169, a.k.a 0xC3,0xA9 in hexadecimal, which is coherent with what is indicated in the reference page
This works well after setting CMD to chcp 65001 which sets it to UTF-8 mode, and executing foo from file. The second line fails to print correctly directly from terminal.
Also, I get similar printing of chars to @prawnsushi after setting to UTF-8. Like 0 is missing and tons of missing characters.
There seems to be something very, very wrong with my own custom command, and I don’t understand it.
Here’s my guess to what happened. As @David_Tschumperle printed out e {`'é'`} command, there is two numbers. My algorithm sorted the additional numbers. So, that’s why my code did not work out and prints out the wrong characters at the end. And that’s something I don’t know how to address.
My command does work on GUI plugin however, so this further complicates my debugging.