ncurses and data types

In ncurses the function waddstr adds characters to a window. The declaration in curses.h is:

int waddstr(WINDOW *win, const char *str);

Let’s say mystring is a pointer to a character array. The chars can be declared signed or unsigned.

Version 1:

unsigned char mytext[LENGTH];
unsigned char *mystring = mytext];
res = waddstr(mywin, mystring);

Version 2:

char mytext[LENGTH];
char *mystring = mytext];
res = waddstr(mywin, mystring);

The problem is: Version 1 works perfect, even when mytext contains characters from the upper part of the iso-8859-1 charset but - obviously - generates gcc compiler warnings.

Version 2 gives no warnings; the compiler is happy. BUT characters above the ASCII range are displayed inverse and blinking (which is not what I want).

Anyone having an idea how to do it properly (no warnings, correct display) ?

I am not an expert here, but could it be that ncurses is still strongly connected to the time that terminals were ANSI (or ASCII) terminals, which among other things means that only the ASCII range of characters is supported? I do have an HP character terminal here (glass TTY) that does ANSI and VT… and I bet that it can only show ASCII. It does of course support ncurses (or should that be worded the other way around?). Thus the hardware suported by ncurses is not even able to show the characters you want to send. The fact that you use other hardware (an emulated glass TTY on a PC monitor) does not alter this.

When that is true, the fact that you found a way to send unsupported characters that are shown as you wanted them is nice for you and I think you should swallow the warnings. It are only warnings, you are warned and accept the risks :slight_smile:

Ncurses is quite new (well, I mean the latest release) and supports the whole character set. What I guess happens here (and this is a wild guess only) is that the character is somehow converted to int and when I use signed char this gives negative numbers for the upper half of the charset.

The bottom line is: I am doing something terribly wrong but I can’t find what it is. Other apps seem to get it right (take dialog as an example) and I looked at the code but still no luck to see what the trick is. Thanks for the help.

As said, I am not an expert, but ncurses may be new (the n stands for ‘new’) and there may be a newest version, but it implements a much older standard. And again I do not know if your intended use fits into the standard.

waddstr is equivalent to calling waddch for each character from the string. waddch takes an unsigned int parameter and its upper (most significant) part is used for determining display attribute (such as blinking, underlining etc.).
[Just read from the docs]

Ok, but waddstr just gets a char and that’s much less info than in an unsigned int. When I look into curses.h (line 737 ff.) I see that waddstr is a macro using waddnstr and waddch is implemented in the library. I am more confused than before.

I too am confused about this. Anyway, negative char values can cause (ch >> 8) == 1 and that will cause one of the display attributes to be active.

Hmm maybe you should be using waddwstr instead?

I don’t have a standard reference at hand, but I’m pretty sure, even when C has a type called “char”, there is no real character type in C. Char is in most implementations (practically anyone I encountered yet) just an 8bit integer. Some functions use integer instead for representing a character. When you decide to use a signed char, you have to consider, that the most significant bit, which is set in the upper half of 256-ascii is the sign-bit.

So for example the german umlaut ‘ü’ has the hex value of ‘FC’, which is in bits “11111100”. In an unsigned char this has the correct integer value of 252, but in a signed char it has the value -116. When you convert now to “int”, which is implicitly a signed integer, the bit representation will change, because the integer value has to be the same as before. The integer value will not be 252 (“00000000 11111100”), but -116 (“11111111 01110011”) and therefore lead to a false character value.

That’s the… ummh… elegant simplicity of C. :wink:

what exactly is the warning given when using unsigned char? You could also use an int-array to store your text, then you don’t have to convert…

what exactly is the warning given when using unsigned char?

This is the warning:

warning: pointer targets in passing argument 2 of 'waddnstr' differ in signedness

Note that waddstr is a macro using waddnstr. Your explanation of what happens seems valid. Most probably waddnstr converts the characters to type ‘chtype’ which is used by ncurses to hold a character and it’s attributes. This (still) works for unsigned char but generates a compiler warning.

The correct solution probably is not to use waddstr anymore but code a loop calling waddch which accepts chtype as an argument.

What about looking at the source code of yast-ncurses or other ncurses-programs (tutorial examples)? I’m pretty sure, there have to be a simple and non-dirty solution. For me, a char is just the type I use as native integer type when working with 8-bit-microprocessors. :slight_smile:

What about looking at the source code of yast-ncurses or other ncurses-programs (tutorial examples)?

That is exactly what I did lol! and this led me to the conclusion to avoid waddstr unless you are dealing with pure ASCII. I am fixing some 10 years old code here which was written when compilers didn’t even throw warnings for this kind of stuff (unless you turned these warnings explicitely on).

Anyway, it compiles now smooth and clean and works. My thanks go to everyone who helped on this quite off-topic issue.