2024-02-16 08:52 AM - edited 2024-02-16 09:34 AM
int intvalue_ = 23;
std::string stringvalue_ = "Hello";
Unicode::snprintf(textAreaBuffer, TEXTAREA_SIZE, "%d %s", intvalue_, stringvalue_ );
textArea.setWildcard(textAreaBuffer);
Unicode::UnicodeChar a[] = {0x00B0,0};
uint16_t degree = 28;
Unicode::snprintf(textArea1Buffer, TEXTAREA1_SIZE, "%d %s", degree, a);
textArea1.invalidate();
Solved! Go to Solution.
2024-02-16 09:58 AM
Hi JN
In your second try, set wildcard range for example 0x20-0xB0, now it missing the space (0x20) and 0xB0 (degree sign). Then it should work.
In first trial, the Unicode::snprintf can understand correctly only 16 bit strings so must first copy your 8 bit char string with Unicode::strncpy to unicode- array. Like this:
int intvalue = 23;
Unicode::UnicodeChar stringvalue[10];
Unicode::strncpy(stringvalue, "Hello",10);
Unicode::snprintf(textAreaBuffer, TEXTAREA_SIZE, "%d %s", intvalue,stringvalue );
textArea.invalidate();
Note that you can also do like this when need only to put free text to wildcard:
Unicode::strncpy(textAreaBuffer, "Hello",TEXTAREA_SIZE);
textArea.invalidate();
Br JTP
2024-02-16 09:58 AM
Hi JN
In your second try, set wildcard range for example 0x20-0xB0, now it missing the space (0x20) and 0xB0 (degree sign). Then it should work.
In first trial, the Unicode::snprintf can understand correctly only 16 bit strings so must first copy your 8 bit char string with Unicode::strncpy to unicode- array. Like this:
int intvalue = 23;
Unicode::UnicodeChar stringvalue[10];
Unicode::strncpy(stringvalue, "Hello",10);
Unicode::snprintf(textAreaBuffer, TEXTAREA_SIZE, "%d %s", intvalue,stringvalue );
textArea.invalidate();
Note that you can also do like this when need only to put free text to wildcard:
Unicode::strncpy(textAreaBuffer, "Hello",TEXTAREA_SIZE);
textArea.invalidate();
Br JTP