cancel
Showing results for 
Search instead for 
Did you mean: 

Converting char[] pulled from keyboard to an uint16_t

DElli.1
Associate III

I know this seems like a simple question, but for whatever reason this is what is giving the most issues today.

Here is what I am working with:

Unicode::UnicodeChar keyboardBuffer[7][3];

Unicode::UnicodeChar* buff = keyboard.getBuffer();

  Unicode:: strncpy(keyboardBuffer[keyboardSelection], buff, 3);

I am copying the keyboard buffer into keyboardBuffer[keyboardSelection].

But I need to convert this character array into an int so I can use it to properly set the time in my device and also check if the int is within range.

I have google searched and have tried several different methods but each one returns an error. I am assuming I am doing something wrong and that is why I am here asking for suggestions.

I am also using the stm32CubeIDE and do not believe it is using C++ version 11 as I am unable to use atoi.

I have also tried stringstream with this:

std::stringstream str(keyboardBuffer[keyboardSelection]);

to no avail.

help would be appreciated as I am sure I am just missing something simple.

Thank you in advanced.

 EDIT:

It turns out I was simply using atoi wrong.

I have now done this:

Unicode::atoi(buff);

  Unicode::snprintf(hourTextBuffer, 10, "%02d", buff);

  hourText.invalidate();

but the number displayed is not correct.

EDIT2:

Problem Solved!

It ended up being very easy.

  Unicode::snprintf(hourTextBuffer, 10, "%02d", Unicode::atoi(buff));

  hourText.invalidate();

2 REPLIES 2
Piranha
Chief II

You are missing a very simple thing - the description of particular errors!

I didnt see the need since I would assume if it was possible to convert UnicodeChar* to an integer then it would be explained to me.

When I use keyboardBuffer[keyboardSelection] in any atoi or stringstream format I am greeted with

conversion from x to x requested

failed