Convert from char array to 16bit signed int and 32bit unsigned int

Andy Mills

I'm working on some embedded C for a PCB I've developed, but my C is a little rusty!

I'm looking to do some conversions from a char array to various integer types.

First Example:

[input]        " 1234" (note the space before the 1)
[convert to]   (int16_t) 1234

Second Example:

[input]        "-1234"
[convert to]   (int16_t) -1234

Third Example:

[input]        "2017061234"
[convert to]   (uint32_t) 2017061234

I've tried playing around with atoi(), but I don't seem to be getting the result I expected. Any suggestions?

[EDIT]

This is the code for the conversions:

char *c_sensor_id = "0061176056";
char *c_reading1 = " 3630";
char *c_reading2 = "-24.30";

uint32_t sensor_id = atoi(c_sensor_id); // comes out as 536880136
uint16_t reading1 = atoi(c_reading1); // comes out as 9224
uint16_t reading2 = atoi(c_reading2); // comes out as 9224
Lundin

A couple of things:

  • Never use the atoi family of functions since they have no error handling and may crash if the input format is bad. Instead, use the strtol family of functions.
  • Either of these functions is somewhat resource heavy on resource-constrained microcontrollers. You might have to roll out your own version of strtol.

Example:

#include <stdint.h>
#include <stdlib.h>
#include <stdio.h>
#include <inttypes.h>

int main() 
{
  const char* c_sensor_id = "0061176056";
  const char* c_reading1  = " 3630";
  const char* c_reading2  = "-1234";

  c_reading1++; // fix the weird string format

  uint32_t sensor_id = (uint32_t)strtoul(c_sensor_id, NULL, 10);
  uint16_t reading1  = (uint16_t)strtoul(c_reading1,  NULL, 10);
  int16_t  reading2  = (int16_t) strtol (c_reading2,  NULL, 10);

  printf("%"PRIu32 "\n", sensor_id);
  printf("%"PRIu16 "\n", reading1);
  printf("%"PRId16 "\n", reading2);

}

Output:

61176056
3630
-1234

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related

Convert signed to unsigned int column

Convert from 2 or 4 bytes to signed/unsigned short/int

Casting from `int` to `unsigned char`

Convert 24 bit signed int signal to 16 bit signed int signal

C: Converting unsigned char array to signed int (vice versa)

Convert signed int of variable bit size

Conversion from signed int to signed char?

IA32 Assembly code for type casting from signed/unsigned char to unsigned/signed int

16bit unsigned int array outputting more elements than initialized size in c

Cast signed char to unsigned int in C

Convert 11 bit hex value to a signed 32 bit int

break a 32bit integer, and use the second half 16bit to form an int

Converting unsigned char array to int

Convert int to 16 bit unsigned short

Convert int to unsigned char array and backwards

Convert an int array to an unsigned char array for AES encryption

How to convert unsigned char * to unsigned int * in c?

Is resolving char between signed and unsigned int unspecified?

How to convert 16 bit unsigned int to 8 bit unsigned char & finally return in an unsigned char*?

memcpy from unsigned char * to unsigned int

Convert Uint8 (floating-point AUDIO_F32) to int16_t (16bit PCM)

GCC fprintf: should that be an int (32bit) or char (8bit)?

Convert from int array to char array

Convert binary to signed, little endian 16bit integer in Python

int[][] representing a grayscale image, how to convert from signed to unsigned values

Decoding 32bit unsigned int in 4 unsigned char in Perl

Convert 8 bit signed integer to unsigned and then convert to int32

Why char is unsigned and int is signed by default?

Convert byte array (signed bytes) to int array of corresponding unsigned bytes