golang
input := []uint{1,2,3,4,5,6}
o := C.fixU32_encode((*C.uint)(unsafe.Pointer(&input[0])), C.size_t(len(input)))
return C.GoString(o)
c
char* fixU32_encode(unsigned int* ptr,size_t length);
rust
pub extern "C" fn fixU32_encode(ptr: *const u32, length: libc::size_t) -> *const libc::c_char {
assert!(!ptr.is_null());
let slice = unsafe {
std::slice::from_raw_parts(ptr, length as usize)
};
println!("{:?}", slice);// there will print [1,0,2,0,3,0]
println!("{:?}", length);
let mut arr = [0u32; 6];
for (&x, p) in slice.iter().zip(arr.iter_mut()) {
*p = x;
}
CString::new(hex::encode(arr.encode())).unwrap().into_raw()
}
This will be passed in, but the array received by rust is like this
[1,0,2,0,3,0]
CodePudding user response:
In Go an uint is 64bit (see https://golangbyexample.com/go-size-range-int-uint/). As a result, you are storing 64bit integers in input
.
The C Code and Rust code treats the input
now was 32bit unsigned integers (in little endian format). So the first input of 0x1 in 64bit:
00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000001
becomes 0x1 and 0x0 respectively. Due to little endian the least significant bits are read first.
You want to be specific in Go to use 32bit using uint32
or ensure your C code matches the machine dependent integers types in Go.