I am trying to use the windows crate to set and get the pixel format descriptor of a window. When I pass None to the PPFD parameter, the return value is something reasonable (the maximum pixel format index, it seems). However, when I try to pass it a pointer to a PFD to fill so that I can check what the actual result after the SetPixelFormat call was, I always get Win32 Error 87 (incorrect parameter), and the PFD has not been modified by the function.
I am new to rust, and so I am unfamiliar with how to pass pointers to C functions, or what the conventions are for the Windows crate in particular. I may be making a mistake there.
One of the other parameters might be incorrect as well, but they are much simpler and look alright when I see their values in the debugger. Fake_hdc is non-null. nBytes is 40. PFD_PIXEL_TYPE seems to have 11 as its value, which is what I expect.
As a result of calling DescribePixelFormat, I expect the PFD fields to be filled with non-zero values and for the return value to be non-zero.
Here is the code with the issue. I'm just trying to get something basic up and running.
let instance = unsafe { GetModuleHandleW(None).unwrap() };
// SECTION: fake window for initializing opengl
{
let fake_window_class_name: &HSTRING = w!("Fake Window Class");
let fake_window_name: &HSTRING = w!("Fake");
let fake_window_class = WNDCLASSW {
style: CS_OWNDC | CS_HREDRAW | CS_VREDRAW,
lpfnWndProc: Some(fake_window_callback),
hInstance: instance,
lpszClassName: fake_window_class_name.into(),
..Default::default()
};
let register_result = unsafe { RegisterClassW(&fake_window_class) };
if register_result == 0 {
panic!("Faile to register initial window class");
}
let fake_window = unsafe {
CreateWindowExW(
WS_EX_APPWINDOW | WS_EX_OVERLAPPEDWINDOW,
fake_window_class.lpszClassName,
fake_window_name,
WS_OVERLAPPEDWINDOW | WS_CLIPCHILDREN | WS_CLIPSIBLINGS,
CW_USEDEFAULT,
CW_USEDEFAULT,
CW_USEDEFAULT,
CW_USEDEFAULT,
HWND(0),
HMENU(0),
instance,
Some(ptr::null_mut()),
)
};
let fake_hdc = unsafe { GetDC(fake_window) };
assert!(!fake_hdc.is_invalid());
let pfd = PIXELFORMATDESCRIPTOR {
nSize: size_of::<PIXELFORMATDESCRIPTOR>() as u16,
nVersion: 1,
dwFlags: PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER,
iPixelType: PFD_TYPE_RGBA,
cColorBits: 32,
cDepthBits: 24,
cStencilBits: 8,
iLayerType: PFD_MAIN_PLANE,
..Default::default()
};
let pf_index = unsafe { ChoosePixelFormat(fake_hdc, &pfd) };
assert!(pf_index != 0);
let set_pixel_format_result = unsafe { SetPixelFormat(fake_hdc, pf_index, &pfd) };
assert!(set_pixel_format_result.as_bool());
let mut resulting_pfd = PIXELFORMATDESCRIPTOR {
nSize: size_of::<PIXELFORMATDESCRIPTOR>() as u16,
nVersion: 1,
..Default::default()
};
let get_pixel_format_result = unsafe {
DescribePixelFormat(
fake_hdc,
PFD_PIXEL_TYPE { 0: pf_index as i8 },
size_of::<PIXELFORMATDESCRIPTOR>() as _,
Some(&mut resulting_pfd),
)
};
if get_pixel_format_result == 0 {
let last_error = unsafe { GetLastError() };
println!("get pixel format failed: {:?}", last_error);
} else {
println!("Pixel format: {:?}", resulting_pfd);
}
let release_result = unsafe { ReleaseDC(fake_window, fake_hdc) };
assert_eq!(release_result, 1);
let destroy_result = unsafe { DestroyWindow(fake_window) };
assert!(destroy_result.as_bool());
let unregister_result = unsafe {
UnregisterClassW(fake_window_class.lpszClassName, instance)
};
assert!(unregister_result.as_bool());
}
CodePudding user response:
Thanks to a comment, I saw that the DescribePixelFormat function in the windows crate wasn't performing any conversion on the iPixelFormat parameter. The docs (https://learn.microsoft.com/en-us/windows/win32/api/wingdi/nf-wingdi-describepixelformat) suggest that this should be an int, and instead it passes the single-valued PFD_PIXEL_TYPE (which I think is a single-value tuple) without conversion. I am not sure why, but this seems to cause the issue.
Here is the binding from the windows crate
#[doc = "*Required features: `\"Win32_Graphics_OpenGL\"`, `\"Win32_Graphics_Gdi\"`*"]
#[cfg(feature = "Win32_Graphics_Gdi")]
#[inline]
pub unsafe fn DescribePixelFormat<'a, P0>(hdc: P0, ipixelformat: PFD_PIXEL_TYPE, nbytes: u32, ppfd: ::core::option::Option<*mut PIXELFORMATDESCRIPTOR>) -> i32
where
P0: ::std::convert::Into<super::Gdi::HDC>,
{
#[cfg_attr(windows, link(name = "windows"))]
extern "system" {
fn DescribePixelFormat(hdc: super::Gdi::HDC, ipixelformat: PFD_PIXEL_TYPE, nbytes: u32, ppfd: *mut PIXELFORMATDESCRIPTOR) -> i32;
}
DescribePixelFormat(hdc.into(), ipixelformat, nbytes, ::core::mem::transmute(ppfd.unwrap_or(::std::ptr::null_mut())))
}
And here is the binding I wrote that seems to work without the wrong parameters error
fn describe_pixel_format(
hdc: HDC,
ipixelformat: i32,
nbytes: u32,
ppfd: *mut PIXELFORMATDESCRIPTOR,
) -> i32 {
#[cfg_attr(windows, link(name = "windows"))]
extern "system" {
fn DescribePixelFormat(
hdc: HDC,
ipixelformat: i32,
nbytes: u32,
ppfd: *mut PIXELFORMATDESCRIPTOR,
) -> i32;
}
unsafe {
DescribePixelFormat(
hdc,
ipixelformat,
nbytes,
::core::mem::transmute(ppfd),
)
}
}