Png_decoder don't work when set LV_COLOR_DEPTH as 16

I am using png_decoder.c, and it works well when LV_COLOR_DEPTH = 32, but set LV_COLOR_DEPTH to 16, the display color is wrong.
this is my png file bluetooth_00000
when LV_COLOR_DEPTH is 32, result: ![32bit|251x354(upload://qDK6z4ZQqU5j7FTkYfTXHYtfeLE.png)
when LV_COLOR_DEPTH is 16, result: 16bit

and my test code is :
void my_demo_create_img()
{
lv_obj_t * img_obj = lv_img_create(lv_scr_act(), NULL);
lv_img_set_src(img_obj, “./bluetooth_00000.png”);
lv_obj_set_pos(img_obj,55,55);
}

I have fix this issue,and fix a memory leak in png_decoder.c, this is the filepng_decoder.c (7.5 KB) holp it can help the guy who use png_decoder

1 Like

Was this memory leak what caused your issue? Or did you swap the 16-bit colors?

Please send a pull request to https://github.com/littlevgl/lv_lib_lodepng.