strange issue when encoding an image sequence using 10 bit build

Questions involving a Windows version of FFmpeg.
Post Reply
spacediver
Posts: 8
Joined: Mon Nov 02, 2015 5:02 am

strange issue when encoding an image sequence using 10 bit build

Post by spacediver » Mon Nov 02, 2015 5:55 am

I've been doing some tests with the madVR renderer to verify 10 bit per channel color. My current test pattern is a single 16 bit png file that has 8 vertical bars, in grayscale. The first bar has 10 bit code 0 (out of 1023), the second has code 1, and so on, until the 8th bar which has code 7.

The actual encoded values have been scaled up by a factor of 64 so they are properly scaled in the 16 bit container (I coded the pixels such the range was 1 to 1024, then multiplied by 64, and then subtracted 1, so that everything is proportioned correctly between 0 and 65,535).

When I view this file using madVR, according to these instructions:

http://forum.doom9.org/showthread.php?t=172128

It works perfectly. With the lights off, after my eyes adjust, I can clearly make out the 8 distinct bars on my CRT.

My next step is to figure out how to convert this image into a 10 bit video file. I used the 10 bit ffmpeg build here:

https://ffmpeg.zeranoe.com/blog/?p=435 (windows 32 bit static build)

I then used the following cmd:

ffmpeg -i image.png movie.mp4

This almost works perfectly. The problem is that when I view the mp4 file with madVR, I only see 7 bars instead of 8. The third bar is the problem - it has the same shade as the fourth bar, as opposed to a shade intermediate between the 2nd and 4th bars. I've experimented by changing the 16 bit value of this bar down a few values (in case there was an issue of a rounding error when converting from 16 bit to 10 bit), but no matter what, I can't get a distinct shade to show. It either adopts the color of the 2nd bar, or the 4th bar - never in between where it should be (it shows properly when madVR renders the original png).

I can't figure this out. The whole area of video encoding is very new to me - it took me all day to figure out how to even use ffmpeg, but as far as I can tell, I'm doing everything correctly. The pixel format being used is yuv444p10be (btw, I haven't been able to figure out the difference between "be" and "le" - can anyone tell me what it is?). I can do some more tests with different values (e.g. code 8 to code 15) and see if the problem is spread across the whole range. I wonder if this is a RGB to YUV conversion issue.

Here's a link to the 16 bit png test pattern, in case anyone wants to view it or see if they can reproduce this issue (you'll need a 10 bit display and a video card capable of 10 bit output for this to work. If you have an Nvidia GPU and CRT you'll be good).

http://s000.tinyupload.com/index.php?fi ... 2481393460

User avatar
Zeranoe
Site Admin
Posts: 704
Joined: Sat May 07, 2011 7:12 pm
Contact:

Re: strange issue when encoding an image sequence using 10 bit build

Post by Zeranoe » Wed Nov 04, 2015 1:04 am

This looks like a bug, possibly in swscale.

I would advise opening a bug report with the information you provided: http://ffmpeg.org/bugreports.html

It's unlikely that this is a windows specific issue.

spacediver
Posts: 8
Joined: Mon Nov 02, 2015 5:02 am

Re: strange issue when encoding an image sequence using 10 bit build

Post by spacediver » Wed Nov 04, 2015 1:46 am

thanks, I'll investigate the issue some more and provide feedback in the bug report section.

spacediver
Posts: 8
Joined: Mon Nov 02, 2015 5:02 am

Re: strange issue when encoding an image sequence using 10 bit build

Post by spacediver » Wed Nov 04, 2015 11:45 pm

So it's apparently a 16-235 vs 0-255 range issue in the RGB to YUV transformation.

I followed the advice given on the ticket, but can't seem to fix the problem:

https://trac.ffmpeg.org/ticket/4986

when I use -dst_range 1, the whole field is uniform, rather than 8 distinct bars.

Is there a way to force the the full swing to be preserved in the transformation from RGB to YUV?

Post Reply
'