Posts by morphinapg

    I noticed while testing 8bit output of a video where I tonemap HDR to SDR (performing some color grading) that there was some banding in the output. I'm guessing this is due to premiere reading the source as 8bit, and applying the filter in an 8bit calculation, resulting in rounding issues. I know this wouldn't happen if I outputted in 10bit (tested, looks great), but I don't want 10bit output for a project like this. The maximum depth option in other encoders usually would solve an issue like this by making sure the source and any effects are calculated with 32bit precision.

    Adobe claims that it affects scaling quality, so if your edits have been scaled in premiere (this may include time remapping), it may improve the quality. If you haven't done anything like that, it probably does nothing.

    It's likely something that's applied before Voukoder gets the video data from Premiere.

    If the values are set correcly it should look something like this in mediainfo:


    Params used:

    -x265-params :colorprim=bt2020:transfer=smpte-st-2048:colormatrix=bt2020nc:master-display="G(13250,34500)B(7500,3000)R(34000,16000)WP(15635,16450)L(12000000,200)":max-cll=0,0


    You set those with x265. These particular flags are set before the encoder gets the data. The idea being (if they had worked) they could pass that data to any encoder that supported it. In particular, if it had worked, it would have been useful for NVENC.

    MaxFALL, MaxCLL, and mastering display characteristics are not some universal flag that ffmpeg can pass to any encoder. They need to be specified by the encoders themselves.

    Like I said, you can do HDR without it, it's just not fully HDR10 compliant and may not tonemap as well since the display doesn't have any idea of how bright the content gets.

    Well, I put a 30 fps file on a 60 fps project / sequence / timeline. That's not under voukoders control, it will be handled by premiere / the NLE. The NLE will send a 60fps frames stream to voukoder. These frames can be anything. Applying a decimate filter at this staged doesn't make any sense, right? Setting cycle to 3 would just output a 40fps file.. but... why?

    Well yeah that example wouldn't make any sense to do other than testing what happens with the filter. For practical purposes I'm talking about situations where your source recording has a higher frame rate than the underlying content. Such as tv with 24fps in a 60fps recording, or my example where the PS4 outputs 60hz so my capture device records a 60fps file no matter if the game itself is rendering at a lower rate.

    Basically, what you could test is import a 24fps video, set the timeline to 30fps,make some cuts (especially trimming out the start), and then export. Then, re-import that 30fps and try to get it back to 24fps. Using normal methods, it may or may not work. Either you'll get something smooth, or there will be a lot of judder. This may also change scene to scene depending on the cuts you made.

    Instead of using normal frame rate conversion methods, use decimate with a cycle of 5, and you will be able to fully reconstruct the original 24fps frame rate of the source. Decimate is about reconstructing an original frame rate that is embedded in a higher fps source, and doing so with precise results (no judder).

    Ok, crop and pad added.

    Not sure about decimate as it adjusts the output framerate.

    On a clip with 25 fps "decimate=cycle=3" would generate a 16.66666 fps file. Wouldn't it make sense to put such file on the timeline and then let the NLE convert it to the desired framerate?

    If there is a consistent frame pacing in the source, then yes, letting the timeline do it (or the connector) is absolutely fine.

    However, if for example if you have a 24fps source that somehow got encoded as 30fps, then what you have is a 1, 1, 1, 2 pattern, meaning the fourth frame is duplicated. It would be very difficult to ensure the frame rate down conversion would exactly remove the duplicated frame, rather than one of the unique frames. When editing gets involved, it's even harder to guarantee that. If the wrong frame gets dropped, you get severe frame judder as a result.

    It's too bad the filter doesn't seem to support removing more than one frame per cycle, as removing two frames from a cycle of 5 would be particularly useful in extracting a 24fps video from a 60p source as well at the correct pacing. However this could probably be accomplished by setting frame rate in sequence/connector to 30fps and then using Decimate with a cycle of 5.

    My issue is specifically with video game footage. 30fps inside a 60fps container. Ideally, just dropping every other frame SHOULD be good enough, but with video games sometimes a frame is rendered early or late, so if you use decimate you're more likely to be able to detect which of the two frames in the cycle is a duplicate, to better improve the downconversion.

    So they way I get the pixel data from premiere is okay?

    morphinapg Could you write down the settings (encoder, options, muxer, filter, etc) and a short description text so I could add this as preset? Or maybe even more than one.

    Premiere's editing workflow is rec709. Even when you edit HDR natively, it saves that in a floating point rec709 pixel format, which Premiere's native encoder converts back to PQ/BT2020 when you select HDR encoding (floating point ensures no color or highlight loss on conversion). When using any other encoder, you're getting that rec709 output, clipped to the first 100 nits. Hence why I needed to create the effect preset to "pre-format" the pixel data as PQ/BT2020. The problem is, Premiere thinks it's seeing rec709, so yes, you absolutely must use rec709 output for my effects preset to work. If you select anything else, Premiere will think it needs to convert the colorspace, modifying those pre-formatted values I so carefully calculated.

    As for a Voukoder preset, the HDR process requires the premiere effects preset I uploaded earlier (or for your input video to be HDR without PQ/2020 flags) so it wouldn't be so simple as someone picking a preset in Voukoder. Their video needs to be sending the right color data to begin with or it will look completely wrong. That's why I wrote the guide I did. If they are sending the right color data to vukoder, then there is one preset you could create I suppose:

    For just generic HDR support, simply using setparams with:


    Will allow HEVC, VP9, H264, and ProRes (if not more) to output in some form of HDR (compatibility would depend on devices of course). HEVC is obviously preferred, so I would default to that with 10bit 4:2:0. You could create presets for both NVENC and x265, and for x265 I'd also enable "HDR Options" and "UHD Blu-ray conform" for the most ideal encoding efficiency. I personally use CRF13 with x265 with HDR content. I haven't experimented enough with NVENC to know what the equivalent would be there.

    For a more "perfectly compliant" HDR10 encode, the user would also need to include the actual metadata that describes the brightness of the content and mastering display, as I describe in my guide. As far as I know, that's only doable on x265 for now. But it becomes a much more complicated process if you don't already have numbers prepared for that. The guide explains that process involving a tool I created to measure the light output of individual frames. So I don't think it would be particularly useful to create a voukoder preset with predefined metadata. That should be something the user is inputting themselves.

    Good idea! In case you have a Project in 60fps with Clips of exact the 0,5-framerate you will have a better result by exporting the Project in 60fps and then drop every second Frame before Encoding to 30fps. Exactly that's what I'm doing when i would like to preserve a tv recording for future: Cinema stuff on TV will be in Europe converted to 50fps (NTSC-to-PAL-speedup + doubling Frames) and usually it Looks better converted back to native 25fps (1:1 Frame Transfer).

    (Sorry! Die saublöde teutonische Rechtschreibkontrolle kann leider kein Englisch!)

    Yeah, if it's exact and consistent, simply dropping the frame rate works perfectly (you can do this in premiere, either in export or sequence settings), but when there's some slight variance in the frame pacing, as there often is with games, decimate would typically give a better result.

    Yes it is the same, HLG does not seems to be different before and after the encoding.

    What is very strange is that even when I'm using the Premiere encoder to output in HDR PQ the colors are pretty the same...

    I need to test all the encoded files on screens supporting HLG and PQ to see if there is any difference.

    The problem is, Premiere is reading your HLG file as SDR. That's okay if you turn around and export as HLG, but the native export will be HLG displayed as SDR, encoded in an HDR container. Just like you could encode any SDR video in HDR mode through premiere's HEVC codec if you want.

    HLG is designed in a way that if you read it as SDR, it still looks fairly good, but the colors being the same is an oddity. If the source has BT2020 colors, then Premiere would likely decode that into rec709 color space, meaning you'd need to perform a conversion using the colorspace filter to fix this (input rec709, output bt2020), and then select bt2020 and HLG options in x265 for correct output. Or skip the colorspace filter and use HLG paired with rec709 colorspace and primaries, although that's not recommended.

    However, if Premiere is just ignoring the color space, then you may be seeing the native rec2020 color space, which SHOULD look slightly weird in the preview monitor. Like slightly desaturated, maybe a little yellow tinted. If that's the case then you would be free to encode to x265 with HLG and bt2020 options without using the colorspace filter.

    Do you have a sample source file I could test? I could maybe give you some advice on how to encode it properly.

    I thought of another thing. If it's still overly dark, even with the new preset, it's possible my custom LUT isn't being loaded in the second lumetri filter, so check on that. I believe this SHOULD be embedded into the preset, but if it's not, then I can upload that separately.

    Basically what the preset does, is first it compresses the 0-10,000 nit HDR range into the 0-100nit SDR range, and then the second filter performs a transformation from rec709 gamma/color into smpte2084 gamma and rec2020 color. So without the second step, you basically get an image that's 1% the brightness of the original. Open up the second Lunetri effect and see if the LUT is active. Try checking the active checkbox above the LUT selection drop down. If checking that box does nothing, then the LUT is not embedded correctly and needs to be uploaded separately.