Davinci Resolve now allows for third party render

  • Status update:

    When getting yuv420 out of DVR and feeding it directly into NVENC (without conversion from uyvy422) I get a performance increase of 62%. Really nice.

    Code
    Exported 1919 frames in 4 seconds. (avg. 393.96 fps) [YUV420p->YUV420p]
    Exported 1919 frames in 7 seconds. (avg. 242.30 fps) [UYVY422->YUV420p]
    Exported 1919 frames in 12 seconds. (avg. 152.90 fps) [YUV444p->YUV420p]

    With these low latencies (2538 us/frame) small changes might have a huge effect.

    TO DO

    • Add high bit depth mode pixel formats
    • Get alpha channel working
    • Get timecodes working for MOV
    • Native DVR NVENC fails after Voukoder ran
      "NVENC m_EncodeAPI.nvEncOpenEncodeSessionEx(&encodeSessionExParams, &m_pEncodeSession) return: 15"
  • just strange DVR is using the 0-255 range as default (auto). AFAIK 16-235 is standard.

    Actually 0-255 is very important . Formats like DCPs, prores 444, Image sequences, IMF master files are full range. Years ago, Resolve was more focused on only rendering master files.

    Grading is done in full range. Resolve takes the limited range and expands it to full with YUV files. RAW files are RGB and is full range. Keeps everything consistent when grading. Also grading monitors are almost always in full range.

    As you noticed, Resolve only goes to 16-235 for non-RGB deliverables like prores hq and h264.

    Looking forward to trying out the new update.

  • Do you know how I can get high bit depth pixel data

    I don't know for sure. But the next code does not generate errors in s_RegisterCodecs (Pixel Format was not defined for plugin encoder fourCC) or when I try to render (Color mode is not supported). I choose 16 bit because for fourCC codecs it is 4/8/16/24/32 (https://www.fourcc.org/rgb.php). But I am not experienced in programs like this.

    P.S.: pIOPropBitsPerSample is for audio, you know, right?

    Edited once, last by smirontsev (April 2, 2021 at 12:58 PM).

  • Actually 0-255 is very important . Formats like DCPs, prores 444, Image sequences, IMF master files are full range. Years ago, Resolve was more focused on only rendering master files.

    Grading is done in full range. Resolve takes the limited range and expands it to full with YUV files. RAW files are RGB and is full range. Keeps everything consistent when grading. Also grading monitors are almost always in full range.

    As you noticed, Resolve only goes to 16-235 for non-RGB deliverables like prores hq and h264.

    Looking forward to trying out the new update.

    Resolve internal processing etc is meaningless here. Explained here:

    https://forum.blackmagicdesign.com/viewtopic.php?…l+range#p711929


    We need full range data as well, but be careful with it.

    ProRes private frame headers have no tag (neither MOV container) to specify range. By design ProRes should be always limited range. You can send full range YUV data to it (ProRes also always stores data as YUV, even if you send RGB to it) and it will preserve it, but reading app will have no clue what is the real range of the file. Only by manual intervention (like we have in Resolve) you can force app to read it as full range. In some apps (like Premiere) it's all hard coded and there is no way to overwrite it. If I'm correct Premiere expects full range for 444 ProRes files and Resolve by default exports limited range, so you always need to set it manually in order to have good mapping between these 2 apps.

  • Within the time we should have few pixel formats supported:

    - yuv420 10bit for h264/5 etc

    - yuv422 10bit ProResHQ, DNxHR etc

    - yuva444 12 bit (for things like ProRes 444, DNxHR 444)- does ffmpeg properly encode now 444 at 12bit or they just fixed decoder?

    - rgba 16bit -for anything RGB based

    I assume bit depth scaling is not the issue, so those higher values will cover smaller bits depths.

    Key point is to have YUV data when codecs require YUV data. Let Resolve do proper RGB->YUV conversion based on project settings etc. You don't want to do RGB<-> YUV as this would require all the color space etc. math behind it. Resolve does it well, so better to use final data.


    For ProRes you want to add extra bits to ffmpeg command ( I assume you're adding color space, etc. already), so it looks more like Apple encode:

    -metadata:s "encoder=Apple ProRes 422" -c:v prorers_ks -vendor apl0 -bitexact -movflags write_colr

    These should be correct encoder names:

    Apple ProRes 422 Proxy
    Apple ProRes 422 LT
    Apple ProRes 422
    Apple ProRes 422 HQ
    Apple ProRes 4444
    Apple ProRes 4444 XQ

    for interlaced encoding (this will properly set ProRes private frame header as well) also add:

    -vf "setfield=1, fieldorder=tff/bff" " -flags "ildct+ilme"

    prores_ks is much slower but keeps bitrate in better control (still not perfectly as Apple encoder and affects quality a bit).

    prores encoder is much faster, but it doesn't really restrict bitrate in the same way as Apple encoder. It will be most likely around 10-20% higher than Apple reference (but also quality is closer to Apple).

    When it comes to PAL/NTSC aspect ratio just pass -aspect 16:9/4:3 and ffmpeg does set it properly by default based on frame size.

    When you get timecode working then:

    -timecode 10:00:00:00 -metadata:s reel_name=ABCD123

    no sure if you can get reel name as variable.

    Later we can add audio tracks language and name setting, which is useful.

    Edited 6 times, last by Andy (April 3, 2021 at 1:21 AM).

  • All those setting should be there as well, no? So it just needs mapping.

    Not exactly. We do not have some kind of "file" for encoding with ffmpeg-like options. We need to "ask" Resolve (with C++ code) to get us the stream in the correct format. And this part of API is not documented at all. So, we can't "Let Resolve do proper RGB->YUV conversion based on project settings etc." until learn how to properly command Resolve via API. The second part - encoding to differents codecs - is not a problem then.

    Edited once, last by smirontsev (April 4, 2021 at 12:11 PM).

  • No way to get some documentation from BM?

    I don't believe API has no documentation, but I can understand BM won't share it with "small developers". Poor approach though.

    Yes, you need correct YUV data to encode for sure. You need levels to be correct as well. Fixing it by setting full range in codec flags is not very good workaround (sometimes not possible) as some apps don't read range flags.

  • I dont have "IOPlugins" folder, I only have 'plugins' folder so I extracted the files there and installed the software before and I dont see it on the delivery page, what went wrong?

  • I'm back now. Let's take a look at the open topics:

    MainConcept's plugin also has NVENC support. But I have no idea how they fixed en error. Closing NVENC session in the destructor?

    That's handled by libav, but as far as I know everthing is properly closed. Also nvsmi.exe does not report an open nvenc session after voukoder is done.

    Actually 0-255 is very important . Formats like DCPs, prores 444, Image sequences, IMF master files are full range. Years ago, Resolve was more focused on only rendering master files.

    Yes, Both variants are support. I just want to avoid DVR is using full range as default but is using limited range instead. Of course the user can switch this to full range if needed.

    But the next code does not generate errors in s_RegisterCodecs (Pixel Format was not defined for plugin encoder fourCC) or when I try to render (Color mode is not supported).

    The pixel format selection in the plugin settings in DVR just defines how frame is being sent to voukoder. You'd have to set the target format of your video in voukoder itself. Voukoder can do any necessary conversion on the fly. But its performant to use the closest pixel format that matches the output format so voukoder does not have to do any conversion. (See my previous post comparing the speeds)

    It makes also no sense to set DVR to yuv420 and set voukoder to export prores 4444. It will work, but the effective color resolution is still 420. Same with bit depth.

    Still, whatever I set in codecInfo, when calling ...

    Code
    uint32_t bitDepth;
    p_pProps->GetUINT32(pIOPropBitDepth, bitDepth);

    ... I always get bitDepth = 0. And the framebuffer looks like being 8 bit. Hmm...

    Edit: Created a post in the BM forum: https://forum.blackmagicdesign.com/viewtopic.php?f=12&t=138829