Posts by morphinapg

    Okay, it turns out I found a workaround for now, using premiere's optical flow and frame blending, but it was kind of complicated. However, it performs very well.

    I'm converting 60fps gameplay to 24fps with motion blur. For those who might be interested, this is what I did:

    1. Have 60fps gameplay
    2. Have 240fps sample file with same resolution
    3. Have 48fps sample file with same resolution
    4. Drag 240fps sample to create a new 240fps sequence.
    5. Delete the sample and drag the gameplay footage to the 240fps sequence.
    6. Right click the footage and time interpolation / optical flow. This will give you a smooth 240fps version of the source gameplay
    7. Next, drag the 48fps sample to create a new 48fps sequence
    8. Delete the sample and now drag the 240fps sequence onto the 48fps sequence.
    9. Right click this and select frame blending for time interpolation. This will blend 5 frames together for each 240fps frame.
    10. Export as 24fps.

    The reason I used 48fps instead of 24fps (getting 10 blended frames) is because typically for motion blur, you want it to be a "180 degree shutter" which is 1/2 of the time. Having a 48fps sequence simulates a full 360 degree shutter, so exporting that to 24fps cuts out half of the frames, simulating the more typical 180 degree shutter. However, if you want the full 360 degree shutter, as it is smoother, you can simply make the second sequence 24fps.

    The reason for the sample files is unfortunately it's not possible to set sequences to weird frame rates like 240/48 manually. You can also achieve this by re-interpreting the source footage as 240/48 before creating those sequences and then setting it back to its original frame rate afterwards, but of course that's an annoying way to do that.

    I don't know for sure how useful this would be for OP's situation, but it may help some people.

    I see you removed the fps filter but doesn't that then make this tmix procedure impossible?

    I had the idea to do something similar in a future project, although generating the additional frames with optical flow rather than capturing them natively. However, as it currently is, tmix will mix the frames based on the frame rate selected in the connector. So I'd have to output at the high frame rate and then drop the extra frames later myself. Obviously, this is not the greatest way to do it as it results in encoding way more frames than necessary for the end product.

    Did you use the preset I uploaded? Third picture looks like what you would get if you followed the guide but forgot to apply the preset. The encoder assumes the color space is ready for HDR, meaning it should look more like image #2 in the preview monitor in premiere after you apply my preset.

    What you will see on the preview monitor is the entire 10,000 nit range compressed into the rec709 preview monitor, with color space and transfer function converted to correspond to the BT2020 and ST2084 spaces, which will make the image look dark, washed out, and lower contrast on the preview monitor before you export with voukoder.

    Here's the preset again:…st2084-bt2020-pp2020-zip/

    Using the HDR editing mode in Premiere 2021 doesn't work with this process btw. Keep the project as rec709.

    so I should just set up a level and CRF min&max?

    And shall I set the VBV? I don't quite understand the function of VBV

    Really don't bother with those other settings. Set a CRF level, and leave the other settings at default. 13 or 14 seems good for HDR to me.

    I've sometimes seen VBV and other settings used in combination with Blu-ray specs, but there's another option that automatically applies anything necessary for UHD Blu-ray encodes, so just use that instead.

    CRF is far better than bitrate, if you care more about quality than file size. There are not 2 kinds of CRF. The strategy option is about what kind of encoding mode you want, and CRF is default. You can change it to ABR if you want to choose a bitrate instead.

    Constant Quantizer, another option, is pretty much an outdated mode for x264/x265 and while it produces similar results to CRF, it creates wasted bits to do so. I believe CQP is what NVENC uses as its constant quality, but the number values are different and the bitrates created are typicall yhigher for the same quality with that.

    Was there anything changed about the MKV implementation in voukoder between July and November that could have caused this? Or is it more likely an issue with something that changed on Youtube's end?

    My HDR uploads are in MKV and worked fine in both September and December, so it's probably not specifically MKV+x265, but it seems to be limited to SDR, or at least, 1080p SDR x265 when combined with MKV.

    I have a feeling this is more likely youtube's fault than Voukoder, but I thought I'd post it here just in case voukoder or x265 is doing something weird causing this. Basically, for the last three 1080p SDR x265 10bit videos I've uploaded to my youtube channel, when I first upload them, I would find random spots in the video where it freezes for a few seconds, and then resumes playback. The first time, reuploading was able to fix the issue. The second time, I contacted support, and they were able to re-process the video, and it worked fine. Today I just uploaded another and had the same issue. The videos play fine on the computer.

    YUV 4:2:0 10bit

    CRF 18

    Everything else default. I used 10bit because I'm doing HDR-to-SDR color grading and I want Premiere to process everything at full bit depth to avoid banding, and since I'm using 10bit, I'm using x265 for that. I believe my most recent 4K HDR video processed correctly, but it's possible it had the issue too and I forgot. I know for sure it was happening on my 1080p SDR videos though. The first time I noticed this was in November, which was my first upload in this format since July. That one worked correctly, as did the 4K HDR uploads I uploaded in September.

    EDIT: I remembered that the time I reuploaded to fix it, I had remuxed to MP4 first. I just reuploaded my MKV hand had the same issue, so I'm going to remux and reupload the MP4 version. Perhaps an issue with the MKV container? Will update after uploading MP4.

    EDIT 2: MP4 appears to be working fine

    Hey Katana, nice to see you here 😁

    Okay, a lot to work with here haha

    First of all, I'm not sure how Vegas works with HDR content as I haven't used the program in years. Based on your output, I'm guessing it likely is outputting in the correct format, the raw PQ/2020 output, which indeed looks "washed out" on an SDR monitor. If you had the ability to output to a monitor that could display that information properly in HDR mode, most likely it would look fine. HDR uses a different "gamma" and color gamut than SDR, so if a program interprets it as if it were SDR data, it looks wrong. That's fine in this case because that's exactly what we need for Voukoder to work properly with the footage. The problem with Premiere is that Premiere would "correct" the display to match the SDR gamma and colors, which meant all that additional HDR information was lost on output to Voukoder. Thankfully, based on your results, I would guess that's not happening with Vegas.

    As for Limited/Full, since you're capturing internally on the PS5, I don't think this will matter, because the video files will be encoded with whichever one the PS5 was set to, and Vegas will almost certainly interpret these correctly. So when you output in Voukoder (yes, the output should be set to Limited) then the editor/voukoder correctly converts the input color range to the output. So if the input is Full, then Voukoder will be converting that full range to limited. If the input is limited, then most likely what's happening is the editor will still be interpreting this as Full range floating point RGB, and Voukoder will then correctly map that to Limited range. While you can use Full range, it's not how HDR was designed to be used so I wouldn't recommend it (there's also no reason to use it). As for what to set your PS5 to, it would depend on your TV and how your TV is calibrated. Some TVs work fine with automatic settings, whether on the tv, console or both, but some need to manually adjust this (even if they claim automatic works) and in those cases I recommend using Limited range because other devices will be using limited range and it will just make the TV settings match up correctly. That being said, forcing Limited Range on the PS5 will likely result in less rounding errors or banding in the final output, as values will line up perfectly with the 10bit Limited output, so it might be a good idea to use that.

    EDIT: noticed you also have some kind of external capture as well. Assuming this capture device isn't HDMI 2.1, then most likely the PS5 is outputting in YUV422 for HDR, and the output of the PS5 will always be Limited in this mode, even if you have it set to full, and this is good because most external capture devices will expect a "legal range" (limited) signal from YUV outputs, not full range.

    As for why your video looks darker on youtube, this is most likely youtube's automatic tonemapping. If you aren't using my HDR metadata tool to analyze frames selected from the full video for the MaxCLL and MaxFALL values, and I don't see that metadata in your MediaInfo outputs so I'd guess you're not, then youtube has to make assumptions about the HDR content of the video, and try to map that content into SDR without severe clipping or crushing of highlight information. I personally use a process to create my own LUT which totally controls the color grading and tonemapping process for the full video, but that's a pretty complicated process as well and if you're not familiar with color grading and LUT creation, it's better to determine the correct MaxCLL and MaxFALL values and let youtube handle those. It may still end up looking dimmer than what you'd expect because HDR in general tends to be brighter than SDR in many cases, and youtube will need to darken that to avoid looking overexposed, but as long as it's presentable I wouldn't worry too much about that.

    The files are recorded USING NVENC in the first place.

    Right, but they're recorded from raw RGB. You lose quality on the initial recording, as NVENC is highly lossy, and you lose quality again on the re-encode, and then you'll lose quality a third time on the final post-edit encode, and then again when youtube processes it. Ideally, you want to minimize the number of times you re-encode to highly lossy codecs like that wherever possible. You can do it, and it will work, but just know you do lose quality.

    While there's technically a theoretical loss of quality on ProRes, it's a production quality codec, meaning it's used for things like post production because you can actually encode it multiple times without any visible loss of quality, at least to the human eye. That's why it ends up making such large files of course, but still.

    Yeah, I don't have 2 monitors yet. And that's a big upgrade from a 970.

    Anyways, thanks for keeping this thread alive, it's really helpful. Maybe when I actually start making content I'll make a video form of this guide.

    I meant the Inferno would act as the second monitor. Yeah I can't wait. I was tempted with the last cards that came out, but I wanted to give RTX some time to mature before jumping in. Last gen didn't quite feel like it was there for me just yet.

    Video guide would be a great idea!

    Yeah they're 4:2:0. I don't intend to record 144hz, but the port on the Ninja Inferno would have to be capable of taking that signal.

    That's not necessarily true. If you use two outputs on your PC, you can set the GPU to mirrored mode. I believe you can set refresh rates of each monitor to be different. Basically, the first monitor acts as a main monitor, and then the secondary output would simply repeat that signal at a different refresh rate. While you could probably do it at 144hz for Monitor 1 and 60Hz for the inferno, 120/60 would probably produce a more pleasant looking recording.

    I could be wrong about the way this works though. Haven't tested it yet.

    I honestly think waiting for my hardware upgrade and then either using Premiere's HEVC encoder or the FFmpeg method will be the best solution, at least until HDMI 2.1 comes to capture cards. I'll have to test render times both ways though.

    Yeah I'd wait to see how that goes. I'm also looking forward to upgrading my GPU soon. I'm still on a GTX970 and I plan to go RTX3080 most likely (unless AMD really impresses me). I do most of my gaming on PS4 (and soon PS5, got my preorder yay) but plenty of things I do in Premiere can certainly be improved with a better GPU, and there are some games I do intend to play that can make use of it, like the new Flight Simulator, and other games I have that I'd like to be able to play at higher resolutions.

    Does the Ninja Inferno require a capture PC to use? Even if it does, it's gonna need HDMI 2.1 support, because I'll be running games at 4k 144hz 10 bit 4:4:4.

    No, doesn't require a PC, it's a standalone capture device. It does 4K, but only at 60Hz. I think 4:2:2 in HDR but otherwise it can do 4:4:4 I believe (although there may be throughput issues with the SATA drive attached, not sure). Note that your recordings from NVIDIA are probably 4:2:0, as are youtube, and I don't think youtube goes over 60fps even if NVIDIA records higher than that. I'm not sure if you could set mirrored monitor outputs or not with HDR, but if you could, then you could record at 60Hz on the inferno and play at 144Hz on your main monitor (although 120Hz may be better to evenly match the refreshes). Audio might be tricky to set up though. I haven't done PC recordings with it yet so I don't know how to best set up audio for it so you can both hear and record it.

    95-100% utilization on the CPU for an hour. I am working from a hard drive, in the future when I have more SSD space, I may end up temporarily moving relevant videos to an SSD for the editing and exporting process.

    Not sure about how the Ninja Inferno works out for me. But regardless I'm getting a 4950x/5950x (depending on what AMD calls it) and a 3090 soon, so I won't have to resort to only using Voukoder NVENC for convenience.

    If there's one thing I've noticed in my time preparing to make 100% HDR gaming content on my channel, it's that 99% of resources available are complex and poorly built. Premiere suffers from a lack of GPU acceleration while editing and specifically exporting HDR, and issues with Rec709 and Bt2020. FFmpeg uses Command Prompt which can be confusing for people who don't speak command line. The other editing softwares have pros and cons as well, but are a little worse for editing. Point is, everyone falls a little short, and it's annoying.

    Yeah, the tools just aren't all there yet. If you only need to make simple cuts, you could always do something like AVIDemux, but if you're doing more complex editing, you're going to need to use one of the methods we previously discussed. So the options are

    • Import the HDR video natively, export with Premiere's HEVC codec
    • Import the HDR video natively, use my preset to prepare the video for Voukoder and deal with slightly inaccurate color
    • Use the command line approach instead of the preset to make the video Voukoder ready

    All time consuming options, unfortunately. It'll probably be a while before things are more accelerated and user friendly. Most people just don't bother with trying to capture and edit HDR yet, so it'll be a while before these tools catch up I think. Thankfully, there ARE options that work, they're just not optimal yet. Even just a couple years ago you probably wouldn't have even been able to do this much.

    It appears to have worked, uploading to Youtube to compare side by side as i'm posting this.

    Although this sort of defeats the purpose of why I personally use Voukoder (NVENC HDR). It takes my 3600 2 hours to render my videos with the normal Premiere HEVC encoder, 20 minutes for NVENC HEVC with Voukoder. This method took 1 hour to finish in FFmpeg and another 20 in Premiere, so while it IS faster I still have to choke slam my poor CPU to do it.

    Thanks for the help on this, thanks for the incredible post, but it looks like i'll be waiting for a CPU upgrade!

    is ProRes encoding really using your CPU heavily? I'd think the HDD speed would mostly be the bottleneck there (I guess, unless you're using a super fast SSD), as it's a fairly low compression codec, which is why it works so well for editing, since the CPU doesn't need to do much to decode the frames, and also why the videos end up being so huge. Still though, I agree it's not ideal time wise.

    As another option, if you can afford it, the Ninja Inferno is what I use to capture HDR footage natively to ProRes via HDMI. In order to get a format Voukoder can work with properly, I just click the "rec709" button in the recorder to let it format the input signal as rec709 without modifying the input colors at all. It'll look all washed out on the monitor while recording, like it does in premiere when you import that format, but it'll work great with Voukoder!

    I think Premiere simply needs to add better HDR support for third party encoders in order to do this more natively, like the built in HEVC encoder. Although the way it works now does work well if you have an external monitor that you can force into HDR mode, interpreting the signal in a similar way to how Voukoder does while displaying it correctly. Premiere's native HDR monitoring options right now are pretty limited.

    I don't speak command line, how do I specify the file I want to convert? Tried it myself, but uh, had no idea what to change.

    The easiest way is to make sure your video file is in the same folder as ffmpeg. Then just replace "input.mp4" with your video file name. If the file name has spaces, use quotes around it. Or just rename your video file as input.mp4 and you won't need to change anything on the command line.

    Yeah, as I explained earlier, for Voukoder to work properly, basically Premiere can't know that your video is HDR. If Premiere knows it's HDR, it will convert it into SDR before sending it to Voukoder, which changes the transfer function and color primaries, and also clips highlights. So if you send a SDR video to Voukoder with HDR settings, then it causes extreme expansion of the brightness and color, so it will look quite wrong. If you use the preset I uploaded, you'll get pretty close, but there's a rec2020-to-rec79-to-rec2020 conversion happening in that case, and you're going to lose some color accuracy.

    If you want to retain perfect color reproduction in HDR through Voukoder, then as said on the previous page, if Premiere is recognizing the recorded video as HDR (which it will be if youtube did), then you need to first trick it into thinking it's SDR, by removing HDR metadata. The only successful way we were able to figure out how to do that is by re-encoding through ffmpeg. One of the faster ways to do this which retains good quality is by using the ProRes codec. Here is a command line for converting your video to a format that you can use in premiere with voukoder to encode in HDR:

    ffmpeg -i input.mp4 -c:v prores_ks -profile:v 2 -vf "setparams=color_primaries=bt709:color_trc=bt709:colorspace=bt709" -c:a copy

    The output file here will be very large, so make sure to have a lot of space. For a 4K60fps recording I have, recorded with the same codec on my Ninja Inferno, I get files that are about 435 GB per hour.

    Not sure if you were aware of this but there is a Beta Program for Premiere now that includes the ability to select your own input color space. Between rec709 and rec2100. I've downloaded it and begun to experiment but am having no luck. It seems to only work with prores and sony clips embedded with rec2100 HLG and nothing else. PQ workflow is stated as not being functional yet. You can even alter embedded color space information but only if its prores or sony in the modify interpret footage section. If only it worked for HEVC clips then I would be in business. But anyway the beta premiere is very intriguing as it has a lot more options.…flow/td-p/11122932?page=1

    Sounds interesting, especially if you're working with HLG footage. Perhaps in the future they'll add a native PQ option, but I'm not sure whether this actually affects how the RGB data is sent to third party exporters though. That may still be automatically sent with the rec709 conversion. So I think you would still need to use Adobe's native HDR export options, but at least you'd be able to monitor it more correctly with an HDR monitor, and therefore perform color grading and stuff more precisely.

    For now, my guide seems to be the best way to prepare HDR footage for voukoder, but hopefully that will change in the future if Adobe allows for more customization in what type of output formats are sent to exporters.