Regarding dedicated graphics card for encoding, low-end Nvidia Turing (except GTX 1630 and GTX 1650) cards are best at the moment. Any Nvidia card from 1650 Super and up.
Turing cards (16/20-series) encode notably better h.264/h.265 picture quality for the same bitrate vs. Pascal cards (10-series). Difference to me looks about the same as decreasing the Constant Quantizer setting by 1, but with the same filesize.
Ampere cards (30-series) use the same NVENC module as Turing. The NVDEC decoder has a slight improvement, but not the encoder.
Lovelace cards (40-series) have better NVENC modules, but are outside your $200 range at the moment.
Worth noting too that a higher-end card won't gain you much encoding performance. Turing and Ampere cards all have the same NVENC encode module, which has nothing to do with how big the GPU is, how many CUDA cores etc. Separate module. (If you're doing effects or filters etc. where you also use CUDA, that's different...) The only encoder difference between a 1650 Super and a 3090 Ti is memory bus width/speed; they both have the same NVENC module.
The RTX 2060 Super has a 256-bit memory bus with 14 Gbps GDDR6. Might be close to the sweet spot for you.