infamousfunk

joined 10 months ago
[–] infamousfunk@alien.top 1 points 9 months ago (1 children)

Like others have said - it depends on the source media. In general, grainy sources require more bitrate to achieve a given quality as opposed to a clean, digitally shot source. You can choose a random bitrate and encode all your sources with it but you might not like the results or your encodes will be bloated for no reason.

Personally, having used both x264 and x265, I would stick with x264 for 1080p content. Yes there are some space saving advantages to using x265 but the time it takes really just isn't worth it - in my opinion. This is assuming you're using software encoding and not NVENC or QuickSync. Hardware encoding is much faster but yield larger file sizes and lower quality when compared to software encoding - again, not really worth it in my opinion.