efca325 already increased the default bitrate for various encoders.
However, some were missed and this commit fills the gap:
* Mac-VT
* OpenH264
* Texture AMF
* VAAPI
The default of 2500 kbps was chosen 10 years ago and times have changed.
Logs and forums posts show that many users of OBS for recording don't
change their bitrate and end up with a 2.5 mbps recording which looks
terrible.
Now that service bitrate enforcement exists, this will be automatically
capped to the maximum bitrate for streaming services, so the only time
this should result in a problem is if the user's upload speed is the
limiting factor, hopefully rarer these days with increasing internet
speeds.
Log the codec level being used by AMF-based encoders after
ffmpeg_opts have been parsed. Users could have manually changed the
codec level so query the level via AMF then determine the string for
logging.
The default "level" setting was being used for each codec (AVC, HEVC,
AV1) supported by AMF. For example, all HEVC encoders were using
level 6.2 and this caused some playback devices to reject the
bitstream for decode because the device reported a maximum decode
level lower than 6.2.
Add functionality to determine the best match for the codec level
instead of relying on the defaults.
Use the recently added `obs_encoder_video_tex_active()` API
for AMD AMF-based encoders, similar to the recent commit for
obs-nvenc. This allows the OBS canvas to use non-NV12 pixel
formats (such as I444) while the multitrack video encoders will
use NV12 or P010 textures converted using the GPU.
The frame rate used to initialize an AMF encoder should be aligned
with the derived frame rate in video_output_info instead of the global
obs_video_info structure. With this change, IDRs can be aligned when
multiple renditions are being encoded.
Using video_output_info members for the format, colorspace, and range
parameters in addition to the frame rate provides a single source for
this information and obs_video_info is no longer needed.
Frame rate should be defined before AMF encoder initialization,
because this information is used for setting vui_time_scale in SPS.
If frame rate isn't defined before initialization,
then AMF encoder set default frame rate into VUI header (30 fps).
If a user sets both AdaptiveMiniGOP=true and EnablePreAnalysis=true
in the AMF/FFmpeg options field, AMF will adaptively insert
B-pictures, and no longer uses the fixed B pattern.
For a fixed B-frames pattern, it is expected that increasing B-frames
can cause a quality drop for certain content such as with high motion.
AdaptiveMiniGOP is recommended when using B-frames to improve the
quality in such cases. AdaptiveMiniGOP is dependent on PreAnalysis
which means that trying to enable it without having PreAnalysis turned
ON will have no negative effect (AdaptiveMiniGOP won't be enabled).
If P216 or P416 color formats are selected with AMF, these color formats
were not explicitly handled, so the switch statements would end up in
the default case. If the user had also selected a Rec. 2100 color space,
this would result in the strange error message:
"OBS does not support 8-bit output of Rec. 2100."
This message is confusing and does not correctly reflect the chosen
settings. Let's explicitly handle the P216/P416 cases and provide a more
accurate error message.
On systems with multiple graphics adapters, one card can be configured
as power saving, and another card can be configured as performance.
Sometimes, OBS and the encoder test subprocesses will not be configured
the same way, so it's necessary to provide adapter order to the encoder
test subprocesses.
This change ensures the adapter order by passing the LUIDs to the test
subprocesses. The adapter indexes will then be updated accordingly.
(Jim note: The missing description of this commit is that basically, the
CAP_MAX_THROUGHPUT value returns a different result based upon what the
other settings are currently set to. It didn't operate the way it was
originally programmed.)