
The split and fps software filters are used to split the a stream into aa and ab and then drop the framerate of ab to 30 fps to produce the abb 1280x720p30 stream. The 4 outputs of the multiscale_xma filter are identified as a, b, c and d respectively. For each output, the width, height and frame rate are defined with out_width, out_height and out_rate. The multiscale_xma filter configures the Xilinx hardware-accelerated scaler to produce 4 output resolutions (1280x720p60, 848x480p30, 640x360p30, and 288x160p30). This example uses the multiscale_xma, split, fps and xvbm_convert filters to create 5 output resolutions from the input stream. The FFmpeg -filter_complex flag allows combining multiple filters together using a graph-like syntax. ) is the hardware-accelerated decoder in the Xilinx device map "" -pix_fmt yuv420p -f rawvideo /tmp/xil_dec_scale_288p30.yuvĭeclares the decoder’s codec for video (as opposed to audio -c:a. map "" -pix_fmt yuv420p -f rawvideo /tmp/xil_dec_scale_360p30.yuv \ map "" -pix_fmt yuv420p -f rawvideo /tmp/xil_dec_scale_480p30.yuv \

map "" -pix_fmt yuv420p -f rawvideo /tmp/xil_dec_scale_720p30.yuv \ map "" -pix_fmt yuv420p -f rawvideo /tmp/xil_dec_scale_720p60.yuv \ Xvbm_convert xvbm_convert xvbm_convert xvbm_convert \ filter_complex "multiscale_xma=outputs=4: \ ) is the hardware-accelerated encoder in the Xilinx deviceĮnable overwrite without prompting the user if they’re sure You can also use 8000K or 8000000.ĭeclares the encoder’s codec for video (as opposed to audio -c:a. 8M signifies a target bitrate of 8 Megabits per second. The target bitrate of the encoded stream. this example is defining the input clip as being this same color space The color space of the encoder is by default yuv420p. This example states the input is 1080pĪgain, without metadata, the encoder requires the framerate of the incoming stream Since there is no container or metadata in a RAW clip, the user must define the input clip’s resolution/size.

This signifies that the video is in a raw format, without container or other metadata/information about the clip The ffmpeg application, which is provided by Xilinx, and moved to the top of the PATH when you sourced the setup.sh script Lower-Latency Transcode With Multiple-Resolution Outputsįfmpeg -f rawvideo -s 1920x1080 -r 60 -pix_fmt yuv420p -i \.Transcode With Multiple-Resolution Outputs.Encode Only Into Multiple-Resolution Outputs.Decode Only Into Multiple-Resolution Outputs.
