This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
ffmpeg [2022-11-05 15:19:16] mi [5.1 AC3 to Stereo] |
ffmpeg [2025-03-20 19:39:18] (current) mi [Add pseudo-timecode] |
||
---|---|---|---|
Line 4: | Line 4: | ||
ffmpeg -i "$in" | ffmpeg -i "$in" | ||
+ | |||
+ | ==webm to mp4 | ||
+ | |||
+ | Example: | ||
+ | |||
+ | <code bash> | ||
+ | ffmpeg -i "$in" -tune film -crf 20 -preset slow -b:a 192k -movflags +faststart "${in%%.webm}.mp4" | ||
+ | </code> | ||
==h264 | ==h264 | ||
Line 19: | Line 27: | ||
-maxrate 6M -bufsize 6M | -maxrate 6M -bufsize 6M | ||
</code> | </code> | ||
+ | |||
+ | Or set an average bitrate, without using ''-crf'' (see [[https://trac.ffmpeg.org/wiki/Limiting%20the%20output%20bitrate|Limiting the output bitrate]]) | ||
+ | |||
+ | |||
+ | <code bash> | ||
+ | -b:v 4M -maxrate 6M -bufsize 10M | ||
+ | </code> | ||
+ | |||
==prores | ==prores | ||
* Prores proxy: -profile:v 0 | * Prores proxy: -profile:v 0 | ||
Line 45: | Line 61: | ||
-vf "scale=iw/2:-2" # same (half width and height), but height divisible by 2 | -vf "scale=iw/2:-2" # same (half width and height), but height divisible by 2 | ||
etc. | etc. | ||
+ | </code> | ||
+ | |||
+ | ==Deinterlace | ||
+ | |||
+ | See https://ffmpeg.org/ffmpeg-filters.html#yadif : | ||
+ | |||
+ | > yadif=mode:parity:deint | ||
+ | > defaults: 0(send_frame=one frame for each frame) : -1(auto=auto detection of field order) : 0(all=all frames) | ||
+ | |||
+ | <code bash> | ||
+ | -vf "yadif=0:-1:0" | ||
</code> | </code> | ||
Line 76: | Line 103: | ||
ffmpeg -i input.flv -vf "select='eq(pict_type,PICT_TYPE_I)'" -vsync vfr thumb%04d.png | ffmpeg -i input.flv -vf "select='eq(pict_type,PICT_TYPE_I)'" -vsync vfr thumb%04d.png | ||
+ | |||
+ | Find all .mov files under some directory, and extract a single Jpeg: | ||
+ | |||
+ | <code bash> | ||
+ | pos=2 # get frame at $pos seconds from start of file | ||
+ | search_in=/mnt/x/y | ||
+ | ext="mov" | ||
+ | |||
+ | find "$search_in" -type f -name "*.$ext" -not -name ".*" \ | ||
+ | | while read f; do | ||
+ | n=$(basename "$f") | ||
+ | [ -f "$n" ] && n="$n-$RANDOM" | ||
+ | ffmpeg -nostdin -hide_banner -loglevel error -ss $pos -i "$f" \ | ||
+ | -r 1 -vframes 1 -f image2 "$n.jpg" \ | ||
+ | && echo "OK $n.jpg" | ||
+ | done | ||
+ | </code> | ||
Line 90: | Line 134: | ||
See also ffmpeg's [[https://trac.ffmpeg.org/wiki/Create%20a%20thumbnail%20image%20every%20X%20seconds%20of%20the%20video|Create a thumbnail image every X seconds of the video]]. | See also ffmpeg's [[https://trac.ffmpeg.org/wiki/Create%20a%20thumbnail%20image%20every%20X%20seconds%20of%20the%20video|Create a thumbnail image every X seconds of the video]]. | ||
- | ==Animated webp | + | ==Images to video |
+ | |||
+ | With numbered frames, the numbering must start at 0: "frame0000.png", "frame0001.png", "frame0002.png", ... | ||
+ | ===Animated webp | ||
Frames to animated .webp : | Frames to animated .webp : | ||
Line 104: | Line 151: | ||
ffmpeg -i my-animation.gif -c vp9 -b:v 0 -crf 41 my-animation.webm | ffmpeg -i my-animation.gif -c vp9 -b:v 0 -crf 41 my-animation.webm | ||
</code> | </code> | ||
+ | |||
+ | ===Jpegs to mp4 | ||
+ | |||
+ | 1 .jpg image = 1 frame : | ||
+ | |||
+ | <code bash> | ||
+ | in="image%03d.jpg" # image000.jpg to image999.jpg | ||
+ | out="image-to-video.mp4" | ||
+ | ffmpeg -framerate 25 -i "$in" -c:v libx264 -crf 18 -pix_fmt yuv420p "$out" | ||
+ | </code> | ||
+ | |||
+ | To start on a specific numbered frame, use ''-start_number''. Example staring at "image150.jpg": | ||
+ | |||
+ | <code bash> | ||
+ | in="image%03d.jpg" # image000.jpg to image999.jpg | ||
+ | out="image-to-video.mp4" | ||
+ | ffmpeg -start_number 150 framerate 25 -i "$in" -c:v libx264 -crf 18 -pix_fmt yuv420p "$out" | ||
+ | </code> | ||
+ | |||
+ | Slideshow with 1 .jpg = 1 second : | ||
+ | |||
+ | <code bash> | ||
+ | fps=25 | ||
+ | out="slideshow-glob.mp4" | ||
+ | ffmpeg -framerate 1 -pattern_type glob -i '*.jpeg' -c:v libx264 -crf 20 -g $fps -pix_fmt yuv420p -r $fps "$out" | ||
+ | </code> | ||
+ | |||
+ | If the size of the frames is uneven (eg. 512x459), it needs to be set to an even size, or there will be this error: | ||
+ | |||
+ | > ''height not divisible by 2 (512x459)'' | ||
+ | > ''Error initializing output stream 0:0 -- Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height'' | ||
+ | |||
+ | In this case, specify the size. For example: ''-s 512x460'' | ||
+ | |||
==Audio | ==Audio | ||
Line 174: | Line 255: | ||
for t in {0..11}; do ffmpeg -nostats -i "$in" -map 0:a:$t -filter_complex ebur128 -f null - > $(printf "ebur-$in-a%02d.txt" $t) 2>&1 ; done | for t in {0..11}; do ffmpeg -nostats -i "$in" -map 0:a:$t -filter_complex ebur128 -f null - > $(printf "ebur-$in-a%02d.txt" $t) 2>&1 ; done | ||
+ | |||
+ | |||
+ | ==Delay track | ||
+ | To add a delay to the start of an input, use ''-itsoffset offset'' before the input file. | ||
+ | |||
+ | See [[https://superuser.com/a/983153/53547|this answer]] for [[https://superuser.com/questions/982342/|In ffmpeg, how to delay only the audio of a .mp4 video without converting the audio?]] | ||
+ | |||
+ | For example, to merge an audio and a video file, but having the audio start 440 ms. later (11 frames at 25 fps): | ||
+ | |||
+ | ffmpeg -i "$video_file" -itsoffset 0.440 -i "$audio_file" -map 0:v -map 1:a -c copy "ff-resync-video_audio.mp4" | ||
+ | ===Syntax for time duration | ||
+ | ''[-][HH:]MM:SS[.m...]'' or ''[-]S+[.m...][s|ms|us]'' | ||
+ | See https://ffmpeg.org/ffmpeg-utils.html#time-duration-syntax | ||
Line 192: | Line 286: | ||
- | == Add pseudo-timecode | + | == Timecode |
+ | === Change existing timecode | ||
+ | |||
+ | f=OriginalFile.mxf | ||
+ | newtc="00:00:00:00" | ||
+ | out=NewTC_File.mxf | ||
+ | ffmpeg -i "$f" -map 0:v -map 0:a -codec copy -movflags use_metadata_tags -timecode "$newtc" "$out" | ||
+ | |||
+ | === Add pseudo-timecode | ||
Example with Fuji .MOV files, to add the "Create Date" time as a timecode: | Example with Fuji .MOV files, to add the "Create Date" time as a timecode: | ||
Line 219: | Line 321: | ||
done | done | ||
</code> | </code> | ||
+ | |||
+ | If the exiftool command gives the error "End of processing at large atom (LargeFileSupport not enabled)", add the ''-api largefilesupport=1'' option so the exiftool line becomes: | ||
+ | |||
+ | <code bash>t=$(exiftool -api largefilesupport=1 -CreateDate "$f" | awk '{print $NF}');</code> | ||
+ | |||
+ | For .wav files from a Zoom, instead of ''-CreateDate'', use ''-DateTimeOriginal'' : | ||
+ | <code bash>t=$(exiftool -DateTimeOriginal "$f" | awk '{print $NF}'); tc="$t:00";</code> | ||
==Diff | ==Diff | ||
Line 252: | Line 361: | ||
$ ffmpeg -i "$filename" -map 0:a -codec copy -hide_banner -loglevel warning -f md5 - | $ ffmpeg -i "$filename" -map 0:a -codec copy -hide_banner -loglevel warning -f md5 - | ||
- | $ ffmpeg -i input.mp4 -map 0 -c copy -f streamhash -hash md5 - | + | $ ffmpeg -i "$filename" -map 0 -c copy -f streamhash -hash md5 - |
... | ... | ||
0,v,MD5=50224fec84bc6dfde90d742bcf1d2e01 | 0,v,MD5=50224fec84bc6dfde90d742bcf1d2e01 | ||
1,a,MD5=1d2a32ed72798d66e0110bd02df2be65 | 1,a,MD5=1d2a32ed72798d66e0110bd02df2be65 | ||
+ | |||
+ | $ ffmpeg -i "$f" -map 0 -c copy -f streamhash -hash md5 "$f.stream.md5" | ||
See also: | See also: | ||
Line 272: | Line 383: | ||
# or | # or | ||
-hide_banner -loglevel info -stats</code> | -hide_banner -loglevel info -stats</code> | ||
+ | |||
+ | ===Edit metadata=== | ||
+ | |||
+ | See [[https://superuser.com/questions/834244/how-do-i-name-an-audio-track-with-ffmpeg/835069|How do I name an audio track with ffmpeg]]: | ||
+ | |||
+ | ffmpeg -i input.mp4 -map 0 -c copy -metadata:s:a:0 title="One" -metadata:s:a:1 title="Two" -metadata:s:a:0 language=eng -metadata:s:a:1 language=spa output.mp4 | ||
==Examples | ==Examples |