This shows you the differences between two versions of the page.
| Both sides previous revision Previous revision Next revision | Previous revision | ||
|
ffmpeg [2023-08-22 16:11:43] mi [Jpegs to mp4] |
ffmpeg [2025-03-20 19:39:18] (current) mi [Add pseudo-timecode] |
||
|---|---|---|---|
| Line 61: | Line 61: | ||
| -vf "scale=iw/2:-2" # same (half width and height), but height divisible by 2 | -vf "scale=iw/2:-2" # same (half width and height), but height divisible by 2 | ||
| etc. | etc. | ||
| + | </code> | ||
| + | |||
| + | ==Deinterlace | ||
| + | |||
| + | See https://ffmpeg.org/ffmpeg-filters.html#yadif : | ||
| + | |||
| + | > yadif=mode:parity:deint | ||
| + | > defaults: 0(send_frame=one frame for each frame) : -1(auto=auto detection of field order) : 0(all=all frames) | ||
| + | |||
| + | <code bash> | ||
| + | -vf "yadif=0:-1:0" | ||
| </code> | </code> | ||
| Line 124: | Line 135: | ||
| ==Images to video | ==Images to video | ||
| + | |||
| + | With numbered frames, the numbering must start at 0: "frame0000.png", "frame0001.png", "frame0002.png", ... | ||
| ===Animated webp | ===Animated webp | ||
| Line 140: | Line 153: | ||
| ===Jpegs to mp4 | ===Jpegs to mp4 | ||
| + | |||
| + | 1 .jpg image = 1 frame : | ||
| <code bash> | <code bash> | ||
| Line 145: | Line 160: | ||
| out="image-to-video.mp4" | out="image-to-video.mp4" | ||
| ffmpeg -framerate 25 -i "$in" -c:v libx264 -crf 18 -pix_fmt yuv420p "$out" | ffmpeg -framerate 25 -i "$in" -c:v libx264 -crf 18 -pix_fmt yuv420p "$out" | ||
| + | </code> | ||
| + | |||
| + | To start on a specific numbered frame, use ''-start_number''. Example staring at "image150.jpg": | ||
| + | |||
| + | <code bash> | ||
| + | in="image%03d.jpg" # image000.jpg to image999.jpg | ||
| + | out="image-to-video.mp4" | ||
| + | ffmpeg -start_number 150 framerate 25 -i "$in" -c:v libx264 -crf 18 -pix_fmt yuv420p "$out" | ||
| + | </code> | ||
| + | |||
| + | Slideshow with 1 .jpg = 1 second : | ||
| + | |||
| + | <code bash> | ||
| + | fps=25 | ||
| + | out="slideshow-glob.mp4" | ||
| + | ffmpeg -framerate 1 -pattern_type glob -i '*.jpeg' -c:v libx264 -crf 20 -g $fps -pix_fmt yuv420p -r $fps "$out" | ||
| </code> | </code> | ||
| Line 224: | Line 255: | ||
| for t in {0..11}; do ffmpeg -nostats -i "$in" -map 0:a:$t -filter_complex ebur128 -f null - > $(printf "ebur-$in-a%02d.txt" $t) 2>&1 ; done | for t in {0..11}; do ffmpeg -nostats -i "$in" -map 0:a:$t -filter_complex ebur128 -f null - > $(printf "ebur-$in-a%02d.txt" $t) 2>&1 ; done | ||
| + | |||
| + | |||
| + | ==Delay track | ||
| + | To add a delay to the start of an input, use ''-itsoffset offset'' before the input file. | ||
| + | |||
| + | See [[https://superuser.com/a/983153/53547|this answer]] for [[https://superuser.com/questions/982342/|In ffmpeg, how to delay only the audio of a .mp4 video without converting the audio?]] | ||
| + | |||
| + | For example, to merge an audio and a video file, but having the audio start 440 ms. later (11 frames at 25 fps): | ||
| + | |||
| + | ffmpeg -i "$video_file" -itsoffset 0.440 -i "$audio_file" -map 0:v -map 1:a -c copy "ff-resync-video_audio.mp4" | ||
| + | ===Syntax for time duration | ||
| + | ''[-][HH:]MM:SS[.m...]'' or ''[-]S+[.m...][s|ms|us]'' | ||
| + | See https://ffmpeg.org/ffmpeg-utils.html#time-duration-syntax | ||
| Line 242: | Line 286: | ||
| - | == Add pseudo-timecode | + | == Timecode |
| + | === Change existing timecode | ||
| + | |||
| + | f=OriginalFile.mxf | ||
| + | newtc="00:00:00:00" | ||
| + | out=NewTC_File.mxf | ||
| + | ffmpeg -i "$f" -map 0:v -map 0:a -codec copy -movflags use_metadata_tags -timecode "$newtc" "$out" | ||
| + | |||
| + | === Add pseudo-timecode | ||
| Example with Fuji .MOV files, to add the "Create Date" time as a timecode: | Example with Fuji .MOV files, to add the "Create Date" time as a timecode: | ||
| Line 269: | Line 321: | ||
| done | done | ||
| </code> | </code> | ||
| + | |||
| + | If the exiftool command gives the error "End of processing at large atom (LargeFileSupport not enabled)", add the ''-api largefilesupport=1'' option so the exiftool line becomes: | ||
| + | |||
| + | <code bash>t=$(exiftool -api largefilesupport=1 -CreateDate "$f" | awk '{print $NF}');</code> | ||
| + | |||
| + | For .wav files from a Zoom, instead of ''-CreateDate'', use ''-DateTimeOriginal'' : | ||
| + | <code bash>t=$(exiftool -DateTimeOriginal "$f" | awk '{print $NF}'); tc="$t:00";</code> | ||
| ==Diff | ==Diff | ||
| Line 302: | Line 361: | ||
| $ ffmpeg -i "$filename" -map 0:a -codec copy -hide_banner -loglevel warning -f md5 - | $ ffmpeg -i "$filename" -map 0:a -codec copy -hide_banner -loglevel warning -f md5 - | ||
| - | $ ffmpeg -i input.mp4 -map 0 -c copy -f streamhash -hash md5 - | + | $ ffmpeg -i "$filename" -map 0 -c copy -f streamhash -hash md5 - |
| ... | ... | ||
| 0,v,MD5=50224fec84bc6dfde90d742bcf1d2e01 | 0,v,MD5=50224fec84bc6dfde90d742bcf1d2e01 | ||
| 1,a,MD5=1d2a32ed72798d66e0110bd02df2be65 | 1,a,MD5=1d2a32ed72798d66e0110bd02df2be65 | ||
| + | |||
| + | $ ffmpeg -i "$f" -map 0 -c copy -f streamhash -hash md5 "$f.stream.md5" | ||
| See also: | See also: | ||
| Line 322: | Line 383: | ||
| # or | # or | ||
| -hide_banner -loglevel info -stats</code> | -hide_banner -loglevel info -stats</code> | ||
| + | |||
| + | ===Edit metadata=== | ||
| + | |||
| + | See [[https://superuser.com/questions/834244/how-do-i-name-an-audio-track-with-ffmpeg/835069|How do I name an audio track with ffmpeg]]: | ||
| + | |||
| + | ffmpeg -i input.mp4 -map 0 -c copy -metadata:s:a:0 title="One" -metadata:s:a:1 title="Two" -metadata:s:a:0 language=eng -metadata:s:a:1 language=spa output.mp4 | ||
| ==Examples | ==Examples | ||