This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
ffmpeg [2024-01-28 13:08:14] mi [Stream MD5] |
ffmpeg [2025-03-20 19:39:18] (current) mi [Add pseudo-timecode] |
||
---|---|---|---|
Line 135: | Line 135: | ||
==Images to video | ==Images to video | ||
+ | |||
+ | With numbered frames, the numbering must start at 0: "frame0000.png", "frame0001.png", "frame0002.png", ... | ||
===Animated webp | ===Animated webp | ||
Line 151: | Line 153: | ||
===Jpegs to mp4 | ===Jpegs to mp4 | ||
+ | |||
+ | 1 .jpg image = 1 frame : | ||
<code bash> | <code bash> | ||
Line 156: | Line 160: | ||
out="image-to-video.mp4" | out="image-to-video.mp4" | ||
ffmpeg -framerate 25 -i "$in" -c:v libx264 -crf 18 -pix_fmt yuv420p "$out" | ffmpeg -framerate 25 -i "$in" -c:v libx264 -crf 18 -pix_fmt yuv420p "$out" | ||
+ | </code> | ||
+ | |||
+ | To start on a specific numbered frame, use ''-start_number''. Example staring at "image150.jpg": | ||
+ | |||
+ | <code bash> | ||
+ | in="image%03d.jpg" # image000.jpg to image999.jpg | ||
+ | out="image-to-video.mp4" | ||
+ | ffmpeg -start_number 150 framerate 25 -i "$in" -c:v libx264 -crf 18 -pix_fmt yuv420p "$out" | ||
+ | </code> | ||
+ | |||
+ | Slideshow with 1 .jpg = 1 second : | ||
+ | |||
+ | <code bash> | ||
+ | fps=25 | ||
+ | out="slideshow-glob.mp4" | ||
+ | ffmpeg -framerate 1 -pattern_type glob -i '*.jpeg' -c:v libx264 -crf 20 -g $fps -pix_fmt yuv420p -r $fps "$out" | ||
</code> | </code> | ||
Line 235: | Line 255: | ||
for t in {0..11}; do ffmpeg -nostats -i "$in" -map 0:a:$t -filter_complex ebur128 -f null - > $(printf "ebur-$in-a%02d.txt" $t) 2>&1 ; done | for t in {0..11}; do ffmpeg -nostats -i "$in" -map 0:a:$t -filter_complex ebur128 -f null - > $(printf "ebur-$in-a%02d.txt" $t) 2>&1 ; done | ||
+ | |||
+ | |||
+ | ==Delay track | ||
+ | To add a delay to the start of an input, use ''-itsoffset offset'' before the input file. | ||
+ | |||
+ | See [[https://superuser.com/a/983153/53547|this answer]] for [[https://superuser.com/questions/982342/|In ffmpeg, how to delay only the audio of a .mp4 video without converting the audio?]] | ||
+ | |||
+ | For example, to merge an audio and a video file, but having the audio start 440 ms. later (11 frames at 25 fps): | ||
+ | |||
+ | ffmpeg -i "$video_file" -itsoffset 0.440 -i "$audio_file" -map 0:v -map 1:a -c copy "ff-resync-video_audio.mp4" | ||
+ | ===Syntax for time duration | ||
+ | ''[-][HH:]MM:SS[.m...]'' or ''[-]S+[.m...][s|ms|us]'' | ||
+ | See https://ffmpeg.org/ffmpeg-utils.html#time-duration-syntax | ||
Line 253: | Line 286: | ||
- | == Add pseudo-timecode | + | == Timecode |
+ | === Change existing timecode | ||
+ | |||
+ | f=OriginalFile.mxf | ||
+ | newtc="00:00:00:00" | ||
+ | out=NewTC_File.mxf | ||
+ | ffmpeg -i "$f" -map 0:v -map 0:a -codec copy -movflags use_metadata_tags -timecode "$newtc" "$out" | ||
+ | |||
+ | === Add pseudo-timecode | ||
Example with Fuji .MOV files, to add the "Create Date" time as a timecode: | Example with Fuji .MOV files, to add the "Create Date" time as a timecode: | ||
Line 280: | Line 321: | ||
done | done | ||
</code> | </code> | ||
+ | |||
+ | If the exiftool command gives the error "End of processing at large atom (LargeFileSupport not enabled)", add the ''-api largefilesupport=1'' option so the exiftool line becomes: | ||
+ | |||
+ | <code bash>t=$(exiftool -api largefilesupport=1 -CreateDate "$f" | awk '{print $NF}');</code> | ||
+ | |||
+ | For .wav files from a Zoom, instead of ''-CreateDate'', use ''-DateTimeOriginal'' : | ||
+ | <code bash>t=$(exiftool -DateTimeOriginal "$f" | awk '{print $NF}'); tc="$t:00";</code> | ||
==Diff | ==Diff |