
source image: https://commons.wikimedia.org/wiki/File:Soyuz_TMA-21_spacecraft_is_lifted_into_position_on_the_launch_pad.jpg
source image: http://getdancewear.com/capezio-adult-practice-tutu-10391.html
#!/bin/bash # Generate stylised animation from video macroblock motion vectors, # and present in a side-by-side comparison with original video. # version: 2018.03.28.21.08.16 # source: https://oioiiooixiii.blogspot.com cropSize="640:ih:480:0" # Adjust area and dimensions of interest ffplay \ -flags2 +export_mvs \ -i "$1" \ -vf \ " split [original][vectors]; [vectors] codecview=mv=pf+bf+bb, crop=$cropSize [vectors]; [original] crop=$cropSize, split=3 [original][original1][original2]; [vectors][original2] blend=all_mode=difference128, eq=contrast=7:brightness=-0.3, split [vectors][vectors1]; [vectors1] colorkey=0xFFFFFF:0.9:0.2 [vectors1]; [original1][vectors1] overlay, smartblur, dilation,dilation,dilation,dilation,dilation, eq=contrast=1.4:brightness=-0.09 [pixels]; [vectors][original][pixels] hstack=inputs=3 "
#!/bin/bash # Extract section of video using time-codes taken from MPV screen-shots # Requires specific MPV screen-shot naming scheme: screenshot-template="%f__%P" # N.B. Skeleton script demonstrating basic operation filename="$(ls -1 *.jpg | head -1)" startTime="$(cut -d. -f-2 <<< "${filename#*__}")" filename="${filename%__*}" endTime="$(cut -d_ -f3 <<<"$(ls -1 *.jpg | tail -1)" | cut -d. -f-2)" ffmpeg \ -i "$filename" \ -ss "$startTime" \ -to "$endTime" \ "EDIT__${filename}__${startTime}-${endTime}.${filename#*.}"Another approach to this (and perhaps more sensible) is to script it all through MPV itself. However, that ties the technique down to MPV, whereas, this 'screen-shot' idea allows it to be used with other media players offering timestamps in the filename. Also, it's a little more tangible: you can create a series of screen-shots and later decide which ones are timed better.
'mpv-webm: Simple WebM maker for mpv, with no external dependencies.' https://github.com/ekisu/mpv-webm Create video clip at selected start/end points, as well as crop coordinates if required. Encoder options, and ability to Preview before output.
— oioiiooixiii (@oioiiooixiii) September 11, 2018
Music: Magma - 'Slag Tanz'. Dance: MacMillan's "Rite of Spring" - English National Ballet (2012). Principal: Erina Takahashi. pic.twitter.com/p3K2cxETNd— oioiiooixiii {gifs} (@oioiiooixiii_) March 20, 2017
Listening to Magma often gives me ideas for dance. I dislike sticking things together (spoils both) but gives some idea of what's in my head https://t.co/CvhBEFj0n6— oioiiooixiii (@oioiiooixiii) March 20, 2017
source video: https://www.youtube.com/watch?v=GEOi4ZzUud4Costume design by Kinder Aggugini, with additional development by Katya Ryazanskaya https://t.co/a9bV5iD4RL @oioiiooixiii_ pic.twitter.com/ak0cYeA8ul— oioiiooixiii (@oioiiooixiii) March 20, 2017
ffmpeg \ -i background.png \ -i video.mkv \ -filter_complex \ " color=#00ff00:size=1280x720 [matte]; [1:0] format=rgb24, split[mask][video]; [0:0][mask] blend=all_mode=difference, curves=m='0/0 .1/0 .2/1 1/1', format=gray, smartblur=1, eq=brightness=30:contrast=3, eq=brightness=50:contrast=2, eq=brightness=-10:contrast=50, smartblur=3, format=rgb24 [mask]; [matte][video][mask] maskedmerge,format=rgb24 " \ -shortest \ -pix_fmt yuv422p \ result.mkv
The image-stacking process is just to create a cleaner background image to work with. The idea is to remove momentary anomalies by averaging frames together. The benefits may be negligible though. Image-stacking can be done many ways. I created a quick demo for you using FFmpeg:
— oioiiooixiii (@oioiiooixiii) November 4, 2019
# Image stacking with FFmpeg usinf 'tmix' filter. # More info on 'tmix' filter: https://ffmpeg.org/ffmpeg-filters.html#tmix ffmpeg -i background-frame%d.png -vf tmix=frames=3 stacked.png # Image stacking is also possible with ImageMagick convert *.png -evaluate-sequence mean stacked.png
# Generate video motion vectors, in various colours, and merge together # NB: Includes fixed 'curve' filters for issue outlined in blog post ffplay \ -flags2 +export_mvs \ -i video.mkv \ -vf \ " split=3 [original][original1][vectors]; [vectors] codecview=mv=pf+bf+bb [vectors]; [vectors][original] blend=all_mode=difference128, eq=contrast=7:brightness=-0.3, split=3 [yellow][pink][black]; [yellow] curves=r='0/0 0.1/0.5 1/1': g='0/0 0.1/0.5 1/1': b='0/0 0.4/0.5 1/1' [yellow]; [pink] curves=r='0/0 0.1/0.5 1/1': g='0/0 0.1/0.3 1/1': b='0/0 0.1/0.3 1/1' [pink]; [original1][yellow] blend=all_expr=if(gt(X\,Y*(W/H))\,A\,B) [yellorig]; [pink][black] blend=all_expr=if(gt(X\,Y*(W/H))\,A\,B) [pinkblack]; [pinkblack][yellorig]blend=all_expr=if(gt(X\,W-Y*(W/H))\,A\,B) " # Process: # 1: Three copies of input video are made # 2: Motion vectors are applied to one stream # 3: The result of #2 is 'difference128' blended with an original video stream # The brightness and contrast are adjusted to improve clarity # Three copies of this vectors result are made # 4: Curves are applied to one vectors stream to create yellow colour # 5: Curves are applied to another vectors stream to create pink colour # 6: Original video stream and yellow vectors are combined diagonally # 7: Pink vectors stream and original vectors stream are combined diagonally # 8: The results of #6 and #7 are combined diagonally (opposite direction)
# Isolate motion-vectors using 'difference128' blend filter # - add brightness, contrast, and scaling, to taste ffplay \ -flags2 +export_mvs \ -i "video.mp4" \ -vf \ " split[original], codecview=mv=pf+bf+bb[vectors], [vectors][original]blend=all_mode=difference128, eq=contrast=7:brightness=-0.3, scale=720:-2 "Works best with higher-resolution videos; 4K source used in this case.