Showing posts with label programming. Show all posts
Showing posts with label programming. Show all posts

FFmpeg: RGB affected luma cycle

Utilising the 'extractplanes', 'blend', and 'xfade' filters found in FFmpeg 4.2.2. In the examples shown here, the following script and arguments were run (in order):
$ rgbLumaBlendCycle_FFmpeg.sh 'image1.jpg' 'screen'
$ rgbLumaBlendCycle_FFmpeg.sh 'image2.jpg' 'difference'
$ rgbLumaBlendCycle_FFmpeg.sh 'image3.jpg' 'pinlight' 'REV'
#!/usr/bin/env bash
# FFmpeg ver. 4.2.2+

# RGB affected luma cycle: Each colour plane is extracted and blended with the
# original image to adjust overall image brightness. The result of each blend
# is faded into the next, before belnding back to the original image.

# Parameters:
# $1 : Filename
# $2 : Blend type (e.g. average, screen, difference, pinlight, etc.)
# $3 : Reverse blend order (any string to enable)

# version: 2020.07.15_12.28.31
# source: https://oioiiooixiii.blogspot.com

function main()
{
   local mode="$2"
   local name="$1"
   local layerArr=('[a][colour1]' '[b][colour2]' '[c][colour3]'
                   '[colour1][a]' '[colour2][b]' '[colour3][c]')
   local layerIndex="${3:+3}" && layerIndex="${layerIndex:-0}"
   # Array contains values for both blend orders; index is offset if $3 is set

   ffmpeg \
      -i "$name" \
      -filter_complex "
         format=rgba,loop=loop=24:size=1:start=0,
            split=8 [rL][gL][bL][colour1][colour2][colour3][o][o1];
         [rL]extractplanes=r,format=rgba[a];
         [gL]extractplanes=g,format=rgba[b];
         [bL]extractplanes=b,format=rgba[c];
         ${layerArr[layerIndex++]}blend=all_mode=${mode}[a];
         ${layerArr[layerIndex++]}blend=all_mode=${mode}[b];
         ${layerArr[layerIndex]}blend=all_mode=${mode}[c];
         [o][a]xfade=transition=fade:duration=0.50:offset=0,format=rgba[a];
         [a][b]xfade=transition=fade:duration=0.50:offset=0.51,format=rgba[b];
         [b][c]xfade=transition=fade:duration=0.50:offset=1.02,format=rgba[c];
         [c][o1]xfade=transition=fade:duration=0.50:offset=1.53,format=rgba
          " \
      ${name}-${mode}.mkv
}

main "$@"
download: rgbLumaBlendCycle_FFmpeg.sh


Image Credits:

Henry Huey - "Alice in Wonderland - MAD Productions 5Sep2018 hhj_6811"
Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0)
https://www.flickr.com/photos/henry_huey/43768692515/

Henry Huey - "Alice in Wonderland - MAD Productions 5Sep2018 hhj_6869"
Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0)
https://www.flickr.com/photos/henry_huey/29740096427/

Henry Huey - "Alice in Wonderland - MAD Productions 5Sep2018 hhj_6848"
Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0)
https://www.flickr.com/photos/henry_huey/29740096117/

FFmpeg: Improved 'Rainbow-Trail' effect





* Includes sound πŸ”Š

I have updated the script for this FFmpeg 'rainbow' effect I created in 2017¹ as there were numerous flaws, errors, and inadequacies in that earlier version. One major issue was the inability to colorkey with any colour other than black; this has been resolved.

This time, the effect is based on the 'extractplanes' filter and the alpha levels created after using a 'colorkey' filter. This produces a much more refined result; better colour shaping, and maintains most of the original foreground subject. This 'extractplanes' filter can even be removed from the filtergraph, to create an alternative, more subtle effect.

Degrading jpeg images with repeated rotation - via Bash (FFmpeg and ImageMagick)



A continuation of the decade-old topic of degrading jpeg images by repeated rotation and saving. This post briefly demonstrates the process using FFmpeg and ImageMagick in a Bash script. Previously, a Python script achieving similar results was published, which has recently been updated. There are many posts on this subject and they can all be accessed by searching for 'jpeg rotation' tag.
posts: https://oioiiooixiii.blogspot.com/search/label/jpeg%20rotation
The two basic commands are below. Both versions rotate an image 90 degrees clockwise, and each overwrite the original image. They should be run inside a loop to create progressively more degraded images.

ImageMagick: The quicker of the two, it uses the standard 'libjpg' library for saving images.
mogrify -rotate "90" -quality "74" "image.jpg"

FFmpeg: Saving is done with the 'mjpeg' encoder, creating significantly different results.
ffmpeg -i "image.jpg" -vf "transpose=1" -q:v 12 "image.jpg" -y

There are many options and ways to extend each of the basic commands. For FFmpeg, one such way is to use the 'noise' filter to help create entropy in the image while running. It also has the effect of discouraging the gradual magenta-shift caused by the mjpeg encoder.

A functional (but basic) Bash script is presented later in this blog post. It allows for the choice between ImageMagick or FFmpeg versions, as well as allowing some other parameters to be set. Directly below is another montage of images created using the script. Run-time parameters for each result are given at the end of this post.



Running the script without any arguments (except for the image file name) will invoke ImageMagick's 'mogrify' command, rotating the image 500 times, and saving at a jpeg quality of '74'. Note that when the FFmpeg version is running, the jpeg quality is crudely inverted, to use the 'q:v' value of the 'mjpeg' encoder.

The parameters for the script: [filename: string] [rotations: 1-n] [quality: 1-100] [frames: (any string)] [version: (any string for FFmpeg)] [noise: 1-100]
#!/bin/bash
# Simple Bash script to degrade a jpeg image by repeated rotations and saves,
# using either FFmpeg or ImageMagick. N.B. Starting image must be a jpeg.

# Example: rotateDegrade.sh "image.jpg" "1200" "67" "no" "FFmpeg" "21"
# Run on image.jpg, 1200 rotations, quality=67, no frames, use FFmpeg, noise=21

# source: oioiiooixiii.blogspot.com
# version: 2019.08.22_13.57.37

# All relevent code resides in this function
function rotateDegrade()
{
   local rotations="${2:-500}" # number of rotations
   local quality="${3:-74}" # Jpeg save quality (note inverse value for FFmpeg)
   local saveInterim="${4:-no}" # To save every full rotation as a new frame
   local version="${5:-IM}" # Choice of function (any other string for FFmpeg)
   local ffNoise="${6:-0}" # FFmpeg noise filter

   # Name of new file created to work on
   local workingFile="${1}_r${rotations}-q${quality}-${version}-n${ffNoise}.jpg"
   cp "$1" "$workingFile" # make a copy of the input file to work on
   # N.B. consider moving above file to volatile memory e.g. /dev/shm

   # ImageMagick and FFmpeg sub-functions
   function rotateImageMagick() {
      mogrify -rotate "90" -quality "$quality" "$workingFile"; }
   function rotateFFmpeg() {
      ffmpeg -i "$workingFile" -vf "format=rgb24,transpose=1,
         noise=alls=${ffNoise}:allf=u,format=rgb24" -q:v "$((100-quality))"\
         "$workingFile" -y -loglevel panic &>/dev/null; }

   # Main loop for repeated rotations and saves
   for (( i=0;i<"$rotations";i++ ))
   {
      # Save each full rotation as a new frame (if enabled)
      [[ "$saveInterim" != "no" ]] && [[ "$(( 10#$i%4 ))" -lt 1  ]] \
      && cp "$workingFile" "$(printf %07d $((i/4)))_$workingFile"

      # Rotate by 90 degrees and save, using whichever function chosen
      [[ "$version" == "IM" ]] \
      && rotateImageMagick \
      || rotateFFmpeg

      # Display progress
      displayRotation "$i" "$rotations"
   }
}

# Simple textual feedback of progress shown in terminal
function displayRotation() { clear;
   case "$(( 10#$1%4 ))" in
   3) printf "Total: $2 / Processing: $1 πŸ‘„  ";;
   2) printf "Total: $2 / Processing: $1 πŸ‘‡  ";;
   1) printf "Total: $2 / Processing: $1 πŸ‘†  ";;
   0) printf "Total: $2 / Processing: $1 πŸ‘…  ";;
   esac
}

# Driver function
function main { rotateDegrade "$@"; echo; }; main "$@"
download: rotateDegrade.sh

python version: https://oioiiooixiii.blogspot.com/2014/08/jpeg-destruction-via-repeated-rotate.html
original image: https://www.flickr.com/photos/flowizm/19148678846/ (CC BY-NC-SA 2.0)

parameters for top image, left to right:
original | rotations=300,quality=52,version=IM | rotations=200,quality=91,version=FFmpeg,noise=7

parameters for bottom image, left to right:
rotations=208,quality=91,version=FFmpeg,noise=7 | rotations=300,quality=52,version=FFmpeg,noise=0 | rotations=500,quality=74,version=IM | rotations=1000,quality=94,version=FFmpeg,noise=7 | rotations=300,quality=94,version=FFmpeg,noise=16

FFmpeg: CRT Screen Effect


A simple attempt at creating a [stylised] 'CRT screen' effect with FFmpeg. Loaded with the common CRT effect tropes and clichΓ©s; interlaced lines, noise, chromatic aberration, bloom etc.

The filterchains were constructed to be modular; allowing them to be included or removed, as desired. The ideas included in these filterchains might be of more use in general, than the whole effect itself.

#!/bin/bash

# A collection of FFmpeg filterchains which can be used to create a stylised
# 'CRT screen' effect on given input.
#
# The filter-chains have been split apart to increase modularity at the cost of
# sacrificing simplicity and increasing redundant code. Filter-chains can be
# added or removed in various orders, but special attention must be paid to
# selecting the correct termination syntax for each stage.
#
# Includes basic demonstration FFmpeg command which takes "$1" input file.
#
# Version: 2019.04.06_02.49.13
# Source https://oioiiooixiii.blogspot.com

### FILTERCHAINS #############################################################

# Reduce input to 25% PAL resolution
shrink144="scale=-2:144"

# Crop to 4:3 aspect ratio at 25% PAL resolution
crop43="crop=180:144"

# Create RGB chromatic aberration
rgbFX="split=3[red][green][blue];
      [red] lutrgb=g=0:b=0,
            scale=188x144,
            crop=180:144 [red];
      [green] lutrgb=r=0:b=0,
              scale=184x144,
              crop=180:144 [green];
      [blue] lutrgb=r=0:g=0,
             scale=180x144,
             crop=180:144 [blue];
      [red][blue] blend=all_mode='addition' [rb];
      [rb][green] blend=all_mode='addition',
                  format=gbrp"

# Create YUV chromatic aberration
yuvFX="split=3[y][u][v];
      [y] lutyuv=u=0:v=0,
          scale=192x144,
          crop=180:144 [y];
      [u] lutyuv=v=0:y=0,
          scale=188x144,
          crop=180:144 [u];
      [v] lutyuv=u=0:y=0,
          scale=180x144,
          crop=180:144 [v];
      [y][v] blend=all_mode='lighten' [yv];
      [yv][u] blend=all_mode='lighten'"

# Create edge contour effect
edgeFX="edgedetect=mode=colormix:high=0"

# Add noise to each frame of input
noiseFX="noise=c0s=7:allf=t"

# Add interlaced fields effect to input
interlaceFX="split[a][b];
             [a] curves=darker [a];
             [a][b] blend=all_expr='if(eq(0,mod(Y,2)),A,B)':shortest=1"

# Re-scale input to full PAL resolution
scale2PAL="scale=720:576"

# Re-scale input to full PAL resolution with linear pixel
scale2PALpix="scale=720:576:flags=neighbor"

# Add magnetic damage effect to input [crt screen]
screenGauss="[base];
             nullsrc=size=720x576,
                drawtext=
                   fontfile=/usr/share/fonts/truetype/freefont/FreeSerif.ttf:
                   text='@':
                   x=600:
                   y=30:
                   fontsize=170:
                   fontcolor=red@1.0,
             boxblur=80 [gauss];
             [gauss][base] blend=all_mode=screen:shortest=1"

# Add reflections to input [crt screen]
reflections="[base];
             nullsrc=size=720x576,
             format=gbrp,
             drawtext=
               fontfile=/usr/share/fonts/truetype/freefont/FreeSerif.ttf:
               text='€':
               x=50:
               y=50:
               fontsize=150:
               fontcolor=white,
             drawtext=
               fontfile=/usr/share/fonts/truetype/freefont/FreeSerif.ttf:
               text='J':
               x=600:
               y=460:
               fontsize=120:
               fontcolor=white,
             boxblur=25 [lights];
             [lights][base] blend=all_mode=screen:shortest=1"

# Add more detailed highlight to input [crt screen]
highlight="[base];
             nullsrc=size=720x576,
             format=gbrp,
             drawtext=
               fontfile=/usr/share/fonts/truetype/freefont/FreeSerif.ttf:
               text='¡':
               x=80:
               y=60:
               fontsize=90:
               fontcolor=white,
             boxblur=7 [lights];
             [lights][base] blend=all_mode=screen:shortest=1"

# Curve input to mimic curve of crt screen
curveImage="vignette,
            format=gbrp,
            lenscorrection=k1=0.2:k2=0.2"

# Add bloom effect to input [crt screen]
bloomEffect="split [a][b];
             [b] boxblur=26,
                    format=gbrp [b];
             [b][a] blend=all_mode=screen:shortest=1"

### FFMPEG COMMAND ###########################################################

ffmpeg \
   -i "$1" \
   -vf "
         ${shrink144},
         ${crop43},
         ${rgbFX},
         ${yuvFX},
         ${noiseFX},
         ${interlaceFX},
         ${scale2PAL}
         ${screenGauss}
         ${reflections}
         ${highlight},
         ${curveImage},
         ${bloomEffect}
      " \
   "${1}__crtTV.mkv"

exit 0
download script: ffmpeg_CRT-effect.sh

A bank of 'screens' displaying different inputs.



Some alternate choices of filterchains.



source video: https://www.youtube.com/watch?v=8SPUHGRXQUY

FFmpeg: FAPA (Frame-Averaged Pixel Array)


Preamble: When I create a blog-post about a film, I will often include a cryptic looking pixelated image somewhere in the body of the post. When possible, I will create one of these images for every film I watch. I create them as a type of 'fingerprint', showing overall tonality and temporal dynamics of the film's visuals.

The image contains all frames in a given film. Each pixel represents the average colour of its particular frame. This colour is calculated by doing no more than scaling the frame to dimensions of '1x1' in a FFmpeg 'scale' filter. The frames [pixels] are then tiled into a single image of suitable dimensions.

The example video is taken from 'Summer in February (2013)' and shows a scene involving tropospheric lightening near the end of the film. The section of the 'pixel array' image relating to this scene has been highlighted and magnified. The contrast in lighting between frames means each frame can be clearly discerned as the video plays, even without the aid of the arrow.



The Bash script outputs basic information before and while processing. The process will take a reasonable length of time to finish. The version here uses two instances of FFmpeg to process the video. This is so progress feedback is displayed during execution. A simple single instance alternative is included in the 'Notes' section of the script, as well as ideas for showing progress while using this version. The script has not been updated since its initial creation and can probably be improved upon.

#!/bin/bash
################################################################################
# Create a 'Frame-Averaged Pixel Array' of a given video. Works by reducing
# each frame to a single pixel, and appending all frames into single image.
# - Takes: $1=Filename [$2=width]
# - Requires: ffmpeg + ffprobe
#   ver. 1.1 - 10th November, 2015
# source: https://oioiiooixiii.blogspot.com
###############################################################################

width="${2:-640}" # If no width given, set as 640
duration="$(ffprobe "$1" 2>&1 \
            | grep Duration \
            | awk  '{ print $2 }')"
seconds="$(echo $duration \
           | awk -F: '{ print ($1 * 3600) + ($2 * 60) + $3 }' \
           | cut -d '.' -f 1)"
fps="$(ffprobe "$1" 2>&1 \
       | sed -n 's/.*, \(.*\) fps,.*/\1/p' \
       | awk '{printf("%d\n",$1 + 0.5)}')"
frames="$(( seconds*fps ))"
height="$(( frames/width ))"
filters="tile=${width}x${height}"

clear
printf "$(pwd)/$1
___Duration: ${duration::-1}
____Seconds: $seconds
________FPS: $fps
_____Frames: $frames
_____Height: $height
____Filters: $filters\n"

# First instance of FFmpeg traverses the frames, the second concatenates them.
ffmpeg \
   -y \
   -i "$1" \
   -vf "scale=1:1" \
   -c:v png \
   -f image2pipe pipe:1 \
   -loglevel quiet \
   -stats \
| ffmpeg \
    -y \
    -i pipe:0 \
    -vf "$filters" \
    -loglevel quiet \
    "${1%.*}_$width".png

################################ NOTES #######################################

# Single line solution, but doesn't show progress
# ffmpeg -i "$1" -frames 1 -vf "$filters" "${1%.*}".png -y
# filters="scale=1:1,tile=${width}x${height}" # Used with single line version
# View ingest progress using: pv "$1" | piped to ffmpeg
download: video2pixarray.sh

[Note: I have struggled with giving a name to this process since I created the script, and have left it as the first thing I thought of. Perhaps others whom have creating something similar have better names for it.]

film review: https://oioiiooixiii.blogspot.com/2017/11/summer-in-february-2013.html

ANSI to HTML: Incorporating 'ansi2htm.sh', 'tiv', 'bat', 'GNU source-highlight', 'bash-drawille'

'tiv' [Terminal Image Viewer] with 'ansi2html.sh'
- Reproduce raster image in HTML unicode characters.


script -q /dev/null -c "tiv image.jpg" | ansi2html.sh > image.html
# output 'pre' tags refined with 'line-height' and 'font-size' styling 
N.B. Due to tag limits set for blog post content, this is only a screen capture representing the result. The actual html representation is demonstrated here: https://oioiiooixiii.blogspot.com/p/ansi2html.html


'bat', with 'ansi2html.sh'
- 'cat'-like application with additional syntax formatting and colouring.
   1 #!/bin/bash
   2 # An example Bash script
   3 # Version: 2018.08.19.19.33.05
   4
   5 function main() # An example function
   6 {
   7    local sentence="Hello, World!"
   8    for (( i=0;i<"${1:-1}";i++ ))
   9    {
  10       echo "$sentence"
  11    }
  12 }
  13 main "$@"
  14 exit

bat -n --color always --theme "1337" example.sh \
| ansi2html.sh --bg=dark > example.html

'bash-drawille' with 'ansi2html.sh'
- Convert raster image into Braille type HTML unicode characters.
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣯⣿⣿⣿⣿⢿⣿⣻⣿⣿⣿⣷⣿⣯⣿⣽⣿⡿⣿⣏⣿⣾⡿⣻⣽⣻⣯⣯⡿⣾⣳⡿⣯⣾⢿⢾⣟⡽⣯⣯⣹⣏⣯⣽⢯⡽⡯⣏⣗⡿⣺⢽⣝⢯⢽⡽⣳⢽⣞⣚⣞⠯⣗⢯⣖⡯⢯⡶⣏⣹⠶⢯⣚⣝⡼⣹⠶⣓⡧⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⢿⡿⢿⣝⣟⣟⠯⠻⠝⢑⠹⠋⠹⠺⠿⠿⡿⣿⣝⣿⣟⣾⣻⣯⣽⣿⣽⣻⣽⣻⣿⣾⢿⣟⣿⣾⣻⣷⢿⣷⡿⣟⡽⣾⣷⣯⣹⣽⣹⣏⣟⣺⡷⣗⣻⣗⡷⣺⢽⢷⢽⢽⣝⢶⢽⣺⣹⢭⡯⡽⡽⣓⣞⣗⣞⠶⣏⣹⠾⣞⢭⡼⢧⣓⡽⣞⣱⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣟⣿⡿⢯⢹⣩⣪⠪⢩⠍⠨⠰⠢⠐⠂⠀⠀⠀⠀⠐⠀⠁⠈⠊⠉⠟⡍⠞⠭⣻⡽⣿⣟⣷⣿⣽⣯⣟⣽⣟⣾⣻⣾⣳⣻⣽⢿⣾⣽⢷⣻⡷⣾⣷⣗⣻⣺⣗⣟⣺⣟⢾⢽⣺⡾⣺⣝⣞⣫⡯⡗⡷⡯⢽⣳⣺⢭⢷⣖⡽⢧⣏⣹⢳⡗⣏⢳⡞⣞⢭⡼⣚⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣟⣏⡞⣭⠕⢙⡨⠊⡁⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠐⠆⠆⡈⠹⢻⢷⢾⡿⢾⢿⣷⢿⣾⣻⣽⣻⣿⢾⣯⣗⣾⣯⣫⣻⡾⣷⣏⣽⣏⣯⣏⣯⢾⡷⡿⣺⢽⣏⣗⢷⣝⣗⢽⣞⣗⣳⣲⡽⣞⠯⣓⣞⢷⣚⢷⣚⢯⣚⣫⢳⣱⢷⢞⢭⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⡿⠓⣉⡝⣵⣕⠵⢊⢆⣌⢖⢈⢀⡀⢀⠀⠂⠄⠀⡀⠀⠂⠤⠄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠅⡀⠀⢪⡍⠏⠩⠝⠼⡻⡾⣽⣻⣽⣻⣟⣯⢷⡿⣯⣟⣝⣽⢷⣽⡷⣽⢷⡷⡯⣽⣺⣹⡷⣳⣳⣝⣟⡾⣺⡯⡶⣝⡾⣏⣏⢽⣺⡽⢭⣞⡞⢭⢷⡭⣳⣓⡽⡶⣏⢧⣫⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⡿⠟⡡⢒⢡⡯⣿⠏⡫⢖⠳⠕⡐⡰⡐⣂⠰⢐⠠⡢⢄⢌⡄⡐⠲⠅⡉⡢⡡⡠⡠⢆⠀⢀⠀⠀⠀⠡⠀⠀⠀⠤⢀⠀⠀⠁⠨⠋⢽⣳⣟⡽⣟⣾⢷⣟⢷⣿⢷⣟⣾⢯⣟⣞⣗⣻⡽⣽⣝⣯⢾⢽⣗⡯⡾⣺⡯⡾⣫⡷⣳⣣⣝⡾⡯⣗⣹⣺⠯⣳⢧⣳⡭⣳⠮⣗⣹⠶⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣻⠫⠔⢊⡚⡔⢕⠋⠡⣐⠆⢃⢔⢲⢊⡡⡩⣔⢦⢕⣕⡼⡭⢖⣅⡪⢕⡵⣏⡺⣪⠶⡵⢕⡕⠸⢣⢖⠦⠈⠈⠠⠀⠀⠀⠀⠢⢠⠀⠀⠈⠫⢾⡿⣷⡿⣯⣳⣿⡾⣯⣫⡷⣟⣾⡷⣟⣺⡷⡷⣗⣯⢽⣹⡷⡾⢯⡯⢽⣝⢷⢯⡽⡝⣫⢯⣜⡾⢽⢳⡧⣏⢯⣚⣹⡼⣹⠶⣫⢳⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⡿⡛⠙⠑⡈⡂⠴⠪⢈⡉⣁⡨⢌⡑⡬⢜⣃⡥⡶⢮⣲⣷⣷⣿⢿⢾⡿⣺⣹⣹⢵⢮⣖⡯⣹⣹⢳⣪⡪⣂⠜⢯⢗⠀⠀⠀⠀⠀⠀⠀⠀⠐⠀⠀⠀⠀⠐⢯⢾⡿⣞⣯⡾⣯⣗⣽⣽⡷⣾⡷⣯⢽⢾⡷⡯⣽⢾⣏⣫⣟⢷⢽⢽⣝⣗⢷⣏⡯⣺⣹⡽⡭⣏⣏⣞⠯⣞⡞⣓⡽⣚⡽⣓⡽⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠟⠢⠐⠈⠑⠑⠐⢐⢐⠂⠒⡌⢑⢌⢞⢕⢕⢪⣾⣯⣿⣿⣿⣿⣿⣿⣿⣳⣿⣾⣺⣻⣞⡽⣽⣽⢺⣺⠽⢼⡣⣆⠱⢎⠜⢂⠤⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢺⣷⢯⣻⢷⣟⣟⣹⢾⣟⣾⡷⣯⢽⣏⣯⢽⡷⡯⣽⢷⢾⢯⡯⣺⣫⡯⡾⣝⣗⣗⠯⡯⢽⣹⢼⢽⣺⡼⢽⡽⢭⡭⣳⢳⡞⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣷⠱⣕⣱⠧⡔⢐⠁⠈⡀⢠⢙⠪⣎⡱⣑⣲⢯⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣾⣿⣻⣟⣝⣗⣛⣗⡮⢞⡳⢞⠵⢮⡪⡔⠙⢆⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠸⢽⣻⡾⣿⡷⣟⣞⢷⣯⢾⡷⢾⡯⣽⣺⣻⣹⡯⣗⣏⡯⡷⣽⢾⣝⣗⢷⣳⢽⣺⣫⢽⡭⡯⣗⡧⣗⣞⡝⣹⢼⢧⣳⠷⣫⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠯⢔⢱⠯⡉⠥⠱⠀⠈⡐⣊⡼⢽⡪⢎⢼⣽⣷⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⢾⣻⣫⡷⢷⡽⢭⣍⡗⡵⡩⡳⡪⢕⢃⠨⠈⡂⠐⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢽⡾⣽⢾⣿⡾⣟⢿⣞⣽⢯⡷⣽⣯⢽⣽⣹⣗⡯⢷⡾⢯⢽⡽⣝⢷⢽⣗⣳⡾⣺⣳⡯⣺⣹⠽⣳⣹⣺⢷⣓⢯⢳⡞⣝⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣑⠮⠈⠘⠂⠒⠐⠠⡈⣎⡭⣳⡞⡨⣏⣽⣿⣿⣿⣿⣿⣿⡿⣿⣿⣷⣿⣿⣯⣫⣏⢷⣝⡼⢮⡱⣎⢎⡪⡎⡪⢎⢁⠅⡀⠂⢈⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠰⢽⢽⣽⢯⣏⣻⣻⡷⣻⢾⡷⣟⢾⢾⣺⡷⣗⣻⡷⡷⡷⣻⢽⡽⣫⡷⣝⣗⣺⣫⢷⢯⣺⣓⣝⡾⢷⣓⢫⣳⣫⢳⡞⣏⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣯⡿⣗⣡⠀⠀⠀⡀⠆⠰⣕⡽⠮⣏⡪⣲⣻⣿⡿⣿⣿⣿⣿⣿⣿⣿⢿⣿⣯⢿⡷⡿⣺⡼⡞⣪⣃⢝⣊⡪⠎⡪⠕⡂⢈⢐⢃⡈⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢐⠪⡼⣟⣾⢷⣟⣝⡷⣟⡷⣟⢾⢷⣻⣹⡷⣻⢾⣗⣽⣫⣞⣞⢽⣝⢷⢽⣝⣗⣗⢷⡾⣝⢭⡯⣗⣳⡽⢭⡯⣏⡗⣏⣓⡾⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣻⣿⣿⡇⠀⠀⣐⢖⣃⣞⡷⣟⢿⢪⢮⡽⣿⣾⣟⣾⣿⣟⣟⣟⣫⣻⣻⣽⢽⣹⡷⣞⠽⣎⢮⡰⣍⢮⡪⡚⣑⢃⢘⢈⠄⠆⢒⡈⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠂⢈⢊⢭⣟⡷⣯⣗⣯⣳⣟⡽⣯⢿⡾⡽⣽⡷⡷⣻⣹⣽⣹⣹⡷⡯⡽⣫⡯⣺⣫⣺⣓⣏⡯⣹⣹⣺⢼⣞⠯⣜⡽⣓⡾⡾⢭⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠇⠀⠀⢑⠨⢾⢾⡿⢽⢝⢼⣽⣻⣯⣿⣷⣿⣿⣟⣿⡿⣽⣳⢿⡷⢽⡽⣹⡼⣣⢝⠮⡼⣣⢕⠎⡰⡘⣂⠍⡂⠡⠔⠰⠰⠘⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠐⢎⢼⢽⢷⣟⡯⣻⣽⢾⣗⣽⣟⣺⢷⡷⡷⣯⢽⣺⣺⣹⣹⣗⣟⢾⣺⣫⢽⣝⢽⡭⢷⣫⣺⢼⣗⠯⢯⢯⢳⣳⡭⢷⣓⣞⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⡆⠀⠠⠌⠐⢞⣻⣟⡱⣏⣽⣟⣿⣟⣿⢿⣷⣟⣿⣻⣷⡿⣞⣽⣽⣽⣯⢷⣚⢮⣑⢎⡮⢎⢦⡵⡡⡱⠢⡈⢁⠄⢈⢁⠂⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠁⡪⡯⣿⡾⣟⣞⣽⢾⡯⣽⣯⢽⢾⡷⣯⢽⡽⡽⣻⣝⣫⡯⣗⢷⡽⣝⢷⢽⡯⡾⣗⢷⣏⡯⢽⣺⡽⡝⣗⣏⡗⣏⣞⢭⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⡆⡐⢀⢈⡸⢽⣝⣜⡽⣿⣷⣿⣺⣿⡿⣿⡿⣿⣿⢿⣟⣹⢽⣷⡿⣝⡳⣩⠳⢱⡕⢱⡣⢗⡕⠕⡊⢡⢁⠐⡐⡀⠂⠁⡐⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠌⢈⢽⣝⣽⡽⣷⢿⡷⢿⣞⣯⣹⡯⣟⣺⣗⣳⣳⣽⣫⣞⣗⣗⢷⣽⣣⢷⢽⢽⣞⣚⣗⣗⡯⡗⡷⡯⢽⣺⡼⣹⣺⡼⢯⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⡢⣇⢆⠢⡡⢞⣚⡾⣽⣻⣿⣽⣿⣟⣻⣞⣗⡷⣺⢳⢝⠰⢺⢳⡗⢕⢍⢊⡢⣱⢪⠕⠎⠔⡰⠨⠆⠅⣈⠂⡈⠀⠐⠄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠠⠅⠀⡄⣪⣏⣽⣯⣺⢷⣯⢿⣞⡽⣯⢾⡷⣏⣟⣺⢽⣽⣝⣹⡷⣺⣳⡾⢾⣝⣞⡯⡾⢽⡭⡽⣗⢽⣹⡽⡭⢷⣓⢯⣣⢳⡞⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⡿⣗⢞⡪⡪⡪⢞⢭⣻⣽⣿⡷⡽⣯⢿⡯⣻⠾⠣⠣⡐⠂⢸⢖⡈⠀⠈⠡⢉⢢⠅⠊⠲⠮⡁⠀⠀⠐⠀⠐⠄⠂⠁⢀⠂⢀⠂⠄⠠⠀⠀⠀⠀⡀⠀⠀⠀⠀⡂⠰⢬⢾⢷⣯⡯⣻⢿⣞⢾⡿⢾⡷⣗⣟⡾⡷⣻⣝⣺⣹⣹⣗⢽⣝⡷⡾⣽⡝⣫⡯⣳⢽⣞⢯⡽⣳⣹⣹⣗⣞⢳⣺⡽⢧⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣷⣽⡾⣯⢎⢮⠯⡯⢾⣻⣏⣯⣾⡷⡵⢞⠴⠀⠀⠀⠄⠀⠜⠂⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠐⠀⠁⠆⠄⢁⠀⠄⠀⠀⠀⠀⡀⠄⢁⠀⠀⠆⠆⢌⡯⣿⢯⣟⣝⣻⡿⢾⡿⣻⢾⣽⣹⡷⣻⢾⡯⣻⣝⢯⡯⡯⡽⣫⡯⡾⣺⢯⣞⡯⡽⣳⣣⣜⡾⢷⣓⣺⡼⢽⣲⢫⣳⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣷⣷⡷⢵⡷⣏⡙⡹⠍⠗⠈⠀⠀⠀⠀⠀⠀⠀⠀⡀⣠⣦⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢂⠂⠅⡀⠐⠈⠀⠈⠐⠀⠀⠀⠁⠀⢀⢄⢥⣲⡿⣽⢿⢾⣟⣾⣷⢷⣯⡷⣯⡷⣯⢽⡯⢾⡷⣝⣯⣝⣗⢽⡽⢷⢾⣫⡯⢽⣝⣗⣺⡽⡝⢫⢷⡾⡝⢷⣓⠯⣗⡗⣏⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣽⣿⡾⢮⠔⠅⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠨⣿⣿⣗⡆⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠠⢐⠀⠀⠌⠠⠄⠂⠀⠀⠁⠁⠂⢐⠀⠀⠀⠄⠀⠐⢆⡯⡯⣽⡽⣽⢽⡷⣾⢷⣽⢯⣳⣟⢾⡷⣻⢾⣝⣯⣗⣯⢯⢯⣯⢾⢽⣝⡯⡾⡯⡽⡽⣗⢯⣞⣳⡯⣹⣺⣗⣞⠯⣏⢧⣏⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⡯⢷⣇⡐⡠⡐⢄⠄⠀⠀⠀⠀⠀⠀⢀⣲⣿⣿⣿⡟⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠂⠐⢀⢂⠨⡂⠰⡀⠁⠀⡀⠀⠀⠀⡁⠀⠀⠀⠀⠀⠄⡲⣻⣺⣳⣷⢽⣯⢾⣺⣹⡯⣽⣟⣺⡷⣟⣽⢾⣏⣟⣏⣗⣟⣺⢾⣏⡯⡾⢽⣝⢽⣝⣺⣣⣳⡽⣺⢽⣳⣓⣺⡼⢯⣲⣫⢳⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⢺⣯⣿⣿⣭⢕⠣⠄⠂⠀⠀⢀⢠⣸⣷⣿⣿⣏⡇⢁⠈⢒⠀⠀⠀⠀⠀⠀⠀⢀⠀⠠⢐⠢⠌⡁⠀⠂⠀⠈⠀⠁⠀⡀⠀⢀⠈⠀⠀⠀⣕⢾⢾⡯⡾⣺⣫⣺⣣⣳⢷⣞⣗⡯⡽⣻⣹⣺⣞⢾⡯⣻⣝⣹⡷⡯⡽⣫⣟⢷⢽⡯⡶⣫⡯⣏⣗⡯⣖⣞⡝⣳⣚⢷⣚⢯⣚⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣷⣟⣻⣿⣿⣷⣆⣂⡄⡀⣰⣳⣾⡿⣯⣿⣿⣓⢇⠄⡈⠡⠆⢃⢂⠆⡂⠄⠅⡈⠡⠰⠐⠂⠠⠁⠀⠀⠀⠀⠀⠈⠀⠀⠄⠄⠐⠀⠀⠠⢺⣳⡷⣫⡯⣫⢯⣳⢷⣺⣓⣗⡯⣹⣹⣳⣫⣞⣞⡯⣽⡷⡷⡷⣽⡯⣽⢷⢽⣝⣗⢽⣹⢷⢯⢾⢼⡧⡷⡯⢯⢫⣳⢭⣳⡽⢭⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⡭⣎⢫⡽⣿⢿⣿⢿⣧⣽⣷⡯⢿⢿⢾⣿⢽⡳⢮⡃⡀⠈⠀⠀⠖⢎⢔⢆⠆⡂⠁⠄⠠⠄⠀⠈⠀⠄⠀⠀⠀⠀⠐⠅⠀⠁⠐⠀⠀⢠⠯⠾⣺⡯⡾⡾⣺⣹⣹⡾⣝⠽⣳⡗⡯⣞⣺⣚⣗⣝⣗⢽⣺⣞⣗⣗⢷⡾⣺⡯⡾⡯⡾⣫⢷⣏⣏⣺⣹⢭⢯⡯⣏⡧⣏⣹⢺⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣖⡵⣱⢯⡯⣗⣷⣿⡷⢯⠍⠐⣹⢿⣿⢷⣞⠵⡃⠀⠀⠀⠀⠀⠀⢊⡩⢆⡃⠜⡂⠆⠠⠁⠀⠂⠀⠆⠀⠀⠠⠀⠀⠀⠀⢀⢐⠰⢵⢽⣝⣗⢷⢽⢯⣞⣜⣗⠽⣗⣹⣺⣲⡽⢯⣲⢳⣺⢳⢷⣝⣗⡷⣽⢽⣝⡯⡾⣫⡯⣫⢯⢾⢺⡧⡷⣏⣏⣞⠯⡼⡽⣖⡯⢧⣏⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣟⣎⢳⡞⣗⣗⡯⡾⡎⠑⠁⠀⠠⢽⡹⣋⠽⠎⠕⠈⠀⠀⠀⠀⠀⠀⠀⠐⠁⠈⠁⠠⠈⡈⠀⠐⠠⢀⠀⠀⠀⠈⠀⠀⠁⡂⢱⢦⢵⢳⢷⣗⣺⣫⡯⣺⣹⢽⡭⡽⡾⣳⣓⣳⣚⢷⣓⠷⣏⡧⣏⡭⡯⡾⣺⡯⡽⣫⡯⡾⣺⢽⣝⡽⡾⢯⢽⣳⣚⢯⢯⡾⡭⣖⡽⣧⣯⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣟⡽⢎⡵⣏⢕⢎⠈⠀⠀⠀⠠⣌⠈⡈⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠠⠈⠂⠀⠀⠀⠀⠀⠀⢨⣪⠽⣺⣣⣳⡽⣞⢯⢯⢯⣺⡼⣺⢽⡭⢯⣞⢭⡭⢯⡽⢭⢯⢳⣹⡲⢯⣚⣏⣏⡯⡾⣺⣫⡾⣺⣺⣫⢫⡯⣗⣳⢭⡯⣗⣺⢽⢺⣹⠽⡼⡽⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⢮⢪⢗⠢⡃⠐⠁⠀⠀⢀⢼⢾⣆⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠌⠀⠀⠀⠀⠀⢜⢷⣫⢭⡯⡽⣗⡯⣖⣞⣺⢯⣣⡯⢧⣫⢳⡞⣏⣹⢼⢧⣫⢗⡾⣓⡯⢳⡞⣲⡽⢽⣞⣗⢷⣺⣓⣝⣞⣳⡽⡗⡯⣏⣗⡧⣗⡯⢧⡽⢯⢳⡞⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣝⠰⡂⢈⠀⠀⠀⠨⢵⡷⡷⣽⣑⠄⠀⠀⠀⠀⠄⡀⠰⠂⢀⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢸⡉⠓⣝⡾⡯⣏⣞⠯⢷⣣⡽⡭⣞⡝⢯⣓⢯⢳⣞⠽⡼⢧⢳⡞⣞⢭⢯⣚⣝⢼⣺⢯⢯⢽⣫⢯⣗⣳⢯⢽⢾⢼⣹⣹⣚⡾⣳⣞⢳⣺⡯⢧⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⡔⡡⠥⠀⠀⠀⡁⠂⠸⡹⣹⢻⣞⢆⠰⠄⠀⠂⠈⠐⠀⠀⠈⠀⠀⠀⠀⠀⢀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⣽⡧⠀⠘⢝⣺⢼⣹⣺⢭⢷⣹⢺⣞⠽⣞⡭⣏⡳⢗⣝⡳⡞⣏⡼⣹⠶⣍⣳⠶⡯⢧⣳⡽⣝⢾⢺⣺⢯⢭⢯⣏⣏⢯⣖⡾⡭⣖⡯⢽⣲⢫⣳⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣏⡒⢁⢂⣖⠂⠀⠀⠠⣀⣀⡈⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠠⣿⣿⡇⠀⠀⠸⣳⢧⣗⢯⣓⢯⢳⣺⠽⡼⢧⣏⡵⢧⣫⣓⡧⣫⢳⢧⣹⡼⢞⠽⡼⣹⠾⣜⡾⡯⣏⡗⣗⡯⢽⣺⢼⣺⢽⣜⡽⣹⡼⣞⠽⡼⢽⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⡆⠐⢱⢎⢂⠀⢀⢴⡭⣹⢳⡹⡫⡛⠕⠀⠀⠐⠀⠐⠀⠀⠄⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠐⣽⣿⣿⣷⠀⠀⠀⠀⠧⣫⢳⣳⢯⣚⣓⡽⡭⣳⠶⣏⣹⢖⡭⣳⡼⢧⣓⡵⡳⡞⣏⢳⣹⠾⣗⣺⣚⣗⡯⢽⣗⣺⢯⢳⢯⣲⡭⢷⣳⣚⣓⡽⢳⡞⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣯⣿⣿⣿⣄⠌⡑⠅⢸⡾⢷⢯⢞⡌⠌⡠⡈⠌⡀⠀⠀⠀⠀⠀⠀⡁⠠⠐⠀⠂⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢴⣿⣿⣿⣷⠀⠀⠀⠀⠀⠀⠯⢧⣞⢭⡭⣳⠽⣜⡞⣕⣏⢳⡞⣕⡵⡞⡭⡞⡳⢧⡹⣜⢞⣹⣓⣞⡽⡭⣗⣺⣖⡽⡧⣷⠯⣳⣧⣯⣞⢭⣳⣿⣏⣽⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⡿⡿⠗⠆⠐⠡⢪⢯⣽⣹⣷⣹⣯⣚⢎⡦⢆⢣⣂⢀⠂⠠⠰⠂⡀⠂⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣿⣿⣿⣿⡓⠀⠀⠀⠀⠀⠀⠀⠀⠐⠫⡝⡲⢯⡹⡼⢞⡼⢮⣓⢧⡹⣎⣣⠯⡵⢞⡹⣎⢧⠯⣞⣞⠯⣗⣯⣶⣳⣷⣽⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠻⠛⠋⠁⠁⠀⠀⠀⠀⠀⠀⠀⢪⢹⣫⣟⢽⡾⢫⢯⡺⡕⠯⡳⠮⡰⢂⠂⠈⢀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣴⣿⣿⣿⣿⡿⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠑⠙⠳⠮⣍⡞⡭⡞⣱⢫⣣⠽⢼⡹⢎⡵⡝⣖⢯⢽⣽⣾⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⡿⠻⠟⠉⠉⠀⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣸⣿⣍⠥⡪⢓⡫⡸⢝⢊⡪⠎⠣⠚⠂⠑⠁⠀⠀⠈⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣠⣿⣿⣿⣿⣿⡋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠓⠱⠋⢜⢧⡳⢞⡳⣩⠯⡼⣿⣽⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⡿⠛⠛⠁⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣿⣿⡇⠀⠀⠀⡈⡐⠑⠄⢁⠈⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣤⣿⣿⣿⣿⣿⡿⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠀⠑⢝⣶⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠿⠃⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣿⣿⣿⣇⠀⠀⠀⠀⠂⡂⢁⠨⠈⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣴⣾⣿⣿⣿⣿⣿⡿⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣰⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠏⠁⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣬⣿⣿⣿⣿⠀⠀⠀⠀⠀⠀⠀⠨⢂⠂⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣿⣿⣿⣿⣿⣿⣿⣿⠂⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣠⣾⣿⣿⣿⣿⣿⣿⣿⣿⣻⢿⣹⠶⢏⡽⢿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠀
⣿⣿⣿⣿⣿⡿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣷⣿⣿⣿⡿⣿⣿⣿⠟⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣠⣿⣿⣿⣿⣿⡀⠀⠀⠀⠀⠀⠀⠈⠂⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣤⣿⣿⣿⣿⣿⣿⣿⣿⣗⠂⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣲⣿⣿⣿⣿⣿⣿⣿⢿⢺⢭⣓⢧⢯⡹⡵⡞⣏⡳⣧⣿⣿⣿⣿⣿⣿⣿⣿⣿⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣻⣿⣿⣿⣿⣿⣿⣿⣷⣿⣿⣿⣿⣟⣿⣿⣿⣿⣿⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣾⣿⣿⣿⣿⣿⣿⢀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣾⣿⣿⣿⣿⣿⣿⣿⣿⡫⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣽⣿⣿⣿⣿⣿⣿⣿⣿⣿⡽⢞⣍⣳⢳⢭⢝⡼⡞⣹⠶⣏⣿⣿⣿⣿⣿⣿⣿⣿⠀
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣻⣿⣿⣿⣿⣿⣾⣿⣿⣿⣿⣿⡿⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⢿⣿⣿⣿⣿⣿⣹⣖⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⣤⣿⣿⣿⣿⣿⣿⣿⣿⣿⡿⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣺⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣞⢭⠶⣏⢧⠯⢯⣚⡵⢧⣫⣿⣿⣽⣿⣿⣿⣿⣿⣿⠀
⣿⣿⣿⣷⣿⣿⣿⣿⣿⣿⣷⣿⣿⣿⣿⣿⣿⣿⣿⣯⣿⣿⣾⣿⣿⠃⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠛⣿⣿⣿⣿⣿⣿⣀⠕⡆⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣠⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠉⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣸⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣻⢳⣿⣽⣽⣿⣿⣿⣿⡟⣍⣳⣯⣿⣿⣿⣿⣿⠀
⣿⣽⣿⣿⣯⣿⣿⣷⣿⣻⣿⣿⣿⣿⣿⣽⣯⣿⣿⣿⣿⣯⣿⣿⠃⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠠⢼⣾⠎⣿⣿⣿⣿⣽⡪⣒⠀⠀⠀⠀⠀⠀⠀⠀⣰⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠓⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣷⣿⣿⣿⣿⣿⢯⠽⣟⡼⡝⣶⣷⣿⣿⣿⣿⣿⠀
⣿⣿⣾⣿⣿⣿⣷⣿⣿⣿⢿⣿⢿⣿⣿⢿⣻⣿⣿⣯⣿⣽⣿⠟⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡸⣯⣽⣿⣻⣿⣹⣿⣿⣿⡕⡆⡀⠀⠠⠀⠂⢠⣼⣻⠽⣜⡝⡪⠜⡸⠟⢿⣿⣿⣿⣟⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣼⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠀
⢿⣿⣿⣻⣻⣿⣿⣿⣟⣿⣟⣿⣿⡿⣿⣿⣿⢿⣿⣿⢿⣿⠟⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣿⣿⣿⣿⣿⣿⣷⣿⣿⣿⣿⣿⡔⠔⠄⢀⣵⡿⣺⣓⡯⡣⣃⢕⢕⢎⢔⢕⡻⢙⢛⠄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣾⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠀
⣿⣿⣿⣟⣟⣿⣟⣿⣿⣿⣻⣿⡿⣿⣿⣿⣾⣿⣷⣿⣿⡽⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢘⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣧⡂⠸⣿⣷⣷⡯⡳⢕⢪⢮⣵⣺⣾⡾⣮⣧⠄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠀
convert-image.sh | ansi2html.sh > braille.html
# output 'pre' tags refined with 'line-height' and 'font-size' styling 


GNU 'source-highlight'
- Print source files [e.g. Bash scripts] with colour syntax highlighting.
01: #!/bin/bash
02: # An example Bash script
03: # Version: 2018.08.19.19.33.05
04:
05: function main() # An example function
06: {
07:    local sentence="Hello, World!"
08:    for (( i=0;i<"${1:-1}";i++ ))
09:    {
10:       echo "$sentence"
11:    }
12: }
13: main "$@"
14: exit
source-highlight -n -f html -i example.sh -o example.html

ansi2html: https://github.com/ralphbean/ansi2html
tiv: https://github.com/stefanhaustein/TerminalImageViewer
bat: https://github.com/sharkdp/bat
GNU source-highlite: https://www.gnu.org/software/src-highlite/
bash-drawille: https://github.com/mydzor/bash-drawille
additional info: https://oioiiooixiii.blogspot.com/2018/03/bash-seven-applications-designed-to.html
image source: (CC BY 2.0) https://www.flickr.com/photos/usdagov/20190061980
image source: http://www.thejournal.ie/readme/vincent-brown-eighth-amendment-yes-vote-4026621-May2018/

Bash: Create a cacophony of voices with 'Mimic' text-to-speech



A somewhat grotesque and macabre presentation of 'Mimic's text-to-speech functionality. The concept arising after recently trying out 'Mimic's features and capabilities. As a short review; I believe 'Mimic's main strengths lie in its speed and ease of execution on the command line, similar to that of 'espeak'. Other more featured text-to-speech solutions can be slow and cumbersome to initiate.

N.B. In the video, there are an unfortunate number of artifacts in the audio. These only manifest in PulseAudio stream recording and are not present while monitoring the audio in real-time.

#!/bin/bash
# (GNU bash, version 4.4.19(1)-release)
#
# Create a cacophony of 'Mimic' text-to-speech voices, using words found in the
# system's 'words' file. Note: The concept can be applied to any text-to-speech
# software and, as such, a rudimentary function using 'espeak' is included.
# (Mimic info: https://mycroft.ai/documentation/mimic/)
# - Optional parameters: '$1' number of words - '$2' Number of voices
#
# Version: 2018.08.11.15.54.15
# Source: https://oioiiooixiii.blogspot.com

function randWords() # RETURNS PUNCTUATED LIST OF WORDS FROM SYSTEM DICTIONARY
{
   shuf </usr/share/dict/words \
   | head -"$1" \
   | awk 1 ORS='. '
}

function randNum() # RETURNS RANDOM NUMBER BETWEEN '$1' and '$2'
{
   shuf -i "$1"-"$2" -n 1
}

function randVoice() # RETURNS NAME OF RANDOM MIMIC VOICE
{
   local voices=("ap" "slt" "kal" "awb" "kal16" "rms")
   printf "${voices[$(randNum 0 ${#voices[@]})]}"
}

function mimicSpeak() # INVOKES 'MIMIC' APPLICATION
{
   mimic \
      -voice "$(randVoice)" \
      --setf int_f0_target_mean="$(randNum 20 180)" \
      --setf duration_stretch="$(randNum 1 12)" \
      <<<"$1"
}

function espeakAlt() # INVOKES 'ESPEAK' ALTERNATIVE
{
   espeak \
      -p "$(randNum 1 200)" \
      -s "$(randNum 1 100)" \
      "$1"
}

function main()
{
   clear

   for (( i=0;i<"${2:-5}";i++  ))
   {
      local sentence="$(randWords "${1:-5}")"
      echo "$sentence"
      mimicSpeak "$sentence" &
      #espeakAlt "$sentence" &
   }
}

main "$@"
exit

#### NOTES: One line concept

# shuf </usr/share/dict/british-english \
# | head -5 \
# | sed ':a;N;$!ba;s/\n/. /g' \
# | tee > \
#   (
#       mimic \
#          -voice slt \
#          --setf int_f0_target_mean=50 \
#          --setf duration_stretch=16 \
#   )
download: mimic-cacophony.sh

FFmpeg: Colour animation from macroblock motion-vectors



The animation is created from styling the macroblock motion vectors, as displayed by FFmpeg, rather than by manipulating the actual video content. The blocks of colour are created by stacking 'dilation' filters on the motion-vector layer. Before being dilated, the colouring of the arrows is extracted from the original video by 'colorkey' overlay. Based on an earlier filtergraph experiments.¹

#!/bin/bash 
# Generate stylised animation from video macroblock motion vectors, 
# and present in a side-by-side comparison with original video. 
# version: 2018.03.28.21.08.16 
# source: https://oioiiooixiii.blogspot.com 

cropSize="640:ih:480:0" # Adjust area and dimensions of interest

ffplay \
   -flags2 +export_mvs \
   -i "$1" \
   -vf \
      "
         split [original][vectors];
         [vectors] codecview=mv=pf+bf+bb,
                   crop=$cropSize [vectors];
         [original] crop=$cropSize,
                    split=3 [original][original1][original2];
         [vectors][original2] blend=all_mode=difference128,
                              eq=contrast=7:brightness=-0.3,
                              split [vectors][vectors1];
         [vectors1] colorkey=0xFFFFFF:0.9:0.2 [vectors1];
         [original1][vectors1] overlay,
                               smartblur,
                               dilation,dilation,dilation,dilation,dilation,
                               eq=contrast=1.4:brightness=-0.09 [pixels];
         [vectors][original][pixels] hstack=inputs=3
      "



¹ see also: https://oioiiooixiii.blogspot.com/2016/09/ffmpeg-create-video-composite-of.html
source video: γ‚Šγ‚Šγ‚ (LILIA)https://www.youtube.com/watch?v=U1DFzSlNkV8 (used without permission) m(_ _)m

Bash: Seven[+] applications designed to display images in a terminal



UPDATE: 22nd November, 2019
Added version that uses 'mpv'. Details at bottom of blog-post

Personal notes on seven programs that provide some ability to display images in a POSIX terminal interface [bash environment specifically]. This is in no way an exhaustive list, nor does it claim to highlight the best applications available. These are merely notes made recently while investigating a solution for such a task.

source image: '"Michael Gehlert" by Harald Peter' http://piqs.de/fotos/192320.html

ImageMagick: Reversible Image Masking for lossy image formats



A demonstration of an image masking procedure using ImageMagick, presented as a Bash script. The concept is to obfuscate an image such that it becomes meaningless to the observer (human or machine) but that can be easily recovered using the correct steps. Since it is based on visual alterations rather than altering the file itself, it does not suffer from informational corruption if the image is resaved and/or resized.

In its current form, it's deemed as weakly cryptographic. It could be reversed via brute force study of patterns and edges (as was seen with VideoCrypt, the analogue video encryption used with satellite television¹). Image masking is an old idea that may still have some value today. Further notes are found within the bash script below.
#!/bin/bash
# Demo implementation of reversible image obfuscation for lossy file formats 
# (jpeg), based on ImageMagick[6] command chains.
#
# USAGE: imageMask.sh ['hide'/'recover'] ['image']
#        (Images cropped to multiples of 64 in this implementation.)
#
# * See 'NOTES' at bottom of script for further information and ideas.
#
# N.B. Regarding cryptography: reversible by brute force, edge-analysis ,etc.
# Designed for privacy from casual scanning (human/machine). Inspired by
# previously developed image masking systems: (GMask, JMask, VideoCrypt, etc.)
#
# Source: https://oioiiooixiii.blogspot.com
# Version: 2018.02.19.05.02.27

function obsfucate() # Takes: 'filename', 'width', and 'height'
{
   local width="$2" height="$3"
   
   # Crop into 64x64 blocks, rotate 90 degrees, and negate 1/4.
   # Tile blocks in reversed image orientation (Height x Width).
   # Crop into 16x16 blocks, rotate 90 degrees, and negative 1/4.
   # Tile blocks in reversed image orientation (Height x Width).
   # One extra 'rotate' at end to return to original orientation.
   
   convert "$1" -crop 64x64 -rotate 90  \
      \( +repage -region 32x32+0+0 -negate \) miff:- \
   | montage miff:- -mode concatenate \
      -tile "$((height/64))"x"$((width/64))" miff:- \
   | convert miff:- -crop 16x16 -rotate 90 \
      \( +repage -region 8x8+0+0 -negate \)  miff:- \
   | montage miff:- -mode concatenate \
      -tile "$((height/16))"x"$((width/16))" miff:- \
   | convert miff:- -rotate 90 ${1%.*}_HIDDEN.jpg
   
}

function deobfuscate() # Takes: 'filename', 'width', and 'height'
{
   local width="$3" height="$2"
   # width,height values swapped, 270 rotate to match 'obfuscate' re-orientation
   
   convert "$1" -rotate 270 -crop 64x64 -rotate 270 \
      \( +repage -region 32x32+0+32 -negate \) miff:- \
   | montage miff:- -mode concatenate \
      -tile "$((height/64))"x"$((width/64))"  miff:- \
   | convert miff:- -crop 16x16 -rotate 270 \
      \( +repage -region 8x8+8+8 -negate \) miff:- \
   | montage miff:- -mode concatenate \
      -tile "$((height/16))"x"$((width/16))" ${1%.*}_RECOVERED.jpg
}

function main()
{
   local width="$(identify -format "%w" "$2")"
   local height="$(identify -format "%h" "$2")"    

   # Crude method of making the image dimensions multiples of 64
   if [[ "$((width%64))" -gt 0 || "$((height%64))" -gt 0 ]]
   then
      local width="$(((width/64)*64))"
      local height="$(((height/64)*64))"
      convert "$2" -crop "$width"x"$height"+0+0 +repage "${2%.*}_CROPPED.png"
      local filename="${2%.*}_CROPPED.png"
   fi

   [[ "$1" == "hide" ]] && obsfucate "${filename:-$2}" "$width" "$height"
   [[ "$1" == "recover" ]] && deobfuscate "${filename:-$2}" "$width" "$height"
}

main "$@"
exit

### NOTES ###################################################################

# The command chain 'algorithm' demonstrated here, is just one particular way of
# rearranging an image, using rotation, negation, and altering aspect ratios. 
# More complex chaining as well as extra measures will result in more obscurity.
# Saving files at interim stage and reordering blocks allows for greater 
# manipulation and security (e.g. unique block ordering based on pass phrases). 
#
# Advantages and uses: survives rescaling and re-compression, with minimal 
# additional losses due to principles of DCT quantisation. It allows for images 
# to be stored on-line using public/private 'cloud' services that destroy 
# cryptographic information by rescaling/compressing the image. Reversible via 
# alternate means (e.g. Python PIL etc.) if software becomes unavailable. 
# Cons: Relatively slow, cumbersome, non-dynamic way to browse images.
#
# A side-effect of the procedure is the removal of EXIF information from the 
# image, thus no need for including the '-strip' argument such was desired.
download: imageMask.sh




To show the differences created when the image is resaved [with heavy compression] while in a state of obfuscation, the following was completed: The image was masked and saved as a jpeg with quality set to '25'. The original image was also saved as a jpeg with quality set to '25'. Both of these images were 'differenced' with the original, and the result of each were 'differenced' with each other. This image was then normailised for clarity. N.B. "Difference" does not imply quality loss but variance in compression artifacting.

Below left: Differences between original and image that underwent obfuscation then deobfuscation.
Below right: Differences [normalised] between heavily compressed images, as mentioned above.



more info: http://gmask.awardspace.info/
see also: http://oioiiooixiii.blogspot.com/2014/03/a-novel-approach-to-encrypting-images.html
related: http://oioiiooixiii.blogspot.com/2014/11/illegal-pixels.html
related: https://oioiiooixiii.blogspot.com/2015/05/creating-two-images-from-one-set-of.html
related: https://oioiiooixiii.blogspot.com/2015/06/gimpmask-encrypting-all-or-part-of-image.html
image source: c⃠  https://images.nasa.gov/details-201409250001HQ.html
image source: c⃠  https://commons.wikimedia.org/wiki/File:Moscow,_Krasnaya_Square,_Sunset.jpg

¹ http://www.cl.cam.ac.uk/~mgk25/tv-crypt/image-processing/antisky.html
¹ http://www.techmind.org/vdc/
¹ https://guru.multimedia.cx/decrypting-videocrypt/

Rough notes on moving Tumblr blog to Blogger (including rehosting images and maintaining source links)

The following are some very rough notes on moving a blog from Tumblr to Blogger, including rehosting images and preserving source links. They are presented in a fashion for personal use, but may provide help to others. They make use of Bash and GNU tools. The generic procedures as documented in links below work perfectly fine, and should provide satisfactory results for most people.
how-to #1: http://www.analyticsforfun.com/2014/04/how-to-move-your-blog-from-tumblr-to.html
how-to #2: https://yourbusiness.azcentral.com/import-tumblr-blogger-10881.html
The procedures in those links leave the images hosted on Tumblr, and also strip the 'source' URLs to content from each post. The bash snippets included here successfully fix these issues, with only a few potential flaws that can be easily cleaned up manually. It would have been more proper to create the XML files from scratch, or at least manipulate the resultant XML objects directly. This was something contemplated during but was abandoned for simple Bash (sed et al.). Some alternate XML manipulation instructions are listed below

ImageMagick: Bidirectional repeated liquid-rescale (content aware scaling)



The use of heavy seam carving (liquid rescale/content aware scaling) in images and video has been well expressed on the internet over the past 10 years¹. The concept of repeated "bidirectional" seam carving has been demonstrated here numerous times in the past².

The concept of bidirectional carving is to resize the image only slightly, and return it to its original dimensions. If this is done repeatedly many times (hundreds or thousands of iterations) the image will continue to corrupt and evolve in novel ways. A simple bash script used for automating the process is presented below.
#!/bin/bash

# Repeated bidirectional 'seam carving' on image. (Requires ImageMagick).
#  - Arguments: filename, iterations, size difference, quality, milestones.
#  - See 'NOTES' at bottom of script for further details
# ver: 2017.11.15.13.07.17
# source: https://oioiiooixiii.blogspot.com

function main()
{   
   # Make duplicate file for working on
   [ "$4" == "png" ] \
      && ext="png" \
      && quality="" \
      || quality="-format jpg -quality $4"
   filename="${1}_lqr-i$2-s$3-q$4.${ext:-jpg}"
   convert "$1" $quality "$filename"
   
   # Set up scaling variables
   originalRes="$(identify $1 | cut -d' ' -f3)"
   pix="$3"
   altRes="$(( $(cut -dx -f1 <<<$originalRes)+pix ))x\
           $(( $(cut -dx -f2 <<<$originalRes)+pix ))"

   #main loop
   for ((i=0;i<"$2";i++)) 
   {
      clear
      printf "FILE: $filename\nFRAMES: $((frame))\nITERATION: $((i+1))\n"
      printf "* Scaling to alt. resolution '${altRes//[[:space:]]/}'\n"
      mogrify -liquid-rescale "$altRes!" $quality "$filename"
      printf "* Scaling to original resolution '$originalRes'\n"
      mogrify -liquid-rescale "$originalRes!" $quality "$filename" 
      
      # Create a new image at milestone interval, if set
      [ ! -z "$5" ] && ! (( $i % $5 )) \
         && cp "$filename" "$((frame++))_$filename"
   }
}

main "$@"
exit

### NOTES ######################################################################
# $1=filename - Name/location of image.
# $2=iterations - The total number of desired resizes.
# $3=size difference - Amount of pixels to scale by (positive or negative).
# $4=quality - Set desired jpeg quality or 'png' (compression causes entropy).
# $5=milestones - Create new file at specified interval, capturing current state
# Possible script improvement: File i/o location in ram drive /dev/shm/ etc.
download: imageMagick_bidirectional-seam-carve.sh



The process can develop numerous types of effect, depending on the attributes given. Shapes can become angular, or rudimentary. Sections of the image can begin to develop seams that tear and germinate. Eventually, most images cascade down into a mess of chaos that never resolves.

The example above demonstrates some of the effects different arguments used in the script can have on an image, though it is in no way exhaustive, nor shows the extremities of the effect. For a more extreme example, see the video below. The starting image size was 960x408, and the arguments given at run-time were: 10,000 iterations, size reduction of 100 pixels, jpeg quality of 90, and every frame saved.


¹ info: http://knowyourmeme.com/memes/content-aware-scaling
² related: https://oioiiooixiii.blogspot.com/search/label/Seam%20Carving
source video: https://en.wikipedia.org/wiki/The_Doctor_and_the_Devils