Batch Image Processing and Uploading to S3

I run another blog where my images are stored in AWS S3.  I also upload the images in two sizes, and don’t upload the original.  I use Linux on my personal computers, so I wrote a shell script to make this easier.

#!/bin/sh
 
# create full size images
mkdir ./$1/1000
mogrify -auto-orient -strip -resize 1000x1000 -quality 60 -path ./$1/1000 ./$1/*.jpg
 
# create thumbnails
mkdir ./$1/400
mogrify -auto-orient -strip -resize 400x400 -quality 60 -path ./$1/400 ./$1/*.jpg
 
# delete originals
trash-put ./$1/*.jpg
 
# upload to S3
aws s3 cp ./$1 s3://pics-fatguy-org/$1 --recursive

This assumes you have ImageMagick, aws-cli tools, and trash-cli installed.  You can skip trash-cli and change the trash-put line to rm.

I keep the shell script named fatguy in the root folder of my local copy of images.  There are subfolders, each an album.  I drop a copy of full-size images in the folder, run the script against the folder:

./fatguy album_name

This creates two subfolders “400” and “1000” with the resized images, deletes (to trash) the originals, then uploads what is left to my S3 bucket.

I may change it later to output the full url of each image for easy pasting into WordPress, but this works well enough for now.

Leave a Reply

Your email address will not be published. Required fields are marked *