Coleman McCormick

Archive of posts with tag 'Code'

Minifying Images on the Command Line

May 4, 2023 • #

When I put images on this blog, I try and manage down the file sizes so browsers aren’t having to load multi-megabyte images for no reason. Since I use Jekyll and a very simple setup of static files with minimal processing tools, I need to do this manually. For years when I wanted to optimize an image file quickly, I’d upload to TinyPNG and redownload the shrunken version. Super simple.

But I wanted a way to do this on the command line rapidly, and to be able to write scripts to batch process images if I need to (which I commonly do for images in my Library section).

We’re in luck because TinyPNG has a free API for this. If you’re just doing a few images here and there, the limits are generous enough to be plenty. Start by going to their Developers page, create an account, and sign into it. Then head to the API keys page and generate yourself a new one.

Once you’ve got that setup, now you’ll need the tinypng-cli tool from npm (it’s open source).

You’ll need nodejs installed for this, which you can download or install from a package manager like Homebrew.

Once downloaded, get yourself the CLI:

npm install -g tinypng-cli

With everything set up, shrinking an image is now a piece of cake:

tinypng example.png -k YOUR_API_KEY

I then took this a step further and made myself a one word shortcut using a function I’ve added to my shell profile (see here for more on updating your shell profile — I use zsh):

function tiny {
  tinypng $1 -r -k YOUR_API_KEY
}

Now a simple command like this shrinks my image:

tiny example.png

I’ve also built a simple script template in Ruby to batch convert the JPEGs in a single directory. It’ll process any images inside its director and put them in a resized/ subdirectory:

#!/usr/bin/env ruby

Dir.glob(["*.jpg","*.jpeg"]) { |f|
  filename = File.basename(f, ".*")
  puts "Minifying #{filename}..."
  `tiny resized/#{filename}.jpg`
  puts "Minify complete."
}

Terra

September 7, 2013 • #

Inspired by a couple of others, I released a micro project of mine called Terra, to provide a fast way to run several geospatial tools on your computer.

Terra

Because I work with a variety of GIS datasets, I end up writing lots of scripts and small automation utilities to manipulate, convert, and merge data, in tons of different formats. Working with geo data at scale like this challenges the non-software developer to get comfortable with basic scripting and programming. I’ve learned a ton in the last couple years about Unix environments, and the community of open source geo tools for working with data in ways that can be pipelined or automated. Fundamental knowledge about bash, Python, or Ruby quickly becomes critical to saving yourself countless hours of repetitive, slow data processing.

GDALThe renaissance tool of choice for all sorts of data munging is GDAL, and the resident command line suites of GDAL and OGR. The GDAL and OGR programs (for raster and vector data, respectively) are super powerful out of the box, once you understand the somewhat obtuse and involved syntax for sending data between datasources, and the myriad translation parameters. But these get extra powerful as multitools for all types of data is when you can read from, and sometimes write to, proprietary data formats like Esri geodatabases, ECW files, MrSID raster images, GeoPDFs, SpatiaLite, and others. Many of these formats, though, require you to build the toolchain from source on your own, including the associated client libraries, and this process can be a giant pain, particularly for anyone who doesn’t want to learn the nuances of make and binary building1. The primary driver for building Terra was to have a simple, clean, consistent environment with a working base set of geotools. It gives you a prebuilt configuration that you can have up and running in minutes.

Terra uses Vagrant, for provisioning virtual machines, and Chef, an automation tool for batching up the setup, and maintaining its configuration. Vagrant is really a wrapper around VirtualBox VMs, and uses base Linux images to give you a clean starting point for each deployment. It’s amazing for running dev environments. It’s supports both Chef and Puppet, two CM tools for automating installation of software. I used Chef since I like writing Ruby, and created recipes to bootstrap the installs.

This all started because I got sick of setting up custom GDAL builds on desktop systems. Next on the list for this mini project is to provision installs of some other open geo apps, like TileMill and CartoDB, to run locally. Try it out on your computer, all you need is VirtualBox and Vagrant installed, and install is a few simple commands. Check it out on GitHub, follow the README to get yourself set up, and post an issue if you’d like to see other functionality included.

  1. Don’t mean to bag on Makefiles, they’re fantastic