Edward Vielmetti has been writing the Vacuum weblog since 1999 from Ann Arbor, Michigan. The topics vary widely, with over 2000 entries in the whole collection.
In the interest of simplifying the presentation, some parts of this collection are currently offline, and the front page represents primarily current work and not a diversity of interests.
The current set of systems that are an area of focus - and the places I draw from for inspiration on each - are as follows. One might loosely couple these together under the intersection of “DevOps” and the “Internet of Things”, but that would be an oversimplification at many levels.
Software defined radio. Building radio receivers and decoders in software. I’m using an RTL-SDR tuner stick and the dump1090 software as a way of getting ADS-B data that decodes to airplane locations. From listening to distant radio stations to decoding digital modes, SDR tools rapidly are transforming the nature of radio experimentation.
Raspberry Pi. This little ARM based single board computer is a launching point for small scale situated experimental computing interfaces to the outside world. I’m running the Hypriot distribution on my systems, which provides infrastructure to run Docker. Pis are cheap and cheerful, and the ARM chips they are built from have applications all the way from disposable embedded computers to data center scale power efficient compute farms.
Embedded controllers with wireless data. Products like Particle’s Electron and the Hologram Dash systems point the way to a future where small battery-operated devices have some kind of always-on global connectivity. Battery constraints and power consumption are the biggest considerations, and there’s a wide variety of possible radio frequencies and associated chipsets to be evaluated.
Docker. This tool allows you to encapsulate system dependencies so that even complex software can be launched and run without going through an extensive installation process. Run this on small systems like the Raspberry Pi, for development on OS X or Linux, and on servers like CoreOS on EC2. Container technology and cluster technology are still changing rapidly, with every new release offering new functions (and the risk of a smattering of interesting new bugs).
CoreOS. This minimalist operating system is designed primarily to run Docker containers and to handle system updates automatically. As long as your services can handle being rebooted periodically for externally scheduled system updates, it does a great job of providing a stable platform for container-first system designs.
Amazon Web Services. The sprawling set of Amazon capabilities means that you should never need to have a data center of your own, so long as you manage to figure out how to economically build systems out of their mostly (but not completely) reliable infrastructure. When AWS goes down, though, watch out - it’s likely to break in some novel way that your application, no matter how well engineered, will not expect to see.
AWS IoT. This Amazon Web Services stack provides a message broker based on MQTT, plus a set of services for triggering functions in AWS Lambda when messages come in and a database capture with AWS DynamoDB. It’s still an open question how expensive this system is at scale.
AWS Lambda. Instead of running Amazon Web Services one computer at a time, you run it one function call at a time. With support for Python and node.js, Lambda is an example of “serverless” computing, where the unit of compute resource you are marshalling to solve a problem is just as small as you can make it.
MQTT. This simple publish-subscribe message broker standard lets very small systems communicate with the cloud in a method that doesn’t require that the remote system have a globally routed address.