I spent a few weeks in the not-so-frozen Canadian northlands over the winter holidays. While there, I had the chance to visit an old childhood favorite: the Ontario Science Centre, six floors of science-based awesomeness. One of their current exhibits, the Amazing Aging Machine, uses a computer vision software package called APRIL to predict how your face will change over the next 50 years.
In this post, I explore my results from that exhibit alongside a customized aging I performed using the APRIL API.
It’s not the most flattering photo, but here I am at 26:
Take One: Amazing Aging Machine
First, my face balloons out massively:
Next, my cheek bones set downwards:
Finally, my face leans up and wrinkles a tiny bit:
Take Two: APRIL API
For this run, I had access to the raw aging metadata, so I could see exactly how old APRIL thought I was at each point in the aging sequence.
From 26 to 28, there’s not much change:
Then, by age 35, my face elongates slightly:
I while away the next couple of decades in relative facial stasis. The most pronounced change is in my skin, which pales gradually with age:
Finally, age catches up with me, and I wrinkle into a haunted septuagenarian:
A few changes, each very minor, contribute to my forlorn expression over these last three photos.
- The eyes get slightly rounder, as though they’re welling up.
- Wrinkling above the eyes gives the impression of a furrowed brow.
- The face elongates yet again, creating a drawn expression.
- As part of the elongation of the face, the mouth corners sag downwards into the merest hint of a frown.
Note the lack of deep forehead and upper nose creases which normally accompany the furrowed brow expression. The mere suggestion of it on the eyes is enough to trigger our expression recognition! It’s amazing how sensitive we are to minute variations in facial muscle position.
These images provide two divergent visions for my distant future:
For comparison, here’s my father in his late 50s, looking quite a bit happier:
Why Were Those So Different?
…the machine uses state-of-the-art aging software developed in partnership with Aprilage Development Inc. of Toronto to add decades to the faces of 8-12 year olds.
The Amazing Aging Machine is calibrated for ages 8-12, likely to match the Ontario Science Centre’s target demographic. (Sadly, I couldn’t find detailed visitor demographic data!) In my case, this creates an awkward puffy look: it’s applying changes in facial structure through adolescence, when much of our bone growth occurs.
By contrast, the APRIL API asks for your current age, allowing it to more correctly calibrate its models. As a result, the second set of faces exhibits relatively little change in shape.
What Do I Get Out Of This?
Although my face is unlikely to match either of these faces at 72, this experiment provides some insight into how our faces change with age. After all, the APRIL face aging models are based on real face data. They represent a sort of statistical average of the aging process.
Also, I get the vaguely warm feeling that comes with having contributed to our collective intelligence. I provided APRIL with a real age-labelled face, which will likely be used to help train future models.
Appendix: How To Use The APRIL API
For the more technically-minded, I’ve provided a quick walkthrough of the API aging pipeline. For all the gritty details, consult the API docs.
Before starting, I highly recommend installing a tool like jsonpp; it makes it much easier to read API results.
The first step is manual: you need to register at ageme.com, then click the confirmation link in your email.
The next step is uploading an image, but let’s check first that the API works by retrieving our user info:
Oops! We haven’t authenticated ourselves. The Authorization header uses a
brain-dead and highly insecure
(Obviously this isn’t my real username/password. Substitute yours above and
use the resulting
base64-encoded string in the
below. I’ll use this bogus value to illustrate the flow.)
With the correct header, we can try fetching the user info again:
1 2 3 4 5 6 7 8
Great! Now we can POST an image to the uploading endpoint with
Another manual step: before proceeding, you’ll need to purchase a token on the ageme.com site. At time of writing, this cost $3.99; I looked for active promotion codes, but couldn’t find any.
With your aging token purchased, you can now create an aging document. This
lets APRIL know your age and ethnicity, which helps it to select the
appropriate models for your particular aging sequence. It also identifies the
starting image of that sequence via the
imageId returned during image upload.
We’re ready to run the aging process. There’s a single method
for performing all three steps, but I’ll break it down into the component
1 2 3 4 5 6
while loops, which wait for each step to complete. Once all steps
are completed, we retrieve the aging results:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
Finally, I wrote a bit of Python glue to fetch the URLs and name them by age:
1 2 3 4 5 6 7 8 9 10 11
With this, we can fetch the images:
And that’s it! Most of the process uses
curl, with minimal leaning
on Python for its