Recently I visited Alex Waterhouse-Hayward; he gave me an excellent lunch and then I observed scanography at work, which was cool. Then I remembered that I’ve long wanted to write about his unique approach to technology. Let’s start with the good part, the picture:


Hosta ‘Halcyon’, 1 August 2023.
By Alex Waterhouse-Hayward;
Really deserves to be expanded.

Alex is an interesting guy and an accomplished professional photographer; at age 80, he still gets regular gigs. Check his site and especially the blog, featuring his extremely colorful autobiography, illustrated and adorned with pictures. I’ve always had a special affection for Nicolás Guillén and the Switchblade, one of his earliest pieces. If you want to see his best boudoir work, which is dreamy and romantic, you’ll have to get to know him; it’s probably not bloggable.

Anyhow, his current photographic passion — I so hope that I still have passions, photographic or otherwise, at his age — is using a desktop scanner to capture each of the blossoms in his small but excellent garden. As you can see above, the depth-of-field is large (why is that, I wonder?) and the fall-off of light is dramatic.

Here’s the set-up.

The scanner and the subject

Scanner as camera.

See the rod coming from the upper right, just the right height to suspend the flowers? Alex credits it with opening the door to what he calls “scanography”; it’s part of a weird old desktop lamp. I told him that on basic lexicographic and etymological principles it’s still “photography”, whatever he wants to call it, but I’m not sure I convinced him.

Do you think getting that composition was quick and easy? Nope. It took Alex a half-hour of snipping back stems and re-arranging foliage to get it looking the way he wanted. The man knows what he’s doing. I speak from experience; back in 2007, I tried using Alex’s technique to capture eroded seashells, and found it taxing.

Immutable infrastructure · Here’s a picture of Alex at work; unfortunately my Pixel decided to focus on his hair, so I’ll have to explain what’s on the screen and why it might surprise you.

Alex working on his scanograph

He has an ordinary PC, a couple of years old I think. The scanner is an Epson Perfection V700, a little older.

The screen is a 22-year-old CRT monitor. On it, were it not blurred, you’d observe a 19-year-old version of Photoshop, which successfully controls the Epson scanner and imports its monster TIFFs.

Once Alex finds a technology he likes, he flatly refuses to update it. He’s retained a Windows wizard who somehow keeps all this ancient stuff running on a current OS version; I gotta say that that old Photoshop is damn fast on a modern PC. Watching him work, I was surprised by how capable that software is; the vast majority of what I do today with Lightroom CC classic was available back then.

Alex’s strategy has also led to him refusing to update his 17-year-old Blogger setup, an attitude which sometimes fails to work well with Google’s habit of sometimes breaking the world.

Speaking as a technology fashion victim who is always running the latest version of everything, I admire this approach. I think that if more people shared it, the world would be a better place. I’ve argued before that we need to disempower the Product Managers who insist on “improving” their offerings for reasons related to their career aspirations, not their customers’ experience. I loathe their assumption that the cost of the world retraining itself is zero. Also, they frequently make things worse.

Anyhow, if you’re in Vancouver and you want a portrait or a picture of a flower or a boudoir shot, Alex would be a good choice, and you’d also meet a really interesting person.


Comment feed for ongoing:Comments feed

From: Tim (but not THE Tim) (Aug 03 2023, at 22:35)

Updates should probably be available in layers:

OS versions should be able to run on multiple different years of hardware architecture. The OS should provide the API to access the hardware.

OS (and other) APIs need to remain in place for literally decades. New features can be added but old APIs should remain. From the beginning the OS or feature should provide a query function to determine if the feature is available.

UIs should be built _only_ using APIs provided by the OS and therefore could be swappable. Users wouldn't need to relearn a UI unless they wanted to. UIs could be sold like applications.

UIs should also provide APIs for applications to use which are constant and queryable similar to the OS APIs

Applications should _only_ use the provided APIs, none of this "I'm writing my own I/O routine because I don't like the OS doing it"

Yes, it's doable - the IBM mainframe software ecosystem has provided a lot of this for years.

The idea is to allow people to keep using what they like for as long as it will possibly work.

I also considered adding that I think applications need to warn you about features that need significant CPU and allow you to turn off / "grey out" those features if you know your hardware won't handle it well. I was surprised once when an image resizing choice cause my older desktop tower to start driving the fan hard because of their new "AI-driven" method


author · Dad
colophon · rights

August 01, 2023
· Arts (11 fragments)
· · Photos (980 fragments)
· · · Cameras (73 more)
· · · Scanned (3 more)

By .

The opinions expressed here
are my own, and no other party
necessarily agrees with them.

A full disclosure of my
professional interests is
on the author page.

I’m on Mastodon!