Moving Things

Hayao Miyazaki and work

I see it’s the 80th birthday of the noted animator Hayao Miyazaki.

I once read his book of essays Starting Point: 1979–1996 and was seized by a desire to do something well, with vigour and clarity, instead of tiredly poking around. At the time this made me a little miserable, and I wondered why.

* * *

The book contains essays, interviews, and documents from Miyazaki during a period in his 30s and 40s in which he directed six highly-regarded anime feature films: The Castle of Cagliostro, Nausicaä of the Valley of the Wind, Laputa: Castle in the Sky, My Neighbour Totoro, Kiki’s Delivery Service, and Porco Rosso. This is an excellent run, and history tells us that the two films he made immediately afterwards would be even more celebrated.

The book contains planning documents, interviews, and some essays written between films. The planning documents describe a film before it has been made, and the interviews talk about it afterwards, when in every case it had been a success.

This structure, no doubt accidentally, gives a sense of a narrative like this: Miyazaki thinks of an idea and writes it up with in clear and forceful terms; he starts drawing without a script or storyboard and develops the plot as he goes; he works solidly for three years; the film is finished; it’s a wonderful piece of work that is just as he had envisioned; the process repeats.

Reinforcing this impression is some of the content of the interviews. For example, when talking about My Neighbour Totoro — a glorious film whose origin seems to speak of as pure a work of art as ever appeared in a cinema — Miyazaki remarks that he felt “tremendous happiness” while working on the film, that he knew throughout how it was going to work, that he could avoid any directorial tricks and keep the plot as simple as possible, that he could almost have made the entire film be just about the excitement and fear of a typhoon passing near a house at night; and that if he were to make the film again, given the same technical constraints, the only thing he would change would be to allow more time with the main characters, two small children, while they live their lives, ignoring the plot.

* * *

This is very lovely to read, but also a little daunting. There is something there, real or imagined, that I found I longed for in my work.

What is that longing? It isn’t about talent. I’m delighted by the talent of Miyazaki, but I don’t yearn for it, probably because I can’t imagine having it. It also isn’t about whether my work is worthwhile — I have doubts, but those are long-standing, unchanging doubts. I think it is about focus and application.

I believe that we all have the ability to create work that satisfies our own critical judgement as a coherent artistic effort, but that we don’t do it, because we can’t find the clarity of mind and the conviction to complete an idea with the quality that we first imagine for it.

As a mere programmer, I’m aware that much of the software I write will never be used, or not in the way I imagined it. I believe this is the unspoken experience of all programmers. Software that is made with care but not used is obviously unsatisfying. But the other side of it is software that is not made well enough to merit users at all. If it hasn’t been made well enough, then the more popular it is, the more people might be damaged by mistakes in it, and so the worse it is. Experience seems to consist partly in learning to suppress the fear of this, and to find sufficient trade-offs to ensure that anything useful ever gets published.

* * *

The perceived narrative I referred to above is not the whole story. There is plenty in the book about team work, and it hints at the vast amount of manual labour going on, with stories about overworked colour artists, slipping schedules, and the continually unmet expectation that everything will be easier next time. And at the end of the book is a retrospective timeline, from which we can see that the narrative doesn’t flow linearly either — the team must often have been working on more than one Miyazaki-directed film at once, and some films were based on ideas that had been sketched decades before.

Focusing on team effort does change the picture. You need a lot of people to make a film, and I suppose before you animate the film, you need to animate the people. If the impulse is enough to carry everyone along, then a team can maintain a direction even if their director isn’t certain where they will end up. Perhaps the sense of purpose that seems so desirable is something that one person can’t readily sustain on their own.

* * *

My reverence for the film artifact might not be shared by its makers either. Throughout this book, as well as in more recent interviews I’ve seen, Miyazaki is actively grumpy about the value of anime: there’s too much animation being made already, this is all a waste of time, we contribute nothing to the world, I only want the industry to continue because I know too many people who are animators and I don’t want them to lose their jobs. It’s only when he is sunk deep into a project that he appears to be happy about it.

Code · Mighty Convolvuli · Work

On macOS, arm64, and universal binaries

A handful of notes I made while building and packaging the new Intel/ARM universal binary of Rubber Band Audio for Mac. I might add to this if other things come up. See also my earlier notes about notarization.

Context

I’m using an ARM Mac – M1 or Apple Silicon – with macOS 11 “Big Sur”, the application is in C++ using Qt, and everything is kicked off from the command line (I don’t use Xcode).

To refer to machine architectures here I will use “x86_64” for 64-bit Intel and “arm64” for 64-bit ARM, since these are the terms the Apple tools use. Elsewhere they may also be referred to as “amd64” for Intel, or “aarch64” for ARM.

Universal binaries

A universal binary is one that contains builds for more than one processor architecture in separate “slices”. They were used in the earlier architecture transitions as well. Some tools (such as the C compiler) can emit universal binaries directly when more than one architecture is requested, but this often isn’t good enough: perhaps it doesn’t fit in with the build system, or the architectures need different compiler flags or libraries. Then the answer is to run the build twice with separate output files and glue the resulting binaries together using the lipo tool which exists for the purpose.

How does the compiler decide which architecture(s) to emit?

The C compiler is a universal binary containing both arm64 and x86_64 “slices”, and it seems to be capable of emitting either arm64 or x86_64 code regardless of which slice of its own binary you invoke.

Perhaps the clearest way to tell it which architecture to emit is to use the -arch flag. With this, cc -arch x86_64 targets x86_64, cc -arch arm64 targets arm64, and cc -arch x86_64 -arch arm64 creates a fat binary containing both architectures.

If you don’t supply an -arch option, then it targets the same architecture as the process that invoked cc. The architecture of the invoking process is not necessarily the native machine architecture, so you can’t assume that a compiler on an ARM Mac will default to arm64 output.

I imagine the mechanism for this is simply that the x86_64 slice of the compiler emits x86_64 unless told otherwise, the arm64 slice emits arm64 likewise, and when you exec the compiler you get whichever slice matches the architecture of the process you exec it from.

There’s also a command called arch that selects a specific slice from a universal binary. So you can run arch -x86_64 make to run the x86_64 binary of make, so that any compiler it forks will default to x86_64. Or you can do things like arch -arm64 cc -arch x86_64 to run the arm64 binary of the compiler but produce an x86_64-only binary.

If you invoke a compiler directly from the shell without any of the above going on, then you get the machine native architecture. I assume this is just because a login shell is itself native.

For my builds I found it helpful to provide a cross-compile file to tell Meson explicitly which options to use for the architecture I wanted to target. That avoids the defaults being just an accident of whichever architecture Meson (or its Python interpreter, or Ninja) happened to be running in, without having to litter the build file with explicit architecture selections. I then scripted the build twice from a separate deployment script, using a different cross file for each, rather than try to have a single Meson file build both at once.

How do I target a particular version of macOS?

Use a flag like -mmacosx-version-min=10.13 at both compile and link time.

For ARM binaries, the oldest version you can target is 11. But you can still build a universal binary that combines this with an Intel binary built for an older version, and the result should run on those earlier versions of macOS as well.

How does a version of macOS decide whether my binary is compatible with it?

I had this question because I had built a universal binary (as above) in which the Intel slice was, I thought, built for macOS 10.13 or newer, but when I brought it to a machine with macOS 10.15 it showed as incompatible in the Finder and could not be opened there.

The answer is that it looks at the relevant architecture slice of the universal binary, and inspects it to find a Mach-O version number. In “older” versions of the macOS SDK this version is written using the LC_VERSION_MIN_MACOSX load command; in “newer” versions (I’m not quite sure when the cutoff is) it is tagged as the minos value of the LC_BUILD_VERSION load command instead. The linker quite logically decides which load command to write based on the value of the version number itself, so if you build -mmacosx-version-min=10.13 you get a binary with LC_VERSION_MIN_MACOSX specified.

You can display a binary’s version information with the vtool tool, and it also appears in the list of information printed by otool -l. In theory you can also change this tag using vtool, but (a) that’s a bad idea, fix it in the build instead and (b) vtool segfaulted when I tried it anyway.

And after all that, in my case the cause turned out to be that I’d failed to supply the -mmacosx-version-min flag at link time.

Why is my program being killed on startup?

It appears that if you build a program for one architecture and then rebuild it for the other arch to the same executable file without deleting the executable in between, sometimes it doesn’t run: it just gets “killed (9)” on startup. I failed to discover why and I failed to reproduce it just now in a test build. I guess if that happens, delete the executable between builds.

* * *

Bonus grumble about Mac trackpad and mouse options

This is not useful content. Please do not attempt to read it

I haven’t used a Mac in such earnest for a while now, so of course I’ve been rediscovering things about macOS that I don’t get on with. One that I find particularly maddening is the way it handles scroll direction for the trackpad and an external mouse.

I switch between the two a lot, and I like to use the “natural scrolling” direction (touchscreen-like, so your fingers are “pushing” the content) with the trackpad, but the opposite with the mouse, which has a scroll wheel or wheel-like scrolling zone whose behaviour I became accustomed to before touchscreen devices started sprouting everywhere.

Fortunately, macOS provides separate touchpad and mouse sections in the system preferences, which contain separate switches for the scroll direction of the trackpad and mouse respectively.

Unfortunately, when you change one of them, the other one changes as well. They aren’t separate options at all – they’re just two different switches in different windows that happen to control the same single internal option! So every time I go from trackpad to mouse or back again, I have to also go to system preferences and switch the scroll direction by hand. That is so stupid.

(Linux and Windows both have separate options that actually work as separate options. Of course they do. Why would they not?)