Since Mitosynth has received a lot of comments regarding its design, I thought it might be nice to look how at that evolved over the course of development.
iOS apps are an interesting design challenge. On the one hand, you have excellent high-res screens, a relatively powerful GPU, and graphics & animation APIs that are for the most part excellent to work with (and probably better than any other platform I’ve developed for). And touch-screens invite direct manipulation, and there’s lots of scope for doing interesting things with multitouch.
On the other hand, you have to work with the limits that brings: there’s the physical size of a finger on-screen, which restricts the amount of “stuff” you can show at once while remaining useable. There’s the fact that you can’t indicate interactivity through “hover states” like you can on a desktop device (eg buttons lighting up as you roll a mouse pointer over them), the fact that your app might be used in awkward circumstances: outdoors in bright sunlight, or on a juddering train. And you have to be careful not to end up “hiding” features behind obscure gestures.
Synth apps often take a very skeumorphic approach to their design, trying to replicate the actual front panels of real-life hardware synthesisers. I can understand why you’d do that, but it usually feels like the wrong approach to me. Hardware synth controls are designed for real-world physical hands-on use: they often don’t feel right in a touch-screen environment where you can’t feel the controls under your fingertips, and can’t brush a rat’s nest of patch-cables out of the way – on-screen, they all tend to merge into a big blob of colour – and even real-world controls are held back by the constraints of physics and manufacturing.
Not only that, but as we’ve seen over the past year, iOS 7 has swerved in the direction of clean, slim lines, the stripping back of textures, and doubling down on animation. You’d be forgiven for thinking Mitosynth’s design was provoked by that trend – but, here’s the earliest mockup I can find, dating back to a month or two before anyone outside of Apple got a look at iOS 7:
It’s funny, I’d already been experimenting in the design areas we now call “the iOS 7 look”: slender, cleaner, less textural, and with a strong emphasis on using animation to help “explain” what’s going on when you work with an interface, and to add character so that the clean lines don’t feel too “sterile”.
Looking back, plenty has changed – but a surprising amount is there from the outset. This image started with me just opening up an iPad-sized screen in Sketch, throwing down a screenshot of Grain Science’s keyboard as a placeholder, and experimenting with ideas. The main thing I knew was that I needed the UI to have a lot more flexibility, so that it could make good use of the audio engine I had in mind.
The basic concept of the dials with large text displays for their values, being flipped out for graphs when automated, and even many of the colour and font choices, are all there. The design of almost everything has been refined during the course of development, but the dials themselves are almost entirely untouched.
(You can see that I started with an awful “radar sweep”-style display, briefly toyed with some hexagonal UI, then started iterating on what would become the final design, in a series of variations in the top-right.)
There was a lot of sci-fi influence in this, particularly with the slightly-glowing buttons and switches, and the target-reticule box/cross elements marking out the control grid. Most of these elements would soon get canned, to reduce clutter.
I’d been experimenting with glass panels, too. In a previous project (Portray, and also another app that never saw the light of day) I’d done a bunch of R&D on efficient techniques for applying those sorts of filters, so I knew it could be done. I laughed when iOS 7 was introduced in a slick video narrated by Jony Ive, about 75 minutes into WWDC ’13, with glass panels sliding up over the Home Screen…
Here’s an image showing an early version of a popup glass panel (at this point, I still saw it as a popup attached to the dial itself, like in Grain Science).
iOS 7 had been announced by this point, as evidenced by the much more iOS 7-like multi-choice selector (though these ended up being set aside in favour of the “rolling button” selectors because they fit neatly into the layout alongside the other controls). The sci-fi buttons have been canned, but the reticule grid is still clinging on for dear life…
So, the problem with the Grain Science-style popups-attached-to-dials idea was that I knew I wanted to be able to automate the automation – the “LFOception” idea at the heart of Mitosynth’s modulation system – but it didn’t fit well with popups.
Popping up a popup over a popup of a popup is prohibited by Apple’s human-interface guidelines (also prohibited by: common sense). Trying to cram a navigation interface (with “back” buttons and a title bar and so on) into the popup also felt inelegant, confusing and cramped.
This held up the design for some time as I ran through different ideas (mostly on paper on in my head) of concepts like: instead of popping up the controls, having the entire UI shift out of the way to make room for them (similar to the way Mitosynth’s settings screen expands to make room when you tap on Scales or MIDI Channel), or a “zooming”-style interface (like iOS 7’s Calendar on the iPhone). Nothing really worked, though: too many problems, either from a user-interaction or a technical standpoint. It ended up being confusing, or dizzying, or quickly getting ridiculous when you started to “dive in” more than one level.
I set it aside for a bit (leaving it percolating in the subconscious) to concentrate on something else that needed to be sorted out early on in the design: the iPhone version.
Grain Science had been designed for the iPad first, and although I always planned to bring it out for iPhone as well (and the size of the control panels was chosen accordingly), in practice, it wasn’t as smooth an experience on the iPhone as I wanted it to be. So for Mitosynth, I decided instead to start by designing for the iPhone, and then allowing the iPad to “relax into the extra space”, like letting its belt loose a notch after dinner. Here’s the earliest iPhone image I can find:
The big W was just a placeholder – at this point, I knew it would be heavily based on wavetables, but I didn’t know what it would be called. The main navigation principle was set, though, with the four tabs for help/info, settings, library and editing.
Here’s the earliest experiment for the editing screen. I had the idea that the routing would be shown like a map of the underground network, with the FX being stops along the train line. I still think there’s something to that idea – maybe in a future app? This version sure looked ugly, though! But, dials have their final colours already here, with the idea of using pale blue for information displays and labels, and amber used for things that can be edited.
Starting to look a little familiar, and with the navigation tabs & editing merged, and proper icons instead of randomly-chosen placeholders:
Breaking away from the tube map idea, I started with these blue chevrons indicating the signal flow. An immediate improvement:
Tweaked size, introducing the smaller “+” button for inserting effects.
Then, the idea of, on the limited-size iPhone screen, shrinking the signal flow down to fit into the sidebar (taking over from the settings/info/etc tabs) when editing a particular section:
Notice this primitive design of the wavechamber editor actually has a very, very early (and ugly) concept for the gridcøre feature that was added in 1.1. The gridcøre mode was the original idea for Mitosynth’s wavechamber, and the closest match for what’s going on in the engine itself. But it’s also very complex and potentially overwhelming for new users, especially with all the other stuff I was hoping to add (such as additive synthesis and prefilter). I quickly realised it would put a lot of people off if this was the only way to set up patches.
My solution was to add the simpler wavechamber modes, starting with the basic sampler (effectively a 1x1 grid, but with a much simpler interface), blender (1xN grid, straightforward list-based interface), and allowing you to set up prefilter to fill out the other axis of the grid.
This was so succesful, I decided to hold gridcøre back: there was already so much awesome functionality in Mitosynth using the simpler modes, and I knew 1.1 would be out pretty soon after. This gave it extra time to evolve into the gridcøre you’re familiar with.
At this point, I went on a side-trip to sort out what the things-being-edited would be. I’d ditched the underground map idea, and was thinking about bio-science: lots of synths and audio tools have names taken from high-energy physics, but this is the century of biotech. I started to experiment with cells under a microscope (uh… as a concept. I don’t own a microscope!), and the glass popups became something more like glass microscope slides:
Lots of cells, petrie dishes, abstract sciency symbols… and one obvious winner:
How does it look in situ?
Pretty good! Starting to get close to what you’re familiar with now. Those chevrons had to go, in the end, though – they looked messy, and were annoying to manage, when doing things like inserting/removing effects, shrinking the cells down into the sidebar, and so on – it was difficult not to make them look misaligned or cluttered in one configuration or another. In the end, I decided that they just weren’t necessary.
This leads up to one of the first images of Mitosynth ever released to the public:
Unlike the concept art, this was a real screenshot from running code, with the cells floating and warping in whatever liquid they’re suspended in.
A couple of other details… the ugly graphs in earlier mockups got replaced with new ones inspired by the DNA double-helix, which looked nicer and kept up the bioscience theme:
And the piano keyboard started out trying to conform to the smooth, flat glass look, but ultimately it just looked terrible and a more realistic keyboard (though simpler than Grain Science’s) returned:
And finally, to bring it all together: