Why some users fear the iPad. And other lessons in app design.


Sometimes insight comes from weird places.

My co-founder Josh Koppel likes to tell the story of how, back in the day, Scrollmotion had a mega-client, a Big Pharma beast, that used our iPad software to make sales pitches to doctors. (Pharma loves the iPad because there's no risk of a salesperson leaving behind any demo materials, which could open the door to lawsuits if they contained any, uh, fatal errors.) The team had spent more than a year helping the company train its sales team when they noticed something bizarre. Some salespeople were reluctant to use the iPad. But for the life of them, Josh and his team couldn't figure out why; it wasn't due to age or gender. Then at one big sales meeting the trainers finally saw what it was: Certain people were embarrassed about their nails! It was a freak, million-dollar revelation. So the management did the most sensible thing it could think of--send everyone off to get manicures, or, if they preferred, a stylus. Problem solved.

The moral of that experience was, Think outside your app. If you are a developer, you should be killing yourself to understand how your software intersects with lived human experience. How do the physical environments and use-case contexts in which your app is deployed affect the way it's used (or, just as important, how do they prevent or discourage its use) and what design--and experience decisions naturally flow from those insights?

When I look at the best apps, it's clear that they share this quality, that they've been built to integrate as deeply as possible into the lives of their users. Here are a few that do so exceptionally well.


If you're like me, your receipts are constantly piling up and you hate them deeply. Well, by feeling your pain in advance--by carefully reading the environment in which it is being used--this app eliminates a major headache. Expensify basically takes care of the whole expense report process with a single touch of a button, or damn close: You can pull out your phone when the check comes to the table, take a picture, and put your phone away; at the end of the month the app sends you a beautiful custom expense report. No more taping and photocopying and mailing.


Note-taking is a crowded market and Evernote has built its dominant position from inspired moments of design and engineering microgenius: Like the way it scrapes your calendar for the title and subject of the meeting as you enter, then makes that the title and subject of the notes you are about to take. It does the same with location; it also integrates with Photos, if I need to pull one in. It's a great example of using hardware and software--and the tools iOS gives you--to build empathy into the app.

Apple Memories

A couple of years ago Apple introduced a new feature on Photos called Memories, which takes thematically connected images and video--from a vacation, say, or a certain location--and, in a feat of pretty spectacular engineering, turns them into auto-generated mini-videos. Why did they do that? Apple knows you have a lot of photos. And Apple wants not only to help you make them useful, but also to keep you engaged with its hardware. So it helps you generate beautiful things to share.


One of the biggest contextual factors out there is accessibility. Almost one in five people has some sort of disability. That's a lot of people who aren't using your app if you don't factor in their needs as you build it. Apple has great documentation on accessibility and every developer should make it her business to study it closely. I will devote a whole column to this down the road.