Categories
Dev

Functionality discoverability

When I’m planning out the user experience of an app, I make sure that the most basic features are immediately obvious. Often there will be more advanced features that I want to include without detracting from the core experience. These are the kinds of things you might hide in a contextual menu or keyboard shortcut on a desktop/web app.

Advanced functionality in a touch interface might be behind a long-press, swipe, double-tap, zoom-flip-rotate-double-finger-triple-tap… whatever it is, users need to realise that the functionality actually exists in the first place before they think to look for it.

Subtle indications about how people should expect an interaction to work are difficult to get right. I think Apple’s best example of this is the camera on the lock screen. Tap it and the lock screen bounces as an indication that you can slide it up to reveal the camera app. Unfortunately even with this example I’ve seen people who don’t take that bounce hint to mean ‘you can slide me up’.

There are plenty of other elements that people have learned though, like scroll bars and horizontal page indicators.

If our subtle indications aren’t obvious enough, in-app help sounds like a cop-out, but I think it’s actually the best solution for exposing advanced functionality to users. I hope that people will seek it out once they understand the basics of my app. Of course, I could be wrong and totally overestimating people’s willingness to seek help before leaving a 1-star review.

To give a concrete example, with Autochords I weighed up the importance of the main functionality that this (admittedly simple) app provides.

  1. View chord progressions.
  2. Select a progression style and musical key.
  3. View alternative progressions.
  4. See how to play a chord.
  5. Hear how a chord sounds.
  6. Play back an entire chord progression.

And that’s not even everything. I tucked some more advanced, less critical options away in in-app settings, like the dark mode option. It’s tricky to know what is most important to most of your users – the ranking I went with is definitely not the same for everyone.

Here’s the interaction required for each.

  1. View progressions.
    • Simply launch the app.
  2. Select a progression style and musical key.
    • Tap the buttons in the bottom-left and bottom-right corners.
    • OR just tap the shuffle button in the top-left.
  3. View alternative progressions.
    • Swipe on the progression in the middle of the screen.
    • I’d consider this somewhat advanced because it requires people to realise that the 3 dots page indicator means they can swipe. Sure, they’re probably familiar with it from their home screen, but you never know.
  4. See how to play a chord.
    • Tap on a chord.
  5. Hear how a chord sounds.
    • Tap on the chord diagram after tapping on a chord.
    • OR long press on a chord.
  6. Play back an entire chord progression.
    • Long press on a chord and move your finger across to other chords.
    • This is definitely the most unintuitive of the features listed here.

The issue that sparked this post is #6. It’s not intuitive enough. I’m thinking that some in-app help is going to be the best solution. I’m also really excited about app preview videos in iOS 8. With a few captions and hints about what functionality is possible, I hope that people will be able to discover these somewhat hidden features in apps.

Categories
Dev

Toggling Full-Screen Split View Controllers in iOS 8

I had some trouble understanding exactly how to get a split view controller to have a toggling master view controller. The problem was that I assumed I needed to manage my own UIBarButtonItem that calls a method on the split view controller.

There are a few pieces I had to put together to get the functionality I wanted.

First, I had to set my split view controller’s preferredDisplayMode to UISplitViewControllerDisplayModeAllVisible. This displays the master on the left, detail on the right.

Next I needed a button toggle the master view controller.

After re-watching WWDC session 204 (text version here), I realised that the UISplitViewController actually provides you with the button through its displayModeButtonItem.

So basically if you put the button in a toolbar or navigation bar, its state, target, and action are going to be managed for you.

I set the toolbar button when the detail view controller gets set. In my storyboard-based app it looks like this:

- (void)prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender {
    if ([[segue identifier] isEqualToString:@"showDetail"]) {
        DetailViewController *controller = (DetailViewController *)[[segue destinationViewController] topViewController];

        // Set a few properties for the detail view here...
        // ...

        // And set the button item.
        controller.navigationItem.leftBarButtonItem = self.splitViewController.displayModeButtonItem;
    }
}

The last piece of the puzzle was getting the toggle action set up properly. Since the appearance of the button that the split view controller provides depends on what will happen when you press it, you need to specify what will happen. You do this in the UISplitViewControllerDelegate’s targetDisplayModeForActionInSplitViewController:. For me, that looks like this:

- (UISplitViewControllerDisplayMode)targetDisplayModeForActionInSplitViewController:(UISplitViewController *)svc {
    if (svc.displayMode == UISplitViewControllerDisplayModePrimaryOverlay || svc.displayMode == UISplitViewControllerDisplayModePrimaryHidden) {
        return UISplitViewControllerDisplayModeAllVisible;
    }
        return UISplitViewControllerDisplayModePrimaryHidden;
}

With very little code and no third-party dependencies, I’ve got a universal split view controller that behaves as you’d expect on all iOS devices, with an option to hide the master controller.