Ken Burns on the App Store

Ken Burns and Big Spaceship got together to create an immersive iPad app that sheds a new light on American history. By highlighting curated moments of Ken Burns' vast library of work it surfaces recurring patterns by visually interweaving connections between events, people and places throughout time.

The app was released on February 10, 2014 and was immediately featured as app of the week in the App Store, becoming Editor’s Choice only within a few days. It’s been featured on Wired, The Verge, ARS Technica, Mashable and many other press outlets during launch week.

Responsibilities

  • Tech Lead
  • iOS Lead
  • UX & UI Prototyping

Technologies

  • Objective-C
  • Sprite Kit
  • PHP

As the tech lead, I not only managed resources, timeline and core application design and wrote the majority code, but was also heavily involved in concepting, UX design and animation studies. To explore the potential of the app’s experimental interface and user-interactions, I worked closely with the design and UX team to iterate over a variety of prototypes.


Easing and Layout

One of the highlights of the Ken Burns app is that all of the clips on the timeline can seamlessly transition between many layouts. iOS provides great ways to animate and lay out interface components, but building a seamless system required custom math and easing.

For example, Instead of just linearly moving all clips off the timeline and shrinking the rest, I wanted to add a much more organic feel to their movement; allowing users to drag a little and get immediate feedback through motion.

Most of that math revolved around normalizing input values coming from user interaction and running it through some form of easing function (e.g. simple exponential easing through powf()). Here is a stripped down example of how the position and scale for dragged clips were calculated:

/**
 *  Handles a vertical drag, separating a group of selected clips based
 *  on the distance of the drag.
 *
 *  @param progress The normalized drag distance (-1.0 ... +1.0)
 */
- (void)draggedWithProgress:(CGFloat)progress {
    CGFloat progressExpo = powf(progress, 4.0f);
    CGPoint targetPosition = CGPointMake(
        startPosition.x + (center.x - startPosition.x) * progressExpo,
        startPosition.y + (center.y - startPosition.y) * progressExpo
    );
}

This basic principle applied to a lot of animations, like smoothly spreading out and reorganizing clips as they’re dragged into a playlist or organically distributing clips around a selected clip in the zoomed-in timeline view:

/**
 *  Selects a single clip node and push away any surrounding clips.
 *  Will zoom in the timeline if it is not zoomed in already.
 *
 *  @param clip The clip to select.
 */
- (void)selectClip:(KBClip *)clip {

    // ...

    // spread all other clips around the selected node
    for (KBClipNode *otherNode in _clipNodes) {
        if (otherNode == _selectedClipNode) {
            continue;
        }

        // random angle offset (don't bother with real random here, just fluff)
        CGFloat angleDeviation = maxAngleDeviation * M_PI * (0.5f - (float)rand() / RAND_MAX);

        // take perspective into account for radius, extending it for clips further in the bg
        CGFloat targetRadius = selectionRadius + (1.0f - otherNode.parallaxDepth) * selectionRadius;

        CGPoint position = otherNode.sourcePosition;
        CGFloat dx = position.x - selectedPosition.x;
        CGFloat dy = position.y - selectedPosition.y;
        CGFloat d = sqrtf(dx * dx + dy * dy);
        CGFloat angle = atan2f(dy, dx) + angleDeviation;

        // push nodes further away the closer they are
        CGFloat repulsion = powf(targetRadius / d, 1.15f);
        CGFloat targetDistance = fmaxf(targetRadius, d * repulsion);
        CGFloat targetDistanceX = distanceStretchFactor.x * targetDistance * cosf(angle);
        CGFloat targetDistanceY = distanceStretchFactor.y * targetDistance * sinf(angle);

        CGPoint targetPosition = CGPointMake(
            targetDistanceX + selectedTargetOffset.x,
            targetDistanceY + selectedTargetOffset.y
        );
        CGFloat targetScale = fmaxf(0.5f, fminf(otherTargetScale, otherTargetScale * repulsion));
        CGFloat targetAlpha = fmaxf(0.0f, fminf(otherTargetAlpha, otherTargetAlpha * repulsion));

        // ... apply values via SKAction
    }
}

Bridging UI Kit and Sprite Kit

To control the environment and trim the timeline, the app was built exclusively for iPads running iOS7. This also meant that we were writing code in unexplored territory, which was equally exciting as it was daunting.

Pushing for an immersive interface was a great opportunity for us to explore Sprite Kit to power our visuals. While it helped us strip down a lot of the luggage that comes with UI Kit animations, it also meant that dealing with complex gestures and memory management would be all the more challenging.

The main hurdle became uniting UI Kit’s UX-focus with Sprite Kit’s muscle to support an engaging experience.

Gestures

While SKNode instances gives access to basic touch events from UIResponder (-touchesBegan:withEvent:, -touchesMoved:withEvent:, -touchesEnded:withEvent: and -touchesCancelled:withEvent:), there’s no direct support for gestures. UIGestureRecognizer provides basic recognition of common gestures, but most importantly simplifies the orchestration of many gestures at once.

Instead of re-inventing the wheel, I tapped right into UIGestureRecognizer from within Sprite Kit. By adding instances of the class to the main SKView, I routed their delegation to individual SKNode instances.

Scrolling

I could apply the same principles of gestures to scrolling content in Sprite Kit. UIScrollView does a lot out of the box; including panning, zooming, momentum and bouncing. Again, I leveraged UI Kit under the hood by mirroring the core scroll view hierarchy and binding the output to Sprite Kit’s visuals.

Translating Coordinates

Since UI Kit and Sprite Kit work in two different coordinate systems, we needed to convert between the two. Luckily, Apple’s built-in UITouch Sprite Kit Additions did the job wonderfully:

- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
       shouldReceiveTouch:(UITouch *)touch {
  CGPoint touchLocation = [touch locationInNode:self.parent];
  return (![self isHidden] && [self containsPoint:touchLocation]);
}

Data Synchronization

All of the app’s data lives in a lightweight API powered by Amazon’s S3 and EC2 services. It allows editors to add new playlists, films and clips or update existing information. Leveraging iOS7 background fetches, we were able to synchronize this data with the app easily.

Data from the API is formatted in JSON and converted to Core Data entities after synchronization. Relationships are managed by Core Data internally and boosted through Magical Record.

Using core data allowed us to have a robust data model as a foundation and easily attach meta data to individual clips, films and playlists using entity relationships.

Team

  • Dave Chau (Design)
  • Nathan Adkisson (Strategy)
  • Will Simon (UX)
  • Nooka Jones (UX)
  • Grace Steite (Production)
  • Stephen Koch (Tech)

Featured Press