Add Interactive Parallax to Any UIView on tvOS
At the introduction of the new Apple TV, one of the first features demoed was the parallax animation and Siri remote trackpad gesture combo used when navigating the UI. The 1:1 tracking of your thumb to what’s onscreen is great UX.
Apple provides a means for developers to utilize this effect in their applications, but for now it only works with a special type of UIImage
. This is great for movie posters and other static content, but not as useful in other scenarios. When designing Yayy, we felt it was important for GIFs to play directly in the channel browser. We couldn’t use the UIImage approach for our movie players and we wanted to overlay text on these as well, so early versions didn’t include any parallax effect.
While it looked great, it didn't feel great. Something was lost when that tactile feedback went away.
We knew we’d need to come up with an approach that combined arbitrary UIView
hierarchies and a tactile parallax effect. After some fiddling, we arrived at a easy to implement way to achieve a nice feeling parallax effect using UIPanGestureRecognizer
and some UIView
transforms. The end effect is something like this…
Here's how to do it..
Yayy’s channel browser is a UICollectionView
, and the cells parallax when they are the focused view. Start by defining a protocol for UICollectionViewCell
to understand the concept of focus.
protocol FocusableCell {
func setFocused(focused: Bool,
withAnimationCoordinator coordinator: UIFocusAnimationCoordinator)
}
Then in the UICollectionViewDelegate
, message the appropriate cells on focus/unfocus.
override func collectionView(collectionView: UICollectionView,
didUpdateFocusInContext context: UICollectionViewFocusUpdateContext,
withAnimationCoordinator coordinator: UIFocusAnimationCoordinator)
{
if let
nextIndexPath = context.nextFocusedIndexPath,
focusCell = collectionView.cellForItemAtIndexPath(nextIndexPath) as? FocusableCell
{
focusCell.setFocused(true, withAnimationCoordinator: coordinator)
}
if let
previousIndexPath = context.previouslyFocusedIndexPath,
focusCell = collectionView.cellForItemAtIndexPath(previousIndexPath) as? FocusableCell
{
focusCell.setFocused(false, withAnimationCoordinator: coordinator)
}
}
Each UICollectionViewCell
will create and manage it’s own UIPanGestureRecognizer
, adding or removing it on focus when appropriate.
func setFocused(focused: Bool,
withAnimationCoordinator coordinator: UIFocusAnimationCoordinator)
{
coordinator.addCoordinatedAnimations({ () -> Void in
transform = focusedTransform
// any other focus-based UI state updating
}, completion: nil)
if focused {
contentView.addGestureRecognizer(panGesture)
}
else {
contentView.removeGestureRecognizer(panGesture)
}
}
private lazy var panGesture: UIPanGestureRecognizer = {
let pan = UIPanGestureRecognizer(target: self,
action: Selector("viewPanned:")
)
pan.cancelsTouchesInView = false
return pan
}()
// The cells zoom when focused.
var focusedTransform: CGAffineTransform {
return CGAffineTransformMakeScale(1.15, 1.15)
}
The UICollectionViewCell
will now be receiving updates as the trackpad is swiped. It’s a matter of updating the view in response to that, using the initial location of the UIPanGestureRecognizer
as a reference point. In my testing, I found that the maximum translation a pan would provide, moving from edge to edge of the trackpad, maps to roughly the size of your view in any direction. Multiplying the raw value by a coefficient allows the effect to be tuned. Different views can be given different multipliers to give the illusion of depth (bigger multipliers move more and thus appear closer).
var initialPanPosition: CGPoint?
func viewPanned(pan: UIPanGestureRecognizer) {
switch pan.state {
case .Began:
initialPanPosition = pan.locationInView(contentView)
case .Changed:
if let initialPanPosition = initialPanPosition {
let currentPosition = pan.locationInView(contentView)
let diff = CGPoint(
x: currentPosition.x - initialPanPosition.x,
y: currentPosition.y - initialPanPosition.y
)
let parallaxCoefficientX = 1 / self.view.width * 16
let parallaxCoefficientY = 1 / self.view.height * 16
// Transform is the default focused transform, translated.
transform = CGAffineTransformConcat(
focusedTransform,
CGAffineTransformMakeTranslation(
diff.x * parallaxCoefficientX,
diff.y * parallaxCoefficientY
)
)
let parallaxCoefficientX = 1 / self.view.width * 16
let labelParallaxCoefficientY = 1 / self.view.height * 24
label.transform = CGAffineTransformMakeTranslation(
diff.x * labelParallaxCoefficientX,
diff.y * labelParallaxCoefficientY
)
// Apply to other views as needed.
}
default:
// .Canceled, .Failed, etc.. return the view to it's default state.
UIView.animateWithDuration(0.3,
delay: 0,
usingSpringWithDamping: 0.8,
initialSpringVelocity: 0,
options: .BeginFromCurrentState,
animations: { () -> Void in
transform = focusedTransform
nameLabel.transform = CGAffineTransformIdentity
},
completion: nil)
}
}
The tvOS focus engine handles transitioning between cells, so tweak the coefficients until they feel right while the cell is focused (for instance, Yayy uses a slightly different coefficient for horizontal and vertical movement). Obviously this doesn’t do the tilt and shine as seen in Apple’s implementation. While that wasn’t something we decided to emulate for Yayy, it’s totally doable. MPParallaxView provides an example of how to achieve that visual effect.
As tvOS matures into full fledged app platform, with applications presenting content that increasingly isn’t a catalog of movie posters, it’s vitally important that all tvOS app interfaces (not just images) provide great tactile feedback.