How Flipboard plays animated GIFs on iOS

How Flipboard plays animated GIFs on iOS
by Raphael Schaad

Flipboard has always sought to “cook the raw web” and transform it into something that has the design elegance of a magazine. We consider many details – from the typography of articles, to the layout of photographs – to remain as faithful to the nature of content as possible.

When it came to animated GIFs, we knew we wanted them to play automatically in our app. Auto-play is one of the chief appeals of animated GIFs. However, many applications on iOS render animated GIFs as stills – an unfortunate result of the complexity of playing multiple GIFs simultaneously, and in real time, on a mobile device.

One would think that such an ancient image format would be supported out of the box for developers on modern iOS devices. But not even all of Apple’s own apps play them. When viewed with the mobile browser, the system often slows to a crawl. It’s challenging to keep the memory footprint and CPU usage low while maintaining fidelity to the playback timings.

Our requirements to support animated GIFs were:

  • Play multiple GIFs simultaneously with a playback speed comparable to desktop browsers
  • Honor variable frame delays
  • Behave gracefully under memory pressure
  • Eliminate delays or blocking during the first playback loop
  • Interpret the frame delays of fast GIFs the same way modern browsers do1

Because there was no built-in method or open source library that met all these requirements, we built an engine and have honed it since we shipped it last year. We think it’s currently our best option and would like to share it with the community.

If you’d like to enhance your iOS app with animated GIF support, skip ahead and follow the simple instructions to include our open source component.

What iOS offers for animated images

The typical way of displaying an image on iOS is to create a UIImage from image data and displaying it in a UIImageView on the screen. However, none of the more than a dozen initializers to create a UIImage can create an animated image from a single multi-frame image.2 Instead of an animation on the screen, the image view shows the first still frame. The programmer has to load every frame as an individual image with Apple’s ImageIO framework and use the animatedImages, animationDuration, and animationRepeatCount APIs of UIImageView.

A shortcoming of this approach is the inability to honor variable frame delays of GIFs.

Five frames with variable delays.

Let’s assume the frame and meta data loading code on a UIImage+animatedImage category and consider the following code:

imageView.animationDuration = image.delayTimes[0] * animatedImage.frameCount;

imageView.animationDuration = image.totalDelayTimes;

It becomes obvious that using the first frame’s delay for all frames or summing all delays up and using that as a total duration don’t lead to good results.

An alternative option is using UIWebView, in which case GIFs are decoded and displayed by WebKit, just as on a desktop browser.

However, web views aren’t tuned to play GIFs well on mobile devices and often slow down the playback speed. There’s also no control over the playback or memory usage.

Rolling your custom playback

One approach to support variable frame delays with UIImageView is to find the greatest common divisor of all frame delays and slot longer frames multiple times in a row into the animatedImages array.

The slot duration determined by the GCD is 1 second.

The first time a frame is displayed on the screen, the compressed image data is decoded into its uncompressed bitmap form. This is a very CPU-intensive operation and therefore the first loop will be slow.

Even trickier are the memory implications; once an image is decoded, that bitmap is attached to the image object and will persist for the lifetime of the object. This turns our image view into a cache for all these huge3 bitmaps – among the worst things you can do for your overall memory footprint.4

When Apple added the animation properties to UIImage and UIImageView, they likely designed it for small UI animations, such as the spinny loading indicator and not for large animated photographs. Maybe some apps could get away with this approach, but in our case it required us to add a play button to each GIF and only allowed playing one at a time. This took the fun out of GIFs and was not an acceptable user experience for us.

Produce and consume frames as needed

Whenever memory is the constraint, instead of storing the solution to a problem one has to recalculate it. In our case we needed a way to load and decode the frames just in time before they were displayed, and to purge the ones that were no longer on screen. This is known as the producer-consumer problem; one thread produces data and another thread consumes it. We needed a producer streaming frames to the view, which would consume them on-demand. The producer would be throttled by the available memory, and the consumer by the frame timings.

Overview of the component with the important aspects of it highlighted in UML notation.


An FLAnimatedImage gets initialized with GIF data and then its job is to deliver frames as quickly as possible if asked for via - (UIImage *)imageLazilyCachedAtIndex:(NSUInteger)index but only using a fraction of the memory.

It tries to intelligently choose the frame cache size depending on the image and memory situation: if it’s a small GIF, we try to keep all frames in memory and go easy on the CPU. If it’s a large GIF, we try to lower the memory usage by buffering just enough frames ahead to playback in real time.

When the system issues a memory warning, all instances of animated images drop all the frames they’ve already drawn into the off-screen buffer and fall back to on-demand decoding. After some time, they begin to build up a buffer again. If multiple memory warnings occur, they hold at keeping just one frame and decoding on-demand. When designing this behavior, it was important to avoid the worst possible user experience – an app crash. We’d rather cut back on the playback speed.


An FLAnimatedImageView can take an FLAnimatedImage and play it in real time by using a CADisplayLink internally.

Instead of dealing with complex lock acquisition, we let the producer return nil if the data we need to consume isn’t ready yet, and continue displaying the previous frame. This simplifies the multi-threaded code and gets the desired effect of the slower, but correct, playback of the animation.

Designing a well encapsulated drop-in component

FLAnimatedImage subclasses NSObject directly because it would benefit little by inheriting from UIImage. FLAnimatedImageView on the other hand is a fully compatible UIImageView subclass and can be dropped in and take its place; setting a UIImage or an FLAnimatedImage on it works as expected. So anywhere we display an image, we use FLAnimatedImageView and it does the right thing automatically.

When deciding on the architecture it was important to create a self-contained, reusable component. Multiple FLAnimatedImage-FLAnimatedImageView pairs can be used without a central cache. They’re all aware of the system and independently try to do the best to be great citizens.

The module is around a thousand lines of code and tries to adhere to the Unix philosophy of “doing one thing and doing it well”. At this point, it’s well tested for production use. There’s still room to build it out further and we welcome contributions from the community.


Flipboard readers have created lively magazines like “GIF Me A Break”, “Goals goals goals!”, or “Cat GIFs 😹”. GIF isn’t just a file format, it’s a culture and arguably the Internet’s native art form. By sharing this engineering challenge and the source, we hope to contribute its longevity.

  1. There’s a long history on how browsers should throttle very fast GIFs.

  2. iOS 5 adds initializers to UIImage to create an “animated image” (backed by the private class _UIAnimatedImage), but it still expects those images to be individual images.

  3. A 1MB GIF for example turns into 55MB of uncompressed data! (800x600 pixels * 4 bytes per pixel for RGBA * 30 frames)

  4. The WWDC 2011 session “iOS Performance In Depth” illustrates that heap objects are just the tip of the iceberg. This can be measured with the VM Tracker Instrument. “Dirty memory” is the number to watch, as it is memory not mapped to a file and can’t be purged. Our pre-drawn images would be in that category. Note that “Memory Tag 70” is also from images (ImageIO).

Special thanks to Ryan, Evan, Troy, Eugene, Josh, Charles and Chris for ideas and improvements.

Add to Flipboard Magazine.