I fixed a bug in a client's iOS app that I want to share, particularly because I couldn't find a write-up or solution during my research into the root cause of the problem.
The bug occurred during video playback. Each time the user tapped play, AVPlayer would first show the underlying view. This momentary transparency was not ideal, resulting in what amounted to a "camera flash" effect on the screen before each video was played.
I started debugging by first checking the app's configuration of AVPlayer, but I didn't spot an obvious issue. So I turned to Google, looking for a baseline of normal behavior for AVPlayer, and found example code Apple released in 2013.
Apple's sample project 1 plays a video selected from Camera Roll, and if you run it, you'll notice that playback occurs without a quick flash of transparency.
At this point, I knew something was wrong with the client's app, and not AVPlayer's API. But what was the culprit? I double-checked the configured properties for AVPlayer and AVPlayerItem, changing the client's app for parity between the two projects. But the bug persisted.
Taking a step back, I thought about the differences that remained between the two apps. Apple's example project played videos that were in Camera Roll, recorded using the Camera app. And the client's app played videos that were made in the client's app.
...wait a second. Was this the issue? Could the video file be the root of the problem and not AVPlayer?
To test this theory, I downloaded one of the videos made in the client's app, dropped it into Apple's example project, and made a couple of quick changes so that Apple's project would use the video I added instead of one selected from Camera Roll. I ran the project and voilà, the same initial flash of transparency reproduced in the example project.
At this point, it was obvious to me that the videos recorded in the client's app were at fault. Curious, I opened one of the client's app-created videos in Finder preview, and dragged the seek bar to the first frame. It was then that I noticed the first frame of the video was transparent, revealing the background of Finder preview.2
I looked at the client's code for creating videos and it seemed like a standard use of AVAssetWriter, the class that writes media data to file. A call was made to
to begin a file writing session, and then callbacks were received to the
With nothing obvious at fault in the code, I started reading the documentation for AVAssetWriter, and saw the following in the explanation for startSession(atSourceTime:):
If the earliest buffer for an input is later than startTime, an empty edit will be inserted to preserve synchronization between tracks of the output asset.
This "empty edit" sounded exactly like what was being inserted into the beginning of the client's app-created videos. I looked at how the client's app was appending buffers to AVAssetWriterInput and noticed that the code never ensured the first buffer appended to the file had the same time as the
startTime of the video.
As a fix, I added simple logic to ensure that the first buffer appended to a new video file always had the same time as the
sourceTime passed to
AVAssetWriter.startSession(atSourceTime:). After this change was made, the client's app started producing videos like this one.3 Notice how it's initial frame is the opaque first frame of the video.
The biggest lesson reinforced here is to always break down complicated bugs by isolating their components into separate projects. Look for examples released by Apple, a respected developer, or create your own. I find this sanity check of knowing how a component or class behaves by default is always helpful in those moments when nothing seems to work.
1. I was not excited when Apple announced Swift. I loved Objective-C. I was used to Objective-C. This test project was the first meaningful Objective-C code I worked with after several consecutive months of Swift development. Somewhere in that time, a shift in my thinking must have occurred, because using Objective-C suddenly felt bulky and kludgy in comparison. ↩︎
2. Because the initial frame of the video is transparent, the background of whatever player you're using will be shown. In QuickTime, this initial frame will appear black. ↩︎