- Introduction / Implementing a Real-Time Green-Screen Video Effect
- Blending Video with Another Image / Exploring and Running the Code
Blending Video with Another Image
The GPU stores generated fragment colors in a memory buffer called the frame buffer. The demonstration program uses OpenGL to configure the GPU so that fragments generated from video replace any existing fragments in the frame buffer. After the video fragments have been stored, the GPU is reconfigured so that any generated fragment colors are blended with the existing colors in the frame buffer.
Specifically, the following OpenGL function calls specify blending that allows replacement of translucent fragments but doesn't allow modification of opaque fragments:
glEnable(GL_BLEND); glBlendFunc(GL_ONE_MINUS_DST_ALPHA, GL_DST_ALPHA);
With blending enabled, the current texture for use by the GPU is changed to the image of an elephant. The two triangles that together cover the display are drawn again, and this time the fragment colors are obtained from texels of the elephant texture. However, generated fragment colors are discarded unless they replace translucent fragments already in the frame buffer. If the fragment color in the frame buffer is only partially translucent, the color in the frame buffer is blended with the new fragment color, producing a combined value.
Figure 3 and Figure 4 show more examples of the green-screen effect.
Figure 3 There's an elephant outside the window.
Figure 4 The neighbor's lawn is very green.
Exploring and Running the Code
The demonstration app has been tested with an iPad 2 and an iPod touch with Retina display. It should run well on any iOS device with a camera.
The code is small and designed to be reusable. Apple's GLKit simplifies the OpenGL portions. All OpenGL drawing commands within the demonstration are located in the GSViewController.m file. The reusable UtilityEffect class and its subclass, GSGreenScreenEffect, load and compile the custom fragment shader program for use by the GPU. The reusable GSVideoProcessor class contains code inspired by Apple's RosyWriter example to capture video frame images and push them in a queue for processing. Put together, the demonstration only contains about 1,200 lines, including blank lines and comments. Only about 400 lines are specific to the green-screen effect being demonstrated.
You're welcome to repurpose the demonstration code for your own projects under the terms of the permissive MIT license. Comments, questions, and suggestions are encouraged in the discussion forum for this article. For more details on working with OpenGL ES on iOS, check out my book Learning OpenGL ES for iOS: A Hands-on Guide to Modern 3D Graphics Programming.