- From PostScript to Quartz
- The First Steps to a New Model
- Introducing LayerKit
- CoreAnimation on the Desktop
- Animation
CoreAnimation on the Desktop
The LayerKit framework was ported to the desktop and renamed CoreAnimation with OS X 10.5. On the desktop, CoreAnimation can use a lot more RAM; lots of windows can be on the screen at once, and each one is typically bigger than the iPhone screen. On a modern computer, this isn't such a problem with GPUs typically including 128MB or more of video RAM and being able to use system memory for overflow.
As well has having more RAM, the desktop GPU is also typically programmable. The iPhone 3GS has a GPU that supports OpenGL ES 2.0, but the older models only supported 1.1, which does not include support for shaders. On the desktop, all GPUs included with Macs for several years have supported pixel (also called fragment) shaders.
Pixel shaders are simple programs that run on the GPU and can transform textures. The CoreImage framework, introduced with OS X 10.4 and based on Apple's Aperture application, makes heavy use of pixel shaders to implement effects. The CoreVideo framework uses the same shader programs but applies them to every frame in a video. With CoreAnimation, every view can draw to one or more textures, and so these same filter programs can be run on the result of drawing a view before compositing it. They can also be used to define new compositing operations, allowing you to use one layer as a color transform on another.