When the new A11
-powered iPhone
models (8, 8 Plus and X) were announced in the September Event keynote, the new GPU Family 4 webpage and a series of new Metal videos labeled Fall 2017
were published. The new processor, called A11 Bionic, has the first Apple-designed GPU
which comes with three cores. Internal tests state it is 30% faster than the previous GPU
inside the A10
. It also features a new Neural Engine
hardware addition, for machine learning.
Below is a table I made out of the Metal Feature Sets document. It includes only what Metal 2
introduces for the A11
devices.

Note: I only included features that are new for
A11
and not available forA10
or earlier. Some of these features are also available formacOS
devices.
Let’s look briefly into some of these features:
- Imageblocks – is not a new concept on iOS devices, however,
Metal 2
onA11
lets us treat imageblocks (which are structured image data in tile memory) as data structures with granular and full control. They are integrated with fragment and compute shaders. - Tile Shading – is a rendering technique that allows fragment and compute shaders to access persistent tile memory between the two rendering phases. Tile memory is GPU on-chip memory that improves performance by storing intermediate results locally instead of using the device memory. Tile memory from one phase is available to any subsequent fragment phases.
- Raster Order Groups – provide ordered memory access from fragment shaders and facilitate features such as order-independent transparency, dual-layer G-buffers, and voxelization.
- Imageblock Sample Coverage Control – Metal 2 on A11 tracks the number of unique samples for each pixel, updating this information as new primitives are rendered. The pixel blends one iteration less than it did on A10 or earlier GPUs, when the covered samples share the same color.
- Threadgroup Sharing – allows threadgroups and the threads within a threadgroup to communicate with each other using atomic operations or a memory fence rather than expensive barriers.
Even though it is not necessarily Metal
related, at the same event, the Face Tracking with ARKit video and Creating Face-Based AR Experiences webpage were also published. Face Tracking
, however, is only possible on the iPhone X
because it is the only one at the moment that has a TrueDepth
front camera. The most immediate application of face tracking we have all seen during the September Event keynote, were the amazing Animoji! The new Neural Engine
hardware is responsible for FaceID
and Animoji
, among other machine learning tasks. Since I purchased an iPhone X
recently, it might give me some ideas for a new post in the Using ARKit with Metal
series.
Until next time!