Filtering camera preview with Metal

Photo by Luca Bravo / Unsplash

In this post, we will use all knowledge acquired from the metal filtering lesson to filter camera preview on iPhone.

I have created a starter project. It will display a camera preview using the front camera on iPhone or iPad. Implementation is straightforward. I'm using a basic, well-known image capture setup with a preview layer. It is working, but we want to have Comic Filter applied to the camera preview.

From the previous post, I'll reuse the whole MetalView with some small changes. We'll need to dispatch rendering to the main thread and create new init because Metal View will be created from code, not from Storyboard. Add these changes to MetalView:

var image: CIImage? {
    didSet {
        DispatchQueue.main.async {

init()  {
    super.init(frame: .zero, device: MTLCreateSystemDefaultDevice())
    self.isOpaque = false
    self.framebufferOnly = false
    self.enableSetNeedsDisplay = true
    self.context = CIContext(mtlDevice: device!)
    self.queue = device!.makeCommandQueue()

Create a new file, FilteringCameraController. I'll write a new camera controller that uses the MetalView and it will add comic effect to preview output. To do that, the controller will have to capture the sample buffer in the delegate and display it manually on preview view. This will be a little bit more complicated, but with code from the Metal tutorial, the rendering part is already done. We just have to prepare filtered CIImage to be displayed by the metal view.

I'll post the whole class and explain the code point by point.

//  FilteringCameraController.swift

import Foundation
import AVFoundation
import CoreGraphics
import CoreImage
import UIKit

class FilteringCameraController: NSObject {
    private var previewView = MetalView()
    @objc dynamic private let captureSession = AVCaptureSession()
    private let captureSessionQueue = DispatchQueue(label: "FilteringCameraController_capture_session_queue",
                                                    attributes: [])
    private var videoInput: AVCaptureDeviceInput?
    private var videoOutput: AVCaptureVideoDataOutput?
    private var setupComplete = false
    private var captureVideoOrientation = AVCaptureVideoOrientation.portrait
    // 1
    private lazy var filter: CIFilter = {
        let filterInternal = CIFilter(name: "CIComicEffect")!
        return filterInternal
    var flashMode =
    // 2
    func prepareCamera(with previewView: UIView) {
        if setupComplete || AVCaptureDevice.authorizationStatus(for: .video) == .denied {
        self.previewView.frame = previewView.frame
        setupInput(for: .front)
        setupComplete = true
    // 3
    private func setupInput(for cameraPosition: AVCaptureDevice.Position) {
        captureSessionQueue.async {
            self.prepareInput(for: cameraPosition)
            if self.captureSession.canSetSessionPreset(.photo) {
                self.captureSession.sessionPreset = .photo
    // 4
    private func prepareInput(for cameraPosition: AVCaptureDevice.Position) {
        guard let videoDevice = captureDevice(with:, position: cameraPosition) else {
        let videoDeviceInput: AVCaptureDeviceInput!
        do {
            videoDeviceInput = try AVCaptureDeviceInput(device: videoDevice)
        } catch {
        if self.captureSession.canAddInput(videoDeviceInput) {
            self.videoInput = videoDeviceInput
    // 5
    private func setupOutputs() {
        let videoDataOutput = AVCaptureVideoDataOutput()
        videoDataOutput.setSampleBufferDelegate(self, queue: self.captureSessionQueue)
        if self.captureSession.canAddOutput(videoDataOutput) {
            self.videoOutput = videoDataOutput
    // 6
    func startCamera() {
        if !setupComplete {
        if captureSession.isRunning {
        captureSessionQueue.async { [unowned self] in
    // 7
    private func captureDevice(with mediaType: String, position: AVCaptureDevice.Position?) -> AVCaptureDevice? {
        let session = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType:, position: .unspecified)
        let cameras = session.devices
        var captureDevice = cameras.first
        if let position = position {
            for device in cameras where device.position == position {
                captureDevice = device
        if position == .back {
            try? captureDevice?.lockForConfiguration()
            captureDevice?.focusMode = .continuousAutoFocus
        return captureDevice
    enum CameraControllerError: Swift.Error {
        case captureSessionAlreadyRunning
        case captureSessionIsMissing
        case inputsAreInvalid
        case invalidOperation
        case noCamerasAvailable
        case unknown

// 8
extension FilteringCameraController: AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate {
    public func captureOutput(_ captureOutput: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
        let sourceImage = CIImage(cvPixelBuffer: imageBuffer as CVPixelBuffer)
        filter.setValue(sourceImage, forKey: kCIInputImageKey)
        let filteredImage = filter.outputImage!
        // 9
        let rotation = -CGFloat.pi
        let translationX: CGFloat = -1
        let rotated = filteredImage
            .transformed(by: CGAffineTransform(rotationAngle: rotation / 2))
            .transformed(by: CGAffineTransform(scaleX: translationX, y: 1))
        let transformed = rotated.transformed(by: .init(translationX: -rotated.extent.origin.x,
        y: -rotated.extent.origin.y))
        // 10
        previewView.image = transformed
  1. CIFilter
    The lazy created comic filter will be used with a camera buffer to create an output image.
  2. Entry method
    Before using the camera, we have to call prepareCamera(with previewView: UIView). It will handle creating inputs, outputs, and delegate setup.
  3. Our code will use a separate queue for handling all camera-related actions. We need to be in sync with the camera to perform actions related to the setup.
  4. Our capture session needs input. In our example, I'll use the front camera as input.
  5. Unlike provided implementation, our new camera controller will use video data output. The output will call the delegate every time the sample buffer is ready.
  6. After successful preparation, the camera can be started and will begin to produce an output buffer.
  7. Helper method for creating capture device
    It uses a discovery session to retrieve available capture devices (front and back cameras). In our code, only the front camera will be used. You can experiment and use a back camera if you like.
  8. Delegate
    In this example, we just care about video output by implementing captureOutput(_ captureOutput: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection). Core Image provides convince method for creating CIImage from pixel buffer. After that, it is used as an input for the comic filter.
  9. Scaling and rotating
    By default, camera output will be rotated 90º and will not be mirrored. To make it looks natural on-screen, the code needs to rotate it  , and then transform it by scaling to make it mirrored. We also need to translate the image to make it in 0,0 position. This will ensure proper rendering. Add a breakpoint and use Xcode to see how the image changes in every step, Xcode will render a CIImage preview for you.
  10. Rendering
    This line of code will render our camera preview. I told you, it will work out of the box!

The last thing to do is to change the View controller to accommodate the new camera controller.

Change camera controller:

let cameraController = FilteringCameraController()

Remove code from view did load, it is not needed anymore.

override func viewDidLoad() {

Our setup will take place in view will appear:

override func viewWillAppear(_ animated: Bool) {
    cameraController.prepareCamera(with: self.view)

Build and run the application. How awesome is the result? You can experiment with filters and produce your own, unique effect! Let your creativity go crazy:)

If you want to get the final result, it is on GitHub.

Artur Gruchała

Artur Gruchała

I started learning iOS development when Swift was introduced. Since then I've tried Xamarin, Flutter, and React Native. Nothing is better than native code:)