This camera app applies filters to the live camera feed in real time. Users can select different filters, save the captured photos, and share them easily.
#UIKit #Combine #AVFoundation #CoreImage #MVVM
| Camera | Filter | Save&Share |
|---|---|---|
![]() |
![]() |
![]() |
- AVCaptureDevice(Camera) sends data to AVCaptureDeviceInput
- The Input sends video data to AVCaptureVideoOuput
- The Output conforms to
AVCaptureVideoDataOutputSampleBufferDelegateto usecaptureOutputmethod which is used for handling CMSampleBuffer from the output. - CameraManager uses ImageFilterManager to apply CIFilters to CMSampleBuffer
- CameraManager renders CGImage with filtered CIImage from ImageFilterManager
- CameraManager sets the
contentsof the VideoView's layer to render the layer with CGImage - CameraManager uses
capturedImagewhen taking a photo(capturedImageis updated on every frame update)
- This resultBuilder makes easier to connect CIFilter chaining.
@resultBuilder
struct FilterBuilder {
static func buildBlock(_ components: CIFilter...) -> CIImage? {
guard let final = components.first,
var image = final.outputImage else { return nil }
for component in components.dropFirst() {
component.setValue(image, forKey: kCIInputImageKey)
if let output = component.outputImage {
image = output
}
}
return image
}
}// Before
func applyFilters(to image: CIImage) -> CIImage? {
// 1️⃣ Apply sepiaTone filter
let sepiaFilter = CIFilter(name: "CISepiaTone")
sepiaFilter?.setValue(image, forKey: kCIInputImageKey)
sepiaFilter?.setValue(0.8, forKey: kCIInputIntensityKey)
guard let sepiaOutput = sepiaFilter?.outputImage else { return nil }
// 2️⃣ Add vignette effect
let vignetteFilter = CIFilter(name: "CIVignette")
vignetteFilter?.setValue(sepiaOutput, forKey: kCIInputImageKey)
vignetteFilter?.setValue(1.5, forKey: kCIInputIntensityKey)
vignetteFilter?.setValue(2.0, forKey: kCIInputRadiusKey)
return vignetteFilter?.outputImage
}// After
func applyFilters(to image: CIImage) -> CIImage? {
@FilterBuilder
var filteredImage: CIImage? {
sepiaFilter(ciImage: image)
vignetteFilter(intensity: 1.2, radius: 3.0)
}
return filteredImage
}
// �Filter Method Example
private func sepiaFilter(ciImage: CIImage = CIImage()) -> CIFilter {
let filter = Filter.sepiaTone.ciFilter
filter.setValue(ciImage, forKey: kCIInputImageKey)
return filter
}- Although all permission were granted, I failed to save the photo.
- The reason I failed to save it was that the UIImage I tried to save was not in png or jpg format.(PhotoLibrary requires jpg, png format)
- To save an image in my photo library, image must be in png or jpg format.
- So, I improved the photo capture function by rendering the UIImage with PNG data.
func takePhoto(scale: CGFloat = 1.0, orientation: UIImage.Orientation = .right) -> UIImage? {
guard let ciImage = capturedImage else { return nil }
captureSession.stopRunning()
guard let pngData = context.pngRepresentation(
of: ciImage,
format: CIFormat.RGBA8,
colorSpace: CGColorSpaceCreateDeviceRGB()
) else { return nil }
return UIImage(data: pngData)
}- At the first time, I used
AVCaptureVideoPreviewLayerto render video data on the screen - After setting
captureOutputdelegate method, I realize that filtering is not working on the camera - The reason was that the
AVCaptureVideoPreviewLayerjust render input data directly whatever I do oncaptureOuputmethod. - So I made VideoView which is subclass of UIView and render data on its layer and replace #
AVCaptureVideoPreviewLayerwith VideoView
final class VideoView: UIView {
//...init...
func renderCGImage(_ cgImage: CGImage?) {
DispatchQueue.main.async {
[weak self] in
self?.layer.contents = cgImage
}
}
}
// CameraManager.swift
func captureOutput(_ output: AVCaptureOutput,
didOutput sampleBuffer: CMSampleBuffer,
from connection: AVCaptureConnection) {
guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
else { return }
// ...
videoView?.renderCGImage(cgImage)
}



