AVFoundation Tutorial
Introduction to AVFoundation
AVFoundation is a powerful framework provided by Apple that allows developers to work with audiovisual media on iOS, macOS, watchOS, and tvOS platforms. With AVFoundation, you can manage and play back audio and video, capture media from the camera, and perform media editing tasks.
Setting Up Your Project
To get started with AVFoundation, you need to integrate the framework into your Xcode project. Follow these steps:
- Create a new Xcode project or open an existing one.
- Go to the Project Navigator and select your project.
- Under TARGETS, select your app target.
- Navigate to the General tab and scroll to the Frameworks, Libraries, and Embedded Content section.
- Click the + button and add AVFoundation.framework.
Playing a Video
Let's start by playing a video using AVFoundation. Here is a simple example:
import AVFoundation import UIKit class ViewController: UIViewController { var player: AVPlayer? var playerLayer: AVPlayerLayer? override func viewDidLoad() { super.viewDidLoad() guard let url = URL(string: "https://www.example.com/video.mp4") else { return } player = AVPlayer(url: url) playerLayer = AVPlayerLayer(player: player) playerLayer?.frame = view.bounds view.layer.addSublayer(playerLayer!) player?.play() } }
In this example, we create an AVPlayer instance with a URL pointing to the video file. We then create an AVPlayerLayer to display the video and add it to the view's layer hierarchy. Finally, we call the play method to start playback.
Recording Audio
AVFoundation also allows you to record audio. Here's how you can do it:
import AVFoundation import UIKit class ViewController: UIViewController { var audioRecorder: AVAudioRecorder? override func viewDidLoad() { super.viewDidLoad() let audioSession = AVAudioSession.sharedInstance() do { try audioSession.setCategory(.playAndRecord, mode: .default) try audioSession.setActive(true) let settings = [ AVFormatIDKey: Int(kAudioFormatMPEG4AAC), AVSampleRateKey: 12000, AVNumberOfChannelsKey: 1, AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue ] let url = getDocumentsDirectory().appendingPathComponent("recording.m4a") audioRecorder = try AVAudioRecorder(url: url, settings: settings) audioRecorder?.record() } catch { print("Failed to record audio: \(error.localizedDescription)") } } func getDocumentsDirectory() -> URL { let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask) return paths[0] } }
In this example, we configure an AVAudioSession for recording and create an AVAudioRecorder instance with the desired settings. We then start recording by calling the record method.
Capturing Video from the Camera
Capturing video from the camera is another common task. Here's how you can achieve it using AVFoundation:
import AVFoundation import UIKit class ViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate { var captureSession: AVCaptureSession? var videoPreviewLayer: AVCaptureVideoPreviewLayer? override func viewDidLoad() { super.viewDidLoad() captureSession = AVCaptureSession() captureSession?.sessionPreset = .medium guard let backCamera = AVCaptureDevice.default(for: .video) else { print("Unable to access back camera!") return } do { let input = try AVCaptureDeviceInput(device: backCamera) captureSession?.addInput(input) } catch { print("Failed to create camera input: \(error.localizedDescription)") return } let videoOutput = AVCaptureVideoDataOutput() videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "videoQueue")) captureSession?.addOutput(videoOutput) videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession!) videoPreviewLayer?.frame = view.layer.bounds videoPreviewLayer?.videoGravity = .resizeAspectFill view.layer.addSublayer(videoPreviewLayer!) captureSession?.startRunning() } func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { // Process the video frame here } }
In this example, we create an AVCaptureSession and configure it with a video input from the back camera. We also add a video data output to process the video frames. Finally, we add a preview layer to display the camera feed and start the session.
Basic Media Editing
AVFoundation provides powerful tools for media editing. Here's an example of trimming a video:
import AVFoundation func trimVideo(sourceURL: URL, destinationURL: URL, startTime: CMTime, endTime: CMTime, completion: @escaping (Bool) -> Void) { let asset = AVAsset(url: sourceURL) let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetHighestQuality) exportSession?.outputURL = destinationURL exportSession?.outputFileType = .mov let timeRange = CMTimeRange(start: startTime, end: endTime) exportSession?.timeRange = timeRange exportSession?.exportAsynchronously { switch exportSession?.status { case .completed: completion(true) case .failed, .cancelled: completion(false) default: completion(false) } } }
In this example, we use an AVAssetExportSession to trim a video. We specify the source and destination URLs, along with the start and end times for the trim operation. The completion handler indicates whether the operation was successful.
Conclusion
In this tutorial, we've covered the basics of AVFoundation, including how to play videos, record audio, capture video from the camera, and perform basic media editing. AVFoundation is a vast framework with many more capabilities, so be sure to explore the official documentation and experiment with different features to fully leverage its power.