Why Your ARKit App Crashes After 5 Minutes (And How to Fix It)
Sebastian Kotarski
The Pattern I See Every Month
At least once a month, a company reaches out to me with the same problem: their ARKit app works perfectly during development, passes all internal tests, but crashes reliably after 5-10 minutes of real-world use.
The crash logs are vague. Memory warnings appear sporadically. And the development team has spent weeks trying to reproduce it in the simulator without success.
If this sounds familiar, you're not alone. This is the single most common ARKit issue I encounter in production apps, and the root cause is almost always the same.
The Root Cause: ARFrame Retain Cycles
ARKit delivers a continuous stream of ARFrame objects through its delegate methods. Each frame contains a captured image, depth data, tracking information, and anchor updates. A single frame can consume 20-50MB of memory.
The problem occurs when developers inadvertently hold strong references to these frames:
// This is the bug pattern I see most often
class ARViewController: UIViewController, ARSessionDelegate {
var lastFrame: ARFrame? // Strong reference!
var frameHistory: [ARFrame] = [] // Even worse!
func session(_ session: ARSession, didUpdate frame: ARFrame) {
lastFrame = frame
frameHistory.append(frame) // Memory grows unbounded
processFrame(frame)
}
}
Each retained frame keeps its entire pixel buffer alive. After 5 minutes at 60fps, you're looking at 18,000 frames. Even if you only keep the last 100, that's potentially 5GB of pixel buffer data that the system can't reclaim.
The Fix
The solution involves three changes:
1. Never Store ARFrames Directly
Extract only the data you need from each frame:
struct FrameData {
let timestamp: TimeInterval
let cameraTransform: simd_float4x4
let lightEstimate: CGFloat?
// Only store what you actually need
}
class ARViewController: UIViewController, ARSessionDelegate {
var lastFrameData: FrameData?
func session(_ session: ARSession, didUpdate frame: ARFrame) {
lastFrameData = FrameData(
timestamp: frame.timestamp,
cameraTransform: frame.camera.transform,
lightEstimate: frame.lightEstimate?.ambientIntensity
)
}
}
2. Use Autoreleasepool for Processing
When you need to process frame data (e.g., for ML inference), wrap it in an autorelease pool:
func session(_ session: ARSession, didUpdate frame: ARFrame) {
autoreleasepool {
let pixelBuffer = frame.capturedImage
// Process pixel buffer here
// It will be released when the pool drains
}
}
3. Monitor Memory Proactively
Add memory monitoring to catch issues early:
func checkMemoryUsage() {
let memoryUsage = getMemoryUsage()
if memoryUsage > 500_000_000 { // 500MB threshold
// Reduce AR quality, pause processing, or alert
arSession.pause()
cleanupResources()
}
}
How to Diagnose This in Your App
If you suspect this issue:
- Open Instruments with the Allocations template
- Run your app on a physical device (not the simulator)
- Filter allocations by "IOSurface" or "CVPixelBuffer"
- Watch the memory graph — if it trends upward without plateauing, you have a leak
The fix typically takes a few hours to implement and test. The hard part is finding it in the first place.
When to Call for Help
If you've tried the above and your app still crashes, the issue might be deeper:
- RealityKit entity leaks — entities not properly removed from scenes
- Metal texture accumulation — custom shaders holding texture references
- Combine subscription leaks — publishers not cancelled on dealloc
These require more specialized debugging with Instruments and LLDB. If you're stuck, book a technical audit and I'll find the root cause within 3-5 days.
Tags
Share this article
Need Help With Your iOS Project?
I help startups and enterprises solve critical iOS, ARKit, and visionOS issues. From performance problems to app store rejections.