App Intents 2.0 In iOS 26: Visual Intelligence, Interactive Snippets, And Deferred Properties
App Intents shipped in iOS 16 as Apple’s structured-action API for Shortcuts, Siri, and Spotlight; iOS 17 expanded it for App Intents-driven widgets; iOS 18 made it the contract for Apple Intelligence’s action surface. iOS 26 extends App Intents into Visual Intelligence (image-search results from third-party apps), interactive snippets (small pop-up UI windows that compose into system contexts), entity view annotations (how an entity renders inline), and @DeferredProperty (async-computed entity properties)1. The extensions don’t change the core App Intents model; they add new participation surfaces an app can adopt.
The post walks the iOS 26 additions against Apple’s documentation. The frame is “what new surfaces an existing App Intents adoption gains by adding the iOS 26 conformances,” because most apps with App Intents already have the foundational types in place, and the iOS 26 work is about extending those types into new contexts.
TL;DR
IntentValueQueryis the new protocol for Visual Intelligence integration. The query accepts aSemanticContentDescriptor(the user’s visual context) and returns an array ofAppEntityinstances the app considers relevant2.@DeferredPropertydeclares an entity property whose value is computed asynchronously. The property loads when the system actually needs it, avoiding upfront cost for entities with many cheap-to-display + a few expensive-to-compute fields.- Interactive App Intents Snippets let an app present a small pop-up window (with buttons, text, controls) inside system contexts like Spotlight, Visual Intelligence, or the Siri response surface. The snippet is a SwiftUI view bound to the App Intent’s result.
- Entity view annotations let an app declare how an
AppEntityshould render in different system contexts (a compact row in Spotlight, a hero card in Visual Intelligence, a Live Activity-style widget). - The cluster’s App Intents post covered the foundational model; this post extends it into iOS 26’s new participation surfaces. The cluster’s App Intents vs MCP Tools post covers the routing question between Apple Intelligence’s intent surface and general-agent MCP tools.
Visual Intelligence: IntentValueQuery And SemanticContentDescriptor
iOS 26’s marquee App Intents addition is Visual Intelligence integration2. The flow:
- The user takes a photo, captures a screenshot, or holds the camera up to an object.
- The system’s Visual Intelligence layer extracts visual + semantic context (what’s in the image, what the user might want about it).
- The system queries every app that registered an
IntentValueQuerywith that semantic context. - Each app returns relevant
AppEntityinstances; the system aggregates and presents them in the Visual Intelligence UI. - The user taps an entity to enter the originating app at the right context.
The developer surface is a struct conforming to IntentValueQuery with a values(for:) method that takes a SemanticContentDescriptor:
import AppIntents
struct ProductLookupQuery: IntentValueQuery {
func values(for descriptor: SemanticContentDescriptor) async throws -> [Product] {
// descriptor.labels: detected category and content labels
// descriptor.pixelBuffer: the visual content as a CVReadOnlyPixelBuffer
// (use VideoToolbox/CoreImage to convert if needed)
let candidates = try await catalog.search(labels: descriptor.labels)
return candidates.map(Product.init)
}
}
The SemanticContentDescriptor carries two fields the system populates: labels (an array of detected category and content tags such as “wine bottle”, “pinot noir”, “label text”, etc.) and pixelBuffer (the underlying image data as a CVReadOnlyPixelBuffer for apps that want to run their own vision models on the content). The app’s job is to map those signals to its own data and return the matching entities.
The right adoption pattern: a shopping app implements the query against its product catalog (visual context → matching products), a wine app against its bottle database (label image → wine entry), a recipe app against its recipe library (ingredient photo → matching recipes).
@DeferredProperty: Async Entity Values
Existing AppEntity types declare their properties statically. Every property must be computed before the entity is returned. For entities with mixed-cost properties (fast title/subtitle/image, slow detailed-description-from-server), the all-or-nothing computation is wasteful.
@DeferredProperty (iOS 26+) declares a property as async-computed3:
import AppIntents
struct Recipe: AppEntity {
static var typeDisplayRepresentation = TypeDisplayRepresentation(...)
static var defaultQuery = RecipeQuery()
@Property(title: "Title")
var title: String
@Property(title: "Cuisine")
var cuisine: String
@DeferredProperty(title: "Detailed Instructions")
var instructions: String {
get async throws {
try await loadInstructionsFromBackend(id: id)
}
}
}
The deferred property’s getter is async; it runs only when the system actually needs the value (e.g., when the entity is selected for detail display). For entities returned from a query and only used for selection UI, the deferred property is never computed.
The pattern is right for any entity with fast-to-build fields (id, title, summary) plus expensive-to-build fields (full body, computed metrics, server-fetched data). Without @DeferredProperty, the developer either computes everything (wasteful) or computes only the cheap fields and adds a separate “load detail” intent (more complex).
Interactive Snippets and SnippetIntent
iOS 26 introduces the dedicated SnippetIntent protocol for snippet-shaped interactions, alongside the existing ShowsSnippetView-conforming AppIntent pattern from earlier releases4. SnippetIntent adds a static reload() method the system can call to refresh the snippet’s content without a full intent re-invocation:
struct WeatherSnippet: SnippetIntent {
static var title: LocalizedStringResource = "Weather Snippet"
@Parameter(title: "City")
var city: City
func perform() async throws -> some IntentResult & ShowsSnippetView {
let forecast = try await weatherService.forecast(for: city)
return .result(view: ForecastSnippet(forecast: forecast))
}
static func reload() async throws {
// Triggered by the system when it wants fresh snippet data
// (e.g., after the data source signals an update)
}
}
For non-snippet-specific intents that want to attach a snippet view to their response, the existing AppIntent + ShowsSnippetView pattern still works:
struct WeatherForecastIntent: AppIntent {
static var title: LocalizedStringResource = "Weather Forecast"
@Parameter(title: "City")
var city: City
func perform() async throws -> some IntentResult & ProvidesDialog & ShowsSnippetView {
let forecast = try await weatherService.forecast(for: city)
return .result(
dialog: "Here's the forecast for \(city.name).",
view: ForecastSnippet(forecast: forecast)
)
}
}
struct ForecastSnippet: View {
let forecast: Forecast
var body: some View {
VStack(alignment: .leading) {
Text(forecast.headline).font(.headline)
HStack {
ForEach(forecast.days) { day in
DayCell(day: day)
}
}
Button("Open in App") {
// App-launching action wired via App Intents
}
}
.padding()
}
}
The result type’s ShowsSnippetView conformance tells the system to render the SwiftUI view alongside the dialog. The snippet is interactive: buttons inside it can trigger other App Intents, the user can scroll, the view participates in the system’s interaction layer.
The cases that earn an interactive snippet: weather forecasts, calendar events, transit times, sports scores, package tracking. Anywhere the user wants more than a one-line dialog response from the system surface.
Entity View Annotations
Entity view annotations let an app declare how an AppEntity should render in different system contexts5. The mechanism extends DisplayRepresentation with multiple variants:
struct Recipe: AppEntity {
var displayRepresentation: DisplayRepresentation {
DisplayRepresentation(
title: "\(title)",
subtitle: "\(cuisine) - \(time) min",
image: .init(named: imageName)
)
}
}
The classic model returns a single DisplayRepresentation. iOS 26 lets entities provide context-specific variants for the system to pick from based on where it’s rendering (compact list, hero card, Spotlight result, Visual Intelligence panel). The framework picks the right variant per context; the app declares each one.
The pattern supports apps that need entity rendering to differ between, say, a tightly-packed Spotlight list and a single-card Visual Intelligence response. The variants compose without the app having to detect context.
Composition With Existing App Intents
The iOS 26 additions compose with existing App Intents primitives:
- An app’s
AppShortcutsProvider(covered in Accessibility as platform) registers shortcuts for voice, action button, and Spotlight. - Each
AppShortcutreferences anAppIntent, which has@Parameterproperties resolved from the user’s request. - The
AppIntent.perform()method returns a result type that can include a snippet view (ShowsSnippetView) or a dialog (ProvidesDialog). AppEntitytypes referenced by the intent’s parameters now support@DeferredPropertyand entity view annotations.- New
IntentValueQuerytypes let the same entities surface in Visual Intelligence.
The shape: the existing model is preserved; the iOS 26 work is adding new types (queries) and new annotations (deferred, view variants) to the same entity surface.
Common Failures
Three patterns from App Intents 2.0 adoption failures:
IntentValueQuery returning unscoped results. A query that returns every product matching the descriptor’s search terms, regardless of relevance, dilutes the Visual Intelligence experience. Fix: scope the query (top-N relevance, recency-weighted, user-personalized) so each returned entity earns its place in the system’s UI.
@DeferredProperty for fast values. The deferred mechanism is for genuinely expensive computation. Marking a fast in-memory property as deferred adds async overhead without benefit. Fix: reserve @DeferredProperty for properties that actually pay off when deferred (server fetches, large computations, ML model invocations).
Interactive snippets that are full apps. A snippet is a compact UI surface; treating it as a mini-app produces snippets that feel cramped or are slow to render. Fix: keep snippets focused on the immediate response and one or two related actions; use the “Open in App” button to hand off complex flows to the full app.
What This Pattern Means For iOS 26+ Apps
Three takeaways.
-
Add
IntentValueQueryfor any app with visual content. Shopping, recipes, products, locations, identifiable objects. The Visual Intelligence integration is the new discovery surface; apps that don’t participate are invisible there. -
Use
@DeferredPropertyfor expensive entity fields. Detailed descriptions, computed metrics, server-fetched data. The default-async pattern matches modern Swift code and keeps the cheap-to-build entity returns fast. -
Adopt interactive snippets for high-value query types. Weather, calendar, transit, sports, package tracking. The snippet UI keeps users in the system response surface for the cases where a one-line dialog isn’t enough but a full-app launch is overkill.
The full Apple Ecosystem cluster: typed App Intents; MCP servers; the routing question; Foundation Models; the runtime vs tooling LLM distinction; three surfaces; the single source of truth pattern; Two MCP Servers; hooks for Apple development; Live Activities; the watchOS runtime; SwiftUI internals; RealityKit’s spatial mental model; SwiftData schema discipline; Liquid Glass patterns; multi-platform shipping; the platform matrix; Vision framework; Symbol Effects; Core ML inference; Writing Tools API; Swift Testing; Privacy Manifest; Accessibility as platform; SF Pro typography; visionOS spatial patterns; Speech framework; SwiftData migrations; tvOS focus engine; @Observable internals; SwiftUI Layout protocol; custom SF Symbols; AVFoundation HDR; watchOS workout lifecycle; what I refuse to write about. The hub is at the Apple Ecosystem Series. For broader iOS-with-AI-agents context, see the iOS Agent Development guide.
FAQ
Do I need Apple Intelligence enabled to test IntentValueQuery?
For full Visual Intelligence testing, yes. Visual Intelligence requires Apple Intelligence-capable hardware (iPhone 15 Pro or newer, M1 Mac or newer) with iOS 26+ and Apple Intelligence enabled. For development, you can test the query in unit tests by constructing a SemanticContentDescriptor directly and calling the query’s values(for:) method without the full system stack.
Can @DeferredProperty throw?
Yes. The async getter signature is get async throws, and exceptions propagate to the system. The system handles the failure by displaying the entity without the deferred property’s value (or showing an error state in some contexts). Apps should fail gracefully and return meaningful error states rather than crashing.
Does interactive snippet content support animations?
Yes, with the standard SwiftUI animation primitives. The snippet’s view runs in the system’s response surface, which supports the same animation infrastructure as in-app SwiftUI. Reach for the cluster’s Symbol Effects post for the animation vocabulary that matches platform conventions.
How do entity view annotations interact with widgets?
Widgets are a separate WidgetKit surface; entity view annotations apply within App Intents contexts (Spotlight, Visual Intelligence, Siri responses). For an app that exposes the same data as both an AppEntity and a widget, the two surfaces require separate UI declarations. Apps usually share the underlying data model and write thin presentation views per surface.
What’s the relationship between this and MCP tools?
App Intents are Apple Intelligence’s intent surface; MCP tools are the agent-server protocol for general LLMs. iOS 26 adds Visual Intelligence to App Intents (Apple’s surface). For non-Apple agents (Claude, GPT-class), MCP tools running locally or remotely cover the same conceptual territory. The cluster’s App Intents vs MCP Tools post covers the routing question.
Can IntentValueQuery be combined with EntityQuery?
Yes. They serve different surfaces: EntityQuery is for when the user types or speaks an entity name (Spotlight, Siri); IntentValueQuery is for when the user is in Visual Intelligence with image context. An app’s AppEntity can have both queries registered; the system picks the right one for the context.
References
-
Apple Developer Documentation: App Intents. The framework reference covering
AppIntent,AppEntity, queries, parameters, and the iOS 26 additions. ↩ -
Apple Developer: Explore new advances in App Intents (WWDC 2025 session 275). The introduction of
IntentValueQuery,SemanticContentDescriptor, and Visual Intelligence integration. ↩↩ -
Apple Developer Documentation:
@DeferredProperty(). The async-computed entity property macro introduced for App Entities in iOS 26. Coverage in WWDC 2025 session 275 (Explore new advances in App Intents). ↩ -
Apple Developer Documentation: Interactive App Intents Snippets via the
ShowsSnippetViewresult type combined with SwiftUI views that render in system contexts (Spotlight, Visual Intelligence, Siri response area). ↩ -
Apple Developer Documentation:
DisplayRepresentationand entity view annotations. The mechanism for declaring how anAppEntityrenders in different system contexts. ↩