Genmoji And NSAdaptiveImageGlyph: How Apps Display User-Generated Inline Emoji
Genmoji is the user-facing name for Apple Intelligence’s custom-emoji feature: the user types a description, the system generates an inline emoji-like image, and the result appears in the text alongside Unicode emoji. The developer-facing surface is NSAdaptiveImageGlyph, an iOS 18+ class that represents adaptive inline images in attributed text1. Genmoji is one source of NSAdaptiveImageGlyph instances; apps that handle attributed text need to support the class to display Genmoji (and any future adaptive-image-glyph content Apple introduces).
The post walks the API against Apple’s documentation. The frame is “what an existing text-handling app must do to display Genmoji correctly,” because most apps that take user-entered text in a UITextView need to opt into adaptive image glyphs to render the user’s Genmoji at all, and the persistence side has serialization implications that get missed.
TL;DR
NSAdaptiveImageGlyph(iOS 18+) is a data type that wraps an adaptive image plus identifying metadata. Genmoji input from the system keyboard arrives asNSAdaptiveImageGlyphinstances embedded in attributed text2.supportsAdaptiveImageGlyphis declared on theUITextInputprotocol;UITextViewconforms, so the property is settable astextView.supportsAdaptiveImageGlyph = true. The default isfalse; without the opt-in, Genmoji typed by the user fails to render.- Adaptive image glyphs require TextKit 2. Apps still on TextKit 1 do not render
NSAdaptiveImageGlyphcorrectly. New apps targeting iOS 18+ should default to TextKit 2. NSAttributedStringcarries adaptive image glyphs through theNSAttributedString.Key.adaptiveImageGlyphattribute. The initializerNSAttributedString(adaptiveImageGlyph:attributes:)constructs an attributed string containing a single glyph3.- Persistence and round-tripping require care. Plain-text storage strips Genmoji entirely; rich-text formats (RTFD, Markdown with extensions, HTML with embedded image data) preserve them.
What NSAdaptiveImageGlyph Carries
An NSAdaptiveImageGlyph is a data wrapper with four identifying properties2:
imageContent: Data. The image data itself, in the format declared bycontentType.contentIdentifier: String. A unique identifier for the glyph instance. Used for deduplication and for the system’s internal caching.contentDescription: String. Alt text describing the glyph. Apps that surface accessibility labels or that send glyphs to non-glyph-supporting recipients use this.contentType: UTType. A class-level type property exposing the image format Apple uses for adaptive glyphs (a HEIC variant). Apps that serialize check this to drive format-aware handling.
The data is typically tens of kilobytes for a standard Genmoji. Multiple sizes are encoded in the same image file using HEIC’s adaptive-image features; the system picks the right size based on rendering context.
Enabling Adaptive Image Glyphs In UITextView
The opt-in is a single property1:
import UIKit
let textView = UITextView()
textView.supportsAdaptiveImageGlyph = true
// Also requires TextKit 2 (default on UITextView for iOS 16+
// when constructed via Interface Builder or modern initializer)
Without supportsAdaptiveImageGlyph = true, Genmoji typed by the user appears as a placeholder character (the system can’t render the glyph). Setting the property enables both rendering and the system keyboard’s “Genmoji” tab so the user can create custom Genmoji within the text view.
SwiftUI’s native TextField and TextEditor do not currently expose a supportsAdaptiveImageGlyph modifier. SwiftUI apps that need adaptive image glyph rendering wrap UITextView in a UIViewRepresentable and set supportsAdaptiveImageGlyph = true on the underlying view. Community wrappers like GlyphMeThat provide this bridge ready-made.
TextKit 2 Is Load-Bearing
NSAdaptiveImageGlyph requires TextKit 2’s layout architecture4. TextKit 1 (the legacy text engine that shipped with the original NSTextStorage/NSLayoutManager/NSTextContainer model) does not render adaptive image glyphs correctly; the glyph appears as a generic placeholder or fails to lay out at all.
Apps in three states:
New apps on iOS 18+. Default to TextKit 2. UITextView initialized through Interface Builder or init(frame:textContainer:) uses TextKit 2 by default on iOS 16+. New code gets it for free.
Legacy apps still using TextKit 1. A migration is required. The TextKit 2 migration is non-trivial for apps that subclass NSLayoutManager, override layout-related delegate methods, or use the older NSTextStorage directly. Apple’s TextKit migration guide covers the path; for apps with simple UITextView usage, the migration is mostly automatic.
Hybrid apps. Some apps embed WKWebView for HTML editing alongside UITextView for plain editing. WKWebView handles adaptive image glyphs through its own rendering path (not TextKit), so a hybrid app may have one surface that supports Genmoji and one that doesn’t. Document the behavior; users notice when one editor supports custom emoji and the other strips them.
Integration With NSAttributedString
Adaptive image glyphs flow through attributed strings via the NSAttributedString.Key.adaptiveImageGlyph attribute3:
import UIKit
// Construct an attributed string containing a single adaptive image glyph
let glyph: NSAdaptiveImageGlyph = ...
let attrString = NSAttributedString(
adaptiveImageGlyph: glyph,
attributes: [
.font: UIFont.systemFont(ofSize: 17)
]
)
// Concatenate with surrounding text
let composed = NSMutableAttributedString(string: "Look at this ")
composed.append(attrString)
composed.append(NSAttributedString(string: " I just made!"))
The pattern composes: a glyph inside text inside more text. The system handles layout (including the glyph’s adaptive sizing for the surrounding text’s font) automatically.
For reading, iterating an NSAttributedString’s .adaptiveImageGlyph attribute returns NSAdaptiveImageGlyph instances at the positions where they appear:
attributedString.enumerateAttribute(
.adaptiveImageGlyph,
in: NSRange(location: 0, length: attributedString.length)
) { value, range, _ in
if let glyph = value as? NSAdaptiveImageGlyph {
// process glyph + range
}
}
Apps that filter, transform, or persist text use this enumeration to find glyphs and decide what to do with them.
Persistence And Serialization
Plain-text storage (a String, a UTF-8 file) does not preserve adaptive image glyphs. The Unicode placeholder character that represents the glyph in attributed text gets serialized as a U+FFFC (object replacement) or nothing, and the actual glyph data is lost.
For round-trippable persistence, apps need a rich-text format5:
RTFD. Apple’s rich text + attachments format. Round-trips adaptive image glyphs. Used by Notes, Mail (when sending rich content), and TextEdit. The format is verbose (a directory bundle with attachments) but lossless.
HTML with embedded images. Web-friendly. Glyphs serialize as <img> tags with base64-encoded data URIs. Larger payloads but works across most rich-text-capable receivers.
Markdown with extensions. Standard Markdown doesn’t have an adaptive-image-glyph syntax, but extended dialects (CommonMark with attachment support, Apple’s own extended Markdown) can carry them. Document the dialect requirement for any markdown-based persistence.
Apps that send text across the network (chat, email, social media) need to decide: preserve glyphs end-to-end (only works if both sender and receiver are iOS 18+ and the transport supports rich text), strip glyphs and substitute fallback text (contentDescription), or render the glyph as a system image and embed the image. The right choice depends on the audience and the platform.
Common Failures
Three patterns from Genmoji integration logs:
Forgetting supportsAdaptiveImageGlyph = true. The most common bug. The text view renders Unicode emoji fine but Genmoji appears as placeholder characters. Fix: set the property to true in every UITextView/NSTextView that accepts user-entered text. For SwiftUI, use the .supportsAdaptiveImageGlyph(true) modifier.
Plain-text persistence stripping glyphs. Saving the text view’s content as text (plain String) discards Genmoji. The user types a custom emoji, sees it in the text view, saves the document, reopens; the emoji is gone. Fix: persist as attributedText with a rich-text format that supports adaptive image glyphs (RTFD, HTML, custom format with attachment side-channel).
Network transmission silently dropping glyphs. A messaging app that serializes outgoing messages as plain text strips Genmoji on send. The recipient sees a placeholder character or empty space. Fix: either send rich content (and ensure the recipient supports it) or substitute the contentDescription for plain-text receivers and include the image data as a separate attachment.
What This Pattern Means For iOS 18+ Apps
Three takeaways.
-
Set
supportsAdaptiveImageGlyph = trueon every text input. Apps that accept user-entered text default to opt-in for adaptive image glyphs. The single property is the difference between Genmoji rendering and Genmoji breaking. -
Migrate to TextKit 2 if you’re still on TextKit 1. TextKit 1 is in maintenance mode. New iOS-26-era features (adaptive image glyphs, Writing Tools’ inline rewrite, Liquid Glass text rendering) all assume TextKit 2. The migration cost is real but the alternative is shipping in a deprecated text engine.
-
Pick your persistence format with adaptive image glyphs in mind. RTFD for native iOS storage; HTML with embedded images for web-compatible storage; custom binary format with attachment side-channel for high-performance apps. Plain text is the wrong default for apps where users will type Genmoji.
The full Apple Ecosystem cluster: typed App Intents; MCP servers; the routing question; Foundation Models; the runtime vs tooling LLM distinction; three surfaces; the single source of truth pattern; Two MCP Servers; hooks for Apple development; Live Activities; the watchOS runtime; SwiftUI internals; RealityKit’s spatial mental model; SwiftData schema discipline; Liquid Glass patterns; multi-platform shipping; the platform matrix; Vision framework; Symbol Effects; Core ML inference; Writing Tools API; Swift Testing; Privacy Manifest; Accessibility as platform; SF Pro typography; visionOS spatial patterns; Speech framework; SwiftData migrations; tvOS focus engine; @Observable internals; SwiftUI Layout protocol; custom SF Symbols; AVFoundation HDR; watchOS workout lifecycle; App Intents 2.0 in iOS 26; Image Playground API; what I refuse to write about. The hub is at the Apple Ecosystem Series. For broader iOS-with-AI-agents context, see the iOS Agent Development guide.
FAQ
Does my app get Genmoji “for free” if I just use UITextView?
Not quite. The default for UITextView.supportsAdaptiveImageGlyph is false. Apps must opt in by setting the property to true. Once enabled, the system keyboard’s Genmoji tab appears for the user, and pasted Genmoji renders correctly. Without the opt-in, Genmoji typed elsewhere and pasted into the text view appears as placeholder characters.
Do I need Apple Intelligence enabled to test Genmoji?
For full Genmoji creation, yes. The user-facing Genmoji creation flow requires Apple Intelligence-capable hardware (iPhone 15 Pro and later, M-series Macs) with iOS 18+ and Apple Intelligence enabled. For development testing of NSAdaptiveImageGlyph rendering, you can construct test glyph instances programmatically with sample image data and verify the text view’s rendering on any iOS 18+ device or simulator.
What happens to a Genmoji when I send it to someone using iOS 17?
Without rich-text transport that preserves the glyph, the recipient sees the contentDescription (alt text) or a placeholder character. Modern messaging frameworks (Apple’s Messages app, recent versions of mail clients) handle the fallback automatically; custom protocols need explicit handling.
Can I create NSAdaptiveImageGlyph instances programmatically?
Yes. The public initializer is init(imageContent: Data), taking pre-encoded HEIC adaptive-image data. The contentDescription, contentIdentifier, and contentType are read from the encoded data rather than passed as separate arguments; apps creating custom adaptive image glyphs prepare the HEIC payload with the metadata embedded, then construct the glyph from that data. WWDC 2024 session 10220 (“Bring expression to your app with Genmoji”) covers the full creation flow.
How does this interact with Writing Tools?
Writing Tools (covered in Writing Tools API) preserves adaptive image glyphs in its rewrite outputs. A user who selects text containing Genmoji and asks Writing Tools to rewrite gets a rewrite that preserves the Genmoji at semantically appropriate positions. Apps that participate in Writing Tools through UIWritingToolsCoordinator need to round-trip the NSAdaptiveImageGlyph instances correctly through their custom text storage.
What’s the difference between NSAdaptiveImageGlyph and NSTextAttachment?
NSTextAttachment is the older, broader attachment system for inline non-text content (images, files, custom drawings) in attributed text. NSAdaptiveImageGlyph is the iOS 18 specialization for emoji-like inline images that adapt to surrounding font characteristics. The two are both attached through attributed string attributes but use different keys (.attachment vs .adaptiveImageGlyph) and different rendering paths (TextKit 1+TextKit 2 vs TextKit 2 only). New code targeting Genmoji-style content uses NSAdaptiveImageGlyph.
References
-
Apple Developer Documentation:
supportsAdaptiveImageGlyph. The opt-in property declared on theUITextInputprotocol thatUITextViewconforms to; the same property is therefore accessible astextView.supportsAdaptiveImageGlyph. ↩↩ -
Apple Developer Documentation:
NSAdaptiveImageGlyph. The data type wrapping the image content, identifier, description, and content type. ↩↩ -
Apple Developer Documentation:
NSAttributedString.Key.adaptiveImageGlyphandNSAttributedString(adaptiveImageGlyph:attributes:). The attributed-string integration surface for adaptive image glyphs. ↩↩ -
Apple Developer Documentation: TextKit 2 migration guide. The migration path from the legacy TextKit 1 layout engine to TextKit 2, required for adaptive image glyph rendering. ↩
-
Apple Developer Documentation:
NSAttributedString.DocumentType. The supported rich-text formats (RTFD, HTML, etc.) for round-tripping adaptive image glyphs through persistence. ↩