Defer offscreen images on mobile
Defer offscreen images on mobile
Mobile Image Deferral: the standard
Mobile performance is oftenheld back by two constraints: network latency (RTT) and main-thread CPU availability. Deferring offscreen images addresses both by preventing bandwidth contention on the critical rendering path and distributing image decoding costs over the session duration.
This document details the modern, browser-native implementation of image deferral, contrasts it with obsolete JavaScript-based methodologies, and addresses the specific mechanical constraints of mobile viewports.

1. The Modern Standard: Native Deferral
The industry standard for deferring images is the native HTML loading attribute. This method offloads the prioritization logic to the browser's internal network stack, removing the need for user-space JavaScript libraries.
Implementation
For all images below the initial viewport ("the fold"), apply the loading="lazy" attribute.
<!-- Standard Deferred Image -->
<img src="product-detail.jpg"
alt="Side view of the chassis"
width="800"
height="600"
loading="lazy"
decoding="async">The Browser Heuristic
Native lazy loading is superior to JavaScript solutions because the browser adjusts the loading threshold based on the Effective Connection Type (ECT).
- On 4G/WiFi: The Blink engine (Chrome/Edge) employs a conservative threshold (e.g., 1250px). It assumes low latency and fetches the image only when the user is relatively close to the viewport.
- On 3G/Slow-2G: The threshold expands significantly (e.g., 2500px). The browser initiates the request much earlier relative to the scroll position to compensate for high round-trip times, ensuring the image is ready before the user scrolls it into view.
Critical Exception: The LCP Candidate
A common performance regression occurs when developers apply loading="lazy" to the Largest Contentful Paint (LCP) element (typically the hero image). This delays the fetch until layout is complete.
Correct LCP Strategy: The LCP image must be eager-loaded and prioritized.
<!-- Hero Image: Eager and Prioritized -->
<img src="hero.jpg"
alt="Summer Collection"
width="1200"
height="800"
loading="eager"
fetchpriority="high">
2. Legacy Methodologies: Why They Failed
Before native support, developers relied on JavaScript to approximate this behavior. These methods are now considered technical debt.
The "Scroll Handler" Era (2010–2016)
Early implementations attached event listeners to the scroll event.
// OBSOLETE: Do not use
window.addEventListener('scroll', () => {
images.forEach(img => {
if (img.getBoundingClientRect().top < window.innerHeight) {
img.src = img.dataset.src;
}
});
});
Main Thread Blocking: The scroll event fires dozens of times per second. Executing logic and calculating layout (getBoundingClientRect) during active scrolling caused frame drops (jank).
Layout Thrashing: Querying geometric properties forces the browser to synchronously recalculate the layout style, a computationally expensive operation on mobile CPUs.
The IntersectionObserver Era (2016–2019)
The IntersectionObserver API improved performance by asynchronously observing changes in element visibility.
The Implementation:
Force "High" priority status on the LCP element using fetchpriority="high". This bypasses internal heuristics and puts the image at the front of the download queue.
// DEPRECATED: Use native loading where possible
const observer = new IntersectionObserver((entries) => {
entries.forEach(entry => {
if (entry.isIntersecting) {
const img = entry.target;
img.src = img.dataset.src;
observer.unobserve(img);
}
});
});
Script Dependency: It requires JavaScript execution. If the main thread is busy hydrating a framework (React/Vue), the images remain unloaded even if they are in the viewport.
Lack of Network Awareness: Unlike native loading, IntersectionObserver uses fixed margins (e.g., rootMargin: '200px'). It does not automatically expand its buffer on slow networks, leading to "white flashes" for users on poor connections.
3. The Mobile Context: Viewport and Touch Mechanics
Mobile viewports introduce specific rendering challenges that native implementation handles more robustly than script-based solutions.
The Dynamic Viewport
On mobile browsers, the viewport height (vh) is fluid. As the user initiates a touch scroll, the URL bar and navigation controls often retract, changing the visible area size.
Legacy Scripts: Often failed to recalculate the "fold" correctly when the chrome retracted, leading to delayed loading.
Native Handling: The browser's internal layout engine tracks the visual viewport automatically, ensuring triggers fire regardless of chrome expansion or retraction.
Decoding and CPU Throttling
A significant cost of images on mobile is decoding. Converting a JPEG into a bitmap requires CPU cycles. On a throttled mobile processor, decoding a sequence of large images can block the main thread for 50ms–100ms each, causing input latency.
The Fix: content-visibility
To further offload the CPU, we combine lazy loading with CSS Render Containment. This property tells the browser to skip layout and painting for offscreen content entirely.
.image-card {
/* Skip rendering work until on-screen */
content-visibility: auto;
/* placeholder size to prevent scrollbar jumping */
contain-intrinsic-size: auto 300px;
}This ensures that even if an image is downloaded, the browser does not pay the layout/paint cost until the user actually scrolls to it.
Need your site lightning fast?
Join 500+ sites that now load faster and excel in Core Web Vitals.
- Fast on 1 or 2 sprints.
- 17+ years experience & over 500 fast sites
- Get fast and stay fast!