Based on Vercel Engineering’s production experience
Welcome to your comprehensive guide for writing performant, maintainable React applications! This document is designed specifically for developers who are new to React or want to understand the “why” behind best practices, not just the “what.”
Table of Contents
- Understanding Performance Impact
- Eliminating Waterfalls — CRITICAL
- Bundle Size Optimization — CRITICAL
- Server-Side Performance — HIGH
- Client-Side Data Fetching — MEDIUM-HIGH
- Re-render Optimization — MEDIUM
- Rendering Performance — MEDIUM
- JavaScript Performance — LOW-MEDIUM
- Advanced Patterns — LOW
Understanding Performance Impact
Before diving into specific practices, let’s understand why performance matters and how we measure it:
Why Performance Matters
- User Experience: Faster apps feel more responsive and professional
- Conversion Rates: Every 100ms improvement can increase conversion by 1%
- SEO: Google uses page speed as a ranking factor
- Mobile Users: Slower connections and devices make optimization crucial
Key Performance Metrics
- Time to Interactive (TTI): When can users actually interact with your app?
- Largest Contentful Paint (LCP): When does the main content appear?
- First Input Delay (FID): How quickly does the app respond to first user interaction?
Impact Levels Explained
- CRITICAL: Can make or break your app’s performance (2-10× improvements)
- HIGH: Significant impact on user experience
- MEDIUM: Noticeable improvements, especially in complex apps
- LOW: Nice-to-have optimizations
Eliminating Waterfalls
Impact: CRITICAL
What Are Waterfalls?
Imagine you’re building a house. You need the foundation before walls, walls before roof, roof before interior. This sequential dependency is necessary in construction, but in software, it’s often wasteful.
A “waterfall” in web development occurs when operations wait for each other unnecessarily, like this:
Request 1: Fetch user data (100ms)
Request 2: Wait for Request 1, then fetch posts (100ms)
Request 3: Wait for Request 2, then fetch comments (100ms)
Total: 300ms
Instead of:
Request 1: Fetch user data (100ms)
Request 2: Fetch posts (100ms) - starts immediately
Request 3: Fetch comments (100ms) - starts immediately
Total: 100ms (all run in parallel)
Why This Matters
Each network round-trip adds latency (typically 50-200ms). In real-world applications, this can mean the difference between a 1-second load time and a 5-second load time.
1.1 Use Promise.all() for Independent Operations
The Problem: When you have multiple independent async operations, running them sequentially wastes time.
❌ Sequential execution (3 round trips):
// This waits for each operation to complete before starting the next
const user = await fetchUser() // 100ms
const posts = await fetchPosts() // +100ms = 200ms total
const comments = await fetchComments() // +100ms = 300ms total
✅ Parallel execution (1 round trip):
// All operations start simultaneously
const [user, posts, comments] = await Promise.all([
fetchUser(), // 100ms
fetchPosts(), // 100ms (runs in parallel)
fetchComments() // 100ms (runs in parallel)
])
// Total: 100ms (not 300ms!)
When to Use This: Whenever you need multiple pieces of data that don’t depend on each other.
Real-World Example: A dashboard that needs user profile, recent notifications, and analytics data.
1.2 Defer Await Until Needed
The Problem: Sometimes you fetch data that might not even be used due to conditional logic.
❌ Always fetches even when skipped:
async function handleRequest(userId: string, skipProcessing: boolean) {
// This always runs, even if we're going to skip processing
const userData = await fetchUserData(userId) // Wastes 100ms if skipped
if (skipProcessing) {
return { skipped: true } // userData was never used!
}
return processUserData(userData)
}
✅ Fetches only when needed:
async function handleRequest(userId: string, skipProcessing: boolean) {
// Check the condition first
if (skipProcessing) {
return { skipped: true } // Returns immediately, no network call
}
// Only fetch if we actually need it
const userData = await fetchUserData(userId)
return processUserData(userData)
}
Why This Matters: This pattern is especially useful in:
- API routes with optional parameters
- Components with conditional rendering
- Form processing with optional steps
1.3 Dependency-Based Parallelization
The Problem: Sometimes operations have partial dependencies. You might need user data for both config and profile, but config doesn’t need to wait for profile.
❌ Config waits unnecessarily:
const [user, config] = await Promise.all([
fetchUser(), // 100ms
fetchConfig() // 100ms (runs in parallel)
])
// Profile waits for BOTH user and config to complete
const profile = await fetchProfile(user.id) // +100ms = 200ms total
✅ Config and profile run in parallel:
import { all } from 'better-all'
const { user, config, profile } = await all({
async user() { return fetchUser() },
async config() { return fetchConfig() },
async profile() {
// Profile only waits for user, not config
return fetchProfile((await this.$.user).id)
}
})
// Total: 100ms (user and config run in parallel, profile starts as soon as user is ready)
How better-all Works: It analyzes dependencies and starts each task at the earliest possible moment, creating an optimal execution schedule.
When to Use This: Complex data fetching with interdependencies, like user onboarding flows.
1.4 Prevent Waterfall Chains in API Routes
The Problem: In server code, it’s easy to create sequential chains that block responses.
❌ Sequential waterfalls:
export async function GET(request: Request) {
const session = await auth() // 100ms
const config = await fetchConfig() // +100ms = 200ms (waits for auth)
const data = await fetchData(session.user.id) // +100ms = 300ms (waits for both)
return Response.json({ data, config })
}
✅ Parallel execution:
export async function GET(request: Request) {
// Start all independent operations immediately
const sessionPromise = auth()
const configPromise = fetchConfig() // Starts immediately, doesn't wait for auth
// Only wait for session when we need it
const session = await sessionPromise
// Config and data can run in parallel
const [config, data] = await Promise.all([
configPromise,
fetchData(session.user.id)
])
return Response.json({ data, config })
}
Why This Pattern Works:
- Start operations as early as possible
- Only wait when you actually need the result
- Keep independent operations truly independent
1.5 Strategic Suspense Boundaries
The Problem: When using Server Components, a single slow data fetch can block your entire page.
❌ Entire page blocked by data fetch:
async function Page() {
const data = await fetchData() // This blocks everything
return (
<div>
<div>Sidebar</div> {/* Waits for data */}
<div>Header</div> {/* Waits for data */}
<div><DataDisplay data={data} /></div>
<div>Footer</div> {/* Waits for data */}
</div>
)
}
✅ Layout renders immediately:
function Page() {
return (
<div>
<div>Sidebar</div> {/* Renders immediately */}
<div>Header</div> {/* Renders immediately */}
<div>
<Suspense fallback={<Skeleton />}>
<DataDisplay /> {/* Only this waits */}
</Suspense>
</div>
<div>Footer</div> {/* Renders immediately */}
</div>
)
}
async function DataDisplay() {
const data = await fetchData()
return <div>{data.content}</div>
}
Why This Matters:
- Users see the page structure immediately
- Only the data-heavy part shows a loading state
- Perceived performance is much better
When to Use Suspense:
- Heavy data fetching in specific components
- Optional content that can load later
- Progressive loading patterns
Bundle Size Optimization
Impact: CRITICAL
Why Bundle Size Matters
Your JavaScript bundle is like luggage for a trip. The more you pack, the longer it takes to:
- Download (especially on slow connections)
- Parse and execute (especially on slower devices)
- Start interacting with your app
Every kilobyte matters, especially for mobile users.
2.1 Avoid Barrel File Imports
What Are Barrel Files?: A barrel file (usually index.js) re-exports everything from a library:
// node_modules/lucide-react/index.js
export { Check } from './icons/check'
export { X } from './icons/x'
export { Menu } from './icons/menu'
// ... potentially thousands more exports
The Problem: When you import from a barrel file, the bundler might include everything, even if you only use one icon.
❌ Loads 1,583 modules (~2.8s dev, 200-800ms runtime):
import { Check, X, Menu } from 'lucide-react'
import { Button, TextField } from '@mui/material'
✅ Loads only what you use (~2KB):
import Check from 'lucide-react/dist/esm/icons/check'
import X from 'lucide-react/dist/esm/icons/x'
import Menu from 'lucide-react/dist/esm/icons/menu'
import Button from '@mui/material/Button'
import TextField from '@mui/material/TextField'
Why This Happens:
- Barrel files can have thousands of re-exports
- Bundlers struggle to tree-shake from large re-export lists
- Direct imports are unambiguous
Next.js 13.5+ Alternative: If you’re using Next.js 13.5+, you can configure automatic optimization:
// next.config.js
module.exports = {
experimental: {
optimizePackageImports: ['lucide-react', '@mui/material']
}
}
Libraries Affected: lucide-react, @mui/material, @headlessui/react, @radix-ui/react-*, lodash, ramda, date-fns, rxjs, react-use.
2.2 Dynamic Imports for Heavy Components
The Problem: Some components are large but not needed immediately (like a code editor that only appears when clicked).
❌ Monaco bundles with main chunk (~300KB):
import { MonacoEditor } from './monaco-editor'
function CodePanel({ code }: { code: string }) {
return <MonacoEditor value={code} />
}
✅ Monaco loads on demand:
import dynamic from 'next/dynamic'
const MonacoEditor = dynamic(
() => import('./monaco-editor').then(m => m.MonacoEditor),
{ ssr: false } // Don't try to render on server
)
function CodePanel({ code }: { code: string }) {
return <MonacoEditor value={code} />
}
How This Works:
- The main bundle doesn’t include MonacoEditor
- When the component renders, it fetches a separate chunk
- Shows a loading state while downloading
When to Use Dynamic Imports:
- Code editors (Monaco, CodeMirror)
- Chart libraries (Chart.js, D3)
- Heavy form components
- Modal/Dialog content
2.3 Defer Non-Critical Third-Party Libraries
The Problem: Analytics, error tracking, and other third-party scripts don’t need to block your app from loading.
❌ Blocks initial bundle:
import { Analytics } from '@vercel/analytics/react'
export default function RootLayout({ children }) {
return (
<html>
<body>
{children}
<Analytics /> {/* This must load before page is interactive */}
</body>
</html>
)
}
✅ Loads after hydration:
import dynamic from 'next/dynamic'
const Analytics = dynamic(
() => import('@vercel/analytics/react').then(m => m.Analytics),
{ ssr: false }
)
export default function RootLayout({ children }) {
return (
<html>
<body>
{children}
<Analytics /> {/* Loads after page is interactive */}
</body>
</html>
)
}
Why This Matters:
- Users can interact with your app immediately
- Analytics loads in the background
- Better Time to Interactive (TTI)
Common Libraries to Defer:
- Analytics (Google Analytics, Vercel Analytics)
- Error tracking (Sentry)
- Customer support widgets (Intercom, Zendesk)
- A/B testing tools
2.4 Conditional Module Loading
The Problem: Sometimes you have large features that only some users use (like an animation player).
function AnimationPlayer({ enabled, setEnabled }) {
const [frames, setFrames] = useState<Frame[] | null>(null)
useEffect(() => {
// Only load if feature is enabled AND not already loaded
if (enabled && !frames && typeof window !== 'undefined') {
import('./animation-frames.js')
.then(mod => setFrames(mod.frames))
.catch(() => setEnabled(false)) // Handle load errors
}
}, [enabled, frames, setEnabled])
if (!frames) return <Skeleton />
return <Canvas frames={frames} />
}
Key Points:
typeof window !== 'undefined'prevents SSR issues- Only loads when feature is actually used
- Handles loading errors gracefully
2.5 Preload Based on User Intent
The Problem: You want to load heavy components just before the user needs them, reducing perceived latency.
function EditorButton({ onClick }) {
const preload = () => {
if (typeof window !== 'undefined') {
// Start downloading the editor bundle
void import('./monaco-editor')
}
}
return (
<button
onMouseEnter={preload} // Preload on hover
onFocus={preload} // Preload on focus (keyboard navigation)
onClick={onClick}
>
Open Editor
</button>
)
}
Why This Works:
- Users typically hover or focus before clicking
- By the time they click, the bundle might already be loaded
- Reduces perceived loading time
When to Preload:
- On hover/focus for buttons that open heavy components
- When scrolling near a heavy component
- Based on user behavior patterns
Server-Side Performance
Impact: HIGH
Why Server Performance Matters
Server-side rendering (SSR) and Server Components can make your app feel instant, but only if the server is fast. Slow server responses defeat the purpose of SSR.
3.1 Parallel Data Fetching with Component Composition
The Problem: React Server Components execute sequentially within a component tree, creating potential bottlenecks.
❌ Sidebar waits for Page’s fetch:
export default async function Page() {
const header = await fetchHeader() // 100ms
return (
<div>
<div>{header}</div>
<Sidebar /> {/* Sidebar's fetch starts AFTER header completes */}
</div>
)
}
async function Sidebar() {
const items = await fetchSidebarItems() // +100ms = 200ms total
return <nav>{items.map(renderItem)}</nav>
}
✅ Both fetch simultaneously:
async function Header() {
const data = await fetchHeader() // 100ms
return <div>{data}</div>
}
async function Sidebar() {
const items = await fetchSidebarItems() // 100ms (runs in parallel)
return <nav>{items.map(renderItem)}</nav>
}
export default function Page() {
return (
<div>
<Header /> {/* Starts fetching immediately */}
<Sidebar /> {/* Starts fetching immediately */}
</div>
)
}
Why This Works:
- React can start rendering both components simultaneously
- Each component’s data fetch starts immediately
- Total time is determined by the slowest fetch, not the sum
3.2 Minimize Serialization at RSC Boundaries
The Problem: When passing data from Server Components to Client Components, React serializes all object properties, even unused ones.
❌ Serializes all 50 fields:
async function Page() {
const user = await fetchUser() // Fetches 50 fields
return <Profile user={user} /> // Serializes all 50 fields
}
'use client'
function Profile({ user }: { user: User }) {
return <div>{user.name}</div> // Only uses 1 field!
}
✅ Serializes only 1 field:
async function Page() {
const user = await fetchUser()
return <Profile name={user.name} /> // Only serializes the name field
}
'use client'
function Profile({ name }: { name: string }) {
return <div>{name}</div>
}
Why This Matters:
- Less data to transfer over the network
- Faster serialization/deserialization
- Reduced memory usage on client
When to Be Careful:
- Large objects with many fields
- Nested objects with deep structures
- Arrays of complex objects
3.3 Per-Request Deduplication with React.cache()
The Problem: Within a single request, you might need the same data in multiple components.
import { cache } from 'react'
export const getCurrentUser = cache(async () => {
const session = await auth()
if (!session?.user?.id) return null
return await db.user.findUnique({
where: { id: session.user.id }
})
})
How This Works:
- First call executes the query
- Subsequent calls return cached result
- Cache is per-request (automatically cleared)
⚠️ Avoid inline objects:
React.cache() uses shallow equality. Inline objects create new references.
❌ Always cache miss:
const getUser = cache(async (params: { uid: number }) => {
return await db.user.findUnique({ where: { id: params.uid } })
})
getUser({ uid: 1 }) // First call - executes query
getUser({ uid: 1 }) // Cache miss! New object reference
✅ Cache hit:
const getUser = cache(async (uid: number) => {
return await db.user.findUnique({ where: { id: uid } })
})
getUser(1) // First call - executes query
getUser(1) // Cache hit! Same primitive value
Next.js Note: In Next.js, fetch is automatically memoized. Use React.cache() for:
- Database queries
- Authentication checks
- File system operations
- Other non-fetch async work
3.4 Cross-Request LRU Caching
The Problem: React.cache() only works within one request. For data shared across sequential requests, you need persistent caching.
import { LRUCache } from 'lru-cache'
const cache = new LRUCache<string, any>({
max: 1000, // Maximum items to cache
ttl: 5 * 60 * 1000 // Time to live: 5 minutes
})
export async function getUser(id: string) {
const cached = cache.get(id)
if (cached) return cached
const user = await db.user.findUnique({ where: { id } })
cache.set(id, user)
return user
}
// Request 1: DB query, result cached
// Request 2: Cache hit, no DB query
Why LRU Cache:
- Limits memory usage (max items)
- Automatically expires old entries (TTL)
- Fast lookups (O(1) complexity)
When to Use This:
- User data that doesn’t change often
- Configuration data
- Reference data (categories, tags, etc.)
With Vercel’s Fluid Compute: Multiple concurrent requests can share the same cache. In traditional serverless, consider Redis for cross-process caching.
3.5 Use after() for Non-Blocking Operations
The Problem: Some operations (like logging) don’t need to block the response.
❌ Blocks response:
import { logUserAction } from '@/app/utils'
export async function POST(request: Request) {
await updateDatabase(request)
// This blocks the response from being sent
const userAgent = request.headers.get('user-agent') || 'unknown'
await logUserAction({ userAgent })
return Response.json({ status: 'success' })
}
✅ Non-blocking:
import { after } from 'next/server'
import { headers, cookies } from 'next/headers'
export async function POST(request: Request) {
await updateDatabase(request)
// Schedule work to run AFTER response is sent
after(async () => {
const userAgent = (await headers()).get('user-agent') || 'unknown'
const sessionCookie = (await cookies()).get('session-id')?.value || 'anonymous'
logUserAction({ sessionCookie, userAgent })
})
// Response sent immediately
return Response.json({ status: 'success' })
}
Common Uses for after():
- Analytics tracking
- Audit logging
- Sending notifications
- Cache invalidation
- Background cleanup
Why This Matters:
- Faster response times for users
- Better server throughput
- Non-critical work doesn’t block critical work
Client-Side Data Fetching
Impact: MEDIUM-HIGH
Why Client-Side Fetching Matters
Even with SSR, most apps need to fetch additional data on the client side. How you handle this can make or break the user experience.
4.1 Use SWR for Automatic Deduplication
The Problem: Multiple components might need the same data, causing duplicate network requests.
❌ No deduplication:
function UserList() {
const [users, setUsers] = useState([])
useEffect(() => {
fetch('/api/users')
.then(r => r.json())
.then(setUsers)
}, [])
}
// If 3 components use this pattern, you get 3 network requests!
✅ Multiple instances share one request:
import useSWR from 'swr'
function UserList() {
const { data: users } = useSWR('/api/users', fetcher)
}
// 100 components using this = 1 network request
SWR Benefits:
- Automatic request deduplication
- Built-in caching
- Automatic revalidation
- Error handling
- Loading states
For immutable data:
import { useImmutableSWR } from '@/lib/swr'
function StaticContent() {
const { data } = useImmutableSWR('/api/config', fetcher)
// Won't revalidate automatically
}
For mutations:
import { useSWRMutation } from 'swr/mutation'
function UpdateButton() {
const { trigger } = useSWRMutation('/api/user', updateUser)
return <button onClick={() => trigger()}>Update</button>
}
4.2 Deduplicate Global Event Listeners
The Problem: Multiple instances of a component might each add their own event listeners, wasting memory.
❌ N instances = N listeners:
function useKeyboardShortcut(key: string, callback: () => void) {
useEffect(() => {
const handler = (e: KeyboardEvent) => {
if (e.metaKey && e.key === key) {
callback()
}
}
window.addEventListener('keydown', handler)
return () => window.removeEventListener('keydown', handler)
}, [key, callback])
}
// 10 components = 10 event listeners!
✅ N instances = 1 listener:
import useSWRSubscription from 'swr/subscription'
// Global registry of callbacks
const keyCallbacks = new Map<string, Set<() => void>>()
function useKeyboardShortcut(key: string, callback: () => void) {
useEffect(() => {
// Register this callback
if (!keyCallbacks.has(key)) {
keyCallbacks.set(key, new Set())
}
keyCallbacks.get(key)!.add(callback)
// Cleanup on unmount
return () => {
const set = keyCallbacks.get(key)
if (set) {
set.delete(callback)
if (set.size === 0) {
keyCallbacks.delete(key)
}
}
}
}, [key, callback])
// Single global listener
useSWRSubscription('global-keydown', () => {
const handler = (e: KeyboardEvent) => {
if (e.metaKey && keyCallbacks.has(e.key)) {
keyCallbacks.get(e.key)!.forEach(cb => cb())
}
}
window.addEventListener('keydown', handler)
return () => window.removeEventListener('keydown', handler)
})
}
Why This Pattern Works:
- One event listener regardless of component count
- Centralized callback management
- Automatic cleanup when components unmount
4.3 Use Passive Event Listeners for Scrolling Performance
The Problem: Event listeners can block scrolling, making your app feel sluggish.
❌ Scroll delay:
useEffect(() => {
const handleWheel = (e: WheelEvent) => console.log(e.deltaY)
document.addEventListener('wheel', handleWheel)
return () => document.removeEventListener('wheel', handleWheel)
}, [])
✅ No scroll delay:
useEffect(() => {
const handleWheel = (e: WheelEvent) => console.log(e.deltaY)
document.addEventListener('wheel', handleWheel, { passive: true })
return () => document.removeEventListener('wheel', handleWheel)
}, [])
What { passive: true } Does:
- Tells the browser you won’t call
preventDefault() - Browser can scroll immediately without waiting for your handler
- Improves scrolling performance significantly
Use passive when:
- Tracking/analytics
- Logging
- Any handler that doesn’t need to prevent default behavior
Don’t use passive when:
- Custom swipe gestures
- Custom zoom controls
- Anything that calls
preventDefault()
4.4 Version and Minimize localStorage Data
The Problem: localStorage can cause issues if not managed properly.
❌ No version, stores everything:
localStorage.setItem('userConfig', JSON.stringify(fullUserObject))
const data = localStorage.getItem('userConfig')
✅ Versioned, minimal:
const VERSION = 'v2'
function saveConfig(config: { theme: string; language: string }) {
try {
localStorage.setItem(`userConfig:${VERSION}`, JSON.stringify(config))
} catch {
// Handle errors (incognito mode, quota exceeded, disabled)
}
}
function loadConfig() {
try {
const data = localStorage.getItem(`userConfig:${VERSION}`)
return data ? JSON.parse(data) : null
} catch {
return null
}
}
// Migration from v1 to v2
function migrate() {
try {
const v1 = localStorage.getItem('userConfig:v1')
if (v1) {
const old = JSON.parse(v1)
saveConfig({
theme: old.darkMode ? 'dark' : 'light',
language: old.lang
})
localStorage.removeItem('userConfig:v1')
}
} catch {
// Migration failed, but that's ok
}
}
Why This Matters:
- Schema Evolution: Version prefixes allow safe data migrations
- Reduced Storage: Only store what you need
- Error Handling: localStorage can fail (incognito, quota exceeded)
- Security: Avoid storing sensitive data (tokens, PII)
Re-render Optimization
Impact: MEDIUM
Why Re-renders Matter
Every re-render costs CPU time and memory. Unnecessary re-renders can make your app feel sluggish, especially on slower devices.
5.1 Extract to Memoized Components
The Problem: Expensive computations run even when they won’t be used.
❌ Computes avatar even when loading:
function Profile({ user, loading }) {
const avatar = useMemo(() => {
const id = computeAvatarId(user) // Expensive computation
return <Avatar id={id} />
}, [user])
if (loading) return <Skeleton />
return <div>{avatar}</div>
}
✅ Skips computation when loading:
const UserAvatar = memo(function UserAvatar({ user }: { user: User }) {
const id = useMemo(() => computeAvatarId(user), [user])
return <Avatar id={id} />
})
function Profile({ user, loading }) {
if (loading) return <Skeleton />
return <div><UserAvatar user={user} /></div>
}
Why This Works:
- Component only renders when not loading
- Expensive computation only runs when needed
memo()prevents unnecessary re-renders
Note: If using React Compiler, manual memoization is not necessary.
5.2 Use Functional setState Updates
The Problem: When updating state based on current value, you can create stale closures or unnecessary dependencies.
❌ Requires state as dependency:
function TodoList() {
const [items, setItems] = useState(initialItems)
// This callback is recreated every time items change
const addItems = useCallback((newItems: Item[]) => {
setItems([...items, ...newItems])
}, [items]) // ❌ Causes callback recreation
// This has a stale closure risk
const removeItem = useCallback((id: string) => {
setItems(items.filter(item => item.id !== id))
}, []) // ❌ Missing dependency - uses old items!
return <ItemsEditor items={items} onAdd={addItems} onRemove={removeItem} />
}
✅ Stable callbacks, no stale closures:
function TodoList() {
const [items, setItems] = useState(initialItems)
// Stable, never recreated
const addItems = useCallback((newItems: Item[]) => {
setItems(curr => [...curr, ...newItems])
}, []) // ✅ No dependencies needed
// Always uses latest state
const removeItem = useCallback((id: string) => {
setItems(curr => curr.filter(item => item.id !== id))
}, []) // ✅ Safe, no stale closures
return <ItemsEditor items={items} onAdd={addItems} onRemove={removeItem} />
}
Benefits:
- Stable callback references (no unnecessary recreations)
- No stale closures (always uses latest state)
- Fewer dependencies (simplifies dependency arrays)
- Prevents common bugs
5.3 Use Lazy State Initialization
The Problem: Expensive initial state computations run on every render.
❌ Runs on every render:
function FilteredList({ items }) {
const [searchIndex] = useState(buildSearchIndex(items)) // Runs every render!
const [query, setQuery] = useState('')
return <SearchResults index={searchIndex} query={query} />
}
✅ Runs only once:
function FilteredList({ items }) {
const [searchIndex] = useState(() => buildSearchIndex(items)) // Only once
const [query, setQuery] = useState('')
return <SearchResults index={searchIndex} query={query} />
}
When to Use Lazy Initialization:
- localStorage/sessionStorage reads
- Building indexes/maps
- DOM reads
- Heavy transformations
- Complex calculations
5.4 Defer State Reads to Usage Point
The Problem: Subscribing to dynamic state when you only read it inside callbacks.
❌ Subscribes to all changes:
function ShareButton({ chatId }) {
const searchParams = useSearchParams() // Re-renders on any param change
const handleShare = () => {
const ref = searchParams.get('ref')
shareChat(chatId, { ref })
}
return <button onClick={handleShare}>Share</button>
}
✅ Reads on demand, no subscription:
function ShareButton({ chatId }) {
const handleShare = () => {
// Only reads when clicked
const params = new URLSearchParams(window.location.search)
const ref = params.get('ref')
shareChat(chatId, { ref })
}
return <button onClick={handleShare}>Share</button>
}
Why This Matters:
- Component doesn’t re-render when URL params change
- Only reads the value when actually needed
- Better performance for dynamic values
5.5 Narrow Effect Dependencies
The Problem: Using objects as dependencies when you only need specific properties.
❌ Re-runs on any user field change:
useEffect(() => {
console.log(user.id)
}, [user]) // Re-runs if ANY user field changes
✅ Re-runs only when id changes:
useEffect(() => {
console.log(user.id)
}, [user.id]) // Only re-runs if user.id changes
For derived state:
// ❌ Runs on width=767, 766, 765...
useEffect(() => {
if (width < 768) {
enableMobileMode()
}
}, [width])
// ✅ Runs only on boolean transition
const isMobile = width < 768
useEffect(() => {
if (isMobile) {
enableMobileMode()
}
}, [isMobile])
Why This Matters:
- Fewer effect re-runs
- Better performance
- More predictable behavior
5.6 Subscribe to Derived State
The Problem: Subscribing to continuous values when you only care about state changes.
❌ Re-renders on every pixel change:
function Sidebar() {
const width = useWindowWidth() // Updates continuously during resize
const isMobile = width < 768
return <nav className={isMobile ? 'mobile' : 'desktop'} />
}
✅ Re-renders only when boolean changes:
function Sidebar() {
const isMobile = useMediaQuery('(max-width: 767px)')
return <nav className={isMobile ? 'mobile' : 'desktop'} />
}
Why This Works:
- Media query only triggers when crossing threshold
- Fewer re-renders during window resize
- More efficient for responsive layouts
5.7 Use Transitions for Non-Urgent Updates
The Problem: Frequent state updates can block the UI and make it feel sluggish.
❌ Blocks UI on every scroll:
function ScrollTracker() {
const [scrollY, setScrollY] = useState(0)
useEffect(() => {
const handler = () => setScrollY(window.scrollY)
window.addEventListener('scroll', handler, { passive: true })
return () => window.removeEventListener('scroll', handler)
}, [])
}
✅ Non-blocking updates:
import { startTransition } from 'react'
function ScrollTracker() {
const [scrollY, setScrollY] = useState(0)
useEffect(() => {
const handler = () => {
startTransition(() => setScrollY(window.scrollY))
}
window.addEventListener('scroll', handler, { passive: true })
return () => window.removeEventListener('scroll', handler)
}, [])
}
What startTransition Does:
- Marks state updates as non-urgent
- React can interrupt rendering for more urgent updates
- Prevents UI blocking during frequent updates
When to Use Transitions:
- Scroll position tracking
- Mouse position tracking
- Search input debouncing
- Animation state updates
Rendering Performance
Impact: MEDIUM
Why Rendering Performance Matters
Rendering is the process of turning your React components into pixels on screen. Efficient rendering means smoother animations, faster interactions, and better battery life.
6.1 Prevent Hydration Mismatch Without Flickering
The Problem: Client-side storage (localStorage, cookies) can cause hydration mismatches or visual flickering.
❌ Breaks SSR:
function ThemeWrapper({ children }) {
// localStorage is undefined on server
const theme = localStorage.getItem('theme') || 'light'
return <div className={theme}>{children}</div>
}
❌ Visual flickering:
function ThemeWrapper({ children }) {
const [theme, setTheme] = useState('light')
useEffect(() => {
const stored = localStorage.getItem('theme')
if (stored) setTheme(stored) // Runs after hydration - causes flash!
}, [])
return <div className={theme}>{children}</div>
}
✅ No flicker, no mismatch:
function ThemeWrapper({ children }) {
return (
<>
<div id="theme-wrapper">{children}</div>
<script
dangerouslySetInnerHTML={{
__html: `
(function() {
try {
var theme = localStorage.getItem('theme') || 'light';
var el = document.getElementById('theme-wrapper');
if (el) el.className = theme;
} catch (e) {}
})();
`,
}}
/>
</>
)
}
How This Works:
- Inline script runs synchronously before React hydrates
- Sets the correct class before first paint
- No hydration mismatch, no visual flicker
6.2 Animate SVG Wrapper Instead of SVG Element
The Problem: Many browsers don’t have hardware acceleration for CSS animations on SVG elements.
❌ Animating SVG directly:
function LoadingSpinner() {
return (
<svg
className="animate-spin"
width="24"
height="24"
viewBox="0 0 24 24"
>
<circle cx="12" cy="12" r="10" stroke="currentColor" />
</svg>
)
}
✅ Animating wrapper div:
function LoadingSpinner() {
return (
<div className="animate-spin">
<svg width="24" height="24" viewBox="0 0 24 24">
<circle cx="12" cy="12" r="10" stroke="currentColor" />
</svg>
</div>
)
}
Why This Matters:
- Div elements get hardware acceleration
- SVG elements often don’t
- Smoother animations, better performance
6.3 CSS content-visibility for Long Lists
The Problem: Long lists with thousands of items can slow down rendering.
CSS:
.message-item {
content-visibility: auto;
contain-intrinsic-size: 0 80px;
}
Usage:
function MessageList({ messages }) {
return (
<div className="overflow-y-auto h-screen">
{messages.map(msg => (
<div key={msg.id} className="message-item">
<Avatar user={msg.author} />
<div>{msg.content}</div>
</div>
))}
</div>
)
}
How content-visibility: auto Works:
- Browser skips layout/paint for off-screen elements
- Uses
contain-intrinsic-sizeas placeholder size - Only renders visible items
- For 1000 messages, ~990 items are skipped (10× faster)
6.4 Hoist Static JSX Elements
The Problem: Static JSX elements get recreated on every render.
❌ Recreates element every render:
function LoadingSkeleton() {
return <div className="animate-pulse h-20 bg-gray-200" />
}
function Container() {
return <div>{loading && <LoadingSkeleton />}</div>
}
✅ Reuses same element:
const loadingSkeleton = (
<div className="animate-pulse h-20 bg-gray-200" />
)
function Container() {
return <div>{loading && loadingSkeleton}</div>
}
Why This Matters:
- Element is created once, not on every render
- Less garbage collection
- Slightly better performance
6.5 Use Explicit Conditional Rendering
The Problem: Using && for conditionals can render falsy values unexpectedly.
❌ Renders “0” when count is 0:
function Badge({ count }) {
return <div>{count && <span className="badge">{count}</span>}</div>
}
// count = 0 renders: <div>0</div> (unexpected!)
// count = 5 renders: <div><span class="badge">5</span></div>
✅ Renders nothing when count is 0:
function Badge({ count }) {
return (
<div>
{count > 0 ? <span className="badge">{count}</span> : null}
</div>
)
}
// count = 0 renders: <div></div> (correct!)
// count = 5 renders: <div><span class="badge">5</span></div>
Why This Happens:
&&returns the falsy value if the left side is falsy0is falsy but still renders as text- Ternary operator explicitly returns null for falsy case
JavaScript Performance
Impact: LOW-MEDIUM
Why JavaScript Performance Matters
While React handles most rendering efficiently, your JavaScript code still matters. Algorithmic complexity and data structure choices can impact performance, especially with large datasets.
7.1 Build Index Maps for Repeated Lookups
The Problem: Using .find() repeatedly on arrays is inefficient.
❌ O(n) per lookup:
function processOrders(orders, users) {
return orders.map(order => ({
...order,
user: users.find(u => u.id === order.userId) // O(n) for each order
}))
}
// For 1000 orders × 1000 users = 1,000,000 operations!
✅ O(1) per lookup:
function processOrders(orders, users) {
const userById = new Map(users.map(u => [u.id, u]))
return orders.map(order => ({
...order,
user: userById.get(order.userId) // O(1) lookup
}))
}
// Build map: 1000 operations + 1000 lookups = 2000 operations!
Why This Matters:
- Map lookups are O(1) vs array find O(n)
- Build map once, reuse many times
- Dramatic performance improvement for large datasets
7.2 Combine Multiple Array Iterations
The Problem: Multiple array methods iterate the array multiple times.
❌ 3 iterations:
const admins = users.filter(u => u.isAdmin) // Iteration 1
const testers = users.filter(u => u.isTester) // Iteration 2
const inactive = users.filter(u => !u.isActive) // Iteration 3
✅ 1 iteration:
const admins = []
const testers = []
const inactive = []
for (const user of users) {
if (user.isAdmin) admins.push(user)
if (user.isTester) testers.push(user)
if (!user.isActive) inactive.push(user)
}
Why This Matters:
- One pass through the array instead of three
- Better performance for large arrays
- More memory efficient
7.3 Batch DOM CSS Changes
The Problem: Multiple style changes can cause multiple browser reflows.
❌ Multiple reflows:
function updateElementStyles(element) {
element.style.width = '100px' // Reflow 1
element.style.height = '200px' // Reflow 2
element.style.backgroundColor = 'blue' // Reflow 3
element.style.border = '1px solid black' // Reflow 4
}
✅ Single reflow with class:
.highlighted-box {
width: 100px;
height: 200px;
background-color: blue;
border: 1px solid black;
}
function updateElementStyles(element) {
element.classList.add('highlighted-box') // Single reflow
}
Why This Matters:
- Browser reflows are expensive
- Batching changes reduces reflow count
- Classes are more maintainable anyway
Advanced Patterns
Impact: LOW
8.1 Store Event Handlers in Refs
The Problem: Event handlers that change frequently can cause unnecessary re-renders.
Solution: Store handlers in refs to maintain stable references.
8.2 useLatest for Stable Callback Refs
The Problem: You need the latest value in a callback without causing re-renders.
import { useRef, useCallback } from 'react'
function useLatest(value) {
const ref = useRef(value)
ref.current = value
return ref
}
function Component() {
const [value, setValue] = useState(0)
const valueRef = useLatest(value)
const callback = useCallback(() => {
console.log(valueRef.current) // Always gets latest value
}, []) // Stable dependency array
}
Why This Works:
- Ref updates don’t cause re-renders
- Callback always has access to latest value
- Stable reference for useEffect/useCallback
Summary
This guide compiles production-proven best practices from Vercel Engineering’s experience. By following these patterns, you can:
Critical Improvements (2-10× faster)
- Eliminate waterfalls by running async operations in parallel
- Reduce bundle size by importing only what you need
Significant Improvements (30-50% faster)
- Optimize server-side rendering with parallel data fetching
- Minimize unnecessary re-renders with proper memoization
Nice-to-Have Improvements (10-20% faster)
- Apply micro-optimizations for JavaScript and rendering
- Use advanced patterns for complex scenarios
Key Takeaways for Beginners
- Start with the critical stuff: Waterfalls and bundle size have the biggest impact
- Measure, don’t guess: Use browser dev tools to identify actual bottlenecks
- Think in terms of user experience: Faster apps feel more professional
- Progressive enhancement: Start with working code, then optimize
- Context matters: Not all optimizations are necessary for all apps
Learning Path
- First: Master async/await and Promise.all() for eliminating waterfalls
- Second: Learn about bundle splitting and dynamic imports
- Third: Understand React’s rendering lifecycle and memoization
- Fourth: Dive into server-side optimization techniques
- Finally: Explore advanced patterns and micro-optimizations
References
- React Documentation - Official React docs and tutorials
- Next.js Documentation - Framework-specific best practices
- How Vercel Optimized Package Imports - Deep dive on barrel imports
- SWR Data Fetching - Client-side data fetching library
- better-all for Parallelization - Advanced async patterns
- LRU Cache - Server-side caching
This guide is based on the react-best-practices skill from the vercel-labs/agent-skills repository. It’s designed to help developers understand not just what to do, but why these practices matter in real-world applications.
Tags:
Related Posts
Design Systems 101: A Beginner's Guide to Building Consistency from Scratch
Learn how to build a design system from scratch and create consistent, professional interfaces. This beginner-friendly guide covers everything from color palettes and typography to components and documentation.
Complete TanStack Router Tutorial: Building a Todo App
Learn TanStack Router from scratch by building a complete Todo application. This comprehensive tutorial covers type-safe routing, data loading, mutations, navigation patterns, and error handling with practical examples.