SLShenoy Labs
HomeProjectsArticlesAboutContactSupport

Shenoy Labs

Hybrid product studio across projects, research, and practical systems.

Think. Learn. Solve.

HomeProjectsArticlesAbout
Privacy PolicyTermsRefund PolicySupportContact

© 2026 Shenoy Labs. All rights reserved.

← All articles
Next.js performance patterns cover artwork with streaming performance visual cues.
Technology2 min read · March 14, 2026

Next.js performance patterns that actually move the needle

A deep-dive into the RSC + streaming model, partial pre-rendering, and edge execution trade-offs.

nextjsperformancerscstreaming

Performance is a product feature. Users do not wait — they leave.

Here are the patterns I have applied to production Next.js apps that made a measurable difference.

1. Treat the component tree as a waterfall

The most common mistake is treating React Server Components as a free pass to fetch anywhere. They are not.

// Bad — sequential fetches
async function Page() {
  const user = await getUser()
  const posts = await getPosts(user.id) // waits for user
  return <PostList posts={posts} />
}
// Good — parallel fetches
async function Page() {
  const [user, posts] = await Promise.all([getUser(), getAllPosts()])
  return <PostList posts={posts} user={user} />
}

Every await in series adds latency. Parallelise aggressively.

2. Stream expensive components

Wrap slow UI in <Suspense> with a skeleton. The shell renders immediately; the slow content streams in.

import { Suspense } from 'react'
import { ArticleSkeleton } from '@/components/ui/skeleton'

export default function ArticlePage() {
  return (
    <>
      <ArticleHeader />
      <Suspense fallback={<ArticleSkeleton />}>
        <ArticleBody />
      </Suspense>
    </>
  )
}

This pattern converts a 1.8 s LCP into a 0.3 s perceived load time.

3. Partial Pre-rendering

PPR is the most significant routing change in Next.js 15+. The static shell is served from the edge; dynamic holes are filled via streaming.

// next.config.ts
experimental: {
  ppr: true
}

Combine this with <Suspense> boundaries and your static shell becomes a CDN response.

4. Edge vs Node runtime trade-offs

| | Edge | Node | |---|---|---| | Cold start | ~0 ms | ~100 ms | | Memory | 128 MB | 3 GB | | Node APIs | ✗ | ✓ | | Geo-awareness | ✓ | ✗ |

Use Edge for lightweight middleware and auth checks. Use Node for anything touching the filesystem or heavy compute.

5. Image discipline

Every <Image> should declare priority only for above-the-fold images — typically just the hero. Everything below uses lazy loading automatically.

Explicitly set sizes for responsive images:

<Image
  src="/hero.jpg"
  sizes="(max-width: 768px) 100vw, 50vw"
  fill
  alt="Hero"
  priority
/>

This alone can reduce LCP by 40% on mobile.


Profile first. The patterns above are high-leverage, but the highest-leverage thing is always: measure, then fix.

Written by Lakshman Shenoy

Published March 14, 2026

Recommended next reads

Why building in public is the best career move you haven't madeLean product research without a UX team

Newsletter

Occasional notes on projects, research, and useful links.

Get in touch

Send a note, suggest a topic, or support the work.

Send a noteSupport
Suggest a topic