I Haven't Written fetch() in a Year

29 minutes ago

Technologies
Next.jsTypeScriptReact.jsReact 19React Server ComponentsServer ActionstRPCPerformance OptimizationFront-End Development
Illustration

The Context

Every React codebase I've worked on in the last five years has had a fetch-shaped hole in it. You crack open the editor, and about sixty percent of the lines exist to move data from a server into a component. A quarter of the bugs live at the seam. Most of the "performance problems" everyone complains about are actually cache-coherence problems wearing a performance problem's coat.

The fix is supposed to be a library. React Query. SWR. Apollo. RTK Query. Pick one, wire it up, admire how much cleaner your components look. Except the library didn't remove the fetching layer — it just moved it into a cache with opinions. Now you have stale-while-revalidate edge cases, refetch-on-focus bugs, and a 40 KB dependency whose documentation is longer than most companies' employee handbooks. You still write useQuery. You still handle isLoading. You still render a skeleton.

RSVPed doesn't do any of that. There is no React Query. There is no SWR. There is no fetch() in application code — grep the whole repo, it doesn't exist. There is one getAPI() call on the server, one useTransition pattern for user-initiated reads, and a compile error if you try to add anything else. A year in, I've written exactly zero client-side fetches, and the app got noticeably faster, noticeably smaller, and noticeably less buggy.

This is how.

The Problem

The honest version of what client-side data fetching costs you:

  • A library. React Query is ~40 KB gzipped. SWR is smaller but covers less ground. Apollo is bigger. That's a cold, fixed cost to show your first byte of data.
  • A second library to handle request cancellation. Or you roll your own with AbortController. Or you don't, and your app has races nobody notices until production.
  • A loading state per read. isLoading, isFetching, isPending, isValidating — four booleans that mean slightly different things across libraries. Components become a nest of early-return skeletons.
  • Cache invalidation. The second-hardest problem in computer science, now yours personally. You mutate something on the server and then spend the afternoon figuring out which queryKeys to invalidate.
  • Waterfalls. Page loads. Fetches user. Then fetches user's events. Then fetches each event's RSVP count. Three round trips, rendered with three shimmer passes, over a 100 ms latency link. Your users see 900 ms. Your performance panel blames you for it.
  • Drift between server types and client types. Your API returns Event[]. Your client calls it any[]. You add a field. The client doesn't know. A bug ships. You blame TypeScript. TypeScript did not cause this.

Every React app I've inherited has had some combination of these costs. Most have all of them at once. The costs aren't imposed by React — they're imposed by the client-side fetching pattern that happened to dominate while we were figuring React out.

React Server Components don't fix this by themselves. You can have RSCs and still use React Query for "just the interactive stuff", and now you have a codebase with both patterns and all the bugs of both. The actual fix is a rule: the client does not fetch. Period. No exceptions for dashboards. No exceptions for infinite scroll. No exceptions for "just this one hover card." The moment you make an exception, the cost comes back.

RSVPed enforces the rule. Everything else in this post is how it's possible to enforce.

The Solution

The only seam: `getAPI()`

All server-side code reaches data through one function:

// server/api/index.ts
export async function getAPI() {
  const ctx = await createTRPCContext()
  return createCaller(ctx)
}

// server/api/trpc.ts
export const createTRPCContext = async () => {
  let session: Session | null = null
  try {
    headers()         // fails outside request scope
    session = await auth()
  } catch {
    session = null    // module-level imports, etc.
  }
  return createInnerTRPCContext({ session })
}

getAPI() returns a tRPC caller bound to the current request's session. Use it anywhere server-side — RSC pages, server actions, generateMetadata, loading functions:

// A page, top-to-bottom
export default async function EventsPage({ searchParams }) {
  const api = await getAPI()
  const events = await api.event.list.core(searchParams)
  return <ProgressiveEventsList data={events} params={searchParams} />
}

Three things happen in that five-line block that are load-bearing:

  1. Auth middleware runs automatically. protectedProcedure checks session; publicProcedure doesn't. Pages never hand-write auth checks.
  2. Types flow end-to-end. RouterOutput['event']['list']['core'] is exactly the prop type of ProgressiveEventsList. Add a field to the router, the component prop updates, TypeScript complains at the consumer — no casting, no any, no DTO layer.
  3. Zero raw Prisma imports anywhere outside `server/api/routers/`. Pages, layouts, actions, middleware — none of them import Prisma. The router is the data boundary. Schema changes have exactly one place to propagate from.

That's the whole server-side read path. No client hook. No useQuery. No JSON over the wire at render time.

Progressive streaming, not loading states

The interesting read isn't "how do you fetch on the server" — everyone knows that now. The interesting read is "how do you avoid the 900 ms wall of nothing while enhanced data loads." The answer is a pattern we use on every data-heavy page:

// ProgressiveEventPage.tsx
const EnhancedEventPage = async ({ slug }: { slug: string }) => {
  const api = await getAPI()
  const [event, similar] = await Promise.all([
    api.event.get.enhanced({ slug }),
    api.user.recommendations.similar({ eventId: slug, limit: 3 }),
  ])
  return (
    <EventPage {...event}>
      <SimilarEvents data={similar} />
    </EventPage>
  )
}

export const ProgressiveEventPage = ({ coreEvent }: ProgressiveEventPageProps) => (
  <Suspense fallback={<EventPage {...coreEvent} />}>
    <EnhancedEventPage slug={coreEvent.slug} />
  </Suspense>
)

The page fetches a .core query — title, date, slug, counts. Lightweight, near-free on Postgres. Renders immediately. Inside a <Suspense>, an async RSC calls .enhanced, which returns the full event with RSVPs, collaborators, metadata, and similar recommendations — all parallelized via Promise.all.

The browser gets real HTML in the first chunk. The enhanced HTML streams in as a second chunk. No shimmer, no waterfall, no 900 ms wall. And crucially: zero client JavaScript fetches. All of this happens inside the server's response.

The same pattern scales to lists:

<Suspense
  fallback={data.map((event) => <EventCard key={event.slug} {...event} />)}
>
  <EnhancedEventsList params={params} />
</Suspense>

Cards render with core data. Enhanced cards (with RSVP avatars, collaborator photos, engagement counts) stream in to replace them. The component itself branches on data shape via a type guard:

type EventCardData = CoreEventData | EnhancedEventData
const isEnhancedEventData = (d: EventCardData): d is EnhancedEventData =>
  'metadata' in d && d.metadata !== undefined

export const EventCard = (props: EventCardProps) => {
  const enhanced = isEnhancedEventData(props)
  const rsvps = enhanced ? props.rsvps : undefined
  const canManage = enhanced ? props.metadata.user.access.manager : undefined
  // … render
}

One component, two render modes, no parallel loading file.

The pattern for user-initiated reads: `useTransition` + server actions

Some reads legitimately happen on the client — hovering on a user card, typing in a search box. The orthodox answer is a client fetch. RSVPed's answer is a server action dispatched inside startTransition:

const [isPending, startTransition] = useTransition()
const [data, setData] = useState<HoverCardData | null>(null)

const handleMouseEnter = () => {
  if (data || isPending) return
  startTransition(async () => {
    const result = await getUserHoverCardAction(user.id)
    setData(result)
  })
}

useTransition gives you isPending for free. The "fetch" is a server action call — zero network code in the component, the action talks to getAPI() on the server, data comes back typed. No library, no cache, no invalidation story — the data is specifically transient (hover preview) and doesn't need one.

For debounced input (autocomplete, live search), the pattern is an async IIFE inside useEffect with a cancellation flag:

useEffect(() => {
  let cancelled = false
  const fetchSuggestions = async () => {
    const results = await getAutocompleteAction(debouncedQuery)
    if (!cancelled) setSuggestions(results)
  }
  fetchSuggestions()
  return () => { cancelled = true }
}, [debouncedQuery])

This is the one justified useEffect pattern in the codebase. Not a useQuery, not a SWR hook, not a React Query — an await in cleanup-protected effect. The function it calls is, again, a server action.

Every client-side "fetch" goes through one of those two patterns. There is no third. There is no fetch() call. A lint rule could enforce it; code review does.

The things I don't need anymore

Once you take the rule seriously, whole categories of code stop being written:

  • No client-side data-fetching library. Zero bytes of React Query, SWR, Apollo, RTK Query in the bundle. At ~40 KB gzipped for the heaviest of them, that's a meaningful wins.
  • No manual loading state. useTransition's isPending covers user-initiated; Suspense covers page-level; there is no third case.
  • No cache invalidation. Mutations are server actions; server actions call revalidatePath() or redirect(). The server invalidates. The client doesn't have a cache to invalidate.
  • No `useMemo` / `useCallback` / `React.memo`. React Compiler handles it. Drop the dependency arrays; drop the review comments about missing memoization; drop the whole category of bug where the dependency array lies.
  • No waterfall debugging. All reads on a given request are server-side, and we use Promise.all inside the async RSC. If something is slow, it's slow in a profiler, not in a flame graph of serialized fetches.

Declarative auth completes the loop

Middleware doesn't sprinkle auth checks across pages. Two arrays declare the entire posture:

// lib/auth/config.ts
export const RouteDefs = {
  Protected: [Routes.Main.Events, Routes.Main.Communities, /* … */],
  Public: [Routes.Auth.SignIn, Routes.Static.Landing, /* … */],
}

// middleware.ts
const isProtected = RouteDefs.Protected.some((r) => matchPathSegments(pathname, r))
if (isProtected && !isLoggedIn) {
  return NextResponse.redirect(new URL(`${Routes.Auth.SignIn}?next=${encodedNext}`, nextUrl))
}

One file shows the protection surface. Matching is structural — route segments, not regex. Adding a new protected route is a one-line change.

What I considered and rejected

  • Keeping React Query for "just the dashboard." No. The discipline is the value. An exception becomes a pattern. A pattern becomes half the codebase.
  • Fetching on the server with raw `fetch()` instead of tRPC. No. Lose the end-to-end types. Gain nothing.
  • Custom hydration strategies (progressive hydration, selective hydration). Not needed. Streaming RSC + minimal client JS already produces what hydration tricks try to produce.
  • Adding a tiny cache for the stir chat history. Ended up using localStorage directly — simpler, no dependency, fine for the scale.

The Impact

  • ~40-60% smaller initial JavaScript compared to a React Query + Axios SPA variant of the same app (measured against a spiked-out alternative build; real savings depend on how much tree-shaking your data library enables)
  • First paint contains real content on every page — not a shimmer, not an empty shell
  • Zero waterfalls on any request — all reads inside an RSC run through Promise.all
  • End-to-end types from Prisma → tRPC router → RouterOutput → component prop. Zero manual boundary typing.
  • Zero cache invalidation bugs in the entire project history — there is no cache to invalidate
  • Zero "is `isLoading` still true?" bug reportsisPending from useTransition always reflects reality because the transition is a single awaited promise

What It Cost

The honest column:

  • A team unlearning exercise. Every engineer I've onboarded has reflexively reached for useQuery at least once. Pair-programming the first form submit helps.
  • A discipline problem in PR review. It takes two or three reviews to calibrate the "this should be a server action" instinct. Once it clicks, it clicks.
  • A novel pattern for infinite scroll. useQuery's useInfiniteQuery is genuinely ergonomic; the RSC-native equivalent (cursor-based server actions that append to state) is slightly more code. Worth it — still no library.
  • An edge case in `generateMetadata`. If your metadata depends on enhanced data, you block SSR on the enhanced query. We ended up using core data for metadata on most pages and accepting slightly generic social previews.

None of it outweighed the wins. None of it was the kind of cost that grew over time.

Closing

A year ago I thought client-side data fetching was the hard part of React. What actually happened is that client-side data fetching was the React problem — the thing we kept creating for ourselves and then building libraries to manage. Remove it and half the complexity of a React app evaporates. No cache. No waterfalls. No shimmer. No isFetching. No stale data. No drift between client type and server type. The remaining complexity is actual domain complexity, which is the complexity you wanted to be solving in the first place.

The rule is simple and load-bearing: the client does not fetch. Everything else follows.