Skip to content
djust
Appearance
Mode

Streaming Initial Render (v0.6.1)

Phase 1 — shipped in v0.6.1. Transport-layer chunked transfer. Lazy-child streaming and true server-side render overlap (Phase 2) are tracked for v0.6.2.

djust can return a LiveView page as an HTTP/1.1 chunked-transfer response instead of a single buffered response. Phase 1's payoff is transport-level: the browser receives the shell chunk (<!DOCTYPE> + <head> + <body> open) as soon as the view has finished rendering, without waiting for the whole response to be assembled in a gzip or reverse-proxy buffer. Intermediate proxies that honor chunked encoding can relay each chunk as it arrives.

What Phase 1 does NOT do (yet): true server-side overlap — rendering the main content while the browser is parsing the shell. Today get() fully assembles the rendered HTML string before handing it to the streaming iterator, so the server-render time and the first chunk's wire time are sequential. Phase 2 (v0.6.2, lazy children) moves each {% live_render lazy=True %} child's render into a separate post-shell chunk, delivering the real concurrency win. Phase 1 lays the response- type plumbing that Phase 2 builds on.

This is the djust analog of Next.js renderToPipeableStream: opting in flips the HTTP response type from HttpResponse to StreamingHttpResponse with no other API changes. The full Next.js experience (shell-first paint during component render) arrives with Phase 2.


Quick start

from djust import LiveView

class DashboardView(LiveView):
    template_name = "dashboard.html"
    streaming_render = True   # ← opt in

    def mount(self, request, **kwargs):
        # Slow work here delays Chunk 2, but Chunk 1 has already
        # arrived at the browser — CSS is loading, fonts are warming.
        self.rows = fetch_expensive_rows()

That's it. No JS changes, no new template tags, no new URL routing — the existing path("/dashboard/", DashboardView.as_view()) just works.


How it works

When streaming_render = True, LiveView.get() splits the rendered HTML into three chunks at well-defined boundaries and yields each chunk to the wire as soon as it's ready:

ChunkContentsBrowser behavior
1. Shell-openEverything before <div dj-root><!DOCTYPE html>, <head>, <link rel="stylesheet">, <body> open, top chromeStarts parsing <head>, fires CSS + JS downloads, paints page background
2. Main contentThe <div dj-root>...</div> block — the entire LiveView bodyInserts the view's DOM; djust client script runs on DOMContentLoaded
3. Shell-close</body></html> + trailing markupFinishes document parse

Browsers begin DOM construction the moment Chunk 1 arrives, so linked stylesheets and <script defer> tags are already in-flight while your Python code is still computing the view state.

The response omits the Content-Length header (HTTP chunked transfer is implicit) and sets X-Djust-Streaming: 1 as an observability marker so you can verify the feature is active from your browser's Network panel.


When to use it

Good fit:

  • Pages where mount() or get_context_data() make slow external calls (database aggregations, REST APIs, S3 lookups, LLM calls).
  • Dashboards with large query fan-out — each row-count query adds to time-to-first-byte under the non-streaming path.
  • Public landing pages where <link rel="stylesheet"> in <head> determines Largest Contentful Paint — flushing the head early is a measurable LCP win.

Not worth it:

  • Small, fast pages where the server renders in < 50 ms. The fixed overhead of chunked transfer (extra bytes per chunk, proxy buffering risk) can exceed the benefit on sub-frame renders.
  • Pages served behind a reverse proxy that buffers responses by default (see caveats).

Caveats

  • No Content-Length. Some reverse proxies (notably default nginx + proxy_buffering on) buffer chunked responses into a single write, defeating the streaming benefit. Set proxy_buffering off; on the nginx location block, or switch the proxy to HTTP/2 (which handles streaming natively).
  • Middleware that inspects the response body must be streaming-aware. Middleware reading response.content on a StreamingHttpResponse raises AttributeError: ... content. If you have custom middleware, guard body reads with isinstance(response, StreamingHttpResponse). All of djust's built-in middleware is streaming-safe as of v0.6.1.
  • CSP nonces generated by django-csp work fine — nonces are produced during template render (before any chunk is sent) and the Content-Security-Policy response header is set once on the StreamingHttpResponse, not per-chunk.
  • HTML without a <div dj-root> (edge case — raw body fragments) falls back to a single-chunk response equivalent to HttpResponse(html). Streaming is a no-op in that case.
  • Literal </body> tokens inside <style> blocks or HTML comments. The chunk-splitter masks <script>...</script> content so a literal </body> inside a JavaScript string does not create a false split boundary, but it does not currently mask <style> or <!-- ... --> blocks. If your template inlines </body> as literal string content inside <style> or an HTML comment, the split may fire at the wrong position. In practice this is extremely rare — for almost all apps it is not a concern. If your template does this legitimately, verify the streamed chunks via your browser's Network panel.

Comparison

FeatureHttpResponse (default)streaming_render = TrueNext.js renderToPipeableStream
Response typeHttpResponseStreamingHttpResponseReadableStream
Transfer encodingContent-Length: NTransfer-Encoding: chunkedTransfer-Encoding: chunked
Time-to-first-byteAfter render completeAfter shell-open ready (~ms)After shell-open ready (~ms)
Chunks13 (shell / main / close)N (per Suspense boundary)
Out-of-order renderNoNo (Phase 1)Yes (React Suspense)
Opt-in per viewn/astreaming_render = True<Suspense> wrapping
Client-side code neededNoneNoneReact runtime

Phase 1 matches the first-paint win of renderToPipeableStream without the Suspense machinery. Phase 2 (planned for v0.6.2) adds out-of-order rendering via lazy-child placeholders that stream in after the main chunk.


Future work — Phase 2 (v0.6.2)

Lazy-child streaming extends this to {% live_render %} children marked lazy=True:

  1. Parent yields a <div dj-view dj-lazy> placeholder inside Chunk 2.
  2. After Chunk 3, the parent continues streaming each lazy child as a <template data-target="dj-lazy-N">...</template> + inline <script>djust.streamFill('dj-lazy-N')</script> sequence.
  3. A new client module adopts the template content into the target container — equivalent to React Suspense's streaming resolution.

This lets you ship an instant shell with placeholder UI and stream in heavy children (charts, tables, LLM output) as they become ready — closer parity with renderToPipeableStream and React Server Components. Tracked on the ROADMAP as v0.6.2 scope.