The Streams API
The Streams API gives JavaScript the ability to process data chunk-by-chunk as it arrives, rather than waiting for an entire response to load. Before this API, you had to download a whole file, string, or blob before doing anything with it. With Streams, you can start processing the moment the first byte arrives — which matters a lot when you’re handling large files, streaming audio/video, or building data pipelines.
Why Streams Matter
Imagine fetching a 500MB log file. Without streams, you wait for all 500MB before doing anything. With the Streams API, you can process the first chunk as soon as it arrives. For video playback, this has been standard forever — the browser streams video while playing it. The Streams API brings that same capability to JavaScript.
The benefits:
- Immediate processing — start working on data as soon as the first chunk arrives
- Lower memory usage — no need to buffer entire responses in memory
- Backpressure — automatically slows down producers when consumers can’t keep up
- Composable — pipe streams together into transformation chains
ReadableStream: Consuming Data Chunk by Chunk
The Response.body property of a fetch response is a ReadableStream. That’s the easiest way to get a stream:
const response = await fetch('https://example.com/large-file.txt');
const stream = response.body; // ReadableStream
Reading with a Reader
Attach a reader to consume chunks:
const response = await fetch('https://example.com/data.txt');
const reader = response.body.getReader();
while (true) {
const { done, value } = await reader.read();
if (done) break;
// value is a Uint8Array chunk
console.log('Received chunk:', value);
}
read() returns a promise that resolves to { done, value }:
{ done: false, value: chunk }— a chunk is available{ done: true, value: undefined }— stream is closed- rejected — an error occurred in the stream
Async Iteration
ReadableStream is an async iterable, so you can use for await...of:
const response = await fetch('https://example.com/data.txt');
for await (const chunk of response.body) {
// chunk is a Uint8Array
console.log('Chunk:', chunk);
}
Cancelling a Stream
If you need to stop reading partway through:
const response = await fetch('https://example.com/large-file.txt');
const reader = response.body.getReader();
const aborter = new AbortController();
setTimeout(() => aborter.abort(), 5000); // cancel after 5s
try {
for await (const chunk of response.body) {
// process
}
} catch (err) {
if (err.name === 'AbortError') {
console.log('Cancelled');
}
}
WritableStream: Writing Data
WritableStream is the destination side of streams. You write chunks to it, and it sends them to the underlying sink:
const writable = new WritableStream({
write(chunk) {
console.log('Writing:', chunk);
},
close() {
console.log('Done writing');
}
});
const writer = writable.getWriter();
await writer.write('Hello');
await writer.write('World');
await writer.close();
WritableStream has built-in backpressure. If the underlying sink is slower than data arriving, write() waits until the sink is ready.
TransformStream: Chaining Data Through a Pipeline
TransformStream sits between a readable and writable stream and transforms data as it passes through:
const upperCaseTransform = new TransformStream({
transform(chunk, controller) {
const text = new TextDecoder().decode(chunk);
controller.enqueue(new TextEncoder().encode(text.toUpperCase()));
}
});
Use pipeThrough() to insert it into a chain:
const response = await fetch('https://example.com/data.txt');
const readable = response.body
.pipeThrough(upperCaseTransform)
.pipeThrough(new TextEncoderStream())
.pipeTo(writableDestination);
pipeThrough() returns the readable side. pipeTo() connects the final readable to a writable and returns a promise when complete.
Building a Custom ReadableStream
Create your own readable stream from scratch:
function generateNumbers(count) {
return new ReadableStream({
start(controller) {
this.current = 0;
},
pull(controller) {
if (this.current >= count) {
controller.close();
return;
}
controller.enqueue(this.current++);
},
cancel(reason) {
console.log('Cancelled:', reason);
}
});
}
const stream = generateNumbers(10);
for await (const num of stream) {
console.log(num); // 0, 1, 2, ... 9
}
start() runs once when created. pull() is called repeatedly as the stream is consumed. cancel() fires if the consumer gives up mid-stream.
Reading from a Fetch Response as a Stream
The most common use — streaming the body of a fetch response:
async function streamResponse(url) {
const response = await fetch(url);
const reader = response.body.getReader();
const chunks = [];
while (true) {
const { done, value } = await reader.read();
if (done) break;
chunks.push(value);
}
return chunks;
}
pipeTo for Automatic Backpressure
pipeTo() handles backpressure automatically:
const source = await fetch('https://example.com/large.txt').then(r => r.body);
const dest = new WritableStream({ write(chunk) { /* save chunk */ } });
await source.pipeTo(dest);
No extra code needed — if dest can’t keep up, pipeTo() slows the source.
Teeing a Stream
Split a ReadableStream into two independent copies:
const [copy1, copy2] = stream.tee();
const reader1 = copy1.getReader();
const reader2 = copy2.getReader();
Both copies can be consumed independently. Once teed, the original stream is locked.
Common Patterns
Stream lines from a body
async function* streamLines(stream) {
const reader = stream.getReader();
const decoder = new TextDecoder();
let remainder = '';
while (true) {
const { done, value } = await reader.read();
if (done) break;
const text = remainder + decoder.decode(value, { stream: true });
const lines = text.split('\n');
remainder = lines.pop();
for (const line of lines) yield line;
}
if (remainder) yield remainder;
}
Process CSV row by row
async function processCSV(url) {
const response = await fetch(url);
const stream = response.body
.pipeThrough(new TextDecoderStream())
.pipeThrough(new CSVParserTransform());
for await (const row of stream) {
// process one row at a time
}
}
Browser Support
Streams API is widely supported (Chrome 43+, Firefox 65+, Safari 16.4+). The ReadableStream.from() static method is experimental and converts any iterable directly to a stream.
See Also
- javascript-async-iterators — the async iteration protocol that underlies
for await...ofwith streams - javascript-web-animations-api — another browser native API with similar patterns
- javascript-performance-api — measuring stream processing performance