Generator functions are advanced JavaScript with real-world uses in async iteration and lazy evaluation. Learn for senior-level interviews.
Picture a vending machine versus a buffet. A buffet puts everything out at once — you take your plate, load it up, and walk away. A vending machine gives you one item at a time. You put in your money, press a button, and get one item. You come back, press again, get another. It remembers exactly where it left off each time. Regular functions are the buffet — they compute everything and return it all at once. Generators are the vending machine — they produce one value at a time, pause, wait for you to ask for the next one, and resume exactly where they left off. The computation is spread across multiple calls, and the function's state is frozen between each one. The key insight: generators give you a function that can be paused mid-execution and resumed later. They don't just return a value — they yield it, suspend themselves, and hand control back to the caller. The caller drives the pace. This makes generators uniquely powerful for lazy sequences, infinite data streams, and managing complex async flows.
A generator function is declared with function* (the asterisk is the signal). Calling it does not run the body — it returns a generator object. The body only runs when you call .next() on the generator object. Execution runs until the next yield, then pauses.
function* counter() {
console.log('Step 1')
yield 1
console.log('Step 2')
yield 2
console.log('Step 3')
yield 3
console.log('Done')
}
const gen = counter() // body does NOT run yet
gen.next() // logs 'Step 1', returns { value: 1, done: false }
gen.next() // logs 'Step 2', returns { value: 2, done: false }
gen.next() // logs 'Step 3', returns { value: 3, done: false }
gen.next() // logs 'Done', returns { value: undefined, done: true }
gen.next() // returns { value: undefined, done: true } — stays exhausted
Each call to .next() returns an object with two properties:
value — the value passed to yield (or undefined after the function returns)done — false while there are more yields, true when the function has returnedGenerator objects implement the JavaScript iterator protocol, which means they work with for...of, spread, destructuring, and anything else that consumes iterables:
function* range(start, end, step = 1) {
for (let i = start; i <= end; i += step) {
yield i
}
}
// for...of
for (const n of range(1, 5)) {
console.log(n) // 1, 2, 3, 4, 5
}
// Spread into an array
const nums = [...range(1, 5)] // [1, 2, 3, 4, 5]
const evens = [...range(2, 10, 2)] // [2, 4, 6, 8, 10]
// Destructuring
const [a, b, c] = range(10, 50, 10) // a=10, b=20, c=30
Because generators only compute one value at a time, they can represent infinite sequences without running out of memory. The sequence only advances when you ask for the next value.
// Infinite Fibonacci sequence
function* fibonacci() {
let a = 0, b = 1
while (true) { // infinite loop — fine because we yield before looping
yield a;
[a, b] = [b, a + b]
}
}
const fib = fibonacci()
fib.next().value // 0
fib.next().value // 1
fib.next().value // 1
fib.next().value // 2
fib.next().value // 3
fib.next().value // 5
fib.next().value // 8
// Take first 10 Fibonacci numbers
function take(n, gen) {
const result = []
for (const val of gen) {
result.push(val)
if (result.length === n) break // stop early — generator is not exhausted
}
return result
}
take(10, fibonacci()) // [0, 1, 1, 2, 3, 5, 8, 13, 21, 34]
// Infinite ID generator
function* idGenerator() {
let id = 1
while (true) yield id++
}
const nextId = idGenerator()
nextId.next().value // 1
nextId.next().value // 2
nextId.next().value // 3
The communication goes both ways. When you call next(value), the value becomes the result of the yield expression inside the generator. This transforms the generator from a read-only producer into a two-way communication channel.
function* conversation() {
const name = yield 'What is your name?'
const age = yield `Hello ${name}! How old are you?`
yield `Wow, ${name} is ${age} years old.`
}
const chat = conversation()
chat.next() // { value: 'What is your name?', done: false }
// First next() starts the generator, value is ignored
chat.next('Alice') // { value: 'Hello Alice! How old are you?', done: false }
// 'Alice' becomes the result of the first yield
chat.next(30) // { value: 'Wow, Alice is 30 years old.', done: false }
// 30 becomes the result of the second yield
yield* delegates to another iterable (including another generator), yielding all its values before continuing:
function* inner() {
yield 'a'
yield 'b'
}
function* outer() {
yield 1
yield* inner() // delegates — yields 'a' then 'b' inline
yield 2
}
[...outer()] // [1, 'a', 'b', 2]
// Flattening nested arrays with generators
function* flatten(arr) {
for (const item of arr) {
if (Array.isArray(item)) {
yield* flatten(item) // recurse
} else {
yield item
}
}
}
[...flatten([1, [2, [3, 4]], 5])] // [1, 2, 3, 4, 5]
function* gen() {
try {
yield 1
yield 2
yield 3
} catch (e) {
yield `caught: ${e}`
}
}
const g = gen()
g.next() // { value: 1, done: false }
g.return('stop') // { value: 'stop', done: true } — forces early termination
const h = gen()
h.next() // { value: 1, done: false }
h.throw('oops') // { value: 'caught: oops', done: false } — injects an error
Async generators combine async/await with generators. They're perfect for processing paginated API responses or real-time data streams using for await...of:
async function* fetchPages(url) {
let nextUrl = url
while (nextUrl) {
const response = await fetch(nextUrl)
const data = await response.json()
yield data.items // yield this page's items
nextUrl = data.nextPage // will be undefined/null when no more pages
}
}
// Process every item from a paginated API without loading everything at once
for await (const page of fetchPages('/api/products')) {
for (const product of page) {
await processProduct(product)
}
}
// Reading a large file line by line
async function* readLines(filename) {
const fileStream = fs.createReadStream(filename)
for await (const chunk of fileStream) {
const lines = chunk.toString().split('\n')
for (const line of lines) yield line
}
}
for await (const line of readLines('huge-file.csv')) {
await insertIntoDatabase(parseCsv(line))
}
Before async/await existed (ES2017), libraries like co used generators to write code that looked synchronous but ran asynchronously. Understanding this history explains why generators exist in the spec:
// With co library (2014 era) — before async/await
const co = require('co')
co(function* () {
const user = yield fetchUser(1) // yield a Promise
const posts = yield fetchPosts(user.id) // yield another Promise
console.log(posts)
})
// co ran the generator, took each yielded Promise,
// waited for it to resolve, then sent the value back in with next()
// This is EXACTLY what async/await does — it was implemented using generators first
Modern async/await is syntactic sugar over generators + Promises. The mental model is the same: pause here, wait for a value, resume with the result.
Many devs think calling a generator function runs its body immediately — but actually calling a generator function returns a generator object and runs nothing. The body only starts executing on the first .next() call. This is the fundamental difference from regular functions.
Many devs think generators are only useful for async code — but actually generators are synchronous by default and their primary use cases are lazy evaluation, infinite sequences, and custom iterables. Async generators (async function*) are a separate feature layered on top. Understanding sync generators first makes async generators obvious.
Many devs think the value passed to the first next() call matters — but actually the first next() call starts the generator and its argument is always discarded. The generator runs until the first yield, and there is no yield expression yet to receive a value. Only the second and subsequent next(value) calls inject values into the running generator.
Many devs think a generator is exhausted after the function body completes — but actually calling next() on an exhausted generator simply returns { value: undefined, done: true } forever. It never throws. An exhausted generator is safe to call repeatedly.
Many devs think for...of will exhaust a generator even if you break early — but actually breaking out of a for...of loop calls return() on the generator, which closes it cleanly. The generator is terminated at that point, not at the last yield. This is why using generators inside for...of loops with early break conditions is safe.
Many devs think generators are too complex for everyday code and only belong in libraries — but actually paginated API fetching with async generators, infinite scroll implementations, unique ID generation, and custom iteration over tree structures are all practical day-to-day use cases that generators solve more cleanly than alternatives.
Redux-Saga is an entire side-effect management library built entirely on generators — every saga is a generator function that yields effect descriptors (call(), put(), take()), and the saga middleware drives the generator by calling next() with resolved values. Understanding generators makes every Redux-Saga pattern immediately readable.
TypeScript's compiler and many AST parsers use generator-based tree traversal — yielding nodes as they walk the tree so callers can process one node at a time without loading the entire traversal result into memory. This is the lazy evaluation pattern that generators enable.
Node.js streams can be consumed with async generators using for await...of, which is now the recommended way to process large files and HTTP response bodies — replacing the complex event-based readable.on('data') pattern with a simple loop that processes one chunk at a time.
Next.js and other SSR frameworks use generators internally for streaming HTML responses — server components can yield HTML chunks that are flushed to the client progressively, enabling the browser to start rendering before the entire page is generated. This is only possible because generators can yield partial results.
Jest's fake timer implementation uses generator-like stepping to control time in tests — advancing the clock by specific amounts and letting the event loop process each step. The mental model (advance by one unit, observe effects, advance again) mirrors exactly how generator stepping works.
Infinite scroll pagination in React applications is cleanly modeled with async generators — an async generator fetches the next page on each iteration, and the component calls next() when the user scrolls near the bottom, making the data fetching logic completely separate from the rendering logic.
What are generators and when would you use them?
Reading answers is not the same as knowing them. Practice saying them out loud with AI feedback — that's what builds real interview confidence.