JavaScript runs on a single thread. One thing at a time. No parallel execution. Yet it handles network requests, timers, and user events simultaneously without blocking the page.
That's not magic — it's the event loop. And understanding it precisely is what separates developers who guess at async behavior from those who predict it.
Start Here: JavaScript Is Single-Threaded
A single-threaded language has one call stack. One executing function at a time. If a function takes 10 seconds to run, nothing else happens for 10 seconds.
So how does a setTimeout "wait" for 1000ms without freezing everything? The answer: it doesn't wait. It registers a callback with the browser's timer API and immediately returns. The JavaScript thread keeps running. The browser runs the timer on a separate internal thread. When the timer fires, it puts the callback in a queue. JavaScript picks it up when the thread is free.
The event loop is the mechanism that manages this pickup.
The Three Components
The Call Stack is a LIFO (last in, first out) data structure that tracks which function is currently executing. When you call a function, it's pushed onto the stack. When it returns, it's popped off. JavaScript runs code only when the call stack is non-empty. When the stack empties, the event loop gets to work.
The Microtask Queue holds callbacks that should run immediately after the current task, before anything else. High priority. Fully drained before the next macrotask ever runs.
What produces microtasks:
- Promise
.then(),.catch(),.finally()callbacks
queueMicrotask(fn)
awaitcontinuations in async functions
MutationObservercallbacks
The Macrotask Queue (also called the task queue) holds callbacks scheduled for later. One macrotask runs per event loop cycle. The browser can render between macrotasks.
What produces macrotasks:
setTimeoutandsetIntervalcallbacks
- DOM event handlers (
click,keypress, etc.)
- I/O callbacks
MessageChannelmessages
The Event Loop Algorithm
The event loop runs continuously. On each cycle:
1. Execute all synchronous code (drain the call stack)
2. Drain the ENTIRE microtask queue (run ALL microtasks, including newly added ones)
3. [Optional: browser renders if a frame is due]
4. Pick ONE macrotask from the queue and execute it
5. Go back to step 2
The critical rule: all microtasks run before any macrotask. Every single one. If a microtask adds another microtask, that new one also runs before any macrotask. The macrotask queue only gets checked once the microtask queue is completely empty.
The Classic Output Question — Traced Step by Step
console.log('start')
setTimeout(() => console.log('timeout 1'), 0) setTimeout(() => console.log('timeout 2'), 0)
Promise.resolve() .then(() => console.log('promise 1')) .then(() => console.log('promise 2'))
console.log('end')
Let's trace it:
Synchronous phase:
console.log('start')→ logs start
setTimeout(cb1, 0)→ registers cb1 in macrotask queue
setTimeout(cb2, 0)→ registers cb2 in macrotask queue
Promise.resolve().then(...)→ queues promise1 callback as microtask
console.log('end')→ logs end
- Call stack empties
Microtask drain:
- Run promise1 callback → logs promise 1, registers promise2 as microtask
- Microtask queue not empty — run promise2 → logs promise 2
- Microtask queue empty
Macrotask (one at a time):
- Run cb1 → logs timeout 1
- Drain microtasks (empty)
- Run cb2 → logs timeout 2
Final output: start, end, promise 1, promise 2, timeout 1, timeout 2
Why Microtasks Were Introduced
Before Promises, JavaScript only had macrotasks. Timer callbacks, event callbacks — all macrotasks. The browser could render between them, which was fine.
Promises needed different semantics. A resolved Promise's callback should feel "immediate" — more like synchronous code than a timer. But it still can't be truly synchronous (that would be unpredictable). Microtasks are the middle ground: async, but before any rendering or other queued work.
This is why Promise.resolve().then(fn) feels faster than setTimeout(fn, 0) — it is. Microtasks clear before the browser even considers a render frame.
The Nested Promise Puzzle
Promise.resolve()
.then(() => {
console.log('A')
return Promise.resolve('B') // returns a thenable
})
.then(v => console.log(v))
Promise.resolve() .then(() => console.log('C')) .then(() => console.log('D'))
Most developers guess: A, B, C, D. The actual output: A, C, D, B.
Why? When a .then() handler returns a Promise, the spec requires two additional microtask "ticks" to unwrap it (one for the internal then on the returned Promise, one for the outer chain to continue). During those two extra ticks, the C and D chain has already progressed.
This is a deep edge case, but it reveals something important: returning a Promise from inside .then() is not the same as returning a plain value. The unwrapping adds microtasks.
async/await and the Event Loop
await is a microtask checkpoint. When an async function hits an await, it suspends — the rest of the function body is scheduled as a microtask continuation. Control returns to the caller immediately.
async function main() {
console.log('main: before await')
await Promise.resolve()
console.log('main: after await')
}
console.log('before main') main() console.log('after main')
Output: before main, main: before await, after main, main: after await
The await registers the continuation as a microtask. after main runs synchronously because main() has returned (it suspended). Then the microtask fires: main: after await.
Why Long Synchronous Code Is Catastrophic
If synchronous code runs for too long, the call stack stays occupied. No microtasks run. No macrotasks run. No event handlers fire. No renders happen. The page freezes.
// This blocks the event loop for ~1 second
const start = Date.now()
while (Date.now() - start < 1000) {}
// Nothing else can run during this loop
This is called blocking the event loop. In a browser, it freezes the UI. In Node.js, it prevents handling incoming requests. The fix for CPU-intensive work is either:
- Breaking it into chunks with
setTimeout(continueWork, 0)to yield between chunks
- Moving it to a Web Worker (browser) or Worker Thread (Node.js) — a separate thread
The Rendering Relationship
The browser targets 60 frames per second — a render opportunity every ~16.6ms. The browser renders between macrotasks, not between microtasks.
This means if your microtask queue never drains (microtasks spawning microtasks endlessly), the browser can never render. This is the microtask analog of blocking the event loop — rarer, but real.
It also means DOM changes inside microtasks batch until the next render opportunity, while DOM changes in separate macrotasks might each trigger their own render cycle. This is why React's unstable_batchedUpdates and React 18's automatic batching improve performance — they batch multiple state changes into one render cycle.
A Practical Decision Rule
When you're choosing how to schedule async work:
Use Promises / async-await when you need the result to be available before any UI rendering or I/O, and the operation is purely JavaScript work (processing, transformation, validation).
Use setTimeout(fn, 0) when you want to defer work until after the browser has a chance to render — for example, deferring a heavy calculation until after a loading spinner is painted.
Use requestAnimationFrame when scheduling visual updates — it fires at the right moment in the rendering pipeline, synchronized to the display refresh rate.
Use Web Workers when the work is CPU-intensive enough to block the main thread for more than a few milliseconds.
The Five Facts That Answer Every Event Loop Question
1. JavaScript is single-threaded. One thing at a time. 2. Synchronous code runs first, always, before any queue is checked. 3. The entire microtask queue drains after each task — before any macrotask runs, before any render. 4. One macrotask runs per event loop cycle. Then microtasks drain again. Then optionally a render. 5. Promise callbacks are microtasks. setTimeout callbacks are macrotasks. Microtasks always beat macrotasks.
With these five facts, every execution order question — no matter how nested or complicated — becomes a matter of careful tracing.