The event loop is the #1 most tested JavaScript concept at senior interviews. Master call stack, microtask queue, and macrotask queue.
Picture a chef working alone in a kitchen. On their left is a stove with one burner — they can only cook one thing at a time on it. On the right are two ticket rails. The top rail holds urgent tickets — these get picked up next whenever the stove is free. The bottom rail holds regular tickets — these get picked up only when the urgent rail is empty. Waiters keep handing the chef new tickets while they cook. The chef never looks at new tickets mid-dish. They finish what's on the stove, then look at the urgent rail. Only if that's empty do they look at the regular rail. Then they start the next dish. JavaScript is that chef. The stove is the call stack — one thing at a time, no interruptions. The urgent rail is the microtask queue (Promises, queueMicrotask). The regular rail is the macrotask queue (setTimeout, setInterval, I/O). The event loop is the chef's routine: finish what's on the stove → drain the urgent rail completely → take one item from the regular rail → repeat. The key insight: JavaScript is single-threaded but non-blocking because it never waits — it registers callbacks and moves on. The event loop checks the queues and drives those callbacks when the thread is free. Understanding exactly which queue and in what order is what separates guessing from knowing.
The JavaScript runtime has four pieces that work together:
This is the precise sequence the event loop runs, over and over:
The critical rule: all microtasks run before the next macrotask, no matter how many microtasks are queued.
function third() { console.log('third') }
function second() { third(); console.log('second') }
function first() { second(); console.log('first') }
first()
// Call stack frames (top = currently executing):
// 1. [first] — first() called
// 2. [second, first] — second() called inside first
// 3. [third, second, first] — third() called inside second
// 4. [second, first] — third() returns, logs 'third'
// 5. [first] — second() returns, logs 'second'
// 6. [] — first() returns, logs 'first'
//
// Output: 'third', 'second', 'first'
setTimeout(fn, 0) does not run immediately. It schedules fn as a macrotask. The current synchronous code, then all microtasks, run first:
console.log('1 — sync start')
setTimeout(() => console.log('2 — setTimeout'), 0)
console.log('3 — sync end')
// Output:
// 1 — sync start
// 3 — sync end
// 2 — setTimeout
//
// setTimeout fires AFTER the current call stack clears
console.log('1 — start')
setTimeout(() => console.log('2 — setTimeout'), 0)
Promise.resolve()
.then(() => console.log('3 — promise 1'))
.then(() => console.log('4 — promise 2'))
console.log('5 — end')
// Output:
// 1 — start
// 5 — end
// 3 — promise 1 ← microtask before setTimeout
// 4 — promise 2 ← microtask before setTimeout
// 2 — setTimeout ← macrotask last
//
// Why: synchronous runs first → call stack clears →
// microtask queue drained (both promises) → macrotask runs
console.log('1')
setTimeout(() => console.log('2'), 0)
Promise.resolve().then(() => {
console.log('3')
setTimeout(() => console.log('4'), 0)
})
Promise.resolve().then(() => console.log('5'))
console.log('6')
// Let's trace exactly:
// Sync: 1, 6
// After sync, microtask queue has: [log 3 + inner setTimeout, log 5]
// Microtask 1: logs 3, schedules setTimeout(log 4) → macrotask queue: [log 2, log 4]
// Microtask 2: logs 5
// Microtask queue empty → run macrotask: log 2
// Run macrotask: log 4
//
// Final output: 1, 6, 3, 5, 2, 4
console.log('sync 1')
queueMicrotask(() => console.log('microtask'))
console.log('sync 2')
// Output: sync 1, sync 2, microtask
// queueMicrotask is identical in priority to Promise.then
Because microtasks drain completely before any macrotask runs, an infinite microtask loop will freeze the page:
// This freezes the browser — the macrotask queue never gets a turn
function runForever() {
Promise.resolve().then(runForever)
}
runForever()
// Compare: this doesn't freeze — each setTimeout is a macrotask,
// so the browser gets to render and process events between each call
function runYieldingly() {
setTimeout(runYieldingly, 0)
}
runYieldingly()
Node.js has two additional queue types not in browsers:
// process.nextTick() — runs before any other microtask
// Technically its own queue, checked before the Promise microtask queue
console.log('1 — sync')
setTimeout(() => console.log('2 — setTimeout'), 0)
Promise.resolve().then(() => console.log('3 — promise'))
process.nextTick(() => console.log('4 — nextTick'))
console.log('5 — sync end')
// Node.js output:
// 1 — sync
// 5 — sync end
// 4 — nextTick ← nextTick queue runs before Promise microtasks
// 3 — promise
// 2 — setTimeout
// setImmediate() — macrotask scheduled to run in the check phase
// Runs after I/O callbacks, before timers in the same iteration
// In practice: setImmediate fires before setTimeout(fn, 0) in I/O callbacks
Every await is a yield point. After awaiting, the rest of the async function resumes as a microtask:
async function fetchAndLog() {
console.log('A — before await')
const result = await Promise.resolve('data')
console.log('B — after await:', result) // this is a microtask callback
}
console.log('1 — before call')
fetchAndLog()
console.log('2 — after call')
// Output:
// 1 — before call
// A — before await ← runs synchronously up to the await
// 2 — after call ← resumes synchronous code after fetchAndLog() yields
// B — after await: data ← microtask resumes the async function
// Browser only — runs before the next paint, after macrotasks
// Sits between the macrotask and the next macrotask in the rendering step
requestAnimationFrame(() => {
// Guaranteed to run before the next frame is painted
// Use for smooth animations — not setTimeout
element.style.transform = `translateX(${x}px)`
})
// From highest to lowest priority:
// 1. Current synchronous code (call stack)
// 2. process.nextTick callbacks (Node.js only)
// 3. Promise microtasks (.then, .catch, .finally, queueMicrotask)
// 4. requestAnimationFrame callbacks (before next paint)
// 5. setTimeout / setInterval / I/O callbacks (macrotasks)
// 6. setImmediate (Node.js, in I/O context)Many devs think JavaScript is multi-threaded because it can handle async operations — but actually the JavaScript engine itself is strictly single-threaded. One call stack, one thing at a time. Async operations (timers, network requests) are handled by browser or Node.js APIs running outside the JS thread. The callbacks those APIs schedule come back to the single thread via the queue system.
Many devs think setTimeout(fn, 0) runs immediately after the current line — but actually setTimeout(fn, 0) schedules fn as a macrotask that runs after the current call stack is empty AND after all pending microtasks (Promises) have been drained. It's the lowest priority callback type.
Many devs think Promise.then() callbacks and setTimeout callbacks have the same priority — but actually Promise.then() callbacks are microtasks and always run before any macrotask, including setTimeout(fn, 0). If you have both queued, all pending microtasks run first, then one macrotask runs, then any new microtasks created by that macrotask run, and so on.
Many devs think queueing many microtasks is harmless — but actually an infinite microtask loop (a .then that always queues another .then) starves the macrotask queue completely, preventing timers, I/O callbacks, and browser rendering from ever executing. The browser tab freezes. This is microtask starvation and is a real production bug category.
Many devs think await pauses the entire JavaScript thread while waiting — but actually await pauses only the async function, not the thread. After hitting an await, the async function suspends and JavaScript returns to the event loop, which continues executing other callbacks. The async function resumes as a microtask when the awaited Promise resolves.
Many devs think the event loop processes all queued tasks in one go — but actually the event loop processes exactly one macrotask per cycle, then drains all microtasks, then renders (in browsers), then picks the next macrotask. This cycle is why long-running synchronous code blocks rendering and user input — the render step never gets a turn until the call stack is clear.
React's state batching in React 18 is built on the event loop — multiple setState calls inside a single event handler are batched into one re-render because React defers the render to a microtask or a scheduled macrotask, collecting all synchronous state changes first before triggering one render pass.
Node.js's non-blocking I/O model — the reason it handles thousands of concurrent connections on a single thread — is entirely the event loop in action. File reads, database queries, and network calls all register callbacks that re-enter the thread via the macrotask queue when ready, while the thread stays free for other work.
Long tasks (JavaScript functions that run for more than 50ms) are detected by Lighthouse and Chrome DevTools as blocking the main thread. Each long task prevents the browser from processing user input, running animations, and rendering, because the call stack doesn't empty until the task finishes. The fix is breaking the work into smaller chunks with setTimeout or scheduler.postTask().
Debounce and throttle implementations are direct event loop tools — debounce uses setTimeout to push work into a future macrotask, allowing many rapid events to cancel and reschedule each other. The final callback only fires once the event stream pauses long enough for the timer to complete without being cancelled.
Promises in the fetch API chain are microtasks all the way down — every .then() in a fetch().then().then().then() chain schedules the next step as a microtask. This means a deeply chained Promise chain processes all its steps before any setTimeout or I/O callback runs, which is why Promise chains respond faster than equivalent timer-based patterns.
Web Workers run JavaScript on a separate thread and communicate back to the main thread via message events, which enter the macrotask queue. The main thread's event loop picks them up normally. This is the only way to run JavaScript truly in parallel — the worker has its own event loop, its own call stack, and its own queues.
Explain the event loop, call stack, and microtask queue.
Microtask starvation — UI never updates
Reading answers is not the same as knowing them. Practice saying them out loud with AI feedback — that's what builds real interview confidence.