Ryan Dahl created Node.js in 2009 to bring JavaScript to the server. Built on Chrome’s V8 engine, Node.js uses an event-driven, non-blocking I/O model that handles thousands of concurrent connections with a single thread. Fifteen years later, Node.js powers the backends of Netflix, LinkedIn, PayPal, Uber, and millions of smaller applications. This guide covers everything you need to go from installation to production deployment.
Installation and Setup
The recommended way to install Node.js is through a version manager. This lets you switch between Node versions per project — critical when maintaining multiple applications:
# Install nvm (Node Version Manager)
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.7/install.sh | bash
# Install the latest LTS version
nvm install --lts
# Install a specific version
nvm install 22
# Switch between versions
nvm use 20
nvm use 22
# Check your installation
node --version # v22.x.x
npm --version # 10.x.x
Node.js ships with npm (Node Package Manager), but many projects now use faster alternatives:
# npm — ships with Node
npm install express
# pnpm — faster, disk-efficient (uses hard links)
npm install -g pnpm
pnpm install express
# yarn — deterministic installs
npm install -g yarn
yarn add express
Your First HTTP Server
Node.js includes an http module for creating web servers without any external dependencies:
import { createServer } from 'node:http';
const server = createServer((req, res) => {
// Route handling
if (req.method === 'GET' && req.url === '/') {
res.writeHead(200, { 'Content-Type': 'text/html' });
res.end('<h1>Hello from Node.js</h1>');
} else if (req.method === 'GET' && req.url === '/api/status') {
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ status: 'ok', uptime: process.uptime() }));
} else {
res.writeHead(404);
res.end('Not Found');
}
});
server.listen(3000, () => {
console.log('Server running at http://localhost:3000');
});
This server handles two routes and returns JSON or HTML based on the request URL. The callback-based API is the foundation of Node.js — the createServer callback fires for every incoming request, and the event loop handles concurrency without threads.
The Event Loop Explained
Node.js runs on a single thread, but it is not limited to one task at a time. The event loop delegates I/O operations (file reads, network requests, database queries) to the operating system or a thread pool, then processes their results when they complete:
import { readFile } from 'node:fs';
console.log('1: Start');
// This is non-blocking — Node hands it to the OS and moves on
readFile('/etc/hostname', 'utf8', (err, data) => {
console.log('3: File contents:', data?.trim());
});
console.log('2: This runs before the file is read');
// Output order:
// 1: Start
// 2: This runs before the file is read
// 3: File contents: my-hostname
This non-blocking model is why Node.js excels at I/O-intensive applications: API servers, real-time chat, streaming services, and proxy servers. It is not ideal for CPU-intensive work (image processing, video encoding, heavy computation) because a single long-running calculation blocks the event loop for all connected clients.
File System Operations
The node:fs module provides both callback-based and promise-based APIs for file system operations:
import { readFile, writeFile, mkdir, readdir, stat } from 'node:fs/promises';
import { existsSync } from 'node:fs';
import { join } from 'node:path';
// Read a file
const content = await readFile('./config.json', 'utf8');
const config = JSON.parse(content);
// Write a file
const data = JSON.stringify({ users: [], version: 1 }, null, 2);
await writeFile('./data/output.json', data, 'utf8');
// Create directories recursively
await mkdir('./logs/2025/03', { recursive: true });
// List directory contents
const files = await readdir('./src');
for (const file of files) {
const filePath = join('./src', file);
const info = await stat(filePath);
console.log(`${file}: ${info.isDirectory() ? 'dir' : info.size + ' bytes'}`);
}
// Check if file exists (sync is fine for startup checks)
if (existsSync('./config.local.json')) {
const local = await readFile('./config.local.json', 'utf8');
Object.assign(config, JSON.parse(local));
}
Streaming Large Files
For large files, streams avoid loading the entire file into memory:
import { createReadStream, createWriteStream } from 'node:fs';
import { pipeline } from 'node:stream/promises';
import { createGzip } from 'node:zlib';
// Compress a large file using streams
await pipeline(
createReadStream('./logs/access.log'),
createGzip(),
createWriteStream('./logs/access.log.gz')
);
// Read a file line by line
import { createInterface } from 'node:readline';
const rl = createInterface({
input: createReadStream('./data/large-dataset.csv'),
});
let lineCount = 0;
for await (const line of rl) {
lineCount++;
// Process each line without loading the entire file
}
console.log(`Processed ${lineCount} lines`);
Building a REST API with Express
Express is the most widely used Node.js framework. It adds routing, middleware, and request/response utilities on top of the built-in HTTP module:
npm init -y
npm install express
import express from 'express';
const app = express();
app.use(express.json()); // Parse JSON request bodies
// In-memory data store (use a database in production)
let todos = [
{ id: 1, title: 'Learn Node.js', done: false },
{ id: 2, title: 'Build an API', done: false },
];
let nextId = 3;
// GET all todos
app.get('/api/todos', (req, res) => {
res.json(todos);
});
// GET single todo
app.get('/api/todos/:id', (req, res) => {
const todo = todos.find(t => t.id === parseInt(req.params.id));
if (!todo) return res.status(404).json({ error: 'Todo not found' });
res.json(todo);
});
// POST create todo
app.post('/api/todos', (req, res) => {
const { title } = req.body;
if (!title?.trim()) {
return res.status(400).json({ error: 'Title is required' });
}
const todo = { id: nextId++, title: title.trim(), done: false };
todos.push(todo);
res.status(201).json(todo);
});
// PATCH update todo
app.patch('/api/todos/:id', (req, res) => {
const todo = todos.find(t => t.id === parseInt(req.params.id));
if (!todo) return res.status(404).json({ error: 'Todo not found' });
if (req.body.title !== undefined) todo.title = req.body.title;
if (req.body.done !== undefined) todo.done = req.body.done;
res.json(todo);
});
// DELETE todo
app.delete('/api/todos/:id', (req, res) => {
const index = todos.findIndex(t => t.id === parseInt(req.params.id));
if (index === -1) return res.status(404).json({ error: 'Todo not found' });
todos.splice(index, 1);
res.status(204).end();
});
app.listen(3000, () => {
console.log('API server running at http://localhost:3000');
});
Middleware Pattern
Express middleware functions process requests in sequence. Each function can modify the request/response, end the response, or pass control to the next middleware:
// Logging middleware
function requestLogger(req, res, next) {
const start = Date.now();
res.on('finish', () => {
const duration = Date.now() - start;
console.log(`${req.method} ${req.url} ${res.statusCode} ${duration}ms`);
});
next();
}
// Authentication middleware
function authenticate(req, res, next) {
const token = req.headers.authorization?.replace('Bearer ', '');
if (!token) {
return res.status(401).json({ error: 'Authentication required' });
}
try {
req.user = verifyToken(token); // Your JWT verification
next();
} catch {
res.status(401).json({ error: 'Invalid token' });
}
}
// Error handling middleware (4 parameters)
function errorHandler(err, req, res, next) {
console.error(err.stack);
res.status(500).json({ error: 'Internal server error' });
}
// Apply middleware
app.use(requestLogger);
app.use('/api/admin', authenticate); // Only for /api/admin routes
app.use(errorHandler); // Must be last
Async Patterns in Node.js
Node.js has evolved through three generations of async patterns. Understanding all three is important because you will encounter each in existing codebases:
Callbacks (Original Pattern)
import { readFile } from 'node:fs';
readFile('./data.json', 'utf8', (err, data) => {
if (err) {
console.error('Failed to read file:', err.message);
return;
}
const parsed = JSON.parse(data);
console.log(parsed);
});
Promises
import { readFile } from 'node:fs/promises';
readFile('./data.json', 'utf8')
.then(data => JSON.parse(data))
.then(parsed => console.log(parsed))
.catch(err => console.error('Failed:', err.message));
async/await (Modern Standard)
import { readFile } from 'node:fs/promises';
try {
const data = await readFile('./data.json', 'utf8');
const parsed = JSON.parse(data);
console.log(parsed);
} catch (err) {
console.error('Failed:', err.message);
}
New code should use async/await. It produces the most readable code, handles errors with standard try/catch, and composes naturally with Promise.all for parallel operations.
Environment Variables and Configuration
// Node 22+ has built-in .env file support
// node --env-file=.env app.js
// For older versions, use dotenv
// npm install dotenv
// import 'dotenv/config';
const config = {
port: parseInt(process.env.PORT || '3000'),
dbUrl: process.env.DATABASE_URL,
nodeEnv: process.env.NODE_ENV || 'development',
jwtSecret: process.env.JWT_SECRET,
};
// Validate required variables at startup
const required = ['DATABASE_URL', 'JWT_SECRET'];
const missing = required.filter(key => !process.env[key]);
if (missing.length > 0) {
console.error(`Missing environment variables: ${missing.join(', ')}`);
process.exit(1);
}
Testing Node.js Applications
Node.js 22 includes a built-in test runner, eliminating the need for external testing frameworks in many cases:
// test/todo.test.js
import { describe, it, beforeEach } from 'node:test';
import assert from 'node:assert/strict';
describe('Todo API', () => {
let todos;
beforeEach(() => {
todos = [
{ id: 1, title: 'Test task', done: false },
];
});
it('should add a new todo', () => {
const newTodo = { id: 2, title: 'New task', done: false };
todos.push(newTodo);
assert.equal(todos.length, 2);
assert.equal(todos[1].title, 'New task');
});
it('should mark a todo as done', () => {
todos[0].done = true;
assert.equal(todos[0].done, true);
});
it('should remove a todo', () => {
todos.splice(0, 1);
assert.equal(todos.length, 0);
});
});
# Run tests with the built-in runner
node --test
# With watch mode for development
node --test --watch
For larger projects, frameworks like Vitest or Jest provide richer assertion libraries, mocking utilities, and code coverage reporting. Setting up tests as part of your CI/CD pipeline catches regressions before they reach production.
Production Deployment
Running Node.js in production requires attention to process management, logging, and error handling:
Process Management
# PM2 — production process manager
npm install -g pm2
# Start with auto-restart, log management, and cluster mode
pm2 start app.js --name "api" -i max # cluster mode: one process per CPU core
pm2 save # save process list
pm2 startup # generate startup script for OS reboot
# Monitor running processes
pm2 monit
pm2 logs api
Graceful Shutdown
// Handle shutdown signals properly
const server = app.listen(config.port);
async function shutdown(signal) {
console.log(`${signal} received. Shutting down gracefully...`);
server.close(() => {
console.log('HTTP server closed');
});
// Close database connections, flush logs, etc.
await db.close();
process.exit(0);
}
process.on('SIGTERM', () => shutdown('SIGTERM'));
process.on('SIGINT', () => shutdown('SIGINT'));
// Handle unhandled promise rejections
process.on('unhandledRejection', (reason, promise) => {
console.error('Unhandled Rejection:', reason);
// Log to error tracking service
});
Docker Deployment
# Dockerfile for Node.js production
FROM node:22-slim AS builder
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY . .
RUN npm run build
FROM node:22-slim
WORKDIR /app
COPY --from=builder /app/dist ./dist
COPY --from=builder /app/node_modules ./node_modules
COPY package*.json ./
USER node
EXPOSE 3000
CMD ["node", "dist/server.js"]
The multi-stage build keeps the production image small by excluding build tools and source code. Running as the node user instead of root is a basic security practice.
Security Essentials
Node.js applications face the same security threats as any web server. Key practices:
import helmet from 'helmet';
import rateLimit from 'express-rate-limit';
// Security headers
app.use(helmet());
// Rate limiting
app.use('/api/', rateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 100, // limit each IP to 100 requests per window
}));
// Input validation (never trust user input)
app.post('/api/users', (req, res) => {
const { email, name } = req.body;
// Validate types
if (typeof email !== 'string' || typeof name !== 'string') {
return res.status(400).json({ error: 'Invalid input types' });
}
// Validate format
if (!email.match(/^[^\s@]+@[^\s@]+\.[^\s@]+$/)) {
return res.status(400).json({ error: 'Invalid email format' });
}
// Sanitize: trim, limit length
const sanitized = {
email: email.trim().toLowerCase().slice(0, 254),
name: name.trim().slice(0, 100),
};
// Proceed with sanitized data
});
Also keep dependencies updated (npm audit), use parameterized database queries to prevent SQL injection, and never store secrets in source code.
Node.js and the Modern Ecosystem
Node.js is the runtime foundation for most modern web frameworks. Next.js, Nuxt, SvelteKit, Astro, and Remix all run on Node.js in production. When you compare frontend frameworks, you are also comparing their Node.js-based build and server tooling.
The right editor setup makes a significant difference for Node.js development. VS Code with the ESLint, Prettier, and Node.js extensions provides IntelliSense for Node APIs, inline error detection, and integrated debugging with breakpoints.
For performance optimization, Node.js offers built-in profiling tools. The --inspect flag connects to Chrome DevTools for CPU profiling, memory snapshots, and step-through debugging. The node --prof flag generates V8 profiling data for identifying hot functions.
Alternatives: Deno and Bun
Ryan Dahl, the creator of Node.js, also created Deno — a runtime that addresses what he considers Node’s design mistakes: implicit trust of packages, lack of TypeScript support, and the node_modules system. Deno runs TypeScript natively, uses URL-based imports, and has a permission system that restricts file and network access by default.
Bun is a newer runtime written in Zig that focuses on speed. It includes a bundler, test runner, and package manager, and claims significantly faster startup times than Node.js.
Despite these alternatives, Node.js remains the production standard due to its massive ecosystem, battle-tested stability, and universal hosting support. Most Node.js code runs unchanged on Deno and Bun, so skills transfer directly.
Frequently Asked Questions
Is Node.js good for beginners?
Yes. If you already know JavaScript from frontend development, Node.js lets you use that same language on the server. The learning curve is understanding server concepts (HTTP, databases, authentication) rather than learning a new programming language. Start with the built-in HTTP module, then move to Express once you understand what Express abstracts away.
Can Node.js handle CPU-intensive tasks?
Not well on the main thread. Heavy computation blocks the event loop, freezing all concurrent connections. For CPU-intensive work, use Worker Threads (built into Node.js) to offload computation to separate threads, or delegate to services written in languages better suited for computation (Rust, Go, C++). Child processes via child_process.fork() are another option.
Should I use CommonJS (require) or ES Modules (import)?
Use ES Modules for new projects. Set "type": "module" in your package.json. ES Modules are the JavaScript standard, support top-level await, and enable static analysis for tree-shaking. CommonJS (require/module.exports) remains necessary when using older packages that have not been updated, but the ecosystem is steadily migrating.
How do I choose between Express, Fastify, and Hono?
Express is the safe default — enormous ecosystem, endless tutorials and middleware, battle-tested in production for over a decade. Fastify is faster and provides built-in JSON schema validation, but has a smaller ecosystem. Hono is the newest contender, designed to run on edge runtimes (Cloudflare Workers, Deno Deploy) as well as Node.js. For most applications, the performance difference between these frameworks is irrelevant compared to database query time and network latency.