Web Development

Authentication and Authorization for Web Developers: OAuth 2.0, JWT, Sessions, and Passkeys

Authentication and Authorization for Web Developers: OAuth 2.0, JWT, Sessions, and Passkeys

Authentication and authorization form the backbone of every secure web application. Whether you are building a simple blog or a complex enterprise platform, understanding how to verify user identity and control access to resources is non-negotiable. This guide explores the four most important authentication mechanisms modern web developers need to master: OAuth 2.0, JSON Web Tokens (JWT), session-based authentication, and the emerging passkeys standard.

By the end of this article, you will understand how each approach works under the hood, when to choose one over another, and how to implement them correctly without introducing security vulnerabilities. If you are not yet familiar with common web security threats, start with our OWASP Top 10 web security guide for essential context.

Authentication vs. Authorization: The Fundamental Distinction

Before diving into specific technologies, it is critical to understand the difference between authentication and authorization. These terms are frequently confused, yet they describe two entirely separate concerns.

Authentication answers the question: “Who are you?” It is the process of verifying a user’s identity. When a user enters a username and password, the system authenticates them by confirming those credentials match a stored record.

Authorization answers the question: “What are you allowed to do?” Once the system knows who a user is, authorization determines which resources and actions that user can access. A regular user might read articles, while an administrator can also delete them.

Think of authentication as showing your ID at the door, and authorization as the access badge that determines which rooms you can enter once inside the building. Every authentication mechanism discussed in this article handles both concerns in different ways.

Session-Based Authentication: The Traditional Approach

Session-based authentication is the oldest and most straightforward method. It has powered web applications since the earliest days of dynamic websites, and it remains an excellent choice for server-rendered applications.

How Session-Based Auth Works

The flow is simple: a user submits their credentials, the server verifies them, creates a session record in server-side storage (typically a database or in-memory store like Redis), and sends back a session ID as an HTTP cookie. Every subsequent request includes this cookie, allowing the server to look up the session and identify the user.

This approach is inherently stateful. The server must maintain session data for every active user. While this introduces scalability considerations — particularly in microservices architectures where multiple servers need access to session data — it provides significant security advantages.

Security Benefits of Sessions

Session-based authentication gives the server full control over user sessions. Need to log out a user immediately after detecting suspicious activity? Simply delete their session record. Need to limit concurrent logins? Count active sessions. This level of control is difficult to achieve with stateless approaches like JWT.

When implementing sessions, always use these security best practices for cookies:

  • HttpOnly flag — prevents JavaScript from accessing the cookie, mitigating XSS attacks
  • Secure flag — ensures the cookie is only sent over HTTPS connections
  • SameSite attribute — set to “Strict” or “Lax” to prevent CSRF attacks
  • Short expiration — sessions should expire after a reasonable period of inactivity

Session-based authentication works best for traditional server-rendered web applications, especially those running on a single server or using a shared session store. If you are building progressive web apps or single-page applications that communicate with separate API backends, JWT or OAuth 2.0 may be more appropriate.

JSON Web Tokens (JWT): Stateless Authentication

JWT (pronounced “jot”) has become one of the most popular authentication mechanisms for modern web applications, especially those with decoupled frontends and API backends. Understanding REST API fundamentals is essential context before implementing JWT-based auth.

JWT Structure

A JWT consists of three Base64URL-encoded parts separated by dots: the header, the payload, and the signature.

The header specifies the token type and the signing algorithm (e.g., HS256 or RS256). The payload contains claims — pieces of information about the user and the token itself, such as the user ID, roles, and expiration time. The signature is created by hashing the header and payload with a secret key, ensuring the token has not been tampered with.

A critical point many developers miss: JWTs are signed, not encrypted. Anyone can decode and read the payload. Never store sensitive information like passwords, credit card numbers, or personal data in a JWT. The signature only guarantees integrity — it proves the token was issued by your server and has not been modified.

JWT Verification Middleware in Node.js

Here is a production-ready JWT verification middleware for a Node.js Express application. This middleware validates incoming tokens, checks expiration, and attaches the decoded user data to the request object:

const jwt = require('jsonwebtoken');

const JWT_SECRET = process.env.JWT_SECRET;
const TOKEN_ISSUER = 'your-app-name';

function authenticateToken(req, res, next) {
  // Extract token from Authorization header
  const authHeader = req.headers['authorization'];
  const token = authHeader && authHeader.startsWith('Bearer ')
    ? authHeader.slice(7)
    : null;

  if (!token) {
    return res.status(401).json({
      error: 'Authentication required',
      message: 'No token provided in Authorization header'
    });
  }

  try {
    // Verify token signature, expiration, and issuer
    const decoded = jwt.verify(token, JWT_SECRET, {
      issuer: TOKEN_ISSUER,
      algorithms: ['HS256'],
      clockTolerance: 30 // 30-second grace period for clock skew
    });

    // Attach user data to request for downstream handlers
    req.user = {
      id: decoded.sub,
      email: decoded.email,
      roles: decoded.roles || []
    };

    next();
  } catch (err) {
    if (err.name === 'TokenExpiredError') {
      return res.status(401).json({
        error: 'Token expired',
        message: 'Please refresh your token or log in again'
      });
    }

    if (err.name === 'JsonWebTokenError') {
      return res.status(403).json({
        error: 'Invalid token',
        message: 'Token verification failed'
      });
    }

    return res.status(500).json({ error: 'Authentication error' });
  }
}

// Role-based authorization middleware
function requireRole(...allowedRoles) {
  return (req, res, next) => {
    if (!req.user) {
      return res.status(401).json({ error: 'Authentication required' });
    }

    const hasRole = req.user.roles.some(
      role => allowedRoles.includes(role)
    );

    if (!hasRole) {
      return res.status(403).json({
        error: 'Insufficient permissions',
        message: `Required role: ${allowedRoles.join(' or ')}`
      });
    }

    next();
  };
}

// Usage example
app.get('/api/admin/users',
  authenticateToken,
  requireRole('admin', 'superadmin'),
  (req, res) => {
    // Only authenticated admins reach this handler
    res.json({ users: getUsersList() });
  }
);

module.exports = { authenticateToken, requireRole };

JWT Best Practices

JWT tokens come with important trade-offs that developers must understand:

  • Keep tokens short-lived — Access tokens should expire within 15 to 30 minutes. Use refresh tokens (stored securely) to obtain new access tokens without re-authentication.
  • Never store JWTs in localStorage — localStorage is accessible to any JavaScript on the page, making it vulnerable to XSS. Use HttpOnly cookies or keep tokens in memory.
  • Use asymmetric signing for distributed systems — RS256 (RSA) allows services to verify tokens without knowing the signing key, which is ideal for microservices.
  • Implement token revocation — Maintain a blacklist of revoked tokens in a fast store like Redis, or use short-lived tokens paired with refresh token rotation.
  • Always validate the algorithm — Specify the expected algorithm explicitly to prevent algorithm confusion attacks where an attacker switches from RS256 to HS256.

OAuth 2.0: Delegated Authorization

OAuth 2.0 is not an authentication protocol — it is an authorization framework. It allows users to grant third-party applications limited access to their resources without sharing credentials. When you click “Sign in with Google” or “Connect your GitHub account,” OAuth 2.0 is working behind the scenes.

OAuth 2.0 Roles and Flows

OAuth 2.0 defines four roles: the Resource Owner (the user), the Client (your application), the Authorization Server (e.g., Google’s OAuth server), and the Resource Server (the API hosting protected resources).

The most common flow for web applications is the Authorization Code flow with PKCE (Proof Key for Code Exchange). PKCE adds an extra layer of security and is now recommended for all client types, including server-side applications. The older implicit flow is deprecated due to security vulnerabilities.

OAuth 2.0 Authorization Code Flow with PKCE

Below is a practical implementation of the OAuth 2.0 Authorization Code flow with PKCE for a web application. This example demonstrates both the client-side redirect and the server-side token exchange:

const crypto = require('crypto');
const express = require('express');
const axios = require('axios');

const app = express();

// OAuth configuration — replace with your provider's endpoints
const OAUTH_CONFIG = {
  clientId: process.env.OAUTH_CLIENT_ID,
  clientSecret: process.env.OAUTH_CLIENT_SECRET,
  authorizationEndpoint: 'https://accounts.google.com/o/oauth2/v2/auth',
  tokenEndpoint: 'https://oauth2.googleapis.com/token',
  redirectUri: 'https://yourapp.com/auth/callback',
  scopes: ['openid', 'email', 'profile']
};

// Generate PKCE code verifier and challenge
function generatePKCE() {
  const codeVerifier = crypto.randomBytes(32)
    .toString('base64url');
  const codeChallenge = crypto.createHash('sha256')
    .update(codeVerifier)
    .digest('base64url');
  return { codeVerifier, codeChallenge };
}

// Step 1: Redirect user to authorization server
app.get('/auth/login', (req, res) => {
  const { codeVerifier, codeChallenge } = generatePKCE();
  const state = crypto.randomBytes(16).toString('hex');

  // Store verifier and state in session for later validation
  req.session.oauthState = state;
  req.session.codeVerifier = codeVerifier;

  const authUrl = new URL(OAUTH_CONFIG.authorizationEndpoint);
  authUrl.searchParams.set('client_id', OAUTH_CONFIG.clientId);
  authUrl.searchParams.set('redirect_uri', OAUTH_CONFIG.redirectUri);
  authUrl.searchParams.set('response_type', 'code');
  authUrl.searchParams.set('scope', OAUTH_CONFIG.scopes.join(' '));
  authUrl.searchParams.set('state', state);
  authUrl.searchParams.set('code_challenge', codeChallenge);
  authUrl.searchParams.set('code_challenge_method', 'S256');

  res.redirect(authUrl.toString());
});

// Step 2: Handle the callback and exchange code for tokens
app.get('/auth/callback', async (req, res) => {
  const { code, state, error } = req.query;

  // Check for errors from the authorization server
  if (error) {
    return res.status(400).json({
      error: 'Authorization failed',
      detail: req.query.error_description || error
    });
  }

  // Validate state parameter to prevent CSRF
  if (state !== req.session.oauthState) {
    return res.status(403).json({
      error: 'Invalid state parameter — possible CSRF attack'
    });
  }

  try {
    // Exchange authorization code for access token
    const tokenResponse = await axios.post(
      OAUTH_CONFIG.tokenEndpoint,
      new URLSearchParams({
        grant_type: 'authorization_code',
        code: code,
        redirect_uri: OAUTH_CONFIG.redirectUri,
        client_id: OAUTH_CONFIG.clientId,
        client_secret: OAUTH_CONFIG.clientSecret,
        code_verifier: req.session.codeVerifier
      }),
      { headers: { 'Content-Type': 'application/x-www-form-urlencoded' } }
    );

    const { access_token, id_token, refresh_token } = tokenResponse.data;

    // Fetch user profile from the resource server
    const userInfo = await axios.get(
      'https://www.googleapis.com/oauth2/v3/userinfo',
      { headers: { Authorization: `Bearer ${access_token}` } }
    );

    // Create or update user in your database
    const user = await findOrCreateUser({
      email: userInfo.data.email,
      name: userInfo.data.name,
      providerId: userInfo.data.sub
    });

    // Establish an application session
    req.session.userId = user.id;

    // Clean up OAuth artifacts from session
    delete req.session.oauthState;
    delete req.session.codeVerifier;

    res.redirect('/dashboard');
  } catch (err) {
    console.error('Token exchange failed:', err.response?.data || err.message);
    res.status(500).json({ error: 'Authentication failed' });
  }
});

OpenID Connect: Authentication on Top of OAuth

Since OAuth 2.0 only handles authorization, OpenID Connect (OIDC) was built as an identity layer on top of it. OIDC adds a standardized ID token (a JWT) that contains user identity information. When you use “Sign in with Google,” you are actually using OIDC, not raw OAuth 2.0.

OIDC introduces several important concepts: the ID token (containing user claims), the UserInfo endpoint (for fetching additional profile data), and discovery documents (a standardized way to find provider endpoints). If your application needs to identify users rather than just access their resources, always use OIDC instead of plain OAuth 2.0.

For teams managing complex projects with multiple authentication providers, tools like Taskee can help organize development workflows and track implementation progress across teams.

Passkeys: The Future of Authentication

Passkeys represent a fundamental shift in web authentication. Built on the WebAuthn standard and FIDO2 protocols, passkeys eliminate passwords entirely. Instead of typing a password, users authenticate using biometrics (fingerprint or face recognition), a device PIN, or a hardware security key.

How Passkeys Work

Passkeys use public-key cryptography. During registration, the user’s device generates a unique key pair. The private key stays on the device (protected by the device’s secure enclave), while the public key is sent to the server. During authentication, the server sends a challenge, the device signs it with the private key, and the server verifies the signature with the stored public key.

This architecture provides several breakthrough advantages:

  • Phishing-resistant — Passkeys are bound to the specific domain they were created for. A fake login page on a lookalike domain cannot trigger the passkey.
  • No shared secrets — The server never stores a password or any secret that could be stolen in a data breach.
  • Cross-device sync — Modern platforms sync passkeys across devices via iCloud Keychain, Google Password Manager, or other platform credential managers.
  • Simpler user experience — No passwords to remember, type, or reset.

Implementing Passkey Registration

The WebAuthn API is built into modern browsers. Registration (also called attestation) involves generating a credential and storing the public key server-side. The front-end calls navigator.credentials.create() with options provided by your server, and the browser handles the biometric prompt and key generation.

When implementing passkeys, you should use a well-tested library like SimpleWebAuthn (for Node.js) or py_webauthn (for Python) rather than implementing the cryptographic verification yourself. The specification is nuanced, and subtle implementation errors can introduce security vulnerabilities.

Current Adoption and Considerations

Major platforms including Apple, Google, and Microsoft now support passkeys across their ecosystems. As of 2025, passkey support is available in all major browsers, and adoption is accelerating. However, most applications should implement passkeys as an additional authentication option alongside traditional methods, since not all users are ready to transition fully.

Choosing the Right Approach

Selecting an authentication strategy depends on your application architecture, security requirements, and user experience goals. Here is a practical decision framework:

Use session-based authentication when:

  • You are building a server-rendered application (e.g., traditional MVC with templates)
  • Your application runs on a single server or a small cluster with shared session storage
  • You need immediate session revocation capabilities
  • You want the simplest possible implementation

Use JWT when:

  • You are building an API consumed by a single-page application or mobile app
  • Your architecture includes multiple microservices that need to verify user identity independently
  • You need stateless authentication across distributed services
  • You are building GraphQL or REST APIs with separate frontend and backend deployments

Use OAuth 2.0 / OIDC when:

  • You want to offer social login (Google, GitHub, Microsoft)
  • Your application needs to access third-party APIs on behalf of users
  • You are building a platform where third-party apps integrate with your API
  • You want to delegate user management to an identity provider

Add passkeys when:

  • You want the strongest protection against phishing and credential theft
  • Your users have modern devices with biometric capabilities
  • You are willing to maintain a fallback authentication method
  • You want to reduce password reset support burden

In practice, most production applications combine multiple approaches. A common pattern is using OAuth 2.0 for social login, issuing JWTs for API access, maintaining server-side sessions for web clients, and offering passkeys as an optional upgrade. When architecting this kind of system, consider containerizing your auth services using Docker for consistent deployment across environments.

Common Security Pitfalls

Regardless of which authentication mechanism you choose, several security mistakes appear repeatedly in real-world applications. Avoiding these will keep your users safe:

  • Storing passwords in plain text or with weak hashing — Always use bcrypt, scrypt, or Argon2id with appropriate cost factors. Never use MD5 or SHA-256 for passwords.
  • Missing rate limiting on login endpoints — Without rate limiting, attackers can brute-force credentials. Implement exponential backoff and account lockout policies.
  • Overly permissive CORS configuration — Setting Access-Control-Allow-Origin: * with credentials enabled is a major vulnerability. Whitelist specific origins.
  • Not validating redirect URIs — Open redirect vulnerabilities in OAuth flows can lead to token theft. Always validate redirect URIs against a strict whitelist.
  • Exposing detailed error messages — “Invalid password” tells an attacker the username exists. Use generic messages like “Invalid credentials” instead.
  • Ignoring token storage security — Storing tokens in localStorage exposes them to XSS. Use HttpOnly cookies or in-memory storage with refresh token rotation.

For organizations building web applications at scale, having a well-structured project management workflow ensures security reviews and authentication updates do not fall through the cracks.

Real-Time Authentication Considerations

Modern web applications increasingly rely on real-time features such as live notifications, chat, and collaborative editing. When using WebSockets or Server-Sent Events, authentication requires special handling. Unlike HTTP requests, WebSocket connections are long-lived, so you must authenticate during the initial handshake and implement mechanisms to terminate connections when sessions expire or tokens are revoked.

A common pattern is to authenticate the WebSocket handshake using a short-lived JWT or session cookie, and then periodically verify that the user’s session remains valid throughout the connection lifetime.

FAQ

What is the difference between OAuth 2.0 and OpenID Connect?

OAuth 2.0 is an authorization framework that allows applications to access resources on behalf of a user, but it does not verify user identity. OpenID Connect (OIDC) is an identity layer built on top of OAuth 2.0 that adds authentication. OIDC introduces ID tokens (JWTs containing user identity claims), a UserInfo endpoint, and standardized scopes like “openid” and “profile.” If your application needs to know who a user is (authentication), use OIDC. If it only needs to access their resources (authorization), OAuth 2.0 alone is sufficient.

Are JWTs secure enough for production applications?

JWTs are secure when implemented correctly, but they require careful handling. Use short expiration times (15-30 minutes for access tokens), store tokens in HttpOnly cookies rather than localStorage, always validate the signing algorithm explicitly, use strong secret keys (at least 256 bits for HS256), and implement refresh token rotation. The main security concern with JWTs is that they cannot be easily revoked before expiration, so pair them with a token blacklist in Redis or use very short-lived tokens with refresh token rotation.

Should I implement my own authentication system or use a third-party service?

For most applications, using a proven authentication service like Auth0, Firebase Authentication, or AWS Cognito is the safer and more efficient choice. These services handle complex edge cases such as brute-force protection, multi-factor authentication, token rotation, and compliance requirements. Building custom authentication makes sense only when you have specific requirements that third-party services cannot meet, when you have dedicated security expertise on your team, or when data sovereignty regulations prevent using external identity providers.

Can passkeys fully replace passwords today?

Passkeys are technically capable of replacing passwords, and major platforms (Apple, Google, Microsoft) are actively promoting passwordless authentication. However, full replacement is not practical for most applications yet. Some users have older devices without biometric support, corporate environments may have policies restricting passkey use, and cross-platform syncing still has edge cases. The recommended approach is to offer passkeys as a primary login option while maintaining password-based authentication as a fallback, allowing users to migrate at their own pace.

How do I handle authentication in a microservices architecture?

In a microservices architecture, the most common approach is to use an API gateway or dedicated authentication service that issues JWTs after validating user credentials. Each microservice then independently verifies the JWT signature without needing to contact the auth service. Use asymmetric signing (RS256) so that services only need the public key to verify tokens, not the private signing key. For service-to-service communication, use separate machine-to-machine tokens with limited scopes. Centralize user session management in a shared store like Redis when you need session revocation capabilities across all services.