REST is request-response. You ask, server answers, connection closes. Works fine for most things.
But chat messages can’t wait for you to ask. Stock prices need to push the moment they change. Notifications should appear, not be polled for.
Two options: WebSockets and Server-Sent Events. Different tools, different trade-offs.
The Quick Answer
WebSockets: Two-way communication. Client and server can both send anytime. Use for chat, games, collaborative editing.
Server-Sent Events (SSE): One-way. Server pushes to client. Use for notifications, live feeds, dashboards.
If you only need server-to-client, use SSE. It’s simpler.
WebSockets
Full duplex. Once connected, either side can send messages whenever.
// Client
const socket = new WebSocket('wss://example.com/socket');
socket.onopen = () => {
socket.send(JSON.stringify({ type: 'subscribe', channel: 'chat' }));
};
socket.onmessage = (event) => {
const message = JSON.parse(event.data);
displayMessage(message);
};
// Send a message
function sendChat(text) {
socket.send(JSON.stringify({ type: 'chat', text }));
}
The connection stays open. Messages flow both directions.
Good for:
- Chat applications
- Multiplayer games
- Collaborative tools (multiple users editing)
- Anything where clients send frequent messages
Annoying parts:
- You handle reconnection yourself
- Some proxies/firewalls don’t like long-lived connections
- More complex server infrastructure
Connection Lifecycle
A WebSocket connection starts as an HTTP request, then upgrades. The handshake looks like a normal GET request with an Upgrade: websocket header. Once the server agrees, the protocol switches and HTTP is out of the picture.
This matters because some infrastructure doesn’t handle the upgrade gracefully. Older load balancers, certain CDNs, corporate proxies — they can all interfere. If you’re deploying behind a reverse proxy like Nginx, you need to explicitly configure it to pass WebSocket connections through.
location /socket {
proxy_pass http://backend;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
}
Skip this config and your WebSocket connections will silently fail. You’ll spend an hour debugging client code before realizing the problem is infrastructure.
Reconnection Strategies
Unlike SSE, WebSockets don’t reconnect automatically. You need to build that yourself.
The naive approach is to reconnect immediately on close. Don’t do that. If the server is down, you’ll hammer it with connection attempts and make things worse.
Use exponential backoff instead:
function createSocket() {
let retryDelay = 1000;
const maxDelay = 30000;
function connect() {
const socket = new WebSocket('wss://example.com/socket');
socket.onopen = () => {
retryDelay = 1000; // Reset on successful connection
};
socket.onclose = (event) => {
if (!event.wasClean) {
setTimeout(connect, retryDelay);
retryDelay = Math.min(retryDelay * 2, maxDelay);
}
};
return socket;
}
return connect();
}
Add some jitter to the delay (a random offset) if you have many clients. Otherwise they all reconnect at the exact same moment and overwhelm the server. Thundering herd problems are real and they’re not fun to debug in production.
Server-Sent Events
One direction: server to client. Built on regular HTTP.
// Client
const events = new EventSource('/api/notifications');
events.onmessage = (event) => {
const notification = JSON.parse(event.data);
showNotification(notification);
};
// That's it. Browser handles reconnection automatically.
// Server (Express)
app.get('/api/notifications', (req, res) => {
res.setHeader('Content-Type', 'text/event-stream');
res.setHeader('Cache-Control', 'no-cache');
res.setHeader('Connection', 'keep-alive');
// Send data whenever you want
const send = (data) => {
res.write(`data: ${JSON.stringify(data)}\n\n`);
};
send({ type: 'connected' });
// Clean up when client disconnects
req.on('close', () => {
// Remove from active connections
});
});
Good for:
- Live notifications
- Activity feeds
- Dashboards and metrics
- Stock tickers and exchange rate feeds
- Any “subscribe and receive updates” pattern
Why it’s nice:
- Automatic reconnection built in
- Works through most proxies
- Just HTTP — nothing special to deploy
- Simpler server code
SSE Event Types and IDs
SSE supports more than just raw data. You can send named events, event IDs, and retry intervals. Most tutorials skip this, but it’s what makes SSE genuinely robust.
// Server: named events with IDs
function sendEvent(res, type, data, id) {
if (id) res.write(`id: ${id}\n`);
res.write(`event: ${type}\n`);
res.write(`data: ${JSON.stringify(data)}\n\n`);
}
sendEvent(res, 'price-update', { symbol: 'BTC', price: 43200 }, '1001');
sendEvent(res, 'notification', { text: 'New follower' }, '1002');
// Client: listen for specific event types
const events = new EventSource('/api/feed');
events.addEventListener('price-update', (e) => {
updateTicker(JSON.parse(e.data));
});
events.addEventListener('notification', (e) => {
showToast(JSON.parse(e.data));
});
Event IDs are especially powerful. When the connection drops and the browser reconnects, it sends a Last-Event-ID header. Your server can use this to replay missed events. The client gets caught up without any custom logic on the frontend.
The retry field tells the browser how long to wait before reconnecting (in milliseconds). Set it once at the start of the stream:
res.write('retry: 5000\n\n'); // Reconnect after 5 seconds
This level of built-in resilience is why SSE is underrated. You get reconnection, event replay, and backpressure control without installing a single library.
Side by Side
| Aspect | WebSockets | SSE |
|---|---|---|
| Direction | Both ways | Server to client |
| Protocol | Custom (ws://) | Regular HTTP |
| Reconnection | Manual | Automatic |
| Binary data | Yes | Text only |
| Complexity | Higher | Lower |
| Proxy support | Sometimes problematic | Usually fine |
| Max connections | Browser-limited (varies) | 6 per domain (HTTP/1.1) |
| Event replay | Build it yourself | Built-in with Last-Event-ID |
That last row on SSE connections per domain is worth noting. HTTP/1.1 browsers typically limit you to six concurrent connections per domain. If a user has multiple tabs open, each with an SSE connection, you can hit that limit fast. HTTP/2 multiplexes streams over a single connection, which largely eliminates this problem. Make sure your server supports HTTP/2.
When Polling Is Actually Fine
Real-time tech has overhead. Sometimes polling is the right call:
- Updates are infrequent (minutes apart)
- You can’t maintain persistent connections
- Simplicity matters more than immediacy
Polling a weather API every 30 seconds? Totally reasonable. Don’t overcomplicate it.
Short Polling vs Long Polling
Standard polling hits an endpoint on a timer. Simple, predictable, wasteful if nothing changes.
Long polling is the middle ground. The client sends a request, the server holds it open until there’s new data (or a timeout), then responds. The client immediately sends another request.
// Long polling client
async function longPoll() {
try {
const response = await fetch('/api/updates?since=' + lastEventId);
const data = await response.json();
handleUpdate(data);
lastEventId = data.id;
} catch (err) {
await new Promise(r => setTimeout(r, 3000)); // Back off on error
}
longPoll(); // Immediately poll again
}
Long polling gives you near-real-time behavior without persistent connections. It’s what Slack used for years before switching to WebSockets. If it was good enough for Slack at scale, it might be good enough for your side project.
Combining SSE with REST
You don’t have to pick one protocol for everything. Common pattern:
- SSE for receiving updates (server → client)
- REST for sending actions (client → server)
// Receive notifications via SSE
const events = new EventSource('/api/feed');
events.onmessage = handleUpdate;
// Send actions via REST
async function postComment(text) {
await fetch('/api/comments', {
method: 'POST',
body: JSON.stringify({ text })
});
// Server broadcasts update to all SSE clients
}
Simpler than WebSockets, covers most use cases.
Real-World Use Cases
Theory is nice. Here’s how these protocols play out in actual applications.
Live dashboards and monitoring. SSE is the natural fit. Your server aggregates metrics — API response times, error rates, active users — and pushes updates every few seconds. The dashboard just listens. No bidirectional communication needed. If you’re pulling data from external sources like IP lookups or DNS checks, your server handles the polling and your clients get a smooth, live-updating display.
Chat and messaging. WebSockets, obviously. Both sides need to send messages freely. Typing indicators, read receipts, presence status — all of these require fast bidirectional communication. The overhead of WebSocket infrastructure is justified here.
Collaborative editing. Think Google Docs. Multiple users editing the same document means conflict resolution, cursor positions, and operational transforms all flowing in real time. WebSockets are non-negotiable for this. The latency requirements are tight — even 200ms of delay feels laggy when you’re typing.
Notifications and activity feeds. SSE. A user sits on the page and new items appear: “John commented on your post,” “Your build passed,” “New order received.” Server pushes, client renders. That’s it.
Live sports scores, auction bidding, stock tickers. These are high-frequency update scenarios where many users consume the same stream. SSE works well here because you can fan out a single data source to thousands of read-only clients efficiently.
Security Considerations
Persistent connections introduce security concerns that don’t exist with stateless REST.
Authentication on connect. For WebSockets, you can’t send custom headers during the handshake from browser JavaScript. Common workaround: authenticate via a short-lived token passed as a query parameter, or authenticate over REST first and use a session cookie that the WebSocket handshake picks up.
// Generate a short-lived token via REST
const { token } = await fetch('/api/ws-token', {
headers: { 'Authorization': 'Bearer ' + jwt }
}).then(r => r.json());
// Connect WebSocket with token
const socket = new WebSocket(`wss://example.com/socket?token=${token}`);
For SSE, authentication is simpler — since it’s regular HTTP, cookies and headers work normally. The EventSource API sends cookies automatically. If you need a custom header, you’ll need to use fetch with a readable stream instead of EventSource, which is a bit more work but gives you full control.
Origin validation. Always check the Origin header on WebSocket connections. Without it, any website can connect to your WebSocket server if they know the URL. This is the WebSocket equivalent of CSRF.
Message validation. Just because a connection is authenticated doesn’t mean every message is safe. Validate and sanitize every incoming WebSocket message the same way you’d validate REST request bodies. Treat the WebSocket as an untrusted input channel, because it is.
Connection limits. Set maximum connections per user. A single misbehaving client opening hundreds of connections can exhaust server resources. Cap it, monitor it, and drop connections that aren’t sending heartbeats.
Getting Real-Time Data
You need something to stream. Options:
- Your own data — Database changes, user actions, internal events
- External APIs — Pull from data sources, push to your clients
For the second one, you poll the API and broadcast to connected clients:
// Poll external API, push to clients
setInterval(async () => {
const weather = await fetch('https://api.apiverve.com/v1/weatherforecast?city=NYC', {
headers: { 'x-api-key': process.env.API_KEY }
}).then(r => r.json());
broadcastToClients({ type: 'weather', data: weather.data });
}, 60000); // Every minute
You absorb the polling. Users get real-time feel.
Scaling
One server, simple. Multiple servers, slightly harder.
Problem: User connects to Server A, but the event happens on Server B.
Solution: Pub/sub. Redis is common:
// When event happens (any server)
redis.publish('notifications', JSON.stringify({ userId, data }));
// All servers subscribe
redis.subscribe('notifications');
redis.on('message', (channel, message) => {
const { userId, data } = JSON.parse(message);
sendToUser(userId, data); // If they're connected to this server
});
Event goes to Redis, all servers hear it, the one with the connection delivers it.
Scaling WebSocket Servers Specifically
WebSocket connections are stateful. Each connection lives on a specific server process. This creates challenges that stateless HTTP doesn’t have.
Sticky sessions. If you’re using a load balancer, you need sticky sessions (also called session affinity) so that a client’s connection always routes to the same server. Without this, a reconnecting client might land on a different server that has no knowledge of their subscriptions.
Connection limits per process. A single Node.js process can handle tens of thousands of concurrent WebSocket connections, but memory adds up. Each connection holds state — user info, subscriptions, message buffers. Monitor memory usage per process and scale horizontally before you hit limits.
Heartbeats. Connections go stale. Clients lose network, close laptops, walk into tunnels. Your server should send periodic ping frames and close connections that don’t pong back within a reasonable timeout. This prevents resource leaks from zombie connections.
// Server-side heartbeat
setInterval(() => {
wss.clients.forEach((ws) => {
if (!ws.isAlive) return ws.terminate();
ws.isAlive = false;
ws.ping();
});
}, 30000);
wss.on('connection', (ws) => {
ws.isAlive = true;
ws.on('pong', () => { ws.isAlive = true; });
});
Graceful shutdown. When deploying new code, you need to drain connections without dropping messages. Send a “reconnect” message to connected clients, wait for them to disconnect, then shut down. Rolling deployments handle this naturally if your reconnection logic is solid.
Making the Choice
Start with SSE if:
- Users primarily receive updates
- You want simpler infrastructure
- Automatic reconnection matters
Use WebSockets if:
- High-frequency bidirectional messages
- Binary data needed
- Building games or collaborative editing
Stick with polling if:
- Updates are infrequent
- You’re optimizing for simplicity
- Real-time isn’t actually required
Keep Reading
- REST vs GraphQL: Which Should You Use?
- Business Day Math Is Harder Than You Think
- 7 SSL Certificate Problems That Kill Trust
Most “real-time” features are actually SSE use cases masquerading as WebSocket projects. Start simple, upgrade if you actually need bidirectional.
For data to stream, check the APIVerve marketplace — weather, currency, stocks. Pull from APIs, push to your users.