I recently discovered than PostgreSQL supports alerting clients when “something” happens, kind of like a message queue, using 2 commands, NOTIFY
and LISTEN
.
Here’s the description of the NOTIFY
command from the PostgreSQL documentation.
The NOTIFY command sends a notification event together with an optional “payload” string to each client application that has previously executed LISTEN channel for the specified channel name in the current database. Notifications are visible to all users.
So, you can use the NOTIFY command to send messages to active connections to the database.
Here’s the description for the LISTEN
command from the PostgreSQL documentation.
LISTEN registers the current session as a listener on the notification channel named channel. If the current session is already registered as a listener for this notification channel, nothing is done.
Whenever the command NOTIFY
channel is invoked, either by this session or another one connected to the same database, all the sessions currently listening on that notification channel are notified, and each will in turn notify its connected client application.
Lets see how we can use that in code with this small Node.js example.
Lets initialize our project.
Small Example
1
2
3
4
mkdir psql-notify
cd psql-notify
npm init -y
npm i postgres
Then let’s run a Postgres instance using Docker.
1
docker run --name pg -d -e POSTGRES_PASSWORD=password -p 5432:5432 postgres:15-alpine
First, we will write our listener in a file named listen.js
.
1
2
3
4
5
6
7
8
9
10
11
const postgres = require("postgres");
const sql = postgres("postgres://postgres:password@localhost:5432/misc");
async function main() {
await sql.listen("foo", (payload) => {
console.log({ payload })
})
}
main();
Then we can write our notifier in a file named notify.js
.
1
2
3
4
5
6
7
8
9
const postgres = require("postgres");
const sql = postgres("postgres://postgres:password@localhost:5432/misc");
async function main() {
await sql`NOTIFY foo, 'bar'`;
}
main();
Run node listen.js
in a terminal, and node notify.js
in another terminal.
You’ll see { payload: 'bar' }
printed in the first terminal.
This opens an opportunity to do interesting stuff, like getting real-time database updates to application clients, like Firebase’s Firestore database, or building a Pub/Sub system.
For the former, we could make our lives easier by adding the NOTIFY command in a trigger on INSERT/UPDATE/DELETE
commands.
Let’s see how we can do that.
As an example, we will build a small social network where posters see new posts in real-time.
Social Network
Here’s the CREATE TABLE statement for the posts table.
1
CREATE TABLE post (id SERIAL PRIMARY KEY, content TEXT);
Now we can setup our web server. Lets install express.js.
1
mkdir realtime-posts && npm init -y && npm i express socket.io postgres
Create a main.js
file with the following content.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
const express = require('express');
const { createServer } = require('node:http');
const { join } = require('node:path');
const { Server } = require('socket.io');
const postgres = require("postgres");
const sql = postgres("postgres://postgres:password@localhost:5432/misc");
const app = express();
app.use(express.json());
const server = createServer(app);
const io = new Server(server);
app.get('/', (_, res) => {
res.sendFile(join(__dirname, 'index.html'));
});
app.post("/post", async (req, res) => {
console.log("post", req.body);
await sql`INSERT INTO post (content) VALUES (${req.body.post});`; // Postgres package handles query parameters.
await sql.notify("post", req.body.post);
return res.sendStatus(200).end();
});
io.on('connection', (socket) => {
console.log('a user connected');
});
async function listener() {
await sql.listen("post", payload => {
io.emit("post_created", payload);
});
}
listener();
server.listen(3000, () => {
console.log('server running at http://localhost:3000');
});
and create an index.html
file with the following content.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
<!DOCTYPE html>
<html lang="en">
<head>
<title></title>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
</head>
<body>
<h1>Hello, world!<h1>
<form>
<input type="text" id="content" placeholder="Post content">
<button type="submit">
Submit
</button>
<ul id="post_list">
</ul>
</form>
<script src="/socket.io/socket.io.js"></script>
<script>
const socket = io();
socket.on("post_created", msg => {
const list = document.querySelector("#post_list");
const li = document.createElement("li");
li.append(msg);
list.append(li);
});
const form = document.querySelector("form");
form.onsubmit = async (e) => {
e.preventDefault();
const content = document.querySelector("#content").value;
await fetch("http://localhost:3000/post", {method: "POST", headers: {"Content-Type": "application/json"}, body: JSON.stringify({post: content})});
return false;
}
</script>
</body>
</html>
Run node main.js
and visit http://localhost:3000
.
Write the post and click submit, and you will see the posts appearing in real-time.
While you can do all of this without Postgres’ NOTIFY/LISTEN
, it would only work for a single server, so if you have more than 1 replica for scalability, it would’t work, and you would need to use something like Redis’ Pub/Sub to notify clients conencting to other server replicas. This is why Postgres’ NOTIFY/LISTEN is good. It’s the easiet and probably the most efecient way to get real-time database updates to clients.
Note: We could make this even easier and adding a trigger on INSERT
that uses NOTIFY
to notify listeners, that way we don’t have to remember to use NOTIFY
after every INSERT
.