Skip to content

Realtime Streaming

1. Request streaming from devices.

Devices are designed to be accesed via By default a device gets itself registered at the official server once it is connected to the internet, either via wirelss LANs(aka Wifi) or 4G/5G cellular networks. Clients who are willing to request the realtime video streaming should connect themselves to the server and join the same room with the target devices, which is followed by the steps below:

a) Connect to server with an arbitary client implementation (listed here). b) Emit a "find" event to join the same room with the target device.

const socket = io("");
const device = "tj01";
socket.emit("find", device);
c) Write the event handler for "create" and "join" event. Emit the "auth" and "heartbeatping" events in both handlers.
socket.on("create", ()=> {
socket.on("join", ()=> {
d) Write an event handler for "heartbeatpong" event to keep the streaming on.
let heartbeat_timer;
socketio.on("heartbeatpong", () => {
    if (heartbeat_timer) {
heartbeat_timer = setTimeout(() => {
    }, 10000);
e) Write an event handler for "bridge" event which carries the argument of the streaming path.
socketio.on("bridge", async (pathid) => {
    setTimeout(() => {
    }, 3000);

2. Read the realtime streaming.

If everything went fine in previous section, we have fetched the path of the realtime streaming which can be read or embedded into html tags. There are several protocols we provide for accessing and reading the streaming, including RTSP, HLS, RTMP and WebRTC, all of which support SSL/TLS encryption. Generally all protocols share the same hosts and paths as well as different schemes and ports. As mentioned in the code block of part e) of the previous section, the URI of the streaming are in the form of

It is considered reliable to embed the HLS streaming uri (latter above) in an <video /> tag or via a third party video library like hls.js while an RTSP uri usually needs some other tricks to be played in modern browsers (Firefox/Chrome/Edge/Safari). If you are not limited by web platforms, say it like you are developing an mobile app, you can always call the native or external media player(or their API) to make use of either uri as you like.

3. Graph

flowchart LR
    A([USB Camera])
    B([localhost Mediamtx])
    C([H264 MP4 in SDcard])
    D([Remote Mediamtx])
    E([OnScreen Playback])
    A -->|FFmpeg tee muxer| C;
    A -->|FFmpeg tee muxer| B;
    B -->|SocketIO requested?| D;
    B -->|gd5_9| E;
flowchart LR
    B([Socket.IO server]);
    C([GDPF -- Web frontend]);
    A --> |keep connection| B;
    C --> |connect on request| B;
    C --> |request streaming| A;