Unfortunately there is always a slight delay when you live stream. That is due to the steps involved in getting your ingested video out to your viewers:
Once your stream is received from the server assigned to process it, it first needs to be transcoded from RTMP to another format. We use the most popular format, HLS (HTTP Live Streaming). In addition, multiple bitrates are output. So for your one incoming stream, we make multiple renditions.
HLS uses a master playlist (.m3u8) file. That references sub-manifests. Each of those reference segment files (usually .ts or .mp4). For live streaming, Apple currently recommends using segments that are 6 seconds however that is not a strict requirement and so we try and use shorter 2 or 4 seecond segments where possible to reduce latency. According to Apple’s latest specification, a HLS manifest must contain at least six of those segments.
Once a segment is available, that file is copied from the transcoding server to the origin from where it can be globally replicated using a content delivery network (CDN).
It is possible to reduce the live streaming delay by using the RTMP protocol for the output too. RTMP is a proprietary protocol developed by Macromedia (now owned by Adobe) and remains a standard method for input. RTMP has several problems when used for output though:
Hence we use HLS for delivering our live streams. We are also investigating newer formats, like SRT.
If you have any more questions about our live streaming, our video CMS or business video hosting, please email [email protected].
Updated: September 25, 2023