If you've ever done any live streaming, you should be familiar with streaming protocols, particularly RTMP, which is the most common protocol for live streaming. However, there is a new streaming protocol that is creating a buzz in the streaming world. It is called, SRT. So, what exactly is SRT?
SRT stands for Secure Reliable Transport, which is a streaming protocol developed by Haivision. Let me illustrate the importance of streaming protocol with an example. When someone opens YouTube Live to view video streams, your PC sends the “request to connect” to the server. Upon acknowledging the request, the server then returns sectioned video data to the PC on which the video is decoded and played at the same time. SRT is basically a streaming protocol that two devices must understand for seamless video streaming. Each protocol has its pros and cons and RTMP, RTSP, HLS and SRT are some of the most prominent protocols used in video streaming.
To learn the pros and cons of SRT as well as its features, we must first compare it with RTMP. RTMP, also known as Real-Time Messaging Protocol, is a mature, well-established streaming protocol with a reputation for reliability because of its TCP-based pack retransmit capabilities and adjustable buffers. RTMP is the most commonly used streaming protocol but has never been updated since 2012, so it is highly probable that it will be replaced by SRT.
Most importantly, SRT handles problematic video better than RTMP. Streaming RTMP over unreliable, low-bandwidth networks can cause issues such as buffering and pixilation of your live stream. SRT requires less bandwidth and it solves data errors faster. As a result, your viewers will experience a better stream, with less buffering and pixelization.
Compared to RTMP, SRT streaming provides lower latency. As dictated in the white paper (https://www.haivision.com/resources/white-paper/srt-versus-rtmp/) published by Haivision, in the same test environment, SRT has a delay that is 2.5 – 3.2 times less than RTMP, which is quite a substantial improvement. As illustrated in the diagram below, the blue bar represents the SRT performance, and the orange bar depicts the RTMP latency (tests were carried out at four different geographical locations, like from Germany to Australia and Germany to the US).
Besides its low latency, it is worth mentioning that SRT can still transmit in a poor performing network. The SRT infrastructure has built-in functions that minimize the adverse effects caused by fluctuating bandwidth, packet loss, etc., thus maintaining the integrity and quality of the video stream even in unpredictable networks.
In addition to ultra-low latency and resiliency to changes in network environment, there are also other advantages that SRT can bring you. Because you can send videos on unpredictable traffic, expensive GPS networks are therefore not needed, so you can be competitive in terms of your service cost. In other words, you can experience interactive duplex communication at any place with Internet availability. Being a video streaming protocol, SRT can packetize MPEG-2, H.264 and HEVC video data and its standard encryption method ensures data privacy.
SRT is designed for all different types of video transmissions. Just imagine in a densely packed conference hall, everyone uses the same network to contend for Internet connection. Sending videos to the production studio over such a busy network, quality of transmission will definitely be degraded. It is highly probable that packet loss will occur when sending video over such a busy network. SRT, in this situation, is very effective at averting these issues and delivers high quality videos to destined encoders.
There are also multiple schools and churches in different areas. To stream videos between different schools or churches, the viewing experience will definitely be unpleasant if there is any latency during streaming. Latency can also cause loss in time and money. With SRT, your will then be able to create quality and reliable video streams between different locations.
If you are hungry for knowledge and would like to know more about the above good points about SRT, the next few paragraphs will provide detailed explanations. If you already know these details or are just simply not interested, you can skip these paragraphs.
A main difference between RTMP and SRT is the absence of timestamps in the RTMP stream packet headers. RTMP only contains the timestamps of the actual stream according to its frame rate. The individual packets do not contain this information, therefore the RTMP receiver must send each received packet within a fixed time interval to the decoding process. To smooth out differences in the time it takes for individual packets to travel, large buffers are required.
SRT, on the other hand, includes a timestamp for each individual packet. This enables the recreation of signal characteristics on the receiver side and dramatically reduces the need for buffering. In other words, the bit-stream leaving the receiver looks exactly like the stream coming into the SRT sender. Another significant difference between RTMP and SRT is the implementation of packet retransmission. SRT can identify an individual lost packet by its sequence number. If the sequence number delta is more than one packet, a retransmission of that packet is triggered. Only that particular packet is sent again to keep latency and overhead low.
For more information about technical details, visit Haivision’s official website and download their technical overview (https://www.haivision.com/blog/all/excited-srt-video-streaming-protocol-technical-overview/).
After seeing so many advantages of SRT, let’s look at its limitations now. Except Wowza, many primary real time streaming platforms do not yet have SRT in their systems so you probably still cannot take advantage of its great features from the client end. However, as more and more corporates and private users adopt SRT, it is expected that SRT will become the future video streaming standard.
As mentioned before, SRT’s greatest feature is its low latency but there are also other factors in the entire streaming work flow that can lead to latency and ultimately bad viewing experience such as network bandwidth, device codec and monitors. SRT does not guarantee low latency and other factors such as network environment and streaming devices must also be taken into account.