Video trends · 5 min read

SRT vs RTMP for live streaming

SRT vs RTMP: Which one to choose for your livestream?

This blog will compare RTMP and SRT to help you choose the best option when you want to stream live video with minimal latency.

Jean-Baptiste  - Video Lead

Jean-Baptiste

May 17, 2024

If you're seeking to stream live video with minimal end-to-end latency, the options for transport protocols are relatively few. This is particularly evident if you’re utilizing the public internet as a transport medium, given its inherent challenges like packet loss and jitter. In this blog, we will analyze and assess the two widely used protocols, RTMP and SRT, to aid you in your decision-making process.

Streaming with RTMP

RTMP, a protocol established roughly 20 years ago, initially served as a closed system for streaming audio and video between Macromedia's Flash players. Originating from Flash, it eventually transitioned to an open format, extending its reach beyond the Flash ecosystem and emerging as a dominant ingest protocol in the streaming landscape.

 

However, RTMP encounters a notable issue while streaming known as head-of-line blocking which can be attributed to its reliance on Transmission Control Protocol (TCP). While transmitting audio and video packets, when the data does not arrive in sequence or is lost due to the inherent unreliability of networks, TCP reorders the packets and requests re-transmissions to compensate loss. It insists on perfect delivery and reordering, which in turn can impede the realtime transmission of audio and video streams.

 

This lack of time assurance in data transmission over TCP translates into unstable latency during RTMP streaming, as the protocol automatically adjusts to fluctuating network conditions without user intervention.

 

What does it mean for an end user watching a livestream?

  • A sudden increase in latency may prevent the server from streaming data in real-time, leading to buffering.
  • If the latency decreases after that, allowing the server to catch up, the end-user may still lag behind the livestream due to previous buffering. While the player may attempt to accelerate playback to reach the edge, recurring latency peaks will result in buffering interruptions.

 

This undesirable outcome underlines the importance of finding alternatives to RTMP for delivering uninterrupted live streams in fluctuating bandwidth conditions.

Streaming with SRT

Now let us see how SRT is different.

 

SRT uses a different approach compared to RTMP. It relies on User Datagram Protocol (UDP) which does not enforce packet re-ordering and does not manage packet losses, and so is not subject to head of line blocking that happens in TCP. However, this is something needed to provide a good playback experience, so it is handled by SRT but using an alternative method.

 

It does this by letting the streamer choose the maximum latency they're okay with when streaming. The SRT server then holds back data for that amount of time. During this time, it re-organizes the packets and asks for any missing ones to be sent again. But if a packet isn't received by the end of this delay, it's considered lost, and it's up to the server to deal with that.

 

By doing so, SRT makes it possible to stream data with a stable latency to the server which limits buffering to a certain extent.

 

Now of course, this comes with a price. If unexpected congestion occurs on the network, viewers may notice playback issues like visual artifacts, audio glitches, or black screens. However, the playback will be smoother which is why, the latency target you choose to achieve plays a very important role in how your video will be delivered.

  • If you set the latency too low, the viewer experience may be deeply impacted or may even not work
  • If you set the latency too high, the viewer experience will be optimal but the end-to-end latency will be high, which may be a problem for the streamer.

 

Therefore, finding a good compromise between latency and quality of stream becomes very important. The table below explains everything in a crystal clear format.

End-to-end latencyVisual/audio qualityPlayback reliability
RTMPClose to optimal but subject to instabilityOptimalFair but subject to buffering
SRT with very low latencyVery lowVery poor (too many losses)Good (no buffering due to ingest)
SRT with good latencyLow but not optimalAlmost goodGood (no buffering due to ingest)
SRT with very high latencyVery highOptimalGood (no buffering due to ingest)

Closing thoughts

So, as you must have seen above, it is not straight answer if you should choose SRT or RTMP for your livestreams. If you stream in stable conditions over a wired network, with a low risk of congestion/jitter, you can start with RTMP and you will probably achieve a better latency than you would with SRT.

 

If the network you use is known to be unstable or there is a chance that your users work frequently with wireless networks, then SRT may be a better fit.

 

With api.video, you can either choose RTMP or SRT for your livestream. Find more details about how to live stream with api.video in our documentation.

 

If you haven’t live streamed with api.video yet and would like to start your journey with us, sign-up for a free sandbox account NOW!

Try out more than 80 features for free

Access all the features for as long as you need.
No commitment or credit card required

Video API, simplified

Fully customizable API to manage everything video. From encoding to delivery, in minutes.

Built for Speed

The fastest video encoding platform. Serve your users globally with 140+ points of presence. 

Let end-users upload videos

Finally, an API that allows your end-users to upload videos and start live streams in a few clicks.

Affordable

Volume discounts and usage-based pricing to ensure you don’t exceed your budget.