New Red5 Pro Release With Rebuffering!

January 20th, 2016 by Chris Allen

We wanted to let you know that we have just released some big updates to the Red5 Pro Server and Streaming SDKs!

I’ve included the release notes, the Server and SDK downloads, and the link to our Github repos with the updated examples and sample apps.

We hope this will help overcome some of the issues you were seeing, and we are here if you have any questions!

Best,

Chris

Release Notes:

Server 0.3.0 (Server)

Re-buffering streams – allows for graceful degradation of streams for client connections that have too low of bandwidth to support the publisher’s stream quality

  – Provide messaging to the clients about the state of the stream

  – Frame dropping for congested streams

General stabilizing of the server and performance gains including new threading model

HLS stream improvements

Metadeta – stream events now reported for HLS streams over WebSockets

 

iOS Streaming SDK 0.8.44.1 (SDKs)

Re-buffering -

  – drops extremely late packets.

  – resets stream if it becomes too latent

  – notification API from server on connection state

  – supports audio only in the case that the video bandwidth is too high

  – stream play back speed is adjusted automatically to keep it in realtime

Ability to capture a UIImage from stream upon request

Bug fix for A9 based chipset encoding (iPhone 6S, iPad Pro, etc. )

Memory leaks fixed

Scaling mode – ability to scale video stream view in multiple modes:

  – Scale to fit, Crop to fit, or Stretch to fit

PixelBuffer support – ability to support custom video sources other than the device’s Camera (bringing this feature in sync with Android)

 

Android Streaming SDK 0.8.44.1 (SDKs)

Re-buffering

  – drops extremely late packets.

  – resets stream if it becomes too latent

  – notification API from server on connection state

  – supports audio only in the case that the video bandwidth is too high

Ability to capture a Bitmap from stream upon request

Memory Leaks fixed

Scaling mode – ability to scale video stream view in multiple modes:

  – Scale to fit, Crop to fit, or Stretch to fit

Bug fix to set Microphone to 32 kbps by default and added ability to set custom bitrate on Microphone.

,

What YOU Need to Know About HLS: Pros and Cons

January 15th, 2016 by Chris Allen

R5P_HLS blog post

HLS, or HTTP Live Streaming is a video streaming format first introduced by Apple in May, 2009. It’s a format that breaks streams into small file-based segments made available for download over HTTP. It now is a widely supported format for viewing streams in almost real time. I say almost only because the protocol, by its very nature, introduces a lot of latency.

We’ve been getting a lot of questions and inbound interest from our customers about HLS. What are the advantages of using it? What are the disadvantages? How does HLS compare to WebRTC? Will Apple approve apps that don’t use HLS for streaming? This post is an attempt to address some of these questions.

Advantages

There are many reasons you would want to use HLS for your live streams, and this is why we recently added the support to Red5 Pro.

Ubiquity

First of all, HLS is widely supported. Although originally conceived by Apple for Quicktime, iOS and the Safari browser, it’s now implemented on virtually every browser out there. As you can see, the leading browsers support it. Now of course most of them support a comparable standard called MPEG DASH as well, but since Apple Safari and iOS devices don’t support it, we think HLS is currently a better choice.

Adaptive Bitrates

Another huge advantage of using HLS is that it allows for clients to choose from a variety of quality streams depending on the available bandwidth.

 

Disadvantages

So if it does all that, why wouldn’t I want to use HLS for my live streaming app?

Terrible Latency

It turns out that while HTTP Live Streaming was designed to deal efficiently with multiple quality streams, it wasn’t built for delivering video quickly. Practically speaking, HLS introduces at least 20 seconds of latency (often more!) in your streams.

Here’s why; HLS requires three segments in queue before it will allow playback, and the segments are divided by the keyframes in the video. The only way to create a super low latency stream (let’s say under one second) with HLS is to encode the video going out with a keyframe happening every 250 ms. The HLS playlist moving-window would be four items long with a segment duration of quarter of a second.  This would of course create high bandwidth video, add to the number of HTTP calls happening (at least four per second), and put additional load on the server.

The whole point of keyframes in video protocols like h.264 is to reduce the number of times you need to send a full frame of image data. With the above scenario, you might as well be sending the frames as a series of JPEG images in sequence. There’s a lot more to this, like the fact that media has to be packaged in 188 byte packets which creates added overhead when you do it too much, but hopefully now you’ve got the gist of it: HLS is a poor choice when it comes to low latency video streaming.

No Publishing

HLS is a subscriber-only protocol. Unlike WebRTC, which has a spec for publishing from a browser, HTTP Live Streaming only supports playing streams. If you want to publish a live video stream for a device, you simply have to look for other technology to do this. Luckily with Red5 Pro, we have alternative SDKs for mobile that allow you to create publishing apps that utilize RTP; you can then relay those streams over HLS for folks to view these streams right in their browsers. You can check out our HLS player example using video.js on GitHub. We are also in development on full WebRTC support with Red5 Pro that will include a JavaScript SDK. This implementation will feature out-of-the-box tools like a WebRTC Publisher and a player that supports WebRTC, HLS and RTMP (as a fallback), so stay tuned for that update as well.

Apple iOS HLS App Rules

Another question that comes up when our customers get ready to submit their iOS apps to the App Store is: will Apple reject my app if it’s not using HLS? As many of you know, our SDK uses RTP streaming for iOS, and Apple has some strange requirements that all apps must use HLS for streaming. That’s not quite true however. Apple states the following in their App Store Submission Guidelines:

  • “9.3 Audio streaming content over a cellular network may not use more than 5MB over 5 minutes.”

  • “9.4 Video streaming content over a cellular network longer than 10 minutes must use HTTP Live Streaming and include a baseline 192 kbps or lower HTTP Live stream.”

What we’ve also found is that if the app is a communication app–meaning that you have some form of two-way communication like Periscope has with live chats–then they tend to group the app in a different category. Apple also considers video calling apps like Skype to be in a different category, and the live streaming restrictions of having to use HLS don’t apply. The other good news is that as the popularity of apps like Periscope and Meerkat continues, Apple is getting used to the idea of live real-time streaming apps, and is gradually becoming more and more flexible with the restrictions.

So with that in mind, because of HLS’s high latency, Apple will approve apps that use other protocols if there’s a need for real-time communication. We simply recommend making a note of why you can’t use HLS when you submit your app.

Summary

As you can see, the ubiquitous support of HLS on a variety of browsers, mobile phones, and operating systems makes it a great choice for distributing your streams to the most amount of viewers. However, since it’s a rather slow protocol, if you are building any kind of app that relies on near real-time communication, you should look at other options. Finally, while Apple’s rules do seem quite rigid when it comes to their iOS streaming requirements, they are actually flexible when the need for something else is justified. What are you thoughts; are you currently using HLS in your apps? Have you submitted a non-HLS based streaming app to Apple’s app store? How did that go? Let us know in the comments.

 

, ,

Why We Built Red5 Pro: An End-To-End Solution

January 7th, 2016 by Chris Allen

WhyRed5Pro

Reflecting on 2015 and what we’ve built over the last couple of years, I started thinking about Red5 Pro and the reasons why we built it in the first place. So I figured I should write a post.

Why did we spend so much time and energy building this thing? Not only did we see that there was a trend of people wanting to make new mobile experiences based on live streaming like Periscope and Meerkat, but we also weren’t happy with the developer tools available to build these kinds of experiences. We saw two approaches towards creating live streaming tools for developers, and we didn’t like what we saw with either approach. First, we have media servers doing a poor job of supporting mobile, and… well, anything but Adobe Flash. Then second, we saw platform as a service companies providing good, but very limited tools.

Media Servers

One thing is increasingly clear–traditional media servers by themselves aren’t sufficient for building a live streaming app today. Why? Because all of them focus on the server side, and they rely on others to do the client. It’s not a full stack solution, and when you don’t control both the client and server endpoints, things can get messy rather quickly.

So, what is the origin of this issue? It’s because the servers, when they were originally designed, were relying on Flash as the client. This makes sense, because at the time Flash was the only viable client for streaming. Now of course, the world has shifted to other platforms. iOS, Android, and even modern browsers on the desktop don’t properly support Flash; the direction is towards native apps or WebRTC.

Before we came out with Red5 Pro, if you wanted to build your own mobile streaming app you would need to find a useable SDK for iOS and for Android. You would then need to install something like Wowza on the server and stream to that with your 3rd party SDK. What we found is that many of the open source SDKs weren’t well supported, and the paid ones just didn’t work that well. The flaws were obvious: they lacked flexibility and extensibility, and they all relied on RTMP, an older Flash based protocol. We decided this just wasn’t acceptable.

Hosted PaaS

The other trend we saw happening for live streaming solutions was the advent of hosted PaaS (Platform as a Service) products. Companies like LiveStream and TokBox are a few of the best ones in this category. What we found is that these solutions don’t provide enough control for the developer. Companies like TokBox have done a good job providing easy-to-use SDKs and make it super simple to setup since you don’t need to install the server–but this comes with a price.

Lack of control is a big one.

You have to rely on what the platform gives you. Either you are forced to include advertising in your streams, or you can’t easily modify portions of the SDK to grab a video source other than the device’s camera. Maybe it’s latency you are having trouble with like with Kickflip, or it could be a whole number of things that you want to do with your app that the provider’s SDK won’t allow you to do.

The PaaS solutions are also harder to scale and will cost more if you do achieve massive scale. The primary issue here is that you are basically locked into them handling the server for you. There is simply no way to host it where you want it. You can’t modify the server program easily to do things like live transcoding, moving recorded files to S3 buckets, integrating third party software like FFMPEG, etc. To make it worse, some companies can’t use cloud solutions and need to be deployed on networks that they control. Do you think your bank is going to allow video calls about proprietary financial information to flow through a 3rd party that they don’t control? How about medical software? I don’t think so–good luck getting around HIPAA.

Another thing to consider is that the major cloud hosting platforms offer credits to startups, and it would make sense to be able to easily move between platforms to take advantage of these deals (think of it like how consumers switch cable providers to get better deals). We designed Red5 Pro to be hosting agnostic so that where you host your solution is up to you.

Our Solution

We think we’ve taken the right approach by building the whole stack and providing super flexible SDKs for mobile (JavaScript WebRTC SDK coming soon!), Plus we give the power of where the server is hosted to you the developer. We are always looking for ways to improve, and if there’s something you are looking for, and we don’t currently do it, please let us know.