I have been using the sample OSMF player that exists in the latest SDK of OSMF. And I find that no matter what bufferTime I set the movie starts at the highest quality but after 1-2 minutes it drops to the lowest for the rest of the movie. It is not something with my bandwidth because other players play this stream at maximum .
The problem here I think is that the algorithm is way too pesimistic and drops the quality right away, and it is hard to get it up again.
If someone knows why this is happening or have a way to deal with this please advice.
Thanks in advance.
PS: this is a sample of my f4m. and the streaming is done via RTMP
<?xml version="1.0" encoding="utf-8"?>
<media url="mp4:2783-07.mp4" bitrate="944" width="1280" height="720" />
<media url="mp4:2783-12.mp4" bitrate="1454" width="1280" height="720" />
<media url="mp4:2783-18.mp4" bitrate="2069" width="1280" height="720" />
<media url="mp4:2783-25.mp4" bitrate="2784" width="1280" height="720" />
<media url="mp4:2783-40.mp4" bitrate="4224" width="1280" height="720" />
It's been a while since I've dealt with rtmp streaming but it sounds like the buffer rule might be dropping you to the lowest index. You'll need to study the player logs to find out what rule is switching down and why.
Make sure your individual encodes do not vary widely in bitrate. The switching rules do not perform well with wide ranges in VBR encodes. Also, OSMF throttles the connection between client and server at 1.4 times the bitrate you specify. So if your 4 Mbps stream spikes up to 8Mbps for a scene with a lot of movement, your player is going to choke trying to pull that down through a 5.6Mbps pipe and your buffer is going to fall below 2 seconds, which will cause it to switch down to index 0.
Trace out buffer length and study the logs to find out what the switching rules are doing.