-
Notifications
You must be signed in to change notification settings - Fork 2.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Low Latency playback - recovering after stalls #6350
Comments
While stalling might be inevitable, support for preload hints will help reduce stalls. Ask @iamboorrito to rebase and open a PR for https://github.com/video-dev/hls.js/tree/feature/preload-hint (resolves #3988). |
You can find Issues and PRs for Low-Latency on this project board https://github.com/orgs/video-dev/projects/3. You might consider commenting on #4681 as these issues are somewhat related: |
I would consider (and test) disabling/dropping
Correct.
There is no mechanism to reset or reduce the
You are on the right track. Being able to recover or reset target latency would be the first step.
|
In general HLS.js tries to avoid any direct control of playback - it never calls play() or pause() and only sets currentTime when necessary to start or recover playback. The point is that control of playback (and latency) can be performed using the media element provided to HLS.js. If you think HLS.js should handle this, please submit a contribution that improves or adds more configuration options to the current behavior. |
Thank you so much @robwalch, for the thorough and thoughtful reply, and I apology for such a long delay in reacting. I have tried the following:
As a next step, I will try to see if some suggestions for #4681 could be implemented (by a newbie) and improve LL stall behaviour. |
What do you want to do with Hls.js?
I am trying to support ultra low-latency HLS stream using 6 second segments and 0.5 second parts with
liveSyncDuration
of 1.4 seconds.Hls.js version is 1.5.7.
My config looks as following:
Periodically the player encounters a stall, and after each stall
LatencyController
target latency is increased by 1 second:this.stallCount * liveSyncOnStallIncrease
, seetargetLatency
below:hls.js/src/controller/latency-controller.ts
Lines 42 to 71 in 0a49731
So if the stream is played for an hour and encountered 3 stalls our target latency drifts from 1.4 to 4.4.
I might be missing something, but I don't seem to see a mechanism to recover from this condition.
I do not see the API to programmatically change
targetLatency
after player has been created, so the only way to return to lower latency is to reset the player.I am looking for a more graceful way to return to lower
targetLatency
after a period of successful playback.What have you tried so far?
I have tried to programmatically set LatencyController.stallCount to a lower value after X seconds of playback without a stall.
hls.js/src/controller/latency-controller.ts
Line 19 in 0a49731
This did reduce
player.targetLatency
, but did not affect the actual playback latency, despite the fact thatmaxLiveSyncPlaybackRate
is set to 1.2 and I was hoping that the player would try to catch up totargetLatency
.I have also tried to change
liveSyncOnStallIncrease
from 1.0 to 0.1 second.hls.js/src/controller/latency-controller.ts
Line 63 in 0a49731
This did cause
targetLatency
to drift slower, however I still observed that the actual playback latency remained higher then the target latency and the player did not attempt to catch up.Note: whenever the drift happened, I was able to start another playback in the same stream in a different window using the same version of hls.js and the second player would start and play the stream at my initial target latency of 1.4. For example, my first player would have targetLatency = 1.4, but actual latency of 3.4 sec, while my second player would have both target and actual latency of 1.4 sec.
This indicates that the content has been available on the server.
I would be very grateful for the guidance on how to improve the playback latency in this scenario.
The text was updated successfully, but these errors were encountered: