Skip to content

Buffer delay setting behaviour in Jamulus #2109

Discussion options

You must be logged in to vote

Can someone please explain why sometimes Jamulus enforces/enables its 3 Buffer delay choices, while other times it let's the driver decide?

Jamlus calls ASIOGetBufferSize to figure out if the driver supports 64, 128, or 256. If the driver says it supports one or more, then those choices are enabled in the UI. Otherwise, they are disabled, and only the supported size is shown.

https://github.com/jamulussoftware/jamulus/blob/r3_8_1/windows/sound.cpp#L312

By the way, I get better latency with the Interface at 64 than the TD-27 at 48 (which does not make sense to me), circa 5ms improvement in overall delay.

If the audio buffer doesn't fit neatly into the network buffer size (which is usua…

Replies: 4 comments 25 replies

Comment options

You must be logged in to vote
10 replies
@dcorson-ticino-com
Comment options

@jujudusud
Comment options

@rdica
Comment options

@dcorson-ticino-com
Comment options

@jujudusud
Comment options

Comment options

You must be logged in to vote
9 replies
@leoinlios
Comment options

@pljones
Comment options

@pljones
Comment options

@pljones
Comment options

@roshank8s
Comment options

Comment options

You must be logged in to vote
6 replies
@leoinlios
Comment options

@jujudusud
Comment options

@leoinlios
Comment options

@jujudusud
Comment options

@ignotus666
Comment options

Comment options

You must be logged in to vote
0 replies
Answer selected by leoinlios
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
8 participants