alsa_in/out: Convert between sample rates when necessary
alsa_in and alsa_out set the playback latency of their ports to the target delay. The problem is the target delay is in terms of the ALSA sample rate, so it should be converted to JACK's sample rate.
Example: Imagine JACK is running at 48 kHz, and alsa_out is invoked like:
alsa_out -r 96000 -t 512
Currently, alsa_out will report 512 frames of playback latency. After the fix, it converts to 48kHz and correctly reports 256.
Also converts the result of jack_frames_since_cycle_start to ALSA sample rate.
* detect version option before all other oprion parsing
jackd now checks its arguments for "-V" and "--version" before all other
option parsing happens.
* remove some dead code from option parsing
Version options are detected before optparse runs, the removed code
paths thus became obsolete.
* remove rest of version option from optparse
Detection of the version option is now handled outside of optparse, thus
left over stings and variables are removed.
* switch to string comparison for detection version option
Demanding an exact match for the option strings reflects the original
behavior more closely than a search for substrings.
The ucontext functionality is not available on all CPUs with all C
libraries. Instead of making just assumptions based on the CPU
architecture, this commit adds the necessary checks in wscript to verify
the availability of the ucontext functionality, before using it in
dbus/sigsegv.c.
This avoids the long list of architecture exclusions, and make it more
robust when building jack2 for new CPU architectures.
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
With glibc 2.16, we get following build error when building jack2:
[193/247] cxx: tests/iodelay.cpp -> build/tests/iodelay.cpp.4.o
../tests/iodelay.cpp:171:43: error: 'UINT32_MAX' was not declared in this scope
../tests/iodelay.cpp:171:55: error: 'UINT32_MAX' was not declared in this scope
../tests/iodelay.cpp:172:44: error: 'UINT32_MAX' was not declared in this scope
../tests/iodelay.cpp:172:56: error: 'UINT32_MAX' was not declared in this scope
In glibc 2.17 or older version, Header <stdint.h> defines these macros
for C++ only if explicitly requested by defining __STDC_LIMIT_MACROS.
We can't use <cstdint> since it requires C++11 standard.
This build issue found by Buildroot autobuilder.
http://autobuild.buildroot.net/results/369/369ce208ffea43dad75ba0a13469159b341e3bf5/
Signed-off-by: Rahul Bedarkar <rahul.bedarkar@imgtec.com>
jack_latency_range_t is
struct _jack_latency_range {
jack_nframes_t min;
jack_nframes_t max;
};
and jack_nframes_t is
typedef uint32_t jack_nframes_t;
so it's unsigned. Initialising it with -1 is invalid (at least in C++14). We cannot use {0, 0}, because latency_cb has
jack_latency_range_t range;
range.min = range.max = 0;
if ((range.min != capture_latency.min) || (range.max !=
capture_latency.max)) {
capture_latency = range;
}
so we must not have {0, 0}, otherwise the condition would never be true.
Using UINT32_MAX should be equivalent to the previous -1.
If configured with --clients=512 (translates to CLIENT_NUM), we exceed
the maximum stack size. CLIENT_NUM==500 still works, but let's allocate
the matrix on the heap to be safe.
Kudos to Markus Seeber for the initial bug triage.
Fixes#212