It looks as if the default timer resolution for applications running under wine
is different than Windows so just test that the minimum timer resolution is
below a certain amount rather than checking before and after calling
timeBeginPeriod
Iterating over a const Midi-Sequence calls Evoral::Sequence::set_event(),
which in turn used Evoral::Event::operator=() which always created
a new event-ID (create copy of the event).
Issues fixed:
- Saving *unmodified* MIDI produced new event-IDs on every save;
files changed with every save. - greetings to Deva.
- all [GUI] operations that use IDs to refer to notes e.g. undo.
invalid undo-history.
Also clarify assignment operator name. Prefer explicit assign() over =.
AFAICT this could happen if a region's end time (on the timeline) was earlier than the end time in the actual recording. This could cause a situation where the last block of detected silence would have an end time greater than the end tome for the region being processed. Strip Silence would create its new regions - but the last one it created would usually come out with a negative duration.
GStatBuf is not usable on 32 bit Windows without the redefinition in
pbd/gstdio_compat.h so add a test to check for the correct behavior of
g_stat and g_utime on all platforms now that the issue is fixed.
Check timer for invalid frequency
Precalculate timer tick rate to save a few instructions
Don't use static variables inside functions to avoid checking for initialization
Use static functions inside anonymous namespace for internal linkage
I'm not sure if this test is going to be effective as I don't have hardware to
test on at the moment. As noted in the documentation, Windows XP should be the
only OS where QPC uses a timer source that is non-monotonic(multi-core with
non-syncronized TSC).