This resolves a PBD vs ARDOUR namespace error for some compilers:
```
error: reference to 'microseconds_t' is ambiguous
libs/pbd/pbd/microseconds.h:29:19: error: candidates are: typedef uint64_t PBD::microseconds_t
libs/ardour/ardour/types.h:81:29: error: typedef PBD::microseconds_t ARDOUR::microseconds_t
```
In some cases old and/or conflicting port names were saved
with the session (e.g. "Faderport" for FP1,8). Loading old sessions
then merges this state into the config, which could lead to
port-registration failure when the surfaces was enabled.
* replace signal-emission with direct calls to CoreSelecton
using BaseUI's session pointer
* remove unused leftmost strip API
* use CoreSelection for first-selected strip
* Accessing CoreSelection does not modify the session
(allow access from const callbacks)
* replace static calls in P2 surface
This removes indirection and dependency on the GUI for
managing strip selection.
Previously this was inherited via PBD.
On MacOS/X, this adds
"-undefined dynamic_lookup -flat_namespace"
and various "-framework .." options to linkflags
Without this flag, .dylibs fail to link usually because
of missing `-lintl` (Undefined symbols: "_libintl_dgettext")
On other systems this is a NO-OP:
CFLAGS_OSX, CXXFLAGS_OSX and LINKFLAGS_OSX
are only set on the darwin platform.
At least on my machine, the fonts on the Push display were ridiculously large,
making everything overlapping and unusable. I suspect this is because
pango_cairo_font_map_get_default() inherits DPI from the system, so the
monitor scaling factor got applied to the Push display as well.
This commit instead creates a new plain font map, and sets the resolution to
96, which looks like what the UI was designed for. Some more tweaking of the
Pango context might make things more optimal on the Push, but just setting the
resolution makes things look reasonable to me anyway.
Generated by tools/f2s. Some hand-editing will be required in a few places to fix up comments related to timecode
and video in order to keep the legible
The Editor continues to notify them, but via a direct call to ControlProtocolManager, not a signal.
The CP Manager calls the ControlProtocol static method to set up static data structures holding
selection info for all surfaces and then notifies each surface/protocol that selection has changed.
Changing solo-state needs to be done in rt-context to atomically
propagate solo/mute.
set_control() queues a rt-event, later Session::rt_set_control() calls
Session::update_route_solo_state() to propagate solo/mute.
Otherwise it is still connected to the ControlProtocol::StripableSelectionChanged signal, even though the
event loop specified in the connection has been destroyed.