Other Free Encyclopedias » Online Encyclopedia » Encyclopedia - Featured Articles » Contributed Topics from F-J

Infrastructure and Engineering - Operating and Real-Time Systems, Middleware and Networking, Database and Data Management

multimedia synchronization media issues

Wayne Robbins
Defense R&D Canada (DRDC)
Future Forces Synthetic Environments, DRDC Ottawa, Ottawa, Ontario, Canada

Definition: Multimedia systems rely on a wide variety of infrastructural technologies to enable their communication, processing, and interface/display needs.

In addition, particular classes of media (e.g., continuous media such as audio and video) often require special computational support to ensure their correct rendering. As such, different categories of infrastructure can be defined as to how systems enable and influence synchronized media playback. These include: operating and real-time systems, middleware and networking, database and data management as well as system and software engineering.

Operating and Real-Time Systems

The operating system (OS) is a key element in multimedia synchronization due its fundamental infrastructural role within end-user and network equipment. Specifically, a number of OS issues can cause significant impact in situations where continuous media are utilized. Such time-sensitive data, typically large in volume and isochronous in nature, often require significant computing power to provide a responsive system with predictable behavior. Specific issues which need to be considered include:

  • Process Management
  • Time Management
  • Memory Management
  • Inter-Process Communication (IPC)
  • Resource and Device Management
  • User Interface and Display Management

Process management addresses the area of process (program) execution and processor allocation. For continuous time-based media, predictable processing is required to ensure media readiness for display in accordance with its isochronous nature. Relevant process management issues therefore include scheduling algorithms and priority mechanisms, such that inappropriate scheduling and misuse of priorities can introduce asynchrony into a multimedia application due to inappropriate delays in processing (e.g., other applications “grabbing” the processor).

Time management addresses the issue of whether the OS can both ensure adequate temporal accuracy for application-level synchronization efforts as well as if the OS itself can operate in a timely and synchronized manner. Included in this area are issues of clock management as well the availability of synchronization primitives (for process and resource/device management).

Memory management addresses how the OS controls memory allocation and applications’ access to memory spaces. Memory protection, virtual memory (VM), shared vs. non-shared models, dynamic vs. static allocation and automatic garbage collection (in conjunction with certain language/run-time environments) falls in this category. These issues influence media synchronization by affecting data transfer between applications and devices as well as possibly inducing asynchrony into process execution due to automatic system overhead (e.g., use of VM swap files and automatic “cleanup” techniques sometimes associated with dynamic memory/resource allocation).

Inter-Process Communication (IPC) plays a significant role in multimedia synchronization due to the potential to incur delays when transferring large amounts of data between processes. Delays can result from both the data duplication itself but also the incidental (and often hard to determine) costs of process management (context switching) and memory management overhead. The result is unexpected and undeterministic delays within the operating system’s own execution which (then) ultimately affect the multimedia application.

Resource management refers to how the operating system provides and controls access to any resource (e.g., piece of hardware/software construct). Device management specifically refers to the appropriate means to control and facilitate data flow to/from devices; multimedia-oriented examples include capture and rendering devices, hardware codecs, storage and communication equipment and so forth. Because multimedia applications require the timely delivery of data to end-users, devices must enable fine-grain data and control flow in order to ensure asynchrony is not introduced at the final stages just prior to being rendered to the user. Similarly, when media are generated in real-time, any asynchrony resulting from the capture process (i.e., via input devices) can create timing errors which may affect data flow and be difficult to account for in subsequent processing.

User interface and display management issues are also important to multimedia synchronization in much the same way as device management. That is, the display management and user interface subsystems need to provide low-overhead rendering and user interaction with multimedia components. Slow rendering times and unresponsive user interfaces are not viable for time-based media or interactive systems. It is also important to consider the affects of manipulating the visual data (vis-à-vis the aforementioned OS issues) and how synchronization methodologies are compatible with user interaction (e.g., mouse pointer synchronization).

The previous discussion illustrates that multimedia synchronization can be intimately affected by a system’s low-level infrastructure. As such, many of these requirements can best addressed in the context a real-time system – one whose correct operation depends both on its logical results as well as the temporal properties of its behavior. Such systems are typically characterized as deterministic, with the ability to provide timely responses and flexible scheduling abilities while also providing for security, fault tolerance and robustness. Classic examples include factory robot control and avionic subsystems.

Consequently, the key to real-time systems is highly accurate, temporal predictability; therefore, real-time systems are not necessarily fast but “temporally pedantic,” since early event occurrence can be just as damaging as incurring delays. Characterized in Table 1, two classes of real-time systems are defined based on the severity of temporal errors, namely “hard” and “soft” real-time systems. Hard real-time systems are those in which any violation of a timing constraint is considered a system failure. Timely execution is guaranteed through resource allocation based on the worst-case situation, usually resulting in under-utilized resources during normal operation, possibly requiring complete system shutdown when any anomalies occur. Conversely, soft real-time systems are those in which a violation of a temporal constraint does not constitute a system failure.

Accordingly, multimedia systems are generally classified as soft real-time systems because their temporal performance requirements are usually not so restrictive; for example, asynchrony in a presentation may degrade its quality and annoy its viewers, but no physical damage results. The human-centric character of multimedia systems also facilitates a range of “acceptable” playback quality which varies with the media, the context of their use and ultimately, the individual users. Consequently, human perceptual limitations can be used to relax certain timing constraints, enabling a choice between which playback characteristics are most important and facilitating potential trade-offs between functionality and resource usage. Such an approach maps well to soft real-time systems, in which performance is not guaranteed by worst-case resource allocation. For example, if the “bandwidth” of a video channel is constrained to only 15fps at a specific resolution, the user could decide to accept the provided quality or adjust select parameters more aptly suit his/her needs.

Middleware and Networking

Middleware and networking are also important to multimedia synchronization in that they affect the delivery of media data between end (client) systems.

At a most basic level, the communications infrastructure must ensure data availability to enable synchronized rendering and timely user interaction. Typically, this issue is addressed by providing for a reasonable and ideally predictable quality of service (QoS). Therefore, network QoS can be seen as an enabler for “temporal composition” by which media can be assembled together and playback organized according to individual and group timing constraints. The provision of appropriate network and application level protocols also support synchronized data transfer (in cooperation with any provided QoS). A large body of work on protocols for multimedia synchronization exists, ranging from lower-level adaptive, feedback-based techniques to those provisioned at the application level, such as RTF (Real-Time Protocol) and RTCP (Real-Time Control Protocol). Additional network-oriented considerations include issues of data buffer management and protocol stack implementation which can impact on synchronization vis-à-vis the OS issues described above (e.g., data copying overhead).

Beyond the basic communications level, middleware addresses the need to bridge network and client functionality through the provision of centrally-based services and abstractions. As such, middleware is a “glue” layer of software between the network and applications, intended to ease application programming, application integration and system management tasks while also promoting standardization and interoperability of services by lessening multiple, independently developed implementations. In terms of multimedia synchronization, middleware offers a logically centralized, service-oriented approach to synchronization (orchestration) logic. It also provides support for useful abstractions and constructs for communicating multimedia data, ranging from publish and subscribe models, to streams, flows, sources and sinks.

Database and Data Management

Database and data management are relevant to multimedia synchronization in how their design and implementation provide flexible and responsive data access. For aspects of spatial and content synchronization, issues of multimedia querying and multimedia data semantics (e.g., image analysis vs. keyword meta-descriptors) are of interest. For purposes of temporal synchronization, a broad array of other issues includes disk scheduling and storage models for particular classes of data (e.g., continuous media). This last aspect also includes how the fundamental database structure impacts the means by which the actual multimedia data is accessed; that is, do the media reside within the database itself (in which access is constrained to the database management system and query engine) or is the data stored independently on separate systems (and the database only contains references to the external data). Such design considerations must be accounted for due to two primary issues: (1) timing considerations in terms of media data retrieval strictly through the database and its overhead (e.g., the potential effects of multiple, concurrent database queries on the timeliness of continuous media streams); and (2) timing considerations in terms of database efficiency resulting from large data objects (such as video) and/or objects of variable and indeterminate size (e.g., how to represent live instances of media, such as a camera capture).

System and Software Engineering

System and software engineering issues are important to multimedia synchronization in how they can affect the real-time implementation of multimedia systems. To provide an actual real-time environment, systems must be appropriately engineered not only to facilitate the necessary structural and behavioral aspects of a system, but also to ensure inappropriate behavior is not inadvertently introduced and that any such anomalies can be corrected as required.

First, a system should be based on the appropriate hardware and software infrastructure, such as a QoS-enabled communication backbone and a real-time operating system. Systems based on inappropriate infrastructure risk reduced quality in the user experience due to unsuitable substrate behavior. Second, system structure (both design and implementation) must provide a flexible and extensible architecture capable of real-time performance. This requirement includes using flexible architectural techniques and technologies, including middleware and component-oriented architectures, along with the appropriate programming interfaces, useful abstractions and developmental paradigms (such as object orientation). Third, the system, application and various software components should have the ability to monitor their behaviors (i.e., the actual performance of its various components). This is a necessary step in creating a system which can adapt (i.e., “tune”) itself to address structural or behavioral deficiencies. An example is a video system which provides the ability to dynamically change playback frame rate based on monitoring the degree of asynchrony that develops during playback. Doing so illustrates the benefit of building systems that address behaviors as first-class considerations and facilitate adaptive behavior management.

As a result of better engineering, the potential exists for more flexible and higher quality systems, based on the increased use of common multimedia infrastructures. The end result would be better interoperability and compatibility across the user community, ultimately aiding in the acceptance and continued growth of multimedia technology across broader audiences.

Ingen-Housz, Jan [next] [back] Infrared Photography - Light and Heat: Near and Far Infrared, Near-Infrared: Light We Cannot See

User Comments

Your email address will be altered so spam harvesting bots can't read it easily.
Hide my email completely instead?

Cancel or