ACM Multimedia Systems 2016 Conference – May 10-13, 2016, Klagenfurt am Wörthersee, Austria

Special Session Media Synchronization (MediaSync)

Important Dates:

  • Submission deadline: February 19, 2016
  • Acceptance notification: March 23, 2016
  • Camera ready deadline: April 8, 2016

Papers should be prepared in the ACM stylebetween 6 and 12 pages, and written in English. The submission site for the special session on Media Synchronisation is available here:


Scope & Goals:

Media synchronization has been a key research area since the early development of (distributed) multimedia systems. Over the years, solutions to achieve intra- and inter-media synchronization in a variety of (mostly audiovisual) applications and scenarios have been proposed. However, it is not by far a solved research problem, as the latest advances in multimedia systems bring new challenges. The coexistence and integration of novel data types (e.g., multi-sensorial media or mulsemedia), advanced encoding techniques, multiple delivery technologies, together with the rise of heterogeneous and ubiquitous connected devices, are resulting in a complex media ecosystem for which evolved, or even new radical, synchronization solutions need to be devised.

This Special Session addresses exactly that: latest advances and remaining challenges on media synchronization to accommodate emerging forms of immersive, personalized and ultra-realistic media experiences in our multi-sensory, multi-protocol and multi-device world. The purpose is to provide a forum for researchers to share and discuss recent contributions in this field and to pave the way to the future, by focusing on different aspects of multimedia systems, such as: content types, (multi-)processing techniques, networking issues, adaptive delivery and presentation, and human perception (QoE). This special session is the continuation of the MediaSync Workshop series (2012, 2013 and 2015) and of Special Sessions in other venues (QoMEX 2014).

Topics of Interest:

  • Novel architectures, protocols, algorithms and techniques
  • Mulsemedia (multi-sensory media)
  • Theoretical frameworks & reference models
  • Evaluation methodologies and metrics
  • Standardization efforts
  • Proprietary solutions (e.g., watermarking, fingerprinting…)
  • Technological frameworks & tools & testbeds
  • Emerging media consumption patterns
  • Content-aware & context-aware solutions

Use Cases & Scenarios of Interest:

  • (Multi-party) Conferencing
  • Shared media experiences (e.g., Social TV)
  • Hybrid broadband broadcast services
  • Multi-Screen applications
  • Multi-view or free viewpoint TV
  • Tiled streaming
  • Networked games
  • Virtual Environments
  • Telepresence, 3D Tele Immersion (3DTI)
  • Multi-sensory experiences (Olfactory, Haptics)
  • Distributed arts or music performances
  • Synchronous e-learning
  • Immersive audio environments
  • Multi-level or multi-quality media
  • Seamless session migration and convergence across devices
  • Collaborative/Cooperative multi-processing and multi-rendering of media


  • Pablo Cesar (Centrum Wiskunde & Informatica, CWI, Netherlands)
  • Fernando Boronat (Universitat Politècnica de València, UPV, Spain)
  • Mario Montagud (CWI & UPV)
  • Alexander Raake (Ilmenau University of Technology, Germany)
  • Zixia Huang (Google, USA)

Further information about Media Sync can be found here.