Multimedia data is now created at a macro, public scale as well as individual personal scale. While distributed multimedia streams (e.g. images, microblogs, and sensor readings) have recently been combined to understand multiple spatio-temporal phenomena like epidemic spreads, seasonal patterns, and political situations; personal data (via mobile sensors, quantified-self technologies) are now being used to identify user behavior, intent, affect, social connections, health, gaze, and interest level in real time.
An effective combination of the two types of data can revolutionize multiple applications ranging from healthcare, to mobility, to product recommendation, to content delivery. Building systems at this intersection can lead to better orchestrated media systems that may also improve users' social, emotional and physical well-being. As a simple example, allergy patients can get much better recommendations by combining environmental risk level (via satellite imagery, tweets, air quality sensors) with personal parameters like exertion, and outdoor exposure time.
This workshop invites submissions that explore novel techniques to combine multiple media streams at different scales (macro and micro) to understand and react to each user's needs. Topics of interest include, but are not limited to:
o Frameworks for information integration across macro and personal data
o Multi-scale data analysis
o Cross-modal information fusion
o Combining ego-networks with global network features
o Spatio-temporal data indexing
o Situation modeling and situation recognition
o Personalization based on sensor media
o Architectures for personal data management
o Privacy issues in combining personal and public data
o User persuasion through multimodal data presentation
o Social approaches for information integration
o Economic models of information utility
o Applications in different areas incl. mobility, advertisements, healthcare, media recommendations.