You like musical instruments? Are you interested in using
interactive technologies to assist and augment musicians? Join our workshop at Augmented Humans (AHs) International Conference 2024! We are a hybrid workshop so we support both in-person and virtual participation!
Important Dates
27 February 2024: Submission Opens
15 March 20 March 2024: Submission Deadline
Submission site: EasyChair
19 March 24 March 2024: Review Notification
06 April 2024: Workshop (Hybrid)
Playing a musical instrument goes hand-in-hand with many benefits, such as positively impacting mental health or dexterity. Electronic elements have been integrated into traditional musical instruments in the early 1930s to create instruments, such as E-guitars, that offer new ways of music expression. Electric instruments evolved by combining networking and computational capabilities. These new capabilities can be leveraged to further broaden artists' expressiveness, enhance learning scenarios, allow remote collaboration of musicians, and even create entirely new musical instruments.
In this workshop, we will discuss and interact with intelligent music interfaces of any form. Novel music interfaces could be a new adaption of a traditional musical instrument, an interface for learning, or even supporting software. The workshop is planned to be held in person in conjunction with the Augmented Humans International Conference on April 6th in Melbourne, Australia.
Integrating interactive technologies into
musical instruments has become an emerging field. Initial work in the domain of intelligent
music interfaces focused on improving the play performance of students through
learning-by-demonstration or by reflecting the performance directly to the student for real-time
improvements. Further, musical instruments were augmented by technologies to extend the musical
sound space. For example, gestures and musical instruments can be combined to, for example,
change the pitch of a sound. We expect future musical instruments to integrate interactive
features, effectively facilitating the learning of musical instruments, promoting
self-expression, and changing stage performances. We seek high quality contributions concerning
different perspective of intelligent music interfaces and instruments including, but not limited
to:
Time | Program Item |
---|---|
9:00am - 9:15am | Workshop Introduction |
9:15am - 9:30am | Moderated Speed Dating |
9:30am - 9:40am | Introduction of Interactive Session |
9:40am - 10:40am | Interactive Music Session |
10:40am - 11:00am | Short Break |
11:00am - 11:30am | Keynote |
12:00pm - 1:00pm | Lunch Break |
1:00pm - 1:30pm | Art Pieces |
1:20pm - 1:45pm | Short break |
1:45pm - 3:00pm | Pitch presentations |
3:00pm - 3:15pm | Coffee Break |
3:15pm - 4:15pm | Moderated Discussion and Closing |
Submissions should follow the ACM two-column
format using the sigconf command at the top of the Latex document: \documentclass[sigconf]{acmart}. You can download the
templates from ACM or
use the corresponding Overleaf
tempate.
Papers can be submitted using EasyChair
The authors of interactive demonstrations and art pieces are invited to present a prototype in the interactive workshop session.
Participants will be selected based on the merit of their contribution to the workshop. At least one author of each accepted submission must attend the workshop. All participants must register for the workshop. (Workshop registration is included in the full conference registration of Augmented Humans '24).
After the workshop, we encourage researchers to rework their research statements and position papers based on the discussions and feedback from the workshop. We support and encourage authors to make their research available on arXiv after the workshop. Recorded pitches and the keynote will be uploaded on YouTube after seeking the presenter's permission. Based on the group work and moderated discussion, the organizers plan to distill critical aspects and the workshop's outcomes into a position paper published open access. The anticipated results are available to research questions concerning prototyping, the study design, and the evaluation of intelligent music interfaces. The feedback of the workshop attendees accompanies these research questions to inspire researchers who are interested in tackling the research questions. Based on the interest of the workshop attendees, we plan to organize regular meetups. We plan to establish a long-term format with a potential future invitation for the authors to contribute to a journal.
is a Ph.D. candidate at the University of Maryland (UMD) College Park. His research focuses on AI-assisted music education. He develops technology, tools, and applications to provide real-time feedback during practice for music players. He is interested in empowering music teachers by creating super-tools that augment their capabilities in understanding their students' strengths and weaknesses.
is a PhD candidate working on augmented reality, music learning, and adaptive visualizations in the HICUP research group at the University of Primorska, Slovenia. He is researching augmented reality techniques to teach improvisation in the piano. His main goal is to design interfaces that enable people to be more creative in their craft. He is also part of the Center for Complexity and Emergies Technologies (COMET) research group from De La Salle University, Philippines.
is a professor at the Humboldt University of Berlin. His research focuses on physiological interaction, including designing, prototyping, and evaluating physiological user interfaces. In addition, he is an expert in integrating physiological sensing into musical instruments to implicitly and explicitly augment musicians. Thomas is deeply interested in new ways to create music, augment existing instruments, and create tools and feedback mechanisms supporting musical students.
has over twenty years of experience as a musician and music teacher. He is part of several band projects and co-owns the music school Schallkultur in Kaiserslautern, Germany. In addition, he collaborates with several research institutions by contributing his expertise as a musician to develop and evaluate new smart music interfaces, such as Let's Frets.
is a PhD student at the Ludwig Maximilian University of Munich, where he focuses on mixed reality as a new medium and investigates the importance of haptic feedback in virtual reality. Therefore, he is also interested in how such novel interactions can enhance experiences with novel music interfaces. Matthias has experience in evaluating supportive tools while practicing musical instruments.
is an assistant professor at KTH Royal Institute of Technology in Stockholm, Sweden. His research focuses on the assisting technology in urban environments, in particular on designing, constructing, and evaluating multimodal and mixed reality interfaces for vulnerable road users. Additionally, he has over 20 years of experience playing trombone in amateur and semi-professional orchestras in Ukraine and Germany, and a bass guitar in jazz/funk/rock bands.
is a professor at the Ruhr University Bochum. Her research focuses on the self-determination and self-expression of individuals in digital spaces, explicitly considering ubiquitous technology and novel (security and privacy) interfaces based on tangible interaction. She further leverages novel interfaces and interaction techniques to improve musical instruments dedicated to beginners and students. In her free time, she plays the piano and sings.
created with
WYSIWYG Web Builder .