Robotic Musicianship (eBook)
XVII, 256 Seiten
Springer International Publishing (Verlag)
978-3-030-38930-7 (ISBN)
This book discusses the principles, methodologies, and challenges of robotic musicianship through an in-depth review of the work conducted at the Georgia Tech Center for Music Technology (GTCMT), where the concept was first developed. Robotic musicianship is a relatively new research field that focuses on the design and development of intelligent music-making machines. The motivation behind the field is to develop robots that not only generate music, but also collaborate with humans by listening and responding in an expressive and creative manner. This combination of human and machine creativity has the potential to surprise and inspire us to play, listen, compose, and think about music in new ways.
The book provides an in-depth view of the robotic platforms designed at the GTCMT Robotic Musicianship Group, including the improvisational robotic percussionists Haile and Shimon, the personal robotic companion Shimi, and a number of wearable robots, such as the Robotic Drumming Prosthesis, The Third Drumming Arm, and the Skywalker Piano Hand. The book discusses numerous research studies based on these platforms in the context of five main principles: Listen like a Human, Play Like a Machine, Be Social, Watch and Learn, and Wear It.
Foreword 7
Preface 9
Contents 12
1 Introduction 17
1.1 Abstract 17
1.2 Why Robotic Musicianship 17
1.3 Sound Production and Design—Survey 19
1.3.1 Traditional Instruments 20
1.3.2 Augmented and Novel Instruments 24
1.4 Musical Intelligence 25
1.4.1 Sensing and Perception 26
1.4.2 Music Generation 30
1.5 Embodiment 32
1.6 Integrating Robotic Musicianship into New Interfaces 34
1.6.1 Musical Companion Robots 34
1.6.2 Wearable Robotic Musicians 35
1.7 Discussion 36
References 37
2 Platforms—Georgia Tech's Robotic Musicians 41
2.1 Abstract 41
2.2 Haile—A Robotic Percussionist 42
2.2.1 Motivation 42
2.2.2 Design 42
2.3 Shimon—A Robotic Marimba Player 47
2.3.1 Striker Design 47
2.3.2 Mallet Motor Control 50
2.3.3 Slider Motor Control 53
2.3.4 Shimon's Socially Expressive Head 56
2.4 Shimi—A Music Driven Robotic Dancing Companion 59
2.4.1 Robotic Musical Companionship 60
2.4.2 Design 61
2.4.3 Software Architecture 62
2.4.4 Core Capabilities 63
2.5 The Robotic Drumming Prosthetic 66
2.5.1 Motivation 67
2.5.2 Related Work 68
2.5.3 Platform 69
2.5.4 Generative Physical Model for Stroke Generation 70
2.5.5 Conclusions 75
References 75
3 ``Listen Like A Human''—Human-Informed Music Perception Models 78
3.1 Abstract 78
3.2 Rhythmic Analysis of Live Drumming 79
3.2.1 Onset Detection 79
3.2.2 Beat Detection 79
3.2.3 Rhythmic Stability and Similarity 80
3.2.4 User Study 83
3.3 Tonal Music Analysis Using Symbolic Rules 84
3.3.1 Implementation 85
3.3.2 Evaluation 88
3.4 Music Analysis Using Deep Neural Networks 91
3.4.1 Deep Musical Autoencoder 91
3.4.2 Music Reconstruction Through Selection 94
3.5 Real-Time Audio Analysis of Prerecorded Music 95
3.5.1 Introduction 95
3.5.2 Previous Work 97
3.5.3 System Design 97
3.5.4 Live Audio Analysis 98
3.5.5 Gesture Design 101
3.5.6 Network Design 105
3.5.7 User Study 107
3.5.8 Summary 108
References 109
4 ``Play Like A Machine''—Generative Musical Models for Robots 110
4.1 Abstract 110
4.2 Genetic Algorithms 111
4.2.1 Related Work 111
4.2.2 Method 111
4.3 Markov Processes (``Playing with the Masters'') 115
4.3.1 Related Work 115
4.3.2 Implementation 115
4.3.3 Summary 119
4.4 Path Planning Driven Music Generation 120
4.4.1 Search and Path Planning 120
4.4.2 Musical Path Planning 121
4.4.3 Planning 123
4.4.4 Evaluation 128
4.4.5 Discussion 130
4.5 Rule Based Jazz Improvisation 130
4.5.1 Parametrized Representations of Higher-Level Musical Semantics 131
4.5.2 Joint Optimization 138
4.5.3 Musical Results 140
4.5.4 Discussion 142
4.6 Neural Network Based Improvisation 143
4.6.1 Introduction 144
4.6.2 Semantic Relevance 146
4.6.3 Concatenation Cost 147
4.6.4 Ranking Units 148
4.6.5 Evaluating the Model 149
4.6.6 Discussion 149
4.6.7 Subjective Evaluation 150
4.6.8 Results 150
4.6.9 An Embodied Unit Selection Process 152
4.7 Conclusion 154
References 155
5 ``Be Social''—Embodied Human-Robot Musical Interactions 158
5.1 Abstract 158
5.2 Embodied Interaction with Haile 158
5.2.1 Interaction Modes 159
5.2.2 Leader-Follower Interaction 161
5.2.3 Evaluation 162
5.2.4 Data Analysis 165
5.2.5 Results 168
5.2.6 Conclusion 169
5.3 Synchronization with Shimon's Music-Making Gestures 169
5.3.1 Hypotheses 170
5.3.2 Experimental Design 170
5.3.3 Manipulation I: Precision 170
5.3.4 Manipulation II: Embodiment 171
5.3.5 Results 172
5.3.6 Discussion 176
5.3.7 Audience Appreciation 177
5.4 Emotion Conveyance Through Gestures with Shimi 180
5.4.1 Related Work 180
5.4.2 A System for Generating Emotional Behaviors 183
5.4.3 Shimi Interactive Applications 190
5.5 Conclusion 199
References 199
6 ``Watch and Learn''—Computer Vision for Musical Gesture Analysis 203
6.1 Abstract 203
6.2 Robotic Musical Anticipation Based on Visual Cues 203
6.2.1 Introduction 203
6.2.2 Related Work 204
6.2.3 Motivation and Approach 205
6.2.4 Method 205
6.2.5 Algorithm 208
6.2.6 Results and Discussion 210
6.2.7 Conclusions 211
6.3 Query by Movement 212
6.3.1 Motivation and Related Work 213
6.3.2 Approach 214
6.3.3 User Study 215
6.3.4 Implementation 219
6.3.5 Evaluation 220
6.3.6 Results 220
6.3.7 Discussion 221
6.3.8 Future Work and Conclusions 221
6.4 A Robotic Movie Composer 222
6.4.1 Visual Analysis 223
6.4.2 Music Generation 223
6.4.3 Informal Feedback 224
6.4.4 Discussion 224
References 225
7 ``Wear it''—Wearable Robotic Musicians 227
7.1 Abstract 227
7.2 The Robotic Drumming Prosthetic Arm 227
7.2.1 Background 228
7.2.2 Related Work 230
7.2.3 Motivation 231
7.2.4 Design 233
7.2.5 Evaluation 236
7.2.6 Results 238
7.2.7 The Second Stick 240
7.2.8 Conclusion 243
7.3 The Third Drumming Arm 244
7.3.1 Introduction 244
7.3.2 Related Work 244
7.3.3 Motivation 245
7.3.4 Design 246
7.3.5 Dynamics Modeling 246
7.3.6 Input-Shaper 249
7.3.7 User Survey 251
7.3.8 Conclusion 253
7.4 The Skywalker Piano Hand 253
7.4.1 Related Work 254
7.4.2 Goals 255
7.4.3 Ultrasound Configuration Experiments 255
7.4.4 Machine Learning Design 261
7.5 Conclusion 264
References 265
Index 269
| Erscheint lt. Verlag | 7.2.2020 |
|---|---|
| Reihe/Serie | Automation, Collaboration, & E-Services | Automation, Collaboration, & E-Services |
| Zusatzinfo | XVII, 256 p. 161 illus., 98 illus. in color. |
| Sprache | englisch |
| Themenwelt | Kunst / Musik / Theater ► Musik |
| Informatik ► Theorie / Studium ► Künstliche Intelligenz / Robotik | |
| Mathematik / Informatik ► Mathematik | |
| Technik ► Bauwesen | |
| Technik ► Maschinenbau | |
| Schlagworte | Computer Music • embodied cognition • human robot interaction • Machine Musicianship • Mechatronics • social robotics |
| ISBN-10 | 3-030-38930-8 / 3030389308 |
| ISBN-13 | 978-3-030-38930-7 / 9783030389307 |
| Informationen gemäß Produktsicherheitsverordnung (GPSR) | |
| Haben Sie eine Frage zum Produkt? |
DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasserzeichen und ist damit für Sie personalisiert. Bei einer missbräuchlichen Weitergabe des eBooks an Dritte ist eine Rückverfolgung an die Quelle möglich.
Dateiformat: PDF (Portable Document Format)
Mit einem festen Seitenlayout eignet sich die PDF besonders für Fachbücher mit Spalten, Tabellen und Abbildungen. Eine PDF kann auf fast allen Geräten angezeigt werden, ist aber für kleine Displays (Smartphone, eReader) nur eingeschränkt geeignet.
Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.
Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.
aus dem Bereich