2DRUNK2CODE (2020)

2DRUNK2CODE is my new live coding project. It is based on a previous live coding project of algorithmic live coding that makes exclusive use of micro-controllers and, contrary to most live coding projects, makes do without a computer.
In the case of 2DRUNK2CODE, the performance is entirely done by a robot that determines algorithms and types them on a regular keyboard to produce sounds, while showing the code as it is customary in the field of live coding. One could say that the robot replaces the performer/live coder rather than simulates them. Thus it preserves the physicality, the sense of presence and is, as any human being, susceptible to mistakes – in this case, mostly mistyping.
At first, the project was questioning the notion of computer live music. It is easy, and natural, for the audience to cast doubts on the authenticity and meaningfulness of such a performance. While it is less of a concern in the frame of live coding as the code is written in real time and, most importantly, is usually displayed for everyone to see; in the case of live computer music, the value of attending a live performance can be very low and lacking in almost everything that draws an audience to a concert. What is left has little or nothing to do with the performance itself – for example, socializing with friends or enjoying a better sound system.
So does a robot can replace a human performer in the context of live performance without diminishing its value ? Can it actually augment its value in the particular case of computer live music ?
In addition, since the COVID-19 pandemic, the idea of a live performance itself where people physically meet has been challenged. As streaming live performances have almost become the norm, is there any value in having a human performer in front of the webcam ? For computer live music, would streaming a static luminous logo be enough ?
Finally, if you are too drunk to code, why not using a robot to do the coding for you ?