I feel the futuristic relationship between technology and human in Dan Tepfer, a jazz pianist (1)


On June 5th, 2019, the piano trio led by the jazz pianist Dan Tepfer appeared in the Marunouchi Cotton Club in Tokyo, a famous live house that belongs to the Blue Note group that many top-level foreign musicians, especially jazz, appear when they come to Japan. Since I was interested in Dan's music, his style of the play, and his thought about the music, I got in touch with him for the first time before he came to Japan,  and talked with him for a couple of hours just after he finished the practice in a studio. This article is a memo of what I thought during the talk with him, and after that.

In this article, I want to write about the future of technology and human, in particular, when artificial intelligence (AI) is eroding into the domain of creation which was considered to be impossible for machines in the old days, what will happen to the relationship between technology and human in the field of art where creativity is believed to be most important.
(In the following sentences, terms such as "technology", "computer", and "AI"  are exchanged depending on the context, but the usage is not strict. Advanced computer programs will include AI-like elements anyway, and "AI" is a buzz word recently and some people mean deep-learning, some people use almost same as advanced computing. Plus, when we talk about the future, it is not so important that we stick to the word meaning boundary based on the current technology status.)


The story here also includes the consideration of the keyword "human augmentation" or "augmented human", which is recently attracting much attention in the field of technology.


First of all, I want you to watch one of his videos. It is a piece in his album called Natural Machines. It will be good to look at the scene that about 1 minute passed. The point is that the keys that he is not playing are moving.



Dan Tepfer's Natural Machines Ep. 1

Do you get it? He uses a piano that has  recording and automatic play functions, and when he plays, the computer program also plays calculated sounds using a computer program based on the tone he played and his touch (timing, speed and  strength of the keyboard press). This computer program is written by Dan Tepfer himself, based on his own algorithm. He graduated from the Department of Astrophysics at the University of Edinburgh, and masters programming languages ​​such as C, JavaScript, and Processing. He even experiments automatic play using hardware devices such as Arduino.

From a musical point of view, it is also important that he is a jazz pianist, and he is playing a piece he composed, and of course his play contains an ad-lib or so-called improvisation. I will explain about this a little bit for who are not familiar with jazz music. When I talk about the piano play, some people may think that I'm talking about the play of a classical music style, which usually means a play faithful to the score written by a composer,  but jazz is different. In jazz, although the progression of the melody and the chords  are almost decided in advance, the core of the music resides in the ad-lib performance (each performer plays freely although having some music-theoretical restrictions in mind). If I add one more surprising thing, Dan Tepfer says he is trying that that he does not decide even the chord structure beforehand in most of his "Natural Machines" pieces. That is, in this video,  he plays with ad lib, the computer plays automatically responding to his play based on the algorithm, and he continues his play possibly amplifying his idea under the influence of the automatic play.

You will notice that this is completely different from just mixing his play and the computer play after independent two performances. He plays the notes that comes to mind on the fly, inflates his idea listening to the computer's performance, and the subsequent performance goes on. We can say that his idea and the computer feedback get multi-layered continuously, as if it were a double mirror.

I feel I can see the futuristic relationship between technology and human here. I sometimes talk with friends, saying, "If we compare a composer and a player, which one might be more likely to be replaced by AI or a robot?". When I look at Dan Tepfer's performance, I get some different perspective. I can see the future that more and more music emerges from where human's ideas and performances are mixed with and expanded by computers. At the same time, it may sound as if a natural human was playing, and the listeners may not notice the music expanded by computers. It's seamless. This is completely different from the electronic music which has existed so far. I can say that this is a difference of background philosophy to computers.

By the way, I am not a music expert. I am a university researcher in the field of computer science, specializing in "computer-human interface". In other words, my research theme is the new methods of interaction between computers and humans, and recently I'm thinking about how to make the interaction between computers and humans more smooth. For example, I'm interested in the way to make interaction with smart speakers more "comfortable".

If I am just a consumer of  such products, I might try to predict what will happen in the future, but as a researcher, I think about what we should do and what kind of interaction skills we should embed into computers and robots. My viewpoint is a bit different from the technology to make computers more intelligent. For example, I ask "Should we design a conversation robot to interact with humans like an entity that behaves like an equal partner (I'm talking about the design of a conversation style or the mindset that an AI should have), or should it be designed like a concierge who is trying to help the owner? ".  Such things affect the conversation design directly.  As a concrete example, when a human talks to a smart speaker like "Hey, Alexa", should it say "What?" like a friend, or "May I help you?" like a concierge?  Or, when I say "What's the weather like today?", should it say "Well, according to the weather forecast, it might rain" or "The probability of rainfall today between 15 and 18 o'clock is 60%". Such things depend on the design philosophy of the designer such as  "What the future AI should be like?" and "What is the robot?" rather than the technology itself.

If you regard the smart speaker as just an intelligent machine, you may think it should  accurately tell the facts such as the probability of precipitation, but if you imagine a scene that an 80 years old man living alone talks to a pet robot at home, you may support that the idea that the robot should say "The weather forecast says it looks like it's going to rain. It's a bit worry, isn't it?".

I wrote above that  "What should the computer/robot/AI be? " is a kind of design philosophy. This is also related to the recent argument such as "Robots may take away human jobs". If AI is considered to be not an independent human-like existence but something to augment human abilities, it is not easily connected to the story that "the job will be taken away". Although computers can calculate correctly in a short time, it just expands the human skills and creates more jobs.

Dan Tepfer's performance stimulates various ideas when thinking about these things. When I saw him, we talked about various topics such as “If you compare a composer and a player, who is more likely to be replaced by AI?”, "If you get a function that changes your novice play to a sophisticated one automatically, would it be good or bad?", and even non-musical issue such as   “Which skill would be more important for a doctor: finding a cancer from an image, or telling a cancer to the patient?".

Well, I'm getting off the track a little bit, so from the next article, I'll write about what kind of talks I had with Dan Tepfer. Actually, he was very knowledgeable about not only music but technology. He had a state-of-the-art understanding of the technology area and said that he even had discussed with the leader of the Google Magenta project, which is an art creation project using deep learning. (to be continued)

-> to the next article

P.S.  This is a video that he is talking about the music and the algorithm


Dan Tepfer's talk: Fascinating Algorithm

コメント

このブログの人気の投稿

I feel the futuristic relationship between technology and human in Dan Tepfer, a jazz pianist (2)

I feel the futuristic relationship between technology and human in Dan Tepfer, a jazz pianist (3)