Entrevista com Mike Cook por Marcus Repa

20 minutos

Assunto :

Sociedade

Formato :

Texto

Participante :

Mike Cook e Marcus Repa

Mike Cook, an AI researcher and game designer. I am currently a Senior Lecturer in the Department of Informatics at King’s College, London, and a member of the Knives and Paintbrushes collective. I do research that combines artificial intelligence, creativity, and game design — I’m interested in how we can use technology to be creative and inventive, and how we can get computers to do strange new things.

Marcus Repa, PhD Candidate in Sociology (USP).  http://lattes.cnpq.br/2089428033025219

Researcher at Understanding Artificial Intelligence (IEA–USP) Understanding Artificial Intelligence.

Marcus Repa

In The Social Responsibility of Game AI, society is a possible force for understanding the uses of intelligent systems. Considering the video game industry, what has this commercial and business model presented in the technological sense that regulatory forms are possible? 

Michael Cook

I think regulation is perhaps an area that games don’t necessarily help us with (or help us understand). However, because they’re such a casual and playful way for us to encounter AI, they can also help us understand the ways AI can go wrong, which might lead us to think about regulation. Technology generally tries to stop us seeing errors these days – when I was young, technology around me broke in obvious ways and it did not try to explain itself very well at all. 

Now our phones and computers will explicitly lie to us rather than tell us something might be broken. I think that’s significant. Games, on the other hand, are often messy and full of problems, and that includes their AI systems. I like hearing people play games and reacting to AI making mistakes, and then coming up with an explanation for why that might be. You can really hear people forming their own theory about how a particular algorithm might work.

Marcus Repa

Also in the article, Games are presented as a means of knowledge that would provide “learning about common AI techniques and systems, and showing people what other things AI can do”. In this respect, are the technical and language elements being standardized, particularly by the publishers and developers of triple-A games?

Michael Cook

I think there is an element of standardisation among AAA developers, absolutely, and we see this even in the indie space now too. A lot of the trends slightly push away from encountering AI in the wild too – I’m thinking about the increase in social multiplayer experiences, where the AI is kind of replaced by other people, and roguelikes which are often less about doing weird stuff with AI. 

But there’s still plenty of weird stuff being done with AI and related techniques, it’s just maybe not front and centre in the games industry. Games like Shadows of Doubt, Caves of Qud, these games are doing wonderful and interesting things with technology. I suppose one question is how you make these games more visible and accessible to people who don’t know anything about games. We did see recently that the Witcher 4 tech demo made some fun weird claims about the AI of their civilian NPCs, but to be honest I’m expecting it to amount to nothing, it felt like the standard exaggerated tech demo stuff. 

Marcus Repa

Considering independent game production, does this style and genre still have creative and innovative possibilities, given that most game systems and engines are standardized? 

Michael Cook

Independent games have never been more vibrant, I think. Technology is standardising and I do see homogenisation in places —  one of them being game jams. I think the proliferation of big tools and tutorial series are good because more people are making games, but we’re not exposing them to the weirdness of what games can do or be, and I think as a result people are maybe working in a bit narrower a space. 

A lot of people also want to make games as a career, and that encourages a lot of people to follow what they see trending around them. I did some reviewing of open source Unity games recently for a research study and I was surprised at how many of them were clones of mobile infinite runners, but it makes sense because those games are popular and people see them a lot. 

But I think there’s also never been more people making games, making weird games, and making their own tools and engines too. I think new technology makes it easier for people to mess around. And in particular I think fantasy consoles like PICO-8 are helping create new communities of weird creators. I think the challenge is always encouraging people to make messier, more experimental things at the cost of trying to be a ‘success’ in the normal sense. I see that in my students too. 

Lots of them feel they need to study certain topics or be the best at certain tools even if they don’t like them, because they think it’s how you become a successful computer scientist or software engineer. Encouraging them to step outside that for a bit and play around is really important to me.

Marcus Repa

Some professional groups in the video game industry have recently decided to reorganize themselves to protect their jobs from being replaced by artificial intelligence. What are the challenges for the gaming industry? How can we adapt “Social Responsibility” to AI research and development in this scenario?

Michael Cook

I think the games industry needs a lot more worker organisation and protection. AI is a threat to jobs in a lot of different ways – it’s easy to imagine that one day someone will tap you on the shoulder and say “You’re fired, we’re replacing you with AI” but in reality it doesn’t look like that. Instead what happens is that fewer people get hired, or people aren’t replaced when they retire or move away. A very big problem is the impact it will have on junior employees. A lot of companies (not just in games) tell me they are hiring fewer interns/juniors now because they believe a lot of the simple tasks can be done with AI. So I think organisation is required. In terms of what can be done, I don’t know. AI feels like an issue on a par with climate change – not in severity, of course, but complexity. It touches so many people, all around the world, in different ways, and everyone’s motivations are really complicated. But organising yourselves as workers can only ever be a good thing – people working in the games industry are commonly exploited and not cared for properly, despite the fact that they power one of the biggest creative industries in the world. They deserve better.

Marcus Repa

The Angelina project appears to integrate your academic research with your work in game development. Combining both areas, it seems to reflect your perspective that games may serve as a potential medium for exploring and understanding concepts of Artificial Intelligence. Could you elaborate on the initiative, its development process, and how Angelina has been received, both within the academic community and the professional game development industry?

Michael Cook

It was a strange time for me, I started it as a PhD project and only expected it to last a year or two maybe, but it ended up becoming my whole PhD and the research has sort of led to being my whole career. Games are an incredibly good place to study creative artificial intelligence – pick any creative problem or skill and you’ll find it in the games industry. On top of that, we can study how creative people collaborate and solve problems using a range of skills too. Making a game is a fascinating creative problem and there’s so many interesting things to think about. 

With Angelina I really started with the most basic things I could think of, and grew from there. So I started out just with very abstract arcade games mostly made of rules, but the next project looked at art and data, and then the next project looked at how to communicate meaning – and you know, none of them were that successful really. I think it’s hard when we look back on projects like this because their outputs look so basic and silly. But when you’re doing something for the first time you’re fumbling in the dark, and asking even basic questions is so hard. I’m really proud of the work I did then, even though it was messy and flawed. 

The response from the game developers I spoke to was always lovely, but you have to remember this was 2010-2016. It was really important to me to talk to developers and develop a relationship with them, to listen to what they thought and understand it, to think about how AI research could be a positive addition to game design communities. I think we did a good job, honestly. The dialogue was always healthy, people were excited and engaged. We learned a lot about creativity and social perceptions of AI. It makes me sad how all of that has been crushed by large AI companies coming in with absolutely zero respect for the public or the industry. The public perception has now been shifted so fast and so hard that my work is now usually misunderstood by everyone, whether they love AI or hate it.  

Nowadays I still work on many of the same questions I was interested in back in the early 2010s. I don’t get as much time to work on them these days because I’m teaching, supervising, and so on. But I hope to keep working on them for the rest of my life, tinkering away at little machines and using them to ask questions about how games work and why. And it’s always been connected to my own creative practice of making games, and I find that has only gotten tighter as time has gone on. I make a lot more small, experimental, silly things now and it makes me so happy, but it also connects to the research and helps me think about that in new ways too.

Marcus Repa

In your text, AI-Generated Imagery: A New Era for the ‘Ready-made’, you bring an intriguing philosophical reflection on meaning, and how visual art language seems to have become a “ready-made.” You reference Duchamp and his style as a possible origin for today’s generative image systems. How do you see the creative and artistic possibilities that arise from this? Do you think we’re entering a moment where generating images through code might become as ordinary as using objects?

Michael Cook

This paper was mostly written by my student, Amy Smith, I only gave some light guidance on it, so I asked her this question. This is what she wrote:

What’s powerful (and challenging) about this moment in art and technology is that anyone can now generate compelling images through code or language. But that ubiquity also means artistic value isn’t necessarily found in the image alone — in this context it can also depend on how the image is used, interpreted, and curated/situated. So yes, I think generating images may become as commonplace as arranging objects, but the creative act in this context could potentially shift towards what users do with those images: how they shape narratives, how prompts are crafted and so on. It means that potentially the AI artist’s role may become more about navigation: through systems and aesthetics, as well as cultural codes (which isn’t new). In that sense, the creative and artistic possibilities are huge — but they rely more on clarity of intent, sensitivity to context, and the ability to shape meaning from abundance.

I don’t know if I think generating images will become commonplace in this way, I think it remains to be seen how people will react to such images becoming widespread in society. I feel like it might become commonplace for throwaway usage, but I don’t know if it will be possible to elevate them as Duchamp did (or other artists do). However, I am not an artist!

Marcus Repa

What are your thoughts on the challenges facing the continued study of Artificial Intelligence and the potential of interdisciplinary research across academic fields such as philosophy, computer science, the social sciences, and related areas?

Michael Cook

Artificial intelligence has always been a weird area of study. There’s this saying, “AI is whatever hasn’t been done yet” – the idea that AI is by definition things that we aren’t really sure how to get a computer to do yet. Once a computer can do something, it’s not AI anymore — it’s just normal. Think about all the things we take for granted that computers do for us today, and how they would be perceived 50 years ago. Because of that, AI is often about the transition of technology from theory to practice, from the lab to society. 

AI is the way a new idea enters the world, and in the past it did so very slowly, from universities trying out messy new things that slowly became usable, which companies slowly adopted, and which slowly found its way into our homes. What happened post-2016 was that this area got taken over by capital investment, and accelerated through massive amounts of money. It moved fast because it found a way to speed up outputs by spending more money (scaling up compute) which also meant it could move faster than universities, than governments, than regulators, than courts, and it could also move faster than the public could adapt. Suddenly we have a tech sector that isn’t interested in the old process of technology slowly becoming part of our lives. The tech industry doesn’t want a debate, it doesn’t want to listen to what people want or don’t want. I believe this is the source of a lot of the problems we have today.

I think collaboration between fields is important, and I think “AI” or whatever want to call what’s going on now, is everyone’s concern. And so it helps for all of us to know a little bit about it so we can think more critically, understand what people are saying to us, and also talk to each other. I think a major issue, related to this, is educating the public. It’s one of the reasons I enjoy doing science outreach so much, and one reason why I think it’s good that games give us an opportunity to play with AI and think about what it means for a computer to be ‘intelligent’ or to solve a problem. 

I still firmly believe in public-funded science as an innately good thing, and I believe in artificial intelligence as a field full of good ideas and exciting ways of looking at the world. I also believe that we all get to decide the world we live in together, no matter how much power or money is consolidated in the hands of a few people. I don’t know exactly what’s coming next, but I know that cross-disciplinary collaboration and dialogue will be a good way to find the best way forward, for everyone. And I hope we find lots of exciting new ideas and creative tools there as well. 

Compartilhe em suas redes