About Mike Pappas & Modulate.ai
Mike Pappas is the CEO and Co-Founder of Modulate.ai, a start-up using cutting-edge machine learning to to make voice chat safe.
In case you don’t play online games, there’s an entire universe of people out there who play games like Call of Duty, League of Legends, etc, and they use in-game voice chat to communicate with each other.
The problem is, like other digital avenues, these spaces can be made toxic by bullies, trolls, and hateful people. For years, this toxicity has been seen as an unsolvable problem.
Mike and his co-founder have created a product called ToxMod that is a significantly more effective and cheaper way to moderate voice chats, creating a better experience for the vast majority of players who aren’t bullies or jerks.
They’ve raised over 30 million in funding and recently made TIME’s list of the 100 most influential companies. It’s a brilliant solution to a problem you might not even have thought about, so here’s Mike Pappas of Modulate.AI.
Full Unedited Audio Conversation:
Links:
If you enjoy the show, please rate it 5 stars on Apple Podcasts, subscribe, and leave a nice review!
EPISODE HIGHLIGHTS:
Creating A Fun Gaming Experience
*2:07 – “When you’re walking down the street, you’re headed to the grocery store and I’m driving by and I roll down my window and I shout a bunch of slurs at you, how does that make you feel? That’s not something that you’re going to say ‘Oh, I’m glad that I had that experience.’ We’re not talking about what I have the right to say, whether the police should chase me down for shouting that from my car. We’re talking about what makes for a good experience in an online platform. And these platforms, especially game studios, they’re in the business of creating a fun experience, of creating an experience people want to go and participate in. And so if the situation is that there’s people driving by and shouting all of these slurs, they haven’t succeeded at creating that fun experience, whatever name you want to put on it. That’s just, that’s a problem for them and that’s something that they want to do something about.”
3:24 – “There have been studies across games, across different kinds of online platforms. What we consistently find is it’s about 2%, maybe 3%, of players are users on these online platforms who are kind of there to mess with people, who are there because they enjoy going out and just being as kind of radical, extreme, offensive as possibly they can…that 2–3% they make up for, depending on the platform, anywhere from 20 to 50 or 60% of the bad experiences people have. Now that’s not 100%. A lot of the remainder comes from anything from someone having a bad day, to you don’t know the cultural norms in this new group that you walked into. So you say something that you didn’t realize was offensive to just people who are ignorant, but not necessarily malintended. Which is unfortunately just a thing that the world’s evolving quickly, not everyone has that full context. So education’s important.”
4:29 – “Time and time again, what we see is there really is this small but very core contingent of users online who are very much intending to go out and create this negative experience. But the vast majority of people, they don’t feel that way. They’ve succumbed to the expectation that there’s nothing we can do about hate and harassment. And so we’re just going to accept it, I guess. I guess I’ll just mute that person. I don’t have another choice, but 98% of people do raise their hands and say, ‘I wish it wasn’t this way. I wish something could be done. I just don’t know what’s possible.’”
12:57 – “A big part of what we’re really trying to work with the studios on is, look, we’re not just here to flag these dirty words to you or something. We’re really here to work with you on a strategy for your community that includes them in the process that actually has everyone come together and say, what do we want this game ecosystem to look like in the first place? And then let’s all collaboratively build towards that. But that’s a really important nuance that I think a lot of folks miss. Yes, we sell to the game studios, but a huge part of the work that we’re doing is in making sure that we’re working closely with the players as well to understand what are they concerned about, what do they want out of this platform? And if they want a space where that small percentage of folks can just go all out and really, you know, the trash talk to end all trash talks, sometimes that’s okay as long as we can curate that space for them. And you’re not going to have some unsuspecting ten-year-old wander into that space by accident without knowing what they’re getting into.”
[FIRST CLIP]
ToxMod Uses Machine-Learning Cues To Spot Bad Gaming Behaviour
*21:34 – “The way ToxMod is different is we say instead of trying to transcribe everything or instead of just waiting for player reports, why don’t we use some of this incredibly cool machine learning technology we’ve built to analyze the emotion, the behavioral dynamics of the conversation, the ups and downs of the energy of it, and use that to kind of recognize which of these conversations have the telltale sign of going badly in some way. Whether it’s people are starting to shout at each other, someone new joined the conversation and everyone got real quiet or it’s a one-on-one conversation between an older male sounding voice and a young pre-pubescent female sounding voice. And then you look and see, ‘Hey, that older male has three other one-on-ones also with pre-pubescent girls. Maybe we should take a closer look at that.’ And so all of these machine-learning cues let us focus much more effectively. And instead of only catching 8% of the bad activity with 30% accuracy, we’re catching anywhere between, let’s call it, 30–60% of the bad activity at 95% accuracy. And that’s before we further train and improve the system to get better and better at finding that bad behavior.”
28:47 – “The simple truth is that doing that kind of emotion and prosody and behavior analysis that can all just be done on much smaller models than transcription. So, A, that opens up the possibility of doing it right there on your device. We don’t even have to send the data off your device until we know that there’s something worth investigating there. But even if we have to do it on the cloud for whatever reason, it’s much, much cheaper. And then we’re only doing that transcription on a tiny fraction of the overall population instead of transcribing everything. And so that ends up with us at a price point that’s in the territory of 100x cheaper than what you would get if you were just trying to do transcription alone.”
50:23 – “We’ve made a dent, but also it’s not nearly enough. And that’s why we’re continuing to go out and push, push further and work with larger and wider variety of studios. But we’re out there with a number of studios today. We’ve been out with some studios, for instance, Rec Room, the social VR platform, for over a year now, where we’ve been continually working with them and helping them identify problematic players of a variety of types. And we’re hearing really positive things, not just – seeing it in the stats is one thing, but honestly, hearing from players, ‘hey, you know, I came back to this title three months later after having this really bad experience and it’s changed. It feels different now.’ That’s really gratifying to see.”
Creating Safe Spaces Vs Allowing Questions To Flow
*53:18 – “How much do you want to prioritize having a safer, more inclusive space here versus being able to discuss things? I think an example is, people who are trying to understand gender. There are people out there, a lot of people who are asking well-intentioned, innocent questions that if someone who identifies as non-binary or otherwise an LGBTQ, in that community, if they see those questions, it feels like someone is questioning their identity. That can be really painful for that person. And that’s legitimate. That’s important. We need to care about that. But there also should be any space for people to go and innocently ask those questions, because if there’s no space like that, they all end up on the dark web getting their answers from the people we don’t want answering those questions. And so I think that’s the kind of line that I think of is, should this be actively a safe space that is designed to preserve and protect the folks that need it? Or is this space more curated to allow questions to flow? But it’s rarely going to be true that anyone wants a space where hostility is welcome or encouraged.”