No items found.

Why Community Moderation Matters: A Conversation with Izzy Neis and Carlos Figueiredo

By 
March 23, 2020
April 2, 2024

From exposés about moderator working conditions at Facebook to new social media legislation being introduced across the globe, over the last few years content moderation has become a hot -- and often controversial -- topic. Often-discussed but little understood outside of the community management and Trust and Safety professions, community moderation can feel like a mystery to newcomers.Luckily, there are industry veterans who can help sort through the chatter and provide insight into what really matters.We brought together old friends and moderation vets Izzy Neis, Senior Director of Engagement & Strategy at ModSquad, and Carlos Figueiredo, Director of Community Trust & Safety at Two Hat, to share their unique experiences moderating content for kids’ platforms, games, social networks, and more in a far-ranging conversation.

Tell us about your interest and experience regarding moderation and community?

Carlos: Online safety, content moderation, and trust & safety work has been a 11+ year journey for me, from the early days working for Club Penguin/Disney to now working at Two Hat with amazing global partners like ModSquad, and more recently co-founding the Fair Play Alliance. I care deeply about enabling healthy and productive user interactions, and seeing online platforms fulfill their potential.Izzy: For nearly 20 years, I have been developing safe, engaging digital experiences for children, families, and entertainment brands. From building teams to developing policies and practices across multiple platforms and verticals, it’s been a wild ride. I’ve had the pleasure of working with some of the biggest brands, while also working closely with key entities like the FTC and the FCC (in regards to COPPA), safe harbors, software ratings boards, and fantastic partners, like the team at Two Hat. For the last 7 years, I’ve helped global brands engage, support, and protect their customers and audiences through digital engagement and strategy at ModSquad.

For companies looking to bring on a community manager for their product, what should they consider?

Izzy: Community management tends to be the role that’s hired about two months before launch, which is too late. The minute you start talking about user experience, that community manager needs to build a map and work closely with the user experience group. The community manager should take into consideration: What does the back-end look like? What does the communication flow look like? What is the CRM solution? What is the admin solution? What are the filter tools? What are the smart behavior management tools? The community manager should also be working with customer support as they have a huge influence on how people understand and interact with a product when they're upset.I love the fact that people understand this now, although it's unfortunate that modern activities in the world are forcing a perspective on moderation. The kids industry has been focused on community and moderation for ages, because we have to deal with laws like COPPA, but now everyone is. We all have to start thinking about moderation and community earlier. The earlier you get on it, the less expensive it is down the road. Once you launch, you’ll realize all of the tools you're missing and the improvements that need to be made.

Some companies try developing an in-house moderation tool. What should they be aware of?

Izzy: Every time somebody tells me, “We're just going to build a filter and implement it and we'll manage it ourselves,” it makes me crazy! There are companies like Two Hat that are already doing that hard work. I appreciate developers who are interested in the challenge, but they may not have the funding and organizational support around a tool like this. The first time it breaks, it's likely the last thing on the roadmap that gets fixed.Carlos: That's a great point. Do you see a lot of that tendency of people to say, “We're just going to build our own filter”? It’s the most common objection we face when talking to companies.Izzy: I hear it and my first response is pretty impassioned. I've been at two companies where I worked with the development team and we planned out beautiful tools, with everything we needed, the interface and the data we want to pull. However, the development process is often broken into three phases and you are told, “You’ll get your third phase by launch.” By launch, the community moderation tool ends up a tenth of the initial product asked for.

Why does such an important feature tend to fall by the wayside?

Carlos: You have to think about the specialty and the focus of any company. Gaming companies, for example, are focused on making awesome games and experiences. Moderation tools fall lower on the roadmap because there are so many components, such as maintaining a word filter with warning messages, suspensions, muting, reporting flows, and escalations. It requires a multidisciplinary approach to get it right. There are so many things that people outside of moderation don’t think about, like ways of manipulating the filter or players using Unicode characters. For example, if a game allows for a username spinner on creation randomly assigning words.they forget that if you have the word “twelve” and you have the word “boy”, a little boy can identify himself as a 12 year old boy in a game. They may have considered pop culture, internet lingo, hate speech, bullying, and sexual content, but forgot about the implications of personal information and usernames.Izzy: You also need to be aware of spelling words wrong, using upside down characters, borrowing accents from other languages or combining words to create inappropriate phrases. I’ve also noticed those outside the world of moderation and community are not really aware of the challenges of behavior in a community. Maybe if they’re really avid gamers themselves, they’ve experienced some of those things. But it's different experiencing it yourself as a player and actually understanding the scope of the problem and all the nuances behind behavior. It’s a massive challenge, it's not just a matter of blocking a couple of words.

What features should companies look for in a moderation toolset?

Izzy: First and foremost, the ability to update and learn in a short period of time.Carlos: Absolutely. There's no time to waste with a 24-hour turnaround on training a new machine-learning model when in four hours your community can blow up.Izzy: While it's very easy to identify bad words, it is somewhat less easy to identify when people are misusing appropriate words. You need really strong tools that allow you to build interfaces on the back end to track individuals who stir up trouble. Those are the people you want to watch. While they're not doing anything to get kicked off a site, they can trigger problematic behavior down the line.Carlos: Beyond needing a moderation tool, you also need really good people who are trained to do the work and can look for the right things, which is where awesome companies like ModSquad come in, with a well-trained workforce of moderators who know what to look for.

What are the key traits to look for when bringing in a moderator? How do you identify a powerhouse moderator?

Izzy: Number one: An inherent sense of curiosity. Machines, for as much as I love them, are not curious. Humans are great at thinking, “I'm reading between the lines, and I feel like I want to see where this interaction goes.”Number two: A passion for the rules, but an understanding that it’s the platform rules and not their own personal view of the rules. Moderators need a light touch of empathy sometimes, but they have to stick to the rules and escalate issues when necessary. It's not their job to be the mom. It's their job to ensure that the expectations set are accurate.Number three: The ability to document and track trends. I like moderators who are dedicated to the cause and therefore if they see a situation, they'll escalate it. They'll write notes, they'll give a viewpoint. It's almost like a reporter mentality: “I saw this, I'm going to put it in this person's profile in case it happens again.” That way there is some level of history involved with decision making in the future.Carlos: When I was at Disney, the Director of Community Support I worked with told me about his playground philosophy when it comes to moderation. Basically, you are not going to be the friend of the kids playing the game, but you're also not going to be the super stern principal. You're there to make sure that the rules are observed from that objective, practical point of view.

Ideally, you’re pairing the right tool with a moderator who’s keyed into the community, as we discussed earlier. The strength of each component complements the other.

Izzy: I think the blend of moderators and toolsets really click when used to identify potentially problematic users and how their content needs to be tracked within the system. We can take notes and drive a really thoughtful, robust moderation process outside of simply, “Josie said this and Bobby did that.” Instead, it's “Josie and Bobby have been talking about this for a month. They haven’t done anything to get kicked out yet. But they've actually brought our attention to these six other things that have happened down the pipeline.” You can't get that unless you have powerful tools that allow you to identify those users and do the research.Carlos: There is value in having a holistic picture of user behavior and knowing whether users are trusted or untrusted in the system, or seeing all their data points, such as how many times they were reported in the last 24 hours or in the last week.Izzy: Exactly, I want to know what people do and what their activities are. I also like tools like Two Hat where I can see what my staff is doing because there are two types of communities at work. For example in gaming, there's the community in the game and the community behind the game. I want to know what the QA guys down in Brazil are doing.Carlos [a native Brazilian]: They're probably up to no good.

So beyond using a moderation tool to compile robust user-behavior data, should clients consider using it to shine a light on the moderators themselves?

Carlos: Yes, it’s a great way to show the value of the work the team is doing. It goes back to the idea of community moving from a cost center to a revenue generating team. It’s important to prove that moderation teams have manageable workloads when they have efficient tools and that they generate results. Then share the information with the executive team to show the value of the program. We can say, “Look at all the harm we’re preventing, and all the awesome work the team is doing to engage the community and prevent churn”Izzy: Moderation is the most essential, yet underrated, aspect of social-digital experiences. You couldn’t have an event in Times Square without police officers, Comic-Con couldn’t exist without event staff, and you wouldn’t send your kid to camp without camp counselors. Why should digital experiences be anything else? You have the same potential issues, but with much more anonymity and less social tolerance.Even if companies do manage to bring moderation teams on board, they have to have the right tools to manage, operate, and resolve the constant opportunities for issues. In our world, it’s not “if” it happens, it’s “when.” And for us, the moderators? Working with effective, efficient, scalable tools allow us to be successful as an economically feasible, smart safeguard to our clients, but also incredibly effective for the health and growth of their communities.

March 23, 2020
April 2, 2024

Share this post

Sign up to our community newsletter

Get insights and the latest community trends in your inbox.

More from the blog