423 lines
35 KiB
Plaintext
423 lines
35 KiB
Plaintext
|
|
Episode: 4421
|
||
|
|
Title: HPR4421: Content Moderation
|
||
|
|
Source: https://hub.hackerpublicradio.org/ccdn.php?filename=/eps/hpr4421/hpr4421.mp3
|
||
|
|
Transcribed: 2025-10-26 00:33:32
|
||
|
|
|
||
|
|
---
|
||
|
|
|
||
|
|
This is Hacker Public Radio Episode 4,421 from Monday 14 July 2025.
|
||
|
|
Today's show is entitled Content Moderation.
|
||
|
|
It is hosted by Lee and is about 43 minutes long.
|
||
|
|
It carries an explicit flag.
|
||
|
|
The summary is, Lee talks to Ellsbeth about the role of content moderation on the internet.
|
||
|
|
Hi, I'm Lee, today I'm joined by Ellsbeth for the second time.
|
||
|
|
I'm going to talk with Ellsbeth today about the subject of content moderation in communities
|
||
|
|
and on the internet.
|
||
|
|
So, hi Ellsbeth.
|
||
|
|
Hi Lee.
|
||
|
|
I'm glad to be here.
|
||
|
|
Would you like to say what experiences you've had in terms of both being on the receiving
|
||
|
|
end of moderation or noticing it happening in communities and forums you've been part
|
||
|
|
of, but then also having a role in those communities so that you're actually moderating
|
||
|
|
things yourself?
|
||
|
|
I have been on the spectrum of content moderation since I guess the originally started with
|
||
|
|
like the AOL checkereds and aim and all that stuff, some forums and bulletin boards
|
||
|
|
and whatnot.
|
||
|
|
So content moderation is kind of like developed over time.
|
||
|
|
At first it was just people that were a part of the community helping moderate it, but
|
||
|
|
as social media has exponentially increased and with MMOs and stuff like that, content
|
||
|
|
moderation has shifted and adjusted and expanded and got to the point where there needs to be
|
||
|
|
a bit of oversight to who's moderating what they're moderating and how they're moderating
|
||
|
|
and then trying to find out as a company and just as internet people that are, you know,
|
||
|
|
citizens of the internet, I guess, of the world that way, the digital world.
|
||
|
|
There has been this shift to where there has to be like, you know, what actually is moderated
|
||
|
|
and where I started was just moderating in general as being one of those people that
|
||
|
|
is like, hey, you know, someone needs to get rid of the trolls from the forum and delete
|
||
|
|
their posts and whatnot and that shifted from that to eventually being a part of helping
|
||
|
|
moderate, like when my kids were gaming, when they were younger, I would help you part
|
||
|
|
of the parental kind of keeping an eye on things from there.
|
||
|
|
I cannot go into the specifics of it, but I have actually worked professionally as a
|
||
|
|
content moderator as things have developed and social media has grown.
|
||
|
|
Big social media groups require you to sign an NDA to be a moderator.
|
||
|
|
So I can't talk a whole lot about that, but I have been on that end and seen the worst
|
||
|
|
of the worst as it's happening and coming on and that helped me realize just how important
|
||
|
|
content moderation is.
|
||
|
|
Okay.
|
||
|
|
So what you're saying before is the context for moderation and this is normally online forums
|
||
|
|
and you also mentioned MMOs and those are massively multiplayer online games like role-playing
|
||
|
|
games.
|
||
|
|
Yeah.
|
||
|
|
And the things that are happening in those games are contributed to by a lot of people
|
||
|
|
who may be to some extent anonymous and there are very little limits inherently in what
|
||
|
|
they might contribute.
|
||
|
|
It might be pleasant, it might be off topic, it might be very obscene and there's a kind
|
||
|
|
of duty on the people running these platforms to keep the people they're safe and comfortable.
|
||
|
|
It's in their interests to do so to achieve that.
|
||
|
|
They may have volunteers or they may actually pay people or their own employees to step
|
||
|
|
in and take a role of making sure that what's happening on their service is above board
|
||
|
|
and is safe and is what everyone who's coming with a good will to use that facility expects
|
||
|
|
to happen.
|
||
|
|
Right.
|
||
|
|
And so you're saying that you've had experiences of being part of that as volunteer.
|
||
|
|
You've also had some professional involvement in that type of role.
|
||
|
|
Right.
|
||
|
|
It's a really, really big job that doesn't get enough credit.
|
||
|
|
What would you say are the main benefits to a community of some kind, of having people
|
||
|
|
who specific role it is to keep that community running smoothly?
|
||
|
|
And is it a matter of just saying what can't happen or is it also about them putting forward
|
||
|
|
and leading things in a more proactive way?
|
||
|
|
I think that any good social media content moderation position or even when I say social media,
|
||
|
|
I'm not just talking about like the big companies that people use for social media.
|
||
|
|
I'm talking also like MMOs and whatnot because it is a form of social media.
|
||
|
|
A good company will make certain that the people that are doing the moderation have a guideline.
|
||
|
|
That keeps the worst of the worst from happening while holding a fair balance
|
||
|
|
when, say, an adult has a conversation that maybe a child shouldn't have.
|
||
|
|
If they are paying attention to the nuances, you know,
|
||
|
|
what comes out of an adult's mouth isn't necessarily something that would be an issue to have
|
||
|
|
as long as it's moderated under PG-13 or as the case might be, you know, RX or whatnot.
|
||
|
|
If you're rating it like a movie system.
|
||
|
|
So there has to be a clear regulation and a clear guideline so that everybody that is working
|
||
|
|
as a content moderator has a guideline and knows clearly what the company wants.
|
||
|
|
When the company doesn't know that it becomes internal bias as the guideline,
|
||
|
|
the company I worked for, I think I can say without any concern about that company was very good
|
||
|
|
about knowing that some people are going to say things that other people aren't going to like
|
||
|
|
but aren't necessarily about them.
|
||
|
|
They were very good about supporting expression, individual expression,
|
||
|
|
while keeping the dirt, you know, the ugly stuff out of the picture.
|
||
|
|
It becomes necessary to train people how to not have internal bias
|
||
|
|
when they're watching interactions that they wouldn't necessarily support or accept in their real life
|
||
|
|
but aren't necessarily a problem.
|
||
|
|
So you're not becoming the thought police.
|
||
|
|
However, it also becomes just as equally important as somebody to say,
|
||
|
|
well, I don't think it's a big deal that this video showed up that, you know, I'm making something up.
|
||
|
|
I'm literally making something up here.
|
||
|
|
I'm not quoting anything that I've actually seen.
|
||
|
|
But like, if a parent is like smack in the crap out of their child and somebody flags that,
|
||
|
|
is it actually abuse in the context that it's at?
|
||
|
|
You know, social media companies that are worldwide need to factor in cultural context.
|
||
|
|
If you are somebody that is content moderating in the US,
|
||
|
|
you will often find them flagging negatively, content that in a third world country shouldn't be flagged.
|
||
|
|
Those things have to be taught.
|
||
|
|
And as a world community, content moderation becomes necessary to help people understand that
|
||
|
|
you may be wanting something to be removed because you don't like it.
|
||
|
|
You don't agree with it. You've reported it because where you're from,
|
||
|
|
it's just completely unethical.
|
||
|
|
And yet, in a country where horrible punishment isn't used as abuse,
|
||
|
|
so they don't think of it.
|
||
|
|
I mean, I'm not to say people don't get abused.
|
||
|
|
It happens worldwide, doesn't matter what the culture is,
|
||
|
|
but if the general consensus is, is we're going to address the people that are doing things to an extreme,
|
||
|
|
but sometimes a smack on the hand keeps a child from burning their fingers on the stovetop.
|
||
|
|
If you're trying to teach a child not to touch the burner on a stove,
|
||
|
|
there's a lot of countries where that sort of corporal punishment is just the cultural norm.
|
||
|
|
But I know for a factor in the US, that sort of thing would be condemned.
|
||
|
|
So there's a lot of nuance to content moderation that people don't even think about.
|
||
|
|
But then you have the extremes of think of all the horrors and stuff in this world that are going on,
|
||
|
|
and people have cameras where they can live stream.
|
||
|
|
There's a lot of wars going on, and people have cameras, and people have opinions,
|
||
|
|
and people want people to see what's going on.
|
||
|
|
So they live stream it, and that's not inherently necessarily a bad thing.
|
||
|
|
But what restrictions do you keep from having a platform that allows children 13 and over,
|
||
|
|
because that's typically the age, not all, but children that are 13 and older watching horrific war crimes live?
|
||
|
|
That can cause children that aren't even in that environment to have PTSD.
|
||
|
|
I'm not a psychologist. I'm not a doctor. I'm not a therapist.
|
||
|
|
I want to make that abundantly clear.
|
||
|
|
But we all know from all of the historical events that happened from watching the
|
||
|
|
challenger explode to watching 9-11, the Oklahoma City bombing, events,
|
||
|
|
and this is just in the US that things that I have experienced and witnessed.
|
||
|
|
And I'm talking about watching it on TV and media, not necessarily live actually there.
|
||
|
|
It has impact, and you have to know what to flag and what not to flag.
|
||
|
|
I could probably babble on that for a bit, but the reality is content moderation is very,
|
||
|
|
very, very nuanced. There's a lot of having to educate people not to hate other people,
|
||
|
|
or have judgments or preconceived biases that they're automatically just going to see,
|
||
|
|
as like in the US. We have people that are, we struggle with racism and whatnot, and
|
||
|
|
gender discrimination and judgment on gender identity and sexual identity and all those things.
|
||
|
|
And I'm sure those things are going around the world, but because I am most familiar with the US
|
||
|
|
right now, aside from my experiences overseas, which has been so long ago that a lot of this
|
||
|
|
cultural divide wasn't as evident, I think, is the word that I'm looking for.
|
||
|
|
I look at my experiences with just being a moderator and I'm thinking, wow, this is a big deal.
|
||
|
|
And our world didn't realize, like the internet happened so fast,
|
||
|
|
the world became connected so fast that the internal biases did not have time to take a moment to
|
||
|
|
sort themselves out. Sort themselves out. Yeah, it's suddenly seem more, it's just there.
|
||
|
|
To what extent do you think communities can be, as in real communities in real life,
|
||
|
|
to what extent do you think online communities can be self-correcting? It without the heavy-handed
|
||
|
|
intervention of official moderation. I think a basic group, like a Facebook group,
|
||
|
|
or, you know, I think those type of things can self-moderate as long as they're not too
|
||
|
|
large. If the community gets too large, then it's harder to self-moderate.
|
||
|
|
I think that groups of maybe a hundred or more, you're really pushing the boundaries
|
||
|
|
on what can be self-moderated. And I'm just thinking about like a second-life group.
|
||
|
|
You get more than 100 people in a group. It can be chaos if they're talking regularly.
|
||
|
|
You know, you have to have somebody that can keep an eye on what people are talking about.
|
||
|
|
I do a mentorship, I'm part of the mentor for the new people coming to second-life.
|
||
|
|
And I've experienced moments where people have said things that are like, I need to get
|
||
|
|
attention to this right now. And fortunately, that's rare because second-life is pretty
|
||
|
|
heavily moderated in its own weird way. It's a very self-moderated, to be honest.
|
||
|
|
But at the same time, there is a lot of things that happen everywhere. It doesn't matter
|
||
|
|
whether it's second-life or Facebook or TikTok or, you know, Instagram or Snapchat. It doesn't
|
||
|
|
matter what the social aspect is to it. There is going to be a need for moderation.
|
||
|
|
Do you think that if moderation is done with a light touch, but with strength when it's
|
||
|
|
needed, then that is yes, is enough to keep honest people honest, so to speak.
|
||
|
|
So as long as it's there, it's not needed. If it wasn't there, it would be needed.
|
||
|
|
Yeah, I think you're dead on on that. A light touch is always beneficial because everybody wants
|
||
|
|
to feel like, I mean, this is just about human decency and respect. People want to be
|
||
|
|
their own unique individual. They want to be able to have the authenticity to be able to say and
|
||
|
|
think what they want. And there is a time and place for that. Unfortunately, I think the internet
|
||
|
|
society, the digital world is still learning what that is. And a lot of people, because the internet
|
||
|
|
community grew so fast, you have a whole generational divide on what's appropriate to say and what's
|
||
|
|
not to say. And at this point, we have three generations of perspectives on what is acceptable
|
||
|
|
and what's not to say and do in an online world, whether it be, and it can go anywhere from just
|
||
|
|
not knowing basic email literacy and how to formulate an email professionally or to a friend or
|
||
|
|
to a family member or whatever. Granted, I'd have to think about that because I guess not the entire
|
||
|
|
world as it's connected is we are in bigger cities, regardless of where we are. But for the most
|
||
|
|
part, the world is connected in a way where there should be some common sense on what people are
|
||
|
|
saying. But there's a generation of people that are forced to get connected that don't want to get
|
||
|
|
connected, that are still speaking as if they have never left the small town that they
|
||
|
|
grew up in. So when you have an extremely narrow world view, it's not uncommon. In fact, it's just,
|
||
|
|
it's really difficult to learn how to communicate with somebody just because this is how you
|
||
|
|
communicated your whole life. And this is you individually. When you come at and say something
|
||
|
|
that is degrading and minimalizing of another human being, and it's not something you ever thought of,
|
||
|
|
like, you know, I grew up being called a retard. You know, that's not socially acceptable now.
|
||
|
|
You know, and I had an interaction with somebody in the last few months that they didn't think twice
|
||
|
|
about calling somebody a retard because that's what they grew up with. They had never been called
|
||
|
|
to task or how that might make somebody feel and how degrading it was. And when I had that conversation
|
||
|
|
with them, they were just like, oh, wow, I didn't even think about that. That was just banter. They
|
||
|
|
didn't mean anything by it. They weren't trying to degrade someone. They were just talking. And I'm
|
||
|
|
kind of like, well, maybe you shouldn't be just talking to people about things that are degrading.
|
||
|
|
And he's like, I didn't think before I spoke. And I'm like, you're right, you didn't. And that in
|
||
|
|
itself was a very difficult conversation to have. But at the same time, it was necessary to have that
|
||
|
|
conversation because they needed to know that they were not going to make people comfortable that
|
||
|
|
they were wanting to hang out with. And once it was addressed, it was like, oh, I'm not going to
|
||
|
|
say that again. I'm going to be respectful. I'm going to be kind. And I'll listen if you tell me
|
||
|
|
how something hurts. And that's essentially what content moderation is. People do it day to day.
|
||
|
|
They just don't realize they're doing it. Have you learned anything particularly technical
|
||
|
|
about the methods and the means of moderating that you think would be of interest to people that
|
||
|
|
maybe is not immediately obvious? That's a good question. Let me think about that for a second.
|
||
|
|
Because there's obviously tools like, you know, their software that allows you to keep track of
|
||
|
|
what's going on and to react and to document. So it's basically like another level of customer
|
||
|
|
service. But it's more like the customer isn't always right in this situation if that makes sense.
|
||
|
|
It'd be kind of like if somebody asked for a very well done burger and they made a big to do
|
||
|
|
about it, the whole restaurant heard it. And then they threw a fit when they got a very
|
||
|
|
well done burger. When really what they wanted was more like a medium well or medium.
|
||
|
|
And from their perspective, medium was well done. But they're not right. They asked for what they
|
||
|
|
asked for. The wait staff may have actually even described what well done was going to look
|
||
|
|
like and possibly taste like and whatnot. But in this case, the customer is throwing a fit
|
||
|
|
and demanding a refund, which is so common in this country, which is ridiculous. But
|
||
|
|
So the customer in this case, we're talking about people who've come to a platform in order to
|
||
|
|
talk to other people. Maybe people they know, right? People they don't know. And they have certain
|
||
|
|
expectations around what the other people will say or what they can expect to receive from the other
|
||
|
|
people. Or even their idea of who the other people are, whether those other people are just
|
||
|
|
volunteers or just other people themselves looking for information or other people are things to say.
|
||
|
|
Or whether these people who they expect to interact with are actually representing the company
|
||
|
|
or the business that is hosting the platform or is the topic of the conversation.
|
||
|
|
Right. There's a saying that I grew up with, suck it up, buttercup. And I'm thinking of that
|
||
|
|
in relation to the context to this as far as the customer isn't always right in social media
|
||
|
|
contexts is, you know, somebody can get mad that we're not, you know, a content moderator is
|
||
|
|
removing, you know, insensitive posts or even inflammatory or even just outright illegal content.
|
||
|
|
And they'll get upset that it's being removed and claim that it's like against freedom of speech
|
||
|
|
and individuality and whatnot. And then at the same time, you have to factor in that just because
|
||
|
|
one person is comfortable sharing all that information does not necessarily make it
|
||
|
|
something that the rest of the internet world needs to experience.
|
||
|
|
What's the importance of having a code of conduct for use of a forum?
|
||
|
|
The human decency people being treated with respect, just because you may be okay with some sort
|
||
|
|
of bantering and joking that it's a little off-putting to some people. Doesn't mean other people
|
||
|
|
will be okay with that. You know, I told, I raised my kids with the phrase, know your audience.
|
||
|
|
You know, I understood that kids when they're young want to be daring and say the cuss words
|
||
|
|
that they hear and stuff like that. So when I was raising my kids, I told them, I said, you need
|
||
|
|
to know your audience. If you want to experiment with using those words, do it around us.
|
||
|
|
So you're not offending anyone. You don't know if that's going to offend somebody else's
|
||
|
|
some sort of activities. And when you're out in public, you've got to watch yourself and be polite.
|
||
|
|
What I was wondering was specifically important is it that there should be guidelines written
|
||
|
|
down. Do you find that people even read these guidelines or they only read them when it comes to
|
||
|
|
things being said against them or them being moderated at that point that they need to be pointed
|
||
|
|
to these guidelines or, you know, that's that's a very good point. Let me ask you this much.
|
||
|
|
Even with as well as I know you, how many times have you read every single TOS that you
|
||
|
|
ever had to read it for something that you wanted to participate in? I don't think anyone ever has.
|
||
|
|
That's exactly my point. If you did not read the TOS, you don't know what the rules and guidelines are.
|
||
|
|
And there has to be some accountability for people that choose not to read the terms of service.
|
||
|
|
Those written down allows you to be able to go back and say, you signed
|
||
|
|
this when you decided to participate in this platform. You chose not to read it.
|
||
|
|
So you chose not to be knowledgeable and aware of the rules and guidelines that we're presenting
|
||
|
|
for the safety of all users, for the comfort and the peace of all users. They become absolutely
|
||
|
|
vital to be written down in that way, but it can also be used as a weapon for those who
|
||
|
|
accidentally break a term of service, for it's not intentional, or they just aren't flat out,
|
||
|
|
they're not trying to be malicious. So most places that I know of give a warning unless it's
|
||
|
|
like a specific rated infraction. So if it's an infraction that's like just
|
||
|
|
if it's an infraction worse, yes, if it's illegal to be honest, if it's something that would
|
||
|
|
like cause someone to harm or damage someone's psyche, then you have to have it. You have to
|
||
|
|
be able to do something about it. There are more minor cases where people can be left with a warning
|
||
|
|
so it's speed. And having it written down assures that there is something to fall back on when
|
||
|
|
you do have to do more than just give a warning, but also giving a warning allows them to refer
|
||
|
|
back to what they didn't read in the first place. It's like you signed these terms of service.
|
||
|
|
You agreed to abide by this. You did not, and this is how it was broken. It's just like gentle,
|
||
|
|
you know, a gentle, hey, you know, I'm sorry you chose to do this, but here's the consequence
|
||
|
|
of you choosing to do it. If we take a step back and look at the nature of these communities,
|
||
|
|
and obviously you've got a range of different experiences of different communities and
|
||
|
|
moderation been used in all of them, to some extent or lesser extent, do you think
|
||
|
|
that some communities are more suited for having the community decide what the rules are,
|
||
|
|
and some communities are more suited to having to have rules imposed by the person
|
||
|
|
or the company in authority? I think it depends on the community. That's a very good question,
|
||
|
|
though. If the community is fairly self-regulating already, it can be quite beneficial.
|
||
|
|
I think you would kind of directly answer the question. I think you're getting at if the
|
||
|
|
community is self-regulating, then it's fairly good that the people within it are then
|
||
|
|
contributing a lot into saying what are the nature of the rules that everyone is agreeing to
|
||
|
|
live in worldwide in this forum. I was kind of getting at like if it's say a computer game,
|
||
|
|
and the people who run the computer game are financially liable if there's any harm caused
|
||
|
|
unless able to delegate their responsibility. You're absolutely right, but that's where the
|
||
|
|
terms of service comes into play. Every game you will ever play, you will be signing a terms of
|
||
|
|
service. Every app that you ever log into or download, you will be signing a terms of service
|
||
|
|
because the companies, especially with the nature of technology, the companies become liable
|
||
|
|
if there's harm from their product. And when that happens, then they become more eager,
|
||
|
|
unfortunately, to restrict what you can and cannot say and do on the games. And if you want,
|
||
|
|
I think that's what most people don't realize is if you want to keep communities like
|
||
|
|
Final Fantasy and World of Warcraft and Call of Duty and Roblox and that's a big one there.
|
||
|
|
Roblox, because so much of that is kid-centered, there has to be a huge amount of content moderation.
|
||
|
|
They're getting smacked for not doing enough of it. That one is not a self-regulating community.
|
||
|
|
When you have children involved, it's not self-regulating. It's not possible for a child to consent
|
||
|
|
to agree to TOS, even if the parents download it for them.
|
||
|
|
How do we deal with a move to a more federated internet in which big services are not actually run
|
||
|
|
by a company. They're run by everyone who is part of that service. There is no single authority
|
||
|
|
in charge. How can moderation fit into that kind of socio-technical environment?
|
||
|
|
You have to have people willing not to get paid for that service and still provide equality.
|
||
|
|
But because you have to have people that are strong empathetic leaders in the community.
|
||
|
|
They see and value each individual for their worth individually and then help them guide them
|
||
|
|
to make choices that are better for the community without making them feel less than.
|
||
|
|
And that their opinion isn't valid because right now, the biggest thing we have with that sort
|
||
|
|
of thing is people wanting to boost their ego. Is that necessarily a bad thing?
|
||
|
|
That is not a bad thing at all and that's kind of why I pause there because there's nothing wrong
|
||
|
|
with the boosting of the ego. It's when you allow your ego to make you more important than anyone
|
||
|
|
else. That becomes the issue and that's probably the biggest struggle that I see with any
|
||
|
|
form of community that needs to be moderated is what is the ego level of the people that are
|
||
|
|
choosing to lead or getting paid to lead either way? There's a community in second life and I'm
|
||
|
|
going to use that y'all are going to hear that second life is kind of my hyper fixation. It's my
|
||
|
|
hobby. It's my love. But there's a community in second life that I love the way that the community
|
||
|
|
is run. There is someone who owns the regions of the community is on but they do not make the
|
||
|
|
people that are a part of the community feel like they're less than. They involve them in the
|
||
|
|
discussions of growth in the community. They involve people with the, you know, how do we handle
|
||
|
|
these situations? You know, if you have somebody that's disrupting the community, how do we want
|
||
|
|
to handle that? How do we want that person to feel? What's going on in that person's life that is
|
||
|
|
making them be disruptive? And instead of just like rejecting them, find out what's going on.
|
||
|
|
And I think there needs to be more of that in communities before we can successfully
|
||
|
|
take the commercialization out of social media. In your experience, how is content moderation
|
||
|
|
treated differently in different parts of the world? Is there a radically different approach in
|
||
|
|
different countries within the world? In my experience with the different social media,
|
||
|
|
as a user, because I am a user of TikTok. I am a user of Snapchat. I am a user of Facebook.
|
||
|
|
And I need to differentiate this between my work as a content moderator. But as a user,
|
||
|
|
I have observed that there are certain social media that is more regulated in a healthier way,
|
||
|
|
where the users receive less impact of what could be traumatic. I have seen some really raw and
|
||
|
|
ugly stuff on Facebook as a user. And I've also seen varying degrees to how fast something that
|
||
|
|
shouldn't stay up, where kids might be able to see it, or people that are not able to
|
||
|
|
consent to viewing stuff that would potentially impact them negatively. Snapchat is different in
|
||
|
|
the sense that other than stories, nothing really stays very long. I think that's a good way.
|
||
|
|
I don't even know where Snapchat is based out of. I'd have to look that up.
|
||
|
|
So that might be a different global perspective because Facebook is based out of the US.
|
||
|
|
Then Instagram is connected to Facebook. I've seen some stuff on there,
|
||
|
|
but it was actually more regulated before Facebook took over it, I think.
|
||
|
|
At least for my observation, I don't use it as much anymore because it's just become boring to me.
|
||
|
|
And then I think of all of the apps that I have used as a user, I think I have seen the least
|
||
|
|
content of things that would be disturbing, probably on TikTok, not to say that it's not there,
|
||
|
|
but I think it's something that when people try to start making things go viral on TikTok,
|
||
|
|
there's a little less of the negative stuff going viral, or it doesn't stay up as long as maybe
|
||
|
|
other social media. And again, I'm just scratching the service because there's all sorts of
|
||
|
|
social media. I've definitely seen firsthand how second-life handles moderation stuff,
|
||
|
|
and they try and get on it pretty fast. I think they could improve. Much of their love
|
||
|
|
second-life, I think they could improve because I have seen some stuff where I cannot believe
|
||
|
|
that's still sitting in the chat that long. Especially in second-life, there's a lot of individual
|
||
|
|
autonomy that is tolerated and accepted, but there are some things that just shouldn't be
|
||
|
|
in certain areas, and I've seen there be a little bit of delay on taking accountability
|
||
|
|
for getting those things away from what users can see. My kids were raised in the
|
||
|
|
Roblox and League of Legends and all that early on MMO stuff for kids, and that wasn't regulated
|
||
|
|
near where it should be. That's why you have lawsuits against Roblox and stuff like that.
|
||
|
|
Do you see content moderation mainly about censorship, or does it have more of a productive role,
|
||
|
|
more of a creative role to it? That's not just removing negatives, a kind of a positive side to
|
||
|
|
it, a sunny side. I think the sunny side is there's a lot of good that social media does about
|
||
|
|
connecting the world and introducing people to different world views. When you can literally
|
||
|
|
want someone cooking in a kitchen completely on the other side of the world and learn a new recipe
|
||
|
|
and listen to different accents and become more familiar, then you become more tolerant of the
|
||
|
|
things that are unfamiliar. I truly believe that social media done right and just internet
|
||
|
|
digitally can be done in a way that brings great tolerance and connection at peace in the world.
|
||
|
|
So moderation does impact that in a way because it's kind of like a parent protecting their
|
||
|
|
child from getting hurt. If you're running out into the street and a car's coming, the
|
||
|
|
parent's going to run up and snatch that child up and get him out of the street so they don't get
|
||
|
|
hurt, right? That's what content moderation is. So you mainly say it's safety but it's a safety
|
||
|
|
that facilitates the good stuff. Yeah. You're not going to stop the child from writing their
|
||
|
|
bike or running around and playing and if you do, then you repress that child's ability to be able
|
||
|
|
to think and be creative or even develop their logic, gills and whatnot. If you're always protecting
|
||
|
|
a child, they're never going to grow. But if you protect them when it's necessary, then they learn
|
||
|
|
the necessary boundaries to thrive. And I view content moderation that way.
|
||
|
|
Well, skills and life, knowledge and mindset might help someone be a successful content moderator.
|
||
|
|
Don't think your opinion or your worldview or your culture is right and the only way to live.
|
||
|
|
You have to be willing to suspend your individual opinions and allow someone else to have a
|
||
|
|
different worldview and a different opinion and the right to express those things in a safe and
|
||
|
|
healthy manner. And if you can be tolerant and view the world beyond the box that you grew up
|
||
|
|
and where you have experienced and realized that this world is huge and not everybody gets to be a
|
||
|
|
world traveler and not everybody gets to experience different cultures. And some people only
|
||
|
|
experience it from their little box. And if you can suspend your little box and see beyond that
|
||
|
|
and see when somebody's not intending to mean harm or they're trying to offer a different
|
||
|
|
perspective or whatnot, then you're able to thrive as a content moderator because you're helping
|
||
|
|
express themselves and you're helping other people see that expression so that they can see
|
||
|
|
beyond their box too. I think that's probably the biggest challenge people have in content
|
||
|
|
moderation is not being able to see the world from beyond their own perspectives.
|
||
|
|
So if you're going to advise someone who was very much interested in doing this kind of work
|
||
|
|
even voluntarily or as a career, what direction might you point them in? What sort of
|
||
|
|
things should they be doing to get good at it and maybe what's the way to get a foothold
|
||
|
|
into this type of work? First off, you need to know that it has an expiration date.
|
||
|
|
It does not matter how tough your skin is. It doesn't matter how open-minded you are.
|
||
|
|
You will see things as a content moderator that you will likely need mental health therapy for.
|
||
|
|
It's not going to happen all the time. In fact, the most of the time it's not going to be that.
|
||
|
|
It's going to be cute videos of cats and dogs and people expressing themselves and whatnot.
|
||
|
|
You know, or just people bantering back and forth, but you will see things that will traumatize you.
|
||
|
|
So go into it knowing that because if you do not know that,
|
||
|
|
you can wind up and also if the company that you work for happens to have like free therapy,
|
||
|
|
there's a reason they have it, use it. Don't be ashamed to get some mental health care
|
||
|
|
because it's a job that you will need to know how to regulate your emotions.
|
||
|
|
And I'm putting so much emphasis on that because I really mean it. Don't go into it if you don't
|
||
|
|
have a therapist or aren't willing to get one. Beyond that, you need to have a really great
|
||
|
|
appreciation for bringing communities together. If you're the person that is the life of the
|
||
|
|
party, but want a job that is behind the scenes doing good, this is a great job for you.
|
||
|
|
But know your limits and be willing to realize that your limits are probably going to be pushed
|
||
|
|
in a position like this. Are there any kind of practical skills that someone, if they've decided
|
||
|
|
and they think they have the right mindset and have what's needed? Are there practicalities
|
||
|
|
like qualifications or that kind of thing that you can work towards or like internships or that sort of thing?
|
||
|
|
It depends on whether you're wanting to do it as a full-time career or not. If you're wanting to
|
||
|
|
do it as a full-time career, you got to know where to look and even then it's hard to find.
|
||
|
|
Usually you know someone that knows someone that works there and you know at the whatever
|
||
|
|
company it is that is doing it and you get that connection that way. Check LinkedIn and
|
||
|
|
indeed be careful for scams on those two because in the last probably five years it's gone
|
||
|
|
from a really great place to get a job to a bunch of scam artists duplicating jobs and trying to
|
||
|
|
get people to go to their websites and whatnot. So just be careful on that. But there is
|
||
|
|
legitimate positions that are posted there. Be able to not just look for a position called
|
||
|
|
content moderator. They're almost never called content moderator. So be creative with your searches.
|
||
|
|
If you want to be a content moderator for a place that you know of, it's like a company that you
|
||
|
|
already know of, go look and see if they have any positions. Look at call centers. See if call
|
||
|
|
centers have any contracts with you know you're not going to know who you're working with but you know
|
||
|
|
not at first. So when you're looking for the position you're looking for a position that's about
|
||
|
|
you know well sometimes they will call it a content moderator position but they usually
|
||
|
|
guys it under like customer service that is not phone facing. So you're not taking inbound calls.
|
||
|
|
You're not taking inbound or outbound calls. You're completely virtual without any voice.
|
||
|
|
I stumbled on the position by accident that I had. Also be careful that they don't underpay you
|
||
|
|
because there's a lot of places that will try and pay you not what you're worth. So is there anything
|
||
|
|
that would suggest you that AI is already or will play a much larger role in sort of moderation
|
||
|
|
jobs that exist at the moment and and that these change going forward. Yeah two years ago I saw
|
||
|
|
way more content moderation jobs than I do now. If you pay any attention to the evolution of AI
|
||
|
|
the empathy in AI is increasing. I think AI in many ways is probably better than humans.
|
||
|
|
For content moderation as long as it's trained ethically and I really want to put the emphasis on
|
||
|
|
trained ethically. AI could be just as bad as humans the worst humans if it's trained by the worst
|
||
|
|
humans. So I don't think that it's actually going to be legitimately a position other than maybe
|
||
|
|
AI training long term. I don't necessarily agree with no human oversight though
|
||
|
|
because AI is not human and does not have the feelings and emotions that a human does
|
||
|
|
even if it can replicate the communication that's empathetic. So I think it's going towards AI.
|
||
|
|
I don't necessarily think without human oversight that is done ethically that that is the best
|
||
|
|
option. All right, Ellsworth thanks a lot for talking to the HPR audience about this. Is there
|
||
|
|
anything else that maybe hasn't been mentioned so far that you can think of? Yeah if you have ever
|
||
|
|
had a post removed on social media be grateful. While human error is a factor there is a dispute
|
||
|
|
system if you think something has been removed inaccurately or without good reason or good cause
|
||
|
|
use the dispute button but definitely be respectful to the folks that are moderating
|
||
|
|
stuff because they're going through a lot. They see the worst things that you will never see.
|
||
|
|
All right, with that goodbye to everyone. Goodbye to Ellsworth and I hope you've enjoyed listening to this.
|
||
|
|
You have been listening to Hacker Public Radio at Hacker Public Radio does work.
|
||
|
|
Today's show was contributed by a HPR listener like yourself. If you ever thought of recording
|
||
|
|
podcast, you click on our contribute link to find out how easy it really is. Hosting
|
||
|
|
for HPR has been kindly provided by an honesthost.com, the internet archive and our sings.net.
|
||
|
|
On the Sadois status, today's show is released under Creative Commons,
|
||
|
|
Attribution 4.0 International License.
|