For Facebook, moderating user speech is a difficult task -- especially when users report violations at least one million times per day.
Monika Bickert, Facebook's head of policy management, elucidated on Facebook's policy for removing content at SXSW on Saturday, according to CNN Money. During the panel conversation, Bickert said the influx of complaints come from all over the world, and that drawing the line between allowing free speech and blocking hate speech can be tricky.
"You can criticize institutions, religions, and you can engage in robust political conversation," Bickert said at the panel, according to CNN Money. "But what you can't do is cross the line into attacking a person or a group of people based on a particular characteristic."
Given the sheer number of complaints the company receives, enforcing policy can also be challenging. She said that automation might play a larger role in determining abusive content, but for now it requires a human touch.
"When it comes to hate speech, it's so contextual...We think it's really important for people to be making that decision."