Social Networking: A Place for Hate?

http://imagine2050.newcomm.org/2009/05/19/social-networking-a-place-for-...

Social Networking: A Place for Hate?

by Nora Flanagan

In 2000, HBO produced a documentary chronicling the capitalization of the internet’s exponential growth by hate groups in America. Hate.com, narrated by Southern Poverty Law Center founder Morris Dees, exposed hate groups’ online efforts to the wide audience provided by the piece’s frequent broadcast on HBO. Millions of youth, parents, educators and activists gained a better understanding of the power of the web as a recruiting tool for organized bigots.

Since Hate.com’s 2000 release, internet use has continued to evolve, and hate groups have not been far behind. The explosion of the social networking capacity of the web, often referred to as ‘Web 2.0,’ has been accompanied by organized attempts to expand and recruit for almost every documented hate group in America. In other words, white supremacists are on MySpace, and more than likely, they’d like to be your friend.

MySpace, to its credit, has a strong and unambiguously worded policy about the presence of hate groups and hate speech on the site. Under MySpace’s ‘Terms & Conditions,’ they state:

Prohibited Content includes, but is not limited to, Content that, in the sole discretion of MySpace: (8.1) is patently offensive and promotes racism, bigotry, hatred or physical harm of any kind against any group or individual.

It is also worth noting that this clause is listed as the site’s first type of prohibited content – followed by other banned subject matter, including nudity, solicitation of minors, copyrighted material and libel, to name a few. Clearly, MySpace places a priority on keeping their site free of racist hatred, which is distinctly within their right as a private enterprise. The enforcement of MySpace’s laudable policy, however, is another matter.

After establishing direct contact with MySpace as the Turn It Down Campaign launched our page, we learned that they did, in fact, make concerted efforts to delete pages that violate this aspect of their Terms & Conditions. However, site administration depends on users reporting violations. In our time on the site, the enforcement has been spotty – from time to time, a rash of pages disappear; for other stretches of time, they seem to go untouched, no matter how blatantly offensive and in violation.

For those interested in trying their hand at reporting violations on MySpace, you’ll see a ‘Report Abuse’ button at the bottom of every profile page, and a ‘Report Image’ button under any photo on any page. Recently, we’ve seen MySpace pages for white power bands, labels and individuals disappearing by the dozens. Perhaps MySpace has found an efficient way to address the issue, and we hope this continues.

Elsewhere online, guidelines vary between far less enforced and far more ambiguous. On YouTube, the Community Guidelines section offers users the following assurance:

We encourage free speech and defend everyone’s right to express unpopular points of view. But we don’t permit hate speech (speech which attacks or demeans a group based on race or ethnic origin, religion, disability, gender, age, veteran status, and sexual orientation/gender identity).

However, a search of the term ‘white power 88’ yielded no less than 1,260 videos. (The number 88 is a common code for white supremacists, referring to the term ‘Heil Hitler,’ as H is the 8th letter of the alphabet.) Other searches of codified and not-so-codified hate speech yielded similar results, and queries of specific hate groups produced hundreds of results for each. The ‘Flag’ button under each video, it turns out, leads users to the option to report a video; however, users are immediately warned that ‘abusing this feature is also a violation of Community Guidelines,’ so it’s possible that users are reluctant to flag videos, for fear that their own accounts may be jeopardized. Or it could be that YouTube doesn’t heed users’ complaints.

Needless to say, enforcement of YouTube’s Community Guidelines is questionable, at best. And Turn It Down isn’t the only entity to notice this. Stormfront, arguably the largest white power online forum, sees racist, anti-Semitic, homophobic and xenophobic YouTube videos reposted there by the hundreds. A single thread, titled ‘YouTube,’ has 1,170 posts – most of which contain reposted YouTube videos of white power bands, hate group leader’s speeches, and various white nationalist call-to-action videos. Other threads on Stormfront encourage members to post videos to YouTube, as a way to spread white nationalist ideals.

On the other hand, Facebook’s ‘Statement of Rights and Responsibilities’ is both clear and seemingly stringently enforced:

[Facebook users] will not post content that is hateful, threatening, pornographic, or that contains nudity or graphic or gratuitous violence.

A search of Facebook groups for the term ‘white power’ drummed up dozens of hits for everything from fans of the White Power Ranger character to white cat enthusiasts, but only one group promoting racism. (We promptly hit the ‘Report Group’ button, and we’ll see how fast the group is taken down.)

So where does this leave the average user of any of these or other popular social networking sites? Your options are many, and your efforts are entirely up to you. Each site offers an avenue for users to report the presence of blatantly bigoted material, and the only way to know a site’s level of actual commitment to its policies is to test them. So report what you see. Be an active member of the communities in which you keep in touch with friends, check out new bands, or simply kill time every now and then. Communities are only as strong as their membership. You can also go a step further by emailing the site administration for any of these sites and either commending their policy enforcement or requesting that they do more to uphold their own rules, whichever the case may be.

Online networking continues to move forward, and those who would exploit its capacity are never far behind. Hate groups are using social networking sites to promote their agendas and recruit new members. We need to use our power as users of these same sites to make it known that policies against hate speech are a good start, but consistent enforcement is key.

Type of incident: