November 2013: What I've seen this past year in Arbcom cases, at the Administrator's noticeboard, and in various policy discussions is leading me to conclude that consensus may have changed on the general subject of inflammatory speech. In modern democracies, there are more or less standard corporate and educational practices that aim to reduce friction in the workplace and the classroom over perceived insults involving ethnicity, nationality, sexual orientation, religion, gender, and so on. Wikipedians generally put a high value on free speech and a low value on "training" of any kind, so we've approached these issues in a more black-and-white way: cross this line and we block you, but if it's not a blockable offense, then anything goes. The inevitable result has been a lot of demeaning comments made without repercussions, comments that have targeted various groups as morally, criminally, mentally or emotionally inferior. I'm not taking a position on what people should or shouldn't say and what the consequences should be, but I hope it's noncontroversial to point out that this is a real problem on Wikipedia. Slurs targeted at entire groups can do great harm, to the individuals and to Wikipedia, in a way that makes it difficult for the targets to fight back, that runs any productive discussion off the rails, and that risks losing valued contributors.
However, I also believe that any relevant RfC is going to fail unless we take the rhetoric down a few notches. People who engage in what gets labeled as hate speech aren't always being jerks or bullies ... sometimes, they're unaware of the import and the impact of what they're saying, and when people are accused of hate speech for repeating things that they've heard other people say, it's easy for them to hear the accusations as a form of hate speech directed at their own communities, which just adds fuel to the fire. Any solution that Wikipedians come up with is going to have to be long on humility, short on restrictions on speech, and free of righteous indignation.