Mark Dombeck, Ph.D. was Director of Mental Help Net from 1999 to 2011. Dr. Dombeck received his Ph.D. in Clinical Psychology in 1995 ...Read More
Griefers have been in my news lately. Wired magazine has recently published an article on the subject, and several bloggers I read have commented on that article. A griefer is essentially a person (or teams of people) who play online games not to experience normal game play (e.g., exploring game worlds and fulfilling game quests), but rather to purposefully disrupt the normal game play of others. One popular way griefers do this is to ambush and murder other less powerful player’s characters and steal their in-game artifacts and objects of power and wealth, causing them grief. The term was news to me when I first encountered it in the Wired article, but that is because I’m not a gamer. Once I understood the basic definition of griefing, however, it became clear to me that griefers are akin to the common community troll; the sort of disruptive personality that has plagued online communities since the web got started and before.
Online communities such as Mental Help Net used to host (and will shortly host again I’m happy to announce) exist for a purpose. In Mental Help Net’s case, that purpose is mutual support for people who are struggling with mental health and life event issues. People who come into the community to further the purpose of the community (e.g., to give and receive support) are called Members. People who come into the community and work against the purpose of the community are called Trolls.
I’ve always thought that the name came from the troll character in the classic folk tale "The Three Billy Goats Gruff" who stood between the goats (the regular community members) and their goal of the grassy pasture on the other side of the troll bridge. However, the term may alternatively come from the practice of trolling for fish (e.g., placing bait into the water (e.g., the community) and waiting to see which fish (which members) will bite on it. In any case, the term troll applies best to people who are entering the community with the premeditated intention of committing mayhem. It is sometimes also applied to people who are legitimately seeking support, but who sometimes don’t have the best social skills, or who decompensate and have angry yelling matches with other community members. These latter people aren’t really trolls, but they do end up needing to be helped to reconstitute themselves for the safety of the community as a whole.
As the wired article sagely points out, game griefers are present to play a game but it is not the same game as everyone else in the community is playing. The same can be said for the community troll, who may be playing "I’m right, you’re wrong" when everyone else is playing "lets all get along". This disparity in motivations between members and griefers/trolls makes griefers/trolls fascinating to many non trolls. Just the audacity of their behavior grabs attention for one thing. Then there are secondary motivations. Some people identify with the trolls, while others identify with the law enforcement apparatus that punishes the trolls. I’m sure there are other motives too, but there is one constant and that is that anyone who pays attention to the troll loses.
As a once and future community moderator, I too am fascinated by griefer motivations but for more practical reasons. I’d like to understand how best to limit the damage they cause. Limiting the damage that such folk cause comes down in part to understanding what are the conditions under which their tendency to act in antisocial ways can be encouraged or discouraged. This topic has been discussed before, but I thought I’d add my two cents.
In my experience, manipulating perpetrator anonymity is an important factor in controlling griefer’s/troll’s antisocial behavior. The more easily identifiable and able to be held accountable for their actions community members are, the fewer instances of bad behavior you tend to see.
Allied with the idea of altering perpetrator anonymity is the idea of altering expectation of punishment. Accountability enables easier punishment. There are several ways that punishment can take place however. Punishment can be very informal, where community members heap scorn on other members who violate the social contract or simply ignore them (by using filters within the community to literally make their presence invisible). This sort of informal punishment is what makes accountability effective all by itself. Accountability can also enable more formal varieties of punishment such as entry bans. In my experience bans are the most useful way to discourage the really hardcore antisocial behavior that happens on communities. Punishment can never hope to eradicate all griefer/troll behavior however, because the really hardcore griefers will thrive on punishment, seeing attempts by the management to eject them as high praise for their work.
Here are a few other elements of the community or game that can be manipulated and which might have an impact on reducing griefing/trolling behavior.
Setting up Initiation Barriers probably would affect griefing behavior. The easier it is to get into a community, the more likely that community is to become a target for griefers. In part this has to do with helping people to identify with and value the community and not take it for granted. When you have to do a lot of work to get into a community you are more likely to care for that community and not want to harm it. The problem here is that the same barriers that might keep out griefers also keep out legitimate members. It is difficult to set a barrier high enough to keep out one group without also keeping out the other group.
I’d expect that the more opportunity there is to act out griefer behaviors with a group of other griefers, the more often the behavior would happen. People tend to take less responsibility for individual actions when they are acting as part of a group or mob. This social psychological principle goes by several names including the bystander effect, and diffusion of responsibility. The solution here would be to limit people’s ability to socialize, but as that utterly defeats the purpose of the community it isn’t really much of a solution.
I would expect that manipulating the frame of the community or game can increase or decrease the chance that griefer behavior will occur. The frame of a game or community has to do with its identity – how members think of what they are doing when engaged in the game or community. If an interaction is thought of as a game and therefore not something that is real or important it is easier to self-justify doing mayhem. If an interaction is thought of as a more serious behavior such as part of a support group interaction, the urge to do mayhem is maybe less strong (for some at least). The Wired article talks about this issue somewhat indirectly, noting that Second Life members don’t think of what they do in Second Life as being part of a game but rather view it as a more serious community. The "non-game" frame of Second Life participants makes such participants more likely to view griefing behavior taking place within Second Life in non-game ways, such as considering it to be actual theft or terrorism.
I would expect that the inherent victim anonymity provided by an online environment (no matter how 3D it becomes) also contributes to displays of griefer antisocial behavior. I’m guessing that it is probably easier to take advantage of someone you don’t know’s cartoon-like avatar than it is to take advantage of someone you know of and empathize with.
Live communities and games use techniques and concepts like these (at least those that can be manipulated) all the time in their ongoing attempts to set up workable boundaries that will prevent the majority of griefing and trolling behaviors. Such boundaries always need to be permeable but not too permeable. They need to let legitimate community members go about their business but also prevent and eject undesirable behavior. Finding the right balance of steps to accomplish this goal is an ongoing and unending process.
To give an example, the MentalEarth.com support community (which evolved out of the old Mental Help Net support community) relies in part upon the principles of accountability, membership initiation and opaqueness in defending their turf. No one can view the community without an active membership, and active memberships cannot be gotten without linking to a single permanent email account (e.g., not a freebie hotmail or gmail account). Most potential mayhem committers are probably not all that interested in waiting for approval of their account before they can post.
Maintaining such accountability is hard work. It doesn’t scale well to very large communities such as online games and also doesn’t fit game environments because it is more exclusionary than such environments would want to be. An online game, being a for profit business, is motivated to have as many paying members as they can take on and this goal is incompatible with the idea of screening out members who wish to remain more anonymous. The more legitimate members complain and leave as a result of bad behavior, however, the more such large game communities will have to take action or risk losing their base.
Anyway, many readers will have spent time within online games or communities and will have their own opinions about how antisocial behavior is best managed. If you’ve got an opinion about what works and what doesn’t work, I’d like it very much if you’d share it below.