As CMC systems and virtual communities develop over time and attract more and more users, they inevitably experience the problems that are observed in real-world communities. Such problems include overcrowding and antisocial behavior. It is an unfortunate fact of life that human beings have deviant tendencies and that wherever CMC systems are to be found, so too will antisocial and deviant behavior (Bruckman et al., 1994). However, CMC systems have the capabilities to manage such behavior and maintain social order by developing social sanctions and using system constraints.

8.1 Aspects of Deviant Behavior in CMC

The deviant behavior observed in virtual communities is a by-product of more than just human nature. Particular aspects of CMC systems actually encourage users to behave irresponsibly, rudely, or obnoxiously. Deviant behavior is an unfortunate side affect of two major aspects observed in all CMC systems: anonymity and disinhibition.

Because CMC is experienced in virtual space there are no social context cues or social structures to guide users in appropriate behavior. There is a decided lack of 'realness' to interactions in virtual communities. Users feel removed from the activity and have a sense of immunity. This leads user to feel disinhibited and potentially more socially rambunctious. This might also contribute to what Paul Curtis calls a 'shipboard syndrome': the feeling that since users from a virtual community will likely never meet in real life, "there is less social risk involved and inhibitions can safely be lowered" (Curtis, 1992).

One significant factor that may lead to deviant behavior is the natural anonymity that CMC systems provide. Because CMC and virtual communities are experienced strictly though computers, users can appear to be whomever or whatever they please. The possibility is extremely low that a user of a virtual community can positively know the real life identity of another user. Users are only known by what they explicitly make known. CMC provides users with an ability to explore many different personas and behaviors all behind a veil of anonymity. Such anonymity gives users a sense of safety and freedom, allowing them to express ideas they may have otherwise suppress in a real life encounter with stricter social sanctions. "Protected by the anonymity of the computer medium, and with few social context cues to indicate 'proper' ways to behave, users are able to express and experiment with aspects of their personality that social inhibition would generally encourage them to suppress" (Reid, 1991).

Deviant behavior is not new to virtual communities, nor are the solutions for dealing with it. Managing deviant behavior has been the subject of many discussions, and though the discussions have been fruitful, there is no one right approach to manage deviant behavior. Technological solutions such as 'kill' files, gags, and filters may be employed. Social solutions such as community sanctions, peer pressure, and user chats with system administrators may also be used. In this chapter we explore some examples of deviant behavior and how various CMC systems, including bianca, have attempted to manage it.

8.2 Impersonation

One of the consequences of a completely anonymous system is the problem of impostors. In a virtual community, an impostor is a deviant user who attempts to use another user's handle in an effort to impersonate that user. Though impostors in a virtual community are not nearly as potentially devastating as impostors in electronic commerce transactions, due to the anonymity of the medium and the fact that a user's handle is their sole means of recognition among the other participants in a virtual community, impersonation is a very important and sensitive issue that all virtual communities must deal with.

We can distinguish two classes of impersonation: a synchronous classification in which an impostor is one who attempts to impersonate another user who is currently on the system, and an asynchronous classification in which an impostor is one who attempts to impersonate another user who is not currently on the system.

IRC and MUDs have resolved the synchronous impostor problem through the use of system constraints. Both IRC and MUD systems maintain a list of active handles currently on the system and require that all users have unique handles. Before a user can access either IRC or MUDs, they must login and verify their handle first. The system assures that all users have unique handles and can not impersonate each other while actively on the system.

Asynchronous impersonation, or impersonating a user who is not currently on the system, is a more difficult and system intensive problem to solve, usually requiring large databases on every user of the system which must be searched every time a user changes handles. MUDs, however, are able to use the same system to prevent asynchronous impersonation as they do synchronous impersonation due to MUDs relatively small size. A MUD usually never has more than 40 simultaneous users at any one time, and rarely has more than 3000 total users. IRC on the other hand must support tens of thousands of users, and maintaining a global database of all active handles is a daunting task. However daunting the task may be, there was an attempt in IRC to maintain just such a database. Called 'Nickserv', this global database of IRC nicknames was active from July 1990 to August 1994 went it was shut down due to, among other things, significant Net overload. (Bechar, 1995)

Though asynchronous impersonation still occasionally occurs, IRC has controlled its asynchronous impersonation problem through a system of social structures and power hierarchies. IRC supports means of enforcing acceptable behavior through the /kick and /kill commands available only to Channel Operators (Reid, 1991). Channel Operators are privileged IRC users with the power to terminate obnoxious users, or users who attempt to impersonate others, off the IRC channel.

Besides this hierarchy of power, IRC has also developed a social structure in which it is considered socially unacceptable to asynchronously impersonate other users. It may seem difficult to spot an asynchronous impostor especially given the detached anonymous nature of the medium, but it is with surprising frequency that asynchronous impostors are "caught" even while in the act of impersonating another user in real time. This is attributed to a strong community of regular users who are attune to other user's characteristic behavior and patters, and historical and personal information. Users who do impersonate others are made to feel a sense of guilt for their actions and are ostracized and banished from the community (Reid, 1991). The following announcement was made by an IRC user on the newsgroup alt.irc:

I admit to having used the nickname "allison" on several occasions, the name of an acquaintance and "virtual" friend at another university. Under this nick, I talked on channels #hottub and #gblf, as well as with few individuals privately. This was a deceptive, immature thing to do, and I am both embarrassed and ashamed of myself.

Though, like IRC and MUDs, bianca also suffers from the problem of impostors, bianca has the added difficulties of having no authentication or login scheme and being a stateless multi-connection based system. When a user accesses IRC or a MUD, they make a single connection to the system and are then connected until they explicitly disconnect. On bianca, and all WWW based virtual communities, a user opens a new connection to the system for every request they make. Not only is this stateless multi-connection approach more CPU intensive in and of itself, this also makes handle verification a potentially CPU intensive job. IRC and MUDs have the luxury of being single connection based systems and therefore only have to verify a user's handle on initial access and when attempting to change handles.

Another problem unique to bianca is the lack of an authentication or login process to gain access to the system. A user can begin conversing with other users in the room immediately upon downloading the page. Most other WWW chat systems subject users to a login procedure in which the user must enter the handle they wish to use before they are allowed to chat. This login procedure serves as an authentication process in which the user's desired handle is verified for uniqueness against the other handles on the system. On bianca however, there is no login process to initially verify a user's handle, so bianca's system has to verity a user's handle with every access to the system.

Keeping in mind the HTTP system constraints and bianca's own design decisions, impersonation is controlled in bianca by maintaining a hash table of all handles used within a two minute time frame in each room, and notifying a user if they attempt to use a handle already in use by another user. Because bianca does not use a login process or have any permanent association between users and their handles, a temporary association is needed while the users are actively chatting. To help accomplish this in the stateless medium of HTTP, a technology called HTTP cookies is used to keep track of users between every access to the system. Every user that accesses bianca is issued an HTTP cookie that serves to uniquely identify them with every HTTP request they make. A hash table is then used to map each user's cookie to their handle. In essence, a temporary association is made between a user and their handle. With every access to the system, the hash table is used to identify the rightful owner of a handle based on HTTP cookie. If a user attempts to use a handle already in use by another user, the system notifies them that the handle is already in use. However, if the rightful owner of a handle does not make an access to the system before the temporary association with their handle expires, their handle is removed from the hash, thus giving another user the opportunity to use that handle in a possible impersonation attempt.

8.3 Harassment

Another common problem virtual communities must deal with is that of user harassment. As described earlier, the nature of CMC systems enables harassment more easily than in typical real life encounters. Users are often rude, and shout at other users. Sexual and racial slurs are commonly uttered, as well as libelous slanders. In addition, users may repeated post the same thing over and over in an attempt to spam or bring down the system.

Each CMC system has its own approach of managing harassment. There is no standard. Some systems use filters, ignoring mechanisms, or 'kill' files. Others use hierarchical power structures and social support systems. Still others rely on user feedback and administrative action.

IRC and MUDs have multiple approaches to managing harassing users. Both however attempt to use some sort of user developed social sanctions for appropriate behavior. Users are made aware of these social sanctions by remarks from other users. If a user is observed harassing another user, the other users will notify the perpetrator of the social sanctions for that particular area. If the user continues to be harassing, further social pressure will be used. If the user further continues, system constraints may then be used.

As discussed earlier in relation to impostors, IRC has two approaches to managing harassing users. The first is a system of social structures developed by the IRC users themselves. A community of regular users decides amongst themselves what will be and want won't be acceptable behavior in their channel. These users police themselves and making it known to everyone what the acceptable behavior is. If a user displays rude or obnoxious behavior, or blatantly disrespects the community's accepted behavior, social pressure will be used in an attempt to correct the errant user. The community pulls together and supports itself in this way.

As with impostors, the second means in which IRC manages harassment is through a combination of power structures and system constraints. If a user refuses to follow the communities accepted behavior, IRC Channel Operators can be called in to remove harassing user through the /kick and /kill commands.

*** Notice -- Received KILL message for 14982784 from MaryD (Obscene Dumps!!!)

*** Notice -- Received KILL message for mic from mgp (massive abusive channel dumping involving lots of ctrl-gs and gaybashing, amongst other almost as obnoxious stuff)

*** Notice -- Received KILL message for JP from Cyberman ((repeatedly ignoring warnings to stop nickname abuse))

MUDs also use both social and technological approaches to managing deviant behavior. Like IRC, MUDs try to maintain a standard set of socially acceptable behaviors developed by the users of the MUD. If a user is out of line, other users will remind them of the accepted community standards. However, when users continue to disregard the community standards, the MUD 'wizards', the MUD equivalent of IRC's Channel Operators, are called in to do damage control.

"On most MUDs, the wizards' first approach to solving serious behavior problems is, as in the best real-life situations, to attempt a calm dialog with the offender. When this fails, as it usually does in the worst cases of irresponsibility, the customary response is to punish the offender with 'toading'. This involves (a) either severely restricting the kinds of actions the player can take or else preventing them from connecting at all, (b) changing the name and description of the player to present an unpleasant appearance (often literally that of a warty toad), and (c) moving the player to some very public place within the virtual reality. This public humiliation is often sufficient to discourage repeat visits by the player, even in a different guise." (Curtis, 1992).

Another less humiliating but more permanent technological approach is to simply remove the offending user from the MUD's database. This approach is refereed to as 'recycling' and is the more common occurrence for managing deviant users in the LambdaMOO MUD (Curtis, 1992).

Like all CMC systems, bianca is no exception to the problems of deviant behavior and harassing users. However, because the Web is much more popular that either IRC or MUDs, more people traffic through bianca, and with more people comes more harassment. Adding to bianca's problems is the fact that so many new users are not versed in the ways of netiquete.

The bianca site relies on three approaches to managing deviant behavior: social sanctions, system constraints, and administrative action. The first is through social sanctions specific to each room in bianca. The users in most rooms of bianca have formed sub-communities within the larger virtual community of bianca and have set their own rules and ideas of appropriate behavior. If a user disregards those social foundations, the other users of the room will remind them of what the appropriate behavior is. Users try to maintain this code of conduct as best they can. Some rooms have even gone as far as formalizing and posting their conduct rules. In bianca's basement, for example, users created a Web page containing the rules of conduct in the basement and have asked that a link to these rules be posted at the top of the chat page.

If social pressure fails and a user continues to display deviant behavior, the next approach is through system constraints. Initially bianca relied completely on social approaches to managing deviant behavior. However, as bianca became more popular and started attracting more users with outright negative intentions, it became apparent that some sort of system constraints would have to be designed into the software. In fact, users started complaining, asking for some means of possibly 'ignoring' other user, and even threatening to leave.

"As cool as your chat system is, there are so many assholes on-line these days, that chatting here is no longer fun or pleasant. Can't you do something about it? Is there any way you can give us the ability to ignore eachother? That might help alot. I know you are busy, but if these jerks keen ruining everything, I'm not coming back."

The constraint system used at bianca gives users an ability to ignore selected users. By using HTTP cookies to uniquely identify each user on the system, the software serves up unique pages to each user. The system can selectively display any particular user's conversation based on the current user's configuration. If a user finds that another user is being rude, obnoxious, or harassing, the user need only set their configuration to 'ignore' the obnoxious user in question, and the system will parse out all comments from the obnoxious user. Because each page is tailored specifically for each user, other users can still see the obnoxious users comments if they so wish. The process of ignoring other users is relatively easy. To ignore other users a user must click the 'Configure Chat' link, find the handle of the user they would like to ignore, check the box next to the handle indicating that the user would like to ignore the selected handle and then submit the new configuration. Figure 7 shows a typical Configuration Chat page. From there on out, the selected handle will be ignored and the user will not see conversation from that handle. Because bianca identifies users by HTTP cookie instead of handles or some other temporary means, the ignored user will continue to be ignored even if they change their handle.

If a user continues to be harassing day in and day out, the only other approach is to deny access to the offending user. The process of denying access to a user involves at least three steps. First the bianca administrator must be notified about the offending user. Then the administrator must make a judgment call as to whether the user is indeed being obnoxious and/or harassing to other users and whether the user deserves to have their access privileges removed. If the administrator deems that the user should indeed be denied access, the administer must then determine the IP address the user connects from and reconfigure the software so as to disable access to the user's IP address.

8.4 Limitation of Freedoms

One important lesson to be learned from a virtual community, is that it is necessary to limit user freedoms in order to maintain order. The failure of the utopic ideal of complete freedom in cyberspace was experienced early on in the developmental cycle of virtual communities (Reid, 1994). The CommuniTree, one of the early bulletin boards of the late 1970s, may have been one of the first virtual communities to experience the consequences of unchecked deviant behavior and antisocial impulses (Rheingold, 1993). "Initially a forum for intellectual and spiritual discussion amongst adults, in an environment where privacy was guaranteed and censorship censured, CommuniTree collapsed under the onslaught of messages, often obscene, posted by the first generation of adolescent school children with personal computers and modems" (Reid, 1994). As Rosanne Stone recalled in her presentation at the First Conference on Cyberspace in 1990 in Austin Texas, "there was no easy way to monitor (the messages) as they arrived, and no easy way to remove them once they were in the system... Within a few months, the Tree had expired, choked to death with what one participant called 'the consequences of freedom of expression'" (Stone, 1990). It was the lessons and implications of CommuniTree that proved "in practice, surveillance and control proved necessary adjuncts to maintaining order in the virtual community" (Stone, 1990).

bianca has also suffered some of the same problems as CommuniTree. In an attempt to bring down the system users have spammed the system with hundreds of chat posts. Users have abused the expressiveness that HTML offers. Users continuously attempt to impostor each other and post libelous information. In unchecked, unguided, and unhosted forums users have blatantly disregarded the conventions of the forum. The list of offenses goes on and on.

Though bianca experiences deviant behavior, bianca designers take the approach that users are 'innocent until proven guilty'. Initially bianca users are allowed as much freedom as possible until they are found abusing that freedom, as they inevitably do, and then their freedoms are slowly removed until the environment is safe for everyone to express their views without being obnoxious in their presentation. Much of the effort to maintain bianca is spent on adding further surveillance methods and user constraints and limitations to the system software.

The chat areas of bianca are where a majority of limitations are seen. Probably the single most important constraint is that of limiting full HTML capabilities. With all the flexibility and expressiveness HTML enables, it can also be quite destructive if used in devious manors. HTML such as forms, Javascript or Java applets can be so dangerous that any occurrence found in a user's chat post by the chat software is immediate stripped of the offending tags before being entered in the room's chat and display on user's screens. HTML such as tables, though inherently not destructive, can be used to cause great havoc. A table with a border size of 100,000 completely fills the screen making reading other posts all but impossible, and even causes some Web browses to crash. For this reason, tables were disabled and stripped from user's chat posts. Inline images are another HTML tag that though not inherently destructive were also being abused. Images can take a long time to download, especially over 14.4 Kbaud modems. Users found that they could post extremely large inline images to the chat room in an effort to lengthen the time required to download a page of chat. Not only were the images of extreme bad taste, they added no value to the chat at hand, and because of the increase download times, just served to agonize the other users. The chat software was thus modified to filter all inline images and replace them with HTML links to the images' source.

Another freedom that users abused and which eventually had to be limited is that of post length. Initially the posts in bianca's chat rooms could be of any length. If a user had a 100 line long poem they wanted to post, the chat software would have accepted and posted it. This was acceptable when bianca was a smaller more closely knit community. However, as bianca grew more popular, racist homophobic hate mongers eventually found their way to bianca, and they too could post 100 line long rants if they pleased. As the bianca site prides itself on allowing all users freedom of speech, the content of these hateful messages was not a problem. However, these individuals posted their rants over and over and over again, totally disrupting the feel of the room, and almost bring the system to its knees.

"There is a person in the JukeBox rightnow, monopolizing the entire thing! He resubmits his super long oversized comments repeatedly, giving us no chance of posting anything! He's been there most of the morning. Any suggestions would be of help. Thanks."

Unfortunate as it was, the users freedoms had to be limited once again. The chat software was changed so as to cut off all posts greater than 20 lines. Though this limitation did stop the repeated harassment from the hate mongers and left 99% of the chat unaffected, it did prevent some users from posting their somewhat lengthy, though beautiful, poetry and stories.

The discussion forums also experienced some of the same problems as bianca's chat rooms. In an effort to draw attention to their post, users would make the subject line of their post appear in large or blinking letters. Initially this practice added to the character of the forums, however, eventually it got so out of hand, that reading the page proved to be extremely difficult, and again, the users themselves asked to have their HTML capabilities limited. In an effort to be fair to everyone, it was decided to remove all HTML from the subject lines of all user posts. Though this limited users ability to make their posts stand out from all the rest, at least it enabled all posts to be seen.

Another problem bianca's discussion forums faced was that of multiple posts by the same user within a short time span. In an effort to disrupt the feel of a forum and obscure other user's posts from view, users would posts meaningless crap over and over again. The discussion forums are designed so that only the most recent 100 message subjects are shown on the screen. Initially the discussion software had no limits on the number of times any one user could post a message. Thus a user could make 100 posts in a row, spamming the system and pushing all other user's posts out of view. In an effort to circumvent this practice, the discussion software was redesigned with a rudimentary limitation on user's ability to spam the system.

"Bianca, there is trouble brewing in the Kitchen. That person Atherium is not stable and is crapping all over in the SUB. Must have been pissed off by somebody's remarks. Out of 100 posts today he sent 96. Claims the kitchen is his. The boy wants attention but when he takes over as he thinks he has it ruins it for others and they will stop coming here. Not good for business."

"Some idiot has purposely posted about 30 or 40 if the same "Question" just to piss everyone off. This guy has been causing problems off and on for the past few weeks."

Like the problems experience at CommuniTree, the problems experienced at bianca serve to further support the idea that at least in virtual communities, users can not handle the responsibilities of complete freedom. By carefully and intelligently limiting users freedoms, bianca was able continue to allow users freedom of speech, prevent users from thwarting the system, and maintain order in the virtual community.

[ Next: Building Community ]