Nico Deyo, a 33-year-old e-commerce specialist from Milwaukee, used to enjoy mixing it up with players from around the world in the popular online fantasy game "World of Warcraft." Then a stalker began harassing her on the game's forums, impersonating her in the game and, later, sending her barrages of Twitter messages, some threatening her with graphic rape and murder.
While the stalker didn't drive her from the game, the experience helped sour her on multiplayer gaming. "There's a lot of things about the community that are very hostile," she says of Warcraft. Deyo largely gave up the game almost two years ago and now mostly spends her time on playing other games by herself.
Deyo is far from alone. In the male-dominated world of multiplayer online games like "Grand Theft Auto," ''Halo," and "Call of Duty," many women say they've had to take drastic steps to escape harassment, stalking and violent threats from male players. Some quit particular games. Others change their screen names or make sure they play only with friends.
Online harassment of women, often involving threats of horrific violence, has become a big issue - and video games are a frequent flashpoint. Two years ago, the online "Gamergate" movement, ostensibly a protest over the ethics of game journalists, also fueled Twitter attacks on female critics replete with gutter-level abuse and assault threats. Some women who were targeted left their homes or canceled speaking engagements, fearing for their safety.
On Saturday, the South by Southwest (SXSW) Interactive festival held a daylong summit on online harassment. Brianna Wu, a video game developer who was at the center of Gamergate, hosted a panel called "Is a safer, saner and civil Internet possible?" Wu talked about her experience receiving hateful messages and death threats online -- she has received over 200 -- and faulted social networks with not doing enough to protect users.
"They need outside people to come in and view their processes to make sure that things like death threats and harassment has no role in the public conversation," Wu told the BBC. "We need social media companies to step up their policies -- because they are failing us."
Tech companies including Google and Facebook took part in panel. Monika Bickert, head of product policy at Facebook, told the BBC that eliminating online harassment was not easy and can't be entrusted to a computer algorithm.
"You might, for instance, have someone using a racial slur to attack a person, and that would violate our policies. [But] you might also have somebody using that slur to say 'this morning, on the subway, someone called me this word, it was upsetting,'" Bickert told the BBC. "We need a real person looking at it to make a decision."
According to the BBC, Wu went on to praise Twitter for its efforts to control harassment and chastised Reddit for "failing women."
- Obama at SXSW: "Dangers are real" in debate over encryption
- "Spring break for nerds": SXSW kicks off in Austin
Many women say online gaming companies have also been slow to act. Major console makers Microsoft and Sony and game developers like Blizzard Entertainment have "terms of service" that explicitly ban stalking and other harassing behavior. The companies have the right to ban reported bad actors from their public forums. But players say that rarely happens -- and when it does, as in Deyo's case, their harassers often follow them onto Twitter and other social channels.
Becky Heineman, the 52-year-old founder of the Olde Skuul game studio in Seattle, was an aficionado of shoot-em-ups like "Halo" and "Call of Duty." But constant catcalls from other players and questions about her bra size or "whether I do it on top or bottom, or other derogatory things," she says, wore her down.
Reporting her harassers never seemed to make a difference, she says. She limited her play to friends for a while, but now mostly focuses on simple single-player games like "Cookie Clicker" on her phone and computer.
Contrary to popular stereotypes, women are avid video gamers; one recent survey showed that about half of all women play video games, about the same as men. But men are far more likely to identify themselves as "gamers," and experts say that "hard-core" shooting and action games remain mostly male.
It's only recently that "women players have been recognized as valid gamers that are interesting for companies," said Yasmin Kafai, a University of Pennsylvania professor who focuses on gender and gaming.
Microsoft says recent changes to its Xbox Live service make it more likely that players with bad reputations will end up playing each other. It adds that its enforcement team monitors complaints at all times and that all reports are investigated. Sony, Blizzard and the Entertainment Software Association, a trade group, did not respond to requests for comment.
Those moves don't impress some women in the industry.
"While they have very good statements about harassment and, you know, responsibility to the community and all that kind of stuff, the enforcement side of it is pretty lax," says Kate Edwards, executive director of the International Game Developers Association. "Players basically have to adopt their own strategies to deal with it."
Games and online game networks, for instance, let players "mute" messages from opponents and turn off voice chat, where trash talk can easily shade over into harassment. Xbox Live also labels players who get lots of complaints with a red marker so that other players can avoid them.
But constantly muting or reporting other players interrupts what's supposed to be a fun pastime. And it doesn't change harassing behavior.
"If I just block somebody, is that stopping them from doing the abuse?" says Kishonna Gray, an Eastern Kentucky University professor who wrote a book about racist and sexist interactions within Xbox. "They can go to the next person and do the same thing."
That's especially true when harassment shades into the real world. Mercer Smith-Looper, a 27-year-old Boston woman, found it annoying when male players patronized her and told her how to play. Then she started receiving unwanted gifts -- a necklace, a sword -- in the mail. One gamer unexpectedly showed up at her workplace after calling her repeatedly.
Fed up, she changed her gamer name and now sticks to playing privately with friends or alone. "I'm kind of in hiding," she says.
What would effective anti-harassment measures look like? Experts like Edwards and Gray point to Riot Games, the maker of "League of Legends," for its efforts to change player culture. Riot built a system based on artificial intelligence and player feedback to determine appropriate behavior during gameplay, and uses it to punish or reward players who draw complaints, according to the company's online support documents.
When players show "signs of toxicity," Riot can block them from competitive play, limit their chats or ban them entirely. The company shows players what behavior other players didn't like when it punishes them. Jeffrey Lin, Riot's lead game designer for social systems, has said that because of these efforts, only 2 percent of its global games experienced racist, homophobic, sexist language or excessive harassment.
Riot Games declined to comment when contacted by The Associated Press.
IGDA's Edwards acknowledges that dealing with harassment is a difficult challenge. "You're dealing with minors versus adults," she says. "You're dealing with free speech issues. It's a struggle for companies to figure out exactly how to approach it."
And while Riot-style moderation might limit harassment, it's unlikely to solve the problem on its own. "This is a social and cultural problem, not a technological one," says Dmitri Williams, CEO of game analytics firm Ninja Metrics.