It's a cliché to assume that all online gamers -- and particularly those who play popular titles like first-person shooters -- are various degrees of unpleasant.
However, that's not to say that unpleasant players don't exist, and thus it's of increasing importance that those who make games or the surrounding online services provide the means for everyone to feel safe online.Approaching the situation from a more positive angle, having these tools in place also helps match players up with the sort of people they will actively enjoy playing with, rather than those whom they either just tolerate or immediately mute.
With this in mind, Microsoft has announced its plans for the Xbox One's "reputation" system in a blog post by Micheal Dunn, program manager for Xbox Live.
Dunn describes the new console's reputation model as "community-powered." What that means is that -- in theory, at least -- you should be able to weed out the sort of people you don't want to play with, and at the same time help to create real consequences for those who spoil the experience for others.
At present, Xbox Live implements a survey system that allows you to provide feedback on a player according to their behavior in a game. This helps build a profile of individual users according to what sort of player people have rated them as, and also contributes to each user's "star" rating on their Xbox Live profile. Unfortunately, it's relatively easily abused at present, as it's not out of the question to find someone with a five-star rating screaming bloody murder in the middle of a game, as all they had to do to get that rating was get a few of their friends to rate them as a good player.
Instead, the Xbox One's implementation of Live will incorporate a more direct feedback model. Rather than relying on surveys, Live will instead take into account immediate actions such as whether people are blocked or muted.
"The new model will take all of the feedback from a player's online flow and put it in the system with a crazy algorithm we created and validated with an MSR [Microsoft Research] PhD to make sure things are fair for everyone," says Dunn. The system will then determine each individual player's reputation.
The way this will work is simpler than the "five star" rating and "zone" system previously used -- instead, you'll be assigned one of three categories according to your behavior, and this category will be immediately visible on your Live profile. Green players are good players, yellow players "need improvement" and red players are flagged as "avoid me."
"Your reputation score is ultimately up to you," says Dunn. "The more hours you play online without being a jerk, the better your reputation will be; similar to the more hours you drive without an accident, the better your driving record and insurance rates will be."
Dunn assures users that the algorithm used is "sophisticated" and won't penalize you for a stray bad report, or those left out of spite.
"The algorithm weighs the data collected so if a dozen people are suddenly reporting a single user, the system will look at a variety of factors before docking their reputation," Dunn explains. "We'll verify if those people actually played in an online game with the person reported -- if not, all of those players' feedback won't matter as much as a single person who spent 15 minutes playing with the reported person. The system also looks at the reputation of the person reported and the alleged offender, frequency of reports from a single user and a number of other factors."
What Dunn doesn't mention is the fact that a lot of people immediately and habitually mute all the other players in an online session in an effort to concentrate on the game rather than potentially distracting chatter. If the algorithm is as sophisticated as Dunn suggests, presumably this will be taken into account, but it remains to be seen how it behaves in the wild -- and that's something we won't see until Xbox One officially launches.