The Activision Blizzard lawsuit shows harassment in the games industry is still rampant — 5 must-reads to understand why - The Next Web | Canada News Media
Connect with us

Tech

The Activision Blizzard lawsuit shows harassment in the games industry is still rampant — 5 must-reads to understand why – The Next Web

Published

 on


Sexual harassment in gamer culture burst back into the spotlight on July 21, 2021, with news of California’s lawsuit against Activision Blizzard, publisher of top-selling video games Call of Duty, World of Warcraft, and Candy Crush, and a walkout by company employees. The lawsuit alleges a “pervasive ‘frat boy’ culture” at the company and discrimination against women in pay and promotion.

The turmoil is an echo of the infamous Gamergate episode of 2014 that featured an organized online campaign of harassment against female gamers, game developers and gaming journalists. The allegations are also of a piece with a decades-long history of gender discrimination in the technology field.

We’ve been covering sexual harassment and gender discrimination in gaming – and technology generally – and picked five articles from our archive to help you understand the news.

1. Gaming culture is toxic – but community norms can change it

Things have not been getting steadily better. The shift to online activities caused by the pandemic was accompanied by an increase in online harassment and a decrease in the number of women and girls playing video games.

More than a third of female gamers have experienced harassment, and female players have developed coping strategies like hiding their gender, playing only with friends and shutting down harassers by outplaying them, according to University of Oregon professor Amanda Cote. These strategies take time and energy, and they avoid rather than challenge the harassment. Challenging harassment is also fraught, because it typically sparks a backlash and puts the burden on the victim.

Shutting down harassment comes down to creating and supporting community norms that reject rather than allow or encourage harassment. Gaming companies can adopt practices beyond banning harassers that discourage the behavior before it happens, including reducing opportunities for conflict outside of gameplay, adding in-game recognition of good behavior, and responding quickly to complaints.

“If esports continue to expand without game companies addressing the toxic environments in their games, abusive and exclusionary behaviors are likely to become entrenched,” she writes. “To avoid this, players, coaches, teams, leagues, game companies and live-streaming services should invest in better community management efforts.”


Read more: Here’s what it’ll take to clean up esports’ toxic culture


2. It’s not just players – fans are part of the problem

Go to any sports stadium and you’ll see that the atmosphere that energizes players and fans alike comes from the fans. For esports the venues are streaming services, where fan reaction comes not from cheers and chants but in the form of online chat.

University of South Florida professor Giovanni Luca Ciampaglia and colleagues analyzed chats on Twitch, one of the largest streaming services that carries live esports. They found a sharp distinction in the language fans use when commenting on players, called streamers, depending on gender.

“When watching a man stream, viewers typically talk about the game and try to engage with the streamer; game jargon (words like ‘points,’ ‘winner’ and ‘star’) and user nicknames are among the most important terms,” he writes. “But when watching a woman stream, the tone changes: Game jargon drops, and objectification language increases (words like ‘cute,’ ‘fat’ and ‘boobs’). The difference is particularly striking when the streamer is popular, and less so when looking at comments on less-popular streamers’ activity.”

As with the games themselves, combating harassment and discrimination on streaming services comes down to community standards, he writes. The streaming services “need to examine their cultural norms to drive out toxic standards that effectively silence entire groups.”


Read more: Can online gaming ditch its sexist ways?


3. Collegiate esports leagues don’t reflect the population of videogame players

Esports is becoming a big business, with over $1 billion in revenues, and collegiate leagues are an important component of the field. Just over 8% of college esports players and 4% of coaches are female. The low rates of participation are not a reflection of interest: 57% of women ages 18-29 play video games that are in the esports category.

Boise State esports coach Doc Haskell watches scholarship graduate student Artie ‘N3rdybird’ Rainn compete in a match.
AP Photo/Otto Kitsinger

Female players face overt hostility and harassment, which discourages participation, according to SUNY Cortland professor Lindsey Darvin. College teams often engage in tokenism by bringing on a single female player, and the vast majority of scholarships go to male players.

Professional esports organizations are beginning to address the gender gap. Colleges and universities need to follow suit.

“Colleges and universities that receive U.S. federal aid have an obligation to improve opportunity and access to participation based on Title IX policy, which prohibits sex discrimination in any education program or activity receiving federal financial assistance,” she writes.


Read more: At colleges nationwide, esports teams dominated by men


4. Lessons from the tech field: Diversity and equity require women with power

The roots of esports’ toxic culture lie in decades of gender discrimination in the technology field as a whole. That discrimination has proved stubborn.

“In 1995, pioneering computer scientist Anita Borg challenged the tech community to a moonshot: equal representation of women in tech by 2020,” writes Rensselaer Polytechnic Institute professor Francine Berman. “Twenty-five years later, we’re still far from that goal. In 2018, fewer than 30% of the employees in tech’s biggest companies and 20% of faculty in university computer science departments were women.”

Reversing discrimination is a matter of changing cultures within organizations. “Diverse leadership is a critical part of creating diverse cultures,” she writes. “Women are more likely to thrive in environments where they have not only stature, but responsibility, resources, influence, opportunity and power.”

“Culture change is a marathon, not a sprint, requiring constant vigilance, many small decisions, and often changes in who holds power,” she writes. “My experience as supercomputer center head, and with the Research Data Alliance, the Sloan Foundation and other groups has shown me that organizations can create positive and more diverse environments.”


Read more: The tech field failed a 25-year challenge to achieve gender equality by 2020 – culture change is key to getting on track


5. The myth of meritocracy is an impediment to equality

The myth of meritocracy is a large part of the longevity of gender discrimination in the tech field. That myth says that success is a result of skill and effort, and that women’s representation is a reflection of their abilities.

In the U.S., women own 39% of all privately owned businesses but receive only around 4% of venture capital funding, according to Brown University professor Banu Ozkazanc-Pan.

“Yet the meritocracy myth, which my research shows has a stronghold in the world of entrepreneurship, means that women are constantly told that all they have to do to get more of that $22 billion or so in venture capital funding is make better pitches or be more assertive,” she writes.

What the tech field calls meritocracy is in fact gender-biased and results in mostly white men gaining access to resources and funding. “By continuing to believe in meritocracy and maintaining practices associated with it, gender equality will remain a distant goal,” she writes.

Adopting gender-aware approaches, including setting concrete goals for gender balance, is key to correcting the imbalances caused by the meritocracy myth.


Read more: Women in tech suffer because of American myth of meritocracy


This article Eric Smalley, Science + Technology Editor, The Conversation, is republished from The Conversation under a Creative Commons license. Read the original article.

Adblock test (Why?)



Source link

Continue Reading

Tech

Ottawa orders TikTok’s Canadian arm to be dissolved

Published

 on

 

The federal government is ordering the dissolution of TikTok’s Canadian business after a national security review of the Chinese company behind the social media platform, but stopped short of ordering people to stay off the app.

Industry Minister François-Philippe Champagne announced the government’s “wind up” demand Wednesday, saying it is meant to address “risks” related to ByteDance Ltd.’s establishment of TikTok Technology Canada Inc.

“The decision was based on the information and evidence collected over the course of the review and on the advice of Canada’s security and intelligence community and other government partners,” he said in a statement.

The announcement added that the government is not blocking Canadians’ access to the TikTok application or their ability to create content.

However, it urged people to “adopt good cybersecurity practices and assess the possible risks of using social media platforms and applications, including how their information is likely to be protected, managed, used and shared by foreign actors, as well as to be aware of which country’s laws apply.”

Champagne’s office did not immediately respond to a request for comment seeking details about what evidence led to the government’s dissolution demand, how long ByteDance has to comply and why the app is not being banned.

A TikTok spokesperson said in a statement that the shutdown of its Canadian offices will mean the loss of hundreds of well-paying local jobs.

“We will challenge this order in court,” the spokesperson said.

“The TikTok platform will remain available for creators to find an audience, explore new interests and for businesses to thrive.”

The federal Liberals ordered a national security review of TikTok in September 2023, but it was not public knowledge until The Canadian Press reported in March that it was investigating the company.

At the time, it said the review was based on the expansion of a business, which it said constituted the establishment of a new Canadian entity. It declined to provide any further details about what expansion it was reviewing.

A government database showed a notification of new business from TikTok in June 2023. It said Network Sense Ventures Ltd. in Toronto and Vancouver would engage in “marketing, advertising, and content/creator development activities in relation to the use of the TikTok app in Canada.”

Even before the review, ByteDance and TikTok were lightning rod for privacy and safety concerns because Chinese national security laws compel organizations in the country to assist with intelligence gathering.

Such concerns led the U.S. House of Representatives to pass a bill in March designed to ban TikTok unless its China-based owner sells its stake in the business.

Champagne’s office has maintained Canada’s review was not related to the U.S. bill, which has yet to pass.

Canada’s review was carried out through the Investment Canada Act, which allows the government to investigate any foreign investment with potential to might harm national security.

While cabinet can make investors sell parts of the business or shares, Champagne has said the act doesn’t allow him to disclose details of the review.

Wednesday’s dissolution order was made in accordance with the act.

The federal government banned TikTok from its mobile devices in February 2023 following the launch of an investigation into the company by federal and provincial privacy commissioners.

— With files from Anja Karadeglija in Ottawa

This report by The Canadian Press was first published Nov. 6, 2024.

The Canadian Press. All rights reserved.

Source link

Continue Reading

Health

Here is how to prepare your online accounts for when you die

Published

 on

 

LONDON (AP) — Most people have accumulated a pile of data — selfies, emails, videos and more — on their social media and digital accounts over their lifetimes. What happens to it when we die?

It’s wise to draft a will spelling out who inherits your physical assets after you’re gone, but don’t forget to take care of your digital estate too. Friends and family might treasure files and posts you’ve left behind, but they could get lost in digital purgatory after you pass away unless you take some simple steps.

Here’s how you can prepare your digital life for your survivors:

Apple

The iPhone maker lets you nominate a “ legacy contact ” who can access your Apple account’s data after you die. The company says it’s a secure way to give trusted people access to photos, files and messages. To set it up you’ll need an Apple device with a fairly recent operating system — iPhones and iPads need iOS or iPadOS 15.2 and MacBooks needs macOS Monterey 12.1.

For iPhones, go to settings, tap Sign-in & Security and then Legacy Contact. You can name one or more people, and they don’t need an Apple ID or device.

You’ll have to share an access key with your contact. It can be a digital version sent electronically, or you can print a copy or save it as a screenshot or PDF.

Take note that there are some types of files you won’t be able to pass on — including digital rights-protected music, movies and passwords stored in Apple’s password manager. Legacy contacts can only access a deceased user’s account for three years before Apple deletes the account.

Google

Google takes a different approach with its Inactive Account Manager, which allows you to share your data with someone if it notices that you’ve stopped using your account.

When setting it up, you need to decide how long Google should wait — from three to 18 months — before considering your account inactive. Once that time is up, Google can notify up to 10 people.

You can write a message informing them you’ve stopped using the account, and, optionally, include a link to download your data. You can choose what types of data they can access — including emails, photos, calendar entries and YouTube videos.

There’s also an option to automatically delete your account after three months of inactivity, so your contacts will have to download any data before that deadline.

Facebook and Instagram

Some social media platforms can preserve accounts for people who have died so that friends and family can honor their memories.

When users of Facebook or Instagram die, parent company Meta says it can memorialize the account if it gets a “valid request” from a friend or family member. Requests can be submitted through an online form.

The social media company strongly recommends Facebook users add a legacy contact to look after their memorial accounts. Legacy contacts can do things like respond to new friend requests and update pinned posts, but they can’t read private messages or remove or alter previous posts. You can only choose one person, who also has to have a Facebook account.

You can also ask Facebook or Instagram to delete a deceased user’s account if you’re a close family member or an executor. You’ll need to send in documents like a death certificate.

TikTok

The video-sharing platform says that if a user has died, people can submit a request to memorialize the account through the settings menu. Go to the Report a Problem section, then Account and profile, then Manage account, where you can report a deceased user.

Once an account has been memorialized, it will be labeled “Remembering.” No one will be able to log into the account, which prevents anyone from editing the profile or using the account to post new content or send messages.

X

It’s not possible to nominate a legacy contact on Elon Musk’s social media site. But family members or an authorized person can submit a request to deactivate a deceased user’s account.

Passwords

Besides the major online services, you’ll probably have dozens if not hundreds of other digital accounts that your survivors might need to access. You could just write all your login credentials down in a notebook and put it somewhere safe. But making a physical copy presents its own vulnerabilities. What if you lose track of it? What if someone finds it?

Instead, consider a password manager that has an emergency access feature. Password managers are digital vaults that you can use to store all your credentials. Some, like Keeper,Bitwarden and NordPass, allow users to nominate one or more trusted contacts who can access their keys in case of an emergency such as a death.

But there are a few catches: Those contacts also need to use the same password manager and you might have to pay for the service.

___

Is there a tech challenge you need help figuring out? Write to us at onetechtip@ap.org with your questions.

Source link

Continue Reading

Tech

Google’s partnership with AI startup Anthropic faces a UK competition investigation

Published

 on

 

LONDON (AP) — Britain’s competition watchdog said Thursday it’s opening a formal investigation into Google’s partnership with artificial intelligence startup Anthropic.

The Competition and Markets Authority said it has “sufficient information” to launch an initial probe after it sought input earlier this year on whether the deal would stifle competition.

The CMA has until Dec. 19 to decide whether to approve the deal or escalate its investigation.

“Google is committed to building the most open and innovative AI ecosystem in the world,” the company said. “Anthropic is free to use multiple cloud providers and does, and we don’t demand exclusive tech rights.”

San Francisco-based Anthropic was founded in 2021 by siblings Dario and Daniela Amodei, who previously worked at ChatGPT maker OpenAI. The company has focused on increasing the safety and reliability of AI models. Google reportedly agreed last year to make a multibillion-dollar investment in Anthropic, which has a popular chatbot named Claude.

Anthropic said it’s cooperating with the regulator and will provide “the complete picture about Google’s investment and our commercial collaboration.”

“We are an independent company and none of our strategic partnerships or investor relationships diminish the independence of our corporate governance or our freedom to partner with others,” it said in a statement.

The U.K. regulator has been scrutinizing a raft of AI deals as investment money floods into the industry to capitalize on the artificial intelligence boom. Last month it cleared Anthropic’s $4 billion deal with Amazon and it has also signed off on Microsoft’s deals with two other AI startups, Inflection and Mistral.

The Canadian Press. All rights reserved.

Source link

Continue Reading

Trending

Exit mobile version