Connect with us

Media

Lush Is Selectively Quitting Social Media, Can You?

Published

 on

Last week, Lush Cosmetics announced removing themselves from 4 major social media platforms, Facebook, Instagram, Snapchat and TikTok. They claim they’re, protesting the “serious effects” social media has on users’ mental health and wellbeing. Ironically their announcement garnered them plenty of free publicity in the traditional sense.

“We wouldn’t ask our customers to meet us down a dark and dangerous alleyway—but some social media platforms are beginning to feel like places no one should be encouraged to go,” the retailer wrote. Lush, which has 240 stores across Canada and the U.S., said it’ll remain off the platforms until they ensure a safer environment for users. They’re calling their “boycott” their Global Anti-Social Media Policy. However, they’re staying on Pinterest, Twitter, and YouTube, thus not 100% removing themselves from having a social media presence.

Lush’s timing is suspect. Sticking with Twitter makes me wonder if their Global Anti-Social Media Policy is just a pre-holiday PR stunt. Twitter literally doesn’t have any filter. People can easily share misinformation, disinformation, hate speech, inappropriate content, etc., which stays on the platform until reported.

On the surface, Lush’s decision is both principled and brave. However, I’m sure Lush questioned the ROI of their social media spend against their bottom line before “taking a stand” against social media’s harmful effects.

Mark Constantine, Lush’s co-founder and CEO, must have decided the ROI from Pinterest, Twitter, and YouTube outweighed taking a stance against all social media platforms tolerating toxic behaviour. As long as a platform serves Lush’s bottom-line, they’ll stick with them. Am I suggesting that Lush selectively “quitting social media” is a business move? Of course, it is! Why else would they stay on some social platforms and not others when all platforms are toxic to a certain degree.

It’s always business profits before taking an ethical stand. Therefore, brands tend to virtue signal; there’s no real risk doing so, with the upside of enhancing the brand’s image amongst the naïve. Lush isn’t really quitting social media; they’re trying to portray as if they are. The headline of their press release: “Lush Cosmetics to Deactivate Global Social Media Accounts”.

Unless you’re a hardcore e-commerce site like Amazon, eBay, and Walmart (Estimated number of monthly visitors: 468.96 million.), social media usually isn’t an efficient sales channel. Social media is great for increasing brand awareness. However, accurately tracking new sales from social media platforms is a near impossibility. Vanity metrics such as ‘likes’ and ‘comments’ don’t necessarily result in sales. I’m still waiting for a brand to show how much a “like” influences their bottom line.

Undeniably social media has proven not to be great for our collective mental health. People don’t congregate on social media because of brands. They do so because of their dire need to be accepted or admired. People use social media in the unhealthiest way possible; trying to gauge whether their peers and community accepts or rejects them. Tribalism is the nature of social media. Lush selectively avoiding specific social media platforms isn’t a significant step toward decreasing the toxic nature of social media platforms. All social media platforms tolerate toxic behaviour.

Lush hasn’t offered any concrete solutions on how to mitigate, if not eradicate, the toxicity appearing on social media sites. Tackling the issue will require a collective effort between lawmakers, associations, sociologists, social media companies and those who hold the money social media companies depend on—brands. Like climate change, it’s too late for minor acts. Major action (READ: drastic) is needed.

In July  I wrote an article entitled, Will Social Media Companies Ever Make Fighting Online Abuse a Priority? I offered several suggestions on how social media companies can reduce toxic behaviour on their platforms. One solution I proposed is requiring credit card and/or phone number authentication to create a social media account. This would prevent anonymous accounts from being created. Social media users, knowing they can easily be traced, will therefore rein in their toxic behaviour.

Every social media platform relies on advertising revenue for survival and being profitable. Eyeballs are what keep social media free for you and me. Having as many eyeballs as possible is why social media companies accept toxicity within their respective “user guidelines” to exist on their platform.

Toxic behaviour is a cost-effective way for social media companies to attract and hold our attention. As someone once said, “If it’s free, then you are the product.” Who among us doesn’t like aggressive theatrics morphing into a flame war of insults, labelling and accusations, all in futile attempts to prove the other person wrong? Has insulting someone or calling them a “racist” ever changed their mind?

Then there’s the lack of discussion regarding algorithms designed to prioritize sensationalized content over mundane content, so anything that encourages debate is presented to the masses. Why? Because this type of content creates arguments in the comments section, which counts as “engagement.”

What’s the likelihood that more companies walk away from social media—or all social media? (Not just selectively as Lush has done.) I’m not holding my breath. We’re unable to unite to face obvious dangers such as climate change. The damage caused, particularly to the emotional development of adolescents, by social media is Machiavellian in nature and therefore not widely accepted as being factual. In contrast, social media companies and brands have self-serving financial agendas worth unimaginable billions.

What’s never talked about is social media usage being a choice and user responsibility. Attention and reaction are a choice. We’re choosing to chase emotions on digital platforms, giving social media companies the eyeballs, they need to attract advertisers.

____________________________________________

 

Nick Kossovan, a self-described connoisseur of human psychology, writes about what’s on his mind from Toronto. You can follow Nick on Twitter and Instagram @NKossovan.

Media

Cheeky social media posts from City of Prince George resonate with residents – CBC.ca

Published

 on


After experiencing a record snowfall earlier this January, the city of Prince George shared an important message with its residents through one of its social media channels: 

“It’s not our fault.”

As complaints came in to city hall about mounds of powder making it difficult to drive, the city’s official social media manager explained the situation with a tongue-in-cheek tone on Facebook, writing, “We would throw it back up in the sky if we could but this is not a municipal service we offer at this time.”

The post went on to explain the services the city does offer, including clearing the end of driveways, and shared safety tips on navigating the snowy streets. 

Residents responded with more than a thousand reactions and shares, primarily positive, with some praising the post’s humorous take and conversational messaging.

And many have wondered who’s behind the more light-hearted messages they’ve been seeing from the city recently.

The answer is a four-person communications team that’s taking a new approach to engaging Prince George residents.

Julie Rogers, the city’s new communications manager hired in October, says a municipal government cannot have a real conversation with people if it communicates in a language that is difficult to understand. Prior to Prince George, she worked for the municipalities of Fort St. John in northeastern B.C. and Sechelt on the province’s Sunshine Coast.

Julie Rogers joined the City of Prince George as the communications manager in October. She credits her four-person communications team with making municipal government more approachable. (Submitted by Julie Rogers)

“When you start off with ‘please be advised,’ ‘you are hereby notified,’ it’s intimidating and it’s not nice,” she told CBC Radio West host Sarah Penton.

“It feels like the government, and you know, as much as we are the government, we are also your neighbours.”‘

Other highlights include a message to dog owners to “scoop your poopsicles” from the snow.

“Come spring our parks are going to STINK,” the post warned, once again garnering positive responses.

Rogers says followers and engagement on the city’s social media channels have skyrocketed since they adopted the more humorous tone.

“We’re really happy that we’ve had a positive response from the public.”

Not everything is a joke, though: the city is still using straightforward messaging for issues such as budget processes and public safety, though Rogers still tries to use clear, straightforward language to make municipal issues easier for everyone to understand.

And she says if the city does something wrong, it will apologize.

“We’ve screwed up and we’re sorry, and here’s how we’ll do better,” she said. “That’s crisis communications 101.”

Rogers says most of the people who leave comments on the city’s Facebook page are nice, but she asks people to stay civil in online discussions.

“You’re entitled to your opinion, thanks for sharing it,” she said. “Don’t attack people … we’re not going to please everybody.”

Adblock test (Why?)



Source link

Continue Reading

Media

Why the nature of TikTok could exacerbate a worrisome social media trend – Boise State Public Radio

Published

 on


MICHEL MARTIN, HOST:

In recent years, parents and policymakers alike have started to focus on the negative effects social media can have on young people, on everything from introducing them to hate groups to encouraging eating disorders. So we want to tell you about another debate that’s emerging around what some think might be a new threat, the self-diagnosing of mental health issues. There doesn’t seem to be any hard data on this, but if you spend any time on TikTok or Reddit or other platforms, you can easily see videos documenting mental health symptoms – sometimes from health professionals, often not. On TikTok, for example, if you type #DID, which stands for a dissociative identity disorder, you’ll find videos that total 1.5 billion views on that topic alone. And while some people think the increased discussion of mental health is a good thing, others worry it’s creating a misunderstanding about certain mental disorders.

To help us break down this trend and what’s being done to address it, we’ve called Taylor Lorenz, a technology and culture reporter. And she’s with us now. Welcome. Thanks so much for joining us once again.

TAYLOR LORENZ: Thank you for having me.

MARTIN: Now, before we dive in, I’m sorry if this sounds super-basic, but for folks who don’t know, what exactly is TikTok? And how does something like self-diagnosis work on a platform like that?

LORENZ: So TikTok is a short-form video app where people can go and post videos around 15 seconds to three minutes long set to music, usually. A lot of these are just dance videos, cooking videos, things like that. And then a significant portion are people sort of sharing information about their lives. So they might be talking about struggles that they have, interpersonal relationships and mental health.

When it comes to mental health, a lot of people on TikTok are very open about their own mental health struggles. So, you know, for instance, hey, I struggle with ADHD. Here’s how it makes my life hard. Here’s how I’m trying to address it. And sometimes they can get pretty personal. So viewers then watch that, and, you know, some of it resonates with them to the point that they wonder if they themselves are struggling with that mental issue.

MARTIN: So, you know, I mean, it’s not like it’s new for people to use media to talk about any kinds of health issues – I mean, blogs and, you know, YouTube, for example. What do you think it is about TikTok that’s making this such a big deal right now? – because, as we said, kind of, there’s really a debate opening up about this, and some professionals in this field are really concerned. Is there something about TikTok that you think makes this particularly worrisome?

LORENZ: TikTok is so fundamentally different than every other social platform out there right now because the primary way that you consume content is through this algorithmically generated feed. You could actually be on TikTok all day and never follow a single person and you would just be getting fed this feed of content. Every other social platform – Twitter, Instagram, YouTube – you have to subscribe to someone else’s content. So when things go viral on TikTok, suddenly they’re sort of just shoved into the feeds of millions of people that might not have ever followed that stuff before. So I think it just – the mechanisms of it allow certain things to spread and allows certain videos to really take hold in a way that they couldn’t.

MARTIN: As we said, there are a lot of professionals who say, you know what? We’ve talked so much about destigmatizing mental health issues. People are finally doing it, and now people are mad about that. But other people are concerned that people aren’t doing the second step, which is to getting confirmation or to actually seeking out professionals who know more about it or that they may be misdiagnosing themselves, which, as we know, can always be…

LORENZ: Yeah.

MARTIN: …You know, worrisome. So do we know something about this trend? Has this been studied in some way? Like, do – you know what I mean?

LORENZ: Yeah.

MARTIN: Is there anything we actually know as opposed to what we suspect?

LORENZ: Researchers are just beginning to study the effects of these things. And I don’t have any kind of specific numbers. I will say that, like, I do think the conversation sometimes gets flipped a little bit backwards, where people say, like, if they’re going on TikTok and they’re getting the idea that they have certain conditions, but then they don’t go and get the diagnosis.

A lot of times the reason that they’re going on TikTok is because there are all of these hurdles in the health care system to getting these mental health diagnoses. It’s incredibly hard, especially during the pandemic, to even get an appointment for a lot of people. And a lot of these people are minors, so it can be even harder for them to kind of get mental health care. So I think it’s kind of like TikTok is a symptom of the broader problem instead of TikTok is necessarily causing the problem, if that makes sense.

MARTIN: And as we mentioned, that, you know, policymakers have gotten very interested in social media – people from across the political spectrum. So wondering whether social media companies have said anything about this kind of content?

LORENZ: Well, they definitely regulate it in certain ways. So, you know, there are hashtags that are banned on TikTok for certain – you know, self-harm. If you post on Instagram about – you know, joking about killing yourself, you’ll get a pop up asking if you need help. You know, social media companies are trying to kind of incorporate some of these things, I think, because of that regulatory pressure from Washington. But at the end of the day, it’s a little bit hard because some of these people are just – it’s not against any kind of, like, terms and policies to share about your own mental health journey. And so I think it’s, like – it’s a little hard to regulate.

I will say another thing that I know these companies take pretty seriously is eating disorder content, which is so rampant on social media. TikTok came under fire for having a lot of this content in the feed, being distributed. And I know that they’ve since kind of banned a lot of it. So I think it’s a bit of a whack-a-mole situation.

MARTIN: So before we let you go, given how long you’ve covered tech and how we use it, is this something that parents should be concerned about in some way? And is this something that you think tech companies should do – should be paying more attention to?

LORENZ: In terms of the, like, diagnoses of things, I think it’s just really important to have these open conversations with your kid, you know, and say, like, hey, well, what do you think? Or why do you think that? You know, so many people – when I was looking into doing a different story kind of related to this last year, I talked to one therapist, and she was saying, you know, so many people come on to TikTok looking for, you know, what’s wrong with them. And they might get – think that they have a diagnosis and then kind of come to the therapist. And it actually turns out there is very much something going on. It might not be what they think it is, but it got – it’s what got them in the door, you know, to see someone.

And so I think parents need to, like, be comfortable taking that second step instead of just dismissing their child’s concerns. Oh, you saw a bunch of those videos on TikTok. But that’s nonsense. You know, say, hey, why do you feel like that stuff is resonating so much? And how can I help you solve some of these problems?

MARTIN: And then the tech companies – do you think that there’s more they need to be paying attention to?

LORENZ: What the tech companies – they need to focus on is surfacing relevant information. You know, if you search the hashtag #ADHD, for instance, sometimes you don’t always get the most relevant information. And so I think, you know, maybe they need to be highlighting, you know, like, sort of experts and vetted experts and make sure that there’s not disinformation flowing around. I think some of the stuff can just be, yeah, so subjective. I would love for them to have hubs for some of these different things and really, you know, integrate some of the doctors that are using these platforms already to kind of provide, you know, valuable and fact-checked information.

MARTIN: That’s Taylor Lorenz. She reports on tech and culture, and she’s currently writing a book about the world of online creators. Taylor Lorenz, thanks so much for your time.

LORENZ: Thanks for having me.

MARTIN: And let’s say this. If you or someone you know may be considering suicide, please contact the National Suicide Prevention Lifeline at 1-800-273-8255 or the Crisis Text Line by texting HOME to 741741.

(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.

Adblock test (Why?)



Source link

Continue Reading

Media

Edmonton Police Service taking on ‘complete review’ of social media after recent backlash – Globalnews.ca

Published

 on


The Edmonton Police Service said it will be reviewing its social media practices and accounts after a Facebook post in December drew a lot of negative attention.

During Thursday’s police commission meeting, Chief Dale McFee answered questions about a controversial post on the Edmonton Police canine unit’s Facebook page from December.

The now-deleted post detailed the arrest of a naked man who broke into a Home Depot on Christmas Eve and was caught by a police service dog. The post was widely criticized for being insensitive.

“It doesn’t reflect the values of the EPS,” McFee said.

Read more:

Edmonton police delete online post detailing K9 arrest: ‘The post was inappropriate’

The police chief, however, defended the use of the police dog to take down the suspect.

“It was a commercial break and enter of a serious violent offender on meth. Like all other canine engagements, that part will be reviewed,” he explained.

The post, however, he said was not acceptable.

“It didn’t need to be posted the way it was,” he said.

“That was not the canine handler. That was somebody different and that individual has been spoken to about his behaviour in relation to posting and I think we’ve gotten to a good outcome on that and we consider this closed.”

During Thursday’s meeting, the head of communications for EPS said it’s undergoing a “complete review” of its social media accounts.

“We’re also looking at how many accounts we have. Do we have the right accounts? We are looking at quality over quantity, making sure that they abide by our social media policy, that they have the right training,” Michael James said.

It wasn’t the first time a social media post on EPS channels received backlash.

Read more:

Edmonton Police Service gets backlash for ‘misogynistic’ Tik Tok video

In August, a video on the EPS community engagement TikTok struck a nerve.

It showed an officer impersonating Stone Cold Steve Austin — a professional wrestler known for beer drinking.

In the video, the officer caught what was later clarified as water before driving off.

Text on the video read: “When you get a text from your wife that a guy is at home picking up your daughter for a date”.

“It doesn’t seem like they have a strong social media plan in place,” said social media strategist Brittney Le Blanc.

She said EPS online policies should have been reviewed a while ago and suggested the force focuses on being informative rather than trendy.

“I don’t think that this is just to blame on one person,” Le Blanc said.

“I don’t think it just comes down to communications. I think this is a problem with the culture within EPS completely.”

It was suggested Thursday social media posts on Edmonton Police accounts should go through a vetting process — a practice EPS admits is not mandatory.

Le Blanc said that should be rule number one.

“I think you want to have that sober second opinion that can say maybe we don’t post this, maybe we don’t write this kind of content,” Le Blanc added.

© 2022 Global News, a division of Corus Entertainment Inc.

Adblock test (Why?)



Source link

Continue Reading

Trending