adplus-dvertising
Connect with us

Media

We Got Social Media Wrong. Can We Get AI Right?

Published

 on

Interactions that dehumanize us.

Disinformation that misleads us.

Algorithms that manipulate us.

These are the risks posed by the explosion in generative artificial intelligence—AI that uses massive amounts of pre-existing content (also known as “large language models”)—to generate text, images, and code as well as to provide information and answers to an ever-growing range of questions.

They’re also the risks that made many people worry about social media.

What We Missed about Social Media

I wish I had worried about social media more. In 2005, my partner and I launched what would now be called a social media agency, at a time when few had even heard the term “social media.” Like a lot of people working on the nascent social web at that time, we were a lot more attuned to its potential than to its risks.

Before the advent of YouTube, Facebook, and Twitter, social media was decentralized, not very corporate, and pretty small: It felt more like a club of people exploring the way user-created content could fuel activism, community, and creativity than the next gold rush. I was so confident that this new medium was intrinsically biased towards social engagement that I used to tell companies that they would have a hard time competing with the grassroots causes and callings that drove most online participation at that time.

But I forgot about this little thing called money. It turns out that if you’re prepared to buy attention with ads and celebrity spokespeople and an endless array of contests and prizes, you can absolutely pry attention away from social advocacy and creativity and direct it towards buying stuff and reviewing stuff and even unboxing stuff on camera.

Money and Media

Once people figured out that there was money to be made with social media—and a lot of it—the dynamics changed quickly. “With digital ad revenues as their primary source of profit,” Douglas Guilbeault writes in “Digital Marketing in the Disinformation Age,” “social-media companies have designed their platforms to influence users on behalf of marketers and politicians, both foreign and domestic.”

Advertising became more sophisticated, to recover the eyeballs and attention that TV and newspapers were losing to social networks and web browsing. In turn, “digital platforms driven by ad revenue models were designed for addiction in order to perpetuate the stream of data collected from users,” as L. M. Sacasas puts it in “The Tech Backlash We Really Need.”

And content became more sensational and more polarizing and more hateful, because sensational and polarizing is what attracted the traffic and engagement that advertisers were looking for; an explosion in hate speech was the result. As Bharath Ganesh notes in “The Ungovernability of Digital Hate Culture,” “[i]n a new media culture in which anonymous entrepreneurs can reach massive audiences with little quality control, the possibilities for those vying to become digital celebrities to spread hateful, even violent, judgements with little evidence, experience, or knowledge are nearly endless.”

Most of the terrible, destructive impacts of social media stem from this core dynamic. The bite-sized velocity of social media has made it endlessly distracting and disruptive to our families, communities, relationships, and mental health. As an ad-driven, data-rich, and sensational medium, it’s ideally suited to the dissemination of misinformation and the explosion of anti-democratic manipulation. And as a space where users create most content for free, while companies control the platforms and the algorithms that determine what gets seen, it has put creators at the mercy of corporate interests and made art subservient to profits.

Where We Went Wrong

Now we’re getting ready to do it all again, only faster and with far more wide-reaching implications. As Allen and Thadani note in “Advancing Cooperative AI Governance at the 2023 G7 Summit,” “the transition to an AI future, if managed poorly, can…displace entire industries and increase socioeconomic disparity.”

We’re embracing technologies that create content so rapidly and so cheaply that even if that content is not yet quite as good as what humans might create, it will be more and more difficult for human creators to compete with machines.

We’re accepting opaque algorithms that deliver answers and “information”—in quotes, because AIs often present wholly invented “hallucinations” as facts—without much transparency about where this information came from or how the AI decided to construct its answers.

We’re sidestepping crucial questions about bias in they ways these AIs think and respond, and we’re sidestepping crucial decisions about how we deploy these AIs in ways that mitigate rather than compound existing inequalities.

How To Do AI Better

If all this makes me sound like a terrible pessimist, it’s only because I have to fight so hard against my innate fascination with emergent tech. I’m falling hard for the magic and power of AI, just like I fell hard for social media and like I fell hard for my first experiences of the web, of the internet, of the personal computer.

Those of us who are truly inspired and enchanted by the advent of new technologies are the ones who most need to rein in our enthusiasm; to anticipate the risks and to learn from our past mistakes.

And there’s a lot we can learn from, because we know what we were warned about last time, what we disregarded, and how we missed the opportunities to avert the worse excesses of social media.

That begins with the companies driving this transformation. Instead of fighting regulation, AI companies could advocate for effective regulation so that they’re less tempted to sideline ethical and safety issues in order to race ahead of the competition. Some AI leaders are already signaling their support for regulation, as we saw when OpenAI’s Sam Altman appeared at a recent Senate hearing.

But we’ll still be in a dangerous position if regulators depend on the technical advice of AI executives in order to set appropriate rules, because even well-intentioned execs are going to be less than objective about regulations that constrain their potential for profit. AI is also a much more complicated, much faster moving area to regulate; legislators who were hard-pressed to comprehend and regulate social media are unlikely to do better with AI.

That’s why, as King and Shull argue in “How Can Policy Makers Predict the Unpredictable,” “policy makers must prioritize developing a multidisciplinary network of trusted experts on whom to call regularly to identify and discuss new developments in AI technologies, many of which may not be intuitive or even yet imagined.”

It’s going to take international coordination and investment to develop an independent source of regulatory advice that is genuinely independent and capable of offering meaningful advice: Think of an AI equivalent of the World Health Organization, with the expertise and resources to guide AI policy and response at a global level.

Becoming a Smarter User of AI

It’s just as crucial for ordinary folks to improve their own AI literacy and comprehension. We need to be alert to both the risks and opportunities AI poses for our own lives, and we need to be informed and effective citizens when it comes to pressing for government regulation.

Here, again, the example of social media is instructive. Social networks made massive investments in understanding how to capture, sustain, and monetize our attention. We only questioned this effort once we saw the impact it had on our mental health, our kids’ wellbeing, and the integrity of our democracies. By then, these networks were so embedded in our personal and professional lives that extracting oneself from social media imposed very real social and professional costs.

This time, let’s figure out how to be the agents who use the tools, rather than the subjects who get manipulated. We won’t get there by avoiding ChatGPT, DALL-E and the like. Avoidance only makes us more vulnerable to manipulation by artificially generated content or to replacement by AI “workers.”

Instead, we human workers and tech users need to become quickly and deeply literate in the tools and technologies that are about to transform our work, our daily lives, and our societies—so that we can meaningfully shape that path. In a delightful paradox, the AIs themselves can help us achieve that rapid path to AI literacy by acting as our self-documenting guides to what’s newly possible.

How AI Helps Build Mastery

If you have yet to delve deep into the potential of generative AI, here’s one place you can start: ask an AI for some examples of how it can transform your own work.

For example, you might prompt ChatGPT with something like:

You are a productivity consultant who has been hired to support the productivity and well-being of a team of policy analysts. You have been asked to identify ten ways these policy analysts can use ChatGPT to facilitate or support their work, which includes reading news stories and academic articles, attending conferences, booking briefings, drafting briefing notes and recommendations, and writing reports. Please provide a list of ten ideas for how to use ChatGPT to support these functions.

Once ChatGPT provides you with a list of options, pick one that you’d like to try out. Then ask ChatGPT to give you step-by-step instructions on how to use it for that particular task. You can even follow up your request for step-by-step instructions with a prompt like,

You are an automation researcher. Review the previous conversation and note five risks or considerations when automating these tasks or adopting this approach.

Seeing how generative AI analyses and enables the automation of your own work or personal tasks is a great way to understand how AI works, where its limits lie, and how it might transform your own corner of the world.

That understanding is what will allow you to use AI instead of getting used by it, and it’s what will allow you to participate meaningfully in the public conversation about how to shape AI, right now. And now is when we need to hear many thoughtful, informed, human voices engaging with the question of how to regulate and use AI.

Otherwise, our voices will be drowned out by the ever louder, ever more pervasive voices of our new AI companions.


 

728x90x4

Source link

Continue Reading

Media

Sutherland House Experts Book Publishing Launches To Empower Quiet Experts

Published

 on

Sutherland House Experts is Empowering Quiet Experts through
Compelling Nonfiction in a Changing Ideas Landscape

TORONTO, ON — Almost one year after its launch, Sutherland House Experts is reshaping the publishing industry with its innovative co-publishing model for “quiet experts.” This approach, where expert authors share both costs and profits with the publisher, is bridging the gap between expertise and public discourse. Helping to drive this transformation is Neil Seeman, a renowned author, educator, and entrepreneur.

“The book publishing world is evolving rapidly,” publisher Neil Seeman explains. “There’s a growing hunger for expert voices in public dialogue, but traditional channels often fall short. Sutherland House Experts provides a platform for ‘quiet experts’ to share their knowledge with the broader book-reading audience.”

The company’s roster boasts respected thought leaders whose books are already gaining major traction:

• V. Kumar Murty, a world-renowned mathematician, and past Fields Institute director, just published “The Science of Human Possibilities” under the new press. The book has been declared a 2024 “must-read” by The Next Big Ideas Club and is receiving widespread media attention across North America.

• Eldon Sprickerhoff, co-founder of cybersecurity firm eSentire, is seeing strong pre-orders for his upcoming book, “Committed: Startup Survival Tips and Uncommon Sense for First-Time Tech Founders.”

• Dr. Tony Sanfilippo, a respected cardiologist and professor of medicine at Queen’s University, is generating significant media interest with his forthcoming book, “The Doctors We Need: Imagining a New Path for Physician Recruitment, Training, and Support.”

Seeman, whose recent and acclaimed book, “Accelerated Minds,” explores the entrepreneurial mindset, brings a unique perspective to publishing. His experience as a Senior Fellow at the University of Toronto’s Institute of Health Policy, Management and Evaluation, and academic affiliations with The Fields Institute and Massey College, give him deep insight into the challenges faced by people he calls “quiet experts.”

“Our goal is to empower quiet, expert authors to become entrepreneurs of actionable ideas the world needs to hear,” Seeman states. “We are blending scholarly insight with market savvy to create accessible, impactful narratives for a global readership. Quiet experts are people with decades of experience in one or more fields who seek to translate their insights into compelling non-fiction for the world,” says Seeman.

This fall, Seeman is taking his insights to the classroom. He will teach the new course, “The Writer as Entrepreneur,” at the University of Toronto, offering aspiring authors practical tools to navigate the evolving book publishing landscape. To enroll in this new weekly night course starting Tuesday, October 1st, visit:
https://learn.utoronto.ca/programs-courses/courses/4121-writer-entrepreneur

“The entrepreneurial ideas industry is changing rapidly,” Seeman notes. “Authors need new skills to thrive in this dynamic environment. My course and our publishing model provide those tools.”

About Neil Seeman:
Neil Seeman is co-founder and publisher of Sutherland House Experts, an author, educator, entrepreneur, and mental health advocate. He holds appointments at the University of Toronto, The Fields Institute, and Massey College. His work spans entrepreneurship, public health, and innovative publishing models.

Follow Neil Seeman:
https://www.neilseeman.com/
https://www.linkedin.com/in/seeman/

Follow Sutherland House Experts:

https://sutherlandhouseexperts.com/
https://www.instagram.com/sutherlandhouseexperts/

Media Inquiries:
Sasha Stoltz | Sasha@sashastoltzpublicity.com | 416.579.4804
https://www.sashastoltzpublicity.com

Continue Reading

Media

What to stream this weekend: ‘Civil War,’ Snow Patrol, ‘How to Die Alone,’ ‘Tulsa King’ and ‘Uglies’

Published

 on

 

Hallmark launching a streaming service with two new original series, and Bill Skarsgård out for revenge in “Boy Kills World” are some of the new television, films, music and games headed to a device near you.

Also among the streaming offerings worth your time as selected by The Associated Press’ entertainment journalists: Alex Garland’s “Civil War” starring Kirsten Dunst, Natasha Rothwell’s heartfelt comedy for Hulu called “How to Die Alone” and Sylvester Stallone’s second season of “Tulsa King” debuts.

NEW MOVIES TO STREAM SEPT. 9-15

Alex Garland’s “Civil War” is finally making its debut on MAX on Friday. The film stars Kirsten Dunst as a veteran photojournalist covering a violent war that’s divided America; She reluctantly allows an aspiring photographer, played by Cailee Spaeny, to tag along as she, an editor (Stephen McKinley Henderson) and a reporter (Wagner Moura) make the dangerous journey to Washington, D.C., to interview the president (Nick Offerman), a blustery, rising despot who has given himself a third term, taken to attacking his citizens and shut himself off from the press. In my review, I called it a bellowing and haunting experience; Smart and thought-provoking with great performances. It’s well worth a watch.

— Joey King stars in Netflix’s adaptation of Scott Westerfeld’s “Uglies,” about a future society in which everyone is required to have beautifying cosmetic surgery at age 16. Streaming on Friday, McG directed the film, in which King’s character inadvertently finds herself in the midst of an uprising against the status quo. “Outer Banks” star Chase Stokes plays King’s best friend.

— Bill Skarsgård is out for revenge against the woman (Famke Janssen) who killed his family in “Boy Kills World,” coming to Hulu on Friday. Moritz Mohr directed the ultra-violent film, of which Variety critic Owen Gleiberman wrote: “It’s a depraved vision, yet I got caught up in its kick-ass revenge-horror pizzazz, its disreputable commitment to what it was doing.”

AP Film Writer Lindsey Bahr

NEW MUSIC TO STREAM SEPT. 9-15

— The year was 2006. Snow Patrol, the Northern Irish-Scottish alternative rock band, released an album, “Eyes Open,” producing the biggest hit of their career: “Chasing Cars.” A lot has happened in the time since — three, soon to be four quality full-length albums, to be exact. On Friday, the band will release “The Forest Is the Path,” their first new album in seven years. Anthemic pop-rock is the name of the game across songs of love and loss, like “All,”“The Beginning” and “This Is the Sound Of Your Voice.”

— For fans of raucous guitar music, Jordan Peele’s 2022 sci-fi thriller, “NOPE,” provided a surprising, if tiny, thrill. One of the leads, Emerald “Em” Haywood portrayed by Keke Palmer, rocks a Jesus Lizard shirt. (Also featured through the film: Rage Against the Machine, Wipers, Mr Bungle, Butthole Surfers and Earth band shirts.) The Austin noise rock band are a less than obvious pick, having been signed to the legendary Touch and Go Records and having stopped releasing new albums in 1998. That changes on Friday the 13th, when “Rack” arrives. And for those curious: The Jesus Lizard’s intensity never went away.

AP Music Writer Maria Sherman

NEW SHOWS TO STREAM SEPT. 9-15

— Hallmark launched a streaming service called Hallmark+ on Tuesday with two new original series, the scripted drama “The Chicken Sisters” and unscripted series “Celebrations with Lacey Chabert.” If you’re a Hallmark holiday movies fan, you know Chabert. She’s starred in more than 30 of their films and many are holiday themed. Off camera, Chabert has a passion for throwing parties and entertaining. In “Celebrations,” deserving people are surprised with a bash in their honor — planned with Chabert’s help. “The Chicken Sisters” stars Schuyler Fisk, Wendie Malick and Lea Thompson in a show about employees at rival chicken restaurants in a small town. The eight-episode series is based on a novel of the same name.

Natasha Rothwell of “Insecure” and “The White Lotus” fame created and stars in a new heartfelt comedy for Hulu called “How to Die Alone.” She plays Mel, a broke, go-along-to-get-along, single, airport employee who, after a near-death experience, makes the conscious decision to take risks and pursue her dreams. Rothwell has been working on the series for the past eight years and described it to The AP as “the most vulnerable piece of art I’ve ever put into the world.” Like Mel, Rothwell had to learn to bet on herself to make the show she wanted to make. “In the Venn diagram of me and Mel, there’s significant overlap,” said Rothwell. It premieres Friday on Hulu.

— Shailene Woodley, DeWanda Wise and Betty Gilpin star in a new drama for Starz called “Three Women,” about entrepreneur Sloane, homemaker Lina and student Maggie who are each stepping into their power and making life-changing decisions. They’re interviewed by a writer named Gia (Woodley.) The series is based on a 2019 best-selling book of the same name by Lisa Taddeo. “Three Women” premieres Friday on Starz.

— Sylvester Stallone’s second season of “Tulsa King” debuts Sunday on Paramount+. Stallone plays Dwight Manfredi, a mafia boss who was recently released from prison after serving 25 years. He’s sent to Tulsa to set up a new crime syndicate. The series is created by Taylor Sheridan of “Yellowstone” fame.

Alicia Rancilio

NEW VIDEO GAMES TO PLAY

— One thing about the title of Focus Entertainment’s Warhammer 40,000: Space Marine 2 — you know exactly what you’re in for. You are Demetrian Titus, a genetically enhanced brute sent into battle against the Tyranids, an insectoid species with an insatiable craving for human flesh. You have a rocket-powered suit of armor and an arsenal of ridiculous weapons like the “Chainsword,” the “Thunderhammer” and the “Melta Rifle,” so what could go wrong? Besides the squishy single-player mode, there are cooperative missions and six-vs.-six free-for-alls. You can suit up now on PlayStation 5, Xbox X/S or PC.

— Likewise, Wild Bastards isn’t exactly the kind of title that’s going to attract fans of, say, Animal Crossing. It’s another sci-fi shooter, but the protagonists are a gang of 13 varmints — aliens and androids included — who are on the run from the law. Each outlaw has a distinctive set of weapons and special powers: Sarge, for example, is a robot with horse genes, while Billy the Squid is … well, you get the idea. Australian studio Blue Manchu developed the 2019 cult hit Void Bastards, and this Wild-West-in-space spinoff has the same snarky humor and vibrant, neon-drenched cartoon look. Saddle up on PlayStation 5, Xbox X/S, Nintendo Switch or PC.

Lou Kesten

Source link

Continue Reading

Media

Trump could cash out his DJT stock within weeks. Here’s what happens if he sells

Published

 on

Former President Donald Trump is on the brink of a significant financial decision that could have far-reaching implications for both his personal wealth and the future of his fledgling social media company, Trump Media & Technology Group (TMTG). As the lockup period on his shares in TMTG, which owns Truth Social, nears its end, Trump could soon be free to sell his substantial stake in the company. However, the potential payday, which makes up a large portion of his net worth, comes with considerable risks for Trump and his supporters.

Trump’s stake in TMTG comprises nearly 59% of the company, amounting to 114,750,000 shares. As of now, this holding is valued at approximately $2.6 billion. These shares are currently under a lockup agreement, a common feature of initial public offerings (IPOs), designed to prevent company insiders from immediately selling their shares and potentially destabilizing the stock. The lockup, which began after TMTG’s merger with a special purpose acquisition company (SPAC), is set to expire on September 25, though it could end earlier if certain conditions are met.

Should Trump decide to sell his shares after the lockup expires, the market could respond in unpredictable ways. The sale of a substantial number of shares by a major stakeholder like Trump could flood the market, potentially driving down the stock price. Daniel Bradley, a finance professor at the University of South Florida, suggests that the market might react negatively to such a large sale, particularly if there aren’t enough buyers to absorb the supply. This could lead to a sharp decline in the stock’s value, impacting both Trump’s personal wealth and the company’s market standing.

Moreover, Trump’s involvement in Truth Social has been a key driver of investor interest. The platform, marketed as a free speech alternative to mainstream social media, has attracted a loyal user base largely due to Trump’s presence. If Trump were to sell his stake, it might signal a lack of confidence in the company, potentially shaking investor confidence and further depressing the stock price.

Trump’s decision is also influenced by his ongoing legal battles, which have already cost him over $100 million in legal fees. Selling his shares could provide a significant financial boost, helping him cover these mounting expenses. However, this move could also have political ramifications, especially as he continues his bid for the Republican nomination in the 2024 presidential race.

Trump Media’s success is closely tied to Trump’s political fortunes. The company’s stock has shown volatility in response to developments in the presidential race, with Trump’s chances of winning having a direct impact on the stock’s value. If Trump sells his stake, it could be interpreted as a lack of confidence in his own political future, potentially undermining both his campaign and the company’s prospects.

Truth Social, the flagship product of TMTG, has faced challenges in generating traffic and advertising revenue, especially compared to established social media giants like X (formerly Twitter) and Facebook. Despite this, the company’s valuation has remained high, fueled by investor speculation on Trump’s political future. If Trump remains in the race and manages to secure the presidency, the value of his shares could increase. Conversely, any missteps on the campaign trail could have the opposite effect, further destabilizing the stock.

As the lockup period comes to an end, Trump faces a critical decision that could shape the future of both his personal finances and Truth Social. Whether he chooses to hold onto his shares or cash out, the outcome will likely have significant consequences for the company, its investors, and Trump’s political aspirations.

728x90x4

Source link

Continue Reading

Trending