Connect with us

Media

Ideon Media Acquires Cue Digital Media Creating one of the Largest Independent Digital Media Companies in Canada – Yahoo Finance

Published

 on


TipRanks

3 Stocks Trading at Rock-Bottom Prices; Analysts Say ‘Buy’

A new year, a new addition to the stock portfolio – what can make more sense than that? The right time to buy, of course, is when stocks are priced at the bottom. Buying low and selling high may be a bit hackneyed, but it’s true, and truth has staying power.But the markets are up. The NASDAQ rose 43% in 2020, and the S&P 500 showed a gain of 16%. With a market environment like that, finding stocks that are caught in the doldrums is harder than it looks. That’s where the Wall Street pros can lend a hand.We used TipRanks’ database to pinpoint three stocks that fit a profile: a share price that has dropped over 30% in the last 12 months, but with at least double-digit upside potential, according to analysts. Not to mention each has earned a Moderate or Strong Buy consensus rating.Esperion (ESPR)We will start with Esperion, a company that specializes in therapies for the treatment of elevated low-density lipoprotein cholesterol levels – a major factor contributing to heart disease. The company’s main product, bempedoic acid, is now available in tablet form under the brand names Nexletol and Nexlizet.In February 2020, both Nexletol and Nexlizet were approved as oral treatments to lower LDL-C. Bempedoic acid remains in clinical trials of its efficacy in risk reduction for cardiovascular disease. The trial, called CLEAR Outcomes, is a large-scale, long-term study, tracking more than 14,000 patients with top-line data expected in the second half of 2022. The study covers 1,400 locations in 32 countries around the world.Esperion shares peaked last February, after the FDA approvals, but since then, the stock has declined. Shares are down 65% since their peak. Along with the drop in share value, the company showed a fall in revenue from Q2 to Q3, with the top line collapsing from $212 million to $3.8 million. Since the Q3 report, Esperion announced pricing on a $250 million offer of senior subordinated notes, at 4%, due in 2025. The offering gives the company a boost in available capital for further work on its development pipeline and its marketing efforts for bempedoic acid.Chad Messer, covering ESPR for Needham, sees the note offering as a net positive for Esperion. “We believe this cash position will be sufficient to support Esperion through 2021 and to profitability in 2022… We believe this financing should help put to rest concerns regarding Esperion’s balance sheet. Despite a challenging launch for NEXLETOL and NEXLIZET, product growth has continued in 3Q against the backdrop of a contracting LDL-C market. This growth trajectory suggests potential for a rapid acceleration when conditions improve,” Messer wrote.To this end, Messer rates ESPR shares a Strong Buy, and his price target, at $158, suggests the stock has room for huge growth this year – up to 481% from current levels. (To watch Messer’s track record, click here)Overall, Esperion has 6 recent reviews on record, with a breakdown of 5 Buys and 1 Hold to give the stock a Strong Buy rating from the analyst consensus. The shares, trading at $27.16, have an average price target of $63.33, implying a one-year upside of 133%. (See ESPR stock analysis on TipRanks)Intercept Pharma (ICPT)Liver disease is a serious health threat, and Intercept Pharma is focused on developing treatments for some of the more dangerous chronic liver conditions, including nonalcoholic steatohepatitis (NASH) and primary biliary cholangitis (PBC). Intercept has a research pipeline based on FXR, a regulator of bile acid pathways in the hepatic system.FXR’s action affects not just the bile acid metabolism, but also the glucose and lipid metabolisms, and inflammation and fibrosis around the liver. The lead compound, obeticholic acid (OCA), is an analog of the bile acid CDCA, and as such can take a role in the FXR pathways and receptors implicated in chronic liver disease. Treating liver disease through the FXR biology has direct applications for PBC, and is showing promise treating complications from NASH.ICPT shares dropped sharply last summer, when the FDA rejected the company’s application to approve OCA for treatment of NASH-related liver fibrosis. This delays the drug’s potential entry to a lucrative market; there is no current treatment for NASH, and the first drug to win approval will have the lead in reaching a market estimated at $2 billion to $5 billion in potential annual sales. The effect on the stock is still felt, and ICPT remains at its 52-week low point.In reaction, in December of 2020, Intercept announced major changes in top-level management, as CEO and President Mark Pruzanski announced he’s stepping down effective January 1 of this year. He is succeeded by Jerome Durso, formerly the company’s COO, who will also take a post on the Board of Directors. Pruzanski will remain as an advisor, and will hold a director’s position on the company’s Board.Piper Sandler analyst Yasmeen Rahimi takes a deep dive into Intercept’s continuing efforts to expand applications of OCA and to resubmits its New Drug Application to the FDA. She sees the leadership transition as part of these efforts, and writes, “[We] believe that Dr. Pruzanski’s dedication to transform the liver space is still strong, and that he will continue to guide ICPT’s progress as an advisor and Board member. Additionally, we have had the pleasure of working closely with Jerry Durso and believe that he will transform the company and lead ICPT’s success in growing the PBC market and the path to potential approval and commercial launch of OCA in NASH.”Rahimi takes a long-term bullish stance on ICPT, giving the stock an Overweight (i.e. Buy) rating and an $82 price target. This figure indicates an impressive 220% upside for the next 12 months. (To watch Rahimi’s track record, click here)Wall Street is somewhat more divided on the drug maker. ICPT’s Moderate Buy consensus rating is based on 17 reviews, including 8 Buys and 9 Holds. Shares are priced at $25.82, and the average price target of $59.19 suggests an upside potential of 132% for the next 12 months. (See ICPT stock analysis on TipRanks)Gilead Sciences (GILD)Gilead has had a year like a firework – fast up and fast down. The gains came in 1H20, when it appeared that the company’s antiviral drug remdesivir would become a prime treatment for COVID-19. By November, however, even though remdesivir had been approved, the World Health Organization (WHO) was recommending against its use, and the COVID vaccines now on the market have made remdesivir irrelevant to the pandemic.This was only one of Gilead’s recent headwinds. The company has been working, in conjunction with Galapagos (GLPG), on development of filgotinib as a treatment for rheumatoid arthritis. While the drug received EU and Japanese approval in September 2020, the FDA has withheld approval and Gilead announced in December that it was suspending US development efforts on the drug.Even so, Gilead retains a diverse and active research pipeline, with over 70 research candidates at varying stages of the development and approval process for a wide range of diseases and conditions, including HIV/AIDS, inflammatory & respiratory diseases, cardiovascular disease, and hematology/oncology.On a positive note, Gilead posted Q3 earnings above estimates, with the top line revenue, of $6.58 billion, beating the forecast by 6% and growing 17% year-over-year. The company updated its full-year 2020 guidance on product sales from $23 billion to $23.5 billion.Among the bulls is Oppenheimer analyst Hartaj Singh, who gives GILD shares an Outperform (i.e. Buy) rating and $100 price target. Investors stand to pocket a 69% gain should the analyst’s thesis play out. (To watch Singh’s track record, click here)Backing his stance, Singh writes, “We continue to believe in our thesis of (1) a dependable remdesivir/other medicines business against SARS-CoV flares, (2) a base business (HIV/oncology/HCV) growing low-single digits over the next couple of years, (3) operating leverage providing greater earnings growth, and (4) a 3-4% dividend yield.” What does the rest of the Street think? Looking at the consensus breakdown, opinions from other analysts are more spread out. 10 Buys, 12 Holds and 1 Sell add up to a Moderate Buy consensus. In addition, the $73.94 average price target indicates 25% upside potential from current levels. (See GILD stock analysis on TipRanks)To find good ideas for beaten-down stocks trading at attractive valuations, visit TipRanks’ Best Stocks to Buy, a newly launched tool that unites all of TipRanks’ equity insights.Disclaimer: The opinions expressed in this article are solely those of the featured analysts. The content is intended to be used for informational purposes only. It is very important to do your own analysis before making any investment.

Let’s block ads! (Why?)



Source link

Continue Reading

Media

Social media companies should face new legal duty to 'act responsibly,' expert panel finds – The Tri-City News

Published

 on


Social media companies can’t be trusted to moderate themselves, so it falls to the government to enforce new restrictions to protect Canadians from harmful content online, according to a report currently under review by the federal heritage minister.

The Canadian Commission on Democratic Expression, an expert panel of seven members, including former chief justice Beverley McLachlin, said it had become difficult to ignore the fact too many real-world manifestations of online interactions are turning violent, destructive or hateful, despite social media’s parallel role in empowering positive social movements.

The panellists were particularly struck by the role they saw social media play last fall in “sowing distrust” in the aftermath of the U.S. presidential election, culminating in the lethal invasion of the U.S. Capitol. And they found, with the Quebec mosque shooting, the Toronto van attack and the armed invasion of Rideau Hall, that “Canada is not immune.”

“We recognize the charter, we recognize the ability of people to express themselves freely,” said Jean La Rose, former chief executive officer of the Aboriginal Peoples Television Network (APTN) and one of the seven commissioners, in an interview.

“But there must be limits at one point. There has to be limits as to where free speech becomes a racist discourse, or a hurtful discourse, or a hateful discourse.”

‘We have been at the receiving end of racist threats’

These limits would come in the form of a new law passed by Parliament, the commission recommended, that would force social media platforms like Twitter and Facebook, search engines like Google and its video-sharing site YouTube and others to adhere to a new “duty to act responsibly.”

The panel purposefully did not spell out what responsible behaviour should look like. Instead, it said this determination should be left to the government — as well as a new regulator that would oversee a code of conduct for the industry and a new “social media council” that would bring together the platforms with civil society and other groups.

La Rose said his experience in the journalism world demonstrated how there needed to be reasonable limits on what people can freely express so they are not permitted to call for the killings of Muslims, for example, or encourage violence against an individual by posting their home address or other personal details online.

“Having worked in media, having worked at APTN, for example, we have been at the receiving end of racist threats, of severe injury to our people, our reporters and others because of the view we present of the situation of the Indigenous community in Canada,” he said.

“Literally, we’ve had some reporters run off the road when they were covering a story because people were trying to block the telling of that story. So as a news entity, we have seen how far sometimes misinformation, hate and hurtful comments can go.”

Rules must reflect issue’s ‘inherent complexity’: Google

Canadian Heritage Minister Steven Guilbeault has himself recently indicated that legislation to address “online hate” will be introduced “very soon.”

The minister has pointed to the popularity of such a move: a recent survey by the Canadian Race Relations Foundation (CRRF), for example, found that fully four-fifths of Canadians are on board with forcing social media companies to rapidly take down hateful content.

“Canadians are now asking their government to hold social media companies accountable for the content that appears on their platforms,” Guilbeault said after the CRRF survey was published.

“This is exactly what we intend to do, by introducing new regulations that will require online platforms to remove illegal and hateful content before they cause more harm and damage.”

Guilbeault has met with the commission to discuss their recommendations and is currently reviewing their report, press secretary Camille Gagné-Raynauld confirmed.

Representatives from Facebook Canada and Twitter Canada were among several people who provided witness testimony and participated in commission deliberations, the report said. Twitter declined comment to Canada’s National Observer.

“We haven’t reviewed the full report yet, so we can’t comment on the specific recommendations,” said Kevin Chan, global director and head of public policy for Facebook Canada. “We have community standards that govern what is and isn’t allowed on our platform, and in most cases those standards go well beyond what’s required by law.”

Chan also said Facebook agreed regulators should make “clear rules for the internet” so private companies aren’t left to make decisions themselves.

Google spokesperson Lauren Skelly said the company shares Canadians’ concerns about harmful content online and said YouTube takes its responsibility to remove content that violates its policies “extremely seriously.” She said the company has significantly ramped up daily removals of hate speech and removed millions of videos last quarter for violations.

“Any regulation needs to reflect the inherent complexity of the issue and the scale at which online platforms operate,” said Skelly. “We look forward to continuing our work with the government and local partners on addressing the spread of online hate to ensure a safer and open internet that works for all Canadians.”

Incentives ‘not aligned with the public interest’: Jaffer

The nine-month study by the commission, an initiative led by the Public Policy Forum, found that with everything from disinformation campaigns to conspiracy theories, hate speech and people targeted for harm, toxic content was being “amplified” by the actions of social media companies.

The study rejected the notion that social media platforms are “neutral disseminators of information,” finding instead that they curate content to serve their own commercial interests.

“The business model of some of the major social media companies involves keeping people engaged with their platforms as much as possible. And it turns out that keeping people engaged means feeding them sensational content because that’s what keeps people clicking,” said Jameel Jaffer, executive director of the Knight First Amendment Institute at Columbia University and another commissioner.

“The incentives for social media companies are not aligned with the public interest. These are private companies whose obligation is to make money for their shareholders.”

The commission also proposed a tribunal to deal with dispute resolutions quickly, as well as a “transparency regime” that would require social media companies to make certain information available to the regulator, including the “algorithmic architecture used to identify problematic content.”

Jaffer wrote a “concurring statement” in the report, where he confessed it was difficult to endorse the commission’s proposed “duty to act responsibly” without going further to define how that duty will work in reality. He said defining it will require “difficult tradeoffs” between free speech, privacy and other issues.

Carl Meyer / Local Journalism Initiative / Canada’s National Observer

Let’s block ads! (Why?)



Source link

Continue Reading

Media

Social media companies should face new legal duty to 'act responsibly,' expert panel finds – North Shore News

Published

 on


Social media companies can’t be trusted to moderate themselves, so it falls to the government to enforce new restrictions to protect Canadians from harmful content online, according to a report currently under review by the federal heritage minister.

The Canadian Commission on Democratic Expression, an expert panel of seven members, including former chief justice Beverley McLachlin, said it had become difficult to ignore the fact too many real-world manifestations of online interactions are turning violent, destructive or hateful, despite social media’s parallel role in empowering positive social movements.

The panellists were particularly struck by the role they saw social media play last fall in “sowing distrust” in the aftermath of the U.S. presidential election, culminating in the lethal invasion of the U.S. Capitol. And they found, with the Quebec mosque shooting, the Toronto van attack and the armed invasion of Rideau Hall, that “Canada is not immune.”

“We recognize the charter, we recognize the ability of people to express themselves freely,” said Jean La Rose, former chief executive officer of the Aboriginal Peoples Television Network (APTN) and one of the seven commissioners, in an interview.

“But there must be limits at one point. There has to be limits as to where free speech becomes a racist discourse, or a hurtful discourse, or a hateful discourse.”

‘We have been at the receiving end of racist threats’

These limits would come in the form of a new law passed by Parliament, the commission recommended, that would force social media platforms like Twitter and Facebook, search engines like Google and its video-sharing site YouTube and others to adhere to a new “duty to act responsibly.”

The panel purposefully did not spell out what responsible behaviour should look like. Instead, it said this determination should be left to the government — as well as a new regulator that would oversee a code of conduct for the industry and a new “social media council” that would bring together the platforms with civil society and other groups.

La Rose said his experience in the journalism world demonstrated how there needed to be reasonable limits on what people can freely express so they are not permitted to call for the killings of Muslims, for example, or encourage violence against an individual by posting their home address or other personal details online.

“Having worked in media, having worked at APTN, for example, we have been at the receiving end of racist threats, of severe injury to our people, our reporters and others because of the view we present of the situation of the Indigenous community in Canada,” he said.

“Literally, we’ve had some reporters run off the road when they were covering a story because people were trying to block the telling of that story. So as a news entity, we have seen how far sometimes misinformation, hate and hurtful comments can go.”

Rules must reflect issue’s ‘inherent complexity’: Google

Canadian Heritage Minister Steven Guilbeault has himself recently indicated that legislation to address “online hate” will be introduced “very soon.”

The minister has pointed to the popularity of such a move: a recent survey by the Canadian Race Relations Foundation (CRRF), for example, found that fully four-fifths of Canadians are on board with forcing social media companies to rapidly take down hateful content.

“Canadians are now asking their government to hold social media companies accountable for the content that appears on their platforms,” Guilbeault said after the CRRF survey was published.

“This is exactly what we intend to do, by introducing new regulations that will require online platforms to remove illegal and hateful content before they cause more harm and damage.”

Guilbeault has met with the commission to discuss their recommendations and is currently reviewing their report, press secretary Camille Gagné-Raynauld confirmed.

Representatives from Facebook Canada and Twitter Canada were among several people who provided witness testimony and participated in commission deliberations, the report said. Twitter declined comment to Canada’s National Observer.

“We haven’t reviewed the full report yet, so we can’t comment on the specific recommendations,” said Kevin Chan, global director and head of public policy for Facebook Canada. “We have community standards that govern what is and isn’t allowed on our platform, and in most cases those standards go well beyond what’s required by law.”

Chan also said Facebook agreed regulators should make “clear rules for the internet” so private companies aren’t left to make decisions themselves.

Google spokesperson Lauren Skelly said the company shares Canadians’ concerns about harmful content online and said YouTube takes its responsibility to remove content that violates its policies “extremely seriously.” She said the company has significantly ramped up daily removals of hate speech and removed millions of videos last quarter for violations.

“Any regulation needs to reflect the inherent complexity of the issue and the scale at which online platforms operate,” said Skelly. “We look forward to continuing our work with the government and local partners on addressing the spread of online hate to ensure a safer and open internet that works for all Canadians.”

Incentives ‘not aligned with the public interest’: Jaffer

The nine-month study by the commission, an initiative led by the Public Policy Forum, found that with everything from disinformation campaigns to conspiracy theories, hate speech and people targeted for harm, toxic content was being “amplified” by the actions of social media companies.

The study rejected the notion that social media platforms are “neutral disseminators of information,” finding instead that they curate content to serve their own commercial interests.

“The business model of some of the major social media companies involves keeping people engaged with their platforms as much as possible. And it turns out that keeping people engaged means feeding them sensational content because that’s what keeps people clicking,” said Jameel Jaffer, executive director of the Knight First Amendment Institute at Columbia University and another commissioner.

“The incentives for social media companies are not aligned with the public interest. These are private companies whose obligation is to make money for their shareholders.”

The commission also proposed a tribunal to deal with dispute resolutions quickly, as well as a “transparency regime” that would require social media companies to make certain information available to the regulator, including the “algorithmic architecture used to identify problematic content.”

Jaffer wrote a “concurring statement” in the report, where he confessed it was difficult to endorse the commission’s proposed “duty to act responsibly” without going further to define how that duty will work in reality. He said defining it will require “difficult tradeoffs” between free speech, privacy and other issues.

Carl Meyer / Local Journalism Initiative / Canada’s National Observer

Let’s block ads! (Why?)



Source link

Continue Reading

Media

What the Capitol Riot Data Download Shows about Social Media Vulnerabilities – Scientific American

Published

 on


What the Capitol Riot Data Download Shows about Social Media Vulnerabilities
Pro-Trump protesters at the Capitol used their phones to record and post photos and videos on social media. Credit: Lev Radin Getty Images
Advertisement

<div class="article-block article-text" data-behavior="newsletter_promo dfp_article_rendering " data-dfp-adword="Advertisement" data-newsletterpromo_article-text="

Sign up for Scientific American&rsquo;s free newsletters.

” data-newsletterpromo_article-image=”https://static.scientificamerican.com/sciam/cache/file/CF54EB21-65FD-4978-9EEF80245C772996_source.jpg” data-newsletterpromo_article-button-text=”Sign Up” data-newsletterpromo_article-button-link=”https://www.scientificamerican.com/page/newsletter-sign-up/?origincode=2018_sciam_ArticlePromo_NewsletterSignUp” name=”articleBody” itemprop=”articleBody”>

During the January 6 assault on the Capitol Building in Washington, D.C., rioters posted photographs and videos of their rampage on social media. The platforms they used ranged from mainstream sites such as Facebook to niche ones such as Parler—a social networking service popular with right-wing groups. Once they realized this documentation could get them in trouble, many started deleting their posts. But Internet sleuths had already begun downloading the potentially incriminating material. One researcher, who publicly identifies herself only by the Twitter handle @donk_enby, led an effort that she claims downloaded and archived more than 99 percent of all data posted to Parler before Amazon Web Services stopped hosting the platform. Scientific American repeatedly e-mailed Parler’s media team for comment but had not received a response at the time of publication.

Amateur and federal investigators can extract a lot of information from this massive trove, including the locations and identities of Parler users. Although many of those studying the Parler data are law enforcement officials looking into the Capitol insurrection, the situation provides a vivid example of the way social media posts—whether extreme or innocuous—can inadvertently reveal much more information than intended. And vulnerabilities that are legitimately used by investigators can be just as easily exploited by bad actors.

To learn more about this issue, Scientific American spoke with Rachel Tobac, an ethical hacker and CEO of SocialProof Security, an organization that helps companies spot potential vulnerabilities to cyberattacks. “The people that most people are talking about when they think of a hacker, those are criminals,” she says. “In the hacker community, we’re trying to help people understand that hackers are helpers. We’re the people who are trying to keep you safe.” To that end, Tobac also explained how even tame posts on mainstream social media sites could reveal more personal information than many users expect—and how they can protect themselves.

[An edited transcript of the interview follows.]

How was it possible to download so much data from Parler?

Folks were able to download and archive the majority of Parler’s content … through automated site scraping. [Parler] ordered their posts by number in the URL itself, so anyone with any programming knowledge could just download all of the public content. This is a fundamental security vulnerability. We call this an insecure direct object reference, or IDOR: the Parler posts were listed one after another, so if you just add “1” to the [number in the] URL, you could then scrape the next post, and so on. This specific type of vulnerability would not be found in mainstream social media sites such as Facebook or Twitter. For instance, Twitter randomizes the URLs of posts and requires authentication to even work with those randomized URLs. This [IDOR vulnerability]—coupled with a lack of authentication required to look at each post and a lack of rate limiting (rate limiting basically means the number of requests that you can make to pull data)—means that even an easy program could allow a person to scrape every post, every photo, every video, all the metadata on the Web site.

What makes the archived data so revealing?

The images and videos still contained GPS metadata when they went online, which means that anyone can now map the detailed GPS locations of all the users who posted. This is because our smartphone logs the GPS coordinates and other data, such as the lens and the timing of the photo and video. We call this EXIF data—we can turn this off on our phones, but many people just don’t know to turn that off. And so they leave it embedded within the files that they upload, such as a video or a photo, and they unknowingly disclose information about their location. Folks on the Internet, law enforcement, the FBI can use this information to determine where those specific users live, work, spend time—or where they were when they posted that content.

Can investigators extract similar information from posts on more mainstream platforms?

This EXIF data are scrubbed on places such as Facebook and Twitter, but we still have a lot of people who don’t realize how much they’re compromising their location and information about themselves when they’re posting. Even if Parler did scrub the EXIF data, we saw on a lot of posts during this event that people were geolocation tagging their Instagram Stories to the Capitol Building that day or broadcasting their actions on Facebook Live publicly and tagging where they were located. I think it’s a general lack of understanding or maybe not realizing just how much data they’re leaking. And I think plenty of folks also didn’t realize that maybe they wouldn’t want to geolocation tag during that event.

Under more normal circumstances, is there a problem with geolocation tagging?

Many people think, “Well, I’m not doing anything wrong, so why would I care if I post a photo?” But let’s just take a really innocuous example, such as going on vacation. [If] you geolocation tag the hotel, what could I do as an attacker? Well, the obvious thing is: you’re not home. But I feel like most people get that. What they don’t probably get is that I can social engineer: I can gain access to information about you through human systems at that hotel. I could call up your hotel pretending to be you and gain information about your travel plans. I could steal your hotel points. I could change your room. I could do all this nefarious stuff. We can do so much and really manipulate because our service providers don’t authenticate the way that I would recommend that they authenticate over the phone. Can you imagine if you could log into your Gmail account, your calendar or something like that by just using your current address, your last name and your phone number? But that’s how it works with a lot of these different companies. They don’t use the same authentication protocols that they would use, say, on a Web site.

How can people protect themselves?

I don’t think it would be fair to tell people that they couldn’t post. I post on Twitter multiple times a day! Instead of saying, “You can’t do this,” I would recommend being what I call “politely paranoid” about what we post online. For instance, we can post about the vacation, but we don’t want location- or service-provider-identifying markers within the post. So how about you post a picture of the sunset and the margarita but don’t geolocation tag the hotel? These very small changes can help folks protect their privacy and safety in the long run while still getting everything that they want out of social media. If you really want a geolocation tag, you can save the city that you’re in rather than the hotel: [then] I can’t call up the city and try and get access to your hotel points or change your plans.

Should social media sites just prevent geolocation tagging? What responsibilities do platforms have to protect their users?

I think it’s really important that all platforms, including social media platforms, follow best practices regarding security and privacy to keep their users safe. It’s also a best practice to scrub metadata for your users before they post their photos or videos so they don’t unknowingly compromise themselves. All of that is the platform’s responsibility; we have to hold them to that [and] make sure that they do those things. After that, I would say individuals get to choose how much risk they would like to take. I work hard to ensure nonsecurity folks understand risks: things such as geolocation tagging, [mentioning] service providers [and] taking pictures of their license, credit cards, gift cards, passports, airplane tickets—now we’re seeing COVID-19 vaccination cards with sensitive data on them. I don’t think it’s the social media company’s responsibility, for instance, to dictate what somebody can or cannot post when it comes to their travel photos. I think that’s up to the user to decide how they would like to use that platform. And I think it’s up to us as [information security] professionals to clearly communicate what those risks are so people can make an informed decision.


Rights & Permissions

Let’s block ads! (Why?)



Source link

Continue Reading

Trending