The Islamabad High Court (IHC) has raised questions over the new social media rules imposed by the Pakistan government, the media reported on Saturday.
The court on Friday heard a petition filed by the Pakistan Bar Council (PBC) against the recently approved regulations titled ‘Removal and Blocking of Unlawful Online Content (Procedure, Oversight and Safeguards) Rules 2020’, The Express Tribune reported.
The PBC contended that the rules violated rights guaranteed by the Constitution.
IHC Chief Justice Athar Minallah expressed his displeasure, saying that if the new rules discouraged criticism, it would “discourage accountability”.
“Criticism is very important for democracy,” he said, adding that the Pakistan Telecommunication Authority (PTA) should encourage it instead.
The IHC later issued directions to the PTA to take into account the objections raised by PBC and satisfy the court at the next hearing on December 18.
The government approved the social media rules last month despite criticism from human rights activists and organisations.
One of the new rules requires a social media company’ to remove, suspend or disable access to any online content within 24 hours, and in emergency situations, within six hours, after being intimated by the authority.
Amnesty International extends deadline for 26th annual Media Awards – Canada NewsWire
OTTAWA, ON, Jan. 15, 2021 /CNW/ – Canadian journalists have an extra week to submit their stories to Amnesty International’s Media Awards in Canada, the human rights organization announced today.
The English-speaking branch of Amnesty International’s Canadian section will now accept submissions up to 11:59 p.m. EST on Jan. 22, 2021.
These awards honour outstanding reporting on human rights issues by journalists in Canada and Canadian journalists abroad, while also increasing awareness and understanding of human rights issues for all in Canada.
If you are a Canadian journalist or working as a journalist in Canada, we invite you to review the judging criteria below and submit your 2020 human rights stories with the link provided. We look forward to hearing from you.
All entries must be published or broadcast in Canada between Jan. 1, 2020 and Dec. 31, 2020. Unfortunately, we can only accept English submissions at this time.
Categories for 2020-2021:
Written News: A written story on a current or breaking news story relating to a human rights issues of 2,000 words or less.
Written Feature: A written story of more than 2,000 words on a human rights issue. Investigative pieces and multi-part series are also welcome.
Short-Form Video: A filmed news story relating to a human rights issue of no longer than 10 minutes.
Long-Form Video: A documentary or film relating to a human rights issue with a runtime of more than 10 minutes.
Audio News: A radio or podcast news story highlighting a human rights issue with a maximum runtime of 35 minutes.
Long-Form Audio: A radio or podcast feature, or series, highlighting a human rights issue with a maximum runtime of 70 minutes. *If submitting a series, please select 2-3 examples to highlight the series. The total runtime of the selected works must not exceed 70 minutes.
Mixed Media: A combination of at least two of the abovementioned elements: text, video and audio.
Post-Secondary Youth Award: A text, audio, video or mixed media story about a human rights issue created by a student attending a post-secondary school in Canada. The piece must be published or broadcast with a school publication.
Secondary Youth Award: A text, audio, video or mixed media story about a human rights issue created by a student attending a secondary school in Canada. The piece must be published or broadcast with a school publication.
Please complete the electronic form, answer all the required questions and ensure you have URLs for your media work.
The Amnesty International Media Awards winners will be announced in late February or early March 2021. Due to the ongoing pandemic, we are opting to host the awards ceremony online again. The virtual ceremony will be held in May 2021, with an exact date to be determined.
1. Is there a human rights issue at the heart of this story? This is yes or no. If no, then don’t go any further. No points awarded.
2. Does it advance the voice and agency of individuals or communities whose experience is at the centre of the story? Maximum 10 points.
3. Is the story told in ways that advance and promote diversity and equity, and avoid maintaining stereotypes or narratives that are racist, oppressive, sexist or otherwise discriminatory? Maximum 10 points.
4. Is there a solution suggested or being worked on by different stakeholders? Or does the story simply point out the abuse or violation without going further to suggest what needs to change? Maximum 10 points.
5. How much research and enterprise reporting was involved in the story? Maximum 10 points.
6. What is the level of professionalism of the story? i.e. Is it accurate, fair, and well-written? Maximum 10 points.
7. What is the impact of the story? Has it resulted in a change to law or policy? Has it positively impacted the lives of those who are at the centre of the story? Maximum 10 points.
SOURCE Amnesty International
For further information: Lucy Scholey, Media Relations, Amnesty International Canada, 613-853-2142, [email protected]ty.ca
Media survey request becomes discussion topic for local councils – OrilliaMatters
A move by MidlandToday to tap the mood of local elected officials has brought a simple survey request to the council debate floor.
Over the past month, Community Editor Andrew Philips sent along a short survey to all council members in North Simcoe. The survey became a topic of discussion at Penetanguishene’s council meeting Wednesday.
The question: Should councillors be given the freedom to respond to it on their own or should staff craft an answer with council input?
“I have had discussions through the mayor’s teleconference with neighbouring municipalities and it’s kind of a mix,” said Mayor Doug Leroux. “We all know that Midland has been doing it individually. Tiny Township has decided they don’t want to do it individually. They want to send in a response as council as a whole.
“In Tay’s case, they were hoping to do the same,” he added, “but Mayor Ted Walker informed me that a couple of them wished to do their own and then he was also told that there were two of them that had no intentions of responding at all.”
Leroux said he was bringing it up at council to ask council members how they wanted to proceed.
“If (you) want to do a combined one or if (you) want to do one on your own, or there might be those that don’t want to do one at all,” he said. “It’s up to each councillor what their wish and desire is.”
Coun. Brian Cummings was the first one to speak up.
“These questions are personal opinion, personal political views and I don’t believe staff really should be providing answers that members of council are providing,” he said. “I think we all have our own little ideas of what’s going on, even though we are united with our strategic plan, there are some things that may come up. I believe we should be free to answer on our own if we choose to.”
Deputy Mayor Anita Dubeau and councillors Debbie Levy and Michel Mayotte showed support for what Cummings said.
“I will be doing my own, thank you very much,” said Levy.
Leroux said this was exactly why he asked the question.
“If you all want to, you can just go ahead and proceed all do your own,” he said. “If there are those that don’t want to do one, then you’re free to do that as well.”
Leroux did add a clarification: “It’s not staff’s ideas or recommendations. It’s input from members of council that staff would prepare.”
The survey Philips sent out to council members in the four municipalities is a common year-end practice in newsrooms. It gives council members a chance to reflect on their year in office and focus on the gains and losses. It also helps in realigning their ideas moving forward.
Five responses to the survey by Midland council members have already been shared on MidlandToday‘s website.
Philips received a call from Tiny Township staff indicating all of council will provide a combined response to this individual exercise.
So far, four Tay Township councillors have also sent in their responses, but a Thursday committee discussion indicated they still sought clarity around how council members should deal with members of the media. (Story to follow.)
The Penetanguishene motion around the survey died on the floor as there was no mover or seconder.
The survey provided to councillors (question #4 varied by municipality):
Question 1. What are you most proud of, personally as a councillor, that you/council have been able to accomplish in the first half of your mandate?
Question 2. What is your biggest disappointment as it relates to a council decision/direction or issue?
Question 3. Nobody saw the pandemic coming. Specifically, as a councillor, what is the biggest challenge the pandemic has created and how have you tried to tackle that challenge?
Question 4. What is your vision for the area known as Midland Bay Landing (i.e. a full park, half a park and the balance development, mostly development)?
Question 5. Are you doing enough as a council to be transparent, to encourage public input and to listen? How so? How could that be improved during the second half of your mandate?
Question 6. What is the biggest challenge council faces in the second half of its mandate (ie. Staff retirements, promised tax freeze, capacity) and what are your top priorities?
Question 7: Lastly, do you intend to seek re-election? Why or why not?
Social Media Companies Should Self-Regulate. Now. – Harvard Business Review
The world witnessed the worst example of the impact digital platforms can have on society with the debacle at the U.S. Capitol on January 6, 2021. Not only did supporters of Donald Trump try to disrupt the certification of the Electoral College votes, but this deplorable incident was, in large part, fomented over social media.
In the past, Twitter and Facebook have been reluctant to censor posts about conspiracy theories and fake news. Digital platforms also have benefitted from a 1996 law, Section 230 of the Communications Decency Act, that grants them immunity from liabilities related to third-party hosted content. Nevertheless, prompted by false accusations of rigged elections and other fake news, the leading digital platforms in social media recently began tagging some posts as unreliable or untrue and removing some videos. Following the January 6th insurrection attempt, Twitter and Facebook also banned Trump from using their platforms because promotion of violence and criminal acts violates their terms of service. For similar reasons, Apple and Google removed the alternative Parler social media platform from their app stores, and Amazon stopped hosting the service.
How did we get into this mess?
Digital platforms can be highly profitable businesses that connect users and other market actors in ways not possible before the internet. When they are successful, they generate powerful feedback loops called network effects and then monetize them by selling advertisements. But what happened at the U.S. Capitol illustrates how digital platforms can be double-edged swords. Yes, they have generated trillions of dollars in wealth. But they have also enabled the distribution of fake news and fake products, manipulation of digital content for political purposes, and promotion of dangerous misinformation on elections, vaccines, and other public health matters.
The social dilemma is clear: Digital platforms can be used for evil as well as good.
What’s the solution? Should platform companies wait for governments to impose potentially intrusive controls and respond defensively? Or should they act pre-emptively?
Governments will inevitably get more engaged in oversight. However, we believe that platforms should become more aggressive at self-regulation now. To explore the feasibility of self-regulation, we researched the history of self-regulation before and after the widespread adoption of the internet. We found that companies have often risked creating a “tragedy of the commons” when they put their short-term, individual self-interests ahead of the good of the consuming public or the industry overall, and, in the long, destroy the environment that made them successful in the first place.
Before the internet era, several industries, such as movies, video games, broadcasting content, television advertising, and computerized airline reservation systems, faced similar issues and managed to self-regulate with some success. At the same time, these historical examples suggest that self-regulation worked best when there were credible threats of government regulation. The bottom line: Self-regulation may be the key to avoiding a potential tragedy of the commons scenario for digital platforms.
What is “self-regulation”? This refers to the steps companies or industry associations take to preempt or supplement governmental rules and guidelines. For an individual company, self-regulation ranges from self-monitoring for regulatory violations to proactive “corporate social responsibility” (CSR) initiatives. Leaving it up to companies to monitor and restrain themselves can sometimes devolve into a self-regulatory or regulatory “charade.” But that doesn’t need to be the case.
For many decades, companies in the business of producing movies, video games, and television shows and commercials all have faced issues around the appropriateness of “content” in a way that resembles today’s social media platforms. To keep regulators at bay, the movie and video games industries resorted to a self-imposed and self-monitored rating system, still in operation today. The broadcasting and advertisement sectors in the 1950s and 1960s faced pushback on the appropriateness of advertisements, with issues resembling what we see today in online advertising. Launched in 1960, the airline reservation industry, led by American Airlines’ Sabre system, introduced self-preferencing in search results, similar to complaints made against Google and Amazon. Self-regulation in these cases often delivered effective and inexpensive guidelines for company operations as well as forestalled more intrusive government intervention.
History provides several lessons for today’s digital platforms.
First, our leading technology companies need to anticipate when government regulation is likely to become a key factor in their businesses. In movies, radio and television broadcasting, airline reservations via computers, and other new industries, there often occurs a vacuum in regulation in the early years. Then, after a kind of “wild west” environment, governments step in to regulate or pressure firms to curb abuses. To avoid problematic government regulation, platform companies need to introduce their own controls on behavior and usage before the government revokes all Section 230 protections, which is currently under debate in Congress. Technology that exploits big data, artificial intelligence, and machine learning, with some human editing, will increasingly give digital platforms the ability to curate what happens on their platforms. The issue is really to what extent the big platforms have the will to self-regulate. The decisions by Facebook, Twitter, Amazon, Apple, and Google during the first week in January 2021 were steps in the right direction.
Second, we find that firms in new industries tend to eschew self-regulation when the perceived costs imply a significant reduction in revenues or profits. Managers rarely like industry regulations that appear “bad for business.” However, this strategy can be self-defeating. If bad behavior undermines consumer trust, then digital platforms will not continue to thrive. Look closely at Section 230. It states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” This act gave online intermediaries broad immunity from liability for user-generated content posted on their sites. Company lawyers generally interpreted this legislation as providing protection as long as they did not engage in curation. However, Section 230 also included a “good Samaritan” exception. This allowed platforms to remove or moderate content deemed obscene or offensive, as long as it was done in good faith. There have been growing calls from both Democrats and Republicans to repeal Section 230 because of accusations of bias (i.e., not acting in good faith) and very little curation over the prior decade by Twitter, Facebook/Instagram, and other platforms. More explicit and transparent self-regulation, like we observed after the U.S. Capitol debacle, might well produce a better outcome for social media platforms, at least compared to leaving their fate up to Congress.
Third, proactive self-regulation was often more successful when coalitions of firms in the same sector worked together. We saw this coalition-type of activity in movie and video-game rating systems limiting violent, profane, or sexual content; television advertisements rules curbing unhealthy products like alcohol and tobacco; and computerized online airline reservations giving equal treatment to airlines, without favoring the system owners. Similarly, social media companies implemented codes of conduct on terrorist activity. Since individual firms may hesitate to enact self-regulation if they incur added costs that their competitors do not, industry coalitions have the benefit of reducing free-riding. Now is the ideal time for more “coopetition,” where platforms compete as well as cooperate with rivals.
Fourth, we found that firms or industry coalitions get serious about self-regulation primarily when they see a credible threat of government regulation, even if it may hurt short-term sales and profits. This pattern occurred with tobacco and cigarette ads, airline reservations, social media ads for terrorist group recruitment, and pornographic material. That threat should be clear and obvious to digital platforms in 2021.
In sum, history suggests that modern digital platforms should not wait for governments to impose controls; they should act decisively and pro-actively now. While the costs of government action in the internet era have been modest so far, the regulatory environment is changing fast. Given the increasing likelihood of government action, the goal of self-regulation should be to avoid a tragedy of the commons, where a lack of trust destroys the environment that has allowed digital platforms to thrive. Going forward, governments and digital platforms will also need to work together more closely. Since more government oversight over Twitter, Facebook, Google, Amazon, and other platforms seems inevitable, new institutional mechanisms for more participative forms of regulation may be critical to their long-term survival and success.
Canada's coming month of Pfizer COVID-19 vaccine shipments will be reduced by half – CTV News
National Rifle Association Files Bankruptcy Citing NY Politics – BNN
Canada's coming month of Pfizer COVID-19 vaccine shipments will be reduced by half – CTV News
Silver investment demand jumped 12% in 2019
Iran anticipates renewed protests amid social media shutdown
Galaxy M31 July 2020 security update brings Glance, a content-driven lockscreen wallpaper service
News23 hours ago
Trudeau urges unified front against China detentions, says all nations vulnerable
News22 hours ago
TD Bank to buy Wells Fargo’s Canadian Direct Equipment Finance business
News22 hours ago
North Korea shows off new submarine-launched missiles after rare party congress
Politics23 hours ago
MLB halts political donations following pro-Trump mob's siege of U.S. Capitol – CBS Sports
Economy23 hours ago
Biden unveiling $1.9T plan to stem virus and steady economy – North Shore News
Health15 hours ago
B.C. becomes 2nd province to identify South African COVID-19 variant – Globalnews.ca
Sports22 hours ago
Auston Matthews’ agent questions ‘abuse of star players’ – Sportsnet.ca
Sports17 hours ago
McDavid notches hat trick as Oilers overpower Canucks – Sportsnet.ca