Connect with us

Tech

London, Ont. researchers examining use of A.I. to diagnose COVID-19

Published

on

A research team at London’s Lawson Health Research Institute is working to determine whether trained artificial intelligence can be used to diagnose COVID-19 by comparing lung ultrasound scans of positive patients to those without the disease.

As part of the project, team members will train an artificial neural network, designed to loosely mimic how brains compute, to detect small patterns in the ultrasounds that humans wouldn’t otherwise see, said researcher Dr. Robert Arntfield.

Researchers say that ultrasound scans of patients with COVID-19-related pneumonia produce a “highly abnormal imaging pattern,” but one that isn’t unique to the disease.

“We’re hoping to establish whether there is a uniqueness to what COVID-19 puts on the lungs as opposed to other similar diseases, whether it be influenza or other causes of pneumonia,” said Arntfield, who is also medical director of LHSC’s Critical Care Trauma Centre, in an interview Wednesday with 980 CFPL’s Devon Peacock.

 

Such a breakthrough would, in theory, allow health-care workers to link COVID-19 to a patients’ lung problems much more quickly compared to a standard swab test, he says.

The A.I. technology being utilized is not entirely unlike the A.I. systems deployed by social media websites and smartphones to distinguish between faces and objects in photographs.

“The difference here is we’re actually trying to harness the power of the computer to see if there’s some difference between a COVID lung and a non-COVID lung. Humans are actually not able to do this task,” Arntfield said.

“We believe, or we’re hopeful, that at the pixel level, at a level that exceeds human vision and human cognition, that the machine will actually detect a pattern in all of that noise.”

Researchers are currently in the process of training the neural network by showing it a large quantity of lung ultrasound scans from patients who have been critically ill from COVID-19, as well as scans from patients with other types of lung infections taken before the pandemic.

“In so doing, we’re training it to develop the eventual capacity to receive what’s called a test set, or a validation phase, where we will show it these pictures without the labels on them and look to it to perform accurately by sorting the images,” Arntfield said.

The research team hopes to have results in the next two to three weeks, with publication to come soon afterward. According to Lawson, many on Arntfield’s team have backgrounds in computer programming and wrote the code of the neural network being used.

Arntfield says such technology could be deployed to other areas of medical diagnostics, and in some cases already has.

“There’s been lots of work already… particularly in imaging, such as CAT scans or the X-rays in showing the capacity for machines to often exceed human performance in recognizing either subtle findings, or as a backstop to shore up diagnostic accuracy.

“The future for A.I. in medicine is wide open and is very exciting.”

 

Source link

Continue Reading

Tech

Sega Celebrates Its 60th Anniversary With A Micro Version Of The Game Gear – Nintendo Life

Published

on


Sega has just announced the Game Gear Micro. As the name suggests, this is a mini version of its original 1990 system.

According to a Twitter image within the source code of the official teaser website, this micro device will launch in Japan on 6th October for ¥4,980, which is expected to translate to about $50 / €50 here in the west.

This new but old system will be available in black, blue, yellow, and red.

Game Gear Micro Colours

What do you think? Tell us below.

Let’s block ads! (Why?)



Source link

Continue Reading

Tech

Safari in iOS and iPadOS 14 Might Include Built-In Translator, Full Apple Pencil Support – MacRumors

Published

on


Apple is planning to add a built-in language translation feature and full Apple Pencil support to Safari in iOS and iPadOS 14, according to details found in a leaked version of iOS 14 by 9to5Mac.


Safari’s built-in translation feature would allow users to translate web pages without using a third-party app or service. If such a feature comes to ‌iOS 14‌, we can probably also count on it coming to the next-generation version of macOS as well.

The code suggests the translation option will be available for each website that’s visited, but an automatic translation feature will also be able to be turned on, similar to Chrome’s automatic translation. Apple also appears to be testing translation options for other apps, such as the App Store, allowing users to do things like read reviews in other languages.

Apple’s translations are powered by the Neural Engine and may work with or without an internet connection.

As for the ‌Apple Pencil‌, Apple may be planning to add full support for ‌Apple Pencil‌ input on websites, which would allow it to be used for drawing and marking up. This feature would be limited to ‌iPadOS‌ 14 as the ‌Apple Pencil‌ does not work on iPhones.

Earlier this year, MacRumors discovered new PencilKit features that will allow users to handwrite text in any text input field using the ‌Apple Pencil‌, with the handwritten content then converted into standard text.

The code also indicates Apple is working on a kind of “Magic Fill” feature that will let users draw a general shape in an app and have it filled in by the operating system.

The leaked version of ‌iOS 14‌ that’s been floating around the internet is an early version of the software and it’s not clear if Apple’s development plans have changed or if some features might be delayed due to the global health crisis.

We’ll find out what we can expect in ‌iOS 14‌ on June 22, which is when Apple’s virtual WWDC event is set to kick off.

Let’s block ads! (Why?)



Source link

Continue Reading

Tech

Judge tosses former Maryland basketball players' Fortnite dance lawsuit – ESPN

Published

on


A federal judge has dismissed a lawsuit in which two former University of Maryland men’s basketball players accused makers of the Fortnite video game of misappropriating a dance move that the ex-teammates popularized.

U.S. District Judge Paul Grimm in Maryland ruled Friday that the Copyright Act preempts claims that Jared Nickens and Jaylen Brantley filed in February 2019 against Epic Games Inc., creator of the wildly popular online shooting game.

Nickens and Brantley claimed the Cary, North Carolina-based company misappropriated their identities by digitally copying the “Running Man Challenge” dance that they performed in social media videos and on “The Ellen DeGeneres Show” in 2016.

Their copyright infringement lawsuit claimed the “Running Man” emote — a celebratory dance in Fortnite — that players can purchase for their characters is identical to the dance that Nickens and Brantley took credit for creating.

The judge said the key question is whether plaintiffs have a claim that is “qualitatively different” from the rights protected by the Copyright Act.

“And here Plaintiffs claim is based on Epic Games allegedly ‘capturing and digitally copying’ the Running Man dance to create the Fortnite emote that ‘allows the player’s avatars to execute the Running Man identically to Plaintiffs’ version. This is squarely within the rights protected by the Copyright Act,'” he wrote.

Brantley, of Springfield, Massachusetts, and Nickens, of Monmouth Junction, New Jersey, were seeking more than $5 million in damages.

Epic Games spokesman Nick Chester declined to comment Monday on the judge’s ruling.

While the game itself is free to play, players can purchase emotes and other character customizations.

Other artists, including Brooklyn-based rapper 2 Milly and “The Fresh Prince of Bel-Air” star Alfonso Ribeiro, also have sued Epic Games over other dances depicted in the shooting game. Ribeiro dropped his lawsuit against Epic Games last year after the U.S. Copyright Office denied him a copyright for the “Carlton” dance that his character performed on the 1990s sitcom.

Nickens and Brantley appeared on DeGeneres’ talk show alongside two New Jersey high school students who were posting videos of the dance online before the two University of Maryland basketball players filmed their own version. Brantley told DeGeneres that Nickens first showed him the dance in a video on Instagram.

“We dance every day for our teammates in the locker room,” Brantley said. “We were like, ‘Hey, let’s make a video and make everybody laugh.'”

One of their dance videos has millions of views on Instagram, YouTube and Facebook, their lawsuit said.

The judge dismissed their lawsuit’s claims for invasion of privacy, unfair competition and unjust enrichment based on preemption under the Copyright Act. He also threw out their trademark claims and claims accusing the company of unfair competition and “false designation of origin” under the Lanham Act.

“Plaintiffs seek to place the same square peg into eight round holes in search of a cause of action against Epic Games for its use of the Running Man dance in its game Fortnite. But Plaintiffs’ claims that Epic Games copied the dance do not support any of their theories,” the judge wrote.

Plaintiffs’ attorney Richard Jaklitsch said his clients may not be able to afford the costs of appealing the judge’s ruling. He said it seems “un-American” for the company to “profit off the backs of” Nickens and Brantley.

“Epic can still step up and do the right thing. Epic can still step up and acknowledge what these kids did,” he said.

Nickens was playing professional basketball in Canada and Brantley was working as a sports agent when they sued last year, according to Jaklitsch.

Let’s block ads! (Why?)



Source link

Continue Reading

Trending