Connect with us

Science

Facebook Introduces Robo-Dog That Is Nearly Impossible To Stop – Tech Gaming Report

Published

 on



Researchers from Facebook and US universities have unveiled a robotic dog that works on a machine learning model and is ready for ‘mishaps’ such as impassable surfaces, variable heights, and heavy weight.

[embedded content]

Unless you’ve been living under a rock for the past few years, you’ve likely seen at least one video of the Boston Dynamics Robo-Dog Spot now owned by Hyundai Corporation. If you are also a devoted geek reader You saw one of these in action right here in Israel, with Israeli technology.. Now Facebook robotics and artificial intelligence researchers are taking and updating the concept with their very own robotic dog that knows how to deal with real-world conditions and is armed with some particularly advanced models.

The AI ​​that will improve robots?

A team of researchers from Facebook’s AI division and the American universities of Berkeley and Carnegie Mellon have introduced a new artificial intelligence model called Rapid Motor Adaption (or RMA for short) designed to improve the mobility of robots. The model allows robots to make corrections to their movement in real time, under different conditions and under different circumstances.

The model created by the research teams for the robots is based on the use of two very familiar techniques from the world of AI and machine learning: the first is reinforcement learning (or RL) and the second is supervised learning. Using these techniques, the researchers created a situation in which the robot becomes accustomed to changes in real time, such as the surfaces on which it walks or carries weight without warning, without the help of visual feedback. In other words, while Spot the Dog scans its surroundings with computer vision, Facebook’s bot is ready for failure and quickly adapts to the UAV.

The researchers behind the new model point out that today robots are either manually programmed according to the environment in which they are designed to operate, or they are partially programmed manually, using learning techniques to learn to navigate the environment. On the contrary, RMA, according to the researchers, is the first model based solely on learning techniques that allows robots to adapt to different environments from scratch by moving in space and interacting with the environment.

According to the researchers, the robots they ran the RMA on were able to achieve more successful results than competing systems when it comes to walking on different surfaces, including different gradients and obstacles, and carrying different weights on them as they walk. “It is even more difficult than sophisticated manual programming, because it is difficult or impossible to pre-program a robot to get used to the full range of environments in the world,” the researchers wrote.

Prepare the robot for real life

Investigators who signed Article On the subject of the model developed together with Facebook, they state that no matter how good the teams that develop robots are, their full or partial programming will always be successful under laboratory conditions, but will not hold up in real life testing. According to them, only the use of a model like RMA could give robots the ability to move in space while loading different weights, without the need to repair their software each time; Or the ability to keep walking properly even if they have suffered some damage to one of their “feet” and the ability to adapt to countless changes that can occur in real time.

To address the various challenges of space movement and real-time repairs, RMA relies on two subsystems. The first is a basic policy created by RL-based learning simulations: the researchers stored a lot of information about the different environments (such as the amount of friction in each environment or the different weights thrown at the robot) and from this information they learned to anticipate which settings must do.





Treat your telegram to the largest technology channel in the country
Join the Gictic channel on Telegram

However, in real time it is impossible to know exactly which surfaces the robot will encounter, and this is where the second subsystem comes into play: the adaptation module. The basic policy is the one that drives the robot in real time and is designed to operate quickly, otherwise it will freeze in place or crash. Next to it, the adaptation module runs in the background, which takes the information collected from the robot’s sensors and makes the necessary corrections according to this information. The two subsystems operate asynchronously, which also allows for a smaller computational module needed to run RMA, so that in practice there is less weight on the robot.

Using all of these abilities, crews were able to march their RMA-based robot into environments such as sand, mud, walking paths, high meadows, and even a pile of dirt when only one of the experiments failed the mission. Among other things, the robot managed 70% of attempts to go down high stairs, which were found to walk where it walked, and 80% of attempts to walk on piles of gravel and concrete. The robot was also able to walk at 12 kilograms, the same weight as its entire body weight. Additionally, the Robo-Dog was able to steadily walk on an oiled surface which caused the dog without the model to instantly slip.

The complete study is Here.

Oshri alexelsi

Your Geek Friendly Neighborhood. Do you have a technological history? Talk to me: [email protected]

wpDiscuz

Tags for the article:

Introvert. Beer guru. Communicator. Travel fanatic. Web advocate. Certified alcohol geek. Tv buff. Subtly charming internet aficionado.

Adblock test (Why?)



Source link

Continue Reading

Science

Facial Recognition—Now for Seals – Hakai Magazine

Published

 on


Article body copy

Have you ever looked at a seal and thought, Is that the same seal I saw yesterday? Well, there could soon be an app for that based on new seal facial recognition technology. Known as SealNet, this seal face-finding system was developed by a team of undergraduate students from Colgate University in New York.

Taking inspiration from other technology adapted for recognizing primates and bears, Krista Ingram, a biologist at Colgate University, led the students in developing software that uses deep learning and a convolutional neural network to tell one seal face from another. SealNet is tailored to identify the harbor seal, a species with a penchant for posing on coasts in haulouts.

The team had to train their software to identify seal faces. “I give it a photograph, it finds the face, [and] clips it to a standard size,” says Ingram. But then she and her students would manually identify the nose, the mouth, and the center of the eyes.

For the project, team members snapped more than 2,000 pictures of seals around Casco Bay, Maine, during a two-year period. They tested the software using 406 different seals and found that SealNet could correctly identify the seals’ faces 85 percent of the time. The team has since expanded its database to include around 1,500 seal faces. As the number of seals logged in the database goes up, so too should the accuracy of the identification, Ingram says.

The developers of SealNet trained a neural network to tell harbor seals apart using photos of 406 different seals. Photo courtesy of Birenbaum et al.

As with all tech, however, SealNet is not infallible. The software saw seal faces in other body parts, vegetation, and even rocks. In one case, Ingram and her students did a double take at the uncanny resemblance between a rock and a seal face. “[The rock] did look like a seal face,” Ingram says. “The darker parts were about the same distance as the eyes … so you can understand why the software found a face.” Consequently, she says it’s always best to manually check that seal faces identified by the software belong to a real seal.

Like a weary seal hauling itself onto a beach for an involuntary photo shoot, the question of why this is all necessary raises itself. Ingram believes SealNet could be a useful, noninvasive tool for researchers.

Of the world’s pinnipeds—a group that includes seals, walruses, and sea lions—harbor seals are considered the most widely dispersed. Yet knowledge gaps do exist. Other techniques to track seals, such as tagging and aerial monitoring, have their limitations and can be highly invasive or expensive.

Ingram points to site fidelity as an aspect of seal behavior that SealNet could shed more light on. The team’s trials indicated that some harbor seals return to the same haulout sites year after year. Other seals, however, such as two animals the team nicknamed Clove and Petal, appeared at two different sites together. Increasing scientists’ understanding of how seals move around could strengthen arguments for protecting specific areas, says Anders Galatius, an ecologist at Aarhus University in Denmark who was not involved in the project.

Galatius, who is responsible for monitoring Denmark’s seal populations, says the software “shows a lot of promise.” If the identification rates are improved, it could be paired with another photo identification method that identifies seals by distinctive markings on their pelage, he says.

In the future, after further testing, Ingram hopes to develop an app based on SealNet. The app, she says, could possibly allow citizen scientists to contribute to logging seal faces. The program could also be adapted for other pinnipeds and possibly even for cetaceans.

Adblock test (Why?)



Source link

Continue Reading

Science

NASA launches nanosatellite in preparation for lunar 'Gateway' station – Yahoo News Canada

Published

 on


The rocket carrying the Capstone satellite lifts off. (NASA)

Nasa has launched a tiny CubeSat this week to test and orbit which will soon be used by Gateway, a lunar space station.

It’s all part of the space agency’s plan to put a woman on the moon by 2025.

The Cislunar Autonomous Positioning System Technology Operations and Navigation Experiment (Capstone) mission launched from New Zealand on Tuesday.

Jim Reuter, associate administrator for the Space Technology Mission Directorate, said: “Capstone is an example of how working with commercial partners is key for Nasa’s ambitious plans to explore the moon and beyond.

“We’re thrilled with a successful start to the mission and looking forward to what Capstone will do once it arrives at the Moon.”

Read more: Astronomers find closest black hole to Earth

The satellite is currently in low-Earth orbit, and it will take the spacecraft about four months to reach its targeted lunar orbit.

Capstone is attached to Rocket Lab’s Lunar Photon, an interplanetary third stage that will send it on its way to deep space.

Over the next six days, Photon’s engine will periodically ignite to accelerate it beyond low-Earth orbit, where Photon will release the CubeSat on a trajectory to the moon.

Capstone will then use its own propulsion and the sun’s gravity to navigate the rest of the way to the Moon.

The gravity-driven track will dramatically reduce the amount of fuel the CubeSat needs to get to the Moon.

Read more: There might once have been life on the moon

Bradley Cheetham, principal investigator for CAPSTONE and chief executive officer of Advanced Space, “Our team is now preparing for separation and initial acquisition for the spacecraft in six days.

“We have already learned a tremendous amount getting to this point, and we are passionate about the importance of returning humans to the Moon, this time to stay!”

At the moon, Capstone will enter an elongated orbit called a near rectilinear halo orbit, or NRHO.

Once in the NRHO, Capstone will fly within 1,000 miles of the moon’s north pole on its near pass and 43,500 miles from the south pole at its farthest.

It will repeat the cycle every six-and-a-half days and maintain this orbit for at least six months to study dynamics.

“Capstone is a pathfinder in many ways, and it will demonstrate several technology capabilities during its mission timeframe while navigating a never-before-flown orbit around the Moon,” said Elwood Agasid, project manager for Capstone at Nasa’s Ames Research Center in California’s Silicon Valley.

“Capstone is laying a foundation for Artemis, Gateway, and commercial support for future lunar operations.”

Nasa estimates the cost of the whole Artemis mission at $28bn.

It would be the first time people have walked on the moon since the last Apollo moon mission in 1972.

Just 12 people have walked on the moon – all men.

Nasa flew six manned missions to the surface of the moon, beginning with Neil Armstrong and Buzz Aldrin in July 1969, up to Gene Cernan and Jack Schmitt in December 1972.

The mission will use Nasa’s powerful new rocket, the Space Launch System (SLS), and the Orion spacecraft.

Watch: NASA launch paves way for moon orbit station

Adblock test (Why?)



Source link

Continue Reading

Science

The year’s biggest and brightest supermoon will appear in July & here’s when you’ll … – Curiocity

Published

 on


Summer is here and with it? Sunshine – and some serious moonshine (of the visible variety, of course). This upcoming month, look up in anticipation of the biggest and brightest event of the year, the July Buck supermoon – which will hover over North America on July 13th.

Appearing 7% larger and lower in the sky, this particular event will be one well worth keeping an eye on when it rises above the horizon.

This will be the closest we’ll get to our celestial neighbour in 2022 (357,418 km) and while North America won’t get to see it when it reaches peak illumination at 2:38 pm ETC., it’ll still look pretty dang impressive after the sunsets.

Related Posts
You can stay in a spaceship Airbnb just a few hours from Seattle
Canada is gearing up to make crimes on the moon a thing

Not sure when the moon rises in your area? Here’s the earliest that you’ll be able to see the moon in various cities across the continent according to the Farmer’s Almanac.

  • Seattle, Washington  – 9:50 pm PDT
  • Vancouver, British Columbia – 10:02 pm PDT
  • Calgary, Alberta – 10:35 pm MST
  • Edmonton, Alberta – 10:49 pm MST
  • Toronto, Ontario – 9:34 pm MST
  • Montreal, Quebec – 9:18 pm MST

Until then, cross your fingers for a clear sky, friends! It’s going to be incredible.

Happy viewing.

JULY BUCK SUPERMOON 

When: Wednesday, July 13th

Adblock test (Why?)



Source link

Continue Reading

Trending