Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):
US National Robotics Week – April 7-17, 2018 – United States
Xconomy Robo Madness – April 12, 2018 – Bedford, Mass., USA
NASA Swarmathon – April 17-19, 2018 – Kennedy Space Center, Fla., USA
RoboSoft 2018 – April 24-28, 2018 – Livorno, Italy
ICARSC 2018 – April 25-27, 2018 – Torres Vedras, Portugal
NASA Robotic Mining Competition – May 14-18, 2018 – Kennedy Space Center, Fla., USA
ICRA 2018 – May 21-25, 2018 – Brisbane, Australia
RSS 2018 – June 26-30, 2018 – Pittsburgh, Pa., USA
Ubiquitous Robots 2018 – June 27-30, 2018 – Honolulu, Hawaii
MARSS 2018 – July 4-8, 2018 – Nagoya, Japan
Let us know if you have suggestions for next week, and enjoy today's videos.
We were at the 2018 Human Robot Interaction conference all this week, and on Wednesday, there was a special video session. The audience, who was provided with popcorn, voted by applause, and here are the top three videos.
“Social Interaction With Drones Using Human Emotion Recognition,” by Eirini Malliaraki, Imperial College, London.
The proposed demonstration suggests a mapping function for human interpretable drone motions corresponding to the five human emotional states (i.e anger, happiness, sadness, surprise, and fear), using the personal drone’s movements (i.e, changing speed or rotations) instead of anthropomorphizing it.
“Perceptions of a Sof Robotic Tentacle in Interaction,” by Jonas Jørgensen, IT University of Copenhagen Denmark.
Soft robotics technology has been proposed for a number of applications that involve human-robot interaction. This video documents a platform created to explore human perceptions of soft robots in interaction. The video presents select footage from an interaction experiment conducted with the platform and the initial findings obtained.
And the winner:
“Social Relationship Development Between Human and Robot Through Real-Time Face Identification and Emotional Interaction,” by WonHyong Lee and Jong-Hwan Kim from the Korea Advanced Institute of Science and Technology.
We developed an interactive humanoid robotic platform with a real-time face learning algorithm for user identification and an emotional episodic memory to combine emotional experiences with users so that the robot can differentiate its reactions to users according to the emotional history. In this video, it is demonstrated how a robot can develop a social relationship with humans through the face identification and emotional interaction.
[ HRI 2018 ]
I don't play basketball, so I tend to assume that shooting free throws is a cinch. This robot proves me right, I think.
The robot was made by 17 volunteers from Toyota Technology Association, a sort of hobby club. It's not directly affiliated with Toyota the company, as far as I can tell, but the robot is intended to celebrate the 70th anniversary of the company. Apparently, it's modeled on "an inspiring character named Hanamichi Sakuragi (桜木花道) in a very famous Japanese anime called Slam Dunk."
[ Asahi ]
ANYmal carries an onboard microphone with which music can be perceived. The beat of the music is analyzed and a suitable sequence of dance motions is choreographed. The desired and real motion trajectories are compared such that the delay between music and motion can be minimized.
[ ANYmal ]
At some point, robots that can quickly solve Rubik's Cubes kinda stop being relevant. Like, the solving part is totally incidental to the impressive thing, which is fast manipulation, but there are way more impressive things that you can do with fast manipulators than just spinning the sides of a cube around.
Having said that, the engineering is really quite interesting, and worth a look at the link below.
[ Blogspot ]
This "corporate video" from Suitable Technologies is worth watching for approximately 5 seconds worth of what appear to be Beam prototypes, including one with what looks like a front-mounted leaf blower (?)
[ Suitable ]
If you're at SXSW next week, you can "engage in Jedi Battle with 2 UR5 cobots equipped with Robotiq 3-Finger Adaptive Robot Gripper and FT 300 Force Torque Sensor." Here's a preview:
[ Robotiq ]
You'll have to crank the volume way up to hear the commentary, but it's worth it to listen to the researchers giggling while trying to use their squishy robot fingers to mildly molest bizarre deep sea creatures.
Very nice to see Uber doing the sensible thing with its self-driving freight trucks: let the autonomous systems handle the dull but well defined highway driving, and then let humans take over for the last few miles of driving, which require the greatest amount of skill.
[ Uber ATG ]
It doesn't get much more gratuitous than drones vs. airsoft, especially when the drones are launching explosives.
[ Team BlackSheep ]
Far above the Earth, Canada’s space robotics helped assemble the International Space Station, an orbiting outpost that pushes the boundaries of science and human knowledge. Canadian expertise in the design and development of robotic systems like Canadarm2 has found new applications inside the operating room. In partnership with MDA, Synaptive Medical developed Drive, a robotic imaging system that increases surgical efficiency and shortens patient recovery time.
[ CSA ]
In the FIRST LEGO League tournament, middle school teams mentored by Penn Engineering students worked to design and build robots related to the theme of water.
[ GRASP Lab ]
A compliation of PAL Robotics' TIAGo manipulating stuff. Loving the red.
[ PAL Robotics ]
Virtually explore a Mars simulation facility used by engineers to practice operating NASA's InSight lander, slated to launch in May 2018. Hear from engineer Marleen Martinez Sundgaard as you explore the In-Situ Instrument Lab at the Jet Propulsion Laboratory in Pasadena, California, and see how the spacecraft will deploy its seismometer.
[ Mars Insight ]
Students in CMU's Introduction to Robotics course build LEGO robots for urban search and rescue. Gotta save those minifigs!
[ CMU ]
The MIT Intelligence Quest will advance the science and engineering of both human and machine intelligence. Launched on February 1, 2018, this effort seeks to discover the foundations of human intelligence and drive the development of technological tools that can positively influence virtually every aspect of society.
[ MIT ]
We've written about Aaron Parness and the innovative gripping systems he builds for robots at JPL. Aaron gave a talk about his work at the Smithsonian Air and Space Museum, and brought along a few robots to demo. And there were some adorable geckos, too!
[ Aaron Parness ]
This week's CMU RI Seminar comes from Naomi Ehrich Leonard at Princeton University, on "Bio-inspired dynamics for multi-agent decision-making."
I will present distributed decision-making dynamics for multi-agent systems, motivated by studies of animal groups, such as house-hunting honeybees, and their extraordinary ability to make collective decisions that are both robust to disturbance and adaptable to change. The dynamics derive from principles of symmetry, consensus, and bifurcation in networked systems, exploiting instability as a means to flexibly transition from one stable solution to another. Feedback dynamics are derived for the bifurcation control, a variable representing social effort, such that flexible transition is made a controlled adaptive response.
[ CMU RI ]
In this week's episode of Robots in Depth, Per interviews Franziska Kirstein, a human-robot interaction researcher at Blue Ocean Robotics.
Franziska Kirstein talks about her experience as a linguist and in Human computer interaction. We get to hear about what works and what doesn't when non-engineer users are tasked with teaching robots different movements. Franziska also describes some of the challenges with kinestetic guidance and alternative methods that can be used. She then talks about some of the projects she is involved in, including one in robot assisted health care and one involving social robots.
[ Robots in Depth ]