Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next two months; here's what we have so far (send us your events!):
IEEE CASE 2017 – August 20-23, 2017 – Xi'an, China
IEEE ICARM 2017 – August 27-31, 2017 – Hefei, China
IEEE RO-MAN – August 28-31, 2017 – Lisbon, Portugal
CLAWAR 2017 – September 11-13, 2017 – Porto, Portugal
FSR 2017 – September 12-15, 2017 – Zurich, Switzerland
Singularities of Mechanisms and Robotic Manipulators – September 18-22, 2017 – Johannes Kepler University, Linz, Austria
ROSCon – September 21-22, 2017 – Vancouver, B.C., Canada
IEEE IROS – September 24-28, 2017 – Vancouver, B.C., Canada
RoboBusiness – September 27-28, 2017 – Santa Clara, Calif., USA
Drone World Expo – October 2-4, 2017 – San Jose, Calif., USA
Let us know if you have suggestions for next week, and enjoy today's videos.
Prior to deployment, robots must be extensively trained and tested. With physical prototypes, this can be expensive and impractical. Creating the complete environment a robot will interact with can be unsafe or very complex. Modeling all possible interactions between a robot and its surrounding environment can be highly time consuming. The Isaac robot simulator advances these tasks by providing an AI-based software platform that lets teams train robots in highly realistic virtual environments and then transfer that knowledge to real-world units.
Isaac is built on an enhanced version of Epic Games' Unreal Engine 4 and uses NVIDIA's advanced simulation, rendering and deep learning technologies. Working within this virtual environment, developers can set up extensive test scenarios using deep learning training, and then simulate them in minutes -- which would otherwise take months to perform. Once a simulation is satisfactorily complete, the information can be quickly transferred to real-world robots. Developers can then iterate and tweak the robot testing methodology, trading intelligence between the two environments. Since simulations within Isaac are highly realistic and can be performed quickly, fewer adjustments are needed to the final product compared to traditional development.
As you can tell, the simulation is sophisticated enough to accurately incorporate robot sounds. Impressive. While NVIDIA's web pages on this project all feature hockey-playing PR2s for some reason, it looks like Isaac is intended to help train a Baxter instead, because look at those arms, right? We're looking forward to seeing how well this works in practice, since there's always the danger of relying too much on simulation, and then having the physical robot whack you upside the head for your trouble.
[ NVIDIA ]
iCub needs to clear a stuffed octopus off of the table, but it can't reach! Oh noes! Luckily, iCub is clever enough to use a tool to help it complete the task:
The goal of this video is to show the ability of iCub to incorporate tools in an autonomous manner, and use them for everyday activities such as cleaning a table. Under the hood, it demonstrated the ability to apply previously learned models of tool affordances, tool classification from vision, automatic tool pose detection, object segmentation and full/empty hand classification to achieve its task.
Dongjun Lee from Seoul National University wrote in to share some of the aerial manipulation robots his lab has been working on:
Typical multi-rotor drones (e.g., quadrotors) are under-actuated, i.e., cannot translate without rotating or rotate without translating. This is fine with aerial photography (with gimbal), yet, not so for aerial manipulation (e.g., cannot maintain pushing or contact task while holding its attitude when sideway wind blows, particularly given that the attached arm is typically of only low-DOF arm not to further sacrifice already-tight payload). By utilizing quadrotors attached to a mainframe via passive spherical joints as rotating-thrust generator, this SmQ (Spherically-connected multiple Quadrotor) system is fully-actuated (e.g., can resist sideway wind without tilting) and also backdrivable (e.g., impedance control possible for compliant interaction). The SmQ system is modular and can provide high-level position/force control-loop, all desirable for real deployment and commercialization. The limited range of commercial spherical joints is also addressed by proper control design.
Using optimally-aligned/distributed rotors and reversible ESCs, the ODAR (Omni-Directional Aerial Robot) system can generate any translation/rotation motion and any force/torque vector simultaneously and independently, crucial for aerial manipulation in 3D space. With design optimization to address the tight weight-thrust margin of current rotor and battery technologies and proper control design, the ODAR system can exhibit such capability for “real” manipulation as 1) downward pushing force larger than 6kg (much larger than its own weight of 2.6kg) and 2) peg-in-hole teleoperation with radial tolerance of only 0.5mm, all unprecedented by other aerial manipulation systems (e.g., drone-manipulator).
[ SNU ]
The latest gymnastics robot from hinamitetu is balance beam, meaning that the robot is female (obviously):
Just like with real gymnasts, it quite often doesn't go exactly how you'd like:
[ hinamitetu ]
Raytheon demonstrates an excellent reason not to fly a drone anywhere near a military base:
In case you were wondering, one single Stinger missile costs about $38,000.
[ Raytheon ]
Fetch Robotics presents more from the Weaponized Plastic Fighting League, which is just like BattleBots except affordable and funnier. Keep an eye out for a weaponized bristlebot, and Diminutive Diplodocus, which has the cutest name and may also win the shattering into largest separate pieces award.
[ Fetch Robotics ]
It's going to take some useful applications and real-world success for me to really get behind social home robots, but I can still appreciate the uniqueness of BIG-i's design. And now it can auto-dock to recharge, so that's good.
[ NXROBO ]
From Harvard's SSR Lab:
Our underwater robot design uses low-cost magnet-in-coil actuators, which have a small profile and minimal sealing requirements. This allows us to create a small (10cm) robot with multiple flapping fin propulsors that independently control robot motions in surge, heave, and yaw. The robot is designed to form the basis for underwater swarm robotics testbeds, where low cost and ease of manufacture are critical, and 3D maneuverability allows testing complex coordination inspired by natural fish schools.
This 32 second video taken in September of 2001 shows two NASA F/A-18s performing a flight test over California's Mojave Desert during the Autonomous Formation Flight (AFF) program at Dryden Flight Research Center (now Armstrong). Autonomous Formation Flight (AFF) is intended to allow an aircraft to fly in close formation over long distances using advanced positioning and controls technology. It utilizes Global Positioning System satellites and inertial navigation systems to position two or more aircraft in formation, with an accuracy of a few inches. This capability is expected to yield fuel efficiency improvements.
[ NASA Armstrong ]
A PAL Robotics TIAGo, modified by the Technical University of Munich, was one of the competitors at RoboCup@Home this year:
[ PAL Robotics ]
In this final project of the WVU Mobile Robotics course , students were asked to program a SMART robot to find magnets (as simulated landmines) taped on the lab floor.
[ WVU IRL ]
After following the curriculum to create an underwater Remotely Operated Vehicle (ROV) from the SeaPerch kit, student teams earn the right to compete nationally by winning their local and regional SeaPerch Challenges. These teams will gather to test what they have learned by putting their SeaPerch ROVs through a series of underwater challenges, and will showcase their design in a juried poster competition.
Wow. Remember being kid-level excited about things? These kids deserve it, though, and congrats to the winners, Team Giapac 101 from Puerto Rico!
[ SeaPerch ]
During his spring internship [at Fetch Robotics], Michael Janov worked on two electrical engineering projects. In the first project, he developed the printed circuit board assembly (PCBA) and initial firmware for a prototype accessory board that allows easier integration of hardware devices with Freight and FetchCore. One of the challenges of hardware projects is the time it takes to have the hardware made and assembled. While waiting for the PCBA to come back, Michael evaluated several off-the-shelf wireless charging modules.
Based on the rest of the video description, I'm guessing that Fetch would very much like me to mention that they have openings for interns right now.
[ Fetch ]
Ten years ago, researchers thought that getting a computer to tell the difference between a cat and a dog would be almost impossible. Today, computer vision systems do it with greater than 99 percent accuracy. How? Joseph Redmon works on the YOLO (You Only Look Once) system, an open-source method of object detection that can identify objects in images and video -- from zebras to stop signs -- with lightning-quick speed. In a remarkable live demo, Redmon shows off this important step forward for applications like self-driving cars, robotics and even cancer detection.
[ TED ]
Hero of Deep Learning Andrew Ng interviews Hero of Deep Learning Pieter Abbeel, for a new series of videos on Heroes of Deep Learning:
There are six more of these interviews in the playlist below.
[ YouTube ]
This is the 2017 RoboCup Standard Platform World Championship Final between B-Human and Nao-Team HTWK:
[ Nao-Team HTWK ]
Here's the RoboCup 2017 3D simulation league final between magmaOffenburg (Offenburg University of Applied Sciences, Germany) and UT Austin Villa (University of Texas at Austin, USA).
You can tell it's simulation because the robots move quickly and can kick the ball long distances accurately, and it's not particularly realistic in other ways as well. But it's a little more active than the real world, right?
[ UT Austin Villa ]