Video Friday: ANYmal Robot, Jibo Unboxing, and Anki Overdrive Fast & Furious

ANYmal quadruped robot from ANYbotics
Image: ANYbotics via YouTube

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next two months; here’s what we have so far (send us your events!):

ROSCon – September 21-22, 2017 – Vancouver, B.C., Canada
IEEE IROS – September 24-28, 2017 – Vancouver, B.C., Canada
RoboBusiness – September 27-28, 2017 – Santa Clara, Calif., USA
BtS Maritime Robotics – October 1-8, 2017 – Biograd na Moru, Croatia
Drone World Expo – October 2-4, 2017 – San Jose, Calif., USA
HAI 2017 – October 17-20, 2017 – Bielefeld, Germany

Let us know if you have suggestions for next week, and enjoy today’s videos.


This video shows some impressively robust autonomous rough-terrain locomotion from ANYmal, but it’s mostly worth watching for the epic ANYmal back massage (!) at the end:

We present an architecture for rough-terrain locomotion with quadrupedal robots. All sensing, state estimation, mapping, control, and planning runs in real-time onboard the robot. The method is implemented on the quadrupedal robot ANYmal and we present experiments of climbing stairs, steps, and slopes and show how the robot can adapt to changing and moving environments.

By Péter Fankhauser, Marko Bjelonic, Miki Takahiro, Tanja Baumann, C. Dario Bellicoso, Christian Gehring, and Marco Hutter.

ANYmal ]


Some of you have been waiting a long, long, long time for this: an actual Jibo unboxing and setup video!

Jibos are shipping now for early Indiegogo backers. I’m not sure how Sean got his first, but it’ll be interesting to see what the typical level of performance is.

[ Jibo ]


The Lockheed Martin Advanced Test High Energy Asset (ATHENA) prototype laser weapon system proved that an advanced system of sensors, software and specialized optics can deliver decisive lethality against unmanned aerial vehicle threats. In tests conducted at White Sands Missile Range in New Mexico, ATHENA destroyed five Outlaw unmanned aerial systems in August 2017.

They may have been Outlaws, but I still feel bad for those drones :(

Lockheed Martin ]


Anki is on a roll. After launching Overdrive a few years ago and, more recently, Cozmo and Code Lab, the San Francisco robotics company is announcing today a new version of Overdrive: Fast & Furious Edition, to be released September 24. Now on pre-order for $170 at Amazon, Best Buy, and other retailers.

Anki OVERDRIVE pits you against Artificially Intelligent Supercars in a pulse-pounding game of speed and strategy. Anki OVERDRIVE: Fast & Furious Edition merges the best elements of Anki’s hit robotic battle-racing system with the adrenaline-fueled world of the Fast & Furious. Take control of two high-octane vehicles from the movies - Dom’s Ice Charger and Hobbs’ MXT - and battle alongside (or against) members of the Toretto crew. Whether you’re battling A.I. opponents or friends, your tactical options are unlimited. Players can also expand their Fast & Furious battles with Anki OVERDRIVE Supercars, Supertrucks, and Expansion Kits!

[ Anki Overdrive: Fast & Furious ]


Canadian robotics company Kinova is introducing a mobile manipulator, based on its Jaco arms. It’s designed as a research platform and will launch next year.

MOVO BETA is a mobile manipulator platform designed to aid researchers and enable the discovery of innovative approaches and applications for mobile manipulation. Kinova has been a leader in the robotics space for more than a decade and is best known for its JACO robotic arm. MOVO, scheduled to launch in 2018, will be the first of Kinova’s mobile manipulator product line.

[ Kinova MOVO ]


Simone Giertz, always full of ideas that we would call “good” if that wouldn’t be opening us up to some sort of lawsuit:

[ YouTube ]


The Matternet Station is the third and final technology component for enabling Matternet’s vision of distributed, peer-to-peer logistics networks. It is integrated with Matternet’s autonomous M2 Drone and Matternet’s Cloud platform to provide an intuitive user interface for sending and receiving packages through Matternet.

The Matternet Station occupies a small footprint of approximately 2 square meters and can be installed at ground or rooftop locations. It is equipped with technology that guides the Matternet M2 Drone to precision landing on the Station’s platform. After landing, the Station locks the drone in place and automatically swaps its battery and payload. A user is able to send a package to another location by simply scanning it into the Matternet Station, or receive a package from the Station by scanning a QR code. Each Station comes with its own automated aerial deconfliction system that manages drone traffic over the Station.

[ Matternet ] via [ Techcrunch ]


Knightscope has slightly more unveiled their K7 robot, a very impractical looking concept for a mobile (like, off-road) security drone:

In unrelated news, Knightscope would also like you to know that the last day to buy pre-IPO shares in their company is October 10th. And if you invest $10,000, Knightscope will reward you with an “Audio Robot Selfie” to share a message of your choice on social media.

[ Knightscope ]


Automatica is a music video by Nigel Stanford featuring a bunch of Kuka robot arms playing instruments rather violently:

It’s very cool, but there’s obviously a bunch of special effects and CGI, so are the robots really playing all the instruments? Here’s a “behind the scenes” clip showing tests Stanford did with bass and drums. 

[ Nigel Stanford ]


This is the final from the second best RoboCup soccer league (mid-size), with Tech United squaring off against water:

The best league is small size, of course, but I haven’t been able to find a nicely edited video of the final. Here’s an unedited version, though, which is still fun to watch:

Seer Robotics Club beat ER-Force 2-1.

[ RoboCup 2017 ]


Researchers at Columbia Engineering have solved a long-standing issue in the creation of untethered soft robots whose actions and movements can help mimic natural biological systems. A group in the Creative Machines lab led by Hod Lipson, professor of mechanical engineering, has developed a 3D-printable synthetic soft muscle, a one-of-a-kind artificial active tissue with intrinsic expansion ability that does not require an external compressor or high voltage equipment as previous muscles required. The new material has a strain density (expansion per gram) that is 15 times larger than natural muscle, and can lift 1000 times its own weight.

To achieve an actuator with high strain and high stress coupled with low density, lead author of the study Aslan Miriyev, a postdoctoral researcher in the Creative Machines lab, used a silicone rubber matrix with ethanol distributed throughout in micro-bubbles. The solution combined the elastic properties and extreme volume change attributes of other material systems while also being easy to fabricate, low cost, and made of environmentally safe materials.

After being 3D-printed into the desired shape, the artificial muscle was electrically actuated using a thin resistive wire and low-power (8V). It was tested in a variety of robotic applications where it showed significant expansion-contraction ability, being capable of expansion up to 900% when electrically heated to 80°C. Via computer controls, the autonomous unit is capable of performing motion tasks in almost any design.

Nature Communications ] via [ Columbia ]


While I’m not sure it’s necessarily “revolutionizing flight,” the Plimp (under development by Egan Airships) is an interesting hybrid between a blimp and a drone. It’s not lighter than air, but rather uses a helium envelope for lift augmentation, helping with payload and also making sure that crashes are a little bit tamer than they would be otherwise:

From Popular Science:

And the ultimate dream is to move beyond a stable, safe drone to into a new form of human transportation. The Egan brothers gushed about the possibility of a 150-foot-long Plimp shuttle that could take 12 people from Seattle’s airport to the Microsoft campus in six minutes (a journey of just under 16 miles, as the Plimp flies), or a cargo and transport mission that could deliver medicine across the Amazon, where cars and boats can’t go, needing only a small clearing to land.

[ Plimp ] via [ PopSci ]


“RoboThespian wants to plunder your booty.” Oh, er, good...?

I think we all learned something today.

[ RoboThespian ]


Audi used KUKA’s highest payload robot, a KUKA KR1000 titan, to create spectacular cinematography by attaching an Audi A8 to the end of the KR titan and projection mapping on top of it synced with gigantic screens in behind.

[ Kuka Titan ]


The Robotic Systems Lab and the Autonomous Systems Lab are competing as a team at the ERL Emergency Robotics 2017 in Piombino, Italy.

Aww, ANYmal looks so happy frolicking on the beach!

[ ERL Emergency Robots ]


At the University of Texas Austin Research Institute (UTARI), Nao is learning to perform Shakespeare (among other things) to help keep elderly folks engaged and entertained:

Ok but OMG WHAT WAS IN THE SNACK FOOD

[ UTARI ]


This video demonstrates the payload (weight carrying) capability, ability to go over small bumps, and total runtime of an AmigoBot mobile robot from Omron Adept MobileRobots. AmigoBot is a small easy to use mobile robot ideal for college and university robotics and computer science teaching and research.

[ Omron Adept ]


This week’s CMU Robotics Institute Seminar comes from assistant professor David Held, on Robots Learning to Understand Environmental Changes:

Robots today are typically confined to operate in relatively simple, controlled environments. One reason for these limitation is that current methods for robotic perception and control tend to break down when faced with occlusions, viewpoint changes, poor lighting, unmodeled dynamics, and other challenging but common situations that occur when robots are placed in the real world. I argue that, in order to handle these variations, robots need to learn to understand how the world changes over time: how the environment can change as a result of the robot’s own actions or from the actions of other agents in the environment. I will show how we can apply this idea of understanding changes to a number of robotics problems, such as object segmentation, tracking, and velocity estimation for autonomous driving as well as various object manipulation tasks. By learning how the environment can change over time, we can enable robots to operate in the complex, cluttered environments of our daily lives.

[ CMU RI ]


Advertisement

Automaton

IEEE Spectrum’s award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, drones, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York City
Senior Writer
Evan Ackerman
Washington, D.C.
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement