In case you haven’t yet managed to find the perfect social robot for your home, this is BIG-i. BIG-i is going to stare at you without blinking until you decide that you want it. Watching, always watching. Seriously though, BIG-i should get your attention if for no other reason than it’s a design that’s completely different (and significantly softer) than anything we’ve seen before. It’s also mobile, with what looks to be a simple and useful if-this-then-that-style verbal programming. The Kickstarter just Kickstarted off and has already just about reached its goal, but if giant eyeballs are your thing (and let’s be honest, everyone has a thing for giant eyeballs), this robot is probably worth a look.
BIG-i is a natural-interaction robot with mobility, 3D vision, voice programming, and active perception. It can manage smart appliances to suit your needs.
BIG-i can act as the bridge between family members. It will improve itself, becoming more and more thoughtful and intelligent through your instructions. Even when you are away, BIG-i will still take care of your loved ones. BIG-i takes charge of the small daily grind so you are free to enjoy every precious moment. It means everyone feels the love and the strong bond of the family.
As we always always always do our best to remind you, this video represents an absolute best-case scenario of how you can expect BIG-i to perform in your house. BIG-i is not a full-fledged product ready to be shipped to you; it’s a project that you’re supporting with your hard-earned money because you think it could be a great thing. To its credit, NXROBO has been posting plenty of progress videos on YouTube, so here are a few examples of how BIG-i looks to be performing currently. This first video shows audio localization and some movement; note the jump cut before the TV control:
And here’s a conversation about weather and news, from a month ago:
Again, props to NXROBO for posting candid videos of its progress. We also spoke with NXROBO’s founder, Dr. Tin Lun Lam (the same guy who invented the Treebot, how cool is that?!), about the project, including how BIG-i compares with other robots and what developers will be able to do with it. Here’s our full interview with him.
IEEE Spectrum: There have been several other social home robots introduced recently. Why did you decide to make your own home robot, and what makes BIG-i unique?
We think that the biggest challenge of a home robot is to handle different user needs and different household environments. We can’t preset all the functions or know everything before we really understand individual user requirements. So we decided to make a robot that aimed at adapting itself to all these differences.
You can teach BIG-i about the basic information of your family, such as recognizing family members and the interior layout of your home. With this information and using the voice programming function, you can simply generate new applications to fit your needs with one sentence. BIG-i can understand your individual requirements and provide specific services. We do not see any robot in the market that has this highly personalized capability.
BIG-i looks much different than most other robots. How did you come up with the design?
When designing BIG-i, we did not use any films, books, or existing robots as reference. Instead, we simply looked for a proper way for a robot to exist in a family. When creating a family robot, we have to deal with the relations between the home, the robot, and family members. We therefore wanted to make a robot that can bring a reassuring, warm, and comfortable feeling from its appearance. That’s why we choose fabric material and a soft cushion to make BIG-i’s exterior, and focused on simplicity for the external form design.
We expect that the robot’s integration into the family will be natural and unobtrusive. We’d like to respect the users, to put them at the same level as the people who define the product. We designed the shape of BIG-i by focusing on interaction. Eyes are the most important path for creatures to communicate. Gaze gestures can represent different meanings, so we fixed attention on motions with the eye of BIG-i.
What was a difficult problem that you had to solve while creating BIG-i?
Integrating both art and technology into one place. We take the appearance of BIG-i as seriously as the technology inside. As a result, we needed to overcome many technical issues that other robots will not need to consider. For instance, we would like to put all of the sensors inside the eyeball of BIG-i, and as a result, we had to re-calibrate all the optical-related sensors so to compensate for the distortion. We spent extra time on it, but we think it is worth it.
Can you describe the sensors on BIG-i? What kinds of things will developers be able to get the robot to do?
BIG-i has many sensors for different purposes. For robot safety, BIG-i has sensors at the chassis to avoid falling down stairs, tilting sensors to detect if BIG-i is going to fall, and obstacle sensors to prevent collisions. BIG-i also has environmental sensors such as temperature sensors, humidity sensors, pressure sensors, and light sensors. Developers can use these to determine what kind of actions BIG-i should take.
A 360-degree microphone array is also embedded in BIG-i, such that it can know which directions voice commands come from. Developers can uses the RGB-D camera inside BIG-i’s eye to do a lot of things: it lets you do object tracking and recognition, 3D-map construction, human body motion tracking, and more. Developers can apply their genius and imagination to create many fun and useful applications.
NXROBO is planning on giving users access to an app store, and if you pre-order a developer edition of BIG-i, you’ll be able to mess around with the SDK, which looks to be ROS-based. If you just want to enjoy the robot, you can pre-order a regular edition for $750 if you hurry, with an estimated delivery date of April 2017.