April 8 | 2 Comments
I dig robots, especially ones that are created to serve you . Apparently humans feel the same way. How else to explain these two robots created in canine form. First off we have the military industrial complex‘s vision of man’s best friend.
Dubbed BigDog, it is powered by a petrol engine that drives a hydraulic actuation system. Its legs are articulated like an animal’s, and have compliant elements that absorb shock and recycle energy from one step to the next.
BigDog has an on-board computer that controls its locomotion through servos on its legs. The control system manages the dynamics of its behaviour to keep it balanced, as well as help it to be steered.
On board sensors measure the BigDogs joint position, joint force, ground contact, and ground load. They are complemented by a stereo vision system and a laser gyro that help with navigation. Other sensors focus on the internal state of BigDog, monitoring its hydraulic pressure, oil temperature, engine temperature, rpm, and battery charge.
And all this time I was using my brain to do the same thing. Silly me.
In separate trials, BigDog has shown that it can run at 4mph, climb slopes up to 35 degrees, walk across rubble and carry a 340lb load.
Yeah, but can it take a dump in the middle of the living room and survive?
While Boston Dynamics focuses its BigDog line at the working dog class, Sony Electronics targets the lazy home dog class, of which I am a member, with its Aibo robotic dog.
Sony Electronics has bred the third generation of its Aibo robotic dog to be faster, smarter — and floppy-eared.
Looks like they’re breeding robots doggy style, eh.
The company’s Entertainment Robot America division on Thursday announced that the ERS-7 model of Aibo is more responsive to voice and touch commands than previous models. And with improved infrared sensors, it is better able to avoid walls, obstacles and edges.
I too have become more responsive to voice and touch commands as the years have gone by, however my sensors for avoiding walls and obstacles are on the fritz. Just feel the bumps on my head for evidence.
The ERS-7 can understand nearly 180 voice commands and, using visual-pattern recognition technology, can find its Energy Station and recharge itself when its battery runs low, the company said. It also features Illume-Face, an LED (light-emitting diode) face panel that lets it better express its feelings, emotions and current conditions.
It’s amazing how life like they can make these things. I too understand tens of words, use visual-pattern recognition technology to identify treats and know when to recharge my batteries by sleeping all day. I choose, however, not to show my emotions. That’s a sign of weakness.
The only question humans need to ask themselves is, “When Skynet becomes self-aware in 2010 at 2:14am, will these robots remain man’s best friend?”