تيم دانشجويي پرسيس دراولين دوره مسابقات سراسري پهباد دانشجويي در شمال غرب كه توسط دانشگاه صنعتي مالك اشتر در شهر اروميه در آذر ماه 92 برگزار شد ،موفق به كسب مقام سوم شد.
برچسبها: mikail shapoory, میکائیل شاپوری, UAVs
A few months ago, we heard rumors that Google was planning something big in robotics. We also heard that Andy Rubin, the engineer who spearheaded the development of Android at Google, was leading this new robotics effort at the company. Rubin, we were told, is personally interested in robots, and now he wants Google to have a major role in making robotics happen. Not justrobotic cars, but actual robots. Today, an article in the New York Times has revealed more about Google's plans: according to the article, the company is funding a major new robotics group, and that includes acquiring a bunch of robotics startups, quite a few of which we're familiar with.
You'll definitely want to read the entire New York Times story, where Rubin talks a little bit too vaguely about what Google is actually planning on doing with these as-yet hypothetical robots that they're apparently working on over there, but here's the bit about the acquisitions:
Mr. Rubin has secretly acquired an array of robotics and artificial intelligence start-up companies in the United States and Japan.
Among the companies are Schaft, a small team of Japanese roboticists who recently left Tokyo University to develop a humanoid robot, and Industrial Perception, a start-up here that has developed computer vision systems and robot arms for loading and unloading trucks. Also acquired were Meka and Redwood Robotics, makers of humanoid robots and robot arms in San Francisco, and Bot & Dolly, a maker of robotic camera systems that were recently used to create special effects in the movie “Gravity.” A related firm, Autofuss, which focuses on advertising and design, and Holomni, a small design firm that makes high-tech wheels, were acquired as well.
The seven companies are capable of creating technologies needed to build a mobile, dexterous robot. Mr. Rubin said he was pursuing additional acquisitions.
Some brief highlights:
Industrial Perception spun out of Willow Garage back in March of 2012; read our Startup Spotlight post on them here.
Meka Robotics builds research robots with series elastic actuators in them; they're probably best known for the M1 humanoid (pictured above in front of the Google logo) and Dreamer, which you can read about here.
Redwood Robotics is (was) a collaboration between Willow Garage, SRI, and Meka that was supposedly designing a very low cost robotic arm. We've been asking around and haven't heard much for the last year or so, maybe now we know why.
And of course, there's Bot & Dolly, which uses robot arms for precise and repeatable camera control, making things way more awesome than "precise and repeatable camera control" probably makes you think of.
Obviously, we're curious about what other acquisitions Rubin is pursuing, and more generally, just what Google is actually working on. Fortunately for us, the Google robotics group will at least initially be based right here in Palo Alto, meaning that I'll get a chance to put my spy drones and ninja outfit to good use.
برچسبها: Google Acquires Seven Robot Companies Wants Big Ro, mikail shapoory
It's that time of year again, when you go out on a huge shopping spree buying all kinds of cool robot gifts for your family, friends, and Secret Santacoworkers. Or maybe you just go out and get something robotic for, you know, yourself. Because you sure deserve an awesome US $12,000 advanced humanoid robot, right? If, however, that's a bit above your budget (it certainly is way above ours!), we have lots of other options for you. And note that we're trying to bring you different stuff (slightly different, at least) than what we featured on our 2012 Robot Gift Guide. So check out our list, and if you think we missed something good, let everyone know in the comments.
On to the gifts!
Anki Drive has the first spot on our list (which is in no particular order) because it's cool, it's fun, you can appreciate it whether or not you really care about robots, and at $200, it's not crazy expensive (as far as robots go). Anki consists of a racetrack mat plus two little robot cars that use your iOS device for brains. Each car can localize on the mat itself, meaning that it drives autonomously, and your input comes in the form of high-level direction and telling it to use various weapons and defensive abilities. Gameplay is supposed to get more strategic and immersive as you get farther into the game, although our hope is that Anki's tech will show up in a wider diversity of products over time. For now, though, you can find it in Apple stores without too much trouble.
Roombas always make good gifts for people who never knew that they were missing a robot in their lives, and the Roomba 880 is the newest, latest, and best. A redesigned cleaning system makes the 880 require a lot less maintenance, even as a more powerful vacuum picks up dirt much more effectively. We also like the slick new look, and in our testing, we feel like it's one of the quietest Roombas yet. It may not be ready to replace an upright vacuum, but it definitely means that you (or whoever you get it for) will have to vacuum a whole lot less.
Robots don't have to be crazy expensive (even though most of them are). Hexbugs are totally cheap, but they're also totally robots, with sensors to detect what's going on in their environment along with the capacity to react to it by changing their behaviors. There are plenty of different ones to choose from (some autonomous and some controllable), with a variety of colorful designs, and they're pretty much all affordable.
Lego Mindstorms EV3
The Mindstorms EV3 kit combines the approachability of Lego with the sophistication of a robotics kit that you can do all sorts of things with. Three servos, a color sensor, an IR sensor, and a touch sensor are paired with USB and WiFi connectivity, all backed by an ARM processor and command and control directly on the robot. Because it's Lego, there's a gigantic community to help you out if you ever get stuck, or help you find inspiration if you're looking for the next thing to build.
AR Drone 2.0 Power Edition
While it's fundamentally the same robot, the "Power Edition" of the AR Dronegives us an excuse to include it in our gift guide again this year, because for the money, this is just about the most fun robot you can possibly buy. It's a cinch to fly straight out of the box, and while it doesn't have much in the way of autonomy built in, it will handle the tricky parts (take off, landing, and stable hovering) for you. Best of all, enclosed props make it safe to use in people and even indoors, although we wouldn't recommend try to get too fancy around anything fragile or expensive.
The newest version of Orbotix' Sphero is even faster and even more capable than the original. It may look mostly the same, but inside there's new hardware that makes it twice as hard to drive around than the first Sphero, with a top speed of two meters per second, indoors and outdoors and over water (it floats!). Besides being a robotic ball that you can drive around with your phone, Sphero also comes with all kinds of apps that should keep things interesting, and there are a variety of programming interfaces when you're ready to take things to the next level.
Neato Signature Edition
As much as we like Roombas, the Neato XV Signature has a laser turret on it, and robots with lasers are kinda the best thing ever. The Neato uses this laser turret to map out all of the rooms that it cleans, enabling it to clean everything in just one pass, making it faster and more efficient than a Roomba (although, for the record, whether it actually cleans better is debatable). The Neato is a good idea for someone who might like a Roomba, but might also appreciate the technology inherent in the Neato. The Signature Edition, by the way, features a fancy new color scheme that makes it look significantly more like a ninja.
Romo is a smartphone dock on tank treads, but as soon as you plug your phone into it, it turns into all kinds of other things. It has a customizable personality that you can train to recognize you, or you can let it wander around on its own and get into trouble. By leveraging all of the connectivity and brain power in your smartphone (even the old smartphone that you don't use anymore), you can also use Romo as a full-fledged tiny little telepresence platform. For the best performance, you'll want to give Romo an iPhone 4S or better, but it'll be happy even with just a fourth-gen iPod Touch.
3D Robotics Iris
3D Robotics makes serious quadrotors, but you don't have to have a serious amount of experience to use one of them, thanks to a sophisticated autopilot system that does all of the actual flying for you (if you want it to). Using an included wireless ground station and Android tablet adapter, you can give the 3D Robotics Iris commands to take off, fly a series of waypoints, and then come right back to you and land. Mount a GoPro on the front (which is that the Iris is built for), and you've got a remote camera platform that's ready to go right out of the box.
We had Aldebaran Robotics' Nao on our list last year, and this year we're featuring DarwIn-OP, the impressive little humanoid robot designed by ROBOTIS and Virginia Tech. Darwin is most definitely research-grade, which is just another way of saying that it's a-lot-of-fun-grade, as long as you can afford it. The robot has many of the same capabilities as Nao does, including 6 DOF legs, 3 DOF arms, a 2 DOF neck, cameras, mics, sensors, LEDs, and some awfully cute eyes.
Sony AIBO ERS-7
Our last gift is something that's going to be very, very difficult to find. It's likely also going to be very, very expensive if you do manage to find one, but totally worth it. It's the final generation of AIBO, Sony's robot dog: the ERS-7 (or ERS-7M2 or 7M3). The AIBO is, arguably, still one of the most sophisticated consumer robots that you can buy, even though the very last ERS-7 was released way back in 2005. It can walk, chase objects, recognize people, follow voice commands, charge itself, and even fetch your email and read it to you. It knows over 1,000 English words, and a sophisticated artificial intelligence along with arrays of LEDs in AIBO's face and body let it express emotion. Brand new in 2005, you could buy an AIBO for $1,600, but the price has only gone up. The best deal you can hope for is to find a used one online from someone who has no idea what it is. And even if you find one in less than stellar shape, AIBOs remain popular enough that you can send it off to an AIBO hospital for refurbishment. Our advice is to try Craigslist or eBay, and keep your fingers crossed.
برچسبها: Mikail Shapoory, 2013 Robot Gift Guide
When iRobot acquired Evolution Robotics (the company behind the Mintcleaning bot) just over a year ago, Evolution CEO Paolo Pirjanian was brought on as iRobot's new CTO. Basically, this means that Paolo's job is to come up with cool new stuff for robots to do, and cool new ways for them to do it.
We got a chance to ask Paolo a few questions at RoboBusiness last month, and we sat down with him and Matthew Lloyd (iRobot's director of communications) to talk about the future of robotics and where he sees iRobot going from here.
IEEE Spectrum: The most exciting thing for us, when we heard that iRobot was acquiring Evolution Robotics, was the idea of localization and navigation. Can you talk about how those technologies might influence the direction in which iRobot is heading?
Paolo Pirjanian: You’re right, navigation is absolutely something that is very important to us and our firm belief is that navigation has reached a level of maturity now that is going to be influencing a lot of our products. We already see it: you have AVA at the high end, and you have Braava at the consumer level, and so navigation is the next wave of capabilities that we are adding to our product portfolio. Specifically, I cannot talk about roadmaps of products, but that’s for sure an area that we are focused on. And we do consider ourselves the leader. With the combination of the high-end navigation technology from iRobot, and the consumer-grade navigation technologies [Evolution Robotics] brought to the table, we are on the leading edge of technology, and these are mature technologies now that are going into products.
We talked to Nancy Dussault Smith a few years ago, right after the Neato XV-11 came out. I was asking her about the technology there: the Neato is a robot that can navigate around rooms in straight lines, and she said "we don’t want to do that because it’s more effective cleaning if you have a random pattern of multiple coverage." How does that reconcile with the idea of navigation being the future, and intelligence being the future?
So, there’s floor care and there’s things beyond floor care. For floor care,Consumer Reports actually just released a test they did, and Roomba, in terms of cleaning, definitely beats everyone. And that’s what iRobot has prided itself in about Roomba: Roomba gets the job done. iAdapt, which is the current diffusion-based technology from iRobot in Roomba, has certain advantages that allow it to do a good job of cleaning, and the evolution of navigation can complement that if combined properly. So we can improve the performance of the robot and speed up coverage and size of the areas we can clean with the same battery usage without compromising cleaning quality. And if you just look at the reports, the reports show that the LG, Samsung, and Neato robots that are doing systematic coverage are getting the task finished faster, but the cleaning quality suffers significantly, and we are not going to let that happen.
That’s something that other companies seem to focus on a lot: speed and efficiency.
This behavior may have been caused by the development of standards for robotic vacuum cleaners. One of the metrics that is used for measuring performance is area covered as a function of time, and so they are optimizing that metric, but as a result, compromising cleaning. And we think cleaning is the key function of the product, and we cannot compromise that. But, at the same time, we believe that, with a combination of the technologies of the two companies, we can keep improving our cleaning technology while improving all the other parameters as well.
As far as cleaning technology goes, how is combing these two companies changing that, since the way that each robot cleans is so different?
Braava's cleaning is complementary to both Roomba and Scooba because it deals with the fine dust and dirt that’s left behind. I would say, from a mechanical, suction, pick-up perspective, iRobot is the leader; we didn't bring anything to the table. The cloth-based mopping and sweeping is something that is complementary and I think it’s a good fit, that way.
When you say that they’re complementary, in your ideal robot-filled world, do you then see people having a Roomba and a Scooba and a Braava all working together in different areas of the house at different times?
We see a lot of customers that buy one product and then buy the second product that they feel is complementary. Usually, the third one is less likely, because Mint [he means Braava!] does both mopping and sweeping, so it depends on the need. Scooba does more like scrubbing, so if someone needs something deeper with dirtier floors, they will probably go with something like a Roomba and Scooba combination. Others that have a lot of hard floors in their home that require day to day maintenance will probably go with Roomba and Mint. And the majority go with one or the other, not the combination.
So do you see the future of these home care robots as everything converging on one robot that’s going to sweep and mop and vacuum? Or is it going to be these teams of smaller, more specialized robots working together instead?
I think that it’s hard to make that prediction, but one thing that we are very much focused on is that we want to make floor cleaning be a seamless task that happens in the background. The analogy I use is the sprinkler system for your lawn: you set it up and all you care about is a green lawn. I never worry about my sprinkler system unless I see a spot that’s getting yellow, and I think floor care is going to go there. We are getting, every year, one step closer to that dream.
Part of the reason I was asking is that I think the latest Roomba, or one of the more recent ones, had the wireless command center. Not something line-of-sight, but actual wireless, so isn't there a lot of potential there for communication between robots and, say, your phone, or other robots?
Right, so this, again, I think it could go either way. It’s hard to speculate whether it could be one robot combined with all of the functionality to clean any kind of floor, or whether you will have a combination of different robots. My prediction would be to start at single purpose robots and then gradually, later on, you’ll start combining functions into the same robot, when efficiencies of cost and technology are maturing. So you’ll probably end up with a little bit of both, that'd be my guess.
A lot of companies are starting to steer away from the term "robot." Are you focused on selling clean floors, or are you willing to say "we’re going to sell you a robot?"
Matt Lloyd: When we first came out with the Roomba, we didn't call it a robot. Because there was a bit of trepidation or intimidation. But the media, the consumers, took it back to "robot" for us.
Paolo: Yeah. And that’s funny because when we launched Mint, I purposefully avoided the use of the word "robot" in any of our marketing materials. But now, for iRobot, our core identity is that we are the robot company. Consumers understand that, and I think there is a value associated with that for them to know that this is what we do and live and breathe every day is robots. We understand robotics. We may not be a Samsung or a company of that scale, but we are 100 percent focused on robotics.
But when you go down that road, then you're competing with this perception that people have of robots that they get from science fiction and popular culture. Roombas aren't like that. Is that an issue that comes up? That people expect them to be able to do things that they can't do?
Paolo: No, I think we have changed the perception in the market. People do look at Roombas and say "those are robots." Robotic floor cleaners. People understand the notion of single purpose robots now. So we have helped shaped the perception also, to ground it into reality.
Matt: Eighty percent of our customers name their products.
برچسبها: iRobot CTO Paolo Pirjanian Talks Present and Futur, Mikail Shapoory
The theme for this year's International Robot Exhibition (IREX) in Tokyo was "Making a Future with Robot." We're not exactly sure what that means, but we're definitely in favor of it, and here are some of the coolest things that we saw.
There's one caveat with our IREX coverage, and that's the fact that there was a bit of a language barrier going on most of the time. With the exception of some big international robotics companies, there simply wasn't a lot of information available on many of the robots that we saw. We're following up as best we can, but in the meantime, enjoy this highlight video and gallery that we've put together for you.
برچسبها: Highlights From the International Robot Exhibition, Mikail Shapoory
As amazing as flying robots are, there's a limited amount of useful stuff that they can do today. Oh, they're great for surveillance and inspection, there's potential to use them to deliver stuff, and in some specialized circumstances we've seen them cooperatively building structures. But to really be useful in the way that we've come to expect from robots, they're going to need to be able to move a variety of objects at will, picking them up and putting them down whenever and wherever they need to. We saw some of the first examples of this at IROS, giving a whole new meaning to the term “mobile manipulator.”
The easiest way to put a mobile manipulator on a UAV is to just bolt a robot arm right on there, which is what DLR (the German Aerospace Center) is trying out. They've got a 7-DOF KUKA industrial arm mounted upside-down underneath an autonomous, turbine-powered mini helicopter. Using an on-board vision system, the robot is able to detect and grasp a pole stuck into the ground. This is tricker than it sounds, because even as the helicopter moves the base of the arm, whenever the arm moves, it changes the center of gravity of the system and moves the helicopter as well.
In terms of practicality, this is simply the very first step, but DLR is looking forward to some heavy lifting:
We think that for many practical applications the usage of a fully actuated arm with a payload of about 10 kg is required. So the setup we presented in this paper is a starting point for practical investigations of these applications and for developing of corresponding technologies.
“First Analysis and Experiments in Aerial Manipulation Using Fully Actuated Redundant Robot Arm,” by Felix Huber, Konstantin Kondak, Kai Krieger, Dominik Sommer, Marc Schwarzbach, Maximilian Laiacker, Ingo Kossyk, Sven Parusel, Sami Haddadin, and Alin Albu-Schaffer from the Institute of Robotics and Mechatronics, DLR, was presented earlier this month at IROS 2013 in Tokyo, Japan.
برچسبها: UAVs Get a Grip With Full, Size Robot Arms, Mikail Shapoory
When I told the organizers of the FutureMed conference that I couldn't fly to San Diego, Calif., to attend their meeting early this month, I expected that to be the end of the conversation. Instead they came back with an unusual proposition: Since I couldn't be there in the flesh, would I care to attend via robot?
It seemed that the conference would be hosting five of the Beam telepresence robots from Suitable Technologies, the robotics company that absorbed most of Willow Garage this summer. All I'd need to do is complete a training session and I'd be able to wheel a robot avatar through the exhibitor hall, although the main speaker hall would be inaccessible—the crush of conference-goers coming and going from the room would make that space too dangerous for the robot and the humans nearby.
My training session took all of 10 minutes. I was given an account that allowed me to log in to one of the Beams in Suitable Technologies' Palo Alto headquarters, and a cheerful employee named Greg Hamilton instructed me in the basics of steering. It's quite intuitive, and easily managed with a keyboard's arrow keys. The Beam has two cameras, one facing forward and one pointed down at the floor to help the user avoid obstacles.
After a few minutes I rolled myself up to a mirror in the office lobby to check myself out. “You’re about 5’3", you weigh about 100 pounds, and you are gorgeous,” said the smooth-talking Hamilton. In the mirror, I saw that the Beam's large screen displayed my face, using the video feed from the webcam in my laptop. Once I'd completed the training session, Hamilton extended his fist toward my front-facing camera. “We don’t shake hands, we just fist bump,” he said.
IEEE Spectrum has tried out telepresence robots before: In 2010, editor Erico Guizzo steered an Anybots QB around the magazine office for a week, trying out the system for remote work. His experience was fairly positive, as the bot allowed him to take part in meetings, initiate informal discussions between colleagues, and generally be a part of office life. The experience might have been unusually smooth, however, since Guizzo was trying out the bot at a technology magazine where colleagues expect such strangeness.
As I found out when I logged onto a Beam parked in the FutureMed conference hall, the experience can be challenging when people aren't accustomed to dealing with robots. The hall was crowded with attendees, many of whom assumed that there was no human presence within my machine, and felt no compunction about pushing past my Beam or blocking its way. A few people did react with surprise to my video face and smiled or waved, but in general I didn't make much progress. The Beam doesn't contain any safeguards in terms of impact-avoidance—I was in full control of the machine, and if I tried to proceed it seemed likely that I'd ram into people, run over their toes, and in general cause havoc.
I decided to Beam back in to the room during a quieter hour, when most of the attendees were listening to a seminar. When I did, I had a very nice interaction with a woman who works for a startup called Wellpepper. The company sells an app to improve the doctor-patient relationship with reminders and advice for the patient, reports for the doctors, and direct communication channels. While I couldn't take the woman's business card or snack on the candy that was set out in a dish, I'd say the experience was comparable to being there in person. Until, that is, I tried to roll on down a corridor and got stopped by a taped-down electrical cable that my bot-self couldn't roll over.
To get a little more info about the Beam's potential to revolutionize conference-going, I next logged back in to a Beam at Suitable Technologies headquarters to interview Scott Hassan, the company's founder and CEO. During our interview (pictured in the photo above) he revealed his dream: He wants to have 10,000 Beams at the 2015 International CES, to allow another 100,000 people to attend the world's preeminent consumer electronics show. "If you can't dream it, you can't do it," he said cheerfully.
So should you try to attend a conference via a robot? At this point I'd say that you won't get the same experience as being there in the flesh, but if you can't attend in person and have a chance to try a telepresence robot, "beaming" yourself into a distant robotic body is worthwhile in itself. And as companies like Suitable improve their robots—and as people get used to them—robot avatars will become a common sight at any conference. Just watch those toes.
برچسبها: Should I Attend a Conference Via a Telepresence Ro, Mikail Shapoory
Having a robot obey your commands is hard. Having a group of robots obey your commands is really, really hard. The good news is roboticists are making good progress in solving this problem. Research presented at IROS early this month shows how we may be getting to the point where teams of robots can pay attention to what we’re saying, where we’re looking, and what we’re doing, leading to a much more natural way to control them.
Think for a second about how you’d direct a subset of a group of people to do something. You’d probably look at each person in turn, saying something like “you, you, and you” while perhaps pointing to emphasize what you wanted them to do. It’s an intuitive thing for humans, because we convey attention through our gaze, and researchers from Simon Fraser University (which, incidentally, has an awesome bagpipe band) are teaching robots to pay attention to what we pay attention to.
Here’s a video illustrating the concept.
The AR Drones are watching the human with their cameras, and cooperatively using face detection to decide which one of them the human is looking at while listening for specific audio commands. Based on where the human’s attention is focused when verbal commands are issued, the robots can then decide which one of them the human is directing. It works if you want to select multiple robots (one at a time), or single robots, whether or not they’re already in the group.
The photos below show the camera views for the three robots. The robot that the user is staring at has the highest "face score" (left image).
The user gets feedback from the robots by watching their LEDs, which change color when the robot thinks it’s being paid attention to.
This same basic principle is just as effective with gestures.
The point of all this is to develop an intuitive command system that even an untrained user can quickly pick up on. To that end, future work will allow robots to be grouped into named teams, so that it’ll be possible to easily direct specific robots to join or abandon different groups. And gesture control will be expanded to allow for directional control. Put it all together, and you can imagine a scenario where you might be able to say to a group of six quadrotors: “you and you are Blue Team, go fly over there. You three are Red Team, patrol this area. Oh, and you, you’d better go join Blue Team.” It sounds so easy to do, and thanks to some very hard work by roboticists, we're getting there.
“ ‘You Two! Take Off!’: Creating, Modifying, and Commanding Groups of Robots Using Face Engagement and Indirect Speech in Voice Commands,” by Shokoofeh Pourmehr, Valiallah (Mani) Monajjemi, Richard Vaughan, and Greg Mori, and "HRI in the Sky: Creating and Commanding Teams of UAVs with a Vision-mediated Gestural Interface," by Valiallah (Mani) Monajjemi, Jens Wawerla, Richard Vaughan, and Greg Mori, all from Simon Fraser University in Burnaby, B.C., Canada., were presented at IROS 2013 in Tokyo, Japan.
برچسبها: IROS 2013, Robots That Pay Attention To You, Mikail Shapoory
The Uncanny Valley is a topic of much fascination not only in robotics, where it originated, but also in other scientific circles as well as in popular culture. Roboticists often allude to it, and so do computer scientists, psychologists, artists, and media theorists. In 2008, it was mentioned in the TV series "30 Rock." More recently, the Uncanny Valley was used to explain why several animation movies failed, and an Atlantic article referred to it to describe Mitt Romney. The term has also been used to name everything from a literary magazine to a painting of a baboon embracing Nicolas Cage. Some even suggest that the Uncanny Valley has become a meme. But just what is the Uncanny Valley?
At a recent robotics gathering in Japan we had the perfect opportunity to ask that question. "The Uncanny Valley Revisited" was a tribute to Masahiro Mori, the robotics professor who came up with the concept in 1970. The event featured speakers with a wide range of backgrounds. At the end we cornered some of the presenters and asked them to explain the Uncanny Valley in less than a minute. Here's how they did.
Our intrepid explainers are: Minoru Asada, a robotics professor at Osaka University; Ken Goldberg, a roboticist and artist at UC Berkeley; Hiroshi Ishiguro, a robotics professor at Osaka University; Elizabeth Jochum, co-founder of University of Copenhagen's Robot Culture and Aesthetics Research Group; Peter Lunenfeld, a professor of media design at UCLA;Marek Michalowski, co-founder of BeatBots; and Todd Murphey, a professor of mechanical engineering at Northwestern University.
If you think you have a good and short explanation, post it on the comment section below.
برچسبها: Explain the Uncanny Valley in Less Than 1 Minute, Mikail shapoory
NEC's cute communication robot, Papero, is getting a new lease on life. Japanese electronics giant NEC has announced an initiative called the Papero Partner Program, calling for research and business partners to help develop apps and distribute the robot to end users. Along with this announcement, NEC debuted the Papero Petit, the newest model of the robot.
What exactly is a communication robot? In recent years, NEC has positioned PaPeRo as a robotic assistant for the elderly. In experiments conducted at nursing homes, the robot would pipe up from time to time to remind people of their daily routines, like when to take their medication. It also connected with a pedometer to encourage a more active lifestyle by announcing how far a person walked in a day, and could take measurements like blood pressure.
The new robot, Papero Petit, stands 24 centimeters (9.4 inches) tall and weighs 1.3 kg (2.8 lbs)—about half the size of earlier models. It combines multiple sensors (cameras, ultrasonic range finders, temperature sensor, and microphones) to detect people and look in their direction even in complete darkness. The robot can recognize faces and has between 80 to 90 percent success rate at speech recognition.
But the biggest change is that, unlike earlier versions, Papero Petit is stationary, so it can no longer follow you from room to room like a cute little R2-D2. NEC hopes the robot can be improved with new features by connecting to the cloud to access software and computing resources. Family members could, for example, send text messages to the robot, which would read them aloud to their grandparents. NEC has already developed the app with NTT Docomo, one of Japan's largest telecommunications companies.
NEC is looking for more partners to help develop apps targeting a wide assortment of uses, from home security to health care. The company also wants to find partners to provide Papero to end users, renting the robot on a monthly basis. NEC says the monthly fee will likely be less than 10,000 yen (approximately US $100), and could include NTT Docomo wireless Internet, since many nursing home residents aren't online. NEC hopes to grow the business to 10 billion yen over the next three years.
Papero, whose name stands for Partner-type Personal Robot, was originally developed in 2000, following a prototype a few years earlier. Since then, NEC has improved the robot's software capabilities, revised its hardware, and used it in numerous experiments at care facilities and in smart homes. Despite these ongoing developments, the robot has never been made available to the public. The new business model seeks to change that, and if things work out, it could lead to Papero Petit arriving in homes and stores across Japan.
برچسبها: NEC Shows Off New Papero Petit Robot, Mikail Shapoory