News about Robotic Sciences
 
 
مطالب علمی و خبری درباره علم روباتیک
 
http://spectrum.ieee.org/img/air-shepherd-drones2-1427730829221.jpg
Source: http://spectrum.ieee.org/automaton/robotics/aerial-robots/drones-wildlife-poachers-africa

Last year, 25,000 elephants were killed by poachers throughout the African continent. In South Africa, more than 1,200 rhinos were slayed, a record year. The vast open expanses and dense undergrowth make it easy for illegal hunters to elude the authorities on the ground. Recently, though, conservationists have found a technology they believe could help them turn the tables on poachers: drones.

Anti-poaching groups operating in South Africa have successfully flown drones capable of giving them an “eye in the sky” and allowing patrolling rangers to locate and catch wildlife poachers. But although such missions could potentially save hundreds of animals every week, the drones have been grounded.

Early last year, the South African Civil Aviation Authority (SACAA)announced that flying unmanned aerial vehicles (UAVs) with cameras, for commercial gain, was against the law. The SACAA said it needed time to consider how best to regulate drones, including those used by anti-poaching organizations, in its airspace. The agency expects to make an announcement about the new regulations next month.

Filmmakers have used commercially available drones to capture aerial footage of African wildlife, but deploying them to search for poachers requires specialized technology. According to Eric Schmidt from Wildlife Protection Solutions, a Denver, Colo.-based company that has assisted conservation groups in South Africa, anti-poaching UAV missions need to be planned with the African bush in mind. 

“There’s no landing point and no runway out there,” Schmidt says. “A ranger needs something small enough to fit in a backpack and that will launch in 5 minutes.”

He explains that drones need to fly for long periods and distances, and for that reason fixed-wing models are more suitable than rotor-based machines. They must be able to operate autonomously or via remote control, and for night missions, they need to carry thermal imaging cameras.

From a ground station, drone operators monitor the images, looking for telltale signs of poaching activity: fences cut, trucks following known animal tracks, or the color blue, which in the bush, invariably means a pair of jeans,says Schmidt.

Conservationists hope drone technology can help not only locate poachers as they move through their hunting grounds and hiding spots, but also anticipate their next steps. The idea is predicting where poachers and animals will be, identifying known-trouble spots before an incident occurs. 

A conservation group called Air Shepherd is collaborating with researchers at the University of Maryland to develop a data analytics system that can do just that. 

“We can place rangers and drones in the areas where an incident is most likely to occur,” says Tom Snitch, a visiting professor at the University of Maryland Institute for Advanced Computer Studies.

Snitch explains that the behaviors of poachers and animals follow patterns. “There are more attacks, for instance, between 6:30 and 8 p.m., or when a full moon provides extra light,” he says. 

The drone’s flight plan would be calculated by a computer cluster in Maryland, and sent to a warden in a vehicle equipped with a UAV control station. The warden launches the drone, which then flies fully autonomously.

http://spectrum.ieee.org/img/air-shepherd-computer-1-1427486836363.png

Air Shepherd has launched acrowdfunding campaign to raise funds for a series of drone missions based on the data analytics approach. 

It’s difficult to quantify the precise, potential impact of drones across the whole of South Africa. Anti-poaching UAVs had only been used for a short while before the ban, and in many parts of Africa, conservation groups can’t afford drones. Instead, they rely on rangers and trackers to locate and catch the poachers.

Snitch says that using the Maryland system, rangers in one private nature reserve managed to stamp out poaching altogether. Previously, that reserve averaged nine killings a month.

‘This doesn’t mean we’re catching nine poachers a month,” Snitch says. “We just created a deterrent, so the poachers moved somewhere else.”

 


برچسب‌ها: Robot, Drones
 |+| نوشته شده در  2015/3/31ساعت 10:51 AM  توسط میکائیل  | 

 
Source: http://spectrum.ieee.org/view-from-the-valley/robotics/diy/march-madness-at-stanford-robots-shoot-and-dunk

As college basketball teams around the country began playing their way down to the Final Four, Stanford engineering students had their own March Madness, complete with a cheering crowd and heated competition. This year, for Stanford’s annual battle-of-the-bots, a competition held since 1995 as part of a mechatronics class, went basketball-crazy, with the theme “scoring machines.”

Inspired by this year’s Golden State Warriors players Steph Curry and Klay Thompson, the students set out to build robots capable of shooting balls into baskets as efficiently as those basketball stars.  (Thompson recently set an NBA record of 37 points in a quarter; Curry had a 50-point game.)

Each robot had two minutes, starting from a random position at the end of the miniaturized court, and could score 1, 2, or 3 points for each ball in a basket, depending on the distance to the selected basket from the “ball inbounding zone”.

The 34 teams had 22 days to build their robots. The winning team went for the long shot (literally), and its robot made 18 three-point shots. A close competitor sunk 20 two-point shots, according to Stanford mechanical engineering professor and basketball fan Thomas Kenny, who ran this year’s project.

The students competing included sophomores, juniors, seniors, and graduate students, from mechanical engineering, electrical engineering, computer science, civil engineering, aero/astro, engineering physics, math, philosophy, business, and even fine arts.  Says Kenny: “We find that students from all across the university are seeking some of these skills to enable them to work on the problems that they care about and find the jobs that they want.”

See some of the action in the video above.

Correction to photo credit made 30 March.


برچسب‌ها: Stanford, Robot
 |+| نوشته شده در  2015/3/31ساعت 10:48 AM  توسط میکائیل  | 

http://spectrum.ieee.org/img/grapheneblog-1427380918381.jpg


Source: http://spectrum.ieee.org/nanoclast/semiconductors/materials/3d-hybrid-supercapacitor-made-with-graphene

By combining sheets of graphene with a traditional battery material, scientists have created hybrid supercapacitors that can store as much charge as lead acid batteries but can be recharged in seconds compared with hours for conventional batteries.

Supercapacitors now play an important role in hybrid and electric vehicles, consumer electronics, and military and space applications. However, they are often limited in terms of how much energy they can store.

Now researchers at the University of California, Los Angeles, have developed a hybrid supercapacitor that is based on graphene, which is made of single layers of carbon atoms. Graphene is flexible, transparent, strong and electrically and thermally conductive, qualities that have led to research worldwide into whether the material could find use in advanced circuitry and other devices.

The scientists combined graphene with manganese dioxide, which is widely used in alkaline batteries and is both abundant and environmentally friendly. The manganese dioxide formed microscopic flowers made of flakes only 10 to 20 nanometers thick. The supercapacitors also incorporated electrolytes that can operate at high voltages.

The graphene provides a highly conductive structure for the manganese dioxide that is also very porous, helping ensure that more of the manganese dioxide can undergo electrochemical reactions. The resulting 3-D hybrid supercapacitor has an energy density of up to 42 watt-hours per liter, superior to many commercially available supercapacitors and comparable to lead acid batteries, the researchers said. Furthermore, the new supercapacitors can provide power densities up to roughly 10 kilowatters per liter, which is 100 times faster than high-power lead acid batteries and 1,000 times faster than a lithium thin-film battery. Moreover, the new devices could also retain their energy capacity over a cycle of 10,000 discharges and recharges.

The scientists demonstrated they could integrate their supercapacitor with solar cells for efficient solar energy harvesting and storage. They also noted their supercapacitors can be assembled in air without the need for the expensive dry rooms needed for manufacturing today's supercapacitors. They detailed their findings online March 23 in the journal Proceedings of the National Academy of Sciences.


برچسب‌ها: Supercharges, Hybrid, Supercapacitor
 |+| نوشته شده در  2015/3/31ساعت 10:14 AM  توسط میکائیل  | 

Sorce: http://spectrum.ieee.org/automaton/robotics/robotics-hardware/self-folding-printable-origami-robot


 

At the IEEE International Conference on Robotics and Automation (ICRA) last year, Harvard's Sam Felton introduced us to his printed, self-folding inchworm robot. With some external infrastructure and the addition of a motor, the inchworm could autonomously transform from a flat sheet to a crawling robot by folding itself into a 3D structure with flexible joints.

Today, Felton and colleagues from Harvard and MIT are publishing a new paper in Science featuring a much more complex self-folding robot that can go from flat to folded and walking in four minutes without any human intervention at all.

Let's be clear about what's autonomous here, and what's not: the folding process by which the robot changes from flat to less flat is completely autonomous, and this includes the ability to begin walking on its own, as shown in the video above.

But the layered structure of the robot takes a lot of work to prepare (printing, bonding, laser cutting, and so on), and the motors, batteries, and some electronic components all need to be installed by hand—a series of steps that take almost 2 hours:

That's a very long and complex process [see illustration below], but what's most relevant is to think about how much of it can be made autonomous. The researchers say that "the assembly time could be substantially reduced and completely automated with the use of pick-and-place electrical component assembly machines and automated adhesive dispensers." And if that's the case, we're looking at potentially very cheap, easily mass-produceable robots.

 http://spectrum.ieee.org/img/origami-robot-folding-1407425909804.png

 

 

The secret sauce that allows this robot to structure itself from a flat piece of cardboard is a very carefully computed folding design, combined with a structure comprised of resistive circuits embedded in a flexible PCB between layers of paper and heat-activated shape-memory polymer (the "PSPS" in the above image). This sandwich is laser cut where the joints will be, and when the embedded resistors heat up, the shape-memory polymer around them contracts.

Depending on where and how the cuts are made, this contraction can result in permanent (when cooled) controlled bending of up to 120 degrees in either direction. Flexible joints (like hinges) come from cutting out both the paper and the PSPS, leaving just the flexible circuit board to connect two structural elements. And by combining flat elements, rigid bends, and flexible joints, you can create complex linkages that can translate (say) the rotary motion from a motor into the cyclical motion of a set of legs. (DASH is an excellent example of this.)

http://spectrum.ieee.org/img/origami-robot-motor-1407426107285.png

The motor and alignment mechanism of the robot: (A) The linkages are fabricated in plane with the composite, and the crank arms are oriented upward. (B) The legs and linkages fold into position, and the alignment tab folds into place. (C) The motor rotates 180°, pushing the crank arm pin into the alignment notch. (D) The locking tab folds over the pin, coupling the pin to the linkage. In (C) and (D) the obscuring linkage is displayed in outline only for clarity. self-folding process with shape-memory composites: (A) The self-folding shape-memory composite consists of five layers: two outer layers of PSPS, two layers of paper, and a layer of polyimide (PCB) bearing a copper circuit in the middle. Cutting a gap into the upper paper layer allows controlled folding of the polyimide, and slits in the bottom layers of paper and PSPS prevent antagonistic forces. (B) A structural hinge, designed to fold once when activated and then become static. (C) When activated, the PSPS on the concave side pulls the two faces together, bending the polyimide along the hinge. (D and E) A dynamic hinge, designed to bend freely and repeatably. (F) A self-folding crawler built with the shape-memory composite. This robot includes both (G) self-folding and (H) dynamic hinges.

The really hard part here is the design of the origami structure (which is done by a computer program) and the manufacturing of the composite sheet. Once the composite is completed, the final assembly step just involves attaching the batteries, motors, and microcontroller to the robot using a 3D printed motor mount and some screws. Still, arriving at the right design required over 40 iterations of the robot's structure.

The microcontroller took care of sending current through the series of embedded resistive traces at the right time, to ensure that the folding process took place in the correct order. A particularly clever bit is how the motors attach to the structure of the robot, using tabs that sequentially fold to align the motor and then lock it into place [right].

Overall, creating a structural fold using this technique had a success rate of about 97 percent. The researchers tried to make three robots, and were successful with just one, but that was due to one single hinge not folding with the necessary precision on each of the failures. The robot itself can walk at 5.4 centimeters per second (0.43 body lengths per second) and turn at about 320 degrees per second.


برچسب‌ها: Self Folding Origami Robot Goes From Flat to Walki
 |+| نوشته شده در  2014/8/11ساعت 17:56 PM  توسط میکائیل  | 

Source:http://spectrum.ieee.org/automaton/robotics/aerial-robots/autonomous-quadrotor-flight-based-on-google-project-tango

 


 

Early this year, Google unveiled its Project Tango smartphone, a mobile device equipped with a depth sensor, a motion tracking camera, and two vision processors that let the phone track its position in space and create 3D maps in real time. The device is particularly useful for robots, which have to navigate and locate themselves in the world. Indeed, a video showed how Google and its partners were putting the smartphone on different kinds of robots, including mobile platforms and manipulator arms.

Now researchers at the University of Pennsylvania led by Professor Vijay Kumar are taking things one step further. After getting a Tango device from Google, they put it on one of their quadrotors and let it loose inside their lab.

Kumar says that a big challenge for researchers working with flying robots is not building them but rather developing hardware and software capable of making them autonomous. Many robots use GPS for guiding themselves, or, when flying indoors, they rely on motion tracking systems like Vicon and OptiTrack, which offer great accuracy but requires that you install sensors on walls and ceilings.

A device capable of localizing itself in space without GPS or external sensors, as the Tango phone does, opens new possibilities for flying robots. Kumar says that the Google device is remarkable because it lets you "literally velcro it to a robot and have it be autonomous."

Giuseppe Loianno, a PhD student in Kumar's group, has made a video showing their initial tests with the device. In the first part of the video, Loianno sets the quadrotor to hover at a fixed position and then perturbs it by moving it around, but the drone promptly returns to the starting point. Next Loianno commands the drone to go to different places in the room and, even if disturbed, the drone recovers and stays on its programmed path.

Kumar says the only measurement from the Tango phone is its pose, which is the position plus orientation with reference to a starting coordinate system (captured at a rate of 30 Hz), and the only other sensor used is the IMU onboard the drone. (The laptop is not controlling flight autonomy in any way; it's only used to send a desired trajectory to the drone and to render a visualization of the its positions in space. And the quadrotor is a machine that Kumar's group designed and built with off-the-shelf components.) 

The researchers now plan to study Tango's accuracy of localization (and compare it to external motion tracking systems), but from their initial tests they estimate the accuracy to be within a centimeter. If that proves to be the case (and if Tango can be made cheap enough), it will be an impressive capability for the Google device, which could revolutionize how mobile robots and drones navigate indoor spaces.

Kumar says that the convergence of computation, communication, and consumers has a huge potential for the robotics industry, and a device like Tango is a key advance because it's "lowering the barrier to entry for autonomous robots."


برچسب‌ها: This Quadrotor Uses Google, s Project Tango to Fly Autonomously, mikail shapoory, Robot
 |+| نوشته شده در  2014/5/23ساعت 21:30 PM  توسط میکائیل  | 

یک شبی مجنون نمازش را شکست        بی وضو در کوچه ی لیلا نشست
عشق آن شب مست مستش کرده بود    فارغ از جام الستش کرده بود
سجده ای زد بر لب درگاه او                    پر ز لیلا شد دل پر آه او
گفت یا رب از چه خوارم کرده ای؟           بر صلیب عشق دارم کرده ای
جام لیلا را به دستم داده ای                   وندر این بازی شکستم داده ای
نشتر عشقش به جانم میزنی                 دردم از لیلاست آنم میزنی
خسته ام زین عشق دلخونم مکن               من که مجنونم تو مجنونم نکن
مرد این بازیچه دیگر نیستم                   این تو این لیلای تو ..... من نیستم
گفت ای دیوانه لیلایت منم                      در رگ پنهان و پیدایت منم
سالها با جور لیلا ساختی                        من کنارت بودم و نشناختی
عشق لیلا در دلت انداختم                       صدقمار عشق یکجا باختم
کردمت آواره صحرا نشد                          گفتم عاقل میشوی اما نشد
سوختم در حسرت یک یا ربت                   غیر لیلا بر نیامد از لبت
روز و شب او را صدا کردی ولی..             دیدم امشب با منی گفتم بلی..
مطمئن بودم به من سر میزنی                 در حریم خانه ام در میزنی
حال این لیلا که خارت کرده بود                درس عشقش بی قرارت کرده بود
مرد راهش باش تا شاهت کنم                 صد چو لیلا کشته در راهت کنم

 |+| نوشته شده در  2014/5/17ساعت 23:24 PM  توسط میکائیل  | 

Source:http://spectrum.ieee.org/automaton/robotics/aerial-robots/north-dakota-first-faa-approved-drone-test-site


Whether or not the U.S. Federal Aviation Administration has the authority to approve or regulate or monitor or oversee or whatever they want to do about unmanned aerial systems (UAS), the fact is more and more poeple are flying these system everywhere.

Technically, you're not supposed to fly drones out of visual range, more than 400 feet in the air, or closer than five miles to any sort of controlled airspace (including the Class B airspace that's in place over most urban areas), without getting an an experimental airworthiness certificate (which specifically precludes carrying cargo) and applying for a Certificate of Waiver or Authorization (COA).

Most people, of course, do not pay any attention to any of this whatsoever, because flying drones is cheap, easy, and fun, and everybody is doing it, so why worry?

The worry, of course, is that some drone somewhere is going to crash and hurt someone or damage property or unlawfully spy on someone or, much worse, collide with a manned aircraft and cause an accident. This could lead to an immediate and harsh crack down that puts drones on its heels.

The FAA is not clueless about any of this, and while the fact that they're a government bureaucracy doesn't make them the most nimble organization in existence, they're at least trying to push for some progress toward intelligent and flexible UAS guidelines.

At the end of last year, the FAA announced the selection of six UAS research and test sites that'll be used to figure out how to properly regulate drones, and the first of these, in North Dakota, will be open for flight operations next month.

The FAA picked North Dakota because it was the only potential area to offer "a test range in the Temperate (continental) climate zone and included a variety of different airspace which will benefit multiple users."

By the summer of this year, drones will have access to two separate areas: one at North Dakota State University's Carrington Research Extension Center (here-ish), and one in Sullys Hill National Game Preserve (here). The first UAS to get tested out at the North Dakota site will be Draganflyer X4-ES, mounting a multispectral imaging system from Tetracam for soil quality assessment and crop health mapping.

The main goal of this site’s initial operations is to show that UAS can check soil quality and the status of crops in support of North Dakota State University/Extension Service precision agriculture research studies. Precision agriculture is one of many industries that represent areas for significant economic opportunity and UAS-industry expansion.

While supporting the precision agriculture project, the Northern Plains Unmanned Aircraft Systems Test Site also will collect safety-related operational data needed for UAS airspace integration. The information will help the FAA analyze current processes for establishing small UAS airworthiness and system maturity. Maintenance data collected during site operations will support a prototype database for UAS maintenance and repair.

Agriculture is likely to be one of the first areas in which drones might be able to establish themselves as a viable business, which is part of the reason that the FAA is starting there. Companies like SenseFly already have mature systems all ready to go: the eBee Ag will do fire-and-forget multispectral crop imaging right out of the box, and the only thing holding it back right now are FAA regulations: they're already being used (quite successfully) in Europe, while we're all waiting around here in the United States twiddling our thumbs.


برچسب‌ها: mikail shapoory, Robot, First FAA, Approved Drone Test Site Goes Live Next Week in No
 |+| نوشته شده در  2014/5/13ساعت 23:4 PM  توسط میکائیل  | 

Source:http://spectrum.ieee.org/automaton/robotics/robotics-hardware/japanese-quadruped-robot-pneupard


Last year, Boston Dynamics' Wildcat quadruped robot managed to "escape" its lab in spectacular style, galloping in a parking lot at up to 25 km/h (16 mph). But Wildcat is just one of a growing pack of quadrupeds under development in robot laboratories around the world.

Another example, hailing from Osaka University, is Pneupard, a biologically-inspired quadruped robot powered by pneumatic muscles. When we last saw this robot over a year ago, it was far from complete. Now a new version, equipped with all four limbs, is taking its first steps.

To make the robot walk, the Osaka University researchers had to individually fine-tune the air pressure of each muscle as the robot moves its legs. You would think that the more muscles you add to the legs, the more capable the robot becomes. That is true, but the trade-off is that the more muscles you have to manage, the more complex and time-consuming the job becomes.

The original version of Pneupard that the team designed had a lot of pneumatic muscles and controlling the robot became a huge challenge. So the researchers decided to adopt a "lean and mean" approach to its anatomy and built a second version with fewer muscles. This made it easier to control the machine and explore different gaits.

The work takes place at the laboratory of Prof. Koh Hosoda and the team includes Andre Rosendo, Shogo Nakatsu, Xiangxiao Liu, and Masahiro Shimizu.

The researchers say the new approach allowed them to shift focus from individual limbs to their collective behavior. In other words, now they can study how the legs work together to create stable motions and efficient gaits. In one of their first experiments, the 4.8-kilogram (10.5-pound) robot can be seen walking on a treadmill, daintily lifting its mechanical paws with each step, helped by a supporting shaft that prevents it from falling over sideways.

"Our belief is that locomotion is created not only by the brain, but also the brainstem, spine, and muscles, which also 'have a say' on how the body moves," Rosendo, the project leader, tells IEEE Spectrum in an email.

As an example of how the brain is not the only entity responsible for body movements, Rosendo mentions the famous case of Mike the Headless Chicken, a chicken that allegedly lived for 18 months after its head had been cut off. He adds that bioinspired robots let researchers explore how locomotion works without having to use live animals. While far from perfect models of their biological counterparts, these robots may reveal important clues by exhibiting similar principles.

For example, in the video above, note that the robot does not require a complex brain, nor does it use any ground feedback or sensors (it's an "open loop" experiment, in control systems parlance). It walks through the physical interaction of its skeletal structure and muscles, which are controlled by a simple, rhythmic controller called a central pattern generator, or CPG.

The team has already installed some force sensors on the robot's "paws" (not used in the video), which should help further their work with CPGs, as well as gait transitions and the contribution of the limb stretch-reflex.

Now that Google has bought out Boston Dynamics, it's hard to say what the future holds for its quadruped robots, and we wonder if we'll ever see them again. At least we can still look forward to more cat-like action from the team at Osaka University and also from groups at MIT and EPFL, both of which are working on feline robots of their own.


برچسب‌ها: میکائیل شاپوری, mikail shapoory, Robot, Quadruped Robot Pneupard Takes Its First Steps
 |+| نوشته شده در  2014/4/25ساعت 10:11 AM  توسط میکائیل  | 

Source: http://spectrum.ieee.org/automaton/robotics/aerial-robots/kmels-hexrotors-put-on-autonomous-musical-spectacular



We love KMel Robotics because they're a fantastic example of how it really is possible to take robots straight out of a research environment and use them to do awesome stuff that also (we assume) is to some extent commercially viable. This is an incredibly hard jump to make for any company, but KMel has done it in style, and their latest performance piece has a large swarm of hexrotors playing (and controlling) a symphony of musical instruments.

Make sure and stick around until the end for the finale on this one.

KMel Robotics presents a team of flying robots that have taken up new instruments to play some fresh songs. The hexrotors create music in ways never seen before, like playing a custom single string guitar hooked up to an electric guitar amp. Drums are hit using a deconstructed piano action. And there are bells. Lots of bells.

Just a reminder about how this all works: each one of these robots has its position measured down to the millimeter (or better) at a very high frequency using an external localization system. The individual hexrotors themselves are being centrally controlled by a computer somewhere else, sort of like a puppeteer system. It takes a lot of coordination and precision to put on a show like this, but as we well know, these KMel guys are pros.

You can catch this performance live, for free, at the USA Science and Engineering Festival in Washington, D.C. this weekend.


برچسب‌ها: میکائیل شاپوری, mikail shapoory, Robot, KMel, s Hexrotors Put on Autonomous Musical Spectacular
 |+| نوشته شده در  2014/4/25ساعت 10:0 AM  توسط میکائیل  | 

Source:http://www.livescience.com/44264-new-xprize-for-artificial-intelligence.html


Imagine a day when a form of artificial intelligence could deliver a speech as compelling as one given by a human.

A nonprofit organization called XPRIZE, which designs competitions to encourage the development of innovative technology for the benefit of humanity, announced it will award a prize to anyone who can develop an artificial intelligence, or "AI," that could give an inspiring talk at the TED (Technology, Education, Design) conference without any human assistance.

"Advances in machine learning and artificial intelligence have made extraordinary progress over the past decade, but we've barely scratched the surface," Peter Diamandis, chairman and CEO of XPRIZE, said in a statement. Diamandis and Chris Anderson, curator of TED, announced the prize Thursday (March 20) at the TED2014 Conference in Vancouver, British Columbia.

The prize organizers are requesting input from the public about the talk's topic and length, whether the AI should be a physical robot or a disembodied voice, how the competition should be judged and whether the prize should be awarded to the first team to meet certain criteria or as part of an annual competition. More information is available on the prize's website.

"We're entering a future in which humans and machines must learn new ways to work with each other," Anderson said in a statement. "I predict that within a few years, we'll be blown away by what artificial intelligences can do." [Super-Intelligent Machines: 7 Robotic Futures]

XPRIZE, founded in 1995, organizes high-profile prizes in five areas: learning, exploration, energy and environment, global development, and life sciences.

Current prizes include the $30 million Google Lunar X PRIZE for safely landing a private spacecraft on the moon, the $10 million Qualcomm Tricorder XPRIZE and the $2.25 million Nokia Sensing XCHALLENGE for portable health care sensing, and the $2 million Wendy Schmidt Ocean Health XPRIZE to understand ocean acidification.


برچسب‌ها: mikail shapoory, میکائیل شاپوری, Robot
 |+| نوشته شده در  2014/4/20ساعت 10:10 AM  توسط میکائیل  | 
مطالب قدیمی‌تر
 
  بالا