Sunday 26 May 2013

FYI: Can A Bionic Eye See As Well As A Human Eye?

Previously blind patients who receive the recently FDA-approved Argus II bionic eye system will regain some degree of functional sight. The retinal implant technology, developed and distributed by Second Sight, can improve quality of life for patients who have lost functional vision due to retinitis pigmentosa, a disease that causes retinal cells to die. But the implant doesn’t facilitate a sudden recovery of 20/20 vision.

Just as cochlear implants don’t have the sensitivity to accurately relay the complex mix of frequencies in music, bionic vision through the Argus II will be closer to a very grainy black-and-white film than an HD movie.

The multi-piece system starts with a digital camera mounted in eyeglasses. Images from the camera get translated into data through a miniature computer and sent via wireless transmitter to a computer chip on the side of the eyeball. From there, the chip activates an eyelash-thin electrode array implanted behind the retina, which then stimulates retinal cells to send visual information to the brain.

Bionic vision will be closer to a very grainy black-and-white film than an HD movie.While the prosthesis sounds complex, it doesn’t compare to the intricacy of the natural human eye. Shawn Kelly, a scientist at Carnegie Mellon University, is currently working on his own retina implant system that includes the same basic technology as the Argus II. He explains that, with the technology available now, the electrode array isn’t finely tuned enough to produce the same detail and clarity of a fully functional human retina.

“It’s not going to be anything like normal vision,” Kelly says, describing the expected results of his implant. “It’s going to be very difficult to convey colors, and it’s going to be difficult to convey the type of visual resolution that we have in the center of our vision.”

The eye’s retinal cells, commonly known as rods and cones, are split up into six types of receptors that detect contrast between light and dark, red and green, and blue and yellow. Electrical signals from the retinal cells travel through the optic nerve to the brain to form a complete picture of what the eye is looking at.

Instead of trying to imitate the activity of all six receptor types, Second Sight’s Argus II implant focuses on stimulating retinal cells to show light and dark contrast. For a blind patient using an artificial retina, being able to detect the relative darkness of a hole or an approaching wall is a lot more important to independently navigating than knowing whether that wall is painted red or blue.

Brian Mech, vice president of business development at Second Sight, reports that in clinical trials, all of the company's 30 patients had some degree of vision improvement after receiving the retinal implant. “Basically they see black and white, shades of gray and they’re getting somewhere between 50 and 60 pixels of information on average,” Mech says.

In terms of acuity, the patient with the best result went from no useful sight to 20/1260 vision. This ratio means that to get the same amount of information a healthy human eye can discern from 1,260 feet, this patient would need to be standing 20 feet away.

Mech admits this level of acuity is limited. “[It] is still pretty poor considering normal vision is 20/20, but it’s pretty amazing when you’re completely blind to begin with,” he says. But he explains that as research and testing continues, there is plenty of room to improve the system’s performance just by working on the external technology. Even color vision is a possibility with improvements to the video processor and wireless transmitter. “I don’t want to make it sound too easy,” Mech says. “There’s a lot of work to be done before we can do color vision, but the good news is that they won’t need a new implant for it.”

Second Sight is currently developing more advanced retinal implants, but the timeline is still unclear. Mech says it will be five to seven years or more before the next generation of implants becomes a reality.

Google And NASA Team Up In Quantum Artificial Intelligence Lab

We believe quantum computing may help solve some of the most challenging computer science problems, particularly in machine learning. Machine learning is all about building better models of the world to make more accurate predictions. If we want to cure diseases, we need better models of how they develop. If we want to create effective environmental policies, we need better models of what’s happening to our climate. And if we want to build a more useful search engine, we need to better understand spoken questions and what’s on the web so you get the best answer.

So today we’re launching the Quantum Artificial Intelligence Lab. NASA’s Ames Research Center will host the lab, which will house a quantum computer from D-Wave Systems, and the USRA (Universities Space Research Association) will invite researchers from around the world to share time on it. Our goal: to study how quantum computing might advance machine learning.

Machine learning is highly difficult. It’s what mathematicians call an “NP-hard” problem. That’s because building a good model is really a creative act. As an analogy, consider what it takes to architect a house. You’re balancing lots of constraints -- budget, usage requirements, space limitations, etc. -- but still trying to create the most beautiful house you can. A creative architect will find a great solution. Mathematically speaking the architect is solving an optimization problem and creativity can be thought of as the ability to come up with a good solution given an objective and constraints.

Classical computers aren’t well suited to these types of creative problems. Solving such problems can be imagined as trying to find the lowest point on a surface covered in hills and valleys. Classical computing might use what’s called “gradient descent”: start at a random spot on the surface, look around for a lower spot to walk down to, and repeat until you can’t walk downhill anymore. But all too often that gets you stuck in a “local minimum” -- a valley that isn’t the very lowest point on the surface.

That’s where quantum computing comes in. It lets you cheat a little, giving you some chance to “tunnel” through a ridge to see if there’s a lower valley hidden beyond it. This gives you a much better shot at finding the true lowest point -- the optimal solution.

We’ve already developed some quantum machine learning algorithms. One produces very compact, efficient recognizers -- very useful when you’re short on power, as on a mobile device. Another can handle highly polluted training data, where a high percentage of the examples are mislabeled, as they often are in the real world. And we’ve learned some useful principles: e.g., you get the best results not with pure quantum computing, but by mixing quantum and classical computing.

Can we move these ideas from theory to practice, building real solutions on quantum hardware? Answering this question is what the Quantum Artificial Intelligence Lab is for. We hope it helps researchers construct more efficient and more accurate models for everything from speech recognition, to web search, to protein folding. We actually think quantum machine learning may provide the most creative problem-solving process under the known laws of physics. We’re excited to get started with NASA Ames, D-Wave, the USRA, and scientists from around the world.

NASA Inspects Ion Engine Prototype For Asteroid-Hauling Rocket

 Ion Propulsion Thruster An ion propulsion thruster, developed at the Jet Propulsion Laboratory, that's being considered for NASA's Asteroid Retrieval Initiative. This thruster uses xenon ions. NASA/JPL-Caltech

When NASA sends a rocket out to tow an asteroid into Earth orbit, it'll be ion propelled.

Ion propulsion engines harness magnetic fields to create thrust, instead of depending on chemical explosions the way chemically fueled engines do. NASA engineers are developing such an engine for the agency's plans to bring an asteroid into Earth's orbit and then send astronauts there to study it. NASA Administrator Charles Bolden went to see a prototype engine yesterday at the Jet Propulsion Laboratory in Southern California, the Associated Press reported.

Bolden also praised the bring-it-to-us asteroid study plan. Originally, the Obama Administration wanted to send people to the asteroid belt, but bringing an asteroid here instead will be cheaper. The technologies the Asteroid Retrieval Initiative develops will help NASA shuttle people to Mars in the future, Bolden told the Pasadena Sun.

Ion thrusters are more fuel efficient and last longer than chemical engines, characteristics the U.S.' asteroid-hauling rocket will need, John Brophy, an electric propulsion engineer at the Jet Propulsion Laboratory, told the Pasadena Sun. NASA began researching the technology in the 1950s and first used it in a spacecraft in 2001.

The Asteroid Retrieval Initiative will cost an estimated $2.6 billion. It should have people on an asteroid in six to 10 years. The project's leaders are looking to bring in an asteroid that's 20 to 25 feet in diameter—something that would burn up in the Earth's atmosphere in the event that it strays too close to home.

Saturday 25 May 2013

Nissan Nuvu is a concept designed to offer next generation car from Nissan with rear-mounted electric motor and lithium ion battery pack. The design of this concept car is quite unusual and small. For city use, the Nuvu’s performance is more than adequate. Across the all-glass roof are a dozen of small solar panels. Basically, natural, organic and recycled materials are used to make its cabin. The steering of the car is very direct for quickness and maneuverability in the city. No one will have any problem in crowded roads and limited parking slots with this tomorrow’s city car.


NUVU: RESHAPING THE CITY

Within just a few years, cities all over the world will be at near bursting point. If mankind wants to retain the level of personal mobility it currently enjoys – and if the city is to survive – the only way forward is for a radical rethink of the type of cars driven there. One solution could be a car like Nuvu, designed for the city of the not-too-distant future.

“Nuvu is literally a ‘new view’ at the future of the city car. It is electric, of course, but as far as Nissan is concerned, for tomorrow’s city cars that is a given. No, the most important aspect of Nuvu is the interior design which provides great comfort and space in an intelligent package designed to make best use of our crowded roads and limited parking slots.”

François Bancon, General Manager, Exploratory and Advance Planning Department, Product Strategy and Product Planning Division, Nissan Motor Co., Ltd.

At a glance

- 2 +1 seating in compact 3m package
- Unique platform for Nuvu
- Zero emissions from EV drivetrain
- Drivetrain previews production EV due soon
- X-By-Wire control for all dynamic functions
- Extensive use of natural, organic and recycled materials
- An urban oasis complete with its own tree inside, which…
… provides shade for the interior, and
… generates solar energy via its ‘leaves’

Thursday 23 May 2013

This Exploit Lets Strangers Hack Into Your Google Glass And See What You See

Google Glass, as a wearable computer that sees everything you see, has received its fair share of privacy concerns. Is it okay to use Glass in a bathroom? Should you ask a stranger's permission before taking a photo or video?

But those are all questions about the privacy of people around the Google Glass wearer. Not much thought has been given to the wearer him/herself. And maybe that's a mistake.

Part of the promise of Google Glass is in its flexibility; it'll rely on skilled developers to take advantage of the possibilities of an always-on wearable computer. Google's counting on that. But with flexibility comes hackability, and one "security consultant," otherwise known as a white hat hacker, has figured out a way to hack Google Glass so that he can see what the Glass wearer sees.

Jay Freeman, who goes by Saurik, is a security consultant who's best known as the father of Cydia, a sort of alternate app store for jailbroken iPhones. If you want to overclock your iPhone, use it as a wireless hotspot without paying your carrier extra, or play Super Nintendo games on it, Cydia is for you. Freeman's Glass hack requires that the Glass be put in "debug" mode, but that doesn't take much. According to Ars Technica, which has a great piece on the subject, Freeman can pick up the Glass hardware and set it in this mode without the user ever knowing. And it doesn't take much to then get unfettered access to everything on that user's Glass; says Freeman, "This exploit is simple enough that you can pull it off with just a couple files, and without any specialized tooling."

After you've gotten access, you can install software that'll deliver any photos or voice recordings taken by the device. It's not a seamless hack; the way it currently works would wipe Glass of everything it had recorded until then, which would be pretty obvious to a user putting it back on. But that could well change, and Freeman recommends that Google institute some kind of protection--a PIN, perhaps--to stop this from happening.
John McGinnis thinks ordinary families would rather skip the airport and fly themselves. So he is trying to reinvent the personal airplane with the help of his father, son, and a rotating crew of about two dozen volunteers. Unlike small aircraft today—which can cost more than a house—McGinnis says Synergy could be cheaper, quieter, and, at more than 40 mpg, three times as fuel-efficient.
McGinnis, a 47-year-old composite manufacturer, flew his first airplane in second grade. Perplexed by the inefficiencies of personal aircraft, he taught himself aeronautical engineering and fluid dynamics over two decades. One day, while perusing scientific studies at a desk in his daughters’ bedroom, he read a NASA researcher’s paper challenging a classic aerodynamic drag equation. McGinnis could see the possibilities. “I came out of the girls’ bedroom ranting like a madman to my wife,” he says. “ ‘Honey, you’re never going to believe this. I think I just solved a problem I’ve been working on since I was a little kid.’”
Synergy’s wings bend upward and into a box shape for minimum drag and maximum efficiency. The top half of each wing swoops behind the body to function as a tail while providing greater in-flight stability. The double-box tail design also makes gliding easier by counteracting tornado-like vortices at the wings’ tips. And instead of a front-mounted propeller, an impeller placed behind the bullet-shaped body quiets noise while adding thrust.

1) A 200hp turbodiesel engine expels heat below the impeller, adding thrust.
2) Large wings allow slower takeoffs and landings.
3) Box tails create airflow patterns that reduce drag and increase flight stability.
4) An autopilot computer can land Synergy at a nearby runway during an emergency; a ballistic parachute can also be deployed.
McGinnis works on Synergy in his father’s garage, where he uses CNC machines and custom molds to fabricate components and 3-D software to rapidly model new ideas. Family members serve among the core build crew, with McGinnis’s son, Kyle, second-in-command. A quarter-scale prototype made from fiberglass, carbon fiber, and Kevlar suggests that both the team’s manufacturing process and unusual wing configuration work. Using about $80,000 in crowdfunded cash, they hope to finish a full-scale, five-person aircraft this year. “I work on it 90 hours a week, with a few hours of sleep,” McGinnis says. “What drives me to do it is that no one else will.”
INVENTOR
John McGinnis
COMPANY
Synergy Aircraft
INVENTION
Synergy
COST TO DEVELOP
Undisclosed
MATURITY
2/10

Scientists Engineer Extreme Microorganisms To Make Fuel From Atmospheric Carbon Dioxide

To find a way of fending off global warming, scientists sometimes look to nature. Plants, after all, use photosynthesis to snap up carbon dioxide, the biggest source of our climate change woes. So we get inventions like artificial leaves and ambitious projects like a plan to give fish photosynthesizing powers. One of the more interesting plans: genetically alter microorganisms so they can chow down on some CO2, too.

University of Georgia researchers recently used the mighty Pyrococcus furiosus, which usually eats carbohydrates and lives in super-heated waters or volcanic marine mud (ideally, for it, at about 100 degrees Celsius). By toying with the genome-sequenced microorganism's genetic material, they were able to make it comfortable in much cooler waters, and to eat carbon dioxide. After that, using hydrogen gas to form a chemical reaction in the microorganism, the researchers got the microorganism to produce 3-hydroxypropionic acid, a common chemical used in household products. That's been done before, but the researchers are looking into turning the process into one that could eventually produce fuel.

If it is able to produce fuel, that wouldn't make it the first bacteria-like organism to do so. Others have been able to make that happen in a lab. But for anyone working on it, the next move after proving it works is scaling up. Then, ideally, we'll start getting water bottles that can power our homes.



For the U.S. Navy and Northrop Grumman, it's shaping up to be a banner year in unmanned flight. While the carrier-based autonomous X-47B continues to hit milestones aboard the USS George H.W. Bush somewhere off the East Coast, out west in Palmdale, Calif., today the Navy flew its MQ-4C Triton maritime drone for the first time, marking the beginning of a sea change (pardon the pun) in the way the U.S. military patrols the oceans. The drone flew for 80 minutes and reached an altitude of 20,000 feet.
The Triton isn’t a completely new platform. If it looks familiar, that’s because everyone from the U.S. Air Force to NASA has been using its cousin--Northrop Grumman’s reliable Global Hawk--for years now, for intelligence, surveillance, reconnaissance, environmental monitoring, and meteorological data gathering, among other things. Triton is essentially an upgrade of the Global Hawk, optimized for maritime environments, with a strengthened airframe and de-icing features that allow it to rapidly ascend to and descend from high altitudes.
Those upgrades allow Triton to fly at altitudes nearly ten miles above sea level (its ceiling is listed as 60,000 feet, though it will likely stick to the 53,000-55,000 for most missions) for 24 hours at a time. That high vantage point allows its advanced sensors to take in a 2,000-nautical-mile view of the ocean in every direction. Carrying the Broad Area Maritime Surveillance (BAMS) sensor package (Popular Science awarded BAMS a Best of What’s New award last year) along with a classified advanced radar system, Triton will be able to both detect and identify ships on the water.
That is, rather than registering as a simple blip on the radar screen, BAMS will be able to generate a picture of the shape of the ship and use that to identify it by profile. In that way, it will be able to tell a container ship from a Chinese frigate from a surfacing Russian submarine--from up to 2,000 nautical miles away (we felt that point was worth stressing here). Triton's strengthened airframe, augmented with de-icing technology, will then allow it to rapidly descend and ascend, so it can swoop in for a closer look at vessels of particular interest.
That’s if everything works as advertised, and both Triton and BAMS are still in the early stages of development. The first flight by Triton is a big step forward. Though it’s built on the back of the tested Global Hawk platform, the tweaks that have been made to the design are significant. In fact, a Global Hawk lent to the Navy by the Air Force for testing crashed at Naval Air Station Pax River last year--an event that was seen at the time as a potential setback for Triton and BAMS. So today’s first flight is significant, as it marks the first airborne tests of a Triton and the beginning of the shift toward a brand new maritime capability.
That new capability is also quite significant. The Navy wants 68 Tritons based at five locations around the globe. Flying in rotations, they will be able to keep unprecedented tabs on the world’s critical sea lanes and important littorals, working alongside and supporting the manned P-8A Poseidon mission (the Poseidon is replacing the P-3 Orion anti-sub warfare aircraft; basically the Triton, which is unarmed, will conduct the ISR and the Poseidon will handle any kinetic strikes or electronic warfare, should it be necessary). And because the Triton is unmanned and autonomous, it will require less intensive human labor to fly as well as less risk to human pilots.
"When operational, the MQ-4C will complement our manned P-8 because it can fly for long periods, transmit its information in real-time to units in the air and on ground, as well as use less resources than previous surveillance aircraft," said Rear Adm. Sean Buck, Patrol and Reconnaissance Group commander, in a statement. "Triton will bring an unprecedented ISR capability to the warfighter."
That’s still a few years away, but today marks a critical step for the maritime capability, and a second huge leap forward for autonomous flight in just more than a week.