Posted on 10/10/2010 4:36:17 PM PDT by lbryce
Anyone driving the twists of Highway 1 between San Francisco and Los Angeles recently may have glimpsed a Toyota Prius with a curious funnel-like cylinder on the roof. Harder to notice was that the person at the wheel was not actually driving.
The car is a project of Google, which has been working in secret but in plain view on vehicles that can drive themselves, using artificial-intelligence software that can sense anything near the car and mimic the decisions made by a human driver.
With someone behind the wheel to take control if something goes awry and a technician in the passenger seat to monitor the navigation system, seven test cars have driven 1,000 miles without human intervention and more than 140,000 miles with only occasional human control. One even drove itself down Lombard Street in San Francisco, one of the steepest and curviest streets in the nation. The only accident, engineers said, was when one Google car was rear-ended while stopped at a traffic light.
Autonomous cars are years from mass production, but technologists who have long dreamed of them believe that they can transform society as profoundly as the Internet has.
Robot drivers react faster than humans, have 360-degree perception and do not get distracted, sleepy or intoxicated, the engineers argue. They speak in terms of lives saved and injuries avoided more than 37,000 people died in car accidents in the United States in 2008. The engineers say the technology could double the capacity of roads by allowing cars to drive more safely while closer together. Because the robot cars would eventually be less likely to crash, they could be built lighter, reducing fuel consumption.
(Excerpt) Read more at nytimes.com ...
Think of it like cruise control. The computer is controlling the vehicle, but there is a driver behind the wheel ready to override if he doesn't like what the computer is doing.
Kubuntu Kars?
A car with no driver would be a car that would be really easy to steal.
Then it will spread down to luxury vehicles, and kits for handicapped and seniors and so on. Yes there will be errors and problems, but I expect that they will be rare from the outset - much lower than the risks of having human drivers. (I work as a quality engineer in automotive safety electronics.) This stuff is being developed over a long period, step by step, so there is time to refine the testing techniques and recognize the odd scenarios that could stump or fool the software.
I would strongly agree that two things be kept inviolate: that drivers be able to take control of the vehicle, and that the autonomous controls be strictly tied to driver input/control (not subject to remote manipulation or command).
True, and delivery vehicles could become extremely lightweight, designed to carry only the cargo and no passengers.
Teamsters are gonna love it - all bennies, no work.
Even people that can drive, could spend the commute time doing other things. For example putting on their makeup or eating breakfast, or texting....oh...well, you could do those things without being hazard. You could also read, work, etc.
I don't think anyone is suggesting letting microsoft design car operating systems.
My '96 Chevy has a computer that controls the engine and the cruise control. Almost all modern day cars do. In the event of a computer failure, the car goes into "limp mode", the individual components, fail safe to preset values that are still driveable but far from optimal.
There will be computer failures and some redundancy will have to be built into the cars. But relative to humans, the computers don't tire, don't get distracted (non-microsoft OS), and can do more things at once than a human.
There is that downside. It would have been awesome when we were young. But as a parent, I'd be looking for that vehicle chaperone program.
Generally agreed, but they should be options.
I can think of exceptions to this. A car owned by a DWI offender or other unsafe driver shouldn't allow the driver to take control.
Conversely, couriers, taxis, business vehicles, parents, guardians, may want remote control. It just needs to be secure enough to keep terrorists at bay. I think there are situations you should be able to alter the route, remotely, but you shouldn't be able to alter the programming remotely.
Once the technology is in place. How do you keep a terrorist from programming cars to attack? All he has to do is gain physical possession of the car and be able to make alterations. Some thought needs to be given to that.
Do we want to allow car owners to hack their own vehicle's software? Allowing it could make for vast improvements, as every car owner, could become a source of technical development. But without controlled testing, they could introduce errors that would be unsafe.
I repeat-
How did it become legal for these guys to NOT be in control of their automobiles?
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.