Saturday, July 23, 2016

Q: Car involuntarily accelerates 2-car lengths 1 out of 100 times






Q: Car involuntarily accelerates 2-car lengths 1 out of 100 times

asked by on 

I purchased a Lexus-certified used hybrid car and it is exhibiting involuntary acceleration. Two to three times out of 20 when I start to brake, my car accelerates slightly on its own. My car has regenerative braking, so I thought that perhaps the slight voluntary acceleration was normal. It is startling when it happens, but I apply hard pressure to the brake and it stops the acceleration. But “every” time I brake, I have to keep focused and aware that it might happen. When it started doing this, I thought I was hitting both pedals at the same time, but I’m sure that I’m not doing that. The dangerous problem is the distance that it accelerates 1 in 100 times when I brake. My car will accelerate “two” car lengths before I can brake. Naturally, when the mechanic at the dealership rode with me, it didn’t happen! I was told to keep a watch on this and they recorded my complaint. This two-car length acceleration only occurs during summer hot months, like now. There has been no change in the frequency. Have you heard of this in other Lexus hybrids and can you advise me how I can get this fixed (before an accident happens)?



Saturday, July 9, 2016

Toyota Prius collides with Joe Mack's Beach Shack in Friday evening crash



Toyota Prius collides with Joe Mack's Beach Shack in Friday evening crash



Toyota Prius collides with Joe Mack's Beach Shack in Friday evening crash


Posted: Friday, July 8, 2016

by Michelle Jostmeyer 


Just after 6:30 p.m. Friday, July 8, the Russellville Police Department the Russellville Fire Department and Pope County EMS were dispatched to a motor vehicle accident that occurred in the parking lot of the University Plaza located at 2005 North Arkansas Avenue. 
RPD officers and RFD Engine 3 firefighters advised dispatch they were en route to the call and a short time later, advised they had arrived on scene. 
RFD firefighters offered dispatch a size up of the scene stating that a small passenger car, a gold Toyota Prius, had struck the store front of Joe Mack's Beach Shack and had entered the store. The car had sustained minor damage in the accident, but the store front had sustained heavy damage in the collision. 
Firefighters made contact with the driver of the vehicle as well as employees of the store to inquire if anyone had sustained injuries. Upon the arrival of Pope County EMS all parties advised they were not injured and denied transport or treatment by EMS.
RPD officers on scene made contact with the driver in order to obtain his license, vehicle registration and his insurance information in order to begin investigating of the crash and for their report. 
RPD officers spoke to employees who were in the store when the accident occurred and took their statements. Officers also obtained a written statement from the driver of the Toyota of how the accident had occurred and then began their investigation. 
After completing the investigation, the officers learned that the driver of the Toyota was parked directly in front of Joe Mack's Beach Shack and was attempting to back out of the parking spot. 
The driver accelerated the car to back out, believing the car was in reverse, but upon accelerating, the car traveled forward, due to the car being in drive. When the car accelerated it hopped the curb, entered the sidewalk, and struck the glass store front. The car entered Joe Mack's Beach Shack where it came to rest just inside the front of the store. The Toyota was later pulled from the store by a wrecker and then towed from the scene. 
Owners and employees of the store began cleaning up the glass and debris left from the collision as they waited for assistance to arrive to begin temporary repairs to secure the store. 
After completing the accident report and assisting in the call, both firefighters and police officer ended the call, advising dispatch they were available for service. 

As of 8 July 2016: 

Vehicle Speed Control Problem on the 2016 TOYOTA PRIUS [1]


2010 – 2015 TOYOTA PRIUS Vehicle Speed Control Problems  (as of: 8 Jul 2016)


Vehicle Speed Control Problem on the 2015 TOYOTA PRIUS [3]


Vehicle Speed Control Problem on the 2014 TOYOTA PRIUS [7]


Vehicle Speed Control Problem on the 2013 TOYOTA PRIUS [11]


Vehicle Speed Control Problem on the 2012 TOYOTA PRIUS [9] 


Vehicle Speed Control Problem on the 2011 TOYOTA PRIUS [15] 


Vehicle Speed Control Problem on the 2010 TOYOTA PRIUS [264] 





2001 – 2009 TOYOTA PRIUS Vehicle Speed Control Problems  (as of: 8 Jul 2016)



Vehicle Speed Control Problem on the 2009 TOYOTA PRIUS [65] 



Vehicle Speed Control Problem on the 2008 TOYOTA PRIUS [140]

Vehicle Speed Control Problem on the 2006 TOYOTA PRIUS [77] 



Vehicle Speed Control Problem on the 2005 TOYOTA PRIUS [49] 

Vehicle Speed Control Problem on the 2004 TOYOTA PRIUS [49] 


Vehicle Speed Control Problem on the 2003 TOYOTA PRIUS [3] 



Vehicle Speed Control Problem on the 2002 TOYOTA PRIUS [11] 

Vehicle Speed Control Problem on the 2001 TOYOTA PRIUS [23] 




Consumer Watchdog: Tesla “rushing,” should disable Autopilot immediately






Driver Automation to Be Scrutinized in NTSB Probe of Tesla Crash



By Alan Levin Jeff Plungis



  • Safety board seeking to learn about safety of automation
  • Agency joins NHTSA in studying fatal accident in Florida


  • For years, U.S. investigators have been calling for more automation on motor vehicles, such as sensors that slam on the brakes to prevent a crash.

    At the same time, the National Transportation Safety Board, in its probes of transportation mishaps, has warned that such devices may also have a down side: the technology can confuse operators if it’s poorly designed or lead to complacency that breeds its own hazards.

    Now, for the first time in a highway accident, those two potentially contradictory themes will be put to the test as the NTSB opens an investigation into a fatal accident involving a Tesla Motors Inc. sedan that was driving with a feature called Autopilot enabled.

    “It’s very significant,” said Clarence Ditlow, executive director of the Center for Auto Safety advocacy group in Washington. “The NTSB only investigates crashes with broader implications.”

    Team Dispatched



    The Safety Board will be sending a team of five investigators to Florida next week, agency spokesman Christopher O’Neil said Friday.

    While the U.S. National Highway Traffic Safety Administration is conducting its own review of the May 7 incident, the NTSB wants to take a more comprehensive look at whether the crash reveals any systemic issues with driverless car technology, O’Neil said. NHTSA is a regulatory agency and the NTSB is an independent investigative body that only has the power to make policy recommendations.

    “It’s worth taking a look and seeing what we can learn from that event, so that as that automation is more widely introduced we can do it in the safest way possible,” O’Neil said.

    A 40-year-old Ohio man died when his 2015 Tesla Model S struck an 18-wheeler on a highway near Williston, Florida, according to a Florida Highway Patrol statement. The Model S drove under the truck’s trailer, shearing off its top.

    First Fatality

    The Autopilot, a semi-autonomous feature that can guide the vehicle in certain conditions, didn’t notice the white side of the tractor trailer as it turned in front of the car against a brightly lit sky so the brake wasn’t applied, according to Tesla. The system may have confused the truck with an overhead highway sign.
    The crash was the first with a known fatality in more than 130 million miles of Autopilot driving, according to the carmaker.
    Ditlow said that the NTSB rarely opens investigations into highway accidents, so the announcement that it was looking at the Tesla crash is significant.
    “They’re not looking at just this crash,” he said. “They’re looking at the broader aspects. Are these driverless vehicles safe? Are there enough regulations in place to ensure their safety?”

    Beta System

    “And one thing in this crash I’m certain they’re going to look at is using the American public as test drivers for beta systems in vehicles. That is simply unheard of in auto safety,” he said.
    Tesla has installed the software for Autopilot on all 70,000 of its cars since October 2014 even though it is still a so-called beta version. Tesla said in a June 30 blog post that vehicle owners must acknowledge that the system is new technology that is “still in a public beta phase” before it will switch it on.
    “Autopilot is by far the most advanced driver-assistance system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility,” the company said. “Since the release of Autopilot, we’ve continuously educated customers on the use of the feature, reminding them that they’re responsible for remaining alert and present when using Autopilot and must be prepared to take control at all times.”
    The NTSB opens highway investigations about 25 to 30 times a year, according to O’Neil. By comparison, it is required by law to investigate the more than 1,000 aviation accidents a year.

    Tracy Morgan

    The NTSB has for decades called on vehicle manufacturers to install more features to automatically prevent accidents. After a Wal-Mart Stores Inc. truck struck a limo van carrying comic Tracy Morgan, the NTSB examined why the truck’s automatic braking system wasn’t switched on at the time of the impact.
    Accidents involving automation also have been a growing issue in aviation and other transportation modes, according to NTSB case files.
    The board concluded that an autopilot feature on the Boeing Co. 777 that crashed short of the runway in San Francisco on July 6, 2013, contributed to the accident that killed three. Pilots didn’t realize that they had accidentally shut off a feature that normally ensured they would maintain a safe speed, allowing the plane to get dangerously slow.
    “We have learned that pilots must understand and command automation, and not become over-reliant on it,” NTSB Chairman Christopher Hart said after the board reached its conclusions. “The pilot must always be the boss.”


    NTSB to Investigate Crash of Tesla With

    Autopilot Engaged

    By JOAN LOWY, ASSOCIATED PRESS

    WASHINGTON — Jul 8, 2016

    A federal official says a second safety agency will investigate the fatal crash of a Tesla sedan while the vehicle's Autopilot self-driving mode was in use.

    Rob Molloy heads the National Transportation Safety Board's highway investigations division. Molloy says a team of investigators will go to Florida next week to examine the wreckage of the Tesla Model S.
    The National Highway Traffic Safety Administration announced last week that it is investigating the May 7 crash near Gainesville to see if the Autopilot system is defective.
    The agency has said a truck made a left turn front of the car, but neither the Autopilot nor the driver applied the brakes. The driver was killed.
    Molloy says NTSB's investigation will take a comprehensive look at the entire incident.
    http://abcnews.go.com/Politics/wireStory/ntsb-investigate-crash-tesla-autopilot-engaged-40441252



    Consumer Watchdog: Tesla “rushing,” should disable Autopilot immediately

    Friday, July 8, 2016

    UPDATED: EE Times: Tesla's Fatal Crash: 6 Unanswered Questions






    Tesla's Fatal Crash: 6 Unanswered Questions

    7/7/2016



    MADISON, Wis.—Two months after a Tesla’s Model S on Autopilot mode killed a driver, the fatal crash still has many automotive industry experts wondering what exactly happened and why the self-driving system failed.




    Of course, the accident was a surprisingly well-kept secret until last week. This delay begs the questions of when Tesla knew, and why did the firm, together with the federal agency, wait nearly two months to give the tragedy its due publicity.

    Although Tesla learned about the May 7 crash in Florida "shortly" afterward, it did not disclose it to the government until May 16. It was June 30 when the National Highway Traffic Safety Administration (NHTSA) announce its probe into the fatality.

    Separately, it’s been revealed that there was another Tesla crash on July 1, involving a Model X that rolled over on the Pennsylvania Turnpike. NHTSA just announced on Wednesday (July 6) that it’s investigating the Pennsylvania accident to determine if automated functions were involved.

    More questions than answers
    Without Tesla’s full disclosure and results from NHTSA investigations, nobody can say for sure what caused the fatal crash on May 7.

    Several automotive experts reached by EE Times are coming up with more questions than answers. Among their questions:
    • Which part of Tesla’s autopilot system failed to recognize the imminent danger of a white trailer truck? CMOS image sensor, radar, vision processor?
    • How is Tesla’s Autopilot system designed to interpret sensory data?
    • Which hardware was responsible for sensor fusion?
    • Who wrote the sensor fusion algorithms – determining which data over-rides another, and what happens when contradictory information comes from different sensors?
    • More specifically, how did Tesla integrate Mobileye’s EyeQ3 vision processor into their Automatic Emergency (AEB) system?
    • Algorithms perception systems are getting pretty good. But how far along has the automotive industry come with algorithms for motion planning?
    • There are many corner cases (like the Tesla case) that are almost impossible to test. Considering the infinite number of potential scenarios that could lead to a crash, how does the car industry plan to meet the challenge of modeling, simulation, test and validation?

    It’s premature to point a finger at any specific technology failure as the culprit. But it’s time for the automotive industry -- technology suppliers, tier ones and car OEMs – to start discussing the limitations of driving on autopilot.

    This was suggested by Amnon Shashua, Mobileye's co-founder, CTO and chairman.  He said during the BMW/Mobileye/Intel press conference last Friday: “Companies need to be very transparent about limitations of the system. It’s not enough to tell the drivers to be alert, it needs to tell them why they need to be alert.”
    In pursuit of such transparency the following pages are an attempt to dissect which technologies were at play inside Model S Autopilot, and what limitations each technology was manifested in the crash.


    First, Tesla Model S is equipped with sensing hardware that includes front radar, a front monocular optical camera, Mobileye’s EyeQ3 vision SoC, and a 360-degree set of ultrasonic sensors. The vehicle also has loads of computing power. It uses Nvidia modules – one based on Tegra 3 processor for media control in the head unit and another based on Tegra 2 for instrument control.

    1. What did the front-camera actually see just before the crash? 
    This is the first question that came to everyone’s mind.  When a tractor-trailer made a left turn in front of the Tesla at an intersection on a non-controlled access highway, what did Tesla’s front-camera exactly see?


    Florida Traffic Crash Report Click here for larger image
    Florida Traffic Crash Report
    Click here for larger image

    Direct sunlight flashing in front of the camera could have caused the CMOS image sensor not to see the truck clearly, said one industry analyst. “Of course, it all depends on the CMOS image sensor’s sensitivity and contrast.”

    Phil Magney, founder & principal advisor of Vision Systems Intelligence, LLC., told us, “It’s odd that the camera could not identify the truck.” At 4:30PM on May 7 in Florida, when Tesla crashed, the effect of the sun on the front camera is unknown.  But Magney added, “My guess is that the vision sensor [EyeQ3] simply didn’t know how to classify” whatever the camera saw.

    “The cameras are programmed to see certain things (such as lanes) but ignore others that they cannot identify.”

    It appears that the camera ignored what it couldn’t classify.

    2. Why didn’t radar see the white truck?
    Vision Systems Intelligence’s Magney made it clear, “The radar did recognize the truck as a threat. Radar generally does not know what the object is but radar does have the ability to distinguish certain profiles as a threat or not, and filter other objects out.”

    If so, what was radar thinking? 

    Tesla’s blog post followed by Elon Musk’s tweet give us a few clues as to what Tesla believes the radar saw. Tesla understands that vision system was blinded (the CMOS image sensor was seeing “the white side of the tractor trailer against a brightly lit sky”). Although the radar shouldn’t have had any problems detecting the trailer, Musk tweeted, “Radar tunes out what looks like an overhead road sign to avoid false braking events.'"

    Mike Demler, a senior analyst at The Linley Group, disagrees.

    He told EE Times that if Tesla’s theory is correct, “Tesla has some serious problems in their sensor system and software.”

    Demler noted, “According to the Department of Transportation, overhead signs should be a minimum of 17 feet above the pavement, and most of them are green. The Tesla Model S involved in the crash is only 56.5 inches tall, and it hit the underside of the truck.”

    He added, “The radar may have seen open road under the truck, but it should have indicated the distance and height was inconsistent with a sign.”

    Demler doesn’t believe radar was at fault.

    He explained that radar is “most commonly used for adaptive cruise control. It prevents rear-end collisions by detecting the distance to a vehicle up ahead. For autonomous vehicles, radar can provide longer distance object detection than cameras and Lidar.”

    “Radar doesn’t provide detail for object identification,” he acknowledged. But Radar also provides better sensing at night and through fog and precipitation, headed.

    Demler pointed to a statement Tesla issued to the media last Friday, which explained that its autopilot system “activates automatic emergency braking in response to any interruption of the ground plane in the path of the vehicle that cross-checks against a consistent radar signature.”

    In that case, Demler said, “I suspect the radar missed the truck because it is focused too close to the ground and only saw the opening.”

    3. Which hardware was responsible for sensor fusion?
    Experts we talked to agree that Mobileye’s EyeQ3 is capable of handling camera/radar fusion. But it’s not clear if EyeQ3 chip was used in Tesla for this purpose.

    It has never been made clear what parts of the Autopilot are built by Tesla and Mobileye, other than Tesla’s Autopilot is using Mobileye’s EyeQ chip.

    A Tesla statement Tesla last Friday said:
    Tesla’s autopilot system was designed in-house and uses a fusion of dozens of internally- and externally-developed component technologies to determine the proper course of action in a given scenario.

    Since January 2016, Autopilot activates automatic emergency braking in response to any interruption of the ground plane in the path of the vehicle that cross-checks against a consistent radar signature. In the case of this accident, the high, white side of the box truck, combined with a radar signature that would have looked very similar to an overhead sign, caused automatic braking not to fire.
    Vision Systems Intelligence’s Magney told us, “I think most of the perception for Autopilot is done by camera. But it does use radar for confirmation. It is also my understanding that the emergency braking solution in the Tesla requires a confirmation of both the radar and the camera.”

    In summary, combining two items of false sensory data together did not trigger the Autopilot system to take any action.

    The Linley Group’s Demler believes that the more pertinent question should be “how Tesla integrated EyeQ3 into its own Automatic Emergency Braking (AEB) system.”

    Tesla’s description of how its AEB works does not align with Mobileye’s statement.

    Mobileye’s chief communication officer Dan Galves told the media that “today’s collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that.”

    In Mobileye’s opinion, the Tesla’s crash “involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon.” Galves said, “Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020.”

    Tesla, however, insisted that its own Autopilot system is designed to trigger AEB regardless where — rear or front – it senses trouble. The AEB activates “in response to any interruption of the ground plane in the path of the vehicle.”

    In summary, Demler said that the problem [of sensor fusion in Tesla] appears to have been in Tesla’s implementation.

    4. Who writes perception software?
    It is entirely possible that the crash was caused by “perception software” that wasn’t capable of deciphering what it’s sensing. Who actually writes those algorithms?

    The Linley Group’s Demler said, “It depends.”

    He explained that Mobileye sells after-market systems as complete packages, but they generally work with the tier ones to develop the systems for new cars.

    Meanwhile, “Tesla acts as their own Tier 1, so they developed their own software.” Demler added, “There are several software companies like ADASworks that specialize in this.”

    Indeed, Tesla created a league of its own. Behaving more like a tech company rather than a traditional carmaker, Tesla famously skips working with tier ones. This policy is turning into a trend. In the recently announced alliance of BMW, Intel and Mobileye, no tier ones are included.

    5.  What about algorithms for motion planning?
    Sensor fusion is vital to getting things right and avoiding catastrophes, as Vision Systems Intelligence’ Magney explained. But what about algorithms for motion planning, he asked.

    In his opinion, “sensing for safety systems is pretty good and the sensor technologies to support it are also pretty good… However, as you move up the ladder of autonomy, the challenges grow exponentially.”

    More specifically, “there are a lot of other hard problems related to motion planning.” Magney said, “For example, once you have detected a scene, what do you do next is critical – how much steering angle, how much torque, how to factor in low grip conditions, etc.”

    Mobileye, which already holds the lion’s share in the vision SoC market, is positioning itself as the super SoC that combines processing for camera, radar and Lidar. Elchanan Rushinek, Mobileye’s senior vice president of Engineering, in a recent interview with us, explained what’s inside its upcoming EyeQ5 scheduled to sample in 2018.

    However, it isn’t likely that such a super fusion sensor will also have room for for motion planning algorithms. Given where NXP’s Bluebox and Audi’s zFAS are heading, Magney envisions a platform featuring multiple processors, each partitioned with different domain responsibilities. Such domains include safety, motion controls and decision making, in addition to perception sensor fusion.

    6. How to handle corner cases
    What the Tesla crash exposed isn’t just the limits of individual sensor hardware or software. That’s a given. Rather, it exposed Tesla’s failure to prepare its system for an unanticipated event, or for multiple failures within its system.

    For example, who at Telsa imagined that its own vehicle sensor would have problems distinguishing a white trailer truck from an overhead road sign?

    Did Tesla engineers foresee the possibility that a truck situated perpendicular to the car’s path neither approaching the car or pulling away, would manifest the Doppler speed signature of a stopped object?

    Further, what was the planned reaction of Tesla’s Autopilot system when multiple sensors failed to decipher what it’s detecting?

    “Sensor fusion requires a lot of steps in development --- modeling, simulation, test and validation before you can deploy them,” said Magney. “There are so many corner cases (like the Tesla case). It is almost impossible to test for every possibility.”

    Making Autopilot driving work 99 percent of the time might be possible. “But to improve the probability to 99.99 percent, it takes a monumental task,” Magnery noted.

    Considering an infinite number of potential scenarios that could lead to a crash, the automotive industry needs an agreed-upon methodology for modeling, simulation, test and validation.

    Duke University robotics professor Mary Cummings testified at a U.S. Senate hearing in March that the self-driving car community is “woefully deficient” in its testing programs, “at least in the dissemination of their test plans and data.”

    In particular, she said she’s concerned about the lack of “principled, evidenced-based tests and evaluations.”

    We know that Google is doing its own thing. Tesla is certainly pursuing its dream. Mobileye has its own, jealously guarded sensor algorithms. But when it comes to testing and validation of autonomous driving, these industry leaders need to find a way to cooperate.

    What will happen next?
    Beyond testing and validation, many automotive industry analysts agree that more redundancy in sensors is the key to safer systems.

    Beyond image sensors and radar, they mentioned Lidar, V2X and high definition maps as additional sensors.  Magnery said, “An integrated high-attribute map could have helped too, as stationary physical objects and their precise locations can be used to improve sensor confidence rather than filtering them out.”

    Luca De Ambroggi, principal analyst, automotive semiconductor at IHS Technology, noted, as the industry goes forward, “redundancy in sensors is a must.” He added that redundancy will be required also for other electronics components to ensure a fallback solution in case of failure.
    Further, De Ambroggi suspects car OEMs will start asking for “ASIL D” – Automotive Safety Integrity Level D – certification in technology used in autonomous driving vehicles.

    ASIL D refers to the highest classification of initial hazard (injury risk) defined within ISO 26262. “ASIL D will require a much stricter control and documentation on system functional safety,” De Ambroggi said.

    But there is one more thing to consider, said Demler.

    I don’t think it’s just a matter of the number of sensors. I think the problem is that Elon Musk and Tesla have been too cavalier about using human drivers as autopilot beta testers, which is what their own documents call them. They say drivers are “required” to keep their hands on the wheel, when clearly they are not. From a technical point-of-view, the system they’ve implemented is only capable of partial autonomous driving, under a very limited set of conditions, yet they call it autopilot.








    Tesla Auto Club touts Tesla Model S's Autopilot. In a virtually empty highway, it looks smooth and almost invincible.


    — Junko Yoshida, Chief International Correspondent, EE Times





    (selected comments):
     
    Re: In my humble opinion
    junko.yoshida   7/8/2016 11:16:06 AM
     
    Olaf, that's a good point. No, owners don't read thick manuals anymore. Everyone expects
    "out of the box" experience. If you had to look it up in the manual or searching answers in
    wikihow, yeah, that's definitely a turn off.
     
    What actually bothers me about this case is that the whole Tesla "fanboy" community really
    encouraged reckless driving behaviors -- i.e. using ADAS system as "autonomous" hands-
    free driving.  Posting their own "look ma, no hands" video clip on YouTube, enthralled by
    the number of clicks it gets. On top of that, he gets "retweet" from Elon Musk, which makes
    the driver feel as though he can die in heaven.
     
    You know how the circular nature of social media goes. Everyone follows what everyone
    else seems to be doing, and kids himself that it's OK to drive his Tesla hands-free.
     
    Calling ADAS as "Autopilot" is the biggest misnomer of this decade in my humble opinion.
     
    Copyright © 2016 UBM All Rights Reserved
    ----------------------------------------------
    Re: good article
    junko.yoshida   7/8/2016 11:03:28 AM
     
    Eric, thanks for chiming in. This msg coming from you -- really one of the pioneers of CMOS
    image sensors -- means a lot to us.
     
    Yes, I am sure that Tesla and all the technologies suppliers have a better handle of what
    went wrong. But you raise a good question here.
     
    If you, as a driver, are blinded by the sun, you slow down -- rather than telling yourself,
    "I can't classify what's in front of me."
     
    The car manufacturer -- and tech suppliers, too -- must do a much better job in spelling out
    the limitations of their products.
     
    The public at this day and age tends to be too trusting of technologies
    .
    Copyright © 2016 UBM All Rights Reserved
    ----------------------------------------------
    Re: Truck safety regulations
    sixscrews   7/7/2016 7:41:00 PM
     
    Trucks do have rear crash barriers but not side barriers although I have seen more and
    more trailers with side airfoils intended to reduce aerodynamic drag - reinforcing these as
    crash barriers would be a no-brainer.  On the other hand, no matter what kind of barrier you
    have a 65 mph impact is likely to be lethal or at least result in serious injury to the front seat
    occupants.
     
    I appreciate the endorsement of certification of automitive software.  Carmakers and others
    will carry on about gov't interference but if you had a chance to read some of the code in
    the Toyota TCM it would make your eyes spin - sphagetti code doesn't even begin to
    describe it.  I wrote better code (I think) back in 1978 when we were using punch cards.
     
    There are lots of excellent software engineers and managers who know how to get them to
    produce excellent code - all it takes is for the auto industry to get smart and hire them.
     
    But they are cheap and dumb, IMHO, and more focused on marketing than product
    development (with some exceptions - Ford's focus on turbochargers is impressive but I
    have no idea what their software looks like). 
     
    And then there is Volkswagen's diesel emission cheat software - you can do all kind of
    things in code provided nobody but your supervisors look at it.
     
    wb/ss
     
    Copyright © 2016 UBM All Rights Reserved
    ----------------------------------------------
    Re: Who is responsible for the accident?
    sixscrews   7/7/2016 4:47:40 PM
     
    Your comments are dead on - what was going on here?
     
    However, if an autopilot system cannot compensate for stupid
    drivers of other vehicles on the same roadway then it's worse
    than useless - it's a loaded weapon pointed straight at the
    head of the vehicle operator - whose primary responsibility is
    to keep aware of the situation.
     
    So, if Tesla is marketing this as a 'play your DVD and let us do
    the driving' then they are worse than fools - they are setting
    their clients up for fatal crashes.
     
    At this time there is no software that can anticipate all the
    situations that will occur on a roadway and anyone who claims
    otherwise is ignorant of the state of software circa 2016.
     
    And this brings me to one of my favoride dead horses – the
    NHTSA has no authority to certify vehicular control software
    and this allows manufacturers to release software the is
    developed by a bunch of amatuers with no idea of the
    complexities of real time control systems (see Toyota throttle
    control issues revealed in the Oklahoma case 18 months ago).
     
    Congress - or whatever group of clowns that masquerade as
    Congress these days - must implement - or allow NHTSA or
    another agency-  to create set of regulations similar to those
    applying to aircraft flight control systems.  A vehicle operating
    in a 2-D environment is just as dangerous as an aircraft
    operating in a 3-D environment and any software associted
    with either one should be subject to the same rules. 
     
    Automakers are behaving as if the vehicles they manufacture
    are still running on a mechanical distributor-points-condensor-
    coil system with a driver controlling manifold pressue via the
    'gas pedal' as was true in 1925.  They fought air bags,
    collapasible steering columns, safety glass and dozens of
    other systems from the '20s to the present. Enough is enough. 
     
    Mine operators who intentionally dodge safety rules spend
    time behind bars (or maybe - provided the appeals courts
    aren't controlled by their friends).  So why shouldn't
    automakers be subject to the same penanties?  Kill your
    customer - go to prison, do not pass GO, do not collect $200. 
     
    The engineering of automobiles is not a game of lowest bidder
    or chepest engineer; given the cost of vehicles today it should
    be the primary focus of company executives, not something to
    be passed off as an annoyance and cost center with less
    improtance than marketing.
     
    It's time for automakers to step up and take responsibility for
    the systems they manufacture and live with certification of the
    systems that determine life or death for thier customers.
     
    wb/ss
     
    Copyright © 2016 UBM All Rights Reserved
    ===============================
    Tesla Crashes BMW-Mobileye-Intel

    Event
    Recent auto-pilot fatality casts pall
    Junko Yoshida, Chief International
    Correspondent
    7/1/2016 05:32 PM EDT
    ---------------------------------------------
    (selected comment - by the article author):
     
    Re: Need information
    junko.yoshida   7/2/2016 7:18:45 AM
    @Bert22306 thanks for your post.
     
    I haven't had time to do a fully story but here's what we know
    now:
     
    You asked:
     
    1. So all I really want to know is, why did multiple redundant
    radar and optical sensors not see the broad side of a
    trailer?
     
    Tesla Model S doesn't come with "multiple redundant radar
    and optical sensors." It has one front radar, one front
    monocular Mobileye optical camera, and a 360-degree set of
    ultrasonic sensors.
     
    In contrast, similar class cars like Mercedez Benz comes with
    a lot more sensing hardware devices including short-range
    radar, multi-mode radar, stereo optical camera in addition to
    what Tesla model S has.
     
    But that's besides the point. More sensing devices obviously
    help but how each carmaker is doing sensor fusion and what
    sort of algorithms are at work are not known to us. There are
    no good yardsticks available to compare different algorithms
    against one another.
     
    2. Must have been a gymongous radar target, no?
     
    Yes, the truck was a huge radar target. But according to Tesla's
    statement, "The high, white side of the box truck, combined
    with a radar signature that would have looked very similar to
    an  overhead sign, caused automatic braking not to fire."
     
    In other wrods, Tesla's autopilot system believed the truck was
    an overhead sign that the car could pass beneath without
    problems.
     
    Go figure.
     
    3. Why the driver didn't notice is easily explained.
     
    There have been multiple reports  that the police found a DVD
    player inside the Tesla driver's car. There have been
    suggestions -- but it isn't confirmed -- that the driver might have
    been watching a film when the crash happened.  But again,
    we don't know this for sure.
     
    4. Even though this was supposed to be just assisted
    driving, one might assume that the driver was distracted
     
    You would think. It's important to note, however, that this driver
    killed in Tesla autopoiot carsh has been known to be a big
    Tesla fan, and he posted a number of a viral video on his
    Model S.
     
    Earlier this year, he posted one video clip showing his Tesla
    avoiding a close call accidet with autopilot engaged. (Which
    obviously got an attention and a tweet back from Elon Musk...)
     
    Here's the thing. Yes,  what autopilot did in that video is cool,
    but it's high time for Musk to start tweeting the real limitations
    of his autopilot system.
     
    5. Saying things like "We need standards" is so peripheral!
     
    I beg to differ, Bert.
     
    As we try to sort out what exactly happened in the latest Tesla
    crash, we realize that there isn't a whole lot of information
    publicly available.
     
    Everything Google, Tesla and others do today remain private,
    we have no standards to compare them against. The
    autonomous car industry is still living in the dark age of siloed
    commuity.
     
    I wrote this blog because I found interesting what Mobileye's
    CTO had to say in the BMW/Mobileye/Intel press conference
    Friday morning. There were a lot of good nuggets there.
     
    But one of the things that struck me is this: whether you like it
    or not, auto companies must live with regulations. And
    "regulators need to see the standard emerging," as he put it.
     
    As each vendor develops its own set of sensor fusing, own set
    of software stack, etc., one can only imagine complexities
    multiplying in standards for testing and verifying autonomous
    cars.
     
    Standards are not peripheral. They are vital.
     
    Copyright © 2016 UBM All Rights Reserved
    ===============================