This is a monthly column by CAA Board Member Dan Alvarez, addressing technology issues in the banking world, for non-tech professionals. 
 
Please feel free to contact him at chasealumtech@gmail.com.
 
Click here to read the first column, on artificial intelligence.
 
 
 

#3  Autonomous Cars (November 2023)                        

 

This month’s subject was requested by one of our readers. Don’t forget you can also submit your ideas to have them featured in future issues.

 

What are Autonomous Vehicles?

Autonomous vehicles, also known as self-driving cars or driverless cars, are vehicles equipped with advanced systems that allow them to navigate and operate without human input. These vehicles combine a variety of sensors to perceive their surroundings, such as radar, lidar, sonar, GPS, odometry and inertial measurement units. Advanced control systems interpret sensory information to identify appropriate navigation paths, as well as obstacles and relevant signage.

 

It’s worth noting that there are varying levels of automation in vehicles. The Society of Automotive Engineers has created 5 levels to help classify vehicles:

  • Level 0 (No Automation): At this level, the human driver performs all driving tasks. There might be system warnings or momentary intervention, but they do not replace driver control. The vast majority of cars on the road today fall into this bucket.
  • Level 1 (Driver Assistance): This level includes vehicles that have a single automated system for driver assistance, such as adaptive cruise control or parking assistance. The driver remains fully in control of the vehicle except for the assisted function.
  • Level 2 (Partial Automation): Here, the vehicle has multiple automated functions like acceleration and steering, but the driver must remain engaged with the driving task and monitor the environment at all times. Examples include Tesla's Autopilot and Cadillac's Super Cruise.
  • Level 3 (Conditional Automation): At Level 3, the car can handle all aspects of the driving task under certain conditions, but the driver must be ready to take control when those conditions are no longer met. Audi's Traffic Jam Pilot is a great example.
  • Level 4 (High Automation): Vehicles at this level can operate in self-driving mode without human input, but only in specific conditions or areas (like geo-fenced urban areas – more on this later). The driver has the option to override manually the automation. Google’s new Waymo service is a fitting example of this – offering taxi service…with no driver!
  • Level 5 (Full Automation): This is the highest level of autonomous driving. Vehicles can perform all driving tasks under all conditions without any human intervention. There are currently no commercially available Level 5 autonomous vehicles on the market, but there is the DARPA Challenge, involving the use of fully autonomous vehicles racing 132 miles across the desert. 
 

What does JPMC have to do with autonomous vehicles?

One of the key emerging trends identified by JPMorgan is the market for in-vehicle purchases for things like apps and streaming services. Cars are becoming more digitized and now contain entire digital-focused ecosystems and experiences centered around letting the driver…do less driving. The latest iteration of the UK Highway Code will let drivers legally watch TV, check email and enjoy other apps from an in-vehicle infotainment system. You can download apps to your car in the same way you would download them to your smart phone. JPMC is expecting the market for in-vehicle payments to hit $86B by 2025. Don’t forget about self-driving trucks either; research from the Investment Bank is showing that autonomous trucks stand to reduce costs by over 40 percent per mile when compared with their traditional 18-wheeler counterparts with a driver. Considering that over 70 percent of all freight is moved by trucks in the United States, that’s quite a lot in potential savings.

 

How do a series of computers compare to their human driving counterparts?

This is one of those questions where the answer can be very short or very long. Of course, a computer can “think” more quickly than a human can, but it isn’t really that simple.

 

Autonomous vehicles come with many many different types of sensors that typical vehicles do not, such as radar, cameras, GPS, proximity and ultrasonic sensors. The data from these devices is fed into a series of redundant computers that process the data and then react accordingly. Because evolution has not gifted us with eyes on the back of our heads yet, this would make a self-driving vehicle generally more aware of its surroundings. Here’s a great example of a Tesla Model 3 slamming on the brakes as it anticipates an accident between two entirely different vehicles ahead of the Tesla itself.

 

There’s also all that pesky stuff that makes humans…humans. Autonomous vehicles don’t get tired or distracted, or require bathroom breaks or stops for coffee (unless the passengers require that!). Because of this, self-driving cars have more consistent driving behavior, which is generally safer for everyone on the road, especially considering that human error causes 97 percent of all accidents and mechanical faults cause the remaining 3 percent.

 

There’s also space for artificial intelligence here as well: AI models can be used to predict and anticipate behavior of other drivers and pedestrians on the road, all based on historical data from normal and abnormal driving. For example, the training dataset that Tesla is using to improve the Autopilot system is 40 petabytes (that is 40 quadrillion bytes) and will eventually grow to 200 petabytes. 

 

Will self-driving cars reduce accidents?

The numbers don’t lie – autonomous vehicles are indeed safer than human drivers. According to Tesla’s Q4 2022 Vehicle Safety Report, vehicles using their proprietary Autopilot technology have one accident every 4.85 million miles driven. Compare that with the U.S. average of 652,000 miles between accidents, and I think we have an answer.

 

It’s worth calling out why there’s such a stark difference in the numbers:

  1. Self-driving cars always follow the rules of the road. It’s hard coded into the application running on the computers within the vehicle. It shouldn’t do anything to break the law, like run a red light, cross a double yellow line or speed. This produces predictable and safe behavior on the road.
  2. Humans, aka the vast majority of drivers on the road today, are unpredictable. The rules of the road exist to make us all play the same game, but predictability goes out the window once you decide to break the traffic laws and open yourself and those around you to accidents.
  3. A number of accidents are caused by the human factor, as we noted above in the third question. Computers don’t get tired or distracted in the way a human will – and modern cars are also outfitted with plenty of sensors to keep tabs on the surrounding environment.

What if humans become overreliant on self-driving cars in the future?

With vehicles becoming significantly more advanced, there’s a genuine possibility that humans will eventually “forget” how to drive safely . There might be some long-term consequences that arise:

  • Safety Concerns: In emergency situations where manual driving is required, lack of driving skills could pose safety risks.
  • Dependence on Technology: Overreliance on autonomous systems might lead to challenges if these systems fail or are unavailable.
  • Legal and Insurance Implications: The legal framework and insurance policies would need to evolve to address liability in a world where human driving skills are diminished.
  • Cultural Shift: Driving as a skill and cultural activity might change, impacting industries like driver's education and traditional car manufacturing.
  • Skill Transition: New skills related to managing and interacting with autonomous vehicles could become more important than traditional driving skills.

There was a point in time where people got from A to B with horses and buggies; I wonder if they were asking themselves the same question.

 

Fun Fact: The autoflight systems (autoflight includes lateral, vertical and throttle systems) on a modern commercial airliner are so advanced, they have the ability to control almost all phases of flight across all weather conditions with little/no input from the pilots. I was once milling around waiting for a flight at JFK and ran into a pilot who told me, “In the future there will be no pilots, just a man and a dog in the cockpit: a man to feed the dog and the dog to bite the man if he touches anything.”

 

* * *

 

Please fill out our brief survey so we can know what our readers are looking for.

 

 








 

 

About Dan Alvarez

 

Dan Alvarez began at JPMorgan Chase in June 2016 as a summer technology analyst/ infrastructure engineer, and left in April 2022 as a Senior Software Engineer in Global Technology Infrastructure - Product Strategy and Site Reliability Engineering (SRE). Since May 2022, he has worked for Amazon Web Services as an Enterprise Solutions Architect.

     He is also an avid guest lecturer for the City University of New York and has given lectures on artificial intelligence, cloud computing and career progression. Dan also works closely with Amazon's Skills to Jobs team and the NY Tech Alliance with the goal of creating the most diverse, equitable and accessible tech ecosystem in the world.

     A graduate of Brooklyn College, he is listed as an Alumni Champion of the school and was named one of Brooklyn College's 30 Under 30. He lives in Bensonhurst, Brooklyn.

 
----------------------------------
 
Comments?