While Elon Musk confidently predicted Tesla robotaxis in 2019, it's Waymo and GM's Cruise that took the leap in 2023. But was it too soon? This video examines the highs and lows, from major technological breakthroughs to the troubling events on the streets of San Francisco, highlighting the potential dangers of robotaxis. Join us as we unpack the complex challenges facing self-driving cars and question the feasibility of achieving full autonomy on our roads.
#Wallstreetmillennial #tesla #waymo #cruise #robotaxi #selfdriving
Limited time: get 5 free stocks when you sign up to moomoo and deposit $100 and 15 free stocks when you deposit $1,000. Use link https://j.moomoo.com/00iPZo
Email us: Wallstreetmillennial @gmail.com
Check out our new podcast on Spotify: https://open.spotify.com/show/4UZL13dUPYW1s4XtvHcEwt?si=08579cc0424d4999&nd=1
All materials in these videos are used for educational purposes and fall within the guidelines of fair use. No copyright infringement intended. If you are or represent the copyright owner of materials used in this video and have a problem with the use of said material, please send me an email, wallstreetmillennial.com, and we can sort it out.
––––––––––––––––––––––––––––––
Buddha by Kontekst https://soundcloud.com/kontekstmusic
Creative Commons — Attribution-ShareAlike 3.0 Unported — CC BY-SA 3.0
Free Download / Stream: http://bit.ly/2Pe7mBN
Music promoted by Audio Library https://youtu.be/b6jK2t3lcRs
––––––––––––––––––––––––––––––
#Wallstreetmillennial #tesla #waymo #cruise #robotaxi #selfdriving
Limited time: get 5 free stocks when you sign up to moomoo and deposit $100 and 15 free stocks when you deposit $1,000. Use link https://j.moomoo.com/00iPZo
Email us: Wallstreetmillennial @gmail.com
Check out our new podcast on Spotify: https://open.spotify.com/show/4UZL13dUPYW1s4XtvHcEwt?si=08579cc0424d4999&nd=1
All materials in these videos are used for educational purposes and fall within the guidelines of fair use. No copyright infringement intended. If you are or represent the copyright owner of materials used in this video and have a problem with the use of said material, please send me an email, wallstreetmillennial.com, and we can sort it out.
––––––––––––––––––––––––––––––
Buddha by Kontekst https://soundcloud.com/kontekstmusic
Creative Commons — Attribution-ShareAlike 3.0 Unported — CC BY-SA 3.0
Free Download / Stream: http://bit.ly/2Pe7mBN
Music promoted by Audio Library https://youtu.be/b6jK2t3lcRs
––––––––––––––––––––––––––––––
In 2019. Elon Musk said that he was very confident that there would be Tesla Robo taxis on the road with no human drivers next year. It's now been more than 4 years and there's still no Tesla Robo Taxi in 2023. Google Owned Wh and GM owned Crw beat Tesla to the punch.
securing regulatory approval to launch commercial Robo Taxi Services 24/7 in San Francisco Technologists were quick to Proclaim this as a great Victory and a big step towards the widespread roll out of self-driving cars. However, with within just a few weeks of operation, the San Francisco Autonomous Driving experiment has already been a complete unmitigated disaster, with self-driving cars causing crashes, killing dogs, creating traffic jams, and even disrupting emergency vehicles. On August 18th, 2023, just a week after receiving approval for 24-Hour operations, a Crews autonomous vehicle crashed into a fir Tru causing at least one injury. The San Francisco Fire Chief says there have been numerous incidents of Robo taxis disrupting emergency vehicles, driving over fire hoses, and driving into active crime and fire scenes.
It's only a matter of time before something catastrophic happens. Over the past decade, companies like Google GM and Tesla have poured billions of dollars trying to develop self-driving cars with the promise that they'll be far safer than human drivers. In this video, we'll look at the technological challenges of autonomous cars and why full self-driving is probably never going to happen. In the Us.
alone, around 40,000 people die every year due to traffic related incidents. Countless more are left with life-changing injuries. Theoretically, a self-driving car could be safer than a human driver and cut down a lot of these deaths. For example, a robot car will never get drunk, never fall asleep, and never text.
A computer can run calculations far faster than a human with far fewer mistakes. When you drive a car you take in Visual and auditory information through your ears and eyes. Where are the road? Lanes Where are the stop lights? Where are the other cars? Is a car near you? honking? If you put a a bunch of cameras and microphones on a car, a computer would theoretically have all the information a human driver would have. Thus, in theory, the computer should be able to do everything a human driver does faster and with less chance of error.
In fact, if you put a radar or Lar sensors on a car, a computer could have even more information than a human has. If fully autonomous cars are theoretically possible, it makes sense that investors would be interested, as the commercial opportunity is enormous. consumers would be willing to pay a lot of money for a self-driving car. Or even better, you could create an uber-like service where people hail autonomous cabs Without the expense of a human driver.
The profit margins would be extremely attractive for the past decade. Tens of billions of dollars have been poured into the space in 2011. Google Acquired the self-driving startup Weo in 2014 General Motors Acquired Cruise and eventually secured additional funding from SoftBank in 2014. Tesla Unbailed its autopilot feature which was its first step towards full self-driving between 2017 and 2012. 20 Ford and Volkswagen each invested more than $1 billion into a self-driving Venture called Argo AI These companies collectively spent tens of billions of dollars hiring tens of thousands of talented engineers and AI researchers from around the world. These startups retrofitted regular cars with the cameras and Radars necessary for self-driving They tested them out on real streets, but with real humans always in the car ready to take over if necessary. They used the data to train their AI models and refine their self-driving software. in term terms of training data, Tesla seemed to have a clear advantage.
Since 2015, all new Tesla cars sold have cameras and Radars built in driving data is sent back to Tesla HQ over Wi-Fi giving Tesla billions of miles worth of real world driving data according to Elon Musk This data allows their self-driving technology to increase exponentially and it's only a matter of time before they have full self-driving cars safer than human drivers. By the late 2010s, it looked like things were progressing pretty well in 2017. Whos started fully autonomous taxi rides in Arizona without a human safety driver. According to Wh's own analysis, their autonomous cars were safer than human drivers.
Tesla Claims that their autopilot feature is far safer than unassisted driving. In most quarters, there's less than one crash per 4 million miles driven with Tesla autopilot compared to the US average of about one crash every half million miles. The autopilot is only available on highways. Their so-called full Self-driving beta is available on city streets as well.
They claim that Teslas using full self-driving are 80% less likely to get in a crash than regular cars. So everything is great. Autonomous cars are already safer than human drivers, so we should all switch to using self-driving cars today and save tens of thousands of lives, right? Unfortunately, it's not that simple. The problem with data and statistics is that they can easily be manipulated and taken out of context to give a misleading impression.
In the most recent quarter available: Tesla claims that vehicles with the autopilot function engaged record one crash every 4.85 million mil driven. This compares to one crash for every 1.4 million mil driven for Tesla without autopilot engaged and one crash per 652 th000 M driven for all cars in the US. This suggests that with Autopilot Teslas are more than seven times safer than regular cars, and even without autopilot engageed, they're more than twice as safe. There are a few major problems with this analysis.
Firstly, autopilot is only engaged on highways. The vast majority of automobile accidents are fender benders, which happen on city streets. Thus, it's an Apples to Oranges comparison, which is completely meaningless. Secondly, Teslas are expensive. Much more expensive than the average gasoline powered car. People who buy them are wealthier and older. On average, wealthier and older. people are less likely to get into crashes.
Anyway, we know this from Tesla's own data. Even without Autopilot engaged, Teslas get into less than half as many crashes compared to the US average. Tesla has access to a huge amount of driver data. If they wanted to, they could create an Apples to Apples comparison of vehicle safety with and without Autopilot, but instead, they only give us useless apples to oranges comparisons.
Why don't they give us better data? Probably because they themselves don't like the results. In April of 2020, the company disclosed that Tesla drivers have driven an aggregate 3 billion miles since the product was launched in 2014. There's a website called Tesla Death.com which keeps track of all Tesla related fatalities and whether or not Autopilot was involved. According to their data, there have been 15 fatalities with suspected autopilot involvement during that period.
That gives a rate of 0.5 deaths per 100 million miles driven. So how is that compared to unassisted human driving in 2019, There were 3.26 trillion total vehicle miles traveled in the US and 3696 total traffic related fatalities. That translates to 1.11 fatalities per 100 million miles driven. So.
Based on this data, Tesla Autopilot is about 55% % safer than the average car, but we have to make some adjustments. Even when a Tesla has autopilot engaged, the driver is still required to keep their hands on the wheel and be prepared to take control at any time. As we explained before, Tesla Drivers are safer than average due to their socioeconomic and age characteristics in the first quarter of 2020, which is the last full quarter in the relevant period. Tesla Drivers without autopilot engaged were 2.6 times less likely to crash per mile driven as compared to the US average.
Another factor is the weather. According to the US Department of Transportation, roughly 16% of all traffic related fatalities happen during inclement weather conditions. Tesla Autopilot refuses to operate during times of poor visibility such as rain, fog, or snow. It also refuses to operate on narrow or winding roads or other unfavorable conditions.
After applying the driver quality adjustment and the inclement weather adjustment, the average US death rate goes down to 36 deaths per 100 million miles. This is significantly lower than the autopilot death rate of .5 Thus, Autopilot is more dangerous, not less. The main reason that Autopilot is so dangerous is because it's AI models often fail to recognize unusual objects on the road or things it hasn't seen before. For example, in 2022, a Tesla with Autopilot engaged crash into a motorcycle killing the motorcyclist. Motorcycle tail lights are shaped differently than regular cars. The Tesla erroneously thought it was a car and thus misjudged the distance. Telling the difference between a motorcycle and a car is easy for a human, but incredibly difficult for a computer. Tesla's claims about their full self-driving feature are even more misleading.
In their 2022 Impact report, they claimed that Teslas with full self-driving activated, experienced 80% fewer crashes per mile than the US average. While this is technically true, it means absolutely nothing. For the majority of this period, full self-driving was only available to drivers who had perfect safety scores based on Tesla's proprietary scoring system. Comparing these drivers to the US Us average is grossly misleading.
Tesla has disclosed very little data about the performance of full Self-driving Why would they possibly want to withhold data about their brilliant new technology? In May of 2023, former Tesla employees leaked full Self-driving data to a German newspaper. This leak showed the Automaker has already received thousands of complaints from customers, including hundreds of cases of the car breaking abruptly in an attempt to avoid obstacles that did not exist. In other cases, the car would randomly slam the accelerator, resulting in it crashing into ditches, walls, or even incoming traffic. Tesla is actively selling its full self-driving software for up to $200 per month, and Elon Musk frequently touts the company's self-driving technology to pump up the share price and by extension his own net worth.
People are paying for this with their lives. Weo The self-driving car Venture owned by Google has been testing its fully autonomous vehicles in multiple States since at least 2017. They claim to have driven over 1 million miles fully autonomously without a single injury reported, and in only 18 minor contact events in every single vehicle to vehicle contact event, it was the other driver at fault, not Wio. Based on this data, you might think that Whmo cars are infallible supercars incapable of making a mistake, but we need more context.
The reason that Weo cars have such a great safety record is because they are programmed to drive extremely conservatively. If it encounters a complex traffic situation that doesn't understand, it'll often resort to abruptly slamming the brakes. This has resulted in numerous cases of being rear ended by other cars. While in such situations, the car behind is technically at fault, it's undeniable that Wh's unexpected halts can introduce an element of unpredictability on the road, potentially heightening the risks, even when the Wio car isn't directly to blame.
Because of this, many residents in areas where Weo operates report that they hate the driverless cars. In April of 2023, both Weo and Cruz received regulatory approval to operate their driverless cars in San Francisco for 24 hours a day, 7 days a week. Within the first few months, this experiment has already been a disaster. There have been numerous incidents of both Wh and Cruise Vehicles stopping randomly in the middle of the street, causing traffic jams and in some cases, obstructing emergency vehicles. The San Francisco Fire Chief says that there have been numerous incidents of Robo taxis blocking emergency vehicles, driving over fire hoses, and driving into active crime and fire scenes. It's only a matter of time before something catastrophic happens. So this begs the question. Why are autonomous vehicles so bad at driving today? The best analysis I found was published by the University of Copenhagen in April of 2023.
The researchers analyze 16 hours of dash cam footage from Tesla and Whmo self-driving cars to analyze their competence at handling real world traffic situations. When you drive a car on a public road, you may not realize it, but you communicate with other drivers dozens if not hundreds of times per hour. For example, when you turn on an intersection or change lanes, you look at what the other cars around you are doing and you have to make a decision. Should I yield or go? How will the other cars react to my movements? If you're an experienced driver, you make these decisions subconsciously in a fraction of a second.
Based on the slightest movements of surrounding cars, you can interpret their intentions and predict their future actions. The computer's powering autonomous vehicles find it incredibly difficult to interpret the intentions of human drivers. In one example, from the University of Copenhagen analysis, a Wh car is trying to cross an intersection and sees pedestrians on the side of the road. The Wh notices the pedestrians and stops, presumably in anticipation of the pedestrians crossing.
The pedestrians repeatedly wave to the Wh signaling for it to go first. The Wio completely fails to recognize their hand signals and remains stopped for a full 11 seconds. This is a clear example of a self-driving car failing to understand simple gestures that a human driver could instantly recognize without a second thought. In a separate example, a Tesla using full self driving, struggles to navigate a crowded four-way stop sign.
This is a complex driving maneuver because the various cars need to collectively decide in which order they will cross the intersection. The Tesla eventually gets through, but only after starting and stopping multiple times and crawling at 1 mph. While it did not result in a crash, traffic was severely disrupted. Driving is a surprisingly complex task for traffic to flow smoothly and safely.
Everyone needs to act in a predictable manner. when you add a bunch of Robo taxis which act radically and unpredictably. At best, you'll get severe disruptions to traffic, and at worst, you'll get a public safety. Hazard On August 22nd, 2023, Cruz was forced to slash its Fleet of Robo taxis in San Francisco by 50% after one of their vehicles crash into a firetruck. It's probably just a matter of time before the city pulls the plug on this ill- fated experiment. The technologists claim that they just need a little bit more time and data, and eventually Robo taxis will be safer than humans. But they've already been working on this for over a decade, collected billions of miles of data, and spent tens of billions of dollars. How many more people need to die before we wake up to the reality.
self-driving cars are never going to happen. All right guys, that wraps it up for this video. What do you think about autonomous cars? Let us know in the comments section below. As always, thank you so much for watching and we'll see you in the next one! Wall Street Millennial Signing out.
self driving cars are not safe, period. There will never be a time when self driving cars are safe. It's literally the 21st century version of snake oil
Self driving cars will be the last nail in the coffin of personal freedom.
Absolute waste of money & utter scam/vapprware that's really there to WOW dummies with money (I mean ""investors"")
A REAL good use of that money can go towards actually developing better public transport. Reduce traffic on the road, reduce pollution, reduce roadway footprint, encourage people to interact with each other more, get a greater density of people to their destination sooner, since they're not driving, they can do tasks while in transport…
I mean, it's the obvious answer here, but no, everyone wants their own futuristic pod. People pay money to be lazier & lazier & more isolated from others.
you're an idiot if you think it's never going to be possible especially with all the growing technologies as evidence.
I've watched a few live streams of Waymo taxis and I was very surprised how advanced their system is. It's not perfect but it does look promising. Seems like they have the best software right now.
Self-driving flying taxis are already in the air (carrying missiles). 🎉🎉🎉
"It will be like owning a horse in 3 years."
Sure it will, Elon.
I've long said that self-driving cars will need to be in a paradigm where they are not only driving the car and using GPS or visually seeing road signs and other vehicles, but are also in communication with one another, with infrastructure built into roadways, and even where needed a central traffic processing computer/A.I. In this situation there will be no intersection faux pas. In fact cars won't even stop at intersections and will seamlessly weave through with one another.
This is not an impossible goal. Autonomous vehicles will become a reality and the norm in time. When the vast majority of cars on the road are autonomous (along with infrastructure designed for them), they will be FAR safer than human drivers.
The problem is these models need a decade or so of data, including catastrophic failures and hundreds of thousands of hours of humans training that data – in order to be sufficiently robust. So we have to sacrifice people and streets to generate data that will ultimately only benefit rich capitalists. My fear is that rather than putting rigid requirements on autonomous vehicles, cities will instead begin ticketing people and human drivers aggressively for not behaving in a way that autonomous vehicles can understand. SF and Arizona have both already decided that gifting information to aid the profits of private companies is more valuable than their own citizens. America is truly a nightmare hellhole.
Self driving cars will never happen until they train their cars in south and southeast asia.
Out in AZ it seems like everyone in a Tesla drives like a dickhead and get in wrecks constantly. I worked a wreck that no shit was two Teslas that hit each other because one person pulled out knowing they couldn't see past the truck coming towards them.
I love how you disect all the nonsense overhyped bluffs fed to the markets and the public one after another, congratulations great work.
Someone thought Airplanes were a dangerous pipe dream once or posting videos of ur making and expecting to be paid by view were a pipe dream
But self driving cars are EASY!
Caveat: Nothing but the car around moves, on a perfect strait road, in perfect conditions.
self driving cars and EVs are a fool's errand. neither do anything to improve urban living or decrease the risks of climate change.
Well ! What did you think was going to happen ?
I'm sure it will happen some day but we are not there yet.
yours is a good article. but you dont understand why what this fool Elon doing is so dangerous. AI networks are only has good as the data it is trained on. and even with different real world data there is no logical way to know why they are sensitive to different factors like texture. Also it is not one network they use but several and they can all have different level of sensitivity to external factors that are just are not predictable. and his insistence on just using Camera data is downright reckless. we all know who well your average camera works using different lighting conditions.
Huge amounts of copium from AI advocates and apologists alike. WSM is actually correct when it said that FSD will “never” happen.
The thing is is that it already doesn’t work. Good ideas do not start out as failures and then fix themselves up later. Good ideas prove their concept on a micro scale first, and then scale upwards when it has gained the trust of society.
As was pointed out, countless amounts of capital, and R&D and God knows what else has already been poured into the idea. As far as resources and funding goes, there’s no more space to go up. Many financial black hole project ideas have already shown where this goes- nuclear fusion being one of them.
Don't forget Uber's modified XC90, that hit and killed a woman.
BTW whatever comes out Elon's mouth is crap.
Self driving cars will totaly happen, but the infrastructure has to change. Everything on the road has been setup for humans, computers don't think like humans. The infrastructure has to be changed so that it makes sense for computers.
As someone who is much older than a millennial, I'll offer that "never" is a long time. I believe it is inevitable that self driving vehicles will be a thing. But not in the next few years and likely, not within a decade or even two.
Here's a simple tip. If Softbank invests in it, it is, at best, a losing bet and at worst, a scam.
Real talk: Catastrophic casualties for new significant technology is normal. Thousands of humans lives; which i will remind you are completely unique; should be expected to be sacrificed before this technology is truly considered safe or reliable. This is dark but its the truth; and I only speak about it so flippantly because it seems to be REALITY across our entire history
Excellent report and analysis
You are not considering that the development of autonomous technology is accelerating as AI technology and the computing power available to it accelerates. Last year, one could have predicted that ChatGPT wouldn't work. It is true that the development in previous years went down many dead ends, but that is typical in the development of new technologies. It remains to be seen whether Tesla's Dojo computer and new training approach (currently called version 12) will finally succeed in taking autonomous driving technology to a safe level. In any case, this video makes it clear that you have no particular expertise or inside knowledge about the state of development of the technology.
The only way self-driving cars are going to work is if you coordinate and control the movement of all vehicles on the road simultaneously. It's a brilliant idea. I thought of it first. I'll call it .. Skynet.
This is grossly overstated. The point this video misses is that self driving cars can be far, far safer than human drivers without being perfect. The only reason why self driving cars may be much delayed is because people would rather see 100 people killed by human drivers than one person killed by a self driving car.
I think it's harsh to say that self driving cars never are going to happen. It may take more time than we think, but self-driving cars feel a lot more feasable than, for example, airplanes did in the 1890s. And we did manage to make airplanes…
You cannot say never. Like AI research in the 1970s led to a dead end, suddenly we got computers beating humans at chess, and then ChatGPT.
I am a software engineer and I worked for an AI company, so I know more than most. But even the uninitiated can see that it's not a dream.
We might not have the necessary breakthrough to make it happen now, and it's dangerous. Still, it's disingenuous to lie about it and say "never".
It might happen 50 years from now, who knows ? Probably depends on whether we kill each other before that or not I suppose.
It is not difficult for a computer to properly judge motorcycle and a car. The problem is that Telsa is fundamentally badly programmed. If it would be well programmed, it would not differentiate objects based on its class. It would take into account its physical characteristics from radar and would model that data into its analysis. Motorcycle then would become an obstruction at the correct distance. This would mean that such a self driving car could not at a fundamental level make mistakes. Sure, it could stall, refusing to go by itself. However, if we tie radar data into its calculations via real time world evaluation, it would always recognize an object and would refuse to go through it.
This only adds issues then there is something on the road which car could drive past safely. For example, trash blocking the way which could safely be driven over. This is where programming complications come from my designed system.
The only thing which I can add is that autonomous driving is best for long distance driving outside the cities. It is really monotonous and simple which these cars could handle and if they encounter a situation which they are not sure what to do, they call me to guide them through. This is one unexplored and brilliant niche which very few companies if any had thought about. Do not try to invent the car in one go. Invent the wheel first and then go from there. In more simple terms, start from simpler tasks and gradually build up the expertise. Trucking and enhanced driving experience for normal drivers is where this thing is the real deal.
It can only work if every vehicle on the road is a computer driven car, directed by a centralized computer system. At home, you type in your destination and the system directs your car to it. But it will never happen. We love our carfreedom too much and how many times do we go somewhere and underway we decide to take a detour to visit a shop or restaurant on the way to our destination ?
Those who trust this technology need to be tested again.
Seems like the major challenge is getting the autonomous cars to talk with each other. Not a hard comms issue compared to the other engineering challenges inherent to self-driving. If comms hardware are mandatory in a given area for all cars, self-driving will probably be improved markedly – if the researchers are right that car-to-car intention is a core issue at present.