Thursday, March 28, 2024
Self-DrivingWhy Tesla is Wrong to Drop Radar

Why Tesla is Wrong to Drop Radar

Given all the benefits that radar can offer, why then would Tesla be moving away and focussing on a camera only sensor suite?

Why Tesla ditched radar

Tesla say that radar will occasionally give a mismeasurement, for example, a manhole cover being mistaken for an obstacle. This leads to a phenomenon called phantom braking, where the emergency braking response is triggered for no real reason. Tesla’s approach is to use examples of good radar data to train their neural network so the cameras can make the same depth and velocity measurements as radars. They say this has been a success, and under the right conditions, the technique can work very well. But what about not under the right conditions?

When the radar was first disabled Tesla let customers know that there would be some temporary limitations on the ability of their ADAS systems. Tesla limited the autosteer functionality to below 75mph, increased the minimum following distance, disabled emergency lane departure, and set the high beams to automatically come on at night (presumably to counter cameras’ poor night vision). In addition to this, some customers reported reduced and poor performance in the rain. This highlights some of the key advantages that radar has over cameras. Unlike cameras, radar is not really affected by poor lighting and visibility conditions. The wavelength that radar operates at means that it does not see ambient occlusions like dust and rain particles, and since it emits a signal and looks for its own echo, it does not matter whether it is day, night, or even direct sunlight.

Qualitative attribute performance. Source: IDTechEx – “Automotive Radar 2022–2042

Other manufacturers have not shown the same enthusiasm for camera only. In fact according to IDTechEx research in “Automotive Radar 2022–2042“, the number of radars per vehicle will grow. This is being driven by the adoption of technologies such as blind-spot detection and cross-traffic alert, which use radars to monitor the perimeter of the vehicle for other road users concealed in blind spots.

In multiple discussions that IDTechEx has had with key players in the automotive industry, it even seems likely that radar could start to replace ultrasonic sensors, which are typically used in parking assistance systems. This would see the number of radars per vehicle potentially grow past five. Additionally, radar is a sensor heavily used by companies working on robotaxis, with some companies using as many as 21 radars per vehicle. So, if Tesla are looking to set a trend, it is not one that appears to be catching on.

Software moves faster than hardware

In a presentation from Tesla’s director of artificial intelligence from June 2021, it was noted that situations such as moving through underpasses are tricky for radars because of their poor elevation resolution. This was true for the radar that Tesla was using. The problem is with poor elevation resolution it is hard for radar to distinguish that there is free space underneath the overpass and so it will slow as a precautionary measure. You could teach the radar that a big signature, like the one caused by an overpass, should be ignored (as it’s probably something that can be driven through), however, this creates issues if there is a parked vehicle underneath. The radar would still not be able to distinguish the overpass from the vehicle, a situation that could potentially lead to a collision.

Tesla Model S PLAID i- Courtesy of Tesla
Tesla Model S PLAID i- Courtesy of Tesla

Tesla was using a Continental ARS4-B radar, which was a perfectly good radar…in 2014. Since then, radar technology has come on a long way. One measure of a radar’s potential imaging performance is the number of virtual channels it has. This is the product of the number of transmitting channels and the number of receiving channels and is analogous to the number of pixels in a camera. The Continental ARS4-B used by Tesla had 8 virtual channels (which was the norm in 2014). Since then, the industry moved to 12 virtual channels, but the latest radars from Continental have 192 virtual channels. Start-ups like Arbe and Uhnder and others covered in “Automotive Radar 2022–2042” have upwards of 200 virtual channels, with room to grow to over 2,000.

This is not to blame Tesla though; many new vehicles are in the same boat. Part of the problem is the long life cycle that vehicles have, typically 10 years. This means if a carmaker releases a new vehicle today and a game-changing radar hits the market tomorrow, it will be up to 10 years before that radar can be put on the new vehicle. In other words, for any new vehicle towards the end of its product cycle, the hardware on it is likely to be 5-10 years out of date, or possibly more. Tesla’s sensor suite was defined in 2016, so it will likely be 2026 before big changes are made to the hardware.

Tesla can combat this by making much of the vehicle software-defined. This allows them to iteratively improve their products through their lifecycle using over-the-air updates. For camera-based systems, this works well, as cameras produce a wealth of data, and software improvements are still available to make the most of that data.

The difference that this makes to a radar’s imaging potential is enormous. The latest radars on the market, and those being developed by start-ups, produce images much more like LiDAR compared to the ambiguous scans of the past.

Part of this improvement is down to a transition in semiconductor technologies. SiGe BiCMOS-based radars, like the one Tesla used, have dominated for the past decade. This is because, compared Si-CMOS based radars, they were able to generate high signal-to-noise ratios. However, as transistor sizing has come down, Si-CMOS-based radars have been able to match and exceed BiCMOS performance. The bonus of this is that the reduced transistor sizing is enabling more functionality and more virtual channels per radar. These Si-CMOS radars only entered the market in 2019 and are not widely used yet, and the highest performing radars from start-ups are yet to hit the market. However, the new heights of performance that radars bring might persuade Tesla to re-evaluate its attitude towards radar.

Tesla regularly makes the argument that humans drive with vision only, therefore a vehicle should be able to do it too. While this is true, it seems to be a bit of a closed mindset. Yes, humans drive with just two eyes, but we do not have much of a choice. Tesla may well be able to get by without radar and plow ahead with a vision-only approach. In the opinion of IDTechEx, this will hamper Tesla’s performance potential. With higher-performing radars coming to market, Tesla may well revaluate their decision.

IDTechExhttp://www.idtechex.com/
Since 1999 IDTechEx has provided independent market research, business intelligence and events on emerging technology to clients in over 80 countries. Our clients use our insights to help make strategic business decisions and grow their organizations. IDTechEx is headquartered in Cambridge, UK with subsidiaries in Japan, Germany and the USA.