As Part Of Its Safety Suite, Tesla Will No Longer Include Parking Sensors, Leaving Just Tesla Vision

After dumping radar and ultrasonic close-range sensors, Tesla has decided to use just cameras for its driver assistance and collision […]

Tesla

After dumping radar and ultrasonic close-range sensors, Tesla has decided to use just cameras for its driver assistance and collision avoidance capabilities moving forward. The vast majority of automakers that offer similar features, such as adaptive cruise control, lane-keep assist, blind-spot monitoring, pedestrian detection, collision warning and automated emergency braking, and so on, use a combination of forward-facing (and sometimes rearward-facing) radar, cameras, and ultrasonic sensors that can detect objects at varying distances and, critically, overlap as backups for one another. Using just cameras, Tesla believes it can do anything.

You may remember that Tesla caused similar waves when it claimed its electric vehicles didn’t require radar sensors to detect objects in front of them. Only cameras would be sufficient. Again, most automakers interested in increasing the autonomy of driver assistance features like hands-free highway driving systems that steer, brake, and accelerate for you—even on pre-mapped streets—agree that it is preferable to have multiple redundancies built in via an overlapping set of sensors. However, there is a case for separate, specialized sensors in addition to redundancy.

View this post on Instagram

A post shared by Tesla (@teslamotors)

For example, a radar sensor picking up an item may be validated by a camera “seeing” the same thing. Radar is handy for detecting objects at greater distances. At the same time, cameras may pick up information like road markings and the forms of objects, allowing for easier differentiation between pedestrians, cyclists, and other vehicles. Each automobile is equipped with 12 ultrasonic sensors. For instance, the parking sensors often associated with Tesla vehicles also offer a near-field, 360-degree understanding of what’s surrounding the car at any time, a function Tesla believes it no longer requires. This is crucial for applications like Tesla’s Autopark self-parking system and Smart Summon, in which the vehicles may leave their designated parking spots and go to the summoner’s location without anybody inside (who can “summon” the car via the Tesla app).

Without ultrasonic sensors, Teslas will instead depend on nine cameras: three on the front bumper, one on each front fender, one on each B pillar looking sideways, and one on the back. So there are eight of them. Given Tesla’s description of the cameras and their placement, we may deduce that the rear- and side-facing units aid with blind-spot monitoring (detecting automobiles in or approaching the Tesla’s blind area) and, by extension, the Autopilot driver assists automatic lane-change capability, as well as parking. And indeed, those cameras will eventually accomplish that goal, but not before some hazy future time.

Several functionalities (which will continue to function on vehicles already constructed and sold with ultrasonics) will be disabled or limited on newly delivered Tesla vehicles that have gone camera-only during the transition to completing cars sans ultrasonic sensors.

Parking Aid: Detection of objects moving at less than five miles per hour, as claimed by Tesla.

Autopark is a self-parking technology for Teslas that can park them in either a parallel or perpendicular parking spot.

Invoke: Utilizing the Tesla app, you can remotely creep the vehicle forward or backwards into or out of a parking place.

Whether in a driveway or a parking lot, Smart Summon can bring your vehicle to you without lifting a finger.

Not surprisingly, all of these gaps concern low-speed, close-range communication. Our best guess is that Tesla still has some work to do to perfect the ability of its cameras to “see” things up close to their cars. It’s much like how cameras have difficulty picking out distant objects; is that speck we see a vehicle rushing toward us? Close-up of a fly, perhaps? Detecting items near the camera’s periphery, such as the corners of the automobile, is not as simple as detecting objects at the camera’s centre.

In his prediction for this “occupancy network,” Tesla predicts steady growth. The same philosophy is being applied to the camera-only approach; camera-equipped Teslas could potentially gain better “vision” over time as more driving situations are uploaded to Tesla’s cloud computing network for analysis, followed by software improvements of system updates.