The Transportation Department projects the new rule could save 360 lives a year and prevent 24,000 injuries.
The Biden administration plans to require that all new cars and trucks come with pedestrian-collision avoidance systems that include automatic emergency braking technology by the end of the decade.
In an interview, Transportation Secretary Pete Buttigieg said the requirement is designed to reduce pedestrian deaths, which have been on the rise in the post-Covid 19 era.
…
The new standards will require all cars to avoid contact at up to 62 mph and mandate that they must be able to detect pedestrians in the dark. They will also require braking at up to 45 mph when a pedestrian is detected.
The Transportation Department projects the rule could save 360 lives a year and prevent 24,000 injuries.
If it’s radar based it should be very reliable. The big issue is camera based stuff. Cameras can’t measure much, only colour and brightness. From this everything is inferred not measured. Inferring things isn’t inherently bad, but the errors need to be accurately known and considered. They probably are, it’s just they are not weighted correctly relative to cost.
Radar doesn’t work for stopped vehicles at high speeds though. You’ll still need cameras and or lidar. OEMs rely so heavily on radar though that their cameras don’t detect this well enough in emergency situations either.
You need either an impeccable vision algorithm, or lidar for this scenario.
That recent Blue Cruise crash for example, even if the conditions had been perfect, and not at night, with no lights, that’s a failure condition for the system. A vehicle moving out of the way of a stopped vehicle ahead of it, is a failure condition on all existing L2 systems. In this case for the human, it’s also a failure condition given the specific conditions.
This is an actual area we can improve on in terms of mandated safety features, solving this problem. There are tons of rear ending of stopped vehicles at high speeds. It’s very dangerous for police/emergency services doing stops, or broken down vehicles. If the vehicle could react to this even when a L2 system isn’t engaged, that would be big.
If it was lidar based it doesn’t even need to be a lidar capable of semi-autonomous driving features, it could be a narrow forward facing one simply for this purpose in emergency situations.
It seems pointless to argue with these people because it seems they’ve convinced themselves that they know exactly how to construct an infallible or nearly infallible autonomous system even though actual experts with billions of dollars at their disposal have yet to create one.
Stereoscopic camera systems exist and they can work very well (like on my ten year old car).
They still don’t measure distance, they only infer it by comparing two images. This still has the same issue. It’s just a more reliable way to infer distance than a single camera. It also requires less processing, hence it was popular for earlier computer vision applications.
Two cameras can measure distance as well. It’s how our eyes work and how things like 3D scanners work.
You misunderstand. Two cameras can infer distance, they do not and cannot measure it.
Two cameras can produce more accurate and reliable distance results than one. But they still cannot measure distance only infer it.
For certain lighting conditions this can be very reliable like 3D scanners. But cars operate in a staggering range of lighting conditions, with a large variety of environments and materials.
They absolutely can calculate distance as it’s the same principle that allows us to measure the distance of distant planets and galaxies using parallax and trigonometry. Furthermore, if you’re in conditions where cameras can’t see anything, then your car shouldn’t even be on the road as your eyes certainly won’t be able to see anything. Cameras aren’t limited to visible light frequencies like the human eye and my security cameras, for example, can see very clearly in pitch black, low light, or bright light conditions.
Cameras are (generally) passive systems. They do not send out light and analyze the returns. They just absorb light and reconstruct a scene.
Radar systems are active. They send out pulses or continuous waves of EM energy at different frequencies than light (much lower) and analyze the returns. They do not need existing light to do their jobs because they’re sending out and receiving their own specific emissions. Because they run at lower frequencies than light, they are able to “see” through certain weather phenomena like fog and are relatively unfazed by darkness.
That all being said, you can measure things passively and actively. Radar is pretty damn good at measuring distances, as that’s the entire reason behind its invention. I’d say its much more reliable and accurate compared to optical systems.
You misunderstand. I never said they can calculate distance. They just don’t measure it, it’s inferred. Inferring results in unreliable estimates. The unreliability of which is not accounted for appropriately in camera based autonomy systems.
It’s not an issue in astronomy. Because people can examine the results further and discount any clearly incorrect distance calculations.
It’s not just about not seeing anything. Outdoors is one of the most variable and challenging lighting conditions for most systems. Too much light, shadows, etc.
You only need one frequency of light for camera based distance measurements. They require features rather than a specific light.
Your security camera does not operate in pitch black. It has infra red lights that illuminated the scene, you just can see it. Your security cameras are subject to problems in both high and low light conditions. It can make adjustments throughout the day to compensate for changing light. However it doesn’t make it measure distance.