Thursday, February 08, 2018

Velodyne Talks about LiDAR Advantages, Tesla Denies the Need

Velodyne publishes a video on LiDAR advantages in automotive applications:




SeekingAlpha publishes Tesla Q4 2017 earnings call transcript with CEO Elon Musk saying:

Q: "Elon, on your autonomous vehicle strategy, why do you believe that your current hardware set of only camera plus radar is going to be able to get you to fully-validated autonomous vehicle system? Most of your competitors noted that they need redundancy from lidar hardware to given the robustness of the 3D point cloud and the data that's generated. What are they missing in their software stack and their algorithms that Tesla is able to obtain from just the camera and plus radar?

Further, what would be your response if the regulatory bodies required that level of redundancy is really needed from an incremental lidar hardware?
"

Elon Musk: "Well, first of all, I should say there's actually three sensor systems. There are cameras, redundant forward cameras, there's the forward radar, and there are the ultrasonics for near field. So, the third is also – the third set is also important for near-field stuff, just as it is for human.

But I think it's pretty obvious that the road system is geared towards passive optical. We have to solve passive optical image recognition, extremely well in order to be able to drive in any given environment and the changing environment. We must solve passive optical image recognition. We must solve it extremely well.

At the point at which you have solved it extremely well, what is the point in having active optical, meaning lidar, which does not – which cannot read signs; it's just giving you – in my view, it is a crutch that will drive companies to a local maximum that they will find very difficult to get out of.

If you take the hard path of a sophisticated neural net that's capable of advanced image recognition, then I think you achieve the goal maximum. And you combine that with increasingly sophisticated radar and if you're going to pick active photon generator, doing so in 400 nanometer to 700 nanometer wavelength is pretty silly, since you're getting that passively.

You would want to do active photon generation in the radar frequencies of approximately around 4 millimeters because that is occlusion penetrating. And you can essentially see through snow, rain, dust, fog, anything. So, it's just I find it quite puzzling that companies would choose to do an active photon system in the wrong wavelength. They're going to have a whole bunch of expensive equipment, most of which makes the car expensive, ugly and unnecessary. And I think they will find themselves at a competitive disadvantage.

Now perhaps I am wrong. In which case, I'll look like a fool. But I am quite certain that I am not.
"

22 comments:

  1. He is right, in case Tesla has 10Mpx radar :)

    ReplyDelete
  2. Interesting discussion between what is necessary and what is possible. Elon Musk is very convincing, and has possibly the right strategy (and communication) for the automotive manufacturing company he is managing. There are other types of companies / business models in the space of Autonomous Driving. For the likes of Waymo, Uber, Lyft and Baidu, lidars are the best way to deliver a Level 5 robotic vehicle today, and it does not have to be produced 50k units per quarter... Pierre Cambou - Yole Développement

    ReplyDelete
  3. for a long time I thought, Lidar is a must. But taking Elons thoughts and thinking about the baseline: The human eye can do the job perfectly (everything other than radar issues) - so ultimately passive should be sufficent...

    ReplyDelete
  4. further on to my last post - interferance of activ systems in highly populated traffic situations has to be considered. That applies for lidar as well a radar systems!

    ReplyDelete
  5. Humans make mistakes, crash, and are liable. Computers are not allowed to make mistakes.

    ReplyDelete
  6. Anonymous writes:
    " interferance of activ[e] systems in highly populated traffic situations has to be considered. That applies for lidar as well a radar systems!"

    WHY IS THIS NOT A THING?

    ReplyDelete
  7. That Velodyne video is an insult to the community here on this website...

    ReplyDelete
  8. The Velodyne video is bad. Very bad. On the other hand Musk's comment is good. Very good. Now the question in this community ought to be whether ToF can work at all. "Anonymous" above recognizes that "interference" between ToF systems is an issue. Velodyne's dream is that every autonomous vehicle will be beaming lidar into the air. How can their dream not become a nightmare? It seems that it will be impossible to disambiguate the return echoes from multiple proximate sources. Why is this not a thing?

    ReplyDelete
    Replies
    1. What about Radar? Doesn't it suffer from the same "impossible to disambiguate the return echoes from multiple proximate sources" and "interference" problems, not to talk about sidelobe artifacts? Still, Tesla seems to endorse it.

      Delete
    2. Thank you Vladimir. Radar has advanced since 1938 when it began to emerge. Today microwave signals can be modulated and tagged. "Which one is which" can be determined at the receiver. This is not true for lasers. They barely make it to coherence much less to identifying subsignals or Angstrom scale frequency modulation. At best they are amplitude modulated to provide better range accuracy, but that trick only makes it harder to tell two simultaneous pings apart. Compared to radar, lidar is in its infancy and transmits a coarse ping. All receivers pick up all others' pings. It looks like "fratricide" to me. If you have the inside scoop on how these animals disambiguate, please elucidate.

      Delete
    3. Well, radars work well when seeking targets in the sky. There is not that much multi-path errors there. In urban environment, there is a lot of multipaths, both discrete and continuous. All radio systems suffer from it, albeit the symptoms are different: wrong location in case of GPS, signal fading in case of mobile phones or WiFi, etc.

      Lidars have their own way to deal with multi-paths, althought nothing is perfect. Velodyne Lidars register from 3 to 5 reflections, depending on the model. The software can decide which reflection is more important than others, or leave it as a soft decision for neural processor, for example.

      Delete
    4. Aircraft carry forward looking radar and must disambiguate one source from another. There are many possibilities to overcome possible interference at radio frequencies, because sources can be identified. There is room in the ping for codes. If aircraft were flying with lidar, the interference problem would be addressed by collision avoidance as it is with Ethernet. Each laser would be activated when it was allowed, and the rest of the aircraft would have to wait their turn.

      Velodyne and the other lidar competitors act as if the interference problem does not exist. It should be a hot topic. They ought to be setting standards in cross platform communication to blank each others' lidar to allow proximate vehicles to have access in an orderly sequence. I don't see this in the on-line literature. Why not? Musk's comments suggest that he has figured out that once every vehicle has lidar, there will be an interference problem that does not exist in the passive illumination case.

      I look to your blog, Vladimir, for guidance on ToF developments. I am expecting to read about how multiple ToF systems interact, but as far back as I have been able to study these entries, I see nothing. It is surprising.

      Delete
    5. Don't be surprised. Many issues are not covered in the blog, and the interference is just a very small part of the non-covered stuff. After all, it's a news blog, rather than encyclopedia.

      Nowadays, interference is a basic design concern that is considered from day 1 in the modern Lidar designs. While no solution is perfect, most of them give some degree of improvement.

      Same thing in Radars - all solutions are not perfect. There is always a scenario when Radar or Lidar is blinded by a nearby strong signal source. Every design has a limited dynamic range, no matter what you do.

      This is why many autonomous system designers require a redundancy. If part of the sensors fail, this should be at least detected and the central CPU can take a decision what to do next.

      Delete
    6. Billions of investment dollars are being thrown at lidar, so a fatal flaw should be telegraphed to the community. You point to the multi-shot acquisition as a likely solution, but the reason multi-shot was adopted has to do with increasing the resolution of a solitary system and not interference removal. The more shots, the more confusion. The problem also grows with increases in the number of lidar units at work in proximity. They would have to communicate with each other to allow clean readings. Much as Ethernet slows as more terminals attach, so the lidar data would slow with increased autonomous vehicle traffic. Obviously, the more traffic, the greater the need for 3D data acquisition to prevent autonomous vehicle collision. Success in deploying lidar with all vehicles would spell failure for the whole concept.

      Vladimir, this is an a valuable blog. It has guided me for years, and I thank you for maintaining it. However, it can only be as good as its contributors, so I'm jumping into the fray. It is time to call out lidar's fatal flaw.

      As 3D becomes increasingly a focus, your readers owe it to themselves to examine lidar, not only in the autonomous vehicle application, but in general where success in wide adoption will cause interference. Indeed, all structure light systems are doomed at a certain level of market penetration, because they get in each other's way. Success in deployment of billions of devices will cause collisions that can only be avoided by networking every device and suppressing their emissions.

      Musk put his finger on a needed solution: robust passive illumination 3D. In that scenario, all receivers will share a common source of illumination. Typically this passive illumination is taken from ambient light.

      Light field cameras point to a way, but their lenses make them expensive and ultimately impractical. No wonder Ren Ng ended up in diffuser studies at UC Berkeley. Lytro optics are best in microscope applications where the primary objective is larger than the subject. That application is, in fact, where Lytro originated at Stanford.

      That said, light field works in ambient light, and this is what Musk is suggesting is needed. Image sensor optics are required for 3D acquisition. Do not take them for granted.

      Delete
  9. @Zierot:
    There are already many million cars with lidar on our streets: Volvo City Safety.

    Last time I checked (six(?) years ago) the radar systems in cars did not identify the signal. They use other ways to get rid of interference.

    ReplyDelete
  10. Thank you for your guidance on this issue. As far as I can make out "Volvo City Safety" addresses radar not lidar. There are many ways for radar to avoid interference. Radar does not concern me nor Elon Musk, for that matter.

    3D lidar produces motion images for an onboard fusion system to process. Invoking the numbers of vehicles ("many millions") does not help me to understand how interference is avoided in circumstances where multiple lidars simultaneously illuminate a shared scene. There are not millions of cars with lidar sharing the road simultaneously. That is the dream of the marketing departments of lidar companies, but their dream might become a nightmare if there is a fatal flaw in the technology. The Velodyne video is proof that truth is not required when selling their product.

    If you know how multiple simultaneously operating lidar systems do not interfere, please explain.

    ReplyDelete
  11. As I understand it, city safety uses radar when the car has a radar anyway (for ACC) and lidar in all other cases. As all Volvo cars built in the last 10 years have city safety there are a lot of them on the streets.

    There is a simple solution against interference (as Vladimir said): Do multiple measurements. If the measurement starts at random times the wrong targets will appear at random distances while the real targets will always appear at the same distance.

    ReplyDelete
  12. Thank you for elucidating. If you are aware of any studies that confirm this method of multiple pings for alias removal, please provide the references. Surely it has been studied.

    As I responded to Vladimir, the multiple shot process was developed to improve resolution not to disambiguate competing sources. Every shot increases the likelihood of collision with other pings, so it compounds the problem that concerns me. Year ago I studied multiple shot lidar, but all of that was about resolution enhancement. If all vehicles are increasing the rate at which they interrogate the scene - for whatever reasons - then overlaps are more likely.

    Furthermore, if processing the received data requires making judgment calls as to which 3D scenes are noise and which are real, these calls become life and death for vehicles dependent upon lidar navigation. An obstacle may appear to be further away than it is if a competing ping illuminates it from a greater distance and the local lidar fails to illuminate it. The incidence of such a failure does not need to be frequent, yet it must be factored into the evaluation of lidar as a reliable means to acquire 3D in autonomous vehicle navigation applications. Errors can be unforgiving and cannot be allowed.

    Again, if you or Vladmire can point me to the literature that deals with multiple proximate lidar interference, I would be most grateful. Until I see both theory and testing, it remains (in my estimation) a fatal flaw that has been left undisclosed, perhaps by nefarious marketeers who would rather talk about "eye safe" lasers and other selling points.

    ReplyDelete
    Replies
    1. Have you seen Facet technology proposals:

      https://www.pr.com/press-release/703312

      The company was not particularly successful business-wise, but their technology seems to work.

      Delete
    2. I've first seen this use against interference some 15 years ago. Sadly I've never seen a paper about this but I think this idea comes from radar and in fact, you can find on wikipedia on this:
      https://en.wikipedia.org/wiki/Radar_signal_characteristics#Staggered_PRF

      Delete
    3. In aircraft applications the return signal is coming from significantly greater distances than with near-field lidar. There is more time to set up staggered rep rates both in reflection return and in antenna rotation speed. Resolution requirements are also lower, that is to say, distance readings are presumed to be coarse. Typically, there are fewer competitors in the illumination space, unless the competitor is a deliberate jammer. In that latter situation, jamming can win. Staggered PRF gives the aircraft a fighting chance but does not guarantee success.

      Clearly interference IS A THING. It applies to ToF and structured light triangulation. Success of these technologies in market penetration will bring out the problem, but we can discuss it now, because we're open to the assertion that there might be a fatal flaw in these approaches to 3D data cloud acquisition. Comments in this blog often come from skeptics.

      Delete
  13. Thanks, Vladimir. The Facet patents cover a variety of lidar techniques. One does use multi-shot to increase rangefinder resolution by identifying characteristics of the acquired targets. None of the patents issued or pending are explicitly for disambuguation, although clearly the coding methods they developed could be used in the same manner as radar I.D. That said, they haven't published a solution. Their press release suggests that the problem is understood by the engineers building lidar systems and that solutions are being sought. This is not something that lends itself to secrecy, since all lidars must share the methods used to distinguish one from another.

    Press releases and promotional videos are not required to be rigorous. Of course, in-the-field solutions have no choice but to work robustly.

    BTW, just for amusement's sake, note that one of Facet's patents is for "Accessories for use during or after slaughtering for classifying or grading carcasses; for measuring back fat" After this, they got into the autonomous vehicle business. Slaughter,indeed...

    ReplyDelete

All comments are moderated to avoid spam and personal attacks.