Short to medium range LiDAR

Discussion in 'Electronics & Electrics' started by PheonixV2, Jan 24, 2019.

  1. PheonixV2

    PheonixV2 Member

    Joined:
    Nov 9, 2013
    Messages:
    15
    Location:
    Melbourne
    Hello there,
    I'm currently in the process of designing a short to medium range lidar that I could attach to my drone and use it to track a gps location, however, and I'm not exactly sure it matters in this use-case, but I can't find a Time-to-Digital converter chip with a faster response time than 12ns which gives a minimum distance of approx. 3.6m. (TDC7200) above this min. distance it has a resolution of 55ps which is damn good.

    3.6m is still a fairly large distance and there could easily be objects in that range that it might need to avoid. My main question is how do modern LiDAR systems overcome this minimum distance, and what sort of chips are used in devices such as 3D laser scanners, since they have a much shorter active range.
    Any replacement IC's people can recommend or other methods that might yield better results?

    Cheers
     
    shredder likes this.
  2. _zak

    _zak Member

    Joined:
    Oct 12, 2009
    Messages:
    341
    Is there any reason you're looking at a DIY option rather than grabbing an off-the-shelf LiDAR module? OCAU's very own Bleckers makes a breakout for the VL53LX1 (Tindie link), and if you're looking for something that's ready-to-go, Seeed have the DE-LIDAR and RPLidar (which will do 360° mapping).
     
    bleckers likes this.
  3. OP
    OP
    PheonixV2

    PheonixV2 Member

    Joined:
    Nov 9, 2013
    Messages:
    15
    Location:
    Melbourne
    I'm going for DIY cos I would like the experience of building one from scratch. It gives me a better understanding of how it works so I know what needs to be done to improve things and what the current limitations are.
    Thanks for the links, I'll check them out, the MappyDot seems like it would be useful for short range.
     
  4. Technics

    Technics Member

    Joined:
    Apr 29, 2002
    Messages:
    1,730
    Location:
    Brisbane, AU
    I might be missing something here but why not start your TDC 12ns before emitting your pulse?
     
  5. OP
    OP
    PheonixV2

    PheonixV2 Member

    Joined:
    Nov 9, 2013
    Messages:
    15
    Location:
    Melbourne
    That would be a very elegant solution, would you know how I could implement something that could create that delay?
    I don't think I would be able to accurately keep track of that on the micro-controller I'm planning to use (Raspberry PI zero)
     
  6. OP
    OP
    PheonixV2

    PheonixV2 Member

    Joined:
    Nov 9, 2013
    Messages:
    15
    Location:
    Melbourne
    Found the DS1100LZ-60 which has a 12ns delay on Tap1, should do the trick.
    Will have to set up some standard distances to test everything
     
    Technics likes this.
  7. Technics

    Technics Member

    Joined:
    Apr 29, 2002
    Messages:
    1,730
    Location:
    Brisbane, AU
    It could also be longer than 12ns if you are willing to sacrifice maximum range for simplicity. It just needs to be consistent so you can remove the offset.
     
  8. _zak

    _zak Member

    Joined:
    Oct 12, 2009
    Messages:
    341
    You may have seen this already, but Hackaday just posted an open-source LIDAR project. There may be some useful details there?
     
    shredder likes this.
  9. Technics

    Technics Member

    Joined:
    Apr 29, 2002
    Messages:
    1,730
    Location:
    Brisbane, AU
    If you look at that design, the gate driver used has a typical propagation delay of 13nS so the laser pulse will be (typically) delayed by at least that (plus however long it takes the MOSFET itself to switch on). The actual range in the gate driver datasheet is quite wide so the total best and worst cases for driver + MOSFET switching time need to be considered. I suspect you would not need to actually delay the laser pulse unless you have some other fantastically quick way of turning it on. You will however need to consider how the unknown and possibly variable delay gets calibrated out. This probably needs to be done per device (unless batches exhibit good consistency) and perhaps with temperature compensation if you need the best accuracy.
     

Share This Page

Advertisement: