How to compensate for asymmetric sensor array response?
I have an array of three VL53L0X sensors, mounted at -20, 0 and +20 degrees with respect to each other. The idea is to use the differential between the distances measured by the three sensors to determine the orientation of my autonomous wall-following robot with respect to a nearby wall. I compute a 'steering value' by dividing the difference between the two outside sensor measurements by the center sensor measurement, i.e. (R-L)/C. This value should be zero when the array is oriented parallel to the target wall.
I have performed a number of automated and manual rotary scans with this array offset about 20 cm from a target 'wall', and I have noticed a consistent asymmetry in the array measurements, as shown in the attached Excel plot. The left sensor measurement data is unusable beyond about -30 deg relative to the parallel orientation, but the right sensor provides good data out to almost +60 deg. In addition, the orientation for which the steering value goes through zero is skewed to the right about 6-12 degrees from where the array is physically parallel to the target wall. For all the gory details, including a number of scan plots and videos of scan operations, please see my 'Paynters Palace' blog post
The sensors do not have glass coverings, but I have visually verified they are free of any obstructions.
So, how to address these issues? I guess I could compensate for the steering value offset in software, but I'm unsure how well that would work, and that would still not address the asymmetric performance issue. I have not calibrated the sensors because I quite frankly get lost every time I try to work through the API documentation.
Any thoughts or suggestions would be appreciated.
Regards,
Frank
