Anyone have a simple Python example using RVR SDK, ToF’s, camera movement and GPS in one Python app?. I tried researching and trying some things without success so far.
It appears pipenv shell isn’t setup to incorporate ToF and GPS libraries. Simply blending the various Python apps doesn’t work.
Quick question; which GPS are you using? There have been a few people on her wondering about how to get real time GPS location tracking for their RVR.
The GPS (Titan X1) Module include with the advanced autonomous kit.
I am using TOF Lidar distance (I2C), Radar (human presence) and ultrasonic distance sensors in one app. I am using GPIOzero and Adafruit libraries. I added the necessary libraries to the pipenv.
Which libraries? I tried searching GitHub, There seem to be many adafruit libraries. What is GPIOzero library for? Which of these are for ToF thru qwiic mux board? I added sparkfun-qwiic but got a subsequent module not found to init the ToF. How did you find the ‘necessary’ libraries?
Is it the Import statement that identifies the libraries? Also pipenv depends on the base directory used, I suppose.
Does it also use Sphero SDK to move the RVR? I’m using advanced autonomous RVR kit. Trying to merge the sample libraries provided in the different directories. Is that your situation?
I am using TOF Sensor - Adafuit VL6180X and Doppler sensor RCWL-0516 via adafruit libraries installed to pipenv via pip3 install. GPIOzero libray provides easy access and programming for sensors. I use it to read the ultrasonic sensor. Refer to https://gpiozero.readthedocs.io (https://gpiozero.readthedocs.io/en/stable/migrating_from_rpigpio.html) . I control the robot via Sphero sdk using Python given feedback from the sensors.
I think I’m over my head but will check this out. Thanks.
Yup, after reading a while, don’t have a clue. Guess, I’m just looking for python code that will work with all the bits included with advanced autonomous RVR kit. Pius whatever pre-steps such as pip installs if necessary. All I’ve got is the examples included with the kit. Reading spec and data sheets are just about as useful as an automatic transmission manual where I just want to put it in Drive.
I’ve coded for the iPod and iPad but not by reading the Apple specs.
If you can help, thanks. Otherwise I don’t want to take any more of your time. I appreciate what you’ve offered so far.
I would recommend picking up a Python book such as Learning Python by Mark Lutz on O’Reilly. I am mainly a .Net developer and learned Python by using this book as well as playing with examples from the SDK and all. If you follow the Sphero SDK guide to setting up the Python environment, that is very important.
Thanks. Ordered the Lutz book.
I think the issue is that Sphero recommends you run their stuff in the pipenv, but the Sparkfun libraries don’t. So when you’re in the pipenv, the Sphero examples like the keyboard drive all work fine, but the Sparkfun examples don’t - and vice-versa.
My solution was to actually add all the dependencies to the system Python install, and not use pipenv at all. I’m OK with managing any version changes myself.
I’m basically trying to do the same thing as you. For example, I want to add another set of arrows and keybindings to the web interface for the camera, to let you drive the robot and move the camera from one web UI.
Yes, I agree. So far I haven’t found the specific dependencies and steps to install in Python without pipenv or to install sparkfun dependencies in pipenv.
It may not be the best practice, but you can place the library for the sensor or etc in the same folder as your project code. Python will be able to locate them via import. I had to do this with an Adafruit TOF sensor. For some reason pip3 install or GIT install weren’t updating dependencies properly. I would still do PIP3 install for the library for dependencies, but also copy the sensor library to your code folder.