Working with sensor results

So after some mucking around with sensor data I finally figured out how to get the actual values to work with. Wanted to show this here in case someone else also has trouble figuring this out.
The getting started section of the github repository only has a print line in the sensor handlers, so I think it would be a good idea to add at least variables and assign the data you get from each sensor to those just to show people how you can work with it.

But here is an example on how you can get the data.
Using the locator sensor stream with this handler:

async def locator_handler(locator_data):
    print('Locator data response: ', locator_data)

Prints out a line like this:

Locator data response: {‘Locator’: {‘is_valid’: True, ‘X’: 38400, ‘Y’: 44800}}

Took me a bit to figure this out but this means that the data object is a dictionary nested in a dictionary. So in this case you have just one entry on the first level with the key ‘Locator’. The value of this entry contains another dictionary with the actual values from the sensor.
So to get the actual sensor data you can do something like this:

loc_X = 0
loc_Y = 0
loc = (0, 0)

async def locator_handler(locator_data):
    global loc_X
    global loc_Y
    global loc
    print('Locator data response: ', locator_data)
    loc_dict = locator_data['Locator']
    if loc_dict['is_valid']:
        loc_X = loc_dict['X']
        loc_Y = loc_dict['Y']
        loc = (loc_X, loc_Y)

And with this you have the sensor values on global variables that you can use elsewhere in your code.
Keep in mind that each sensor has its own data and the key for the first level dictionary is named after the sensor used. So each one looks a bit different, but you can see what to do if you look at the line you get with the print().


The way I figure it is that the binary value return by the locator function is a number between 0 and 2^32 or ‭4,294,967,295. So the number return needs to be divided by ‭4,294,967,295‬ to get the fraction of the range value which is between -16000 and +16000.

As the wheels turn the value increases or decreases along the Y axis in centimeters.



That’s a great find. As of now it seems like it takes some detective work to get the sensor readings out, but this shows that they are definitely in there and accessible!

Thanks for sharing.

1 Like

Discussed in this thread with pointers to two code bases that show how to scale. Also have it on my Wiki with a funky ASCII formula.

The scaling factor requested is supposed to change the size of the results. If you ask for format 0 you’ll only get byte wide data. Your divisor needs to adjust for the scale requested.

Also, the token is any value, not what is listed in the table per my testing.


1 Like

The locator actually also returns negative values. So guessed the minimum should be -2147483648 and maximum should be 2147483647. But using the helper function to normalize from sphero-sdk-raspberrypi-python/sphero_sdk/common/ return odd values.

Did a short test with this code:

    loc = (normalize(loc_X, -2147483648, 2147483647, -16000, 16000),
           normalize(loc_Y, -2147483648, 2147483647, -16000, 16000))

But even without moving and loc_x and loc_y being 0 it returns:
(3.725290298461914e-06, 3.725290298461914e-06)

Then I let the RVR drive forward 1 second with speed 64 and got the location again.
First odd thing was that X location also changed to -512… well ok, it steered a little bit to the left, so there seems to be some issue with driving forward with heading 0 somewhere. But lets ignore that for now.

Y was 72704 so tried to push those vaslues through the normalize function and got this:
Location: (-0.003810971975326538, 0.5416907371363777)
I put a ruler beside the RVR and it was driving for 53cm.

Looks like the min and max values are not right. It would really be nice to get some input from the Sphero coders at this point.

1 Like

No, the values are unsigned long integers so number greater than 2,147,483,648 are positive and number less than that are negative.

x = (2147483648/4294967296) * (16000 - - 16000) + - 16000
x = 0

x = (2147481984/4294967296) * (16000 - -16000) + - 16000
x = -.01239



In python I get negative values. Does in python interpret them wrong?

Mmh, adding the values to your formula it would be:

loc_x = -512
x = ((2147483648 + -512) / 4294967296) * (16000 - - 16000) + - 16000
x = -0,003814697265625‬

loc_y = 72704
y = ((2147483648 + 72704) / 4294967296) * (16000 - - 16000) + - 16000
y = 0,54168701171875

It moved by about 53cm. Maybe got the measurement a little wrong and maybe there is a little error, so the 0,54 looks like it but has to be multiplied by 100 to get cm.

1 Like

I have my streaming code working and the results look good. Locator confused me for a bit. The RVR is on a test stand so not actually moving when given a drive command.

  1. Driving with equal values Y increases but X doesn’t.
  2. Driving with unequal values Y increase but X doesn’t.
  3. Changing the heading and then driving with equal values, same.
  4. Physically turned the RVR 90 degrees (roughtly) Y increased some but X also increases.

It appears Locator is calculated using the Yaw from IMU not just reporting the motor encoders.

I haven’t compared the reading to absolute distances, i.e. has it moved 1 meter when it reports 1 meter.

I appears you can’t directly read the encoders which would have been nice.


1 Like

Testing the Locator data today with RVR moving on the floor.

First, I setup my ‘drive’ commands to work on percentage of full speed. It is easier to think in those terms, IMO, than what 65 means on a scale of 0 to 255. The test is run at 25% for a distance of 40 inches. Don’t have my metric tape so converting to inches is easier. Test code is:


    // set to receive locator every 30 ms which is fastest RVR allows

    float forty_in = 40 * in_to_m;
    double sp { 25 };

    while (l.y < forty_in) {, sp);   // have to repeat command because it takes more than 2 second timeout
        std::this_thread::sleep_for(30ms);    // wait for new locator reading
        l = sen_s.locator();

    l = sen_s.locator();
    tout << code_loc << "locator: " << l.x / in_to_m << mys::sp << l.y / in_to_m;

    // final position reading
    l = sen_s.locator();
    tout << code_loc << "locator: " << l.x / in_to_m << mys::sp << l.y / in_to_m;

A typical result after the stop is (0.96, 40.33) while (1.038, 40.72) for (x, y) for the final position.

More test results later.


Seems you getting same results as me. Driving forward is not really keeping its heading. Thought with encoders on the motors there is some course correction in action to make sure the RVR drives straight. But rather looks like we have to make sure ourselves.
Made some functions for this already that controls heading every few cycles which means more overhead on the serial line and since heading only is full degrees it is also not overly exact either.

1 Like

Think I mentioned this elsewhere but someone from Sphero said movement using the heading call is controlled by a PID to give straight movement. The raw movement is not.

1 Like

Oh, I am using the drive with heading call, but it still does not drive straight.

1 Like

Trying to figure out the values I am getting similar to what you have discussed.
For instance, my velocity values are so small. They never make it to 1.
I guess I am a little confused since I saw the SensorStreamingControl which did show that the values were normalized.
Is that the right thought?

from sphero_sdk.common.sensors.sensor_streaming_control import SensorStreamingControl

await rvr.sensor_control.add_sensor_data_handler(
‘Velocity’, velocity_handler

1 Like

My testing until now to get the serial API working has been with the Rvr on a test stand with the tracks not touching a surface, i.e. not actually moving. Recently I have been testing on a tile floor.

I just observed at different speeds using both raw and heading. My Rvr goes straight. Using heading it will move at 15% (38) but it won’t using raw command. IMO that is because the PID is actively controlling the speeds.

The issue I have is turning with the raw commands. My right motor doesn’t start working until over 25% (64) and then it fluctuates. Higher inputs it works better. (In contact with customer support about this.) I think moving straight the left motor drags the right one enough to get it started.

Been experimenting with yaw and locator measurements to see if I can create a PID controller that will maintain a constant difference in the motor speeds, e.g. 0 for straight and some value for turning.

Tried using heading commands but there is no control over the rotation speed so it won’t do what I want.

1 Like

Control over rotation speed has been a problem for me too. I’m finding that the command you used previously can influence the rotation speed. Using raw motors before a turn command results in a rapid turn, but calling the turn command after roll stop seems to result in a smooth turn.

1 Like

I wanted to use the color values that are detected from the sensors to be used as events.
for example, turn left, turn right stop and start or remain in a state when a color is detected until next color, this is achievable by sphero app but not by raspberry pi and python.
Can someone please help me with this?
Thank you

1 Like

Can someone please help with this?

1 Like