RVR Firmware Update Preview

Hello RVR users!

We’ve been talking about improvements to RVR’s control system for quite a while, and we appreciate your patience while the team here has been hard at work bringing those to life. Schedule details may still shift, but at the moment it looks like we’ll be able to release new RVR firmware in early summer. Due to resource limitations, our public SDK’s may not support any new commands until late summer or early fall, but the features will be present in firmware and documented for those who don’t mind low-level programming. Here’s an update on the firmware changes you can expect:

Control System

Previously, one driving control system was available to users through roll commands, requiring a target heading and a target speed, with an optional reverse flag.

This update replaces the “roll drive” controller with a higher-performing, more configurable implementation and supplements it with 3 selectable alternate control systems. All control systems present a normalized interface, similar to current roll commands, and an SI units interface where linear velocities are specified in m/s and yaw angular velocity (if applicable) is specified in degrees/s.

The list below does not include everything targeted for the release, but represents the features that are complete at this time.

Improvements: Drive with Heading

  • Vector drive state machine
    • State 1: If currently stopped, spin in place to face the target heading
    • State 2: Drive along the target heading
  • User-adjustable yaw and linear velocity slew rates.
  • Default yaw slew behavior is dependent on the magnitude of the commanded linear velocity (to make deliberate, slow driving easier). See video for comparison to released firmware
  • Yaw targets are typically hit within +/-1 degree of the IMU reading. Previous tolerance was +/- 3 degrees. See video for comparison to released firmware
  • Steady-state yaw error during driving tracks to zero across the full linear velocity range (a bug had previously reduced yaw control accuracy at high linear velocities)

New: Drive to X-Y Position

  • Provide a target position and orientation as (X,Y,heading), along with a maximum linear velocity, and RVR will rotate to face the target position, drive to the target position, and then turn to the specified orientation, sending an API async when done. See video for demonstration.
  • Defaults to driving forward to the target position, but supports options for reverse driving, or automatic selection of forward or reverse to minimize the required initial turn.
  • Supports relative or absolute coordinates

New: RC Drive

  • Provide a linear velocity, and a yaw angular velocity, and RVR will follow that command until it times out (default 2 seconds) or a new command is received.
  • Supports multiple adjustable options for linear acceleration rates, so your project can keep delicate payloads safe, or put the pedal to the metal.

New: Tank Drive

  • Provide left and right tread linear velocity targets in normalized or SI units form, and RVR will track these targets. If your goal is to build an externally hosted control system, this is a much more useful interface layer to use than raw motor commands, as the onboard velocity controllers update at 1kHz and the maximum streaming data update rate to provide feedback to an external control system is 100 Hz.

Encoders

  • Encoder position resolution has increased 4x due to counting all quadrature edges as ticks rather than full quadrature cycles.
  • Velocity measurement has improved with changes to the encoder driver.
  • Tick counts for the left and right encoders are available as 32 bit signed integer values.

Locator

  • RVR’s locator precision is improved. There was a bug that caused loss of precision at low tread velocities, which has been fixed. Precision is now independent of tread velocity.

Other Bug Fixes

  • Idle to “soft sleep” (standby) transition now occurs after 5 minutes, as designed, to reduce power consumption.
  • Resolved “lurching” bug in the roll drive controller.

If you have any questions, feedback, or concerns, please let us know in this thread. We are very excited to share these details, and look forward to the actual release to you, our RVR users.

-JimK
Senior Firmware Engineer, Sphero

4/3/20: changed “roll drive” to “drive with heading” for consistency with SDK naming.

5 Likes

i am not sure i understand the reasoning for not releasing an update? Not enough engineers? This is a good reason to take advantage of the open source community…especially right now when there are some who might want something to take their mind off things… no need to wait until the end of summer, and continue to lose interest from those who believe in this product…

2 Likes

Hi @wegunterjr,

I left some background out of my post. If we had a mechanism for distributing optional firmware updates to users, we would have already released at least one, maybe multiple firmware updates this year. Without that mechanism, we limit the frequency of our app and firmware updates because there are logistical considerations for teachers around updating a classroom full of robots. In the interest of limiting updates and maximizing continuity for our education users, any update has to be thoroughly tested across all supported mobile and desktop platforms to avoid shipping a bug that interferes with classroom activities.

It’s been great to see the variety of open source projects showing up in these forums, either building on top of the public SDKs we’ve developed, or working towards complete alternate SDKs for platforms we never anticipated, and wouldn’t have the resources to adequately support. That was always part of the vision for RVR. Our apps and our firmware, however, have to remain closed-source to protect sensitive IP. Another piece of the timeline puzzle is that we need our docs for new features to adequately support open-source SDK development.

We will ship this update at the earliest possible opportunity, given the constraints of finite resources supporting multiple product lines in the field. Although we’d have preferred to release it sooner, we hope that users will find it was worth the wait. We thank you in advance for understanding, and are hoping you will still build wonders with the robot in the meantime!

-JimK

3 Likes

If i understand correctly this will be changing the control method and giving us the option to drive it more like a real RC, so we don’t have to use the same controls as the spheres do?

Driving using the EDU app won’t be changing for now, since there were other priorities for the spring EDU releases and the timing didn’t line up. We will eventually have a way to drive more like an RC car in an app, but in the meantime the firmware will support these new modes, and DIY solutions can take advantage of them.

As an example: in my test setup, I have a dual joystick USB game pad connected to our internal test software sending the commands over BLE. I put linear velocity on the left stick Y axis, and angular velocity on the right stick X axis, with the angular velocity polarity flipped for driving backwards to mimic a car with a steering linkage. This setup feels great to me for manual driving, as it works similarly to a quadcopter in angle mode with a Mode 2 transmitter. Side note: If you are used to another RC vehicle, try to map your inputs to be as similar as possible to what you already know. Otherwise you’ll have a lot of brain retraining to do.

I have something similar set up for tank commands with the left and right velocities assigned to the left and right joysticks, but that’s considerably more difficult to drive manually unless you cap the velocity commands at a low value. I’m guessing it’ll be used more for programmatic driving, taking the place of the raw motor commands in many projects.

This project could easily be adapted to the new commands, and will likely be updated after we ship the firmware: https://github.com/sphero-inc/sphero-sdk-raspberrypi-python/tree/master/projects/rc_rvr

-Jim

1 Like

Is there a (versioning?) reason why a manual firmware transfer, a beta program (like with Testflight under iOS) or an additional non-edu app would not be possible?

I just unboxed my new RVR, expected all the controlled driving from Lightning, some Bolt orientational features and additional precision and measurements.

I ran some test (repeated square with slow speeds) and it was quite ok, but not what I had hoped for. Couldn’t find the magnetometer sensor block nor absolute positioning without polling and was a bit disappointed.

Now, it seems you fixed really everything in the new firmware, added the precision I was looking for, enabled the compass and implemented the absolute positioning. Everything sounds great, except for the date :expressionless:

So, while I dont’t have much hope to see it in time for this update (which is really sad given the essential features you mentioned), I wonder if you could find a way to distribute earlier and more often to the early adopters (did I mention my Gen 1 Sphero?) that are also a part of the community.

I totally unterstand the edu quality needs and I am glad that this gives you a stable market. Still, I’d really like to play as comfortably as possibly with the device as this is why I bought it in the first place.

Thanks for the heads-up. Happy to see so many genes from my other bots in RVR.

(typo edits)

1 Like

Hi @Stev!

It all comes down to developer bandwidth. Distributing the firmware through some other means would have to be implemented by the same engineers who are currently busy with the spring EDU releases. Optional firmware updates are on our roadmap so that we can have more flexibility in the future, but the app and infrastructure changes required just haven’t made it to the top of the queue yet.

We on the firmware team are really excited to get this out and onto users’ robots as soon as we can. Thanks for your patience!

-JimK

2 Likes

Hey @JimK,

Ok, waving pom poms instead then. Go, go, Sphe-roww…

Great to hear that’s a planned option for the future, though.
When I look at the XY positioning video, I am still impatient since it’s exactly what I hoped RVR would do right from the start and I can’t wait for it to be released :slight_smile:

Some detail questions that came up over the Easter weekend concerning the current and future FW:

  • Will the smoothness for slow driving be improved by the updated hall and velocity precision? Right now it’s close to the Lightning McQueen behaviour at slow speeds (not as silent, though, turning it a bit into a siren signal). (BTW. I’d really loved to see that Lightning McQueen in the list of Edu app bots, though I know that it lacks the turtle behaviour. Maybe a turn-on-the-spot car implementation is a good intern project)

  • Can you say something about the current and upcoming speeds that still allow for full/high hall sensor/distance precision (or typical precision limits per speed setting)? Currently it’s not clear whether the processing is always fast enough and how to choose a good speed setting (eg for color detection, distance measurement).

  • For the color sensor, will there be a precision setting? Or is there one that I oversaw? On slightly color-textured ground it’s sometimes hard to hit the right color and not miss the trigger while driving (in contrast to the static picking of the color) despite big differences in the overall color or brightness of the different makers/ground which would be easy to differentiate. I can poll, calculate threshold values manually and check them in a loop but events are a great paradigm (and I’d love to see more for all kinds of sensor values, BTW), especially in the block programming model.

  • will there be/is there active braking support for manual/coded control when the distance/duration is not pre-determined by the command/block?

Cheers
Stev

3 Likes

Thanks! :slight_smile:

The encoder changes increased the position resolution by 4x, but there are tradeoffs with using that full resolution for velocity feedback, as it tends to produce noisier measurements. There are also tradeoffs around measuring very low velocities (with long no-feedback periods between encoder edges), vs detecting a stopped motor quickly so that the control system can respond to the stop.
We’ve tried to pick the best trade-off for the hardware.

It’s been a while since I’ve used the field firmware and done a direct comparison - I think the smoothness of the tread velocity control loops is improved somewhat at low speeds, but we’re still limited by the resolution of the encoders, as it dictates the update rate of velocity measurements. The main improvements to the velocity loops are reduced settling time and steady state error. Turning smoothness is significantly improved, however, and we’re giving users a lot more control over the turning characteristics of the robot.

Yeah, it would have been nice to add it, but I don’t see it ever happening, since we’re no longer partnered with Disney and selling the product.

The current distance tracking variation isn’t a processing speed issue, it’s the result of integrating a filtered estimated velocity. The velocity estimate is less accurate at low speeds, resulting in accumulated position error. The newer firmware doesn’t use velocity in any of the position calculations, so that source of error is eliminated. However, faster driving and particularly fast turning will generally cause more bouncing and slippage of the treads on the driving surface and therefore reduce final positioning accuracy. As you might imagine, the magnitude of this effect is surface-dependent.

When you say a precision setting, do you mean an adjustable range for on-board classification of sensor values as a particular color? I don’t believe there’s anything like this for the EDU app, but I’d have to look to see if there’s something already implemented that could be used for this with the SDK. I haven’t touched color sensing since Specdrums so I’m a bit out of the loop on that.

Yes! There are 2 new stop commands - you can stop with the default deceleration rate, or specify your own rate in m/s^2. The specified rate is applied to the faster tread at the time the command is received, and the slower tread gets a proportionally lower deceleration rate so they stop at the same time. I should edit the post to include this feature, as it was introduced after I wrote up the initial list.

Thanks for the great questions!

-JimK

2 Likes

The encoder changes increased the position resolution by 4x, but there are tradeoffs with using that full resolution for velocity feedback, as it tends to produce noisier measurements. There are also tradeoffs around measuring very low velocities (with long no-feedback periods between encoder edges), vs detecting a stopped motor quickly so that the control system can respond to the stop.
We’ve tried to pick the best trade-off for the hardware.

It’s been a while since I’ve used the field firmware and done a direct comparison - I think the smoothness of the tread velocity control loops is improved somewhat at low speeds, but we’re still limited by the resolution of the encoders, as it dictates the update rate of velocity measurements. The main improvements to the velocity loops are reduced settling time and steady state error. Turning smoothness is significantly improved, however, and we’re giving users a lot more control over the turning characteristics of the robot.

That’s great news. Sometime it’s tricky to decide between remote control needs and robotic needs, sometimes it converges quite nicely.

I had the bot do an emergency stop during slow speed turning on a not-that-fluffy carpet and go full speed on another occasion when picking it up, so I guess those are the extremes of the motor control reaction and I will try to work around them.

Yeah, it would have been nice to add it, but I don’t see it ever happening, since we’re no longer partnered with Disney and selling the product.

But, but, the Star Wars bots, and it reads “Sphero” not “Disney” prominently on the box… Ok. I won’t bother any further. Just in case, should somebody get bored during some holidays and have one of the red cars lying around, the firmware sources pop up and… the usual things.

The current distance tracking variation isn’t a processing speed issue, it’s the result of integrating a filtered estimated velocity. The velocity estimate is less accurate at low speeds, resulting in accumulated position error. The newer firmware doesn’t use velocity in any of the position calculations, so that source of error is eliminated. However, faster driving and particularly fast turning will generally cause more bouncing and slippage of the treads on the driving surface and therefore reduce final positioning accuracy. As you might imagine, the magnitude of this effect is surface-dependent.

That’s an interesting topic to look at, once the firmware is out. Wide nubby rubber treads on different surfaces…

When you say a precision setting, do you mean an adjustable range for on-board classification of sensor values as a particular color? I don’t believe there’s anything like this for the EDU app, but I’d have to look to see if there’s something already implemented that could be used for this with the SDK. I haven’t touched color sensing since Specdrums so I’m a bit out of the loop on that.

Yes, there are always tricky surfaces, like color-textured floors or carpets, also in educational environments (less carpets, though)

Maybe moving RVR over a surface during picking could collect the color samples and define a range or set (for the EDU app) instead of a single color.

In a second step, a collection of possible color definitions (eg. those of the surfaces involved in the current environment/lesson/experiment) that could be referenced in code might come handy, where you could use a mini AI to cluster the measured colors and select the nearest neighbour. Might also show conflicts in too similar color sets. Maybe as some kind of complex variable or additional group besides the functions and variables.

(might also work without an AI, but the AI label might not be reached easier than that :slight_smile:)

This might also work for other “fuzzy” parameters for sensor events, though a multi-channel comparison and range definition (like with RGB) is trickier to implement with the given coding blocks, the main reason to bring it up.

Yes! There are 2 new stop commands - you can stop with the default deceleration rate, or specify your own rate in m/s^2. The specified rate is applied to the faster tread at the time the command is received, and the slower tread gets a proportionally lower deceleration rate so they stop at the same time. I should edit the post to include this feature, as it was introduced after I wrote up the initial list.

Nice. So most things will be programmable in real world units? Cant wait :slight_smile:

Thanks for the great questions!

Thanks for the peek and details. Great to see where it’s going and to hear your take on the inherent issues.

2 Likes

It’s mid-summer now… any updates on the firmware?

2 Likes

Hi @bitrunner, welcome to our community forum!

The QA team found some must-fix bugs in our latest release candidate, so we’re working on fixes and then have to go back through QA. We’re pushing to get this out the door as quickly as we can. On the bright side, it looks like a significant Python SDK update will be ready to release alongside the firmware when it’s ready.

Happy programming!
Jim

1 Like

Glad to hear you guys are still getting some dev time with all the virus slowdowns. Any more news since this was posted? Looking forward to the update.

1 Like

The firmware is finalized and we’re putting the finishing touches on the SDK update. Stay tuned! :grin:

-Jim

3 Likes

WOOHOO!!! How about the URDF? so needy, huh? hahaha

Well, I don’t have a URDF for you (though there have been rumblings around the forum from a few people interested in writing a ROS node), but here’s the firmware and SDK update!

Enjoy!
Jim

1 Like

Excellent. My arms were already aching and those pom poms… Fringe abrasion is a real thing!

ROS package! Here :raising_hand_man:t3:

1 Like
SPHERO Email Marketing -