Life with Deep Racer - Part 1


My Deep Racer has finally arrived back in Australia



There's a few things I have learned already, and I will share them as I go along.

Charging the batteries

Power

The charger for the compute battery pack simply needs a 3-pin appliance cable to connect it to non-US power outlets.  Yes, you can use a plug adapter, but I find they get loose over time.

The LiPo charger needs a 9V 0.5A plug pack.  The centre pin is positive. 

There is a considerable power drain on the CPU battery even when the module is off.  My advice is disconnect both batteries when not in use, because that big pack takes a long time to charge.

Tail light colours

Note that these colours might vary depending on the version of software you're running.  When I first unpacked it, there were no lights at all.


Red - Something is wrong.  The usual cause is failure to connect to WiFi.  The official documentation lists some troubleshooting tips

Blue - Connecting to the WiFi network

Green - Ready to go.  Motor battery is also connected.

wifi-creds.txt

I'm not too happy about this because it does not seem to be consistent.  If there is a SPACE in the SSID or password, you must use single quotes, but you can use double quotes if there are no spaces.  I am not sure if this includes special characters as well.

I have observed, but I am still investigating that the wif-creds method of configuration does not appear to be happy connecting to networks with a hidden SSID.  I will update these notes when I understand what is happening.

Console login

Connect an HDMI monitor, USB keyboard and mouse to the relevant ports and power on.  You will be greeted with a standard Ubuntu login screen.  The password is the same as the default user, but you will be required to change it on first login.

The track

The track is huge!  It takes up a good portion of our open space in the office, but all made with masking tape so it's removable if need be.  Make sure you add a dotted centre line to the layout, because the car's current software is definitely looking for one.

It does work!

Take a look at this (rough) first test of the car on our track

 
 



Deep Racer - A first look




Today, Amazon Web Services (AWS) announced the AWS "Deep Racer" at it's annual re:Invent conference in Las Vegas.  I was fortunate enough to get hold of one of the first batch of production units, and since most of you will not yet have seen one I thought I would give you a run down, as well as my first impressions.

What is it?
Deep Racer is a 1/18th scale AWD buggy with a cute body shape finished in back and silver-grey.  The body shape is quite interesting, but we'll come back to that later.  The chassis appears similar in design and components to many other low-cost remote control buggies

You're no doubt interested in what's inside.  There is a single-board controller carrying an Intel Atom processor, 4GB RAM, 32GB Storage 802.11ac WiFi, a 1080p camera and battery packs.  It can also be used as a remote control vehicle through an app.

(updated November 29) The controller board looks a lot like an Up Board, but I can't confirm this without pulling the car apart completely.  The specs match the Up-Board, as does the overall layout.  The one difference I found so far is the power connector is not the Barrel socket of the UP.  Given the pricepoint of this car, and that Intel exited the single board market, it's almost certainly an OEM item.

In the launch presentation, it was mentioned the storage is expandable (presumably using either USB or SD-Card), but the documentation makes no such statements.

I am confident, but prepared to be proven wrong that it is not the same board as used in Deep Lens.  The specs alone don't match.  In the end, it doesn't really matter, it's simply an interesting technical challenge

The controller runs Ubuntu Linux 16.04, and has the Intel OpenVINO tools installed.  The middleware solution is Robot Operating System (ROS).  At the moment, I don't have details on what if any customisation may have been done. 

After evaluating a model, I discovered that the network is Tensorflow, but any hard details about it are scarce at present.

AWS say that Deep Racer is a platform to allow more people to experiment with Machine Learning, especially Reinforcement Learning.

What it isn't
Here's where things actually get more interesting.  Straight out of it's box Deep Racer is not capable of much at all.  It needs to be trained first, and right now (at least) the only way you can do this is via the Deep Racer console (https://aws.amazon.com/deepracer).  This training leverages Amazon SageMaker and RoboBuilder (announced on Sunday night) to do he hard work for you.

Curiously it's not a fully fledged machine learning kit either.  You don't have access to the models themselves,  nor can you run arbitrary code on the controller.  It was hinted these limitations are likely to be removed sooner rather than later.

Getting Started
Right now, the services are in closed beta, and to be honest they are not quite ready for the big time.  That aside, let's walk through the process.

Firstly, you will need to pull down the documentation from GitHub (https://github.com/aws-samples/aws-deepracer-workshops), but be aware, by the time you can actually get one for yourself, that documentation and lab exercises will most like have been changed and moved.

There is only one track available today for you to use, called "MGM Speedway" , because that's where all the workshops took place.

You then need to create a "reward function", which is a pretty simple Python function where you establish outcomes for conditions the car finds itself in.  Having completed your function (or simply pasted in the example), you select "train", and SageMaker fires up in the background.  If you haven't worked in this space before it might be a surprise to you that the training time is usually measured in hours, and the current documentation recommends a run of 3 hours to properly train your model.

The training display looks like this:


On the right is a video representation of the model output, and the chart is the historical training score.  In the current release, the video is buggy and often freezes up or times out.  I am sure this will get better when the services reach GA.

Once training is complete, or you stop it because you're impatient, you need to evaluate the model's performance against the simulated track.  The evaluation step will attempt to drive the course and record the time to complete a lap.  My first attempt was 1:20 after a twenty minute training run, and using the default reward function.  It was able to complete the circuit on two out of three passes.  A more advanced model got this down to 1:13, and it is still very basic.




Finally, you download the trained model into Deep Racer via a USB drive, and off you go.

Some additional trivia for you. 
There are two battery packs, one for the controller, which is claimed to give it a running time of up to four hours.  The other pack is for the motor, and it's run time is in the order of 15 - 20 minutes.

The curious body shape is a result of getting the plastic body far enough away from the passive heatsink on the controller.  Early prototypes apparently could melt or distort the body.

AWS also announced "Deep Racer League", which will conduct heats at various events around the world.  A prototype of which is being conducted at re:Invent this week.  Sadly I was too late finishing to have my final times counted.

Conclusions
It's exciting to see AWS putting Deep Racer out there for the world to experiment with. at $US399 ($299 if you pre-order) plus shipping and ongoing AWS fees for building and testing models, it's not the cheapest solution for learning, but they do make it very easy.  To be fair, you get a lot for your money, and I suspect AWS are selling this pretty much at cost, or at most, a very slim margin.

I do hope that AWS do relax the sharing of knowledge around Deep Racer in the future.  Being based on OpenVINO (and presumably OpenCV as well), It would be fascinating to add a Movidius NCS to the machine and see what improvement that makes.

If you want the cheapest way to get into autonomous vehicles, this is probably not it.  That said, AWS and Intel have done a lot to lower the barrier to entry into this world.

Auric - The Software Side



Auric is a combination of hardware and software.

Last time we looked at the hardware required, and in this installment we will consider the software components.

Like my previous self-driving car, there are two controllers in Auric.  The main processor and the motion control processor.

Let's start with Motion Control because it is much simpler.   Motion Control is based on a NodeMCU (v1.0) and an associated L293D H-Bridge driver.  The software loop monitors the incoming serial port for a character stream from the main controller.

This character stream is in the form of 'channel:value'.  These channel IDs are a single ASCII character (case-insensitive), and the value data is dependent on the channel.  For example R:512 is interpreted as Right Motor with a value of 512.

It is the responsibility of the channel code to act and generate a response to the main processor.

The main processor, a FriendlyARM NanoPC-T4 is primarily tasked with navigation, as well as edge and object detection.  Edge detection takes a continual stream of still images (as opposed to a video stream).  The image is devolved to identify edge boundaries (such as a pathway or road).  This edge data is used to determine the relative location of the camera between the edges.

An important assumption at this stage is that the camera faces forward along the centre-line of the chassis.  Given the size of the chassis, any misalignment is most likely to be negligible.

New for this release is support for a compass/magnetometer module on the I2C buss.  This gives the software a heading value.  For the moment, this is reference data only, but in the future it will be used to determine on/off course condition.

A second process runs on the main processor to support the LIDAR unit.  At present, this is data gathering to help me understand the nature of the LIDAR data, and because of the relatively high volume of data (720 range values / second) I am sending this data off-board using MQTT for storage since I would rapidly exhaust the on-board storage.  I plan to write a follow-up article on interpreting the LIDAR point cloud data once I get a handle on it for myself.

News Flash

Take a look at Auric's first day out on YouTube

Want to help?

If you would like to make a financial contribution to the project, please go to my Patreon Page at:




My new favourite tool


Life is much easier when you can see what's going on

Sometimes I think these machines are getting a little too autonomous for my liking...

Right now, Auric is being a downright pain the in the rear.  It just seems to be uncooperative.

It all started when I observed that the directional control was erratic.  Given that the steering process is a stream of adjustments based on image processing, it is normal that there will usually be a mismatch in the speeds of the motors on each side.  But when the camera is shown a static image of a receding path I would expect it to steer more or less straight, or to make only small adjustments.  This however, was not the case.

More digging hinted that not all of the commands sent from the main controller to the motor controller were being processed.  This boiled down to one of three possibilities:
  • The main controller was not sending the commands;
  • The motor controller was not receiving all the commands being sent; or
  • The motor controller was not acting on some of the commands being received.
I was able to eliminate the first option by connecting a serial interface to the output of the main controller and comparing what hit the wire with the logging output.  They matched, so I ruled out the controller and it's code as being the culprit.

This left the blame on the motor controller.  Unfortunately since it has only a single serial port, debugging was limited, and it does operate in what is basically a tight loop.  With no console available to me, I modified the controller code to flash the onboard LED (LED_BUILTIN) each time a message was received.  Firing up again, the LED was flickering constantly so that looked OK.

I then modified the code again to flash the LED each time it executed a command, and the result was a much slower flicker. Aha!

Or at least so I thought.  The commands follow a simple formula:

direction:pwm_speed

So R:512 would mean set the PWM value on the right side to 512, and the motor controller will respond with "0" for OK or "1" for an error.  If I ran minicom on the main controller, and entered the commands manually, it worked fine every time.

Then I wrote a smaller test program using libserialport to send a command to each motor.  What was curious here was that consistently only the first command was executed!  Now we are getting somewhere, but why?

I suspected that the line speed might be a problem, so I lowered the serial rate to 9600 baud, but the same symptoms was there.

Time for the big guns

A while ago I purchased a BitScope Micro through a special deal from TronixLabs.  I hadn't really put it to serious use, but if ever there was a time, this was it.  The BS05 (as it's known) has a range of software from basic oscilloscope through logic analyser and protocol analysis.  It also has tiny little probes which can get between pins.  The software also works under Windows, MacOS and Linux.  As a bonus, they are a local company.  Even better as far as I am concerned.

Images courtesy of BitScope
With everything running, I was going to use the protocol analysis features to decode the serial stream, but I didn't need to.  Modifying serialtest to loop through it's transmission, and report the number of bytes sent, I had one channel on the transmit from the main controller, and another channel on its receive.

This little device has helped protect my sanity!
I could see the blocks of data on both channels, and the size of the blocks seemed about right.  But wait!  The relative positions of the blocks was changing, sometimes there was a response in between the commands, other times responses seemed to overlap with a command.

This was the break I needed (See what I did there?)  I modified serial test yet again to report when a frame was received.  Now I could understand what was happening.  The RockChip UART has a deep transmit buffer, so when we wrote a command, the write was acknowledged instantly as having been sent and it was ready for the next command.  As a result, the commands would just keep arriving in the motor controller's RX buffers faster than it could deal with them and because there was no handshaking, some commands were simply lost.

The tiny probes make it easier to grab the right part on a PCB
I can't change the buffer behaviour, and I have no capacity to add hardware handshaking between the two controllers, so the solution is to make the main controller actively wait for the acknowledgement from the motor controller before sending the next command.  By making this change, I could ramp the data speed back up and things held together fine.

It was definitely a software problem of my own making; but the symptoms were pointing to anything but that.

The odd thing was that this same code model had held up just fine in the previous car even though my poor code was present, so what is the difference?  There's a explanations I can offer.  I am not yet sure which is the most correct:
  • Because the Raspberry Pi is slower, perhaps it simply didn't manifest;
  • There may be a difference in either the UART driver code or the UART itself between the two boards;
  • Maybe it was there, but because the of way the steering mechanism worked, it just wasn't obvious.

So what did I learn?

Sometimes a console just isn't enough, and no console is even harder
Debugging embedded systems has unique challenges because you can't always have the controller report what it's thinking.  Using a spare GPIO port to indicate state will usually cost very little in performance, but might be all you need.
Don't assume.  Check
Maybe if I had been less rushed in writing the code in the first place, I would not have spent a day trying to solve this, but I fell into the trap of "what the heck, it works!"
The right tools make all the difference
I have a couple of perfectly good CRO's (Cathode Ray Oscilloscope), but they are big and bulky and I couldn't be bothered to drag them out.  The BitScope on the other hand, is a little USB device, which makes it portable and powerful for my needs.

There is a lot of functionality I have yet to explore, but I am sure that with Auric's bad attitude I am going to get the opportunity!

Want to help?

If you would like to make a financial contribution to the project, please go to my Patreon Page at:




Please note:  I am not associated in any way with BitScope Designs, MetaChip Pty Ltd, or their resellers.  I am just a very happy customer, and sought their permission before writing.














Introducing Auric


Looks like something an evil genius should have, right?
In my last update, I mentioned that the "Auric"* chassis was being used as part of my self-driving experiment.  Now it's time to unveil it properly.

Auric is a T-900 tank chassis made by SZ-Doit.  It's not cheap, but you get a lot for your money:
  • The chassis itself, in a choice of anodising colours
  • Twenty (20) idler wheels
  • Four driven wheels
  • Plastic track
  • Bearings
  • Motors (9V with Hall-Effect sensors, or 12V without)
  • 16550 Battery Holder (no batteries)
  • Screws, Nuts. Washers
  • Allen keys
The only problem is that there are no instructions supplied.  SZ-Doit have a Wiki on GitHub, but for the most part you are referring to photographs of the finished product in order to assemble the machine.

Also available is a motor shield which is used with a NodeMCU module.  Since this was featured in the company's build I added one of those as well.

Some build notes

This thing is big!  It is close to 50cm long and 30cm wide, so you need plenty of space when working on this chassis.

It's normal to have two idler wheels left over when building the quad-track version.

The motor wires are not long enough to reach the controller.  I added a pair of terminal strips to the underside of the chassis, then connected from there to the controller.

Take your time lining up the drive wheel with the idler wheels.  If they are not lined up well, there will be a tendency to throw tracks.

There are two ridges on the inside of the tracks.  These go inside the the idler wheels.  There is also a notch closer to one edge of the track.  This faces out.  If you keep these things in mind, lining up the track on the drive and idler wheels becomes easier to understand.

With the mechanical side done, I loaded the demo software onto the NodeMCU, and I could not get it to work, so I didn't bother with the BlueTooth control either.  My GitHub repo contains a simple sketch that does work, and it would give you a starting point.

The shield uses an L293 driver which is different in operation to the Uno + L298 I usually run.  One thing I noticed was the the motor speed is much lower on the NodeMCU + L293 driver, so you do need to take this into account.

Auric chatting with my Logic Analyser

A Guided Tour

In a more-or-less central position is the new controller.  I like the Raspberry Pi, but I these projects have simply outgrown their capabilities.  After a lot of evaluation, I settled on the NanoPC-T4 from FriendlyArm.

Images courtesy of FriendlyElec/FriendlyArm

This board uses a RockChip RK3389 SoC, it's still ARM, and runs Ubuntu.  It's biggest advantages for me were the processor speed, and 4Gb of RAM.  It also supports adding an NVMe storage device, but I have not enabled that just yet.  There is also the usual WiFi (2.4/5G), BlueTooth, HDMI, etc.  As a bonus, the 40-pin I/O header is pin-compatible with the Raspberry Pi.

At the rear of the chassis is a NodeMCU attached to an SZ-Doit Motor Shield.  A simple serial connection (115200bps) connects the two devices together, and they use the same basic command/response protocol defined in my earlier self-driving experiments.

At the front are the two visually most interesting features.  A forward-facing camera.  You've seen this one before.  It's connected to the T4 and drives the image-processing functions.

Long exposure shows the LIDAR in operation

The blue cylinder is the X4 LIDAR unit from ydlidar.  It generates 720 samples per rotation, thus 2 samples / degree, which is rather high resolution for an inexpensive device ($US99)

To mount the LIDAR and forward camera, I had to make a bracket from two back-to-back aluminium angle pieces.




LIDAR is great for knowing what is around you, but it is even more useful if you know were you are, and where you are going at the time.  Two small modules make this possible.  The first is an HMC5883L I2C Compass/Magnetometer module.  This provides an heading record which is saved along with the LIDAR data.

Also included is a uBlox GPS receiver.  This device is only for logging at this point, but in the future I want to us it for navigation as well.

All this hardware needs a considerable amount of power.  The main processor needs 12V, which is a bit annoying.

Battery size and weight is always an issue with mobile devices, and Auric is no exception.  Fortunately RC cars have faced similar problems, so I am presently using a 4C LiPo battery pack.

This provides 7.4V, so for 12V I used a small switch-mode regulator to run the NanoPC-T4.  The motors are slightly under voltage, but I don't see this as a big issue for now, anyway.

Battery goes underneath
That's enough of the hardware for now.  In the next update, I will look at how the self-driving software was integrated into this platform, as well as how few changes were actually needed.

Want to help?

If you would like to make a financial contribution to the project, please go to my Patreon Page at:




*Auric gets it's name from Auric Goldfinger, a criminal genius obsessed with gold, in the James Bond novel and film "Goldfinger".  Since the chassis was a golden colour, it just made sense at the time.




DIY Self-Driving - Part 9 - New Imaging

Please Note:  Unlike most of my projects which are fully complete before they are published, this project is as much a build diary as anything else.  Whilst I am attempting to clearly document the processes, this is not a step-by-step guide, and later articles may well contradict earlier instructions.

If you decide you want to duplicate my build, please read through all the relevant articles before you start.

We have a lot of work to do in pretty short space of time now, with new competitive opportunities in the very near future.

To be honest, the line detection has simply not been good enough, and this is leading to some of the poor performance I have been experiencing with the car.  Whilst it really does work, it is not accurate enough.  There are a couple of issues here which need to be dealt with:

  1. The core processor is just not powerful enough; and
  2. The optics aren't good enough

Processing

As I noted in my "Next Iteration" article I am looking at other processor boards.  At the moment, I am experimenting with four different units
  1. Intel NUC;
  2. HardKernel ODROID XU4;
  3. FriendlyARM NanoPC T4;
  4. Up! UpBoard;
These are being benchmarked against the RaspberryPi 3B from the current platform on a number of criteria:
  1. Overall Performance;
  2. Power Efficiency;
  3. Ease of Integration
Performance and Integration are key functions for me.  OpenCV is pretty demanding, and when we start adding some of the learning-based functionality, there are going to be a lot of demands on the system.  Integration means we need to be able to interface the new processor platform with the rest of the car, so it needs to have well-documented GPIO capabilities.  More news on this as it comes to hand.

Imaging

I am also working with a pretty amazing new camera unit manufactured by ELP.  This module includes two physical camera and an onboard CPU (ARM, but the specs are unclear) which stitches the two images together.  It is intended for use in VR appliances and similar roles, but I like it because if we split the image into it it's components, and then mathematically add the images, we get a single image with depth information:

(This image was built at 25 fps using the ODROID XU4)

    It is very reminiscent of those old red/blue "3D" posters.  The further away the objects, the more aligned the images are.  The camera is also much faster than the older single-lens ELP device previously attached to the car.

    Clearly, just adding the two images together is not the end of the story, and I am looking very closely at the OpenCV v.3 Stereo Image functions to extract the depth information.  I can then massage this into something the navigation layer can use.

    The other option is to use the images in a "Stitch" mode and create a single wide image with less fish-eye effect than I would get with a wide-angle lens.

    In any case, it's been a long night, but the car is definitely getting smarter!

    Want to help?

    If you would like to make a financial contribution to the project, please go to my Patreon Page at:



    Shameless Plug:
    I use PCBWAY.com for my printed circuit boards.  If you need PCB manufacturing, consider the sign-up link below.  You get the service, and I get credits.  Potentially a win for both of us!




    DIY Self-Driving - The next iteration

    Please Note:  Unlike most of my projects which are fully complete before they are published, this project is as much a build diary as anything else.  Whilst I am attempting to clearly document the processes, this is not a step-by-step guide, and later articles may well contradict earlier instructions.

    If you decide you want to duplicate my build, please read through all the relevant articles before you start.

    Things have been a little quiet on the self-driving front for the last month, but the project is still moving ahead.

    Mechanical Issues

    I guess it had to happen.  Prolonged running of this old car has taken it's toll on the drivetrain and the gearbox has stripped.  As you can imagine, this makes it rather hard to move anywhere.  It's not the end of the world, because these units are pretty common.  I am going to try and get a pair of them and convert the car to 2 wheel drive into the bargain.  All the cutouts are already there, so in theory at least it's not that big a task.

    Design Changes

    However, the gearbox failure has led to another problem.  If I am going to go to two motors, I need to revisit how everything is powered.  I am also looking at adding a lot more electronics which will make my 6V electrical system rather doubtful, so I am planning on reworking the whole car to 12 volts.

    This is going to cause me some challenges for the 6 volt motors driving, so I have a couple of choices.  I could regulate the motor voltage using a switch-mode supply or similar and retain the same basic drive scheme or I could change the motor drive to a PWM scheme and use a higher frequency drive for the motors.  Not sure which approach is better at this point, and I would welcome experienced suggestions.

    The basic software does an OK job of finding the path to follow, but at present it has it's limitations.  Firstly, it is not aware of the environment around it, which makes it less autonomous than I am aiming for.

    An obvious solution to this is to implement a Machine-Learning (ML) based image recogniser which can identify traffic lights, signs, vehicles and the like, and then command the driving logic to react accordingly.

    After AWS Summit I have been offered a number of really interesting devices to further the car's development.  I am now the proud owner of a pair of Intel Movidius Neural Compute Stick (NCS):


    This device is not much bigger than your typical USB drive, but it contains a visual processor designed for deep learning an analysis in edge systems.  Using models developed on a host system, the processing can be done locally instead of having to either send over the network or carry around a huge GPU-Equipped machine.  Also thanks to my friends at Intel I have a NUC i5 system to host it with.  There's going to be a few challenges with power (the NUC needs 19 volts) but I am sure it's achievable.  The NCS will also work with a Raspberry Pi, so I will be testing back-to-back to see which is the best solution for the car.

    Further into sensing the world around us, I also now have a YDLIDAR X4 LIDAR unit.


    To be fair I have have very little time with the X4 at all yet, but it does seem very impressive for it's $US99 price tag.  It is intended for smart appliances and small mobile robots, which makes it a good start for my car.  While it's not in the same league as the Velodyne unit, I feel like now I am running with the big guys on this project.
    These devices are however quite power-hungry.  The NUC/NCS combintation requires a little over 2.5A and the LIDAR unit alone draws 1.6A from it's micro-USB connector, so clearly it is back to the drawing board as far as power supplies go.

    Software Changes

    These new components mean that a comprehensive rewrite of the car's software is in order.  The LIDAR produces an order of magnitude (or more) additional data than the current camera setup, and to be honest I am not quite sure how to deal with it yet.  There is a lot of experimentation and article writing to come on this subject.

    In summary, bear with me folks, we're about to go down a load of new roads with our little car, and I promise to take you all for the ride with me!








     

    Meerkat is famous!


    I want to give a big shout out to the folks over at DIYODE Magazine because in their latest issue (#11), they have run a feature on Meerkat, my obstacle-avoiding robot.

    The little guy has dressed up really well for the story, and my self-driving car even makes a brief appearance.

     

    Read for yourself here, and please think about supporting this local publisher and by purchasing the issue or better yet, subscribing.



    DIY Self-Driving - A night out!

    Please Note:  Unlike most of my projects which are fully complete before they are published, this project is as much a build diary as anything else.  Whilst I am attempting to clearly document the processes, this is not a step-by-step guide, and later articles may well contradict earlier instructions.

    If you decide you want to duplicate my build, please read through all the relevant articles before you start.
    Well, we made it...

    The car is a very long way from being finished, but it's official debut at Sydney's Darling Harbour coincided with Amazon Web Services AWS Summit for 2018 so I had to take advantage of that as well.

    This is a collection of photos from the big night out before we go back to the hard work again.






    Wasting your and my time

    I had a really interesting experience recently which I hope might enlighten others as much as it did me: I was approached (via LinkedIn) by ...