Make: Raspberry Pi First Person View Robot Car

For MakerFaire UK 2017 we decided to enter a project, which was then accepted! The project is a Raspberry Pi controlled car connected to a virtual reality viewer and wii remote.

Basically the user can drive the car using a wii remote and also see everything from the car’s point of view. For added complexity we added a servo to the car/camera so when the driver moves their head left to right, the camera moves left to right.

Please note this is PART 1 of the build. The code may change before PART 2

Parts

We used the

Build – the car

The Agobo2 was a kit car. It comes with all the parts and you build it from instructions from 4tronix. We used a Raspberry Pi 2 with the kit and a wifi dongle and a bluetooth dongle.

We 3d printed the holder for the servo (the black block below) and the adaptor for the camera (the blue wing) to sit on the servo.

20170219_160605

Build – the city

We have a 6 foot by 4 foot table at MakerFaire. We wanted people to be able to drive around a “city” passing landmarks/streets on the way.

We bought an 18mm board and cut it down to 6 foot by 4 foot then we had to cut it into 4 pieces as it wouldn’t fit in our car in one piece!

For the walls we got four 6mm hardboard 4 foot by 1 foot 8 inches and cut them into 4 foot by 8 inch strips.

This is what it looked like after cutting in four pieces and adding the walls

20170305_221739

We used a router to create grooves in the board for the walls to stick into. Then 3d printed some connecting blocks to hold everything steady.

TO UPDATE:

  • Putting holes in the walls to connect the streets
  • Adding pictures to the walls

Code

There are three devices that are sending/receiving information.

  1. The car (Nora the explorer) which includes a
    1. camera
    2. servo
    3. motors
    4. Neopixels
  2. The phone
  3. The wii remote

Nora is running her own web server using the Python Twisted Framework. The phone connects to this server. The phone tells Nora what direction it is pointing. Nora sends the phone a picture of what it is currently looking at and moves the camera to face the same direction as the phone. The wii remote talks to Nora and tells Nora whether to move forwards, backwards, left or right. It also tells her whether to turn the lights on or off.

Libraries used:

  1. PiCamera and their rapid capture and streaming techniques to copy the most recent image from the camera and hold it in memory. http://picamera.readthedocs.io/en/release-1.10/recipes2.html#rapid-capture-and-streaming
  2. Wiringpi to control the motors http://wiringpi.com/
  3. ServoBlaster to control the servo https://github.com/richardghirst/PiBits/tree/master/ServoBlaster
  4. FullTilt to detect movement in the phone https://github.com/adtile/Full-Tilt
  5. CWiiD to allow bluetooth control of the car from a wii remote https://github.com/abstrakraft/cwiid
  6. Neopixel library for some cool lights!

One big hint:

You have to use ServoBlaster in PCM mode not PWM mode, otherwise it interferes with the NeoPixel library. That was a fun one to debug!

All the code is on GitHub: https://github.com/furbrain/nora

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s