(I went by memory in selecting the size M coax plug, but fortunately, it's the right size.)
This photo shows the robot platform with the top taken off, showing the four motors attached to the wheels. The left motors are wired together, and the right motors are wired together.
This is the top plate of the robot platform. An Arduino Uno with a motor shield is attached to the front (left) of the plate with short standoffs. The wireless SD shield (which I use mainly as a platform to hold a 16 pin socket) is not attached. On the right is a small circuit board attached to the top plate with longer standoffs. I usually velcro the Raspberry Pi and its battery pack here.
Here's the bottom side of the top plate, showing the 6 AA battery pack attached to the plate and wired to a two-position switch. When the robot is jostled, sometimes a battery falls out of the holder. In any case, it doesn't provide a lot of power.
So, here's the new battery pack, shown in green lying on the bottom of the robot platform. The battery pack itself comes with a male molex connector that is used to attach to the charger. Because this molex connector isn't quite compatible with the male and female molex connectors I got from Radio Shack, I disconnected the original molex connector and attached the Radio Shack male connector. Then I wired the female molex connector to the coax socket and the two position switch (both of which were included with the Sainsmart 4WD robot platform). If you look closely, you can trace out the wiring.
The new battery pack is intended to power both the Arduino stack and the robot motors. In my initial test drive (without a full charge of the new battery), it's pretty swift.
Here's the top plate. On the right of the "On" label, you can see the newly installed coax socket.
On these two photos, you can see the newly assembled robot charging. On the left is the charger that came with the battery pack. I used the male molex connector that I disconnected from the battery, and I wired it to the Radio Shack size M coax plug with a black and red pair of wires.
The Arduino wireless SD shield is shown at the top of the Arduino stack, with the Radio Shack ultrasonic range sensor kind of looking like a pair of eyes. The Raspberry Pi and battery pack are velcro'd to the platform in back. Also of note is a Belkin Retractable High Speed USB 2 Cable connecting the Raspberry Pi to the Arduino stack as well as a short USB cable (half disconnected) between the Raspberry Pi and the battery pack velcro'd beneath it. I ordered both of these from Amazon.com to shorten cable length and reduce clutter.
This is a big thing to note: A variety of Raspberry Pi -- Arduino robots described on the web use the GPIO ports on the Raspberry Pi to communicate with the pins on the Arduino. To do so, since the two devices use 3.3 V and 5 V standard on their pins, the robot designers have to use an opto-isolator between the devices. I don't see the advantage to this over using the USB ports on both devices and then using the serial communications packages in the libraries on both platforms to handle the communications. It seems so much more straightforward to me to be able to send and receive ASCII code rather than control pins.
Finally, attached to the wireless SD shield by a standoff and encased in a clear plastic case behind the Radio Shack ultrasonic range sensor "eyes" is a Raspberry Pi Camera, attached also to the Raspberry Pi by its white flat ribbon cable.
-----
I still can't get the Raspberry Pi to act as a wifi access point to which I can log in. So, I'm still forced to use my home WiFi network for communications.
I still can't get the Raspberry Pi camera to display photos directly from Python using show() (or image.show()), either over SSH or via a VNC connection. (Actually, I'm not sure it works this way even when I use the Raspberry Pi directly connected to a monitor.) Instead, I have to capture an image to a .jpg file and then use Python to make an external call to Image Viewer. That works, but it feels pretty clunky.
Oh, yes, and I've also started using a VNC connection to the Raspberry Pi.
-----
My next project is to attach the Radio Shack ultrasonic range sensor to a swivel platform attached to a stepper motor controlled by the Arduino stack. Then I'll write an addition to the Arduino code to allow the range sensor to sweep in front of the robot and get a rudimentary range map of objects in front of the robot. I've been playing with the pylab library in python, and it seems promising for this project.
This is a big thing to note: A variety of Raspberry Pi -- Arduino robots described on the web use the GPIO ports on the Raspberry Pi to communicate with the pins on the Arduino. To do so, since the two devices use 3.3 V and 5 V standard on their pins, the robot designers have to use an opto-isolator between the devices. I don't see the advantage to this over using the USB ports on both devices and then using the serial communications packages in the libraries on both platforms to handle the communications. It seems so much more straightforward to me to be able to send and receive ASCII code rather than control pins.
Finally, attached to the wireless SD shield by a standoff and encased in a clear plastic case behind the Radio Shack ultrasonic range sensor "eyes" is a Raspberry Pi Camera, attached also to the Raspberry Pi by its white flat ribbon cable.
-----
I still can't get the Raspberry Pi to act as a wifi access point to which I can log in. So, I'm still forced to use my home WiFi network for communications.
I still can't get the Raspberry Pi camera to display photos directly from Python using show() (or image.show()), either over SSH or via a VNC connection. (Actually, I'm not sure it works this way even when I use the Raspberry Pi directly connected to a monitor.) Instead, I have to capture an image to a .jpg file and then use Python to make an external call to Image Viewer. That works, but it feels pretty clunky.
Oh, yes, and I've also started using a VNC connection to the Raspberry Pi.
-----
My next project is to attach the Radio Shack ultrasonic range sensor to a swivel platform attached to a stepper motor controlled by the Arduino stack. Then I'll write an addition to the Arduino code to allow the range sensor to sweep in front of the robot and get a rudimentary range map of objects in front of the robot. I've been playing with the pylab library in python, and it seems promising for this project.