I've made some pretty good progress on project Spartacus (no special acronym meaning, just a good name). Building upon previous robotics system experience, I've designed a table top experimentation platform that's also fun to observe. Over the last few months of working on this project, I've learned quite a bit.
USB power management is critical to having a system that doesn't generate random errors due to slight power fluctuations. I really wanted my robot to be wireless, but this added weeks of design on top of trial and error testing with commercial hubs, USB dongles and configurations. I ended up using the following hardware list:
- Anker 15,000mAh @ 5V 2A usb pattery pack
- HooToo usb hub
- USB WiFi dongle, RTI8192cu chipset
- Syba SD-CM-UAUD USB stereo audio adapter, C-Media chipset
- Logitech quickcam messenger webcam
- Beaglebone Black
- Adafruit 16 channel i2c PWM
- My foamboard box robot platform
I also needed to connect my mini-powered speaker to the headphone jack on the usb sound card. The wires running down the neck of my robot were starting to add up so I opted to simply keep the speaker charged seperatly since it has its own built in battery back. In addition, the BeagleBone Black board and the Adafruit 16 channel i2c PWM board needs 5V which is delivered through the USB battery pack through the means of modified usb cables. The sum of these parts make quite a few devices to power for my small robotics package. In retrospect, it would have been useful to study the technical specifications of each device in more detail and to test out the current and voltuate outputs under load of the battery pack and of the hub. This way, I could have avoided ordering parts online that have ended up in my spare parts bin. Regardless, some parts didn't perform according to their specifications and trial and error was used.
For example, I tested three different USB sound cards before ending up with one that had the capture quality for speech recognition (STT) and text-to-speech (TTS). I even tried audio capture with a high quality USB mic but the $10 sound card combined with an equally reasonably priced PC mic performed quite well. For speech recognition, I used PyAudio to capture a wav file when a threshold of sound level was reached, converted the file to flac using SOX and finally send it to Google to convert it to text. For TTS, I used espeek and direct command lines from Python. It worked great.
One benefit of using Adafruit's PWM controller is that it can also control tri-color LEDs. I ended using a diffused 10mm tri-color LEDs for the eyes. They take three output pins each but in my case, three in parallel since I didn't have a use case for eyes of different colors. Since I only have 6 servos on the robot, there are still a few channels to spare for potential future expansions.
I used the Logitec Quickcam messenger for a couple reasons. For instance, I had originally purchased a pair of these webcams for a stereo vision projects a few years ago and knew they worked well with Ubuntu linux. I also still had them around. I learned from my home security system project that a TTL camera simply couldn't capture enough frames per second for useful applications in my current projects. The camera can successfully take pics using OpenCV. I plan on using this machine vision library to identify the coordinates of people's faces in front of my robot. For my next project, I'll use the Logitech HD Pro Webcam 290 which is used on many Beaglebone projects on the web.
With servo control, led control, voice-recognition, text-to-speech, chatbot APIs, WiFi, battery and webcam components all working individually and integrated into my robotics platform programmed in Python, it's now time to put it all together into a wonderful Spartacus robotics system package. I hope to share some more good news with you soon.