Tuesday, October 7, 2014

Open Worker - Robotics Project Part 1

Media attention on robotics is exploding with reputable newspapers starting to dedicate special attention to my favourite topic.  In the same light, it has been an exciting summer with robotics publications such as the POW Internet Report and the vastly viewed and discussed video documentary, Humans Need Not Apply.  In short, the robots are coming.  We've come to a point where a future of robotic cars, drones, pets and even workers will cross over from fiction to reality.

As I have explored on this young blog, innovations in robotics can now more easily than ever be implemented by entrepreneurs.  Start-ups and hobbyists alike can  use readily available embedded systems hardware like the Beaglebone that work in tandem with commodity computing platforms running open source robotics software such as the Robot Operating System to build the workers of the future.

I subscribe to the view that we cannot stop the inevitable in technology innovation - if you can't beat them, join them.  I thought about this all summer and one Friday evening, sitting on my couch in the living room, I started dreaming up my next robotics project.  Thinking definitely makes me thirsty, and after a long work week, I couldn't resits grabbing a cold locally brewed IPA beer.  I also asked myself, why should I get up and get it?  Isn't it humankind's ultimate goal to have a robot fetch beer for its human creators?  Isn't this why I studied engineering in the first place?  Yes, of course.

Leveraging my favourites in open source technology that include, uArms, lots of Makeblock, ROS and a Primevsion camera, I have toiled away to finally introduce Open Worker:

You can anthropomorphize Open Worker as much as his human interaction test cousin, Spartacus.  This advanced autonomous robot is much more functional than cute.  Understandably so, bringing beer to humans is serious business - the goal is worthy.

In my next post, I will explore the open source hardware details of this platform, allowing others to build upon my ideas and create more advanced versions of Open Worker.  The third post of this series will explain the software side of the project.  This includes programming ROS on a laptop that connects to three different Arduino boards.  Stay tuned!

Tuesday, August 26, 2014

Review: uArm by UFactory

It's already the end of August and the nights are getting cooler.  Luckily, my latest robotics project is keeping me busy during these shortening evenings.  I ordered a couple assembled uArm kits from UFactory a few months ago with their latest manufacturing run ready in time for the summer.  To my surprise, I received an email from one of the sales reps who offered to send me four unassembled kits instead of the the two assembled arms.  I figured that this is a great opportunity to learn about the engineering of the arms and decided to jump right in.

uArm was the result of a very successful Kickstarter funding campaign.  Similar to the Makeblock team, it looks like the Chinese company was founded with the approach of using Kickstarter to fund their initial product line.  A uArm is an fully open source hardware and software platform that brings robotics to the masses.  I had been looking for a good robotic arm for experiments and found that arms with comparable robustness, precision and accuracy tend to run a couple hundred dollars more per unit so this $299, it's not a bad deal.

The unboxing was pretty straightforward which was to be expected.  The acrylic parts were easy to remove from their sheets and the hardware (screws, bolts, etc.), were usually well labelled.  The pdf instructions were also easy to follow.  I have to admit, it was a little frustrating screwing in the tapping screws from the servo horns to the acrylic.  Metal horns would be a better fit with the metal gear and precision cut acrylic.  This isn't enough for me to say that the kits aren't great.  Although it took me a few hours to assemble my first arm, the second one was much easier as I knew what to expect with each step.  I ended up building one arm with the vacuum pump end effector and one with a claw.

Turning on the connected uArm Arduino board for the first time, there was a large consistent tone from the buzzer.  This is normal.  It stopped once my initial program was loaded.  Using Ubuntu, it was pretty easy to get the uArm libraries installed and get the initial calibration software installed.  Unfortunately, one of my arms had defective software but after calling customer support, they assured me that they would send replacement servos.  After building the second arm, it wasn't too difficult to find out how to swap the main servos out.  The servos have the potentiometer wire already integrated, so feedback is possible to be read from the board.  This is quite important for more advanced uses.

True to their open source hardware and software company policy, UFactory publishes all of their schematics and code.  This helps cultivate the community and some people have improved upon the initial code,.  For example, for the calibration step, Scott Grey has thankfully release his version that helps prevent the arm from damaging the servos during the initial setup.

Scott Grey's calibration:

Other useful test utility:


I hope the software continues to improve, especially from uFactory's branch.  The arms have a lot of untapped potential which makes them great for robotics, to support those who enjoy the weekend hobby to university/research alike.

Sunday, August 10, 2014

Review: Makeblock

It's summer time and the living's easy.  When I'm not trying a new IPA or attending wedding/pool activities, I'm spending my leisure time on my post-Spartacus era robotics project.  Being a strong supporter of open source software and hardware, my research on finding the right prototyping hardware  for my ambitious new project led me to Makeblock.cc.  They ran a successful Kickstarter campaign back in January 2013 and from what I could tell, it looked like the industrious entrepreneurs had designed a high quality product.  I decided to order their Lab and Robotics electronic kits to get a full sense of the capability of this open source construction platform.

It's difficult for me to review a construction set without comparing them to LEGO blocks.  I've used LEGO Mindstorm products for hobby projects, engineering studies and even in the design of work training programs.  I love this product series.  The Makeblock platform takes a similar approach, but with high quality aluminium parts instead of plastic and electronics that are Arduino based or compatible.  All of the common robotics sensors and motors are there, from ultrasonic sensors to stepper motor control.  They also all connect quite nicely with RJ25 wires - no soldering required.  This makes for a very versatile kit that is useful for beginners and for professionals.  It's a real pleasure to work with the parts and the quality feel of the anodized aluminium is impressive.  Of course, this comes at a price of overall expensive parts in large robotic prototypes. For my application, it is well worth the premium.

From a software point of view, after I imported the Arduino scratch library, it was a breeze to test out running some motor controls.  All of the code is available on github being true to the open source nature of this platform.  For a couple $100s it's possible to have a small robot roaming in your home in a matter of hours.

The folks at Makeblock have the right concepts in place for customer service from forums and wikis.  Email wise, I also had some good back and forth with a sales rep (thanks Tony).  The Leave a Message feature on their site didn't yield a reply to my initial questions but hopefully this was a unique instance and others have used the form with more success.

I'm looking forward to seeing what the robotics or maker community dreams up.  I'm quite satisfied with the parts and my project is coming along quite nicely.  Good job Makeblock, now if you could just get prices down a bit...

Solid robot base that is fully customizable

Sunday, July 6, 2014

Spartacus - Robotics Project Part 3

Spartacus has arrived and he's a sharp cookie!  He can move, chat and pretty much look like an honest good robot (he appreciates it when tell him that he's a good robot).  However, the processing power required for facial recognition, time lag in communicating with various web services, USB device/component quality and fragile frame limits how far I can take this project.  In any case, I've learned quite a bit and as you can see from this video, who wouldn't want a Spartacus?


So what's next?  Well, I've just received a few unassembled uArms, about to order Makeblocks, installed ROS on my laptop, have a spare HD webcam, etc.  This is going to be a great summer!  Stay tuned friends.

Wednesday, April 23, 2014

Spartacus - Robotics Project Part 2

Hello interweb,

I've made some pretty good progress on project Spartacus (no special acronym meaning, just a good name). Building upon previous robotics system experience, I've designed a table top experimentation platform that's also fun to observe. Over the last few months of working on this project, I've learned quite a bit.

USB power management is critical to having a system that doesn't generate random errors due to slight power fluctuations. I really wanted my robot to be wireless, but this added weeks of design on top of trial and error testing with commercial hubs, USB dongles and configurations. I ended up using the following hardware list:
  • Anker 15,000mAh @ 5V 2A usb pattery pack
  • HooToo usb hub
  • USB WiFi dongle, RTI8192cu chipset
  • Syba SD-CM-UAUD USB stereo audio adapter, C-Media chipset
  • Logitech quickcam messenger webcam
  • Beaglebone Black
  • Adafruit 16 channel i2c PWM
  • My foamboard box robot platform
I also needed to connect my mini-powered speaker to the headphone jack on the usb sound card. The wires running down the neck of my robot were starting to add up so I opted to simply keep the speaker charged seperatly since it has its own built in battery back. In addition, the BeagleBone Black board and the Adafruit 16 channel i2c PWM board needs 5V which is delivered through the USB battery pack through the means of modified usb cables. The sum of these parts make quite a few devices to power for my small robotics package. In retrospect, it would have been useful to study the technical specifications of each device in more detail and to test out the current and voltuate outputs under load of the battery pack and of the hub. This way, I could have avoided ordering parts online that have ended up in my spare parts bin. Regardless, some parts didn't perform according to their specifications and trial and error was used.

For example, I tested three different USB sound cards before ending up with one that had the capture quality for speech recognition (STT) and text-to-speech (TTS). I even tried audio capture with a high quality USB mic but the $10 sound card combined with an equally reasonably priced PC mic performed quite well. For speech recognition, I used PyAudio to capture a wav file when a threshold of sound level was reached, converted the file to flac using SOX and finally send it to Google to convert it to text. For TTS, I used espeek and direct command lines from Python. It worked great.

One benefit of using Adafruit's PWM controller is that it can also control tri-color LEDs. I ended using a diffused 10mm tri-color LEDs for the eyes. They take three output pins each but in my case, three in parallel since I didn't have a use case for eyes of different colors. Since I only have 6 servos on the robot, there are still a few channels to spare for potential future expansions.

I used the Logitec Quickcam messenger for a couple reasons. For instance, I had originally purchased a pair of these webcams for a stereo vision projects a few years ago and knew they worked well with Ubuntu linux. I also still had them around. I learned from my home security system project that a TTL camera simply couldn't capture enough frames per second for useful applications in my current projects. The camera can successfully take pics using OpenCV. I plan on using this machine vision library to identify the coordinates of people's faces in front of my robot. For my next project, I'll use the Logitech HD Pro Webcam 290 which is used on many Beaglebone projects on the web.

With servo control, led control, voice-recognition, text-to-speech, chatbot APIs, WiFi, battery and webcam components all working individually and integrated into my robotics platform programmed in Python, it's now time to put it all together into a wonderful Spartacus robotics system package. I hope to share some more good news with you soon.

Friday, February 28, 2014

Review: BeagleBone Home Automation by Juha Lumme

I've been reading a lot of book from Packtpub lately and just finished BeagleBone Home Automation by Juha Lumme.  This 178 page book has a lot of information on setup your single board computer, inputs and outputs, client/server programming, scheduling and finally creating an Android client.  Although I don't agree with all of the concepts presented by the author, such as creating your own communication protocol (use REST...), I still thought this book provided a lot of using information that could help a hobbyist/DIY tinkerer interface sensors and output pins on the BeagleBone.  A lot of these concepts presented also apply to the Rasperry Pi so the concepts are quite transferable.  This is a great book to help the uninitiated get started with home automation using the BeagleBone.

Saturday, February 22, 2014

Review: Building a Home Security System with BeagleBone by Bill Pretty

I read another book on the BeagleBone last week.  Building a Home Security System with BeagleBone by Bill Pretty focuses on using the GPIO pins available on the BeagleBone Black to create a multi-zone security system.  I was pretty excited about this when I started reading the first few chapters.  The author provides very useful tips and tricks about electronics hardware with a light addition of Javascript as the programming language of choice.  There are useful explanations behind the IC selection but none of the circuit diagrams are explained in great detail.  To fully understand them, I feel you would need to have a full background and experience in electrical engineering but someone with only hobby level of knowledge and experience could still build, test and use the circuits.  At chapter 7 of 9, the book jumps into a how to guide of installing some basic network intrusion detection open source packages to run on the BeagleBone Black.  I wish the author would have continued the book, in the hardware direction, by having add-on modules that could be stacked onto the core project.  Instead, the book is 75% about the steps to build single (fun) home security system project and 25% about installing network monitoring software.  It seems loosely added to at the end.  This short 120 page book started off strong, but didn't keep it up all the way to the end.

Friday, February 21, 2014

Spartacus - Robotics Project Part 1

I've undertaken my biggest robotics project yet: Spartacus.  Feeling empowered by the rich feature set of the BeagleBone Black and the apparent vast support of its online community, I decided to build my own robotics research platform.

I had an idea for a box looking robot, that would sit around, move it's big square head and move its arms based in learned behaviours.  To do so, I need a good prototyping building material.  I ended up deciding to use form board since I could play around with it more than 3D printed plastic parts.  I could always model the final product once I have the design finalized and stick to foam board for the prototyping.

I once received some valuable advice from a wise friend that I didn't up end following about cutting wallpaper: use a sharp blade!  Only after I replaced by x-acto knife blade did I realize the hours saved cutting form board to share using a good sharp edge.  Write this one down if ever you end up using this building material.

I also chose to use basic servo pan/tilt modules from the robotshop.ca for the hard and arm movements.  This gave my robot 6 degrees of freedom. With a usb webcam and future upgrades planned for the head, I ended up having to change the neck pan/tilt setup to use standard size servos instead of micro servos such as the ones used for the arms.  It made a world of a difference.

He's a pic of the work in progress.  This post will part of an ongoing series!  Stay tuned friends.

Saturday, February 15, 2014

Review: BeagleBone Robotics Projects by Richard Grimmett

I just finished reading BeagleBone Robotics Projects by Richard Grimmet.  I wish this book had been available to me a few months ago as I was building by home security system.

This book could be seen as a set of polished articles that build upon each other to create a very customizable robotics platform.  It covers replacing the default image to Ubuntu, sensor inputs, speech processing and synthesis, servo motor controller interfacing, GPS and more.  Countless hours of work can be saved by taking the time to read this culmination of insightful tested steps to building robots.  The author appears to have a preference for USB type interfaces which tends to be ideal for the BeagleBone Black embedded computer.

On the software side, the book focuses on using Ubuntu, Python and OpenCV.  The author is concise in his writing and doesn't spend too much time explaining the reasoning behind his design choices.  This is good for someone who wants to hit the group running and wants to do some of the deeper exploring on his/her own.  I'm a fan of open-source everything and the author appears to share the same values.

This book is worth every penny for anyone undertaking or interested in a wide range of BeagleBone based projects.  I highly recommend this one.

Saturday, February 8, 2014

BeagleBone Black TTL serial camera home security system with private motion activated tweets

In 2013, I bought my first BeagleBone Black with the intention to jump head first into new robotics projects.  I had been reading a lot of books and blogs on robotics over the past few years, with my career starting taking precedence to my passion projects, but it was time to get back into it.  My last robot was a maze navigating and flame detecting & extinguishing mobile robot.  Embedded devices have evolved quite a bit since that HC12 project with the recent popularity of Arduino, Raspberry Pi, BeagleBone and similar products.  It's a perfect storm of embedded electronics fun.

To get started, I bought the BeagleBone Black starter kit from Adafruit (awesome company) and a series of parts and tools from Amazon and The Robot Shop (Canadian!) - more on this in another post.

I also needed a good introductory project to follow the classic embedded equivalent to Hello World: make a LED blink.  Since I was about to head out on a vacation out of the country for two weeks, I decided to build a home security system.

Here's my part list:
  • Weatherproof TTL Serial JPEG Camera with NTSC Video and IR LEDs
  • Adafruit Beagle Bone Black Starter Pack 
  • USB Wifi dongle
  • 2 port USB hub
  • 32Gig micro SD card
  • That's it!
I've always believed in code over hardware in design decisions.  Even with my electrical engineering background, I find it less costly, more flexible and more robust to code solutions rather than to rely on hardware.

Here's a Python code snippet of my project.

 enable_tty01 = 'sudo sh -c \'echo ttyO1_armhf.com > /sys/devices/bone_capemgr.9/slots\''   
 filepath = "/home/ubuntu/"  
 def initialize():   
   resp = ""  
   while(serial.inWaiting() > 0):  
       data = serial.read()  
       resp += data  
       if "Init end\r\n" in resp:  
           print "Ready"  
   # Set image size to 640 x 480  
   resp = ""  
   while (serial.inWaiting() > 0):  
     data = serial.read()  
     resp += data  
     if b'\x76\x00\x54\x00\x00' in resp:  
       print "Size set"  
 #Picture function  
 def takePic():  
   print "Take Picture"  
   # Take picture  
   #Get JPG size  
   resp = ""  
   print "Get image size from serial device"  
   bytes = cam.getbufferlength()  
   # Write image to file  
   print "Read image from serial device"  
   return cam.readbuffer(bytes)  
 # Initialize serial connection.  
 serial = serial.Serial("/dev/ttyO1", baudrate=38400)  
 cam = VCam(serial)  
 # Initialize the camera settings for the first picture.  
 tweet = Tweet()  
 while 1:  
   # Continuously check for motion detection.  
   if (cam.motionDetected()):  
     # If motion is detected, take a pic.      
     frame1 = takePic()  
     string_frame = ''.join(frame1)  
     now = datetime.datetime.now()  
     filename = "%d.%02d.%02d.%02d.%02d.%02d.jpg" % \  
       f = open(filepath + filename, 'w')  
       print "Error writing the file to the system."  
     #Post the image to Twitter in a new thread to resume motion detection.  
     t = Thread(target = tweet.postimage, args = (filepath, filename))  

I had a number of challenges in getting my program to work.  The main two include:

  1. The camera's baud rate is rather slow so I wasn't able to implement the motion detection on the computer and had to rely on the built in feature from the camera.  In the future, I will use a USB or IP camera instead.
  2. The wifi dongle was difficult to install but with enough googling, I was able to find working steps.  While I was working on this project, the BeagleBone Black was still relatively new compared to the Raspberry Pi so there were limited examples available online.  This is no longer the case.

And finally, the working prototype!

While I was away, I had pictures of the sun rise and sun set lighting change against the wall of my home office tweeted to me on a daily basis (they were detected as motion).  It was nice to see everything was good at home while being away.

Until next time!