Sunday, July 23, 2017

How not to make millions predicting cryptocurrency prices with social media sentiment scores


Sometimes hobbies can teach you basic survival skills.  In a world where machines are likely to replace many of today’s jobs, I figured that rolling up my sleeves and jumping into deep learning during my recent summer vacation would be both fun and useful.  In any case, I find it helpful to understand the work that I will eventually need to manage in my career.  As a passion and hobby blog, here are my musings into machine learning/deep learning.


I started with a business problem that I wanted to solve.  I’ve always been curious about cryptocurrencies and hypothesized that the market could be more open than the traditional stock market: a prime opportunity for data to help predict prices.  Moreover, investment firms have long hired the best mathematicians and computer scientists to model trade data.  With only a few days to dedicate to this project, I knew I wasn’t going to come up with anything too groundbreaking.  As an avid Redditor, I often wonder:  What if the information contained in reddit comments could serve as features to predict cryptocurrency prices like bitcoin, litecoin, ethereum, etc.?


First off, I delved into the Machine Learning for Trading course by Georgia Tech delivered through Udacity.   I really love MOOCs and wish they existed back when I was in HS and university.  This MOOC was very informative and the programming language used, Python, is my second favourite (after Groovy).  The concepts from stock market trading decisions from probabilistic machine learning approaches translate well to cryptocurrency trading decisions.  I also decided to use pythonanywhere.com as my cloud IDE.  It’s a cheap, clean and efficient dev environment and with great technical support.


Next, I needed data and lots of it!  This was definitely the most challenging step.  Getting cryptocurrency time series data that is highly granular and for free, took quite a bit of googling.  I eventually settled on writing a little script to access the API from cryptocompare.com.  Using python Pandas, I was able to save dataframe data into cvs files.  Now for the social media information, I used the popular PRAW Python Reddit API Wrapper to pull the top 25 comments from the top 25 posts of the top 25 subreddits.  I saved the upvote score into a table and used TextBlob to get the sentiment score of the comment itself.  This would be the basis of my main feature to help predict the currency prices.  On a side note, I also looked at using the Watson API from IBM’s Bluemix.  It provided a lot more information on the emotions found in the reddit comments. The API call limitations meant that there would be a cost associated with obtaining the sentiment for each of the 1000s of Reddit comments. Needless to say, my decision to use TextBlob was a no-brainer.


With the reddit comments data and associated sentiment scores, I created a time series table (dataframe) that lined up the comment time to the price time.  A little googling and stackoverflow later, I had my data ready for analysis.


Now the fun part could begin.  Using Google’s Open Source TensorFlow library, I created a neural net model that took the inputs from some of the cryptocurrency prices and sentiment scores with the output of the price of a single cryptocurrency.  Luckily, the fine folks that created Keras made it very easy to create neural net models from data.  Keras truly lowers the barrier of entry into deep learning.  Combined with Dr. Jason Brownlee’s tutorials, anyone with basic programming experience can start experimenting with deep learning.


Unfortunately, I didn’t make millions (of course) with my predictive model.  My hypothesis that the “internet’s” sentiment of positivity vs. negativity, derived from reddit comments, was a predictor of bitcoin, litecoin or ethereum prices was not supported by the models generated by my data. I did learn firm hand that data are really king and predictive algorithms are a dime a dozen. Eventually, I remembered to leave my home office to enjoy the warm summer weather!  Maybe next year, I’ll try and get higher quality data with a more granular analysis of emotions from social media combined with longer time series.  Or, you never know, a machine may be writing my next blog article...








Wednesday, July 8, 2015

Review: Mastering BeagleBone Robotics by Richard Grimmett


I have to say that I'm somewhat conflicted about writing my review of Mastering BeagleBone Robotics by Richard Grimmett. I definitely found that one of the author's previous book on BeagleBone Robotics was a gem. You can imagine my anticipation in reading a book that I assumed would take basic project introductions like the previous publication and get into much more advanced topics for seasoned hobby robotics practitioners. This wasn't quite the case.

There are advanced projects presented in this 175 page (Kindle format) publication. After a brief intro, the book gets into the projects, with some chapters expanding on the previous and starting afresh. Code is presented and downloadable. I would recommend Packt Publishing to edit black terminal window screenshots and make them either black on white or simply use text to present the typed content and out. It would make for a far more enjoying reading experience.

In one chapter, where the basics of a robotic sail boat are presented, the book states “... you can now build a robot that sails autonomously”. I find the statement to be on the presumptuous side. My lack of understanding of sailing leads me to believe that it's a very complex activity. I wish that more of the theory was presented directly in the book instead of referenced to external websites. Using my Kindle Whitepaper at a local coffee shop, I'm simply not going to click on the links and therefore missing some of the learning potential. Perhaps I am a difficult reader...


Overall, key BeagleBone concepts for multiple types of hobby robotics are presented. The information provided is enough to give the novice hobby enthusiasts the background required to get started. Should you buy this book? Absolutely, if you haven't bought any previous BeagleBone robotics books from the same publisher and author. In my always humble opinion, it's a very strong second edition.

Tuesday, October 7, 2014

Open Worker - Robotics Project Part 1

Media attention on robotics is exploding with reputable newspapers starting to dedicate special attention to my favourite topic.  In the same light, it has been an exciting summer with robotics publications such as the POW Internet Report and the vastly viewed and discussed video documentary, Humans Need Not Apply.  In short, the robots are coming.  We've come to a point where a future of robotic cars, drones, pets and even workers will cross over from fiction to reality.

As I have explored on this young blog, innovations in robotics can now more easily than ever be implemented by entrepreneurs.  Start-ups and hobbyists alike can  use readily available embedded systems hardware like the Beaglebone that work in tandem with commodity computing platforms running open source robotics software such as the Robot Operating System to build the workers of the future.

I subscribe to the view that we cannot stop the inevitable in technology innovation - if you can't beat them, join them.  I thought about this all summer and one Friday evening, sitting on my couch in the living room, I started dreaming up my next robotics project.  Thinking definitely makes me thirsty, and after a long work week, I couldn't resits grabbing a cold locally brewed IPA beer.  I also asked myself, why should I get up and get it?  Isn't it humankind's ultimate goal to have a robot fetch beer for its human creators?  Isn't this why I studied engineering in the first place?  Yes, of course.

Leveraging my favourites in open source technology that include, uArms, lots of Makeblock, ROS and a Primevsion camera, I have toiled away to finally introduce Open Worker:



You can anthropomorphize Open Worker as much as his human interaction test cousin, Spartacus.  This advanced autonomous robot is much more functional than cute.  Understandably so, bringing beer to humans is serious business - the goal is worthy.

In my next post, I will explore the open source hardware details of this platform, allowing others to build upon my ideas and create more advanced versions of Open Worker.  The third post of this series will explain the software side of the project.  This includes programming ROS on a laptop that connects to three different Arduino boards.  Stay tuned!

Tuesday, August 26, 2014

Review: uArm by UFactory

It's already the end of August and the nights are getting cooler.  Luckily, my latest robotics project is keeping me busy during these shortening evenings.  I ordered a couple assembled uArm kits from UFactory a few months ago with their latest manufacturing run ready in time for the summer.  To my surprise, I received an email from one of the sales reps who offered to send me four unassembled kits instead of the the two assembled arms.  I figured that this is a great opportunity to learn about the engineering of the arms and decided to jump right in.

uArm was the result of a very successful Kickstarter funding campaign.  Similar to the Makeblock team, it looks like the Chinese company was founded with the approach of using Kickstarter to fund their initial product line.  A uArm is an fully open source hardware and software platform that brings robotics to the masses.  I had been looking for a good robotic arm for experiments and found that arms with comparable robustness, precision and accuracy tend to run a couple hundred dollars more per unit so this $299, it's not a bad deal.

The unboxing was pretty straightforward which was to be expected.  The acrylic parts were easy to remove from their sheets and the hardware (screws, bolts, etc.), were usually well labelled.  The pdf instructions were also easy to follow.  I have to admit, it was a little frustrating screwing in the tapping screws from the servo horns to the acrylic.  Metal horns would be a better fit with the metal gear and precision cut acrylic.  This isn't enough for me to say that the kits aren't great.  Although it took me a few hours to assemble my first arm, the second one was much easier as I knew what to expect with each step.  I ended up building one arm with the vacuum pump end effector and one with a claw.

Turning on the connected uArm Arduino board for the first time, there was a large consistent tone from the buzzer.  This is normal.  It stopped once my initial program was loaded.  Using Ubuntu, it was pretty easy to get the uArm libraries installed and get the initial calibration software installed.  Unfortunately, one of my arms had defective software but after calling customer support, they assured me that they would send replacement servos.  After building the second arm, it wasn't too difficult to find out how to swap the main servos out.  The servos have the potentiometer wire already integrated, so feedback is possible to be read from the board.  This is quite important for more advanced uses.

True to their open source hardware and software company policy, UFactory publishes all of their schematics and code.  This helps cultivate the community and some people have improved upon the initial code,.  For example, for the calibration step, Scott Grey has thankfully release his version that helps prevent the arm from damaging the servos during the initial setup.

Scott Grey's calibration:
https://dl.dropboxusercontent.com/u/37860507/UF_uArmSG.zip

Other useful test utility:
https://github.com/kenaaker/uArm-test-utility

Official:
https://github.com/UFactory

I hope the software continues to improve, especially from uFactory's branch.  The arms have a lot of untapped potential which makes them great for robotics, to support those who enjoy the weekend hobby to university/research alike.

Sunday, August 10, 2014

Review: Makeblock


It's summer time and the living's easy.  When I'm not trying a new IPA or attending wedding/pool activities, I'm spending my leisure time on my post-Spartacus era robotics project.  Being a strong supporter of open source software and hardware, my research on finding the right prototyping hardware  for my ambitious new project led me to Makeblock.cc.  They ran a successful Kickstarter campaign back in January 2013 and from what I could tell, it looked like the industrious entrepreneurs had designed a high quality product.  I decided to order their Lab and Robotics electronic kits to get a full sense of the capability of this open source construction platform.

It's difficult for me to review a construction set without comparing them to LEGO blocks.  I've used LEGO Mindstorm products for hobby projects, engineering studies and even in the design of work training programs.  I love this product series.  The Makeblock platform takes a similar approach, but with high quality aluminium parts instead of plastic and electronics that are Arduino based or compatible.  All of the common robotics sensors and motors are there, from ultrasonic sensors to stepper motor control.  They also all connect quite nicely with RJ25 wires - no soldering required.  This makes for a very versatile kit that is useful for beginners and for professionals.  It's a real pleasure to work with the parts and the quality feel of the anodized aluminium is impressive.  Of course, this comes at a price of overall expensive parts in large robotic prototypes. For my application, it is well worth the premium.

From a software point of view, after I imported the Arduino scratch library, it was a breeze to test out running some motor controls.  All of the code is available on github being true to the open source nature of this platform.  For a couple $100s it's possible to have a small robot roaming in your home in a matter of hours.

The folks at Makeblock have the right concepts in place for customer service from forums and wikis.  Email wise, I also had some good back and forth with a sales rep (thanks Tony).  The Leave a Message feature on their site didn't yield a reply to my initial questions but hopefully this was a unique instance and others have used the form with more success.

I'm looking forward to seeing what the robotics or maker community dreams up.  I'm quite satisfied with the parts and my project is coming along quite nicely.  Good job Makeblock, now if you could just get prices down a bit...


Solid robot base that is fully customizable

Sunday, July 6, 2014

Spartacus - Robotics Project Part 3

Spartacus has arrived and he's a sharp cookie!  He can move, chat and pretty much look like an honest good robot (he appreciates it when tell him that he's a good robot).  However, the processing power required for facial recognition, time lag in communicating with various web services, USB device/component quality and fragile frame limits how far I can take this project.  In any case, I've learned quite a bit and as you can see from this video, who wouldn't want a Spartacus?


So what's next?  Well, I've just received a few unassembled uArms, about to order Makeblocks, installed ROS on my laptop, have a spare HD webcam, etc.  This is going to be a great summer!  Stay tuned friends.

Wednesday, April 23, 2014

Spartacus - Robotics Project Part 2

Hello interweb,

I've made some pretty good progress on project Spartacus (no special acronym meaning, just a good name). Building upon previous robotics system experience, I've designed a table top experimentation platform that's also fun to observe. Over the last few months of working on this project, I've learned quite a bit.

USB power management is critical to having a system that doesn't generate random errors due to slight power fluctuations. I really wanted my robot to be wireless, but this added weeks of design on top of trial and error testing with commercial hubs, USB dongles and configurations. I ended up using the following hardware list:
  • Anker 15,000mAh @ 5V 2A usb pattery pack
  • HooToo usb hub
  • USB WiFi dongle, RTI8192cu chipset
  • Syba SD-CM-UAUD USB stereo audio adapter, C-Media chipset
  • Logitech quickcam messenger webcam
  • Beaglebone Black
  • Adafruit 16 channel i2c PWM
  • My foamboard box robot platform
I also needed to connect my mini-powered speaker to the headphone jack on the usb sound card. The wires running down the neck of my robot were starting to add up so I opted to simply keep the speaker charged seperatly since it has its own built in battery back. In addition, the BeagleBone Black board and the Adafruit 16 channel i2c PWM board needs 5V which is delivered through the USB battery pack through the means of modified usb cables. The sum of these parts make quite a few devices to power for my small robotics package. In retrospect, it would have been useful to study the technical specifications of each device in more detail and to test out the current and voltuate outputs under load of the battery pack and of the hub. This way, I could have avoided ordering parts online that have ended up in my spare parts bin. Regardless, some parts didn't perform according to their specifications and trial and error was used.

For example, I tested three different USB sound cards before ending up with one that had the capture quality for speech recognition (STT) and text-to-speech (TTS). I even tried audio capture with a high quality USB mic but the $10 sound card combined with an equally reasonably priced PC mic performed quite well. For speech recognition, I used PyAudio to capture a wav file when a threshold of sound level was reached, converted the file to flac using SOX and finally send it to Google to convert it to text. For TTS, I used espeek and direct command lines from Python. It worked great.

One benefit of using Adafruit's PWM controller is that it can also control tri-color LEDs. I ended using a diffused 10mm tri-color LEDs for the eyes. They take three output pins each but in my case, three in parallel since I didn't have a use case for eyes of different colors. Since I only have 6 servos on the robot, there are still a few channels to spare for potential future expansions.

I used the Logitec Quickcam messenger for a couple reasons. For instance, I had originally purchased a pair of these webcams for a stereo vision projects a few years ago and knew they worked well with Ubuntu linux. I also still had them around. I learned from my home security system project that a TTL camera simply couldn't capture enough frames per second for useful applications in my current projects. The camera can successfully take pics using OpenCV. I plan on using this machine vision library to identify the coordinates of people's faces in front of my robot. For my next project, I'll use the Logitech HD Pro Webcam 290 which is used on many Beaglebone projects on the web.



With servo control, led control, voice-recognition, text-to-speech, chatbot APIs, WiFi, battery and webcam components all working individually and integrated into my robotics platform programmed in Python, it's now time to put it all together into a wonderful Spartacus robotics system package. I hope to share some more good news with you soon.