Machine Learning Color Classifier

My line-following robot needed to differentiate between 5 colors to choose the line to follow. Instead of hard-coding and trial-and-error to figure out what the sensor is seeing, I decided to use a Machine Learning algorithm. It was surprisingly easy to implement on my Arduino Uno.

For a recent line-following robot competition at the RSSC, I was presented with a new challenge. The organizers made the path change colors along the way, forcing the robot to make decisions about which path to take when it came to an intersection. My typical line-following sensor of choice is an array of five generic infrared reflective sensors, which work great when there’s high-contrast between the lines and the field. However for this challenge, those sensors don’t have the color-spectrum range to get the job done. Adafruit manufactures a breakout circuit board based on the TCS34725 sensor which can detect the full RGB visible spectrum, has an on-board LED to reduce the effects of ambient light on the sensor output, and conveniently communicates via I2C. The basic line-following algorithm I used is to have 3 sensors in a line-array that are constantly scanning the ground and reporting back the color they see while the robot uses that data to try to keep itself centered on the right colored line. The TCS34725 chip has a fixed I2C address, so I also bought an I2C multiplexer breakout board which allows me to access all three sensors without network collisions.

The data from the color sensors comes in the form of an integer for each color Red, Green, and Blue and another integer for Clear light. My algorithm will need to take in that raw data and produce a prediction of what color the sensor is actually looking at all within the limitations of the Arduino. It would be somewhat difficult to train the algorithm and run it all on the Arduino, but the reality is: I don’t have to.

The whole process is actually two smaller processes, training and prediction. Training requires more processing power than prediction because it processes my entire collection of known data to work the prediction model down to the best fit for the data provided. Prediction on the other hand, is simple. It uses a simple math calculation called the sigmoid function which is easy to implement on the Arduino Uno and processes very quickly as well. I don’t need to train the algorithm on the Arduino. That can be done on the computer. Here’s an outline of the whole process:

Training

  1. Use the color sensors and an Arduino, connected to my computer to collect raw data.
    • This is the supervised teaching method (i.e. a person is collecting the data and doing the work of picking what color it is up front.)
    • The data will be output over the serial port and saved in a text file
    • I need to know which color the sensor is trained on for each data set, so I’ll make one text file for each color
    • To make the prediction robust, I’ll have to frequently change the position of the sensor and the lighting conditions
    • The colored lines will always be near a white background, so I’ll need to collect data near those borders also
  2. Use Octave to import the data from the text files and use the fminunc function to quickly train the algorithm using gradient descent.
    • When I import the data, I create an additional column that represents the known output from each set and assign a value; either 1 for match or 0 for not a match.
      For example, if I’m training for finding Red, I’ll set all of the known outputs for the Red data set to 1 and all of the known outputs for the Blue, Green, Black, and White data to 0.
    • Before training, I randomize the order of the data, mixing matches and non-matches, then set aside about 30% of the data points which I won’t use during training. After I’m done training, I’ll use the trained algorithm on this data to test if it’s working or not.
  3. The key part of the trained algorithm is the weights that I will use in the Arduino during real-time operation.
Training Process

Training Process

Prediction

  1. During real-time operation, the Arduino will only use the prediction model.
    • Unlike unsupervised learning algorithm, this prediction model will not “learn” any more than it already has learned at the time I program the Arduino
    • The Arduino takes in the raw data, multiplies it by the weights, then passes the sum through the sigmoid function to decide if the color matches.
    • I have to break the classification problem into a series of smaller problems. Instead of predicting “which color is it”, the Arduino will consider each color one-at-a-time as a yes/no question.
  2. The output isn’t guaranteed to be clear. For example, if the sensor is looking at a red/blue transition, it might produce a positive result for red and blue at the same time.
    • I consider this possibility when writing the rest of my Arduino program.
Prediction Process

Prediction Process

If you’re new to machine learning and you want to dive deeper into using on your own, I recommend you check out Prof. Andrew Ng’s Machine Learning course on Coursera.

The algorithm worked out well for me. When I gathered data in my garage under LED light, the trained algorithm worked in that setting, probably very close to 99% success rate. On the day of the competition, the lights tended more toward the red end of the spectrum, so I gathered more data on the day and reran my training algorithm. After that, it worked fine. In general, gathering more data in more lighting conditions would help build robustness into the algorithm, but for a one-time-use project, it was good enough to get the job done. One major limitation I experienced was with the color sensor. It uses an accumulation time to gather data and that really hurt me on computation time. Every call to the sensor had a built-in 154ms delay, then multiply that by 3 sensors polled sequentially and the robot was very difficult to keep on-track. If I had more time, I might have tried to use something other than a delay, perhaps try to allow accumulation in parallel in the background and reduce the delay to 154ms total.

If you need a little help trying this out on your own, here is some sample code you can use to get started:

Arduino code for collecting the data to a text file:

#include <Wire.h>
#include "Adafruit_TCS34725.h"

/* Initialize with specific int time and gain values */
Adafruit_TCS34725 tcs = Adafruit_TCS34725(TCS34725_INTEGRATIONTIME_154MS, TCS34725_GAIN_1X);

void RawColorPrint(int x[5]){
//debug code – color data for classifier
//Serial.print(x[0],DEC);Serial.print(“,”); // always = 1
Serial.print(x[1],DEC);Serial.print(“,”);
Serial.print(x[2],DEC);Serial.print(“,”);
Serial.print(x[3],DEC);Serial.print(“,”);
Serial.print(x[4],DEC);Serial.println(“”);
}

void setup() {
while(!Serial); //wait for Serial to start
delay(1000);

Wire.begin();

Serial.begin(115200);
Serial.println(“Starting…”);
delay(1000);

// Initialize the sensor
if (tcs.begin()) {
Serial.println(“Found sensor”);
} else {
Serial.println(“No TCS34725 found… check your connections”);
while (1);
}

}

void loop() {
//debug code – color data for classifier
uint16_t col0[5];
col0[0] = 1;
tcs.getRawData(&col0[1], &col0[2], &col0[3], &col0[4]);
RawColorPrint(col0);
}

Octave code for training the prediction algorithm:

sigmoid.m

function g = sigmoid(z)
g = zeros(size(z));
g = 1./(1.+exp(1).^(-z));
end

predict.m

function p = predict(theta, X)
m = size(X, 1); % Number of training examples
p = zeros(m, 1);
p = round(sigmoid(X*theta));
end

costFunction.m

function [J, grad] = costFunction(theta, X, y)

m = length(y); % number of training examples
J = 0;
grad = zeros(size(theta));
J = -((1/m)*sum(y.*log(sigmoid(X*theta))+(1.-y).*log(1.-sigmoid(X*theta))));
grad = (1/m)*sum((sigmoid(X*theta)-y).*X);
end

Main code:

%% Initialization
clear ; close all; clc

%% Load Data
Whidata = load(‘White.txt’);
Bladata = load(‘Black.txt’);
Bludata = load(‘Blue.txt’);
Reddata = load(‘Red.txt’);
Gredata = load(‘Green.txt’);

fprintf(‘Training for Red\n’)

%% Assign Match(1) to Red, and No Match(0) to all the rest
data = [Reddata ones(size(Reddata)(1),1)];
data = [data; Bladata zeros(size(Bladata)(1),1)];
data = [data; Bludata zeros(size(Bludata)(1),1)];
data = [data; Whidata zeros(size(Whidata)(1),1)];
data = [data; Gredata zeros(size(Gredata)(1),1)];

%% Randomize Order
data = data(randperm(size(data,1)),:);

%% Sequester 30% of the data for testing after training
datacutoff = int16(size(data)(1)*0.7);
X = data([1:datacutoff], [1:4]); y = data([1:datacutoff], 5);
Xtest = data([datacutoff + 1:size(data)(1)], [1:4]);
ytest = data([datacutoff + 1:size(data)(1)], 5);

%% Add constant term to X and Xtest
[m,n] = size(X);
X = [ones(size(X)(1), 1) X];
Xtest = [ones(size(Xtest)(1),1) Xtest];

%% Initialize weights
initial_theta = zeros(n + 1, 1);

%% Set options for fminunc
options = optimset(‘GradObj’, ‘on’, ‘MaxIter’, 10000);

%% Run fminunc to obtain the optimal weights
[theta, cost] = …
fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);

%% Print weights to screen in Arduino-compatible format
fprintf(‘{%f, %f, %f, %f, %f} \n’, …
theta(1),theta(2),theta(3),theta(4),theta(5));

%% Use the optimal weights to predict the color of the checking data set
ycheck = predict(theta, Xtest);
fprintf(‘Train Accuracy: %f\n’, mean(double(ycheck == ytest)) * 100);

Arduino code for the prediction algorithm:

#include <Wire.h>
#include "Adafruit_TCS34725.h"

#define TCAADDR 0x70

/* Initialize with specific int time and gain values */
Adafruit_TCS34725 tcs0 = Adafruit_TCS34725(TCS34725_INTEGRATIONTIME_154MS, TCS34725_GAIN_1X);
Adafruit_TCS34725 tcs1 = Adafruit_TCS34725(TCS34725_INTEGRATIONTIME_154MS, TCS34725_GAIN_1X);
Adafruit_TCS34725 tcs2 = Adafruit_TCS34725(TCS34725_INTEGRATIONTIME_154MS, TCS34725_GAIN_1X);

/* Trained algorithm for 154ms delay and 1x gain */
double WhiteTh[5] = {-17.197105, -0.007517, 0.033887, -0.040166, 0.010233};
double BlackTh[5] = {13.863546, -0.011893, -0.022478, -0.013213, 0.005284};
double BlueTh[5] = {0.113073, -0.031746, -0.112839, 0.120412, 0.010445};
double RedTh[5] = {-0.051582, 0.077581, -0.036699, 0.042285, -0.027792};
double GreenTh[5] = {-12.740958, -0.187098, 0.227913, -0.174424, 0.033591};

void ClassifiedColorPrint(){
uint16_t col0[5], col1[5], col2[5];
col0[0] = 1;
col1[0] = 1;
col2[0] = 1;

tcaselect(0);
tcs0.getRawData(&col0[1], &col0[2], &col0[3], &col0[4]);
tcaselect(1);
tcs1.getRawData(&col1[1], &col1[2], &col1[3], &col1[4]);
tcaselect(2);
tcs2.getRawData(&col2[1], &col2[2], &col2[3], &col2[4]);

double B1Pred, B2Pred, WPred, RPred, GPred, MPred;

B1Pred = predict(col0,BlackTh);
B2Pred = predict(col0,BlueTh);
WPred = predict(col0,WhiteTh);
RPred = predict(col0,RedTh);
GPred = predict(col0,GreenTh);

//debug code – checking color classifier
Serial.println(” “);
Serial.println(“Sensor 0”);
Serial.print(“Black: “); Serial.println(B1Pred * 100.0);
Serial.print(“White: “); Serial.println(WPred * 100.0);
Serial.print(“Blue: “); Serial.println(B2Pred * 100.0);
Serial.print(“Red: “); Serial.println(RPred * 100.0);
Serial.print(“Green: “); Serial.println(GPred * 100.0);

B1Pred = predict(col1,BlackTh);
B2Pred = predict(col1,BlueTh);
WPred = predict(col1,WhiteTh);
RPred = predict(col1,RedTh);
GPred = predict(col1,GreenTh);

Serial.println(” “);
Serial.println(“Sensor 1”);
Serial.print(“Black: “); Serial.println(B1Pred * 100.0);
Serial.print(“White: “); Serial.println(WPred * 100.0);
Serial.print(“Blue: “); Serial.println(B2Pred * 100.0);
Serial.print(“Red: “); Serial.println(RPred * 100.0);
Serial.print(“Green: “); Serial.println(GPred * 100.0);

B1Pred = predict(col2,BlackTh);
B2Pred = predict(col2,BlueTh);
WPred = predict(col2,WhiteTh);
RPred = predict(col2,RedTh);
GPred = predict(col2,GreenTh);

Serial.println(” “);
Serial.println(“Sensor 2”);
Serial.print(“Black: “); Serial.println(B1Pred * 100.0);
Serial.print(“White: “); Serial.println(WPred * 100.0);
Serial.print(“Blue: “); Serial.println(B2Pred * 100.0);
Serial.print(“Red: “); Serial.println(RPred * 100.0);
Serial.print(“Green: “); Serial.println(GPred * 100.0);
}

void tcaselect(uint8_t i) {
if (i > 7) return;

Wire.beginTransmission(TCAADDR);
Wire.write(1 << i);
Wire.endTransmission();
}

double sigmoid(double z) {

return (double)1.0/((double)1.0 + exp(-z));
}

double predict(int x[5], double theta[5]) {
double answer = 0;

answer = ((double)x[0]*theta[0] + (double)x[1]*theta[1] + (double)x[2]*theta[2] + (double)x[3]*theta[3] + (double)x[4]*theta[4]);
answer = sigmoid(answer);
return answer;
}

void setup() {
while(!Serial); //wait for Serial to start
delay(1000);

Wire.begin();

Serial.begin(115200);
Serial.println(“Starting…”);
delay(1000);

// Initialize the sensors
tcaselect(0);
if (tcs0.begin()) {
Serial.println(“Found sensor”);
} else {
Serial.println(“No TCS34725 found on channel 0… check your connections”);
while (1);
}

tcaselect(1);
if (tcs1.begin()) {
Serial.println(“Found sensor”);
} else {
Serial.println(“No TCS34725 found on channel 1… check your connections”);
while (1);
}
tcaselect(2);
if (tcs2.begin()) {
Serial.println(“Found sensor”);
} else {
Serial.println(“No TCS34725 found on channel 2… check your connections”);
while (1);
}
}

void loop() {
//Print Classifications
ClassifiedColorPrint();
delay(1000);
}

For my code and these examples, I used the TCS34725 library from Adafruit and leaned heavily on their examples for the TCS34725 and TCA9548A.

FTC disclosure: this article contains affiliate links

If you liked this project, check out some of my others:

Instant Parade!

The ThrAxis – Our Scratch-Built CNC Mill

Fume Extractor

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.

Fume Extractor

If you tinker with electronics, then you’re certainly familiar with solder fumes. You probably do your best to avoid breathing them, but let’s face it: like I used to, you probably just accept them as a necessary evil. I got wary of dealing with them, so I built this fume extractor project as an excellent way to filter the smoke and breathe a little easier.

As a prototype, I put the pair of fans in series (one in front of the other) with an activated charcoal filter on the front of both, but I found that the air throughput was about the same as with a single fan. I should have realized this for the fact that fans have a maximum speed which sets the airflow limit. The cable I used for the power cord had a thumbwheel switch that was pretty difficult to turn, so that had to go. I also learned that it would have been easier to have a handle and a shorter run of power cord since my desk is small with a power strip right next to my soldering workspace.

The final design needed a handle, a short power cord, a rocker switch on the body, both fans in parallel (next to each other), and a fuse for short-circuit protection. I designed the whole assembly in Fusion 360 starting with the fan housings. The two fans are sandwiched  with a 3D printed handle between two laser-cut 1/4″ plywood panels. The activated charcoal filter is held in front of the fans by another laser-cut plywood panel and four 3D printed brackets. The brackets have a nice snap-in action. If you’d like to make your own version of my design, they’re hosted on Thingiverse.com: 

https://www.thingiverse.com/thing:2748762

That was my project! If you liked this project, check out some of my others:

Instant Parade!

Applications and Fabrication with Plastics

The ThrAxis – Our Scratch-Built CNC Mill

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.

 

Entry into Machine Vision

To paraphrase Helen Keller: the only thing worse than being a blind robot is to be a robot with sight, but no vision. 

Machine vision is the combined methods for taking in video or still image data and translating it into some meaningful interpretation of the world that a computer can understand. Some of the most popular applications include face recognition, glyph tracking, 2D barcoding, navigation, and in recent years, using artificial intelligence algorithms to be able to describe the objects in a picture using computers. My thinking is that if I can understand the tools of machine vision, then I can extend those tools to robotics problems I’m working on like object grasping and manipulation and navigation. Machine vision algorithms are built into many sophisticated software packages like MatLab or LabView, but these are typically VERY expensive, making them completely inaccessible to me. Fortunately, the principles of machine vision are well-documented and published outside of these expensive software packages, so at least there’s hope that if I work for it, I can build up my own library of machine vision tools without spending a fortune.

Since I’m building up this library for myself, I want avoid having to rewrite the programs to adapt them to any hardware that I might have connected to a robot including Windows and Linux machines or system-on-a-chip computers like Raspberry Pi, BeagleBone Black, or Intel Edison. My programming experience has been based in Windows computers, so I realize that the languages I’m familiar with won’t be directly useful. I chose Python because it’s open source, free, well-supported, popular, and has versions across all of the hardware platforms I’m concerned with.

I chose finding a color marker as my first venture into the murky waters of machine vision. The problem is pretty simple: use the webcam to find a color, send the coordinates to a robot arm, and move the robot arm. Au contraire, mon frere. That is not all.

This looks easy, doesn't it?

This looks easy, doesn’t it?

The first hurdle to overcome is capturing the webcam data. Fortunately, the Python module Pygame is incredible for doing just that. It allowed me to capture and display a live feed of images from the webcam and superimpose drawing objects over the images so I could easily understand what I was seeing. Most of the code I used came from the tutorial programs in one form or another. In the picture above, you can see the webcam image shown with the superimposed green dot representing the center of the marker.

The second battle to face is lighting. When we look at an object, we are actually seeing the light it’s reflecting. What that means is that when you change the light level or the color content of your light source (for example: use yellow instead of white), suddenly your robot will get lost because the color it “sees” is different than the color it expects. So, now we have to add a step to calibrate the vision system with it’s marker every time we run the program in case the light levels have changed between runs.

The next challenge comes in the form of coordinate systems. When we look at the image that the webcam captures, we can get the position of the object in what I’ll call webcam coordinates. Webcam coordinates are basically the count of pixels in the x- and y-axis from one of the corners. However, the robot arm doesn’t know webcam coordinates. It knows the space around it in what I’ll call robot coordinates, or the distance around the base measured in inches. In order for the computer to provide the arm with something that makes sense, we’ll have to be able to translate webcam coordinates into robot coordinates. If your webcam and arm have parallel x- and y-axes, then conversion may be just scaling the pixel count linearly. If the axes aren’t parallel, then a rotation will be needed. I kept the axes parallel and simply put a ruler down on the surface and used that to “measure” the distance that the webcam sees, then divided by the number of pixels in that axis.

The final roadblock is when you go to make the arm move to the location you’ve given it. The solution to this problem could be as simple as geometric equations or as complicated as inverted kinematic models. Since both methods relate to the movement of the arm, I’ll just call them both kinematics. Even though this will probably be the hardest challenge to overcome, you should probably take it on first since you’ll be able to use the kinematic model of the arm to simplify many other programs you may have for the same arm.

Geometric solution to the kinematic equations

Geometric solution to the kinematic equations

The idea behind kinematic modeling is that you want to write a group or system of equations that tell you what angles to move the arms to so the end of the arm is in a particular position and orientation. In general terms, if you want a robot that can move to any position in a plane (2D), it needs to have at least 2 degrees of freedom (meaning it has 2 moving joints) or if you want to move to any position in a space (3D), it needs to have at least 3 degrees of freedom. In my case, my arm is a 5 degree of freedom arm (meaning it has 5 moving joints) and that makes it particularly complicated to use a purely mathematical inverse kinematic model because I could move to any position in the space around my robot (3D), but I’d have multiple solutions to the equations. I chose to constrain the arm to 3 degrees of freedom by forcing the gripper to have a particular orientation. Then it became easier to model it geometrically.

The program I wrote works fairly well. It’s able to find objects of many colors and it’s pretty entertaining to watch it knock over the marker and chase it outside the field of view. I demonstrated it at the August meeting of the Robotics Society of Southern California which meets every month on the second Saturday.

RSSC Machine Vision Demo

RSSC Machine Vision Demo

If you have any suggestions on an application for color marker tracking or if you’d like to know more about this project, please leave a comment below.

That was my project day!

If you liked this project, check out some of my others:

My Thanks to Mr. Clavel

Instant Parade!

The ThrAxis – Our Scratch-Built CNC Mill

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.

 

A Fun Recipe for Kitchen Chemistry

Chemistry is all around us every day. And now, it’s all over my father-in-laws refrigerator, too.

My father-in-law is a chemical engineer and a kitchen chemist in his free time. My wife and I thought that the chemistry equivalent of poetry fridge magnets would be the perfect Christmas gift for him. The concept is pretty simple, find a set of molecules that are interesting, then imprint their molecular model in some aluminum flat bar and stick some magnetic strips to the back.

In honor of the kitchens they end up in, here is our recipe:

1  –   1″ x 1/16″ aluminum bar, cut to length

1  –   Sharpie marker

1  –   polymer magnetic strip, cut to length

1  –   1/8″ letter/number metal stamping set

Find a set of molecules that you find interesting and figure out their molecular models. Draw them out to scale on a sheet of paper to make sure they fit on the bar stock.

Cut each medallion to length from the aluminum bar with a hacksaw. Leave yourself a little extra around the edges for a cleaner look. Make sure the cut is square to the bar stock and be sure to debur the edges with either a file or a deburring tool.

Plan out the lettering on each medallion by drawing the letters out with the Sharpie marker. My wife figured out this cool trick: press down on the aluminum with the stamps and it will leave the slightest ghost of the image, then trace that with the marker.

Glucose Medallion Before Stamping

Glucose Medallion Before Stamping

Imprint the letters on the surface of the medallion with the metal stamps and a hammer. Use a steel bar as a backing when stamping to make sure the aluminum doesn’t stretch and draw. (drawing is when the edges of the metal curl upward and the spot where the die is struck forms a bowl shape). Wash the marker off with rubbing alcohol or nail-polish remover.

Finish the surface of the medallion whichever way looks good to you. Aluminum can be worked with fine-grit sandpaper to make a brushed finish, hit with a steel wire wheel to give a rough finish, or polished until it shines.

Stick the self-adhesive side of the polymer magnet to the back of each medallion. Make sure the magnetic strip doesn’t delaminate by stacking the finished medallions in width order (widest on the bottom) and clamp the stack to a table for 24 hours or more. Use cardboard to keep from marring the surface.

Refrigerator Magnets!

Refrigerator Magnets!

That was my project day!

If you liked this project, check out some of my others:

Featured Artist Nameplates

Wooden Time Machine

Set your Creativity Adrift

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.

 

My Thanks to Mr. Clavel

I spent some time working on this backburner project over my Christmas vacation. It’s nowhere near finished, but it functions, so I thought I’d share what I have.

This robot arm configuration is called a Clavel Delta Arm and where you might think of a typical robot arm composed of a series of links, this is a parallel design. Well, strictly speaking, my robot arm is missing the parallelogram lower links that are a key feature of the Reymond Clavel design, but that’s a future improvement and I’ll get there.

My goal for this project is to develop the mechanics and controls to make this arm functional. I’ll be adding the parallelogram lower links and developing the kinematic equations so I can drive this arm by cartesian (x, y, and z position) or cylindrical (r, theta, and z position) coordinates.

That was my project day!

If you liked this project, check out some of my others:

Instant Parade!

The ThrAxis – Our Scratch-Built CNC Mill

Give Aging Technology a Chance

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.

Seven Segments… What More Could You Need?

Seven-segment displays are pretty useful tools for taking a project from indicator lamps to actual human-machine interaction. The only problem is, one digit of one display adds 7 outputs that you have to drive (or sink) and that doesn’t leave many signals for the process. How do you overcome this hurdle? Well, electronics engineers thought of that and came up with display driver IC chips. Basically, these take a more condensed set of data and deal with the overhead of controlling the display for you. For this project day, my goal was to use the least outputs possible to drive a 2-digit seven-segment display using the outputs from a microcontroller (I used Arduino Uno). Read on to see how I did.

 

Prototype Display Circuit

Prototype Display Circuit

For the uninitiated, a seven-segment display is just an array of LEDs arranged to form the number 8. Well, actually any light source could be used, but I’m focusing on LED displays. By choosing which of the LEDs to light, the arrangement can articulate any numeral. This technology is used pretty much anywhere cheap numbers are needed, so in digital watches, calculators, thermometers, etc. LED displays in particular come in one of two flavors, common cathode and common anode. The type is important in selecting your driver, so watch out.

Seven-Segment LED Display

Seven-Segment LED Display

My display is a two-digit display with decimal points (Avago Technologies HDSP-5621, common anode, 2.5V, 30mA per segment)

There are a number of choices when it comes to display drivers. Input types include BCD, parallel binary, and serial. The output types vary from driving particular display types like LED, LCD, CRT to driving a generic output and from one up to eight digits . The driver I had on hand was a BCD to common anode LED driver for one digit (Texas Instruments SN74LS47). Part of my goal was to use the components I had to reduce my costs, but if I were to buy a driver from scratch, I’d want to go with a two-digit common anode driver with serial input.

At this point, I could have driven all 14 LEDs from 8 inputs to the display driver ICs, but 8 pins is a lot to ask on a small microcontroller, so the only way to reduce again was to look at serial. I chose to use a simple parallel-out shift register IC (takes in serial data and a clock pulse and outputs the last 8 serial bits as parallel data). I didn’t have anything on hand, so I selected Texas Instruments SN74HC164N.

Prototyping was pretty straightforward. I selected 220Ohm resistors to limit the current between each segment and the driving IC (assuming 1.2V-2.5V drop across resistors, that would be 17mA – 11mA from my 5V source). Also, I had to choose if the serial data would come from the microcontroller as LSb first or MSb first. The choice was completely mine since I wouldn’t be able to use the Serial method in the Arduino due to the clock requirement on the shift register. I opted to stick with convention and send the LSb first.

Here is a drawing of my prototype circuit:

Schematic of Seven-Segment Display Driver

Schematic of Seven-Segment Display Driver

Here is the Arduino code I used to drive the display:

// define all constant parameters here

int LEDPin = 2;
int CLKPin = 3;
int i = 0;
int j = 0;

//Debugging settings
#define DebugBaud 9600

void setup()
{
// Debugging Setup
Serial.begin(DebugBaud);

pinMode(LEDPin, OUTPUT);
pinMode(CLKPin, OUTPUT);
}

void loop()
{
//Debugging Message
// Serial.println(“Online”);

for (j = 0; j <= 9; j++){
Serial.print(j);
Serial.println(” is on the display”);
Transmit((byte) j);
Serial.println();
delay(1000);
}

}

void Transmit(byte Msg)
{
for (i = 0; i <= 7; i++){
writeBit(bitRead(Msg,i));
Serial.print(bitRead(Msg,i));
// delay(100);
}
}

void writeBit(boolean Val) {
digitalWrite(CLKPin, LOW);
// delay(1);

if (Val) {digitalWrite(LEDPin, HIGH);}
else {digitalWrite(LEDPin, LOW);}

digitalWrite(CLKPin, HIGH);
}

That was my project day!

If you liked this project, check out some of my others:

Integration… It’s not just about that silly ‘S’

Instant Parade!

The ThrAxis – Our Scratch-Built CNC Mill

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.

 

Integration… It’s not just about that silly ‘S’

Many years ago, I got about a dozen 16×16 bi-color LED display panels that were a part of a broken segment on a marquis. Until now, they’ve been sitting in my spare parts cabinet just waiting for a project to come up. Nothing has and now space has become a premium, so it’s time to do something with them or get rid of them. Since I haven’t used them before and I don’t have a display driver, I don’t know how useful they will be or even if they will work.

When these panels came to me, they were mounted on a frame and had small FPC (flexible printed circuit) ribbon cables connecting each row in series. The backs are printed with two part numbers KLM-163-CAH and KLM-163-CAN which made tracking down a datasheet challenging. I found one on datasheetdirect.com. The panels are powered by 5VDC and take 5VDC TTL logic levels (making them 100% compatible with Arduino or any other 5V microcontroller).

In terms of basic operation, you have to select the line, write a serial string to indicate the LEDs to light, then flip an enable bit to lock in that sequence until you get done with the next line. The serial data is timed by an external clock so timing with the serial sequence isn’t important. In fact, I bit-bashed my way through the example code and it worked just fine. The display only shows one line at a time, so you have to be able to scan through the sequence pretty fast. The data sheet has a pretty clear timeline of the sequence of operations, so figuring out the timing shouldn’t hold you up. The only thing that I got hung up on (and still haven’t figured out) is an offset on line addressing, but I’ve worked around it with a 15 line offset.

I also put a bit in the code that allows the Arduino to accept a serial sequence (through the USB port) and it will display that sequence. I tested it using HyperTerminal and it showed the bit-wise ASCII codes for the keys I pressed (so that’s success, right?)

I plan on writing at least one more post about these displays, so look for that in the future.

Here is the Arduino code I used to drive the display:

#define SBaud 9600

#define OE 2 //Out Enable pin
#define Latch 3 //Latch pin
#define CLK 4 //Clock pin
#define GData 5 //Green data pin
#define RData 6 //Red data pin
#define Ad0 8 //Address bit 0 pin
#define Ad1 9 //Address bit 1 pin
#define Ad2 10 //Address bit 2 pin
#define Ad3 11 //Address bit 3 pin
#define Timer 0 //Delay Timer

#define RowMin 0 // Decimal min in row
#define RowMax 15 // Decimal max in row
#define ColMin 0 // Decimal min in col
#define ColMax 15 // Decimal max in col

#define SMax 32 // number of bytes to read from serial

word Display[] = {0x0000, 0x03C0, 0x0FF0, 0x1FB8, 0x3F1C, 0x3F8C, 0x7EC6, 0x7C46, 0x760E, 0x631E, 0x303C, 0x387C, 0x1CF8, 0x0FF0, 0x03C0, 0x0000};
int Di = 0;

int Row = 0;
int Col = 0;

void setup() {
pinMode(OE, OUTPUT);
pinMode(Latch, OUTPUT);
pinMode(CLK, OUTPUT);
pinMode(GData, OUTPUT);
pinMode(RData, OUTPUT);
pinMode(Ad0, OUTPUT);
pinMode(Ad1, OUTPUT);
pinMode(Ad2, OUTPUT);
pinMode(Ad3, OUTPUT);

//Start with all bits low
digitalWrite(OE, LOW);
digitalWrite(Latch, LOW);
digitalWrite(CLK, LOW);
digitalWrite(GData, LOW);
digitalWrite(RData, LOW);
digitalWrite(Ad0, LOW);
digitalWrite(Ad1, LOW);
digitalWrite(Ad2, LOW);
digitalWrite(Ad3, LOW);

//Start the serial monitor
Serial.begin(SBaud, SERIAL_8N1);
Serial.setTimeout(1); //stop checking the port if no data is available after 1 ms

}

void loop() {

//check the serial buffer for data, if there is some, use it
//otherwise continue putting data on the screen
if (Serial.available() > SMax) {
//when enough bytes are available to fill the screen, load them into the image buffer
for (Di = 0; Di <= SMax ; Di++){
if (Di%2 == 1) {
Display[Di/2] = word(highByte(Display[Di/2]) , Serial.read());
}
else {
Display[Di/2] = word(Serial.read() , lowByte(Display[Di/2]));
}
}
for (Di=0;Di<SMax;Di++){
if (Di%2==1){
Serial.print(lowByte(Display[Di/2]));
}
else{
Serial.print(highByte(Display[Di/2]));
}
}
Serial.println(“”);
}

// initialize enable to be high (disable)
// Step 1. Bring both Latch and Enable High to prepare for new data
digitalWrite(OE, HIGH);
digitalWrite(Latch, HIGH);

for (Row = RowMin; Row <= RowMax; Row++){

// Step 2. With Latch and Enable High, set the row address
writeAddress(Row+15);

// Step 3. Enable goes low to activate the display
digitalWrite(OE, LOW);

delay(Timer);

// Step 4. The bitwise data is written to red and green
for (Col = ColMin; Col <= ColMax; Col++){
writeBit(bitRead(Display[Row],Col), 0);
}

// Step 5. After the row is done, bring Enable High and
//Latch Low to set the data for that row
digitalWrite(CLK, LOW);
digitalWrite(OE, HIGH);
digitalWrite(Latch, LOW);
digitalWrite(Latch, HIGH);
}
}

void writeBit(boolean RVal, boolean GVal) {
digitalWrite(CLK, LOW);

if (RVal) {digitalWrite(RData, LOW);}
else {digitalWrite(RData, HIGH);}

if (GVal){digitalWrite(GData, LOW);}
else {digitalWrite(GData, HIGH);}

digitalWrite(CLK, HIGH);
}

void writeAddress(int RowNum) {
if (bitRead(RowNum,0)==1){ digitalWrite(Ad0, HIGH);}
else {digitalWrite(Ad0, LOW);}
if (bitRead(RowNum,1)==1){ digitalWrite(Ad1, HIGH);}
else {digitalWrite(Ad1, LOW);}
if (bitRead(RowNum,2)==1){ digitalWrite(Ad2, HIGH);}
else {digitalWrite(Ad2, LOW);}
if (bitRead(RowNum,3)==1){ digitalWrite(Ad3, HIGH);}
else {digitalWrite(Ad3, LOW);}
}

That was my project day!

If you liked this project, check out some of my others:

Instant Parade!

The ThrAxis – Our Scratch-Built CNC Mill

Give Aging Technology a Chance

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.

 

Featured Artist Nameplates

As engineers, my wife and I are constantly using our left brains, so we find it’s nice to quit analyzing and get creative. I’m building that these nameplates will add some sophistication to our little hallway amateur art gallery.

Between my dad’s and my sister’s photography, my wife’s paintings and drawings, and my wife’s best friend’s paintings, we have a very diverse little art gallery growing in our hallway. We’re very proud of the artists in our family and they deserve to be recognized for their work. I started thinking about this project as just simple aluminum plates with names stamped into them. While that concept would have been good enough to fit the function, the aesthetic just wasn’t what I wanted.

Featured Artist Nameplate

Featured Artist Nameplate

I used layers of material to make the nameplates 3-dimensional and used the textures and shapes of different kinds of materials to give them character. I used plain poplar for the base (closest to the wall), perforated stainless steel for the second layer, and of course the aluminum for the top layer. Thankfully, the aluminum polished up fairly well and makes the letters pop out, so they are pretty easy to read. I used simple washers to put spacing between the plates.

Stamping Jig

Stamping Jig

It took a while to get the lettering just right. At first, I tried stamping free-hand, but the smallest turn of the stamp or the smallest misalignment between letters ruined the whole plate. Ultimately, I made a jig out of spare aluminum plates and some angle shapes. Getting the spacing and alignment just right was the hardest part of the project. I also used hash marks on my workbench to get the spacing down.

That was my project day!

If you liked this project, check out some of my others:

Instant Parade!

Lukewarm Drill Press Bench

Set Your Creativity Adrift

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.

 

Lukewarm Drill Press Bench

Space is a premium in small workshops and there are many ways to overcome the limitation space has on tool sizing. One way to fit big tools like my drill press in my garage work space is to make them mobile so they’re available when needed and can be stored when they’re not.

 

Finished Drill Press Table

Finished Drill Press Table

This project has been in the back of my mind for quite a while. You see, I have a small bench-mounted drill press that is very useful and, for some projects, necessary. The problem with it is that in my garage workshop, I don’t have a bench to mount my bench-mounted drill press to. So, every time I use it, I have to pick it up off the ground and set it up on a small portable bench, then tear it all down again when I’m done. Even if I just built a bench for it, I don’t have any wall-space left, so the bench would have to sit in the middle of the workshop taking up valuable floor space that I will probably need for something else. It’s also to my advantage to design my workspace so it’s reconfigurable since my projects range from small-scale electronics and widgets smaller than a credit card to furniture-sized constructions. What that means for a drill press is that sometimes, I may only need to have the working space of the table (that’s the suspended platform below the spindle and chuck) and other times, I’ll need several feet of space on either side. The key to make all of these things possible without turning my garage into a logistical nightmare is to make the drill press bench mobile.

I’ve made a few workbenches before and the easiest and most economical materials I know are 2x4s and plywood. As long as you have some basic dimensions in mind, 2x4s and plywood make building a bench so easy I feel like it practically builds itself. In fact, this build was made even easier because it’s based on the plans for a drill press bench I had built previously. If you have the skills, I suggest you make a sketch of the final version of anything you build because you never know when you will want to rebuild it or use the design for something else. As for the mobility portion of this design, I knew the bench would have to lock into place when I went to use it. In the planning stages, I decided against using locking casters since there is quite a bit of movement you can get out of them even if they are locked. Instead, I opted to have wheels on one side so I could tilt the bench to wheel it into place, then tilt it upright to set it down. The original plan was to put a steel rod through some of the 2x4s, hold it in with cotter pins and buy some harbor freight wheels and hold them on with washers and more cotter pins. This plan didn’t pan out because to buy the materials for that idea would only be slightly less expensive to build it myself than to buy a dolly on sale. Honestly, buying the dolly feels a little like cheating, but in spite of what you might think while reading my blog, sometimes it’s just better to let someone else do the work.

Table with Dolly Attached

Table with Dolly Attached

One of the big parts of designing this table is that the drill press is top-heavy and so I can’t make the table too narrow otherwise it’s at risk of falling over. To overcome this risk, I performed a tilt analysis as part of my design. Using geometry, you make some educated guesses about where the center of gravity of the table plus drill press is, then make some more educated guesses about how far the table should tilt before it falls over, then make a footprint that matches those two conditions. Because this table is designed to tilt to move, I used a scant 10 degrees as the maximum tilt angle and assumed the CG was pretty high, justifying the 24″ base.

The build process is fairly straightforward. Measure and cut the 2x4s and plywood, then make like the Avengers and assemble. I used a circular saw to make the cuts, but a chop saw would work, too if you have one. One trick when cutting lengths of material down is to cut the pieces in order from longest to shortest. In this way, if you make a mistake, you can cut the mistaken piece into the smaller pieces and still minimize waste. During assembly, I try to predrill as much as is practical. In this case, the 3″ drywall screws work best when you predrill the lumber nearest the head of the screw with an 1/8″ hole. The pilot hole allows the screw to go through at the angle exactly how you want it to and the two boards will squeeze in tight as the screws are tightened to make a solid structure. Rather than constantly swap the 1/8″ drill bit for the phillips screw attachment in my cordless drill, I had my 120V corded drill set up with the screw attachment and the cordless set up with the drill bit. It went way faster that way, but the corded drill was severely overpowered and it made the bit skip in the screw head. In retrospect, I probably should have had those two reversed so the right tool was used for the right job. Primitive Pete strikes again!

I made a lot of missteps on this build. I started with the plan to build the square top and the shelf borders first, then secure them to each other with the legs and finish with the plywood tops. As I built, I changed my mind and decided that it would be best to try to build the whole table from one side to the other starting with two legs and one quarter of the square top and shelf borders. The idea was that when the top and shelf were 3/4 built, the plywood could be put in and clamped in place with the last quarter of the platform. Instead of being the elegant solution I thought it would be, it was a disaster. Finally, I reverted back to the original plan which ended up working much much better.

I finished the table with some gray paint that I randomly chose at the hardware store. I wanted to use a neutral color that wouldn’t be overly dark or light so marks would show. I didn’t go to the store with a plan and decided while there that the middle gray color the store had chosen for their display bases was just right. All the way home from the hardware store, I wondered if I had made the right choice, but the color has worked out pretty well, so far. If I find I don’t like it as time goes on, I can always repaint it. I also used some chrome corner caps to keep from bashing knees because the top of the table is at the perfect knee-bashing height.

I’ve mounted the drill press and I’m letting the table settle into the garage space, but I don’t think it is a huge success. I think I misjudged how large the top of the table needed to be and so it takes up a considerable amount of room. Much more room than I expected. I’m seriously considering cutting it down to be much closer to just the footprint of the drill press.

UPDATE!

Since the original posting, I’ve reduced the size of this table from the original 27″ x 27″ to a much more modest 14″ x 19″. At that size, the dolly is just the right width and there’s just enough room underneath for my toolbox. I’ve used it a couple of times so far and I’m really happy with the height and how easy it is to move around.

That's Much Better!

That’s Much Better!

That was my project day!

If you liked this project, check out some of my others:

Wooden Time Machine

A Twist to Build a Dream On

Growing Projects One Dimension at a Time

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.

Hot Rod Red Robot Controller

I’ve always believed in working harder by working smarter. So when I do a task more than once, I take a minute to consider if I could make life easier for myself by making a tool, gadget, or program like this controller that will keep me moving forward instead of being stuck in the doldrums.

When I’m working with my PicAxe, Arduino, Raspberry Pi, or even just PLCs, I keep finding myself building the same kinds of prototype circuits over and over again. Circuits like switches, buttons, and potentiometers as voltage dividers or as current limiting devices come up all the time. I’m sure if you’re electronically-oriented, you have had the same happen with you. Instead of having these parts clutter up my solderless breadboard, I decided to make a controller that would house the devices that I could simply wire into my prototypes… and do it with style!

Since I’ve been building prototype circuits with these components for years and it’s mostly straight connections it didn’t take any effort at all to make the electrical plan. The real challenge of this project was planning out how the components I wanted would all fit on a single panel. On one extreme, I could make it a big, obnoxious contraption with everything I could possibly ever need, but completely unwieldy or on the other end of the spectrum, something so small and specific that it’s not useful. Aside from the use / aesthetic spectrum, I also have more than enough prototyping components, so one self-imposed limitation was that I didn’t want to go nuts buying all new stuff. That brought the challenge that I’d have to build the project around these two massive industrial joysticks that I have. If space is such a premium, then why two joysticks you ask? “To control robots”, I would answer.  In the end, the limiting factor for the every dimension of the panel was the size of the joysticks. I managed to fit two switches, two potentiometer / rotary selector switch knobs, and five push buttons in the space between.

Enclosure Base Complete with Unused Hinges

Enclosure Base Complete with Unused Hinges

The build started off as a box with feet and a hinged lid which the components would be mounted to. That was going to give me the flexibility to easily open the cover and make changes, if needed. The hurdle with that design is that the lid, being made from very thin aluminum, would need to be reinforced so it didn’t flex every time you touched it. Also, there are the pointy corners to consider. Every iteration of a supported lid that I came up with was either clunky or complicated or both, so I decided that a fixed lid was the way to go and I’d just have to deal with reaching through the controller to make changes. Between the easy-to-manage handy panel siding and the square material used for bracing and the legs, it took very little time to build the enclosed bottom of the controller. The hardest part was visualizing interacting with the controls and planning where to put them and how to plan for the possibility of changes in the future. For this project and any others you might have dealing with sheet metal and drilling holes, I recommend you buy a set of step drills. Not only do they make much larger holes than you can practically make with general purpose drill bits, but they will also debur the hole after they cut it.

Finished in Shiny Red

Finished in Shiny Red

After I had the enclosed base fitted with the aluminum plate, I test-fitted all of the parts to make absolutely sure everything fits and painted the whole thing with red automotive paint. It’s not by best paint job, but it gets the job done. I may repaint the base or the plate with a different color to give it some personality.

Going forward, this will be great for prototyping. If I need a joystick in a permanent build, though, I think I’ll go for the mini joysticks available from Parallax or others instead of using a joystick almost as large as the robots I build.

That was my project day!

If you liked this project, check out some of my others:

Instant Parade!

The ThrAxis – Our Scratch-Built CNC Mill

Give Aging Technology a Chance

Did you like It’s Project Day? You can subscribe to email notifications by clicking ‘Follow’ in the side bar on the right, or leave a comment below.