A quick calculation shows that I have made about 40% of the replicated parts I need at this point. I would say this is good progress for a few weeks worth of spare time work.
Sunday, December 21, 2008
RepStrap: first parts made...
I began building some of the bulk pieces last week. So far I have used a dremel / dremel drill press, coping saw, and drill. I already had the dremel. All other tools cost me a combined $80 dollars. The parts themselves are made up of poplar, wood glue, and aluminum square. I got the idea for building pieces this way from Bruce Wattendorf. He has a lot of images up on here. These images were a lot of help when deciding how to create the replicated parts myself.
A quick calculation shows that I have made about 40% of the replicated parts I need at this point. I would say this is good progress for a few weeks worth of spare time work.
A quick calculation shows that I have made about 40% of the replicated parts I need at this point. I would say this is good progress for a few weeks worth of spare time work.
Thursday, December 11, 2008
RepStrap: I am not spending that...
I have several build options for a RepStrap at this point.
The first option that I considered is the most feasible option, which is to build a McWire. It is cheap and easy to construct. It is sturdy enough to run a mill. All around it is a solid solution for a RepStrap. However, because of the thread rod everywhere, it is slow. Also, it isn't very photogenic, is it? Also, it will always be a RepStrap. (A RepStrap is named such since it cannot reproduce itself. However, it can produce something that can reproduce itself.)
The second option is to follow Forrest Higgs' route and build a traditional CNC machine. I could theoretically build one for cheap. However, Forrest has yet to extrude much with it. At a later date I may have given this a try.
All other options are centered on the Darwin design. I like the Darwin because it is... pretty. It may sound shallow but pretty can very easily translate to good mechanical engineering in my opinion. Also, it is a proven design. And... it has the potential to become a RepRap once you start printing parts to replace the parts you purchase or make.
OK, so I've settled on the Darwin design. Now all that is left is to decide whether to purchase or build the replicated parts. If you refer to the image from the previous post the white and green parts are what I am talking about here. As far as purchasing these parts it looks like the only viable option today is to have them made at Ponoko. I am not willing to spend +$400 dollars on parts that I plan on replacing as soon as I can print them.
It looks like I am stuck with making these parts by hand. But how?
The first option that I considered is the most feasible option, which is to build a McWire. It is cheap and easy to construct. It is sturdy enough to run a mill. All around it is a solid solution for a RepStrap. However, because of the thread rod everywhere, it is slow. Also, it isn't very photogenic, is it? Also, it will always be a RepStrap. (A RepStrap is named such since it cannot reproduce itself. However, it can produce something that can reproduce itself.)
The second option is to follow Forrest Higgs' route and build a traditional CNC machine. I could theoretically build one for cheap. However, Forrest has yet to extrude much with it. At a later date I may have given this a try.
All other options are centered on the Darwin design. I like the Darwin because it is... pretty. It may sound shallow but pretty can very easily translate to good mechanical engineering in my opinion. Also, it is a proven design. And... it has the potential to become a RepRap once you start printing parts to replace the parts you purchase or make.
OK, so I've settled on the Darwin design. Now all that is left is to decide whether to purchase or build the replicated parts. If you refer to the image from the previous post the white and green parts are what I am talking about here. As far as purchasing these parts it looks like the only viable option today is to have them made at Ponoko. I am not willing to spend +$400 dollars on parts that I plan on replacing as soon as I can print them.
It looks like I am stuck with making these parts by hand. But how?
Sunday, December 7, 2008
RepStrap: Why I am building a RepStrap
While attempting to build a web cam mount in one of my other blogs I realized that I did not have the tools or the resources for a legitimate solution. Looking around the Internet a bit revealed no company willing to prototype small quantities of anything at a small price. This is when I started wondering whether creating my own rapid prototyping device was within my reach. A quick Google search shockingly confirmed the positive! This brings us to the RepRap or the Replicating Rapid Prototype Machine conceived by Adrian Bowyer. Below is a picture of the Darwin, first iteration, model.
Image can be found at RepRap.org.
Since their website is top notch I will not attempt to rehash the purpose of the project. Let us just say that it has a print head that can extrude plastic while moving in three dimensions. This allows it to build up layers of plastic into a 3D object that is accurate up to 0.1 mm (in theory).
I will begin building after a bit more research. Oh, here is a video of Adrian Bowyer talking to the economic and social impacts of a Replicating Rapid Prototyper: RepRap Pop!Cast
Since their website is top notch I will not attempt to rehash the purpose of the project. Let us just say that it has a print head that can extrude plastic while moving in three dimensions. This allows it to build up layers of plastic into a 3D object that is accurate up to 0.1 mm (in theory).
I will begin building after a bit more research. Oh, here is a video of Adrian Bowyer talking to the economic and social impacts of a Replicating Rapid Prototyper: RepRap Pop!Cast
Saturday, December 6, 2008
AllSeeing: stuck with an empty wallet...
Did you know... that hot glue does not make a strong enough connection between a miniature web cam and a servo drive gear? I was foolish enough to try this. I guess I forgot that proof of concepts should stay in their own directory. In order to get this rig driving a web cam around it is time for mechanical engineering to step in. Only, proper mechanical solutions to servos driving a web cam would require gear trains, bearnings, and some type of housing. Have you ever priced gears?
So far this project required the following funds:
Put gears, bearings, and housing on that list and you are looking at a shoddy solution running +$60 more dollars. I am not willing to spend this on components that cannot be reused and on a project that cannot occupy my time for more than a month.
Here is where things get interesting. I sat on this dilemma for a few days and decided to start searching Google for a rapid prototyped solution. That is, I searched for companies willing to prototype custom parts for cheap. I could not believe what I found. Hence forth I am abandoning this project in order to make my own rapid prototype machine for under $500, unlocking SO MANY possibilities! Once this prototype machine is complete I will eventually revisit this... I am sure.
So far this project required the following funds:
- servo motors - $12 X 2
- transistors - $0.2 X 8
- wire / connectors - around $4
- web cam - had it lying around
Put gears, bearings, and housing on that list and you are looking at a shoddy solution running +$60 more dollars. I am not willing to spend this on components that cannot be reused and on a project that cannot occupy my time for more than a month.
Here is where things get interesting. I sat on this dilemma for a few days and decided to start searching Google for a rapid prototyped solution. That is, I searched for companies willing to prototype custom parts for cheap. I could not believe what I found. Hence forth I am abandoning this project in order to make my own rapid prototype machine for under $500, unlocking SO MANY possibilities! Once this prototype machine is complete I will eventually revisit this... I am sure.
Wednesday, November 26, 2008
AllSeeing: two degrees of freedom...
Two H Bridges built out of eight transistors and one week of research later and I have the following.
This rig, which is held together by hot glue, gives a Proof of Concept with two degrees of freedom. To further the proof of concept I needed a C application with simple user input. Because I have a little background in GLUT graphics I wrote up a simple window that processes the X, Y position of the mouse and converts it to polar coordinates. Attached is the source code. I will post a video shortly of the end result.
[file]
This rig, which is held together by hot glue, gives a Proof of Concept with two degrees of freedom. To further the proof of concept I needed a C application with simple user input. Because I have a little background in GLUT graphics I wrote up a simple window that processes the X, Y position of the mouse and converts it to polar coordinates. Attached is the source code. I will post a video shortly of the end result.
[file]
Saturday, November 22, 2008
AllSeeing: basic electronics, a refresher...
There are two types of motors that can be used for precision work, stepper motors and servo motors. They both have their benefits. Steppers are easy to interface with but do not tell you when they skip a step. Servos are basically DC motors with a feedback mechanism and a gear box. It is the feedback mechanism that is their strong suite, though they are more complicated to interface with. I will be using servos like this one.
In order to power these servos I will need some form of amplifier. Parallel ports only provide enough current for logic. To keep it simple and basic I will be using good old transistors like this one.
And, of course, I will need a way to put it all together.
Here is a picture of my workshop and my current research. :)
In order to power these servos I will need some form of amplifier. Parallel ports only provide enough current for logic. To keep it simple and basic I will be using good old transistors like this one.
And, of course, I will need a way to put it all together.
Here is a picture of my workshop and my current research. :)
Tuesday, November 4, 2008
AllSeeing: an idea...
Knowing that in our senior year of college we would be required to complete extensive research on a topic of our choice was enough to inspire late night brain storming sessions among my colleagues in the Computer Science department. Most of these sessions resulted in us throwing outrageous ideas on the table and talking them to death.
Out of these sessions came the following idea. Would it not be cool to teach a computer to see? More specifically, get a web cam to follow people and look them in the face? This has been done before. There are commercial products that can keep a face centered. Still, it would be pretty cool to get a system like this up and running with nothing more than a web cam, some servo motors, and a parallel port.
It is my opinion that most AI of today is a bunch of smoke an mirrors. This is not bad. Smoke and mirrors can do some pretty impressive things. The reason this idea was never implemented by my colleagues is because from a Computer Science perspective it had already been done. If you have a subscription to the ACM Library you can read about detailed algorithms for Skin Detection, Face Detection, and Face Recognition. Surprisingly, Skin Detection can be implemented with only a few lines of C. Observe my go at this:
I executed this very crude and very fast solution on Angela Lansbury and Micheal Jordan, choosing these two people partially by random and also in order to show the versatility of the solution. Below is the result.
As you can see, the result looks very promising. So... skin detection is a check. All that would be left as far as algorithms would be the face detector, which I feel would also be trivial. This is where Computer Science ends and Electrical Engineering begins. In order to have a complete system the web cam would have to be mounted on a platform that would allow it to follow a face. This platform would have to be controlled from the same executing C code that is analyzing the web cam feed. Sounds like I need to dig into my basic electronics books, something I haven't done since junior year of high school.
Out of these sessions came the following idea. Would it not be cool to teach a computer to see? More specifically, get a web cam to follow people and look them in the face? This has been done before. There are commercial products that can keep a face centered. Still, it would be pretty cool to get a system like this up and running with nothing more than a web cam, some servo motors, and a parallel port.
It is my opinion that most AI of today is a bunch of smoke an mirrors. This is not bad. Smoke and mirrors can do some pretty impressive things. The reason this idea was never implemented by my colleagues is because from a Computer Science perspective it had already been done. If you have a subscription to the ACM Library you can read about detailed algorithms for Skin Detection, Face Detection, and Face Recognition. Surprisingly, Skin Detection can be implemented with only a few lines of C. Observe my go at this:
#include <stdio.h> #include <stdlib.h> #include <math.h> #include <Magick++.h> using namespace Magick; int main(int argc, char **argv) { Image image; image.read(argv[1]); Color white = Color(MaxRGB, MaxRGB, MaxRGB, 0); Color black = Color(0, 0, 0, 0); int width = image.columns(); int height = image.rows(); printf("Width: %d Height: %d \n\n", width, height); int x = 0; int y = 0; for(x=0;x<width;x++){ for(y=0;y<height;y++){ ColorRGB pixel = image.pixelColor(x,y); double red = pixel.red(); double green = pixel.green(); double blue = pixel.blue(); double a = (red+green+blue)/red; double b = (red+green+blue)/blue; if(a<2.5 && a>2 && b<4 && b>3){ //image.pixelColor(x,y, white); } else{ image.pixelColor(x,y,black); } } } image.write(argv[2]); return 0; }
I executed this very crude and very fast solution on Angela Lansbury and Micheal Jordan, choosing these two people partially by random and also in order to show the versatility of the solution. Below is the result.
As you can see, the result looks very promising. So... skin detection is a check. All that would be left as far as algorithms would be the face detector, which I feel would also be trivial. This is where Computer Science ends and Electrical Engineering begins. In order to have a complete system the web cam would have to be mounted on a platform that would allow it to follow a face. This platform would have to be controlled from the same executing C code that is analyzing the web cam feed. Sounds like I need to dig into my basic electronics books, something I haven't done since junior year of high school.
Subscribe to:
Posts (Atom)