~ Machine Design.
~ post date: 2017.03.27, recent update: 2017.6.11

Background

dìshūBot is a water calligraphy device inspired by the modern Chinese custom of painting calligraphy on public sidewalks with a water brush, a “practice [that] corresponds to both a socializing need and an individual search for self accomplishment or improvement” (Francois Chastanet). Fabricated at FablabO in Shanghai as part of the 2017 Fab Academy session, dìshūBot is a composition of 3D printed and laser cut materials and employs a combination of stepper motors, a water pump, Arduino Uno x CNC Shield and a custom G-code transcoder (Grasshopper) to endlessly trace passages composed of «hanzi» that slowly disappear as water evaporates.

In this post, I will explore some precedent and the prototyping of mechanical design and fabrication from my point of view. The collective effort of the project will be posted to a Shanghai fabpage.

Because it is fun to skip to the end of the story first (SPOILER ALERT), this is a video of a working prototype of dishuBot featured at Maker Faire Bay Area 2017 writing a poem by Tao Qian titled Drinking Wine.

Contents
  • Shanghai FabLabO Machine Design Post.
  • Research : Machine precedent; Dishu
  • Fabricating and modifying Gestalt stages
  • dishuBot: Prototyping the infinity axis.
  • dishuBot: Prototyping the variable axis.
  • dishuBot: Prototyping the instrument.
  • Walking up some Mega RAMPS (and falling down).
  • About the day the Arduino Uno CNC Shield arrived in the mail...
  • Custom g-code writer, why and how.
  • Jump : Index
  • Project files
  • download dishuBot wheel armature SVG
  • download dishuBot stepper side STL
  • download dishuBot pump side STL
  • download dishuBot stage STL
  • download dishuBot effector shield STL
  • download dishuBot effector gear STL
  • download dishuBot effector brush STL stem
  • download dishuBot effector sharpie stem STL
  • Research : Machine precedent; Dishu

    Over the past couple months we have been learning to use tools for fabrication and coding. Now, we are faced with the challenge to pull these tools together into a fabbed robot. My initial concept is a Seymour Papert "turtle" inspired drawing robot that can use water to write poetry in Chinese characters on outdoor concrete ground surfaces.

    First I will survey the drawing bot landscape. Dishu is "earth writing or practicing ephemeral calligraphy on the ground using clear water as ink." The imagined robot would be mobile using legs or wheels, battery-powered, use water through a brush to make drawings translated through computer-generated code.

    The following exercpt is from Francois Chastanet's book: Dishu: Ground Calligraphy in China.

    "The endlessly tracing texts composed of «hanzi» signs slowly disappear as water evaporates. This phenomenon, called «dishu», appeared in the beginning of the 1990s in a north Beijing park and soon spread to most major Chinese cities. Based on classic Chinese literature, poetry or aphorisms, these monumental letterings, ranging from static and regular to highly cursive styles, make the whole body break into spontaneous dance and infinite formal renewals. This street calligraphic practice corresponds to both a socializing need and an individual search for self accomplishment or improvement."

    Hanzi is typically written from a top to down left to right manner and line thickness is controlled by the force (or speed?) of the brush onto the surface. I think it would be wonderful if the robot is capable of mimicing this sequence and line width control.

    The inspiration and direction for this robot comes from Seymour Papert's LOGO "turtle".

    A wonderful television expose Seymour Papert's LOGO and turtle drawing robot. If you can, I recommend listening to something like Radiohead's "Treefingers" or Mum's "Slow Bicycle" while watching the video. It helps direct your mind into the right space.

    From the video, it is appears this version of the turtle is made of an acryllic base, two wheels individually driven by motors connected via timing belts, two ball bearing casters on the perpendicular axis and a sharpie marker near the center perhaps connected via a servo bracket. Data and power are provided externally via the serial cable. The previous image shows a vaccuum formed plastic globe and some additional finishing materials under the base which gives the robot character.

    Design considerations for our Dishu Robot.

    Design components of our Dishu Robot.

    Thoughts on uses.

    I think this type of robot could be used for many useful things beyond Dishu. For instance, the robot could draw permanent artistic patterns on floors in paints or inks. It could draw patterns on large stock material for various uses such as construction. Maybe the robot does not use a pen and is instead equipped with a knife for large scale cuttingp applications; maybe it cuts patterns of mega-origami. That could be fun.

    Regarding motion.

    Gears driven by motors. One gear to move the brush holder along an axis and another to move the assembly along the perpendicular axis ala the Itty Bitty Drawing Robot. This could be used to move the brush one one axis independent of the movement of the wheels on another axis.

    A Gestalt stage, developed by Nadya Peek and James Coleman, used for motion in one direction, while something else, wheels or legs, freely move the robot in the perpendicular direction.

    This fabricatable machine by Jens Dyvik with the linear rails, rack, pinion and glide blocks all fabricatable on conventional CNC milling machines. Fantastic.

    Motors attached directly to wheels. Rubber is added around the outside face of the wheel.

    Or motors attached to belts attached to wheels. I think this option might make operation less efficient but looks better. Also, The motor could be positioned with greater freedom in relation to the wheel, whereas the previous option requires the motor to be centered on the wheel. This seems to be the technique used in the LOGO "turtle".

    Further, this could be a motorized gear connected to two wheels on each side for balancing the machine. This diagram from this gear train design blogpost displays a [stepper] motorized gear driving two wheels.

    Carmen found this drawbot archetype, 3&Dbot, uses two pairs of wheels oriented to perpendicular axes. Omni-directional wheels are used to permit full motion.

    Carmen found this similar execution of splitting the brush and movement axes. However, this robot moves the paper instead of the machine. Our robot is intended to move itself.

    Theo Jensen inspired walking legs. Which, by the way, is infinitely more dynamic and sensational than wheels! Can the movement be discretely controlled by stepper motors? To what increment?

    Carmen found this diagram of a large-scale cable camera movement system from a blog entry. One possibility would be to post a tent like structure, connect these cables, and then execute the drawings.

    Often considered the first of the hanging wall drawing machine type, Hektor is a machine consisting of two motors, toothed belts and a spray can holder. It is controlled by a software written in Scriptographer running inside of Adobe Illustrator. From the Hektor webpost, "A geometric path-finding algorithm calculates the motion paths required by the fragile mechanical installation to move smoothly and not lose the battle against gravity. The algorithm then translates these paths into rotations of the two stepper motors that position the spray-can, and coordinates the pressing of its nozzle." Between the stadium scale of the Camera's Cradle to the wall scale of Hektor, could a robot draw on a floor surface using a tension system and be mobile with legs or wheels?

    About the brush holder and effector.

    The simplest most widely used variant seems to involve a micro-server that pushes on a sleeve (wrapped around the brush handle) to move the brush onto and away from the surface through an extruded rail. These images are from the Low-cost, Arduino-compatible drawing robot post on instructables. I think one improvement on this design would be to flip the direction of the sleeve so it slides into the rail. This way the two can be calibrated together to help minimize jiggle, while the sleeve is adjustable to multiple brush sizes.

    This is a more elegant solution to brush movement found in the "Polar Plotter on Arduino and MakerBeams" project found by Carmen. In this case, the servo has two positions up and down. Could it be modified for pressure variants?

    Watercolor Bot has an effector that uses two stressed plastic planes in combination with a servo motor and an adjustable collar (screw). The stress in the plastic helps keep the brush in constant contact with the paper and is located very near the crossing point of the X,Y axes (0,0).

    FabAcademy student Keith Tan designed an adjustable end effector with a spring. Rubberbands would also be used.

    And finally the overkill solution. The entire armature of the Interactive Robotic Painting Machine moves up and down along the vertical axis (relative to drawing plane). This is certainly capable of pressure variance.

    What might it look like?

    Despite being crude, these sketches portray an overall strategy for the dishu robot. This first sketch shows the movement systems. The servo controlled brush is connected to a movement axis while the stepper motor controlled wheels operate on another axis. The robot draws characters via a combination of axial movement by the two systems. At the end of a line, the robot swings around by moving one wheel at a greater rate than the other.

    A crazy variation on this may be to use walking legs. Also, this sketch at least recognizes the need for water, battery and controller space on the robot. Legs may work better for walking over cracks and small stones common to outdoor floors. It seems difficult to implement though.

    This sketch variation uses a tension wire system articulated with four motors for four cables, two for wheel motion, and one for the brush effector. I am now thinking there may be a delta, three motor, variation on this depending on the wheel configuration not interfering with the drawing surface.

    I hope to develop a machine that can stand amongst the following calligraphy inspired machine projects. Wonderful work by Nicolas Hanna.

    Junjiao Gan and Xu Zhang sing a Kuka arm.

    Training a robot using a motion copy system.

    Cable driven system.

    Professor Xu Yangshang's calligraphy arm.

    I will post links to resources I have found helpful here.

  • Timeline of Computer History : Snapshots of the story. 1967, Seymour Papert designs LOGO, Star Trek debuts with multiple computation.
  • Brief History of the LOGO turtle : Beginnin gin 1948 with Elmer and Elsie...
  • Low-Cost, Arduino-Compatible Drawing Robot : Instructable for a drawing robot.
  • Dood : 2014 Fab Academy student, Matthieu, builds a drawing robot as his final project.
  • Ideal qualities in a drawing robot pen holder : Great research post on the MakerBlock blog.
  • Microbot : Building instructions for the v2 Miro(draw)bot kit.
  • Der Krizler : Detailed post about a tension based wall drawing bot.
  • Fabricating a Gestalt Stage

    After much intense strategic consternation, we decided to begin our project by developing through the Gestalt framework. The framework is developed; we are beginners. Can we connect rotary stages on either side of a linear stage equipped with a brush effector?

    The idea is similar to a common CNC machine. Wheels mobilize the whole assembly along an infinite X axis. A stage holding an effector moves a limited distance between two wheels (Y). The effector moves a minimal distance along the Z axis.

    The first step is to fabricate a linear stage. Download the Rhinoceros file and you may need the grasshopper file as well. I had a problem opening the grasshopper file on my MacOS. After opening the file on a recent Windows build and saving it, it is now MacOS ready. I will include this version of the file in my links. Use with caution, while it appears to be in working condition, the graph is thick and I did not double check everything. I had 5mm thick cardboard on hand, so the material thickness was updated to this specification (Gestalt file is formatted in inches). Bake the Frame, Platform and Supports. Do some linework cleanup. Export the file to your laser cut format. If you have 5mm thick cardboard, save yourself some time and use my SVG linked below.

    A rounded 813mm x 830mm without padding.

    ♻ Gestalt Linear Stage 5mm thickness CUT! ♻
    ♻ Parastage grasshopper MacOS compatible ♻

    I had to use a laser cutter from another lab because the biggest piece exceeded our lasercut bed. I rushed optimizing the settings, due to a security guard trying to close shop. That decision lit a small fire that ruined a some pieces. Remember, poor decisions do not make pretty things.

    Score: speed55, powerMin80, powerMax85
    Cut:   speed10, powerMin100, powerMax100

    Cancelled that job. Increased the speed of the laser movement, reoriented the origin and started again. This time no fires and the pieces were mostly cut. Some pieces did not cut all the way through, maybe due to warping. In the future, I might consider cutting along the same lines twice with a combination of less power and greater speed. The settings I used:

    Score: speed60, powerMin80, powerMax85
    Cut:   speed20, powerMin95, powerMax100

    With a boxcutter, I liberated the unfolded Gestalt Stage. Tomorrow I will assemble.

    Constructing a Linear Gestalt Stage was fairly straightforward. There were a couple issues which I think may have been related to the 5mm cardboard thickness I used and how the Grasshopper graph accounts for that. I did not encounter any problems that required me to re-cut any of the pieces. First issue, these two holes were not cut on the end of the main piece. The holes are necessary for sliding in the rails. Glad I caught this before I folded the frame.

    In the inner stage structure, there is not enough space for the piece that grabs the threaded rod attached to the motor. This would be less of an issue if that piece was positioned to the outside, but the cardboard would still need some trimming around the bolt heads.

    This looks like a face.

    In the frame, the stage was out of alignment with the aluminum shafts and leadscrew. There was too much cardboard. I trimmed off the last flap on the frame and some material off the bottom side of the stage.

    If I need to fabricate more of this Gestalt module with 5mm cardboard, I can quickly adjust the pattern file to pickup these sizing mods. Regardless, everything is ready to be motorized with this piece of our robot mockup.

    To help with visualizing the other components of the machine, I added low-tech wheels and an end effector. At least it is humorous.

    And I tested the movement of the stage using the motor attached to a Gestalt hub and python. While it worked, it was extremely finicky: the motor would become unresponsive after running the script a few times. After, if I disconnected everything and reconnected, it would work again. Plus, we were unable to network multiple nodes.

    Update: With the linear stage working, I went to work constructing wheel armatures. Looking at the length of the screw attached to the motor driving the stage, we thought it would not permit a wide enough range of motion on that axis. Therefore, in addition to providing for wheels and motors, I built in attachements for a motor facing forward that could drive a toothed belt to move the stage.

    I returned to the rhinoceros file and rearranged some of the elements of the linear stage into a foldable armature using the same 5mm cardboard.

    I incorrectly accounted for material thickness in the folds of the inside layer. Fortunately, cutting with the laser cutter is quick. I added the extra width to all the modules within the fold and recut.

    Everything fit together tightly without need for extra adhesives.

    I was anxious to get some kind of testable prototype together for wheels connected to motors before I ended the evening, so I thought a quick solution would be to connect the wheel directly to the motor shaft. This turned out to be more troublesome than expected for a couple of reasons. First, I did not do a kerf test because I thought after cutting one test, I would be able to adjust the dimensions. Ultimately, I was forced to cut many tests. I used 4mm acrylic and cooling after cutting with the laser can change the material size enough to throw off a good fit. Further, the material wears loose after only a few uses. A far easier solution would have been to use a metal connection on the pin which could be screwed in through the face of the wheel. Then the screw holes in the acrylic would not need to be perfect and with the connections spread across multiple points, the connection would retain its fit longer. Nonethelss, for purposes of testing the wheel movement these wheels worked.

    I added some multidirectional wheels to to help the machine balance. We hope that the machine will be able to turn and thought to make the wheels on the center of the linear axis. Not a great idea for balance. This caster is not great either.

    Carmen printed an effector from a previous OpenDot project. While it will need some adjustment for our machine, it is helpful to at least go through some fabrication and see how one works in our hand.

    ZhaoWei returned to the lab, wired the motors and tested its operation. He used an Arduino Mega 2560 with a RAMPS shield running GRBL. I will comment more on the programming in another section of this post. All the motors, two for the wheels, one for the stage, one for the effector movement and another for pumping water through the brush are working.

    The machine does not balance well, so the next thing I will focus on is improve the wheel mechanics.

    I will post links to resources I have found helpful here.

  • [m]MTM : Modular Machines that Make : Cardboard CNC.
  • pyGestalt Github : The goods.
  • [modular] Machines that make : More Gestalt information.
  • OpenDot Effector, 2015 : Work by Simone Boasso printed for our prototype.
  • dishuBot: Prototyping the infinity axis.

    Now we have an operational prototype. Now we have a barely operational prototype. We decided I would focus on the mechanics of the second prototype. I wanted to improve several aspects of the first prototype. One, the robot should be easily portable, which I think works best if it can be easily dissambled and assembled. Two, the wheels do not balance the robot. Three, the effector and stage should be customized to the dimensions and requirements of this robot.

    In this section I will focus on the wheels, or the (Y) axis, the infinity axis. First, Saverio found this video which inspired the design.

    I thought it would be interesting if the metal rods were freed from the cardboard frame, enabling them to be any length, perhaps even variable depending on circumstance. Instead of using a screw to drive motion on that axis (X), we could use a toothed belt, which may easily be adjusted for length. The wheels needed to be detached from the motor and repositioned for balance. I thought we could use two pieces of material sandwiching the wheels and a belt to translate motor movement to the wheels. The belt would also protect the wheels from wear and give greater friction. Finally the stage needed an element to grip and tighten the open end of the belt and an effector for the brush. The brushes we have are large enough to run a tube down the center of the stem, so maybe it makes sense to increase the diameter of the verticle guide. For future flexibility, this entire effector armature may be detached from the stage and replaced with either an improved version of itself, or something with a different use. With this in mind, I made this freeform sketch in Rhinoceros. Completely out-of-scale and interrelationships, but it proved to be a helpful reference through the next few steps of fabrication.

    So how about more Grasshopper practice? I can make one hell of a mess in this grasshopper graph. In the future, I will have to explore best practices in keeping the grasshopper graph organized. Fortunately, I can read it and this script made tweaking the wheel armature through prototyping significantly easier. I started with two wheels, the motor with a pulley attached to its shaft, and a single bearing between the wheels. After thinking ahead a little to positioning the motor for the X axis, I decided to move the wheel motor below the rods. When doing so, we were concerned for the tightness of the belt to the motor's gear so I added two more bearings between the wheels and motor.

    Over to the lasercutter and minutes later I had a working version in cardboard.

    Unfortunately, we did not have bearings or closed toothed belts in stock. To the streets of Shanghai where dreams can be bought on the cheap.

    With a fresh batch of parts, I returned to the lab and the cardboard prototype and put everything together. I love it when this happens.

    Now I need teeth in the wheels to better catch the belt. I found this grasshopper gear maker referenced in Moritz Begle's FAB aacademy page. What better point of reference than a FAB academy page? Use a gear maker in grasshopper to cut wheels. There are many tools included in that package. I only needed the gear tool. I used a little maths to find the circumference of the circle for the number of teeth on the gear. My purchased belt had a 5mm pitch. I had to cut a couple test wheels to calibrate for laser kerf.

    With properly coordinated gear wheels, I returned to my cardboard armature and tested. Brilliant.

    Finally, I wanted to use stiffer material than cardboard for this armature, so I went to the acrylic stock I was using for the wheels. The material is far less forgiving than cardboard, so I needed to adjust the bottom middle bearing location to relieve belt tension. I did this first on the drill press and then translated my findings into grasshopper and back out to the lasercutter.

    There are some [mechanically] viable wheel armatures. By this point I had already begun sketching a piece to hold the rods and motor for the perpindicular axis which I will document in the following section of the page. The joints cut into the top of the wheel armature were made to slot into this next piece. Because I was working quickly, I made some decisions and followed through on that next piece. I decided this armature will slot in and the rod will help pin it in place for extra structure. And I added holes facing the motor screws on the opposite wall so I could unscrew or tighten without dismantling the entire armature.

    A few final notes on this build. The lasercutter makes a mess on the bottom side of the acrylic. So I oriented my cut files so the messy side would be the inside side of the armature. Also, there seems to be a balance of when to remove the acrylic from the stock after cutting. Too fast, and it is horrific smelling. Too slow and it seems to lightly bond back together.

    Download project files

    I will post links to resources I have found helpful here.

  • Evolvent & Gears : Grasshopper gear script I used.
  • Grasshopper gear script : Another gear script I tried but did not like as much. This is better in that it is clear what is going into making the gear.
  • Grasshopper Gear simulator : I thought this would be interesting to try but never had time throughout this build.
  • How to read a screw thread callout : Because sometimes we are doe-eyed beginners.
  • Generic Step Motor Datasheet : The basic dimensions of these motors are all very similar, and my specific steppers I had a hell of a time finding datasheets, so I used this to build a virtual reference model.
  • CNC Tank : Youtube design that inspired the wheel design.
  • dishuBot: Prototyping the variable axis.

    The Gestalt linear stage used a screw drive mechanic which we thought would be too limiting dimensionally for our purposes. Therefore, we decided to change this axis to a belt drive mechanic. I also wanted to see if this axis could be freed from the cardboard structure of the Gestalt model so it might have interchangeable rod lengths.

    I went about designing these pieces thinking three-d printing would be the optimal method because I could make non-planar things without worrying about connections and alignments. Three-d printing is slow. Three-d printing, however, is also passive which lets me focus on other things while the machines are working (designing the wheel armature and effector). These sections are not organized based on time.

    My design was a modification of the gestalt stage, stripping away unnecessary parts and slotting connections for my wheel armature, stepper to drive the (x) axis and DC water pump. That in mind, the design of the first piece was straightforward.

    Keep in mind, 3d printing can sometimes take hours to produce a single piece. You better test critical parts of the print first. By testing the rod fit, I found the printer was adding 0.4mm of material with my settings. Rather than try to properly calibrate my settings to the printer, I decided to adjust my 3D model. This is an easier thing for me to control.

    Then I printed the first piece. I printed this piece with Makerbot tough PLA and a 0.2mm layer height, 3 layers on the external surfaces and 20% infill. Along with the overall wall thicknesses, the layer height was overkill and made this an 8 hour print when it easily could have been far less. In fact, with future pieces, I used 0.3mm layer height and the prints of similar size were finished in under four hours.

    I made an assumption about how much space I would need for the wheels inside the two layers of acrylic before I purchased materials. Happily, I was correct. These photos were taken before I resolved the wheel belt tensioning issue.

    On the other side I made an M8 sized hole for a TBD belt bearing and made space for the water pump and wires. I also added expansion slots for a water shelf and zip ties.

    I printed this with the Smart Extruder+ and normal PLA at 0.3mm layer height. Much quicker and no noticeable structural negative.

    Finally for the stage, I started to think about ways to carve away unnecessary material and give the robot some style. I used a U-shaped channel to tighten the belt and deal with the slack. It works but it is tricky to push the belt into place with tension. I think on another try I may make it something like S-shaped, so I can tighten the belt, hold it in place and then put it into a teethed section. Or, how about this? I might look for great ideas already in use.

    I test printed the belt and rod bearing fits. Initially I had teeth going all the way around the belt fitting. It was overkill. Glad I tested first.

    And the final fit of the belt tensioner.

    The robot is coming together.

    ♻ dishuBot stepper side ♻
    ♻ dishuBot pump side ♻
    ♻ dishuBot stage ♻

    I will post links to resources I have found helpful here.

  • 4xiDraw drawing machine : A machine I checked out while making.
  • 28BYJ-48 Step Motor Axis : A machine I checked out while making.
  • dishuBot: Prototyping his instrument.

    The effector pushes the brush into the drawing surface and pulls it away. Included in the mechanism is the flow of water from the reservoir into the brush tip. This design uses a 2BBYJ-48 stepper motor we had in stock in the lab. Precision of movement is not needed.

    In the design of the stage, I left some gaps that could be used to attach an effector shield. We also looked at an effector designed by Simone Boasso previously. With these starting points, I designed this:

    I created a larger guide for the vertical movement with a perpindicular axis for slide for alignment. Further, the center is hollow for the water tube and the end is customized for our brush. And the gaps are adjustments for wiggle and printer calibration.

    By this point, my 3D printing game was on point. This piece was a breeze. The rail bearings fit snuggly and the belt fastener keeps.

    The stage shield fits in nicely without adhesive. I think I could have added a cross at the connection points to make it one step better although there are no problems with that operation whatsoever.

    The gear is a little loose so I think it would be better tapped with a small tightening screw. The effector column sticks a little when moving down. We discovered that a sharp corner in the sleeve was catching in the gear teeth. When that corner was filed down, operation smoothed significantly.

    Not the easiest Grasshopper graph to navigate...

    Brushes can easily be swapped out or have a new stem printed to adjust for length and thickness. Alternate shields can also easily be swapped into place. With that, the whole is together. Next I need to return to Arduino and GRBL...

    Update: Overall, the weakest point of the project has been the effector. The water flow through the pump is out of control, the movement is fragile and inconsistent. I think it is time for upgrades.

    I first tested a smaller tube diameter for the water flow which holds the water well, except when pushed out when the pump motor is engaged. Upon deciding to use this pipe, I quickly made some acrylic fittings to connect the smaller pipe to the larger pump nipples.

    Next, I lined the 3D print to brush connection with a generous coating of hot glue. Water is no longer leaking out of this connection.

    The connection of the 3D printed gear to the stepper motor in the stage has to this point been press fit. I added space to add a nut and bolt tightener and increased the thickness of the gear wall. Added thickness to the gear wall was to reduce the chance of missing a connection and space was available in the existing design. Print again. 18 minutes later...

    The connection is strong but when I adequately tighen the bolt, the gear skews towards that connection point. Then when working within the system, at a certain rotation it creates too much friction and the motor loses steps working through that rotation.

    After discovering and diagnosing the skewing, I mirroed the nut and bolt connection to the other side. When I tighten the bolts on the motor, I do so slowly and evenly. The gear action is smoother and stronger. I like it.

    I noticed in testing that the Z axis was missing steps still. I wrote some G-code to test the Z movement. This code will roughly vary the movement speed, height changes and add some pauses. While working through this, I adjusted the V-ref on the corresponding stepper driver adjusting between .400 to .600. Around .500 the stepper seemed to do as desired without getting warm to the touch. Above .550 the motor starts to get slightly warm, not enough so to be alarmed in my use case so I will not hesitate to increase the voltage more as a first level of troubleshooting if I notice more missed steps.e

    g90
    G0 z10
    g4 P0.5
    z0
    z20
    z10
    z20
    z0
    g4 P0.5
    Z15
    Z20
    Z25
    G1 Z15
    Z7
    Z9
    Z2
    Z20
    Z30
    g4 P0.5
    Z10
    Z25
    Z4
    G0 Z9
    Z19
    Z4
    Z24
    Z9
    Z2
    Z10
    Z0

    We had always thought of the possibility to swap in different mediums and in fact, I designed both the stage shield and effector stem to be quickly and easily swappable. Plug in play. That in mind, I made a first version of this sharpie effector. There is a spring hidden in the top that halps keep consistent pressure on the invariably warped drawing surface. This first version has some correctable tolerance issues (in spite of successful partial test prints).

    Up to this point, I am still working with some non-parametrically designed objects which are now beginning to hamper my development. It might take longer initially for me to setup a parametric system for generating a model as complex as this but making adjustments could not be easier, whereas with this system even the most minor change may take tens of minutes. Until I find time to make adjustments and reprint, this one will work.

    Download project files

    I will post links to resources I have found helpful here.

  • BYJ48 Stepper Motor : Instructable post about the effector stepper.
  • 28BYJ-48 – 5V Stepper Motor Datasheet : Datasheet.
  • Customizable Wheel for 28BYJ-48 stepper motor : A better gear connection.
  • OpenDot Effector, 2015 : Work by Simone Boasso printed for our prototype.
  • Walking up some Mega RAMPS (and falling down).

    TBD.

    Regarding FTDI Drivers: Finding the matching FTDI drivers for the Arduino you are using can be frustrating, to say the least. If you are fortunate, your computer will immediately recognize your board's serial port. If not and you have a board that has the CH340G USB/Serial chip, i.e. Arduino x China, start here then, if using macOS and still having connection issues, this Sparkfun post. If neither works, I highly reocmmend you go directly to learning which USB/Serial chip your board is using. The forums are full of people taking wild swings without this one crucial piece of information.

    About the day the Arduino Uno CNC Shield arrived in the mail...

    TBD.
    SPOILER: It's wonderful.

    Custom g-code writer, why and how.

    When writing Chinese characters, there are rules. You can either abide or you will be revealed a fraud. When we first tested the machine, we tried several online g-code generators (link?). One thing we did not find was a good way to specify stroke ordering and directionality. I was excited to show some Chinese people the first successful tests of our machine and everyone criticized the machine's stroke order!

    I started to investigate the possibility to write code we could use to generate g-code which follows the basic principals for patterning strokes in Chinese characters. And, this was a good excuse, as the thought of being able to control g-code throughput has intrigued me since three-d printing a couple months ago. Throughout the past couple months, I have had fun learning the Grasshopper plugin to Rhinoceros so I decided to start with Jens Dyvik's Bark Beetle - parametric toolpath plugin for Grasshopper. Unfortunately, there is a plugin compatibility issue I could not solve with Firefly on MacOS and the graph was out of my comprehension level. I will return to it soon.

    Searching around I found two fantastic webposts. This instructable: Make Awesome 3D Geometry by Programming CNC-code by Siemenc and G-Code Writer for 2D Shape Milling. The latter was mostly outdated through Grasshopper updates but the concepts Andy Payne (same person behind Firefly) discusses in the video are still relevant; I ultimately built systems based on these concepts. I started with Siemenc's open-source grasshopper code for controlling shopbots, only slightly outdated, and adapted it to our purposes. By this time, the original grasshopper code is nearly unrecognizable.

    First, you need to develop single line strokes of the characters. We have used Inkscape and Illustrator tools for generating vectors from rasters, written into the computer using a mouse, and creating single line vectors from typefaces. All are imperfect methods and I hope to find a better way. Handwriting on the left and rhinoceros single line version on the right.

    Each character, which is a set of strokes in succession, is inputed into a single module grasshopper curve module. Right click on the module and choose: “Set Multiple Curves” then in Rhinoceros window, choose each set of strokes in succession from first to last. The ordering can be double checked in the preview cluster.

    If you are not using all the characters, turn off the extra modules. If you want more, duplicate and connect the output to clean module. Hold down shift when you do this, or it will replace all the previous connections. The character modules must be connected to clean module in order first to last.

    Next, check that the machine moves along each stroke in the correct direction. This can be recognized by reading the combination of red (machine repositioning) and green (Z movements) lines. This is easier to see in perspective than top down.

    If your robot is painting in the wrong direction, it can be changed with the "flip" or "dir" command in rhinoceros. "Dir" shows helpful arrows.

    Change this slider to review stroke sequence. If strokes in a character are out of sequence, correct the order of the list in the corresponding character module, for instance, reselect. Bear in mind 0 is first.

    Nobody in the world wants to wait on a slow machine. For your speed needs, dishuBot can book it, assuming you understand its limitations. First, tune the stepper drivers and ensure the mechanical operation is excellent, this is discussed in another section. Using GRBL and the CNC Shield, the most important thing to understand is serial communication delay. GRBL queues around 10 commands. If the machine exhausts the queue of commands faster than the serial communication, it will stutter while it awaits serial transmission. This is from the RepRap Wiki:

    A precision setting adjusts the segmentation of the curves as all the G-code is parsed in segments currently. The lower the precision setting, the more segmented the polylines that make up the curves will be. Each point along the polyline will be converted to a machine coordinate. More points, more coordinates, longer code. However, list length is a placeholder reference that does not mean much really. It is just there for feel. The real thing to do might be to calculate the length of every set of queued commands, predict how swiftly the machine will draw and factor the serial delay. Then the precision adjustment may not be necessary as the code will automatically adjust precision to serial transmission rate vis a vis movement speed.

    Maximum movement speed is calibrated within the Arduino code or GRBL. I set the maximum to 5000 mm/min. I am sure it can move faster so I intend to return to this setting in the future. Commands following a “G0” prompt will move the machine at the maximum speed automatically. Commands following G1 are painting strokes. These commands are dictated by the move speed slider. G0 are jogging movements. Near the top of the G-code, I have G94, which sets the machine to a units per minute feed rate and G21 for millimeters.

    Before using any G-code with dishuBot, move the stage near to the left side of the wheel axis and set the tip of the effector medium just to the painting surface. Safe Z height is the height the effector moves up between strokes. Z depth is the distance the effector moves down on strokes. This could be thought of as stroke pressure.

    Pump timing sets the amount of time the pump turns before stopping. M03 is relay low (pump on). G4 is delay and P(number) is timing. M05 is relay high (pump off). Calibrate this to your surface material and effector medium. Post pump delay is an optional machine delay to permit the effector medium to saturate. During this time, the machine will do nothing before resuming painting. Both settings are in seconds. If not using the pump, switch the Pump Code panel to the left off. Select, press space bar once, select “Disable”. If you want to renable the pump follow the same procedure and press “Enable”.

    These are the points in the code the pump is activated. Currently, the pump is not active on the first stroke, assuming the brush is saturated before beginning the job. If you would like to activate the pump on the first character, add a connection between the top First Value and Curve holding shift. Disconnect it to deactivate.

    This code searches the G-code for the starting points of each character and inserts the four lines of pump control code there.

    Finally, there is a panel on the right side of the graph with all the G-code. Right click on this panel and setup a stream destination with the *.nc format and enable stream contents. Now a file will be automatically updated as modifications are made in the grasshopper graph.

    And the file can next be loaded into GRBL for painting.

    Update: I found two mobile apps that work well for touch screen drawing vectors, ie Chinese calligraphy, that can be used in the Grasshopper graph.

  • CREATE : Pros: cheaper, directly import in Rhino cons: restrictive drawing canvas, sometimes does not export vectors in PDF (?).
  • Vector Touch : Pros: customizable canvas, ie. setup to machine dimensions, imports to scale, export as SVG, cons: price, file encoding unrecognizable by Rhino (MacOS tested)
  • With either of these apps, you can transfer drawings from your fingertips to dishuBot in just a couple of steps. This eliminates the tricky process of converting image scans, or typefaces to single line strokes for the G-code script. Both apps require in-app purchases to export vector files. CREATE can export PDF via email. Vector Touch can export SVG and PDF via email. Rhinoceros can read most PDF files. I will break down my use of each app.

    CREATE exports PDFs which can be directly exported to Rhinoceros. In my tests, Grasshopper remembers stroke order and direction. I recommend you double check this each time regardless. Follow the steps in the GIF below. Open the app, press the plus button, choose the curvy line, remove the fill, draw, then press the square with the angle button, choose vectors and email the file to yourself. This application does not have good options for changing the artboard beyond the screen size. Sometimes the CREATE does not properly export the vectors, when this happens, I return to the application and export again or draw another shape on the screen then export. I have yet to isolate a pattern in the error.

    Vector Touch exports PDFs which Rhinoceros does not like. You will need to open these in another application, ie Illustrator, and save the file. Then Rhinoceros will have no problem. Vector Tough can also export SVG. However, Rhinoceros does not have the capability to import this type. Open the file in another program, ie Inkscape, and save in an acceptable format for Rhinoceros (many). Vector Touch, in my testing, is also capable of exporting the vectors into Grasshopper in sequence and direction. You can set a canvas size in the app and import the drawing to scale in Rhinoceros. And, this app supports standard iOS Pan and Zoom gestures. Choose the plus sign to make a new drawing, or just use a custom canvas already in place. Then choose the canvas dimensions in mm. The maximum Y direction is 722.4mm but if you set the X dimension to some factor of the machine's dimensions, it will scale the canvas. Choose the freeform curve draw tool, set the fill to none, draw characters. Pinch and two finger pan inputs are present. Press the square with the arrow button and export the file via email.

    In Rhinoceros, import the vector drawing. Depending on how you tune the import settings, the drawing will likely be out of scale. One quick fix is to select all the curves and use the boundingbox command, which will draw a rectangle fitted to the extents of the group of curves.

    Then use the move command, choose the upper left hand corner and type in 0, press return. This will move the curves to the machinable area. Next use the scale command, choose the upper left corner of the box followed by the upper right corner and type in your desired width. Now the characters are properly scaled for the robot.

    If using VECTOR TOUCH, it is possible to export to scale if you setup the canvas correctly. In my example, I set the X dimension half of the machine working area then fixed the scale on import. The drawing will be imported in the X+, Y+ zone of Rhinoceros working area. Simply move the drawing to the X+, Y- zone. In this case, Y -1444.8mm.

    If using CREATE, the vectors seem to come into Rhinoceros in sequence if you follow these steps. Right click on a Character input in Grasshopper. Choose "Set Multiple Curves".

    Next, draw a window around the character in Rhinoceros. Good to go. I recommend double checking direction and sequence as described above.

    If using VECTOR TOUCH, do the same in reverse. First select the character in Rhinoceros with a bounding box, then choose "Set Multiple Curves" on a character input in Grasshopper.

    Update (11 June) : Based on more experience using VECTOR TOUCH with volunteers, it seems to reverse or not the order of the characters, perhaps based on the file format used to prepare the file for Rhinoceros. Either check the stroke order on the first character inputed, or reverse the modules as needed.

    As mentioned in the effector section, now in the code, I want a relationship between the length of the character drawn and the pump timing. In this, I use grasshopper to define the pause length between setting the relay to low and high, or the amount of time the pump is activated which correlates to the amount of water pumped. Perhaps in the future, a sensor could be used to make a closed loop system from the brush reservoir level to the amount and timing of water pumped.

    M03 G4 P0.072172 M05 G3 P1.0
    M03 G4 P0.100443 M05 G3 P1.0
    M03 G4 P0.117917 M05 G3 P1.0
    M03 G4 P0.108700 M05 G3 P1.0
    M03 G4 P0.027820 M05 G3 P1.0

    For some time now, I have had trouble understanding how to correctly handle lists in grasshopper. I went to The Grasshopper Primer and read the sections on lists and trees. Upgraded abilities. The new section looks like this. The left side, what was once one component per character is now one for any number of characters.

    Pump timing was changed to a water factor. Finding the correct factor without a sensor embedded in the effector stem to check water level is by feel based on a couple test runs, the drawing medium, brush pressure and other environmental variables. When calibrated, the water factor variable is a divisor of the total stroke length of the previously drawn character. I have not had a chance to test it on the machine yet, however, the programming is effective in generating the code and inserting the variable pump timing correctly within the movement code.

    This chunk of programming finds the insertion points for the pump timing. Again, largely simplified from the previous version based on better understanding data trees and lists. Essentially a list is a set of data that would be considered as a single branch of a tree. Branches (each containing a list) can bifurcate, at the very least many times forming complicated trees.

    At this level, I think I will next focus on my efforts to transition to Firefly control the outputs instead of GRBL and start prototyping the "Yong" strokes and character mapping. I recently exhibited the machine at the 2017 Maker Festival in Xi'an and welcomed tremendous feedback, including suggestions to enable twisting and tilting of the brush, movement sensors, which Neil also suggested, encouragement to explore different styles of strokes and wide enthusiasm for the lofty goal to digitize the Chinese character library in strokes.

    Currently, I am interested in these next steps of development:

    Future intention: There are a limited number of basic strokes which comprise all Chinese characters. Some systems of identification find up to 37 different strokes, while others have distilled it down to just eight. I intend to optimally code brush movement for each of the basic strokes in code modules then map configurations according to desired character output. This Chinese character "Yong" meaning "permanence" is often used to teach the eight basic strokes (because they are all present). With defined rulesets for stroke ordering and knowledge of the strokes (components), code may be able to scan charcters and find the stroke composition automatically. I foresee a problem in that simply using this approach might output characters that are too "machined". Perhaps there is a secondary set of rules that create unique distortions based on stroke proximities, ie water/ ink saturation points, momentum, or even emotional content of what is being written. Another further step could be to use the stroke and character library as a basis for the code to learn stylistic distinctions of people and adapt those across its own stroke modules.

    Download project files

    I will post links to resources I have found helpful here.

  • G-code Wiki!! - Do not screw around, this is your number one resource for writing g-code.
  • Stroke Order for Chinese Characters : Background for Chinese handwriting.
  • The Importance of Strokes in Chinese Characters : Background of the eight basic strokes that compose most Chinese characters.
  • Relative vs Absolute Coordinates : An explanation of use cases for each in writing g-code.
  • Treesloth : Advanced set of tools for managing Grasshopper lists.
  • Grasshopper list searching : Index searches require an exact match of all components within an indexed item while Member index searches work simply on values.
  • CREATE App : Draw vectors on your iDevice and export to Grasshopper G-code writer
  • Vector Touch : Another way to draw vectors on iDevice and export to Grasshopper G-code writer
  • Jump : Index

    J.travis Russett © 2017
    Creative Commons License All the work contained within is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) License
    You may remix, tweak, and build upon my work non-commercially, as long as you credit me and license your new creations under the identical terms.