Homework for these weeks:
I did the tasks I am describing below for our group assignment on machine building. Hence below text is largely reproduced from our machine page.
In preparation for the installation of the Gestalt nodes library, I followed the tutorials here and here to install wxGestalt. The installation source for wxGestalt is this wxGestalt ZIP file. The README.md file inside told me that I had to install PySerial, unidecode, jsonpickle and wxPython before. Additionally I needed to install pip: sudo easy_install pip.
thomas ~/Desktop/FabAcademy/Week9/Gestalt $ git clone https://github.com/openp2pdesign/wxGestalt.git
Cloning into 'wxGestalt'...
remote: Counting objects: 446, done.
remote: Total 446 (delta 0), reused 0 (delta 0), pack-reused 446
Receiving objects: 100% (446/446), 83.29 KiB | 0 bytes/s, done.
Resolving deltas: 100% (286/286), done.
Checking connectivity... done.
thomas ~/Desktop/FabAcademy/Week9/Gestalt $ cd wxGestalt
thomas ~/Desktop/FabAcademy/Week9/Gestalt/wxGestalt $ git submodule update --init gestalt
Submodule 'gestalt' (https://github.com/imoyer/gestalt.git) registered for path 'gestalt'
Cloning into 'gestalt'...
remote: Counting objects: 573, done.
remote: Total 573 (delta 0), reused 0 (delta 0), pack-reused 573
Receiving objects: 100% (573/573), 13.00 MiB | 809.00 KiB/s, done.
Resolving deltas: 100% (252/252), done.
Checking connectivity... done.
Submodule path 'gestalt': checked out 'b5848c8fe97fd390c5ddb8e3b79a68b62f851c65'
For controlling a machine from within wxGestalt, you need to install wxPython. The latter is a widgets manager for Python. Although you can download an installer for Mac OS X from this place. you will be unable to install it on Mac OS X El Capitan. When you double-click the dmg icon on the Desktop, the Mac installer will pop up but display at least one of the following errors:
The first error is easy to solve. It comes up when you try to install an application from a DMG file that is from
a vendor who is not or no longer signed by Apple. You can come around this safety measure by stretching the user rights in System Preferences/Security & Privacy.
I followed the procedure from this post.
The second error appears because the file structure in the *.dmg file is too old and not appropriate for El Capitan.
The solution for this is to modernize the file structure. How this can be accomplished is detailed here.
Note: the author of the tutorial has put the *.dmg file in the user's home directory. It is okay to put it somewhere else, i. e., on the Desktop.
If one decides to put it somewhere else than one's home directory, one must not forget to adjust the file paths in the shell commands.
I paraphrase here the shell commands necessary to rebuild the file structure and to install the new DMG:
# base workdir
mkdir ~/wxpython_elcapitan
cd ~/wxpython_elcapitan
# download the wxPython dmg
curl -L "http://downloads.sourceforge.net/project/wxpython/wxPython/3.0.2.0/wxPython3.0-osx-3.0.2.0-cocoa-py2.7.dmg?r=http%3A%2F%2Fwww.wxpython.org%2Fdownload.php&ts=1453708927&use_mirror=netix" -o wxPython3.0-osx-3.0.2.0-cocoa-py2.7.dmg
# mount the dmg
hdiutil attach wxPython3.0-osx-3.0.2.0-cocoa-py2.7.dmg
# copy the dmg package to the local disk
mkdir ~/wxpython_elcapitan/repack_wxpython
cd ~/wxpython_elcapitan/repack_wxpython
cp -r /Volumes/wxPython3.0-osx-3.0.2.0-cocoa-py2.7/wxPython3.0-osx-cocoa-py2.7.pkg .
# unmount the dmg
dmgdisk="$(hdiutil info | grep '/Volumes/wxPython3.0-osx-3.0.2.0-cocoa-py2.7' | awk '{ print $1; }')"
hdiutil detach ${dmgdisk}
# prepare the new package contents
mkdir ~/wxpython_elcapitan/repack_wxpython/pkg_root
cd ~/wxpython_elcapitan/repack_wxpython/pkg_root
pax -f ../wxPython3.0-osx-cocoa-py2.7.pkg/Contents/Resources/wxPython3.0-osx-cocoa-py2.7.pax.gz -z -r
cd ~/wxpython_elcapitan/repack_wxpython
# prepare the new package scripts
mkdir ~/wxpython_elcapitan/repack_wxpython/scripts
cp wxPython3.0-osx-cocoa-py2.7.pkg/Contents/Resources/preflight scripts/preinstall
cp wxPython3.0-osx-cocoa-py2.7.pkg/Contents/Resources/postflight scripts/postinstall
# delete the old package
rm -rf ~/wxpython_elcapitan/repack_wxpython/wxPython3.0-osx-cocoa-py2.7.pkg
# build the new one :
pkgbuild --root ./pkg_root --scripts ./scripts --identifier com.wxwidgets.wxpython wxPython3.0-osx-cocoa-py2.7.pkg
# put the package on Desktop, and clean workdir
mv ~/wxpython_elcapitan/repack_wxpython/wxPython3.0-osx-cocoa-py2.7.pkg ~/Desktop/
cd ~
rm -rf ~/wxpython_elcapitan
# install it ! it will ask for your password (to become superuser/root)
sudo installer -pkg ~/Desktop/wxPython3.0-osx-cocoa-py2.7.pkg -target /
# EOF
Gestalt needs to be installed in the Python path. I have not used pip for that matter.
I installed the Eclipse plugin pyDev.
For our PixelPlanter, we want to read the recipe for the machine from a 16x16 pixel bitmap image. The two shades in the image represent the states "seeds" and "no seeds" (see our project description). As Python does not come with an image processing library, I tried to install either OpenCV for Python or the Python Imaging Library. I first tried to install OpenCV for Python from "http://www.pyimagesearch.com/2015/06/15/install-opencv-3-0-and-python-2-7-on-osx/", but failed. After that, I successfully installed the library PIL (Python Imaging Library). The project's homepage is here. I used this tutorial: http://stackoverflow.com/questions/9070074/how-can-i-install-pil-on-mac-os-x-10-7-2-lion However, before this worked, I needed to do the next:
ln -s /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.9.sdk/System/Library/Frameworks/Tk.framework/Versions/8.5/Headers/X11 /usr/local/include/X11
sudo pip install pil
xcode-select --install
The process is detailed in this tutorial: http://stackoverflow.com/questions/19532125/cant-install-pil-after-mac-os-x-10-9.
For the sake of comfort, I decided to use the Eclipse IDE and pydev from now on and to skip wxPython.
I used the following python fragment under Eclipse to read the image:
import svgParse.svg2path as svg2path
import mtm.Utils as utils
import mtm.fabMachine as plotter
from PIL import Image
if __name__ == '__main__':
# Import an image
img = Image.open("/Users/thomas/Desktop/FabAcademy/Week9/pixelPlanter_template.png")
pixels = img.load() # create the pixel map
for i in range(img.size[0]): # for every pixel:
for j in range(img.size[1]):
print str(pixels[i,j])
img.show()
...and got the following error messages from Eclipse:
Traceback (most recent call last):
File "/Users/thomas/Documents/workspace/PixelPlanterControl/mtm/main.py", line 39, in
pixels = img.load() # create the pixel map
File "/Library/Python/2.7/site-packages/PIL/ImageFile.py", line 164, in load
self.load_prepare()
File "/Library/Python/2.7/site-packages/PIL/PngImagePlugin.py", line 381, in load_prepare
ImageFile.ImageFile.load_prepare(self)
File "/Library/Python/2.7/site-packages/PIL/ImageFile.py", line 231, in load_prepare
self.im = Image.core.new(self.mode, self.size)
File "/Library/Python/2.7/site-packages/PIL/Image.py", line 37, in __getattr__
raise ImportError("The _imaging C module is not installed")
ImportError: The _imaging C module is not installed
The problem is due to the missing library _imaging.so, or _imagingmodule.so that belongs to the Python Imaging Library. I found more information here. I first found out whether a file named "_imaging.so" does already exist on my hard disk. Moreover, I read here that I need to add the file to "app.yaml", so I also looked for files with this name.
icds-MacBook-Pro:~ thomas$ locate _imaging.so
/Applications/Inkscape.app/Contents/Resources/lib/python2.7/site-packages/PIL/_imaging.so
/Applications/Inkscape.app/Contents/Resources/lib/python2.7/site-packages/sk1libs/imaging/_imaging.so
/Library/Python/2.7/site-packages/PIL/_imaging.so
/Users/thomas/Imaging-1.1.7/build/lib.macosx-10.11-intel-2.7/_imaging.so
icds-MacBook-Pro:~ thomas$ locate app.yaml
/Users/thomas/.p2/pool/plugins/org.python.pydev.customizations_4.5.5.201603221110/templates/google_app_engine/ask_login/app.yaml
/Users/thomas/.p2/pool/plugins/org.python.pydev.customizations_4.5.5.201603221110/templates/google_app_engine/hello_webapp_world/app.yaml
/Users/thomas/.p2/pool/plugins/org.python.pydev.customizations_4.5.5.201603221110/templates/google_app_engine/hello_world/app.yaml
icds-MacBook-Pro:~ thomas$
Our new FabLab Guru, Daniele Ingrassia, hinted me at the possibility that I am using an old version of the PIL which does potentially cause me trouble. He pointed me at the new version of the PIL that goes by the name PILLOW. I installed it with:
icds-MacBook-Pro:~ thomas$ sudo pip install pillow
Password:
The directory '/Users/thomas/Library/Caches/pip/http' or its parent directory is not owned by the current user and the cache has been disabled. Please check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
The directory '/Users/thomas/Library/Caches/pip' or its parent directory is not owned by the current user and caching wheels has been disabled. check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Collecting pillow
Downloading Pillow-3.2.0-cp27-cp27m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl (3.0MB)
100% |████████████████████████████████| 3.0MB 355kB/s
Installing collected packages: pillow
Successfully installed pillow-3.2.0
This was the solution. With PILLOW, my modified seed of a Python Gestalt node control program worked instantaneously in Eclipse. I wrote a small function, "plantImage(image, velocity)" that accepts an input bitmap image and a velocity value as parameter. So far, it does nothing interesting but move to location (i, j) when value 255 is encountered within the pixel matrix. There are many things left to do: move the Z-axis, check whether the input image qualifies as a bitmap in the right dimensions, scale the coordinates in a way that they fit the bed, and so on. To be continued...
We decided to use Ilan Moyer's Gestalt hardware and software framework as suggested in the FabAcademy class.
The Python code was developed using the PyDev plugin for Eclipse in the Eclipse IDE.
Image Processing
Our function plantImage_grayscale accepts an RGB image suitable for
the Python Imaging Library (PILLOW), converts it into real grayscale, then scales it down to size 16x16 pixels.
A pixel in the image is equivalent to a rectangular area ('box') within the bed. The higher the pixel value, the less seeds will be deployed
on the corresponding box.
Machine control strategy
The extruder head is positioned using two stepper motors, one for the x-direction, the other for the y-direction.
Each of the stepper motors is attached to a hardware Gestalt node (see photographs). The hardware Gestalt nodes are
linked serially. The first one is attached to the FabNet bridgeboard that Karsten Nebe did following this tutorial.
The FabNet bridgeboard is attached to the computer via USB.
For the software side, I have created a new Python project by the name PixelPlanterControl and imported the Gestalt libraries.
I have prepared a custom machine definition by copying and modifying the function "fabMachine.py".
The hardware Gestalt nodes are controlled by an instance of a xyNode object.
Although an xyNode can potentially steer two physical Gestalt nodes, or two stepper motors, respectively, simultaneously, I have not used this feature.
For our machine, serialized movements are adequate. Both nodes are set to the same velocity, which is among the parameters of function plantImage_grayscale.
Main function
The main function main.p is shown below.
The code for our custom software control for the PixelPlanter is imported, so is the Perl Imaging Library that is used for the basic image manipulation tasks.
The serial library needs to be imported to facilitate the serial communication between Arduino and the Gestalt library.
First the port to which the Arduino is attached is reserved. In below code fragment a dummy name is used. The source image is opened, and our custom function
"pixelPlanter.py" is called with the parameters path to image, velocity of the axes, box width, offsetX, offsetY.
import mtm.pixelPlanter as pixelPlanter
import serial
from PIL import Image
if __name__ == '__main__':
# Open a serial port for communbicating with the Arduino
arduinoCom = serial.Serial("/dev/tty.jPhone-WirelessiAP")
img = Image.open("/Users/thomas/Desktop/FabAcademy/repository/fablabkamplintfort/students/125/week9_files/images/fabLab_logo.png")
pixelPlanter.plantImage_grayscale(img, 7, 18, 0, 0, arduinoCom) # Arguments: path to image, velocity of the axes, box width, offsetX, offsetY
I wrote a first rendition of the function (see below screenshot) to try my hypotheses on how the library would steer the machine.
When observing the movements of the thread rods, I realized that they moved in patterns,
that were different to those I was expecting. The explanation for this was easy: I obviously had misunderstood the available documentation
of the move() command in Gestalt. Instead of using absolute coordinates as parameters for move(), I used relative coordinates.
I reworked the code (see below) to use absolute positioning. The function plantImage_grayscale() demands following arguments as input:
Algorithm for the x-y-movement of the extruder head
The head is initially positioned at starting location (offsetX, offsetY) in the machine coordinate system. After finishing the entire planting
procedure on the entire bed, it returns to its starting location. For this location, it is convenient to place it in the middle of the upper left box.
The head moves through the bed in a meandering trajectory. It initially moves from left to right in sixteen discrete steps, then one step down, then
sixteen discrete steps from right to left, then one step down. This pattern is repeated eight times. In each discrete step, it stops in the middle of
a box. Prior to each movement, the program waits for the respective stepper motor to stop. The extent of each move is determined through the parameter boxWidth.
At each stop, the extruder stepper motor is moved as many steps as necessary to release the number of seeds that is proportional to the gray value of the
respective pixel in the source image. The functionality is outsourced to function releaseSeeds(stages, pixels[x, y], arduinoCom). The first parameter
is the base Gestalt object to represent the basic machine. The second parameter is the gray value of the pixel (x, y) in the source image which is equivalent
to the current box in the bed. The third parameter is a Serial object used for communicating with the Arduino board to which the extruder stepper motor is attached.
As we have moved with the machine to a new design, the number of steps necessary is still an open question.
# The bed consists of 16x16 rectangular spots ('boxes'). A spot either receives seeds, or it remains empty.
# This is controlled by the input bitmap image. A shade in the range 0..255 is equivalent to the density
# of seeds in a square within the bed. The interval is mapped onto the range 0..15, as we assume that
# one cannot distinguish more nuances in the planting.
# The parameter boxWidth is a scaling factor that maps the image coordinate system on the machine coordinate system
# A servo motor that is attached to an Arduino board is used for controlling the extruder head that emits the seeds.
# The communication between the program and the Arduino is established via serial connection.
def plantImage_grayscale(image, velocity, boxWidth, offsetX, offsetY, arduinoCom):
imageSize = 16, 16 # The bed is divided into 16x16 boxes, so is the source image
startX = offsetX # Initial position of the head
startY = offsetY
gsImage = image.convert('L') # Convert the image into a grayscale image
thumbImage = gsImage.resize(imageSize) # Scale the image to size imageSize x imageSize
pixels = thumbImage.load() # Create the pixel map
thumbImage.save("/Users/thomas/Desktop/testImage.png");
stages = virtualMachine(persistenceFile = "pixelPlanter.vmp")
stages.xyNode.setVelocityRequest(velocity) # Universal speed for the combined node that handles x-y-plane
# Move the head to its starting location. This is not necessarily the origin of the machine coordinate system
# Ideally, this location lies within the middle of the leftmost box in the first row
stages.move([startX, 0], 0)
waitForMovementX(stages)
stages.move([0, startY], 0)
waitForMovementY(stages)
# Move the head and plant the seeds
# In each iteration, the head moves from left to right, then down, then moves from right to left
# While the head moves in x-direction, it stops in the center of every box and releases seeds
currentX = startX # Holds current position of the head
currentY = startY
for i in range(1, 8):
# Move the head eight steps to the right in x-direction
for step in range(1, 16):
currentX = currentX + boxWidth
stages.move([currentX, currentY], 0)
waitForMovementX(stages)
releaseSeeds(stages, pixels[step-1, 2*i-2], arduinoCom)
# Move the head one step down in y-direction
currentY = currentY + boxWidth
stages.move([currentX, currentY], 0)
waitForMovementY(stages)
releaseSeeds(stages, pixels[step-1, 2*i-2], arduinoCom)
# Move the head eight steps to the left in x-direction
for step in range(1, 16):
currentX = currentX - boxWidth
stages.move([currentX, currentY], 0)
waitForMovementX(stages)
releaseSeeds(stages, pixels[step-1, 2*i-1], arduinoCom)
# Move the head one step down in y-direction
currentY = currentY + boxWidth
stages.move([currentX, currentY], 0)
waitForMovementY(stages)
releaseSeeds(stages, pixels[step-1, 2*i-1],arduinoCom)
# Move the head back to the starting position
stages.move([startX, 0], 0)
waitForMovementX(stages)
stages.move([0, startY], 0)
waitForMovementY(stages)
For machine control, the computer is expected to control through a serial connection
a stepper motor that is linked to an Arduino Uno board. Whenever the computer sends
the shade of a pixel to the Arduino, the stepper motor shall move a number of steps
that is proportional to the shade. The exact relationship between shade and the number
of steps still needs to be decided about in an empirical study.
However, in order to make sure that a serial connection between Arduino and the Computer has
been established, a rudimentary "handshake" is made upon setting up the Arduino sketch. In
function setup(), the sketch waits for the receipt of byte 255 over the serial line (see
function waitForSerialAndReply()), then answers with byte 255. It then continues with function
loop(). There, the program keeps waiting for incoming bytes that it interprets as the shade
of a pixel. It moves the attached stepper motor that emits seeds from the extruder.
Below Arduino sketch is modified from an example by Tom Igoe. It makes use of the
stepper motor library which comes with the Arduino IDE.
/*
* ExtruderMotorControl by Thomas Laubach
* Modified from Tom Igoe's example sketch
"Stepper Motor Control - one revolution"
*/
#include
int inByte = 0;
const int stepsPerRevolution = 200; // change this to fit the number of steps per revolution
// Initialize the stepper library on pins 8 through 11
Stepper myStepper(stepsPerRevolution, 8, 9, 10, 11);
void setup()
{
myStepper.setSpeed(60); // Set the speed of the stepper motor at 60 rpm
Serial.begin(9600); // Start serial port at 9600 bps
while (!waitForSerialAndReply()) {}; // Send a byte to establish contact until receiver responds
}
void loop()
{
int shade = 0;
int numSteps = 0; // Empirically found number of steps for our extruder, dependent on argument shade
if (Serial.available() > 0) // Grab the next available byte from the serial line
{
shade = (byte) Serial.read();
numSteps = 1.5 * shade; // TODO: find the necessary number of steps
myStepper.step(numSteps);
delay(500);
}
}
boolean waitForSerialAndReply()
{
Serial.println("Waiting for the Computer to send a 255 byte...");
while (Serial.available() == 0) { } // Wait for a byte on the serial line
do // Investigate the next byte from the serial line...
{
inByte = Serial.read();
Serial.print("\nByte received from the Computer: ");
Serial.print(inByte, DEC);
}
while (!inByte == 255); // ...as long as it is not byte 255
Serial.println("Received a 255 from the Computer. Computer is now serially linked to Arduino board");
Serial.write(255); // Reply with byte 255
delay(2000);
return true;
}
I extended the main function in Python for a basic "handshake" mechanism between the Arduino and the Computer.
The Python program first opens a serial connection to the attached Arduino Uno. Then it empties the serial input
and output buffers and waits for three seconds. After this time it sends byte 255 across the serial line in order
to tell the Arduino "that it's here". It then waits for an answer from the Arduino until it receives byte 255 on
the serial line. Only when this handshake has taken place, it opens an image file and processes it.
When the USB plug from the Arduino is plugged in, it is likely that the device name is being changed. This means that
in the program the name needs to be replaced. This is annoying and could be done automatically.
# -*- coding: utf-8 -*-
'''
Created on April 18th., 2016
@author: Thomas Laubach
'''
import mtm.pixelPlanter as pixelPlanter
import serial
from PIL import Image
from time import sleep
if __name__ == '__main__':
# Open a serial port for communicating with the Arduino
arduinoCom = serial.Serial("/dev/tty.usbmodemFD1321", 9600) # Speed 9600 baud
# Send a character across the serial port in order to establish a connection to the Arduino
arduinoCom.flushInput()
arduinoCom.flushOutput()
sleep(3); # Wait for three seconds
arduinoCom.write(chr(255)) # Send a byte to the Arduino
print "Sent byte 255 to the Arduino."
# Wait for the Arduino to respond with another 255, then move on
print "Now waiting for the Arduino to answer with byte 255"
newByte = 0
newByteStr = ""
while newByte != 255:
newByte = ord(arduinoCom.read()) # Read one byte from the Serial port
if newByteStr != "":
print "Received a new byte from the Arduino: "
print newByte
sleep(0.30)
print "Received byte '255' from the Arduino. Going on..."
# Open an image, then plant it
img = Image.open("/Users/thomas/Desktop/FabAcademy/repository/fablabkamplintfort/students/125/week9_files/images/fabLab_logo.png")
pixelPlanter.plantImage_grayscale(img, 7, 18, 0, 0, arduinoCom) # Arguments: path to image, velocity of the axes, box width, offsetX, offsetY
As an experiment, I have converted a FabLab logo for the PixelPlanter (see below image). For the final machine, we have simplified our approach such that it gets a halftone image, i. e. binary image, as input. Where the machine encounters a white pixel, plants are seeded, at places with a black pixel, nothing is seeded.
For a movie of the Pixel Planter in action and several photographs, please refer to our machine page.
Assignment for today: automate your machine closed loop (telling the machine where to go, the machine makes the necessary steps, but slower) is much better than open loop (tell the machine the number of steps to go, but you don't know if the machine reaches the position exactly, faster) Open loop vs. closed loop: http://www.atp.ruhr-uni-bochum.de/rt1/syscontrol/node4.html Classic Control Theory Proportional control: measures the difference between what you want and where you are out = A(want-is) problems: the output can never go to zero as you always need an error term to drive the system pid: proportional + integral + derivative: take the signal you want, the prop. term measures the erro where you want to be, the derivative term measures the rate of change, the integral term adds up the total error, thereby making a base line so that the error can go to zero error term (drives you to the goal) + derivative term (how fast) + integral term (makes the base line) you can buy a PID controller but you can implement one in any of the microcontrollers we are using model predictive control: beyond the scope of this class you want to regulate the controls, you do not simply turn them on or off! TinyG is a GCode controller. You need to configure it. It needs to know what motor controls what and with which DOF. Problems: you need to change the code if you want to add an axis etc. (they put a lot of state into the machine) Gestalt is a software framework for virtual control. Defines machines in software You write Python programs machine is a real-time network makes the machine controls reconfigurable One way to make this week's assignment: use the package that every FabLab has got FabNet Write Gestalt programs Gestalt: Node code handles node communication Von der Python library wird ein RF45 chip benötigt. sollten wir haben you need a compound node to call movements on all axes at once kinematics maps machine steps to physical space: 50 steps = the machine moves one centimeter in the y axis serial kinematics: x moves, then z parallel kinematics: x and y move simultaneously XPython könnte man benutzen, um ein GUI für Gestalt zu schreiben potentiometer on node hardware: sets motor current mitgelieferter motor: can run two amps, but you need to add a heatsink then this year's FabAcademy: 1 either use Gestalt in Python, or use stripped-down gestalt 2. use the Mod framework to plan what you want to do ... then use Gestalt 3. forgotten 4. for the future: eliminating nodes Gestalt is stable, Mods is experimental FabNet: RF45 with hardware timing for a new potentiometer, you would need a new (hardware) node in the network