week 5

3D Scanning and Printing

ASSIGNMENTS:

  • Testing the design rules of the 3D printer as a group project.
  • 3D Scan an object (and optionally print it).
  • 3D Print an object that cannot be made subtractively.

weekly work

3D scanning

final project

The assignment was done in 2 techniques in the lab. The methods are:

  1. Light Coding
  2. Photogrammetry

 

The details, benefits and drawbacks are described here.

contact

3D scanning:

light coding

FabLab CEPT

The method 'Light Coding' uses Microsoft Kinect as the scanner and a software to process the data. In this case I have used Kinect Model 1414 with Kinect for Windows SDK 1.7 & KScan3D as the scanning software.

Kinect Windows SDK 1.7 can be found here and KScan3D is freely available here.

apparatus

  A kinect Sensor (Model 1414) , USB cable for Kinect, a Laptop

For this method, I scanned myself using Kinect and Kscan3D for the proof of concept.

methodology

The kinect was kept static while I myself rotated as much precise I could and the setup was done for a batch process and the software scanned at each instance.

  Due to lack of tripod socket in Model 1414 it was placed on a tool for capturing.

  If the Kinect is connected properly you will see such a window.

  Test the RGB and Depth Channel from the Kinect and recallibrate if necessary.

  Set up the batch scan and hit Scan while I move slowly.

  It will gather scans according to my position in front of the sensor.

The method 'Light Coding' uses Microsoft Kinect as the scanner and a software to process the data. In this case I have used Kinect Model 1414 with Kinect for Windows SDK 1.7 & KScan3D as the scanning software.

Kinect Windows SDK 1.7 can be found here and KScan3D is freely available here.

  Now Align all the scans

  Combine scans into one and export in STL format.

The mesh is open at some places and not yet ready to be printed. So it is taken into Autodesk MeshMixer to edit and clean-up.

 

Autodesk Meshmixer can be freely availed from here.

  From the Edit menu it is made to a solid and cut from bottom.

The final model.

Final 3D printable watertight mesh can be downloaded from here.

In this process if the motion is not smooth either of the sensor and the subject there is a room for error and inaccurate scan result though it takes pretty less time to scan. These drawbacks were improvised in the next method.

3D scanning:

photogrammetry

Photogrammetry uses photographs from imaging devices at different angles of a same object, preferably in a uniform lighting setup to generate 3D. This process takes a ot of computing time & power and accurate capture of images. Agisoft PhotoScan and Autodesk ReCap 360 are used in this section. Both have their own advantages.

apparatus

  Nikon D5500 and Lenovo Z51-70 i5-16GB RAM

First Test was done in AgiSoft Photoscan using Gautam as subject and taking 87 photos of him from all possible angle but fatigue after certain number of shots made it pretty inaccurate to process the data. Still, it came out to be a good result and the process is described here.

First photos were taken around him, then Agisoft Photoscan was fired up.

  The interface looks like this. We need to head for the menu 'Workflow'.

  Import the photos and then hit Align photos from the Workflow menu.

  After that one should find similar screen, where the photos are aligned and point cloud is generated. Next step is to generate the dense cloud which wil generate the mesh.

  This is how it should look.

  From the same menu, click generate mesh to get the mesh created from the cloud.

  The last step is to create a texture out of it for display. Then export the model in OBJ format so that mesh and texture is both created.

  I imported the mesh to Autodesk Maya 2016 to check for quality. It has 1 million mesh faces.

  Mesh Preview with texture.

  Amount of Detail (Click on the image for larger resolution )

I discarded this method for its use of time issue and its usage of CPU power at 100% and blocking my laptop though it generates very accurate mesh. I looked for cloud computing solution after this and Autodesk Recap 360 came to notice.

The texture and the file is pretty heavy to host on this server, but you can access the file from Dropbox here.

autodesk recap 360

When I found Autodesk ReCap 360, I was traveling to Mumbai. Before this week, our mentor, Ohad, advised us to capture heritage using the latest cutting edge technology which can be used for conservation and preservation later. Having my minor in Architectural conservation and background from arts I chose to take this up and scanned two artifact from Chhatrapati Shivaji Maharaj Vastu Sangrahalaya, Mumbai.

 

One is a Shiva statue from  11th Century CE and another Mahishashuramarini from 11th Centure CE.

I took photos all around the structures around 10 degrees apart. One from eye level, one from below eye level and one from above eye level.

I took photos all around the structures around 10 degrees apart. One from eye level, one from below eye level and one from above eye level. So the photos look like this after arranging and deleting some bad ones.

Then I went to https://recap360.autodesk.com/ and logged into my account to utilize the service. The steps are as followed.

  Welcome Screen, prompts to add photos.

  Add all the photos. In my case number of 91 in this case.

 Set the mesh format and resolution. If pressed create it will queue in cloud server and will send a mail when completed.

I recieved this file.

 

I made another project using this too. You can find that here.

Upon opening the file in Rhino I saw some unwanted mesh like this except the model.

  The model

  The model in rendered view with texture

Joined the mesh and made a plane in XY plane to Cut the mesh

Using MeshTrim I have cut the mesh from bottom at the plane.

  The trimmed model in rendered view without texture. Only mesh in default light.

  The animation showing the detail in 360.

  The model was analysed in Meshmixer and found to be having only one open edge and subsequently fixed in to a solid.

[  3d printing group work  ]

3D printing:

I decided to take the model I created using the scan to print my model. The scan has such intricate details in in the front and the back and the erosion during time makes it impossible to replicate exactly. This is why I chose to make it through 3D Printing.

We have Ultimaker 2+ in our lab so downloaded Cura 2.4.2 from this link. Then opened the 3D printable file I made in the last section into Cura.

  Scaled to model to 200% to make it a bit than it came in scan.

  Red portions shows the places it is overhang in the model

  Details will be seen in actual 3D print.

  Cura settings.

  All 658 Layers in Cura.

3D printing time was 4 hours are following are the details.

To test a print if that can be printed only in additive method I tried Marching cube algorithm and made the file.

Some random lines were drawn in 3D to make a closed mesh through marching cube.

Marching Cube algorithm was run.

The mesh is baked into rhino and exported to STL to be 3D printed.

The file can be downloaded from here.

print result

 go to WEEK 6 >>

Avishek Das   |   2017   |   FabLab CEPT