TOP SEARCH TERMS
CANCEL
TOP SEARCH TERMS

Posts Tagged ‘kyle mcdonald’

From Kinect to MakerBot Guide at Make: Projects

Head on over to the MakeProjects site to catch Brian Jepson’s From Kinect to MakerBot guide-in-progress for how to transform captured Kinect data on through to the STLs you need to 3D print with your MakerBot Thing-O-Matic.

His guide picks up where Kyle McDonald’s great 3D Printing with Kinect post leaves off — a great tutorial to take you from the initial STL you create using Kyle’s KinectToSTL tool through to a scaled-down, MakerBot-printable  STL. Bonus points for using only open source tools for the entire chain!

Those looking to learn more about the Open Kinect movement should check out the Open Kinect Project (offers MeetUps in certain cities) and consider attending conferences such as Art && Code 3D: Kinect-Hacking Conference, on October 21-23rd at Carnegie Mellon University.

There have been a few people asking for easier to install binary releases for Kyle’s KinectToSTL tool, compiled also for Windows and Linux. There are some complications that require fuss, not to mention the need to make changes to the code to suit the latest OpenFrameworks release (according to Matt and Kyle). If you accomplish this work, drop a comment back here and we will happily trumpet your triumph to the world.

Tagged with , , , 5 comments
 

Popular Mechanics Features MakerBot Thing-O-Matic in 10 Coolest DIY Projects From Maker Faire 2011

Popular Mechanics: "Top 10 Coolest DIY Projects from MakerFaire"

The MakerBot crew was in attendance at MakerFaire 2011 – Bay Area this past weekend — where the MakerBot Thing-O-Matic made quite a splash. Check out a little of what Popular Mechanics has to say about our latest DIY 3D printer kit:

MakerBot Thing-O-MaticMakerBot Industries is a pure example of the maker ethos: Not only has the company created an interesting machine, but its machine’s sole purpose is to create things. Provided with the right instructions, it can print just about any 3D shape into plastic.

It can be hard to explain why exactly someone might want a 3D printer, so the crew put together a demo: a facial scanning system housed inside a 7-foot-tall dome built by maker Michael Felix, the joints of which were created with a 3D printer. Inside, fellow maker Kyle MacDonald uses the infrared camera bar from a Microsoft Kinect, along with software he wrote himself, to capture 3D models of attendees’ faces, which are then printed into plastic statues. The whole process, from flesh to plastic, takes only 45 minutes.

Tagged with , , , , , , One comment
 

3rd Ward MakerBot Make-A-Thon Photos!


The 3rd Ward MakerBot Make-A-Thon was awesome! Special thanks to MakerBot User Group New York for showing off their designs and introducing 3rd Ward members to MakerBot. Kyle McDonald used the interior of Michael Felix’s geodesic dome for his 3D Photo Booth. If Kyle scanned you, be sure to find yourself on Thingiverse here. Stay tuned for our next MakerBot User Group meeting and a continued collaboration between MakerBot and 3rd Ward.

Tagged with , , , , , , , One comment
 

NYC Elevation Map by kylemcdonald – Machine Halts FTW

Now...lets just see...where am I? Brooklyn!

MakerBot Artist-In-Residence Kyle McDonald is already an expert in mapping objects with point clouds given his extensive work on structured light and 3D scanning with a Kinect via Processing. But he is also exploring another kind of mapping — grabbing topological1 data and 3D printing it with a MakerBot Thing-O-Matic.

He’s not the first to attempt this — Lake and Mountain Topography and Mt Everest are other notable examples from the Thingiverse kingdom — but his hack makes the topography easier to read.

He uses the g-code command M1 machine halt2 to pause the print to give you time to switch filaments. The results are a two-color topography where the critical elevation gain above the bottom color of filament is easier to read. Works great for sea-level topography where the transition maps to an easy to understand reference elevation. ((Though I’d love to see some sea trenches!)

This hack only works on a “tethered” MakerBot printing RepG at the moment, but is worth the hassle:

  • The topographical data is freely available here.
  • Generate the STL with Kyle’s custom app here.
  • Then search through the gcode for just before the M101 command for the first G1 command including the z-height where you’d like the transition, and drop an “M1″ on its own line.
  • I’d suggest sandwiching a “20mm up, 20mm” down waiting position script around the M1 command to keep from oozing while you fiddle with the filament, but you should code to taste.
  • If there is an active M101 command you will be extruding while you switch filament! You need to find a sweet spot between an M103 and an M101.

Kyle was standing somewhere just above the thumb when he took this picture!

Error - could not find Thing 8611.
  1. From topology to topography []
  2. that might be vanishing as an option soon []
Tagged with , , , , , , , , , , One comment
 

3rd Ward MakerBot Make-A-Thon!

Get scanned by MakerBot Artist-in-Residence Kyle McDonald at the 3rd Ward MakerBot Make-A-Thon

MakerBot User Group New York – it’s time to meetup!

The 3rd Ward MakerBot Make-A-Thon is your chance to meet other MakerBot users, print awesome 3D objects and even a 3D portrait of yourself.

MakerBot Artist-in-Residence Kyle McDonald will be presenting his work turning the Xbox Kinect into a 3D Scanner. He will scan you in his 3D Photo Booth, then print you using the MakerBot.

Learn more about Kyle McDonald’s Xbox Kinect hack in this 3rd Ward blog post.

Bring your MakerBots and your favorite objects for a MakerBot user show and tell.

Giveaways! There will be giveaways, including LEDs!

MakerBot User Group New York

3rd Ward MakerBot Make-A-Thon
Saturday, May 14th
2:00 pm – 6:00 pm
3rd Ward – 195 Morgan Avenue, Brooklyn, NY 11237
FREE EVENT!

Tagged with , , , , , , Leave a comment
 

MakerBot User Group New York!

Thanks to all the MakerBot Operators who came to the Botcave last night for out MakerBot User Group New York meeting. We had a blast meeting all of you, hearing your MakerBot experiences and sharing pizza and LEDs!

MakerBot users explored the Botfarm and were 3D scanned and printed with Kyle McDonald’s 3D Photo Booth.  Highlights also included a MakerBot user show and tell, and a demonstration by our first artist-in-residence, Marius Watz.

Stay tuned to the blog for future MakerBot User Group events!

Tagged with , , , , , , , , , One comment
 

Interview with Kyle McDonald, creator of ThreePhase

Kyle McDonald, structured light scanning researcher and Taylor Goodman, creator of the Makerbot 3D Scanner v1.0 Kit

Kyle McDonald, structured light scanning researcher and Taylor Goodman, creator of the Makerbot 3D Scanner v1.0 Kit

Taylor Goodman recently interviewed Kyle McDonald, the creator of ThreePhase1 ThreePhase is an open source 3D scanning program which creates a point cloud using a method of scanning involving a projector and camera, a system termed “structured light.”

Taylor Goodman (TG): What inspired you to write the three phase decoder? What did you want yourself or others to use it for?

Kyle McDonald (KM): Last year I heard about a choreographer who was using the DAVID line laser scanner with some Lego motors for creating 3D stop motion animations. Every scan took about one minute, so it was a painstaking process. The idea of 3D stop motion had me interested, but one minute per frame seemed way too long!
Iwrote the three phase decoder because I wanted to make something that would allow more people to experiment with 3D stop motion using resources already available to them.

TG: How did you write it, i.e. how and where did you learn everything about structured light 3D scanning? How long did it take?

KM: When I started I didn’t have a laser, so I thought I’d use a projector instead. This helped me realize that the fastest scanning technique would record information about every pixel simultaneously rather than one line per frame. I created a scanner based on binary subdivision, which takes around 8-10 frames, one or two orders of magnitude faster than laser scanning.
I thought that was the best you could do, but then I discovered “structured light”: an umbrella term for the kind of projector-camera 3D scanning system I was working with. I learned that people had been doing this for decades, and they even had some techniques for using fewer frames to 3D scan motion in real time (like three phase scanning).

While the initial subdivision scanner took a few days from idea to demo, the three phase scanner took a few weeks. It wouldn’t have happened without some code by Alex Evans from Media Molecule (ported to Processing by Florian Jennett), and some great research papers from Song Zhang at Iowa State University (who worked on the technology for Radiohead’s “House of Cards” music video).

After that initial development, it’s been over a year of brainstorming with people, reading papers, and completely rewriting the code multiple times. And there’s still a lot of work to do.

TG: Why did you make it open source, completely free to anyone interested?

KM: I think we need as many people as possible doing what they love. Open source is one way to get people the tools they need when they wouldn’t otherwise have access.

I’d also like to help overcome the novelty associated with new technologies. More people using 3D scanning means more diverse perspectives on what can be done with the technology.

TG: Any further/deeper applications for ThreePhase?

KM: One advantage of a 3D printer is that you can resize while you replicate. I’d love to see some very large things scanned and made very small, or vice versa.

There is a also malleability to 3D scanned data that isn’t available in the physical world. It’d be nice to have some objects that are combinations of averages of multiple items. Maybe a Katamari ball made from real household objects?

TG: What is the future of structured light 3D scanning? What do you wish to see happen next with it?

KM: While the three phase technique comes primarily from academic papers and is relatively unencumbered by patents, I have an idea for a completely open source scanning technique that would allow a more flexible trade off between accuracy and speed. It could be adapted to high resolution still scans or lower resolution motion scans.

TG: Can ThreePhase be improved?  Why and how?

KM: The Processing three phase decoder is meant more as a demo, and lacks a lot of features for automated decoding. There is a more robust version built with with Open Frameworks where the majority of my work is focused.

But for both apps I’d really like to get some people involved who have a stronger mathematics and computer vision background. The decoding process is currently very naive and doesn’t account for the various parameters inherent to the projector and camera.

TG: Did you ever imagine this would be a project worked on at Makerbot or another 3D printing company?

KM: I’m regularly surprised by the ways this work is used. So, in a way, this makes perfect sense.

TG: What is your next big project?

KM: I’ve been looking into projection mapping, or using a projector to augment a scene. This is another technique that is currently thriving on its novelty, so I’m working on a toolkit that makes it easier to scan and projection map arbitrary scenes and objects. There’s also a specific interactive environment I’d like to create with this technique that plays with our understanding of light sources.

For updates, see my website http://kylemcdonald.net or follow me @kcimc

  1. The list of projects on Kyle’s site is truly amazing, you should really check it out. []
Tagged with , , , , Leave a comment
 
 
Chat
What can we help you with today?
I want to chat with Sales.
I have a question about an existing order.
I have a technical question about my device.
Continue
Existing Orders
For faster service, enter your order number
(found in your confirmation e-mail)
Skip
Submit