The work that has come out of the Met MakerBot Hackathon has been really ground breaking, especially since it has inspired people who weren’t even with us a couple of weekends ago.
Matthew Plummer-Fernandez is one of those artists. I posted about him during the Hackathon and it generated a lot of interest about his ideas of “remixing” and “sampling” physical objects.
The video above, “We Met Heads On”, is a new addition from Matthew, drawing on what has so far been captured with 123D Catch and uploaded to Thingiverse. Here’s the description of the video from the Vimeo page.
This video titled ‘We Met Heads On’ is my remix of the 3D scan hackathon at the Metropolitan Museum of Art in NY organised by Makerbot. The public were invited to scan artifacts to then modify and 3d print derivatives. The files ended up on Thingiverse, giving me access to the scans, in particular ‘decimation study – met heads’ by scotta3d which is a derivative from another thingiverse user tbuser. To continue the lineage of derivatives, I have placed the low-polygon heads from scotta3d into a Processing sketch that distorts the meshes in realtime in response to sound and outputs the modified stl objects. The soundwave is analysed from the streaming audio and used to force the mesh to twist to the strength of the soundwave. Performed and recorded in real-time.
We are going crazy about this video right now. Imagine what could be done with some of these ancient figures, animating them (re-animating them?) to appropriate music. Renaissance sculpture to ancient Greek music, Oceanic sculpture to Oceanic music.
It’s time to make art dance.
|Tagged with||3d design, 3d scanning, art, Matthew Plummer-Fernandez, Met MakerBot Hackathon, sampling||Leave a comment|