6
Towards Interactive Fabrication of 3D Prints with FabAR—Modifying and Creating 3D Models through Gestures Abstract FabAR enables end-users to modify and create 3D models based on gestural input and augmented reality allowing the interactive fabrication of 3D prints directly inside a 3D printer. We outline the concept of FabAR and give detailed insights into our early prototype. Author Keywords Interactive Fabrication; 3D Printers; Gestures; Aug- mented Reality ACM Classification Keywords H5.2. [User Interfaces]: Haptic I/O General Terms Design; Human Factors Introduction Current developments from the maker culture, fab labs, the open source movement, and crowd funding plat- forms allow making novel technology like 3D printers and laser cutters accessible and available to everybody at low-cost and thus opening space for new paradigms. Personal Digital Fabrication [3], the idea that every- body can produce, modify, or repair their own physical goods, already marks the advent of a third industrial Copyright is held by the author/owner(s). TEI 2014, Feb 16 – 19, 2014, Munich, Germany. Mirko Fetter, Christoph Beckmann, Tom Gross Human-Computer Interaction Group University of Bamberg Bamberg, Germany <firstname>.<lastname>(at)uni-bamberg.de

Towards Interactive Fabrication of 3D Prints with FabAR ...tei.acm.org/2014/wip/wip-fetter.pdffaces to personal fabrication tools. In the following we present the concept for the interac-tive

  • Upload
    others

  • View
    6

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Towards Interactive Fabrication of 3D Prints with FabAR ...tei.acm.org/2014/wip/wip-fetter.pdffaces to personal fabrication tools. In the following we present the concept for the interac-tive

Towards Interactive Fabrication of 3D Prints with FabAR—Modifying and Creating 3D Models through Gestures

Abstract FabAR enables end-users to modify and create 3D models based on gestural input and augmented reality allowing the interactive fabrication of 3D prints directly inside a 3D printer. We outline the concept of FabAR and give detailed insights into our early prototype.

Author Keywords Interactive Fabrication; 3D Printers; Gestures; Aug-mented Reality

ACM Classification Keywords H5.2. [User Interfaces]: Haptic I/O General Terms Design; Human Factors

Introduction Current developments from the maker culture, fab labs, the open source movement, and crowd funding plat-forms allow making novel technology like 3D printers and laser cutters accessible and available to everybody at low-cost and thus opening space for new paradigms. Personal Digital Fabrication [3], the idea that every-body can produce, modify, or repair their own physical goods, already marks the advent of a third industrial

Copyright is held by the author/owner(s). TEI 2014, Feb 16 – 19, 2014, Munich, Germany.

Mirko Fetter, Christoph Beckmann, Tom Gross Human-Computer Interaction Group University of Bamberg Bamberg, Germany <firstname>.<lastname>(at)uni-bamberg.de

Page 2: Towards Interactive Fabrication of 3D Prints with FabAR ...tei.acm.org/2014/wip/wip-fetter.pdffaces to personal fabrication tools. In the following we present the concept for the interac-tive

revolution. Rapid Prototyping on the other side enables designers and engineers to quickly produce scale mod-els of design variations and so can massively improve the design process.

However, the current process of personal digital fabri-cation separates creators from their workpiece by in-troducing a digital layer in form of GUI-based CAD and 3D modelling tools [10]. With fast interaction, undo mechanisms, and precision [5] those tools provide ben-efits over traditional tools like plow and chisel. Howev-er, for personal digital fabrication the learning curves of

such tools for end-users are steep and for designers the act of designing and sculpting looses its embodied qual-ity. The notion of interactive fabrication [10] aims at overcoming these hurdles by providing embodied inter-faces to personal fabrication tools.

In the following we present the concept for the interac-tive fabrication of 3D prints through FabAR (cf. Fig-ure 1), allowing end-users to directly modify or create 3D models in a 3D printer based on gestural input and augmented reality (AR). Furthermore, we give detailed insights in a first prototype that we built to investigate the technical feasibility and to explore the design space.

Related Work Willis et al. [10] outline the notion of interactive fabri-cation with a series of prototyping devices that link physical input and output for personal fabrication. Vari-ous researches explored the design space of combining physical in- and output for interactive fabrication from a technical perspective. For instance, Constructable [5] provides precise and direct input to a laser cutter via laser pointers. FreeD V2 [11] is a technical sophisticat-ed handheld digital milling tool that aids users to carve pre-modelled 3D models out of a Balsa foam block. And CopyCAD [1] allows integrating geometries of existing physical objects into geometries of new 3D models. However, concepts for interactive fabrication on 3D printers are sparse.

Others looked into the creation of 3D models from ges-tural input in form of freehand sketches, in order to increase the embodiment and materiality of the digital design process. For example, Spatial Sketch [12] as well as Sketch Furniture [2] enable users to fabricate

Figure 1. The large picture (1) shows a user interacting with both, a physi-cal object and a 3D model using Fa-bAR on the print bed of a 3D printer. The user wears an AR headset and modifies the 3D model by gesturing with a LEAP motion controller in front of the 3D printer. The picture in the upper right corner (2) shows the ste-reoscopic representation that makes up the user’s view inside the AR headset.

Page 3: Towards Interactive Fabrication of 3D Prints with FabAR ...tei.acm.org/2014/wip/wip-fetter.pdffaces to personal fabrication tools. In the following we present the concept for the interac-tive

their own furniture like lampshades, tables, or chairs by recording pen strokes in mid-air. 3Doodler1 and Doo-dle3D2 already provide basic products for this form of physical doodling. With Surface Drawing Schkolne et al. [7] developed an early system that allows designing organic 3D shapes in a semi-immersive virtual envi-ronment. While the presented work is promising, as it combines ease-of-use with a degree of embodiment, the resulting output lacks precision.

Finally, some commercial products like AutoDesk’s 123D apps3 Tinkercad and 123D Design offer easy-to-use modelling tools for end-users. While these tools allow for precise creation of 3D models, they only pro-vide limited functionality and more importantly lack the possibility for interactive fabrication. This often leads to trial-and-error approaches where a 3D model needs to be printed, modified, and re-printed a few times, until a satisfying result with the right dimensions is reached. With 3D prints often taking up to a few hours to com-plete, this is a time and material wasting process.

Concept of FabAR In order to provide intuitive and lightweight support for the interactive fabrication of 3D printed objects, we identified challenges that we want to address with Fa-bAR. As these are a result of introducing a layer in form of CAD tools between the user and the physical output, our aim is to remove this layer to allow for direct, em-bodied, and interactive manipulation of 3D models. Fa-

1 http://www.the3doodler.com/ 2 http://www.doodle3d.com/ 3 http://www.123dapp.com

bAR achieves this by using AR to directly display a printed model on the print bed of the 3D printer and by using gestural input for the interaction with the 3D model.

This setup allows end-users to better envision what the final 3D print will look like upfront. By placing real ob-jects on the print bed and positioning the virtual 3D model next to them, users can better estimate the final form, size, etc. of the 3D print without a need for time consuming trial-and-error. Additionally, the gestural input provides a direct and intuitive control for modify-ing and creating 3D models and hence a gentle learning curve opposed to professional CAD tools. In the follow-ing we provide more details on what the interaction with FabAR looks like.

Interaction Design of FabAR To frame the design space, we conceptualised an inter-action design with two modes called Modification and Creation, designing for low floor and high ceiling.

Figure 2. A user selects a part of a downloaded model of a bolt by pointing (1) and uses a pinch gesture to scale the part (2).

The first mode named Modification enables novel users to browse and download 3D models from a collection of standardised 3D objects and components (comparable

Page 4: Towards Interactive Fabrication of 3D Prints with FabAR ...tei.acm.org/2014/wip/wip-fetter.pdffaces to personal fabrication tools. In the following we present the concept for the interac-tive

with Makerbot Thinkiverse4), which can be adapted to personal needs. From these models the users can ei-ther select parts of a 3D model by pointing and then using gestures to scale, move, or rotate these parts (cf. Figure 2) or combine several 3D models into a new ob-ject (e.g., combine a tripod mount with a Kinect stand).

Figure 3. A user sketches a star with the index finger (1) and uses a swipe gesture to extrude the created outlines (2).

The second mode named Creation provides users with a set of simple tools to generate new and original 3D models from scratch with results that can be compared to those of services like the Shapeways creator5. The creation of a 3D model consists of two steps (cf. Fig-ure 3). First users draw an outline as a freehand sketch (i.e., a plane curve) with their index finger on the print bed. Second, they use a gesture to generate a volu-metric geometry from the sketch. For example, a circle gesture with the index finger generates a solid of revo-lution (e.g., a vase) and a swipe-up gesture generates an extrusion solid (e.g., a star like in Figure 3) based on the initial sketch. Users can fine-tune the result by the way they perform the gesture. For instance, circling

4 http://www.thingiverse.com 5 http://www.shapeways.com/creator

clockwise or counter-clockwise with the finger (compare to a jog wheel) allows users to deviate from a full 360-degree rotation (e.g., a section of a vase). Swiping up or down regulates the height of the extrusion.

Of course the 3D models generated in Creation mode can be combined in Modification mode with other creat-ed or downloaded models. For instance, to decorate a downloaded 3D model of an iPhone case with a custom ornament designed by the user from scratch.

Prototype User Interaction In order to explore the technical feasibility of our ap-proach and to collect early user feedback, we built a first prototype of FabAR, which provides some basic functionality and gives an initial impression of the final system. The system allows users to directly see a true to scale 3D model through AR glasses inside the 3D printer placed on the printing bed and provides basic functionalities to manipulate the 3D model through ges-tural input. In the following we describe our prototype’s initial four options of manipulation: selection, place-ment, scaling, and rotation.

To select a 3D model form a catalogue of available models, users can make a horizontal swipe gesture in front of the 3D printer. They can place real world ob-jects on the print bed, so they can directly see the se-lected model in relation to it (cf. Figure 4).

Users can scale a 3D model via a vertical swipe ges-ture. Swiping from bottom to top translates into in-creasing the size of the 3D model on the print bed and vice versa. Besides the already mentioned use for esti-mating the ratio in size between existing physical ob-jects and a model, users can evaluate the fit of internal

Figure 5. In printer view for fitting existing physical objects (scissor and pens) into a 3D model of case.

Figure 4. In printer view for estimat-ing the ratio in size between an exist-ing physical object (car) and a 3D model (action figure).

Page 5: Towards Interactive Fabrication of 3D Prints with FabAR ...tei.acm.org/2014/wip/wip-fetter.pdffaces to personal fabrication tools. In the following we present the concept for the interac-tive

dimensions of 3D models for placing physical objects inside (cf. Figure 5).

To rotate 3D models on the print bed users can make a circle gesture with their index finder. The gesture direc-tion is synchronized with the rotation direction of the model and allows a fine-grained control of the rotation by applying a non-isomorphic translation.

To position a 3D model on the print bed users employ a pointing gesture. Moving the index finger sideways re-sults in an x-axis translation of the model; moving it forth and back results in a y-axis translation.

Prototype Implementation For our prototype we use off the shelf hardware: a Makerbot Replicator 26 as 3D printer, a LEAP motion7 as gesture input device, and a self-built, low-cost AR headset, consisting of two PS 3 Eye8 cameras function-ing as single stereo camera and an Oculus Rift9 as ste-reo display—for the latter we printed a custom mount for attaching both cameras. The software implementa-tion follows the modular software design of the open-Frameworks [6] toolkit with one controller and 4 specif-ic subsystems: the FARFrameGrabber, FARAR, FARDraw, and FARView. Subsequently, we briefly describe our processing steps.

For processing the video images, we use two distinct cameras mounted in front of the stereo display in verti-

6 http://www.makerbot.com 7 http://www.leapmotion.com 8 http://www.playstation.com 9 http://www.oculusvr.com

cal position. In our software, the FARFrameGrabber ob-tains images from each camera. After rotating and mir-roring these images once, we create black and white copies of the images that are clipped towards a thresh-old. We then process these images to in the FARAR sub-system to detect AR markers on both sides individually using the ARToolKit+ [8]. The original camera images are drawn within a single OpenGL context and overlaid with the augmented 3D models in the FARDraw subsys-tem. Finally, in the FARView subsystem we use vertex and fragment shaders to create a warped representa-tion of the final images ready for the presentation on the stereo display at about 60 fps (cf. Figure 1.2).

For augmenting the camera images with a 3D model, we assign a collection of stereolithography files (STL); however we only activate the user-selected 3D model for AR overlay. To allow AR overlays from different viewing perspective, we employ two markers to the print bed—one facing up and one facing front. This al-lows an optimal registration of the 3D model on the print bed. In both cases we obtain the ModelMatrix from the ARToolKit+ and then make individual trans-formations for positioning the on the print bed.

For detecting gestures, we rely on swipe and circle ges-tures that we query from the LEAP’s C++ API [4]. User interaction events for manipulating 3D models, such as detected gestures and keyboard inputs are propagated to the active, user-selected 3D model. So changes are visible immediately and remain persistent on a 3D model while browsing for other models. User interaction events for selecting models are handled in the FARController and result in activating the selected 3D model from the catalogue.

Page 6: Towards Interactive Fabrication of 3D Prints with FabAR ...tei.acm.org/2014/wip/wip-fetter.pdffaces to personal fabrication tools. In the following we present the concept for the interac-tive

Conclusions In this paper we presented our concept for the interac-tive fabrication of 3D prints by end-users through the embodied modification and creation of 3D models in a 3D printer. With our early prototype of FabAR we showed the technical feasibility of the proposed concept by combining gestural input and AR. From a technical perspective for the current prototype we can already share one insight: the self-built AR headset consisting of two PS3 Eye cameras and the developer version of the Oculus Rift offers room for enhancements towards a more natural perception by introducing higher resolu-tion cameras and displays for more details and sharp-ness, as well as less distortion and displacement for a more accurate perspective. Upcoming affordable tech-nologies like castAR10 promise to fill this gap. The Fa-bAR prototype allows us to collect early feedback from users and thus informs the design of the final system. In upcoming studies we address both, the complexity and the precision of gestures to further refine our inter-action design for Modification and Creation.

Acknowledgements We thank the members of the Cooperative Media Lab.

References [1] Follmer, S., Carr, D., Lovell, E. and Ishii, H. Copy-

CAD: Remixing Physical Objects With Copy and Paste From the Real World. In Adjunct Proceedings of UIST 2010 (Oct. 3-6, New York, USA). ACM Press, New York, USA, 2010. pp. 381-382.

[2] Front Design. Sketch Furnitures. http://www.designfront.org/category.php?id=81&product=191, 2013 (Last accessed: 05/11/2013).

10 http://technicalillusions.com

[3] Gershenfeld, N. Fab: The Coming Revolution on Your Desktop--from Personal Computers to Person-al Fabrication. Basic Books, Cambridge, USA, 2005.

[4] Leap Motion Inc. Leap::Frame Class Reference. http://developer.leapmotion.com/documentation/Languages/C++/API/class_leap_1_1_frame.html, 2013 (Last accessed: 13/11/2013).

[5] Mueller, S., Lopes, P. and Baudisch, P. Interactive Construction: Interactive Fabrication of Functional Mechanical Devices. In Proceedings of UIST 2012 (Oct. 7-10, Cambridge, USA). ACM Press, New York, USA, 2012. pp. 599-606.

[6] openFrameworks. About openFrameworks. http://www.openframeworks.cc/about/, 2013 (Last accessed: 12/11/2013).

[7] Schkolne, S., Pruett, M. and Schröder, P. Surface Drawing: Creating Organic 3D Shapes with the Hand and Tangible Tools. In Proceedings of CHI 2001 (Mar. 31 - Apr. 5, Seattle, USA). ACM Press, New York, USA, 2001. pp. 261-268.

[8] Wagner, D. and Schmalstieg, D. ARToolKitPlus for Pose Tracking on Mobile Devices. In Proceedings of CVWW 2007. (Feb. 6-8, Sankt Lambrecht, Austria). 2007. pp. 1–8.

[9] Willis, K.D.D., Lin, J., Mitani, J. and Igarashi, T. Spatial Sketch: Bridging Between Movement & Fab-rication In Proceedings of TEI 2010 (Jan. 25-27, Cambridge, USA). ACM Press, New York, USA, 2010. pp. 5-12.

[10] Willis, K.D.D., Xu, C., Wu, K.-J., Levin, G. and Gross, M.D. Interactive Fabrication: New Interfaces for Digital Fabrication. In Proceedings of TEI 2011 (Jan. 23-26, Funchal, Portugal). ACM Press, New York, USA, 2011. pp. 69-72.

[11] Zoran, A., Shilkrot, R. and Paradiso, J. Human-Computer Interaction for Hybrid Carving. In Pro-ceedings of UIST 2013 (Oct. 8-11, St. Andrews, Scotland, UK). ACM Press, New York, USA, 2013. pp. 433-440.