7
Distortion-free Robotic Surface-drawing using Conformal Mapping Daeun Song and Young J. Kim Abstract—We present a robotic pen-drawing system that is capable of faithfully reproducing pen art on an unknown sur- face. Our robotic system relies on an industrial, seven-degree-of- freedom manipulator that can be both position- and impedance- controlled. In order to estimate a rough geometry of the target, continuous surface, we first generate a point cloud of the surface using an RGB-D camera, which is filtered to remove outliers and calibrated to the physical canvas surface. Then, our control algorithm physically reproduces digital drawing on the surface by impedance-controlling the manipulator. Our impedance- controlled drawing algorithm compensates for the uncertainty and incompleteness inherent to a point-cloud estimation of the drawing surface. Moreover, since drawing 2D vector pen art on a 3D surface requires surface parameterization that does not destroy the original 2D drawing, we rely on the least squares conformal mapping. Specifically, the conformal map reduces angle distortion during surface parameterization. As a result, our system can create distortion-free and complicated pen drawings on general surfaces with many unpredictable bumps robustly and faithfully. I. INTRODUCTION In the early twentieth century, contemporary art challenged conventional approaches by experimenting and exploring new techniques. Since then, numerous new attempts have emerged to fuse science and technology into an art form. Along the way, mechanical machines also have become means and subjects of these art forms adopted by a few artists [1],[2],[3],[4]. Robots are now operating in our daily lives and become a subject of art. Due to the recent impressive development of robotic technology, the artistic roles of robots become more diverse, and possibilities to express artists’ intention through robotic mechanism and interaction with it is expanding [5]. In this paper, we present our new research work on a robotic pen-drawing system that relies on human’s creativity but produces a new creation of artwork. In our previous work [6], we introduced an artistic robotic drawing system that can create visually-pleasing and complicated artistic pen drawings on general surfaces without explicit surface- reconstruction nor visual feedback. In order to expand our system to be scalable in terms of canvas size and drawing time, now we employ an RGB-D camera to estimate the shape of the canvas surface geometry. Simply adding the vision capability to our system, how- ever, may not only complicate the drawing pipeline but also even worsen the robotic drawing results, as the sensor would introduce different sources of noises from depth estimation and calibration. Moreover, in our previous work, we ignored The authors are with the department of computer science and engineering at Ewha womans university in Korea [email protected],[email protected] a mapping artifact from a 2D vector graphic drawing to a 3D robotic drawing, that can introduce serious image distortion for an arbitrary, bumpy surface. Robotic surface drawing without inducing bad visual artifacts is not just limited to artistic drawing, but also applicable to more practical tasks. For instance, in a robotic cleaning application, a robot needs to follow a trajectory (or a cleaning pattern) while making contact with the target cleaning surface [7]. In this case, the cleaning pattern is pre-determined in 2D space (e.g., a space-filling curve in 2D), and the cleaning surface will not be known a priori and can be also defined in 3D (e.g., a window-cleaning robot). The task objective here would be to sweep the 3D surface as closely following the given patterned trajectory as it would on the 2D surface - i.e., the pre-determined cleaning pattern in 2D should be faithfully reproduced in 3D without distorting the patterns. A similar scenario is possible for a surface-probing robot where the robot needs to evaluate and reconstruct an unknown surface by following a pre- determined trajectory, also known as manifold learning [8], [9]. Main Results: In contrast to our previous work [6], now we use minimal vision support using an RGB-D camera to estimate the unknown target canvas-surface and represent it as a point cloud. However, since this estimation is often too noisy for an arbitrarily bumpy surface to draw pen art on it, we rely on an impedance-control technique to counteract the potential uncertainties caused by the sensor noise as well as the numerical noise in surface estimation. Moreover, in order to map the original vector drawing in 2D to a general surface in 3D while minimizing mapping distortion, we use an idea of conformal mapping to mitigate the distortion. Since conventional conformal mapping techniques developed in raster-based computer graphics are not directly applicable to our problem, as our input drawing is not rasterized but piecewise continuous curves, we employ bi-linear interpola- tion to compute proper conformal parameterization. Finally, we successfully demonstrate that our robotic drawing system can generate visually-pleasing pen drawings on real-world surfaces with minimal distortion. The rest of this paper is organized as follows. We survey works relevant to robotic surface drawing in Sec. II. We propose our surface estimation method in Sec. IV and the robotic surface drawing method with minimal distortion in Sec. V. We explain the robotic incarnation of this technique using an impedance-control technique in Sec. VI. We show our implementation results and discuss them in Sec. VII, and conclude the paper in Sec. VIII.

Distortion-free Robotic Surface-drawing using Conformal Mappinggraphics.ewha.ac.kr/SSK2/ICRA19_0376_FI.pdf · 2019-02-28 · conformal mapping as well as for orienting the robot's

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Distortion-free Robotic Surface-drawing using Conformal Mappinggraphics.ewha.ac.kr/SSK2/ICRA19_0376_FI.pdf · 2019-02-28 · conformal mapping as well as for orienting the robot's

Distortion-free Robotic Surface-drawing using Conformal Mapping

Daeun Song and Young J. Kim

Abstract— We present a robotic pen-drawing system that iscapable of faithfully reproducing pen art on an unknown sur-face. Our robotic system relies on an industrial, seven-degree-of-freedom manipulator that can be both position- and impedance-controlled. In order to estimate a rough geometry of the target,continuous surface, we first generate a point cloud of the surfaceusing an RGB-D camera, which is filtered to remove outliersand calibrated to the physical canvas surface. Then, our controlalgorithm physically reproduces digital drawing on the surfaceby impedance-controlling the manipulator. Our impedance-controlled drawing algorithm compensates for the uncertaintyand incompleteness inherent to a point-cloud estimation of thedrawing surface. Moreover, since drawing 2D vector pen arton a 3D surface requires surface parameterization that doesnot destroy the original 2D drawing, we rely on the leastsquares conformal mapping. Specifically, the conformal mapreduces angle distortion during surface parameterization. As aresult, our system can create distortion-free and complicatedpen drawings on general surfaces with many unpredictablebumps robustly and faithfully.

I. INTRODUCTION

In the early twentieth century, contemporary art challengedconventional approaches by experimenting and exploringnew techniques. Since then, numerous new attempts haveemerged to fuse science and technology into an art form.Along the way, mechanical machines also have becomemeans and subjects of these art forms adopted by a few artists[1],[2],[3],[4]. Robots are now operating in our daily livesand become a subject of art. Due to the recent impressivedevelopment of robotic technology, the artistic roles of robotsbecome more diverse, and possibilities to express artists’intention through robotic mechanism and interaction with itis expanding [5].

In this paper, we present our new research work on arobotic pen-drawing system that relies on human’s creativitybut produces a new creation of artwork. In our previouswork [6], we introduced an artistic robotic drawing systemthat can create visually-pleasing and complicated artisticpen drawings on general surfaces without explicit surface-reconstruction nor visual feedback. In order to expand oursystem to be scalable in terms of canvas size and drawingtime, now we employ an RGB-D camera to estimate theshape of the canvas surface geometry.

Simply adding the vision capability to our system, how-ever, may not only complicate the drawing pipeline but alsoeven worsen the robotic drawing results, as the sensor wouldintroduce different sources of noises from depth estimationand calibration. Moreover, in our previous work, we ignored

The authors are with the department of computer scienceand engineering at Ewha womans university in [email protected],[email protected]

a mapping artifact from a 2D vector graphic drawing to a 3Drobotic drawing, that can introduce serious image distortionfor an arbitrary, bumpy surface.

Robotic surface drawing without inducing bad visualartifacts is not just limited to artistic drawing, but alsoapplicable to more practical tasks. For instance, in a roboticcleaning application, a robot needs to follow a trajectory(or a cleaning pattern) while making contact with the targetcleaning surface [7]. In this case, the cleaning pattern ispre-determined in 2D space (e.g., a space-filling curve in2D), and the cleaning surface will not be known a prioriand can be also defined in 3D (e.g., a window-cleaningrobot). The task objective here would be to sweep the 3Dsurface as closely following the given patterned trajectory asit would on the 2D surface - i.e., the pre-determined cleaningpattern in 2D should be faithfully reproduced in 3D withoutdistorting the patterns. A similar scenario is possible fora surface-probing robot where the robot needs to evaluateand reconstruct an unknown surface by following a pre-determined trajectory, also known as manifold learning [8],[9].

Main Results: In contrast to our previous work [6], nowwe use minimal vision support using an RGB-D camera toestimate the unknown target canvas-surface and represent itas a point cloud. However, since this estimation is often toonoisy for an arbitrarily bumpy surface to draw pen art on it,we rely on an impedance-control technique to counteract thepotential uncertainties caused by the sensor noise as wellas the numerical noise in surface estimation. Moreover, inorder to map the original vector drawing in 2D to a generalsurface in 3D while minimizing mapping distortion, we usean idea of conformal mapping to mitigate the distortion.Since conventional conformal mapping techniques developedin raster-based computer graphics are not directly applicableto our problem, as our input drawing is not rasterized butpiecewise continuous curves, we employ bi-linear interpola-tion to compute proper conformal parameterization. Finally,we successfully demonstrate that our robotic drawing systemcan generate visually-pleasing pen drawings on real-worldsurfaces with minimal distortion.

The rest of this paper is organized as follows. We surveyworks relevant to robotic surface drawing in Sec. II. Wepropose our surface estimation method in Sec. IV and therobotic surface drawing method with minimal distortion inSec. V. We explain the robotic incarnation of this techniqueusing an impedance-control technique in Sec. VI. We showour implementation results and discuss them in Sec. VII, andconclude the paper in Sec. VIII.

Page 2: Distortion-free Robotic Surface-drawing using Conformal Mappinggraphics.ewha.ac.kr/SSK2/ICRA19_0376_FI.pdf · 2019-02-28 · conformal mapping as well as for orienting the robot's

(a) Robotic setup (b) Coordinate system setup (c) Tool frame setup

Fig. 1. Robotic Surface-drawing System Setup

II. PREVIOUS WORK

A. Robotic Surface DrawingA good application of robotic surface drawing is an artistic

robotic drawing. Tinguely and Aaron [10] pioneered thisarea by creating a drawing machine in the 1990s. An earlierprototype of drawing robots is mostly based on a special-purpose plotter robot, and the recent trend is to use a high-DoF robotic manipulator for robotic drawing that can is moreversatile in terms of artistic expressions.

A telerobotic work such as the Pumapaint project [11] al-lows remote users to draw paintings using a classical PUMArobot. The HOAP2 humanoid robot was also employed todraw human portraits in human-characteristic styles [2]. Theyalso rely on speech recognition and synthesis to create aninteractive system. Paul the robot [4] is a robotic installationthat creates observational portrait drawing, mimicking anartist’s stylistic signatures. The robot itself is a planar roboticarm using a pen as end-effector with a tilt webcam for visualfeedback. This robot was also publicly displayed during theICRA 2018 conference. eDavid [3] is a response to roboticart from the computer graphics community and can drawnon-photorealistic paintings from an image input using anindustrial robot. A drone-type robot was introduced to createstippling effects from an image input [12]. The drone isequipped with an ink-soaked sponge and uses a centroidalVoronoi diagram to generate stippling effects and minimizesits motion by approximating the traveling salesman problem.Song et al. [6] used a collaborate robot to draw artistic pen-drawing on an unknown surface using impedance-control, butignored other sensory information including robotic visionand is unable to generate a distortion-free rendering ofdrawings on an unknown surface.

B. Distortion-free Surface ParameterizationAutomatic parameterization of a non-parametric surface

like polygonal or point-set surfaces is a non-trivial problemand has been extensively studied in geometric modeling andprocessing [13], [14]. Typical applications of parameteri-zation include texture mapping in computer graphics, andmesh editing and re-meshing in geometry processing. Inparticular, the former application is closely related to ourproblem, where a curved drawing in 2D should be faithfullyreproduced on curved or bump surfaces in 3D.

For a polygonal surface with disk-like topology and aconvex boundary, Barycentric mapping [15] is the mostpopular method in the literature. However, this method caninduce serious distortion in parameterization if the boundaryis highly non-convex. Coping with this problem, conformalmapping has been introduced based on a complex analysis.At a high level, conformal mapping can be classified intoanalytic and geometric methods [14]. The former is relativelyeasy to implement using energy minimization [16] but cansuffer from unbalanced distortion if the underlying surfacehas high Gaussian curvature while the latter can resolve thisissue [17].

C. Impedance-controlled Robot

Hogan et al. [18] first suggested an idea of usingimpedance control for controlling the interaction between amanipulator and the surrounding environment. Since then,robot interaction techniques using impedance control havebeen explored in the robotics community [19], [20]. Apopularity of the use of impedance control is partly due to thedemand for robotic systems with an ability to interact withuncertain environments in practical applications. Thus, manycollaborative robots based on impedance-control have beenintroduced from both industry and academia, for instance,LBR IIWA from KUKA Robotics, Baxter and Sawyer fromRethink Robotics, and Justin from DLR. These robots areable to handle classically challenging robotic manipulationsuch as peg-in-hole assembly. Impedance control can be alsoused for unscrewing a can for a humanoid robot Justin [21]as well as bimanual tasks [22]. The use of impedance controlfor drawing is very new, and Song et al.’s work [6] is theonly known work to the best of our knowledge.

III. SYSTEM OVERVIEW

As illustrated in Fig. 1-(a), we show the setup of ourrobotic drawing system consisting of a robotic manipulatorequipped with a 3D-printed pen-gripper to hold a pen,a non-planar target surface S and an RGB-D camera. InFig. 1-(b), we also show different frames used for systemimplementation where the robot base’s frame B coincideswith the inertial frame W in our system. The input drawingof our system consists of a sequence of control points C in2D, defining quadratic Bezier curves, which are drawn by an

Page 3: Distortion-free Robotic Surface-drawing using Conformal Mappinggraphics.ewha.ac.kr/SSK2/ICRA19_0376_FI.pdf · 2019-02-28 · conformal mapping as well as for orienting the robot's

artist using tablet-like pen interfaces such as [6]. To projectthis drawing on to the target surface S, we obtain a pointcloud data P representing the surface S using an RGB-Dcamera. After calibrating P with S with respect to W , weestimate the normal vector for each point in P , later used forconformal mapping as well as for orienting the robot’s penframe. Then, the surface S is parameterized into 2D domainΩ by first approximating it using mesh reconstruction fromP and then performing conformal mapping on it.

The original input drawing defined by C is made to fitin the domain of Ω, and their corresponding parameters arefound by inverse-mapping Ω to P and performing bi-linearinterpolation on it. Finally, the robot performs drawing withimpedance control to compensate for possible mapping andestimation errors.

IV. SURFACE ESTIMATION

In order to reproduce the drawing originally provided as asequence of 2D vectors on a target surface, we first acquiredepth information of the target surface. Various types ofRGB-D cameras, such as stereo vision, time-of-flight, canreconstruct the depth information of a 3D surface with anaccuracy range of ±5% with a resolution less than 1mm.

Using this depth data, we generate a point cloud of thescene and calibrate it with the target surface by defining alocal frame, called calibration frame, attached to the targetsurface using three non-degenerate points on it. Further, wealso estimate the surface normals for the calibrated pointsto orient the robot’s tool frame with respect to the robotbase frame. This enables the robot’s pen tip aligned towardthe surface normal of the target surface and also to exertproper forces on to the pen tip to draw. However, sincethese estimated results may not be precise due to sensornoise, we use impedance-based control to compensate forthe estimation error that will be explained in Sec. VI.

A. Point Cloud Calibration

Calibrating the captured point cloud P data with the sourcesurface S is crucial for distortion-free robotic drawing. How-ever, common calibration methods based on planar markersor mounting a camera on the robot arm is not suitable forour purpose because our target surfaces are often non-planar,and they are placed very close to the robot manipulator thatit violates the RGB-D camera’s required minimum distance.So we devise our own approach as follows. With a capturedpoint cloud CP relative to the camera frame C, we transformit relative to the robot’s base frame B, which also correspondsto the inertial frame W in our approach.

To calibrate, we define a local calibration frame L attachedto the surface S using three non-degenerate position-markersp,q, r on the surface (Fig. 1-(b)). We define −→pq as thex-axis and the normal vector n of the 4pqr as the z-axis of L. By registering the position markers in the pointcloud P , the frame L is defined with respect to C. Similarly,by picking p,q, r using the robot’s end-effector and usinginverse kinematics, L with respect to B is redefined. Then,the relative transformations from C or B to L are defined

respectively. Finally, BCT, the transformation from C to B is

obtained and applied to the point cloud data.BCT =B

L T(CLT)−1 (1)BP =B

C T CP. (2)

Fig. 2 illustrates a dense point cloud captured from acamera frame C, transformed into the robot base frame B.

B. Normal Estimation

To perform robotic surface drawing on an arbitrary sur-face, the robot needs to decide both the position p andorientation r of its end-effector. In particular, to calculate theorientation r that ensures the robot to make stable contacton a surface point, we calculate the surface normals n of apoint cloud and make it parallel to the approaching direction(i.e., z-axis) of the end-effector (i.e., pen).

One can consider two options to estimate the normals of apoint set P . A rather straightforward way is to use a surfacemeshing technique to obtain a mesh representation of P andthen extract the per-vertex normals. Alternatively, one caninfer per-point normal directly from P by performing least-squared plane-fitting [23]. This is reduced to the principalcomponent analysis for a local neighborhood of points. Eventhough either way would work in our case, the surfaceparameterization method we have chosen in Sec. V requiressurface meshing, and thus opt for the meshing technique.

Fig. 2. Calibrated Dense Point Cloud

V. DRAWING FROM 2D TO 3D

To reproduce a 2D drawing on an arbitrary surface in 3Dwithout severe image distortion, one can employ conformaltexture mapping. In other words, we can preserve the qualityof the original 2D drawing during mapping from a 2D digitalvector-drawing to a robotic drawing on a 3D surface byminimizing the angle distortion of local geometry. As aresult, we can get a distortion-free drawing result regardlessof whether the drawing is executed on a large or small, astretched or shrunken surface. However, conventional con-formal texture mapping methods used in rasterization-basedcomputer graphics is not directly applicable to our problem

Page 4: Distortion-free Robotic Surface-drawing using Conformal Mappinggraphics.ewha.ac.kr/SSK2/ICRA19_0376_FI.pdf · 2019-02-28 · conformal mapping as well as for orienting the robot's

as our input drawing data is composed of a sequence of splinecurves in 2D using a set of control points C, correspondingto pen strokes. Moreover, our target surface is a set of 3Dpoints with estimated normals, but without explicit surfaceparameterization.

Fig. 3. Conformal Mapping

A. Conformal Mapping

Before explaining our idea of conformal mapping, wefirst introduce some theoretical background of conformalmapping. Given two surfaces with similar topology, it ispossible to compute a one-to-one correspondence betweenthem [13]. The problem of computing such a mapping isreferred to as surface parameterization.

Conformal mapping is one of the surface parameteriza-tion techniques that preserves both angles and shapes.Leta continuous surface S in 3D space be parameterized intoa parametric domain Ω ⊂ R2. As illustrated in Fig. 3, afunction f mapping from the Ω domain in 2D to a 3D surfaceS is said to be conformal if for each (u, v) ∈ Ω, the tangentvectors along the horizontal ∇u and vertical lines ∇v (thered and blue lines in Fig. 3), forming a regular grid, areorthogonal on S and have the same norm [14]:

∇v = n×∇u, (3)

where n denotes the unit normal to the surface S. Inother words, a conformal mapping locally corresponds toa similarity transform - i.e., transforms an elementary circleof the (u, v) domain to an elementary circle on the surface.

B. Surface Drawing with Minimal Distortion

To realize conformal mapping in our work, we adoptedthe least squares conformal mapping (LSCM) [16] to pa-rameterize the target surface, that is based on energy mini-mization on the non-conformality of the mapping function.Once we unfold the target surface into 2D parameter space(u, v) ∈ Ω, we search for proper parameter values of the2D drawing data in the parameter space, and refold thesurface into 3D space. As mentioned in Sec. IV-B, surfacenormals could be estimated in different ways. Accordingly,conformal mapping could be done differently as well. Wechoose to compute gradients with a local orthonormal basisof triangles, which needs the surface to be reconstructed asa triangular mesh. Alternatively, one could use a meshlesstechnique for conformal mapping of a point cloud withoutsurface reconstruction using Laplace-Beltrami (LB) operator

[24]. Both results would yield a set of coordinates (ui, vi) ∈Ω associated with each point in P that satisfies Eq. 3. In ourimplementation, we have chosen the meshing technique, asrobust surface meshing implementations are readily availablefor use [25] and it is easier to implement this way.

After the surface parameterization is done, we need todecide the proper parameter coordinates for the control pointsset C, which will then be mapped back to 3D using theparameterization. Since our captured point cloud set P isan unorganized point set and does not necessarily form auniform grid, in order to parameterize every control pointto a corresponding (u, v) coordinate, we need to solve aninverse mapping/parameterization problem. Note that this isnot a big issue for raster-based texture mapping, as one canhave a clear idea of which rasterized point on the surfaceneeds to be parameterized. On the other hand, in our case,the control points are originally defined in 2D, and we haveno idea where these points will be mapped to the 3D surface.To solve this inverse mapping problem, we simply performbi-linear interpolation in the parametric domain to estimate(u, v)’s for C as follows.

Given a desired drawing space W in 3D along with thedesired drawing scale, we can compute the parametric scalein Ω and fit the 2D drawing to Ω. Then, for each controlpoint ci ∈ C, we search for the four-nearest points in P thatform a quadrilateral in Ω that contains ci. By performing bi-linear interpolation on this quadrilateral using our previouswork [6], we can parameterize ci that is mapped to boththe position pi and the surface normal ni on P . Note thatwe calculate the surface orientation ri only for the controlpoints of drawing, not for all the points on the surface,which is more dense than the set of control points. Finally,a set of control points mapped to the target surface with thecorresponding surface orientations, C ′ = c′i = (pi, ri)|pi ∈R3, ri ∈ SO(3), is generated.

Fig. 4. Compliant force (f ) is generated by the impedance-controlledmanipulator. The thick black line represents spline curves on the physicalsurface that need to be drawn with a set of black dots for control points, alsocorresponding to the origin of the pen frame P . Meanwhile, the thin linewith a set of circular dots represents the target spline curves on the virtualsurface, located slightly under the physical surface by δz, corresponding tothe origin of the virtual frame V . The orientations of P and V are identicalbut their z-axes are offset by δz.

Page 5: Distortion-free Robotic Surface-drawing using Conformal Mappinggraphics.ewha.ac.kr/SSK2/ICRA19_0376_FI.pdf · 2019-02-28 · conformal mapping as well as for orienting the robot's

(a) Grid with Hilbert space-filling curve (b) Sierpinski arrowhead curve (c) Koch snowflake curve

Fig. 5. Drawing Results on a Bumpy, Circular Column Wall. The first row shows the original fractal curves in 2D including Hilbert space-filling curve,Sierpinski arrowhead curve, and Koch snowflake curve from left to right. The second row shows the robotic drawing results, the black lines using ourmethod, and orange lines using projection mapping. Note that the orange lines are distorted compared to the original2D drawings.

VI. IMPEDANCE-CONTROLLED DRAWING

Using surface estimation and mapping, our drawing robotis now fully provided with a set of Bezier curves in 3D thatcan be drawn on the target surface. However, to exert a propercompliant force at a pen-tip (the end-effector) as well asto compensate for possible depth-estimation and calibrationerrors, we adopted an impedance control method. Impedancecontrol is one of the hybrid, position- and force- controlledmethod that was proposed to interact with an unstructuredenvironment [18]. By employing a mass-spring-damper-likesystem, the impedance control allows a robot to react in acompliant manner to external obstacles. By considering acertain offset value (i.e., impedance) for each control point,corresponding to the estimation errors, the impedance controlresults in continuous contact motions with the surface duringthe entire drawing session.

We configured the impedance controller in such a waythat the robot is compliant only in the normal direction ofthe surface, as the pen tip attached to the robot manipulatoris oriented oppositely toward the estimated surface normal.Additionally, in order to exert an appropriate amount ofcompliant force at a pen-tip, a small deviation between thetarget position and the physical position of the pen-tip needsto be provided to the impedance controller.

Simply having the control points set C ′ = c′i = (pi, ri)define the target drawing frames can result in lack of suffi-cient pen pressure and will be very sensitive to the surfaceestimation error. Therefore, the target position pi of c′i ismodified to:

p′i = pi + δzni, (4)

where δz is a user-defined gain value that controls the penpressure and ni is parallel to the z-axis Pz of the pen frameP . This deviation results in a compliant force fz = kδz alongPz , where k is the spring stiffness. To be more efficient,instead of calculating new positions for every c′i, we attach

a virtual frame V , as shown in Fig. 1-(c), to the physicalpen aligned with the pen-tip frame P attached to the end ofthe pen except that V is slightly offset by δz from P alongz-direction, which has the same effect of moving P to p′i.

As a result, the pen-type end-effector traces out theposition of spline curves while maintaining the contact withthe surface, exerting an almost uniform amount of compliantforces regardless of the shape of the surface.

VII. RESULTS AND DISCUSSIONS

A. Implementation Details

As shown in Fig. 1-(a), our robotic surface drawingsystem consists of a KUKA LBR IIWA 7 R800 manipulatorequipped with a 3D-printed gripper as an end-effector tohold various types of pens that is solid enough to exert aforce. We use an Intel RealSense ZR300 RGB-D camera tocapture 3D point cloud of the target surface. It is a stereotypecamera that is augmented with an infrared projection systemto detect mono-colored objects more precisely.

We use C++, Java and MATLAB with a PC equippedwith 6 cores Intel Xeon E5 CPU and a 16-GB RAMunder Windows 10 64bit and Ubuntu 16.04 64-bit operatingsystems. We also use Point Cloud Library (PCL) [25] togenerate point cloud data and to reconstruct a triangular meshfrom the point cloud.

To benchmark our robotic drawing system, we used twotypes of input drawings: patterned drawings and artisticdrawings to effectively show both distortion-free and compli-cated pen drawing results. For patterned drawings, we gen-erated several fractal curves including a Hilbert space-fillingcurve, a Sierpinski arrowhead curve, and Koch snowflakecurves that form uniform squares, triangles, and hexagonalshapes. Artistic drawing data sets in Fig. 6 are acquired byusing the vector graphics engine proposed in our previoussystem [6].

Page 6: Distortion-free Robotic Surface-drawing using Conformal Mappinggraphics.ewha.ac.kr/SSK2/ICRA19_0376_FI.pdf · 2019-02-28 · conformal mapping as well as for orienting the robot's

B. Experimental Results

Fig. 5 shows examples of the robotic pen drawing resultsusing our system on a bumpy circular column wall withpatterned data sets in black compared to a simple projectionmapping used in [6] in orange. Compared to the results usingprojection mapping, our method reproduces the originaldrawing faithfully on the surface. On the other hand, theprojection mapping method does not preserve the originallength and it gradually increases as it moves away from thecenter of projection. As can be seen from the grid patternresults shown in Fig. 5-(a), our method preserves the sidelengths of the grid, but in projection mapping, the lengthhas increased by more than 20% in the worst case.

The point cloud of the target surface is generated in480 × 360 resolution, generating over 172K points for atypical scene. The statistics of our experimental resultsincluding the number of control points, the size of themapping surface (i.e., canvas size), the mapping calculationtime and the robot drawing-execution time are provided inTable I. The drawings can be stretched into any smaller,or bigger sizes only if it is within the robot’s workspace.Even though we parallelize some of the mapping tasks,the drawing time can be further reduced by adopting afew acceleration techniques - for instance, a more efficientneighborhood search method with an efficient data structuresuch as Delaunay triangulation.

TABLE IEXPERIMENTAL STATISTICS

Drawings (a) (b) (c)

# of Control points 6,930 2,360 4,128Mapping Surface Size (mm) 384×216 432×216 384×192

Mapping Time (sec.) 635 308 523Execution Time (min.) 54 14 19

We also show experimental results of artistic drawings inFig. 6, drawn using vector graphics engine. The drawingscontain over 90K control points, and the robotic drawing ona circular column wall took about 5 hours for each drawing.

C. Discussion

Maintaining continuous contact with an arbitrary surfacewith an intended amount of contact force is not an easyrobotic task. Our system, however, is able to perform roboticsurface drawing with nearly constant compliant forces, whileminimizing the distortion of the original input vector draw-ings.

Currently, we rely on the depth information captured usingan RGB-D camera so that our system is not suitable for atransparent or a reflective surface, such as on a window andon a mirror. Additionally, the depth and normal estimationprocess can be greatly accelerated by adopting a moreefficient nearest neighborhood searching method, such asANN [26]. Another problem is that our robot has a limitedworkspace, even though the input drawing consisting of asequence of vectors could be scaled indefinitely. We wantto address this issue in the future by adding mobility to ourcurrent manipulator.

Fig. 6. Artistic Drawing Results

VIII. CONCLUSION

We presented a robotic surface drawing system that cangenerate pen drawings on physical surfaces with minimaldrawing distortion, realized by conformal mapping and ef-fective surface parameterization. Our distortion-free mappingmethod successfully maps a sequence of 2D vectors to 3Dreal-world space. With impedance control, we compensatefor possible estimation or calibration error and generatecontinuous contact motions. The experimental results showthat our system has the potential to be extended into otherrobotic applications than drawing that requires a robot tofollow a given trajectory while maintaining contact with theunderlying surface.

ACKNOWLEDGEMENTS

This project was supported by the National ResearchFoundation(NRF) in South Korea (2017R1A2B3012701).

REFERENCES

[1] J. Reichardt, “Machines and art,” Leonardo, vol. 20, no. 4, pp. 367–372, 1987.

[2] S. Calinon, J. Epiney, and A. Billard, “A humanoid robot drawinghuman portraits,” in IEEE-RAS International Conference on HumanoidRobots, 2005.

[3] T. Lindemeier, S. Pirk, and O. Deussen, “Image stylization with apainting machine using semantic hints,” Computers and Graphics,vol. 37, 2013.

[4] P. Tresset and F. F. Leymarie, “Portrait drawing by paul the robot,”Computers and Graphics, vol. 37, 2013.

[5] S. Damith Herath, Christian Kroos, Robots and Art. Springer, 2016.[6] D. Song, T. Lee, and Y. J. Kim, “Artistic pen drawing on an arbitrary

surface using an impedance-controlled robot,” in IEEE InternationalConference on Robotics and Automation (ICRA), 2018.

[7] C. Hofner and G. Schmidt, “Path planning and guidance techniquesfor an autonomous mobile cleaning robot,” Robotics and AutonomousSystems, vol. 14, no. 2, pp. 199 – 212, 1995, research on AutonomousMobile Robots. [Online]. Available: http://www.sciencedirect.com/science/article/pii/092188909400034Y

[8] Y.-B. Jia, L. Mi, and J. Tian, “Surface patch reconstruction viacurve sampling,” in IEEE International Conference on Robotics andAutomation, May 2006, pp. 1371–1377.

Page 7: Distortion-free Robotic Surface-drawing using Conformal Mappinggraphics.ewha.ac.kr/SSK2/ICRA19_0376_FI.pdf · 2019-02-28 · conformal mapping as well as for orienting the robot's

[9] J.-D. Boissonnat, L. J. Guibas, and S. Oudot, “Learning smoothobjects by probing,” in Proceedings of the Twenty-first AnnualSymposium on Computational Geometry, ser. SCG ’05. NewYork, NY, USA: ACM, 2005, pp. 198–207. [Online]. Available:http://doi.acm.org/10.1145/1064092.1064124

[10] P. McCorduck, AARON’S CODE: Meta-Art, Artificial Intelligence, andthe Work of Harold Cohen. W. H. Freeman & Co, 1990.

[11] M. Stein and C. Madden, “The pumapaint project: Long term usagetrends and the move to three dimensions,” in Proceedings of the IEEEIntl Conference on Robotics and Automation (ICRA), 2005.

[12] B. Galea, E. Kia, N. Aird, and P. G. Kry, “Stippling with aerial robots,”in Computational Aesthetics in Graphics, Visualization and Imaging,2016.

[13] K. Hormann, B. Levy, and A. Sheffer, “Mesh parameterization: Theoryand practice,” 2007.

[14] M. Botsch, L. Kobbelt, M. Pauly, P. Alliez, and B. Levy, Polygonmesh processing. AK Peters/CRC Press, 2010.

[15] W. T. Tutte, “Convex representation of graphs,” in London Mathemat-ical Society, 1960.

[16] B. Levy, S. Petitjean, N. Ray, and J. Maillot, “Least squares conformalmaps for automatic texture atlas generation,” in ACM transactions ongraphics (TOG), vol. 21, no. 3, 2002, pp. 362–371.

[17] A. Sheffer and E. de Sturler, “Parameterization of faceted surfaces formeshing using angle-based flattening,” Engineering with Computers,vol. 17, no. 3, pp. 326–337, Oct 2001. [Online]. Available:https://doi.org/10.1007/PL00013391

[18] N. Hogan, “Impedance control: An approach to manipulation,” inAmerican Control Conference, 1984. IEEE, 1984, pp. 304–313.

[19] S. A. Schneider and R. H. Cannon, “Object impedance control forcooperative manipulation: Theory and experimental results,” IEEETransactions on Robotics and Automation, vol. 8, no. 3, pp. 383–394,1992.

[20] F. Caccavale, C. Natale, B. Siciliano, and L. Villani, “Six-dofimpedance control based on angle/axis representations,” IEEE Trans-actions on Robotics and Automation, vol. 15, no. 2, pp. 289–300,1999.

[21] T. Wimbock, C. Ott, and G. Hirzinger, “Impedance behaviors for two-handed manipulation: Design and experiments,” in Proceedings 2007IEEE International Conference on Robotics and Automation, April2007, pp. 4182–4189.

[22] J. Lee, P. H. Chang, and R. S. Jamisola, “Relative impedance controlfor dual-arm robots performing asymmetric bimanual tasks,” IEEEtransactions on industrial electronics, vol. 61, no. 7, pp. 3786–3796,2014.

[23] R. B. Rusu, “Semantic 3d object maps for everyday manipulation inhuman living environments,” KI-Kunstliche Intelligenz, vol. 24, no. 4,pp. 345–348, 2010.

[24] J. Liang, R. Lai, T. W. Wong, and H. Zhao, “Geometric understandingof point clouds using laplace-beltrami operator,” in Computer Visionand Pattern Recognition (CVPR), 2012 IEEE Conference on. IEEE,2012, pp. 214–221.

[25] R. B. Rusu and S. Cousins, “3d is here: Point cloud library (pcl),” inRobotics and automation (ICRA), 2011 IEEE International Conferenceon. IEEE, 2011, pp. 1–4.

[26] D. M. Mount and S. Arya, “ANN: library for approximate nearestneighbour searching,” 1998.