VFX Najbolje

Embed Size (px)

Citation preview

  • 8/10/2019 VFX Najbolje

    1/12

    Visual Effects in Film and Television

    Ian KezsbomBinghamton University

    State University of New York

    Abs tract: Since the creation of the motion picture

    camera, people have been trying to amaze and

    astonish audiences with spectacular effects on the

    screen. As film effects became more complicated,

    new methods were needed to create them. With the

    introduction of computers a whole new array of

    visual effects was introduced to the motion picture

    and television industries. In recent years, these

    computer effects, also called digital effects, have

    become an integral part of film and television

    productions. These digital effects have allowed

    viewers to travel to far-away lands and witnesscreatures never before imagined. This paper will

    cover both the history and process of digital effects.

    The progress of visual effects since the

    invention of the motion picture camera will be

    traced. This paper will focus on digital effects and

    how computers are used to create them.

    Comparisons will be made between the first films

    and television shows in which effects were used and

    the films and television shows of today. Several

    leading industries in the field of digital effects and

    the processes that they use will be discussed.

    The processes used to create digital effects

    will be covered in depth. Topics such as computeranimation and morphing will be focused upon.

    Several modern films and television shows will be

    used as examples. Specific effects of these films will

    be discussed in a step-by-step process. This will

    include discussion of some of the software and

    hardware used to create these effects. This will also

    incorporate software available only to the

    production companies and software that is readily

    available on the market today. Also the impact these

    effects have on cinema, television and the viewer will

    be discussed.

    Finally, the future of digital effects will be

    covered. Possible future techniques will be looked

    upon, as well as the impact they might have on the

    entertainment industry. The possibility of filming a

    movie or television show with only computer

    generated actors and actresses will be discussed as

    well as the possibility of bring past actors and

    actresses back to life.

    I. Introduction

    ince the invention of the motion picturecamera, movies have been a major form of

    entertainment throughout the world. Film-makers have attempted to transfer viewers toforeign lands, show them the past, present andfuture and have them witness wonderful,

    fantastic and sometimes horrifying creatures.The invention of television has allowedviewers to take these journeys in their ownhomes. As a result, the field of special effectshas grown to be a major factor in theproduction of films and television shows.With the introduction of computers a wholenew world of visual effects known as DigitalEffects became possible. These computerizedeffects have allowed filmmakers to create

    images never before thought possible. InDecember of 1998, director James Cameronused digital effects to help create his film,Titanic. In figure 1, a screen shot from themovie is shown. While the ship is a model,everything else in the picture is computergenerated, including the water and smoke.Creating these effects, however, is not a

    S

    Figure 1. The model of the Titanic, surrounded by digital water,waves, smoke and sky [1].

  • 8/10/2019 VFX Najbolje

    2/12

    simple and quick process. It takes time, effortand numerous types of software packages tocreate.

    II. Historical Perspective

    Even before the 1900's filmmakerswere experimenting with visual effects. In1896, a French filmmaker named GeorgeMelies created the short film The VanishingLady. With the use of camera tricks he wasable to make a woman disappear from thescreen. In 1936 Technicolor was introducedand allowed films to be colorized. Thisopened up a whole new doorway to visualeffects realism, since images would now

    appear more true to life.Some of the early visual effects werecreated with a process known as Stop-MotionAnimation. A model of an image was createdand a picture taken of it. The object wasslightly moved and then another picture wastaken of it. This process was continued untilwhen all the shots were put together, itappeared as if the object was moving or alive.

    Around 1978 computers first enteredthe film industry as George Lucas set up the

    computer division for his special effects houseIndustrial Light and Magic (ILM). In 1982,the Genesis sequence in Star Trek II: TheWrath of Kahnbecame the first ever computergraphics sequence put into a feature film.However, at that time three-dimensionalcomputer graphics were too slow andexpensive to be used regularly in a motionpicture. However, in recent years, digitaleffects have grown to be an almost necessarypart of feature films. In an industry that isconstantly changing, new computer graphicssystems and software are constantly beingdesigned to handle filmmakers needs. [2]

    Digital effects have also become anintegral part of television production. Thetools used in developing effects for film andtelevision and generally the same. At firsteffects for television were not on as grand a

    scale as movies, but in recent years moreshows have been "upgrading" their effects inattempt to gain audiences in the digital age.

    III. The Companies

    In 1976 George Lucas created hisvisual effects studio Industrial Light andMagic. In 1978 their computer divisionopened and has since grown to be the largestcomputer effects house in the world.However, they are not alone in the industry.Companies such as Digital Domain and Pixarhave been doing their share as well.

    Pixar was founded by Mr. Catmull andhis group after they left ILM to form their own

    company in 1986. It was named after theimage-processing system they had developedwhile with ILM. In recent years more andmore special effects houses have opened up.Digital Domain is one of these newercompanies. Recently, they did the majority ofwork on the film Titanic. However, theTitanic job was enormous and in all overfifteen visual effects houses joined in. Someof these included ILM, POP Film, MatteWorld Digital and Cinesite.

    Silicon Graphics has been important indesigning the systems and workstations thesegraphics run on. Adobe has licensed a numberof software packets that are used in both theindustry and home.

    IV. The Tools of the Trade

    What does it take to create the DigitalEffects used in films and television? Over theyears numerous types of software andhardware have been created to fit the needs ofdirectors. Some of these tools have also beenreleased for commercial use.

    One of the most powerful computergraphics tools created was Renderman, athree-dimensional rendering tool. Rendermanallows the user to change things like Motion-blur and depth of field. It also gives the user

  • 8/10/2019 VFX Najbolje

    3/12

    the capacity to create exceptionally complexenvironments. It was created by ILM andconsidered one of greatest contributions tocomputer graphics software so far.Renderman is written in the C programming

    language.Renderman allows many differentapplications and has many different tools. Onemain tool is Alfred. Alfred is a taskprocessing system that is specifically designedto manage network-distributed rendering.Another tool is Combiner, which is a script-based compositing application. It allows theuser to specify any number of Still Frame,Image Sequence, Display Server and AliasWirefile (Alias is a tool that designs models

    and scene data) input elements. Finally, thereis the Ator tool, which moves Alias modeland scene data into the Rendermanenvironment where it can then be worked with[3].

    One of the most valuable benefits ofRenderman is the shading language. It is bothprogrammable and extensible and can createany type of surface appearance. Shading is avery important part of creating computer-generated images. Phong shading or normal-

    vector interpolation shading is presented as anexample of how a shading algorithm mightwork

    In Phong shading, three steps must becarried out. First, the average unit normalvector at each polygon surface must bedetermined. Next, the vertex normals must belinearly interpolated over the surface of thepolygon. Finally, an illumination model mustbe applied along each scan line in order tocalculate projected pixel intensities for thesurface point. To put it simply, a path is madefrom the screen's pixel to the object anddepending on how the light is hitting the objectdepends on how the pixel will be shaded.Once this is completed, a three-dimensionalobject will appear shaded on a two-dimensional screen [4].

    The original ILM computer graphicsdivision created the Pixar Laser Scanner andEditdroid. The Pixar Laser Scanner allowedimages to be digitally scanned into a computer.Editdroid was a digital editing system.

    Another major breakthrough insoftware was the invention of Photoshopwhich was eventually published by Adobe andpackaged for home use (see figure 2). It was

    created by ILM employee John Knoll and hisbrother Thomas. Photoshop was designed inthe late 1980's to manipulate pictures andreceived it first extensive use in The Abyss(1989). In The Abyss it was used tomanipulate digitized photographs ofbackground plate sets and to create a digitalmodel in order to integrate a water creature

    also generated by computers (see figure 3).

    Photoshop allows for the editing ofdigital pictures. It allows artists to rotate,change the size and color or touch-up thesedigital images. It is a versatile piece of

    Figure 3. The water creature from The Abyss[6].

    Figure 2. Adobe Photoshop is now available for home use [5].

  • 8/10/2019 VFX Najbolje

    4/12

    software and is used in various stages of thefilm and video making process. A typicalscreen shot from Photoshop can be seen infigure 4. The Domino created by QuantelInc., is a production suite for features work. It

    was given its first thorough testing in the

    movie Stargate. ILM found this systemslightly limited for their use so they created theSabre System. The Sabre System was an openarchitecture system as opposed to Domino'sclosed architecture system. This means thatoutside and proprietary software could not be

    added to the Domino but it could be added tothe Sabre System.

    The Sabre System featured the "Flame"and "Inferno" software packages produced bythe Canadian company Discreet Logic. Sabreis also very user friendly and allows for non-computer specialists to use the system. It canbe used for digital matte painting, editing andimage processing.

    Kodak created its Cineon system toallow many different aspects of digital film

    making to be produced. The system containsKodak's Lightning Scanner, which uses threearrays of 4096-by-1 devices. Each one isequipped with a filter, one red, one green andone blue. A Xenon lamp is used for a lightsource with the scanner unlike other scannerswhich use tungsten or quartz halogen lamps.The blue energy from the Xenon lamp helps to

    compensate for the orange color of the filmnegative being scanned. The light istransmitted through an optical-fiber tube filledwith doped water to keep down loss andaberration. It is then diffused by an array of

    lenses and an integrating cylinder. This way,the illumination is a soft, rather than a sharp,focused beam.

    With the complicated tasks ofTerminator 2: Judgement Day, the Kodakscanner was replaced with the TrilinearMulticolor High Resolution Charged CoupledDevice (CCD) Digital Input Scanner. It wasthe first high-end digitizing device that wasable to output a multi-color, high-resolutionimage and allow the image to be inter-cut with

    production footage [2]. It could scan anddigitize in all 35mm formats.Terminator 2: Judgement Day led to

    other software breakthroughs as well. Tocreate the smooth surfaces of a humanoidcreature "Body Sock" was created by ILM. Itallowed for the smoothing out and blendingtogether of the edges of a digitally modeledsurface. Also created during the filming ofthis movie was the "Make Sticky" software. Itallows computer graphics artists to project

    two-dimensional images onto three-dimensional computer models from a liveaction plate. It was developed originally forthe checkerboard floor scene in the moviewhen the liquid robot rises up from the ground(See Figure 5).

    Figure 5. Using the Make Sticky software, the liquid robotseems to grow out of the ground [7].

    Figure 4. A typical screen shot from Photoshop [5].

  • 8/10/2019 VFX Najbolje

    5/12

    Film and computer companies areconstantly trying to improve old software andsystems and create new ones. The abovementioned systems are only a small sample ofwhat is used in the industry. However, they

    give a good idea of the basis of systems usedto create visual effects.

    V. "Simple" Tricks

    Since the beginning of films, frameshave been touched up by tinting. In the digitalage, however, there are many other ways totouch up a piece of film. Products likePhotoshop are used to clean up a frame orperhaps add an image here or there. This

    allows a lot of freedom in the movie makingprocess. As a result many simple tricks can bedone to make the film more appealing.

    One of the most common "tricks" is tomake wires disappear. Often stuntmen ormodels are hung from wires and in many oldermovies these wires can still be seen. Today,however, most wires can easily be removedfrom the image.

    Wire-removal is, for the most part, justpainting and merging. For both of these tasks

    the main tool being used is a computerizedbrush. The brushes are a software-configurable pressure sensitive pen cursor anda tablet. They are provided with Matadorsoftware, which is designed by ParallaxSoftware in London. There are different kindsof brushes, but the one most often used iscalled the merge brush.

    Imagine two sheets of paper; onecontaining an image (the bottom sheet) and theother containing an image to be added to thefirst. By placing the second sheet on top of thefirst sheet, the bottom sheet will make an exactduplicate of the new image on itself, pixel bypixel. This is what the merge brush does.

    How does this help with wire removal?If an image has a wire holding an actor oractress and is moving across a skybackground, an image of the same scene

    without the actor/actress and wire can bemerged over the spot in the original film withthe wire. This will give the impression that thewire was never there in the first place. As aresult, this can make it appear as if the

    actor/actress is actually flying.Another common "trick" is to make theactors appear to be somewhere they actuallyare not, by changing the background. This isdone with the use of masks (also known asmattes and keys). Before the introduction ofcomputers, masks were made by placingpieces of opaque film over portions of theframes in unexposed film. Then furtherimages were shot relative to the initial imagesand mixed with them. Often the lines of the

    films, called mask lines, were mixed andnoticeable. Also, the film quality wasdegraded because this process requiredmultiple exposures.

    Today, a mask is harder to detect.Masks can be planned; in which case the sceneis initially shot in front of a blue or greenscreen. The reason for this is because thecolor can be easily filled in and removed.Bluescreening was developed in the 1950's. Itis most often used on television weather

    reports, when the weatherman seems to standin front of a map. He is actually standing infront of a bluescreen. The images are thenmasked on the blue spaces behind him.

    Masks can also be unplanned. In thiscase specific portions of the film must be dealtwith in each frame since there is not one colorto remove. This procedure is known asrotoscoping and it creates what is called atraveling matte. In rotoscoping, a computerartist must outline the areas that are needed tobe kept. Then the computer will digitallyremove the rest of the frame and place the newscene in the now empty space [8].

    VI. Morphing

    Morphing is "having the computertransform pixels attached to a geometric lattice

  • 8/10/2019 VFX Najbolje

    6/12

    overlaid on an image to a differently shapedlattice overlaid on the same or another image.The second image can simultaneously be fadedin as the first image is faded out" [9]. Whatthis allows is for the transformation of one

    object into another object. It was firstintroduced in the movie Willow (1988) totransform a woman into a tiger and othercreatures.

    To create the effect, the differentimages were first digitally scanned in on ablue-background. Once the background wasdigitally removed, the elements were meldedtogether. In a two-dimensional morph, thebeginning and end frames are selected and thecomputer fills in the frames in the "in-

    between" stages. Next, key framesrepresenting major poses in a sequence wereselected. This allowed the programmer tomanipulate grids of points and curves and tospatially distort them over each input image.This allows the edge and internal features ofthe different frames to exactly overlap. Theeffect created was a seamless transitionbetween two creatures and opened up a wholenew door to the digital effects realm.

    The underlying method of Morphing is

    called Warping. It allows pixels to be mappedonto a geometric grid and deform in a plannedfashion. Elastic Reality Inc. created the firstpackage to perform this and it is currentlyavailable to the public for Macs and PCs forless than $1000.

    Morphing is also currently used intelevision advertisements and shows. It wasalso used in a Michael Jackson video calledBlack and Whitein 1991. In it, many differentfaces transformed into one another. In recentyears, Morphing has become a common placetechnique in film and television.

    VII. Donovan's Destruction - Indiana Jonesand the Last Crusade

    At the end of Indiana Jones and theLast Crusade, Walter Donovan meets his

    doom by drinking from the false Holy Grail.The sequence that occurs was dubbed by ILMas "Donovan's Destruction" and is the first all-digital composite.

    The challenge of creating Donovan's

    quick aging was to make the scene onecontinuous shot instead of many cutaways. Itwas decided that the blending of the heads inthe different stages of his death was to be donewith morphing. To create the aging head, arubber latex cast of the actor's head was madeand motion control mechanics were added tomake Donovan's cheeks come in and his nosego back. With the head in this most decayedposition a new cast was made from it, thatmade the head look even older. From this

    head a third cast was made to make thetransition to a skeleton.The heads were attached to a special

    motion control rig to look like WalterDonovan's torso and neck. The different headsused the same rig in order to go through theexact same motions in the scene. The imageswere filmed in front of a blue screen anddigitized much like in Willow, but unlikeWillow, all the main elements in the scenewould be manipulated and composited in the

    digital realm instead of just specific elements.The morphing of the heads wascomplicated by the need for the first puppet tohave hair. To integrate the second and thirdstages, texture mapping of the master head wasneeded. After the shot was digitallycomposited and scanned back out the first all-digital composite was converted to the form itneeded to be in and was finally completed.What resulted was a very realistic quick agingof a man into a skeleton.

    VIII. The Magic ofForrest Gump

    Forrest Gump (1994) contained manyincredible visual effects. These ranged fromplacing the actor Tom Hanks along sidePresident Kennedy to removing an actor's legs.

  • 8/10/2019 VFX Najbolje

    7/12

    Forrest Gump called for many scenesin which an actor interacted with charactersfrom much older footage. In the sceneinvolving President Kennedy, different footageof the actual president was used to perform the

    different actions needed in the film. TomHanks performed the actions he needed tomake in front of a blue screen. Thebackground images were filmed in anotherstudio and formed another layer of the finalcomposite. These layers are usually referredto as plates. Keyframing (discussed in sectionX. Digital Clones) was used to line up thehands of the President and Tom Hanks at thebeginning and the end of the shake. Thecomputer then interpolated the positions of

    Tom Hanks' forearm between the twokeyframes. Finally, Tom Hanks' thumb wasdigitally painted over to make it look like hewas actually shaking hands with PresidentKennedy in the a room with a number ofonlookers. Finally, the three different layerswere all made to match the look of thenewsreel's granularity and black-and-whitetones.

    Morphing was also used in this sceneto make President Kennedy say the words the

    directors wanted him to say. Large sections ofthe Presidents face were warped from oneframe to the next, while smaller more intricatedetails, such as the teeth and lips, weredigitally painted on each new frame.

    To remove the legs of Lieutenant Dan(Gary Sinise) required the actor to wear bluestockings. These can be compared to aportable blue screen. This made it easy tomask out the legs. Objects that his real legswould normally go through were added in thefinal sequences. To add the areas of theground that his legs concealed, identical shotsof the scene without the actor were used toreplace the missing areas. To ensure that thetwo passes of the scene (with and without theactor) were identical, digital camera controlwas used.

    IX. Creatures From Another World

    Creating incredible and amazingcreatures has always been a fascination offilmmakers. In recent years creating these

    "monsters" has turned more and more to theuse of computers. This can easily be seen inthree different films: The Abyss, Terminator 2:Judgement DayandJurassic Park(as well asits sequel, The Lost World).

    In The Abyss(1989), a fantastic watercreature had to be created. The challenge wasto create a water creature that appeared to takeon some form while retaining water's qualities.Before computer animation was decided upondirector James Cameron toyed with the idea of

    using methods like stop-motion animation andhydraulic water systems.ILM was chosen to create the creature

    with a team led by Dennis Muren, John Knoll,Mark Dippe and Jay Riddle. Silicon Graphicsworkstations and Alias software were used toperform the animation. First, the model shopproduced a clear resin model. This modelallowed the animators to determine how lightwould reflect off the surface. Then a wire-frame model was built on the computer.

    Around this frame, the creature was designed(see figure 6). It was decided that instead ofdripping water, the creature would appear tohave a rippling effect that was created by the

    Figure 6. The stages of development of The Abysswater creature.First a wire-frame model is made and then it is fleshed out [2].

  • 8/10/2019 VFX Najbolje

    8/12

    programs filter. The final challenge was tomake the light reflect off the creature. Lighthad to both pass through and reflect off thewater being. Instead of using actual ray-tracing (which is more accurate) to determine

    the lighting, all the information was simplyderived from the background plate. This wasquicker to process and the end result was stilleffective; a very life-like creature made fromwater.

    Ray-tracing is a slower but moreaccurate process. In ray-tracing an imaginaryray is bounced around a scene in an attempt tocollect intensity contributions. This allows theglobal reflection and transmission effects to beobtained. A basic ray-tracing algorithm

    provides for visible surface detection, shadoweffects, transparency, and multiple light sourceillumination.

    The basic ray-tracing algorithm iseffective but slow. It is similar to Phongshading mentioned earlier. The algorithm setsup a reference frame that has pixel positionsdesignated in the xy plane. A ray path is thendetermined that passes through the center ofeach screen-pixel position. Any illuminationeffects accumulated along this ray are assigned

    to the pixel. For each pixel ray, the surface ofthe scene is tested to determine if it intersectsthe ray. If it does the distance from the surfaceintersection point to the pixel is calculated.The ray is then reflected off the surface alonga specular path (a path where the angle ofreflection equals the angle of incidence - it isused to determine how the light reflects). Ifthe surface is transparent secondary rays areused and sent through it. As the rays from thepixel go through the scene a binary ray-tracingtree is created with the left branchesrepresenting reflection paths and the rightbranches representing transmission paths. Theintensity of the pixel is then determined byaccumulating the intensity contributionsstarting at the bottom of the tree. The intensitydetermines how the pixel looks and as a resulta three-dimensional object appears to reflect

    light correctly on a two-dimensional screen[4].

    Ray-tracing involves using manycalculations to determine how the light reflectsoff the surface of an object. This was the main

    reason that ray-tracing was not used in TheAbyss. Instead approximations of reflectedlight were used saving a large amount of time.

    In 1991 a new challenge awaited withthe film Terminator 2: Judgement Day, alsodirected by James Cameron. Here, the ILMspecial effects teams had to create a liquidmetal cyborg from the future. The filmresulted in forty-four computer graphics shots.These required 100 elements and 7,965 framesof three-dimensional imaging. The design

    team was led by Dennis Muren and DougChiang. The most important tool used inmaking these visual effects was the TrilinearMulticolor High Resolution CCD Digital InputScanner. This allowed the images to bescanned in clearly and correctly. Just like inthe Abyss, a model was created by hand togive the computer artists a reference to workwith.

    One main challenge was to have theliquid metal cyborg change into the actor

    Robert Patrick. The Human Motion Group ofILM had this job. First, Robert Patrick wasfilmed with a black grid painted on his body.This would help the computer assimilate thephysical data. Then the reference footage wasdigitized and a frame-by-frame digital skeletonwas built. "Body Sock" software was used tosmooth out and blend together the edges of thedigitally modeled surface. Using this softwarethe liquid metal creature could be fleshed outand transformed into the actor.

    The other breakthrough in Terminator2: Judgement Day was the Make Stickysoftware. This helped with the checkerboardfloor scene mentioned earlier. It also allowedthe animators to have the liquid robot passthrough a set of bars. This scene was done byfirst filming the hallway with the bars and noactor and then the actor and no bars. When the

  • 8/10/2019 VFX Najbolje

    9/12

    two scenes were digitized and merged, thethree-dimensional model of the actor wasmatch-moved to the actual footage of the actorwalking. The final result was a man seemingto pass through the bars.

    In 1993 Jurassic Park hit the movietheaters. The biggest challenge with this filmwas to create a number of creatures that havenot existed for millions of years and makethem look real. In 1997 the sequel to JurassicPark - The Lost World was released and infilming that the same problems were faced.

    To begin with, in Jurassic Park, a wire-frame model of the dinosaurs were built (seefigure 7). Next, the texture maps of the skins

    surface had to be painted. This was doneusing Viewpaint (a new ILM softwareprogram written by John Schlag and BrianKnep). Viewpaint allowed artists to paint thesurface of the dinosaur as if it were a realsculpture. The digital artist could look at thethree-dimensional model rendered on thescreen and add the color with texture-map

    coordinates while bending it around thecomplex finished shapes.Also designed was a tool called

    Enveloping. When used in conjunction withBody Sock software the flesh of the dinosaurseemed to move with actual realism.Therefore, if a wrist were moved, the programwould automatically move the rest of the arm

    in a natural response to the movement of theskeleton. Digital Rotoscoping was then usedto place the dinosaurs into the scene. Finallytechnical directors touched up the film to makeit seem more realistic. Camera bounces were

    added to make it look as if the creature wasactually stomping by and film grain was addedto make the film look more natural. The finalshots were cleaned up by software paintprograms like Photoshop (The Abyss andTerminator 2: Judgement Dayused Photoshopin this way as well).

    The Lost Worldcontained about fortypercent more three-dimensional work than inJurassic Park. Also many of the shots weremore complicated due to background effects or

    multiple dinosaurs interacting. The modelswere built on Softimage and Alias software,but a new tool, called I-Sculpt was developedthat allowed the animators to work with themodels as if they were clay. The dinosaurs ofThe Lost World were developed much in thesame way as in Jurassic Park, only moreattention was paid to detail. There were nowmore close-ups and many different dinosaurs,each of which had to look unique. In all, itwas a greater undertaking with fantastic results

    [10].

    X. Digital Clones

    Animating human motion is one of themost difficult things to portray. It is very hardto make an animated image appear lifelike.Often it will lack the subtle gestures andmovements that make us human. As a resultthe characters will have a certain stiffnessabout them.

    The research on human animation is amajor topic in visual effects. Human cloneswould allow directors to have an actor oractress who never miss a shoot, who nevergive arguments and who never leave to take upa better offer elsewhere. Already digital actorscan be seen in films, especially in long shotswhere subtle aspects of the character would

    Figure 7. The wire-frame models of the T-Rex fromJurassic Park[2].

  • 8/10/2019 VFX Najbolje

    10/12

    not be noticed. In figure 8, a close up of a

    group of digital actors can be seen standing onthe deck of the Titanic in the film Titanic.

    There are three main ways to animate a

    human being. The first is called Keyframing.In this technique, the animator specifies thecritical, or key, positions for the objects. Thecomputer will then fill in the missing framesbetween the positio ns by smoothlyinterpolating between them. The characters ofthe movie Toy Story(1995) were animated thisway. There were more than 700 controls foreach main character. Separate controlsallowed the movements of each character'sdifferent parts.

    Another technique is called MotionCapture. In this technique, magnetic or vision-based sensors are used to record the actions ofa human or animal subject in three dimensions.Usually sixty to seventy points are marked onthe figure to help the computer know where toscan the data. The more points that are used,the more exact the computer generated clonewill be. The computer then uses the data toanimate the character. A similar techniquecalled Laser range Finding with StructuredLight uses a laser scanner, much like thosefound in super-market checkouts, to record thecontours and shapes of the subject [11].

    The final and newest technique iscalled Simulation. Simulation uses the laws ofphysics to generate the objects needed. Majorresearch on this technique is being done byJessica K. Hodgins at the Georgia Institute of

    Technology [12]. The computer is able tosimulate movements of the clone by physicallydetermining how the body parts will move.For example, running can be shown by havingthe computer determine the torque need to

    swing a leg forward (see figure 9). As a result,actual weights and sizes of different parts ofthe body must be determined and inputted.Still, the result is only an approximation of thehuman body. Simulation allows slightlydifferent sequences to be produced without theneed to film a new subject while maintainingphysical realism. It also allows for real-timemotion, which is most important in virtualenvironments.

    Unfortunately creating a character with

    the simulation method is time consuming andrequires great expertise. Also since actualfootage is not always used, Simulation iscurrently not realistic enough for motionpicture usage.

    The need for digital clones was great inthe recent film Titanic. Hundreds of digital

    extras were needed for the crew of the famousship. To design these "actors," a mixture ofKeyframing and Motion Capturing, calledRotocapturing, was used [1]. This techniquebegan with Motion Capturing using a reducedmarker set. Instead of sixty to seventy pointson the figure, only about twenty were used.The image skeleton was brought into

    Figure 9. The top row is an actual human running. The bottomrow is a cloned human using theSimulation technique [12]

    Figure 8. The digital clones on the Titanic. This is a close-uptaken from the movie - it is not an actual scene. [1]

  • 8/10/2019 VFX Najbolje

    11/12

    Softimage where a puppet was rotoscopedagainst it to make the Keyframing poses.Since less points were used to motion-capture,the characters were made quicker and with lessproblems.

    Both Keyframing and MotionCapturing are commonly used to create humanclones. Simulation is a newer technique that isstill being researched. In all three techniques,human clones are still not flawless and canoften be easily recognized in a film.

    XI. IMAX Theaters

    IMAX is the largest film format inhistory using 70mm, 15-perforation film

    frames as opposed to the traditional 35mm, 8-perforation film frames. In 1995 ILMproduced the visual effects for the IMAX filmSpecial Effects. Due to the difference in filmtype, new challenges needed to be overcome.

    To work with the IMAX film a thirdgeneration scanner would have to be built.This new scanner was compatible with bothtypes of film (35mm and 70mm). The team todesign the scanner at ILM was headed byLincoln Hu and Josh Pines. The new scanner

    employed the same Trilinear CCD imagingtechnology as the second generation scanner,but now new electronic, optical, mechanicaland illumination designs were introduced toimprove scan speed, image quality andflexibility [2]. Due to the size of the film,processing time was multiplied by a factor offour, but the system did allow the new footageto be digitally scanned into a computer. Sincethe film is so large, and film defects are alsomagnified. This means that more care must betaken in "cleaning" them up.

    IMAX offers a whole new experienceto movie going and to visual effects design.The sheer size of the film is not only whatmakes it so impressive to watch but difficult towork with. While IMAX films can bedigitally processed, work is still occurring onthe digital scanning system for the 70mm film

    in order to improve it and make it easier touse.

    XII. The Future of Digital Effects

    What does the future hold for digitaleffects? Already films have allowed us totravel to strange and wonderful places, and letus watch people change shape and size.Directors have already incorporated digitalclones into their films, so what is left to do?

    The one major problem with digitaleffects is that the grander projects often end uplosing realism. Even if a viewer has neverseen a creature before, the viewer can claim itlooks fake. This is often because the creature

    looks more like a cartoon than a living,breathing being. The same is the case formany digital clones, especially when they needto be seen close up. One of the reasons ToyStory's digital animation was successful wasbecause it was a digital cartoon and not adigital representation of actual life.

    However, as effects research goes on,effects begin to look more realistic. It ispossible that in a few years a film with bemade entirely with the use of digital actors and

    no audience member will know the difference.This would incite a change in film and videomaking as it is known today. However, it isbelieved that no digital clone could ever bringthe human qualities to the screen the way areal actor does.

    It is not too far fetched to believe thatone day interactive movies will be made andbecome common place in the world. VirtualReality simulations are becoming increasinglypopular in training people to fly planes, drivetanks, etc. It may not be long before thesesimulations hit the entertainment field in theform of virtual movies. One thing is certain,however. Film and television effects willcontinue to attempt to amaze audiences in anyway possible.

  • 8/10/2019 VFX Najbolje

    12/12

    XIII. Conclusions

    Visual effects have become animportant, if not necessary, part of film andvideo making. As the requirements of

    directors grow, so does the technology used tocreate these effects. With the introduction ofcomputers into the film world, a whole newera of film effects, called digital effects, cameinto use. Digital effects give the filmmakerthe freedom to do things he or she neverthought possible, except in their dreams.Actors can be placed in a far-away landwithout ever actually going there. Stunts toodangerous to actually be performed by ahuman can actually be done. Even aliens are

    now allowed to invade and destroy the earthall within a film or television studio.Digital effects allow artists to do things

    ranging from simple wire-removal to joiningtwo pieces of film into one as seen in ForrestGump. Morphing, one of the most excitingtechnologies, allows creatures and people tochange shapes in one seamless shot. The useof virtual actors has allowed filmmakers to addhundreds of extras when it would actually betoo expensive to hire them individually.

    What does the future of visual effectshold? The possibilities are endless. Toy Storybecame the first fully computer animatedmajor motion picture. Perhaps one day a filmor television show will be made without theuse of actors and actresses. Whether or notthis is a good thing remains to be seen. Thereis probably a sense of realism that a computergenerated actor can never achieve. However,digital effects have opened new doors andallowed many incredible achievements. Ifthey are wielded correctly then digital effectsmake for a very powerful and effective filmand video making weapon.

    References

    [1] Visual Magic, web site devoted to information onvisual effects, Are You Ready to go Back to Titanic,http://www.visualmagic.awn.com (1998).

    [2] M. Vaz, Industrial Light and Magic: Into the DigitalRealm, New York: Ballantine Books, 1996.[3] Pixar's web page, information on the company andproducts, http://www.pixar.com. (1998)[4] D. Hearn, M. P. Baker, Computer Graphics, NewJersey: Prentice Hall Inc, 1997.[5] Adobe Web Page, web Site devoted to the productsof Adobe, http://www.adobe.com (1998).[6] Web site with images from the movie The Abyss(unofficial), http://www.lysator.iliu.se/~hakgn/ab-images2.html. (1998)[7] Web site devoted to images from the movieTerminator 2: Judgement Day (unofficial),http://www.peakaccess.net/~gurvits/fx/terminator/doc/imagefxthumb.html. (1998)[8] R. Braham, The Digital Backlot, IEEE Spectrum,v. 32, p. 50-63, July 1995.[9] R. Braham, The Digital Backlot, IEEE Spectrum,v. 32, p. 52, July 1995.[10] B. Robertson, Something Has Survived,

    Computer Graphics World, p. 34-38, June 1997.[11] C. Davidson, Darlings, Clones, You WereWonderful, New Scientist, p 31, 14 May 1994.[12] J.K. Hodgins, Animating Human Motion,Scientific American, v.278, p.64-69, March 1998.

    Ian K ezsbom is a Senior atBinghamton University and workingtowards his bachelors degree inComputer Science. He is a graduateof Stuyvesant High School in NewYork and is a member of Upsilon Pi

    Epsilon, the computer science honors society. He has

    been a course assistant for the introduction to computerprogramming class as well as a computer science tutor.He has also directed and created many of his own shortvideos and has experience working with both analogand digital editing systems. He was a member ofBinghamton University's Varsity Swim Team during hisfirst three years of college specializing in the Butterflyevents. Over the past summers, he has been a lifeguardand swim instructor and has worked as a videoproduction assistant.