49
DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due date: 31.09.2016 Actual submission date: 06.10.2016 Project no: FP7 - 61005 Project start date: 01.10.13 Lead contractor: The Foundry

D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

  • Upload
    others

  • View
    17

  • Download
    0

Embed Size (px)

Citation preview

Page 1: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage1of50

D6.1.4 Final Prototype Dreamspace

Performance Environment Deliverable due date: 31.09.2016

Actual submission date: 06.10.2016

Project no: FP7 - 61005 Project start date: 01.10.13

Lead contractor: The Foundry

Page 2: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage3of50

TableofContents

1 EXECUTIVESUMMARY.................................................................................................4

2 INTRODUCTION...........................................................................................................5

3 SCOPEOFWORK..........................................................................................................73.1 CONTEXT.....................................................................................................................................73.2 SPECIFICATIONSFORCREATINGHOLOGRAPHICANDIMMERSIVEEXPERIENCES.....................83.2.1 REVIEWOFD5.2.1....................................................................................................................................83.3 UNITYAUTHORINGPIPELINE...................................................................................................123.3.1 OVERVIEW.................................................................................................................................................123.3.2 CAPTUREPROJECTIONSURFACEGEOMETRY...........................................................................................133.3.3 AUTHORMIXEDMEDIACONTENT............................................................................................................143.3.4 RENDERONTOIMMERSIVEDISPLAYSURFACE.........................................................................................163.3.5 LISTINGOFDEVELOPEDTECHNICALCOMPONENTS.................................................................................183.3.6 METHODSOFDISTRIBUTION....................................................................................................................203.4 STATE-OF-THE-ARTANDACHIEVEMENTS...............................................................................203.4.1 REVIEWOFD5.2.2..................................................................................................................................203.4.2 IMMERSIVEMULTI-PROJECTORCALIBRATION.........................................................................................203.4.3 AUNIFIEDWORKFLOWENABLINGCROSS-FERTILIZATIONBETWEENFILM/BROADCASTANDINSTALLATIONART...................................................................................................................................................223.4.4 PROGRESSINGVIRTUALSETBUILDING.....................................................................................................23

4 USERTESTSANDEVALUATION...................................................................................244.1 MIXED-MEDIAAUTHOREDCONTENTTESTS............................................................................244.1.1 CELESTIALBODIES(CREW).....................................................................................................................244.1.2 WORKSHOPIMMERSIVESPACES(FAAI)................................................................................................264.1.3 LONGINGFORWILDERNESS,SIGGRAPH(FAAI)....................................................................................284.1.4 CAPE_BUCHMESSE(CREW)..................................................................................................................294.2 IMMERSIVEDISPLAYTESTS......................................................................................................304.2.1 TESTINGTHECOALTRACK(EXTERNALARTPROJECT).............................................................................304.2.2 THEPAVILION(CREW)...........................................................................................................................33

5 PUBLICDEMONSTRATORS.........................................................................................365.1 CVMP2015............................................................................................................................365.2 BATTLEGROUNDFINALDEMONSTRATOR...............................................................................385.2.1 OVERVIEW...............................................................................................................................................385.2.2 SYSTEMCOMPONENTS..........................................................................................................................385.2.3 CONCLUSION...........................................................................................................................................425.3 IBC2016.................................................................................................................................43

6 CONCLUSION.............................................................................................................44

7 APPENDICES..............................................................................................................46

8 REFERENCES...............................................................................................................49

Page 3: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage4of50

1 ExecutiveSummary Building immersive and holographic view-dependent experiences by projecting 3Dcontentontoarbitrarysurfacescanbeacumbersomeprocess.Currentlythisisdonebycombiningdifferentadhoctools likededicatedcalibrationsoftwareandcustomgraphicsenginestocompletethetask.Nexttonumerousartisticpossibilitiestheseexperiences can provide crucial feedback and novel viewpoints when building amixed 3DCG/film virtual set as it proposes a collaborative and open environmentofferingapreviewquality view-dependent renderof the scene. This enhances theunderstanding of the space and enables artistic interaction with the layout andpositionofitsassets.In Dreamspace Work Package 5 Task 2 (WP5T2), iMinds developed the tools toconstructanimmersiveenvironmentinwhichfilmedreal-worldimageryisdisplayedonarbitrarysurfacestocreateatrulyholographicexperienceforamotiontrackedviewerwalkingaroundthespacewithouttheneedforaheadmounteddisplay.Thetargetoftheworkwastodefineandbuildaunifiedworkflowthatenablesausertoquickly capture the layoutofa space,author3dcontent tobedisplayedonto thisspaceandrenderitbackoutthroughthedisplaysysteminaholographicway.Thiswork focusedondemocratizing the creationof immersive spacesbyproviding thebuildingblocksandunifiedtoolsetatalevelofcost,effortandsimplicitythatallowsexploitationinperformanceandinstallationart.Thisdeliverablepresentsadetailedoverviewof thetoolsetsspecificationsandthecurrentstate-of-the-artinthefield,laysouttheachievementsmadeinthistaskandpresents an in depth overview of the developed technical components evaluatedthrough several user field tests. The final Dreamspace demonstrator Battlegroundhas been built using the immersive display in conjunction with the LiveViewintegrated system. The immersive environment displaying free viewpoint videocontent was showcased at CVMP 2015 and the immersive virtual set using acalibrateddisplaysetupwasevaluatedaspartoftheBattlegroundfinalproductionshowcasedatIBC2016.The developed toolset is made publicly available and can be downloaded on thefollowinglink:http://www.github.com/HasseltVR/Holo_Toolset

Page 4: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage5of50

2 Introduction TaskWP5T2oftheDreamspaceprojectconcernedthedevelopmentandintegrationof tools for creating immersive environments in which navigable filmed content(depthenhancedomnidirectionalvideo)isdisplayedinanenvironmentallowingthespectator to navigate that content physically. In particular, we aimed for viewdependent multi-projection environments, creating a “holographic” experiencespace.Theexperiencespaceis“holographic”inthesensethattheprojectedimageryis adapting in real time to the viewingpositionof a spectator, creating an illusionthat the projection screen itself disappears, while the reproduction of motionparallaxincreasestheexperienceof“beinginside”.Viewpoint dependent display of synthetic 3D CG content is familiar since thebeginningofthe1980-ies,andbecamewellknownwiththevirtualrealityhypeandCAVE environments of the 1990-ies. Dreamspace, in contrast, aims at facilitatingdisplaying “filmed” rather than “synthetic” free viewpoint content in viewpointtracking display environments for creating navigable experiences with truephotorealistic look at affordable cost, both in terms of required hardware and intermsofperson-costforcreatingandprocessingthecontent.Unlike the “ordinary” 2D resulting from stereoscopic 3D video, the spectator willexperience a higher sense of immersion thanks to the reproduction of motionparallaxwhen“moving”intheenvironment.Thereproductionofmotionparallaxissimilar to a hologram, hence the term “holographic”. Unlike true holograms,however,Dreamspaceaddressesexperiencesinfullcolormovingimagery,usingoff-the-shelf IT technology. The technical setup of such an experience is quite achallenging task. Theprocess consists of combiningdifferentoftenmonolithic andproprietarysoftwareandhardwarecomponentsthathavetobematchedinordertoprovideatrueholographicsensation.FromanartisticpointofviewtheDreamspaceproject1focusedon“enablingacross-fertilization between cinema, TV, entertainment and performance and installationart”, “help artists to use the technology with little or no technical support” andfinally that “immersive projectionmay also be beneficial in the context of virtualbroadcastproduction”.From a technical point of view the original objectives of Dreamspace2were to“integratefree-viewpointvideoresultingfrommulti-viewfuseddepth-anddynamic-range enhanced omnidirectional video with previously developed viewpoint-dependent immersive multi-projection technology”. It is also stated that theresulting toolswill be “highlymodularandconfigurable”and that theywill enablethe viewer to “physically walk around in filmed dynamic real-world scenery”, asopposingtosynthetic3DCGcontent.

1DreamspaceDescriptionofWork,p.22.2DreamspaceDescriptionofWork,p.22.

Page 5: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage6of50

WP5T2 aimed at resolving all these points by proposing a unified workflow ofmodular components using low-cost hardware that aims to simplify the pipelineinvolved inmakingholographic experiences. Thisworkflowdoesn’t impose a fixedprocedurebutenablestheartisttocombinethenecessarybuildingblocksandtweaktheir parameters without the need to worry about the underlying technicalprocesses.Thisworkflowisabletoworkwithhigh-resolution,high-frameratedepthenhancedcontentandcontainsthetoolstolinkupwithexternalhardwaresuchasmotion capture devices, 6DOF tracking and other interaction devices. Using theworkflow, theuser isable tocombinemediaofdifferentsortsandgather them inone common ‘playground’. This environment also serves as a basewhen buildinganddesigningvirtualset,allowingtomixdepthenhancedandview-dependentvideotextures.Theviewerdoesexperiencethisenvironmentfromatrackedpointofview,enablinghimtowalkaroundina‘filmedspace’.Itwasvital togainuser-feedbackduring thedevelopmentprocess inorder to finetune the research using the needs and questions of people working with theseenvironments. As such, we applied a ‘design-by-doing’ methodology and pushedknowledge gained through various small user tests back into the implementationprocess.The following section documents the research to realizing the final prototypeDreamspacePerformanceEnvironment,definedasaunifiedperformativeplatformfor building Holographic Immersive Experiences, which we will from now onabbreviateasHIE.

Page 6: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage7of50

3 ScopeofWork

3.1 Context In order to understand immersive and holographic experiences and how they arerealizedbyusingmultipleprojectors, the following sectiondiscusses the setup forthe installationNoHorizon(2011),designedbyBelgianbasedtheatrecompanyandprojectpartnerCREW.NoHorizonplacestheviewerinaspacewhere6projectorsprojectcontentonallthesurroundingwalls,thefloorandtheceiling.Achariotisattachedtoarailsystemonthefloorthatguidestothefarendofthespace.Variousobjectsareplacedagainstthe walls. A spherical grid3serves as a source image. When the viewer movesthrough thespacebypushing thechariot,a laser sensormeasures thedistance totheviewerandtriggersanewperspectiveonthedisplaycomputer.Thisspacehasbeenpre-calibratedwith anomnidirectional camera, capturing probes at differentdistances. Moving the chariot triggers an interpolation between these differentcalibrations. Each calibration adjusts theoutput of eachprojector in order for theviewertofeelsurroundedbyoneimage.Asaresult,theuserfeelscompletelyinsidetheprojectedenvironmentandforgetsthephysicallayoutofthespaceheisinwhilepushingthechariotforwardsorbackwards.Figure1clearlyshowsthateachprojectorsoutputiswarpedandblendedtogether,sotheyformoneseamlessimage.Calibratingthesurfaceinordertogeneratedatatowarpandblendeachprojectorisfundamentalwhenworkingwithmulti-projectorsetups.

Figure1:SetupforNoHorizon(CREW-2011)

3Asphericalgridisanimageofagridthatperfectlywrapsaroundasphere.

Page 7: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage8of50

Work Package 5 Task 2 (WP5T2) aimed at democratizing this setup by offering aunified pipeline that enables artistic users to calibrate arbitrary environments andusethemasimmersivedisplays.Thistaskalsoaddressedtheinteroperationofsuchimmersive and interactive displays with a professional film pipeline, such as theDreamspacepipelinebuilt aroundLiveView.The LiveViewsystem is the integrateddemonstrator system that provides live visualization in a virtual productionenvironment connected to each of the technical innovations in the Dreamspaceproject: light capture, live depth capture, real-time global illumination, on-setintuitivetoolsandtheimmersivedisplayenvironmentcreatedinWP5T2.

3.2 Specificationsforcreatingholographicandimmersiveexperiences Building immersive and holographic view-dependent experiences by projecting 3Dcontentontoarbitrarysurfacescanbeacumbersomeprocess.Currentlythisisdonebycombiningdifferentadhoctools likededicatedcalibrationsoftwareandcustomgraphicsengines tocomplete the task. It’snotunusual forartistsanddesigners toincorporatehardwareandsoftwareintheirworksinceitallowsthemtoexplorenewpossibilities in content generation, creating digital experiences and letting usersinteractwiththeseworlds.InWP5T2 of theDreamspace project, iMinds has developed tools to construct animmersive environment in which filmed real-world imagery and 3DCG content isdisplayedonarbitrarysurfacestocreateatrulyholographicexperienceforamotiontracked viewer walking around the space without the need for a head mounteddisplay. The target of the work was to define and build a unified workflow thatenables a user to quickly capture the layout of a space, author 3d content to bedisplayed onto this space and render it back out through the display system in aholographic way. This work focused on democratizing the creation of immersivespacesbyprovidingthebuildingblocksandunifiedtoolsetata levelofcost,effortand simplicity that allows exploitation in performance and installation art. TheprimarygoalofWP5T2istofacilitatethedisplayofimmersivevideoonlow-costandeasier to set up multi-projection systems, made from off the shelf commonhardware.Webelievethiswilldrivethecreativeexplorationinastrongerwaythanbyaddressinghigh-endenvironmentsusinghigh-endhardware.TheworkdoneintaskWP5T2buildsuponearlierworkofHasseltuniversity,iMindsand project partner CREW, most notably in the 2020 3D Media FP7 integratedproject,onmulti-projectionenvironmentcalibration,renderinganddisplay,aswellas DreamspaceWP3T2 andWP4T4work on capture, processing and rendering ofhighresolutiondepth-anddynamicrangeenhancedomnidirectionalvideo.

3.2.1 ReviewofD5.2.1

WP5T2requiredaninitialresearchphasetodraftalistofspecificationsforatoolsetallowing to create holographic and immersive experiences, in short a HIE. This

Page 8: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage9of50

researchwasdocumentedindeliverable5.2.1andaimedatgettinganoverviewonwhichcomponentsarenecessaryforrealizingsuchapipeline.Severalbuildingblocksorcomponentsweredefinedthatneedtobepresentwhenbuilding HIEs. The selection of these components is strongly influenced by the‘design-by-doing’ methodology mentioned earlier and is resulting from theexperienceof iMinds/UniversityofHasselt insupportingprojectpartnerCREWandotherartisticandcreativeresearch.Thedifferenttechnicalcomponentsarelistedasfollows:

1. Multi-projection calibrationbymeansofcamera feedback (20203DMediaworkandlaterrefinementsatiMinds).

a. Calculationanddisplay(projection)ofcalibrationpatternsb. Analysisofcapturedpatternsintocamera-projectormapsc. Filtering of such maps, to eliminate noise and reflections in the

projectionenvironmentd. Calculationoftemplateblendingmasksandwarpingmeshes

These algorithms have been developed in the 2020 3D Media EUproject and later refined in various regional R&D collaborativeprojects at iMinds. A basic description is found in 2020 3D MediadeliverableD6.6section44

2. Meshbasedcontentwarping:using2Dmeshdistortionitispossibletowarptheoutputpixelstocompensateforthedisplaysurface’shape.

3. Videowarping toparametricmodels:workingwith immersivecontentandvirtual worlds means stepping out of the rectangular screen and using avirtual sphere around the viewers’ head to project content upon.Differentmathematicalmodelsexisttomapthissphericalrepresentationofthevirtualworld to a 2D surface (like a display): rectilinear, equidistant fish-eye,stereographic,orthographic,equirectangularcylindrical,...

4. Masking and blending: for dynamic blending so the overlap between eachprojectorgetscompensatedforandoneseamlessimageisformed.

5. Interaction:choiceofviewer’span/tilt/zoomandpositioninspace.

6. Multi-pass rendering pipeline allowing thepipeline toworkwith3DCGbutalsowithvideo-basedcontent.

In order to realize all these components in one pipeline, a review of existing 3Dauthoringplatformswasnecessary.Acreative3Dauthoringplatform isa software

4http://www.20203dmedia.eu/materials/PU4/20203DMedia-Deliverable-D6.6-V6-Final.pdf

Page 9: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage10of50

environmentwhereonecanfreelycombinemediaofallsortsandplacethemina3Dworld.Theseplatformssupportdifferentwaysofnavigatingthroughthisworldandsupportrenderingtodifferentoutputdestinations.Theyarecreativeandartistic innatureandareassuchregularlyusedbyartistanddesignersforbuildinginteractiveart,museumapplications,VirtualRealityexperiencesandvideoart.D5.2.1reportsonanevaluationofseveralofthesesoftwaretoolsinordertogetadetailedoverviewofexistingplatformsonthemarket.Itwascrucialtobasefurtherresearch on common available software in order to be accessible for artisticdevelopers.Again,thelistingofthespecificationsistheresultofa“design-by-doing”methodology. Instead of focusing on a formal a priori specification sheet, thenecessary developments are outlined based on previous experience and currentdemands.We summarize the findings of D5.2.1 in the following figure and paragraph(AppendixB/Ccontainmorereadableversionsofthefollowingfigure).

Figure2:OverviewofD5.2.1evaluated3Dauthoringplatforms

Page 10: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage11of50

Mostoftheplatformsevaluatedarenon-freedownloadableapplicationbinariesthatbundle functionality in working with rich media content (Max, Touch Designer,Ventuz,PixelConduitandIsadora).Theseapplicationsfocusonenablingtheirusersto rapidly prototype and deploy their applications. All of them use a node-basedvisual-programming style and predefine certain functionality in blocks that can befreely combined in order to get a global workflow, working on input data. Theseapplicationsarenotalwayscross-platform.ThefactthatthemostofthemonlyworkonWindows isprobablyclosely linkedtothefacttheWindowsmachinesallowforeasyhardwareupgrades.Windowssystemsarealsostill thestandard in theeventindustrywhenlookingathigh-endvideoandlightingapplications.Others (The Foundry Blink, OpenFrameworks, Cinder and OpenScenegraph) arepurely code based and propose a set of libraries to work with multimedia andvisualize 3D content. They are mostly cross platform and can support differentrenderbackends,mainlyOpenGLandDirectXtowhichausercandirectlyusetheAPI. These frameworks can be extended to support any application, being code-based,butlackastart/templatefortheusertobeginwith.Assuch,thereisaseriousstartup cost involved in building multimedia applications, mainly when it comesdown tomakingauser interface. Thepaidplatformsdoprovide suchan interfaceandtendtoprovidetheuserwithanartistic ‘tabularasa’,readytobefilledbythecontrolsalreadyintheinterface.Unity sets itself apart from the others, since it is a free (for non-commercial use)gameengine,platform-agnosticfromthegroundupandfullyscriptableplatform.Itscales itself to the specifications of the host machine. It has a well-structuredinterfaceandprovidestheuserwithaneditorwherealltheauthoringhappens,andagameviewthatdisplaystheresultfromadefinedvirtualcamerapoint.Allofthetools,withinlesserextentPixelConduitandTheFoundryBlink,canplaceassetsin3Dandcontroltheirvisualpresence.Highresolution,highframeratevideo-playbackseemstobeonlygraduallysupported,andisgreatlydependingonplatformspecific implementations, as no cross-platform video playback implementationexists5. Playing back multiple streams in parallel proved to be unstable. SomeplatformssuchasTouchDesignersupporttightintegrationwithoutputdevicessuchas the widely used Matrox or Blackmagic hardware. All platforms, except PixelConduit support remote control through network packages. All tools can beextendedthroughscriptsorthroughnativecodeintheformofplugins.6Thedeliverableconcluded7thatUnitywasthebestcandidateforbuildingourunifiedpipelineupon. Fromnowon in thisdocumentwewill refer to thispipelineas the

5TheGStreameropensourcemultimediaframeworkistheonlytruecrossplatformvideoenginethatweknowof.It isabletoplaythemajorityofvideocodecsexistingtoday,butitsAPIofconstructinggraphsisrathercomplex.RenderingtotextureforfastGPUoperationisonlypartiallysupported.6V.Jacobs,P.Bekaert,D5.2.1,Specificationanddescriptionoftoolsforholographicliveperformanceandinstallationart,p.10.7V.Jacobs,P.Bekaert,D5.2.1,p.10.

Page 11: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage12of50

Unity Authoring pipeline. This pipeline connects all necessary components forbuildingHIEs.Thenextsection3.3describes indetailhowallcomponents listed insection3.2.1wereincorporatedintotheUnityauthoringpipeline.DeviationfromtheDescriptionofWorkThechoicefortheUnityplatformasbasedeviatesfromtheDreamspaceDescriptionof Work where it was stated that the building blocks would be developed forMax/MSPandTheFoundryNuke.Max/MSP is indeed the “de facto standard for node-based visual programmingenvironment for real-time audio…”8and provides a fast way to combine differentcomputingblocksthatdealwithvideo,lightandsoundbutitwasassessed9thatitsvideoand3Dcontentprocessingblocks lackperformancecompared toothersandarenotonparwiththecurrentdemandoffunctionality10.The Foundry Nuke is a powerful compositing engine with an exhaustive imageprocessingtoolset,but it lacksthepossibilityto interactandpreview livewhathasbeen composed. InWP4T2, The Foundryhasdevelopeda live compositing toolsetbased on Nuke and this has been integrated into the LiveView system for theDreamspaceproject.ThisprovidesanofflineworkflowtosetupCGcontentandthecompositingpipelineusingthehigh-endtoolsNukeandKatana.Intheworkflowwepropose,Unityisusedasasimple,accessible3Dworkspace.ThisworkspacecanbeconnectedtoNukerunningasalivecompositingsystemthroughLiveViewandthisisoutlinedinSection3.4.4.

3.3 UnityAuthoringpipeline

3.3.1 Overview

Unity isamodernandpowerfulenvironment foroperating real-timeand livewith3D content. It has gained much acclaim in game design and 3D prototyping. Itprovides an extensive toolset and can be easily extended using scripts (written inJavaScriptorC#)orbycreatingnativepluginsinC++wherehardwareresourcescanbeusedatitsfullperformance.Furthermore,Unityrespondswelltodifferentkindsof(physical)liveinputandismadecross-platformfromthebottomup.11TheUnity authoringpipeline connects all necessary components for buildingHIEs.iMindsdevelopedacalibrationtooltocapturethephysicallayoutofthespacewithalow-cost off-the-shelf omnidirectional camera and feed this data into Unitywhich

8DreamspaceDescriptionofWork,p.22.9V.Jacobs,P.Bekaert,D5.2.1,Specificationanddescriptionoftoolsforholographicliveperformanceandinstallationart,p.6.10V.Jacobs,P.Bekaert,D5.2.1,p.8-9.11V.Jacobs,P.Bekaert,D5.2.1,p.6.

Page 12: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage13of50

serves as the central 3D playground. The platform was extended to make itunderstandmulti-projection calibrations andmake it able towarpandblendeachprojectorwith the option of doing these operations on theGPU throughNVidia’sNVAPI. iMinds developed extensions to work with high-resolution, high-fps andview-dependentvideotexturesandenabledUnity tounderstanddifferentkindsofvirtual reality peripheral devices.When possible, developmentsweremade cross-platform. As a result, the user is presented with a unified pipeline and intuitiveworkflow consisting of off-the-shelf low-cost components to creatively build novelimmersiveexperiencesandpresenttheminaviewdependentandholographicway.TheUnityauthoringpipelineisableto

• Capturethephysicallayoutofthespacethroughalow-costprobecamera• Authormultimediacontentandinteraction• Renderoutthiscontentonthephysicalspaceusingamulti-projectionsetup,

adaptingtoamovingviewer

Sections3.3.2to3.3.4presenteachofthesethreestagesinmoredetail.

3.3.2 Captureprojectionsurfacegeometry

Inordertouseanyprojectionsurfaceasanimmersivedisplaysurfacethereneedstobeasystemtodeterminethepositionofeachprojectorandtodefinetheoverlapwith neighboring units. This process of determining this information is calledgeometricalcalibration.The surface can be a straightwall, but also any other irregular surface, as can beseen in figure 3. The setup in this figure consists of 3 projectors, yielding a fairamountofoverlap.Eachprojectorsoutputstrikesthesurfaceinanon-straightway,somepartsareclosertotheprojectorthanothers.Toaccountforthecurvatureofthe surface, we need to find the correct distorting mesh to compensate for theshape.Thisprocessiscalledwarping.Theprocessthatcompensatesfortheoverlapbetweentheprojectorsiscalledblending.Awarpedandblendedsystemcanbeseen in figure4.The imageshowsthesamecurvedsetupasinfigure3,butisnowpresentingonegaplessseamlessimage.Fromthe point of calibration (sometimes referred to as the ‘sweet-spot’), the user willperceiveoneimagearoundhim,asifhewasstandinginthemiddleofasphere.Thegeometric calibration process specifies how to warp and blend pixels in order toformoneunifiedandseamlessimageontothedisplaysurface.

Page 13: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage14of50

Figure3:Uncalibratedmulti-projectorsetupconsistingof3units

Figure4:SamesetupcalibratedthroughiMindstools,showingoneseamlessimage

3.3.3 Authormixedmediacontent

iMinds developed plugins for the popular Unity engine allowing it to parse thecalibrationdataofthecapturestageandprovidetheuserwiththeabilitytopresent

Page 14: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage15of50

anyscenewithanycontentmade inUnityontothecalibrateddisplaysurface.Thefocushereisonbeingabletomixlive,real-timeanimatedcontentwithpre-recordedassetsandpresentthemontheflyinthecalibratedimmersivedisplay. TheUnityauthoringpipeline isaplatformwhere thedesignercan load richmediacontent,defineitspositioninthe3Dworldanddefineitsinteractiongainedthroughan external peripheral device. The platform is able to work with high-resolutionvideocontent,e.g.360capturedvideo,andhastheabilitytoattachthiscontentto3D geometry defined in the scene.12The software is also able to translate themovement of the viewer to depth enhanced and view-dependent rendering ofcontentprovidedbythetoolsdiscussedinWP313andWP414.Afterdefiningassetsandpositioningthemin3D,thepipelineisthenabletocapturethescenebyplacingavirtualcamerainthemiddleandcapturingitfromthatpoint.Thisway,theusergainsmaximumfreedomincombiningdifferentlayersofcontent,both CG and video, without having to worry about a final compositing step. Thescene will always look good from the point of the capturing camera. Preferably,materialsshouldbefreelyassignabletoalltheassetssothattheirlightingandcolorparameters can be adjusted on the fly. This virtual probing camera generates asphericalrepresentationofthesceneasarendertexturethatisconsequentlyfedtoashaderstagewhereitservesasthesourceforthedisplayoutput.

Figure5:Mixedmedia:360(left),3DCG(righttop)andpanoramicviewdependent(rightbottom)

12360pre-stitchedvideoisaverygoodcandidateforusinginsideimmersiveexperiences,as it isaneasywayofimmersingaviewerinsideaphoto-realisticenvironment.13Reportedon inD3.2.1: “Depth-anddynamic-rangeenhancedomnidirectional capture system fordynamiclocations”.14ReportedoninD4.4.1:“Multi-viewfusionofdepth-anddynamic-rangeenhancedomnidirectionalvideo”

Page 15: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage16of50

Figure6:ThedifferentparametricmodelsandequirectangularscenecaptureinUnity

3.3.4 Renderontoimmersivedisplaysurface

The tools developed inWP5T2 create a true holographic display by capturing andcalibrating a physical space as a display environment, then projecting omni-directional contentbackon to the space. Spectators canexplore this spacewithatrackedpointofviewwhenatrackingdeviceisinstalled.Thistrackedpointofviewcandelivertruemotionparallaxdependingonthedisplayedcontentandcalibration.Whendisplayingfree-viewpointvideoorpure3Dscene,theviewerwillexperiencemotionparallaxwhentrackedbyanexternalmotioncapturemechanism.Moreover,whenthecamerawasplacedatdifferentlocationsforeachshot,thewarpandblendengine can also adapt to a moving viewer by interpolating between differentcalibrations.Thismeansthatthe‘sweet-spot’willmovewiththeviewer.During practical testing we found out that when the content allows for a virtualcameratomovearoundand‘see’thespacefromanotherpointofview,thisalreadyprovides a satisfying feeling of motion parallax. When the warping and blendingengine isalsoupdatedthemotionparallax ismorecorrect,butusersusuallydon’tperceiveahighlyincrementedbenefit.

Page 16: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage17of50

Figure7:ConceptualoverviewoftheUnityauthoringpipelineandits3stages

Page 17: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage18of50

Figure7showsaconceptualoverviewwithaclearoutlineofthethreestages.Inthecapturestagethedisplaysurface iscalibratedforuseasan immersivedisplay.Theusercanmixdifferentkindsofcontenttogetherintheauthorstagepreparingthemfor rendering onto the display. The render stage takes care of outputting theauthored Unity scene with the correct warping and blending to present oneseamlessexperience.

3.3.5 Listingofdevelopedtechnicalcomponents

Manydifferenttechnicalsub-componentsneedtoexist inorderforthepipelinetoprocesscalibrationdatafromthecapturephase,readdifferentsortsofmediaintheauthorstageandrenderoutontotheimmersivesurfaceintherenderstage.Unity’sversatile and powerful plugin system allowed extending its corewith native (C++)andmanaged(C#)plugins.Figure8summarisesthefunctionalityofeachofthetoolswehavedeveloped.STAGE TOOL TYPE DESCRIPTION

CAPTURE HoloCalib Standaloneapplication

StandalonewindowsOpenGLapplicationthatusesthelow-costRicohThetaSasaprobecameratocalibratearbitrarymulti-projectionsetups.Calibratingfromdifferentpositionallowsinterpolatingbetween‘sweetspots’.

AUTHOR HPV_Manager.cs UnityC#script SingletonHPVcompressedvideomanager.Connectstolow-levelfunctionalitythroughthenativeplugin.

HPV_Node.cs UnityC#script HPVplayerobjectscriptwhichplaysonemoviefileandupdatesthetextureofconnectedgeometryormaterial.Manynodescanexistnexttoeachother.

HPV_Unity_Bridge.dll UnitynativeC++plugin

NativeHPVplayerbackend.

HPV_Encoder Standaloneapplication

NativeHPVcompressorapplication.VideoframesarecompressedbyusingDXTtexturecompression.FuturesupportforASTCisalsopresent.

VDR_Streamer.cs UnityC#script View-dependantRenderstreamingscript.

VDR_Streamer.dll UnitynativeC++plugin

NativeView-dependantRenderstreamerbackend.

LensModel.cs UnityC#script Lensmodelabstractionscript,providingmathematicalgroundforparametricmodel

Page 18: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage19of50

conversion. ArcBall.cs UnityC#script Enablesuserinteraction

throughmousegestures. Capture.cs UnityC#script Captureresulttodisk.

Cubemap2Projection UnityShaderpack Shadersuitedoingconversionofsphericalcapturetodifferentparametricmodels.

VRPN_Manager.cs UnityC#script ManagesconnectiontoVRPNperipheralhardwareserver.

VRPN_Object.cs UnityC#script SingletonVRPNobjectconnectedtoUnitygameobject.

RENDER ProjectionManager.cs UnityC#script Managessphericalcaptureandconnectiontoshaderconversionpipeline.

NVAPI_Bridge.cs UnityC#script ProvidingconnectionfromUnitytoNVAPI.

NVAPI_Bridge.dll UnitynativeC++plugin

NativebackendforNVAPIbridgescript.

Figure8:SchematicoverviewoftheUnityauthoringpipeline

Eachofthetoolsisdiscussedindetailindeliverable5.2.1,togetherwithauserguideonhowtheyshouldbeused.Theuserwillmostlikelynotusealloftheseblocksatanytime,butwillconfigurethemforhisspecificscenario.Eachofthetoolshavecanbepluggedinoroutatanytimewithoutblockingtheglobalprocess.Sections 3.2 discusses the specifications for a workflow to create immersiveexperiences,asspecifiedinD5.2.1.Figure9givesanoverviewonwhichspecificationwashandledbywhichtool(s)inthefinaldevelopedtoolset.

Specification Developedcomponent

Multi-projectorcalibration HoloCalib

Meshbasedcontentwarping ProjectionManager/NVAPI_Bridge

Videowarpingtoparametricmodels ProjectionManager/Cubemap2Projectionshaders

Maskingandblending Cubemap2Projectionshaders

Interaction VRPN_*

Multi-passrenderingpipeline HPV_*/VDR_*/Cubemap2Projectionshaders

Figure9:Specificationvsdevelopedcomponents

TheHPV(HighPerformanceVideo)suiteoftoolsspecificallydealwithplayingbackmultiple stream of high resolution video content, that can be of the planar orspherical type. They offer a cross-platform solution for video playback inUnity by

Page 19: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage20of50

applyingwidespreadDXT texture compression algorithms that can be decodedonthe fly on the GPU. Future support for mobile platforms is being anticipated bysupportingtheASTCtexturecompression.15

3.3.6 Methodsofdistribution

The developed tools inWP5T2 are bemade publicly available through the onlinecode-sharingplatformGitHub.Providingthetoolsinsuchawayfostersalargeusergroupthatisabletouseandevaluatethetools.Any(user-suggested)improvementswillbepushedtothiscentralplatformsothatthetoolscangrowtogetherwiththeUnityplatformitwascreatedfor.Thetoolsaremadeopensourceandcanbedownloadedfromthefollowinglink:http://www.github.com/HasseltVR/Holo_Toolset

3.4 State-of-the-artandachievements

3.4.1 ReviewofD5.2.2

In WP5T2, iMinds developed different components to provide a framework forcreating immersive experiences. These components deal with calibrating a multi-projector setup (immersive multi-projector calibration), authoring mixed-mediacontentforcreatinganimmersivescene(inaunifiedworkflow)andfinallyrenderingthisscenebackontothedisplaysurface inaholographicway. Thecalibrationandimmersive content playback tools build on previous iMinds experience with 360˚videocapturesystemsandcalibrationresearch.Thestate-of-theartconcerningthedeveloped components and the key-innovations were documented in deliverable5.2.2, “Tools for creating holographic live performance and installation art”. Thisdeliverablestatesallobjectivesformakingthetools,givesadetailedoverviewofthedeveloped tools and their technical internals and discusses their impact on thecurrentstate-of-the-art.16The following sections summarize these findings and give an overview of the keyinnovationsthatwererealized.

3.4.2 Immersivemulti-projectorcalibration

TheresearchdoneforthissectionbuildsupondevelopmentsbyiMindsmadeinthecontext of a previous FP7 project, 2020 3D media17. The project document D6.6

15V.Jacobs,D5.2.2,Toolsforcreatingholographicliveperformanceandinstallationart,p.23.16V.Jacobs,D5.2.2,p.7-1817This project,with ID ICT-FP7-215475, focusedon getting immersive 3D experiencesmore readilyavailableforthehome-consumermarket.

Page 20: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage21of50

“Immersivedisplayforhomeenvironments”outlinesanautomatedworkflowwhereanomnidirectionalcameraisusedforcalibratingseveraldome-likemulti-projectionsetups. The document states “Previous art closest to ourwork concerns a similarnon-parametric approach for planar-like displays using a regular linear-perspectivecamera,describedinUSpatentUS2004/0257540A1.Ourworkcanbeviewedasanextensionofthatwork,butforsurroundvideodisplays,andusingasurroundvideocamera, involving stitching.”18By using stitched images, the technology allows forcalibrating any screen setup, ranging from frontal, spherical to cylindrical. Thecalibrationpatternsused“combinepropertiesofseveralknownpatternsequencesusing in 3D scanning, camera calibration and former multi-projection calibrationmethods.”19The pattern sequence used fuses advantages of different calibrationpurposes: camera lens distortion calibration, intrinsic and extrinsic cameracalibration and structured light depth scanning. This combination speeds up thewhole process, and provides the user with an automatic immersive calibrationsystem.

Thecamerasused informerresearch,aPointGreyLadybug3orothercustomrigs,havebeenreplacedbyamuchlowerpricedconsumerlevelRicohThetaS,whichdidnotexistatthetime.Thislittleandeasymountabledevicecarriestwofisheyelensesoneachsideofitsbodyandasaresultcancapturefullsphericalphotosandvideos.Pictures are stitched on the device internally; videos have to be stitched with anofflinetool.Theresolutionforstitchedphotos(equirectangularformat,5376x2688)ismorethanhighenoughforthepurpose.ThecameraparameterslikeISO,shutterspeed and exposure mode can be wirelessly controlled and fine-tuned throughsimple HTTP commands that it receives over its own ad-hoc wireless network.Downloading pictures after triggering the shutter happens over that same link. Assuch,oursystemisabletomatcheachcalibrationpatternwiththeequirectangularpicturethatwascapturedatthesametime.

Figure10:Left:PointGreyLadybug(15Keuro)-Right:RicohThetaS(400euro)

18P.Bekaert,C.Weissig,20203DMedia,D6.6.Immersivedisplayforhomeenvironments,2012,p.33.19P.Bekaert,C.Weissig,p.33

Page 21: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage22of50

The key innovation in WP5T2 is to present creative users with a multi-projectorautomatic calibration system consisting entirely of off-the-shelf components, andrepackagedintheformofeasilyre-usablesoftwarecomponentsallowingtomixandmatchtoneedsandtaste,inplaceofamonolithicsoftwaresystem.

Afteracalibrationprocedurehascompleted, thehostplatformneedstowarpandblend all sub-imagesusing its own renderingpipeline. Thismeans that a seamlessimageonlyappearswhenthatspecificapplicationisrunning.ThedesktopoftheOSrunningonthemachinewillnotappearasoneseamlessimage.NVidia recently (~2012) introduced theirWarp& BlendAPI (wewill fromnowonrefer to itasNVAPI) for theirQuadroseriesofcards,eventhe lowerpricedK1200(~500euro).Thisallowsadeveloper to launchcodethateffectivelysendswarpingand blending information to the graphics card. It can be defined to save thisinformationbetween intermittentpower cycles.Moving thewarpingandblendingto the GPU means that the display outputs of the machine will always create aseamlessimage,independentfromtheapplicationthatisrunning.Allthenecessarycalculations cannowbeoffloaded to a separate stage in theGPU,which freesupvaluableresources.The key innovation and main benefit of this achievement lies in decoupling thewarpingandblendingfromtheapplicationsideandsendingittoadedicatedstageat the GPU. The research done in WP5T2 bridges the resulting data from theautomaticcalibrationsystemtotheNVAPIsothatanapplicationcanenable/disablewarpingandblendingfromuserspace.BydefiningacorrectUV-coordinaterelationbetweenthe input images,anyapplicationgoingfullscreencaneffectivelycoverawholemulti-projectorstepwithaseamlessimage,includingcolorequalization.The base coordinate system for the input image has been fixed to theequirectangularmapprojectiontobeabletocoverthewholeprojectionarea.Thisrectangularprojectioncoversthefullvirtualsphereandhasa1:1relationwiththecaptured image coming from the surround camera that was posed at the sweet-spot(s) during calibration. When the user moves and is tracked through motioncapture or other means, we can interpolate between different calibrations byupdating the information stored in the warping/blending stage on the GPU. Thisway,theuserwillalwayshavethecorrectperspectiveonthevirtualscene,creatingatrueHIE.

3.4.3 Aunifiedworkflowenablingcross-fertilizationbetweenfilm/broadcastandinstallationart

Projectdeliverable5.2.1 evaluates a list of softwareplatforms forbuildingHIEs. Itwasmandatorytodoanevaluationofthesoftwaretoolsfirst inordertochooseabuildinggroundforfurtherresearch.

Page 22: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage23of50

The deliverable concluded20that Unity was the best candidate for building ourunifiedpipelineupon.ThefinalprototypeDreamspaceperformanceenvironmentissubsequently called the Unity Authoring pipeline. This pipeline connects allnecessarycomponentsforbuildingHIEs.Thekey innovationofthissectionisprovidingtheuserwithaunifiedandintuitiveworkflow for building immersive experiences as a low cost readily accessibleplatform. This pipeline consists of several component building blocks that can becombined ad-hoc to form a basiswithout having to rely on expensive proprietarysoftwaresystems.Thetoolsdeveloped inWP5T2createa trueholographicdisplayby capturing and calibrating a physical space as a display environment, thenprojecting omni-directional content back on to the space. Spectators explore thisspacewithatrackedpointofview.The Unity pipeline as such fosters cross-fertilization between film production andinstallationartbydemocratizingthesetupforimmersiveenvironmentswhereuserscan‘walk’aroundinfilmedcontent.

3.4.4 Progressingvirtualsetbuilding

Virtualsetbuildingistheprocessofcreatingonevirtualspacewhere3DCGassetsare placed next to traditional camera captured photorealistic assets. Thecompositioncanhappenon thebasisofadepth relation, forexamplebyshootingassetsonagreenscreenandaddingvirtualbackgrounds.Virtualassetscanalsobeanimated, and require lighting andmaterial calculations to appear as natural andrealisticaspossible.Aviewerisabletonavigatethroughthevirtualworldandlookat assets from different perspectives. This enables for example a 3D designer tojudge the quality of the assets and their placement inside the virtual world. Acommonpracticeforre-viewingvirtualspacesisusingHMD’sliketheOculusRiftorHTCVice.NewcomersonthemarketliketheMicrosoftHoloLensaddanaugmentedlayerbybeingabletocomposite3Dcontentontopoftherealenvironment.ThekeyinnovationhereistheUnityauthoringpipelinecanofferanovel,openandcollaborativewayofpresentingvirtual setsbyusinga largerdisplay surface (usingmultipleprojectors)andopeningupforacollaborativeapproachincontrasttotheindividual experience when using HMDs. By standing inside a calibrated displaysurface, the viewer (or director) can navigate through the set with amuchwiderperipheralview,lookatcertainassetsfromdifferentpoints-of-viewanddiscussshotframing, lenschoicesorotherchangeswithadirectorofphotographyordesigner.Motionparallaxissupportedthroughtrackingsodepthcanbeperceived.TheUnityauthoring pipeline also supports high-resolution omnidirectional and view-dependentassetswhichaddsupportforphoto-realistic immersiveandholographicassetsinthevirtualscene.Section4.2.2abouttheBattlegroundfinaldemonstratorillustratesthispurposeindepthwithexampledocumentation.

20V.Jacobs,P.Bekaert,D5.2.1,p.10.

Page 23: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage24of50

4 Usertestsandevaluation Assetout intheDescriptionofWork,theproject isexpectedtodeliver“Evidence-based evaluation of the advances in creative expression and efficiency, based onworkcarriedoutwiththetechnology inexperimentalandreal-worldproductions.”Theseevaluationscomplementthe‘expectedadvances’setoutintheDescriptionofWorkandreportedoninD5.2.2.21Thefiveunifyinghighlevelevaluationquestionsare:22•Thesuitabilityoftheconceptandwhetheritsupportsthereal-timeintegrationofreal-worldandCGelements - ‘do users think this is the right approach,with therightfunctions?’•Thequalityandreliabilityof thetechnologies fordesigning,directing,navigating,editingandmodifyingVirtualProductionsinacollaborativeenvironment-‘howwelldothetoolswork?’•Theusabilityandeffectivenessofthecomponents-‘aretheyeasytouseanddotheyincreasecreativityandproductivity?’• The interoperability and integration of the software components and functions,with each other and third party tools - ‘do the tools function in a professionalcreativeworkflow?’•Thecreativequalityoftheresultsachievedinexperimentalproductions-‘aretheresultingvisualexperiencesconvincing,novelandoriginal?’ThroughoutDreamspace,thetoolsdevelopedinWP5T2wereregularlyexaminedinfield tests and workshops organized by the projects creative partners or jointdemonstrators where all Dreamspace technology was evaluated as a whole. TheevaluationoftheusercasesisdocumentedindetailinD6.2.2andoutofthescopeofthisdocument.Thefollowingsectionshoweverusethesequestionsasaguidetospecify which conclusions could be drawn that drove further technicaldevelopments.

4.1 Mixed-mediaauthoredcontenttests These tests focused on the playback of high-resolution and high-speed free-viewpointor360°videoframesinsideUnityandtheinclusionofsuchpluginsnexttoanexistingpipeline.

4.1.1 CelestialBodies(CREW)

Celestial Bodies is an immersive installation for one spectator at a time. ThisspectatorreceivesaVirtualRealityheadsetwithheadphoneswhilebeingimmersed

21VolkerH.,D2.2.1,CreativeExperimentsandEvaluationplan,p.7-822VolkerH.,D2.2.1,p.8

Page 24: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage25of50

ina virtual versionofourmilkyway in space.Travelling through space theviewercandiscoverthedifferentplanetsfromcloserby.Experimentswere donewith the Unity 360 HPV video player thatwas created inWP5T2. ‘Celestial Bodies’ needed to work with 4K equirectangular video texturescontainingalphachannels.ThiswasnotpossiblewithanyotherUnitytoolexistingatthatmoment.TheaimwastoputtheviewerinthemiddleofasphereandletthetopofthatsphereopentoallowthespectatortoseetheCGassetsintherestofthescene.

Figure11:ScenesetupforCelestialBodies(2012)

4K video texturesweremade in After Effects and an alpha channelwas added toserveasamask,asshowninfigure12.

Figure12:Resulting4Kvideo+alphaplayback inUnity,attached tosphere.Thenorthernpole ismadetransparent.

Page 25: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage26of50

Evaluationsheet:

Highlevelevaluationquestion Feedback

Rightapproach,withrightfunctions?TheWP5T2toolsprovedtherightapproach,sinceitprovidedstablefeedbackof4Kvideowithsupportfor

alpha.

Howwelldotoolswork? Thetoolsperformedwell:somelowlevelissuesaroseandweresolvedinlaterdevelopments.

Easytouse,andincreasingcreativity/productivity

TheHPVpluginshavelowimpactonthehostUnitysystemandallowedCREWtoimportvideointoUnity.

Functioninginprofessionalworkflow? ThesystemforCelestialBodieswasstandalonesointegrationintootherpipelineswasnottested.

Resultinginconvincing,novelandoriginalvisualexperiences?

TheWP5T2toolsmadeitpossibleforCREWtocreateanovelimmersiveexperiencewheresphericalvideoand

3DCGelementsco-exist.Outcome/conclusion:TheWP5T2toolsmadeitpossibletomixvideoand3DCGworldseffectively:sincethegeometrythatholdsthevideotexturewastransparent,theusercouldexperiencethe3Dcontentbehindit.CREWcouldexperimentwithdifferentlayersof4Ksphericalvideobyswitchingbetweendifferentstreamsinrealtime.During‘CelestialBodies’issuesoccurredwiththelowlevelrenderingpluginwhichkeptitfromworkingonAMDandIntelGPUs.Theseissueswereaddressedduringthecourseoftheproduction.TheHPVplayerwastestedinreal-worldconditionsandprovedtobestableandreliable.Itwasfeltthattheabilitytoworkwith360˚videoinanimmersiveinstallationgreatlyenhancedtheprocessesofcreativeresearchandthedesignofanovelimmersiveexperience.

4.1.2 WorkshopImmersiveSpaces(FAAI)

Theworkshop‘ImmersiveSpaces’wasa4-dayworkshopwithhandsonstate-of-the-artVirtualandAugmentedRealitytechnologyandcontentheldattheFilmakademieBaden-Württemberg(FAAI).iMindswaspartofthisworkshopandpresentedoneoftheir latest 360˚ camera rigs allowing for livemonitoring of the real-time stitchedcontent. iMinds also used the WP5T2 tools to quickly iterate over ideas andinteractive scenarios. There was no projection surface: instead, the content wasplayedbacktoanHTCViveheadsetwithpositionaltracking.

Page 26: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage27of50

Figure13:Left:showingoverviewof3Dassetsinsphericalvideoscene.Right:viewerinsidevideoscenewithno3Dassetsaftercollidingwithtriggerbymovingaround.

Evaluationsheet:

Highlevelevaluationquestion Feedback

Rightapproach,withrightfunctions?TheWP5T2toolsenabledthestudentstoresearch

interactionusing360˚videostoimmersetheviewerinchangingenvironments.

Howwelldotoolswork?Thetoolsperformedwell,althoughlocaladjustmentshad

tobemadetosupportreadingvideotextureswithchangingresolutions.

Easytouse,andincreasingcreativity/productivity

HPVpluginshadalowimpactonthehostUnitysystemandallowedthestudentstotryoutthesuccessionand

interactionofdifferent360˚videos.

Functioninginprofessionalworkflow? Wasnottestedinthiscontext.

Resultinginconvincing,novelandoriginalvisualexperiences?

TheWP5T2toolsmadeitpossibletoexploretrulyinteractiveimmersivespacesthatchangeaccordingtoactionsperformedbytheviewer(inparticular,changing

position).Outcome/conclusion:TheWP5T2toolsallowedquickreconversiontosupporttheHTCheadset,whichenabledtheviewertonavigatethroughascenethatmixedsphericalvideoand3DCGassets.Whenthespectatorcrossedthepositionsofvirtualobjectsplacedthroughoutthe3Dscene,360˚videosweretriggeredinrealtime.Theresultwasatrulyinteractiveimmersiveexperience.Workshopsaregoodenvironmentsfortestingtheturnovertimeandflexibilityofatechnology.Theneedtoswitchbetweenvideosatruntime(afeaturenotinitiallysupported)resultedinfurtherdevelopmentsoftheHPVplayertechnology.

Page 27: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage28of50

4.1.3 LongingforWilderness,Siggraph(FAAI)

‘LongingforWilderness’isa360˚VRexperiencecreatedbyFilmakademieandMarcZimmerman.Theexperience takes theuser fromthenoisy city through the slowlytransformingforesttowardsacalmandairylandscape.Itseekstoexpressourinnatelonging to experience nature in its rawest forms. To achieve this, the artists havemade use of the latest technology: phones/tablets, virtual reality head-mounteddisplays (HMDs), 360° imagery, interactive binaural sound, and a seatback tactilebass system to transmit low frequencies to the user’s body, to create a trulyimmersiveexperiencethataddressesallsenses.LongingforWildernessusestheHPVVRplayertechnologydevelopedbyiMindsinthescopeofDreamspaceallowingtheplaybackof4Kcontentatspeedsupto50Hz.TheplayertechnologyhassuccessfullyrunwithoutissuesatFMX2016andatSiggraph2016.

Figure14:Viewerexperiencingthe'LongingforWilderness'VRexperience

Evaluationsheet:

Highlevelevaluationquestion Feedback

Rightapproach,withrightfunctions? TheWP5T2toolsprovedtherightapproach,sinceitprovidedstablefeedbackof4Kvideoat50Hz.

Howwelldotoolswork?Thetoolsperformedwell,aftermakingthenecessaryadjustmentstosupportDirectX11nexttoOpenGL;no

issueswerereportedafterFMXorSiggraph.

Easytouse,andincreasingcreativity/productivity

TheHPVpluginshavelowimpactonthehostUnitysystemandallowedFAAItousehigh-resolution360˚videowith

binauralsoundandotherexternalinput.

Functioninginprofessionalworkflow? TheWP5T2toolsfunctionedasatruepluginalongsidealltheothertechnologiesusedinUnity.

Resultinginconvincing,novelandoriginalvisualexperiences?

TheWP5T2toolsallowedresearchintohighframerate360˚video,resultinginamorefluidandconvincingVR

experience.

Page 28: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage29of50

Outcome/conclusion: Thisusecasetestedtheinteroperabilityofthetoolswithotherplugins,dealingwith3Dsoundandexternallyconnecteddevicessuchasawindmachine,duringlongperiodsofoperation.Thetechnologyprovedstableandreliableinsuchanenvironment.Inorderfortheexperiencetorunwithallthepluginscombined,theHPVsystemhadtobeadjustedtoworkwithDirectX11inadditiontoOpenGL.Unitysupportsbothrendererswewereabletoprovidearenderingpluginforboth.Asaresult,wecouldshowtheHPVtechnologyworkingtrulycross-platformbetweenWindows,MacandUNIX.

4.1.4 CAPE_Buchmesse(CREW)

DreamspacecreativepartnerCREWwascommissionedtocreatetwonewimmersiveinstallations for the Buchmesse 2016 in Frankfurt, Germany. To bring their CAPEformat(amobilepersonalVRexperience)uptopartomoderntechnologies,CREWdecidedtoinvest inrebuildingthebackpackthatuserscarrywhilebeingimmersedintheexperienceandbyswitchingovertheSamsungGEARVRplaybackdeviceforplaying back the 360˚ content. iMinds developed the technology for the remotecontrolapplicationwhich is runningona tabletattached to thebackpack.Weusethe WP5T2 HPV technology to synchronize 4K video with the view in theimmersant’sheadset.Thesynchronizationiswireless,sincetheSamsungGEARVRisaself-containeddevice.TheHPVplayerallowsreal-timeseekingwithoutanydelaysowecanachieveapreciseframesynchronisation.

Figure15:TwospectatorswithCREWsnewmobileCAPEbackpack,usingiMindsWP5T2technology

Page 29: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage30of50

Evaluationsheet:

Highlevelevaluationquestion Feedback

Rightapproach,withrightfunctions?ThedevelopedWP5T2technologywastestedoutofUnity,butinatraditionalC++context.TheHPVvideoplaybackhasbeentestedinastandaloneOpenGLapplication.

Howwelldotoolswork? Thetoolsperformedwell,allowing4KvideoplaybackonatabletwithmobileprocessorandnodedicatedGPU.

Easytouse,andincreasingcreativity/productivity

TheuseoftheHPVtechnologyenabledthesystemtojumpandseekrealtimebetweendifferentframes,allowingforexternallysyncingthevideoplayback.

Functioninginprofessionalworkflow?TheHPVtechnologywastestedasstandaloneC++

softwaremeaningitcanbeintegratedinanyothercode-basedsolution.

Resultinginconvincing,novelandoriginalvisualexperiences?

Notevaluated:thistechnologyonlyindirectlycontributedtotheviewer’sexperience.

Outcome/conclusion: This use case tested the HPV player technology in a non-Unity environment, as astandalone set of C++ plugins. We were able to fine-tune the inner parts in anenvironmentwhereeachadvancecouldbemonitoredmoreindetailthatwhenusedinside theUnity platform,where allmonitoring goes throughUnity, and achievedperformanceimprovementstoboththeencoderanddecoder.TheHPVtechnologycould,asaresult,adjustitselfmoreeffectivelytoplatformswithlessGPUresource.

4.2 Immersivedisplaytests The following user tests focused on the workflow to use multiple projectors tocreateanimmersivedisplay,andthepracticalusageofthecalibrationtoolandUnityauthoringplatform.

4.2.1 TestingtheCoalTrack(externalartproject)

iMinds was asked to assist in a project commissioned by the Flemish ministry oftown and country planning. This project focused on a region in Belgium that hasbeenextensivelyusedinthepastbytheminingindustry.Thisindustryhashaddeepimpactonthelandandspaceoftheprovince.Sincetheminingindustryperishedinthe80ies,thisspacehasbeenunusedanditsrailtrackinfrastructure(thecoaltrackwasarailtrackconnectingdifferentminingsitesintheregion)hasrunwild.TheartprojectwascommissionedtoperformontheInternationalArchitectureBiennaleof2016 held in Rotterdam and to inform architects and investors about theopportunitiesforurbandevelopmentpresentinthisregion.iMinds entirely relied on the Unity authoring pipeline to realize this interactiveimmersive installation. 360˚ video captures were made with the latest iMinds

Page 30: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage31of50

camera rig attached to a railbike driving over the coal track. A 20-minute 4Kequirectangular video composition was made which was being played in a loop.Interactive 3D elementswere composited in real-time on top of the video, givingmorebackgroundinformationaboutcertainpartsofthetrack.Thespectatorcouldmoveforwardsandbackwards inthevideobyusinga joystickattachedtoastand.The3Dcontentwouldappearanddisappeardependingonthepartofthetrackthatwas shown. The content was displayed on a round U-shaped wooden projectionsurface that was calibrated with the WP5T2 tools. Standing in the center, thespectatorwouldfeelasifhewasimmersedinthemiddleofthescene.

Figure16:Calibratedmulti-projectorenvironmentfor'TestingtheCoalTrack',displayinginteractive4K spherical videowith3Delements compositedon top,madeentirelywith theUnityauthoringpipeline.

Page 31: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage32of50

Figure17:'TestingtheCoalTrack',close-upof3Dcontentcompositedontopofvideo.

Evaluationsheet:

Highlevelevaluationquestion Feedback

Rightapproach,withrightfunctions?

TheCoalTrackinstallationwasaperfectusecaseforalmostallcomponentsdevelopedinWP5T2.Thetoolsallowedforcalibratingthearbitraryprojectionsurface,whileallthecontentcouldbeauthoredinUnityand

playedbackontothesurface.

Howwelldotoolswork? Thetoolsperformedverywell,therewerenoissuesatbuild-uporduringthethree-monthoperation.

Easytouse,andincreasingcreativity/productivity

Themostchallengingtaskinthisinstallationwasdesigningthe3DUIelementsthatwerecompositedontopofthevideo.ThedesignprocesshasbeendonetogetherwithexternaldesignersthatwerenotfamiliarwithUnitysosomeconversionprocessneededtotakeplace.Allinall,

Unity’sUIsystemprovedtobeapowerfuldesigncomponent,makingitsuitableforpresentingassetsin

interactiveinstallations.

Functioninginprofessionalworkflow?

UnityworkedtogetherwithtraditionalvideoeditingsoftwarelikeAdobePremiereandtraditionaldesign

softwarelikeAdobeIllustrator.Unitycouldhandletheiroutputfileswell,someconversionsneededtotakeplace.

Resultinginconvincing,novelandoriginalvisualexperiences?

TheWP5T2toolsdidmaketheCoalTrackinstallationstandoutassnovelimmersiveexperiencewheresphericalvideoand3DCGelementslivenexttoeachother,andare

interactivelycontrollerbytheuser.

Page 32: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage33of50

Outcome/conclusion:TestingtheCoalTrackwasinretrospecttheperfecttry-outfortheglobalworkflowand performance of the Unity authoring pipeline tools developed inWP5T2. Thisinstallation provided a first opportunity to try-out and iterate the calibrationsoftwareonatrueirregularandad-hocsurface.iMindswasabletotryoutdifferentcalibration patterns and judge their outcome, resulting in a set of patterns moresuitable for all kindsof curvedandnon-curveddisplay surfaces.Amixtureof highresolution 360° video content (edited in Adobe Premiere) and CG UI itemscomposited on topwas authored inUnity (usingUnity’s ownUI layout tools) andmade interactive through simple ad-hoc Unity scripts. This allowed the viewer toexperience an immersive space while being able to interact with it and alter itscoursethroughthejoystick.TestingtheCoalTrackalsoallowediMindstofine-tunethe render/output components yielding a better quality for the content andespeciallythetext-basedassetsthatwerebeingused.The installation ranwithout problems for 3 consecutivemonths,without a break.Such tests prove that the system can be used in museums and other morepermanentbasedlocationsetups.

4.2.2 ThePavilion(CREW)

The Pavilion is a big immersive installation using multiple projectors. The viewerenters the space with a tracked point-of-view (using an OptiTrack optical motioncapturesystem).Thesceneadaptstothespectator’spointofview.TheinstallationusesafreeviewpointmodulethatusesWP5T2pluginstoplaybackinreal-timethecorrect portion of the recorded datasets. Figures 17 and 18 show how the pluginadapts to a newpoint of view. The figures show static frames, but themodule isplayingvideocontentthatadaptstothepositionoftheuser,creatingaholographicexperience.ThePavilionalsousesWP5T2toolstodistributethetrackingdatatoallclientsthatneedit.

Page 33: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage34of50

Figure18:ThePavilionsetup,withfree-viewpointvideointegrated.Trackedviewfromtheright.

Figure19:ThePavilionsetup,withfree-viewpointvideointegrated.Trackedviewfromleft/front.

Evaluationsheet:

Highlevelevaluationquestion Feedback

Rightapproach,withrightfunctions?TheWP5T2toolsprovidedacalibrationsuitetocalibratethemulti-projectorsetupandprovidedtoolstodisplay

freeviewpointvideo.

Howwelldotoolswork? Thetoolsperformedwell.

Easytouse,andincreasingcreativity/productivity

Thecalibrationofthemulti-projectorsetuptook30minutes.ThroughUnity’spluginsystem,thefreeviewpointmodulecouldbeeasilyimported.

Functioninginprofessionalworkflow? NoconventionalfilmpipelinetoolsareusedinthePavilion,

Page 34: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage35of50

Resultinginconvincing,novelandoriginalvisualexperiences?

TheWP5T2toolsdidgiveCREWthepossibilitytocreateanovelimmersiveexperiencewheresphericalvideo,freeviewpointvideoand3DCGelementslivenexttoeach

otherandcanbeviewedinaholographicway. Outcome/conclusion: Theaim forusing theUnityauthoringpipeline inPavilionwas two-fold: tryingoutthe calibration softwareon theCAVEbasedprojection setupofPavilionandusingtheOptitrackmotioncapturedata toallow forholographic renderingof thesceneandtheview-dependentmodulethatwasincorporated.Trueholographicrenderingnot only means rendering the content from a new point but also adjusting thewarpingandblendingparametersofeachprojector.Developmentsweremadetodothis all inUnity. The calibration tools performedwell, providing accurate blendingand warping, also in the 90° middle corner. The free viewpoint plugin providedstableplaybackofthepre-recordedcontent.

Page 35: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage36of50

5 Publicdemonstrators The Unity pipeline for authoring immersive spaces builds on a mixed-mediaapproach that places itself in a reciprocated relation between performance artpractices (real-time and interactive) and film/broadcast centric pipelines (highquality but generally non real-time). Each side of the connection has been closelyresearched in the demonstrators that were organized in the scope of theDreamspaceproject.The finaldemonstratorBattlegroundhasbeenbuiltusing the immersivedisplay inconjunction with the LiveView integrated system. This has been tested in twoscenarios:firsttocreateatrulyimmersivespaceusingfilmedreal-worldimagesforuse in installation or performance art and second to create an immersiveenvironmenton-settoassistcreativeprofessionalsinvirtualproduction.ThefilmedimmersiveenvironmentwasshowcasedatCVMP2015andtheimmersivevirtualsetwasevaluatedaspartoftheBattlegroundfinalproductionshowcasedatIBC2016.

5.1 CVMP2015 The connection of film centric pipelines towards performance and installation artwas examined at the CVMP demonstrator held in year 3. CVMP presented oneFullHDimmersivedisplaythatprovidedtrueHMD-lessholographicplaybackoffreeviewpointvideocapturedbyiMindswith3DCGcontentcompositedontopofit.Theinstallationwas running LiveView, the core compositing and high quality playbackenginedevelopedbyTheFoundryinDreamspace.LiveViewaddsreal-timerenderingcomponentstotraditionalfilmcompositingschemes.ForCVMPLiveViewwaslinkeduptointeractiveinputcomingfromamotioncapturesystemthroughpluginsfromthe Unity authoring pipeline. As such, traditional film pipelines were exposed toartisticandinteractiveuse,thelatterbothreal-timeinnature,andallowedaviewertowitnesshighqualitycontentwith3Dcompositedelements inan interactiveandview-dependentway.

Thecontentfortheview-dependantbacklotwascapturedwiththeiMindsWP4T4multi-camerasetup.Eachcamerastreamwasfurtherprocessedtobeabletorenderout 400 intermittent frames for each timestamp. Several Unity scripts(VDR_Streamer.*)werewritten to link these frames to the correct positionof theviewersothatatrueholographicperspectiveisperceived.Alsothe3DCGscenewasauthoredinsidetheUnityauthoringpipeline.Once satisfactory results were achieved in Unity the code was transferred toLiveView to be able to play back the same view-dependant content at the demobooth,usingthesameOptiTracktrackingsystem.ThenecessaryscriptstolinkuptheOptiTrack motion capture system to LiveView formed the base for the VRPN_*scripts of the Unity Authoring pipeline. The VPET tools developed in WP5T1 andrunningonatabletenabledtheusertointeractlivewiththepositionandlayoutofthe3DCGassets.

Page 36: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage37of50

Figure20:CMVPfrontalview

Figure21:LivecontrolCVMP3DCGassetsthroughWP5T1tablettools

The key outcome of the CVMP demonstrator is the use of a filmed immersiveenvironment in conjunction with interactive installation art. The LiveView systemenabledagenerallyrealtimeimmersiveinstallationtoberunonhigh-qualityrenderandcompositingsystem.TheUnity authoring pipelinewas tested alongsidemore traditional film pipelines,runningNukeforcompositingandKatanaforauthoring.TheUnitypluginsenabledthefilmed immersivecontent (free-viewpointvideo)tobeauthoredandevaluatedinUnity, togetherwith the3DCGassets. The contentwas then re-authored insideKatanatoruninhigherqualityandmorepreciselightinginsideLiveView.

Page 37: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage38of50

5.2 BattlegroundFinaldemonstrator

5.2.1 Overview

Dreamspace’s final experimental productionBattlegroundexplored theapplicationof the Unity authoring pipeline in the context of film and broadcast workflows.Battlegroundreliesheavilyonvirtual3DCGsetbuildingandmixesbothrealcapturedfootage and animatedCG content. An immersive display system consisting of twoprojectors and an ad-hoc display surface in a 90° V-shapewere built in the sameroomwhereseveralof the (greenkey) filmandmotioncaptureshoots tookplace.The display surface was calibrated using the iMinds tools in order to create oneseamless image, a process that took less than 30 minutes. The virtual 3D scene,consisting of several 3D assets designed by a 3D artist in pre-production, wasdistributed to all Dreamspace systems on set including the immersive displayrunningtheUnityauthoringpipeline.AnOptiTrackmotioncapturesetupandtoolsfrom theUnity authoring pipelinewere used to feed livemocap data to differentconnected network clients. This live data was used to animate several of the 3Dassetsinreal-time.Thecombinationofmotiontrackingandan immersivedisplayallowedthedirectorandhisteamtoexploreandframedifferentperspectivesontothevirtualworld:theabsence of HMDs made it easy for them to collaborate and the director couldtranslate new ideas into camera shots on the adjacent set. The immersive displayfosteredcross-fertilizationbetweenperformers, thedirector’s teamandthedigitalfilmpipeline.Battleground also established a connection with FAAI’s Virtual Production EditingTools (VPET),which allow the global 3D scene to be distributed to differentUnityinstancesrunningthesamenetworkclientsoftware.Allclients,typicallytabletswitha touch interface,were able to interact and animate editable elements in the 3Dscene.TheUnityauthoringpipelinewaslinktothesamesystemsothatallchangesmadeonsetbyanyoftheVPETtabletsweremirroredonthecalibratedimmersivedisplay.

5.2.2 Systemcomponents

Figure22showsthecoreDreamspacepipelineandthecomponentsthatconnecttoaconventionalfilmpipeline.Figure23shows(toscale)aplanviewoftheproductionsetup,which indicates how the immersive performance system, using a calibratedmulti-projector setup (light green) connects to this Dreamspace pipeline (lightpurple).

Page 38: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage39of50

Figure22:Dreamspace(conventional)filmpipeline

Figure23:ConnectionoffinalimmersiveperformancesystemtoDreamspacefilmpipeline

Page 39: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage40of50

Insummary:• ThesamevirtualsceneissharedbetweentheDreamspacefilmpipelineand

immersivedisplay• TheimmersivedisplayallowstheDirectorandhisteamtonavigatethrough

thevirtualscenewithatrackedpoint-of-viewandtoframeshotscollaboratively.Thedisplayoffersabroadperipheralviewofthevirtualsceneandthefinalimmersiveperformanceenvironmentisbuiltsothattheviewercannavigate‘inside’thevirtualscene

• TheimmersivedisplayisconnectedtothecentralSyncServer,allowingittoreacttoanddisplaychangesmadetothevirtualscenebytheonsetVPETtools.

• TheimmersivedisplayiscalibratedusingWP5T2toolsandrunstheUnityauthoringpipeline.

As a result: the immersive spacepresents a novel viewingexperience allowing forthedirectorandhisteamtoframethevirtualscenerapidlyandcollaboratively.

Figure24:Frontalviewofimmersivedisplay

Page 40: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage41of50

Figure25:Trackedviewfindersimulationapplication,usingtheimmersivedisplayassourcewithmuchmoreperipheralviewthenHMD

Figure26:Viewoncameraandmocapset

Page 41: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage42of50

5.2.3 Conclusion

For the Battleground production the immersive display realizedwithWP5T2 toolsand running in Unity was installed next to a film centric pipeline with LiveViewrunning a filmquality 3D renderer,Nuke compositing schemes, an optimizedminirenderfarmforglobal illuminationthroughraytracingandaNCamsuitefordepth-tracking. The aim of theUnity authoring pipelinewas to present the same virtualscene running in LiveView in an immersive and holographic way. While LiveViewrenders all the content in a high quality, the immersive display provides a uniqueway of showing the same virtual scene, albeit in a lower quality. The Directorspecificallyprofitedfromthewiderfieldofviewoftheimmersivedisplaycomparedto anHMD.Hewas able tonavigate through the virtual sceneand collaborativelyframedifferentpartswhiledesigningshotsfrominsidetheprojectionenvironment.These ‘pre-shots’, captured for example by a mobile device, or a viewfindersimulation application, could thenbe translated to actual camera shotson set. Bylinking the tools with the VPET framework realized by creative partner FAAI, thescenewasalsomodifiablebytheDirectororDirectorofPhotography.AssetscouldofcoursebeeasilyadjusteddirectlyinUnityaswell. Unity, traditionally a game engine, is opening up more towards film orientedpipelines in order to build novel storytelling techniques. 3D assets are becomingincreasinglyimportantinthefilmmakingprocesswhenworkingwithvirtualsets.Thefilmcrewasa resultwants tobeable toenterandnavigatethesevirtualsets inacollaborativeway.ThisisexactlywhattheWP5T2tools,madeandrunninginUnity,offer.Thequalityisnotuptopartothefinalrender,buttheexperienceisreal-time,interactiveanddoesn’trequireanHMDtobeworn.WP5T2toolsaredesignedtobeinteractive.Theyareawareofexternalsensors,motion-captureinformationandothertriggers.TheUnityauthoringpipelineoffersaunifiedwaytodesign,directandnavigatevirtualsets.Theinvitedusergroupreactedpositivelyandwelcomedthepresenceofamulti-projectorenvironmentonset.

Page 42: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage43of50

5.3 IBC2016DreamspaceshowcasedthevirtualproductiontechnologyusedtocreateBattlegroundatIBC2016(withminormodificationstomeetthespaceconstraintsimposedbythebooth).iMindsusedamini-immersivedisplay(usingtwoplasmascreensinsteadofaprojector)nexttothevirtualscenetopresentanoverviewoftheCVMPdemoandtheHPVvideoplaybacktools,runningdemovideosof4Kat50Hz,whileanIntelRealSensesensortrackedthemotionoftheviewerinsidethesetup.

Figure27:ImmersivesetupatDreamspacebooth,withIntelRealSensesensorattachedbetweenthetwoscreens,IBC2016

Page 43: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage44of50

6 ConclusionThisdocumentreportsontheresearchdoneandtoolsdevelopedinTask2ofWorkPackage 5 of the Dreamspace project. This taskwas aimed at designing a unifiedpipeline that enables artists to build performative immersive and holographicexperiencesusingmulti-projectorsetupsenablingaviewerto‘walkaround’infilmedcontent. This task also addresses the interoperation of such immersive andinteractivedisplayswithaprofessionalfilmpipelinewhenshootingvirtualmixedCGand film content scenes. Thework resulted in the final Dreamspace performanceenvironment.Afterresearchingthespecificationsforsuchapipeline(documentedinD5.2.1)anddocumentingthecurrentstate-of-the-artbothfromanartisticandtechnicalpointofview(documentedinD5.2.2), itwasconcludedthattherewastheneedforamorelow-costandaccessiblewayofcalibratingmulti-projectorsetups,incontrasttothecurrenthigh-endsystems.Itwasfurthermoreconcludedthattherewastheneedforacentralcreative3Dauthoringplatformthatisequippedtounderstandandhandlethe media that are played back in such holographic environments. Such mediaincludehigh-resolution and view-dependent video textures, but also regular 3DCGassets. Such platform should be able to communicate with the hardware side tocontrolthesetupofthemulti-projectionsystem,includingitswarpingandblendingstate to formoneseamless image.Finally, theplatformshouldbeable toconnectwithallkindsofperipheralhardwareprovidinginteractioninput.Afterevaluatingthesolutionscurrentlyavailableonthemarket,itwasdecidedthatfreely available Unity 3D engine was the best candidate for building the finalprototype. The final prototype Dreamspace performance environment issubsequently called the Unity authoring pipeline. This pipeline allows users tocaptureanyarbitraryscreensetup,curved,straightorcylindrical,authorcontenttobe displayed in on this surface and render the content back outwith the correctwarpingandblending,optionallyontheGPUbyusingtheNVidiaAPI.TheUnityauthoringpipelinehasbeenevaluatedinseveraluserfieldtest,organizedbytheinternalDreamspacecreativepartnersbutalsobyexternalcommissionedartprojects.EachfieldtrialwasevaluatedbymeansofafixedsetofcriteriadefinedinD2.2.1 which tested the pipeline on its suitability, quality, reliability, usability,effectiveness, interoperabilitywith otherDreamspace tools and creative potential.The pipeline was designed from ground up fit these evaluation points and itsdevelopmentwasconstantlypushedbyactualevaluationcritiques.ThefinaldemonstratorBattlegroundhasbeenbuiltforusingtheimmersivedisplayin conjunction with the LiveView integrated system. This has been tested in twoscenarios:firsttocreateatrulyimmersivespaceusingfilmedreal-worldimagesforuse in installation or performance art and second to create an immersiveenvironmenton-settoassistcreativeprofessionalsinvirtualproduction.ThefilmedimmersiveenvironmentwasshowcasedatCVMP2015andtheimmersivevirtualsetwasevaluatedaspartoftheBattlegroundfinalproductionshowcasedatIBC2016.

Page 44: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage45of50

TheUnityauthoringpipelinepresentsaunifiedworkflowthatenablescreativeuserstodesignimmersivespaceswithlow-costandoff-the-shelfcomponentsthatcanbemixed andmatched to taste. This pipeline fosters cross-fertilization between filmproduction and installation art by democratizing the setup for immersiveenvironmentswhereuserscan‘walk’aroundinfilmedcontent.TheUnityauthoringpipelinealsooffersanovel,openandcollaborativewayofpresentingvirtualscenesby using a larger display surface (usingmultiple projectors) and opening up for acollaborativeapproachincontrasttotheindividualexperiencewhenusingHMDs.

Page 45: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage46of50

7 AppendicesAppendixA:Evaluationofsoftwaremultiprojectioncalibrationenvironments

Page 46: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage47of50

AppendixB:Evaluationofsoftware3Dauthoringplatforms,partI

Page 47: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage48of50

AppendixC:Evaluationofsoftware3Dauthoringplatforms,partII

Page 48: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage49of50

8 References BekaertP.,WeissigC.(2012,April18).D6.6ImmersiveDisplayforhomeenvironments,20203DMediaproject(ICT-FP7-215475)

BekaertP.(2012,April19).D5.15Surroundvideomanipulationandinteraction,20203DMediaproject(ICT-FP7-215475)

Everitt,C.ProjectiveTextureMapping.Retrievedfromhttp://developer.download.nvidia.com/assets/gamedev/docs/projective_texture_mapping.pdfVanWaveren,J.M.P,CastanoI.(2007,September14).Real-TimeYCoCg-DXTCompression.Retrievedfromhttp://www.nvidia.com/object/real-time-ycocg-dxt-compression.html

3DCreativeAuthoringplatforms:

Cinder:https://libcinder.org/

Isadora:http://troikatronix.com/

Max7.0:https://cycling74.com/max7/

OpenFrameworks:http://openframeworks.cc/

OpenSceneGraph:http://www.openscenegraph.org/

PixelConduit:http://pixelconduit.com/

Resolume:https://resolume.com/

TheFoundry:https://www.thefoundry.co.uk/

TouchDesigner:http://www.derivative.ca/

Unity:https://unity3d.com/

Ventuz:http://www.ventuz.com/

Multi-projectioncalibrationandplay-out:

Blendy-Dome:http://www.blendydomevj.com/

DomeProjectionAutoCal:http://www.domeprojection.com/projectiontools-auto-cal-suites/

Page 49: D6.1.4 Final Prototype Dreamspace Performance Environment · DREAMSPACE_D6.1.4_V3_300916.docx Page 1 of 50 D6.1.4 Final Prototype Dreamspace Performance Environment Deliverable due

DREAMSPACE_D6.1.4_V3_300916.docxPage50of50

ImmersaView:https://www.immersaview.com/

OmniDome:http://www.omnido.me/

ScalableDisplayTechnologies:http://www.scalabledisplay.com/

VisioAnyblend:http://www.vioso.com/