10
Superhigh-Definition Digital Cinema Distribution System with 8-Million-Pixel Resolution Takahiro Yamaguchi, 1 Tatsuya Fujii, 2 Mitsuru Nomura, 1 Daisuke Shirai, 1 Kazuhiro Shirakawa, 1 and Tetsuro Fujii 1 1 NTT Network Innovation Laboratories, Yokosuka, 239-0847 Japan 2 Business Communications Headquarters, NTT East Corp., Tokyo, 112-0004 Japan SUMMARY We developed a prototype superhigh-definition (SHD) digital cinema system that can store, transmit, and display extrahigh-quality movies with a resolution of 8 million pixels, using JPEG2000 coding algorithm. The spatial resolution on image quality is four times better than HDTV, and surpasses that of 35-mm cinefilm. The system is comprised of three principal apparatus designed for SHD movies: a video server, a real-time decoder, and a projector. The movie data digitized by scanning the cinefilm is com- pressed using JPEG2000 coding algorithm and stored in the video server in advance. Using Gigabit Ethernet (GbE) links and TCP/IP protocols, the server transmits the com- pressed digital movie data stream of 300 to 450 Mbps to a real-time decoder. The decoder receives the compressed data stream, and outputs a raw, uncompressed, 10-bit R/G/B digital video signal using a parallel-processor mechanism in real time. The projector reproduces the SHD movie using three reflective LCD panels (D-ILA), one for each 10-bit R/G/B color, while receiving the raw digital video signal through the special video interfaces. This system enables us to distribute digital cinema of original 35-mm cinefilm quality via the GbE network. © 2006 Wiley Periodicals, Inc. Syst Comp Jpn, 37(4): 35–44, 2006; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/scj.20465 Key words: digital cinema; superhigh definition image; content distribution; JPEG2000; Gigabit Ethernet. 1. Introduction The growth of broadband networks that use optical fibers has stimulated the need for communications with higher quality images, not common in conventional broad- casting technologies. Motion picture distribution is ex- pected to be the breakthrough application for broadband network services. Currently, the highest quality demand is for the distribution of digital cinema, the crown jewel of motion picture content [1]. Currently, 35-mm cinefilms are conventionally used for shooting, production, and distribution. In the United States in 1996, “Star Wars: Episode I” screened by George Lucas marked the beginning of digital cinema (D-cinema, also E-cinema) using Texas Instruments (TI) DLP projec- tors and/or Hughes-JVC’s ILA projectors. In “Star Wars: Episode II,” released in 2002, the flow from shooting to screening was achieved in film-less/full-digital environ- ments using Panavision-Sony’s digital video cameras and TI’s DLP projectors. Lucas’s digital cinema format was 1080/24p, which is almost the same as those used in high- definition (HD) digital broadcasting. Cineastes call this format “2K” on the basis of the horizontal resolution of roughly 2000 pixels. Current digital cinema has been put to © 2006 Wiley Periodicals, Inc. Systems and Computers in Japan, Vol. 37, No. 4, 2006 Translated from Denshi Joho Tsushin Gakkai Ronbunshi, Vol. J88-D-I, No. 2, February 2005, pp. 361–370 35

Superhigh-definition digital cinema distribution system with 8-million-pixel resolution

Embed Size (px)

Citation preview

Page 1: Superhigh-definition digital cinema distribution system with 8-million-pixel resolution

Superhigh-Definition Digital Cinema Distribution System with8-Million-Pixel Resolution

Takahiro Yamaguchi,1 Tatsuya Fujii,2 Mitsuru Nomura,1 Daisuke Shirai,1 Kazuhiro Shirakawa,1 and Tetsuro Fujii1

1NTT Network Innovation Laboratories, Yokosuka, 239-0847 Japan

2Business Communications Headquarters, NTT East Corp., Tokyo, 112-0004 Japan

SUMMARY

We developed a prototype superhigh-definition(SHD) digital cinema system that can store, transmit, anddisplay extrahigh-quality movies with a resolution of 8million pixels, using JPEG2000 coding algorithm. Thespatial resolution on image quality is four times better thanHDTV, and surpasses that of 35-mm cinefilm. The systemis comprised of three principal apparatus designed for SHDmovies: a video server, a real-time decoder, and a projector.The movie data digitized by scanning the cinefilm is com-pressed using JPEG2000 coding algorithm and stored in thevideo server in advance. Using Gigabit Ethernet (GbE)links and TCP/IP protocols, the server transmits the com-pressed digital movie data stream of 300 to 450 Mbps to areal-time decoder. The decoder receives the compresseddata stream, and outputs a raw, uncompressed, 10-bit R/G/Bdigital video signal using a parallel-processor mechanismin real time. The projector reproduces the SHD movie usingthree reflective LCD panels (D-ILA), one for each 10-bitR/G/B color, while receiving the raw digital video signalthrough the special video interfaces. This system enables usto distribute digital cinema of original 35-mm cinefilmquality via the GbE network. © 2006 Wiley Periodicals,Inc. Syst Comp Jpn, 37(4): 35–44, 2006; Published onlinein Wiley InterScience (www.interscience.wiley.com). DOI10.1002/scj.20465

Key words: digital cinema; superhigh definitionimage; content distribution; JPEG2000; Gigabit Ethernet.

1. Introduction

The growth of broadband networks that use opticalfibers has stimulated the need for communications withhigher quality images, not common in conventional broad-casting technologies. Motion picture distribution is ex-pected to be the breakthrough application for broadbandnetwork services. Currently, the highest quality demand isfor the distribution of digital cinema, the crown jewel ofmotion picture content [1].

Currently, 35-mm cinefilms are conventionally usedfor shooting, production, and distribution. In the UnitedStates in 1996, “Star Wars: Episode I” screened by GeorgeLucas marked the beginning of digital cinema (D-cinema,also E-cinema) using Texas Instruments (TI) DLP projec-tors and/or Hughes-JVC’s ILA projectors. In “Star Wars:Episode II,” released in 2002, the flow from shooting toscreening was achieved in film-less/full-digital environ-ments using Panavision-Sony’s digital video cameras andTI’s DLP projectors. Lucas’s digital cinema format was1080/24p, which is almost the same as those used in high-definition (HD) digital broadcasting. Cineastes call thisformat “2K” on the basis of the horizontal resolution ofroughly 2000 pixels. Current digital cinema has been put to

© 2006 Wiley Periodicals, Inc.

Systems and Computers in Japan, Vol. 37, No. 4, 2006Translated from Denshi Joho Tsushin Gakkai Ronbunshi, Vol. J88-D-I, No. 2, February 2005, pp. 361–370

35

Page 2: Superhigh-definition digital cinema distribution system with 8-million-pixel resolution

practical use in “2K (1080/24p)” and has been commer-cially shown using the DLP Cinema Projector with threeSXGA-size (1280 × 1024 pixel) DMDs [2, 3].

The image quality of current 2K digital cinemas isequal to that of 35-mm release prints. However, there are alot of cineastes that demand even higher-quality images(equivalent to an answer print which is the first print fromthe original negative). They do not allow digitization de-pending on the 2K format that does not fill the high-qualitydemand [4–6].

Superhigh-definition (SHD) digital cinema, with aresolution of 8 million (4K × 2K) pixels, quadruple 2Kresolution standards, achieves the high quality demanded.This format is called “4K” on the basis of the horizontalresolution of roughly 4000 pixels. Comparisons betweenthe temporal and the spatial resolutions can be seen in Fig.1.

In this paper, we introduce the development of anSHD digital cinema system that can distribute cinema con-tents via IP networks using Gigabit Ethernet (GbE) andproject them on to a screen with an 8-million-pixel resolu-tion using a large venue projector. Section 2 gives a briefoverview of the SHD digital cinema distribution and pro-jection process. Section 3 explains the principal apparatusof the developed system. The system’s preliminary per-formance issues and results of the network transmissionexperiments are provided in Section 4, and our conclusionsare given in Section 5.

2. Digital Cinema Distribution andProjection Process

The major SHD digital cinema distribution and pro-jection process is depicted in Fig. 2. A 35-mm cinefilm isdigitized using a film scanner, and the digital data is thenstored in advance in the distribution server through colorcontrol and compression processes. These are done offline,before distribution and projection.

Fig. 1. Positioning of SHD digital cinema.

Fig. 2. Process of SHD digital cinema distribution and projection.

36

Page 3: Superhigh-definition digital cinema distribution system with 8-million-pixel resolution

The SHD digital cinema distribution and projectionprocesses are as follows. Compressed/encoded data isstreamed from a video server to a real-time decoder via IPnetworks on GbE, and then the uncompressed/decoded datais projected onto a screen using a projector.

While digital rights management processes, such aswatermarking and encryption, were ignored in our experi-mental stage, they should be used in actual business.

2.1. Film digitization

In the film industry, various stages of film develop-ment exist, such as original negative, inter-positive (masterpositive), inter-negative (dupe-negative), and release print.A sufficient SHD image resolution cannot be obtainedsimply by scanning the release print, which has passedthrough most generations/duplications. Higher-quality datacan be obtained by scanning the nearest generation film,compared to the original negative. However, scanning anoriginal negative is a big gamble, because only one exists.Therefore, inter-positives or inter-negatives are digitized inthe usual way.

For material that demands higher quality, such asSHD digital cinema, both the resolution and the colorbit-depth are important. A quantization number for an RGBcolor space of 10 bits or more is required for each primarycolor. In our experiments, we usually used IMAGICA’s filmscanner “IMAGER XE” to digitize the 35-mm cinefilms.This scanner can yield image files of 4096 × 3112 pixelsmaximum, with 14-bit R/G/B. We used a 10-bit log formatcalled a Cineon file to store the scanned image. The aspectratio of the movie/cinema was wider than that of the SHDprojector we describe later. The number of horizontal pixelswas set to 3840, the same as that of the projector, and thenumber of vertical pixels was adjusted to the value thatcorresponded to the movie’s aspect ratio. For example, theaspect ratio for Cinemascope size is 2.35:1, so the numberof vertical pixels is 1634.

2.2. Color control

For scanned images, neither the color step nor thetone was successfully reproduced by the projector. There-fore, the color must be adjusted to conform to the projector.In order to reproduce the color space expressed on the filmusing a film projector, several hundred colors of the digitalprojector used for the SHD digital cinema projection werefirst measured to obtain a color profile. Next, based on thecolor profile, we made a 3D LUT to convert the color spacebetween the film projection and the digital projection andapplied it to the scanned image. Finally, after theIMAGICA’s color specialist confirmed the color correctionby selecting the still and motion picture from each main cut,the final color was decided.

In our experiment, we applied IMAGICA’s colormanagement tool “Galette” to control the colors [7, 8]. Thecolor corrected data was output to a 16-bit TIFF file of 3840× 2048 pixels (the upper and lower no signal part caused bythe difference in the movie content aspect ratio and projec-tor screen were black), and stored in tapes or HDDs.

2.3. Compression/coding

The volume of SHD digital cinema data is huge. Fora 2-hour movie, we require about 5 Tbyte, without com-pression. This was calculated using the following condi-tions: 2 hours = 172,800 frames, 3840 × 2048 pixels/frame,R/G/B 10 bits/pixel. Therefore, data compression/encodingis essential.

In the motion picture coding method, interframe cod-ing like MPEG-2/4 that can obtain high compressibilityusing the motion forecast and the correlation of successiveframes is generally used. For intraframe coding, the datacompression ratio is small and is easier to access, frame byframe.

Since the development of the first SHD digital cin-ema system, we have used JPEG instead of MPEG for thefollowing reasons [9, 10]: (1) It is much easier to implementthe parallel processing feature, depending on image tiling,which is necessary to process SHD images in real time. (2)The decoder is robust against data errors through packetlosses in network transmission; for example, the frame’sdata error is not extended in the vicinity of the frame. Newlydeveloped SHD digital cinema system uses JPEG2000 [11]to construct a real-time decoder for the following reasons:(3) Ten bits of R/G/B data, or more, can be handled. (4) A“one-source multiuse” system that can reproduce imagesthat have suitable resolution and quality (SNR) from oneencoding data, appropriate to the network bandwidth anddisplay capability (screen size and resolution), can beachieved.

Depending on the parallel processing of the follow-ing real-time decoder, an actual JPEG2000 encoding wasperformed after each frame image had been divided into128 × 128-pixel tiles. For digital cinema, encoding does notrequire a real-time process. Even if the encoding processtakes time, it is possible to speed it up by distributing theprocessing over the PC cluster.

In the experiment, the compressed JPEG2000 datawas within a 300 to 450 Mbps range, where degradation inquality was not discernible to the human eye, known as“visually lossless coding.” In this range, the block noisecaused by the image tiling is not perceptible.

2.4. Distribution and projection

It is assumed that a sufficient network bandwidth canbe obtained using the Gigabit Ethernet. We use TCP/IP

37

Page 4: Superhigh-definition digital cinema distribution system with 8-million-pixel resolution

protocols for SHD digital cinema streaming. Transmit-ting/receiving data via an IP network and TCP protocolprocessing depend on the Linux OS of the server anddecoder. Buffering of the transmitting/receiving data andthe playback timing at 24 frames per second (fps) werecontrolled in the application layer.

In conventional video streaming, the video serverside is the master for data transmission. In such cases,missing or duplicate frames might be caused by a smallclock difference between the server and decoder. In oursystem, the server’s data transmission rate is preciselycontrolled by the decoder over a TCP connection. This isbecause the decoder becomes the system master to yieldsmooth replays, without missing or duplicate frames.

Unlike digital camera encoding, decoding requires areal-time process, because it is related directly to playback.In view of this, a real-time decoder is a key component ofthe system configuration.

After the real-time decoding process, a 10-bit R/G/Buncompressed digital video signal with an actual bit rate5.7 Gbps is input from the decoder to the projector. Then,the SHD digital cinema is projected onto a screen using theprojector.

The audio data distribution also used a TCP connec-tion from the server to the decoder, as well as the video data,and was then reproduced using the decoder’s sound card.The audio data uses uncompressed PCM data, because itsbit rate is quite low, compared to that of video data. This isthe only reference to audio in this paper.

3. Principal Apparatuses

The appearance of our prototype SHD digital cinemadistribution system is depicted in Fig. 3. This system iscomprised of three principal apparatuses: (1) a video serverthat transmits encoded movie-data streams, (2) a real-timedecoder that decompresses the received data stream, and (3)a projector that projects an actual 8-million-pixel imageonto a screen. An explanation of each apparatus is givenbelow [12, 13].

3.1. PC video server

The video server is an IA-32-based Linux (kernel ver.2.4) PC with dual CPUs (Intel Pentium III 1.26 GHz),high-performance software-type RAID system (IDE HDD× 4–8, RAID0, striping mode), and a Gigabit Ethernetnetwork interface card (GbE-NIC). Packed data files, con-taining 1000 frames of encoded movie data, are stored onRAID in advance. In order to achieve stable high-bit-rateSHD movie-data streaming, the two main threads (the dataread from a disk and data write to the network sockets)cooperate in parallel with dual CPUs. The data reading

fluctuations are absorbed by securing a mass ring buffer onthe user memory space of the PC’s main memory. In thedata read thread, the data amount for one frame is read fromthe header of the packed data file, and then the data is readfrom RAID to the ring buffer, one frame at a time. Whenthe unused ring buffer space disappears, the data read threadis blocked. In the data write thread, the data of one frame isread from the top of the ring buffer, and it is written to TCPsockets. When the data write operation is finished, a counterthat shows the number of unused buffers is incremented byone, before the next frame’s data is processed. When thering buffer is empty, the data write thread to the TCP socketsis blocked. The mounted RAID, however, has a disk-readspeed of 80 Mbyte/s or more, which is sufficient for readingSHD movie data of 300 to 450 Mbps. Therefore, the ringbuffer does not empty as usual.

3.2. SHD/JPEG2000 real-time decoder

The real-time decoder is an IA-32-based Linux (ker-nel ver. 2.4) PC with dual CPUs (Intel Pentium III 1.26GHz), four newly developed, 64-bit PCI boards imple-mented on JPEG2000 decoder chips [JPEG2000 decoderboards, Fig. 4(a)], a GbE-NIC, and a digital sound card, asshown in Fig. 4(b).

The encoded movie data, received by the GbE-NIC,is distributed to the four JPEG2000 decoder boards using aPCI64-bus. Thirty JPEG2000 processors by Analog De-vices, Inc. (ADV-JP2000) were implemented on a decoderboard. By using an ADV-JP2000, each 128 × 128-pixel tileof the frame image was decoded. In order to achieve real-

Fig. 3. Photograph of SHD digital cinema distributionsystem. (From the left: video server, GbE-switch, and

real-time decoder; behind: projector)

38

Page 5: Superhigh-definition digital cinema distribution system with 8-million-pixel resolution

time decoding of the SHD movie data, one 3840 × 2048-pixel frame is divided into 480 tiles, and 120 JPEG2000processors function in parallel. The decoder supports bothYCbCr 4:2:2 (with subsampling color components Cb andCr two-to-one horizontally) and RGB 4:4:4 (without sub-sampling) data modes. These modes are used depending onthe network bandwidth between the server and decoder.

Each decoder board processes a quarter of the wholeimage area of an SHD movie. First, the data tiles from onequarter-frame area are recombined in the frame buffer onthe final part of each board. Next, for the YCbCr 4:2:2mode, the interpolation function of Cb and Cr subsampleddata and the transfer function from YCbCr to RGB areperformed. (For the RGB 4:4:4 mode, these functions areskipped.) Finally, the raw/uncompressed digital video sig-nals for the 10-bit R/G/B that correspond to the quarter partare output from each board. This is a signal with a data rate

of 48 fps. Every 24-fps movie frame is simply output twicefrom the frame buffer. The four decoder boards share onesynchronizing signal, and output synchronized video sig-nals in parallel to the projector for the entire image area.

To stably decode the high-bit-rate SHD movie data,the two main threads (the data read from the networksockets and data write to the decoder boards) cooperate inparallel with the dual CPUs. The data reading fluctuationsare absorbed by securing a mass ring buffer on the usermemory space of the PC main memory. A message aboutthe data amount for one frame is sent over a TCP connec-tion. In the data read thread, data is read from the TCPsockets to the ring buffer in one frame. When the unusedring buffer space disappears, the data read thread is blocked.In the data write thread, the data of one frame is read fromthe top of the ring buffer, and is written in the decoderboards using DMA. When data writing is finished, a counterthat shows the number of unused buffers is incremented byone, and then the data of the next frame is processed. Withthe ring buffer continuously being filled, the data readthread is blocked every 1/24 second at the same time thedata write thread is blocked, depending on the movie framerate of 24 fps. At the same time, the server-side data writethread to the network sockets is blocked by the TCP flowcontrol, and the server-side data read thread from a disk isblocked. Thus, the entire system is synchronized with thedecoder-side processing for the SHD digital cinema distri-bution and projection. However, because the networkthroughput is insufficient, the ring buffer might be emptyand buffer underruns might occur following the frame drop.

3.3. SHD D-ILA projector

We developed an SHD projector that has a specialdigital video interface for a raw RGB video signal to theSHD decoders (Fig. 5). Recently, Victor Company of Japan(JVC)’s D-ILA device (a kind of reflective LCD panel)technology has progressed sufficiently to develop an SHDprojector. The SHD projector uses three prototype diagonal1.7-inch 3840 × 2048-pixel D-ILA devices, one for each10-bit R/G/B color. In order to project in a movie theater, abrightness of at least 200 to 400 inches that can be projectedto the screen is required. By using a 1600-W xenon lamp,the effective brightness exceeds 5000 ANSI lumens, whichis bright enough to show images on screens as large as 300inches.

This prototype SHD projector has three components:a video interface unit, a frame buffer unit, and a projectionhead unit, as shown in Fig. 6.

The raw video signal output bit rate from the decoderreached 11.3 Gbps (= 8-million pixels/frame × 10 bits/pixel× 3 RGB × 48 fps), in total. To transmit/receive the data, wedesigned 4-ch special digital video interfaces with TMDS

Fig. 4. Photograph of JPEG2000 decoder board (a) andblock diagram of an SHD real-time decoder (b).

39

Page 6: Superhigh-definition digital cinema distribution system with 8-million-pixel resolution

signaling that support up to 16-bit R/G/B colors (18.1 Gbpsin total). Each channel corresponds to one-quarter (1920 ×1024 pixels) of the whole image area (3840 × 2048 pixels).The transmitting side (decoder output) has two TMDStransmitters, while the receiving side (projector input) hastwo TMDS receivers. Through their 16-bit data lines, aneffective 10-bit video signal for each primary color wastransmitted, dividing the high 8 bits and the low 2 bits(remaining 6 bits were zeros). The prototype projector hadthree separated bodies/units that should be integrated forthe completed product. In addition, the LVDS signalinglines should be the wiring inside the projector, even though

the lines between each unit were outside under our experi-mental conditions.

The projector was applied as follows. First, the rawvideo signals corresponding to the four divisions of theentire image area output from the decoder are receivedthrough special digital video interfaces on the video inter-face unit. Only the effective 10-bit data of each primarycolor is retrieved from these signals. Next, at the framebuffer unit, those data that correspond to the four divisionsare permuted and accumulated in each color’s frame bufferas the combined data of the whole image area. Then, theprojection head’s D-ILA device is refreshed by the data,which is read from the frame buffer and sent to the corre-sponding driving circuit, at a speed of 96 Hz. Finally, every24-fps movie frame is displayed four consecutive times,without any interpolation between adjacent frames (i.e.,every frame of the 48-fps decoder output is displayedtwice). The refresh rate is a direct multiple of the 24-fpsmovie, thus avoiding motion artifacts caused by the com-plicated frame rate conversion of 2:3 pull-down (convertinga 24-fps movie into a 60 field-per-second television).

4. Evaluation

The SHD digital cinema image quality depends onthe overall data rate of the transmitted data stream. Weevaluated the server and decoder performance and the rela-tionship between the data rate and the image quality. More-over, through the network transmission experiment, weconfirmed a stable SHD digital cinema transmission usingTCP/IP.

Fig. 5. Photograph of SHD D-ILA Projector. (upper left: projector head unit, upper right: video

interface unit, lower right: frame buffer unit)

Fig. 6. Block diagram of prototype SHD digital cinema projection system.

40

Page 7: Superhigh-definition digital cinema distribution system with 8-million-pixel resolution

4.1. Server/decoder performance

We used software-type RAID system combined 4HDD (80 Gbyte, 7200 rpm, UltraATA-100) in the server.When digital cinema data was continuously read fromRAID in which there was no fragmentation, a throughputof 80 Mbyte/s or more was obtained. The decoder wasdesigned to process SHD images of 3840 × 2048 pixels upto 48 fps (twice the frame rate of conventional movies).When the server and decoder’s GbE-NIC was directlyconnected, the data encoded by an average bit rate of 250Mbps at a frame rate of 24 fps could be transmitted anddisplayed at twice the frame rate of 48 fps. Therefore, weconfirmed that the data handling throughput capacity in theserver, from RAID to GbE-NIC via a PCI64-bus, and in thedecoder, from GbE-NIC to the decoder board via a PCI64-bus, had 500 Mbps or more, if there was a sufficient networkbandwidth. This means that data encoding done by lower-ing the compression rate to 11:1, can be transmitted anddisplayed at a frame rate of 24 fps.

Initially, we used a YCbCr 4:2:2 mode compressedJPEG2000 stream at a data rate of 300 Mbps. Recently, weused an RGB 4:4:4 mode stream at a data rate of 450 Mbpsto be able to show higher image quality movies.

4.2. Data rate versus image quality

As mentioned before, this decoder uses ADV-JP2000processors. The tile size of 128 × 128 pixels and 3-level

frequency components with the 5/3-wavelet filter are in-variable specifications for JPEG2000. To extract the highcoding performance without restrictions, we must use thewhole image area as one tile, and the 5-level frequencycomponents with a 9/7-wavelet filter. To check this codingefficiency, we evaluated the coding performance of 128 ×128-pixel tiles and 3-level–5/3-wavelet filter, compared toa full tile, 5-level–9/7-wavelet filter. The test image wasscanned from an original 35-mm negative film of “Circleof Love,” provided by ARRI in Germany, and digitized asa 10-bit R/G/B image. The peak signal-to-noise ratio(PSNR) in the color space, obtained from the average noisepower and the maximum signal power of three primarycolor components (RGB), was used as a metric to evaluatethe color image [14]. The results are shown in Fig. 7(a). Ifthe bit rate is less than 100 Mbps, the difference becomesbigger and coding loss increases to more than 2.0 dB. Thesystem showed good coding performance during high-speed transmissions, where there is only about a 1.0-dBcoding loss.

Next, we evaluate the coding performance of RGB4:4:4 and YCbCr 4:2:2. For YCbCr 4:2:2 mode, the YCbCr4:2:2 image converted from the RGB 4:4:4 image wasencoded and decoded using JPEG2000. We calculated thePSNR using the RGB 4:4:4 image converted from thedecoded YCbCr 4:2:2 image. The results are shown in Fig.7(b). If the coding rate is higher than 400 Mbps, the RGB4:4:4 is better than the YCbCr 4:2:2. But for less than 300

Fig. 7. Comparison of JPEG2000 coding performance (a) developed system (128 tile, level 3, 5/3 filter) versus idealcondition (full tile, level 5, 9/7 filter) (b) RGB 4:4:4 versus YCbCr 4:2:2.

41

Page 8: Superhigh-definition digital cinema distribution system with 8-million-pixel resolution

Mbps, YCbCr 4:2:2 is better than RGB 4:4:4. This is inagreement with our beliefs [15].

4.3. Network transmission experiments

To evaluate the system availability, we conductedfield experiments to transmit SHD digital cinema overactual networks, including some router-based ones. Thereare theoretical limitations of TCP-window-based flow con-trol. The data rate decreases due to a long round-trip time(RTT) caused by network delays. We used an enlarged TCPwindow and multiple TCP connections between the serverand the decoder to improve the throughput. In the initialserver configuration, where one movie-data set of eachcompressed frame was sent to GbE-NIC in one write op-eration, the traffic was extremely intermittent. The peak ratewas about 800 Mbps, subjecting the network routers to aheavy load [16]. The shaping control function for the datatransmission was built into the socket writing process of theserver application to suppress peaks.

Using these methods, we succeeded in transmittingSHD digital cinema content data at about 300 Mbps be-tween Chicago and Los Angeles via the Internet2 network[17, 18]. We also succeeded in transmitting the data at about450 Mbps between Makuhari (in Chiba Prefecture) andGinza (in Tokyo) using a commercially available GigabitEthernet link (NTT EAST Corp., Metro Ether) [19].

4.4. Subjective evaluation test

Cineastes in Hollywood and Europe subjectivelyevaluated our digital cinema system using high-qualitymovie-data sequences provided by Hollywood studios. Be-cause the SHD digital cinema system was still being devel-oped, the demonstrations were primarily concerned withverifying the excellent resolution, rather than the contrastor the brightness, which are still being improved. However,the demonstrations underscored the potential benefits andfeasibility of the SHD digital cinema system. As a result,the concept of our system was approved by Hollywooddirectors and cinematographers and cited in the draft of“Digital Cinema System Specification v. 3” written byDigital Cinema Initiatives, LLC (DCI) [6, 20].

5. Conclusions

We developed a prototype superhigh-definition(SHD) digital cinema system that can store, transmit, anddisplay extrahigh-quality, 8-million-pixel-resolution mov-ies, using the JPEG2000 coding algorithm. SHD digitalcinema fully matches the quality of the original 35-mmnegatives. Because a completely digital data path is sent

from the server to the projector, the SHD digital cinemasystem can obtain 2048 scanning lines and a 3840-pixel-wide resolution with 10-bit R/G/B color.

The circulation process of the cinema contents willchange greatly if we can handle digital cinemas that canmaintain the quality of the original 35-mm cinefilms. Forinstance, a digital archiving and on-demand streaming of acurrent masterpiece/premier movie will become possible.Our system is a prototype system, designed to realize theserequirements.

To further improve image quality, we must improvethe throughput of the entire system in order to handle thelower compression rate and higher SNR coded streams.Furthermore, we must improve the projector performancein areas such as brightness, contrast, and the range of colorreproduction.

To start up the business of digital cinema distribution,there are a lot of problems, such as copyright issues andpiracy, that must be solved. Moreover, we require additionalstudy to achieve a scalable distribution platform that cancover all locations, from large theaters to homes and mo-bile/handheld terminals before we can obtain a biggermarket share.

Acknowledgments. We express our sincere grati-tude to all those at Victor Company of Japan (JVC) forhelping with the development of the projectors, and allthose at IMAGICA Corp. for helping with the color man-agement of digital cinema content. We also thank ChiefDirector Tomonori Aoyama, Deputy Director SadayasuOno, and the others of Digital Cinema Consortium Japan(DCCJ) for promoting our demonstrations.

REFERENCES

1. Fujii T. Current communication technologies forbroadband networks. J Imaging Soc Japan2003;52:51–55. (in Japanese)

2. Special Edition. Digital cinema there. J Inst ImageInf Television Eng 2003;57:178–202. (in Japanese)

3. Akiyama M. Motion picture and television engineer-ing, No. 618, p 39–45, 2004. (in Japanese)

4. Baroncini V, Mahler H, Sintas M. The image resolu-tion of 35 mm cinema film in theatrical presentation.SMPTE Motion Imaging, p 60–66, 2004.

5. Dettmer R. Digital cinema: A slow revolution. IEEReview, p 46–50, Nov. 2003.

6. Nikkei Electronics, No. 861 (2003-11-24), p 117–124, 2003. (in Japanese)

7. Ishii A. Color management technology for producinga theatrical release print from the digital image. Color

42

Page 9: Superhigh-definition digital cinema distribution system with 8-million-pixel resolution

Forum Japan ’02 Proceedings, 9-1, p 131–135. (inJapanese)

8. Ishii A. Color management technology for digitalfilm mastering. Proc IS&T/SID 11th Color ImagingConference, p 319–325, 2003.

9. Fujii T, Nomura M, Suzuki J, Furukawa I, Ono S.Super high definition digital movie system. ProcSPIE, VCIP’99, Vol. 3653, p 1412–1419.

10. Fujii T, Nomura M, Shirai D, Yamaguchi T, Fujii T,Hagimoto K, Ono S. IP transmission system fordigital cinema using 2048 scanning line resolution.Proc IEEE Globecom 2002, p 1643–1647.

11. ITU-T Rec. T.800/ISO/IEC 15444-1:2000, Informa-tion technology. JPEG2000 image coding system—Part 1: Core coding system, 2002.

12. Fujii T, Nomura M, Shirai D, Yamaguchi T, Fujii T,Ono S. Digital cinema system using JPEG2000 mov-ies of 8 million pixel resolution. Proc SPIE, Elec-tronic Imaging 2003, Vol. 5022-07, p 50–57.

13. Fujii T, Shirai D, Yamaguchi T, Nomura M, Fujii T,Ono S. IP transmission system for 8 million pixelsuper high definition images and its application todigital cinema. IIEEJ 9th VMA Conference, 2002. (inJapanese)

14. Matoba N, Aoki T, Ikeda H. IIEEJ, 7th VMA Confer-ence, 2001. (in Japanese)

15. Fujii T, Shirai D, Yamaguchi T, Nomura M, Fujii T.SHD digital cinema delivery system with 8 millionpixel. 1st International Workshop on Interactive RichMedia Content Production, RichMedia 2003, p 79–87.

16. Murooka T, Hashimoto M, Miyazaki T. IEICE TechRep, NS2003-46, Vol. 1–3, No. 122, p 49–52, 2003.(in Japanese)

17. Shirai D, Shimizu T, Murooka T, Yamaguchi T, FujiiT, Fujii T, Takahara A, Oguchi K. 3000km/300Mbpstransmission of 800M pixels digital cinema streamusing TCP multiple connections. Proc 2003 IEICEGeneral Conference, B-7-68, p 328. (in Japanese)

18. Yamaguchi T, Shirai D, Fujii T, Nomura M, Fujii T,Ono S. SHD digital cinema distribution over a longdistance network of Internet2. Proc SPIE, VCIP2003,Vol. 5150, p 1760–1769.

19. Ishimaru K, Shirai D, Yamaguchi T, Shimizu T, Shi-rakawa K, Suzuki J, Fujii T, Aoyama T. 8M pixelsdigital cinema transmission trial via JGN. Proc 2003IEICE Communications Society Conference, B-7-37, p 218. (in Japanese)

20. Digital Cinema Initiatives, LLC. Digital Cinema Sys-tem Specification v.3. Nov. 25, 2003.

AUTHORS (from left to right)

Takahiro Yamaguchi received his B.E., M.E., and Ph.D. degrees in electronic engineering from the University ofElectro-Communications in 1991, 1993, and 1998. He joined NTT Optical Network System Laboratories in 1998 and has beenresearching superhigh-definition imaging systems and their digital cinema applications. He is currently a research engineer withNTT Network Innovation Laboratories. He is a member of IEICE, ITE of Japan, and SID.

Tatsuya Fujii received his B.S., M.S., and Ph.D. degrees in electrical engineering from the University of Tokyo in 1986,1988, and 1991. He joined NTT, Japan, in 1991 and has been researching image processing and image communication networks.In 1996, he was a visiting researcher at Washington University in St. Louis. He is currently a manager in the Broadband BusinessDepartment at the Business Communication Headquarters of NTT East Corp. Japan. He is a member of IEICE, ITE of Japan,and IEEE.

43

Page 10: Superhigh-definition digital cinema distribution system with 8-million-pixel resolution

AUTHORS (continued) (from left to right)

Mitsuru Nomura received his B.E. and M.E. degrees from Kyushu Institute of Design in 1981 and 1985, and Ph.D.degree from Tohoku University in 1992. He is currently a senior engineer with NTT Network Innovation Laboratories. Hisresearch interests are in the areas of high-quality imaging, video transmission and image signal processing. He is a member ofIEICE.

Daisuke Shirai received his B.E. degree in electrical engineering and M.E. degree in computer science from KeioUniversity in 1999 and 2001. He joined NTT Network Innovation Laboratories in 2001 and has been researching superhigh-definition digital cinema systems. He is a member of IEICE.

Kazuhiro Shirakawa received his B.E. and M.E. degrees in electrical engineering from Osaka University in 1985 and1987. He joined NTT’s Atsugi Electrical Communication Labs. in 1987. He is currently a senior research engineer and supervisorat NTT Network Innovation Laboratories. His research has included programmable digital transport systems and their designmethodology, and he is currently studying a superhigh-definition digital cinema distribution platform. He is a member of IPSJ,IEICE, and IEEE.

Tetsuro Fujii received his B.S., M.S., and Ph.D. degrees in electrical engineering from the University of Tokyo in 1979,1981, and 1984. He joined NTT, Japan, in 1984. He has been engaged in research on adaptive acoustic signal processing, paralleldigital signal processing, and image processing. Currently he heads the Media Innovation Laboratory of the Network InnovationLaboratories. He is a member of IEICE and IEEE.

44