10
American Institute of Aeronautics and Astronautics 1 Real Time Tracking of Geostationary Satellites in Sub Arc Second Accuracy Jizhang Sang 1 and Craig Smith 2 EOS Space Systems Pty Ltd, Weston Creek, ACT 2611, Australia EOS Space Systems conducted a demonstration project of optically tracking geostationary and high altitude satellites using CCD camera on its 1.8m telescope located at its Mt Stromlo Space Research Centre in August 2006. The images of the tracked satellite and stars were processed in real time. First, the stars in images were identified using UCAC2 catalog. The rotation and scale parameters of transforming the camera pixel system to azimuth/elevation system were then estimated from the positions of the identified stars. And finally, the azimuth and elevation of the tracked object were determined using transformation parameters and positions of the identified stars. The experiments showed that such a system could complete the tracking of a geostationary satellite within 2-3 minutes including the telescope slewing time, and generate, on average, between 40-80 normal point positions of the tracked object. The accuracy of the generated azimuth and elevation observations was shown better than 1 arc second. I. Introduction EOS Space Systems (EOS) conducted an CCD-camera based optical tracking project in August 2006, to demonstrate the EOS’ capability of precision tracking of high altitude earth orbiting objects, in addition to the capability of laser tracking of LEO debris objects proven in the RazorView and NEOT projects in 2004 and 2005, respectively. This paper presents a description of the optical tracking system for GEO objects and the results of the demonstration tracking operations. This includes the determination of camera transformation parameters using stars, estimation of normal point observations from imaged positions of target object and stars, and star identification method. The tracking results from three nights of 21 to 23 August 2006 are then discussed. II. EOS Space Debris Tracking System The EOS Space Debris Tracking System has an optical tracking subsystem and a laser ranging subsystem, among others, as core components. The optical tracking subsystem can be operated independently and works for objects at low or high altitudes. The laser ranging subsystem is operated at the present together with the optical tracking system, which provides the real-time angular observations that are used in an orbit update procedure (for objects with known approximate elements) or initial orbit determination procedure (for objects without any a priori orbit information), because the laser ranging subsystem needs reasonably accurate position information of objects for laser signals being returned and successfully detected. Two milestone projects, RazorView and NEOT, were successfully completed respectively in 2004 and 2005. The RazorView demonstrated that an object with approximate orbital elements (such as TLE elements) could be tracked by the EOS laser tracking system with a success rate over 85%. The NEOT project aimed at tracking a totally new object, of which no a priori information was known. EOS achieved laser tracking of newly detected debris objects with a high success rate, and a newly detected object could be reacquired after 36 hours. An example of the returned (left) and processed (right) laser signals is shown in Figure 1. Figure 2 shows the laser beam pointed at a target (top-left) and the image of the sky with the target at the image center (top-middle). Figure 3 Shows the EOS Space Research Centre at Mount Stromlo. 1 Research Engineer, EOS Space Systems Pty Ltd, EOS House, Cotter Road, Weston Creek, ACT 2611, Australia, AIAA Senior Member. 2 Technical Director, EOS Space Systems Pty Ltd, EOS House, Cotter Road, Weston Creek, ACT 2611, Australia.

Real Time Tracking of Geostationary Satellites in Sub Arc Second Accuracy

Embed Size (px)

Citation preview

American Institute of Aeronautics and Astronautics

1

Real Time Tracking of Geostationary Satellites in Sub Arc

Second Accuracy

Jizhang Sang1 and Craig Smith

2

EOS Space Systems Pty Ltd, Weston Creek, ACT 2611, Australia

EOS Space Systems conducted a demonstration project of optically tracking

geostationary and high altitude satellites using CCD camera on its 1.8m telescope located at

its Mt Stromlo Space Research Centre in August 2006. The images of the tracked satellite

and stars were processed in real time. First, the stars in images were identified using UCAC2

catalog. The rotation and scale parameters of transforming the camera pixel system to

azimuth/elevation system were then estimated from the positions of the identified stars. And

finally, the azimuth and elevation of the tracked object were determined using

transformation parameters and positions of the identified stars. The experiments showed

that such a system could complete the tracking of a geostationary satellite within 2-3 minutes

including the telescope slewing time, and generate, on average, between 40-80 normal point

positions of the tracked object. The accuracy of the generated azimuth and elevation

observations was shown better than 1 arc second.

I. Introduction

EOS Space Systems (EOS) conducted an CCD-camera based optical tracking project in August 2006, to

demonstrate the EOS’ capability of precision tracking of high altitude earth orbiting objects, in addition to the

capability of laser tracking of LEO debris objects proven in the RazorView and NEOT projects in 2004 and 2005,

respectively.

This paper presents a description of the optical tracking system for GEO objects and the results of the

demonstration tracking operations. This includes the determination of camera transformation parameters using stars,

estimation of normal point observations from imaged positions of target object and stars, and star identification

method. The tracking results from three nights of 21 to 23 August 2006 are then discussed.

II. EOS Space Debris Tracking System

The EOS Space Debris Tracking System has an optical tracking subsystem and a laser ranging subsystem,

among others, as core components. The optical tracking subsystem can be operated independently and works for

objects at low or high altitudes. The laser ranging subsystem is operated at the present together with the optical

tracking system, which provides the real-time angular observations that are used in an orbit update procedure (for

objects with known approximate elements) or initial orbit determination procedure (for objects without any a priori

orbit information), because the laser ranging subsystem needs reasonably accurate position information of objects

for laser signals being returned and successfully detected.

Two milestone projects, RazorView and NEOT, were successfully completed respectively in 2004 and 2005.

The RazorView demonstrated that an object with approximate orbital elements (such as TLE elements) could be

tracked by the EOS laser tracking system with a success rate over 85%. The NEOT project aimed at tracking a

totally new object, of which no a priori information was known. EOS achieved laser tracking of newly detected

debris objects with a high success rate, and a newly detected object could be reacquired after 36 hours. An example

of the returned (left) and processed (right) laser signals is shown in Figure 1. Figure 2 shows the laser beam pointed

at a target (top-left) and the image of the sky with the target at the image center (top-middle). Figure 3 Shows the

EOS Space Research Centre at Mount Stromlo.

1 Research Engineer, EOS Space Systems Pty Ltd, EOS House, Cotter Road, Weston Creek, ACT 2611, Australia,

AIAA Senior Member. 2 Technical Director, EOS Space Systems Pty Ltd, EOS House, Cotter Road, Weston Creek, ACT 2611, Australia.

American Institute of Aeronautics and Astronautics

2

Figure 1. Example of Laser Ranging to Debris Object

Figure 2. Space Debris Acquisition

Figure 3. EOS Space Research Centre at Mt Stromlo

III. Optical Tracking System for Geostationary Satellites

The optical tracking subsystem within the EOS Debris Tracking System consisted of three CCD cameras, with

main properties listed in Table 1. It is almost certain that the WFOV camera is not suitable for achieving the sub arc

second measurement accuracy because of the poor pixel resolution (~ "6.11 per pixel). In the following, the

American Institute of Aeronautics and Astronautics

3

algorithms implemented for the NFOV camera are discussed. These algorithms could be implemented for the

MFOV camera.

Table 1: Cameras in the EOS Optical Tracking System

Name Aperture (mm) Field of View Use

WFOV 355 1.65 x 1.65 degree Initial Detection

MFOV 406 21 x 21 arc min Angles Tracking

NFOV 1800 6 x 6 arc min Beam Locking

A. MFOV and NFOV Camera

The resolution of the MFOV was "46.2 per pixel. This would require the image centroiding accuracy better than

0.4 pixels to make the pointing measurement accuracy better than "1 . On the other hand, the pixel resolution of the

NFOV camera was about "65.0 per pixel, thus it would be much easier to achieve the sub-arc second measurement

accuracy. In practice, images from the NFOV camera were used to generate angular observations. The basic

procedure of estimating the azimuth and elevation of target object from an image can be summarized as (1) identify

stars; (2) calibrate the camera scale and rotation parameters; (3) estimate the azimuth and elevation of the target

object using the positions of identified stars and transformation parameters.

In the following, the methods of determining the scale and rotation parameters of the camera from using stars are

discussed first, followed by the algorithm of estimating the azimuth and elevation observation of the target object.

Finally, the algorithm of identifying stars is presented.

B. Determination of Scale and Rotation Parameters for NFOV Camera

The scale and rotation parameters for computing the azimuth and elevation of the tracked object from the pixel

positions of the object and identified stars were needed. They could be determined either off-line or in real time

using pixel and azimuth and elevation positions of stars. Both methods were used to determine these transformation

parameters of the NFOV camera.

The AZ-EL coordinate system appeared as a left-handed coordinate system, the same as the image X-Y system,

so that the transformation equations between the relative pixel position and the relative azimuth and elevation are

• ElzAyx ∆∆→∆∆ ,~

,

yxyx sysxElsysxzA ⋅∆+⋅∆−=∆⋅∆+⋅∆=∆ θθθθ cossin,sincos~

(1)

• yxElzA ∆∆→∆∆ ,,~

yx sElzAysElzAx /)cos~

(sin,/)sin~

(cos ∆+∆=∆∆−∆=∆ θθθθ

(2)

where the rotation angle θ was a function of the telescope pointing azimuth and elevation:

Dtelescopetelescope ElAz θθ ++=

(3)

Dθ is the derotation angle depending on the optics configuration between the sky and the camera, and xs and ys

are the pixel scale factors. zA~

∆ is the absolute angle in the sky in the azimuth direction. When used in practice, it

should be converted into the conventionally-meant Az∆ , that is )cos(/~

ElzAAz ∆=∆ .

B.1 Off-line Determination

The off-line determination of the scale factor (assuming sss yx == ) and derotation angle was from star

movement measurements on images. A star movement measurement could be made by tracking a star (in sidereal

tracking mode) and recording its pixel position. Then manually applying bias to either the azimuth or elevation or

both to cause the star moving to a new position, the new pixel position was recorded. In this way, a pair of

),~

( ElzA ∆∆ had a corresponding pair of ),( yx ∆∆ . Multiple star movement measurements resulted in multiple

estimates of the scale factor and derotation angle, and the mean of the multiple estimates was taken as the final

estimate, and the RMS of the mean could also be computed.

American Institute of Aeronautics and Astronautics

4

Scale factor was determined as

2222 /)()~

( yxElzAs ∆+∆∆+∆=

(4)

The rotation angle was determined using the following equations

0,tan

0~

,tan

=∆∆

∆=

=∆∆

∆−=

Elwhenx

y

zAwheny

x

θ

θ

(5)

Table 2 shows an example of determining the scale and rotation parameters of the NFOV camera by measuring

star movements on 4 August 2006, 17 days before the tracking operations described in Section III. Three stars were

used. The estimated parameters were used as a priori values in real-time parameter calibration.

Table 2: Off-line Determination of Transformation Parameters of NFOV Camera

Az

(Deg)

El

(Deg)

Pixel

x

Pixel

y s (

"/pixel) Dθ (Deg)

Star 1

Unbiased Pixel Pos 88.789 28.697 259 234

"40,0~

=∆=∆ ElzA 88.7598 28.749 318 250 0.6547 167.8

0,"40~

=∆=∆ ElzA 88.7196 28.811 276 172 0.6221 167.8

Star 2

Unbiased Pixel Pos 177.609 71.906 256 237

"40,0~

=∆=∆ ElzA 177.796 71.921 203 269 0.6462 169.2

0,"40~

=∆=∆ ElzA 177.969 71.911 288 288 0.6644 168.1

Star 3

Unbiased Pixel Pos 248 230

"20,"20~

=∆=∆ ElzA 271.978 54.174 202 233 0.6135

"30,"30~

=∆=∆ ElzA 271.959 54.145 184 231 0.6629

"40,"40~

=∆=∆ ElzA 271.943 54.121 161 232 0.6502

Mean 0.6449 168.2

RMS 0.0197 0.67

B.2 Real-time Calibration

When there were two or more stars identified, an estimate procedure could be applied to determine the

transformation parameters. In this process, separate scale factors for the x and y axes could be estimated depending

on the number of identified stars. In practice, when an object was tracked for about 1-2 minutes, hundreds of images

(imaging rate was 5Hz for the NFOV camera) were taken and identified stars from all the images were used to

estimate the parameters in a batch least-squares procedure. The actual procedure was, for the available N images

taken during the tracking of a particular object:

• For the i-th image ( Ni ≤≤1 ), if there were )2(≥in stars identified, the following 1−in sets of

measurement equations could be formed

yjxjj

yjxjj

sysxEl

sysxzA

⋅+⋅−=

⋅+⋅=

θδθδδ

θδθδδ

cossin

sincos~

, 1,,1 −= inj L

(6)

American Institute of Aeronautics and Astronautics

5

where

0

0

0

00 cos)(~

yyy

xxx

ElElEl

ElAzAzzA

jj

jj

jj

jj

−=

−=

−=

−=

δ

δ

δ

δ

The index of star started from 0.

• The above measurement equations were expanded into linear terms of the Taylor series about the given

approximate values of xs , ys and θ

0000000000

000000000

cossin)sin(coscossin

sincos)cos(sinsincos~

yjxjDyjxjyjxjj

yjxjDyjxjyjxjj

sysxsysxsysxEl

sysxsysxsysxzA

δθδθθδθδθδθδθδ

δθδθθδθδδθδθδ

+−∆+−∆+∆−=

++∆−−∆+∆=

(7)

• The normal equations of ),,( Dyx ss θ∆∆∆ were then formed and solved to obtain the estimated parameters

as

DDD

yyy

xxx

sss

sss

θθθ ∆+=

∆+=

∆+=

0

0

0

ˆ

ˆ

ˆ

This real time calibration method might have the following advantages

o It was mathematically rigorous if the telescope is stable over the exposure time;

o It provided real time error statistics of the pointing measurement accuracy.

However, it could have the following disadvantages:

o Real time star identification could be erroneous;

o Real time access to a star catalogue was required.

o Positions of catalogue stars should be reliable and accurate.

o Image timing accuracy should be better than 0.01s;

o To make centroiding of star objects as accurate as possible, the exposure time should be short

enough so that the stars appeared as points on image (assuming the telescope was tracking a

geostationary satellite);

o Tracking system needed real time weather data for the refraction correction.

It will be seen in Section IV that the accuracy of the real-timely calibrated parameters was highly accuracte.

C. Determination of Object Azimuth and Elevation

Given the transformation parameters and pixel positions of the tracked object and identified stars on an image, it

was an easy process to determine the azimuth and elevation of the object. For the i-th image with )1(≥in identified

stars, the azimuth and elevation observations of the object from this image were determined as

i

n

j

ji

ii

n

j

ji

in

El

Eln

Az

Az

ii

∑∑−

=

===

1

0

,

1

0

,

,

(8)

where

jiyjiixjiistar

jiji

yjiixjiistar

jiji

ElsysxAzAz

sysxElEl

,,,,,

,,,,

cos/)ˆsinˆ(cos

ˆcosˆsin

⋅∆+⋅∆+=

⋅∆+⋅∆−=

θθ

θθ, 1,,0 −= inj L

(9)

Dtelescopei

telescopeii ElAz θθ ˆ++=

(10)

American Institute of Aeronautics and Astronautics

6

starji

objectiji

starji

objectiji

yyy

xxx

,,

,,

−=∆

−=∆, 1,,0 −= inj L

(11)

where starjiAz , and star

jiEl , are the apparent azimuth and elevation of the j-th star in the i-th image, telescopeiAz and

telescopeiEl are the azimuth and elevation of the telescope boresight of the i-th image, star

jix , and starjiy , are the pixel

position of the j-th star in the i-th image, and objectix and object

iy are the pixel position of the object in the i-th image.

Normal point observations might be generated when image frame rate was high. For the NFOV camera used in

the tracking system, the image frame rate was 5 per second, so there could be as many as 5 sets of azimuth and

elevation observations in a second. The normal point azimuth within the second was defined as the mean or zero-

order term of a linear fitting of all the azimuths within the second. The normal point elevation was determined in the

same way. The time of the first observation within the second was taken as the observation time for the normal

point.

D. Star Identification

Obviously, the star identification was a key to the success of the whole process. Because the azimuth and

elevation of the telescope and camera transformation parameters were known to a high accuracy, it was a relatively

easy and straightforward task to design a star identification algorithm for the optical tracking of geostationary

satellites. The algorithm can be summarized as

• The azimuth and elevation of a star-like object were calculated using pixel position of the star-like object,

pixel position and the azimuth and elevation of the telescope boresight (which was usually at the centre of

the camera), and the scale and rotation parameters of the NFOV camera. In fact, the pixel positions of the

anticipated stars could be estimated before actual tracking operations.

• The above calculated azimuth and elevation of the star-like object were transformed into the coordinate

system of the UCAC2 catalog (USNO CCD Astrograph Catalog, second version), which was used in the

EOS optical tracking system.

• Search the catalog for stars with azimuths and elevations close to the above estimated azimuth and

elevation of the star-like object.

• The relative azimuth and elevation of a catalog star pair were compared with the relative azimuth and

elevation of a pair of star-like object. If their differences were less than a few arc seconds, the star pair was

saved.

• The above step was repeated for each pair of the star-like objects. Because the FNOV camera’s narrow

field of view and short exposure time, usually only a few stars appeared in the image, therefore, the

comparison of the relative azimuths and elevations could be completed very quickly.

• Finally, the stars could be identified based on the number of appearances in the saved star pairs.

It can be seen that the above algorithm would require three stars in an image. After the star identification, the

catalog azimuths and elevations of the identified stars were transformed into apparent azimuths and elevations with

respected to the tracking station.

IV. Results

A. Tracking Operations The tracking demonstration operations were performed on nights of 21-23 August, 2006. Table 3 gives summary

of the tracking operations, with the information on the numbers of the tracked satellites and the numbers of normal-

point observations and duration of each tracking pass. The operations on 22 and 23 August were affected by bad

weathers, particularly on 23 August. The exposure time was 100ms. Usually 3 to 5 stars of magnitudes between 12

and 14 were identified on each image.

Table 3: Tracking Operations Summary Date Session Duration Number of

Objects

Number of Passes Number of NP

Observations

21 Aug 8h 14m 79 166 11295

22 Aug 4h 23m 77 93 6305

23 Aug 0h 42m 25 25 1327

American Institute of Aeronautics and Astronautics

7

The number of normal point observations and tracking duration of each individual object (pass) are shown in

Figures 4 to 6, and the mean number of normal points and mean effective tracking duration were 72 and 108

seconds, respectively, with only 7 tracking passes being 15 seconds or shorter. It should be pointed out that not each

single image resulted in an effective observation of the object azimuth and elevation.

Number of NP Observations and Tracking

Duration 20060821

0

50

100

150

200

250

300

350

1 21 41 61 81 101 121 141 161

Object Index

Number of NP Obs

Duration (Seconds)

Figure 4. Number of Normal Point Observations and Tracking Duration, 21 August 2006

Number of NP Observations and Tracking

Duration 20060822

0

50

100

150

200

250

300

1 21 41 61 81

Object Index

Number of NP Obs

Duration (Seconds)

Figure 5. Number of Normal Point Observations and Tracking Duration, 22 August 2006

Number of NP Observations and Tracking

Duration 20060823

0

2040

6080

100

120140

160180

200

1 3 5 7 9 11 13 15 17 19 21 23 25

Object Index

Number of NP Obs

Duration (Seconds)

Figure 6. Number of Normal Point Observations and Tracking Duration, 23 August 2006

American Institute of Aeronautics and Astronautics

8

B. Transformation Parameters

Figures 7 to 9 show the real-timely estimated transformation parameters on three nights, respectively. The mean

values of three parameters from all the estimates are "647529.0 /pixel, "647611.0 /pixel, and 168.0848 degrees, with

RMS values "0024.0 /pixel, "0018.0 /pixel, and 0.0861 degrees, respectively for xs , ys and Dθ .

Scale Factors 20060821

0.625

0.63

0.635

0.64

0.645

0.65

0.655

0.66

1 21 41 61 81 101 121 141

Object Index

arc

se

c /

pix

el

x y

Derotation Angle 20060821

167.7

167.8

167.9

168

168.1

168.2

168.3

1 21 41 61 81 101 121 141

Object IndexA

ng

le (

De

g)

Figure 7. Scale and Derotation Parameters, 21 August 2006

Scale Factors 20060822

0.625

0.63

0.635

0.64

0.645

0.65

0.655

0.66

0.665

1 21 41 61 81

Object Index

arc

se

c /

pix

el

x y

Derotation Angle 20060822

167.2

167.4

167.6

167.8

168

168.2

168.4

168.6

168.8

169

1 21 41 61 81

Object Index

An

gle

(D

eg

)

Figure 8. Scale and Derotation Parameters, 22 August 2006

Scale Factors 20060823

0.638

0.64

0.642

0.644

0.646

0.648

0.65

0.652

1 3 5 7 9 11 13 15 17 19 21 23

Object Index

arc

se

c /

pix

el

x y

Derotation Angle 20060823

167.6

167.7

167.8

167.9

168

168.1

168.2

168.3

1 3 5 7 9 11 13 15 17 19 21 23

Object Index

An

gle

(D

eg

)

Figure 9. Scale and Derotation Parameters, 23 August 2006

C. Accuracy

American Institute of Aeronautics and Astronautics

9

The RMS value of the observations for each tracking pass of a satellite was evaluated. It was assumed that the

satellite was either static or drifting slowly during the tracking. This means that the normal point azimuths from one

tracking pass could be fit into a constant or a linear equation, and the RMS value of the fitting residuals was then

computed. Similar fitting procedure was applied to the normal point elevations from a tracking pass. Figures 10-12

show the RMS values on three different nights. It can be seen that, of the total 568 RMS values, only 5 of them were

larger than 1 arc second.

Residual RMS Values of Azimuth and Elevation

Observations 20060821

0

0.2

0.4

0.6

0.8

1

1.2

1 11 21 31 41 51 61 71 81 91 101 111 121 131 141 151 161

Object Index

RM

S (Arc

Seconds)

Az

El

Figure 10. Observation RMS Values, 21 August 2006

Residual RMS Values of Azimuth and Elevation

Observations 20060822

0

0.5

1

1.5

2

2.5

3

1 21 41 61 81

Object Index

RM

S (Arc

Seconds) Az

El

Figure 11. Observation RMS Values, 22 August 2006

Residual RMS Values of Azimuth and Elevation

Observations 20060823

00.1

0.20.3

0.40.5

0.60.7

0.80.9

1

1 3 5 7 9 11 13 15 17 19 21 23 25

Object Index

RM

S (Arc

Seconds)

Az

El

Figure 12. Observation RMS Values, 23 August 2006

American Institute of Aeronautics and Astronautics

10

The high accuracy of the observations can also be seen from Figure 13, where the azimuths and elevations from

4 tracking passes of TELKOM1 (NORAD ID 25880) on the night of 21 August 2006, are shown. The first tracking

pass started at 10h19m19s and lasted for 41 seconds and resulted in 35 normal points, the second start at 11h22m49s

and lasted for 51 seconds and generated 36 normal points, the third started at 11h29m51s and lasted for 125 seconds

and had 104 normal points, and the last started at 14h53m07s and lasted for 60 seconds and resulted in 41 normal

points. The base values for the azimuth and elevation are 303.5760 and 30.5060 degrees, respectively. It is seen the

observations within each pass are almost on a level line. It can also be seen that the satellite was slowly drifting. The

drift rates were "108.0 per minute and "165.0− per minute for the azimuth and elevation, respectively, when the

azimuths and elevations were fit into linear equations. It is found that some objects were drifting at much quicker

rates, such as Ekran21 (NORAD ID 26736) had drift rates "408.39− per minute and "978.33 per minute,

respectively for the azimuth and elevation (see Figure 14, where the base azimuth and elevation are 295.0018 and

24.8500 degrees, respectively).

Azimuth and Elevation Observations of TELCOM1, 20060821

-80

-60

-40

-20

0

20

40

Diffe

rences (")

Az El

Figure 13. Observations of 4 Tracking Passes of TELCOM1 (NORAD ID 25880), 21 August 2006

Drifting Ekran21 (NORAD ID 26736), 20060821

-40

-30

-20

-10

0

10

20

30

40

0 10 20 30 40 50

Epochs (seconds)

Diffe

rences (")

Az El

Figure 14. Drifting Ekran21, 21 August 2006

V. Conclusions

The EOS Space Debris Tracking System was primarily designed to track LEO debris objects using laser ranging

technology, which was demonstrated in the RazorView and NEOT projects. Within the whole system, the optical

tracking subsystem has been mainly used as an aiding tool for generating the orbit data in an accuracy required by

the laser tracking subsystem.

However, the optical tracking subsystem can be operated independently for both high and low altitude objects.

The accuracy of observations from tracking LEO objects has been about "2~1 , and the accuracy of observing

geostationary satellites has been in sub-arc second level as shown in this paper. The observation rates are about 1Hz

and 5Hz for LEO and GEO objects, respectively. All optical observations are generated in near real-time.