8
ERA Operations Verification: Results and Lessons Learned C. Heemskerk (1) , H. Petersen (1) , L. Aris (1) , R. Stott (2) , F. Didot (2) (1) Dutch Space BV, P.O. Box 32070, 2303 DB Leiden, The Netherlands E-mail: [email protected] , [email protected] , [email protected] (2) ESA Direcorate of Human Spaceflight & Exploration, Noordwijk, The Netherlands Email: [email protected] , [email protected] ABSTRACT The European Robotic Arm has recently completed its Acceptance Programme. The last major system level tests were completed in 2003, including the Operations Reference Mission test on the ERA Flight Model, and several Operations Verification tests performed on the ERA Mission Preparation and Training Equipment (MPTE) at ESTEC. Final regression testing was completed in 2004. The paper describes the various test campaigns that were performed, reporting on the test results and lessons learned. Specific attention is given to the test approach, which included several overlapping end-to-end tests, the role of astronauts as test conductor controlling ERA, the testing of failure scenarios, the testing of operational procedures, and the role of the on-line mission support system. INTRODUCTION The European Robotic Arm (ERA) has been developed by a European consortium lead by Dutch Space, under contract from ESA. The arm will be used on the Russian Segment (RS) of the International Space Station (ISS). The delivery of the MPTE and the ERA Flight Model to ESA is taking place in October 2004 and launch is now scheduled together with the Multipurpose Laboratory Module (MLM) on a Russian Proton launcher from Baikonur in November 2007 (Fig.1). Fig. 1: ERA in the Charlie Chaplin launch configuration on the new Multipurpose Laboratory Module (MLM) In Proceedings of the 8th ESA Workshop on Advanced Space Technologies for Robotics and Automation 'ASTRA 2004' ESTEC, Noordwijk, The Netherlands, November 2 - 4, 2004 1

ERA Operations Verification: Results and Lessons Learnedrobotics.estec.esa.int/ASTRA/Astra2004/Papers/astra2004...In the ASTRA 2002 Workshop we have reported on the ERA Qualification

  • Upload
    others

  • View
    26

  • Download
    0

Embed Size (px)

Citation preview

Page 1: ERA Operations Verification: Results and Lessons Learnedrobotics.estec.esa.int/ASTRA/Astra2004/Papers/astra2004...In the ASTRA 2002 Workshop we have reported on the ERA Qualification

ERA Operations Verification: Results and Lessons Learned

C. Heemskerk(1), H. Petersen(1), L. Aris(1), R. Stott(2), F. Didot(2)

(1) Dutch Space BV, P.O. Box 32070, 2303 DB Leiden, The Netherlands E-mail: [email protected], [email protected], [email protected]

(2) ESA Direcorate of Human Spaceflight & Exploration, Noordwijk, The Netherlands Email: [email protected], [email protected]

ABSTRACT

The European Robotic Arm has recently completed its Acceptance Programme. The last major system level tests were completed in 2003, including the Operations Reference Mission test on the ERA Flight Model, and several Operations Verification tests performed on the ERA Mission Preparation and Training Equipment (MPTE) at ESTEC.

Final regression testing was completed in 2004. The paper describes the various test campaigns that were performed, reporting on the test results and lessons learned. Specific attention is given to the test approach, which included several overlapping end-to-end tests, the role of astronauts as test conductor controlling ERA, the testing of failure scenarios, the testing of operational procedures, and the role of the on-line mission support system.

INTRODUCTION

The European Robotic Arm (ERA) has been developed by a European consortium lead by Dutch Space, under contract from ESA. The arm will be used on the Russian Segment (RS) of the International Space Station (ISS). The delivery of the MPTE and the ERA Flight Model to ESA is taking place in October 2004 and launch is now scheduled together with the Multipurpose Laboratory Module (MLM) on a Russian Proton launcher from Baikonur in November 2007 (Fig.1).

Fig. 1: ERA in the Charlie Chaplin launch configuration on the new Multipurpose Laboratory Module (MLM)

In Proceedings of the 8th ESA Workshop on Advanced Space Technologies for Robotics and Automation 'ASTRA 2004' ESTEC, Noordwijk, The Netherlands, November 2 - 4, 2004

1

Page 2: ERA Operations Verification: Results and Lessons Learnedrobotics.estec.esa.int/ASTRA/Astra2004/Papers/astra2004...In the ASTRA 2002 Workshop we have reported on the ERA Qualification

ERA s primary role is in external assembly and servicing activities of the Russian Segment (RS) of the International Space Station (ISS), and to support human Extra Vehicular Activity (EVA) tasks. The ERA Flight Segment consists of:

The manipulator arm, with seven joints in an anthropomorphic configuration, and end effectors on each end, making the arm symmetric and relocatable.

Control software for commanding the arm from inside the Space Station, the ERA IVA Man-Machine-Interface, or IMMI. This software runs on one of the laptops at the RS Central Post in the Service Module. The creation of a second Control Post in the MLM is under investigation.

A control panel for commanding the arm during EVA called the ERA EVA Man-Machine-Interface, or EMMI. The EMMI can be operated from a Portable Working Platform (PWP)

Supporting infrastructure on the station which consists of ERA basepoints (BP) mounted on the external surface of RS, grapple fixtures (GF) mounted on payloads to be handled by ERA, and electrical cabling to the various BP. In the recently approved baseline for the Russian Segment, a total of 11 basepoints are planned, allowing ERA to roam from the MLM, via the Docking Compartment (DC) to the Service Module, the Solar Power Module (SPM) and the re-instated Research Module (RM). Fig. 2 shows the approved RS configuration.

The ERA Ground Segment consists of generic Ground Control Equipment (GCE) provided by the Russian Mission Control Centre in Moscow (MCC-M) and ERA-specific Mission Preparation and Training Equipment (MPTE). The GCE uplinks ERA ground commands, ground prepared autosequences and software updates. It also handles and distributes ERA telemetry data.

On the MPTE, ERA missions are designed and tested, trained, supported and evaluated. There are three instantiations of MPTE. MPTE-A is installed at ESTEC, and is used to train instructors and maintain the ERA SW. MPTE-B will be located at the Gagarin Cosmonaut Training Centre (GCTC) in Star City, and is used to train ground operators and flight crew. MPTE-C is shared between the Mission Control Centre and RSC/Energia, providing On-Line Mission Support (OLMS) and an environment to prepare and validate new missions.

Fig. 2 Officially approved baseline for the Russian Segment of ISS, showing the MLM (2007) at the nadir port of the FGB ( Zarya ), the Solar Power Module SPM (2009) at the zenith node of the Service Module ( Zvezda ), and the

Research Module RM (2011) below it. ERA is shown in various poses stepping form one basepoint to another.

2

Page 3: ERA Operations Verification: Results and Lessons Learnedrobotics.estec.esa.int/ASTRA/Astra2004/Papers/astra2004...In the ASTRA 2002 Workshop we have reported on the ERA Qualification

In the ASTRA 2002 Workshop we have reported on the ERA Qualification Test programme, which at that time was already completed on the EQM [1-6]. The FM Qualification Test programme was almost finalised, with only the compliant motion test and the Operational Reference Mission test remaining. Furthermore, the MPTE Acceptance Test had to be completed before we could run the final Operational Verification Tests with an astronaut in the command loop.

The rest of the paper describes the last two years of system level test campaigns in chronological order, starting with a description of the operations verification approach, and concluding with some lessons learned.

OPERATIONS VERIFICATION APPROACH

The primary objective of the ERA Operations Verification Test campaign was to show that the ERA system does allow cosmonauts to successfully conduct and complete a representative set of missions. Normally, an ERA mission is pre-planned on ground. A pre-planned ERA mission mainly consists of an autosequence with ERA commands (AS), and a corresponding set of written procedures in Operations Data File (ODF) format for the crew. Both Autosequence and procedures have to be carefully verified.

The ERA Mission Preparation operator composes a Mission Specific ERA Autosequence (AS) and Operations Plan from a default set of generic task descriptions (the building blocks of each AS) and from generic procedures which are part of the MPTE library.

In an exceptional case, a mission can also be performed unplanned. Unplanned ERA missions are composed on the spot by the operator, from simple manual motion commands like Jog, and from semi-automatic procedures called Mini Auto Sequences (MAS), which are a default part of the flight S/W. Also for the MAS, generic tasks and procedures are available. This manual mode is supported for both IVA and EVA.

To handle contingency situations, an elaborate set of diagnostic and recovery procedures has been developed. The verification and training of these contingency procedures required a combination of dedicated contingency missions , the use of special test procedures with a Training Instructor stimulating specific failure cases using the Failure Injection GUI on MPTE, and repair rehearsals on hardware.

ERA operations planning and verification in the flight situation using the MPTE is done in several stages as indicated in Fig. 3.

Create Mission SpecificAutosequence and Procedures

Verify ERA operations withMPTE dynamics simulator, but

without man in the loop

Verify ERA operations withMPTE dynamics simulator, and

with astronauts in the loop

Fig. 3 Stages in ERA operations planning using MPTE

3

Page 4: ERA Operations Verification: Results and Lessons Learnedrobotics.estec.esa.int/ASTRA/Astra2004/Papers/astra2004...In the ASTRA 2002 Workshop we have reported on the ERA Qualification

Before a full scale, operator-in-the-loop verification of the ERA operations is conducted, the correctness of the generated AS is verified and some basic dynamic performance parameters are checked. This is typically done by the Mission Preparation operator, using the ERA dynamics simulator in MPTE with a HW model of the ERA Control Computer (ECC) in the loop, running the actual flight SW, but without a man in the loop. In this verification stage, operator commands are generated from a pre-planned file.

The purpose of this important verification step is to establish that all commands in the AS are given in the right context (no semantic command checks trigger), to verify that the ERA performance is within nominal bounds (no dynamic motion checks trigger) and to verify that the path is collision free. Note that a first check on potential collisions was performed while the trajectory is planned using the ERA Path Planner, using a kinematic arm model, the collision check during the verification stage takes into account the dynamics of ERA, and the full functionality of the Collision Avoidance check that also runs in the ECC during operations on-orbit.

The final stage of Operations Verification is to check the ERA Operations Plan with the MPTE dynamics simulator and a man in the loop. The purpose is:

to verify mission specific operational procedures

to verify operator visibility issues

to verify operator commandability issues

to verify the time needed

To cover the entire spectrum of operations, several verification tests were developed:

Operational Reference Mission (ORM-ETF)

Operations Verification Test (OVT-MPTE)

Contingency Tests (on MPTE and ETF)

Ground-Space Integrated Mission Test (GSIMT)

ORM-ETF

For all operations that require the presence of real hardware, a system level test was designed to be executed with the ERA Flight Model on the ERA Flat Floor Test Facility (ETF). Also here, worst-case operational conditions were selected. The Operational Reference Mission (ORM) is a reduced set of operations in the 2-D world of the test floor, consisting of a payload pick-and-place operation, a payload inspection, a shoulder relocation and a yield test. A graphical illustration of the ORM-ETF, carried out with the ERA Flight Model on the flat test floor, is shown in fig. 4.

X = -6.071

X

Z

PTM

Z = 4.571SAF

Z = 6.071GFISP

Sidewall

Simulator

WORLD (Armbase Interface)

PILLAR

FLAT FLOORBOUNDARY

Latching InterfaceZ = 7.153X = -5.730

w.r.t. WORLD

20 °

SAP= Safe Approach PoseISP = Insertion PoseGF = Grapple FrameFOR = Frame Of Resolution

Position of equipment on the Flat Floor

Fig. 4: Typical operation during the ORM-ETF test: ERA is approaching the Payload Test Model

4

Page 5: ERA Operations Verification: Results and Lessons Learnedrobotics.estec.esa.int/ASTRA/Astra2004/Papers/astra2004...In the ASTRA 2002 Workshop we have reported on the ERA Qualification

OVT-MPTE

The Operations Verification Test (OVT) on MPTE is a complex mission composed of several Autosequences. The objective was to cover the entire nominal scope of ERA capabilities. The OVT simulates the installation of a solar array on the Russian SPP module on ISS and specifies for the purpose of the test part of the command sequence as a series of manual actions, imitating an unplanned mission.

The test was held in a very intense two week session, early April 2003 at ESTEC. The tests involved an extended test team including a facility maintenance operator, training instructor, EMMI operator, IMMI operator and OLMS operator, plus test witnesses. Some of the operator roles were played by future users: Mrs. L. Purtova of RSC/E (Manager of Operations Preparation for ERA), Mr. O. Pushkar of GCTC (EVA Cosmonaut Instructor), ESA astronauts Mr. P. Duque, Mr. T. Reiter, and Mr. U. Guidoni, and Mr. R. Stott (ESA Operations). Prior to the actual test, the test subjects received basic training, and instruction of ERA operations. The duration of the test was optimised by alternating instruction and test sessions with two parallel teams. Because none of the ESA astronauts could participate for the whole two week period, the test programme for the European crew was adapted to accommodate nominal Autosequence operations as well as Manual operations and Collision Avoidance testing during an abbreviated one week program. During the second week, the second European crew re-ran the abbreviated program, while the Russian crew completed the original test programme in the foreseen way.

CONTINGENCY TESTS ON MPTE AND ETF

Contingency procedures were verified in a number of tests on MPTE and ETF. Because real failures should not be introduced in the ERA hardware, anomaly procedures are tested by injecting failures in the communication flow between simulated subsystems and the ERA Control Computer. Also the process of software maintenance in flight has been tested. Hardware repair procedures, of course, are all verified on the real hardware, though their final verification will take place with the ERA WET model in the Neutral Buoyancy facility at the Gagarin Cosmnonaut Training Centre (GCTC) in Star City.

Many contingency tests were uneventful. Contingency handling procedures (in ODF format) were well-established, and guided the IMMI operator in a straightforward way from the event message, via a clear diagnostostic procedure, to the correct recovery strategy.

Fig. 5. Test campaign on MPTE-A. On the right the Instructor Station, on the left On-Line Mission

Support. In the back ground a mock-up of the Service Module Zvezda with the Russian Command Post

Fig 6 The Command Post inside the Service Module mock-up: On the left a laptop running the IVA MMI, the second laptop on the right is for camera control,

and two B/W video monitors in the background

5

Page 6: ERA Operations Verification: Results and Lessons Learnedrobotics.estec.esa.int/ASTRA/Astra2004/Papers/astra2004...In the ASTRA 2002 Workshop we have reported on the ERA Qualification

One of the most challenging and interesting contingency tests was an end-to-end test run on MPTE , in which the test conductor was injecting failures at an arbitrary moment during the execution of a nominal autosequence. The time of insertion and the nature of the failure were kept hidden from the ERA operators using EMMI, IMMI and the OLMS. It was the combined task of the IMMI operator and the ERA specialist behind the OLMS station to diagnose the failure at hand within a reasonable time, and select the correct recovery strategy. With experienced operators at the helm, most failures were correctly diagnosed in a matter of minutes. Getting to the right diagnosis in the shortest possible time became a challenge. At one point there was a real sense of victory, when the OLMS operator claimed that the test conductor had inserted a stuck latch failure, long before any check triggered. The operator had correctly observed an unexpected trend in extraction force levels.

GROUND-SPACE INTEGRATED MISSION TEST (GSIMT)

The last major test was the Ground-Space Integrated Mission Test. GSIMT was intended to be the ultimate end-to-end test, including mission preparation and verification on the ERA MPTE at ESTEC, uploading the resulting Datasets into the ERA Qualification Model, running the mission on this flight representative hardware in the ERA Test Facility (ETF) at Dutch Space in Leiden, recording live telemetry from this test and finally replaying and evaluating the mission telemetry again on MPTE.

The test took much longer than expected. Initiated in February 2003, the test team quickly ran into trouble. Some of the early challenges were specific to the nature of the test, in which the simulated world of MPTE had to be matched to the existing HW world of the ERA EQM on the ETF (worlds). Since the HW world of ETF differs significantly from the flight environment of ERA FM on ISS, the whole process of creating World Models, deriving dedicated visualisation models, collision models, synoptic display models and deriving payload and world frame related database parameters had to be performed from scratch. Operators with limited experience struggled with unclarities in procedures, system instabilities and a large set of complex tools. Also, the mission preparation had to take into account some peculiarities of operating ERA in the ETF world, e.g. out of plane motion has to be prevented and several check limits have to be set wider, to allow for friction effects.

When the teething problems were over, quite some minor and several major problems in both the flight S/W and the MPTE S/W were uncovered. The most elusive of these problems were related to the inadvertant triggering of the offset correction check and apparently minute differences in database settings. These last problems were not (and could not be) discovered in the earlier stand-alone tests with FM/ ETF or MPTE, because the command sequences and detailed parameters for those tests were generated with the specific characteristics of the world in mind. GSIMT forced us to re-think world modelling conventions and the database maintenance process. As a result, the procedure to calculate offset errors and the database maintenance process were revised.

After S/W corrections were made, the test was restarted in October 2003, basically from scratch. The relative ease with which the preparatory activities could now be performed showed that once the operator has some familiarity with the system and the tools, much faster mission preparation is indeed feasible.

The test on the flat floor (GSIMT Part 2) was relatively uneventful. Some observations of incorrect behaviour were made, e.g. some unexpected events that were reported on the operator consoles. All these issues could be traced back to known problems, which could be resolved independently from GSIMT with an already planned Regression Test on MPTE.

In January 2004, part 3 of the GSIMT test was completed with a successful mission evaluation on MPTE. In the mean time, the ERA flight S/W and the MPTE S/W had been updated again, mainly correcting for errors unrelated to GSIMT. This time no problems with the simulator occurred. The MPTE system remained stable throughout the test. But again, the end-to-end character of the GSIMT proved its value. Using a new script tool to re-transmit the ERA TM data at a representative frame rate in the MPTE, the On Line Mission Support System was exposed to its first live telemetry. Several errors in the OLMS data displays were uncovered, problems which had escaped detection in earlier tests with artificial telemetry.

6

Page 7: ERA Operations Verification: Results and Lessons Learnedrobotics.estec.esa.int/ASTRA/Astra2004/Papers/astra2004...In the ASTRA 2002 Workshop we have reported on the ERA Qualification

LESSONS LEARNED

Use trained operators in system level test campaigns. Many of the ERA system level test campaigns were run with one or more inexperienced operators behind some of the operator consoles. Often, the participation of an inexperienced operator was driven by reasons beyond the test objectives, e.g. to familiarise new staff with the system. While testing with untrained operators learns a lot about system robustness and user friendliness, it does distract from testing of the system capabilities against the agreed specifications and procedures. A lot of time was spent in guiding novice operators through basic operations. As will be the case for the flight training program, potential operators should have basic skills and basic system knowledge before getting involved in complex system level operations. System robustness and user friendliness should certainly not be neglected, but should be checked in dedicated tests before formal testing with end users begins.

Separate testers and observers. Some of the early test campaigns were run while many observers were looking over the shoulder of the operators that were performing the test. Sometimes the questions were trivial, due to a clear lack of understanding of ERA system basics, sometimes the questions were very much to the point, but requiring a lengthy explanation. The situation often stimulated a heated debate on the adequacy of the implementation, while the test was progressing. The operators were distracted, and tempted to participate in the discussion, loosing momentum in the test execution. A set-up that was used in later tests appeared much more efficient: While the test was conducted by a small team in the MPTE room, observers were guided through the tests in an adjacent briefing room, using tests documents projected on a white board and switching between video channels with IMMI, OLMS, EVA and instructor views to keep trace of the test in progress. In this way, valuable observations were made and discussed in parallel with steady test progress.

Adopt the HW test approach. The overall test approach commonly used in Flight HW testing can be well adapted to S/W acceptance testing and Operations testing at system level. During the two years of system level test campaigns we have fully adopted this HW test approach, which starts from a customer approved test specification, written against the user level requirements to be closed by the test, and governed by an overall test plan. From the test specification, a detailed test procedure is derived, with enough white space for manual annotation. The procedure is red-lined by the test conductor during the test and issued under a new number as the as-run test procedure. The test is kicked-off with a Test Readiness Review, in which the test configuration is recorded with minute detail. For system level SW and Operations tests, this implies a detailed discussion on all open non conformances and software problem reports possibly affecting the test. The test is wrapped up in a formal Post Test Review, and the results are reported in a test report. Once established, this approach ensures a good control over the test progress.

Keep the agreed test objectives in mind during test execution. Especially when a test is well-prepared, with a very detailed test procedure, it is easy to loose sight of the original test objectives. Executing the tests in the presence of one or more witnesses, the operator is often tempted to deviate from the procedure. Where can I see this, could you just show me, or check that? With the test objectives clearly in mind, it is much easier for the operator to either accommodate a valid request or politely refer the question to the back room. For exploratory operations, to test the system without explicit procedures, separate sessions should be organised.

Stick to a single system of non-conformance reporting. In the early stages of the MPTE Acceptance Test and in the first iterations of the ORM-ETF and Operations Verification Test on MPTE, there were many observations made that could not be directly attributed to a system failure, a deficiency in the requirements, or being simply a matter of flavour. Long lists of these so-called findings were generated and separately maintained. Since then, these lists have started a life of their own, partly independent of the formal requirements flow down, implementation and test process. Some findings were duplicated into Software Problem Reports (SPRs or SLIPs), others into NCRs, or RIDs, making the task of tracing and cross referencing in successive Regression Tests a job in itself. In the mean time, all critical issues have been resolved, and system stability, robustness and user friendliness have been dramatically improved. In retrospect, the first MPTE Acceptance Test should have been declared failed. All findings should have been discussed immediately, separating clear non conformances from operational improvements. The NCRs should be merged back into the baseline system development (repair before retry). Items that could not be disposed as a clear non-conformances should be dropped immediately.

Reserve time to perform a dry run . Complex end-to-end tests involve large test teams, with several test conductors, a test director, observers, PA, interested managers. A dry-run performed by a limited test team, will help to iron out the wrinkles from the test procedure, and speed up the execution of the formal test.

7

Page 8: ERA Operations Verification: Results and Lessons Learnedrobotics.estec.esa.int/ASTRA/Astra2004/Papers/astra2004...In the ASTRA 2002 Workshop we have reported on the ERA Qualification

CONCLUSIONS

The European Robotic Arm has completed its Acceptance Programme. The last major Operations Verification Tests was performed on the ERA Mission Preparation and Training Equipment (MPTE) at ESTEC in 2003. Final regression testing was completed in 2004.

The delivery of the MPTE and the ERA Flight Model to ESA is taking place in October 2004 and launch is now scheduled together with the Multipurpose Laboratory Module (MLM) on a Russian Proton launcher from Baikonur in November 2007.

Important lessons learned were:

Use trained operators in system level test campaigns

Separate testers and observers in complex tests

Stick to a single system of non-conformance reporting

Adopt the HW test approach in S/W acceptance testing and Operations testing at system level

Keep the agreed test objectives in mind during test execution

Reserve time to perform a dry run

REFERENCES

[1] ERA Performance measurements test results , P. Verzijden, H. Petersen, M. Visser, in Proceedings of the 7th ESA Workshop on Advanced Space Technologies for Robotics and Automation ASTRA 2002

[2] The ERA System: Control Architecture and Performances Results , F. Didot, M. Oort, J. Kouwen, P. Verzijden, in: Proceedings of the Sixth International Symposium on Artificial intelligence, Research & Development in Space, I-SAIRAS 2001 Conference, Montréal, Canada, June 2001

[3] Boosted Modal Survey Test on the European Robotic Arm , E. v.d. Heuvel, G. Gloth, M. Degener, in: 4th International Symposium on Environmental Testing for Space Programmes, ESA SP-467, Liège, Belgium, June 2001

[4] Thermal Balance Testing Of the European Robotic Arm , E. v.d. Heuvel, J. Doornink, in: 4th International Symposium on Environmental Testing for Space Programmes, ESA SP-467, Liège, Belgium, June 2001

[5] ERA EQM and FM test results , P. Verzijden, W.J. Admiraal, J. Kouwen, in: Proceedings of the 6th ESA workshop on Advanced Space Technologies for Robotic Applications ASTRA 2000

[6] How to build a Space Robot; ERA Lessons Learned , M. Oort, F. Meiboom, C. Heemskerk, in: Proceedings of the 6th ESA workshop on Advanced Space Technologies for Robotic Applications ASTRA 2000

8