of 40 /40
FRONT PAGE European Research Consortium for Informatics and Mathematics www.ercim.org Number 38 July 1999 SPECIAL: Financial Mathematics 7 Branislav Rovan, Director of the Slovak Research Consortium for Informatics and Mathmatics and a founding member of the Department of Computer Science at the Comenius University in Bratislava. Next Issue: Special: 10 years ERCIM T he Slovak Research Consortium for Informatics and Mathematics (SRCIM) joined ERCIM in May 1998. It always takes time for a new partner to fully integrate into the co-operative work of the member institutes. The Familiarisation Day held during the recent ERCIM Board of Directors and ERCIM Executive Committee meetings in Bratislava will certainly help to speed up this process. SRCIM aspires to both benefit from and contribute to the melting pot of information and experience embodied in ERCIM. Slovakia is amidst a difficult economical and social transformation. It will take a number of years before local industry becomes strong enough to look for challenges in the more distant future and to recognise the importance of research, development, and education. Some of the problems in the area of IT that Slovak society and industry are going to face in the future are in the meantime being recognised in many European countries and addressed by the ERCIM institutes. SRCIM intends to participate in looking for solutions to these problems and thus become ready to apply these solutions in the local context. Historical circumstances inhibited advanced research in most applied areas of IT in Slovakia. Theoretical research, less dependent on hardware, managed to stay in touch with the current developments and the results achieved are recognised and appreciated by the international community. Strong theoretical research influenced the computer science education at leading universities. The educational paradigm 'through abstraction to flexibility', applied over many years, resulted in a strong base of IT professionals in Slovakia who are ready, many years after their graduation, to embrace the newest technologies and paradigms of software development. SRCIM member institutes are eager to find areas of common interest with other ERCIM member institutes. Their well-trained research and development teams are looking forward to contributing to and gaining experience from joint projects. The challenges posed by our vision of the information society of the future transcend the borders and the solutions our community needs to find will require teams that transcend the borders too. It is vital that partners of varied expertise and societal background look for solutions that will indeed bring the benefits of IT to everyone. ERCIM has a large enough geographical coverage of Europe to find such partners and to form such teams. The member institutes of SRCIM have many years of experience in co-operation with countries in Central and Eastern Europe and are ready to share this experience. The broad base of R&D, the geographical spread, and the multicultural outlook give ERCIM the potential of being one of the few organisations that can identify key issues and influence the strategy of European IT R&D. SRCIM welcomes the chance and the challenge to take part in this process. SRCIM is a junior partner in ERCIM, both in its size and in the duration of its membership. Having as members the key R&D institutions in IT in Slovakia and representing a base of well trained and flexible researchers, SRCIM has the ambition to contribute to finding solutions to the IT challenges on the ERCIM agenda. Branislav Rovan ERCIM will celebrate its 10th anniversary with a two days event in Amsterdam, 4-5 November 1999. See announcement on page 3. C O N T E N T S Joint ERCIM Actions 2 The European Scene: 4 Special Theme: Financial Mathematics 7 Research and Development 23 Technology Transfer 34 Events 38 In Brief 39

[Finance] Financial Mathematics

Embed Size (px)

Citation preview

Page 1: [Finance] Financial Mathematics


European Research Consortium for Informatics and Mathematicswww.ercim.org

Number 38 July 1999


Financial Mathematics 7

Branislav Rovan,Director of the SlovakResearch Consortiumfor Informatics andMathmatics and afounding member of theDepartment of Computer Science atthe Comenius University inBratislava.

Next Issue:

Special: 10 years ERCIM

The Slovak Research Consortium for Informatics and Mathematics(SRCIM) joined ERCIM in May 1998. It always takes time for a newpartner to fully integrate into the co-operative work of the member

institutes. The Familiarisation Day held during the recent ERCIM Board ofDirectors and ERCIM Executive Committee meetings in Bratislava willcertainly help to speed up this process.

SRCIM aspires to both benefit from and contribute to the melting pot ofinformation and experience embodied in ERCIM. Slovakia is amidst a difficulteconomical and social transformation. It will take a number of years before

local industry becomes strong enough to look for challenges in the more distantfuture and to recognise the importance of research, development, and education.Some of the problems in the area of IT that Slovak society and industry are goingto face in the future are in the meantime being recognised in many European countriesand addressed by the ERCIM institutes. SRCIM intends to participate in lookingfor solutions to these problems and thus become ready to apply these solutions inthe local context.

Historical circumstances inhibited advanced research in most applied areas of ITin Slovakia. Theoretical research, less dependent on hardware, managed to stay intouch with the current developments and the results achieved are recognised andappreciated by the international community. Strong theoretical research influencedthe computer science education at leading universities. The educational paradigm'through abstraction to flexibility', applied over many years, resulted in a strongbase of IT professionals in Slovakia who are ready, many years after their graduation,to embrace the newest technologies and paradigms of software development. SRCIMmember institutes are eager to find areas of common interest with other ERCIMmember institutes. Their well-trained research and development teams are lookingforward to contributing to and gaining experience from joint projects.

The challenges posed by our vision of the information society of the future transcendthe borders and the solutions our community needs to find will require teams thattranscend the borders too. It is vital that partners of varied expertise and societalbackground look for solutions that will indeed bring the benefits of IT to everyone.ERCIM has a large enough geographical coverage of Europe to find such partnersand to form such teams. The member institutes of SRCIM have many years ofexperience in co-operation with countries in Central and Eastern Europe and areready to share this experience.

The broad base of R&D, the geographical spread, and the multicultural outlookgive ERCIM the potential of being one of the few organisations that can identifykey issues and influence the strategy of European IT R&D. SRCIM welcomes thechance and the challenge to take part in this process. SRCIM is a junior partner inERCIM, both in its size and in the duration of its membership. Having as membersthe key R&D institutions in IT in Slovakia and representing a base of well trainedand flexible researchers, SRCIM has the ambition to contribute to finding solutionsto the IT challenges on the ERCIM agenda.

Branislav Rovan

ERCIM willcelebrate its 10thanniversary witha two days event in Amsterdam, 4-5 November1999. Seeannouncementon page 3.


Joint ERCIM Actions 2

The European Scene: 4

Special Theme:Financial Mathematics 7

Research and Development 23

Technology Transfer 34

Events 38

In Brief 39

Page 2: [Finance] Financial Mathematics



5th ERCIMEnvironmentalModelling GroupWorkshopby Jean-Paul Berroir

The fifth workshop of the ERCIMEnvironmental Modelling Group,dedicated to Information Systems forEnvironmental Modelling, was heldon 3-4 June 1999. It was organized byINRIA and hosted in Palais desCongrès, Versailles, France andattracted some 20 participants fromsix countries. The workshop chairmanwas Isabelle Herlin from INRIA.

The lectures and discussions focused oninformation systems designed forenvironmental modelling. Morespecifically, several issues wereaddressed, all being crucial for theoperational implementation ofenvironmental models, such as systemsfor air quality monitoring, coastal zonemanagement, hydrology, climate: theseissues were system architecture, datacollection on the Internet, datamanagement, access to distributedgeographic data sources, GISapplications over the Internet.

The workshop was divided into threesessions, the first one concerningapplications of information systems toenvironment (air quality, riskmanagement), the second being focusedon systems themselves. A final sessionconcerned ongoing European projectssharing the concern of designing systemsfor environmental modelling. Fourprojects have thus been presented, relatedto the European Telematics or Incoprograms.

The workshop ended with a lecture byAchim Sydow,GMD, chairman of theworking group, summarizing theIST/Telematics EnvironmentConcertation Meeting. This was theopportunity to discuss ideas for futureprojects, to be formed within the workinggroup. The DECAIR project, dedicatedto the use of remote sensing data for airquality simulation, which has recently

started under the framework of the CEOprogram (Centre for Earth Observation),is a good example of the success of thecollaboration between members of theworking group. See http://www-air.inria.fr/decair/ for information aboutthis project.

Detailed information about the workshopprogram can be found at:http://www-air.inria.fr/ercim.

■Please contact

Thomax Lux – GMDTel: +49 30 6392 1820E-mail: [email protected]

Ninth DELOSWorkshopfocuses on DistanceLearningby Pasquale Savino and Pavel Zezula

The 9th DELOS Workshop on DigitalLibraries for Distance Learning washeld in Brno, Czech Republic, 15-17April 1999. The objective of theDELOS Working Group, part of theERCIM Digital Library Initiative, isto promote research into the furtherdevelopment of digital librarytechnologies. This year, BrnoTechnical University held its 100 yearanniversary. It also recently becomean associated partner of DELOS. Theworkshop was organized in celebrationof these two events.

The workshop addressed two relativelynew areas : Digital Libraries and DistanceLearning. Access to education hasbecome increasingly important forindividuals who need to gain acompetitive edge in the labour marketthrough acquisition of specialized or newknowledge. This demand for newinformation, coupled with the everincreasing quantity of informationavailable in digital form, has lead to achange in traditional teaching methods.

Face to face teaching is gradually beingreplaced by distance education. In orderto make this form of education botheffective and efficient, advancedinformation and communicationtechnologies must be exploited. Digitallibraries of distributed complexmultimedia data can serve as suitablerepositories of continuously changing up-to-date information, which areindispensable for distance education.

The DELOS organizers cooperated withthe Czech Association of DistanceLearning Universities and the EuropeanAssociation of Distance LearningUniversities in preparing the programmefor the workshop.

The final programme containedcontributions from nine countries. Theinvited talk, by John A.N. Lee,concentrated on distance learningexperiences at the Department ofComputer Science at Virginia Tech, USA.The remaining presentations can bedivided in two categories. Papers in thefirst category concentrated on conceptualissues of distance learning, emphasizingthe position of digital libraries in theglobal process of knowledge acquisition.Papers in the second category presentedinformation about actual prototypes fordistance learning or addressed some ofthe advance technology tools necessaryto meet this aim. The workshop attendeesalso greatly appreciated the sessiondedicated to prototype demonstrations;six different prototypes were presented.The workshop inspired numerous, verylively discussions.

For more information on the DelosWorking Group, see:http:// www.iei.pi.cnr.it/DELOS/The Proceedings of the Workshop havebeen published in the DELOS Workshopseries and can be found at:h t t p : / / w w w . e r c i m . o r g / p u b l i c a t i o n / w s -proceedings/DELOS9/

■Please contact:

Pasquale Savino – IEI-CNRTel: +39 050 593 408E-mail: [email protected]

Pavel Zezula – Technical University, BrnoE-mail: [email protected]: +420 5 4214 1202

Page 3: [Finance] Financial Mathematics

ERCIM - a Virtual Laboratory for ICT Research in Europe,Amsterdam, Thursday 4 November 1999

Under this slogan scientists of theERCIM institutes will be given theopportunity to present their ideas onmatters that are closely related to ITresearch. It is not research itself that willbe targeted with these presentations butrather the issues that come up on a meta-level. To give some examples: Apresentation will be given on the prosand cons of open source softwaredevelopment, on the state of the art in anumber of ICT research areas, on newparadigms and prospects in particularfields, and so on. A full program will beavailable at the ERCIM website soon.

ERCIM - Leveraging World ClassR&D for Business and SocietyAmsterdam, Friday 5 November 1999

The November 5 event is targetedtowards the European industrial andpolitical community. It aims at takingstock of information technology, itsadvancement and its applications inbusiness and society. Presentations willbe given by J.T. Bergqvist (BoDNOKIA), Gottfried Dutiné (Director ofAlcatel/SEL Germany), Jacques LouisLions (President of the French Academyof Sciences), Roger Needham (Directorof Microsoft Research Europe), Gerardvan Oortmerssen (President of ERCIM),and Alexander Rinnooy Kan (BoD INGBank). Next to these presentations majorachievements of the ERCIM instituteswill be demonstrated throughout the day.

For more information see:http://www.ercim.org/

Please contact:

ERCIM officeTel +33 1 3963 5303 E-mail: [email protected]



A new Manager for ERCIMDuring their recent meeting in Bratislava theERCIM Board of Directors nominated Jean-Eric Pin the new Manager of ERCIM. Jean-Eric Pin is a 46 years old director of researchat CNRS, and he currently heads a researchteam in the LIAFA (Laboratoire d’InformatiqueAlgorithmique : Fondements et Applications)from University Paris 7. As a former director of the LIAFA,he is experienced in research management. He has alsogathered knowledge in research transfer during the two yearsspent at the IT group Bull and with his activities as consultantfor data compression for the French space agency CNES. Heis well-versed in European programs such as ESPRIT andnow IST. Pin first studied mathematics and then moved tocomputer science. In 1989, he received the IBM France

Scientific Prize in Computer Science for hiswork in automata theory.

“I would like to thank ERCIM for the trust itputs in me. I am very enthusiastic aboutjoining ERCIM and I hope to prove equal tothis challenging new task. I am especiallydelighted to have to celebrate an anniversaryso early after being nominated! It is not onlyan exciting festivity, but also a uniqueopportunity for our consortium to become anunavoidable entity at the European level. So,

don’t forget to tell your friends, your colleagues, and yourindustrial partners about that very special event that willtake place in Amsterdam at the beginning of November.This anniversary is going to be a very rich event, bothinternally and externally, and I am sure that everybody isready to help for its success!”

Jean Eric Pin

4 5

ERCIM 10th Anniversary Event Amsterdam, 4-5 November 1999

ERCIM will celebrate its 10th anniversary with a two days event in the “Beursvan Berlage” in Amsterdam, 4-5 November 1999. The first day will be aninternal event for ERCIM-member personnel only, while the second day istargeted towards Information and Communication Technology (ICT) users inEuropean industry and leading people from the political community.

Page 4: [Finance] Financial Mathematics



IT Training in a ChangingSocietyby Josef Kolafi

The process of changes in the countriesof Central and Eastern Europe hasremoved barriers in their political,economical, and social life. In theCzech Republic, we experience thecreation of a new environment inwhich both industrial companies andeducational institutions are subject toconditions of an open market. Thisarticle presents some hypothesesconcerning recent trends in studentpopulation at one of the faculties of theCzech Technical University in Prague.

The Department of Computer Scienceand Engineering (CS&E) at the Facultyof Electrical Engineering was the firstoffering a comprehensive universityeducation in IT in the formerCzechoslovakia. The study program hasalways been a balanced mixture ofsoftware- and hardware-oriented courses,so that the graduates were attractive to arelatively wide sector of the job market.

After the removal of the communistregime, the computer market opened toa massive import of technologies whosesupply was strictly controlled before.Free import eliminated the need oftechnologically obsolete IT systemsproduced in the former COMECOMcountries. and caused a peak demand forIT personnel capable of a quick adoptionto new technologies. Western companiesstarted to build their local offices hiringmostly Czech personnel since they werecheaper and knew the local environment.Graduates from the Department of CS&Ewere some of the most successful ingetting such jobs and in many cases theygradually reached the top positions in theCzech branches of many importantcompanies (as eg IBM, Microsoft,Oracle, etc.). Apart from this, thecontinuous development in IT andtelecommunications has been attractingyoung people to enroll for computerstudies at the department.

The figure shows how three indicatorswe consider interesting have beenevolving in the last decade. Theyrepresent the overall student populationof the faculty, the number of students ofCS&E, and the number of women in thestudent population. The indicator valueshave been normalized in order to comparetheir trends (the actual starting values are4037, 362, and 293, resp.). We see thatafter an initial stagnation, the populationgrows yet not that quickly as the numbersof CS&E students. The difference couldhave been even more remarkable if allstudents applying for the CS&E studyprogram had been accepted, which is notpossible due to limited space andpersonnel capacity of the department.While quite satisfactory for us, thissituation reflects a serious drain-off effectto other study programs and departmentsboth in student numbers and quality.

The critically decreasing number ofwomen is something the university is notpleased with even though there is probablyno chance for a technical university toachieve a close-to-balanced populationwith respect to sex. The decrease inwomen population is even more alarmingif percentage is considered. The studentpopulation had 7.3% women in 1988, butonly 1.6% in 1997. We tried to formulatepossible hypothesis as to the reasons forthis situation.

Girls do not like computers - The waychildren get the first exposure to IT isfavoring boys. It is not only that mostcomputer games are competition-oriented (fighting, war-games) but thetechnical aspects of the issue attract moreboys than girls. More publicity is neededto stress the fact that there is enoughspace in IT applications for creativity,cooperation, and social communication,

both in usage and in design (as eg inWWW pages or human-computerinterface), in which the female factor canbe fully appreciated.

Girls do not like electrical engineering(EE) - Even accepting that technicaldisciplines (and specifically EE) areperhaps more male-attractive, how toexplain the latest trend that has led froma modest 7.3% to an almost completefemale extinction from the student body?Our hypothesis is that nowadays, thereis a richer offering in the educationalmarket so that most girls actually selectstudy programs what they like more.

Another fact derived from indicators thatare not depicted in the diagram is that theaverage time needed to graduate (if ever)has grown remarkably. Our hypothesis,whose verification would need moredata, is that the reason is not the difficultyof the program but mostly the deliberatedecision of the students. Since they donot pay any fees and have importantadvantages, they often stay îstudyingîwhile actually working for somecompany. The university thus offers ashelter for a smooth start into theirprofessional life.


There are many traditions and myths inuniversity life that, surprisingly, quicklydisappear when the society experiencesa deep social transition. Although someof the changes are positive and someothers are inevitable, we still have achance to influence them provided thatwe find the real reasons.

■Please contact:

Josef KolafiTel: +420 2 2435 7403E-mail: [email protected]

Evaluation of thestudent population atthe Faculty ofElectrical Engineering(FEL), the number ofstudents of theDepartment ofComputer Scienceand Engineering(CS&E) and thenumber of women inthe student population.

Page 5: [Finance] Financial Mathematics



A SuccessfulEf f ort to Increasethe Number ofFemale Students in ComputerScienceby Truls Gjestland

The Norwegian University for Scienceand Technology (NTNU) observed asteady decline in the number of femalestudents in subjects related to computerscience. In 1996 only 6 percent of thestudents in Computer Science werewomen. On the other hand femalestudents with a degree in computerscience were highly in demand,reflecting a general Norwegian trendto have a balanced workforce.

Why worry?

It is considered important that both menand women are among the well-qualifiedcomputer science and IT graduates thatwork in R&D projects that will color ourfuture. Good qualifications in computerscience is the gateway to interesting,well-paid careers. More females shouldbe employed in this market. BothNorwegian industry and the public sectorrecognize that competent staff with ITskills are essential. When half theapplicants to higher education are female,we should make use of the resources andscientific talents that women possess toeducate well-qualified female computerscience graduates.

University initiative

In 1997 a special program was launchedby NTNU to increase the number ofyoung women in computer science. Firstof all a special extra quota was establishedreserved exclusively for female students.Someone would argue that having specialquotas would lead to students withinferior qualifications. This has not beenthe case. In 1997 and 1998 a total of 36and 37 women respectively were admittedon this special quota. At NTNU studentsare admitted to the various faculties

according to their grades from high school.Different faculties may have differentqualification requirements. All of the‘quota girls’ belong gradewise to the upperquarter of all the students at NTNU;definitely not a minor league team.

Information material especially designedfor women were distributed to all the highschools in Norway, and all the womenwho expressed an interest in studyingcomputer science at NTNU, were invitedto participate in an all paid ‘girls day’ atthe university. During this visit theywould meet with students and faculty,and given all relevant information as ahands-on experience.

The results were promising. One of theproblems earlier was that only 40% ofthe young women who were acceptedactually started their studies at the

university. Now this percentage wasincreased to 80. At the semester start in1996 only 6 out of 101 students incomputer science were women. In 1997the ratio was 50 out of a total of 171. In1998 the efforts were further increased.In the fall 1998 the number of womenstarting to study computer science atNTNU had increased to 69 out of 230.The percentage of young womenadmitted for the fall semester 1999 is now29.6 %. The experiment that started atNTNU has now been expanded tobecome a national initiative. Fouruniversities are currently involved.

Measures directed at the upper secondaryschool was implemented in the summer1998. The project engaged the servicesof a natural science teacher at this level.A common information campaign waslaunched by the four universities to get

more young women into computerscience. This comprised a brochure,advertising, web-based information anda special postcard:

• 25 000 copies of the campaign brochurewere distributed to universities and 380upper secondary schools all overNorway. It was also sent to teachers inmathematics in the third year atsecondary schools who participated ina special conference, Damer@Data(Females@Computing) at theUniversity of Tromsø in March 1998.

• The campaign postcard was printed in60 000 copies and distributed in cafes,discos and similar places where youngstudents gather in most large towns inNorway. A further 10 000 were sent touniversities. At NTNU, the Departmentof Computer and Information Sciencesent a personal postcard to all the young

women in the upper secondary schoolin Norway who had taken the necessarysubjects in mathematics and physics tobe qualified for admission. ProfessorReidar Conradi, head of the Departmentof Computer and Information Sciencewrote them and urged them to considerstudying computing at NTNU.

•The project had double and single pageads in the press, especially in magazinesfor young people. There were also adsin the student newspapers at the fouruniversities.

• The project also written about in thelocal and national media and specializedcomputer magazines.

It is not enough to have a high percentageof women at the beginning of theirstudies. You also have to make sure thatthey complete the courses. This was alsopart of the initiative.

A computer lab for female students is part of an initiative at NTNU to increase the number of young women in computer science.

Page 6: [Finance] Financial Mathematics



NTNU does not have any computerclasses exclusively for women. Certainactions, however, are specifically aimedat the female students. There is acomputer lab for women with sixdesignated assistants (female students atsenior level), and there are two assistantswhose prime task is to make sure that thenew female students are having a goodtime! They arrange special courses, visitsto computer businesses, social meetingswith female industrial leaders, etc. Inorder to emphasize the role-model aspect,a female associate professor has also beenengaged. Another important aspect hasalso been a series of lectures: Know yoursubject. In these lectures the relevance ofthe computer science subjects is discussedto give the students a broader perspective.

The project has received financial supportfrom the Norwegian research council,and several large industrial firms inNorway act as sponsors. For furtherinformation see: h t t p : / / w w w . n t n u . n o / d a t a j e n t e r / e n g l . h t m l

■Please contact:

Kirsti Rye Ramberg – NorwegianUniversity of Science and TechnolgyTel: +47 73 59 09 32E-mail: [email protected]

Basic Research,InformationTechnologies,and theirPerspectives in the CzechAcademyby Milan Mare‰

Like in other countries in CentralEurope, the research in the Academyof Sciences of the Czech Republic(formerly Czechoslovak Academy ofSciences), its management and theposition of researchers after the earlynintees display significant changes.Among the general and generallyknown conditions being valid in the

former regime, there existed additionalproblems connected specifically withthe R&D in the informatics,information sciences and informationtechnologies. Namely, the embargo onadvanced technologies forced theresearchers to ‘repeat’ the workalready done in developing even simpleelements of high electronic technology.Certain ignorance regarding thecopyrights of software products led tothe existence of their uncontrollableillegal ‘import’. General unconcern onthe industrial production of advancedinformation technologies essentiallylimited the career possibilities of younggifted specialists outside the universitiesand basic research facilities, demandfor them was rather limited. That allhas changed almost overnight.

However these changes are beneficent,from the general point of view, they bringqualitatively new problems to be solvedby the managers of the research. Thegrant system of the financing of researchprojects led some researchers to a feelingof lower stability of their position.

Their ability and readiness to start riskyresearch in quite new fields connectedwith the possibility of failure or, at least,with relatively long period of decreaseof the publication outputs (with all theconsequences for the success in the grantcompetition) becomes much lower. The‘safety’ research in well-known areasseems to be more attractive. The mobilityof researchers and research teams, as anatural reaction on the flexibility ofsupports, is rather difficult in a smallcountry like the Czech Republic and thisdifficulty is even increased by theextremely limited possibilities to findadequate accommodation forresearcher’s family. Last but far not least,the demand for information andcomputer specialists in the industry,business and banking has rapidlyincreased. The salaries offered by thesenew potential employers are much higherthan those ones, which can be achievedin an academic institute or university. Inthe situation of young families thisargument becomes very cogent.

Gifted postgraduate students frequentlyunderstand their study as an opportunityto increase their price on the labor market.

It is not wrong, generally, but it would bedesirable to keep at least some (desirablythe most gifted ones) in the institutes. Allthese new circumstances met themanagements of the research institutes(also usually new) and confronted themwith the problem to cope with theinstability of research staff and guaranteeits fluent regeneration. The way tomanage this situation is both, simple inits general formulation and difficult in thepractical realization. It is expectable thatthe labor market in the field ofinformation science and technology willturn more saturated and that this cancontribute to the equilibrium between thesupply and demand for researchers in theinstitutes. But this expectation cannot bethe starting point for the management ofIT research in the next years.

First, it is necessary to built stable coreof tribal researchers in the institute. Thisneed not be very large, but it must becurrently completed and its membershave to be creative personalities beingsure that the institute reckons with them.This core can be surrounded by a staff ofresearchers moving between institutesand applied research even with the riskof irreversibility of some moves orincreasing their qualification. Suchsystem cannot be effective withoutmobility of researchers - in the case ofthe Czech science also the internationalone in both directions - including the jointsolution of research and grant projects.Also the narrow cooperation withuniversities and participation on theeducation is necessary for a sound life ofthe research institute of the consideredtype. Cooperation with industry and otherconsumers of applied results is effectiveonly if it concerns original non-standardsolutions of very specific problems.Academic institute cannot (and shouldnot) compete with routine products ofspecialized firms. The achievement ofsuch dynamic stability of the researchsystem in Academic institutes is notsolvable in short time and by simpletools, but it must be at the horizon of ourendeavor if we want to manage the ITbasic research on the level demanded bythe contemporary world.

■Please contact:

Milan Mare‰ – CRCIMTel: +420 2 6884 669E-mail: [email protected]

Page 7: [Finance] Financial Mathematics



FinancialMathematicsby Denis Talay

Financial markets play an importanteconomical role as everybody knows.It is not well known (except byspecialists) that the traders now usenot only huge communicationnetworks but also highly sophisticatedmathematical models and scientificcomputation algorithms. Here are afew examples:

The trading of options represents a largepart of the financial activity. An optionis a contract which gives the right to thebuyer of the option to buy or sell aprimary asset (for example, a stock or abond) at a price and at a maturity datewhich are fixed at the time the contractis signed. This financial instrument canbeen seen as an insurance contract whichprotects the holder against indesirablechanges of the primary asset price.

A natural and of practical importancequestion is: does there exist a theoreticalprice of any option within a coherentmodel for the economy? It is out of thescope of this short introduction to give aprecise answer to such a difficult problemwhich, indeed, requires an entire book tobe treated deeply (see Duffie ‘92). Thisintroduction is limited to focusing on oneelement of the answer: owing tostochastic calculus and the notion of nonarbitrage (one supposes that the marketis such that, starting with a zero wealth,one cannot get a strictly positive futurewealth with a positive probability), onecan define rational prices for the options.Such a rational price is given as the initialamount of money invested in a financialportfolios which permits to exactlyreplicate the payoff of the option at itsmaturity date. The dynamic managementof the portfolio is called the hedgingstrategy of the option.

It seems that the idea of modelling afinancial asset price by a stochasticprocess is due to Bachelier (1900) whoused Brownian motion to model a stockprice, but the stochastic part of FinancialMathematics is actually born in 1973 withthe celebrated Black and Scholes formulafor European options and a paper by

Merton; decisive milestones then arepapers by Harrison and Kreps (1979),Harrison and Pliska (1981) which p r o v i d ea rigorous and very general conceptualframework to the option pricing problem,particularly owing to an intensive use ofthe stochastic integration theory. As aresult, most of the traders in tradingrooms are now using stochastic processesto model the primary assets and deducetheoretical optimal hedging strategieswhich help to take managementdecisions. The related questions arevarious and complex, such as: is itpossible to identify stochastic modelsprecisely, can one efficiently approximatethe option prices (usually given assolutions of Partial Differential Equationsor as expectations of functionals ofprocesses) and the hedging strategies, canone evaluate the risks of severe lossescorresponding to given financial positionsor the risks induced by the numerousmispecifications of the models?

These questions are subjects of intensivecurrent researches, both in academic andfinancial institutions. They requirecompetences in Statistics, stochasticprocesses, Partial Differential Equations,numerical analysis, software engineering,and so forth. Of course, in the ERCIMinstitutes several research groupsparticipate to the exponentially growingscientific activity raised by financialmarkets and insurance companies, andmotivated by at least three factors:

• this economical sector is hiring anincreasing number of good students

• it is rich enough to fund research

• it is a source of fascinating new openproblems which are challenging science.

The selection of papers in this specialtheme gives a partial activity report ofthe ERCIM groups, preceded by anauthorized opinion developed by BjörnPalmgren, Chief Actuary and member ofthe Data Security project at SICS, on theneeds for mathematical models in Finance.One can separate the papers in threegroups which correspond to threeessential concerns in trading rooms:

• how to identify models and parametersin the models: papers by Arno Siebes(CWI), Kacha Dzhaparidze and PeterSpreij (University of Amsterdam),József Hornyák and László Monostori(SZTAKI)

• how to price options or to evaluatefinancial risks: papers by Jiri Hooglandand Dimitri Neumann (CWI), LászlóGerencsér (SZTAKI), Michiel Bertsch(CNR), Gerhard Paaß (GMD), ValeriaSkrivankova (SRCIM), Denis Talay(INRIA)

• efficient methods of numericalresolution and softwares: papers byDavid Sayers (NAG Ltd), ClaudeMartini (INRIA) and Antonino Zanette(University of Trieste), Mireille Bossy(INRIA), Arie van Deursen (CWI),László Monostori (SZTAKI). Severalof these papers mention results obtainedjointly by researchers working indifferent ERCIM institutes.

■Please contact:

Denis Talay – INRIATel: +33 4 92 38 78 98E-mail: [email protected]

CONTENTSThe Need for Financial Models by Björn Palmgren 8

Mining Financial Time Series by Arno Siebes 9

Statistical Methods for Financial and otherDynamical Stochastic Models by Kacha Dzhaparidze and Peter Spreij 9

Genetic Programming for Feature Extractionin Financial Forecasting by József Hornyák

and László Monostori 10

Taming Risks: Financial Models andNumerics by Jiri Hoogland and Dimitri Neumann 11

Stochastic Systems in FinancialMathematics – Research activities atSZTAKI by László Gerencsér 13

Understanding Mortgage-backedSecurities by Michiel Bertsch 14

ShowRisk – Prediction of Credit Risk by Gerhard Paaß 15

Stochastic Methods in Finance: EvaluatingPredictions by Valeria Skrivankova 16

Model Risk Analysis for Discount BondOptions by Denis Talay 17

Numerical Algorithms Group by David Sayers 17

Premia: An Option Pricing Project by Claude Martini and Antonino Zanette 19

Life Insurance Contract Simulations by Mireille Bossy 20

Using a Domain-Specific Language forFinancial Engineering by Arie van Deursen 21

Subsymbolic and Hybrid ArtificialIntelligence Techniques in FinancialEngineering by László Monostori 22

Page 8: [Finance] Financial Mathematics


The Need for FinancialModelsby Björn Palmgren

Against a background in insuranceand finance and with my presentexperience from supervision of thefinancial sector, I would like to give anoverview and some reflections on therole of mathematics and statistics infinance. The emphasis will be on theneed for models and a discussion ofwhat may make models useful. Thereare other important areas, such assecure handling of information andrelated questions covered by the fieldof cryptography and protocols, whichwill be left out here.

Cash flows

One way to understand the need forfinancial models is to look at what thefinancial sector is dealing with. What wesee is as customers are products andservices offered by banks, securitiesfirms and insurance companies. Thefinancial institutions receive our deposits,savings and insurance premiums andoffer management of investments, loans,insurance cover and pensions. With amore abstract description we could saythat cash flows in and out are handled bythese institutions. What is more importantis that some of these cash flows may beuncertain at a given moment in time.Certain cash flows may be of size thatcannot be predicted with certainty, suchas the yield on bonds or equity. Inparticular, some future cash-flows mayturn out to be nil or non-existent, due tothe default of those who should providethis cash-flow, or due to that theconditions for payment will not besatisfied, eg in insurance when nodamage covered by the insurancecontract occurs.

Uncertainty and stability

It is the duty of the financial institutionto find a balance or at least an acceptablelevel of imbalance between the cash flowsthat it manages. This balance is a

condition for the fulfilment of liabilitiesto customers and the corresponding goalof stability of the financial sectormotivates special legislation for thefinancial sector and a system ofauthorisation, monitoring andsupervision. It is the uncertainty aboutthis balance, subject to financial andoperational risk, that is one of themotivations for an increasing interest infinancial models of use for achieving thisbalance or stability. Talking of risk, it isworth mentioning the other side of thecoin, opportunity. Opportunity is anothergood reason for trying to understand thefinancial processes using financialmodels, at least as a complement toeverything else that is of value for successin the financial sector: information,knowledge and competence in the field.

Having identified uncertainty as acharacteristic feature of financial activity,we turn next to aspects for managing it.Here it would seem reasonable to makesome distinction between methods, toolsand models, although they are quiteintertwined. For the moment we will,however, make no particular efforts tokeep these aspects apart. Instead we willlook closer at the types of uncertainty orrisk that may occur and put them into awider context, in order to be able to saysomething non-trivial about theusefulness and need for financial models.


It is important to bear in mind that thepractical use of models should be judgedwith reference to some decision situationor context. Such a context necessarilydepends on some horizon or periodwithin which decisions have to be made.This aspect of horizon has consequencesfor the choice of model for describingthe uncertainty or risk. Many processesin industry have a need for reactions ordecisions in real time or at least with arelatively short horizon for decisions ormonitoring. Similar processes do occurin certain financial markets, such asdifferent kind of trading activities. Mostother financial activities work, however,with considerably longer horizons,ranging from days and weeks to monthsand years. With a longer horizon and lessfrequent data it may be problematic touse models that were designed to handle

continuous or highly frequent processes,mainly because the underlying realitywill be too unstable or inhomogeneousto fit into such a model. This highlightsanother aspect of the use of models. Willthey be used for predictions or will theyrather be used for descriptions ofexperience or projections of assumptionsmade about the future? For processes inreal time there is a need for models withpredictive power for at least a very nearfuture. There is a need for financialmodels in situations where there is littlehope of safe prediction, for severalreasons. The process modelled may bepoorly understood or just intrinsicallyinhomogeneous. The process may bedepending on unpredictable marketbehaviour or external events, resistingany attempt to find a truthful model.

For this reason it is important to realisethat many if not most financial modelscannot be used as sharp predictiveinstruments. There are, however, anumber of other respectable uses offinancial models. These includeprojections of assumptions made,assessment of possible uncertainty, riskor opportunity, including different kindsof sensitivity analysis and calculation ofbuffers or margins that may be neededto compensate for adverse developments,ie when things do not go your way. Suchapproaches are of importance fordefining regulatory minimum capitalrequirements and for capital allocationand performance measurement.

Some models and methods

With the background given I would finallylike to mention some concrete approachesthat seem to be fruitful for further research.A general reference that gives a criticaloverview of a part of this vast field is ‘RiskManagement and Analysis, Vol. 1’ editedby Carol Alexander, Wiley 1998.

It is a general experience that a deepunderstanding of the phenomenon to bemodelled is the best starting point. Modelswith elements of market behaviour satisfythis requirement to a certain extent. Theassumption of no arbitrage has beenfruitful for the area of stochastic financialcalculus, including models for derivativeinstruments. These models are used inpricing and are put to the test there.


Page 9: [Finance] Financial Mathematics


Still, actual behaviour may differ fromtheoretical assumption. In such fields ascredit or counterparty risk there seemsto be room for more analysis. First thereis a need to link default risk to propertiesof the debtor. Much have been done incredit scoring where the law of largenumbers seems to be working, but thereare several areas where default isrelatively scarce or comes in batches.There is a need to sort out riskdetermining factors and find morefrequent proxies for default. Givensufficient and relevant data this is an areafor statistical analysis, including clusteranalysis and various kind of structure-finding methods. There are connectionswith non-life insurance, which facessimilar problems for pricing insurancerisk, but usually with more statisticsavailable. The increasing capacity ofcomputers makes certain methods orapproaches more practical than before.One example is methods based on theBayesian approach that can be combinedwith empirical data rather than subjectivea priori information. Here we have egcredibility methods in insurance and thearea of stochastic simulation for Bayesianinference, known as the Markov chainMonte Carlo approach.

Models describing inhomogeneousprocesses, especially rare or catastrophicevents are of interest, although there arelimits for what can be said in such cases.Information is scarce and it may take avery long time to evaluate whetherdecisions based on the models werecorrect. Extreme value theory can beexplored further, but perhaps best withinthe framework of sensitivity testing ratherthan prediction.

When measuring the total exposure to riskof a financial entity, it is clear that modelsshould reflect various kinds ofdependencies. Such dependencies occurbetween consecutive periods of time andbetween various types of activities.Models incorporating dynamic controlmechanisms can explain some of thedependencies over time. In a moredescriptive approach, there seems to befurther work to be done in finding anddescribing correlation between asset typesand, in case of insurance, correlationbetween types of business. One area

where such interactions are studied is thearea of asset liability models, where thereis interaction between the two sides of thebalance sheet. Future development andexperience with such models can bee x p e c t e d .

■Please contact:

Björn Palmgren – Chief ActuaryFinansinspektionen, the FinancialSupervisory Authority of Sweden and a member of the Data Securityproject at SICSTel: +46 8 787 80 00 E-mail: [email protected]

Mining FinancialTime Seriesby Arno Siebes

A lot of financial data is in the form oftime-series data, eg, the tick data fromstock markets. Interesting patternsmined from such data could be usedfor, eg, cleaning the data or spottingpossible market opportunities.

Mining time-series data is, however, nottrivial. Simply seeing each individualtime-series as a (large) record in a tablepre-supposes that all series have the samelength and sampling frequency.Moreover, straightforward application ofstandard mining algorithms to such tablesmeans that one forgets the time structurein the series. To overcome theseproblems, one can work with a fixed setof characteristics that are derived fromeach individual time-series. Thesecharacteristics should be such that theypreserve similarity of time-series. Thatis, time-series that are similar should havesimilar characteristics and vice versa. Ifsuch a set of characteristics can be found,the mining can be done on thesecharacteristics rather than on the originaltime-series.

A confounding factor in defining suchcharacteristics is that similarity of time-series is not a well-defined criterion. Inthe Dutch HPCN project IMPACT, inwhich CWI participates, we takesimilarity as being similar to the humaneye, and we use wavelet analysis to

define and compute the characteristics.One of the attractive features of thisapproach is that different characterisationscapture different aspects of similarity.For example, Hoelder exponents captureroughness at a pre-defined scale, whereasa Haar representation focuses on localslope.

Currently, experiments are underwaywith the Dutch ABN AMRO bank tofilter errors from on-line tick-data. In thefirst stage, a Haar representation is usedto identify spikes in the data. In the nextstage, clustering on Hoelder exponentsand/or Haar representations will be usedto identify smaller scale errors.

■Please contact:

Arno Siebes – CWITel: +31 20 592 4139E-mail: [email protected]

StatisticalMethods forFinancial andother DynamicalStochasticModels by Kacha Dzhaparidze and Peter Spreij

The high capacity of present daycomputers has enabled the use ofcomplex stochastic models becausedata on the system under study can beobtained in huge amounts andanalyzed by simulation techniques orother numerical methods. Forinstance, at the stock exchanges, timeand price are recorded for every singletrade. Mathematical finance is anexample of a field with a vigorousdevelopment of new models. Thedevelopment of statistical methods forstochastic process models, however, lagsbehind, with the result that far too oftenstatistical methods have been appliedthat, although they can be relativelysophisticated, suffer from shortcomings


Page 10: [Finance] Financial Mathematics

because they do not fully take intoaccount and exploit the structure of thenew models. Researchers at CWI aimat making a major contribution to thetheory of statistical inference forstochastic processes.

The research is carried out in closecollaboration with many researchers inThe Netherlands and elsewhere inEurope. The theoretical work uses themethods of modern probability theoryincluding stochastic calculus. A moreapplied project objective is the statisticalanalysis and modelling of financial datasuch as stock prices, interest rates,exchange rates and prices of options andother derivative assets, and thedevelopment of more realistic models forthese than those presently used in thefinancial industry. There are increasingdemands (including new legislation) thatbanks and other financial institutionsimprove the management of their riskfrom holding positions in securities. Thiswill require use of more realistic andsophisticated mathematical models aswell as improved statistical proceduresto evaluate prices of financial assets.

Mathematical finance is an example of afield where data analysis is, in practice,very often done by means of traditionaldiscrete time models, whereas most of themodels used for pricing derivative assetsare continuous-time models. Continuous-time models have the additionaladvantage that they can be analysed bymeans of the powerful tools of stochasticcalculus, so that results can often beobtained even for very complicatedmodels. In many applications, however,one has to take into consideration thatdata are obtained at discrete time points,so inference methods for discretelyobserved continuous-time processes areto be applied. In recent years, statisticalmethods for discrete time observationsfrom diffusion-type processes has startedto attract attention and it appears that thereare many challenging mathematicalproblems involved. A survey paper onthis subject by Dzhaparidze, Spreij andVan Zanten will soon appear in StatisticaNeerlandica.

Very often the complexity of the modelsin question prevents exact calculation ofthe statistical properties of the methods

developed. An example is calculation ofthe variances of estimators that are oftenused to choose the most efficient memberof a family of estimators. Computersimulations are then a useful tool, but itis important to have a mathematicaltheory with which simulation results canbe compared. Asymptotic statisticaltheory can play this role, being thereforean important research objective at CWI.In recent years Dzhaparidze and Spreijhave published a number of papers onparameter estimation problems in ageneral context of semimartingales.

Asymptotic methods can also be used toapproximate complex models by simplerones for inferential purposes. Moreover,the theory of asymptotic equivalence ofexperiments will be used to simplifydecision problems for complex stochasticmodels to those of Gaussian or Poissonmodels that approximate them in thedeficiency distance. This method can alsobe used to the approximation of discrete-time models by continuous time-models.Certain rudimentary ideas and facts onthe relationship between these models hasbeen reported by Dzhaparidze in a seriesof three papers in CWI Quarterly. Thesepapers gave rise to a textbook on optionsvaluation which is recently completed andintended for publication at CWI.

The research described above will befurther developed in close collaborationwith research teams in, eg, Paris, Berlin,Copenhagen, Freiburg, Helsinki andPadova. Most of these teams have beeninvolved in the HCM researchprogramme ‘Statistical Inference forStochastic Processes’. Contacts betweenthe members of these teams are currentlymaintained or reinforced at annualworkshops, recently in Munzingen(Freiburg). The collaboration with E.Valkeila (Helsinki), in particular, provedto be quite fruitful. A number of jointpapers on general parametric families ofstatistical experiments were published,and others are scheduled for this year.

■Please contact:

Kacha Dzhaparidze – CWITel: +31 20 592 4089E-mail: [email protected]

GeneticProgrammingfor FeatureExtraction in FinancialForecasting by József Hornyák and László Monostori

Artificial neural networks (ANNs)received great attention in the past fewyears because they were able to solveseveral difficult problems withcomplex, irrelevant, noisy or partialinformation, and problems which werehardly manageable in other ways. Theusual inputs of ANNs are the time-series themselves or their simpledescendants, such as differences,moving averages or standarddeviations. The applicability of geneticprogramming for feature extraction isinvestigated at the SZTAKI, as part ofa PhD work.

During the training phase ANNs try tolearn associations between the inputs andthe expected outputs. Although backpropagation (BP) ANNs are appropriatefor non-linear mapping, they cannoteasily realise certain mathematicalrelationships. On the one hand,appropriate feature extraction techniquescan simplify the mapping task, on theother hand, they can enhance the speedand effectiveness of learning. On the baseof previous experience, the user usuallydefines a large number of features, andautomatic feature selection methods (egbased on statistical measures) are appliedto reduce the feature size. A differenttechnique for feature creation is thegenetic programming (GP) approach.Genetic programming provides a way tosearch the space of all possible functionscomposed of certain terminals andprimitive functions to find a function thatsatisfies the initial conditions.

The measurement of goodness ofindividual features or feature sets plays



Page 11: [Finance] Financial Mathematics

a significant role in all kinds of featureextraction techniques. Methods can bedistinguished, whether the learning/classification/estimation phases areincorporated in the feature extractionmethod (filter and wrapper approaches).

In fact, most of the financial technicalindicators (Average True Range, Chaikin

Oscillator, Demand Index, DirectionalMovement Index, Relative StrengthIndex etc.) are features of time-series ina certain sense. Feature extraction canlead to similar indicators. An interestingquestion is, however, whether such anapproach can create new, betterindicators.

The techniques were demonstrated andcompared on the problem of predictingthe direction of changes in the nextweek’s average of daily closes for S&P500 Index. The fundamental data werethe daily S&P 500 High, Low and CloseIndices, Dow Jones Industrial Average,Dow Jones Transportation Average, DowJones 20 Bond Average, Dow JonesUtility Average and NYSE Total Volumefrom 1993 to 1996.

Three ANN-based forecasting modelshave been compared. The first one usedANNs trained by historical data and theirsimple descendants. The second one wastrained by historical data and technicalindicators, while the third model used

new features extracted by GP as well.Plain ANN models did not provide thenecessary generalization power. Theexamined financial indicators showedinterclass distance measure (ICDM)values better than those of raw data andenhanced the performance of ANN-basedforecasting. By using GP much betterinputs for ANNs could be created

improving their learning andgeneralization abilities.

Nevertheless, further work on forecastingmodels is planned, for example:

• extension of functions and terminals forGP

• direct application of GP for theextraction of investment decisions

• committee forecasts where somedifferent forecasting systems work forthe same problem and these forecastsare merged.

This project is partially supported by theScientific Research Fund OTKA,Hungary, Grant No. T023650.

■Please contact:

László Monostori – SZTAKITel: +36 1 466 5644E-mail: [email protected]

Taming Risks:FinancialModels and Numerics by Jiri Hoogland and Dimitri Neumann

The increasing complexity of thefinancial world and the speed at whichmarkets respond to world-eventsrequires both good models for thedynamics of the financial markets aswell as proper means to use thesemodels at the high speed required inpresent-day trading and risk-management. Research at CWIfocuses on the development of modelsfor high-frequency data andapplications in option-pricing, andtools to allow fast evaluation ofcomplex simulations required foroption-pricing and risk-management.

The modeling of equity price movementsalready started in 1900 with the work ofBachelier, who modeled asset prices asBrownian motion. The seminal papers byMerton, Black, and Scholes, in which theyderived option prices on assets, modeledas geometric Brownian motions, spurredthe enormous growth of the financialindustry with a wide variety of (very)complex financial instruments, such asoptions and swaps. These derivatives canbe used to fine-tune the balance betweenrisk and profit in portfolios. Wrong use ofthem may lead to large losses. This iswhere risk-management comes in. Itquantifies potentially hazardous positionsin outstanding contracts over some time-h o r i z o n .

Option pricing requires complexmathematics. It is of utmost importanceto try to simplify and clarify thefundamental concepts and mathematicsrequired as this may eventually lead tosimpler, less error-prone, and fastercomputations. We have derived a newformulation of the option-pricing theoryof Merton, Black, and Scholes, whichleads to simpler formulae and potentiallybetter numerical algorithms.



ANN-based forecasting of stock prices.

Page 12: [Finance] Financial Mathematics



Brownian motion is widely used tomodel asset-prices. High-frequency dataclearly shows a deviation from Brownianmotion, especially in the tails of thedistributions. Large price-jumps occur inpractice more often than in a Brownianmotion world. Thus also big losses occurmore frequently. It is therefore important

to take this into account by more accuratemodeling of the asset-price movements.This leads to power-laws, Levy-distributions, etc.

Apart from options on financialinstruments like stocks, there exist optionson physical objects. Examples are optionsto buy real estate, options to exploit anoil-well within a certain period of time,or options to buy electricity. Like ordinaryoptions, these options should have a price.However, the writer of such an option(the one who receives the money) usuallycannot hedge his risk sufficiently. Themarket is incomplete, in contrast with theassumptions in the Black-Scholes model.In order to attach a price to such an option,it is necessary to quantify the residual riskto the writer. Both parties can thennegotiate how much money should bepaid to compensate for this risk. We

explore ways to partially hedge inincomplete markets.

A relatively new phenomenon in thefinancial market has been theintroduction of credit risk derivatives.These are instruments which can be usedto hedge against the risk of default of a

debitor. It is obvious that this kind of riskrequires a different modeling approach.The effect of default of a firm is a suddenjump in the value of the firm and itsliabilities, and should be described by ajump process (for example, a Poisson-process). In practice, it is difficult toestimate the chance of default of somefirm, given the information which isavailable. For larger firms, credit-worthiness is assessed by rating agencieslike Standard and Poors. We are lookingat methods to estimate and model thedefault risk of groups of smaller firms,using limited information.

The mathematics underlying financialderivatives has become quite formidable.Sometimes prices and related propertiesof options can be computed usinganalytical techniques, often one has torely on numerical schemes to findapproximations. This has to be done very

fast. The efficient evaluation of optionprices, greeks, and portfolio risk-management is very important.

Many options depend on the prices ofdifferent assets. Often they allow theowner of the option to exercise the optionat any moment up to the maturity of the(so-called) American-style option. Thecomputation of prices of these options isvery difficult. Analytically it seems to beimpossible. Also numerically they are atough nut to crack. For more than threeunderlying assets it becomes very hardto use tree or PDE methods. In that caseMonte Carlo methods may provide asolution. The catch is that this is not doneeasily for American-style options. Weare constructing methods whichindirectly estimate American-style optionprices on multiple assets using MonteCarlo techniques.

Monte Carlo methods are very versatileas their performance is independent ofthe number of underlying dynamicvariables. They can be compared togambling with dice in a casino many,many times, hence the name. Even if thenumber of assets becomes large, theamount of time required to compute theprice stays approximately the same. Stillthe financial industry demands morespeedy solutions, ie faster simulationmethods. A potential candidate is the so-called Quasi-Monte Carlo method. Thename stems from the fact that onegambles with hindsight (prepared dice),hence the ‘Quasi’. It promises a muchfaster computation of the option-price.The problems one has to tackle are thegeneration of the required quasi-randomvariates (the dice) and the computationof the numerical error made. We try tofind methods to devise optimal quasi-random number generators. Furthermorewe look for simple rules-of-thumb whichallow for the proper use of Quasi-MonteCarlo methods.

For more information seeh t t p : / / d b s . c w i . n l : 8 0 8 0 / c w w w i / o w a / c w wwi.print_themes?ID=15

■Please contact:

Jiri Hoogland or Dimitri Neumann – CWITel: +31 20 5924102E-mail: {jiri, neumann}@cwi.nl

Options were traded at the Beurs in Amsterdam (building by Hendrick deKeyser) already in the early 17th century.

Page 13: [Finance] Financial Mathematics



StochasticSystems in FinancialMathematics –Researchactivities at SZTAKI by László Gerencsér

Financial mathematics andmathematical methods in economyhave attracted a lot of attention withinacademia in Hungary in recent years.The potentials of the new area has alsobeen recognized at the SZTAKI: aninter-laboratory virtual researchgroup has been established by thename ‘Financial Mathematics andManagement’. The participatinglaboratories are: Laboratory ofApplied Mathematics, Laboratory ofOperations Research and DecisionSystems and Laboratory ofEngineering and ManagementIntelligence. The participants havecommitted themselves to carrying outresearch, among other things, in thearea of option pricing, economic timeseries and portfolio analysis. Thisarticle gives a short overview of theactivity of the Stochastic SystemsResearch Group, Laboratory ofApplied Mathematics and theLaboratory of Operations Researchand Decision Systems in the stochasticaspects of financial mathematics.

Our activity in the area started with mydiscussions with Tomas Björk(Department of Finance, StockholmSchool of Economics) and AndreaGombani (CNR/LADSEB) in summer,1996, while visiting Lorenzo Finesso inCNR/LADSEB. A prime theme for thesediscussions was financial mathematicsthat attracted many people working instochastic analysis both in Europe andthe USA last years. To try to use ourspecialized skills a formal procedure wasinitiated at the SZTAKI to get a project

in financial mathematics established. Theinitiative was accepted and the inter-laboratory virtual research group‘Financial Mathematics andManagement’ was established.

Our research efforts, in the stochasticaspects, are focused on marketincompleteness due to uncertainties suchas poor volatility estimates in modelingthe stock-processes. Under too muchmodeling uncertainties the market isincomplete, and replicating a contingentclaim requires a non-self-financingportfolio. We have analyzed the path-wise add-on cost and used it informulating a stochastic programmingproblem which yields a performanceindex for any given price on which theseller and buyer agree. This approach hasbeen motivated by my earlier researchwith Jorma Rissanen in the area ofstochastic complexity on the interactionof statistical uncertainty andperformance. The method is a result ofmy joint work with György Michaletzky,head of department at the Eötvös LorándUniversity (ELTE), Budapest, and a part-time researcher at the SZTAKI, aninternational authority on stochasticrealization theory, and with MiklósRásonyi, the youngest member of theStochastic Systems Research Group. Toget a data-driven procedure we alsoconsider the analysis of financial data byusing on-line statistical analysis,including adaptive prediction andchange-detection. Zsuzsanna Vágó,member of Laboratory of OperationsResearch and Decision Systems, hasobtained a János Bolyai researchscholarship for three years to study theseproblems.

In addition to research, we have startedan educational program. First, we had setup a one-semester course on derivativepricing. An adequate place for this coursewas the Department of ProbabilityTheory and Statistics at the EötvösLoránd University, headed by GyörgyMichaletzky.

A major thrust to our educational activitywas a one-week thrilling minicourse, 14-20 September, 1998, held by TomasBjörk, with the title ‘Arbitrage pricingof derivative financial securities’. The

course attracted some 30 enthusiasticparticipants from industry and academia.Taking the advantage of this visit, werestructured our educational program andnow we have a two-semester course,including more material on interest ratetheory.

We are looking forward to having ournext minicourse in financial mathematicsto be held next September, with the title‘Optimal Portfolios - Risk and ReturnControl’, given by Ralf Korn,Department of Mathematics, Universityof Kaiserslautern.

We have been in co-operation withManfred Deistler, Technical University,Wien, in the area of time-series analysis,especially with respect to co-integration.A joint project with Youri Kabanov,Department of Mathematics at Universitéde Franche-Comté, Besancon, France,including problems of option pricing andhedging under transaction costs is justunder way. We also see risk-sensitivecontrol, an area that has beensignificantly enriched by Jan vanSchuppen, CWI, as a potentially usefultool for portfolio design and an area forfurther cooperation. We are lookingforward to developing a co-operativeproject with the group on research theme‘Mathematics of Finance’, CWI, headedby Hans Schumacher.

■Please contact:

László Gerencsér – SZTAKITel: +36 1 4665 644E-mail: [email protected]

Page 14: [Finance] Financial Mathematics



UnderstandingMortgage-backedSecurities by Michiel Bertsch

A research project on themathematical modeling of fixedincome markets has recently begun atthe CNR institute for appliedmathematics in Rome (Istituto per leApplicazioni del Calcolo – IAC-CNR).The aim is to combine ‘real worldproblems’ with high quality researchin mathematical finance, in order toobtain a better and more efficientunderstanding of the correct pricingof complicated fixed income productssuch as mortgage-backed securities.The project is intrinsicallyinterdisciplinary, and uses techniquesvarying from the statistical analysis offinancial data to the development ofbasic models and their numericalsimulation.

IAC has started a project on financialmathematics in collaboration with INASIM S.P.A. (INA is a major insurancecompany in Italy). The aim of the projectis both to study existing mathematicaland statistical models for the correctpricing of fixed income financialproducts, and to develop new ones. Inthe early stage we focus on one hand onthe analysis of the relevant statistical dataand, on the other, on the study of existingadvanced models in the academicliterature. In a second stage, these twoactivities are intended to ‘meet’ in orderto develop accurate models for thepricing of complicated financial productsand their numerical implementation.

A particular example of such productsare the so-called mortgage-backedsecurities (MBS’s). Roughly speaking,the US fixed income market is dividedin three areas: treasury bills, corporatebonds and MBS’s, but nowadays thelatter area is the bigger one. MBS’s areliquid and they are securitized for defaultrisk. Their only disadvantage is the

prepayment risk, and it is exactly thispoint which makes MBS’s difficult toprice and creates a challenge to financialmodelers. Someone with a mortgageusually does not optimize the moment atwhich he exercises the prepaymentoption of the mortgage, and even poolingseveral mortgages together does notaverage out this effect. In the academicliterature only very few advanced pricingmodels have been proposed; however,after more than 30 years of experience,the US market is a source of considerabledata. This means that the necessaryingredients are present to improve themethods of quantitative analysis ofMBS’s. In this context, we observe thatquantitative analysis becomes aparticularly powerful tool in the case ofnew emerging markets, in which evenaggressive traders may lack the necessaryexperience to be as efficient as usual. Inthe future, in the new European context,MBS’s could very well form such anemerging market.

A closing remark regards the dramaticproblem of the almost complete absenceof good research in applied mathematicsin Italian industry. The project on MBS’sis attracting first rate students andpostdocs. Some of them will becomeacademic researchers, but I am convincedthat others will find a job in Italianfinancial institutes. Having researcherswith a PhD degree in mathematics instrategic positions in private companieswould be an important step towardsfurther high-quality collaboration withItalian industry.

■Please contact:

Michiel Bertsch – University of Rome ‘TorVergata’ and CNR-IACTel: +39 06 440 2627E-mail: [email protected]

ShowRisk –Prediction of Credit Riskby Gerhard Paaß

The growing number of insolvenciesas well as the intensified internationalcompetition calls for reliableprocedures to evaluate the credit risk(risk of insolvency) of bank loans.GMD has developed a methodologythat improves the current approachesin a decisive aspect: the explicitcharacterization of the predictiveuncertainty for each new case. Theresulting procedure does not onlyderive a single number as result, butalso describes the uncertainty of thisnumber.

Credit Scoring procedures use arepresentative sample to estimate thecredit risk, the probability that a borrowerwill not repay the credit. If all borrowershad the same features, the credit risk maybe estimated. Therefore the uncertaintyof the estimate is reduced if the numberof sample elements grows. In the generalcase complex models (eg neuralnetworks or classification trees) arerequired to capture the relation betweenthe features of the borrowers and thecredit risk. Most current procedures arenot capable to estimate the uncertaintyof the predicted credit risk.

Prediction with Plausible Models

We employ the Bayesian theory togenerate a representative selection ofmodels describing the uncertainty. Foreach model a prediction is performedwhich yields a distribution of plausiblepredictions. As each model representsanother possible relation between inputsand outputs, all these possibilities aretaken into account in the joint prediction.

A theoretical derivation shows that theaverage of these plausible predictions ingeneral has a lower error than single‘optimal’ predictions. This wasconfirmed by an empirical investigation:For a real data base of several thousand

Page 15: [Finance] Financial Mathematics



enterprises with more than 70 balancesheet variables, the GMD procedure onlyrejected 35.5% of the ‘good’ loans,whereas other methods (neural networks,fuzzy pattern classification, etc.) rejectedat least 40%.

Expected Profit as Criterion

The criterion for accepting a credit is aloss function specifying the gain or lossin case of solvency/insolvency. Using thepredicted credit risk we may estimate theaverage or expected profit. According tostatistical decision theory a creditapplication should be accepted if this

expected profit is positive. Dependingon the credit conditions (interest rate,securities) this defines a decisionthreshold for the expected profit.

In figures 2 and 3 the decision thresholdfor a credit condition is depicted: If thepredicted average credit risk is above thethreshold, a loss is to be expected onaverage and the loan application shouldbe rejected. Figure 2 shows a predictivedistribution where the expected creditrisk is low. The uncertainty about thecredit risk is low, too, and the loanapplication could be accepted withoutfurther investigations. The expected

credit risk of the predictive distributionin figure 3 is close to the decisionthreshold.

The actual credit risk could be located inthe favourable region the intermediaterange or in the adverse region. Theinformation in the training data are notsufficient to assign the credit risk to oneof the regions. Obviously the data basecontains too few similar cases for thisprediction resulting in an uncertainprediction. Therefore in this case thereis a large chance that additionalinformation, especially a closer audit ofthe customer, yields a favorable creditrisk.


Under a contract the credit scoringprocedure was adapted to the data of theGerman banking group DeutscherSparkassen und Giroverband and iscurrently in a test phase. For each newapplication it is possible to modify thecredit conditions (interest rate, securities)to find the conditions, where the crediton the average will yield a profit. For aprediction the computing times are abouta second. Currently an explanationmodule is developed which will explainthe customer and the bank officer interms of plausible rules and concepts,why the procedure generated a specificprediction.

■Please contact:

Gerhard Paaß – GMDTel: +49 2241 14 2698E-mail: [email protected]

Figure 2: Distribution of credit riskwith low expected credit risk and lowuncertainty.

Figure 3: Distribution of credit riskwith medium expected credit riskand high uncertainty.

Figure 1: Steps of a prognosis.

training data




new customer

Page 16: [Finance] Financial Mathematics



StochasticMethods in Finance:EvaluatingPredictions by Valeria Skrivankova

Stochastic methods in finance aremainly connected with risky financialoperations, for example the securitymarket trading. Relevant decisions areaffected by a prediction of somequantity, but the adequate judgmenton the future fulfilments of theexpectation is often a difficult problem.Common methods of the evaluation ofjudgments are based on long termobservations. The presented methodof evaluation called ReflexiveEvaluation Of Predictive Expertises(REOPE) is also applicable for theunrepeatable expertises.

The financial market models are basedon the premise that investors like returnand dislike risk. So the financialmanagement wants to maximize thereturn and minimize the risk. For thispurpose it is necessary to have the bestforecast of expected return and risk. Thedefinition of risk used in a classicalMarkowitz ‘Mean-Variance’ Model foreffective portfolio is a measure of thevariability of return called the standarddeviation of return. So the main task isto predict (estimate) the expected returnand the standard deviation.

What Forecasting can and cannot do?

One should not expect any forecastingtheory or technique to predict the precisevalue at which a future price will settletomorrow, or any given day, or what theexact high or low will be. A goodforecasting method will on average havea small forecast error; that is, thedifference between the forecast price andthe actual market price will be small.Further, the forecast must be unbiased,which means that the errors shouldovershoot the actual price as often andby as much as they undershoot it.

Measuring Talent by CommonMethods

Talent can be differentiated from luckonly by examining results averaged overmany periods. Investors and managementcannot afford to evaluate futureperformance and the reason for it merelyon the basis of a one period forecast.They must consider such things as theexpected contribution of a securityanalyst over time, how to estimate it andhow to organize to make the most of it.

Formulation of the Problem for REOPE

Consider the predicted quantity as arandom variable X (eg portfolio return).Suppose that the quality of the judgementof X is evaluated according to thecorrespondence of the estimation withthe consequently realized value of X o n l y .Let t ( X ) be a relevant parameter of thedistribution of X in sense of expert’sopinion. The problem of the judgementevaluation is generally based on anevaluation function h = h(x,estim t) ,where x is the realized value of X a n destim t is the expert’s estimation of t. Theexpert’s criterion of optimality is fulfilledif he gives unbiased estimate of t.Suppose that estim t is fully determinedby the expert’s effort to optimize hiscriterion C which is connected with theevaluation h(X,estim t) of his work only.So we have to find the concreteevaluation function as a solution ofcertain equation. The expert’sperfomance evaluation:

• optimizes the expert’s criterion of utilityC if he delivers an unbiased judgement

• reflects the correspondence between thesingle estimation of some parameterand the consequently realized value ofthe predicted quantity only

• motivates the expert to put a reasonabledeal of his effort in the judgement.

Mean Value Judgements

Let X be the followed random variable,E represents the mean value operator, theparameter t of the distribution of X i sE ( X ). The expert’s criterion of optimalityconsists in the maximization of the meanvalue of his future evaluation here. Wesearch for a function h so that

E [ h ( X , E ( X ) ) ] is the maximum ofE[h(X,estim t)]. We can show that thefunction h given as a - b(estim t -x).(estim t - x), where a , x are real numbersand b positive, fulfils our condition.Parameters a , b can be choosen by higherlevel management (management ofexpertises).

Common methods for the evaluation ofjudgements are based on statisticalanalysis of adequacy of past judgement.Ferguson (1975) uses simple regressionmethods which require long termobservations. These models aren’tsuitable for the unrepeatable expertises.The presented method of evaluation isalways applicable if the manager knowsthe expert’s criterion C and the expertknows the evaluation function h . Thismethod reflects the expert’s successimmediately so motivates him to theoptimal performance in every judgement.

The given solution of the problem doesnot claim completeness. Probabilitydistribution judgements and manager’sutility optimization were published bySkrivanek (1996) and Skrivankova(1998). Statistical regulation ofestimations and hypothesis testing oftheir convenience are studied.

■Please contact:

Valeria Skrivankova – SRCIMTel: +421 95 62 219 26E-mail: [email protected]

Page 17: [Finance] Financial Mathematics



Model RiskAnalysis forDiscount BondOptionsby Denis Talay

Resarchers of the Omega researchgroup at INRIA Sophia Antipolis andof the University of Lausanne havestarted in 1998 a study on model riskfor discount bond options. Thisresearch is funded by the SwissRisklab institute. The aim of theproject is to see how models risk affectsthe risk management of interest ratederivatives and how to manage thisrisk.

RiskLab is a Swiss inter-universityresearch institute, concentrating onprecompetitive, applied research in thegeneral area of (integrated) riskmanagement for finance and insurance.The institute, founded in 1994, ispresently co-sponsored by the ETHZ, theCrédit Suisse Group, the SwissReinsurance Company and UBS AG.Several research projects are beingfunded by Risklab. Among them, theproject on model risk analysis fordiscount bond options proposed byresearchers at the University of Lausanne(Rajna Gibson and François-SergeLhabitant) and the Omega Researchgroup at INRIA Sophia Antipolis(Mireille Bossy, Nathalie Pistre, DenisTalay, Zheng Ziyu).

Model risk is an important question forfinancial institutions. Indeed, trading,hedging and managing strategies for theirbooks of options are derived fromstochastic models proposed in theliterature to describe the underlyingassets evolutions. Of course these modelsare imperfect and, even if it were not,their parameters could not be estimatedperfectly since, eg, market prices cannotbe observed in continuous time. Fordiscount bond options, additionalmispecifications occur: for example, itseems difficult to discriminate modelsand to calibrate them from historical data

of the term structure. Thus a trader cannotmake use of perfectly replicatingstrategies to hedge such options. Thepurpose of the study is to provide ananalytical framework in which weformalize the model risk incurred by afinancial institution which acts either asa market maker — posting bid and askprices and replicating the instrumentbought or sold — or as a trader who takesthe market price as given and replicatesthe transaction until a terminal date(which does not necessarily extend untilthe maturity of his long or short position).

The first part of the study is to define theagent’s profit and loss due to model risk,given that he uses an incorrect model forhis replicating strategy, and toanalytically (or numerically) analyse itsdistribution at any time. This allows usto quantify model risk for pathindependent as well as for path dependentderivatives. The main contributions ofthe study is to decompose the Profit andLoss (P&L) into three distinct terms: thefirst representing a pricing freedomdegree arising at the strategy’s inception(date 0), the second term representing thepricing error evaluated as of the currentdate $t$ and the final term defining thecumulative replicating error which isshown to be essentially determined bythe agent’s erroneous ‘gamma’multiplied by the squared deviationbetween the two forward rate volatilitiescurve segments’specifications. Wefurthermore derive the analyticalproperties of the P&L function for somesimple forward rate volatilitiesspecifications and finally conduct MonteCarlo simulations to illustrate andcharacterize the model error propertieswith respect to the moneyness, the timeto maturity and the objective functionchosen by the institution to evaluate therisk related to the wrong replicatingmodel. A specific error analysis has beenmade for the numerical approximationof the quantiles of the P&L.

Aside from providing a fairly general yetconceptual framework for assessingmodel risk for interest rate sensitiveclaims, this approach has two interestingproperties: first, it can be applied to afairly large class of term structure models(all those nested in the Heath, Jarrow,

Morton general specification). Secondly,it shows that model risk does indeedencompass three well defined steps, thatis, the identification of the factors, theirspecification and the estimation of themodel’s parameters. The elegance of theHJM term structure characterization isthat those three steps can all be recast interms of the specification and theestimation of the proper forwardvolatility curve function.

The second part of the study concernsthe model risk management. Weconstruct a strategy which minimizes thetrader’s losses universally with respectto all the possible stochastic dynamics ofthe term structure within a large class ofmodels. This leads to complex stochasticgame problems, hard to studytheoretically and to solve numerically:this is in current progress.

Risklab: http://www.risklab.ch/Omega Research team:http://www.inria.fr/Equipes/OMEGA-fra.html

■Please contact:

Denis Talay – INRIATel: +33 4 92 38 78 98E-mail: [email protected]

NumericalAlgorithmsGroupby David Sayers

Numerical Algorithms Group Ltd( N A G

T M), a not-for-profit software

house, is the UK’s leading producerand supplier of mathematicalcomputer software for business,education and industry. A key focusand growth area is the complex worldof finance. Here, according to NAGtechnical consultant David Sayers, therole that his company’s technologycould play in delivering competitiveadvantage is tantalisingly ripe fordiscovery.

NAG was founded in an academicresearch environment. It was created in

Page 18: [Finance] Financial Mathematics



1970 by numerical analysts co-ordinatedfrom the University of Nottingham,moved to Oxford in 1973, then expandedto become an international group. TodayNAG continues to be driven by a networkof research professionals from aroundthe world. Its successes to date havealways depended on integrating thisworld effectively with that of the ‘real’world as experienced by the end users ofits technology. Nag is committed tosecure future successes by adopting thesame approach. For example, there islittle point forging ahead with researchto heighten accuracy, when customershave a more pressing need for speed ofdelivery. NAG has recently launched aproactive initiative to investigate thefinancial customer base in more detail,to direct its research network to deal moreclosely with the real life problemsfinancial analysts have to solvetoday....and tomorrow.

NAG’s numerical libraries are alreadyused extensively in financial institutionsaround the world, here NAG is prized forthe high quality, reliability and speed ofits software, scope (in terms of the rangeof solutions available) and attentive levelof technical support. The customer baseis wide ranging, some users work on thesmallest PC while others manage themost modern supercomputers; they usea variety of computing languages. A keyrequirement these institutions have incommon is the need to use NAG routinesto develop unique in-house trading andpricing strategies, something that is notpossible with off-the-shelf completepackages.

For those with particularly complexfinancial challenges, NAG also offers aconsultancy service. Recent work, forexample, involved a sophisticatedportfolio tracking program and theprovision of a bespoke module for tradingin global currency and bond markets.

The same NAG mathematical consultantsare constantly considering general trendsin the marketplace to direct softwaredevelopment.

Key trends already identified includeinterest in the single European currency.NAG has already predicted a refinement

in investment strategies based on a muchlarger global portfolio of shares than atpresent in Europe. The indices will nowcross the European spectrum of sharesnot just those quoted on the localexchanges. Problem sizes will be larger,leading to a greater demand for morepowerful routines capable of solvinglarger problems. Here NAG’s multi-processor libraries (SMP and Parallellibraries) are the ideal solution.

Another interesting development underscrutiny is the inclusion of transactioncosts in portfolio modelling. This leadsto the minimisation of numericallydifficult discontinuous functions.Accordingly, major software systemswill need to rely on NAG’s expertise andquality to solve complex problems.

Derivatives are also becoming morecomplex – with simple option pricing

giving way to the more complicatedproblem of pricing exotic derivatives.Black-Scholes models are now startingto give way to more sophisticated models. As European markets change, so will theregulatory bodies and surroundinglegislation. Dealers will need to knowhow their books stand at the end of theday, to meet both the regulatoryrequirements and the ‘risk of exposure’requirements of their own managers.With NAG’s flexible solvers, theadaptation to changing circumstances ismade possible. NAG is also alreadyanticipating new breeds of programmersgraduating from universities. Thesepeople are moving away from thetraditional library approach to problemsolving. They will need either moresophisticated components or solutionmodules that interface to ‘packages’ or‘problem solving environments’. Userswill have ever increasing amounts of data

NAG'svisualisationpackage IRISExplorerillustrates thetype of visualanalysis thefinancialcommunityincreasinglyneeds. Complexinformation iseasily digestedin a second andthe ability toview data indifferentdimensionsreveals pertinentrelationshipsthat couldotherwise gooverlooked.

Page 19: [Finance] Financial Mathematics



to analyse and assess. This will requiregood visualisation capabilities and asystem capable of performingmeaningful and powerful statisticalanalysis of that data.

Looking ahead, NAG is committed tomeeting financial analysts’ need forspeedier, accurate solutions by enhancingthe numerical libraries that have alreadygained a considerable following in thiscommunity. The company will alsodeliver the security and flexibility thesecustomers require. As architectureschange, so the libraries will change tofully exploit new features and to embracethe increasing need for thread-safety. Atthe same time, NAG will enhance thelibraries with newer and more powerfulsolvers, keeping pace with the rapidadvances in numerical techniques. Inaddition, further work will focus onpresenting NAG’s numerical techniquesin new ways, ensuring the power of thistechnology can be accessed by newstypes of user.

NAG also anticipates a surge inawareness of the competitive advantageof using visualisation packages, again akey area for the new types of user.NAG’s own package, IRIS Explorer(tm)can be combined with the reliableengines of the company’s libraries toform a bespoke computational andvisualisation program. This is a vitaldevelopment in the financial worldwhere, for example, dealers are underpressure to absorb the results of acalculation at a glance. Numbers are notsufficient. NAG is set to develop morevisualisation modules to meet theexpected demand for increasingly morepowerful tools in this area.

Further focus areas and challenges willdoubtless emerge. NAG anticipates withrelish that the rate of change and pace ofsoftware development will bephenomenal. For more information onNAG, see http://www.nag.co.uk.

■Please contact:

David Sayers – NAG LtdTel: +44 186 551 1245E-mail: [email protected]

Premia: AnOption PricingProjectby Claude Martini and Antonino Zanette

The main purpose of the Premiaconsortium is to provide routines forpricing financial derivative productstogether with scientific documentation.The Premia project is carried out atINRIA and CERMICS (Centred’Enseignement et de Recherche enMathématiques, Informatique etCalcul Scientifique).

The Premia project focuses on theimplementation of numerical analysistechniques to compute the quantities ofinterest rather than on the financialcontext. It is an attempt to keep track ofthe most recent advances in the field froma numerical point of view in a well-documented manner. The ultimate aimis to assist the R&D professional teamsin their day-to-day duty. It may also beuseful for academics who wish toperform tests on a new algorithm orpricing method without starting fromscratch.

The Premia project is three-fold:

• the first component is a library designedto describe derivative products, models,pricing methods and which providesbasic input/output functionalities. Thislibrary is written in C language and isobject-oriented.

• The second component is the pricingroutines themselves. Each routine iswritten in a separate .c file. The .c filecontains the code of the routine; thispart of the code is what matters for userswho want to plug the routines of Premiain to another software.

• The third component is the scientificdocumentation system. It is createdfrom hyperlinked PDF files whichdiscuss either a pricing routine (everyroutine has its own PDF doc file) or amore general topic like Monte Carlomethods, lattice methods, etc. This webof PDF files also includes a PDFversion of the whole C source code with

easy jumps from the source file to thedocumentation file.

The most valuable component of thisproject is the documentation whichmakes use of the scientific and numericalknowledge of our institutions. Thisdocumentation will complement in animportant way books devoted totheoretical option pricing. The routinesthemselves come in second. We feel thaton a given pricing issue some otherprofessional R&D team will certainlyhave much better and competitivesoftware or algorithm. Nevertheless onthe average Premia should be of interestto them. Lastly the object-orientedsoftware is only there to provide an easyway to test things. It was mainly designedfor the use of the Premia team. Thus,Premia is more attractive than a plainlibrary of C routines.

Current State and Perspectives

We have already programmed anddocumented a fairly large set of routinescomputing the prices and the hedges ofstock options. These routines use mainlyexplicit, lattice or finite differencemethods. Current work deals withMonte-Carlo and quasi-Monte-carlomethods. We plan to start implementingalgorithms for interest rate options inearly 2000.

This project is funded by a group offinancial institutions called the Premiaconsortium. Members of the consortiumare Crédit Agricole Indosuez, CréditLyonnais, Caisse Centrale des BanquesPopulaires, Union Européenne du CIC,Caisse des Dépots et Consignations. Thefunding members have access to thecomplete software with the source andthe documentation. Other interestedfinancial institutions are welcome to jointhe consortium.

A web site describing in more detail theaims of the project and the way to jointhe consortium is available at:http://cermics.enpc.fr/~premia/

■Please contact:

Claude Martini – INRIATel: +33 1 39 63 51 01E-mail: [email protected]

Page 20: [Finance] Financial Mathematics



Life InsuranceContractSimulationsby Mireille Bossy

A common feature of life insurancecontracts is the early exit option whichallows the policy holder to end thecontract at any time before its maturity(with a penalty). Because of this option,usual methodologies fail to compute thevalue and the sensibility of the debt ofthe Insurance Company towards itscustomers. Moreover, it is nowcommonly admitted that an early exitoption is a source of risk in a volatileinterest rates environment. TheOMEGA Research team at INRIASophia Antipolis studies riskmanagement strategies for lifeinsurance contracts which guarantee aminimal rate of return augmented bya participation to the financial benefitsof the Company.

A preliminary work of OMEGAconsisted in studying the dependency ofthe Insurance Company’s debt valuetowards a given customer on variousparameters such as the policy holdercriterion of early exit and the financialparameters of the Company investmentportfolio. Statistics of the value of thedebt are obtained owing to a Monte Carlomethod and simulations of the randomevolution of the Company’s financialportfolio, the interest rates and of thebehaviour of a customer.

More precisely, the debt at the exit timet from the contract (with an initial valueof 1), is modeled by D(t) = p(t)[exp(r t)+ max(0, A(t) - exp(r t))]. Here, r is theminimal rate of return guaranteed by thecontract and exp(r t) stands for theguaranteed minimal value of the contractat time t. A ( t ) is the value of the assets ofthe Company invested in a financialportfolio. A simplified model is A(t) = aS_t + b Z(t), where S ( t ) (respectively Z ( t ))is the value of the stocks (respectively ofthe bonds) held by the Company; a andb denote the proportions of theinvestments in stocks and in bonds

respectively. Finally, the function p ( t )describes the penalty applied to the policyholder in the case of an anticipated exitof the contract. Two kinds of exitcriterions are studied: the ‘historical’customer chooses his exit time bycomputing mean rates of return on thebasis of the past of the contract; the‘anticipative’ customer applies a morecomplex rule which takes the conditionalexpected returns of the contract intoaccount. In both cases, a latencyparameter is introduced to represent thecustomer’s rationality with respect to hisexit criterion. (The simulation of a largenumber of independent paths of theprocesses S and Z permits to compute thedifferent values of assets and liabilitiesin terms of the parameters of the market,a, b, and the strategy followed by thepolicy holder.)

In our first simulations, the asset of theCompany was extremely simplified: S ( t )is the market price of a unique share(described by the Black and Scholesparadigm) and Z ( t ) is the market price ofa unique zero-coupon bond (derived fromthe Vasicek model). Even in thisframework, the computational cost ishigh and we take advantage of the Monte

Carlo procedure to propose a software(named LICS) which attempts todemonstrate the advantage of parallelcomputing in this field. This softwarewas achieved within the FINANCEactivity of the ProHPC TTN of HPCN.

The computational cost correspondingto more realistic models can becomehuge. Starting in March 99, theAMAZONE project is a part of the G.I.E.Dyade (BULL/INRIA). Its aim is toimplement LICS on the NEC SX-4/16Vector/Parallel Supercomputer. Thisversion will include a largediversification of the financial portfolio(around thousand lines) and anaggregation of a large number ofcontracts mixing customers’ behaviors.

In parallel to this the OMEGA teamstudies the problem of the optimalportfolio allocation in the context ofsimplified models for life insurancecontract. For more information, see:http://www-sop.inria.fr/omega/finance/demonst.html, http://www.dyade.fr

■Please contact:

Mireille Bossy – INRIATel: +33 4 92 38 79 82E-mail: [email protected]

LICS: a Life Insurance Contract Simulation software.

Page 21: [Finance] Financial Mathematics



Using a Domain-SpecificLanguage for FinancialEngineering by Arie van Deursen

In order to reduce the lead time tomarket for new types of financialproducts, it is crucial that a bank’sfinancial and management informationsystems can be quickly adapted. Theuse of domain-specific languages hasproved a valuable tool to achieve thisrequired flexibility.

Financial engineering deals, amongstothers, with interest rate products. Theseproducts are typically used for inter-banktrade, or to finance company take-oversinvolving triple comma figures inmultiple currencies. Crucial for suchtransactions are the protection againstand the well-timed exploitation of riskscoming with interest rate or currencyexchange rate fluctuations.

The simplest interest rate product is theloan: a fixed amount in a certain currencyis borrowed for a fixed period at a giveninterest rate. More complicated products,such as the financial future, the forwardrate agreement, or the capped floater, allaim at risk reallocation. Banks frequentlyinvent new ways to achieve this, givingrise to more and more interest rateproducts. This, however, affects the

bank’s automated systems, whichperform contract administration andprovide management informationconcerning the bank’s on- and off-balance position, interest and exchangerate risks, etc.

To make the required modifications tothese systems, bank MeesPierson andsoftware house CAP Gemini decided todescribe the essence of their interest rateproducts in a high-level language, and togenerate the software automatically fromthese product descriptions. To that end,a small domain-specific language wasdesigned, especially suited to describeso-called cash-flows following fromproducts.

The use of this language is illustrated inthe Figure. At the heart of the system isa compiler for the domain-specificlanguage. Given a high-level productdescription, it is able to generateautomatically data structures in VSAMformat, data entry screens in CICSformat, and COBOL routines forregistering modifications in the productor for yielding management information.

The key enabler to the compiler is aCOBOL library formalizing domainconcepts such as cash flows, intervals,interest payment schemes, datemanipulations, etc. The domain-specificlanguage provides ways to combine theseconcepts into products, using a notationthat is familiar to financial engineers. Thelanguage has been developed incollaboration with CWI, using theASF+SDF Meta-Environment toconstruct a prototype tool suite.

The language has been included in CapGemini’s Financial Product System FPS,which is in use at several Dutch financialinstitutions. Time to market has gonedown from months to days, and IT costreductions of up to 50% have beenreported.

At the time of writing, CWI is involvedin a major research project addressingthe tooling, application, and methodologyof domain-specific languages in general– a project that is carried out as part ofthe Dutch Telematics Institute, incollaboration with several Dutchbusinesses.

■Please contact:

Arie van Deursen – CWITel: +31 20 592 4075E-mail: [email protected]

Subsymbolicand HybridArtificialIntelligenceTechniques in FinancialEngineering by László Monostori

The major difficulties in financialengineering, the most computationalintense subfield of finance, are asfollows. The problems are usuallymultivariate, nonlinear, difficult tomodel. The information available isimprecise and incomplete, can haveboth numerical and linguistic forms.Chaotic features and changingenvironment can be mentioned asfurther barriers. Obviously, intelligenttechniques, more exactly, techniquesof artificial intelligence (AI) research,became conspicuous in different fieldsof finance and business. The range ofapplications is rapidly increasing andcompanies use intelligent systems evento automate parts of their core

Use of a language, designed to describe so-called cash-flows followingfrom products.

Page 22: [Finance] Financial Mathematics



business. The fundamental goal ofinvestigations started some years agoat the SZTAKI was to exploreintelligent techniques, includingintelligent hybrid systems andmultistrategy learning applicable indifferent fields of finance and business.

The potential users of intelligenttechniques in finance and business arecorporate finance, financial institutions,and professional investors. Theapplications can be grouped as follows:analysis of financial condition, businessfailure prediction, debt risk assessment,security market applications, financialforecasting, portfolio management, frauddetection and insurance, mining offinancial and economic databases, other(eg macroeconomics) applications.

Intelligent hybrid systems are a verypowerful class of computational methodsthat can provide solutions to problemsthat are not solvable by an individualintelligent technique alone. Perhaps, themost important class of hybrid systemsis the integration of expert systems andneural networks. Several techniques haveemerged over the past few yearsspreading from stand-alone models,through transformational, loosely andtightly coupled models to fully integratedexpert system/neural network models.The integration of neural and fuzzy

techniques, which can be considered asa ‘full integration’, is an approach of highimportance.

Software tools for technical and financialapplications have been developed at theResearch Group on IntelligentManufacturing and Business Processes(IMBP) at the SZTAKI. The general-purpose neural network simulator(NEURECA) is to be mentioned first,which provides an integrated frameworkincluding feature definition and real-timecomputation; automatic feature selection;various learning algorithms,classification, estimation of unknownpatterns; standardized (DDE) interfacesto other programs, etc. NEURECA waswritten in C++ using its object-orientednature enabling to dynamically vary thenetwork structure during learning and toimplement different hybrid AI models:

• a hierarchically connected hybrid AIsystem, HYBEXP where neuralnetworks work on the lower,subsymbolic level coupled with anexpert system (ES) on the higher,symbolic level

• the neuro-fuzzy version of NEURECA,which is a symbiotic-type of hybrid AIsystem, uses a multistrategy learningapproach combining self-organizedclustering for initializing themembership functions (MBFs),competitive learning for selecting the

most important fuzzy rules, and backpropagation (BP) learning for finetuning the MBFs’ parameters

• genetic algorithms supported ruleselection within the neuro-fuzzy versionof NEURECA.

The applicability of the developedtechniques has been demonstratedthrough case studies:

• ANN-based forecasting of stockprices

• bankruptcy prediction by ANN andneuro-fuzzy techniques

• stock analysis and trading with neuro-fuzzy techniques.

Promising results have been achievedillustrating the benefits of the hybrid AIapproaches (eg comprehensivestructures, some sort of explanationfacility, higher convergence speed, etc.)over conventional methods or pure ANNor ES solutions.

Within SZTAKI, these and relatedactivities in the field of financialengineering are coordinated by the inter-laboratory Research Group of FinancialMathematics and Management. Furtherimpulses are expected from the ERCIMWorking Group on FinancialMathematics. This project is partiallysupported by the Scientific ResearchFund OTKA, Hungary, Grant No.T023650.

■Please contact:

László Monostori – SZTAKITel: +36 1 466 5644E-mail: [email protected]

The NEURECA ANN simulator.

Page 23: [Finance] Financial Mathematics



Micro AnalyticSimulationModels for PoliticalPlanningby Hermann Quinke

The ever increasing complexity is adistinctive characteristic of tax andtransfer laws. It is therefore impossibleto evaluate proposed or enactedamendments without appropriatesimulation models. GMD has a longstanding tradition in developing suchmodels and applying them in thecontext of the political decisionmaking process. The projectMIKMOD continues this tradition,because the demand for such modelsby the federal administration remainshigh and because their developmentposes interesting research problems.

MIKMOD focuses on the developmentand application of microsimulationmodels for taxes and transfer programsand other types of models that proveduseful for the analytical tasks of theadministration. Microsimulation modelsare well suited for estimating the cost oflaw changes ie the change of totalprogram cost or tax revenue. They arebased on a projected representativesample of the relevant populationsegment, the micro data base. Thesemodels are also capable of estimating thedistributive impact of the amendments.

For the German Federal Ministry forEducation, Science, Research andTechnology, we still maintain and updateregularly the static micromodelBAFPLAN for the analysis of the federaltraining assistance act (BAföG).BAFPLAN is one of the fewmicrosimulation models in Germany,which has been used in various versionson a regular basis for over fifteen years.We have built a comparable model forEast Germany, because the income leveland other socio-economic characteristics

in East and West Germany are andcontinue to be very divergent.

To keep the forecasting errors withinreasonable bounds the database ofBAFPLAN has to be replaced by a moretimely sample of eligible students on aregular basis. Also the student forecasts,underlying the projection (ageing) of thesample, are regularly updated.Experience with BAFPLAN indicates,that these models are of course still proneto error, but that the forecasting errorsare significantly smaller than those ofthe more intuitive methods formerlyused.

The Federal Ministry for Families, theElderly, Women and Youth has a specialneed for tools to analyse the economicconditions of the elderly. To satisfy theseneeds GMD developed AsA(Analysesystem Alterssicherung), acomprehensive model for simulating theformation of old-age income and itslevying with social security contributionsand taxes. It has been used extensivelyin the discussion about the radical changein the taxation of old-age income as wellas for analysing the reform of the corepension insurance (GesetzlicheRentenversicherung).

AsA consists of two components. Onecomponent comprises a classical staticmicroanalytical simulation model inwhich specific variants of income-taxand social contributions are specified andthe incomes of a representative sampleof persons receiving old-age income areexposed to these variants. A commondata base has been developed for theFederal Ministry for Families, theElderly, Women and Youth and theFederal Ministry of Finance, derivedfrom a survey by Infratest. The secondcomponent of AsA is a typical casemodel. It allows for analysing thegeneration of old-age income dependingon individual biographies and themembership of an individual in one ormore of the different systems providingold-age income. Due to the particularusage of AsA in the policy makingprocess special emphasis has been puton the parametrization of the institutionalspecifics in order to speed up the analysis

of the economic effects of proposedlegislative changes.

For the same ministry we developedAPF, a decision support system foranalysing direct and indirect transfers tofamilies with dependent children. LikeAsA it contains an elaborate typical casemodel, which is able to analyse theimpact of direct and indirect taxation aswell as the effect of all major transferson the disposable income of families,seen as typical by the user. It alsocomprises a micromodel for estimatingthe cost of transfers to families withchildren. Because stringent dataprotection laws prohibit the access to asample of tax returns for this ministry,the latest income and expenditure surveyis used as the database. This survey hadto be adjusted to key tables of the incometax statistic to reflect fully the incomedistribution of families. APF is the mainanalytical tool of the ministry in theongoing discussion about public transfersto families.

The personal income model of theGerman Federal Ministry of Finance isbased on a stratified sample of 123thousand income tax returns, which wasdrawn and updated by MIKMOD. Thismodel provides detailed analysis ofchanges in personal income tax. In orderto estimate the distributive effects ofdecreasing income taxes and increasingindirect taxes, especially energy taxes,MIKMOD imputed the consumptionpattern of all tax units in the sample bymatching the income and expendituresurvey and the sample of tax returns.

■Please contact:

Hermann Quinke – GMDTel: +49 2241 14 2727E-mail: [email protected]

Page 24: [Finance] Financial Mathematics

KALIF – KernelAlgorithms for Learning in FeatureSpaces by Bernhard Schölkopf

Kernel Algorithms represent a novelmeans for nonlinear analysis ofcomplex real-world data. By providinga mathematical foundation for previousalgorithms such as Neural Nets, theysupport systematic and principledimprovements of learning algorithmsfor pattern recognition, functionestimation, and feature extraction.

Learning algorithms are particularlyuseful in situations where an empiricallyobservable dependency cannot bemodelled explicitly, but ample empiricaldata are available. Examples thereofinclude pattern recognition problemsranging from pedestrian or traffic signdetection via internet text categorizationto problems of financial time seriesanalysis.

The KALIF project (Kernel Algorithmsfor Learning in Feature Spaces) developsand applies statistical methods for dataanalysis in nonlinear feature spaces. Onecan show that a certain class of kernelfunctions computes dot products in high-dimensional feature spaces nonlinearlyrelated to input space. This way, anumber of learning algorithms that canbe cast in terms of dot products canimplicitly be carried out in feature spaces – without the need for complexcalculations in these high-dimensionalspaces.

An example thereof are Support Vectormethods for classification (see Figure)and regression. Moreover, we haverecently generalized a standard algorithmfor high-dimensional data analysis, thealgorithm of statistical principalcomponent analysis (PCA), to thenonlinear case. To this end, we carry out

standard linear PCA in the feature space,using kernels. The algorithms boils downto a matrix diagonalization. It can, forinstance, be used for nonlinear featureextraction and dimensionality reduction.

It is precisely the close encounter oftheory and practice that renders thisrecently emerging research areaparticularly appealing: using complexityconcepts of statistical learning theory andmethods of functional analysis, we cannow theoretically analyze and furtherdevelop a number of successfulheuristical methods of data analysis. Thishas already led to record results on

several benchmark problems, such as thedigit recognition database of theAmerican National Institute of Standardsand Technology (NIST).

Potentially more important, however, isthe fact that the increased accuracy ofthese algorithms helps opening up newapplication fields, which previously hadnot been accessible to machine learningsuch as, for instance, very high-dimensional data sets.

More information at:http://svm.first.gmd.de/kalif.html

■Please contact:

Bernhard Schölkopf – GMDTel: +49 30 6392 1875E-mail: [email protected]

Klaus-Robert Müller – GMDTel:+49 30 6392 1860E-mail: [email protected]

Control Theoryand Electro-elasticity:Realisation of an ActiveDamperby Maurizio Brocato

Interaction between mechanics andelectronics is paramount in the designof sensors and actuators. At IEI-CNR,a system based on the electrorheologicproperties of a colloidal suspension,capable of damping mechanicalvibrations of small amplitude has beenconceived and developed.

Sensors and actuators are oftendeveloped using materials with particularelectromechanical properties; theirmodelling as continua withmicrostructure differs from ordinarycases because of the possibility ofcontrolling, at least in part, theirmicrostructure. The design andoptimisation of such devices utilises themathematical theory of distributedcontrols. We deal with a problem of thistype, but the case at issue differs fromthose prevalently analysed in theliterature (mainly after the work of J. L.Lions) as some of the mechanicalconstitutive parameters may be altered,within limits, at will.

In a previous paper the author suggestedan application of Pontryagin’s MaximumPrinciple to find optimal control pathsfor such ‘controllable’ materials. Theexample of how mechanical waves canbe damped, within given optimalityrequirements, in a body of which theelastic response can be controlled wasgiven. The optimal path to be followedin order to minimise the kinetic energyin the shortest time, is of the type shownin Figure 1 on a deformation vs. rate ofdeformation plane: the body has to be asstiff as possible when its kinetic energytransforms into an elastic one, and as softas possible under the opposite

By using a kernel to compute dotproducts, a Support VectorClassifier constructs a hyperplane infeature space, corresponding to anonlinear decision rule in inputspace.



Page 25: [Finance] Financial Mathematics

circumstances. Energy is thus dissipateddue to the discontinuities of the elasticbehaviour in time.

To implement this control strategy andbuild an active device capable ofdamping vibrations, we need a fitactuator: a material with elastic modulusthat can be controlled and modifiedsufficiently quickly to switch from itssoftest to its stiffest configuration fourtimes during each period of the unwantedoscillation. Electrorheologic (ER)materials meet these requirements.

ER materials are colloidal suspensionsof dielectric particles in a dielectric fluid,characterised by the Winslow effect.Applying an electric field, the suspensionaggregates in columns orthogonal to theequipotential surfaces of the field. Thisphenomenon is primarily due to chargesthat accumulate at the interface betweenfluid and particles causing the latter tobehave like dipoles. Thus, acting on theelectric field, the microstructure of themixture and therefore its grossmechanical properties can be modified.The mathematical model of such asudden change of phase is particularlyinteresting and will be studied in the nearfuture.

The particles’ organisation in columns isreversible and ruled by a threshold of theelectric field (a characteristic of theparticular material employed, which,typically, ranges from 2 to 3 MV/m)above which it cannot be observed. Thetime needed for the organisation or itsdecomposition is in the order of 1 ms.

This effect is well known sinceWinslow’s work in 1949. We focus on aparticular aspect of it which, so far, has

been given little attention in applications:it gives the mixture – considered as a(perhaps composite) continuum – t h etypical behaviour of a Bingham body.When the suspension is organised incolumns, applying a small mechanicalforce, the system reacts as a linear elasticsolid, ie with a small displacementproportional to that force. This can beexplained by the fact that the columnsare neither destroyed nor rearrangedduring this process: they shear and bendas if they were very many small beamswithin the fluid matrix (of which thedynamic influence is probably negligiblefor such small displacements).

In our experimental set up, twoconducting cylinders are free to moveone into the other only along theircommon axis; both are in contact via asuspension of starch into silicon oil fillingthe thin space in between them, butotherwise electrically separate. When arelatively small motion occurs thesheared ER material reacts elastically,provided there is a sufficient differenceof potential between the cylinders.

To realise the control algorithm, anLVDT (Linear Variable DifferentialTransformer) sensor captures the relativedisplacement of the cylinder and informsa computer of the current kinematic state.Once converted in terms of the phasevariables of Figure 1, this information issufficient to decide whether to switch onor off the electric field and thus drive theprocess along the desired path. Theswitch needed to interrupt about 2mA at3kV in a few ms has been implementedusing a beam triode.

The plots in Figure 2 and Figure 3 showthe effect on displacement and force oftwo opposite steps (on/off/on) of theelectric field.

■Please contact:

Maurizio Brocato – IEI-CNRTel: +39 050 593 422E-mail: [email protected]

Figure 2:Displacementvs. time.

Figure 3: Forcevs. time.

Figure 1: Optimal paths to dampelastic waves.



Page 26: [Finance] Financial Mathematics



The GRAFISWord Processorfor People withDisabilitiesby Constantine Stephanidis

GRAFIS is a word processorspecifically developed for disabledpeople at ICS-FORTH in theframework of the HORIZON ESTIAProject. This project ran from January1996 to June 1998 and focused on thevocational training of unemployeddisabled people, through the use ofinformation technologies.

GRAFIS supports the typical wordprocessing functionality, through a simpleinterface, accessible through conventionalas well as alternative input-output devices.The basic characteristics of the interfaceare: (a) a clear separation between the‘text-input’ and ‘function’ areas, and (b)elimination of overlapping objects on thescreen through the grouping of functionsin alternative function areas.

In addition to text input (in Greek andEnglish), manipulation and formatting,GRAFIS features a simplified interfacefor storing, retrieving and otherwisemanaging document files. The user issupported by an extensive on-line helpsystem which describes the interface andguides the user in the completion ofcommon word processing tasks. GRAFISalso supports saving and loadingdocuments in Rich-Text Format (RTF),thus allowing users to share andexchange documents with their able-bodied counterparts employingmainstream word processors.

GRAFIS supports a variety of inputdevices, including conventional and specialkeyboards, mouse, trackball, mouseemulators and binary switches. Interactionthrough switches is made possible throughthe use of interaction scanning techniquesdeveloped at ICS-FORTH. Thesetechniques allow users that can operateonly a single binary switch to make fulluse of the word processor.

Text input for users employing binaryswitches is supported through virtualkeyboards with alternative keyarrangements, namely: (a) QWERTY,and (b) letter-frequency based (ie, keysare arranged based on the frequency ofletters and digraphs in each of thesupported languages). Rate enhancementis also achieved by means of a wordprediction function, which performscontext-based prediction of the possiblenext words in a text, or of the

continuation of the word currently beingtyped. Word prediction is based on asimple statistical technique applied toexemplary documents (eg, general-casebusiness letters), prior to the use ofGRAFIS, or documents that the user hasedited, once a significant corpus oftypical cases of such documents exists.

One of the primary design objectives ofGRAFIS concerned the ease ofcustomisation of the interface by both

Figure 2: An instance of the word processor GRAFIS-2 for users withlearning difficulties (without the optional virtual keyboard).

Figure 1: An instance of the word processor GRAFIS-1 for users with motorimpairment in their upper limbs.

Page 27: [Finance] Financial Mathematics



facilitators and end users. Customisablecharacteristics include colour(background, virtual keyboards, etc.),dimension of characters in the virtualkeyboard, activation / deactivation ofacoustic feedback, and interactionscanning attributes (eg, speed).

GRAFIS addresses two main categoriesof disability, for which there exist separateversions of the word processor:GRAFIS–1 for users with motorimpairment in upper limbs and GRAFIS–2for users with learning difficulties. Bothversions support the previously describedfunctionality in its entirety, and have verysimilar interfaces, but differ in terms of theinput techniques; the latter are providedon the basis of user characteristics andrequirements. For instance, GRAFIS-1supports virtual keyboards and scanningfor access through mouse and binaryswitches, while GRAFIS-2 supports inputthrough conventional keyboard and mouse,and offers the virtual keyboard optionally.Figure 1 depicts an instance of GRAFIS–1,and Figure 2 an instance of GRAFIS–2.

Both versions of the GRAFIS wordprocessor have been tested and evaluatedby end-users during the vocationaltraining and pilot employment phases ofthe ESTIA project. Additionally, afterthe completion of the ESTIA project,GRAFIS was distributed to selecteddisability organisations and end users forfurther evaluation. The results of thevarious evaluation phases have shownthat GRAFIS is easy to learn and use,and that its characteristics areappreciated, in particular by users withno previous experience in wordprocessing.

Partners in the HORIZON ESTIA Projectwere: ICS-FORTH, University of AthensDepartment of Informatics, IdrimaKinonikis Ergasias, PanelliniosSindesmos Tiflon, IdrimaPammakaristos, Eteria Spastikon VoriouEllados, Euroskills, INESC-Portugal.

■Please contact:

Constantine Stephanidis – ICS-FORTHTel: +30 81 39 17 41E-mail: [email protected]

BuildingHarmonisedSemanticLexicons by Nicoletta Calzolari

SIMPLE is a project sponsored by ECDGXIII in the framework of theLanguage Engineering programme.To our knowledge, this projectrepresents the first attempt to developwide-coverage semantic lexicons for alarge number of languages (12), witha harmonised common model thatencodes structured ‘semantic types’and semantic (subcategorisation)frames. Even though SIMPLE is alexicon building project, it alsoaddresses challenging research issuesand provides a framework for testingand evaluating the maturity of thecurrent state-of-the-art in the realmof lexical semantics grounded on, andconnected to, a syntactic foundation.

Many theoretical approaches arecurrently tackling different aspects ofsemantics. However, such approacheshave to be tested i) with wide-coverageimplementations, and ii) with respect totheir actual usefulness and usability inreal-world systems both of mono- andmulti-lingual nature. The SIMPLEproject addresses point i) directly, whileproviding the necessary platform to allowapplication projects to address point ii).

SIMPLE is coherent with the strategicEC policy that aims at providing a coreset of language resources for the EUlanguages and should be considered as afollow up to the PAROLE project (seehttp://www.ilc.pi.cnr); SIMPLE adds asemantic layer to a subset of the existingmorphological and syntactic layersdeveloped by PAROLE. The semanticlexicons (about 10,000 word meanings)are built in a harmonised way for the 12PAROLE languages. These lexicons willbe partially corpus-based, exploiting theharmonised and representative corporabuilt within PAROLE. In this way, thesemantic encoding will respect actual

corpus distinctions. The lexicons aredesigned bearing in mind a future cross-language linking: they share and are builtaround the same core ontology and thesame set of semantic templates. The ‘baseconcepts’ identified by EuroWordNet(about 800 senses at a high level in thetaxonomy) are used as a common set ofsenses, so that a cross-language link forall the 12 languages is already providedautomatically through their link to theEuroWordNet Interlingual Index (see:http://www.let.uva.nl/~ewn/).

The Model

In the first stage of the project, the formalrepresentation of the ‘conceptual core’of the lexicons was specified, ie the basicstructured set of ‘meaning-types’ (theSIMPLE ontology). This constitutes acommon starting point on which to basethe building of the language specificsemantic lexicons. The development of12 harmonised semantic lexiconsrequires strong mechanisms forguaranteeing uniformity and consistency.The multilingual aspect translates intothe need to identify elements of thesemantic vocabulary for structuring wordmeanings which are both languageindependent but also able to capturelinguistically useful generalisations fordifferent NLP tasks.

The SIMPLE model is based on therecommendations of the EAGLESLexicon/Semantics Working Group( h t t p : / / w w w . i l c . p i . c n r . i t / E A G L E S 9 6 / r e p2) and on extensions of GenerativeLexicon theory. An essentialcharacteristic is its ability to capture thevarious dimensions of word meaning.The basic vocabulary relies on anextension of ‘qualia structure’ forstructuring the semantic/conceptual typesas a representational device forexpressing the multi-dimensional aspectof word meaning. The model has a highdegree of generality in that it providesthe same mechanisms for generatingbroad-coverage and coherent conceptsindependently of their grammatical/semantic category (entities, events,qualities, etc.).

In order to combine the theoreticalframework with the practical lexico-

Page 28: [Finance] Financial Mathematics

graphic task of lexicon encoding, wehave created a common ‘library’ oflanguage independent template-types,which act as ‘blueprints’ for any giventype - reflecting the conditions of well-formedness and providing constraints forlexical items belonging to that type. Therelevance of this approach for buildingconsistent resources is that types bothprovide the formal specifications andguide subsequent encoding, thussatisfying theoretical and practicalmethodological requirements.

The large number of languages coveredby SIMPLE is reflected in the size of itsConsortium: Università di Pisa(coordinator: A. Zampolli), Erli (nowLexiquest)-Paris, Institute for Languageand Speech Processing-Athens, Institutd'Estudis Catalans, University ofBirmingham, Univ. of Sheffield, DetDanske Sprog-og Litteraturselskab,Center for Sprogteknologi-Copenhagen,Språkdata-Göteborgs Universitet,University of Helsinki, Instituut voorNederlandse Lexicologie-Leiden,Université de Liège BELTEXT, Centrode Linguística da Universidade deLisboa, Instituto de Engenharia deSistemas e Computadores-Lisboa,Fundacion Bosch Gimpera Universitatde Barcelona, Institut für DeutscheSprache, Istituto di LinguisticaComputazionale – CNR Pisa, Universityof Graz.

■Please contact:

Nicoletta Calzolari – ILC-CNRTel: +39 050 560 481E-mail: [email protected]

LiveTeleteaching via the Internetby Konstantinos Fostiropoulos,Detlef Skaley, Hideyuki Inamuraand Sepideh Chakaveh

The telecommunication sector forscientific use has developed parallel tothe classical telephone system.However, in the last decade, this sectorhas undergone a change from oneconfined only to the relatively HighTech Sector of Sciences and Media, toone accessible and easily available tothe general public. This change wasbrought about by the rapidly fallingprices of telecommunication networksand equipment.

Our planet is now covered by numerousnetworks with different transmissionvelocities – terrestrial by copper cable orfibre optical, via satellite or by directedradio transmission. The Internet has alsochanged because of this from cateringprimarily to the Scientific Community’sneed for data to addressing themultimedia needs of the general public– for education, work, consumer needs,entertainment and other personal matters.The high demand for these services hasmeant that growth opportunities in thisindustry are tremendous.

The changes mentioned above haveaffected the educational facet of theInternet. There is an enormous demandfor high quality educational applicationson the Internet in almost every subject.On the other hand, in spite of this highdemand, there is no straightforward wayto bring the teacher (service provider)and the student (customer) together forlive lectures.

The various incompatible solutionswhich are implemented differ both inhardware and software. The smallbandwidth, and the resulting slowconnection, are the main reason whyindividual customers cannot receive theseservices easily at home. But Institutionssuch as companies and schools do have

the necessary connection speeds and canwork and learn together.

One possible implementation of the teleeducation concept is in the form of theISDN Video Conference System(standard H.320). This system has thecapacity to support a variety ofmultimedia services using severalbundled ISDN B-channels for thetransmission of audio, video anddocuments.

Other implementations of this concept isvia the Internet. There are a number ofapplication specific teaching and learningtools developed by the lecturersthemselves. However, the individualtools are difficult to extend to otherapplications.

In this paper, we present a solution thatintegrates both the ISDN (IntegratedServices Digital Network) and IPtechniques (Internet Protocol) on oneplatform in an uncomplicated manner.The result is an interactive distributedlecture that is also broadcast to theInternet.

Education Demand

Since the physical and economicalborders in Europe have opened, theeducational institutions have been drawninto the whirlpool of the market, wherethey face competition from across thecontinent. For a student, the personalityof a present teacher is no longer the onlydetermining factor in choosing a schoolor university. The means employed toconvey the teaching are also important.The student now has the opportunity toget a diploma from an internationallyrenowned educational institution withoutliving there and consequently saving oncost of living at the very least. And hecan choose the specific education for hisneeds because he has the entire world tochoose from.

In former command economies inEastern Europe, there is a strong interestto get economic know-how from theWest. Since in order to orient oneself inthe vast global market and establishprivate industry, one needs detailedknowledge of the market and its



Page 29: [Finance] Financial Mathematics

structure. In this pilot project, wedemonstrated that it is possible for peoplesuch as students, entrepreneurs,employees of companies and indeed theleaders themselves to take part in lecturesof their choosing on the basis of IP andacquire the knowledge that they need.

Together with its partners, GMDsucceeded in finding students inBucharest and Sofia and in organisingdistributed lectures on a commonplatform with the content provider inPennsylvania. The participants met ineducational centres of the Faculty forMathematics and Informatics (FMI),

University of Sofia, and the Institute forMathematics (ICI), University ofBucharest. There they first listened to thelive lecture from Pennsylvania. Afterthat, there was the opportunity to havediscussions between the three locationson questions arising from the lecture. Thecontent of the lecture was delivered byCAPE, which is a consortium of morethan 100 faculties in Pennsylvania on thefollowing subjects:

• Venture Initiation

• Market Economies

• International Commerce

• Banking.


For this course, the participants use aspecially designed Intranet with accessrestricted by passwords. In order tominimise the effort in these threelocations, the facilities at the threelocations have to be used. However, thesefacilities are not compatible. TheAmerican side uses a commercial systemfor ISDN video conferencing of thestandard H.320. With a similar system atGMD, a point to point connection toCAPE has been established via twoISDN B channels with 128 kb/s. Inprinciple, the band width can beincreased to 384 kb/s by bundling 6channels. In addition, the number ofparticipants can be increased by amulticast unit (MCU). The lecture notescan be accessed via the Internet fromGMD’s server.

The two Eastern European sides utiliseIP networks but the connection speedsvia terrestrial means are inadequate.Using GMD´s MULTISERVE satellitecommunication network via EUTELSATIIF3, it is possible to connect GMD, ICIin Bucharest and the Academy ofSciences in Sofia with 512 kb/s at thismoment. In this pilot project, we transmitfrom GMD 128 kbit/s. On the backchannels from the Eastern Europe sites64 kb/s could be used.

In addition, there is a 128 kb/s connectionvia telephone modem from the Academyto the FMI in Sofia. These two multicastvideo conference systems (VCS) viaISDN and IP are linked together into onenetwork in GMD. In order to make thislink, the input and output signals of theISDN VCS are transformed for theIntranet using an SGI O2 with VideoCapture and Converter cards. Forequalising the audio channels a mixerhad to be used.

Software for the ISDN part of thenetwork is fixed by the H.320 standard.The IP side had a variety of options fromwhich to choose from. Most of thesesystems were not platform independentand in addition, a server would be neededfor a multi-point conference. Thereforethey were not adequate. The solution wasto use the Mbone tools VIC v2.8ucl3

Figure 1: GMD’s Teleteaching network including ISDN as well as IP videoconference systems. Through this network students in Sofia and Bucharestmet professors in Pennsylvania.

Figure 2: Discussion between participants of the distributed course viaSatellite on questions arising from the lecture.



Page 30: [Finance] Financial Mathematics

(Video Conference) and RAT v3.0.34(Robust Audio Tool). These run on mostof the common operating systems anddistribute the data streams directly bymulticast routers. In GMD and FMI anSGI O2 is used as the terminal. On theBucharest side, a multimedia PersonalComputer is used. All terminals receivedirectly video and audio inputs and areconnected to each other via the Intranet.


In this pilot project we performed fourTeleteaching sessions of two hours eachone per week. The participants inPennsylvania dialed in by an ISDN VCSto GMD. The students in Sofia andBucharest had interactive access to thelecture via Mbone VCS, chat andelectronic mail. In addition the wholecourse was broadcast using RealPlayeron the Internet and the participants hadaccess to the HTML lecture notes(Hypertext Markup Language) onGMD’s server.

Because of the limited bandwidth werestricted the video transmission to fiveframes per second which is acceptablefor Teleteaching purposes. The audiotransmission was stable and its qualitywas sufficient for discussions.

Running this course we proved that a lowcost, low bandwidth interactivedistributed course can be implemented.Using full transmission capacities of thenetwork the quality, as well as the cost,of the course will rise.

■Please contact:

Konstantinos Fostiropoulos – GMDTel: +49 2241 14 2667E-mail:[email protected]

3D City –AdaptiveCapture andVisualization of Cityscapes by Thomas Jung and Ines Ernst

The project 3D City is concerned withthe development of an innovativetechnique for visualizing cityspaces.

The capture of existing cityscapes isnormally an extremely time-consumingmatter. It involves the three-dimensionalsurveying of objects and their provisionwith surface textures for the purposes ofvisualization. Generation of the modelsrequires a great deal of subsequentreworking by hand. The large three-dimensional models can only be renderedin reasonable quality by using high-endgraphics computers.

On the other hand, there are techniquesthat allow panoramic shots to bevisualized interactively in photo-realistic

quality, eg QuicktimeVR. Here,however, the viewer cannot move freely,but only turn and alternate betweendifferent camera positions, as no depthinformation on the panoramic shots isavailable. Adding depth information tosuch shots should allow the viewer tomove freely in cityscapes that have beencaptured photographically.

A new computer-graphics 2 1/2Drendering technique allows the generationof images for viewer positions fromwhich no photograph was taken. On thebasis of image and depth information, therendering algorithm decides which objectparts are visible from the current viewerposition and renders them according totheir appearance in the correspondingp a n o r a m a s .

Using several panoramic shots, the depthvalues for individual pixels or imagesections can be automaticallyreconstructed. Specialized tools allowdepth information to be added semi-automatically for further image sections.

What emerges is a visualizationenvironment with the following features:

• photo-realistic rendering quality:since the basic information is

Screen shot of 3D City system with (a) original panoramic shot (b) computergenerated image parts (c) viewing position.



Page 31: [Finance] Financial Mathematics

obtained from photographs, the imagequality is comparable with that ofmultimedia environments such asQuicktimeVR

• free navigation in the parts of thescene that are rendered in the picturestaken; no fixed camera positions

• moderate computation time forrendering; it increases only slightlywith the complexity of the scene

• use of modern 3D graphics cards, thetechniques employed being based on3D algorithms

• compatibility with 3D graphics: thetechnique can be used eg forrendering high-detail staticcomponents in VRML (VirtualReality Modeling Language) scenes.

Possible application scenarios include:

• simple generation and updating of 3Dcity maps or 3D information systems

• helping architects by speeding up thejob of drawing up tenders

• capturing the state of building andredevelopment areas

• supporting architectural presentationsby enabling planned objects to berendered in a natural, lifelike setting.

■Please contact:

Thomas Jung – GMDTel: +49 30 6392 1779E-mail: [email protected]

AVOCADO –The VirtualEnvironmentFramework by Henrik Tramberend, FrankHasenbrink, Gerhard Eckel and Uli Lechner

AVOCADO is a software frameworkdesigned to allow the rapiddevelopment of virtual environmentapplications for immersive and non-immersive display setups like theCAVE (CAVE Automatic VirtualEnvironment), CyberStage, ResponsiveWorkbench and Teleport. It supportsthe programmer in all tasks involvedwith these types of applications.

AVOCADO integrates a variety ofdifferent interface devices, is easilyextensible and adaptable for new devicesto be invented. It is highly interactive andresponsive, supports a rapid prototypingstyle of application development, willenable the development of trulydistributed applications, and is targetedat high-end Silicon Graphics workstationsand aims to deliver the best performancethese machines are capable of.

AVOCADO includes the followingconcepts:

• Viewer: All kinds of configurations ofinput and output devices can beassembled to viewers. A viewer buildsup the interface between the user andthe virtual world. Typical elements ofa viewer are the visual, auditory andtactile displays as output devices andspatial trackers, audio or video sourcesas input devices. In a multi-userenvironment every user configures hisown viewer.

• Scripting: All relevant parts of thesystem’s Application ProgrammingInterface are mapped to Scheme, aninterpreted scripting language. Thisenables the user to specify and changescene content, viewer features andobject behavior in a running system.

• Streaming: All objects know how toread and write their state to and from a

stream. This is the basic facility neededto implement object persistence andnetwork distribution. Persistencetogether with streaming support forobjects enable the user to write thecomplete state of the system to a diskfile at any time. An initial system statecan be read from a disk file as well.

• Distribution: All AVOCADO objectsare distributable, and their state isshared by any number of participatingviewers. Object creation, deletion andall changes at one site are immediatelyand transparently distributed to everyparticipating viewer.

• Extensions: The System is extendibleby subclassing existing C++ systemclasses. This concerns object classes aswell as classes which encapsulateviewer features. Compiled extensionscan be loaded into the system at runtimevia Dynamically Shared Objects.

• Interaction: Viewers provideinput/output services which can bemapped to objects in the scene. Objectscan respond to events generated frominput devices or other objects and candeliver events to output devices.

• Visual Rendering: The differentdisplays all have their own renderingmechanism applied to the modelinghierarchy. Only the visual rendering hasa direct access through the Performerpipeline. The auditory and the tactilerendering can be calculated on a secondcomputer, connected to the ‘master’ bya fast network.

Visual Data Processing

The visual data processing is organizedin a pipeline and computed in parallel byPerformer. This rendering pipelineconsists of a set of optional units, for:

• a database connection

• a user application

• the visual culling of the scene

• the intersection of objects,

• the drawing of the scene.

After the modeling hierarchy is updatedto its actual state in the applicationprocess it is passed on to the cullingprocess which strips all invisible objects.It is important to support this techniqueby dividing large geometry into smaller,



Page 32: [Finance] Financial Mathematics

cullable objects. The part of the sceneleft over after the culling is passed on tothe drawing process where it is renderedto the screen with OpenGL. Forconfigurations with more than one visualdisplay system, the appropriate numberof pipelines is used.

Auditory Rendering

Rendering the auditory scene has to takeinto account the position of the observer´shead in the virtual world and in theauditory display as well as thecharacteristics of the auditory display (iethe loudspeaker configuration). Theauditory rendering process is a two stageprocess. In the first stage a source signalis synthesized and in the second stage itis spatialized. In the first stage only thesound model parameters are needed bythe rendering process. In the second stagethe signals driving the auditory displayare computed as a function of the distancebetween observer and sound source, theradiation characteristics of the source andthe signature of the acoustic environment.

With these signals the auditory displayproduces the illusion of a sound sourceemitted from a certain position in acertain acoustic environment shared bythe observer and the source. The soundrendering is a dynamic process that takesinto account movements of the observerin the display, movements in the virtualworld, and movements of the soundsource. If these movements are fasterthan about 30 km/h, the pitch changesdue to Doppler shift are simulated aswell.

Tactile Rendering

The CyberStage display has a set of low-frequency emitters built into its floor. Thisallows vibrations to be generated, whichcan be felt through the feet and legs. Thereare two main areas of application of thisdisplay component. First, low frequencysound (which cannot be localized) can beemitted to complement the loudspeakerprojection. Second, specially synthesizedlow frequency signals can be used toconvey attributes such as roughness orsurface texture.The vibration display is handled likesound in the rendering process. Sound

models are used to generate the low-frequency signals. Sound synthesistechniques, generally referred to asgranular synthesis, are very well suitedto produce band-limited impulses thatmay represent surface features. Suchfeatures can be displayed through userinteraction. For instance, a virtualpointing device can be used to slide orglide over a surface and producevibrations. Additionally, higher-frequency sound can also be produced ifnecessary. Some of what can be feltusually through the skin of our fingerswhen sliding over an object is presentedto our feet. This sensation cancomplement sound and visiondramatically.

■Please contact:

Henrik Tramberend – GMDTel: +49 2241 14 2364E-mail: [email protected]

Frank Hasenbrink – GMDTel: +49 2241 14 2051E-mail: [email protected]

Gerhard Eckel – GMDTel: +49 2241 14 2968E-mail: [email protected]

Uli Lechner – GMDTel: +49 2241 14 2984E-mail: [email protected]

ImmersiveTelepresenceby Vali Lalioti, Frank Hasenbrinkand Olaf Menkens

The aim of Immersive Telepresence isto provide geographically dispersedgroups of people the possibility to meetand work within projection-basedVirtual Reality Systems as if face-to-face.

In our approach participants not onlymeet as if face-to-face, but also share thesame virtual space and perform commontasks, in order to reach a common goal.For this purpose live stereo-video andaudio of remote participants is integratedinto the virtual space of anotherparticipant, allowing a geographicallyseparated group of people to collaboratewhile maintaining eye-contact, gazeawareness and body language.

Participants can use a wide range ofProjective Virtual Reality Systems, suchas CyberStage, Responsive Workbench,Cooperative Responsive Workbench andTeleport, resulting symmetric orasymmetric collaboration scenarios. Thescientific approach includes stereocamera calibration, in order to obtain thecamera parameters, which are then used

A remote participant being virtually present in a 3D virtual environmentdisplayed in the CyberStage.



Page 33: [Finance] Financial Mathematics

for integrating the stereo-video into thevirtual space, while preserving the stereo-effect and perspective for the trackedviewer.

Immersive Telepresence in Cyberstage

The first prototype of the environmentwas demonstrated in October 1997. Aremote participant captured with a stereocamera was chroma-keyed into a 3Dvirtual space, being virtually present ina 3D virtual environment displayed inthe CyberStage. For the first time ever afully immersed 3D virtual teleconferencewas demonstrated based on virtual studiotechniques of keying video images intocomputer generated scenes.

The imported image sequences had topass several Silicon Graphics videooptions, delay units and chroma keyersbefore taken as a dynamic texture intothe AVOCADO Software Framework.AVOCADO handled the positioning anddisplay of the remote participant withinthe virtual world of an operation theater.The remote participant gave instructionsto CyberStage visitors on how to operatevarious devices and instruments of thisvirtual operation theater.

Cyberstage and Immersive Telepresence

In integrating the live audio of the remoteparticipant the spatial audio functionalityof AVOCADO was used (see previousarticle). The live audio was attached asan audio source to the geometryrepresenting the live stereo-video in thevirtual world. Therefore, audio from theremote participant was spatialized, andwas increasing in volume or fading awayin direct response to moving nearer orfurther away from the video image of theremote participant in CyberStage. Thisgreatly enhances the immersivetelepresence effect.

Immersive Telepresence at theCooperative Responsive Workbench

The Cooperative Responsive Workbenchextends the Responsive Workbench witha vertical screen, thus enlarging theviewing frustum and allowing remotecollaboration and Immersive Telepresence.

On the vertical screen of the CooperativeResponsive Workbench a user could seeand communicate in real-time with aperson or team located at a differentplace, and at the same time view andmanipulate 3D stereoscopic virtualobjects (ie. the model of a car, seismicdata for mining, medical data of a patientetc). Immersive telepresence can be usedin a variety of application areas such asmedical and geoscience applications.

■Please contact:

Vali Lalioti – GMDTel: +49 2241 14 2787E-mail: [email protected]

Frank Hasenbrink – GMDTel: +49 2241 14 2051E-mail: [email protected]

Olaf Menkens – GMDTel: +49 2241 14 2362E-mail: [email protected]

BSCW setsStandard forOpen InternetCooperation by Konrad Klöckner

Even today, the platform-independentcooperation in geographicallydistributed projects is often a difficultproblem. So far the Internet hasprovided support for this problem onlyto a limited extent. With the BSCW(Basic Support for Cooperative Work)system, GMD in Sankt Augustin hasdeveloped a World Wide Web-basedplatform for cooperation supportwhich extends the possibilities of theWorld Wide Web considerably.

The basic idea of the BSCW system isthe autonomously managed SharedWorkspace which the members of aworking group install and use for theorganization and coordination of theirtasks. The members of the working groupcan upload documents from their localcomputer to the workspace as well asaccess documents in the workspace, egfor processing them.

In addition to the comfortable documentmanagement, there are a notificationservice providing information aboutcurrent activities and a great number offunctions and object types for moreextensive support of cooperation. Theuser of the system only requires one ofthe usual browsers such as NetscapeNavigator or Microsoft Internet Explorer.

BSCW is installed on some 300 serversboth in the education area and in businessaround the world. For several years now,GMD has been operating a public serveron which meanwhile more than 17,000users have registered and managecommon projects.

The version 3.3 of the system which hasjust been released is another step in thedirection of open Internet cooperation.This version provides an object browserwritten in Java which enables a simpleprocessing of files and documents in asimilar way as the Windows file browser.For the support of informationmanagement, documents and WorldWide Web links can now be annotatedand rated individually, a median iscalculated from the individual ratings andpresented to the users.

The previous version was improved andextended in many respects, eg throughimproved e-mail integration, new lockingmechanisms for distributed documentprocessing, easier creation of largedocument archives and easier accessrights management. Version 3.3 of theBSCW system thus fulfils a great numberof requirements and requests from thelarge user group in the form of new andextended functionality. For furtherinformation visit our Web site athttp://bscw.gmd.de/

■Please contact:

Konrad Klöckner – GMDTel: +49 2241 14 2055E-mail: [email protected]



Page 34: [Finance] Financial Mathematics

EuropeanSoftwareInstitute and IEI-CNR to launch aQualificationScheme for SoftwareProcessAssessors by José Arias, Fabrizio Fabbrini,Mario Fusani, Vinicio Lami,Giuseppe Magnani

The SPICE project (Software ProcessImprovement and CapabilitydEtermination) began in 1993 tosupport the development of anISO/IEC Standard for softwareprocess assessment and improvement.The project has been conducted withinthe context of ISO/IEC JTC1 SC7WG10 by experts from all over theworld. In 1988 it produced a TechnicalReport (ISO/IEC TR 15504), whichincludes a process reference model, anassessment model and assessment/improvement guidance. Experimentsof SPICE models and guides, calledTrials, have been under way since1995, and hundreds of worldwide-assessment reports have now beencollected. The results of the SPICETrials are now supporting the currentrefining work for the final standard,expected by the end of 2001.

As assessment/improvement activities inthe SPICE framework are going to havea large impact on the softwarecommunity (suppliers, organizedcustomers, end users), the need forqualified human resources to enact theStandard is becoming crucial. ISO/IECTR15504 contains requirements andcurricula criteria for Assessors but, as theStandard is approved, people must beready to work with it.

The organizations involved in the SPICEproject are well aware of this. Amongthem, non-profit public organizationssuch as the European Software Institute(ESI), Bilbao, Spain, and IEI of theNational Research Council, Pisa, Italy,are combining efforts to set up a bodycapable of running a QualificationScheme for SPICE Assessors.

The main objectives for defining theQualification Scheme are the following:

• guarantee the consistency, repeatabilityand quality for SPICE basedassessments

• design a qualification/registrationprogramme with a common set ofrequirements to be fulfilled by allassessors

• establish a systematic improvementplan using a feedback system built intoassessment methods, models andpractices and disseminate thatknowledge among the softwarecommunity.

The overall programme is managed by aQualification Body, an independentauthority led by ESI, to corroborate thecompatibility of models and methods,provide training courses for potential andexisting assessors and support theoperating activities and procedures withrespect to registration.

Additional participation by otherpartners, such as the National Agenciesfor Quality (members of the EuropeanOrganization for Quality) and otherEuropean non-profit organizations willbe encouraged. Upon the completion ofthe Qualification Scheme, the ISO/IEC15504 Standard for software processassessment will be endorsed, preservingintegrity during the assessment activitiesby guaranteeing that those activities areperformed by professionals with a highlevel of competence.

Thus, the Qualification Scheme does notjust cover selection and training, but hasbeen designed to ensure a high level ofprofessionality from assessors, to managethe assessors’ activity database and tocollect useful feedback in order toimprove the Scheme itself, the SPICEassessments (including Trials) and all theassessments/improvements campaigns

performed within any existingframework (such as, for example, ISO9000).

■Please contact:

Mario Fusani – IEI-CNRTel: +39 050 593 512E-mail: [email protected]

Giuseppe Magnani – ESI, BilbaoTel: +34 94 420 95 19E-mail: [email protected]

Solving ImplicitDifferentialEquationswithin a ModelingEnvironment by Jacques de Swart

Many industrial applications can bemodeled by sets of Implicit DifferentialEquations (IDEs). In joint work atCWI and Paragon DecisionTechnology (PDT), sponsored by theTechnology Foundation STW, it isshown how the integration of an IDEsolver in an algebraic modeling systemcan help to overcome many of thedifficulties that a user currentlyencounters in trying to solve suchsystems of IDEs.

In industrial design processes, testingdesigns is of major importance. Beforeproducts are manufactured based onsome design, one wants to know how aproduct based on this design wouldbehave under several circumstances. Forthis purpose, one models the product interms of its design. By solving the model,one simulates the working of the product.This procedure is much cheaper thanbuilding and testing prototypes. Oftenthe modeling of a product results in IDEs– equations in which derivatives of theunknowns with respect to oneindependent variable, typically time,appear implicitly. The modeling of time-dependent processes often results in



Page 35: [Finance] Financial Mathematics

IDEs. To solve IDEs, numerical methodsare indispensable.

Examples of applications wheresimulation processes involve IDEs aretesting the design of an electrical circuit,and the formulation of safetyrequirements for trains. Other areasinclude the modeling of turbulent flowsin water tube systems, the description ofdemand-supply curves in liberalizedmarkets, and the simulation of chemicalreactions.

On the one hand, the complexity and sizeof the applications require a user friendlymodeling environment, which speaks thelanguage of the modeler, and offers thepossibility to test and compare severalscenarios and instances of the model dataefficiently. Moreover, the modeler doesnot want to be involved in the oftencumbersome interfacing with solvers. Onthe other hand, modern numericaltechniques are required to solve ill-conditioned IDE systems of highdimension. How to meet theserequirements is studied by integrating thenovel IDE solver PSIDE, developed atCWI, in the advanced modelingenvironment AIMMS (AdvancedInteractive Multi-dimensional ModelingSystem), which is a product of PDT.

If the time scales of the various solutioncomponents vary greatly, and if therapidly changing components are

physically irrelevant, then we call aproblem stiff. For example, if both highand low frequency signals are present inan electrical circuit, but the high-frequentsignals are small in magnitude, then themodeling of such a circuit gives rise toa stiff system of IDEs. To solve suchIDEs, an implicit method is required,which means that the numericalapproximations are not directly available,but have to be computed from nonlinearsystems. This computation requires theevaluation of Jacobians of the IDEs withrespect to the unknowns. In AIMMSthese Jacobians are available in analyticalform.

If some solution components of an IDEare more sensitive to perturbations thanothers, then the IDE is said to be of higherindex. In order to integrate IDEsnumerically with variable stepsize, oneusually estimates a local error , in whichthe index has to be taken into account.Existing IDE solvers do not have afacility to compute the index. Anautomatic index determination facilityfor PSIDE, which uses the analyticalJacobian available in AIMMS, iscurrently under development.

The IDE solver has to know not only thevalues of all variables at the start of theintegration interval, but also theirderivatives. Especially the latter are inpractice often unknown to the modelerand have to be computed from an – often

nonlinear – system of algebraicequations. A powerful commercialnonlinear solver available in AIMMS isCONOPT. We used CONOPTsuccessfully to compute the missinginitial values. For higher index problemsthe problem of finding initial values iseven more complicated, because theinitial values have to satisfy differentiatedequations as well. Based on the indexdetermination facility and the capabilityof AIMMS to differentiate equationsautomatically, we are working on anautomatic procedure for finding initialvalues for higher index problems.

One problem in the CWI Test Set is theTransistor Amplifier, whose circuitdiagram is shown in the Figure. The taskis to compute the behaviour of thevoltages in the nodes and the currents inthe wires over time. There are severalsymbolic equations, such as Kirchoff‚sLaw, whereas for every type of electricalcomponent there is an equationdescribing its working. These equationsare independent of the specific electricalcircuit. To simulate another circuit oneonly has to adapt model data. Theapplication shown in the figure isclickable. One can play the movie of thecircuit‚s working and select the wires andnodes of which one wants to see moreinformation. As additional information,the maximum of the currents over allwires is displayed. The ‘<<’, ‘<’, ‘>’ and‘>>’ buttons serve to step through themovie.

■Please contact:

Jacques de Swart – CWI/PDTTel: +31 20 592 4176E-mail: [email protected],[email protected]

Simulating the working of a transistor amplifier using AIMMS and PSIDE.



Page 36: [Finance] Financial Mathematics

MultilevelForecastingimprovesCorporatePlanning and Operationsby Ilkka Karanta

Forecasting involves estimating futurevalues for a process that is at leastpartially uncontrollable. Examplesrange from weather to election resultsand stock prices. An importantsubfield of forecasting is salesforecasting, where one tries to forecastthe sales of products usually targetedto the consumer market. Althoughdifferent actors in the supply chain -producers, wholesale companies andretail companies - all have ways ofaffecting the sales such as marketingand pricing, important uncertaintyfactors remain due to eg consumerbehavior and the actions ofcompetitors.

Forecasts are needed on all levels ofsupply chain planning. At the operationallevel, logistical decisions such asinventory levels and transportationschedules and routes are affected by salesforecasts. At the tactical level, yearlybudgeting decisions are affected byestimated sales for the next year. And atthe strategic level, investment decisionsare affected by estimates of regionaldemand for products.

A pervasive feature of sales forecastingproblems is that they are needed fordifferent hierarchies, and at differentlevel in these hierarchies. For example,product-level forecasts are needed bymarketing; product-group level forecastsare needed in budgeting; forecasts byregion and product are needed inlogistical decisions; and forecasts bycustomer and product are needed bycustomer-relations management staff. Onthe other hand, forecasts are needed fordifferent time spans (a year, a quarter, or

a month) and for different sample rates(monthly, weekly or daily data). Theseforecasts should be consistent, and, in thebest case, the models and data for eachprocess should support the accuracy offorecasting in all the processes.

VTT Information Technology is doingresearch on automated modeling andhierarchical forecasting. In automatedmodeling, the emphasis is on selectingboth an appropriate model class for aforecasting task (such as ARIMA,regression, exponential smoothing orneural networks) and an optimal modelstructure within that class. In hierarchicalforecasting, the emphasis is incoordinating different models fordifferent hierarchy levels so that forecastsare consistent and accuracy is improved.

On the other hand, also new productforecasting, and the effects of new andending products and customers on theitems in hierarchies have beenstresspoints of the research.

As a result, a forecasting system is underdevelopment where forecasts fordifferent levels in hierarchies anddifferent sample rates can be made. Somehighlights of the system:

• multiple hierarchies are supported: theuser can define for which nodes orlevels the forecasts are needed indifferent hierarchies (eg producthierarchy, customer hierarchy, region

hierarchy), and the forecasts at differentlevels are consistent with each other

• formulation of statistical models isautomatic; the end user doesn’t need toknow anything about statistical models

• several classes of statistical models aresupported, and adding new classes issimple

• external software components are usedin model parameter estimation andsome other statistical calculations

• the system is platform-independent

• the system is object-oriented andwritten in Java.

The system has been installed to a largeFinnish company in food manufacturingindustry, and planning is under way toan implementation for a large Finnish

wholesales company. Future plansinclude utilization of data in the wholesupply chain, and the incorporation ofsubjective assessment as part of theforecasting process.

■Please contact:

Ilkka Karanta – VTT InformationTechnologyTel: +358 9 456 4509E-mail: [email protected]

Forecast for the sales of a cheese brand.



Page 37: [Finance] Financial Mathematics

Helping Co m p a n yF o u n d e r s :I N R I A’sTechnologicalOutreachStrategy by Bernard Larrouturou and Laurent Kott

The founding of companies stemmingfrom INRIA started in 1984. Overfifteen years, thirty-seven companieshave been created by researchers,engineers or young PhDs coming fromINRIA’s research teams. Twenty-sixof them are still in activity today undertheir own name and gather a total ofover one thousand highly qualifiedjobs.

INRIA’s commitment to foster thecreation of technology companies hasbeen reinforced in the last two years. Inaddition to various incentives andencouragement, INRIA now supportscompany founders through post-doctoralfellowships and due to the incubator roleplayed by its new subsidiary INRIA-Transfert. INRIA also played an essentialrole in setting up the I-Source-Gestioncompany that manages the first start-upfund in France consecrated to the sectorof Information and CommunicationScience and Technology (ICST). Thiscommitment led to very satisfying resultsin 1998 and the beginning of 1999. Seventechnology companies stemming fromINRIA were created in just twelvemonths: Gene-IT, Liquid Market,Novadis Services, PolyspaceTechnologies, Realviz, Saphir-Controland Trusted Logic. These companiesintervene in such diverse fields as videodigital special effects, softwarecertification for smart cards, catalogueset up for electronic commerce andgenome sequence processing, to name afew. The skills and technological basisfor several of these companies come fromhighly theoretical research. Two yearsago, it was not always evident that such

research had the potential required toenter the markets concerned today. Suchcompany creations thus are very goodexamples of the breathtaking speed atwhich technological breakthroughs occurin ICST.


INRIA-Transfert is a company with acapital of13,2 Million Euros created in1998 that is a 100% subsidiary of INRIA.Its purpose is twofold: to play a majorrole in the setting up of a start-up fundin ICST, and to provide the referencestructure for the incubation of innovativecompany projects in software dominatedinformation technology. On the start-upside, INRIA-Transfert is the referencestockholder in the newly established I-Source Gestion company (see below),but has no direct responsibility in themanagement of its start-up fund.Concerning incubation, the essential taskis networking, that is to say setting up anetwork of professionals that INRIA-Transfert can call upon to detect, appraiseand consolidate creation projects. Theobjective is to bring them to fruition inless than a year.

INRIA-Transfert is thus a structure thatthe project authors during the phase priorto the creation of the company itself.INRIA-Transfert provides help in thefollowing steps:

• scientific appraisal of the project

• exchange of ideas to refine theproject and narrow down theobjectives

• search for financing

• market research

• verification that the final programproceeds as expected.

I-Source Gestion

The goal of this company is to financethe seed phase of technology start-upsstemming from public or privateresearch, operating in the ICT markets.The first financing tool is a venturecapital mutual fund for an amount of 15,2Million Euros and a duration of 10 years.Funds come from the public sector (likethe main stockholder INRIA-Transfertand the Caisse des Dépôts et

Consignations) and private sectorsubscriptions (institutional investors, inparticular the assurance company AXA,and venture capital companies).

I-Source Gestion develops a genuine ‘co-business’ approach with the projectauthors it selects. Starting from a draftof the product, team and business plan,I-Source Gestion works with them on theformalisation of their business strategy,on the definition of the first objectivesand on the finalisation of the financingplan. The goal is to make sure that thestart-up takes off successfully so that thestart-up fund can disengage itself whenstrong growth phases are reached. Anactive follow-up of the start-up is madepossible through a participation in theboard of directors or the supervisoryboard. The planned average amount ofI-Source Gestion intervention is on theorder of 450,000 Euros in twoinstalments separated by 12 to 18 months.Project selection is extremely demandingon the marketing as well as technicallevel and I-Source Gestion does nothesitate to ask the advice of variousexperts, including INRIA scientists.

The first accomplished project thatbenefited from the help of I-SourceGestion financing is the PolyspaceTechnologies start-up, which specialisesin the development and marketing ofverification and validation environmentsfor on-board real-time software. Thecompany was founded at the initiative ofDaniel Pilaud – formerly at Verilog – andAlain Deutsch, an INRIA-Rocquencourtresearcher. Approximately twentyprojects are currently under study. Theobjective of I-Source Gestion is to helpstarting up some eight technologycompanies per year over the next fiveyears.

For more information, see:

• I-Source Gestion:http://www.isourcegestion.fr/

• Industrial relations at INRIA:http://www.inria.fr/Partenariats/dev-rel-industrie-eng.html

■Please contact:

Laurent Kott – INRIATel: +33 1 39 63 56 02E-mail: [email protected]



Page 38: [Finance] Financial Mathematics


Joint Summer School on Extending DataBaseTechnology (EDBT)

by Serge Abiteboul and Nourredine Mouaddib

The EDBT (Extending Data BaseTechnology) school took place at LaBaule-les-Pins, France from 16 to 21May 1999. The school, now a majorand well-established event, was thefifth of a series of a European bi-yearlysummer school in databases. Lectureswere given by leading researchers ofthe international database community.

The school was attended by 61participants. Together with the teachersand the organizers this lead to 75 persons.It should be noted that many teachers (asasked) attended most of the school. Morethan half were PhD students and amongthe others the wide majority were ratheryoung researchers. The courses up to thelast one were fully attended. It is ourimpression that the students enjoyed theschool and learnt a lot. Indeed, the PhDstudents requested an extra session sothey would have more time to discuss thefuture of the field.


• Serge Abiteboul, INRIA: ElectronicCommerce and Databases

• Georges Gardarin, Université deVersailles: Distributed DatabaseTechniques: Architectures andEvolutions

• Gerti Kappel, University JohannesKepler, Linz: UML at Work - Object-Oriented Software Development fromAnalysis to Implementation

• Mohamed Quafafou, Institut deRecherche en Informatique de Nantes:Knowledge Discovery

• Guido Moerkotte, University ofKarlsruhe: Building Query Compilers

• C. Mohan, IBM, Almaden: WorkflowManagement in the Internet Age

• Vassilis Christophides, FORTH:System Infrastructure for DigitalLibraries: A Survey and Outlook

• Michel Scholl, Centre National des Artset Metiers, Paris: S p a t i o - t e m p o r a lDatabases

• Dennis Shasha, New York University:Time Series in Finance: the ArrayDatabase Approach

• Stefano Ceri, Politecnico di Milano: theAsilomar report and discussion on thenew directions of the field; the studentsrequested the addition of a session todiscuss the future of the field. They leadthe discussion during that session. Theyare planning an answer to the Asilomarreport in a report tentatively called theLa Baule Report.

• Four PhD students sessions of shortpresentations of their works.

For more details, see: http://www-rocq.inria.fr/EDBT-school99Students’ presentations: h t t p : / / w w w -r o c q . i n r i a . f r / E D B T - s c h o o l 9 9 / s t u d e n t s . h t m l

■Please contact:

Serge Abiteboul – INRIATel: +33 1 39 63 55 37E-mail: [email protected]


5th ERCIM Workshop on User Interfaces for All

Dagstuhl, Germany, 28 November - 1 December 1999

Extended deadline for submittingpapers: 6 September 1999.

The 5th ERCIM Workshop on 'UserInterfaces for All' is on theories,methodologies, techniques and toolswhich contribute to the development ofUser-Tailored Information Environments.

The vision of User Interfaces for Alladvocates the proactive realisation of thedesign for all principle in the field ofHuman-Computer Interaction, andinvolves the development of userinterfaces to interactive applications andtelematic services, which provideuniversal access and quality in use topotentially all users.

The invited speakers of this year’sWorkshop will be:

• Dr. Jon Gunderson, Coordinator ofAssistive Communication andInformation Technology, Division ofRehabilitation - Education Services,

University of Illinois at Urbana/Champaign, USA

• Dr. Hans-Heinrich Bothe, AssociateProfessor at Orebro University, Sweden.

For more information, see:h t t p : / / z e u s . g m d . d e / 5 - U I 4 A L L - W o r k s h o p /call.html or http://www.ics.forth.gr/ercim-w g - u i 4 a l l / i n d e x . h t m l

■Please contact:

Constantine Stephanidis – ICS-FORTHChair, ERCIM Working Group ‘UserInterfaces for All’Tel: +30 81 391741E-mail: [email protected]

Michael Pieper – GMD-FITLocal Organiser of the WorkshopTel: +49 2241 14 2018E-mail: [email protected]


Third European Conferenceon Research and Advanced Technology for Digital Libraries

Paris, 22-24 September 1999

After Pisa in 1997 and Heraklion in 1998,ECDL will take place in Paris at theprestigious location of the BibliothèqueNationale de France. It is the third of aseries of European conferences on researchand technology for digital libraries,partially funded by the European TMRProgramme. Its main objective is to bringtogether researchers from multipledisciplines to present their work onenabling technologies for digital libraries.The conference also provides anopportunity for scientists to develop aresearch community in Europe focusingon digital library development. Moreinformation, including conferenceprogramme, the list of accepted papers,demos, tutorials as well as a registrationform is available on the conference website at http://www-rocq.inria.fr/EuroDL99/

A special rate will be applied toregistrations before 31 July 1999

■Please contact:

Florence Balax – INRIATel: +33 1 3963 5053E-mail: [email protected]



Page 39: [Finance] Financial Mathematics

CLRC – AlphaGalileo, a new internetpress centre, was formally launched bythe UK’s Minister for Science, LordSainsbury, at the British Association’sFestival of Science at Cardiff Universityon 7th September. It was introduced inFrance on 7th October at the BritishCouncil, in the presence of EdouardBrezin, Chairman of the CNRSadministrative council and Sir MichaelJay, British Ambassador to France. It isa project run by the British Associationfor the Advancement of Science andsupported by the Office of Science andTechnology and a number of ResearchCouncils. The project manager, PeterGreen, is now based at CLRC RutherfordAppleton Laboratory. Over the next 18months, AlphaGalileo will develop intoa multi-language service for communi-cating the best of European research tojournalists world-wide. The web addressis: http://www.alphagalileo.org/

■CWI – Piet Beertema , one of theInternet’s founding fathers, was knighted

on June 9 (Ridderin de Orde van deN e d e r l a n d s eLeeuw) for hisp i o n e e r i n gachievements inthe developmentof the Internet inThe Netherlandsand in Europe. In

the eighties a predecessor of the Internetwas developed at CWI, in closecooperation with research instituteselsewhere in Europe, which served theresearch community in Europe. Beertemaplayed here a pivotal role, through closecontacts with colleagues in the USA. Lateron he founded and managed the InternetDomain Registration for The Netherlands,which became a working model forregistries abroad. When in 1995 thegrowth of the Internet made clear thatregistration as well as handling otheraspects could not any longer be a one-manbusiness, the foundation InternetDomeinregistratie Nederland was created.When Beertema transferred his dutiesearly 1997 to this foundation (in which hekeeps playing an important role), almost10.000 names were registered (now7 7 . 0 0 0 ) .

SZTAKI – Paks nuclear power plantbecomes the first VVER 440 type plantmodernizing its safety system by usingthe most recent methods in control theoryand I&C technology. Siemens andSZTAKI contributed to the modernizationand refurbishment of the complete reactorprotection system (RPS) of the nuclearpower plant Paks, Hungary over the pasttwo years. By August, 1999 therefurbishment of the first unit is to becompleted. While Siemens provided astate-of-the-art distributed multi-processor-based I&C system, the

SZTAKI’s team was entrusted with thecomplex task of the verification andvalidation (V&V) of the new system. Thesubject of V&V was the verification ofhardware and software design both inactivity and on document level. Theverification of the testing methodsprovided by Siemens, moreover, theelaboration of the test cases independentfrom the supplier were also parts of theproject.

■SZTAKI – A co-operation agreementhas been signed between SZTAKI andthe Faculty of Information Technology,Péter Pázmány Catholic University,Budapest to establish an externalDepartment on Information Techniquesat the SZTAKI and to take part in aScience-Technology-Education Centerorganized at the Faculty with researchlaboratories and leading companies.Other external university departments atthe SZTAKI are the Department ofDecisions in Economy (BudapestUniversity of Economic Sciences), theDepartment of Information Sciences(Eötvös Loránd University of Sciences,Budapest) and the Department ofIntegrated Production InformationSystems (Technical University of

Budapest). Furthermore, SZTAKI andthe Faculty of Transportation, TechnicalUniversity of Budapest run a jointlaboratory, the Dynamics and ControlSystems Centre.

■INRIA – Claude Samson , ResearchDirector at INRIA Sophia Antipolis hasbeen awarded the 1999 Michel Monpetitaward from the French Academy ofSciences for his remarkable contributionsto the mechanical and mathematicalmodeling of complex robots (inparticular, mobile robots and walkingrobots), as well as the command andstabilization of such robots, which hasbeen a source of difficulty in nonlinearmathematics.

■INRIA – Odile Lausecker at INRIA’sMultimedia Department of the ScientificInformation and Communication Servicereceived the Grand Prix of the ResearchFilm Festival in the category ‘Illustrationof Reseach for Industry’ (Nancy, March1999). Her film ‘Le Loria’ is a guidedtour of the Lorraine Research Laboratoryin Computer Sciences and theirapplications. The institutional aspect ofthe presentation was deliberatelydownplayed. A young actress gave theaudience a new perspective of Loria byintroducing six research teams and theirindustrial partners. INRIA received theprize for the second consecutive year.

■VTT – Linus Torvalds , creator of theoperative system LINUX, guested aseminar aimed at young researchersand students on May 20, 1999 and

hosted by VTTtogether withH e l s i n k iUniversity ofHelsinki. Thehighly inter-active occasioncovered subjectsranging fromt e c h n o l o g i c a l

questions to philosphic issues. Theentity was held together by the threemotivational stages: survival,communication and entertainment.Even technology and its developmentpasses through these stages, which isimportant to keep in mind whendeveloping innovations.

The nuclear power plant in Paks,Hungary.



Page 40: [Finance] Financial Mathematics



Directeur de Publication: Jean-Eric PinPublication périodique réalisée par GEIE-ERCIM

ISSN 0926-4981

ERCIM Central Office: Domaine de VoluceauRocquencourtB.P. 105F-78153 Le Chesnay CedexFRANCEE-mail: [email protected]

The European Research Consortium for Informatics and Mathematics (ERCIM) is anorganisation dedicated to the advancement of European research and development, inthe areas of information technology and applied mathematics. Through the definitionof common scientific goals and strategies, its national member institutions aim tofoster collaborative work within the European research community and to increase co-operation with European industry. To further these objectives, ERCIM organises jointtechnical Workshops and Advanced Courses, sponsors a Fellowship Programme fortalented young researchers, undertakes joint strategic projects, and publishesworkshop, research and strategic reports, as well as a newsletter.

ERCIM News is the in-house magazine of ERCIM. Published quarterly, the newsletterreports on joint actions of the E R C I M partners, and aims to reflect the contributionmade by E R C I M to the European Community in Information Technology. Throughshort articles and news items, it provides a forum for the exchange of informationbetween the institutes and also with the wider scientific community. ERCIM News hasa circulation of 7,000 copies.

GMD –Forschungszentrum I n f o r m a t i o n s t e c h n i kGmbH

Schloß BirlinghovenD-53754 SanktAugustin

Tel: +49 2241 14 0 Fax: +49 2241 14 2889h t t p : / / w w w . g m d . d e /

Consiglio Nazionale

delle Ricerche


Via S. Maria, 46

I-56126 Pisa

Tel: +39 050 593 433

Fax: +39 050 554 342

h t t p : / / w w w . i e i . p i . c n r . i t/

Centrum voorWiskundeen Informatica

Kruislaan 413NL-1098 SJAmsterdam

Tel: +312 05 9 29 3 3 3Fax: +31 20 592 4199h t t p : / / w w w . c w i . n l /

Institut National de Rechercheen Informatique et en Automatique

B.P. 105F-78153 Le Chesnay

Tel: +33 1 39 63 5511Fax: +33 1 39 63 5330h t t p : / / w w w . i n r i a . f r /

Central Laboratory of the ResearchCouncils

Rutherford AppletonL a b o r a t o r yChilton, DidcotGB-Oxon OX11 0QX

Tel: +44 123582 1900Fax: +44 1235 44 5385h t t p : / / w w w . c c l r c . a c . u k /

Swedish Institute

of Computer Science

Box 1263

S-164 29 Kista

Tel: +46 8 633 1500

Fax: +46 8 751 7230

h t t p : / / w w w . s i c s . s e /

Stiftelsen for Industriell ogTeknisk Forskningved Norges TekniskeHøgskole

SINTEF Telecom &InformaticsN-7465 Trondheim

Tel :+47 73 59 30 00Fax :+47 73 59 43 02h t t p : / / w w w . i n f o r m a t i c s .sintef.no/

Technical ResearchCentre of Finland

VTT InformationTechnologyP.O. Box 1200FIN-02044 VTT

Tel:+358 9 456 6041Fax :+358 9 456 6027h t t p : / / w w w . v t t . f i /

Foundation for Researchand Technology –Hellas

Institute of ComputerScienceP.O. Box 1385GR-71110 Heraklion,Crete

Tel: +30 81 39 16 00Fax: +30 81 39 16 01h t t p : / / w w w . i c s . f o r t h . g r /

FORTHCzech ResearchConsortium for Informatics and Mathematics

FI MUBotanicka 68aCZ-602 00 Brno

Tel: +420 2 6884669Fax: +420 2 6884903h t t p : / / w w w . u t i a . c a s . c z /C R C I M / h o m e . h t m l

Swiss Association for Research in InformationT e c h n o l o g y

Dept. InformatikETH-Zentrum CH-8092 Zürich

Tel: +41 1 632 72 41Fax: +41 1 632 11 72http://www-dbs.inf.ethz.ch/sarit/

Slovak R e s e a r c hConsortium for Informatics and Mathematics

Dept.of ComputerScience, ComeniusUniversityMlynska Dolina MSK-84215 Bratislava

Tel: +421 7 726635Fax: +421 7 727041

Danish Consortium for InformationTechnology

DANIT co/CITAabogade 34DK - 8200 Aarhus N

Tel: +45 8942 2440Fax: +45 8942 2443h t t p : / / w w w . c i t . d k / E R C I M /


Central Editor:Peter Kunz

Local Editors:Grabriela Andrejkova (SRCIM)Erzsébet Csuhaj-Varjú (SZTAKI)Truls Gjestland (SINTEF)Michal Haindl (CRCIM)Kersti Hedman (SICS)Bernard Hidoine (INRIA)Pia-Maria Linden-Linna (VTT)Siegfried Münch (GMD)Henk Nieland (CWI)Flemming Nielson (DANIT)Carol Peters (CNR) Martin Prime (CLRC)Constantine Stephanidis (FORTH)

E-mail:[email protected]

[email protected]@sztaki.hut r u l s . g j e s t l a n d @ i n f o r m a t i c s . s i n t e f . n [email protected]@[email protected]@[email protected]@[email protected]@[email protected]@csi.forth.gr

Telephone:+33 1 3963 5040

+421 95 6221128+36 1 209 6990+47 73 59 26 45+420 2 6605 2350+46 8633 1508+33 1 3963 5484+358 0 456 4501+49 2241 14 2255+31 20 592 4092+45 89 42 33 63+39 050 593 429+44 1235 44 6555+30 81 39 17 41

You can subscribe to ERCIM Newsfree of charge by: sending e-mail toyour local editor; posting papermail to the address above or fillingout the form at the ERCIM web siteat h t t p : / / w w w . e r c i m . o r g /

Magyar TudományosAkadémia –Számítástechnikai ésA u t o m a t i z á l á s iKutató Intézete

P.O. Box 63H-1518 Budapest

Tel: +36 1 4665644Fax: + 36 1 466 7503h t t p : / / w w w . s z t a k i . h u /