15
Semiparametric hypotheses testing for dynamical system with small noise S.M IACUS – Y.A. KUTUYANTS Working Paper n. 00.02 – mese Settembre Dipartimento di Economia Politica e Aziendale Università degli Studi di Milano Via Conservatorio, 7 20122 Milano Tel: ++39 02 76074501 Fax: ++39 02 76009695

Semiparametric hypotheses testing for dynamical system with small noise

  • Upload
    unimi

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Semiparametric hypotheses testing for dynamical system with small noise S.M IACUS – Y.A. KUTUYANTS

Working Paper n. 00.02 – mese Settembre

Dipartimento di Economia Politica e Aziendale Università degli Studi di Milano Via Conservatorio, 7 20122 Milano Tel: ++39 02 76074501 Fax: ++39 02 76009695

Semiparametric hypotheses testing fordynamical systems with small noise

S. M. IacusDipartimento di Politica Economica e Aziendale

Universita di Milano, Via del Conservatorio, 7

20122 Milano, Italy

[email protected]

Yu. A. KutoyantsDepartement de Mathematiques

Universite du Maine, Avenue O. Messiaen

72085 Le Mans, Cedex 9, France

[email protected]

September 18, 2000

Abstract

A problem of hypothesis testing is considered for a model of dif-fusion process with diffusion coefficient tending to zero. A simple hy-pothesis and a semiparametric alternative are formulated in the termsof the values of some functional of nonperturbed trajectory of the dy-namical system. The test proposed is locally asymptotically uniformlymost powerful for a class of contiguous alternatives.

Key words: minimax test, diffusion processes, contiguous hypotheses.

1 Introduction

We are given an observation Xε = {Xt, 0 ≤ t ≤ T} (in continuous time) ofthe trajectory of a diffusion process solution of the stochastic differentialequation

dXt = S(Xt) dt + ε dWt, X0 = x0, 0 ≤ t ≤ T (1)

1

where {Wt, 0 ≤ t ≤ T} is the Wiener process, x0 is a non random initialvalue and the trend coefficient S(·) is an unknown function satisfying theLipschitz condition

|S(x)− S(y)| ≤ L |x− y| , L > 0. (2)

The diffusion coefficient ε2 ∈ (0, 1] is supposed to be known and we considera hypotheses testing problem in the asymptotics of small noise, i.e., as ε → 0.Condition (2) guarantee the existence and uniqueness of the solution of theequation (1) (see e.g. [12]), moreover it provides the uniform in t ∈ [0, T ]convergence of the stochastic solution {Xt, 0 ≤ t ≤ T} of (1) to the deter-ministic solution {xt, 0 ≤ t ≤ T} of the following (limiting nonperturbed)dynamical system

dxt

dt= S(xt), x0, 0 ≤ t ≤ T , (3)

as ε → 0. The model (1) is usually considered as a dynamical system per-turbed by a small white Gaussian noise. Several problems of parametric andnonparametric estimation for such model are discussed in [8].

Note that the solution x = {xt, 0 ≤ t ≤ T} of the equation (3) is com-pletely determined by the function S(·) and the initial value x0, so we canuse the notation x(S) = {xt(S), 0 ≤ t ≤ T} to indicate the dependence.

We are interested in verifying a simple hypothesys concerning the valueof the functional

ϑ (S) =

∫ T

0

G(xt(S)) dt, (4)

against a class of one-sided alternatives, i.e., the hypothesis H0 : ϑ = ϑ0

against H1 : ϑ > ϑ0, where the values of ϑ under alternative are definedas {ϑ (S + εh) , h (·) ∈ H} with specially chosen (nonparametric) class H offunctions h (·). The function G(·) is supposed to be known and continuouslydifferentiable with a positive derivative.

This problem of hypothesis testing is interesting by the following reason.Suppose that one is interested in the value of energy of a signal {xt, 0 ≤ t ≤T} of the nonperturbed system (3) by the observations (1), then he (or she)has to consider the integral

ϑ =

∫ T

0

x2t dt,

which is a particular case of (4).

2

The construction of the asymptotically optimal test is based on the tra-ditional minimax approach, i.e., we replace the composite alternative by anordered set of smaller alternatives, then we find the worst alternative in thisrestricted class of alternatives and use the Neyman-Pearson lemma to findthe best test. This is one else realization the idea of local asymptotic mini-max approach for semiparametric models initiated by by B. Levit [11], see aswell the monographes by I. Ibragimov and R. Khasminskii [2] and P. Bickelet al. [1].

Similar (in some sense) problem of asymptotically optimal minimax es-timation of the functional (4) for the same model of observations (1) wasconsidered in [9]. See as well some generalizations of these results in [3] and[4].

The theory of asymptotically efficient nonparametric tests can be found inthe monograph by Ya. Nikitin [13]. Our statement is close to the hypothesestesting for semiparametric models, which has been developed by J. Pfanzagland W. Wefelmeyer [14] and recently by A. Janssen ([5], [6] (see [6] for thefurther references).

The paper is organized as follows. In Section 2 we recall a problem ofparametric hypotheses testing for diffusion processes with small diffusioncoefficient (simple hypothesis against one-sided parametric contiguous al-ternative) and (following G. Roussas [15]) construct locally asymptoticallyuniformly most powerfull test. In Section 3 we define a locally asymptoti-cally uniformly most powerful test (LAUMP) for semiparametric contiguousalternative and present an optimal in this sense test. We call this problemsemiparametric (as usual in such situations) because the model is known(under the alternative) up to a value of some function (here S (·)), but theproblem is formulated in the terms of hypotheses testing for one-dimensionalparameter (here ϑ).

2 Testing of contiguous parametric hypothe-

ses

Suppose that we observe the trajectory of a diffusion process Xε solution ofthe following stochastic differential equation

dXt = S(ϑ, Xt) dt + ε dWt, t ∈ [0, T ], (5)

with non random initial value X0 = x0 and ϑ ≥ ϑ0, where ϑ0 is some knownvalue. The process Xε with different ϑ, induces in the measurable space(CT ,BT ) of continuous functions on [0, T ] with its Borel σ-field, the family of

3

measures {P(ε)ϑ , ϑ ≥ ϑ0}. We denote by Eϑ the expectation with respect to

the measure P(ε)ϑ . We denote by S(ϑ, x) the partial derivative of S(ϑ, x) with

respect to ϑ. We suppose that the function S(ϑ, x) satisfies the condition (2)so the solution of (5) converges, as ε → 0, to the solution of the equation

dxϑt

dt= S(ϑ, xϑ

t ), xϑ0 = x0. (6)

We will denote by {x0t , 0 ≤ t ≤ T} the solution of (6) corresponding to

S(ϑ0, ·).Condition A. The function S(ϑ, x) is continuously differentiable in ϑ atpoint ϑ = ϑ0 and its derivative S(ϑ, x) is continuous with respect to botharguments.

We are interested in the following problem of hypotheses testing

H0 : ϑ = ϑ0

versusH1 : ϑ > ϑ0 .

This is a problem of testing one simple hypothesis against one-sided compos-ite alternative.

A sequence of tests (or simply a test) is completely characterized by ameasurable function φε : CT → [0, 1] called critical function (see e.g. E.Lehmann, [10], p.62). The value of φε(X) is the probability of the rejectionH0 given the observation Xε. A test is said to be of asymptotic level α if

limε→0

E(ε)ϑ0

φε(X) ≤ α .

We denote by Kα the class of critical functions associated with the tests ofthe same asymptotic level α.

In this section we consider the contiguous alternatives which can be in-troduced with the help of the following change of variables ϑ = ϑ0 +ε u. Thehypotheses testing problem can be reformulated as follows

H0 : u = 0, H1 : u > 0 .

The power function of a test based on φε is defined by

βε(φε, u) = E(ε)ϑ+εu φε(X), u > 0 .

4

Definition 1. In this view a test based on φ∗ε ∈ Kα is locally asymptoticallyuniformly most powerfull (LAUMP) in the class Kα if, for any k > 0, andany other φε ∈ Kα, its critical function satisfies

limε→0

inf0<u<k

{βε(φ∗ε, u)− βε(φε, u)} ≥ 0. (7)

We construct the LAUMP test following G. Roussas [15], i.e., at first we

show that under regularity conditions the family of measures {P(ε)ϑ , ϑ > ϑ0}

is LAN at the point ϑ = ϑ0 and then we use the general result concerningsuch tests (see [15], Theorem 4.3.1).

Introduce the Fisher information

I(ϑ) =

∫ T

0

S(ϑ, xϑt )2 dt

and the statistic

∆ε(X) =1

ε

∫ T

0

S(ϑ0, x0t ) [dXt − S(ϑ0, Xt) dt] .

Below χ{A} is the indicator of the set A and z1−α is 1 − α quantil of thestandard Gaussian law.

Theorem 1. Let the condition A be fulfilled and I(ϑ0) > 0, then the testbased on the critical function

φ∗ε(X) = χ{∆ε(X)>z1−αI(ϑ0)1/2}

is LAUMP in the class Kα.

Proof. Note that the random variable ∆ε(X) is under hypothesis H0 aGaussian random variable, i.e.,

Lϑ0 {∆ε(X)} = N (0, I(ϑ0)) ,

because with probability P(ε)ϑ0

one we have

∆ε(X) =

∫ T

0

S(ϑ0, x0t ) dWt ∼ N (0, I(ϑ0)) .

Therefore Eϑ0φε(X)∗ = α for all ε ∈ (0, 1]. Under condition A the measures

{P(ε)ϑ , ϑ > ϑ0} are absolutely continuous with respect to P

(ε)ϑ0

[12] and for each

5

u ≥ 0 the likelihood ratio admits (with probability P(ε)ϑ0

one) the followingrepresentation

dP(ε)ϑ0+uε

dP(ε)ϑ0

(X) = exp

{1

ε

∫ T

0

[S (ϑ0 + εu,Xt)− S (ϑ0, Xt)] dWt−

− 1

2ε2

∫ T

0

[S (ϑ0 + εu, Xt)− S (ϑ0, Xt)]2 dt

}=

= exp

{u ∆ε(X, ϑ0)−

1

2u2 I(ϑ0) + rε(u, X, ϑ0)

}, (8)

where for any bounded sequence uε, the random variables rε(uε, X, ϑ0) con-

verge to zero under P(ε)ϑ0

. Indeed, we have for the integral

1

ε2

∫ T

0

[S (ϑ0 + εuε, Xt)− S (ϑ0, Xt)]2 dt =

= u2ε

∫ T

0

S (ϑ0 + γtεu,Xt)2 dt = u2

ε

∫ T

0

S(ϑ0, x

0t

)2dt + u2

ε o (1) ,

where we used the convergence sup0≤t≤T |Xt − x0t | → 0 and the continuity of

the function S(ϑ, x). Here γt ∈ [0, 1]. Similarly it can be shown that for thestochastic integral we have as well

1

ε

∫ T

0

[S (ϑ0 + εuε, Xt)− S (ϑ0, Xt)] dWt = uε ∆ε (X) + uε o (1) .

Therefore the representation (8) is valid and rε (uε, X, ϑ0) → 0 for anybounded sequence uε. Now we can cite the Corollary 3.1 of the Theorem4.3.1 by G. Roussas [15] and this proves the theorem. This means that thedecision rule:

to accept H0 if ∆ε(X) ≤ z1−α I (ϑ0)1/2 ,

to accept H1 if ∆ε(X) > z1−α I (ϑ0)1/2

is LAUMP test in this problem.Note that such tests are discussed in [7] (seeProposition 1 and Example).Remark. We have assumed that u is bounded by some constant k in

order to have the optimality of the test in the sense (7). The class of alter-natives can be easily extended to 0 < u < λε where λε → ∞, but ε λε → 0,i.e., the test proposed is optimal in the following sense as well:

limε→0

inf0<u<λε

{βε(φ∗ε, u)− βε(φε, u)} ≥ 0. (9)

6

Indeed, in this case under the alternative we have

∆ε(X) = ∆ +1

ε

∫ T

0

S(ϑ0, x0t ) [S(ϑ, Xt)− S(ϑ0, Xt)] dt =

= ∆ +1

ε

∫ T

0

S(ϑ0, x0t )[S(ϑ, xϑ

t )− S(ϑ0, xϑt )]

dt + o (1) =

= ∆ +1

εI (ϑ0) + o (1) →∞

(∆ =

∫ T

0

S(ϑ0, x0t ) dWt

)as ε → 0 and the condition A5 of Roussas [15], section 4.1 is fulfilled.

For the wider class of alternatives (without the condition ε λε → 0) wehave to suppose that for all δ > 0

infϑ−ϑ0>δ

∫ T

0

S(ϑ0, x0t )[S(ϑ, xϑ

t )− S(ϑ0, xϑt )]

dt > 0.

Then the condition ∆ε(X) → ∞ under alternatives ϑ > ϑ0 will be fulfilledand according to the Theorem 4.3.1 by G. Roussas [15] the test proposedin the Theorem 1 is asymptotically most powerfull against all alternativesϑ > ϑ0, i.e.,

limε→0

infu>0

{βε(φ∗ε, u)− βε(φε, u)} ≥ 0. (10)

3 Testing of contiguous semiparametric hy-

potheses

We now try to extend these results to a nonparametric setup. The observedprocess is

dXt = S(Xt) dt + ε dWt, X0 = x0, 0 ≤ t ≤ T, (11)

where S (·) is some smooth function.Let us fix a function S0(·) and suppose that it is continuously differen-

tiable, its derivative is bounded by a constant L > 0 and S0(x0) > 0. Further,we denote by {xt(S), 0 ≤ t ≤ T} the solution of the ordinary differentialequation

dxt

dt= S(xt), x0,

and we write {x0t , 0 ≤ t ≤ T} if S (·) = S0 (·). Introduce the functional ϑ(S)

as follows:

ϑ = ϑ(S) =

∫ T

0

G(xs(S)) ds,

7

where G(·) ∈ {f(·) ∈ C1 : f(·) > 0, f ′(·) > 0} is a known function and putϑ0 = ϑ(S0).

We consider the problem of testing the hypothesis

H0 : ϑ = ϑ0

against the composite nonparametric contiguous alternative

H1 : ϑ = ϑ(S0 + εh), h ∈ H,

where H is a set of functions defined as follows :

H = {h(·) : h(·) ∈ C(L), h(x0) > 0} .

The process in (11) induces on the measurable space (CT ,BT ) of continuous

on [0, T ] functions the measures P(ε)0 and P

(ε)h respectively under H0 and H1.

A test based on a critical function {φε}ε>0 is said to be of asymptoticlevel α if

limε→0

E(ε)0 φε(X) ≤ α

where E(ε)0 is the expectation with respect to P

(ε)0 . We denote by Kα the

class of critical functions associated to the tests of same asymptotic level α.The power function of a test based on φε is the defined as

βε(φε, h) = E(ε)h φε(X), h ∈ H

where E(ε)h is the expectation with respect to P

(ε)h for h(·) ∈ H. For any fixed

constant 0 < K < ∞, we define the sets

HK =

{h(·) : h ∈ H,

∫ T

0

h(x0s)

S0(x0s)

[G(x0

T )−G(x0s)]

ds ≤ K

},

and note thatH =

⋃K>0

HK .

The constant K plays the role similar to k in the definition (7) in the followingsense.

Definition 2. A locally asymptotically uniformly most powerful (LAUMP)test in the class Kα in the problem of testing

H0 : ϑ = ϑ(S0)

H1 : ϑ = ϑ(S0 + εh), h ∈ H,

8

is a test based on {φ∗ε}ε>0 ∈ Kα such that for any K > 0 it holds

limε→0

infh∈HK

{βε(φ∗ε, h)− βε(φε, h)} ≥ 0 (12)

for any other test with {φε}ε>0 ∈ Kα.

Roughly speaking, this problem of testing corresponds to test whetherthe observed trajectory is the solution of the stochastic differential equation

dXt = S0(Xt) dt + ε dWt, X0 = x0

against any other model

dXt = [S0(Xt) + εh(Xt)] dt + ε dWt, X0 = x0, h ∈ H.

We introduce the following quantities that are useful in the describing ex-plicitly of the rejection region of the optimal test:

κ(S0) =

∫ T

0

(G(x0

T )−G(x0s)

S0(x0s)

)2

ds =

∫ xT

x0

(G(x0T )−G(y))

2

S0(y)3dy (13)

and

∆∗ε(X) =

1

ε

∫ T

0

[G(Xt)−G

(x0

t

)]dt

These quantities can be calculated under hypothesis H0.

Theorem 2. The test φ∗ε (X) based on the critical function

φ∗ε(X) = χn∆∗

ε(X)>z1−ακ(S0)12

o

is LAUMP in the class Kα. Here z1−α is the 1 − α quantil of the standardGaussian distribution.

Proof. The proof is organized as follows. Fix a constant K and note that theproblem (as ε → 0) is localized in the following sense: the testing H0 againstH1 with h ∈ HK is asymptotically equivalent to

H0 : ϑ = ϑ0

H1 : ϑ = ϑ0 + εδ + oε(1), 0 < δ < K,

where for each value of δ ∈ (0, K) there corresponds a set of functions h(·) ∈Hδ ⊂ HK , with

Hδ =

{h(·) : h ∈ H,

∫ T

0

h(x0s)

S0(x0s)

[G(x0

T )−G(x0s)]

ds = δ

}.

9

Fix a function h∗(·) ∈ Hδ and consider two simple hypothesis H0 and H1

with this h∗ (·). Then by Neymann-Pearson lemma we can construct anoptimal in Kα test φε. Further we find the worst function h∗(·) ∈ Hδ andthis gives us the optimal in the minimax sense test (in the class Hδ). Thenext step is to find the worst in δ ∈ (0, K) test, which is always optimal byNeymann-Pearson but for the worst alternative in the class HK .

Note as well that the statistic ∆∗ε(X) under hypothesis H0 is a Gaussian

random variable, i.e.,

Lϑ0 {∆∗ε(X)} = N (0, κ(ϑ0)) .

Therefore the test φ∗ε (X) is of exact level α for all ε > 0.

For any h(·) ∈ HK using the continuous differentiablity of the functionsG (·), and S (·) we can write the following expansion of ϑε

h = ϑ(S0 + εh) bythe powers of ε:

ϑεh = ϑ0 + ε

∫ T

0

G′(x0t ) xt dt + o (ε)

where G′(z) = ddz

G(z) and xt is given by the equation

dxt = S ′0(x

0t ) xt dt + h(x0

t ) dt, x0 = 0.

Its solution is

xt = S0(x0t )

∫ t

0

h(x0s)

S0(x0s)

ds .

Thus,∫ T

0

G′(x0t )S0(x

0t )

∫ t

0

h(x0s)

S0(x0s)

ds dt =

∫ T

0

h(x0s)

S0(x0s)

∫ T

s

G′(x0t )S0(x

0t ) dt ds =

=

∫ T

0

h(x0s)

S0(x0s)

[G(x0T )−G(x0

s)] ds = δ.

Hence, for each K and h(·) ∈ HK , ϑ(S0 + εh) = ϑ0 + εδ + oε(1). Fixnow a value of δ ∈ (0, K) and consider a function h∗ (·) ∈ Hδ. Then fora fixed alternative given by this function h∗ (·) we have to test two simplehypotheses:

H0 : h (·) = 0

H1 : h (·) = h∗ (·) .

The likelihood ratio by the LAN (local asymptotic normality) property ofthe model, can be written as (see e.g. [8])

dP(ε)h∗

dP(ε)0

(X) = exp

{∆ε(h

∗, X)− 1

2I(h∗) + rε(h

∗, X)

},

10

where

I(h) =

∫ T

0

h∗(x0t )

2 dt,

is the Fisher information and the statistic

∆ε(h∗, X) = ε−1

∫ T

0

h∗(Xt) [dXt − S0(Xt) dt]

is asymptotically normalN (0, I(h∗)) under hypothesis H0 and under alterna-tive H1 it has the limit distribution N (I(h∗), I(h∗)). The quantity rε(h

∗, X)converges to zero in probability.

The likelihood ratio is a monotone function of ∆ε(h∗, X) so the most

powerful test can be based on this statistic and the critical function of thetest has the form (see e.g. E. Lehmann, [10], p.68)

φ(X) =

{1, ∆ε(h

∗, X) > c,

0, ∆ε(h∗, X) ≤ c.

where the threshold c is determined according to the condition φ(X) ∈ Kα.We have

αε(h∗) = P

(ε)0 (∆ε(h

∗, X) > c)

andβε(h

∗) = P(ε)h∗ (∆ε(h

∗, X) > c) .

From the asymptotic normality (under the hypothesis H0) of the statistic∆ε(h

∗, X) we obtain the equality

limε→0

αε(h∗) = 1− Φ

(c√

I(h∗)

)= α,

where Φ(x) is the cumulative distribution function of the standard Gaussianrandom variable. So, c must be such that c = z1−α

√I(h∗), where z1−α is

the 1−α quantil of the standard normal distribution. For what concerns thelimit of βε(h

∗), by plugging in the value of c, we have that

limε→0

βε(h∗) = 1− Φ

(z1−α −

√I(h∗)

)≡ β(h∗).

Remind that h∗ ∈ Hδ, so we will seek now the worst function h∗ (·) in theclass Hδ. The limit form of β(h∗) suggest to find a function h∗ ∈ Hδ withthe minimal Fisher information because

infh∗∈Hδ

β(h∗) = 1− Φ

z1−α − infh∗∈Hδ

√∫ T

0

h∗(x0t )

2 dt

.

11

Below we use the Cauchy-Schwartz inequality

δ2 =

(∫ T

0

h∗(x0s)

S0(x0s)

[G(x0

T )−G(x0s)]

ds

)2

≤∫ T

0

h∗(x0s)

2 ds

∫ T

0

[G(x0T )−G(x0

s)]2

S0(x0s)

2ds = I(h∗) κ (S0)

where κ(S0) has been introduced in (13). Hence

I(h∗) ≥ δ2 κ (S0)−1 .

The right side of this inequality does not depend on h∗, so if we find suchh that I(h) = δ2 κ (S0)

−1 it will be the worst alternative in the class Hδ.Remind that the equality in the inequality Cauchy-Schwartz we have if

h(y) = δG(x0

T )−G(y)

κ (S0) S0(y).

The corresponding likelihood ratio is

dP(ε)

h

dP(ε)0

(X) = exp

{δ ∆′

ε(X)

κ (S0)− δ2

2κ (S0)+ rε(h, X)

},

where

∆′ε(X) = ε−1

∫ T

0

G(x0T )−G(Xt)

S0(Xt)[dXt − S0(Xt) dt]

Therefore for any δ ∈ (0, K) the test based on the critical function

φε (X) = χn∆′

ε(X)>z1−α κ(S0)12

o

will be in the class Kα and has

β(h)

= 1− Φ(z1−α − δκ (S0)

1/2)

.

Note as well that instead of ∆′ε(X) we can use the test statistic

∆ε(X) = ε−1

∫ T

0

G(x0T )−G(x0

t )

S0(x0t )

[dXt − S0(Xt) dt] ,

which is asymptotically equivalent (in law) to ∆′ε(X) and has under hypoth-

esis H0 Gaussian distribution for all ε > 0.

12

To show the optimality of the test proposed in the Theorem we justmention that under the hypothesis H0 we have the representation (see [9])

∆∗ε(X) =

∫ T

0

G′(x0t ) x

(1)t dt (1 + o(1)) =

=

∫ T

0

G′(x0t ) S0(x

0t )

∫ t

0

dWs

S0(x0s)

dt (1 + o(1)) =

=

∫ T

0

G(x0T )−G(x0

t )

S0(x0t )

dWt (1 + o(1)) ,

and under alternative H1 we have similarly

∆∗ε(X) =

∫ T

0

G′(x0t ) S0(x

0t )

∫ t

0

dWs

S0(x0s)

dt +

+

∫ T

0

G′(x0t ) S0(x

0t )

∫ t

0

h(x0s)

S0(x0s)

ds dt (1 + o(1)) =

=

∫ T

0

G(x0T )−G(x0

t )

S0(x0t )

dWt +

∫ T

0

G(x0T )−G(x0

t )

S0(x0t )

h(x0t ) dt (1 + o(1)) .

Hence

limε→0

infh∈HK

[βε (φ∗ε, h)− β

(h)]

= limε→0

inf0≤δ≤K

infh∈Hδ

[βε (φ∗ε, h)− β

(h)]

= 0.

Remark. Note that the quantity κ (S0) plays the role of Fisher informa-tion of the worst parametric family in semiparametric estimation problem [9]and in this hypothesis testing problem its role is quite close as well.

References

[1] P.J. Bickel, C.A.J. Klaassen, Y. Ritov and J.A. Wellner, Efficient andAdaptive Estimation for Semiparametric Models. John Hopkins Univer-sity Press, Baltimore, 1993.

[2] I.A. Ibragimov and R.Z. Khasminskii Statistical estimation (Asymptotictheory), Springer, New York, 1981.

[3] S.M. Iacus, Semiparametric estimation of a functional of the drift coeffi-cient for a small diffusion process, Proceedings of Prague Stochastic’98,Vol 1, 243-246, 1998.

13

[4] S.M. Iacus, Semiparametric estimation of the state of a dynamical sys-tem with small noise, submitted, 1998.

[5] A. Janssen, Global power functions of goodness of fit tests, Preprint ofHeinrich-Heine University, Duesseldorf, 1997.

[6] A. Janssen, Nonparametric symmetry tests for statistical functionals,Mathematical Methodes of Statistics, 8(1999), 3, 320-343.

[7] Yu.A. Kutoyants, On a problem of testing hypotheses and asymptoticnormality of stochastic integrals, Theory Prob. Appl., 20(1975), 376-384.

[8] Yu.A. Kutoyants, Identification of dynamical systems with small noise,Kluwer, Dordrecht, 1994.

[9] Yu.A. Kutoyants, Semiparametric estimation for dynamical system withsmall noise, Mathemat. Methodes of Statistics, 7(1998), 4, 457-465.

[10] E.L. Lehmann, Testing statistical hypotheses, Wiley, New York, 1959.

[11] B. Ya. Levit, On optimality of some statistical estimates, Proc. PragueSymp. Asymptotic Statistics, 2, 215-238, 1973.

[12] R.S. Liptser and A.N. Shiryayev, Statistics of random processes, Part.1, Springer-Verlag, New York, 1977.

[13] Ya. Nikitin, Asymptotic efficiency of nonparametric tets, CambridgeUniversity Press, Cambridge, 1995.

[14] J. Pfanzagl and W. Wefelmeyer, Contributions to a general asymptoticstatistical theory, Lecture Notes in Statistics, Vol 13, Springer, Berlin-Heidelberg, 1982.

[15] G.G. Roussas, Contiguity of probability measures: some applications instatistics, Cambridge Univ. Press, 1972.

14