15
Journal of Mathematical Sciences, Vol. 108, No. 6, 2002 SOME PROPERTIESOF ELEMENTARY HAMILTONIAN MATRICES R. Coleman UDC 512.7+512.2 Certain Hamiltonian matrices, which are building blocks for a canonicalform, have a particularly simple form. Some of their properties are considered. Bibliography: 7 titles. §1. Introduction A symplectic linear space is a real vector space V of even dimension 2n equipped with a nondegenerate antisymmetric bilinear form ω. A basis (v i ) of such a space is said to be symplectic if ω(v i ,v j )= +1 if 1 i n, j = n + i, -1 if n +1 i 2n, j = i - n, 0, otherwise. Every symplectic space has a symplectic basis (cf. [6]). Two types of operators on symplectic spaces are of particular importance. An operator L is symplectic if ω(Lv i , Lv j )= ω(v i ,v j ), and L is Hamiltonian if ω(Lv i ,v j )= -ω(v i , Lv j ). We write T for symplectic operators and A for Hamiltonian operators. A real square (2n × 2n)-matrix M is symplectic if M t JM = J , and M is Hamiltonian if M t J + JM = 0, where J = J 2n = 0 I n -I n 0 . We write T for symplectic matrices and A for Hamiltonian matrices. It is easy to see that the matrix of a symplectic (respectively, Hamiltonian) operator in a symplectic basis is symplectic (respectively, Hamiltonian). A matrix A is Hamiltonian if and only if A = JS , where S is symmetric. A linear Hamil- tonian vector field has the form f (x)= Ax, where A is a Hamiltonian matrix; the Hamiltonian of f is the quadratic form H (x)= 1 2 x t Sx, where S is as above. A symplectic matrix is invertible and has determinant +1. (It is easy to see that the determinant is ±1, but to show that it is +1 requires a little work, cf. [4].) If a matrix T is symplectic, then so is its transpose, and we have the relation T -1 = -JT t J . Conjugating a Hamiltonian matrix A with a symplectic matrix T , we obtain a Hamiltonian matrix ˜ A = T -1 AT . In this case, we say that A and ˜ A are symplectically similar or symplec- tically conjugate. Similarly, we say that two symmetric matrices S and ˜ S are symplectically Published in Zapiski Nauchnykh Seminarov POMI, Vol. 258, 99, pp. 82–100. Original article submitted May 18, 1999. 1072-3374/02/1086-0951 $27.00 c 2002 Plenum Publishing Corporation 951

Some Properties of Elementary Hamiltonian Matrices

Embed Size (px)

Citation preview

Page 1: Some Properties of Elementary Hamiltonian Matrices

Journal of Mathematical Sciences, Vol. 108, No. 6, 2002

SOME PROPERTIES OF ELEMENTARY HAMILTONIANMATRICES

R. Coleman UDC 512.7+512.2

Certain Hamiltonian matrices, which are building blocks for a canonical form, have a particularlysimple form. Some of their properties are considered. Bibliography: 7 titles.

§1. Introduction

A symplectic linear space is a real vector space V of even dimension 2n equipped witha nondegenerate antisymmetric bilinear form ω. A basis (vi) of such a space is said to besymplectic if

ω(vi, vj) =

+1 if 1 ≤ i ≤ n, j = n + i,

−1 if n + 1 ≤ i ≤ 2n, j = i− n,

0, otherwise.

Every symplectic space has a symplectic basis (cf. [6]).Two types of operators on symplectic spaces are of particular importance. An operator L

is symplectic ifω(Lvi,Lvj) = ω(vi, vj),

and L is Hamiltonian ifω(Lvi, vj) = −ω(vi,Lvj).

We write T for symplectic operators and A for Hamiltonian operators.A real square (2n × 2n)-matrix M is symplectic if MtJM = J , and M is Hamiltonian if

MtJ + JM = 0, where

J = J2n =(

0 In

−In 0

).

We write T for symplectic matrices and A for Hamiltonian matrices. It is easy to see that thematrix of a symplectic (respectively, Hamiltonian) operator in a symplectic basis is symplectic(respectively, Hamiltonian).

A matrix A is Hamiltonian if and only if A = JS, where S is symmetric. A linear Hamil-tonian vector field has the form f(x) = Ax, where A is a Hamiltonian matrix; the Hamiltonianof f is the quadratic form H(x) = 1

2xtSx, where S is as above.A symplectic matrix is invertible and has determinant +1. (It is easy to see that the

determinant is ±1, but to show that it is +1 requires a little work, cf. [4].) If a matrix T issymplectic, then so is its transpose, and we have the relation T−1 = −JT tJ .

Conjugating a Hamiltonian matrix A with a symplectic matrix T , we obtain a Hamiltonianmatrix A = T−1AT . In this case, we say that A and A are symplectically similar or symplec-tically conjugate. Similarly, we say that two symmetric matrices S and S are symplectically

Published in Zapiski Nauchnykh Seminarov POMI, Vol. 258, 99, pp. 82–100. Original article submittedMay 18, 1999.

1072-3374/02/1086-0951 $27.00 c©2002 Plenum Publishing Corporation 951

Page 2: Some Properties of Elementary Hamiltonian Matrices

congruent if there is a symplectic matrix T such that S = T tST . Writing A = JS, we haveS = T tST . Thus, two Hamiltonian matrices are symplectically similar if and only if thecorresponding symmetric matrices are symplectically congruent.

Consider two symplectic spaces (V1, ω1) and (V2, ω2). A linear map L : V1 → V2 is a sym-plectomorphism if it preserves the bilinear form, i.e.,

∀u, v ∈ V1 : ω2(Lu,Lv) = ω1(u, v).

In this case, the two spaces are said to be symplectomorphic.Now, assume that we have two triples (V1, ω1,A1) and (V2, ω2,A2), where Ai is a Hamil-

tonian operator on the symplectic space (Vi, ωi); i = 1, 2. If there exists a symplectomorphismL : V1 → V2 such that L◦A1 = A2 ◦L, then we say that the two triples have the same type andwe write (V1, ω1,A1) ∼ (V2, ω2,A2). Clearly, ∼ is an equivalence relation; the correspondingequivalence classes are called types. Obviously, all vector spaces in triples in a given type havethe same dimension, and so we can speak of the dimension of a type.

A subspace W of a symplectic space (V, ω) is symplectic if ω|W is nondegenerate. Thismeans that (W,ω|W ) is a symplectic space. Clearly, the dimension of W must be even. Twosubspaces W and Z of V are orthogonal if

∀w ∈ W, ∀z ∈ Z : ω(w, z) = 0.

If, in addition, V = W ⊕Z, then Z is called an orthogonal complement of U . It is easy to showthat a subspace is symplectic if and only if it has an orthogonal complement [7].

A triple (V, ω,A) is decomposable if V is the direct sum of nontrivial orthogonal A-invariantsubspaces. Otherwise, the triple is indecomposable. A decomposition of a triple α yields twotriples α1 and α2, and we write α = α1 + α2. If a triple β is equivalent to α, then β isalso decomposable, and we have β = β1 + β2 with β1 ∼ α1 and β2 ∼ α2. It follows that allelements of the same type are either decomposable or indecomposable, and so we can speakof decomposable and indecomposable types. Also, we can uniquely define a decomposition ofa given type ∆ as follows: if α ∈ ∆ and α = α1 + α2, then we write ∆ = ∆1 + ∆2, whereα1 ∈ ∆1 and α2 ∈ ∆2.

Clearly, for any type ∆, we have

∆ = ∆1 + ∆2 + · · ·+ ∆s,

where the ∆i are indecomposable types. N. Burgoyne and R. Cushman [2] proved that thisdecomposition is unique (up to order).

Let two triples (Vi, ωi,Ai), i = 1, 2, belong to the same type, and let L : V1 → V2 be asymplectomorphism making them equivalent. If (ei) is a symplectic basis of V1 and A is the(Hamiltonian) matrix of A1 in (ei), then (Lei) is a symplectic basis of V2 and A is also thematrix of A2 in (Lei). Thus, we can associate with a type a certain family of Hamiltonianmatrices, namely, those representing the operator of any triple in the type in a symplecticbasis. Since the transformation matrix from one symplectic basis to another is symplectic,this family is just the collection of Hamiltonian matrices that are symplectically similar to thematrix of the operator of any triple in a symplectic basis. It follows that, with any type ∆, wecan associate the characteristic polynomial P∆ of a matrix associated with ∆.

952

Page 3: Some Properties of Elementary Hamiltonian Matrices

If ∆ is a decomposable (respectively, indecomposable) type, then any matrix associatedwith ∆ is said to be decomposable (respectively, indecomposable). It is easily seen that if A is adecomposable Hamiltonian matrix, then there exists a symplectic matrix T and a permutationmatrix K such that

KtT−1ATK = diag(A1, . . . , As),

where s > 1 and the blocks A1, . . . , As are Hamiltonian matrices (see Appendix).Here, we are concerned with particularly simple indecomposable matrices, which we call

elementary Hamiltonian matrices.

§2. Elementary Hamiltonian matrices

The characteristic polynomial of a Hamiltonian operator is even [7]. It follows that if λis an eigenvalue, then so are −λ and λ and they have the same multiplicity as λ. Denotingby gen(λ) the generalized eigenspace of λ, we group the generalized eigenspaces and obtainsubspaces of the following types:

P =gen(α) ⊕ gen(−α),

Q′=gen(γ + iδ) ⊕ gen(γ − iδ)⊕ gen(−γ + iδ) ⊕ gen(−γ − iδ),

R′=gen(iβ)⊕ gen(−iβ),

S =gen(0),

where α, β, γ, δ ∈ R∗+. (Certainly, the spaces Q′ and R′ lie in the complexification of V .)If A is a Hamiltonian operator on a symplectic space (V, ω) and λ and µ are eigenvalues

such that λ + µ 6= 0, then the corresponding generalized eigenspaces are orthogonal [5]. Thisis also the case if we replace the spaces Q′ and R′ by their real parts Q and R. It follows thatthe characteristic polynomial of an indecomposable type has one of the following four forms:

(X2 − α2)m,

[X − (γ2 + δ2)]m[X + (γ2 + δ2)]m,

(X2 + β2)m,

Xm, m even.

It turns out that, for a given number 2n, there are two indecomposable types if the characteristicpolynomial is (X2 + β2)n or X2n with n even, and there is one indecomposable type if thecharacteristic polynomial is (X2 − α2)n or X2n with n odd. For a given number 4n, there isone indecomposable type with polynomial [X− (γ2 +δ2)]n[X +(γ2 +δ2)]n (see [2]). With eachone of these types, we associate a particularly simple matrix; we call these matrices elementaryHamiltonian matrices.

All Hamiltonian matrices have the block structure

A =(

A1 A2

A3 A4

)(if A is a (2n× 2n)-matrix, then the blocks Ai have size n× n), where A4 = −At

1 and A2 andA3 are symmetric. Hence, we can describe a Hamiltonian matrix by giving the form of theblocks A1, A2, and A3. Elementary Hamiltonian matrices have the following forms:

953

Page 4: Some Properties of Elementary Hamiltonian Matrices

A1 =

α1 α

1 α. . . . . .

1 α

, A2 = A3 = 0; (I)

A1 =

∆I2 ∆

I2 ∆.. . . . .

I2 ∆

, A2 = A3 = 0, (II)

whereD =

γ δ−δ γ

and I2 = 1 0

0 1

;

A2 =

+1 +β

. . . +β. . . . . .

+1 +β+β

, A3 =

−β

−β −1. . . . . .

. . . . . .

−β −1

, S2 = S3 = 0,

(III)or (III′), the negative of (III);

A1 =

01 0

1 0. . . . . .

1 0

, A3 =

0

. . .0

1

, A2 = 0 (IVev)

(n is even), or (IV′ev), which is (IVev) with 1 replaced by −1 in the block A3;

A1 =

01 0

1 0. . . . . .

1 0

, A2 = A3 = 0 (IVod)

(n is odd).

Remark. The elementary matrices of types (I) and (IV) are Burgoyne–Cushman and Brunonormal forms [2, 1] (the forms are the same), type (II) is a Burgoyne–Cushman form, and type(III) is a Bruno form. We have chosen what we consider to be the simplest matrix of eachtype.

We have already remarked that a Hamiltonian matrix A is the product of the matrix J anda symmetric matrix S. Below, we study the spectra of these symmetric matrices in the casewhere A is an elementary Hamiltonian matrix. We call these matrices elementary symmetricmatrices.

954

Page 5: Some Properties of Elementary Hamiltonian Matrices

§3. Spectra of elementary symmetric matrices

Now we analyze the spectra of the elementary symmetric matrices introduced in §2. IfA = JS, then S has the block form

S =S1 S2

S3 S4

,

whereS1 = −A3, S2 = −A4, S3 = A1, S4 = A2.

Before considering the spectra of these matrices, we prove an elementary result.

Proposition 3.1. Let X be a (2n× 2n)-matrix with the block form

X = U V

W Z

,

where U , V , W , and Z are (n × n)-matrices. If W and Z commute, then we have

detX = det(UZ − V W ).

Proof. First, we assume that Z is invertible. Then we have U VW Z

Z 0W Z−1

= UZ − V W V Z−1

WZ − ZW In

= UZ − V W V Z−1

0 In

because W and Z commute. However,

det Z 0

W Z−1

= 1,

whencedet

U VW Z

= det(UZ − V W ).

Now, we consider the general case. The matrix Z − λIn is invertible for an infinite numberof values of λ (because Z has only a finite number of eigenvalues). Thus, for an infinite numberof values of λ, we have

det U V

W Z − λIn

= det(U(Z − λIn)− V W ).

This means that the polynomials

det U V

W Z − xIn

and det(U(Z − xIn)− V W )

take the same value at an infinite number of points, and so they are equal. Setting x = 0, weobtain the required result. �

Now, we consider the spectra of the elementary symmetric matrices.

955

Page 6: Some Properties of Elementary Hamiltonian Matrices

Type (I). In this case, we have

S2 =

α 1

. . . . . .α 1

α 1α

, S3 =

α1 α

1 α. . . . . .

1 α

, S1 = S4 = 0.

If ΦS is the characteristic polynomial of S, then Proposition 3.1 implies

ΦS(λ) = det(S − λI2n) = det−λIn S2

S3 −λIn

= det(λ2In − S2S3) = det(S2S3 − λ2In).

However, we note that

M = S2S3 =

a αα a α

. . . . . . . . .α a α

α α2

,

where a = α2 + 1. The matrix M is tridiagonal with nonzero values on the upper and lowersemi-diagonals. Therefore, the eigenvalues of M are distinct. Furthermore, if M ′ is the corre-sponding matrix when S is a 2(n+1)× 2(n+1)-matrix, then the eigenvalues of M lie betweenthose of M ′ [3]. If µ1 < µ2 < · · · < µn are the eigenvalues of M , then

ΦS(λ) = ΦM (λ2) =n∏

i=1

(λ2 − µi).

Hence, ΦS is an even polynomial with distinct roots: if λ is a root, then so is −λ. In addition,the characteristic polynomials of the (2n×2n)- and 2(n+1)×2(n+1)-matrices S are distinct.

Remark. Note that the eigenvalues µi must be strictly positive.

Type (II). Here, we have

S2 =

∆t I2

. . . . . .∆t I2

∆t I2

∆t

, S3 =

∆I2 ∆

I2 ∆... . . .

I2 ∆

, S1 = S4 = 0.

Once again using Proposition 3.1, we obtain

ΦS(λ) = det(S − λI4n) = det−λI2n S2

S3 −λI2n

= det(λ2I2n − S2S3) = det(S2S3 − λ2I2n).

956

Page 7: Some Properties of Elementary Hamiltonian Matrices

Also, note that if we write ε = γ2 + δ2, then we have

∆t∆ = εI2n,

whence

M = S2S3 =

σI2 ∆∆t σI2 ∆

... . . . . . .∆t σI2 ∆

∆t ε2I2

,

where σ = ε2 + 1.Let us consider the spectrum of M . If µ is an eigenvalue and Xt = (X1, . . . ,Xn)t is an

eigenvector corresponding to µ such that Xi ∈ R2, then we have the system

εX1 + ∆X2 = µX1,

∆tX1 + (1 + ε)X2 + ∆X3 = µX2,

...

∆tXn−2 + (1 + ε)Xn−1 + ∆Xn = µXn−1,

∆tXn−1 + (1 + ε)Xn = µXn.

(∗)

Multiplying the ith equation by ∆i−1 for i = 2, . . . , n− 1, we obtain

∆X2 = (µ− ε)X1,

∆2X3 = (µ− ε− 1)∆X2 − εX1,

∆3X4 = (µ− ε− 1)∆2X3 − ε∆X2,

...

∆n−1Xn = (µ− ε− 1)∆n−2Xn−1 − ε∆n−3Xn−2.

Letting P1(x) = x − ε, we have∆X2 = P1(µ)X1.

Also,∆2X3 = (µ− ε− 1)∆X2 − εX1 = (µ− ε− 1)P1(µ)X1 − εX1.

Now, letting P2(x) = (x − ε− 1)P1(x) − ε, we obtain

∆2X3 = P2(µ)X1.

Proceeding in this way, we obtain a sequence of polynomials P1, . . . , Pn−1 with deg Pi = isuch that

∀i ∆iXi+1 = Pi(µ)Xi.

957

Page 8: Some Properties of Elementary Hamiltonian Matrices

Multiplying the last equation in system (∗) by ∆n−1, we have

ε∆n−2Xn−1 + (1 + ε)∆n−1Xn = µXn

=⇒ (µ− ε− 1)∆n−1Xn = ε∆n−2Xn−1

=⇒ (µ− ε− 1)Pn−1(µ) = εPn−2(µ)X1.

SettingQ(x) = (x − ε − 1)Pn−1(x) − εPn−2(x),

we have deg Q = n andQ(µ)X1 = 0.

Using system (∗) and the fact that X is an eigenvector, we see that X1 6= 0 and so Q(µ) = 0.Now, we consider the matrix M − µI2n. Eliminating the first two rows and the last two

columns, we obtain the matrix

∆t ηI2 ∆∆t ηI2 ∆

... . . . . . .∆t ηI2 ∆

∆t ηI2

∆t

,

where η = ε2 + 1− µ.This matrix is clearly nonsingular. It follows that the dimension of any eigenspace is at

most two. However, all eigenvalues are roots of Q, so that their number is at most n. Since Mhas size 2n× 2n, is symmetric, and therefore the diagonalizable, it follows that each root of Qis an eigenvalue of M of multiplicity two.

If µ1, . . . , µn are distinct roots of Q, then we have

ΦS(x) =n∏

i=1

(x2 − µi)2.

Every eigenvalue of S has multiplicity two, and if λ is an eigenvalue, then so is −λ.

Remark. Note that the eigenvalues µi must be strictly positive.

Type (III). In this case, we have S with

S1 =

1 β

1 β. . . . . .

1 ββ

, S4 =

β

β 1. . . . . .

β 1β 1

,

S2 = S3 = 0

958

Page 9: Some Properties of Elementary Hamiltonian Matrices

or S ′, the negative of S.We have

det(S − λI2n) = det(S1 − λIn) det(S4 − λIn) = det(S1 − λIn)2.

Now, we consider the spectrum of the matrix S1. First, we note that

M = S21 =

b ββ b β

. . . . . . . . .β b β

β β2

,

where b = 1 + β2.The matrix M is tridiagonal with nonzero values on the semi-diagonals. Therefore, its

eigenvalues are distinct. Furthermore, if M ′ is the corresponding matrix when S is a 2(n +1) × 2(n + 1)-matrix, then the eigenvalues of M lie between those of N . This means that theeigenvalues of S1 are distinct and differ from those of S1 if S is a 2(n + 1)× 2(n + 1)-matrix.

We would like to know the number of positive and negative eigenvalues of S1. Consider thematrix

S(ε) =

ε β

ε β. . . . . .

ε ββ

.

Clearly, S(1) = S1. Let Λt(ε) = (λ1(ε), . . . , λn(ε))t be the vector of eigenvalues. Since S(ε) issymmetric, we have Λ(ε) ∈ Rn. Hence, φ : ε 7→ Λ(ε) is a continuous map from R to Rn. Forany value of ε, the eigenvalues λ1(ε), . . . , λn(ε) are all nonzero, whence it follows that Λ(ε)remains in the same quadrant, i.e., the number of positive and negative eigenvalues does notchange. Therefore, to find the number of positive and negative eigenvalues of S1, it suffices toconsider the matrix S(0).

If n = 2k, then we have

Fk = S(0)− λIn =

−λ β

. . . . . .

. . . . . .β −λ

.

It is not difficult to find the recurrence relation

det Fk = (λ2 − β2) det Fk−1,

whence det Fk = (λ2 − β2)k. Thus, the number of positive eigenvalues of S1 is equal to thenumber of negative ones.

959

Page 10: Some Properties of Elementary Hamiltonian Matrices

Now, let n = 2k + 1. Then

Gk = S(0) − λIn =

−λ β. . . . . .

β − λ

. . . . . .β −λ

.

Developing the determinant of this matrix by the (k + 1)th column, we obtain

det Gk = (β − λ) det Fk = (β − λ)k+1(β + λ)k.

Hence, in this case, the number of positive eigenvalues of S1 is one greater than that of thenegative ones.

Now, we return to S. We have shown that S has n eigenvalues, each of multiplicity 2. Ifn is even, then the number of positive and negative eigenvalues is the same. If n is odd, thenumber of positive eigenvalues exceeds the number of negative ones by 1.

The matrix S′ is the negative of S, and the eigenvalues of S′ are the negatives of those ofS. Here, also, there are n eigenvalues of multiplicity two; as before, if n is even, the number ofpositive and negative eigenvalues is the same. On the other hand, if n is odd, then the numberof negative eigenvalues of S is one greater than that of the positive eigenvalues.

We also note that the eigenvalues of S and S′ must be distinct. Indeed, if S and S ′ havean eigenvalue λ in common, then both λ and −λ are eigenvalues of S. However, the squaresof the eigenvalues of S (and of S′) are eigenvalues of M . Since λ2 = (−λ)2, the number ofdistinct eigenvalues of M in this case cannot exceed n− 1, which is a contradiction.

Type (IVev). This time, we have

S1 =

0

. . .0

1

, S3 =

01 0

1 0. . . . . .

1 0

,

S2 = St3, S4 = 0.

Let λ be an eigenvalue, and let (X,Y )t = (x1, . . . , xn, y1, . . . , yn)t be an eigenvector corre-sponding to λ. Then we have the systems

y2 = λx1,

...yn = λxn−1,

xn = λxn

960

Page 11: Some Properties of Elementary Hamiltonian Matrices

and

0 = λy1,

x1 = λy2,

...xn−1 = λyn.

Since JS is singular and J is nonsingular, we see that S is singular. Therefore, 0 is an eigenvalueof S. Considering the systems above, we easily see that

x1 = · · · = xn = y2 = · · · = yn = 0.

It follows that the eigenspace of 0 is the vector subspace of R2n generated by the (i + 1)thelement of the canonical basis.

Now, we assume that λ2 6= 1. Then the two systems yield

y2 = λ2y2,

...

yn = λ2yn,

whencey2 = · · · = yn = 0.

It easily follows that (X,Y )t = 0, which contradicts the fact that (X,Y )t is an eigenvector,and hence nonzero. Therefore, the only possible nonzero eigenvalues are ±1.

If λ = 1, then we have

y1 = 0,

y2 = x1,

...yn = xn−1.

We see that +1 is an eigenvalue with eigenspace generated by the vectors e1 + en+2, . . . ,en−1 + e2n, en, where (ei)1≤i≤2n is the canonical basis of R2n.

If λ = −1, we have

xn = 0,

y1 = 0,

y2 = − x1,

...yn = − xn−1.

We see that −1 is an eigenvalue with eigenspace generated by the vectors e1 − en+2, . . . ,en−1 − e2n. Note that the eigenspaces of +1 and −1 have different dimensions.

The case of (IV′ev) can be treated in the same way as that of (IVev). As before, theeigenvalues are 0 and ±1, and the dimension of the eigenspace of 0 is 1. However, this timethe dimension of the eigenspace of −1 is one greater than that of +1.

961

Page 12: Some Properties of Elementary Hamiltonian Matrices

Type (IVod). In this case, we have

S3 =

01 0

1 0. . . . . .

1 0

, S2 = St3, S1 = S4 = 0.

If λ is an eigenvalue and (X,Y )t = (x1, . . . , xn, y1, . . . , yn)t is an eigenvector corresponding toλ, then we have the systems

y2 = λx1,

...yn = λxn−1,

0 = λxn

and

0 = λy1,

x1 = λy2,

...xn−1 = λyn.

If λ = 0, thenx1 = · · · = xn−1 = y2 = · · · = yn = 0,

and we see that 0 is an eigenvalue with eigenspace generated by the vectors en and en+1 of thecanonical basis.

It easy to see that λ2 6= 1 is impossible. If λ = 1, then we have

xn = 0,

y1 = 0,

y2 = x1,

...yn = xn−1.

We see that +1 is an eigenvalue with eigenspace generated by the vectors e1 +en+2, . . . , en−1 +e2n.

A similar analysis shows that −1 is also an eigenvalue and that the vectors e1−en+2, . . . , en−1

−e2n generate its eigenspace.

962

Page 13: Some Properties of Elementary Hamiltonian Matrices

§4. Discussion

A natural question concerning Hamiltonian matrices is the relation between the spectra of aHamiltonian matrix and the corresponding symmetric matrix. The above examples show thatthe two spectra may have very different properties. For example, a Hamiltonian (2n × 2n)-matrix A may have two eigenvalues, while the corresponding symmetric matrix has 2n (case(I)) or n (case (III)) eigenvalues. It is particularly interesting to consider case (IV), where theHamiltonian matrix is nilpotent. Here, the matrix A has a unique eigenvalue 0 and the matrixS has this eigenvalue with multiplicity at most two.

We also observe the fundamental difference between cases (I) and (III). If we have a realpair ±α, all eigenvalues are distinct; half of them are negative and half are positive. However,in the case of an imaginary pair ±iβ, all eigenvalues have multiplicity two, and, if n is odd,we do not have the same number of positive and negative eigenvalues. At least in this case, itcannot be argued that the difference comes from the choice of the simple Hamiltonian matrix:since the symmetric matrices corresponding to a given elementary Hamiltonian matrix are allsymplectically congruent, they all have the same number of positive and negative eigenvalues.

This brings us to another point, namely, the rank and signature of the quadratic formH(x) = 1

2xtSx associated with an elementary Hamiltonian matrix. If r (respectively, s) is thenumber of positive (respectively, negative) eigenvalues, then the pair (r + s, r − s) determinesthe rank and the signature of the quadratic form. Here, we have

I : (2n, 0); II : (4n, 0); III (n even) : (2n, 0); III (n odd) : (2n, 2) or (2n,−2);

IVev : (2n− 1, 1) or (2n− 1,−1); IVod : (2n − 2, 0).

Since the rank and the signature of the quadratic form associated with any matrix in a giventype are equal to those of the quadratic form associated with any other matrix in the type, itfollows that we can associate rank and signature with any type, in particular with the indecom-posable types. The ranks and signatures that we found here are those of the indecomposabletypes.

We consider the symmetric matrices of types (I) and (III) in more detail. In our analysis,we saw that in both cases we obtained a matrix M of the same form:

M =

p ππ p π

. . . . . . . . .π p π

π π2

,

where p = π2 + 1.In case (I), the eigenvalues of the symmetric matrix of interest to us are the square roots of

the eigenvalues of M . However, in case (III), we obtain half of them, and we obtain the otherhalf in case (III′).

To conclude, we consider another property of the quadratic form H(x) = 12xtSx associated

with an elementary Hamiltonian matrix. In all cases, except the trivial ones, where A is thezero (2 × 2)-matrix or the (2 × 2)-matrix (aij) with

aii = a22 = 0 a12 = β = −a21,

963

Page 14: Some Properties of Elementary Hamiltonian Matrices

the quadratic form is not convex because a quadratic form is convex if and only if it is positive(see Appendix). Hence, the only vector fields corresponding to indecomposable Hamiltonianmatrices and having convex Hamiltonians are those symplectically conjugate to these twosimple matrices.

Appendix

Here, we prove two elementary results referred to in the main text.If K is the permutation matrix [e1, en+1, . . . , en, e2n], where (ei)1≤i≤2n is the canonical basis

of R2n, then we have the following result.

Proposition A1. A Hamiltonian matrix A is decomposable if and only if there exists asymplectic matrix T such that

KtT−1ATK = diag(A1, . . . , As),

where s > 1 and the blocks A1, . . . , As are Hamiltonian matrices.

Proof. First, we assume that A is decomposable. Let (V, ω) be a symplectic space and let Abe a Hamiltonian operator on this space such that A is the matrix of A in a symplectic basis(yi, xi)1≤i≤n = (y, x). Let

V = U1 ⊕ · · · ⊕ Us

be a decomposition of V into symplectic subspaces on which A is stable. For i ∈ {1, . . . , s},let (qi, pi) be a symplectic basis of Ui. (The collections qi and pi consist of ni vectors each.)The matrix of A in the basis (qi, pi) is a Hamiltonian matrix Ai.

The basis (q1, . . . , qs, p1, . . . , ps) = (q, p) is a symplectic basis of V , and so the transformationmatrix T from the basis (y, x) to the basis (q, p) is symplectic. Let A be the matrix of A inthe basis (q, p). Then A = T−1AT and

A =

A11 A21

. . . . . .A1s A2s

A31 A41

. . . . . .A3s A4s

,

where the matrices Aij are the (ni × ni)-blocks of the matrix Ai:

Ai =A1i A2i

A3i A4i

.

The matrix of A in the basis (q1, p1, . . . , qs, ps) is diag(A1, . . . , As), whence the result.Now, let A be a Hamiltonian matrix, and let there exist a symplectic matrix T such that

KtT−1ATK = diag(A1, . . . , As),

where s > 1 and the blocks A1, . . . , As are Hamiltonian matrices.

964

Page 15: Some Properties of Elementary Hamiltonian Matrices

The matrix T−1AT has the same form as A above, and it is Hamiltonian because T issymplectic. Furthermore, the columns Ti of T form a symplectic basis of the symplectic space(R2n, ω0), where ω0 is the canonical antisymmetric bilinear form:

∀ x, y ∈ R2n ω0(x, y) = xtJ2ny.

Let the columns of the block A1i have indices ki, ki + 1, . . . , li. Then the subspace Vi of R2n

generated by the columns Tj , Tn+j, where ki ≤ j ≤ li, is symplectic, the Hamiltonian operatorA defined by

∀ x ∈ R2n A(x) = Ax

is stable on this space, and the matrix of A|Vi is Ai. Thus, the triple (R2n, ω0,A) is decom-posable, which proves the converse. �Proposition A2. A real quadratic form is convex if and only if it is positive.

Proof. Let q be a real quadratic form on a vector space V , and let φ be the polar form of q. Ifx, y ∈ V and t ∈ [0, 1], then we have

q[(1− t)x + ty] = (1 − t)2q(x) + 2(1 − t)tφ(x, y) + t2q(y),

whence

(1− t)q(x) + tq(y) − q[(1− t)x + ty] = (1− t)t[q(x) − 2φ(x, y) + q(y)] = (1 − t)tq(x − y),

and the result follows. �

REFERENCES

1. A. Bruno, The Restricted 3-Body Problem: Plane Periodic Orbits, Walter de Gruyter (1994).2. N. Burgoyne and R. Cushman, “Normal forms for real Hamiltonian systems,” in: The Ames

Research Center (NASA) Conference on Geometric Control Theory (1976).3. P. G. Ciarlet, Introduction to Matrix Numerical Analysis and Optimization, Cambridge

Univ. Press (1989).4. H. Hofer and E. Zehnder, Symplectic Invariants and Hamiltonian Dynamics, Birkhauser-

Verlag (1994).5. A. Laub and K. R. Meyer, “Canonical forms for symplectic and Hamiltonian matrices,”

Celest. Mech.. 9, 213–238 (1974).6. P. Libermann and C.-M. Marle, Symplectic Geometry and Analytic Mechanics, D. Reidel

(1987).7. K. R. Meyer and G. R. Hall, Introduction to Hamiltonian Dynamical Systems and the N-

Body Problem Springer-Verlag (1992).

965