• wonderlic tests
  • EXAM REVIEW
  • NCCCO Examination
  • Summary
  • Class notes
  • QUESTIONS & ANSWERS
  • NCLEX EXAM
  • Exam (elaborations)
  • Study guide
  • Latest nclex materials
  • HESI EXAMS
  • EXAMS AND CERTIFICATIONS
  • HESI ENTRANCE EXAM
  • ATI EXAM
  • NR AND NUR Exams
  • Gizmos
  • PORTAGE LEARNING
  • Ihuman Case Study
  • LETRS
  • NURS EXAM
  • NSG Exam
  • Testbanks
  • Vsim
  • Latest WGU
  • AQA PAPERS AND MARK SCHEME
  • DMV
  • WGU EXAM
  • exam bundles
  • Study Material
  • Study Notes
  • Test Prep

Solutions to Chapter 1

Testbanks Dec 30, 2025 ★★★★☆ (4.0/5)
Loading...

Loading document viewer...

Page 0 of 0

Document Text

Solutions to Chapter 1 1.1ja bj=j 9j= 9, whilekak kbk= p 6 p 22

=11:489>9.

1.2 To verify the Cauchy{Schwarz inequality, rst see that the inequality holds trivially if aandbare zero vectors. We therefore assume that bothaandbare nonzero. Letcbe the vectorc=xayb, wherex=b

b, andy=a

  • Clearly,c

c0. We expressc

c

in terms ofxandy:

c

c= (xayb)

(xayb) =x 2 a

a2xya

b+y 2 b

b:

Sincec

c0, and using the denitions ofxandy, we see that (b

b) 2 (a

a)2(a

b) 2 (b

  • + (a

b) 2 (b

b)0 and dividing byb

bin the last inequality, we see that (b

b)(a

a)(a

b) 2 0; which veries the Cauchy{Schwarz inequality.We use the Cauchy{Schwarz inequality to deduce the triangle inequality, which can be written in an equivalent form ka+bk 2 (kak+kbk) 2

:

The expression on the left is ka+bk 2 = (a+b) (a+b) =a a+ 2a b+b b =kak 2

  • 2a
  • b+kbk 2 while the expression on the right is (kak+kbk) 2 =kak 2

  • 2kakkbk+kbk
  • 2

:

Comparing these two formulas, we see that the triangle inequality holds if and only ifa b kak kbk. By Cauchy{Schwarz inequality,ja bj kakkbk, so the triangle inequality follows as a consequence of the Cauchy{Schwarz inequality. The converse is also true; i.e., if the triangle inequality holds, thena b kakkbkholds foraand fora, from which Cauchy{Schwarz inequality follows. If equality holds, i.e., ifa b=kakkbk, thenb=ca, for some scalarc. Hence,a b=ckak 2 , andkakkbk=jcjkak 2 . For nonnull a, this implies thatc=jcj, so thatc0. Ifb6=0, thenb=ca, withc >0.

1.3 Sincex

y= 0 =x

z=y

z, it follows that we must solve the equationsa 2 +b 2 = 1, and a 2 b 2 = 0, for which the solutions area=1= p

  • andb=1=
  • p 2.

    1.4 Let V=

B B @

  • 2 0
  • 0 2
  • 1 1
  • 11 1 1 C C A

:

1 Solutions Manual for A First Course in Linear Model Theory, 2e by Nalini Ravishanker, Zhiyi Chi, Dipak Dey (All Chapters) 1 / 4

2Solutions to Chapter 1 Using elementary column transformationsC22C1, and then,C3C2, the matrixV becomes V=

B B @

  • 0 0
  • 12 0

  • 1 0
  • 11 0 1 C C A ; so that the column rank ofV(or the dimension of its column space) is 2<3. Therefore, v1,v2andv3are linearly dependent. It is easily seen thatv1andv2are LIN, and that v3=v22v1.

    1.5c1v1+c2v2+c3v3=0results in the following three equations,

and the only solution isc1= 0; c2= 0, andc3= 0:

2c1+ 8c24c3= 0 3c16c2+ 3c3= 0

2c1+ 5c2+c3= 0:

1.6Ais equivalent to the matrix

@

  • 3 3
  • 4 4
  • 01
  • 1 A; so that the columns ofAare LIN.

    1.7

2 3

=c1

1 2

+c2

3 5

to getc1=1 andc2= 1. Hence,u is in Spanfv1;v2g.

1.8 v1; ;vmare linearly dependent, then there are scalarsc1; ; cmnot all zero, such that P m i=1 civi=0. For anykwithck6= 0, we then havevk= P i6=k (ci=ck)vi, showing property 1 in Result 1.2.2.Suppose without loss of generality thatv1; ;vsare linearly dependent andc1; ; cs are constants, not all zero, such that P s i=1 civi=0. Letcj= 0 for allj=s+ 1; ; n.Thenc1; ; cnare not all zero and P n i=1 civi=0. Hencev1; ;vnare linearly dependent, showing property 2 in Result 1.2.2.

1.9 S=fv1; ;vngdenote a set of nonzero orthogonal vectors, and letubelong to the span ofSwith

u=c1v1+ +cnvn:

For a xedi= 1; ; n, take the inner product of each side withvi. Sincevi vj= 0; i6=j, u vi=c1(v1 vi) + +cn(vn vi) =ci(vi

vi):

Hence,ci= (u vi)

(vi vi). To verify linear independence, setu=c1v1+ +cnvn=0.This implies thatci= 0,i= 1; ; n, which in turn implies LIN offv1; ;vng. 2 / 4

Solutions to Chapter 13

  • By denition, everyv2 V1+ +Vmcan be written asv1+ +vm, wherevi2 Vi,
  • i= 1; : : : ; m. Letwi2 Vi,i= 1; : : : ; m, such that we also havev=w1+ +wm. Then (w1v1) + + (wmvm) =0. For eachi, pre-multiply both sides by (wivi)

. For j6=i, sinceVi? Vj, then (wivi)

(wjvj) = 0. As a result, (wivi)

(wivi) = kwivik 2 = 0, givingwi=vi. Hence by denition, the sum of theVi's is a direct sum.

1.10 fv1; ;vmgis a basis ofV, the vectors are LIN. So from y1=v1;yk=vk k1 X i=1 y

i vk kyik 2 yi; k= 2; ; m; yk6=0, implying thatzk= yk kykk are well-dened and each has length 1. On the other hand, for 1j < km, y

jyk=y

jvk k1 X i=1 y

i vk kyik 2 y

jyi= j1 X i=1 y

i vk kyik 2 y

iyj k1 X i=j+1 y

i vk kyik 2 y

jyi:

Ifk= 2, thenj= 1, and it is straightforward to see thaty

1y2= 0. Suppose we have shown that for alli < j < k,y

i yj= 0. Then the above identity shows that for all j < k,y

j yk= 0. By induction, theyi's are orthogonal to each other. Thenz1; ;zm are orthonormal. Since they are LIN from Exercise 1.9, and there aremof them, they form an orthonormal basis ofV.

1.11 W\(W

?

\V) W\W

?=f0g. FromW VandW ?

\V V,W(W

?\V)

  • On the other hand, for anyv2 V, there are uniquew2 Wandu2 W
  • ?, such that v=w+u. Sincew2 V, thenu=vw2 V, and sou2 W ?\ V. As a result

V W (W

?\ V). ThenV=W (W ?\ V). From the paragraph below Denition 1.2.8, dimV= dimW+ dim(W ?\ V), completing the proof of property 1. Next, by W ?\V W ?, (W ?\V) ?\V (W ?) ?\V=W \V=W. On the other hand, ifv2 V andv?(W ?\ V), then by property 1, there are uniquew2 Wandu2 W ?\ Vsuch thatv=w+u. From assumption,v?u. Meanwhilew?u. Thenu

u=u

(vw) = 0, sou=0. Thenv=w2 W. As a result, (W ?\V) ?\V W. Then (W ?\V) ?\V=W, showing property 2.

1.12A=faijgi; j= 1;2;3, to commute with the matrixB, we require thatAB=BA. Computing the product on both sides, and equating them, we see that the conditions area11=a22=a33,a12=a23, anda21=a31=a32= 0.

1.13 A k =

a k

  • 1

; where,=b(1 +a+ +a k1 ).

1.14AB)Cis equal toA k B k , where,C= A k1 +A k2

B+ +AB

k2 +B k1 .

1.15 A

A)

=A

(A

)

=A

A, and (AA

)

= (A

)

A

=AA

.

1.16 A=O, then clearlyA

A=O. To show the converse, let the column vectors ofAbe a1; ;an. SinceA

A=fa

i ajg, ifA

A=O, then for alli= 1; ; n,a

i ai= 0, giving ai=0, and soA=O. 3 / 4

4Solutions to Chapter 1 1.17 tr(A k X i=1 xix

  • = tr(
  • k X i=1 Axix

  • =
  • k X i=1 tr(Axix

i) = k X i=1 tr(x

iAxi) = k X i=1

(xiAxi):

(b) tr(B 1 AB) = tr(BB 1

A) = tr(A):

1.18 A=faijgandB=fbijgare both lower triangular matrices andaii=aiand bii=bi. Thenaij=bij= 0 for alli < j. NowC=AB=fcijgwithcij= P n k=1 aikbkj.Ifi > j, then for everyk= 1; ; n, eitheri > kork > j, soaikbkj= 0, giving cij= 0. ThenCis lower triangular. Ifi=j, thenaikbkj6= 0 if and only ifk=i, so cii=aiibii=anbn. This completes the proof for the lower triangular case. The proof for the upper triangular case is similar.

1.19AB) =

P m i=1 P n j=1 aijbji= P n j=1 P m i=1 aijbji= P n j=1 P m i=1 bjiaij= tr(BA), Property 2 follows. Property 4 is a direct consequence, since we can regardABCas the product ofABandC, or ofAand BC. Both tr(AA

  • and tr(A
  • equal the sum of squares of the elements ofA, which is
  • nonnegative, and shows property 7, of which property 8 is an immediate consequence.

    1.20 A=faklgis lower triangular, thenakl= 0 for allk < l. NowMij=fmklg, where mkl= 8 > > > < > > >

:

akl ifk < i; l < j ak+1;lifki; l < j ak;l+1 ifk < i; lj

ak+1;l+1ifki; lj:

Ifi > j, then fork < l, the second case in the display is not possible, and all the other cases are possible, each one giving value 0. Therefore,mkl= 0, so thatMijis lower triangular. Furthermore, for anyk=j; ; i1,mkk=ak;k+1= 0, soMijhas at least one diagonal element equal to zero. This veries the rst fact.From (1.3.1), theith diagonal element ofA 1 isjMiij=jAj. If the diagonal elements ofA ared1; ; dn, then the diagonal elements ofMiiared1; ; di1; di+1; ; dn. There- forejMiij=d1 di1di+1 dnandjAj=d1 dn. Then theith diagonal element of A 1 is 1=di.

1.21jAj=8.

1.22n= (1 +a 2 + a 4

  • +a
  • 2n

  • = [1a
  • 2(n+1) ]=[1a 4 ].

    1.23 1.24 P n i=1 ai.

    1.25j(1)In+Jnj. If6= 1, then from 12 of Result 1.3.6, the determinant is (1) n jIn+(1) 1 1n1

nj= (1) n

[1 +(1)

1 1

n1n] = (1) n [1 +n(1) 1

] = (1)

n1 [1 + (n1)].

  • / 4

User Reviews

★★★★☆ (4.0/5 based on 1 reviews)
Login to Review
S
Student
May 21, 2025
★★★★☆

With its practical examples, this document was a perfect resource for my project. Definitely a outstanding choice!

Download Document

Buy This Document

$1.00 One-time purchase
Buy Now
  • Full access to this document
  • Download anytime
  • No expiration

Document Information

Category: Testbanks
Added: Dec 30, 2025
Description:

Solutions to Chapter 1 1.1ja bj=j 9j= 9, whilekak kbk= p p  =11:489>9. 1.2 To verify the Cauchy{Schwarz inequality, rst see that the inequality holds trivially if aandbare zero vectors. We there...

Unlock Now
$ 1.00