EE239AS Homework1-Linear algebra refresher Solved

35.00 $ 17.50 $


You 'll get a: . zip file solution


  1. Linear algebra refresher.
    •  Let A be a square matrix, and further let AAT = I.
      1.  Construct a 2 × 2 example of A and derive the eigenvalues and eigenvectors of this example. Show all work (i.e., do not use a computer’s eigenvalue decomposition capabilities). You may not use a diagonal matrix as your 2 × 2 example. What do you notice about the eigenvalues and eigenvectors?
      2. ) Show that A has eigenvalues with norm 1.
  •  Show that the eigenvectors of A corresponding to distinct eigenvalues are orthogonal. iv. (3 points) In words, describe what may happen to a vector x under the transformation Ax.
  •  Let A be a matrix.
    1.  What is the relationship between the singular vectors of A and the eigenvectors of AAT? What about ATA?
    2.  What is the relationship between the singular values of A and the eigenvalues of AAT? What about ATA?
  •  True or False. Partial credit on an incorrect solution may be awarded if you justify your answer.
  1. Every linear operator in an n-dimensional vector space has n distinct eigenvalues. ii. A non-zero sum of two eigenvectors of a matrix A is an eigenvector.
    • If a matrix A has the positive semidefinite property, i.e., xTAx ≥ 0 for all x, then its eigenvalues must be non-negative.
  1. The rank of a matrix can exceed the number of non-zero eigenvalues.
  2. A non-zero sum of two eigenvectors of a matrix A corresponding to the same eigenvalue λ is always an eigenvector.
  1. Probability refresher.
    •  A jar of coins is equally populated with two types of coins. One is type “H50” and comes up heads with probability 0. Another is type “H60” and comes up heads with probability 0.6.
      1. You take one coin from the jar and flip it. It lands tails. What is the posterior probability that this is an H50 coin?
      2.  You put the coin back, take another, and flip it 4 times. It lands T, H, H, H. How likely is the coin to be type H50?
  •  A new jar is now equally populated with coins of type H50, H55, and H60 (with probabilities of coming up heads 0.5, 0.55, and 0.6 respectively. You take one coin and flip it 10 times. It lands heads 9 times. How likely is the coin to be of each possible type?
  •  Consider a pregnancy test with the following statistics.
    • If the woman is pregnant, the test returns “positive” (or 1, indicating the woman is pregnant) 99% of the time.
    • If the woman is not pregnant, the test returns “positive” 10% of the time.
    • At any given point in time, 99% of the female population is not pregnant.

What is the probability that a woman is pregnant given she received a positive test?

The answer should make intuitive sense; given an explanation of the result that you find.

  •  Let x1,x2,…,xn be identically distributed random variables. A random vector, x, is defined as

x1   x2 x =  … 


What is E(Ax + b) in terms of E(x), given that A and b are deterministic?

  •  Let cov

What is cov(Ax + b) in terms of cov(x), given that A and b are deterministic?

  1. Multivariate derivatives.
    •  Let x ∈Rn, y ∈Rm, and A ∈Rn×m. What is ∇xxTAy?
    • What is ∇yxTAy?
    • What is ∇AxTAy?
    •  Let f = xTAx + bTx. What is ∇xf?
    •  Let f = tr(AB). What is ∇Af?
  2. Deriving least-squares with matrix derivatives.

In least-squares, we seek to estimate some multivariate output y via the model

yˆ = Wx

In the training set we’re given paired data examples (x(i),y(i)) from i = 1,…,n. Leastsquares is the following quadratic optimization problem:


Derive the optimal W.

Hint: you may find the following derivatives useful:

tr(WA)                   T

= A


tr(WAWT)                         T

= WA + WA


  1. (30 points) Hello World in Jupyer.

Complete the Jupyter notebook linear regression.ipynb. Print out the Jupyter notebook and submit it to Gradescope.