## Description

**Linear algebra refresher.**- Let
**A**be a square matrix, and further let**AA**=^{T }**I**.- Construct a 2 × 2 example of
**A**and derive the eigenvalues and eigenvectors of this example. Show all work (i.e., do not use a computer’s eigenvalue decomposition capabilities). You may not use a diagonal matrix as your 2 × 2 example. What do you notice about the eigenvalues and eigenvectors? - ) Show that
**A**has eigenvalues with norm 1.

- Construct a 2 × 2 example of

- Let

- Show that the eigenvectors of
**A**corresponding to distinct eigenvalues are orthogonal. iv. (3 points) In words, describe what may happen to a vector**x**under the transformation**Ax**.

- Let
**A**be a matrix.- What is the relationship between the singular vectors of
**A**and the eigenvectors of**AA**? What about^{T}**A**^{T}**A**? - What is the relationship between the singular values of
**A**and the eigenvalues of**AA**? What about^{T}**A**^{T}**A**?

- What is the relationship between the singular vectors of
- True or False. Partial credit on an incorrect solution may be awarded if you justify your answer.

- Every linear operator in an
*n*-dimensional vector space has*n*distinct eigenvalues. ii. A non-zero sum of two eigenvectors of a matrix**A**is an eigenvector.- If a matrix
**A**has the positive semidefinite property, i.e.,**x**^{T}**Ax**≥ 0 for all**x**, then its eigenvalues must be non-negative.

- If a matrix

- The rank of a matrix can exceed the number of non-zero eigenvalues.
- A non-zero sum of two eigenvectors of a matrix
**A**corresponding to the same eigenvalue*λ*is always an eigenvector.

**Probability refresher.**- A jar of coins is equally populated with two types of coins. One is type “H50” and comes up heads with probability 0
*.*Another is type “H60” and comes up heads with probability 0*.*6.- You take one coin from the jar and flip it. It lands tails. What is the posterior probability that this is an H50 coin?
- You put the coin back, take another, and flip it 4 times. It lands T, H, H, H. How likely is the coin to be type H50?

- A jar of coins is equally populated with two types of coins. One is type “H50” and comes up heads with probability 0

- A new jar is now equally populated with coins of type H50, H55, and H60 (with probabilities of coming up heads 0
*.*5, 0*.*55, and 0*.*6 respectively. You take one coin and flip it 10 times. It lands heads 9 times. How likely is the coin to be of each possible type?

- Consider a pregnancy test with the following statistics.
- If the woman is pregnant, the test returns “positive” (or 1, indicating the woman is pregnant) 99% of the time.
- If the woman is not pregnant, the test returns “positive” 10% of the time.
- At any given point in time, 99% of the female population is not pregnant.

What is the probability that a woman is pregnant given she received a positive test?

The answer should make intuitive sense; given an explanation of the result that you find.

- Let
*x*_{1}*,x*_{2}*,…,x*be identically distributed random variables. A random vector,_{n }**x**, is defined as

*x*_{1 } *x*_{2 } **x **= …

*x _{n}*

What is E(**Ax **+ **b**) in terms of E(**x**), given that **A **and **b **are deterministic?

- Let
**cov**

What is **cov**(**Ax **+ **b**) in terms of **cov**(**x**), given that **A **and **b **are deterministic?

**Multivariate derivatives.**- Let
**x**∈R,^{n}**y**∈R, and^{m}**A**∈R^{n}^{×m}. What is ∇_{x}x^{T}**Ay**? - What is ∇
_{y}x^{T}**Ay**? - What is ∇
_{A}x^{T}**Ay**? - Let
*f*=**x**^{T}**Ax**+**b**^{T}**x**. What is ∇_{x}*f*? - Let
*f*= tr(**AB**). What is ∇_{A}*f*?

- Let
**Deriving least-squares with matrix derivatives.**

In least-squares, we seek to estimate some multivariate output **y **via the model

**y**ˆ = **Wx**

In the training set we’re given paired data examples (**x**^{(i)}*,***y**^{(i)}) from *i *= 1*,…,n*. Leastsquares is the following quadratic optimization problem:

**Wx**

Derive the optimal **W**.

Hint: you may find the following derivatives useful:

*∂*tr(**WA**) _{T}

= **A**

*∂***W**

*∂*tr(**WAW*** ^{T}*)

_{T}= **WA **+ **WA**

*∂***W**

- (30 points)
**Hello World in Jupyer.**

Complete the Jupyter notebook linear regression.ipynb. Print out the Jupyter notebook and submit it to Gradescope.