代写GEOMETRY OF DATA代写数据结构语言

- 首页 >> CS

GEOMETRY OF DATA

EXAMPLES FOR STUDY & PRACTICE

1. One-sided chebyshev: For Z ∈ L2 with EZ =0& varZ = 1 verify that, for any t ≥ 0:

P(Z>t) ≤ 1/(1 + t2).

There are at least three di↵erent ways to arrive at this result:

(a) Hint: Z>t (Z + s)2 ≥ (t + s)2 s > 0. Apply markov.

(b) Hint: t − Z ≤ (t − Z)I(Z ≤ t). Apply cauchy-schwarz.

(c) Hint: (1 + t 2)2I(Z>t) ≤ (tZ + 1)2.

2. wP1-equality

For any X R we say that iff P(X = Y ) = 1.

(a) Verify that is an equivalence relation on R.

(b) Verify

(c) Verify

(d) Verify

3. Verify that, in L2, P is an orthogonal projection (onto W = JmP)

if and only if

it has the following three properties:

and, in particular, on Rn, P is an orthogonal projection

4. For an orthogonal projection, P, with P +Q = I, verify that

Use this result, or otherwise, to verify that

from which we have the special case of pythagorus

5. For nested sub-spaces V < W in L2, if Q is the orthogonal projection onto V, while P is the orthogonal projection onto W, verify that

6. Suppose that X ~ bin(2, 1/3), Y ~ poisson(2/3) and X Y .

a) Given that , determine k.

b) Determine the ratio ||X−Y||/||X−EX||.

c) Determine the coefficient of correlation ρ(X−Y,X+Y ).

7. Suppose that X ~ N(1, 1), let Y = X3, and consider the simple

linear model

Y = α+βX + W w. EW =0= ρ(X,W).

a) Evaluate the constants α and β.

b) Determine the relative proximity of Y to its closest linear predictor

8. Let X ~ exp (1), Y = e−X, and consider the simple

linear model

Y = α+βX + W w. EW =0= ρ(X,W).

a) Evaluate the constants α, β and γ.

b) Determine the relative proximity of Y to its closest linear predictor




站长地图