100% Guaranteed Results


AI2100 – Indian Institute of Technology Hyderabad Solved
$ 29.99
Category:

Description

5/5 – (1 vote)

Everything should be made as simple as possible, but not simpler. – Albert Einstein (?)
Instructions:
• Use matplotlib to read and plot images – https://matplotlib.org/tutorials/introductory/images. html.
• Use numpy for basic functions like log, sqrt, power. Do not use other built-in functions.
• Please turn in Python Notebooks with the following notation for the file name: your-roll-numberhw1.ipynb.
• Do not turn in images. Please use the same names for images in your code as in the database (and asmentioned in the problem statement below). The TAs will use these images to test your code.
Problem Set:
1. Unit norm ball: Recall the definition of a unit norm ball from class that we defined in a normed linear space (X, ||.||): B¯ = {x ∈ X : ||x|| ≤ 1}. Assume X = R2.
(a) Write a function that accepts p and plots the ||.||p unit norm ball. Test with integer p ≥ 1 as well as 0 < p < 1. (3)
(b) We claimed that for integer p ≥ 1 the unit norm ball is convex. Is this clear from the unit norm ball plots? What happens when 0 < p < 1? Print your observations. (2)
2. Completeness: Recall from class that a metric space (X, d) is said to be complete if all Cauchy sequences in X converge to a point in X. Show with a numerical example that the space of continuous functions defined on the closed interval [0, 1] and denoted C[0, 1] is incomplete with respect to the
1
metric derived from the L1 norm (i.e., d(f, g) = ||f − g||1 = R |f(x) − g(x)|dx for any f, g ∈ C[0, 1]).
0
Code your example and demonstrate the result either using a plot or numerically. (5)
3. Entropy of a discrete RV: Recall the definition of entropy of a discrete RV X from class, H(X) = − ∑ p(x) log p(x), where p(x) is the probability mass function (PMF) of X, and X is the set of x∈X
possible values that the random variable X can take.
(b) Now use the above function to plot the entropy of X ∼ Bern(p) as a function of p. Where does this plot attain its maximum? (2)
4. Image entropy: Download a gray scale image from the link provided in the instructions. By gray scale is meant that the image has one intensity channel. Further, the pixel intensities are in the range [0, 255].
(a) Write a function that accepts an image as input and returns its normalized histogram. Note thatthe normalized histogram is found by dividing the original histogram by the total number of pixels in the image. (3)
1
(b) Use your entropy function from the previous problem to find the image entropy. Experimentwith different gray scale images from the aforementioned link and note your observations. (2)
5. Joint PMF and joint entropy: For this problem, work with the given stereo image pair labeled left.png and right.png respectively. As in the previous question, both images are gray scale.
(a) Write a function that accepts this stereo image pair as input, and outputs the normalized jointhistogram in addition to plotting it. (3)
(b) Write a function that accepts the joint PMF of a pair of random variables as input and outputsthe joint entropy. (1)
(c) Test your joint entropy function using the normalized joint histogram computed in Problem 5
(a). (1)
6. Conditional PMF and conditional entropy: Continue to work with the stereo image pair.
(a) Write a function that accepts as input the joint PMF of a pair of random variables, the index of theconditioning random variable, and the value of the conditioning random variable. The function must output the appropriate conditional PMF. (3)
(b) Write a function that accepts as input the joint PMF and the index of the conditioning RV, andoutputs the conditional entropy. (1)
(c) Test your conditional entropy function using the normalized joint histogram computed in Problem 4 (a), for your choice of the conditioning RV. (1)
7. KL divergence: We showed in class that for PMFs p and q defined on X , D(p||q) ≥ 0, D(q||p) 6= D(p||q). This problem explores these properties experimentally.
(a) Write a function that accepts as input two PMFs p and q as input, and outputs D(p||q). (1)
(b) As discussed in class, let p ∼ Bern(r) and q ∼ Bern(s). For a fixed value of r, vary s and do the following: (4)
i. Plot D(p||q), D(q||p).
ii. Verify that D(p||q), D(q||p) are indeed non-negative. iii. Verify that D(p||q) 6= D(q||p) and are both equal to zero only when r = s.
iv. Finally, find D(p||q) and D(q||p) where p and q are the normalized histograms of left.png and right.png respectively. Do you think D(p||q) is a good metric for image similarity? Print your response.
2

Reviews

There are no reviews yet.

Be the first to review “AI2100 – Indian Institute of Technology Hyderabad Solved”

Your email address will not be published. Required fields are marked *

Related products