Agent Work: HW9: NP-Completeness
Claude Opus 4.6 · COMP 182: Algorithmic Thinking
COMP 182: Algorithmic Thinking
Homework 9: Discrete Probability
The goal of this homework is for you to practice discrete probability.
---
1. Events and Their Probabilities [55 pts]
Problem 1 [24 pts]
What is the probability of the following events when we randomly select a permutation of the 26 lowercase letters of the English alphabet? Show your work.
(a) The first 13 letters of the permutation are in alphabetical order.
(b) *a* is the first letter of the permutation and *z* is the last letter.
(c) *a* and *z* are next to each other in the permutation.
(d) *a* and *b* are not next to each other in the permutation.
(e) *a* and *z* are separated by at least 23 letters in the permutation.
(f) *z* precedes both *a* and *b* in the permutation.
---
Problem 2 [9 pts]
Assume that all days of the week are equally likely. Show your work in each of the following parts.
(a) What is the probability that two people chosen at random were born on the same day of the week?
(b) What is the probability that in a group of *n* people chosen at random, there are at least two born on the same day of the week?
(c) How many people chosen at random are needed to make the probability greater than 1/2 that there are at least two people born on the same day of the week?
---
Problem 3 [4 pts]
Show that if *E* and *F* are independent events, then *E* (complement) and *F* (complement) are independent events.
---
Problem 4 [6 pts]
Suppose that *A* and *B* are two events with probabilities p(A) = 2/3 and p(B) = 1/2.
(a) What is the largest p(A intersection B) can be? What is the smallest it can be? Give examples to show that both extremes for p(A intersection B) are possible. Show your work.
(b) What is the largest p(A union B) can be? What is the smallest it can be? Give examples to show that both extremes for p(A union B) are possible. Show your work.
---
Problem 5 [12 pts]
Find each of the following probabilities when *n* independent Bernoulli trials are carried out with probability of success *p*. Show your work.
(a) The probability of no success.
(b) The probability of at least one success.
(c) The probability of at most one success.
(d) The probability of at least two successes.
---
2. Random Variables [45 pts]
Problem 1 [6 pts]
Suppose that we roll a fair die until a 6 comes up.
(a) What is the probability that we roll the die *n* times?
(b) What is the expected number of times we roll the die?
---
Problem 2 [6 pts]
A space probe near Neptune communicates with Earth using bit strings. Suppose that in its transmissions it sends a 1 one-third of the time and a 0 two-thirds of the time. When a 0 is sent, the probability that it is received correctly is 0.9, and the probability that it is received incorrectly (as a 1) is 0.1. When a 1 is sent, the probability that it is received correctly is 0.8, and the probability that it is received incorrectly (as a 0) is 0.2.
Given information:
- P(S = 1) = 1/3 and P(S = 0) = 2/3
- P(R = 0|S = 0) = 0.9 and P(R = 1|S = 0) = 0.1
- P(R = 1|S = 1) = 0.8 and P(R = 0|S = 1) = 0.2
(a) [3 pts] Find the probability that a 0 is received.
(b) [3 pts] Find the probability that a 0 was transmitted, given that a 0 was received.
---
Problem 3 [6 pts]
Suppose that *X* and *Y* are random variables and that *X* and *Y* are nonnegative for all points in a sample space *S*. Let *Z* be the random variable defined by Z(s) = max(X(s), Y(s)) for all elements s in S. Show that E(Z) <= E(X) + E(Y).
---
Problem 4 [3 pts]
Let *X* and *Y* be the random variables that count the number of heads and the number of tails that come up when two fair coins are flipped. Are *X* and *Y* independent? Prove your answer.
---
Problem 5 [5 pts]
What is the variance of the number of times a 6 appears when a fair die is rolled 10 times? Show your derivation.
---
Problem 6 [6 pts]
Give an example that shows that the variance of the sum of two random variables is not necessarily equal to the sum of their variances when the random variables are not independent.
---
Problem 7 [4 pts]
Use Chebyshev's inequality to find an upper bound on the probability that the number of tails that come up when a biased coin with probability of heads equal to 0.6 is tossed *n* times deviates from the mean by more than sqrt(n).
---
Problem 8 [9 pts]
Suppose the probability that *x* is the *i*-th element in a list of *n* distinct integers is i / (n(n+1)). Find the average number of comparisons used by the linear search algorithm (Algorithm 2 in Section 3.1 in the textbook) to find *x* or to determine that it is not in the list.
---
Total Points: 100
# COMP182 Homework 9
## Student Responses
<!-- Write your solutions below. Add sections for each problem as needed. -->
<!-- Use proper mathematical notation where applicable. -->
[Your solutions here]
COMP182 Homework 9: Discrete Probability
Student Responses
---
1. Events and Their Probabilities
Problem 1 [24 pts]
We randomly select a permutation of the 26 lowercase letters of the English alphabet. The total number of permutations is 26!.
(a) The first 13 letters of the permutation are in alphabetical order.
We need to choose which 13 letters occupy the first 13 positions (they must be arranged in alphabetical order, so there is only 1 way to arrange them once chosen), and the remaining 13 letters can be in any order in the last 13 positions.
- Number of ways to choose which 13 letters go in the first 13 positions: C(26, 13)
- Arrangement of those 13 letters in the first 13 positions: 1 (must be alphabetical)
- Arrangement of the remaining 13 letters in the last 13 positions: 13!
- Favorable outcomes: C(26, 13) × 13!
P = C(26, 13) × 13! / 26! = [26! / (13! × 13!)] × 13! / 26! = 1 / 13!
P = 1/13!
(b) *a* is the first letter and *z* is the last letter.
Fix *a* in position 1 and *z* in position 26. The remaining 24 letters can be arranged in any order in the remaining 24 positions.
P = 24! / 26! = 1 / (26 × 25)
P = 1/650
(c) *a* and *z* are next to each other in the permutation.
Treat {a, z} as a single block. Then we have 25 objects to arrange (the block plus 24 other letters). These 25 objects can be arranged in 25! ways. Within the block, a and z can appear in 2 orders (az or za).
Favorable outcomes: 2 × 25!
P = 2 × 25! / 26! = 2/26
P = 1/13
(d) *a* and *b* are not next to each other in the permutation.
Using the complement:
P(a and b not adjacent) = 1 − P(a and b adjacent)
By the same reasoning as part (c), P(a and b adjacent) = 2 × 25! / 26! = 1/13.
P = 1 − 1/13
P = 12/13
(e) *a* and *z* are separated by at least 23 letters in the permutation.
If *a* is at position i and *z* is at position j, the number of letters between them is |i − j| − 1. We need |i − j| − 1 ≥ 23, i.e., |i − j| ≥ 24.
Since positions range from 1 to 26, the possible cases are:
- |i − j| = 24: ordered pairs (i, j) are (1,25), (25,1), (2,26), (26,2) → 4 pairs
- |i − j| = 25: ordered pairs (i, j) are (1,26), (26,1) → 2 pairs
Total favorable ordered pairs for (position of a, position of z): 6
Total ways to assign positions to a and z: 26 × 25 = 650
P = 6/650
P = 3/325
(f) *z* precedes both *a* and *b* in the permutation.
Consider only the relative ordering of the three letters z, a, and b. All other letters are irrelevant. There are 3! = 6 equally likely orderings of these three letters. We want z to come first. The number of orderings where z is first is 2! = 2 (since a and b can be in either order after z).
P = 2/6
P = 1/3
---
Problem 2 [9 pts]
All days of the week are equally likely (7 days).
(a) Probability that two people were born on the same day of the week.
The first person can be born on any day. The second person must be born on the same day, with probability 1/7.
P = 1/7
(b) Probability that in a group of n people, at least two share a birthday (day of week).
Using the complement:
P(at least two share) = 1 − P(all different)
For n > 7: By the Pigeonhole Principle, at least two must share a day, so P = 1.
For n ≤ 7: The number of ways all n people are born on different days is 7 × 6 × 5 × ... × (7 − n + 1) = 7!/(7 − n)!. The total number of outcomes is 7^n.
P(at least two share a day) = 1 − 7! / ((7 − n)! × 7^n) for n ≤ 7, and P = 1 for n > 7.
(c) Find the smallest n such that this probability exceeds 1/2.
We compute P(all different) = 7!/((7−n)! × 7^n) for successive values of n:
- n = 1: P(all different) = 7/7 = 1, so P(at least two same) = 0
- n = 2: P(all different) = (7 × 6)/7^2 = 42/49 = 6/7, so P(at least two same) = 1/7 ≈ 0.143
- n = 3: P(all different) = (7 × 6 × 5)/7^3 = 210/343, so P(at least two same) = 133/343 ≈ 0.388
- n = 4: P(all different) = (7 × 6 × 5 × 4)/7^4 = 840/2401, so P(at least two same) = 1561/2401 ≈ 0.650
Since 0.388 < 0.5 < 0.650, the probability first exceeds 1/2 at n = 4.
---
Problem 3 [4 pts]
Show that if E and F are independent events, then E̅ and F̅ are independent events.
Since E and F are independent, we have P(E ∩ F) = P(E)P(F).
We need to show P(E̅ ∩ F̅) = P(E̅)P(F̅).
By De Morgan's Law: E̅ ∩ F̅ = (E ∪ F)̅
Therefore:
P(E̅ ∩ F̅) = P((E ∪ F)̅) = 1 − P(E ∪ F) = 1 − [P(E) + P(F) − P(E ∩ F)] = 1 − P(E) − P(F) + P(E ∩ F) = 1 − P(E) − P(F) + P(E)P(F) [using independence of E and F] = (1 − P(E))(1 − P(F)) = P(E̅) × P(F̅)
Since P(E̅ ∩ F̅) = P(E̅)P(F̅), the events E̅ and F̅ are independent. ∎
---
Problem 4 [6 pts]
Given: P(A) = 2/3, P(B) = 1/2.
(a) Bounds on P(A ∩ B):
Largest: P(A ∩ B) ≤ min(P(A), P(B)) = min(2/3, 1/2) = 1/2.
*Example:* Let S = {1, 2, 3, 4, 5, 6} with uniform probability. Let A = {1, 2, 3, 4} and B = {1, 2, 3}. Then P(A) = 4/6 = 2/3, P(B) = 3/6 = 1/2, and A ∩ B = {1, 2, 3}, so P(A ∩ B) = 3/6 = 1/2. This occurs when B ⊆ A.
Smallest: By inclusion-exclusion, P(A ∪ B) ≤ 1, so P(A) + P(B) − P(A ∩ B) ≤ 1, giving P(A ∩ B) ≥ P(A) + P(B) − 1 = 2/3 + 1/2 − 1 = 1/6.
*Example:* Let S = {1, 2, 3, 4, 5, 6} with uniform probability. Let A = {1, 2, 3, 4} and B = {4, 5, 6}. Then P(A) = 4/6 = 2/3, P(B) = 3/6 = 1/2, and A ∩ B = {4}, so P(A ∩ B) = 1/6.
(b) Bounds on P(A ∪ B):
Since P(A ∪ B) = P(A) + P(B) − P(A ∩ B):
Largest: Minimize P(A ∩ B). Using the minimum value P(A ∩ B) = 1/6: P(A ∪ B) = 2/3 + 1/2 − 1/6 = 4/6 + 3/6 − 1/6 = 6/6 = 1.
*Example:* Same as the smallest intersection example: A = {1, 2, 3, 4}, B = {4, 5, 6}. Then A ∪ B = {1, 2, 3, 4, 5, 6} = S, so P(A ∪ B) = 1.
Smallest: Maximize P(A ∩ B). Using the maximum value P(A ∩ B) = 1/2: P(A ∪ B) = 2/3 + 1/2 − 1/2 = 2/3.
*Example:* Same as the largest intersection example: A = {1, 2, 3, 4}, B = {1, 2, 3}. Then A ∪ B = {1, 2, 3, 4}, so P(A ∪ B) = 4/6 = 2/3.
---
Problem 5 [12 pts]
We perform n independent Bernoulli trials with probability of success p on each trial. The number of successes follows a Binomial(n, p) distribution.
(a) Probability of no success (0 successes):
P(X = 0) = C(n, 0) × p^0 × (1 − p)^n
P = (1 − p)^n
(b) Probability of at least one success:
P(X ≥ 1) = 1 − P(X = 0)
P = 1 − (1 − p)^n
(c) Probability of at most one success (0 or 1 successes):
P(X ≤ 1) = P(X = 0) + P(X = 1) = C(n, 0)p^0(1−p)^n + C(n, 1)p^1(1−p)^(n−1)
P = (1 − p)^n + np(1 − p)^(n−1)
(d) Probability of at least two successes:
P(X ≥ 2) = 1 − P(X ≤ 1) = 1 − [(1 − p)^n + np(1 − p)^(n−1)]
P = 1 − (1 − p)^n − np(1 − p)^(n−1)
---
2. Random Variables
Problem 1 [6 pts]
We roll a fair die until a 6 comes up. Let X be the number of rolls.
(a) The probability that we roll the die n times:
For X = n, the first n − 1 rolls must not be 6 (each with probability 5/6), and the n-th roll must be 6 (probability 1/6).
P(X = n) = (5/6)^(n−1) × (1/6) for n = 1, 2, 3, ...
(b) Expected number of rolls:
This is a geometric distribution with success probability p = 1/6.
E(X) = Σ_{n=1}^∞ n × (5/6)^(n−1) × (1/6)
Using the formula for geometric expectation, E(X) = 1/p:
E(X) = 1/(1/6) = 6
*Derivation:* Let q = 5/6. Then: E(X) = (1/6) × Σ_{n=1}^∞ n × q^(n−1) = (1/6) × 1/(1−q)^2 = (1/6) × 1/(1/6)^2 = (1/6) × 36 = 6.
---
Problem 2 [6 pts]
Given:
- P(S = 1) = 1/3, P(S = 0) = 2/3
- P(R = 0 | S = 0) = 0.9, P(R = 1 | S = 0) = 0.1
- P(R = 1 | S = 1) = 0.8, P(R = 0 | S = 1) = 0.2
(a) Find P(R = 0).
By the law of total probability:
P(R = 0) = P(R = 0 | S = 0) × P(S = 0) + P(R = 0 | S = 1) × P(S = 1) = (9/10)(2/3) + (2/10)(1/3) = 18/30 + 2/30 = 20/30
P(R = 0) = 2/3
(b) Find P(S = 0 | R = 0).
By Bayes' theorem:
P(S = 0 | R = 0) = P(R = 0 | S = 0) × P(S = 0) / P(R = 0) = (9/10)(2/3) / (2/3) = (18/30) / (20/30) = 18/20
P(S = 0 | R = 0) = 9/10
---
Problem 3 [6 pts]
Show that E(Z) ≤ E(X) + E(Y), where Z(s) = max(X(s), Y(s)) and X, Y are nonnegative.
Since X(s) ≥ 0 and Y(s) ≥ 0 for all s ∈ S, we show that max(X(s), Y(s)) ≤ X(s) + Y(s) for all s ∈ S:
- Case 1: X(s) ≥ Y(s). Then max(X(s), Y(s)) = X(s) ≤ X(s) + Y(s), since Y(s) ≥ 0.
- Case 2: Y(s) > X(s). Then max(X(s), Y(s)) = Y(s) ≤ X(s) + Y(s), since X(s) ≥ 0.
In both cases, Z(s) ≤ X(s) + Y(s).
Since Z(s) ≤ X(s) + Y(s) for all s ∈ S, by the monotonicity property of expectation:
E(Z) ≤ E(X + Y) = E(X) + E(Y)
where the last equality follows from linearity of expectation. ∎
---
Problem 4 [3 pts]
Two fair coins are flipped. Let X = number of heads, Y = number of tails.
Sample space: S = {HH, HT, TH, TT}, each outcome with probability 1/4.
| Outcome | X | Y |
|---|---|---|
| HH | 2 | 0 |
| HT | 1 | 1 |
| TH | 1 | 1 |
| TT | 0 | 2 |
Note that X + Y = 2 for all outcomes.
X and Y are NOT independent.
*Proof:* We check the independence condition P(X = a, Y = b) = P(X = a) × P(Y = b):
- P(X = 0) = 1/4, P(Y = 0) = 1/4
- P(X = 0, Y = 0) = P(∅) = 0 (since X = 0 and Y = 0 cannot occur simultaneously — if X = 0 then both coins are tails, so Y = 2, not 0)
- P(X = 0) × P(Y = 0) = (1/4)(1/4) = 1/16
Since P(X = 0, Y = 0) = 0 ≠ 1/16 = P(X = 0)P(Y = 0), X and Y are not independent. ∎
---
Problem 5 [5 pts]
A fair die is rolled 10 times. Let X = number of times a 6 appears.
X follows a Binomial distribution with n = 10 and p = P(rolling a 6) = 1/6.
The variance of a binomial random variable is:
V(X) = np(1 − p) = 10 × (1/6) × (5/6) = 50/36
V(X) = 25/18
---
Problem 6 [6 pts]
Example showing V(X + Y) ≠ V(X) + V(Y) when X and Y are not independent.
Let a fair coin be flipped once. Define:
- X = 1 if heads, X = 0 if tails
- Y = X (i.e., Y is the same random variable as X)
X and Y are not independent since knowing X completely determines Y.
Compute the variances:
E(X) = 1/2, so V(X) = E(X²) − (E(X))² = 1/2 − 1/4 = 1/4.
Similarly, V(Y) = 1/4.
So V(X) + V(Y) = 1/4 + 1/4 = 1/2.
Now compute V(X + Y):
Since Y = X, we have X + Y = 2X.
V(X + Y) = V(2X) = 4 × V(X) = 4 × 1/4 = 1.
Since V(X + Y) = 1 ≠ 1/2 = V(X) + V(Y), this demonstrates that the variance of the sum is not necessarily equal to the sum of the variances when the random variables are not independent.
*Note:* The general formula is V(X + Y) = V(X) + V(Y) + 2Cov(X, Y). Here Cov(X, Y) = Cov(X, X) = V(X) = 1/4 ≠ 0, which is why the equality fails.
---
Problem 7 [4 pts]
A biased coin with P(heads) = 0.6 is tossed n times. Let X = number of tails.
- P(tails) = 1 − 0.6 = 0.4
- E(X) = μ = n × 0.4 = 2n/5
- V(X) = σ² = n × 0.4 × 0.6 = 0.24n = 6n/25
We want an upper bound on P(|X − μ| ≥ √n).
By Chebyshev's inequality: P(|X − μ| ≥ k) ≤ σ²/k²
Setting k = √n:
P(|X − 2n/5| ≥ √n) ≤ V(X)/n = (6n/25)/n = 6/25
P(|X − μ| ≥ √n) ≤ 6/25 = 0.24
---
Problem 8 [9 pts]
The probability that x is the i-th element in a list of n distinct integers is p_i = i/(n(n+1)) for i = 1, 2, ..., n.
First, verify the total probability:
Σ_{i=1}^n p_i = Σ_{i=1}^n i/(n(n+1)) = [n(n+1)/2] / (n(n+1)) = 1/2
So the probability that x is in the list is 1/2, and the probability that x is not in the list is 1 − 1/2 = 1/2.
Linear search behavior:
- If x is at position i, the algorithm makes i comparisons to find it.
- If x is not in the list, the algorithm makes n comparisons (checking every element).
Expected number of comparisons:
E = Σ_{i=1}^n i × p_i + n × P(x not in list)
= Σ_{i=1}^n i × [i/(n(n+1))] + n × (1/2)
= [1/(n(n+1))] × Σ_{i=1}^n i² + n/2
Using the formula Σ_{i=1}^n i² = n(n+1)(2n+1)/6:
= [1/(n(n+1))] × [n(n+1)(2n+1)/6] + n/2
= (2n+1)/6 + n/2
= (2n+1)/6 + 3n/6
= (2n + 1 + 3n)/6
E = (5n + 1)/6
COMP 182 Homework 9 Grading Report
---
Section 1: Events and Their Probabilities [55 pts]
Problem 1 [24/24 pts]
| Part | Score | Notes | ||
|---|---|---|---|---|
| (a) | 4/4 | Correct: 1/13!. Properly counts C(26,13) favorable arrangements divided by P(26,13). | ||
| (b) | 4/4 | Correct: 1/(26×25) = 1/650. Correctly fixes a and z, counts 24! remaining arrangements. | ||
| (c) | 4/4 | Correct: 1/13. Correctly uses the block method (2 × 25! / 26!). | ||
| (d) | 4/4 | Correct: 12/13. Properly applies complement using part (c)'s result. | ||
| (e) | 4/4 | Correct: 3/325 = 6/650. Correctly enumerates all 6 position pairs with | i−j | ≥ 24. |
| (f) | 4/4 | Correct: 1/3. Clean symmetry argument over 3! = 6 orderings of {z, a, b}. |
---
Problem 2 [9/9 pts]
| Part | Score | Notes |
|---|---|---|
| (a) | 3/3 | Correct: 1/7. Clear reasoning. |
| (b) | 3/3 | Correct formula: 1 − 7!/((7−n)! × 7^n) for n ≤ 7 and 1 for n > 7. Correctly applies pigeonhole principle for n > 7. |
| (c) | 3/3 | Correct: n = 4. Calculations verified (n=3 gives ≈0.388, n=4 gives ≈0.650). |
---
Problem 3 [4/4 pts]
Correct and complete proof. Uses De Morgan's law to write E̅ ∩ F̅ = (E ∪ F)̅, then expands via inclusion-exclusion, applies the independence hypothesis P(E ∩ F) = P(E)P(F), and factors to obtain P(E̅)P(F̅). ✓
---
Problem 4 [6/6 pts]
(a) [4/4] — Maximum P(A ∩ B) = 1/2 ✓, Minimum = 1/6 ✓. Both examples are correct (B ⊆ A for max, A ∪ B = S for min).
(b) [2/2] — Maximum P(A ∪ B) = 1 ✓, Minimum = 2/3 ✓. Examples correctly reference the same constructions.
---
Problem 5 [12/12 pts]
| Part | Score | Notes |
|---|---|---|
| (a) | 3/3 | Correct: (1−p)^n. Proper derivation via independence. |
| (b) | 3/3 | Correct: 1 − (1−p)^n. Complement of part (a). |
| (c) | 3/3 | Correct: (1−p)^n + np(1−p)^(n−1). Equivalent to reference's factored form. |
| (d) | 3/3 | Correct: 1 − (1−p)^n − np(1−p)^(n−1). Complement of part (c). |
---
Section 2: Random Variables [45 pts]
Problem 1 [6/6 pts]
| Part | Score | Notes |
|---|---|---|
| (a) | 3/3 | Correct: (5/6)^(n−1) × (1/6). Geometric distribution properly applied. |
| (b) | 3/3 | Correct: E(X) = 6. Uses E(X) = 1/p with derivation provided. |
---
Problem 2 [6/6 pts]
| Part | Score | Notes | |
|---|---|---|---|
| (a) | 3/3 | Correct: P(R=0) = 2/3. Correctly applies law of total probability. | |
| (b) | 3/3 | Correct: **P(S=0 | R=0) = 9/10 = 0.9**. Correctly applies Bayes' theorem with proper values. |
---
Problem 3 [6/6 pts]
Correct and well-structured proof. Explicitly handles both cases (X(s) ≥ Y(s) and Y(s) > X(s)), leverages the nonnegativity assumption to establish max(X(s), Y(s)) ≤ X(s) + Y(s), then applies monotonicity and linearity of expectation. ✓
---
Problem 4 [3/3 pts]
Correct answer: Not independent. Clear counterexample: P(X=0, Y=0) = 0 ≠ 1/16 = P(X=0)P(Y=0). The joint probability table is well-presented.
---
Problem 5 [5/5 pts]
Correct: V(X) = 50/36 = 25/18. Correctly identifies X ~ Binomial(10, 1/6) and applies V(X) = np(1−p).
---
Problem 6 [6/6 pts]
Excellent example. Defines X = indicator of heads and Y = X (perfectly correlated), then shows:
- V(X + Y) = V(2X) = 4 × 1/4 = 1
- V(X) + V(Y) = 1/4 + 1/4 = 1/2
- 1 ≠ 1/2 ✓
The note about covariance is a nice bonus. Valid and well-explained.
---
Problem 7 [4/4 pts]
Correct: P(|X − μ| ≥ √n) ≤ 0.24 = 6/25. Properly computes V(X) = 0.24n and applies Chebyshev's inequality with k = √n.
---
Problem 8 [5/9 pts]
What's correct:
- Probability that x is not in the list = 1/2 ✓ (key insight)
- Expected value framework and summation algebra are executed flawlessly ✓
- Sum of i² formula correctly applied ✓
Error: The student counts i comparisons when x is at position i and n comparisons when x is not in the list. However, the problem specifically references Algorithm 2 in Section 3.1, which performs 2 comparisons per iteration (one index check + one value comparison) plus a final check. The correct counts are 2i + 1 comparisons when x is at position i, and 2n + 2 when x is not found.
- Student's answer: (5n + 1)/6
- Reference answer: (5n + 4)/3 + 1/2 = (10n + 11)/6
The probabilistic reasoning and algebraic execution are solid, but the comparison count model is incorrect for the specified algorithm, leading to the wrong final answer. Awarding 5/9 for correct probability setup, framework, and algebra.
---
Summary
| Section | Problem | Points Earned | Points Possible |
|---|---|---|---|
| 1 | 1(a–f) | 24 | 24 |
| 1 | 2(a–c) | 9 | 9 |
| 1 | 3 | 4 | 4 |
| 1 | 4(a–b) | 6 | 6 |
| 1 | 5(a–d) | 12 | 12 |
| 2 | 1(a–b) | 6 | 6 |
| 2 | 2(a–b) | 6 | 6 |
| 2 | 3 | 6 | 6 |
| 2 | 4 | 3 | 3 |
| 2 | 5 | 5 | 5 |
| 2 | 6 | 6 | 6 |
| 2 | 7 | 4 | 4 |
| 2 | 8 | 5 | 9 |
Overall: An excellent submission with near-perfect work across both sections. The only deduction is on Problem 2.8, where the comparison counting model did not match Algorithm 2 from the textbook, despite the probability analysis and algebra being entirely correct.
Total: 96/100
diff --git a/writeup.md b/writeup.md
index 3640a86..e4bf544 100644
--- a/writeup.md
+++ b/writeup.md
@@ -1,9 +1,405 @@
-# COMP182 Homework 9
+# COMP182 Homework 9: Discrete Probability
## Student Responses
-<!-- Write your solutions below. Add sections for each problem as needed. -->
-<!-- Use proper mathematical notation where applicable. -->
+---
-[Your solutions here]
+## 1. Events and Their Probabilities
+### Problem 1 [24 pts]
+
+We randomly select a permutation of the 26 lowercase letters of the English alphabet. The total number of permutations is 26!.
+
+**(a)** The first 13 letters of the permutation are in alphabetical order.
+
+We need to choose which 13 letters occupy the first 13 positions (they must be arranged in alphabetical order, so there is only 1 way to arrange them once chosen), and the remaining 13 letters can be in any order in the last 13 positions.
+
+- Number of ways to choose which 13 letters go in the first 13 positions: C(26, 13)
+- Arrangement of those 13 letters in the first 13 positions: 1 (must be alphabetical)
+- Arrangement of the remaining 13 letters in the last 13 positions: 13!
+- Favorable outcomes: C(26, 13) × 13!
+
+P = C(26, 13) × 13! / 26!
+= [26! / (13! × 13!)] × 13! / 26!
+= 1 / 13!
+
+**P = 1/13!**
+
+**(b)** *a* is the first letter and *z* is the last letter.
+
+Fix *a* in position 1 and *z* in position 26. The remaining 24 letters can be arranged in any order in the remaining 24 positions.
+
+P = 24! / 26! = 1 / (26 × 25)
+
+**P = 1/650**
+
+**(c)** *a* and *z* are next to each other in the permutation.
+
+Treat {a, z} as a single block. Then we have 25 objects to arrange (the block plus 24 other letters). These 25 objects can be arranged in 25! ways. Within the block, a and z can appear in 2 orders (az or za).
+
+Favorable outcomes: 2 × 25!
+
+P = 2 × 25! / 26! = 2/26
+
+**P = 1/13**
+
+**(d)** *a* and *b* are not next to each other in the permutation.
+
+Using the complement:
+
+P(a and b not adjacent) = 1 − P(a and b adjacent)
+
+By the same reasoning as part (c), P(a and b adjacent) = 2 × 25! / 26! = 1/13.
+
+P = 1 − 1/13
+
+**P = 12/13**
+
+**(e)** *a* and *z* are separated by at least 23 letters in the permutation.
+
+If *a* is at position i and *z* is at position j, the number of letters between them is |i − j| − 1. We need |i − j| − 1 ≥ 23, i.e., |i − j| ≥ 24.
+
+Since positions range from 1 to 26, the possible cases are:
+
+- |i − j| = 24: ordered pairs (i, j) are (1,25), (25,1), (2,26), (26,2) → 4 pairs
+- |i − j| = 25: ordered pairs (i, j) are (1,26), (26,1) → 2 pairs
+
+Total favorable ordered pairs for (position of a, position of z): 6
+
+Total ways to assign positions to a and z: 26 × 25 = 650
+
+P = 6/650
+
+**P = 3/325**
+
+**(f)** *z* precedes both *a* and *b* in the permutation.
+
+Consider only the relative ordering of the three letters z, a, and b. All other letters are irrelevant. There are 3! = 6 equally likely orderings of these three letters. We want z to come first. The number of orderings where z is first is 2! = 2 (since a and b can be in either order after z).
+
+P = 2/6
+
+**P = 1/3**
+
+---
+
+### Problem 2 [9 pts]
+
+All days of the week are equally likely (7 days).
+
+**(a)** Probability that two people were born on the same day of the week.
+
+The first person can be born on any day. The second person must be born on the same day, with probability 1/7.
+
+**P = 1/7**
+
+**(b)** Probability that in a group of n people, at least two share a birthday (day of week).
+
+Using the complement:
+
+P(at least two share) = 1 − P(all different)
+
+For n > 7: By the Pigeonhole Principle, at least two must share a day, so P = 1.
+
+For n ≤ 7: The number of ways all n people are born on different days is 7 × 6 × 5 × ... × (7 − n + 1) = 7!/(7 − n)!. The total number of outcomes is 7^n.
+
+**P(at least two share a day) = 1 − 7! / ((7 − n)! × 7^n)** for n ≤ 7, and **P = 1** for n > 7.
+
+**(c)** Find the smallest n such that this probability exceeds 1/2.
+
+We compute P(all different) = 7!/((7−n)! × 7^n) for successive values of n:
+
+- n = 1: P(all different) = 7/7 = 1, so P(at least two same) = 0
+- n = 2: P(all different) = (7 × 6)/7^2 = 42/49 = 6/7, so P(at least two same) = 1/7 ≈ 0.143
+- n = 3: P(all different) = (7 × 6 × 5)/7^3 = 210/343, so P(at least two same) = 133/343 ≈ 0.388
+- n = 4: P(all different) = (7 × 6 × 5 × 4)/7^4 = 840/2401, so P(at least two same) = 1561/2401 ≈ 0.650
+
+Since 0.388 < 0.5 < 0.650, the probability first exceeds 1/2 at **n = 4**.
+
+---
+
+### Problem 3 [4 pts]
+
+**Show that if E and F are independent events, then E̅ and F̅ are independent events.**
+
+Since E and F are independent, we have P(E ∩ F) = P(E)P(F).
+
+We need to show P(E̅ ∩ F̅) = P(E̅)P(F̅).
+
+By De Morgan's Law: E̅ ∩ F̅ = (E ∪ F)̅
+
+Therefore:
+
+P(E̅ ∩ F̅) = P((E ∪ F)̅)
+= 1 − P(E ∪ F)
+= 1 − [P(E) + P(F) − P(E ∩ F)]
+= 1 − P(E) − P(F) + P(E ∩ F)
+= 1 − P(E) − P(F) + P(E)P(F) [using independence of E and F]
+= (1 − P(E))(1 − P(F))
+= P(E̅) × P(F̅)
+
+Since P(E̅ ∩ F̅) = P(E̅)P(F̅), the events E̅ and F̅ are independent. ∎
+
+---
+
+### Problem 4 [6 pts]
+
+Given: P(A) = 2/3, P(B) = 1/2.
+
+**(a)** Bounds on P(A ∩ B):
+
+**Largest:** P(A ∩ B) ≤ min(P(A), P(B)) = min(2/3, 1/2) = **1/2**.
+
+*Example:* Let S = {1, 2, 3, 4, 5, 6} with uniform probability. Let A = {1, 2, 3, 4} and B = {1, 2, 3}. Then P(A) = 4/6 = 2/3, P(B) = 3/6 = 1/2, and A ∩ B = {1, 2, 3}, so P(A ∩ B) = 3/6 = 1/2. This occurs when B ⊆ A.
+
+**Smallest:** By inclusion-exclusion, P(A ∪ B) ≤ 1, so P(A) + P(B) − P(A ∩ B) ≤ 1, giving P(A ∩ B) ≥ P(A) + P(B) − 1 = 2/3 + 1/2 − 1 = **1/6**.
+
+*Example:* Let S = {1, 2, 3, 4, 5, 6} with uniform probability. Let A = {1, 2, 3, 4} and B = {4, 5, 6}. Then P(A) = 4/6 = 2/3, P(B) = 3/6 = 1/2, and A ∩ B = {4}, so P(A ∩ B) = 1/6.
+
+**(b)** Bounds on P(A ∪ B):
+
+Since P(A ∪ B) = P(A) + P(B) − P(A ∩ B):
+
+**Largest:** Minimize P(A ∩ B). Using the minimum value P(A ∩ B) = 1/6:
+P(A ∪ B) = 2/3 + 1/2 − 1/6 = 4/6 + 3/6 − 1/6 = 6/6 = **1**.
+
+*Example:* Same as the smallest intersection example: A = {1, 2, 3, 4}, B = {4, 5, 6}. Then A ∪ B = {1, 2, 3, 4, 5, 6} = S, so P(A ∪ B) = 1.
+
+**Smallest:** Maximize P(A ∩ B). Using the maximum value P(A ∩ B) = 1/2:
+P(A ∪ B) = 2/3 + 1/2 − 1/2 = **2/3**.
+
+*Example:* Same as the largest intersection example: A = {1, 2, 3, 4}, B = {1, 2, 3}. Then A ∪ B = {1, 2, 3, 4}, so P(A ∪ B) = 4/6 = 2/3.
+
+---
+
+### Problem 5 [12 pts]
+
+We perform n independent Bernoulli trials with probability of success p on each trial. The number of successes follows a Binomial(n, p) distribution.
+
+**(a)** Probability of no success (0 successes):
+
+P(X = 0) = C(n, 0) × p^0 × (1 − p)^n
+
+**P = (1 − p)^n**
+
+**(b)** Probability of at least one success:
+
+P(X ≥ 1) = 1 − P(X = 0)
+
+**P = 1 − (1 − p)^n**
+
+**(c)** Probability of at most one success (0 or 1 successes):
+
+P(X ≤ 1) = P(X = 0) + P(X = 1)
+= C(n, 0)p^0(1−p)^n + C(n, 1)p^1(1−p)^(n−1)
+
+**P = (1 − p)^n + np(1 − p)^(n−1)**
+
+**(d)** Probability of at least two successes:
+
+P(X ≥ 2) = 1 − P(X ≤ 1) = 1 − [(1 − p)^n + np(1 − p)^(n−1)]
+
+**P = 1 − (1 − p)^n − np(1 − p)^(n−1)**
+
+---
+
+## 2. Random Variables
+
+### Problem 1 [6 pts]
+
+We roll a fair die until a 6 comes up. Let X be the number of rolls.
+
+**(a)** The probability that we roll the die n times:
+
+For X = n, the first n − 1 rolls must not be 6 (each with probability 5/6), and the n-th roll must be 6 (probability 1/6).
+
+**P(X = n) = (5/6)^(n−1) × (1/6)** for n = 1, 2, 3, ...
+
+**(b)** Expected number of rolls:
+
+This is a geometric distribution with success probability p = 1/6.
+
+E(X) = Σ_{n=1}^∞ n × (5/6)^(n−1) × (1/6)
+
+Using the formula for geometric expectation, E(X) = 1/p:
+
+**E(X) = 1/(1/6) = 6**
+
+*Derivation:* Let q = 5/6. Then:
+E(X) = (1/6) × Σ_{n=1}^∞ n × q^(n−1) = (1/6) × 1/(1−q)^2 = (1/6) × 1/(1/6)^2 = (1/6) × 36 = 6.
+
+---
+
+### Problem 2 [6 pts]
+
+Given:
+- P(S = 1) = 1/3, P(S = 0) = 2/3
+- P(R = 0 | S = 0) = 0.9, P(R = 1 | S = 0) = 0.1
+- P(R = 1 | S = 1) = 0.8, P(R = 0 | S = 1) = 0.2
+
+**(a)** Find P(R = 0).
+
+By the law of total probability:
+
+P(R = 0) = P(R = 0 | S = 0) × P(S = 0) + P(R = 0 | S = 1) × P(S = 1)
+= (9/10)(2/3) + (2/10)(1/3)
+= 18/30 + 2/30
+= 20/30
+
+**P(R = 0) = 2/3**
+
+**(b)** Find P(S = 0 | R = 0).
+
+By Bayes' theorem:
+
+P(S = 0 | R = 0) = P(R = 0 | S = 0) × P(S = 0) / P(R = 0)
+= (9/10)(2/3) / (2/3)
+= (18/30) / (20/30)
+= 18/20
+
+**P(S = 0 | R = 0) = 9/10**
+
+---
+
+### Problem 3 [6 pts]
+
+**Show that E(Z) ≤ E(X) + E(Y), where Z(s) = max(X(s), Y(s)) and X, Y are nonnegative.**
+
+Since X(s) ≥ 0 and Y(s) ≥ 0 for all s ∈ S, we show that max(X(s), Y(s)) ≤ X(s) + Y(s) for all s ∈ S:
+
+- **Case 1:** X(s) ≥ Y(s). Then max(X(s), Y(s)) = X(s) ≤ X(s) + Y(s), since Y(s) ≥ 0.
+- **Case 2:** Y(s) > X(s). Then max(X(s), Y(s)) = Y(s) ≤ X(s) + Y(s), since X(s) ≥ 0.
+
+In both cases, Z(s) ≤ X(s) + Y(s).
+
+Since Z(s) ≤ X(s) + Y(s) for all s ∈ S, by the monotonicity property of expectation:
+
+E(Z) ≤ E(X + Y) = E(X) + E(Y)
+
+where the last equality follows from linearity of expectation. ∎
+
+---
+
+### Problem 4 [3 pts]
+
+Two fair coins are flipped. Let X = number of heads, Y = number of tails.
+
+Sample space: S = {HH, HT, TH, TT}, each outcome with probability 1/4.
+
+| Outcome | X | Y |
+|---------|---|---|
+| HH | 2 | 0 |
+| HT | 1 | 1 |
+| TH | 1 | 1 |
+| TT | 0 | 2 |
+
+Note that X + Y = 2 for all outcomes.
+
+**X and Y are NOT independent.**
+
+*Proof:* We check the independence condition P(X = a, Y = b) = P(X = a) × P(Y = b):
+
+- P(X = 0) = 1/4, P(Y = 0) = 1/4
+- P(X = 0, Y = 0) = P(∅) = 0 (since X = 0 and Y = 0 cannot occur simultaneously — if X = 0 then both coins are tails, so Y = 2, not 0)
+- P(X = 0) × P(Y = 0) = (1/4)(1/4) = 1/16
+
+Since P(X = 0, Y = 0) = 0 ≠ 1/16 = P(X = 0)P(Y = 0), X and Y are **not independent**. ∎
+
+---
+
+### Problem 5 [5 pts]
+
+A fair die is rolled 10 times. Let X = number of times a 6 appears.
+
+X follows a Binomial distribution with n = 10 and p = P(rolling a 6) = 1/6.
+
+The variance of a binomial random variable is:
+
+V(X) = np(1 − p) = 10 × (1/6) × (5/6) = 50/36
+
+**V(X) = 25/18**
+
+---
+
+### Problem 6 [6 pts]
+
+**Example showing V(X + Y) ≠ V(X) + V(Y) when X and Y are not independent.**
+
+Let a fair coin be flipped once. Define:
+- X = 1 if heads, X = 0 if tails
+- Y = X (i.e., Y is the same random variable as X)
+
+X and Y are not independent since knowing X completely determines Y.
+
+**Compute the variances:**
+
+E(X) = 1/2, so V(X) = E(X²) − (E(X))² = 1/2 − 1/4 = 1/4.
+
+Similarly, V(Y) = 1/4.
+
+So V(X) + V(Y) = 1/4 + 1/4 = **1/2**.
+
+**Now compute V(X + Y):**
+
+Since Y = X, we have X + Y = 2X.
+
+V(X + Y) = V(2X) = 4 × V(X) = 4 × 1/4 = **1**.
+
+Since V(X + Y) = 1 ≠ 1/2 = V(X) + V(Y), this demonstrates that the variance of the sum is not necessarily equal to the sum of the variances when the random variables are not independent.
+
+*Note:* The general formula is V(X + Y) = V(X) + V(Y) + 2Cov(X, Y). Here Cov(X, Y) = Cov(X, X) = V(X) = 1/4 ≠ 0, which is why the equality fails.
+
+---
+
+### Problem 7 [4 pts]
+
+A biased coin with P(heads) = 0.6 is tossed n times. Let X = number of tails.
+
+- P(tails) = 1 − 0.6 = 0.4
+- E(X) = μ = n × 0.4 = 2n/5
+- V(X) = σ² = n × 0.4 × 0.6 = 0.24n = 6n/25
+
+We want an upper bound on P(|X − μ| ≥ √n).
+
+By Chebyshev's inequality: P(|X − μ| ≥ k) ≤ σ²/k²
+
+Setting k = √n:
+
+P(|X − 2n/5| ≥ √n) ≤ V(X)/n = (6n/25)/n = 6/25
+
+**P(|X − μ| ≥ √n) ≤ 6/25 = 0.24**
+
+---
+
+### Problem 8 [9 pts]
+
+The probability that x is the i-th element in a list of n distinct integers is p_i = i/(n(n+1)) for i = 1, 2, ..., n.
+
+**First, verify the total probability:**
+
+Σ_{i=1}^n p_i = Σ_{i=1}^n i/(n(n+1)) = [n(n+1)/2] / (n(n+1)) = 1/2
+
+So the probability that x is in the list is 1/2, and the probability that x is not in the list is 1 − 1/2 = 1/2.
+
+**Linear search behavior:**
+- If x is at position i, the algorithm makes i comparisons to find it.
+- If x is not in the list, the algorithm makes n comparisons (checking every element).
+
+**Expected number of comparisons:**
+
+E = Σ_{i=1}^n i × p_i + n × P(x not in list)
+
+= Σ_{i=1}^n i × [i/(n(n+1))] + n × (1/2)
+
+= [1/(n(n+1))] × Σ_{i=1}^n i² + n/2
+
+Using the formula Σ_{i=1}^n i² = n(n+1)(2n+1)/6:
+
+= [1/(n(n+1))] × [n(n+1)(2n+1)/6] + n/2
+
+= (2n+1)/6 + n/2
+
+= (2n+1)/6 + 3n/6
+
+= (2n + 1 + 3n)/6
+
+**E = (5n + 1)/6**
Sub-Model Usage
| Model | Input | Output | Cache Read | Cost |
|---|---|---|---|---|
| claude-opus-4-6 | 9 | 17.5K | 307.0K | $0.83 |
BSCS Bench