Introduction to mathematical statistics and its applications solutions

Solution Manual for An Introduction to Mathematical Statistics and Its Applications 5th edition by L

Published on Aug 8, 2020

This is completed downloadable Solution Manual for An Introduction to Mathematical Statistics and Its Applications 5th edition by Richars J. Larsen an...

iiirerer

Jouw privacyvoorkeuren

Om bol.com goed te laten werken, gebruiken we altijd functionele en analytische cookies en vergelijkbare technieken. Je kunt kiezen voor je eigen bol.com met persoonlijke aanbevelingen en advertenties, zodat we beter op jouw interesses aansluiten. Ook in nieuwsbrieven en notificaties als je die krijgt. Verder kun je kiezen voor persoonlijke advertenties buiten bol.com. In beide gevallen bepalen we je interesses. Hiervoor voegen we info uit je bestellingen samen met je favorieten, algemene klantinfo en gegevens van anderen als je ze hier toestemming voor hebt gegeven. Met cookies en vergelijkbare technieken verzamelen we ook je bol.com surfgedrag. Doen we natuurlijk niet als je tracking of cookies uit hebt gezet op je toestel of in je browser. De persoonlijke advertenties buiten bol.com kun je zien bij onze partners doordat we versleutelde gegevens delen en cookies en vergelijkbare technieken gebruiken. Zie ook ons privacybeleid en cookiebeleid. Vind je deze twee persoonlijke ervaringen binnen en buiten bol.com oké, kies dan voor ‘Alles accepteren’. Zelf instellen kan ook. Kies je voor weigeren, dan plaatsen we alleen functionele en analytische cookies. Achteraf aanpassen kan altijd, bij ons privacybeleid.

INSTRUCTOR’S SOLUTIONS

MANUAL

MATHEMATICAL STATISTICS

AND ITS APPLICATIONS

FIFTH EDITION

Richard J. Larsen

Vanderbilt University

Morris L. Marx

University of West Florida

This should be only distributed free of cost. If you have paid for this from an online solution manual vendor, you have been cheated.

Copyright © 2012, 2006, 2001 Pearson Education, Inc. Publishing as Prentice Hall, 75 Arlington Street, Boston, MA 02116.

All rights reserved. This manual may be reproduced for classroom use only.

ISBN-13: 978-0-321-69401- ISBN-10: 0-321-69401-

ii Contents

7 Deriving the Distribution of /

Y

Appendix 12.A The Distribution of SSTR / ( k −1)

  • Chapter 6: Hypothesis Testing.................................................................................................................................
    • 6 The Decision Rule ........................................................................................................................................
    • 6 Testing Binomial Data - H 0 : p = p o..........................................................................................................................................................................
    • 6 Type I and Type II Errors .............................................................................................................................
    • 6 A Notion of Optimality: The Generalized Likelihood Ratio ........................................................................
  • Chapter 7: Inferences Based on the Normal Distribution ........................................................................................ - −μ............................................................................................................... S n
    • 7 Drawing Inferences aboutμ.........................................................................................................................
    • 7 Drawing Inferences about σ 2 ........................................................................................................................
  • Chapter 8: Types of Data: A Brief Overview ..........................................................................................................
    • 8 Classifying Data ...........................................................................................................................................
  • Chapter 9: Two-Sample Inference ...........................................................................................................................
    • 9 Testing H 0 :μ X =μ Y .......................................................................................................................................
    • 9 Testing H 0 :σσ 2 X = Y 2 —The F Test .................................................................................................................
    • 9 Binomial Data: Testing H 0 : pX = pY ...........................................................................................................
    • 9 Confidence Intervals for the Two-Sample Problem .....................................................................................
  • Chapter 10: Goodness-of-Fit Tests ..........................................................................................................................
    • 10 The Multinomial Distribution ...................................................................................................................
    • 10 Goodness-of-Fit Tests: All Parameters Known.........................................................................................
    • 10 Goodness-of-Fit Tests: Parameters Unknown...........................................................................................
    • 10 Contingency Tables...................................................................................................................................
  • Chapter 11: Regression ............................................................................................................................................
    • 11 The Method of Least Squares....................................................................................................................
    • 11 The Linear Model......................................................................................................................................
    • 11 Covariance and Correlation.......................................................................................................................
    • 11 The Bivariate Normal Distribution............................................................................................................
  • Chapter 12: The Analysis of Variance..................................................................................................................... Contents iii
    • 12 The F test...................................................................................................................................................
    • 12 Multiple Comparisons: Tukey’s Method ..................................................................................................
    • 12 Testing Subhypotheses with Constrasts ....................................................................................................
    • 12 Data Transformations ................................................................................................................................
      • When H 1 Is True ................................................................ SSE / ( n − k )
  • Chapter 13: Randomized Block Designs .................................................................................................................
    • 13 The F Test for a Randomized Block Design
    • 13 The Paired t Test .......................................................................................................................................
  • Chapter 14: Nonparametric Statistics ......................................................................................................................
    • 14 The Sign Test ............................................................................................................................................
    • 14 Wilcoxon Tests..........................................................................................................................................
    • 14 The Kruskal-Wallis Test ...........................................................................................................................
    • 14 The Friedman Test ....................................................................................................................................
    • 14 Testing for Randomness ............................................................................................................................

2 Chapter 2: Probability

2.2 In order for the shooter to win with a point of 9, one of the following (countably infinite) sequences of sums must be rolled: (9,9), (9, no 7 or no 9,9), (9, no 7 or no 9, no 7 or no 9,9), ...

2.2 Let ( x , y ) denote the strategy of putting x white chips and y black chips in the first urn (which results in 10 − x white chips and 10 − y black chips being in the second urn). Then S = {( , ) : x y x == ≤+≤0,1,...,10, y 0,1,...,10, and 1 x y 19 }. Intuitively, the optimal strategies are (1, 0) and (9, 10).

2.2 Let Ak be the set of chips put in the urn at 1/2 k minute until midnight_._ For example, A 1 = {11, 12, 13, 14, 15, 16, 17, 18, 19, 20}. Then the set of chips in the urn at midnight is

1

( k { 1}) k

∪ A k

=

−+ =∅.

2.

2.2 If x 2 + 2 x ≤ 8, then ( x + 4)( x − 2) ≤ 0 and A = { x : − 4 ≤ x ≤ 2}. Similarly, if x 2 + x ≤ 6, then ( x + 3)( x − 2) ≤ 0 and B = { x : − 3 ≤ x ≤ 2). Therefore, A ∩ B = { x : − 3 ≤ x ≤ 2} and A ∪ B = { x : − 4 ≤ x ≤ 2}.

2.2 A ∩ B ∩ C = { x : x = 2, 3, 4}

2.2 The system fails if either the first pair fails or the second pair fails (or both pairs fail). For either pair to fail, though, both of its components must fail. Therefore, A = ( A 11 ∩ A 21 ) ∪ ( A 12 ∩ A 22 ).

2.2 (a) (b) _____________________ −∞ ∞

(c) empty set (d)

2.2 40

2.2 (a) { E 1, E 2} (b) { S 1, S 2, T 1, T 2} (c) { A , I }

2.2 (a) If s is a member of A ∪ ( B ∩ C ) then s belongs to A or to B ∩ C. If it is a member of A or of B ∩ C , then it belongs to A ∪ B and to A ∪ C. Thus, it is a member of ( A ∪ B ) ∩ ( A ∪ C ). Conversely, choose s in ( A ∪ B ) ∩ ( A ∪ C ). If it belongs to A , then it belongs to A ∪ ( B ∩ C ). If it does not belong to A , then it must be a member of B ∩ C. In that case it also is a member of A ∪ ( B ∩ C ).

Section 2: Sample Spaces and the Algebra of Sets 3

(b) If s is a member of A ∩ ( B ∪ C ) then s belongs to A and to B ∪ C. If it is a member of B , then it belongs to A ∩ B and, hence, ( A ∩ B ) ∪ ( A ∩ C ). Similarly, if it belongs to C , it is a member of ( A ∩ B ) ∪ ( A ∩ C ). Conversely, choose s in ( A ∩ B ) ∪ ( A ∩ C ). Then it belongs to A. If it is a member of A ∩ B then it belongs to A ∩ ( B ∪ C ). Similarly, if it belongs to A ∩ C , then it must be a member of A ∩ ( B ∪ C ).

2.2 Let B = A 1 ∪ A 2 ∪ ... ∪ Ak. Then A 1 C +++ A 2 C ... AkC = ( A 1 ∪ A 2 ∪ ...∪ Ak ) C = BC. Then the expression is simply B ∪ BC = S.

2.2 (a) Let s be a member of A ∪ ( B ∪ C ). Then s belongs to either A or B ∪ C (or both). If s belongs to A , it necessarily belongs to ( A ∪ B ) ∪ C. If s belongs to B ∪ C , it belongs to B or C or both, so it must belong to ( A ∪ B ) ∪ C. Now, suppose s belongs to ( A ∪ B ) ∪ C. Then it belongs to either A ∪ B or C or both. If it belongs to C , it must belong to A ∪ ( B ∪ C ). If it belongs to A ∪ B , it must belong to either A or B or both, so it must belong to A ∪ ( B ∪ C ).

(b) Suppose s belongs to A ∩ ( B ∩ C ), so it is a member of A and also B ∩ C. Then it is a member of A and of B and C. That makes it a member of ( A ∩ B ) ∩ C. Conversely, if s is a member of ( A ∩ B ) ∩ C , a similar argument shows it belongs to A ∩ ( B ∩ C ).

2.2 (a) AC ∩ BC ∩ CC (b) A ∩ B ∩ C (c) A ∩ BC ∩ CC (d) ( A ∩ BC ∩ CC ) ∪ ( AC ∩ B ∩ CC ) ∪ ( AC ∩ BC ∩ C ) (e) ( A ∩ B ∩ CC ) ∪ ( A ∩ BC ∩ C ) ∪ ( AC ∩ B ∩ C )

2.2 A is a subset of B.

2.2 (a) {0} ∪ { x : 5 ≤ x ≤ 10} (b) { x : 3 ≤ x < 5} (c) { x : 0 < x ≤ 7} (d) { x : 0 < x < 3} (e) {0} { : 3,≤≤ x x 10} (f) {0} { : 7,<≤ x x 10}

2.2 (a) B and C (b) B is a subset of A.

2.2 (a) A 1 ∩ A 2 ∩ A 3 (b) A 1 ∪ A 2 ∪ A 3

The second protocol would be better if speed of approval matters. For very important issues, the first protocol is superior.

2.2 Let A and B denote the students who saw the movie the first time and the second time, respectively. Then N (a) = 850, N (b) = 690, and N A [( , B ) ] C = 4700 (implying that N ( A ∪ B ) = 1300). Therefore, N ( A ∩ B ) = number who saw movie twice = 850 + 690 − 1300 = 240.

2.2 (a)

Section 2: The Probability Function 5

(b)

B ,, = ,( A B ) C AC B

(c)

A ++ =+( A B ) C A BC

2.2 Let A be the set of those with MCAT scores ≥ 27 and B be the set of those with GPAs ≥ 3. We are given that N (a) = 1000, N (b) = 400, and N ( A ∩ B ) = 300. Then N A ( C + B C ) = N A [( , B ) ] C = 1200 − N ( A ∪ B ) = 1200 − [( N (a) + N (b) − N ( A ∩ B )] = 1200 − [(1000 + 400 − 300] = 100. The requested proportion is 100/1200.

2.

N ( A ∪ B ∪ C ) = N (a) + N (b) + N (c) − N ( A ∩ B ) − N ( A ∩ C ) − N ( B ∩ C ) + N ( A ∩ B ∩ C )

2.2 Let A be the set of those saying “yes” to the first question and B be the set of those saying “yes” to the second question. We are given that N (a) = 600, N (b) = 400, and N ( AC ∩ B ) = 300. Then N ( A ∩ B ) = N (b) − N A ( C + B )= 400 − 300 = 100. N A ( + B C ) = N (a) − N ( A ∩ B ) = 600 − 100 = 500.

2.2 N A [( + B ) ] C = 120 − N ( A ∪ B ) = 120 − [ N ( AC ∩ B ) + N ( A ∩ BC ) + N ( A ∩ B )] = 120 − [50 + 15 + 2] = 53

Section 2: The Probability Function

2.3 Let L and V denote the sets of programs with offensive language and too much violence, respectively. Then P ( L ) = 0, P ( V ) = 0, and P ( L ∩ V ) = 0. Therefore, P (program complies) = P (( L ∪ V ) C ) = 1 − [ P ( L ) + P ( V ) − P ( L ∩ V )] = 0.

2.3 P ( A or B but not both) = P ( A ∪ B ) − P ( A ∩ B ) = P (a) + P (b) − P ( A ∩ B ) − P ( A ∩ B ) = 0 + 0 − 0 − 0 = 0.

6 Chapter 2: Probability

2.3 (a) 1 − P ( A ∩ B ) (b) P (b) − P ( A ∩ B )

2.3 P ( A ∪ B ) = P (a) + P (b) − P ( A ∩ B ) = 0; P (a) − P ( A ∩ B ) = 0. Therefore, P (b) = 0.

2.3 No. P ( A 1 ∪ A 2 ∪ A 3 ) = P (at least one “6” appears) = 1 − P (no 6’s appear) = 1 −

5 3 1

6 2

»¿≠

¼À½Á.

The Ai ’s are not mutually exclusive, so P ( A 1 ∪ A 2 ∪ A 3 ) ≠ P ( A 1 ) + P ( A 2 ) + P ( A 3 ).

2.

P(A or B but not both) = 0 – 0 = 0.

2.

By inspection, B = ( B ∩ A 1 ) ∪ ( B ∩ A 2 ) ∪ ... ∪ ( B ∩ An ).

2.3 (a) (b) (b)

2.3 P (odd man out) = 1 − P (no odd man out) = 1 − P ( HHH or TTT ) = 1 − 2 3 8 4

=

2.3 A = {2, 4, 6, ..., 24}; B = {3, 6, 9, ..., 24); A ∩ B = {6, 12, 18, 24}.

Therefore, P ( A ∪ B ) = P (a) + P (b) − P ( A ∩ B ) = 128416 24 24 24 24

+−=.

2.3 Let A : State wins Saturday and B : State wins next Saturday. Then P (a) = 0, P (b) = 0, and P (lose both) = 0 = 1 − P ( A ∪ B ), which implies that P ( A ∪ B ) = 0. Therefore, P ( A ∩ B ) = 0 + 0 − 0 = 0, so P (State wins exactly once) = P ( A ∪ B ) − P ( A ∩ B ) = 0 − 0 = 0.

8 Chapter 2: Probability

2.4 P ( A ∪ B ) = 0 and P ( A ∪ B ) − P ( A ∩ B ) = 0, so P ( A ∩ B ) = 0. Also, P ( A | B ) = 0 = ( ) ( )

P A B

P B

  • , so P (b) = 0 1 0 3

= and P (a) = 0 + 0 − 1 2 3 3

=.

2.4 Let Ri be the event that a red chip is selected on the i th draw, i = 1, 2.

Then P (both are red) = P ( R 1 ∩ R 2 ) = P ( R 2 | R 1 ) P ( R 1 ) = 3 1 3 4 2 8

⋅=.

2.4 P ( A | B ) = ( ) ( ) ( ) ( ) ( )

( ) ( )

P A B P A P B P A B a b P A B P B P B b

++−,+−,==.

But P ( A ∪ B ) ≤ 1, so P ( A | B ) ≥ a b 1 b

+−.

2.4 Let Wi be the event that a white chip is selected on the i th draw, i = 1,2. Then

P ( W 2 | W 1 ) = 12 1

( )

( )

P W W

P W

+. If both chips in the urn are white, P ( W 1 ) = 1;

if one is white and one is black, P ( W 1 ) = 1 2 . Since each chip distribution is equally likely,

P ( W 1 ) = 1 ⋅ 1 1 1 3 2 2 2 4

+⋅=. Similarly, P ( W 1 ∩ W 2 ) = 1 ⋅ 1 1 1 5 2 4 2 8

+⋅=, so P ( W 2 | W 1 ) = 5 / 8 5 3 / 4 6

=.

2.4 P [( A ∩ B )| ( A ∪ B ) C ] =

[( ) ( ) ] ( )

0

[( ) ] [( ) ]

C C C

P A B A B P

P A B P A B

++, ∅

==

,,

2.4 (a) P ( AC ∩ BC ) = 1 − P ( A ∪ B ) = 1 − [ P (a) + P (b) − P ( A ∩ B )] = 1 − [0 + 0 − 0] = 0.

(b) P [( AC ∩ B ) ∪ ( A ∩ BC )] = P ( AC ∩ B ) + P ( A ∩ BC ) = [ P (a) − P ( A ∩ B )] + [ P (b) − P ( A ∩ B )] = [0 − 0] + [0 − 0] = 0.

(c) P ( A ∪ B ) = 0.

(d) P [( A ∩ B ) C ] = 1 − P ( A ∩ B ) = 1 − 0 = 0.

(e) P {[( AC ∩ B ) ∪ ( A ∩ BC )]| A ∪ B } = [( ) ( )] ( )

P AC B A B C

P A B

+,+

,

= 0.70/0 = 70/

(f) P ( A ∩ B )| A ∪ B ) = P ( A ∩ B )/ P ( A ∪ B ) = 0.25/0 = 25/

(g) P ( B | AC ) = P ( AC ∩ B )/ P ( AC ) ] = [ P (b) − P ( A ∩ B )]/[1 − P (a)] = [0 − 0]/[1 − 0] = 30/

2.4 P (No. of heads ≥ 2| No. of heads ≤ 2) = P (No. of heads ≥ 2 and No. of heads ≤ 2)/ P (No. of heads ≤ 2) = P (No. of heads = 2)/ P (No. of heads ≤ 2) = (3/8)/(7/8) = 3/

Section 2: Conditional Probability 9

2.4 P (first die ≥ 4|sum = 8) = P (first die ≥ 4 and sum = 8)/ P (sum = 8) = P ({(4, 4), (5, 3), (6, 2)}/ P ({(2, 6), (3, 5), (4, 4), (5, 3), (6, 2)}) = 3/

2.4 There are 4 ways to choose three aces (count which one is left out). There are 48 ways to choose the card that is not an ace, so there are 4 × 48 = 192 sets of cards where exactly three are aces. That gives 193 sets where there are at least three aces. The conditional probability is (1/270,725)/(193/270,725) = 1/193.

2.4 First note that P ( A ∪ B ) = 1 − P [( A ∪ B ) C ] = 1 − 0 = 0. Then P (b) = P ( A ∪ B ) − P ( A ∩ BC ) − P ( A ∩ B ) = 0 − 0 − 0 = 0. Finally P ( A | B ) = P ( A ∩ B )/ P (b) = 0.1/0 = 1/

2.4 P ( A | B ) = 0 implies P ( A ∩ B ) = 0 P (b). P ( B | A ) = 0 implies P ( A ∩ B ) = (0) P (a). Thus, 0 P (b) = 0 P (a) or P (b) = 0 P (a). Then, 0 = P (a) + P (b) = P (a) + 0 P (a) or P (a) = 0.9/1 = 0.

2.4 P [( A ∩ B ) C ] = P [( A ∪ B ) C ] + P ( A ∩ BC ) + P ( AC ∩ B ) = 0 + 0 + 0 = 0. P ( A ∪ B |( A ∩ B ) C ) = P [( A ∩ BC ) ∪ ( AC ∩ B )]/ P (( A ∩ B ) C ) = [0 + 0]/0 = 2/

2.4 P (sum ≥ 8|at least one die shows 5) = P (sum ≥ 8 and at least one die shows 5)/ P (at least one die shows 5) = P ({(5, 3), (5, 4), (5, 6), (3, 5), (4, 5), (6, 5), (5, 5)})/(11/36) = 7/

2.4 P (Outandout wins|Australian Doll and Dusty Stake don’t win) = P (Outandout wins and Australian Doll and Dusty Stake don’t win)/ P (Australian Doll and Dusty Stake don’t win) = 0.20/0 = 20/

2.4 Suppose the guard will randomly choose to name Bob or Charley if they are the two to go free. Then the probability the guard will name Bob, for example, is P (Andy, Bob) + (1/2) P (Bob, Charley) = 1/3 + (1/2)(1/3) = 1/2. The probability Andy will go free given the guard names Bob is P (Andy, Bob)/ P (Guard names Bob) = (1/3)/(1/2) = 2/3. A similar argument holds for the guard naming Charley. Andy’s concern is not justified.

2.4 P ( BBRWW ) = P (b) P ( B | B ) P ( R | BB ) P ( W | BBR ) P ( W | BBRW ) = 3565 15 14 13 12 11

4 ⋅⋅⋅⋅

=. P (2, 6, 4, 9, 13) = 111111

15 14 13 12 11 360, 360

⋅⋅⋅⋅=.

2.4 Let Ki be the event that the i th key tried opens the door, i = 1, 2, ..., n. Then P (door opens first time with 3rd key) = P K ( 1 C ++= ⋅ K 2 C K 3 ) P K ( 1 C ) P K ( 2 C K 1 C )⋅ P K ( 3 K 1 C + K 2 C )=

1 2 1 1 1 2

n n n n n n

−−⋅⋅=

−−

.

2.4 (1/52)(1/51)(1/50)(1/49) = 1/6,497,

2.4 (1/2)(1/2)(1/2)(2/3)(3/4) = 1/

Section 2: Conditional Probability 11

2.4 No. Let B denote the event that the person calling the toss is correct. Let AH be the event that the coin comes up Heads and let AT be the event that the coin comes up Tails.

Then P (b) = P ( B | AH ) P ( AH ) + P ( B | AT ) P ( AT ) = (0) 1 2

» ¿

¼½ ÀÁ + (0)

1

2

» ¿

¼½ ÀÁ =

1

2

.

2.4 Let B be the event of a guilty verdict; let A be the event that the defense can discredit the police. Then P (b) = P ( B | A ) P (a) + P ( B | AC ) P ( AC ) = 0(0) + 0(0) = 0.

2.4 Let A 1 be the event of a 3.5-4 GPA; A 2 , of a 3.0-3 GPA; and A 3 , of a GPA less than 3. If B is the event of getting into medical school, then P (b) = P ( B | A 1 ) P ( A 1 ) + P ( B | A 2 ) P ( A 2 ) + P ( B | A 3 ) P ( A 3 ) = (0)(0) + (0)(0) + (0)(0) = 0.

2.4 Let B be the event of early release; let A be the event that the prisoner is related to someone on the governor’s staff. Then P (b) = P ( B | A ) P (a) + P ( B | AC ) P ( AC ) = (0)(0) + (0)(0) = 0.

2.4 Let A 1 be the event of being a Humanities major; A 2 , of being a Natural Science major; A 3 , of being a History major; and A 4 , of being a Social Science major. If B is the event of a male student, then P (b) = P ( B | A 1 ) P ( A 1 ) + P ( B | A 2 ) P ( A 2 ) + P ( B | A 3 ) P ( A 3 ) + P ( B | A 4 ) P ( A 4 ) = (0)(0) + (0)(0) + (0)(0) + (0)(0) = 0.

2.4 Let B denote the event that the chip drawn from Urn II is red; let AR and AW denote the events that the chips transferred are red and white, respectively.

Then ( | ) ( | ) ( ) (2 / 4)(2 / 3) 4 ( | ) ( ) ( | ) ( ) (3 / 4)(1/ 3) (2 / 4)(2 / 3) 7 W W W R R W W

P A B P B A P A

P B A P A P B A P A

===

++

2.4 Let Ai be the event that Urn i is chosen, i = I, II, III. Then, P ( Ai ) = 1/3, i = I, II, III. Suppose B is the event a red chip is drawn. Note that P ( B | A 1 ) = 3/8, P ( B | A 2 ) = 1/2 and P ( B | A 3 ) = 5/8.

3 3 3

1 1 2 2 3 3

( | ) ( )

( | ) =

( | ) ( ) ( | ) ( ) ( | ) ( )

P B A P A

P A B

P B A P A ++ P B A P A P B A P A

= (5 / 8)(1/ 3)

(3 / 8)(1/ 3) (1/ 2)(1/ 3) (5 / 8)(1/ 3)++

= 5/12.

2.4 If B is the event that the warning light flashes and A is the event that the oil pressure is low, then

P ( A | B ) = ( | ) ( )

( | ) ( ) ( | C ) ( C )

P B A P A

P B A P A + P B A P A

= (0)(0)

(0)(0) (0)(0)+

= 0.

2.4 Let B be the event that the basement leaks, and let AT , AW , and AH denote the events that the house was built by Tara, Westview, and Hearthstone, respectively. Then P ( B | AT ) = 0, P ( B | AW ) = 0, and P ( B | AH ) = 0. Also, P ( AT ) = 2/11, P ( AW ) = 3/11, and P ( AH ) = 6/11. Applying Bayes’ rule to each of the builders shows that P ( AT | B ) = 0, P ( AW | B ) = 0, and P ( AH | B ) = 0, implying that Hearthstone is the most likely contractor.

12 Chapter 2: Probability

2.4 Let B denote the event that Francesca passed, and let AX and AY denote the events that she was enrolled in Professor X ’s section and Professor Y ’s section, respectively. Since P ( B | AX ) = 0, P ( B | AY ) = 0, P ( AX ) = 0, and P ( AY ) = 0,

P ( AX | B ) = (0)(0) (0)(0) (0)(0)+

= 0.

2.4 Let B denote the event that a check bounces, and let A be the event that a customer wears sunglasses. Then P ( B | A ) = 0, P ( B | AC ) = 1 − 0 = 0, and P (a) = 0, so

P ( A | B ) = (0)(0)

(0)(0) (0)(0)+

= 0.

2.4 Let B be the event that Basil dies, and define A 1 , A 2 , and A 3 to be the events that he ordered cherries flambe, chocolate mousse, or no dessert, respectively. Then P ( B | A 1 ) = 0, P ( B | A 2 ) = 0, P ( B | A 3 ) = 0, P ( A 1 ) = 0, P ( A 2 ) = 0, and P ( A 3 ) = 0. Comparing P ( A 1 | B ) and P ( A 2 | B ) suggests that Margo should be considered the prime suspect:

P ( A 1 | B ) = (0)(0)

(0)(0) (0)(0) (0)(0)++

= 0.

P ( A 2 | B ) = (0)(0)

(0)(0) (0)(0) (0)(0)++

= 0.

2.4 Define B to be the event that Josh answers a randomly selected question correctly, and let A 1 and A 2 denote the events that he was 1) unprepared for the question and 2) prepared for the question, respectively. Then P ( B | A 1 ) = 0, P ( B | A 2 ) = 1, P ( A 2 ) = p , P ( A 1 ) = 1 − p , and

P ( A 2 | B ) = 0 = 22

1 1 2 2

( | ) ( ) 1

( | ) ( ) ( | ) ( ) (0)(1 ) (1 )

P B A P A p P B A P A P B A P A p p

= ⋅

+−+⋅

which implies that p = 0 (meaning that Josh was prepared for (0)(20) = 14 of the questions).

2.4 Let B denote the event that the program diagnoses the child as abused, and let A be the event that the child is abused. Then P (a) = 1/90, P ( B | A ) = 0, and P ( B | AC ) = 0, so

P ( A | B ) = (0)(1/ 90)

(0)(1/ 90) (0)(89 / 90)+

= 0.

If P (a) = 1/1000, P ( A | B ) = 0; if P (a) = 1/50, P ( A | B ) = 0.

2.4 Let A 1 be the event of being a Humanities major; A 2 , of being a History and Culture major; and A 3 , of being a Science major. If B is the event of being a woman, then

P ( A 2 | B ) = (0)(0) (0)(0) (0)(0) (0)(0)++

= 225/

14 Chapter 2: Probability

2.5 Six equally-likely orderings are possible for any set of three distinct random numbers: x 1 < x 2 < x 3 , x 1 < x 3 < x 2 , x 2 < x 1 < x 3 , x 2 < x 3 < x 1 , x 3 < x 1 < x 2 , and x 3 < x 2 < x 1. By inspection,

P (a) = 2 6

, and P (b) = 1 6

, so P ( A ∩ B ) = P (a) ⋅ P (b) = 1 18

.

2.5 (a) 1. P ( A ∪ B ) = P (a) + P (b) − P ( A ∩ B ) = 1/4 + 1/8 + 0 = 3/ 2. P ( A ∪ B ) = P (a) + P (b) − P (a) P (b) = 1/4 + 1/8 − (1/4)(1/8) = 11/

(b) 1. P ( A | B ) =

( ) 0

0

( ) ( )

P A B

P B P B

+

==

2. P ( A | B ) =

( ) ( ) ( )

( ) 1/ 4

( ) ( )

P A B P A P B

P A

P B P B

+

===

2.5 (a) P ( A ∪ B ∪ C ) = P (a) + P (b) + P (c) − P (a) P (b) − P (a) P (c) − P (b) P (c) + P (a) P (b) P (c) (b) P ( A ∪ B ∪ C ) = 1 − P [( A ∪ B ∪ C ) C ] = 1 − P ( AC ∩ BC ∩ CC ) = 1 − P ( AC ) P ( BC ) P ( CC )

2.5 Let Ai be the event of i heads in the first two tosses, i = 0, 1, 2. Let Bi be the event of i heads in the last two tosses, i = 0, 1, 2. The A ’s and B ’s are independent. The event of interest is ( A 0 ∩ B 0 ) ∪ ( A 1 ∩ B 1 ) ∪ ( A 2 ∩ B 2 ) and P [( A 0 ∩ B 0 ) ∪ ( A 1 ∩ B 1 ) ∪ ( A 2 ∩ B 2 )] = P ( A 0 ) P ( B 0 ) + P ( A 1 ) P ( B 1 ) + P ( A 2 ) P ( B 2 ) = (1/4)(1/4) + (1/2)(1/2) + (1/4)(1/4) = 6/

2.5 A and B are disjoint, so they cannot be independent.

2.5 Equation 2.5: P ( A ∩ B ∩ C ) = P ({1, 3)}) = 1/36 = (2/6)(3/6)(6/36) = P (a) P (b) P (c) Equation 2.5: P ( B ∩ C ) = P ({1, 3), (5,6)}) = 2/36 ≠ (3/6)(6/36) = P (b) P (c)

2.5 Equation 2 3: P ( A ∩ B ∩ C ) = P ({2, 4, 10, 12)}) = 4/36 ≠ (1/2)(1/2)(1/2) = P (a) P (b) P (c) Equation 2.5: P ( A ∩ B ) = P ({2, 4, 10, 12, 24, 26, 32, 34, 36)}) = 9/36 = 1/4 = (1/2)(1/2) = P (a) P (b) P ( A ∩ C ) = P ({1, 2, 3, 4, 5, 10, 11, 12, 13)}) = 9/36 = 1/4 = (1/2)(1/2) = P (a) P (c) P ( B ∩ C ) = P ({2, 4, 6, 8, 10, 12, 14, 16, 18)}) = 9/36 = 1/4 = (1/2)(1/2) = P (a) P (c)

2.5 11 [= 6 verifications of the form P ( Ai ∩ Aj ) = P ( Ai ) ⋅ P ( Aj ) + 4 verifications of the form P ( Ai ∩ Aj ∩ Ak ) = P ( Ai ) ⋅ P ( Aj ) ⋅ P ( Ak ) + 1 verification that P ( A 1 ∩ A 2 ∩ A 3 ∩ A 4 ) = P ( A 1 ) ⋅ P ( A 2 ) ⋅ P ( A 3 ) ⋅ P ( A 4 )].

2.5 P (a) = 3 6

, P (b) = 2 6

, P (c) = 6 36

, P ( A ∩ B ) = 6

36

, P ( A ∩ C ) = 3

36

, P ( B ∩ C ) = 2

36

, and

P ( A ∩ B ∩ C ) = 1 36 . It follows that A , B , and C are mutually independent because

P ( A ∩ B ∩ C ) = 1 36

= P (a) ⋅ P (b) ⋅ P (c) = 3 2 6 6 6 36

⋅⋅ , P ( A ∩ B ) = 6

36

= P (a) ⋅ P (b) = 3 2 6 6

⋅ , P ( A

∩ C ) = 3

36

= P (a) ⋅ P (c) = 3 6 6 36

⋅ , and P ( B ∩ C ) = 2 36

= P (b) ⋅ P (c) = 2 6 6 36

⋅.

Section 2: Independence 15

2.5 P ( A ∩ B ∩ C ) = 0 (since the sum of two odd numbers is necessarily even) ≠ P (a) ⋅ P (b) ⋅ P (c)

> 0, so A , B , and C are not mutually independent. However, P ( A ∩ B ) = 9 36

= P (a) ⋅ P (b) = 3 3 6 6

⋅ , P ( A ∩ C ) = 9

36

= P (a) ⋅ P (c) = 3 18 6 36

⋅ , and P ( B ∩ C ) = 9 36

= P (b) ⋅ P (c) = 3 18 6 36

⋅ , so A , B , and C are pairwise independent.

2.5 Let Ri and Gi be the events that the i th light is red and green, respectively, i = 1, 2, 3, 4. Then

P ( R 1 ) = P ( R 2 ) = 1 3 and P ( R 3 ) = P ( R 4 ) = 1 2 . Because of the considerable distance between the intersections, what happens from light to light can be considered independent events. P (driver stops at least 3 times) = P (driver stops exactly 3 times) + P (driver stops all 4 times) = P (( R 1 ∩ R 2 ∩ R 3 ∩ G 4 ) ∪ ( R 1 ∩ R 2 ∩ G 3 ∩ R 4 ) ∪ ( R 1 ∩ G 2 ∩ R 3 ∩ R 4 )

∪ ( G 1 ∩ R 2 ∩ R 3 ∩ R 4 ) ∪ ( R 1 ∩ R 2 ∩ R 3 ∩ R 4 )) =

1 1 1 1 1 1 1 1

3 3 2 2 3 3 2 2

» ¿» ¿» ¿» ¿ » ¿» ¿» ¿» ¿+

¼½ À¼ À¼ À¼ À ¼ À¼ À¼ À¼ ÀÁ½ Á½ Á½ Á ½ Á½ Á½ Á½ Á

+ 1211211111117

3 3 2 2 3 3 2 2 3 3 2 2 36

»¿»¿»¿»¿»¿»¿»¿»¿»¿»¿»¿»¿++=

¼À¼À¼À¼À¼À¼À¼À¼À¼À¼À¼À¼À½Á½Á½Á½Á½Á½Á½Á½Á½Á½Á½Á½Á.

2.5 Let M , L , and G be the events that a student passes the mathematics, language, and general

knowledge tests, respectively. Then P ( M ) = 6175 9500

, P ( L ) = 7600

9500

, and P (g) = 8075 9500

.

P (student fails to qualify) = P (student fails at least one exam) = 1 − P (student passes all three exams) = 1 − P ( M ∩ L ∩ G ) = 1 − P ( M ) ⋅ P ( L ) ⋅ P (g) = 0.

2.5 Let Ai denote the event that switch Ai closes, i = 1, 2, 3, 4. Since the Ai ’s are independent events, P (circuit is completed) = P (( A 1 ∩ A 2 ) ∪ ( A 3 ∩ A 4 )) = P ( A 1 ∩ A 2 ) + P ( A 3 ∩ A 4 ) − P (( A 1 ∩ A 2 ) ∩ ( A 3 ∩ A 4 )) = 2 p 2 − p 4.

2.5 Let p be the probability of having a winning game card. Then 0 = P (winning at least once in 5 tries) = 1 − P (not winning in 5 tries) = 1 − (1 − p ) 5 , so p = 0.

2.5 Let AH , AT , BH , BT , CH , and CT denote the events that players A , B , and C throw heads and tails on individual tosses. Then P ( A throws first head) = P ( AH ∪ ( AT ∩ BT ∩ CT ∩ AH ) ∪ ⋅⋅⋅)

=

1 1 1 1 1 2114

2 2 8 2 8 2 1 1/ 8 7

++ +=»¿ »¿¼À ¼À ∀ »¿¼À=

½Á ½Á ½Á−

. Similarly, P ( B throws first head)

= P (( AT ∩ BH ) ∪ ( AT ∩ BT ∩ CT ∩ AT ∩ BH ) ∪ ...) =

1 1 1 1 1 2112

...

4 4 8 4 8 4 1 1/ 8 7

++ +=»¿ »¿ »¿=

¼À ¼À½Á ½Á ¼À½Á−.

P ( C throws first head) = 1 − 4 2 1 7 7 7

−=.

2.5 P (at least one child becomes adult) = 1 − P (no child becomes adult) = 1 0 − n.

Then1 2−≥ n 0 implies ln 0. ln 0. n ≥ or n ≥0, so take n = 1.

What are the applications of statistics in mathematics?

The application of mathematics in statistics is beneficial for understanding the pricing of stocks, derivative securities. In addition to this, complex financial systems, or economies. In economics, mathematics allows us to make decisions & understand Game theory.

What are the topics in mathematical statistics?

Statistics Department Combinatorics and basic set theory notation. Probability definitions and properties. Common discrete and continuous distributions. Bivariate distributions.

What is the importance of mathematical statistics?

Statistics may be used to study the vast amounts of data we have about these systems and look for patterns. Mathematical and Statistical models can be used to understand and make predictions about such diverse things as glacial movement, seismic events, and tsunamis.

What is the meaning of mathematical statistics?

Mathematical statistics is the application of probability theory, a branch of mathematics, to statistics, as opposed to techniques for collecting statistical data.