Showing posts with label Probability. Show all posts
Showing posts with label Probability. Show all posts

Saturday, July 18, 2020

RANDOM VARIABLE

INTRODUCTION:

Some Thinks that Probability is very easy 
And Some think it is very tough.
      If you Think Probability is tough then Proceed to our Probability blogs to learn easily.

Let's Proceed to our Topic
✦ A random variable is often described as a variable whose values are determines  by outcomes of a random experiment.
OR, 
      We assigned some real values to each of the outcomes of a sample space ,which is called the random variable.
✦ A random variable is a function , X :S ➝ R i.e a random variable is a function whose domain is the sample space of a random experiment(S) and whose range is a real line(R).

 Example: Let X is a random variable , the no. of heads in the experiment of tossing a coin twice , then in this case  S ={ HH,HT,TH,TT} and X ={2,1,1,0} i.e X(HH)=2, X(HT)=1, X(TH)=1,X(TT)=0. Thus the domain of  X is S and range is {0,1,2}.


✦ For a mathematical and rigorous definition of the random variable , let us consider the triplet (S,B,P), where S is the sample space B is the ๐žผ-field of subsets in S and P is the probability function on B.
   So mathematically , a random variable is a function X(๐Ž) with domain S and range (-∞ , ∞) ,such that for every real number 'a' , the event [ ๐Ž :  X(๐Ž) ≤ a ] ∈ B.
✦ Random variable X can be written as P( X ≤ a ) to make probability statements about 'a'.
     For simple example given above , we should write P( X ≤ 1 ) = P{ HH,HT,TH,TT } = 3/4.

NOTATION:

 One dimensional random variable will be denoted by capital letters X,Y,Z,... etc. A typical outcome of the experiment ( i.e a typical elements of the sample space ) will be denoted by ๐Ž or ะต. Thus X(๐Ž) represents the real number which the random variable X associates with the outcome ๐Ž .
 The value of the random variable will be denoted by small letters x,y,z,... etc. 

SOME PROPERTIES OF RANDOM VARIABLE :

✦ If  X1 , X2  are random variables and C is a constant , then CX1 , X1 + X2 , X1 - X2 , X1 * X2  also random variable.
✦ If X is a random variable , then
  • 1/X , where (1/X ) ( ๐Ž ) = ∞ if X(๐Ž) = 0.
  • X + ๐Ž = max[ 0 , X(๐Ž) ]
  • X - ๐Ž = min [ 0 , X(๐Ž) ]
  • | X |
are also random variables.
✦ If  X1 , X2  are random variables ,then (i) max[ X1 , X2 ] & (ii) min[ X1 , X2 ] are also random variables .
✦ If X is a random variable and ∱(.) is a continuous function , then ∱(X) is a random variable.
✦ If X is a random variable and ∱(.) is a increasing function , then ∱(X) is a random variable.
✦ If ∱ is a function of bounded variations on every finite interval [a,b] and X is a random variable, then ∱(X) is a random variable.


Sunday, June 28, 2020

THEOREMS ON PROBABILITY OF EVENTS( PART -2)

[If you are beginner in probability then you should read Probability Theorem (Part-1) before Read the below content]

THEOREM- 4:

                   For any 3 events A , B , C  P( A ⋃ B | C ) = P( A | C ) + P( B | C ) P( A ⋂ B | C )
 

PROOF:

      We have P( A แˆ€ B ) = P( A ) + P( B ) - P( A ⋂ B )
 ⇒P[( A ⋂ C) ⋃ ( B ⋂ C )] = P( A ⋂ C ) + P( B ⋂ C )     - P( A ⋂ B ⋂ C )
         Dividing both sides by P( C ), we get
 ⇒ P[( A ⋂ C) ⋃ ( B ⋂ C )] / P( C ) = P( A ⋂ C ) / P(         C ) + P( B ⋂ C ) / P( C )  - P( A ⋂ B ⋂ C ) / P( C )                                      , P( C ) > 0
                       
 ⇒  P[( A  ⋃ B ) ⋂ C ] / P( C ) = P( A | C ) + P( B | C ) -  P( A ⋂ B | C )
 ⇒   P[( A  ⋃ B ) | C ] = P( A | C ) + P( B | C ) - P( A ⋂ B | C ) 
                                                                        [Proved]

THEOREM - 5:

           For any 3 events A , B , C  P(A ⋂ B' | C ) + P( A ⋂ B | C ) = P( A | C ) , B' = complementary of event B.

PROOF :

          Given,
         P(A ⋂ B' | C ) + P( A ⋂ B | C )
     = P(A ⋂ B' ⋂ C ) / P( C ) + P( A ⋂ B ⋂ C ) / P( C )
     = [ P(A ⋂ B' ⋂ C ) + P( A ⋂ B ⋂ C ) ] / P( C ) 
     =  P(A ⋂ C ) / P( C ) 
     = P( A | C )                                  [Proved]
 

 THEOREM - 6:

         For any 3 events A , B , C defined on the sample space S such that B ⊂ C and P( A ) > 0 , P( C | A )  ≥ P( B | A ).

 PROOF:

      P( C | A ) = P( C ⋂ A ) / P( A )      [ By definition of conditional probability]

     = [ P( B ⋂ C ⋂ A ) ⋃ P( B' ⋂ C ⋂ A ) ] / P( A )

     =  [ P( B ⋂ C ⋂ A ) ] / P( A )  +  [ P( B' ⋂ C ⋂ A ) ] / P( A )
     =    P [ ( B ⋂ C | A ) + ( B' ⋂ C | A ) ]
    Since B ⊂ C ⇒ B ⋂ C = B
           ⇒ P( C | A ) = P( B | A ) + P( B' ⋂ C | A )
           ⇒ P( C | A ) ≥ P( B |  A ).                                         [Proved]

 Pair-wise Independent Events:

         Definition:  A set of events  A1 , A2  , ... , An  are said to be pair-wise independent  if  
          P(  Ai ⋂  Aj ) = P(  Ai ) . P(  Aj )  ∀ i ≠ j

 Conditions for Mutual Independence of ' n ' Events :

    Let S denote the sample space for a number of events . The events in S are said to be mutually independent if the probability of the simultaneous occurrence of (any) finite number of them is equal to the product of their separate probabilities.
               If  A1 , A2  , ... , An  are ' n ' events , then their mutual independence , we should have  
       (i)  P(  Ai ⋂  Aj ) = P(  Ai ) . P(  Aj ) ,    [ i ≠ j ; i , j = 1,2,...,n]
      (ii) P(  Ai  ⋂  Aj  ⋂  Ak ) = P( Ai ) . P( Aj ) . P( Ak ) ,  [ i ≠ j ≠ k ; i ,      j , k = 1,2,...,n]
                                    .                                     .
                                    .                                     .
                                    .                                     .
 P( A1 ⋂  A2 ⋂ ... ⋂ An ) = P( A1 ) . P( A2 ) . ... . P( An )

 

Remarks : 

       1. It may be observed that pair-wise or mutual independence of events A1 , A2  , ... , An  is defined only when P(  A1 ) ≠ 0 , for i = 1,2,...,n .
       2. If the events A and B are such that P( A )  ≠ 0 , P( B ) ≠ 0 and A independent of B , then B is independent of A.
      Proof :  We are given that , P( A | B) = P( A )       
                   [∵ A is independent of B ]
                   ⇒ P( A ⋂ B ) / P( B ) = P( A )
                   ⇒  P( A ⋂ B ) = P( A ) . P( B )
         ⇒ P( B ⋂ A ) / P( A ) = P( B )    [∵ P( A ) ≠ 0 ,                    A ⋂ B =  B ⋂ A ]
         ⇒ P( B | A ) = P( B )  


Friday, June 26, 2020

THEOREMS ON PROBABILITY OF EVENTS (PART-1)

  Before you readout this blog I suggest you to read it's previous session i.e. " Introduction to Probability" , CLICK HERE to read. 

 THEOREM-1:

             Probability of the impossible event is zero i.e. p(ฮฆ) = 0.
 

    PROOF:

              Impossible event contains no sample  point and hence the certain event S and   the impossible   event ฮฆ are mutually     exclusive.  
     Hence      S U ฮฆ = S  , [where S=    Sample space ]
             ∴      P( S U ฮฆ ) = P( S )
             ⇒    P( S ) + P(ฮฆ ) = P( S )
             ⇒    P( ฮฆ ) = 0                                                                                           [Proved]

REMARK:

    It may be noted P( A )=0, does not imply  that  A is necessarily an empty set. In practice, probability ' 0 ' is assigned to the events which are so rare for example , let us consider the   random tossing of a coin. The event that the coin will stand erect on its edge,  is assigned the probability 0.

 

THEOREM -2:

           Probability of the complementary event ฤ€  of A is given by  P( ฤ€ ) =1-P( A ).

    PROOF:

                 We know that A and ฤ€ are dis-joint                          events.
             Moreover,          A U ฤ€ = S 
                                    ⇒ P( A U ฤ€ ) = P( S )
                                    ⇒ P( A ) + P( ฤ€ ) = 1
                                    ⇒ P( ฤ€ ) = 1 - P( A )                                                                               [Proved]

                                                

THEOREM - 3:

                      For any event A, 0 ≤ P( A ) ≤ 1.

PROOF:

         Clearly for any event A , 
               we have  ฮฆ ⊆ A⊆ S
                         ⇒ | ฮฆ | ⊆ | A | ⊆ | S |
                         ⇒ | ฮฆ |/ | S | ≤ | A |/ | S | ≤ | S |/ | S |
                         ⇒  0  ≤ P( A ) ≤ 1        [Proved]

ADDITION RULE:

    For any two events A and B                

            P( A ⋃ B ) = P( A ) + P( B ) - P( A ⋂ B ).

  ★    If A and B are two mutually exclusive  events then,
                      P( A ⋃ B ) = P( A ) + P( B )                                            [ ∵ P( A ⋂ B ) = 0 ]
 ★    For any three events A  , B , C ,
                        
                  P( A ⋃ B ⋃ C )=P( A ) + P( B ) + P( C ) -      P( A ⋂ B ) - P(B ⋂ C ) - P(C ⋂ A ) + P(A ⋂ B ⋂ C)
 
    PROOF:
           Let us take A ⋃ B = D
      P( A ⋃ B ⋃ C ) = P( D ⋃ C ).                                   
      = P( D ) + P( C ) - P ( D ∩ C )
     = P( A ⋃ B ) + P( C ) -  P(( A ⋃ B ) ∩ C ))                 
     = P( A ⋃ B ) + P( C ) -  [ P(( A ⋂ C )  ⋃ ( B ⋂ C )) ]
     = P( A ) + P( B ) + P( C ) -P( A ⋂ B ) - [ P( A ⋂ C )  + P( B ⋂ C ) - P((A ⋂ B ) ⋂ ( B ⋂ C)) ].  
     =  P( A ) + P( B ) + P( C ) -P( A ⋂ B ) - P(B ⋂ C ) -  P(C ⋂ A ) + P(A ⋂ B ⋂ C)
                                                                                                                                            [Proved]
 ★  If A , B , C are the pair-wise mutually exclusive events , then 
                            
             P( A ⋃ B ⋃ C )=P( A ) + P( B ) + P( C ) .

 Example:
          A dice is thrown is thrown twice , find the 
          i) P( getting total is 8)
         ii) P( getting total is 10)
        iii) P( total is 8 and 10 )
        iv) P ( total is 8 or 10 )
         v) P( total is an even number)
        vi) P( total is not 8)
       vii) P(  tota is an even number but not 8 )

Answer-
  Since the dice is thrown twice , the sample space ( S ) = | 36 |
       i) Let A is the event of getting total is 8
         A={ (2,6) , (3,5) , (4,4) , (5,3) , (6,2) } , | A | = 5
                 so, P( A )= | A | / | S | = 5/36.

      ii) Let B is the event of getting total is 10
                   B={ (4,6) , (5,5) , (6,4) } , | B |=3
                 so, P( B )= | B | / | S | = 3/36= 1/12

     iii) P( getting total is 8 and 10)
           = P( A ⋂ B ) = P(ฮฆ ) = 0

      iv)P( getting total is 8 pr 10)
          = P( A ⋃ B ) = P( A ) + P( B ) - P( A ⋂ B ) 
          = 5/36 + 3/36 - 0 = 8/36=2/9

       v) Let C be the event of total is an even      no. 
              P( C ) = 18/ 36 = 1/2

      vi) P( getting total is not 8)
            = 1 - P( A ) = 1- (5/36) = 31/ 36

   vii) P( getting total is an even no. but not 8)
             = P(getting total is an even no.) - 
                                                P(getting total is 8)
             = (18/36) - (5/36) = 13/36.
 
    
 
  

Friday, June 19, 2020

Introduction to Probability (Chapter-1)

Probability:
Probability concerning numerical description of how likely an event is to occur.

Types of Probability:
1.Classical Probability:
In this type of Probability the outcomes can be calculated by listing.
Ex.Tossing a coin
2.Experimental/Empirical Probability:
This is based on the number of possible outcomes by total number of trials.
Ex.The probability of a coin which is flipped 50times and landed on heads 28 times.
3.Subjective Probability:
It is based on the person's own personal reasoning and judgement.
Ex.A Cricket Fan may predict his/her favourite team score and guess that it will win.
4.Theoretical Probability:
It is an approach on the basis of the possible probability on possible chances of something happen.
Ex.The probability of getting 3 in a dice.
5.Axiomatic Probability:
Axiomatic probability is a unifying probability theory. It sets down a set of axioms (rules) that apply to all of types of probability, including frequentist probability and classical probability. These rules, based on Kolmogorov's Three Axioms, set starting points for mathematical probability. (Which is discussed below)
Elements of a Probabilistic Model:
1.Experiment:
Which work/things gives uncertain outcomes is called Experiment.
Ex.Tossing a coin
2.Sample Space:
The set of all possible outcomes of an experiment.
3.Event:
A event which is a subset of sample space and non negative number that assign the given outcomes or required outcomes.
4.P(A):
It is called probability of a that ensure the ratio of events to the sample space.
It will always less that 1.


Types Of Events:

1. Simple Event/Elementary Event

If the event has only one sample point of a sample space, it is called a simple event or an Elementary Event. It is an event that consists of exactly one outcome. Let us understand this with an example. Say you throw a die, the possibility of 2 appearing on the die is a simple event and is given by E = {2}.

2. Compound Event

As opposed to a simple event, if there is more than one sample point on a sample space, such an event is called Compound Event. It involves combining two or more events together and finding the probability of such a combination of events.

For example, let us take another example. When we throw a die, the possibility of an even number appearing is a compound event, as there is more than one possibility, there are three possibilities i.e. E = {2,4,6}.


3. Certain Event/Sure Event

Just as the name suggests, an event which is sure to occur in any given experiment is a certain event. The probability of this type of event is 1.

4. Impossible Event

On the other hand, when an event cannot occur i.e. there is no chance of the event occurring it is said to be an impossible event. The probability of this event is 0. Like the probability that the card you drew from a deck is both red and black is an impossible event.

5. Equally likely Events

When the outcomes of an experiment are equally likely to happen, they are called equally likely events. Like during a coin toss you are equally likely to get heads or tails.

6.Complimentary Events

The non-occurence of an event is called complimentary event.                           It is denoted by " A' " or " Ac"

7. Mutually Exclusive Events

Two events A and B is called Mutually Exclusive events if A⋂B=ั„

i.e; They have no common elements.

Probability Axioms:
  1. (Non-negativity)  P(A) ≥ 0 for every event A.
  2. (Additivity) If A and B are two disjoint events,then the probability of their union is  P(A U B)= P(A) + P(B)
  3. (Normalization) The probability of entire Sample space แ˜ฏ is equal to 1. i.e;P(แ˜ฏ)=1

Conditional Probability

The conditional probability of an event B is the probability that the event will occur given the knowledge that an event A has already occurred. This probability is written P(B|A), notation for the probability of B given A. In the case where events A and B are independent (where event A has no effect on the probability of event B), the conditional probability of event B given event A is simply the probability of event B, that is P(B).
[Note: P(A) in P(A|B) is called Prior Probability]

Example:

Independence:

Two events and are independent if and only if 
EXAMPLE:

Law of Total Probability

For two events A and B associated with a sample space S, the sample space can be divided into a set A ∩ B′, A ∩ B, A′ ∩ B, A′ ∩ B′. This set is said to be mutually disjoint or pairwise disjoint because any pair of sets in it is disjoint. Elements of this set are better known as a partition of sample space.

This can be represented by the Venn diagram as shown in fig. 1. In cases where the probability of occurrence of one event depends on the occurrence of other events, we use total probability theorem.

In Other Word 


EXAMPLE:

I have three bags that each contain  marbles:

  • Bag 1 has  red and  blue marbles;
  • Bag 2 has 0 red and 0 blue marbles;
  • Bag 3 has red and 5 blue marbles.

I choose one of the bags at random and then pick a marble from the chosen bag, also at random. What is the probability that the chosen marble is red?

Solution:

Let R be the event that the chosen marble is red. Let be the event that I choose Bag  We already know that

We choose our partition as . Note that this is a valid partition because, firstly, the Bi's are disjoint (only one of them can happen), and secondly, because their union is the entire sample space as one the bags will be chosen for sure, i.e.,Using the law of total probability, we can write









Bayes' Rule

suppose that we know P(A|B), but we are interested in the probability P(B|A). Using the definition of conditional probability, we have

P(A|B)P(B)=P(AB)=P(B|A)P(A).
Dividing by P(A), we obtain
P(B|A)=P(A|B)P(B)/P(A),
which is the famous Bayes' rule. Often, in order to find P(A) in Bayes' formula we need to use the law of total probability, so sometimes Bayes' rule is stated as
P(Bj|A)=P(A|Bj)P(Bj)/iP(A|Bi)P(Bi),
where B1,B2,,Bn form a partition of the sample space.


Example:
In the above example 
suppose we observe that the chosen marble is red. What is the probability that Bag 1 was chosen?

Conditional Independence

Two events  and  are conditionally independent given an event with P(C)>0 if  
  The conditional Probability of A given B is represented by P(A|B). The variables A and B are said to be independent if P(A)= P(A|B) (or alternatively if P(A,B)=P(A) P(B) because of the formula for conditional Probability).

Example1 Suppose Norman and Martin each toss separate coins. Let A represent the variable "Norman's toss outcome", and B represent the variable "Martin's toss outcome". Both A and B have two possible values (Heads and Tails). It would be uncontroversial to assume that A and B are independent. Evidence about B will not change our belief in A.