Theory 1

Two events are independent when information about one of them does not change our probability estimate for the other.

Independence

Events A and B are independent when these (logically equivalent) equations hold:

  • P[B|A]=P[B]
  • P[A|B]=P[A]
  • P[BA]=P[B]P[A]

Note that the last equation is symmetric in A and B:

  • Check: BA=AB and P[B]P[A]=P[A]P[B]
  • This symmetric version is the preferred definition of the concept of independence.

Multiple-independence

A collection of events A1,,An is mutually independent when every subcollection Ai1,,Aik satisfies:

P[Ai1Aik]=P[Ai1]P[Aik]

A potentially weaker condition for a collection A1,,An is called pairwise independence, which holds when all 2-member subcollections are independent:

P[AiAj]=P[Ai]P[Aj]for allij

One could also define 3-member independence, or n-member independence. Plain ‘independence’ means any-member independence.