Let the \(m\) events \(B_1, B_2, \ldots, B_m\) constitute a partition of the sample space \(\mathbf{S}\). That is, the \(B_i\) are mutually exclusive:
\(B_i\cap B_j=\emptyset\) for \(i\ne j\)
and exhaustive:
\(\mathbf{S}=B_1\cup B_2\cup \ldots B_m\)
Also, suppose the prior probability of the event \(B_i\)is positive, that is, \(P(B_i)>0\) for \(i=1, \ldots, m\). Now, if \(A\) is an event, then \(A\) can be written as the union of \(m\) mutually exclusive events, namely:
\(A=(A\cap B_1)\cup(A\cap B_2)\cup\ldots\cup (A\cap B_m)\)
Therefore:
\begin{align} P(A) &= P(A\cap B_1)+P(A\cap B_2)+\ldots +P(A\cap B_m)\\ &= \sum\limits_{i=1}^m P(A\cap B_i)\\ &= \sum\limits_{i=1}^m P(B_i) \times P(A|B_i)\\ \end{align}
And so, as long as \(P(A)>0\), the posterior probability of event \(B_k\) given event \(A\) has occurred is:
\(P(B_k|A)=\dfrac{P(B_k \cap A)}{P(A)}=\dfrac{P(B_k)\times P(A|B_k)}{\sum\limits_{i=1}^m P(B_i)\times P(A|B_i)}\)
Now, even though I've presented the formal Bayes' Theorem to you, as I should have, the reality is that I still find "reverse conditional probabilities" using the brute force method I presented in the example on the last page. That is, I effectively re-create Bayes' Theorem every time I solve such a problem.