Suppose I want to write code to derive formulas using basic algebra and rules of probability theory (general multiplication rule, inclusion-exclusion, law of total probability, conversion to odds/information, etc.). For example, Bayes' Theorem is derived by applying the general multiplication rule in two different ways to $\Pr(A\ \cap\ B)$, equating them, and sometimes further expanding one of the unconditional probabilities using the law of total probability. However, for higher dimensional intersections such as $\Pr(A\ \cap\ B\ \cap\ C)$ and $\Pr(A\ \cap\ B\ \cap\ C\ \cap\ D)$, there are many more ways to execute such an algorithm. I'd like to write code to derive large lists of such formulas (hundreds or thousands, perhaps).
I don't know how common of a practice this type of exercise is, or whether math-focused languages such as Mathematica explicitly provide tools for such a task. One obvious yet tedious approach would be to store equations as arrays of operators and operands, and build all probability rule logic from the ground up. Are there better, higher level tools in Mathematica, MATLAB, etc., which are better designed for such a task? Do mathematicians ever find useful formulas by mining large decision trees worth of computations in this way?