Tabletop role-playing games and probability generating functions
Tabletop role-playing games like Dungeons & Dragons and Pathfinder are as much about inventiveness and imagination as they are about throwing and adding dice. Any attack, action or reaction that requires some skill basically goes like this: throw the appropriate dice and add the results, apply bonuses and penalties, and check whether the resulting number is equal to or larger than some target - higher numbers are better.
For example, in Pathfinder , to attack a zombie with a shortsword, you would roll a twenty-sided die (called a ), add your Strength modifier and check whether it's equal or greater than 12. If it is, then you would roll a regular six-sided die (called a ) and add your Strength modifier to determine how much damage you deal.
Different weapons and characters will have different dice, modifiers, and choices involved in even a single action. But which action is "best"? Let's say you're offered two options: to roll either a single , or two s and add the results. Which would you pick, and why?
Your first approach might be to look at the average results of both options, known as the expected value of the two options - it's a weighted average for all the possible outcomes, where the weights are just the probabilities of the associated outcomes. (The expected value of a variable is written as .) For the given options, these are: So rolling two four-sided dice is better than a single eight-sided die, right? Well, it depends. If we know that dealing 7 or 8 damage is required to slay the monster, then choosing the might be preferable, since that would make it more likely to slay the monster in a single action: There is third way by which we might compare distributions of die rolls: we could say that one beats another when it is more likely that it lands higher than the other way around. In other words, we can check the probability of either roll being larger than the other. After crunching some numbers, this turns out to be: Interestingly, this final method of comparison turns out to be "intransitive". In other words, you would expect that if A is better than B and B is better than C, than A would be better than C, but this is generally not true when comparing distributions in this way!
There are a bunch of different ways by which we can compare distributions, but before we can even compare distributions, we need a way to reliably determine them in the first place. A good approximation method would be the Monte Carlo method, which relies on performing a lot of independent random samples, and aggregating the results. This approach is for example used to calculate odds in poker. However, for our area of interest, we don't need approximations or simulations, we can simply create an exact method. We just need one concept from mathematics: the probability generating function.
Probability generating functions
Let's say we have some stochastic variable which can, with some distribution, randomly take some non-negative integer value (0, 1, 2, et cetera), then we define its probability generating function, as: which is a power series (a sort of infinite polynomial) where the exponents represent the possible values the stochastic variable can take, and their coefficients are simply the probability of taking that particular value. holds no real meaning - you shouldn't think of it as a function in which you will plug in values, but just as a fancy mathematical way to represent the distribution, which will turn out to be very convenient for certain operations.
For example, let's consider again and , their probability generating functions are, respectively: Another probability generating function that is quite illustrative is that of a variable that has a 100% probability to get a single specific value. The following probability generating function belongs to the "stochastic variable" that always equals three: Probability generating functions have an interesting property, which is that for independent variables and : , so we could have determined by squaring the probability generating function for , and this works for any distributions! You can simply "add" the underlying stochastic variables by multiplying their probability generating functions!
Furthermore, we can create a mixture of two independent variables and , by taking the weighted sum of their probability generating functions (assuming the weights are non-negative, and sum to 1). Hence, if equals with probability and equals with probability , then: Using just these two facts, we can determine the exact distribution of combined dice results!
For a more a complicated example, consider the puzzle posed in the XKCD comic from earlier. Grabbing two arrows at random out of ten when five of them are cursed means there is a probability of not grabbing any cursed arrows (calculated through ). The question is whether this corresponds to landing 16 or higher on ? Armed with our new tools, this becomes pretty easy! Simply determine the probability generating function, discard all terms with exponent 15 or lower, and sum their coefficients, which turns out to be exactly equal to !
Modeling attacks
With this new mathematical tool at our disposal, let's return to TTRPGs. When making an attack, one of three things can happen: you miss, you make a regular hit, or you make a critical hit. Whether you hit or not is based on a roll:
- rolling a 1 means you miss, dealing zero damage;
- rolling a 20 results in a critical hit, meaning you roll your normal damage two times and add the results to determine your damage. In some cases, lower rolls may already result in a critical hit, or you may have a higher multiplier. Pathfinder requires you to make a second roll to "confirm" the critical hit, where you only deal critical damage if this roll also hits, otherwise dealing regular damage;
- for any other result, it depends on characteristics of the enemy you are attacking whether you hit or not.
Consider a Pathfinder fighter with a +2 Strength modifier and +1 base attack bonus attacking a Zombie (Armor Class 12) with a shortsword (shortswords have a critical hit range of 19-20). First we need to determine probabilities and probability generating functions:
- , since 8 out of 20 rolls would cause you to miss (rolling a 9 would still make you hit the target due to the total +3 bonus on the attack roll);
- , corresponding to rolling a critical threat and then confirming it, and;
- , which is the remainder;
- , which corresponds to a roll with the Strength modifier added to it;
- , which means that if you land a critical hit, you roll damage twice.
Implementation
Let's get programming! From a programmer's perspective, a probability generating function can simply be represented by an array of floating point numbers with addition and mixing defined as before. Assuming the indices of the entries in the array correspond to the exponents in the generating function, we can add two distributions by multiplying their probability generating functions as follows:
/**
* Adds two independent distributions
*/
function addDistributions(distribution1, distribution2) {
const newLength = distribution1.length + distribution2.length - 1
let newDistribution = Array(newLength).fill(0)
distribution1.forEach((outcome1, index1) => {
distribution2.forEach((outcome2, index2) => {
newDistribution[index1 + index2] += outcome1 * outcome2
})
})
return newDistribution
}
Here we go through every combination of exponents from both generating functions, and accumulate the results in the appropriate term. The worst-case time complexity of this approach is where is the length of distribution1
, and is the length of distribution2
.
For the earlier mentioned mixing of distributions, we require a helper method, since the arrays representing the generating functions we are mixing may not have the same length yet. That's what zipWithDefault
takes care of: it iterates the pairs of entries of two arrays, yielding defaultValue
in place of one of the two values in the pair belonging to the the shorter of the two arrays, once its values have been exhausted. Using it, we can easily implement mixing:
/**
* Used to zip two arrays, if either array runs out of elements before the
* other, its elements are replaced by defaultValue.
*/
function zipWithDefault(array1, array2, defaultValue) {
return Array.from(Array(Math.max(array1.length, array2.length)), (_, i) =>
[array1[i] || defaultValue, array2[i] || defaultValue])
}
/**
* Mixes two independent distributions by sampling from distribution1 with the
* given probability, and sampling from distribution2 with 1-probability.
*/
function mixDistributions(distribution1, distribution2, probability) {
return zipWithDefault(distribution1, distribution2, 0).map(outcomes =>
probability*outcomes[0] + (1-probability)*outcomes[1])
}
And that already wraps up the main logic! We can go one step further by defining some helper functions for convenience, to allow us to easily generate uniform distributions, "constant distributions" (like the distribution that yields 3 in 100% of the cases), and a way to multiply distributions (which is just repeated addition):
/**
* Used to generate the distribution of a regular die. Each outcome is
* equiprobable, with a probability of 1/numberOfSides of occurring. Since
* counting starts at 0, the first outcome has probability 0 of occurring.
*/
function die(numberOfSides) {
return [0, ...Array(numberOfSides).fill(1/numberOfSides)]
}
/**
* Used to generate the distribution for sampling the given value with
* certainty.
*/
function constant(value) {
return [...Array(value).fill(0), 1]
}
/**
* Add multiple independent identically distributed distributions
*/
function multiplyDistribution(distribution, number) {
if (number == 0) return constant(0)
if (number == 1) return distribution
return addDistributions(distribution, multiplyDistribution(distribution, number-1))
}
All that remains is combining it into a nice UI, and we have a blazingly fast exact distribution calculator!
Damage distribution calculator
Below you will find a tool that uses the above approach to determine the damage distribution of specific attacks.