I have been aware, pretty much from the time I knew what “Standard Distribution” was, that dice don’t quite fit the curve. This is apparently because die rolls are so strongly quantized. I have long suspected that, as the number of dice rolled increases, the shape of the result curve gets squarer and squarer. Over the last week or two, in the spirit of “looking busy when I should really being doing something else”, I have done some actual research.

To begin, I generated the result curves for sets of up to ten dice, which is pretty trivial with a spreadsheet (it takes about five minutes per set). Then I started trying to make sense of the data without doing too much actual work. Stuff I have found that seems to support the “increasingly square” hypothesis:

The greatest slope of the curve is a higher value (correcting for scaling) with each additional die.

The center N+1 vaules of the curve consume a greater percentage of the results with each additional die, ranging from 33% for the central 2 values of a single die to 69% for the central eleven values of the ten die set.

Being lazy, I am inclined to accept this as proof of the original hypothesis. (The shape of the result curve for N similar dice approaches a square wave shape as N increases.)