The Linear Congruential Generator (LCG) is one of the most common algorithms for deterministic generation of "random" numbers. It is defined by (in Python syntax):

Code: Select all

```
def random_sequence(multiplier, increment, modulus, seed):
value = seed
for index in range(modulus):
value = (multiplier * value + increment) % modulus
yield value
```

Each value of the sequence is an integer between 0 (inclusive) and the modulus (exclusive).

For dozenal counting, an obvious choice for the modulus is 12 (*10). This yields a sequence of single-digit numbers. The hard part is deciding on the multiplier and increment.

Some choices are quite obviously bad. For example, the multiplier a=3 and increment c=3 (starting with a seed of 0) gives the sequence [3, 0, 3, 0, 3, 0, ...]. This is not only way too obvious of a pattern, but only uses two of the twelve possible values.

The most basic desirable property of an LCG is to have a full period of

*m*values, each used once per period. This ensures a uniform distribution of values.

With m=12, there are only four possible combinations of parameters meeting this criterion:

- a=1, c=1: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 0, ...
- a=1, c=5: 5, 10, 3, 8, 1, 6, 11, 4, 9, 2, 7, 0, ...
- a=1, c=7: 7, 2, 9, 4, 11, 6, 1, 8, 3, 10, 5, 0, ...
- a=1, c=11: 11, 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0, ...

So, let's try

*two*dozenal digits. With m=144 (*100), there are 576 (*400) possible combinations of

*a*and

*c*to chose from. How do we choose?

*To be continued...*