We're going to simulate "Rule 110", which is essentially a way of turning one
string of bits into another string of bits. (You won't need any background
knowledge to complete the problem, but if you're curious, check out
https://en.wikipedia.org/wiki/Rule_110)

The program should take one argument N on the command line, and should then
display a possibly-infinite sequence of rows of N digits each. A digit may be
either zero or one.

Create the first row randomly. Then, to construct the digit at position x of row
y, consider the digits at positions (x-1), x, and (x+1) of row (y-1), and select
the new digit according to the following table:

| Pattern | New Digit for Center Cell |
| ------- | ------------------------- | 
| 111     | 0                         |
| 110     | 1                         |
| 101     | 1                         |
| 100     | 0                         |
| 011     | 1                         |
| 010     | 1                         |
| 001     | 1                         |
| 000     | 0                         |

Wrap around at the edges, so the pattern for position 1 is obtained by looking
at positions N, 1, and 2.

Stop after printing a row consisting entirely of zero or ones. Note that
depending on your random initial row, this might never happen!

For example, if N is 3, an example run might be:

001
011
111