Calculate $140$ digits of Conway's Constant from the Look and Say Sequence

9

The look-and-say sequence is the sequence of numbers $1, 11, 21, 1211, 111221, 312211, …,$ in which each term is constructed by “reading” the previous term in the sequence. For example, the term $1$ is read as “one $1$”, which becomes the next term: $11$. Then $11$ is read as “two ones”, which becomes the next term: $21$, and so on...

then

If $L_n$ is the number of digits in the $n$th term in the sequence, then

$$\lim_{n \to \infty} \frac{L_{n+1}}{L_n}=\lambda$$

where $λ = \color{red}{1.303577269034...}$ is a Conway's Constant. ( algebraic number of degree $71$)

I tried to calculate it , but its very slow :

RunLengthEncode[x_List] := 
RunLengthEncode[x] = (Through[{First, Length}[#1]] &) /@ Split[x]
LookAndSay[n_, d_: 1] := 
NestList[Flatten[Reverse /@ RunLengthEncode[#]] &, {d}, n - 1]

nthdigit[n_] := Length /@ LookAndSay[n] // Last

(nthdigit[60]/nthdigit[59])~N~15 // AbsoluteTiming

$\{46.6713, \color{red}{1.3035}5090959742\}$

How can I speed up this calculation to find $140$ digits of Conway's Constant ?

$140$ digits of $\lambda$ is quite enough to find polynomial

lambda = 1.3035772690342963912570991121525518907307025046594048757548613906285508
878524615571268157668644252255534713930470949026839628498935515543473758248566910891;


 MinimalPolynomial[RootApproximant[lambda], x]

enter image description here

vito

Posted 2017-03-12T13:52:31.437

Reputation: 8 518

1Possibly better suited to [Math.SE]? – Mr.Wizard – 2017-03-12T14:12:43.427

A minor point, but instead of reversing {First, Length} why not just do {Length, First} ? – Simon Woods – 2017-03-12T14:32:53.213

1If you're able to get $\lambda$ as a Root object, why not use N to get the digits? – Chip Hurst – 2017-03-13T14:00:18.257

You could use the continued fraction form. The sequence can be found here https://oeis.org/A014715. The following (Javascript) code works correctly up to 190 digits: c=[1,3,3,2,2...] // about 200 elements b=r=BigInt(10)*BigInt(200); for (i=c.length-1; i>=0; i--) { r = BigInt(c[i])b + bb/r; } // r now contains 10*200 * Conway'c constant

– Peter Robinson – 2018-08-03T10:19:04.547

Answers

12

I call this ambiguous and I think you haven't thought this through. Let us assume it would be possible by such a direct method of calculating $\lambda$. First, we need to speed up the high-level function of yours that you probably have from Rosetta code. If the numbers grow larger and larger, Split becomes a bottleneck, and I don't now how to speed it up except writing in in a low-level compiled function.

Therefore, here is an implementation calculating the run-length encoding in an ugly combination of two loops:

rl = Compile[{{l, _Integer, 1}},
  Module[{b = Internal`Bag[Most[{0}]], n = Length[l], count, value, 
    i = 1},
   While[i <= n,
    count = 1;
    value = l[[i++]];
    While[i <= n && value == l[[i]],
     count++;
     i++
     ];
    Internal`StuffBag[b, count];
    Internal`StuffBag[b, value];
    ];
   Internal`BagPart[b, All]
   ], CompilationTarget -> "C", RuntimeOptions -> "Speed"
  ]

As comparison let's take a slightly adapted version of the core of your function

RunLengthEncode[x_List] := Flatten[Through[{Length, First}[#1]] & /@ Split[x]]

While yours takes

res1 = Nest[RunLengthEncode, {1}, 60]; // AbsoluteTiming
(* {35.2265, Null} *)

the compiled version only needs

res2 = Nest[rl, {1}, 60]; // AbsoluteTiming
(* {0.92388, Null} *)

and the results are the same

res1 == res2
(* True *)

Speed, unfortunately, is not our biggest problem. Let me demonstrate the byte size of the results for all sequences up to the 50th.

sizes = Table[ByteCount[Nest[rl, {1}, n]], {n, 1, 50}];

and fit an exponential model on the data

nlm = NonlinearModelFit[sizes, a*Exp[b x], {a, b}, x]

Without quantifying the quality of the model fit, let us instead just look on the combined plots:

Show[
 ListLinePlot[sizes, PlotStyle -> Thick],
 Plot[nlm[x], {x, 0, 50}, PlotStyle -> Directive[Dotted, Red]]
 ]

Mathematica graphics

That looks promising if you like to eat up all your system memory, but since we are a very optimistic bunch, let's estimate at which sequence we need 16GB of precious RAM

FindRoot[Normal[nlm]/2^30 - 16, {x, 60}]
(* {x -> 78.3621} *)

This means, that for 79 steps, we will cross the 16GB line. I like to remind you, that you are using memoization to store all previous results, so your computer is going to burst into flames sooner. I have 32GB, so mine is probably dying a bit later. Let's be cautious and calculate n=75

seq75 = Nest[rl, {1}, 75]; // AbsoluteTiming
(* {44.1891, Null} *)

This monster already uses 6.6GB of RAM and has 881.752.750 elements. Let's calculate $\lambda$ using the 76th sequence

N[Length[rl[seq75]]/Length[seq75], 15]
(* 1.30358560492156 *)

Still as bad as your approximation. Therefore, I don't think that speeding your approach up will help you. I believe a completely different method is required to get $\lambda$ with more precision.

halirutan

Posted 2017-03-12T13:52:31.437

Reputation: 109 574