One Collatz Coincidence

59 pointsposted 3 days ago
by 7373737373

21 Comments

wruza

4 hours ago

Isn’t it a true coincidence? Collatz-like problems are one of the simplest (in operation), no surprise that BB(low n) chooses them. It’s probably more surprising that we found it before analyzing BBs.

tocs3

3 days ago

Is there a more "popular science" description of what is being said here. I would like to understand what is meant by "Collatz-like". Is it a function like:

  f(n) = {n/2 : for even n,
          3n+1 : for odd n}  
Can I sit round for days making trivial calculations looking for patterns?

What does

  M(n) = 0^inf < A 1^n 0^inf.
mean?

I have really enjoyed the Busy Beaver stuff and play with simple code to try and learn what I can but when I read about it I run into the brick wall of math reading comprehension. Numberphile on youtube is pretty good sometimes with explanations but is not a reference. I do not know where to turn (I might ask chatGPT just to see what happens).

7373737373

3 days ago

The definition used on https://wiki.bbchallenge.org/wiki/Collatz-like for a Collatz-like function is:

> a partial function defined piecewise depending on the remainder of an input modulo some number

(for the best known Collatz conjecture specifically: 2)

and

> A Collatz-like problem is a question about the behavior of iterating a Collatz-like function.

(Regarding the first part, there are other functions that do not use modulo to decide on which "path" to take but some other property of the input number (e.g. its primality: https://mathoverflow.net/questions/205911/a-collatz-like-fun...) that may perhaps also be described as Collatz-like.)

The notation for tape states is documented here: https://wiki.bbchallenge.org/wiki/Directed_head_notation

I think part of the reason why it's so difficult to learn more about this kind of problem is that humanity has simply not found the right language for it yet. And in the case of not only describing, but solving them, a terminology/classification for the "shapes" of halting or non-halting behavior of such systems is also still largely missing.

isaacfrond

9 hours ago

That's right, the function in the article is:

Given an input n, if n is of the form 3k then output 5k+6

if n is of the form 3k+1 then output 5k+9

if n is of the form 3k+2 then halt

So starting with 0, which is of the form 3x0, we go to 5x0+6=6, then to 5x2+6=16. Because 16 is 3x5+1 we then go to 25+9=34, etc.

The bb program is made such that one computation of the function takes an insane long time.

By the way Conway proved [0] that any computer program can be rewritten in the form of these collatz like functions. So they are pretty prowerful.

[0]: https://en.wikipedia.org/wiki/Collatz_conjecture#Undecidable...

NeoTar

5 hours ago

Interesting - if any computer program can be rewritten in the form of a Collatz-like function, is it at all surprising that the BB problems appear in this form?

LegionMammal978

4 hours ago

The interesting part is how simple the resulting Collatz-like functions are to describe, suggesting that the Collatz-like form is the most 'natural' form for hard problems given the TM formalism. E.g., any program can also be rewritten in the form of a Diophantine equation, but most of those would be monstrously large.

empath75

5 hours ago

> By the way Conway proved [0] that any computer program can be rewritten in the form of these collatz like functions. So they are pretty prowerful.

I don't think that article supports that claim, but maybe I'm misunderstanding it.

isaacfrond

5 hours ago

The wikipedia article doesn't do full justice perhaps though the essence is there. The reason the halting problem is equivalent to solving collatz like problems is because you can rewrite any turing machine in terms of a generalized collatz function. The Fractran wikipedia page has more info.

(the original conway article itself is also very readable: FRACTRAN: A SIMPLE UNIVERSAL PROGRAMMING LANGUAGE FOR ARITHMETIC)

dmichulke

8 hours ago

Re the link to the wiki and its introductory paragraph:

If you ask yourself why the 2nd part of the original Collatz function is:

  c(2k+1) = 3k+2
the reason is that:

3x+1 applied to (2k+1) is 6k+4 which can immediately be divided by 2 and results in 3k+2.

tocs3

2 days ago

Thank you. I have been looking a little at your github Busy_Beaver repository. I think I will stick with my on code writing for a little while (amateurish but I am learning a little bit writing it).

throwaway81523

6 hours ago

" M(n) = 0^inf < A 1^n 0^inf."

I read that as an infinite string of 0's (0^inf), then the tape head (<), then a string of exactly n 1's (1^n), then infinitely many 0's. So a block of n 1's starting at the tape head, surrounded by endless 0's to the left and right.

tocs3

3 days ago

OK, you were not fast enough. I asked chatGPT. I think it helped but there is still a lot I do not understand.

  M(n) = 0^inf <A 1^n 0^inf.
Is, in pseudo math/English, some Turing machine in state n is equal to

  An infinite list of zeros up to the position of the machine pointer followed by some number of ones (maybe zero ones?) followed by another infinite list of zeros.
  
I think at this point chatGPT gets something wrong but at this point I hit the Free plan limit for GPT-4o and will have to wait till 2:13PM.

tux3

10 hours ago

A turing machine has a position on a tape and an internal state

The <A means the machine is in state A and facing left. To its left are infinitely many zeroes (0^inf), to its right are n 1s, and then infinitely many zeroes (the initial tape is infinite, and starts filled with zeroes)

throwaway81523

6 hours ago

That's surprisingly good, though yeah it misses a detail.

empath75

5 hours ago

I wonder if this is something generally true for all BB's and if Collatz-like functions represent some kind of of limit of decidability.

jerf

3 hours ago

"I wonder if this is something generally true for all BBs"

No. A BB is going to use "all" its states. When you're up into the hundreds, it's not going to be anything recognizably Collatz-like, because Collatz is so simple.

People love what I think of as "squinting until everything blurs and everything is the same" and will probably try to salvage this conjecture for whatever reason people tend to do this, but unless you're going to claim that all Turing Machines are "just" computing some sort of "Collatz-like computation", degenerately, by blurring your definition of "Collatz-like" until the hand-written Turing Machine from school to reverse a string is "Collatz-like" and the hand-written TM to simulate a multitape machine is "Collatz-like", I don't think there's any reason to believe BB will continue to be Collatz-like indefinitely.

To put it another way, we've got a pretty good sense that even some hand-written functions can exceed these Collatz machines pretty handily... we just need enough states to represent them. The odds of a Collatz machine being the optimal machine once you get enough states to represent other ultra-super-exponential functions (e.g., Tree) is basically zero. It's just an artifact of us not being able to test such small Busy Beavers.

The most likely thing that BB(100) "does" is going to be something utterly humanly incomprehensible, not some sort of cute Collatz-like computation that can be characterized in any manner that would make sense to a human.

tromp

3 hours ago

There's no sign of Collatz-like behaviour in any of the 37 known values of BBλ [1], so it appears specific to Turing Machines, which cannot compute exponentials so easily.

[1] https://oeis.org/A333479

tocs3

34 minutes ago

Thank you for the link. I have been wondering whether the Lambda Calculus has a BB like values. I do not know much about the Lambda Calculus and this might be a good reason for me to learn more.

The values shown in the OEIS link look different than the values for the TM BBs. Am I just reading it wrong?

tromp

29 minutes ago

No, you're reading it right. It's a different function based on a different model of computation so the values are going to be quite different.

klyrs

2 hours ago

If I were to generalize this in an optimistic view, I'd say that there's a decent chance for open problems in math to be busy beaver candidates. The collatz conjecture is an early family, I'd bet Diophantine equations show up, I think I recall the Golbach conjecture being a record at some point...

And I call this 'optimistic' because it supposes that mathematicians have identified the actual hard problems.