(I couldn’t find a sub for hypotethical questions…)

  • Hjalmar@feddit.nu
    link
    fedilink
    arrow-up
    4
    ·
    20 hours ago

    You can’t send bits at a constant rate in this case. You essentialy get to send one very large number, the amount of time since your decided starting time (plus the one bit we were actually intended to use). The bit count grows logarithmicly with time

    Thus, the amount of bits n you can send over t time steps would be

    n = log(t)/log(2) + 1

    As an example, say they wait 8 seconds before sending you a 1. You have received the number 1000 and the bit 1. That’s a total of 5 bits.

    If they choose to wait twice as long, 16 seconds, they have in effect transmitted the numbers 10000 and one additional bit, a total of 6 bits. Double the time but only one additional bit.