Computational complexity of neural networks (2022)

(lunalux.io)

18 points | by mathattack 13 hours ago ago

4 comments

  • constantcrying 21 minutes ago ago

    "If we once again assume that there are the same number of neurons in each layer, and that the number of layers equal the number of neurons in each layer we find:"

    These are terrible assumptions.

    Why not compute the runtime as the product of the actual sizes? Then the comparison will also make more sense.

  • duvenaud 12 hours ago ago

    This is simply wrong. Backprop has the same asymptotic time complexity as forward.

    • bobmarleybiceps 9 hours ago ago

      I think they're misusing "forward propagation" and "backward propagation" to be basically mean "post training inference" and "training". they seem to be assuming n iterations of the backward pass, which is why it's larger...

      • vrighter 6 hours ago ago

        n iterations would be a constant factor, which is omitted from asymptotic complexity