• floofloof@lemmy.caOP
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    7 months ago

    Neuronal firing is often understood as a fundamentally binary process, because a neuron either fires an action potential or it does not. This is often referred to as the “all-or-none” principle.

    Isn’t this true of standard multi-bit neural networks too? This seems to be what a nonlinear activation function achieves: translating the input values into an all-or-nothing activation.

    The characteristic of a 1-bit model is not that its activations are recorded in a single but but that its weights are. There are no gradations of connection weights: they are just on or off. As far as I know, that’s different from both standard neural nets and from how the brain works.