Short-term Hebbian learning can implement transformer-like attention

Ellwood, Ian T. and Robinson, Emma Claire (2024) Short-term Hebbian learning can implement transformer-like attention. PLOS Computational Biology, 20 (1). e1011843. ISSN 1553-7358

[thumbnail of journal.pcbi.1011843.pdf] Text
journal.pcbi.1011843.pdf - Published Version

Download (2MB)

Abstract

Transformers have revolutionized machine learning models of language and vision, but their connection with neuroscience remains tenuous. Built from attention layers, they require a mass comparison of queries and keys that is difficult to perform using traditional neural circuits. Here, we show that neurons can implement attention-like computations using short-term, Hebbian synaptic potentiation. We call our mechanism the match-and-control principle and it proposes that when activity in an axon is synchronous, or matched, with the somatic activity of a neuron that it synapses onto, the synapse can be briefly strongly potentiated, allowing the axon to take over, or control, the activity of the downstream neuron for a short time. In our scheme, the keys and queries are represented as spike trains and comparisons between the two are performed in individual spines allowing for hundreds of key comparisons per query and roughly as many keys and queries as there are neurons in the network.

Item Type: Article
Subjects: Eprints STM archive > Biological Science
Depositing User: Unnamed user with email admin@eprints.stmarchive
Date Deposited: 23 Mar 2024 11:05
Last Modified: 23 Mar 2024 11:05
URI: http://public.paper4promo.com/id/eprint/1898

Actions (login required)

View Item
View Item