Close Menu
UK Daily: Tech, Science, Business & Lifestyle News UpdatesUK Daily: Tech, Science, Business & Lifestyle News Updates
    What's Hot

    Former England boxing head coach jailed for sexual assaults

    December 15, 2025

    How iRobot lost its way home

    December 15, 2025

    Sangha Opens 20MW Bitcoin Mining Facility In Texas

    December 15, 2025
    Facebook X (Twitter) Instagram
    Trending
    • Former England boxing head coach jailed for sexual assaults
    • How iRobot lost its way home
    • Sangha Opens 20MW Bitcoin Mining Facility In Texas
    • Bank of England expected to cut interest rates to nearly three-year low
    • Glasgow Subway turns 129 years old amid major route upgrades
    • Greenvale Hotel tragedy: Accused deny charges following deaths of three teenagers
    • WhatsApp’s biggest market is becoming its toughest test
    • Calls grow for fully integrated One Health surveillance
    • London
    • Kent
    • Glasgow
    • Cardiff
    • Belfast
    Facebook X (Twitter) Instagram YouTube
    UK Daily: Tech, Science, Business & Lifestyle News UpdatesUK Daily: Tech, Science, Business & Lifestyle News Updates
    Subscribe
    Monday, December 15
    • Home
    • News
      1. Kent
      2. London
      3. Belfast
      4. Birmingham
      5. Cardiff
      6. Edinburgh
      7. Glasgow
      8. Liverpool
      9. Manchester
      10. Newcastle
      11. Nottingham
      12. Sheffield
      13. West Yorkshire
      Featured

      ‘Miniature’ mountain creature with ‘squeaker’-like call discovered as new species

      Science November 9, 2023
      Recent

      Former England boxing head coach jailed for sexual assaults

      December 15, 2025

      How iRobot lost its way home

      December 15, 2025

      Sangha Opens 20MW Bitcoin Mining Facility In Texas

      December 15, 2025
    • Lifestyle
      1. Celebrity
      2. Fashion
      3. Food
      4. Leisure
      5. Social Good
      6. Trending
      7. Wellness
      8. Event
      Featured

      Season 2 Streaming Details – Hollywood Life

      Celebrity December 14, 2025
      Recent

      Season 2 Streaming Details – Hollywood Life

      December 14, 2025

      How Did the Actor Die? – Hollywood Life

      December 14, 2025

      Who Is Josh Dun? 5 Things to Know About Debby Ryan’s Husband – Hollywood Life

      December 14, 2025
    • Science
    • Business
    • Sports

      Chatham Town through to round four, Maidstone United beaten by Yeovil on penalties

      December 13, 2025

      League 2 match reaction from Gills boss Gareth Ainsworth

      December 13, 2025

      Whitstable Town go five points clear, nine-man Larkfield & New Hythe lose at Phoenix Sports, Bearsted up to third

      December 13, 2025

      Leaders Folkestone Invicta win derby at Dartford, two wins in a row for Ashford United, Sittingbourne and Sheppey United hit the goal trail

      December 13, 2025

      League 2 match report from Priestfield Stadium

      December 13, 2025
    • Politics
    • Tech
    • Property
    • Press Release
    UK Daily: Tech, Science, Business & Lifestyle News UpdatesUK Daily: Tech, Science, Business & Lifestyle News Updates
    Home » AI models are powerful, but are they biologically plausible? | MIT News

    AI models are powerful, but are they biologically plausible? | MIT News

    bibhutiBy bibhutiNovember 24, 2023 Tech No Comments6 Mins Read
    Facebook Twitter LinkedIn WhatsApp Telegram
    Share
    Facebook Twitter LinkedIn Telegram WhatsApp



    Artificial neural networks, ubiquitous machine-learning models that can be trained to complete many tasks, are so called because their architecture is inspired by the way biological neurons process information in the human brain.

    About six years ago, scientists discovered a new type of more powerful neural network model known as a transformer. These models can achieve unprecedented performance, such as by generating text from prompts with near-human-like accuracy. A transformer underlies AI systems such as ChatGPT and Bard, for example. While incredibly effective, transformers are also mysterious: Unlike with other brain-inspired neural network models, it hasn’t been clear how to build them using biological components.

    Now, researchers from MIT, the MIT-IBM Watson AI Lab, and Harvard Medical School have produced a hypothesis that may explain how a transformer could be built using biological elements in the brain. They suggest that a biological network composed of neurons and other brain cells called astrocytes could perform the same core computation as a transformer.

    Recent research has shown that astrocytes, non-neuronal cells that are abundant in the brain, communicate with neurons and play a role in some physiological processes, like regulating blood flow. But scientists still lack a clear understanding of what these cells do computationally.

    With the new study, published this week in open-access format in the Proceedings of the National Academy of Sciences, the researchers explored the role astrocytes play in the brain from a computational perspective, and crafted a mathematical model that shows how they could be used, along with neurons, to build a biologically plausible transformer.

    Their hypothesis provides insights that could spark future neuroscience research into how the human brain works. At the same time, it could help machine-learning researchers explain why transformers are so successful across a diverse set of complex tasks.

    “The brain is far superior to even the best artificial neural networks that we have developed, but we don’t really know exactly how the brain works. There is scientific value in thinking about connections between biological hardware and large-scale artificial intelligence networks. This is neuroscience for AI and AI for neuroscience,” says Dmitry Krotov, a research staff member at the MIT-IBM Watson AI Lab and senior author of the research paper.

    Joining Krotov on the paper are lead author Leo Kozachkov, a postdoc in the MIT Department of Brain and Cognitive Sciences; and Ksenia V. Kastanenka, an assistant professor of neurobiology at Harvard Medical School and an assistant investigator at the Massachusetts General Research Institute.  

    A biological impossibility becomes plausible

    Transformers operate differently than other neural network models. For instance, a recurrent neural network trained for natural language processing would compare each word in a sentence to an internal state determined by the previous words. A transformer, on the other hand, compares all the words in the sentence at once to generate a prediction, a process called self-attention.

    For self-attention to work, the transformer must keep all the words ready in some form of memory, Krotov explains, but this didn’t seem biologically possible due to the way neurons communicate.

    However, a few years ago scientists studying a slightly different type of machine-learning model (known as a Dense Associated Memory) realized that this self-attention mechanism could occur in the brain, but only if there were communication between at least three neurons.

    “The number three really popped out to me because it is known in neuroscience that these cells called astrocytes, which are not neurons, form three-way connections with neurons, what are called tripartite synapses,” Kozachkov says.

    When two neurons communicate, a presynaptic neuron sends chemicals called neurotransmitters across the synapse that connects it to a postsynaptic neuron. Sometimes, an astrocyte is also connected — it wraps a long, thin tentacle around the synapse, creating a tripartite (three-part) synapse. One astrocyte may form millions of tripartite synapses.

    The astrocyte collects some neurotransmitters that flow through the synaptic junction. At some point, the astrocyte can signal back to the neurons. Because astrocytes operate on a much longer time scale than neurons — they create signals by slowly elevating their calcium response and then decreasing it — these cells can hold and integrate information communicated to them from neurons. In this way, astrocytes can form a type of memory buffer, Krotov says.

    “If you think about it from that perspective, then astrocytes are extremely natural for precisely the computation we need to perform the attention operation inside transformers,” he adds.

    Building a neuron-astrocyte network

    With this insight, the researchers formed their hypothesis that astrocytes could play a role in how transformers compute. Then they set out to build a mathematical model of a neuron-astrocyte network that would operate like a transformer.

    They took the core mathematics that comprise a transformer and developed simple biophysical models of what astrocytes and neurons do when they communicate in the brain, based on a deep dive into the literature and guidance from neuroscientist collaborators.

    Then they combined the models in certain ways until they arrived at an equation of a neuron-astrocyte network that describes a transformer’s self-attention.

    “Sometimes, we found that certain things we wanted to be true couldn’t be plausibly implemented. So, we had to think of workarounds. There are some things in the paper that are very careful approximations of the transformer architecture to be able to match it in a biologically plausible way,” Kozachkov says.

    Through their analysis, the researchers showed that their biophysical neuron-astrocyte network theoretically matches a transformer. In addition, they conducted numerical simulations by feeding images and paragraphs of text to transformer models and comparing the responses to those of their simulated neuron-astrocyte network. Both responded to the prompts in similar ways, confirming their theoretical model.

    “Having remained electrically silent for over a century of brain recordings, astrocytes are one of the most abundant, yet less explored, cells in the brain. The potential of unleashing the computational power of the other half of our brain is enormous,” says Konstantinos Michmizos, associate professor of computer science at Rutgers University, who was not involved with this work. “This study opens up a fascinating iterative loop, from understanding how intelligent behavior may truly emerge in the brain, to translating disruptive hypotheses into new tools that exhibit human-like intelligence.”

    The next step for the researchers is to make the leap from theory to practice. They hope to compare the model’s predictions to those that have been observed in biological experiments, and use this knowledge to refine, or possibly disprove, their hypothesis.

    In addition, one implication of their study is that astrocytes may be involved in long-term memory, since the network needs to store information to be able act on it in the future. Additional research could investigate this idea further, Krotov says.

    “For a lot of reasons, astrocytes are extremely important for cognition and behavior, and they operate in fundamentally different ways from neurons. My biggest hope for this paper is that it catalyzes a bunch of research in computational neuroscience toward glial cells, and in particular, astrocytes,” adds Kozachkov.

    This research was supported, in part, by the BrightFocus Foundation and the National Institute of Health.



    Source link

    Featured Just In Top News
    Share. Facebook Twitter LinkedIn Email
    Previous ArticleMeet Alexander Ray, CEO of CaaS Solution: Albus Protocol
    Next Article Kylie Jenner Teases Kendall on Thanksgiving Over How She Cuts Onions – Hollywood Life
    bibhuti
    • Website

    Keep Reading

    Former England boxing head coach jailed for sexual assaults

    How iRobot lost its way home

    Bank of England expected to cut interest rates to nearly three-year low

    Glasgow Subway turns 129 years old amid major route upgrades

    Greenvale Hotel tragedy: Accused deny charges following deaths of three teenagers

    WhatsApp’s biggest market is becoming its toughest test

    Add A Comment
    Leave A Reply Cancel Reply

    Editors Picks

    89th Utkala Dibasa Celebration Brings Odisha’s Vibrant Culture to London

    April 8, 2024

    US and EU pledge to foster connections to enhance research on AI safety and risk.

    April 5, 2024

    Holi Celebrations Across Various Locations in Kent Attract a Diverse Range of Community Participation

    March 25, 2024

    Plans for new Bromley tower blocks up to 14-storeys tall refused

    December 4, 2023
    Latest Posts

    Subscribe to News

    Get the latest sports news from NewsSite about world, sports and politics.

    Advertisement

    Recent Posts

    • Former England boxing head coach jailed for sexual assaults
    • How iRobot lost its way home
    • Sangha Opens 20MW Bitcoin Mining Facility In Texas
    • Bank of England expected to cut interest rates to nearly three-year low
    • Glasgow Subway turns 129 years old amid major route upgrades

    Recent Comments

    1. Register on Anycubic users say their 3D printers were hacked to warn of a security flaw
    2. Pembuatan Akun Binance on Braiins Becomes First Mining Pool To Introduce Lightning Payouts
    3. tadalafil tablets sale on The market is forcing cloud vendors to relax data egress fees
    4. cerebrozen reviews on Kent director of cricket Simon Cook adapting to his new role during the close season
    5. Glycogen Review on The little-known town just 5 miles from Kent border with stunning beaches and only 600 residents
    The News Times Logo
    Facebook X (Twitter) Pinterest Vimeo WhatsApp TikTok Instagram

    News

    • UK News
    • US Politics
    • EU Politics
    • Business
    • Opinions
    • Connections
    • Science

    Company

    • Information
    • Advertising
    • Classified Ads
    • Contact Info
    • Do Not Sell Data
    • GDPR Policy
    • Media Kits

    Services

    • Subscriptions
    • Customer Support
    • Bulk Packages
    • Newsletters
    • Sponsored News
    • Work With Us

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    © 2025 The News Times. Designed by The News Times.
    • Privacy Policy
    • Terms
    • Accessibility

    Type above and press Enter to search. Press Esc to cancel.

    Manage Cookie Consent
    To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
    Functional Always active
    The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
    Preferences
    The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
    Statistics
    The technical storage or access that is used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
    Marketing
    The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
    • Manage options
    • Manage services
    • Manage {vendor_count} vendors
    • Read more about these purposes
    View preferences
    • {title}
    • {title}
    • {title}