Добавить новость
Январь 2010 Февраль 2010 Март 2010 Апрель 2010 Май 2010
Июнь 2010
Июль 2010 Август 2010
Сентябрь 2010
Октябрь 2010
Ноябрь 2010
Декабрь 2010
Январь 2011
Февраль 2011 Март 2011 Апрель 2011 Май 2011 Июнь 2011 Июль 2011 Август 2011
Сентябрь 2011
Октябрь 2011 Ноябрь 2011 Декабрь 2011 Январь 2012 Февраль 2012 Март 2012 Апрель 2012 Май 2012 Июнь 2012 Июль 2012 Август 2012 Сентябрь 2012 Октябрь 2012 Ноябрь 2012 Декабрь 2012 Январь 2013 Февраль 2013 Март 2013 Апрель 2013 Май 2013 Июнь 2013 Июль 2013 Август 2013 Сентябрь 2013 Октябрь 2013 Ноябрь 2013 Декабрь 2013 Январь 2014 Февраль 2014
Март 2014
Апрель 2014 Май 2014 Июнь 2014 Июль 2014 Август 2014 Сентябрь 2014 Октябрь 2014 Ноябрь 2014 Декабрь 2014 Январь 2015 Февраль 2015 Март 2015 Апрель 2015 Май 2015 Июнь 2015 Июль 2015 Август 2015 Сентябрь 2015 Октябрь 2015 Ноябрь 2015 Декабрь 2015 Январь 2016 Февраль 2016 Март 2016 Апрель 2016 Май 2016 Июнь 2016 Июль 2016 Август 2016 Сентябрь 2016 Октябрь 2016 Ноябрь 2016 Декабрь 2016 Январь 2017 Февраль 2017 Март 2017 Апрель 2017 Май 2017
Июнь 2017
Июль 2017
Август 2017 Сентябрь 2017 Октябрь 2017 Ноябрь 2017 Декабрь 2017 Январь 2018 Февраль 2018 Март 2018 Апрель 2018 Май 2018 Июнь 2018 Июль 2018 Август 2018 Сентябрь 2018 Октябрь 2018 Ноябрь 2018 Декабрь 2018 Январь 2019
Февраль 2019
Март 2019 Апрель 2019 Май 2019 Июнь 2019 Июль 2019 Август 2019 Сентябрь 2019 Октябрь 2019 Ноябрь 2019 Декабрь 2019 Январь 2020
Февраль 2020
Март 2020 Апрель 2020 Май 2020 Июнь 2020 Июль 2020 Август 2020 Сентябрь 2020 Октябрь 2020 Ноябрь 2020 Декабрь 2020 Январь 2021 Февраль 2021 Март 2021 Апрель 2021 Май 2021 Июнь 2021 Июль 2021 Август 2021 Сентябрь 2021 Октябрь 2021 Ноябрь 2021 Декабрь 2021 Январь 2022 Февраль 2022 Март 2022 Апрель 2022 Май 2022 Июнь 2022 Июль 2022 Август 2022 Сентябрь 2022 Октябрь 2022 Ноябрь 2022 Декабрь 2022 Январь 2023 Февраль 2023 Март 2023 Апрель 2023 Май 2023 Июнь 2023 Июль 2023 Август 2023 Сентябрь 2023 Октябрь 2023 Ноябрь 2023 Декабрь 2023 Январь 2024 Февраль 2024 Март 2024 Апрель 2024 Май 2024 Июнь 2024 Июль 2024 Август 2024 Сентябрь 2024 Октябрь 2024 Ноябрь 2024 Декабрь 2024 Январь 2025 Февраль 2025 Март 2025 Апрель 2025 Май 2025 Июнь 2025 Июль 2025 Август 2025 Сентябрь 2025 Октябрь 2025 Ноябрь 2025 Декабрь 2025
1 2 3 4 5 6 7 8 9 10 11 12 13 14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
Game News |

Meta might've done something useful, pioneering an AI model that can interpret brain activity into sentences with 80% accuracy

Depending on what areas of the internet you frequent, perhaps you were under the illusion that thoughts-to-text technology already existed; we all have that one mutual or online friend that we gently hope will perhaps one day post slightly less. Well, recently Meta has announced that a number of their research projects are coming together to form something that might even improve real people's lives—one day. Maybe!

Way back in 2017, Meta (at that time just called 'Facebook') talked a big game about “typing by brain.” Fast forward to now and Meta has shared news of two breakthroughs that make those earlier claims seem more substantial than a big sci-fi thought bubble (via MIT Technology Review). Firstly, Meta announced research that has created an AI model which "successfully decodes the production of sentences from non-invasive brain recordings, accurately decoding up to 80% of characters, and thus often reconstructing full sentences solely from brain signals."

The second study Meta shared then examines how AI can facilitate a better understanding of how our brains slot the Lego bricks of language into place. For people who have lost the ability to speak after traumatic brain injuries, or who otherwise have complex communication needs, all of this scientific research could be genuinely life-changing. Unfortunately, this is where I burst the bubble: the 'non-invasive' device Meta used to record brain signals so that they could be decoded into text is huge, costs $2 million, and makes you look a bit like Megamind.

Dated reference to an animated superhero flick for children aside, Meta has been all about brain-computer interfaces for years. More recently they've even demonstrated a welcome amount of caution when it comes to the intersection of hard and 'wet' ware.

This time, the Meta Fundamental Artificial Intelligence Research (FAIR) lab collaborated with the Basque Center on Cognition, Brain and Language, to record the brain signals of 35 healthy volunteers as they typed. Those brain signals were recorded using the aforementioned, hefty headgear—specifically a MEG scanner—and then interpreted by a purposefully trained deep neural network.

Meta wrote, "On new sentences, our AI model decodes up to 80% of the characters typed by the participants recorded with MEG, at least twice better than what can be obtained with the classic EEG system."

This essentially means that recording the magnetic fields produced by the electrical currents within the participants' brains resulted in data the AI could more accurately interpret, compared to just recording the electrical activity itself via an EEG. However, by Meta's own admission, this does not leave the research in the most practical of places.

For one, MEG scanners are far from helmets you can just pop on and off—it's specialised equipment that requires patients to sit still in a shielded room. Besides that, this study used a comparatively tiny sample size of participants, none of whom had a known traumatic brain injury or speech difficulties. This means that it's yet to be seen just how well Meta's AI model can interpret for those who really need it.

Still, as a drop out linguist myself, I'm intrigued by Meta's findings when it comes to how we string sentences together in the first place. Meta begins by explaining, "Studying the brain during speech has always proved extremely challenging for neuroscience, in part because of a simple technical problem: moving the mouth and tongue heavily corrupts neuroimaging signals." In light of this practical reality, typing instead of speaking is kind of genius.

So, what did Meta find? It's exactly like I said before: Linguistic Lego bricks, baby. Okay, that's an oversimplification, so I'll quote Meta directly once more: "Our study shows that the brain generates a sequence of representations that start from the most abstract level of representations—the meaning of a sentence—and progressively transform them into a myriad of actions, such as the actual finger movement on the keyboard [...] Our results show that the brain uses a ‘dynamic neural code’—a special neural mechanism that chains successive representations while maintaining each of them over long time periods."

To put it another way, your brain starts with vibes, unearths meaning, daisy chains those Lego bricks together, then transforms the thought into the action of typing…yeah, I would love to see the AI try to interpret the magnetic fields that led to that sentence too.


Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.



Читайте также

This roguelite claims to have the dubious honor of being 'the world's first fully playable game created 100% through AI' in a milestone for slop everywhere

Confession time: How long do you stick it out before you abandon a terrible MMO dungeon/raid group?

Prolific voice actor Jim Ward, whose credits include Knights of the Old Republic, New Vegas, Grim Fandango, and more, has died




Game24.pro — паблик игровых новостей в календарном формате на основе технологичной новостной информационно-поисковой системы с элементами искусственного интеллекта, гео-отбора и возможностью мгновенной публикации авторского контента в режиме Free Public. Game24.pro — ваши Game News сегодня и сейчас в Вашем городе.

Опубликовать свою новость, реплику, комментарий, анонс и т.д. можно мгновенно — здесь.