![]() ‘We want to make sure that people only use these when they want to, and that it helps them.’ ‘I think, right now, while the technology is in such an early state, it's important to be proactive and get a head-start on, for one, enacting policies that protect people's mental privacy, giving people a right to their thoughts and their brain data. ‘And we want to dedicate a lot of time moving forward to try to try to avoid that. Jerry Tang, lead author of the study from the University of Texas at Austin, said he could not prove ‘a false sense of security’ that the technology might not have the potential to eavesdrop on people’s thoughts in the future, and said it could be ‘misused’ now.īut he said: ‘We take very seriously the concerns that it could be used for bad purposes. ![]() People were also able to ‘sabotage’ the technology, using methods like mentally listing animals’ names, to stop it reading their thoughts. Researchers built an AI model that can read those reactions and translate them into writing (file photo)īut the researchers point out that it took 16 hours of training, with someone listening to podcasts in an MRI machine, for the computer model to understand their brain patterns and interpret what they were thinking. ![]() ![]() The brain reacts to different words using electrical signals and blood flow. ![]()
0 Comments
Leave a Reply. |