3 things direct from the future

Edition 60

Once every 2 weeks I will deliver “3 things direct from the future”. A 2 minute read that will always give you:

  • one thing that can help,
  • one thing to be wary of, and
  • one thing to amaze.

If this sounds interesting to you then please subscribe.

Subscribe

1. One thing that helps

Social Fish Autism

I know you know the names of all your genes, so today we will be paying special attention to the old EGR1 gene. Mutations in the EGR1 gene have been linked to mental health problems such as schizophrenia, depression, and – now it is thought – autism. A team of researchers has been investigating this “social” gene using zebrafish. 

Like us, zebrafish are social creatures – they like to get close to each other and catch up over some bubbles. To test their social graces, scientists put two fish in separate adjacent tanks. They could see each other but could not sense water movement nor chemical signals. Those with normal EGR1 genes tended to swim and orient towards the other fish. Those with mutations did not exhibit the same social tendencies.

But why do we study genes from other species, in this case, a fish? While our brain is more complex, the gene EGR1 is present in both zebrafish and humans. Observing behavior and gene correlation in animals with less complex cortex than humans allows us to clearly study the effects of such genes. This, in turn, may lead us to be able to adjust differences in these genes to better assist people with disorders such as autism.

2. One to be wary of

Faces from Voices

“In some ways, then, the system is a bit like your racist uncle,” writes photographer Thomas Smith. “It feels it can always tell a person’s race or ethnic background based on how they sound — but it’s often wrong.”

We’ve seen AI that create images based on text and it’s actually good at it. Now we have AI that can draw your portrait based on your voice alone. It’s amazing but also creepy.

Scientists at MIT trained an AI called Speech2Face on millions of videos from YouTube and the Internet. It learned to correlate facial features of the speakers to their voice. Based on these correlations, the AI makes an inference about the speaker’s gender, age and ethnicity. Noteworthy is the fact that during the training process, no human was involved – just loads of videos for it to train on.

With a few enhancements, the AI was able to make (relatively) accurate portraits out of voice recordings. See for yourself.

Face

However, there are some issues raised by this tech. For one, it has biases like most AI. It recognised young boys with high voices as females, while those with low voices as males even if the speaker was female. There are also biases based on accent and language spoken. An Asian man speaking English looked more Caucasian then the same man speaking Chinese.

I like the idea of seeing representations of people I’m talking to who I don’t know, or being able to have video chats without having to use my actual real-self. I am concerned about this technology being used to further stereotype people who may be calling in for services purely based on the AI interpretation.

Still fascinating stuff.

 

3. One to amaze

Beats By Brain

Hp

Dr. Dre has nothing on these headphones! These can literally read your brain when you wear them. NextSense, a company born out of Google’s Moonshot program, have created headphones equipped to record electroencephalogram or EEG signals.  In movies, we often see people with electrodes attached to their head for EEG. It’s used for diagnosing neurological activity in the brain as well as for any disease or disorder that may be present. Needless to say, having electrodes attached is uncomfortable for some. These headphones replace the wires and open the door to be able to constantly monitor the brain for issues.

A wonderful application of this technology is currently to read people’s brain signals and help predict when they might be about to have a seizure. They may well provide the data we need to predict all sorts of things.  Like how the ECU of your car can report faults, we could see data on anything out of the ordinary and act on it before it becomes critical.

In the future, the headphones may be used instead of invasive implants to facilitate brain-machine interfaces like what Elon Musk is trying to do with Neuralink. Headphones are no longer just for listening to music; they’re gonna be reading your brain too!

Have a great week.

Daniel J McKinnon

Connect on LinkedIn

Subscribe

Subscribe

Don’t Stop Here

More To Explore