The Alarming Effect of Manufactured​ Voices

So yesterday I came home from work to see a message on my Facebook account from a stranger. He said that he ordered shoes and socks on my website and that he demanded a refund. After politely telling him that Trend Gear doesn’t sell anything he became annoyed and threatened to go to the authorities. I myself wasn’t sure either so I reported his account immediately. Why? Because scenarios like this instantly alarm me. This might be one of those fake accounts trying to steal god knows what from me. Being a little bit overdramatic, I immediately shut down my computer and went back to covering my webcam.

Over the last year, ‘Fake News’ became a negatively impactful phenomenon. I never really looked into it though but as soon as I felt alarmed (yesterday) I became interested in its effects and started looking into this fascinating topic. What I found, however, was even more alarming than I’d expect. If you thought written fake news was threatening our society, you do not know what’s coming to us.

Fake faces

A few years back, Apple bought some of the most innovative companies in 3D-technology, Facial Recognition and augmented reality. Nobody really understood why Apple made such a big purchase but now we know that these tech-savvy people helped to create the iPhone X with its most prominent feature: Animoji.


It might seem harmless but the tech behind Animoji is extremely expensive. It’s not a new tech by any means. Hollywood has been using facial recognition for some time now. If you’ve ever watched a behind-the-scenes clip of a big blockbuster movie, you might be familiar with actors covered in dots on their face. These dots called ‘Markers’ enabled machines to compute facial expressions more easily. This is how the public was blessed with a movie like Avatar.

However, with Animoji you don’t need to draw any dots on your face. In fact, your face is the only marker you’ll need. So, today we might be using this technology for cute and harmless Emoji, tomorrow we might be able to use this on actual human faces. Algorithms are already able to make hyper-realistic 3D-models of faces by simply using 1 photo. Pinscreen, the start-up behind this project, says Animoji are just the beginning. Soon we’ll see people using the face of the president of the United States. While this can be seen as a fun extension of Animoji, the thought of this technology reaching a bigger audience is quite scary yet imminent.


Fake voices

Though scarily accurate faces are sinister enough, the combination with manufactured voices will leave us extremely vulnerable. Several technologies can use a speech, a voice-memo or even a phone call to mimic your voice. Not only does this strip away the uniqueness of an individual, it’s also a security hell.

With the Internet-of-things, all our devices are connected. Add the fact that more and more gadgets can be voice-controlled and you’ve got the perfect ingredient for a possible disaster. Though artificial voices are quite easy to create for giants like Google and Facebook, algorithms not only recognise the unique vibration of our voice but also our intonation and pacing.

With voice, misinterpreting a message is near to impossible. Not only because of the words we choose to use, but also because of the way we emotionally carry a message. These subconscious meanings are why we love to use Emoji as they give a clue on how to interpret a written message. However, speaking with one another still remains the most efficient way of communication. Now imagine fake voices being so well manufactured that they can fool us all.

Audio fakery can have a shocking impact on our society. When you consider the distrust in the media institutions, an actor or president saying some threatening words can trigger political conflict. Or worse, it can trigger a war. But how can we protect ourselves? Are we able to distinguish the fake from the real?

Pop the Bubble 

In the coming years, I think it’ll become increasingly important to deal with our own filter bubbles. The only way to deal with fake news, fake audio and fake faces is to gain perspective. A filter bubble is, to a certain degree, also a form of fake news. Facebook might be filtering the dark side of the internet into oblivion, the real truth still isn’t always cushy.

When filters determine what we see and what we do not see based on what we like, our own opinion will be constantly validated making us blind to other possible truths. In my opinion, this is extremely dangerous. As people, we sometimes need to take a critical look at how we experience and see the world. Ask yourself: ‘What do others see?’, ‘Have I been looking at this the wrong way?’, ‘OMG, There is way more to this topic than I initially thought! GASP’. Talking to others and sharing your opinions is a significant aspect of widening your perspective.






So since it’s December let me say this. Try to pop your bubble in 2018. Discover more truths, read all kinds of newspapers and remember; There is not always one truth.

Want to read deeper into this topic? Visit the following links:


Door deze technologie bellen de oplichters van de toekomst met de stem van je moeder


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s