We may earn affiliate commissions for the recommended products. Learn more

The weaponization of AI and machine learning

The weaponization of AI and machine learning

Those that dare to look beyond the hype will quickly discover that the partnership of AI and machine learning is already taking center stage. Although it’s capabilities at present are often exaggerated, a quick look towards the future reveals that the potential of these emerging technologies is both underrated and underhyped.

The tech dream team is beginning to transform businesses across multiple industries with new data interactions. The ability to automatically apply complex mathematical calculations to an organization’s big data repeatedly and faster at scale is offering an array of mouth-watering opportunities.

However, while everyone is getting excited about future possibilities, we also need to remember that there is no light without darkness. Unfortunately, there is also an algorithmic warfare brewing. What makes this battle unique is that there will be no boundaries or geographical borders, and sometimes there won’t even be any humans involved.

With great power comes great responsibility, and a simple oversight today could quickly be your undoing tomorrow. The butterfly effect could have a much more significant impact on the future of businesses and our lifestyle than many realize. As emerging technologies converge, connect disparate elements and remove data silos, some will undoubtedly attempt to weaponize AI and machine learning too.

Hacker AI vs. Enterprise AI

Hacker AI vs. Enterprise AI

There is a battle brewing in the cybersecurity arena as both the good guys turn to AI and machine learning in an elaborate game of cat and mouse. AI-enabled cyber threats will attempt to outsmart your protection systems using malicious tactics.

There is an argument that although AI can add value, it is also creating new vulnerabilities at the same time. The problem is that there isn’t a silver bullet that can protect an organization. A combination of AI and machine learning will only take you so far.  But deep learning and its combative capabilities will provide you with the best protection against future cyberattacks.

Together these technologies enable them continuous learning without explicit programming. The main problem for the good guys is that although AI and Algorithms are excellent at detecting variations in patterns, they currently struggle to identify new ones. Finding the unknown is something that computers struggle with.

Pulling the plug on a nation’s power grid

When talking about cyber warfare, we have already highlighted that battles are much more likely to be online being fought by machines rather than humans. In a world that is heavily reliant on technology, nations are beginning to build smart cities as IoT sensors transmit data via 5G connections.

With everything online, the easiest way to disable an entire nation would be to pull the plug on its access to power. Back in 2015, a team of cunning hackers struck vital power plants in Ukraine, leaving 230,000 residents in the dark. They also disabled backup power supplies which left the engineers stumbling in the dark too.

Weaponized AI and machine learning could target power grids and the critical infrastructure of any country. Our reliance on electricity and power has become our most significant vulnerability. By only hitting the off switch your hotel key cards stop working, retailers cannot take cash, and within hours, people are unable to communicate with each other.

Facial weaponization

Facial weaponization

High-speed facial recognition processing enables software to identify individuals from databases containing millions of images. This is only made possible by AI and machine learning algorithms. Companies such as Microsoft have already quietly deleted their massive facial recognition databases. But this is a drop in the ocean compared to the 300 million images posted on Facebook and 95 million photos posted on Instagram every day.

Facebook can recognize your identity every time an image of you is posted online. Our privacy settings no longer seem to protect our integrity either. For example, the location-based marketing platform Hyp3r was recently caught scraping public data from Instagram. In the same week, Biostar was forced to admit that the biometric data of a million people had been compromised too.

What if your own image could be weaponized?

Imagine uniting the thousands of pictures that you have posted online. Now throw in numerous facial recognition systems from your shopping trip to an afternoon visit to a museum. All of this data could potentially be fed into a system to create deep-fake videos of you.

It’s already been done to Mark Zuckerberg and Barack Obama. But what happens when a competitor or someone with a grudge decides to discredit you in the same way? If you are not in the public with a team of people to support you, it could be much harder to prove your innocence.

What about surveillance cameras?

The surveillance cameras that protect us could also be weaponized. If you were deemed a person of interest, it would be incredibly easy to detect and track you. If Facebook knows when photos of you are posted on their platform, what would happen if this was cross-referenced with every other system using facial recognition?

The bottom line

Without even realizing it, we have fed photos, videos, and even our speech patterns into large databases. Your digital footprint can never be erased. If weaponized, it could cause reputational damage in the future.

If you have nothing to hide, it doesn’t mean you have nothing to fear. The moment you or your business becomes a threat, it will be weaponized technology that will be waiting to take you down. We are currently like a small child playing with a very dangerous toy, blissfully unaware of the consequences.

Every action we take today could have much broader implications tomorrow.

1 comments
default-avatar.

Your email address will not be published.   Required fields are marked *


  1. ringtonesdump.com
    ringtonesdump.com October 29, 2019 at 6AM

    When nations individually and collectively accelerate their efforts to gain a competitive advantage in science and technology, the further weaponization of AI is inevitable. Accordingly, there is a need to visualize what would an algorithmic war of tomorrow looks like, because building autonomous weapons systems is one thing but using them in algorithmic warfare with other nations and against other humans is another.


Thanks for your opinion!
Jump to section