Now that the dust of innovation has settled on the hype around ChatGPT, it may be a good time to unpack the full implications of this technology.

Video Credit: Pexels

Music by Arulo

While it certainly helps sleep-deprived college students ace term papers and gives copywriters a creative boost, it has a potentially dark underbelly.

Image: The Times

David Carvalho, CEO and co-founder of Naoris Protocol, unpacks some of the not so pretty aspects of emerging AI technology and its potential to wreak havoc for businesses globally.

Video Credit: Pexels

How can ChatGPT be used to exploit code and can it really create code?

Image Credit: Pexels

The short answer is yes.

OpenAI’s ChatGPT, is a large language model (LLM)-based artificial intelligence (AI) text generator, it just requires a prompt with a normal English language query.

Video Credit: Pexels

GPT stands for Generative Pre-Trained Transformer, it is trained on a big data sample of text from the internet, containing billions of words to create learnings on all subjects in the samples.

Video Credit: Pexels

It can ‘‘think’ of everything from essays, poems, emails, and yes, computer code.

It can generate code fed to it from plain English text,or receive new and existing code as input.

This code can however be exploited for malicious purposes

Video Credit: Pexels

While Google can show you an article on how to solve a specific coding problem, ChatGPT could  write the code for you.

It would also enable companies to  change their deployment processes making them more thorough prior to launch

What are some of the current limitations?

Image Credit: Pexels

The downside is that bad actors can program AI to find vulnerabilities to exploit any popular, existing coding standard, Smart Contract code, or even known computing platforms and operating systems.

AI is not conscious, it is an algorithm based on mathematical principles, weights and biases.

Video Credit: Pexels

It will miss basic preconceptions, knowledge, emotions and subtleties that only humans see.