Skip to main content

4 min read

Pros and Cons of Using ChatGPT for DevOps - from the AI itself

ChatGPT for DevOps

We asked ChatGPT: ‘Could you write me a blog post suitable to the Adaptavist series of blog posts that outlines the pros and cons of ChatGPT in software development and DevOps?’ Here’s what it had to say… (and don’t forget to keep scrolling for the human response!)

copy_text Icon

What is ChatGPT?

ChatGPT is a language model developed by OpenAI that has revolutionised how we interact with AI. It is trained on vast amounts of data and can generate human-like responses to a wide range of questions. This has made it an attractive tool for software development and DevOps teams who are looking for ways to automate repetitive tasks and improve the efficiency of their work. In this post, we'll explore the pros and cons of using ChatGPT in these areas.


Automates repetitive tasks: ChatGPT can be programmed to perform repetitive tasks such as responding to customer inquiries or running routine checks on systems, freeing up valuable time for developers and DevOps professionals.

Increases efficiency: With its ability to understand natural language, ChatGPT can quickly analyse data and provide accurate information, reducing the time it takes to complete tasks.

24/7 availability: Unlike humans, ChatGPT doesn't need rest, making it available 24/7 to handle tasks and respond to inquiries.

Improved accuracy: ChatGPT is trained on vast amounts of data, making it capable of generating accurate responses and reducing the possibility of human error.


Limitations in understanding: While ChatGPT is highly trained, it still has limitations in its understanding of context and meaning, making it vulnerable to misinterpretation and incorrect responses.

Reliance on data: The accuracy of ChatGPT's responses is directly linked to the quality and quantity of data it was trained on, making it necessary for developers to update the training data constantly.

Privacy concerns: The use of ChatGPT raises privacy concerns as the tool is trained on vast amounts of data and may have access to sensitive information.

High cost: The development and maintenance of ChatGPT systems require significant investment, making it an expensive solution for some organisations.


In conclusion, ChatGPT has the potential to revolutionise the way software development and DevOps teams work by automating repetitive tasks and increasing efficiency. However, it also has limitations in its understanding and raises privacy concerns, making it necessary for organisations to weigh the pros and cons before implementing it into their workflow.

copy_text Icon

And now for the human part…

We asked ChatGPT a question, and it responded – the information above has been posted without any edits. It's clear the AI is able to illuminate us on some strengths and weaknesses, but how accurate do you find its response?

Here are some other questions ChatGPT's content might have you asking:

  • What real-world pros and cons have developers who've already used ChatGPT for software development and DevOps noticed?
  • For those who haven’t, are they considering using it in the future? At Adaptavist, we are already working on incorporating AI capabilities into our products and services. How about you?
  • Are the above statements meaningful and accurate?
  • How will ChatGPT impact the DevOps industry moving forward?
  • Will ChatGPT replace developers, or is it just a fad?
copy_text Icon
DevOps Decrypted Podcast

Learn more about ChatGPT in Ep. 12 of our DevOps Decrypted Podcast!

Listen to our experts as they talk about OpenAI's ChatGPT, and how powerful it is (when we can actually access it) – as well as some of its predictions for DevOps in 2023!

Listen now!

Rise of the machines

To remain competitive in today’s software development landscape, automation has become essential. Any tool that eradicates or speeds up manual processes effectively should be seriously considered by developers. Failing to use it could leave you lagging behind everyone else.

Of course, the less time spent writing responses to customer questions or creating boilerplate code, for example, the more time developers have to get on with building complex application architecture or coming up with new features. And for now, it looks like ChatGPT is a faster option than, say, a Google search (more on Google in a second).

The key word here is ‘effectively’. Chat GPT needs to be prompted – someone has to know what to put in to get the right result, a skill set that will undoubtedly be in high demand as AI becomes more commonplace in development.

While ChatGPT is still not mature enough to write complex code, it’s becoming more proficient every day. However, over-reliance isn't advised. While ChatGPT responded that it’s capable of generating accurate responses, like us humans, it’s also capable of generating inaccurate ones too. Taking everything offered up by the AI as fact is naive and could lead you into trouble.

Malicious intent

ChatGPT said it itself, the accuracy of what it spews out is directly linked to the data it’s been trained on – and right now, that’s data from no later than 2021 (it’s not hooked up to the internet).

It also means that without serious oversight, you could end up using AI-generated code that’s bug-filled at best, setting you back and slowing down deployment, and malicious at worst – code that’s designed to cause damage, potentially leaving your own software vulnerable to hackers.

ChatGPT is right to mention privacy, but it doesn’t touch on one of the wider issues. With the AI running on whatever data is fed into it, you need to think carefully when you input your own code, asking it to make small tweaks. Is that proprietary information? Does it belong to your organisation or a client? Could ChatGPT regurgitate your work to others or leave your own code more vulnerable?

Legal implications

And what about the legal and ethical implications of relying on AI-generated code? OpenAI, ChatGPT's developer, is currently facing a class-action lawsuit alongside Microsoft, which has invested billions into OpenAI and GitHub. The case has been brought by developers who accuse the companies of scraping licensed code to build GitHub Copilot.

Launched in 2021, this tool uses OpenAI's tech to generate and suggest lines of code inside a developer's own code editing program. Sure, it might speed up how quickly you can get the job done, but at what cost? How can you be sure the code you’re using is open-source or licensed? How can you know you’re not stealing from someone else?

What's next?

What ChatGPT failed to acknowledge is that it’s not the only super-smart robot on the market. There are plenty of other chatbots out there, some specifically developed to generate code. And that’s all before Google’s stepped into the picture.

Bard is Google’s AI chat service, which was introduced on 6 February this year. Unlike ChatGPT, which isn’t connected to the internet, Google’s service uses its Language Model for Dialogue Applications (LaMDA), which draws on web content to create responses. It’s not been rolled out to the general public yet, so time will tell how impressive this rival is and how it might better benefit developers.

At Adaptavist, we believe in harnessing the power of the latest technologies to transform the way organisations work. But, whatever the bots are saying, we also see huge value in human interactions. After all, it is people first in the PPT (people, process, and technology) framework. An AI evolution is underway, but we need experts to guide it on the right path.

Want to know more about what AI could do for your organisation?

copy_text Icon

Get in touch!

About the authors

Jobin Kuruvilla

Jobin Kuruvilla

Jobin Kuruvilla is a DevOps subject matter expert, and an experienced solutions expert and App developer. Jobin has several certifications under his belt, including Atlassian products, GitLab certified PSE, AWS, Kubernetes, Jenkins to name a few, and has spearheaded implementing Digital Transformation for teams and enterprises.