Sonatype reveals DevOps and SecOps leaders’ views on generative AI

Sonatype reveals DevOps and SecOps leaders’ views on generative AI Ryan is a senior editor at TechForge Media with over a decade of experience covering the latest technology and interviewing leading industry figures. He can often be sighted at tech conferences with a strong coffee in one hand and a laptop in the other. If it's geeky, he’s probably into it. Find him on Twitter (@Gadget_Ry) or Mastodon (@gadgetry@techhub.social)


While the tech community remains divided on the potential of generative AI tools, there’s a consensus that their impact on the industry is comparable to the adoption of cloud technology.

Software engineers are harnessing generative AI to explore libraries, create new code, and enhance their development process, while application security professionals employ it for code analysis and security testing.

A recent survey conducted by Sonatype in the US sheds light on how generative AI is influencing software engineers and the software development lifecycle.

The survey engaged 400 DevOps and 400 SecOps leaders, revealing an overwhelming adoption of generative AI. However, some respondents expressed concerns about security risks and potential job displacement due to AI.

Role-based perspectives

DevOps and SecOps leaders share concerns about security (52%) and job loss (49%) as their top apprehensions regarding generative AI. These concerns reflect the ongoing debate about AI’s potential to replace technical roles.

Additionally, a significant portion (74%) of respondents feel pressured to adopt generative AI, with DevOps leads feeling this pressure more intensely than their counterparts.

Both groups agree that creators should own the copyright for AI-generated output in the absence of copyright law (40%), and they support compensating developers (90%).

Most respondents (97%) currently use generative AI in their workstreams to some degree, with 84 percent using it regularly and 41 percent using it daily.

Most popular generative AI tools

Generative AI tools are gaining popularity, with engineers and security teams extensively employing them for various tasks.

ChatGPT is the preferred tool for 86 percent of respondents, followed by GitHub Copilot at 70 percent.

Sentiment differences

DevOps and SecOps leaders differ in their sentiment toward generative AI.

While 61 percent of DevOps leads believe it’s overhyped, 90 percent of SecOps leads think its impact on the industry will be similar to the impact of cloud technology.

SecOps leads also report more significant time savings (57% save at least 6 hours per week compared to 47% for DevOps) and a higher rate of full implementation (45% vs 31%).

Security concerns

Unsurprisingly, security concerns are prevalent in both groups. 77 percent of DevOps leads and 71 percent of SecOps leads feel pressured to use generative AI despite security worries.

DevOps leaders are more pessimistic about the technology’s potential to lead to more security vulnerabilities (77%), particularly in open-source code (58%). They also anticipate it making threat detection more complex (55%).

“The AI era feels like the early days of open source, like we’re building the plane as we’re flying it in terms of security, policy and regulation,” comments Brian Fox, Co-founder and CTO at Sonatype.

“Adoption has been widespread across the board, and the software development cycle is no exception. While productivity dividends are clear, our data also exposes a concerning, hand-in-hand reality: the security threats posed by this still-nascent technology.

“With every innovation cycle comes new risk, and it’s paramount that developers and application security leaders eye AI adoption with an eye for safety and security.“

Responsible use and regulation

Organisations are addressing concerns through generative AI policies and are keenly awaiting regulation in a space that lacks comprehensive governance.

Notably, 71 percent of respondents report that their organisations have established policies for generative AI use while 20 percent are in the process of developing them.

In terms of regulation, 15 percent of DevOps leads believe the government should regulate, while six percent of SecOps leads share this view. A majority (78% of SecOps and 59% of DevOps) suggest that both the government and individual companies should play roles in regulation.

Copyright and compensation

Respondents agree that developers should be compensated (90%) for their code if it’s used in open-source artifacts in LLMs. Furthermore, 67 percent believe the organisation using the code in their software should pay the developers.

As generative AI continues to evolve, it will be essential to strike a balance between its potential benefits and the need for responsible and ethical implementation.

You can find a full copy of Sonatype’s report here.

(Photo by Agence Olloweb on Unsplash)

See also: Niantic 8th Wall enhances WebAR with powerful GenAI modules

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with Digital Transformation Week.

Explore other upcoming enterprise technology events and webinars powered by TechForge here.

Author

  • Ryan Daws

    Ryan is a senior editor at TechForge Media with over a decade of experience covering the latest technology and interviewing leading industry figures. He can often be sighted at tech conferences with a strong coffee in one hand and a laptop in the other. If it's geeky, he’s probably into it. Find him on Twitter (@Gadget_Ry) or Mastodon (@gadgetry@techhub.social)

Tags: , , , , , , , , , , , , ,

View Comments
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *