-

AI Policies are Low, Use is High, and Adversaries are Taking Advantage, Says New AI Study

SCHAUMBURG, Ill.--(BUSINESS WIRE)--A new poll of global digital trust professionals is revealing a high degree of uncertainty around generative artificial intelligence (AI), few company policies around its use, lack of training, and fears around its exploitation by bad actors, according to Generative AI 2023: An ISACA Pulse Poll.

AI policies are low, use is high, and adversaries are taking advantage, says new #ISACA AI study.

Share

Digital trust professionals from around the globe—those who work in cybersecurity, IT audit, governance, privacy and risk—weighed in on generative AI—artificial intelligence that can generate text, images and other media—in a new pulse poll from ISACA that explores employee use, training, attention to ethical implementation, risk management, exploitation by adversaries, and impact on jobs.

Diving in, even without policies

The poll found that many employees at respondents’ organizations are using generative AI, even without policies in place for its use. Only 28 percent of organizations say their companies expressly permit the use of generative AI, only 10 percent say a formal comprehensive policy is in place, and more than one in four say no policy exists and there is no plan for one. Despite this, over 40 percent say employees are using it regardless—and the percentage is likely much higher given that an additional 35 percent aren’t sure.

These employees are using generative AI in a number of ways, including to:

  • Create written content (65%)
  • Increase productivity (44%)
  • Automate repetitive tasks (32%)
  • Provide customer service (29%)
  • Improve decision making (27%)

Lack of familiarity and training

However, despite employees quickly moving forward with use of the technology, only six percent of respondents’ organizations are providing training to all staff on AI, and more than half (54 percent) say that no AI training at all is provided, even to teams directly impacted by AI. Only 25 percent of respondents indicated they have a high degree of familiarity with generative AI.

“Employees are not waiting for permission to explore and leverage generative AI to bring value to their work, and it is clear that their organizations need to catch up in providing policies, guidance and training to ensure the technology is used appropriately and ethically,” said Jason Lau, ISACA board director and CISO at Crypto.com. “With greater alignment between employers and their staff around generative AI, organizations will be able to drive increased understanding of the technology among their teams, gain further benefit from AI, and better protect themselves from related risk.”

Risk and exploitation concerns

The poll explored the ethical concerns and risks associated with AI as well, with 41 percent saying that not enough attention is being paid to ethical standards for AI implementation. Fewer than one-third of their organizations consider managing AI risk to be an immediate priority, 29 percent say it is a longer-term priority, and 23 percent say their organization does not have plans to consider AI risk at the moment, even though respondents note the following as top risks of the technology:

  1. Misinformation/Disinformation (77%)
  2. Privacy violations (68%)
  3. Social engineering (63)
  4. Loss of intellectual property (IP) (58%)
  5. Job displacement and widening of the skills gap (tied at 35%)

More than half (57 percent) of respondents indicated they are very or extremely worried about generative AI being exploited by bad actors. Sixty-nine percent say that adversaries are using AI as successfully or more successfully than digital trust professionals.

“Even digital trust professionals report a low familiarity with AI—a concern as the technology iterates at a pace faster than anything we’ve seen before, with use spreading rampantly in organizations,” said John De Santis, ISACA board chair. “Without good governance, employees can easily share critical intellectual property on these tools without the correct controls in place. It is essential for leaders to get up to speed quickly on the technology’s benefits and risks, and to equip their team members with that knowledge as well.”

Impact on jobs

Examining how current roles are involved with AI, respondents believe that security (47 percent), IT operations (42 percent), and risk and compliance (tie, 35%) are responsible for the safe deployment of AI. When looking ahead, one in five organizations (19 percent) are opening job roles related to AI-related functions in the next 12 months. Forty-five percent believe a significant number of jobs will be eliminated due to AI, but digital trust professionals remain optimistic about their own jobs, with 70 percent saying it will have some positive impact for their roles. To realize the positive impact, 80 percent think they will need additional training to retain their job or advance their career.

Optimism in the face of challenges

Despite the uncertainty and risk surrounding AI, 80 percent of respondents believe AI will have a positive or neutral impact on their industry, 81 percent believe it will have a positive or neutral impact on their organizations, and 82 percent believe it will have a positive or neutral impact on their careers. Eighty-five percent of respondents also say AI is a tool that extends human productivity, and 62 percent believe it will have a positive or neutral impact on society as a whole.

Learn More

Read more in the infographic and other AI resources, including the AI Fundamentals Certificate, the complimentary The Promise and Peril of the AI Revolution: Managing Risk white paper, and a free guide to AI policy considerations, at www.isaca.org/resources/artificial-intelligence.

About ISACA

ISACA® (www.isaca.org) equips individuals and enterprises with the knowledge, credentials, education, training and community to progress their careers, transform their organizations, and build a more trusted and ethical digital world. ISACA leverages the expertise of its more than 165,000 members who work in digital trust fields such as information security, governance, assurance, risk, privacy and quality. It has a presence in 188 countries, including 225 chapters worldwide. Through its foundation One In Tech, ISACA supports IT education and career pathways for underresourced and underrepresented populations.

Contacts

communications@isaca.org
Emily Ayala, +1.847.385.7223
Bridget Drufke, +1.847.660.5554

ISACA


Hashtags

Contacts

communications@isaca.org
Emily Ayala, +1.847.385.7223
Bridget Drufke, +1.847.660.5554

Social Media Profiles
More News From ISACA

ISACA Authorized as the CAICO for the US Department of War’s CMMC Program

WASHINGTON--(BUSINESS WIRE)--Global professional association ISACA—best known for its Certified Information Systems Auditor (CISA) and Certified Information Security Manager (CISM) certifications—has been authorized as the new and exclusive CMMC Assessor and Instructor Certification Organization (CAICO) for the Cybersecurity Maturity Model Certification (CMMC) program of the US Department of War (DoW). This means ISACA is the trusted credentialing leader to manage the training, examination, and...

ISACA to Lead Global Credentialing for CMMC Cybersecurity Framework as International Cyber Readiness Standards Rise

BRUSSELS & LONDON & MADRID & BERLIN--(BUSINESS WIRE)--As cyber threats escalate and governments raise expectations around operational resilience, ISACA has been appointed to lead the global credentialing programme for the U.S. DoW’s Cybersecurity Maturity Model Certification (CMMC) program. The appointment positions ISACA – the international association for cybersecurity, audit and digital trust – as the exclusive CMMC Assessor and Instructor Certification Organization (CAICO), responsible for...

ISACA, Nasscom Join Hands to Standardize Digital Skills for India’s Workforce

NEW DELHI--(BUSINESS WIRE)--ISACA, a global professional association and learning organization working in digital trust fields serving 185,000 members and operating in more than 190 countries, has exchanged an MoU with IT-ITeS SSC Nasscom, the national standard-setting body for IT skills for the alignment of its credentials to NSQF (National Skill Qualification Framework). Sector Skills Council Nasscom, set up under the aegis of National Skill Development Corporation (NSDC) and Ministry of Skil...
Back to Newsroom