AI Magazine May 2024 | Page 74

GENERATIVE AI
to prevent abuse , cybercriminals are adopting large language models to design malicious subscription-based services . Chatbots such as FraudGPT and WormGPT are lowering the skills required to commit complex and convincing campaigns .
How can leaders help ensure that AI is developed securely ? Guidelines for Secure AI System Development , published by the NCSC and developed with the US ’ s Cybersecurity and Infrastructure Security Agency ( CISA ) and agencies from 17 other countries , advise on the design , development , deployment and operation of AI systems .
The NCSC says that , crucially , keeping AI systems secure is as much about organisational culture , process , and communication as it is about technical measures . “ Security should be integrated into all AI projects and workflows in your organisation from inception . This is known as a ‘ secure by design ’ approach , and it requires strong leadership that ensures security is a business priority , and not just a technical consideration ,” it says .
“ Leaders need to understand the consequences to the organisation if the integrity , availability or confidentiality of an AI system were to be compromised . There may be operational and reputational consequences , and your organisation should have an appropriate response plan in place . As a manager , you should also be particularly aware of AI-specific concerns around data
security . You should understand whether your organisation is legally compliant and adhering to established best practices when handling data related to these systems .”
The NCSC says the burden of using AI safely should not fall on the individual users of the AI products ; customers typically won ’ t have the expertise to fully understand or address AI-related risks . That is , developers of AI models and systems should take responsibility for the security outcomes of their customers .
Ensuring the secure development and deployment of AI systems is paramount as they become increasingly
74 May 2024