The Importance of Diligence for Businesses in Utilising Generative AI Applications

2024-09-09
Article Banner

By Kevin Shepherdson, Founder & CEO of Straits Interactive


With its ability to generate text, images, and even code, I believe that generative AI offers businesses unparalleled avenues for creativity, automation, and personalisation.

This year, as businesses in various sectors increasingly embrace generative AI technologies to drive digital transformation, I believe we can expect enhanced productivity and value creation across many verticals including marketing, customer service, and HR. 

Make no mistake, this mainstream adoption of Generative AI has boomed, resulting in the need for greater awareness and diligence across organisations. 

According to recent research from a recent internal study, there has been a 62 percent increase in productivity, resulting in a time saving of twenty-six hours per week through the use of this breakthrough innovation.

Make no mistake, though, as Generative AI adoption becomes more widespread, the risks to organisations are heightened in tandem, particularly when it comes to data privacy, security, and ethics.

It is important to remember that not all apps are created equally. By this, I mean that, from industry leaders like OpenAI to startups leveraging APIs, and existing apps that integrate generative AI functionalities, each type of software poses unique challenges. Therefore thorough due diligence must be conducted when adopting generative AI apps across core, clone, and combination applications.

Core applications, developed by pioneering leaders in the generative AI sector, such as OpenAI and Midjourney, drive innovation with relentless research and development. However, the recent leakage of user conversations highlights the need for heightened governance alongside advancements.

When it comes to cloning apps, startups and individual developers, funded by venture capitalists, leverage the API of Core Apps to create solutions for specific niches or industries. While they play a pivotal role in the democratisation and commercialisation of generative AI, questionable privacy practices remain within this category.

Last but not least, combination apps—and existing applications that have incorporated generative AI features, such as Microsoft Copilot—can often expose non-savvy users to potential data privacy breaches.

Our research on 100 mobile Clone Apps using OpenAI's GPT APIs revealed significant discrepancies in declared data safety practices and actual behaviour, posing potential privacy risks. According to a recent study covering 113 popular apps, a significant number of Gen AI apps are currently falling short of GDPR standards.

These findings underscore the urgency for businesses to exercise caution and stringent due diligence in their AI adoption strategies. I believe businesses need to place greater emphasis on verifying software providers' data handling policies, including privacy policies and terms of use, as well as local data protection laws in the user’s country, to guarantee any apps’ trustworthiness.

Utilising generative AI apps that fall short of standards not only jeopardise data privacy and security but also undermines organisational integrity and trust. Businesses that develop or deploy Generative AI must exercise caution, especially in areas like customer service where AI-powered chatbots might inadvertently expose sensitive information.

Developers of generative AI apps must remain vigilant against biases in training data sets to prevent the propagation of biassed patterns into outputs. I would advise them to implement ongoing monitoring and mitigation efforts (if they haven’t already) to ensure their systems remain fair, ethical, and reliable.

Robust governance frameworks should be established to oversee the development, deployment, and monitoring of AI applications within an organisation. This involves clearly defining roles and responsibilities, establishing accountability mechanisms, and implementing transparency measures to ensure stakeholders understand the AI systems' capabilities and limitations.

Additionally, data privacy and security must be prioritised, adhering to relevant regulations such as the newly passed EU AI Act or Singapore’s PDPC Advisory Guidelines on the use of personal data in AI Recommendation and Decision Systems, as well as industry-specific standards.

Regular audits and assessments should be conducted to evaluate the ethical and legal implications of AI usage within the organisation, with mechanisms in place to address any identified issues promptly. By integrating AI governance into broader organisational strategies, organisations can harness the benefits of AI while safeguarding against potential pitfalls and promoting responsible innovation.

The integration of AI technologies in the organisation and the advancement of generative AI's multimodal capabilities call for upskilling among data professionals to evolve from mere custodians of information to guardians of ethical AI practices. Multimodal capabilities require enhanced privacy management skills to mitigate risks across different data types and operational aspects of Generative AI such as image and video processing, voice recognition, and other sensory data.

According to a recent Data Protection Excellence Network study on tracking data protection and governance jobs in Singapore, there was a one hundred and seventy-three percent increase in demand for data governance roles year on year from 2022 to 2023. This suggests that data protection officers and data governance professionals who enhance their AI and AI governance knowledge will prove invaluable as firms navigate an ever-evolving generative AI landscape.

By practising the requisite diligence and prioritising sound governance practices, businesses can harness the potential Generative AI while safeguarding against its risks and vulnerabilities, for sustainable success.


This article was first published on SME Magazine on 18 July 2024. No online version is available yet.


Unlock these benefits
benefit

Get access to news, enforcement cases, events, and actionable tips and guides

benefit

Get regular email updates and offers

benefit

Job opportunities, mentorship and career guidance

benefit

Exclusive access to Data Protection community - ask questions, network and share knowledge with peers and experts via WhatsApp and Linkedin

Topics
Related Articles