40 Percent of Aussie Employees Worry About Losing Job Due to AI

40 Percent of Aussie Employees Worry About Losing Job Due to AI
Almost four in ten employees in Australia worry about job losses caused by AI. Courtesy of GetApp
Updated:
0:00

Almost four in ten employees in Australia worry about job losses caused by artificial intelligence (AI), a new study has revealed.

According to a recent survey by GetApp, a business app and software discovery platform, around 41 percent of employees in Australia are concerned that generative AI can take their jobs. Additionally, 55 percent of ChatGPT users agree to some extent that ChatGPT content can compete with human creations.

Generative AI refers to a type of artificial intelligence that is capable of generating new, original content such as images, videos, music, code, or text. It typically uses deep learning techniques and neural networks to analyse and learn from large datasets and uses this information to generate content that resembles human creations. Some examples of generative AI tools are ChatGPT, Bard, and DALL-E.

The survey, titled “ChatGPT: A friend or foe in the workplace?” is based on an online interview of 463 Australian employees in June. The interviewees are all Australian residents employed full or part-time who use a computer/laptop to perform daily tasks at work and use generative AI tools for their work.

According to the survey result, 33 percent of participants are somewhat concerned, while eight percent are apprehensive about losing their jobs due to generative AI.

About a third (32 percent) of respondents believe that generative AI output can replace 11-20 percent of their professional roles. As such, 36 percent of respondents said they had more time to focus on higher-value tasks thanks to the AI-generated job transformation.

AI (Artificial Intelligence) letters and robot miniature on June 23, 2023. (Dado Ruvic/Reuters)
AI (Artificial Intelligence) letters and robot miniature on June 23, 2023. Dado Ruvic/Reuters

In particular, the survey found that 45 percent of respondents cited privacy and data security as their top ethical concern regarding using general AI tools in the workplace, followed by abuse of AI-generated content (30 percent). Additionally, 51 percent of respondents predict businesses will be exposed to cybersecurity risks, and 41 percent predict regulatory compliance risks through generative AI.

On the other hand, 41 percent of ChatGPT users were concerned that they might develop an over-reliance on using the tool and other AI tools to complete tasks, while 31 percent cite spreading incorrect information because users view ChatGPT’s answer as definitive. A total of 89 percent of respondents say they check ChatGPT results for errors, with 42 percent meticulously reviewing and verifying each answer before using it.

About a third (34 percent) of the interviewees use ChatGPT several times a week, with 30 percent using it three to 10 times a day. While text editing is the most popular use (40 percent), it also serves many other purposes, such as idea generation and data analysis, which were both selected by 33 percent of the respondents.

Nearly all (98 percent) of ChatGPT users rate the technology’s effectiveness, with 37 percent saying the results are “highly effective” based on their experience using it at work. Just over half (55 percent) of users say it improves workflow, followed by 44 percent of respondents saying that saving time is a benefit.

(rafapress/Shutterstock)
rafapress/Shutterstock

Andrew Blair, Content Analyst at GetApp Australia, suggested companies adopt related policies about employees’ use of AI technology.

“Companies that are exploring the use and uptake of generative AI tools, particularly ChatGPT, must ensure complete transparency about the abilities and limitations of the technology and implement appropriate security measures as employees start to use it in the workplace,” he said.

Cindy Li
Cindy Li
Author
Cindy Li is an Australia-based writer for The Epoch Times focusing on China-related topics. Contact Cindy at [email protected]
Related Topics