Site icon Next Business 24

OpenAI simply received a 2nd Australian authorities contract after being the one firm invited to bid

OpenAI simply received a 2nd Australian authorities contract after being the one firm invited to bid


OpenAI is steadily embedding itself within the Australian authorities, with the US tech big profitable its second contract with none competitors as the general public sector is inspired to embrace public generative AI instruments.

The Digital Transformation Company (DTA) final week issued new steerage recommending companies and departments be inspired to make use of public generative AI instruments corresponding to OpenAI’s ChatGPT for work involving OFFICIAL degree authorities information.

OpenAI received its first Australian authorities contract in June when it landed a small $50,000 one-year cope with Treasury for “software-as-a-service”.

It has now received a second smaller $25,000 contract with the Commonwealth Grants Fee for the “provision of AI” for 12 months.

As a result of each contracts had been price underneath $80,000, the companies weren’t required to open them to aggressive tender — and OpenAI was the one firm invited to bid.

Whereas modest in dimension, the offers mark OpenAI’s first foothold in federal authorities and will pave the way in which for bigger, longer-term engagements.

OpenAI involves Canberra

OpenAI has been actively increasing its presence in Canberra.

The corporate just lately employed Bourke Avenue Advisory as its native lobbyist, in keeping with the federal register, and despatched senior executives to Australia final week to debate potential information centre offers.

OpenAI chief international affairs Chris Lehane, talking at SXSW Sydney final week, stated Australia might play a pivotal function within the international AI infrastructure race.

OpenAI chief international affairs Chris Lehane spoke at SXSW Sydney final week concerning the function Australia might play. Photograph: Hanna Lassen/SXSW

“Australia might create frontier-class inference mannequin that embeds native languages, customs and tradition,” Lehane stated on the convention.

“You’d have an Australian-sovereign mannequin that reinforces productiveness right here and exports it overseas.

“It’s chips, it’s information, it’s vitality and it’s expertise – that’s the brand new stack of energy.

“Whichever nation can marshal these assets will decide whether or not the world is constructed on democratic or autocratic AI.”

GenAI within the public sector

The brand new OpenAI contract was revealed in the identical week DTA launched new steerage for the Australian authorities’s use of public generative AI instruments, corresponding to ChatGPT.

This recommendation encourages the expanded use of those instruments for a variety of labor involving info as much as the OFFICIAL degree of classification.

“Generative AI is right here to remain,” DTA deputy CEO Lucy Poole stated in a press release.

“This steerage offers our workforce the arrogance to make use of generative AI instruments of their roles whereas retaining safety and public belief on the centre of every thing we do.

“We don’t wish to be in a state of affairs the place workers, from any company, are utilizing these instruments with out correct recommendation.

“Guaranteeing workers have clear steerage on what info they will share with these companies, and the way, is crucial to minimise dangers and maximise the alternatives that AI presents to the general public service.”

The framework introduces three overarching ideas: defend privateness and safeguard authorities info; use judgement and critically assess generative AI outputs and be capable to clarify; and justify and take possession of your recommendation and selections.

The steerage permits public sector workers to make use of public generative AI instruments for OFFICIAL-level authorities info to assist with brainstorming, analysis, figuring out public out there analysis papers, suggesting methods to current program info and to help with information evaluation and sample identification, amongst others.

It outlines that generative AI shouldn’t be used for any work involving delicate info, for assessing purposes or within the procurement course of.

It builds on final yr’s Technical Requirements for Authorities’s Use of Synthetic Intelligence, which defines 42 necessities throughout the AI system lifecycle — from design and information via to monitoring and decommissioning — to make sure accountable and constant adoption.

The transfer follows an Australian Nationwide Audit Workplace report revealing a minimum of 20 authorities entities had been utilizing AI final yr with none formal insurance policies in place.

Keep forward of the curve with NextBusiness 24. Discover extra tales, subscribe to our publication, and be part of our rising neighborhood at nextbusiness24.com

Exit mobile version