July 31, 2024
Explore the challenges facing Australia in developing its own large language model, including funding issues, technological constraints, and data availability. Learn why this ambitious AI project remains a distant dream

Recently, there's been a growing discussion in tech circles about the need for an Australian-owned and developed large language model (LLM). As an Australian who keeps up with the latest tech news, I find this idea intriguing but also quite challenging. While a homegrown LLM tailored to Australian culture and sensitivities is appealing, several significant hurdles make this a distant possibility.

 

The development of LLMs has been dominated by tech giants like OpenAI, Google, and Microsoft, primarily based in the United States. This has led to concerns about these models' cultural bias and relevance for users outside the US, including Australians. However, the path to creating an Australian LLM is fraught with challenges beyond technological hurdles.

 

a robot wearing an Australian flag as a cape

 

The Case for an Australian LLM

 

Language models are inherently biased by the data they are trained on. Most current LLMs, such as those developed in the US, reflect American cultural norms and perspectives. This bias can lead to outputs that are not culturally sensitive or relevant to Australians. Therefore, the argument for an Australian LLM trained in local data makes perfect sense. It could ensure more accurate, culturally aware, and contextually relevant responses for Australian users.

 

Research has shown that LLMs can exhibit biases related to race, gender, and cultural background. A study published in the journal "Nature Machine Intelligence" in 2021 found that popular language models like GPT-3 displayed significant biases, often reflecting the societal biases in their training data. For Australians, this could mean misunderstandings of local idioms, misrepresentations of Indigenous cultures, or incorrect interpretations of Australian history and politics.

 

An Australian LLM could potentially address these issues by incorporating:

  1. Australian English: Including local slang, pronunciations, and grammatical nuances.
  2. Indigenous languages and cultures: Properly representing and respecting Aboriginal and Torres Strait Islander languages and cultural concepts.
  3. Australian history and politics: Accurately reflecting the nuances of Australian governance, historical events, and political landscape.
  4. Local context: Understanding Australian geography, flora and fauna, sports, and cultural events.

 

Additionally, having such technology owned by an Australian entity aligns with the broader goal of reducing reliance on foreign tech giants. This aligns with Australia's broader tech strategy, as outlined in the Australian Government's "Digital Economy Strategy 2030," which aims to position Australia as a leading digital economy and society by 2030.

 

The Funding Challenge

 

However, the primary obstacle to developing an Australian LLM is funding. Australian venture capitalists (VCs) are notoriously risk-averse. This cultural aversion to taking risks, often called "tall poppy syndrome," discourages investments in ambitious, high-growth projects.

 

The "tall poppy syndrome" is a cultural phenomenon where people of high status are resented, attacked, cut down, or criticised because their talents or achievements distinguish them from their peers. This attitude has been widely discussed in Australian business and entrepreneurial circles, and its impact on the tech sector is significant.

 

According to a report by StartupAUS, an advocacy group for Australian startups, the total venture capital investment in Australia was AUD 1.96 billion in 2020. While this might seem substantial, it pales compared to the US, where venture capital investments totalled USD 130 billion in the same year. This disparity in funding availability makes it extremely challenging for Australian startups to compete globally, especially in resource-intensive fields like AI and machine learning.

 

Historically, successful Australian tech companies like Canva and Atlassian had to seek funding from overseas investors. They could not secure the necessary capital locally and even listed on foreign exchanges. Canva, for instance, raised significant funding from US-based investors like Sequoia Capital and Blackbird Ventures before reaching its unicorn status. Atlassian, another Australian tech success story, chose to list on the NASDAQ rather than the Australian Securities Exchange (ASX) due to better access to capital and tech-savvy investors.

 

Given this context, it is highly unlikely that Australian VCs would fund a large language model's expensive and resource-intensive development. The development of GPT-3, one of the most advanced LLMs, reportedly cost OpenAI around USD 4.6 million just for the computing power needed during training. This figure doesn't include research costs, data acquisition, or ongoing maintenance.

 

Technological Constraints

 

Apart from funding, the technological infrastructure required for training an LLM is another significant barrier. Training a state-of-the-art LLM necessitates access to thousands of high-performance GPUs and substantial computing power, typically available only in large data centres.

 

To put this into perspective, the GPT-3 model, which has 175 billion parameters, required an estimated 3.14E23 FLOPS (floating-point operations) of computing during pre-training. This is equivalent to running thousands of high-end GPUs continuously for several weeks.

 

While companies like Microsoft invest in OpenAI by providing such resources, Australia lacks similar facilities and infrastructure. The few data centres in Australia may not have the capacity or the latest GPU technology required for this purpose.

 

According to the Australian Government's "State of the Data and Digital Nation" report, while Australia has progressed in digital infrastructure, it still lags behind global leaders in areas like high-performance computing. The report notes that Australia ranks 68th globally regarding average fixed broadband speeds, which could pose challenges for data-intensive operations like training LLMs.

 

Moreover, the energy consumption required for training and running large AI models is substantial. A study published in the journal "Energy and Environmental Science" estimated that training a large AI model can emit as much carbon as five cars in their lifetimes. Given Australia's commitments to reducing carbon emissions, the environmental impact of developing and maintaining an LLM could be a significant concern.

 

a safe locked up, bursting at the seams with newspaper clippings

 

Data Availability

 

Finally, data availability is a crucial factor. Training an LLM requires vast amounts of diverse and high-quality data. In Australia, news companies have protected their content, even attempting to charge platforms like Facebook for linking to their articles.

 

The News Media Bargaining Code, introduced by the Australian Government in 2021, requires tech giants like Google and Facebook to pay news publishers for content. While this legislation aims to support the Australian news industry, it could create barriers for AI researchers seeking to use news content to train language models.

 

Moreover, the diversity of data needed to create a truly representative Australian LLM is challenging to obtain. Australia's population of about 25 million is relatively small compared to countries like the US or China, with populations large enough to generate vast amounts of online content in various domains.

 

Acquiring the necessary data for training would likely involve significant costs. Without sufficient and accessible data, creating an Australian LLM that can compete with its international counterparts becomes even more challenging.

 

Legal and Ethical Considerations

 

Beyond the technical and financial challenges, there are also legal and ethical considerations. Australia has strict privacy laws, including the Privacy Act 1988 and the Australian Privacy Principles, which could impact the collection and use of data for training an LLM.

 

There are ongoing debates about the ethical implications of AI and large language models. Any Australian LLM project would need to address global concerns such as bias, misinformation, and the potential misuse of AI-generated content.

 

The Australian Human Rights Commission has called for a national strategy on AI, emphasising the need to protect human rights in the development and use of AI technologies. Any Australian LLM project would need to navigate these complex ethical and legal landscapes, adding another layer of challenge to an already difficult endeavour.

 

Potential Alternatives

 

While a fully Australian-developed LLM might be out of reach, there are potential alternatives that could address some of the concerns:

 

  1. Fine-tuning existing models: Instead of building an LLM from scratch, Australian researchers could focus on fine-tuning existing models with Australian data. This approach would be less resource-intensive while improving the model's performance on Australian-specific tasks.
  2. Collaboration with international partners: Australian institutions could partner with global tech companies or research institutions to develop LLMs that are more inclusive of Australian perspectives.
  3. Focus on niche applications: Rather than attempting to create a general-purpose LLM, Australian researchers could develop smaller, specialised models for specific Australian contexts, such as legal or healthcare applications.

 

Conclusion

 

While the concept of an Australian-centric large language model is undoubtedly appealing, several insurmountable obstacles make its realisation unlikely. The lack of risk appetite among local VCs, insufficient technological infrastructure, and limited access to necessary data collectively hinder the development of such a model.

 

The challenges facing the development of an Australian LLM reflect broader issues in the country's tech ecosystem, including the need for more investment in digital infrastructure, a more supportive environment for high-risk, high-reward tech ventures, and strategies to compete in the global AI race.

 

Until these fundamental issues are addressed, the dream of an Australian LLM will remain just that—a dream. However, this doesn't mean Australia can't contribute to the field of AI and language models. By focusing on alternatives like fine-tuning existing models, international collaborations, or developing specialised applications, Australia can still play a significant role in shaping the future of AI technology.

 

Australia must find its niche and leverage its unique strengths as the global AI landscape evolves. While a homegrown LLM might not be on the horizon, Australia can still strive to be at the forefront of ethical AI development, specialised applications, and AI governance frameworks that reflect its values and priorities.

 

Some other posts you may like

Explore how integrating AI and ML technologies into your operations can fundamentally reshape your business, from boosting productivity and revenue to enhancing customer experiences.

How does machine learning and artificial intelligence technologies help businesses

The digital age has ushered in AI and Machine Learning (ML) technologies with the power …

July 31, 2024

Read More
Discover how freelancers can harness the power of AI to increase their income. From content creation and personal assistance to data analysis and product development, learn how AI can help you streamline your workflow and attract high-value clients.

Boost Your Income with AI - Content Creation, Automation, and More

Artificial Intelligence has emerged as a game-changer in various industries in recent years, revolutionising how …

July 31, 2024

Read More