Federal and state governments have just released a national framework for generative AI in schools. This paves the best way for generative AI – algorithms that may create recent content – for use routinely in classrooms across the country.

This provides much-needed guidance, a full yr after the launch of ChatGPT. Over the past 12 months, schools have had a variety of responses to the technology from outright banning to trying to include it into learning.

What is within the framework and what’s missing?

What is the framework?

The framework was agreed by state and federal education ministers in October and released publicly last week.

It is designed to assist schools use generative AI “in a secure and effective way”. It notes it has “great potential to help teaching and learning and reduce administrative workload in Australian schools”. But at the identical time it warns of risk and consequences, including

the potential for errors and algorithmic bias in generative AI content; the misuse of private or confidential information; and the usage of generative AI for inappropriate purposes, akin to to discriminate against individuals or groups, or to undermine the integrity of student assessments.

Federal Education Minister Jason Clare also stressed “schools mustn’t use generative AI products that sell student data”.

What is within the framework?

The framework itself is just two pages long, and includes six overarching principles and 25 “guiding statements”. The six principles are:

  • teaching and learning, including schools teaching students about how these tools work, including their potential limitations and biases

  • human and social wellbeing, including using tools in a way that avoids reinforcing biases

  • transparency, including disclosing when tools are used and their impact

  • fairness, including access for people from diverse and disadvantaged backgrounds

  • accountability, including schools testing tools before they use them, and

  • privacy, safety and security, including the usage of “robust” cyber-security measures.

The framework shall be reviewed every 12 months.

Caution is required

The framework does vital work acknowledging opportunities of this technology, while noting the importance of wellbeing, privacy, safety and security.

However, a few of these concepts are much less straightforward than the framework suggests. As experts in generative AI in education, now we have moved from optimism to a way more cautious stance about this technology over the past 12 months. As UNESCO has recently warned,

the speed at which generative AI technologies are being integrated into education systems within the absence of checks, rules or regulations, is astonishing.

The framework puts a rare onus on schools and teachers to do high-stakes work for which they might not be qualified or would not have time or funding to finish.

For example, the framework calls for “explainability” – but even the developers of AI models struggle to fully explain how they work.

The framework also calls on schools to do risk assessments of algorithms, design appropriate learning experiences, revise assessments, seek the advice of with communities, study and apply mental property rights and copyright law and customarily change into expert in the usage of generative AI.

It shouldn’t be clear how this will possibly be achieved inside existing workloads, which we all know are already stretched. This is especially so when the nature and ethics of generative AI are complex and contested. We also know the technology shouldn’t be foolproof – it makes mistakes.

Here are five areas we expect must be included in any further version of this framework.

1. A more honest stance on generative AI

We must be clear that generative AI is biased. This is since it reflects the biases of its training materials, including what’s published on the web.

Such limited datasets are created largely by those that are white, male and United States or Western-based.

For example, a current version of ChatGPT doesn’t speak in or use Australian First Nations words. There could also be valid reasons for this, akin to not using cultural knowledges without permission. But this means the whiteness of its “voice” and the issues inherent in requiring students to make use of or depend on it.

2. More evidence

The use of technology doesn’t routinely improve teaching and learning.

So far, there’s little research demonstrating the advantages of generative AI use in education. In fact, (a recent UNESCO report confirmed there’s little evidence of any improvement to learning from the usage of digital technology in classrooms over many years.

But we do have research showing the the harms of algorithms. For example, AI-driven feedback narrows the sorts of writing students produce and privileges white voices.

Schools need support to develop processes and procedures to observe and evaluate the usage of generative AI by each staff and students.

3. Acknowledging dangers around bots

There is long-standing research demonstrating the risks of chatbots and their capability to harm human creativity and important pondering. This happens because humans appear to routinely trust bots and their outputs.

The framework should aim to make clear which (low-stakes) tasks are and will not be suitable for generative AI for each students and teachers. High stakes marking, for instance, must be accomplished by humans.

4. Transparency

So far, the framework seems to concentrate on students and their activities,

All use of generative AI in schools must be disclosed. This should include teachers using generative AI to organize teaching materials and plan lessons.

5. Acknowledging teachers’ expertise

The global education technology (“edtech”) market was estimated to be price about US$300 billion (A$450 billion) as of 2022. Some firms argue edtech will be used to observe students’ progress and take over roles traditionally done by teachers.

Australia’s national education policies need to make sure teachers’ roles will not be downgraded as AI use becomes more common. Teachers are experts in greater than just subject material. They are experts in easy methods to teach various disciplines and of their students’ and communities’ needs.

This article was originally published at theconversation.com