Students across Australia have began the brand new school yr using pencils, pens and keyboards to learn to put in writing.

In workplaces, machines are also learning to put in writing, so effectively that inside a couple of years they could write higher than humans.

Sometimes they already do, as apps like Grammarly reveal. Certainly, much on a regular basis writing humans now do may soon be done by machines with artificial intelligence (AI).

The predictive text commonly utilized by phone and email software is a type of AI writing that countless humans use each day.

According to an industry research organisation Gartner, AI and related technology will automate production of 30% of all content found on the web by 2022.

Literacy increasingly means and includes interacting with and critically evaluating AI.

This means our kids should now not be taught just formulaic writing. Instead, writing education should encompass skills that transcend the capacities of artificial intelligence.

Back to basics, or further away from them?

After 2019 PISA results (Programme for International Student Assessment) showed Australian students sliding backwards in numeracy and literacy, then Education Minister Dan Tehan called for schools to go back to basics. But computers have already got the fundamentals mastered.

Three major reports — from the NSW Teachers’ Federation,the NSW Education Standards Authority and the NSW, QLD, Victorian and ACT governments — have criticised school writing for having change into formulaic, to serve NAPLAN (the National Assessment Program – Literacy and Numeracy).



In some schools, students write essays with sentences fulfilling specified functions, in specified orders, in specified numbers and arrangements of paragraphs. These can then be marked by computers to reveal progress.

This template writing is strictly the sort of standardised practice robot writers can do.

Are you scared yet, human?

In 2019, the New Yorker magazine did an experiment to see if IT company OpenAI’s natural language generator GPT-2 could write a whole article within the magazine’s distinctive style. This attempt had limited success, with the generator making many errors.

But by 2020, GPT-3, the new edition of the machine, trained on much more data, wrote an article for The Guardian newspaper with the headline “A robot wrote this complete article. Are you scared yet, human?”

Robots can have a voice, but they haven’t any soul.
Shutterstock

This latest much improved generator has implications for the longer term of journalism, because the Elon Musk-funded OpenAI invests ever more in research and development.

Robots have voice but no soul

Back in school, teachers experience pressure to show writing for student success in narrowly defined writing tests.

But as a substitute, the prospect of human obsolescence or “technological unemployment” must drive urgent curriculum developments based on what humans are learning AI do — especially in relation to creativity and compassion.

AI writing is alleged to have voice but no soul. Human writers, because the New Yorker’s John Seabrook says, give “color, personality and emotion to writing by bending the foundations”. Students, due to this fact, must learn the foundations be encouraged to interrupt them.



Creativity and co-creativity (with machines) ought to be fostered. Machines are trained on a finite amount of information, to predict and replicate, to not innovate in meaningful and deliberate ways.

Purposeful writing

AI cannot yet plan and doesn’t have a purpose. Students must hone skills in purposeful writing that achieves their communication goals.

Unfortunately, the NAPLAN regime has hampered teaching writing as a process that involves planning and editing. This is since it favours time-limited exam-style writing for no audience.

Students must practise writing wherein they’re invested, that they care about and that they hope will effect change on the earth in addition to of their real, known readers. This is what machines cannot do.

AI just isn’t yet as complex because the human brain. Humans detect humour and satire. They know words can have multiple and subtle meanings. Humans are able to perception and insight; they’ll make advanced evaluative judgements about good and bad writing.

There are calls for humans to change into expert in sophisticated types of writing and in editing writing created by robots as vital future skills.

Robots haven’t any morality

Nor does AI have an ethical compass. It doesn’t care. OpenAI’s managers originally refused to release GPT-3, ostensibly because they were concerned in regards to the generator getting used to create fake material, reminiscent of reviews of products or election-related commentary.

AI writing bots haven’t any conscience and should should be eliminated by humans, as with Microsoft’s racist Twitter prototype, Tay.

Critical, compassionate and nuanced assessment of what AI produces, management and monitoring of content, and decision-making and empathy with readers are all a part of the “writing” roles of a democratic future.

Skills for the longer term

As early as 2011, the Institute for the Future identified social intelligence (“the power to connect with others in a deep and direct way”), novel and adaptive considering, cross-cultural competency, transdisciplinarity, virtual collaboration and a design mindset as essential skills for the longer term workforce.

In 2017, a report by The Foundation for Young Australians found complex problem-solving skills, judgement, creativity and social intelligence could be vital for college kids’ futures.



This is in stark contrast to parroting irrelevant grammar terms reminiscent of “subordinate clauses” and “nominalisations”, having the ability to spell “quixotic” and “acaulescent” (words my daughter learnt by rote in primary school recently) or writing to a formula.

Teaching and assessment of writing must catch as much as the true world.

This article was originally published at theconversation.com