News Roundup: July 3 to July 15
Major PTE Changes, Australia Changes Looming, New TOEFL Format, HOELT News... and more!
Major PTE Changes Coming in August!
Pearson has announced some major changes to the PTE-Academic Test. I am finding it hard to summarize them all as descriptions of the changes are spread over a few different PDFs and pages, but I think the following covers the main adjustments:
Two new speaking tasks will be added to the test starting in August. First is the “summarize a group discussion” task. In this task, test takers will listen to a conversation between three people for up to three minutes, and then summarize it in their own words for up to two minutes. Next is the “respond to a situation” task. In this task, test takers will listen to (and read) a description of a situation, and then answer a question about it in 40 seconds. The test will include two of each new item type.
The press release says that the test’s scoring model has been “enhanced.” I think that most of the changes are described in various documents like the new “Score Guide.” It describes how the score scales for many items have been adjusted. Where some items once were graded on a 0-3 scale or a 0-5 scale, many are now graded on a 0-6 scale. The way that integrated questions contribute to overall section scores has also been adjusted, with some of them impacting just one of the four main skills (instead of impacting multiple of them).
Human raters have been added to some writing tasks. This means that both the speaking and writing sections now contain tasks that are scored by both humans and AI.
The distribution of test items has been adjusted. For instance, there will now be 5 “describe the picture” items (up from 3), 2 “retell a lecture” items (up from 1). The total number of items on the test will increase from 52 to 65.
The test will now take about 2 hours and 15 minutes to complete (up from two hours).
You can read more on this page from Pearson.
Regular readers will know that I’ve been raving about the “summarize a group discussion” task since I first encountered it in Pearson’s “Versant by Pearson English Certificate” product (RIP) back in 2024. It’s a fun integrated task that demands strong speaking skills and some really sharp listening skills. I’m not too familiar with the “respond to a situation” task.
PTE/IELTS Concordance
I dug into the new IELTS/PTE Score Concordance. Note that the 2020 IELTS/PTE concordance study didn’t include concordance tables for the four skills. It only included a table to convert between overall PTE and IELTS scores. The new concordance study has tables for all four skills.
It is perhaps for this reason that while the Australian Department of Home Affairs currently requires minimum section scores for visa applications, for the PTE the score is the same for every section. For instance, the department defines “proficient English” as reaching 65 in each section of the PTE. “Competent English” is defined as reaching 50 in each section. And so on.
(In comparison, note how applicants who use TOEFL scores much achieve a somewhat confusing 12/13/21/18 line to reach “competent” proficiency)
If the department adjusts required scores based on the new concordance study, some requirements could go up. Especially if IELTS is used as a sort of anchor.
For instance, “Proficient English” can currently be achieved by reaching 7 in each section of the IELTS or 65 in each section of the PTE. If PTE requirements are adjusted based on the new concordances and IELTS is used as an anchor, PTE users might need to submit section scores of: Listening 58, Reading 59, Speaking 76, Writing 69.
A higher speaking score requirement could impact the number of people who opt for Pearson’s test when pursuing an Australian visa. As has been pointed out many times, “perceived easiness” is often at top of mind when it comes to test choice.
The new concordance could impact other levels as well. Currently, “Competent English” can be met with an IELTS score of 6 in every band, or a PTE score of 50 in each section. The new concordance could push the PTE requirement to L 47, R 48, S 54, W 51.
The PTE numbers I’ve cited are the absolute lowest end of the range that compares to each IELTS band. If the DHA decided to pick a requirement from the middle of each range, the requirements could be set even higher than what I’ve suggested.
So… Australia Visa Changes Real Soon?
It dawns on me that new requirements from the Australian Department of Home Affairs will probably kick off around August 7. That’s the date that the big PTE changes will take effect… changes which were partially (largely?) mandated by the DHA. It will be nice to finally stop checking the DHA website for updates after making my morning coffee.
A few things come to mind:
LANGUAGECERT, CELPIP and MET have all been through the arduous DHA acceptance process. As part of that process, they have all published concordance studies linking their tests to IELTS. I don’t see any reason why these three products won’t be added to the list of acceptable tests. These tests currently have fairly small volumes, but they are backed by organizations with very deep pockets (PeopleCert, Prometric and Cambridge University Press & Assessment) and will grow over time. They will draw customers away from PTE and IELTS.
As described above, the DHA now has access to concordance tables for speaking, writing, reading and listening. Accordingly, we might see adjustments to the required section scores for Australian visa applications. Notably, we might see higher PTE speaking requirements, which could slow the use of the PTE for Australian visas. In recent years that test has become somewhat dominant among individuals going to Australia. That’s partly because the required PTE scores are perceived to be easier to meet than the required IELTS scores.
Obviously the enhanced TOEFL (launching January 2026, see below) has not been approved by the DHA. Given the scope of the changes to that test it probably never will be. One imagines that ETS will maintain a version of the classic TOEFL iBT solely for Australia-bound students (and for the handful of other use cases that will be unlikely to accept the new test) but it is sometimes hard to gauge what the folks in New Jersey are thinking these days.
More on the Pearson English Express Test
In other Pearson news, a webinar about the new “Pearson English Express Test” was held a few weeks ago. Additional details about the test were provided. A few are worth repeating here. Just note that the test hasn’t launched yet, and stuff could change.
Here goes:
The price will be $70 in every market.
A second camera will be required.
Scoring is by AI, but includes “verification” related to security and review of possible flags.
Pearson is promising a generous retake policy in cases of security issues, unless blatant malfeasance is detected.
Initial (unofficial) scores will be reported within minutes of the test being completed. Confirmed scores will be available within 48 hours.
Score reports can be sent to an unlimited number of institutions at no cost.
The test will launch in a bunch of European and Latin American markets and Japan on October 25. It will launch worldwide in 2026.
Some institutions have confirmed their acceptance of the test.
Number five is interesting, as that’s something that will set Pearson’s product apart from the Duolingo English Test.
As regular readers know, at launch this test will only be used for admissions to American institutions. Accordingly, its main competitors will be the Duolingo English Test and the TOEFL Test. With a second test entering the “super low cost” category that Duolingo has had all to itself for quite a few years, I’m optimistic that we might see a price freeze for at least a little while. I’m also optimistic that the TOEFL team, facing the prospect of a second low-cost competitor in its own backyard just as it relaunches its flagship test, might be inspired to get a little more competitive (and creative) when it comes to pricing.
New TOEFL Format Revealed!
ETS has announced the new format for the TOEFL iBT. Below is a detailed rundown of what the test will contain starting January 21, 2026. Interestingly, the integrated speaking and writing tasks the test is known for will be removed. The test won’t contain an essay task, either. As has been noted, this format is extremely similar to the existing TOEFL Essentials Test. For all the details, start reading here.
Meanwhile, here’s a quick breakdown of the various items the test will contain, based on the sample tests that have been published:
Reading Tasks (18-27 minutes)
Complete the Words. This is a “fill in the missing letters” task, like on the Duolingo English Test.
Read in Daily Life. Test takers read a non-academic text between 15 and 150 words like a poster, a menu, an invoice, etc. Then they answer multiple-choice questions about it.
Read an Academic Text. This is a roughly 200-word academic text followed by five multiple-choice questions.
Listening Tasks (18-27 minutes)
Listen and Choose a Response. Test takers hear a single sentence and choose the correct response from among four choices.
Listen to a Conversation. Test takers hear a short conversation (ten turns in the sample) and answer multiple-choice questions about it. Topics include everyday life situations.
Listen to an announcement. Test takers listen to a campus or classroom announcement and answer multiple-choice questions about it.
Listen to an academic talk. Test takers listen to a short lecture (100 to 250 words) and answer multiple-choice questions about it.
Writing Tasks (23 minutes)
Build a Sentence. Test takers unscramble a mixed-up sentence. The sentence is part of an exchange between students.
Write an email. Test takers have seven minutes to write an email regarding a specific scenario.
Writing for an academic discussion. Same as the current TOEFL.
Speaking Tasks (8 minutes)
Listen and Repeat. Test takers listen and repeat (seven sentences).
Take an Interview. Test takers will be asked four questions about a given topic. They will have 45 seconds to answer each one. No preparation time is provided.
The whole test will take between 67 and 85 minutes to complete.
ETS is being a little cagey with phrasing, but it appears that the revised test will be wholly scored by AI (which has been trained on human ratings). They note:
“The Speaking and Writing responses will be scored by the ETS proprietary AI scoring engine according to the criteria outlined in the scoring guides. These engines integrate the most advanced natural language processing (NLP) techniques, combining cutting edge research with extensive operational expertise for enhanced performance.”
And:
“Human rating remains a critical component of the overall scoring process of TOEFL’s Writing and Speaking tasks because the automated scoring engines are trained on human ratings. Human ratings not only set the standard for machine learning but also provide oversight to ensure the accuracy and reliability of our scoring.”
Thoughts on the TOEFL Changes
I penned a few thoughts on the changes for the blawg a few days after the sample tests were published:
The revisions seem to be a turning of the page on the “iBT Era” (2005 to 2025) of TOEFL. I think that everyone reading this is familiar with the characteristics that differentiate the TOEFL iBT from the original TOEFL (and from other tests of its era). Three things come to mind. First is the inclusion of so-called “integrated” tasks which test multiple skills at the same time. Second is the fact that the TOEFL purports to be made up of “100% academic content.” And third is that all test items are meant to simulate, as closely as possible, the sorts of things students do in an academic environment. These features have been mentioned again and again in marketing material used to promote the TOEFL iBT.
None of the above things will be true of the revised TOEFL. That’s not a complaint. But it is worth noting. Some other day we can talk about how, from a business perspective, the iBT never really worked out. It could be argued that the switch from “classic TOEFL” to “TOEFL iBT” is what gave IELTS the opening to become the juggernaut that it is today. But again… that’s a conversation for another day.
Speaking of the business and history of testing, it is worth noting that the revised TOEFL is almost identical to the TOEFL Essentials Test, which launched in 2021 as a low-cost alternative to the main TOEFL iBT. But the Essentials Test seems to have been largely rejected by both test takers and score users. That ETS has plucked this particular product from benign obscurity in the back pages of the TOEFL website to serve this new function is utterly fascinating.
I am very interested in cost. Everything we’ve learned so far suggests that the new TOEFL will be cheaper to develop, cheaper to deliver, and cheaper to score. Some of those savings ought to be passed on to test takers.
I would love to see more research about score equivalencies. The test will maintain the traditional 1-120 score scale for two years. But this is a wholly different test. Can I be sure that 73 points on the old TOEFL is totally equivalent to 73 points on the new test?
Some clarification on AI vs human scoring would be welcome.
At this point, ETS should begin the process of winding down the test centers. The idea of hauling ass to a test center for an 85-minute test is bonkers. Eliminating test centers will be a long and tedious process, but it ought to be done. Maintain a few test centers where necessary, but do away with the rest. If someone really needs a test center and one isn’t available… they can take a different test.
Eliminating the test centers would be the first step in disentangling ETS from the NEEA. For everyone’s sake, that needs to happen eventually.
I’m curious what the NABP will think about this test. This could represent a golden opportunity for test makers like Michigan Language Assessment who already have a focus on health care professionals.
Did anyone tell IELTS about the changes when they were deciding whether to spend all that time and money on a concordance study comparing IELTS to old TOEFL?
TOEFL Office Hours #3
Now that we’ve gotten a look at the enhanced TOEFL iBT, I’ve scheduled a third “Office Hours” meeting. That will go down on July 23.
You can register for the meeting right here. Remember that since this is a meeting (not a webinar) Zoom won’t send you a reminder. Please set a reminder in your own calendar app.
These chats have traditionally been focused on people in the test prep space, but for this one I’d like to widen the audience to anyone with a general or professional interest in tests. About seventy-five people stuck around for the full hour in June, and hopefully we can do a bit better this month.
We’ll talk about the new item types and share our thoughts about the changes. The changes are even bigger than most of us figured (indeed, just about everything will change) so I am sure a spirited discussion will emerge. As always, to ensure that everyone feels comfortable sharing their thoughts, I won’t record anything.
Officially these meetings run for one hour, though sometimes keen participants like to stick around and shoot the breeze for some time after. Office Hours is a social event!
HOELT Might be Test Center + Remote?
The UK Home Office has posted a fourth request for information regarding the HOELT. As always, Polly Nash has written up all the key details in The PIE.
Interestingly, according to the Home Office the updated request is being undertaken to “understand the viability of transitioning to a digital service model for English Language Testing” and more specifically “to gather market insights on newly available and emerging technology in relation to remote testing.”
That’s a bit of a shocker. The original tender did not mention remote testing (nor did any of the earlier updates).
But even if this approach is deemed viable, the HOELT is unlikely to be wholly remote, as the tender also mentions “that there are 268 test centres operating across 142 countries globally.”
That’s an oddly specific pair of numbers and a curious verb tense. But maybe I’m missing something. In any case, note that there are 1389 SELT-approved test centers outside the UK right now.
Cambridge Study on Tests Now Available
The long, long, long awaited Cambridge study about test use in the UK is now available.
For all the heavy lifting the preliminary results have been doing in IELTS marketing over the past year, there is surprisingly little in here about the value (or lack thereof) of specific new tests. That’s not meant to be a criticism, of course. The authors of the study seem to have had a higher purpose.
That’s not to say it is totally bereft of that sort of thing. The study contains statements like this:
“There is a notable divergence in the perceived value of various tests among different groups within institutions. While tests like Cambridge Qualifications are praised for their ability to prepare students for academic study, others, such as the Oxford International Education Group’s (OIEG) ELLT and Duolingo, are viewed with scepticism. Specific concerns were raised about the validity, security, and overall suitability of these newer, more efficient or less established tests. One survey respondent expressed dissatisfaction with ‘the recent decision to accept OEIG’s online ELLT for the China market only (in order to boost recruitment)’ due to its lack of credibility and associated security concerns.”
And this:
“For instance, one respondent noted that ‘students who came with the Duolingo award were not in practice equipped to deal with HE life and study’, echoing concerns found in studies about the adequacy of such tests.”
And this:
“One of the most consistent findings is that IELTS is widely regarded as the international standard or ‘common currency’.”
One imagines that the IELTS partners will continue to lean on this research study when crafting marketing materials in the years ahead. Score users might be wise to keep in mind that the criticisms mentioned in the study are anecdotal and not presently supported by comparative data about actual student outcomes. Often, the statements seem to be based on the perspectives of very small numbers of individuals.
Apart from the above, most of the study highlights concerns about English fluency on campus (quite separate from the use of particular tests) and provides recommendations for how to properly assess the worth of new tests.
Paper IELTS Discontinued in Uzbekistan
According to press reports, paper-based IELTS tests have been suspended in Uzbekistan. This comes amidst concerns about test questions leaking ahead of certain administrations of the IELTS in that country. Test takers who have already registered for a paper-based IELTS have been offered refunds and free transfers to a computer-based administration.
Regular readers will know that the IELTS partners have made several changes to paper-based testing policies in recent months. Earlier this year, a policy was instituted requiring test takers opting for a paper-delivered test to have legal residency in the country of testing. This change came after analysis by the IELTS partners linked non-resident test takers to fraudulent test-day activity, as reported by the South China Morning Post.
The paper format was completely eliminated in Vietnam at the end of March and later this month IELTS for UKVI in Bangladesh will no longer have a paper-based option.
TOEIC Cheating Scandal Gets Worse
This TOEIC cheating scandal in Japan is starting to spiral into something really dreadful. According to the Japan Times, 803 TOEIC scores have been cancelled (so far) by the administrators of the TOEIC (IIBC and ETS). The scores go back as far as May 2023. All of the test takers with cancelled scores registered for the test using the same address, which made it possible for them to take the test in the same test center. Link in comments.
There is an assumption in some quarters that in-person testing is necessarily better than at-home testing. Given the above – and ongoing concerns related to paper-based IELTS testing – that might not be true.
Any test delivery method is only as good as the procedures established by a test maker and the enforcement of them. And with thousands upon thousands of test centers around the world, enforcement can sometimes be a challenge.
Elderly readers might recall that the SELT system was created partially in response to the abject bungling of test center delivery by the Educational Testing Service between 2011 and 2014.
Pearson Holds Nursing Summit
I saw that Pearson recently wrapped up the first “PTE Global Nursing Pathway Expo” in the Philippines. Below this item is a picture I stole from LinkedIn.
One of the strengths of Pearson is its ability to identify and foster niche(ish) use cases where the PTE has room to grow (and quickly).
They do this, partially, by going to the people. Meeting them where they are, and all that. Other testing firms could follow their lead.
As I’ve written a few times here, from 2022 to 2024 the percentage of nurses submitting PTE scores to the CGFNS (which does visa screening for nurses who wish to head to the USA) increased from 7% to 50%. The percentage submitting IELTS scores decreased from 84% to 35% in the same period.
Supporting this Newsletter
I’ve received some very generous pledges of financial support from readers already. However, due to onerous banking regulations here in Korea, it is unlikely that I’ll ever be able to turn on Substack’s monetization feature. Anyone who really wants to make a donation can do so via a Ko-Fi page I’ve set up. You can sign up to make a one-time or monthly donation.