If language schools are to maximise their students’ learning, then an accurate initial language skills assessment may be the best start they can give. And, today, it is not only language schools, but also many of their study travel advisor partners who are putting efforts into achieving this.
For some advisors, a language assessment is, indeed, regarded as a key part of their service. “From my point of view,” says Emanuela Marianecci of SC Inglese in Italy, “this is somehow essential for giving an excellent and quality consultancy.” Albanian advisor, Adela Makashi of Ande-LM Agency, finds it is part of the job that their school partners expect of them “before ever taking the initiative to present them to any school”, she says.
However, this is probably the exception rather than the general rule. Where there is clear value in testing prior to booking is for students wishing to enrol on courses that have a minimum language requirement such as Ielts or Cambridge preparation, or special purposes language, such as English for Lawyers. In such a case, comments Marianecci, “testing a student’s level properly…becomes vital for the right and final result of the course chosen.” It is also helpful in establishing the required duration of the course, as Cyrille Marshall of Marshall Language Studies in Switzerland, highlights. “For example, an intermediate level student needs less time to pass the FCE [First Certificate in English] than an elementary level student,” she points out.
“I believe it is crucial to test the student’s level before…enrolment, if the student has specific goals to achieve, for example a specific level to reach,” agrees Françoise Mantel, Director of Agence Babel Séjours Linguistiques, Switzerland. She observes, however, that the results are only of limited use first, because the student’s language level can change before arrival at the school, and, “most importantly”, because the test does not identify how fast the student will learn. “It is therefore a rough idea of how long he/she will need to achieve his/her goals.”
At the post-booking stage, many advisors are involved in carrying out tests required by a student’s chosen school. From the language school perspective, while testing on arrival is the norm, there are a number of reasons why they may require some or all students to be tested prior to this. As highlighted above, one example would be courses with a minimum language requirement. Study Group spokesperson, Vic Richardson, also points out that those requiring a UK entry visa need to have their language level verified as part of the visa process.
The tightening of UK visa requirements has encouraged language schools to upgrade their pre-arrival assessment processes, and, comments Carsten Sallmann, Marketing Manager at Sprachcaffe Languages Plus, the three Sprachcaffe centres in England started working with Skype interviews in 2009. This has proved a positive move, resulting in “a considerable improvement” in the assessment process, he notes. Hence, despite the UK Border Agency having since announced that a B1 Secure English Language Test (Selt) is now required for a Tier 4 visa and a Skype interview will no longer be acceptable Sprachcaffe will continue to use Skype interviews, “simply to get a better picture of the student’s overall level of the language,” says Sallmann.
While some schools use pre-arrival testing selectively, others use it in all cases. Sprachcaffe, which has its own online test, uses this for all students. Sallmann enumerates several reasons. “Students save time and can start with the lessons very quickly on their first day. [In addition,] we have a better means of planning the levels for our classes and assigning teachers and can even direct students with a certain level to suitable starting dates. We can [also] ensure [having] enough teachers for all levels at all starting dates. [Added to that,] fluctuation on the first day is limited to a minimum and we can react swiftly if certain modifications still seem necessary.”
Genki Japanese and Culture School (GenkiJACS) in Fukuoka, Japan, asks all students upon booking to self-report their approximate level e.g.; low beginner, high beginner, pre-intermediate advises Director of Marketing, Evan Kirby. Then, at least two weeks before arrival, they take an online placement test. In addition to this, the school has begun offering all confirmed students two free online video-chat lessons via Skype before they arrive. “Many students are, of course, happy to have a chance to brush up on their skills a little before coming to Japan. But these lessons also have the added advantage of letting us get to know the students’ skills in detail before they arrive, so we can be sure of their class placement in advance,” says Kirby.
Kai Language School in Tokyo, Japan, is another school that always pre-tests. Its assessment consists of reading and writing elements, as well as a telephone interview to test listening and speaking. By using this combination, “We have almost no students changing levels after the start of the course,” attests spokesperson Tae Yamaguchi.
In most cases, prior testing is in addition to, rather than instead of, testing on arrival. Most or all of the four key skills of speaking, listening, writing and reading (see box page 33) are generally tested before allocating students to a level. At Sprachcaffe, the online test is supplemented by an oral test on arrival, “so that we can ensure that the spoken knowledge of the language is at a similar level as the written skills”, says Sallmann. Then, at school chain Eurocentres, a mixture of self-assessment and online testing is used at the pre-arrival stage, followed by a further test upon arrival. However, this may change in the future, as, comments Head of Academic Development, Brian North, “We are currently piloting online use of our main entry test with students going straight in to class on arrival.”
The number of levels offered by language schools varies from school to school, and can also depend upon numbers in attendance. At Study Group, for example, seven levels are offered, of which “we typically always run six”, according to Richardson.
Many European schools, such as the UK-based Colchester English Study Centre (CESC), Spain’s Malaca Instituto and the German did deutsch-institut, work according to the CEFR (Common European Framework of Reference) model, which identifies six levels of attainment. At did in Frankfurt, Germany, says Sales Manager Almir Krupic, separate classes are run corresponding to each of these levels. Some schools split these into further sub-levels, and at Spanish language school, Malaca Instituto, spokesperson Bob Burger reports that there are at least 16 differentiated levels available.
Elsewhere, schools use other systems, to suit their programme profile. Three separate tests of Japanese are offered at GenkiJACS, appropriate to low-beginner, high-beginner and intermediate levels. Based on the results, students are placed into one of 12 class levels. “We always run a class appropriate to the relevant level, even if only one student will attend,” says Kirby.
At International House Sydney in Australia, there are two levels for Ielts, and four levels for General English, while FCE is upper-intermediate and CAE [Certificate in Advanced English] is advanced, explains spokesperson Shellie Hansen. Cambridge courses always have separate classes; otherwise, they are not run, she notes.
Although schools vary significantly in their provision of levels, according to Cristina Maenaka at Scala Mundi Turismo in São Paolo, Brazil, it is “not very often” that clients raise this topic when choosing a school. Students who do ask about the number of levels are often those who already have a high level of language attainment, observes Alvaro Benevides of Ready for You, a consultancy based in Malta, Ireland and the UK.
For Ros Slemint of IEC Estudios in Madrid, Spain, nevertheless, the number of class levels is an important consideration in choosing a school for a particular client since, she points out, too few class levels means too great a range of language levels within the classroom, leading to dissatisfaction from students. Mantel endorses this view, commenting that, although students rarely ask, “I consider it my job to verify [that there are sufficient levels”.
Once assessed, students are then generally allocated to one single level, judged on their overall performance. “The only difficulty comes from students who have very unbalanced skill sets,” comments Kirby, “for example, people who have only ever studied Japanese from textbooks but have never spoken it, or people who lived in Japan previously, so have great conversation skills, but little formal grammar training.”
However, many schools are loath to place students in a different class level for one skill than another, and address any imbalance in other ways. At GenkiJACS, Kirby says that any areas of weakness “are noted by the tester, and sometimes students are given free ‘catch-up’ classes, if they are lacking one specific skill”.
“We do as most schools who offer full-time courses,” comments Richardson at Study Group, “that is, place students in their overall level for ‘core’ lessons…and then offer electives.” These include skills development, he notes, which will often focus on a single skill, which is specific to the student’s level for that skill. Meanwhile, at Sprachcaffe, Sallmann explains that, given its small class sizes, “We fully trust in our teachers’ abilities to judge a student’s language skills in the first few lessons and put emphasis on weaknesses on an individual basis.”
Agent Marina Martins, Campus Brazil, Pinheiros, Brazil, endorses the single-level approach, commenting, “it is important to be surrounded by students with different skills, as they challenge you in areas you are not so strong at.” There is, however, an opposing view, voiced by Slemint, that, “Students shouldn’t just be grouped to the average mark for all four skills”; this approach “can cause frustration” argues Mantel.
Accordingly, at some schools, there are certain circumstances in which students may study the different skills at different levels. For example, at Kai Language School, a student may take the Kanji (Chinese characters) class at a different level, “if necessary” says Yamaguchi. Such decisions clearly depend on degrees of disparity between the skills, a factor often linked to the similarity of the target language to the learner’s native tongue (see box page 32). Burger affirms that Malaca Instituto will also place students into different levels where significant disparities occur.
In any event, feedback from study travel advisors suggests that students are generally happy with the placement process. Only rarely it seems are they asked to intervene over any problem with the assessment process, as ATW Rome spokesperson, Gabriella D’Urso, testifies.
Some students get nervous during the placement test, resulting in lower than expected results, highlights Maenaka. However, where, for any reason, it transpires that a student has been allocated to the wrong class level, it is in most cases quite straightforward for them to be moved, as Marianecci confirms. Hence, “Initial assessment is important,” says Vanessa Navarro of English Language Center in Boston and Los Angeles, USA, “but we allow for students to exceed their initial assessment, and, if needed, they may be given permission to change levels.” Likewise, for Susan Sharpe of Milton College in Australia, the initial assessment is “a starting-point only”.
Others accord key importance to the initial assessment. “Having to move levels after the start of the course means a big loss for both students and us, time-wise as well as money-wise,” Yamaguchi observes.
However, perhaps this is merely a difference of emphasis. Few would disagree with Burger’s observation that students being placed in appropriate levels, “is crucial in their successful language learning”.
The nationality factor
Language students can sometimes display a significant discrepancy in ability across the four skills of reading, writing, listening and speaking. This can be linked to educational and cultural differences between countries, and can also be a result of the similarity or lack of it between the learner’s native tongue and their target language.
According to Evan Kirby, Director of Marketing at Genki Japanese and Culture School in Japan, there is a huge split between ‘Western’ and ‘Asian’ students, in both learning styles and skills. “Students from many Asian countries have high reading and writing ability, but lower conversation skills, whereas those from many Western countries are [the] opposite,” he claims.
Clear differences are also observed between nationalities, with Tae Yamaguchi at Kai Language School in Japan, noting, “Koreans are stronger in grammar, because their grammar is very similar to Japanese grammar. Chinese are stronger in writing kanji (Chinese characters), as they use it in their own language.”
In learning English, “Arabic speakers tend to be strong in speaking and weak in grammar and writing,” says Sarah Greatorex, Principal of Colchester English Study Centre in the UK. “Japanese tend to be strong in grammar and weak in spelling.”
“There are lots of theories here!” comments Shellie Hansen of International House in Sydney, Australia. She confirms that European and South American students tend to be stronger in verbal and listening skills. “This can be due to a number of factors,” she observes, including the way they have learned English and their experience with it, as well as cultural attitudes towards learning and to teachers. However, she says Asian students tend to be stronger in grammatical work and reading.
“Some Asian cultures place more importance on clarity and precision than ‘just getting the message across’ and perhaps making mistakes, and, while correct, can slow down the communicative process.” She further points out, “Asian languages are also often syllabic, which can cause more difficulty in pronunciation than [for] students whose first language is stress-timed, like English.”
Sprachcaffe spokesperson Carsten Sallmann confirms that Asian students’ pronunciation in European languages is often not as developed as their grammar. “We feel that this is not necessarily due to their educational system at home,” he says, “but, rather, due to the fact that their native languages include sounds which are rather different from the European languages they are learning.”
At the US-based International Language Institute of Massachusetts, Director of Programmes, Caroline Gear, finds in general that speakers of Arabic and Asian languages who lack the cultural and linguistic advantages of native speakers of Latin languages struggle more in all areas of English language, but especially speaking and writing. “In particular, our Middle Eastern students struggle with the most basic spelling and sentence structure, making their writing especially problematic,” she observes. “Another example would be [that] students from the Pacific Rim countries, who test high on our grammar section, may test low on speaking.”
By contrast, closeness of the target language to the student’s mother tongue can confer advantages, as Bob Burger of Spanish language school, Malaca Instituto, highlights. In particular, he says, “Brazilians and Italians can [generally] communicate orally at a higher level than would be expected for their overall language level.”
How levels are assessed
Most schools test each of the four skills of speaking, listening, writing and reading. Many, like Harrogate Tutorial College in the UK, also separately take grammar knowledge into consideration. However, the system of assessing levels varies between schools, and is also often dependent on the type of course the student has chosen.
At International House in Sydney, Australia, for instance, there are four separate tests, depending on whether the chosen course is general English, Ielts or English for Academic Purposes (EAP), Cambridge or English for Teaching Young Learners (ETYL), explains spokesperson, Shellie Hansen. Each test has a different writing component. For Ielts/EAP, this takes the form of an essay, for FCE a narrative, and for general English a personal response.
General English and ETYL students also take a 40-minute language assessment. This focuses on vocabulary, reading and grammar skills, she notes. Ielts/EAP students also take a reading test, with two sections one academic and one general. In addition, all students have a speaking interview lasting approximately 10 minutes, in which listening comprehension is assessed.
Many other schools use one standard test for all students. The International Languages Institute of Massachusetts in the USA has recently updated its assessment system. All students now take a three-part test, assessing grammar, speaking and writing, but not reading. As Director of Programmes, Caroline Gear, explains, “We have found that a reading assessment did not really help in the overall levelling of our students.”
While the nature of the test is one important factor in ensuring that the results are accurate, another is the way in which the answers are assessed. After testing, says Gear, staff evaluate each section based on an in-depth nine-level rubric that assesses speaking, writing and grammar. “Because of the time put into assessing language production, we feel that the level assigned is accurate and realistic,” she comments. Regular updating is also important, according to Vanessa Navarro of US-based English Language Center schools in Boston and Los Angeles, and “We revisit the test every few years, making sure it is the best assessment of a student’s level for our courses.”
At Sprachcaffe schools, the online test taken by all students prior to arrival has been used since 2008. It consists of approximately 100 questions, focusing on all aspects of grammar and spelling in the target language. “Our directors are thus able to assess the student’s level in written use of language on quite an exact basis,” claims Marketing Manager, Carsten Sallmann. “The past three years have shown that its results are 80 per cent accurate, and the oral test makes up for the remaining 20 per cent,” he asserts.