Reliability on ARMA Examinations: How we do it at Miklós Zrínyi National Defence University - PDF

AARMS Vol. 3, No. 2 (2004) EDUCATION Reliability on ARMA Examinations: How we do it at Miklós Zrínyi National Defence University ILONA VÁRNAINÉ KIS Miklós Zrínyi National Defence University, Budapest,

Please download to get full document.

View again

of 7
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.


Publish on:

Views: 23 | Pages: 7

Extension: PDF | Download: 0

AARMS Vol. 3, No. 2 (2004) EDUCATION Reliability on ARMA Examinations: How we do it at Miklós Zrínyi National Defence University ILONA VÁRNAINÉ KIS Miklós Zrínyi National Defence University, Budapest, Hungary Being a NATO member country just at the gate before joining the European Union leads new challenges for the whole Hungarian society, and especially for the defence sector. According to the new carrier model there are certain language requirements that military personnel must meet. To accomplish this mission the MZNDU Language Testing Centre has an important role. The Language Testing Centre The Testing Centre was set up as an independent organisation in 2001 after the accreditation of our military language exams when it was accredited at the national level since reliability and validity of language tests are invariably judged from the conditions and operation of the testing centre. We organise three types of examinations: STANAG 6001, general language exams (English, French, German, Italian, Russian and Hungarian as a foreign language), and military language exams which we call ARMA exams in English, French, German, Italian, Russian, Croatian, Serbian, Slovakian and Ukrainian. In this article I would like to write about the ARMA military exam, which was accredited by the Hungarian National Accreditation Board in The ARMA examinations The ARMA exam is a professional military language exam. The exam is offered at three levels of proficiency basic, intermediate and advanced. To meet graduation requirements, the Hungarian university system stipulates that students must sit and pass a language exam at the intermediate level. The same holds true for students at the National Defence University. One exception is that students of Security and Defence Policy are required to pass an intermediate and an advanced language exam, one of which must be in the English language. Received: March 24, 2004 Address for correspondence: ILONA VÁRNAINÉ KIS Miklós Zrínyi National Defence University, P.O. Box 15, H-1581 Budapest, Hungary The name of our exam, ARMA has its roots in Latin for arm and is what is known in Hungary as a bilingual exam in that both Hungarian and English are used e.g. for translation tasks. Bearing in mind the specific needs of military personnel, we knew that certain assignments demand specific language skills. This was the rationale for dividing the language exam into 3 different types: A, B and C. The A -type exam is actually an oral exam for those who primarily intend to use the language communicatively. The B - type exam is a written exam focusing on writing and reading skills whilst the C type exam combines the requirements of both A and B. I would now like to highlight each of the skill areas to give information about the task types and exam content. The oral exam The oral exam begins with an interview, which is a dialogue between the examiner and the candidate on 2 3 topics of general interest (e.g. Family, work, travelling, hobby, etc.). This is followed by an interview on one or two military topics (eg. Function of the armed forces, Hungary s participation in peacekeeping missions, regions of strategic importance to NATO, regional conflicts, current international political military events etc.). At the basic level, the middle section involves describing a picture and then a roleplay dialogue on the basis of a true to life situation. At the intermediate and advanced levels the third task involves expressing opinions for or against a statement on a military-political topic. The two-member board of examiners evaluates the candidates communication skills, vocabulary and grammar at each of the three or four tasks. Listening comprehension tasks constitute part of the oral exam at both intermediate and advanced levels. There are two tasks. In the first task the candidates listen to a 90 second recording 2 or 3 times depending on the level of the exam and, on the basis of this recording, are required to answer Hungarian questions in Hungarian. The second task is a multiple choice task. It goes without saying that the complexity of the texts and items differs at both levels. A candidate is deemed successful only if his total score is 60% or above. 320 AARMS 3(2) (2004) The written exam At the basic level, the written exam consists of 3 tasks: a 20 item grammatical and lexical multiple-choice test; a reading comprehension task (a military text with 3 items in Hungarian to be answered in Hungarian); a guided letter in which the information to be communicated is specified. The duration of the exam is 20 minutes for the multiple choice test and 90 for the remaining two tasks. At the intermediate level, the exam lasts for three and a half hours. The multiple-choice test should be completed in 30 minutes. For the remaining tasks there is no specification of time. The first task is a fifty item multiple-choice test on grammatical and lexical problems; The second task is a translation of a general Hungarian text into the foreign language; Next is a reading comprehension task. Candidates are given a newspaper article on a military topic in a foreign language and the examinees have to answer 5 items in Hungarian; The last task is a guided composition on a military topic. Two topics are offered and the candidate must choose one. The duration of the advanced level exam is four hours. The multiple-choice test should be completed in 30 minutes. For the other tasks there is no specification of time. The first task is a fifty-item multiple-choice test on grammatical and lexical problems; The second task is a guided summary (the candidate has to sum up a Hungarian military text in the foreign language); The third task is a translation of a military-political text from a foreign language into Hungarian; The last task is a guided composition on a military topic. The candidate passes if his total score is 60% or above. Since October 2001, we have tested some 2200 candidates using the ARMA military examinations. AARMS 3(2) (2004) 321 Reliability on the ARMA Exams 1. To make our exam more reliable we test the candidate s ability in more than one way. Grammatical knowledge is tested by the multiple-choice test, guided composition and oral exam; Communication is checked by oral conversation, giving opinions and guided composition; As the ARMA is a bilingual exam, we also check the candidate s translation and précis skills in the foreign language. Reading skills are tested by a reading comprehension task; Listening skills are tested by listening comprehension tasks and during oral exams. We employ both receptive and productive tasks to test the four skills. RECEPTIVE PRODUCTIVE ORAL EXAM listening speaking WRITTEN EXAM reading writing 2. All of our tasks are controlled. This means that during oral exams examiners have the questions they have to ask so that the examinee will perform nearly in the same way at any examination board. In the written exam the composition, the summary and the reading comprehension are guided. We restrict the freedom of candidates because the greater the restrictions imposed on the candidates, the more directly comparable the performances of different candidates will be. 3. It is very important to write unambiguous items. The candidate should not be presented with items with unclear meanings or to which there is an acceptable answer which the test writer didn t think about, or to which there is more than one acceptable answer. For this reason, we always ask our colleagues to moderate the new items and find alternative interpretations to the ones intended. We also try to pretest the new items on a group of people comparable to those for whom the test is prepared. Unfortunately we are unable to pretest all the items in every language, as we have only a few students learning Italian or Serbian for instance. 4. It is essential to provide clear and explicit instructions both in written and oral exams. Weak candidates especially often misinterpret what they are asked to do. In our tests, we write all rubrics in Hungarian, indicating the time allowed, the score, the length of the task, whether dictionary use is permitted. We also add an answer sheet where the questions are presented again. 5. We ensure that tests are well laid out and perfectly legible. 322 AARMS 3(2) (2004) 6. We expect candidates to be familiar with the format and testing techniques employed in the ARMA exams and have provided all relevant information on task types, scoring, time, and topics on the university website ( képzés/nyelvvizsgaközpont). Additionally, practice tests are freely available. 7. We take great care to ensure ideal conditions for exam administration. The written exam is always organised in a 100-seat hall, which is very quiet and well lit. There are 5 or 6 trained invigilators present during the exam. The listening exams always take place in language laboratories. 8. Scorer reliability is essential to test reliability. Therefore, we regularly train our examiners how to use our marking guides. Scoring of the multiple-choice test is the most objective as there can be only one correct answer. The marking guide is very detailed. The markers are also provided with the proper sample for the translation and summary. The set symbols for correcting grammar, vocabulary, style, etc. are also compulsory for every marker. Each test is double marked. Before handing in the corrected works, markers have to discuss the points they didn t agree on. We also enclose a questionnaire for markers as to whether the marking guide has worked well or not. 9. We identify our candidates by numbers, not names. So the marker does not know whose work he or she is correcting. We try to ensure the reliability of the ARMA military exam system on the basis of three elements: observance of the regulations for the conduct of examinations; continuous pretesting and renewal of items; continuous training and observing of testers. In the block diagram of Figure 1 I would like to illustrate the procedure of conducting ARMA exams. The diagram describes organised activity, based on disciplined work and unified principles. The purpose is to decide in as wide spectrum as possible, in the most reliable way, whether a candidate s knowledge meets the requirements of the exam or not. The biggest part of the process though can ensure only the suitability of the measuring means. The real measurement, testing, namely the reliability of performance in an exam situation that is the correlation between real knowledge and the result of the exam, also depends on scorers subjectivity. The key task is to ensure that the measuring of knowledge is always based on the same criteria. What does that mean? Usage of control means which are independent of the procedure itself, which are able to judge the validity of the test, evaluation of the testers, and to demonstrate possible mistakes in an objective way. This control means is statistics, which indicate unforeseen problems in the examination process in an objective way, on the basis of test results. AARMS 3(2) (2004) 323 Figure 1. Diagram of organising exams 324 AARMS 3(2) (2004) The applied procedure goes to two groups. On one hand, the multiple-choice tests are measured by so-called scale analyses, on the other hand, the partial scores and the final scores are measured by ANOVA Analyses of Variance. In the interests of the reliability of our examination system, we carry out statistical analyses after each session and we use the results of the analyses in the preparation of the following exam. There are several types of information that our Testing Centre collects and analyses in order to decide whether the test was satisfactory. The statistician who works for the institution reports the relevant statistics and interprets the figures. The results indicate whether different parts of the test were performing as intended and, if not, where the major problem areas seem to be. This quantitative statistical information is combined with qualitative feedback collected from administrators, candidates, and examiners. Conclusion The first thing that testers have to be clear about is the purpose of testing in any particular situation. The basic problem is to develop tests which are valid and reliable, which have a beneficial backwash effect on teaching, and which are practical. They are designed to test the ability of students with different language training backgrounds. The content of a proficiency test (and ARMA is a proficiency test) is not based on the content or objectives of language courses. Rather, it is based on a specification of what candidates have to be able to do in the language in order to be considered proficient. Proficient means having sufficient command of the language for a particular purpose. The purpose of the ARMA military language test is to prove that the candidates have sufficient proficiency and are ready linguistically to perform their job in the target situation. (For example to decide whether the candidate is able to work in NATO peacekeeping missions.) References 1. J. C. ALDERSON, C. CLAPHAM, D. WALL: Language Test Construction and Evaluation, Cambridge University Press, A. HUGHES: Testing for LanguageTeachers, Cambridge University Press, L. F. BACHMAN, A. S. PALMER: Language Testing in Practice, Oxford University Press, AARMS 3(2) (2004) 325
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks