{"id":761,"date":"2017-07-15T12:34:10","date_gmt":"2017-07-15T12:34:10","guid":{"rendered":"http:\/\/wp.lancs.ac.uk\/ltrg\/?page_id=761"},"modified":"2022-01-11T10:34:11","modified_gmt":"2022-01-11T10:34:11","slug":"language-assessment-literacy-survey","status":"publish","type":"page","link":"http:\/\/wp.lancs.ac.uk\/ltrg\/projects\/language-assessment-literacy-survey\/","title":{"rendered":"Language Assessment Literacy Survey"},"content":{"rendered":"<p>Benjamin Kremmel (University of Innsbruck) and Luke Harding (Lancaster University) have developed the <strong>Language Assessment Literacy Survey<\/strong>. The aim of this project was to create a comprehensive survey of language assessment literacy (LAL) which can be used for needs analysis, self-assessment, reflective practice, and research. The survey was conducted in several stages. Some background information (including conference presentations) is provided at the bottom of this page.<\/p>\n<p>The survey is live; you can participate by clicking on the link here: <a href=\"https:\/\/lancasteruni.eu.qualtrics.com\/jfe\/form\/SV_dgUnkyGlDhQGtNz\" target=\"_blank\" rel=\"noopener\">https:\/\/lancasteruni.eu.qualtrics.com\/jfe\/form\/SV_dgUnkyGlDhQGtNz<\/a><\/p>\n<p>We would also be grateful if you could share the link to the survey with any interested colleagues or networks of acquaintances. The survey is targeted at: language teachers, professional examiners, language assessment developers, language assessment researchers, score users (e.g. university admissions officers), policy-makers, learners\/candidates, and parents of candidates.<\/p>\n<p><strong>Survey items<\/strong><\/p>\n<p>The full set of items used in the current survey (launched May 2017) is shown below. We are happy for others to make use of these items, however if you do please provide the following citation \/ attribution:<\/p>\n<p><strong>Kremmel, B. &amp; Harding, L. (2020). Towards a comprehensive, empirical model of language assessment literacy across stakeholder groups: Developing the Language Assessment Literacy Survey. L<em>anguage Assessment Quarterly, 17<\/em>(1), 100-120.<\/strong><\/p>\n<p>1) how to use assessments to guide learning or teaching goals<\/p>\n<p>2) how to use assessments to evaluate progress in language learning<\/p>\n<p>3) how to use assessments to evaluate achievement in language learning<\/p>\n<p>4) how to use assessments to evaluate language programs<\/p>\n<p>5) how to use assessments to diagnose learners\u2019 strengths and weaknesses<\/p>\n<p>6) how to use assessments to motivate student learning<\/p>\n<p>7) how to use self-assessment<\/p>\n<p>8) how to use peer-assessment<\/p>\n<p>9) how to interpret measurement error<\/p>\n<p>10) how to interpret what a particular score says about an individual\u2019s language ability<\/p>\n<p>11) how to determine if a language assessment aligns with a local system of accreditation<\/p>\n<p>12) how to determine if a language assessment aligns with a local educational system<\/p>\n<p>13) how to determine if the content of a language assessment is culturally appropriate<\/p>\n<p>14) how to determine if the results from a language assessment are relevant to the local context<\/p>\n<p>15) how to communicate assessment results and decisions to teachers<\/p>\n<p>16) how to communicate assessment results and decisions to students or parents<\/p>\n<p>17) how to train others about language assessment<\/p>\n<p>18) how to recognize when an assessment is being used inappropriately<\/p>\n<p>19) how to prepare learners to take language assessments<\/p>\n<p>20) how to find information to help in interpreting test results<\/p>\n<p>21) how to give useful feedback on the basis of an assessment<\/p>\n<p>22) how assessments can be used to enforce social policies (e.g., immigration, citizenship)<\/p>\n<p>23) how assessments can influence the design of a language course or curriculum<\/p>\n<p>24) how assessments can influence teaching and learning materials<\/p>\n<p>25) how assessments can influence teaching and learning in the classroom<\/p>\n<p>26) how language skills develop (e.g., reading, listening, writing, speaking)<\/p>\n<p>27) how foreign\/second languages are learned<\/p>\n<p>28) how language is used in society<\/p>\n<p>29) how social values can influence language assessment design and use<\/p>\n<p>30) how pass-fail marks are set<\/p>\n<p>31) the concept of reliability (how accurate or consistent an assessment is)<\/p>\n<p>32) the concept of validity (how well an assessment measures what it claims to measure)<\/p>\n<p>33) the structure of language<\/p>\n<p>34) the advantages and disadvantages of standardized testing<\/p>\n<p>35) the history of language assessment<\/p>\n<p>36) the philosophy behind the design of a relevant language assessment<\/p>\n<p>37) the impact language assessments can have on society<\/p>\n<p>38) the relevant legal regulations for assessment in your local area<\/p>\n<p>39) the assessment traditions in your local context<\/p>\n<p>40) the specialist terminology related to language assessment<\/p>\n<p>41) different language proficiency frameworks (e.g., the Common European Framework of Reference [CEFR], American Council on the Teaching of Foreign Languages [ACTFL])<\/p>\n<p>42) different stages of language proficiency<\/p>\n<p>43) different types of purposes for language assessment purposes (e.g., proficiency, achievement, diagnostic)<\/p>\n<p>44) different forms of alternative assessments (e.g., portfolio assessment)<\/p>\n<p>45) your own beliefs\/attitudes towards language assessment<\/p>\n<p>46) how your own beliefs\/attitudes might influence one\u2019s assessment practices<\/p>\n<p>47) how your own beliefs\/attitudes may conflict with those of other groups involved in assessment<\/p>\n<p>48) how your own knowledge of language assessment might be further developed<\/p>\n<p>49) using statistics to analyse the difficulty of individual items (questions) or tasks<\/p>\n<p>50) using statistics to analyse overall scores on a particular assessment<\/p>\n<p>51) using statistics to analyse the quality of individual items (questions)\/tasks<\/p>\n<p>52) using techniques other than statistics (e.g., questionnaires, interviews, analysis of language) to get information about the quality of a language assessment<\/p>\n<p>53) using rating scales to score speaking or writing performances<\/p>\n<p>54) using specifications to develop items (questions) and tasks<\/p>\n<p>55) scoring closed-response questions (e.g. Multiple Choice Questions)<\/p>\n<p>56) scoring open-ended questions (e.g. short answer questions)<\/p>\n<p>57) developing portfolio-based assessments<\/p>\n<p>58) developing specifications (overall plans) for language assessments<\/p>\n<p>59) selecting appropriate rating scales (rubrics)<\/p>\n<p>60) selecting appropriate items or tasks for a particular assessment purpose<\/p>\n<p>61) training others to use rating scales (rubrics) appropriately<\/p>\n<p>62) training others to write good quality items (questions) or tasks for language assessments<\/p>\n<p>63) writing good quality items (questions) or tasks for language assessments<\/p>\n<p>64) aligning tests to proficiency frameworks (e.g., the Common European Framework of Reference [CEFR], American Council on the Teaching of Foreign Languages [ACTFL])<\/p>\n<p>65) determining pass-fail marks or cut-scores<\/p>\n<p>66) identifying assessment bias<\/p>\n<p>67) accommodating candidates with disabilities or other learning impairments<\/p>\n<p>68) designing scoring keys and rating scales (rubrics) for assessment tasks<\/p>\n<p>69) making decisions about what aspects of language to assess<\/p>\n<p>70) piloting\/trying-out assessments before their administration<\/p>\n<p>71) selecting appropriate ready-made assessments<\/p>\n<p><strong>Background<\/strong><\/p>\n<p>The initial idea for the survey was to empirically validate the suggested language assessment literacy levels provided in Taylor (2013). Our first version was a simple survey which explored stakeholders&#8217; views of their own LAL needs, and the LAL needs of other stakeholders. The results of this first survey are described in the presentation below (given at the EALTA 2015 conference in Copenhagen).<\/p>\n<p><a href=\"http:\/\/www.ealta.eu.org\/conference\/2015\/presentations\/Friday\/Friday%20room%2023.0.50\/KremmelHarding.pdf\" target=\"_blank\" rel=\"noopener\">http:\/\/www.ealta.eu.org\/conference\/2015\/presentations\/Friday\/Friday%20room%2023.0.50\/KremmelHarding.pdf<\/a><\/p>\n<p>Following that presentation, we decided that a more expansive survey would be required, with multiple items targeting each hypothesised dimension of LAL. The second version of the survey went through multiple iterations (including two stages of expert review and pre-testing). Part of that process is described in the presentation below (given at the CRELLA 2016 summer seminar).<\/p>\n<p><a href=\"https:\/\/www.beds.ac.uk\/__data\/assets\/pdf_file\/0008\/509264\/Luke-Harding-CRELLA-7-July-2016.pdf\" target=\"_blank\" rel=\"noopener\">https:\/\/www.beds.ac.uk\/__data\/assets\/pdf_file\/0008\/509264\/Luke-Harding-CRELLA-7-July-2016.pdf<\/a><\/p>\n<p>The survey was refined further throughout 2016 and 2017, and officially launched in May 2017. The Initial results from the first large-scale administration of the survey will be presented at LTRC in Bogot\u00e1 in July 2017.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Benjamin Kremmel (University of Innsbruck) and Luke Harding (Lancaster University) have developed the Language Assessment Literacy Survey. The aim of this project was to create a comprehensive survey of language assessment literacy (LAL) which can be used for needs analysis, self-assessment, reflective practice, and research. The survey was conducted in several stages. Some background information [&hellip;]<\/p>\n","protected":false},"author":175,"featured_media":0,"parent":61,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"page-templates\/template-full-width.php","meta":{"footnotes":""},"class_list":["post-761","page","type-page","status-publish","hentry","post-preview"],"_links":{"self":[{"href":"http:\/\/wp.lancs.ac.uk\/ltrg\/wp-json\/wp\/v2\/pages\/761","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/wp.lancs.ac.uk\/ltrg\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"http:\/\/wp.lancs.ac.uk\/ltrg\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"http:\/\/wp.lancs.ac.uk\/ltrg\/wp-json\/wp\/v2\/users\/175"}],"replies":[{"embeddable":true,"href":"http:\/\/wp.lancs.ac.uk\/ltrg\/wp-json\/wp\/v2\/comments?post=761"}],"version-history":[{"count":11,"href":"http:\/\/wp.lancs.ac.uk\/ltrg\/wp-json\/wp\/v2\/pages\/761\/revisions"}],"predecessor-version":[{"id":1337,"href":"http:\/\/wp.lancs.ac.uk\/ltrg\/wp-json\/wp\/v2\/pages\/761\/revisions\/1337"}],"up":[{"embeddable":true,"href":"http:\/\/wp.lancs.ac.uk\/ltrg\/wp-json\/wp\/v2\/pages\/61"}],"wp:attachment":[{"href":"http:\/\/wp.lancs.ac.uk\/ltrg\/wp-json\/wp\/v2\/media?parent=761"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}