Final
Performance Report 1996-2000
The Second Language Assessment Project
During the third year of funding (1998 - 1999) and the extension year (1999 - 2000), project staff Cheryl Alcaya and Ursula Lentz focused on converting reading and writing assessment instruments developed in previous years of the grant to computer-delivered format, developing listening assessments, and creating new instruments at higher proficiency levels for computer delivery. They also made great strides in developing an efficient method for evaluating performance-based writing assessments online. In addition, a fully enrolled summer institute on assessment development was offered to teachers in July 1999 and July 2000. Each of these areas of endeavor is addressed in more detail below.
Development
of computer-administered assessment instruments
Reading, writing, and listening assessments have been successfully
converted to a computer-administered format. Although some assessment
instruments and evaluation systems are not yet in final form, the
Assessment Project made significant progress in understanding how
to enhance and simplify its computer-administered instruments so
that they are appealing and easy to use for both learners and test
administrators. Project members received a great deal of feedback
on the performance of the various pilot versions of the assessments
through repeated trials of the instruments and through presentations
to language teaching professionals at conferences and workshops.
The feedback was implemented to ensure that these instruments play
a useful role in the nationwide need for standards-based assessment.
The status of the various assessment instruments is described in
Figure 1.
Students who have taken the computer-delivered tests have also reported that they enjoyed the tests. One student who took the pilot of the listening test in French stated:
"I really like the format of this test ... I like being able to work at my own pace, and have the time I need to think about each question, instead of listening to a cassette tape and having a set amount of time to answer both easy and difficult questions. I liked the pictures too, to put together a setting. I felt very comfortable taking the test."
Figure 1: Status of assessment instruments |
|||
Intermediate-Low |
Intermediate-Mid |
Intermediate-High |
|
Intermediate-Low level proficiency was selected as the appropriate benchmark for articulation between secondary and post-secondary language programs. There is a vital need for effective articulation in the U.S. if learners are to become proficient in a second language. |
Learners can be expected to attain at least Intermediate-Mid level proficiency in writing and speaking at the conclusion of the equivalent of two years of language study in a post-secondary institution. |
Learners can be expected to attain at least Intermediate-High level proficiency in reading and listening at the conclusion of the equivalent of two years of language study in a post-secondary institution. |
|
Reading |
One form (version) of French, German, and Spanish MLPA are available for purchase and examination copies have been distributed in response to inquiries. Instruments are computer-administered and scored. CD-ROM includes a manual for test administrators describing procedures for analyzing data and setting standards. |
One form of French, German, and Spanish are completed and are available for purchase. Examination copies have been distributed in response to inquiries. Instruments are computer-administered and scored. CD-ROM includes a manual for test administrators describing procedures for analyzing data and setting standards. |
|
Writing |
Two forms of French, German, and Spanish have been pilot tested in computer-administered format. Another form has been developed and will be included in future pilot testing. Improvements have been implemented to make the instruments more visually appealing and easier to navigate. The online evaluation module will be simplified. |
One form of French, German, and Spanish has been pilot tested in computer-administered format. A second form has been developed and will undergo large-scale pilot testing. Improvements have been implemented to make the instruments more visually appealing and easier to navigate. |
|
Listening |
One form of French, German, and Spanish has been pilot tested, and the instruments are available for purchase in beta form (purchasers will receive free upgrades). Instruments are computer-administered and scored. |
One form of French, German, and Spanish has been developed and is undergoing large-scale pilot testing. The instruments will be available for purchase after a large-scale administration. Instruments are computer-administered and scored. |
|
Speaking |
Three forms of French, German, and Spanish have been pilot tested in pencil-and-paper format. Software options for computer delivery continue to be explored. |
Additional funding must be sought for development of IM speaking proficiency tests. |
Dissemination of the Minnesota Language
Proficiency Assessments
The Assessment Project's instruments - the Minnesota Language Proficiency
Assessments (MLPA) - have drawn considerable attention because they
address the need for benchmarks in standards-based language programs
at both the secondary and post-secondary levels. For example, at
ACTFL '99:
- Patricia Barr-Harrison, current president of NADSFL (National
Association of District Supervisors of Foreign Languages), enthusiastically
described the MLPA to NADSFL members at their annual meeting;
- June Phillips, ACTFL President Elect, discussed the visual
support of reading texts in the MLPA as a model during a session
on revisions to the ACTFL Guidelines;
- Dorry Kenyon, Associate Director of the National Capital Language Resource Center, cited the use of MLPA item types by NAEP (National Assessment of Educational Progress) in its development of a Spanish language assessment to be administered nationwide.
During the past year, CARLA staff met with the World Languages Learning Area Supervisor and other officials of the State of Minnesota's Department of Children, Families, and Learning for discussions about the role the MLPA might play in measuring attainment of Minnesota's World Languages Standard. As a follow-up to these discussions, the Assessment Project was invited to present the MLPA to the Minnesota Educational Effectiveness Program's Winter 2000 conference so that more of the State's teachers and administrators are familiar with the assessments and the purpose for which they were developed.
The Assessment Project has received numerous inquiries about the MLPA from institutions across the United States. In Arizona, Karina Colentine of Yavapai College presented the assessments at a statewide meeting of members of the Foreign Languages Committee in February, 2000. A demonstration of the assessments in an exhibitors' session at ACTFL 2000 generated significant interest in the instruments. To date, the MLPA have been used in Arizona, California, Colorado, Florida, Georgia, Indiana, Maine, Minnesota, New Jersey, North Carolina, Pennsylvania, Tennessee, and Wisconsin.
Online rating tutorial and scoring module
Evaluation criteria and rater training are key components of assessments
of oral and written performance. Many teachers of world languages
would like to do more assessment of learners' productive skills,
but the logistics make this difficult. Assessment Team members developed
an online scoring system for its writing assessments that addressed
some of these difficulties by providing:
- An online tutorial with numerous examples of rated samples and practice rating opportunities;
- A rating module that guides the rater through the process one step at a time;
- Access to rated samples and raters' comments, explaining why they made the decisions they did;
- A record keeping function that generates score reports.
Valuable feedback on this scoring module was collected and the information was used to create a more efficient and adaptable version to the many different computer configurations that exist in secondary and post-secondary institutions. A holistic approach to rating that the Assessment Project developed for use in a large-scale testing environment at the University of Minnesota will be adapted for online scoring, as it allows raters to evaluate written performance more quickly and efficiently than was possible with the initial model.
Return to the Final Performance Report 1996-2000