TEL@York Conference 2007
May 1, 2007
Is Online Testing Viable for Language Courses?
Norio Ota, Noriko Yabuki-Soh, Alison Devine-Tanimura (DLLL), Mike Street (IT Consultant, ATS)
On-line Testing using Moodle
2007TEL@York Conference Norio Ota
Panel This is Part 2 of our report on developing on-line tests using Moodle presented last year. Some of the information will be repeated for a new audience. First I would like to talk about the background and make general remarks on the Japanese Section’s on-line testing. Next, Alison Devine will discuss pros and cons of using Moodle to create and implement on-line tests for Japanese from a novice user’s viewpoints. Her presentation will help would-be users to understand what is involved in developing on-line tests. Mike Street will then discuss technical issues that include problems unique to Moodle and those related to administering tests. He will also comment on some of the on-going and future developments of Moodle. Noriko Yabuki Soh will discuss how we have developed various types of questions for Japanese tests and examine students feedback. Last, I will show how Moodle has been used to create advanced level tests in Japanese and how they were implemented in distant locations including Halifax and Japan.
The Japanese Section has been developing web-based courses and instructional materials for self-study in the past 11 years, and offered an experimental distance education course for four years. In September 2006 it offered an advanced level Japanese language course using a distance education format via video-conferencing for students at St. Mary’s University in Halifax. The TEL initiatives by the Japanese Section of DLLL are as follows:
Server-based web course development Developing interactive instructional materials for self-study Developed a distance education course for the elementary level Japanese language course (tested at Glendon for 4 yrs) using videoconferencing Introducing Media Site Live for video-streamed lectures Developing web-based on-line tests for the elementary Japanese course assisted by ATS (2005-07) Developed a distance education course for the advanced Japanese course for St. Mary’s University in Halifax (2006-07)
Web-based Testing (WBT)
One of the most challenging aspects of web-based on-line language courses is how to implement testing on-line. Web-based testing (WBT) has overcome some of the restrictions of Computer-Based Testing (CBT) and opened a door to developing browser-based tests, which are more flexible and user-friendly. Many on-line course delivery software products have been developed and made available for creating on-line tests, such as WebCT, Hot Potatoes, Sakai, FLE3 and Moodle. All of these products have a quiz and test creating component. With the uncertainty of the future of WebCT and other products, we have chosen Moodle to put first year language tests on line for the following reasons.
Free Open source – customizable Non-proprietary User-friendly Ease of installation Flexible Language support Comprehensive
The main rationale of this project was that implementation of on-line tests would save much time and energy for the two instructors who would have had to mark 270 tests four times a year. On-line testing is also a must for developing distance education courses.
Challenges for on-line testing for languages
Language teaching professionals are often reluctant in developing on-line tests due to the following restrictions.
limited types of questions lack of analytical tools (natural language parsing) lack of qualitative evaluation lack of evaluation for communicative competence security issues technical issues administering issues
These are still legitimate concerns, which will be discussed in this session, but on-line testing is used mainly to assess learners’ learned knowledge.
Implementation and Objectives
On-line testing is NOT comprehensive On-line testing is to access each learner’s knowledge and recognition of:
Vocabulary Expressions Conjugations Sentence structures Basic kana characters and basic sino-Japanese characters (kanji) Simple context (communicative understanding) Sociolinguistic and pragmatic aspects
It is to be underlined that the purpose of theses tests are to assess each learner’s knowledge and recognition of vocabulary, expressions, conjugations, sentence structures, basic kana characters and basic sinoJapanese characters (kanji). Questions which test learner’s understanding of context are included in the form of a short dialogue. Understanding of various sociolinguistic and pragmatic aspects such as honorifics and speech acts are also tested in terms of learned knowledge. This position can be justified because other aspects of language learning such as communicative activities, listening recognition and comprehension, reading and writing are covered in the classroom in terms of experiential knowledge. The Section plans to incorporate audio or video files for listening comprehension and test knowledge on sociolinguistic and pragmatic knowledge in 2007-08.
We embarked on this project by transforming the written versions of the tests into on-line versions with certain modifications and by inventing new types of questions that would be more suitable to the nature of the program.
Transforming paper-based tests into web-based versions Modifying types of questions Developing new types of questions Reviewing students’ answers and modifying possible answers Readjusting answers to introduce a partial marking system Developing a questionnaire for students’ feedback As is always the case with the attempts at early stages, we have observed pros and cons ourselves and from the students’ responses on the questionnaire.
Learn more about limitations, bugs and positive features of Moodle Why were test scores lower this year? Academic honesty issues Learning process for faculty Faculty’s willingness to learn and no quick resistance to Moodle Cooperation between IT consultant and faculty with good working relationship and initiatives Tech problems are hard to deal with by faculty alone: requires a tech support person at the test site. Limitations re the test site
A Novice’s Observations
Advantages Disadvantages Suggestions for improvement
Paperless = Green Remote Access means students can:
see their final exam learn from their mistakes query grades via email grades anytime / anywhere
Instructors can: reconsider
Design Advantages (1)
No black streaks from the photocopier, etc. GIFs for high resolution visuals colour instead of B&W.
Colour coded Q-prompts increased clarity of instruction to test-taker.
GIFs & Colour Coding
Design Advantages (2)
Re-ordering Q.s Delete / replace Q.s Display decided number of Q.s per page
Sorting Question Order
Sorting Question Order
Automatic grading Ease & speed for large courses
Feedback Function Reduced writing time & Fair distribution of comments Regrade Function Ergonomic. Typed answers = Legible
Question Design Disadvantages
GIFs for visuals
cut & paste of clip art & formatted Excel tables Create online community for GIF sharing
Formatting of Q text
Answer Base Disadvantages (1)
Need to input ALL POSSIBLE strings Or manually grade ALL students’ tests = REDUNDANT
Have ALL students’ answers saved into Q-Answer Base or Display ALL Q.1, Q.2, Q.3
Answer Base Disadvantages(2)
ONLY (!) 10 Answer slots in Q-design
[How many slots? ] Function
ONLY(!) 10 Answer Boxes
What is tested?
Evaluation criteria for JP1000
Attendance & Participation: Sept.-April Oral Presentation: 4 times a year Quizzes (dictation) & Homework (exercise of Japanese characters): every week Tests (on-line): 4 times a year
-grammar & structure; vocabulary; idiomatic expressions -reading of Japanese scripts
Improvement from last year
More ease of and control over making and conducting tests in general Improved format and instructions of each test Carefully selected test items (e.g. types and number of questions) Effective use of visual aids Decided number of questions per page Two attempts only
Test Average (%)
Test 1 Paper 78.44 (2004-05) Online 64.57 (2005-06) Online 64.43 (2006-07) Test 2 55.16 59.09 57.39 Test 3 64.90 59.52 61.11 Test4 66.98 62.50 64.97
Questionnaire Results: Student Background
1. What is your level of comfort with computers? 1: 1% 2: 8% 3: 13% 4: 29% 5: 46% 2. How enthusiastic are you about the use of information technology in your classes? 1: 7% 2: 14% 3: 25% 4: 39% 5: 13% 3. How satisfied are you with the time and efforts you allotted yourself to prepare for the tests? 1: 6% 2: 15% 3: 40% 4: 29% 5: 7%
Questionnaire Results: Online Testing
4. How would you rate (your satisfaction with) the number of questions included in each JP1000 test? 1: 11% 2: 25% 3: 32% 4: 22% 5: 7% 5. How would you rate (your satisfaction with) the time you were given to complete each test? 1: 17% 2: 29% 3: 25% 4: 18% 5: 7% 6. How would you rate (your satisfaction with) the content of the tests? 1: 3% 2: 13% 3: 29% 4: 36% 5: 15% 7. How would you rate (your satisfaction with) the organization of the tests? 1: 8% 2: 7% 3: 26% 4: 40% 5: 14%
Questionnaire Results: Online Testing (Cont’d)
8. How would you rate the range of knowledge and skills assessed by the computer-based testing? 1: 6% 2: 17% 3: 38% 4: 29% 5: 8% 9. In your opinion, how accurately did the computerbased assessment measure your knowledge and abilities in comparison to a conventional paper-based test? 1: 11% 2: 26% 3: 38% 4: 19% 5: 6% 10. Please compare your performance in the computerbased assessment with how you feel you would perform in a paper-based test? 1: 4% 2: 21% 3: 42% 4: 22% 5: 11%
Questionnaire Results: Open-ended:
11. Features of the on-line testing that you liked:
―Avoidance of pen errors/conflicts with messy writing‖ ―Easy legibility‖ ―It wasn’t messy… i wasn’t writing…‖ ―Tests were clear. I liked the ability to save before sending.‖ ―It was organized and was easy to do.‖ ―There were no ambiguous answers‖ ―It is more direct and provide ease to answer the question.‖ ―I liked the fact that we have two attempts on the test.‖ ―I liked the use of images to help bring up ideas in the questions [...] the comic book dialogues on the test were cool.‖ ―I can relax more somehow when I’m doing the online tests, and concentrate better.‖
Questionnaire Results: Open-ended (Cont’d):
12. Features of the on-line testing that you did not like:
―I absolutely hated the timer in the corner of the test. […] it feels like a bomb is going to explode.‖ ―The timer!‖ ―Saving answers after every 5-10 questions was a bit annoying‖ ―Too many possible variations for a correct answer‖ ―Could only check answers twice. ―Numbers were often hard to differentiate from each other (eg. 25 & 26 looked similar)‖ ―a small character can make a full answer wrong.‖ ―i sometimes found the technical issues to be quite annoying.‖ ―# of questions asked in 50 mins‖ ―Doesn’t really test writing skills (kanji)‖ ―Not having my test evaluated by my instructor‖
Questionnaire Results: Open-ended (Cont’d):
13. How could those features (that you did not like) be improved?
―Remove the timer, and make the test easier to navigate by putting all the questions on 1 long page.‖ ―Clearer, larger fonts.‖ ―I have no idea. Maybe instead of moodle, another program created strictly for Jp classes with the same concepts as moodle but not on a shared database.‖ ―clearer instructions, quicker system‖ ―Have a written section for hiragana, katakana, and kanji.‖ ―Auto-save the answers as they are typed.‖ ―Turn down the brightness on the computer screens!‖ ―Allow answers to be typed in hiragana.‖ ―just by increasing the time limit maybe by 10 minutes or so, nothing too much‖
Questionnaire Results: Open-ended (Cont’d):
14. Any other comments:
―I know that some students complain about the number of questions but the difficulty level is much lower than written tests from last year. I don't think that online tests make it difficult for students to perform well. If they're well prepared and have studied hard, they should be able to perform well no matter what. So, overall, I think that online tests are good.‖ ―the online testing was pretty good, i think it is a more efficient system than paper, and would prefer online testing over paper testing any day.‖ ―Thank you for your efforts to create an efficient testing system.‖ ―Less questions and/or more time. For a 50-minute test, there should be 50 questions.‖ ―All other courses give 2 hours for a 100 multiple choice question test. These tests were so compressed in time that it made it stressful to think and answer questions.‖
Abhijeet Chavan (2004) Open-Source Learning Management with Moodle http://www.linuxjournal.com/article/7478 Moodle (2006) Moodle for Language Teaching http://moodle.org/course/view.php?id=31 Aditya Nag (2005) Moodle: An open source learning management system http://business.newsforge.com/article.pl?sid=05/05/09/2117200 Den Pain and Judy Le Heron (2003) WebCT and Online Assessment: The best thing since SOAP? http://www.ifets.info/journals/6_2/7.html Röver, C. (2000)Web-Based Language Testing: Opportunities and Challenges http://www2.hawaii.edu/~roever/wbt.htm _________(2001) WEB-BASED LANGUAGE TESTING http://llt.msu.edu/vol5num2/roever/default.html Sabine Siekmann (2006) CALICO Software Report Which Web Course Management System is Right for Me? A Comparison of WebCT 3.1 and Blackboard 5.0 http://calico.org/CALICO_Review/review/webct-bb00.htm University of Ontario (2006) WebCT http://www.uoit.ca/EN/main/11258/12122/17767/learning_webct.html