evaluation methods for mobile learning by aP68V1sd

VIEWS: 0 PAGES: 30

									      Evaluation Methods
(for Mobile Learning Research)
              and
Sample of Research Roadmap

   Prof. Dr.-Ing. Kalamullah Ramli, M. Eng
About Me
• National Reviewer for
  –   Hibah Riset Kemitraan PT, Industri dan Pemda
  –   Riset Unggulan Perguruan Tinggi dan Industri (RAPID)
  –   Penelitian Unggulan Strategis National (PUSNAS)
  –   Penelitian Unggulan Internasional
• Technical Reviewer
  – International Journal on Learning
  – IEEE Malaysia International Conference on
    Communications (MICC) 2008
  – IEEE Malaysia International Conference on
    Communications (MICC) 2009
  – International Conference on Quality in Research (QiR)
Instant Broadband
 Any where!
   Any time!
Dense Wireless Data Network




 Increasing Demand in Mobile Data Access
What is mobile learning?
• Learning with portable technology
  – Focus on the technology
  – Could be in a fixed location, such as a classroom
• Learning across contexts
  – Focus on the learner
  – Could use portable or fixed technology
  – How people learn across locations and transitions
• Learning in a mobile world
  – Focus on the mobile society
  – How to understand people and technology in constant
    mobility
  – How to design learning for the mobile society
Can mobile learning be effective?
• We think so!
  –   Classroom response systems
  –   Group learning with wireless mobiles and phones
  –   Classroom handheld simulation games
  –   Mobile guides
  –   Connecting learning in formal and informal settings
• Lack of convincing studies of mobile learning
  – Attitude surveys and interviews: “they say they enjoy it”
  – Observations: “they look like they are learning”
  – With a few exceptions
Issues in evaluating mobile learning
• It may be mobile
  – Tracking activity across locations
• It may be distributed
  – Multiple participants in different locations
• It may be informal
  – How can we distinguish learning from other activities?
• It may be extended
  – How can we evaluate long-term learning?
• It may involve a variety of personal and
  institutional technologies
  – Mobile and fixed phones, desktop machines, laptops,
    public information systems
• There may be specific ethical problems
  – How can and should we monitor everyday activity?
What do you want to know?
• Usability
   – Well-tested methods:
      • Expert evaluations (e.g. Heuristic evaluation and Cognitive
        Walkthrough)
      • Lab-based comparisons
• Usefulness
   – Hard: depends on the educational aims and context
      • Field-based interviews, observations and walk-throughs
         – Ethnographic analysis
         – Critical incident studies (including focus group replay)
      • Learning outcome measures
         – Control group
         – Pre-test, intervention, post-test, delayed post-test
      • Logbooks and diaries
         – Logbooks of activity
         – Diary-diary interview used successfully for intensive study of
           everyday learning over time
Some evaluation methods (contd.)
• Usefulness (contd.)
   – Other feedback methods
       •   Telephone probes
       •   Snap polls
       •   Interviews
       •   Focus groups
   – Automatic logging
       • Recording where, when and how a mobile device is used
       • Quantitative analysis of student learning action (Trinder et al., 2005)
   – Learning outcome measures
       • Control group
       • Pre-test, intervention, post-test, delayed post-test
• Attitude
   – Attitude surveys
       • General attitude surveys are little use: almost all innovations are rated
         between 3.5 and 4.5 on a 5 point Likert scale
       • Specific questions can indicate issues (e.g. interface problems)
   – Microsoft Desirability Toolkit
       • Users indicate their attitudes through choice of cards
Case studies

• Student Learning Organiser
  – Long term learning
• MyArtSpace
  – Learning across contexts
• PI: Personal Inquiry
  – Ethics
Interactive Logbook project
Corlett, D., Sharples, M., Chan, T., Bull, S. (2005) Evaluation of a Mobile
Learning Organiser for University Students, Journal of Computer Assisted
Learning, 21, pp. 162-170.

• 17 MSc Students, University
  of Birmingham
• Academic year 2002-3
• Loaned iPAQ with wireless
  LAN for personal use
• Learning organiser
     Time manager
     Course manager
     Communications
     Concept mapper
• Standard tools
     Email
     Instant messenger
     Web browsing
• Free to download further
  software from the web
Evaluation methods
•   Questionnaires
    –   administered at 1, 4, 16 weeks, and 10 months
•   Focus groups, following each of the questionnaires
•   Logbooks
    –   Students kept logbooks for six weeks
    –   Students’ attitudes towards the learning organiser
    –   Patterns of usage of the various applications (including any
        they had downloaded themselves)
    –   Patterns of usage of the technology, particularly with
        respect to wireless connectivity
    –   Ease of use issues
    –   Issues relating to institutional support for mobile learning
        devices
•   Videoed interactions
    –   to compare the concept map tools, three students were
        videoed carrying out an exercise, which they later
        commented on after reviewing the video
Data
• Usability
  – Size, memory, battery life, speed, software
    usability, integration
• Usefulness
  – of PDAs
  – of Learning Organiser
  – of concept mapping tools
• Patterns of use
  – Locations
  – Changes over time
Frequency of use
Use of PDA in specific locations
Rank order, for coursework, and in brackets
for other activities
                 4 weeks   16 weeks   10 months
Home             1= (1)    2 (1)      2 (1)
Department       1= (2)    1 (2)      1 (3)
University       3 (4)     4 (4)      3 (4)
   (elsewhere)

Travelling       4 (3)     3 (3)      4 (2)
Perceived usefulness of tools
(“useful” or “very useful”)

                          4 Weeks    16 Weeks   10 months


Timetable                 59% (10)   64% (9)    82% (14)
Web browser               65% (11)   64% (9)    71% (12)
Instant messaging         59% (10)   50% (7)    71% (12)
Email                     76% (13)   79% (11)   65% (11)
Course materials          59% (10)   43% (6)    41% (7)
Supplementary materials   53% (9)    43% (6)    24% (4)
Concept mapper            35% (5)    14% (2)    0% (0)
Perceived impact on activities
Number of students naming tool as having greatest
impact

Learning                      Personal Organisation         Entertainment
Course materials (6)          Timetable and deadlines (6)   Media player (7)

Browser (3)                   Calendar (5)                  Games (3)

Timetable and deadlines (2)   Writing/note taking (2)       Messenger (2)

Writing/note taking (1)       Email (2)                     Browser (1)

Calendar (1)                  Task manager (1)              Writing/note taking (1)

                                                            Reader (1)
Results
• Some usability problems
    – Especially battery life
•   Most use of calendar, timetable and communications
•   PDA-optimised content was well used
•   Importance of connectivity
•   No clear demand for a specific “student learning
    organiser”
•   Concept mapping tools were not widely used
•   Not generally used while travelling
•   Ownership is important
•   Need for institutional support
MyArtSpace
• Service on mobile phones for
  enquiry-led museum learning
• Aim to make school museum visits
  more engaging and educational
• Students create their own
  interpretation of a museum visit
  which they explore back in the
  classroom
• Learning through structured enquiry,
  exploration
• Museum test sites
  – Urbis (Manchester)
  – The D-Day Museum (Portsmouth)
  – The Study Gallery of Modern Art (Poole)
• About 3000 children during 2006
How it works
•   In class before the visit, the teacher sets an inquiry topic
•   At the museum, children are loaned multimedia phones
•   Exhibits in the museum have 2-letter codes printed by them
•   Children can use the phone to
    –   Type the code to ‘collect’ an object and see a presentation about it
    –   Record sounds
    –   Take photos
    –   Make notes
    –   See who else has ‘collected’ the object
• All the information collected or created is sent automatically to a
  personal website showing a list of the items
• The website provides a record of the child’s interpretation of the
  visit
• In class after the visit, the children share the collected and
  recorded items and make them into presentations
Lifecycle evaluation
• Micro level: Usability issues
  – technology usability
  – individual and group activities
• Meso level: Educational Issues
  – learning experience as a whole
  – classroom-museum-home continuity
  – critical incidents: learning breakthroughs and
    breakdowns
• Macro level: Organisational Issues
  – effect on the educational practice for school
    museum visits
  – emergence of new practices
  – take-up and sustainability
Evaluation
At each level
• Step 1 – what was supposed to happen
  – pre-interviews with stakeholders (teachers, students,
    museum educators),
  – documents provided to support the visits
• Step 2 – what actually happened
  – observer logs
  – post-focus groups
  – analysis of video diaries
• Step 3 – differences between 1 & 2
  – reflective interviews with stakeholders
  – critical incident analysis
Summary of results
• The technology worked
  – Photos, information on exhibits, notes, automatic
    sending to website
• Minor usability problems
• Students liked the ‘cool’ technology
• Students enjoyed the experience more than
  their previous museum visit
• The students indicated that the phones made
  the visit more interactive
• Teachers were pleased that students
  engaged with the inquiry learning task
Usability Issues
+ Appropriate form factor
  + Device is a mobile phone, not a typical handheld museum
    guide
+ Collecting and creating items was an easy and
  natural process
– Mobile phone connection
– Text annotations
– Integration of website with commercial software,
  e.g. PowerPoint
Educational Issues
+ Supports curriculum topics in literacy and media
  studies
+ Encourages meaningful and enjoyable pre- and post-
  visit lessons
+ Encourages children to make active choices in what
  is normally a passive experience
– Teacher preparation
  – Need for teacher to understand the experience and run an
    appropriate pre-visit lesson
– Where to impose constraints
  – Structure and restrict the collecting activity, or learn from
    organising the material back in the classroom
– Support for collaborative learning
  – “X has also collected” wasn’t successful
Summary of methods
• Interactive logbook
   – Usability
       • Videoed interactions with comparative systems and reflective discussion
   – Usefulness
       • Questionnaires, focus groups, user logbooks
   – Attitude
       • Questionnaires
• MyArtSpace
   – Usability
       • Heuristic evaluation
   – Usefulness
       • Structured interviews with stakeholders
       • Videotaped observations and notes, critical incident analysis
       • Focus group interviews with learners to discuss incidents
   – Attitude
       • Interviews with stakeholders
• PI: Personal Inquiry
   – Still to be determined, but will include: stakeholder panels, videotaped
     observations and critical incident analysis, comparative tests of learning
     process and outcomes for selected tasks
If time permits .....................
 Problems of Indonesian ICT Companies




Limited availability of resources, especially financial capital,
          during the development/scale-up phase
The Grand Strategy
           Innovation lifecycle: from basic idea to marketable products
                            functional             business               innovation :               viable
      patents                invention             validation               new firms             business


                       Proof-of-             Scale-up
    Basic                                                             Product                  Product
                        concept/            technology
  research                                                          development                marketing
                       invention           development


                                                       • Technology
                                                         development                               Full-
Univer-         •Research                Incuba-                                                   scale
                                                       • Business skill                  SME
 sity           •Training                  tor                                                    busines
                                                         development
                                                                                                     s
                                                       • Market access
The Grand Strategy

  university                                 Technology
                                              vendors
                               Share
                 Human     (knowledge,
                resource     products)
 Case studies
                                 Facilities, trainings,
                                  market channels
                                                Socio-economic   Government
                                                  contribution

                                                   Regulation,
                   Incubatee (SME)                  services

								
To top