Reading System Evaluation – Work Plan 2014-15

Reading System Evaluation Work Plan 2014-15

Work Plan 2014-15
Sub project: Reading System Evaluation

Submitted by: George Kerscher

Last updated on: October 09, 2014 by Avneesh Singh (Refined 2015 project plan)

 

Table of contents

Reading Systems Accessibility Screening Methodology Tool 1

Brief introduction. 1

Outline of the plan for 2014-15. 2

Milestones or priorities. 2

Dependencies & Risks. 6

 

Reading Systems Accessibility Screening Methodology Tool

Nature of activity: part of the TIES Chartered project

Brief introduction

The goal of the reading systems accessibility Screening Methodology project is to encourage the development of accessible reading systems by providing the accessibility evaluation methodology which is accepted as the de facto benchmark for evaluating accessibility of EPUB reading systems in the industry, DAISY members, disability groups, education system & governments.

 

The activity is a sub project under the TIES project (Transition to Inclusive EPUB 3 eco System). The over arching objective of the sub project is to increase the worldwide availability of the accessible EPUB 3 reading systems.

 

The Reading System Accessibility Screening Methodology was originally a white paper developed by the DAISY Consortium in conjunction with Technology For All (TFA) and distributed to the DAISY Community in June 2013 in Copenhagen. The question has been to how best take this methodology and implements it in a way that will have the most worldwide impact.

 

In support of the EPUB 3 Standard, the IDPF has built an “EPUB 3 test Suite” of valid EPUB 3.0.1 publications; many, many .epub publications that demonstrate all features in the Standard. In conjunction with this the Book Industry Study Group (BISG) had migrated from their Excel EPUB 3 Grid to the new website for testing the features in reading systems using the EPUB 3 Test Suite. This means that the IDPF site is now the home for the EPUB 3 Test Suite and the BISG Reading Systems evaluations; one place to go for the authoritative information about the capabilities of reading systems.

DAISY Consortium collaborated with BISG for creating a 3rd area for Reading System Accessibility Screening in the same website. This means that DAISY Consortium built reading systems accessibility screening on IDPF and BISG’s fine work. The “accessibility” results are shown in the same table as other testing, i.e. the accessibility results are now integrated.

Outline of the plan for 2014-15

In 2014-15, the project will have the following overarching objectives:

a)    Establish crowd sourced testing & evaluation system for the reading system methodology: The evaluations will be largely done on crowd source basis. It will require deployment of the online version of evaluation tool, setting up the management system & marketing for seeking contributions from members & community.

b)   The reading system methodology will be periodically updated based on the feedback of the experts. A tracking tool or mechanism will be deployed for tracking the changes, updates based on ongoing feedback & Evaluation of effectiveness of the system etc. The short comings of the evaluated reading systems will also be communicated to respective technology developers like Readium foundation & other open source & commercial reading systems developers.

c)    Acceptance of the methodology: The ultimate objective of the methodology’s to encourage & direct the reading system developers towards making their reading system accessible. This requires that the methodology is accepted as de facto benchmark for evaluating accessibility of EPUB reading systems. It should get acceptance from reputed universities, major players in Mainstream ICT Providers, DAISY members & related government agencies.

d)   Ensure the community has the most up-to-date evaluations of reading systems: To achieve results with the crowd sourced approach, constant marketing & communications efforts will be important. Furthermore the project will use the resources of DAISY Consortium’s training & support team for evaluating reading systems in the early stage, which will also act as fall back mechanism.

Milestones or priorities

Milestones are the concrete deliverables set forth on the basis of confirmed resources (either we have staff or we have confirmed funds and/or resource commitments). Milestones should be formulated on half yearly basis for 2014 & 2015.

Priorities are the deliverables based on external contributions, our influence on others, available opportunities etc. The priorities should be formulated on yearly bases, for 2014 & 2015.

 

Milestone May 2014

a)     Establish crowd sourced testing & evaluation system.
  • Make decision for the website that will host the reading systems methodology tool. Candidates are DAISY website & IDPF website.
  • Make the online version of methodology live on the selected website.
  • Identify sample content that should be used for testing. (Create fundamental accessibility test book)
  • Setup the Crowd sourcing management system (deploy the system & identify the person responsible for managing the evaluation results & overall monitoring)
  • Start conducting promotion campaign for seeking volunteers for reading system evaluation.

b)     Maintenance & tracking for updates based on ongoing feedback, Evaluation of effectiveness of the system etc.

Remark: Activities will start from 2nd half of 2014.

c)      Acceptance of the methodology

Remark: It is dependent on external influences and therefore listed in the priorities.

d)     Ensure the community has the most up-to-date evaluations of reading systems

Evaluation of 4 reading systems by the training & support team.

Milestone September/October 2014

a)     Establish crowd sourced testing & evaluation system.

Update the fundamental accessibility test book.

b)     Maintenance & tracking for updates based on ongoing feedback, Evaluation of effectiveness of the system etc.
  • Deploy the tracking tool that captures shortcomings & suggestions
  • Update the UI based on the users’ feedback (twice in 6 months).
  • Update the methodology based on the new EPUB revisions & suggestions from the experts in the community. (twice in 6 months)
  • Communicate the short comings of reading systems to the respective technology developers (periodically).
  • Perform analysis of the effectiveness of the evaluation system.
  • Revise plans for moving forward based on the analysis.
c)      Acceptance of the methodology

Remark: It is dependent on external influences and therefore listed in the priorities.

d)     Ensure the community has the most up-to-date evaluations of reading systems

Remark: It is dependent on external influences and therefore listed in the priorities.

 

Priorities based on external dependencies & availability of external contributions, 2014

a)     Establish crowd sourced testing & evaluation system.

Motivate the DAISY members, IDPF & other partner organizations to contribute & promote the methodology

b)     Maintenance & tracking for updates based on ongoing feedback, Evaluation of effectiveness of the system etc.

Identify more sample content/publications for testing.

c)      Acceptance of the methodology

Commence the work for acceptance of the methodology by DAISY members, key universities & education departments in governments

d)     Ensure the community has the most up-to-date evaluations of reading systems

Evaluate 10 reading systems via contributions of organizations & individuals.

Milestone May 2015

a)     Establish crowd sourced testing & evaluation system.

Create accessibility test book for advance accessibility features

b)     Maintenance & tracking for updates based on ongoing feedback, Evaluation of effectiveness of the system etc.
  • Improve the UI based on the feedback from users (once in 6 months).
  • Update the methodology, web interface & the test books based on the feedback of evaluators, the suggestions from the experts in the community, and the new EPUB revisions (average interval of the update should be 3 months).
  • Communicate the short comings of reading systems to respective technology developers (periodically).
  • Perform analysis of the effectiveness of the evaluation system.
  • Revise plans moving forward based on the analysis.
c)      Acceptance of the methodology

Remark: It is dependent on external influences and therefore listed in the priorities.

d)     Ensure the community has the most up-to-date evaluations of reading systems

Remark: It is dependent on external influences and therefore listed in the priorities.

Milestone September/October 2015

a)     Establish crowd sourced testing & evaluation system.

N-A

b)     Maintenance & tracking for updates based on ongoing feedback, Evaluation of effectiveness of the system etc.
  • Improve the user interface of the website based on the feedback from users (once in 6 months).
  • Update the methodology, web interface & the test books based on the feedback of evaluators, the suggestions from the experts in the community, and new EPUB revisions (average interval of the updates should be 3 months).
  • Communicate the short comings of reading systems to respective technology developers (periodically).
  • Perform analysis of the effectiveness of the evaluation system.
  • Revise plans moving forward based on the analysis.
c)      Acceptance of the methodology

Remark: It is dependent on external influences and therefore listed in the priorities.

d)     Ensure the community has the most up-to-date evaluations of reading systems

Remark: It is dependent on external influences and therefore listed in the priorities.

Priorities based on external dependencies & availability of external contributions, 2015

a)     Establish crowd sourced testing & evaluation system.

Conduct promotion campaign for seeking volunteers for testing the reading systems

b)     Maintenance & tracking for updates based on ongoing feedback, Evaluation of effectiveness of the system etc.

Remark: Listed in milestone

c)      Acceptance of the methodology

The methodology is recognized as suitable approach for EPUB 3 reading systems accessibility evaluation by DAISY members, industry, disability groups, education system & related departments in governments.

d)     Ensure the community has the most up-to-date evaluations of reading systems

20 more evaluations via contributions from organizations & individuals. It should also include the revaluations performed on the new versions/models of reading systems which were already evaluated.

Note: To ensure adequate availability of evaluations in the stated timeframe, US$ 5,000 is allocated for outsourcing the testing, if required.

Dependencies & Risks

Dependencies

  • The results are highly dependent on the contributions of DAISY members & DAISY Community.
  • Project has dependency on the collaboration with BISG.

Risks

  • Emergence of similar kind of evaluation methodology with stronger political backing or market forces.
  • The methodology is in English, this can become a limitation in getting acceptance & contributions for non-English centric regions.