Tactical solutions
Research
projects
Microsoft Office
Multiple choice guessing
Services
Knowledge measurement audit
Item bank services
Item
design course
Products
WaterMarker
iOTA
STASiS
ACiS
CaSelector
CaSCADE
Technical articles
Excel and Office RAQ
Free Excel Add-in
Free PowerPoint Add-in
Contact
Home page
|
If you conduct multiple
choice testing, consider the following questions:
-
Are all of our questions
fair and well-written?
-
Does a test result reliably
indicate competence or otherwise?
-
Can we prove we have tried
to ensure our tests are effective?
-
Does our testing
programme provide
useful feedback on the effectiveness of our training?

The iOTA (intelligent
objective test analysis) programme provides
an easy way to analyse your multiple choice testing and answer all of these
questions. Provided you currently use an automated testing system (e.g.
AdTest, TestMaster) or you transfer your manual marking to a database or
spreadsheet, iOTA can be geared to work automatically on data you already
possess.
The software analyses the
available sitting data for each multiple choice question and test, producing
a straightforward report, commenting on:
-
Facility – how easy is the
question and the test overall?
-
Discrimination – how well
does the question distinguish between a more able and a less able
candidate?
-
Soundness – how far does it
appear that the correct answer has been identified and is being marked as
correct?
-
Distractor effectiveness –
which, if any, of the distractors (incorrect options) are not credible and
therefore reducing test effectiveness?
-
Correlation – how does the
ability of the candidates compare with previous sittings? Has the training
been effective for this group?
In financial services, iOTA
represents an excellent resource with which to face the
Regulator. You can
provide evidence that proper steps have been taken to ensure
testing produces a
reliable indication of competence.
|