I've been meaning to post here for a while about what I am working on, but I keep posting in the Moodle Quiz Forum instead, because what I have to say seems more relevant there.
Anyway, the short answer is that I am rewriting the Moodle Question Engine. That document explains the what and the why. I want to post here about the how.
The question engine is the part of Moodle that processes what happens as a student attempts questions as part of a quiz (or other activity). What makes this a relatively difficult problem is that there are two independent degrees of variation. A question may be one of many different question types (multiple choice, short-answer, essay, ...) and there are various ways that a student may interact with it (adaptive mode, non-adaptive mode, manually graded, ...). Plus, people seem to think that when quizzes are used for summative assessments, it is quite important that there are no bugs and that performance is good ;-).
The complexity of the problem, and my natural bias, means that I am taking a very object-oriented approach. More so than is normally used in Moodle. Separate objects, with clearly defined responsibilities, helps me understand the problem and have confidence that what I am doing will work.
If you read the document linked to above, you will learn that I have been dissatisfied with this part of the quiz code for years, however it is really only within the last year that I have worked out the overall approach to solving this problem nicely. The key realisation is that you need to apply the strategy pattern twice, once for the bits that vary with the question type, and once for the bits that vary with the interaction model. The other important part of the plan is a cleaner database structure that more closely maps to data we need to store.
However, to implement all that requires quite a lot of code. I need to build the core system that can support all the different interactions and question types. Then I need to create all the interactions to replicate Moodle's existing functionality and the new features we want to add (Certainty Based Marking, and what we are calling the Interactive mode). Then I need to modify all the existing Moodle question types to fit into the new system. And I need to do all this with robust bug-free code that we can confidently use for summative assessment.
Over the last year I become increasing coverted to test-driven development. Certainly I wrote a lots of tests while eating my elephant, and this time round I am being very thorough with testing, although I must admit I still don't always write the tests first.
I am also proceeding very iteratively. I started with a test that took a single true-false question through being attempted according to the simplest of the interactions, and wrote some code that used roughly the right classes to make that test pass. Then for about a week I did little more than refactor that code.
It turned out that most of my variables and classes had the wrong name. For example, at one point I had classes called question_state and question_states, which was clearly a recipe for confusion. It was only after I had working code that I was able to think of the name question_attempt_step (one step in a question_attempt) for one of those. I also came to realise that grades were stored in three separate ways in different places, and so I ought to make sure I was using a consistent naming scheme for my variables. Then I did a bit of moving methods between classes as the exact boundaries of responsibility of different classes became clearer. Should I have been able to get all that right first time? Well I think that would be impossible.
I also started gradually removing simplifying assumptions that worked for my initial test true-false question, but not more generally. That was the point at which I started extending my code to other question types and interactions. For example I wrote a test case a walking an essay question through the manual grading interaction, and exposed some new issues that had to be accounted for; and just today I introduced multiple-choice questions, which randomise the order of the choices when you start an attempt, but then you have to store that random order somewhere so it can be used throughout the attempt. I have also already implemented the certainty based marking interaction.
There is still much to do. My code that outputs the HTML for a question is still a rough hack of the current code that lets the tests pass. I need several rounds of refactoring before that will be cleaned to my satisfaction. There are sill five more interactions to write, although each one is getting quicker than the one before. Then I have to convert all the existing question types. I need code that stores the current state of everything in the database and later reads it back. Finally, the nasty bits, which are upgrading from the current database structure to the new one, and re-implementing backup and restore, including restore of old backups.
If you were paying attention in that last paragraph, you will have deduced that my code does not yet store anything in the database! I am taking an approach with the domain objects completely unaware of the database (which makes testing very easy) and I am planning to use the data mapper pattern to coordinate loading and saving the necessary data efficiently. I am pretty sure I know how to do that, however, I have not tried to write it yet because I want to re-read the relevant chapters of Patterns of Enterprise Application Architecture, and my copy only recently got back from Australia, and I won't be able to retrieve it from my parents' house until this weekend.
In terms of time-scales, I am really hoping to get this done by about Christmas, and in time to go into Moodle 2.0, since it breaks a number of APIs. However, I first have to implement this in the OU's Moodle 1.9 code-base and then port it to Moodle 2.0, so no promises. This may have to wait until Moodle 2.1.
Anyway, that is what I have been doing, and so far it has been very enjoyable. Write tests, make them pass, then refactor aggressively with the tests as a safety net is an approach that is working really well for me. These are quite big scary changes, but right now I am feeling confident that it will all work out, and we will end up with a really solid question engine upon which to base an enhanced quiz and other Moodle activities.
No comments:
Post a Comment