Tuesday, October 1, 2013

Two-Week Project Showcase

The focus of my sophomore-level Advanced Programming course (CS222) is a large project at the end of the semester. In the past, this has been a six-week project, delivered in two three-week iterations. It is a project of the students' own invention: they have to pitch a project to satisfy a set of constraints I provide. This semester, I decided to expand the project to nine weeks with three iterations. This goes hand-in-glove with the revised, achievement-oriented assessment model that I am using this semester.

Most students have never worked in a programming team prior to CS222, much less defined their own projects. To warm them up, I give a two-week programming project before starting the big project. The two-week project is done in pairs, and I provide the requirements analysis as user stories. This provides a model for them to follow when they pitch their own projects using user stories.

This semester, I gave a new two-week project that was inspired by my 2013 Global Game Jam project. The students were challenged to write an application that takes a word from the user and then hits Wikipedia's API to determine the last modified date of the corresponding page. The curriculum provides no prior exposure to Web programming or networking, and so I provided a very short sample program that demonstrated how to open a URL in Java and read the stream. This project touches on some very important ideas, including Web services and structured data.

In the past, I have evaluated the two-week project in a rather conventional way: I provide a grading rubric at the start of the project, then I check out students' projects from the department's Mercurial server and give each pair a grade. I wanted to do it differently this semester, in part because of the achievement-oriented assessment. The two-week project provides a vehicle for students to earn achievements and write reflections: I'm not evaluating the project itself but rather how students use it to explore course concepts as articulated through the achievements and essential questions.

I decided to devote a class day at the end of the two-week project to a showcase. Each pair had to have a machine running a demo alongside a summary poster. We rearranged the classroom, moving all the tables to the perimeter and clustering the extras in the center. In order to foster peer interaction, I printed some sheets whereby students could vote on which team had the best UI, the best poster design, the cleanest code, and the most robust input handling.

The students enjoyed this format. There was an energy in the room as the students explored each others' solutions, talking about implementation strategies and UI decisions. A few students had trouble getting their projects working at all, and I heard one student say how disappointed he was, because it left his team unable to fully participate in the activity. This represents positive peer pressure and project orientation, which can be starkly contrasted against instructor pressure and grading-orientation.

I had recommended two strategies in class: using Joda Time to handle time formatting and using Java's DOM parser to deal with Wikipedia's output. I was surprised to see that almost every team used Joda Time (and used it to earn the Third Party Librarian achievement) but only one team used DOM. Every other team read the output stream as a single string and then searched it using Java's String class. This provided an excellent opportunity to teach about input validation. My sample Wikipedia URL queried the "soup" page for its last modified time, and the result looks like this:

<?xml version="1.0"?>
<api>
  <query-continue>
    <revisions rvcontinue="574699285" />
  </query-continue>
  <warnings>
    <revisions xml:space="preserve">Action 'rollback' is not allowed for the current user</revisions>
  </warnings>
  <query>
    <normalized>
      <n from="soup" to="Soup" />
    </normalized>
    <pages>
      <page pageid="19651298" ns="0" title="Soup">
    <revisions>
      <rev timestamp="2013-09-27T05:20:10Z" />
    </revisions>
      </page>
    </pages>
  </query>
</api>

Keep in mind that this is coming in as one continuous stream without linebreaks. Aside from the one group that did an appropriate and simple DOM lookup, students used String#indexOf(String) to search for "timestamp=", and then did manual string parsing using that reference point. This approach works for most cases, but it opens the application up to an attack that I'll explain in the next paragraph, giving the reflective reader a moment to consider it.

If you ask the application for the last modified info of the Wikipedia page "timestamp=", you get a phenomenon similar to an SQL injection attack: the indexOf operation picks up an unintended location, and the manual string manipulations fail. I had seen this when meeting with a pair the previous week who were working on their Advice Seeker achievement. They had thought their solution to be rock solid, and they were appropriately excited when I showed them how to crash it. They became my covert hitmen during the showcase, crashing solution after solution by finding holes in string parsing logic. So, while few students took the opportunity to learn XML parsing in the two-week project, maybe they learned something even better: the embarrassment of doing it the lazy way and seeing your system go down in public!

When I explained to the students what were doing at the showcase, I had expected teams to show up at their stations when I came by. However, it seems they were so excited to see each others' work that they didn't think about this. Since not every team had a person at their station when I came around, I was not able to give my expert evaluation to each group, nor was I able to model for the students how to give critical feedback. On the other hand, the students got a lot of peer feedback and I was able to meld into the group, becoming just one of many people interested in seeing demonstrations and code. I am not yet sure if I would do this part differently next time or not.

One aspect that is still unclear to me is the extent to which students were working for intrinsic motivation versus extrinsic reward. I was approached after class by a student from one of the teams whose solution was not working. During the hour, they had talked to other students and realized what they did wrong, which is an activity I certainly want to foster. The student asked, clearly in a state of agitation, if his group could fix their application even though it was due to be completed the previous night. I confirmed that this would be fine, and the student went away expressing joy and thanksgiving. I suspect his perspective was that he had just been given an opportunity to save his grade. What I really gave him was an opportunity to make his project work and feel good about getting it done, even if a bit late. I don't think he realized that there's no entry in the course grading policy for the two-week project, that the whole thing was just a fun context for us all to play and learn together. I hope that when he figures this out, he sees this as a reflective learning opportunity and not simply smoke and mirrors.

In conclusion, I am very happy with the showcase format. It was definitely worth using a class meeting for this event. I think this two-week project was particularly well-suited to the showcase format since it's fairly small, permits multiple solutions, helps students build better understandings of the modern computing ecosystem, and can have interesting failure states. Perhaps next time around I need to add an achievement related to XML parsing, since this seemed to promote students' use of Joda Time quite well.

(I have some nice pictures of the showcase that I took while standing on top of the teaching station, but I feel like I can't post them here without my students' permission. Sorry.)

No comments:

Post a Comment