Closing In on the End of Summer Semester

It’s been a while since I last updated – things have been moving very quickly.

Here’s how we’ve passed the last few weeks:

Week 6:

I prepared the first drafts of these documents then asked for feedback. I passed on responsibility for several of them to another teammate, which went well until I realized that the team had just entered an endless feedback loop. I think it would have helped these documents get to a more “final” state if we had specified either a final decision-maker for each one or a date that they were going to be finished. Having taken on a leadership role in terms of guiding the user testing and creating the documents, though, I kind of ended up as the one with final say over everything, which is a lot of responsibility – I didn’t feel comfortable handing off responsibility, but I had an email exchange with Kyle that really helped clear up what I should and shouldn’t be responsible for. Things went much more smoothly after that.

Week 7:

  • Finalized testing documents
  • Contacted and confirmed with Kyle’s recruited participants
    • 1 expert tester, Aaron Schwartz
    • 7 users registered for or interested in the course
  • Pilot user testing session with adviser Linda Braun

A lot of this week was spent doing administrative tasks like contacting and confirming with participants, as well as ironing out how we were going to use Google Hangouts to facilitate. One teammate also stepped up and said she was prepared to moderate several tests, which was a relief! We agreed that the best approach to moderation was to make the session feel like a conversation about the site rather than a formal process, and the pilot test went very well conducted this way.

Week 8:

  • 6 user testing sessions with users via Google Hangouts on Air (1 cancelled)
    • Several technology issues – for future sessions, I created a Best Practices document
  • Site walkthrough with expert tester Aaron Schmidt of Walking Paper

After the first few sessions, we had a sense of what some of the red flags were – navigating back and forth between the course site and WordPress dashboard functionality being the primary, and a handful of small functionality or minor visual issues as well. We also got great commentary on the look and feel from both expert testers during the pilot test and the walkthrough. Google Hangouts turned out to be a great tool for facilitating user sessions using screensharing, although a number of participants had technical issues we couldn’t anticipate – outdated plugins without the admin rights to install new ones, malfunctioning microphone – you name it.

Going into Week 9:

  • I put together a User Testing Data spreadsheet, where we’ll enter our notes on the user testing sessions. I asked teammates to commit to entering data by Sunday evening.
  • We’ll use that data to write our Recommendations Report. I asked teammates to commit to having a draft of their pieces of the results and recommendations by Wednesday, so we can all review, and then we’ll submit those pieces by Sunday night. Other pieces, like the related research, we will hand in on Sunday if we’ve had time to complete them, otherwise they’ll be submitted at the end of the semester along with our handoff report.

Weeks 5 & 6: Heuristics and Use Cases

After a very productive synchronous meeting on Tuesday night, our team has a revised schedule that accommodates:

  • Team learning about UX processes and collaboration on testing tasks
  • A deeper focus on use cases as an artifact that can be shared across teams
  • Time for Kyle to finish adjustments to the interface.

 

One teammate and I spent Week 5 working on a Heuristics document that we anticipate will also be shared across teams and that lays the framework for testing tasks, which we’ll be writing up for Week 6.

Meeting with Kyle also clarified a lot of what’s important to him – I think his expectation is that we’ll provide ongoing heuristic evaluation for all aspects of the site, although as a team we’ve agreed to continue with plans for formal(ish) usability testing to make sure the site aligns with user expectations.

He also made the point that the audience we’re focusing on is LIS-oriented, but that’s not what’s important in designing a learning experience for them – some of them may be very new to online learning, and it’s that aspect of these users’ experience that we should be focusing on. I thought this was a really interesting point – the LIS aspect of our users’ backgrounds will be important in developing content, but it’s their experience and expectations as learners that we need to focus on.

SLIS 298: Weeks 3 and 4 – Personas, Heuristics and Use Cases

After submitting our reports on personas, heuristic evaluation and use cases, the instructor evaluated our group’s direction. He’d like us to focus our energy on use cases and heuristic checklists, because personas require a lot of investment and the return might not be that great in such a short timeframe. I think his idea is that we’ll provide this information to other teams and we can all participate in a kind of self-evaluation, and my thinking is that we’ll be able to use those ideas as a framework for testing. Meanwhile, any test prep we can do outside of the actual interface is useful.

We have a Google+ hangout scheduled for 6:30 pm tomorrow evening (Tuesday, 7/2), so I’m hoping to get the following ironed out then:

  • What the instructor means by “heuristics” and what kind of deliverable he’s expecting there
  • How we should be working with, or at least communicating our role, to the other teams
  • What the timeframe is for a class-directed heuristic evaluation, followed by user testing
    • My concern is that it will take a long time to collect and make changes based on class feedback, and we may not have time for both as envisioned
  • A revised schedule

I did tell the instructor in last week’s progress report that I was concerned about the testing timeframe, and he responded that he needs two weeks to finish the first round of the interface. I think we’ll need a week after that to make sure the testing questions align with what’s going on on the interface, so those should be the big considerations in revising the schedule.

SLIS 298: Week 2 – User Experience Basics

So as it turns out, the agenda for this week has changed – the professor does want to follow the UX-heavy schedule I set last week, but we’re taking a time out this week to put together writeups on personas, heuristic evaluations and use cases. These reports include:

  • A simple, one sentence definition of your concept;
  • An elevator pitch (an extended definition with a brief description about the concept);
  • Some research on the concept (contributed to the group’s Zotero library) and a summary of the pros and cons and what researchers say
  • An outline of how putting this concept into practice this semester might be helpful and how to make it work.

I’m assigned personas, which is a little bit of a review for me, but it’s useful to catch up on current thought in the field (although it hasn’t changed much since I studied them last fall). It is a good experience in writing it up to be consumed by someone who doesn’t have the background I do, though – so far this semester looks like it may be a lot more about co-teaching than actually doing the UX work I put on the schedule, which is different than I expected but not bad.

Update: after Google Hangout with Linda, 6/20

I’m going to update the schedule to reflect what we talked about, including divvying up work instead of working collaboratively which would allow us to push forward with the schedule we’ve set.

Week 4, next week:

  • We’ll each provide a deliverable based on our research from this week, week 3
  • I’ll suggest a Google hangout to review/provide feedback, and by the end of the week we should be ready to perform the heuristic eval and write testing tasks
  • I’ll share Personas document, team schedule and any other deliverables with Linda

Week 5:

  • We’ll each perform heuristic evaluation
  • One person will write up testing tasks, one will recruit and one will put together a plan for how tests are conducted
  • At the end of the week (Sunday), we’ll be ready to conduct a pilot test with Linda

Week 6:

  • Conduct testing

Ideas for a product/article:

  • different models of interaction in an online learning environment
    • leader
    • follower
    • quiet worker
    • different learning styles
    • how the instructor’s interaction makes a difference
  • education “startup” feel
    • sense of hurry, urgency
    • following a development model
    • usability in education
    • carving out a “space” in academia

SJSU SLIS 298: MOOC Research and Development

I will be using this blog to post updates, progress reports and general thoughts from my experience in SJSU SLIS 298: MOOC Research and Development, which I’m taking as an independent study through Simmons. This will help me communicate with my advisor, Linda Braun, as well as provide Simmons with an artifact at the end of the semester.

Progress Report

We are finishing up Week 2 of the course, and so far it’s been a whirlwind. I have:

  • been assigned to a team (Web Design & User Experience)
  • collected research articles relevant to my team’s folder
  • started the reading
  • started creating documents that will serve as artifacts throughout the course
  • volunteered to write this week’s progress report for our team, due tomorrow
  • created a team schedule for the 10-week semester

Research Articles

Here are the two articles I’ve contributed to our team’s reading list:

  • Kop, R. (2011). The challenges to connectivist learning on open online networks: Learning experiences during a massive open online course. The International Review Of Research In Open And Distance Learning, 12(3), 19-38. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/882/1689
  • Fini, A. (2009). The Technological Dimension of a Massive Open Online Course: The Case of the CCK08 Course Tools. The International Review Of Research In Open And Distance Learning, 10(5). Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/643/1402
  • I’ve also pointed my teammates to some user experience and usability resources we can use as a framework to make decisions, primarily from NNG and usability.gov.

On my reading list:

  • W3C’s Accessiblity Evaluation. (2013). Easy Checks – A First Review of Web Accessibility. Retrieved fromhttp://www.w3.org/WAI/eval/Overview.html
  • There are 175 other articles in the class Zotero folder – I need to go through and pick some out that will help provide background for writing up personas and use cases.

Team Schedule

Below os an overview of what that schedule looks like. I wrote it following the process I’m familiar with based on user experience design and usability testing and tried to fit everything in in a way that was reasonable during a short time. When the professor reviewed the schedule, he approved it but mentioned that things like personas and use cases might be vocabulary not everyone in my group is familiar with, so I’m a little concerned that I’ve put together a schedule that’s too heavy on the user experience design and not heavy enough on what other team members focus on, so this schedule may change over the course of the semester. In addition, if my teammates aren’t familiar with user experience practices, we may not be able to accomplish everything I’ve set out to do!

Week 2

Review and post plan and schedule on team space

Research MOOC audience and user experience

Week 3

Research MOOC audience and user experience, begin writing up use cases and personas

Week 4

Finish writing up use cases and personas, begin writing user testing tasks

Review use cases and personas

Share use cases and personas

Week 5

Finish writing user testing tasks

Review user testing tasks

Week 6

Evaluate MOOC based on usability/user experience heuristics, recruit 5 participants for user testing

Write up recommendations based on evaluation

Review recommendations

Share recommendations

Week 7

Make changes to MOOC based on recommendations

Review changes, update user testing tasks as necessary

Week 8

User testing

Write up user testing results and recommendations

Share results and recommendations

Week 9

Make changes to MOOC based on user testing

Review changes

Week 10

Write Handoff Report

Review Handoff Report

Submit Handoff Report

 

Next Steps

Here are the things I’ll be focusing on in the next week:

  • Reading to support personas and use cases
  • Developing a method of communicating with the team – may be continuing to post in a group blog, may be meeting synchronously, may be emailing
  • Figuring out who will be responsible for what in the coming weeks

And the week after:

  • Writing personas and use cases with my team members and reviewing them together
  • Starting to think about user testing tasks

Posts from LIBR281

Just a quick clarifying update: the posts from January-May 2013 are from a WISE course I’m just finishing up, Transformative Learning and Technology Literacies, with Professor Michael Stephens. It’s been a great experience, and I’ll be working with him on an independent study over the summer to build a MOOC out of another course he teaches, the Hyperlinked Library.

 

Jolicloud, for those who have a BUNCH of cloud services in their PLN

Thought this might be a useful tool for anyone else who’s using multiple cloud services in their personal and professional lives:

http://unclutterer.com/2013/05/02/organize-all-your-cloud-services-with-jolidrive/?utm_source=feedly&utm_medium=feed&utm_campaign=Feed%3A+unclutterer+(Unclutterer)

 

I’m going to give it a try – let me know if you are too, and how you like it!