Monday, October 13, 2008
ITM 353 Slides
I gave a talk to the ITM 353 class on Oct 10th, You can go to http://docs.google.com/Presentation?id=dccvgnrf_41c4gk6dk4 to view the slides and download them.
Thursday, August 14, 2008
Data Issues: Crosslisted courses
We're actually dealing with this one right now. There's a meeting scheduled on Monday for it. The problem is the difficulty of identifying crosslisted courses and then knowing which department is the primary sponsor of the crosslisted course.
A crosslisted course is a single class that students can register for using two different CRNs (Course Reference Numbers). For example, Asian Studies 608 and Political Science 645C are the same class, with the same teacher, hours, etc. Two students registering one under AS and the other under PS will find themselves in the same class.
The problem we are encountering for eCAFE is that there doesn't seem to be a way to distinctly identify a crosslist course. There is a field in the database named crosslist_group which gives a two character code. This code consists of either two letters or a letter and a number. The code is used to group sections together as either crosslisted or concurrent (another story). We were initially told that two letters meant that a course was crosslisted and a character-letter code meant it was concurrent. This turned out not to be true. Apparently other campuses use a different convention, and to top it off, the outreach college at UHM has their own convention as well.
So we asked if there was another method of getting this data, and were told that there was. We were given access to an internal-use table that contained the information we needed. However, when I went to look at it, there was no information for the semester that starts in a few weeks. Upon further inquiry, it turns out that table isn't filled with the current semester's data until a month into the semester. Far too late for our needs. Blocked again.
Let's hope the meeting on Monday provides some illumination to the problem.
A crosslisted course is a single class that students can register for using two different CRNs (Course Reference Numbers). For example, Asian Studies 608 and Political Science 645C are the same class, with the same teacher, hours, etc. Two students registering one under AS and the other under PS will find themselves in the same class.
The problem we are encountering for eCAFE is that there doesn't seem to be a way to distinctly identify a crosslist course. There is a field in the database named crosslist_group which gives a two character code. This code consists of either two letters or a letter and a number. The code is used to group sections together as either crosslisted or concurrent (another story). We were initially told that two letters meant that a course was crosslisted and a character-letter code meant it was concurrent. This turned out not to be true. Apparently other campuses use a different convention, and to top it off, the outreach college at UHM has their own convention as well.
So we asked if there was another method of getting this data, and were told that there was. We were given access to an internal-use table that contained the information we needed. However, when I went to look at it, there was no information for the semester that starts in a few weeks. Upon further inquiry, it turns out that table isn't filled with the current semester's data until a month into the semester. Far too late for our needs. Blocked again.
Let's hope the meeting on Monday provides some illumination to the problem.
Data Issues: changing ids
After discussing my woes about ODS (where our class/student/instructor data comes from), a developer pointed out I should be noting all this stuff down. It's a good reference for explaining why our project took some of the turns that it did, and it may save another developer some headaches. So, here's the first of these.
When we first started designing eCAFE, we were told that the person_uid and id_number fields were the primary values used to identify a person in the database. I don't recall if someone told us this explicitly, or we just assumed it, but it was our belief that the numbers were unchanging. This turned out not to be the case.
While either number was fairly constant, during our updates each semester, we would inevitably find a few that changed. Since we built our tables using person_uid as the foreign key that all other tables referenced, this caused big problems. I finally had to write a script to do the following:
1.) Remove the unique index from the person table on the id_number field.
2.) Add a new record with the new person_uid (with same id_number, hence #1).
3.) Go to all dependent tables and change the person_uid to the new value.
4.) Remove the original record in the person table.
5.) Restore the unique index.
What a pain. In the redesigned eCAFE we autogenerate a unique key for each person and use that as the FK in the dependent tables. Now we can change the person_uid field with a simple update statement.
When we first started designing eCAFE, we were told that the person_uid and id_number fields were the primary values used to identify a person in the database. I don't recall if someone told us this explicitly, or we just assumed it, but it was our belief that the numbers were unchanging. This turned out not to be the case.
While either number was fairly constant, during our updates each semester, we would inevitably find a few that changed. Since we built our tables using person_uid as the foreign key that all other tables referenced, this caused big problems. I finally had to write a script to do the following:
1.) Remove the unique index from the person table on the id_number field.
2.) Add a new record with the new person_uid (with same id_number, hence #1).
3.) Go to all dependent tables and change the person_uid to the new value.
4.) Remove the original record in the person table.
5.) Restore the unique index.
What a pain. In the redesigned eCAFE we autogenerate a unique key for each person and use that as the FK in the dependent tables. Now we can change the person_uid field with a simple update statement.
Wednesday, July 30, 2008
MySQL Notes
I just posted on one of my project_specific blogs about some of the issues we've encountered trying to move data from an Oracle db to MySQL.
I was banging my head against a wall trying to figure out why this:
create table ecafetest.temp_stats (
statistic_id int not null
, survey_id int
, section_id int
, org_id int
, question_id int
, answer_id int
, answer_order_num int
, response_id int
, srs_id int
, urs_id int
, cnt int not null default '0'
, FOREIGN KEY (statistic_id) REFERENCES statistic(id) ON DELETE CASCADE
, FOREIGN KEY (survey_id) REFERENCES survey(id) ON DELETE CASCADE
, FOREIGN KEY (section_id) REFERENCES section(id) ON DELETE CASCADE
, FOREIGN KEY (org_id) REFERENCES org(id) ON DELETE CASCADE
, FOREIGN KEY (question_id) REFERENCES question(id) ON DELETE CASCADE
, FOREIGN KEY (answer_id) REFERENCES answer(id) ON DELETE CASCADE
, FOREIGN KEY (response_id) REFERENCES response(id) ON DELETE CASCADE
, FOREIGN KEY (srs_id) REFERENCES section_response_set(id) ON DELETE CASCADE
, FOREIGN KEY (urs_id) REFERENCES user_response_set(id) ON DELETE CASCADE
, primary key(statistic_id, question_id, answer_id, response_id)
) Engine=InnoDB;
Was generating a table where question_id, answer_id, and response_id were declared "not null" and my insert was failing because response_id was null. Imagine how silly I felt when I finally noticed the primary key. MySQL doesn't allow null values in a primary key, although they are allowed in all other indexes. So, while I may not have written "response_id int not null," through the primary key I had in effect done so.
I was dealing with this at the end of yesterday, and wasn't making any progress on it. I finally went home, and within a minute of looking at it this morning, saw my problem. It's amazing how often walking away from a problem brings you the solution.
I was banging my head against a wall trying to figure out why this:
create table ecafetest.temp_stats (
statistic_id int not null
, survey_id int
, section_id int
, org_id int
, question_id int
, answer_id int
, answer_order_num int
, response_id int
, srs_id int
, urs_id int
, cnt int not null default '0'
, FOREIGN KEY (statistic_id) REFERENCES statistic(id) ON DELETE CASCADE
, FOREIGN KEY (survey_id) REFERENCES survey(id) ON DELETE CASCADE
, FOREIGN KEY (section_id) REFERENCES section(id) ON DELETE CASCADE
, FOREIGN KEY (org_id) REFERENCES org(id) ON DELETE CASCADE
, FOREIGN KEY (question_id) REFERENCES question(id) ON DELETE CASCADE
, FOREIGN KEY (answer_id) REFERENCES answer(id) ON DELETE CASCADE
, FOREIGN KEY (response_id) REFERENCES response(id) ON DELETE CASCADE
, FOREIGN KEY (srs_id) REFERENCES section_response_set(id) ON DELETE CASCADE
, FOREIGN KEY (urs_id) REFERENCES user_response_set(id) ON DELETE CASCADE
, primary key(statistic_id, question_id, answer_id, response_id)
) Engine=InnoDB;
Was generating a table where question_id, answer_id, and response_id were declared "not null" and my insert was failing because response_id was null. Imagine how silly I felt when I finally noticed the primary key. MySQL doesn't allow null values in a primary key, although they are allowed in all other indexes. So, while I may not have written "response_id int not null," through the primary key I had in effect done so.
I was dealing with this at the end of yesterday, and wasn't making any progress on it. I finally went home, and within a minute of looking at it this morning, saw my problem. It's amazing how often walking away from a problem brings you the solution.
Thursday, March 27, 2008
User Centered Design
I went to a seminar today titled "Conducting User-Centered Design Tests in 5 Minutes or Less." The speaker was Matthew Winkel, Communications Officer for Web and New Media at The College of New Jersey. The presentation took about 75 minutes, including questions.
The gist was that most people don't have a big budget for design testing, or alot of time. To top it off, most users have an attention span that tops out at 15 minutes. This led to the idea of quick and dirty user design tests, which I summarize below. The first three can be accomplished with a print out of your interface and a pencil or highlighter, the last two are a little more involved.
The short version of the different types of tests:
1.) Label test: Hand out a printed screen shot and have the user circle or highlight the labels they find confusing. This shows which labels aren't adequately descriptive or are ambiguous.
2.) Visual Affordance Tests: On a printed screen shot, have the users highlight which areas of the page are selectable. This helps you pinpoint which parts the site the users don't realize are clickable, and which parts fool the user into thinking they are.
3.) Brand Tests: Hand the user a sheet of paper that has a list of attributes (ex: trustworthy, cluttered, simple, contrived, etc), and have them circle the ones they think applies to your site. Of course, this test only works with people who have actually seen/used the site. You could also provide a printed screenshot if your site lends itself to that.
4.) Single Question Web Poll: A popup dialog that asks a single question. Figure out what one thing you most want to know. Given it's only one question, it's more likely people will respond than if there were more.
5.) One or Two Task Tests: Have user sit down and assign them a single task (or two, max). Record what they say/do. Users are more likely to participate if they know it will only take 5 minutes or so.
What to ask about? Identify your key task or most desired task. Look at the search logs to see what people are most interested in. Look at Google Analytics (or other tracker/logger) to see what questions people navigate to in the FAQ. That tells you what parts of your site are the most confusing.
How to find people for tests? Approach people in common areas, get student help to recruit their friends and friends of friends. Do not use same students repeatedly.
The gist was that most people don't have a big budget for design testing, or alot of time. To top it off, most users have an attention span that tops out at 15 minutes. This led to the idea of quick and dirty user design tests, which I summarize below. The first three can be accomplished with a print out of your interface and a pencil or highlighter, the last two are a little more involved.
The short version of the different types of tests:
1.) Label test: Hand out a printed screen shot and have the user circle or highlight the labels they find confusing. This shows which labels aren't adequately descriptive or are ambiguous.
2.) Visual Affordance Tests: On a printed screen shot, have the users highlight which areas of the page are selectable. This helps you pinpoint which parts the site the users don't realize are clickable, and which parts fool the user into thinking they are.
3.) Brand Tests: Hand the user a sheet of paper that has a list of attributes (ex: trustworthy, cluttered, simple, contrived, etc), and have them circle the ones they think applies to your site. Of course, this test only works with people who have actually seen/used the site. You could also provide a printed screenshot if your site lends itself to that.
4.) Single Question Web Poll: A popup dialog that asks a single question. Figure out what one thing you most want to know. Given it's only one question, it's more likely people will respond than if there were more.
5.) One or Two Task Tests: Have user sit down and assign them a single task (or two, max). Record what they say/do. Users are more likely to participate if they know it will only take 5 minutes or so.
What to ask about? Identify your key task or most desired task. Look at the search logs to see what people are most interested in. Look at Google Analytics (or other tracker/logger) to see what questions people navigate to in the FAQ. That tells you what parts of your site are the most confusing.
How to find people for tests? Approach people in common areas, get student help to recruit their friends and friends of friends. Do not use same students repeatedly.
Thursday, March 20, 2008
Status Report: 3/17-3/20
What I said/did:
- Assess and start training the new guy
<snip>
- Meet w/ ASUH representative to get student-oriented feedback.
Had the meeting and wrote it up on the blog: http://uhmis-ecafe.blogspot.com/2008/03/asuh.html.
The short version is that they pretty much approved the system, liked what they saw, and are very happy we talked to them. They also said that it appeared that privacy concerns were addressed and they had no issues there. We spent the majority of the meeting talking about possible methods of increasing response rates.
- Implementing staff features.
Writing code? What's that?
- Reading: MySQL and ?
The ? turned out to be Ruby. My eyes are tired. Florescent lights suck. Let's declare the Manoa Starbucks a "Remote Office," they have windows.
In other news:
- I mentioned how there needs to be a small barrier for requesting features. I'm trying out this. We'll see how it goes.
- The BSAR specs were posted to the blog and I sent an email to D on Wednesday.
- Some advisory board members were saying that they like the blog, but want notification of when posts are made. So, I started a Google Group which users can join and opt for email notification. When a post is made, the blog automatically sends notice to the group, the notice is then disseminated by email to all members of the group.
Next week:
- Reading (will it EVER end?)
- Working on eCAFE Staff features
- Going to that "User Centered Design Tests" lecture
- Looking into how we can "hide" grades when students don't do their surveys.
- Assess and start training the new guy
<snip>
- Meet w/ ASUH representative to get student-oriented feedback.
Had the meeting and wrote it up on the blog: http://uhmis-ecafe.blogspot.com/2008/03/asuh.html.
The short version is that they pretty much approved the system, liked what they saw, and are very happy we talked to them. They also said that it appeared that privacy concerns were addressed and they had no issues there. We spent the majority of the meeting talking about possible methods of increasing response rates.
- Implementing staff features.
Writing code? What's that?
- Reading: MySQL and ?
The ? turned out to be Ruby. My eyes are tired. Florescent lights suck. Let's declare the Manoa Starbucks a "Remote Office," they have windows.
In other news:
- I mentioned how there needs to be a small barrier for requesting features. I'm trying out this. We'll see how it goes.
- The BSAR specs were posted to the blog and I sent an email to D on Wednesday.
- Some advisory board members were saying that they like the blog, but want notification of when posts are made. So, I started a Google Group which users can join and opt for email notification. When a post is made, the blog automatically sends notice to the group, the notice is then disseminated by email to all members of the group.
Next week:
- Reading (will it EVER end?)
- Working on eCAFE Staff features
- Going to that "User Centered Design Tests" lecture
- Looking into how we can "hide" grades when students don't do their surveys.
Friday, March 14, 2008
Status Report: 3/10-3/14
What I said/did:
- Factor D's most recent comments into the BSAR specs.
Done. I'll be posting it to the site and asking for his review on Monday.
- Implementing staff features.
Didn't really do much here. Ended up doing some edits to the database layout and looking into how/if I can accommodate a feature request. In contemplating how the feature would work, I generated a number of questions I needed them to answer. When I forwarded the questions, the request got dropped.
At one of the sessions I went to at the conference, there was a discussion on users asking for features and how it quickly bloats software. The determination was that users should have to go over some sort of hurdle when making a feature request. If it's too easy, then they'll toss up any old idea, even if hardly anyone would use it. This was proven true in the above example, and now I'm debating how to incorporate a small "barrier" for future feature requests to make sure it's something that people have really thought about and need, rather than just a whim.
- Reading up on MySQL and maybe some other technologies for BSAR.
I'm working my way through some of the MySQL books. Will be ongoing for a while.
Next week:
- Assess and start training the new guy
- Meet w/ ASUH representative to get student-oriented feedback.
- Implementing staff features.
- Reading: MySQL and ?
While I wrote the above as a guide, it's going to be pretty fluid depending on what happens with new guy. I'll be playing things by ear next week.
- Factor D's most recent comments into the BSAR specs.
Done. I'll be posting it to the site and asking for his review on Monday.
- Implementing staff features.
Didn't really do much here. Ended up doing some edits to the database layout and looking into how/if I can accommodate a feature request. In contemplating how the feature would work, I generated a number of questions I needed them to answer. When I forwarded the questions, the request got dropped.
At one of the sessions I went to at the conference, there was a discussion on users asking for features and how it quickly bloats software. The determination was that users should have to go over some sort of hurdle when making a feature request. If it's too easy, then they'll toss up any old idea, even if hardly anyone would use it. This was proven true in the above example, and now I'm debating how to incorporate a small "barrier" for future feature requests to make sure it's something that people have really thought about and need, rather than just a whim.
- Reading up on MySQL and maybe some other technologies for BSAR.
I'm working my way through some of the MySQL books. Will be ongoing for a while.
Next week:
- Assess and start training the new guy
- Meet w/ ASUH representative to get student-oriented feedback.
- Implementing staff features.
- Reading: MySQL and ?
While I wrote the above as a guide, it's going to be pretty fluid depending on what happens with new guy. I'll be playing things by ear next week.
Subscribe to:
Posts (Atom)