October 18th – live. This webcast is being recorded for anyone who wants to watch later. Presenter = Cinthya Ippoliti.
1. navigation- how many times to try action before getting concept (browser interruption and no sound. Discussion from whiteboard: learning is scary, change is difficult, navigation isn’t always obvious, user inexperience, anxious, lack of confidence.
2. about half of audience has experience creating tutorials
3. challenges we have encountered (whiteboard): marketing, time, committee agreement about what to cover in tutorial, content, what to include
4. Where to start? they had a subcommittee (about 5 people) that met to determine the elements of the tutorial, so they did a needs assessment survey of faculty and students. They ended up working on adapting the TILT tutorial. The needs assessment showed that faculty want info on accessing info and evaluating info with some wanting students to know how to define info need and use info ethically/legally. Interactivity and ease of use were the features they thought most important. Students were interested in where to look for info and how to find reliable info (90%). 74% said they would use the tool on their own without prompting.
5. 2-year process. used html, flash cs3/cs5 and action script 2.0 and some xml. (campus needs to understand may be $ involved and a learning curve for the person learning how to use the new technology…so how committed is the institution to creating this tool?).\
6. lessons learned – things don’t always work better by committee and don’t be afraid to ask for assistance. Collaboration = great, but don’t always need that when in creation process. wanted something basic, no bells and whistles, more of a “classic” rather than something that would have dated technology
7. most literature came from IT and education sectors. Interactive meant something different to everyone (i.e., text-based, home-grown and maintained, do not allow for free user input, linear). Many were “click on a page of text, read it, and that’s how you’ll learn”. This was what not to do.
8. dome element of free-form navigation, varying levels of difficulty, facilitate the acquisition and use of strategies, encourage and develop inductive learning (here’s a specific thing but here’s how you can apply it to a broader set of concepts), help is provided, but the answer is not given. In addition (these are a little more advanced because they are course related and branching)- it should give ed objectives, course/assignment related functions branching capabilities, problem based, concept based elements, assessment and feedback
9. whiteboard comments ensued. Here is her definition: present the content in a visual manner that allows stus to directly interact with the material even then simply reading text- whether it consists of a rollover, drag and drop or clicking. stus doing something throughout the entire tutorial-they have to be engaged in order to use it.
10.what content to include? Four areas of IL. How to conceptualize these abstract ideas? (how do you make it concrete?) How to boil them down to their core elements? (the ACRL standards complex and thorough like our Oregon proficiencies- but how do you boil it down? Less is more! Get away from the idea that they have to learn everything. Use self monitoring to make sure content is as streamlined as possible.
A. came up with raw content (storyboarding) page by page and each page had a set of objectives (brainstorming phase- not all objectives ended up in the final product).
B. Finished product comparison (text and then what it actually ended up looking like- link is in webcast recording). (My note: Approach examples were narrative, causal analysis, compare/contrast or argument to fit with their writing assignments). go to http://www.pvc.maricopa.edu/researchhelp/swfs/selecttopic.swf
(my note- yesterday’s activity of watching a class take a tutorial)
12. every module has forward/back buttons on each page. flash opens up in own window- starts off as a website, icons maintain look and feel throughout. What to make interactive? every module has interactivity depending on the content covered- e.g., picking a topic does not require students to do anything, but recognizing how to narrow or broaden a topic was s perfect chance for them to practice with drag and drop activity.
13. each module designed as a stand alone. 4 modules = getting started, finding resources, evaluate info, plagiarism 101.
14. used food analogy to show boolean (and/or/not) and then a menu drag and drop activity
15. without a server, no long-term data and relying on students to print out results. Pre and post quizzes allowed flexibility for classroom use. stus can go back and review and they get immediate feedback on every answer. Note- doesn’t give answer if stu chooses wrong, just gives a hint (best practice note). Note- because no database, can’t force student into the appropriate module, can only strongly recommend a next step based on the quiz results. Another note- tutorial specific to Maricopa catalog and library. Note- uses Camtasia for quick video tutorial imbedded.
17. at the end of the tutorial there is an option to email the librarian at their school and get more help finding resources on their topic. My question- what is the relationship among the partner schools? Note- Robin had a different impression of the research cited in reference to number of students tested.
18. started in fall of 2009, had students complete a series of tasks (so did complex task observation) and talk while working through the tutorial (“I’m thinking of clicking here…”etc). Did a demographic questionnaire and a post-test questionnaire. Welcomed and assured that they were not being tested. Complex tasks = where would you click in order to go to the first module of the tutorial? where would you click in order to begin taking the pre-quiz?…etc (see all questions in webcast). Then post-questionnaire to gauge how easy to use (informative? directions clear? feedback? where would it be most helpful – library page/error screen…etc?.
They said informative and clear. wanted more feedback. look was ok. responses varied in terms of using as a stand-alone tool. overall, ease of use and modular content were the best features.
19. stus cared about quiz scores almost as much as completing the assigned task. they wanted feedback on the pre-quiz as much as on the post-quiz. spell everything out- color matters (stus will try to click on text that looks like it might be hyperlinked even if it is not). Screen size is an issue- many student did not realize they had to scroll to get to the information located further down the page (so they tried to limit info so there was little or no scrolling required)
20. results- most students needed assistance with evaluating info so they only covered that content in the review. My note- librarian said “you guys don’t know how to evaluate information” then students more engaged and trying to score 100%. hmmm. reported that it got away from students who felt they had already “had the library presentation.” Interesting- use in conjunction with face-2-face instruction. My note- ask students to take test and email results to me before f2f meeting.
21. they would love to have a database behind the tutorial so they can track student progress longitudinally.
22. from Educational Context slide- “commitment from one college to dedicate resources and time” = my question- look into college partnerships.
23. note- chose not to have much audio because “distracting”
24. student testers made it clear they wouldn’t use tutorial unless incorporated into the course and had a grading component or was in some way required. So, if not stand alone, then could concentrate on creating a tool that tests student learning.
25. look at Planning Handout slides at end of webcast
26. Link to tutorial http://www.maricopa.edu/researchhelp/ and the powerpoint slides are available with the recording of the session.
Our discussion: how useful to have in person or should they just send out the url.