Documentation:Learning Design/Workshop Notes

From Kumu Wiki - TRU
< Documentation:Learning Design
Revision as of 12:38, 7 May 2014 by Smauricio (talk | contribs) (Created page with "Agenda Morning 1. Review what we have so far - templates, activities, how can we use them as a department? Sequencing (when to do activities) - what are the strategies for ...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Agenda Morning 1. Review what we have so far - templates, activities, how can we use them as a department?


Sequencing (when to do activities) - what are the strategies for each stage of the course? The LAMS outline - naming the activity (creating a shared vocabulary) - How prescriptive do we want to be?

Small working groups to work on specific templates (we have both generic/specific examples)


2. Review the feedback from the student surveys (not compiled, but can look at some generalities). What does this info tell us, as we move forward to gather more students data what would be useful?

So how can we use this data? Focus on the new courses - do we continue to gather this kind of data for all paced cohort courses? What may we want to change?

Do we still want to pursue getting specific feedback on activities? Yes - we should - we still need a process for this - embed into BB and then pull it? Put a Vovici link into BB? How to get students to do it (enter names in a draw)...get Irwin/Barbara on board. Get the process in place

Two things we want to look at in analytics: activities (for us), there is also a side that looks at student success (where they are in the course/participating) - build the best learning activities for student success


Janet: reviewing the courses that were analyzed - what were the commonalities? what does well organized? What are the characteristics of the courses that students provided positive feedback on?

We need to follow the "blue/white" trail (disagree) - so either check the course, follow-up with students, when we get the analysis done...have whoever is doing this make the linkages...maybe look at analytics (participation rates in BB, who was participating)

Discuss the results with each of the faculty members - and then get their feedback on how things were going

ACTION: Do this survey again in April with new cohorts, try to develop a process to get feedback from individual activities, analysis of this data done (lauren research assistant), Follow up with faculty (time? short interview - research assistant?)


3. How can we integrate the feedback into our practice? Good to make a report that makes some specific recommendations, and general observations - come with a "doctrine" or common code of practice for design and delivery - Monterey had a template (things they believed them) - Royal Roads has twelve principles for instructors (for facilitation)

Also can do some workshops or seminars (as follow up with instructors/designers)

Set some minimum level of engagement we should be striving for

Can use this information we gather to bolster our perceptions



Lunch

Afternoon Work on a few activities/templates that we could use across courses - is there lit/feedback that we could use for these?