Friday, 15 June 2007

Reflective Commentary: Course Development Document and Instructional Design

On reflection, I wonder whether at least this, if not all of the reflective commentaries are part of a project on instructional design Adam?

1. There were strategies, resources, and processes that I found really helpful as I constructed my Course Development Document, and others that did not work for me.

I found the reflective commentaries very useful (?irony) as a process for collecting my thoughts- I think 'collection of thoughts' was the single most difficult aspect of processing and applying education theory that I encountered. I also have a couple of thoughts on how to improve my collection of thoughts further on.

Although I think the peer-evaluation task was useful in respect to receiving evaluation, I think I may have missed the idea and the utility until receiving feedback on my own project. I actually feel a little embarrassed by the quality of the peer review Sanya received- it didn't reflect my deepest thoughts, and I think part of this was the directed/prescriptive nature of the tool/task (Herrington, Herrington, Oliver, Stoney & Willis, 2001)- or at least my interpretation of the task. I wonder whether providing feedback on the quality of our feedback (as formative feedback for education practitioner development), might be useful.

Now in respect to the collection of thought:

Firstly, I would like to learn how to better aggregate quotes and references- not in terms of citing references, but a strategy to better remember and summarise what the jist of the paper was, and parts of the paper which were of note- such as a model, theory or incite; also a way of categorising the relevance of the paper. I think that one strategy might be an annotated bibliography- but I'm not too sure about this.

Secondly, I found it difficult to bring the process of writing the draft, reflecting on it, undertaking peer or tutor review, and implementing the suggested changes effectively. Perhaps this is part of the concept of praxis that Robyn referred to in her Instructional Design Visualisation (http://clineds.blogspot.com/) and that both Kolb and Freire seemed to be fond of.

By compartmentalising the course design into parts (i.e. modules), it made me think (and perhaps I've read about this elsewhere too, but I can't remember where) that the concept of learning objects is too narrow as it is used in reference to learning technology (Wiley, n.d.), and that the same idea of modularising units of learning for educational purposes could be applied to other areas of instructional design.

The first of these- the narrow definition of learning objects- appears to be a well discussed issue. I think the key things are that 1) Learning objects do not have to be 'technological' or in particular, digital; 2) There is not a particular size, shape, format, presentation, of preconception that should qualify an object, except that it is intended for use in learning education; 3) The key advantage of learning objects and learning object repositories (RLOs), is the use of meta-descriptors, and the ability to integrate the object within more then one conception.

I think healthcare has a key head-start in implementing a standardised meta-language/ontology (?ISO... - if it hasn't already been) - i.e. the US National Library of Medicine (NLM), Medical Subject Headings (MeSH) (see http://en.wikipedia.org/wiki/Medical_Subject_Headings), which is the standard for ; the most obvious equivalent for educational descriptors might be the ERIC thesaurus.

By modularisation, I'm referring to units of something, for example 'assessment objects', or even 'pedagogical' objects' (or 'teaching support' objects). Originally, when I thought about assessment objects the idea of a repository of multiple choice questions came to mind, all meta-coded in relation to several categories (possibly including the higher level MeSH terms, combined with level of 'difficulty' or complexity or some other descriptors) however this is quite a boring and predictable idea; I think it would be beneficial for instructional designers, particularly practitioner-designers, to be able to search a list of assessment objects (assessment technology; made searchable with the use of a semantic/vertical search engine- see endnote 1), determine who has implemented problem based learning exercises in paediatrics (or MCQs or projects or OSCEs etc.) and approach the authors for a summary of their implementation, or perhaps collegial advice- i.e. the repository does not have to physically hold a piece of technology, but holds a description, ownership, licensing and links.

Now to get back on track:

Another area I found difficult was when my conception of the aim of a collaborative task differed from my partner- not that disagreeing was a concern- but that it would be useful in this circumstance to negotiate the aim of the collaboration as a part of the task.

There were two other areas which weren't probably as helpful as they should have been- the referencing aspect of the rubric and the article-resources provided.

Although I tended to be a bit slack on the referencing sub-task (and hence scoring less on this, probably completely tingeing my view), I think this was partly because of the hard-copy style of referencing that was used (i.e. APA); I think that on many of the reflection tasks I had referenced quite a bit using hyperlinking- which although it is not as formal does seem to be appropriate for an e-learning task, and I follows my style of learning ('bite sized'-good, 'bloated'-bad).

Finally, I found it difficult to read most of the articles on-screen, and found myself printing off most of the early articles (time and pressure forced me to put up with it toward the end); I wonder whether formats like FlashPaper or even cut-down HTML would be better; I think the A4 PDF format is terrible to read on-screen, and even a change in paper size to A5 might improve readability.


2. Considering my project and the various elements of e-learning that I explored in this course, the depiction or metaphor for instructional design that I created in Module 9 provides a useful model of instructional design and the role it plays in teaching, learning, and professional development.

Of all the ClinEd 711 tasks (displayed below; figures 1-3), I think this was the one that I learned from least- that is quite distinct from the overall module which I found quite useful. I found that the article by Cennamo, Abell, and Chung (1996), was particularly useful in terms of applying constructivism to the practice of instructional design to attempt to create a constructivist model of instructional design (?this is praxis!).

The reason I didn't find the task that useful was that after reading Scweir's thoughts (n.d.) and reflecting on them, I felt that I had gained as much as was relevant to me at this time- the idea that instructional design could contribute to 'social reform' (either medically and/or society at large), is the key reason why I'm interested and involved the paediatrics curriculum committee, and a strong driver of my interest in clinical education- this is new clarification following completion of this course; to me this idea is fundamental and put my learning from the entire course in perspective.

Therefore I think the task was slightly redundant given my reflections.

What I did find useful and would like to learn more about, is the process of course development (instructional design)- I think in the long term this is a key area to enhancing my involvement and contribution to clinical course design. I would have liked to look at the Institute of Electrical and Electronics Engineers, Inc. (IEEE) Reference Guide for Instructional Design and Development (n.d.) but as we found, this appears to have disappeared from their website.


Figure 1. What is your conception of 'instructional design'?
This was my original depiction. There is an error where the course evaluation node is connected to individual knowledge constructs node via a directional line through "Which influences"- the arrow is meant to be bidirectional, and the course evaluation node is only meant to affect the instructional design 'super-node', not the individual knowledge constructs node.

Figure 2. What is the greater goal of instructional design.
This was the visual summary of what I took from the article by Richard Schwier (n.d.)- A Grand Purpose for ID?

Figure 3. Layers of Negotiation Model.
This is a re-depiction of that presented by Cennamo et al (1996). ; it was a little unclear as to whether the Layers of Negotiation Model and the Spiral Model were one and the same- I couldn't figure out how to recreate the spiral.



3. The strategy underlying my learning design reflects the view of teaching and learning evidenced by my Teaching Perspectives Inventory results, but also reflects new insights I have gained into clinical education, e-learning, and instructional design.

This was my first course in education, and only one of the few courses involving non-technical (not sure if that term completely represents what I mean) thought. I understood Leah's perspective in grizzling about this somewhat 'abstract' and 'theoretical' task, but I found it insightful, interesting and profoundly useful for clarifying my thoughts as they relate to this course, and in general. It helped me develop a language for articulating my views (perspectives) on teaching.

When I came into the course, I was really only able to articulate my strongest view on instruction/teaching- an immense dislike for tasks involving rote learning for the purposes of short term recall- Indeed, I think I reflected this prior to beginning the teaching perspectives task. Early on in the course I managed to identify this somewhat simplistically (and possibly erroneously) with the transmission ('of information') and apprenticeship ('of abilities') perspectives; I linked replication of information and behaviors with the objectivist epistemology, and the behaviourist view of learning, and developed an instant separation from these ideas.

At the beginning of the course, I also saw that the 'classroom' was a place for 'teaching students what they need to know'- which on retrospect feels dangerously close to the transmission view that I was so ardently trying to avoid. And, thus I doubted the place of the social reform perspective in any place other than as the subject of instruction- i.e. in socially orientated areas such as public health. I think that part of this may have been from my interpretation of this low-scoring aspect on the the teaching perspective index (TPI). On reflection I (somewhat conveniently) doubt the validity of the questions in respect to my new conception of the social reform 'construct'- that is that social reform relates more broadly to the application of new ideas, new cognitions, and new skills, as well as the direct challenge to 'society'.

I think one of the strongest forms of social reform in medical education over the last half-decade, has been the rapid progression of communication instruction; I think that the peer-review tasks in my course are strongly social, in that I think they will influence the interaction between peers and they acceptance of constructive criticism- key to building professionalism and maintaining constructive working relationships (Hafferty, 2006; Stern, & Papadakis 2006). Sanya's course development project implicitly involves a social reform perspective, in the sense that she is quite clearly driving the 'pharmacist-as-business-owner' agenda; I think that in a different time that the idea of teaching business skills in a health professional course like pharmacy would have been scoffed at; undergraduate medicine at the University of Auckland certainly has no explicit people or business management instruction that I'm aware of, although has definitely developed strong instruction in the area of communication skills.

As the course progressed, my 'exclusivist' conception of the teaching perspectives began to change- I think it's clear that I felt that the the teaching perspectives were discrete containers of substance to be dipped into for varying amounts at my whim; in one of the rubrics Adam reflected on my 'exclusivist' interpretation of the teaching perspectives. And, thus as a part of this progression I began to realise that my the 'recessive' perspectives in my TPI were of a real magnitude and significance, and not simply 'noise'.

On leaving the course I had recognised the place that the social reform perspective held in my own mind- indeed, the instructional design task discussed above helped me realise the role of instructional design in this respect. I began to better understand the role of the apprenticeship perspective in clinical education and in my conception of the blurred lines between formal and informal education in clinical practice and between education and practice (service). I think that my idea of developmentalism and nurturance are slightly troubled- I'm slightly uncertain about the lack of distinction in my mind between these and the apprenticeship perspective. Finally, although I feel I know what 'transmission' is, I'm not certain that my conception of a transmission perspective is entirely formed.

I could go on, but I'm not sure of the extent to which I've blurred the lines between 'thinking out loud' and 'reflection', or indeed whether there is a difference, and whether it evens matters anyways. I think, coming back to my earlier comments, that I've learned an immense amount about both instructional design and education in general; I think that this course could be alternatively described in many ways including the exclusion of any technology-related terms, and with an extra emphasis on the instructional design component.

Endnote 1

Vertical search engines:
see http://www.theregister.co.uk/2007/05/04/semantic_web_breakthrough/ for a relevant article; http://en.wikipedia.org/wiki/Vertical_search for a decent summary;

Semantic Web:

see http://en.wikipedia.org/wiki/Semantic_Web

Some examples:

http://swoogle.umbc.edu/

http://www.semanticwebsearch.com/query/

References

Cennamo, K.S., Abell, S.K., & Chung, M. (1996). A Layers of Negotiation Model for Designing Constructivist Learning Materials. Educational Technology, 36(4), 39-48. [obtained on interloan]

Hafferty, F. W. (2006). Professionalism--the next wave. N Engl J Med, 355(20), 2151-2152.

Herrington, A., Herrington, J., Oliver, R., Stoney, S. &Willis, J. (2001). Quality guidelines for online courses: The development of an instrument to audit online units. In G. Kennedy, M. Keppell, C. McNaught & T. Petrovic (Eds.) Meeting at the crossroads: Proceedings of ASCILITE 2001, (pp 263-270). Melbourne: The University of Melbourne. Retrieved September 6, 2006, from http://elrond.scam.ecu.edu.au/oliver/2001/qowg.pdf.

Institute of Electrical and Electronics Engineers, Inc. (n.d.). IEEE Reference Guide for Instructional Design and Development. Retrieved September 6, 2006, from http://www.ieee.org/organizations/eab/tutorials/refguide/refGuide.pdf .

Schwier, R.A. (n.d.). A Grand Purpose for ID? Retrieved September 6, 2006, from http://www.indiana.edu/~idt/shortpapers/documents/IDTf_Schwier.pdf

Stern, D. T., & Papadakis, M. (2006). The developing physician--becoming a professional. N Engl J Med, 355(17), 1794-1799.

Wiley, D. (n.d.) The Definition Debate, in Advanced Topics in Learning Object Design and Reuse, Utah State University. Retrieved January 7, 2007, from http://ocw.usu.edu/Instructional_Technology/Advanced_Topics_in_Learning_Object_Design_and_Reuse/debate.htm

Monday, 28 May 2007

Learning Objects

It's strange concentrating now on learning objects. The last task involved reflection on assessment and the idea of, assessment-centred learning' became to be quite important in my thinking (Race, P. 2003; Anderson, T. 2004). It seems interesting to me today then- with the knowledge that assessment is quite possibly the single most important factor in driving learning- that with the new focus of learning objects and the discovery of a whole industry of commercial learning-products, there is very little available in the area of assessment objects.

I think this is probably worth considering at an undergraduate level of assessment, given the medical council requirements for standardisation of learning outcomes- since assessment drives learning, than wouldn't standardisation of assessment modules converge learning outcomes (e.g. USMLE - United States Medical Licensing Examination - http://www.usmle.org/)? But then I suppose the logistics of high quality, reliable, valid and transparent assessments (Race, P. 2003) become compromised with the scale on which this assessments must be made (~300 medical students per year, for six levels, throughout New Zealand each year).

Much of what is easily found on e-learning throughout the Internet seems to be on learning objects. I think there are a few reasons for this. Firstly, the Internet is a technological beast, and as such the techies (computer scientists and engineers) tend to have a lot more of a presence then other professionals- thus, the focus on e-learning standards, organisations etc. is on largely in respect to the technological factors, rather than on content or pedagogy.

Secondly, as alluded to in one of my previous posts, whether many of these learning objects recognise good pedagogy or not is overridden by the commercial forces driving uptake- both reduced cost for educational institutes and generation of profits by e-learning companies.

However, now onto my actual search.

Learning object search strategy and tips
1A. Which repositories did you visit, and what process/strategy did you use to locate an appropriate learning object?

I started off using the resources listed in the course guide. Following this, I searched using Google. I had originally written this reflection with the exact process, but on revision I think it's suffice to say that there just isn't the availability of resources for this course. Indeed, of the private/commercial learning objects such as IVIMEDS, they do not appear to have gone further than the pre-clinical sciences in an organ systems-based approach.

However, one of the learning objects I did find which was of note, was a learning object on paediatrics asthma, which gave license and author details and was actually pretty good, although it was largely didactic disease-based followed by case-based instruction. There appeared to be no standardised meta-data by which to search for the object within a repository. I found this object by searching for "pediatric (I used both pe- and pae-) learning objects" in google (http://www.google.com/search?q=pediatrics+learning+objects&rls=com.microsoft:en-nz:IE-SearchBox&ie=UTF-8&oe=UTF-8&sourceid=ie7), followed the first link (http://www.gwumc.edu/healthsci/faculty_resources/health_science_learning_objects.cfm), and despite my previous comments, found this conveniently half-way down the page (http://learn.gwumc.edu/hscidist/LearningObjects/PediatricAsthma/index.htm).

But:

If you are unable to locate a satisfactory learning object, specify in detail your 'ideal' learning object for the learning task you had in mind.

So, my ideal learning object.

Taking the content of an 'Approach to...' topic (e.g. approach to fever in a child), I would use a narrative-based approach with a non-linear interface (storyteller approach), in order to create a PBL exercise. The technology would need to be cross-platform, web-browser based, light on resources, and amenable to visual niceties.

Therefore, the tool I would use to produce such a learning object would be:
The Flash Based RPG Game Engine (https://eduforge.org/projects/gameflashobjs/). This uses the Adobe Flash environment (http://en.wikipedia.org/wiki/Macromedia_Flash), the built-in Actionscript programming/scripting language (http://en.wikipedia.org/wiki/ActionScript), a simple XML back-end file (http://en.wikipedia.org/wiki/XML), is quite simple to use and meets the aforementioned tech requirements.

Ideally, I would like to collate the story selection choices (i.e. MCQ results), and collate them for the class and tutor to see. The tutor would then be able to focus discussion on areas in which students' incorrect answers tended to cluster- I think this is where SCORM (http://en.wikipedia.org/wiki/SCORM) would come in, but this is beyond my technological understanding and skill- and I imagine would add considerable time and effort onto the production timetable.

I would also like to integrate a free-text box, so that descriptive answers could be submitted within the learning object environment and posted on the class discussion forum. The purpose of this would be to generate further discussion outside of the learning object environment, and thereby generate discussion around the differences in answers and the reasons underlying the differences (where they are significant).

1B. What tips would you offer to somebody else undertaking their own search?

I would advise them to prepare for a long and difficult search. I would advise them to start by searching for repositories, then searching within repositories. I would also advise them to search outside repositories, using google etc. There seems to be a lot of good learning objects which do not fit into a particular course or strategy, and do not comply with the learning object standards, but could reduce the time take in producing standards-based resources and could be ported relatively easily.

Linking learning object to learning objective(s)
2. What learning objective(s) will the learning object help your students achieve? How?

These learning objects will help satisfy the third learning outcome, as part of the professional, clinical and research skills domain:

Formulate logical problem lists for a range of paediatric patients.

  • Develop a differential diagnosis list for a patient;
  • Determine the most likely working diagnosis;
  • Select appropriate tests that will confirm or alter the working diagnosis;

The problem-based learning exercises will specifically focus on diagnostic formulation, rather than on management issues- thereby satisfying the learning outcomes. 

Access and copyright
3. If you did not locate an appropriate learning object, what were the access/re-use terms and conditions for one of the repositories you visited that you found notable?

For the pediatric asthma management learning object cited above the license was clearly linked to at the bottom. It used one of the Creative Commons licenses, "Attribution-NonCommercial-NoDerivs 2.0".

I find the creative commons format of presenting licenses exceptional, especially given how often I stroll past EULAs/Copyright agreements/GPLs etc. without really taking any notice- simply because there has been almost no attempt made to make the license easily and quickly comprehendible by non-solicitors.

To quickly re-present what it says:

 

You are free:
  • to Share — to copy, distribute and transmit the work
Under the following conditions:
  • Attribution. You must attribute the work in the manner specified by the author or licensor (but not in any way that suggests that they endorse you or your use of the work).

  • Noncommercial. You may not use this work for commercial purposes.

  • No Derivative Works. You may not alter, transform, or build upon this work.

For any reuse or distribution, you must make clear to others the license terms of this work. The best way to do this is with a link to this web page.

Any of the above conditions can be waived if you get permission from the copyright holder.

Nothing in this license impairs or restricts the author's moral rights.

Full code

And proper attribution means:

...the proper way of accrediting your use of a work when you're making a verbatim use is: (1) to keep intact any copyright notices for the Work; (2) credit the author, licensor and/or other parties (such as a wiki or journal) in the manner they specify; (3) the title of the Work; and (4) the Uniform Resource Identifier for the work if specified by the author and/or licensor.

This means that my idea of deriving my own learning objects from the content in this learning object is technically out of the question. Having said that, once the object is broken down into its separate parts (i.e. content and presentation), I think it's difficult to establish that this work would be significantly original to justify it's licensing- having said that I'm not a lawyer, I don't have a clue about IP/copyright law, and the last thing I'd want to get involved in is a copyright dispute.

I have a couple of difficulties with the creative commons license- it is new, so the quirks in applying the license in practice may not be entirely worked out (perhaps as evidenced by the development of a replacement for this license- http://creativecommons.org/licenses/by-nc-nd/3.0/. The changes are solely in the legal code section, i.e 2.0 cf 3.0, which probably just proves the saying, the devil's in the detail... For a summary of the changes between 2.0 and 3.0 see here: http://creativecommons.org/weblog/entry/7249. For more detail see: http://wiki.creativecommons.org/Version_3. Finally, for a good overview (in my opinion) of the whole CC affair, see: http://en.wikipedia.org/wiki/Creative_Commons_License- which is kind of fitting given that most (?all) wikipedia content is open-licensed, in some way, shape, or form: http://en.wikipedia.org/wiki/Wikipedia:Copyrights, largely on the GNU Free Documentation License (GFDL).

Also, that although there is an adaptation for the license for Australia, there does not appear to be any work in progress to develop the license for official use in New Zealand. I find that kind of strange because there is extensive discussion about the document, including citation in New Zealand government discussion documents.

Finally, in my search through CC, I came across the discussion of an educational-specific license creation project which might be of interest: http://creativecommons.org/weblog/entry/3633.

Learning object integration/adaptation
4. How will you integrate the learning object into your course design? Can it be used exactly as is, or does it (or your course) require changes? Are changes permissible/realistic?

The learning object which I described previously would be integrated into the Clinical knowledge in paediatrics module.

The module as I described, would ideally be integrated as a unit within the LMS.

I think as discussed above that some of the changes are likely to be logistically difficult but nor unrealistic. However, it is likely that the changes would not be fully implemented.

Anticipation of positives and negatives
5. What knowledge, experience and attitudes of your particular student group do you anticipate might help or hinder the integration of your learning object? How can you best harness or overcome these factors?

These students are fairly tech savy. They are also fairly used to being spoon-fed with rote-based materials, with a focus on memorising details immediately prior to examinations. Thus, I think that it's reasonable to expect that some students may face difficulties with motivation and with milestones.

These students are also used to almost exclusively independent learning. The idea of being required to discuss the issues in a democratic way, may be challenging in this respect.

Finally, I think that students attitudes of e-learning will be fairly damaged. The reason I say this is that most of the existing use of learning technologies may appear to have been for the advantages to teachers and administrators, rather than for the pedagogical benefits.

I'm not really sure how these can be taken advantage of. I think the best way to overcome these factors is to design a clear, efficient and high quality learning environment, and thereby the reputation of the course be anticipated in a positive way. That sounds a bit idealistic on revision, but I think that other ways might help. I've found that during this course, that despite my difficulties with certain technicalities, the milestones have forced me to simply move-on (although I've submitted most of the assessments considerably late).

In terms of the knowledge of the students, I think this will be a positive feature. This students will have had at least 14 months of clinical exposure at this stage. They will understand some of the conventions of clinical practice, and diagnostic process. Thus, the narrative of the learning object will be intuitive. If it is not intuitive to all students- which would be surprising- then the environment itself would serve to further immerse them in this narrative process.

References

Anderson, T. (2004). Toward a theory of online learning. In T. Anderson & F. Falloumi (Eds.), Theory and practice of online learning (pp. 273-294). Athabasca (AB): Athabasca University.

Race, P. (2003). Why fix Assessment? – a discussion paper [Electronic Version], 9. Retrieved 14 May 2007 from http://www.scu.edu.au/services/tl/why_fix_assess.pdf.

Reflective Commentary: Course Development Document - Assessment & Feedback (Part 3; Column 7)

Just going back a little to reflecting on the learning tasks, I've only just realised that my CDD draft has presented the learning design in a content/domain centred way, rather than in the Learning Task centred way as requested. I would plan to change this for the final as in reading over it I realise that the tasks that I've envisaged are certainly not crystal clear.

Now onto assessment.

On reading Phil Race's (Race, P. 2003) article on assessment, several things have resonated with me. I've read it several times, that good education emphasises assessment-centred learning (Including: Anderson, T. 2004). I'm not sure of all of the arguments for this, but I know that with the volume of information needing to be memorised- usually with little processing going on- the challenge and the strategy of med school became 'how to learn about what we were tested on'; there was little desire to 'learn for the sakes of learning'; peer discussion was centred on 'what you need to learn', rather than on 'what does this mean' and 'why is this important. The aforementioned article quotes the following:

“Assessment is the engine which drives student learning” (John Cowan). “And our feedback is the oil which can lubricate this engine”

To me, this simply reiterates the importance of aligning learning outcomes with assessment. There seem a myriad of other reasons for 'assessment-centred learning', but I'm yet to figure out their place amongst 'student-centred learning' and the like.

1.  The form of assessment I have chosen for each learning activity is consistent with its learning objectives, and is integrated into the learning activity.

The assessment tasks are somewhat of a compromise. As such, I haven't integrated all learning tasks with assessment. In fact, I'm not too sure whether I have changed the course in this respect at all.

In module 1, it's displayed quite ambiguously but the learning tasks are self-directed PBLs (problem based learning exercises), combined with peer-assisted mini-CEX ('mini' clinical examinations). The assessment in this later module is the peer-submitted evaluations- i.e. the task is also the assessment. I'm not quite sure how to explain how this will be assessed, perhaps because I'm not quite certain myself- I think this needs to be finalised as part of the discussions with the committee, i.e. NOT finalised as part of the proposal.

This task is most definitely consistent with the learning outcomes (Pinnock, R. 2007):

Domain: Professional, Clinical and Research Skills
Evaluate paediatric patients presenting with a range of clinical problems...

I think both the task and this kind of assessment are a key aspect missing from the existing course, and that the teaching perspectives of development and nurturance are thoroughly expressed in this task. I think that students will receive formative feedback from each other on not only clinical skills but also wider professional skills and communication. I think it will also contribute to an environment where students feel that peer-feedback is part of professional development rather than a 'summative' criticism.

As far as the PBLs go, these are not assessed as part of the module. This was probably because as part of my original conception of the course, transmission would be through didactic and linear materials. As this idea has evolved toward active learning tasks, the assessment aspect has lingered. I'm still not sure how I could assess this part of the course without increasing the tutor workload too heavily. I also don't know how you could mark for completion, without having it as part of the LMS- also a considerable undertaking. Although if we were to integrate the PBLs into the LMS, I think we could probably mark on a grade-based system, given that if you take the time to collaborate, this doesn't necessarily mean that you've "cheated", perhaps this could even be considered part of the task. I'm not sure if the PBLs could be submitted collaboratively? Going in circles now, I'm also not certain what, other than the logistical benefits, the benefit of LMS submission would be unless the PBL answers were in a MCQ form and instantly analysed by the LMS.

The second form of summative assessment mentioned in my CDD is the end-of-course OSCE (Objective Structured Clinical Examination). This is the ?'main-stay' of the current assessment. The OSCE seem to have developed an air of absoluteness about them; that they are valid, reliable, transparent and authentic (Race, P. 2003):

Valid. Assessment should measure what it purports to measure, namely the intended and published learning outcomes for a given module or course.

Reliable. Assessment should be objective, and consistent across students and assessors.

Transparent. Students (and assessors) should know exactly which aspects of a task will be assessed, and what will constitute a satisfactory or a poor performance.

Authentic. Assessment should measure a student's own, non- plagiarised work.

I've not read extensively on the literature around the OSCE, although I'm planning on enrolling in Jennefer Weller and Alison Jone's Assessment Course next semester (ClinEd 704). My understanding is that the OSCE becomes ?more reliable with an increased number of stations.

To me the existing OSCE suffers from several issues: The reliability suffers because the station numbers are restricted by the available staff. It is also affected by inter-rate differences, and by the subjective nature of some of the questions- it's odd how despite a complete jungle of opinions surrounding many clinical questions in practice, that many of these questions are expected to be answered in a binary way.

The OSCE suffers from issues of validity, in that many of the tasks are abstractions. It suffers from decreased transparency, because for some reason there is a fear that if students know what they will be assessed on, this will not distinguish between levels of 'ability'. As a result of the reduced transparency it suffers from issues of authenticy, since students from previous groups compile lists of 'remembered questions'- I was forwarded a copy (Anon. 2005) as a student of the same course, but unfortunately gadn't checked my emailprior to sitting, however my disappointment with the OSCE was not with the issues of validity or reliaiblity, but with the fact that other students had been prior to what I see as a basic requirement of any assessment- i.e. being told what you're going to be assessed on.

Thirdly, the logbooks/vignettes. To me the task should be that the student takes a history, examines a patient, then attempts a formulation, then compares that formulation with what the patient's team thought and reflects on the differences and similarities. In reality there will likely be a difference in order- the team will say, 'go see that you boy with x', but the essence is the same- practice of the 'hypothetico-deductive' method of formulation, filling gaps in knowledge, comparing ideas with peers and superiors.

Thus, ideally these aspects would be reported in fluent language (potentially preceeded by rough, hand-written notes), and assessed/reflected on by the students, their peers and tutors- throughout the course. Note that the students themselves would have fairly specific instructions on how to formulate the reflection/instruction. The final grade would consist of a combination of cumulative assessments, and global grading.

To me, this is different from the existing 'case-history' style of assessment. These are long, and somewhat detached from clinical practice; they allow the student to avoid any responsibility for combining the information in the paper in any way other than in print presentation, ordered in the recognisable, "presenting complaint, history of presenting complaint...".

Furthermore, I think the assessment relates to the learning outcomes in the sense that it not only serves as an exercise in developing the cognitive skill of formulation- But that it develops the skill of fluent and efficient presentation (given a maximum word count).

Finally, the community and Hauora Maori collaborative tasks. I think that formulated and presented in the optimum way, these problem-based learning tasks could prove to be quite useful and innovative. The existing tasks are centred on the individuals reflections on aspects of community health. But to me, the task is rather abstract and therefore not engaging. I think that these tasks could improve on this state-of-affairs by testing the students assumptions around their ability to effectively and efficiently problem-solve 'social' issues.

The assessment is intertwined with the learning task, and I feel is quite successful in its consistency with the learning outcomes (Pinnock, R. 2007):

Domain: Hauora Maori
Identify key health issues for Maori children and adolescents and explain the approaches to addressing the issues;

Domain: Population and Community Based Practice
Summarise the roles, responsibilities and collaborative processes of child health professionals.

2. Students will have opportunities to undertake self-assessment and peer critique as well as receiving instructor feedback.

I think it's become quite evident, my attempt to involve peer-assessment wherever possible. I recently read a quote from J. M. Coeztee's Disgrace (Coetzee, J. M. (2000), which sounded a note with me:

The irony does not escape him: that the one who comes to teach learns the keenest lessons, while those who come to learn learn nothing.

To me the main reason for wanting students to assess each other is not for the increased learning as it relates to the subject content, but to develop awareness of the skill of communicating clinical information between each other- i.e. fluid presentation.

I think that there are several areas that would particularly benefit in terms of 'increased learning'. Clinical examination skills and patient communication, are areas which peer supervision could help not only to evaluate and feedback, but also to socialise into this 'routine'.

3. The strategy underlying the assessment approaches I have chosen reflects the view of teaching and learning evidenced by my Teaching Perspectives Inventory results, but also reflects new insights I have gained into assessment and e-learning.

Development and nurturance were the original two key perspectives I felt I held. However, following my recent reflection on teacher role, I realised the significance of what was originally an equivalent 'score' for apprenticeship in the Teaching Perspectives Index (TPI).

To me, the underlying strategies of my assessment approaches, are to assist in development in key cognitive skills, to do this in a way which is nurturing and socialises peers into the requirements of the task rather than the (often mis-aligned) expectations of the (often absent and/or disinterested and/or overcommited) clinical supervisor.

Thus, the way the student interacts with the teacher is to clarify issues which cannot be easily dealt with by suitable alternatives- i.e. text-book or peers.

I think these definitely reflect insights I have gained into assessement and learning, but not necessarily e-learning. What I am quite confident of it that e-learning provides the excuse to implement many of these advantageous course changes.

References

Anderson, T. (2004). Toward a theory of online learning. In T. Anderson & F. Falloumi (Eds.), Theory and practice of online learning (pp. 273-294). Athabasca (AB): Athabasca University.

Anonymous. (2005). Paediatric OSCE Cheat Sheet.

Coetzee, J. M. (2000). Disgrace. London: Vintage.

Pinnock, R. (2007). 5th Year Book: University of Auckland Department of Paediatrics Undergraduate Curriculum Committee.

Pratt, D. D., & Collins, J. B. Teaching Perspectives Inventory. Retrieved 14 March, 2007, from http://teachingperspectives.com/

Race, P. (2003). Why fix Assessment? – a discussion paper [Electronic Version], 9. Retrieved 14 May 2007 from http://www.scu.edu.au/services/tl/why_fix_assess.pdf.

Reflective Commentary: Course Development Document - The Role of the Teacher (Part 2; Columns 5-6)

Unfortunately I had read the resources for this module quite some weeks prior to completing this reflection, thus I'm not entirely sure whether my ideas have incorporated the reading or whether I merely 'validated' my existing ideas against the reading.

The following link to the other 'role of the teacher' task which Leah and I worked on (rather asynchronously to begin with) together- Principles: The role of the teacher in clinical education(http://dylanandleah.pbwiki.com/ and http://bubbl.us/view.php?sid=14398&pw=ya.8QMLbhmKbUMTEyLjBacjQxQnQ0WQ).

Our principles, and the following embedded diagram, drew heavily from "The good teacher is more than a lecturer- the twelve roles of the teacher" (Harden & Crosby, 2000

 

1. The teaching presence I intend to enact to enable my students to achieve the learning outcomes specified in the Needs Analysis Document will acknowledge the importance of my students' prior knowledge, and encourage them to take ownership of their own learning.

The role of the teacher I would like to maintain is that of a master and apprentice, in which the apprentice is encouraged to tackle a problem, and seek validation of a proposed solution to that problem, or to seek advice in order to solve a problem.

Moreover, as a manager of the team the master has a responsibility to ensure efficiency, productivity, morale, and that the team is up-to-date. Thus, the teaching presence should include reminders about deadlines, with support and flexibility when there are difficulties due to workload and deadlines.

This model seems ideal- students begin learning from where they left off; they seek advice when new questions arise, and continually reinforce that learning with repeated experiences - i.e. experiencial learning (Kolb, 1984) - see infed.com for a good summary, especially the weaknesses of the theory!!! (http://www.infed.org/biblio/b-explrn.htm)

Experiential Learning Diagram

Experiential Learning Diagram (above)

Although it would be nice to leave it at this apprenticeship analogy, the issue of quality of teaching seems to be of concern when the implementation of teaching activites are left to 'teachers intuition' (Herrington, Herrington, Oliver, Stoney, & Willis, 2001). Indeed, the continuing erosion of quality master-apprenticeship interaction in the health sector, has lead to claims of the apprenticeship model being 'abandoned' in certain aspects of clinical practice such as surgical procedures, for stategies such as simulation-based training (e.g. www.vrmedical.com)- I can't help but feel like a salesman for putting this link in, although I do acknowledge my confusion and indeed skepticism over some of the simulation rhetoric that exists. As recently discussed, the commerical imperitive of high-end simulation can be seen as somewhat self-pepetuating, irrespective of need.

2. The supports (e.g. strategies, templates, announcements) I intend to build into the course materials and contribute during the course will model critical thinking and reflection appropriate to clinical practice.

Problem-based learning is the closest description to the process of learning in hospital clinical practice. The materials and learning activities focus on this style of learning. What I haven't made explicit, is the role of the teacher in characterising this model of learning.

Therefore, I need to develop a well-defined role for teacher in this process, without overwhelming the tutors.

Thus, the idea of having students work with other students to develop solutions to these PBLs has this implicitly built-in. I think that it's a reasonable expectation that if the students are appropriately inducted into this method of problem-solving, that there will be relatively few instances in which they will need to consult about the problems with the tutors.

Thus, what I need is a way in which the tutors can observe the problem-solving process. Small-group online discussion groups seem like a good way to do this, and by including individuals from different geographical areas (Starship, KidsFirst, Northland, Wai ato), it would encourage (or essentially force) students to rely-on the online interaction in order to complete these tasks.

These peer-peer and peer-tutor (/master) interations are very close to what happens in the hospital. The online learning environment simply allowsfor an interaction which is not hospital specific, and thereby allows discussion to occur over the differences between 'taught', text-book, and hospital-specific practice, and also encourages discussion between the differences in practices and the factors which influence the differences- i.e. resources, expertise, patient population etc.

3. The strategy underlying the teaching presence I intend to enact reflects the view of teaching and learning evidenced by my Teaching Perspectives Inventory results, but also reflects new insights I have gained into the role of the teacher and e-learning.

In my original post about the Teaching Perspectives Inventory (Pratt & Collins), I struggled to explain the meaning of the apprenticeship perspective in my results (see below). Indeed, my discussion focused on development and nurturance and how they related to my own perception of my teaching perspectives- this conveniently explained the transmission and social reform perspectives not registering.

Transmission total: (Tr)   25.00
  B=8; I=7; A=10
Apprenticeship total: (Ap)  32.00
  B=10; I=11; A=11
Developmental total: (Dv)  32.00
  B=10; I=11; A=11
Nurturance total: (Nu)  33.00
  B=13; I=11; A=9
Social Reform total: (SR)  21.00
  B=8; I=6; A=7
----------------------
Beliefs total: (B)  49.00
Intention total: (I)  46.00
Action total: (A)  48.00
----------------------
Mean: (M)  28.60
Standard Deviation: (SD)    4.76
HiT: (HiT)   33.00
LoT: (LoT)   24.00
----------------------
Overall Total: (T) 143.00

However, on reflection I feel that I have neglected to define the place of the apprenticeship perspective throughout the subsequent discussions. In some respects, I now find it difficult to distinguish completely between these perspectives. Somehow, I must have thought that the apprenticeship perspective could be avoided (and certainly the transmission perspective).

What I have found myself leading to in this course development document is a modelling the teaching perspective on the apprenticeship model, given that this is essentially how learning occurs in clinical practice- in that a problem is recognised, information is learned about that problem, and expertise is developed experientially.

Furthermore, the information that is learned can be presented (taught) in such a way so as not to question the validity (or evidence), or in such a way that the best possible and most up-to-date information is used to develop a solution to the problem- i.e. social reform.

Thus, the CDD reflects my new focus on apprenticeship, whilst retaining the focus on developmentalism. The learning activities are designed in such a way (peer assisted), so as to nurture a less-threatening learning environment than tutor-focused learning. I've also begun to recognise the importance of the social reform perspective, and although the tasks in community paediatrics have some of this perspective built-in, this perspective has not been widely included elsewhere.

The position of transmission in my CDD seems slightly uncertain now- I've included 'information' learning tasks via problem-based learning exercises, which have tried to avoid traditional case-based or rote learning exercises. I think the ability to include learning activities like this is advantageous, however there seem to be two difficulties with this strategy:

  1. Problem-based learning exercises are time-consuming and challenging to develop
  2. Students may rote-learn the examples, relying on their understanding of these examples to help with assessment, rather than on their understanding of the broader principles, skills and underlying knowledge.
References

Harden, R., & Crosby, J. (2000). AMEE Guide no. 20: The good teacher is more than a lecturer - the twelve roles of the teacher [Electronic version]. Medical Teacher, 22(4), 334-347.

Herrington, A., Herrington, J., Oliver, R., Stoney, S., & Willis, J. (2001). Quality guidelines for online courses: The development of an instrument to audit online units [Electronic Version]. Meeting at the crossroads: Proceedings of the 18th Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education (ASCILITE). , 263-270. Retrieved September 6, 2006, from from http://elrond.scam.ecu.edu.au/oliver/2001/qowg.pdf.

Kolb, D. A. (1984). Experiential learning : experience as the source of learning and development. Englewood Cliffs, N.J.: Prentice-Hall

Pratt, D. D., & Collins, J. B. Teaching Perspectives Inventory. Retrieved 14 March, 2007, from http://teachingperspectives.com/

Thursday, 26 April 2007

Reflective Commentary: Course Development Document - Learning Tasks, Student Activities, Delivery mode(s) & Resources (Part 1; Columns 1-4)

I have to be honest, this project is very challenging in the respect of all the relationships that are entailed between different aspects which and people who will need to be involved.

Although I’ve certainly moved from my original conception of an e-learning project- which was focused on the mere replication/updating/inprovement of the pre-existing digital lecture series- I have certainly found it difficult to construct an e-learning course which can challenge the multifaceted learning environment that is undergraduate clinical paediatrics.

Indeed, there is a definite knowledge (transmission) component directly relating to the first learning outcome:

Determine the essential knowledge base for paediatrics.

 Apply knowledge of basic physiology and pathology to the management of paediatric patients.

 Use knowledge of growth and development in children and adolescents to interpret manifestations of disease.

But, creating resources to satisfy this outcome, seems relatively easy to accomplish in a simple way and certainly open to more interesting interpretations than simple digital lectures. What I have found difficult is integrating in my mind is the combined needs of a module which involves a clinical placement (development of skills), information transmission (Pratt et al., 2001), and assessment. Indeed, the existing assessment of the knowledge component of the undergraduate clinical paediatrics component seems set in stone for a few reasons, which are unlikely to be overcome.

However, now onto the main reflection...


1. The learning activities I intend to use to enable my students to achieve the learning outcomes specified in the Needs Analysis Document will actively engage them in problem-solving, and reflect the way that the learning outcomes will be applied in real world settings

The learning activities specifically address each of the seven learning outcomes, with the a large focus remaining within the knowledge domain. Despite this, several aspects of the design refocus on the areas not related to mere information-content, there are the case-history and related formulation exercises, the collaborative community paediatrics project, and the peer-tutor formative assessment of mini clinical examinations- the core changes to the existing course.

Thus, the difference between the proposed modules and the existing modules is the move away from information transmission as the primary focus, and toward skills development- both clinical and cognitive (i.e. formulation and reasoning) (Pratt et al., 2001).

The first of these mentioned above is the case-history. Interestingly, the description of the expected learning outcomes associated with the case history (in the paediatrics course handbook- Pinnock, 2007) is as follows-

"The discussion should be relevant to the patient you have seen with emphasis on either diagnosis or management pertaining to the patient rather than the text-book regurgitation"

This made me wonder what exactly the proposed learning outcomes were in detail. A quick literature search on Pubmed and google certainly didn't elucidate anything worthwhile either- I think this relates to the ubiquity of the term case history, and perhaps the meaninglessness of the term in respect to learning outcomes. From my perspective, the educational value of the case history relates to the direction in which the author takes- that having been said, the most strong influence in directing case history discussion is usually a tutor (either consultant, registrar, or occasionally house officer). This is certainly an area I'd like to search more in detail (given time).

However, one of the advantages of the case-history is that when done well, they clearly summarise the real-world process (although often very much absent of any concept of time- tests on paper seem very easily obtained as a student, but when you have to negotiate with a radiologist for a scan...). The case-history in the context of this project would be to specifically direct the discussion of the case history around articulating the basis for the diagnosis that was made and or management that was undertaken. Students would be asked to causally relate clinical findings, and relate the findings to the underlying processes (anatomy, physiology, pathology); they would be asked to explain why a medication would be expected to be efficacious in a certain situation, rather than simply regurgitate the epidemiological/trial evidence for its use.

Secondly, the collaborative community paediatrics project would move the existing, individually completed project to a more real-world platform, where unilateral decisions about care and welfare are fiction. The idea would be to present a clinical problem where the focus is 'non-medical', and where a solution needed to be arrived at with discussion and input from several people. Group members would assign each other to investigate the available services for the particular problem, and construct (Pratt et al., 2001) a solution that would be best-fit, to a problem which would have no perfect solution, and would come together dynamically (i.e. with tutor input- Chickering & Ehrmann, 1996-7).

The point of this task is to actively engage students in solving a community/social problem, not merely by constructing hypothetical solutions, but by investigating the availability and suitability of the services. The task would allow for a much greater scope of learning than the individual project due to the collaborative process, but would also develop reciprocity and cooperation amongst students.

Finally, the mini clinical examinations would focus time on task (Chickering & Ehrmann, 1996-7), introduce collaboration in the learning of clinical skills, and provide a level of peer-mentoring and feedback not previously encouraged, nor indeed present- feedback being a key catalyst to focused learning and motivation.

2. The learning activities I intend to use will require my students to articulate and justify their understandings, and to collaborate to create meaningful products.

It seems to be strongly held that ‘peer-to-peer’ (p2p) and ‘peer-to-tutor’ (p2t) interaction in an e-learning context (Chickering & Ehrmann, 1996-7; Anderson, 2004) is a strong catalyst to a constructive learning experience. Given that many clinical decisions are not an individual process, but an activity of collective reasoning- even if one individual is deemed to hold more information that another, and especially in a multi-disciplinary setting where content experts somehow agree on a generalised solution- seems an important activity. Thus, a (small-group) p2p project would combine data collection- individual transmission (‘banking’- (Freire, 1970)) of information- with a collective clinical reasoning activity- the rubric focusing not merely on the accuracy of the information or the appropriateness of the decision, but on the quality of the reasoning behind the clinical decision, and democracy or appropriateness of the input from each individual.

Indeed, the p2p strategy will also be used in both the community paediatrics project and in the peer-tutor formative feedback mini clinical examinations (mini-CEX).

I feel that the major problem is going to be moderation of a discussion forum, and that there will be difficulty in determining an appropriate tutor. I certainly think there does need to be some tutor input into the process, in order to enhance the learning experience (Anderson, 2004; Harden & Crosby, 2000). Fundamentally though, I think the optimun learning will occur with prompt and dynamic feedback and discussion from the tutor (Chickering & Ehrmann, 1996-7).


3. The resources I intend to offer my students to help them complete the learning activities represent a variety of perspectives and use a medium that is engaging and well-suited to their message.

The resources I intend to offer are formulated to present the most important information in the simplest and most efficient way possible, whilst remaining non-linear and interactive. Thus, the perspectives recognised are information transmission (because one simply can't compute or formulate without data), apprenticeship, development, nurturance and social reform.

I find it difficult to separate these in some respects, although previously I hadn't recognised some of the aspects as being distinct. Indeed, previously I had perhaps shunned social reform as a necessary perspective in teaching. Having said that, I did recognise that social reform had an important role to play in more socially focused specialties such as public health. Thus, I think the collaborative learning project would be ideal to introduce a social reform perspective. Students would be asked to evaluate the level of suitability or satisfaction with their solution and what could be done to improve to the delivery of the solution or the services used. Indeed, this would introduce the idea of limited resources, of organisational politics, and more wide priniciples of public health funding.

Further perspectives are development, nurturance and apprenticeship. I find these hard to distinguish completely from one another. To me, apprenticeship is part of the behavioural model of learning/psychology in the respect that it encourages replication of behaviour- in the same way that transmission encourages the replication/assimilation of information. Thus, in the same way that formulation cannot occur without data, data cannot be obtained without it first being collected- in particular the collection of clinical histories and examinations. Thus, replication of behavior to the extent that students understand what the process is, is important.

However, the apprenticeship perspective does also relate to the cognitive skill of formulation. However, it seems to me that formulation is a developed skill rather than a replicated skill, and thus fits into the contstructivism epistemology.

In combining all this discussion then, these tasks are designed to hold a perspective of learning that best suits the knowledge that is aimed to be developed. Clinical skills are to be practiced, and peer feedback given on the correctness of the order, thoroughness and quality of the examination (replication of skills- apprenticeship). The information is then processed, with the transparency of logic and the quality of the formulation being emphasised (development of skill- formulation). However, the information is not merely examined in a 'sink or float' examination environment; practice/replication, and formulation are nurtured by the process of peer mentoring and feedback, and by collaborative learning (nurturance).

4. The technologies I intend to use to facilitate my students' learning activities are appropriate when considered in light of the Bates & Poole (2003) SECTIONS model and the technology principles I helped to formulate during Module 3.

Students:

The intended audience of this course is the fifth year class of the undergraduate medical program at the University of Auckland. In background, these students will have studied at least two years of pre-clinical sciences (some students obtain alternative entry, waving the first year), followed by at least a year of clinical studies in their fourth year as medical students. The students at this stage have minimal time together in lecture theatres or tutorials; the year is roughly nine months long, with roughly four weeks of "lecture time". They will be used to being in small groups and to some degree of self-directed learning. There is minimal use of learning management systems, and minimal requirements for using computers- other than performing literature searches and writing assignments.

The above is taken from my course development document. The key point is that med students are intelligent and highly adaptive, but non currently immersed in any significant e-learning environment. Thus, whilst they will likely learn quickly, the course medium will still be a challenge. Having said that, it is my belief that the proposed environment will be more familiar to them than the existing digital lectures, and independent-individual learning activities.

Ease of use and reliability: this will be a significant challenge. The platform will likely need to be CECIL (cecil.auckland.ac.nz), which certainly has its challenges. One of the 'saving graces' will be that this should be a reasonably familiar interface for students, that is a reliable enterprise application, and that future developments of the interface will benefit students.

Costs: Students will need to have no more existing resources than are currently required, and will likely have less computing demands. The main difference between this e-learning strategy and the existing 'm-learning' strategy is that internet access will be required to access the content at home.

In terms of the demands on faculty, the project would not require significantly more input than at present, but would require that content specialists participate in the initial development of resources. This could be challenging to coordinate, but should not put a significant burden on faculty, given the relative simplicity of the content and the continuing reduction of ongoing lecturing requirements initiated with the original digital lecture concept.

Teaching and Learning: I won't spell them all out here partly because of potential copyright implications (believe it or not!), but the learning outcomes fall generally in the following categories: Professional, Clinical and Research Skills, Acquisition and Application of Medical Knowledge, and Population and Community Based Practice.

The current approach to these learning outcomes tends to be either entirely assessment focused, or focused on the linear transmission of information. My approach is to refocus the learning perspectives more toward construction of knowledge and cognitive frameworks, through a variety of methods including collaboration, peer mentoring, and dynamic (non-linear) resources, aswell as being focused on assessment.

Technologies which associated collaborative learning seem to fit within their own category of e-learning. Perhaps the most traditional would be the forum or the wiki, but others might include concept mapping and document workflow environments.

Technologies which support peer mentoring in truth could be in person given the personal requirement of supervision, but electronically submitted feedback would be available for reflection by the student at a future date, for verification of completion by the tutor for assessment, and for tutor input/reflection. These technologies would include a private messaging or journaling system.

Finally, technologies which encourage dynamic learning involve streamin audio and visual content, non-linear text structures such as hyperlinks, images and information visualisation techniques such as concept mapping. Technologies which build these are many, but frameworks on which these are based are relatively few and include flash, AJAX, and silverstream. In particular flash is a widely accepted technology, particularly for streaming audio and video. One particular concern with the flash medium is accessibility (particularly visual), but this will not be a concern with this population.

Interaction: This is discussed above. There are many frameworks and specific applications. The particular stategy I plan on using for collaboration is the wiki, given the ubiquity and relative simplicity. Journalling is conveniently built into CECIL. Flash is a presentation framework which is ideal for producing dynamic, multimedia, interactive presentations.

Organisational Issues: Integration with the CECIL LMS will be the key limiting factor.

Novelty: As far as web technologies go, these are well tested- except for the CECIL environment which is quirky at best and could definitely be improved and standardised.

Speed: These are relatively simple technologies. The content specialists will likely not know how to use flash for development, but this is a one-off and relatively simple. It is likely that the learning technology unit would need to be involved.

5. The strategy underlying the learning activities I have chosen reflects the view of teaching and learning evidenced by my Teaching Perspectives Inventory results, but also reflects new insights I have gained into learning theory and e-learning

I think what Leah has been saying about the TPI is correct to a certain extent. However I do think that the TPI was useful in both teaching the principles of the teaching perspective constructs, and in thereby helping me identify with what I think is important in a given situation. I have reflected on this extensively in the previous paragraphs, but have retraced this below.







http://bubbl.us/view.php?sid=17402&pw=ya.8QMLbhmKbUMTF6bmgxeThyL2IwVQ

My feeling is that information is required in which to act on. Thus, students require information transmission to allow them to function in the most basic way possible- as a walking textbook. Following this, and a similarly behaviourist way, is the apprenticeship model where certain skills, activities (and behaviours) are replicated in order to function properly as a clinician- independent of clinical decision making, formulation, or any functioning greater than the level of mimicking activity.

Then, and what I feel are the most important, are the development of cognitive skills which enable appropriate use of information and ability (knowledge). Nurturance, to me, is merely a strategy to catalyse the transmission, apprenticeship and development aspects- the perspective encourages optimisation of the learning environment and of motivation. Certainly social reform fits into the higher functioning, in the sense that once some of the higher skills have been taught some of the previously transmitted information and abilities can be critically evaluated. More particularly though, social reform is a perspective which encourages the evaluation of the environment both in which the learning occurs, and of which the learning is about.

My strategy is thus to build a layer of information on which to develop the higher skills of formulation, clinical decision making, self- and peer- critical evaluation, and on psychosocial/multidisciplinary interaction/formulation/consideration.

Apologies for the referencing quirks- I’ve given up on figuring out end-note’s quirky cite-as-you-write.


References

Anderson, T. (2004). Teaching in an online learning context. In T. Anderson & F. Falloumi (Eds.), Theory and practice of online learning (pp. 273-294). Athabasca (AB): Athabasca University.

Bates, A.W. & Poole, G. (2003). Effective teaching with technology in higher education. San Francisco: Jossey-Bass.

Chickering, A. W., & Ehrmann, S. C. (1996-7). Implementing the seven principles: Technology as lever [Electronic Version]. AAHE Bulletin, 49, 3-6 from http://www.tltgroup.org/programs/seven.html.

Freire, P. (1970). Pedagogy of the oppressed. [New York]: Herder and Herder.

Harden, R., & Crosby, J. (2000). AMEE Guide no. 20: The good teacher is more than a lecturer - the twelve roles of the teacher [Electronic version]. Medical Teacher, 22(4), 334-347

Pinnock, R. (2007). 5th Year Book: University of Auckland Department of Paediatrics Undergraduate Curriculum Committee.

Pratt, D., Arseneau, R., & Collins, J. (2001). Reconsidering "good teaching" across the continuum of medical education. Journal of Continuing Education in the Health Professions, 21(2), 70-81.

Wednesday, 18 April 2007

Copyright...

What experiences have you had in dealing with copyright issues in your work as an educator?
To be honest it's not something I've had an awful lot to do with, and I tend to avoid copyright issues as much as possible.

What are the copyright guidelines and protocols in place in your institution, in relation to e- learning?
Interestingly I coudn't find a specific reference to copyright within the ADHB intranet. Of course, within the UoA there is the document which has been provided as part of the course pack. Even though surveys occur, I wonder what the differences would be between the surveys and an unnotified audit in some courses.

Do these differ from those you have read about in the course? Do you think these will change over the next decade or so?
It seems to me that there probably won't be a huge change in law and institutional rules. The socialist inside me would like to think that more 'freely available' resources would prevail. For example, the IVIMEDS (http://www.ivimeds.org/) learning objects project in medicine, or its equivalent in dentistry(IVIDENT) and nursing (IVINURS http://www.ivinurs.org/), o r perhaps the open source peer-reviewed medical journals (http://medicine.plosjournals.org).

What copyright issues do you anticipate in relation to resources that you would like to use for your current e- learning project?
Ideally I'd like to focus on creating or using open learning objects with licenses such as creative commons (http://creativecommons.org/ - which a few years ago I thought was just a small open source project which would be missed by most of the world, but now...)- interestingly there hasn't been a New Zealand specific port of the generic license developed- that I can find.

Monday, 9 April 2007

Test Post

It just seems to be the thing to do- to submit a test post...