Enhancements to the SOLE Tookit – now version 3.5

May 31, 2015

I have no idea what the protocol is for naming versions of things. I imagine, like me, someone has an idea of what the stages are going to look like, when a truly fresh new is going to happen. For me I have a sense that version 4.0 of the SOLE Toolkit will incorporate what I am currently learning about assessment and ‘badges’, self-certification and team marking. But for now I’m not there yet and am building on what I have learnt about student digital literacies so I will settle for Version 3.5.

This version of the SOLE Toolkit, 3.5, remains a completely free, unprotected and macro-free Excel workbook with rich functionality to serve the learning designer. In version 3.0 I added more opportunities for the student to use the toolkit as an advanced organiser offering ways to record their engagement with their learning. It also added in some ability to sequence learning so that students could plan better their learning although I maintained this was guidance only and should allow students to determine their own pathways for learning.

Version 3.5 has two significant enhancements. Firstly, it introduces a new dimension, providing a rich visualization of the learning spaces and tools that students are to engage with in their learning. This provides an alternative, fine-grain, view of the students modes of engagement in their learning. It permits the designer to plan not only for a balance of learning engagement but also a balance of environments and tools. This should allow designers to identify where ‘tool-boredom’ or ‘tool-weariness’ is possibly a danger to learner motivation and to ensure that a range of tools and environments allow students to develop based on their own learning preferences.

Secondly, it allows for a greater degree of estimation of staff workload, part of the original purpose of the SOLE Model and Toolkit project back in 2009. This faculty-time calculations in design and facilitating are based on the learning spaces and tools to be used. This function allows programme designers and administrators, as well as designers themselves, to calculate the amount of time they are likely to need to design materials and facilitate learning around those materials.

I invite you to explore the SOLE Toolkit on the dedicated website for the project and would welcome any comments of feedback you might have.

Advertisements

Why ‘learning analytics’ is like a sewer

May 2, 2015

Back in the late northern hemisphere summer of 2013 I drafted a background paper on the differences between Educational Data Mining, Academic Analytics and Learning Analytics. Entitled ‘Adaptive Learning and Learning Analytics: a new design paradigm‘, It was intended to ‘get everyone on the same page‘ as many people at my University, from very different roles, responsibilities and perspectives, had something to say about ‘analytics’. Unfortunately for me I then had nearly a years absence through ill-health and I came back to an equally obfuscated landscape of debate and deliberation. So I opted to finish the paper.

I don’t claim to be an expert on learning analytics, but I do know something about learning design, about teaching on-line and about adapting learning delivery and contexts to suit different individual needs. The paper outlines some of the social implications of big data collection. It looks to find useful definitions for the various fields of enquiry concerned with collecting and making something useful with learner data to enrich the learning process. It then suggest some of the challenges that such data collection involves (decontextualisation and privacy) and the opportunity it represents (self-directed learning and the SOLE Model). Finally it explores the impact of learning analytics on learning design and suggests why we need to re-examine the granularity of our learning designs.

I conclude;

Learning Analytics Cover“The influences on the learner that lay beyond the control of the learning provider, employer or indeed the individual themselves, are extremely diverse. Behaviours in social media may not be reflected in work contexts, and patterns of learning in one discipline or field of experience may not be effective in another. The only possible solution to the fragmentation and intricacy of our identities is to have more, and more interconnected, data and that poses a significant problem.

Privacy issues are likely to provide a natural break on the innovation of learning analytics. Individuals may not feel that there is sufficient value to them personally to reveal significant information about themselves to data collectors outside the immediate learning experience and that information may simply be inadequate to make effective adaptive decisions. Indeed, the value of the personal data associated with the learning analytics platforms emerging may soon see a two tier pricing arrangement whereby a student pays a lower fee if they engage fully in the data gathering process, providing the learning provider with social and personal data, as well as their learning activity, and higher fees for those that wish to opt-out of the ‘data immersion’.

However sophisticated the learning analytics platforms, algorithms and user interfaces become in the next few years, it is the fundamentals of the learning design process which will ensure that learning providers do not need to ‘re-tool’ every 12 months as technology advances and that the optimum benefit for the learner is achieved. Much of the current commercial effort, informed by ‘big data’ and ‘every-click-counts’ models of Internet application development, is largely devoid of any educational understanding. There are rich veins of academic traditional and practice in anthropology, sociology and psychology, in particular, that can usefully inform enquiries into discourse analysis, social network analysis, motivation, empathy and sentiment study, predictive modelling and visualisation and engagement and adaptive uses of semantic content (Siemens, 2012). It is the scholarship and research informed learning design itself, grounded in meaningful pedagogical and andragogical theories of learning that will ensure that technology solutions deliver significant and sustainable benefits.

To consciously misparaphrase American satirist Tom Lehrer, learning analytics and adaptive learning platforms are “like sewers, you only get out of them, what you put into them’.”

Download the paper here, at AcademiaEdu or ResearchGate

Siemens, G. (2012). Learning analytics: envisioning a research discipline and a domain of practice. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 4–8). New York, NY, USA: ACM. doi:10.1145/2330601.2330605


Visualisation of Educational Taxonomies

August 22, 2013

Sharing a paper today on the visualisation of educational taxonomies. I have finally got around to putting into a paper some of the blog postings, discussion, tweets and ruminations of recent years on educational taxonomies. I am always struck in talking to US educators (and faculty training teachers in particular) of the very direct use made of Bloom’s original 1956 educational taxonomy for the cognitive domain. They seem oblivious however to other work that might sit
(conceptually) alongside Bloom is a way to support their practice.

http://www.academia.edu/4289077/Taxonomy_Circles_Visualizing_the_possibilities_of_intended_learning_outcomes

In New Zealand, whilst at Massey I got into some fascinating discussions with education staff about the blurring of the affective and cognitive domains, significant in cross-cultural education, and this led me to look for effective representations of domains. I came across an unattributed circular representation that made instant sense to me, and set about mapping other domains in the same way. In the process I found not only a tool that supported and reinforced the conceptual framework represented by Constructive Alignment, but also a visualising that supported engagement with educational technologies and assessment tools. I hope this brief account is of use to people and am, as always, very open to feedback and comment.

I’m very grateful to those colleagues across the globe who have expressed interest in using these visual representations and hope to be able to share some applicable data with everyone in due course.


Visualising Outcomes: domains, taxonomies and verbs

October 17, 2012
Circular representations of educational taxonomies

Four ‘Domains’ of educational objectives represented in a circular form

I think being able to visualise things is important. Faculty and learning designers need to be able to see Intended Learning Outcomes (ILOs) take shape and mant find existing lists are uninspiring. It’s not uncommon for faculty and instructional designers to get tired and weary of ILOs; they can feel restrictive, repetitive, formulaic and sometimes obstructive. In previous posts I’ve tried to suggest that the bigger picture, the challenges of effective 21st century university level learning design, make them not only useful, but also essential. If you don’t agree, don’t bother reading. I’m not going to try and persuade you. If you think there’s some truth in the argument and you want to engage with ILOs to make your teaching more focussed, your students increasingly autonomous and your graduates equipped with meaningful evidence, then I hope I have something worthwhile sharing and will welcome your thoughts.

My argument is that a module (a substantial unit of a full years undergraduate study), and the programme of which is part, should have clearly articulated outcomes in four domains:

  • Knowledge and understanding – or the knowledge domain
  • Intellectual Skills – or the cognitive domain
  • Professional Skills – or the affective domain
  • Transferable Skills – or the psychomotor domain

I’m suggesting one SHOULD expect to see a different distribution of ILOs between the outcomes in these domains depending on the focus of the module and the level of study. One might expect to see a second year anthropology module on ‘theoretical perspectives’ emphasising cognitive outcomes and a module being studied alongside it on ‘research design and techniques’ emphasising affective and psychomotor outcomes. One might reasonably expect to see more foundational ‘knowledge and understanding’ outcomes in the first year of a programme of study, and more ‘cognitive’ outcomes at the end of the programme. The lack of this ‘designed articulation’ in many modules undermines their value to the student and ultimately to faculty.

The basic principle is that an outcome should be assessable. Lots of great stuff can happen in your teaching and students’ learning that DOESN’T need to be assessed. It can be articulated in the syllabus, it just isn’t a measured outcome. A student should be able, at the end of this course of study (module or programme), to evidence that they have attained the intended learning outcomes. This evidence has been assessed in some way and the student is then able to point to the ILOs amassed throughout their programme and say “I can demonstrate that I learnt to DO this”.

Representing Taxonomies

There has been a significant shift in the language we now use from the original work in the 1950s by Bloom and colleagues. The passively descriptive language of Bloom’s Taxonomy has become the active language of Anderson and Krathwohl (Anderson & Krathwohl, 2001). The taxonomies have moved from Evaluation to Evaluate, from Analysis to Analyse. This is significant in that the emphasis has moved away from describing what the focus of the teaching is supposed to be, to the demonstrable outcomes of the learning.

The illustration above consists of four visual ‘wheels’ that I have used to discuss learning outcomes with faculty in the context of module and programme design at Massey University in New Zealand and at the LSE and BPP University College in the United Kingdom. These visual representations were inspired by work done elsewhere, on the cognitive domain in particular. The first documented example of this circular representation I have been able to find is attributed to Barbara Clark in 2002, but a great many people have since represented Bloom’s original, and the revised, cognitive domain in this way.

The circular representation has the higher level terms at the centre, proto-verbs if you will, surrounded by a series of active verbs that articulate actions an individual might undertake to generate evidence, of their ability to represent to proto-verb. The circular visualisation also serves to create a more fluid representation of the stages, or divisions, in the proto-verbs. Rather than a strict ‘step-by-step’ representation where one advances ‘up’ the proto-verbs, one might consider this almost like the dial on an old telephone, in every case one starts at the ‘foundational’ and dials-up though the stages to the ‘highest’ level. Each level relies on the previous. It may be implicit that to analyse something, one will already have acquired a sense of its application, and that application is grounded on subject knowledge and understanding. So the circle is a useful way of visualising the interconnected nature of the process. Most importantly in my practice, it’s a great catalyst for debate.

The circular representations of the domains and associated taxonomies also serve to make learning designers aware of the language they use. Can a verb be used at different levels? Certainly. Why? Because context is everything. One might ‘identify’ different rock samples in a first year geology class as part of applying a given classification of rocks to samples, or one might identify a new species of insect as part of postgraduate research programme. The verb on its own does not always denote level. I talk about the structure of ILOs in a subsequent post.

Circular representation of Educational Taxonomies

Structure of the circular representations of Educational Taxonomies

More recent representations have created new complex forms that include the outer circle illustrated here. I’ve found these rather useful, in part because they often prove contentious. If the inner circle represents (in my versions) the proto-verbs within our chosen taxonomies, and the next circle represent that active verbs used to describe the Intended Learning Outcomes (ILO) AND the Learning and Teaching Activities (TLS), the outermost circle represents the evidence and assessment forms used to demonstrate that verb. Increasingly I’ve used this to identify educational technologies and get faculty thinking more broadly about how they can assess things online as well as in more traditional settings. The outermost circle will continue to evolve as our use of educational technologies evolves. In Constructive Alignment one might reasonably expect students’ learning activity to ‘rehearse’ the skills they are ultimately to evidence in assessment (Biggs & Collis, 1982; Boud & Falchikov, 2006) and the forms to enable that are becoming increasingly varied.

Re-visioning  Taxonomies

One of my favourite representations of the relationship between the knowledge dimension and the cognitive domain is from Rex Heer at Iowa State University’s Center for Excellence in Learning and Teaching (http://www.celt.iastate.edu/teaching/RevisedBlooms1.html accessed ). It’s an interactive model that articulates the relationship, as Anderson and Krathwohl saw it, rather well. My own interest, as we look to effective ILOs, is to separate out the knowledge dimension as a subject or knowledge domain and have faculty articulate this clearly for students, before reconnecting to the other domains. A process I’ll talk about subsequently.

Here are my four ‘working circles’ using adaptations of taxonomies from Anderson and Krathwohl (Knowledge and Understanding, and Cognitive), Krathwohl et al (Affective) and Dave (Psychomotor). I have adapted the Knowledge Dimension of Anderson and Krathwohl to do two things; to describe the dimension in terms of active verbs rather than as a definition of the nature of the knowledge itself, and I have incorporated a stage I believe is under represented in their articulation. I have added the ability to ‘ contextualise’ subject knowledge between the ability to specify it (Factual) and the ability to conceptualize (Conceptual). I have also rearticulated the original ‘Metacognitive’ as the ability to ‘Abstract‘. This will doubtless need further work. My intent is not to dismiss the valuable work already in evidence around the relationship between a knowledge dimension and the cognitive domain, rather it is to enable faculty, specifically when writing learning outcomes, to identify the subject, discipline or knowledge to be enabled in more meaningful ways.

These images are provided as JPG images. If you would like me to email the original PowerPoint slides (very low-tech!) so that you can edit, amend and enhance, I am happy to do so. I only ask that you enhance my practice by sharing your results with me.

I hope these provoke thought, reflection and comment. Feel free to use them with colleagues in discussion and let me know if there are enhancements you think would make them more useful to others.

Cognitive Domain – Intellectual Skills

Cognitive Domain – Intellectual Skills

Affective Domain – Professional and Personal Skills

Affective Domain - Professional and Personal Skills

Affective Domain – Professional and Personal Skills

Psychomotor Domain- Practical, Technical and Transferable Skills

Psychomotor Domain- Practical, Technical and Transferable Skills

Psychomotor Domain- Practical, Technical and Transferable Skills

Knowledge Domain – Subject and Discipline Knowledge

Knowledge Domain- Subject or Discipline Skills

Knowledge Domain- Subject or Discipline Skills

The next post will illustrate the usefulness of these visualisations in drafting Intended Learning Outcomes with some examples.

………………………………………………………………………………………

Anderson, L. W., & Krathwohl, D. R. (2001). A taxonomy for learning, teaching, and assessing : a revision of Bloom’s taxonomy of educational objectives. New York: Longman.

Biggs, J. B., & Collis, K. F. (1982). Evaluating the Quality of Learning: Structure of the Observed Learning Outcome Taxonomy. Academic Press Inc.

Boud, D., & Falchikov, N. (2006). Aligning assessment with long‐term learning. Assessment & Evaluation in Higher Education, 31(4), 399–413.

………………………………………………………………………………………..

Edited October 19th 2012 in response to feedback.


Distance Education Conference Madison-Wisconsin

August 11, 2011

SOLE and DiAL-e Sites August 2011

The  27th Annual Distance Learning and Teaching Conference at Madison-Wisconsin this August was a diverse and varied programme attended by some 900 distance educators from all sectors, from K-12 to professional education. My contribution was a half-day  DiAL-e Workshop with Kevin Burden (University of Hull) attended by some 24 people. The workshop went relatively well but also gave us an insight into a variety of cultural differences in such settings. I was able to learn from this and the 45 Minute Information Session on the SOLE model the following day definitely had a better ‘buzz’. In addition I contributed to a new format this year, a 5+10 Videoshare session where participants had (supposedly!) produced a 5 minute video and then made themselves available to discuss it for 10 minutes.

All the sessions went well but the SOLE model and toolkit seemed to grab some serious interest and I will hope to have the opportunity to go back to the States and work with colleagues on learning design projects in the future.


LAMS Learning Design Conference Presentation

September 6, 2010

Finally this weekend got around to putting the slides to the audio that was recorded at the European LAMS & Learning Design Conference 2010. I’ve uploaded the presentation to YouTube in two parts. Part 1 essentially introduces the Student-Owned Learning-Engagement (SOLE) model itself and Part 2 highlights the recent version of the toolkit in Excel.

Part 1: The Model

Part 2: The Toolkit


Releasing SOLE Toolkit v1.1

May 18, 2010

With thanks to colleagues form the GradDip Primary programme at Massey University who yesterday provided some great feedback, comment and criticism on the Student-Owned Learning Engagement model. I presented briefly the SOLE model and explained the underlying rational and then showed the ‘rough’ version of the excel workbook that constitutes the ‘toolkit’.

The toolkit (see ‘pages’) is in some respects rather simple but appears to have captured the imagination of the group and as such was a spur to further development. So after incorporating some minor amendments I’ve taken the plunge and have released version 1.1 to the world! I have also created a shirt YouTube video to explain the basic structure and plan to develop some other resources soon.

Here’s video – a new channel has been created at http://www.YouTube.com/TheSOLEmodel


%d bloggers like this: