Defining Transferable Skills

March 19, 2017

Recently I have been advising colleagues on how they should write Intended Learning Outcomes across all five educational domains (cognitive, knowledge, affective, psychomotor and interpersonal) and conform to the QAA guidance (UK). This guidance (widely adopted across UK higher education a sector) breaks ILOs into:

  • Knowledge and Understanding
  • Intellectual Skills
  • Practical and Professional Skills
  • Transferable Skills.

I don’t agree with this guidance and would prefer learning designers to identify a balance of outcomes, appropriate to the nature of the discipline, the focus of the module and the modules shape or purpose within a programme. I suggest it makes more sense to do this by using five distinct domains, rather than the existing four vaguely defined catagories. Pragmatically though it is possible to map five distinct domains onto the four existing catagories. This is illustrated below.

Table 1.         Mapping educational domains to QAA categories

 Domain QAA Category Description
Knowledge Knowledge and Understanding Knowledge often describes the scope of the subject intended to represent the ‘nature’ of the discipline with reference to the personal-epistemological and metacognitive development of students
Cognitive Intellectual Skills Cognitive often referred to as intellectual skills refers to ‘knowledge structures’ in cognition, the progressively complex use of knowledge artefacts
Affective Practical and Professional Skills Affective sometimes referred to professional ‘skills’ or attributes perception of value issues, and ranges from simple awareness (Receiving), through to the internalization of personal value systems
Psychomotor Transferable Skills Psychomotor referred to as practical skills refers to progressively complex manual or physical skills. This could be the ability to use a complex piece of software, instrument or paper documentation
Interpersonal  Transferable Skills  Interpersonal referred to as communication skills refers to progressively complex levels in interpersonal communication, conflict resolution, collaboration and cross-cultural communication

As stated elsewhere I think higher education fails to accurately describe the skills, attributes and knowledge that students are intended to acquire through their studies. Creating meaningful ILOs is the beginning of well designed constructively aligned curricula.


Simplifying Assessment Alignment

November 22, 2016

Some recent work with programme designers in other UK institutions suggests to me that quality assurance and enhancement measures continue to be appended to the policies and practices carried out in UK HEIs rather than seeing a revitalising redesign of the entire design and approval process.

This is a shame because it has produced a great deal of work for faculty in designing and administering programmes and modules, not least when it comes to assessment. Whatever you feel about intended learning outcomes (ILOs) and their constraints or structural purpose, there is nearly universal agreement that the purpose of assessment is not to assess students ‘knowledge of the content’ on a module. Rather the intention of assessment is to demonstrate higher learning skills, most commonly codified in the intended learning outcomes. I have written elsewhere about the paucity of writing effective ILOs and focusing them almost entirely the cognitive domain (intellectual skills), with the omission of other skill domains notably the effective (professional skills) and the psychomotor (transferable skills). Here I want to identify the need for close proximity between ILOs and assessment criteria.

It seems to me that well-designed intended learning outcomes lead to cogent assessment design. They also suggest that the use of a transparent marking rubric, used by both markers and students, creates a simpler process.

To illustrate this I wanted to share two alternative approaches to aligning assessment to the outcomes of a specific module. In order to preserve the confidentiality of the module in question some elements have been omitted but hopefully the point will still be clearly made.

Complex Attempt to Assessment Alignment

Complex Assessment AlignmentI have experienced this process in several Universities.

  1. Intended Learning Outcomes are written (normally at the end of the ‘design’ process)
  2. ILOs are mapped to different categorizations of domains, Knowledge & Understanding, Intellectual Skills, Professional Skills and Attitudes, Transferable Skills.
  3. ILOs are mapped against assessments, sometimes even mapped to subject topics or weeks.
  4. Students get first sight of the assessment.
  5. Assessment Criteria are written for students using different categories of judgement: Organisation, Implementation, Analysis, Application, Structure, Referencing, etc.
  6. Assessment Marking Schemes are then written for assessors. Often with guidance as to what might be expected at specific threshold stages in the marking scheme.
  7. General Grading Criteria are then developed to map the schemes outcomes back to the ILOs.

 

Streamlined version of aligned assessment

streamlined marking rubric

I realise that this proposed structure is not suitable for all contexts, all educational levels and all disciplines. Nonetheless I would advocate that this is the optimal approach.

  1. ILO are written using a clear delineation of domains; Knowledge, Cognitive (Intellectual), Affective (Values), Psychomotor (Skills) and Interpersonal. These use appropriate verb structures tied directly to appropriate levels. This process is explained in this earlier post.
  2. A comprehensive marking rubric is then shared with both students and assessors. It identifies all of the ILOs that are being assessed. In principle we should only be assessing the ILOs in UK Higher Education NOT content. The rubric will differentiate the type of responses expected to achieve varies grading level.
    • There is an option to automatically sum grades given against specific outcomes or to take a more holistic view.
    • It is possible to weight specific ILOs as being worth more marks than others.
    • This approach works for portfolio assessment but also for a model of assessment where there are perhaps two or three separate pieces of assessment assuming each piece is linked to two or three ILOs.
    • Feedback is given against each ILO on the same rubric (I use Excel workbooks)

I would suggest that it makes sense to use this streamlined process even if it means rewriting your existing ILOs. I’d be happy to engage in debate with anyone about how best to use the streamlined process in their context.


Using Learning Design to Unleash the Power of Learning Analytics

December 4, 2015

12274589_10153905502716282_5140343966725134468_n.jpg

Atkinson, S.P. (2015). Using Learning Design to Unleash the Power of Learning Analytics. In T. Reiners, B.R. von Konsky, D. Gibson, V. Chang, L. Irving, & K. Clarke (Eds.), Globally connected, digitally enabled. Proceedings ascilite 2015 in Perth (pp. 358-364 / CP:6-CP:10).


 

A very enjoyable presentation made this week at ascilite 2015 in Perth, Australia. Wonderful to engage with this vibrant and hospitable community. Amongst some fascinating presentations exploring the theoretical and information management dimension of learning analytics and academic analytics, my very foundational work on constructively aligned curricula and transparency in design was I believe welcomed.

I said in my paper that I believed “New learning technologies require designers and faculty to take a fresh approach to the design of the learner experience. Adaptive learning, and responsive and predicative learning systems, are emerging with advances in learning analytics. This process of collecting, measuring, analysing and reporting data has the intention of optimising the student learning experience itself and/or the environment in which the experience of learning occurs… it is suggested here that no matter how sophisticated the learning analytics platforms, algorithms and user interfaces may become, it is the fundamentals of the learning design, exercised by individual learning designers and faculty, that will ensure that technology solutions will deliver significant and sustainable benefits. This paper argues that effective learning analytics is contingent on well structured and effectively mapped learning designs.

12316100_10153905502676282_6800926538129710150_n


Graduate Competencies, Employability and Educational Taxonomies: Critique of Intended Learning Outcomes

July 18, 2015

We hear much about the changing world of work and how slow higher and professional education is to respond. So in an increasingly competitive global market of Higher Apprenticeships and work-based learning provision I began to take a particular interest in students’ ‘graduateness’. What had begun as an exploratory look for examples of intended learning outcomes (ILO) with ’employability’ in mind ended up as this critical review published in an article entitled ‘Graduate Competencies, Employability and Educational Taxonomies: Critique of Intended Learning Outcomes’ in the journal called  Practice and Evidence of the Scholarship of Teaching and Learning in Higher Education.

I randomly identified 20 UK institutions, 80 undergraduate modules and examined their ILOs. This resulted in 435 individual ILOs being taken by students in current modules (academic year 2014-2015) across different stages of their undergraduate journey (ordinarily in the UK this takes place over three years through Levels 4,5 and 6). This research reveals the lack of specificity of ILOs in terms of skills, literacies and graduates attributes that employers consistently say they want from graduates

The data in the table below from the full paper which describes the post-analysis attribution of ILOs to domains of educational objectives (see paper for methodology) which I found rather surprising. The first surprise was the significant percentage of ILOs which are poorly structured, given the weight of existing practice guidance and encouragement for learning designers and validators (notably from the UK Higher Education Academy and the UK Quality Assurance Agency). Some 94 individual ILOs (21.6%) had no discernible active verbs in their construction.  64 ILOs (14.7%) did not contain any meaningful verbs so could not be mapped to any educational domain. This included the infamous ‘to understand’ and ‘to be aware of’. So as a result only 276 ILOs (64%) were deemed ‘well-structured’ and were then mapped against four domains of educational objectives.

Table 8.          Post-analysis attribution of ILOs to Domains of Educational Objectives

  Level 4 Level 5 Level 6 Total
Knowledge(Subject Knowledge) 14 5 11 30
Cognitive (Intellectual Skills) 46 91 61 198
Affective(Professional Skills) 1 4 1 6
Psychomotor(Practical/Transferable Skills) 12 18 13 43
No Verbs 35 32 27 94
Not classifiable 23 30 11 64
Totals 131 180 125 435

Remember what I had been originally looking for were examples of ILOs that represented skills that the literature on employability and capabilities suggested should be there. These could have been anticipated to be those in the affective or psychomotor domains.

So it was rather surprising to see that of the 64% of the full sample that was codeable,  sizeable percentage were cognitive (45.4%), a relatively small percentage fell into the psychomotor domain (9.8%), even less into the knowledge domain (6.8%) and a remarkably small number could be deemed affective (1.4%).

I say remarkable because the affective domain, sometimes detailed as personal and professional skills, are very much the skills that employers (and most graduates) prize above all else. These refer to the development of values and the perception of values, including professionalism, inter-personal awareness, timeliness, ethics, inter-cultural sensitivity, and diversity and inclusivity issues.

Apparently despite all the sterling work going on in our libraries and career services, employment-ready priorities within programmes and modules in higher education, are not integrated with teaching and learning practices. I suggest that as a consequence, this makes it difficult for students to extract, from their learning experience within modules, the tangible skill development required of them as future employees.

There is an evident reliance by module designers on the cognitive domain most commonly associated at a lower level with ‘knowing and understanding’ and at a higher level as ‘thinking and intellectual skills’. The old favourite ‘to critically evaluate’ and ‘to critically analyse’ are perennial favourites.

There is much more to the picture than this single study attempts to represent but I think it is remarkable not more attention is being paid to the affective and psychomotor domains in module creation.

More analysis, and further data collection will be done, to explore the issue at programme level and stage outcomes (is it plausible that module ILOs are simply not mapped and unrelated and all is well at programme level). I would also be interested to explore the mapping of module and programme ILOs to specified graduate attributes that many institutions make public.

I go on in the full paper about the relative balance of different ILOs in each of the domains depending on the nature of the learning, whether it is a clinical laboratory module or a fieldwork module or a literature-based module.

The reason I think this is important, and I have written here before, that this important (it is about semantics!), is that students are increasingly demanding control over their choices, their options, the shape of their portfolios, their ‘graduateness’, and they need to be able to identify their own strengths and weaknesses and make meaningful modules choices to modify the balance of the skills acquired in a ’practical’ module compared with those in a ‘cerebral’ one. I conclude that the ability to consciously build a ‘skills profile’ is a useful graduate attribute in itself…. which incidentally would be an affective ILO.

You can download the full paper here LINK.

Also available on ResearchGate and Academia.edu

Full citation:
Atkinson, S. 2015 Jul 9. Graduate Competencies, Employability and Educational Taxonomies: Critique of Intended Learning Outcomes. Practice and Evidence of the Scholarship of Teaching and Learning in Higher Education [Online] 10:2. Available: http://community.dur.ac.uk/pestlhe.learning/index.php/pestlhe/article/view/194/281


Enhancements to the SOLE Tookit – now version 3.5

May 31, 2015

I have no idea what the protocol is for naming versions of things. I imagine, like me, someone has an idea of what the stages are going to look like, when a truly fresh new is going to happen. For me I have a sense that version 4.0 of the SOLE Toolkit will incorporate what I am currently learning about assessment and ‘badges’, self-certification and team marking. But for now I’m not there yet and am building on what I have learnt about student digital literacies so I will settle for Version 3.5.

This version of the SOLE Toolkit, 3.5, remains a completely free, unprotected and macro-free Excel workbook with rich functionality to serve the learning designer. In version 3.0 I added more opportunities for the student to use the toolkit as an advanced organiser offering ways to record their engagement with their learning. It also added in some ability to sequence learning so that students could plan better their learning although I maintained this was guidance only and should allow students to determine their own pathways for learning.

Version 3.5 has two significant enhancements. Firstly, it introduces a new dimension, providing a rich visualization of the learning spaces and tools that students are to engage with in their learning. This provides an alternative, fine-grain, view of the students modes of engagement in their learning. It permits the designer to plan not only for a balance of learning engagement but also a balance of environments and tools. This should allow designers to identify where ‘tool-boredom’ or ‘tool-weariness’ is possibly a danger to learner motivation and to ensure that a range of tools and environments allow students to develop based on their own learning preferences.

Secondly, it allows for a greater degree of estimation of staff workload, part of the original purpose of the SOLE Model and Toolkit project back in 2009. This faculty-time calculations in design and facilitating are based on the learning spaces and tools to be used. This function allows programme designers and administrators, as well as designers themselves, to calculate the amount of time they are likely to need to design materials and facilitate learning around those materials.

I invite you to explore the SOLE Toolkit on the dedicated website for the project and would welcome any comments of feedback you might have.


Visualisation of Educational Taxonomies

August 22, 2013

Sharing a paper today on the visualisation of educational taxonomies. I have finally got around to putting into a paper some of the blog postings, discussion, tweets and ruminations of recent years on educational taxonomies. I am always struck in talking to US educators (and faculty training teachers in particular) of the very direct use made of Bloom’s original 1956 educational taxonomy for the cognitive domain. They seem oblivious however to other work that might sit
(conceptually) alongside Bloom is a way to support their practice.

http://www.academia.edu/4289077/Taxonomy_Circles_Visualizing_the_possibilities_of_intended_learning_outcomes

In New Zealand, whilst at Massey I got into some fascinating discussions with education staff about the blurring of the affective and cognitive domains, significant in cross-cultural education, and this led me to look for effective representations of domains. I came across an unattributed circular representation that made instant sense to me, and set about mapping other domains in the same way. In the process I found not only a tool that supported and reinforced the conceptual framework represented by Constructive Alignment, but also a visualising that supported engagement with educational technologies and assessment tools. I hope this brief account is of use to people and am, as always, very open to feedback and comment.

I’m very grateful to those colleagues across the globe who have expressed interest in using these visual representations and hope to be able to share some applicable data with everyone in due course.


Re-visioning Learning Spaces: Evolving faculty roles and emerging learning spaces

May 16, 2013
Published a short working paper today entitled “Re-visioning Learning Spaces: Evolving faculty roles and emerging learning spaces“. In July I’ll be running some face-to-face workshops to explore the ideas in the paper and there will be a version two.
New build and refurbishments of educational spaces can be significant financial commitments and often represent ‘flagship’ investments for many universities.  However, apart from their marketing brochure appeal and the contemporary feel good factor for current students of ‘being there’, we should question whether they are really supporting effective learning.  This paper advocates that truly effective spaces need to be more closely associated with the particular learning contexts one is seeking to enrich.  Re-visioning our learning spaces requires universities to create and engage with a conceptual model of the learner and faculty, to develop not just new spaces but support for new roles within those spaces. The SOLE model is presented as a conceptual framework through which new spaces and new faculty roles are considered.
Paper can be downloaded at Academia.edu or direct from BPP University College pages (ISBN – 9781 4453 5457 6 / Publication Date: May 2013)

%d bloggers like this: