Defining Transferable Skills

March 19, 2017

Recently I have been advising colleagues on how they should write Intended Learning Outcomes across all five educational domains (cognitive, knowledge, affective, psychomotor and interpersonal) and conform to the QAA guidance (UK). This guidance (widely adopted across UK higher education a sector) breaks ILOs into:

  • Knowledge and Understanding
  • Intellectual Skills
  • Practical and Professional Skills
  • Transferable Skills.

I don’t agree with this guidance and would prefer learning designers to identify a balance of outcomes, appropriate to the nature of the discipline, the focus of the module and the modules shape or purpose within a programme. I suggest it makes more sense to do this by using five distinct domains, rather than the existing four vaguely defined catagories. Pragmatically though it is possible to map five distinct domains onto the four existing catagories. This is illustrated below.

Table 1.         Mapping educational domains to QAA categories

 Domain QAA Category Description
Knowledge Knowledge and Understanding Knowledge often describes the scope of the subject intended to represent the ‘nature’ of the discipline with reference to the personal-epistemological and metacognitive development of students
Cognitive Intellectual Skills Cognitive often referred to as intellectual skills refers to ‘knowledge structures’ in cognition, the progressively complex use of knowledge artefacts
Affective Practical and Professional Skills Affective sometimes referred to professional ‘skills’ or attributes perception of value issues, and ranges from simple awareness (Receiving), through to the internalization of personal value systems
Psychomotor Transferable Skills Psychomotor referred to as practical skills refers to progressively complex manual or physical skills. This could be the ability to use a complex piece of software, instrument or paper documentation
Interpersonal  Transferable Skills  Interpersonal referred to as communication skills refers to progressively complex levels in interpersonal communication, conflict resolution, collaboration and cross-cultural communication

As stated elsewhere I think higher education fails to accurately describe the skills, attributes and knowledge that students are intended to acquire through their studies. Creating meaningful ILOs is the beginning of well designed constructively aligned curricula.


Simplifying Assessment Alignment

November 22, 2016

Some recent work with programme designers in other UK institutions suggests to me that quality assurance and enhancement measures continue to be appended to the policies and practices carried out in UK HEIs rather than seeing a revitalising redesign of the entire design and approval process.

This is a shame because it has produced a great deal of work for faculty in designing and administering programmes and modules, not least when it comes to assessment. Whatever you feel about intended learning outcomes (ILOs) and their constraints or structural purpose, there is nearly universal agreement that the purpose of assessment is not to assess students ‘knowledge of the content’ on a module. Rather the intention of assessment is to demonstrate higher learning skills, most commonly codified in the intended learning outcomes. I have written elsewhere about the paucity of writing effective ILOs and focusing them almost entirely the cognitive domain (intellectual skills), with the omission of other skill domains notably the effective (professional skills) and the psychomotor (transferable skills). Here I want to identify the need for close proximity between ILOs and assessment criteria.

It seems to me that well-designed intended learning outcomes lead to cogent assessment design. They also suggest that the use of a transparent marking rubric, used by both markers and students, creates a simpler process.

To illustrate this I wanted to share two alternative approaches to aligning assessment to the outcomes of a specific module. In order to preserve the confidentiality of the module in question some elements have been omitted but hopefully the point will still be clearly made.

Complex Attempt to Assessment Alignment

Complex Assessment AlignmentI have experienced this process in several Universities.

  1. Intended Learning Outcomes are written (normally at the end of the ‘design’ process)
  2. ILOs are mapped to different categorizations of domains, Knowledge & Understanding, Intellectual Skills, Professional Skills and Attitudes, Transferable Skills.
  3. ILOs are mapped against assessments, sometimes even mapped to subject topics or weeks.
  4. Students get first sight of the assessment.
  5. Assessment Criteria are written for students using different categories of judgement: Organisation, Implementation, Analysis, Application, Structure, Referencing, etc.
  6. Assessment Marking Schemes are then written for assessors. Often with guidance as to what might be expected at specific threshold stages in the marking scheme.
  7. General Grading Criteria are then developed to map the schemes outcomes back to the ILOs.

 

Streamlined version of aligned assessment

streamlined marking rubric

I realise that this proposed structure is not suitable for all contexts, all educational levels and all disciplines. Nonetheless I would advocate that this is the optimal approach.

  1. ILO are written using a clear delineation of domains; Knowledge, Cognitive (Intellectual), Affective (Values), Psychomotor (Skills) and Interpersonal. These use appropriate verb structures tied directly to appropriate levels. This process is explained in this earlier post.
  2. A comprehensive marking rubric is then shared with both students and assessors. It identifies all of the ILOs that are being assessed. In principle we should only be assessing the ILOs in UK Higher Education NOT content. The rubric will differentiate the type of responses expected to achieve varies grading level.
    • There is an option to automatically sum grades given against specific outcomes or to take a more holistic view.
    • It is possible to weight specific ILOs as being worth more marks than others.
    • This approach works for portfolio assessment but also for a model of assessment where there are perhaps two or three separate pieces of assessment assuming each piece is linked to two or three ILOs.
    • Feedback is given against each ILO on the same rubric (I use Excel workbooks)

I would suggest that it makes sense to use this streamlined process even if it means rewriting your existing ILOs. I’d be happy to engage in debate with anyone about how best to use the streamlined process in their context.


Using Learning Design to Unleash the Power of Learning Analytics

December 4, 2015

12274589_10153905502716282_5140343966725134468_n.jpg

Atkinson, S.P. (2015). Using Learning Design to Unleash the Power of Learning Analytics. In T. Reiners, B.R. von Konsky, D. Gibson, V. Chang, L. Irving, & K. Clarke (Eds.), Globally connected, digitally enabled. Proceedings ascilite 2015 in Perth (pp. 358-364 / CP:6-CP:10).


 

A very enjoyable presentation made this week at ascilite 2015 in Perth, Australia. Wonderful to engage with this vibrant and hospitable community. Amongst some fascinating presentations exploring the theoretical and information management dimension of learning analytics and academic analytics, my very foundational work on constructively aligned curricula and transparency in design was I believe welcomed.

I said in my paper that I believed “New learning technologies require designers and faculty to take a fresh approach to the design of the learner experience. Adaptive learning, and responsive and predicative learning systems, are emerging with advances in learning analytics. This process of collecting, measuring, analysing and reporting data has the intention of optimising the student learning experience itself and/or the environment in which the experience of learning occurs… it is suggested here that no matter how sophisticated the learning analytics platforms, algorithms and user interfaces may become, it is the fundamentals of the learning design, exercised by individual learning designers and faculty, that will ensure that technology solutions will deliver significant and sustainable benefits. This paper argues that effective learning analytics is contingent on well structured and effectively mapped learning designs.

12316100_10153905502676282_6800926538129710150_n


Graduate Competencies, Employability and Educational Taxonomies: Critique of Intended Learning Outcomes

July 18, 2015

We hear much about the changing world of work and how slow higher and professional education is to respond. So in an increasingly competitive global market of Higher Apprenticeships and work-based learning provision I began to take a particular interest in students’ ‘graduateness’. What had begun as an exploratory look for examples of intended learning outcomes (ILO) with ’employability’ in mind ended up as this critical review published in an article entitled ‘Graduate Competencies, Employability and Educational Taxonomies: Critique of Intended Learning Outcomes’ in the journal called  Practice and Evidence of the Scholarship of Teaching and Learning in Higher Education.

I randomly identified 20 UK institutions, 80 undergraduate modules and examined their ILOs. This resulted in 435 individual ILOs being taken by students in current modules (academic year 2014-2015) across different stages of their undergraduate journey (ordinarily in the UK this takes place over three years through Levels 4,5 and 6). This research reveals the lack of specificity of ILOs in terms of skills, literacies and graduates attributes that employers consistently say they want from graduates

The data in the table below from the full paper which describes the post-analysis attribution of ILOs to domains of educational objectives (see paper for methodology) which I found rather surprising. The first surprise was the significant percentage of ILOs which are poorly structured, given the weight of existing practice guidance and encouragement for learning designers and validators (notably from the UK Higher Education Academy and the UK Quality Assurance Agency). Some 94 individual ILOs (21.6%) had no discernible active verbs in their construction.  64 ILOs (14.7%) did not contain any meaningful verbs so could not be mapped to any educational domain. This included the infamous ‘to understand’ and ‘to be aware of’. So as a result only 276 ILOs (64%) were deemed ‘well-structured’ and were then mapped against four domains of educational objectives.

Table 8.          Post-analysis attribution of ILOs to Domains of Educational Objectives

  Level 4 Level 5 Level 6 Total
Knowledge(Subject Knowledge) 14 5 11 30
Cognitive (Intellectual Skills) 46 91 61 198
Affective(Professional Skills) 1 4 1 6
Psychomotor(Practical/Transferable Skills) 12 18 13 43
No Verbs 35 32 27 94
Not classifiable 23 30 11 64
Totals 131 180 125 435

Remember what I had been originally looking for were examples of ILOs that represented skills that the literature on employability and capabilities suggested should be there. These could have been anticipated to be those in the affective or psychomotor domains.

So it was rather surprising to see that of the 64% of the full sample that was codeable,  sizeable percentage were cognitive (45.4%), a relatively small percentage fell into the psychomotor domain (9.8%), even less into the knowledge domain (6.8%) and a remarkably small number could be deemed affective (1.4%).

I say remarkable because the affective domain, sometimes detailed as personal and professional skills, are very much the skills that employers (and most graduates) prize above all else. These refer to the development of values and the perception of values, including professionalism, inter-personal awareness, timeliness, ethics, inter-cultural sensitivity, and diversity and inclusivity issues.

Apparently despite all the sterling work going on in our libraries and career services, employment-ready priorities within programmes and modules in higher education, are not integrated with teaching and learning practices. I suggest that as a consequence, this makes it difficult for students to extract, from their learning experience within modules, the tangible skill development required of them as future employees.

There is an evident reliance by module designers on the cognitive domain most commonly associated at a lower level with ‘knowing and understanding’ and at a higher level as ‘thinking and intellectual skills’. The old favourite ‘to critically evaluate’ and ‘to critically analyse’ are perennial favourites.

There is much more to the picture than this single study attempts to represent but I think it is remarkable not more attention is being paid to the affective and psychomotor domains in module creation.

More analysis, and further data collection will be done, to explore the issue at programme level and stage outcomes (is it plausible that module ILOs are simply not mapped and unrelated and all is well at programme level). I would also be interested to explore the mapping of module and programme ILOs to specified graduate attributes that many institutions make public.

I go on in the full paper about the relative balance of different ILOs in each of the domains depending on the nature of the learning, whether it is a clinical laboratory module or a fieldwork module or a literature-based module.

The reason I think this is important, and I have written here before, that this important (it is about semantics!), is that students are increasingly demanding control over their choices, their options, the shape of their portfolios, their ‘graduateness’, and they need to be able to identify their own strengths and weaknesses and make meaningful modules choices to modify the balance of the skills acquired in a ’practical’ module compared with those in a ‘cerebral’ one. I conclude that the ability to consciously build a ‘skills profile’ is a useful graduate attribute in itself…. which incidentally would be an affective ILO.

You can download the full paper here LINK.

Also available on ResearchGate and Academia.edu

Full citation:
Atkinson, S. 2015 Jul 9. Graduate Competencies, Employability and Educational Taxonomies: Critique of Intended Learning Outcomes. Practice and Evidence of the Scholarship of Teaching and Learning in Higher Education [Online] 10:2. Available: http://community.dur.ac.uk/pestlhe.learning/index.php/pestlhe/article/view/194/281


Enhancements to the SOLE Tookit – now version 3.5

May 31, 2015

I have no idea what the protocol is for naming versions of things. I imagine, like me, someone has an idea of what the stages are going to look like, when a truly fresh new is going to happen. For me I have a sense that version 4.0 of the SOLE Toolkit will incorporate what I am currently learning about assessment and ‘badges’, self-certification and team marking. But for now I’m not there yet and am building on what I have learnt about student digital literacies so I will settle for Version 3.5.

This version of the SOLE Toolkit, 3.5, remains a completely free, unprotected and macro-free Excel workbook with rich functionality to serve the learning designer. In version 3.0 I added more opportunities for the student to use the toolkit as an advanced organiser offering ways to record their engagement with their learning. It also added in some ability to sequence learning so that students could plan better their learning although I maintained this was guidance only and should allow students to determine their own pathways for learning.

Version 3.5 has two significant enhancements. Firstly, it introduces a new dimension, providing a rich visualization of the learning spaces and tools that students are to engage with in their learning. This provides an alternative, fine-grain, view of the students modes of engagement in their learning. It permits the designer to plan not only for a balance of learning engagement but also a balance of environments and tools. This should allow designers to identify where ‘tool-boredom’ or ‘tool-weariness’ is possibly a danger to learner motivation and to ensure that a range of tools and environments allow students to develop based on their own learning preferences.

Secondly, it allows for a greater degree of estimation of staff workload, part of the original purpose of the SOLE Model and Toolkit project back in 2009. This faculty-time calculations in design and facilitating are based on the learning spaces and tools to be used. This function allows programme designers and administrators, as well as designers themselves, to calculate the amount of time they are likely to need to design materials and facilitate learning around those materials.

I invite you to explore the SOLE Toolkit on the dedicated website for the project and would welcome any comments of feedback you might have.


Why ‘learning analytics’ is like a sewer

May 2, 2015

Back in the late northern hemisphere summer of 2013 I drafted a background paper on the differences between Educational Data Mining, Academic Analytics and Learning Analytics. Entitled ‘Adaptive Learning and Learning Analytics: a new design paradigm‘, It was intended to ‘get everyone on the same page‘ as many people at my University, from very different roles, responsibilities and perspectives, had something to say about ‘analytics’. Unfortunately for me I then had nearly a years absence through ill-health and I came back to an equally obfuscated landscape of debate and deliberation. So I opted to finish the paper.

I don’t claim to be an expert on learning analytics, but I do know something about learning design, about teaching on-line and about adapting learning delivery and contexts to suit different individual needs. The paper outlines some of the social implications of big data collection. It looks to find useful definitions for the various fields of enquiry concerned with collecting and making something useful with learner data to enrich the learning process. It then suggest some of the challenges that such data collection involves (decontextualisation and privacy) and the opportunity it represents (self-directed learning and the SOLE Model). Finally it explores the impact of learning analytics on learning design and suggests why we need to re-examine the granularity of our learning designs.

I conclude;

Learning Analytics Cover“The influences on the learner that lay beyond the control of the learning provider, employer or indeed the individual themselves, are extremely diverse. Behaviours in social media may not be reflected in work contexts, and patterns of learning in one discipline or field of experience may not be effective in another. The only possible solution to the fragmentation and intricacy of our identities is to have more, and more interconnected, data and that poses a significant problem.

Privacy issues are likely to provide a natural break on the innovation of learning analytics. Individuals may not feel that there is sufficient value to them personally to reveal significant information about themselves to data collectors outside the immediate learning experience and that information may simply be inadequate to make effective adaptive decisions. Indeed, the value of the personal data associated with the learning analytics platforms emerging may soon see a two tier pricing arrangement whereby a student pays a lower fee if they engage fully in the data gathering process, providing the learning provider with social and personal data, as well as their learning activity, and higher fees for those that wish to opt-out of the ‘data immersion’.

However sophisticated the learning analytics platforms, algorithms and user interfaces become in the next few years, it is the fundamentals of the learning design process which will ensure that learning providers do not need to ‘re-tool’ every 12 months as technology advances and that the optimum benefit for the learner is achieved. Much of the current commercial effort, informed by ‘big data’ and ‘every-click-counts’ models of Internet application development, is largely devoid of any educational understanding. There are rich veins of academic traditional and practice in anthropology, sociology and psychology, in particular, that can usefully inform enquiries into discourse analysis, social network analysis, motivation, empathy and sentiment study, predictive modelling and visualisation and engagement and adaptive uses of semantic content (Siemens, 2012). It is the scholarship and research informed learning design itself, grounded in meaningful pedagogical and andragogical theories of learning that will ensure that technology solutions deliver significant and sustainable benefits.

To consciously misparaphrase American satirist Tom Lehrer, learning analytics and adaptive learning platforms are “like sewers, you only get out of them, what you put into them’.”

Download the paper here, at AcademiaEdu or ResearchGate

Siemens, G. (2012). Learning analytics: envisioning a research discipline and a domain of practice. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 4–8). New York, NY, USA: ACM. doi:10.1145/2330601.2330605


ePortfolios: for whom and wherefore 

April 12, 2015

(unformed thoughts...) I’m exploring a student e-portfolio solution. I probably didn’t need to install an instance of Moodle (2.5.9) and one of Mahara (1.10.2) to know it’s a match made in heaven. It’s been fun patching the two systems together (also linked up Moodle to Google Drive to test the outputs there too – plain and simple) but then I began to wonder. ..Who’s it for? Whose version of heaven does it represent?

Mahara Test Environment

Mahara Test Environment

The very notion of an ‘e-portfolio‘ might produce obstacles rather than opportunities. The word ‘portfolio‘ means various things in different context to a variety of audiences. A portfolio might refer to a physical (or virtual) briefcase, it might refer to a financial portfolio as a collection of stocks, shares and asset notifications; or it could also mean an artist’s portfolio as a collection of works in progress or final outputs; or again it can still have other meanings, to a portfolio as a body of responsibilities or projects currently held by someone in the workplace. Why then does the vast majority scholarly literature on e-portfolios in a university context represents a uniform interpretation – an assemblage of reflections and representational artefacts?

I am afraid I’m going to bang an old drum, that of the need for learning to consider the foundational aspects of epistemological beliefs. There is an ongoing debate as to the extent to which the portfolio is owned by the students (the majority view) or the institution. Where a portfolio is independent of any formal assessment processes it is fairly easy to define the portfolio in terms that students take full control of its structure and output (within whatever restrictions the technology imposes). However, if there is any relationship between the representational space and formal assessment processes the ownership (of process if not products) is at east shared. This changes the way we advocate the use of a portfolio. If student and tutor have a shared perception of learning as a personal reflective journey that the tutor can encourage but remove themselves from the process of meta-cognitive growth we hope students might experience.

There is also an important cultural dimension to our expectation that students will want to record and reflect in a portfolio context. I don’t mean ‘cultural’ with respect to international students, as important as that is, but I confess I am ignorant as to the existence of a diurnal recording tradition outside the occidental world. If there is no historical context for writing ones daily occurrences, experiences and reflections, it is a ‘big ask’ to expect students to engage in such a process from ‘scratch’. There is a global tradition of thoughts and observations and of travel writings and so perhaps a more suitable metaphor for the majority of learners to grasp onto would be the learning journey (re: journal).

Surely we want our graduates to have ‘basic’ digital literacy skills but isn’t a template driven portfolio solution really short-changing them? Shouldn’t they leave University with the ability to set-up a digital presence on the web for themselves, to select a service that suits their particular context be it LinkedIn for the ‘career driven’ or Academia.edu for the apprenticed faculty. Would not a WordPress solution suit most for a public facing self-representation. Shouldn’t institutions be getting out of the way of learners and rather than seeking to curate learning outputs,  instead enabling learners to take digital-flight.


%d bloggers like this: