Soon after these rounds of consultations, the three of us working on the staff development site also were asked to redesign the Brookes virtual landing page. This is a sort of portal into all the support for technology that staff might need, and is actually the most visited page on the Brookes site. At the moment it mainly links to information on the tools – but we also wanted to integrate the student digital experience advice too. I think this is part of a larger, maybe even ideological, issue with technology-enhanced learning. It’s often seen as an issue of just getting the kit working, what plugs into what and what button to press. I think the pedagogical issues are far more important, such as what is the student supposed to get from it? what teaching skills are needed to ensure it is used effectively? This is partly because this aspect gets overlooked and yet has a huge impact on the learning that takes place, and partly because this is the bit I get paid to do and I want to keep my job. However, it is fair enough to say that getting the stuff to work is probably what people focus on first.
This then leads into the fourth design principle for what we’ve put together:
#4 Integrating as many different perspective into the tool as possible (alternatively: keep as many people happy as possible).
The gateway has gone through some re-designs but essentially it has always aimed to do two things;
- provide a point of entry for people who know what technology they want to use, but not how to use it in parallel with a point of entry for those who know what change they want to make to the student digital experience, but not how to achieve it.
- keep the CPD angle in the forefront of people’s minds, by having links to advice on how to do it throughout the site.
The structure then, looks like this (there are more options within the technologies and CPD sections, but if I included everything it wouldn’t fit on the post.
The structure of the new Brookes Virtual Gateway
The other aspect is how to provide the filtered view. Applying the general principle of keeping as many people as possible happy, we also wanted to have explicit links to Brookes’s TEL framework. This was an ideal opportunity to do both simultaneously and have the different views match the different lines of the framework. The assumption being that if the TEL framework is successfully implemented, people will need advice based on whichever line of the framework they’ve been tasked with implementing, and the DC matrix would then provide the right targeted help.
There are 12 principles in the Brookes TEL framework, and 12 in the Jisc NUS benchmarking tool. 8 of these actually map pretty closely with a one-to-one correspondence. I can post about this in detail if anyone is interested. There’s also a line in the Brookes TEL framework on assessment, which is split across many of the benchmarking tool’s principles. So 3/4 of each framework is actually very similar to the other.
This made creating the filtered views very simple. All that was needed was to come up with a question that reflected both the TEL framework principle and the corresponding Jisc NUS Benchmarking tool principle; the user of the site picks the question that most closely matches what aspect of the student digital experience they want to (or have to) develop and this will link them to the right grid in the DC matrix. That grid then shows the complete range of components that address this experience, divided into levels of complexity, with links (eventually) to resources and examples of good practice that support that particular component.
One thing, though, that the TEL framework has focused on, which the Jisc NUS benchmarking tool hasn’t is the idea of transformative, mixed, coalescent spaces. When adapting the DC matrix to take account of this, we had the thought of adding to each of the existing principles. The reaction from the community was not to mess with it, though. So instead we’ve created a 13th principle (otherwise the filtered view would link to a blank page).
Before getting to this point in the development of the DC matrix though, I’d already worked on putting together something similar, as part of a group set up by Katherine Jensen at the University of Huddersfield. She’s interested in these sort of in-between space , and set up a Google hangout on the subject, inviting me, Andrew Middleton, Liz Falconer, Catherine Cronin and too many others to list here (but who don’t read this blog). Andrew, Liz and others contributed to putting together a set of benchmarks for people to progress through if they are thinking of developing these sort of spaces. We adopted the First Steps, Developing, Developed and Outstanding categories as they’re useful for structuring thinking about benchmarking, and I half-expected it might be useful for the DC matrix.
Richard also realised that it would be useful to show how these cross-referenced to the existing principles. Between us we tweaked and merged the grid the Google hangout group had come up with, to produce the grid below.
We’ve tested this out a couple of times already, with workshops called Course Design Intensives. These are similar to the Carpe Diem sessions run at Leicester and the CAIeRO sessions at Northampton. A group of people from a programme team get together to discuss the issues they are facing with the programme and between us we resolve those issues. We’ve actually found it more useful to have programme teams from two different faculties simultaneously, as the cross-fertilisation between the ideas being generated adds a lot to the process.
The DC matrix has proven really useful in terms of that initial breakdown of the student experience. When a question crops up such as “but what are the digital literacies we need students to know for their course?” we can just click on the right principle, and there’s the answer, subdivided into various stages. All we need to do now is open it up for everyone to start submitting their examples and resources for how to achieve those stages.