Changing the student digital experience pt 5

One thing that developing a framework does is enable you to see more clearly where there are gaps. Sort of like Mendeleev and his periodic table. Putting the 13th principle together with the in-between spaces group, and then mapping what we came up with to the Jisc NUS benchmarking tool showed that those categories overlapped with the 7th, 8th, 9th, 10th and 11th principles of the tool. Which then raises  the question, what about the overlap between blended coalescent spaces and the 12th principle, ie digital well-being?

It’s not something we were focusing on but it is a key aspect of introducing blended and virtual spaces – there are a range of different elements to digital well-being that only emerge when we start transitioning between physical and virtual spaces, or merge the two.

To some extent, this is relatively advanced stuff compared to just regular looking after yourself while online, but virtual spaces usually require an avatar for interaction and that opens up a whole new area of digital experience. (Augmented reality too, although perhaps this isn’t so much digital well-being as physical, inasmuch as watching where you’re going when you’re hunting Pokemon.)  One of the things that emerged when I was looking at virtual worlds is how the sense of presence exposes the user in ways that don’t occur when you’re a disembodied presence in a forum.

One of these is presentation of self. With the whole variety of choices available to you, what you choose has a big impact. Do you choose your physical world sex for your avatar? Do you choose your physical world gender? What if those are different from each other? Do you want to take the opportunity to explore identity by adopting a different ethnicity, sex, species? Will you expose yourself to hostility if you adopt an animal avatar, or a mechanical one? (I spent a lot of time in Second Life as an airship – that got some weird reactions. Though the spider one was the only one that generated outright hostility).

If we as educators introduce virtual worlds to our students, there is some responsibility for their continued interactions with virtual worlds, even outside of the learning situations. If they’ve become interested, and developed an online identity, as a result of our teaching, they may decide to continue and explore more. And there’s some weird stuff in there, which they need to learn to ensure they’re comfortable with before engaging with (or perhaps be resilient enough to be OK with being uncomfortable). That level of embodiment also enables people to form relationships, and there can be a mismatch between the significance that people attach to those. Which can lead to people being hurt.

There is reputation management too. If you want to be taken seriously in online interactions, maybe a giraffe isn’t the best choice of avatar. But then, I did get to know one academic simply because he and I were dressed as punks at an ESRC event in SL when everyone else was in suits. So representation of self is something to be consciously engaged with, and many people first entering virtual worlds tend to be oblivious to the relevance of avatar design.

At the moment, perhaps these concerns aren’t huge ones for educators, but at least we know where to put them on the site once we do start thinking about them.

 

 

Advertisement

Changing the student digital experience pt 4

Soon after these rounds of consultations, the three of us working on the staff development site also were asked to redesign the Brookes virtual landing page. This is a sort of portal into all the support for technology that staff might need, and is actually the most visited page on the Brookes site. At the moment it mainly links to information on the tools – but we also wanted to integrate the student digital experience advice too. I think this is part of a larger, maybe even ideological, issue with technology-enhanced learning. It’s often seen as an issue of just getting the kit working, what plugs into what and what button to press. I think the pedagogical issues are far more important, such as what is the student supposed to get from it? what teaching skills are needed to ensure it is used effectively? This is partly because this aspect gets overlooked and yet has a huge impact on the learning that takes place, and partly because this is the bit I get paid to do and I want to keep my job. However, it is fair enough to say that getting the stuff to work is probably what people focus on first.

This then leads into the fourth design principle for what we’ve put together:

#4 Integrating as many different perspective into the tool as possible (alternatively: keep as many people happy as possible).

The gateway has gone through some re-designs but essentially it has always aimed to do two things;

  • provide a point of entry for people who know what technology they want to use, but not how to use it in parallel with a point of entry for those who know what change they want to make to the student digital experience, but not how to achieve it.
  • keep the CPD angle in the forefront of people’s minds, by having links to advice on how to do it throughout the site.

The structure then, looks like this (there are more options within the technologies and CPD sections, but if I included everything it wouldn’t fit on the post.

TEL gateway

The structure of the new Brookes Virtual Gateway

The other aspect is how to provide the filtered view. Applying the general principle of keeping as many people as possible happy, we also wanted to have explicit links to Brookes’s TEL framework. This was an ideal opportunity to do both simultaneously and have the different views match the different lines of the framework. The assumption being that if the TEL framework is successfully implemented, people will need advice based on whichever line of the framework they’ve been tasked with implementing, and the DC matrix would then provide the right targeted help.

There are 12 principles in the Brookes TEL framework, and 12 in the Jisc NUS benchmarking tool. 8 of these actually map pretty closely with a one-to-one correspondence. I can post about this in detail if anyone is interested. There’s also a line in the Brookes TEL framework on assessment, which is split across many of the benchmarking tool’s principles. So 3/4 of each framework is actually very similar to the other.

This made creating the filtered views very simple. All that was needed was to come up with a question that reflected both the TEL framework principle and the corresponding Jisc NUS Benchmarking tool principle; the user of the site picks the question that most closely matches what aspect of the student digital experience they want to (or have to) develop and this will link them to the right grid in the DC matrix. That grid then shows the complete range of components that address this experience, divided into levels of complexity, with links (eventually) to resources and examples of good practice that support that particular component.

One thing, though, that the TEL framework has focused on, which the Jisc NUS benchmarking tool hasn’t is the idea of transformative, mixed, coalescent spaces. When adapting the DC matrix to take account of this, we had the thought of adding to each of the existing principles. The reaction from the community was not to mess with it, though. So instead we’ve created a 13th principle (otherwise the filtered view would link to a blank page).

Before getting to this point in the development of the DC matrix though, I’d already worked on putting together something similar, as part of a group set up by Katherine Jensen at the University of Huddersfield. She’s interested in these sort of in-between space , and set up a Google hangout on the subject, inviting me, Andrew Middleton, Liz Falconer, Catherine Cronin and too many others to list here (but who don’t read this blog). Andrew, Liz and others contributed to putting together a set of benchmarks for people to progress through if they are thinking of developing these sort of spaces. We adopted the First Steps, Developing, Developed and Outstanding categories as they’re useful for structuring thinking about benchmarking, and I half-expected it might be useful for the DC matrix.

Richard also realised that it would be useful to show how these cross-referenced to the existing principles. Between us we tweaked and merged the grid the Google hangout group had come up with, to produce the grid below.

13th principle

We’ve tested this out a couple of times already, with workshops called Course Design Intensives. These are similar to the Carpe Diem sessions run at Leicester and the CAIeRO sessions at Northampton. A group of people from a programme team get together to discuss the issues they are facing with the programme and between us we resolve those issues. We’ve actually found it more useful to have programme teams from two different faculties simultaneously, as the cross-fertilisation between the ideas being generated adds a lot to the process.

The DC matrix has proven really useful in terms of that initial breakdown of the student experience. When a question crops up such as “but what are the digital literacies we need students to know for their course?” we can just click on the right principle, and there’s the answer, subdivided into various stages. All we need to do now is open it up for everyone to start submitting their examples and resources for how to achieve those stages.

 

Changing the student digital experience pt 3

With the groundwork done of turning the Jisc / NUS benchmarking tool into a structured website of resources, and incorporating a CPD element, the next step was to ensure the DC matrix met the following criterion.

Goal # 3 ensure that the programme meets users’ needs

The DC matrix had already been used with DMELDs, but getting an idea of whether this was actually the approach that other people would find useful is obviously something to address as soon as possible. Between us, George Roberts, Richard Francis and I showed it to a range of people for feedback. These included:

  • academic staff at Brookes
  • librarians at Brookes
  • the other academic and staff developers within OCSLD
  • the wider community through the Jisc Students Experience Experts Group meetings

We had a mixture of responses. The first of which is that looking at the staff capabilities from the perspective of student experiences wasn’t helpful, in that usually staff started with the process of engaging with technology-enhanced learning from the perspective of identifying a technology that could help, and then needing help with that specific tool. I’d found too that when I had an example of good practice to share via the site, that my tendency was to think (for example) “where would webinars go?” rather than think about the student experience it was providing. The search function helps here, in that you can just search for the relevant element, but it’s not really in the spirit of reframing TEL from the student perspective. That shift in perspective will take time, I think. In the meantime, this suggested we needed alternative routes to the resources.

The second response was that there is a lot in the DC matrix as a whole. This is unavoidable as there are so many aspects to the student digital experience and we didn’t want to deliberately avoid looking at some aspect. Suggestions were to highlight some principles, or to produce filtered views depending on who was looking.

The third response was that there are other models, and other repositories of resources, and this is just competing with them. For example, there is the TEL framework at Brookes, the Brookes graduate attributes, and this proliferation of models is confusing. Also – what’s the point of having a structured access to resources if no-one ever populates the structure?

The response from the wider community  addressed some of these issues, however. The creation of a site based on the Jisc / NUS benchmarking tool meant that it was shareable across all institutions, rather than tied to one TEL strategy. There was considerable resistance to the idea of changing the DC matrix away from the original structure of the benchmarking tool, therefore. However, this addressed the problem of how to populate it; if everyone uses it, then the resources should soon be populated. The problem of it appearing too large was acknowledged, but we got the advice to develop walkthroughs for the principles that might be more commonly used. Finally, what people really liked was how the site also incorporated advice on how to also use any TEL intervention for CPD. The “one cell from the top and one from anywhere else” model for staff development seemed to go down very well. Although probably better coming from Rachel Riley.

 

 

Changing the student digital experience pt 2

Continuing to chart the progress behind Brookes’s new staff developmental programme for TEL. With a structure for mapping the student digital experience in place, from Jisc and the NUS – there are other concerns – the first of which is:

Goal #2: Integrating developments in learning and teaching with staff CPD

One of the things I didn’t want to do for the new programme was set in place a whole new set of courses. There are plenty of effective developmental opportunities already; from online courses that are part of the Postgraduate Certificate for Learning and Teaching in Higher Education (which I teach on) to one-off workshops on how to use the various tools needed for TEL. Adding to these would add to the workload for staff, but also, who’s to say what should be on a course? There are over 200 different elements to the student digital experience tool as put together by Jisc and the NUS, we shouldn’t be choosing on behalf of the teacher which is the important one for them to address.

Rather than have a course, therefore, it made sense to me to just add different types of continuous professional development to the DC matrix, alongside the 12 principles of student digital experience that were already there. The intention being that the member of staff would plan to do some sort of TEL intervention and use this intervention as the basis for some CPD activity. This would create a model that could be light-touch, incremental, and adaptable to meet the specific needs of a staff member (and ultimately his or her students) at a particular time. It also follows the practice of not focusing on the staff member’s digital capability as a goal in itself, but as a means to improve student digital experience. It also borrows a bit from the idea of activity-led learning; you’re not learning any skills for their own sake, but to achieve a specific goal.

And this also recognises that people don’t (normally) just do professional development for the sake of it, they are directed towards it for different reasons. I wanted the programme to actually meet the needs of people who had CPD-related task on their plate already, rather than try and generate an additional rationale for engagement. These were the four reasons I thought that people might need the site:

  • Having something in their own courses that needed improvement, or a need to be addressed. This could be identified by them, by their line manager in a performance review, or in student feedback.
  • Wanting to get into publishing research, and thinking of using a TEL intervention as a good basis for this. Brookes has a lot of internal mechanisms for publication, for those just starting out, and of course there’s the whole range of TEL journals and conferences out there.
  • Going for HEA accreditation as a Fellow, Senior Fellow or Associate Fellow and running into the criterion of “using and valuing appropriate learning technologies”.
  • Actually having already done something, but wanting to get into dissemination of their practice and not knowing where to start.

The basic structure of the Jisc NUS benchmarking tool (First Steps, Developing, Developed, Outstanding) is helpful in developing ideas, so I adopted that and formed another table similar to the 12 already existing. When integrated into Richard Francis’s site it looks like this:

zeroth principle

 

The idea is that if it sits alongside the other 12, as a “zeroth principle”, then people who come to the site looking for ways to improve the student experience will be drawn into considering how to make something of their TEL development as CPD. Ideally they’d pick a cell from somewhere on principles 1 to 12, and one from the matrix above, and in combination those two elements will form their CPD for that academic year. It’s similar to the numbers round in Countdown – one from the top and one from anywhere else.

Changing the Student Digital Experience

One of my roles at Oxford Brookes University has been to come up with a Technology-Enhanced Learning programme. It’s laid out in the university’s TEL framework – under the line

Redesign and implement a staff developmental programme for TEL based on the Brookes Attribute of Digital and Information Literacy.

The structure for this programme is just about to go live, so this is a useful point to reflect on why it looks the way it does, and what the aims of it are.

Goal # 1 – put the student experience first

Staff developmental programmes usually look at what skill sets staff need to develop. The approach suggested to me (by George Roberts – my line manager and also a Principal Lecturer for the Student Experience at Brookes) was to start with the Jisc / NUS Student Digital Experience Benchmarking tool, which you can see at this link. The tool breaks down students’ experiences into 12 principles, and each principle is further subdivided into around 20 different attributes, on a scale from first steps, to developing, to developed, to outstanding. It’s quite comprehensive, and has a good provenance, (supported by Jisc, the NUS and collated by Helen Beetham). Furthermore it reframes the whole debate about TEL from the perspective of “what should a student get out of it” rather than “what do we need to put into it?”

Fortuitously, a colleague at Brookes, Richard Francis, who is the Digital Services and Learning Technology Manager, had already developed an online site based on the Jisc / NUS tool. This site replicates all 12 principles as separate wepages and for each cell of the matrix on the page there is a link to a further page which can be populated with resources. Richard had even begun populating the site with resources through a series of workshops with a group of people Brookes calls DMELDs, or Digital Media and e-Learning Developers.

The two images below show what this looks like:

DC matrix

As resources are added, each cell is ticked, which means that users can see quickly where the resources are. Clicking on text in a cell links to a page such as this:

resource page

Looking for a name for this website tool, we toyed with Digital Capabilities matrix, and Digital Capacity matrix, also Digital Competencies matrix. Unable to choose, we just went for DC matrix. However, as we went out to talk to more people about it, we (quite reasonably) met resistance to the idea of yet another abbreviation. So we went for Digital Choices matrix, as that seemed to actually describe what it did, rather than what it was for.

The idea is then that a member of staff who wants to make a change to their students’ learning experience, goes through the DC matrix, looks for where they need to develop their practice, and by clicking on the link are not only provided with a list of resources, there is also a link to a forum of people who are also engaged in that area. Ideally once they’ve completed the intervention, they can upload the results and so provide resources for people who come after.

The site could also be used as a personal audit tool. A staff member looks through the matrices, decides where they are at, and looks to the cell to the right to identify where next to develop their practice.

The next step in the development continues in the following post.