Immersion, presence and immersiveness

This post is prompted by a discussion I’ve been having in linkedin with many of the delegates from the Experiential Learning in Virtual Worlds conference in Lisbon earlier this month. It’s extracted from the various posts I made, but also prompted by their comments, so thanks to them for the discussion.

The question was really about the role of immersion in general, and in virtual worlds in particular, and whether it’s different in different environments, and particularly what immersion is and how it differs from other forms of experience.

I think the problem with much of this is that we’re trying to explain experiences that aren’t necessarily ones we’re used to, in that the technology does provide new sorts of experiences. And that these things are defined differently by different authors, so we’re not always talking about the same thing.

For me, immersion is a very precise metaphorical term for that sense of feeling submerged in an experience. It’s like being immersed in water when you’re taking a bath. Making a certain set of technologies different because they’re so called immersive technologies is pointless as far as this is concerned, because any technology is immersive. You can lose yourself in a book, that’s becoming immersed in it. You can do the same in a play or a film. In those media it’s called the diegetic effect, the fictional world of the narrative becomes real just for the period that you’re part of it.

Is immersion the same as presence? I think it probably is. While you’re feeling immersed, you’re transported to that fictional world. There’s a paper by Sheridan MIT’s journal Presence in which he talks about the sense of actually being there when we experience these media. There’s a sense of departure from one reality and arrival at the other. We get in the flow of the text, of the narrative or whatever, but if something intrudes, someone talking in the cinema, or a cat jumping on your lap, then that connection with that fictional space is lost.

I rant about that a bit on a post in a previous blog. It’s in response to the BBC placing a trailer for a TV show over the top of the climax of Dr Who http://blogs.warwick.ac.uk/markchilds/entry/responses_to_nortongate/ worrying not just because it ruined the experience, but also because anyone who can do that obviously doesn’t get a large point of what art and entertainment are for, which is that sense of transportation and immersion.

Is immersion necessary for learning, or for engagement? On the whole, I don’t think it is. In fact some entertainment deliberately avoids immersion. Brecht called that Verfremsdungseffekt. I’m reading Midnight’s Children at the moment, it’s a good book and I’m enjoying it. But the frequent breaking of the author into the narrative, and the jumping from scenario to the next precludes that sense of flow, of being caught up with the story. The reader isn’t submerged in the same way. Actually that distant, sometimes critical reflective position is often referred to as engagement and there’s a great paper here on how that works in Grand Theft Auto http://www.jorisdormans.nl/article.php?ref=theworldisyours by moving between a sense of immersion and engagement, is perhaps how we get the most out of something. Experiencing both at once is supposedly possible too, a state called metaxis.

Two people can watch the same piece or experience the same technology and one can feel immersed and the other not. Ultimately immersion happens in your head, not on the screen. Technology has something to do with it though, but the problem with the idea of immersive technology is that it implies somehow that it creates that sense of immersion. It doesn’t but it can help. It’s more useful therefore to think of immersiveness as a series of technological factors that can contribute to immersion (resolution, frame rate, width of field, soundsurround, haptics, etc. the so-called depth and breadth of senses engaged) as objective measures, without being hung up on the issue that they don’t actually cause immersion.

I think one of the clarifications that can help is the difference between perceptual immersion and psychological immersion … this is in At the Heart of it All by Lombard and Ditton http://jcmc.indiana.edu/vol3/issue2/lombard.html which together with Biocca’s The Cyborg’s Dilemma http://jcmc.indiana.edu/vol3/issue2/biocca2.html is probably the most seminal article on this. Immersive technologies lead to perceptual immersion, but this might not necessary lead to psychological immersion. And psychological immersion can take place without recourse to messing with your perceptions. It depends on the individual. How it depends on the individual is one of the things I’m particularly interested in looking at. But more on that some other time.

Another thing that gets bundled into the same package as immersion is immediacy. Sometimes immersion is defined as the perception of non-mediation. I don’t think these are equivalent at all. Sure if you’re in an environment where you don’t notice the technology it can seem real (if technology ever gets that sophisticated) but actually the things that help mediate information can actually help you feel more immersed. An example: minimaps in Second Life. They pop up on screen, (so you’re aware of something between you and the virtual space) but once you’re accustomed to them, and incorporate them into the automatic way you interact with the world, they become extensions of your perception, they help you wayfind round the space, and so therefore add to the sense of immersion.

So we have three factors that are linked, but also have differences: immersion (=presence), immediacy (=non-mediation) and immersiveness (=realness, vividness).

I’m using the word presence for “being there” and I’m deliberately avoiding the word telepresence because that’s become an ambiguous word. Originally it was coined by Minsky to mean ability to act at a distance http://web.media.mit.edu/~minsky/papers/Telepresence.html but was since expanded to mean anything at which you felt you were present at a remote location (like feeling a videoconference was actually a face-to-face meeting). Recent developments in technology have reappropriated the word to mean specifically technologies that enable you to act at a distance, not just experience being at a distance. For that I’m trying to get into using the phrase “distal presence” since that’s not ambiguous. But I just wish people would come up with a definition for a word, that’s different from their definition of a different word. And stick to it.

So if any technology can cause immersion, why get hung up on the more immersive technologies? Good question, but I’ve run out of space. Some other time.

Sheridan, T. (1992) Musings on telepresence and virtual presence. Presence: Teleoperators and Virtual Environments 1 (1), 120 – 126

Mobile learning in hospitals

Yesterday I had the opportunity to visit Birmingham Children’s Hospital. The children there still receive an education, and so it’s a site of James Brindley school (it has 14 sites around Birmingham). They asked me to come in to get them started on evaluating the impact that iPads have had on their teaching there. I think there’s an amazing amount of things that can be done with tablets (i’m agnostic about specific devices, though I’m director of research for the iPad Academy UK I actually own a Transformer Prime) and this was an opportunity to see some of the real advances that can be done with them.

There are three main modes they teach in. There’s a primary classroom and a secondary one, for children who are well and mobile enough to leave their wards, each ward has a separate room for teaching in too, and then a lot is done at the bedside. They have children from around Europe visiting, so language can be a barrier, but there’s google translate just a tap away. They do a lot of maths and art education too, so it’s less of a problem in those subjects. The devices integrate directly with the other work they do, so in animation they draw out the storyboards on paper, then use the storyboards to create animations using an animation app. The primary children showed me a video they’d made (the ipad integrates seamlessly with the reflectors and smart boards, no annoying plugging in data projectors – which never seem to reach and need rebooting a couple of times to get them to recognise each other). In the heart surgery ward the teacher showed me the maths apps she uses, meteor maths is a popular one (you have to tap on the two numbers that make the solution before the meteors bump into each other). She had to first of all persuade the boy she was teaching that he didn’t have to be scared because I’m not that sort of a doctor. She says she has to be careful with that one because it can raise the heart rates of the children too much. In the cystic fibrosis ward there was a little girl who at first could only use her head, she now has the use of her arms too, but she was able to interact with it using a stylus in her mouth, and the one thing she wanted when she left was one of her own. Neurology also finds them useful, since even if the children can’t hold a pen, they can trace letters with their fingers. It’s also useful because they can record children’s progress, sometimes which can be only small increments, but by seeing work, or videoing reactions when for example, patterns are touched on their hands, these can form a record over months, all integrated into one place. Everyone was doing Easter-themed work, one girl in an isolation ward was making little chicks, and I could see the work because it had been photographed, printed out and stuck in her workbook (another example of it integrating seamlessly with the usual practice). another advantage: it takes hours to clean up a PC enough to take it into an isolation ward, books can never be made clean enough, but a wipe down with an antiseptic wipe and a tablet is ready to go. The downside is on the neurological ward, the in-built magnets (which are only there to hold covers in place) interfere with the shunts if they’re fitted, so they can’t be used. Another girl (I think on the nephrology ward, but it was towards the end and I was feeling slightly swamped by then) I spoke to said the best thing about it was the games, but these were actually games she was learning with according to the teacher.

This was the theme all the way through: the children took to it because it was interactive, and the apps were often so game-like they weren’t aware that they were learning. with it being tactile and visual, there was a pick-me-up-ness (which is a real phrase, I know i just googled it) to it that generated engagement. This development in practice has only taken a few months. This has applied to the teachers too, in the staff room they are drawn to each other’s practice through the sounds and visuals of the ipads they’re playing with and although they’ve always shared their practice, they said that this has increased with introducing the ipads.

the aspect of mobile devices I’m particularly interested in is the way that the use of them becomes embodied (of course I am, it’s in the title of the blog). I think the reason why mobile devices are a step-change in our relationship with technology is the greater and faster degree to which they become extensions of ourselves. They easily make the jump from tool to prosthesis because they’re tactile, they’re flexible, we carry them close to us constantly, and well of course because they’re mobile. The depth of this is indicated by the relative comfort we have with letting someone else use our desktop PCs (no problem) our laptops (slightly uneasy) and our tablets/phones (feels very much like an invasion of our space). The ipads were kept in amongst the toys, books and so on, in plastic crates and in bags. They were just another part of the kit, albeit the one piece that brought all the other bits together.

Anyway, at the moment I’m looking for funding to expand the degree to which the evaluation can take place. There is definitely a lot of awesome practice going on that more people need to hear about. It’s also a very moving environment to be in. The children there were going through stuff that’s worse than anything I’ve gone through, and yet all were smiling (even the boy with in the heart ward once he realised I was a Ph and not an M). It’s very difficult to find the words without lapsing into cliche or sentimentality, but if you think you’re having a crap time, it really is the best place to force you to get a grip.

Tips on editing

Today I am mostly editing book chapters – this is the fourth book I’ve edited and also have done two sets of conference proceedings. So there are some things I possibly do as rote now that might not be obvious. Although most are:

The obvious ones are good discipline with organising directories with the various versions in them. If I’m writing my own stuff it’s easier, the date of the file goes in the file name (in a YYYMMDD format obviously) so the most recent one is always at the bottom. When you have loads of different authors, all using different naming conventions, and when you might have to take a break of a month or so, while they do rewrites, or you go off and do things that earn you money, then it’s important to make sure they’re always sorted into the right directory, properly labelled, so you know where everything is when you come back to it. And have another directory with things like author emails addresses and so on so it’s to hand. And a list of what everything is and which chapter it is too.

Create a style sheet. It’s a bit laborious but the minutes you spend doing it at the start will save you hours at the end. Create a template using that style sheet and send it out to the authors. You’ll probably need to do some tidying up at the end, but it will save you a lot of hassle.

Get the authors to submit pictures as separate files, and inserted into the text. Publishers want them as separate files and it saves you having to mess about at the end, but it’s good to see where they belong. Find out what the minimum dpi your publisher insists on too.

Here’s the least obvious one … tables should be submitted as images, not spreadsheets. You want the author to be deciding which bits go where and how it should look, not the publisher, leaving it to that stage creates all possibilites for error. Particularly if the table includes images. As few a files as possible is always a good practice.

Check references tie up with citations. The easiest way i’ve found to do this is to go through checking citations tie up with a reference at the back and while doing so highlight the reference the first time it’s cited. When a reference is missing, it’s easy to flag. And at the end you have a mass of yellow (or whatever) at the end. Where there are gaps do a word search on the name. If it doesn’t return anything, then you’ve an uncited reference. Flag that for the author. Then remove the highlights.

The most common mistakes academics make? Plurals. “Data is.” “Media is” or pluralising “medium” as “mediums”. Mixing up phenomenon and phenomena. I even saw “dices” the other day. Sure there are some tricky ones, (octopus for example, but that one doesn’t crop up often), but those are worth always checking.

It’s worth looking up some of the references too. Sometimes they’ll misquote, or misinterpret. Usually common sense will flag those. If you’re thinking “really?” it’s worth a look.

Oh yeah the reviewing thing as a whole is worth another post. I’m really in this one thinking about the bits that are specific to editing, organising, copy editing, since that’s the stage I’m at with this book.

EDIT: Some more points occur to me.

Don’t use the bullet point tab to create bullet points. Create your own style called “bullet” and mark the bulleted lists up with that. Chance are you may want to change your mind about how to lay them out, and then you have to go through changing them all manually. Or you may change the Normal style and it re-sets all your bulleted lists, or just some of them.

Leave the formatting till last. There are lots of reasons for this, one is during the editing process your authors may add new stuff (so you have to do it again) or delete stuff (so you’ve wasted your time). It’s unlikely that they will do the formatting themselves. If they haven’t started off using your template, then they won’t. The golden rule of anything to do with editing anyway is that it’s less effort to do it yourself than get someone else to do it.

The other reason is there is something exhilirating (by my standards anyway) at seeing all of the disparate chapters, with various naming schemes and formatting, and with variations in spelling, all coming together into a uniform look. It’s at that point you really feel like you’ve got a book. At the moment I’m about half way through and have them all for the first time in a single directory with all the figures etc with a standard naming. At the moment, with the chapters that aren’t formatted properly (and bless, the current one I’m working on has tried but the styles on the headings and subheadings are switched).  I usually paste a new chapter into a previously created file, save it as a new file and then delete the old text. It’s the easiest way to import an established style sheet.

Oh and if possible, if it’s a book in a series, have an earlier edition in the series to check against.

A rant about bioethics

Anyone who ever read my last blog (at http://blogs.warwick.ac.uk/markchilds/) will know that I sometimes go a bit off-topic in order to let off a little steam about something. And once off-topic I run the risk of running into territory I know little about. But I read this … http://www.christian.org.uk/news/bioethics-expert-warns-against-gm-babies-plan/ and the full report here http://www.christian.org.uk/wp-content/downloads/3-parents.pdf written by someone who is apparently an expert and it wound me up so much because it’s such a bad (or good) example of what happens when you let your need to come up with a particular position influence your opinion on something, I thought I’d comment.

The discussion is about mitochondrial transplantation in order to address mitochondrial diseases. They’re rare but pretty devastating. It involves taking out the mitochondria from an ovum and replacing it with donor mitochondria. The report does a good job of describing the procedure. Here’s the thing though, although mitochondria are passed from mother to child, they contribute nothing to the physical traits of the individual. That’s the DNA bit. The report seems to blur over this. Anyway, addressing the points one by one.

Biomedical risks: Yes it sounds like there are some, but the point about any procedure is that it contains them, and the researchers take those into account. Making this a particular case simply because it’s about genetics seems disingenuous.

Similarity to cloning: It’s similar inasmuch as you’re messing with ova, but it’s not replacing anything to do with the chromosomes … there could be some ethical problems with creating a human clone, but that’s something to be considered when that’s suggested, not at this point.

Similarity with the ‘male egg’ proposal. This just seems like an excuse to appeal to the homophobes in the audience. It’s not really similar at all.

Moral status of the embryo. Like any procedure to do with embryology, this produces spare embryos. This isn’t exceptional, so isn’t grounds for any additional concerns. Plus embryos aren’t actually defined as alive legally. Get over it.

Modifying the genetic inheritance: Again the sleight of hand with the genetic inheritance of the offspring. Mitochondrial DNA don’t influence appearance, behaviour, whatever, they process energy and growth in cells. That’s it. Sure they’re passed on, but it’s not like you’ll inherit any of the traits of the donor. Oh wait .. .that’s the next argument.

It’s eugenics: OK eugenics does have a bad name, it’s associated with nazis breeding supermen, and ethnic “cleansing” but this is about removing some really delibilitating diseases and isn’t about creating new communities of ubermensch. Again, this is an appeal to some pretty nasty reactionism.

Kinship issues: This is where the report goes from some dodgy appeals to knee-jerk reactionism to some seriously unhealthy worldviews explicitly stated: “a genuine risk exists that future children may be deeply confused and distressed in their understanding of who their parents really are. This may haveserious repercussions on the manner in which they define their identity and self-understanding.” Identity is my field. Identity is a complex and individually negotiated idea that each person works with and comes to terms with in their own way. The idea that someone may feel the person who donated their mitochondria is in any way a parent is remote, but then, people feel connections with the families of organ donors so it’s possible. But so what? Kinship varies, I know people with three parents, four, none. Adopted, fostered. Who have wards and just very close ties. I don’t want to get into knocking religion here (because it’s important to a lot of people who are important to me) but this is often where ethics advisers from a religious viewpoint get it wrong. Humanity, society, people are far more varied than they want to admit. We’re far more able to adjust to diversity than they want to accept, and they try to impose their own limited viewpoint on what is good and bad for people on the rest of us. I don’t mean religious people in general, I mean the ones who take it on themselves to advise the rest of us. It’s petty and small-minded.

I also should point out my own political ethical standpoint here (of course) and why maybe this is relevant to a blog called The Body Electric, but as I’ve said before, I’m a transhumanist. I see something like altering people or society and mixing things up and my first thought isn’t “ugh how scary, let’s stop it”,  it’s “wild, bring it on, let’s see what happens”. Luckily, I think my side is winning.

Finally we get to the part where I felt compelled to write this post: Sperm and eggs represent the whole person. The quote “When parents procreate in a normal way they also give of themselves in love wholly and unconditionally in the sense that it is not only a portion of the person that takes part in the procreation. It is the whole person that takes part, with his or her whole body and soul.” This is quasi-mystical bullshit. We are talking about addressing real people with real problems and this kind of comment is precisely why, if you’re making decisions about actual real things, you need to leave your fairy stories at the door. It would be very worrying if this report and particularly this final statement had any influence on the decisions being made, and it illustrates why something like The Christian Institute is the least competent organisation at addressing moral issues. In short people like Dr Calum MacKellar need to grow the hell up before opening their mouths in public.

An Edutechy Wonderland

This is written in response to a post about re-entering Second Life and the changes (and lack of changes there after a two year break) written by Bex Ferriday at http://mavendorf.tumblr.com/post/45827913344/second-life-second-attempt

Firstly, the problems Bex relates about the course weren’t really due to the design of the course, or the design of SL, in my opinion. I think with anything like that there’s often a problem of commitment from the people taking it. People just over-estimate how much time they have, other things crop up, and so participation wanes. Look out the dropout rate from MOOCs and they only use tried and tested technologies. I got a lot from it anyway.

I think the biggest advantage and disadvantage of using virtual worlds for education was that for most of the latter part of the noughties, virtual worlds were synonymous with just one platform; Second Life. The advantage was that nearly everyone you knew teaching and researching the field were in the one place. If you wanted to visit their build, observe what they were doing, guest lecture in their teaching, then you didn’t need to learn to use a new interface (unless Linden Lab itself decided to screw around with it), you could use your own avatar, inventory etc. If they held a social event, you could meet up with everyone you knew  and worked with, invite other people over to what you were doing. If your work involved a social dimension (like exploring digital culture, or digital identity) then you had a living complex world to send them out into, full of 10s of 1000s of people. There was a real sense of a community of educators working together.

The disadvantage of course was that it was all operating under the discretion of one software company, and when they pulled the plug, it all fell apart.

Well “pull the plug” is a slight exaggeration. For anyone who doesn’t work in the field, Linden Lab, who ran Second Life, ended the educational subsidy. So most institutions could no longer afford to stay in there, and a lot of cheaper options emerged.

Last year I was trying to organise a tour for a group of students, and so went through the normal list of landmarks to show them different resources. Fewer than half were still there. The numbers of people using it are down, but apparently revenue is up. So the customer base is a smaller amount of more committed people. Which I guess suits the provider. Not so helpful for us using it for education though.

The impact on education towards making it more mainstream has been negative. The fragmentation of the community means it’s more difficult to show colleagues the range of stuff it can be used for. It’s more difficult to find good examples of practice, because you first of all have to know where to look.

Bex’s other point is that the technology hasn’t moved on at all. I’m less worried about this. As long as it’s good enough to give you a sense of immersion, (and it can be) and a sense of copresence (and it does) then overall tech quality isn’t a problem. A lot of people’s equipment is still not great, so keeping the graphics at a lower end gives the majority of users a chance to catch up. I’ve given up on IT departments ever doing so though. What I was hoping for though is for the problems to be resolved. But the lag is as bad as ever. In a session I was teaching last week, it was the worst I’ve ever seen, I got booted out several times and struggled to get back in.

But there are still fascinating things to see there, which reassures me that the technology is here to stay, and is an essential part of the educator’s kit. Just the ones I’m involved with:  there’s the palaeontology course at the Field Museum of Natural History in Chicago. The Science Ethics course at the University of Iowa. The digital cultures course at Newman University, the Human Behaviour course at University of Southern Maine, the Extract / Insert performance and installation by Stelarc, Joff Chafer and Ian Upton. All fascinating. All excellent from an education perspective, (or performance), and all only really possible in a virtual world. And all, (maybe coincidentally, maybe not), taking place in Second Life.

I think what will emerge is either another single platform that will replace SL and everyone can migrate back to that to recreate that single community, or the technology for hypergridding (i.e. linking together the different platforms) will fill the same role. In this thread responding to Bex’s post in Facebook, Anna Peachey she always thought of SL as the fluffer for the bigger event. In the physical world, the work of the fluffer has been made redundant by Viagra. Hopefully the field of virtual worlds will see a similar game-changing technology.

Belief and the impossible

I saw the Daily Post prompt today http://dailypost.wordpress.com/2013/03/18/daily-prompt-impossibility/  and immediately went off on an internal rant. bah belief and all that. Reading others’ blog posts in response I realise that the spirit of the challenge is to list things that are seemingly impossible (feats of physical endurance, forgiveness, stuff like that) and choose to believe in them. I took a completely different tack with my thoughts, maybe taking it too literally, but this is where I went with it.

Firstly there’s a notion of philosophical scepticism which is that nothing can be absolutely known.Even “I think, therefore I am” assumes too much (how do you really know it’s you doing the thinking?) The Universe may actually be a hologram projection of a 2D surface http://en.wikipedia.org/wiki/Holographic_principle, it may not exist, this may just be a simulation, or brought into existence randomly a millisecond ago complete with memory of the past.

The rational person is aware of all of this, and brings it to mind occasionally, but it would be pretty difficult to allow this to weigh on all of one’s decisions. When we say “is” therefore, we’re using that as a short-hand for “is to the best of our knowledge”. To the best of our knowledge I exist, this laptop exists. The Universe was created 13.4 billion years ago in a Big Bang and arose out of the final heat death of the previous one. And so on. These aren’t a matter of belief though, to say these things are true just means that, looking at the evidence, these are the best explanations we have. That’s really what truth is.

It’s therefore true that there is no God. Or no afterlife. To the best of our knowledge there isn’t. That’s not to say that there definitely 100% isn’t one. It’s always possible that there is an omnipotent divine being who just doesn’t seem to have an impact on anything. But accepting that doesn’t make me an agnostic, any more than accepting that the universe may be a hologram has an impact on my daily life. I will act in such a way that the truth is there isn’t one. To the best of our knowledge. That for me is the essential difference between an atheist and an agnostic. An atheist has made that observation, but is prepared to change his mind (and in the case of Tim Minchin carve “fancy that” on his cock with a compass if proved wrong). An agnostic has put off making that observation. If it was choosing my dessert (the majority of my metaphors include chocolate at some point) an atheist would have ordered their dessert, but be prepared to pick another one if they see one that’s better. An agnostic is still looking at the menu and not picking one.

So is anything impossible? No. Actually it isn’t. There’s a very very very tiny possibility that anything can happen. Even God. Do I actually hold any beliefs about anything? No I don’t. There are making decisions based on weighing of evidence, there are observations. But none of these constitute beliefs. This is what rational people (and rationality is the most human thing we can aspire to) do.

On creativity

This is another blog post following up on one Grainne Conole has written (at http://e4innovation.com/?p=661) which is ironic, I suppose, given the nature of the topic. I wanted to chip in on the conversation too, because I wanted to offer a slightly different perspective on what creativity is, and what constitutes a particularly creative person. I think our culture is obsessed by the lonely, creative genius who works away creating rare works of art, and I think this is both limiting and offputting for those of us who aren’t actually geniuses. So I’ll offer some examples of what also comprises creativity, using as an e the person I’d consider to be one of the most creative people, if not the most creative person, whose work I follow, Gregg Taylor.

You might not have heard of Gregg Taylor because he’s not someone held up as one of the great creative geniuses of our time, because what he does doesn’t fit in with that image. Gregg is the force behind Decoder Ring Theatre, which produces podcasts in the style of 40s’ and 50s’ radio serials. He’s been doing that for eight years. and I first came across them about seven years ago.

These podcasts come out twice a month. So 24 a year, of which he writes 18. That’s 18 a year, for 8 years. Without fail. We underestimate that as an aspect of creativity. Quantity. Sure it’s important to have the novelist spending his entire lifetime creating one world altering novel. But to be able to sit down and come up with something new, every fortnight. That’s an incredible achievement. I think more of us should look at the amount someone produces as a mark of a creative person.

That’s not to say the quality isn’t there. Sure there are better writers. I’m reading Midnight’s Children at the moment, by Salman Rushdie, and there’s a great writer. But the content of the podcasts are entertaining, there’s character development, nearly always a plot (as much as you can get into 25 mins), there’s some fun lines, poignancy. They have the lot. And Gregg is a better writer than most. And he comes up with that every two weeks. For eight years. Very few creative outputs have those attributes of quality and consistency.

But I think what also makes the truly creative people stand out, isn’t just the ability to succeed in one area. There’s a good team of actors in these podcasts, of whom Gregg Taylor is one. He acts, directs, does post-production and markets them. He’s also written novels based on the characters and now has launched the first comic book, to great reviews. Specialism is over-rated, adaptability is a mark of a very creative person.

I think, though, the most unhelpful of the characteristics we associate with creativity is the idea of the emotional erratic soul suffering for his art. We all know people who are jerks, who people let get away with being a jerk, simply because they are creative and innovative. This probably happens more in the academic world than the art world. Happens a lot in movie making too. They’re perfectionists, or they’re obsessed, or any one of a number of excuses we give for their bad behaviour. But really, if it’s such an effort for them to create, then really they’re not that good at being creative. Sure everyone needs to put their work first occasionally. I get ratty if I get interrupted in the middle of thinking about something. But actually … that is because if I lose my train of thought it takes me ages to get it back, sometimes I never do. So that’s a case in point, I’m actually not that creative, otherwise I could recall it whenever I want. In contrast, the DRT troupe engage with their audiences, through twitter, facebook and there’s an approachability there that you wouldn’t get, say, with other writers, actors etc. Not sure what I’d call this as a quality, but maybe not being really up yourself … humour or humility would cover it.

While on the subject of humility, I first heard someone describe themself as a Creative at a seminar day a couple of years back (at the University of Hull actually). Talk about lack of humility. To describe yourself as a Creative is, by default, implying that you’re somehow different from everyone else, that you’re creative and they’re not. No. You’re just lucky enough to be in a job that supports you to be creative, that doesn’t make you special. I get to spend a big chunk of my time writing. Sometimes I get paid for that. I am therefore a jammy bastard, http://www.urbandictionary.com/define.php?term=jammy%20bastard and I never forget that. If you ever refer to yourself as A Creative, you’re pronouncing it wrong, it’s actually pronounced “wank-er”.

Finally, and maybe the most controversial of the pre-requisites for creativity, the DRT output is free. One of the mistakes a lot of people who create things make is that they think that because they are talented, the world owes them the opportunity to put those talents to use. No it doesn’t. Work for the vast majority of people, is doing stuff they hate that they get paid for. If you like doing it, it’s not work, and essentially there’s no reason to be paid for it. People need you to stack shelves, mend roads, grow food. They don’t need your book or your music in the same way. So yes, the majority will share your music, download your movie, pass on pdfs of your book chapter. That’s tough, but it’s a fact of life and you probably need to just face up to that rather than whinging about it and trying to come up with legislation to stop it. I’ve never actually taken anything like that for free, I pay for the music I listen to, and the TV shows I watch, and I donate to DRT and soma fm and any of the free content that’s out there, but I do so because I see it as a moral obligation, not a legal necessity. And from a selfish point of view, I want to see them continue. If enough people value your work, then they will pay for it and the work will continue. If they don’t then they won’t,  and it won’t. The truly creative will therefore make their stuff open source, they’ll share it for free and then see what happens. A freemium model, whereby you find ways to make money by selling extra content works too – monetizing the long tail as I almost managed to say at a recent transmedia conference, then had an attack of self-respect at the last minute.  (Other people ripping off your stuff and making money off it is another matter, that’s out and out theft).  It’s inappropriate to rail against creating music, or writing books or making movies under those conditions.because you knew that’s the way things are when you got into it. If you don’t like it, there’s always stacking shelves, mending roads, or teaching to fall back on. If you’re really creative, you’ll still feel driven to create anyway.

Oh and if you want to check out DRT, their website is at http://www.decoderringtheatre.com/