Sorry for the lack of posts recently. I’ve a couple of things I’ve been meaning to write since about October (Eek!) and am determined to do so. Until then, I wanted to post about what I’ve been working on recently.
I met Puffles the dragon fairy on Twitter quite recently. They blog here about Politics, news and anything else that takes their fancy and with past Whitehall experience, their insights are often quite keen. They’re also a bit fab at encouraging young people, who are a bit Catch-22’d by needing experience to get experience. I took part, along with Michelle Brook and Ceri Jones in the second set of commissions (the first lot being WordPress and Facebook). Our topics were An Introduction to Twitter, and an Introduction to Social Media Analytics and I’m really proud of the final results.
Puffle’s blogposts on the process are here: planning, scripting, screencasting, publishing/editing.
It was definitely a learning experience, not least because I’m most comfortable on PCs and view Macs with some suspicion. Some of the initial attempts to try and convert things between the two types of laptop OS’s left me flailing as filetypes claimed to be incompatible and I kept hitting the wrong shortcut keys. There was also much “where the hell is this button/function?!?!?”, but we got there in the end. During the process we were usually on two computers, and Dropbox was invaluable to keep files current. All the editing was eventually done on Puffles’ Mac, as they had Adobe Premiere as well, which was what I was most familiar with. I cannot stress enough the importance of using what you know. Obviously, it was exciting to play about with unfamiliar tools, but when needing to produce something, it does help to work with your experience. The best way to learn any program is to essentially fiddle about until it does what you want. You often find interesting things you hadn’t thought of yet, as a bonus. (That said, there were frantic interweb searches for how to guides at points!). But I get ahead of myself.
These things do take some time. What we found very useful was having the first day spent exploring our parameters with bullet point lists of points we needed to cover, discussing the topic at hand. This also gave us time to get to know each other. Fortunately, we all worked really well together and got on fabulously. Taking this time to talk about the video meant that we did not get into the awful state of realising we’d missed something crucial later and having to wedge it in. The script that was written on the second day was continuously rewritten throughout the experience (up until the final audio recording!) to rephrase things more clearly, or to replace awkward to say passages.
Initial audio quality was horrendous. It was recorded off a laptop microphone and you could tell in how echoey it was. Similarly there was a point where the lovely roaring fire wanted to get into the action and crackled as we spoke, as well as an attention-seeking cat wandering about the house looking for food.
Due to where the laptop had its mic, the difference in volume between the three voices was pretty stark, and playing with the master volume didn’t really mitigate it as much as necessary. Luckily, as we got more familiar with the script (having done it enough times) and rephrased things, stumbles became less frequent, replaced by periodic giggles, which was necessary, I think, to keep us mostly focussed throughout the day.
The second set of audio was so much clearer. And as it should be, as we used a proper mic (Apogee MiC) with a tripod facing us. It was astounding the difference the right equipment could make sometimes. Garageband, which the new mic preferred to be fed into, was surprisingly simple to use, even for PC users like myself and Ceri. The original effect settings were quite odd though, so we settled for “no effects”.
On a related note, Ceri has now made Subtitled versions of the guides – obviously this is an important thing to do to make the guides as accessible as possible. Here’s links to the Twitter Guide with Subtitles and the Subbed Analytics video.
Using Screencasting Tool, from the Good Luck Corporation, a company that sounds like it should be from some dystopian video game, all of the footage used was taken on one of the Macs. Do make sure to match the frame rate with the frame rate of the Premiere file, or you’ll have to do it again, like we did, at points, lest it export to a flickering mess.
Important to note: creating a storyboard (even just writing what to cast for what bit of audio), is essential. Taking the time to rename the screencasts for their content and their number in the running order meant that editing was so much simpler, as we could import the folder and literally go down the list, building the bridge slat by slat.
That first video we exported as a trial run. Getting feedback was important, but also painful. Essentially, a) the audio was unclear and more importantly, b) because of the exporting + the initial set up of the Premiere file, it was too small and too blurry to be legible.
It took some frantic exporting and fiddling about with settings before we could figure out what the matter was. Tip: it’s best to export snippets of bits occasionally to see that everything’s going ok. There are a load of template settings to export with on Premiere, including a Vimeo one, but this is misleading as that’ll export to 640 x 480, so fullscreening will break it. We tried exporting the original and cropping the frame, but the damage was done. We’d have to start from scratch.
One thing that really worried me was that the previews in Premiere itself of the screencasted files were a bit blurry themselves (probably just due to rendering capacity), so there was a point when I had assumed we’d have to cast it all again: which would take a significant amount of time. But luckily, was reminded on Twitter that this might be the case and checked the files themselves, which were legible and fine. Thank goodness. They were also fine sped up, which was useful as some of the key tricks we used to match footage to audio was essentially fitting the video length to the audio length.
Our settings, for reference, were: H.264, 720 x 1280, 64kbps, 48 kHz, 25 fps.
Notes and Tips for Premiere
Set up all the things you’ll need. I personally keep it so I can see the files to drag in, the Timeline, the film bit, and have the effects dropdown in another tab, so it’s easier to drag in and drop the most useful things ever: fade to black and cross dissolve video transitions. We used the former for most transitions, and the latter for cuts within the same segment: say if we cut out a bit in the middle but it’s ostensibly the same thing.
Keyboard shortcuts are invaluable! Become friends with x, c and v, which flicks the cursor between normal selecting (which’ll allow you to drag-and-crop clips, the incredibly useful time-smushing tool (shush, I can’t remember its name at this point) that will speed up/slow down clips as you drag to fit a specific length, and the razor key, which is useful at times, but not as helpful as Control (or Apple key) + K, which is cut at where your time-stamp thing is, which allows you to be much more accurate. There’s a key for this in Garageband too, I think it’s Apple+T off the top of my head. Note on Apple+K: it will cut all the tracks you’ve got highlighted, which can be useful or can be a pain: just keep an eye on it. Also, render (hit Enter key) as you go along. It’ll make it faster in the end. Also? Save. Save a lot.
There were a lot of things we could have done to make it a faster process: having the right audio equipment initially, setting up the video file correctly at the start… Small but very important things that’ll come back to haunt you if it’s done wrong initially. There were also a few screencasts we hadn’t renamed, which meant there were points of looking for particular things we knew we had done. All in all, the Analytics went much smoother, having made the mistakes on the Twitter guide and learnt from them. It doesn’t take too long to edit as long as everything is set up to be easily found. Working technically five hour days, though generally spending much longer allowed us to be more calm about the whole process, taking lunch breaks and loosening up in the evenings.
Definitely very fun to do. Having finished Uni I’ve not really had the chance to use the skills I honed there, instead concentrating on dull, but gainful temping. Working on these learning guides seems to be what I really should be doing, and I’m certainly going to be on the lookout for positions that actually consider either the creation of such guides or the implementing of said digital strategy of sorts. Particularly given the feedback we’ve received on the final versions has been very positive.
Where to go from here / How does this fit into traditional roles?
It’s quite difficult to figure out what this sort of work really is. Technically we were making digital strategy learning resources that will be used by organisations to introduce their staff to the importance of social media, a bit like Sue Llewellyn’s guides for the BBC’s College of Journalism. It’s definitely an important field that’ll need more consideration by those wishing to be in the public eye.
But where does it fit within traditional business models? Much of the issues that arise with Social Media blow-ups are due to essentially bad customer service, because the people using the tools are perhaps focussed on keeping it a broadcasting, Marketing tool. There’s nothing so frustrating as those in customer-facing roles hearing complaints/suggestions/feedback and having very little recourse but to feed it up the chain, which can be far too slow as technological advances. Some middle ground needs to be sought, perhaps using the framework of a kind of un/solicited out-sourcing to aid decision making. Too many cooks are said to spoil the broth, but you can never have too many cookbooks from which to draw inspiration </convoluted analogy>.