I, for one, welcome our new screencasting overloads
Or, how I used robots to make my screencasts.
I don’t like screencasts. I don’t like watching them, and I hate making them.
However, I know a lot of people do like watching them, especially people that are trying evaluate software. It had been pointed out to me that there was no way to see the internals of 88 Miles without signing up (Personally, I would suggest signing up, but whatever). So I thought I’d investigate a way to make a screencast without wanting to stab myself in the face.
So why do I hate making them?
- If you change any part of the UI, you have to go through and re-record the whole thing. This becomes tedious. That said, having an out of date screencast is probably worse than having no screencast.
- Making typos during the recording looks unprofessional, so either you need to be perfect (not going to happen) or you need to spend ages editing out the typos (I’ve got better things to do).
- Not only do you have to write content, you have to record voice over audio and video. They are a lot of work.
I’ve spent a lot of time on the look and feel of 88 Miles, and I wasn’t going to produce a crappy screencast – it had to look professional. So I needed a way to automate as much of this as I could, so that I replicate the video easily.
After putting out a call to twitter, Max pointed me at a ruby gem called Castanaut that basically wraps AppleScript allowing the automation of both the screencasting software (in my case: iShowU) and Safari. Success! Sort of. There was a little bit of work to get it all working.
The first thing to do is install the gem:
gem install castanaut
I started out using the screenplay from Castanaut site, but had to make some changes to get it working. First of all, I don’t have Mousepos installed, so I removed that plugin.
Next, Castanaut seemed to miss clicks randomly, which was a pain. After a bit of digging it looks like it was the way that it was calling AppleScript. Rather than debugging that, I just installed cliclick which is a commandline app that control the mouse without AppleScript. I had to write a small plugin to override the move and click functions. (Save this to plugin/cliclick.rb)
module Castanaut
module Plugin
module Cliclick
def click(btn = "left")
`cliclick c:+0,+0`
end
def doubleclick(btn = "left")
`cliclick dc:+0,+0`
end
def cursor(*options)
options = combine_options(*options)
apply_offset(options)
@cursor_loc ||= {}
@cursor_loc[:x] = options[:to][:left]
@cursor_loc[:y] = options[:to][:top]
`cliclick m:#{@cursor_loc[:x]},#{@cursor_loc[:y]}`
end
end
end
end
Castanaut will use say (the built in speech synthesis software on a Mac) for timing voiceovers. You really don’t want to be using that in your final screencast, unless you are actually Stephen Hawking. To solve this problem, I wrote another plugin that automatically generates a subtitle file that, when run under VLC will display the text, allowing me to read along with the video
module Castanaut
module Plugin
module Subtitle
def start_subtitles(filename)
@filename = filename
@start = Time.now
@sequence = 1
@srt = ''
@webvtt = "WEBVTT\n\n"
end
def stop_subtitles
@start = nil
@sequence = 0
File.write "#{@filename}.srt", @srt
File.write "#{@filename}.vtt", @webvtt
@srt = ''
@webvtt = ''
end
def subtitle(narrative, &blk)
start = Time.now - @start
yield
stop = Time.now - @start
@srt += "#{@sequence}\n"
@srt += "#{time_diff(start)} --> #{time_diff(stop)}\n"
@srt += "#{narrative.scan(/\S.{0,40}\S(?=\s|$)|\S+/).join("\n")}\n"
@srt += "\n"
@webvtt += "#{time_diff(start).gsub(',', '.')} --> #{time_diff(stop).gsub(',', '.')}\n"
@webvtt += "#{narrative.scan(/\S.{0,40}\S(?=\s|$)|\S+/).join("\n")}\n"
@webvtt += "\n"
@sequence += 1
end
def say_with_subtitles(narrative)
subtitle narrative do
say(narrative)
end
end
def while_saying_with_subtitles(narrative, &blk)
subtitle narrative do
while_saying narrative, &blk
end
end
protected
def time_diff(time)
micro = ((time.to_f - time.to_i) * 1000).floor
seconds = (time.abs % 60).floor
minute = (time.abs / 60 % 60).floor
hour = (time.abs / 3600).floor
(time != 0 && (time / time.abs) == -1 ? "-" : "") + hour.to_s.rjust(2, '0') + ":" + minute.to_s.rjust(2, '0') + ":" + seconds.to_s.rjust(2, '0') + ',' + micro.to_s
end
end
end
end
This will create both a SRT and VTT (Web subtitle) file. Here is a screen shot of the subtitle overlayed on to the video:
Here is an excerpt from my screenplay file:
#!/usr/bin/env castanaut
plugin "safari"
plugin "keystack"
plugin "cliclick"
plugin "subtitle"
plugin "ishowu"
plugin "sayfast"
launch "Safari", at(120, 120, 1024, 768)
url "http://88miles.net/projects"
pause 5
ishowu_start_recording
start_subtitles "/Users/myles/Movies/iShowU/tour"
pause 1
say_with_subtitles "Hi, my name is Myles Eftos, and I'm the creator of Eighty Eight Miles"
say_with_subtitles "a time tracking application for designers, developers and copywriters."
say_with_subtitles "This short video will show you how Eighty Eight Miles tracks your time"
Oh, one last thing – I found the synthesised voice was too slow, so I made another plugin that speeds up the voice (saved in plugins/sayfast.rb):
module Castanaut
module Plugin
module Sayfast
def say(narrative)
run(%Q`say -r 240 "#{escape_dq(narrative)}"`) unless ENV['SHHH']
end
end
end
end
I recorded the voice over using Audacity. I wasn’t too fussed about an exact sync, so I just hit record on Audacity, and play in VLC. If you are worried about sync, just make a noise into the microphone when hit click (tapping the mic will do it), and you can use that as a sync mark.
Protip: Don’t use the built in microphone on your laptop, unless you are going for the “I’m recording this in a toilet” aesthetic. Ideally, you’d have a decent studio mic with a pop filter (I have a Samson C01U), but you know what? A gaming headset mic will still be orders of magnitude better than your laptop microphone.
Now, you should have a MP4 and WAV file (one for video, one for audio) than need to get mashed together. I use Adobe Premier Pro for this, but iMovie works great too. You will need to remove the existing audio track from the video file as it will have the robot voice on it, replacing it with your voice over track.
Finally, I topped-and-tailed the video with some titles for that last bit of fancy.
After exporting the final render, I used FFMPEG to encode the file into a final MP4 and WEBM file so I could drop them into a video tag. To install ffmpeg:
brew install ffmpeg --with-libvpx --with-libvorbis --with-fdk-aacc
Then run the following commands
ffmpeg -i [input file] -crf 10 -b:v 1M -c:a libfaac screencast.mp4
ffmpeg -i [input file] -c:v libvpx -crf 10 -b:v 1M -c:a libvorbis screencast.webm
You can upload those files somewhere, then reference them in like so:
<video autoplay class="tour" controls height="768" preload="auto" width="1024">
<source src="/videos/screencast.mp4" type="video/mp4"></source>
<source src="/videos/screencast.webm" type="video/webm"></source>
<track default kind="captions" label="English" src="/videos/screencast.vtt" srclang="en"></track>
</video>
Want to see the output? Here is the final render embedded on the internets.