@madpilot makes

Yep, Software Engineering is dead

You know that feeling you get when something you’ve been taught to believe in gets discredited and because your belief was tenuous at best, the walls come tumbling down around you and then you have a huge weight lifted off you shoulders?

Pascal just posted this on the 220 mailing list. Amen. It’s something that I’m pretty sure I’ve been articulating for a long time. Whenever someone has asked me why software is hard, I always use this analogy:

If you ask a Civil Engineer to build you a bridge, it is easy to spec out. You know how far the bridge has to span, what sort of foundations you need, and as a result you can make a recommendation about what sort of bridge you need. The Engineer can build you a little model – you can look at the model and say “Yes! That is a bridge. That will do nicely”. They can mathematically model the bridge to make sure this doesn’t happen. They build the bridge and if it allows things to cross from one bank to the other, you have a success.

Unless you are building “Hello World”, a Software Engineer’s life isn’t so simple. You have different platforms, users, stakeholders, contexts – it gets exponentially harder with every feature that gets added. I once did a unit at Uni called Formal Methods which tried to mathematically model software. It was stupid. The code we modelled was like, nine lines long, and required a 32 page proof (I didn’t even get close). Stupid.

Of course, academics have been trying to shoehorn software into engineering for ever. In first year, they taught us UML which I guess is similar to architectural drawings or flow diagrams or something. I’m sure UML works really well when working with the waterfall model of software design, which has strong ties to old school, proper engineering. I couldn’t imagine having to go and update hundreds of UML documents every time a minor change was required. We are also taught in first year, that the waterfall model is pants in the real world, which by association makes UML nothing more than a nice thought experiment. (I’m still bemused by the number of Software firms that put it as a requirement for graduate Software Engineers – basically because coming up with job descriptions for inexperienced programmers is really hard).

Sure you can argue that testing is a software technique that we (should) use, but this is the exception to the rule. I guess the conclusion we need to come to is that Software isn’t an engineering problem – it’s a people problem. (Some may say, it’s a creative problem – that’s also true, but buy me a beer and I’ll explain that traditional engineering is too, so the argument doesn’t further my point). This in itself is a problem, as (gross generalisation ahead) boffins who like coding, tend not to deal with real people very well.

Further discussion on our internal list suggested that creating software products is the way to go. I think I want to agree with this – there are many examples of off-the-shelf products that are extremely popular: Microsoft Office, Adobe Photoshop etc. In these situations, the customer works with in the workflow of the software, and that seems to work. So do we as developers need to convince our clients that the feature they want may not be needed? Do our clients actually know what they need? Of course this view is not with out it’s flaws either – users will generally be working against the software, rather than with it. Is working with a sub-optimal solution better than battling with requirements and budget overruns?

I can’t help to think that there is something we are missing. It would seem there is a disconnect between what our clients want and what we can provide. If you look at the classic project triangle, your client wants to minimise price and time, and maximise good (I hope my English teacher isn’t reading this), where as we want to maximise all three. So the crucial “pick two” part flies out the window. Either we start sacrificing the good, re-negotiate the price, or try to stretch out the project to restore the balance – none of which makes for happy clients.

Well how about adding fat to the quote? In theory, this is fine – if a client sees value in an “inflated” (but more likey a realistic) price then everyone is happy right? Well, not really – software development is much like homework assignments: You start out with plenty of time, and the best intentions, and then end up pulling an all-nighter to get it finished – and you still only get a C at best. I suspect this is because it’s impossible to lock down requirements of an abstract problem. This isn’t only because of the difficulty in describing what we don’t understand, but because we don’t even know what half of the problems are going to be.

And this is our quandry – how can we estimate unknowns? Not just “we haven’t seen this before but it looks like X” unknowns, but “What the hell? How is that even possible?!” unknowns. Other areas of Engineering encounter these problems occasionally – we get them all the time. So, the solution (he says as if there is one) is to minimise the risks and/or consequences of these unknowns. Jobs that deal with people do this all the time. If you work in marketing, you can postulate all you like – you can’t be sure how a campaign will work until it does. Marketing is reactive.

When you make a change you can’t be sure what will happen. Sure, you can put an ad in the Yellow Pages year after year, because it has brought in on average Y leads per year – but there is no guarantee this year will be the same. It seems that the humanity-based sciences are happy with this, but quantitative-loving geeks don’t like that. Hell, binary is black and white, not Gray.

So, perhaps the key is to treat software as a living breathing thing. Agile programming and iterative development can help, but they are means to an end – they don’t work with out communication and understanding between people. We need to break down the barriers between provider and client – the question is: Is that even possible?

The first SchwaCMS goes live!

After the last announcement of MadPilot’s new CMS, I’m proud to follow it up with the announcement of the first site to use Schwa as it’s backend: Greenvale Mining NL.

Greenvale was designed by the ever-so-talented Adrianne from bird.STUDIOS, and was sliced by the latest edition to the twotwenty family: Niaal Holder from Speak.

We have a number of sites being launched over the next couple of week so watch this space!

Dear clients…

Please watch:

Ideas 5: The accessibilty edition

The Australian Web Industry Association, together with Web Industry Professionals Association, present Ideas 5. This year’s Ideas is is focussing on Understanding WCAG 2.0 & Preparing Websites with Improved Accessibility. If you are a web developer, and you aren’t thinking about accessibility then you REALLY need to get your butt down to the Melbourne Hotel in Perth on 22 of April 2009. Tickets are only $40 for AWIA members ($55 if you aren’t. In unrelated news, AWIA memberships are pretty cheap)

The two talks will be given by Roger Hudson and About Andrew, both experts in their fields, so seriously, it’s a great opportunity to hear from people that know what they are talking about.

Go to the website and get more info. Go on. Seriously.

SchwaCMS goes live

For the past couple of months, I’ve been locked in my room, hacking away at a (probably not-so) top secret project – which I have just pulled the big switch on. So ladies and gentleman, let me introduce you to SchwaCMS. It’s a hosted CMS product that has taken ideas from many years of toiling away on other peoples’ CMSs.

There is a complete feature list on the website, but some of the geek highlights:

  • Upload progress bar
  • Proper use of HTTP status codes (including 410 – Gone)
  • Full UTF-8 support (Check it out — the ? isn’t a HTML entity, it’s a real schwa)
  • Weight based keyword extraction from content
  • Full-text search, including a Did you mean? function
  • Integrated spell checking
  • Just-in-time scaling and caching of images
  • HTML is automatically cleaned on save
  • The ability to export the menu as an ordered list for inclusion in third-party apps, such as blogs or forums
  • A demo that is tied to the browser session, as as soon as the user logs out, their demo site gets blown away

I’m pretty excited about this release. It’s built on a custom PHP framework, and is hosted on my very own hosting box – it almost feels like MadPilot is a real web company now! (It’s only taken 8 years :P)

Go check it out: http://www.schwacms.com

Using SSH to run remote commands using PHP. A cheat guide.

I’m working on a soon-to-be-released project that needed to run commands on a Linux server. Whilst it would be possible to use something like the exec command to run it, this would mean that the user that Apache was running as would have to have permissions to run the commands, which is less than cool. I could have messed around with sudo, but even that would open up some gaping holes, as all other websites hosted on the same box could theoretically run the same commands.

As it turns out, there is a PECL project that allows you to remotely login to a server using SSH, which would actually kill a number of birds with one stone:

  1. I can sandbox the commands that get run, by setting a special user that only has access to commands that are needed (using sudo)
  2. The web app would be able to talk to multiple servers, which wouldn’t have been possible with exec alone

The flow is simple: Login to the server – I’m using a username/password pair at the moment, but only because I haven’t been able to get public key exchange working on the server yet (interestingly, it works if I call the code from the command line), run the command, then check the output and response. There was a slight issue here,  ssh2_exec returns a pointer to a stream, which needs to be read. If there is no response (some programs complete without returning anything), then the process would block indefinately. Also, if the program fails, it might not output anything to stdout, instead outputting text to stderr, AND you miss out on checking the return status code (which quite often gives you some interesting information about the status of the program).

To get around this, I wrote this really simple bash script, that runs the command on your behalf and wraps the stdout, stderr, pwd and result in an XML envelope ready for parsing. Because you will always get the envelope returned (unless the process daemonises) you won’t get the blocking problem.

#!/bin/sh
tmp_stderr=`mktemp`
output=`$* 2<$tmp_stderr`
result=$?
error=`cat $tmp_stderr`
rm $tmp_stderr

echo "<?xml version=\"1.0\" encoding=\"UTF-8\"?>"
echo "<xmlsh>"

if [ -n "$output" ]
then
    echo "  <stdout>"
    echo "    <![CDATA["
    echo $output
    echo "   ]]>"
    echo "  </stdout>"
fi

if [ -n "$error" ]
then
    echo "  <stderr>"
    echo "    <![CDATA["
    echo $error
    echo "   ]]>"
    echo "  </stderr>"
fi

echo "  <meta>"
echo "    <pwd>$PWD</pwd>"
echo "    <return>$result</return>"
echo "  </meta>"
echo "</xmlsh>"

In a nutshell, when you call the script, it runs the program supplied as an argument, then pipes the stderr out to a temporary file, and pipes stdout into a variable. It wraps this, and the current working directory and return value in XML, and prints it out. Pretty simple, but it works.

MythTV + XBMC = HOT SAUCE

In December, I blogged about my new Atom-based MythTV setup. Whilst is was OK, I’ve since bought a Jetway J9F2 coupled with a Core2 T5500 (1.6Ghz) CPU. Let me state this now: it is the best MiniITX based media centre setup there is. It has HDMI and DVI outputs, two gigabit network ports and digital audio, and has enough power to happily decode free-to-air HDTV. The layout suits my case better than both previous boards, as there seems to be more airflow – with two caveats:

  1. You NEED to use low-profile RAM if you want the DVD player to fit. I got some Kingston dual-channel sticks off eBay (2GB worth).
  2. Because I use right-angle PCI riser for my trusty AverMedia DVB-T card, I had to slice some of the plastic off the SATA cable to make it fit (The SATA ports are in a really bad spot). The SATA cable that comes with the board is slightly non-standard, which made it possible to perform the surgery – I don’t think it’ll work on regular SATA plugs, as they are thicker.

But besides that, it really is an awesome rig. Oh, it turns out that the “power” issue I had was actually because the DVD wasn’t sitting properly with the old board, which was causing something to jam, which in turn drew too much power and shut the system down. With the new board (and the low profile RAM) DVD playback works perfectly. I’m yet to try burning.

Anyway, since everything is now working properly, I thought it time to mess around with some more software! It is no secret that the MythTV UI is pretty bad (understatement!) – particulary the video and music plugins. Whilst we use the PVR features A LOT, we also watch videos and listen to music quite a bit, and with the library growing and growing it was getting harder and harder to find what we wanted to watch or listen to.

My little brother introduced me to XBMC a while ago, and I was really impressed. It actually feels like a real media center – it has slick effects, nice themes and just feels better. Up until November last year, it was a Windows only affair, but now the port to Linux has been released, so I have installed it. And it is ace. It’s not perfect, but it sure beats MythTV for video and music watching. Some highlights:

  1. When you are watching a movie, and need to go back to the menu, the video continues playing in a smaller window
  2. You can browse music by Artist, Album and Song (unlike MythMusic which is a horrendous tree setup)
  3. It is smart enough to group TV shows together AND it can pull meta data out not only for TV shows, but episode in those TV shows
  4. You can group different directories, so, if you are like me and have a couple of external drives, you can group movie directories from both drives into one list
  5. I’ve said it before, but (almost counter-intuitively) having nice animations and transistion makes things feel more polished and it does.

Although it does support the MythTV protocol partially for watching live TV, it isn’t ready for prime time yet – you can’t easily change channels, or view the EPG, nor can you change signal inputs or hit record to record a program which means I still need to run MythTV. Mind you, when they implement the API fully, I would happily drop mythfrontend for XBMC.

So make life a little more remote control friendly, I’ve added a custom menu item for XBMC into my /usr/share/mythtv/library.xml that just runs xbmc -fs (full screen mode), so I can select it from the Media menu item in MythTV.

There is still some outstanding issues though:

  1. I need to get my remote mapped properly – for some reason it ignores my arrow keys, which is really annoying. I guess I just need to mess with the Lircmap.xml file to sort that out.
  2. I need to work out how to add an exit menu item on the main menu – I haven’t got a key I can bind to the shutdown menu, which makes to make dropping back to MythTV impossible without a keyboard
  3. There is no interlacing support in the Linux version yet (that I can find anyway) so HD tv is unwatchable – no biggie – I use MythTV for that.

Other than that, give it a go – it is what MythTV SHOULD be. 4 1/2 stars.

Global search and replace using the command line

If you have ever used Dreamweaver, you are probably familiar with the Global Search and Replace feature, which allows you to search and replace amongst all files with in a site, which can be very handy if you are doing a static site. If you are too hard core for Dreamweaver though, and you spend your whole day buried in a terminal window, how can you achieve the same thing? By this piece of bash-trickery, that’s how:

find . | grep html | xargs -t -I {} sh -c "cat {} | sed 's/Stuff to find/Replace with this/' > {}"
  1. find . – find all the files from the current directory down
  2. grep html – filters the output to include only filesnames with html in them
  3. xargs -t -I {} sh -c – pumps the file names into the cat command, but also sets a variable called {} that holds the filename
  4. sed ‘s/Stuff to find/Replace with this/’ – just a search and replace
  5. > {} – save the output back to the original file name
  6. A word of warning though, leave the last bit off until you are sure your output is correct, because there is no undo feature :)

MythTV on an Intel Atom

I’ve been using MythTV for about 4 or 5 years now, first as just a DVD player, video and music player and more recently as an actual television replacement.

Unsuprisingly, my old VIA M10000 was starting to get a bit long in the tooth (it IS a 4 year old motherboard that was underpowered when it was new), so when the D945GCLF was released by and Intel a few months ago, I decided to give it a go.

Just like all the netbooks out there at the moment, it’s a Atom 1.6Ghz, so it’s still underpowered, but it surely has to beat the old 1Ghz Nemehiah…

Since the board was so cheap – it was about $AU90, compared to around $AU300 I paid for the M10000, so the extra dollars I had to fork out for new RAM, a HDD (I needed a SATA one), power suppy and extra heatsink didn’t hurt as much as it could have.

The last two items were required because the Travla case I have only had a 60W PSU which turns out wasn’t quite gutsy enough (more on this in a moment) and because of the large factory heatsink (for the Southbridge or Northbridge or whichever one ISN’T the CPU) stopped the DVD player from fitting – thankfully an after-market low-profile heatsink seems up for the job.

Problem 1: The Realtek 8165 network card wasn’t recognised by the Gentoo 2008.1 live CD, since the kernel was too old. Throwing in a spare PCI network card allowed me to bootstrap it, and kernel 2.6.27+ seem to support the card. Thankfully the rest of the base install was realitely painless – well as much as a Gentoo install can be… As of that kernel, there isn’t yet a kernel optimisation options specifically for the Atom, so I picked the Dual Core defaults which seems fine.

Problem 2: I spend WEEKS trying to fix this until I gave up. As usual XvMC, the interface that makes DVD and digital TV playback less CPU intensive would segfault on Xine, mplayer and MythTV (From memory I spent two years trying to get the M10000 working – I’m obviously less persistent now). I tried different gcc flags, getting the latest version from the relevent repositories, but nothing seemed to work. However, since the CPU would happily decode SD over-the-air broadcasts and DVDs I was tempted to cut my losses and forget about it. The fact that XvMC wouldn’t have helped out HD content either (It theoretically maxes out for anything bigger than 1024×768 I think. Oh, and it doesn’t do MPEG4) the decision was made.

Interestingly, Xine would happily decode a Channel 10 “Full HD” ATSC recording, but MythTV seems a little more CPU hungry, so live tv is too choppy to watch. I wonder whether the dual-core version might have enough headroom – I might try that in the new year (Although I have a feeling the larger CPU heatsink will stop it fitting in my case).

Problem 3: The board would randomly (or not so randomly as it turns out) reboot itself. You would have thought after spending so much time in class during electronics units at Uni, I would have worked out that the 60W wall adapter I was using couldn’t supply enough juice. If the second USB tv tuner and the DVD player were in use at the same time, the picoPSU 120 would shutdown. Thankfully ebay came to the rescue, and an 80w adapter is sitting on a delivery dock somewhere in Hong Kong. Hopefully it will find it’s way to my house.

Problem 4: The volume from the Intel HD chipset is REALLY low. Normal listening has the volume at ~80-90, rather than the 30-40 of the old board. I can probably fix that, but it’s tolerable, so I’m not to worried…

Problem 5: The GPU fan IS LOUD. It’s ok when there is something on the TV at a decent volume, but when it’s off, it sounds like a really small 747 in there. Mind you, if we are in the lounge room, the telly is probably one, so again, no biggie.

Problem 6: The latest version of Gentoo has trouble compiling mjpegtools-1.8 which is required by mytharchive. I had to compile it by hand, after applying some patches.

Problem 7: There is a bug in the network card driver, where sometimes, if the system performs a warm boot, the network card will stop working, which can only be fixed by a cold reboot (sometimes multiple times).

Problem 8: Clutch – the web interface to Transmission has remove the ability to have transmission download .torrent files for you. Now you have to download them to the desktop, then upload them. Dumb. The developers said something about not wanting AJAX calls to have to wait, which sounds like a dubious excuse to me – file uploads aren’t AJAX calls, they have to make a full round trip to the server.

So after all that (Oh, come one – 8 issues is a walk in the park for installing Linux! :P), the system is up and running. Overall, it’s not bad.

Advantage 1: The menus are much, MUCH snappier than the old machine. I can press a button on the remote, and I no longer have enough time to make a cup of coffee before the menu item changes.

Advantage 2: The standard transitions in  mythphotos now work. OpenGL doesn’t though.

Advantage 3: One of the plus sides of not having to use XvMC means the OSD is in colour and not distorted (Because the overlaygot mixed in before the scaling is done, the font rendering always looks weird).

So if this a good buy? If you aren’t afraid of a compiler, and aren’t going to miss HD OTA (Blueray WILL NOT WORK), then sure – it’s cheap and mostly easy to get working. It would be nice if there was a DVI output on the board – Intel could have easily replaced the serial port or parallel port with one. It also would have been nice it they had put a better GPU – either a low powered one, or a better spec’d one, I’d be happy with either.

But all in all, for a cheap, interium board it’s quite nice. 3 stars.

Now do I make a bad pun about Christmas or summer BBQs…

You know what, I’ll save you the pain.

The reason I’m being tempted by such high-brow literary devices, is the AWIA Web Mixed Grill – 24 short web articles from now up until Christmas – a web advent calendar if you will. Today saw Miles Burke givng hints and tips about being a successful freelancer, and tomorrow I’m blabbering on about HTTP request codes or something, and I hear from a small avian creature that there is a number of other hawt-sauce articles coming up, so it’s well worth the RSS subscription.

Of course, if you think that you have what is takes to impart some knowledge on the web world at large, then there is still a few spots left, so email mixedgrill@webindustry.asn.au and pony up some grey matter.

Previous Next