Where is my iPhone Mini?

I’ve been an iPhone user and fan ever since the original iPhone came out and I’ve used one for the past four and a half years. I had the original iPhone, the 3G, the 3GS and then I skipped a couple of models and now have an iPhone 5. I’ve smashed the screen, obviously, by dropping a dumbbell onto it, but it seems unfashionable to have an iPhone with an intact screen these days and the dumbbell thing* gives me man points.

Smashed screen aside, the iPhone 5 is a very capable smartphone. However, I’m at the point with it where I believe it is in fact too capable I’m struggling to justify ownership of it. I find that I actually use very little of what it has to offer. I use the phone, obviously, text messages, e-mail, Facebook, Twitter, Foursquare, Maps, Camera, iPod occasionally*, National Rail enquiries and a handful of other apps on an occasional basis. Although my old 3GS was slow, there was none of this that it couldn’t do and there is nothing I use my iPhone 5 for now that I didn’t use to use my 3GS for (with the exception of the camera, I didn’t used to use that on the 3GS because it was properly awful). I use mobile apps on my iPad much, much more than I do on my smartphone; my iPad is where I need the mobile computing power and features.

My point is that I’m paying for (£45 per month on a lease) and carrying around this massive overpowered pocket computer with me everywhere I go, with its fragile screen, poor battery life and a relatively high chance that I’ll get mugged for it one day, when I barely use its capabilities. When Apple launched the iPad Mini earlier this year I had very high hopes that they would follow suit with a smaller iPhone, the iPhone Mini, or whatever; a device which isn’t as powerful as a full-blown iPhone but is smaller, has a better battery life and can do the basics like make phone calls, text messages, basic social media apps, iPod, a reasonable (if not overly fancy) camera, etc.

My hope was that they would base it on the iPod Nano:

ipod-nano

This device has a small colour multitouch screen with an iOS-like interface which is clearly capable of handling a form of application selection. I cannot imagine how it would be hard to include the necessary electronics for a mobile phone and wifi into a package this size, even if it had to be slightly thicker perhaps than a plain iPod Nano (in the same way that the iPod Touch is thinner than the iPhone). It would have been perfect for me, so I got quite excited when I saw the rumours about the iPhone 5C – perhaps the “C” stands for “compact”?

But no.

The iPhone 5C is nothing more than a re-packaged iPhone 5, except they’re making it out plastic, which will arguably be more robust, but is actually a decision that has mainly been made for cost-reduction purposes. Despite this, the 5C is by no means a bargain, offering a saving of just £80 over the even more powerful and even more expensive flagship iPhone 5S, which they have introduced to replace the iPhone 5. The top of the range 64Gb model costs more than an eye-watering £700.

They’ve missed a beat here. I’m not normally underwhelmed by Apple launches (although I am by no means a frothing fanboy before, during or after them), but this one may as well have never happened.

* I have, incidentally, eliminated the possibility of future dumbbell related screen smashes with the purchase of an iPod Shuffle for use in the gym. It’s not possible to smash the screen on this because it does not have one.

comments

A very Angular learning curve

Recently my team at work have been working with Angular JS, a Javascript framework created, used and published by Google. We’ve used it extensively in our new website, which is created from static HTML and Javascript files with no server-side page generation. All the work is done by the browser and user interaction is processed using a REST API.

AngularJS-large

I didn’t actually do any of the coding on the website and so I did not have the opportunity to learn how to use Angular JS during the project as the rest of my team did, so in order that I did not fall behind on the skill I decided to learn it myself in my own time by creating a web-based tool which creates DHCPd configuration files. The application is boring (although actually useful if you run such a server), but that’s not the point, it was a learning exercise.

Angular JS has a bit of a learning curve. It works in different ways to other Javascript libraries and frameworks and it takes a while when you’ve started from scratch to “think Angular”, rather than in ways in which you may have become accustomed with things like jQuery, itself revolutionary in the world of Javascript, but Angular takes it to a whole new level. Once you are “thinking Angular” things become much clearer and easier and you find yourself in a very natural-feeling flow.

I’ve made the exercise available on Github. You may find the tool itself useful if you’re a system administrator, but if you’re a developer it’s more likely the demonstration of a simple Angular application that you will probably see more value in.

I have some larger extra-curricular projects around the corner which I intend to base on Angular JS and expand my knowledge. We’ll also continue to use it at work and will almost certainly use it when it comes to re-implementing the user interface of the company’s internal browser-based management system.

comments

MRTG Dashboard

I’m one of those die-hards whose been using MRTG for almost as long as I’ve had a computer with a network connection. It’s an old tool for monitoring network traffic and its not pretty by modern standards but it does still do that job very well. However, its blocky output does rather leave much to be desired in this day and age of interactivity and so I’ve knocked together an MRTG Dashboard.

It’s a single PHP script which you just pop in your MRTG output directory (workdir) on your PHP-enabled web server. That’s all you need, all the required libraries are loaded from CDNs. It’s not perfect, but it is an improvement.

MRTG Dashboard screenshot

MRTG Dashboard screenshot

You will find that the timescales on the interactive graphs can be a little hit-and-miss. This is because while Highcharts demands data at consistent intervals when creating time-based graphs MRTG’s data is anything but consistently intervalled. I will try to improve this at some point in the future.

You can get MRTG Dashboard from Github.

comments

Driven to drop Google Drive for Dropbox

Cloud computing is a wonderful thing, whether you are a business or a consumer. It isn’t the answer to everything, but it’s certainly solved some common problems, not least of which is the issue of back-ups. These days for a few dollars per month everybody can transparently back-up most if not all their important files to servers on the Internet and have those files synchronised between multiple computers and mobile devices such as smartphones and tablets.

There’s also no shortage of companies willing to offer their cloud storage services. Some services, like Amazon’s S3 service, are geared towards developers for integration into software (although Amazon now have a consumer offering), but there are many aimed at consumers who want a simple way of achieving transparent backup of their personal files. Microsoft, Symantec and Google all offer solutions, although not all are cross-platform.

Google Drive

Up until last week I used Google Drive, having taken up the service since it was launched earlier in the year. It costs $4.99 per month for 100Gb of storage and comes with software which you install on your computer and it automatically manages the sychronisation of your files, so long as you save them in the special “Google Drive” directory.

However, Google Drive was not without its problems from the very start. The software is not particularly well written and it is apparent that it has some bugs. It suffers from massive memory management problems and is prone to crashing without warning. This was especially annoying during my initial upload of files, which would have taken around a week if the software had remained running, but it did not and it would quit every few hours. Because I was either not awake or not at home to keep restarting it each time it crashed, my initial upload took far longer.

But it got there in the end, and for around six months it successfully kept my files safe and sychronised between my computers. I still had the memory issues (it typically used between 700Mb and 1Gb of RAM even when idle), and so I often found myself having to quit the software in order to free up some RAM if I needed it. This wasn’t ideal as it meant that I had to remember to restart Google Drive in order to ensure my files were kept up to date, but I lived with it.

Restoration test

Then, at the end of November, came a real test of the value of Google Drive. The hard disk in my desktop Mac Mini developed unrecoverable hardware problems, and I had to replace it. Although this was a time-consuming process it was not a disaster for me as I had all my important data in one cloud service or another. I have all my music on iTunes Match, all my development work on Github and all other files that I would be upset about losing in Google Drive. I have other files that aren’t on any cloud service stored on an external hard drive; these are files that could be replaced relatively easily if I had to and it’s not worth backing them up.

So I merrily removed the old hard disk without attempting to remove any of my data from it and installed the new one in its place (putty knives and swearing is always involved when upgrading an old-shape Mac Mini). I installed the operating system from scratch and all my software on the new hard disk and then began the process of restoring my data from the various cloud services. Github and iTunes Match worked like a charm straight off the bat, but Google Drive was, unfortunately, an entirely different story.

I installed the latest version of the software and entered my Google account details. It thought about it for a bit, allocated itself a whopping 3.25Gb of RAM, and then started to download my files. “OK”, I thought, “the RAM thing is even more annoying than it was before, but whatever”, and left it to do its thing. After downloading around 700Mb, it displayed a window saying that “An unknown issue occurred and Google Drive needs to quit“. The window also said that if this happens repeatedly I should disconnect my account.

It did this seven further times. Each time I was able to download around 100Mb of data before it displayed this error again. After the seventh time it didn’t download any more data, no matter how many more times I ran it. It had only downloaded 1.3Gb of my 55Gb of data. So I tried disconnecting my account and logging-in again. It insisted on starting the download from scratch, forcing me to discard the 1.3Gb already downloaded. Unfortunately it did exactly the same thing, repeated errors and then “maxing-out” at around 1.3Gb of files after numerous restarts. It was, frankly, ridiculous.

Out of frustration I called upon Google’s support, which as a paying customer I was entitled to. Their suggestion was to uninstall and re-install the software, and this suggestion came 48 hours later. Needless to say I was not particularly impressed. I did not believe for a second that this would fix the problem and that I was simply being taken through a standard support script. This was the final straw with Google Drive, after all the upload issues, memory issues and now this, an apparent inability to restore from my precious backup when I needed to.

I am 99% sure that it was crashing due to poor memory management (i.e. it was running out of memory), if the console messages were anything to go by. I considered that following their reinstallation advice would be a waste of my time based on this and I would further waste my time attempting to explain my technical suspicions to them. I needed my files back and I needed my cloud service back, on my timescale and not on Google’s.

Dropbox

I am fortunate to own two computers, and this was my saving grace. I still had the copy of the Google Drive directory on my other computer, so I still had a local and up to date copy of all my files. If, however, I had only one computer, I would have been entirely at the mercy of Google to get my files back. That was not something that I decided I was comfortable with and so I decided I had two choices:

  1. Persevere with Google’s support and, assuming they manage to fix the issue, continue to tolerate their piss-poor software going forward.
  2. Use the other copy of my files I had, find an alternative cloud storage service, upload them to it, and dump Google Drive.

I chose the latter. I had heard good things about Dropbox. They are a small firm for whom online storage is their entire business, rather than just another product, which is the case for Google. It is absolutely in their interest to get their offering right, because if they don’t they don’t have a dominant global search engine business (for example) to fall back upon. I wouldn’t be surprised if Google Drive grew half-arsed out of project that a Google developer created on his “do your own thing” day of the week, a privilege extended to Google developers as standard, to the envy of most others.

Dropbox is twice the price of Google Drive, costing $9.99 per month for 100Gb instead of $4.99. This isn’t a high price to pay for a reliable solution in my opinion. Like Google Drive, it too comes with software to be installed on your computer(s) which creates a special directory into which you save your files and it sits there in the background and uploads and downloads files as required. The difference between the Dropbox software and the Google Drive software is that the Dropbox software does so without using all your RAM and without quitting every few hours. Amazeballs!

It took around 7 days to upload my files to Dropbox, during which the software did not crash even once and used no more than 400Mb of RAM at its peak. Google Drive’s memory management was so poor that it never released memory if it didn’t need it any more; its RAM usage just kept going up and up and up. I was supremely impressed with this; this is how Google Drive should have been from the very beginning and the fact that Dropbox can do it means there is no excuse for Google Drive not to be able to. I am currently in the process of downloading these newly-uploaded files to my other computer en-masse, and guess what, still no crashes and it doesn’t seem to think that downloading 55Gb is a somehow insurmountable task, so doesn’t give up after the first 1.3Gb.

Other things I like about Dropbox:

  1. Great mobile app for iPhone and and iPad. This, too, Just Works, and allows viewing of a wide range of file types. It also backs up the camera photos from each device, which is a nice touch.
  2. It has an API, which allows it to be integrated into other software and services, such as IFTTT. This is more exciting for me than it probably would be for most people, but it’s something that Google Drive doesn’t have.

Of course, Dropbox may well not be without its own problems which are not yet apparent. If any transpire I will of course report on them, but initial tests and use of the service is very promising, and certainly far better than comparable early days with Google Drive.

So there you are. If you’re looking for advice on which cloud backup service to use, I recommend Dropbox. It’s compatible with Mac OS, Linux, Microsoft Windows, iOS (iPhone, iPad) and Android. Enjoy.

comments

Why I’ve cancelled my TV licence

Image credit: Computer Active

I have cancelled my television licence. My current licence is valid up until and including 30th November 2012. This post is about why I have made this decision and states the facts and information I have used to help me make it with the intention of helping others making similarly informed decisions about their produced entertainment choices.

First off, I would like to make it clear that I am not anti-BBC. I believe that the BBC is fantastic institution that is the gold-standard envy of the world when it comes to broadcasting. Recent events may well have given the BBC a tremendous bloody nose, but I do not fundamentally believe that the BBC does not deserve the respect that it commands.

Twenty years ago you obtained your produced entertainment from a relatively small number of sources: radio, a small number of terrestrial television channels, cinema, theatre and the printed press. These days its very different, the proliferation of the Internet having changed everything in ways that were unimaginable two decades ago. Now you can choose how you consume produced entertainment, and most importantly, the laws regarding your consumption vary according to the method in which you choose to consume it. Television programmes are not just available via scheduled broadcasts as they once were; they are now also available on-demand on the Internet accessible through commonplace domestic Internet connections and viewable on a range of domestic equipment including computers, televisions and games consoles.

The point is that these days you do not have to watch or record televisions programmes while they are being broadcast in order to enjoy them. This is revolutionary for many people who have limited time to watch television and so perhaps cannot commit to being in front of their television sets at pre-determined times. People can’t and don’t plan their lives around The Radio Times any more.

Why do you need a licence?

You need a television licence if you use a television, video recorder or digital recorder to view or record (i.e. receive) television broadcasts from any broadcaster in the United Kingdom. That’s it. You do not need a licence for any other reason, including:

  1. Radio, analogue and digital.
  2. Watching DVDs, Blurays or other formats of physical video media.
  3. On-demand TV services from any provider, including BBC’s iPlayer.

The third item is critical. Why aren’t such services covered if, especially in the case of BBC’s iPlayer, the content has still been produced and made available using funds from the TV licence? The answer is simple. When you view content through such services it is not a broadcast, it is a unicast, and therefore not covered by the legislation applicable to television broadcasts. The BBC even admit this fact themselves, stating clearly:

You do not need a television licence to catch-up on television programmes in BBC iPlayer, only when you watch or record at the same time (or virtually the same time) as it is being broadcast or otherwise distributed to the public. In BBC iPlayer, this is through the Watch Live simulcast option.

Anyone in the UK watching or recording television as it’s being broadcast or simulcast on any device – including mobiles, laptops and PCs – must, by law, be covered by a valid TV licence.

A ‘live’ TV programme is a programme, which is watched or recorded at the same time (or virtually the same time) as it is being broadcast or otherwise distributed to members of the public. As a general rule, if a person is watching a programme on a computer or other device at the same time as it is being shown on TV then the programme is ‘live’. This is sometimes known as simulcasting.

If you are using the live rewind function to either restart the current live programme or to rewind any live stream for up to 2 hours, a television license is required as you are still accessing the live simulcasts.

This is backed-up by the TV Licensing website, which also answers the question about whether or not you can use a DVD/Bluray player without a licence. It states:

The law states that you need to be covered by a TV Licence if you watch or record television programmes, on any device, as they’re being shown on TV. This includes TVs, computers, mobile phones, games consoles, digital boxes and Blu-ray/DVD/VHS recorders.

You don’t need a licence if you don’t use any of these devices to watch or record television programmes as they’re being shown on TV – for example, if you use your TV only to watch DVDs or play video games, or you only watch ‘catch up’ services like BBC iPlayer or 4oD.

The second sentence of the first paragraph only applies to DVD, Bluray and VHS recorders, i.e. if your DVD or Bluray player is capable of recording and you use it to record broadcasted television programmes. Just because it is capable of doing so it does not mean that you must purchase a television licence, in the same way if that if you don’t use your television to receive broadcasts you also do not need to purchase a television licence, even though your television is capable of receiving them.

Consumer choice

At this point the decision whether or not to purchase a television licence becomes a normal consumer choice. It is not mandatory to buy a licence and one is only required to do so if one chooses to watch television as it is being broadcasted. There is no other reason.

I paid £145.50 for my last television licence. I have estimated that my lifestyle permits me to watch perhaps one, maybe two broadcasted television programmes per week. I estimate that 90% of my television consumption is through on-demand services such as BBC iPlayer and Netflix, with the other 10% made up of a combination of broadcasted television and trips to the cinema. For me £145.50 per year, or £12.25 per month, represents an average cost of £1.42 per broadcasted television programme viewed, assuming two programmes per week. I do not consider this to be good value and I have concluded that I can easily live without this source of entertainment.

The same consumer rights and laws apply to me regarding this decision as they would with any other consumer decision. I go to the gym five times per week, but if, like many people, I paid for a gym subscription but only went once or twice a month (at best), I would probably consider cancelling it. It is not compulsory to have a gym subscription. If you don’t use it, don’t pay for it. A more relatable example may well by a Sky television subscription. Would you pay a monthly subscription for Sky channels if you never had time to watch them? It isn’t compulsory to have a Sky subscription, so if you don’t need one, don’t have one.

The point is that I do not need to be able to watch broadcasted television. I do not have to watch it, it is not mandatory to do so, and the law states that if I do not do so then I do not need to purchase a TV licence.

So on 30th November 2012 I will be disconnecting the aerial cables on both my televisions and I will switch entirely to on-demand services to watch television, which both my television sets are capable of receiving themselves without the need to connect a computer. The only disadvantage of this is that I will have to wait up to two hours longer than everyone else to watch a programme. I believe I can live with that.

The TV Licensing Gestapo will get you!

No, they will not.

Don’t get me wrong, I’m fully expecting a barrage of nasty threatening letters from them. But sending nasty, threatening letters is the absolute limit of their powers unless they can prove to a judge beyond any reasonable doubt that I am watching broadcasted television without a licence. There is absolutely no way they will be able to prove that I am, chiefly because I won’t be. The burden of proof is upon them, rather than the burden of disproof upon me.

TV Licensing will threaten to send an inspector to my home. They may actually do so. In contrast to TV detector vans, TV Licensing inspectors do actually exist. The problem for TV Licensing is that their inspectors have no more right to enter a private home than someone selling dusters door-to-door. They require a warrant to enter and search private premises.  The BBC states that a search warrant would never be applied for solely on the basis of non-cooperation with TV Licensing and that in the event of being denied access to unlicensed property will use “detection equipment” rather than a search warrant. The BBC have also admitted that “TVL has not, to date [as of 01/04/2011], used detection evidence in Court”. This means that no judge has at any point issued TV Licensing with a search warrant.

In short, all I have to do is ignore the letters from TV Licensing, no matter how nasty they become. Others have done this successfully. They are all mouth and no trousers and for the most part they work; many less well informed people will capitulate to TV Licensing after several of such letters, having been convinced that owning a television licence is absolutely mandatory, akin to paying council tax or income tax, rather than an optional consumer choice.

What if everyone did this?

While I think it’s unlikely that “everyone” will do similar, I expect that a significant number will follow suit in the next few years. Television habits are changing drastically, and I think that more people will make the same consumer choice. It will reach a tipping point where it the BBC will simply not be able to function to the standard that everyone expects because of the combined reduction in TV licence revenues. At that point there will probably be a change in a law in order that a TV licence will become necessary to watch catch-up services from the BBC and Channel 4 (commercial providers may be different).

At that point I will have absolutely no problem paying the TV licence fee and will be happy to start doing so again. It will be excellent value for money.

Let me know how it works out for you

I will do. I’m not going to be one of those people who scans and publishes every nasty letter or evangelise about not having a TV licence on this blog or on social media. People like that irritate me. As I have said, this is a consumer choice. I wouldn’t bang on about cancelling my gym subscription endlessly. If anything significantly interesting happens as a result of this decision I will follow-up this post.

comments

Airplay with Raspberry Pi

I bought a Raspberry Pi this week. For those who don’t know this is a tiny ARM-based computer, the size of a credit card, which is supplied as a board without case, power supply or mass storage, for £30 (delivered). It’s been in the media and is being described as a universally affordable spiritual successor to the popular 1980s BBC Micro, as it has been designed with the purpose of teaching school kids how to program computers in mind.

It ships with 256Mb of RAM, an SD card slot, two USB ports, an ethernet port and an HDMI port. It’s powered via micro-USB and so will work with any micro-USB cable (and therefore many phone chargers). You then have to add a SD card for mass storage, onto which the operating system is installed. You then also need to connect it to an HDMI display and plug in a USB keyboard. You can easily spend as much as the original purchase price again on accessories, but that still doesn’t make it expensive.

Raspberry Pi running RISC OS 5

The primary intention of its manufacturers is for it to run a special Linux distribution called Raspbian, which is based on Debian, but it is by no means limited to this. In theory it can run anything that’s compiled for the ARM architecture, although in practise this is different. Already a group is working on a port of Android, an obvious choice, since this operating system is designed for ARM-based smartphones and tablets. Someone has even made a RISC OS 5 distribution available (RISC OS 5 is the older fork of RISC OS which was open-sourced, RISC OS 6 remains a commercial product and is not available in the same way). This gave me a few hours of delightful nostalgia as I lived and breathed RISC OS for 5 years back in the early 1990s. I’m hoping I’m going to be able to use it to recover some of my old files and convert them to PDF.

But this isn’t the real reason why I’ve bought my Raspberry Pi. Nor have I bought it, as many will, just to dick about with it. Unlike some others I don’t have any grand delusions that it will replace either my desktop computer or my home server, because it’s frankly not up to either task. Its low cost and the fact that you can run it off a USB port means that it’s actually rather slow, but that’s fine, it’s not designed as nor was it ever meant to be a fast computer. But it is small, cheap and perfect for what I want to use it for.

Alternative Airplay device

Airplay is the system through which Apple devices can play music through remote speakers connected to devices on the local network. These can be Apple TVs or an Airport Express. The Apple TV represents great value at £99, but the Airport Express is less so at £80, which is an increase on the previous price since they brought out the new model. Most people already have a wireless network and so £80 just to connect your stereo to your network is a little steep if you don’t need the wireless features of an Airport Express.

Here’s how the budget stacks up: Raspberry Pi is £29.95 delivered from Farnell. On top of that you’ll need an SD card (£3.38 delivered from Play.com), a case (various options on eBay, I found one for £4.23 delivered), and if you don’t have a spare already then a micro-USB charge (£2.40 delivered from Play.com). This all comes to £40.00 delivered, exactly half the cost of an Airport Express.

You will also need an audio cable and an ethernet cable but I’m not including these in the budget since neither is not included with an Airport Express. What I would point out, however, is that the Raspberry Pi solution is not a wireless solution without the addition of a USB wireless dongle, themselves no more than a fiver from eBay.

Instructions

  1. Install Raspbian. You can do this using one of the pre-built images if you want, but if you’re capable I recommend that you install it using the network installer so you can control what goes on and it uses as little space as possible (you will however find this method much slower). You’ll need at least a 2Gb SD card for either method. I tried to shoehorn an install on a 1Gb card by removing the swap partition, but it didn’t boot. You need only the default options if using the network installer, no extras required.
  2. I recommend that you update the firmware and the operating system (using aptitude) at this point. There have been some recent improvements to the firmware which bring performance increases and better wireless support.
  3. Log in as root and run the following commands:

aptitude update
aptitude upgrade
aptitude install sudo ntp build-essential pkg-config alsa-utils git libao-dev libssl-dev libcrypt-openssl-rsa-perl libio-socket-inet6-perl libwww-perl avahi-utils wireless-tools wpasupplicant unzip wget
mkdir /root/build
cd /root/build
git clone https://github.com/albertz/shairport.git shairport
cd shairport
make
make install
cp shairport.init.sample /etc/init.d/shairport
cd /etc/init.d
chmod a+x shairport
update-rc.d shairport defaults

  1. Add these lines to /etc/rc.local. The second line forces the audio through the 3.5mm jack rather than the HDMI port. If for some reason you require the latter then omit the second line.
modprobe snd_bcm2835
amixer cset numid=3 1
  1. Change this line in /etc/init.d/shairport, starting DAEMON_ARGS, so that it reads the following (you can change “Raspberry-Pi” to a string of your choice):
DAEMON_ARGS="-w $PIDFILE -a Raspberry-Pi"

Reboot, and you should now see a new entry in your Airplay menu on your device. At this point my SD card was using 783Mb on its root partition. I’ve made an image of this with a view to making it available for download, but even compressed it came out at 658Mb and I pay for my bandwidth by the Gb, so I won’t be uploading it, not when the instructions are so easy.

I would note that if you are geeky enough to achieve this then think twice before building them for your friends in order to save them a few quid. If you build and supply it you will have to support it, and you won’t have the option of sending them to the Apple Store should it go wrong. I speak as a reluctant Apple help desk for many of my friends and family; certainly I will not be making any of these little rods for my own back for anyone who can’t do it themselves :)

Portable wireless boombox

Despite this little triumph I actually don’t require an Airplay device at the moment. I have two already and no requirement for a third, so while this is useful it’s not especially useful for me as a home device at this time. What I want to do is take this project further and build a portable wireless boombox.

This would be a self-contained system which doesn’t depend on anything other than a 12 volt power source (so, car battery, boat, caravan, solar panels, mains adaptor or a collection of D-cell batteries). It would provide its own wireless network to which users can connect their Airplay devices and then use wirelessly. It would contain a small power amplifier and a pair of speakers. I’ve found a power amplifier that even has a USB port from which I can power the Raspberry Pi, saving me having to worry about a step-down from 12 volts to 5 volts.

Not intended for connection to an existing wireless infrastructure this would mean that it could be used anywhere, as long as there’s a 12 volt power source. Great for camping, barbecues, boats, festivals or simply down at the bottom of the garden. I’ve identified the parts that I will need (and indeed ordered most of them), but my biggest challenge still remains and that is what sort of box to build to house them and how to manufacture it. I’ve a feeling that my prototype won’t be particularly pretty, if entirely functional.

I’ll keep you posted on this project as I make progress.

comments

Mac, Apache, MySQL and PHP (MAMP)

Mac OS X serves as an excellent development environment, even if you are not actually developing Mac OS or iOS applications. It is the darling of many a LAMP (Linux, Apache, MySQL and PHP) developer, who enjoys a slick desktop operating system with good UNIX-like underpinnings but who don’t necessarily want to put up with all the various limitations and complications that running a Linux desktop brings, consistent improvements in this regard over recent years notwithstanding.

The only trouble with this is that if you want to develop LAMP applications and work on a Mac then traditionally you’ve needed a two-box setup; a Mac on your desk and Linux on a development server. For many this isn’t an issue, and indeed when you’ve got a team of developers, optimal, but what if you wanted a self-contained development environment that was restricted to just one box? What if you wanted that box to be your laptop so you could take it anywhere?

Solutions

“Virtual machine!”, I hear you cry. Yes, this is a possible solution, and for many works well. Good virtualisation software is free these days, but using a local VM is cumbersome. Not only does it consume a large slice of your RAM but it also puts a lot of strain on the CPU, meaning that if you are running off your battery your battery life will be decreased. It’s also cumbersome; you have to start up the VM when you need it and there can be complications with the networking, for example, if you have connected to a public wireless network it’s possible that your VM might not be extended the same resource.

There is a software package for Mac OS called MAMP (the M for Mac OS replacing the L for Linux). This is a point-and-click installer which bundles Apache, Linux and PHP for installation on Mac OS. I don’t like this solution, for a number of reasons, including:

  1. Limited functionality unless you “go pro” (at quite considerable cost). Any self-respecting developer will require multiple virtual hosts as a minimum and won’t need or want a clicky-button interface to get what they want.
  2. You are entirely at the mercy of the distributors of MAMP with regards to component software versions that are made available to you and when.

Alternative solution

There’s an alternative to this. You don’t have to fork out £39 for a package of what it otherwise freely and widely available software. With the help of my friend and colleague Ben Nimmo I present the following assembled and tested instructions for turning your Mac into a native MAMP server without using the packages download.

MySQL

  1. Download and install the latest .dmg and install both the *.pkgs within it (don’t use the TAR/GZ archives). You may wish to install the Workbench too, it’s really good these days.
  2. Find where the mysql.sock file is expected to be in /etc/php.ini (should be /var/mysql/mysql.sock)
  3. Create the folder and link the socket file to the expected location.
sudo mkdir /var/mysql
sudo ln -s /private/tmp/mysql.sock /var/mysql/mysql.sock
  1. Add MySQL to command line by editing /Users/username/.bash_profile and adding this line and then either restarting terminal or source-ing the file:
export PATH=$PATH:/usr/local/mysql/bin

PHP

PHP comes with Mac OS, so it’s not necessary to download and install it, however, there are a couple of necessary steps to configure it:

  1. Copy the default php.ini file:
sudo cp /etc/php.ini.default to /etc/php.ini
  1. Edit /etc/php.ini and uncomment this line to enable xdebug (not essential, but recommended):
zend_extension="/usr/lib/php/extensions/no-debug-non-zts-20090626/xdebug.so"

Apache

Apache too comes with Mac OS, so again, no need to download and install it. Its configuration, however, is a little more complex, but nothing scary. The described configuration will provide a special Apache “sandbox” environment for your projects. It uses the existing “Sites” directory in your Mac OS home directory.

  1. Create a subdirectory in this directory for each of your projects, ensuring that the directory name does not contain any characters that would be illegal in a URL. Within each of these subdirectories create another subdirectory called “web”; this will be become the web root of each project. The extra subdirectory is in case you wish to use a framework in your projects which may keep some of its files outside of the web server root (Symfony is a good example of this).
  2. Create a subdirectory called “logs” in your “Sites” directory; Apache will maintain two log files, access and error, for all the sandbox sites.
  3. Enable PHP5 with Apache by editing /etc/apache2/httpd.conf and uncomment the following line:
LoadModule php5_module libexec/apache2/libphp5.so
  1. Change the user and group to your username and “staff” respectively, also in /etc/apache2/httpd.conf:
User sbf
Group staff
  1. While still in /etc/apache2/httpd.conf, find the following configuration and change “Deny from all” to “Allow from all”:
<Directory />
    Options FollowSymLinks
    AllowOverride None
    Order deny,allow
    Deny from all
</Directory>
  1. Create and edit /etc/apache/users/user.conf with the following, changing “sbf” to the username:
<VirtualHost *:80>

    ServerName dev.local
    DocumentRoot /Users/sbf/Sites/

    RewriteEngine on
    RewriteLogLevel 1
    RewriteLog /var/log/apache2/rewrite.log

    # sites in the format http://[site].dev.local
    RewriteCond %{HTTP_HOST} ^[^.]+\.dev\.local
    RewriteCond %{REQUEST_URI} !^/error/.*
    RewriteCond %{REQUEST_URI} !^/icons/.*
    RewriteRule ^(.+) %{HTTP_HOST}$1 [C]
    RewriteRule ^([^.]+)\.dev\.local/(.*) /Users/sbf/Sites/$1/web/$2

    # Logging
    CustomLog /Users/sbf/Sites/logs/sandbox.access.log combined
    ErrorLog /Users/sbf/Sites/logs/sandbox.error.log

</VirtualHost>
  1. Restart Apache:
sudo apachectl restart

Then, for each of your sites, add an entry in /etc/hosts for with the format “name.dev.local” pointing to 127.0.0.1, where name corresponds to a subdirectory in your “Sites” directory. Don’t forget that the public subdirectory of each site is assumed to be “web”, so make a symlink to this if the framework you use has a different convention.

You should then be able to access each of your sites from URLs using the convention http://name.dev.local/ – where “name” again is a subdirectory within your “Sites” directory.

I’ve tested this setup procedure and It Works For Me [tm]. If, however, it doesn’t quite work for you as described, please let me know where you’re going wrong and how, if you were able, to resolve it, and I will update these instructions accordingly.

comments

Call centre HUD for Asterisk

At work every fortnight or so we hold what we call Evenings Of Code. The company buys the development team pizza and beer and in exchange we work on projects and ideas that are of special or particular interest to us, that the company would benefit from, but that the company doesn’t really need at the moment and so normal weekday work time cannot be allocated to it.

My recent project has been to create a HUD (Heads-Up Display) for the call centre. I’m particularly suited to do this out of those in my team because I have quite a lot of experience with Asterisk, the popular open-source PBX, on which the company’s telephone system is based. The system, designed for a 1080p display (it’s not yet decided whether we’re going to go for a plasma or a projector yet), uses caller details records (CDRs) and the Asterisk Manager system to gather statistics on the day’s calls. It’s not quite realtime, but the key queue statistics from the Manager are updated every five seconds and the cumulative statistics based on the CDRs since midnight every 20 seconds.

A script runs constantly in the background during the working day (8.00am to 8.00pm) to perform the updates, which saves the results in a database table. The HUD itself is then loaded in a browser and uses AJAX calls every five seconds to read the contents of this table. This will allow multiple instances of the HUD to run without each hammering the CDRs and the Manager. At the moment, as it’s in testing, it’s being loaded into web browsers by certain staff members, and so when it initially loads it loads quarter-size. When we finally put it up on the wall we would then hover over the top right corner and an option will appear to enlarge it to full 1080p, basically by doubling the values in key CSS properties.

It looks slick too. Because it uses AJAX there are no page loads, you would expect nothing less in this day and age. If a digit changes after retrieving an update then the old digit is faded out before the new digit is faded in. If a digit doesn’t change between updates it obviously remains static.

We’re now trying to ascertain whether my algorithms for calculating the statistics are accurate at the moment by using a bit of manual record keeping on the part of the cell centre staff, who have been asked to record the number of calls they are taking per day for a week or so, independently of the records on the PBX. I’m confident that I’ve got it right, however, and actually quite pleased the information that I have been able to calculate given that Asterisk CDRs aren’t all that great to work with.

Assuming the project gets full approval following testing my intention is to get it running on a Raspberry Pi, which can be strapped to the back of the screen or on top of the projector, or indeed easily hidden in the ceiling in either case. Given that modern screens and projectors come with USB ports they will even be able to supply power to the Raspberry Pi. This means that the screen or the projector, whichever is chosen, represents the largest cost of the project as the Raspberry Pi eliminates the need to dedicate a whole PC for the purpose. My research indicates that the Raspberry Pi is capable of running Google Chrome, so that’s all good.

Screenshot of the system during a quiet period (no calls in the queues)

comments