Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
User Journal

Journal: How Microsoft Fights for Hearts & Minds

Journal by PhillC

Just received the following email into my corporate Inbox..... (My real company name replaced with "My Company Name")

Microsoft Office at Home

"My Company Name" is pleased to announce a new scheme for employees arranged by the IT Department.

"My Company Name" has an agreement for software licensing with Microsoft. Because of that agreement any "My Company Name" employee is eligible to participate in the Microsoft Home Use Program (HUP). This program enables you to get a licensed copy of Microsoft Office to install and use on your home computer, for the price of the CD, postage and packing (credit card required online).

What is the Home Use Program?

The Microsoft Home Use Program is a benefit of one of Microsoft's volume licensing programs that "My Company Name" has procured. It provides a simple way for staff to work at home with the same Microsoft products they use at work.

This really removes many of the incentives for Average User (within my company) to bother with Open Office, when they can get MS Office for the cost of the CD and packaging - which incidently costs £17.31 (about US$38).

User Journal

Journal: Installing Tcl-0.95 module from CPAN

Journal by PhillC

On my new Debian Testing system, I have installed Tcl8.5 and Tk8.5, rather than the older Tcl8.4 and Tk8.4.

This has lead to problems when trying to install the Tcl-0.95 modulefrom CPAN - http://search.cpan.org/dist/Tcl/

The errors I was seeing looked like the following:

> perl Makefile.PL
> tclsh=/usr/bin/tclsh
tcl_library=/usr/share/tcltk/tcl8.5
tcl_version=8.5
LIBS = -ltcl8.5
Use of uninitialized value in concatenation (.) or string at Makefile.PL line 204.
INC =
DEFINE =
Use of uninitialized value in string at Makefile.PL line 220.
Checking if your kit is complete...
Looks good
Writing Makefile for Tcl

Make then subsequently fails with:

> make
> cp Tcl.pm blib/lib/Tcl.pm /usr/bin/perl /usr/share/perl/5.8/ExtUtils/xsubpp
-typemap /usr/share/perl/5.8/ExtUtils/typemap -typemap
typemap Tcl.xs > Tcl.xsc && mv Tcl.xsc Tcl.c
Please specify prototyping behavior for Tcl.xs (see perlxs manual)
cc -c -D_REENTRANT -D_GNU_SOURCE -DTHREADS_HAVE_PIDS
-DDEBIAN -fno-strict-aliasing -pipe -I/usr/local /include -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -O2
-DVERSION=\"0.95\" -DXS_VERSION=\"0.95\" -fPIC
"-I/usr /lib/perl/5.8/CORE" Tcl.c
Tcl.xs:33:17: error: tcl.h: No such file or directory
Tcl.xs:127: error: expected ')' before '*' token
*snip* Lots more similar errors
make: *** Tcl.o Error 1

I have overcome this by digging around and find where exactly my Tcl config file resided and the correct include path. My new Makefile command looked like this:

>sudo perl Makefile.PL --tclsh /usr/bin/tclsh8.5 --tclconfig /usr/lib/tcl8.5/tclConfig.sh --include /usr/include/tcl8.5

The module was then made and installed without error. I think it was probably the inclusion of the include path that did the trick, as that's where the missing tcl.h is to be found

User Journal

Journal: Quick FFMPEG Codec Comparison Test: x264 vs xVid 6

Journal by PhillC

Today I took the time to run a quick comparison test using FFMPEG for producing content with the x264 and xVid codecs. x264 produces H.264 content, while xVid encodes in MPEG4. Frankly, I was a little surprised with the results.

The input file was sourced from BBC Motion Gallery. It was an MPEG2 Program Stream with I-Frames, encoded at approximately 50Mbps. It also contained a single MP2 audio track at approximately 356kbps. To see the clip on BBC Motion Gallery, click here. The clip was chosen because it is only 2 seconds long, so I could transcode it quickly, and there is also lots of movement, so I expected artifacts.

The output file container is QuickTime MOV, the video bitrate around 2Mbps, the audio codec aac (through libfaac) at 128kbps. I performed a 1 pass and 2 pass encode with x264 and xVid. The output file was to be 720x404 in size, this is 16:9 aspect ratio. All files were played back on a Windows XP machine using QuickTime Player 7.1.3. Some cropping of the original MPEG2 was required to remove VITC at the top and some black padding left and right. An example FFMPEG command line I used is outlined below. As you can see, there are very few optimisations other than the base settings.

FFMPEG command for the second pass x264 encode:

ffmpeg -i 2573-9.mpg -vcodec libx264 -f mov -b 2000k -acodec libfaac -ab 128k -s 736x442 -croptop 34 -cropbottom 4 -cropleft 8 -cropright 8 -deinterlace -pass 2 2573-9_h264.mov

The final output file sizes are as follows:

  • x264 1 pass: 599.558 kilobytes
  • x264 2 pass: 553.767 kilobytes
  • xVid 1 pass: 577.232 kilobytes
  • xVid 2 pass: 559.947 kilobytes

So, while xVid one pass produced smaller file sizes than x264 one pass, the x264 2 pass file is smaller than the xVid 2 pass file. Due to the short duration of the input file, no comparison of encoder speed could really be made. However, anecdotally from other ad hoc encoding jobs, xVid does seem to be quicker.

Now, to the real proof of the pudding, what was the quality like. Have a look at the following image file:

http://farm3.static.flickr.com/2331/2166176406_44357465ed_o.png

This is where I was surprised. The x264 files are on the left. 1 pass is bottom left. 2 pass is top left. The xVid files are on the right. 1 pass bottom right. 2 pass top right. Clearly, the 1 pass xVid file is superior to the 1 pass x264 file. I also believe that the 2 pass xVid file is slightly better quality then the 2 pass x264 file. Areas for close inspection (we're looking at the two top files here):

  • Bottom right corner. There is more blocking and artifacts on the x264 file.
  • Top right wing tip. The xVid file has better definition here.
  • The underside of the wings and tail. Again the xVid file has better definition.
  • The background fire. I think the xVid file has less artifacts and better definition in general.
  • As there aren't a lot of colours in the example video, it's hard to say which codec handles colour better, but there is obviously more depth in each of the 2 pass samples, compared to the 1 pass output.

That's pretty much it. I was surprised that to my eye the xVid content appeared to be superior to the x264 output. Perhaps with a more complext FFMPEG command and options this would have changed.

I'd be interested to hear other's thoughts and opinions on these two codecs.

User Journal

Journal: FFMPEG Bash script to create H.264/MPEG-4 files

Journal by PhillC

I've spent some time putting together a Bash script for creating H.264 and MPEG-4 files, using x264 and xvid codecs, with FFMPEG.

The script takes you through a few questions such as output filename, output container video codec, height, width, bitrate, audio codec, cropping requirements etc

You should just be able to follow the user prompts. Some are multiple choice, some you have to type things in.

The script is located here:

http://www.kapitalworks.com/projects/ffmpeg/kapitaltranscode

Copy and past this into a text document saved somewhere locally. For the sake of this post, let's call the file "kapitaltranscode" without an extension.

Give the file execution permissions:

chmod 755 kapitaltranscode

Move the file to your path for ease of execution later:

mv kapitaltranscode /usr/bin

You can then execute the script simply by typing "kapitaltranscode" at your terminal prompt.

You will need to specify an input video. So usage is:

kapitaltranscode video_name.avi

There's a couple of other things you need to know.

Obviously this script depends on FFMPEG having been build in such a way as to allow X264 and xVid transcoding. This is how I did it on Debian Etch:

http://slashdot.org/~PhillC/journal/190325

Within the FFMPEG Tools directory (/ffmpeg/tools) is a qt-faststart file. You'll need to make this separately:

make qt-faststart

For the script to work this file needs to have execute permissions and also be in your user path. See the commands above regarding how to do this with the script itself.

qt-faststart allows for a QuickTime file to be played back over the Internet as a Progressive Download file. That is, it will start playing before the whole file has been downloaded. This option will only appear in the script if you've chosen a mov or mp4 container.

That's pretty much it. The script isn't very attractive. I could have written it much nicer using functions I guess. Initially I wrote it only for my own use and creating x264 QuickTime mov files with aac audio. So, this is the output that has been most extensively tested.

It's up to you, the user, to choose audio codecs that work with your video codec choices. There's no checking of this in the script. For example, the script lets you choose aac audio with an avi file container, which produces only silence.

I'd really like feedback. Is this a useful script? What else could be added to it? More formats perhaps? I didn't include Ogg Theora, because there is already ffmpeg2theora. But I'm happy to include it if it's a useful addition. There's no aspect ratio conversion, but it could be added. Can my ffmpeg execution be improved?

Thanks and good luck!

User Journal

Journal: Install FFMPEG on Ubuntu Gutsy

Journal by PhillC

I wanted to install FFMPEG on my Ubuntu Gutsy Gibbon (7.10) desktop machine. This is so I can encode and transcode video files to various formats locally, and also render projects from the non-linear editor (NLE) PiTiVi.

This post will mainly cover just the commands I used to install FFMPEG on Gutsy, with very little commentary regarding why or how things work. If you want a more in-depth look at installing FFMPEG, you can read about the installation of FFMPEG on my Debian Etch server earlier today - which ultimately moves me closer to on-the-fly video transcoding of user submitted content on Kapital Moto TV

Installing FFMPEG on Ubuntu Gutsy:

sudo apt-get build-dep ffmpeg

sudo apt-get install liblame-dev libfaad2-dev libfaac-dev libxvidcore4-dev liba52-0.7.4 liba52-0.7.4-dev libx264-dev libdts-dev checkinstall build-essential subversion

svn checkout svn://svn.mplayerhq.hu/ffmpeg/trunk ffmpeg

cd ffmpeg

make distclean (I used this because I already had an older SVN snapshot of FFMPEG downloaded, configured and made from last night)

./configure --enable-gpl --enable-pp --enable-libvorbis --enable-libtheora --enable-liba52 --enable-libdc1394 --enable-libgsm --enable-libmp3lame --enable-libfaad --enable-libfaac --enable-libxvid --enable-pthreads --enable-libx264

make

sudo checkinstall

Some things you might want to do when prompted to by checkinstall:

  • Select 0 change maintainer name
  • Select 3 to set version name. I used svn11213-ubuntu-gutsy-20071213

And that's it FFMPEG installed on Ubuntu Gutsy.

Other links:

User Journal

Journal: Install FFMPEG on Debian Etch 2

Journal by PhillC

I have to say, that the video manipulation program FFMPEG, while very powerful, is not very user-friendly when it comes to installation. While many Linux programs can be happily installed from either a pre-compiled package, or downloading source and compiling yourself, this isn't necessarily the case with FFMPEG. The ease of FFMPEG installation largely depends on how many different video codecs and containers you want to be able to input or output. The greater the number, the exponential increase in installation difficulty. My main need was for FFMPEG to accept a wide range of input formats, while outputting H.264 encoded QuickTime (MOV) files. Here's how I achieved this on a Debian Etch server........

I'm going to assume that you are familiar with using the Linux command prompt, moving between directories, editing text files and have at least some experience compiling programs.

The first thing I would recommend doing is making an addition to your source repository lists.

pico /etc/apt/sources.list

Add the following line:

deb http://www.debian-multimedia.org stable main

This repository contains some essential libraries for xvid and x264 (an open source H.264 codec) amongst other things. You'll need to install some software from here. The software may well be available from other repositories too, that are already in your sources.list file, but add this one to be safe.

Next rebuild your sources:

apt-get update

I would also recommend installing checkinstall. This program can be used instead of a regular "make install" command and produces a deb package file that will make re-installation or multiple machine installs much easier. If checkinstall isn't already on your machine download it from:

http://www.asic-linux.com.mx/~izto/checkinstall/download.php

Maybe navigate here with lynx, maybe use wget once you've found the actual file you need, maybe download it with a GUI based web browser and then copy it to your desired directory. It's your choice. I grabbed the latest .deb package. After the download, execute the following as root:

dpkg -i checkinstall_1.6.1-1_i386.deb

Checkinstall should have happily installed on your system. Now it's time to really get into FFMPEG.

Build the dependencies:

apt-get build-dep ffmpeg

Next we're going to install a whole lot more useful software that will allow FFMPEG to output many more than just the minimal file types.

apt-get install liblame-dev libfaad-dev libfaac-dev libxvidcore4-dev liba52-0.7.4 liba52-0.7.4-dev libx264-dev build-essential subversion,

We've also ensured that you have the necessary tools installed to compile from source (build-essential) and obtain files from the Subversion version control repositories.

We're ready to checkout FFMPEG itself:

svn checkout svn://svn.mplayerhq.hu/ffmpeg/trunk ffmpeg,

At the time of writing the latest revision was 11212. If you'd feel more comfortable not using the lastest bleeding edge version of FFMPEG, issue the Subversion command as follows:

svn checkout -r 11212 svn://svn.mplayerhq.hu/ffmpeg/trunk ffmpeg

This will ensure that you are also downloading the 11212 revision. Once downloaded, move into the ffmpeg directory (cd ffmpeg) and configure:

./configure --enable-gpl --enable-pp --enable-libvorbis --enable-liba52 --enable-libdc1394 --enable-libgsm --enable-libmp3lame --enable-libfaad --enable-libfaac --enable-pthreads --enable-libx264 -enable-libxvid

So, what have we done here......

The essence of his information, and many more options, can be found by typing ./configure --help first.

(You might also consider including libtheora in your configuration, but I forgot at the time)

We're now ready to make the installation files so at the command prompt:

make

If something goes wrong, and you need to start again, a useful command to know is:

make distclean

Make sure you do this first and then run the configure command again.

A finally:

checkinstall

You will be asked a few questions, which should be straightforward enough to answer - yes to creating the documentation, choose a name, select D for Debian package, lastly select number 3 and type a version name that means something to you. Mine was svn11212-etch-20071213. Checkinstall will now create a Debian package of FFMPEG, bespoke for your system with the configuration options you've selected earlier. Checkinstall WILL NOT install the package, so don't forget to do that:

dpkg -i ffmpeg_svn11212-etch-20071213-1_i386.deb

With some small amount of luck, you should now have a working version of FFMPEG installed on your Debian Etch server. You will be able to output H.264 encoded files in a variety of containers.

Now the fun part really begins as you spend days tinkering with commands to output the best possible files. Documentation for using FFMPEG can be found at:

http://ffmpeg.mplayerhq.hu/ffmpeg-doc.html

Have fun!

(Credit for getting me started in the right direction goes to Paul Battley and his FFMPEG Ubuntu Feisty install how-to)

User Journal

Journal: Blasting Away at Windows 1

Journal by PhillC

On Sunday I took the final plunge and completely removed Windows XP from my laptop, replacing it with Ubuntu Studio. It's not quite how I planned it, but that's how it worked out.

I was having some problems booting Ubuntu from the external USB drive I was using. Desktop icons would strangely disappear and applications freeze. I decided it was time to put Linux direct on the laptop's hard drive. I have a Dell Inspiron 6400, Intel Core 2 Duo T5500 1.66Mhz processor, 1GB RAM, ATI graphics card and a 120GB hard drive. I also decided, in my wisdom, to try a different flavour of Ubuntu and downloaded the new GeUbuntu ISO.

What's GeUbuntu? It's basically Ubuntu plus Enlightenment. I like the way Enlightenment looks so I was prepared to give it a shot. In short, I screwed up the hard drive partitioning. The GeUbuntu installer froze for more than 30 minutes at 0% of the hard drive partitioning. I finally decided to exit and reboot, to be greeted by the bad news that I had a blank hard drive. Fortunately, I'd backed up all the important files from my Windows XP install prior to attempting this. Sure, I could have probably recovered XP, but ultimately I decided to just get on with Linux. I inserted the XP install disk, booted to a DOS prompt and formatted c:.

Back to the GeUbuntu install and everything went smoothly this time. Once up and running I quite liked many aspects of GeUbuntu, but after half a day's playing I was finally too frustrated by the small things. For example, MPlayer wouldn't launch, just popped up an error screen. The same behaviour was witnessed for video editor PiTiVi after installation via Synaptic. Logging out of the system also always resulted in a pop-up telling me that certain applications were taking too long to close and did I want to continue; this was with no visible, user initiated applications open. So I gave up.

Back to Ubuntu Studio for me. Installing this Ubuntu flavour to the hard drive was a piece of cake and I was up and running after about an hour, first thing Monday morning. Ubuntu seemed to recognise pretty much everything the laptop hard to offer, so no problems there. However, the first improvement I did make was to install the ATI graphics driver from the restricted drivers list. I could have achieved this in a number of ways, such as following the manual tutorial, installing Envy or just select the restricted driver from the list under Administration. Easily done and now the laptop is in widescreen 1280x800 mode.

There's a few things that I may still need Windows for. One of them is synchronising my iPhone. iTunes is not available for Linux and as my iPhone is not "jailbroken" I don't believe there is anyway to synchronise with any native Linux applications. iTunes 7 is known not to work with Wine, so I didn't bother attempting that route. Instead I installed VirtualBox. Downloading, installing and setting up VirtualBox was straightforward and I had a Windows XP virtual machine running in no time. However, as detailed in many, many threads subsequently discovered, there's a problem with VirtualBox (and VMWare) recognising the iPhone's USB connection. In short, Windows seems to recognise the iPhone's USB drivers and then VirtualBox crashes to exit. So, it appears for now I'm stuck with the music and contacts on my iPhone in its present state.

As an aside, F-Stop personal photo manager comes standard as part of the Ubuntu Studio application set. With F-Stop it is possible to access and transfer any photos stored on the iPhone.

I also followed some pretty straightforward and simple steps to decrease the required boot time, from 27 seconds to 23 seconds according to bootchart.

So, the current stat of play is roughly:

  • Ubuntu Studio installed with all included applications. Internet works.
  • ATI graphics driver installed from the restricted drivers list
  • VirtualBox installed and a Windows XP Virtual Machine available
  • iPhone not working with XP VM. Connection of iPhone causes the VM to fatally crash.
  • GStreamer codecs installed via Synaptic to facilitate playback of a broad range of videos.
  • PiTiVi upgraded to lastest version 0.11.1, by downloading the tarball, configuring, making and installing. Unfortunately, the application will not start.

At the moment there is no great urgency to go back to Windows, but there are some annoying problems that I can probably only live with until after Christmas if they aren't resolved. Gotta get that iPhone synched!

User Journal

Journal: Breaking out of Jail

Journal by PhillC

Last night I spent a few hours attempting to "jailbreak" my iPhone. Essentially after a "jailbreak" the iPhone is in a mode that will allow the installation of native third party applications. The main problem is that all current jailbreak methods require the phone to be running Apple's firmware version 1.1.1. All new iPhones that have been sold since the beginning of November run firmware 1.1.2. Upgrading older phones through iTunes also sets them to firmware 1.1.2

So, the first step is to try downgrading existing firmware from 1.1.2 to 1.1.1. It's relatively easy enough and I won't go into all the details here. I was following the steps in the following, slightly confusing, tutorial on the Hacktint0sh forums:

http://hackint0sh.org/forum/showthread.php?t=18353

See that line right near the end that says, "This will not work with brand new out of the box iPhones that already come with firmware version 1.1.2"? Well I didn't read that first and set off down the road to firmware downgrade.

In actuality, I managed to successfully downgrade to 1.1.1, restored the phone using the Windows command line utility iPhuc, connected to the Net with WiFi, loaded the AppSnap installer and installed a few choice programs. Happy days it would seem! Not quite. My iPhone did not re-activate with O2. Essentially I now had a rather expensive Internet tablet device that only connects with WiFi. I'd lost the essential phone functionality and thus a great deal of the data connectivity with it.

I attempted to hard re-boot the iPhone, which was a step I needed to undertake after 24 hours when the phone was very first activated with O2, before service was received. However last night, this step simply put the phone back in restore mode. After going through the whole firmware installation process again, there was still no network activation.

Around 1am, I decided to restore the factory settings, which is a helpful option in iTunes. Sure enough, my phone was back on the O2 network pretty much immediately the firmware was back to the original 1.1.2. To be clear, that's not 1.1.2 upgraded from 1.1.1. I had to completely restore the base factory settings, which effectively wiped all my data on the iPhone, meaning email had to be re-configured, Safari bookmarks gone and all those helpful SMS conversation threads trashed. Fortunately syncing the phone with iTunes meant that contacts and music were easily recovered.

Just as the above tutorial states at the very end, it doesn't look like iPhones that came from the factory with firmware 1.1.2 can currently escape jail and retain their mobile network connectivity. I'll just keep waiting for a simple one-click 1.1.2 jailbreak that doesn't require firmware downgrades first.

User Journal

Journal: Wide Blue Sea in a Shell

Journal by PhillC

After recently acquiring my new gadget toy, that also conveniently doubles as a mobile phone and portable music player, there were some things I wanted to do with this device. My earlier post outlined my disappointment at not being able to remove certain applications. I've also been waiting patiently for someone to hack the newest firmware, allowing me to unlock the device and thus install applications. I've been waiting two weeks now!

So, I needed to find my own solutions. One thing I wanted was a terminal program that would provide me with secure access to my web server. I'd have to do something via a browser interface.

There's a couple of third party, ready to use, online solutions out there that allow input of your IP address, username and password then try to connect via SSH to your server. An example can be found at the following address: http://churchturing.org/w/iphone-ssh/ Giving my server login details to an unknown online service? I don't think so!

The alternative was to install something like Ajaxterm, Anyterm or Webshell on the Kapital Moto TV server itself. While there's still some risk here that malicious code in any of those applications could be calling home, sending my credentials to the bad men, as they are all open source applications there has to be at least some level of trust and security.

Such a web based terminal application is going to be vulnerable to a brute force attack on my passwords. Then again someone could open PuTTY on their Windows machine, connect to my server's IP address and try exactly the same thing.

I decided to go with WebShell as it didn't require any Apache mods and installation seemed straightforward enough. Of course, things are never quite as they seem.

Here's what's needed to run WebShell:

  • Python - I installed the latest version 2.5.1 direct from python.org
  • OpenSSL - Again I went direct to openssl.org for version 0.9.8g, which is the latest.
  • pyOpenSSL - These are Python extensions for OpenSSL. This is where the problems started. I just could not get this to install direct from source at http://pyopenssl.sourceforge.net/. Instead I possibly did the easiest thing and just installed the package from the Debian repository with apt-get install python-pyopenssl
  • WebShell - The important bit. Latest version at the time of writing is 0.9.5, obtained from http://www-personal.umich.edu/~mressl/webshell/ Unzip all the files into a convenient directory and follow the two steps at the Installation page on the above website.

The installation instructions for WebShell are sparse and totally assume you know what you're doing in terms of installing the dependencies Python, OpenSSL and pyOpenSSL.

As a real quick overview, for the totally new, in general installing a Linux application from source can be achieved in the following manner:

  • Download the source file you require. It will most likely end in a .tar.gz, .tar.bz2, .tgz or .zip extension.
  • Copy the file into your home directory on the server.
  • Unzip the file. As a starting point try a command like tar -xvf filename.extension Substitute in the real filename of course. This will extract the archive into a local directory. If this doesn't work use Google to learn more about unzipping files on Linux. You might need to use zip or bzip instead.
  • Navigate into the newly created application directory using cd directoryname
  • You'll now need to be logged in as root or SuperUser. Depending on your system type, su or sudo. You'll be asked for the root password, which you'll need to know before proceeding further.
  • Type ./configure
  • Type make install
  • Type make

Now all being well you should have successfully installed your new application. These are the very simple steps I followed for installing the latest versions of Python and OpenSSL. If it doesn't work for you, I'm sorry I can't help. Not because I don't want to, but more likely because I can't. Installing from source certainly didn't work for me with pyOpenSSL, so I had to revert to the Debian repositories.

Next start WebShell using the two commands on their installation page. The first will generate a secure server certificate, the second will start WebShell. Once WebShell is started, it can be connected to remotely via your server's IP address and port 8022. e.g. https://my.server.address:8022 Take note of the S in https. Forget that and of course you won't be able to connect.

Another important thing to remember, if you'd like WebShell to run in the background, is to execute the start command in the following manner -

./webshell.py -d

Without this, WebShell will quit every time you close the regular terminal window.

Note: I'm running Debian 4.0

User Journal

Journal: Got me a New Gadget

Journal by PhillC

With my birthday just a few days away, I've convinced my long suffering better half to plump for my present early. During an otherwise mundane clothes and food shopping excursion on Sunday, we found ourselves in an O2 store fondling an iPhone. I admit to being impressed.

My biggest reservation, without having seen the device itself, was the ease of SMS messaging and using the touch pad keyboard. It is easier than I imagined. I don't have the fattest fingers, but they aren't tiny toothpicks either, and the keyboard works fine. I'm probably a little slower using it than on my old Nokia N70, but in time I'm sure that'll improve. So, for my 34th birthday, I have received a new Apple iPhone.

The device itself is not especially light, weighing a little more than the new Nokia N95 using the "one-in-each-hand" weight test. The iPhone is thin though, about half the thickness of the N95. The screen is big, colourful and glorious. I've never been a particular Apple fanboy, have never owned an iPod and don't have any other Apple products in the household (Although my first personal computer, way back in 1986 or 1987 was an Apple IIe) but I will admit to the iPhone being quite aesthetically pleasing.

The activation process was reasonably straight forward, although when first plugged into my laptop via USB, the iPhone automatically associated itself with Sonic DVD burner rather than iTunes. No problem, I just uninstalled Sonic as it was OEM software that I never use anyway. Upgrade to iTunes 7.5 and away we go. A few minutes after completing the iTunes based registration process, I received an email saying that O2 has now activated my phone. However, I didn't actually receive service on my phone until some 12 hours later and this was only achieved by hard rebooting the device.

The iPhone picked up my WEP encrypted home wireless network no problem. After entering my password I was happily browsing away. My first disappointment was moments later. The Kapital Moto TV website uses an embedded QuickTime player to display QuickTime H.264 MOV files. In the iPhone's Safari browser these do not play, it looks like a plugin is missing. This is exceptionally odd to me, given that QuickTime is Apple technology! Somebody didn't really think on that one.

My first real joy with the iPhone wasn't far away though. Setting up my Gmail account in the iPhone's mail client, using IMAP settings was a breeze. The mail application works quite well, although you do lose Gmail's message threading capabilities. Reading my Gmail was one thing I did quite regularly on my Nokia N70, via Google's Symbian based mail reader. The iPhone's mail reader is faster and looks better. Plus good so far.

I thought I'd be frustrated by the inability to install third party applications without unlocking the phone, and maybe I will be in the future. At the moment, I'm OK with it and it stops me cluttering the thing up with slightly useful, but ultimately pointless applications. What I'd really like to see now is an iPhone SSH client for accessing my web server, but this isn't essential and I can probably live without it. What annoys me the most is that I can't remove the YouTube and Stocks icons from the iPhone home menu. I have absolutely zero interest in either. I'll probably take the plunge, invalidate my warranty and jail break the phone when the hacks are finished for 1.1.2 firmware - I'm not going to downgrade to 1.1.1.

In summary, this is how my iPhone experience looks so far:

Good:

  • WiFi - works really well and I'm looking forward to making use of free access to The Cloud's hotspots.
  • Email - Gmail on the iPhone with IMAP works a treat.
  • Form factor - it's attractive
  • Threaded SMS Conversations - handy, but might encourage me to keep old, redundant text messages.
  • Landscape - the big screen is great for photo and video viewing. Also not bad for regular web pages using the zoom function.

Bad:

  • Default Home icons - can't remove the useless YouTube and Stocks icons
  • Can't Install New Apps - minor annoyance at the moment, but I'd like an SSH client on my iPhone
  • Can't Remove Apps - only old people use YouTube or want Stock prices in the UK.
  • Slippery - I am going to drop it at some point, but have ordered a rubberised case and screen protector already.
  • Calendar - can't sync Google Calendar with iPhone Calendar application. Seems like it can only be done via a web based app.
User Journal

Journal: Transient Linux Love

Journal by PhillC

After proclaiming success with my Ubuntu Studio install yesterday, the wheels all feel off last night.

Feeling confident, but not entirely happy, with Ubuntu Studio I decided to experiment a little. I've quickly learned that fucking around with Linux leads to fuck ups, especially if you only partially know what you're doing.

I wanted to try KDE. Ubuntu Studio comes with Gnome, but of course KDE is available from Synaptic. The problem is that there are about a million different packages with "KDE" in the title. Instead, from the terminal command line, I typed apt-get install kubuntu-desktop. That all worked fine and KDE packages were duly downloaded and installed. The problem is, it seemed impossible to switch to KDE from Ubuntu Studio Gnome. Choosing the desktop windows manager at log-on didn't produce the desired result. While it looked like KDE was loading, everything suddenly switched back to Gnome.

Looking at my installed packages I determined that there were specific Ubuntu Studio Desktop packages. Right, they'd best be removed then. Not an entirely smart idea. After the next reboot, selection of KDE and login, many things had now disappeared entirely! The Applications menu was completely empty to start with.

I wanted KDE. Don't ask why, I just did. Thinking that I'd not really installed much in this OS instance yet (mostly just the ATI driver package and that was pretty straight forward), I decided to give a KDE specific distro a chance. But which one? After spending some time over at Distro Watch and checking out various distribution's websites, I decided to try Mandriva 2008. It wasn't a Debian based distro, which probably would make life harder as the Kapital Moto TV server runs Debian. I simply saw this as a chance to expand my Linux knowledge.

With the ISO downloaded and burned to CD, I started the fresh install on my external hard drive. Things were looking good. Mandriva 2008 loads as a Live CD, but then installs persistently in about 15 minutes. There aren't many options to choose - language, keyboard, timezone and partitioning information are about it - except it does ask a few questions at the very, very end of the process about the GRUB bootloader.

Once the install was complete, I rebooted and was ready to experience the rapturous joy of Mandriva 2008. It wasn't going to be that easy! GRUB failures. Total GRUB failures. Nothing would boot. Not my Windows OS on the built in hard drive, not my Linux OS on the external disk. So, rinse and repeat, this time choosing different GRUB options at the end of the install process.

No joy there though. I gave it a third chance, choosing LILO this time. Nope. No booting. So, back to the Ubuntu Studio disk it was. The good news here is that Studio now boots again from the external hard drive. Sadly, Windows still doesn't. GRUB has royally screwed my Master Boot Record. Booting into Linux, loading Firefox and turning to Google found the answers I needed. Fortunately I still have my original OEM Windows XP install disk. Boot into this, select R from the first menu and at the command prompt type fixmbr. Easy as that. If it doesn't work for you (if you've stumbled over this post and have a ruined Master Boot Record), just try Google as I did.

With Windows back, I was still stuck with an unwanted Ubuntu Studio install and a tea coaster Mandriva 2008 disk. I'm downloading Kubuntu right now, determined to give KDE a chance.

User Journal

Journal: Giving it to Ubuntu

Journal by PhillC

Many years ago, in the early 1990s, I tried to install a version of Slackware on my old 386 machine. It was a miserable failure. Over the years I've tried Linux again a few times, but never with much success. Something always didn't work correctly and after a couple of days I went back to Windows. This time I promise it'll be different.

I had a 120GB external 2.5" hard drive knocking about, not being used for much. This seemed like an ideal candidate to try a Linux distro, booting from USB. Ubuntu is in all the press and they have a "Studio" edition specifically for multimedia work. Whether it's any good or not, at multimedia I mean, is another question, but downloading the ISO and burning it to disk was easy enough.

The install process went reasonably smooth. The only thing I had to really think deeply about was which hard drive to install on - was it SCSI1 or SCSI4? I chose correctly of course and didn't accidentally destroy my Windows OS. I must admit to also initially screwing up the X windows config. Choosing 1280x800 as the only option for X to try resulted in total load failure. I went through the install process a second time leaving all options as possible, although there's probably an easier way that I just don't know about.

Then I was in! Ubuntu studio (Gutsy Gibbon 7.10 of course) in all its glory. Only problem was that X only loaded at a maximum of 1024x768 resolution. Damn those ATI drivers. Same thing happened with Feisty Fawn and Breezy Badger in the past. However, the popularity of Ubuntu is now great enough that other smart people have figured out exactly how to sort this out, and ATI has made the necessary drivers easily available. I followed the user written How-To on the Ubuntu forums and problem solved! Crisp, clear fonts and a 1280x800 widescreen resolution. Nice.

So, with that out of the way I've customised a few Gnome setting to have a desktop looking closer to how I like it. I've installed some of the GStreamer plugins to ensure various video containers play correctly. I still need to do the same for sound as nothing using AAC currently works. I've had a quick look at the various video editing tools included in Ubuntu Studio and frankly they're all a bit crap. I was hoping PiTiVi might be more advanced than where it currently is. My next likely bet is to install Kdenlive and see how that measures up.

I'm sure there's also lots more configuration I'll be doing over the next few days. Maybe even give VirtualBox a chance as there are some Windows apps that I truly do need, that there are no Linux equivalents for - Trackvision for example. For now though, bring it on.

User Journal

Journal: The Search for a Simple Online Backup Service

Journal by PhillC

Before starting this entry, I should note that Kapital Moto TV is currently just a two man operation. I'm the business and online video brains, my partner Antoine is the developer and server admin. Sometimes, depending on time and availability the lines are blurred. Any simple Linux errors in the processes described below are through my own ignorance and lack of knowledge.

It's probably gross negligence, but we haven't really had any sort of automated backup schedule in place for the Kapital Moto TV website. Video files served via the website are "backed up" by a local copy on my home PC. Website code is "backed up" by local copies on Antoine's and my home PCs. These may or may not be the latest code versions. The rule we have in place is that before working on any live code, we download the file from the website first. In this way, we'll always be working on the most recent version. MySQL database "backups" are handled by me remembering to download an extract via PHPMyAdmin whenever I think of it.

Overall, a pretty crap backup regime. So, I decided to do something about it. I wanted automated, online and easy to restore. Perhaps easier said than done, but it didn't need to be.

After much research, and reading this Slashdot article from about a year ago, I decided to use Amazon's S3 service. This seemed to be the cheapest storage available, they've just launched the service in Europe and in my mind it seemed like a pretty safe bet, not likely to disappear anytime soon.

The only problem I could see with S3 is that it's difficult to connect directly to the service. Either, you need to write your own application using their APIs, or connect through a third party service. I wasn't about to write my own application, so the two most likely third party services were Jungledisk or S3sync.

I decided to try Jungledisk first. It all seemed pretty straight forward to begin with. I followed what appeared to be an easy online tutorial, installed Davfs2, mounted S3 as a new disk locally, then attempted to use rsync for copying across data. Initially it seemed to work fine on my couple of small test files. However, when running a "live" test on the 10GB of data that needs backing up, rsync threw errors. Initially I suspected a Jungledisk install problem, but that didn't seem to be the case. Unfortunately, the Jungledisk support forums provided very little help. One post from "JungleDave" helped troubleshoot a Jungledisk startup issue, but at the time of writing there's been no further responses for four days. Another user with similar problems also doesn't seem to be getting the answers they need. For what is ultimately a commercial tool, the level of support provided is pretty minimal and disappointing.

So, unmount, uninstall and off to S3Sync I went. S3Sync is a Ruby script that works similar to the standard rsync but interfaces with Amazon's S3 as the storage medium. I didn't have Ruby installed on the server, so that was the first step. Following another online tutorial for getting S3Sync up and running all seemed to be going well. Appearances can be deceptive and problems did strike. At the time of writing S3Sync hasn't been updated to work with the S3 European storage. The KMTV server is located in the UK, and while I wanted the backups held remotely, for a decent transfer speed I was only really considering European based storage options. It looks like I was stuck again. I can't criticise the guys at S3Sync, but I needed a backup regime in place. The more I worked on getting one up and running, the more paranoid I became about disaster striking. I had to get something sorted as soon as possible.

What's left? Perhaps what I should have done to start with! Find a service that allows direct rsync connections from my server. A good few hours online reasearch narrowed the options down to the following contenders:

  • Strongspace - part of Joyent so not likely to go away anytime soon. However, at US$15 for 5GB they aren't the cheapest.
  • Gigaserver - European based, but again the price isn't right - EUR25 for 5GB
  • Rsync.net - a favourite with the Slashdot crowd in the earlier discussion I linked to. The price is right and they offer European storage in Switzerland. Sadly, a much more recent blog post elsewhere didn't have a lot of good things to say about this service.

I'm sure there are more options out there and I could have spent days researching all the relevant services. In the end I needed to get something up and running quickly. I went with Rsync.net, signed up for 12GB of storage, at US$19.20 per month it's not as cheap as S3, but won't break the bank either.

After signing up, it took a couple of hours for my login details to come through, but this was flagged during the sign-up process. The instructions in the initial email were extremely helpful. Connecting via SSH, using certificates and thus not needing to input a password each time, was explained in straightforward terms. A few simple rsync tests worked a treat. Everything looking pretty good. It was time for the big test. Using rsync I started to copy across the 10GB of video files. Left to run overnight, the morning logs showed no errors. A few more rsync tests with small test files dumped in the video directory also worked correctly, with only the new files being copied across. It looks like Rsync.net was ultimately the right choice.

My backup regime now looks something like this:

Step 1.

mysqldump used to extract copies of the relevant databases

mysqldump -u user --password=XXXXXXXXXX --opt --databases kmtv scottish | gzip > /user/mysqldumps/mysqlbackup.sql.gz

Insert your own "user" instead. Insert the relevant password to access the databases on your server.

This command will extract dumps of the databases kmtv and scottish (another project) to the folder /user/mysqldumps as a GZip file.

I created this as a simple shell script:

#!/bin/bash

mysqldump -u user --password=XXXXXXXXXX --opt --databases kmtv scottish | gzip > /user/mysqldumps/mysqlbackup.sql.gz

Saved in a folder under my user's directory.

Step 2.

Use rsync to copy relevant data across to the rsync.net server. Again I created these as a shell script, after testing them on the command line.

#!/bin/bash

rsync -aHvz /user/mysqldumps/mysqlbackup.sql.gz rsyncdotnetuser@servername.rsync.net:mysqldumps

rsync -aHvz /www/kmtv rsyncdotnetuser@servername.rsync.net:kmtv

Again you'll need to insert your own rsync.net username and allocated servername in the above lines.

If you can't get your shell scripts to work correctly, check the permissions. Chmod 755 worked for me.

Step 3.

Setup crontab to handle the transfer in an automated manner. I use crontab -e to make these necessary changes.

# Every morning at 3.00am run mysqldump backups
0 3 * * * /user/scripts/mysqldump.sh

# Every morning at 3.15am run rsync backups to rsync.net
15 3 * * * /user/scripts/rsync.sh

Insert your own correct path to the location of your shell scripts.

And that's it! There are other places on the Net to find rsync, mysqldump and crontab tutorials if you need help with that.

I'll probably add to and fine tune the above over time.

Feedback from those more knowledgeable than myself is always appreciated.

And now, with this post, I have a record of what I've done for future reference.

The solution of problems is the most characteristic and peculiar sort of voluntary thinking. -- William James

Working...