Business card printing by

No Comments

I printed some new business cards for GlobalGoat Consultants this week, using my shiny new logo as designed by . I used for this service and can say that I was really happy with the results.

Their interface was both simple to use and suitably variable enough to allow me to get exactly what i wanted and their prices and quality of cards were excellent. I’ve used them before for other vanity projects in the past because I like the fact that as well as using your own images (or their stock ones if you like) you can use anything from your flickr library. I have thousands of flickr images built up over the past years and I used a selection of 8 of my favourites (mixed landscapes of the UK and Sweden) for the back of the business cards.

If you come and chat to me next week at Kista arbetsmarknadsdag then you might even get one 🙂

I’ll be speaking at Kista Arbetsmarknadsdag (KAM) next week – 28th March

No Comments

I’ll be speaking at Kista Arbetsmarknadsdag (KAM) next week on 28th March. It’s not a technical talk since it’s a career day for students, it’s a “this much I’ve learnt” type of talk about career building in the IT sector. I’ll be there representing Basefarm AB as I help then with their windows recruitment, as I know the recruitment market and the company from  my time in Stockholm.

I’m lucky enough to have had a fairly varied career in the IT sector, ranging from start-ups to Microsoft and in the talk I just compare and contrast some of the different things that come along in the course of an IT career.

I’ll also be on the Basefarm stand from time to time during the day, so please come along and say hello if you’re attending.

LAMP for a beginner, lessons from my WordPress build

No Comments

When I finally got round to rebuilding this site into its current format, I searched around for a while before choosing WordPress as the platform to put it together with. As a complete newbie to this platform, and also as a person who has mostly used Windows software for the past 15 years, there were a few things to learn along the way, and here’s some information about some of those points.

Firstly this breaks into 2 areas, client and server. When I say client in this case I refer to a laptop I use as a general machine for mail and stuff, but which is also the development environment. This machine is a very old Toshiba M70 from 2005 running a vanilla install Ubuntu 11.10. Being as a was a complete novice in terms of putting LAMP together I looked for suitable instructions online and found this

Which I can’t recommend enough as a great step by step guide. I had one problem on this which I documented in the comments of that article (which was not the fault of the article but a subsequent problem / config issue in webmin). The problem was that one couldn’t login to webmin at all after install even with root. When this was occurring I must say that I felt completely useless as I’m such a linux noob that I didn’t really know where to start troubleshooting. If this type of thing occurs in windows I just work it out using various tools, but in linux I’m quite stuck where to start. Anyway the solution was in the ubuntu forums here:

Beyond that the client build was very smooth and is happily up and running on my crappy old laptop.

On the server side I’m hosting with who have been very good to me and were very efficient when I did my domain transfer last year from another provider who I won’t mention here 😉 The only thing you could say against is that you only get one MYSQL database for your entire domain, so this could cause an issue to bigger more complex installs and sites maybe, but works just fine for me. The install was super simple exactly as WordPress documentation says it should be. I FTP’d the files over to the site and extracted them and I was pretty much up and running.

In terms of custom configuration I’m using the following:

Cruz theme – purchased through – not the most complex  theme compared to some, but very worth the small fee to purchase it. Very well documented as well.

BackupBuddy – this is a great plug in, it’s not cheap, but it does exactly what it says, it backs everything up with a click once its setup, plugins and all. Being a SQL guy you can be sure that I tested the restore process as well, and can also confirm that it was very simple and seems very stable. This also comes highly recommended. You can move the backup files here there and everywhere, either automated or manual and I push mine to Amazon S3 automatically from the plugin.

This is what I would consider the bare minimum, in that its deployed, it runs, I can develop on it in a separate environment and more importantly I can back it up.

I like to be plugin light currently, as it keeps it lower maintenance for me, and is less risk I feel, as I hate risk and don’t want to spend lots of time troubleshooting compatibility issues. So the only other things I run is

Google XML Sitemaps – which simply generates a sitemap – no more no less


SyntaxHighlighter Evolved – which deals with the rather nice code syntax for many different languages.

I did consider an SEO plugin, but decided for the moment that my needs were not worthly of such granular control, and also the more competent ones seemed to require a certain amount of config times (unsurprisingly considering the subject) which I wasn’t prepared to devote just now!

Overall I’m fairly happy with the experience. I did pay for a few things, but the prices were very reasonable and were well worth it in my opinion (especially the backup).

SQL Server script to populate a table of sequential dates in multiple formats

No Comments

I was writing lots of TSQL today for something I was working on, which in itself is a rather rare occasion since I do so much more infrastructure and architecture stuff nowadays. I needed to write a script to populate a table with a series of rows containing sequential dates. I had to create a table which had 2 columns in it, one was [date] datatype and the other needed to be a 6 character string representing the year and the month as numbers concatenated together, for example 15th March 2012 would become 201215.

Then this table needed several rows covering every day over a number of years. Here’s the code just in case anyone fancies reusing it.

create table #dates
CustomMonth char(6),
lookupdate date

declare @date_increment date
--change your start date here if you fancy
set @date_increment = '2005-01-01'

--change your end date here if you fancy
while @date_increment < '2012-12-31'

set @date_increment = DATEADD(day,1,@date_increment)

insert #dates (CustomMonth, lookupdate)
select convert(char(4),(datepart(year,@date_increment)))
	+ RIGHT('0' + CONVERT(VARCHAR(2), DATEPART(MM, @date_increment)), 2),


--check it looks ok
select * from #dates

Create automated PGP task in SSIS using GnuPG to decrypt files

1 Comment

Previously I wrote about my efforts to automate the decryption of files with SSIS using the gpg2.exe which comes as part of the GnuPG package. The original article is here

SSIS Task to decrypt PGP files automatically

However after deploying the working solution into production, to be run as a scheduled task, I found out that this package and solution still had some issues. I found that it was behaving rather differently when it was deployed into the production environment as opposed to running in the BIDS environment. When executing the exact same code in production which worked processing the exact same files in development (and I mean the exact same, same account, same files, same everything) I got error which looked like this (sanitised for security)

Error: 2012-03-13 11:16:07.10 Code: 0xC0029151 Source: run decrypt Execute Process Task Description: In Executing "C:\Program Files (x86)\GNU\GnuPG\gpg2.exe" "--batch --passphrase-fd 0 --decrypt-files [myfilename]" at "", The process exit code was "2" while the expected was "0". End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 11:16:02 Finished: 11:16:07 Elapsed: 4.609 seconds. The package execution failed. The step failed.

So I was again getting error code 2 which I had previously, for which the –batch switch had previously resolved the issue in development. So the error code was the same, but the reason obviously had to be different now. This required a little more investigation to get to the bottom of. Firstly I ran Process Monitor, which is often my first port of call in such scenarios, to check whether I was hitting some obscure permissions errors when running in live with the SQL Agent. It turned out totally clean (As an aisde I had done the same when initially installing GnuPG to resolve and issue that it couldn’t access a temp directory it required to do decryption).

A bit of research through the web and the full documentation of GnuPG left me using a further switch:


which allowed me to look at some of the status messages from the output which were previously being swallowed by the SSIS task when run in production. There was SSIS logging enabled but it wasn’t getting anything back from the gpg2 executable beyond the status code.

I used a couple of different versions of this switch which looked like this

gpg2 --batch --status-fd 2 c:\gk\test\output.txt --decrypt-files test.gpg

which outputs the status messages to c:\gk\test\output.txt, or you can do this

gpg2 --batch --status-fd 2 --decrypt-files test.gpg

which outputs the messages to the console

Either way you end up with the following output (again slightly sanitised)

[GNUPG:] FILE_START 3 test.gpg
[GNUPG:] ENC_TO [hexstring] 1 0
[GNUPG:] USERID_HINT [hexstring] [mykeyname] (mykeyname) &lt;;
[GNUPG:] NEED_PASSPHRASE [hexstring] [hexstring] 1 0
[GNUPG:] PLAINTEXT 62 [integer] test_.txt

but unfortunately this still didn’t give me anything to go on, as it still worked in the test environment, but not in the production one. Eventually by playing with the logging levels and these switches in production I got the details out in the SSIS log which contained this ket string

gpg: decryption failed: No secret key

I then realised that I was being an idiot and that the service account that I was running the SQL Agent under did not have the certificate registered under that userid. I had only imported the certificate into Kleopatra for the development userid I logged in with and not for the service account. I simply imported the certificate to the service account profile and then everything worked. This meant that the original instructions and code were valid, but I thought I’d put this post up in case anyone did the same stupid thing as me. It’s worth remembering that the certificates are by default imported at a user level into Kleopatra.

SSIS Task to decrypt PGP files automatically

1 Comment

Update 2012-03-13 – If you still get error code 2 after running this code in production (but it continues to work in your development environment) you might like to look at the subsequent post I did about further troubleshooting of this issue

This is something that I spent a few hours on recently which I wanted to share. The requirement here is to create a SSIS task to automatically decrypt a variable number of files that have been encrypted with PGP. This task will live within a larger SSIS package which does other typical SSIS tasks; fetching files from FTP, moving them around a file system, streaming them into a database and so forth.

The key here is that the task needs to be completely automated so that no user interaction is required , i.e. typing in the passphrase or other such matters. Whilst working this out I was browsing around the web and found various solutions but none was 100% perfect for my particular requirements. Initially all the options I tried either required me to enter the passphrase or returned error codes even on success. This post assumes a certain familiarity with SSIS development and PGP.

The PGP tool I used was the latest GPG4WIN installed to the default location (this means that the actual executable is:

C:\Program Files (x86)\GNU\GnuPG\gpg2.exe

The PGP files I was receiving were encrypted with the public key I had passed to the external source, and were simply decrypted using the GUI or the command line if I was prepared to type in the passphrase.

The way I automated this in SSIS was as follows:

Create a Foreach Loop to allow the processing of multiple files. The collection properties looked like this:

Foreach loop collection

The variable mapping look like this

foreach loop variable mappings

Inside this Foreach Loop I create an Execute Process Task.The process properties look like this:

Execute process task

The Expressions properties look like this.

Execute process expressions

It’s important to note that the arguments property on the process page are set by the expression, not hard coded, although they subsequently appear here. It’s critical to form the arguments in the expression builder to get them to work properly. The expression in text format is:

“–batch –passphrase-fd 0 –decrypt-files ” + @[User::receive_file_name]

Part of this syntax is undocumented in the GPG help files and had to be picked from the web. The confusion that I had was that I found an article which used gpg.exe and not gpg2.exe and mine version seemed to behave differently. The passphrase here is held in a variable in the package and then passed to the command line as the StandardInputVariable. This is what the [-fd 0] string achieves in the syntax. However, this still doesn’t work properly unless you pass the –batch parameter. If you don’t pass –batch then you still get challenged for the passphrase. If you run the package in debug mode you get the dialog box challenge, which you can then type into, but if you run in production mode, the task just fails with error code 2.

Whilst looking at this problem I also experimented with storing the passphrase in a file and various other options. Although the above option worked for me, I also noted a custom component is available for purchase at which might be wort investigating if you have a budget to acquire the tools and an enterprise grade ETL requirement.