I DRAW COMICS REWARDED

A few days ago, I received the reward for backing the Kickstarter project I DRAW COMICS. I really liked the idea of creating a guide how to start drawing, showing you a few of the tricks the pros use during their day job.

As  this was Matt’s second project, there was a quite a chance to receive a high quality result out of the project. As this was my very first non-technical project backed, I was really interested in the outcome.

I DRAW COMICS Sketchbook and Reference Guide 

If you are interested in the sketchbook as well, head over to their website, the sketchbook will be available for pre-order there soon.

Heading for SAP HANA

A couple weeks ago, I started a new position as software development manager building a team for a new cloud based product dealing with big data. Evaluating various storage solutions we came along the SAP HANA database, an large-scale in–memory database with fascinating computing capabilities. Not only SAP HANA, all kinds of other great technologies sitting along the way waiting for being picked up on the upcoming journey.

We will focus a lot on SQL 1992 Standard and the new SQLScript language provided by SAP HANA, R, JavaScript, HTML5 but also Eclipse and Java writing plug-ins for third party components. We will deal a lot with RESTful Web services, JSON format, OData protocol and huge amounts of data. So if you feel home at these these technologies looking for new challenges in Karlsruhe (Mannheim area), drop me a line.

Changing the technology stack? Not really….

Said that, for my day job I am heading away from C# and .NET quite a lot. I still use C# a lot for me side projects. However, for quite a while I started using Python more an more for many tasks I performed with .NET before. I will still write for the .NET magazine dotnetpro, even when my recent article series is about JavaScript frameworks, though. 

During my time at Microsoft Research, we investigated a lot of heterogeneous technologies, used them and developed even some. Beside C#, we used F# a lot, ANSI C, R as well and even ANTLR. We developed a new scripting language called Vedea fully compatible to the Processing syntax. Before we worked on visual languages for programming and worked with the CCR/DSS team on the concurrent programming models.     

Not be afraid of technology, you should be, Yoda might say…

One thing I’ve learned during my time at MSRC was not being narrow minded and not being freighted by technology. Our architect at MSRC once told me not being afraid of technology. It might take some time, maybe it is be not easy nor very convenient, however, someone built it, so you can figure it out.

What I do realize more often than not, is the fact developers stick within their comfort zone. They want to stay with a particular technology because it’s easy, the feel at home, the know everything a lot and so on…

Time to change, or maybe not (yet)?

Recently a friend told me, he thinks it’s impossible to change to another technology after coding in C# for several years… That’s just not right. Once you understood the concepts behind a certain kind of technology, there is no reason not to move on. So here are some excuses I have heard during the last few yeas…

  • I don’t like [put in any kind of technology]
  • I don’t like the syntax of [same as above] 
  • I never can learn everything I missed during the last [put in any timeframe]
  • That’s a step backwards [put in any high-tech company] did with this technology
  • This [put in any kind of technology] has no future

In fact, any of these statements just is a excuse not to learn something new, to stick with the well known and not moving out of your comfort zone. Said that. stand up, learn something new and move a step forward… technologies change rapidly, so you should change as well…

2011 Spam Statistics

2011 Spam StatisticsAs I recently moved my mail server to a new cloud provider, I used this opportunity to check the mail statistics for 2011. All together I had

64.097 mails processed from which

42.469 where spam mails and

375 included viruses.

All together this makes 66.25% of spam. However, only 0,58% of processed mails included viruses, which is a quite  surprisingly fact. Over the last year I encountered only  two to five spam mails a day in my inbox, and I have not reported any false positive at all. In contrast, I pick up false positives in my GMX junk mailbox on a regular weekly  base. For my personal mail server, I am using SpamAssassin with settings, I improved over two to three years as well as a set of various DNS blacklist.

Quite a part of the regular mails origin from various mailing lists and newsletters which leaves me with about 60 mails a day to process. A number to be definitely improved (i.e. reduced) for 2012.

I See Clouds of White

For several years, I run my own local server as well as a root server hosted online. I run all kinds of services, some of them I used on a regular base, some of them I used from time to time and others I just set up to learn and experiment. However, the time I set up most of these, was a time when there where not many choices if you wanted to host something online. So I run my local repository, Microsoft Team Foundation Server, my own mail server online, my FTP and Web Server and many other services.

As maintaining all these services became almost a full time job, I finally decided to move anything in to the Cloud – or at least somewhere online. In some way, this is an experiment as I try to achieve to run anything I need (or lets say I want) somewhere online while staying within the budget of my Web and local server.

For the local server I calculate $42 for maintenance and electricity a month while the monthly  rent for the Web server is $70. All together I face yearly costs of nearly $1.350 average fixed costs a year not included software licenses and time invested to maintain and update the servers.

Step by step I now move my services to various online services (free and paid). First of all I moved my blog to wordpress.com. That was a rather easy decision as I already switched to the WordPress software several moths ago on my own server. Exporting and importing the content therefore was quite an easy job. Finally, I picked domain mapping for http://www.hack-the-planet.net which is about $12 a year.

To keep track of stuff to do, I use remember the milk for quite a time now. $25 a year is not that cheap for a simple list of todos, however, I get the Web application, a fine app for iPhone and iPad as well as GMail and Google Calendar Gadgets synced all over the place.

A critical step, however, are my source code repository. I maintain all code I’ve ever written in CVS and Subversion for ages. Without your own server it’s not that easy to grant rights on repositories for friends and colleagues you work with. Here, I decided to move to two different platforms. First of all, I started a new project called aheil code (to keep the corporate identity in sync with aheil blog) at CodePlex. That’s the place I plan to share anything under Ms-PL license. Closed source however, I go to store with Assembla. They provide a limited but cost free program for private repositories which should be sufficient for my needs.

Instead of using my own FTP to exchange files between machines (and people), DropBox appeared to be a great solution. I joined DropBox at a very early beta state and I am still very happy. (If you don’t have an DropBox account yet, follow http://db.tt/kNZcbyI which gives you and me 250MB of extra free space). I use about 4GB of space at the moment for free. However, once you need more there is always the possibility to switch to a paid account. The client is available for almost any platforms and I use it for various scenarios across many of them including Web, Windows, Mac and iOS. Before I used Microsoft Live Mesh, however, canceling the beta, changing the name, running two Microsoft services (Mesh and SkyDrive) at the same time you were not able to combine and finally changing the software drove me finally to DropBox.

I terms of productivity tools, I completely switched to Google Calendar as it syncs nicely with iPhone and iPad and even iCal on my Mac. I used (and really liked) Outlook for many years, but the lack of syncing with third party online services seem to be an epic fail in 2011. I can tell you that you won’t notice this fact within Microsoft (living in a happy Exchange and SharePoint equipped world), but out there in the World Wild Wide Web, connectivity is all that counts.

Also, I joined Evernote to sync, copy and share notes and documents. Again, client software is available for all major platforms including iOS, Windows and Mac. I still try to figure out how to use Evernote on a daily base, but at the moment, the maintenance costs (manual sorting, organizing etc.) are beyond the benefit.

So far, I was not able to cover all services I need, for example I am still looking for a good (and secure) online backup solution, a way to host my IMAP server and Web server as well as a possibility for a local storage solution. At least the last point seems to be almost solved by my new router which allows you use a external HDD as network drive. Using my previous solution, I was able to connect to my local network from anywhere using OpenVPN in a very convenient way. Also here, I am looking for an alternative solution where maybe a router might take care of this.

So far, the experiment to move everything to the “cloud” was quite a success. I was able to migrate quite a lot of my services and only spent 3% of my available budget for services so far.

Hello world, again!

After running my own blog for several years on a self-hosted Windows server, I finally decided to move my software engineering blog to wordpress.com. For both, security and maintenance reasons, I decided to use the WordPress service instead of running my own installation.

Over time, maintaining the WordPress installation just turned out as quite time consuming tasks. Frequent updates on WordPress itself, the plugins and themes, the hosting operating system, the Perl, PHP and .NET packages almost became a full time job.

At the time, I set up my own server, there were only little possibilities to host all the services I used on the Web (or let’s say meanwhile cloud). While I used to host my own Web Servers (IIS and Tomcat), SVN repository, SMTP server and various databases you can find most of these nowadays offered by third parties, most of the time for free. Most of these offers are sufficient for the experiments you perform to keep up to date with the technology or to experiment with new approaches.

As a fist step. I decided to purchase the domain mapping feature from wordpress.com to use my own subdomain http://www.hack-the-planet.net. Following the technical documentation and Matthias’ post, all you have to do is to set up a CNAME entry with your current registrar (if you just want to map a subdomain). As I use domainFactory as my registrar, this was pretty easy using their Web interface.

31.07
Moving to wordpress.com, I hope to free up more time for researching and blogging in the future again.

Fish’n’Chips No More

Following All Good Things… it was the day I choose to leave Microsoft Research in Cambridge for the sake of new challenges and adventures. England is not that bad, considering the weather, the living conditions, the food and the traffic…

The last five years with Microsoft Research have been very intense, exciting but also quite exhausting. Our dev team went twice to the Microsoft TechFest showing the Microsoft Computational Science Studio in 2008 even to the press.I met Alan Alda, known as Capt. Benjamin Franklin “Hawkeye” Pierce from the TV drama M*A*S*H, meanwhile presenting us TV show Scientific American Frontiers.

Ein Highlight bei der Entwicklung des Microsoft Computational Science Studio war zweifelsohne die Präsentation durch Craig Mundie während der Microsoft College Tour 2009. Nicht nur dass Craig Mundie hier meinen Code in den Händen hielt, er musste ihn auch gleich an einigen der Top-Universitäten in den USA vorzeigen (u.a. Cornell und Harvard). Das ist einmal eine ganz neue Art von Druck, bedenkt man, dass es sich dabei um einen Forschungsprototypen handelte. Auch hier gab es den einen oder anderen Artikel in der Presse, beispielsweise in der Seattle Times. Überraschend kam dann noch ein Channel 9 Video von dem niemand wirklich wusste, bis es plötzlich online war.

image

Nachdem ich für Coding4Fun 2005 bereits das erste Mal auf der PDC05 mit einem eigenen Stand vertreten war, konnten wir 2009 wir dann auch mit Vedea auf der PDC09 punkten.

Beide in Cambridge entwickelten Prototypen konnte ich Mitte diesen Jahres nochmals im kleinen Kreis bei der .NET User Group in Karlsruhe vorzeigen und habe mich dabei sehr über die interessanten Diskussionen gefreut. Zwei Mal auf der PDC, zwei mal auf dem TechFest ist keine üble Bilanz, die  mit einer Danksagung im Artikel Predictive Models of Forest Dynamics im Fachmagazin Science (Science, vol 30, 13. Juni 2008) noch “aufgewertet” wurde.

Zwischendurch gab es sogar ein Besuch von Bill Gates, dem einige der (Forschungs-) Arbeiten vorgestellt wurden (ich saß in dem Stuhl in dem Bill Gate saß). Aber auch das tägliche Leben ließ sich nicht Lumpen, ob eine Diskussion mit Don Syme (einer der brillantesten Köpfe und bekannt für Generics und F#) oder ein morgendliches “Hi Tony” zu Tony Hoare (jedem Informatikstudent aus dem ersten Semester aufgrund von Quicksort und dem Hoare-Kalküls wohlbekannt), das hat man nicht überall.

Letztendlich konnten wir die eine oder andere Entwicklung an verschiedene Produktgruppen in Redmond weitergeben, und so findet sich vielleicht manch Zeile Code bald wieder in einem der zukünftigen Microsoft-Produkte.

Sick about URIs? My Wishlist URI

Following the idea of having clear HTTP URIs, I now extended my list of https://www.hack-the-planet.net/freebusy and https://www.hack-the-planet.net/software with my wishlist currently being with amazon: https://www.hack-the-planet.net/wishlist. What’s cool about this approach? At first it looks much better than http://www.amazon.de/gp/registry/Z5LA1EEWOT64 and is far more intuitive. Second, when moving to another country my wishlist URI stays the same, even when changing from amazon.de to amazon.co.uk. Finally, whatever happens to Amazon – I don’t care about my wishlist – my URI just stays the same. About 10 years ago, in 1998, I was a quite good customer of Telebuch.de, one of the very first German online bookstores before Amazon bought them and consequently all the URIs changed.

GPL as a Business Model

How stupid I was: while dealing with licenses for years, Dirk Riehle finally gave me a reality check. In an interview with Software Engineering Radio he told about his recent work at SAP and his experience and research with open source business models. Exaggerating I say: GPL is great, because GPL is the most capitalistic license you can think of.

As Dirk explained in the podcast the whole dual licensing model is build up on GPL. You sell your product but you also want to be pseudo open source. Then you open source using the GPL. All your competitors can read your code, extend the code, but then have to release everything under the GPL again. So there is no no benefit for your competitor. If customer’s or competitors want to extend your code they have to purchase the second license you have.

Personally, I prefer either closed source or quite permissive licenses (FreeBSD, MsPL etc). If yo are going to give your code away, do it right. If you want to build a business on your code base – keep it closed. So I always was quite careful about not reading GPL code or even worse, copying code snippets into your code base that might be under GPL.

Since yesterday however, I really have a different view on the whole topic: If you are open sourcing  a commercial open source project and you perform dual licensing, GPL is used as a pure instrument for your business methods. If you open source  a community project under GPL you probably have not understood the concepts at all.

Also interesting in this podcast was the facts about shifting revenues. So, licensing is a tool used for shifting revenues among various business areas. If you are selling a database you are probably interested, that all operating systems are free of charge, so the customer has more money left to pay your product and your service. If you are some company similar to SAP, you are probably interested in all operating systems and databases are being for free: Consequently, the customer has more money left to spend on your product and services. If you selling your operating system you are for sure interested in having all programs running on top of your system are for free. That way, the customer has more money left to spend on your operating system and the services.

The next time you read about some company switching from Windows to Linux the question is not about saving money on licenses. At the end, I personally don’t thing that the corresponding IT budget will be cut down due to the saved licensing fees. I rather think the budget is shifted to some other area.

I just realized I was focused on the GPL from the view of a developer for too long. If you feel the same, I highly recommend the interview with Dirk.

Got Root #4

It is finally done. After some exiting days I moved my web site, blog and domain to my new server. A few seconds ago, I was informed about the succeeded the domain transfer. Now I am just waiting that all the DNS entries are updated. Since everything is already set up and prepared for the domain name, the overall project is accomplished.

In to make sure such a relocation goes off without a hitch, just follow a few simple rules

  1. Terminate the contract with our current provider, best in written form or by fax
  2. Inform your current provider about the domain transfer
  3. Request the domain transfer with your new provider
  4. Set up your Web applications on the new site – test it!
  5. Make the site listening to the domain name to be moved.
  6. Make sure you have some mail server set up already listening to the domain name being moved.

That’s all.

Got Root #2

In a previous post, I told about he first steps in virtualizing a Windows Server 2008. In this article I describe how to proceed after, the request for my own RIPE subnet was approved. Now I can concentrate on the next point: Installing VMware. Since I want to set up this machine for visualization, I have to perform a few steps first. That way, this post will be mostly about my fight with Debian Linux which is the host system.

After logging in, I just realize that updating the package database might be not the worst idea. Consequently, I do so and install some Norton Commander like tool for real men

apt-get update
apt-get install mc

This actually makes things much easier.

Now, I have to activate IP forwarding in /etc/sysctl.conf by the adding

net.ipv4.ip_forward=1

and bringing the additional IP on the host system by adding

up ip add 192.168.1.1/29 dev eth0

to /etc/network/interfaces. Additionally, I have to add some host-route (by using my gateway 192.168.0.1) so my new subnet is reachable by adding

pointopoint 192.168.0.1 

to eth0 in /etc/network/interfaces. Installing iproute by a

apt-get install iproute 

restarting the interface by calling

/etc/init.d/network restart

finally makes my IP ping-able. Quite a fight so far if you don’t do this on a regular base. Additionally I installed the powersave package and reconfigured several settings to increase the performance in /etc/powersave.

I just got the tip to put my virtual machines to the separate disc. Since I have one spare 400GB disc. I have to create some partition and to format it.

cfdisk /dev/sdb mkfs -t ext3 /dev/sdb1

Let’s create some directory for the virtual machines and mount the disc

mkdir VMs
mount -t ext3 /dev/sdb1 /Vms

Now some final tweak at /etc/fstab by adding

/dev/sdb1 /VMs et3 defaults 0 0 

and I am done.

Finally I start installing the VM. I was pointed to some German How-To written by Till Brehm which is includes quite detailed instructions.

Some prerequisites are required before I start. I do the required 220 MB update by

apt-get install linux-headers-`uname -r` libx11-6 libx11-dev x-window-system-core x-window-system xspecs libxtst6 psmisc build-essential

VMware can be downloaded from http://www.vmware.com/download/server/. I skip the management console since I will use it on my Windows workstation and focus only on the server and management interface binaries using:

wget http://download3.vmware.com/... tar xvz VMware-server-*.tar.gz cd vmware-server-distrib ./vmware-install.pl

Now, I simply accept the defaults for the following installation. only at one point I had to tell the script that my virtual machines will be located at /VMs. No I have to continuing with the management interface

tar xvfz VMware-mui-*.tar.gz
cd vmware-mui-distrib
./vmware-install.pl

The Web-based management interface seems to work perfectly after installing.

VMware Management Interface

After installing the management console on Windows I run into some trouble. During compilation of the corresponding modules, the VMware script was not able to start the inetd service. Therefore, I was not able to connect to the VMware server. After restarting the service manually it worked perfectly and I set up my virtual machine.

VMware Management Console

Now, I have to copy the installation files for the Windows Server 2008.