Friday 29 February 2008

The Ugly Reality of Internet

I recently added in my blog a little map (see on the right side) provided by ClustrMaps Ltd which shows the location of the poeple reading or ending-up on your website.

This is very nice and it is done by geo-locating the IP address of the reader.

I have a non frequented blog but I believe that this map kind of perfectly shows the poeple from the countries who have embraced Internet.
They are the guys using Internet daily at work.

The analysis is interesting as you are kind of being hit in the face by the reality.
Indeed you can see that the dominant areas are again the rich countries: The US, then Europe and the rest of the anglo-saxons countries such as South-Africa, Australia, New Zealand, etc. Then you see few spots in India, China, Russia, ... These are mostly the emerging countries that have not completely filled the technological gap with the western countries.

For Japan, it might be different as the problem is probably the English and the fact that they kind of have built there parallel internet and mobile network in japanese.
Anyway, Internet is probably the definite victory of English over the other language's contenders (apart Chinese maybe) and to follow the Internet trends, you've got to speak english.

At the end, this map is very sad because Africa is totally absent of the map and not represented at all.

Another point is that Internet seems to only be present in cities and not in rural areas.
Or is it that poeple leaving near the wild mother nature still have better things to do than read dummy articles on technicalities :-)

Let me know what are your thoughts on that ?

Wednesday 30 January 2008

VirtualBox (virtualbox): How to boot from an existing Win XP partition under Ubuntu

I freshly arrived in a new working environment and the first thing I did was to run an Ubuntu Linux instead of Windows. However some things are only accessible under Windows so I googled a bit in order to find an easy way to boot Windows XP from my existing Windows partition which comes by default with the PC we have here.

So this is possible with virtual box 1.5.4 which can be downloaded from here and this is generally called booting from a raw partition.
Once virtual box installed, you can use the tool VBoxManage to create a disk image (vmdk) pointing to your partition.


#>VBoxManage internalcommands createrawvmdk -filename ./WinXP.vmdk -rawdisk /dev/sda
VirtualBox Command Line Management Interface Version 1.5.4
(C) 2005-2007 innotek GmbH
All rights reserved.

RAW host disk access VMDK file ./WinXP.vmdk created successfully.
my-desktop:>

Note that if you also want to directly register your image to VirtualBox you can also use the -register option.
The previous VBoxManage command will work but is extremely dangerous as in the virtual machine, you have access to all your partitions and you could inadvertently boot the host OS as a guest OS (the host being Ubuntu in my case) !!!
This could create incurable damages to your installation so the solution to avoid that is to create the disk image and restricting it only to the windows partition.
You will also have to create in a file a new master boot manager which will concern only the windows partition and use the new mbr to create the disk image.

1) Create a Master Boot Record manager in a file

To do this you need to use install-mbr which is part of the Debian package mbr and call install-mbr with the --force option:

#> apt-get install

#> install-mbr --force myBootRecord.mbr

The produced file myBootRecord.mbr should be 512 bytes.

2) Call VBoxManager with the right options

First we need to know which is our WinXP Partition:

#>fdisk -l /dev/sda

Disk /dev/sda: 82.3 GB, 82348277760 bytes
255 heads, 63 sectors/track, 10011 cylinders
Units = cylinders of 16065 * 512 = 8225280 bytes
Disk identifier: 0xf3c1f3c1

Device Boot Start End Blocks Id System
/dev/sda1 * 1 2623 21069216 7 HPFS/NTFS
/dev/sda2 2624 9704 56878132+ 83 Linux
/dev/sda3 9705 10011 2465977+ 5 Extended
/dev/sda5 9705 10011 2465946 82 Linux swap / Solaris

For me it is the partition number 1. Here is the magic command:

#>VBoxManage internalcommands createrawvmdk -filename ./WinXP.vmdk -rawdisk /dev/sda -partitions 1 -mbr ./myBootRecord.mbr -relative -register
VirtualBox Command Line Management Interface Version 1.5.4
(C) 2005-2007 innotek GmbH
All rights reserved.

Now we have a virtual disk call WinXP.vmdk which will allow us to boot our existing partion.

3) Use the VirtualBox GUI to create a new Virtual Machine Profile

Now let's launch VirtualBox to create and configure a Virtual Machine.
The first thing to do is to use the Virtual Disk Manager to add the newly created disk:

Now, let's create a virtual machine with this new disk:

Finally you will have to tweak the options and especially tick the Enable IO APIC value:

4) The Conclusion

If you are lucky you should then be able to run WinXP and access all the network disks already configured by your wonderful Win Admin => Good !!
Note that you should create another Hardware profile in WinXP to not confuse it too much as the virtual machine might comes with some cruder hardware devices (sound card, graphic card, etc).
There could also be an issue regarding the write permssions on your "hard drive" if you want to run your Virtual Machine as a standard user. You might need a chmod 777 /dev/sda with /dev/sda unmounted.

5) References

Here are the two threads which helped me find a solution for my problem

http://forums.virtualbox.org/viewtopic.php?t=333&highlight=createrawvmdk

http://forums.virtualbox.org/viewtopic.php?t=2019

Friday 17 August 2007

Committee Standards and Defacto Standards in IT

Here is the question:
Is there one usable IT standard which has been brainstormed and created from scratch out of committees and other consortiums ?
Since I do software development (wow more than 10 years now), I've seen several standards being released out of committees and being discarded or beaten by a contender because the so called standard only exists on paper, it isn't partical, it is too complex or it is too ambitious. This happened many times and will happen again in the future. Let's take some concrete examples:

1. X.25 and TCP/IP

Who can remember X.25 which was strictly following the OSI layers Model (by implementing layer 2 and 3) and should have been used to create the backbone of what we now call Internet. However the winner was TCP/IP mostly because TCP/IP was easier to understand and managable even if it was not fully following the faboulous 7 layers OSI Model. TCP/IP became also the defacto standard because it was adopted by the research community in order to create a WAN (CERN, NASA, American Universities, ...).

2. OMG CORBA

mid-nineties, Internet was emerging and becoming important, the object oriented programming was the most dominant paradigm and naturally there was a need to create a standard in order to build object oriented applications distributed over several sites. So the Object Management Group (OMG) a consortium regrouping the influential players in IT such as Hewlett-Packard, IBM, Sun Microsystems, Apple Computer and others gave birth to CORBA (Common Object Request Broker Architecture). Everybody also knows that giving birth can be quite painful and here it was the case as the different partners started to fight and push different ideas in CORBA. More Microsoft was not part of the club and had decided to develop a concurrent called DCOM based on Windows technology. At the end, the created standard ended-up to be very cluttered and was covering too many topics. It became a nightmare for the CORBA implementors as it was impossible to build a complete CORBA implementation covering all the different services. It was also very easy to have two compliant implementations which were not interoperable. Does it ring a bell ? yes, the standard WS-I. One thing which was also totally missed by these Computer Science Experts was that HTTP would become the only way to communicate between two sites because of the apparition of the firewall. Indeed these little beasts were configured to only let go HTTP from the WAN to the DMZ.
At the end CORBA has progressively disappeared and has been replaced by simpler Object protocols relying on HTTP (REST for example as SOAP will also probably vanish).

3. The Rest of the herd

So in those days, the list could be very long. Obviously you've understood that I do not particularly fancy the WebService Infrastructure (WS-I, SOAP, WS-Security, ...) and most of the XML standards which seems to have been birthed within endless meetings as most of the time they are totally unpractical (XSD and XSLT are part my favorites here). On the other hand, you have the REST stack which is slowly maturing and being implemented to provide real services while taking on board the same good ideas as the ones used to create WS-I.

As you can see the same story seems to be repeated all the time.

4. OGC Standards: the current annoyance

You could now say where does this guy want to end-up ? In fact I am currently working on a very interesting project where the main goal is to create a framework and deploy an infrastructure for discovering, distributing and sharing weather and climate data. All this is closely linked with the Geoweb as all these data are spatio-temporal data. Within our world there are 3 different contenders: You have a pragmatic effort started by the Earth Science Community beginning of the 21st century and lead by the American universities and main research centers in climatology (NetCDF, OpeNDAP, etc). You have now the geoweb lead by Google to display geo-spatial data on the web (google maps, google earth, KML). You also have a standard pushed by a consortium called OGC (Open Geospatial Consortium) which defines everything from the protocols and Web Services used to exchange the geo-spatial data, how to orchestrate these services, how to discover these data and how to represent these data (GML).
The issues comes from the last one because it is creating lots of disruptions and interferences within our project. The OGC standard is a patchwork of ideas which creates a clumsy model at the end. The OGC WS is somehow broken as it does not offer ways to request asynchronously data, does not embed security mechanisms and wants to do too much. I am not even talking about the discovery part which is based on the idea to have a metadata standard for all these datasets when the same idea applied to the Web never worked.
GML is the jewel of the crown there but his "concurrent" KML is controlled by only one body (Google) and can evolved very quickly. KML also has got the Google Earth advantage as it is a shiny GIS client.
Lots of people would like to push us to be compliant with the OGC Standards but when you look at it, you can guess that in few years time, this will have vanished or mutated to use defacto standards such as KML and friends. So we should probably concentrate on creating a pragmatic solution efficient for our users and slowly migrate to matured and proven technology when necessary.
After our goal is not to be compliant to standard A or B but to improve the user's life.

To conclude here is my proposal : to avoid loosing time, money and produce tons of CO2 in travel we should ban and minimize the number of committees created to regulate the IT world as they will irremediably fail and provide very little satisfaction to their participants.




Saturday 12 May 2007

Project Management Fuss

I just had a course on project management and the instructor who was quite good actually and he served us the usual things regarding process, planning, resources, etc.

Frankly I don't believe that managing a software project using analytic methods is the right way to do it because here we are mainly talking about human resources , human interactions, human creativity and the model describing this reality has yet to be created.

The fundamental problem with the analytic methods (which are perfectly valid in other circumstances by the way) is that building software cannot be treated like building a car or a plane because our industry is very very young in compared to the manufacturing industry.
These people tend to think that we have reach the taylorization stage and have to consider developers like blue collars working on a semi-automatic chain and fixing screws on a car door.

But we are millions miles away from this stage and I would say that we have only reached the CRAFTSMAN stage and this why so many people feel exited about software development. Indeed a craftsman is a bit of an artist with some techniques and light methodologies and his ultimate goal is to create his "chef d'oeuvre" (think about the Asian culture with the masters).

The instructor was talking a lot about how achievement is important for human beings and it is especially important in computer science. Creativity is also a part which is crucial in software development. People do software because they can be creative and solve elegantly complex issues never solved before.

All these software management methodologies tends to destroy creativity by overloading developers with progress reports, time sheets, ... in order to increase control on them. This will probably give more visibility and information to the management but you might end up with a standard DULL product using very common techniques and no innovation at all.

So I would yes to organize the work of your developers jointly with them in a light fashion and no to the heavy methodology which gives more weight to the process and the control.

I suppose that the people behind the XP programming have a similar view to mine.