Monday, January 25, 2016

Visiting the Dark Side (first part - MacBookPro)

After a very long time I decided to do a few quick posts regarding my latest visit to the Dark Side (or Apple as other people might call it) - for me from the freedom point of view Apple is the worst alternative when it comes to having technology-related choices, they are the Dark Side and compared to them Microsoft is just the Grey Side. I will mention here that there is exactly one aspect where Apple appears on the consumer side (actually leading the way on that aspect) and that is encryption on mobile phones, but that will be discussed in the second part.

On my current job we are now having some extremely ambitious projects that encompass highest-end custom hardware, device firmware and computer software, and part of the last bit now also includes some access from mobile-like devices = smartphones or tablets. And of course some of our best-paying customers are already using devices with iOS, so that means we have to start with iOS our entire line of mobile apps. Which in turn needs devices to run the apps and even more so devices to design, develop and test the apps, and the main platform to develop iOS apps is of course OSX so now you see where I am getting - my company provided the latest MacBookPro and the latest iOS device with the clear goal of getting accustomed to the platform and be 100% certain that our hardware devices and the firmware for them are not only fully compatible with Apple platforms but also that everything works in the smoothest way possible.

I can still remember that my previous experience on the Dark Side did not go as well as I expected (see ten years ago here and here) so it is easy to understand how I was a little uneasy about how things would go this time. The good news are that 10 years of technology advancement and being provided with the highest-end model really did make a positive difference, the bad news are that some things do not change THAT much - the lucky part for me on the MacBookPro is that the entire Apple walled-garden paranoia is now heavily focused on iOS so the OSX segment gets away with an almost decent degree of default restrictions and most of them can be somehow customized.

So what is my impression of the 15'' MacBookPro Retina late 2015 model? It is a well-built system, but that is something to be expected given the inflated price - the model was ordered for me around December 2015 and at that point it did cost around 3000 EUR for the 512GB SSD model or almost 3700 EUR for the 1 TB model, and at the same time there was a special offer on the latest Dell XPS 15'' with a newer/better (Skylake) CPU, 32GB RAM instead of just 16GB and 1TB SSD for under 2300 EUR, so the MacBookPro was about 60% more expensive in a slightly inferior hardware configuration. So if there was no XCode needed or I had to pay for it myself that aspect would have settled the matter and there would be no further post on Apple stuff.

But I did get it and I am writing this post on it, so things were not that bad after all. In fact other than the keyboard (which is inferior to the one in my Thinkpad T420s which is old but was very inexpensive by comparison) everything is generally OK, most notably the screen is good. And even on the keyboard the main problem is not so much with the keys as it is with the number of them - or lack of, since of course Apple is still more interested to cater to a segment of people they believe might become nervous if presented with too may keys, so instead of having separate keys the power users are just forced to use the same cumbersome multi-key combinations (so the Delete key still does Backspace and to do a true Delete you need to keep FN+Delete, and many other things like this). Also it is a little surprising how at this price point you don't get a fingerprint scanner and from Apple is not surprising at all that you have to pay extra (and quite a bit extra if you buy from Apple) if you need extra USB ports or a wired Ethernet port - but I got a good portable 3-port USB3 hub + Gigabit Ethernet for about half of what just the hub would cost from Apple so standardization definitely works for the consumer.

OSX itself is now to my surprise quite improved compared to 10 years ago and finally looks reasonably stable (even if I have seen some strange quirks in some of the settings dialogs). The part that I really like most is the Mission Control, which Apple now gets quite well and together with the 3-finger gestures on the large touchpad make windows and virtual desktops management quite pleasant and quick.

The software selection is still nowhere close to what I am used but for the most important things it is easy to find good programs (I definitely like Chrome or Firefox better than Safari, VLC is still the best media player, Tunnelblick seems a good VPN client, Double Commander is an open-source alternative to Total Commander and MacPass is a usable alternative to KeePass; I have also found Contexts useful enough to pay for it).

Last but not least Windows 10 (another significant extra expense, together with VMware Fusion which we also needed for some of our legacy Linux build systems for some older products) is also running quite well in both native Bootcamp mode and also in Bootcamp VMware mode, which in most scenarios improves productivity a bit since I do not have to reboot from one to another but instead just move to a separate desktop. And in VMware I also have various Linux virtual machines and of course some Windows 7 or 8 configurations (which are no longer natively supported by Bootcamp).

But as I said not all things changed for the better and some of the changes are just more of the same type of "milking the customer for as much as possible" which is such a constant of the entire Apple overpriced walled garden philosophy - the RAM is now soldered and you can not upgrade it and the battery also can not be replaced in any way by the user.

Almost all the other interesting features require to pay more money to Apple (with more and more stuff being pushed towards iCloud so you will be forever stuck with Apple and a monthly payment to them), but I was lucky enough to already have an excellent Asus RT-68U router, so I got a TimeMachine network backup basically "for free" and generally things work fine and (except for the price) the general impression is reasonably (I should probably say surprisingly) good. Am I now an OSX fanboy? Definitely not, but I can live with it (side-by-side with Linux and WIndows). And I still keep around my old T420s "just in case" :)

Friday, September 28, 2012

Thinkpad T420s - cool if you know some tricks :)

This will be just a short post to add a few ideas about how modern Thinkpad notebooks are booting into Win7 and how you can best add a Linux boot to that.

There is a huge lot of valuable information at ThinkWiki - but that information is kind of dated and might not be able to provide now enough clarity for people without a huge previous experience with Thinkpad notebooks.

The first idea that is not well explained upfront is that the Master Boot Record / Partition Table in T420s notebooks (and probably other similar models with Windows 7) is some sort of a "proprietary boot manager for IBM/Lenovo" which I will call BMGR for the rest of this post - if you are more familiar with Linux think of it as a completely different kind of GRUB (or LILO, or EXTLINUX). It is a far different program than any of the Linux boot managers - for instance all those above use a text-file configuration somewhere in one of the partitions, while BMGR is using some kind of a binary configuration / extension located in the first 8 (in some instances it might have been even 16) sectors of the disk - so the MBR + 7 (or 15) sectors (512-bytes standard sectors). The bottom line is that you do not have there a dumb small/standard DOS boot-sector, but something a little bigger.

In the original partition table the modern Thinkpads now have at least 3 NTFS partitions - one "System" partition (where the actual Win7 boot loader is located, partition that I believe might be normally hidden or named S:), the standard Windows_OS partition (where the system is located, but which is called by Microsoft "Boot" = C: ) and another NTFS-formatted partition (Lenovo_Recovery) which I believe appears as drive Q in Windows and where you have the original installation / recovery image from which your Win7 can be reinstalled "from clean" - those are contained in some WIM files and can be written currently on 1 CD + 3 DVD writable disks - which is something that you should absolutely certainly do as the very first thing after you get your Thinkpad! Since I am a little paranoid I also made a second copy of that - the original program only lets you write one, but of course it can not stop you from later making a duplicate of that disk:)

So why are all this relevant? Well, mainly since now BMGR can do some extra tricks - if you press the blueish ThinkVantage button before the Windows loader starts you will be taken to a separate "Restore and Recovery mode" - a Windows RE (Recovery Environment), which is an extension of the older Windows PE (Preinstallation Environment) in which you have programs that can let you do minor repairs or even a backup / restore (including a full restore to the original "clean configuration"). All that stuff is basically booted from the first partition (SYSTEM, S:) but to restore things to original configuration it will use two very large files from Q: (Lenovo_Recovery). And if you also get a new version of the Rescue and Recovery program you can also do some specific recovery from inside that mode!

Another trick is that on the small SYSTEM partition you have another small WinRE boot target, which can be automatically started if Windows fails to boot normally - in that environment you have the pretty standard "System Recovery" for Windows 7, but since this one (unlike most default Win7 installations) is also in the separate SYSTEM partition (which is very, very rarely written to) chances are that it could come handy more than once (for a CHKDSK or even when a very bad driver is installed and Windows no longer starts).

The problem is that the installation of the Windows-based part of 'ThinkVantage Rescue and Recovery" (v4.5, which is not small and on my computer was not preinstalled) REQUIRES the presence of BMGR - and if BMGR was replaced with something else (for instance GRUB) that R&R installation will fail! So as a bottom line on this one - if you really want R&R - install that one very early!(the program is OK if you do not have something better, but IMHO it looks somehow slow since it does not seem to take full advantage of NTFS "tricks" like shadow copy and restore points).

And that gets us to the multiple-boot Linux problem - in case you still want all the above features you can not let GRUB/LILO/EXTLINUX go and overwrite the MBR (which is the normal thing to do when you have a "dumb boot sector" there) - in case you already installed for instance GRUB you can restore the old Lenovo BMGR (with a little work, maybe I will do a separate post about that), but IMHO the ideal configuration is one where you place GRUB in the boot sector of the Linux partition and you just add a second entry to the normal Win7 boot menu with BCD or EasyBCD (you can call that entry Linux or Grub) - I actually added with BCD an entry for Grub4DOS and then I chained that one to the GRUB2 from my Mint13 installation - which I have to say that was installed pretty well and I got Compiz working just fine!

But more about that kind of stuff in a later post :)

Monday, September 13, 2010

Fighting FakeRAID and GRUB problems

Everybody knows that RAID can be an effective solution for redundancy and general availability, but in a real professional environment it is clearly not enough - and a very careful analysis of all the possible point of failure should be made (and people that believe having a single expensive SAN with RAID5 inside 'covers everything' at enterprise level should be quickly and quietly removed from decision positions).

However for very cheap (home or really-small business) some compromises have to be made, and given the abysmal reliability but acceptable price of the more recent generations of hard-disks it is now a decent solution to use some form of low-cost RAID 1 (mirroring) even for home use.

Now if you have some basic technical skills with computers you already know about the 'integrated RAID' that is now provided at basically no extra cost on most of the modern motherboards, and if you also have some decent Linux skills you also know that the type of RAID provided by those solutions is very specific in needing some (serious) OS-side support and as such it is referred in the Linux world as FakeRAID and in the purist circles is always derided - however that is a very narrow and slightly shortsighted view (of the same kind as 'I don't see SMP becoming mainstream …') - for certain scenarios the FakeRAID can be a very effective solution if you know what you are doing!

So how bad is FakeRAID ? Well, right now it is not as polished as some of the other OS-level software approaches (like volumes in Windows or Linux) and it is not as fast as some of the very expensive dedicated hardware solutions, however:

a) it is much better for interoperability among different Operating Systems than any of the other software solutions - think of it at hardware vendors forcing Linux and Microsoft to come to an agreement on volume (basic) formats :) On top of that it also perfectly fixes the initial boot problem (since the BIOS is aware of it, unlike for instance a Microsoft proprietary solution).

b) it can be a LOT cheaper and also 'easier to recover' than a dedicated hardware solution.

So what problems should you be aware of ?

First of all there is still some lock-in but this time from the hardware vendors - there are a number of FakeRAID formats and those are not 'playing nice' one with another so a FakeRAID from one type of motherboard most likely will not work on a totally different motherboard - you can see a more detailed discussion here but as a general quick rule of thumb you must remember that ideally you should replace the motherboard in such a system with another motherboard with the same chipset for a full 'instant replacement' !

If that kind of drop-in replacement is not possible things are still not completely lost - since Linux can still see such FakeRAID volumes even on some slightly different hardware (because the differences are coming from BIOS level) - so you can still perfectly read all you data, but this time you might need some extra storage to save the data to, then you can re-arrange things in BIOS for the new FakeRAID format and then restore the saved data - far more time-consuming but still doable if you care about your data - which is precisely what happened to one of my systems :)

So which are the problems mentioned in the title ? Well, once the theory from above was well clarified (and I was no longer trying to get a 'perfect solution' since the old chipset could no longer be found) I decided to reinstall clean versions of the Operating Systems - things worked very well for Windows (except Windows 2000 which seems to no longer be supported by all FakeRAID vendors), but surprisingly the boot process was looking very tricky on Linux, specifically with some of my USB 'recovery tools' and with new installations. The problem was apparently related to RAID support (since in some early attempts it was booting OK without the RAID) - so I lost a huge amount of time experimenting with a number of distributions - most of which were starting OK from the installation DVD or LiveDVD but were rather tricky in setting the partitions and all without any exception were finally getting to the same end-point - a non-bootable system at GRUB level !!!

I will not list all the stupid tests I have done to clarify that strange behavior but instead I will fast-forward to the actual conclusion - the problem WAS related to RAID, but not to the actual OS-level RAID (well, most of the time at least, see below the paragraph on Ubuntu) but instead to something else simpler but far more surprising - the free-memory pattern of the low 640k of the memory which was changed when the RAID BIOS was activated !!!

So the bottom line is that even to this day there is a (dumb) bug in both GRUB and GRUB2 which 'assumes' that the low memory is pretty free in some fixed (???) addresses from (linear) 0x90000 to about 0x9A000 - where some initial parts of the kernel (and also memtest) is loaded/started (in real mode) !!! And of course that assumption was NOT true once the extra RAID BIOS was active - so no surprise that the system was never bootable! You can easily check for that problem with displaymem at the GRUB legacy (which still is a LOT better than GRUB2 when something goes wrong) command prompt - if you see any reserved block in that range you will see booting problems!

So how on earth those Linux DVDs were still starting so well ? Well, that also explained something that I have observed starting a few years ago - pretty much all serious distributions are using ISOLINUX to boot from CD/DVD (even if GRUB was looking easier to setup and handle) - but of course ISOLINUX / SYSLINUX / EXTLINUX are all working OK even with strange low-640k memory maps - so probably without ever getting to describe the full (potential) problems, the maintainers were just automatically choosing the boot method which 'was working' !!!

Once that become clear the obvious solution was to switch to EXTLINUX - which is a lot 'different' than you would think, since as far as I know there is no large-scale friendly distribution using it as the final HDD bootloader - many distributions are including it somewhere but are optimized around having GRUB and more recently GRUB2 as the general system bootloader which has to be 'automagically' updated after installing a new kernel or anything :(

So the final solution was to install Ubuntu - one extra 'trick requirement' was that the above system was also somehow used for certain tests, and I wanted both a 32-bit and a 64-bit system side-by-side - 'forcing' GRUB legacy on both (but with different targets - I used the partition table for the entire disk for the 32-bit version and the actual Linux64 partition for the 64-bit version), and then 'chain' from GRUB to a manually-maintained EXTLINUX for the actual booting - things work OK as long as I remember to manually update the EXTLINUX configuration file after major kernel updates. There was also an extra complication - somehow along my many tests I ended with a separate small /boot partition (located in the first 8 GB of the disk - at some point in the tests I was ready to accept even the most fantastic explanations, like for instance the BIOS not being able to read in real mode past 8 GB), where EXTLINUX also is placed (starting from the boot sector of that partition) and I have to copy there the kernel and initrd files - and with that occasion I also noticed that Ubuntu kernel/initrd files do not have anywhere in the name some 'architecture marks' - like x86 or x64 somewhere in the name - so basically I keep them in separate folders to avoid inherent name clashes :)

The other tricky part with Ubuntu was handling RAID partitions - it seems that the current 10.04 LTS neither in the LiveCD nor the AlternateCD is very prepared to handle the changes after you create or seriously change partitions on FakeRAID - so ideally you just create everything on the first boot from one of those CDs and then just reboot - when the installer will detect the partitions just fine and go ahead without any problems! Surprisingly the Alternate x64 also has some bugs, so I ended installing from Alternate x86 and Live x64 - but the resulting install worked equally well once correctly set with EXTLINUX !

So the bottom line is that sometimes you might still see unexpected things with Linux, but a little persistence (and good internet searching skills) can provide a quick workaround - and hopefully during the long run the GRUB bug from above will be fixed and I will no longer need to keep both GRUB and SYSLINUX on my USB recovery tools :)

Friday, June 25, 2010

Ubuntu LTS - from 'huge disappointment' to 'acceptable Lazy Lynx' ?

There have been a number of previous Ubuntu versions which were clear disappointments, with 8.04 LTS going to the top of the list (see for instance here), so the launch of the latest Ubuntu LTS version - 10.04 Lucid Lynx - was expected with understandable curiosity.

Unfortunately after a promising beta version the first release candidate started to expose ugly problems which unfortunately remained in the final version :( It is however possible to get past those and with a little effort get a very acceptable installation - but that is obviously not something very simple for a newcomer or when we speak about deploying it on a small number of computers - so the correct codename for the version IMHO should have been 'Lazy Lynx' :)

The first and most serious problem is a very unfortunate (and IMHO far from smart) change in regard to X11 on a number of video cards - most notably some VERY common Intel models used mainly in notebooks - with the result that actually many systems that were very stable under the previous 9.10 can't even boot the 10.04 LiveCD !!!

So if you just tried to start the 10.04 LTS LiveCD and after 5-10 minutes you get a computer that just hangs with a black or garbled screen and no feedback at all (most notably no starting sound) you are most likely a 'victim' of that problem (one other alternative might be RAID-related stuff, but that one most often just 'blocks' the computer for like 3-5 minutes only and after that is starts OK). One ugly but GENERIC workaround for the video problem is to start in 'safe video mode' - unfortunately in the current version that option is no longer 'friendly' in an obvious menu so you basically need to press F6 and erase the 'splash quiet --' part of the options and replace them with 'xforcevesa' - this can solve plenty of problems but unfortunately you are now stuck with the VESA minimal video driver (which will remain after installation) and all acceleration and 3D is gone :(

The real workaround for certain Intel video cards is to actually use another option - 'i915.modeset=1' - apparently older X11 code was failing on maybe 3% of the Intel video cards with the default modesetting behavior (and for that case you needed 'i915.modeset=0') so somebody decided to reverse (disable) that - which seems that now fails in like 10% of the cases :( Anyway, if after that your LiveCD starts nicely then you will be able to install just fine but the booting process will fail on the first restart - so again add manually (once) 'i915.modeset=1' to the boot options and then once booted OK from the command line do:
sudo echo options i915 modeset=1 > /etc/modprobe.d/i915-kms.conf
(which will set that option persistently). Some other information on the matter can be found at this page - but note that the version of the page might talk about older cases where you had to disable modesetting (which is now the default).

As I said another (rare) problem might be in certain RAID configurations - but patience and attention (plus eventually some Google skills) will get you over that :)

If you want optimal interoperation with Windows another 'pre-installation trick' would be to format the Linux partition in advance with EXT3 and the older 128 bytes inode (mkfs.ext3 -I 128) and do not let the installer to reformat the partition (and in the installer you should ALWAYS use the advanced/manual partitioner !) - that way you will be able to read and write to that partition from Windows with EXT2 IFS !

On certain hardware configurations (Broadcom cards) WiFi will not be configured by default and you will need at least once a wired Ethernet connection so that the 'hardware wizard' will be able to fetch the (restricted to deployment) Broadcom firmware. After that your network will start and you can enable time-sync (which will install ntp) and most certainly you will have over 200 Mbytes of updates :)

While the default general color combination is a small improvement over previous versions, the 'new looks' are actually worse - apparently there is some huge 'OSX envy' somewhere at a very high level at Ubuntu and that leads to some very dumb usability/efficiency decisions - one of the most talked subject about 10.04 is about how to get back to the classic button order so here is again - you start gconf-editor, go to apps → metacity → general and change the value for button_layout to 'menu:minimize,maximize,close' ! Of course that is only part of the road to a decent look - I also MUST change the theme from the new Ambiance to Clearlooks (which allows me to configure the color for the title bars since it is the same as the color for ' selected item').

Also I always make certain the 'desktop effects' are on 'Advanced' (which means compiz) and then install compiz-settings-manager where I have a huge amount of personal productivity settings (including a large number of hotkeys). I also need my favorite firefox plugins + my passwords + my bookmarks, then I install VLC (in this version on my notebook it seems to need to be manually configured for OpenGL video output since otherwise will crash the system), and of course WINE 'just in case'.

The look and usability will be more than acceptable in the end:

Friday, January 29, 2010

When no amount of reality distortion field can hide the fact that the emperor is naked

After a long media frenzy on how the Apple tablet will revolutionize the world and be the best thing since the invention of the PC, the actual official announcement of the iPad has indeed slashed all records - I mean, all records on the number of new jokes (most of them surprisingly funny) launched in just 24 hours !!!
There are many 'firsts' in regard to this latest launch, but none of them is really a pure technological achievement - and most important of all is the fact that for the very first time Apple has launched a product that even the most die-hard fanboys see as non-inspiring and not so desirable in any way. This is a huge problem - since at Apple it was always a matter of selling a product which was visibly and vastly overpriced but at least it was so desirable that people would skip rent and still buy one just in order to have the latest 'status symbol'. That's no longer the case - and is only in a rather small part since the name is related to an article of feminine personal hygiene. It could be that for the first time there was too much hype prior to the launch but more likely it is also related to the total lack of something technologically new or at least exciting - it is just a much bigger iPod, one that you can NOT carry in your pocket to listen to your music on your daily commute ...
There IS however one potential market segment where the iPad might fit - but again I wonder if Apple really would like to go there - the product could be an almost-perfect device for your grandmother or anybody else with poor sight or hand coordination - but that would make it instantly the most non-cool device that Apple has ever sold and I am curious if the company will survive the wave of jokes on that matter.
There are plenty other things that are wrong with the device - unfortunately at the very top is the actual US economy :( The device would have done well in 2005 or 2006, but that is 2010 and with little chances to see a real recovery before 2020 the sticker shock might be a little too much for everyone but the most brainwashed fans.
And finally let's get to the actual huge technological problem - while with the original iPod the real innovation was a legal market for a product which until then RIAA would have considered totally illegal (and certainly fight in court), and on the iPhone the trick was to grab some small market share from other totally-closed devices (and getting most of the money in hidden long-term fees from the cellphone providers), today the iPad is trying something that even most mactards see as the wrong direction - closing a class of devices perceived as personal computers and which until now people would assume that they own - not entirely true, given that both Windows and OSX are actually licensed, but at least in theory it was possible to 'break free'. Well, not any more - the new device is just another totally-closed ecosystem, but unlike the iPhone (where pretty much all the other players were equally closed at the moment when Apple was actually grabbing market share), on this one there are quite a number of cheaper and better devices already on the market or very soon coming. And just as with the cellphones - where the actual iPhone killers are open-source devices like Android or Nokia N900 - on the tablets the new wave of Linux tablets and netbooks will actually marginalize the iPad probably long before most people will forget the original jokes about it.
The only (other than grandma-device) segment where the iPad might eventually compete is the electronic-book segment - on this one ALL the publishers will agree that the total closeness is a huge advantage - but unfortunately the device is 1-2 years too late even for that (not to mention that a Kindle will kick major iPad ass in direct sun).
So - will it fail or not ? Well, for 1-2 years the sales to fanboys might keep it afloat, but on the long term it might remain in history as the device that sank the myth of Steve Jobs ...

Thursday, November 12, 2009

Ubuntu finally getting a good version this year ...

After the half-failure with the current Long-Term-Support (LTS) 8.04 version the guys from Canonical finally managed to have a decent version with 8.10 but again 9.04 was a disapointment so now everybody was expecting 9.10 - which I am happy to say is (finally) another solid release !!! (even if the transition to GRUB2 is not without surprises).

However the major question in the mind of everybody that is involved in using Ubuntu for something more than getting email and surfing the web at home is about the next Long-Term-Support version - and we all hope that 10.04 LTS will be a version which FINALLY will allow real business use of Linux ... but of course that should not be taken as certain - some last moment stupid decision is still possible :(

Saturday, January 03, 2009

Predictions 2009

Just a very quick post for now - as you can see my 2008 predictions were over 80% correct - I have only missed the one regarding high-profile assassination (which however remains a possibility for 2009) and the success of the second generation iPhone.

I will only have one general prediction for 2009 and you probably already know about that - things will get A LOT WORSE - among other things expect 1 EUR = 2 USD, more war and generally the shit will hit the fan ... and I don't (yet) speak about the global warming shit - that will start hitting us in about 5-6 years :(

Also I will only provide one technical prediction for 2009 - things will look bad for the large players and especially Apple and Google, but it might be possible that specifically because things will be so ugly we might finally see some actual progress - at least better batteries but I really hope for (at least) one major innovation ...

Labels: , ,

Tuesday, December 02, 2008

Apple 'behind' the most hilarious Simpsons episode in years ...

The funniest Simpsons episode in years aired recently - you can also take a look at the best part here but let's just say that every single thing that is wrong with Apple is there (maybe except the recent action where Apple directly attacked free speech in a way that certainly makes Microsoft proud). There are many, many small references, including one to the famous 80's commercial ... just take a look for yourself (and don't be fooled by mactards - it IS incredibly funny).

Another 'funny' thing from Apple - as predicted they are getting closer and closer to their own antivirus industry - !!!

UPDATE - the knowledge base article above was removed by Apple - but I guess that is now far more embarrassing :)

Also even more embarrassing is the OFFICIAL LEGAL POSITION of Apple that ONLY A FOOL WOULD BELIEVE APPLE 3G ADS !!!

Sunday, June 01, 2008

Mandriva 2008 Spring vs. Ubuntu 8.04 LTS ... and how neither one wins :(

You might remember from my old Ubuntu 7.10 vs Mandriva 2008 story and later from the Fedora vs. OpenSUSE vs. Ubuntu on both x86 and PPC that the winner of the 'end of 2007 Linux competition' was Ubuntu 7.10 (but only by a whisker or so) - and I was actually mentioning that the competition only started to get more interesting - so now we are a the end of the spring 2008 and it's time to see how things have evolved ...

Mandriva launched the new 2008 Spring version (or 2008.1) around April 9th 2008 and just 2 weeks later around April 24th Ubuntu 8.04 LTS was out clogging the internet pipes in a download frenzy :) Needless to say - I have tested some of the beta and release candidates before those launch dates and very soon I have also installed both final versions on a number of computers - but as always there were too many other more urgent things to finish first and I wanted a slightly longer-term experiment with those so only now I will try to do a (rather short) review of those two.

I will start with the good part - Mandriva 2008.1 got slightly better than the previous version with the new 2.6.24 kernel and the entire color theme and user interface remains very good, polished and usable - which probably make Mandriva one of the best distributions around if you want both KDE and GNOME.

In a rather similar way Ubuntu 8.04 has integrated the newer 2.6.24 kernel and all the latest sofware and has also done a number of small usability improvements - for instance for a newcomer from Windows WUBI might a nice way to take a first look at Linux.

Unfortunately here is where the constructive criticism must start - BOTH Mandriva 2008.1 AND Ubuntu 8.04 LTS just scream 'unfinished work' all over them - it might be a general Linux feature that it's an evolving target but the fact that 'managers' from both Ubuntu and Mandriva somehow decided that they will replace the major winner of the entire open-source philosophy of building software - we will ship when it is ready - with time-fixed release dates (which not even Microsoft is able to pull most of the time) is such a huge mistake - and is mind-blowing that the entire media is still missing the fact that by officially abandoning the 'we will ship when it is ready' approach the open-source just handed a major victory to the closed-source camp :(

Among the worst parts in Mandriva:
  1. On clean installations on certain notebook video cards (and NOT the 'latest stuff', just ordinary intel cards that worked perfect with the previous installer...) the dual-output leads to a bad graphics configuration and the system can not start in GUI mode - a more experienced user is able to fix things very quick from the command line but it's easy to see how somebody without any previous Linux/X11 experience will run scared and never look back;
  2. There is still a problem with suspend/hibernate which seems to be related to a 'video suspend script' - the 'suggested workaround' is to remove the script /usr/share/pm-utils/sleep.d/20video ???
  3. CPU SpeedStep is OK for 1.2 GHz Low-Voltage Pentium-M models but not for the slightly-newer 1.4 GHz Low-Voltage Dothan ??? (which works fine in Ubuntu or Windows).
  4. HUGE problem in certain configurations with the default inode size on EXT3 - 'normal/older' versions of GRUB can not even boot from those !!!

Good or at least better-than-Ubuntu stuff in Mandriva 2008.1:

  1. Ndiswrapper still works;
  2. Compiz seems to be working for the first time stable with KDE;
  3. notebook HDD problems seem to be handled better than in Ubuntu;
  4. Default UI is better than in Ubuntu.

Worst parts in Ubuntu 8.04:

  1. Ndiswrapper is no longer working (there is a bad workaround here but it's not working in all Broadcom configurations);
  2. Suspend/resume is again not always working :(
  3. Many things that look 'unfinished' or 'untested';
  4. still using the old/unpatched GRUB that can not boot on newer EXT3 with big inodes;
  5. Default UI is still hurting my eyes :)

So who is the winner of the spring contest ? If I would be forced to pick just between the two from above I would say Mandriva might be now one whisker ahead on some configurations - but the actual unexpected winner is Ubuntu 7.10 which still remains the main Linux on my ultraportables !!!

That is showing a dangerous closed-source precedent now moving to Linux - just like many people will still favor XP over Vista or OSX 10.4 over 10.5, it is now the first time when a newer 'generation' of Linux distributions fail to become clearly better than the previous one ... That is definitely related to the fixed release schedule imposed by the above distributions but might also raise a small question - is it possible that the complexity of modern distributions (meaning full operating systems, and I include here Vista and OSX 10.5) might have now reached a point where it is no longer possible to 'get them right' the first time ?

Saturday, March 01, 2008

What can you do when you need the portability of the MacBook Air but you only have 500 USD to spend ...

First things first - if you just want the HotAir in order to "make you look cool" (as probably 95% of the owners do) you have already lost that battle, time to go away, nothing for you here ... However if you really like the idea of a very portable notebook, you are not crazy about glamour and you only have very little money - keep reading - there is still hope :)

For a smart buyer there ARE a few low-cost alternatives but you will need to set your priorities straight - first real question is "does size matter to you ?" :) Or in other words - how good are you with very small screens and keyboards ? If you don't mind a VERY small screen and a small keyboard you can actually get something SMALLER than the MacBook Air starting NEW around 300-400 US$ - either the already famous Asus eeePC or the newer Everex Cloudbook - get the first if a more normal touchpad is important to you, the second if disk size is essential.

However there is a serious problem with both of the above - the screen is actually more than 3 times smaller than the Air and the keyboard can feel too small at first so you will NOT write your novel on a 7'' screen (and generally you should also avoid reading one if you care about your eyes).

If money was no object there were many other choices a LOT better than the HotAir - Panasonic W or Y series, Lenovo X61 and X61T, Dell Latitude XT, and soon the Lenovo X300 which seems quite nice ... but obviously those are in the same price range as the Air (but without skimping on any important features and then selling that as a 'major progress') ... so the only route left is pre-owned ... but amazingly you can get something with BETTER FEATURES than the HotAir for about 500 US$ - just head for ebay or craiglist and look after a Dell X300 or Dell X1 - the 'secret' is to buy from a bigger seller that probably got a (very) large batch that was retired by a bank or something like that -they will have a decent description of the item, 10000+ feedback (so a scam is very unlikely) and most often you will get 2 weeks of 'warranty' so if things go wrong you can still send it back ...

X300 is cheaper and I recommend the models with the 1.4 GHz Low-Voltage CPU (but the older 1.2 GHz Low-Voltage CPU is also just fine and runs circles around the 600 MHz Celeron Low-Voltage from the eeePC). X1 comes with an Ultra-Low-Voltage CPU around 1-1.1 GHz - that one was only somehow superseded by Intel in 2007 by some models and the first major step forward will actually come in 2008 so it still is almost as good as it gets in the ultra-cool CPUs!

You can probably get a good X300 with 640 MB RAM, 30-40 GB HDD, docking station, combo CDRW/DVD and two batteries under 400 US$ delivered - if you want more RAM you should try to get one of the cheaper models with only 128 or 324 MB RAM since any memory upgrade will mean that you will throw away the memory from inside and add a 1 GB SODIMM stick from Crucial or similar for around 80 US$ - for a total of around 1.1 GB RAM which should be OK for any decent scenario. The same can go for the HDD if you need LOTS of space - you can upgrade to a 120 GB WD or Samsung 2.5'' model for well under 100 US$. The total will most likely come to less than 550$ even if you do both upgrades, and you will have an amazing subnotebook that is LIGHTER than the HotAir and has ALL the extensions you will ever need - USB, FireWire, PCMCIA, SD card reader, Ethernet, WiFi, video out, swappable battery, even modem and IRDA :)

Most of the machines will come with the original XP sticker so you will be able to install a clean legal version of the Dell OEM XP - but the hardware configuration is rather classic at this moment and most of the modern Linux distributions will work just fine (eventually with a little tweaking on the WiFi and Suspend to RAM part).

Another HUGE advantage over the Air or any of the 2000-3000 US$ new subnotebooks is that you can actually get 2 notebooks + all upgrades for under 1000 US$, and at that point you can always keep them 'cloned' and at any moment something fails you just switch to the other one - unlike a new expensive model where you might have warranty, but that means you will send the notebook (most likely together with ALL you confidential information) to be repaired and AT BEST you will get it back in 1-3 weeks ...

Finally here is another somehow similar post - Cloudbook vs eeePC vs X31 Thinkpad - the IBM Thinkpads are probably even better built than the X300/X1 but are slightly heavier and do not have a touchpad :( (that last one being the reason why X61T was not the absolute best TabletPC ever on the price/performance ratio - but I hope that Lenovo X71 will fix that :) ).