Monthly Archives: September 2009

Ubuntu 9.10 boot stats

Bear in mind this is alpha 6. I timed from the end of the bios loading (about 8 seconds, so it’s 46 seconds from power on to idle desktop):

0s – OS starts to boot
24s – at logon screen
38s – desktop loaded, hdd idle

This is not a fresh install as I’ve been using it for a few days, however I did stop postfix and samba from loading at boot (these aren’t installed by default anyway). I’ve also added KVM.

This is pretty impressive performance, but not enough to make sleep or hibernate redundant, and it doesn’t really blow Windows 7 out of the water either.

Specs:

  • Dell E4300
  • Core 2 Duo 2.26ghz
  • Seagate 7200.4 500gb laptop hard drive

Bring on the SSDs – at $900 the Intel 160gb X25-M G2 is still way too expensive and would have to drop by about 60% before I’d even consider one.

Nokia N900 availability in New Zealand

So. They’re not out yet but I want one, and it doesn’t look like it’s going to be easy to get hold of one.

I enquired with a friend who’s a computer dealer this week. He has an account with a major wholesalers that distributes Nokia devices, so I figured it would be a good place to start. The reply I got back was rather interesting.

Hi Alex,

By some wondrous decision, without consultation, we are now NOT allowed to sell any product from the importer that has Cell capabilities.

We have approached the commerce commission but they are swamped with bigger matters!

Long live New Zealand’s free trade… It appears that the big brothers still run the country.

[…]

Say what?

So I decided to ask Nokia themselves. And this is the response:

Hi Alex,

I am pleased to hear of your keen interest in the Nokia N900.

At this stage, there are no updates when this phone will be launched and release in Asia Pacific, which is including New Zealand. Hence, I do apologise as I am unable to confirm if the Nokia N900 will be available for sale in New Zealand once it is launched in Asia Pacific.

Kindly be advised that all new product launches are carrier and market dependent in all countries due to the tests carried out to ensure compatibility with network and government regulations. Hence, the launch dates are still not available as it is still in tests and awaiting approval from the respective network providers and government.

As a suggestion, you can subscribe to our Nokia e-Newsletter. The e-newsletter will provide any latest updates on our products as well dates for new product launches. You may refer to the link below to register for the e-Newsletter subscription:

http://www.nokia.co.nz/subscribe

We thank you for your interest in Nokia products and hope for your continued support.

Hope the above helps to clarify your query.

Thank you for emailing Nokia Careline! Please help us serve you better by providing your valuable feedback at:

[Link removed, has UID]

Do you know you can now update your phone software at your own convenience?
Visit www.nokia.com.au/support to check if your phone model is supported and download the “Nokia Software Updater”.

Kind regards,

Suba
Nokia Careline
Please contact us at 0800 665 421
www.nokia.co.nz/support

Well I’m 100% sure that no government regulations are going to get in the way of a generic HSDPA device, so in other words, they need to wait for Vodafone to test whether an HSDPA 900/2100 device will work on an HSDPA 900/2100 network.

Can’t the consumer take some responsibility here? What if we want a phone that’s not locked to a particular carrier’s network? Paying $1000+ for a phone that only works on Vodafone’s network? I don’t think so. What if we want to buy a phone at market value rather than the exorbitant markup Telecom and Vodafone put on their phones? Vodafone charges $1800 for the N97 which is close to 100% markup and totally absurd.

I’m sure the high markup is done to make the contracts which include the phone more attractive, but it completely shafts anyone that doesn’t want to be locked in.

This market needs to change. Networks are built on standards, and so long as the device is compliant with those standards there should be no need for the carrier to “approve” them and control the market. Cellular devices are not just phones anymore, they’re computers, and the market isn’t reflecting that. I think the wholesaler’s decision not to sell cellular devices to computer retailers is strongly influenced by another party – and it’s fairly obvious who this benefits (hint: not the wholesaler, consumer or computer retailer, and I’d be dubious about whether this benefits Nokia in any way).

It looks like I will have to get one from a parallel importer. But I’m not particularly happy about it.

HDD failure warning in Ubuntu Karmic (9.10)

I started to write a blog post about my backup solution, but didn’t actually finish it before this happened. I only got it running on Wednesday this week, when today my laptop (running Ubuntu 9.04) refused to boot! I was getting a lot of I/O and “DRDY ERR” error messages. The boot process mounted the drive read only, dropped me to a shell and told me to run fsck manually (not terribly helpful for inexperienced users I might add).

Anyway, instead of doing that I elected to reboot from a flash drive with 9.10 alpha6 on it, and examine the disk from a properly working system. After booting Karmic, I was greeted with the following message:

Screenshot-gdu-notification-daemon

How thoughtful!

The “icon” it’s referring to is a little disk icon in the top right of the screen with an exclamation mark on it. Clicking on it brings up the new Palimpsest Disk Utility – a nice step forward from 9.04, which only included gparted. There’s not really anything wrong with gparted, but its main focus is on partitioning and it doesn’t have other disk management features such as SMART monitoring. And Palimpsest does present a nice interface:

Palimpsest Disk Utility

Bad sectors are not a good sign, so it would seem that this not-very-old 500gb hard drive is on the way out.

To “repair” the bad sectors (i.e. make sure the filesystem doesn’t use them), I ran “fsck -c /dev/sda5” (sda5 is my root partition, the one that was giving me trouble). This runs the filesystem check in conjunction with the badblocks tools. For now it’s up and running again, but I’ll be replacing the drive and restoring my data before sending it off for RMA!

It looks like I won’t need to go back to a backup, but this certainly shows the value of regular backups and when my laptop failed to boot I was extremely glad I had them!

Ubuntu 9.10 beta is only a week away, and so far “Karmic Koala” is shaping up to be a solid release.

Identity Management

(Warning: if you’re not an IT nerd this blog post may make rather dry reading)

Identity Management is a pretty big topic these days – some might say it’s the new IT buzzword. From an organisational perspective it is highly desirable for users to have to remember as few passwords as possible, as this reduces the need to them to write them down. Centralised management and provisioning of user rights also provides more certainty and reduces overheads.

With the use of authentication services such as Facebook Connect, Windows Live ID, and Google accounts becoming more widespread on the web, we’re starting to see the web trending away from the “one identity per service” model towards fewer identity providers proving authentication services for other sites.

Recently I’ve been asked to investigate SAML-base single sign on solutions, so I’ve collected some of my thoughts in this blog post. Please note that this is based on my own research and should not be considered authoritative in any way!

The Web Perspective

One of the problems with the web today is the sheer number of usernames and passwords that people have to remember. You need to create an account for almost every online service you use, as sites need access to certain information about you in order to provide a useful service, and they need a way to ensure that you keep the same identity on the site the next time you visit. E-Commerce is a very significant example of an area where this is needed as you can’t accept payments without a fair bit of information.

Microsoft tried to solve the problem with their passport service back in 1999 (actually it may have been even earlier). The idea was that your “passport” could be used to sign in to other passport-enabled sites, and could contain enough information to allow ecommerce transactions to take place without having to enter your details every time. The problem, in typical Microsoft fashion, was that this service was a centralised Microsoft service – they wanted to hold all the information. It should have come as no surprise then that adoption was rather limited, and fortunately as a result the current Windows Live ID service is a different beast.

What was needed was an open model not tied to a particular service, and that model is OpenID. All the aforementioned services support (or have committed to supporting) OpenID, which is in layman’s terms an open way of logging in to one site using credentials from another. So what this means is that theoretically you could use your Facebook account to login to any site that supports logging in with OpenID.

“Brilliant! Now I can use one identity for everything!”

There’s a small problem though.

The major Identity Providers (holders of your information) all want to be providers, but they don’t want to be consumers (i.e. accept logins from other sites). So while you can log on to Gmail with your Google ID, and digg.com with your Facebook ID, you can’t login to Facebook with your Google ID or Gmail with your Windows Live ID. We’re a long way from the OpenID dream of being able to sign in to any service with any ID, and there’s little stopping it but branding and marketing. But we are at least moving towards needing fewer logins, as smaller sites tend to be happy to accept logins from the major providers, and OpenID adoption is growing so it’s not all bad.

Organisational Needs

Large corporate networks mainly want a single place to manage user access to company resources. They also generally want their users to have as few passwords to remember as possible, and to have to enter their passwords only when really necessary. LDAP solves the first problem by providing that central repository of user information which services can outsource their authentication to, and most applications that would be used on a large network can do this. It doesn’t solve the second problem however, as the user still has to type their password for each service. But at least it’s the same password.

OpenID works well for the web where the services are available to anyone with an email address. Basically they don’t care who the user is as long as they’re the same person. However the Identity Management needs of organisations are somewhat different. You generally don’t want to grant any old OpenID access to a company network, however you may want to grant employees or members of other organisations access to certain resources. What is needed therefore is a framework which  refers to a centralised directory service, provides single sign on, and can provide access to users of other trusted organisations.

The solution to this is “Security Assertion Markup Language”, or SAML. SAML introduces the concepts of an Identity Provider (provider of assertions) and Service Provider (consumer of assertions). What happens in a SAML authentication session is that the user’s web browser tries to access the app, gets redirected to the login page of their Identity Provider, which returns a token to the browser upon login. The browser then forwards the token to the service provider which verifies the request and grants access. The best diagram I’ve seen which explains this process is on Google’s SAML reference implementatin page for Google Apps.

The Identity Provider part (IDP) is the easy bit. The software is available (Shibboleth and SimpleSAMLphp are two examples) and once you get your head around the concepts and set it up correctly you can point it at a directory service and go. The problem currently is at the Service Provider (SP) end (the part labelled ACS in Google’s diagram), as few services actually support SAML. Google Apps is one of the first notable examples, and I’m hoping that adoption of Google apps will solve the chicken and egg problem by driving adoption of SAML and providing the install base for other software developers to jump on board and add SAML to their services.

Software such as Novell Access Manager (which supports SAML) attempts to get around the problem by effectively acting as a gateway to the service, and blocking access to unauthenticated users. That way the service doesn’t have to support SAML and you can only get to the service if you have permission, however I don’t know how the target web service is supposed to handle authentication if it needs to know who you are (for example to edit a wiki). I think the logical way would be for it to insert login credentials in the HTTP request, but hopefully this will become apparent when I start playing with it.

Conclusion

OpenID isn’t perfect, and like any username/password scheme it is particularly vulnerable to phishing attacks (only the stakes are higher as a successful attack results in access to multiple sites). The battle between the major providers to be the provider of your identity also threatens to reduce the benefits. But regardless of the risks it seems like a step forward for the web.

For organisations that need single sign on and a federated trust model, SAML seems to be the way to go. But it requires much broader adoption by software developers and service providers before it will truly eliminate multiple logons in organisations. Heck, many don’t even support LDAP yet.

Ubuntu 9.10 Alpha 6 Impressions

So it’s Saturday night and… I’m blogging about Karmic Koala. My social life has really taken off recently.

But on a more serious note I took alpha 6 for a spin on my E4300, and so far I’m impressed. I haven’t actually installed it to the hard drive yet, just booted from a USB key. But everything’s working well so far, and kernel mode setting is just the bees knees. It’s amazing how much of a difference it makes when switching terminals – it’s instantaneous. You will definitely want to be running an Intel or ATI card for this version.

I’ll be upgrading permanently once the beta comes out, so I’ll go into more detail then. I’ll also be refreshing my Mythbuntu media PC (Athlon II 250, Geforce 8200 motherboard), older laptop (HP nx6120), and maybe my old desktop (Intel P35 + ATI 4850), which gives a pretty broad coverage in terms of hardware testing. I’m looking forward to seeing if battery life has improved, as when Vista gets 5 hours and Ubuntu just over 3, you know something’s wrong.

It will also be nice to have an up to date browser again – Firefox 3.5 under Jaunty is not well integrated. Can’t comment on the boot speed as my flash drive is rather slow (and the live distro is not really indicative anyway). I tried to have a go with the new gnome-shell too, but unfortunately couldn’t get it to load. All I did was aptitude install gnome-shell from the live usb distro, so hopefully I’ll be able to get it working after installing the beta.

Decided it’s time to finally wipe Windows too, I never boot to it so it’s just a waste of 80gb. Believe it or not, this will actually be the first time I’ve not had Windows installed on my main computer, so quite a milestone really. It’s been over 3 years since I switched to using Ubuntu as my main OS, and looking back at Ubuntu 3 years ago it has come a long way. Edgy Eft (6.10) was usuable but rough (wireless networking was huge pain), and 7.04 was a big improvement. 7.10 was one of those high points, and was when I first started seriously recommending Ubuntu to others as a replacement for Windows. Then 8.04 with pulseaudio was a bit of a mixed bag but otherwise pretty solid, and 8.10 was a rather unexciting steady improvement. 9.04 was a big step forward with much faster boot times but big problems with the Intel graphics driver. 9.10 looks to resolve most of the Intel graphics regressions but I think we’ll find there will be room for 10.04 to improve again.

That’s one of the things I like about following Ubuntu – we get new toys to play with twice a year.

Techfest ’09




Oscar Kightly was the MC

Originally uploaded by Al404

I’m not Microsoft’s biggest fan, but I’m not going to turn down food, drink and entertainment at their expense! And to their credit, this was a good event.

I snapped a few pics on my camera phone, and they showed I’m well and truly past its limits in this sort of environment. A few people had SLRs, and I wish I’d taken mine as most concerts you can’t take that sort of equipment to. Maybe if I score a ticket next year as well…!

Katchafire, The Septembers, ElemenoP all put on solid performances, and the comedy act Ben Hurley was hilarious. Two thumbs up.

Full set is here.

Update: Some of my pics were added to the official Teched gallery.

Automatically update DansGuardian Filter Groups List from LDAP

Update 20th March 2011: Heath has made some modifications to the original script and made it more efficient, see the comments below.

Here’s a script I wrote today, which updates the filtergroupslist file of Dansguardian. If you’re using LDAP authentication and want to give different levels of protection to certain groups of users, you need to update the list somehow, as Dansguardian doesn’t support LDAP groups. See this page for more info on filter groups.

The school I wrote this for is a Novell eDirectory site, and it will require a bit of modification to work on other sites. In particular you will need to alter the parameters of the ldapsearch command (filter string, server name, user credentials). Other LDAP servers may not support the ufn attribute, which this is based on. If your directory is well maintained and up to date you would probably be better of using the uid attribute, but this particular school hasn’t populated it for all users yet (only users created with ConsoleOne and iManager populate the uid attribute by default). If you do use uid, be sure to remove the cut command.

ldapsearch outputs data in ldif format, which is difficult to use in scripts. The tool to use to convert this is awk, which unfortunately is a language I haven’t learnt yet. So I found a premade awk script which converts ldif2csv (from here), removed out all the attributes and replaced them with just ufn (you may want to use uid instead).

If you use this script and modify or improve it, I’d appreciate you contributing the modifications back, as they may be useful to others (myself included)!

updateFilter.sh

#!/bin/bash 
#
# Dansguardian filter group update script
# Alex Forbes, Edtech Ltd
# Updated 9th September 2009
#

## Variables
# Dansguardian filtergroupslist file
DESTFILE=/root/filtergroupslist-test
LOGFILE=/var/log/dansfgupdate.log

# LDAP settings
LDAPFILTER="(&(objectClass=Person)(|(groupMembership=cn=ALL-TEACHERS,ou=TCHR,o=HWK)(groupMembership=cn=ALL-ADMIN,ou=ADM,o=HWK)(groupMembership=cn=OESAdmins,o=HWK)))"

# Which filtergroup do you want the users to be a member of
FILTERGROUP=filter2

# Path to the awk script (converts the ldif file to parseable text). I modified one from
# http://www.yolinux.com/TUTORIALS/LinuxTutorialLDAP-ScriptsAndTools.html
AWKSCRIPT=/opt/ldif2csv.awk
TMP=/tmp

# Dansguardian filter group list file
# Temp path, creates folder for the temp files. There are probably better ways of doing it.

# Make temp directories
WIP=$TMP/dgFilterUpdate
mkdir -p $WIP

# Header message
echo "## This file is automatically updated, any changes will be overwritten" > $WIP/4final
echo "## See /opt/edir2dansg.sh" >> $WIP/4final
echo "" >>$WIP/4final

# Perform LDAP search. Outputs ldif file.
ldapsearch -uxvA -H ldaps://fs2.howick.school.nz -b "o=HWK" -S '' -s "sub" -D cn=ldapauth,o=hwk -w password "$LDAPFILTER" ufn > $WIP/1ldif

# Picks out the ufn attribute using a modified awk script I found:
awk -F ': ' -f $AWKSCRIPT  $WIP/2txt

# Picks the first field of the ufn attribute to generate a clean list of users
cut -d, -f1 $WIP/2txt > $WIP/3userlist

# Add the values required to meet the dansguardian filter format
for u in `cat $WIP/3userlist`; do
	echo "$u=$FILTERGROUP" >> $WIP/4final
done

# Finally, copy the file to overwrite the dansguardian list.
# I've done a simple check to make sure the file isn't too small in case of error, but it could be handled better.
SIZE=`stat -c %s $WIP/4final`
if [ $SIZE -gt 2500 ]; then
	cp $WIP/4final $DESTFILE
	echo $(date +"%Y/%m/%d %H:%M"): Updated filter groups list "("size $SIZE bytes")" >> $LOGFILE
else echo $(date +"%Y/%m/%d %H:%M"): Output file is too small, list not updated >> $LOGFILE
fi

# Gentle reload of dansguardian
dansguardian -g

And the modified awk script, ldif2csv.awk:

BEGIN {
        ufn = ""
      }
/^ufn: /              {ufn=$2}
/^dn/ {
        if(ufn != "") printf("%sn",ufn)
        ufn     = ""
      }
# Capture last dn
END {
        if(ufn != "") printf("%sn",ufn)
}

Update 9-9-09: Fixed a few dumb mistakes.

Simple File Backup to Email Script

Here’s a file backup script I installed for a client. The original outline came from a post on the ubuntu forums (I forget where exactly), but it’s simple enough. It creates an archive in /tmp, zips it up, emails it then deletes the archive. If your target is a linux computer then it makes more sense to gzip it by adding a “z” to the tar options (i.e. tar -czf) and removing the zip line.

#!/bin/bash
#
# Simple file backup script, creates archive in /tmp and emails it.
#
# Software required:
#  zip
#  tar
#  mutt

# Variables
[email protected]
SOURCE="/home/user1 /home/user2"
SERVERNAME=server.example.com
MAIL=`which mutt`
ZIP=`which zip`
DATE=`date +%Y_%m_%d`
FILE=myfiles-$DATE.tar
DESTINATION=/tmp/$FILE
ZIPFILE=$DESTINATION.zip

# Actions
tar -cf $DESTINATION $SOURCE 2> /dev/null
$ZIP $ZIPFILE $DESTINATION
$MAIL -a $ZIPFILE -s "Backup for $DATE" -s "$SERVERNAME backup $DATE" $MAILADDR < /dev/null
rm $ZIPFILE $DESTINATION

For mutt to work you need an MTA (mail transport agent) such as postfix. If it’s not installed and you don’t need it for anything else, configure it as a satellite system (the Ubuntu/Debian packages prompt you on install and satellite system is an option). This prevents spammers from using it as a relay, and ensures the mail goes to your real mail server.

Don’t visit al4.com …

Noticed today that my .com namesake is an adult search/spam site, and since it was registered in 2002 it probably has been for a while. The main reason is of course because it’s a 3 letter domain (visit any 3 letter .com domain and it’s guaranteed to be registered).

Messing with the N900

Apparently Nokia believes the mobile network carriers won’t be interested in selling the N900 because it won’t let them mess with the operating system.

Strangely enough, this is one of the reasons why I will be buying one.

Also interesting to see that Nokia doesn’t consider the N900 to be the “next generation” of computers. That honour is reserved for their fifth generation tablet – the model after the N900.