Points of Authority

Posted on 27th May 2011

Back in February I did a presentation for the Birmingham Perl Mongers, regarding a chunk of code I had been using to test websites. The code was originally based on simple XHTML validation, using the DTD headers found on each page. I then expanded the code to include pattern matching so I could verify key phrases existed in the pages being tested. After the presentation I received several hints and suggestions, which I've now implemented and have set up a GitHub repository.

Since the talk, I have now started to add some WAI compliance testing. I got frustrated with finding online sites that claimed to be able to validate full websites, but either didn't or charged for the service. There are some downloadable applications, but most require you to have Microsoft Windows installed or again charge for the service. As I already had the bulk of the DTD validation code, it seemed a reasonable step to add the WAI compliance code. There is a considerable way to go before I get all the compliance tests that can be automated written into the distribution, but some of the more immediate tests are now there.

As mentioned in my presentation to Birmingham.pm, I still have not decided on a name. Part of the problem being that the front-end wrapper, Test::XHTML, is written using Test::Builder so you can use it within a standard Perl test suite, while the underlying package, Test::XHTML::Valid uses a rather different approach and does provides a wider API than just validating single pages against a DTD specification. Originally, I had considered these two packages should be two separate releases, but now that I've added the WAI test package, I plan to expose more of the functionality of Test::XHTML::Valid within Test::XHTML. If you have namespace suggestions, please let me know, as I'm not sure Test-XHTML is necessarily suitable.

Ultimately I'm hoping this distribution can provide a more complete validation utility for web developers, which will be free to use and will work cross-platform. For those familiar with the Perl test suite structure, they can use it as such, but as it already has a basic stand-alone script to perform the DTD validation checks, it should be usable from the command-line too.

If this sounds interesting to you, please feel free to fork the GitHub repo and try it out. If you have suggestions for fixes and more tests, you are very welcome to send me pull requests. I'd be most interested in anyone who has the time to add more WAI compliance tests and can provide a better reporting structure, particularly when testing complete websites.

File Under: modules / opensource / perl / technology / testing / usability / web
NO COMMENTS


The Sanity Assassin

Posted on 12th May 2011

An update to my recent post.

With thanks to a fellow Perler, Smylers informs me that a Flash Cookie refers to the cookie used by Flash content on a site, which saves state on the users machines, by-passing browsers preferences. Odd that the advice singles out this type of cookie by name though, and not the others.

In an article on the Wall Street Journal I found after posting my article, I found it interesting to discover that the ICO themselves use Google Analytics. So after 25th May, if you visit the ICO website and see no pop-up, I guess that means Google Analytics are good to go. Failing that they'll see a deluge of complaints that their own website fails to follow the EU directive.

I also recommend reading the StatCounter's response too. They also note the problem with the way hosting locations are (not) covered by the directive, and the fact that the protection from behavioural advertising has got lost along the way.

After a discussion about this at the Birmingham.pm Social meeting last night, we came to the considered opinion that this would likely just be a wait and see game. Until the ICO bring a test case to court, we really won't know how much impact this will have. Which brings us back to the motives for the directives. If you're going to take someone to court, only big business is worth fining. Bankrupting an individual or a small business (ICO now have powers to fine up to £500,000) is going to give the ICO, the government and the EU a lot of really negative press.

Having tackled the problem in the wrong way, those the directives sort to bring into line are only going to use other technologies to retrieve and store the data they want. It may even effect EU hoisting companies, if a sizeable portion of their market decide to register and host their websites in non-EU countries.

In the end the only losers will be EU businesses, and thus the EU economy. Did anyone seriously think these directives through?

File Under: government / law / security / technology / usability / web / website
NO COMMENTS


The Planner's Dream Goes Wrong

Posted on 11th May 2011

On May 26th 2011, UK websites must adhere to a EU directive regarding cookies, that still hasn't been finalised. Other member states of the EU are also required to have laws in place that enforce the directive.

Within the web developer world this has caused a considerable amount of confusion and annoyance, for a variety of reasons, and has enabled media outlets to scaremonger the doom and gloom that could befall developers, businesses and users. It wouldn't be so bad if there was a clear piece of legislation that could be read, understood and followed, but there isn't. Even the original EU directives are vague in the presentation of their requirements.

If you have the time and/or inclination the documents to read are Article 2 of Directive 2009/136/EC (the Directive), which amends the E-Privacy Directive 2002/58/EC (the E-Privacy Directive), with both part of the EU Electronic Communications Framework (ECF).

Aside from the ludicrous situation of trying to enforce a law with no actual documentation to abide by (George Orwell would have a field day), and questioning why we are paying polictians for this shambolic situation, I have to question the motives behind the creation of this directive.

The basic Data Protection premise for tightening up the directive is a reasonable one, however the way it has been presented is potentially detremental to the way developers, businesses and users, particularly in the EU, are going to browse and use the internet. The directive needed tightening due to the way advertisers use cookies to track users as they browse the web and target adverts. There has been much to complain about in this regard, and far beyond the use of cookies with companies such as Phorm trying to track information at the server level too. However, the directive has ended up being too vague and covers too wide a perspective to tackle the problem effectively.

Others have already questioned whether it could push users to use non-EU websites to do their business because they get put off using EU based sites. Continually being asked whether you want to have information stored in a cookie every time you visit a website is going to get pretty tiresome pretty quickly. You see, if you do not consent to the use of cookies, that information cannot be saved in a cookie, and so when revisiting the site, the site doesn't know you said no, and will ask you all over again. For those happy to save simple preferences and settings stored in cookies, then you'll be asked once and never again. If you need an example of how bad it could get, Paul Carpenter took a sartirical look at a possible implementation.

On Monday 9th May 2011, the Information Commissioner's Office (ICO) issued an advice notice to UK businesses and organisation on how to comply with the new law. However even their own advice states the document "is a starting point for getting compliant rather than a definitive guide." They even invent cookie types that don't exist! Apparently "Flash Cookies" is a commonly used term, except in the web technology world there are just two types of cookie, Persistent Cookies and Session Cookies. They even reference the website AllAboutCookies, which makes no mention of "Flash Cookies". Still not convinced this is a complete shambolic mess?

The directives currently state that only cookies that are "strictly necessary" to the consumer are exempt from the ruling. In most cases shopping carts have been used as an example of cookie usage which would be exempt. However, it doesn't exempt all 1st party cookies (those that come from the originating domain), and especially targets 3rd party cookies (from other domains). The advice states "The exception would not apply, for example, just because you have decided that your website is more attractive if you remember users' preferences or if you decide to use a cookie to collect statistical information about the use of your website." Both of which have significant disruption potential for both websites and their visitors.

Many of the 1st party cookies I use are Session Cookies, which either store an encrypted key to keep you logged into the site, or store preferences to hide/show elements of the site. You could argue both are strictly necessary or not depending on your view. Of the 3rd party cookies, like many people these days, I use Google Analytics to study the use of my websites. Of particular interest to me is how people find the site, and the search words used that brough the visitor to the site. It could be argued that these are strictly necessary to help allow the site visitor find the site in the first place. Okay its a weak argument, but the point remains that people use these types of analysis to improve their sites and make the visitor experience more worthwhile.

Understandly many people have questioned the implications of using Google Analytics, and on one Google forum thread, the Google approved answer seems to imply that it will only mean websites make it clearer that they use Google Analtyics. However this is at odds with the ICO advice, which says that that isn't enough to comply with the law.

If the ruling had been more explicit about consent for the storing of personal data in cookies, such as a name or e-mail address, or the use of cookies to create a personal profile, such as with advertisier tracking cookies, it would have been much more reasonable and obvious what is permissible. Instead it feels like the politicians are using a wrecking ball to take out a few bricks, but then aiming at the wrong wall.

For a site like CPAN Testers Reports, it is quite likely that I will have to block anyone using the site, unless they explictly allow me to use cookies. The current plan is to redirect people to the static site, which will have Google Analytics switched off, and has no other cookies to require consent. It also doesn't have the full dynamic driven content of the main site. In Germany, which already has much stricter requirements for data protection, several personal bloggers have choosen to not use Google Analytics at all in case they are prosecuted. I'm undecided at the moment whether I will remove GA from my websites, but will watch with interest whether other bloggers use pop-ups or remove GA from their sites.

Perhaps the most frustrating aspect of the directives and the advice is that it discusses only website compliance. It doesn't acknowledge that the websites and services may be hosted on servers outside the EU, although the organisation or domain may have been registered within the EU. It also doesn't differentiate between commercial businesses, voluntary organisations or individuals. Personal bloggers are just as at risk to prosecution as multinational, multibillion [currency of choice] businesses. The ICO is planning to issue a separate guidance on how they intend to enforce these Regulations, but no timescale is given. I hope that they make it absolutely clear that commercial businesses, voluntary organisations or individuals will all be treated differently from each other.

In their eagerness to appear to be doing something, the politicians, in their ignorance, have crafted a very misguided ruling that will largely fail to prevent the tracking of information and creation of personal profiles, which was the original intent of the changes. When companies, such as Phorm, can create all this personal information on their servers, using the same techology to capture the data, but sending it back to a server, rather than saving a cookie, have these directives actually protected us? By and large this will be a resounding No. Have they put in place a mission to disrupt EU business and web usage, and deter some from using EU based websites? Definitely. How much this truly affects web usage remains to be seen, but I suspect initially there will be an increase in pop-ups appearing on websites asking to use cookies.

It will also be interesting to see how many government websites adhere to the rulings too.

File Under: government / law / security / technology / usability / web / website
NO COMMENTS


Addicted to Chaos

Posted on 31st March 2011

Sometime ago, a website I was working on needed the ability to view images on the current page from a thumbnail. Many websites now feature this functionality, but at the time only a few seemed to offer this, and the assumption was that the javascript required was rather complex. As such, I did a search of the viewer libraries available, either as Open Source or for free download, that I could use for a commercial website.

The initial search revealed a rather more limited result than I expected, and seemed to imply that the complexity had put people off from developing such a library. However, in retrospect it seems that a market leader has become so popular, stable and robust, that others have choosen to provide different or limited presentations based on similar designs.

Back last year I began writing a review of some of the viewers, but never got around to finishing it. Having some time recently, I decided to both complete the review and revisit the viewers to see what improvements have been made since I first investigated them.

Before I begin the individual reviews, I should note the requirements I was looking for in a viewer. Firstly, the viewer needed to be self contained, both with files and directory structure, so that the feature could be added or removed with minimal changes to other website files. The viewer needed to be run completely on the client side, no AJAX or slow loading of large images would be acceptable. However, the most significant requirement was that all code needed to work in IE6. Unfortunately this latter requirement was non-negotiable.

I was quite surprised by the results of the solutions I could find around the web, and although there are likely to be others now, the following is a brief review of each of the four immediate solutions I found, and my experiences with them.

Lightbox

Possibly the best know thumbnail viewer library available, and now a clear market leader. The original review was with v2.04, which had been the stable release from 2008. This month (March 2011) has seen a version 2.05 release with added IE9 support. Lightbox is licensed under the Creative Commons Attribution 2.5 License, and is free to use for commercial projects, although a donation would be very much appreciated.

While this viewer works in most browsers, and the features of images sets and loading effects looked great, it proved unworkable in many of the IE6 browsers I tried across multiple platforms. Despite searching in forums and in some howtos, there didn't seem to be an obvious fix to the problem. The viewer would either not load at all, load with a black layer over the whole web page, or begin to load and crash the browser. I know there are many problems and faults with IE6 and the javascript rendering engine, but these were supposedly stable releases.

As Lightbox makes use of the Prototype Framework and Scriptaculous Effects Library, which was already being used within the website the viewer was for, the library initially seemed to be the best fit. Failing IE6 so dramatically and consistently, disappointingly meant it couldn't be pursued further.

Slimbox

Slimbox is a Lightbox clone written for the JQuery Javascript Library. v2.04 is the last stable release, and the release that was originally reviewed. Slimbox is free software released under MIT License.

Slimbox is based on Lightbox 2, but utilises more of the JQuery framework and is thus slightly less bulky. While working well in the browsers I tried, it flickered several times in IE6 when loading the image. Anyone viewing the effect with eplipsy might well have felt ill. Even for someone not affected by eplisey this strobing effect was extremely off putting. I suspect this problem may well be an alternative side-effect to those seen with the original Lightbox, but again forums and howtos didn't provide a suitable fix in order to remedy this problem.

Dynamic Drive Thumbnail Viewer

This is the first thumbnail viewer that Dynamic Drive have available, as the second is an inline viewer rather than an overlay, which is what I was after, and is the version made available on July 7th, 2008. Scripts by Dynamic Drive are made available under their Terms of Use, and are free to use for commercial projects.

This a very basic viewer, relying on basic functionality rather than flashy effects. As such, it is simple in design and presentation. Rather than create a full browser window overlay, as both Lightbox and Slimbox do, the Dynamic Drive viewer simply contains the viewing image within a simple DIV layer tag. There is the possibility to add visual effects, but these can be easily turned off.

This seemed to work in most of the browser tried, except when clicking the image in IE6. The image appeared, but then immediately a javascript error popped up. After quickly reviewing the configuration and turning off the animation, the viewer opened and worked seamlessly across all the browsers tested.

Highslide JS

Highslide JS is a very feature rich library, which provides much more than an image viewer. Highslide JS is licensed under a Creative Commons Attribution-NonCommercial 2.5 License, which means you are free to use the library for non-commercial projects. For commercial projects two payment methods are available, $29 for a single website, and $179 for unlimted use.

The feature set for displaying images includes the style of animation to open images, the positioning of text, and the linking of image sets. In addition, it also provides many features for regular content, which can then be used for tooltip type pop-ups, using embedded HTML, IFrames and AJAX. Another standard feature is the ability to allow the user to move the pop-up around the screen, to wherever might be convienent.

However, there is a downside. While this works well in most browsers, even just loading the Highslide JS website in IE6 throws up several errors. With the library being so feature rich, it is a considerably larger codebase, although removing comments can remove this down to just over 8KB, and I suspect some of the older browsers may not be able to handle some of the complexity. Their compatibility table suggests that it works all the way back to IE 5.5, but in the tests performed for IE6, when the site did open without crashing the browser, the viewer itself felt rather clunky when an image was opened and several of the visibility settings just didn't work. You also frequently get an 'Unterminated string constant' error pop-up, which just feels disconcerting considering they are asking you to pay for commercial usage.

If IE6 wasn't a factor, this may have been a contender, as the cost is very reasonable for a commercial project that would utilise all its features.

Conclusion

These are just the four viewers that were prominent in searches for a "thumbnail viewer". They all seem to have the same, or at least a similar, style of presentation of images, which is likely due to the limited way images can be displayed as an overlay. However, the basic functionality of displaying an image seems to have been overshadowed by how many cool shiny features some can fit into their library, with configuration seeming to be an after thought.

With the ease of configuration to disable the IE6 error, the basic functionality and the freedom to use for commercial projects, the Dynamic Drive solution was utimately chosen for the project I was working on. If IE6 wasn't a consideration, I would have gone with Lightbox, as we already use Prototype and Scriptaculous. With IE6 usage dwindling on the website in question (Jun 2010: 38.8%, down to Mar 2011: 13.2%), it is quite possible that we may upgrade to a more feature and effect rich viewer in the future, and Lightbox does seem to be a prime candidate.

Consider this post a point of reference, rather than a definitie suggestion of what image viewer library to use. There may be other choices that suit your needs better than these, but these four are worth initial consideration at the very least.

Browsers & Operating Systems

For reference these were the browsers I tried, and the respective operating systems. And yes, I did test IE6 on Linux, where it occasionally stood up better than the version on Windows! Though this may be due to the lack of ActiveX support.

  • IE6 (WinXP, Windows7, Linux)
  • IE7 (Windows7)
  • IE8 (Windows7)
  • Firefox 3.6 (WinXP, Windows7, Linux)
  • Opera 9.8 (Linux)
  • Opera 10.52 (Linux)
  • Chrome 5 (Windows7, Darwin)
  • Chromium 6 (Linux)
  • Safari 4 (Darwin, iOS)

File Under: opensource / review / technology / usability / web / website
NO COMMENTS


Living By Numbers

Posted on 13th September 2010

Maisha, now with OAuth support.

A project I started back last year is Maisha, a command line client to interface to social micro-blogging networks, such as Twitter. On 31st August this year, Twitter depreciated the Basic Authention method of allowing applications to login users with a simple username and password combination. In its place they now use OAuth. (See also the blog post by Marc Mims - author of Net-Twitter).

On the face of it, OAuth seemed a bit confusing, and even the documentation is devoid of decent diagrams to explain it properly. Once I did get it, it was surprising to discover just how easy the concept and implementation is. For the most part Marc Mims has implemented all the necessary work within Net-Twitter, so Maisha only needed to add the code to provide the right URL for authorisation, and allow the user to enter the PIN# that then allows the application to use the Twitter API.

The big advantage to OAuth is that you don't need to save your password in plain text for an application. Once you enter the authorisation PIN#, the token is then saved, and reused each time you start up Maisha to access your Twitter feed.

As Identi.ca also implements an Open Source version of Twitter, they have also implemented OAuth in their interface. However, there is a slight modification to Net::Twitter needed, so I will wait for Marc to implement that before releasing the next version of Maisha.

So if you have been using Maisha and have been frustrated that you can no longer access Twitter, you now only need to upgrade to App-Maisha-0.14 and all should work again (once you've entered the PIN# of course).

If you are using Maisha, and have any feedback or wishlist suggestions please let me know.

File Under: life / opensource / perl / technology
NO COMMENTS


Back On Line

Posted on 16th February 2009

After the last few weeks of trying to access Twitter from the command line, I set about writing something that I could expand to micro-blog to any social networking site that supports many of the Twitter API type commands. At the moment it only works with Twitter and Identi.ca, but my plan is to look at creating plugins, or more likely to allow others to create plugins, that can enable the tool to interact with other micro-blogging sites.

After trying to think of a decent name, I finally settled on Maisha. It's a Swahili word meaning "life". You can grab the code from CPAN as App-Maisha.

Currently you'll need to use the standard Perl install toolset to install the application, but ultimately I'd like to have something that you can install just about anywhere without having to go through all the headache of installing dependencies. I'll have a go at doing an .rpm and a .deb package release, and will also try using PAR. It would be nice to have this as a standalone application that just about anyone can use, but for now CPAN will have to do.

My next immediate step is to look at writing something that interfaces to Facebook without requiring a developer key or any such nonsense. It will probably have to involve a bit of screen scraping, unless there is some more official API, but as yet I haven't found it. Everything regards Facebook applications seems to centre around the developer application that can do all sorts of dubious things, but mine is purely for the user to control from their desktop, not a 3rd party website/server. Thus giving them a developer API key assigned to me is wholly inappropriate. It would be nice if they had a restricted User API, which allows you to update your status and look at your friends' statuses, but I think I'll be in the minority wanting it.

File Under: community / internet / opensource / perl / technology
NO COMMENTS


You Know My Name

Posted on 9th February 2009

Having mentioned twitter in a recent post, I thought I would mention a project I decided to look at recently. The original project, twittershell, is not my own, but it appeared to be the closest to what I was after, a command line interface to Twitter, with the bonus of it being written in Perl. As a consequence of the latter, I was able to hack on the code and submit a patch to do a lot of what I wanted. I currently run the patched version, and it runs rather nicely for me. However, there is something missing.

Twitter is no longer the only micro-logging service and as such I've also signed up to identi.ca. With the APIs being pretty much the same, it should theorectically be simple to plugin an identi.ca interface to twittershell. Except it isn't. Unfortunately twittershell is written with only the Twitter API in mind. To intergrate identi.ca and other micro-blogging services, it requires a rewrite. So that's where I'm currently at. The original twittershell project hasn't been touched in over a year, so I'm hoping the orignial author won't be offended by me forking the code to a new project.

However, what do I call the new project? I would rather it not be something that identifies itself with any specific blogging service, as I would like it to have a broader appeal, that encourages others to add plugins should a new service come along. I realise this project will likely have limited appeal, as iPhone and GUI apps seem the in thing, but I want something that I can run via ssh/screen on my home box and not have to worry about watching some app running on the desktop.

One idea I had was to call it 'Mazungumzo' (Swahili for talk or conversation ... an idea stolen from Joomla!), then I thought of 'Maisha' (Swahili for life). I did look up some Welsh words, but doubt anyone would be able to pronounce them ;) I also thought of 'Rambler', but that might have too many connections to someone who goes walking across hill and dale of a weekend. So any good ideas for projects names?

File Under: internet / perl / technology
NO COMMENTS


It's Oh So Quiet!

Posted on 5th February 2009

Something that has bugged me recently is my lack of regular posts to my personal blog. I rarely write about the projects I work on, as most are featured in some form or another on the various Perl sites that I work on. But I feel I ought to make a note of snippets of ideas and thoughts about some here too. Not that I want this to become a technical blog, but there are random thoughts that would fit better here than over there. So expect a few more project related posts in the future.

I also have a considerable backlog of gig photos as well as the family type photos that I want to go through, and at least put a few photos online. So that might help make my postings a little more regular :)

File Under: opensource / technology
NO COMMENTS


All Moving Parts (Stand Still)

Posted on 7th March 2008

What a breath of fresh air. A comedian, actor and presenter, who actually has an interest in things in the computer world, beyond a source of writing inspiration. I recently came across a post by Stephen Fry (for those American readers, Stephen is the other member of the comedic duo, Fry and Laurie, with Laurie being Hugh Laurie currently making a name for himself in House), in his blog. The blog post that I picked up on is entitled "Deliver us from Microsoft". Reading back through other posts it appears he is quite a strong supporter of Open Source software, and to my mind, for all the right reasons.

The article in question looks at the Asus EEE PC, which was also recently (December 2007) reviewed by the LUGRadio presenters in their "Inspirational Muppetational" episode (Season 5, Episode 7). Both Stephen and the LUGRadio guys all came out praising the machine, and although they all found some form of critism for it, their view was healthy put into perspective by the fact that the aim is to provide a cheap machine for educational purposes. It isn't aimed at power users, such as myself, but those who want a laptop that can connect to the internet, enabling them to browse the web, chat to friends, edit or write office documents.

However, the most significant thing about the laptop, which is hinted at in Stephen's blog post title, is the fact it runs using Open Source software. From the Debian base (although tailered to the Asus EEE PC), through to OpenOffice and Firefox applications. The machine is perhaps the first to ever be sold commercially from the outset, where Linux is only version available, with no Microsoft product installed. Vendors are starting to realise that users are buying their machines and installing Linux on them, wiping any hint of Microsoft off, as has been apparent by the news reports of people contacting them for refunds. The choice isn't perhaps as wide spread as some of us would like, but it is getting better.

Stephen thinks that the change will happen within 5 years, and I would certainly welcome a change in the balance, with many more people running Linux as their Operating System. Linux on the desktop, has long been a challenge that Open Source developers have been making many dramatic changes to improve. DanDan and Nicole both use Ubuntu on their laptops, and I have heard of many people getting their parents, spouses, siblings and offspring to use some flavour of Linux with great results. There are still lots of gains to be made, particular in the area of closed source drivers and getting many devices (especially wireless network devices) working out of the box, but credit where credit is due, we have a lot to thank those developers of all the Linux distributions and Open Source applications. We have come a very long way in the last 5 years, and now perhaps more than ever Linux on the desktop has a real chance of challenging Microsoft's dominance in the market. I don't expect a complete take over, as I think Stephen was hinting, but I would like to see consumers being given a better, more considered option to buy an operating that works for them.

I do accept that Microsoft can be better in some areas, particularly with games, but I can see that advantage disappearing once games developers realise that a large portion of their current geek market will switch to non-Microsoft platforms. It might even challenge Microsoft to finally listen to many of the opponents and actually evaluate their security and product quality, enabling them to release more stable and reliable products. For myself, I choose Open Source partly because I find it more secure and reliable, but also because it gives me the freedom to investigate and hopefully fix problems, and potentially give back to the wider community. I already contribute to Open Source and I'd like to think that offsets all the benefits I've gained by using Open Source software.

I don't read the Guardian, but I think I'll be reading more of Stephen Fry's blog in the future. It's been an enlightening read.

File Under: computers / laptop / lugradio / opensource / technology
NO COMMENTS


From Russia Infected

Posted on 6th March 2008

Yesterday MessageLabs got a mentioned on the BBC News site, under the title of Infective Art. The Metro Newspaper in the UK also ran with the story, Cyber crime art revealed.

I'm currently touring the UK with a presentation entitled Understanding Malware, which takes the six types of malware, and using the MessageLabs "Know Your Enemy" campaign images, explains a little more about what they are. The presentation has gone down very well so far and there have been some healthy discussions afterwards, with attendees trying to understand how we can get better at getting rid of malware threats from the inbox. It's unlikely to happen altogether any time soon, but with companies like MessageLabs on the case we are making it harder for the malware to get through.

I shall be taking the presentation to more parts of the UK, so if you have a user group that might be interested, please feel free to get in touch and invite me along. Note that the presentation is not a programming language or operating system talk, and is more about technology and social engineering. I shall be submitting it to LUGRadio Live, YAPC::NA and YAPC::Europe this year, so if I don't make it to your local user group, hopefully you'll be able to make one of those conferences. As an added bonus I also have some freebie giveaways for anyone who can answer the questions during my persentation, courtesy of MessageLabs :)

File Under: computers / internet / malware / security / spam / technology
NO COMMENTS


Neon Knights

Posted on 5th March 2008

Following on from my previous post, I had two conversations recently where we got onto subjects relating to the demise of technophobia. In one instance I was discussing protection against viruses and malware, while the other was to do with just generally understanding how computers work and using them to your advantage. In both cases, I considered how DanDan (and Ethne to a degree) had reacted to getting his own laptop for Christmas. In 10 (maybe more) years time he will be looking for a job. By then if you don't have a decent level of computer understanding, you are going to find it very difficult to get a job, as computers and technology are becoming ever more prevalent in just about every industry, even where you might not think it.

Children growing up today, certainly in the UK, are being exposed to computer technology in most (if not all) primary schools. With desktop and laptop computers getting cheaper, more are getting them to help them with their homework, to play games or to send messages to friends and family. Those that do have a slightly more technically inquisitive nature are likely to want to do more and either write software to do cool stuff, or figure out how to get the machine to work even better. It's how I got started, although not until I was 15.

In 10 years or less the way we see the world, particularly through the eyes of our children, is going to be very different than it is today. My dad has no trouble working his way around a circuit board, and worked for Lyons computers (later ICL), on their LEO III among other things, whereas I have only basic knowledge of hardware. I have a good grasp of software design (and hopefully user interface design), and have been very fortunate to work on some very worthwhile projects, whereas my dad has no knowledge of any programming language. I wonder what DanDan and Ethne will be able to do in 20-30 years time, and what they will be able to do far better at than me? I'm quite looking forward to seeing them flourish, as I have been with some of my current colleagues, who often leaving me wishing I had their vision at the command line.

The world is changing, and I, for one, welcome our new 7 year old future overlords ;)

File Under: computers / technology
NO COMMENTS


Cry Wolf

Posted on 28th February 2008

In a recent BBC news article, Microsoft set to open up software, it is reported that Microsoft plan to release the technology to some of their software in order to provide better interoperability with other rival products. It also states that they promise "not to sue open source developers for making that software available for non-commercial use."

Now some may be extremely dubious, as that just doesn't seem to fit Microsoft's business model. There has to be something unusual here for them to feel they can release something to the world for free. It wouldn't surprise me if they released their back catalogue of software that is now 10 years+ out of date. As this software is now end of life, it does make sense to release restrictions on the old file formats, so that those who have to support Win95 and Win98 machines have a chance of getting some support from the Open Source community. It benefits Microsoft in that they will likely still require credit for any software that uses their file formats, but also allows them to virtually forget about support for older formats in their newer products.

If the second statement holds true, then it will hopefully mean less of the table thumbing and general smoke clouds of threats, which never amounted to anything anyway. It might also mean older Microsoft products might get their own special Open Source security release with all the holes repaired ;)

I'll be intrigued to hear what software/technology they are releasing, but I suspect that there will be an overwhelming wave of derision from some of the more out-spoken Open Source protagonists. Pity really, as to my mind, it may well add value to the many Open Source projects. Open Source is no longer a hobby. Serious investment is made by the likes of Sun, Red Hat, Novell and many others. The future for Linux as a reliable alternative desktop is getting better and better. No doubt there will still be plenty of FUD about, but consumers are becoming more and more educated about the choices they have available to them, and Microsoft is slowly waking up to the fact that they can use the Open Source community to their advantage, and still keep their name on every desktop, just not necessarily in one of their own product releases.

File Under: opensource / technology
NO COMMENTS


Sing Blue Silver

Posted on 25th February 2008

The battle is over. For some time now there has been a competition, reminiscent of the VHS/Betamax video format war, in the high definition market for video disk encodings, between Blu-ray pioneered by Sony (pun intended) and HD DVD championed by Toshiba. As of last week, "Toshiba drops out of HD DVD war". As a consequence the HD DVD players and recorders, as well as the HD DVD disks themselves are to disappear from our shelves. I have no preference for a particular high definition format, but I am glad that an early decision has been made. Perhaps not early enough for some, but I suspect the vast majority now discovering the improved formats, will be able to buy players and disks with more security that they are going to be able to play them on equipment in the future.

I remember the war back in the 70s and 80s with the VHS and Betamax formats for videos cassette formats, and although the Betamax format was widely regarded as the better format, Philips domineered the market so much, that regular consumers were hardly given the choice between the two. In this current change of heart, it appears that Toshiba have re-evaluated their format, and with the majority film studios electing to use the Blu-ray format, have admitted it was going to be a difficult race to win. They've gracefully conceded, and allowed consumers to win.

I've only recently bought a HD Ready TV, and thankfully it has all the proposed format connection sockets. I hadn't quite got so far as thinking about the player/recorder to go with it, but now I don't have to worry about making the wrong choice ... at least not with the format.

File Under: technology
NO COMMENTS


Maps And Legends

Posted on 18th January 2008

On the Birmingham LUG mailing list recently there was an announcement about the Birmingham Mapping Party, which is being organised by some of the guys at OpenStreetMap. Previously Alex has been over to Birmingham.pm to gives up a bit of background about GPS and mapping, and seeing as round where I live there is a distinct lack of mapping data, I thought it might be a good idea to find out how to get involved.

First off was to check whether I had the right requipment. I have a Nokia N95, and although it has GPS, I had no idea whether it could record data and allow me to upload to OpenStreetMap. Reading the notes, the N95 does indeed have the capability to record the mapping data, however it needs an additional (free) app to do it. I headed off to the Nokia Research Labs website and read up on Sportstracker, an app that allows joggers, etc to monitor their progress. As a by-product it also records the route you take in the GPX format, which can then be uploaded to OpenStreetMap. Using my local Wifi network, I logged onto the website and installed the software directly onto the phone. Having only had a quick look at the app, it does look quite cool.

So now I'm ready to record. However, I've previously mentioned to JJ about the GPS connection taking ages to triangulate my position when I switched on the GPS, and he mentioned A-GPS, which is also mentioned on the Nokia website, so I figured I ought to try and upgrade that too. Another download of the latest Software Updater, this time to the PC, and I'm ready to update. When I first tried a month ago, I had problems connecting to the Nokia website, this time around it connect without a problem. It also detected the phone and detected correctly that it has the 11.0.026 version of the firmware. The lastest version listed on the website is 20.0.015, and for A-GPS support version 12.0.013 or newer is required, so I was expecting a download and upgrade. Unfortunately, it would seem the Software Updater doesn't agree, as it is claiming that the firmware is up to date with the latest version 11.0.026. This is a bit annoying and have yet to find any way to update the phone to the latest firmware. I've now email Nokia customer support to see whether they can shed any light.

JJ did mention previously that I could go into any Vodafone shop and they would upgrade for me, but I doubt they would do anything that different from what I've tried, unless they complete wipe the OS and reinstall with the latest version. Just in case I do have to take this back to the shop, I've also downloaded the Nseries PC Suite to backup my data. I don't have too much on there, but the phonebook and messages I would rather keep, but then I should back them up every so often anyway.

Although I've had the phone for several months, this is the first time I've actually looked at it from a lower level. I'm starting to look at other possible applications, such as using as a laptop input controller, and really get the most out of the phone. Seeing as it has all these gadgets installed, it would be a shame not to use them ;)

Incidentally, I'm not planning to be at the Mapping Party, but I do hope to contribute to the mapping effort once I've figured out how to use SportsTracker. And maybe I'll be able to do a bit of Gloucestershire too seeing as I work there.

File Under: gps / maps / opensource / phones / technology
NO COMMENTS


Look Through Any Window

Posted on 11th January 2008

A friend recently pointed me at "Head Tracking for Desktop VR Displays using the WiiRemote", a video on YouTube. While the current headset might not be to everyones taste, the software and the idea is fantastic. It shouldn't be difficult to imagine all sorts of real world uses, never mind gaming fun. It really impresses me how some people can look at something from a completely different angle and do something which the creator never intended or expected.

And just in case you think this might be a one-off, Johnny Chung Lee has several project ideas, and a few more for the wii too. I expect Mr Lee will make quite an impact at Ninendo when they recruit him, and if they don't I'm sure there will be plenty of hi-tech companies that will be delighted to have him on their research staff.

File Under: computers / technology
NO COMMENTS


New Europeans

Posted on 16th September 2007

Back last year I was invited to EuroFOO. Having never attended this type of event I was a bit wary of what to expect. As it turned out it was rather an interesting couple of days. For those who never been, the event is a mini conference with the scheduled more or less decided after the welcome session, on two large whiteboards, with the attendees themselves allocating themselves to the available timeslots. To a degree it is a free for all, but there are enough clever people here, including several who were well prepared, who were able to pretty much fill all sessions within a few minutes.

The sessions themselves were a complete mixture of ideas. Some were an opportunity to show off cool apps, some focused on "mashups", others were discussion forums and several others were just whatever seemed like a good idea. Although there were a few sessions that stood out as worth attending for me, there were plenty of others that I could drop in or out of and either enter discussions or just play the part of observer. From a personal point of view I took a lot away with me, but I think if I'm ever invited next time, there are a couple of presentations I could bring with me. I'd certainly feel more confident about suggesting a session next time. When it's your first expereince of something like this, it's a bit daunting to stand up in front of so many talented people.

One aspect of the event I enjoyed was spending breakfast with Allison Randal and Gnat Torkington, and being introduce to Tim O'Reilly. Being quite a quiet person, I'm not the sort to stand out at something like this, but it was nice to realise that I did know quite a few people. On the last evening it was also great to meet Robert Lefkowitz, as it gave me the opportunity to say how much I enjoyed his talks that I heard via IT Conversations, on "The Semasiology of Open Source".

I also got time to chat to Damian Conway, Piers Cawley and Mark Fowler, which was great as I don't often get to see them these days, and when I do they're often busy preparing for talks or only standing still for a short amount of time. The weekend for me was a great success and if you're ever invited, I heartily recommend going along.

File Under: conference / opensource / technology
NO COMMENTS


Both Ends Burning

Posted on 13th July 2007

During José's talk, 'The Acme Namespace - 20 minutes, 100 modules', at YAPC::NA in Houston, he mentioned one of the Acme modules that accesses the info for a Playboy Playmate, Acme::Playmate. After he mentioned it, Liz "zrusilla" Cortell noted that she used to work for Playboy and worked on the site that was screen scrapped by the Acme module, informing us that she wrote the backend in Perl too, "so you see it was Perl at both ends". At this point the room erupted, Liz got rather red and I'm sure wished the ground would swallow her up :)

Despite the rather salacious connotation that can be drawn from that remark, it was a phrase that struck me later as being rather more descriptive of the state of Perl. I started to think about the community, business and the way Perl is perceived. Drawing a line with the individual at one end, moving into community through small businesses and onto corporations at the far end, we can see Perl is not only used at both ends, but all the way through. But people still ask isn't Perl dead?

Perl hasn't died, in fact it's probably more vibrant now than it has been for several years. The difference now though is that it isn't flavour of the month. I did a Perl BOF at LUGRadio at the weekend, and it was a subject that got brought up there. Is Perl still be used? It would seem that Perl publicity to the outside world is extremely lacking, as several non-Perl people I've spoken to over the past few months have been surprised to learn that Perl is used pretty much in every major financial institution, in email filtering or network applications, for the Human Genome project (and bioinformatics in general) and pretty much every type of industry you can think of. It isn't dead, it just isn't sticking it's head above the parapet to say "I'm still here".

Last year at YAPC::Europe, Dave Cross talked about speaking in a vacuum. Inside the Perl community we all know that perl is great and gets the job done, but what about the people who are struggling with other languages, or project managers and technical architects who are looking at what skill set they should be using to write their new applications? What about big business that is continually confronted with the marketing of Java from Sun or .Net from Microsoft?

I see Python gaining momentum simply because several in the Linux and Open Source communities started using it to see how good it was, and now with Ubuntu using it pretty much exclusively, it has gained a large foothold with the wider developer community. Ruby has been seen as great for creating flashy websites, but beyond 37 signals, I've not heard of any big name sites that have been created with it. It gets featured at every Open Source conference and developers generally seem to think its really cool, but I'm still waiting to hear of any big take up outside of the cool, hip and trendy set. Maybe that's Perl's problem. It isn't cool, hip and trendy anymore, it's part of the establishment, part of the furniture. Does the job, does it well and without any fuss.

Perl has generated such a great community, that we seem to have forgotten that there are other communities out there, and they've partly forgotten us too. YAPCs are great conferences, but they grew out of the desire to have more affordable conferences for the developers, students and self-employed. Their success has been to the cost of Perl people wanting to go to other Open Source events such as OSCON, and keep Perl presence in the wider developer communities going. As a consequence Perl is almost seen as an add-on for legacy reasons to those conferences.

Looking back at that line I drew at the beginning, although I see Perl in our community, it doesn't feature very much in the wider communities, and as such small businesses don't notice it so much and look to other languages to develop their applications. The individual or hobbyist still uses it, and the corporations would struggle to remove it now, so to the outside world Perl is very much at both ends, but only at both ends. It's lost its focus in the middle ground.

At LUGRadio this year, I kind of felt rather relieved that people who came and spoke to me, knew me for being part of the Perl community. Most of these people are hardcore Linux, C or Python developers and although several know Perl, don't often use it. I've spent a lot of time speaking at Linux User Groups this year, and plan to speak at more later in the year. I've also been invited to speak to the PHP West Midlands User Group, invited to attend PyCon and will be attending GUADEC next week, but it's hard work to try and remind these other communities that Perl is still there. Although the personal touch certainly does help, I can't help but think there needs to be another way to promote Perl. This isn't about success stories (although they do help) or about talking at conferences and user groups (although they are just as important), but about reaching to the other communities and thus small businesses to remind them that Perl is still a viable choice, and that rather than competing for market share, the different languages can work together.

Having spoken to some developers of other languages, I'm amazed that the FUD of all Perl is unreadable, obfuscated and too hard for the beginner to learn properly is still being peddled. Challenging that mentality is a bit of a battle, but I've had to state on several occasions that you can write unreadable, obfuscate and unmaintainable code in any language, and in fact most of the respected Perl community and much of CPAN strives to write readable, clear and maintable code. It seems the Perl code from over 10 years ago and the dodgy scripts of certain archives are still poisoning the well.

Part of the problem (possibly fueled by the above FUD) that we have in the UK is overcoming the fact that several new Open Source initiatives don't even feature Perl when they talk about Open Source languages. If the networks that work between the communities and small business aren't promoting us, then it's going to be a tough slog. I've already written emails to the National Open Centre and tried to get OpenAdvantage to be more inclusive, but there are other similar initiatives, both here in Europe and in the US that need reminding too. Once they're helping to promote Perl, then it might just be something that Universities and Colleges include in the curriculums again. From there small businesses will be able to see that there is a pool of Perl developers they can employ and Perl again becomes a viable choice.

I firmly believe Perl 5 will still be around in 10 years time. Whether its running on Parrot, within Perl 6 or as it is now remains to be seen. I was asked to describe Perl 6 at the weekend and responded with a generalisation of "Perl 6 is to Perl 5 as C++ is to C". C++ took C into another realm, but C is still around. I just hope that the constant confusing information given out about Perl 6 to non-Perl people, isn't the reason why some think Perl 5 is all but dead.

The theme for the 2005 YAPC::Europe in Braga was "Perl Everywhere". I don't think that's true, but I wish it was :)

(this has been cross-posted from my use.perl journal)

File Under: education / opensource / perl / rant / technology
NO COMMENTS


Wishing (If I Had A Photograph Of You)

Posted on 9th July 2007

The Crew

The Crew

Four Large Gents

Four Large Gents

As if I haven't mentioned it enough, this weekend I went along to LUGRadio Live in Wolverhampton. It was a fantastic event, as always, and I had a great time meeting people, seeing some interesting talks and taking lots of photos. I was a little disappointed to hear Ade has decided to leave LUGRadio as a regular presenter, but I'm sure Chris Procter will do an admirable job in his place. To read my more technical writeup of the event see my use.perl journal. To see my photos, click the links below :)

File Under: conference / lugradio / opensource / technology
NO COMMENTS


I'm Bad, I'm Nationwide

Posted on 5th July 2007

The 2007 YAPC::NA Organisers

The 2007 YAPC::NA Organisers

Finally got the time to sort through my photos from last week. From over 2,000 photos, I've got them down to just over 700. There are still a few in there that aren't quite as good as I'd like, but then until I can freeze people in time before taking the shot, I'm going to struggle with the current camera. I'm looking at to getting a DSLR at some point, so hopefully I won't get so many blurred pictures then. Still I'm pleased I managed to get quite a selection that I did like.

For those who discover this entry by searching for YAPC::NA, here are all the photos I have online:

I also took some videos of Luke Closs and the Lightning Talks, so once I've converted them I'll get those online too.

Last week was a lot of fun, and I'm glad I got to go. Looking forward to YAPC::Europe now :)

File Under: conference / houston / opensource / perl / photography / space / technology / yapc
NO COMMENTS


When We Are Free

Posted on 14th June 2007

JJ made a point last night, that I also agree with. When I got home, following a chain of blog links and I came across an article written by Martin Belam, about his wifes feeling towards an aspect of DRM. She makes a very good point, that had JJ, Brian and I coincidentally discussing at length yesterday evening at the Birmingham Perl Mongers meeting. I hope Martin's wife doesn't mind me requoting it here:

"The thing I don't get is this core of people that want everything for free. Artists still have to eat. Why do these people think that they are entitled to get everything for free for ever?"

JJ's point was that the biggest failing of the Linux community was the expectation that everything they want on their desktop should be free. As a consequence the Linux community, to a large extent, has become very closed one. The idea of Open to me, is more about encompassing different forms of expression, being inclusive rather than exclusive. In terms of software that can also mean different forms of distribution. As a corporate, people like Sun, Novell, etc can afford to give away parts of their software portfolio, as they have gained a credible market share for their brand to allow other large corporates to want to buy support contracts and services at very high rates. Ubuntu has been able to come into existence because Mark Shuttleworth was willing put the money down to make it happen. Big players and very rich people can afford to do that, if they choose. But what about the little guy?

Certainly in the UK and probably in the rest of the world, the people that take risks are the individuals and small businesses. They can because there often isn't the risk or outlay that would be required by a large business. As a consequence, when an idea does work it's often taken a lot of research, time and effort to get it into a state worthy of release. That's research, time and effort that the designer, developer or company don't get anything back for doing that work. Suppose as an individual, I create a piece of software that manages website. It takes 4 years to get that product stable and complete enough to release. Why should I be expected to just give it away?

The failing of the Open Source community is the expectation that everything should be free. While developers may choose to release their software as free, if they don't they are derided or sneered at. If my piece of software revolutionised the way websites could be created, and gives value for money, then why shouldn't I ask a nominal fee for it? The argument that the Open Source community seems to favour, is that I should charge a support contract. But that argument fundamentality fails to understand how business works. Support contracts work for big business because they need someone to blame when it all goes wrong. JJ gave the example of the supply chain for Vodafone, where one software supplier they use doesn't have a support contract with Vodafone, but via another suppler, because the software suppler is too small to guarantee a 24/7 support contract. Even though the other suppler can only provide a 24/7 telephone answering service, and still passes the details to the software supplier when they turn up for work in the morning.

I, as an individual, wouldn't get any support contracts from businesses around the world for my product. And even if I did, the chance of me providing realistic level of support is minimal. However, I could charge for my software and allow others to reap the benefit. While, I wouldn't necessarily reap great rewards, at least I would be getting some reward for all that research, time and effort getting the product into a state that others can take advantage of.

I find I keep having to ask every so often, 'why is it such a crime to make money?'. I have a family, I have a house and I have a life. If I want to have my own business, am I expected to work for nothing for 4 years and then give the software away for free and expect the support contracts to come rushing in, while in the meantime, my family starve, I lose my house and end up with no life? The biggest part of the UK's economic growth is the SMB (Small Medium Business) or SME (Small Medium Enterprise) markets. They help to employ a large part of the working population, but also help feed many of the larger businesses and corporations, thus helping to employ the remaining part of the working population. When MG Rover collapsed down the road here in Longbridge, the knock on effect to the smaller businesses who made parts for MG Rover was devastating. Several went out of business, while others had to cut their workforce. They can't work for free in the hope that the other manufacturers might use their products. And exactly the same is true of the software market. Individuals and small businesses create many products that are used by bigger companies. Sometimes those products might be suitable for release to the general public, but it shouldn't it be their choice whether they make a living from it and how?

Part of this closed mindset also means commercial developers are less likely to support Linux, which is a bad thing. While I personally like what Linux and the Open Source community has to offer, and dislike DRM, I'm also able to be realistic and understand that people want to protect something they have created. I dislike DRM, not because I think the concept is bad, but the fact that all the implimentations of it are flawed and misunderstand both the demands of retailer and the consumer. However, the problem that things like DRM has uncovered, is that the Open Source community's resistence to anything commercial for "their" operating system, has reduced the choice available, and has not allowed developers to work with the community to help make Linux a vibrant alternative to governments, emerging markets and the like. Currently Microsoft are able to offer great incentives to the decision makers, simply because many of the vendors of peripheral devices and software only support Microsoft products. That's not allowing freedom of choice. It's also not allowing decision makers to make informed decisions on the systems they wish to deploy.

An individual or small business, wishing to make a commercial product available on Linux is currently met with derision and considered to be evil. Until this mindset opens up and accepts that we can all work together, Linux on the desktop is always going to be playing catchup, and even Linux on the server is occasionally going to have to accept that it cannot compete when a requirement is run a piece of software that isn't available for it. Freedom is also about Freedom Of Choice. If there isn't a choice, then is it any wonder why so many restricted or flawed installations occur?

Although just to be clear, the website management tool I've written called Labyrinth, that's take over 4 years of my free time in research and development, will be available as Open Source Software in the future. I don't believe I have a product that would warrant selling as a commercial product, as I don't feel I can devote the time and effort to making it into a marketable product. I will however, be looking to encourage potential clients who want me to design and develop their website to come to me. The fact that I will use Labyrinth is incidental, but the fact that I created it and know it better than anybody else is my unique selling point.

There are other products out there that do website management. Some are free, some are not. Some do much much more than Labyrinth, while others are very basic. I'm not interested in trying to compete with them, as Labyrinth was written to fulfill my requirements to administer websites that I created. The fact that I've been able to use it for other sites has been great. But had I not had that attitude and decided to make it a commercial product, why should I expect the ridcule and scorn of the Open Source community because I decided to make money?

Libre and Freedom is about choice and open minds not about money.

When We Are Free.

File Under: commerce / opensource / technology
1 COMMENT


The Real Me

Posted on 13th June 2007

I've had to turn down a fantastic opportunity today. One of the LUGRadio presenters isn't able to make the recording of the show tomorrow, and Aq contacted me to see if I'd be up for being a guest presenter. I'm gutted as they've been meaning to get me on the show for a while, and now would have been an ideal opportunity to plug YAPC::NA and YAPC::Europe.

Not sure who isn't able to make it, but as Adam Sweet is now a regular, they can't call on him to be their stand-in guest any more :) Hence why my name cropped up. Hopefully they manage to recruit another member of the WolvesLUG massive, but I'll definitely be up for another chance to stand in. Hopefully, I haven't scotched my golden opportunity.

File Under: linux / lugradio / perl / technology
NO COMMENTS


This Property Is Condemned

Posted on 8th June 2007

I spotted the story of Julie Amero on the BBC News site this morning. While I'm glad there has been some sense to provide a second trial, with more appropriate evidence, I'm also disappointed that this should ever come to trial in the way it has. While I totally agree that minors shouldn't be exposed to the kind of images these sites promote, I also don't agree that a single SUBSTITUTE teacher should be held accountable in the way that she has.

Firstly she's a substitute teacher, meaning that her knowledge of the computer security systems is likely to be extremely limited at best and more likely non-existent. Did the school fully brief her on the security measures they have in place? Perhaps she should be suing the school or the state for not reasonably putting in place security measures to prevent children being exposed to this sort of thing in the first place. However, that perhaps also isn't fair, as in far too many cases the school or the local governement don't have any idea about computer security. It's why there are specialist computer security companies that are called in to investigate and secure companies and organisations.

I work for a company called MessageLabs. We work in an industry where stopping malicious content is part and parcel of the job. When you consider that in email alone we stop over 70% of mail as spam, virus, inappropriate content or illegal images and are also seeing increasing numbers within our web scanning and instant messaging serives too, computer security is a huge and very specialised business. MessageLabs are the largest company of it's kind in the world, and as such, every minute we stop hundreds of messages with the sort of payloads that would cause this kind of content to be popped up on unsuspecting computers. Are you really expecting a substitute teacher to have that level of knowledge and skill?

Part of the problem is education, and that isn't meant to be ironic. In Julie Amero's case, if the prosecution wins, then we are now expecting every single person to be accountable for ensuring every single aspect of their work environment is not going to get them arrested. By implication, we're also now stipulating that every single individual MUST be come a security expert. That ain't gonna happen. In my opinon this focus is totally misplaced. The responsibility for protection at the workplace lies solely with the employer. In this instance the school or state should have taken reasonable steps to ensure that all computer security measures were deployed to ensure that the desktop computers were adequately protected, and that their network was also appropriately protected, both from intrusion and in restricting the sites that can be viewed by any computer in the school. But whether you take action against the individual or the school or the state, you are still prosecuting the victims.

Taking a step back, the law basically stipulates that minor should not be exposed to this sort of imagery, which I agree with. However, as the law is very bad at being able to hold those truly responsible accountable, they go after easy prey. Although I do believe the law could be better written to make this sort of thing virtually disappear over night.

This kind of promotion is typically from the pornographic, gaming and drug industries. None of which a minor should be exposed to. What if the law found the owners of those sites personally accountable for the distribution of harmful matter to minors? What if institutions, such as schools, colleges and libraries, or businesses, such as internet cafes, and maybe even individuals in the right circumstances were able to prosecute the site owners? How quickly do you think that this sort of invasion would disappear? Unfortunately, those three industries are extremely big business, and can employ people to ensure that bills don't get passed that would effect them in this way. As such the justice systems become corrupt by allowing victims such as Julie Amero to be held up as a scapecoat.

I really hope that the prosecution's case fails, as otherwise the kind of precedence it will set, really isn't something I want to think about.

File Under: education / law / security / technology
NO COMMENTS


Fan Mail

Posted on 20th May 2007

Dear Spammer,

Why are you bothering to try and spam this system. If you even bothered to check back after you'd posted, you'd note that your scheming spamming tricks don't work. This site has a very strict filtering system that you will not get through, so please don't bother.

This site gets hit by spammers at least once a day. However, as the backend gets to see more of these spammers, the less I get to see them. Hopefully it'll get to the point I see maybe the occasional post once a week, or once a month or even better never. The Scooter Do also has a similar detection in the backend, and both are accummulating a notable amount of knowledge. I've been wary of using something like SORBS, as for email is not reliable enough, but seeing as most spamming systems tend to use these open proxies, the chances are that legitimate posters won't be on those lists. So, I've started to look at using Net-DNSBLLookup, to see how well I can integrate it with what I have. It will hopefully mean I only need to clean the database once in a blue moon :)

Regards,
Barbie.

File Under: spam / technology
NO COMMENTS


Somebody's Watching Me

Posted on 13th May 2007

I've had tracking code in Labyrinth for sometime, but it's mostly to track popular galleries and photos. It does count pages, but nothing as detailed as Google Analytics. I'd heard interesting comments about this Google service, and seeing as I can't use their AdSense service for any practical purpose, I thought I give it a try. So for the past few days I've been adding the appropriate code into several of my sites. I was looking at the reports this morning for some of the more popular sites and they make interesting reading.

Many of the sites are specifically aimed at the UK audience, so it's not too surprising to see the majority of visits are from UK residents. However, some, particluarly my Perl sites, are of global interest so I'm hoping to spot any interesting trends, and identify the popular pages. It's early days yet, but so far my CPAN Testers Statistics site is popular in Germany and the US. It'll be interesting to see what the analytics report when the CPAN Testers Wiki finally goes live.

However, the biggest benefit to using Google Analytics, is that I can show anyone I do sites for, a more active response to their site. Kev is always quite keen to see what the response is like after The Scooter Do has an event. The gallery for the night always seems popular, but now we'll be able to see whether that's true and whether site visitors browse the rest of the site.

File Under: google / technology / usability / web
NO COMMENTS


Dead And Bloated

Posted on 30th April 2007

If you've ever bought a desktop or laptop in recent years, that has come with a version of Windows on it, the chances are that there is an awful lot of "bloatware" preinstalled and taking up valueable resources, which often hinder the performance of the machine. It's often a reason why I've heard non-IT people complain about Windows. Now a technical savvy person can generally get rid of most of the unwanted applications, but I am seeing far too many getting in under the guise of helper and support functions.

My sister had a problem with her machine, and asked me to take a look. Apparently it took ages to load up and wasn't particularly fast when it did finally load. Considering it's a 2.70GHz machine, this wasn't a good sign. I did suggest getting some more memory, so before I called round she bought a 512MB memory stick, to compliment the 256MB she already had.

I started by turning on the machine and watching it load. It took nearly 10 minutes! She was running Windows XP, and even though it's sluggish on my laptop, it's nowhere near that bad. Then trying to open anything caused the disk drive to be almost permanently spinning. Taking a first look at the Taskbar and Start Menu items revealed a large collection of apps that mostly just sit there, then come alive to "check things" every few minutes. I immediately removed them all, except a couple of essential ones. I then install TweakAll, which I've often found to be a handy utility for find all the "invisible" start menu apps. Several featured, which on closer inspection where phone home type apps. The worst offender turned out to be Hewlett-Packard. They have a "Motive Chorus Daemon" application installed when you install the drivers and image apps from their CD, which came with my sister's All-In-One Scanner/Printer. I've blocked some of the network traffic, but I suspect there's more.

It really is horrendous how many spyware and intrusive applications are bundled with software these days. All the unwanted apps on my sister's machine were all either preinstalled or installed by driver CDs with new devices. It took 5 hours to clean the machine, after which I'd reclaimed over 1GB of disk space. The machine loaded in roughly 1 minute, and opening a browser window now happened in seconds with the disk drive barely spinning. In fact if you blinked you'd probably miss the orange flash of the LED. Not surprisingly my sister is very relieved, as it's been a cause of frustration for sometime.

I recently bought a new laptop from Dell, and although I specifically said I wanted a bare bones system, I still got bloatware on there. Thankfully not very much, but enough to be a nuisance to uninstall. However, on both the laptop (even though I made a point of explicitly saying 'remove it') and my sisters machine, there was a little app that appears to have different names, but does exactly the same thing. Remote Assistant. If you ever see anything like it on your machine, I would advise you to get rid of it as soon as possible. It allows someone to remotely log on to your machine, without you asking or even accepting, and alter your machine. This cropped up recently on a thread in a LUG mailing list and was thought to be a hoax. Unfortunately not. I'm absolutely amazed that vendors have actually got away with this, but then Microsoft have finally found a way to sell you software to cripple your machine, so why not the vendors too.

Incidentally the BBC reported the fact that Dell are offering XP again on some models. If you email them directly, like I did, you can get XP on any model you want. There is no way I wanted Vista installed anywhere near my machines, and from reports around the internet, there are too many driver and incompatible device issues that would ever encourage me to use it. The fact that it also comes with inbuilt "security protection" of DRM is now just another reason not to go near it. I don't think I've ever seen such a negative response to a new Windows OS. At a recent Birmingham Perl Mongers technical meeting, the comment made about the fancy graphics was that if you wanted XGL that badly, why not just install Linux. I installed Ubuntu :)

File Under: rant / technology / usability
NO COMMENTS


On The Air

Posted on 18th April 2007

Seeing as I live in the area, this got plastered all over the local news. Two people were cautioned for using someone else's wireless network. Now I don't care for the sensationalist reporting, but the heart of the article does convey that we do require better education for home users about this. However, tradition dictates that readers and viewers only care if media scares them enough! If someone is going to put up cardboard to hide their activities, then yes they are probably doing something they shouldn't, but scaremongering that anyone with a laptop outside your house is going to be hacking your network or accessing illegal sites it just irresponsible.

If the network is routed via a broadband network, as is suggested here, as pretty much most of Redditch and South Birmingham have cable, the network owner is paying a flat fee. Additional useage from someone else incurs no extra charge. Admittedly this may be different for some other providers that charge for the bandwidth, and if the user of your network decides to download large files. In pretty much every instance, these wardrivers are not hacking your network, they are just using the transport mechanism to access the internet. While there might be some who are now considering accessing illegal sites, the traditional wardriver is more interested in standard web surfing, email checking or accessing their own servers when they are away from home.

The upshot is that situations like those which Grep found himself in and pretty much ever wardriver are now going to be seen as illegal in the public's eyes. I've used other open wireless networks, and it was extremely useful when I was looking for second-hand cars, as I could search the AutoTrader site and locate maps to figure out where to go next. My wireless network is open by choice as I don't live somewhere where you can just casually drive-by anyway, and it's convenient when anybody stays over for them just to connect, rather than mess about with configuring their secure settings for wireless networking.

However, wouldn't it be much better to try and educate owners of wireless networks, to the implications of running an open wireless network, and how to do basic security? It's not difficult to secure a wireless router enough to be unavailable to the casual wardriver. If someone is dedicated enough to want to crack the router security, then there isn't much you are going to do about it anyway. Although there is an explanation on how to set up security in the user guide for probably every wireless router, it's easy to forget that many owners don't understand it and just want it to work.

Apparently this is the first case of it's kind. It's a shame it's even happened, as the encouragement for open wireless networks (such as SOWN) or wireless community projects is potentially going to suffer. If these people were doing something illegal, then fine they should accept the consequences. The sensationalism attached to the story though is largely illusory as the reporters have no idea what sites or activities these individuals were using the networks for.

The story has now gone national. One interesting point that has been noted elsewhere, is how this might effect owners of Nokia and Motorola WiMAX phones? Members of my family haven't asked yet, but I suspect I may be busy this weekend.

File Under: redditch / technology / wardriving / wifi
NO COMMENTS


Some Rights Reserved Unless otherwise expressly stated, all original material of whatever nature created by Barbie and included in the Memories Of A Roadie website and any related pages, including the website's archives, is licensed under a Creative Commons by Attribution Non-Commercial License. If you wish to use material for commercial puposes, please contact me for further assistance regarding commercial licensing.