Let's Go Crazy

Posted on 31st May 2016

Last weekend saw me in Rubgy for the 9th QA Hackathon. This is a Perl event where the key developers for CPAN, PAUSE, MetaCPAN, CPAN Testers, Dist::Zilla, Test2 and many other Perl 5 and Perl 6 projects, get together to discuss problems, future plans and collaborate on code.

Although I was a co-organiser of the event, I really would like to thank my fellow co-organisers; Neil Bowers (NEILB) and JJ Allen (JONALLEN). Without these guys, organising this QA Hackathon would have been tough, as they really did all the hard work. Also many thanks to Wendy for keeping us fed with nibbles, keeping notes and generally making sure we all stayed focused. An event like this needs a team, and they are an awesome team.

My main aim for this event was to meet Doug Bell (PREACTION). Back last summer, the CPAN Testers server had some severe problems, which meant we had to switch to a new physical server. It was at this moment I realised that I couldn"t do this alone any more. Doug stepped up and started to take over the reins, and has done a great job since. However, I"d never met Doug, so this was going to the first opportunity to catch up in person. After only a few moments of saying hello, I knew we had found the right person to take over CPAN Testers. Doug has a lot of great ideas, and is more than capable of taking the project to the next level, which is where I wanted to see it grow to, but knew it needed fresh eyes to take it there. I feel immensely confident that I have left the keys in capable hands, and with the ideas Doug has already shown me, I expect bigger and better things for CPAN Testers' future. Please look after him :)

On the first day Oriol Soriano Vila (UREE) introduced himself to Doug and I. Oriol was suggested to the organisers by his employer, and after explaining what the event was about, Oriol was even more enthusiastic to attend. I'm glad he did, as he is another great asset to both CPAN Testers and Perl. Although we have referred to him as our "intern", Oriol has proved he was just one of the team. He has some great ideas, asked all the right questions and had results by the end of the hackathon too! You can read more on his own blog.

So once we got our introductions out of the way, we started looking at a high priority problem. One that had been reported in two different ways, but was in fact the same problem. The Summary RAG bars and the release database (as used by MetaCPAN). In turns out the problem was straightforward. After the server crash last year, The database scheme used to rebuild the new server was missing a new column in the release summary table, and thus wasn't getting updated. Once that was fixed, it was "simply" a matter of rebuilding the table. Sadly it took the whole weekend to rebuild, but once completed, we were able to start regenerating the release SQLite database. That took a week, but I"m pleased to say all is now updating and available again.

While that was rebuilding, I started to take a look at some other issues. After introducing Oriol to our family of websites, he registered for the wiki, and spotted a problem with the registration process. After some tinkering, I got that working again. I've no idea how long it's been a problem, but apologies to anyone affected.

In the introductions at the beginning of the event, Leo Lapworth (LLAP) mentioned that he was hoping to refine MetaCPAN"s use of Fastly, and was interested in helping anyone else who might be interested in using the service for their project. I got Leo to sit with me for a while and he gave me a good run through of what the service is, what it can do, and why we should use it for CPAN Testers. I didn"t take much convincing, and quickly opened an account and started to move the main family of websites to it. We have since seen a slight drop on hits to the server, but I expect that to improve as the static pages (the individual reports) are cached. Even the dynamic pages can benefit from their caching, as although many will change throughout the day, only a small portion are updated more than once an hour. Once we learn more about Fastly, expect to see some better response times for your page hits.

Talking with Christian Walde (MITHALDU), he wanted to help with the performance of the websites, particularly the Reports website. However, with the rebuilding ongoing, the server wasn't in the best place to really evaluate performance. He did happen to mention that the reports he was getting from the mailer were coming through as garbage. After some investigation, I discovered that the mailer had not been upgraded to use Sereal, which is now our serialiser of choice for the reports stored in the database. With that fixed, together with some further improvements and with all tests running, we put it live and waited. The following morning Christian report he had readable reports coming through again.

One aspect for testing the Reports site, and one that would have restricted Christian to evaluate the performance, is that apart from mine and Doug"s development machines, there is no stable installable full instance of the CPAN Testers Report site, including databases and cron scripts. As such, Doug has been working on providing exactly that. It has been on my TODO list for some time, as some of the bug reports and issue requests would have been quashed much more efficiently had others been able to fire up a working site and be able to send a pull request. You can read more about Doug"s progress on his blog, and hopefully this will encourage more people in the longer term to get involved with CPAN Testers development work.

Throughout the weekend I worked on cleaning up some of the templates on the various websites, ensuring that sponsors were correctly attributed, and fixed several bugs in some of the associated distributions. Not all have been pushed to CPAN, but work is ongoing.

Having finally met, Doug and I went through all the website logins and social media accounts, and made sure he had all the keys. The handover process has been a longish one, but I didn"t want to overwhelm Doug, and wanted him to find his feet first. After this weekend, expect more posts and updates from him rather than me. Please look after him :)

I also joined in some for the discussions regarding the CPAN River and the naming of the QA Hackathon. Neil has written up both admirably, and while I didn"t contribute much, it was good to see a lot of healthy discussion on both subjects. Regarding the naming of the event, I do think it's a shame that the likes of Google have turn the word "Hackathon" into the concept of a competition event, which the QA Hackathon event is definitely not. Ours is about collaboration and planning for the future, with many of the key technical leads for the various toolchain and associated projects within Perl 5 and Perl 6. I don"t have a suitable name to suggest, but I would recommend ensuring the acronym could not be used negatively.

In the coming weeks, I hope to collate all the website tests I run prior to updating the CPAN Testers family websites, and handing over to Doug for his new development environment for CPAN Testers. This will hopefully enable easier access to anyone wanting to help fix problems on the websites and backends in the future. 

In short, my completed tasks during the hackathon were:

  • Fixed the registrations for the CPAN Testers Wiki.
  • Got CPAN Testers Reports running on the Fastly (http://fastly.com) service, allowing us to caching some of the pages, and reduce the load on the webserver when trying to recreate reasonably static pages. Also means the routing for anyone viewing the site outside of Europe is going to reduce page load times too.
  • Fixed some bugs in the Reports Mailer, refreshed the tests and test data, and tidied up the notifications.
  • Fixed the Reports Mailer for sending individual reports, due to the DB storage now using Sereal. Note this had no effect on the summary reports.
  • Fixed a long running bug with the Summary panel (and release summary table), which turns out has also been affecting MetaCPAN.
  • Continued to hand over the final keys to Doug Bell (PREACTION), who is now carrying the torch for CPAN Testers.
  • Fixed a few bugs in other distributions, a couple related to CPAN Testers.
  • Cleaned up some of the CPAN Testers family website templates.
  • Joined discussions for The Perl River, the (re)naming of the QAH and the future of CPAN Testers.

It was a very productive event, and for CPAN Testers, I'm pleased it gave Doug and I a chance to knowledge share, and ensure he has everything he needs to not only keep the project going, but help develop new ideas to solve some of the big data problems that CPAN Testers sometimes throws up. Over the past 6 months or so, I have been taking a back seat, for various reasons, and in the coming months you will hear much less from me regarding CPAN Testers. Occasionally, I may pitch in to discussions to help give some background to decisions that were made, to give some context to why we wrote code a certain way, or designed a DB table the way we did, but this is now Doug's project, he will be the main point of contact now.

During the wrap at the end of the event, where we got to say a little piece about what we achieved, Chris Williams (BINGOS) made announcement to say thank you to me for 10 years of CPAN Testers. After taking on the challenge to grow CPAN Testers, and make it more interesting for people to get involved, I think I've achieved that. The project is well respected throughout the Perl community, and I've had some kind words from people in the wider OpenSource community too, and with over 68 million test reports in the database, I think I can safely say that has been a success. I wish Doug all the best taking it to the next level, and hope he gains as much knowledge and experience (if not more) from the project as I've done. Thanks to everyone who has support the project, me and all those that came before.

The QA Hackathon would not have been possible without the Sponsors. No matter what they have contributed, we owe them all our thanks for enabling the participants the time and ability to work together for the benefit of all. Thank you to FastMail, ActiveState, ZipRecruiter, Strato, SureVoIP, CV-Library, OpusVL, thinkproject!, MongoDB, Infinity, Dreamhost, Campus Explorer, Perl 6, Perl Careers, Evozon, Booking, Eligo, Oetiker+Partner, CAPSiDE, Perl Services, Procura, Constructor.io, Robbie Bow, Ron Savage, Charlie Gonzalez, and Justin Cook.

File Under: hackathon / opensource / perl / qa / rugby

Crash Course in Brain Surgery

Posted on 22nd March 2015

A Year of CPAN Uploads

On Thursday, 19th March 2015 I uploaded my 366th consecutive release to CPAN. To most that may well be "meh, whatever!", but for me it has been an exhausting yet fulfilling exercise. The last 60 days though, were undoubtably the hardest to achieve.

When I started this escapade, I did it without realising it. It was several days before I noticed that I had been commiting changes every day, just after the QA Hackahon in Lyon. What made it worse was that I then discovered that I had missed a day, and could have had a 3 day head-start beyond the 9 days I already had in hand. Just one day behind me was Neil Bowers, and the pair of us set about trying to reach 100 consecuive days. It took a while for us to get into the flow, but once we did, we were happily committing each day.

Both of us created our own automated upload scripts, to help us maintain the daily uploads. This was partly to ensure we didn't forget, but also allowed us to be away for a day or two and still know that we would be able to upload something. In my case I had worried I would miss out when I went on holiday to Cornwall, but thankfully the apartment had wifi installed, and I was able to manage my releases and commits every morning before we left to explore for the day.

I mostly worked at weekends and stocked up on releases, sometimes with around 10 days prepared in advance. Most of the changes centred around bug fixes, documentaion updates and test suite updates, but after a short while, we both started looking at our CPANTS ratings and other metrics around what makes a good packaged release. We both created quests on QuestHub, and ticked off the achievements as we went. There were plenty of new features along the way too, as well as some new modules and distributions, as we both wanted to avoid making only minor tweaks, just for the sake of releasing something. I even adopted around 10 distributions from others, who had either moved on to other things or sadly passed away, and brought them all up to date.

Sadly, Neil wasn't able to sustain the momentum, and had to bail out after 111 consecutive uploads. Thankfully, I still had plenty of fixes and updates to work through, so I was hopeful I could keep going for a little while longer at least.

One major change that happened during 2014, was to the CPANTS analysis code. Kenichi Ishigaki updated the META file evaluations to employ a stricter rendition of the META Specification, which meant the license field in most of my distributions on CPAN now failed. As a consequence this gave me around 80 distributions that needed a release. On top of this, I committed myself to releasing 12 new distribuions, one each month, for a year, beginning March 2014. Although I've now completed the release of the 12 distributions, I have yet to complete all the blog posts, so that quest is still incomplete.

I made a lot of changes to Labyrinth (my website management framework) and the various ISBN scrapers I had written, so these formed the bedrock of my releases. Without these I probably wouldn't have been able to make 100 consecutive releases, and definitely not for a full year. But here I am 366+ days later and still have releases yet to do. Most of the releases from me in the future will centre around Labyrinth and CPAN Testers, but as both require quite in depth work, it's unlikely you'll see such a frequent release schedule. I expect I'll be able to get at least one released a week, to maintain and extend my current 157 week stretch, but sustaining a daily release is going to be a struggle.

Having set the bar, Mohammad S Anwar (MANWAR) and Michal Špaček (SKIM) have now entered the race, and Mohammad has said he wants to beat my record. Both are just over 200 days behind, and judging from my experience, they are going to find it tricky once they hit around 250, unless they have plenty of plans for releases by then. After 100, I had high hopes of reaching 200, however I wasn't so sure I would make 300. After 300, it really was much tougher to think of what to release. Occasionally, I would be working on a test suite and bug fixes would suggest themselves, but mostly it was about working through the CPAN Testers reports. Although, I do have to thank the various book sites too, for updating their sites, which in turn meant I had several updates I could make to the scrapers.

I note that Mohammad and Michal both are sharing releases against the Map-Tube variants, which may keep them going for a while, but eventually they do need to think about other distributions. Both have plenty of other distributions in their repetoire, so it's entirely possible for them both to overtake me, but I suspect it will be a good while before anyone else attempts to tackle this particular escapade. I wish then both well on their respective journies, but at least I am safe in the knowledge I was the first to break 1 year of daily consecutive CPAN uploads. Don't think I'll be trying it again though :)

File Under: cpan / opensource / perl

100 Nights

Posted on 13th July 2014

100 in more ways than one!

100 #1

11 years ago I was eager to be a CPAN Author, execpt I had nothing to release. I tried thinking of modules that I could write, but nothing seemed worth posting. Then I saw a post on a technical forum, and came up with a script to give the result the poster was looking for. Looking at the script I suddenly realised I had my first module. That script was then released as Calendar::List, and I'm pleased to say I still use it today. Although perhaps more importantly, I know of others who use it too.

Since then, I have slowly increased my distributions to CPAN. However, it wasn't until I got involved with CPAN Testers that my contributions increased noticeably. Another jump was when I wrote some WWW::Scraper::ISBN driver plugins for the Birmingham Perl Mongers website to help me manage the book reviews. I later worked for a book publishing company, during which time I added even more. My next big jump was the release of Labyrinth.

In between all of those big groups of releases, there have been several odds and ends to help me climb the CPAN Leaderboard. Earlier this year, with the idea of the Monthly New Distribution Challenge, I noticed I was tantalisingly close to having 100 distributions on CPAN. I remember when Simon Cozens was the first author to achieve that goal, and it was noted as quite an achievement. Since then Adam Kennedy, Ricardo Signes and Steven Haryanto have pushed those limits even further, with Steven having over 300 distributions on CPAN!

My 100th distribution came in the form of an addoption, Template-Plugin-Lingua-EN-Inflect, originally written by the sadly departed Andrew Ford.

100 #2

My 100th distribution came a few days before I managed to complete my target of a 100 consecutive days of CPAN uploads. A run I started accidentally. After the 2014 QA Hackathon, I had several distribution releases planned. However, had I realised what I could be doing, I might have been a bit more vigilant and not missed the day between what now seems to be my false start and the real run. After 9 consecutive days, I figured I might as well try to reach at least a month's worth of releases, and take the top position from ZOFFIX (who had previously uploaded for 27 consecutive days) for the once-a-day CPAN regular releasers.

As it happened, Neil Bowers was on a run that was 1 day behind me, but inspired by my new quest, decided he would continue as my wingman. As I passed the 100 consecutive day mark, Neil announced that he was to end his run soon, and finally bowed out after 111 days of releases. My thanks to Neil for sticking with me, and additionally for giving me several ideas for releases, both as suggestions for package updates and a few ideas for new modules.

I have another quest to make 200 releases to CPAN this year, and with another 20 release currently planned, I'm still continuing on. We'll see if I can make 200, or even 365, consecutive days, but reaching 100 was quite a milestone that I didn't expect to achieve.

100 #3

As part of my 100 consecutive days of CPAN uploads challenge, I also managed to achieve 100 consecutive days of commits to git. I had been monitoring GitHub for this, and was gutted to realise that just after 101 days, I forgot to commit some changes over that particular weekend. However, I'm still quite pleased to have made 101 days. I have a holiday coming up soon, so I may not have been able to keep that statistic up for much longer anyway.

100 #4

As part of updates to the CPAN Testers Statistics site, I looked at some additional statistics regarding CPAN uploads. In particular looking at the number of distributions authors have submitted to CPAN, both over the life of CPAN (aka BackPAN) and currently on CPAN. The result was two new distributions, Acme-CPANAuthors-CPAN-OneHundred and Acme-CPANAuthors-BACKPAN-OneHundred.

When I first released the distributions, I only featured in the second. For my 100th consecutive day, I released the latest Acme-CPANAuthors-CPAN-OneHundred up to that day, and with my newly achieved 100th distribution, was delighted to feature in the lists for both distributions.

File Under: opensource / perl

Time Waits For No One

Posted on 10th May 2014

When I relaunched the CPAN Testers sites back in 2008, I was in a position to be responsible for 3 servers, the CPAN Testers server, the Birmingham Perl Mongers server, and my own server. While managing them wasn't too bad, I did think it would be useful having some sort of monitoring system that could help me keep an eye on them. After talking to a few people, the two key systems most keenly suggested were Nagios and Munin. Most seemed to favour Munin, so I gave it a go. Sure enough it was pretty easy to set up, and I was able to monitor the servers, using my home server to monitor them. However, there was one area of monitoring that wasn't covered. The performance of the websites.

At the time I had around 10-20 sites up and running, and the default plugins didn't provide the sort of monitoring I was looking for. After some searching I found a script written by Nicolas Mendoza. The script not only got me started, but helped to make clear how easy it was to write a Munin plugin. However, the script as was, didn't suit my needs exactly, so had to make several tweaks. I then found myself copying the file around for each website, which seem a bit unnecessary. So I wrote what was to become Munin::Plugin::ApacheRequest. Following the Hubris and DRY principles copying the script around just didn't make sense, and being able to upgrade via a Perl Module on each server, was far easier than updating the 30+ scripts for the sites I now manage.

Although the module still contains the original intention of the script, how it does it has changed. The magic still happens in the script itself.

To start with an example, this is the current script to monitor the CPAN Testers Reports site:

#!/usr/bin/perl -w
use Munin::Plugin::ApacheRequest;
my ($VHOST) = ($0 =~ /_([^_]+)$/);

Part of the magic is in the name of the script. This one is 'apache_request_reports'. The script extracts the last section of the name, in this case 'reports', and passes that to Run() as the name of the virtual host. If you wish to name the scripts slightly differently, you only need to amend this line to extract the name of your virtual host as appropriate. If you only have one website you may wish to name the host explicity, but then if you create more it does mean you will need to edit each file, which is what I wanted to avoid. All I do now is copy an existing file to one to represent the new virtual host when I create a new website, and Munin automatically adds it to the list.

Munin::Plugin::ApacheRequest does make some assumptions, one of which is where you locate the log files, and how you name them for each virtual host. On my servers '/var/www/' contains all the virtual hosts (/var/www/reports, in this example), and '/var/www/logs/' contains the logs. I also use a conventional naming system for the logs, so '/var/www/logs/reports-access.log' is the Access Log for the CPAN Testers Reports site. Should you have a different path or naming format for your logs, you can alter the internal variable $ACCESS_LOG_PATTERN to the format you wish. Note that this is a sprintf format, and the first '%s' in the format string is replaced by the virtual host name. If you only have one website, you can change the format string to the specific path and file of the log, and no string interpolation is done.

The log format used is quite significant, and when you describe the LogFormat for your Access Log in the Apache config file, you will need to use an extended format type. The field to show the time taken to execute a request is needed, which is normally set using the %T (seconds) or %D (microseconds) format option (see also Apache Log Formats). For example my logs use the following:

LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-agent}i\" %T %v"

The second to last field is our time field. In Munin::Plugin::ApacheRequest, this is stored in the $TIME_FIELD_INDEX variable. By default this is -2, assuming a similar log format as above. If you have a different format, where the execution time is in another position, like $ACCESS_LOG_PATTERN, you can change this in your script before calling Run(). A positive number assumes a column left to right, while a negative number assumes a column right to left.

The last number passed to the Run() method, determines the number of lines read for the access log to describe the average execution time. For high hit rate sites, you may wish this to be a higher number, but as most of my sites are not that frequently visited, 1000 seems to be a reasonable number.

The config statements that are generated for the Munin master monitor are currently hardcoded with values. This will change in a future version. For the example above the config produced reads as:

graph_title reports ave msecs last 1000 requests
graph_args --base 1000
graph_scale no
graph_vlabel Average request time (msec)
graph_category Apache
graph_info This graph shows average request times for the last 1000 requests
images.warning 30000000
images.critical 60000000
total.warning 10000000
total.critical 60000000

The highlighted values are interpolated from the arguments passed to Run(). In a future version I want to be able to allow you to reconfigure the warning and critical values and the graph base value, should you wish to.

I have now been using Munin::Plugin::ApacheRequest and the associated scripts for 6 years now, and it has proved very successful. I have thought about releasing the module to CPAN previously, and have made several attempts to contact Nicolas over the years, but have never had a reply. I know he was working for Opera when he released his script, but have no idea of his whereabouts now. As the script contained no licensing information, I was also unsure what licensing he had intended the code to be. I hope he doesn't mind me having adapted his original script, that I'm now releasing the code under the Artistic License v2.

Although I haven't been able to contact Nicolas, I would like to thank him for releasing his original script. If I hadn't have found it, it is unlikely I would have found a way to write a Munin plugin myself to do Apache website monitoring. With his headstart, I discovered how to write Munic plugins, and can now set up monitor of new websites within a few seconds. Thanks Nicolas.

File Under: opensource / perl / website

History Of Modern (part I)

Posted on 23rd February 2014

Neil Bowers recently unleashed CPAN::ReleaseHistory on the world. Internally the distribution uses the a BACKPAN Index, which records every release to CPAN. I was already interested in this kind of representation, as I wanted to add a similar metric on each Author page of the CPAN Testers Reports website, but hadn't got around to it. Neil then posted about the script included in the distribution, cpan-release-counts in an interesting post; What's your CPAN release history?.

After a quick download, I ran the following for myself:

barbie@kmfdm:~$ cpan-release-counts --char = --width 30 --user barbie
 2003 ( 12) ==
 2004 ( 26) =====
 2005 ( 80) ===============
 2006 (  6) =
 2007 ( 59) ===========
 2008 ( 62) ===========
 2009 (122) =======================
 2010 (148) ============================
 2011 ( 89) =================
 2012 (156) ==============================
 2013 (123) =======================
 2014 ( 11) ==

So my most prolific year was in 2012. I'll have to see if I can change that this year. However, it does give a nice yearly snapshot of my releases.

As it turns out, for CPAN Testers I don't need the BACKPAN index, as I already generate and maintain an 'uploads' table within the 'cpanstats' database. I do need to write the code to add this metric to the Author pages. Thanks to Neil's script though, he has given me a starting point. Being able to see the releases for yourself (or a particular Author) is quite cool, so I may adapt that to make any such matrix more dynamic. It might also be worth adding a more generic metric for all of CPAN to the CPAN Testers Statistics website. Either way, I now have two more things to add to my list of projects for the QA Hackathon next month. Neil will be there too, so I hope he can give me even more ideas, while I'm there ;)

File Under: hackathon / opensource / perl

Page 2 >>

Some Rights Reserved Unless otherwise expressly stated, all original material of whatever nature created by Barbie and included in the Memories Of A Roadie website and any related pages, including the website's archives, is licensed under a Creative Commons by Attribution Non-Commercial License. If you wish to use material for commercial puposes, please contact me for further assistance regarding commercial licensing.