UVN800 Wireless Remote Sensor Review

Posted April 13, 2014 by waynedgrant
Categories: Meteorology, Weather Station

Tags: , , , ,

Until recently I lived in Scotland where UV is hardly ever an issue given the hideous weather we tend to get there. However, I now live in New Jersey in the USA where the summers can be very hot and the risks of UV exposure are more of a concern for me. Given that I took my weather station with me when I relocated (which comprises of both the WMR88 and WMR200 base stations) it made sense to consider expanding its capability to record UV.

One of the advantages of Oregon Scientific’s range of wireless weather stations is the ability to add extra sensors. I have, for example, previously taken advantage of this by adding an additional temperature and humidity sensor in the form of a THGR810 unit.

Both of my base stations also support a UV sensor in the form of the UVN800. This sensor is not normally bundled with the WMR88 or WMR200 but I purchased one recently as an add-on. The unit normally retails for $59.99 but I was able to pick it up for $43.79 from Amazon (not including sales tax). Both the full and discounted prices are on par with what I would expect to pay for other additional sensors such as the THGR810 mentioned above. It arrived undamaged and well packaged in a relatively small box:

UVN800 boxed

Click to Enlarge

Upon unboxing I was presented with the UVN800 unit itself, a wall mount with two screws, a ground spike, AA batteries and instructions.

UVN800 unboxed

Click to Enlarge

This provided me with two different installation options. Either use the ground spike to insert the sensor in the ground or the wall mount and screws to attach it to a wall or pole. I chose the wall mount option which I attached to my existing sensor pole. However, I appreciate the flexibility of the ground spike option. The trick with the UVN800 is to orient it such that the UV sensor on top of it has a constant, uninterrupted view of the sky which I could more easily achieve with the pole mounting.

UVN800 mounted

Click to Enlarge

Installation of the UVN800 is fairly straight forward if a little more awkward than it could be due to some weird choices made by the unit’s designers. First of all accessing the battery compartment requires the removal of four small screws from the base of the unit to access the battery compartment. Why the compartment is not accessed by a sliding mechanism like most of Oregon’s sensors is a mystery.

Secondly the wall mount is screwed into place at the bottom of the unit obscuring the battery compartment and reset button. Given my pole mounted configuration changing batteries will be far more time-consuming than it should be. I will have to unscrew the wall mount from the pole, detach the sensor from the wall mount, remove the screws from the battery compartment. Only then can I change the batteries and will then have to reverse the procedure to reinstall the unit.

So far this is my only gripe with the UVN800 and it is not a deal breaker by any means. One the batteries were installed pairing it with my base stations was as simple as hitting the sensor’s reset button and initiating a search from each base station. They both started displaying UV Index readings straight away. On both the WMR88 and WMR200 this takes the form of a live UV Index display and a graph of the last 10 hours of values.

Having UV Index values displayed on my base stations was just the start, however. I publish weather data to my own website and wanted to add UV Index information to it. I use a Meteo Sheeva connected to my WMR200 as a data logger and to do uploads of weather readings and graphs to my website. As expected it was a snap to get it to start logging data from the UVN800. As an example here is one of several graphs I have configured on the Meteo Sheeva which display the last 7 days of maximum UV Index values:

7day_uv

Click to Enlarge

My Meteo Sheeva also uploads data in WD Live’s clientraw.txt format which includes UV Index readings. I have rearranged my existing WD Live console on my website to incorporate a UV Index bar:

wdlive

Click to Enlarge

Finally I wanted to be able to see up-to-date UV readings when I am outside. To do this I ideally want to be able to see the current UV Index on my phone. One of my winter projects was to develop an Android widget which displays the values found in online WD Live clientraw.txt files. The data the widget displays is user configurable and one of the options is for colour coded UV Index values:

Cirrus_ori_portrait

Click to Enlarge

The widget is pretty much production ready and supports 18 different weather data points and 16 different measurement units. I use the widget extensively for both my own weather station and others close by locally or when I travel. However, at this time I am not sure whether or not I will publish it on the Google Play Store given how many similar apps are already available.

Returning to the UVN800 itself there is one more thing to note. The reviews on Amazon for the UVN800 indicate that many units permanently fail just after a year of operation. At the time of writing I have only been operating the sensor for a few days but will add a note to this review if and when it fails.

Should the sensor last at least a couple of years (as all my other sensors have already done) then I would not hesitate to recommend the UVN800 as a useful, easy to use addition to an existing Oregon Scientific wireless station.

Sprint Retrospective Techniques Roundup

Posted March 28, 2014 by waynedgrant
Categories: Agile, Scrum

Tags: , ,

Over the last three years I have written a number of posts detailing various Sprint Retrospective techniques. Each technique’s writeup describes what it is good for, the steps to run it and includes pictures of real sessions. This post represents a roundup of all of the techniques I have written about with links to the original posts.

retros roundup

 

Enjoy.

Sprint Retrospective Techniques 3

Posted February 9, 2014 by waynedgrant
Categories: Agile, Scrum

Tags: , ,

This post presents three more sprint retrospective techniques to add to the six I have already detailed in my previous posts Sprint Retrospective Techniques 1 and Sprint Retrospectives 2. Why present three more techniques? Surely six is enough to drive continuous improvement in any team? First of all I have found these techniques to be useful additions to my arsenal for reasons I outline below. Secondly variation is one of the keys to maintaining the effectiveness of a team’s retrospectives and more techniques makes for more variety.

Technique 1 – 4Ls

The 4Ls is shorthand for the following:

  • What was Liked? What were the things that the team really appreciated about the sprint?
  • What was Learned? What were things that the team learned that they did not know before the sprint?
  • What Lacked? What were the things that the team think could have done better in the sprint?
  • What was Longed For? What were the things the team desired or wished for but were not present during the sprint?

A 4Ls retrospective can be run by following these steps:

  • Create a poster for each of the Ls and stick them up on the wall (easel paper is good for this purpose but drawing sections on a whiteboard is a good alternative).
  • Explain the meaning of each of the Ls to the team.
Click to Enlarge

Click to Enlarge

  • Hand out sticky notes and markers to the team.
  • Encourage the team to place stickies with ideas onto each relevant poster and wait until everyone has posted all of their ideas.
Click to Enlarge

Click to Enlarge

  • Have the team group similar ideas together on each poster.
Click to Enlarge

Click to Enlarge

  • Discuss each grouping as a team and note any corrective actions.
Click to Enlarge

Click to Enlarge

The reason I like using the 4Ls is that it has the potential to cover a wide range of topics in a compact session. It addresses both the positive and negative aspects of the sprint (Liked and Lacked) but also specifically calls out the teams growing experience (Learned) and problematic gaps that can filled (Longed For).

Technique 2 – Satisfaction Histograms

There are many variations of the Satisfaction Histogram. I came across the version outlined here when a team member on a project I was a Scrum Master on offered to run a couple of retrospective sessions. This is how he ran one of the sessions and it turned out to be very effective.

  • As preparation pick around four topics that you want to gauge the teams satisfaction of. These can be practices, behaviors or anything else you can think of.
  • For example: Testing practices, Keeping a clean build, Standup effectiveness, Accuracy of estimates, etc.
  • Draw a satisfaction histogram for each of the topics on a whiteboard. Label the x-axis 1-5 for each and add the topic name as a heading.
Click to Enlarge

Click to Enlarge

  • Explain the meaning behind each of the topics  to the team.
  • Explain the meanings of the 1-5 scale to the team. For example, for “Our Team Communication is…”
    • 1 - “…disastrous, is actively impeding the team”
    • 2 - “…bad, not being done effectively”
    • 3 - “…satisfactory, requires improvement”
    • 4 - “…mostly very good, could still be improved further in small ways”
    • 5 - “…awesome, little or no room for improvement”
  • Distribute sticky notes to the team, one per topic.
  • Invite the team to place one sticky note in each topic’s histogram to grade how satisfied they are with the team’s performance for that topic. Sticky notes placed on the same topic and number are stacked.
  • Wait until everyone has placed their stickies.
satisfaction histogram 2

Click to Enlarge

  • Discuss the results for each topic in turn. Where there is low satisfaction or a wide-spread of satisfaction grades dig into why this is.
  • As potential corrective actions are identified by the team, especially for topics with mostly low numbers, note them down.
satisfaction histogram 3

Click to Enlarge

This version of Satisfaction Histogram has a number of advantages. First of all the selection of topics means that it can be targeted to certain problem areas. Note that the technique can also be varied to allow the team to suggest topics. This can be done by leaving one histogram blank for the team to suggest the topic during the session. As the team becomes more familiar with the technique you can allow them to suggest all of the topics to be scored for satisfaction.

Secondly it is very visual. At a glance everyone can see the topics where the team is satisfied, dissatisfied or in disagreement. This allows the team to focus on the topics where they belive they are most lacking or conflicted about.

Technique 3 – Circles

Circles is more commonly known as Circles and Soup. I dislike the “Soup” metaphor so I refer to the technique simply as Circles. In addition when I run this technique I replace “Soup” with “Concern”. I understand that the technique is based on Stephen Covey’s book Seven Habits of Highly Effective People.

The idea behind circles is to get the team to focus their energies on what they can change and not to waste time worrying about what they cannot affect.

A Circles retrospective can be run by following these steps:

  • Define what an impediment is to the team:
    • “An impediment is anything that prevents you or the team from delivering work as efficiently as possible. An impediment is anything that blocks you working or slows you down.”
  • Distribute sticky notes and markers to the team.
  • Ask the team to write down all of the impediments encountered in the sprint, one per sticky note, and have them post them onto a whiteboard.
  • Wait until everyone has posted all of their impediment ideas.
circles 1

Click to Enlarge

  • Ask the team to identify and remove any duplicate impediment stickies.
Click to Enlarge

Click to Enlarge

  • Draw three concentric circles on the whiteboard and label them, from the inside out, “Control”, “Influence” and “Concern”.
Click to Enlarge

Click to Enlarge

  • Define what each of the circles means to the team:
    • Control - Impediments for which the team can take action to remediate.
    • Influence - Impediments for which the team can collaborate with or make a recommendation to an outside entity to remediate. For example, another team, group or line management.
    • Concern - Impediments over which the team has no ability to Control or Influence.
  • Invite the team to collaboratively place each of the impediments in the appropriate circle.
  • Encourage and guide any debate as to what should go where. 
  • Wait until all of the impediments have been placed in one of the circles.
Click to Enlarge

Click to Enlarge

  • Go through each Control impediment one at a time and have the team:
    • Gain an understanding of what each impediment is.
    • Identify remedial actions that are within the team’s control and write these on the whiteboard.
  • Go through each Influence impediment one at a time and have the team:
    • Gain an understanding of what each impediment is.
    • Identify contacts and recommendations to influence their remediation and write these on the whiteboard.
  • Review the Concern impediments to gain an understanding of what they are.
  • Note that as the impediments are discussed the team may identify actions that had not previously occurred to them.  This can cause the impediments to move inward. For example an impediment that was initially placed in Concern may move to Influence or Control.
Click to Enlarge

Click to Enlarge

Circles is my new favourite technique but I use it sparingly. The reason I try not to overuse it is because it focuses very much on impediments and does not have the “celebration of success” aspect that is built into most other retrospective techniques. However, it is a powerful technique for sprint retrospectives as well as for other purposes. It is, for example, ideally suited to holding retrospectives into releases especially those that proved to be problematic.

Definition of Done Workshop

Posted December 11, 2013 by waynedgrant
Categories: Agile, Scrum

Tags: , ,

A few months ago I underwent a career change by converting from being a Scrum Master to an Agile Coach. I am still adjusting to my new role but am, by and large, enjoying the challenge and change of pace that it entails. I work with a number of internal teams which represent a fairly broad spectrum of Agile experience from novice to intermediate.

When I engage with a team for the first time I ask them a number of initial questions to help me to gauge their Agile maturity. Some of these concern the Agile ceremonies they practice and the particular Agile artifacts they make use of. What has surprised me so far is that an overwhelming number of these Agile teams did not have a Definition of Done.

For those not familiar with what a Definition of Done (DoD) is I will attempt to summarise the concept here. A DoD is a checklist of useful activities that are carried out by a software team every time they implement a user story. The DoD can, for example, include things like ‘Code’, ‘Unit Test’,  ‘Integration Test’, ‘Peer Review’, etc. The idea is that the culmination of all of these activities against a given set of requirements will result in potentially releasable software. That is quality software that satisfies the requirements. Until all of these activities are completed successfully for a given user story it cannot therefore be considered ‘Done’.

It is important to note that every team’s DoD will almost certainly contain different activities because they do different types of work and have different means of ensuring quality. Therefore every team has to spend some time coming up with their particular DoD.

For Agile teams an agreed and published DoD is essential as it brings transparency to a team’s way of working. Firstly it helps to ensure that everyone on the team is on the same page as to what “Done” means and how they should get there. Secondly as the DoD is written down it can be discussed and subsequently altered as the team agree to make changes to how they work to improve quality and efficiency. For example, via their retrospectives.

Some of the teams I coach had not heard of a DoD. Some had but had not written theirs down. However, they insisted that they knew what it was. As an experiment I asked several members of the same team what they thought their team’s DoD was. I got back as many different answers as the number of team members I asked. Although the teams thought they had a shared idea of their DoD they did not.

As a result of these findings I have been on a DoD offensive of late. This has involved explaining the benefits of having a DoD and helping teams to get their initial DoD published. I have helped so many teams define these recently that I have come up with a workshop format to help drive them out. I present the outline of this DoD Workshop format in this post.

The Workshop Format

The workshop is designed to be run in-person and ideally the entire team should be present. When not all of the team members can attend and the team is not cross-functional then all functions should be represented. No specific preparation is required on the part of the team. I allow an hour for the workshop and hold it in a room with a white board or similar usable wall space.

I start by explaining what the session is about, i.e. creating the team’s DoD which will subsequently be published on their wiki. This of course involves explaining what a DoD is and how it can help the team. This normally raises a few questions from the team and I address these before moving onto the next stage of the workshop.

Next I hand out sticky notes and sharpies. I request that the team think of all of the high-level activities that they normally engage in to get a story done. I ask them to write each of these items onto the sticky notes with one activity featured on each sticky note. I insist that everyone writes up at least a few sticky notes and that they only include the activities that they do now not things they would ideally do in the future.

Once everyone has finished writing I get the team to post all of their items, in no particular arrangement, onto the white board.

Click to Enlarge

Click to Enlarge

Next I ask the team to collaboratively group similar and identical items together. Team members may have used different terminology to describe an otherwise identical activity but by working together they can identify which stickies are actually the same thing.

Click to Enlarge

Click to Enlarge

With the groupings complete we go through each one and validate all of the items within it are truly the same activity. I also check that the team actually does the particular activity now and that it is not simply an aspiration. Any such aspirational items are removed.

Next I work with the team to place each activity group into a timeline. The placement of the activities reflects the order in which each takes place within the team’s development process.

Click to Enlarge

Click to Enlarge

With the timeline in place we now turn attention to creating a headline name for each activity. This is necessary because individual team members may refer to the same activity by a different name. For example, “Implementation”, “Coding” and “Development” are all different terms for the same things. Having the team agree on a single name for each activity introduces a consistent terminology that the team can use to enhance their communication when discussing their DoD. In addition if a team struggle to create a headline name for one of their activity groups then it suggests that it may need to be broken up into several more nameable activities.

The agreed headline names are written against each activity.

Click to Enlarge

Click to Enlarge

With the headline names agreed I go through each activity again. This time I ask the team to define what is involved in accomplishing each of them focusing on the deliverables, outcomes and conditions that represent success. These descriptions are written against each activity beside the headline name.

Click to Enlarge

Click to Enlarge

The combination of the ordered activities and the more detailed descriptions represent the team’s DoD.

Post Workshop

Finally I snap a picture of the board and collaborate with the Scrum Master to have it documented in the team’s wiki and communicated out to the team. If possible I also recommend that a paper copy be posted somewhere in the team’s physical work area.

Note my insistence throughout the process of only including the things the team does now. This may seem overly constraining when the team could be suggesting good adaptions. However, the main purpose of the exercise is to capture what the team does now. Immediately after the DoD is agreed and published the team is free to discuss it and make positive changes. This may even happen in a separate, later section of the workshop. The key is to write it down first, warts and all, so that the team’s improvements can be made against a known state.

When to Run the Workshop

So far I have only used this workshop format to help document the DoD for teams that are already sprinting. However, new teams should not wait until they are partway through a project before they create their DoD. They should instead put it in writing at the very beginning.

I envisage that this workshop format could be used equally as effectively to create the initial DoD for a newly formed team. The only difference would be to have the team write-up stickies for the activities they think they will need based on their previous experience. Otherwise the session could be run in much the same way.

Meteo Sheeva

Posted November 23, 2013 by waynedgrant
Categories: Meteorology, Weather Station

Tags: , , , , , ,

Late last year I purchased a Meteo Sheeva to log the data from my WMR200 weather station. While the WMR200 is supposed to function as a simple data logger Oregon Scientific’s awful software renders the feature useless. Therefore, to log data from the WMR200, I required a separate connected computer. My requirements for a data logging machine were that it be:

  • capable of logging the data from my WMR200′s sensors
  • able to publish my  weather data live to the internet
  • compact
  • low-cost
  • turn-key

The Meteo Sheeva fulfills all of these requirements and then some.

A Meteo Sheeva is actually a combination of two things. Firstly there is the “Sheeva” part. This takes the physical form of a SheevaPlug Linux-based compact computer. Secondly there is the “Meteo” part. The Meteo Sheeva ships with an SD Card preloaded with a demo version of the Linux-based MeteoHub data logger software. I purchased the Meteo Sheeva unit from a UK vendor called New IT for £130 and licensed the MeteoHub software from Smartbedded for £50.

For me this is a relatively low price purchase compared to dedicating a full size machine to data collection. I could have created a data logger for less money by configuring a Raspberry Pi with, say, wview. However, I was more interested in my particular solution being turn-key in nature than in saving more money.

The SheevaPlug fufills the compact requirement as it measure only 11 x 7 x 5 cm. This makes it bigger than a cased Raspberry Pi but still small enough to tuck out of sight. This picture shows my SheevaPlug next to a beer mat for comparison purposes.

Click to Enlarge

Click to Enlarge

Like the Raspberry Pi the SheevaPlug runs off of a SD Card. It is powered off the mains via a detachable radio type power cable. For connectivity it has a USB socket to connect to a weather station and has an ethernet port for networking.

The unit’s setup is indeed turn-key and couldn’t be much easier. I simply connected the SheevaPlug to the mains, attached it via USB to the WMR200 and to my router using an ethernet cable. Once powered up all interactions with the MeteoHub software are made via a web browser and a simple web-based UI. All I had to do to start data logging was access the Weather Station section of the UI, specify that a WMR200 was connected and then select which of the automatically detected sensors I wanted to log data from.

The MeteoHub interface is simple, uncluttered and well organised. For example, here is Sensors page which shows the current status of all connected weather sensors:

Click to Enlarge

Click to Enlarge

The MeteoHub manual is excellent. The manual covers every detail of the software’s capabilities. This is invaluable, not because the MeteoHub software is difficult to use, but because of the massive amount of functionality it provides. I will describe a small subset of the main features which I have found to be the most useful for my own requirements. Bear in mind that MeteoHub is capable of much more than I make use of.

The first feature I started experimenting with was the ability to specify charts from the logged sensor data. I focused on bar and line charts but there are many more options available as well as flexible configurations for time period, aggregation buckets, scales and colours. With limited UI-based configuration I was able to create charts for all of my weather data types over many different time periods.

Here are a selection of some of the charts I have specified for my own setup:

Last Day of Maximum Wind Speed, Gust Speed and Wind Direction:

Click to Enlarge

Click to Enlarge

Last Day of Temperature, Humidity and Dew Point:

Click to Enlarge

Click to Enlarge

Last Month of Daily Surface Pressure Ranges:

Click to Enlarge

Click to Enlarge

Last Month of Daily Maximum Wind and Gust Speeds:

Click to Enlarge

Click to Enlarge

Last Year of Total Monthly Rainfall (data logging started in September):

Click to Enlarge

Click to Enlarge

Any of the charts can be displayed via the MeteoHub UI at any time. However, the charting feature really come into its own when combined with the automated FTP upload feature. This can be used to publish updated charts to a website on a pre-defined schedule. I use this feature to publish my setup’s charts to my own weather website.

MeteoHub can also automatically publish its data via FTP in WD-Live format. This makes it capable of feeding the Flash-based Weather Display Live dashboard. This is separate software from another vendor but costs only $40 and enables a dynamic and attractive looking weather dashboard which is highly configurable. This is my particular online weather dashboard:

Click to Enlarge

Click to Enlarge

Weather data can also be pushed to more than two dozen online weather services including WeatherBug, Weather Underground and the UK Met Office. For example, here is my data displayed on the Met Office’s WOW website:

Click to Enlarge

Click to Enlarge

Publishing my weather station’s data to my website in WD-Live format had an interesting side benefit as I could use it to view my weather station’s latest readings from my smart phone. The paid-for but inexpensive Android Weather Watch Widget can be pointed at the WD-Live data file’s location and display the latest conditions for my weather station:

Click to Enlarge

Click to Enlarge

In short I cannot recommend the Meteo Sheeva highly enough as a weather station data logger. Even though I only take advantage of a subset of its capabilities the features I do use make use of make it more than worth the money I have invested in it. You can get a cheaper and smaller solution using a Raspberry Pi. However, if you want something that works out of the box then the Meteo Sheeva cannot be beat.

KeyStore Explorer Now Open Source

Posted October 19, 2013 by waynedgrant
Categories: KeyStore Explorer, Open Source

Tags:

KeyStore Explorer (KSE), in its various guises, is a PKI desktop application project I have worked solo on since 2001. While it started as open source software it has been a closed project for most of the subsequent time. For the last year development has stalled due to a lack of motivation on my part. My concern has been that my lack of activity would lead to the whole project dying.

I am therefore happy to report that KSE is now officially open source software. The new owner Kai Kramer answered my call to arms and has been busy since late July remediating all of the impediments to open sourcing the application, completing the features I had slated for version 4.2 and adding extra functionality. The result is KSE 5.0 which is licensed under GPL Version 3.

The old KSE website http://www.lazgosoftware.com/kse has been decomissioned and now redirects to KSE’s new home at SoureForge http://keystore-explorer.sourceforge.net.

Release notes for 5.0 are available here. Executable downloads for the usual supported platforms are available here. Finally the highlight is the availability, for the first time since 2004, of the source code.

For my own part, I will be stepping back completely from the project to let Kai and the community take KSE forward.

Agile Bug Out Box

Posted September 29, 2013 by waynedgrant
Categories: Agile, Scrum

Tags: ,

A month ago I relocated from Scotland to the US. This move necessitated a lot of planning (and heavy use of more than one personal Kanban board). Many decisions had to be made. In particular I had to decide what to take with me and what to leave behind.

One of the things I decided to leave behind was my rather voluminous Agile Kit. Everything in the kit has its use but it wasn’t going to fit in my luggage. Also by the time the main shipment of my belongings travelled between Scotland and the US I would have had to replace most of it anyway. I therefore gave away most of the kit to my local Scrum Master colleagues.

I say I gave away most of the kit because I would need a couple of materials to hand from day one of my move to help me facilitate planning and retrospective sessions in my new workplace.

I decided to have a little fun with the exercise and brand my mini Agile kit. I have an unhealthy fascination with weird reality shows. One of the shows I watch is Doomsday Preppers. Preppers are survivalists who prepare for extreme, sometimes civilisation ending, events. One thing preppers have constantly to hand is a bug out bag. This contains everything a prepper needs to survive for a few days should the worst happen and they need to get out of dodge fast.

Stretching the analogy more than a little I decided to create a prepper styled Agile Bug Out Box. I chose a box rather than a bag as it would protect its contents better and I liked the look of Corinna Baldauf’s Scrum Master Emergency Kit which also takes the form of a box. The box would contain everything I would need to perform my agile role for a few weeks.

The box itself is simply a small Tupperware box which is durable and fits easily into my luggage. I labelled it “Agile Bug Out Box” using the Old Stamper font and added some warning stripes to make it look suitably prepper-like:

box packed

Click to Enlarge

With the box picked and labelled I had to decide what to put in it. This was tricky because the box is very small and I use a lot of stuff day-to-day. I had to consider what I absolutely needed not just what made my life easier. I finally settled on:

  • My custom planning poker cards.
  • Super sticky notes.
  • Mini sticky notes.
  • Four whiteboard markers, each a different colour.
  • Four black sharpie pens.

To round out the kit I used thin elastic hair ties to tie the pens and cards together so the kit would stay ordered:

box unpacked

Click to Enlarge

So why did I settle on these particular items?

One of the planning techniques I use is planning poker so I needed a deck to do that. While I gave away all my other official decks I couldn’t part with my custom hand-made deck or risk them to surface transit.

The super stickies, white board markers and sharpies are essential for me so that I can run retrospectives. Every technique I currently perform utilises some or all of these items.

Finally there are the mini stickies. I use these in concert with larger stickes to create basic, compact personal Kanban boards for when I am on the road like this example here:

on the road

Click to Enlarge

I have been in the US for four weeks now and my Agile Bug Out Box has proved to be invaluable. However, there is a Staples store just over the road from my office so I will start building up a full agile kit again soon. I’ll keep my Agile Bug Out Box to hand though. After all you never know when you’ll need to bug out.


Follow

Get every new post delivered to your Inbox.

Join 50 other followers