Wednesday, 4 December 2024

Unravelling Conflict

An interesting question came up at a meet-up I recently attended, where they were discussing approaches to coaching and mentoring. The question was not overly interesting by itself, but the roller-coaster of a journey as we got to the essence of the real problem, was. The initial framing of the question by one of the attendees was this: 

“How do I tell someone to optimise their code?”

The wording here immediately made me feel a little uncomfortable because I generally dislike the notion of telling anybody what they should do or how they should behave. This was a session about coaching and mentoring and I felt that before we could answer the question we first needed to talk about the question itself, and the language used to express it. It helps immensely if you don’t go into the process assuming you are inherently right. But maybe it was me that misread the tone of the question?

One of the most useful guidelines for navigating life (not just the work environment), I first heard from Jeffrey Snover – Chief Architect of PowerShell. He often used to tweet this saying:

“When faced with conflict, respond with curiosity.”

In this instance I’m the one facing conflict. The tone of question is at odds with how I feel people should behave towards others. In the past I have occasionally had the misfortune to work with some people that had very different ideas about collaboration (see Fallibility) and so I also know that sometimes the answer is simply to walk away.

However, the attendee was very likely not a native English speaker, and so there was every possibility the choice of words was simply unfortunate. They also seemed like a nice person from previous conversations I’d had, so this really felt like there was something much deeper going on that was worth pursuing.

Unfortunately, the situation got slightly worse, before it got better. Some of the other attendees immediately started to challenge the assumption that anything might even need optimising. After all, the general wisdom on this topic is “don’t do it”. And, for experts only, the second rule is “don’t do it, yet”. I was happy to see where this line of questioning went and even added my own typical questions, such as, “have you measured it?” and “what business value would rewriting it bring?” Ultimately this lead to the even more uncomfortable statement:

“I just don’t like how it’s written.”

I’ve worked on a few codebases where there is a lot of noise in the history because people decided to arbitrarily rewrite code simply because it didn’t fit their personal preferences. In the worst case code flip-flops between two styles as any chance to touch the module becomes an opportunity to rewrite it, yet again. Doing this just creates conflict within the team and is ultimately very wasteful. In contrast, genuine refactoring, that brings to code forward to adhere to the team’s current preferred principles and practices does have some merit. (See Sympathetic Change for more thoughts on this.)

Anyway, this is going badly as I’m finding it harder to empathise with the speaker and my opinion of them shouldn’t really cloud my decision to help them. After all, they’ve come to this meet-up because they want to learn, like the rest of us, and we should fully embrace that.

And then the magic starts to happen.

Someone asks what it is about the code that they dislike, and we find out that the code was written using some very poor choices. The speaker continues to describe how there are more modern idioms which are inherently more performant and in-keeping with how code tends to be written in that language nowadays. Hence, this isn’t so much about personal preference as de-facto standards.

Now we’re on a roll.

Further questioning reveals that this code was written by a relative newcomer, i.e. a more junior programmer. The original question was badly phrased and the real question was about how to approach a more junior member of the team and mentor them on how to write better code – code which will perform better out-of-the-box, and also fits in with the industry norms. (Writing incredibly naively is known as pessimisation – the opposite of optimisation – and we shouldn’t make poor design choices purely to avoid being seen to be prematurely optimising.)

Hence, what had seemed at first like a somewhat selfish question was actually altruistic in nature, and entirely within the community spirit the meet-up wants to promote.

In retrospect I realised this was effectively an application of the classic Five Whys technique often used during incidents and post mortems to get to the “root” cause [1]. It’s easier to apply that technique to an incident because it’s less personal, whereas the continued questioning of a single person can feel more like an interrogation, so you need to be very careful not to appear to be attacking the person. Responding with curiosity is key to enabling that process as it allows you to create a warmer, safer space and therefore allow the other person to open up and become more reflective.

 

[1] There are problems with the notion of there ever being a single cause of a issue, hence the use of quotes. How Complex Systems Fail is mandatory reading on this topic.

 

Thursday, 28 November 2024

Using CoPilot-Like Tools is Not Pairing

Ever since the rise of LLM based tools like ChatGPT and CoPilot there have been quips made about how great it makes as a pairing partner. The joke is because these tools are passive and won’t trash your code, or disagree with your approach, or smell of garlic.

I get the joke.

The problem for me is that these jokes are suggesting that using tools like CoPilot are akin to pairing, but they’re not. And, as a consequence, they are continuing to perpetuate misconceptions about what pairing is really about. This topic came up one Tuesday at a Cambridge Software Crafters meet-up when we were discussing mentoring & coaching and how to adopt pairing in companies that are hostile towards it, likely for these very reasons.

(If you’re after a longer read on this subject I can offer you my 2018 C Vu article To Mob, Pair, or Fly Solo, or if you’re a video kind of person then you might want to watch the talky version I gave at the ACCU 2024 Conference, also titled Mob, Pair, or Fly Solo.)

Tools like CoPilot are there to help you with the “mechanics” of writing code [1]. They complement the traditional tools like compilers and linters which help avoid a bunch of annoying problems that come with creating software as part of a solution. Your pairing partner may help you avoid these mistakes too, especially if you’re new to the language or tool, but that’s not primarily what they are there for.

Their input should focus on helping you to solve the actual problem. That might be at a design level, such as how best to represent the solution using common idioms or patterns, because they can see where it might be heading. They can also help come up with potential problems or edge cases that need to be factored in, and help decide how and when that might happen in the evolution of the feature you’re working on.

I’ve never paired in the strict navigator-driver sense, it’s always tended to be far more free-flowing, likewise when ensemble programming too. The keyboard tends to go back-and-forth with “type, don’t talk” being the trigger to switch, when it’s not blindingly obvious what my partner(s) are trying to convey.

Hence, in a TDD style loop the other half of the pair might already be thinking about the next test or where the design might be going for the refactoring step after the test passes. Sometimes one half is on a roll and it’s better to maintain momentum, other times both minds need to focus on the exact problem at hand, such as when the test fails unexpectedly. (If you both missed it then that is a smell worth exploring because something is clearly amiss somewhere.)

With modern IDEs and syntax highlighting the typist knows when there is a typo, they don’t need to be told by their partner. However the way they drive the IDE and other tools can often be a point of discussion as there are often many shortcuts and extensions that provide easy wins. High-end tooling often has an ability to try and point out other features you might find useful, and ChatGPT like tools could probably summarise related features in popular tools if asked.

Humans often remember things better when the correction happens at the point of triggering, and what your partner can see that your tooling can’t, is whether or not it’s an appropriate moment to make  a suggestion. There is a time and a place to share knowledge and part of the art of pairing is to read the mood and only interject if it’s likely to be beneficial now, or note it for a retrospective conversation later. It may even resolve itself automatically, later.

Sharing knowledge takes practice. For instance, sharing the right level of detail or techniques and idioms, to avoid overwhelming your partner while they are trying to focus on the problem at hand. The new generation of coding tools might be well versed in programming language details and common patterns but they don’t know anything about your organisation, team, or design principles. They might help with the small picture but they can’t help with the medium and bigger pictures, and that is the bit that makes software delivery hard. That is also the area where your partner should be adding the most value to the exercise.

Programming is so much more than just writing code, and programming in pairs and groups is about active participation to triangulate on a sustainable solution far quicker than working with passive tools and feedback.

 

[1] Maybe one day they’ll be able to consume your entire codebase, along with the VCS history, so that they can tell what’s old and what’s new and make better design suggestions too based on evolving trends. We’ll see…

 

Saturday, 5 October 2024

Crafters Meetup: Architecture Kata

Last time I wrote about the monthly Cambridge Software Crafters Meetup which I started attending almost a year ago. In that post I briefly mentioned the Architecture Kata session which I found particularly interesting as I’ve never done anything like that before.

I’ve been aware of katas for a long time, but they’ve tended to focus on solving simple problems in code, usually with a test-first bias, but this is the first time I’ve seen it applied to other aspects of software delivery.

Session Format

The attendees divided into groups of four or so people and were given a problem statement to design a system. We then spent an hour or so discussing it before presenting our design and approach to the other teams at the end. One of the organisers acted as a sort of product owner who we could ask for further background or clarification.

The problem we were given revolved around the overnight charging of a fleet of electric vehicles. The statement contained a fair bit of information about the charging rates and prices, along with other bits about the scale and monitoring.

Each team had a large sheet of white paper and a pack of sticky notes and pens to use.

Big Picture or Small Picture?

One of the things I’m trying to get out of these meetup sessions is to sit back and see how other people approach problems. Having 30 years of experience means you have plenty of biases to overcome, but at the same time I’m not going to sit there and watch my team spin its wheels stuck in Analysis Paralysis or get lost down in the weeds. Hence I was prepared to try and nudge when I felt it was appropriate as we had limited time, but mostly I wanted to avoid appearing to take control of the group and drive the direction.

To me, architecture is generally about big picture stuff. In this particular case: what is a suitable overall shape for the solution, such as computation, storage, network connections, etc? For example, the problem statement suggested that efficient charging was desirable from a cost perspective, which implied to us some kind of scheduling component which could drive the remote charging stations. The potential for failures (of vehicles, charging stations, etc.) also implied to us that this scheduling must be dynamic. However, that’s as far as our team went, aside from noting the need to store tariffs somewhere and perhaps historical data if charging was not effectively uniform to allow the system to be adaptive.

One other team also looked at the problem at roughly the same “scale” we did, although they went as far picking specific technology choices as they had a firmer idea of exactly what services they were building and how they wanted to connect them.

Somewhat surprising to me, two other teams focused deeply on the algorithmic element of how to efficiently charge the vehicles. Their presentation was all about what we had simply called “the scheduler”, and then ignored all the other stuff that sits around it. In my mind these were implementation details rather than software architecture.

Clearly the point of the session was to learn, so there were no right or wrong answers, but what in retrospect I’d be interested in knowing now is what the other teams think software architecture is about. The other thing that occurred to me is that maybe I drove the direction of our team’s solution more than I thought, despite trying hard not to.

Actors

What also surprised me, and maybe this was one of the reasons why we ended up focusing on the big picture, was that our team was the only one which had any actors (i.e. users) in it. This was an early suggestion of mine as we were talking about the problem statement but nobody seemed to be looking at it from the user’s point of view. After all, software development is about solving people’s problems and it’s not always obvious who those people are at first.

There weren’t many users mentioned in the problem statement per-se, but I was trying to use that question to open up a wider discussion about who the stakeholders really are, and how those might affect the shape of the system. And not just the shape, but also the evolution of it as it grows from a walking skeleton to a fully-fledged system – the journey is often far more important than the destination, which we know will change over time.

Design Surface

One thing I did do early on was to draw my actor on a post-it note and stick that on the sheet of paper after I noticed someone was about to draw directly onto the sheet. The other teams sketched elsewhere and then drew up a final diagram and notes on their large sheet. Our final presentation was essentially a sheet covered in sticky notes.

I’ll be honest, I love sticky notes :o), but I think one of the lessons of the architecture kata is that it should be fluid, and in the early days a system can be incredibly fluid, aka emergent design, therefore the temporary nature of sticky notes helps to explore and convey that. Once we had positioned some of the key actors (which weren’t necessarily people) at the edges we could then break down and refine what happens in the middle without having to re-draw from scratch or squeeze things in. (You don’t need an electronic drawing board at this level could have been another lesson.)

One of our team got some scissors and started cutting up the problem statement into little sections and attached them to the relevant part of the diagram. I really liked this as it gave a physical manifestation to the decomposition of the problem. It also helped us keep track of what we had covered and what we needed more detail on.

Epilogue

I really enjoyed this session, mostly because greenfield architecture is something which we rarely get to do as software developers. The vast majority of what we do is maintenance on existing systems and, even when adding new services, we are still designing within a space that is largely already well-defined. In my 30+ year career I’ve only been in this greenfield situation for real maybe three or four times.

My background is largely in financial systems so the problem setting was new to me, although once you break it down you find that dashboards, scheduling, etc. are all smaller problems I’ve met before so have other experiences I could apply to this particular scenario.

One consequence of the problem setting – the physical nature of chargers and electric vehicles – and my lack of knowledge of this technology, meant that these were big risks to me; so my general thoughts were around how to address those and whether that might have an impact on the architecture. Ever since reading Waltzing with Bears: Managing Risk on Software Projects I have tried to factor the identification and mitigation of risks early on in my design thinking.

I really forward to another architecture kata session. One point of katas is to solve the same problem multiple times but by using different approaches. I’d love to do that but I suspect the desire of the meetup group would be to do something new each time.

Our team definitely didn’t make enough use of the product owner, which may have been because we weren’t sure whether that was really the role being played, or they were just there to provide some basic clarity. In the exercise I described in Choosing a Supplier: The Hackathon we had a literal product owner shared amongst teams so that was more like what I was expecting.

It would also be nice if we could find a little time to discuss what we even think “software architecture” means these days, and also reflect more on how we approach it as this session raised some interesting questions for me around the very nature of this level of design.

Tuesday, 3 September 2024

Cambridge Software Crafters Meetup

Despite living pretty close to Cambridge I’ve never actually worked there. For the past 25 years I’ve commuted to London because it was easier and faster than getting to Cambridge [1]. The daily rates for contract programmers in the finance industry might also have had an influence too :o).

Anyway, one consequence of working in London meant that I had access to all the London meetups in the evening. But I also had a wife and four young children, so wanted to keep the balance right and consequently I largely limited myself to the ACCU London and eXtreme Tuesday Club (XTC) events, only dabbling in others if there was a particularly interesting talk / topic.

Then the pandemic happened, and like many others I found myself working remotely full-time. I had worked from home before on previous contracts, and in some cases that was to facilitate being able to attend a meetup in Cambridge, most notably around .Net and Go, but I only attended a handful or so. Mark Dalgarno (via Software Acumen – organiser of many Agile related conferences, such as Agile Cambridge) was in Cambridge back then too, and through Software East also arranged various enjoyable gatherings.

Eventually some semblance of normality resumed but I still found myself working remotely and now Cambridge was my nearest point of social contact (IT wise) in the evenings. The pandemic meant that meetups had to stop for some considerable time (years!) and with people nervous to venture back out again the choice was naturally limited. Platform focused meetups like Cambridge .Net reappeared in-person sooner, as I guess they had a captive audience, but I wanted something that covered the wider topic of programming at large, not just one specific language / stack. (There was an ACCU Cambridge meetup once-upon-a-time but sadly that folded a long time ago.)

Tangentially, my eldest son had also recently entered a career in programming and he asked me about any local meetups I knew of as he was based around the same area. It didn’t take long to discover that the Cambridge Software Crafters group on Meetup was holding its first in-person gathering since November 2019 (a hiatus of four years). The topic was learning TDD and it was being held in the office where my son was working at the time so I suggested he had no excuse not to attend either :o).

I met some old faces at that event and, more importantly, some new faces too. (I even bumped into a young chap who lived a few houses down my street. I didn’t know he’d also started a career as a programmer.) A diversity of ages, experience, industries, culture, etc. is an important requirement for me in a more general craft-oriented meetup because “one size never fits all” and I like to be aware of what forces and constraints drive other people’s work.

For that first meetup we did a leap-year kata in pairs. I chose to work with someone young who wanted to use TypeScript as I’d never used that before. Having practiced TDD for almost 20 years I clearly wasn’t there as a beginner but I felt my experience might be useful and I always learn something new when pairing with people. (After pairing / mobbing heavily for the 6 years prior to my current contract I realised how much I had missed the sociability of ensemble programming.)

Sadly I missed the next few events but my son kept me informed on what they had got up to and it continued to sound interesting, with practicing TDD clearly a popular topic. I eventually attended the Architecture Kata, which I found interesting and intend to write up some particular observations in a separate post. Then we did an evening of lightning talks (10 mins) which is something I’ve always enjoyed at the ACCU and Python conferences as you always get a pretty diverse set of topics. This was no exception. (It also gave me an opportunity to resurrect a humorous short talk I did at ACCU 2014 – The Art of Code.)

The most recent meetup was another TDD kata, this time around the game of Noughts & Crosses. I ended up pairing with someone that was only just getting into programming and was attending the meetup to learn more about the craft as he had a background in construction. In the end we hardly did any programming per-se and I mostly walked him through TDD but more as a backdrop on talking about what sustainable delivery really entails. It wasn’t at all what I was expecting to do that evening but I was glad I could share some knowledge & experience, and it meant that everybody then got a suitable partner they could work with.

While the sociability factor is high on my list it would be remiss of me not to mention that the event has sponsorship from Codurance which means that the evening comes with free pizza and drinks. The venue, The Bradfield Centre, is a nice modern building and has an auditorium which was used for the lightning talks. Most events take place on the open plan ground floor area as we either pair or work in small groups and then share with everyone what we’ve done / learned.

I really hope this meetup continues to gain traction as I think we often focus too much on specific technologies and not enough on the craft as a whole. There are clearly a few regulars but also new faces each time which provides a nice mix and everyone has been really friendly. It also gives me an opportunity to “talk shop” with my son as that never goes down well at family gatherings :o).

 

[1] Things are much improved with the guided buses and upgraded A14 but what I wrote in “Missing the Daily Commute by Train” still holds as far as using the train goes.

 

Monday, 22 April 2024

Naming Functions: When Intent and Implementation Differ

Most of the time these days when I get into a conversation about naming it tends to be about tweaking the language, perhaps because I think there is a much better term available, or the author is a non-native speaker and they’ve transliterated the name and it’s ended being quite jarring, or worse, ambiguous. I’m also on somewhat of a personal crusade to try and vastly reduce the lazy use of “get” in codebases by replacing it with far more descriptive names. (For a more humorous take on the overuse of “get” try my Overload back-pager “Acquire Carter”. Alternatively my C Vu article “In The Toolbox – Dictionary & Thesaurus” tackles the issue a little more traditionally.)

However, more recently I ended up in a conversation where I felt the the whole point of “naming to reveal intent” had gone out of the window and the author and other reviewer were far too fixated on reflecting the implementation instead. Normally the intent and implementation align very closely, but in this instance they didn’t because it was the first step towards formalising a new feature, which was implemented temporarily [1] using a different one.

To anyone who’s read Arlo Belshee’s writings on Naming as a Process I think you’ll recognise what follows as a case of “Honest Naming” instead of “Naming for Intent”, and that comes at a cost to the reader.

Background

I currently work on a system used to generate a variety of different reports. When some of the reporting code runs it generates log messages which warn about missing data, which means it’s had to substitute something else instead. When the code was initially written it was being run interactively and so the missing data would get addressed quickly. However, now that it’s automated nobody sees these messages anymore, except when the code fails and anything written to stdout gets included in the failure alert.

Naturally, there should really be some better way of surfacing these anomalies, even if the report generation succeeds, and there will be, eventually. However, in the meantime we discovered there is a way of hooking into the report scheduler so that those warnings could at least be surfaced in a completion status email instead. This mechanism involved a bit of abuse though – using the “critical” log level instead of merely logging the missing data as a “warning”, because serious errors during runtime were still highlighted at completion.

First Attempt: Confusion Through Dishonesty

The simplest thing that could possibly work, and the initial change that was presented, was to directly change any log lines like this:

print(“WARNING: Missing data for ‘XYZ’”)

to use an alternative logging function like this:

logCritical(“Missing data for ‘XYZ’”)

My immediate reaction to this was one of “No, now the code is lying to us!”. The code previously made it clear that this condition was just an inconvenience but now it appears to be far more serious.

While the elevation in severity was unfortunate my real problem was that it was missing the reason why the logging call was being made in the first place. Even in the original code the use of print() was a means to another end, and ultimately we should make that end more obvious.

While I’m a fan of doing the simplest thing by default, that does not also mean doing something overly naïve. The cost of adding an extra level of indirection at this point in time was not onerous and would very likely minimise rework in the future as we evolved the real implementation.

Second Attempt: Too Honest

I thought I had expressed my opinion fairly clearly and there appeared to be agreement on the way forward so I was somewhat surprised when the code came back like this:

enrichEmailWithWarning(“Missing data for ‘XYZ’”)

At this point I pulled out Arlo Belshee’s classic naming infographic to try and show how we were still a way off getting to the point where the code was saying what was really going on at a business level.

What makes this particularly confusing is the code where this appears runs in a child process and so has no knowledge of the parent scheduler. Hence the function being invoked is talking about some obscure mechanism totally outside itself. The name of the function in this instance is too honest, it says what is going under the covers but not what it really means. If you run this interactively again what email is it even talking about?

On the plus side though we have at least encapsulated the underlying mechanism to stop lying about the severity, and we can reimplement this later to use a separate file, database, or whatever without touching all the call sites. Or can we?

Well, the problem with honest naming is that when the implementation changes, then so does the name. This means all the call sites will have to be updated yet again each time we reimplement it. Every time we update the code we run the risk of breaking it [2], and it adds unnecessary noise to the file’s history because the real intent hasn’t changed.

Third Attempt: Revealing Intent

Let’s take a step back and ask ourselves what this line of code is really trying to achieve:

print(“WARNING: Missing data for ‘XYZ’”)

While you might say that it’s printing or logging, that is just the implementation bleeding through. A more general way of saying that might be “recording”. You could argue that “notifying” might be better because the information is included in a status email but I’d suggest that is being too specific at this point because in the future it may only get written somewhere and the retrieval mechanism is totally up for grabs and could easily involve not sending yet more emails.

Okay, so we’re recording something, but that’s not much better than print() or log() if we are just going to pass an arbitrary text string. We need to think more about what we’re recording to see if that can be included in the name too, or broken down into separate arguments.

The whole reason this conversation started was because the messages about missing data were not being seen and therefore we needed to surface them in some way. Hence what we’re recording is “missing data” and that record ideally needs to be different from the diagnostic trace messages and other more general logging output. Hence the function name / API I proposed was:

recordMissingData(“XYZ”)

To me this now expresses far better what the original author intended all along. The implementation has the ability to vary quite widely from simply printing to stdout, logging to a file, writing to a queue / database, and yet the same name will continue to reflect all those possibilities. By elevating the text “missing data” from the log message to the function name we are also being more explicit about the fact that handling missing data is formally recognised as a feature and can be given special treatment, unlike the general logging output.

Anyone not involved in this conversation who runs across the code and has a similar requirement stands a better chance of using it correctly too, meaning less rework. When it’s missing, future readers may try to “second guess” what’s going on and then apply the same mechanism for different, inappropriate reasons and now, as they say, you have two problems.

I can’t say for sure we won’t need to revisit this again as we learn more about the nature of what data is missing and whether with more context we can automatically triage and notify the right people, but for now it feels like the cost / benefit ratio of “talking versus doing” is about right.

 

[1] Describing anything as “temporary” is a fools errand, we all know it will live far longer than planned :o).

[2] This is code written in a dynamic language so even a simple typo can go undetected unless it’s tested properly.

 

Friday, 9 February 2024

Our Star Baker

Just over 14 years ago I posted the eulogy I wrote for my father on this blog (So Long and Thanks For All the Onions) mostly because I had just started writing and this blog gave me the confidence to write. Sadly, a month ago my mother passed away too and yesterday I got to present my eulogy for her as well. The writing practice from the intervening 14 years undoubtedly made the mechanics easier and afforded me more time for reflection and allowed me to better translate my thoughts and feelings to the page. So, thank you blog, but far more importantly “thank you Mum” you too will be with me always…

I was visiting Mum in hospital a few years ago, when she was having one of her knees or hips or something replaced, and a nurse came in and addressed her as “Jennifer”. For a brief moment I wondered who they were talking to, I’m not even sure Mum acknowledged her at first either. Mum hasn’t been a Jennifer for very long time – she was always Jenny, or Jen to her friends. Of course, to myself and Jo she was “mum”, and to our various offspring she was Grandma, or Grandma Jenny. Millie and Ella came up with The Jenster but it never really gained any traction outside our family, for obvious reasons.

She was only ever Jennifer in an official capacity, such as when she returned to work part-time, initially doing market research for the BBC. One of my earliest memories was of tagging along with Mum to some local parade of shops where she would interview the public about their viewing and listening habits. We got to play on the slopes and the railings while she quizzed the public, which was a lot more fun than it might sound today. From there she switched to clerical work, most notably (for me at least) at Bowater-Scott where one of her colleagues offered to build Jo and I a wooden sledge, which he did! As I only remember visiting that office maybe once or twice, I’ve always assumed Mum must have made quite an impression there.

Eventually she ended up in the Audiology Department at West Hill Hospital in Dartford treating people that had lost their hearing. It was only supposed to be a two-week placement, but she ended up staying for 21 years in the end (after cross-training as a Student Technician a year later). It was fairly apparent even to us kids that not every patient was easy to deal with, but she always put their needs first, to the extent that she would often bring her paperwork home to allow her to prioritize her time with the patients instead. If anyone was ever in any doubt that Mum was a people-person, her career at the hospital would surely stand as testament, backed up by the many cards of thanks she received over the years from the people she helped. Perhaps the one aspect of her work we regret her bringing home was the need to talk so much louder all the time.

The hospital wasn’t the most enjoyable place to hang around during the school holidays, but I soon discovered a tiny little computer shop at the top of West Hill which then made the trips to her workplace an absolute joy. When it came to buying me that first computer, I know Dad wasn’t so convinced, and it is Mum that I must thank for taking that leap of faith. Even forty years later Mum would still joke about whether it was the right thing to have done, as it could still just be a passing fad.

When I wrote the eulogy for my father, I suggested that the genes which have probably contributed most to my career in computer programming probably came from him, but in this more recent time for reflection I am beginning to question if it wasn’t more from my mother’s side instead. For her generation Mum was very good with technology – the proverbial Silver Surfer. Although she might occasionally ask for technical advice, she often sorted out her own problems, along with those of her friends! She always wrote a good email and picked-up WhatsApp with similar ease, if occasionally being a little over-zealous with the emojis. We had several different family WhatsApp groups with which she was very active and helped ensure she remained in constant contact with her grandchildren and could easily find out what they were up to. She took a genuine interest in their lives and they were always keen to share. It wasn’t unusual for Charlotte or me to mention what Mum was up to only to be met with a chorus of “yes, we know!” because they had been conversely directly with her about it.

This need to adapt to the ever-changing world was something which Mum embraced, not only on a technical level but also from a social perspective. Rather than dismiss young people because they haven’t faced the same struggles or because their viewpoint didn’t match hers, she would instead engage with them to try and understand how and why the world was changing the way it was. Her grandchildren helped her move with the times and in effect helped her to remain young at heart. She very much believed the old saying about only being as old as you feel. Her body may have shown some signs of wear and tear as she reached her eighties, but her mind was still razor-sharp, along with her wit.

We probably shouldn’t be surprised that some of her joints needed replacing later in life because she was always such an active person! Her diary always seemed to be full – from dawn until dusk – whether that be out with friends and family, or abroad visiting another new country and making even more new friends, and not just for the duration of the trip, they often became lifelong friends which speaks volumes about the kind of impression she left on everyone she met.

As a family we often joked when we visited somewhere new that grandma had probably already been there. If she had, you knew you could rely on her suggestions to fill your itinerary as they would include a range of beautiful vistas, buildings, galleries, restaurants, etc.. Over the years she visited a whole variety of different places from Alaska to Moscow with Canada, Croatia, China, and India to name a few in between. We were always fascinated to see the photo album she would put together on her return and listen to the stories she told about the people she met.

When our children were much smaller we managed to convince her to come away with us on a couple of more relaxing beach holidays. Much as she enjoyed reading at home she wasn’t the sort of person to curl up on a sun lounger with a book, not when there were places she could explore, and the grandkids also made sure it wasn’t going to be a relaxing holiday for any of us, least of all Grandma. I’m still not sure what possessed Mum and I to go paragliding in Turkey! Running, and then Jumping off a cliff while strapped to a stranger felt courageous enough for me, let alone mum who was in her mid-sixties by then. At the end she remarked the scariest part was the jeep ride up the mountain, not coming back down again by parachute!

What all her travelling proved was that she had a great sense of adventure and that was epitomised by her walk along the Inca Trail to Machu Picchu. This multi-day hike was essentially a marathon at high elevation, so a challenge even to the younger trekkers. Beforehand Mum was a little concerned about her age and fitness but she put the training in and need not have worried as she found herself at or near the front the entire time. In fact the porters nicknamed her “the nanny goat” on account of how well she acquitted herself. For once the trip didn’t just conclude with a photo album but ended up becoming a PowerPoint presentation too which she gave twice at Isaac and Millie’s school as Machu Picchu was on their curriculum. That wasn’t the only lesson they got from her there either, as Mum and Dad also put us all to shame at the school’s 1940’s night with a wonderful display of swing dancing.

I don’t think mum was ever a passive bystander in anything she got involved in, she was always there to lend a hand and ultimately would get drawn in to fill whatever role needed her talents. While I’m sure she enjoyed watching us swim I think she preferred it when she could also be an active participant – initially helping-out by decorating the float for the parade, to becoming a committee member, then club secretary, and then officiating at galas in various capacities, even after we’d flown the nest. (Her long-standing service to the Kent County Executive was recognised in 2002 when she received the Edward Maples trophy.) She even got to rekindle her netball skills in a couple of Mother & Daughter swimming club socials and we discovered along the way that she had briefly appeared on TV in her youth playing netball.

This wasn’t the only time she has featured on TV, more recently her left arm made a guest appearance on the BBC during The Proms. Mum was a huge fan of art, both in the literal sense and the wider movement. Although we lived well over an hour away London was middle ground for us both and that gave us the perfect opportunity to meet up and take in a West End show or a trip to the Royal Albert Hall. I was already well versed in musicals long before meeting Charlotte and have Mum to thank for knowing so much of The Sound of Music off-by-heart. While always a favourite in our house too, Mamma Mia became the musical of choice when the kids went to stay at Grandma’s after seeing the show together in London.

Even though she didn’t live on our doorstep that didn’t stop her from attending so many of the concerts and productions that her grandchildren featured in. She was always a big supporter of their talents and watched them whether it was a bit-part in the school Nativity or a paid-concert in Ely Cathedral or Huddersfield Town Hall. (Or for that matter a freezing cold football pitch or rugby pitch, which is probably why she nudged Jo and I towards the warmth of a swimming pool.) For some of you this will be old news as she was keen to share their endeavours with her close friends as any doting grandmother would. She attended so many events in and around Godmanchester over the years that people were always surprised to learn that she actually lived 90 miles away!

During the pandemic this distance made it a little harder to meet up in person, but it didn’t stop her from socialising and even doing activities with the kids. Like everyone else we used Zoom to keep in touch and Ella and Mum created their own virtual Great British Bake Off to ensure those legendary cooking skills were still put to good use. I was never a big fan of baking per-se, but I did enjoy squishing the sausage meat between my fingers when we made sausage rolls for Christmas. Likewise making mince pies was something I enjoyed too, and this Christmas baking tradition was passed-down to my children while I took on the more important role of keeping Mum’s glass of Prosecco topped up. Her puddings definitely were legendary, but for The Oldwood family it’s undoubtedly her coffee flavoured birthday cake that she will be most sorely missed for, baking-wise.

I’m now two-thousand words in and have barely scratched the surface of memories I could talk about. At some point I need to stop and give you the opportunity to share your favourite memories with us, and with the other people here. And share them we must, because that’s how we keep her memory alive. Every time we plan a trip, or open a packet of biscuits, or play a game of Rummy, or use her baking tray, or pour a glass of red wine, or whatever, there will be another opportunity to share our love for the person we once knew as Jenny, or Mum, or Grandma.

Wednesday, 4 October 2023

Unpacking Code Ownership

This post was prompted by a document I read which was presented as a development guide. While the rest of it was about style, the section that particularly piqued my interest was one involving code ownership. For those of us who’ve been around the block, the term “code ownership” can bring with it connotations of protectionism. If you’ve never worked with people who are incredibly guarded about the code they write may I recommend my 2017 blog post Fallibility which contains two examples of work colleagues that erected a wall around themselves and their code.

While I initially assumed the use of the term was a proxy for accountability, some comments to my suggestion that Relentless Refactoring was an established practice in many teams hinted that there might be more to it than that. What came out of an online meeting of the team was that the term was carrying the weight of two different characteristics.

(I should point out that I’m always wary of this kind of discussion verging into bike-shedding territory. I like to try and ensure that language is only as precise as necessary, so when I suspect there may be confusion or suboptimal behaviour as a consequence, do I feel it’s worth digging deeper. In this instance I think “ownership” was referring to the following attributes and not about gatekeeping or protectionism for selfish reasons, e.g. job security.)

Accountability / Responsibility

When people talk about “owning your mistakes” what they’re referring to is effectively being accountable, nay responsible, for them. While there might be a legal aspect in the cases of Machiavellian behaviour, for the most part what we’re really after is some indication that changes were not made simply because “you felt like it”. Any code change should be justifiable which implies that there is an air of objectivity around your rationale.

For example, reformatting the code simply because you personally prefer a different brace placement is not objective [1]. In contrast, reformatting to meet the pre-agreed in-house style is. Likewise applying any refactoring that brings old code back in line with the team’s preferred idioms is inherently sound. Moreover, neither of these should even require any debate as the guide automatically confers agreement [2].

Where it might get more contentious is when it appears to be superfluous, but as long as you can justify your actions with a sense of objectivity I think the team should err on the side of acceptance [3]. The reason I think this kind of change can end up being rejected by default is when there is nothing in the development process to allow the status quo to be challenged. A healthy development process should include time for retrospection (e.g. a formal retrospective) and this is probably the place for further debate if it cannot quickly be resolved. (You should not build “inventory” in the form of open PRs simply because of unresolved conflict [4].)

One scenario where this can be less objective is when trying to introduce new idioms, i.e. experimental changes that may or may not set a new precedent. I would expect this to solicit at least some up-front discussion or proactive reviewing / pairing to weed out the obvious chaff. Throwing “weird” code into the codebase without consulting your teammates is disrespectful and can lead to unnecessary turf wars.

Being accountable also implies that you are mature enough to deal with the consequences if the decision doesn’t go your way, aka Egoless Programming [5]. That may involve seeing your work rejected or rewritten, either immediately or in the future which can feel like a personal attack, but shouldn’t.

Experience / Expertise

While accountability looks at ownership from the perspective of the person wanting to change the code, the flipside of ownership is about those people best placed to evaluate change. When we look for someone to act as a reviewer we look for those who have the most experience either directly with the code itself, or from working on similar problems. There may also be different people that can provide a technical or business focused viewpoint if there are both elements at play which deserve special attention, for example when touching code where the previous authors have left and you need help validating your assumptions.

In this instance what we’re talking about are Subject Matter Experts. These people are no more “owners” of the code in question than we are but that doesn’t mean they can’t provide useful insights. If anything having people unrelated to the code reviewing it can be more useful because you know they will have no emotional attachment to it. If the change makes sense feature-wise, and does it in a simple, easy to understand way, does anything else really matter?

These days we have modern tooling like version control products which, assuming we put the right level of metadata in, allow us to see the evolution of the codebase along with the rationale even when the authors have long gone. Ownership doesn’t have to be conferred simply because you’re the only one that remembers how and why the code ended up the way it did. This leads into territory around fear of change which is not a sustainable approach to software delivery. In this day-and-age “consulting the elders” should really be a last resort for times when the record of events is lost in the sands of time. Approval should be a function based on knowledge of the subject matter rather than simply years of service [6].

Shepherds, Not Owners

Ultimately what I find slightly distasteful about the term “shared ownership” is that it still conveys a sense of protectionism, especially for those currently “outside the team”.

From a metaphorical point of view what I think I described above is more a sense of shepherding. The desire should be to nurture contributors to understand the culture of the codebase and product to the extent that the conversations can focus on the essential, rather than accidental complexity.

I wonder if “shared mentorship” would work as a substitute?

 

[1] This is a good argument for using a standard code formatting tool as it can make these debates moot. 

[2] If the code is that performance sensitive it should not be touched without consultation then there should either be some performance tests or at a minimum some comments to make that obvious.

[3] The late Pieter Hintjens makes a compelling case in Why Optimistic Merging Works Better.

[4] There is where I favour the optimism of Trust, but Verify as an approach, or pairing / ensemble programming to reach early consensus.

[5] The Psychology of Computer Programming – Gerry Weinberg, 1971.

[6] One needs to be mindful of not falling into the Meritocracy trap though.