How We Got Agile: An Origin Story

How We Got Agile: An Origin Story

My old copy of “the mythical man-month”

When I was a young man, a college student in the Computer Science program at Virginia Polytechnic Institute, we were assigned a book to read. It was called The Mythical Man-Month, by Frederick P. Brooks, Jr., and I still have my copy from the 1980s. The point of the book, which you might be able to glean from the title, is that you can’t simply measure work effort in “man-months,” on a scale such that you could conceivably get more work done by adding more people to a project. As an example, you couldn’t say that a project has a work effort of 120 man-months, meaning that with 10 men it will take 12 months to finish, and therefore with 20 men it will be done in 6 months.

If you had 10 men working on this hypothetical project, and added 10 more, you would not find that it completed 6 months sooner. It would, in fact, take longer than 12 months. The problem is, as you add more men (people) to a project, you need time to get new hires ramped up to where they understand the project well enough to be productive. You also multiply the lines of communication, which generates additional overhead keeping everyone in sync on specific information needed to make interacting components work together. In engineering, these pieces of information are called “specifications,” and they have to be tracked somehow. If you add more people to a technical project, you add more tracking effort. These complications are summarized in Brook’s law: “Adding manpower to a late software project makes it later”.

As a software engineer in the early 21st century, it fascinates me to read the author’s description of how specifications were tracked on the project he worked on – the IBM System/360 – in the 1950s and 60s. They had huge manuals kept in binders, and as changes were made, the responsible engineers would have to go in to the binders and update the appropriate pages – that is, take out the old pages and insert the new ones with the changed specs. This manual was the Bible of the system, and keeping it up to date was abolutely vital to the success of the project.

Modern day software engineers like me are not used to such meticulously maintained documentation. We consider ourselves lucky if there is any documentation at all for the software on which we are working. You’d think it would be easier, now that everything can be done online, but projects move too fast and the people working on them move around too much. No one is necessarily going to stay on top of documentation, and so long as software works as expected, that’s fine. It’s when it doesn’t work that you run into trouble.

Because personnel move around so frequently in the modern workforce, there is rarely anyone working on a software program who was there when it was originally programmed. But programmers still need to maintain it. Sometimes we are given requirements to modify existing software that has no documentation, with no one around who knows anything about it, and the only way to achieve that goal is through “reverse engineering.” This means poring over old code and documenting it from scratch, which is very time consuming. This underscores the point about the man-month: you can’t just insert a person into a project and expect them to get as much done in a given amount of time as a previous person on the project did. Certainly not if they are going to be reverse engineering the previous person’s work.

Since the start of the personal computing era and the long economic boom of the late 20th and early 21st centuries, computer software has been advancing at a faster pace than it did when Frederick P. Brooks worked as an engineer at IBM. The workforce has changed as well, with employees typically job hopping every few years, and often working as contractors through agencies rather than directly for the client that owns the software they are developing. So how do the software engineers of my generation handle project management in such a chaotic work environment?

The answer is “Agile” methodology, which came about around the start of this century. Agile is a lean or lightweight software development method that emphasizes individuals collaborating over plans and processes, and defines good software as software that works, not necessarily software that is well documented. At least, that’s the declaration in a famous “Manifesto for Agile Software Development” that was published in 2001.

The idea is that “Agile” is a mindset where you are focused as a team on communication and collaboration, continuous improvement, and responsiveness to change. In practice, it means breaking up the project work into short iterations called “sprints,” which typically last two weeks. Everyone’s tasks for the sprint are things that shouldn’t take more than a couple of weeks to finish. So right there the idea of a “man-month” is out; no one would work on one thing for a whole month!

Breaking the project work into chunks like this makes it easier to show progress, and to evaluate how effective the team is from sprint to sprint, and change processes and workflows as needed. It also makes it easier to accomodate personnel shifting around from project to project. It’s a way of coping with today’s volatile workplace, which makes long term planning harder to achieve. A whole panoply of “frameworks” and “ceremonies” has developed around the original concept since it was first elucidated.

If you are in a white collar profession (not even necessarily Information Technology) you might have experience with Agile-related frameworks in your career. I was first exposed to Agile in the late 2000s, and have been at positions where it is used comprehensively since 2018. Every company does it a little differently, but I have always found it to be a useful way to structure project work.

The way I see it, Agile came about because a new generation of software engineers needed to adapt to a faster pace of work than what the generation of Frederick P. Brooks experienced in their careers. They needed to find their own solution to the problem of how to get people to work effectively when they are added, out of the blue, to a new project. If you look at the signatories of the 2001 Agile Manifesto, you will see that they are almost entirely Baby Boomers and Gen Xers. Today’s Millennials and Gen Zers in the IT workforce have possibly never worked on a project that wasn’t using Agile.

I’ll have more to say about the different generations and Agile in a future post.

Next Generation Board Gaming

Next Generation Board Gaming

I saw an article just recently about the release of a new version of Scrabble, friendlier and less competitive than the original. The article title indicated that it was designed to appeal to the young generation, putting ‘less competitive’ and ‘inclusive’ in scare quotes, as though one should wonder why anyone would want such features in a board game. I encountered the article in the context of social media feeds where posters were mocking Gen Z and decrying this as “woke Scrabble.”

I gathered that these posters were Gen Xers, and that the editor who picked the title of the article probably is as well. My generation likes to pick on younger people for not being tough enough. But I don’t see what their problem is; this new Scrabble version, called “Scrabble Together,” seems like a perfectly cromulent game to me. To me, it’s simply part of a trend that’s been going on for years, where cooperative and team play games have grown in popularity. These games are suited for socializing in large groups, and I think they are a good fit for the peer personality of the Millennial generation.

As Neil Howe and William Strauss put it in Millennials Rising, this generation is special, sheltered, and team-oriented. A chiller version of Scrabble is perfect for a generation more interested in fitting in and playing it safe than in standing out and taking chances. In fact, Neil Howe identifies board gaming as one of many pastimes Millennials have favored as they have embraced youthful restraint, in contrast to the wild days of my generation’s youth.

The board gaming hobby has really taken off in the past couple of decades, as I have noted in other posts. I remember the very beginnings of the new wave of board games back in the 1990s, when Millennials were children. As the media caught on to the trend when Millennials became young adults, articles started appearing associating the board game revival with their generation. I’ve certainly enjoyed watching Millennials swarm into gaming conventions and game stores, and even sometimes feeling like the wise old guy teaching them a thing or two as we play a game together.

I would say that the board game revival belongs to both Millennials and Generation X, as this article by a Gen X board gamer describes. And in all fairness, the Boomer generation deserves credit for giving us many of the prominent designers of the tabletop games that are so popular today. But Millennials really have taken board gaming to a new level, folding the hobby in with social media and streaming video platforms, and adapting it to their mode of life.

It’s been quite remarkable to observe, and since board games are something of an obsession for me, I’m glad that it’s happened. I look forward to playing Scrabble Together some day, possibly chilling with some friendly Millennials at a game day hosted by a local craft brewery. ‘Cause all we’re trying to do here is get along and have a little fun.

AI at Work, for Better or Worse

AI at Work, for Better or Worse

A little robot guy I made with an AI image generator

As you surely know if you are a denizen of the online world like I am, artificial intelligence has made remarkable strides in the past few years. In particular, what they are calling generative AI has really taken off. This is a kind of advanced pattern matching software that grew out of machine learning. It lets you use prompts to create content like images, complicated text including writing stories, and even videos and music at this point. At the bottom of this post I linked to a YouTube video that explains generative AI really well so check it out.

I played with AI image generators for a while, and had some fun. In their early iterations they produced really weird, often creepy looking stuff, but now they’ve gotten pretty advanced. The images they produce are intriguing, impressive even. I saved a lot of the ones I generated, but stopped messing with the programs when I saw how many of my artist friends were upset by the proliferation of AI-generated images on social media. I gathered they could sense their own work being made obsolete by an overwhelming supply of easily produced knock-off art. Why hire an illustrator when you can just describe what you want into a text box in an AI tool, and get the result in a few minutes? Plus there’s the troubling issue of these programs possibly being trained on copyrighted material without the consent of the copyright owners, meaning they are effectively stealing from artists.

Another thing you have to consider about the product of generative AI (and this is covered in the video below) is that it is subject to one of the rules about computer programming that I was taught as a lad: Garbage In, Garbage Out. That is, if you put bad data into a computer program, then you will get bad data out of it. Generative AI is trained on massive data sets, and one result of the way the current AI programs have been trained is that they produce content that tends to express a sort of lowest common denominator of its subject matter. You put in the vast quantity of data on the Internet, apply sophisticated pattern matching, and you get out, as a result, something like an “Internet average” version of human knowledge.

For an example of what I mean, here is a fantastic article explaining how AI-generated images of selfies misrepresent culture. They do this because the pattern matching algorithms take the conventional way that selfies typically look and apply it to subjects where that wouldn’t make sense. So an AI-generated image of, say, a group selfie of medieval warriors makes them look like modern day humans. Now, since the idea of the existence of such a selfie is absurd on the face of it, maybe it’s pointless to worry about its inherent historical inaccuracy. But in a way, these kinds of images are erasing history.

The article goes even deeper; the AI generators tend to represent everyone as smiling into the camera the way that Americans do. But other cultures that do exist today and do take group selfies have different ways of expressing themselves when taking photos. So the AI programs aren’t just erasing history, they are also erasing existing modern cultures. They are turning everyone into Americans, because American culture dominates the Internet.

Here’s another way AI-generated content gravitates toward a dominant average mode, one you might have heard of already. It seems that AI chat programs, trained on the massive data of online conversations, will often produce racist, abusive comments. It’s like they inevitably turn into Internet trolls. This might seem like a mere annoyance, but AI programs generating racially biased content can have serious, life or death consequences.

With all of these concerns, it’s understandable that public perception of AI is not always favorable. Ted Gioia (who has an awesome substack, by the way) wrote about this perception recently, starting with a story about the audience at SXSW booing an AI presentation. His article expands into a general discussion of the public’s current distrust of the technocracy, in contrast with the way technocrats like Steve Jobs were idolized in the past. Faith in “innovation” and “disruption” has waned in a society facing uncertainty and disorder, and sensing that technology is leading us toward a dystopian future.

Where does AI fit into my life, now that I’ve stopped playing with image generators? Well, I may not be able to avoid using it, as the company where I work has been promoting AI chat programs to help with day to day tasks. We are all being asked to look into them and come up with ways this new software can improve our productivity. Other folks who have a job like mine might be encountering similar pushes at their workplaces.

I think this is an honest effort by our management to ensure that our organization doesn’t get left behind in the AI wave they are convinced will revolutionize the workforce. Stay ahead of the disruption, and ride the wave I guess is the thinking. Surely it’s not the case, as Aileen and I joked when I brought this up to her, that I am training an AI to replace me. I mean, why pay a software tester when you can just describe the tests you need into a text box in an AI tool? Oh my.

Below is the very informative video that explains Generative AI.

Truth is a Casualty in the Age of Performative Politics

Truth is a Casualty in the Age of Performative Politics

If you watched President Biden’s State of the Union speech last week, and were aware of the Republican response by Senator Katie Britt, you probably know that the latter’s speech has been mocked for its insincere and performative nature. In fact, Britt’s rebuttal was so performative that even as she was giving it, the Internet was anticipating that SNL would parody it in their next cold open sketch, coming just a couple of days later. And indeed they did, though to be fair they also parodied the President.

I do agree that Senator Britt’s speech was performative, as well as inaccurate in its statements but this whole affair reminded me of some important points about the state of politics today:

  • Politicians are performatve because they are not arguing in good faith; they are rallying their side in a partisan conflict. Is Biden really going to enact policies for the long laundry list of liberal/left/blue zone causes he touted in his speech? How could he in this era of dysfunctional government? He is simply assuring his base that he represents their values.
  • The partisan conflict is rooted in the Culture Wars that emerged out of the last Awakening, as evidenced by the conservative/right/red zone trappings of Britt’s speech: Christian family values, nativism, domesicated femininity – all the backlash against the Consciousness Revolution. She is simply assuring her MAGA base that she and the rest of the opposition against President Biden represent MAGA values; she doesn’t need to use facts to do that, just feelings.

The simple truth is politicians in each partisan faction are going to use whatever rhetoric works to reinforce the group feeling within their camp. There’s not much point in worrying about the nuance of what they say, or for that matter its accuracy or whatever hypocrisies are embedded in the rhetoric. We are past the point of anyone convincing anyone through reason. We are in a raw struggle for power, so pick a faction and stick with it. If you can’t or won’t pick a faction, you might want to keep your head down for awhile.

It’s Really Been A Year Already?

It’s Really Been A Year Already?

This photo showed up in the memories feed which my smartphone helpfully throws in my face every once in a while. It was one year ago today since I went back to the corporate campus of my previous job to turn in my laptop. I took this photo because this was a new building that wasn’t up yet when I left the campus to begin remote work in March 2020, and I was excited to see it on my return. It was under construction when I left and there was a lot of hubbub about it.

I think it’s a pretty building. The campus has this striking architectural design that resembles modern art, and this building fits right in. It also has a lot of stairs (I mean the campus as a whole does) which makes it challenging to walk around in if you are not physically fit. When I walked on that campus I felt my age. I felt like I was obsolescing as I was surrounded by the aggressive energy of a workforce that keeps growing younger with every new job I take.

I did go up that formidable looking staircase and go into the bulding. It was impressive on the inside, too, with a spacious lobby with some nice art installations. The security guy at the desk paid me no mind.

The campus was custom build for this corporation, and it must have cost a bundle. So I can understand why they wanted people returning to onsite work. Aileen and I speculated that maybe I was let go because I declined to go hybrid and wanted to stay 100% remote. They gave us the option to do either, and assured me that my decision to stay remote had nothing to do with my position being cut. But who knows.

I’m glad I made the choices I did, and that I amazingly was able to get a 100% remote job elsewhere after being let go. I feel very lucky to be in the position I am in today, and grateful for the support of my family here in Pennsylvania. I just can’t believe it’s been a year already at my new job. Tick tock.

Emojis at Work – How Social Media Infiltrated the Workplace

Emojis at Work – How Social Media Infiltrated the Workplace

I still remember the excitement when the first iPhone came out in 2007; only a few people were using this new kind of mobile phone, but boy were they delighted with it. At the same time, everyone was jumping onto Facebook, which had just opened up to the general public in 2006.

Fast forward to a decade and a half later, and everyone has a touchscreen phone (I got my first one in 2014). Social media platforms have proliferated, and are a constant, pervasive feature of daily life.

Once, employers tried to prevent workers from browsing the Internet during the day, but such efforts have been abandoned. Everyone is on their phone all the time. In fact, the software used to officially collaborate in the workplace looks a lot like the apps we use in our personal lives.

At least, that’s been my experience as a white collar professional in a cubicle environment. I’m a middle-aged GenXer, and my career is split pretty evenly between the world before social media, and the world after. I’ll explore what that’s been like for me a little more in this post.


I joined Facebook in 2008, because all of my coworkers were doing it and I didn’t want to be left out. It was a clear case of FOMO (Fear Of Missing Out), a term then recently introduced, to explain how social networks stimulate compulsive following and mimicking behavior. I friended all of my coworkers, and had fun setting up my profile and exploring the site.

Do you remember those days, and how primitive the Facebook interface was compared to today? Your main profile page was called your “wall” and to post on it, you “updated your status.” If you look back at your posts from fifteen years ago, you’ll see how diffferent they were. They seem kind of awkard and tentative, like we all didn’t quite know what to do with this new way of communicating.

Back then, there was a site called “Please Rob Me” that tried to raise awareness about the dangers of sharing the fact that you weren’t home, like someone was wondering how anyone could be stupid enough to do that. The site is defunct now, and today it is routine for people to tag their locations when they go out, even though we all know we’re giving valuable information away to giant corporations (the ones who are really robbing us).

Back then, as employees found themselves sucked into their Facebook feeds, companies started blocking the website from their local intranets. They established policies about what employees were allowed to post on social media platforms, warning them against representing the company or divulging corporate information.

In the late 2000s, the world was just getting used to social media, and its implications. Today, a decade and a half later, social media is routine in our daily lives. Everyone accesses social media platforms from their smartphones on a more or less continuous basis, even while at work, and employers have no chance of stopping them.

One thing I’ve decided since those early days is that it is best to keep my work life and my personal life separated, where social media is concerned. I no longer send Facebook friend requests to my coworkers, as I did back when I first joined the site. But that’s just how I personally manage my online presence. For other people, depending on their line of work, it might be better or even necessary to network and post about work across all social media, and have less of a distinction between personal and professional social spaces.

A clever post about work I made on a social media app

That’s not to say that social media isn’t a part of my work life at all. There are, as you well know, work-specific social media sites, such as LinkedIn, where I do make sure to connect with my coworkers. The Intranet at the company where I work uses software that has posts and feeds that resemble those on any other social media platform, and while I’m not particularly active there I do browse, to get a feel for the corporate culture.

I also sometimes post about work on my personal social media accounts, but in a sly way. I don’t want to reveal where I work, but just say something about work that’s clever, maybe even vaguely subversive, hoping for likes and shares. I’ve included an example screenshot in this blog post. You can see that I got zero engagement.

Social media conventions have infiltrated workaday tasks as well, such as is in the use of emojis and reactions in online conversations. I have long been using messaging software in the workplace; I remember Skype being in place in the office back in 2002. I also remember that at as emojis started coming into use in personal messaging, I was hesitant at first to include them in work conversations. It just seemed somehow unprofessional to use a smiley face in a work related chat.

But, in time, it simply became a norm. On the messaging software I use at work now, there are emoji reaction options, and my coworkers and I routinely “thumb up” and “heart” one another’s messages. It’s just a way of signalling agreement or showing appreciation. Workplace communication has become more friendly and informal than in the past, and I think this reflects a preferred mode for today’s mostly Gen X and Millennial workers.

For me, a Gen Xer who adopted less formal modes of communication in the latter portion of his career, it’s been an adjustment. But for many of my coworkers, who are Millennials twenty or thirty years younger than I am, it must just seem like the normal way people communicate in the digital space. For Boomers, experiencing these changes at the tail ends of their careers, it might seem too informal or alien to their expectations.

I suppose I shouldn’t speak for others, especially if they are from a different generation. These are just my thoughts on the matter. There’s no denying that the proliferation of smartphones, along with ubiquitous access to the Internet and its software platforms, has changed our daily routines, including our work routines. Please feel free to share your own experience in the comments below.

On Gratitude

On Gratitude

Gratitude is difficult for people to express, because it requires admitting dependence on others. In that way it feels like surrendering autonomy, which everyone is loathe to do. Ultimately, all conflict in human life is about power and autonomy, and the resistance to expressing gratitude is like a fortress people erect to defend their self-perception in that power struggle. In their egoic desire for power and autonomy, people convince themselves of their self-reliance and self-determination, and cannot face the truth that in our complex society we are all interdependent.

The use of money and market transactions to facilitate meeting basic survival needs helps to sustain these self-delusions. After all, so long as one has the mettle to maintain a money income through some skill or trade, one can exchange one’s money for needed goods and services. Therefore, one can believe that one is reliant only on oneself. The illusion of freedom is maintained.

But these market exchanges don’t change the fact that to eat, we depend on others to grow our food. To thrive, we depend on others to maintain basic infrastructure, roads and bridges and the utilities that deliver our power and water. We depend on others to extract and refine the minerals and metals and fossil fuels which form the material foundation of our civilization.

Our use of money to acquire these things via free market capitalism disguises these dependencies but does not eliminate them. And we depend on the authority of our government, which ultimately rests on the power of its military and police, to even make those markets work and that money useable as a currency of exchange. We are utterly dependent on other human beings, but we cannot acknowledge this or display even the simplest gratitude for what they do.

Even in our personal lives we are dependent on others. We are dependent on our friends and family for emotional support and for logistical support. We depend on their willingness to share their time with us. But then we get used to relying on them. We start to take them for granted, assuming they will always be there for us, and forget to show our gratitude.

We resist showing gratitude for what others do for us, whether people close to us or the myriad strangers who make our lifestyles possible, because that would be admitting our dependence. Our egos would rather believe in their own sovereignty, that we are in charge and others are fufilling obligations to us. Expressing gratitude, for the ego, is like abdicating a throne. But that throne is a mirage – we are really held up by what others do for us. Other people who deserve our gratitude.

Heed the wisdom of the Buddha Bear.


I plan more of these Buddha Bear posts in the future. This was a format I was originally planning to post on another siteTM which sadly has not heeded the Buddha Bear and has lost its way.

A Couple of Interesting 20th Century History Podcasts

A Couple of Interesting 20th Century History Podcasts

I like having a podcast running in the background while I work. I work from home, alone in an office, and having a podcast going is like having some folks there in the room with me, discussing whatever. My favorite topics are culture and history, though sometimes I go for science or spirituality. I like something low-key, fairly non-intrusive, which podcasts tend to be in my experience, or at least the ones I listen to are. I might not fully absorb the content, as my focus is divided by work, but that’s OK. It’s just nice to have someone talking in the background.

The term “podcast” came about in the early 2000s, taking its name from the “iPod,” a common way to access digital content back then. All it means is some kind of digital streaming audio content, in episodic format. Episode lengths can range anywhere from about 15 minutes to over two hours. I listen to podcasts over the web on my laptop, or on my smartphone. I tend to be behind on the content; meaning I’m often listening to episodes made years ago, rather than recently produced ones. I’m always way behind on pop culture consumption. I mean, I only just recently watched the 50th anniversary Dr. Who special, and they’ve already made the 60th anniversary one.

In this post I wanted to call out a couple of podcasts I’ve enjoyed recently, both of which cover the history of the mid-twentieth century.

The first one is The Long Seventies Podcast, which covers, well, the 1970s. It uses the term “long seventies” to mean the period from about 1968-1983, which is understood as one cultural era. This is basically the Consciousness Revolution Second Turning as defined in Strauss & Howe generational theory. The hosts, Matt and Alex, are these two guys who I’m guessing are core Gen Xers like me, based on their attitudes and how their own life experiences come up in their discussions.

The podcast episodes are long – about 2 hours each – and cover politics, media and popular culture (so far that I’ve heard). Sometimes they talk about events, sometimes trends, sometimes specific cultural artifacts like movies or music albums. The two hosts are mild-mannered, kind of soft-spoken and a bit rambly. It makes for easy background listening. They are skeptical, vaguely reactionary, and often insightful, with heads full of pop culture trivia. Very Gen X.

I have only listened to episodes from a few years back, so have no idea what they think of recent events, not that they would necessarily discuss current events, since the podcast is about the 1970s. Anyway, if you are interested in that decade and find the style I have described appealing, you should check them out.

The second podcast is titled From Boomers to Millennials. This is an ambitious project by a Millennial named Logan Roberts, covering modern U.S. history. The goal is to have an episode for every year from 1946 (the first Boomer birth year according to the U.S. Census) to the present – that is, to go from the dawn of the Boomer era to some point in the current Millennial era.

The episodes are usually about 45 minutes long, and mostly cover politics and major historical events. But because 45 minutes is not really enough time to go over a whole year, the “episodes” end up getting broken up into multiple parts anyway. Plus there are “supplemental” episodes, often in the form of minibiographies of important people from the time period. At the time of this writing, the podcaster has reached the year 1961.

Though this podcast is slow going, I don’t mind, because Roberts is a great narrator. He is well spoken and very knowledgeable. As a Millennial, he seems to have absorbed a historical narrative that some might consider to be “woke” or “liberal,” but I don’t mind that either. I think he’s on the right side of history, and I hope he gets through the Millennial Saeculum, which should end in the early 2030s. By the time this podcast reaches the 2030s (remember, it’s at 1961), I might well be dead. But history, of course, will be marching on.

Entering 2024

Entering 2024

I heard a Lewis Black bit on the Daily Show where he said that 2023 was the first year since the pandemic that felt almost normal. In our world, what with the return of live theater, it does feel that way, though you still see some people in audiences wearing face masks, since the pandemic isn’t really over. The COVID-19 pandemic will possibly continue for the rest of our lives, as the AIDS pandemic has, and COVID has killed almost as many people as AIDS has cumulatively, in 10% of the time.

As for a return to normalcy, well, maybe, except I still worry about what will happen in this country on the political front. I do have a hope that our relatively high levels of prosperity will save us from a complete breakdown, though I have forebodings of a consitutional crisis to come. The first month of 2024 could be very eventful.

I’ve listed below the current ages of the living generations in the United States, as 2023 comes to an end. We are almost, but not quite, to the point where each archetype fills an age bracket. When that happens, we will be close to the end of the Crisis Era.

  • Greatest: 99+
  • Silent: 81-98
  • Boomer: 63-80
  • GenX: 42-62
  • Millennial: 19-41
  • Homeland: 0-18

Which means this era isn’t over yet, pandemic or no pandemic, normal or not normal. And people sense that, which is where memes like the one on the right are coming from.

So just be aware, and pay attention as these oh-so-interesting times unfold.

With all that said, I hope you and your family have a safe, prosperous, and happy New Year.

Hi, It’s Me, Your Friendly Individual Contributor

Hi, It’s Me, Your Friendly Individual Contributor

When I was unexpectedly laid off at the beginning of the year, I scrambled to update my LinkedIn profile and my resume. I was not prepared to suddenly be looking for work.

If you are a white collar professional like me, then you know the drill. When you start a job search, you have to review your resume, which has probably been languishing, untouched since the last time you got an offer. It did what you needed it to do then, and you promptly forgot about it.

But now you need to update it with your latest experience, maybe streamline it so it’s not too long. Tweak it a little to reflect what’s new in your industry, so you look like you’re keeping up with the changing times. I mean, you are, of course, since you are a brilliant professional.

When I was doing this at the beginning of the year, I was feeling vulnerable. As I stated in my blog post then, I was thrown off balance. There was no way of knowing how long it would take me to find another position. And I was anxious about age discrimination; that the older you get, the harder it is to get hired.

Now they say that when you are doing up your resume, you should always phrase your experience in terms of how you were proactive and made a difference, rather than just list out the tasks that you performed. You’re trying to convince some hiring manager that you provide some special value. But “proactive” isn’t my vibe. My vibe is, I do the damn tasks and get the shit done. I am a worker bot. As I’ve told Aileen, my aspiration is to be like R2-D2: not a main character, but resourceful and reliable. The one you call on.

In the parlance of the corporate workforce, I am an “individual contributor.” I have never held a management or leadership position, nor have I ever sought one. I have worried a bit about what this means for my prospects, as I’ve noticed how other workers around me are younger than me by more and more years as time passes. Everyone my age, it seems, has moved on to management, to more impressive titles. But I am not seeking position or status; I just want to get paid to do work.

I know that it’s possible to finish my career this way, because I recall a job where there was an old-timer who was in the same position as me, but in his sixties. He was white haired and a little bit stooped and he was just doing his little low-level tasks under someone else’s direction. He actually retired while I was still working there. God willing, I thought, that could be me in twenty years. And that was ten years ago.

How could you let this guy go?

So I decided to embrace the idea of being an individual contributor. To own the brand, so to speak. I mention it explicitly in my LinkedIn profile, as well as how well I fit into any team (which is true, I believe). I also took a new profile pic, with the most puppy-dog-eyes look I could muster, like I want the hiring managers to see me as a rescue they just couldn’t turn away.

I guess it must have worked, since I got hired pretty quickly. I’m very lucky to be in a field where there is high demand for workers, and to have found a company that was in a hiring boom. The economy still works for some of us, and I just need it to keep working for me for another decade (or two?) so I can R2-D2 along, making a small difference.