Browsed by
Category: Work Life

Agile across the Generations

Agile across the Generations

In a post last month I discussed the Agile method, and described an origin story for it. In my story, Agile was invented by a new generation of software developers for a new generation of software – the software being written in the fast-paced world of the networked personal computer. It started when an “Agile Manifesto” was declared in 2001, at the height of the dot-com boom, after the software world had experienced a couple of decades of rapid growth amidst a profound shift in work patterns. A rising young generation (my own, Generation X) moved freely from job to job, eschewing loyalty to the company in favor of careers as “perma-temps.” Some system was needed to manage the frenetic chaos of this new working environment, and that’s where Agile came in.

This surely is a simplification and possibly off the mark. After all, innovation in workflow management precedes the Agile manifesto by generations. It has been a part of the evolution of the modern corporation for more than a century, going back at least to Taylorism and scientific management. Agile fits in with other conceptualizations of “lightweight” or “lean” approaches to project management, meant to avoid bogging everyone down with process and minutiae, and with earlier iterative development methodologies. These came about long before my generation was in the workforce.

My origin story came about because the Agile methodology strikingly fits the peer personalities of the generations who invented it – Baby Boomers and Generation X. If you look up the signatories of the Agile Manifesto, almost all of them are from those two generations, which constituted the bulk of workforce at the time (Millennials were only just graduating from high school). These are both generations know for individualism, for focus on the self and the personal, and for short-term thinking. It makes sense that they would embrace a work methodology that emphasizes individuals over process, and adaptability over planning.

The very name “Agile” evokes the ideas of speed and flexibility, qualities which align with my generation’s reputation. Also aligning with Generation X is Agile’s way of defining success as developing software that works, not necessarily software that is perfectly crafted or meticulously documented. “Git-R-Done!” or “Just Do It!” as a Gen Xer might say. Or how about the Agile sub-type known as “extreme programming,” a hyper-focused iterative approach with very short cycles? What could be more Gen X than that?

My point is that this methodology was primed for the workforce of the time – a workforce consisting of young adult Gen Xers, managed by middle-aged Boomers. The hyper-focused individualists were doing the work while the visionaries were directing them. Agile, in theory, was a mindset, a whole philosophy of managing work in a fast-paced world. So long as everyone was not worried too much about following a fixed process or plan, but instead was adaptable and constantly communicating, much could be accomplished.

Contrast this with Six Sigma, a methodology that came from the Silent Generation when they were the middle-aged managers of young adult Boomers. This faultfinding approach, which uses statistical methods to eliminate defects in processes, suits the Silent Generation’s reputation for fine-tuning expertise, as well as the Boomer Generation’s reputation for perfectionism.

Now what about Agile in the workforce today? It’s been over twenty years since the manifesto was published, and now it’s Gen Xers who are the middle-aged managers and Millennials who are the young adult workers. Does the Agile methodology suit a generation known more for hivemind thinking than for focused individualism? I think it does, though maybe not in exactly the way it was originally envisioned.

I have been using Agile at work for the better part of the last ten years, at all three of my most recent software development jobs. In my experience, the ideal of the “Agile mindset” doesn’t really stick. It’s fine to have an overall philosophy of work, but actually getting people to adopt a specific mindset requires coaching and attention, not simply declaring a vision. What does stick easily about Agile is the framework of dividing the work into short sprints and keeping the team aligned, using regular meetings (such as a daily scrum or stand up) and a system for tracking the work (such as user stories on a storyboard).

I think the structure provided by this framework is a good fit for the peer personality of the Millennial generation, who do best in an orderly work environment with clearly set expectations. They like to be given a well-defined task and rewarded for accomplishing it. A little praise and gratitude will do. They even get FOMO when they don’t have a specific assignment, which is understandable as it might be a sign that their position isn’t needed any longer.

Even as Agile methodology supplies structure, the short duration of the sprints and the iterative workflow continue to provide the benefits of flexibility as project priorities and personnel shift about. A plethora of practices and sub-methods has evolved out of the original idea, giving Gen X and Elder Millennial managers plenty of ways to tinker with the methodology to find the best fit for their teams.

It’s worth noting that there are limitations that come about when you have structure. If everything has to be tracked, work might not get done if no one remembers to track it. If expectations are clear, there might not be much motivation to go beyond expectations. A well ordered framework for defining and assigning work might be easy to navigate, but it can also foster complacency. No one is likely to go above and beyond, if there doesn’t seem to be any particular reward for doing so, and if doing so risks ruffling feathers by disrupting the expected workflow.

Continuing the story of Agile, it might be that what started as a methodology for producing results in a fast-paced environment has evolved into a methodology for governing work in an orderly manner, such that everyone can function in a well-defined role. That’s what my experience shows. Agile might not be as versatile in practice as it was originally envisioned to be, but it’s still a useful tool for keeping teams aligned and productive.

I do sometimes hear an old Gen Xer on a team complain that “we’re not practicing true Agile,” but I just think, “so what?” We’re getting stuff done (hopefully), and keeping tabs on it. That’s enough.

As far as I can tell, Agile, at least in name, is here to stay. The concept is entrenched in the Information Technology workplace, and will certainly outlast my career, which has not much more than a decade to go. Ten years from now the generation that comes after Millennials, the Homeland Generation, will fill the twenty-something age bracket and constitute the workforce’s youngest cohorts. I wonder what further evolution of the Agile method might come along with them.

How We Got Agile: An Origin Story

How We Got Agile: An Origin Story

My old copy of “the mythical man-month”

When I was a young man, a college student in the Computer Science program at Virginia Polytechnic Institute, we were assigned a book to read. It was called The Mythical Man-Month, by Frederick P. Brooks, Jr., and I still have my copy from the 1980s. The point of the book, which you might be able to glean from the title, is that you can’t simply measure work effort in “man-months,” on a scale such that you could conceivably get more work done by adding more people to a project. As an example, you couldn’t say that a project has a work effort of 120 man-months, meaning that with 10 men it will take 12 months to finish, and therefore with 20 men it will be done in 6 months.

If you had 10 men working on this hypothetical project, and added 10 more, you would not find that it completed 6 months sooner. It would, in fact, take longer than 12 months. The problem is, as you add more men (people) to a project, you need time to get new hires ramped up to where they understand the project well enough to be productive. You also multiply the lines of communication, which generates additional overhead keeping everyone in sync on specific information needed to make interacting components work together. In engineering, these pieces of information are called “specifications,” and they have to be tracked somehow. If you add more people to a technical project, you add more tracking effort. These complications are summarized in Brook’s law: “Adding manpower to a late software project makes it later”.

As a software engineer in the early 21st century, it fascinates me to read the author’s description of how specifications were tracked on the project he worked on – the IBM System/360 – in the 1950s and 60s. They had huge manuals kept in binders, and as changes were made, the responsible engineers would have to go in to the binders and update the appropriate pages – that is, take out the old pages and insert the new ones with the changed specs. This manual was the Bible of the system, and keeping it up to date was abolutely vital to the success of the project.

Modern day software engineers like me are not used to such meticulously maintained documentation. We consider ourselves lucky if there is any documentation at all for the software on which we are working. You’d think it would be easier, now that everything can be done online, but projects move too fast and the people working on them move around too much. No one is necessarily going to stay on top of documentation, and so long as software works as expected, that’s fine. It’s when it doesn’t work that you run into trouble.

Because personnel move around so frequently in the modern workforce, there is rarely anyone working on a software program who was there when it was originally programmed. But programmers still need to maintain it. Sometimes we are given requirements to modify existing software that has no documentation, with no one around who knows anything about it, and the only way to achieve that goal is through “reverse engineering.” This means poring over old code and documenting it from scratch, which is very time consuming. This underscores the point about the man-month: you can’t just insert a person into a project and expect them to get as much done in a given amount of time as a previous person on the project did. Certainly not if they are going to be reverse engineering the previous person’s work.

Since the start of the personal computing era and the long economic boom of the late 20th and early 21st centuries, computer software has been advancing at a faster pace than it did when Frederick P. Brooks worked as an engineer at IBM. The workforce has changed as well, with employees typically job hopping every few years, and often working as contractors through agencies rather than directly for the client that owns the software they are developing. So how do the software engineers of my generation handle project management in such a chaotic work environment?

The answer is “Agile” methodology, which came about around the start of this century. Agile is a lean or lightweight software development method that emphasizes individuals collaborating over plans and processes, and defines good software as software that works, not necessarily software that is well documented. At least, that’s the declaration in a famous “Manifesto for Agile Software Development” that was published in 2001.

The idea is that “Agile” is a mindset where you are focused as a team on communication and collaboration, continuous improvement, and responsiveness to change. In practice, it means breaking up the project work into short iterations called “sprints,” which typically last two weeks. Everyone’s tasks for the sprint are things that shouldn’t take more than a couple of weeks to finish. So right there the idea of a “man-month” is out; no one would work on one thing for a whole month!

Breaking the project work into chunks like this makes it easier to show progress, and to evaluate how effective the team is from sprint to sprint, and change processes and workflows as needed. It also makes it easier to accomodate personnel shifting around from project to project. It’s a way of coping with today’s volatile workplace, which makes long term planning harder to achieve. A whole panoply of “frameworks” and “ceremonies” has developed around the original concept since it was first elucidated.

If you are in a white collar profession (not even necessarily Information Technology) you might have experience with Agile-related frameworks in your career. I was first exposed to Agile in the late 2000s, and have been at positions where it is used comprehensively since 2018. Every company does it a little differently, but I have always found it to be a useful way to structure project work.

The way I see it, Agile came about because a new generation of software engineers needed to adapt to a faster pace of work than what the generation of Frederick P. Brooks experienced in their careers. They needed to find their own solution to the problem of how to get people to work effectively when they are added, out of the blue, to a new project. If you look at the signatories of the 2001 Agile Manifesto, you will see that they are almost entirely Baby Boomers and Gen Xers. Today’s Millennials and Gen Zers in the IT workforce have possibly never worked on a project that wasn’t using Agile.

I’ll have more to say about the different generations and Agile in a future post.

AI at Work, for Better or Worse

AI at Work, for Better or Worse

A little robot guy I made with an AI image generator

As you surely know if you are a denizen of the online world like I am, artificial intelligence has made remarkable strides in the past few years. In particular, what they are calling generative AI has really taken off. This is a kind of advanced pattern matching software that grew out of machine learning. It lets you use prompts to create content like images, complicated text including writing stories, and even videos and music at this point. At the bottom of this post I linked to a YouTube video that explains generative AI really well so check it out.

I played with AI image generators for a while, and had some fun. In their early iterations they produced really weird, often creepy looking stuff, but now they’ve gotten pretty advanced. The images they produce are intriguing, impressive even. I saved a lot of the ones I generated, but stopped messing with the programs when I saw how many of my artist friends were upset by the proliferation of AI-generated images on social media. I gathered they could sense their own work being made obsolete by an overwhelming supply of easily produced knock-off art. Why hire an illustrator when you can just describe what you want into a text box in an AI tool, and get the result in a few minutes? Plus there’s the troubling issue of these programs possibly being trained on copyrighted material without the consent of the copyright owners, meaning they are effectively stealing from artists.

Another thing you have to consider about the product of generative AI (and this is covered in the video below) is that it is subject to one of the rules about computer programming that I was taught as a lad: Garbage In, Garbage Out. That is, if you put bad data into a computer program, then you will get bad data out of it. Generative AI is trained on massive data sets, and one result of the way the current AI programs have been trained is that they produce content that tends to express a sort of lowest common denominator of its subject matter. You put in the vast quantity of data on the Internet, apply sophisticated pattern matching, and you get out, as a result, something like an “Internet average” version of human knowledge.

For an example of what I mean, here is a fantastic article explaining how AI-generated images of selfies misrepresent culture. They do this because the pattern matching algorithms take the conventional way that selfies typically look and apply it to subjects where that wouldn’t make sense. So an AI-generated image of, say, a group selfie of medieval warriors makes them look like modern day humans. Now, since the idea of the existence of such a selfie is absurd on the face of it, maybe it’s pointless to worry about its inherent historical inaccuracy. But in a way, these kinds of images are erasing history.

The article goes even deeper; the AI generators tend to represent everyone as smiling into the camera the way that Americans do. But other cultures that do exist today and do take group selfies have different ways of expressing themselves when taking photos. So the AI programs aren’t just erasing history, they are also erasing existing modern cultures. They are turning everyone into Americans, because American culture dominates the Internet.

Here’s another way AI-generated content gravitates toward a dominant average mode, one you might have heard of already. It seems that AI chat programs, trained on the massive data of online conversations, will often produce racist, abusive comments. It’s like they inevitably turn into Internet trolls. This might seem like a mere annoyance, but AI programs generating racially biased content can have serious, life or death consequences.

With all of these concerns, it’s understandable that public perception of AI is not always favorable. Ted Gioia (who has an awesome substack, by the way) wrote about this perception recently, starting with a story about the audience at SXSW booing an AI presentation. His article expands into a general discussion of the public’s current distrust of the technocracy, in contrast with the way technocrats like Steve Jobs were idolized in the past. Faith in “innovation” and “disruption” has waned in a society facing uncertainty and disorder, and sensing that technology is leading us toward a dystopian future.

Where does AI fit into my life, now that I’ve stopped playing with image generators? Well, I may not be able to avoid using it, as the company where I work has been promoting AI chat programs to help with day to day tasks. We are all being asked to look into them and come up with ways this new software can improve our productivity. Other folks who have a job like mine might be encountering similar pushes at their workplaces.

I think this is an honest effort by our management to ensure that our organization doesn’t get left behind in the AI wave they are convinced will revolutionize the workforce. Stay ahead of the disruption, and ride the wave I guess is the thinking. Surely it’s not the case, as Aileen and I joked when I brought this up to her, that I am training an AI to replace me. I mean, why pay a software tester when you can just describe the tests you need into a text box in an AI tool? Oh my.

Below is the very informative video that explains Generative AI.

It’s Really Been A Year Already?

It’s Really Been A Year Already?

This photo showed up in the memories feed which my smartphone helpfully throws in my face every once in a while. It was one year ago today since I went back to the corporate campus of my previous job to turn in my laptop. I took this photo because this was a new building that wasn’t up yet when I left the campus to begin remote work in March 2020, and I was excited to see it on my return. It was under construction when I left and there was a lot of hubbub about it.

I think it’s a pretty building. The campus has this striking architectural design that resembles modern art, and this building fits right in. It also has a lot of stairs (I mean the campus as a whole does) which makes it challenging to walk around in if you are not physically fit. When I walked on that campus I felt my age. I felt like I was obsolescing as I was surrounded by the aggressive energy of a workforce that keeps growing younger with every new job I take.

I did go up that formidable looking staircase and go into the bulding. It was impressive on the inside, too, with a spacious lobby with some nice art installations. The security guy at the desk paid me no mind.

The campus was custom build for this corporation, and it must have cost a bundle. So I can understand why they wanted people returning to onsite work. Aileen and I speculated that maybe I was let go because I declined to go hybrid and wanted to stay 100% remote. They gave us the option to do either, and assured me that my decision to stay remote had nothing to do with my position being cut. But who knows.

I’m glad I made the choices I did, and that I amazingly was able to get a 100% remote job elsewhere after being let go. I feel very lucky to be in the position I am in today, and grateful for the support of my family here in Pennsylvania. I just can’t believe it’s been a year already at my new job. Tick tock.

Emojis at Work – How Social Media Infiltrated the Workplace

Emojis at Work – How Social Media Infiltrated the Workplace

I still remember the excitement when the first iPhone came out in 2007; only a few people were using this new kind of mobile phone, but boy were they delighted with it. At the same time, everyone was jumping onto Facebook, which had just opened up to the general public in 2006.

Fast forward to a decade and a half later, and everyone has a touchscreen phone (I got my first one in 2014). Social media platforms have proliferated, and are a constant, pervasive feature of daily life.

Once, employers tried to prevent workers from browsing the Internet during the day, but such efforts have been abandoned. Everyone is on their phone all the time. In fact, the software used to officially collaborate in the workplace looks a lot like the apps we use in our personal lives.

At least, that’s been my experience as a white collar professional in a cubicle environment. I’m a middle-aged GenXer, and my career is split pretty evenly between the world before social media, and the world after. I’ll explore what that’s been like for me a little more in this post.


I joined Facebook in 2008, because all of my coworkers were doing it and I didn’t want to be left out. It was a clear case of FOMO (Fear Of Missing Out), a term then recently introduced, to explain how social networks stimulate compulsive following and mimicking behavior. I friended all of my coworkers, and had fun setting up my profile and exploring the site.

Do you remember those days, and how primitive the Facebook interface was compared to today? Your main profile page was called your “wall” and to post on it, you “updated your status.” If you look back at your posts from fifteen years ago, you’ll see how diffferent they were. They seem kind of awkard and tentative, like we all didn’t quite know what to do with this new way of communicating.

Back then, there was a site called “Please Rob Me” that tried to raise awareness about the dangers of sharing the fact that you weren’t home, like someone was wondering how anyone could be stupid enough to do that. The site is defunct now, and today it is routine for people to tag their locations when they go out, even though we all know we’re giving valuable information away to giant corporations (the ones who are really robbing us).

Back then, as employees found themselves sucked into their Facebook feeds, companies started blocking the website from their local intranets. They established policies about what employees were allowed to post on social media platforms, warning them against representing the company or divulging corporate information.

In the late 2000s, the world was just getting used to social media, and its implications. Today, a decade and a half later, social media is routine in our daily lives. Everyone accesses social media platforms from their smartphones on a more or less continuous basis, even while at work, and employers have no chance of stopping them.

One thing I’ve decided since those early days is that it is best to keep my work life and my personal life separated, where social media is concerned. I no longer send Facebook friend requests to my coworkers, as I did back when I first joined the site. But that’s just how I personally manage my online presence. For other people, depending on their line of work, it might be better or even necessary to network and post about work across all social media, and have less of a distinction between personal and professional social spaces.

A clever post about work I made on a social media app

That’s not to say that social media isn’t a part of my work life at all. There are, as you well know, work-specific social media sites, such as LinkedIn, where I do make sure to connect with my coworkers. The Intranet at the company where I work uses software that has posts and feeds that resemble those on any other social media platform, and while I’m not particularly active there I do browse, to get a feel for the corporate culture.

I also sometimes post about work on my personal social media accounts, but in a sly way. I don’t want to reveal where I work, but just say something about work that’s clever, maybe even vaguely subversive, hoping for likes and shares. I’ve included an example screenshot in this blog post. You can see that I got zero engagement.

Social media conventions have infiltrated workaday tasks as well, such as is in the use of emojis and reactions in online conversations. I have long been using messaging software in the workplace; I remember Skype being in place in the office back in 2002. I also remember that at as emojis started coming into use in personal messaging, I was hesitant at first to include them in work conversations. It just seemed somehow unprofessional to use a smiley face in a work related chat.

But, in time, it simply became a norm. On the messaging software I use at work now, there are emoji reaction options, and my coworkers and I routinely “thumb up” and “heart” one another’s messages. It’s just a way of signalling agreement or showing appreciation. Workplace communication has become more friendly and informal than in the past, and I think this reflects a preferred mode for today’s mostly Gen X and Millennial workers.

For me, a Gen Xer who adopted less formal modes of communication in the latter portion of his career, it’s been an adjustment. But for many of my coworkers, who are Millennials twenty or thirty years younger than I am, it must just seem like the normal way people communicate in the digital space. For Boomers, experiencing these changes at the tail ends of their careers, it might seem too informal or alien to their expectations.

I suppose I shouldn’t speak for others, especially if they are from a different generation. These are just my thoughts on the matter. There’s no denying that the proliferation of smartphones, along with ubiquitous access to the Internet and its software platforms, has changed our daily routines, including our work routines. Please feel free to share your own experience in the comments below.

Hi, It’s Me, Your Friendly Individual Contributor

Hi, It’s Me, Your Friendly Individual Contributor

When I was unexpectedly laid off at the beginning of the year, I scrambled to update my LinkedIn profile and my resume. I was not prepared to suddenly be looking for work.

If you are a white collar professional like me, then you know the drill. When you start a job search, you have to review your resume, which has probably been languishing, untouched since the last time you got an offer. It did what you needed it to do then, and you promptly forgot about it.

But now you need to update it with your latest experience, maybe streamline it so it’s not too long. Tweak it a little to reflect what’s new in your industry, so you look like you’re keeping up with the changing times. I mean, you are, of course, since you are a brilliant professional.

When I was doing this at the beginning of the year, I was feeling vulnerable. As I stated in my blog post then, I was thrown off balance. There was no way of knowing how long it would take me to find another position. And I was anxious about age discrimination; that the older you get, the harder it is to get hired.

Now they say that when you are doing up your resume, you should always phrase your experience in terms of how you were proactive and made a difference, rather than just list out the tasks that you performed. You’re trying to convince some hiring manager that you provide some special value. But “proactive” isn’t my vibe. My vibe is, I do the damn tasks and get the shit done. I am a worker bot. As I’ve told Aileen, my aspiration is to be like R2-D2: not a main character, but resourceful and reliable. The one you call on.

In the parlance of the corporate workforce, I am an “individual contributor.” I have never held a management or leadership position, nor have I ever sought one. I have worried a bit about what this means for my prospects, as I’ve noticed how other workers around me are younger than me by more and more years as time passes. Everyone my age, it seems, has moved on to management, to more impressive titles. But I am not seeking position or status; I just want to get paid to do work.

I know that it’s possible to finish my career this way, because I recall a job where there was an old-timer who was in the same position as me, but in his sixties. He was white haired and a little bit stooped and he was just doing his little low-level tasks under someone else’s direction. He actually retired while I was still working there. God willing, I thought, that could be me in twenty years. And that was ten years ago.

How could you let this guy go?

So I decided to embrace the idea of being an individual contributor. To own the brand, so to speak. I mention it explicitly in my LinkedIn profile, as well as how well I fit into any team (which is true, I believe). I also took a new profile pic, with the most puppy-dog-eyes look I could muster, like I want the hiring managers to see me as a rescue they just couldn’t turn away.

I guess it must have worked, since I got hired pretty quickly. I’m very lucky to be in a field where there is high demand for workers, and to have found a company that was in a hiring boom. The economy still works for some of us, and I just need it to keep working for me for another decade (or two?) so I can R2-D2 along, making a small difference.

Summer Update

Summer Update

And just like that, it was halfway through 2023.

I am over three months into my new remote job, and things are going swimmingly. It’s interesting because I get to work for a new kind of company (agricultural sector as opposed to finance), and also pick up on a new corporate culture. The IT department there isn’t very mature, in part because it has been expanding rapidly (how I got the job, essentially), so I get not only to prove my chops but also to help the folks who aren’t as seasoned as I am to understand the software development lifecycle. It’s very gratifying that my experience is being put to good use, and to know that despite my advanced years I am still relevant in the workforce.

Aileen, meanwhile, is working on the summer Arts Bubble musical, which this year will be City of Angels, a satirical noir comedy (not to be confused with a Nic Cage movie of the same name). As usual, she is committed 100% to all aspects of the production and putting in tons of work. Equally committed is our son, Tiernan, who is cast in his first lead role, as the hard boiled private eye from the movies. I hope you will be able to come see it (many friends and family already have confirmed they will, thank you all). The show dates are July 14-17; message me for details if you want to attend. But note that opening night is sold out. Woo hoo!

Our other son, Lionel, has just come back from a month in France, where he took a French immersion course with his University, and had a taste of life in another culture. This included going clubbing and he had some interesting stories there. He’s becoming such a worldly young man. Back home, Gavin continues his relentless work maintaining the region’s water infrastructure. He is a wizard with programming PLCs, which are these logical circuit board thingies that basically hold our entire civilization together. Aileen goes over to his house more often these days, since that’s where the best computer is, which is great for Potato, the cat who lives there, since it means she gets more attention now.

There is still a big hole in our heart and home that used to be filled by our sweet kitty, Sashimi, our magical girl. Aileen made this portrait of her after she died. It’s hard to believe it’s already been almost two months. Have we really moved on?

Is it ok to move on?

Is it ok to die?

We all will. Already this year two FB friends have died from cancer. Another, a very dear friend from back in the day, is sick and currently hospitalized. The clock is always ticking, ticking away to midnight.

Last night we watched a video on YouTube that informed us that the Doomsday Clock is the closest it’s ever been to midnight: 90 seconds away. The war in Ukraine is not helping here. The video we watched was actually about how scarily sophisticated A.I. is getting, and speculated on whether it might just decide to destroy the human race. It really terrified Aileen and gave her nightmares. I tried Stevesplaining to her that A.I. chatbots aren’t sentient beings with a will, just really impressive pattern-seeking algorithms, but I don’t think I reassured her.

In any event, just because A.I.s are “merely” computer programs doesn’t mean they won’t be put in charge of everything and then God knows what will happen. And if that doesn’t get us, we just might end up cooking to death anyway when Earth turns into a Venus-like planet. All we can do is carry on with our usual business while the summer broils us.

Oh dear, sorry to end on such a heavy note. Here’s a poem about cats by Jane Hirshfield to hopefully lighten your mood. Have a great summer, everyone, if you can. And come see our show!

The Rise and Fall of Drinking Culture

The Rise and Fall of Drinking Culture

We’ve recently been watching Mad Men (available on Amazon Prime with our AMC+ subscription), a TV show about New Yorkers in the advertising business in the 1960s. It clearly is attempting to paint a portrait of what life was like in that bygone era, and how social mores were so much different back then. For example, everyone is constantly lighting up cigarettes, in any context, even in front of kids. The men unabashedly treat women like sex objects, and the women just accept it and learn to navigate what today would be considered a hostile work environment.

I know that a major premise of the TV show is to highlight these social differences between then and now. How accurate this portrayal of the period is, I can’t be sure, since I wasn’t there, but it seems plausibly realistic to me. And the show certainly has high production values, beautiful art design, and fine performances, making it a delight to watch.

What truly amazes me about the lifestyle of these advertising guys (as depicted on this TV show) is their capacity for consuming alcoholic beverages. They keep liquor in their offices, and take any opportunity to have a finger or two of scotch. If one of your coworkers comes into your office at, say, 10:30AM, well – it would be rude not to offer them a drink! It’s a much different experience than I’ve had in my work life, which has occurred since our society moved on from the casual alcoholism of these Madison Avenue men.

For the duration of my young adult life, it would have been unthinkable to have alcohol in the workplace, or even to have a drink during the work day. It’s possible that this is because I spent those years living in the South of the United States, which while certainly known for its hard-drinkers, is also known for puritanical restrictions on public life. Maybe up in the big cities in the North, people were still having three martini lunches. But I suspect the real reason my work life was so different is my generational placement in history.

I do recall one early work experience which was like a glimpse of the last vestiges of the older generation’s casual work drinking. When I was a college student in the mid-1980s, I was in a work-study program, and worked at a major government agency in the DC area. The director of our department had an office suite that was behind a frosted glass window, so I never saw inside. One holiday season he opened up his suite for a company party, and lo and behold, he had a fully stocked dry bar in there. I even had a glass or two of something strong (I was 19 at the time, so I believe this was technically illegal), feeling a little bit guilty since I had to drive home afterwards. I was already internalizing the safety messages about drinking alcohol that were becoming predominant in the culture.

Logically, the director who presided over this dry bar would have been from the same generation as the “mad men” on the TV show, just twenty years older (since it was the 80s instead of the 60s). The way generations work, a cohort of people born about the same time tends to retain the same attitudes and behavioral patterns throughout the lives, bringing those patterns with them to older and older age brackets as time passes. This old timey office executive wasn’t going to give up his liquor, unless they pried it from his trembling fingers.

By the 90s and 00s, the tenor of public life had changed. America was in a social era in which the Baby Boomer generation – a profoundly moralistic generation – was entering midlife; while my generation, Generation X – an opportunistic but disorganized generation – was entering adulthood. Society became safety-obsessed and health-obsessed, and drinking on the job was counter to this new values focus. While my generation may have chafed under the emerging neo-Puritanical values regime, we weren’t about to collectively do anything about it. We would just deal with it.

A similar dynamic occurred in an earlier era: the Roaring ’20s, when Prohibition under the terms of the Eighteenth Amendment was in place. In that time the midlife generation was the moralistic Missionary generation, while the young adults were the free-wheeling Lost generation. Prohibition didn’t exactly stop drinking, but it did drive it underground.

The Eighteenth Amendment was repealed around the same time that the Great Depression started. In the new social era that emerged, the generation that came of age – the Greatest Generation – developed a reputation for collegial drinking and smoking. These behaviors became associated with recreational pleasure in a context of sociability and solidarity, never mind all the health problems they were destined to lead to down the road.

This pattern of casual drinking and smoking in public continued into the postwar era in which the Mad Men live(d), until further generational change led to a more health-conscious society, and those habits fell out of favor. So the cycle goes. The era of the executive with a ready a supply of liquor at the office came to an end.

During my young adulthood, drinking on the job became an underground activity, as during the Prohibition era. I say that because I do recall having a boss who was a bit emotionally unstable, in my opinion, and heard through the grapevine that he drank during the day. One time I found an empty bottle in a staircase, and took it at as sign that the rumors were true.

At a different job I had, there was a programmer who reputedly came to work drunk. His fate was to be sent to rehab; the company actually gave him a month off to clean up his act. I think they might have even paid for the rehab. He was really good at programming so I guess they couldn’t let him go. It just goes to show how much attitudes had shifted, and how drinking alcohol had become understood to be more of a pathology than a pastime.

Time has continued to pass, and I am no longer a young adult. Our society has recently gone through a financial crisis which can be likened to the 1929 stock market crash that was followed by the Great Depression. Has there also been a shift in attitudes towards drinking alcohol in the workplace, where it is now more acceptable as a social lubricant and source of conviviality, rather than being perceived as a personal moral failing?

I think so, at least to a limited degree, based on my experience in the workplace. In my recent positions, it has been common for the company to host parties where alcohol is served, sometimes but not always with a cap on the number of drinks per person. I’m not sure if age limits are enforced; it’s not impossible that an intern under the age of 21 has been able to sneak some drinks in. It isn’t exactly Mad Men, but it is an acceptance of drinking in the workplace, at least under controlled circumstances.

Media reports from the past decade or so have also suggested that this is happening, with the emergence a new kind of startup culture where drinks are a perk, available in the break room. Not that I’ve ever had the luck to work at a place like that, but then my startup days were during the dot com era, long ago.

That drinking at work may be on the rise makes sense in this social era. Instead of having moralizing Boomers in middle age, we now have practical Gen Xers, who will do whatever it takes to boost productivity and retain employees. Instead of having lone wolf Gen Xers in young adulthood, we now have sociable Millennials, who favor group activities, for which alcohol – since it lowers inhibitions and elevates mood, albeit temporarily – is a natural fit.

It must be noted, though, that the long term trend is that younger generations are drinking less than us oldsters did at the same age. The party days of my Gen X youth are in the past, and today’s youth are more cautious, and more conscious of their future. In fact, it’s those crazy Boomers who are drinking the most these days. That is the real story behind the controversy over the “woke” marketing campaign by Anheuser-Busch: a major corporation is desperately trying to generate sales among the young demographic, and finding that their only customers are uptight old farts. “Anti-woke” alcoholism is for a generation that is currently in its sunset years.

It’s probably for the best that, in the long term, we are drinking less as a society. The harmful effects of alcohol, such as the health problems it creates, and its contribution to car accidents and to domestic violence, outweigh its benefits. Prohibition might not have worked (no one likes to be told what and what not to do), but behavior can still change with time as beliefs and priorities change from generation to generation.

The question is, will this trend eventually reverse for future generations, in a future social era in which living for the present and taking chances with one’s health become fashionable once more? It’s hard to envision a completely alcohol-free future, given humanity’s long relationship with the pleasures and perils of consuming fermented beverages.

And Just Like That, He Was Overemployed Again

And Just Like That, He Was Overemployed Again

I hope the dear reader will excuse me for bragging, but I have to say that I am proud of myself for finding work again so quickly. My last day at my previous company was February 28th; I had an interview with my new company on March 1st, during which they made an offer. I accepted the position, which is 100% remote, with a start date of March 20.

There was a little tension as I waited for the background check to clear, as well as the drug test (!), which I haven’t had to take for a job in a long time. Meanwhile, I was “funemployed” for a couple of weeks, including a week which coincided with Aileen’s spring break from her University job. You would think I would have gotten a lot done, been super chill and relaxed all the time, but it didn’t seem to turn out that way. Aileen says I was a very cranky bear! All play and no work makes Steve a dull boy, I guess.

Once I was cleared and they shipped my equipment, it felt more like a sure thing, and I think my mood improved. It was kind of exciting to be doing a 100% remote onboarding, as this is my first time. I was feeling like I had mastered this new mode of remote work that has come with the 2020s, by proving I was able to switch jobs and stay remote. I was told to expect an email (to my personal account) early on Monday morning with login instructions for the work laptop. All I could do was set it up and wait over the weekend, which was filled with shows for the Independence Awards anyway.

Aileen made sure I got up early on Monday (thank you!) and sure enough, an email came just before 8 AM. I was able to log in, get oriented, and start meeting my colleagues and learning about the project(s) I will be on. In some ways, it’s the same as it ever was; it’s much the same kind of work, just with a different organization. This org, I will say, has embraced the remote work paradigm (as was explained to me during the interview), which partly explains how this all come together.

So now my days are filled with work once more, and then again my evenings, as high school theatre season is in full swing. When will I have time for books, games, and TV? (Aileen is laughing right now, because she knows how much of all those pastimes I squeeze into my waking moments). I’m very lucky, of course, that I have an email laptop job, which makes 100% remote work possible, and that I am able to work at this stage of my life, when I most need to be saving for retirement. Overemployed I may be, but life treats me well.

PS: Sashimi the cat is doing OK, eating well but she is very drooly.

2023 Update

2023 Update

The cat is eating much better, presumably thanks to the anti-inflammatory medicine she is taking. It must reduce the pain and irritation in her mouth. She still is eating mostly mushy food, though we have found that she has a fondness for ham, so she sometimes gets pieces of that to eat. She really likes ham, reminding me of Ponyo the way she tears into it.

Thanks to eating more, Sashimi has gained weight. But she still drools a lot, meaning she must still have that growth on her tongue. It is comforting, at least, to know she is not in danger of starvation.

Nor, it seems, are we in danger of income starvation, as I have been offered, and accepted, a new position. It is the same kind of work I always do (software testing), and it is a 100% remote position with a company in Minneapolis. Pretty excited to be onboarding 100% remote; that will be a new experience for me. Right now I feel like a remote till COVID champion.