Browsed by
Category: Generations

The Decade of My Childhood

The Decade of My Childhood

I recently read The Seventies by Bruce J. Shulman, a history of the decade just after I was born, published in 2001. I call the 70s the decade of my childhood, not the decade of my youth, which I would say was the 80s. But my generation grew up fast, and I remember feeling all grown up in the late 70s, and picking up on the free-wheeling energy of the times. But possibly all teens feel this way.

Specific events of the Seventies that I remember include the Nixon Presidency – but only having a vague sense that he was strongly disliked. I was a very young child during his second term. I remember the Bicentennial and the Tall Ships, President Carter, the Three Mile Island disaster, the Soviet invasion of Afghanistan, and the Iran-Hostage crisis.

I remember disco, and even dancing to the contemporary disco hits when I was in junior high school. But I was really more into progressive rock, and nerdy stuff like Dungeons & Dragons (I started playing in 1979, even before the kids on Stranger Things). And I remember being infected by the rebellious, free-thinking spirit of the age.

Which takes me to what I thought was the most remarkable thing about this history of the decade, which is how well it aligns with what Strauss-Howe generational theory says about the time period. Any long-time follower of this blog knows how much credence I give to the generational approach. Schulman doesn’t specifcally discuss generations, beyond acknowledging that the Baby Boomers were the young people during the 1970s. But a lot of his analysis fits with Strauss-Howe.

In Strauss-Howe theory, the 70s fit inside a social era they call an Awakening, whose dates they give as 1964-1984. Schulman also allows that the spirit of the 70s extended beyond the exact years of the decade, giving his boundaries as 1969-1984.

An Awakening era is characterized by spritual fervor, new movements that question existing values and institutions, and a shift in focus from the collective to the individual, and from the public to the private. All of this is captured by Schulman, including how the 1970s saw increasing distrust in and revolt against government, culminating in the Reagan Revolution of the 1980s.

This book came out not long after Strauss and Howe published Generations and The Fourth Turning, but Schulman doesn’t seem to be aware of their existence. Which makes it all the more fascinating that he comes to the same conclusions as them.

My goodreads review follows.


The Seventies: The Great Shift in American Culture, Society, and Politics by Bruce J. Schulman

My rating: 5 of 5 stars

A definitive look at the decade of my childhood, the 1970s. Author Bruce J. Schulman, a professor of American history, is older than me, but not by much, and would also have lived through this decade during his youth. The book covers developments in politics and culture, and how the transformations of the 1970s led to the social regime current at the time of the book’s publication in 2001, just at the eve of the next transformative era.

The key points that Schulman repeats throughout his book are that the 1970s mark a shift in priority from society to the individual, and in trust from the public sphere to the private sphere. What’s fascinating to me is that this observation aligns with the framework of Strauss and Howe generations theory, including the time range he gives to what might be called the “long seventies,” 1969-1984. Schulman doesn’t explicitly discuss generational effects, except to acknowledge the existence and importance of the baby boomers (his own generation).

The books has nine chapters, covering a variety of topics. Nixon gets his own chapter, as his Presidency marks two important political turning points: the beginning of the disempowerment of the New Deal liberal establishment, and the planting of the seeds of deep public distrust in government that would blossom ten years later during the Reagan Revolution. Schulman makes a great point about Watergate: that the lesson Americans learned from it was not that Nixon was corrupt, but that all government was corrupt. One can easily see that this belief haunts us to this very day (I write this in 2025).

Trends covered in other chapters include the dawn of identity politics and the end of the 1960s-era dreams of integrationism, the shift in political and cultural power from the Northeastern United States to the South and West, the emergence of new styles of film and music, and the rise of new religious movements. The 1970s saw a relaxation of norms and standards and a turning away from traditional values, and a corresponding new brand of conservatism that developed in opposition to these trends.

As already noted, running through this decade was a society-wide movement away from the public and collective and toward the private and individual. It culminated in the Reagan re-election in 1984 and “Morning in America.” At that point the young generations had completed the “hippie to yuppie” transition. Business had replaced government as the trusted engine of productive achievement, and entrepreneurship had replaced political activism as the preferred mode of personal expression and agent of social change.

I was fascinated by Schulman’s claim that the 1970s have a lowly reputation as a dull and meaningless decade. Perhaps, living through it at his age, he recalls the disillusionment coming out of the previous socially charged 1960s. His attitude may be common for his generation; as a slightly younger Gen Xer, I have a warm nostalgia for the era that I think is more typical of my age cohorts. Schulman clearly does have a personal relationship with the time period; this comes out the most in his write-up of the Punk and New Wave genres of rock music, which must have been his favorite growing up.

Overall, this is a nicely written and well-researched account of a social era, though I was a little annoyed that there was so much descriptive text in the end notes that I was constantly flipping back and forth between them and the main text. I suppose Schulman was trying to keep his narrative lean and on point, which he does achieve. A great read, and I do love how well this book aligns with my favorite generational theory.

View all my reviews


As American as Spooky Fun and Branded Merch

As American as Spooky Fun and Branded Merch

In a recent post, I praised the NFL for being woke by inviting a Spanish-language Puerto Rican rapper to headline the Super Bowl LX halftime show, and lambasted the MAGA reactionaries for throwing a hissy fit over it. I called out MAGA for wanting to bring the United States back to the white supremacy of what they think of as the “good old days” – Hispanics need not apply for the role of American.

In my argument, I brought up academic Michael Lind’s idea of how the United States has gone through periodic redefinitions of itself as a nation. As part of that evolution, Lind recognized the emergence of four cultural mainstays of our national identity: baseball, American football, Thanksgiving, and our unique way of celebrating Christmas.

It is because of football’s iconic status as an American pastime that it is so meaningful that the NFL made its gesture of inclusivity to Hispanic-Americans. By the same logic, this is why the gesture upsets MAGA partisans. Personally, I commend the NFL, and that’s all I have to say about that in this post.

Next, I wanted to speculate on what new cultural elements might now be considered essentially American, given the progress of recent decades.

In the realm of professional sports, surely we would have to add basketball. It is more popular than baseball now. It was propelled to international fame by the wild success of the Chicago Bulls in the 1990s, with star former player Michael Jordan now a multi-billionaire. And college basketball’s “March madness” NCAA tournament has been a staple of office betting pools for at least twenty years that I can remember.

I would also add the blockbuster film franchises that have emanated from Hollywood, and which also have global reach. They may be repetitive, each movie following the same formula as the last one, but that’s kind of how audiences want them. They are like a fast food version of entertainment – you know what you are going to get. Based on box office alone, the really big franchises are Star Wars and Marvel, and it was smart of Disney to buy them up, as the luster has come off of its original fairy-tale inspired brand.

For a new essentially American holiday, I nominate Halloween.

Our front porch this Halloween

Halloween, or All Hallows Eve, is one of those Christianized pagan holidays dating back to the middle ages. It is connected with the Celtic festival of Samhain, which marks the end of the harvest season, and came to the United States via Irish and Scottish immigrants in the 19th century.

By the early 20th century, familiar Halloween traditions such as parties, costumes, and trick-or-treating had developed in the U.S. But it was really with the post-WWII baby boom and the rise of suburbia that you started to see the annual spectacle of hordes of kids in costumes swarming neighborhoods on Halloween night.

Each postwar generation has had their own special experience of this holiday. Boomers were there at the inception of the modern mode of celebration. They were trick-or-treating in an era when the suburbs were safe enough for kids to wander unsupervised, and to prank middle-class homes without risking being shot to death. From their childhood comes the sentimental imagery of Linus from Peanuts waiting for the Great Pumpkin.

The Boomer childhood marks the rise of a Halloween costume industry, in parallel with the rise of television, as children wanted to dress as their favorite TV characters. Costumes then, and going into the era of my generation’s childhood (that would be Gen X), were cheaply made, and featured plastic masks and vinyl coveralls you wore over your clothes. They seem chintzy, even bizarre, in retrospect, but how could any Gen Xer like me look back at images of those days and not feel the twinge of nostalgia? Here’s a fantastic archive of these photos: Vintage Halloween Pictures of Generation X.

Those old costume companies have all gone out of business, replaced by the monolithic Spirit Halloween, whose retail outlets spring up perennially all around the nation each October. Meanwhile, the amount of pop culture intellectual property available as merchandise has exploded, with new icons being created each year (anyone dressing as a KPop Demon Hunter?). The industry is huge, set to reach new spending records this year.

In the lifetime of Millennials, Halloween has grown as a celebration for adults only, with new expectations. As the movie Mean Girls put it, it’s the one night a year when a girl can dress like a complete slut and not be judged for it. Any costume, apparently, can be made sexy with a little effort.

A more wholesome trend is the family Halloween costume cosplay, reflecting society’s growing family focus over the decades since Millennials started being born. In photos shared each year on social media, the young post-Millennial generation is enfolded into the holiday tradition with joy and creative spirit.

Halloween is so big now, I don’t see how it doesn’t have equal stature with Thanksgiving and Christmas. These three holidays together, coming at the end of every year, are part of the ineluctable rhythm of American life. Yeah, they’re highly commercialized. The way we celebrate them is unsophisticated, often to the point of complete kitsch.

That just makes them all the more American.

Reflections at the End of the World

Reflections at the End of the World

Once more I am on the job hunt. So far, it doesn’t seem much different than in the past. There are positions out there for which I am qualified. I have registered successfully for unemployment compensation, as I have many times before in my life.

In my job searches, I am limiting myself to remote work. The convenience of it is too much to give up, if I don’t have to, and so far there have been multiple remote positions to apply for. But obviously, if the search drags on, I will have to cast my net wider and consider going back to commuting – perhaps only on certain days of the week, in the now common “hybrid” mode which combines working from home with working on site. The point is, I might not be so lucky as I was a couple years ago, when I easily transitioned from one remote job to another.

I worry, actually, that I might be really unlucky at this juncture. With insano-fascist-guy upending the American economy via his unhinged policies, the job market is in trouble. Companies facing the uncertainty of the times are freezing hiring. We might even be heading into a recessium. I’m almost 60 years old, not a good age to have my career suddenly stalled.

I recently wrote on my substack about the problem of “gerontocracy” – our political leadership skewing older than the population, and therefore being out of touch with the needs of the American people. I made the point that this is a bigger problem for Democrats than for Republicans, and helps explain the Republican rise to power. Currently my generation, the middle-aged generation, are the primary Trump supporters. Democrats are either the older generations on their way out, or younger Millennials.

A generational shift in power is surely going to be a fallout of this tumultuous social era. I can see – in the long term – MAGA burning out and the Millennials taking over with a more progressive agenda. At that point, Gen-X will be sidelined. With the demographic collapse making Social Security less sustainable, we will probably also be impoverished. There won’t be a lot of sympathy for us, especially if we’re seen as the ones repsonsible for the worst of what is to come.

Those are some depressing thoughts, I know. It’s just where my head is at right now. It would be just like my generation grow old just as the gerontocracy was being eliminated. Another boat missed.

I’m going to take advantage of the time I have been given to do more writing. Maybe some political activism.

And I would appreciate more subscribers to my Substack if you can: https://stevebarrera.substack.com/

Peace out.

Yes, the Boomers Did That

Yes, the Boomers Did That

I was struck by this post on Thomas P. M. Barnett’s Global Throughlines substack, because of how well if dovetails with generations theory: [POST] Conjoined at birth, separated by death?

Here’s an excerpt so you can see what I mean:

In my reading, the Greatest Generation’s influence peaked around 1965 and began its slow surrender to the whims and desires of the Boomers from that point on. So, yeah, while we can say that the Boomers came of age during a period of immense social change, driving movements such as civil rights, anti-war protests, and shifts in cultural norms (e.g., feminism, environmentalism), they didn’t really dominate the workforce and political elite until the early 1980s (the shift from Hippies to so-called Yuppies).

Again, the Boomers were blessed from the start, growing up in a time of American confidence and expansion. Like most would-be “revolutionary” generations, after the raucous efforts of their youth, the real talent went into business and technology and the leftovers went into politics. As such, the Boomers remade our world with the information revolution but passed very little meaningful legislation, instead succumbing to a bizarrely counter-productive ideological polarization as they aged.

The Boomers accumulated vast wealth, but also presided over growing income inequality. Their consolidation of political power and economic resources created serious barriers for younger generations, and has contributed to a serious decline in our nation’s institutions — almost all of which now are held in low esteem as the Boomers begin to depart.

The Boomers’ long reign likewise distorted America’s perceptions and understanding of the Greatest Generation’s greatest legacy — namely, the international liberal trade order we now call globalization.

I see these parallels all the time with geostrategists like Barnett. They are pattern-seekers and they inevitably detect the same pattern of generational change found in Strauss-Howe generational theory. Barnett doesn’t see it in Strauss-Howe terms (that there are generational archetypes and cycles), but he does get that Boomers (Prophet archetype) grew up in post-war prosperity, and that they attacked the institutions established by the Greatest Generation (Hero archetype) in their youth (1960s), and dismantled them when they came into political power (now).

Different thinkers might come up with different explanations for these kinds of sociological patterns, but the fact that they all see similar ones tells me they are looking at something real.

Barnett laments what Boomers have done to the international liberal trade order, but it’s hardly surprising – they’ve been bitching about it for decades, at least in their right-wing camp. If you read Barnett’s substack posts, you will see that he see some good in this desrtruction, as it is pushes us along to needed reforms in the world order. And, of course, the destruction and recreation of the world order is exactly what Strauss and Howe predict as part of their “Fourth Turning.”

I will conclude this post by noting that I plan to start writing more on substack; I will probably post all my social commentary there, and only post more personal or ligher fare on this blog. Here is my substack link for some teaser posts I already have out there: https://stevebarrera.substack.com/

Happy New Year 2025 Generations

Happy New Year 2025 Generations

One of my New Year’s traditions is posting a list of the ages of the current living generations in the United States.

Arguably, on December 31st, everyone has had their birthday for the year. If generations are defined by birth year boundaries, then each generation fits neatly into an age bracket on that day (just ignore time zones, please). I use the birth years defined by Strauss-Howe generational theory, which gives us this age breakdown:

  • GI or Greatest Generation (b.1901-1924): 100+ years old
  • Silent Generation (b.1925-1942): 82-99 years old
  • Boomer Generation (b.1943-1960): 64-81 years old
  • Generation X (b.1961-1981): 43-63 years old
  • Millennial Generation (b.1982-2004): 20-42 years old
  • Homeland Generation (b.2005-20??): 0-19 years old

All living members of the GI (or Greatest) Generation are now centenarians, a fact underscored by the death on December 29 of former US President Jimmy Carter at age 100. His generation will still be with us for years to come, as we always have a few people alive who are supercentenarians (110+). As I write this, the oldest living American is 114 years old. So if just one 100 year old alive today makes it to that age, there will still be living members of the Greatest Generation in 2038.

Each generation’s age bracket currently lines up well with a phase of life. Meaning, Millennials fill the age bracket corresponding to young adulthood (21-41 by Strauss-Howe reckoning), Gen Xers that corresponding to midlife (42-62), and so forth. This means we should be close to the end of the current social era, the Fourth Turning or Crisis Era. In the next era, the First Turning of the new saeculum, the generations will be aging into their new life phases (Millennials will become midlifers, Gen Xers will become elders, etc.).

This Crisis Era has been dragging on, probably because of the influence of the Silent generation, which is holding back change. They are just on the edge of leaving elderhood (63-83) but still in power; President Biden is a member of the Silent Generation, for example. You could think of it as the long shadow cast by the last generations that were alive in World War II, whose legacy defines the postwar order which is now coming to an end.

As the Silents age out of public life in the near future, we will lurch our way to the end of this era and into the next saeculum (the true New World Order), however chaotically and however painfully. The inexorable logic of time and generational change demands it.

Congratulations, living generations, you made it through 2024!

Good luck in 2025!

This Confounded State We’re In

This Confounded State We’re In

One type of post I’ve made a lot on this blog is the “strategy review,” where I either review a theory of social and political change, or examine current events through the lens of such theories. Considering recent historical developments, I feel like it’s time for another one.


Over the years, I’ve gotten a lot of traction on this blog out of Philip Bobbitt‘s concept of the “market state” – a new constitutional order which he theorized was forming in the wake of America’s Cold War victory. In his framework, this was caused by changes in the security environment. With the ideological conflicts of the World Wars to Cold War era resolved, and free market capitalism ascendant, the state no longer derived legitimacy from controlling the economy and maximizing benefits to its citizens, in competition with other economic systems. Instead, it’s purpose was to keep its citizens safe and free markets functioning, to maximize economic opportunity.

This jibes with what other strategists, like Thomas P.M. Barnett and Peter Zeihan, have identified as the grand bargain the United States made with the world after WWII: we opened up our vast consumer market and invited other countries to embrace free trade, in return for which we stood as a bulwark against the Soviet bloc. Then we simply outlasted the Communists’ failure of an economic system. With Great Power conventional warfare a bygone in the nuclear age (the MAD doctrine), Pax Americana reigned over the Earth. Some even called it “the end of history.”

Things got messy after 9/11. It seemed history wasn’t interested in ending after all. The way Bobbitt understood it, in terms of his market state theory, is that in the new security environment, the threat wasn’t other nations making war on the West. Instead, it was transnational organizations taking advantage of the open networks of market state societies to infiltrate and cause harm – the 9/11 terror attacks being a spectacularly dramatic example. The point is, the market state had to adapt and develop countermeasures against these threats, with minimal reduction of economic opportunity for its subjects: that would be the test of its legitimacy.

The War on Terror and nation-building efforts in Afghanistan and Iraq could be thought of as the emerging market state’s efforts to assert just such legitimacy, led by the hegemonic “sole superpower” United States. We would just reformat failed states and turn them into free market democracies like us, with a few tricks (like Guantanamo Bay) to get around any legal concerns. It ultimately didn’t turn out so well, and we gave up after the Bush era, but arguably there were a lot of lessons learned about the shape of modern warfare that carry forward to this day (send in the drones!).

I’ve argued in other posts on this blog that what Bobbitt calls the “market state” is really just the zeitgeist of the late twentieth and early twenty-first centuries – an inner-driven, individualistic, commerce-minded social era. It was the age of neoliberalism, brought on by the Reagan revolution: a regime of free market principles aggressively pursued by government, on a global scale. The term “neoliberalism” is a bit fuzzy, and generally is used in the pejorative these days. Ever since the Great Financial Crisis of 2008, there’s been kind of a consensus that neoliberalism was a bad idea, that it wrecked the middle class, and that we need to turn away from it, and from globalization in general.

In other words, what could be called the “neoliberal market state” was a creature of a relatively prosperous and stable era, when it was conceivable to have faith in markets and be comfortable with low regulation and an open, globalizing society. It wasn’t the end of history so much as a reprieve, during which the United States basked in its Cold War victory and enjoyed peak global hegemony. But the mood has shifted now. The public clamors for a more closed and orderly society, and a retreat from global affairs, which every President since Obama has provided.

This takes me to the recent Presidential election and the curious return of Donald Trump. Didn’t the people know that Biden-Harris was rolling back neoliberalism already, and was the best bet for the middle class? That Trump’s plans to cut taxes on the rich and impose tariffs on imports would hurt ordinary consumers? That his administration will deregulate capitalism to the benefit of the very wealthy, one of the hallmarks of the neoliberal regime we are supposedly rejecting? So why did they vote for him?

The election result could just be attributed to the incumbent-punishing effects of seething populism: everything sucks, and heads must roll! Alternately, the market state viewpoint might offer another explanation: informational warfare.

What I mean is, in the new constitutional order of the market state, the citizen is primarily a consumer. That includes being a consumer of media; that is, of information. In our somewhat free-for-all media envrironment, dominated by social networking sites, consumer-citizens tend to get pulled into either of two media bubbles, each one replete with the messaging of one of the two political factions vying for control of the government. It’s like two different versions of reality fighting for control over the minds of the masses. I’ve described this before as the “red-blue wars.”

It seems that in the recent skirmish that was the 2024 election, the red zone faction prevailed on the information warfare front. I have read post-mortem posts (there were so many this year!) that state just as much. The red zone faction simply has a more robust media ecosystem, which gives it a significant advantage. And, as I’ve noted before, they might also have more “group feeling,” or solidarity of purpose – another advantage.

But here’s another way to think about information war: it could be waged from outside! Meaning that, with the open and global nature of the Internet, “bad actors” who are not subjects of your government can infliltrate your media networks and influence your elections. This is a true test of the market state’s ability to sustain itself – is it even possible to govern at all in a wide-open society?

You might recall that this was the big story after the 2016 election: it was a successful Russian cyberwarfare operation, as Timothy Snyder bluntly put it. It was the first step to installing a Russian-style oligarchy in the U.S., and it seems like the 2024 election might be the last. In this interpretation, it wasn’t that the blue zone lost to the red zone. Instead, the United States lost to a foreign adversary, and was defeated in a market state war. The Russians outlasted us in the end, and we became like them!

I used to joke, during Trump’s first term, that we were transitioning from the “market state” to the “mafia state.” It doesn’t seem so funny now. The U.S. Constitution, stressed by decades of partisan gridlock, is fragile and might not survive a second Trump Presidency. He has no respect for the rule of law, and is enabled by cronies in the other branches of government. So it looks like we might end up with an entrenched criminal oligarchy. The only hope I have is that Trump is unfocused and distractable. But, as Tom Waits puts it, if you live in hope, you’re dancing to a terrible tune.

Arguably, “change voters” who put Trump in office this cycle were hoping for some kind of shake up that would at least put us on the path to fixing our broken system. That’s the only credit I can give them. But what will replace the market state that ostensibly has been trying to emerge these past decades? Trump’s cabinet of media personalities and tech bros are like a perverse enshrinement of the Reagan revolution – conservative pundits and Ayn Rand aficianados large and in charge. Isn’t that embracing the neoliberal market state?

Well, no, since the new regime promises to pull back from free trade, globalization, and military interventionism – all hallmarks of the neoliberal order. And the oligarchs at the top of the economic pyramid, like Bezos and Musk, are not interested in free markets. They want monopoly power, and the new administration will surely not stand in their way. It really is looking like we are reverting to isolationism and the rule of robber barons – because, you know, things were so great during the Gilded Age in the 19th century.

Were voters not aware that this was the future they were choosing? I mean, isn’t MAGA supposed to be a populist movement? Why did it put oligarchs in power? That’s where the idea of rightwing propagandists scoring an information warfare victory applies. Democracy is the tyranny of the uninformed.

Alternately, maybe MAGAs did intentionally vote for this bleak new order. Snyder has invented a term for this type of regime: sadopopulism. This is a kind of government that inflicts harm, but then deflects blame to stay in power. Certainly on brand for Trump. MAGA voters might be willing to suffer, so long as other people that they blame for their woes (immigrants, queers) suffer even more.

An even bleaker prospect: MAGA is an alliance between criminal oligarchy and a vicious backlash from social conservatives against the multiculturalism of the post-1960s era. It wants to replace the market state with a new version of the nation state that yokes powerful business interests to White Christian nationalism. If the nation state was legitimate because it looked out for the people’s welfare, then the Trumpian White Christian nation state is legitimate (in some people’s minds) because it looks out specifically for white Christians – maintaining their privilege over the rest of society.

At what point do we just go ahead and call it fascism?

If a MAGA takeover is resisted, it might only be because our judicial system allows that, in the “emerging market state” in the United States, consumer-citizens are empowered to define at the state level what their particular constitutional rights are. So states that are in the blue zone could reject White Christian nationalism, and institutionalize rights according to blue zone values – obvious examples being abortion access or sanctuary for immigrants.

This would amount to a fractionalizing of the U.S. along red zone-blue zone lines, which sounds quite plausible in today’s political environment. The problem with this, which Bobbitt himself has reflected on, is that it goes against the 14th Amendment’s guarantee of equal rights for all citizens under federal law. This may well be the direction in which our state is evolving. For many citizens of the United States, that would be a human rights disaster. There are already women dying in red states from lack of reproductive healthcare, and God help us if deportation camps become a reality.

Another problem with fractionalizing along red zone-blue zone lines is that it denies the United States a national identity. Can we then truly be a nation? Each side in our partisan conflict has a different vision of how our national identity should be defined. The red zone’s vision is exclusive and looks backwards in time, while the blue zone’s vision is inclusive and confront’s the realities of today’s world. Obviously, I favor the latter vision. But until the conflict is resolved, one way or another, the definition of our national identity – and with it our understanding of what makes government legitimate – will be unclear. Until then, we can only keep dancing to that terrible tune.


Well, there you have it. Another long post that probably overthinks the politics of our time by trying to force fit it into theoretical frameworks. I mean, is “information warfare” really a feature unique to the new “market state” of the 21st century? Wasn’t propaganda a big part of the political struggles and wars of the 20th century as well? Haven’t other societies faced political conflict with an ideological dimension, where persuasion and the spread of ideas was a factor – for example, the Religious Wars of the 16th century, or the Enlightenment Era Revolutions of the late 18th century?

Theories are useful for making sense of events and for structuring narratives, but might also impose limitations on our thinking. And while the past can inform us of what is possible, it cannot be a perfect guide to the future. Ultimately, the shape of things to come is determined by our unique choices, based on our needs and perspectives, in our specific location in history. Whatever version of “the state” is coming into being, and whatever name we give it, it will be one that makes sense to today’s living generations.

All I know for sure is that everyone is getting a copy of this book in their stocking this Christmas:

An Age without Empathy

An Age without Empathy

As I write this, authorities have just arrested a person of interest in the case of the “Healthcare assassin,” who murdered a CEO on his way to an investor meeting. This guy, if it is him, has been treated by the public like a folk hero. I’m sure you’ve seen the memes. People really hate the healthcare system in the United States.

The public reaction recalled my takeaway from this statement in an article I linked to in my election post-mortem post:

the second wave of newly aging-in Trump voters entered adulthood… hoping only to grind out a living through scams. But this is fundamentally an anti-social and anti-humanist mode of economic activity that contributes nothing to society and offers nothing but alienation to its victims. The result is people willing to vote for someone they know will cause immense harm to others, hoping it will help them personally.

As I put it, voters tapped into their inner Joker and embraced the breakdown of the society. This latest incident certainly supports that idea: if we can’t reform healthcare by legal means, well…shall we say the Purge is underway?

I will point out that insensitivity about the death of the rich has already been on display, during an earlier story that took place before the election. I’m referring to the submersible that imploded while taking some wealthy clients on a tour. There wasn’t much sympathy for them, either, and they were just some folks out on a lark, not supervillain-esque corporate executives on their way to plot how to ensure that the maximum proportion of a firm’s revenues went to its shareholders and not its customers.

A mural in Seattle, made after the Ocengate Titan implosion

Celebrating someone’s death is pretty harsh. Is Trump’s reelection making us all worse as a society, or is it that we’ve become less civil, making Trump’s rise possible? Arguably, Trump’s election win simply exposed us for the uncivil society that we’ve already become. I’m sure the two phenomena feed back on each other, in a vicious cycle. This is how social moods are reinforced; by collective reactions to events.

Generations theory has its own take on why this is an age of callous attitudes and lack of sensitivity: it has to do with the archetypes of the generations that fill the adult age brackets. The “sensitive artist”-type generation that is left is the Silent generation, but they are very old now, and on their way out of public life. President Biden is from that generation, and his departure when his Presidency ends will likely mark the end of his generation’s influence.

The next generation to fit that archetype is the current child generation, the Homelanders. Not until they have come of age in significant numbers will we see the return of an attitude of empathy and humaneness. By then, we will have entered another social era.

Agile across the Generations

Agile across the Generations

In a post last month I discussed the Agile method, and described an origin story for it. In my story, Agile was invented by a new generation of software developers for a new generation of software – the software being written in the fast-paced world of the networked personal computer. It started when an “Agile Manifesto” was declared in 2001, at the height of the dot-com boom, after the software world had experienced a couple of decades of rapid growth amidst a profound shift in work patterns. A rising young generation (my own, Generation X) moved freely from job to job, eschewing loyalty to the company in favor of careers as “perma-temps.” Some system was needed to manage the frenetic chaos of this new working environment, and that’s where Agile came in.

This surely is a simplification and possibly off the mark. After all, innovation in workflow management precedes the Agile manifesto by generations. It has been a part of the evolution of the modern corporation for more than a century, going back at least to Taylorism and scientific management. Agile fits in with other conceptualizations of “lightweight” or “lean” approaches to project management, meant to avoid bogging everyone down with process and minutiae, and with earlier iterative development methodologies. These came about long before my generation was in the workforce.

My origin story came about because the Agile methodology strikingly fits the peer personalities of the generations who invented it – Baby Boomers and Generation X. If you look up the signatories of the Agile Manifesto, almost all of them are from those two generations, which constituted the bulk of workforce at the time (Millennials were only just graduating from high school). These are both generations know for individualism, for focus on the self and the personal, and for short-term thinking. It makes sense that they would embrace a work methodology that emphasizes individuals over process, and adaptability over planning.

The very name “Agile” evokes the ideas of speed and flexibility, qualities which align with my generation’s reputation. Also aligning with Generation X is Agile’s way of defining success as developing software that works, not necessarily software that is perfectly crafted or meticulously documented. “Git-R-Done!” or “Just Do It!” as a Gen Xer might say. Or how about the Agile sub-type known as “extreme programming,” a hyper-focused iterative approach with very short cycles? What could be more Gen X than that?

My point is that this methodology was primed for the workforce of the time – a workforce consisting of young adult Gen Xers, managed by middle-aged Boomers. The hyper-focused individualists were doing the work while the visionaries were directing them. Agile, in theory, was a mindset, a whole philosophy of managing work in a fast-paced world. So long as everyone was not worried too much about following a fixed process or plan, but instead was adaptable and constantly communicating, much could be accomplished.

Contrast this with Six Sigma, a methodology that came from the Silent Generation when they were the middle-aged managers of young adult Boomers. This faultfinding approach, which uses statistical methods to eliminate defects in processes, suits the Silent Generation’s reputation for fine-tuning expertise, as well as the Boomer Generation’s reputation for perfectionism.

Now what about Agile in the workforce today? It’s been over twenty years since the manifesto was published, and now it’s Gen Xers who are the middle-aged managers and Millennials who are the young adult workers. Does the Agile methodology suit a generation known more for hivemind thinking than for focused individualism? I think it does, though maybe not in exactly the way it was originally envisioned.

I have been using Agile at work for the better part of the last ten years, at all three of my most recent software development jobs. In my experience, the ideal of the “Agile mindset” doesn’t really stick. It’s fine to have an overall philosophy of work, but actually getting people to adopt a specific mindset requires coaching and attention, not simply declaring a vision. What does stick easily about Agile is the framework of dividing the work into short sprints and keeping the team aligned, using regular meetings (such as a daily scrum or stand up) and a system for tracking the work (such as user stories on a storyboard).

I think the structure provided by this framework is a good fit for the peer personality of the Millennial generation, who do best in an orderly work environment with clearly set expectations. They like to be given a well-defined task and rewarded for accomplishing it. A little praise and gratitude will do. They even get FOMO when they don’t have a specific assignment, which is understandable as it might be a sign that their position isn’t needed any longer.

Even as Agile methodology supplies structure, the short duration of the sprints and the iterative workflow continue to provide the benefits of flexibility as project priorities and personnel shift about. A plethora of practices and sub-methods has evolved out of the original idea, giving Gen X and Elder Millennial managers plenty of ways to tinker with the methodology to find the best fit for their teams.

It’s worth noting that there are limitations that come about when you have structure. If everything has to be tracked, work might not get done if no one remembers to track it. If expectations are clear, there might not be much motivation to go beyond expectations. A well ordered framework for defining and assigning work might be easy to navigate, but it can also foster complacency. No one is likely to go above and beyond, if there doesn’t seem to be any particular reward for doing so, and if doing so risks ruffling feathers by disrupting the expected workflow.

Continuing the story of Agile, it might be that what started as a methodology for producing results in a fast-paced environment has evolved into a methodology for governing work in an orderly manner, such that everyone can function in a well-defined role. That’s what my experience shows. Agile might not be as versatile in practice as it was originally envisioned to be, but it’s still a useful tool for keeping teams aligned and productive.

I do sometimes hear an old Gen Xer on a team complain that “we’re not practicing true Agile,” but I just think, “so what?” We’re getting stuff done (hopefully), and keeping tabs on it. That’s enough.

As far as I can tell, Agile, at least in name, is here to stay. The concept is entrenched in the Information Technology workplace, and will certainly outlast my career, which has not much more than a decade to go. Ten years from now the generation that comes after Millennials, the Homeland Generation, will fill the twenty-something age bracket and constitute the workforce’s youngest cohorts. I wonder what further evolution of the Agile method might come along with them.

How We Got Agile: An Origin Story

How We Got Agile: An Origin Story

My old copy of “the mythical man-month”

When I was a young man, a college student in the Computer Science program at Virginia Polytechnic Institute, we were assigned a book to read. It was called The Mythical Man-Month, by Frederick P. Brooks, Jr., and I still have my copy from the 1980s. The point of the book, which you might be able to glean from the title, is that you can’t simply measure work effort in “man-months,” on a scale such that you could conceivably get more work done by adding more people to a project. As an example, you couldn’t say that a project has a work effort of 120 man-months, meaning that with 10 men it will take 12 months to finish, and therefore with 20 men it will be done in 6 months.

If you had 10 men working on this hypothetical project, and added 10 more, you would not find that it completed 6 months sooner. It would, in fact, take longer than 12 months. The problem is, as you add more men (people) to a project, you need time to get new hires ramped up to where they understand the project well enough to be productive. You also multiply the lines of communication, which generates additional overhead keeping everyone in sync on specific information needed to make interacting components work together. In engineering, these pieces of information are called “specifications,” and they have to be tracked somehow. If you add more people to a technical project, you add more tracking effort. These complications are summarized in Brook’s law: “Adding manpower to a late software project makes it later”.

As a software engineer in the early 21st century, it fascinates me to read the author’s description of how specifications were tracked on the project he worked on – the IBM System/360 – in the 1950s and 60s. They had huge manuals kept in binders, and as changes were made, the responsible engineers would have to go in to the binders and update the appropriate pages – that is, take out the old pages and insert the new ones with the changed specs. This manual was the Bible of the system, and keeping it up to date was abolutely vital to the success of the project.

Modern day software engineers like me are not used to such meticulously maintained documentation. We consider ourselves lucky if there is any documentation at all for the software on which we are working. You’d think it would be easier, now that everything can be done online, but projects move too fast and the people working on them move around too much. No one is necessarily going to stay on top of documentation, and so long as software works as expected, that’s fine. It’s when it doesn’t work that you run into trouble.

Because personnel move around so frequently in the modern workforce, there is rarely anyone working on a software program who was there when it was originally programmed. But programmers still need to maintain it. Sometimes we are given requirements to modify existing software that has no documentation, with no one around who knows anything about it, and the only way to achieve that goal is through “reverse engineering.” This means poring over old code and documenting it from scratch, which is very time consuming. This underscores the point about the man-month: you can’t just insert a person into a project and expect them to get as much done in a given amount of time as a previous person on the project did. Certainly not if they are going to be reverse engineering the previous person’s work.

Since the start of the personal computing era and the long economic boom of the late 20th and early 21st centuries, computer software has been advancing at a faster pace than it did when Frederick P. Brooks worked as an engineer at IBM. The workforce has changed as well, with employees typically job hopping every few years, and often working as contractors through agencies rather than directly for the client that owns the software they are developing. So how do the software engineers of my generation handle project management in such a chaotic work environment?

The answer is “Agile” methodology, which came about around the start of this century. Agile is a lean or lightweight software development method that emphasizes individuals collaborating over plans and processes, and defines good software as software that works, not necessarily software that is well documented. At least, that’s the declaration in a famous “Manifesto for Agile Software Development” that was published in 2001.

The idea is that “Agile” is a mindset where you are focused as a team on communication and collaboration, continuous improvement, and responsiveness to change. In practice, it means breaking up the project work into short iterations called “sprints,” which typically last two weeks. Everyone’s tasks for the sprint are things that shouldn’t take more than a couple of weeks to finish. So right there the idea of a “man-month” is out; no one would work on one thing for a whole month!

Breaking the project work into chunks like this makes it easier to show progress, and to evaluate how effective the team is from sprint to sprint, and change processes and workflows as needed. It also makes it easier to accomodate personnel shifting around from project to project. It’s a way of coping with today’s volatile workplace, which makes long term planning harder to achieve. A whole panoply of “frameworks” and “ceremonies” has developed around the original concept since it was first elucidated.

If you are in a white collar profession (not even necessarily Information Technology) you might have experience with Agile-related frameworks in your career. I was first exposed to Agile in the late 2000s, and have been at positions where it is used comprehensively since 2018. Every company does it a little differently, but I have always found it to be a useful way to structure project work.

The way I see it, Agile came about because a new generation of software engineers needed to adapt to a faster pace of work than what the generation of Frederick P. Brooks experienced in their careers. They needed to find their own solution to the problem of how to get people to work effectively when they are added, out of the blue, to a new project. If you look at the signatories of the 2001 Agile Manifesto, you will see that they are almost entirely Baby Boomers and Gen Xers. Today’s Millennials and Gen Zers in the IT workforce have possibly never worked on a project that wasn’t using Agile.

I’ll have more to say about the different generations and Agile in a future post.

Emojis at Work – How Social Media Infiltrated the Workplace

Emojis at Work – How Social Media Infiltrated the Workplace

I still remember the excitement when the first iPhone came out in 2007; only a few people were using this new kind of mobile phone, but boy were they delighted with it. At the same time, everyone was jumping onto Facebook, which had just opened up to the general public in 2006.

Fast forward to a decade and a half later, and everyone has a touchscreen phone (I got my first one in 2014). Social media platforms have proliferated, and are a constant, pervasive feature of daily life.

Once, employers tried to prevent workers from browsing the Internet during the day, but such efforts have been abandoned. Everyone is on their phone all the time. In fact, the software used to officially collaborate in the workplace looks a lot like the apps we use in our personal lives.

At least, that’s been my experience as a white collar professional in a cubicle environment. I’m a middle-aged GenXer, and my career is split pretty evenly between the world before social media, and the world after. I’ll explore what that’s been like for me a little more in this post.


I joined Facebook in 2008, because all of my coworkers were doing it and I didn’t want to be left out. It was a clear case of FOMO (Fear Of Missing Out), a term then recently introduced, to explain how social networks stimulate compulsive following and mimicking behavior. I friended all of my coworkers, and had fun setting up my profile and exploring the site.

Do you remember those days, and how primitive the Facebook interface was compared to today? Your main profile page was called your “wall” and to post on it, you “updated your status.” If you look back at your posts from fifteen years ago, you’ll see how diffferent they were. They seem kind of awkard and tentative, like we all didn’t quite know what to do with this new way of communicating.

Back then, there was a site called “Please Rob Me” that tried to raise awareness about the dangers of sharing the fact that you weren’t home, like someone was wondering how anyone could be stupid enough to do that. The site is defunct now, and today it is routine for people to tag their locations when they go out, even though we all know we’re giving valuable information away to giant corporations (the ones who are really robbing us).

Back then, as employees found themselves sucked into their Facebook feeds, companies started blocking the website from their local intranets. They established policies about what employees were allowed to post on social media platforms, warning them against representing the company or divulging corporate information.

In the late 2000s, the world was just getting used to social media, and its implications. Today, a decade and a half later, social media is routine in our daily lives. Everyone accesses social media platforms from their smartphones on a more or less continuous basis, even while at work, and employers have no chance of stopping them.

One thing I’ve decided since those early days is that it is best to keep my work life and my personal life separated, where social media is concerned. I no longer send Facebook friend requests to my coworkers, as I did back when I first joined the site. But that’s just how I personally manage my online presence. For other people, depending on their line of work, it might be better or even necessary to network and post about work across all social media, and have less of a distinction between personal and professional social spaces.

A clever post about work I made on a social media app

That’s not to say that social media isn’t a part of my work life at all. There are, as you well know, work-specific social media sites, such as LinkedIn, where I do make sure to connect with my coworkers. The Intranet at the company where I work uses software that has posts and feeds that resemble those on any other social media platform, and while I’m not particularly active there I do browse, to get a feel for the corporate culture.

I also sometimes post about work on my personal social media accounts, but in a sly way. I don’t want to reveal where I work, but just say something about work that’s clever, maybe even vaguely subversive, hoping for likes and shares. I’ve included an example screenshot in this blog post. You can see that I got zero engagement.

Social media conventions have infiltrated workaday tasks as well, such as is in the use of emojis and reactions in online conversations. I have long been using messaging software in the workplace; I remember Skype being in place in the office back in 2002. I also remember that at as emojis started coming into use in personal messaging, I was hesitant at first to include them in work conversations. It just seemed somehow unprofessional to use a smiley face in a work related chat.

But, in time, it simply became a norm. On the messaging software I use at work now, there are emoji reaction options, and my coworkers and I routinely “thumb up” and “heart” one another’s messages. It’s just a way of signalling agreement or showing appreciation. Workplace communication has become more friendly and informal than in the past, and I think this reflects a preferred mode for today’s mostly Gen X and Millennial workers.

For me, a Gen Xer who adopted less formal modes of communication in the latter portion of his career, it’s been an adjustment. But for many of my coworkers, who are Millennials twenty or thirty years younger than I am, it must just seem like the normal way people communicate in the digital space. For Boomers, experiencing these changes at the tail ends of their careers, it might seem too informal or alien to their expectations.

I suppose I shouldn’t speak for others, especially if they are from a different generation. These are just my thoughts on the matter. There’s no denying that the proliferation of smartphones, along with ubiquitous access to the Internet and its software platforms, has changed our daily routines, including our work routines. Please feel free to share your own experience in the comments below.