On Everyday Acts

On Everyday Acts

Did you know that when you go the store and pick an item off the shelf, choosing among the myriad options on display, you are performing a theatrical act? That you are putting on a little play?

Did you know that everything we do is a performance? Every choice we make and every action we take? It is just as the great bard wrote:

All the world’s a stage,
And all the men and women merely players;

William Shakespeare, As You Like It, Act 2, Scene 7

Let me explain.

In our ordinary state of consciousness, our actions are motivated by our personal agenda and our ego’s need to maintain its identity-image. Our identity-image is a persona that develops over the course of our lives. It is our self-image – “I’m a nice guy” or “I’m a tough guy” for example. It develops with conditioning as our experiences shape our habits of perception and behavior to form who we are as distinct individuals. In other words, we become a character.

Yes, like a character in a play. Do you see where I am going?

The conditioning that forms our character limits us, so that we think and act in specific ways, following a script. Just as a play has a script. The details of our script define who we are in society – our role. Just as each character in a play has a role. Our role determines how we dress, which is like our costume in the play, and what we own, which is like our stage props.

You know what I mean! People live according to their self-image. They dress a certain way and drive a certain car, and have particular beliefs and mannerisms. They fall into a pattern, and to come out of that pattern is difficult – it is very uncomfortable!

We refer to these patterns of dressing and acting and so forth as culture, and in our diverse society we even have sub-cultures, groups within the larger population whose identity-image conforms to some model. This enriches our collective lives, and also complicates our politics.

Some people are ostentatious about their self-image, like they are showing it off to those around them, who are the audience of their play. But even if a person is not ostentatious, even if they only want to be left alone and don’t care much how they look to others, even then they are putting on a performance.

For whom, then, you might wonder? Who is the audience for this solitary person, acting alone?

Why they are their own audience, of course! The conditioned ego is performing for itself, making the choices necessary to maintain its self-image. It is always like this with every choice we make, and every action we take; they are affirmations to ourselves of who we are.

Let’s say you are at the store and you are buying your bottle of shampoo, and you have to decide if you will buy the inexpensive generic brand or the costlier designer brand. You make the choice in accord with your self-image. Maybe you are successful and deserve the finer things in life, and certainly you can afford the more expensive option. Or maybe you are sensible and you understand the value of money, and you know the cheaper option works just as well.

You make the choice of either the generic or designer bottle, and so you reinforce your identity-image. Your consumer choice is a little performative act by yourself for yourself, solipsistic theater as it were. Marketers know this is how consumer behavior works, and they make sure to have different brands at different price points to capture greater market share, even though what is in each bottle is more or less the same.

So our everyday act is an act of theater. This is how we are in our ordinary state of ego-consciousness, and this is not a bad thing per se. We need to reinforce our identity-image and have a stable ego if we wish to function in society, and we should not be ashamed of our limiting conditioning.

I say that everything we do is a performance, because our actions are always witnessed, whether by others, or by ourselves. They always have an audience. And there is always an ultimate witness, which is the unitive consciousness that is the ground of reality.

It is within this unitive consciousness that all of manifestation occurs, the whole world that is the stage on which we perform. It is this unitive consciousness that ultimately is making the choice that we attribute to ourselves, bringing the world into manifestation as it does. This unitive consciousness is unconditioned; it has complete freedom of choice. But it is very difficult for us to access this freedom of choice, because of the limitations of our conditioning – our fears and our inertias that hold us back.

In our daily lives, we can strive to keep an open mind and to overcome our conditioning, and so expand the possibilities available to us in the choices we make. What is there to fear? We are merely players on a stage.

It is all just a performance!


Behold the wisdom of the Buddha Bear!

Agile across the Generations

Agile across the Generations

In a post last month I discussed the Agile method, and described an origin story for it. In my story, Agile was invented by a new generation of software developers for a new generation of software – the software being written in the fast-paced world of the networked personal computer. It started when an “Agile Manifesto” was declared in 2001, at the height of the dot-com boom, after the software world had experienced a couple of decades of rapid growth amidst a profound shift in work patterns. A rising young generation (my own, Generation X) moved freely from job to job, eschewing loyalty to the company in favor of careers as “perma-temps.” Some system was needed to manage the frenetic chaos of this new working environment, and that’s where Agile came in.

This surely is a simplification and possibly off the mark. After all, innovation in workflow management precedes the Agile manifesto by generations. It has been a part of the evolution of the modern corporation for more than a century, going back at least to Taylorism and scientific management. Agile fits in with other conceptualizations of “lightweight” or “lean” approaches to project management, meant to avoid bogging everyone down with process and minutiae, and with earlier iterative development methodologies. These came about long before my generation was in the workforce.

My origin story came about because the Agile methodology strikingly fits the peer personalities of the generations who invented it – Baby Boomers and Generation X. If you look up the signatories of the Agile Manifesto, almost all of them are from those two generations, which constituted the bulk of workforce at the time (Millennials were only just graduating from high school). These are both generations know for individualism, for focus on the self and the personal, and for short-term thinking. It makes sense that they would embrace a work methodology that emphasizes individuals over process, and adaptability over planning.

The very name “Agile” evokes the ideas of speed and flexibility, qualities which align with my generation’s reputation. Also aligning with Generation X is Agile’s way of defining success as developing software that works, not necessarily software that is perfectly crafted or meticulously documented. “Git-R-Done!” or “Just Do It!” as a Gen Xer might say. Or how about the Agile sub-type known as “extreme programming,” a hyper-focused iterative approach with very short cycles? What could be more Gen X than that?

My point is that this methodology was primed for the workforce of the time – a workforce consisting of young adult Gen Xers, managed by middle-aged Boomers. The hyper-focused individualists were doing the work while the visionaries were directing them. Agile, in theory, was a mindset, a whole philosophy of managing work in a fast-paced world. So long as everyone was not worried too much about following a fixed process or plan, but instead was adaptable and constantly communicating, much could be accomplished.

Contrast this with Six Sigma, a methodology that came from the Silent Generation when they were the middle-aged managers of young adult Boomers. This faultfinding approach, which uses statistical methods to eliminate defects in processes, suits the Silent Generation’s reputation for fine-tuning expertise, as well as the Boomer Generation’s reputation for perfectionism.

Now what about Agile in the workforce today? It’s been over twenty years since the manifesto was published, and now it’s Gen Xers who are the middle-aged managers and Millennials who are the young adult workers. Does the Agile methodology suit a generation known more for hivemind thinking than for focused individualism? I think it does, though maybe not in exactly the way it was originally envisioned.

I have been using Agile at work for the better part of the last ten years, at all three of my most recent software development jobs. In my experience, the ideal of the “Agile mindset” doesn’t really stick. It’s fine to have an overall philosophy of work, but actually getting people to adopt a specific mindset requires coaching and attention, not simply declaring a vision. What does stick easily about Agile is the framework of dividing the work into short sprints and keeping the team aligned, using regular meetings (such as a daily scrum or stand up) and a system for tracking the work (such as user stories on a storyboard).

I think the structure provided by this framework is a good fit for the peer personality of the Millennial generation, who do best in an orderly work environment with clearly set expectations. They like to be given a well-defined task and rewarded for accomplishing it. A little praise and gratitude will do. They even get FOMO when they don’t have a specific assignment, which is understandable as it might be a sign that their position isn’t needed any longer.

Even as Agile methodology supplies structure, the short duration of the sprints and the iterative workflow continue to provide the benefits of flexibility as project priorities and personnel shift about. A plethora of practices and sub-methods has evolved out of the original idea, giving Gen X and Elder Millennial managers plenty of ways to tinker with the methodology to find the best fit for their teams.

It’s worth noting that there are limitations that come about when you have structure. If everything has to be tracked, work might not get done if no one remembers to track it. If expectations are clear, there might not be much motivation to go beyond expectations. A well ordered framework for defining and assigning work might be easy to navigate, but it can also foster complacency. No one is likely to go above and beyond, if there doesn’t seem to be any particular reward for doing so, and if doing so risks ruffling feathers by disrupting the expected workflow.

Continuing the story of Agile, it might be that what started as a methodology for producing results in a fast-paced environment has evolved into a methodology for governing work in an orderly manner, such that everyone can function in a well-defined role. That’s what my experience shows. Agile might not be as versatile in practice as it was originally envisioned to be, but it’s still a useful tool for keeping teams aligned and productive.

I do sometimes hear an old Gen Xer on a team complain that “we’re not practicing true Agile,” but I just think, “so what?” We’re getting stuff done (hopefully), and keeping tabs on it. That’s enough.

As far as I can tell, Agile, at least in name, is here to stay. The concept is entrenched in the Information Technology workplace, and will certainly outlast my career, which has not much more than a decade to go. Ten years from now the generation that comes after Millennials, the Homeland Generation, will fill the twenty-something age bracket and constitute the workforce’s youngest cohorts. I wonder what further evolution of the Agile method might come along with them.

Evolution within Consciousness

Evolution within Consciousness

In my previous post in honor of the late philosopher of the mind and consciousness, Daniel Dennett, I mentioned that I would post a follow up. This post relates to a different philosophy of consciousness, from a different philospher, one where consciousness is considered to be fundamental and all phenomena to arise within it, rather than for it to be a trait that emerges out of material interactions in the brain. So the brain and the mind exist within consciouness, not the other way around.

That philosopher is Amit Goswami, and I have long been a proponent of his model, since reading his seminal book The Self-Aware Universe at the advice of an old friend. I’ve read and re-read most of his books, and having just completed my second or third read of his book on evolution, I am just going to post my goodreads review of it here. I hope it makes sense, and makes his arguments and line of thinking clear.

In this 2008 book, Amit Goswami applies his theoretical framework of science within consciousness to biological evolution and the origins of life. His hope is to reconcile creationism with evolution, in accord with his greater goal of reconciling science with spirituality. For the first time in this body of work, he repeatedly uses the term “God” (this book was published in the same year as another of his books, “God Is Not Dead”). He defines God as “objective cosmic consciousness” – unitive consciousness as the ground of all being.

He frames the problem of creationism vs. Darwinism as one of conflicting worldviews, both of which are ultimately untenable. The simplistic model of creationism is clearly contradicted by real world data, but the Darwinist model of random mutation and natural selection is also unable to explain much of what is observable about life. For example, it cannot explain life’s purposiveness, or the biological arrow of time with its progression from simpler to more complex life forms. Nor can it explain the subjective feeling of being alive.

The problem is basing science on a reductionist materialist ontology; this makes it impossible to explain subjective qualia of experience without running into paradoxes. In addition, with Darwinism, everything must arise from chance and necessity, so the theory runs afoul of huge improbabilities. How can organic molecules arrange themselves into complex life forms by chance alone? The doctrine of natural selection is inadequate because it too is paradoxical – it declares “survival of the fittest” but then defines “fittest” as that which survives. This is circular reasoning which fails to address the fundamental question – why survive at all?

Something is lacking in the materialist worldview on which Darwinism is based, and Goswami’s proposition is that what is missing is the idea of the universe arising within consciousness as a consequence of self-referential quantum measurement. Such a measurement can arise when there is a “tangled hierarchy,” where cause and effect are intertwined. This is a key concept in Goswami’s theory, an idea you may have already encountered in the work of Douglas Hofstadter. An example from biology is how DNA encodes for proteins but proteins are used to replicate DNA. Which comes first, if each depends on the other? Clearly the whole living system must arise as one.

In Goswami’s model this happens because consciousness itself – the ground of all being – actualizes the living system in manifest reality out of the myriad quantum possibilities available at the microscopic level. In other words, the biological complexity evolves in the uncollapsed wave function, unrestricted by the laws of entropy which make its manifestation via material interactions alone so unlikely. When the gestalt of a functioning living system is available in possibility, consciousness collapses the wave function into that state in a self-referential measurement, actualizing the living entity and identifying with it in the process. Thus arises a sense of self, an experience of being separate from the world. This explains the subjective feeling of being alive, and why life forms have a drive to survive.

Quantum measurement alone is not enough to explain how a life form can exist; somehow consciousness must be able to recognize the proper arrangement of biological matter to represent a living function. This is where Goswami reintroduces his idea of subtle bodies and psychophysical parallelism – consciousness simultaneously collapses correlated physical and vital bodies, with the vital body acting as a blueprint so that consciousness can recognize the possibilities of life available to be represented in material form. Our experience of feeling is the manifestation of this vital body.

Similarly, as evolution progresses up the Great Chain of Being, a mental body, correlated with our biological brain, gives us our experience of thought. Goswami explains how perception manifests from mental image representation in the brain. He presents an intriguing road map of the evolution of mind which is similar to that espoused by Ken Wilber, whom Goswami has referenced in earlier works. He suggests some tantalizing possibilities for future evolution, and also speculates that as a species humanity is stuck evolutionarily because we have not integrated our emotional and rational minds. He offers some ideas of how we could overcome this blocker.

Goswami’s thinking is unconventional, but it does connect physics and biology with spirituality using a consciousness-based resolution to the measurement problem in quantum mechanics. He postulates an objective cosmic consciousness as the equivalent of what religions call “God,” which fosters creativity in the manifest physical world with the aid of archetypes of form. He also postulates subtle bodies which exist in parallel with our material body, which give us our inner experience of being alive, of having feelings and a mind. This is what religions call our “soul.” This is an idealist as oppososed to a materialist science, akin to the idealism of Plato, and it does indeed reconcile the idea of a creator God with the nitty gritty of the physical sciences.

I’ve written a super long review here, the longest of mine yet for any of Amit Goswami’s books. Goswami’s ideas make sense to me, and I find his philosophy satisfying. I hope I have summarized his arguments here accurately and in a way that motivates the reader to check out this book, or any of his others. I recommend starting with “The Self Aware Universe”.

Saying Goodbye to an Eminent Philosopher

Saying Goodbye to an Eminent Philosopher

Dennett’s books among some others in my collection.

One of the great philosophers of our time just passed away recently. His name was Daniel Dennett, and he was a cognitive scientist and researcher into the philosophy of mind. He was famously an atheist and a proponent of Darwinist evolutionary biology. I have read a few of his books, and have them on my bookshelf in my curated collection of what I think are among the best or most important books on the philosophy of mind and the meaning of life. Probably Dennett’s best known works are Darwin’s Dangerous Idea and Consciousness Explained, the latter of which lays out his understanding of what consciousness is.

He was a proponent of the Darwinist idea of traits arising through natural selection because of their adaptiveness, with consciousness being just one more trait that an organism can have. In his view, consciousness was something like an illusory experience that gives us a summary view of reality to help us get along, arising out of the interacting neurons in the brain. He was a materialist who believed that to study consciousness, you have to look in the brain, its ultimate cause. Below is an interview that will give you an idea of his train of thought.

I have great respect for Daniel Dennett, and admired his gentle and humane nature, and his deep thinking. I really appreciated that he ascribed consciousness to non-human animals, at least those with more advanced brains, and believed consequently that their suffering was real and we should take it seriously.

But I don’t agree with his philosophy. I think that with a materialist, upward causation model, you run into paradoxes when trying to explain consciousness. You can see what I mean if you watch the interview, where Dennett describes how human consciousness is more advanced than animal consciousness because our neurons have representations not just of our sensory data but also of the representations themselves. Layers upon layers. But how do you get to the actual meaning that is being represented; do you just add layers ad infinitum? The subjective experience of meaning is not explained.

I am a proponent of the ideas of a different philosopher, Amit Goswami, of whom I’ve written on this blog before. He has a better model, an idealist one, which puts consciousness ahead of matter instead of the other way around. It’s not a question of mind over matter or of matter over mind when both exist within fundamental consciousness. As the Beatles put it, “it’s all within yourself.” I have a follow up post based on one of his books, which will describe a different way of thinking about the evolution of the human mind.

But I give Dennett his due, as he was a great and wise thinker. I end this post with a link to a full-length album of avant-garde music featuring sampling from one of his lectures. Rest in Peace, o noble born.

How We Got Agile: An Origin Story

How We Got Agile: An Origin Story

My old copy of “the mythical man-month”

When I was a young man, a college student in the Computer Science program at Virginia Polytechnic Institute, we were assigned a book to read. It was called The Mythical Man-Month, by Frederick P. Brooks, Jr., and I still have my copy from the 1980s. The point of the book, which you might be able to glean from the title, is that you can’t simply measure work effort in “man-months,” on a scale such that you could conceivably get more work done by adding more people to a project. As an example, you couldn’t say that a project has a work effort of 120 man-months, meaning that with 10 men it will take 12 months to finish, and therefore with 20 men it will be done in 6 months.

If you had 10 men working on this hypothetical project, and added 10 more, you would not find that it completed 6 months sooner. It would, in fact, take longer than 12 months. The problem is, as you add more men (people) to a project, you need time to get new hires ramped up to where they understand the project well enough to be productive. You also multiply the lines of communication, which generates additional overhead keeping everyone in sync on specific information needed to make interacting components work together. In engineering, these pieces of information are called “specifications,” and they have to be tracked somehow. If you add more people to a technical project, you add more tracking effort. These complications are summarized in Brook’s law: “Adding manpower to a late software project makes it later”.

As a software engineer in the early 21st century, it fascinates me to read the author’s description of how specifications were tracked on the project he worked on – the IBM System/360 – in the 1950s and 60s. They had huge manuals kept in binders, and as changes were made, the responsible engineers would have to go in to the binders and update the appropriate pages – that is, take out the old pages and insert the new ones with the changed specs. This manual was the Bible of the system, and keeping it up to date was abolutely vital to the success of the project.

Modern day software engineers like me are not used to such meticulously maintained documentation. We consider ourselves lucky if there is any documentation at all for the software on which we are working. You’d think it would be easier, now that everything can be done online, but projects move too fast and the people working on them move around too much. No one is necessarily going to stay on top of documentation, and so long as software works as expected, that’s fine. It’s when it doesn’t work that you run into trouble.

Because personnel move around so frequently in the modern workforce, there is rarely anyone working on a software program who was there when it was originally programmed. But programmers still need to maintain it. Sometimes we are given requirements to modify existing software that has no documentation, with no one around who knows anything about it, and the only way to achieve that goal is through “reverse engineering.” This means poring over old code and documenting it from scratch, which is very time consuming. This underscores the point about the man-month: you can’t just insert a person into a project and expect them to get as much done in a given amount of time as a previous person on the project did. Certainly not if they are going to be reverse engineering the previous person’s work.

Since the start of the personal computing era and the long economic boom of the late 20th and early 21st centuries, computer software has been advancing at a faster pace than it did when Frederick P. Brooks worked as an engineer at IBM. The workforce has changed as well, with employees typically job hopping every few years, and often working as contractors through agencies rather than directly for the client that owns the software they are developing. So how do the software engineers of my generation handle project management in such a chaotic work environment?

The answer is “Agile” methodology, which came about around the start of this century. Agile is a lean or lightweight software development method that emphasizes individuals collaborating over plans and processes, and defines good software as software that works, not necessarily software that is well documented. At least, that’s the declaration in a famous “Manifesto for Agile Software Development” that was published in 2001.

The idea is that “Agile” is a mindset where you are focused as a team on communication and collaboration, continuous improvement, and responsiveness to change. In practice, it means breaking up the project work into short iterations called “sprints,” which typically last two weeks. Everyone’s tasks for the sprint are things that shouldn’t take more than a couple of weeks to finish. So right there the idea of a “man-month” is out; no one would work on one thing for a whole month!

Breaking the project work into chunks like this makes it easier to show progress, and to evaluate how effective the team is from sprint to sprint, and change processes and workflows as needed. It also makes it easier to accomodate personnel shifting around from project to project. It’s a way of coping with today’s volatile workplace, which makes long term planning harder to achieve. A whole panoply of “frameworks” and “ceremonies” has developed around the original concept since it was first elucidated.

If you are in a white collar profession (not even necessarily Information Technology) you might have experience with Agile-related frameworks in your career. I was first exposed to Agile in the late 2000s, and have been at positions where it is used comprehensively since 2018. Every company does it a little differently, but I have always found it to be a useful way to structure project work.

The way I see it, Agile came about because a new generation of software engineers needed to adapt to a faster pace of work than what the generation of Frederick P. Brooks experienced in their careers. They needed to find their own solution to the problem of how to get people to work effectively when they are added, out of the blue, to a new project. If you look at the signatories of the 2001 Agile Manifesto, you will see that they are almost entirely Baby Boomers and Gen Xers. Today’s Millennials and Gen Zers in the IT workforce have possibly never worked on a project that wasn’t using Agile.

I’ll have more to say about the different generations and Agile in a future post.

Next Generation Board Gaming

Next Generation Board Gaming

I saw an article just recently about the release of a new version of Scrabble, friendlier and less competitive than the original. The article title indicated that it was designed to appeal to the young generation, putting ‘less competitive’ and ‘inclusive’ in scare quotes, as though one should wonder why anyone would want such features in a board game. I encountered the article in the context of social media feeds where posters were mocking Gen Z and decrying this as “woke Scrabble.”

I gathered that these posters were Gen Xers, and that the editor who picked the title of the article probably is as well. My generation likes to pick on younger people for not being tough enough. But I don’t see what their problem is; this new Scrabble version, called “Scrabble Together,” seems like a perfectly cromulent game to me. To me, it’s simply part of a trend that’s been going on for years, where cooperative and team play games have grown in popularity. These games are suited for socializing in large groups, and I think they are a good fit for the peer personality of the Millennial generation.

As Neil Howe and William Strauss put it in Millennials Rising, this generation is special, sheltered, and team-oriented. A chiller version of Scrabble is perfect for a generation more interested in fitting in and playing it safe than in standing out and taking chances. In fact, Neil Howe identifies board gaming as one of many pastimes Millennials have favored as they have embraced youthful restraint, in contrast to the wild days of my generation’s youth.

The board gaming hobby has really taken off in the past couple of decades, as I have noted in other posts. I remember the very beginnings of the new wave of board games back in the 1990s, when Millennials were children. As the media caught on to the trend when Millennials became young adults, articles started appearing associating the board game revival with their generation. I’ve certainly enjoyed watching Millennials swarm into gaming conventions and game stores, and even sometimes feeling like the wise old guy teaching them a thing or two as we play a game together.

I would say that the board game revival belongs to both Millennials and Generation X, as this article by a Gen X board gamer describes. And in all fairness, the Boomer generation deserves credit for giving us many of the prominent designers of the tabletop games that are so popular today. But Millennials really have taken board gaming to a new level, folding the hobby in with social media and streaming video platforms, and adapting it to their mode of life.

It’s been quite remarkable to observe, and since board games are something of an obsession for me, I’m glad that it’s happened. I look forward to playing Scrabble Together some day, possibly chilling with some friendly Millennials at a game day hosted by a local craft brewery. ‘Cause all we’re trying to do here is get along and have a little fun.

AI at Work, for Better or Worse

AI at Work, for Better or Worse

A little robot guy I made with an AI image generator

As you surely know if you are a denizen of the online world like I am, artificial intelligence has made remarkable strides in the past few years. In particular, what they are calling generative AI has really taken off. This is a kind of advanced pattern matching software that grew out of machine learning. It lets you use prompts to create content like images, complicated text including writing stories, and even videos and music at this point. At the bottom of this post I linked to a YouTube video that explains generative AI really well so check it out.

I played with AI image generators for a while, and had some fun. In their early iterations they produced really weird, often creepy looking stuff, but now they’ve gotten pretty advanced. The images they produce are intriguing, impressive even. I saved a lot of the ones I generated, but stopped messing with the programs when I saw how many of my artist friends were upset by the proliferation of AI-generated images on social media. I gathered they could sense their own work being made obsolete by an overwhelming supply of easily produced knock-off art. Why hire an illustrator when you can just describe what you want into a text box in an AI tool, and get the result in a few minutes? Plus there’s the troubling issue of these programs possibly being trained on copyrighted material without the consent of the copyright owners, meaning they are effectively stealing from artists.

Another thing you have to consider about the product of generative AI (and this is covered in the video below) is that it is subject to one of the rules about computer programming that I was taught as a lad: Garbage In, Garbage Out. That is, if you put bad data into a computer program, then you will get bad data out of it. Generative AI is trained on massive data sets, and one result of the way the current AI programs have been trained is that they produce content that tends to express a sort of lowest common denominator of its subject matter. You put in the vast quantity of data on the Internet, apply sophisticated pattern matching, and you get out, as a result, something like an “Internet average” version of human knowledge.

For an example of what I mean, here is a fantastic article explaining how AI-generated images of selfies misrepresent culture. They do this because the pattern matching algorithms take the conventional way that selfies typically look and apply it to subjects where that wouldn’t make sense. So an AI-generated image of, say, a group selfie of medieval warriors makes them look like modern day humans. Now, since the idea of the existence of such a selfie is absurd on the face of it, maybe it’s pointless to worry about its inherent historical inaccuracy. But in a way, these kinds of images are erasing history.

The article goes even deeper; the AI generators tend to represent everyone as smiling into the camera the way that Americans do. But other cultures that do exist today and do take group selfies have different ways of expressing themselves when taking photos. So the AI programs aren’t just erasing history, they are also erasing existing modern cultures. They are turning everyone into Americans, because American culture dominates the Internet.

Here’s another way AI-generated content gravitates toward a dominant average mode, one you might have heard of already. It seems that AI chat programs, trained on the massive data of online conversations, will often produce racist, abusive comments. It’s like they inevitably turn into Internet trolls. This might seem like a mere annoyance, but AI programs generating racially biased content can have serious, life or death consequences.

With all of these concerns, it’s understandable that public perception of AI is not always favorable. Ted Gioia (who has an awesome substack, by the way) wrote about this perception recently, starting with a story about the audience at SXSW booing an AI presentation. His article expands into a general discussion of the public’s current distrust of the technocracy, in contrast with the way technocrats like Steve Jobs were idolized in the past. Faith in “innovation” and “disruption” has waned in a society facing uncertainty and disorder, and sensing that technology is leading us toward a dystopian future.

Where does AI fit into my life, now that I’ve stopped playing with image generators? Well, I may not be able to avoid using it, as the company where I work has been promoting AI chat programs to help with day to day tasks. We are all being asked to look into them and come up with ways this new software can improve our productivity. Other folks who have a job like mine might be encountering similar pushes at their workplaces.

I think this is an honest effort by our management to ensure that our organization doesn’t get left behind in the AI wave they are convinced will revolutionize the workforce. Stay ahead of the disruption, and ride the wave I guess is the thinking. Surely it’s not the case, as Aileen and I joked when I brought this up to her, that I am training an AI to replace me. I mean, why pay a software tester when you can just describe the tests you need into a text box in an AI tool? Oh my.

Below is the very informative video that explains Generative AI.

Truth is a Casualty in the Age of Performative Politics

Truth is a Casualty in the Age of Performative Politics

If you watched President Biden’s State of the Union speech last week, and were aware of the Republican response by Senator Katie Britt, you probably know that the latter’s speech has been mocked for its insincere and performative nature. In fact, Britt’s rebuttal was so performative that even as she was giving it, the Internet was anticipating that SNL would parody it in their next cold open sketch, coming just a couple of days later. And indeed they did, though to be fair they also parodied the President.

I do agree that Senator Britt’s speech was performative, as well as inaccurate in its statements but this whole affair reminded me of some important points about the state of politics today:

  • Politicians are performatve because they are not arguing in good faith; they are rallying their side in a partisan conflict. Is Biden really going to enact policies for the long laundry list of liberal/left/blue zone causes he touted in his speech? How could he in this era of dysfunctional government? He is simply assuring his base that he represents their values.
  • The partisan conflict is rooted in the Culture Wars that emerged out of the last Awakening, as evidenced by the conservative/right/red zone trappings of Britt’s speech: Christian family values, nativism, domesicated femininity – all the backlash against the Consciousness Revolution. She is simply assuring her MAGA base that she and the rest of the opposition against President Biden represent MAGA values; she doesn’t need to use facts to do that, just feelings.

The simple truth is politicians in each partisan faction are going to use whatever rhetoric works to reinforce the group feeling within their camp. There’s not much point in worrying about the nuance of what they say, or for that matter its accuracy or whatever hypocrisies are embedded in the rhetoric. We are past the point of anyone convincing anyone through reason. We are in a raw struggle for power, so pick a faction and stick with it. If you can’t or won’t pick a faction, you might want to keep your head down for awhile.

It’s Really Been A Year Already?

It’s Really Been A Year Already?

This photo showed up in the memories feed which my smartphone helpfully throws in my face every once in a while. It was one year ago today since I went back to the corporate campus of my previous job to turn in my laptop. I took this photo because this was a new building that wasn’t up yet when I left the campus to begin remote work in March 2020, and I was excited to see it on my return. It was under construction when I left and there was a lot of hubbub about it.

I think it’s a pretty building. The campus has this striking architectural design that resembles modern art, and this building fits right in. It also has a lot of stairs (I mean the campus as a whole does) which makes it challenging to walk around in if you are not physically fit. When I walked on that campus I felt my age. I felt like I was obsolescing as I was surrounded by the aggressive energy of a workforce that keeps growing younger with every new job I take.

I did go up that formidable looking staircase and go into the bulding. It was impressive on the inside, too, with a spacious lobby with some nice art installations. The security guy at the desk paid me no mind.

The campus was custom build for this corporation, and it must have cost a bundle. So I can understand why they wanted people returning to onsite work. Aileen and I speculated that maybe I was let go because I declined to go hybrid and wanted to stay 100% remote. They gave us the option to do either, and assured me that my decision to stay remote had nothing to do with my position being cut. But who knows.

I’m glad I made the choices I did, and that I amazingly was able to get a 100% remote job elsewhere after being let go. I feel very lucky to be in the position I am in today, and grateful for the support of my family here in Pennsylvania. I just can’t believe it’s been a year already at my new job. Tick tock.

Emojis at Work – How Social Media Infiltrated the Workplace

Emojis at Work – How Social Media Infiltrated the Workplace

I still remember the excitement when the first iPhone came out in 2007; only a few people were using this new kind of mobile phone, but boy were they delighted with it. At the same time, everyone was jumping onto Facebook, which had just opened up to the general public in 2006.

Fast forward to a decade and a half later, and everyone has a touchscreen phone (I got my first one in 2014). Social media platforms have proliferated, and are a constant, pervasive feature of daily life.

Once, employers tried to prevent workers from browsing the Internet during the day, but such efforts have been abandoned. Everyone is on their phone all the time. In fact, the software used to officially collaborate in the workplace looks a lot like the apps we use in our personal lives.

At least, that’s been my experience as a white collar professional in a cubicle environment. I’m a middle-aged GenXer, and my career is split pretty evenly between the world before social media, and the world after. I’ll explore what that’s been like for me a little more in this post.


I joined Facebook in 2008, because all of my coworkers were doing it and I didn’t want to be left out. It was a clear case of FOMO (Fear Of Missing Out), a term then recently introduced, to explain how social networks stimulate compulsive following and mimicking behavior. I friended all of my coworkers, and had fun setting up my profile and exploring the site.

Do you remember those days, and how primitive the Facebook interface was compared to today? Your main profile page was called your “wall” and to post on it, you “updated your status.” If you look back at your posts from fifteen years ago, you’ll see how diffferent they were. They seem kind of awkard and tentative, like we all didn’t quite know what to do with this new way of communicating.

Back then, there was a site called “Please Rob Me” that tried to raise awareness about the dangers of sharing the fact that you weren’t home, like someone was wondering how anyone could be stupid enough to do that. The site is defunct now, and today it is routine for people to tag their locations when they go out, even though we all know we’re giving valuable information away to giant corporations (the ones who are really robbing us).

Back then, as employees found themselves sucked into their Facebook feeds, companies started blocking the website from their local intranets. They established policies about what employees were allowed to post on social media platforms, warning them against representing the company or divulging corporate information.

In the late 2000s, the world was just getting used to social media, and its implications. Today, a decade and a half later, social media is routine in our daily lives. Everyone accesses social media platforms from their smartphones on a more or less continuous basis, even while at work, and employers have no chance of stopping them.

One thing I’ve decided since those early days is that it is best to keep my work life and my personal life separated, where social media is concerned. I no longer send Facebook friend requests to my coworkers, as I did back when I first joined the site. But that’s just how I personally manage my online presence. For other people, depending on their line of work, it might be better or even necessary to network and post about work across all social media, and have less of a distinction between personal and professional social spaces.

A clever post about work I made on a social media app

That’s not to say that social media isn’t a part of my work life at all. There are, as you well know, work-specific social media sites, such as LinkedIn, where I do make sure to connect with my coworkers. The Intranet at the company where I work uses software that has posts and feeds that resemble those on any other social media platform, and while I’m not particularly active there I do browse, to get a feel for the corporate culture.

I also sometimes post about work on my personal social media accounts, but in a sly way. I don’t want to reveal where I work, but just say something about work that’s clever, maybe even vaguely subversive, hoping for likes and shares. I’ve included an example screenshot in this blog post. You can see that I got zero engagement.

Social media conventions have infiltrated workaday tasks as well, such as is in the use of emojis and reactions in online conversations. I have long been using messaging software in the workplace; I remember Skype being in place in the office back in 2002. I also remember that at as emojis started coming into use in personal messaging, I was hesitant at first to include them in work conversations. It just seemed somehow unprofessional to use a smiley face in a work related chat.

But, in time, it simply became a norm. On the messaging software I use at work now, there are emoji reaction options, and my coworkers and I routinely “thumb up” and “heart” one another’s messages. It’s just a way of signalling agreement or showing appreciation. Workplace communication has become more friendly and informal than in the past, and I think this reflects a preferred mode for today’s mostly Gen X and Millennial workers.

For me, a Gen Xer who adopted less formal modes of communication in the latter portion of his career, it’s been an adjustment. But for many of my coworkers, who are Millennials twenty or thirty years younger than I am, it must just seem like the normal way people communicate in the digital space. For Boomers, experiencing these changes at the tail ends of their careers, it might seem too informal or alien to their expectations.

I suppose I shouldn’t speak for others, especially if they are from a different generation. These are just my thoughts on the matter. There’s no denying that the proliferation of smartphones, along with ubiquitous access to the Internet and its software platforms, has changed our daily routines, including our work routines. Please feel free to share your own experience in the comments below.