What’s with all the ink?

What’s with all the ink?

A conversation with my BFF started with me Stevesplaining (a thing I do) the social history of tattoos.

You see, they were a Pacific islander custom (thinkĀ Maui from Moana) which Europeans encountered in the Age of Discovery. This is why tattoos are associated with sailors. They then became a custom among Europeans of the lower orders, and were for a long time in the West associated with lack of respectability and the criminal underclass.

Then, in the aftermath of the Sixties at the end of the twentieth century, young people sought to break free of cultural norms. Getting tattooed (and/or pierced) was a rebellious act of individualism practiced by a minority of youth. By the turn of the century it was a fad. And now, it has become a conventional rite of youth passage. Which is why you see almost every young adult with at least some tattoos.

My BFF replied, “Have you even talked with anyone who has tattoos and asked them why they have them?” I had to admit that I really had not. “It’s because they feel like it’s the only way in which they can truly express themselves,” she continued.

“You mean that their personal bodies is the only space over which they have any control?” I’m thinking how the young generation must feel powerless in a world politically and culturally dominated by their elders.

“Yes.”

So that’s two different perspectives on the frequently tattooed Millennial generation. What do you think?

What is of this era

What is of this era

Assuming that the Fourth Turning began in 2008 with the Global Financial Crisis, what are some things that belong wholly to this era? Here are some which I can think of.

  1. Smartphones. Specifically, the large touchscreen form factor smartphone that made its famous debut with the first iPhone in the summer of 2007. I remember seeing people that summer that had one. They were few and far between, but they looked like the happiest people I had ever seen, delighted beyond belief with their shiny black rectangles. I got my black mirror in 2014 and I can’t imagine life without it.
  2. Social media. Yes, there were social media sites back in the early 2000s. I am willing to admit that I had a MySpace page. But the big wave of near universal adoption began when Facebook became open to all adults in 2006. It was 2008 when I noticed everyone around me was joining, and I jumped right on that bandwagon. With smartphones making it ever easier to share immediate experiences, there are now multiple services in widespread use.
  3. The Marvel Cinematic Universe. This one belongs squarely in the current era, as Iron Man was released in 2008. Only a few months later, the leaders of the free world were struggling to prevent global economic collapse. Things keep getting messier and messier in the real world, but in the MCU films the good guys always manage to avert catastrophe. At least until the after-credits sequence sets up the next plot twist.
My life (so far) in the IT crowd

My life (so far) in the IT crowd

Last spring I posted a little recent career history. This was when I lived in North Carolina, and worked at a corporate campus nestled in a lovely wooded area in the famed Research Triangle. I ended the post with the hopeful expectation of more corporate campuses in my future. It’s been over a year now, and I have moved to Pennsylvania and have a new gig – in Wilmington, Delaware of all places – reportedly an up-and-comer as an information technology hub.

I don’t work at a campus, but rather in a shiny blue office building on the city’s touristy river front. It is also a lovely work environment, although there are not nearly as many nearby conveniences. The location is a strange kind of wasteland of office buildings, overpriced restaurants, and parking. But undeniably it is delightful to step out to the river front on a beautiful day.

My coworkers are, once again, mostly from India. Actually, the percentage here is much higher than at my last position – probably 90% of the IT staff. No sign of “Hire American” here, though perhaps those visa reforms don’t apply to my company or industry.

And, once again, I am working as an IT contractor. I started contracting in the late 1990s, after finding that “full-time” positions in the boisterous dot com era tended to be short-lived. Of the past twenty years, only a few have been spent working as a full-time employee instead of a contractor, and that has happened only when my position was “converted” from a contractor position. This is a mark of prestige in my business, sort of like being made (not really).

My contracts are generally through an agency, which means I am paying payroll taxes, and often have some access to benefits. The contracts tend to last between 1 and 3 years, and then I am looking for work again. I do enjoy at each new position the opportunity to meet new people, learn new workflows and processes, learn about a new type of business, and experience a company’s particular work culture. It can be a bit stressful adjusting to the new environment, but it is also exciting and satisfying when I learn the ropes and prove myself.

It is also stressful when the job search phase begins again, especially now that I am middle-aged and worried about age discrimination, but I’m not sure that being full-time versus a contractor would make any difference, since full-time positions are eliminated as surely as contracts come to an end. The main disadvantages of contracting are not receiving paid time off, and having to pay more for health insurance. For the latter issue, I have found the Affordable Care Act very helpful. For the former issue, judicious saving is required.

Way back, in the 1990s, the rise of temporary employment was decried as a deprivation of worker’s rights. Perhaps it still is, part of an overarching trend toward greater corporate power at the expense of the people. Or it could be thought of as part of the free-agency style that my generation brought to the workforce in young adulthood. In my life, a career of many short stints has served me well, but that could be because I am in the field of Information Techonology. It might not be so easy in other lines of work.

With more years behind me than ahead of me, I imagine I don’t have too many more stints left. But who knows what the future will bring.

Cricket Match

Cricket Match

You may have seen a couple of my posts where I am participating in a cricket match. This was for a tournament that is being played at my place of employment. I just wanted to briefly describe my experience.

Cricket, as you probably know, is a sport originating in England and played around the world, and is reminiscent of American baseball. As in baseball, someone hurls a ball at a batter who is then obligated to hit the ball, assuming that the throw isn’t way off or otherwise disqualified. In baseball the throwing is called “pitching,” but in cricket it is called “bowling” and the bowler is actually allowed one bounce off the ground (but only one). The bowler can also run for a bit to add more speed to the thrown ball.

Without going into too much detail, another huge difference is that the bowler and batter are in the center of the field, and are surrounded by the other fielding team members, so the batter can hit the ball in any direction. To the side, behind him, whatever. Also, there are two batters, one at each end of the pitch (the center bit) and they run back and forth, switching places, between the two creases (the ends of the pitch). And also there’s a wicket in each crease, which is a set of three wobbly poles, and it is bad for the batting team if the ball touches a wicket when a batter isn’t in the crease defending said wicket. Whew!

It’s all very complicated and I’m sure I haven’t explained it very well, despite the fact that I went to a training class and watched a couple of instructional videos. I joined my group’s team in the spirit of participation and camaraderie with my coworkers, and also because it is fun to learn new things. And also because of peer pressure, because the team needed me for this reason: in the tournament format, each team is required to have a certain number of “novice” players.

A “novice” player is defined as one who is from a “non-cricket-playing nation.” In practice, this means someone who is not from India. So the language of the rules dances around this kind of racial prejudice, which might seem justified by the fact that no one at work who is not from India has any effin’ idea how to play cricket. Oh, and each team also has to have at least one woman player, who can be from a cricket-playing nation (that is, India), and who is treated like a novice player, so there is some sexism added into the format to boot.

What it means to be treated like a novice player is that the bowler has to bowl softly when a novice is batting. This makes it easy to hit the ball, especially considering that it is a tennis ball instead of the real thing. I imagine this is for liability reasons. Hitting the ball is one thing, though; scoring is a bit harder, because if you don’t hit a good grounder, you won’t have the time to run to the other end of the pitch before the fielders make a play for your wicket. It’s the same as trying to reach first base in baseball.

Our team played two games, winning the first and then losing the second. A game consists of each team batting once (so sort of like one inning in baseball). The score that the first team to bat achieves then becomes the target for the other team to beat, and if the other team does beat the target then the game ends immediately. A batting team gets twelve “overs,” each of which consists of six balls – not counting “dead balls” or “no balls” – well, it was way too much for me to keep track of, and I basically had no idea what was going on.

Suffice it to say that I had fun, got sunburned, and though I did not contribute much to my team, I did gain an appreciation for cricket which I’m sure will come to good use the next time I see a game on the telly at the local pub. šŸ™‚

Warming up before a game
Old Letters

Old Letters

I sorted through my collection of old letters and cards. Not to dwell on the past, but to follow the dictum of Socrates to examine one’s life. I tried to order them chronologically, first in a stack, but as that proved unwieldy I layed them out on the floor by year. The oldest was from the year 2000, so there is no record of my correspondences from the 1900s. Once the sorting was done, a pattern clearly emerged. There is not much from the early 2000s, then there is a big gap in the mid-2000s, when I was isolated and having mental health issues. Then around 2008 things pick up, as I stitched my social life back together and recovered from severe depression. The 2010s stack is nice and big.

Now it’s possible I simply lost older letters. Or maybe in my despondent period I was throwing them away. But the pattern jives with this thing I have where ever since the Global Financial Crisis and Great Recession my life has just been getting better and better. Not even Trumpty-Dumpty has slowed it down.

So thanks to all the friends and coworkers who have sent me invites and holiday greetings over the years. It’s nice to be able to look back at all the memories and milestones. It’s fun to see my employment history reflected in the names signed on holiday greeting cards. Oh – and one thing, people – when you are sending those holiday cards, please make sure the year is written on them so that we OCD people can put them in the right order later. šŸ™‚

On my third page

On my third page

Last week there was a retirement party at work, for a man in his sixties who had been with the company for about a decade. There was some nattering from other employees conveying wonder that someone would actually have the opportunity to retire. It felt like an expression of the anxiety of younger workers about their future. My generation in particular is notoriously pessimistic about its retirement prospects.

I have some notebook papers on which I have been tracking my computer career since it began when I was a college student. I write down where I worked, when, and what skills I developed. Each paper covers about 14 years, and I am now a quarter of the way through the third page. When I reach the end of the page, I will be in my sixties.

One career, easily compressed onto college-ruled notebook paper. Will I need a fourth page?

Millennials as Consensus-builders on Social Media

Millennials as Consensus-builders on Social Media

Looking at the GenerationsI recently posted a list of patterns to look for among the living generations in the current social era, based on Strauss & Howe generational theory. I wanted to take a closer look at some of the items on that list in a series of posts, and I’ll start with one under that most talked about of generations – the Millennials.

The item in particular is the second one in the Crisis era box – “look for the Millennial generation to enforce, among peers, a code of good conduct.” You can see this happening in that ubiquitous phenomenon that is defining the times – social media.

The rise of social media is part of the story of the maturation of the Internet, which first came into the public eye at a time when computer networks were the province of a small minority of socially outcast nerds. As adoption grew through the “you’ve got mail” era and into the dawn of today’s tech giants like Amazon and Google, going online became more and more mainstream.

Then, just around the start of the Crisis in 2008, came a new kind of computer that made being online essentially effortless – the smartphone. With it came an explosion of participation on Internet sites designed to promote social networking and interaction. Now, ten years later, what we call social media platforms dominate as a source of information and news.

The term “media” refers to an era’s primary means of mass communication. Adding the qualifier “social” suggests that a socializing role has been added to that of communicating, and perhaps that control of mass communication has been transferred from media elites (who are now mistrusted) to society at large.

The socializing role is evident in the familiar features of promoting posts (“liking” and “sharing”). Popular opinions rise to the top of feeds and are seen by the most viewers. Unpopular opinions are quashed. The consensus is reinforced through the use of signal-boosting hashtags like #metoo.

Another form of enforcement involves calling out bad behavior. A post demonstrates a transgression of social mores, which may, unfortunately for the transgressor, be taken out of context. Then a blast of comments shames the person. In extreme cases, the person may be identified in real life – called “doxxing” – which can be ruinous.

Perhaps the exemplary case in point is the store owner who posts an anti-gay sign, and then finds his or her business boycotted after a picture of the sign goes viral on social media. But how far might the phenomenon go? Blogger John Robb speculates about “weaponized social networks” and imagines their full potenital.

As for the people being in charge of mass communication now, the “democratization of the media” if you will – that has proven fraught with challenges. Social networks are vulnerable to infiltration, and social engineering has swayed elections. Social media sharing makes the dissemination of false information much too easy, and so the term “fake news” has come into the zeitgeist.

There is also the question of whose consensus is being enforced, as there are competing “red-state” and “blue-state” networks, each attempting to persuade us with their values-promoting memes. What values prevail will be evident in time. And though all of the living generations are participating in this social evolution, ultimately it will be the rising Millennial generation that defines what conduct is considered correct.

A closer look at the Crisis era

A closer look at the Crisis era

Years ago I had another blog, Generation Watch, dedicated to looking at current events and news stories through the filter of Strauss-Howe generational theory. I was an avid reader of their work (still am, though their work is now mostly confined to Neil Howe’s column at forbes.com), and as part of the Generation Watch site I had a list of submission guidelines that included hints about what to look for in news stories about each generation – it was all taken right out of the book The Fourth Turning. At the time I was writing (early to mid-2000s) I, along with many other Strauss & Howe aficianados, was trying to determine if and when we were going to transition from the Third Turning in the social cycle (the Unraveling) into the Fourth Turning (the Crisis). Well, the “official” word is that the Crisis began in 2008, so now that we are a decade in, I think I will reexamine the markers that I published so many years ago, in some new posts. Meanwhile, you can read them at the old site here:

http://home.earthlink.net/~generationwatch/gw_submission.html

Two animated films for your watch list

Two animated films for your watch list

The latest Pixar offering, Coco, is a wonderful film which has instantly become one of our family’s favorites. It is technically brilliant, demonstrating how far computer animation has come in the more than 20 years since Pixar’s beginnings. The visual detail, the lighting and the color are superb, and it is all easy to take in despite being so complex, unlike much of the CGI that accompanies live action movies these days. The story is excellent as well, and I can’t write much at all about it without spoiling it, except to note that it is fancifully set in Mexico, and while it has action-adventure elements it is really a family drama.

Coco actually has some things in common with another recent animated feature,Ā Kubo and the Two Strings, which uses stop-motion instead of CGI, giving it an older and artsier look. Kubo is also more of an adventure movie, a mythological tale whose narrative is not quite as engaging as Coco’s, though it is not without its own twists. But again – no spoilers!

Each of these films appropriates a facet of world culture to tell a high-stakes story that reminds us to cherish our loved ones. If you’re looking for a satisfying family movie night, you can’t go wrong with either one.

Scrambled Easter Eggs: A Review of Ready Player One

Scrambled Easter Eggs: A Review of Ready Player One

Why would the youth of 2045 be obsessed with the pop culture of the late 1900s? This was my thought as I sat in the theater and watched the movie Ready Player One.Ā As the cultural references kept piling on, my partner commented, “this movie is made for us.” We are both Gen-Xers, born in the 1960s, and the movie’s plot was an endless series of shoutouts to the iconic movies, TV shows and video games of our youth.

My partner’s son, who was born in the early 2000s, was watching with us, and ironically, he knew more of the references than we did. He would call them “Easter eggs,” coming from the term for messages or images that are often hidden in video games. In fact, Easter eggs in this sense are a major plot point in the movie, which is mostly set in a massive multiplayer virtual reality game. As the characters hunt online for the Easter eggs that are the MacGuffins of the story, we the audience enjoy recognizing the pop culture references thatĀ parade by in the form of avatars and items and scenery.

The people of 2045 apparently live online to escape the hellish dystopia the world has become, following some droughts and riots. Civilization has weathered events like these many times before, but we’ll have to assume they were really bad this time. We don’t get to see much of what this bleak future looks like outside of the VR, except for one vertically sprawling slum in Columbus, Ohio, where the residents live in stacked shipping containers supported by bare metal scaffolding. The visual design of this set is striking, which just underscores that fact that we are consuming visual art, and that the “real world” setting depicted is just as contrived as the “virtual world.”

The people in “the stacks” are mostly obsessed with putting on their VR gear and hanging out online, trying to rack up experience and in-game resources. Despite presumably beingĀ poor, they can all afford the gear, much as people of all classes today own smartphones. It’s not clear what is happening in the outside world, in the boring realm of politics, economics, and international relations. All that matters to anyone living in the stacks is this game world, as if humanity has abandoned all thought of civic renewal to focus on entertainment. Which actually fits theĀ zeitgeistĀ of the early 21st century quite well.

Really, this movie is an homage to the era of entertainment culture that has been presided over by its director, the hugely influential Steven Spielberg. His generation has dominated the cultural space, and let the political space go to ruin. The people of 2045 worship his generation’s cultural legacy because, in the Spielbergian vision, that legacy is the culminating achievement of our time. The movie audiences of 2018 may well agree.

In the Spielbergian weltanschauung, civic virtue amounts to willingness to join the scrappy underdogs in a fight against the uniformed forces of oppression, represented in Ready Player One by a greedy corporate conglomerate. “Welcome to the rebellion” is actually a line in this movie. It’s likeĀ Star WarsĀ all over again. Again.

I write this review without any knowledge of the book on which the film is based, so apologies to the book’s author for missing his original intent, which may have been much different. He may have written a profoundly original and thought-provoking story. You’ll never know from watching this film. Watching this film, you will get an amusingĀ mĆ©lange of your favorite pop culture nostalgia, packaged in a plot that has become a routine of PG-rated action adventure movies. And, of course, you will have fun.