Does anybody really know what time it is?  Does anyone care?  I know I don’t.  I’m increasingly living in a time-shifted dimension disconnected from time and season.

I realized how disconnected I am from live television a few weeks ago, when I sat down to watch HBO’s autism benefit and had no clue how to watch HBO live, despite being a 20-year subscriber.  I consume a lot of HBO content but almost always on HBO Go.  So when I wanted to watch the benefit, I couldn’t remember what, you know, “channel” the network was on, and had to go through the laborious process of finding that information from my cable provider’s website.

And then it occurred to me:  Except for sports and news, it’s been a long time since I watched any television show live.  In fact, I know the exact date I did so: Sunday, March 7, 2016, the series finale of “Downton Abbey.”  I was only watching live because I’d been recapping the show for a couple of years.  Before that, the last time I watched a show live because I absolutely HAD to was the series finale of “Mad Men.”

For the record, I’m not a cord-cutter.  We pay a lot to watch a full range of broadcast, cable, premium, and streaming channels.   I just don’t watch live.

This means I’ve lost complete track of when my favorite TV shows air and even what network they are on.  I literally have no idea what day “Brooklyn Nine-Nine” is on — never mind the time — and have to think hard to remember it’s on Fox.

The way we watch TV in our house is, we look at the DVR recording guide to see what shows are in the queue (“Oh, ‘Modern Family’ was on last night!”).  If nothing urgent is there, then we move on to HBO and Netflix.  And if I have a spare half hour and want to watch a screen but there’s nothing I particularly need to see on Netflix, the last thing I’d do is channel-surf.  Much more likely is that I’ll click over to YouTube and watch some favorite music videos, film clips or TV scenes.

People time-shift for many reasons.  The original draw for VCRs was that they allowed you to fast-forward through commercials — and go out in the evening and catch your favorite show when you came home.  Still, the understanding was that using a video recorder would be the exception, not the rule.

Two trends have pushed me into a full-time time-shifter.   First, with all the high quality television available today, everything I watch is “Must-See TV.”  I would never just turn on the TV and watch whatever’s on.

Just as important, the fragmentation of TV, with the broadcast network monopoly smashed to pieces, means I no longer feel compelled to watch a show when it’s live so I can talk about it with friends or colleagues the next day.  No one’s watching what I’m watching, so there’s no water-cooler chatter about TV.

It’s funny how easily old habits die.  I can barely recall what it was like to watch the clock to make sure I didn’t miss a favorite show.  And yet back when I was younger and had a vastly more active social life outside the house, I somehow managed to consume even more television than I do now.

What I can’t wrap my head around is whether I am an outlier or a harbinger of future viewing habits.  Clearly a lot of people are still watching live TV.  Nielsen’s most recent Total Audience Report shows that the average person still watches nearly four hours of TV a day.  That’s only down by about 15 minutes compared to the same period two years ago.   (This would be a good time to remind everyone that only about half the homes in America even have DVRs, and fewer subscribe to premium cable channels).

But I don’t feel unique as a full-time timeshifter, certainly not with a 25-year-old in the family.  He’s lived in his own apartment for three years and would no more own a television than a Sony Walkman.

So maybe I’m slightly ahead of the curve.  A decade ago I pish-poshed futurists who said that live TV would eventually go away.  But now that it’s happened to me, I’m not so sure.

After all, if an old-timer like me can abandon live TV, anyone can.


Louis CK


How else to react to the news that Louis C.K., famous for his self-flagellating comedy specials and his  emotionally raw series on FX, had been taking self-exposure to the extreme in his private life?

I feel sorry for my son, who so admired Louis’ comedic daring and honesty.  I know how it feels.  As a kid I practically memorized all Bill Cosby’s records.  And then I graduated to Woody Allen movies in my late teens.  Now I feel that a large swath of my youthful enthusiasms are covered with slime.

Maybe there really is something in the DNA of comedians that causes bad choices.  One of the oldest clichés in the book is that people with difficult childhoods and damaged psyches find an outlet for their pain and self-loathing in stand-up comedy.  After all, a great deal of contemporary stand-up revolves around self-lacerating stories – stories that pick at a comedian’s most obvious wounds.

This cliché certainly does not apply to all comics.  If Jerry Seinfeld or Jim Gaffigan were accused of sexual deviancy the shock would be so intense that I’d give up watching comedy altogether.  But there are many comedians who do seem to have screw loose.  As Mark Twain purportedly said, “The secret source of humor itself is not joy, but sorrow. There is no humor in heaven.”

The list of sexually abusive comics is not short.  It goes all the way from C-listers like Andy Dick to stars like Al Franken.  The power they hold over audiences seems to embolden them to act out off-stage too.

Louis C.K., however, is in a class of his own.  He was known for years as a “comedian’s comedian,” using material that went right up the edge of what an audience could stand. His FX show started out as a word-of-mouth hit among comedy nerds.  I mostly liked the show but it always made me uneasy, which I gathered was the point.  Don’t let the audience get too comfortable.

In the very first “Louie” episode I ever watched, there’s a scene where Louis is stopped by a TSA agent at the airport who finds a tube of gel in his luggage.  He straightforwardly explains that it’s the “lube” he’ll be using for self-pleasuring when he gets to the hotel.  The TSA agent is dumbfounded and mildly disgusted by the matter-of-fact way Louis owns up to behavior is usually considered shameful.

I have to admit it creeped me out, but not enough to stop watching.  I was also unnerved by the frequent references to self-abuse in his comedy specials but assumed he was just pushing the envelope.  Who was to know that a comedian lauded for being a truth teller was actually telling the truth when confessing to audiences that he was a pervert? (And if you want to know what I’m talking about watch this clip:

Louis is now in celebrity purgatory.  The theatrical release of his new movie “I Love You Daddy” has been cancelled and HBO has removed his specials and other material from their streaming services.   Kevin Spacey has suffered a similar fate for his own sexual abuse scandals.  Netflix cancelled the upcoming season of “House of Cards” and he is being completely excised from Rideley Scott’s new movie “It’s a Shame,” with his scenes tossed out and reshot with Christopher Plummer.

The effort to make previously lauded entertainers disappear from our consciousness is typical of our overheated social media-driven culture.  In the old days we would stone sinners or cut off their hands.  Today we shame them on Twitter until they vanish.  I can understand that the entertainment business is a business and that no one particularly wants to see a new movie starring Louis C.K. or Kevin Spacey right now, but to pull existing content off HBO Go is vaguely reminiscent those Soviet-era May Day parades, where Politburo members who fell out of favor were erased from photographs.

And to be honest, it’s a bit rich for HBO to get politically correct on Louis C.K. when it profits so fabulously on violence against women on “Game of Thrones” or “Westworld.”  Just saying.

These spasms of morality always seem to be applied unevenly too.  For example, we have one sitting President of the United States accused of sexual assault and one former President accused of rape.  Apparently we hold our comedians to a higher standard of conduct than we do our national leaders.

My guess is that Louis C.K.’s career is not over. At least he had the grace to admit his sins and ask for forgiveness. And unlike Bill Cosby, his behavior was not completely contrary to the persona he presented on stage.  I expect an apology tour in a year or two, with a less sexualized performance, and maybe even a grudging concession to the benefits of conventional bourgeois behavior. Because if one good thing comes out of these scandals it’s that being outrageous on stage doesn’t give you a free pass from basic human decency.


My father-in-law Frank Keane, who died this morning (on the 242nd birthday of the Marine Corps), was not old enough to be in the “Greatest Generation,” but he was still representative of a very good generation of men who built the country after World War II.

He was himself a Marine, an old school journalist, and the stepfather to five kids at a crucial time in their lives.  I forgave him for being a Yankees fan because he was an original Boston Patriots fan way back when they played at Fenway Park.  He knew his sports and history inside out and read the New York Times every day until the last weeks of his life.

He was widely admired within the newspaper world in Providence, Rhode Island, as outlined in this piece from the Providence Business Journal, which I’d encourage you to read here.

He was gentle, dignified, and kind but also a tower of strength within the family.

He was a loving grandfather too.  Here he is horsing around with my son 20 years ago.

Frank and Christian

Rest in peace.


With the huge ongoing success of “The Walking Dead,” “American Horror Story,” “Stranger Things 2”  “It” and “Get Out,” I’m tempted to say that horror is having a cultural moment, except that horror is always having a cultural moment.  There is hardly an era in which this supposedly disreputable genre hasn’t had a massive audience.

The popularity of movies that scare the bejeezus out of us goes back to the silent era, with “The Phantom of the Opera” and “Nosferatu.”  “Dracula” and “Frankenstein” were among the first blockbusters of the talkie era.  And every decade since then has had its own variation on horror movies.

As with any genre, there’s always a definitional issue with what is and isn’t horror, but classic horror seems to be about scaring viewers deeply enough to get their hearts pumping, using horrifying situations that involve a supernatural or non-rational event.  A scary movie with a psycho killer is a thriller.  A scary movie with a ghost is a horror movie.

TV is a relative latecomer to horror.  Given that horror exploits viewers’ revulsions and terrors, the powers-that-be used to believe that it was not suitable for TV, where unsuspecting kids might be watching with their kindly grandparents and end up scarred for life.  Those concerns seem hopelessly antiquated now, though, when any child with a smartphone can easily call up the most horrific videos of ISIS atrocities.

There were early TV shows that attempted to creep audiences out and scare them — within reason.  “The Twilight Zone” and “The Outer Limits” were occasionally disturbing but always kept in line by network censors.  It wasn’t really until 1990 that a truly frightening horror series made it onto the air:  “Twin Peaks.” That David Lynch series is usually not included in the horror canon, although it contains all the genre elements including fright, eeriness, and supernatural explanations.  Among its other impacts, that series did demonstrate that there was an appetite among many viewers for creepy dramas.

Horror as delivered by “Twin Peaks”

Before “Twin Peaks,” TV’s aversion to horror was that the genre concerns itself with a fearful topic that is rarely appropriate for a device that sits innocently in a living room – death.  And not just the kind of death you see on a medical or crime show, where it’s sad when someone dies but at least they’re dead. No, horror reflects a profoundly unsettling death where the natural order is disrupted and everything we thought we knew about the subject is turned upside down.

The barely submerged fear that that there might not be a heavenly afterlife explains the enduring fascination with vampires or zombies — beings that were once dead but are now living – or inanimate creatures or animals that become animated with supernatural power.  Consequently horror is populated with ghosts, monsters, possessed children, werewolves, demons, Satanism, gore, vicious animals, evil witches, sadistic clowns, and cannibals.

The rise of cable TV and its niche targeting, combined with the loosening restrictions on televised violence, have created the opening for TV horror.  After decades without any truly terrifying TV shows, we’ve been deluged with them: “Penny Dreadful,” “Bates Motel,” “The Vampire Diaries,” “The Stain,” “Scream Queens,” “The Originals,” “Slasher,” etc, etc.

Personally, I think that horror is ill-suited for television, or at best a watered-down experience of watching horror at the movies.  Going to the movies is a proactive choice – you get out of the house, drive to a destination, pay money for tickets and find yourself in a dark space with a massive screen.  Usually this is an event that you plan with friends – maybe it’s even a group bonding experience like riding a roller coaster.  In other words, movie-going is an immersive event where the experience can be over-powering.  It gives you a shock that reminds you you’re still alive.

Watching TV is completely different.  The room is well-lit, the screen is smaller and half the time you’re watching by yourself and distracted by your smartphone.  It’s a solitary, not a social event and it doesn’t have the same impact as watching in the theater.  Viewers will frequently scream out loud at a horror movie, but rarely scream at home.

At yet, horror is very popular on TV.  There are people who watch murder, mutation and mutilation week after week.  All the philosophical justifications for horror – that it provides a cathartic release from death-related anxiety – melt away when watching horror transforms from being an occasional thing to a weekly or even daily event.  How much catharsis does a person need?

There’s a legitimate concern that too much horror makes people numb to it and in need of bigger and bigger doses, like any sensation junkie.  And at a time when there are no cultural overlords to impose order, who knows where it will end.  Let’s hope it’s somewhere short of live executions and murders.  We’ve already got the Internet for that.



It’s been about ten years since smartphones, iTunes and the popularity of yakking personalities like Ricky Gervais, Bill Simmons, and Adam Corolla turned podcasting into a mainstream activity.

A decade later and podcasting is still a rising medium.  About 45 million Americans listen weekly and 70 million do so monthly.  That’s higher than movie attendance.  And with 350,000 podcasts to choose from, there’s a podcast for any interest or obsession.

There have been some legitimate break-out stars too.  The first season of “Serial” became a national obsession, with more than 230 million downloads.  Marc Maron’s “WTF” has become a must-have promotional spot for everyone from President Obama to Norm MacDonald. The podcast “Missing Richard Simmons” briefly launched hundreds of news reports about whether the former exercise mogul had been kidnapped by his own housekeeper.


President Obama on Marc Maron’s “WTF”

Advertising on podcasts is also growing fast, albeit from a minuscule to a tiny level.  According to report the IAB and PricewaterhouseCoopers, podcast ad revenue has grown by 85 percent since last year and is on track to reach more than $220 million in 2017.  But that’s only about one percent of the total ad market, not much penetration for a decade-old medium.  How, then, do we increase the value of those ads and make podcasting more profitable?

I’ve been listening to a lot of podcasts, which means listening to a lot of podcast ads. There are two phenomena that demonstrate this is still a nascent medium.  First, there’s a remarkable dearth of ads from traditional mainstream advertisers.  I’ve recently noticed that American Express and Gillette have started to dip their toes into podcast advertising but most advertisers are e-commerce companies or low-end brands: Squarespace, Stamps.com, Harry’s.com, Blue Apron, etc.  All great products, I’m sure, but nothing you’d expect to see advertised on a network TV show.

I also can’t help but notice that almost all the ads are either read by the show hosts.  The previously cited IAB and PricewaterhouseCoopers study claims that these host-delivered ads are the “most effective,” whatever that means.  I doubt the research is definitive and it wouldn’t surprise me to learn that same argument was made back in the 1950s, when TV hosts routinely plugged advertisers themselves.

To me, a medium in which the hosts still read the ads reeks of amateur hour.  And to make matters worse, most of these ads direct listeners to a website where they can plug in a “promo code” to make a purchase and give the podcast credit for the sale.  This is like the early days of the Internet, when pop-ups were judged by their click-through rates.

Podcasts won’t be a mature advertising platform until major brands like Coke, General Motors and Procter and Gamble decide that podcasting is a good space for professionally produced brand-building ads.  And that won’t happen until there is good ad measurement to ensure that people are actually listening to their commercials.

Today no advertiser knows what the audience is for a podcast.  The standard measurement of a podcast’s popularity is downloads but that doesn’t tell you anything about actual consumption.  I subscribe to both “Fresh Air” and “Serial,” two of the most popular podcasts; I listen to about ten percent of the “Fresh Air” interviews but have consumed every second of both “Serial” podcasts.  But that’s me – maybe there are others who dote on Terry Gross’s every word. Only a metric that actually measures listens will tell us.

Podcast ads face another challenge too.  In television and radio you can more or less assume that the ”average audience” for a show (which is the average number of people listening at any time during the entire episode) is more of less the number of people consuming the ad.  That’s because TV viewers and radio listeners are constantly tuning in and dropping off, so consumption is roughly the same throughout the entire length of the show (unless there’s a large amount of DVR playback.)

But hardly anyone will start listening to a podcast half-way through playback.  And in certain genres, like celebrity interviews, the drop-off can be pretty significant.  I’ve almost never made it all the way to the end of a Marc Maron interview, for example, and have no idea whether there are even any ads at the end of his show.

The most obvious company to measure podcast consumption is Apple, which provides the major platform for podcast downloads.  If they could capture podcast playback on iPhones they would have the closest thing to a census-based (as opposed to panel-based) measurement that the media industry has ever had.

The next most obvious candidate to measure podcasts is Nielsen, which has the experience, methodology and technology for the job.

As it turns out, both companies are working on some form of measurement.  Apple has announced it would begin giving creators consumption metrics and Nielsen has begun to offer general insights on the buying habits of podcast listeners, with more detailed numbers reportedly on the way.

If these two companies can come up with reasonably credible metrics then podcasting might finally take off as an advertising medium.  Ironically this might mean fewer podcasts as advertisers flock to the biggest shows and leave the scraps for everyone else.  But more money in the medium can only mean a higher overall standard for all.  Bring it on!







Thank you Shailene Woodley for insulting all TV viewers and creating a moment of national unity that has eluded our national leaders.

Appearing on the red carpet of last month’s Emmy’s she announced that she hasn’t owned a TV since she moved out of her parents’ house when she was 18 and thus couldn’t watch any of the nominated shows.

But even if she did own a TV, Woodley implied, she wouldn’t have watched the nominated shows anyway because she’s too busy pursuing more intellectual pursuits: “I always ask [friends who watch TV] — when do they have time to? When do people have time to? I’m a reader, so I always read a book instead of checking out my TV.”

The reaction was swift and unified, setting off a Twitter-storm, which boiled down to:  how could someone who was richly compensated for being on a TV series (“Big Little Lies”) and then nominated for said performance appear on a TV awards show to insult everyone tuned in on the very household appliance she found so time-wasting?  If TV is such a brain-suck why didn’t she just stay home that night and read Tolstoy?  Apparently one of the books she WASN’T reading was “Miss Manners’ Guide to Excruciatingly Correct Behavior.”

We can all laugh at the cluelessness of young actresses, but who among us hasn’t encountered that same anti-TV attitude at a cocktail party or around the office watercooler? Is it really possible that these people missed the memo that the most important work in the visual arts is being done on TV? I mean, isn’t that why Shailene Woodley, Nicole Kidman, Reese Witherspoon and Laura Dern were all in “Big Little Lies” in the first place – because TV is now the place where actors can really stretch?

I don’t know if Shailene Woodley thinks she’s special because she doesn’t own a TV, but she’s actually a conformist in her demographic. Of course she doesn’t have a TV! She’s 25 years old. I’d be more impressed by her originality if she didn’t have a smartphone.  In fact, what I’d really like to know is the proportion of time she spends reading vs. looking at a smartphone.  The iPhone, not the television, is what rots brains these days.

The anti-TV snobs like Shailene Woodley have always been with us – and in certain decades there was some justification for that attitude.  But what’s new these days are people who watch TV but actively disdain legitimately good shows – the anti-snobbery snobs.  I guess they think that viewers of “peak TV” are looking down on them so they get preemptively defensive, as in, “I watched 15 minutes of ‘Mad Men’ and thought it was boring.  I can’t understand why you like it.”

Somewhat related to the anti-snobbery snobs are the anti-popularity snobs – those who brag that they never watch the highest-rated shows.  A few weeks ago, some of us in the office were discussing “Game of Thrones” when our CFO, who was not part of the conversation by the way, felt it necessary to interject that he’d never watched it.  Now there are many legitimate reasons not to watch “Game of Thrones,” but he definitely left the impression that the fact that so many others were watching was a factor in his avoidance of it.

Of course he then undercut himself by then telling us that our highly regarded outside counsel had recently admitted that he was a “Game of Thrones” fan and that because this well-known lawyer was watching the show, well, maybe he’d check it out too.  Which prompted the rest of us to observe that when WE said we were watching “Game of Thrones” you weren’t interested but when the lawyer said he was watching you were willing to give it a try.

The bottom line is that people today are too quick to define themselves by what they don’t like on TV.  Maybe it’s time we all just stopped judging each other for our TV choices.  Instead of airily dismissing what someone else watches, maybe you could ask why they like that particular show or genre.  We might learn something about each other for once.

Football nfl_120607_wg

Here we go again.  We’re only a couple of weeks into the new football season and already everyone’s wringing their hands over the state of America’s favorite sport and television’s most important broadcasting product.

The games are uninspired, the ratings are weak, the players are domestic abusers, and the President of the United States is calling for a boycott.  This is a major issue for television because football is one of the last places where advertisers can reliably expect men to watch their ads in real time.  Football is also one of the remaining rationales that many families give themselves for not cutting the cord (even though most games could be watched live over-the-air).

This is a dramatic turnaround from just a few years ago when it football was still gaining in popularity and appeared to be television’s bulwark against the encroachments of the digital world. Of course what goes up must come down and it was inevitable that some marginal fans would eventually peel off and move onto the next big fad, but football’s decline has been so precipitous that it can’t all be the fickleness of fans.

Many of the explanations offered this year are the same as they were last year.  For example, the Colin Kaepernick National Anthem protest against alleged police racism metastasized to a full-blown political controversy during the off-season when Kaepernick couldn’t land a job, even as a back-up quarterback.

This has put the NFL in the worst possible situation.  The (mostly conservative) white men who are the sport’s core base are still furious that football allowed itself to get embroiled in a political correctness controversy in the first place.  But black activists and the media keep the controversy alive by alleging that the NFL owners have conspired not to hire the one-time Super Bowl winner because he’s become such a lightning rod for the Black Lives Matter issue.  No one was happy – and that was even before President Trump has weighed in with his usual brew of grievance, divisiveness and vitriol.

It’s a surprise it took Trump this long to recognize that the anthem boycotters were pushing a hit button.  I would guess that he doesn’t even follow football so didn’t appreciate the furor until ESPN’s Jemele Hill’s ESPN called him a racist, which launched a feud with ESPN that spilled over into football itself. Regardless of why he decided to take on the NFL, his comments threaten to cause schisms in America’s one true religion – watching football on Sunday.


But if the National Anthem imbroglio is turning off fans who don’t want to think about politics while they’re watching football, the ongoing revelations about the impact of football-related concussions is turning off fans who worry about the their own personal morality.  These fans always knew that football was violent but it wasn’t until players started publicly dying of Chronic Traumatic Encephalopathy (CTE) that they began to ponder their own complicity in cheering each violent hit.  And it didn’t help when Tom Brady’s wife revealed that even he’d suffered concussions, leaving us to contemplate the specter one of the country’s most glamorous athletes not being able to remember his name some day.

It’s not that Americans have suddenly become pacifists, though.  The biggest sporting event of the summer (if you can call it that) was the Floyd Mayweather/Conor McGregor fight, in which millions of Americans paid $90 to watch a retired prizefighter pummel a mixed martial arts fighter who’d never boxed before.  Given the gimmicky nature of the pairing no one could claim to be enjoying the “sweet science” of finesse, balance and strategy that supposedly lends a patina of respectability to boxing. Nope, this was straight-up bloodlust.

If anything, football’s problem with violence isn’t that it’s too violent but that its near monopoly on controlled aggression has been broken.  For years, Sunday was the one day of the week when working stiffs could get a catharsis by watching other guys brutalize each other on the gridiron. Sure, there was the occasional spinal cord-severing injury that resulted a player becoming a lifelong quadriplegic, but in general, the players reveled in hitting and being hit and the viewers loved it watching. Somehow, having the players wear helmets and pads kept us from feeling bad about the three or four players who needed to be carried off the field each game.

But as any casual view of “Game of Thrones” or “The Walking Dead” knows, there’s a lot more violence on entertainment television now than there is in sports.  To say nothing about video games or YouTube videos where you can watch an actual, not just a metaphorical, beheading.  And social media now enables frustrated guys to channel their rage as Internet trolls when they might have simply spent Sunday afternoons yelling “Yes!” every time someone on the opposing team was knocked unconscious.

Football isn’t going anywhere, but it seems passé compared to basketball.  The league’s owners, led by their doofus Commissioner, seem out of touch and concerned only with protecting their investment.  Football will probably remain television’s biggest draw for years to come but it also seems to have entered a period of slow decline.  Whether that will be good for the soul of America is an open question but it will definitely be bad for television.