Was 1968 the greatest year in popular music? To me that seems self-evident, unless you want to claim 1967. Or maybe 1969.

OK, so I was 14 years old at the time and it is well-known that the most meaningful music in your life is the music that was popular when you were in adolescence and beginning to have a sexual awakening. But it wasn’t my hormones that made 1968 such a great year – it was the music itself.

At least that’s what I thought until I listened to a Slate.com podcast featuring music historian Chris Molanphy, who pointed out that many of the top songs from 1968 were little more than schlock or elevator music. In other words, for every fantastic Number One like Marvin Gaye’s “I Heard It Through the Grapevine,” there was a dog like Bobby Goldsboro’s “Honey.”

Molaphy’s theory is that music served as a refuge because 1968 was such a horrible year politically (assassinations, riots, war, etc.). Therefore some of the year’s most popular songs were mindless diversions from the evening news. Maybe that’s the reason, or maybe the truth is that every year is full of schlock and it takes a couple decades to realize it. Looking at the full list of top hits in 1968, though, it seems that about half the songs aimed to change society through social commentary that you’d never find in pop music today so I’m not sure how escapist it was.

In any event, here are ten interesting nuggets I learned from Molanphy or my own observations about the top hits of 1968.

1. “Hey Jude,” one of the all-time great songs, is still the longest single ever to top Billboard’s pop charts. It was also the Beatles song that stayed longest at Number One (nine weeks). At seven minutes and 11 seconds, it was twice as long as most pop hits, and every radio station played the whole thing. Even more unprecedented, the Beatles ended the song with a four-minute chant, giving pop music a rare sense of mysticism. I will never forget watching the “Hey Jude” clip (below) that appeared on The Smothers Brothers in October 1968. In retrospect, that moment, even more than Woodstock, was the high point of the feel-good “flower power” movement.

2. Another really great hit from 1968 was Simon and Garfunkel’s “Mrs. Robinson,” a fragment of which had appeared in Mike Nichols’ “The Graduate” the year before. Paul Simon hadn’t finished the song when the movie premiered and it wasn’t released until the  next summer. The song was initially titled “Mrs. Roosevelt,” but when Simon showed it to Nichols the director convinced him to change it the name of the seductress in the movie. The famous line if the song, “Where have you gone Joe DiMaggio?” was originally intended to refer to Simon’s boyhood hero Mickey Mantle but the syllables didn’t match up. In a song so deeply contemptuous of 1960’s America it was probably better anyway to refer back longingly to DiMaggio’s generation.

3. There were two instrumental Number One hits in 1968, both by international artists. First we had “Love Is Blue” by the French composer Paul Mauriat, who remains to this day the only French artist to have a chart-topping Billboard hit. The song was composed – with lyrics – for the Eurovision contest (as Luxemburg’s entry.) It didn’t win at Eurovision but became a huge hit in the U.S. Molanphy dismisses this song as the greatest piece of elevator music ever composed, but I have to admit that I owned this record and played it constantly.

4.  The other major instrumental hit of 1968 was “Grazing in the Grass” by the South African musician “Hugh Masekela.” Of course I’ve heard this song a million times; it arguably invented the smooth jazz genre. But I never knew the music was from South Africa. Partly that’s because The Friends of Distinction added words and released their own hit single, which is now better known than the original. (And “Love is Blue” and “Grazing in the Grass” weren’t the only instrumental hits that year – only the two number one hits. Other notable instrumental songs from 1968 include “The Good, The Bad, and The Ugly” and my favorite, “Classical Gas” by Mason Williams. I can’t remember any instrumental hits in the 21st Century.)

5.  Another Number One hit that might as well have been an instrumental recording was “Tighten Up” by Archie Bell and the Drells. This is a proto-Funk record in which Bell directs the band and the dancers on how to perform a dance called The Tighten Up. The remarkable thing about this song is that Drell had been drafted into the army and was recuperating in a German hospital from wounds suffered in Vietnam when the song hit Number One.

7. And then there’s Herb Albert and the Tijuana Brass, a hugely popular instrumental band that had 17 Top 100 hits before they finally charted a Number One song with “This Guy’s in Love With You.” To demonstrate the oddity of 1968, this song was NOT an instrumental record. Nope, the band’s first Number One hit was vocalized by Herb Albert himself. Originally inserted as a knock-off number in a CBS TV special, the song so charmed viewers that it was rushed out as a single. And not only was this the first Number One hit for Herb Albert, it was the first Number One song by Burt Bacharach and Hal David. Go figure.

8. Molanphy reserves his greatest scorn for Bobby Goldsboro’s weeper “Honey,” about a husband mourning his dead wife. He claims that it is considered by many to be the worst Number One song ever, although I’m sure the competition for that title is very steep. I have to admit that it’s pretty bad: consider these immortal lyrics: “She was always young at heart/Kinda dumb and kinda smart/And I loved her so”

9. If “Honey” was notable for anything other than its schlock, it was for exemplifying the trend toward country music crossing over into pop. A worthier country/pop entry in 1968 was Jeannie C. Reilly’s “Harper Valley P.T.A.” which scathingly attacked the hypocrisy of small town life.

10. Then there are Number One songs from 1968 that seem downright dangerous. The Doors’ “Hello I Love You” is ostensibly about Jim Morrison’s yearning for a girl walking down Venice Beach but the aggressiveness of the lyrics and the pulsing way in which they’re delivered seems scary even today. In any event it was the first 45 rpm stereo record.

So is 1968 the greatest year in music? I consistently liked more top songs from 1967 (Aretha’s “Respect,” The Monkees’ “I’m A Believer,” The Turtles’ “Happy Together,” The Doors’ “Light My Fire,” Bobby Gentry’s “Ode to Billy Joe,” The Association’s “Windy,” The Supremes’ “The Happening,” even Lulu’s “To Sir With Love.”) But any year in which “Hey Jude” could be heard on the radio for month after month has to rank high.

Suffice it to say that the Sixties really were the Golden Age of pop music. Almost every week another great new song appeared on the top 40 and since we all listened to the same Top 40 format we all had the same frame of reference. Those were the days, my friends. In fact, there was a big hit with that very title in 1968.


The recent news that Nielsen intends to eliminate paper diaries from the 140 ratings markets in which they are currently being used was met with relief in some quarters and with incredulity in others that such antediluvian methods were still being deployed to generate TV ratings this late into the 21st Century. For me, however, it brought back memories from more than a decade ago, when Nielsen first announced its intention to phase them out by 2011.

Paper diaries harken back to the dawn of the television age when there were only a handful of TV networks and viewers could generally be counted on to remember what channels they watched throughout the day.  But even back then, Nielsen founder Art Nielsen was always searching for a better electric measurement system that would record what people actually watched instead of what they said they watched.

Fast forward to June 2006.  Coming off a bruising fight with News Corp over the introduction of the Local People Meters that replaced these same paper diaries in the top ten local markets, Nielsen faced an even more implacable foe: the Internet.  People were beginning to find new ways to watch TV online and Nielsen needed a plan to adapt.

The result was an initiative called Anywhere Anytime Media Measurement, aka A2M2.  Looking back on that plan ten years later is a reminder of how visionary corporate aspirations are often restrained by more mundane considerations such as cost, technology and clients.  For example, it eventually turned out that clients weren’t interested in paying Nielsen to measure viewing outside the home (although that is apparently back on the table again.)  And when it further turned out that iPods (remember them?) wouldn’t become the primary vehicle for mobile viewing, Nielsen dropped the quest to develop a “go meter”.  On the plus side, Nielsen did eventually integrate Internet viewing into its ratings, and it did expand the number of local markets measured with People Meters.

The nut it couldn’t crack was the complete elimination of diaries.  Nielsen’s ratings CEO Susan Whiting had pushed for a deadline of introducing electronic measurement in all local markets by 2011, reasoning that the Nielsen staff would be motivated into action by an aggressive but firm deadline.  I was the PR representative on the A2M2 team so I can testify that they were motivated, and yet the 2011 deadline came and went with diaries as firmly entrenched as ever.

It wasn’t for lack of trying.  For at least a couple of years, the A2M2 team met every Friday morning via a conference call between the project managers in the company’s business headquarters in New York City and its technology headquarters in Oldsmar Florida.  They considered and tested a number of options, including a “mailable meter” that would be sent to viewers in lieu of a diary, placed next to a TV for a month where it would record what shows were being watched, and then returned to Nielsen for transcription.

A lot of work went into developing and testing this mailable meter, including research into packaging and shipping, but in the end it wasn’t good enough or cheap enough to replace diaries.  Because here’s the thing about diaries: they are really cheap. Nielsen lost money measuring many of these markets but it lost less than it would have with electronic measurement.

Here’s the other thing about diaries: they hung around so long because a lot of the people who pay the bills (i.e., the local stations) didn’t really want to change if it meant lower ratings.  TV viewers are most likely to mark down the big-name shows they “usually” watch and those are typically network shows, which inflates their ratings, so there was an incentive to maintain the status quo.

Or at least there used to be.  In a world of streaming, video playback, on-demand, channel-switching, premium cable stations, and limited attention spans, viewers are less likely to remember anything they watch, even the six o’clock news, and the diaries have finally lost any credibility among advertisers.   The incentive for abandoning diaries is finally larger than it is for keeping them.

What’s coming instead is something that’s been in Nielsen’s toolbox for years – fusing together different datasets, including return path data from set top boxes and data from other electronic sources like the National People Meter sample.   The flaws of set top box data are numerous, including a lack of representativeness and no information on which person in the house is watching, but at this point anything would be better than paper diaries.

The adoption of fusion data based on modeling different datasets is a significant development.  Until now, almost all TV ratings have been based on quantifiable data from scientifically selected panels that can be double-checked.  You can go back and see how many people actually pushed People Meter buttons in a market or wrote in particular programs in their diaries.  There’s even a room at Nielsen’s Oldsmar facility where station managers can go to review the individual diaries from their markets and confirm that, yes, this 53-year-old white woman really did watch “The Wheel of Fortune” instead of “Jeopardy.”  You can’t do that when data from set-top boxes are funneled into a computer, fused with other data, and modeled using an algorithm that only a handful of data scientists can understand.

Welcome to the 21st Century.  I’m sure my former Nielsen colleagues are thrilled they’ve finally convinced the market to phase out diaries, which symbolized the old, analog Nielsen in a digital age.  That’s an image no one wanted.

The new target date for eliminating diaries is now 2018, seven years after the first deadline.  But I’d bet on Nielsen making this one.  Not only is the technology there but so is the marketplace.





altanta_still_episode_101_pilot_h_2016The so-called “Golden Age Of Television” usually refers to dramas – hardly anyone has claimed we’re also in a Golden Age of comedy.  Indeed, the last great line-up of “must see” sitcoms seems like a long time ago, when “The Office,” “30 Rock,” and “Parks and Recreation,” all appeared together one last time in the fall of 2013. Since then the comedy landscape has been hit or miss.

But the launch of the new TV season brings more enthusiasm than usual for sitcoms.  Critics have been excited about FX’s “Atlanta” and “Better Things,” ABC’s “Speechless,” NBC’s “The Good Place,” and Amazon’s “One Mississippi One.”   If these shows turn out to be half-way successful maybe they, combined with returning hits like “Modern Family,” “black*ish” and Brooklyn Nine Nine,” could begin to constitute a sitcom revival.

Still, it’s interesting that some of these series are not really that funny.  In fact, for some time now, prestige comedy has been synonymous with a dark, bleak world.  On “Louie” and “Girls,” two of the most highly praised “comedies” of the past five years, entire episodes go by without a single joke.

Consider the situations in a few of the new situation comedies.  In “Speechless” the parents are trying to balance the needs of their kids, one of whom is wheelchair-bound with cerebral palsy.  In “Atlanta,” a broke Princeton drop-out with serious money and relationship problems is trying to launch his cousin’s hip hop career.  In “One Mississippi,” a woman recovering from a double mastectomy and a debilitating digestive illness returns home to take her mother off life support.

On “Saturday Night Live,” Gilda Rader’s character Lisa Loopner used to say “That’s so funny I forgot to laugh.”  After seeing some of these “traumedies,” I’m beginning to appreciate the unintended wisdom of the saying.  When a sitcom character slips on a rug or is outwitted by a wisecracking ten-year-old, that’s amusing.  But when a character is faced with the absurdity of a parent’s death or a child’s disability, it’s a profound cosmic joke.  The deep humor comes out of the need to keep moving forward in the face of a tragedy, and this is not always laugh-out-loud comedy.

Like many programming trends on television, the trend to bleak humor originates in both changing taste and the fracturing of the once-monolithic television audience.  When there were only three major networks, all television networks needed to appeal to the lowest common denominator and that meant set-up, joke, laugh, set-up, joke, laugh. Some of the resulting shows were transcendent but most were formulaic and mediocre.

The decision of HBO to move into original programming broke the broadcast networks’ creative monopoly and one of their first series – Garry Shandling’s “The Larry Sanders Show” – became an ur-text for bleak comedies.  Shot in single camera format without a laugh-track, the show highlighted the insecurities, selfishness, thwarted ambition and existential despair of the characters and guest stars.  Since then, dyspeptic shows have become increasingly popular as networks sought smaller niche audiences who could support a different kind of comedy.

Laugh-less comedies have also thrived as television’s business model has changed.  For decades the big money for a television series occurred when it went into syndication and the goal was to get a series enough episodes (usually about 100) to make that possible.  That meant targeting a broad audience and generating high enough ratings to keep the show on the air for four or five years.

But as live viewing gives way to streaming, ratings are less important.  Networks and services like HBO, Showtime, Amazon Prime and Netflix are in the business of acquiring monthly paying subscribers and ratings are an afterthought as long as customers keep sending in those monthly checks.   Building content libraries is now the name of the game and it’s more important to have at least one show that each customer is passionate about than to have dozens of shows with moderate appeal.

The dark comedy is definitely a niche and not for everyone.  It’s also hard to sustain.  A show like “Louie,” which seemed fresh and original in its first seasons, felt downright depressing as the years went by.  Louis C.K. peeled back the façade that we all display to the world only to discover a confused, self-involved introvert underneath.  If audiences are going to stick with a show they have to find something redeeming in the main characters and that’s hard to do without at least a little humor.

Fortunately, these new dark sitcoms, as uncomfortable as they are, actually do have moments of real humor.  I really did laugh out loud at “Speechless,” “Atlanta,” and “One Mississippi.”  Sometimes real life is so ridiculous that all you can do is laugh.



In every modern Presidential election campaign there are three major hurdles to overcome: the primaries, the nominating conventions and the fall debates.

I think it’s fair to say that Donald Trump won the primaries and Hillary Clinton won the battle of the conventions. Now we come to the debates, and with two polar opposite candidates and a surprisingly large number of voters undecided, they are expected to be among the most widely watched televised political events in a long time.

The debates, of course, are not really “debates” as properly understood. And although they are ostensibly about issues, they are really about personalities and creating moments – sometimes as short as ten seconds – that can be replayed endlessly on TV and mocked or celebrated on social media for decades to come.

I’ve watched every Presidential debate since the Jimmy Carter and Jerry Ford met in 1976, (although sometimes I was watching through my fingers).  I think this qualifies me to give debate tips, so here goes:

Beat expectations: The debates are the nation’s longest-running reality TV show program, where performance counts more than substance.  And like much of life, performance is judged on expectation. In 2012, Mitt Romney performed much better than expected in his first debate against President Obama, which gave him a boost in the polls (that is, until he reverted to the mean in the second round).   In 2016, the “expectations game” advantage goes to Trump, who is widely considered to be, shall we say, a shallow policy thinker. So all he needs to do is articulate a few cogent arguments and not commit any major faux pas and he’ll perform better than expected.  This is also called “grading on a curve,” which drives the Clintonistas crazy, but that’s the dynamic of politics.

If you make a mistake address it immediately. In 1976 Jerry Ford bizarrely claimed that the Soviet Union was not dominating Eastern Europe.  But what made it worse was the campaign’s refusal to acknowledge there had been any misstatement at all. That made it an issue for days afterward.  Same with Marco Rubio’s meltdown at the 2016 GOP New Hampshire debate, when he robotically repeated talking points after being accused by Chris Christie of robotically repeating talking points.  Instead of making a joke about his brain freeze, Rubio doggedly insisted that he’d done nothing wrong and by the time he apologized for his poor performance, he’d fallen to fifth in the polls.

Be a gentleman to the lady:  Trump can’t afford to bully Clinton like he bullied his male competitors during the primaries.  Chivalry won’t allow it even in the 21st Century, as Barack Obama, a candidate as politically correct as they come, learned when he condescending said in one of their 2008 debates, “You’re likable enough, Hillary.”  She got so much sympathy that she surged in the polls (but not enough).  But in terms of gender block-headedness, no one beats Rick Lazio, who during their New York Senate debate in 2000, approached Clinton’s podium, thrust a piece of paper in front of her and insisted that she sign a pledge against soft money.  That looked aggressive and threatening, not manly.

Don’t stray from the podium:  As Rick Lazio demonstrated, very little good can come of wandering away from your assigned spot.  In 2000 Al Gore tried to assert dominance by getting in the face of George W. Bush, and it fell flat when Bush gave him a surprised and disdainful glance. Eight years later John McCain was caught on camera wandering around the stage while Obama was talking, which spawned a ton of online spoofs.

Use your humor: Everyone is so uptight at these debates that a little joke goes a long way.  Most famously, Ronald Reagan overcame the perception that he was too old to be president when he quipped that he wasn’t going to bring age into the campaign, and wouldn’t exploit “the youth and inexperience” of his opponent (i.e., the well-traveled Walter Mondale).  This year Trump has been the jokester but if Clinton could get off a self-deprecating zinger about one of her liabilities, it would help a lot.  But talk about grading of a curve! It wouldn’t take much for Clinton to beat expectations on her sense of humor, given that he most famous joke of the campaign was that she didn’t know who invented Pokemon Go, but wished someone would invent “Pokemon go to the polls.”  Yeesh.

Don’t fall in love with your poll-tested metaphors.  During one particularly painful debate in 2000, Al Gore kept promising to put Medicare in a “lock box” while George W. Bush repeatedly accused him of “fuzzy math.”   They obviously thought these were killer moments.  They weren’t.

Beware the “Town Hall” format.  I really hate these phony Town Halls, where “real” voters are selected to ask questions, no matter how off-the-wall. Plus candidates are expected to sit at stools and casually approach the audience, which is how McCain got in trouble with his wandering. The worst performer was then-President George H.W. Bush, who was understandably flummoxed when a voter asked him how he’d been “personally affected by the national debt.”  Huh? First he looked at his watch, as if he wanted to know how much longer this God-awful ordeal would last, and then he argued with the premise of the question.  If you are asked a dumb question, acknowledge the voter’s pain and then move on to your own messaging.

If someone asks if you’d be upset if a criminal raped and murdered your wife, say yes.  Got that Michael Dukakis?



watching netflix.jpg

As Labor Day approaches, most of us past the age of consent are realizing we’ve been denied one of the season’s sweetest pleasures: that great summer film that everyone’s talking about.  It was left to television to produce 2016’s only terrific summer movie, “Stranger Things,”an homage to the great films of the ‘80s. Unfortunately it wasn’t shown in an actual cinema, but only on Netflix.

Film was the great mass communication medium during the first half of the 20th century, with the average American attending two to three movies per week.  The introduction of television in the 1950s dealt Hollywood a body blow, stripping away its monopoly on visual entertainment and significantly cutting into movie attendance.

But early TV didn’t kill the movies.  If anything, by drawing away viewers interested only in mindless entertainment, TV did cinema the favor of making it a more serious and ambitious medium.  For Baby Boomers, going to movies in the ‘70s and ‘80s was akin to attending the opera or the museum a century earlier, and going to a summer movie with your friends was a rite of passage.

Well, that was then.  By the time 2016 rolled around, the major movie studios had almost abandoned any hope of attracting adult audiences.  To the extent there are still serious movies, they are generally produced by independent film companies and shown in art houses to discerning but small audiences, or released at Christmas so they can be eligible for the Academy Awards.

For the last dozen years or so, Hollywood’s business model has been to create blockbusters that generate hundreds of millions of dollars at the box office.  To get a blockbuster, you need to attract repeat viewing, which has generally meant developing movies for teens or kids who are eager to get out of the house (it’s no coincidence that the groups most likely to go to the movies also watch the least amount of TV).

There was a time when blockbusters meant exciting original content (“Jaws,” “Star Wars,” “Raiders of the Lost Ark,” etc.) Today the industry is fixated on franchises, remakes or sequels.  So it’s no surprise that the top 10 movies of the year to date are literal or figurative cartoons (i.e., animation or action movies based on comic books.)  What’s a little bit of a surprise, however, is that box office receipts are down from last year.  Maybe even kids and teens have had their fill of sub-par films.

Is television to blame for this sorry state of affairs?  Has it finally finished off what it started in the 1950s?

In some respects, TV didn’t kill cinema. The film industry itself committed suicide.  No one forced Hollywood to stop making movies that appeal to adults.

And yet you can’t help feeling that much of the talent and energy that would have gone into making general-appeal movies 20 years ago is now focused on TV.  The best blockbuster of the year is not “Captain America.”  It’s “Game of Thrones.”  And the best horror experience is “The Walking Dead.”  And the best documentary is ESPN’s “O.J.: Made in America.”

Hollywood’s fate may have been doomed by the artistic and commercial success of “The Sopranos,” which demonstrated there was a mature audience hungry for adult storytelling.  Soon thereafter, Alan Ball, who had won a screenwriting Academy Award for “American Beauty,” took his talents to HBO to produce “Six Feet Under.”  A decade later, David Fincher, the Oscar-nominated director for “The Curious Case of Benjamin Button” and “The Social Network,” created and produced “House of Cards” for Netflix.  We’ve reached a point where Martin Scorsese, the world’s greatest living director, is now doing occasional TV work, directing the series premieres for both “Boardwalk Empire” and “Vinyl.”

But what’s really drained Hollywood has been the renaissance of the TV mini-series and the anthology series.   Hugely popular in the 1970s and 80s, with such shows as “Roots,” “The Thorn Birds” and “The Winds of War,” these self-contained, multi-episode TV shows have returned with a vengeance.  Mini-series were once a rare and special TV event, but have now become a regular part of the TV diet.

It’s the mini-series that is really drawing star power to television.  Major movies stars like Matthew McConaughey, Woody Harrelson, Colin Farrell, Rachel McAdams and Vince Vaughn agreed to do TV work on “True Detective.”  And Billy Bob Thornton and Kirsten Dunst appeared in “Fargo.”

Storytellers have always craved time to tell their stories.  Exactly 100 years ago, D.W. Griffith brought forth the three-and-a-half-hour blockbuster “Intolerance,” and in 1924 Erich von Stroheim infamously produced the eight-hour-long silent movie “Greed.”  Since then, some of Hollywood’s greatest films (“Gone With The Wind,” the earlier, 1959 version of “Ben-Hur,” “Lawrence of Arabia,” “The Godfather”s, part 1 and 2, and “The Lord of the Rings”) have all clocked in at three hours or more.   Hollywood rarely has the nerve to do that any longer, but HBO and Netflix, with hours to content to fill, are happy to give their storytellers as much time as they need.

Maybe the slate of fall movies will surprise me and the year will redeem itself. but it will be hard for any film to beat the experience and joy of watching “Stranger Things” this summer.

Good luck, Hollywood.  I like to get out of the house, too, so I’m rooting for you.

TV remotes

I recently received a message from Amazon urging me to watch its video programming, which is included in my Amazon Prime membership.  That certainly seemed like a good suggestion — until the time came when I actually wanted to watch something.

The video-viewing apparatus in my living room consists of a stack of machines and devices — but to my disappointment, none of them offer Amazon Prime, not even my much-hyped Apple TV.  Sure I could watch Prime on my laptop, and last year I did, in fact, watch Amazon’s “Catastrophe” that way, but the experience convinced me that watching “TV” on a laptop while seated bolt-upright in front of a desk is about as satisfying as eating dinner standing in front of the sink.  From a utilitarian perspective they both accomplish their main task, but the aesthetics leave something to be desired.

Preferring to watch video entertainment on my HDTV monitor, I decided to solve the Amazon Prime problem by ordering a six-foot HDMI cable, which I can hook up to my computer when I want to stream onto the TV.

But what a pain in the neck.  Already sitting on my viewing stand are five remotes that control: 1) the monitor; 2) the DVR; 3) the DVD player; 4) my Apple TV; and 5) a cable-splitter device to switch from cable to cable. Now I have a separate cable to connect my laptop.

And it’s not as if these other devices are that easy to master.  The navigation on the Apple TV is so sensitive that I’m constantly landing on the wrong icon or the wrong show.  Of course to get even this far I needed to go online many times to check Apple TV instructions, since there was no manual with the device itself.  And even now, after all these years, I don’t understand why the DVR will sudden stop recording shows on my watch list, or why I get reruns when I specifically set the directions to record first-time-only broadcasts.

Whenever I complain about the complexity of watching TV, I feel like the old coot yelling at the neighborhood ruffians to stop playing on his grass.  Why can’t I be more like my Millennial son, who watches TV while lying on his bed with his laptop propped on his stomach?  Get out of the way of progress ,you geezer!

Of course we have to be careful not to romanticize the past.  One of the earliest television clichés was the image of the 1950s dad on the roof trying to position the antenna just right, so TV was frequently a pain the neck even in the days of yore.  Cable solved the antenna problem but created its own challenges with the cable box, which required its own remote control.  And the VCR was so complicated that most people only used it to play videos, not to record anything.

It seems like every time we master one form of technology, the device industrial complex invents another must-have machine. We now live in a world when no one can go into another person’s home and confidently change the TV channel without screwing up the system.  That’s a lesson I’ve learned over too many Christmas visits to my parents’ house.

Figuring out how to work the devices is bad enough — but finding something to watch is even worse.  I know there’s a ton of content to watch, but where to find it?  I’d really like to watch “Orphan Black,” but have no idea how to do that.  I see from a Google search that it’s on BBC America.  Is that part of my cable package?  I guess I could look, assuming I can find my channel guide?  Or maybe it’s on Netflix, but the search function is really hard on Netflix.

I’m glad there are so many great shows to watch and so many ways to watch them, but it seems like “television” is about to collapse on itself from the weight of its own complexity.

In the meantime, maybe I’ll just stick to Colbert.  He’s on every night and is waiting for me on the DVR whenever I get home.  Sometimes the path of least resistance is the best option.


So Roger Ailes has been ejected from his throne at Fox News and even barred from entering the News Corporation building.  You won’t find me shedding a tear because eight years ago he tried to get me fired.  What happened to me wasn’t as bad as what has allegedly happened to Fox’s own employees, but it did provide a brief glimpse of Fox’s modus operandi.

At the time of the events in question I was the chief spokesman for Nielsen and caught in the middle of one of those adolescent spitball fights that periodically erupts between media companies.  In one corner was Fox News, which had recently launched Fox Business News, a financial cable network that was supposed to do for financial reporting what they had done to political news.  In the other corner was CNBC, which Ailes had once led before being ushered out the door in 1996.

In 2007, Ailes launched Fox Business with great fanfare. This included a huge ad campaign that took direct aim at CNBC.  The day the network launched Fox even sent a reporter to stand outside CNBC’s headquarters and announce that it was “hunting season.”

The problem is that the shenanigans that made Fox News a political powerhouse didn’t work with financial viewers, who, since they are making investment decisions involving real money, tend to prefer their financial news to actually be fair and balanced.  The result was that the ratings for Fox Business News were in the toilet.  For the first two months it was on the air, it had an average audience of 6,300 viewers, about as many people as you’d see at a small town’s Thanksgiving Day football game.

The folks at CNBC and NBC were overjoyed by Fox’s flop but here’s the rub: under Nielsen rules, which had been carefully negotiated with all the media companies, no one can release viewing numbers with a rating below 0.1 (or 0.1 percent of the viewing audience), which in this case would have represented about 35,000 viewers.  This rule is designed to protect nascent cable networks so they aren’t humiliated by low numbers as they’re trying to get on their feet.

This rule usually protects networks that no one’s ever heard of, but Fox Business had launched with so much publicity that everyone in the TV world knew who they were.  CNBC wanted them humiliated but Nielsen wouldn’t release the 6,300 number and CNBC itself could have been sanctioned if they made it public.

Despite this rule, I was not surprised when someone actually did leak the number to New York Times media reporter Jacques Steinberg.  For years The Times and Fox had had a contentious relationship, to say the least.  Their values and biases were diametrically opposed and if there was any publication motivated and powerful enough to stand up to Fox it was The Times.

Steinberg’s call to Nielsen asking for confirmation came at the end of several weeks of furious calls among senior Nielsen, Fox and NBC executives, with NBC pressuring us to make the number public and Fox demanding that we squash the story.  Emotions were running high, with both networks acting like this story was on par with the Pentagon Papers.  Nielsen decided to stay neutral and enforce its own rule; eventually I ended up telling Steinberg that I would not confirm the number.  But I also reminded him that I would steer him away from erroneous information, which is what I would do for any reporter.

The resulting story reported the embarrassingly low numbers for Fox Business, with the Times sourcing it to “a person who saw those internal reports [and] vouched for their contents on Thursday, speaking on condition of anonymity.”  CNBC “declined comment” and Fox didn’t answer emailed inquiries.  I was quoted in the piece by name as confirming the rules around the minimum reporting requirements.

I don’t think I’m breaking any confidentiality agreements when I reveal that Fox is (or at least was) full of vindictive bullies.  Fox News almost always got great ratings but whenever there was a dip, Ailes and his lieutenants would call and complain, threatening some kind of unspecified retribution.  Eventually there would be a war or terrorist attack to drive Fox ratings back up and things would be fine again, but for those months when they were slumping Ailes would make life miserable for Nielsen.

Ailes and the rest of Fox News either believed, or pretended to believe, they were the victims of a left-wing conspiracy, which was ridiculous as far as Nielsen was concerned.  Our CEO David Calhoun was, according to public filings, a steady contributor to Republican candidates and the rest of the executive team on balance leaned moderate right, to the extend they leaned any way at all.  As for me, it’s right there on my LinkedIn profile that I worked for a right-wing Congressman, served in the 1984 Reagan-Bush campaign and spent time in the Reagan White House.  So I had no ideological problems with Fox News.

In any event, Ailes (or his PR team) was exorcised enough about the story to send Nielsen a letter, which, among other things, demanded my head.  The logic of the letter was that since CNBC and Fox had declined comment and I was quoted in the story explaining how the reporting requirements work, I must have been the one to have leaked the number to The New York Times.  I doubt that even Ailes believed this bit of fallacious logic; instead I think the purpose of the letter was to punish me for refusing to play along with Fox in killing the story, which would have been impossible without outright deception.

Surprise! Nielsen didn’t fire me, viewing this as another Ailes tantrum, and he seemed to get over his fit of pique pretty quickly since the name “Gary Holmes” never appeared in any future Ailes correspondence or conversations.

Jacques Steinberg was not quite as lucky.  Fox launched a nasty on-air campaign against him and at one point even featuring him in an anti-Semitically doctored photo.  Nice.  With Ailes gone will bullying these tactics also disappear?  We can only hope.