I don’t think many journalists ever interviewed Pete Sheehy, but I was among the few who did. Pete, who was the clubhouse man at Yankee Stadium for about seven decades, didn’t like to talk, and I suppose that accounts for the fact that he made only rare appearances in print. I arranged an interview through a mutual friend, and I wasn’t with Pete for very long before I realized what a challenge I had taken on. In fact, Pete was forthright about it — in his way. He told me that he figured he had kept his job for so long, being in the confidence of members of the Yankees and, for a time, the football New York Giants, because he knew how to keep his mouth shut. Whatever he knew about Babe Ruth, Billy Martin, and Mickey Mantle, he kept it to himself.
He didn’t have to say any more. “Joe” meant DiMaggio, and his choice didn’t surprise me. My father had been a Yankee fan since the Ruth era, too, and although I never asked him, I am confident that he would have said “Joe” too — despite a reverence for Lou Gehrig.
DiMaggio had an outstanding career. He was among the very best hitters, baserunners, and outfielders of his time or any time. Not the very best, necessarily, but one of the best. As Kostya Kennedy mentions in his book, 56: Joe DiMaggio and the Last Magic Number in Sports, a poll taken in 1969 named DiMaggio the “greatest living baseball player.” DiMaggio believed it; he was that kind of a guy. But there were skeptics who noted, for instance, that Ted Williams, DiMaggio’s contemporary, outstripped the Yankee in every major hitting category and had a longer career, despite combat duty tours in two wars.
If there is an inequity in the way DiMaggio is regarded, it may be attributed at least in part to the fact that he played for the New York Yankees while they were the preeminent team in baseball if not in sports in general. DiMaggio appeared in 10 World Series in his 13 years in the majors.
But the primary reason for the aura around Joe DiMaggio may be the record he set 60 years ago this season — the record that was the occasion for Kennedy’s book. In the 1941 campaign, DiMaggio got a base hit in 56 consecutive games.
To put that record in context, Kennedy points out that more than 17,000 men have played Major League baseball, and only DiMaggio has achieved it. The only others to come close were Willie Keeler, who hit in 44 straight games in 1897 in the dead-ball era, and Pete Rose, who hit in 44 in 1978. (Keeler’s streak began on the first day of the ’97 season, so the hit he got in the last game in ’96 puts his official record at 45.)
The subtitle of Kennedy’s book refers to the fact that while DiMaggio’s record once formed a holy trinity with Babe Ruth’s single-season and lifetime home run records, Ruth’s marks have been exceeded several times and in some cases under questionable circumstances. DiMaggio’s 56 is the only individual record of its kind still standing.
Kennedy describes in his very literate book the atmosphere in which the streak occurred. It captured the attention of the whole country — and even folks in some other countries. DiMaggio’s sizable family, people who were tight with him, baseball fans, and people who didn’t know anything else about him or the game were all caught up in his day-day-progress. Everywhere, Kennedy writes, people stopped to ask each other: “Did he get a hit today?”
And, as Kennedy artfully shows, this didn’t happen in a vacuum. In 1941, there was something far more ponderous on people’s minds — the increasing aggression of Nazi Germany. The idea that the United States could stay out of the war seemed more and more like wishful thinking as American plants turned out material to assist the European allies and as more and more American men were drafted into military service. DiMaggio’s streak was a fortuitous respite in such an atmosphere — the counterpart, in a way, to Susan Boyle’s triumph on Britain’s Got Talent in the midst of worldwide recession and seemingly pointless wars.
The streak served another purpose, too. It was something for Italian-Americans to cling with pride as they — thanks to Benito Mussolini — came under the same kind of suspicion that was being directed at Americans of Japanese and German background. Even at that, DiMaggio’s own father, Giuseppe, who had made his living as a commercial fisherman, was placed under wartime restrictions that kept him from approaching San Francisco Bay.
In telling this story, Kennedy carefully constructs a portrait of DiMaggio that isn’t at all endearing. DiMaggio was a cold fish. He was known from his youth for his spells of silence. Kennedy writes a lot about DiMaggio’s relationship with his first wife, movie actress Dorothy Arnold, and that isn’t a happy tale. DiMaggio — in spite of the girls he invited to his hotel rooms — missed Dorothy when he was on the road. But when he was home, he stifled her, resented her, and often subjected her to his emotional and sometimes his physical absence.
This book is peppered with the interesting characters who played large and small parts in DiMaggio’s life — his relatives, including his major league brothers, Don and Vince; his somewhat “connected” Italian-American friends in Newark; his fans — not the least of whom were the boys Mario Cuomo and Gay Talese; and, of course, his fellow ballplayers: Gehrig, Phil Rizzuto, and DiMaggio’s wacky road-trip roommate, Lefty Gomez.
On the field, DiMaggio appeared impassive as the streak progressed. If a pitcher had boasted that he would stop DiMaggio, and DiMaggio got a hit off him, there would be none of the fist pumping that cheapens the game today. Inside, however, Kennedy writes, DiMaggio’s stomach was often in knots. And, of course, if he didn’t have to talk about the streak, he didn’t:
” ‘You nervous about the streak?’ a reporter would call out and it would be Lefty who would turn and reply, ‘Joe? Nah, he’s fine. Me? I threw up my breakfast.’ “
In a post on May 14, I mentioned a song written in 1920 by Harry Ruby and Bert Kalmar: “So Long, Oolong, (How Long Ya Gonna Be Gone?”) I didn’t mention that I happen to have a recording of that song, sung by Frank Crumit on the Columbia label. Crumit was a popular singer and radio personality who also wrote about 50 songs, including “Buckeye Battle Cry” which is played at Ohio State University football games.
My recording of the Ruby-Kalmar song is a 78 rpm shellac disk. I could play it on the electric turntable that we use to listen to our 33 rpm LPs, but I don’t. I play it on our 1927 model wind-up Victrola. I have an odd assortment of records stored in the cabinet of that phonograph. Most of them are 10-inch disks, but there are a few of the 12-inch disks. Some of these are recorded on only one side – including a 12-inch Victor record of Giovanni Martinelli singing “Celeste Aida” from the Verdi opera. By the mid 1920s, Crumit was recording for Victor, so the recording I have has to date from before that. The Martinelli recording was made at the Victor studios in Camden on Nov. 25, 1914. I was able to determine that at THIS LINK, which is a complete catalog of Victor recordings. An interesting detail is that the Martinelli recording – one side only – sold for $1.50 and the Crumit record sold for a buck.
I got on this subject because of a story I read today HERE, on the Boston Globe web site. The nut of this story by Sarah Rodman is as follows:
As consumers buy fewer and fewer CDs, an interesting phenomenon is occurring — artists who appeal to older listeners are showing up surprisingly high on the charts.
The reason: Adults are largely the ones buying CDs these days. Younger people tend to download in general and focus on singles.
The story makes it clear that while this isn’t universally true, it’s a clear trend. It’s also interesting to note that “surprisingly high on the charts” is a relative concept. Rodman points out that a reissue of a Rolling Stones album recently hit the charts in second place on the strength of about 76,000 sales. In the “early 2000s,” the writer explains, a recording had to achieve six figures just to be in the Top 10. The early 2000s are already “the old days.”
The acts the story cites as appealing to “older listeners” are an eclectic group that includes Sarah McLachlan, Sade, Barbra Streisand, Michael Buble, and Susan Boyle.
There is a lot of discussion about the changes that have taken place in the recording industry. Like some other fields affected by rapidly evolving digital technology, this one presents a variety of challenges to everyone involved. And the challenged include people like me, who have lived through all of the developments in recording except wax cylinders — and who have accumulated evidence of every stage.
Besides the heavy shellac records and the acoustic talking machine, we have boxes of 45 rpm records in the garage — including a duet by Connie Francis and Marvin Rainwater — hundreds of LPs in the living room — dozens of cassette tapes (and several cassette players, including the one in my Beetle), CDs all over the house, and a couple of MP3 players. The only stage we skipped was 8-track.
There’s an episode of “Seinfeld” in which George Costanza, having seen “Les Miserables” on Broadway, can’t get the song “Master of the House” out of his head. We’ve all had a similar experience, and it can be annoying. I read a book by the neurologist Dr. Oliver Sacks in which he discusses the possible causes of this phenomenon. Pay attention. I think the next step in sound technology will be a chip implanted in the listener’s head and songs transmitted directly into the brain.
Meanwhile, is there a market for all these jewel cases?
July 11, 2009
The Christian Science Monitor has joined the chorus whose song is that Michael Jackson was likely one of the the last “mega-stars.”
A story in the Monitor this week, written by Stephen Humphries, included these passages:
That Jackson could command such an audience is testament to the kind globe-straddling star power that was possible in an earlier, simpler entertainment age. Amid today’s fragmented popular culture, in which an unlimited buffet of mass media has segregated consumers into niche-oriented tribes, Jackson was arguably one of the world’s last superstars.
“It isn’t just that Michael Jackson was the last superstar because he was one of the last people to benefit from an unfragmented media,” says Timothy Burke, a cultural historian at Swarthmore College in Pennsylvania. “He may also have been one of the last people who could surprise us with a stunning innovation where we didn’t have that sense already of being so jaded by the ubiquity of spectacularly good entertainment. That someone could just leap on the stage and do this thing, and you could go, ‘Wow, I’ve never seen that before!’ “
I don’t know that either niche marketing or a need for innovation supports the bold prediction that no one after Jackson will be able to appeal to a global audience.
Luciano Pavarotti, for example — whose estate was worth about a half billion dollars the last I read about it — appealed to millions of people all over the world, including people who knew nothing about opera, including people who did not want to know anything about opera, and he didn’t appeal to them because he was an innovator — certainly not in the sense that Michael Jackson was. Pavarotti’s performance was pretty much traditional. Whether he was, as a friend of mine claimed, “the second greatest tenor in history,” is a matter of conjecture — and conjecture, I might add, that has no real meaning. Most of those who bought Pavarotti’s recordings, attended his concerts, and watched his television appearances, wouldn’t know if he were second greatest or not. What they knew was that they liked him, and that was all that mattered. The implication of my friend’s remark was that Enrico Caruso was the greatest tenor in history, and Caruso and Pavarotti were alike in this: There was something about each of them that simply appealed to people, including those not normally in the opera crowd. The very fact that the something can’t be quantified, while both tenors’ enormous audiences and coincident earnings can be quantified, should tell us that it’s foolhardy to predict that no such performer will appear again.
Susan Boyle’s experience is also instructive. The record-setting video on YouTube featured Boyle, not Jackson. That doesn’t imply any parallel between the two as performers, and that’s exactly the point. Boyle’s appeal was unpredictable. No one saw it coming. And I dare say that even experts in the field, if they had heard Susan Boyle perform before her appearance on the British TV competition, would not have forseen her appeal, which has cut across all the usual borders of musical taste and which, it is important to note, has been a function of a new mode of almost universal communications whose implications and whose future we can’t even imagine. Jackson only got to scratch the surface of the rapidly evolving technology. Even if Susan Boyle turns out to be a comet that will soon fade to black, we don’t know that there won’t be another Susan Boyle who will burst out into the world via YouTube or some unforseen successor to it and re-define the concept of a “star” in ways we haven’t dreamed of.
June 3, 2009
Andy Burnham, the British culture secretary, wants the Office of Communications to investigate whether the television network and the producers of “Britain’s Got Talent” had acted responsibly toward Susan Boyle in the runup to the show’s finals. The implication is that the people behind the show that vaulted Boyle from the obscurity of a Scottish village to the limelight of YouTube should have done a better job of protecting her from the effects of sudden fame.
Burnham made reference to Britain’s broadcast code when he called for a determination that “duty of care” had been exercised with respect to Susan Boyle, who was briefly hospitalized for exhaustion after coming in second in the show’s finals. The Office of Communications doesn’t think the broadcast code covers what happened to Boyle, but Burnham said: “We are living in a world where it is not just about what happens on telly on a Saturday night. There is 360 degree scrutiny, 365 days a year. We need to look after people, not just around the camera. Broadcasters should always put people’s welfare first.”
This has prompted some bitter responses from readers of The Times of London, some sympathetic to Susan Boyle, some not. Some of the readers were outraged that the government would even think of becoming involved in a trivial, private matter. I liked the comment from Al of Manchester:
The UK is full of cruel people feasting on a diet of bile soaked Tabloid fodder and Reality TV trash. First they jeered and sneered at Susan for not looking like a singer and now they do the same because she not “tough enough to take it”. What a sad place and sad people we’ve become.
And Jessica of Eastbourne:
Can I just say that “they” did not treat Susan any differently than any of the other contestants. Susan was a victim of the throwaway celebrity culture that the UK and the US fawn over so much. If anyone “threw her away” it was the public, and the show’s producers are not as much to blame as we are.
What I loved about the reporting of this story is that after the universal handwringing and public penance over the snickering and eye-rolling when Susan Boyle first appeared on the show, the media couldn’t mention her without pointing out how “dowdy” she is, how unlikely a celebrity she is, or without calling attention again to the fact that she is a “virgin” who has “never been kissed.”
April 17, 2009
It’s interesting to listen to the vocabulary in media reports about the cruelty directed at Scottish villager Susan Boyle by the judges and audience on a British television show. Piers Morgan, one of the judges on the program, acknowledged after Boyle’s unexpected performance, that while “everyone was laughing at you” before the song, “nobody’s laughing now.” Morgan hadn’t laughed, but the camera caught his wince when Boyle first appeared on stage. Amanda Holden, another judge, was not shown overtly reacting to Boyle, but when Boyle had finished singing and the pandemonium in the studio had died down, Holden offered what seemed like a sincere communal apology for the “cynicism” that had greeted Boyle.
In the Los Angeles Times today, writers Josh Collins and Janice Stobart report on the Boyle phenomenon, which has set records for YouTube hits. It was a balanced story over all, but I wonder about their lead: “Less than a week ago, she was just another 47-year-old Scottish virgin.” The disingenuous Boyle had revealed that detail to producers, and the media has gleefully latched onto that term – “Scottish virgin” – as though to demonstrate that, despite her talent, it’s still okay to ridicule this woman, to constantly call attention with an oh-so-wry wit to what may be a painful part of her most private life. In the second paragraph of the story – the one-two punch being an effective offense – the writers describe Boyle as “a stocky, beetle-browed woman who would not ordinarily rate a second glance on the street.” (Emphasis mine.) Boyle is stocky and beetle-browed, but I would describe her appearance as unexceptional; I don’t know who licensed the Times writers to judge who does or does not “rate a second glance,” but I’m sure they enjoyed exercising the privilege. Presumably, they don’t understand the implication that if Boyle couldn’t sing, she wouldn’t “rate” anyone’s notice. If that’s the standard, I, for one, am in trouble.
The writers quoted Tanya Brown’s commentary in the Guardian: “Why are we so shocked when ‘ugly’ women can do things, rather than sitting at home weeping and wishing they were somebody else? Men are allowed to be ugly and talented.” The quotation marks, I suppose, were Brown’s way of softening the expression, but I wouldn’t have used the term, with or without the quotes, to describe Susan Boyle. Most people’s faces are interesting – nice, in some unique way. This is something I notice when I’m holding the Communion cup at Mass. I’ve been doing that for many years; I haven’t seen an ugly face yet.
The writers also referred to Marie Dressler, a film and vaudeville actress who, very late in life, became such a major movie star that she was the first woman to appear on the cover of Time. “I’m too homely for a prima donna,” she once said, “and too ugly for a soubrette” – “soubrette” being a coquettish sort of stock character in the theater and the opera. That was Dressler’s self-assessment, but be sure that once she re-emerged from obscurity and poverty in the 1930s, she was the only one calling Marie Dressler ugly. As for Susan Boyle, what’s wrong with that smile?
The sensation caused by Susan Boyle’s appearance on a British talent show should be embarrassing to more than those judges and that audience who initially dismissed her based on her appearance alone. They didn’t dismiss her because her appearance was somehow exceptional; if they had met her at a backyard party in her village they might not have given her appearance a second thought. Rather, they either assumed from the outset that a person that ordinary, a person of that age, could have nothing to offer in the way of talent, or they assumed that no matter what she had to offer in the way of talent, it couldn’t be enough if she looked like that. The judges and the audience weren’t alone in making such assumptions. They live in a larger world in which uncounted people of enormous talent go unnoticed while mediocrities like Britney Spears make headlines regardless of their lack of artistic gifts.
The Susan Boyle phenomenon calls to mind the experience of Kate Smith, one of my favorite singers of standards. Early in her career, she had to put up with ridicule, especially “fat girl” jokes, but in the end the public couldn’t ignore her musical prowess. Based on the only thing I’ve heard Boyle sing, she reminds me of Kate Smith in that the power and clarity of her voice are enhanced by her ability to deliver the song. Kate Smith was a favorite among lyricists for that reason, which explains why she introduced more than 650 hit songs during her radio and recording career.
My guess is that we didn’t learn anything – at least, not permanently – from the Susan Boyle incident, but it will be justification enough for her if she flourishes in a mid-life career. I, for one, would love to hear more.