Happy July 4th! Here is my American story.

Happy 4th of July!

Let me first note, transparent in my pedantry, the Declaration of Independence was actually approved on July 2, 1776. Nonetheless, it was dated July 4, 1776 and signed August 2, 1776.

Allow me next to relate I was physically born (at long-since-closed Metropolitan Hospital, then at 3rd and Spruce) roughly 1/5 of a mile (about 4½ city blocks) southeast of Independence Hall, where both the Declaration of Independence and the Constitution of the United States were written.

And permit me to conclude with the fascinating coincidence that both the 2nd president of the United States, John Adams, and the 3rd president of the United States, Thomas Jefferson, died on this day in 1826—50 years to the day from the day we designate as our official day of independence from England.

That is, I conclude these introductory paragraphs that way.

**********

A few hours, I began to write a thread on Twitter. It opened thus:

1/ For July 4, I present my American story.

I was born in Philadelphia–where the Declaration of Independence and Constitution of the United States were written.

I was adopted in utero in the late summer of 1966. Both of my (legal) grandfathers were born outside the US.

The thread ties together the various elements of my background into a single, “American” story. Regular readers of this site will not be surprised, given a series of posts I have written (collected here) telling parts of this same story.

Moving right along:

2/ Morris Berger was born in what is now Poland in 1894 and came to the US when he was 4 years old. A Yiddish speaker, he became a successful business owner and Jewish community leader in West Philadelphia.

His son David Louis was my (legal) father.

He went the other direction.

Two things here (besides proudly observing I was given the Hebrew name Moshe ben David Leib in his honor).

One, the year of my (legal) paternal grandfather’s birth is incorrect. Twitter, however, lacks an edit function, so I could not correct this tweet once it was posted.

Two, there is some uncertainty as to when, exactly, Morris Berger (and three of his siblings) was born.

Next:

3/ Charming, gregarious and generous, “Lou” spiralled down after his iron-willed mother died in 1972. A gambling addiction cost him the business his father and uncle had built. He also lost his marriage–though he never lost me. He died, broke, from a heart attack at 46 in 1982.

David Louis “Lou” Berger died on June 30, 1982, meaning the 37th anniversary of his death was four days ago. By an egregious act of bad timing, June 30 is also the birthday of a close cousin. In fact, my mother and I spent the evening he died at a birthday party for this cousin. As we walked in the front door of our apartment after the celebration, the phone was ringing shrilly. My mother walked behind her white-and-chrome desk to answer it. It was her ex-husband’s—what is the adult form of “girlfriend?”—calling from her hospital bed to inform us of Lou’s sudden passing.

At the time, he was driving a cab for a living (quite happily, I hasten to add, because it gave me a freedom he had rarely known). He was headed to Little Pete’s diner (which closed in 2017) to meet some fellow cabbies for a meal, when he collapsed on the sidewalk in front of the Warwick Hotel (where my wife Nell and I stayed a few times early in our relationship). He was dead before he hit the ground from his third heart attack in 10 years.

Ignoring decades-old tears and moving on:

4/ Yisrael HaCohen was born in what is now Ukraine in 1904. He came to the United States when he was 7, speaking Yiddish. To join the Philadelphia Police Department in the 1930s, he changed his name to Samuel Kohn (sounded less Jewish) and changed his birthplace to Cleveland.

This story I have told before, so let us proceed:

5/ He served for nearly 20 years, rising to Detective. He ultimately retired to Atlantic City.

His daughter Elaine was my (legal) mother.

Serious reproductive health issues (and hysterectomy) led her only natural child (b. 1962) is “severely intellectually disabled.”

Again, one cannot edit a tweet—that should read, “led…to be.”

Because it is better to laugh than to cry, I sometimes tell the following “joke”: My mother had two miscarriages and a hysterectomy, and then I was born!

It was not until I became my sister’s sole legal guardian and began receiving her annual Life Enrichment Plans that I knew the extent of my mother’s reproductive miseries. Besides the two miscarriages—and a prolonged, painful labor resulting in her daughter being deprived of oxygen at critical moments during her birth process—Elaine Berger also had uterine cancer. Thus, the hysterectomy.

Oy.

Next:

6/ I am my sister’s legal guardian. She lives in a facility run through private-public partnership; she is funded through supplemental Social Security income. Thank you, FDR.

Elaine took the opposite path from Lou. After her marriage ended in 1977, she worked a minimum wage job.

She actually took that job—cold-calling folks on behalf of the A-1 Carpet Cleaning Company—some time around October 1976, as her marriage was inexorably coming to an end.

And I must say this: the end of my (legal) parents’ marriage was about as amiable as such an event can be. As painful as it must have been (the night before they officially separated was the only time I saw my father cry), I will always be grateful to them for this civility.

Meanwhile, this is what I mean by “supplemental Social Security income.”

Moving on:

7/ Eventually, Elaine bought that business and, with some help from her own business-owning mother, made a good living for nearly 25 years.

But her reproductive issues returned, and she died from ovarian cancer, aged 66, in 2004.

Oh…her mother. Irene Gurmankin, later Goldman.

Yes, my (legal) maternal great-grandfather—or, at least, his four daughters—also Anglicized his name.

Three years after Elaine Berger began as a minimum-wage-earning telephone solicitor, the owner—a lovely man named (if memory serves) Schwartz—retired. My mother worked out a deal with the man who owned the actual carpet-cleaning machinery to run the business together. A few years after that, this other man retired (or something, my memory defies interrogation on these points), and Elaine Berger took over the A-1 Carpet Cleaning Company (a two-person operation—three when I pitched in, mostly by filing or placing leaflets on car windshields—to be sure) for good.

Here she is in 1988 running that business (same desk, different apartment) with her two children framed in the background:

1988-2.jpg

Next:

8/ After divorcing Samuel Kohn in (I believe) 1964–a rarity in those days–she started a cosmetics and costume jewelry business. That business–and her own iron will and fierce work ethic–became fairly successful, allowing her to live comfortably until her death at 92 in 2007.

For some reason, Irene Kohn (she kept the surname) soon moved 60 or so miles west to Lancaster, Pennsylvania, where she set up shop at the newly-opened Host Farm. Because of her beauty and extroverted (if sometimes cruel—my relationship to her was complicated) charm, she quickly established herself as the unofficial hostess of the sprawling resort. This was a great boon to my cousins and me, who effectively had the run of the place (two pools, a game room, a gift shop, three great restaurants with employee discounts, endless hallways to explore, a superb daylong program called the Peppermint Parlor). Heck, I got to see my man Rupert Holmes perform in the Host Farm Cabaret (for free) in the summer of 1981!

She finally moved back to Philadelphia in 1984, though she never actually retired, running a mail-order business for loyal customers well into her 80s.

Next:

9/ Meanwhile, Morris Berger died, aged 61, in 1954 (correction, he was born in 1893–if only Twitter allowed editing), and Samuel Kohn died, aged 73, in 1978.

OK, that is my legal family, the only family (prior to marriage and parenthood) I have ever known.

I really wish I could have known my namesake—whose death was one of a series of blows to young Lou Berger, who was asked to shoulder more responsibility than he was prepared to. As for “Pop Pop Sam,” for all his “combative personality” and temper, he was a kind and loving grandfather, and I miss him still.

The next few tweets in the thread speak for themselves:

10/ Here is what I know about my genetic family.

My maternal grandmother could trace her ancestry–and family presence in the United States–to the 1700s. English, Dutch. Her ancestors primarily lived in the southeasterern [sic] United States.

Where they fought for the Confederacy.

**********

11/ Alice Mulkey married an Irish Catholic Philadelphian named William Dixon, and moved to Philly. Their first child is my genetic mother.

They lived in what was then a working class area

At 19, while working at Philadelphia’s Drexel University, she met my genetic father.

**********

12/ This part is…fuzzy…so I elide it.

However, the man she met was almost certainly the only son of legendary naval historian Reuben Elmore Stivers. Assuming I am correct, my genetic father died in 2006.

The Stivers family also goes back in the United States to the 1700s.

I exaggerate only slightly when I use the word “legendary” to describe the man who is almost certainly my (genetic) paternal grandfather. When I explained to a different cousin, who serves his country ably and proudly as a Lieutenant Commander, Naval Intelligence, “Smokey” Stivers was likely my ancestor, he said admiringly, “Oh, THAT Reuben Stivers!”

Continuing the thread:

13/ Except they were primarily in Kentucky.

And those men fought for the Union during the Civil War.

“My” branch settled in the Maryland suburbs of Washington, DC. This could explain my (legal) mother’s belief that my genetic father was Colombian.

I miss her (and my father).

Two points.

One, it was not just Kentucky. It was specifically around Lexington, Kentucky, based on what I have learned on Ancestry.com and through discussion with newly-discovered genetic cousins (who have been unfailingly gracious).

But more to the point, I was shocked to learn my genetic ancestors fought each other (perhaps literally, I do not know) in the American Civil War; ponder that counterfactual for a while. This discovery also fits well within the context of my “split identity” first post.

Two, Elaine Berger was so convinced (after a bad game of Telephone: my genetic mother conveyed what she knew to Modell, who passed it on to his client, who probably misunderstood “District of Columbia”–which had only just received its three electoral votes—as “Colombia”) of my genetic paternal heritage she went to the library to see what Colombian children looked like. I do not know what photographs she saw, but she told me numerous times she thought I would be black, or at least much darker-skinned.

She was one of a kind, my mother was.

14/ Upon learning she was pregnant, my genetic mother–unmarried and lacking means–chose to put me up for adoption.

That adoption was arranged through another child of Jewish immigrants, Herman Modell.

How, you ask, did my (legal) father and uncle know the powerful Mr. Modell?

I scrupulously avoid injecting my own political beliefs onto this site, but I make an exception here.

Had I been conceived seven years later, my genetic mother could have had her fetus legally aborted, thanks to Roe v. Wade.

Now, because of her Catholic upbringing—and this is pure speculation on my part—my genetic mother may have carried me to term anyway. She also may have been living in different economic and/or personal circumstances after January 1973. The counterfactuals make my head spin.

And let me back up a second here.

Nell and I have discussed on more than one occasion how much of a role privilege (read: white privilege) plays here. Her own mother was raised with a modicum of wealth, and there is no doubt that if she had found herself with an unwanted pregnancy early prior to 1973, her family would have quietly arranged an abortion for her. It is a near-certainty my genetic mother had no such option (which is why, as long as I am shouting from my soapbox, I have always been opposed to the Hyde Amendment—it denies less well-off women access to a Constitutionally-protected medical procedure and is thus, frankly, unconstitutional. Talk about an “undue burden!”).

But if, under ANY circumstances, my genetic mother had chosen to abort the fetus gestating in her womb—the fetus that would not really become yours truly until the end of September 1966—I would absolutely and unequivocally support that decision.

It was her body, so it was her choice. As it is for all women, everywhere. If you do not like abortions, do not have one, but do not sit in any sort of judgment on any woman who makes that most painful of decision in private consultation with her medical providers and selected loved ones.

Just as I do not get to sit here, more than 50 years later, and judge my genetic mother for any decision she made (or did not make, or could/would have made). I did not yet exist as an autonomous being…and I if I had never existed as an autonomous being, so be it. It was never my decision to make.

My (legal) mother would often remark something to the effect of “If men could get pregnant, you would be able to get an abortion on any street corner.”

For a woman with only a few years of post-high-school medical technician training, she saw things with exceptional clearly.

Returning to my Twitter thread:

15/ Through their simultaneous membership in La Fayette Lodge No. 71.

Yes, my (legal) father, his uncle and the powerful lawyer who arranged my adoption were brother Freemasons.

To be fair, my (legal) father was asked to leave La Fayette Lodge No. 71 for non-payment of dues.

I have told some of this story before, so let us move on; see also here. I would just add that to the extent you knew my father—and realize he was a Freemason for about 10 years—any support for the myth of the controlling influence of the Freemasons evaporates.

16/ But consider this.

When the unplanned child of two people who could trace (mostly) ancestry in the United States to the 1700s was placed for adoption, with whom was he placed?

The children of Yiddish-speaking immigrant fathers who had built successful lives in Philadelphia.

And there it is…thank you for continuing to “just bear with me.” Often lost in our collective squabbles over immigration: the descendants of recent immigrants often do better economically and socially than the longer-term “original settlers.”

Speaking of bearing with me:

End/ I was fortunate to be raised by loving parents of some means in the leafy suburbs north and west of Philadelphia. Nature and nurture cooperated successfully, and I enrolled in Yale College in 1984, sparking a fairly successful life of my own.

And that is #MyAmericanStory

Here is a photograph of those leafy suburbs, as my (legal) father holds his two children (backstory here):

Sue Ellen Drive Feb 1967 or October November 1967

And here I am with my legal mother and maternal grandmother at my graduation from Yale in 1988.

Yale graduation with Nana and Mom 1988.jpg

Here is the first postscript:

PS/ I am writing a book (inspired by, of all things, trying to explain why I love #FilmNoir so much) detailing this history. Working title: Interrogating Memory: Film Noir, Identity and a Search for Truth.

For more, please see justbearwithme.blog.

Thank you, and Happy 4th!

Hmm, this is getting very circular.

And, finally:

PPS/ My profile picture is from my (legal) parents’ wedding in January 1960. Their wedding, literally and metaphorically, took place about half a mile south of City Line Avenue. They were on the Philadelphia side, but maybe they could see their future home in the suburbs.

For those of you who do not follow me on Twitter (tsk, tsk–@drnoir33), here is that photograph:

Elaine and Lou Berger with parents January 17 1960.jpg

I do not know who the gentleman on the far left is (a great-uncle?), but from left to right are Rae Caesar Berger (mother of the groom, Lou Berger, Elaine Kohn Berger (photograph taken after exchange of vows), Irene Kohn (mother of the bride) and Samuel Kohn (father of the bride).

I LOVE this photograph, even if the men on either end look dyspeptic.

Please have (or continue to have, or I hope you had) a safe and festive holiday!

Until next time…

Organizing by themes IV: Bipartisanship and civil discourse

This site benefits/suffers/both from consisting of posts about a wide range of topics, all linked under the amorphous heading “data-driven storytelling.”

In an attempt to impose some coherent structure, I am organizing related posts both chronologically and thematically.

When I first launched this blog in December 2016, I decided that if I were going to write about American politics—however “objective” my analyses and transparent my methods—I should be careful not to be seen merely as a partisan or ideological hack.

Thus, in only my second post, I laid out what I considered to be my bipartisan bona fides, while also making clear that I am a proud liberal Democrat. The two are not inconsistent.

Over the next six months, as I wrote a great deal about American politics—particularly reflecting on the 2016 presidential campaign—I chose, with one exception, not to refer back to that post.

But as the resistance to President Donald Trump heated up in the spring and early summer of 2017, I began to be disturbed by the nascent tit-for-tat nastiness of some of my fellow liberals (or progressives, or whatever the label du jour is). I found myself writing long Facebook posts that were more or less erudite versions of “two wrongs don’t make a right.”

The end result was that in June 2017, I crafted what remains the post of which I am still the proudest: Two distinct restaurants. Two different conversations. One unanswered question.

One conversation (about gun rights) was with a cultural conservative in exurban Philadelphia (near where I was raised), while the other conversation (about the 2016 presidential candidacy of Democrat Hillary Clinton) was with an ardent progressive in Brookline, MA (where I live now). The former conversation was polite and informative, the latter confrontational and head-scratching.

And the question I still have is:

When do you stick to deeply-held principles, and when do you set them aside to advance the common good?

The answer may something to do with lowering your voice, listening to other points of view and questioning your own certainty.

I have linked to this post on Twitter (less so on Facebook, which I have all but abandoned) more often than any other post. Granted, Twitter is not exactly renowned for being “where cooler heads prevail”—but that will not stop me from trying.

Four months passed, during which I spent a great deal of time (or so it felt) arguing for the repeal of Amendment II on Twitter (see caveat in previous paragraph). The…umm…pushback I received prompted me in October 2017 to write Unpacking Twitter arguments, both coherent and incoherent.

This was the “Featured Image” on that post. It still sits on my desk, where I can easily access it.

IMG_3270 (2)

I did not write specifically about bipartisanship again until April 2018, but the notion clearly suffused the following posts:

What if Dewey HAD defeated Truman?

Dynamics of the Party System

Manifest(o) Identity

The latter post, from May 2018, was a first response to what I saw as a rapidly growing and dangerous epistemological crisis (which still exists) in the United States: the division of American citizens into ideological media silos, wherein we only “accept as true” information we receive from our preferred sources.

As a recent birthday gift shows, I am not immune to such siloing; MSNBC rules our weekday evenings.

IMG_3981

In June 2018, I began to proffer a specific form of bipartisan action as the cure for our epistemological crisis—a willingness to vote across party lines, while still staying true to one’s fundamental political views. In Bipartisanship as patriotism, I announced I would vote to reelect Republican Charlie Baker governor of Massachusetts; my wife Nell and I both followed through on that pledge with no regrets.

Just one week later, I published a hopeful piece about the vacancy on the United States Supreme Court created by the retirement of Associate Justice Anthony Kennedy. I did not really expect a more centrist nominee from President Trump, but neither did I expect to have a personal connection to his eventual choice.

Finally, my most recent posts dealing with bipartisanship (other than an exhortation to be involved in the process, whatever your political perspective) came after the deaths of two Republican icons I came greatly to admire (despite our ideological differences and their all-too-human foibles):

John McCain

George Herbert Walker Bush

Rest in peace, gentlemen. You served your country with honor—and did your best to act in accordance with what I wrote on my home page: “It really is possible to disagree without being disagreeable.”

Until next time…

Samuel Joseph Kohn: exemplar of the Jewish immigrant experience

He had been a powerfully-built man, which served him well when he spent nearly two decades as a Philadelphia police officer (rising as high as plainclothes detective in the late 1940s). His 1940 World War II draft card lists the then-36-year-old patrolman as 5’10” tall and 210 pounds, dark-complexioned with black hair and brown eyes. But the stroke he suffered at 73 had left the right half of his body paralyzed, making him seem much frailer. Nonetheless, as I stood next to his bed in the rehabilitation center, he had more than enough strength in his massive left hand to grip my smaller 11-year-old hand tightly.

As he held my hand, he made me swear to him that I would become either a doctor or a lawyer, the professional pinnacles for late 19th– and early 20th-century Jewish immigrants and their immediate descendants. At that age, I was far more interested in math and history—and not particularly good with blood (I still am not)—so neither option appealed to me.  However, I adored my grandfather, so I did as he asked.

None of us in that room, in the early fall of 1978, had any idea what “epidemiology” was, but that is the field in which I earned my doctorate 36 years later.

That counts, right?

**********

Wednesday, December 12, 2018, would have been Samuel Joseph Kohn’s 114th birthday. Here he is, with my grandmother Irene (who he would divorce only a few years later), at his younger daughter Elaine’s 1960 wedding to David Louis Berger. Six years later, the young married couple would adopt a boy and name him Matthew…but that is an entirely different story.

Irene and Samuel Kohn January 1960

More precisely, what I learned growing up was that a man named Samuel Cohen had been born in a town near Kiev (in modern-day Ukraine) called Shpola (sometimes Shpolakievagubernia[1]) on December 12, 1904. And that date of birth is clearly recorded on his headstone:

IMG_0012 (2).JPG

However, as I wrote with regard to my paternal grandfather Morris Berger (and his four younger siblings), dates of birth are hard to pin down when official American birth certificates are not available. Decades after the fact, researchers (and curious descendants) are forced to rely on documentation such as naturalization papers, military service documents and United States Census (“Census”) records.

But it is not just dates of birth that can be difficult to verify. Things as supposedly straightforward as name and place of birth may prove tricky as well…especially when those facts were deliberately altered to fit in better with an early 20th-century urban American milieu.

I do not mean this to sound sinister. It was no secret when I was a boy that sometime between 1930 (when the Census records 25-year-old “Samuel Cohen” living at 1842 N. 32nd Street) and 1934 (when his marriage record to Irene Goldman lists him as “Samuel Kohn”), my grandfather changed the distinctly-Semitic (and distinguished) last name of “Cohen” to “Kohn,” believing it to be more ethnically ambiguous. He had done this, supposedly, anticipating anti-Semitic resistance when he joined the police force (and even when that occurred is a bit of a mystery). When you consider that his father, Joseph Cohen had been, variously, a rabbi, a shochet (Kosher butcher) and a Hebrew school teacher—and that he and his 11 brothers and sisters were purported to be direct descendants of the legendary Shpoler Zaide [2] (“the Grandfather from Shpola” or, as I knew him as a boy, “the Dancing Rabbi of Shpola”), the surname change is even more striking.

It may not have been simply joining the police department, though. Sometime after landing in Philadelphia in 1911, Yaakov Gurmankin of Cherson (in modern-day Ukraine) became “Jack Goldman” of Philadelphia—and his four daughters (of whom Ida—or Irene—was the eldest, born August 11, 1914) became the “Goldman Girls.” Adopting a less “Jewish” sounding name was a fairly common occurrence for these newly-arrived immigrants, a natural part of their slow assimilation.

But let us return a moment to his 1940 draft card. It lists his date of birth as December 12, 1904 (so far, so good) and his place of birth as…Cleveland, OH?

When I located this draft card via Ancestry.com sometime last year, as I began my family research, I was less surprised by the birth location as I was by how early Samuel Kohn had begun claiming it.

In February 2015, “Snowmageddon” shut down Boston’s Logan Airport, forcing me to spend two extra days in the San Francisco area (Burlingame, actually) following the conclusion of that year’s NOIR CITY film festival. As I enjoyed dinner (including a nice chianti) at Café Figaro my first night in Burlingame, I had a long text exchange with my maternal aunt and her children (which, unfortunately, I have since deleted). It was then I learned Samuel Kohn had changed his birthplace from Shpola to Cleveland, adding to his slow Americanization.

OK, so how do I know he was born in Shpola?

Let us start in 1979, when a grandson of Joseph Cohen (first cousin of my mother and her older sister) took three sheets of orange paper and wrote out the family tree of the descendants of Joseph Cohen and his wife Bat-sheva (later Bessie) Koslenko Cohen. I have no photographs of Joseph (so far as I know), but my great-grandmother struck quite a pose:

Bessie (Barhseva Koslenko) Cohen

The 1979 record lists the eight children of Joseph and Bessie Cohen who eventually made their way to Philadelphia: Sima, Bella, Sarah, Benjamin, Sophie, Samuel, Anna, Jack. Some 60 years earlier, meanwhile, Joseph Cohen’s United States of America Petition for Naturalization (dated June 4, 1918) listed eight children (with dates of birth) of Joseph Cohen (and wife Bessie):

Sima (later Sarah; July 3, 1887[3]),

Rebecca (later Bella; November 10, 1890),

Sara (October 4, 1897[4]),

Benjamin (November 28, 1901),

Sofia (February 12, 1903),

Israel (November 22, 1905),

Anna (April 1, 1908[5])

Jacob (January 1, 1912)

According to this same document, Joseph Cohen (and his wife Bessie) hailed from Shpola, Russia (forcing him to “renounce forever all allegiance and fidelity to any foreign, prince, potentate, state, or sovereignty, and particularly to Nicholas II Emperor of all the Russias[6], of whom I am now a subject.”). The eldest daughter, Sima, also hailed from Shpola, according to the Naturalization Petition of her husband Leib (later Louis) Goldstein. Put two and two together…

Joseph, Bessie and six or seven of their children[7] were among the 1,091 steerage (I presume) passengers who sailed on the SS Haverford from Liverpool, England on November 18 (or 20), 1912, landing in Philadelphia on December 3, 1912.

SS Haverford

Photograph from here.

It was quite a harrowing journey, according to the front page of the December 4, 1912 issue of the Philadelphia Inquirer. I wonder how close my grandfather and his family truly came to perishing in the North Atlantic (only eight months after the sinking of the Titanic), which would have rather dramatically altered my family’s history.

Rough voyage of Haverford 1912

Wait, hold on, back up a second.

Who the bleepity-frick is “Israel Cohen,” born November 22, 1905?

**********

When my mother died in March 2004, I acquired a handful of documents relating to my grandfather. One of them was a small white piece of paper on which was written my grandfather’s Hebrew name as it was supposed to appear on his headstone; I recently came across it digging (again) through the “genealogy” folders in my filing cabinet.

Samuel Kohn headstone information.jpg

Clearly, at some point between December 3, 1912 and January 14, 1920–the date on which the 11-person Cohen household (including 15-year-old “Samuel”) at 729 Morris Street in South Philadelphia was enumerated by A. S. Burstein—”Yisrael (son of Yosef) HaCohen” became “Samuel Joseph Kohn.” I do not know why Rabbi Levin did not put this Hebrew name on the headstone; perhaps he simply could not reconcile “Yisrael” with “Samuel” (whose Hebrew equivalent is Shmuel).

As for when a date of birth of “November 22, 1905” became “December 12, 1904,” it is telling that the latter date is more consistent with the ages listed for Samuel Cohen on the 1920 (15) and 1930 (25) Censuses. The former is consistent with a date of birth between January 15, 1904 and January 14, 1905, while the latter (conducted April 4-5, 1930) implies a date of birth between April 6, 1904 and April 5, 1905.

The bottom line is this: upon embarking from the SS Haverford onto the Washington Avenue pier 106 years ago (what must the city have looked like to him, his parents and siblings coming from a town that had a population of about 12,000 in 1897?), a sea-sick Yiddish-speaking seven-year-old boy named Yisrael HaCohen from Shpola, on the rural outskirts of the Russian Empire, slowly transformed into the English-speaking Philadelphia police officer Samuel Joseph Kohn from Cleveland, OH. Why he chose “Cleveland” and moved his date of birth back 11 months and 10 days remains a mystery.

**********

In April 1930, Samuel Joseph Kohn was an attendant at a Gulf Refining Station—and probably starting to play pinochle, which would be the great passion of his life. Sometime in the next four years, he met a lovely teenager who lived about five blocks north of his packed house at 1842 N. 32nd Street. He married Ida “Irene” Gurmankin, I mean Goldman, at the Jewish-catering Imperial Hotel in Atlantic City in the summer of 1934. The would have two daughters, including my mother Elaine in January 1938. It was a contentious marriage—two hotheads separated in age by eight or nine years—but he was a great father. Later, he would be a beloved grandfather—and family patriarch.

As I have noted, his time on the force remains a large black box, but a handful of articles in the Philadelphia Inquirer[8] and conversations with my aunt (plus this book) tell me he was stationed in at 28th and Oxford (not far from his Strawberry Mansion house) in February 1937, where he was an incidental part of a post-shootout car chase. By 1948, he was a plain-clothes detective working on the Crime Prevention Squad (which targeted juvenile offenders); in November of that year, he and partner Jack Auerbach arrested two brothers at 23rd and Venango for running a numbers bank with a daily take of $1,200.[9]  In 1951 and 1952, a patrolman again, he worked in South Philadelphia (7th and Catherine) busting rackets under Acting Staff Sergeant Frank Rizzo (who would serve as a very controversial police chief, then mayor from 1972 to 1980); Rizzo and my grandfather looked very much alike, actually. My aunt told me he always voted Republican (despite living in a family that adored Democrats like Franklin Roosevelt) in those years because he thought his career depended on it.

A few years later, he had retired from the force and begun to operate a series of taverns in “seedy neighborhoods” (according to my aunt), at first with his brother-in-law Harry Alterman. I would love to imagine David Goodis frequenting one of those taverns.

His father Joseph died in October 1930, followed by his mother Bessie in November 1941. In January 1922, his sister Sophie had died seven days after giving birth to her only child, a daughter named Evelyn; she was only a month away from her 20th birthday (as I put it to my wife Nell, this is when this type of research “gets real.”). His sister Sima died in October 1944.

And by 1930, his brother Benjamin had moved to New York City to start a family. This left Samuel the male head of a rapidly growing family that would meet every year (becoming known as the “Cousins Club”) to celebrate the first night of Passover with the ritual Seder meal. (This tradition would continue for decades; as a boy, I looked forward to seeing all of my cousins at the vast Doral Caterers—which closed in 1989—near the intersection of Bustleton and Cottman Avenues, where inevitably one of us would be injured after an evening of high-spirited shenanigans).

Here he is, standing alone in the back, running the show in 1946 (my eight-year-old mother is sitting alone in the bottom center)…

Cohen Family Seder, 1946

…and again in 1953 (my 15-year-old mother is in the white blouse, seated on the right edge).

Cohen Family Seder, 1953.jpg

Five years after this Seder, in 1958, both his brother Benjamin and his sister Bella died.

But on January 17, 1960, he proudly watched his daughter Elaine marry a charming and handsome young man named David Louis Berger. His first (of four) grandchild, Mindy, was born on March 8, 1962. And then I arrived (literally) in September 1966.

By 1964, meanwhile, he had divorced his wife of nearly 30 years and moved to Atlantic City, where he drove a jitney for a few years before retiring; I used to spend hours riding them up and down Pacific Avenue for only 35 cents in the summer of 1974 and 1975.

It is those summers I remember when I think about my grandfather. By then, he had settled into the Warwick Apartments, just off the beach on Raleigh Avenue; I would occasionally spend a weekend with him there over the winter (I loved it, but let us just say that my grandfather could give Oscar Madison a run for his money).

619963-large-fullheightview-view-from-the-southeast

Occasionally, he would take me for a ride on one of the double-decker (we always sat on top in the open air) boats that departed from Captain Starn’s seafood restaurant and sailed lazily south along the beach then north again. He would treat me to an ice cold can of Coke or Dr. Pepper from a vendor with a cooler; they remain the most delicious sodas I have ever tasted.

And then there was the night—probably in the summer of 1974—when my mother and I had dinner with him in his apartment. At the start of the meal, I was served a steaming hot bowl of tomato soup (most likely Campbell’s cream of tomato). It was a hot night, so I sat at the table shirtless. Then, somehow, my grandfather tipped the entire bowl of soup onto my bare chest.

Owwww!!!!!

At this, my grandfather—tough-as-nails Philly cop, tavern owner and Cohen family patriarch—became completely distraught; I have never seen a man look so shattered. And while my chest was still stinging in pain, despite the butter (yes, butter) being rubbed on it, his reaction had me feeling sorrier for him than anything else.

Just over four years later, on November 15, 1978, Samuel Joseph Kohn (and Yisrael ben Yosef HaCohen) succumbed to complications from his stroke (“cardio respiratory collapse” from “myocardial infarction”), ending an extraordinary rich life that typified the 20th century immigrant experience. Less than one year later, I wore his yarmulke at my Bar Mitzvah. Despite being a Jewish-raised Atheist (married to an Episcopalian-raised Agnostic), I still wear it (with my father’s tallit) when I light the candles of the menorah on Chanukah.

IMG_3304

*********

I end where I began, with this excerpt from page iv of my doctoral thesis.

Dedication

          This dissertation is dedicated to the memory of three late members of my family whose love and support I miss every day.  

          First  is  my  maternal grandfather,  Samuel  Kohn.  Toward  the  end  of  his  life,  he  made  me  promise  that  someday  I  would  become  either  a  lawyer  or  a  doctor.  

Pop-Pop  Sam,  I  kept  my  promise.

Until next time…

[1] “Shpola in the governing district of Kiev”

[2] In early November 2018, a man reached out to me on Ancestry.com, seeking information about Joseph Cohen. He believes (and there is some decent evidence in support) that his great-grandfather Yankel Cohen—a rabbi from a town just 10 miles south of Shpola called Zlatopol (now part of Novomirogrod), whose younger brother were also rabbis—was my great-grandfather’s older brother. And he alerted to the researches of Dr. Jeffrey Mark Paull, who has traced, through Y-DNA, the male descendants of the Shpoler Zaide. Somewhere in here lies the truth of our descent (or not).

[3] Curiously, the date of birth listed on her husband Leib (Louis) Goldstein’s Naturalization Petition is July 10, 1886.

[4] Or was it July 4, 1894?

[5] This was actually a guess based upon her being born on the third day of Passover in 1908, which she later learned was April 19.

[6] The italicized words were handwritten on his Declaration of Intention, dated June 1, 1915.

[7] According to the 1930 Census, Sima Cohen arrived in 1914, though that date is almost certainly 1913, when she and her husband Leib arrived in Philadelphia on the SS Breslau. Meanwhile, the 1930 Census says Bella arrived in 1899, though she would only have been eight or nine years old then; I suspect that is a miscommunication.

[8] “GUNMEN FLEE POLICE SHOTS IN TWO DUELS,” Philadelphia Inquirer (Philadelphia, PA), February 28, 1937, pg 4; “Brothers Seized On Numbers Count,” Philadelphia Inquirer (Philadelphia, PA), November 27, 1948, pg 15.

[9] A little over $12,600 in 2018 dollars.

A Supreme opportunity to overcome partisan rancor

During my senior year at Yale, I took a seminar called “Political Uses of History.” The topic of my final paper (accounting for most of the course grade[1]) was the history lessons used to defend/critique the nomination of U.S. Court of Appeals for the District of Columbia Circuit (DC Appeals Court) Judge Robert Bork to the United States Supreme Court (SCOTUS). Upon being nominated by President Ronald Reagan to fill the seat vacated by Associate Justice Lewis Powell on July 1, 1987, Senate Democrats immediately expressed dismay at Bork’s “originalist” legal perspective (the Constitution of the United States only means what the original framers of the document intended it to be at the time).

They were also disturbed by Bork’s role as Solicitor General of the United States on October 20, 1973.

On the night now known as the “Saturday Night Massacre,” President Richard Nixon, alarmed by Watergate Special Prosecutor Archibald Cox’s request for secret White House recordings, demanded that Cox be fired–which only the Attorney General could do. When both Attorney General Elliot Richardson and Deputy Attorney General William Ruckelshaus resigned rather than comply, the next person in line was Bork, who promptly fired Cox.

Bork ultimately lost his nomination vote 58-42. Reagan then nominated DC Appeals Court Judge Douglas Ginsburg, but he quickly withdrew his name after reports about prior marijuana use surfaced.

Oh, how times have changed.

Finally, Reagan nominated Anthony Kennedy, a judge on the U.S. Court of Appeals for the Ninth Circuit, and he was confirmed by the United States Senate (Senate) on February 3, 1988 by a 97-0 vote.

And after serving as the “swing” vote on SCOTUS for years, Justice Kennedy announced his retirement on June 27, 2018.

The tumultuous reaction to this news—laser-focused on the possibility that President Donald Trump will choose an ultra-conservative jurist who would be the decisive vote on issues like LGTBQ rights, abortion, guns and Obamacare—reminded me of my political uses of history paper.

**********

Just bear with me, then, while I review some recent history.

First, whether or not you approve of the filibuster (a final up-or-down vote can only occur if, say, 60% of legislators agree) as way to protect the rights of the minority party in a legislative body, it served to constrain judicial nominations by requiring a broad base of support.

Of course, it also meant that a determined minority could prevent any given nominee from a final up-or-down vote. After then-minority Senate Republicans kept doing just that to President Barack Obama’s nominees, the Senate voted 52-48 on November 21, 2013 to abolish the 60-vote threshold to end debate for all judicial nominations except for SCOTUS. In retaliation (and after Trump SCOTUS nominee Neil Gorsuch, a judge on the U.S. Court of Appeals for the Tenth Circuit, fell five votes shy of the required 60), the now-majority-Republican Senate voted 52-48 on April 7, 2017 to end the 60-vote requirement to end debate on SCOTUS nominees.

Goose, meet gander.

Gorsuch was then quickly confirmed by a 54-45 vote, with three Democratic Senators—Joe Donnelly (IN), Heidi Heitkamp (ND), Joe Manchin (WV)—voting yes. All three face reelection in 2018 in very Republican states: R+16.3, R+29.4 and R+35.5, respectively.

Why Gorsuch was nominated in the first place is the second bit of recent history to review.

On February 13, 2016, SCOTUS Associate Justice Antonin Scalia died. Soon after, President Obama nominated DC Appeals Court Judge Merrick Garland to replace him. Within hours, though, Senate Majority Leader Mitch McConnell (R-KY) announced that because Obama was in the last year of his presidency (and thus some sort of irrelevant lame duck), the Senate would not even hold hearings on ANY Obama appointment until after the November 2016 elections. Charles Grassley (R-IA), chair of the Senate Judiciary Committee—where any hearings would be held—concurred, and the seat remained vacant until Gorsuch was confirmed.

merrick garland

**********

Democrats, hamstrung by their current 49-51 minority in the Senate, appear to be taking two fundamental—and somewhat contradictory—stances on the vacancy created by Justice Kennedy’s retirement.

Some invoke the “McConnell Rule,” insisting no vote be held on a new SCOTUS nominee until after the 2018 midterm elections, even though there is no guarantee Democrats will net the two seats they need for a majority.

Others focus on defeating any nominee outright, honing in on the damage to their (and, full disclosure, my) priorities a solid 5-4 conservative majority could do, particularly the distinct possibility it would overturn Roe v. Wade, the 1973 SCOTUS decision that declared all state laws against abortion unconstitutional, effectively making abortion legal throughout the United States.

It should be noted that overturning Roe would not make abortion illegal everywhere in the United States. Rather, it would leave it to each individual state (and the District of Columbia) to decide whether abortion is legal within its borders. Still, many states have “trigger laws” that would immediately outlaw abortion to the extent legally possible the instant Roe is overturned.

Basically, then, the Democrats have two unpalatable options: try to delay the nomination until after the November 2018 elections, or assume a vote is inevitable and work to defeat it. The rub is that either option would require at least one Republican to buck her/his own party. For example, assuming Senator John McCain (R-AZ) is too ill to vote (and does not retire to give Republican Governor Doug Ducey the chance to pick a replacement), if the Democrats are unified, a single Republican “No” vote means the nomination is defeated 50-49. This, while not impossible, will not be easy either.

I feel compelled to note that this entire conversation is taking place BEFORE any nominee has even been announced. That in itself is worrisome.

**********

Let me address these two stances in turn before concluding with my own thoughts.

No political act enraged me more in the last few years than the theft of a SCOTUS seat by Senate Republicans. Barack Obama was president of the United States until noon on January 20, 2017, and the Senators elected over the elections of 2010-2014 were the representatives duly chosen to provide “advice and consent” on the nomination under Article II, Section 2. The people, whose will McConnell invoked, had already spoken by voting in the relevant elections. President Obama was thus denied a fair hearing and vote on his judicial nominee—that is theft.

As disgusted as I remain by that, however, I have deep concerns about the tit-for-tat invocation of the McConnell rule. Two wrongs do not make a right: as we remind our daughters, meanness by one to the other is not a license to be mean back.

I sympathize with the arguments that Democrats should not be a doormat, that McConnell brought this on himself, that turnabout is fair play, that the system is already broken…

And it is that last point that most gives me pause. With good reason, Democrats and like-minded Independents and Republicans decry the corruption and norm violations they see from the Trump Administration and its Congressional allies. But that powerful critique is severely undercut if the Democrats themselves use the violation of a norm (regardless of “who started it”) for their own partisan gain. This would simply be the rescinding of the judicial nominee filibuster all over again.

There is also the unpleasant whiff of “ends justifying the means” about invoking the McConnell rule. I recently called out the modern Republican Party for doing just that. It also recalls one of President Franklin Roosevelt’s worst moments: his 1937 scheme to expand SCOTUS by as many as six Associate Justices (which he would then appoint) to make it less hostile to the laudable New Deal.

It is fashionable to dismiss taking the high ground as weakness and some sort of “asymmetrical warfare.” And perhaps in this single instance—a uniquely pivotal SCOTUS seat following the theft of a prior seat—that is the correct conclusion. But that is a very slippery slope: if Democrats and their allies resort to using the same ruthless tactics to “win” this battle, how are they any better than the Republicans? Does that mean tribalist victory is all that matters now?

The argument may be moot—and mostly public posturing (pointing out the rank hypocrisy of blocking one nomination in an election year but not another)—since it is not clear the Democrats could actually prevent hearings and a vote, short of grinding the Senate to a halt.

And a far better argument for delaying hearings and votes is that a president who is the subject of a criminal investigation should not be allowed to nominate a SCOTUS justice who would almost certainly vote on questions pertinent to that investigation (e.g., Can a president pardon her/himself or be indicted while in office?).

The second stance is at least well within traditional Senate rules and has a successful recent precedent.

It still gives me pause, however, because I worry liberals and like-minded centrists have become too reliant—almost complacent—on the SCOTUS (and the courts more generally) to do too much of the heavy lifting of policy-making for them. Republicans, smelling blood on this point, successfully put SCOTUS front and center in the 2016 election.

It does not help that SCOTUS Justices have become as entrenched in their ideologies (though not always) as both major political parties—Justice Kennedy was the swing vote because the other eight Justices were so reliably liberal or conservative in their rulings. Gone are the days when President Dwight Eisenhower (supposedly) called his appointment of California Governor Earl Warren as Chief Justice “the biggest damned fool mistake I ever made.” Seriously, what would even be the point of arguing cases before SCOTUS if the outcome was always predetermined?

The more fundamental problem, however, is that the Democrats let too many state legislative seats get away from them in too many states over the last 10 years. It is in those very states that the most important policy outcomes—on abortion, LGBTQ rights, Medicare expansion, gun control—actually get decided. And that is how it is supposed to be. I am far from an “originalist,” but Article I and Amendment X strongly imply policy is meant to be decided, umm, politically, in the legislative arena.

I know: both parties (despite bemoaning “activist judges”) try to seek policy victories in SCOTUS by arguing that this or that law or Executive order is unconstitutional—and that the “right to privacy” articulated so elegantly in Griswold v. Connecticut had a profound (mostly progressive) legislative impact.

My point is simply that if Democrats put as much work into winning back legislative seats (so far so good) as they will into blocking President Trump’s next SCOTUS nominee that will greatly reduce their reliance on favorable SCOTUS decisions. They could even overturn many of those anti-abortion laws at the state level (not all of them, of course).

**********

I have previously called for cross-partisan dialogue—patriotic bipartisanship. After President Trump was elected, I also began proposing a “coalition of the center” to form in the Senate that would wield an effective veto over legislation, forcing broad compromises by both parties. Such a group could consist of “red-state” Democrats like Donnelly, Heitkamp, Doug Jones (AL—R+28.4), Manchin, Claire McCaskill (MO—R+15.9) and Jon Tester (MT—R+18.6); Independent Angus King (ME—D+5.9); and Republicans like Susan Collins (ME—D+5.9), Lisa Murkowski (AK—R+19.2) and, perhaps, Cory Gardner (CO—D+2.2).

Were this bloc (or even the smaller bloc of Donnelly, Heitkamp, Jones, Manchin, Collins and Murkowski) to insist, unequivocally, that President Trump select

…a consensus nominee to replace Kennedy. “[Senator Heitkamp] told the president that he has a chance to unite the country by nominating a true non-ideological jurist who could gain strong support from senators on both sides of the aisle, rather than create more divisions…”

…they would elevate the traditional “advice and consent” role of the Senate above partisan rancor and force both parties to compromise, in effect restoring the judicial nomination filibuster.

Now, this would infuriate the conservatives who voted for Donald Trump (and President Trump himself) solely for the opportunity to remake SCOTUS in their image (though they still “won” with Gorsuch). And it would disappoint the liberal activists who want every Senate Democrat to resist President Trump at every turn (though this is easily the least-worst nominee they will get in 2018). But those may the necessary costs of restoring civil order to our public discourse.

Plus, how poetically just would it be if that “non-ideological” jurist was…Merrick Garland!

Until next time…

[1] I received an A on both the paper and the seminar, with a special commendation by Professor Joseph Hamburger.

Bipartisanship as patriotism

I started quietly screaming here.

But my deep revulsion for what the United States government, my government, the government elegantly outlined in our founding documents, is doing along our southern border (not the northern border with majority-northern-European Canada, mind you) boiled over the other night in this (annotated) 1,000+-word reply to a similar cri de coeur on the Bone and Silver blog.

The US faces an epistemological crisis. Some 20-25% of the population–primarily rural white Protestant men with at most a high school diploma (culturally conservative, isolationist, economically populist)–has been conditioned by right-wing propaganda (Fox News, talk radio mostly) for 30+ years to believe that all of their problems are caused by a long list of “others”: blacks (dangerous criminals), Spanish-speaking immigrants (drug-lord rapists and murderers who want your jobs), Muslims (terrorists), LGBQT folks (out to destroy your families), the mass media (lying to you), liberals (wimpy snowflakes who hate you and your values and *your* country) and the globalist-coastal elites (sending *your* jobs and country overseas, or something).

[Eds. note: I have no idea how large this segment of the population is. Trump’s 2016 share of the voting-age population was 25.0%, according to data from here and here. While not all Trump voters fit this characterization, an identical 25% (on average) support Trump’s recent immigration actions. And about 24% of American adults solely get their news from Fox News. The overlap between these groups is probably quite large, though well below 100%. Still, even if the percentage is only half of my upper limit—12.5%–that is still 1 in 8 Americans over the age of 18.] 

The crisis is that these Americans literally live in a different reality, with different news sources and accepted truths. This self-contained echo chamber is the only way they can sustain their paranoid grievances. And what they most fear is not loss of economic status but loss of racial/cultural status. They see an encroaching diverse modernity in which they have little-to-no status, which existentially terrifies them.

And so they cultishly follow an autocrat who echoes and validates their worst fears:  Mexicans and Muslims and transgendered folks and black athletes and liberals and Democrats and the media and China and our allies (Canada? Really?) are out to get *them*.

They are so deep in this twisted (yet infinitely self-justified) worldview that they no longer see these “others” as human beings, at some primitive level. *They* are animals who will “infest” (in 45’s words) THEIR country and destroy THEIR way of life. 

Yeah, you say, but they are outnumbered at least 3-1, so why is this happening?

This 20-25% of the population has an outsized influence on the Republican Party (which has cynically nurtured their paranoia for political gain since Nixon was first elected president in 1968), particularly which Republicans get nominated—and especially since the election of an urbane black man as president in 2008. That was a bridge too far for them, and for the Republican Party, who (to prevent losing nominations to further-right-wing candidates) vowed absolute opposition to him. They are also geographically dispersed across enough districts to elect enough like-minded Republicans to effectively control a majority of state houses and the United States House of Representatives. And, in a 17-person field, they coalesced around Trump early enough to allow him to win the nomination, sweeping aside an establishment that could not (or would not) coalesce around a more “mainstream” alternative (not that their choices were all that impressive). Once the Democrats nominated the equally-flawed Hillary Clinton, after Democrats had controlled the White House for 8 years…well, he still only won by 77,000 votes in three states (while losing the popular vote by 2.1 percentage points—the Electoral College’s Republican advantage at work again).

The thing is, 45’s policy advisors–including the all-but-Nazi Stephen Miller–truly think that they beat Clinton not because she was a bad candidate at the wrong time, but because they mistakenly believe that most of the country is as right-wing nationalist/racist as they are. Here, they are flat wrong, but for arcane structural reasons, it may still take a tidal wave of Democratic votes to wrest back the House this November (the Senate will be tougher, but I am optimistic). 

And as with any tribalist cult, they make up in passion and cunning what they lack in numbers, including voting at higher rates, while using every trick to maximize their electoral advantage (less through gerrymandering than through suppression). They do this because they legitimately see the “not-them” as Manichean enemies who must be stopped at all costs. For them, ends justify cruel, immoral and, yes, anti-democratic means: when push comes to shove, safety/security generally trumps (pun intended) liberal democracy. 

The thing is, though, even if Democrats win back the House (likely) and the Senate (30% chance?) and a bunch of state houses…actually, many good things will happen (if only by preventing more bad things from happening). But the crisis will still exist. This squeaky-wheel minority will, if anything, feel more aggrieved and more isolated and more desperate to fight inexorable change. And Fox News and Rush Limbaugh and Alex Jones and the National Enquirer and Breitbart will continue to echo and amplify their increasingly-distorted reality, not only because it serves their own interests (and bottom-lines) to do so–they also genuinely fear the consequences of suddenly backing off decades of crazy-stroking. 

So how do we fix this? How do we get a sprawling, impossibly-diverse nation of nearly 400 million people back on the same “we are all in this together” page (begging the question whether, besides WWII, we ever were)? How do we get these reality-denying folks to accept the reality of climate change, the trade-offs between secure borders and nurturing compassion, the tragic consequences of an overly-gun-permissive society (the unique Constitutional protection afforded guns has morphed into Constitutional protection of THEIR way of life—restricting the former is a direct assault on the latter), the value of expertise, the benefits of a multi-cultural/multi-ethnic society (a wider talent pool, if nothing else), and so forth?

I have absolutely no idea.

But as I see one California couple raise nearly $15 million almost overnight on Facebook to provide legal services for these newly-detained immigrants and their lost children, as I see more and more Republicans abandoning/staring down their party (thank you, Massachusetts Governor Charlie Baker), as I see the mainstream media absolutely refusing to back down from their Constitutionally-protected duty to investigate and report and expose, as I see Robert Mueller—a lifelong Republicandiligently pursuing his own investigations, as I watch previously apathetic citizens taking to the streets in protest…I have hope that the “sensible” (if not always ideologically-unified) 75+% will regain the “values” upper-hand and restore everything I have always loved about my country. 

The aggrieved minority may never accept what we understand as reality, because it is too existentially painful. But they are still my fellow Americans, and I must share our nation with them, just as they have to share it with folks like me. All I can do is continue to call out their nonsense in the clearest possible terms in the perhaps-naive hope that enough of them will eventually snap out of it.

Otherwise…we may simply have to wait as their numbers shrink even further, as the demographers insist will happen. 

Do not give up on this country…we ARE better than this.

Upon further reflection, though, I do have one practical suggestion, however, though it may not appeal to everyone: active bipartisanship.

It is telling in this regard that my second-ever post presented my bipartisan bona fides. My goal was to insulate myself against criticism (yet to materialize) that my liberal Democratic views biased my political and cultural data analyses. My meticulous sourcing also serves that purpose—allowing critical readers to fact-check my assertions and draw their own conclusion. In this, my academic roots clearly show: transparency in methods, data and sources.

But I think that post also stemmed from my hope that sufficient elected Republicans would stand up to the newly-elected President, thwarting his most anti-democratic impulses.

Shockingly few Republican elected officials, however, have done so. Yes, Republican Senators Susan Collins (Maine), John McCain (Arizona) and Lisa Murkowski (Alaska) voted NOT to repeal the Affordable Care Act. And Republican Senators Bob Corker (Tennessee) and Jeff Flake (Arizona), both of whom chose not to seek reelection in 2018, have at time publicly expressed deep reservations about President Trump.

But those moments have been few and far between. The reality is that Republicans, for all their protestations, have mostly voted for whatever President Trump has wanted. According to the FiveThirtyEight vote tracker, the median Republican United States Senator (51 currently serving) has voted with the President’s position a median 93.2% of the time, with 41 (80.4%) voting with his position at least 90% of the time; the “least” loyal Republican Senators were Rand Paul (Kentucky) and Collins, who still supported the President on at least 75% of votes. The obeisance was slightly higher for Republican members of the United States House of Representatives (US House; 235 currently serving who have cast at least one vote[1]): median support was 96.2%, with 193 (82.1%) voting with the President at least 90% of the time; the two least-loyal Republican House members have only voted with the President half of the time—Walter Jones (NC-3; 52.2%) and Justin Amash (MI-3; 53.0%). Curiously, the most vulnerable Republican House members, the 22 who represent congressional districts Clinton won in 2016, backed the President a median 97.0% of the time.

Instead, the few “profiles in courage” have come from state houses. Thirty-three states currently have Republican governors, with 16 having Democratic governors; Alaska Governor Bill Walker is an Independent.

Ohio Governor John Kasich famously challenged Trump from the (relative) left during the 2016 Republican presidential primaries and caucuses; he remains a vocal thorn in the President’s side. Three other Republican governors: Baker, Larry Hogan (Maryland), Phil Scott (Vermont)—remain enormously popular (68% approve/18% disapprove, on average) in states that are 24.1 percentage points more Democratic than the nation as a whole (using this calculation). Besides being genuinely likable, they remain popular by working—often in direct opposition to “their” President—closely with their states’ majority Democratic legislatures, carving out socially moderate-to-liberal and fiscally conservative positions.

Although I have lived in Massachusetts for most of the last 30 years, I never really followed Baker’s ascent, though I knew he was the chief Republican “up-and-comer” after his successful stint directing Harvard Pilgrim Health Care starting in 1999. In 2010, he was the Republican nominee against incumbent Democratic Governor Deval Patrick; Baker lost 48.4 to 42.0%.

charlie baker

A few months later, I was sitting in a Boston restaurant having lunch with my then-supervisor, when she nudged my arm. “Isn’t that himself?” she asked. I turned around to see Baker walk right near out table.That was when I realized how TALL he is (6’6”).

On August 25 of the previous year, Democratic Senator Edward M. Kennedy had died, after serving in the US Senate for almost 47 years. A special election to fill the seat through January 2013 was held on January 19, 2010. Democratic Attorney General Martha Coakley and little-known Republican State Senator Scott Brown easily won their primaries, and the prevailing wisdom was that Coakley would easily prevail against Brown. Instead, Brown upset Coakley 51.9 to 47.1%. (I drove through central Massachusetts with both daughters the weekend before the election, seeing no Coakley signs but quite a few Brown signs; uh-oh, I thought).

Four years later, with Patrick term-limited, Coakley was now the Democratic nominee for governor, seemingly a stronger candidate after her upset defeat. Baker was again the Republican gubernatorial nominee. And this time he won, 48.4 to 46.5%.

I did not vote for Baker in 2014 (just as I did not vote for Republican gubernatorial nominee William Weld in 1990 when he was, in many ways, more liberal than Democratic nominee Jon Silber—I now regret that vote). However, watching the debates between Coakley and Baker, I was struck by how much I LIKED Baker. Where Coakley was robotic and stiff, Baker was warm and engaging. His Harvard-educated brilliance shown through, but with an appealing everyman demeanor: he was clearly enjoying himself.

Because I think Coakley, with her flaws, would still have been a good governor, I do not regret my vote. But neither was I particularly upset that Baker won.

And since then, I have only grown to respect Baker more. He is more fiscally conservative than I would prefer, but his consistent willingness to call out Trump when necessary, well, trumps those positions.

I was wavering on voting for him this November (regardless of who the Democratic nominee is) until he forcefully “revoked his decision to send National Guard helicopters and personnel to the Southwestern border,” citing the inhumane treatment of children by the Trump Administration.

That did it: Nell and I will be voting to reelect Baker this fall, even as we joyfully vote for Democratic Senator Elizabeth Warren and our member of Congress, Joseph P. Kennedy III, also a Democrat.

Here is also why I will be voting for Baker in four+ months.

If I am calling on select Republicans to defy their President and work in a bipartisan fashion with Democrats, it would be massively hypocritical for me not to support a more-than-reasonable Republican who has done exactly that. Every time I cheer a former Republican speaking out against the President on MSNBC, I need to be able to match that gesture with one of my own.

Simply put, I cannot ask someone to do something—be actively bipartisan—without being willing to do the same thing myself.

Moreover, the only way to break down the tribalist partisanship that causes us to see persons with the wrong “label” as a mortal enemy is to elevate bipartisanship into an act of patriotism.

The stakes of the Cold War were so monumental that partisanship was supposed to stop at the water’s edge: there was to be no squabbling over matters of life and death. While that was not always true, particularly as the Vietnam War divided the Democratic Party and Democrats took President Ronald Reagan to task for his aggressively anti-Soviet Union posturing, that credo still serves as an excellent model for reimagining bipartisanship as patriotism.

Would I still vote for Baker if he were not heavily favored to win, meaning Nell’s and my votes will in no way be decisive? I do not know, to be honest. But were he not so effective AND anti-Trump, he would not be so popular, so the question kind of answers itself.

It is exceptionally difficult for lifelong partisans like me—this will only be the second time I vote Republican—even to consider opposing point of view (though it can be done), let alone voting for a candidate of the opposite party. But I firmly believe these actions are the best—maybe the only—ways to begin to solve our current epistemological crisis.

Until next time…

[1] 240 overall

Manifest(o) Identity

Having written and thought a lot about the 2018 United States (US) midterm elections, the first things I read each day (after my e-mail) are Taegan Goddard’s invaluable Political Wire and, of course, FiveThirtyEight.

On May 19, 2018, Goddard linked to this commentary by Washington Post columnist Paul Waldman. Waldman argues Democrats should abandon the “naïve” notion they will be able to win the votes of certain white Republicans (presumably once-Democratic voters who preferred Republican Donald J. Trump in the 2016 US presidential election) by showing them more “respect.” The fallacy, Waldman believes, lies in ignoring “where the belief in Democratic disrespect comes from and to assume that Democrats have it in their power to banish it.”

Specifically,

“The right has a gigantic media apparatus that is devoted to convincing people that liberals disrespect them, plus a political party whose leaders all understand that that idea is key to their political project and so join in the chorus at every opportunity.

“If you doubt this, I’d encourage you to tune in to Fox News or listen to conservative talk radio for a week. When you do, you’ll find that again and again you’re told stories of some excess of campus political correctness, some obscure liberal professor who said something offensive, some liberal celebrity who said something crude about rednecks or some Democratic politician who displayed a lack of knowledge of a conservative cultural marker. The message is pounded home over and over: They hate you and everything you stand for.”

If I may editorialize a moment, the sheer cynicism of this political strategy, while hardly new (McCarthyism, the Southern Strategy[1]), is breathtaking. There is no substantive policy argument or coherent ideological framework being offered, only an ever-stoked resentment intended to pit one (non-elite) group against another, a devious bit of misdirection by an alternate elite trying to maintain political power. This, of course, in no way excuses those who all-too-willingly fall for this misdirection. Not to get overly (or overtly) Marxist, but this is a textbook example of “false consciousness.”

Waldman, a writer for the progressive The American Prospect and graduate of Swarthmore and the University of Pennsylvania, calls the target of this resentment “snooty liberal elitism.” Note the magazine for which he writes and his elite education (is he a Philadelphian like me?); I have no evidence regarding his snootiness.

Of course, I myself hold strong liberal views and attended Ivy League and other top schools (Yale, Harvard, Boston University School of Public Health), ultimately earning two Master’s Degrees and a PhD. I defer to others to decide how “snooty” I am.

Hold that thought.

Returning to Waldman’s article, his argument resonated with me for multiple reasons.

First, I have also written about whether Democrats should focus electorally more on “whites without a college degree” (President Trump’s core supporters) or on a coalition of younger, college-educated, non-white, urban and women voters. If pressed, I would choose the latter, though it is not necessarily a zero-sum choice (e.g., the decision by Democratic leaders to zero in on Trump Administration corruption could have broad appeal).

Second, I had just been thinking about “elites” in the context of explaining the choices in my seven-day Facebook book challenge. While discussing the third book, I highlighted Christopher Hayes’ compellingly-readable treatise, Twilight of the Elites: America After Meritocracy.

In Chapter 5 (“Winners”), Hayes attempts to determine who comprises the “elite.” The (self-serving, given the plethora of wealthy and powerful conservatives) right-wing view is that “elitism” is not “degree of power or influence, but rather their condescension, their worldview, their tastes, preferences, and cultural diet […] snobby cosmopolitans who look down on the ordinary Americans who unpretentiously and earnestly devote themselves to the bedrock values of faith, family, and flag.”[2]

Need I point out that nearly all Americans value all three? I may be an atheist now, but I attended Hebrew School three days a week for six years, was Bar Mitzvahed and attended many a large family Seder. I adore my family, even if individual members at time drive me crazy and my definition is a bit looser. And while I love my country, I see its flaws and seek to repair them; my love is not unconditional.

Third, as a student of epidemiology, in many ways a quantitative offshoot of epistemology (how do we know and how much can we know), I am alarmed by the partisan bifurcation of information sources and accepted truths. It is not quite as simple as Republicans watch Fox News and listen to conservative talk radio, while Democrats watch MSNBC and listen to NPR, though as oversimplifications go, that is not bad. For the record, while I regularly watch MSNBC, I rarely listen to NPR.

But such resentments can only be a winning political strategy if the “facts of the case” are in constant dispute, if we choose only to believe (as opposed to know) what we learn from “our” sources. That is one reason I noted Waldman’s (presumed) ideology and education: I always “consider the source” of anything I read, watch or hear.

All of these issues—Democratic electoral strategy, conservative populist resentment, an untenable fractured epistemology—are fascinating and of vital importance.

And they only obliquely relate to what I am trying to say here.

Just bear with me.

**********

In my first post, I presented two brief—and radically different—biographies. One was that of a “well-placed member of the coastal cultural elite,” while the other was that of “one of life’s losers whose ladder of opportunity is buried deep underground.”

Of course, this is my own bit of epistemological misdirection: both biographies are mine. I was trying to demonstrate both my story-telling style and the manipulative power of story-telling itself: both stories were, strictly speaking, true, but each included (and amplified) only those facts that advanced the story’s message.

Upon reflection, though, I think that particular choice of stories (and my book project) unwittingly revealed my own ambiguity about my identity. Much of my ambivalence (a term my psychotherapist loves to use) stems from my adoption at that time by that particular family, a sense bordering on guilt of how extremely (unfairly?) lucky I was.

Even within this blog, I have evinced this ambivalence. I literally mentioned that I attended Yale in the very first sentence of my post arguing that “we are not our resumes.

You cannot make this stuff up.

**********

So what does this identity ambivalence have to do with conservative populist resentment at snooty liberal elites?

(Actually, the question pretty much answers itself.)

What most galls me is that it is nothing more than reverse snobbery. Whereas I do not “look down on them,” they clearly despise me…without ever meeting or otherwise getting to know me (I have been called a “libtard”—a term both ridiculous and highly offensive—more than once on Twitter).

It is also factually incorrect.

While I have, through both native ability and extremely hard work, earned my Ivy League and other degrees, I am hardly a member of the elite.

In his “Winners” chapter, Hayes expounds upon what he calls “fractal inequality.” One illustration: the economic distance between the bottom 99% and the top 1% is the same as that between the top 0.01% and the top 0.99% (both within the top 1%), which is also the same as that between the top 0.0001% and the top 0.0099% (both within the top 0.01%).

As Hayes describes it:

“Such a distributional structure reliably induces a dizzying vertigo among those ambitious souls who aim to scale it. The successful overachiever can only enjoy the perks of his [or her] relatively exalted status long enough to realize that there’s an entire world of heretofore unseen perks, power, and status that’s suddenly come within view and yet remains out of reach.”[3]

Something very much like this happened to me when I arrived at Yale in September 1984. I had always been one of the smartest kids in my class, even at a high school recognized for its academic excellence which regularly sent a few dozen graduates to the Ivy League and other top schools.

Yeah, I had no idea what being smart meant.

I had classmates who could play complex musical passages solely by ear…and were mildly surprised that I could not (though I did once work out the opening chords to The Stylistics’ “You Make Me Feel Brand New” on my portable electronic keyboard). My colleagues in the Yale Political Union had an astonishing mastery of debating techniques and policy details. One classmate (now one of my dearest friends) understood mathematics (and seemingly everything else) at a level that made the rest of look like kindergarteners.

Basically, while I ultimately found my niche and performed well at Yale (cum laude, distinction in the major), I was average there. And while it helped launch what became my health-data-analysis career, that career was far from lucrative, though I suppose some of my Boston-inflated salaries were respectable.

**********

Speaking of which, let me end where I started, with whether “respect” should be part of a winning electoral strategy.

To begin with, EVERYONE deserves respect by virtue of their basic humanity.

But to the conservative populists who think that America is somehow not great because of me or folks like me or what they think folks like me are like (or something), I observe that that respect goes both ways. You need to respect my triumphs and tragedies as well.

And do not for one minute think that my respect implies any kind of acceptance of retrograde and regressive beliefs.

Simply put, I will not overlook for the sake of electoral victory…

….the scapegoating of immigrants (undocumented or otherwise), Muslims or other non-white-Christian citizens: if you want my electoral respect, please show respect for everyone who does not look, sound or worship (o not worship) like you.

…the denial of basic science and the scientific method in the service of some half-baked conspiracy or religious doctrine: if you want my electoral respect, do not insult my intelligence or, for that matter, your own.

…the elevation of unborn fetuses over the lives of women: I am absolutely going there—if you want my electoral respect, stop objectifying, degrading and diminishing women, not only through anti-contraception and anti-abortion legislation but also through harassment and violence. Here I channel my late mother who firmly believed that if men could become pregnant, abortion clinics would be as plentiful as CVS or Walgreens.

…a preference for firearms over human beings: if you want my electoral respect, set aside your anti-government paranoia and mitigate my call for Amendment II repeal by taking serious steps to halt the US epidemic of gun violence (school shootings; police shooting unarmed civilians; homicides, suicides and accidents). I do not want your bleepity-frick guns, but neither do I want them anywhere but in your homes and on licensed shooting ranges.

…and any other latent or blatant racism, authoritarianism, xenophobia, homophobia, anti-Semitism, misogyny, ignorance and/or outright paranoia: if you want my electoral respect, take a long look in your own conscience first.

The bottom line is this: I may respect you as a person (and expect the same in return), but I will NOT respect all of your beliefs.

I may at times feel guilty about the breaks I have received (not least my gender and skin color), but I will never feel guilty or ashamed about anything I accomplished given those breaks and my natural abilities, nor about what I believe through my own research, careful thought and debate.

If that makes me a liberal elitist, and if calling me that somehow makes somebody feel better about your own life (and provides an excuse not to change it—the way you tell others to pull themselves up by their own bootstraps, whatever the heck that means), that is not my problem.

So…who am I?

Hello, my name is Matt Berger, and I am proud of my degrees from Yale and Harvard and Boston University, just as I am proud of my secular liberal belief system. And I ask you to respect me as much as I respect you.

Scan0050

Scan0046

Until next time…

[1] “In more recent elections, the Democratic coalition has been fractured, particularly by issues associated with race,” which then underlay a series of values conflicts. “Exploiting these newer issues, Republicans had won all but one presidential election in the past quarter century, making particularly notable gains among whites, men, southerners, and Catholics.” (Italics added) Pomper, Gerald M., “The Presidential Election” in Pomper, Gerald M., Arterton, F. Christopher, Baker, Ross K., Burnham, Walter Dean, Frankovic, Kathleen A., Hershey, Marjorie Randon and Wilson Carey McWilliams. 1993. The Election of 1992. Chatham, NJ: Chatham House Publishers, Inc., pg. 135.

[2] Hayes, Christopher L. 2012. Twilight of the Elites: America After Meritocracy. First Paperback Edition. New York, NY: Broadway Paperbacks, pg. 138.

[3] Hayes, pg. 156

Separating the art from the artist

The director David Lynch—who I dressed as this past Halloween—gave this response to a question about the meaning of a puzzling moment toward the end of episode 15 of Twin Peaks: The Return.

“What matters is what you believe happened,” he clarified. “That’s the whole thing. There are lots of things in life, and we wonder about them, and we have to come to our own conclusions. You can, for example, read a book that raises a series of questions, and you want to talk to the author, but he died a hundred years ago. That’s why everything is up to you.”

On the surface, this is a straightforward answer, one Lynch has restated in different ways over the years: the meaning of a piece of art is whatever you think it is. Every individual understands a piece of art through her/his own beliefs and experiences.

I am reminded of a therapeutic approach to the interpretation of dreams that particularly resonates with me.

You tell your therapist what you remember of a dream. The therapist then probes a little more, attempting to elicit forgotten details. The conversation then turns to the “meaning” of the dream. Some therapists may pursue the Freudian notion of a dream as the disguised fulfillment of a repressed wish (so what is the wish?). Other therapists may look to the symbolism of characters and objects in the dream (is every character in a dream really a version of the dreamer?) for interpretation.

Then there is what you might call the Socratic approach; this is the approach that resonates with me. The therapist allows the patient to speculate what s/he thinks the dream means. Eventually, the patient will arrive at a meaning that “clicks” with her/him, the interpretation that feels correct. The therapist then accepts this interpretation as the “true” one.

That the “dreams mean whatever you think they mean” approach aligns nicely with Lynch’s musing is not surprising, given how central dreams and dream logic are to his film and television work.

We live inside a dream

However, there is a subtext to Lynch’s musing about artistic meaning that is particularly relevant today.

**********

The November 20, 2017 issue of The Paris Review includes author Claire Dederer’s essay “What Do We Do with the Art of Monstrous Men?”

I highly recommend this elegant and provocative essay.

For simplicity, I will focus on two questions raised by the essay:

  1. To what extent should we divorce the artist from her/his art when assessing its aesthetic quality?
  2. Does successful art require the artist to be “monstrously” selfish?

Dederer describes many “monstrous” artists, nearly all men (she struggles when cataloging the monstrosity of women, despite how odious she finds the impact of Sylvia Plath’s suicide on her children) before singling out Woody Allen as the “ur-monster.”

And here is where I discern a deeper meaning in Lynch’s “dead author” illustration.

Lynch’s notion that one brings one’s own meaning to any piece of art is premised on the idea that the artist may no longer be able to (or may choose not to) reveal her/his intent.

But that implies that something about the artist is relevant to understanding her/his art. Otherwise, one would never have sought out the artist in the first place.

The disturbing implication is that it is all-but-impossible to separate art from artist.

This is Dederer’s conundrum, and it is mine as well.

**********

A few years ago, a group of work colleagues and I were engaging in a “getting to know each other” exercise in which each person writes down a fact nobody else knows about them, and then everyone else has to guess whose fact that is.

I wrote, “All of my favorite authors were falling-down drunks.”

Nobody guessed that was me, which was a mild surprise.

Of course, the statement was an exaggeration, a tongue-in-cheek poke at the mock seriousness of the process.

Still, when I think about many of the authors I love, including Dashiell Hammett, Raymond Chandler, Edgar Allan Poe, John Dickson Carr, Cornell Woolrich, David Goodis[1]

…what first jumps to mind is that every author I just listed is male (not to mention inhabiting the more noir corners of detective fiction). So far as I know, my favorite female authors (Sara Paretsky, Ngaio Marsh and Agatha Christie, among others) do/did not have substance abuse problems.

Gender differences aside, while not all of these authors were alcoholics, they did all battle serious socially-repugnant demons.

Carr, for example, was a virulently racist and misogynistic alcoholic.

He also produced some of the most breathtakingly-inventive and original detective fiction ever written.

Woolrich was an agoraphobic malcontent who was psychologically cruel to his wife during and just after their brief, unconsummated marriage[2].

He also basically single-handedly invented the psychological suspense novel. More films noir (including the seminal Rear Window) have been based on his stories than those of any other author.

And so forth.

It is not just the authors I admire who are loathsome in their way.

I never ceased to be amazed by the music of Miles Davis, who ranks behind only Genesis and “noir troubadour” Stan Ridgway in my musical pantheon. His “Blue in Green” is my favorite song in any genre, and his Kind of Blue is my favorite album.

But this is the same Miles Davis who purportedly beat his wives, abused painkillers and cocaine, was taciturn and full of rage, and supposedly once said, “If somebody told me I only had an hour to live, I’d spend it choking a white man. I’d do it nice and slow.[3]

Moving on, my favorite movie is L.A. Confidential.

Leaving aside the shenanigans of co-star Russell Crowe, there is the problem of Kevin Spacey, an actor I once greatly respected.

Given the slew of allegations leveled at Spacey, the character arc of his “Jack Vincennes” in Confidential is ironic.

But first, let me warn any reader who has not seen the film that there are spoilers ahead. For those who want to skip ahead, I have italicized the relevant paragraphs.

Vincennes is an amoral 1950s Los Angeles police officer whose lucrative sideline is selling “inside” information to Sid Hudgens, publisher of Hush Hush magazine, reaping both financial rewards and high public visibility. Late in the film, he arranges for a young bisexual actor to have a secret (and then-illegal) sexual liaison with the District Attorney, a closeted homosexual. Vincennes and Hudgens would then catch the DA and the young actor in flagrante delicto.

Sitting in the Formosa Club that night, however, Vincennes has a sudden pang of conscience and leaves the bar (symbolically leaving his payoff—a 50-dollar bill—atop his glass of whiskey), intending to stop the male actor from “playing his part.” Unfortunately, he arrives at the motel room too late; the actor has been murdered.

Determined to make amends, he teams up with two other detectives to solve a related set of crimes, including the murder of the young actor. In the course of his “noble” investigation, he questions his superior officer, Captain Dudley Smith, one quiet night in the latter’s kitchen. Realizing that Vincennes is perilously close to learning the full extent of his criminal enterprise, Smith suddenly pulls out a .32 and shoots Vincennes in the chest, killing him.

OK, the spoilers are behind us.

**********

This listing of magnificent art made by morally damaged people demonstrates I am in the same boat as Claire Dederer: I have been struggling for years to separate art from artist.[4]

And that is before discussing the film that serves as Dederer’s Exhibit A: Woody Allen’s Manhattan.

Dederer singles out Manhattan (still one of my favorite films) because of the relationship it depicts between a divorced man of around 40 (Isaac, played by Allen himself) and a 17-year-old high school named Tracy (Mariel Hemingway).

Not only is the relationship inherently creepy (especially in light of recent allegations by Hemingway and the fact that in December 1997, the 62-year-old Allen married the 27-year-old Soon-Yi Previn, the adopted daughter of his long-time romantic partner Mia Farrow[5]), but, as Dederer observes, the blasé reaction to it from other adult characters in the film makes us cringe even more.

As I formulated this post—having just read Dederer’s essay—I thought about why I love Manhattan so much.

My reasons are primarily aesthetic: the opening montage backed by George Gershwin’s Rhapsody in Blue (and Allen’s voiceover narration), Gordon Willis’ stunning black-and-white cinematography, the omnipresence of a vibrant Manhattan itself.

In addition, the story, a complex narrative of intertwined relationships and their aftermath, is highly engaging. The dialogue is fresh and witty—and often very funny. The characters are quirky (far from being a two-dimensional character, I see Tracy as the moral center of the film) but still familiar.

And then there is the way saw the film for the first time.

The movie was released on April 25, 1979. At some point in the next few months, my father took me to see it at the now-defunct City Line Center Theater (now a T.J. Maxx) in the Overbrook neighborhood of Philadelphia. Given that I was 12 years old, it was an odd choice on my father’s part, but I suspect he wanted to see the film and seized the opportunity of his night with me (my parents had been separated two years at this point) to do so.

City Line Theater

I recall little about seeing Manhattan with him, other than being vaguely bored. I mean, it was one thing for old movies and television shows to be in black-and-white (like my beloved Charlie Chan films), but a new movie?

I do not remember when I saw Manhattan again. At one of Yale’s six film societies? While flipping through television channels in the 1990s? Whenever it was, the film clicked with me that second viewing, and I have only become fonder of it since then.

Two observations are relevant here.

One, it is clear to me that the fact that I first saw Manhattan at the behest of my father, who I adored in spite of his many flaws, heavily influenced my later appreciation of the film[6].

Two, this appreciation cemented itself years before Allen’s perfidy became public knowledge.

These two facts help explain (but not condone) why I still…sidestep…my conscience to admire Manhattan as a work of art.

**********

Ultimately, I think the following question best frames any possible resolution of the ethical dilemma of appreciating the art of monstrous artists:

Which did you encounter first, the monstrous reputation of the artist…or the art itself?

I ask this question because my experience is that once I hear that a given artist is monstrous, I have no desire to experience any of her/his art.

Conscience clear. No muss, no fuss.

That includes not-yet-experienced works by an artist I have learned is loathsome. I have not, for example, seen a new Woody Allen since the execrable The Curse of the Jade Scorpion in 2001.

But if I learn about the artist’s monstrous behavior AFTER reacting favorably to a piece of her/his art, I will often find myself still drawn to the art.[7]

Conscience compartmentalized. Definitely some muss, some fuss.

My love of these works is just too firmly embedded in my consciousness to unwind. Thus, I still love the music of Miles Davis. L.A.Confidential remains my favorite movie. Manhattan may have dropped some in my estimation, but it is still in my top 10.

I am reminded of this line from “Seen and Not Seen” on the Talking Heads album Remain in Light:

“This is why first impressions are often correct.”

**********

And here is where I think Lynch’s impressionistic approach to finding meaning in art and the patient-centered approach to dream interpretation—art and dreams mean whatever we think they mean—relate to the question of loving art while loathing the artist.

Art is a deeply personal experience. The “Authority” Dederer so pointedly disdains in her essay can provide guidance, but (s)he cannot experience the art for you or me.

Put simply, each of us is an “Authority” on any given piece of art—and also on whether or not to seek out that art.

For example:

As a child, I found myself hating The Beatles simply because I was supposed to love them. However, once I discovered their music on my own terms, purchasing used vinyl copies of the “Red” and “Blue” albums (which I still own 30+ years later) along with Abbey Road, The Beatles (the “White” Album), Sgt. Peppers’s Lonely Hearts Club Band, Revolver and Rubber Soul…suffice to say I have 124 Beatles tracks (out of 9,504) in my iTunes, second only to Genesis (288). The Beatles also rank sixth in total “plays” behind The Cars, Steely Dan, Miles Davis (there he is again), Stan Ridgway and Genesis.

Each of us is also the Authority on our changing attitudes toward a given piece of art, including what we learn about the artist, knowledge which then becomes one more element we bring to the subjective experience of art.

**********

Dederer speculates about whether artists (particularly writers) somehow NEED to be monstrous to be successful.

(Upon writing that last sentence, the phrase “madness-genius” began to careen around my brain).

As a writer with advanced academic training in epistemology-driven-epidemiology, I would suggest this study to assess this question.

A group of aspiring artists who had not yet produced notable works would be identified. They would be divided into “more monstrous” and “less monstrous,”[8] definitions to be determined. These artists would be followed for, say, 10 years, after which time each artist still it the study would be defined as “more successful” and “less successful,” definitions to be determined The percentages of artists in each category who were “more successful” would be compared, to see whether being “monstrous” made an aspiring artist more or less likely to be “successful,” or even made no difference at all.

This would not settle the question of the link between monstrosity and art by any means, but it would sure be entertaining.

**********

When Dederer talks about the monstrous selfishness of the full-time writer, she focuses on the temporal trade-offs writers must make—time with family and friends versus time spent writing. Writing is an almost-uniquely solitary endeavor, as I first learned writing my doctoral thesis, and as I continue to experience in my new career.

Luckily, my wife and daughters remain strongly supportive of my choice to become a “writer,” so I have not yet felt monstrously selfish.

There is a different kind of authorial “selfishness,” though, that I would argue is both more benign and more beneficial to the author.

When I began this blog, my stated aim was to focus solely on objective, data-driven stories; my personal feelings and life story were irrelevant (outside of this introductory post).

Looking back over my first 48 posts, though, I was surprised to count 17 (35.4%) I would characterize as “personal” (of which three are a hybrid of personal and impersonal). These personal posts, I observed, have also become more frequent.

Even more surprising was how much more “popular” these “personal” posts were. As of this writing, my personal posts averaged 28.4 views (95% confidence interval [CI]=19.9-36.9), while my “impersonal” posts averaged 14.5 views (95% CI=10.8-18.1); the 95% CI around the difference in means (14.0) was 6.3-21.6.[9]

Moreover, the most popular post (77 views, 32 more than this post) is a very personal exploration of my love of film noir.

In other words, while none of my posts have been especially popular (although I am immensely grateful to every single reader), my “personal” posts have been twice as popular as my “impersonal” posts.

I had already absorbed this lesson somewhat as I began to formulate the book I am writing[10]. Initially inspired by my “film noir personal journey” post, it has morphed into a deep dive not only into my personal history, but also the history of my family (legal and genetic) going back three or four generations.

This, then, is the “selfish” part: the discovery that the most popular posts I have written are the ones in which I speak directly about my own life and thoughts, leading me to begin to write what amounts to a “hey, I really like film noir…and here are some really fun stories about my family and me” memoir-research hybrid. One that I think will be very entertaining.

Whether an agent, publisher and/or the book-buying public ever agree remains an open question.

**********

Just bear with me (I had to write that phrase at some point) while I fumble around for a worthwhile conclusion to these thoughts and memories.

I am very hesitant ever to argue that means justify the ends, meaning that my first instinct is to say that art produced by monstrous artists should be avoided.

But I cannot say that because, having formed highly favorable “first (and later) impressions” of various works of art produced by “monstrous” artists, I continue to love those works of art. I may see them differently, but the art itself has not changed. “Blue in Green” is still “Blue in Green,” regardless of what I learn about Miles Davis, and it is still my favorite song.

And that may be the key. Our store of information about a piece of art may change, but the art itself does not change. It is fixed, unchanging.

Of course, if Lynch and the patient-centered therapists are correct that we each need to interpret/appreciate (or not) works of art as individuals, then how we react to that piece of art WILL change as our store of information changes.

Shoot. I thought I had something there.

Well, then, what about the “slippery slope” argument?

Once we start down the path of singling out certain artists (and, by extension, their works of art) for opprobrium, where does that path lead?

The French Revolution devolved into an anarchic cycle of guillotining because (at least as I understand it) competing groups of revolutionaries began to point the finger at each other, condemning rival groups to death as power shifted between the groups.

This is admittedly an extreme example, but my point is that we once start condemning monstrosity in our public figures, it is difficult to stop.

It is also the case that very few of us are pure enough to condemn others. We all have our Henry Jekyll, and we all have our Edward Hyde, within us. I think the vast majority of us contain far more of the noble Dr. Jekyll than of the odious Mr. Hyde, but we all enough of the latter to be wary of hypocrisy.

And if THAT is not a good argument, then I have one more.

Simply put, let us all put on our Lynchian-therapeutic cloaks and make our own decisions about works of art, bringing to bear everything we know and feel and think, including our conscience…while also understanding that blatant censorship (through public boycott or private influence) is equally problematic…

These decisions may be ethically uncomfortable, but as “Authorities,” they are ultimately ours and ours alone.

Until next time…

[1] Fun fact about Goodis: Philadelphia-born-and-raised, he is buried in the same cemetery as my father.

[2] Woolrich was also a self-loathing homosexual.

[3] This quote is found on page 61 of the March 25, 1985 issue of Jet, in a blurb titled “Miles Davis Can’t Shake Boyhood Racial Abuse.” The quote is apparently from a recent interview with Miles White of USA Today, but I cannot find the actual USA Today article.

As a counter, and for some context, here is a long excerpt from Davis’ September 1962 Playboy interview.

Playboy: You feel that the complaints about you are because of your race?

Davis: I know damn well a lot of it is race. White people have certain things they expect from Negro musicians — just like they’ve got labels for the whole Negro race. It goes clear back to the slavery days. That was when Uncle Tomming got started because white people demanded it. Every little black child grew up seeing that getting along with white people meant grinning and acting clowns. It helped white people to feel easy about what they had done, and were doing, to Negroes, and that’s carried right on over to now. You bring it down to musicians, they want you to not only play your instrument, but to entertain them, too, with grinning and dancing.

Playboy: Generally speaking, what are your feelings with regard to race?

Davis: I hate to talk about what I think of the mess because my friends are all colors. When I say that some of my best friends are white, I sure ain’t lying. The only white people I don’t like are the prejudiced white people. Those the shoe don’t fit, well, they don’t wear it. I don’t like the white people that show me they can’t understand that not just the Negroes, but the Chinese and Puerto Ricans and any other races that ain’t white, should be given dignity and respect like everybody else.

But let me straighten you — I ain’t saying I think all Negroes are the salt of the earth. It’s plenty of Negroes I can’t stand, too. Especially those that act like they think white people want them to. They bug me worse than Uncle Toms.

But prejudiced white people can’t see any of the other races as just individual people. If a white man robs a bank, it’s just a man robbed a bank. But if a Negro or a Puerto Rican does it, it’s them awful Negroes or Puerto Ricans. Hardly anybody not white hasn’t suffered from some of white people’s labels. It used to be said that all Negroes were shiftless and happy-go-lucky and lazy. But that’s been proved a lie so much that now the label is that what Negroes want integration for is so they can sleep in the bed with white people. It’s another damn lie. All Negroes want is to be free to do in this country just like anybody else. Prejudiced white people ask one another, “Would you want your sister to marry a Negro?” It’s a jive question to ask in the first place — as if white women stand around helpless if some Negro wants to drag one off to a preacher. It makes me sick to hear that. A Negro just might not want your sister. The Negro is always to blame if some white woman decides she wants him. But it’s all right that ever since slavery, white men been having Negro women. Every Negro you see that ain’t black, that’s what’s happened somewhere in his background. The slaves they brought here were all black.

What makes me mad about these labels for Negroes is that very few white people really know what Negroes really feel like. A lot of white people have never even been in the company of an intelligent Negro. But you can hardly meet a white person, especially a white man, that don’t think he’s qualified to tell you all about Negroes.

You know the story the minute you meet some white cat and he comes off with a big show that he’s with you. It’s 10,000 things you can talk about, but the only thing he can think of is some other Negro he’s such close friends with. Intelligent Negroes are sick of hearing this. I don’t know how many times different whites have started talking, telling me they was raised up with a Negro boy. But I ain’t found one yet that knows whatever happened to that boy after they grew up.

Playboy: Did you grow up with any white boys?

Davis: I didn’t grow up with any, not as friends, to speak of. But I went to school with some. In high school, I was the best in the music class on the trumpet. I knew it and all the rest knew it — but all the contest first prizes went to the boys with blue eyes. It made me so mad I made up my mind to outdo anybody white on my horn. If I hadn’t met that prejudice, I probably wouldn’t have had as much drive in my work. I have thought about that a lot. I have thought that prejudice and curiosity have been responsible for what I have done in music.

[4] This has actually impacted me directly. Privacy concerns prevent me from using names, but I have had long and painful discussions with people close to me who were either related to, or knew very well, artists whose work they admired but who were/are loathsome human beings.

[5] Purportedly, Allen and his quasi-step-daughter (Allen and Farrow never married) had been having a long-term affair.

[6] And, perhaps, of black-and-white cinematography more generally.

[7] There are exceptions to this, of course. As much as I love the Father Brown stories by G.K. Chesterton, his blatant anti-Semitism has likely permanently soured me on his writing.

[8] Acknowledging that “monstrosity” is not binary, but a continuum. We have all had monstrous moments, and even the most monstrous people have had a moment or two of being above reproach.

[9] Using a somewhat stricter definition of “personal” made the difference even starker.

[10] Tentative title: Interrogating Memory: How a Love of Film Noir Led Me to Investigate My Own Identity.