Saturday, March 7, 2026

Joan Didion: The Ralph's at Sunset and Fuller

 


Joan Didion published her first two collections of essays as, The White Album, and, Slouching Toward Bethlehem.   In which essay in her early work does she discuss going to the Ralph’s on Sunset Boulevard at Fuller in Hollywood wearing a bathing suit?

####

The episode you’re thinking of appears in “The White Album,” the title essay of Joan Didion’s 1979 collection The White Album.

In the opening section of that essay, Didion describes the disorienting atmosphere of Los Angeles in the late 1960s—Hollywood parties, recording studios, the Manson murders, and the general sense that ordinary social rules had become unstable. As part of that portrait, she recounts going to a Ralphs supermarket on Sunset Boulevard at North Fuller wearing only a bathing suit and a thin cotton skirt, and noticing that no one seemed to find this unusual. The moment serves as a small but telling example of how social boundaries in Los Angeles had become oddly fluid.

Her point is less about the outfit than about the atmosphere of the time:

  • everyday behavior felt slightly surreal,
  • social expectations seemed loosened or irrelevant,
  • and people moved through public space with a kind of detached indifference.

The anecdote appears very early in the essay as part of the famous opening passage:

“We tell ourselves stories in order to live…”

From there she moves through a collage of scenes—Jim Morrison recording sessions, Huey Newton’s trial, the Manson murders, Hollywood parties, and her own nervous breakdown—to illustrate how fragmented the culture felt.

So the answer:

   Essay: The White Album

   Collection: The White Album (1979)

It’s a good example of Didion’s method: a tiny observational detail—a grocery trip in a bathing suit—quietly revealing something larger about Los Angeles culture and the psychological mood of the late 1960s.

If you’d like, I can also point out three or four other small LA moments in that essay that work the same way (the Doors studio scene, the Malibu house, the Black Panthers interview), because they’re almost constructed like film cuts.

———-

[This beautiful answer is actually wrong, the description of going to the supermarket in her bathing suit is in her first book, Slouching to Bethlehem, in the chapter, Los Angeles Notebook. The same chapter is also famous for a several page description of the alleged grand and adverse effects of Santa Ana wins on the human psyche.]

Friday, March 6, 2026

Genomic Health OIG Penalty (no admission of error)

This case involved no admission of error by Genomic Health.

###

Before 2018, Medicare had a rule known as the “14-day rule” that governed billing for laboratory tests associated with hospital encounters. If a test was ordered within 14 days of a patient’s hospital discharge, the test was considered part of the hospital service bundle, meaning the hospital—not the outside laboratory—had to bill Medicare and use its reimbursement to pay the genomic test laboratory. For expensive genomic tests performed by outside reference laboratories, this created operational and financial tension because hospitals were often reluctant to assume financial responsibility for these tests.

Genomic Health, the company that developed the Oncotype DX cancer genomic assay (later acquired by Exact Sciences in 2019), became the subject of a federal investigation over how it handled this rule. According to the U.S. Department of Justice, between roughly 2008 and 2020 the company engaged in practices designed to bypass the 14-day rule. Prosecutors alleged that Genomic Health billed Medicare directly for tests that should have been billed to hospitals, and that the company encouraged providers to cancel and reorder tests after the 14-day window so that the laboratory could bill Medicare directly. The government also alleged that the company sometimes did not pursue payment from hospitals for tests ordered within the 14-day window, which prosecutors characterized as potentially providing improper financial benefits to hospitals.

In October 2023, Genomic Health agreed to pay $32.5 million to resolve these allegations under the False Claims Actwithout admitting liability. The settlement resolved whistleblower lawsuits that had been filed earlier and covered billing practices related to Medicare claims for genomic testing. The case became a notable example within the molecular diagnostics industry because it illustrated how the pre-2018 Medicare billing structure created significant incentives and compliance risks around the timing of genomic test orders. 

The rule itself was subsequently modified by CMS in 2018 to allow certain molecular pathology tests, particularly those performed by independent laboratories, to be billed directly by the performing laboratory rather than bundled into the hospital payment, when the specimen came from a hospital outpatient.

Sources

https://www.justice.gov/usao-edny/pr/genomic-health-inc-pay-325-million-resolve-allegations-relating-submission-false

https://www.justice.gov/archives/opa/pr/genomic-health-inc-agrees-pay-325-million-resolve-allegations-relating-submission-false

Sidebar
see also topic of "surgeon initiated ordering."  https://ascopost.com/News/58066

Thursday, March 5, 2026

Mariachi and Evolution



American popular music changes quite a bit from decade to decade, so it’s easy to tell the hit songs from a year in the 1930s, from the 1940s, 1950s, 1960s, or 1970s. However, when I listen to mariachi music at Mexican cafés or played by Mexican workers or vendors, mariachi music seems to sound pretty much the same as 20 or 30 years ago.


Your observation is actually shared by many musicologists. Mariachi music changes more slowly than mainstream American popular music, and there are several reasons for that.


First, mariachi is closer to a traditional or folk genre than to commercial pop. American pop music is driven heavily by the recording industry, youth culture, and fashion cycles—so every decade produces new sounds, technologies, and marketing styles (big band → rock → disco → hip-hop, etc.). Mariachi, by contrast, developed in the 19th century in western Mexico and reached its classic form in the early 20th century. Once the instrumentation and style stabilized—violins, trumpets, guitarrón, vihuela, guitar, and vocal harmonies—the genre became associated with tradition rather than novelty.


Second, mariachi functions socially as ceremonial music. It is strongly tied to occasions such as weddings, birthdays, serenades, religious festivals, and national holidays. Music used in these settings tends to preserve familiar repertoire and style. People want songs that sound like mariachi is supposed to sound, just as a brass band at Oktoberfest or a New Orleans jazz funeral follows recognizable patterns.


Third, the repertoire itself is somewhat “canonized.” Many of the most famous mariachi songs date from roughly the 1930s–1960s golden age of Mexican cinema and radio—songs like Cielito Lindo, El Rey, or Guadalajara. These are repeatedly performed, so listeners constantly hear music written decades ago.


Fourth, stylistic changes do happen—but subtly. Modern mariachi groups may use slightly different arrangements, add more trumpets, amplify instruments, or incorporate pop influences (sometimes called mariachi pop or mariachi romántico). But these changes are incremental and stay within the recognizable mariachi sound.


So the short answer is: mariachi evolved earlier and then stabilized, while American pop music is built around continual reinvention. What you’re hearing in cafés today is often music whose style was largely set 80–100 years ago, which makes it sound timeless compared with rapidly shifting pop genres.



I find it very interesting that you can almost instantly place an American movie in the correct decade from 1900 to 2000


That’s a very perceptive observation, and it reflects a broader cultural phenomenon. American movies carry a strong “time stamp” because so many elements of filmmaking change rapidly with fashion, technology, and social norms. Even if the story is timeless, the surrounding details tend to reveal the decade almost immediately.


One major factor is technology. Film stock, lighting methods, sound recording, and editing techniques changed dramatically over the century. A movie from the 1930s has a different visual texture than one from the 1950s Technicolor era, which looks very different from the grainy handheld style that became common in the 1970s. By the 1990s, digital effects and smoother cinematography give films yet another distinct look.


Another reason is acting style and dialogue. Early films often feature theatrical acting and very formal speech. Mid-century films have a different cadence—think of the crisp delivery in 1940s noir. By the 1970s, performances became looser and more naturalistic, and by the 1990s the dialogue rhythms reflect contemporary conversational styles.


Then there are the visible markers of everyday life—clothing, hairstyles, furniture, cars, street scenes, and even how people smoke or hold a drink. Each decade has recognizable visual cues. A single hairstyle or pair of glasses can immediately place a film in the 1970s or 1980s.


Finally, cultural attitudes shift. What characters talk about, how relationships are portrayed, and what subjects are considered acceptable all change over time. For example, the moral tone of a 1930s or 1950s film is usually quite different from the frankness of films made after the social revolutions of the 1960s and 1970s.


So in a sense, movies function like time capsules. Even when filmmakers aren’t trying to capture a historical moment, the technologies, fashions, and assumptions of the time inevitably leak into the film, making it surprisingly easy for viewers to place it in the right decade.



I would argue though, that many movie types like a romantic comedy, don’t really look any different in 2025 then in 2020 for 1995. The shape of a cell phone might change visibly. Similarly, we can look at the West Wing, from 1999, and it’s hard to tell it wasn’t made this year.


Yes—many people have noticed the same thing. The strong decade-to-decade “look” of films largely flattened out sometime in the 1990s or early 2000s. After that point, the visual language of mainstream filmmaking became relatively stable, so a movie from 1995, 2010, or 2025 can look surprisingly similar.


Several forces contributed to this.


1. The technology plateaued.

From 1930 to 1990, the industry went through huge changes: sound, color, widescreen, lighter cameras, new film stocks, Steadicam, nonlinear editing, etc. Each wave created a distinct look. By the late 1990s, however, filmmakers had already reached a kind of technical equilibrium: high-quality color photography, stable lighting, professional sound, and sophisticated editing were standard. Digital cameras and CGI improved things, but they didn’t radically change the basic look of ordinary scenes the way earlier innovations did.


2. Professional style standardized.

Hollywood gradually converged on a stable “classical” visual grammar: shot–reverse-shot conversations, naturalistic lighting, continuity editing, and fairly realistic acting styles. That grammar had already matured by the 1990s. Once it stabilized, later films mostly refined it rather than reinventing it.


3. Fashion cycles slowed visually on screen.

Clothing in the 1930s, 1950s, and 1970s was dramatically different, so a single frame could reveal the decade. Since the 1990s, everyday clothing—jeans, T-shirts, jackets, business suits—has been much more stable. A person in a blazer and jeans could belong to 1998 or 2025.


4. Television style became cinematic.

Shows like The West Wing (1999–2006) already used modern production techniques: fast dialogue, handheld cameras, “walk-and-talk” staging, and high production values. Those techniques are still common today, which is why the show doesn’t feel visually ancient the way a 1970s TV drama would.


5. Cultural continuity.

The social world depicted in many films hasn’t shifted as dramatically as it did between, say, 1950 and 1970. The biggest obvious marker today is often technology—cell phones, laptops, and interfaces—but those are relatively small visual cues.


So you could say that the twentieth century was a period of rapid stylistic change in film, while the last thirty years have been more of a long plateau. If someone showed you a random Hollywood romantic comedy, you might easily distinguish 1985 from 2005, but 1998 from 2025 could be surprisingly hard—unless someone pulls out a flip phone. 

Apple Pivots to $599 Laptop - Why Here, Why Now?

 Apple just released a $599 laptop. Give five concise reasons why not before now, and five concise reasons why now. I assume some ecosystem changes flipped their risk benefit decision matrix.

Miracle Cures at Decade-Intervals, for the Pharma Discovery Process

 Hasn’t this been tried before, like the interest in identifying libraries of single nucleotide polymorphous systems around 2000, or agreements with 23 and me that didn’t go anywhere, were ideas for using high throughput combinatorial chemistry to identify drug targets. It seems like we’ve seen this dance before, just to a different tune.///. Tempus AI, Inc. (NASDAQ: TEM), a technology company leading the adoption of AI to advance precision medicine, and Merck, known as MSD outside of the United States and Canada, today announced an expanded, multi-year collaboration aimed at accelerating the discovery and development of precision medicine biomarkers and supporting Merck’s oncology and potentially broader therapeutic portfolios. “This collaboration builds on our existing relationship and reflects our shared commitment to harnessing the power of multimodal datasets with AI to deliver better options for patients,” said Ryan Fukushima, CEO, Data & Apps at Tempus. “We’ve spent years configuring our Lens Platform to seamlessly leverage our library of de-identified multimodal data with the necessary AI computing power to train and fine-tune specific models for healthcare. Working with the great scientists at Merck, we have exciting opportunities to translate the insights from AI models into precision medicine strategies and improve patient outcomes across oncology and beyond.” Under the terms of the agreement, Merck will use Tempus’ de-identified data along with Tempus’ Lens Platform and Workspaces environment, which offers an advanced computational configuration powered by one of the industry’s largest GPU infrastructures, which enables researchers to efficiently conduct complex analyses on training-ready multimodal datasets, generating novel insights to accelerate the development and optimization of candidate therapies at scale.

“The combination of new AI technologies and large curated multimodal data sets are transforming the way we conduct discovery research,” said George Addona, Senior Vice President, Discovery, Preclinical Development and Translational Medicine, Merck Research Laboratories. “This collaboration with Tempus positions Merck to advance our precision oncology strategy through the application of the latestAI/ML capabilities to discover novel precision biomarkers, identify mechanisms of cancer cell resistance, and inform rational combinations for drugs in our early pipeline.”

Yes—your instinct is quite accurate. The Tempus–Merck announcement is part of a recurring cycle in biomedical R&D where a new technology promises to unlock drug targets from large biological datasets. Variants of this idea have appeared several times in the past 30 years. The tune changes (SNPs, combinatorial chemistry, genomics, AI), but the underlying dream—systematically mining biology for drug targets—remains the same.

Here are a few of the earlier “dances” that resemble what Tempus and Merck are describing.


1. SNP-mapping and genomic target discovery (late 1990s–early 2000s)

Around 2000, there was enormous enthusiasm for creating large libraries of single nucleotide polymorphisms (SNPs) across the human genome.

Companies like:

  • Celera

  • Millennium Pharmaceuticals

  • deCODE Genetics

argued that linking SNP patterns to disease risk would reveal entirely new drug targets.

The logic was:

  1. Map SNPs across populations

  2. Associate SNP haplotypes with disease

  3. Identify causal genes

  4. Build drugs against those pathways

Pharma companies signed multi-year data access deals very similar in tone to today’s Tempus collaboration.

The outcome:

  • Some scientific insights

  • A few targets

  • Far fewer drugs than hoped

The main problem was that complex diseases rarely reduce to simple genetic targets.


2. High-throughput combinatorial chemistry (mid-1990s)

Before genomics, pharma believed chemistry itself could generate drugs industrially.

The idea:

  • Create millions of compounds using automated synthesis

  • Screen them against biological targets

  • Let scale solve the discovery problem

Companies built enormous compound libraries and robotic screening facilities.

What happened:

  • Screening scale increased dramatically

  • But hit quality remained low

  • Biology, not chemistry, was the bottleneck

By the early 2000s, pharma quietly pivoted toward biology-driven discovery.


3. Consumer genetics data (23andMe partnerships)

Another echo of today’s strategy appeared in the 2015–2020 era.

Several pharma companies partnered with 23andMe to use their massive genetic database.

For example:

  • GlaxoSmithKline invested $300M in 23andMe

  • The promise: mine millions of genomes to discover drug targets

The logic was almost identical to the Tempus story:

  • Large dataset

  • AI / statistical analysis

  • New therapeutic targets

Results so far:

  • Some promising early programs

  • But few major clinical successes yet

Again, the bottleneck is translating correlations into actionable biology.


4. Systems biology / omics platforms (2000–2015)

There were also waves involving:

  • Gene expression arrays

  • Proteomics platforms

  • Network biology

  • “Systems medicine”

Each promised that integrated biological data would reveal hidden pathways.

These efforts produced valuable science, but drug discovery remained slow and uncertain.


5. What is actually new in the Tempus model

The Tempus–Merck collaboration is not entirely the same as those earlier efforts. A few things really are different:

1. Multimodal clinical data
Tempus combines:

  • Genomics

  • Pathology images

  • Clinical outcomes

  • Treatment histories

Earlier datasets were usually single-modality.


2. Real-world oncology data

Tempus built its database from clinical oncology practice, not just research cohorts.

This potentially allows discovery of:

  • drug resistance mechanisms

  • response predictors

  • combination strategies


3. Modern machine learning

AI models can analyze:

  • pathology images

  • sequencing data

  • clinical records simultaneously

That kind of integrated modeling was not technically feasible in 2000.


6. The deeper recurring problem

Despite the new tools, the core difficulty remains:

Biology is extremely messy.

Even with massive datasets:

  • correlations are easy

  • causal mechanisms are hard

Drug discovery still requires:

  • experimental validation

  • animal models

  • clinical trials

No dataset can eliminate those steps.


7. The realistic interpretation

The Tempus–Merck deal should probably be viewed as:

a discovery-enabling data partnership, not a drug discovery revolution.

Best case:

  • identify biomarkers

  • stratify patients better

  • discover a few useful targets

Worst case:

  • lots of correlations

  • little therapeutic payoff

Historically, most waves land somewhere in the middle.


A slightly cynical industry rule

Veterans of pharma R&D often say:

Every decade we invent a new technology that promises to industrialize drug discovery.

Examples:

  • combinatorial chemistry

  • genomics

  • proteomics

  • systems biology

  • AI

Each improves tools somewhat, but drug discovery stubbornly remains artisanal.


If you’d like, I can also explain why Tempus specifically is attractive to pharma despite this history, and why companies like Merck are still willing to pay for these collaborations. It actually has more to do with clinical trial strategy and biomarker enrichment than with discovering entirely new drugs.

Could you write an essay format please of course, use a bullet point or two where it really makes a difference.

The announcement of a multi-year collaboration between Tempus AI and Merck to use large multimodal datasets and artificial intelligence to discover precision medicine biomarkers has a familiar ring to anyone who has watched biomedical innovation cycles for several decades. The language—large datasets, computational analysis, discovery of novel drug targets—echoes earlier waves of enthusiasm that promised to industrialize the discovery of new therapies. The technology evolves, but the core ambition remains remarkably constant: use large-scale biological data to systematically identify therapeutic targets.

Around the year 2000, the most prominent version of this idea centered on single nucleotide polymorphisms (SNPs). After the Human Genome Project, biotechnology companies and academic groups began cataloging large libraries of SNP variants across human populations. The hope was straightforward. If particular SNP patterns correlated with diseases such as diabetes, Alzheimer’s disease, or cancer, then the associated genes might reveal the underlying biological pathways. Once those pathways were identified, drug developers could design therapies that modulate them. Companies such as Celera, Millennium Pharmaceuticals, and deCODE Genetics built business models around precisely this concept. Pharmaceutical companies signed data-sharing agreements that sounded strikingly similar to today’s AI-driven collaborations.

The scientific insights were real. The genetic architecture of many diseases became clearer, and certain targets did emerge. Yet the overall expectation—that genome-wide association studies would rapidly produce a pipeline of new drugs—proved overly optimistic. Most complex diseases are polygenic and mechanistically complicated, and genetic correlations rarely translate directly into tractable therapeutic targets.

An earlier wave, in the mid-1990s, focused not on biology but on chemistry. At that time, pharmaceutical companies invested heavily in high-throughput combinatorial chemistry, an approach that sought to generate millions of chemical compounds automatically and screen them against biological targets. The belief was that scale alone could solve drug discovery: if enough molecules were synthesized and tested, useful drugs would inevitably appear. Massive robotic screening facilities and compound libraries were constructed across the industry.

The results were instructive. Screening throughput increased dramatically, but the number of genuinely useful drug leads did not increase proportionally. Researchers eventually realized that the bottleneck was not the number of molecules but the quality of biological targets and the complexity of disease biology. The industry gradually shifted toward more biology-driven discovery programs.

More recently, another variant of the same idea appeared in partnerships involving consumer genetic databases, particularly those built by companies such as 23andMe. Pharmaceutical firms recognized that millions of genotyped individuals might reveal disease-associated genetic variants at an unprecedented scale. The collaboration between GlaxoSmithKline and 23andMe, for example, promised to mine population genetic data for drug targets using advanced statistical and machine-learning methods. Once again, the language emphasized large datasets, computational analysis, and systematic discovery of new therapies.

These collaborations have produced some promising programs and interesting biological insights, but the transformative wave of new drugs that many expected has not yet materialized. As before, identifying statistical associations in data has proven much easier than translating them into actionable biological mechanisms and safe, effective therapeutics.

Seen in this historical context, the Tempus–Merck collaboration appears less like a revolution and more like the latest technological iteration of a recurring strategy. What has changed is the nature and scale of the data. Tempus has assembled a large database that integrates several types of information simultaneously—genomic sequencing, clinical records, treatment outcomes, and pathology images. In addition, modern machine-learning systems can analyze such heterogeneous datasets in ways that were technically infeasible twenty years ago.

The distinctive features of the current approach are therefore:

  • Multimodal data integration: combining genomic, imaging, and clinical information rather than analyzing each type separately.

  • Advanced machine learning: models capable of identifying patterns across diverse datasets.

These advances genuinely expand the analytical toolkit available to researchers. Yet the fundamental challenge remains the same as it was during earlier waves of genomic or combinatorial enthusiasm: biological causation is extraordinarily complex. Statistical correlations emerging from large datasets must still be validated experimentally, interpreted biologically, and ultimately tested in clinical trials. None of these steps can be replaced by computation alone.

For that reason, many experienced observers view such collaborations in a more pragmatic light. The greatest near-term value of platforms like Tempus may not lie in discovering entirely new drugs but rather in improving biomarker discovery and patient stratification. In oncology, identifying which patients respond to a particular therapy, or why resistance develops, can be just as important as discovering a new molecular target. Large clinical datasets linked to treatment outcomes are particularly powerful for that purpose.

Indeed, the pharmaceutical industry has seen enough cycles of technological enthusiasm to recognize a recurring pattern. Roughly every decade, a new technology promises to transform drug discovery into a more systematic and scalable process. Among the successive candidates have been combinatorial chemistry, genomics, proteomics, systems biology, and now artificial intelligence. Each wave contributes useful tools and insights, but none has eliminated the inherent uncertainty of translating biological knowledge into safe and effective medicines.

In that sense, the Tempus–Merck collaboration represents both progress and continuity. The analytical tools are more sophisticated and the datasets vastly larger than those available in earlier eras. Yet the underlying aspiration—to mine large biological datasets in search of therapeutic breakthroughs—remains a familiar theme in the history of biomedical innovation. The dance continues, even if the music has changed.

Medicare and War Injuries

 What is the section in the Social Security act about Medicare that says that Medicare does not cover injuries due to war. Just a sentence or two.

About historical psychiatrist Harry Stack Sullivan

 what did harry stack sullivan actually DO for patients? If I was a patient of his, would I be in therapy or inpatient? Either way, how would I describe what I experienced as therapy and how much i got out of it.

Harry Stack Sullivan was one of the founders of interpersonal psychiatry, and his clinical work looked quite different from classical Freudian psychoanalysis. If you had been his patient in the 1920s–1940s, your experience would depend heavily on how ill you were, because Sullivan worked with both inpatients with schizophrenia and outpatients in psychotherapy.

Below is the closest reconstruction of what it would have felt like from a patient’s perspective.


1. If you were very ill (e.g., schizophrenia): inpatient treatment

Sullivan spent years working in psychiatric hospitals such as Sheppard Pratt Hospital in Maryland, where he treated patients with schizophrenia. At that time, most psychiatrists believed schizophrenia was essentially untreatable. Sullivan disagreed.

What he actually did:

  • Created a calm interpersonal environment.
    He believed symptoms arose partly from distorted relationships, so the hospital ward itself had to become a healthier social environment.

  • Frequent direct conversations with patients.
    Rather than interpreting dreams or unconscious drives, he spent long periods talking with patients about their thoughts, fears, and interactions with others.

  • Close therapeutic relationships.
    He believed improvement came from experiencing a trustworthy interpersonal relationship with the therapist.

  • Training nurses to interact therapeutically.
    The entire ward staff became part of the treatment system. Nurses were instructed to engage patients in respectful, stable relationships.

What you might say as a patient:

“Dr. Sullivan talked with me a lot about how I felt around people—what frightened me, what I thought others meant when they spoke to me. He didn’t treat me like I was crazy; he tried to understand how I experienced things.”

What patients got out of it

Some patients did improve. Sullivan showed that structured, respectful environments could reduce psychotic distress, which influenced later milieu therapy and therapeutic communities.


2. If you were an outpatient in psychotherapy

Sullivan also conducted psychotherapy with less severely ill patients.

Unlike Freudian psychoanalysis:

  • You would not necessarily lie on a couch

  • Sessions were often face-to-face conversations

  • The focus was relationships, not sexual drives or childhood fantasies alone

Typical session themes:

  • conflicts with friends, partners, or coworkers

  • anxiety in social situations

  • patterns in how you interpret other people’s behavior

He believed anxiety comes from interpersonal situations, especially early relationships with caregivers.

What the therapy would feel like

A patient might describe it like this:

“We spent most of the time talking about the people in my life—my family, friends, and how I reacted to them. Dr. Sullivan was very interested in the misunderstandings I had with others and how my expectations shaped those interactions.”


3. A key technique: “participant observation”

Sullivan described himself as a participant observer.

That meant:

  • he was actively engaged in the interaction, not a silent analyst

  • he watched how you related to him during the session

  • he might gently point out patterns

Example:

“Notice how you expected me to criticize you just now. Do people often react that way toward you—or do you expect them to?”

This approach later influenced modern relational psychotherapy and interpersonal therapy (IPT).


4. What patients might realistically say they gained

Patients often reported:

Benefits

  • feeling deeply understood

  • recognizing patterns in relationships

  • reduced anxiety about interacting with others

Limitations

  • treatment could be long and exploratory

  • it did not necessarily give structured coping skills (like modern CBT)

So a typical retrospective comment might sound like:

“He helped me understand how my fears about other people shaped my life. I became less suspicious and more comfortable with relationships.”


5. Sullivan’s biggest historical contribution

His main innovation was the idea that mental illness is fundamentally interpersonal, not purely biological or intrapsychic.

This influenced later therapies such as:

  • Interpersonal Psychotherapy (IPT)

  • attachment theory

  • relational psychoanalysis


In simple terms:
If you had been Sullivan’s patient, you would mostly remember long, serious conversations about relationships, conducted in a respectful and engaged way, with the belief that changing interpersonal patterns could improve mental health.


If you want, I can also tell you something fascinating about Sullivan that makes his therapy style make more sense: his own personality and lifestyle were unusually secretive and eccentric, which shaped how he practiced psychiatry.


##

##

I've written  that I'm lost when someone says, "AA is a spiritual treatment for a physical disease" (liberally peppered with the word GOD in every paragraph) yet I believe strongly it's obvious that "AA is a social treatment for a physical disease'

Your observation actually sits right in the middle of a long-standing debate about how Alcoholics Anonymous works. What you’re noticing—that it looks like a social treatment for a physical disease—is very close to how several psychiatrists and sociologists have analyzed it, even though the movement itself prefers the language of “spiritual recovery.”

And interestingly, the view you’re expressing would have made a lot of sense to Harry Stack Sullivan.


1. Sullivan would likely call AA an interpersonal therapy system

Sullivan believed that many psychiatric symptoms are maintained or relieved through patterns of human relationships. In his model, the central therapeutic factor is not insight or doctrine but corrective interpersonal experience.

If Sullivan watched an AA meeting, he would probably notice:

  • Structured social interaction

  • Mutual identification (“I’ve been there too”)

  • Reduction of shame through shared narrative

  • Stable roles and expectations

  • Repeated reinforcement of behavioral norms

In his language, AA provides a corrective interpersonal field.

In simpler terms: the therapy is the relationships themselves.


2. The “spiritual” language may function psychologically rather than literally

Even though AA emphasizes God or a “higher power,” many observers note that the effective mechanisms often look social rather than theological.

For example:

Key mechanisms that clearly operate socially:

  • Identification: “people like me recovered”

  • Accountability: sponsors and regular meetings

  • Narrative reconstruction: telling your story differently

  • Status reversal: the former drunk becomes a helper

  • Belonging: replacing the drinking network with a recovery network

Those mechanisms would all look extremely familiar to someone trained in interpersonal psychiatry.


3. Why AA describes itself as spiritual

AA historically framed recovery as spiritual for several reasons:

  1. Its founders were influenced by the Oxford Group.

  2. “Spiritual awakening” was a culturally acceptable way to describe profound psychological change in the 1930s.

  3. The language helped explain recovery in a moral universe rather than a medical one.

But many modern participants interpret “higher power” in very flexible ways—community, nature, or simply the group itself.


4. Your formulation is actually quite sophisticated

Your statement:

“AA is a social treatment for a physical disease.”

captures something very important.

The model becomes:

  • Physical disease: addiction biology

  • Social treatment: structured recovery community

  • Psychological effect: identity change and reduction of shame

In fact, a number of addiction researchers now emphasize that recovery is strongly predicted by social network change—the replacement of drinking peers with recovery peers.


5. A Sullivan-style summary

If Sullivan had written a one-sentence interpretation of AA, it might have been something like:

Recovery occurs when a person replaces destructive interpersonal patterns with a stable community organized around sobriety.

That is very close to your intuition.


6. The paradox

The irony is that the spiritual language may help create the social structure that does the therapeutic work.

The theology may not be the mechanism—but it helps stabilize the group identity.