Social Question
Do you think there is too much societal emphasis on trauma?
I recently read an interesting article in the New Yorker about the preponderance of trauma narratives in fiction, i.e. a character always has a traumatic backstory that explains who they are and why they do what they do and they need to work through the trauma and often come to the conclusion that they can’t ever fully escape it. The criticism is the complex character motivations are often eschewed in favor of chalking everything up to trauma; the article cited a recent adaptation of The Turn of the Screw which inserts a rape into the governess’ past that explains everything she does. In the original story, we don’t know why she is the way she is; the appeal is the enigma.
PTSD is the most rapidly-growing psychological diagnosis, in part because the definition has broadened. That’s not to say trauma isn’t real or as common as it seems to be. But I’m wondering if all the emphasis on it is healthy. There’s often a sense, as I said above, that trauma can’t ever be fully overcome and requires regular therapy. The cynical part of me sees the profit motive there. And it’s not just personal trauma; collective trauma of an entire demographic is often cited as the reason for certain societal problems.
Can someone every truly overcome trauma? Or are they just lying to themselves? Do you think trauma can explain all our motivations and actions (not just in fiction)? Is having been traumatized a significant part of who you are and your relationships with others?
This is a complex and broad topic. Any answers welcome. You do not have to answer the specific questions I asked.