CN: this article mentions organized abuse, ritual abuse, mind control, abuser organizations, stigma and infuriating bullshit
There is a lot of misinformation out there about DID. Some of it is based on a lack of knowledge and we can help by pointing towards scientific resources. In other situations, what we encounter is plain denial of science. People are misleading others intentionally and systematically to influence the way DID and the severe trauma behind it is perceived in society or court. If patients are liars and their therapists frauds, then nobody has to listen to their testimony.
I want my readers to be able to identify science denial when they see it. That has become a core competence for those engaging in discourse online or trying to advocate for people with DID. We will look at an overview of techniques of science denial called FLICC.
FLICC is an acronym for the categories of technique of science denial: Fake experts, Logical fallacies, Impossible expectations, Cherry picking and Conspiracy theories.
Fake experts
We are presented with the theories or opinions of people who seem to have expertise but in reality they are not working in the field they are talking about or they are studying a different area of that field. We have seen that a lot during the corona pandemic when lung doctors shared their opinions about the development of a pandemic, talking over epidemiologists who make that their main focus of research. We see it regarding DID when people who study memory in healthy adults suddenly claim expertise about the way traumatic memory is processed in small children. Memory and traumatic memory are very very different fields of expertise. People who study monotrauma and its treatment will not be experts for structural dissociation. It is important to pay close attention to the field people work in and study. Not every psychiatrist or psychologist is a DID expert.
Special problems concerning fake experts:
- Bulk Fake Experts: Sometimes fake experts organize. They start foundations or associations where a long list of supporters can be found. The conditions for being a supporter have nothing to do with a scientific interest. Usually there is a financial reason, the wish for status or some other personal gain. These groups will show up everywhere they can to push their agenda and spread their misinformation. That way it looks like it is a common point of view. We can see this with the False Memory Foundation or the Satanic Panic movement.
- Magnified Minority: The opinions of a small group within the scientific community are presented out of proportion to other opinions. There is a minority that e.g. still thinks that DID has something to do with fantasy or that it is iatrogenic. It is usually claimed by people who don’t treat people with DID and never studied people with DID. But their voices are amplified and blown out of proportion while the majority of experts with a differing opinion and better research are overlooked. DID scientists have no doubt that trauma is an essential factor in the development of DID.
- Fake Debate/False Balance: To create the image of a balanced discussion, scientists and pseudo-scientists are invited for debate. The real discourse that is going on in the scientific community is replaced with a fake discourse that includes false information. We are told that we need to hear ‘both sides’ but they don’t have the same value because one is based on research and the other isn’t. That way people get the impression that there is an ongoing debate about something within the scientific community, that there is major disagreement, when really that is not the case. We can see that in discussions about recovered memories, the origin of DID, the validity of reports of extreme abuse ect. Presenting different points of view is important in journalism when it comes to opinions and political debate. It is misleading when it invites pseudo-science to the table.
- (Experts by Experience: people with lived experience are not researchers. They only work with a sample size of one and subjective personal experience. While that is valuable, it isn’t scientific. There is a certain danger to declare that personal experience is universal within DID and to create personal concepts that could be harmful for others. Sometimes experts by experience openly spread misinformation because they take their experience literally and they are unable to reflect how that view might be distorted by trauma or dissociation. If that is presented as the Truth, we are just another fake expert. Making sure to explain when something is a personal concept helps to clarify. The problem is rarely with the experts by experience, it is often in the way their followers share the tools and misrepresent them to a third party.)
Logical fallacies
Good logic works with true statements that are connected in a way that creates another true statement. I am female. I have a sister. My sister has at least one sister. In logical fallacies the conclusions are faulty. ‘The sky is blue’ is a statement that is at least sometimes true. ‘Blue is often used to represent cold temperatures’ is true in our culture. ‘Therefore the sky is cold’, is not a logical conclusion that makes any sense. To detect logical fallacies it helps to know the premises and to check if there really is a logical connection between them. The statement ‘people with DID are easy to hypnotize’ and ‘ people who are hypnotized can be influenced to act like a different person while the hypnosis lasts but not beyond that’ does not logically result in ‘DID is caused by therapists who implanted a new and lasting personality in people they hypnotized, even though there is no proof this can be done, the symptomatic patient has never seen a therapist before and there are recordings of every session that happened since first contact to prove that no such thing happened’. There is no valid way to get to that conclusion. There are a number of specific logical fallacies we can look at:
- Ad Hominem: In this technique a person or their character are attacked to distract from their arguments. We can see this when experts are criticized because of their failure in relationships in their personal life or because of isolated incidents in therapy. Onno van der Hart’s devastating treatment errors with one patient have led to people claiming that the whole theory of structural dissociation he only co-authored is bad. The fact that it is peer-reviewed and widely accepted by DID scientists becomes unimportant. He is assigned a stereotypical role, the horrible perpetrator, and therefore everything he said is wrong and we don’t even have to pay attention to everything he wrote in the decades before he messed up. There are valid things to criticize about the theory of structural dissociation, like how it leads to diagnostic criteria that are rubbish or the limited view of DID as an alter-disorder. It doesn’t need ad hominem attacks for that. They just distract from the content.
Another way to attack experts is to call them obsessed with a topic and to suggest that their own belief in DID makes them see things that aren’t real. After all, they seem to find so many more people with that disorder than others who don’t believe in it…
- Misrepresentations: a situation is presented in a misleading way to create a desired impression. Usually that is used to make it easy to prove an argument wrong: it isn’t presented properly from the start. We can find that in situations of DID deniers claiming that therapists said that there are multiple people living in one body and then ridiculing that idea. But no proper expert has claimed that in a couple of decades. Their concept of parts of the personality was grossly misrepresented. That is called a straw man argument.
Sometimes an ancient hypothesis is presented and proven wrong when the scientific community has done that ages ago and nobody works with that hypothesis anymore.
- Ambiguous language: Words that are not properly defined or that can have multiple meanings are used to cause confusion or the meaning of similar words is used interchangeably to misrepresent concepts. We see that a lot when integration and fusion is used interchangeably even though there is quite a difference. When therapists tell us that being more integrated is the goal of therapy, they don’t mean that they insist that fusion is our only option and they will force us all the way there. The word dissociation is used for a whole collection of phenomena that cover a huge range of experience. We end up with claims like ‘every kind of dissociation can be controlled’ that destroy treatment concepts. Sentences like ‘everyone has parts’ or ‘everyone is multiple’ are another common example. Vague language is used to obscure facts. Claiming that the theory of structural dissociation is ‘just a theory’ and therefore not important or rooted in research makes use of the different meanings of the word theory in everyday language and science.
- Oversimplification: facts become distorted by leaving out nuance or differentiation or by cropping statements to create the illusion that a conclusion is valid. For example the fact that there are usually child parts in DID can be reduced to people acting like children. People in age regression act like children. Therefore child parts in DID are only age regression. All the aspects of other dissociative symptoms are left out.
- False dichotomy: we are presented with only 2 possible explanations and pressured to pick one. That way there is an illusion that there aren’t more options or that they cannot exist at the same time. DID is either always faked or always real, when in truth it can be faked but it is also often real. There is no reason why only one can be true.
- Single cause: only one factor is considered to be of influence. The idea of several factors is rejected. When it comes to DID the causes that people insist on are often simply wrong: fantasy and role playing, suggestion by therapists, tricks to get attention, socially reinforced behavior quirks etc. It’s not just single cause, it is also a false cause. Some people discuss the origin of DID using actual data. Opinions that insist that only disorganized attachment or only extreme abuse are the reason, work within the single cause fallacy. It is pretty clear that there are multiple factors and trauma and attachment are 2 of them.There is no single cause. Science is still busy figuring out how to weigh the factors but trauma is not one that is considered as uncertain in any way.
- False analogy: In this technique a similarity in one area is used to conclude that there must be similarities in other areas. People with DID hear voices, just as people with Schizophrenia. Therefore they will both benefit from antipsychotics to manage their symptoms. The 5-6% of people with an overt representation of DID draw a lot of attention with their behavior. Histrionic people create a lot of drama to draw attention to themselves. Therefore people with DID only want attention when they switch. Those are logical fallacies that obviously make no sense but people use them all the time in the way they stigmatize people with DID.
- Red Herring: our attention is intentionally diverted from the main arguments and facts and directed towards weird stories, minor and unimportant details and different topics that are barely related. It is meant to lead us down a rabbit hole so that the main topic will not be presented and discussed. This can sometimes be combined with sea lioning. People pick an irrelevant detail and demand more and more information about it, even though it is unnecessary for understanding the main point and leads away from the real topic.
A special kind of red herring is called the blow fish strategy. Here, the methods of research are taken apart and minor problems are blown out of proportion. Studies are never perfect but that doesn’t mean they are useless either. Usually the problems are discussed within the article to make sure they are obvious. A common problem in DID is the small sample size. There are just not enough people with DID who are stable enough to take part in research or they are hard to reach and the research is under-funded to begin with. Small sample sizes are a real problem. It doesn’t mean that results are useless. Other studies about the effectiveness of treatment have no control group, because researchers were trying to get as many people to participate as they could. DID research is wide open to criticism about the methods because of the difficulties that come with it. While some of that is part of the scientific discussion that is happening for a good reason, DID deniers often attack things that get good enough results to make decent estimates. The issues are not as big as they try to make us believe.
Impossible expectation
People expect completely unrealistic standards or a level of certainty that cannot be accomplished by scientific methods. One of the big areas where we see this is the problem of correlation vs causation. To prove a causation we would have to have complete control over a situation, manipulate one aspect and then observe and measure the effect. It is ethically impossible to do trauma science this way since we can’t isolate small children, traumatize them on purpose and then watch what happens. This is actually what nazi scientists did and what groups that make use of mind control are still doing today, but that is not an option for ehtical science. That means that we will never be able to prove a true causation. The only thing we will get is a strong correlation. Impossible expectation demands a causation to be able to accept a result. Even when the correlation is much stronger than for other topics and they never attempt to deny those. Empirical science works with certain rules and we can’t expect results that cannot be accomplished by our scientific methods. It is simply impossible.
A special version of impossible expectations is called ‘moving the goal post’. It is where we can observe more sea lioning. We offer appropriate proof but then new proof is demanded. It is never enough. If there is one study then it needs to be replicated. But even if there are 5 studies with similar results like when it comes to the role of fantasy proneness, that is still not enough. A hypothesis will never be abondoned, no matter how much proof we bring because it is not about the facts.
On the flipside, the bar for proof for the prefered opinion is constantly lowered. Even research that is methodically questionable is accepted, if it shows the desired results. Suddenly we see people accept research where not a single participant actually had DID as valid proof for a hypothesis about DID. How can that be a DID study? In other situations, misremembering the presence of a word in a list of related words is considered enough proof to confirm that false memories of severe trauma can easily exist as well. The fact that memory is sometimes wrong is enough to state that memories of abuse are also wrong.
Cherry Picking
In Cherry Picking only small chunks of data are accepted and all the rest is ignored. DID deniers pick what kind of information they like and refuse to pay attention to the rest of research and all the contradictions with the information they picked. We can see that when people insist that there is no recovery from DID because fusion doesn’t always last. Newly fused people tend to fall aparts again when under pressure. That is true but it is also just a small part of the truth and it ignores the testimonies of fused people that were collected over the decades. Fusion is a real option and people grow old and die seeing their great grandchildren without falling aparts again. It is just not true that it isn’t possible. How far will the goal post be moved beyond death? There is more information out there than the fact that fusion can be unstable at first. It needs a lot of denial and disregard for those who healed to invalidate their experience and their documented successes.
A special kind of cherry picking uses anecdotes to make a point. Because something was true in one case, it has to be true in every case. Some people fake DID. And sometimes there are false-positive diagnoses. It doesn’t mean that everyone fakes DID or that all diagnoses are false positives. In fact, people fake other disorders too and there are false positives for every diagnosis out there. These situations aren’t even specific to DID. The suggestion that the famous case of Sybil might not have been real causes an argument against the whole diagnosis instead of treating it like an isolated case.
A similar way of cherry picking is called Slothful induction. It shows up when people neglect information that would be available because it doesn’t fit their opinion. They make it a point not to get a balanced view and leave out relevant information. We see this a lot in the media and even with therapists who deny recent research and insist on sticking to outdated ideas. Brain studies that show predictable changes in the brains of DID patients are ignored. Brain scans that prove that dissociative parts are distinctly different and cannot be imitated by actors are ignored. So called professionals prefer to stick to their concept that it is all made up and are fully immune to data. There is a difference between not knowing because it is impossible to keep up with all psychological science and intentionally refusing to know pieces of information that are inconvenient.
Wishful thinking also belongs to this category. With a focus on details that are pleasing and the neglect of all other kinds of information, there is room for fantasies. Usually therapy techniques that promise to heal even the worst of complex trauma in no time end up being wishful thinking. The participants are cherry picked and exclude the normal representation of a diagnosis. Working with only the easy cases without comorbidities distorts the results of the study and presents a picture that is pleasing but too good to be true. Dropout rates are ignored to keep up the pretense. That way we get very clean studies of very unlikely patients and a promise that it will still work with everyone else.
Conspiracy theories
Organized science denial includes the creation of conspiracy theories around the topic. One big conspiracy was invented by the false memory foundation. The claim was that either the patients are making things up to ruin their perpetrators or their therapists implant these memories somehow to harm other people. It never made a lot of sense, since most victims never tell anyone who their abusers are and don’t press charges. Symptoms showed up long before people entered therapy and often enough there are visible scars that prove that the abuse happened. Theories around false memories have been disproven for years. It is possible to make someone believe they probably saw Bugs Bunny in Disneyland. But nobody suddenly develops memories of abuse just because someone told them a story. It isn’t scientific. Today we know that a lot of abusers organized to discredit their victims in court. The False Memory Foundation is an abuser organization. But it seems like their conspiracy of implanted memories won’t die with them.
Similarly there is another movement under the cover of Satanic Panic that tries to discredit survivors of ritual abuse but calling it all fake and creating a conspiracy around it. A conspiracy that calls something else merely a conspiracy. Every argument against that is included into the theory. Of course the would say that! That is how they operate! Openly ridiculing people who support survivors of ritual abuse and using all the techniques mentioned above is meant to distract attention away from abusive structures.
There is another mess made by the Q Anon followers who share proof of extreme violence against children, material that is created by organized abusers, but then using it within their own conspiracy narrative. It makes real testimonies look like they are only part of that group and connects them to the weird teachings that are spread through the same channels. It destroys credibility even though some of it is actually true, simply by putting it into an inacceptably unscientific context. There is some conspiracy happening in organized abuse. But not all conspiracies are connected. Once we look at DID we will encounter all kinds of wild stories and it needs a lot of critical thinking to figure out what is real, what is misrepresented and what is humbug.
A specific problem that shows up with conspiracy theories is called quote mining. People search literature to find sentences or pieces of sentences they can take out of context to change their meaning. These quotes are then newly interpreted and whole articles are written about that new interpretation that was never intended by the author. If you read this text, you will find that I wrote the words ‘we are just a fake expert’ in this text. Quote mining would take that completely out of context, add my name to that quote and use it against me to tell the whole world that I admitted that I am somehow fake. It doesn’t matter that I never called myself an ‘expert’ to begin with or that I was talking about something different. When we see fishy quotes, we need to check for context. There is no conspiracy here, no secret knowledge or confessions. Some people just have their own agenda and they use whatever means they can to spread them.
This article is not meant to be a study of the misinformation about DID that is out there, so the examples are not exhaustive. My main point is to show you that there are tools used in science denial and that they are used by the people who deny DID. If we know the tools it will be easier to recognize them when we see someone using them. It gives us an idea how to influence the conversation to return to actual science and how to confront their logical fallacies. We might also notice our own fallacies and need for education. Everyone is at least slightly biased by existing knowledge and misses other aspects. By broadening our understanding we get a better idea of what the actual discourse looks like and we can identify fake discourse that is just meant to keep us busy and distracted. Make sure to consider the possibility that someone is just uneducated and repeating something they heard. A lot of people don’t look into science, they just trust the wrong people and repeat arguments.
It is ok not to argue with people who intentionally use techniques of science denial. They know exactly what they are doing and it is not likely that our arguments will change their mind. The problem might not be their lack of knowledge. It might be their general agenda. We can leave the arguing to the actual experts and save a lot of energy. Still, we are free to call out bullshit whenever we encounter it.
You can find a video series explaining FLICC with examples from climate change denial that we used as a base over here.
Leave a Reply