Tag Archives: manipulation

Bad popular science books

There is a class of books that is marketed as popular science, but have the profit from sales as their only goal, disregarding truth. Easily visible signs of these are titles that include clickbait keywords (sex, seduction, death, fear, apocalypse, diet), controversial or emotional topics (evolution, health, psychology theories, war, terrorism), radical statements about these topics (statements opposite to mainstream thinking, common sense or previous research), and big claims about the authors’ qualifications that are actually hollow (PhD from an obscure institution or not in the field of the book). The authors typically include a journalist (or writer, or some other professional marketer of narratives) and a person that seems to be qualified in the field of the book. Of course these signs are an imperfect signal, but their usefulness is that they are visible from the covers.
Inside such a book, the authors cherry-pick pieces of science and non-science that support the claim that the book makes, and ignore contradicting evidence, even if that evidence is present in the same research articles that the book cites as supporting it. Most pages promise that soon the book will prove the claims that are made on that page, but somehow the book never gets to the proof. It just presents more unfounded claims.
A book of this class does not define its central concepts or claims precisely, so it can flexibly interpret previous research as supporting its claims. The book does not make precise what would constitute evidence refuting its claim, but sets up “straw-man” counterarguments to its claim and refutes them (mischaracterising the actual counterarguments to make them look ridiculous).
Examples of these books that I have read to some extent before becoming exasperated by their demagoguery: Sex at dawn, Games people play.

“What if” is a manipulative question

“What if this bad event happens?” is a question used as a high-pressure sales tactic (for insurance, maintenance, upgrades and various protective measures). People suffering from anxiety or depression also tend to ask that question, which is called catastrophising. The question generates vague fears and is usually unhelpful for finding reasonable preventive or corrective measures for the bad event. Fearful people tend to jump on anything that looks like it might be a prevention or cure, which sometimes makes the problem worse (e.g. quack remedies for imagined rare disease worsen health).
A more useful question is: “What is the probability of this bad event happening?” This question directs attention to statistics and research about the event. Often, the fear-generating event is so unlikely that it is not worth worrying about. Even if it has significant probability, checking the research on it is more likely to lead to solutions than vague rumination along the lines of “what if.” Even if there are no solutions, statistics on the bad event often suggest circumstances that make it more likely, thus information on which situations or risk factors to avoid.
These points have been made before, as exemplified by the aphorisms “Prepare for what is likely and you are likely to be prepared” and “Safety is an expensive illusion.”

News are gradually biased by re-reporting

The (science) news cycle occurs when the original source is quoted by another news outlet, which is quoted by another outlet, etc, creating a “telephone game”, a.k.a. “Chinese whispers” familiar from kindergarten. Each re-reporting introduces noise to the previous report, so the end result may differ diametrically from the original story. This news cycle has been identified and mocked before, e.g. by PhD Comics.
The telephone game of news outlets has an additional aspect that I have not seen mentioned, namely that the re-reporting does not add random noise, but noise that biases the previous source deliberately. Each news outlet, blog or other re-poster has a slant and focusses on those aspects of the story that favour its existing viewpoint.
A single outlet usually does not change the story to the complete opposite of the original, because outright lying is easy to detect and would damage the outlet’s reputation. However, many outlets in a sequence can each bias the story a little, until the final report is the opposite of the original. Each outlet’s biasing decision is difficult to detect, because the small bias is hidden in the noise of rephrasing and selectively copying the previous outlet’s story. So each outlet can claim to report unbiased news, if readers do not question why the outlet used second-hand (really n-th hand) sources, not the original article (the first in the sequence). A single manipulator thus has an incentive to create many websites that report each other’s stories in a sequence.
The moral of this text is that to get accurate information, read the original source. Whenever you see an interesting news article, work backward along the sequence of reports to see whether the claims are the same as in the first report. The first report is not guaranteed to be true, but at least the biases and honest errors introduced later can be removed this way.

Sugar-free, fat-free and low-salt claims

The three main ingredients of unhealthy food are sugar, salt and fat. The packaging of junk food often has claims of sugar-free, fat-free or low-salt in big colourful letters on the front. The trick is that the absence of one of the three ingredients is compensated by a larger amount of the other two, as can be checked from the nutrition information label.
Sometimes the claims on the front of the pack directly contradict the nutrition label, so are downright lies. I have seen packaging with the claim “sugar-free” on the front, with sugars listed in significant quantity on the nutrition label. There are some legal sanctions for falsifying the nutrition information label, but almost no restrictions on what can be claimed elsewhere on the pack, so any contradictions should almost always be resolved in favour of the nutrition label.
I have seen a sugar-free claim on a pack on which the ingredient list included brown sugar. This suggests the existence of a legal loophole (brown sugar not equalling sugar somehow) that the manufacturer wanted to use.
If the manufacturer does not want to outright lie, then a trick I have seen is to claim “no added sugar” or “no sugar or artificial sweeteners” on the pack, but add other sweeteners, e.g. sugarcane juice, molasses, high fructose corn syrup. Similarly, “no added salt” can be bypassed by adding salty ingredients, for example dried salted meat or bacon to a snack mix.
Another trick is to create the sugar in the food during the manufacturing process. For example, heating starch for a long time or adding the enzyme amylase breaks the starch into smaller-molecule sugars. So a manufacturer can claim “no added sweeteners” and yet produce sugars in the food by processing the starch in it.
A similar trick for salt is to add sodium and chloride in other ingredients and let them combine into NaCl in the food.

How superstition grows out of science

Priests in Ancient Egypt could predict eclipses and the floods of the Nile by observing the stars and the Moon and recording their previous positions when the events of interest happened. The rest was calculation, nothing magical. Ordinary people saw the priests looking at the stars and predicting events in the future, and thought that the stars magically told priests things and that the prediction ability extended to all future events (births, deaths, outcomes of battles). The priests encouraged this belief, because it gave them more power. This is one way astrology could have developed – by distorting and exaggerating the science of astronomy. Another way is via navigators telling the latitude of a ship using the stars or the sun. People would have thought that if heavenly bodies could tell a navigator his location on the open sea, then why not other secrets?
Engineers in Ancient Rome calculated the strength of bridges and aqueducts, and estimated the amount of material needed for these works. Ordinary people saw the engineers playing with numbers and predicting the amount of stones needed for a house or a fort. Numbers “magically” told engineers about the future, and ordinary people thought this prediction ability extended to all future events. Thus the belief in numerology could have been born.
When certain plants were discovered to have medicinal properties against certain diseases, then swindlers imitated doctors by claiming that other natural substances were powerful cures against whatever diseases. The charlatans and snake oil salesmen distorted and exaggerated medicine.
Doctors diagnosed diseases by physical examination before laboratory tests were invented. Thus a doctor could look at parts of a person’s body, tell what diseases the person had, and predict the symptoms that the person would experience in the future. Exaggerating this, palm readers claimed to predict a person’s future life course by looking at the skin of their palm.
In the 20th century, some medicines were discovered to be equally effective at somewhat lower doses than previously thought. Then homeopathy exaggerated this by claiming that medicines are effective when diluted so much that on average not a single molecule of the drug remains in the water given to the patient.
In all these cases, superstition only adds bias and noise to scientific results. Science does not know everything, but it is a sufficient statistic (https://en.wikipedia.org/wiki/Sufficient_statistic) for superstitious beliefs, in the sense that any true information contained in superstition is also contained in science. Nothing additional can be learned from superstition once the scientific results are known.

Scientific thinking coordination game

If most people in a society use the scientific method for decision-making, then telling stories will not persuade them – they will demand evidence. In that case, bullshit artists and storytellers will not have much influence. It is then profitable to learn to provide evidence, which is positively correlated with learning to understand and use evidence. If young people respond to incentives and want to become influential in society (get a high income and social status), then young people will learn and use the scientific method, which reinforces the demand for evidence and reduces the demand for narratives.
If most people are not scientifically minded, but believe stories, then it is profitable to learn to tell stories. The skilled storytellers will be able to manipulate people, thus will gain wealth and power. Young people who want to climb the social and income ladder will then gravitate towards narrative fields of study. They will not learn to understand and use evidence, which reinforces the low demand for evidence.
Both the scientific and the narrative society are self-reinforcing, thus there is a coordination game of people choosing to become evidence-users or storytellers. Note that using the scientific method does not mean being a scientist. Most researchers who I have met do not use science in their everyday decisions, but believe the stories they read in the media or hear from their friends. I have met Yale PhD-s in STEM fields who held beliefs that most people in the world would agree to be false.
One signal of not thinking scientifically is asking people what the weather is like in some place one has not visited (I don’t mean asking in order to make small talk, but asking to gain information). Weather statistics for most places in the world are available online and are much more accurate than acquaintances’ opinions of the weather. This is because weather statistics are based on a much longer time series and on physically measured temperature, rainfall, wind, etc, not on a person’s guess of these variables.

Remembering the sacrifice

Many times and in many places I have seen a call to remember the sacrifice of the soldiers who died in some past conflict. Often, this call seems an attempt to direct attention away from the misguidedness of the particular conflict or the incompetence and selfish motives of the leadership who decided to enter the war. It tries to make people focus on the noble courage of the soldiers, not the sordid power-hunger of the rulers. The bravery of soldiers is discussed in another post (http://sanderheinsalu.com/ajaveeb/?p=595); here I would like to clarify this sacrifice business.

If the soldiers volunteered under reasonably accurate information about the reasons for the conflict and the chances of success, then indeed they chose to sacrifice themselves for the cause. Then we should remember their sacrifice. If, however, they were conscripted (dictatorships often call this volunteering) using the threat of punishment for them or their family, then they did not make the sacrifice any more than a sacrificial animal sacrifices itself. Others sacrificed the conscripts to further their own ends.

These ends are unlikely to prioritize defeating an evil regime and making the world a better place, although the propaganda claims this was the objective. Mostly the goal of leaders is to preserve and expand their power, whether by defending the country against takeover or conquering additional subjects and wealth. Even if this is acknowledged, current propaganda may point out some good side effect of sacrificing the soldiers, e.g. defeating an old enemy. This is again a distraction attempt. The world is complex and interconnected, so every event, including a mass death, has some beneficial side effects, just like every event has some negative side effects. One should consider the overall consequences of an event, not just one side effect.

If the soldiers genuinely volunteered, but due to being misled by propaganda, then they wanted to sacrifice themselves for one cause, but their leaders sacrificed them for another. Usually volunteers underestimate the length of the war and the probability of dying. Thus even when they know the true goal of the conflict, the sacrifice they are led to is larger than the one they intended to make.

The most clear and direct self-sacrifice is made by suicide bombers. They probably think that their bombing serves a good purpose, but such belief is almost always misguided. Religious indoctrination of the bombers manipulates them into believing in a noble cause, hiding the true goals of the leaders ordering the bombing.

I have not heard many calls to remember the sacrifice of present-day child soldiers. Rather there are calls to pity and save them. The situation of many soldiers in many wars has been similar to children forced to fight – ignorance and fear of punishment. Obeying the conscription order often offers a greater survival probability than refusal.

Instead of remembering the sacrifice of conscripts, we should remember them being sacrificed. Remember with pity. Remember to prevent.

On accepting apologies

There seems to be a social convention that an apology has to be accepted and that someone who does not is unfriendly and a bad person. This seems strange to me, because an apology often tries to undo deeds with words, or cancel unthinking words with considered ones.

The willingness of most people to trade words for deeds seems irrational to me – there is a qualitative difference between words and deeds, in that words can be neutralized within the hearer’s mind. If the hearer or reader does not understand, hear or attach emotional significance to words, then these have no effect. Deeds, on the other hand, have consequences that are not just in people’s heads. A punch causes bruising even if imagined to be a caress. An insult does not cause bad feelings if it is interpreted as a joke by all concerned.

Accepting words in compensation for deeds makes one manipulable. The perpetrator of bad actions can get away with them repeatedly by promising each time to change and to sin no more (Hitler’s “last territorial demand”). The social convention that words have to be accepted as compensation helps the unscrupulous. If instead good works in sufficient quantity were required to make up for misdeeds, then taking advantage of others would be less profitable. Some people would have to spend a lifetime undoing their crimes, which creates the incentive problem of how to make criminals work. Perhaps gradually easing ostracism and restrictions as the debt is worked off. The quantity of good actions required must be large enough to make the overall profit from a bad deed negative.

Cancelling unthinking words with a considered apology benefits impulsive liars who initially insult and then talk their way out of the opprobrium by pretending to be sorry. Every time I find in the media that a politician or a white collar criminal says sorry, I interpret it as them being sorry they were caught. If they were sorry about the deed itself, they wouldn’t have done it in the first place.

A good person who did something bad by accident would volunteer to make amends. They would not have to be forced to it as punishment. Of course, if volunteering to compensate starts being interpreted favourably enough by society, then selfish and manipulative people would also volunteer. Making amends is a costly signal of good intentions, but if the benefit of signalling is large enough, then even the bad types signal to imitate the good.

Which ideology is more likely to be wrong?

Exercise in Bayes’ rule: is an ideology more likely to be wrong if it appeals relatively more to poor people than the rich?

More manipulable folks are more likely to lose their money, so less likely to be rich. Stupid people have a lower probability of making money. By Bayes, the rich are on average less manipulable and more intelligent than the poor.

Less manipulable people are less likely to find an ideology built on fallacies appealing. By Bayes, an ideology relatively more appealing to the stupid and credulous is more likely to be wrong. Due to such people being poor with a higher probability, an ideology embraced more by the poor than the rich is more likely to be fallacious.

Another exercise: is an ideology more likely to be wrong if academics like it relatively more than non-academics?

Smarter people are more likely to become academics, so by Bayes’ rule, academics are more likely to be smart. Intelligent people have a relatively higher probability of liking a correct ideology, so by Bayes, an ideology appealing to the intelligent is more likely to be correct. An ideology liked by academics is correct with a higher probability.

Comparing dictatorships

Why compare evil regimes? Sometimes a choice must be made which one to support. Inaction and refusal to choose is also a choice and may favour one or another. The help or harm to some regime may be indirect, e.g. through the enemy of an enemy.

How to compare evil regimes? I have encountered people who compare based on words, justifying crimes against humanity by some countries with the argument that their goals were good or the ideology was good, just wrongly implemented. (The subtext here is that if it was wrongly implemented in the past, perhaps it should be tried again in the hopes of implementing it rightly.) I disagree. Actions should be the basis of judgement, not narratives. A failure of a political system is a negative signal about it. Regardless of whether it signals a fundamental flaw or a low likelihood of right implementation, until all other systems have been tried and have accumulated a similar weight of negative signals, the failed system should not be tried again. This can be mathematically formalized as optimal sequential control under incomplete information.

I believe comparisons of countries should be based on objective criteria, preferably specified before the data is gathered (as in the scientific method). These objective criteria are for example the number of people killed, tortured, wrongly imprisoned, expropriated, the number and extent of wars started, the territory and population conquered and for how long, the economic and environmental damage caused. The number of ethnic or religious groups eliminated may also be counted, but this has the effect of weighting the deaths of people from smaller groups more.

The measures can be total or divided by time or by the number of supporters of the regime. The total of these criteria is generally larger for bigger countries. There is simply more opportunity to kill, torture, etc when there are more people available. The total measures are of interest because they show the whole negative impact on the world.

Division by time results in criteria that measure the flow of evil done. If the decision is which regime to eliminate first, it is optimal to focus on the one with the greatest predicted negative influence per unit of time. This strategy minimizes the total impact of evil regimes.

To find the expected number of crimes of a person from a dictatorship (or its leadership) without other data about them, the total crimes of the regime should be divided by its population (or number of leaders). Dividing by both people and time gives the expected flow of evil per person, suggesting an optimal strategy of removing leaders of criminal regimes.

The above focusses on past evil, but for predictive purposes the attractiveness or “selling power” of the regime also matters. How likely is the dictatorship to survive and expand? If more people favour and justify it, including outside its borders, it has a greater opportunity to do evil in the future. So the niceness of the narrative used to excuse its actions is actually a negative signal about an otherwise criminal regime. If the stories the dictatorship tells about itself make people consider its goals good or ideology good, then the dictatorship is more dangerous than another that cannot manipulate audiences into supporting it.

Principles help in resisting the siren call of “the end justifies the means.” For example the principle that nothing justifies crimes against humanity. No story about the greater good, no idealistic ideology. Another good principle is that actions speak louder than words. If a regime fails at good governance, excuses should be ignored.