Tag Archives: logic

Robustness is a form of efficiency

Efficiency means using the best way to achieve a goal. Mathematically, selecting the maximizer of an objective function. The goal may be anything. For example, the objective function may be a weighted average of performance across various situations.

Robustness means performing well in a wide variety of circumstances. Mathematically, performing well may mean maximizing the weighted average performance across situations, where the weights are the probabilities of the situations. Performing well may also mean maximizing the probability of meeting a minimum standard – this probability sums the probabilities of situations in which the (situation-specific) minimum standard is reached. In any case, some objective function is being maximized for robustness. The best way to achieve a goal is being found. The goal is either a weighted average performance, the probability of exceeding a minimum standard or some similar objective. Thus robustness is efficiency for a particular objective.

The robustness-efficiency tradeoff is just a tradeoff between different objective functions. One objective function in this case is a weighted average that puts positive weight on the other objective function.

Whatever the goal, working towards it efficiently is by definition the best thing to do. The goal usually changes over time, but most of this change is a slow drift. Reevaluating the probabilities of situations usually changes the goal, in particular if the goal is a weighted average or a sum of probabilities that includes some of these situations. A rare event occurring causes a reevaluation of the probability of this event, thus necessarily the probability of at least one other event. If the probabilities of rare events are revised up, then the goal tends to shift away from single-situation efficiency, or performance in a small number of situations, towards robustness (efficiency for a combination of a large number of situations).

To be better prepared for emergencies and crises, the society should prepare efficiently. The most efficient method may be difficult to determine in the short term. If the expected time until the next crisis is long, then the best way includes gathering resources and storing these in a large number of distributed depots. These resources include human capital – the skills of solving emergencies. Such skills are produced using training, stored in people’s brains, kept fresh with training. Both the physical and mental resources are part of the economic production in the country. Economic growth is helpful for creating emergency supplies, raising the medical capacity, freeing up time in which to train preparedness. Unfortunately, economic growth is often wasted on frivolous consumption of goods and services, often to impress others. Resources wasted in this way may reduce preparedness by causing people to go soft physically and mentally.

Solving a crisis requires cooperation. Consumption of social media may polarize a society, reducing collaboration and thus preparedness.

On the optimal burden of proof

All claims should be considered false until proven otherwise, because lies can be invented much faster than refuted. In other words, the maker of a claim has the burden of providing high-quality scientific proof, for example by referencing previous research on the subject. Strangely enough, some people seem to believe marketing, political spin and conspiracy theories even after such claims have been proven false. It remains to wish that everyone received the consequences of their choices (so that karma works).
Considering all claims false until proven otherwise runs into a logical problem: a claim and its opposite claim cannot be simultaneously false. The priority for falsity should be given to actively made claims, e.g. someone saying that a product or a policy works, or that there is a conspiracy behind an accident. Especially suspect are claims that benefit their maker if people believe them. A higher probability of falsity should also be attached to positive claims, e.g. that something has an effect in whatever direction (as opposed to no effect) or that an event is due to non-obvious causes, not chance. The lack of an effect should be the null hypothesis. Similarly, ignorance and carelessness, not malice, should be the default explanation for bad events.
Sometimes two opposing claims are actively made and belief in them benefits their makers, e.g. in politics or when competing products are marketed. This is the hardest case to find the truth in, but a partial and probabilistic solution is possible. Until rigorous proof is found, one should keep an open mind. Keeping an open mind creates a vulnerability to manipulation: after some claim is proven false, its proponents often try to defend it by asking its opponents to keep an open mind, i.e. ignore evidence. In such cases, the mind should be closed to the claim until its proponents provide enough counter-evidence for a neutral view to be reasonable again.
To find which opposing claim is true, the first test is logic. If a claim is logically inconsistent with itself, then it is false by syntactic reasoning alone. A broader test is whether the claim is consistent with other claims of the same person. For example, Vladimir Putin said that there were no Russian soldiers in Crimea, but a month later gave medals to some Russian soldiers, citing their successful operation in Crimea. At least one of the claims must be false, because either there were Russian soldiers in Crimea or not. The way people try to weasel out of such self-contradictions is to say that the two claims referred to different time periods, definitions or circumstances. In other words, change the interpretation of words. A difficulty for the truth-seeker is that sometimes such a change in interpretation is a legitimate clarification. Tongues do slip. Nonetheless, a contradiction is probabilistic evidence for lying.
The second test for falsity is objective evidence. If there is a streetfight and the two sides accuse each other of starting it, then sometimes a security camera video can refute one of the contradicting claims. What evidence is objective is, sadly, subject to interpretation. Videos can be photoshopped, though it is difficult and time-consuming. The objectivity of the evidence is strongly positively correlated with the scientific rigour of its collection process. „Hard” evidence is a signal of the truth, but a probabilistic signal. In this world, most signals are probabilistic.
The third test of falsity is the testimony of neutral observers, preferably several of them, because people misperceive and misremember even under the best intentions. The neutrality of observers is again up for debate and interpretation. In some cases, an observer is a statistics-gathering organisation. Just like objective evidence, testimony and statistics are probabilistic signals.
The fourth test of falsity is the testimony of interested parties, to which the above caveats apply even more strongly.
Integrating conflicting evidence should use Bayes’ rule, because it keeps probabilities consistent. Consistency helps glean information about one aspect of the question from data on other aspects. Background knowledge should be combined with the evidence, for example by ruling out physical impossibilities. If a camera shows a car disappearing behind a corner and immediately reappearing, moving in the opposite direction, then physics says that the original car couldn’t have changed direction so fast. The appearing car must be a different one. Knowledge of human interactions and psychology is part of the background information, e.g. if smaller, weaker and outnumbered people rarely attack the stronger and more numerous, then this provides probabilistic info about who started a fight. Legal theory incorporates background knowledge of human nature to get information about the crime – human nature suggests motives. Asking: „Who benefits?” has a long history in law.