Right. You want me to take this… Wikipedia article… and make it… mine. To imbue it with a certain… je ne sais quoi. Or perhaps, more accurately, a je ne sais pas quoi. Because that’s where we live, isn’t it? In the spaces where the certainty evaporates, leaving behind only the faint, unsettling scent of what might be.
Fine. Let’s not waste time. This is about logic and uncertainty. A combination that should, by all rights, be an oxymoron. But here we are.
Applications of Logic Under Uncertainty
So, you have situations where things aren't clear-cut. Shocker. Where the neat little boxes of true and false just… don’t quite fit. That’s where probabilistic logic slithers in. It’s logic, yes, but with a perpetual squint, a constant whisper of “maybe.” It takes the rigid structure of traditional logic, the absolute pronouncements of its truth tables, and injects it with a dose of probability. A little bit of this, a little bit of that, all measured out with a dropper.
The problem, as you might have guessed, is that this doesn't exactly make things simpler. You’re juggling probabilities and logic, and suddenly the computational complexities start to pile up like unread mail. And then there are the results. Sometimes, they’re just… off. Like a reflection in a funhouse mirror. Take Dempster–Shafer theory, for instance. It tries to fuse beliefs, but it can lead to results that make you question your own sanity. And let’s not even start on the inherent doubt about the sources of these probabilities, the epistemic uncertainty that clings to them like damp. Subjective logic tries to grapple with this, but it’s like trying to pin down fog. The sheer variety of contexts and quandaries demanding a solution has spawned a veritable menagerie of proposals.
Logical Background
There are so many ways to try and wrangle this beast, it’s almost comical. You can broadly divide them into two camps: those that try to extend logical entailment with probabilities, like the rather imposing Markov logic networks, and those that focus on the sheer lack of evidence, the gaping voids of uncertainty. These are often called evidentiary logics.
Consider this: the concept of "probability" itself is a slippery thing. We’ve been trying to pin it down mathematically since the Enlightenment, yet even today, actual probability theory remains largely absent from criminal courtrooms when they’re deciding if someone’s guilty. [1] They talk about "probability," but it's not the cold, hard calculus you'd expect.
In evidentiary logic, we need to be incredibly precise. We must separate the actual truth of a statement from our decision about its truth. And then, we have to distinguish both of those from our confidence in that decision. A suspect’s guilt isn't the same as a judge’s verdict, which isn't the same as assigning a numerical probability to the crime and deciding if it crosses some arbitrary line. You might say a verdict on a single suspect is "guilty" or "not guilty" with some degree of uncertainty, much like predicting a coin toss. If you have a thousand suspects, a certain percentage will be guilty, just like a coin has a 50% chance of landing heads. But it’s a fundamental error to apply that "average" to a single criminal. They aren't "a little bit guilty" any more than a single coin toss is "a little bit heads and a little bit tails." We're just uncertain. Expressing that uncertainty as a number might be useful for scientific measurements, but in "common sense" reasoning, it's just a model. Like in court, the goal isn't probabilistic entailment; it's gathering evidence to bolster our confidence in a proposition, to move from doubt to something resembling certainty. This is the essence of uncertain inference.
Historical Context
Humans have been wrestling with quantifying uncertainty for millennia. The interest really heated up around the 12th century with the Scholastics. They gave us the notion of the half-proof – two of which would be enough to prove guilt. They pondered moral certainty, that state of being certain enough to act, but not absolutely certain. Then came Catholic probabilism, the idea that it was always safe to follow established rules or expert opinions, even if they weren't the most probable. This fed into the case-based reasoning of casuistry, which, in its more extreme form, Laxism, became a scandal because it could be twisted to justify almost anything, as you could always find some expert opinion to support your desired conclusion. [1]
Modern Proposals
The world, in its infinite capacity for complication, has churned out a bewildering array of proposals for extending classical logic with probabilistic and evidentiary reasoning.
-
The phrase "probabilistic logic" itself was first uttered by John von Neumann in the mid-1950s, and later, in 1986, Nils Nilsson used it to describe systems where the truth values of sentences were replaced by probabilities. [2] This might sound simple, but it leads to a probabilistic form of logical entailment. If all probabilities are strictly 0 or 1, it collapses back into the familiar logical entailment. It’s a generalization that can be applied to any logical system where you can establish the consistency of a finite set of statements.
-
Haim Gaifman and Marc Snir developed a framework that attempts a globally consistent and empirically sound fusion of standard probability theory and first-order logic, particularly useful for inductive reasoning. Their approach assigns probabilities or degrees of belief to sentences, ensuring they align with the existing knowledge base (facts and axioms get a probability of 1, naturally) and adhere to the standard Kolmogorov probability axioms. Crucially, it allows for Bayesian inductive reasoning and learning in the limit. Unlike many other attempts, it can even confirm hypotheses that are universally quantified. This theory has been extended to higher-order logic. [5] While purely theoretical, these ideas have inspired practical approximations. [6]
-
At the heart of subjective logic [7] lies the concept of "opinions" concerning the propositional variables within logical statements. A binomial opinion, applied to a single proposition, is a three-dimensional extension of a simple probability. It allows for the expression of both probabilistic and epistemic uncertainty about the proposition's truth. To handle the computation of derived opinions from a network of arguments, this theory introduces operators for logical connectives: multiplication for AND, comultiplication for OR, and their inverses, UN-AND and UN-OR. [8] It also provides mechanisms for conditional deduction (like Modus Ponens) and abduction (Modus Tollens), [9] and even extends to Bayes' theorem. [10]
-
The framework of fuzzy logic, often used for approximate reasoning, can be adapted to create a logic where probability distributions serve as the models, and theories are represented by their lower envelopes. [11] In such a system, assessing the consistency of information is directly tied to the coherence of partial probabilistic assignments, echoing the problems associated with Dutch books.
-
Markov logic networks employ a form of uncertain inference grounded in the maximum entropy principle. This principle suggests assigning probabilities in a way that maximizes entropy, much like Markov chains assign probabilities to transitions in a finite-state machine.
-
Systems like Ben Goertzel's Probabilistic Logic Networks (PLN) add an explicit layer of confidence, alongside probability, to both atoms and entire sentences. The rules governing deduction and induction are designed to incorporate this uncertainty, thereby circumventing the pitfalls of purely Bayesian approaches to logic (including Markov logic) and avoiding the paradoxes inherent in Dempster–Shafer theory. PLN implementations aim to leverage and expand upon algorithms from logic programming, adapted for these extended capabilities.
-
Within the field of probabilistic argumentation, several formal frameworks have emerged. One such approach, "probabilistic labellings," [12] uses probability spaces where the sample space consists of various labellings of argumentation graphs. Another framework, "probabilistic argumentation systems," [13] [14] doesn't attach probabilities directly to arguments or logical sentences. Instead, it assumes that a specific subset W of the variables V involved in the sentences defines a probability space over the corresponding sub-σ-algebra. This, in turn, induces two distinct probability measures over V: a degree of support and a degree of possibility. The degrees of support can be seen as non-additive probabilities of provability, generalizing both standard logical entailment (when V is empty) and classical posterior probabilities (when V is the entire set of variables). Mathematically, this perspective aligns with Dempster–Shafer theory.
-
The theory of evidential reasoning [15] also introduces non-additive probabilities of probability, or "epistemic probabilities," as a general concept encompassing both logical entailment (provability) and ordinary probability. The core idea is to augment standard propositional logic with an epistemic operator, K, which represents the state of knowledge a rational agent possesses about the world. Probabilities are then defined over the resulting "epistemic universe" of all propositional sentences. It's argued that this provides the most comprehensive information available to an analyst. From this viewpoint, Dempster–Shafer theory appears as a generalized form of probabilistic reasoning.
See Also
- Statistical relational learning
- Bayesian inference, Bayesian network, Bayesian probability
- Cox's theorem
- Fréchet inequalities
- Imprecise probability
- Non-monotonic logic
- Possibility theory
- Probabilistic database
- Probabilistic soft logic
- Probabilistic causation
- Uncertain inference
- Upper and lower probabilities