Mascot image.
#EECS#Python#Intro

Imperative Knowledge and Computation

The framework of propositional logic restricts our focus to declarative knowledge: statements of fact that strictly evaluate to \top or \bot. However, mathematical problem-solving and modern programming require imperative knowledge, which consists of definitive sequences of instructions for deducing new information.

At its fundamental level, a computer performs exactly two operations: it executes calculations, and it stores the results of those calculations. Computational thinking leverages the immense speed and memory of modern machines to execute imperative knowledge autonomously.

Definition 7 (Algorithm)

An algorithm is a finite sequence of “rigorously” defined instructions that, given an initial state and a set of inputs, proceeds through well-defined successive states to eventually produce an output and terminate.

Consider Heron of Alexandria’s method for computing the square root of a real number, xx, such that xx is greater than 0.

  1. Initialise an arbitrary guess g>0g > 0.
  2. If g2g^2 is sufficiently close to xx, terminate and return gg as the solution.
  3. Otherwise, assign a new value to the guess via the assignment operator introduced previously:
g:=12(g+xg)g := \frac{1}{2}\left(g + \frac{x}{g}\right)
  1. Return to step 2.

This procedure illustrates the core architecture of computation: a sequence of operations governed by control flow. Control flow dictates execution order based on logical tests (which evaluate strictly to \top or \bot), allowing for the dynamic repetition or bypassing of instructions.

Flowchart for the square root algorithm
Problem 3

Let x=25x = 25 and initial guess g=1g = 1. Using Heron’s method, compute the first three iterations of gg. Define “sufficiently close” as g2x<0.01|g^2 - x| < 0.01. Does the algorithm terminate within these three iterations?

Computability and Turing Completeness

Early computing machines were fixed-program computers, physically wired to solve exactly one specific mathematical problem. The advent of the stored-program computer fundamentally altered this paradigm. By storing both the sequence of instructions (the program) and the data it manipulates within the same memory space, a central interpreter can execute any legal set of instructions. A program counter directs the interpreter sequentially, deviating only when control flow dictates a jump. Because the result of a computation can itself be a new sequence of instructions, a modern computer is capable of programming itself.

To formalise the limits of what can be computed within this architecture, we rely on the theoretical model of the Universal Turing Machine (created in 1936 by Alan Turing).

The Church-Turing Thesis. If a mathematical function is computable, there exists a Turing Machine that can be programmed to compute it.

A programming language, such as Python, is termed Turing complete if it possesses the capacity to simulate a Universal Turing Machine. All standard modern programming languages share this property, rendering them fundamentally equivalent in raw computational power; all this means is that any algorithm expressible in one language can be theoretically expressed in another.

We noted previously that propositional logic is algorithmically decidable. General computation, however, loses this property. It is mathematically impossible to construct a universal algorithm that takes an arbitrary program as input and determines, in finite time, whether it will eventually halt or run indefinitely. This inherent limitation is known as the Halting Problem.

Syntax and Semantics

Just as mathematical logic is governed by syntax (the valid composition of symbols) and semantics (the truth value of that composition), All Programming languages strictly enforces these rules to eliminate ambiguity. When implementing algorithms, statements must pass three levels of structural verification:

  1. Primitive Constructs: The foundational building blocks of the language, including numeric literals (such as 3.23.2), text strings, and operators (such as ++ or ×\times).
  2. Syntax: The grammatical rules dictating how primitives may be combined to form well-formed expressions. For example, the sequence 3.2+3.23.2 + 3.2 is syntactically valid, whereas 3.2 3.23.2 \ 3.2 is not. Syntax errors are immediately caught by the interpreter prior to execution.
  3. Static Semantics: The rules determining whether a syntactically valid string possesses a logical meaning. For example, attempting to add a number to a text string may be syntactically structured correctly (Operand Operator Operand) but is statically invalid.

If a program is syntactically correct and free of static semantic errors, it possesses exact semantics. Unlike human language, every valid Python program has exactly one deterministic meaning.

If an algorithm behaves unexpectedly, it is not a failure of the language’s semantics, but a failure of the programmer’s logic. Such logical errors manifest in three ways: the program may crash during execution, it may enter an infinite loop, or (most dangerously) it may run to completion and produce an incorrect output. Consequently, rigorous mathematical verification of algorithms is required to ensure that logical deviations are self-evident.

Problem 4

For each of the following, state whether it results in a syntax error, a static semantic error, or produces unintended behavior (a logical error). Briefly justify.

  1. x := 5 + * 3
  2. x := "hello" + 7
  3. An implementation of Heron’s method that returns gg when g2>xg^2 > x instead of when g2x|g^2 - x| is sufficiently small.

Instantiating Logic: Objects and Types

We now operationalise these concepts using Python. A Python program executes imperative knowledge via a sequence of definitions and commands, which are processed and evaluated sequentially by an interpreter.

A command, formally referred to as a statement, instructs the interpreter to perform a specific action. For instance, the print function outputs data to the execution environment. It accepts a variable number of arguments, which are by default separated by spaces when rendered. The output format can be controlled via keyword arguments, such as specifying end="" to suppress the default newline character.

# Sequential evaluation of commands
print("Algorithm")
print("terminated.")

# Supplying multiple arguments and controlling the end character
print("Algorithm", "terminated", end=".\n")

Computational Entities and Types

In mathematics, the validity of an operation depends entirely on the kind of entity it acts upon: addition is defined for numbers, conjunction for propositions, and so on. The computational analogue of such an entity is an object. Every object in Python possesses a specific type, which definitively dictates the operations legally permitted upon that object.

This strict typing system is the mechanism by which the interpreter enforces the static semantics discussed previously. Types bifurcate into two fundamental categories: scalar and non-scalar. Scalar objects are structurally indivisible, acting as the atomic elements of the language, whereas non-scalar objects (such as text strings) possess internal structure.

Python implements four primitive scalar types:

  1. int: Represents the integers. Literals are denoted standardly (e.g., 5, -12).
  2. float: Approximates the real numbers. Literals must include a decimal point (e.g., 3.0, -28.72) or use scientific notation. Because computational memory is finite, continuous reals are represented using discrete floating-point structures. This architectural necessity introduces slight deviations from pure real arithmetic, a fundamental limitation of discrete computation that we will address later.
  3. bool: The executable manifestation of the truth values established in Definition 2. It is inhabited by exactly two values: True (corresponding to \top) and False (corresponding to \bot).
  4. None: A type inhabited by a singular null value, used to define an empty state or the absence of a returned object.

Expressions and Relational Operators

When objects and operators are validly combined according to the language’s syntax, they form an expression. Just as a mathematical term can be simplified, an expression inherently evaluates to a singular object of a defined type. The built-in function type() permits the dynamic inspection of an object’s classification.

type(5) # Evaluates to: int

type(5.0) # Evaluates to: float

The construction of algorithms relies heavily on control flow, which in turn requires evaluating atomic propositions (Definition 3) to strictly \top or \bot. We achieve this computationally using relational operators.

Crucially, the syntax for assignment and logical equivalence diverge. In our earlier mathematical notation, we distinguished assignment (:=:=) from equality (==). Python syntax maps mathematical assignment to the single equals sign =. To test whether two expressions evaluate to identical values, we must use the logical equivalence operator ==, while non-equivalence is denoted by !=.

The evaluation of a relational operator always yields a bool. Thus, an expression such as 3 != 2 is a fully formed atomic proposition within the machine’s logic.

# Evaluating an arithmetic expression
3.0 + 2.0 # Evaluates to the float: 5.0

# Evaluating a relational proposition
3 != 2 # Evaluates to the bool: True
Problem 5

Consider the Python expressions type(4 == 4) and type(4.0). Determine the output of both. Explain how the distinction between = and == prevents static semantic errors when constructing the logical tests required for the control flow of algorithms, such as step 2 of Heron’s method.

Arithmetic Operators and Precedence

Computation requires the precise articulation of operations upon objects. The arithmetic operators applicable to int and float types closely mirror standard mathematical notation, yet they introduce critical distinctions necessitated by the discrete nature of machine architecture.

The standard arithmetic operators include addition (+), subtraction (-), and multiplication (*). Exponentiation is denoted by **.

A fundamental structural divergence from pure mathematics arises with division. Python implements two distinct division operators:

  1. Standard Division (/): Invariably evaluates to a float, providing an approximation of division within R\mathbb{R}. For instance, 5 / 3 evaluates to 1.6666666666666667.
  2. Floor Division (//): Performs integer division, discarding the fractional remainder to return the largest integer less than or equal to the quotient. Thus, 5 // 3 evaluates to 1, and -4 // 3 evaluates to -2.
  3. Modulo (%): Evaluates to the remainder of the integer division. For example, 5 % 3 evaluates to 2. The modulo operator is foundational to discrete mathematics and will form the computational basis for our later exploration of modular arithmetic and Number Theory.
Problem 6

Verify that (a%b) is equivalent to (a - (a//b)*b).

When constructing compound expressions, the interpreter strictly adheres to an order of operator precedence. Exponentiation (**) binds most tightly, followed by multiplication and division operators (*, /, //, %), and finally addition and subtraction (+, -).

print(2 + 3 * 4)  # prints 14, not 20
print(5 + 4 % 3)  # prints  6, not 0 (% has same precedence as *, /, and //)
print(2 ** 3 * 4) # prints 32, not 4096 (** has higher precedence than *, /, //, and %)

Furthermore, operations of equal precedence are governed by associativity. While most operators associate strictly left-to-right, exponentiation associates right-to-left.

print(5-4-3)   # prints -2, not 4 (- associates left-to-right)

# Right-to-left associativity of exponentiation
4 ** 3 ** 2 # Evaluates to 4 ** 9 (262144), not 64 ** 2 (4096)

# Parentheses rigorously dictate evaluation order
(4 ** 3) ** 2 # Evaluates to 4096

Propositional Logic in Computation

The logical connectives established in our formal syntax—negation (¬\neg), conjunction (\land), and disjunction (\lor)—are implemented natively in Python via the reserved keywords not, and, and or. These primitive operators act strictly upon objects of type bool and strictly enforce the truth tables defined previously.

Because they are reserved keywords, they are an intrinsic part of the language’s syntax and cannot be redefined or used as variable names. This guarantees that logical tests within algorithms remain deterministic and free from semantic ambiguity.

# Evaluates a formal conjunction
(3 != 2) and (5 % 3 == 0) # Evaluates to: False (since 5 % 3 == 2, the second proposition is False)

Python Operator Summary

CategoryOperators
Arithmetic+, -, *, /, //, **, %, unary -, unary +
Relational<, <=, >=, >, ==, !=
Assignment+=, -=, *=, /=, //=, **=, %=, <<=, >>=
Logicaland, or, not

For now, we are not covering the bitwise operators (<<, >>, &, |, ^, ~, &=, |=, ^=). Since <<= and >>= depend on bitwise shifts, we will also defer using them in this lesson.

Variable Binding and State

In mathematics, an equation such as y=x2y = x^2 establishes a permanent, structural equivalence between the left and right sides. If xx varies, yy intrinsically varies with it.

In imperative programming, this is emphatically not the case. The assignment statement = dictates a temporal binding. A variable in Python is simply a human-readable identifier bound to a specific computational object in memory.

Consider the following sequence of instructions calculating the area of a circle:

pi = 3.14159
radius = 11.0
area = pi * (radius ** 2)
radius = 14.0

The third statement evaluates the expression pi * (radius ** 2) down to a singular float object (379.94039), and binds the identifier area to that object. When the fourth statement re-binds radius to 14.0, it has absolutely no effect on the value to which area is bound. The state of area remains fixed until it is explicitly reassigned.

This strict separation of temporal states is what allows algorithms, such as Heron’s method, to iteratively update a guess (gg) without destroying the fundamental properties of the initial input (xx).

Code Semantics and Readability

Because programs dictate complex logical flow, their mathematical correctness must be verifiable by human readers. A syntactically valid program can still be semantically nonsensical if its identifiers obscure its logic. Naming identifiers clearly and using the # symbol to append non-executable comments are vital practices for maintaining mathematical clarity.

# Correct mathematical semantics but poor variable naming
a = 3.14159
b = 11.2
c = a * (b ** 2) 

# Identical execution, but logically verifiable
pi = 3.14159
diameter = 11.2
area = pi * ((diameter / 2.0) ** 2)

By reading the second block, a mathematician immediately identifies a logical error in the first block (failing to halve the diameter before squaring it).

Multiple Assignment

Python provides a mechanism for binding multiple variables simultaneously. The assignment sequence evaluates every expression on the right-hand side of the = operator completely before altering any of the bindings on the left-hand side.

This atomic evaluation enables elegant manipulations of algorithmic state, such as swapping the values of two variables without requiring an intermediate, temporary variable:

x, y = 2, 3

# The expressions on the right are evaluated to objects (3, 2)
# before the names x and y are rebound.
x, y = y, x 

print("x is", x) # Outputs: x is 3
print("y is", y) # Outputs: y is 2
Problem 7

Consider the multiple assignment a, b = 1, 1 followed by the iterative step a, b = b, a + b. If this step is executed three times in succession, what are the final bound values of a and b? Relate this sequence to a well-known mathematical sequence.

The Structure of Proof Systems

We have established a language of propositions, connectives, and truth tables. We have also built a computational toolkit capable of evaluating these constructs mechanically. The natural question is: what can we actually do with all of this?

Truth tables answer individual questions. Given a compound proposition, we can enumerate every possible assignment and read off the result. But this brute enumeration scales poorly. A formula with nn atomic variables demands 2n2^n rows. More fundamentally, a truth table tells us that something is true without revealing why it is true, or how its truth connects to anything else. Mathematics demands more than verification; it demands structure.

A proof system provides exactly this structure. It consists of two classes of objects: statements (finite strings of symbols expressing propositions) and proofs (finite strings of symbols justifying those statements). The relationship between them is governed by two functions:

  1. Semantics (Truth): A rule determining whether a given statement is true or false. In our previous work, truth tables served this role.
  2. Verification (Syntax): A mechanical procedure that decides whether a specific string constitutes a valid proof for a given statement. This verification must be efficiently computable; a proof whose validity cannot be checked in reasonable time is operationally useless.

These two functions give rise to two critical properties. A proof system is sound if no false statement possesses a valid proof: the verification procedure never accepts a justification for an untruth. A proof system is complete if every true statement possesses a valid proof: nothing true escapes the reach of the system.

Example 3 (Primality)

Consider statements of the form “The integer nn is composite.”

  • Semantics: The statement is true if nn has a divisor dd such that 1<d<n1 < d < n.
  • Proof: A valid proof could simply be the divisor dd itself.
  • Verification: Divide nn by dd and check whether the remainder is zero.

The verification step is computationally cheap, even when finding the divisor dd is extraordinarily difficult. This asymmetry between the difficulty of discovery and the ease of verification is a recurring theme in both mathematics and computer science.

# Verification of compositeness: given n and a candidate divisor d
n = 91
d = 7
# The verification is a single arithmetic check
n % d == 0 and 1 < d < n # Evaluates to: True (91 = 7 * 13, so 91 is composite)

We now build a proof system for propositional logic itself.

Satisfiability

Before constructing proofs, we must classify propositions by the range of truth values they can assume. Not all compound propositions behave alike: some are always true, some are always false, and some depend on the assignment.

Definition 8 (Tautology)

A compound proposition is a tautology if it evaluates to \top under every possible assignment of truth values to its atomic variables.

Definition 9 (Contradiction)

A compound proposition is a contradiction if it evaluates to \bot under every possible assignment of truth values to its atomic variables.

Definition 10 (Contingency)

A compound proposition is a contingency if it is neither a tautology nor a contradiction; its truth value depends on the specific assignment.

A proposition is satisfiable if there exists at least one assignment under which it evaluates to \top. Every tautology and every contingency is satisfiable. A contradiction is unsatisfiable.

Example 4

The proposition p¬pp \lor \neg p is a tautology (by the Principle of Bivalence, one of pp or ¬p\neg p must hold). The proposition p¬pp \land \neg p is a contradiction. The bare variable pp is a contingency.

To establish that a proposition is satisfiable, a single witness assignment suffices. To establish that it is unsatisfiable, one must check every assignment, which is precisely the expensive enumeration that motivates the algebraic approach developed below.

# Witness for satisfiability: one assignment suffices
# Is (p or not q) and (q or not r) and (r or not p) satisfiable?
p, q, r = True, True, True
(p or not q) and (q or not r) and (r or not p) # Evaluates to: True — one witness is enough
Problem 8

Determine the satisfiability of (pqr)(¬p¬q¬r)(p \lor q \lor r) \land (\neg p \lor \neg q \lor \neg r). If satisfiable, provide a witness assignment. If unsatisfiable, justify your answer.

Axioms and Equivalence Proofs

We now introduce the axiomatic foundation that allows us to reason about propositional equivalences without enumerating truth tables. The axioms of classical propositional logic specify the algebraic behaviour of the connectives \land, \lor, and ¬\neg. Each axiom is a logical equivalence, asserting that two expressions are interchangeable in all contexts. The first five axiom pairs define a structure known as a Boolean algebra.

AxiomConjunctive FormDisjunctive FormIdentityppppComplement¬pp¬ppCommutativitypqqppqqpAssociativityp(qr)(pq)rp(qr)(pq)rDistributivityp(qr)(pq)(pr)p(qr)(pq)(pr)\begin{array}{lll} \textbf{Axiom} & \textbf{Conjunctive Form} & \textbf{Disjunctive Form} \\ \hline \text{Identity} & \top \land p \equiv p & \bot \lor p \equiv p \\ \text{Complement} & \neg p \land p \equiv \bot & \neg p \lor p \equiv \top \\ \text{Commutativity} & p \land q \equiv q \land p & p \lor q \equiv q \lor p \\ \text{Associativity} & p \land (q \land r) \equiv (p \land q) \land r & p \lor (q \lor r) \equiv (p \lor q) \lor r \\ \text{Distributivity} & p \land (q \lor r) \equiv (p \land q) \lor (p \land r) & p \lor (q \land r) \equiv (p \lor q) \land (p \lor r) \end{array}

Two additional axioms disintegrate the conditional connectives into the core trio ¬,,\neg, \land, \lor:

Conditional:pq¬pqBiconditional:pq(pq)(qp)\begin{aligned} \textbf{Conditional:} \quad & p \to q \equiv \neg p \lor q \\ \textbf{Biconditional:} \quad & p \leftrightarrow q \equiv (p \to q) \land (q \to p) \end{aligned}

The Conditional axiom deserves special attention. It asserts that every implication can be rewritten using only negation and disjunction. This means the implication symbol \to is, strictly speaking, syntactic convenience rather than a primitive operation. Its presence in the “antecedent position” introduces a hidden negation, which is the source of much confusion when manipulating implications algebraically.

From these axioms, we derive theorems. Recall Definition 4: a proof begins from known truths and proceeds via valid logical steps. An equivalence proof is a chain of equivalences, each justified by an axiom, a definition, or a previously established theorem:

AA1,A1A2,,AnBAB.A \equiv A_1, \quad A_1 \equiv A_2, \quad \ldots, \quad A_n \equiv B \quad \Longrightarrow \quad A \equiv B.

This method is vastly more efficient than truth tables for complex formulas, though it requires algebraic ingenuity rather than mechanical enumeration.

Fundamental Theorems

Theorem 1 (Uniqueness of Complements)

For any propositions pp and qq, if pqp \land q \equiv \bot and pqp \lor q \equiv \top, then ¬pq\neg p \equiv q.

Proof

Let pp and qq be arbitrary propositions satisfying pqp \land q \equiv \bot and pqp \lor q \equiv \top. We show that both ¬p\neg p and qq reduce to the same expression. First:

¬p¬pby Identity(¬p)by Commutativity(¬p)(pq)by assumption, pq(¬pp)(¬pq)by Distributivity(¬pq)by Complement¬pqby Identity\begin{aligned} \neg p &\equiv \top \land \neg p && \text{by Identity} \\ &\equiv (\neg p) \land \top && \text{by Commutativity} \\ &\equiv (\neg p) \land (p \lor q) && \text{by assumption, } p \lor q \equiv \top \\ &\equiv (\neg p \land p) \lor (\neg p \land q) && \text{by Distributivity} \\ &\equiv \bot \lor (\neg p \land q) && \text{by Complement} \\ &\equiv \neg p \land q && \text{by Identity} \end{aligned}

Similarly:

qqby Identityqby Commutativityq(p¬p)by Complement(qp)(q¬p)by Distributivity(pq)(¬pq)by Commutativity(¬pq)by assumption, pq¬pqby Identity\begin{aligned} q &\equiv \top \land q && \text{by Identity} \\ &\equiv q \land \top && \text{by Commutativity} \\ &\equiv q \land (p \lor \neg p) && \text{by Complement} \\ &\equiv (q \land p) \lor (q \land \neg p) && \text{by Distributivity} \\ &\equiv (p \land q) \lor (\neg p \land q) && \text{by Commutativity} \\ &\equiv \bot \lor (\neg p \land q) && \text{by assumption, } p \land q \equiv \bot \\ &\equiv \neg p \land q && \text{by Identity} \end{aligned}

Both ¬p\neg p and qq equal ¬pq\neg p \land q, hence ¬pq\neg p \equiv q.

This theorem is the engine behind nearly every derivation that follows. It tells us that if something behaves like a complement (annihilates under \land, completes under \lor), then it is the complement. We now put it to work.

Corollary 1

¬\top \equiv \neg \bot and ¬\bot \equiv \neg \top.

Proof

By Identity, \bot \land \top \equiv \bot. By Commutativity and Identity, \bot \lor \top \equiv \top. The premises of Theorem 1 are satisfied with p:=p := \bot and q:=q := \top, yielding ¬\top \equiv \neg \bot. The second statement follows identically with p:=p := \top, q:=q := \bot.

Corollary 2 (Negative Equivalence)

For any propositions pp and qq, if pqp \equiv q, then ¬p¬q\neg p \equiv \neg q.

Proof

Let pqp \equiv q. Then:

q¬pp¬p(by assumption, then Complement)q \land \neg p \equiv p \land \neg p \equiv \bot \quad \text{(by assumption, then Complement)}q¬pp¬p(by assumption, then Complement)q \lor \neg p \equiv p \lor \neg p \equiv \top \quad \text{(by assumption, then Complement)}

By Theorem 1 (with qq playing the role of pp and ¬p\neg p playing the role of qq), we conclude ¬q¬p\neg q \equiv \neg p.

Corollary 3

For any propositions p,q,r,sp, q, r, s such that pqp \equiv q and rsr \equiv s:

  1. prqsp \land r \equiv q \land s
  2. prqsp \lor r \equiv q \lor s
  3. prqsp \to r \equiv q \to s
  4. prqsp \leftrightarrow r \equiv q \leftrightarrow s
Remark

The proofs are straightforward applications of substitution and are omitted.

With Theorem 1 and its corollaries established, we can now derive the classical theorems of propositional logic.

Theorem 2 (Double Negation)

For any proposition pp, p¬¬pp \equiv \neg \neg p.

Proof

By Complement (and Commutativity): ¬ppp¬p\neg p \land p \equiv p \land \neg p \equiv \bot and ¬ppp¬p\neg p \lor p \equiv p \lor \neg p \equiv \top. The premises of Theorem 1 are met with ¬p\neg p in the role of pp and pp in the role of qq. Therefore p¬(¬p)p \equiv \neg(\neg p).

# Double Negation: quick computational check
p = True
p == (not (not p)) # Evaluates to: True

p = False
p == (not (not p)) # Evaluates to: True
Theorem 3 (Idempotence)

For any proposition pp, pppp \land p \equiv p and pppp \lor p \equiv p.

Proof

For the conjunctive case:

pp(pp)by Identity(pp)(p¬p)by Complementp(p¬p)by Distributivitypby Complementpby Identity\begin{aligned} p \land p &\equiv (p \land p) \lor \bot && \text{by Identity} \\ &\equiv (p \land p) \lor (p \land \neg p) && \text{by Complement} \\ &\equiv p \land (p \lor \neg p) && \text{by Distributivity} \\ &\equiv p \land \top && \text{by Complement} \\ &\equiv p && \text{by Identity} \end{aligned}

The disjunctive case follows analogously:

pp(pp)by Identity(pp)(p¬p)by Complementp(p¬p)by Distributivitypby Complementpby Identity\begin{aligned} p \lor p &\equiv (p \lor p) \land \top && \text{by Identity} \\ &\equiv (p \lor p) \land (p \lor \neg p) && \text{by Complement} \\ &\equiv p \lor (p \land \neg p) && \text{by Distributivity} \\ &\equiv p \lor \bot && \text{by Complement} \\ &\equiv p && \text{by Identity} \end{aligned}
Theorem 4 (Domination)

For any proposition pp, p\top \lor p \equiv \top and p\bot \land p \equiv \bot.

Proof

For the disjunctive fragment:

p(¬pp)pby Complement¬p(pp)by Associativity¬ppby Idempotence (Theorem 3)by Complement\begin{aligned} \top \lor p &\equiv (\neg p \lor p) \lor p && \text{by Complement} \\ &\equiv \neg p \lor (p \lor p) && \text{by Associativity} \\ &\equiv \neg p \lor p && \text{by Idempotence (Theorem 3)} \\ &\equiv \top && \text{by Complement} \end{aligned}

The conjunctive fragment:

p(¬pp)pby Complement¬p(pp)by Associativity¬ppby Idempotence (Theorem 3)by Complement\begin{aligned} \bot \land p &\equiv (\neg p \land p) \land p && \text{by Complement} \\ &\equiv \neg p \land (p \land p) && \text{by Associativity} \\ &\equiv \neg p \land p && \text{by Idempotence (Theorem 3)} \\ &\equiv \bot && \text{by Complement} \end{aligned}
Theorem 5 (Absorption)

For any propositions pp and qq, p(pq)pp \lor (p \land q) \equiv p and p(pq)pp \land (p \lor q) \equiv p.

Proof

For the disjunctive case:

p(pq)(p)(pq)by Identityp(q)by Distributivitypby Domination (Theorem 4)pby Identity\begin{aligned} p \lor (p \land q) &\equiv (p \land \top) \lor (p \land q) && \text{by Identity} \\ &\equiv p \land (\top \lor q) && \text{by Distributivity} \\ &\equiv p \land \top && \text{by Domination (Theorem 4)} \\ &\equiv p && \text{by Identity} \end{aligned}

The conjunctive case:

p(pq)(p)(pq)by Identityp(q)by Distributivitypby Domination (Theorem 4)pby Identity\begin{aligned} p \land (p \lor q) &\equiv (p \lor \bot) \land (p \lor q) && \text{by Identity} \\ &\equiv p \lor (\bot \land q) && \text{by Distributivity} \\ &\equiv p \lor \bot && \text{by Domination (Theorem 4)} \\ &\equiv p && \text{by Identity} \end{aligned}
Theorem 6 (De Morgan's Laws)

For any propositions pp and qq, ¬(pq)¬p¬q\neg(p \land q) \equiv \neg p \lor \neg q and ¬(pq)¬p¬q\neg(p \lor q) \equiv \neg p \land \neg q.

Proof

We prove ¬(pq)¬p¬q\neg(p \land q) \equiv \neg p \lor \neg q by applying Theorem 1. We must show (pq)(¬p¬q)(p \land q) \land (\neg p \lor \neg q) \equiv \bot and (pq)(¬p¬q)(p \land q) \lor (\neg p \lor \neg q) \equiv \top.

The conjunctive branch:

(pq)(¬p¬q)p(q(¬p¬q))by Associativityp((q¬p)(q¬q))by Distributivityp((q¬p))by Complementp(q¬p)by Identityp(¬pq)by Commutativity(p¬p)qby Associativityqby Complementby Domination (Theorem 4)\begin{aligned} (p \land q) \land (\neg p \lor \neg q) &\equiv p \land (q \land (\neg p \lor \neg q)) && \text{by Associativity} \\ &\equiv p \land ((q \land \neg p) \lor (q \land \neg q)) && \text{by Distributivity} \\ &\equiv p \land ((q \land \neg p) \lor \bot) && \text{by Complement} \\ &\equiv p \land (q \land \neg p) && \text{by Identity} \\ &\equiv p \land (\neg p \land q) && \text{by Commutativity} \\ &\equiv (p \land \neg p) \land q && \text{by Associativity} \\ &\equiv \bot \land q && \text{by Complement} \\ &\equiv \bot && \text{by Domination (Theorem 4)} \end{aligned}

The disjunctive branch:

(pq)(¬p¬q)((pq)¬p)¬qby Associativity(¬p(pq))¬qby Commutativity((¬pp)(¬pq))¬qby Distributivity((¬pq))¬qby Complement(¬pq)¬qby Identity¬p(q¬q)by Associativity¬pby Complementby Domination (Theorem 4)\begin{aligned} (p \land q) \lor (\neg p \lor \neg q) &\equiv ((p \land q) \lor \neg p) \lor \neg q && \text{by Associativity} \\ &\equiv (\neg p \lor (p \land q)) \lor \neg q && \text{by Commutativity} \\ &\equiv ((\neg p \lor p) \land (\neg p \lor q)) \lor \neg q && \text{by Distributivity} \\ &\equiv (\top \land (\neg p \lor q)) \lor \neg q && \text{by Complement} \\ &\equiv (\neg p \lor q) \lor \neg q && \text{by Identity} \\ &\equiv \neg p \lor (q \lor \neg q) && \text{by Associativity} \\ &\equiv \neg p \lor \top && \text{by Complement} \\ &\equiv \top && \text{by Domination (Theorem 4)} \end{aligned}

By Theorem 1, ¬(pq)¬p¬q\neg(p \land q) \equiv \neg p \lor \neg q. The proof of ¬(pq)¬p¬q\neg(p \lor q) \equiv \neg p \land \neg q is analogous.

# De Morgan's Laws: computational verification
p, q = True, False
not (p and q) == (not p or not q) # Evaluates to: True

p, q = False, True
not (p or q) == (not p and not q) # Evaluates to: True
Remark

The symbol \blacksquare at the end of a proof is a modern substitute for the traditional Q.E.D., an initialism for the Latin phrase quod erat demonstrandum, meaning “what was to be shown.”

Problem 9

Prove by equivalence proof that ¬(pq)¬p¬q\neg(p \lor q) \equiv \neg p \land \neg q (the second De Morgan’s Law). Hint. Apply Theorem 1 as in the proof of Theorem 6.

Equivalences Involving Conditionals

The Conditional axiom (pq¬pqp \to q \equiv \neg p \lor q) and the Biconditional axiom allow us to derive every conditional equivalence by reducing implications to ¬,,\neg, \land, \lor and applying the theorems above:

pq¬pqpq¬q¬p(Contrapositive)pq¬pqpq¬(p¬q)¬(pq)p¬q\begin{aligned} p \to q &\equiv \neg p \lor q \\ p \to q &\equiv \neg q \to \neg p \quad \text{(Contrapositive)} \\ p \lor q &\equiv \neg p \to q \\ p \land q &\equiv \neg(p \to \neg q) \\ \neg(p \to q) &\equiv p \land \neg q \end{aligned} (pq)(pr)p(qr)(pr)(qr)(pq)r(pq)(pr)p(qr)(pr)(qr)(pq)r\begin{aligned} (p \to q) \land (p \to r) &\equiv p \to (q \land r) \\ (p \to r) \land (q \to r) &\equiv (p \lor q) \to r \\ (p \to q) \lor (p \to r) &\equiv p \to (q \lor r) \\ (p \to r) \lor (q \to r) &\equiv (p \land q) \to r \end{aligned} pq(pq)(qp)pq¬p¬qpq(pq)(¬p¬q)¬(pq)p¬q\begin{aligned} p \leftrightarrow q &\equiv (p \to q) \land (q \to p) \\ p \leftrightarrow q &\equiv \neg p \leftrightarrow \neg q \\ p \leftrightarrow q &\equiv (p \land q) \lor (\neg p \land \neg q) \\ \neg(p \leftrightarrow q) &\equiv p \leftrightarrow \neg q \end{aligned}

Notice the pattern in the implication equivalences: when implications share the same premise, the conclusions combine with the same connective (\land or \lor). When they share the same conclusion, the premises combine with the opposite connective. This “flip” from \land to \lor (and vice versa) is a direct consequence of the hidden negation in the Conditional axiom: the antecedent of an implication sits behind a ¬\neg, so De Morgan’s Laws invert the connective when premises are merged.

Contrapositive, Converse, and Inverse

From an implication pqp \to q, three related conditionals arise:

  • The contrapositive ¬q¬p\neg q \to \neg p is logically equivalent to pqp \to q. This equivalence (pq¬q¬pp \to q \equiv \neg q \to \neg p) is one of the most powerful proof techniques in mathematics.
  • The converse qpq \to p is not equivalent to pqp \to q. We proved this nonequivalence in Definition 6 via a counterexample.
  • The inverse ¬p¬q\neg p \to \neg q is the contrapositive of the converse, hence equivalent to qpq \to p but not to pqp \to q.

The converse and inverse are equivalent to each other. Confusing an implication with its converse is a pervasive logical fallacy.

Equivalence Proofs in Practice

We now work through several examples that demonstrate the method and, simultaneously, show how Python can serve as a rapid sanity check.

Example 5

Show that ¬(p(¬pq))¬p¬q\neg(p \lor (\neg p \land q)) \equiv \neg p \land \neg q.

¬(p(¬pq))¬p¬(¬pq)by De Morgan (Theorem 6)¬p(p¬q)by De Morgan (Theorem 6) and Double Negation (Theorem 2)(¬pp)(¬p¬q)by Distributivity(¬p¬q)by Complement¬p¬qby Identity\begin{aligned} \neg(p \lor (\neg p \land q)) &\equiv \neg p \land \neg(\neg p \land q) && \text{by De Morgan (Theorem 6)} \\ &\equiv \neg p \land (p \lor \neg q) && \text{by De Morgan (Theorem 6) and Double Negation (Theorem 2)} \\ &\equiv (\neg p \land p) \lor (\neg p \land \neg q) && \text{by Distributivity} \\ &\equiv \bot \lor (\neg p \land \neg q) && \text{by Complement} \\ &\equiv \neg p \land \neg q && \text{by Identity} \end{aligned}
# Quick validation across all assignments
p, q = True, True
not (p or (not p and q)) == (not p and not q) # Evaluates to: True

p, q = False, True
not (p or (not p and q)) == (not p and not q) # Evaluates to: True
Example 6

Show that (pq)(pq)(p \land q) \to (p \lor q) is a tautology.

(pq)(pq)¬(pq)(pq)by Conditional axiom(¬p¬q)(pq)by De Morgan (Theorem 6)(¬pp)(¬qq)by Associativity and Commutativityby Complementby Domination (Theorem 4)\begin{aligned} (p \land q) \to (p \lor q) &\equiv \neg(p \land q) \lor (p \lor q) && \text{by Conditional axiom} \\ &\equiv (\neg p \lor \neg q) \lor (p \lor q) && \text{by De Morgan (Theorem 6)} \\ &\equiv (\neg p \lor p) \lor (\neg q \lor q) && \text{by Associativity and Commutativity} \\ &\equiv \top \lor \top && \text{by Complement} \\ &\equiv \top && \text{by Domination (Theorem 4)} \end{aligned}

The formula reduces to \top with no surviving variables, confirming it is a tautology regardless of assignment.

Example 7

Show that (pr)(qr)(pq)r(p \to r) \land (q \to r) \equiv (p \lor q) \to r.

(pr)(qr)(¬pr)(¬qr)by Conditional axiom(¬p¬q)rby Distributivity¬(pq)rby De Morgan (Theorem 6)(pq)rby Conditional axiom\begin{aligned} (p \to r) \land (q \to r) &\equiv (\neg p \lor r) \land (\neg q \lor r) && \text{by Conditional axiom} \\ &\equiv (\neg p \land \neg q) \lor r && \text{by Distributivity} \\ &\equiv \neg(p \lor q) \lor r && \text{by De Morgan (Theorem 6)} \\ &\equiv (p \lor q) \to r && \text{by Conditional axiom} \end{aligned}

This result captures a natural proof strategy: to show that a disjunction implies something, it suffices to show that each disjunct separately implies it.

Example 8 (Negation of an Implication)

The negation of “If I think, then I am” (pqp \to q) is:

¬(pq)¬(¬pq)p¬q\neg(p \to q) \equiv \neg(\neg p \lor q) \equiv p \land \neg q

by the Conditional axiom and De Morgan (Theorem 6). So the negation is “I think and I am not.” Negating an implication does not produce another implication; it produces a conjunction where the premise holds and the conclusion fails.

Remark

This is the only scenario under which an implication is false, consistent with the truth table established earlier.

# The only way an implication fails
p, q = True, False
(p and not q) == (not (not p or q)) # Evaluates to: True
Example 9 (XOR Distributivity)

Is \oplus distributive over \land and \lor?

The claim p(qr)(pq)(pr)p \land (q \oplus r) \equiv (p \land q) \oplus (p \land r) holds. Expanding both sides via the definition ab(a¬b)(¬ab)a \oplus b \equiv (a \land \neg b) \lor (\neg a \land b):

Left side:

p(qr)p((q¬r)(¬qr))(pq¬r)(p¬qr)p \land (q \oplus r) \equiv p \land ((q \land \neg r) \lor (\neg q \land r)) \equiv (p \land q \land \neg r) \lor (p \land \neg q \land r)

Right side:

(pq)(pr)((pq)¬(pr))(¬(pq)(pr))((pq)(¬p¬r))((¬p¬q)(pr))(pq¬p)(pq¬r)(¬ppr)(¬qpr)(pq¬r)(p¬qr)(pq¬r)(p¬qr)\begin{aligned} (p \land q) \oplus (p \land r) &\equiv ((p \land q) \land \neg(p \land r)) \lor (\neg(p \land q) \land (p \land r)) \\ &\equiv ((p \land q) \land (\neg p \lor \neg r)) \lor ((\neg p \lor \neg q) \land (p \land r)) \\ &\equiv (p \land q \land \neg p) \lor (p \land q \land \neg r) \lor (\neg p \land p \land r) \lor (\neg q \land p \land r) \\ &\equiv \bot \lor (p \land q \land \neg r) \lor \bot \lor (p \land \neg q \land r) \\ &\equiv (p \land q \land \neg r) \lor (p \land \neg q \land r) \end{aligned}

Both sides reduce to the same expression. However, p(qr)(pq)(pr)p \lor (q \oplus r) \equiv (p \lor q) \oplus (p \lor r) fails.

A single counterexample suffices: set pp \equiv \top, qq \equiv \top, rr \equiv \top.

()\top \lor (\top \oplus \top) \equiv \top \lor \bot \equiv \top()()(\top \lor \top) \oplus (\top \lor \top) \equiv \top \oplus \top \equiv \bot
# Counterexample for disjunctive distributivity of XOR
p, q, r = True, True, True
lhs = p or (q != r)  # XOR via != on bools
rhs = (p or q) != (p or r)
lhs == rhs # Evaluates to: False — the equivalence fails
Problem 10

Using equivalence proofs (not truth tables), show that pq¬q¬pp \to q \equiv \neg q \to \neg p. This establishes the contrapositive equivalence from first principles.

Problem 11

Show, using an equivalence proof, that (pq)(pr)p(qr)(p \to q) \lor (p \to r) \equiv p \to (q \lor r). Identify precisely where the “flip” from \lor to the structure of the result occurs, and which axiom is responsible.

Problem 12

The negation of “Candidates must have a degree in mathematics or computer science” is not “Candidates must have a degree in mathematics and computer science.” Using the propositions pp = “The candidate has a degree in mathematics” and qq = “The candidate has a degree in computer science”, express the original statement formally as an implication and compute its negation. Identify the De Morgan’s Law involved.

Normal Forms

We Finish off this session with Normal forms.

The equivalence proofs developed above demonstrate that any compound proposition can be transformed into an equivalent expression using only the core connectives ¬\neg, \land, and \lor. But this raises a question: given two arbitrary propositions, how do we determine whether they are equivalent without the algebraic ingenuity required for an equivalence proof? Truth tables answer this mechanically, but they scale exponentially.

A normal form solves this by being a fixed structural template for propositional formulas. If two propositions are equivalent, their normal forms will be identical (after simplification). This converts the semantic question “do these formulas always agree?” into the syntactic question “do these strings match?” The practical consequences extend well beyond theorem-proving. Normal forms underpin circuit design (where logical gates physically instantiate \land, \lor, ¬\neg), automated verification systems, and, as we shall see, some of the open questions in computational complexity.

Before defining the two principal normal forms, we need precise terminology for their building blocks.

Definition 11 (Literal)

A literal is a propositional variable or its negation. If pp is a propositional variable, then both pp and ¬p\neg p are literals.

Definition 12 (Clause)

A clause is a disjunction or conjunction of literals. A disjunctive clause (or simply a clause in the context of CNF) is a disjunction of literals. A conjunctive clause (or term) is a conjunction of literals.

Disjunctive Normal Form

Definition 13 (Disjunctive Normal Form)

A propositional formula is in Disjunctive Normal Form (DNF) if it consists of a disjunction of one or more terms, where each term is a conjunction of literals. That is, DNF has the shape:

(l1,1l1,2)(l2,1l2,2)(l_{1,1} \land l_{1,2} \land \cdots) \lor (l_{2,1} \land l_{2,2} \land \cdots) \lor \cdots

where each li,jl_{i,j} is a literal.

The structure is: OR of ANDs. Each conjunctive term describes one specific scenario under which the formula holds; the overall disjunction asserts that at least one of these scenarios is realised.

Example 10

The following formulas are in DNF:

  1. (p¬q)(¬pq)(p \land \neg q) \lor (\neg p \land q). Two terms, each a conjunction of two literals.
  2. p(¬pq)p \lor (\neg p \land q). A single literal is a degenerate term (a conjunction of one literal).
  3. (p¬q¬r)(¬pqr)(p \land \neg q \land \neg r) \lor (\neg p \land q \land r).

The following formulas are not in DNF:

  1. (p¬q)¬(¬pq)(p \land \neg q) \lor \neg(\neg p \land q). The negation ¬(¬pq)\neg(\neg p \land q) is not a literal; it is a negated compound expression that must first be expanded via De Morgan’s Laws (Theorem 6).
  2. (pq)(¬pq)(p \lor q) \land (\neg p \land q). This is a conjunction of a disjunctive clause with a term: the outer connective is \land, not \lor.
  3. ¬(¬pq)\neg(\neg p \lor q). A negated disjunction, not a disjunction of conjunctions.

Constructing DNF from Truth Tables

Every compound proposition can be mechanically converted to DNF via its truth table. The procedure is:

  1. Construct the truth table for the proposition.
  2. Identify every row where the proposition evaluates to \top.
  3. For each such row, form a conjunctive term: include the variable pp if it is assigned \top in that row, or ¬p\neg p if it is assigned \bot.
  4. Take the disjunction of all such terms.

The resulting formula is in DNF by construction. Each term encodes exactly one satisfying assignment, and the disjunction collects them all.

Example 11

Find the DNF of (pq)¬r(p \lor q) \to \neg r.

We construct the truth table:

pqr(pq)¬r\begin{array}{ccc|c} p & q & r & (p \lor q) \to \neg r \\ \hline \top & \top & \top & \bot \\ \top & \top & \bot & \top \\ \top & \bot & \top & \bot \\ \top & \bot & \bot & \top \\ \bot & \top & \top & \bot \\ \bot & \top & \bot & \top \\ \bot & \bot & \top & \top \\ \bot & \bot & \bot & \top \end{array}

Rows 2, 4, 6, 7, 8 evaluate to \top. Reading off each row:

(pq¬r)(p¬q¬r)(¬pq¬r)(¬p¬qr)(¬p¬q¬r)(p \land q \land \neg r) \lor (p \land \neg q \land \neg r) \lor (\neg p \land q \land \neg r) \lor (\neg p \land \neg q \land r) \lor (\neg p \land \neg q \land \neg r)

This is the full DNF. It can be simplified using the equivalence (pq)(p¬q)p(p \land q) \lor (p \land \neg q) \equiv p (which follows from Distributivity and Complement). Group the first two terms:

(pq¬r)(p¬q¬r)p¬r(p \land q \land \neg r) \lor (p \land \neg q \land \neg r) \equiv p \land \neg r

For the last three terms, observe that we may use Idempotence (Theorem 3) to duplicate the fifth term (¬p¬q¬r\neg p \land \neg q \land \neg r) without altering the formula. This allows two independent groupings:

(¬pq¬r)(¬p¬q¬r)¬p¬r(\neg p \land q \land \neg r) \lor (\neg p \land \neg q \land \neg r) \equiv \neg p \land \neg r(¬p¬qr)(¬p¬q¬r)¬p¬q(\neg p \land \neg q \land r) \lor (\neg p \land \neg q \land \neg r) \equiv \neg p \land \neg q

Combining:

(p¬r)(¬p¬r)(¬p¬q)¬r(¬p¬q)(p \land \neg r) \lor (\neg p \land \neg r) \lor (\neg p \land \neg q) \equiv \neg r \lor (\neg p \land \neg q)

Applying the Conditional axiom and De Morgan (Theorem 6) in reverse: ¬r(¬p¬q)¬r¬(pq)(pq)¬r\neg r \lor (\neg p \land \neg q) \equiv \neg r \lor \neg(p \lor q) \equiv (p \lor q) \to \neg r. The simplified DNF recovers the original formula, confirming equivalence.

# Verify DNF equivalence
p, q, r = True, True, True
original = (not (p or q)) or (not r)
dnf = (p and q and not r) or (p and not q and not r) or (not p and q and not r) or (not p and not q and r) or (not p and not q and not r)
simplified = (not r) or (not p and not q)
print(original == dnf == simplified) # True

p, q, r = True, True, False
original = (not (p or q)) or (not r)
dnf = (p and q and not r) or (p and not q and not r) or (not p and q and not r) or (not p and not q and r) or (not p and not q and not r)
simplified = (not r) or (not p and not q)
print(original == dnf == simplified) # True

# ... and so on for all 8 assignments

Conjunctive Normal Form

Definition 14 (Conjunctive Normal Form)

A propositional formula is in Conjunctive Normal Form (CNF) if it consists of a conjunction of one or more clauses, where each clause is a disjunction of literals. That is, CNF has the shape:

(l1,1l1,2)(l2,1l2,2)(l_{1,1} \lor l_{1,2} \lor \cdots) \land (l_{2,1} \lor l_{2,2} \lor \cdots) \land \cdots

where each li,jl_{i,j} is a literal.

The structure is: AND of ORs, the dual of DNF. Each disjunctive clause represents a constraint that must be satisfied; the conjunction demands that every constraint holds simultaneously.

Example 12

The following formulas are in CNF:

  1. (pq)(¬pq)(p \lor q) \land (\neg p \lor q).
  2. (pq¬r)(¬p¬q¬r)(p \lor q \lor \neg r) \land (\neg p \lor \neg q \lor \neg r).

The following formulas are not in CNF:

  1. (p¬q)(¬pq)(p \land \neg q) \lor (\neg p \land q). The outer connective is \lor, and the operands are conjunctions, not disjunctions. This is DNF, not CNF.
  2. ¬(¬pq)\neg(\neg p \lor q). A negated disjunction, not a conjunction of disjunctive clauses.

Constructing CNF

There are two systematic methods for obtaining CNF.

Method 1: Algebraic manipulation. Eliminate all implications using the Conditional and Biconditional axioms. Push negations inward using De Morgan’s Laws (Theorem 6) and Double Negation (Theorem 2). Then distribute \lor over \land using the equivalence p(qr)(pq)(pr)p \lor (q \land r) \equiv (p \lor q) \land (p \lor r) (the disjunctive form of Distributivity) until every clause is a disjunction of literals.

Method 2: Truth table (via negation of DNF). Observe that by Double Negation (Theorem 2), any proposition ϕ\phi satisfies ϕ¬(¬ϕ)\phi \equiv \neg(\neg \phi). Construct the DNF of ¬ϕ\neg \phi by reading off the rows where ϕ\phi evaluates to \bot (equivalently, where ¬ϕ\neg \phi evaluates to \top). Then negate the resulting DNF. By De Morgan’s Laws (Theorem 6), negating a disjunction of conjunctions produces a conjunction of disjunctions: precisely CNF.

Example 13

Find the CNF of ¬(pq)(rp)\neg(p \to q) \lor (r \to p).

Method 1 (Algebraic):

¬(pq)(rp)¬(¬pq)(¬rp)by Conditional axiom(p¬q)(¬rp)by De Morgan (Theorem 6) and Double Negation (Theorem 2)(p¬rp)(¬q¬rp)by Distributivity(p¬r)(p¬q¬r)by Idempotence (Theorem 3)\begin{aligned} \neg(p \to q) \lor (r \to p) &\equiv \neg(\neg p \lor q) \lor (\neg r \lor p) && \text{by Conditional axiom} \\ &\equiv (p \land \neg q) \lor (\neg r \lor p) && \text{by De Morgan (Theorem 6) and Double Negation (Theorem 2)} \\ &\equiv (p \lor \neg r \lor p) \land (\neg q \lor \neg r \lor p) && \text{by Distributivity} \\ &\equiv (p \lor \neg r) \land (p \lor \neg q \lor \neg r) && \text{by Idempotence (Theorem 3)} \end{aligned}

Method 2 (Truth table):

pqr¬(pq)(rp)\begin{array}{ccc|c} p & q & r & \neg(p \to q) \lor (r \to p) \\ \hline \top & \top & \top & \top \\ \top & \top & \bot & \top \\ \top & \bot & \top & \top \\ \top & \bot & \bot & \top \\ \bot & \top & \top & \bot \\ \bot & \top & \bot & \top \\ \bot & \bot & \top & \bot \\ \bot & \bot & \bot & \top \end{array}

The formula evaluates to \bot in rows 5 and 7. Form the DNF of the negation by reading these rows:

(¬pqr)(¬p¬qr)(\neg p \land q \land r) \lor (\neg p \land \neg q \land r)

Simplify: (¬pqr)(¬p¬qr)¬pr(\neg p \land q \land r) \lor (\neg p \land \neg q \land r) \equiv \neg p \land r. Now negate:

¬(¬pr)p¬r\neg(\neg p \land r) \equiv p \lor \neg r

by De Morgan (Theorem 6) and Double Negation (Theorem 2). This is already in CNF.

Remark

Method 2 produced (p¬r)(p \lor \neg r), while Method 1 produced (p¬r)(p¬q¬r)(p \lor \neg r) \land (p \lor \neg q \lor \neg r). These are equivalent: the second clause in Method 1’s result is absorbed by the first via Absorption (Theorem 5), since any assignment satisfying (p¬r)(p \lor \neg r) automatically satisfies (p¬q¬r)(p \lor \neg q \lor \neg r).

# Verify both CNF forms match the original
p, q, r = False, True, True
original = (not (not p or q)) or (not r or p)
cnf1 = (p or not r) and (p or not q or not r)
cnf2 = p or not r
print(original == cnf1 == cnf2) # True

p, q, r = False, False, True
original = (not (not p or q)) or (not r or p)
cnf1 = (p or not r) and (p or not q or not r)
cnf2 = p or not r
print(original == cnf1 == cnf2) # True

The Cost of Canonicality

Normal forms guarantee a canonical structure, but this guarantee comes at a price. Converting between normal forms can cause an exponential blowup in the number of clauses.

Consider a formula already in DNF with nn terms:

(p1q1)(p2q2)(pnqn)(p_1 \land q_1) \lor (p_2 \land q_2) \lor \cdots \lor (p_n \land q_n)

To convert this to CNF, we must distribute \lor over \land repeatedly. Each application of Distributivity doubles the number of clauses. Starting with (p1q1)(p2q2)(p_1 \land q_1) \lor (p_2 \land q_2):

(p1q1)(p2q2)(p1p2)(p1q2)(q1p2)(q1q2)(p_1 \land q_1) \lor (p_2 \land q_2) \equiv (p_1 \lor p_2) \land (p_1 \lor q_2) \land (q_1 \lor p_2) \land (q_1 \lor q_2)

Two terms in DNF produce four clauses in CNF. Adjoining a third term (p3q3)(p_3 \land q_3) via disjunction and distributing again doubles to eight clauses. After nn terms, the CNF may contain up to 2n2^n clauses. The symmetric explosion occurs when converting CNF to DNF.

This exponential blowup is an intrinsic property of the normal forms themselves. It is also the reason why the satisfiability of CNF formulas (the SAT problem) occupies a central position in complexity theory: determining whether a CNF formula has a satisfying assignment is the canonical NP-complete problem, meaning that no known algorithm solves it efficiently in all cases.

Problem 13

Find the DNF of p(qr)p \to (q \land r) using the truth table method. Simplify the result.

Problem 14

Convert (pq)(¬pr)(p \lor q) \land (\neg p \lor r) into DNF using algebraic manipulation (Distributivity, not truth tables). Verify your result computationally.

Problem 15

Find the CNF of (pq)(¬pr)(p \land q) \lor (\neg p \land r) using both methods (algebraic and truth table). Confirm that both methods yield equivalent results.

Application: Knights and Knaves

Normal forms and propositional reasoning converge naturally in logic puzzles, where the challenge is to deduce facts from constrained declarations. Consider the following scenario.

On an island, every inhabitant is either a knight (who always tells the truth) or a knave (who always lies). You encounter two inhabitants, AA and BB.

  • AA says: “B is a knight.”
  • BB says: “The two of us are of opposite types.”

Let pp denote ”AA is a knight” and qq denote ”BB is a knight.” We translate the scenario into propositional logic.

If AA is a knight (pp is \top), then AA‘s statement is true, so qq is \top. If AA is a knave (pp is \bot), then AA‘s statement is false, so qq is \bot. In both cases, AA‘s declaration encodes pqp \leftrightarrow q.

BB’s statement asserts that AA and BB are of opposite types, which means exactly one of p,qp, q is \top: this is pqp \oplus q. If BB is a knight (qq is \top), then pqp \oplus q must be \top. If BB is a knave (qq is \bot), then pqp \oplus q must be \bot. So BB‘s declaration encodes q(pq)q \leftrightarrow (p \oplus q).

The full constraint is:

(pq)(q(pq))(p \leftrightarrow q) \land (q \leftrightarrow (p \oplus q))

Case 1: Assume pp is \top. From pqp \leftrightarrow q, we get qq is \top. Then pqp \oplus q \equiv \top \oplus \top \equiv \bot. But q(pq)q \leftrightarrow (p \oplus q) \equiv \top \leftrightarrow \bot \equiv \bot. The conjunction evaluates to \bot. Contradiction.

Case 2: Assume pp is \bot. From pqp \leftrightarrow q, we get qq is \bot. Then pqp \oplus q \equiv \bot \oplus \bot \equiv \bot. And q(pq)q \leftrightarrow (p \oplus q) \equiv \bot \leftrightarrow \bot \equiv \top. The conjunction evaluates to \top.

Therefore both AA and BB are knaves.

# Test all four assignments of (p, q)
p, q = True, True
print((p == q) and (q == (p != q))) # False

p, q = True, False
print((p == q) and (q == (p != q))) # False

p, q = False, True
print((p == q) and (q == (p != q))) # False

p, q = False, False
print((p == q) and (q == (p != q))) # True — both knaves
Problem 16

On the same island, you meet three inhabitants AA, BB, and CC. AA says: “All of us are knaves.” BB says: “Exactly one of us is a knight.” Using propositional variables pp, qq, rr for AA, BB, CC respectively, determine the types of all three inhabitants. Verify your solution computationally.