# Automata

## Lecture 1

• will study computability theory – whether a problem can be solved by a computer (ch 4, 5, 6)
• not studying complexity theory (if a problem can be solved, how much time/space it will take to solve) (ch 7, 8, 9, 10, CSCE 423, 424)
• Requires some preliminaries – ch 1-3
1. finite automata
2. push-down automata
3. Turing machines
• Ch 0 will be a review

## Lecture 2: Chapter 0

• review of
• sequence
• set
• string
• language

### Sequence

• a list of objects in some order (may be infinite)
• objects are referred to as elements or members
• parens and commas for notation
• two sequences are equal iff they have the same number of elements, the same elements, and the elements are in the same order
• $$A = (a_1, a_2, \ldots, a_n)$$
• $$B = (b_1, b_2, \ldots, b_m)$$
• equal when $$n = m$$ and $$a_1 = b_1$$, $$a_2 = b_2$$, $$\ldots$$, $$a_n = b_n$$
• Order matters
• repetition matters
• $$|A|$$ is the number of elements in A

### Sets

• A group of unordered objects (also called elements or members)
• braces are used, an empty set is the null set
• Two sets are equal iff they contain the same group of distinct elements – $$A = B \leftrightarrow \forall x (x \in A \land x \in B)$$
• Size again denoted w/ vertical bars, but is number of distinct elements in A
• A is a subset of B if every element of A is also an element of B
• $$P(A)$$ – the powerset of A – all subsets of A
• Cross product of A and B – $$A \times B = \{(a, b) | a \in A \land B \in B \}$$

### Strings

• A finite sequence of symbols over an alphabet
• An alphabet is a set of symbols
• empty string is denoted with $$\epsilon$$
• Concatenation $$xy$$, $$x = 01$$, $$y = 10$$, $$xy = 0110$$, the empty string is ignored
• Repetition $$x^i$$ – $$x$$ concatenated $$i$$ times, $$x=01$$, $$x^2 = 0101$$, $$x^0 = \epsilon$$
• Reverse $$x^R$$ – $$x = 01$$, $$x^R = 10$$, $$x = 001, x^R = 100$$, $$x = \epsilon, x^R = \epsilon$$
• $$x$$ is a palindrome if $$x = x^R$$
• dictionary order – sort by first symbol, then second for those with the same first, and so forth
• shortlex order – sort by length, and then in each length, sort by dictionary order

### Language

• A language is a set of string over an alphabet
• empty language denoted by empty set
• size denoted similarly

## Lecture 3: Chapter 1

• Regular Languages
• deterministic FA
• non-deterministic FA
• regular expressions
• what exactly is a regular language – how do they connect to problems that are solveable by the three models
• models used by many search/parse/editing tools
• testing and verification/model checking
• modelling OS/network protocols

### Deterministic Finite Automata

• state transition diagram – inputs are edges from a state (a node) to a state (node)
• a finite automaton – a state diagram, plus a start state and one or more final states
• start state and final states together define the set of strings we’re interested in
• start state indicated by arrow pointing from nowhere
• final/accept states are indicated by a double circle
• languages recognised by finite automaton
• for a finite automaton $$M$$ and input $$x$$, if $$M$$ stops at a final state after reading $$x$$, we say $$M$$ accepts $$x$$, otherwise, $$M$$ rejects $$x$$
• the language $$L(M)$$ recognised by a finite automaton $$M$$ contains all the strings accepted by $$M$$ (all strings that can go from start to end)
• DFA is 5-tuple $$(Q, \Sigma, \delta, q_0, F)$$
$$Q$$
set of states
$$\Sigma$$
set of input symbols
$$\delta$$
transition function $$\delta : Q \times \Sigma \to Q$$
$$q_0$$
initial state
$$F$$
set of final/accepting states
• State transition function can be described as diagram
• As a function
• as a table – first row is start state, starred rows is final state
• swapping final states and non-final states, you get the complement of the original
• when all states are accepting, it accepts all strings
• when no states are accepting, it accepts nothing

## Lecture 4: Chapter 1, Continued

### Design of DFA

• If a page contains a keyword – page, a string, and keyword, a substring
• contains unl, uno, or unk; also contains cse
• this is $$L$$, build DFA, minimize DFA to get minimum DFA
• Designing the DFA – 3 steps for $$M$$ from $$L$$
• Steps:
1. Determine information to remember about part of an input string that $$M$$ has already read
2. Represent information as a set of states
• determine start and final states
3. Assign transitions ($$\delta: Q \times \Sigma \to Q$$)
• $$L = \{x over \{0, 1\} | x no end 01\}$$
• Whether $$01$$ are the last two symbols
• Symbols
• 0
• 1
• 01
• 0 is start
• 0, 1 are accept
• Transitions:
• $$\epsilon \to 0$$
• $$(0, 0) \to 0$$
• $$(0, 1) \to 01$$
• $$(01, 0) \to 0$$
• $$(01, 1) \to 1$$
• Strings containing 10
• States
• 1
• 0
• 10
• 1 is start
• 10 is accept
• Transitions
• $$\epsilon \to 0$$
• $$(0,0) \to 0$$
• $$(0,1) \to 1$$
• $$(1,0) \to 10$$
• $$(1,1) \to 1$$
• $$(10,_) \to 10$$

## Lecture 5: Chapter 1, continued

### DFA Design, continued

• L - over 0, 1, contains 10, doesn’t end with 01
• Combine from two others
• Requires same alphabet
• need to keep state of both machines
• use pairs, built by the cross product of the two sets $$Q$$
• State transition function built as:
• $$\delta_3((a, b), sym) = (\delta_1(a, sym), \delta_2(b, sym))$$
• again, the cross product
• ensure that invalid states are removed
• From start state, add each tranision iteratively, going by what is present, as if doing breadth-first search
• For a $$\cup$$ combination:
• do as above
• but change final state, $$F_4 = \{(s_1, s_2) | s_1 \in F_1 \lor s_2 \in F_2\}$$

### DFA Minimization

• Three steps
• steps:
1. Partition all states into two groups $$G_1$$ – final states, $$G_2$$ – all non-final states
2. For each group, check whether all states in the group go to the same next group for each symbol.
• If not, partition into smaller groups
• repeating untill no mare partitions
3. All states in a group can be combined to a single state
• groups with original final states become a single, new final state
• The group with the original start state becomes the new start state

## Lecture 6: Chapter 1, continued

### Regular langugages

• a language is said to be regular if there exists some DFA recognizing it
• Theorem: For a regular language there exists a unique minimum DFA up to isomorphism
• def’n: two DFA are isomorphic if they have different state names but the same transition function

### NFAs

• why?
• many different models
• consider a dfa, all languages recognizable are regular, as do NFAs
• for some regular languages, NFAs are easier to design (ex all pages containing CSE, CS or CE)
• a transition may go to multiple states
• transitions may be on the empty string
• NFAs are describe similarly, but $$\delta: Q \times \Sigma_{\epsilon} \to \np(Q)$$
• So long as at least one of the states currently held is accepting, the string is accepted
• a decision tree/path forms

## Lecture 7: Chapter 1, continued

• all languages that can be recognized by a DFA are regular languages, likewise for NFA
• A proof that DFA and NFA are equivalent
• To show a languageg is regular, show either a DFA or an NFA
• DFAs are a special type of NFA
• So prove that NFAs are a subset of DFAs

### NFA to DFA

• Steps:
1. Information – copies of the NFA states, as sets of individual states
2. States – the powerset of $$Q$$ from the NFA, only consider the distinct states, accepting if any member of the state set was accepting
3. Transitions – $$\delta_d(R, sym) = \cup_{r \in R} \delta_n(r, sym)$$
• Remember to minimize the resulting DFA
• when handling epsilon transitions, use $$E(A) = A \cup \{ \mathrm{all states reached by following only epsilon transitions} \}$$ for the next state, $$\delta_d(R, sym) = E(\cup_{r \in R}\delta_n(r,sym))$$, start state is $$E(q_{n1})$$

## Lecture 8: Chapter 1, continued

### The Regular Expression

• Reasons
• new model
• used in many programming languages and software
• vi
• grep
• perl
• java
• python
• awk
• write a program to check whether a string matches a given regular expression
• An alphabet $$\Sigma$$

RE language
a {a}
$$\epsilon$$ {ε}
R1 ∪ R2 L(R1) ∪ L(R2)
R1R2 L(R1) ˆ L(R2)
R1* L(R1)*
• Σ = {0, 1}
• $$11* \cup 0*00*$$
• $$*$$ means zero or more
• $$\circ$$ means a followed by b, $$A \circ B = \left\{ xy \mid x \in A \land y \in B \right\}$$, essentialy the cross product
• if is the empty set, the concatenation is the empty set
• $$A^n$$ is A concatenated with itself $$n$$ times
• $$A^*$$ is $$A^0 \cup A^1 \cup A^2 \cup A^3 \cup \cdots$$
• $$A^+$$ is $$A^1 \cup A^2 \cup A^3 \cup \cdots$$

## Lecture 9: Chapter 1, continued

• Showing that REs accept regular expressions
• RE to DFA/NFA
• $$a$$ - start state, accept a, to final state
• $$\epsilon$$ - start state is final state, no transitions other than the initial epsilon
• $$\emptyset$$ - start state, no transitions, no final state
• $$R_1 \cup R_2$$ - given a machine for each, use an epsilon transition to each as an NFA
• $$R_1 \circ R_2$$ - all final states of $$R_1$$ have an epsilon transition to the initial state of $$R_2$$
• $$R_1^*$$ - Empty state before initial, epsilon transition from each final state to the original start state

## Lecture 10: Chapter 1, continued

• DFA/NFA to RE
• Requires GNFA
• NFA + RE as transition labels
• Conversion:
1. DFA to GNFA in a special form
1. Start state only has outgoing transitions – often just add a new start state w/ ε transition
2. one final state, only incoming transitions – Give all final states an ε transition to a new final state
3. Only Any state to any state: at most one transition – merge transitions with union
2. Remove states one by one, except the start and final state – label each transition with concat of each relevant transition
3. The label between start and final is the regular expression

## Lecture 11: Chapter 1, continued

• HW1 Due 09-21, No class 09-24, HW reviewed by TA 09-26, Exam 1 09-28
• The Pumping lemma
• describes a common property of all regular languages, and some non-regular languages
• if a language has the property is it regular?
• it may be regular
• if it doesn’t have the property, than it isn’t regular
• based on pigeonhole principle
• if more than $$p$$ pigeons are placed into $$p$$ holes, then at least one hole will have at least one pigeon
• Consider a reg. lang. $$A$$ and a DFA for $$A$$ as well as a string $$s \in A$$ of $$n$$ symbols
• The path must then consist of $$n + 1$$ states
• if $$n + 1 > |D|$$, then the path repeats some states
• $$xz$$ – $$x$$ takes it to 9, $$z$$ takes it from 9 to 13
• $$xyyz$$ – $$x$$ takes to 9
• For any integer $$i \geq 0$$, $$xy^iz$$ is accepted by the machine
• If $$A$$ is a regular language, then there is a number $$p$$ (the pumping length) such that for any string $$s$$ in $$A$$ with $$|s|\geq p$$ it can be divided into two three pieces $$s = xyz$$ satisfyining the following conditions
1. for each $$i \geq 0$$, $$xy^iz \in A$$
2. $$|y| > 0$$
3. $$|xy|\leq p$$
• Number of states is often the pumping number $$p$$ (in particular of a minimized DFA)
• contrapositive of the pumping lemma
• normal: if $$A$$ is regular, then for any long enough string, some part of its first $$p$$ symbols can be pumped
• contrapositive: if for some long enough string $$s$$ in $$A$$ none of its first $$p$$ symbols can be pumped, then $$A$$ is not regular
• Language $$A$$ is not regular if, for every arbitrary number $$p$$, there exists a string $$s$$ in $$A$$ with $$|s|\geq p$$ having the following property:
• for any decomposition of $$s = xyz$$ in which $$|xy| \leq p$$, and $$|y| > 0$$,…
• Do it.
1. Let $$p$$ denote an arbitrary number
2. pick a string $$s \in A$$ with $$|s| \geq p$$
3. identify all possible decomps of $$s$$ into $$xyz$$ with $$|xy| \leq p$$ and $$|y| \geq 0$$
4. show that for each decomp, there exists an $$i \geq 0$$ s.t. $$xy^iz \not\in A$$ (i may be different for each decomp, if it isn’t a good choice, try again)
5. conclude that $$A$$ is not regular.
• $$L_1 = \left\{ w \mid w \mathrm{ has the same number of a's and b's} \right\}$$
• let $$p$$ be arbitrary
• Let $$s = (ab)^p$$ and we have $$(ab)^p \in L_1$$, and $$|(ab)^p| = 2p > 1$$
• wrong
• $$s = xyz$$, $$x = \epsilon$$, $$y = aba$$, $$z = b(ab)^{p-2}$$
• let $$i = 0$$, then $$xy^iz = xz = b(ab)^{p-2} \not\in L_1$$
• Therefore, $$L_1$$ isn’t regular
• Identify all possible decomps of $$s$$ into $$xyz$$ with $$|xy| \leq p$$ and $$|y| > 0$$
• follow from here
• Try
• $$p$$ is arbit.
• $$s = a^pb^p$$ and we have $$a^pb^p \in L_1$$
• Ident all possible decomps of $$s$$ into $$xyz$$ with $$|xy| \leq p$$ and $$|y| > 0$$
• $$x = \epsilon$$, $$y = a^p$$, $$z = b^p$$
• $$k = a^n$$, $$p-k \geq n \geq 0$$
• $$y = a^k$$, $$p \geq k \geq 1$$
• $$z = a^{p-n-k}b^p$$
• ex, $$y = 2$$, thus, language isn’t regular.
• Consider $$L_2 = \{ww | w \in \{a,b\}^*\}$$
• $$p$$ is arbitrary
• $$s = a^pba^pb$$
• $$y = a^k$$, $$p \geq k \geq 1$$, $$x = a^n$$, $$p-k \geq n \geq 0$$; $$z$$ is the remainder
• falls through as before

## Lecture 12: Chapter 1, Continued

• proving non-regularity
• $$L_3 = \{a^mb^n | m > n\}$$
• $$p$$ is arbitrary
• $$s = a^{p+1}b^p$$
• $$y=a^k$$, $$p \geq k \geq 1$$
• $$x=a^n$$, $$p-k \geq n \geq 0$$, $$z = remaining$$
• $$i = 0$$, $$xy^iz = xz = a^{p+1-k}b^p$$, $$\not\in L_3$$
• Closure properties
• $$A \cup B$$ – regular
• combine NFA/DFA – easy
• $$A \circ B$$ – regular
• combine NFA/DFA – similarly easy
• $$A^*$$ – regular
• Allow accepting empty string and repeats
• $$A \cap B$$ – regular
• see notes for DFA, can be extended to NFA
• $$\bar{A}$$ – regular
• switch final/non-final states
• Reverse of a language a is defined as follows: $$A^R = \{w^R | w \in A\}$$
• change direction of transitions, start state becomes final, if multiple final states, add a new state w/ epsilon transitions to old final states
• Also regular
• $$\Sigma = \{a, b\}$$, $$\Gamma = \{0, 1\}$$
• $$h(a) = 00$$, $$h(b) = 01$$
• if $$A$$ is regular, than $$h(A)$$ is also regular
• Closure properties can be used as for a contrapositive w/ the pumping lemma.
• $$L_4 = \{w | w \text{has different number of a's and b's}\}$$
• We see that $$\bar{L_4} = L_1$$
• we have already proven that L1 is not regular, therefore L4 is not regular
• $$L_5 = \{a^nb^n | n \geq 0\}$$
• $$L_5 = L_1 \cap a*b*$$
• therefore, $$L_5$$ is not regular – wrong! L1 is not regular to begin with!
• Subset property doesn’t apply either

## Lecture 13: Chapter 2, Context Free Languages

• is it possible to design a computer to solve a problem?
• chapter 1 , whether or not it’s possible to design a DFA/NFA/RE for a given language
• to simplify – consider only decision problems – is a number divisible by 5
• Is it possible to design a computer to solve a decision problem
• what does it mean to solve – determine yes or no given an input
• input must be encoded
• whether or not strings are accepted
• is it possible to design a computer to recognize lang of decision problem
• Models and Languages:
1. DFA/NFA/RE – smallest group, regular languages
2. PDA/CFG – Context Free Languages
3. Real Computers – problems solvable by computer
4. Turing Machines – Turing-recognizable languages
• The CFG
• mathematical model to describe languages
• widely used in many applications
• specify programming languages
• generate images (contextfreeart.org)
• automatically generate CS papers (pdos.csail.mit.edu/scigen)
• Ex:
• $$A \to 0A1$$
• $$A \to \epsilon$$
• LHS is a variable
• non variables, non empty string are terminals
• $$(V, \Sigma, R, S)$$
• $$V$$ – Set of Variables
• $$\Sigma$$ – Set of Terminals (alphabet)
• $$R$$ – Set of Rules
• $$S$$ – Start Variable
• Rules may be combined – $$A \to 0A1 | \epsilon$$
• Start from variable – find a rule that matches, and continue by variable.
• $$\left\{ w \mathrm{over} \Sigma | S \Rightarrow^* w \right\}$$
• parse trees may be used to explain a CFG
• graphical/tree-based structure
• Ex:
• $$S \to S + S$$
• $$S \to a$$
• CFGs are more powerful that DFA/NFA/RE
• Convert REs to a CFG
1. $$a \iff S \to a$$
2. $$\epsilon \iff S \to \epsilon$$
3. $$\emptyset \iff S \to S$$ (or no rules)
4. $$R_1 \cup R_2 \iff S_1 \to R_1, S_2 \to R_2, S \to S_1 | S_2$$
5. $$R_1 \circ R_2 \iff S \to SR_1 SR_2$$
6. $$R_1^* \iff S \to \epsilon | SR_1 S$$
• For NFA/DFA:
• $$S_i = sym S_j$$

## Lecture 14: Review

• be careful when designing DFAs
• use the examples
• make sure to minimize

## Lecture 15: Context Free Grammars

• $$L_1 = \left\{a^nb^n | n \geq 0 \right\}$$
• $$R \to \epsilon$$
• $$R \to a R b$$
• Two methods:
• Union Method – union of multiple smaller languages
• concatenation method – concatenation of multiple smaller languages
• recursive – $$L_1L_2L$$
• special cases
• $$L_2 = \left\{a^m b^n | m > n \geq 0\right\}$$
• $$S_2 \to a S_2$$
• $$S_2 \to a S_1$$
• $$L_2 = \left\{a^mb^n | m \neq n\right\}$$
• $$S_3 \to S_2 | B$$
• $$B \to B b$$
• $$B \to S_1 b$$
• $$B^{\prime} \to B b | b$$
• $$L_4 = \{ a^ib^ja^k | j > i + k, i \geq 0, k \geq 0 \}$$
• $$S_4 \to S_1 B^{\prime} S_1^R$$
• $L5 = \{ x over \{a, b\} | x is a palindrome \} • $$S_5 \to \epsilon$$ • $$S_5 \to a$$ • $$S_5 \to b$$ • $$S_5 \to b S_5 b$$ • $$S_5 \to a S_5 a$$ • $$L_6 = \{ x over \{a, b\} | x is not a palindrome \$$} • $$S_6 \to a S_6 a$$ • $$S_6 \to b S_6 b$$ • $$S_6 \to a T b$$ • $$S_6 \to b T a$$ • $$T \to aT | bT | \epsilon$$ • $$L_7 = \{ x over \{a, b\} | x has the same number of a's and b's \}$$ • $$S_7 = a S_7 a$$ • $$S_7 = b S_7 a$$ • $$S_7 = S_7 S_7$$ • $$S_7 = \epsilon$$ ## Lecture 16: PDA (Pushdown Automaton) • a new model • Adds a stack to a DFA/NFA • cfg – specify a programming language • PDAs can be used to design the compiler (See CSCE 425) • PDAs may be built upon either DFA or NFA, based upon NFA is more powerful • Stacks have 2 ops • push – push a symbol onto the top of the stack • pop – pop the symbol off of the top of the stack • Edges are now labeled differently: ($$a, B \to C$$) • if current state is $$P$$, and $$a$$ is input symbol, and $$B$$ is the top of the stack symbol, then take transition, read input symbol, POP symbol B, and push $$C$$, going to new state $$Q$$ • with $$\epsilon$$ as $$B$$, the transition may be taken no matter the current symbol, and no symbol is to be popped • with $$\epsilon$$ as $$C$$, don’t push any symbol • $$\epsilon, \epsilon \to \epsilon$$ – don’t read, don’t change the stack • shortcuts do exist • Formal Description: • NFA: $$Q$$ States $$\Sigma$$ Alphabet $$\delta$$ $$Q \time \Sigma_\epsilon \to P(Q)$$ $$q_0$$ Start state $$F$$ Final States • PDA: $$Q$$ States $$\Sigma$$ Input Alphabet $$\Gamma$$ Stack Alphabet – all possible stack symbols $$\delta$$ $$Q \time \Sigma_\epsilon \times \Gamma_\epsilon \to P(Q \times \Gamma_\epsilon)$$ $$F$$ Final States • \$ is used to denote the first thing (popping will be empty stack)
• An input string is accepted by a PDA if at least one copy of the PDA stops at a final state after reading the input string.
• Use a ’configuration’ to show how it works in homework:
• (current state, remaining input symbols, contents of the stack)
• $$(1, aab, \epsilon) \Rightarrow (2, aab, \$$) ⇒ (2, ab, A\$$) \Rightarrow (2, b, AA\$$) ⇒ (3, ε, A\$$) \Rightarrow (4, \epsilon, \$$)$## Lecture 17: PDAs, Continued • Designing a PDA for a language • Split input string • $$S_1 = S_2^R$$ • $$|S_1| \leq |S_2|$$ or other comparison • $$L_1 = \{ wcw^R | w over \{a,b\}\}$$ • $$Q = \{1,2,3,4}$$ • $$q_0 = 1$$ • $$F = \{4\}$$ • 2 makes a copy of the first part • 3 compares the first part with the second • transition to 4 on acceptance • $$L_2 = \{ ww^R | w over \{a,b\}\}$$ • $$Q = \{1,2,3,4}$$ • $$q_0 = 1$$ • $$F = \{4\}$$ • 2 makes a copy of the first part • epsilon, epsilon, epsilon transition between 2 and 3 • 3 compares the first part with the second • transition to 4 on acceptance • $$L_3 = \{ w | w = w^R over \{a, b\} \}$$ • add a transition between 2,3 tha reads either char but doesn’t change the stack ## Lecture 18: Equivalence of PDA and CFG • PDA to CFG • Special form of PDA as first step • Only one final state – epsilon transitions from old final sates to new • Stack must be empty at the final state • Ex: $$\Gamma = \{ \$$, A, B \}$
• Requires loop-back transitions to pop remaining stack symbols at the end
• A transiion is either a push transition ($$a, \epsilon \to x$$) or a pop transition ($$a, x \to \epsilon$$)
• States P, Q – CFG Var $$A_pq$$ (all strings that can go from p to q without changinging the top stack symbol at p, and the top stack at p has never been popped) – Three types:
• $$p\to r$$ ($$a, \epsilon \to x$$) (push), $$s \to q$$ ($$b, x \to e$$) (pop) – $$A_pq = a A_rs b$$
• $$p v q$$ – $$A_pq \to A_pv A_vq$$
• $$p$$ – $$A_pp \to \epsilon$$

## Lecture 19: PDA to CFG

• Continue given previous rules
• Generating rules from the three types of rules
• In $$A_pq \to a A_rs b$$, $$A_rs$$ must satisfy the same conditions as $$A_pq$$
• push to pop in order for generating rules

## Lecure 20: The Pumping Lemma for CFGs

• Review of the Pumping Lemma for Regular Langugages
• The loop
• Pumping Lemma for CFGs
• Assume each node has at most 3 child nodes
• If height is 1, it has at most 3 nodes
• If a tree has at least 3 leaf nodes, its height is at least 1
• $$S \to aSb$$, $$S \to \epsilon$$, $$ab$$
• Nuber of child nodes of a variable node is the number of symbols on the rhs of the rule
• Assume that each rule has at most $$b$$ symbols on the RHS, if a string as at least $$B$$ symbols, the height of the parse tree is at least $$b$$
• Generally,
• $$b$$ is the max number of symbols on the RHS of a rule
• $$v$$ is the total number of variables
• Consider a string $$s$$ that can be derived by the grammar, if $$s$$ has at least $$b^{v+1}$$ symbols, then
• the height of the trea is at least $$v + 1$$
• and the number of nodes is at least $$v + 2$$
• The number of variables – the longest path has at least $$v + 1$$ variable nodes
• According to the pigeonhole principle, some variable occurs at least once on the longest path
• the number of terminals – $$v + 1$$
• here are two subtrees under this parse tree – one for the earlier $$T$$ and another for the later.
• The inner tree may be replaced with the outer, or likewise the outer with the inner
• Requires 2 pumped sequences?
• Lemma: If $$A$$ is a CFL, then there exists a pumping length $$p$$ such that for any stirng $$s \in A$$ with $$|s| \geq p$$, it can be devided into five pieces, $$s = uvxyz$$ satisfying the following conditions:
1. For any integer $$i \geq 0$$, $$uv^ixy^iz \in A$$
2. $$|vy| > 0$$ (one of $$v$$ and $$y$$ could be empty, but not both at the same time)
3. $$|vxy| \leq p$$ ($$vxy$$ are not necessarily the first $$p$$ symbols since $$|u|$$ is unknown)
• Contrapositive: if for any $$p$$ there exists a string $$s \in A$$ with $$|s| \geq p$$, having the following property: For any decomposition of $$s = uvxyz$$ in which $$|vy| > 0$$ and $$|vxy| \leq p$$, there is an integer $$i \geq 0$$ for which $$uv^ixy^iz \not\in A$$, then the language is not context free.
• Using the new Pumping Lemma
• Prove that $$L = \left\{ a^nb^nc^n | n \geq 0\right\}$$
• Let $$p$$ be an arbitrary number,
• Consider $$s = a^pb^pc^p$$ and consider all possible decompositions as specified
• Cases:
1. $$vy$$ has only $$a$$
• if we set $$i = 0$$, we remove both $$v$$ and $$y$$
• note, $$i \neq 0$$ prevents validity
2. $$vy$$ consists of both $$a$$ and $$b$$
• $$i = 0$$, any $$i \neq 1$$
3. $$vy$$ has only $$b$$
• See above
4. $$vy$$ consists of both $$b$$ and $$c$$
• See above
5. $$vy$$ has only $$c$$
• See above
• Likewise, $$\{w over \{a, b, c\} | |w|_a = |w|_b = |w|_c\}$$ follows, by the same string

## Lecture 21: The Pumping Lemma, Continued

• For a string $$s$$ in a cfl, $$A$$, if $$s$$ is sufficiently long, then on the longest path from the root to a leaf in the parse tree, some variable appears more than once.
• If $$A$$ is a cfl, then there exists a pumping length $$p$$ such that for any stirng $$s \in A$$, with $$|s| > p$$, it can be dividied into five pieces, $$s = uvxyz$$, satisfying the following conditions:
1. For any integerr $$i \ge 0$$, $$uv^ixy^iz \in A$$
2. $$|vy| > 0$$
3. $$vxy \leq p$$
• Describes a common property
• if has the property, it may be cf, but if it doesn’t, it isn’t CF, see contrapositive above
• Prove that the following language is not context free: $$L = \{ ww | w over \{a,b\}\}$$
• Given cases 1-8, if $$i \neq 1$$, then $$s \not\in L$$
• For case 9:
• if $$v$$ and $$y$$ have the same number of $a$s, string $$a^pba^pb$$ isn’t a good choice
• Let $$p$$ be arbitrary, and $$s = a^pb^pa^pb^p$$
• For each case, if $$i = 0$$, the string doesn’t hold. Any integer $$i \neq 1$$ holds. Language is not CFG
• For $$L = \{ wcw | w over \{a,b\}\}$$
• First Proof:
• $$s = a^pb^pca^bp^b$$
• 11 cases
• For each case, if $$i = 0$$, the string doesn’t fit
• Different Proof:
• List all possible decomps depending on whether $$vy$$ contains symbol $$c$$
• 2 cases:
• $$c \in vy$$
• $$i = 0$$, new string $$uxz$$ doesn’t have $$c$$, and thus not in $$L$$, in fact, $$i \neq 1$$ works.
• $$c \not\in vy$$
• Because $$|vxy| \leq p$$, $$v$$ and $$y$$ must contain a different number of symbols. Therefore, $$s \not\in L$$, and language not CF.

### Closure Property

• CLASS is closed under $$\odot$$ if $$A,B \in CLASS$$, then $$A\dotcirc B \in CLASS$$
• Properties:
• Union – Closed
• Concatenation – Closed
• Star – Closed
• Intersection – Not Closed
• Complement – Not Closed

### Proving $$L$$ is CF

• Design a PDA
• Design a CFG
• Use closure property
• To prove not CF
• use pumping lemma
• and closure property

## Lecture 22: Context-Free Languages, continued

### CFG to PDA

• Ex:
• $$S \to S_1 S_2$$
• $$S_1 a S_1$$
• $$S_1 \to a$$
• $$S_2 \to a S_2 b$$
• $$S_2 \to \epsilon$$
• Generate a transition for each rule, and for each terminal
• terminals pop items, rules may push as well as pop
• Use left-most derivation to follow the PDA

### Group Activity

• Be careful when traversing a tree from a PDA

### Midterm Review

• pumping lemma was a problem
• $$s$$ must be defined generally, and $$|s| \geq p$$
• Consider all possible decompositions, $$s = xyz$$, $$s$$ must be specific!

## Lecture 23: Chapter 3, The Turing Machine

• Initially only the standard turing machine, but from there, some variants
• Recall DFA/NFA and PDA/CFG
• Turing Machine
• Note: These are deterministic
• semi-infinite input tape, infinite to the right
• blank symbols after the original string, infinitely many
• finite number of states
• Input tape may be written to
• tape head can be moved left or right
• Transition, ex: from $$P$$ to $$Q$$, $$a \to b, c$$
• $$P$$ – current state
• $$Q$$ – next state
• $$a$$ is the current tape symbol
• $$b$$ is the new tape symbol
• $$c$$ is either $$L$$ or $$R$$, $$L$$ to move head to left, $$R$$ to move to right
• Moving left at leftmost edge
• Recognizes Turing-recogniseable languages (i.e., $$a^nb^nc^n$$)
• Has 2 final states – accept or reject
• When the machine enters the accept state, the machine accepts
• Enters the reject state or no more transitions to take
• Status may be described as a “configuration”, (symbols on lhs of tape head, current state, current symbol, symbol on rhs of tape head)
• Described as:
• $$Q$$ – States
• $$\Sigma$$ – Input Symbols
• $$\Gamma$$ – Tape Alphabet, $$\Sigma \subseteq \Gamma$$, Space $$\in \Gamma$$
• $$\delta$$ – $$Q \times \Gamma \to Q \times \Gamma \times \{l, r\}$$
• $$q_{0}$$ – Initial State
• $$q_{accept}$$ – Accept State
• $$q_{reject}$$ – Reject State

## Lecture 24: Chapter 3, The Turing Machine

• Design of a TM is like writing a program
• Algorithm is a high-level idea
• TM is the implementation
• $$\{a^{2^n} | n \geq 0\}$$
• Count number of $a$s, say $$m$$
• check if $$m$$ is a power of 2
• Unary number to represent $$m$$
• use a binary number to represent $$m$$
• Unary numbers, single symbol, $$m$$ is represented by $$m$$ symbols
• binary, two symbols, etc
• $$a$$ with unary number
• input string is already a unary number
• Checking whether a decimal number is a power of 2
• divide by 2 until an odd number. If the odd number is $$1$$, $$m$$ is a power of 2
• Divide by 2, by replacing every other $$a$$ with a special symbol
• See turing machine from slides (TM1)
• $$a$$ with binary counter
• remove an a from the string, increase counter by one
• count should be lsb first
• check to make sure that only the last bit is 1
• only count $$m - 1$$, and then check only 1s in counter
• See TM2 from slides
• $$#x#y#z$$ where $$x = y + z$$ and $x, y, and $$z$$ are unsigned binary numbers, and the MSB is 1, MSB is first bit • sum = y + z ; if sum == x, accept, reject • while y > 0, i–, z++; if x == z, accept, reject • while y > 0, y–, x–; while z > 0, z–, x–, if x == 0, accept, reject • mark left end with submachine • mini machine for initial validity (MSB check) • mini machine for checking if $$y = 0$$ • mini machine for $$y--$$ • find LSB, if 1, replace with 0 • if 0, replace with 1, if next is 1, replace with 0, else, if next is 0, loop from start of step (if 0, replace w/ 1…) • mini machine for $$z++$$ • check if overflow, if so, shift $$z$$ right and add a 1, else, continue • Another machine to check if $$x = z$$ ## Lecture 25: Chapter 3, The Turing Machine – Extensions • Standard TM: Left, Right, state machine and rewrite rules • Variants • Stay • bi-infinite tape • jump option • non-deterministic • multiple tapes • multiple tape heads • 2 dimensional tape • Stay option • $$\delta : Q \times \Gamma \to Q \times \Gamma \times \{L, R, S\}$$ • simulated using an intermediate state • which doesn’t change the intermediate symbol • Bi-infinite tape • Mapping • cell-to-cell – each of T2’s tape is mapped to a cell in T1’s tape – every other mapping • string mapping – write the string of T2’s tape on T1’s tape • Will require special transitions, with one or more special intermediate state ## Lecture 26: Chapter 3, The Turing Machine – Extensions, Continued • multiple tapes • $$\delta : Q \times \Gamma \times \Gamma \to Q \time \Gamma \time \Gamma \times \{L, R\} \times \{L, R\}$$ • Considers symbols on tape one and two, new symbols on 1, 2; directions on tape 1, 2 • equivalent to normal TM • Use a tape mapping – cell-to-cell or string • Use string mapping, and special symbols (dotted) to denote current location of head • From here, a series of special transitions is required. These are ugly. • Non-deterministic Turing Machine • new outcome – loop, loops forever on $$w$$ ## Lecture 27: Chapter 3, The Turing Machine, Variants: The Non-deterministic TM • remember - rejecting $$w$$ means it either enters reject or reaches a place of no more transitions • in NDTM, $$\delta: Q \times \Gamma \to P(Q \times \Gamma \times \{L, R\})$$ • An NDTM accepts a string if at least one copy of the NDTM accepts the string. • An NDTM rejects a string if every copy rejects $$w$$. • An NDTM loops on $$w$$ if no copy accepts $$w$$ and at least one copy loops on $$w$$ • Designing a TM from an NDT: • $$T1$$ searches the computation tree of $$T2$$ on string $$w$$, if accept node found, $$T1$$ accepts $$w$$ • Search is done using BFS to avoid issues with looping • searching is done in shortlex order • From NDTM with three tapes, input string, node id, one copy of $$T2$$, to a standard TM ## Lecture 28: More Models – Enumerators • Has multiple tapes, but last one is write-only • $$L(E)$$ – the language of all strings printed out on the write-only tape • has no input string • $$L(T)$$ – all strings accepted by a TM. • Enumerator to TM: • $$T$$ accepts $$x$$ if $$x \in L(E)$$ • $$E$$ has $$m$$ tapes, $$T$$ has $$m + 1$$ tapes • $$T$$ simulates $$E$$ • Every time $$E$$ prints out a string $$w$$ on the WO tape, $$T$$ checks if $$w = x$$, if so, $$T$$ accepts $$x$$, else, continue • Thus, $$L(T) = L(E)$$ • TM to Enumerator: • For $$i = 1$$ to $$\infty$$ • $$E$$ simulates $$T$$ on string $$S_i$$ • If $$T$$ accepts $$S_i$$, $$E$$ prints out $$S_i$$ on the write only tape • If $$T$$ loops on $$S_i$$, • However, $$L(E) \neq L(T)$$, because of loops – remedy as follows: • For $$n = 1$$ to $$\infty$$ • For $$i = 1$$ to $$n$$, • $$E$$ simulates$T on $$S_i$$ for $$n$$ steps
• If $$T$$ accepts $$S_i$$, E prints out $$S_i$$ on the output tape
• Then $$L(E) = L(T)$$
• These are recursively enumerable a/k/a a Turing recognizeable language
• A language is recursively enumerable iff a languages is Turing Recogniseable
• Adding a stack to a DFA/NFA gives a PDA

### Multi-stack PDAs

• PDA-2 > PDA-1
• PDA-3 = PDA-2

### CFGs

• $$A \to \alpha$$ – $$A$$ can be replaced by $$\alpha$$
• Instead, generate a CSG
• $$xAy \to x\alpha y$$ – A can only be replaced by $$\alpha$$ if surrounded by $$x$$ and $$y$$
• Human languages are context sensitive
• Turing machine is more powerful

### More Models

• LBA
• $λ$-Calculus
• $μ$-recursive functions – similar to modern computers
• register machines
• Have same power as a TM
• Any algorithm that can be carried out at all (by a single person, a finite number of people, a computer, a computational model) can be carried out by a TM. – The Church-Turing Thesis

## Lecture 29: Languages and Closure Properties

• Regular Language Closure Properties
• Union
• Concatenation
• Star
• Intersection
• Complement
• Context Free
• Union
• Concatenation
• Star
• Turing Recognizeable
• Union – Run A, if A accepts, fine. If B accepts, fine.
• Accept if either accept
• Reject if both reject
• Loop if both loop
• Concatenation and star are part of homework
• Intersection – Also Holds
• If a and b accept, if either reject, rejectx
• Complement – May or may not be recognizeable – See Ch. 4, will be shown later

### Further Languages – The Decider

• A special case of the TM, a subset of Turing-recognizeable languages
• but a superset of CFLs
• All languages that outside of Decideable languages cannot have a Decider built. those that are Turing-recognizeable can have a TM built, outside of that, cannot
• Does DFA $$D$$ accept string $$w$$?
• Network protocol – a sequence of network messages
• soda machine – a sequence of actions
• Languages $$A_{DFA} = \left\{\langle D, w \rangle \mid D \textrm{accepts} w \right\}$$
• Language is “red”
• Proof $$A_{DFA}$$ is decidable.
• Encoding 1: $$\#p,q\#a,b\#p,a,p\#p,b,q\#q,b,p\#q,a,q\#p\#p\#\#abb\#$$
• $$\Sigma = \left\{p,q,b,a,\#,','\right\}$$
• Encoding 2: Use $$0$$ and $$1$$ for symbols, and map to various options
• Assume an encoding and then design a decider
• Input: an encoding string
• simulate $$d$$ on $$w$$, if $$d$$ accepts, accept, if rejects, reject

## Lecture 30: More Languages, Continued

• $$L$$ is Turing Recognizable
• Design TM for $$L$$ such that if $$x \in L$$, $$M$$ stops and accepts $$x$$ (answer is yes)
• if $$M$$ stops and rejects $$x$$, answer no
• if $$M$$ runs forever, no answer
• $$L$$ is Decideable
• only two answers, yes and no
• Design a decider $$M$$ for $$L$$, such that
• if $$x \in L$$, $$M$$ stops and accepts $$x$$
• if $$x \not\in L$$, $$M$$ stops and rejects $$x$$
• The Decision problem
• Does a DFA $$D$$ accept a string $$x$$
• Language \$ADFA = \left\{⟨ D, W ⟩ \mid D \textrm{accepts} w \right\}
• $$A_{DFA}$$ is decideable – a decider can be designed for the language
• Decider checks if encoding is valid – if not, reject
• then simulate $$D$$ on $$w$$ – if accept, accept, else, reject
• Doesn’t work for NFAs with this same system
• After checking validity, convert NFA to DFA, then follow DFA rules
• Similarly for RE, convert to DFA

### The Emptiness Testing problem

• Check if $$D$$ has $$L(D) = \emptyset$$
• Check if $$D$$ is valid, if not reject
• For $$i = 1$$ to $$\infty$$
• simulate $$D$$ on $$S_1$$
• If accepts, reject
• At the end, accept
• Not a decider
• Instead, check for a final state with at least one inward transition, if exists, reject, else accept

### Equivalence problem

• given two machines, do they recognize the same language
• Problem is decidable
• For each strig $$x = S_1$$
• simulate $$D_1$$ on $$x$$
• simulate $$D_2$$ on $$x$$
• If both accept/reject, continue; else reject
• Not correct option – infinite loop
• minimize both DFAs and check if equivalent, if so, accept, else reject
• Decider 2
• has 2 submachines
• 1
• Reads input string, outputs new string, $$\langle C \rangle$$
• $$L(D_1) \cap \overbar{L(D_2)} = \emptyset$$, likewise with $$D_1$$ and $$D_2$$ switched
• Then union the two – this is $$C$$
• 2
• Uses Decider for emptiness test on $$\langle C \rangle$$
• Accept if this accepts
• reject if this rejects
• Will use this method for other problems

## Lecture 31: Properties of CFGs

### Decidability

• $$A_{CFG} = \left\{ \langle G, w \rangle \mid G \textrm{generates} w \right\}$$
• $$G$$ is a PL spec
• $$w$$ is source code of a program
• Design a decider forr $$A_{CFG}$$ – a compiler or interpreter
• Proving so
• A decider $$x$$, input is encoding
• Check if input string, reject if invalid
• two possibilities – accept and reject
• Convert $$G$$ to an equivalent PDA $$P$$
• If $$P$$ accepts $$w$$, then accept, if $$P$$ rejects, then reject
• Issue, what if $$P$$ is non-deterministic
• Issues – don’t try to generate all possible strings within the language
• Convert any CFG $$G$$ to Chomsky Normal Form $$G^{\prime}$$ – See chapter 2
• If a string $$w$$ can be generated by $$G^{\prime}$$, it is generated by $$G^{\prime}$$ in exactly $$2n-1$$ stepps, where $$n = |w|$$
• Ex:
• $$S \to AB | CC | \epsilon$$
• $$A \to AB | CC$$
• $$B \to CC$$
• $$C \to a | b$$
• Each variable can be replaced by either exactly two variables or exactly one terminal, exclusively. Unless start variable, which may also be replaced by the empty string
• So, to decide for a CFG,
• If invalid, reject.
• Convert $$G$$ to $$G^{\prime}$$ in Chomsky Normal Form
• Generate all possible strings with $$n = |w|$$ symbols
• Check if $$w$$ is generated, if so, accept, else, reject
• Bounded by $$2|w|$$
• And thus compilers work!
• PDAs may become a CFG, and then as before

### Emptiness

• $$E_{CFG} = \left\{ \langle G \rangle \mid G is CFG \land L(G) = \emptyset \right\}$$
• Convert $$G$$ to PDA $$P$$
• Check $$P$$ for final state, if has it, reject, if not, accept
• But not quite!
• If invalid, reject
• Find a set $$T$$ that contains all variables tthat generate at least one string of terminals
• If $$S \in T$$, reject
• Else, accept
• To generate $$T$$
• $$T = \{\}$$
• For each rule $$A \to w$$
• If $$A \not\in T$$
• If $$w$$ doesn’t have any variables, add $$A$$ to $$T$$
• Else variable in $$w \in T$$, add $$A$$ to $$T$$
• Repeat above until no more new variabels can be added to $$T$$
• This is therefore decidable

### Equivalence

• $$EQ_{CFG} = \left\{ \langle G_1, G_2 \rangle | L(G_1) = L(G_2) \right\}$$
• Chapter 5, will be proven that is not decideable

## Lecture 32: Acceptance problem of Turing Machines:

• Given a TM $$M$$ and a string $$w$$, know if $$M$$ accepts $$w$$
• For Machine Program $$M$$ in lecture, and $$w = ab$$, $$w = abc$$ and $$w = abcab$$ – works, but may not always
• Then:
• $$U$$ – a machine akes in an encoding
• If encoding is invalid, reject
• Simulate $$M$$ on $$w$$
• If accepts, accept
• If rejects, reject
• Not a decider! But, is $$U$$ a TM?
• $$S_{accept}$$ is $$A_{TM}$$
• $$S_{reject}$$ $$M$$ rejects $$w$$ and invalid encoding strings
• Then $$A_{TM}$$ is Turing-recognizeable, by $$U$$
• $$A_{TM}$$ is undecideable – by contradiction
• Assume that $$A_{TM}$$ is decidable, then there is a decider $$H$$ for $$A_{TM}$$
• Then $$H$$ accepts if $$M$$ accepts
• Reject otherwise.
• Then make a new machine, $$X$$ using $$H$$
• $$X$$ takes an encoding of a TM, $$M$$, simulating $$M$$ on the encoding of itself.
• If $$H$$ accepts, reject, else if $$H$$ rejects, accept
• $$X$$ rejects if $$M$$ accepts $$\langle M \rangle$$
• $$X$$ accepts if $$M$$ does not accept $$\langle M \rangle$$
• Will $$X$$ accept itself? No! Contradiction
• Accepts if $$X$$ rejects $$\langle X \rangle$$
• Rejects if $$X$$ accepts $$\langle X \rangle$$
• Thus, $$A_{TM}$$ is undecideable! Turing, Church, 1936
• Youtube video for Halting Problem

## Lecture 33: Review of HW 3/4

• Problem 1 – use recursive structure, likewise for problem 2
• PDA for $$L_1$$
• Push an $$A$$ for each $$a$$
• Then have a recognizer for one $$b$$, two $$b$$’s and three $$b$$’s
• Uses non-deterministic transitions, similar for $$L_2$$
• Do algebra on inequalities

### Chapter 4, Continued

• Problems that are not Turing recognizable
• All languages that can be recognized by real computers
• There exists an infinite number of turing unrecognizeable languages
• The number of languages is greater than the number of turing machines
• there is an infinite number of Turing machines
• Definitions today
• Definitions
Function $$f$$
$$f$$ is a function from set $$A$$ to set $$B$$ if $$f$$ maps each element of $$A$$ to exactly one element of $$B$$. $$A \neq \emptyset$$, $$B \neq \emptyset$$; $$f: A \to B$$
Function $$f$$ is one-to-one (injective)
If $$\forall a_1, a_2 \in A$$, $$a_1 \neq a_2$$ implies $$f(a_1) \neq f(a_b)$$
Function $$f$$ is onto (surjective)
if for each $$b \in B$$, there is at least one $$a$$ such that $$f(a) = b$$
Function $$f$$ is correspondence (bijective)
iff $$f$$ is both injective and surjective

## Lecture 34: Chapter 4, Continued – Problems that aren’t Turing Recognizeable

• Theorem 1: $$f: A \to B$$ is correspoondence, tthen $$f^{-1}:B -to A$$ is also correspondence
• Sets a and B have the same size $$|A| = |B|$$ iff $$A$$ and $$B$$ have the same number of elements – applies on both finite and infinite sets
• Set $$A$$ has a smaller size than $$B$$, $$|A| < |B|$$ if there is a one-to-one function from $$A$$ to $$B$$, but no correspondence between $$A$$ and $$B$$ (again, for both finite and infinite sets)
• For infinite set $$A$$:
• If there is a corresbondence between $$A$$ and $$\mathbb{N} = \{1, 2, 3, \ldots\}$$, then $$|A| = |N|$$ and $$A$$ is countably infinite.
• Theorem 2: If there is no correspondence between $$A$$ and $$\mathbb{N}$$, then $$|A| > |N|$$, and $$A$$ is uncountably infinite
• EX: $$E = \{ x | x \in \mathbb{Z}, x \mod{2} \equiv 0\}$$
• Countably infinite. Several options for mapping.
• $$O = \{ x | x \in \mathbb{Z}, x \mod{2} \equiv 1 }$$ – countably infinite
• $$\mathbb{Z} = E \cup O$$ – Still countably infinite
• Several options for correspondence
• $$S$$ – All possible strings over $$\Sigma$$
• Countably infinite if $$\Sigma$$ is finite
• $$B$$ – All possible infinite sequences over $$\Sigma = \{0, 1}$$
• Uncountably infinite.
• By Contradiction. $$f: N \to B$$ is a correspondence
• Then there exists a $$b \in B$$, s.t. for each symbol at index $$n$$, it is the opposite symbol than at index $$n$$ of $$f(n)$$
• Contradiction – there does not exist an $$n$$ such that $$f(n) = b$$.
• Theorem 3: If $$|A = |N|$$ and $$B \sub A$$, then $$|B| = |N|$$
• If $$|A| = |B|$$ and $$|A| > |N|$$, then $$|B| > |N|$$
• Number of languages vs number of TMs
• $$M$$ is all possible TMs; $$L$$ is all possible languages
• $$|M| = |N|$$
• $$|M| = |M_C|$$ where $$M_C$$ is all possible encoding strings
• $$|M_C| = |N|$$ as $$M_C \sub S$$
• $$|L| > |N|$$
• $$|L| = |B| > |N|$$
• A correspondence bust be formed between $$L$$ and $$B$$ – See Slides

## Lecture 35: Chapter 4, Continued – A specific Turing-Unrecognizeable language

• Theorem: A language $$L$$ is decideable iff $$L$$ is Turing-recognizeable and $$\overbar{L}$$ is Turing-recognizeable ($$L$$ is co-Turing-recognizeable).
• Proof (forwards):
• Given a Decider $$D$$ for $$L$$, Given $$x$$, accept if $$x \in L$$, reject if $$x \not\in L$$
• Then given a TM $$M_1$$ for $$L$$, accept if $$x \in L$$, reject if $$x \not\in L$$ or if $$x$$ causes $$M_1$$ to loop
• Likewise a TM for $$\overbar{L}$$ called $$M_2$$.
• $$M_1$$ is a decider, accept if accept, reject or loop otherwise
• For $$M_2$$, use decider, if decider accepts, reject or loop, if decider rejects, accept
• Proof (backwards):
• While true:
• Sim $$M_1$$ for one step on $$x$$
• Sim $$M_2$$ for one step on $$x$$
• If $$M_1$$ accepts, accept
• If $$M_2$$ accepts, reject
• Then $$\overbar{A_{TM}}$$ is Turing-unrecognizeable
• Proof: Assume $$\overbar{A_{TM}}$$ is Turing-recognizeable. And $$A_{TM}$$ is Turing-recognizeable. Then $$A_{TM}$$ is decideable. $$\textreferencemark$$ $$A_{TM}$$ is thus Turing-unrecognizeable.
• Closure properties of deciedale languages:
• Union
• Concatenation
• Star
• Intersection
• Complement

## Lecture 34: Chapter 5 – The Reduction Method

• Consider two problems, $$X$$ and $$Y$$
• Whether either is is decideable, or is Turing-recognizeable
• Assume a solution for $$Y$$ (a decider or TM)
• We say tha $$X$$ can be reduced to $$Y$$ ($$X$$ is reducible to $$Y$$), $$X \leq Y$$ if we can design a solution to $$X$$ using the solution to $$Y$$.
• Assuming a decider for $$Y$$
• Decider for $$X$$ is built on this.
• A general reduction method
• The Mapping Reduction Method:
• Given an assumed decider to $$Y$$
• A Mapping function $$f$$ is used to take the input for $$X$$ and convert it to input for $$Y$$
• $$f$$ must be computable, and a TM that halts on every input string $$x$$ and outputs $$f(x)$$ on its tape
• Application:
• If $$X \leq Y$$
• If $$Y$$ is decideable, then $$X$$ is decideable
• If $$X$$ is undecideable, then $$Y$$ is also undecidable
• If $$Y$$ is Turing-recognizeable, then $$X$$ is also (See $$A_{TM}$$)
• If $$X$$ is Turing-unrecognizeable, then $$Y$$ is likewise
• Example: $$X = EQ_{DFA} = \left\{ \langle D_1, D_2 \rangle \right\}$$, as well as $$Y =$$ Emptiness.
• Decider for $$E_{DFA}$$
• Input is Encoding of 2 DFAs
• Sub-machine/Mapping function:
• Takes encoding, writes $$\langle C \rangle$$ for $$Y$$ – See previous proof of $$EQ_{DFA}$$
• $$Y = H_{TM}$$, $$X = A_{TM}$$
• $$X$$ is proven to be undecideable
• Show $$X$$ is reducible to $$Y$$
• $$X$$’s decider takes an encoding of $$\langle M, w \rangle$$ and decides acceptance and rejection
• If $$Y$$ accepts, simulate $$M$$ on $$w$$, finally accepting if $$M$$ accepts, rejecting otherwise, including if $$Y$$ rejects
• Formally proven:
• Assume, for a contradiction, assume that $$HALT_{TM}$$ is decidable and $$Y$$ is the decider for it.
• Design a decider $$X$$ on input $$\langle M, w \rangle$$
• Check validity of $$\langle M, w \rangle$$
• Reject if invalid
• Simulate $$Y$$ on $$\langle M, w \rangle$$
• If $$Y$$ accepts, simulate $$M$$ on $$w$$
• If $$M$$ accepts, accept.
• If $$M$$ rejects, reject.
• If $$Y$$ rejects, reject.
• Then, $$A_{TM}$$ is decideable. $$\textreferencemark$$
• Therefore, as the assumption is wrong, $$HALT_{TM}$$ is undecideable.
• Mapping Reduction for the above
• Define $$f$$ such that if $$Y$$ rejects, $$X$$ rejects, if $$Y$$ accepts, $$X$$ accepts
• To be continued

## Lecture 35: Chapter 5

• If $$X \leq 1$$, and $$X$$ is undecideable, then $$Y$$ is undecideable.
• $$X = A_{TM}$$, $$Y = \overline{E_{TM}}$$
• Proof. For the purpose of contradiction, assume that $$Y$$ is decideable, and we have a decider for it.
• Then define $$f$$ taking input for $$X$$ and mapping it to input $$Y$$ (a decider for $$X$$)
• Accept if decider for $$Y$$, accept, likewise for rejection
• Most important step: Defining the Mapping
• $$\langle M, w \rangle \in A_{TM} \iff f(\langle M, w \rangle) = \langle T \rangle \in \overline{E_{TM}}$$
• In, $$L(T) \neq \emptyset$$, $$\langle T \rangle$$ is invalid
• Not in, $$L(T) = \emptyset$$
• Function $$f$$ as follows:
• $L(T) = \left\{ \begin{matrix} \neq \emptyset & M \text{accepts} w\\\emptyset & \text{otherwise}\end{matrix}\right.$
• $$f$$ outputs $$\langle T \rangle$$
• Simulate decider for $$\overline{E_{TM}}$$ on $$\langle T \rangle$$
• If accepts, accept; reject otherwise
• $$T$$ works by taking any string, and simulates $$M$$ on $$w$$, accepting if accepts, rejecting if rejects. Looping if loop
• $$X = \overline{A_{TM}}$$, $$Y = EQ_{TM}$$
• For the purpose of contradiction, assume $$Y$$ is recognizeable and we have a TM for $$Y$$
• Then $$f$$ goes from $$\langle M, w \rangle$$ to $$\langle T_1, T_2 \rangle$$
• $$T_1$$
• takes in $$t_1$$
• Always rejects
• $$T_2$$
• takes in $$t_2$$
• Simulate $$M$$ on $$w$$
• Accept if accept
• reject otherwise
• Contradiction, as $$X$$ is unrecognizeable