# Theory of Computation - PowerPoint by theoryman

VIEWS: 694 PAGES: 17

• pg 1
```									Theory of Computation
What types of things are
computable?
How can we demonstrate what
things are computable?
Foundations
• Cantor’s Set Theory - contradictions
• Multiple sizes to infinity.
• There exists at least one set bigger than the universal
set.
• Hilbert’s rigor
• Find an algorithm that will generate proofs for all true
statements
• Gödel’s Incompleteness Theorem
• No such algoritm exists.
• Even worse, there exist true statements for which no
proof can ever be found.
• In any mathematical system there will either be true
statements that cannot be proven or false statements
that can.
Computability
• The question then becomes, can we at
least find an algorithm that can find any
proof that does exist?
– Church & Kleene and Post found no.
– Turing developed the “universal algorithm
machine”, now called the Turing maching.
perform.
Languages and grammars
• A language is a set of strings.

• A grammar is the set of rules that define
what strings are valid members of a
language.
Rule structure
• Rules consist of three types of symbols:
• Terminals are symbols that cannot be further expanded.
• Nonterminals are symbols that are not part of the language’s
alphabet, but can be expanded into larger substrings
• , which represents the empty string.
• Assume the alphabet is {a, b}. An example
grammar might be:
S  aSb
S  ba
where a and b are terminals and S is a nonterminal.
• Note that rules can be defined recursively.z
Formal grammars
• A grammar is thus the set of generation rules
that define the valid strings of the language.
• Any string that can be generated by the
grammar is a valid string in the language, and
any string in the language that is valid can be
generated by the grammar
• The type of grammar used by a language is
determined by the restrictiveness of the rules.
Regular grammars
• Regular grammars are ones where the
the rules have a nonterminal on the left
and on the right either:
• The empty string
• A single terminal
• A single terminal and a single nonterminal
» Left regular grammars are ones where the
nonterminal is to the left of the terminal; in right
regular grammars, the opposite is true.
Example regular grammars
S  aB               S  aB
B  bB               B  bB
B                 Ba

What language does   What language does
this define?         this define?

ab*                 ab*a
State machines
• Designed to allow for the modeling of transitions
from one state to another.
• Consist of states representing the current state of
the machine and transitions between those states.
• Entry Action - the action to perform when the state is entered
• Exit Action - the action to perform when the state is exited
• Input Action - an action to perform in a particular state with a a
particular input (usually involves following a transition)
• Have a start state and an acceptor state
• Transitions define how the machine changes from
one state to another. They usually define the
condition(s) under which the machine changes
states
• Transition Action - the action to perform when following a
Finite Automata
• Particular type of state machine that has
no actions at all. Only states and
transitions.
Limits of Regular languages
and finite automata
• What types of languages can’t FA’s
accept? In other words, what limits are
there on the complexity of regular
languages?

• FA’s lack memory, so that you can’t
have one part of a regular language
dependent on another part.
Context-free grammars
In CFGs, the rules all take the form:
Nw
where N is a single non-terminal and w is some
finite string of terminals and non-terminals, in any
order.
Whereas in regular languages, nonterminals were
restricted as to where they could appear in the
rules, now they can appear anywhere. Hence the
term context-free.
S  aSb
S  ab
CFG example
S  aB         SU
S  bA         SV
Aa          U  TaU
A  aS       U  TaT
A  bAA       V  TbV
Bb          V  TbT
B  bS       T  aTbT
B  aBB       T  bTaT
T
Pushdown Automata
•   Pushdown automata extend FA’s in one
very important way. We are now given a
stack on which we can store information.
This works like a standard LIFO stack,
where information gets pushed onto the top
and popped off the top.
•   This means that we can now choose
transitions based not just on the input, but
also based on what’s on the top of the
stack.
•   We also now have transition actions
available to us. We can either push a
specific element to the top of the stack, or
Limits on PDAs and CFGs
• Adding memory is nice, but there are
still significant limits on what a PDA can
accomplish.

• Can a PDA be constructed that can do
arithmetic?

• How, or why not?
The Turing Machine
• Four components:
– A tape of infinite length, divided into cells,
which can contain one character of data