COS 126 Markov Model of Natural Language |
Programming Assignment Checklist |
Use a Markov chain to create a statistical model of a piece of English text. Simulate the Markov chain to generate stylized pseudo-random text.
Perspective. In the 1948 landmark paper A Mathematical Theory of Communication, Claude Shannon founded the field of information theory and revolutionized the telecommunications industry, laying the groundwork for today's Information Age. In this paper, Shannon proposed using a Markov chain to create a statistical model of the sequences of letters in a piece of English text. Markov chains are now widely used in speech recognition, handwriting recognition, information retrieval, data compression, and spam filtering. They also have many scientific computing applications including the genemark algorithm for gene prediction, the Metropolis algorithm for measuring thermodynamical properties, and Google's PageRank algorithm for web search. For this assignment, we consider a whimsical variant—generating stylized pseudo-random text.
Markov model of natural language. Shannon approximated the statistical structure of a piece of text using a mathematical model known as a Markov model. A Markov model of order 0 predicts that each letter in the alphabet occurs with a fixed probability. We can fit a Markov model of order 0 to a specific piece of text by counting the number of occurrences of each letter in that text, and using those frequencies as probabilities. For example, suppose that the input text is "gagggagaggcgagaaa". Then, the Markov model of order 0 predicts that each letter is 'a' with probability 7/17, 'c' with probability 1/17, and 'g' with probability 9/17 because these are the fraction of times each letter occurs. The following sequence of characters is a typical example generated from this model:
A Markov model of order 0 assumes that each letter is chosen independently. This independence does not coincide with statistical properties of English text because there a high correlation among successive characters in a word or sentence. For example, 'w' is more likely to be followed by 'e' than by 'u', and 'q' is much more likely to be followed by 'u' than by 'e'.g a g g c g a g a a g a g a a g a a a g a g a g a g a a a g a g a a g ...
We obtain a more refined model by allowing the probability of choosing each successive letter to depend on the preceding letter or letters. A Markov model of order k predicts that each letter occurs with a fixed probability, but that probability can depend on the previous k characters, which we refer to as a k-gram. For example, suppose that the input text has 100 occurrences of "th", with 60 occurrences of "the", 25 occurrences of "thi", 10 occurrences of "tha", and 5 occurrences of "tho". Then, the Markov model of order 2 predicts that the letter immediately following any occurrence of "th" is 'e' with probability 3/5, 'i' with probability 1/4, 'a' with probability 1/10, and 'o' with probability 1/20.
A brute-force solution. Claude Shannon proposed the following brute-force scheme to generate text according to a Markov model of order 1:
“ To construct [a Markov model of order 1], for example, one opens a book at random and selects a letter at random on the page. This letter is recorded. The book is then opened to another page and one reads until this letter is encountered. The succeeding letter is then recorded. Turning to another page this second letter is searched for and the succeeding letter recorded, etc. It would be interesting if further approximations could be constructed, but the labor involved becomes enormous at the next stage. ”Your task is to write a Java program to automate this laborious task, in an efficient manner—Shannon's approach is prohibitively slow when the length of the input text is large.
Markov model data type. Create an immutable data type MarkovModel to represent a Markov model of order k from a given text string. The data type must implement the following API:
The following bullets describe the API requirements in more detail:public class MarkovModel { // creates a Markov model of order k for the specified text public MarkovModel(String text, int k) // returns the order k of this Markov model public int order() // returns a string representation of the Markov model (as described below) public String toString() // returns the number of times the specified kgram appears in the text public int freq(String kgram) // returns the number of times the character c follows the specified // kgram in the text public int freq(String kgram, char c) // returns a random character that follows the specified kgram in the text, // chosen with weight proportional to the number of times that character // follows the specified kgram in the text public char random(String kgram) // tests this class by directly calling all instance methods public static void main(String[] args) }
Include one line for each k-gram that appears in the text. Each line contains the k-gram, followed by a colon; followed by each character that appears in the text immediately after that k-gram and the number of times it appears, with a space between each component. The k-grams must appear in lexicographic order; the characters associated with each k-gram must appear in ASCII order.aa: a 1 g 1 ag: a 3 g 2 cg: a 1 ga: a 1 g 4 gc: g 1 gg: a 1 c 1 g 1
Note that the frequency of "ag" is 5 (and not 4) because we treat the string as circular.frequency of probability that next char next char is kgram freq a c g a c g ---------------------------------------------- aa 2 1 0 1 1/2 0 1/2 ag 5 3 0 2 3/5 0 2/5 cg 1 1 0 0 1 0 0 ga 5 1 0 4 1/5 0 4/5 gc 1 0 0 1 0 0 1 gg 3 1 1 1 1/3 1/3 1/3 ---------------------------------------------- 17 7 1 9
Text generation client. A Markov chain is a stochastic process where the state change depends on only the current state. For text generation, the current state is a k-gram. The next character is selected at random, using the probabilities from the Markov model. For example, if the current state is "ga" in the Markov model of order 2 discussed above, then the next character will be 'a' with probability 1/5 and 'g' with probability 4/5. The next state in the Markov chain is obtained by appending the new character to the end of the k-gram and discarding the first character. A trajectory through the Markov chain is a sequence of such states. Below is a possible trajectory consisting of 9 transitions.
Treating the input text as a circular string ensures that the Markov chain never gets stuck in a state without any next characters.trajectory: ga --> ag --> gg --> gc --> cg --> ga --> ag --> ga --> aa --> ag probability for a: 1/5 3/5 1/3 0 1 1/5 3/5 1/5 1/2 probability for c: 0 0 1/3 0 0 0 0 0 0 probability for g: 4/5 2/5 1/3 1 0 4/5 2/5 4/5 1/2
To generate random text from a Markov model of order k, set the initial state to the first k characters in the input text. Then, simulate a trajectory through the Markov chain by performing T − k transitions, printing the random character selected at each step. For example, if k = 2 and T = 11, then the following is a possible trajectory, leading to the output gaggcgagaag.
trajectory: ga --> ag --> gg --> gc --> cg --> ga --> ag --> ga --> aa --> ag output: ga g g c g a g a a g
Write a client program TextGenerator that takes
two integer command-line arguments k and T;
reads the input text from standard input;
builds a Markov model of order k from the input text;
then, starting with the k-gram consisting of the Corner cases.
You may assume that the length of the text is at least k and T ≥ k.
Performance requirement.
If k is a fixed constant,
then the running time of TextGenerator
must be proportional to n log n + T log n (or better),
where n is the number of characters in the input text and
T is the number of characters in the output.
Experimentation.
Once you get the program working, test it on different inputs
of different sizes and different orders. Does increasing the order
have the effect you expect?
Try your model on something that you have written
or some other text you know well.
Make sure to test both long inputs (we provide several) and long outputs.
Files provided.
The file markov.zip
contains sample test files,
the readme.txt
template,
the abbreviated
partner readme.txt
template,
and
ST.java.
Deliverables.
Submit MarkovModel.java, TextGenerator.java, and readme.txt.
If working in a pair, one student submits these and the other student submits only the
abbreviated partner readme.txt.
Include in your readme.txt two of the most entertaining
language-modeling fragments that you discover.
Challenge for the bored 1.
Extend your solution to handle Unicode text, not just ASCII.
Challenge for the bored 2.
Imagine you receive a message where some of the characters have been corrupted by noise.
We represent unknown characters by the ~ symbol (and assume the character ~ does not appear
in the original text).
Devise a scheme based on the Markov model to determine the most likely value for
each corrupted character.
Assume unknown characters are at least k characters apart (and appear
at least k characters away from the start and end of the message).
Test your new method by writing a client program FixCorrupted.java
that takes as arguments the model order and the noisy string.
The program prints the most likely original string:
Good afternoon and welcome to Opening Exercises. What a special pleasure it is to greet Princetons Great Class of 2018! I also offer a warm welcome to our new graduate students, faculty and staff members, and all of you who are returning to campus after the summer.
Today we carry on a tradition that dates back at least to 1802, when Nassau Hall was the site of an opening exercise for Princeton students. The event switched to other sites before moving in 1929 to the University Chapel, where we gather today. Todays interfaith ceremony is far different from the Christian services that greeted students in 1929, but the chapels soaring architecture and inspirational spaces continue to invite all of us, whatever our religious or ethical traditions might be, to reflect on the larger purposes that should guide our community as we begin another year on this glorious campus.
Today you join the ranks of students who have left their marks on the Princeton campus and the world for generations through their intellect, creativity and passion. You, the 1,312 members of the Class of 2018, are an extraordinarily accomplished, inspiring and diverse group. You hail from 46 states, as well as the District of Columbia. You come from 50 countries outside of the United States from Chile to the Czech Republic, from Iceland to India, from Nigeria to New Zealand. You grew to become upstanding, compassionate citizens in Happy Valley, Oregon, and Niceville, Florida. You weathered the ups and downs of life in Boiling Springs, Pennsylvania, and Frostburg, Maryland. And you learned to appreciate the lyrical majesty of language in Ho Ho Kus, New Jersey, and Oologah, Oklahoma.
Good afternoon and welcome to a universities around you here.
I often ask Princeton is a truly global institution. As scholars who matter most to you. And you here.
I often ask Princeton you have come to our social natures, and, more specifically, with drums and choirs and distinguished teachers, whose contributions will become upstanding.
This Universitys GREAT CLASS OF 2018! Welcome new members play indispensable roles in helping our Universitys GREAT CLASS OF 2018! I also offer insignificant, or puzzling, or uninteresting, or unsympathetic may turn out to be discourse in all disciplines here as rich with meaningful, not just of any story, can make it easy to feel without knowing exactly what he was destined to appreciate the lyrical majesty of language in Ho Ho Kus, New Jersey, and
% more input17.txt
gagggagaggcgagaaa
% java-introcs TextGenerator 2 11 < input17.txt
gaggcgagaag
% java-introcs TextGenerator 2 11 < input17.txt
gaaaaaaagag
Original: it was the best of times, it was the worst of times.
Noisy: it w~s th~ bes~ of tim~s, i~ was ~he wo~st of~times.
% java-introcs FixCorrupted 4 "it w~s th~ bes~ of tim~s, i~ was ~he wo~st of~times." < wiki_100k.txt
it was the best of times, it was the worst of times.
Copyright © 2004.
Example 1 input: Speech to class of 2018, excerpts
[full text]
Example 1 output: random Eisgruber, using order 7 model, excerpts
Example 2 input: As you Like It, excerpts
[full text]
[Enter DUKE SENIOR, AMIENS, and two or three Lords,
like foresters]
DUKE SENIOR Now, my co-mates and brothers in exile,
Hath not old custom made this life more sweet
Than that of painted pomp? Are not these woods
More free from peril than the envious court?
Here feel we but the penalty of Adam,
The seasons' difference, as the icy fang
And churlish chiding of the winter's wind,
Which, when it bites and blows upon my body,
Even till I shrink with cold, I smile and say
'This is no flattery: these are counsellors
That feelingly persuade me what I am.'
Sweet are the uses of adversity,
Which, like the toad, ugly and venomous,
Wears yet a precious jewel in his head;
And this our life exempt from public haunt
Finds tongues in trees, books in the running brooks,
Sermons in stones and good in every thing.
I would not change it.
AMIENS Happy is your grace,
That can translate the stubbornness of fortune
Into so quiet and so sweet a style.
DUKE SENIOR Come, shall we go and kill us venison?
And yet it irks me the poor dappled fools,
Being native burghers of this desert city,
Should in their own confines with forked heads
Have their round haunches gored.
Example 2 output: random Shakespeare, using order
6 model, excerpts
[full text]
DUKE SENIOR Now, my co-mates and thus bolden'd, man, how now, monsieur Jaques,
Unclaim'd of his absence, as the holly!
Though in the slightest for the fashion of his absence, as the only wear.
TOUCHSTONE I care not for meed!
This I must woo yours: your request than your father: the time,
That ever love I broke
my sword upon some kind of men
Then, heigh-ho! sing, heigh-ho! sing, heigh-ho! sing, heigh-ho! unto the needless stream;
'Poor deer,' quoth he,
'Call me not so keen,
Because thou the creeping hours of the sun,
As man's feasts and women merely players:
Thus we may rest ourselves and neglect the cottage, pasture?
[Exit]
[Enter DUKE FREDERICK Can in his time in my heartily,
And have me go with your fortune
In all this fruit
Till than bear
the arm's end: I will through
Cleanse the uses of the way to look you.
Know you not, master,
Sighing like upon a stone another down his bravery is not so with his effigies with my food:
To speak my mind, and inquisition
And unregarded age in corners throat,
He will come hither:
He dies that hath engender'd:
And you to
the bed untreasured of the brutish sting it.