I've been looking in to generate text.
What I've learned so far is that I will have to use word-level Markov-text generation. I've found a few examples of those on this site. here
Now knowing this wouldn't work I tried it anyways and copied it to Processing. With the errors of not finding the correct libraries.
Is there anyone out there that has done this or can point me in a good direction to find more about doing text generation with processing. Or even somebody who want's to do a collab. Being open source and what not.
What I want isn't that more different than the example on the site, except the letter count should be word based and the database is given by words I put in there. The last part could be altered to an other source which I'm still brainstorming about. But could be everything actually with words. If you have any ideas please be free to contribute.
I'll edit this post when I know more from other forums. So when there's a solution I can pass it to others.
EDIT: SOLUTION CLICKBASED GENERATING
// required imports for Processing
import java.util.Hashtable;
import java.util.Vector;
String inputFile = "Sonnet51.txt";
Markov markovChain1;
String sentence = "";
void setup() {
size (900, 500);
background(0);
markovChain1 = new Markov();
// load text
String[] input = loadStrings(inputFile);
for (String line : input) {
markovChain1.addWords(line);
println(line);
}
// generate a sentence!
sentence = markovChain1.generateSentence();
println("-------------");
}
void draw() {
background(0);
// noLoop();
fill(255);
text(sentence, 19, 190);
fill(2, 255, 2);
text("Please press mouse", 19, height-33);
}
void mousePressed() {
// generate a sentence!
sentence = markovChain1.generateSentence();
println(sentence);
}
// ==========================================
class Markov {
Hashtable<String, Vector<String>> markovChain =
new Hashtable<String, Vector<String>>();
Markov() {
markovChain.put("_start", new Vector<String>());
markovChain.put("_end", new Vector<String>());
}
void addWords(String line) {
String[] words = line.split(" ");
for (int i=0; i<words.length; i++) {
if (i == 0) {
Vector<String> startWords = markovChain.get("_start");
startWords.add(words[i]);
Vector<String> suffix = markovChain.get(words[i]);
if (suffix == null) {
suffix = new Vector<String>();
suffix.add(words[i+1]);
markovChain.put(words[i], suffix);
}
}
else if (i == words.length-1) {
Vector<String> endWords = markovChain.get("_end");
endWords.add(words[i]);
}
else {
Vector<String> suffix = markovChain.get(words[i]);
if (suffix == null) {
suffix = new Vector<String>();
suffix.add(words[i+1]);
markovChain.put(words[i], suffix);
}
else {
suffix.add(words[i+1]);
markovChain.put(words[i], suffix);
}
}
}
}
String generateSentence() {
String newPhrase = "";
String nextWord = "";
Vector<String> startWords = markovChain.get("_start");
int startWordsLen = startWords.size();
nextWord = startWords.get(int(random(startWordsLen)));
newPhrase += " " + nextWord;
while (nextWord.charAt (nextWord.length ()-1) != '.') {
Vector<String> wordSelection=null;
wordSelection = markovChain.get(nextWord);
if (wordSelection!=null) {
int wordSelectionLen = wordSelection.size();
nextWord = wordSelection.get(int(random(wordSelectionLen-1)));
newPhrase += " " + nextWord;
}
else
{
return newPhrase.toString();
}
}
return newPhrase.toString();
}
} // class
//
use following text to use for the generator.
Thus can my love excuse the slow offence
Of my dull bearer when from thee I speed
From where thou art why should I haste me thence
Till I return of posting is no need
O! what excuse will my poor beast then find
When swift extremity can seem but slow
Then should I spur though mounted on the wind.
In winged speed no motion shall I know.
Then can no horse with my desire keep pace.
Therefore desire of perfectst love being made.
Shall neigh no dull flesh in his fiery race;
But love for love thus shall excuse my jade.
Since from thee going, he went wilful-slow
Towards thee Ill run, and give him leave to go.
It works completely and now I can begin to change it for making bigger texts. I anybody have ideas let me know. But this case is solved for me. Thanks to ChrisIr from Processing forum.