1

I have a piece of code that has been working for years until today. After debugging I realized that last token it not collected correctly. I think is because of his length (more than 10k chars).

Code:

StringTokenizer tokens = new StringTokenizer(myString,"&&&&&&&"); 
(...)
String s=tokens.nextToken();
//Do something with s
s=tokens.nextToken();
//Do something with s
s=tokens.nextToken();
//Do something with s

//Now it's time of last and biggest token
s=tokens.nextToken(); // --> s does not contain entire string
daniherculano
  • 373
  • 1
  • 9
  • 26
  • I tried, I acn take a token of length 20000 out of a `StringTokenizer` without it being shortened. There must be something more going on. – Ole V.V. Sep 07 '16 at 07:26

2 Answers2

4

You are using the StringTokenizer in the wrong way. Your tokenizer does not split at "&&&&&&&" as one would expect, but at '&', since it just requires one character from your delimiters String to delimit tokens. It then discards empty tokens, which is why you got the expected result anyway. For example:

    StringTokenizer tokens = new StringTokenizer("a&&b&&c", "&&&");
    while (tokens.hasMoreTokens()) {
        System.out.println(tokens.nextToken());
    }

This prints:

a
b
c

So my suspicion is there is an & somewhere within you 10k token. If that could be the case, I suggest that msaint’s suggestion, using String.split(), is the way to go if you can afford modifying your old code.

Ole V.V.
  • 81,772
  • 15
  • 137
  • 161
1

API seems to have no limitation in terms of length. I tried to reproduce your case and couldn't succeed. I was able to get 7 Mega chars from stringtokenizer. You can check your string first, then try split as stringtokenizer is a legacy class.

msaint
  • 91
  • 7
  • 1
    Indeed the docs say “`StringTokenizer` is a legacy class that is retained for compatibility reasons although its use is discouraged in new code. It is recommended that anyone seeking this functionality use the `split` method of `String` or the `java.util.regex` package instead.” – Ole V.V. Sep 07 '16 at 07:59