Location: jguru.com/faq/view.jsp?EID=127 Created: Sep 3, 1999 Author: Terence Parr (http://www.jguru.com/guru/viewbio.jsp?EID=1) Version 2.7.0 or greater; earlier versions didn't have the TokenStreamException hierarchy Use a filter rule to trap invalid sequences and throw a TokenStreamRecognitionException, which is a TokenStream exception indicating that an invalid token was found. The parser or method that invokes the parser must catch this TokenStreamException. Remember that the filter option changes the behavior of your lexer.
When you set a filter rule, there is no such thing as a syntax error. Anything short of a completely recognized (non-protected) lexer rule will result in the filter rule being called. In other words, the lexer will enter the filter rule with input state rewound to the way it was when the lexer was asked for a token.
The following parser/lexer combo illustrates the strategy. The language is just a set of IDs separated by white space. Upon input " ... more.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.