While there's some overhead in launching a grep subprocess, it's a fixed amount, unrelated to the size of the file. Unless you've got a very bad implementation of grep you're not going to be able to improve the search time in pure Perl. In fact, searching a large file using Perl is likely to be slower than grep which is optimized solely for searching files.
While there's some overhead in launching a grep subprocess, it's a fixed amount, unrelated to the size of the file. Unless you've got a very bad implementation of grep, you're not going to be able to improve the search time in pure Perl. In fact, searching a large file using Perl is likely to be slower than grep, which is optimized solely for searching files.
If you're only looking for the first match, you might want to add --max-count=1 to your grep command. This will cause grep to quit immediately, instead of reading the rest of the file to find additional matches.
You didn't tell your typical search patterns. Under certain circumstances you are able to speed things up with the -F option and disabling NLS by setting the environment LANG=C, read the manual for the meaning.
The OP does seem to imply they have a fixed pattern, so -F is a very good suggestion - and even if the pattern isn't totally fixed, it's quite possible the regex used is unnecessarily complex. – Jefromi May 13 at 12:22.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.