Yes. HarvestMan respects the rules laid down by website managers in the robots. Txtrules in the web server.
These rules specify certain limitations to crawling certain areas of the web site depending upon the user agent of the browser client. (Some site owners block entire sections to all clients). HarvestMan obeys the robot exclusion protocol by default.
There is way to bypass this protocol by disabling this feature. However, it is a good idea to always enable it to follow Internet etiquette and also to prevent yourself getting fined or sued by website owners for not following the robots. Txt rules.
Support for robots. Txt rules is available in Python. HarvestMan uses a customised form of this module.
More.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.