I have to agree with @dceljak.
Blocking user agents is not a strong security method – certainly not ‘BulletProof’. It’s akin to asking someone ‘are you a terrorist?’ and letting them in if they say ‘no’. All programs can change their user agent strings.
What this encourages is for web crawlers (including search engines) to use alternate user agent strings so they’re not arbitrarily blocked by plugins such as this. Anyone who is genuinely intent on causing harm to a website will already be masquerading as chrome/firefox/ie/safari.
This ultimately provides a false sense of ‘blocking’ against web crawlers/search/archiving engines. If web masters genuinely don’t want their site indexed or stored, then they should use relevant directives in robots.txt or the meta robot commands which are an industry defacto standard. Perhaps BPS could provide an option to set this up?