At first I thought this was a duplicate of your
other post.
Dealing with scan robots is like pulling weeds in the garden. It doesn't matter what you do, they just keep coming back. I understand the frustration you feel, but retaliation is more likely to cause you problems as they will probably complain about you and your ISP will take action.
Fail2ban, which you referenced in your other thread is a good way to go. If you find that you have a particular IP that is doing this, you can block them (or their ISP), but this can have side effects and like the weeds, they will come up somewhere else.
I use Ossec (and snort), which will temporarily block your IP if you try to access too many invalid web pages (about 3). It looks at the apache error log and takes action accordingly.
You might also look at rate limiting in ip tables which will prevent them from trying to establish too many connections in too short a time, but again, this may interfere with your desired operation.