I work for a medium sized business unit in a fairly large organisation. Network level DDOS protection has been under investigation for a while but unfortunately it can't come soon enough for my users.
Over the past month we have had someone scraping our website every hour and they are getting progressively greedier with their requests. Due to the periodic nature and the data they are requesting I'm hesitant to call it a DDOS attack (though they are using distributed 'botnet looking' IP addresses). The problem is starting to cause service disruptions for our legitimate users. We have typically 100 internal and 100-300 external concurrent users making 2000-3000 simultaneous requests. The requests from the third party today crept up to about 10000 simultaneous requests every hour, the point where our application servers started crashing, DB locks don't release properly and our rendition server had massive queues.
I need a short term solution and the two ideas which were bandied about this afternoon were a) forcing all external users to register and log-in (until the third party goes away) or b) putting a CAPTURE or human test on the particular request that the third party is making. Is there anything else to consider here?

