locked
Preventing denial of service attacks in public APIs RRS feed

  • Question

  •  

    Hi Forum

    We are building some web services which will be available to our customers. What is the best way to prevent denail of service attacks?

    My worst fear is actually the accidental never ending loop that a programmer might do. How does one prevent a never a ending loop by an external programmer?

    Is it something that should be programmed into the application?

    Max Object Limit on Inserts - Valid option?
    I can see that if would be quite easy to prevent too many inserts. We could decide on a limit of max entries pr. user for each object type and then do a count before each insert. This is by no means a way to prevent denial of service but it could be one of the tools to use? Is it a bad idea? - It cannot do anyting about reads, updates, calculations etc.

    Transaction table with count of number of previous requests - Valid option?
    One could of cause write to a transaction table with a timestap for each request from a session and then check how many previous requests that have been made. - But this would slow performance of the application.

    Hardware? IIS?
    What I would really like to do is to be able to set a limit of how many requests that is allowed within a session or within a certain time frame. Does IIS or hardware like routers include such features?

    I hope someone can help me in the right direction.

     

    Wednesday, January 31, 2007 6:53 AM

Answers

  •  

    If you have a good firewall that should be able to help out, but if not stick an ISA Server in front

    http://www.microsoft.com/technet/isa/2006/flood_resiliency.mspx

    I understand that mod_evasive for Apache may offer the same sort of protection - if you are after something cheaper.

    https://www.linux-magazine.com/issue/62/Charly_Column.pdf

    Wednesday, January 31, 2007 1:13 PM
  •  

    Ollie is right, logging is key to understanding security attacks and yes ISA can help you with that - and not just doing the logging but having something monitor it so you can be advised of suspicious activity.   You'll want to think how logging will effect performance and how you will manage those logs - how long they need to be kept for, should they be shipped to an internal system or left on the box - does the box have enough disk space?  Logging can cause an outage by accident - I saw a system last year that logged so much web traffic info that it then ran out disk space and fell over!

    If you have outside developers programming against your system you'll want to enforce some 'test' or 'sand pit area'.  Have you seen PayPals sandpit test system?  You could then think up some sort of certification test before they get allowed to go live.

    Thursday, February 1, 2007 9:51 AM

All replies

  •  

    If you have a good firewall that should be able to help out, but if not stick an ISA Server in front

    http://www.microsoft.com/technet/isa/2006/flood_resiliency.mspx

    I understand that mod_evasive for Apache may offer the same sort of protection - if you are after something cheaper.

    https://www.linux-magazine.com/issue/62/Charly_Column.pdf

    Wednesday, January 31, 2007 1:13 PM
  • Experience in the last has taught me that trying to prevent DOS attacks from a coding point of view are pointless and a waste of time. I always rely on a 'proper' firewall that can allow you to filter out mailocous request before ever reaching your application.

    My worst fear is actually the accidental never ending loop that a programmer might do. How does one prevent a never a ending loop by an external programmer? - this is not a DOS attack this is bug in developer solution that should be picked up by integration testing and secondly you should be logging (or be able to turn on logging for) all unacceptable behaviour in your system so that if this did happen you can identify the problem and provide a solution.

    HTH

    Ollie Riches

     

     

     

    Wednesday, January 31, 2007 3:57 PM
  •  

    Hi Ben

    Thank you for your suggestion on ISA server. It sounds like a very good option.

     

    Wednesday, January 31, 2007 4:38 PM
  •  

    Hi Ollie

    Thank you for the reply.

    In regard to the accidental never ending loop.

    Our services will be available to a large number of customers and we do not have any contact with them.

    We provide docs and an API and they will be programming against the live production environment.

    But I'm gussing that something like ISA Server lilke Ben suggests would actually mitigate that?
    (Since it would be like someone trying to launch a HTTP DoS attack which it seams that ISA server can prevent.)

    Wednesday, January 31, 2007 4:41 PM
  •  

    Ollie is right, logging is key to understanding security attacks and yes ISA can help you with that - and not just doing the logging but having something monitor it so you can be advised of suspicious activity.   You'll want to think how logging will effect performance and how you will manage those logs - how long they need to be kept for, should they be shipped to an internal system or left on the box - does the box have enough disk space?  Logging can cause an outage by accident - I saw a system last year that logged so much web traffic info that it then ran out disk space and fell over!

    If you have outside developers programming against your system you'll want to enforce some 'test' or 'sand pit area'.  Have you seen PayPals sandpit test system?  You could then think up some sort of certification test before they get allowed to go live.

    Thursday, February 1, 2007 9:51 AM