locked
Handling Decimals in BO RRS feed

  • Question

  • User1649856639 posted

    Hopefully I'm putting this under the correct category.

     

    I have a 3-tier architecture.

     

    I'm inserting data into SQL from an ASP.Net application using DAAB 4. I'm inserting into some decimal fields, (Precision 5, Scale 4) . If I exceed this by entering > 10 when doing the insert, I get an error and the insert fails - obviously.

     

    Now here's the question, I am a newbie and have looked around for this. What is a proper way to handle this in a 3-tier environment? Right now I'm checking the data in my BO layer before it gets to the DA and throwing an error to the user. However, this is not very flexible or dynamic, b/c if the field Precision or Scale changes in the database table, which it shouldn't, but if it does I'd have to change my BO and redeploy it.

     

    Any advice, or is this generally acceptable?

    Tuesday, March 16, 2010 12:10 PM

Answers

  • User532898053 posted

    Hi,

    I would say that each object should validate its data and throw relevant exceptions if required. Your documentation or code comments could highlight any ripple effects to guide developers although any failures should show up in your unit testing project. If you amend the test data to cover the changes to the database then the failed tests would direct you to the problem classes - not that you shouldn't review that seperately though.

    I wouldn't worry about trying to cover changes like that as you are not likely to be changing database types often and if you are it may be worth reviewing them to a level where the frequency of changes is reduced.

    • Marked as answer by Anonymous Thursday, October 7, 2021 12:00 AM
    Wednesday, March 17, 2010 9:51 AM

All replies

  • User532898053 posted

    Hi,

    I would say that each object should validate its data and throw relevant exceptions if required. Your documentation or code comments could highlight any ripple effects to guide developers although any failures should show up in your unit testing project. If you amend the test data to cover the changes to the database then the failed tests would direct you to the problem classes - not that you shouldn't review that seperately though.

    I wouldn't worry about trying to cover changes like that as you are not likely to be changing database types often and if you are it may be worth reviewing them to a level where the frequency of changes is reduced.

    • Marked as answer by Anonymous Thursday, October 7, 2021 12:00 AM
    Wednesday, March 17, 2010 9:51 AM
  • User-952121411 posted

    However, this is not very flexible or dynamic, b/c if the field Precision or Scale changes in the database table, which it shouldn't, but if it does I'd have to change my BO and redeploy it.
     

    This scenario is actualy not specefic just to your described case.  Often db changes are going to require changes to code and there is no good way around it.  You could try to refactor the rule into an external medium, so that it could be changed on the fly, but then that really defeats thepurpose and responsibility assigned to your BO layer.

    Thursday, March 18, 2010 4:15 PM