To use DevForce O/RM or Datasets? RRS feed

  • Question

  • Hello,

    I am wrestling with the decision on how to design a new (Order Entry) application.  I am using VB 2005, and it will be a Windows Forms app.  I have about an intermediate+ level of expertise. 

    In the past I have used VB DataSets and the whole dataset designer which connects to the database.  Since then I have learned about O/R mapping tools and am considering using DevForce from IdeaBlade.  It seems to work well, so far connecting to the database and generating the classes that relate to the tables.  I like this concept and think it makes sense.

    This seems like a big change, or is it?  My question is, is this a good direction to go?  Everything you read from the VS documentation refers to typed Datasets as well as the examples.  What are the best practices?  The DevForce mapper seems to emphasize the use of base tables over stored procedures.  Is this a real disadvantage?

    Thanks in advance for any help!


    Wednesday, January 31, 2007 2:55 PM

All replies

  • I think it comes down to how you want to organize your business logic in the application.  In Martin Fowler's Pattern's of Enterprise Application Architecture, three distinct patterns are identified: Transaction Script (stored procs, vbscript, php, vb6), Table Module (DataSets, record sets) and Domain Model (use classes to define entities corresponding to the business/data model).  Each has it's advantages and disadvantages.  The primary factor to consider is how well each pattern handles complexity versus how much development work/time it takes to implement -  Transaction Script being the simplest to implement at first but quickly chokes when faced with moderately complex business logic.  Here you wouldn't need O/RM or Datasets, just something simple like SqlHelper.  Table Module is pretty easy to get going with VS's support for datasets and so forth.  Because the data-access logic is somewhat abstracted, you can get a little more mileage as far as handling complexity goes, but there is still a relatively low ceiling compared to Domain Model.  Domain Model is arguably the preferred pattern of organizing moderatly to highly complex business logic, due to it's object-oriented style.  If you use Domain Model, you have to do some kind of O/R mapping, and it would be insane not to use a commercial OR/Mapper but try to build your own. 

    Look at NHibernate and WilsonORMapper, too.

    Wednesday, January 31, 2007 3:43 PM
  • Hi guys,

       OR/M is a way to deal with the impedance mismatch between OO and relational databases. However, some issues about execution performance and also... yes, later maintanibility could arise

       Without further delays, just want to refer a thread opened by some guy who entered to ORM thinking just in benefits and not in its drawbacks:


       I'm not against ORM, anyway, I just want to prevent some undesirable side effects

    Wednesday, January 31, 2007 6:22 PM
  • Wow - there is just too much information to digest.  It seems like the more I read the less certain I am of how to proceed.  I have been doing so much research that I am losing sight of what I need to accomplish!

    Thanks for your help. 

    Wednesday, January 31, 2007 7:54 PM
  • Yeah, everybody said the same!!  
    Wednesday, January 31, 2007 10:50 PM
  • Hi Jim - It will be no surprise that I have an opinion given that I'm the product management guy at IdeaBlade and have been with the company and the product since its founding in 2001. The ALKI post cites the relevant passage from a wonderful book by Fowler. I agree with its conclusions which I render as: an Object Model would be by far the best approach if only we didn't have to build it. Fowler's calculus attempts to balance time-to-get-the-job-done versus the-architecture-you'd-like-to-have. What if there were no tradeoff? What if you could have an Object Model in less time than the (technically inferior) alternatives? That's the promise of an ORM tool. Various products do a better or worse job at delivering on that promise. I admire some of our rivals but I think we have a decided advantage in ease-of-development without compromising flexibility or performance. Don't take my word for it. Poke around as the other mails suggest.

    Diego D. alludes to concerns about performance and maintainability. I don't understand how maintainability could be any better than with an Object Model. Even passable OM provides much needed insulation of the UI from the schema and almost always implies a distinct data access layer (DAL). Keeping intimate detail of data storage and data access out of your UI (aka "Presentation Layer") makes an app vastly easier to understand and maintain. Check out Fowler's "Separated Presentation" pattern ( for discussion of this, perhaps the single most important enterprise design pattern.

    "Performance" is one of those bugaboos that anyone can trot out to cast a shadow over any technology. It's meaningless on its own. Cutting to the chase, anytime you take the time to hydrate/dehydrate data as an object you'll bleed some cycles. The question to ask is "is that a good expenditure of cycles?" As always, the answer is "it depends".  Do you want to predict the weather by crunching a terabyte of data points? Don't use any object orientation what so ever. Don't use a relational database either - an RDB is ridiculously slow relative to much older db technologies (e.g. Btrieve).

    On the other hand, if you are going to have read/write user interaction with (on average) a few thousand rows of rich data, with multiple properties and business rules, unpredictable access paths, and changing requirements - then you'd be spending your cycles wisely to improve the developer's productivity with an Object Model backed by an RDBMS.

    Remember it's only a performance problem if it fails a performance benchmark. Know your application. Know your benchmark.

    Does this sound like an evasion on perf? It's not meant to be. It's a flat out claim that an Object Model will cost you between 1.5 and 2x versus a DataReader.

    Of course that's per trip to the database. No trip to the db is +1000x faster than any trip. If you issue the same query twice when you only needed to query once, you're putative performance advantage just went negative. The Object Model approach facilitates caching - automated, intelligent, you-don't-have-to-program-it caching. Try to write it yourself and you'll bleed all of the supposed performance gain from your custom data retrieval approach.

    If your application reads a ton of data once, processes it, and discards it, you may have a point. That's not how line-of-business applications with user interaction really work. In these kinds of applications you revisit the same data over and over, often coming at them from different angles. If you try to build this kind of app with a DataReader, you'll deliver nothing but heartache and broken promises.

    The final P.S. on this subject is that our product (and other products) provide backdoors for the occasional query that simply must be super-optimized. You can call a stored proc., you can write SQL pass-thru, you can invoke a custom process on the data tier that does any imaginable kind of magic, or you can override the object data provider with your own amazing data retriever that does exactly the right thing. You just don't want to make a habit of writing special cases all the time. Without an ORM it's "special cases ALL of the time." When 99% of your app deals with a few thousand records at most .. it ... just ... isn't ... worth ... the .. trouble. So get yourself an ORM ... like ours (there's a free version) ... or anybody's ... because I think just about anybody's ORM is better than no ORM.

    I MUST ADD that getting the Object Model out of the way is only the start. Thanks to a good OM you'll spend most of your time building your UI and the lion's share of that effort involves moving data between screen widgets and your business objects. How easy is that?

    Many ORM products don't do much to help you. We think there is a similar opportunity for rich mapping between widgets and business objects ... not just the simple properties of the objects (employee.LastName) but also nested properties (employee.HomeAddress.State.StateName) and made-up-properties that support your UI (something that feels like "employee.SuspiciousBirthDateColor"). Take a look around; we think you'll like IdeaBlade's DevForce for this too.

    Good luck with your search!

    Friday, February 2, 2007 6:01 AM
  • Thanks Ward,

    I think my problem is that I just don't know enough about this stuff.  Here I am at the beginning of a brand new project (which doesn't happen very often) and I am doing my best to start off on the right foot.  I have to confess that most of this stuff is over my head.  But I am trying, and it is starting to sink in.  So part of my dilemna is going down a path using an ORM and having it be more difficult than 'just using datasets'.  I enjoy a challenge and have always prided myself on a good clean design.  All of this theoretical stuff is very interesting.

    I like the idea of using an OM but am concerned about losing some of the advantages of datasets, then having to code those myself.  Just yesterday I read a good article on building a DAL using the dataset designer.

    You know, I just about had my mind made up to go ahead and proceed with datasets, but now I'm not sure again!  Arghh! 

    Thanks for your help,



    Friday, February 2, 2007 1:07 PM
  • Hi Jim,

       the article you mentioned is a very good article from Bryan Noyes (a well reputed .NET architect). Don't know if you noticed well, but there Noyes talks about a sort of "mid-walk" alternative: the Typed DataSets. These are components which, while they expose business entities properties, are backed on DataSets. That way you don't lose DataSets benefits when it is inconvenient, and can treat those components as business entities in your business layer

       These links give you more info on them (top level and how-tos). Hope it's useful:

    Friday, February 2, 2007 3:03 PM
  • Diego,

    Yes, that is what I was referring to when I talked about 'just datasets', I should have been clearer.  Without a doubt typed datasets would be the way to go in my mind, if you chose this design over pure business objects.  I have seen the examples of adding business logic to the partial classes exposed by the typed datasets.  This seems to work well.

    My question (and the title of this thread) should be 'To use OO business objects or typed datasets in the DAL'  I think that if I do end up going with objects, DevForce is probably as good if not better than most mappers.  It seems to be pretty robust from what I have seen, and I already have too much time invested to start over.

    Thanks again for your help.


    Friday, February 2, 2007 3:53 PM
  • Hi,


    You may also want to consider using a code generation tool like CodeSmith 4.0.

    We use CodeSmith and a set of free generation templates called nettiers to generate immense amounts of extremely functional code. The quality of data access/business logc functionality that the nettiers templates creates would take literally MONTHS to develop.

    It's not perfect, but I reckon we eliminate approximately 85% of our data access and business layer code buy using these templates

    Although I try not to be dogmatic about it (I used to be dogmatic, but then found that it was a trait that I really hate in other people so try to be more laid back now! :-) I really am much more a custom business layer sort of person. I will use the design tools for simple applications, prototypes or apps that I don't expect to become more and more complicated, but otherwise I tend to avoid the design time features of VS. My feeling is, they are really good if all your after is speed at any cost - but if your after a solid architectural framework following best practice designs - they really don't cut the mustered.

    It's really unfortunate that the majority of examples on databinding from Microsoft demonstrate practices that tend to be frowned upon by more experienced architects.

    Just my two cents - there are very intelligent people that I work with every day that have grown to think that there's no harm in using designtime driven functionality


    Friday, February 2, 2007 5:17 PM
  • Check out Rocky Lhotka on the difference between DataSets - even Typed DataSets - and a business objects. In brief, business objects have behavior - lots of behavior - behavior that, in many cases, should be inherited from super classes. That's hard to do with DataSets (and TypedDataSets).

    Oh - and by the way - we implement our object model with DataSets. We insulate you from that implementation detail - because we all want to stay at a higher level as much as possible - but you can get the DataSet any time (e.g., myPersistenceManager.DataSet). 

    Friday, February 2, 2007 5:48 PM
  • <<It's really unfortunate that the majority of examples on databinding from Microsoft demonstrate practices that tend to be frowned upon by more experienced architects.>>

    That's exactly my point, they (MS) seem to steer you this way or that way, and maybe it is aimed at the novice developer.  It's hard to break away from that if, like me you are beyond novice but not quite expert (yet...).  To their credit, they have to be everything to everybody and I have read articles on MSDN on binding to objects.


    Friday, February 2, 2007 5:54 PM