I'm looking for a design pattern (preferably leveraging SQL Server 2005 features) to create a batch job which will process 1000s of different user-defined search criteria against a large DW.
The data warehouse schema is classical star schema in a healthcare setting: Claims, BillLines, Diag Codes, Procedure Codes, Modifier Codes, Providers.
The output of the batch job would be something like: Search1 ClaimKeys: 1, 2, 3 Search2 CaimKeys: 2, 3, 4,... ... SearchN ClaimKeys: 12334334, 12343235, 12323212...
Search Criterias would be something like: Search1: All Claims having Diag Codes (D1, D2, D3, D4) which have (BillLines using Procedure Codes (P1, P2, P3) which do NOT use modifer codes (M2, M3)).
Search2: All Claims having Diag Codes (D4, D5, D6, D7) which have (BillLines using Procedure Codes (P2, P3, P4) which use modifer codes (M5, M6)).
The Claims and BillLines fact tables have about 5 million records, so we need this batch processing to be as optimized as possible. In the current system we have stored procedures for each search criteria and this is really dragging us down.
Any patterns, guidelines, examples that anyone can provide would be of great help. Thanks, JGP