I am planning to write some Custom Sync Provider due to limitations posed by our Database designers. I am restricted to UPDATE/INSERT/DELETE/SELECT Tables only through stored procedure. Those stored procedures only accept/return DataSet.
With such limitations, I can not allow Sync framework (and its SPs) to touch any of our Tables. Do I have to write Custom Provider or could I use SqlSyncProvider?
Custom Sync Provider will serve as 'bridge' between Sync Framework implementation and our 'Atom based' SQL Server database design. Following lists Custom Sync Provider goals and scope:
- Custom Sync Provider will Sync molecules i.e. DataSet to proprietary 3rd Party database for large number of customers (30K+ customers overall in 2-3 years time span).
- Server side database only serve molecules i.e. DataSet and updates itself using molecules.
- Server side database does not have Primary Key and Foreign Key constraints.
- Only stored procedures can be used to server/update Atoms i.e. Tables.
- Custom Sync Provider design should be
- Transaction based (To address data integrity requirements due to network failures)
- Compression driven (To address client bandwidth requirements for large amount of data i.e. 200 transactions/hour/connector)
- Batch oriented (To address chunk based downloads and resume on network failures)
- Easy to deploy (To address simplicity requirement for non-technical customer base)
Server Side = WCF/Windows 2008/SQL Server 200/Sync 4 CTP
Client Side = Silverlight/3rd Party Database
I would recommend you inherit the DbServerSyncProvider and override the GetChanges() and ApplyChanges() method.
But, i am wondering without Primary Key, how would you prevent data duplication and uniquely identify a record? You should add an identity column at least, and use it as primary key.
You can Provider ur Stored procs in InsertCommand, DeleteCommand and UpdateCommand to update tables
Provider SPs for SelectIncrementalInsert, SelectIncrementalUpdate and SelectIncrementalDelete to select changes to be downloaded to the clients
For atmocity, its taken care of by sync framework. But, if you have Foreign key you might want to group the related tables in SyncGroup to ensure changes applied for the entire group or is rolled back.
Batching is a good approach to prevent huge chunk of data being sent at once.
As for compression, you have to take care of it in the class that implements the interface which your server exposes to the client. Similarly, you will have to make the changes in the client side code generated by svcutil.exe to ensure that both client and server understand each other.