Friday, May 04, 2012 5:22 PM
I posted this in the SQLCe group, but thought perhaps it would be a better fit here:
I've created a WinForms Time Clock application which leverages SQLCe. It logs the time in / time out of all the employees at a business. I have a Windows Service that periodically (a few time a day) imports data (employee and schedule info) from another database (PostgreSQL). Since the front end a WinForms app - therefore employees need to go to one computer to clock in / clock out.
I would like to make this application accessible via the web, but with all the different ways an technologies MS offers - I can't figure out what would be the best approch (MS Sync Services, WCF Data Services, SQL Azure, etc.). So given the following thing below which I cannot change, what woudl be the best way to make this application more "cloud" friendly (accessible via the web)?
1. The Employee/Schedule application (PostgreSQL) has to run on premise.
2. There will be different businesses using this solutions. I'd like to have one website and 1 database to maintain (I don't want to have to maintain a separate database for each business)
So I guess my question is - if I have multiple local SqlCe databases at different businesses - each importing data from the local PostgreSql, and logging in/out times thoughout the day - what woudl be the best method to get that data up into the cloud effeciently so that I can serve up the same information via a website?
Saturday, May 05, 2012 5:52 PM
Based on the information provided by you what I am able to visualize is that you need a web application that will be consuming integrated SQL server Data base and data base need to be exposed in cloud friendly way. below are my recommendation
- For integrating data you can use SSIS or else you can modify your windows service to directly push the data to SQL Server (I Will Prefer this if service is in your control)
- For exposing data you can use WCF Data Service that are easy to develop and preferred way of exposing data over cloud. Data Service uses REST pattern that makes them device (mobile) friendly too.
- If you also need a web application that will provide UI support then i would suggest to use ASP.MVC pattern. MVC is well supported on cloud (Azure) and can work for wide range of devices too.
Hope this will answer your question up to some extent , Please let us know if you have some more specific query
Tuesday, May 22, 2012 2:41 PM
Thanks you for the reply. To clarify a bit - What I would like to do is have all the clients (say 300 of them) which each have DIFFERENT data within their SQLCe db to send this information UPSTREAM to a central DB - to one database preferably (M:1). I would like this data to be accessible via the web. (i.e. Business A logs in to see only business A's clock in/out data) Is this possible? Could I have a central Clock_In_Clock_Out table and have a business_ID column. Could each SqlCe sync their data with the Clock_In_Clock_Out table? I assume this cannot work like this based of the samples I've looked at. Rather you would have to create a table for each (SqlCe db) business.
I'm guessing that I can simply have each client upload a .CSV and my web application can ingest that into the database - but that seems really old school and there probably is a better way. Any ideas?
Tuesday, May 22, 2012 7:27 PM
Based on the information you provided and with assumption that Machine hosting Central Data base will have access to all your 300 SQL CE databases
My proposed architecture will be as below
- Create a central data base that all the columns of CE database plus some extra fields (business_ID +some reserved fields) . Since Data volume may go very high so make it as partition table and partition should be based on business_ID
- Create a service (or application or SSIS) on the same server where your central data base is. This service will pull data base from each SQLCE db based on time stamp (i.e. only new records that was not fetched earlier). Advantage of this model over CSV export is that if due to any reason import failed then you can retry automatically and no need of fallow up with your branch IT.
- Now create a ODATA Service (WCF Data Service) that will expose your central table as single entity
- In order to restrict one-one mapping between branch logged-in user and clock in/out data in table we can implement Authorization on Gateway (If you have) or else in side the service implementation. All you need to make BusinessID as mandatory $filter parameters in request and once request reaches to server then first make sure that BusinessID claimed is belong to logged in user (using AD or other mean you have in your organisation) if yes then relay the request to Dataservices for processing
we can detail it out further if this architecture works for you and there is no such constraints that will block implementation
Hope I am able to cover all the requirement presented above.
Tuesday, May 22, 2012 7:40 PM
Keeping a manditory field for Business_ID makes sense and should not be a problem. However, I believe that I will have to PUSH the data from the SQL CE as the machine hosting the central database won't have a consistant connection to the 300 SQL CE databases. I'm hoping to host the central SQL database on a web host or perhaps in Azure - howe does this change your solution #2?
Wednesday, May 23, 2012 4:46 PM
Based on the used case presented above I will suggest to modify #2 as below
- Create a central table in SQL AZURE as suggested in #1 (above Post)
- Ask you client upload a CSV formatted data to AZURE BLOB.That is very cheap and can be access via Internet
- Create a worker Role on server on the server that will pool for new files
- When it see a new file it will Inject the data to SQL AZURE and after successful injection it will delete the file from blob (to save storage)
- If data injection fails then it should fail worker will notify the Administrator via email
Let me know if I am able to answer
Thursday, May 24, 2012 9:19 PM
Thank you Ashwini. When dealing with a .CSV to insert data, does is make sense to try to figure out how to Update and Delete rows, or should I just delete then insert. For example, if a .CSV has 5 days of data, but the .CSV upload happens every day - would deleting and insert the data in the .CSV file daily be faster than going through each line to determine if a unique id and respective row data has been updated or deleted?
Friday, May 25, 2012 6:54 PM
daily uploading is a good practice so prefer for the same
First of all you should export only fraction of data that has not been uploaded i.e. (Select * from MyTable where UpdateTimestamp=Today) this will also save your bandwidth (on azure you have to pay for bandwidth)
Second prefer not to update the data in CE it will make your life easier update should be allowed once date is moved to central location but need update happens then you really need to do some extra work i.e. table and export schema will change as below
Also you need to handle separately , you can create a deleted table that you can export to CSV at the end of day
LAST updated time
however even if you decided not to allow Updates in CE database still i would suggest to use above schema for csv because it will be scalable in future