locked
store a huge quantity of data RRS feed

  • Question

  • hi all

    I'm working on AVL(automatic vehicle locator) system,

    in this situation i have about 200 vehicle that send their coordinate and date and time and speed and course per a second for my server application

    server app must store this data  for make report in feature

    what is the best solution  for storing , I'm looking for best performance .

    for example i have two choices:

    1- i can store data on a table of  database(sql 2005) (in this case the size of my table is raised up rapidly and i think it makes my query some slower)

    2- store data of each vehicle on a binary file and make index for it manually? (in this case when we want make a report on date i think retrieving is very slow)

    please show me the way

    thanks a lot

     

     



    هنگام سپیده دم خـروس سحری , دانی که چرا همی کند نوحـه گری , یعنی که نمودند در آیـینه صبح , کز عمر شبی گذشت و تو بی خبری
    Saturday, January 24, 2009 9:08 AM

Answers

  • You still can store data in database tables, but you might need to think about caching data locally and then send this data as a batch. You could use SQL Server Integration Services that provides functionality to load data from external sources (files, for example) and it would work faster that inserting records one-by-one. In a case of SQL Server you can also create partitioned tables and put partitions on different hard drives, so writing could be done in parallel. Performance is huge topic and it is not something that you would do in one single step. You know what your application collects, so you can prototype it first and then move specific way depending on what is acceptable in your case.

    I would not store data into some sort of custom binary files, since you would need to deal with parsing of these files and it could be even bigger issue than database performance.


    Val Mazur (MVP) http://www.xporttools.net
    • Marked as answer by Hamid Moghadam Thursday, January 29, 2009 6:32 AM
    • Unmarked as answer by Hamid Moghadam Thursday, January 29, 2009 6:38 AM
    • Marked as answer by Hamid Moghadam Saturday, January 31, 2009 5:07 AM
    Wednesday, January 28, 2009 11:21 AM

All replies

  • You still can store data in database tables, but you might need to think about caching data locally and then send this data as a batch. You could use SQL Server Integration Services that provides functionality to load data from external sources (files, for example) and it would work faster that inserting records one-by-one. In a case of SQL Server you can also create partitioned tables and put partitions on different hard drives, so writing could be done in parallel. Performance is huge topic and it is not something that you would do in one single step. You know what your application collects, so you can prototype it first and then move specific way depending on what is acceptable in your case.

    I would not store data into some sort of custom binary files, since you would need to deal with parsing of these files and it could be even bigger issue than database performance.


    Val Mazur (MVP) http://www.xporttools.net
    • Marked as answer by Hamid Moghadam Thursday, January 29, 2009 6:32 AM
    • Unmarked as answer by Hamid Moghadam Thursday, January 29, 2009 6:38 AM
    • Marked as answer by Hamid Moghadam Saturday, January 31, 2009 5:07 AM
    Wednesday, January 28, 2009 11:21 AM
  • thanks VMazur for your professional guidance

    but one more question

    you just said about writing data on data base but

    what is the best way for raising performance of retrieving data from database ?


     


     


     


    هنگام سپیده دم خـروس سحری , دانی که چرا همی کند نوحـه گری , یعنی که نمودند در آیـینه صبح , کز عمر شبی گذشت و تو بی خبری
    Thursday, January 29, 2009 6:44 AM
  • Retrieving could be very fast as well if you have proper indexes created for your data. There is no simple rule how to achieve this, but in general you should have indexes on table fields which are involved into search conditions. For example if you have following SQL statement

    SELECT ..... FROM MyTable  WHERE MyColumn= some value here

    Than in most cases you would need to create index on MyColumn field, because database engine will use it for search. If that field is not indexes that most likely engine will end up with the full table scan which is slow on large tables. From other side if you have too many indexes, it will slow down updates of table because each time you make any update of the data, index will be re-built. This is just general guidelines and tuning of the database is huge topic and not always easy to achieve in short period of time.


    Val Mazur (MVP) http://www.xporttools.net
    Thursday, January 29, 2009 5:32 PM