Setting up a database to handle millions of records with speed RRS feed

  • Question

  • Hello! Basically, my software collects data and inserts these rows into the database. There can be millions of rows inserted into the database, across multiple tables, in just one session.

    What I'd like to do is pull all records associated with a particular session and have it be speedy. Maybe this is done with certain types of keys? I'm not really sure. I've only dealt with simple queries in the past.

    Right now, about an hour worth of data takes about a minute to grab. (Around 10M records.)

    I have setup the tables to be associate with whatever session they were recorded under. So I have a Sessions table and various other tables where the rows are tied to the session through a common SessionID.

    So I go through each table looking for any row that contains the particular SessionID.

    Any ideas on what I can do to speed things up? I've also increased the buffer pool to be 80% of the server's memory. The server is dedicated to being a database as well.


    Tuesday, September 17, 2019 10:32 PM

All replies