We are optimizing the performance of a SQL server spatial table with about 300 million polygons in it. The current average query response time is about 250ms.
Is this considered a "good" performance, what response time we should be shooting for? 50ms? 20ms?
The query we are using looks like:
select shape from SpTable WITH(INDEX(Idx_SpTable) WHERE Shape.Filter(geometry::STGeomFromText('POLYGON ((-13120454.1218383 3981872.33241955, -13120454.1218383 3983477.51001331, -13118848.9442445 3983477.51001331, -13118848.9442445 3981872.33241955, -13120454.1218383 3981872.33241955))', 3857)) = 1
The database server has about 16GB of RAM and 4 quad core processors....
- Edited by samm21 Tuesday, June 26, 2012 5:29 PM
It really also depends upon how selective your queries are, the parameters of your spatial index, the complexity of your polygons, and how many results (matches) you expect to receive.There’s not really a good way to guess general specs like “X rows, X amount of RAM, X processors should get Y response”.You can see how “good” your spatial index is for a specific query sample (exemplar spatial value) by using the spatial index diagnostic procedures. I wrote a series of blog entries about how to use these and interpret the results starting here: http://www.sqlskills.com/BLOGS/BOBB/post/Using-the-spatial-index-diagnostic-stored-procedures.aspx (or go to the general page on blog and click on the “SQL Server Spatial” category).Cheers,Bob
I came across your blog post earlier and it was very helpful, however what we are trying to find out is how our results compare to other results for similar databases. The spatial table stores the US parcel polygons and we are hoping for other people to share their results.
Our intent is to define a performance target that once we hit we can stop spending resources on the performance tuning activity.