locked
Can you do a monthly report including multiple files? and Graphs and stuff RRS feed

  • Question

  • User531388329 posted

    Hi again,

    Ok I finally got it to work thanks to you and your STRONG suggestions to read up on it, greatly appreciated by the way.  The dilemma I'm faced with now is that I got multiple logs, one per day giving me a multitude of columns like : date - time - customer ip - server ip - server name - server - sc status.  Would there be a way to bunch all of the 30 - 31 logs (one per day) based on sc-status?  That's the server client status returning a status code if the page was found or what not or even if it had missing info.

    I thought a group by sc-status but that would do it per day example:

    logparser -i:iisw3c -o:csv "select date, time, s-ip, cs-uri-stem, sc-status into MonthlyErrorLog.log from c:\webstats\web3-filter-jul2006\filter_ex060701.log"

     

    And I also would like to make a graphic with the whole month BUT just for a day it gives me a odd error.  Says it's expecting FROM keyword instead of token 'count(*)'  YET it's really really close to the example they give!


    logparser "SELECT s-ip, sc-status count(*) as [error code] into MonthlyGraphic.gif from c:\webstats\web3-filter-jul2006\filter_ex060701.log group by sc-status order by [error code] desc" -o:chart -charttype:column3D

     

    Can anyone help me?  Thanks lots. Good program though but a tad touchy on the syntax.

    Eric

    Tuesday, August 21, 2007 2:21 AM

All replies

  • User531388329 posted
    I'm sorry, I'm not sure I understand your first question. One thing I thought you might be asking is how to query from a bunch of daily log files. You can do that by using normal file globbing. e.g. in this query:
    logparser -i:iisw3c -o:csv "select date, time, s-ip, cs-uri-stem, sc-status into MonthlyErrorLog.log from c:\webstats\web3-filter-jul2006\filter_ex060701.log"
    where you have filter_ex060701.log, you could just as easily have:
    logparser -i:iisw3c -o:csv "select date, time, s-ip, cs-uri-stem, sc-status into MonthlyErrorLog.log from c:\webstats\web3-filter-jul2006\filter_ex0607*.log"

    And it will query from all logs for July.

    For your second question, you have a simple syntax error in your query, you need a comma between sc-status and count(*).
    Wednesday, August 9, 2006 12:26 PM
  • User531388329 posted

    Think I could maybe take the top ten in the same querry?  Or will it put them all there and then I'd have to do descending counting the number of times that sc-status (status of the page that is being asked, ex 200 or 302, ...)  I think I answered my own question by putting in descending order.  But the problem is can we do that with your example even if it contains 30-31 logs, then convert that into one log that is descending?

    This one example:


    logparser -i:iisw3c -o:csv "select date, time, s-ip, cs-uri-stem, sc-status into MonthlyErrorLog.log from c:\webstats\web3-filter-jul2006\filter_ex0607*.log"

    ??

    2nd part of the question, Ahhhh geez!  Didn't even notice that.  I went looking for the error but no results came up and then thought maybe it's my ' - ' that are causing the problems and it's trying to subtract.  Good eye, good eye!

     

    You are either brillant or gifted to 'parse' !

    Tx a load.

    Eric

    Wednesday, August 9, 2006 1:07 PM
  • User531388329 posted

    Another error and it's not a comma.  All I ultimately want to do is get the top then, I thought of using it in desc order and I'll manually take the ten, errors from these pages.  The logs are one per day for the month so I'll take them all, thanks for telling me you could do multiple files, into one log where I'll take the ten manually.  Then from the then I've taken do a graph probably another querry.  The error says Semantic Error: presence of field(s) and agregate fucntion(s) in the select clause with no GROUP BY clause.  Thought the count would take care of that since I'd count the amount of specific errors that page gives off.  Example 200, or 402 or whatever the number may be.  So I thought of counting the number of times that page comes up with errors per month by counting it with this:

    logparser -i:iisw3c -o:csv "select date, time, count(sc-status) AS ErrorTotals, s-ip, cs-uri-stem, sc-status into MonthlyErrorLog.log from c:\webstats\web3-filter-ul2006\filter_ex060*.log order by ErrorTotals desc”

    Then I'd do a graphic with it afterwards cause I'd want both but that's putting the horse before the cart but in this case it's the mule cause I'm having soo many syntax errors.

    Thanks again guru!

    Eric

    Thursday, August 10, 2006 8:48 AM
  • User531388329 posted
    You can't use an aggregate function like COUNT() in a SELECT clause with non-aggregated columns unless all of the non-aggregated columns are part of a GROUP BY clause.

    Unfortunately, I still don't understand exactly what you are looking for. You say you want the top ten, but the top ten what? What do you want your graph to show? Could you describe that to me in very clear simple terms? Something along the lines of:

    "I want the x axis of the chart to show each sc-status code that represents an error, and the y axis to show the total number of records that had that sc-status code."

    That chart could be produced with this query:
    SELECT sc-status, COUNT(*) AS requests FROM ex*.log GROUP BY sc-status WHERE sc-status >= 400
    Thursday, August 10, 2006 9:33 AM
  • User531388329 posted
    Quote: You can't use an aggregate function like COUNT() in a SELECT clause with non-aggregated columns unless all of the non-aggregated columns are part of a GROUP BY clause. Unfortunately, I still don't un...

    I would like the top 10 errors (sc-status) given by these webpages (cs-uri-stem).  I'd like to have a log file to review and I'd like it in chart form with the top 10.  Found the keyword LIMIT.  Wondering if that would do the trick??

    logparser -i:iisw3c -o:csv "select date, time, s-ip, cs-uri-stem, sc-status into MonthlyErrorLog.log from c:\webstats\web3-filter-jul2006\filter_ex060701.log LIMIT 10"  ??

    Then, I'd like to have those 10 in a visual way.

    Thanks again

    Eric

    Thursday, August 10, 2006 10:16 AM
  • User531388329 posted
    Argh!

    You say "top 10 errors (sc-status)". sc-status is not just errors. It is the status for every request. Do you want only requests with a status >= 400?

    You say "given by these webpages (cs-uri-stem)". What webpages? Do you want to only show the error status for a certain list of pages?

    The LIMIT keyword is not part of LogParser's SQL syntax. LogParser uses the TOP keyword similar to the Transact-SQL dialect.

    The thing we keep coming back to is that what you are asking for doesn't make sense. If you were looking at only sc-status codes in your log files using a query like this:
    select sc-status from ex060701.log

    then you might see results along the lines of:
    200
    200
    200
    302
    404
    200
    200
    200
    302
    304
    304
    200
    500
    501
    404
    200
    200
    200
    304
    304
    304

    Now, what are the top 10 of that list? If you changed your query to be:
    SELECT TOP 10 sc-status FROM...
    then you would get just the sc-status for the first ten records that LP parsed.
    If you said
    SELECT TOP 10 DISTINCT sc-status
    then you would get a very short list of:
    200
    302
    404
    304
    500
    501

    because those are all the distinct sc-status codes.
    If you wanted to see how many requests had each of those codes then you would use a group by statement to group on sc-status.
    SELECT sc-status, COUNT(*) FROM... GROUP BY sc-status
    that would give you:
    200 10
    302 2
    404 2
    304 5
    500 1
    501 1

    If you grouped by both sc-status and cs-uri-stem then you would get the count of requests for each page by status code..
    SELECT cs-uri-stem, sc-status, COUNT(*) FROM... GROUP BY cs-uri-stem, sc-status
    would give you:
    /index.html 200 5
    /index.html 304 3
    /mypage.xxx 404 1
    /mypage.yyy 404 1
    /moved.html 302 2
    /error.1 500 1
    /error.2 501 1
    /page2.html 200 5
    /page2.html 304 2



    Finally, you mention you want a log file to review and in prior posts you mentioned you wanted a monthly log. So, you could write a query that would do just that:
    select date, time, s-ip, cs-uri-stem, sc-status into MonthlyErrorLog.log from c:\webstats\web3-filter-jul2006\filter_ex0607*.log where sc-status >= 400

    And then run those queries I mentioned above off of the MonthlyErrorLog.log (which would mean those queries would not see any 200 or 300 status codes since they had already been filtered out.
    Thursday, August 10, 2006 1:05 PM
  • User531388329 posted

    It must sound odd without the actual information I know.

    The whole thing is to determine why the error page is the most popular one.  The cs-uri-stem is the page, the sc-status is the webpage status and you can see some at: http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html

    I want to return the top 10 error codes that occur the most often hence the 'top 10' .  These codes are given out everytime depending on when something is missing, not found, partial content, OK (full content found), ...  I'd like to retrieve the top 10 errors mind you, might have lots of 200s so if possible ignore those all those numbers that start with two hundred something ( 2?? example: 202, 206, 204...) , and create a LOG file with the 30-31 logs where you said I can use a GROUP BY cs-uri-stem (cause I'd want to group by how many times the page gives off errors and NOT by the number that error number comes up).

    Then I'd have a LOG file with the top ten pages that give of errors (pages who give off most errors) so I could rectify those pages.  Those top ten would also be put in a Column3D chart showing me the problems in a visual aid.

    Hope this clarifies things up.  Question, I'm guessing you're not doing this for fun cause you seem to know waaaaay too much of this program, am I correct?

    Thanks a BUNCH!
    Eric

    Friday, August 11, 2006 8:10 AM
  • User531388329 posted
    Okay. now we are getting some where.

    So the first thing we need to do is collect all the daily logs into one monthly file containing only error requests. I would still suggest using sc-status >= 400 because the 300 codes are still informational, things such as "Content not modified" or redirects rather than true errors. 400s are errors dealing with resources that the server will not deliver.. 403 forbidden if the user doesn't have access to the page, 404 if the page doesn't exist. Errors in the 500s are the worst ones, those are errors where the server attempted to deliver the document but some sort of exception or error occurred.

    So, the query that will create the monthly error log:
    select date, time, s-ip, cs-uri-stem, sc-status into MonthlyError.log from c:\webstats\web3-filter-jul2006\filter_ex0607*.log where sc-status >= 400

    Then, from that log file (which is now in CSV instead of IISW3C format can be grouped by page and error to give error counts.
    SELECT cs-uri-stem AS Page, sc-status AS ErrorCode, COUNT(*) AS ErrorCount INTO MonthlyErrorCountsByPage.log FROM MonthlyError.log GROUP BY cs-uri-stem, sc-status

    Next, we can run a query to get the top ten list that you so richly deserve.
    SELECT TOP 10 Page, Errors USING SUM(ErrorCount) AS Errors INTO MonthlyTop10ErrorPages.log FROM MonthlyErrorCountsByPage.log GROUP BY Page ORDER BY Errors DESC

    Finally, for those bad bad pages, we might want to see that error code breakdown. We can do that with a little subselect filter of the MonthlyErrorCountsByPage log by the MonthlyTop10ErrorPages:
    SELECT Page, ErrorCode, ErrorCount FROM MonthlyErrorCountsByPage.log WHERE Page IN (SELECT Page FROM MonthlyTop10ErrorPages) ORDER BY ErrorCode, Page


    Try running through this process and see if might deliver some statistics you find useful.

    I use LogParser frequently in my job, but my work in the forums providing support for others is just something I do for "fun". I know a lot of SQL and because of this I just grok LogParser.
    Friday, August 11, 2006 1:15 PM
  • User531388329 posted

    Great!  I've been fiddling around with what you gave me and it works out beautifully!

    Thank you soo much.  Now the only problem I have is putting it into a graph.  I didn't change a thing and it's saying it can't find the specified file.  Does it have to be in the same exact syntax (order) they put it in?

    I wrote :

    logparser -i:csv "select Page, Errors as [Errors] into ErrorChart.Gif from MonthlyTop10ErrorPages Group by Errors Order by [Errors] Desc" -o:chart -Charttype:column3d

    I doubt it's the order but I'll try following the same one as shown.

    I do the graph and I'm done, thanks for everything.

    Ps, for real hard stuff what do you do?  Take on king kong in wrestling match? hahaha 

    Thanks again

    Eric

    Monday, August 14, 2006 11:11 AM
  • User531388329 posted
    If it is saying file not found it is likely another typo.
    In the query you pasted, I noticed that you say FROM MonthlyTop10ErrorPages instead of FROM MonthlyTop10ErrorPages.log.. could that be it?
    Monday, August 14, 2006 11:24 AM
  • User531388329 posted

    No, I checked that.  It says it's a semantic error.

    whole thing:

    Error: Semantic Error: Select clause field-expression "Page" is not an aggreate function and does not contain GROUP BY field expressions

     

    Yet I do have one as you can see:

    logparser -i:csv "select Page, Errors [Errors] into ErrorChart.gif from MonthlyTop10ErrorPages.log group by Errors order by [Errors] desc" -o:chart -charttype:column3d

     

    My dream of being done one day aren't as close as I thought...

    Monday, August 14, 2006 12:13 PM
  • User531388329 posted

    Fiddled around with it and it says "Unknown field 'Page' ".  But when I go into the log file itself, it's there named Page! Is it just me and i'm tired after a dull weekend ?  If I can't figure it by 3:00 i'm gonna switch task till tomorrow.

     

    Later

    Monday, August 14, 2006 1:10 PM
  • User531388329 posted
    This is pretty much my only concern at the moment so I worked on it till now and it keeps giving me a odd errors yet it's pretty much the same as the example.  Guess I'll pick it up tomorrow.  Any helpful suggestions?
    Monday, August 14, 2006 2:57 PM
  • User531388329 posted
    The error telling you Page is not aggregated is because you are trying to group by Errors and I don't think you need to.

    It looks like you are just wanting to graph the results of MonthlyTop10ErrorPages.log. I'd try this query:

    logparser -i:csv "select Page, Errors into ErrorChart.gif from MonthlyTop10ErrorPages.log order by Errors desc" -o:chart -charttype:column3d

    Monday, August 14, 2006 5:02 PM
  • User531388329 posted

    Just when I think I have the logic down, I'm proven wrong !

    In the top ten errors I'd like to put the error code but it says it's not a aggregate function and does not contain a group by field-expressions but it's already grouped by page no?  I'll keep looking but we'll see what little I turn up.  I wrote this:

    logparser -i:csv -o:csv "select top 10 Page, ErrorCode as ErrorCode, Errors Using Sum(ErrorCount) as Errors into MonthlyTop10ErrorPages.log from MonthlyErrorCountsByPage.log group by Page order by Errors Desc"

    I'm trying to add the ErrorCode column should I group it again if it's already grouped by page?

    Thanks lots!

    Tuesday, August 15, 2006 9:36 AM
  • User531388329 posted
    You can't select fields in a grouped query that are not either grouped or aggregated. Let me try an example..

    Given the following table:
    Page ErrorCode
    /a 404
    /a 403
    /a 500
    /a 501
    /b 501
    /b 501
    /b 500

    If you group by page, then you are telling LP to scan the input and grab all the rows with the same page value. So, LP will grab all the rows with the page value /a. Now at this point, it can output a single row containing the value /a for Page, but you can't tell it to output ErrorCode because it has a jumbled handful of ErrorCodes for /a. You have two options if you are interested in ErrorCode. You can either aggregate it or you can group by it. If you choose to aggregate it, you might tell LP you want to see the MIN error code for each Page. Then, when it goes to output the /a record, it looks through the jumble of ErrorCodes and finds the one with the minimum value, 403 then it outputs that row as /a, 403. If you told it to COUNT(ErrorCode) instead then for the /a records, it would output a single row containing the grouped value /a and then it would count the jumble of ErrorCodes in its hand and end up with the output row /a, 4.
    Now, if instead of aggregating ErrorCodes, you decide to group by it then you are no longer telling LP to look at all the /a's together, you are telling it to scan through the table and find every distinct grouping of page and errorcode. So, given this query, the following is the grouped output:
    select Page, ErrorCode, COUNT(*) AS Errors GROUP BY Page, ErrorCode

    /a 404 1
    /a 403 1
    /a 500 1
    /a 501 1
    /b 501 2
    /b 500 1

    Notice that given my sample data, the grouping is very sparse. /b 501 is the only grouping that actually had more than one record as part of the group.


    So. Back to the real world. Given the query you posted above:
    logparser -i:csv -o:csv "select top 10 Page, ErrorCode, Errors Using Sum(ErrorCount) as Errors into MonthlyTop10ErrorPages.log from MonthlyErrorCountsByPage.log group by Page order by Errors Desc"

    If you really do want ErrorCode as part of the output then you need to add ErrorCode to your group statement. However, you should be able to see that if that query was run on the results of my grouping query in the last example, the output would be identical to the input. Your SUM is ineffectual because the grouping is too discrete. In my example queries on the previous page of this message thread, that is specifically why I don't have ErrorCode as part of the summing query. This query:
    SELECT Page, SUM(ErrorCount) AS Errors GROUP BY Page
    will let you see the total number of errors for each page.
    /a 4
    /b 3

    Taking a TOP 10 of that query ordered by that sum will give you the then pages that had the most errors.
    Trying to shove ErrorCode into this output will mess things up basically giving you the same output as your input.
    If you want to see the top ten page and errorcodes ordered by the ErrorCount then no grouping is needed:
    SELECT TOP 10 Page, ErrorCode, ErrorCount ORDER BY ErrorCount DESC
    Tuesday, August 15, 2006 10:05 AM
  • User531388329 posted

    Not too sure what you meant by no grouping if that's what I want.  I want the pages to be grouped (so they appear once) and would give me the total of times counted, by counted.  Still says no group by clause.  I know it's not the begining of the sentence as nothing has changed.  I understand what you said that it's already in a grouping and there is no need so I took out what I thought was unnecessary but hmmmm, guess I'll have to look at it tomorrow.

    I'll review my whole line tomorrow.  Any close examples to my situation you might have?

     

    Thanks again, I'll get out of your hair and stop bothering you soon enough...

    Tuesday, August 15, 2006 1:32 PM
  • User531388329 posted

    All I'd want to do is have the error number appear as well and have it ordered by number.  Odd that it'll show the top 10 but not in order even if it's with a order by as you can see:

    logparser -i:csv -o:csv "Select Top 10 Page, Errors Using Sum(ErrorCount) as Errors into MonthlyTop10ErrorPages.log from MonthlyErrorCountsByPage.log group by Page order by Errors Desc"

     

    I'm reading someone else's guide and trying to find some clues as to what's going wrong.  His guide seems with some good tips, guess we'll see...

     

    Thanks for everything, hopefully something will turn up

    Wednesday, August 16, 2006 8:09 AM
  • User531388329 posted

    Trying to get the total of errors by the error and I can't do it yet I can do the top 10!  Trying to think of how I use to querry in SQL for multiple columns but I don't think I had to put things between them like a ' ; ' or what not.


    logparser -i:csv -o:csv "Select Top 10 Page, Errors Using Sum(ErrorCount) as Errors into MonthlyTop10ErrorPages.log from MonthlyErrorCountsByPage.log group by Page order by Errors Desc"

    That works for the top 10, but when i want to get a list of all the problems with a count of it it gives me invalid field for the CSV file.

    logparser -i:csv -o:csv "select cs-status as ErrorCode, StatusCountTotal Using Sum(cs-status) as ErrorCount into MonthlyErrorCount.log from MonthlyErrorCountsByPage.log Group by Page Order By ErrorCode Desc"

    I just modified my other querry and it gives me a error.  I'd understand if it would give me something that I didn't want but not this!  Got a idea parser king?

    Wednesday, August 16, 2006 1:36 PM
  • User531388329 posted

    Trying to get the total of errors by the error and I can't do it yet I can do the top 10!  Trying to think of how I use to querry in SQL for multiple columns but I don't think I had to put things between them like a ' ; ' or what not.


    logparser -i:csv -o:csv "Select Top 10 Page, Errors Using Sum(ErrorCount) as Errors into MonthlyTop10ErrorPages.log from MonthlyErrorCountsByPage.log group by Page order by Errors Desc"

    That works for the top 10, but when i want to get a list of all the problems with a count of it it gives me invalid field for the CSV file.

    logparser -i:csv -o:csv "select cs-status as ErrorCode, StatusCountTotal Using Sum(cs-status) as ErrorCount into MonthlyErrorCount.log from MonthlyErrorCountsByPage.log Group by Page Order By ErrorCode Desc"

    I just modified my other querry and it gives me a error.  I'd understand if it would give me something that I didn't want but not this!  Got a idea parser man?

    Wednesday, August 16, 2006 1:36 PM
  • User531388329 posted

    Ok, I finally took this project deriously and found my error.  Things are gonna move now that I can put my other projects aside and work on this one.

     

    Thanks again, I know you read this before but thanks,  seriously.

    Thursday, August 17, 2006 7:34 AM
  • User531388329 posted

    Ok, I finally took this project deriously and found my error.  Things are gonna move now that I can put my other projects aside and work on this one.

     

    Thanks again, I know you read this before but thanks,  seriously.

    Thursday, August 17, 2006 7:35 AM
  • User531388329 posted

    Ok, I finally took this project deriously and found my error.  Things are gonna move now that I can put my other projects aside and work on this one.

     

    Thanks again, I know you read this before but thanks,  seriously.

    Thursday, August 17, 2006 7:36 AM
  • User531388329 posted

    Ok, I finally took this project deriously and found my error.  Things are gonna move now that I can put my other projects aside and work on this one.

     

    Thanks again, I know you read this before but thanks,  seriously.

    Thursday, August 17, 2006 7:36 AM
  • User531388329 posted

    Ok, I finally took this project deriously and found my error.  Things are gonna move now that I can put my other projects aside and work on this one.

     

    Thanks again, I know you read this before but thanks,  seriously.

    Thursday, August 17, 2006 7:37 AM
  • User531388329 posted

    Ok, I finally took this project deriously and found my error.  Things are gonna move now that I can put my other projects aside and work on this one.

     

    Thanks again, I know you read this before but thanks,  seriously.

    Thursday, August 17, 2006 7:38 AM
  • User531388329 posted

    Ok, I finally took this project deriously and found my error.  Things are gonna move now that I can put my other projects aside and work on this one.

     

    Thanks again, I know you read this before but thanks,  seriously.

    Thursday, August 17, 2006 7:39 AM
  • User531388329 posted

    Ok, I finally took this project deriously and found my error.  Things are gonna move now that I can put my other projects aside and work on this one.

     

    Thanks again, I know you read this before but thanks,  seriously.

    Thursday, August 17, 2006 7:40 AM
  • User531388329 posted
    I'm sorry, I just can't help you on this any more.
    I've told you in every way I can think of that you can't group by one thing then try to display a non-aggregated field in the same query.
    If you group by page, you can't display the errorcode because it doesn't make sense. You group all the /a page records together into one output record and you would end up with a big jumble of errorcodes. you can't display that big jumble.

    I don't mean to insult you but I've explained every way I can think of that might give you the type of data I think you want and I provided lots of example queries in the long replies to you. You keep coming back to this query where you group by page and try to display errorcode and the sum of errors. Did you ever try running the example queries I gave you?
    Thursday, August 17, 2006 11:32 AM
  • User531388329 posted

    No no no, you thought I was being sarcastic but I really wasn't.  I'm actually looking at my SQL and all along I thought it was a parser error.  Now I'm actually going through my code and phocusing more on getting this done properly and not trying to rush through it.  I thank you lots for your help and if I have any more questions I'll ask AFTER I give it a try.

    You honnestly are a great help and I do thank you for your help.  I'm not saying that I might not need help later on but now I'm really going to get to it before asking.

    Thanks lots!

    Thursday, August 17, 2006 11:59 AM
  • User531388329 posted

    No no no, you thought I was being sarcastic but I really wasn't.  I'm actually looking at my SQL and all along I thought it was a parser error.  Now I'm actually going through my code and phocusing more on getting this done properly and not trying to rush through it.  I thank you lots for your help and if I have any more questions I'll ask AFTER I give it a try.

    You honnestly are a great help and I do thank you for your help.  I'm not saying that I might not need help later on but now I'm really going to get to it before asking.

    Thanks lots!

    Thursday, August 17, 2006 11:59 AM
  • User531388329 posted

    Hi again,

    things are moving along just great.  I might of asked you before hand, could I get the top 10 error pages that don't repeat themselves or you can't cause you're already doing a group by pages ?

    I've fiddled around with it gives me all 302's .  I've taken away all the 200's (the ok's) but it gives me 10 302 and from the same spot visited 10 different times giving off a 302 everytime.  Would there be a way of using a distinct within there?  Works but when I try to put a distinct it messes up.

    logparser -i:csv -o:csv "select top 10 date, time, s-ip, cs-uri-stem, sc-status into LegisInfoTop10Errors2006.log from July2006MonthlyErrorLog.log where cs-usi-stem = '/legisinfo/index.asp' and sc-status <>200"

     

    I have learned and tried out various way, and I have taken your advise in reading!  Just want to make sure if I can or can't select pages (field I want), give their errors but distinct errors (example not all 302).  I juuust thought of doing it by errors but then it would take a querry for every possible error.

     

    Oh parser king, think you can shed some help?

    Thanks

    Monday, August 21, 2006 1:09 PM