mercredi 7 mars 2012 03:34
TRUNCATE TABLE Mobile.dbo.[Customer]
INSERT INTO Mobile.dbo.[Customer]
CST.[No], CST.[Name], CST.[Address], CST.[Address 2], CST.[Post Code], CST.[City], CST.[County] AS [State],
CST.[Country_Region Code] AS [Country Code], CST.[Contact], CST.[Phone No_], CST.[Fax No_], CST.[Telex No_] AS [Mobile No_],
CST.[E-Mail], CST.[Home Page], CAST(CST.[Credit Limit] AS DECIMAL (38,2)) AS [Credit Limit],
WHEN (CAST(INV.[Amount] AS DECIMAL) IS NOT NULL) THEN CAST((INV.[Amount]) AS DECIMAL (38,2))
END AS [Balance],
CST.[Salesperson Code], CST.[Payment Terms], CST.[Payment Code], CST.Zone Code]
FROM Live.dbo.[Customer] AS CST
SELECT ORIamt.[Customer No_], SUM(ORIamt.[Amount]) AS [Amount]
SELECT CLE.[Entry No_], CLE.[Customer No_], CLE.[Document No_], DCLE.[Amount]
FROM Live.dbo.[Entry] AS CLE
SELECT [Cust_ Entry No_], SUM([Amount]) AS [Amount]
FROM Live.dbo.[Detailed Entry]
GROUP BY [Cust_ Entry No_]
) DCLE ON CLE.[Entry No_] = DCLE.[Cust_ Entry No_]
) AS ORIamt
GROUP BY ORIamt.[Customer No_]
) AS INV ON INV.[Customer No_] = CST.[No_]
Toutes les réponses
mercredi 7 mars 2012 03:47Make sure that this database Recovery Model is set to Simple.
For every expert, there is an equal and opposite expert. - Becker's Law
mercredi 7 mars 2012 04:47
Thanks for the response Naomi, I set it to simple already but the log file still grow quite fast.
mercredi 7 mars 2012 04:52
Sounds like your batch is too large? Try breaking the insert statement up into multiple transactions of smaller batches and insert some checkpoints. Even in Simple mode everything needs to be written to the log file first and open transactions cannot be wrapped around in the log in case the transaction has to rollback.
mercredi 7 mars 2012 05:53Hi Sean, actually it is not too large. But just my client request to make this mobile log file not growing too fast. Because the purpose of this mostly for showing data only. all the transaction has been passed and made to another database.
mercredi 7 mars 2012 08:09
If the log is growing in simple mode too large for your requirements then simply put the transaction is modifying too much data at once and not allowing the log to wrap around fast enough and therefore is causing it to grow. What size is the log after the insert completes?
SQL uses write ahead logging therefore any insert, update or delete gets logged to the log file first.
jeudi 8 mars 2012 01:36Hi Sean, Currently I decided to change it into SELECT INTO. And before SELECT INTO is triggered, I drop the table first. Then change the database back up into bulk-logged. Do you think it is better?
mercredi 14 mars 2012 03:17
Sorry for the delay in my response.
Since "select into" is a minimally logged operation switching to bulk logged then running the select into statement then switch back to full recovery mode and running a differential backup should help.
- Marqué comme réponse Maggie LuoMicrosoft Contingent Staff, Moderator lundi 19 mars 2012 10:00