SqlTransaction has completed

前端 未结 9 1460
后悔当初
后悔当初 2020-12-03 10:05

I have an application which potentially does thousands of inserts to a SQL Server 2005 database. If an insert fails for any reason (foreign key constraint, field length, etc

相关标签:
9条回答
  • 2020-12-03 10:41

    This exception is thrown because actual DB transaction is already rolled back, so a .NET object representing it on the client side is already a "zombie".

    More detailed explanation is here. This post explains how to write a correct transaction rollback code for such scenarios.

    0 讨论(0)
  • 2020-12-03 10:43

    Keep in mind that your application isn't the only participant in the transaction - the SQL Server is involved as well.

    The error you quote:

    This SqlTransaction has completed; it is no longer usable. at System.Data.SqlClient.SqlTransaction.ZombieCheck() at System.Data.SqlClient.SqlTransaction.Commit()

    doesn't indicate the transaction has comitted, only that it is complete.

    My first suggestion is that your server has killed off the transaction because it either took too long (ellapsed wall time) or got too large (too many changes or too many locks).

    My second suggestion is to check that you're cleaning up connections and transactions appropriately. It's possible that you're running into problems because you are occasionally exhausting a pool of some resource before things get automatically recycled.

    For example, DbConnection implements IDisposable, so you need to ensure you clean up appropriately - with a using statement if you can, or by calling Dispose() directly if you can't. 'DbCommand' is similar, as it also implements IDisposable.

    0 讨论(0)
  • 2020-12-03 11:00

    You have proved your data is OK beyond all reasonable doubt.
    FWIW, I would prefer to move the insert into a SPROC and don't use the transaction at all.
    If you need the UI to be responsive, use a background worker to do the database grunt.
    To me, a transaction is for interrelated activities, not a time saving device. The insertion cost has to be paid somewhere along the line.

    I recently used ANTS profiler on a database application and was amazed to see intermittant SQlClient exceptions showing in solidly performing code. The errors are deep in the framework when opening a connection. They never make it to the surface and aren't detectable by the client code. So... The point?
    It is not all rock solid out there, move the heavy work off the U.I. and accept the cost. HTH Bob

    0 讨论(0)
  • 2020-12-03 11:01

    According to this post: http://blogs.msdn.com/b/dataaccesstechnologies/archive/2010/08/24/zombie-check-on-transaction-error-this-sqltransaction-has-completed-it-is-no-longer-usable.aspx

    A temporary solution could be to try catching the rollback or commit operation. So this code will be enough for stopping the bug to be throwing:

        public static void TryRollback(this System.Data.IDbTransaction t)
        {
            try
            {
                t.Rollback();
            }
            catch (Exception ex)
            {
                // log error in my case
            }
        }
    
        public static void TryCommit(this System.Data.IDbTransaction t)
        {
            try
            {
                t.Commit();
            }
            catch (Exception ex)
            {
                // log error in my case
            }
        }
    

    Check this example from msdn website: http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqltransaction.aspx

    0 讨论(0)
  • 2020-12-03 11:06

    I also faced the same problem.

    This SqlTransaction has completed; it is no longer usable
    

    After doing some research, I found that my datatabase log file size is 40GB round.
    It is large data volume which cause my sql transaction un-commited.

    So, My solution is to shrink log file size by seeing this reference link.

    CREATE PROCEDURE sp_ShrinkLog_ByDataBaseName(
    @DataBaseName VARCHAR(200)
    )
    AS
    BEGIN
    
    DECLARE @DBName VARCHAR(200)
    DECLARE @LogFileName VARCHAR(200)
    SET @DBName = @DataBaseName
    
    SELECT @LogFileName = [Logical File Name]
    FROM
    (
    SELECT
        DB_NAME(database_id)  AS "Database Name", 
    type_desc             AS "File Type", 
    name                  AS "Logical File Name", 
    physical_name         AS "Physical File", 
    state_desc            AS "State"
    FROM
        SYS.MASTER_FILES
    WHERE
        database_id = DB_ID(@DBName)
        and type_desc = 'LOG'
    ) AS SystemInfo
    SELECT LogFileName = @LogFileName
    
    EXECUTE(
    'USE ' + @DBName + ';
    
    -- Truncate the log by changing the database recovery model to SIMPLE.
    ALTER DATABASE ' + @DBName + '
    SET RECOVERY SIMPLE;
    
    -- Shrink the truncated log file to 1 MB.
    DBCC SHRINKFILE (' + @LogFileName + ', 1);
    
    -- Reset the database recovery model.
    ALTER DATABASE ' + @DBName + '
    SET RECOVERY FULL;
    
    ')
    END
    

    Then execute it EXEC sp_ShrinkLog_ByDataBaseName 'Yor_DB_NAME'.

    If this is not working for you, here is another solution which I think it will work.

    0 讨论(0)
  • 2020-12-03 11:07

    Thanks for all the feedback. I've been working with someone from MSFT on the MSDN forums to figure out what's going on. It turns out the issue is due to one of the inserts failing due to a date time conversion problem.

    The major problem is the fact that this error shows up if it's a date conversion error. However, if it's another error such as a field being too long it doesn't cause this issue. In both cases I would expect the transaction to still exist so I can call Rollback on it.

    I have a full sample program to replicate this issue. If anyone wishes to see it or the exchange with MSFT you can find the thread on MSFT's newsgroups in microsoft.public.dotnet.framework.adonet under the SqlTransaction.ZombieCheck error thread.

    0 讨论(0)
提交回复
热议问题