August 04, 2008

Blocking and Deadlocking

Blocking
Blocking is a necessary side effect of using locks to control concurrent resource access. A lock is either compatible or incompatible with other locks depending on the lock mode. Before a lock is granted, SQL Server first checks to see if an incompatible lock on the same resource exists. If not, the lock request is granted and execution proceeds. When the requested lock mode is incompatible with an existing lock mode on the same resource, the requesting session is blocked and waits until:
1) The blocking session releases the lock
2) The LOCK_TIMEOUT threshold of the blocked session is reached (indefinite by default)
3) The blocked client cancels the query
4) The blocked client times out (30-second default in ADO/ADO.NET, technically the same as #3)

Deadlocking
A deadlock is basically a special blocking scenario where 2 sessions are waiting on each other (directly or indirectly). Neither can proceed so both will wait indefinitely unless a timeout or intervention occurs. Unlike a normal blocking scenario, SQL Server will intervene when a deadlock situation is detected and cancel one of the transactions involved. The locks of the cancelled transaction are then released so the other blocked session can proceed. SQL Server chooses the transaction that is the least expensive to rollback as the deadlock victim by default. If set deadlock priority has been issued, SQL Server chooses the one with the lowest priority as the victim.
A deadlock always starts as a normal block with one session waiting while the other continues. It is only when the running session is later blocked by the waiting session that the deadlock occurs.

You can view session and lock info during this blocking episode by executing sp_who2 and sp_lock from a new SSMS window or you can use the Activity Monitor from the SSMS Object Explorer GUI.

No comments:

Creating DataFrames from CSV in Apache Spark

 from pyspark.sql import SparkSession spark = SparkSession.builder.appName("CSV Example").getOrCreate() sc = spark.sparkContext Sp...