Setting Deadlock_Priority High

At our customers location, during their billing and exporting routine, handling hundreds of orders, our software recieved a Error “Transaction was deadlocked on lock resources with another proces and has been chosen as deadlock victim. Rerun the transaction (40001)) Attempting to access a record from the dbo.orderrecords file. Returning to previous window”

As far as I assume, this is an MS SQL Server generated order. I found that within T-SQL I can use “SET DEADLOCK_PRIORITY HIGH” to ensure my billing process will never be the deadlock victim anymore. However, neither the help, google or the search in Clarion Hub can direct me to the way of setting the high priority within Clarion.

Our programmers use plain clarion commands for their data access. Hardly any pure SQL commands are used.

is there also a way of detecting these errors within the code, so the routines can loop their commands, breaking out when there is no deadlock?

Thank you in advance for your time & shared knowledge.

Its not a SQL server error per se as its not showing here
Database engine events and errors - SQL Server | Microsoft Docs

but it is an ODBC error

Appendix A: ODBC Error Codes - ODBC API Reference | Microsoft Docs

|40001|Serialization failure|**SQLBulkOperations**
**SQLColumnPrivileges**
**SQLColumns**
**SQLEndTran**
**SQLExecDirect**
**SQLExecute**
**SQLFetch**
**SQLFetchScroll**
**SQLForeignKeys**
**SQLGetTypeInfo**
**SQLMoreResults**
**SQLParamData**
**SQLPrimaryKeys**
**SQLProcedureColumns**
**SQLProcedures**
**SQLSetPos**
**SQLSpecialColumns**
**SQLStatistics**
**SQLTablePrivileges**
**SQLTables**|
| --- | --- |

SET DEADLOCK_PRIORITY (Transact-SQL) - SQL Server | Microsoft Docs

You should be able to do this by calling a stored procedure which sets this T-SQL command and then call the stored procedure from within your clarion code which we can do.

Thats probably how I would attack it anyway.

Hi, When use SQL Commands can you try WITH (NOLOCK) maybe help.

This is a very bad idea, especially he is already intentionally running in a transaction.

It’s very likely your application is causing the deadlock, hopefully two different instances. I would put the effort into determining where the dead lock is coming from and trying to resolve it.
sp_BlitzLock is part of the free SQL first responder’s toolkit and you can use it to examine your dead lock afterwards and help track down what object were involved in the deadlock and hopefully point you to where to look in your code base.

One other thing I will add, when I was trying to import thousands of millions of records, the only acceptable if you can even call it that considering the time it still took was using the bcp utility.
Bulk Import and Export of Data (SQL Server) - SQL Server | Microsoft Docs (mins)

A stored procedure using the T-SQL Bulk Insert was too slow.

BULK INSERT (Transact-SQL) - SQL Server | Microsoft Docs (hours/days)

And trying to add the records individually from within a logout transaction using the Clarion MS SQL drivers were the slowest. (weeks)

As its possible to call the BCP from the command line, this maybe the only acceptable way to keep the speed up IF its that critical.

Obviously cpu, ram and disk type were also factors here and in my example I was running this of a development laptop which do not have good performance attributes.

I would love too, however my programmers don’t like to call T-SQL and the company does not want to use stored procedures… It’s a battle I’ve lost many times over

We still use the tryinsert and the likes to call the MSsql database. SQL commands are frowned upon.

Would have been an option, however it has happend once now, the first time in about 6 months. So to have the user start a tracing program ahead of every billing round, is not viable. I was hoping for a result-code that the program could act upon. I do not know if the ‘benign’ would suffice.

Interesting reading. Thank you. We are however just using the SQL database, but not issuing SQL commands. It is all still clarion based, like they are TPS files. With TSQL commands I would set the deadlock priority to high for these conditions.

Stored Procedures are very very fast compared to using a clarion process, but obviously you share the logic when the code is moved to a stored procedure, but another way that might stop the deadlock problem is to have some additional tables in the dct and sql server which are used for doing this batch process.

Copy over the records/rows in question into the temp batch files, process them using clarion, then copy them back. That way it shouldnt matter or interfere, but some companies have batch processes as an End of Day function.

I saw this alot with some IBM mainframes, once all day to day functions, data inputting, orders received, orders sent had been completed an End of Day process was initiated which did all the totally and processing but it required every on site to be logged out the system, once completed every could log back in then.

Admittedly thats like 30years+ ago and we are now more than ever a 24hr society, but that might be another option.