SQL Server queries beyond a certain character length are timing out

淺唱寂寞╮ 提交于 2021-02-11 16:30:26

问题


Recently I migrated an ETL platform from Python 2.7 to 3 and also updated its Docker image from Ubuntu 16.04 to 18.04.

After the migration I started encountering some really strange behavior when trying to reflect SQL Server tables using SQLAlchemy with the pyodbc driver.

Initially, I was seeing this error message:

[08S01] [FreeTDS][SQL Server]Read from the server failed (20004) (SQLExecDirectW)

I had updated SQLAlchemy from 0.9.8 to 1.3.10 (I hadn't touched pyodbc which was at 3.0.10) so I followed the advice in the documentation and changed the pyodbc driver from FreeTDS 7.0 to Microsoft ODBC Driver 17. All that did was change the error message to this:

[08S01] [Microsoft][ODBC Driver 17 for SQL Server]TCP Provider: Error code 0x274C (10060) (SQLExecDirectW)

I confirmed that I could login to the server using isql and so I tried to directly run the query that SQLAlchemy is trying to run to get the table info

SELECT 
[INFORMATION_SCHEMA].[COLUMNS].[TABLE_SCHEMA]
, [INFORMATION_SCHEMA].[COLUMNS].[TABLE_NAME]
, [INFORMATION_SCHEMA].[COLUMNS].[COLUMN_NAME]
, [INFORMATION_SCHEMA].[COLUMNS].[IS_NULLABLE]
, [INFORMATION_SCHEMA].[COLUMNS].[DATA_TYPE]
, [INFORMATION_SCHEMA].[COLUMNS].[ORDINAL_POSITION]
, [INFORMATION_SCHEMA].[COLUMNS].[CHARACTER_MAXIMUM_LENGTH]
, [INFORMATION_SCHEMA].[COLUMNS].[NUMERIC_PRECISION]
, [INFORMATION_SCHEMA].[COLUMNS].[NUMERIC_SCALE]
, [INFORMATION_SCHEMA].[COLUMNS].[COLUMN_DEFAULT]
, [INFORMATION_SCHEMA].[COLUMNS].[COLLATION_NAME] 
FROM 
[INFORMATION_SCHEMA].[COLUMNS]
WHERE 
[INFORMATION_SCHEMA].[COLUMNS].[TABLE_NAME] = CAST(<table name> AS NVARCHAR(max)) 
AND [INFORMATION_SCHEMA].[COLUMNS].[TABLE_SCHEMA] = CAST(<schema name> AS NVARCHAR(max)) 
ORDER BY [INFORMATION_SCHEMA].[COLUMNS].[ORDINAL_POSITION];

When I run this query the connection just hangs and eventually times out. I started playing around with the query and discovered that I can run queries just fine as long as they are under a certain character limit. For example:

-- This works (and returns all columns)
SELECT
*
FROM 
[INFORMATION_SCHEMA].[COLUMNS]
WHERE 
[INFORMATION_SCHEMA].[COLUMNS].[TABLE_NAME] = CAST(<table name> AS NVARCHAR(max)) 
AND [INFORMATION_SCHEMA].[COLUMNS].[TABLE_SCHEMA] = CAST(<schema name> AS NVARCHAR(max)) 
ORDER BY [INFORMATION_SCHEMA].[COLUMNS].[ORDINAL_POSITION];

-- This also works
SELECT 
[INFORMATION_SCHEMA].[COLUMNS].[TABLE_SCHEMA]
, [INFORMATION_SCHEMA].[COLUMNS].[TABLE_NAME]
, [INFORMATION_SCHEMA].[COLUMNS].[COLUMN_NAME]
, [INFORMATION_SCHEMA].[COLUMNS].[IS_NULLABLE]
, [INFORMATION_SCHEMA].[COLUMNS].[DATA_TYPE]
, [INFORMATION_SCHEMA].[COLUMNS].[ORDINAL_POSITION]
, [INFORMATION_SCHEMA].[COLUMNS].[CHARACTER_MAXIMUM_LENGTH]
FROM 
[INFORMATION_SCHEMA].[COLUMNS]
WHERE 
[INFORMATION_SCHEMA].[COLUMNS].[TABLE_NAME] = CAST(<table name> AS NVARCHAR(max)) 
AND [INFORMATION_SCHEMA].[COLUMNS].[TABLE_SCHEMA] = CAST(<schema name> AS NVARCHAR(max)) 
ORDER BY [INFORMATION_SCHEMA].[COLUMNS].[ORDINAL_POSITION];

-- This causes the connection to hang after adding the line
-- , [INFORMATION_SCHEMA].[COLUMNS].[NUMERIC_PRECISION]
SELECT 
[INFORMATION_SCHEMA].[COLUMNS].[TABLE_SCHEMA]
, [INFORMATION_SCHEMA].[COLUMNS].[TABLE_NAME]
, [INFORMATION_SCHEMA].[COLUMNS].[COLUMN_NAME]
, [INFORMATION_SCHEMA].[COLUMNS].[IS_NULLABLE]
, [INFORMATION_SCHEMA].[COLUMNS].[DATA_TYPE]
, [INFORMATION_SCHEMA].[COLUMNS].[ORDINAL_POSITION]
, [INFORMATION_SCHEMA].[COLUMNS].[CHARACTER_MAXIMUM_LENGTH]
, [INFORMATION_SCHEMA].[COLUMNS].[NUMERIC_PRECISION]
FROM 
[INFORMATION_SCHEMA].[COLUMNS]
WHERE 
[INFORMATION_SCHEMA].[COLUMNS].[TABLE_NAME] = CAST(<table name> AS NVARCHAR(max)) 
AND [INFORMATION_SCHEMA].[COLUMNS].[TABLE_SCHEMA] = CAST(<schema name> AS NVARCHAR(max)) 
ORDER BY [INFORMATION_SCHEMA].[COLUMNS].[ORDINAL_POSITION];

I am at a complete loss as to what is going on. Trying to run the reflection from an old Docker image on Ubuntu 16.04 and Python 2.7 works fine; I can also run the query just fine if I connect to the database via DataGrip. I've tried using pymssql and that gives the first error. Is there some package/dependency that I should be checking that could cause this? When I was using the FreeTDS driver the connection string had UseNTLMv2=true and I'm not using TrustedConnection=Yes with the Microsoft driver because it wants me to set up Kerberos and I don't want to go down that rabbit hole if it's something else, considering the FreeTDS driver has the same issue.

Is this some weird network thing? Have I angered the SQL gods with my hubris? Is there any additional info I need to add?

来源:https://stackoverflow.com/questions/59636271/sql-server-queries-beyond-a-certain-character-length-are-timing-out

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!