One query many plans – SQL Server

When you see this happening in your system. Here is the script that uses system DMVs to look at the cache for execution plans:

SELECT 
cplan.usecounts,
qryplan.[query_plan],
sqltxt.text,
planobj.pages_in_bytes / 1024 AS [PlanKB],
cplan.plan_handle,
cplan.cacheobjtype,
cplan.objtype
FROM sys.dm_exec_cached_plans cplan
OUTER APPLY sys.dm_exec_sql_test(cplan.[plan_handle] sqltxt
OUTER APPLY sys.dm_exec_query_pla(cplan.[plan_handle] qrypln
INNER JOIN sys.dm_os_memory_ibjects planobj
ON planobj.memory_object_address = cplan.memory_object_address
WHERE cplan.parent_plan_handle is NULL
AND cplan.cacheobjtype IN (N'Compiled Plan',N'Compiled Plan Stub')
ORDER BY cplan.objtype, cplan.plan_handle;

You see the many plans scenario when dynamic SQL is being generated by ORM. It would help if you can convert functions or queries into a stored procedure so that fewer plans get cached into the system and get re-used.

Advantages of Kerberos over NTLM authentication

Kerberos provides several advantages over NTLM:
– More secure: No password stored locally or sent over the net.
– Best performance: improved performance over NTLM authentication.
– Delegation support: Servers can impersonate clients and use the client’s security context to access a resource.
– Simpler trust management: Avoids the need to have p2p trust relationships on multiple domains environment.

– Supports MFA (Multi Factor Authentication)

NTLMv1 hashes could be cracked in seconds with today’s computing since they are always the same length and are not salted.

NTLMv2 is a little better, since it variable length and salted hash, but not that much better. Even though hash it`s salted before it`s sent, it`s saved unsalted in a machine’s memory.
And off course, when we talk about NTLM, we talk about a challenge/response mechanism, which exposes its password to offline cracking when responding to the challenge.

What are the Kerberos dependencies?

Both the client and the server need to be running W2k or latter versions and be on the same, or trusted domain.

A SPN needs to exist in the AD for the domain account in use to run the service in which the client is authenticating.

What is NTLM?

NTLM is an authentication protocol. It was the default protocol used in old windows versions, but it’s still used today. If for any reason Kerberos fails, NTLM will be used instead.

NTLM has a challenge/response mechanism.

What is Kerberos?

Kerberos is an authentication protocol. It’s the default authentication protocol on Windows versions above W2k, replacing the NTLM authentication protocol.

How do you detect if you have memory pressure in SQL Server?

Below code is a very rudimentary way of finding out if you have memory pressure, the code uses SQL server DMVs:

SELECT total_physical_memory_kb/1024 [Total Physical Memory in MB],
	   available_physical_memory_kb/1024 [Physical Memory Available in MB],
	   system_memory_state_desc
FROM sys.dm_os_sys_memory

SELECT physical_memory_in_use_kb/1024 [Physical Memory Used in MB],
	   process_physical_memory_low [Physical Memory Low],
	   process_virtual_memory_low [Virtual Memory Low]
FROM sys.dm_os_process_memory

SELECT committed_kb/1024 [SQL Server Committed Memory in MB],
	   committed_target_kb/1024 [SQL Server Target Committed Memory in MB]
FROM sys.dm_os_sys_info

Watch out if you see the value of Physical Memory Low or Virtual Memory Low is 1. Then you should consider increasing the memory on the server.

Importing from a CSV to a database table

A common request that i get frequently is to load contents of CSV into the database. Until i came across dbatools, I used to perform this operation using GUI.

dbatools offers two ways to do this:

  1. Import-DbaCsv
  2. Import-Csv and Write-DbaDataTable

Which one should you choose? It is a personal choice, I have found that Import-DbaCsv is better choice for larger CSVs because it is optimized to keep memory usage low.

When you use Import-DbaCsv, it’s using some kind of streaming methodology to move the data efficiently between the disk and the SQL Server.

The output from the command shows table information, the number of rows copied and even how quickly it copied them. This is useful when you are testing a script with a smaller set of data as you can extrapolate the time it will take to load the data in your production environment.

Get-ChildItem -Path C:\temp\data.csv | Import-DbaCsv -SqlInstance $sqlinstance1 -Database StagingDB -Table data1

ComputerName  : SQLDEV01
InstanceName  : MSSQLSERVER
SqlInstance   : SQLDEV01
Database      : StagingDB
Table         : data1
Schema        : dbo
RowsCopied    : 2450
Elapsed       : 55.16 ms
RowsPerSecond : 44663
Path          : C:\temp\data.csv

The beauty of the command is that you can even import multiple CSVs at once! You can do so using the command shown below:

Get-ChildItem C:\temp\data*.csv | Import-DbaCsv -SqlInstance sql2017 -Database StagingDB -AutoCreateTable

ComputerName  : SQLDEV01
InstanceName  : MSSQLSERVER
SqlInstance   : SQLDEV01
Database      : StagingDB
Table         : data-tracks1
Schema        : dbo
RowsCopied    : 2450
Elapsed       : 73.02 ms
RowsPerSecond : 33712
Path          : C:\temp\data-tracks1.csv

ComputerName  : SQLDEV01
InstanceName  : MSSQLSERVER
SqlInstance   : SQLDEV01
Database      : StagingDB
Table         : data-tracks2
Schema        : dbo
RowsCopied    : 1312
Elapsed       : 65.41 ms
RowsPerSecond : 20160
Path          : C:\temp\data-tracks2.csv

Using Import-Csv with Write-DbaDataTable

Import-Csv is a powerful command that turns the text within CSV files to objects. Generally, Import-Csv is piped right to Write-DbaDataTable

# Import the CSV and write to the database
Import-Csv -Path .\DatabaseServerinfo.csv | Write-DbaDataTable -SqlInstance $sqlinstance1 -Database StagingDB -Table Databases -AutoCreateTable

Piping to Write-DbaDataTable is convenient and extremely fast for small batches, but slows down for larger datasets (similar to SQL’s RBAR concept). If you intend to import a large dataset, use Import-DbaCsv or the following syntax instead:

Import-DbaCsv -Path \\NetworkPath\server\bigdataset.csv -SqlInstance sql2017 -Database StagingDB -Table NewTable -AutoCreateTable

Ports required to use dbatools – SQL Server

ProtocolDefault portSample CommandPercentage of Commands
SQL Database Engine1433Get-DbaDatabase62%
WS-Management5985 or 5986New-DbaClientAlias25%
SQL WMI135Enable-DbaAgHadr4%
SMB139Invoke-DbaDbLogShipping4%
SMB over IP445Get-DbaPfDataCollectorCounterSample<1%