Install dbatools powershell module

In order to execute the first time, PackageManagement requires an internet connection to download the Nuget package provider. However, if your computer does not have an internet connection and you need to use the Nuget or PowerShellGet provider, you can download them on another computer and copy them to your target computer. Use the following steps to do this:

  • Open PowerShell (as Admin)
  • Run the following to set powershell to TLS 1.2:
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
Install-PackageProvider -Name NuGet -MinimumVersion 2.8.5.201 -Force

If you run Windows 10 or Windows Server 2016, you’ve already got support for the Gallery and can just issue the following command. This will download the package, unzip it into C:\temp\dbatools folder. Temp folder must exist for this to work. If temp folder does not exist, please create one before running the below script

Save-Module -Name dbatools -Path C:\temp

Keep in mind that the DLLs will likely be blocked by default. If this happens, simply run

Get-ChildItem -Recurse C:\temp | Unblock-File

Copy the dbatools folder from C:\temp to C:\Program Files\WindowsPowerShell\Modules folder

Open PS or command prompt and run the below command: $env:PSMODULEPATH

Any module placed within your $env:PSMODULEPATH will automatically load once you execute a command from the module

Run the following command to test the import of dbatools PS module:

Import-Module dbatools

It should run without any errors. You are now free to proceed to run any of the dbatools PS modules for DBA administrative tasks.

DBAmp and Salesforce API call Counts

Like all third party salesforce.com tools, DBAmp uses the salesforce.com api
to send and receive data from salesforce.com. Every salesforce.com
customer has a limit on the total number of API calls they can execute, org
wide, from all tools they are using. This limit is found on the Company
Information screen in the salesforce.com application.
Here are some rough guidelines for api call counts for various operations in
DBAmp:
SELECT against link server tables, SF_Replicate and SF_Refresh
DBAmp requests data in batches of 2000 records. The salesforce server
may reduce that amount based on the width of the row. Our experience
has been that the average batch size is 1000. So for every 1000 rows of
data retrieved = 1 API call

UPDATE and INSERT statements – 1 api call for each record updated or
inserted.
SF_TableLoader without the bulkapi switch – 1 api call for each batch
of 200 records.
SF_TableLoader with the bulkapi switch – 1 api call for each batch of
10,000 records. If you use the batchsize option, then 1 api call per
batchsize

DBAmp performance considerations

Updating and Deleting rows using SQL

DBAmp supports updating and deleting Salesforce.com objects with SQL.
In order to get the maximum performance with your UPDATE and DELETE
statements, you need to understand how SQL Server handles
UPDATE/DELETE statements with a linked server (like DBAmp).
For example, take the following SQL UPDATE
Update SALESFORCE…Account
Set AnnualRevenue = 4000
Where Id=’00130000005ZsG8AAK’

Using the Display Estimated Execution Plan option from the Query
Analyzer, you can see that SQL Server will retrieve the entire Account
table from Salesforce and then search for the one row that has the Id of
00130000005ZsG8AAK. Then, SQL Server will update the AnnualRevenue of
that row.
Obviously, this UPDATE statement has poor performance which gets worse
as the size of the Account table grows. What we need is a way to retrieve
only the row with Id 00130000005ZsG8AAK and then update the
AnnualRevenue of that row. To do this, use an OPENQUERY clause as the
table name.

Update OPENQUERY(SALESFORCE,
‘Select Id, AnnualRevenue from Account
where Id=”00130000005ZsG8AAK” ‘)
set AnnualRevenue = 4000
Using an OPENQUERY clause insures that we retrieve only the row with the
proper Id.
You can construct stored procedures that make your code more readable
and that use the above technique. See the Create SF_UpdateAccount.sql
file in the DBAmp program directory as an example. Using this stored
procedure, we can do updates to the Account table using the following SQL:

exec SF_UpdateAccount ‘00130000008hz55AAA’,’BillingCity’,”’Denver”’
or
exec SF_UpdateAccount ‘00130000008hz55AAA’,’AnnualRevenue’,’20000′

You can use the SF_UpdateAccount stored procedure as a template for
building your own specialized stored procedures. See the file Create

SF_UpdateAnnualRevenue.sql for an example. Then, use the following
SQL to update the Annual Revenue of an account.
exec SF_UpdateAnnualRevenue ‘00130000009DCEcAAO’, 30000
Deleting rows with SQL has the same caveats. For best performance with
deletion by Id, use an OPENQUERY clause in the SQL statement. An
example of a stored procedure that deletes Accounts by Id is in the file
Create SF_DeleteAccount.sql.


For maximum scalability, please consider using the sf_TableLoader stored
procedure instead of SQL Update or Delete statements. The
sf_TableLoader stored procedure takes advantage of the ability to batch
together requests to the salesforce.com api.

Get a snapshot of indexes

Spreadsheet inventory of existing indexes

You need to run the below script under each database:

sp_BlitzIndex @GetAllDatabases = 1, @Mode = 2

Inventory of missing indexes

sp_BlitzIndex @GetAllDatabases = 1, @Mode = 3

Find large unused indexes

/* To find large, unused, junk drawer objects: */

sp_BlitzIndex @Mode =2 , @sortorder = 'size' sp_BlitzIndex @Mode =2 , @sortorder = 'rows'

/* For management overview before & after doing index tuning*/ sp_BlitzIndex @Mode=1

Database maintenance best practices

UPDATE: INDEXES & STATISTICS 

  • default to weekly, but then escalating to daily for specific stats that need it
  • You will need to update Column stats separately. Column stats do not get updated when you rebuild indexes or index stats

side note to be aware: Each time you update indexes or statistics on a table clears out the plan cache for those tables. Having outdated statistics can cause all sorts of issues, but updating statistics can also cause issues like parameter sniffing, we are not looking for a perfect plan but more like a good plan, a balance between the issues of outdated statistics against the parameter sniffing.

BACKUP SCHEDULE

Full Backups – WEEKLY

Differential Backups – DAILY (EVERY NIGHT)

Transaction Log Backups – Every few hours (ONLY IF Databases need to meet RPO of every few hours, additionally Database needs to be in FULL recovery mode)

%d bloggers like this: