Automating SQL Server User Removal with PowerShell and dbatools

Introduction

When an employee leaves or a service account is retired, it’s essential to remove their access cleanly and consistently from SQL Server.
Manually revoking access across multiple databases can be error-prone and time-consuming — especially in large environments.

In this post, we’ll look at how to use the dbatools PowerShell module to automatically remove a user from all databases (except system ones) and drop the server-level login, with full logging for audit purposes.


Prerequisites

  • Install dbatools (if not already installed): Install-Module dbatools -Scope CurrentUser -Force
  • Ensure you have sysadmin rights on the SQL instance.
  • Have the login name ready (domain or SQL account).

The PowerShell Script

<#
.SYNOPSIS
Removes a SQL Server login and its users from all user databases.
Works for both domain and SQL logins, with logging.
#>

param(
    [Parameter(Mandatory = $true)]
    [string]$SqlInstance,
    [Parameter(Mandatory = $true)]
    [string]$Login,
    [string]$LogFile = "$(Join-Path $PSScriptRoot ("UserRemovalLog_{0:yyyyMMdd_HHmmss}.txt" -f (Get-Date)))"
)

if (-not (Get-Module -ListAvailable -Name dbatools)) {
    Write-Error "Please install dbatools using: Install-Module dbatools -Scope CurrentUser -Force"
    exit 1
}

function Write-Log {
    param([string]$Message, [string]$Color = "White")
    $timestamp = (Get-Date).ToString("yyyy-MM-dd HH:mm:ss")
    $logEntry = "[$timestamp] $Message"
    Write-Host $logEntry -ForegroundColor $Color
    Add-Content -Path $LogFile -Value $logEntry
}

Write-Log "=== Starting cleanup for login: $Login on instance: $SqlInstance ===" "Cyan"

$UserDatabases = Get-DbaDatabase -SqlInstance $SqlInstance | Where-Object { -not $_.IsSystemObject }

foreach ($db in $UserDatabases) {
    try {
        $dbName = $db.Name
        $user = Get-DbaDbUser -SqlInstance $SqlInstance -Database $dbName -User $Login -ErrorAction SilentlyContinue
        if ($user) {
            Write-Log "Removing user [$Login] from [$dbName]" "Red"
            Remove-DbaDbUser -SqlInstance $SqlInstance -Database $dbName -User $Login -Confirm:$false -ErrorAction Stop
            Write-Log "✅ Removed from [$dbName]" "Green"
        }
        else {
            Write-Log "User [$Login] not found in [$dbName]" "DarkGray"
        }
    }
    catch {
        Write-Log "⚠️ Failed in [$dbName]: $_" "Yellow"
    }
}

try {
    $loginObj = Get-DbaLogin -SqlInstance $SqlInstance -Login $Login -ErrorAction SilentlyContinue
    if ($loginObj) {
        $loginType = $loginObj.LoginType
        Write-Log "Removing server-level login [$Login] ($loginType)" "Red"
        Remove-DbaLogin -SqlInstance $SqlInstance -Login $Login -Confirm:$false -ErrorAction Stop
        Write-Log "✅ Server-level login removed" "Green"
    }
    else {
        Write-Log "No server-level login [$Login] found" "DarkGray"
    }
}
catch {
    Write-Log "⚠️ Failed to remove login [$Login]: $_" "Yellow"
}

Write-Log "=== Completed cleanup for [$Login] on [$SqlInstance] ===" "Cyan"
Write-Log "Log file saved to: $LogFile" "Gray"


How It Works

  • Get-DbaDatabase lists all user databases.
  • Get-DbaDbUser / Remove-DbaDbUser checks for and removes the user from each DB.
  • Get-DbaLogin / Remove-DbaLogin cleans up the login from the instance.
  • All actions are written to a timestamped .txt log for compliance or auditing.

Example Usage

.\Remove-DbUserFromAllDatabases.ps1 -SqlInstance "SQLPROD01" -Login "Contoso\User123"

You can also specify a custom log path:

.\Remove-DbUserFromAllDatabases.ps1 -SqlInstance "SQLPROD01" -Login "appuser" -LogFile "C:\Logs\UserCleanup.txt"


Key Takeaways

  • Fully automated and non-interactive — perfect for offboarding workflows.
  • Handles both Windows and SQL logins gracefully.
  • Creates a detailed audit log for every action taken.
  • Safe to re-run — it skips users or logins that don’t exist.

Decoding a Python Script: An Improv-Inspired Guide for Beginners

By Vinay Rahul Are, Python Enthusiast \& Improv Comedy Fan


Introduction

Learning Python can feel intimidating—unless you approach it with a sense of play! Just like improv comedy, Python is about saying “yes, and…” to new ideas, experimenting, and having fun. In this post, I’ll walk you through a real-world Python script, breaking down each part so you can understand, explain, and even perform it yourself!


The Script’s Purpose

The script we’ll explore automates the process of running multiple SQL files against an Amazon Redshift database. For each SQL file, it:

  • Executes the file’s SQL commands on Redshift
  • Logs how many rows were affected, how long it took, and any errors
  • Moves the file to a “Done” folder when finished

It’s a practical tool for data engineers, but the structure and logic are great for any Python beginner to learn from.


1. The “Show Description” (Docstring)

At the top, you’ll find a docstring—a big comment block that tells you what the script does, what you need to run it, and how to use it.

"""
Batch Redshift SQL Script Executor with Per-Script Logging, Timing, and Post-Execution Archiving

Pre-requisites:
---------------
1. Python 3.x installed on your machine.
2. The following Python packages must be installed:
    - psycopg2-binary
3. (Recommended) Use a virtual environment to avoid dependency conflicts.
4. Network access to your Amazon Redshift cluster.

Installation commands:
----------------------
python -m venv venv
venv\Scripts\activate        # On Windows
pip install psycopg2-binary

Purpose:
--------
This script automates the execution of multiple .sql files against an Amazon Redshift cluster...
"""

2. Importing the “Cast and Crew” (Modules)

Every show needs its cast. In Python, that means importing modules:

import os
import glob
import psycopg2
import getpass
import shutil
import time
  • os, glob, shutil: Handle files and folders
  • psycopg2: Talks to the Redshift database
  • getpass: Securely prompts for passwords
  • time: Measures how long things take

3. The “Stage Directions” (Configuration)

Before the curtain rises, set your stage:

HOST = '<redshift-endpoint>'
PORT = 5439
USER = '<your-username>'
DATABASE = '<your-database>'
SCRIPT_DIR = r'C:\redshift_scripts'
DONE_DIR = os.path.join(SCRIPT_DIR, 'Done')
  • Replace the placeholders with your actual Redshift details and script folder path.

4. The “Comedy Routine” (Function Definition)

The main function, run_sql_script, is like a well-rehearsed bit:

def run_sql_script(script_path, conn):
    log_path = os.path.splitext(script_path)[0] + '.log'
    with open(script_path, 'r', encoding='utf-8') as sql_file, open(log_path, 'w', encoding='utf-8') as log_file:
        sql = sql_file.read()
        log_file.write(f"Running script: {script_path}\n")
        start_time = time.perf_counter()
        try:
            with conn.cursor() as cur:
                cur.execute(sql)
                end_time = time.perf_counter()
                elapsed_time = end_time - start_time
                rows_affected = cur.rowcount if cur.rowcount != -1 else 'Unknown'
                log_file.write(f"Rows affected: {rows_affected}\n")
                log_file.write(f"Execution time: {elapsed_time:.2f} seconds\n")
                conn.commit()
                log_file.write("Execution successful.\n")
        except Exception as e:
            end_time = time.perf_counter()
            elapsed_time = end_time - start_time
            log_file.write(f"Error: {str(e)}\n")
            log_file.write(f"Execution time (until error): {elapsed_time:.2f} seconds\n")
            conn.rollback()
  • Reads the SQL file
  • Logs what’s happening
  • Measures execution time
  • Handles success or errors gracefully

5. The “Main Event” (main function)

This is the showrunner, making sure everything happens in order:

def main():
    password = getpass.getpass("Enter your Redshift password: ")
    if not os.path.exists(DONE_DIR):
        os.makedirs(DONE_DIR)
    sql_files = glob.glob(os.path.join(SCRIPT_DIR, '*.sql'))
    conn = psycopg2.connect(
        host=HOST,
        port=PORT,
        user=USER,
        password=password,
        dbname=DATABASE
    )
    for script_path in sql_files:
        print(f"Running {script_path} ...")
        run_sql_script(script_path, conn)
        try:
            shutil.move(script_path, DONE_DIR)
            print(f"Moved {script_path} to {DONE_DIR}")
        except Exception as move_err:
            print(f"Failed to move {script_path}: {move_err}")
    conn.close()
    print("All scripts executed.")
  • Prompts for your password (no peeking!)
  • Makes sure the “Done” folder exists
  • Finds all .sql files
  • Connects to Redshift
  • Runs each script, logs results, and moves the file when done

6. The “Curtain Call” (Script Entry Point)

This line ensures the main event only happens if you run the script directly:

if __name__ == "__main__":
    main()

7. Explaining the Script in Plain English

“This script automates running a bunch of SQL files against a Redshift database. For each file, it logs how many rows were affected, how long it took, and any errors. After running, it moves the file to a ‘Done’ folder so you know it’s finished. It’s organized with clear sections for setup, reusable functions, and the main execution flow.”


8. Why This Structure?

  • Imports first: So all your helpers are ready before the show starts.
  • Functions: Keep the code neat, reusable, and easy to understand.
  • Main block: Keeps your script from running accidentally if imported elsewhere.
  • Comments and docstrings: Make it easy for others (and future you) to understand what’s going on.

9. Final Thoughts: Python is Improv!

Just like improv, Python is best learned by doing. Try things out, make mistakes, and remember: if your code “crashes,” it’s just the computer’s way of saying, “Yes, and…let’s try that again!”

If you want to dig deeper into any part of this script, just ask in the comments below. Happy coding—and yes, and… keep learning!


ChatGPT prompt to create How-To Guide Builder

This prompt assists in creating a complete how-to guide for any topic, specifically tailored to the target audience’s skill level (beginner, intermediate, or advanced) and the desired content format (blog post, video script, infographic, etc.).

<System>
You are an expert technical writer, educator, and SEO strategist. Your job is to generate a full, structured, and professional how-to guide based on user inputs: TOPIC, SKILLLEVEL, and FORMAT. Tailor your output to match the intended audience and content style.
</System>

<Context>
The user wants to create an informative how-to guide that provides step-by-step instructions, insights, FAQs, and more for a specific topic. The guide should be educational, comprehensive, and approachable for the target skill level and content format.
</Context>

<Instructions>
1. Begin by identifying the TOPIC, SKILLLEVEL, and FORMAT provided.
2. Research and list the 5-10 most common pain points, questions, or challenges learners face related to TOPIC.
3. Create a 5-7 section outline breaking down the how-to process of TOPIC. Match complexity to SKILLLEVEL.
4. Write an engaging introduction:
   - Explain why TOPIC is important or beneficial.
   - Clarify what the reader will achieve or understand by the end.
5. For each main section:
   - Explain what needs to be done.
   - Mention any warnings or prep steps.
   - Share 2-3 best practices or helpful tips.
   - Recommend tools or resources if relevant.
6. Add a troubleshooting section with common mistakes and how to fix them.
7. Include a “Frequently Asked Questions” section with concise answers.
8. Add a “Next Steps” or “Advanced Techniques” section for progressing beyond basics.
9. If technical terms exist, include a glossary with beginner-friendly definitions.
10. Based on FORMAT, suggest visuals (e.g. screenshots, diagrams, timestamps) to support content delivery.
11. End with a conclusion summarizing the key points and motivating the reader to act.
12. Format the final piece according to FORMAT (blog post, video script, infographic layout, etc.), and include a table of contents if length exceeds 1,000 words.
</Instructions>

<Constrains>
- Stay within the bounds of the SKILLLEVEL.
- Maintain a tone and structure appropriate to FORMAT.
- Be practical, user-friendly, and professional.
- Avoid jargon unless explained in glossary.
</Constrains>

<Output Format>
Deliver the how-to guide as a completed piece matching FORMAT, with all structural sections in place.
</Output Format>

<Reasoning>
Apply Theory of Mind to analyze the user's request, considering both logical intent and emotional undertones. Use Strategic Chain-of-Thought and System 2 Thinking to provide evidence-based, nuanced responses that balance depth with clarity. 
</Reasoning>
<User Input>
Reply with: "Please enter your {prompt subject} request and I will start the process," then wait for the user to provide their specific {prompt subject}  process request.
</User Input>

Prompt Use Case:

A database engineer wants to create a runbook to troubleshoot MySQL replication issues.