Quickstart
Connect to a SQL Server instance, choose an object, run the analysis, and open your first evidence-backed report.
If you want the shortest successful first run, follow this order. It avoids the most common beginner mistake: trying to analyze before the app has both a working SQL connection and a working AI provider.
- Open Settings and complete AI / LLM first, so analysis features have a working model backend.
- Add one saved SQL Server connection in Settings > Database.
- Test the connection, save it, then click Connect.
- Start with Dashboard to confirm the instance is reachable and metrics are loading.
- Move to Query Statistics or Object Explorer for your first deeper analysis.
In this product, Add Connection means add a saved connection profile. It does not create a new SQL database on the server.
- Open Settings > Database.
- Click Add Connection.
- Enter a clear Connection Name.
- Enter Server / Instance and confirm the correct Port.
- Choose the Default Database you want the profile to open first.
- Select SQL Server Authentication or Windows Authentication.
- If you chose SQL authentication, enter Username and Password.
- Leave ODBC Driver on auto-select unless your DBA requires a specific driver.
- Keep Encrypt Connection enabled unless you have a documented reason not to.
- Enable Trust Server Certificate only when your environment explicitly requires it.
- Click Test Connection.
- If the test succeeds, click Save, then select the profile and click Connect.
Choose this path if you want model traffic to stay local. Make sure Ollama is already running and the model is already installed before you test it inside the app.
ollama serve ollama pull codellama
- Open Settings > AI / LLM > Providers.
- Click Add AI Model.
- Choose Provider = Ollama.
- Enter a clear Name such as Local Ollama - SQL.
- Enter Host as your Ollama URL, usually http://localhost:11434.
- Enter the exact installed model name, for example codellama.
- Click Test. If the test succeeds, click Add.
- Select the provider in the list, click Set Default, then click Save Settings.
Choose this path if your team already uses a managed provider or if you want a centralized model backend. The exact fields vary by provider, but the workflow stays almost the same.
- Open Settings > AI / LLM > Providers and click Add AI Model.
- Choose a cloud provider such as OpenAI, Azure OpenAI, Anthropic, or DeepSeek.
- Enter a clear Name so you can distinguish production and test providers later.
- Enter the target Model value exactly as required by that provider.
- Enter the API key or credential field.
- If you use Azure OpenAI, also fill in Endpoint and Deployment.
- Click Test. If the test succeeds, click Add.
- Select the provider, click Set Default, then click Save Settings.
- API Key
- Model
- API Key
- Endpoint
- Deployment
- Model label used by your deployment policy
- After you connect, open Dashboard first.
- Confirm that CPU, memory, IO, and TempDB metrics are populated.
- Then open Query Statistics for query-level evidence.
- If you want to inspect one procedure, function, or table, open Object Explorer.
- Re-check server name, port, and authentication mode.
- Confirm the account has the expected SQL permissions.
- Review encryption and certificate settings with your DBA.
- For Ollama, confirm the service is running and the model is installed.
- For cloud providers, re-check the API key, model, endpoint, or deployment fields.
- Do not forget to click Save Settings after the provider test and add flow succeed.
- Check whether Query Store is enabled for the target database.
- Remember that DMV fallback has less historical depth.
- Ask the DBA to enable Query Store if the database is new to the platform.
- Click Save Settings after changing providers or connection defaults.
- Some settings affect only future connections or future analyses.
- Retry the action after saving instead of assuming the setting is already active.