This guide provides comprehensive instructions for setting up the Chat With Your Data Solution Accelerator for local development across Windows, Linux, and macOS platforms.
This application consists of four separate services that run independently:
- Admin Web - Streamlit application for data management
- Backend API - Python Flask application for REST endpoints
- Frontend - React-based user interface
- Batch Processing Function - Azure Function for document processing
⚠️ Critical: Each service must run in its own terminal/console window
- Do NOT close terminals while services are running
- Open 4 separate terminal windows for local development
- Each service will occupy its terminal and show live logs
Terminal Organization:
- Terminal 1: Admin Web - Streamlit server on port 8501
- Terminal 2: Backend API - Flask server on port 5050
- Terminal 3: Frontend - Development server on port 5174
- Terminal 4 (Optional): Batch Processing Function - Azure Functions runtime
All paths in this guide are relative to the repository root directory. Before starting any step, ensure you are in the repository root:
# Verify you're in the correct location
pwd # Linux/macOS - should show: .../chat-with-your-data-solution-accelerator
Get-Location # Windows PowerShell - should show: ...\chat-with-your-data-solution-accelerator
# If not, navigate to repository root
cd path/to/chat-with-your-data-solution-acceleratorRepository Structure:
chat-with-your-data-solution-accelerator/ ← Repository root (start here)
├── code/
│ ├── app.py ← Backend API entry point
│ ├── create_app.py ← Backend API factory
│ ├── backend/
│ │ ├── Admin.py ← Admin Web entry point
│ │ ├── batch/
│ │ │ ├── function_app.py ← Batch Function entry point
│ │ │ └── .env ← Batch function config (copy from main .env)
│ ├── frontend/ ← Frontend entry point
│ │ ├── vite.config.ts
│ │ └── src/
├── .azure/
│ └── <env-name>/
│ └── .env ← Main .env file (generated by azd provision)
└── docs/ ← You are here
⚠️ Important: Environment variables are automatically generated when you runazd provision. You may need to manually configure some values as described in the steps below.
# Install Python 3.11
winget install Python.Python.3.11
# Install Node.js LTS
winget install OpenJS.NodeJS.LTS
# Install Azure Developer CLI
winget install Microsoft.Azd
# Install Azure Functions Core Tools
winget install Microsoft.Azure.FunctionsCoreTools
# Install Visual Studio Code
winget install Microsoft.VisualStudioCode# Install WSL2 first (run in PowerShell as Administrator)
wsl --install -d Ubuntu
# Then in WSL2 Ubuntu terminal:
sudo apt update && sudo apt install python3.11 python3.11-venv git curl nodejs npm -y
# Install Azure CLI
curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash
# Install Azure Functions Core Tools
wget -q https://packages.microsoft.com/config/ubuntu/$(lsb_release -rs)/packages-microsoft-prod.deb
sudo dpkg -i packages-microsoft-prod.deb
sudo apt-get update
sudo apt-get install azure-functions-core-tools-4
# Install Azure Developer CLI
curl -fsSL https://aka.ms/install-azd.sh | bash# Install prerequisites
sudo apt update && sudo apt install python3.11 python3.11-venv git curl nodejs npm -y
# Install Azure CLI
curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash
# Install Azure Functions Core Tools
wget -q https://packages.microsoft.com/config/ubuntu/$(lsb_release -rs)/packages-microsoft-prod.deb
sudo dpkg -i packages-microsoft-prod.deb
sudo apt-get update
sudo apt-get install azure-functions-core-tools-4
# Install Azure Developer CLI
curl -fsSL https://aka.ms/install-azd.sh | bash# Install prerequisites
sudo dnf install python3.11 python3.11-devel git curl gcc nodejs npm -y
# Install Azure CLI
sudo rpm --import https://packages.microsoft.com/keys/microsoft.asc
sudo dnf install azure-cli
# Install Azure Functions Core Tools
sudo dnf install azure-functions-core-tools-4
# Install Azure Developer CLI
curl -fsSL https://aka.ms/install-azd.sh | bash# Install Homebrew if not already installed
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
# Install prerequisites
brew install python@3.11 node azure-cli azure-functions-core-tools@4
# Install Azure Developer CLI
curl -fsSL https://aka.ms/install-azd.sh | bashCreate .vscode/extensions.json in the workspace root and copy the following JSON:
{
"recommendations": [
"ms-python.python",
"ms-python.pylint",
"ms-python.black-formatter",
"ms-azuretools.vscode-azurefunctions",
"ms-azuretools.vscode-bicep",
"ms-python.vscode-pylance",
"ms-vscode.azure-account",
"ms-vscode-remote.remote-wsl",
"ms-teams-vscode.teams-toolkit"
]
}VS Code will prompt you to install these recommended extensions when you open the workspace.
Create .vscode/settings.json and copy the following JSON:
{
"python.defaultInterpreterPath": "./.venv/bin/python",
"python.terminal.activateEnvironment": true,
"python.linting.enabled": true,
"python.linting.pylintEnabled": true,
"python.testing.pytestEnabled": true,
"python.testing.unittestEnabled": false,
"files.associations": {
"*.bicep": "bicep"
}
}git clone https://github.com/Azure-Samples/chat-with-your-data-solution-accelerator.git
cd chat-with-your-data-solution-accelerator📋 Note: Using Existing Azure Deployment
If you already have an existing deployment or resource group with the solution deployed to Azure, you can skip resource provisioning and extract environment variables directly from your deployed services:
Option 1: Extract from Existing Azure Deployment
Get environment variables from all three services using Azure CLI:
# Get environment variables from Web App az webapp config appsettings list --name <web-app-name> --resource-group <resource-group-name> | ConvertFrom-Json | ForEach-Object { "$($_.name)=`"$($_.value)`""} # Get environment variables from Admin App az webapp config appsettings list --name <admin-app-name> --resource-group <resource-group-name> | ConvertFrom-Json | ForEach-Object { "$($_.name)=`"$($_.value)`""} # Get environment variables from Function App az functionapp config appsettings list --name <function-app-name> --resource-group <resource-group-name> | ConvertFrom-Json | ForEach-Object { "$($_.name)=`"$($_.value)`""}Then create a
.envfile at.azure/<env-name>/.env(create the directory structure if needed) and copy all environment variables into it.Option 2: Reuse Existing Local .env File
If you have previously deployed the solution locally using
azd up, you can directly use the existing.envfile located at.azure/<env-name>/.env.If using existing resources, you can skip Step 4.2 (Provision Azure Resources) and proceed directly to Step 4.3.
Before running the application locally, you need to provision Azure resources and configure authentication.
# Login to Azure CLI
az login
# Set your subscription
az account set --subscription "your-subscription-id"
# Verify authentication
az account showThe application requires Azure resources to be provisioned before running locally. Use Azure Developer CLI (azd) to provision all resources:
# Initialize azd environment (first time only)
azd init
# Provision all Azure resources
azd provisionThis command will:
- Create all required Azure resources (Storage, CosmosDB/PostgreSQL, AI Search, OpenAI, etc.)
- Generate a
.envfile in.azure/<env-name>/.envwith all configuration values - Set up the infrastructure using Bicep templates
⚠️ Important: Theazd provisioncommand can take 15-30 minutes to complete. Do not interrupt this process.
To run the application locally using RBAC authentication, your Azure account needs the following role assignments:
Get your Principal ID from Microsoft Entra ID, then assign these roles:
| Role | GUID | Scope |
|---|---|---|
| Cognitive Services OpenAI User | 5e0bd9bd-7b93-4f28-af87-19fc36ad61bd | Azure OpenAI Service |
| Cognitive Services User | a97b65f3-24c7-4388-baec-2e87135dc908 | Cognitive Services |
| Cosmos DB Built-in Data Contributor | 00000000-0000-0000-0000-000000000002 | Cosmos DB Account (How to assign) |
| Key Vault Secrets User | 4633458b-17de-408a-b874-0445c86b69e6 | Key Vault |
| Search Index Data Contributor | 8ebe5a00-799e-43f5-93ac-243d3dce84a7 | AI Search Service |
| Search Service Contributor | 7ca78c08-252a-4471-8644-bb5ff32d4ba0 | AI Search Service |
| Storage Blob Data Contributor | ba92f5b4-2d11-453d-a403-e96b0029c9fe | Storage Account |
| Storage Queue Data Contributor | 974c5e8b-45b9-4653-ba55-5f855dd0fb88 | Storage Account |
See Step 13: Assign Azure Roles Using Command Line for automated role assignment scripts.
You can also update the principalId value in infra/main.bicep with your Principal ID before running azd provision. This will automatically assign roles during provisioning.
Note: RBAC permission changes can take 5-10 minutes to propagate. If you encounter "Forbidden" errors after assigning roles, wait a few minutes and try again.
- Ensure role assignments from Step 4.3 are created
- Navigate to your Search service in the Azure Portal
- Under Settings, select Keys
- Select either Role-based access control or Both
- Set
AZURE_AUTH_TYPE=rbacin your.envfile - Set
APP_ENV=devin your.envfile to use Azure CLI credentials locally
📋 Note: Python 3.11 is required for this project. Ensure it's installed before proceeding.
# Create virtual environment
python -m venv .venv
# Activate virtual environment
.\.venv\Scripts\Activate.ps1# Create virtual environment
python3.11 -m venv .venv
# Activate virtual environment
source .venv/bin/activateInstall dependencies for all Python services:
# Navigate to backend directory
cd code/backend
# Install dependencies using Poetry
pip install --upgrade pip
pip install poetry
poetry self add poetry-plugin-export@latest
poetry export -o requirements.txt
pip install -r requirements.txt# Navigate to batch directory
cd batch
# Install dependencies
pip install -r requirements.txt# From repository root
.devcontainer/setupEnv.shThis script will install all dependencies for backend, batch, and utilities folders automatically.
📋 Terminal Reminder: Open a dedicated terminal window (Terminal 1) for the Admin Web (Streamlit). All commands assume you start from the repository root directory.
The Admin application is a Streamlit-based interface for managing documents, viewing data, and configuring the solution.
# From repository root
cd code/backend- Press
Ctrl+Shift+Dto open the Run and Debug panel - Select "Launch Admin site" from the dropdown
- Click the green play button or press
F5
This will:
- Automatically start the Streamlit server
- Open the browser at
http://localhost:8501 - Enable hot reload for code changes
- Allow setting breakpoints for debugging
# Make sure you're in code/backend directory
# Ensure virtual environment is activated
streamlit run Admin.pyThe Admin interface will automatically open in your browser at:
http://localhost:8501
Admin Features:
- Upload and manage documents
- Configure search indexes
- View conversation logs
- Test embeddings and queries
- Monitor system health
📋 Terminal Reminder: Open a second dedicated terminal window (Terminal 2) for the Backend API (Flask). Keep Terminal 1 (Admin Web) running. All commands assume you start from the repository root directory.
The Backend API is a Python Flask application that provides REST endpoints for the frontend.
# From repository root
cd codeEnsure your virtual environment is activated and the .env file from azd provision is accessible. The Flask app will read environment variables from .azure/<env-name>/.env.
- Press
Ctrl+Shift+Dto open the Run and Debug panel - Select "Launch Backend (api)" from the dropdown
- Click the green play button or press
F5
This will:
- Automatically activate the virtual environment
- Start the Flask development server
- Enable hot reload for code changes
- Allow setting breakpoints for debugging
# Make sure you're in code directory
# Ensure virtual environment is activated
poetry run flask runThe Flask API will start at:
http://127.0.0.1:5050
⚠️ Important: Make sure this port matches the proxy configuration invite.config.ts(Step 9.3)
API Endpoints:
- Health check:
http://127.0.0.1:5050/health - API routes:
http://127.0.0.1:5050/api/...
📋 Terminal Reminder: Open a third dedicated terminal window (Terminal 3) for the Frontend service. Keep Terminals 1 (Admin Web) and 2 (Backend API) running. All commands in this section assume you start from the repository root directory.
The UI is a React-based application located under code/frontend.
# From repository root
cd code/frontendnpm installUpdate vite.config.ts to point to the Flask API URL. For local development, the default Flask API runs at http://127.0.0.1:5050.
Edit vite.config.ts:
export default defineConfig({
// ... other config
server: {
proxy: {
'/api': {
target: 'http://127.0.0.1:5050', // Update this to your API URL
changeOrigin: true,
}
}
}
})- Press
Ctrl+Shift+Dto open the Run and Debug panel - Select "Launch Frontend (UI)" from the dropdown
- Click the green play button or press
F5
This will:
- Automatically install dependencies
- Start the Vite development server
- Enable hot reload for instant updates
- Allow setting breakpoints for debugging
# Make sure you're in code/frontend directory
npm run devThe frontend will start at:
http://localhost:5174
(or whichever port Vite assigns - check the terminal output)
⚠️ Hot Reload: Changes to React components will automatically reload in the browser. No need to manually refresh!
📋 Terminal Reminder: The Batch Processing Function runs in the background. You can run it in a separate terminal if needed, or open a fourth terminal window (Terminal 4).
The Batch Processing Function is an Azure Function that handles document processing, chunking, and indexing.
Navigate to the batch function directory and configure the local settings:
# From repository root
cd code/backend/batch# Copy the sample file
cp local.settings.json.sample local.settings.json # Linux/macOS
# or
Copy-Item local.settings.json.sample local.settings.json # Windows PowerShellEdit local.settings.json and update the following:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage__accountName": "<your-storage-account-name>",
"FUNCTIONS_WORKER_RUNTIME": "python"
}
}The batch function needs access to the same environment variables as other services:
# Copy .env from the azd-generated location
cp ../../../.azure/<env-name>/.env . # Linux/macOS
# or
Copy-Item ..\..\..\.azure\<env-name>\.env . # Windows PowerShell
⚠️ Important: Replace<env-name>with your actual azd environment name. You can find it by runningazd env list.
# Make sure you're in code/backend/batch directory
pip install -r requirements.txt# Make sure you're in code/backend/batch directory
# Ensure virtual environment is activated
poetry run func start- Install the Azure Functions VS Code extension
- Press
F5with the batch function folder open - Or right-click on the
function_app.pyfile and select "Execute Function Now..."
The Azure Functions runtime will start and display available HTTP trigger URLs:
Functions:
batch_push_results: [POST] http://localhost:7071/api/batch_push_results
batch_start_processing: [POST] http://localhost:7071/api/batch_start_processing
Triggering Functions:
- Use the URLs displayed in the terminal
- Use tools like Postman or curl to send HTTP requests
- Upload documents through the Admin interface (which will trigger the function)
⚠️ Important: You may need to stop the deployed function in the Azure Portal to ensure all requests are processed locally during debugging.
Before using the application, confirm all services are running correctly:
| Terminal | Service | Command | Expected Output | URL |
|---|---|---|---|---|
| Terminal 1 | Admin Web (Streamlit) | streamlit run Admin.py |
Local URL: http://localhost:8501 |
http://localhost:8501 |
| Terminal 2 | Backend API (Flask) | poetry run flask run |
Running on http://127.0.0.1:5050 |
http://127.0.0.1:5050 |
| Terminal 3 | Frontend (React/Vite) | npm run dev |
Local: http://localhost:5174/ |
http://localhost:5174 |
| Terminal 4 (Optional) | Batch Processing Function | poetry run func start |
Functions: batch_push_results: [POST] http://localhost:7071/api/... |
http://localhost:7071 |
1. Check Backend API:
# In a new terminal
curl http://127.0.0.1:5050/health
# Expected: JSON response with health status2. Check Frontend:
- Open browser to http://localhost:5174
- Should see the Chat interface
- Try typing a message (requires backend to be running)
3. Check Admin Interface:
- Open browser to http://localhost:8501
- Should see the Admin dashboard
- Navigate through different pages to verify functionality
4. Check Batch Function (if running):
# In a new terminal
curl http://localhost:7071/api/batch_push_results -X POST
# Should return a response (may be an error if no data, but confirms it's running)Service not starting?
- Ensure you're in the correct directory
- Verify virtual environment is activated (Python services)
- Check that the port is not already in use
- Review error messages in the terminal
- Ensure
azd provisioncompleted successfully
Can't access services?
- Verify firewall isn't blocking the ports
- Try
http://localhost:portinstead ofhttp://127.0.0.1:port(or vice versa) - Ensure services show "startup complete" messages
- Check VS Code terminal output for errors
Environment variable errors?
- Verify
.azure/<env-name>/.envfile exists and contains values - Check that
APP_ENV=devis set for local development - Ensure Azure CLI is logged in (
az account show) - Verify RBAC roles have been assigned and propagated
You can deploy the full solution or individual services using Azure Developer CLI:
# Deploy all services
azd deploy
# Deploy individual services
azd deploy web # Frontend chat application
azd deploy adminweb # Admin Streamlit application
azd deploy function # Batch processing functionFor a containerized local development environment:
- Ensure Docker Desktop is installed and running
- Provision Azure resources using
azd provisionto generate.envfile
-
Locate the .env file:
# Find your environment name azd env list # The .env file will be at: # .azure/<env-name>/.env
-
Configure APP_ENV for local development:
Edit
.azure/<env-name>/.envand set:APP_ENV=dev # Use Azure CLI credentials for local developmentFor production, set
APP_ENV=prodto use Managed Identity. -
Add AzureWebJobsStorage (required):
This value needs to be added manually. Get it from the Azure Portal:
- Navigate to your Function App
- Go to Settings → Configuration
- Copy the
AzureWebJobsStoragevalue - Add it to your
.envfile
# Option 1: Using Make
make docker-compose-up
# Option 2: Using docker-compose directly
cd docker
AZD_ENV_FILE=../.azure/<env-name>/.env docker-compose up
# Windows PowerShell:
$env:AZD_ENV_FILE="../.azure/<env-name>/.env"
docker-compose upNote: By default, these commands will run the latest Docker images built from the main branch. To use custom images:
- Build your own images using the Dockerfiles in the
docker/directory - Update
docker/docker-compose.ymlwith your image tags - Run
docker-compose up
For automated role assignment, use the following script. Replace the placeholder values with your actual Azure resource information:
#!/bin/bash
# Variables - Update these with your values
SUBSCRIPTION_ID="<your-subscription-id>"
RESOURCE_GROUP="<your-resource-group>"
PRINCIPAL_ID="<user-or-service-principal-id>"
SOLUTION_PREFIX="<your-solution-prefix>"
# Get your principal ID automatically (if using current user)
PRINCIPAL_ID=$(az ad signed-in-user show --query id -o tsv)
# Azure AI Search
az role assignment create --assignee $PRINCIPAL_ID --role "Search Index Data Contributor" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.Search/searchServices/srch-$SOLUTION_PREFIX
az role assignment create --assignee $PRINCIPAL_ID --role "Search Service Contributor" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.Search/searchServices/srch-$SOLUTION_PREFIX
az role assignment create --assignee $PRINCIPAL_ID --role "Search Index Data Reader" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.Search/searchServices/srch-$SOLUTION_PREFIX
# Azure OpenAI
az role assignment create --assignee $PRINCIPAL_ID --role "Cognitive Services User" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.CognitiveServices/accounts/oai-$SOLUTION_PREFIX
az role assignment create --assignee $PRINCIPAL_ID --role "Cognitive Services OpenAI User" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.CognitiveServices/accounts/oai-$SOLUTION_PREFIX
# Computer Vision (if enabled)
az role assignment create --assignee $PRINCIPAL_ID --role "Cognitive Services User" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.CognitiveServices/accounts/cv-$SOLUTION_PREFIX
# Speech Services (if enabled)
az role assignment create --assignee $PRINCIPAL_ID --role "Cognitive Services User" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.CognitiveServices/accounts/spch-$SOLUTION_PREFIX
# Document Intelligence
az role assignment create --assignee $PRINCIPAL_ID --role "Cognitive Services User" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.CognitiveServices/accounts/di-$SOLUTION_PREFIX
# Content Safety (if enabled)
az role assignment create --assignee $PRINCIPAL_ID --role "Cognitive Services User" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.CognitiveServices/accounts/cs-$SOLUTION_PREFIX
# Storage Account
az role assignment create --assignee $PRINCIPAL_ID --role "Storage Blob Data Contributor" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.Storage/storageAccounts/st$SOLUTION_PREFIX
az role assignment create --assignee $PRINCIPAL_ID --role "Storage Queue Data Contributor" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.Storage/storageAccounts/st$SOLUTION_PREFIX
# Key Vault
az role assignment create --assignee $PRINCIPAL_ID --role "Key Vault Secrets User" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.KeyVault/vaults/kv-$SOLUTION_PREFIX
# Cosmos DB (if using CosmosDB)
az cosmosdb sql role assignment create --account-name cosmos-$SOLUTION_PREFIX --resource-group $RESOURCE_GROUP --scope "/" --principal-id $PRINCIPAL_ID --role-definition-id 00000000-0000-0000-0000-000000000002
# PostgreSQL (if using PostgreSQL) - Add user as Microsoft Entra ID administrator
# Note: The CLI command may not work with older Azure CLI versions. If you encounter issues, use the Azure Portal method below.
# Option 1: Using Azure CLI (if 'ad-admin' command is not recognized, try updating Azure CLI with 'az upgrade')
az postgres flexible-server ad-admin create --server-name <server-name> --resource-group $RESOURCE_GROUP --object-id $PRINCIPAL_ID --display-name <display-name>
# Option 2: Using Azure Portal (Recommended if CLI fails)
# Step 1: Navigate to your PostgreSQL flexible server in Azure Portal (https://portal.azure.com)
# Step 2: Go to Settings -> Authentication
# Step 3: Select "PostgreSQL and Microsoft Entra authentication" or "Microsoft Entra authentication only"
# Step 4: Click "Add Microsoft Entra Admins"
# Step 5: Search for your user account by email or display name
# Step 6: Select your account and click "Select"
# Step 7: Click "Save" at the top of the page
# Step 8: Wait for the configuration to complete (1-2 minutes)
#
# You can get your display name from: Microsoft Entra ID -> Users -> Your User Account -> Display NameWindows PowerShell Version:
# Variables - Update these with your values
$SUBSCRIPTION_ID = "<your-subscription-id>"
$RESOURCE_GROUP = "<your-resource-group>"
$SOLUTION_PREFIX = "<your-solution-prefix>"
# Get your principal ID automatically
$PRINCIPAL_ID = (az ad signed-in-user show --query id -o tsv)
# Azure AI Search
az role assignment create --assignee $PRINCIPAL_ID --role "Search Index Data Contributor" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.Search/searchServices/srch-$SOLUTION_PREFIX
az role assignment create --assignee $PRINCIPAL_ID --role "Search Service Contributor" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.Search/searchServices/srch-$SOLUTION_PREFIX
# Azure OpenAI
az role assignment create --assignee $PRINCIPAL_ID --role "Cognitive Services User" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.CognitiveServices/accounts/oai-$SOLUTION_PREFIX
az role assignment create --assignee $PRINCIPAL_ID --role "Cognitive Services OpenAI User" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.CognitiveServices/accounts/oai-$SOLUTION_PREFIX
# Storage Account
az role assignment create --assignee $PRINCIPAL_ID --role "Storage Blob Data Contributor" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.Storage/storageAccounts/st$SOLUTION_PREFIX
az role assignment create --assignee $PRINCIPAL_ID --role "Storage Queue Data Contributor" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.Storage/storageAccounts/st$SOLUTION_PREFIX
# Key Vault
az role assignment create --assignee $PRINCIPAL_ID --role "Key Vault Secrets User" --scope /subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.KeyVault/vaults/kv-$SOLUTION_PREFIX
# Cosmos DB (if using CosmosDB)
az cosmosdb sql role assignment create --account-name cosmos-$SOLUTION_PREFIX --resource-group $RESOURCE_GROUP --scope "/" --principal-id $PRINCIPAL_ID --role-definition-id 00000000-0000-0000-0000-000000000002
# PostgreSQL (if using PostgreSQL) - Add user as Microsoft Entra ID administrator
# Note: The CLI command may not work with older Azure CLI versions. If you encounter issues, use the Azure Portal method below.
# Option 1: Using Azure CLI (if 'ad-admin' command is not recognized, try updating Azure CLI with 'az upgrade')
az postgres flexible-server ad-admin create --server-name <server-name> --resource-group $RESOURCE_GROUP --object-id $PRINCIPAL_ID --display-name <display-name>
# Option 2: Using Azure Portal (Recommended if CLI fails)
# Step 1: Navigate to your PostgreSQL flexible server in Azure Portal (https://portal.azure.com)
# Step 2: Go to Settings -> Authentication
# Step 3: Select "PostgreSQL and Microsoft Entra authentication" or "Microsoft Entra authentication only"
# Step 4: Click "Add Microsoft Entra Admins"
# Step 5: Search for your user account by email or display name
# Step 6: Select your account and click "Select"
# Step 7: Click "Save" at the top of the page
# Step 8: Wait for the configuration to complete (1-2 minutes)
#
# You can get your display name from: Microsoft Entra ID -> Users -> Your User Account -> Display Name
⚠️ Important:
- Replace all placeholder values (
<your-subscription-id>,<your-resource-group>, etc.) with your actual values- Some services (Computer Vision, Speech, Content Safety) are optional and may not exist in your deployment
- RBAC changes can take 5-10 minutes to propagate
- Run
azd env get-valuesto see your solution prefix and other values
# Check available Python versions
python --version
python3.11 --version
# If python3.11 not found, install it:
# Ubuntu/Debian: sudo apt install python3.11 python3.11-venv
# Windows: winget install Python.Python.3.11
# macOS: brew install python@3.11# Recreate virtual environment
rm -rf .venv # Linux/macOS
# or
Remove-Item -Recurse -Force .venv # Windows PowerShell
# Create new virtual environment
python3.11 -m venv .venv # Linux/macOS
python -m venv .venv # Windows
# Activate and reinstall dependencies
source .venv/bin/activate # Linux/macOS
# or
.\.venv\Scripts\Activate.ps1 # Windows
# Reinstall dependencies
cd code/backend
pip install --upgrade pip
pip install poetry
poetry export -o requirements.txt
pip install -r requirements.txt# Check Azure CLI authentication
az account show
# If not authenticated, login
az login
# Set correct subscription
az account set --subscription "your-subscription-id"
# Verify RBAC roles are assigned
az role assignment list --assignee $(az ad signed-in-user show --query id -o tsv)
# Force refresh tokens (if getting 401/403 errors)
az account get-access-token --resource https://management.azure.com/# Fix ownership of files
sudo chown -R $USER:$USER .
# Fix script permissions
chmod +x .devcontainer/setupEnv.sh
chmod +x scripts/*.sh# PowerShell execution policy
Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser
# Long path support (Windows 10 1607+, run as Administrator)
New-ItemProperty -Path "HKLM:\SYSTEM\CurrentControlSet\Control\FileSystem" `
-Name "LongPathsEnabled" -Value 1 -PropertyType DWORD -Force
# SSL certificate issues
pip install --trusted-host pypi.org --trusted-host pypi.python.org --trusted-host files.pythonhosted.org -r requirements.txt# Find process using a port (Linux/macOS)
lsof -i :5050 # Backend Flask
lsof -i :5174 # Frontend Vite
lsof -i :8501 # Admin Streamlit
lsof -i :7071 # Azure Functions
# Kill process by PID
kill -9 <PID>
# Find process using a port (Windows)
netstat -ano | findstr :5050
netstat -ano | findstr :5174
netstat -ano | findstr :8501
netstat -ano | findstr :7071
# Kill process by PID (Windows)
taskkill /PID <PID> /F# Clear npm cache
npm cache clean --force
# Remove node_modules and reinstall
cd code/frontend
rm -rf node_modules package-lock.json # Linux/macOS
# or
Remove-Item -Recurse -Force node_modules, package-lock.json # Windows
npm install
# If npm install fails, try with legacy peer deps
npm install --legacy-peer-deps# Check environment variables are loaded (Linux/macOS)
env | grep AZURE
# Check environment variables (Windows PowerShell)
Get-ChildItem Env:AZURE*
# Validate .env file format
cat .azure/<env-name>/.env | grep -v '^#' | grep '=' # Should show key=value pairs
# Check azd environment
azd env list
azd env get-values# Ensure Azure Functions Core Tools is installed
func --version
# If not installed:
# Windows: winget install Microsoft.Azure.FunctionsCoreTools
# Linux/macOS: See Step 1 for installation instructions
# Check AzureWebJobsStorage configuration
cat code/backend/batch/local.settings.json
# Validate function app structure
ls code/backend/batch/
# Should see: function_app.py, local.settings.json, requirements.txt
# Clear Functions cache
rm -rf ~/.azure-functions-core-tools # Linux/macOS
Remove-Item -Recurse ~/.azure-functions-core-tools # Windows# Clear Streamlit cache
streamlit cache clear
# Run with verbose logging
streamlit run Admin.py --logger.level=debug
# Check Streamlit version
streamlit --version
# Reinstall Streamlit
pip uninstall streamlit
pip install streamlit# Check azd version
azd version
# Update azd to latest version
# Windows: winget upgrade Microsoft.Azd
# Linux/macOS: curl -fsSL https://aka.ms/install-azd.sh | bash
# Clean azd environment and retry
azd env delete <env-name>
azd init
azd provision
# Check Azure subscription quotas
# Many provision failures are due to quota limits
# Visit Azure Portal → Quotas to check and request increases
# Enable detailed logging
azd provision --debug# Verify Cosmos DB RBAC role assignment
az cosmosdb sql role assignment list \
--account-name cosmos-<prefix> \
--resource-group <resource-group>
# Check if Cosmos DB is accessible
az cosmosdb show \
--name cosmos-<prefix> \
--resource-group <resource-group>
# Test connection using Azure SDK
python -c "from azure.cosmos import CosmosClient; print('Cosmos DB SDK imported successfully')"# Verify Search service is running
az search service show \
--name srch-<prefix> \
--resource-group <resource-group>
# Check RBAC roles for Search
az role assignment list \
--scope /subscriptions/<sub-id>/resourceGroups/<rg>/providers/Microsoft.Search/searchServices/srch-<prefix>
# Test search index
az search index list \
--service-name srch-<prefix> \
--resource-group <resource-group># Verify OpenAI service
az cognitiveservices account show \
--name oai-<prefix> \
--resource-group <resource-group>
# Check model deployments
az cognitiveservices account deployment list \
--name oai-<prefix> \
--resource-group <resource-group>
# Verify RBAC roles
az role assignment list \
--scope /subscriptions/<sub-id>/resourceGroups/<rg>/providers/Microsoft.CognitiveServices/accounts/oai-<prefix>-
Enable Verbose Logging:
- Set
LOGLEVEL=DEBUGin your.envfile - Set
PACKAGE_LOGGING_LEVEL=DEBUGfor Azure SDK logging - Add
AZURE_LOGGING_PACKAGES=azure.core,azure.identityto see authentication details
- Set
-
Check Application Logs:
- Flask: Check the terminal running
flask run - Streamlit: Check the terminal running
streamlit run - Functions: Check the Functions Core Tools terminal output
- Frontend: Check browser developer console (F12)
- Flask: Check the terminal running
-
Isolate the Issue:
- Test each service independently
- Verify Azure resources are accessible from Azure Portal
- Use Azure Storage Explorer to check blob/queue contents
- Test API endpoints with curl or Postman
-
Common Error Messages:
Error Likely Cause Solution 401 UnauthorizedRBAC roles not assigned or not propagated Wait 5-10 minutes, verify role assignments 403 ForbiddenInsufficient permissions Check RBAC roles, verify authentication 404 Not FoundResource doesn't exist or wrong name Verify resource names in .env file ModuleNotFoundErrorMissing Python dependency Run pip install -r requirements.txtEADDRINUSEPort already in use Kill process using the port (see above) azd provision failedQuota limits or infrastructure error Check quotas, review error details
Once all services are running successfully, you can:
- Access the Application: Open
http://localhost:5174in your browser to start chatting with your data - Upload Documents: Use the Admin interface at
http://localhost:8501to upload and manage documents - Test API Endpoints: Use the Flask API at
http://127.0.0.1:5050/docsto explore available endpoints - Monitor Processing: Watch the Batch Function logs to see document processing in action
- Customize the Solution: Modify prompts, orchestration strategies, and UI components to fit your needs
- Deploy to Azure: Use
azd deployto deploy your local changes to Azure
- LOCAL_DEPLOYMENT.md - Detailed local deployment scenarios and Docker options
- best_practices.md - Best practices for production deployments
- model_configuration.md - Configure AI models and embeddings
- conversation_flow_options.md - Customize conversation orchestration
- integrated_vectorization.md - Set up integrated vectorization
- azure_app_service_auth_setup.md - Configure authentication
- web-apps.md - Deploy to Azure App Service
- supported_file_types.md - Supported document formats
- advanced_image_processing.md - Enable vision-based document processing