N5YLEHBUSHNTNCNCEPG2ZWGGELEOKV6Q2PH7LH5574WO6V3O2UGAC # Testing the Skraak MCP Server## Quick Testing with Shell ScriptsThe easiest way to test the server is using the provided shell scripts:### Comprehensive Test (All Tools)```bash./test_mcp.sh [path-to-database]```Tests all functionality:1. Server initialization2. Tool listing3. `get_current_time` tool4. `query_datasets` toolDefault database path: `./db/skraak.duckdb`### Quick Tool Tests**Get Current Time:**```bash./get_time.sh [path-to-database]```**Query Datasets:**```bash./query_datasets.sh [path-to-database]```Both scripts output clean JSON using `jq`.## Manual JSON-RPC TestingYou can send messages manually via stdin:```bash./skraak_mcp ./db/skraak.duckdb```Then type these JSON-RPC messages (one per line):### 1. Initialize```json{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}}}```### 2. List Tools```json{"jsonrpc":"2.0","id":2,"method":"tools/list","params":{}}```### 3. Call get_current_time```json{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"get_current_time","arguments":{}}}```### 4. Call query_datasets```json{"jsonrpc":"2.0","id":4,"method":"tools/call","params":{"name":"query_datasets","arguments":{}}}```## Expected Responses### Initialize Response```json{"jsonrpc":"2.0","id":1,"result":{"capabilities":{"logging":{},"tools":{"listChanged":true}},"protocolVersion":"2024-11-05","serverInfo":{"name":"skraak_mcp","version":"v1.0.0"}}}```### List Tools Response```json{"jsonrpc":"2.0","id":2,"result":{"tools":[{"name":"get_current_time","description":"Get the current system time with timezone information","inputSchema":{"type":"object","additionalProperties":false},"outputSchema":{"type":"object","required":["time","timezone","unix"],"properties":{"time":{"type":"string","description":"Current system time in RFC3339 format"},"timezone":{"type":"string","description":"System timezone"},"unix":{"type":"integer","description":"Unix timestamp in seconds"}}}},{"name":"query_datasets","description":"Query all datasets from the database. Returns dataset information including ID, name, description, timestamps, active status, and type (organise/test/train).","inputSchema":{"type":"object","additionalProperties":false},"outputSchema":{"type":"object","required":["datasets","count"],"properties":{"datasets":{"type":"array","description":"Array of dataset records from the database"},"count":{"type":"integer","description":"Total number of datasets returned"}}}}]}}```### Get Current Time Response```json{"jsonrpc":"2.0","id":3,"result":{"structuredContent":{"time":"2026-01-25T16:30:00+13:00","timezone":"Local","unix":1769311800}}}```### Query Datasets Response```json{"jsonrpc":"2.0","id":4,"result":{"structuredContent":{"count":10,"datasets":[{"id":"U1khPsIN_r9-","name":"sorted data test","description":null,"created_at":"2025-08-26T09:01:04Z","last_modified":"2025-08-26T09:03:05Z","active":false,"type":"organise"}]}}}```## Testing with Claude DesktopConfigure the server in Claude Desktop:1. Edit your MCP config file:- **Linux**: `~/.config/Claude/claude_desktop_config.json`- **macOS**: `~/Library/Application Support/Claude/claude_desktop_config.json`- **Windows**: `%APPDATA%\Claude\claude_desktop_config.json`2. Add this configuration:```json{"mcpServers": {"skraak_mcp": {"command": "/home/david/go/src/skraak_mcp/skraak_mcp","args": ["/home/david/go/src/skraak_mcp/db/skraak.duckdb"]}}}```3. Restart Claude Desktop4. Test by asking:- "What time is it?"- "Query all datasets"- "List the available datasets"## Troubleshooting- **Server immediately exits**: Normal - it waits for stdin input- **"Usage: ./skraak_mcp <path>"**: You must provide database path argument- **JSON parsing errors**: Each JSON message must be on a single line- **No response**: Server outputs to stdout; notifications may appear between responses- **Tool not found**: Initialize the connection first before calling tools- **Database connection failed**: Check the database path exists and is readable
#!/bin/bash# Simple script to get current time from MCP serverDB_PATH="${1:-../db/skraak.duckdb}"{# Initializeecho '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"cli-test","version":"1.0.0"}}}'sleep 0.2# Call get_current_timeecho '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"get_current_time","arguments":{}}}'sleep 0.2} | ../skraak_mcp "$DB_PATH" 2>/dev/null | grep '"id":2' | jq '.result.structuredContent'
#!/bin/bash# Test script for all MCP prompts# Tests all 6 prompts including the new location/cluster/file promptsif [ ! -f "$DB_PATH" ]; thenecho "Error: Database not found at $DB_PATH" >&2exit 1fiecho "=== Testing All MCP Prompts ===" >&2echo "Database: $DB_PATH" >&2echo "" >&2{echo "=== 1. Initialize ===" >&2echo '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test-client","version":"1.0.0"}}}'sleep 0.2echo "" >&2echo "=== 2. List Prompts ===" >&2echo '{"jsonrpc":"2.0","id":2,"method":"prompts/list","params":{}}'sleep 0.2echo "" >&2echo "=== 3. Get query_active_datasets ===" >&2echo '{"jsonrpc":"2.0","id":3,"method":"prompts/get","params":{"name":"query_active_datasets","arguments":{}}}'sleep 0.2echo "" >&2echo "=== 4. Get explore_database_schema (overview) ===" >&2echo '{"jsonrpc":"2.0","id":4,"method":"prompts/get","params":{"name":"explore_database_schema","arguments":{"focus_area":"overview"}}}'sleep 0.2echo "" >&2echo "=== 5. Get explore_location_hierarchy (no args) ===" >&2echo '{"jsonrpc":"2.0","id":5,"method":"prompts/get","params":{"name":"explore_location_hierarchy","arguments":{}}}'sleep 0.2echo "" >&2echo "=== 6. Get explore_location_hierarchy (with dataset_id) ===" >&2echo '{"jsonrpc":"2.0","id":6,"method":"prompts/get","params":{"name":"explore_location_hierarchy","arguments":{"dataset_id":"vgIr9JSH_lFj"}}}'sleep 0.2echo "" >&2echo "=== 7. Get query_location_data ===" >&2echo '{"jsonrpc":"2.0","id":7,"method":"prompts/get","params":{"name":"query_location_data","arguments":{}}}'sleep 0.2echo "" >&2echo "=== 8. Get analyze_cluster_files (with cluster_id) ===" >&2echo '{"jsonrpc":"2.0","id":8,"method":"prompts/get","params":{"name":"analyze_cluster_files","arguments":{"cluster_id":"oNI9jqszP4Bk"}}}'sleep 0.2echo "" >&2echo "=== 9. Get system_status_check ===" >&2echo '{"jsonrpc":"2.0","id":9,"method":"prompts/get","params":{"name":"system_status_check","arguments":{}}}'sleep 0.2echo "" >&2echo "=== 10. Test Error Handling (analyze_cluster_files without cluster_id) ===" >&2echo '{"jsonrpc":"2.0","id":10,"method":"prompts/get","params":{"name":"analyze_cluster_files","arguments":{}}}'sleep 0.2} | ../skraak_mcp "$DB_PATH" 2>/dev/nullDB_PATH="${1:-../db/test.duckdb}"
#!/bin/bash# Simple test of import_audio_files tool registration# Just checks if the server can start and the tool is registeredDB_PATH="${1:-../db/test.duckdb}"echo "=== Testing import_audio_files Tool Registration ==="echo "Database: $DB_PATH"echo ""# Create a test script that sends proper MCP initialization + tools/listcat > /tmp/test_import_mcp.txt << 'EOF'{"jsonrpc":"2.0","method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}},"id":1}{"jsonrpc":"2.0","method":"tools/list","id":2}EOFecho "Sending MCP commands..."cat /tmp/test_import_mcp.txt | ../skraak_mcp "$DB_PATH" 2>&1 | grep -A 20 '"method":"tools/list"' | jq -r 'select(.result != null) | .result.tools[] | select(.name == "import_audio_files") | "✓ Tool registered: \(.name)\n Description: \(.description)\n Required inputs: \(.inputSchema.required | join(", "))"'echo ""echo "=== Test Complete ==="# Cleanuprm -f /tmp/test_import_mcp.txt
#!/bin/bash# Test script for import_audio_files tool# Tests tool registration and basic validation# Database path - USE TEST DATABASEDB_PATH="${1:-../db/test.duckdb}"echo "=== Testing import_audio_files Tool ==="echo "Database: $DB_PATH"echo ""# Test 1: List available tools (should include import_audio_files)echo "Test 1: List available tools"echo '{"jsonrpc":"2.0","method":"tools/list","id":1}' | ../skraak_mcp "$DB_PATH" | jq -r '.result.tools[] | select(.name == "import_audio_files") | "✓ Found: \(.name) - \(.description)"'echo ""# Test 2: Get tool schemaecho "Test 2: Get import_audio_files tool schema"echo '{"jsonrpc":"2.0","method":"tools/list","id":2}' | ../skraak_mcp "$DB_PATH" | jq '.result.tools[] | select(.name == "import_audio_files") | .inputSchema.properties'echo ""# Test 3: Test validation with invalid folder pathecho "Test 3: Test validation - invalid folder path"echo '{"jsonrpc": "2.0","method": "tools/call","params": {"name": "import_audio_files","arguments": {"folder_path": "/nonexistent/folder","dataset_id": "test123","location_id": "loc456","cluster_id": "clust789"}},"id": 3}' | ../skraak_mcp "$DB_PATH" | jq -r '.error.message // "Validation passed (unexpected!)"'echo ""# Test 4: Test validation with invalid dataset IDecho "Test 4: Test validation - invalid dataset_id"echo '{"jsonrpc": "2.0","method": "tools/call","params": {"name": "import_audio_files","arguments": {"folder_path": "/tmp","dataset_id": "invalidXXXXXX","location_id": "invalidXXXXXX","cluster_id": "invalidXXXXXX"}},"id": 4}' | ../skraak_mcp "$DB_PATH" | jq -r '.error.message // "No error (unexpected!)"'echo ""echo "=== Test Complete ==="echo ""echo "Note: For full functional testing with actual WAV files:"echo "1. Create a test dataset, location, and cluster in the database"echo "2. Place WAV files in a test folder"echo "3. Run import with valid IDs and folder path"
{"jsonrpc":"2.0","id":1,"result":{"capabilities":{"logging":{},"prompts":{"listChanged":true},"resources":{"listChanged":true},"tools":{"listChanged":true}},"protocolVersion":"2024-11-05","serverInfo":{"name":"skraak_mcp","version":"v1.0.0"}}}{"jsonrpc":"2.0","method":"notifications/tools/list_changed","params":{}}{"jsonrpc":"2.0","method":"notifications/resources/list_changed","params":{}}{"jsonrpc":"2.0","method":"notifications/prompts/list_changed","params":{}}{"jsonrpc":"2.0","id":2,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"id\"},{\"database_type\":\"VARCHAR\",\"name\":\"name\"},{\"database_type\":\"ENUM\",\"name\":\"type\"}],\"limited\":false,\"query_executed\":\"SELECT id, name, type FROM dataset WHERE active = true ORDER BY name LIMIT 1000\",\"row_count\":8,\"rows\":[{\"id\":\"wAJk9wuZN15x\",\"name\":\"Bluemine - Kiwi\",\"type\":\"organise\"},{\"id\":\"QZ0tlUrX4Nyi\",\"name\":\"Friends of Cobb - Kiwi\",\"type\":\"organise\"},{\"id\":\"RxajkKXz-w48\",\"name\":\"Lisa Whittle\",\"type\":\"organise\"},{\"id\":\"vgIr9JSH_lFj\",\"name\":\"MOK call site 1\",\"type\":\"organise\"},{\"id\":\"la-JpAf2nLKG\",\"name\":\"Manu o Kahurangi - Kiwi\",\"type\":\"organise\"},{\"id\":\"Yx0oNUDmP5ch\",\"name\":\"Pomona - Kiwi\",\"type\":\"organise\"},{\"id\":\"jWS-sw5RvM-j\",\"name\":\"Pure Salt - Kiwi\",\"type\":\"organise\"},{\"id\":\"gljgxDbfasva\",\"name\":\"Twenty Four Seven\",\"type\":\"organise\"}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"id"},{"database_type":"VARCHAR","name":"name"},{"database_type":"ENUM","name":"type"}],"limited":false,"query_executed":"SELECT id, name, type FROM dataset WHERE active = true ORDER BY name LIMIT 1000","row_count":8,"rows":[{"id":"wAJk9wuZN15x","name":"Bluemine - Kiwi","type":"organise"},{"id":"QZ0tlUrX4Nyi","name":"Friends of Cobb - Kiwi","type":"organise"},{"id":"RxajkKXz-w48","name":"Lisa Whittle","type":"organise"},{"id":"vgIr9JSH_lFj","name":"MOK call site 1","type":"organise"},{"id":"la-JpAf2nLKG","name":"Manu o Kahurangi - Kiwi","type":"organise"},{"id":"Yx0oNUDmP5ch","name":"Pomona - Kiwi","type":"organise"},{"id":"jWS-sw5RvM-j","name":"Pure Salt - Kiwi","type":"organise"},{"id":"gljgxDbfasva","name":"Twenty Four Seven","type":"organise"}]}}}{"jsonrpc":"2.0","id":3,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"id\"},{\"database_type\":\"VARCHAR\",\"name\":\"name\"}],\"limited\":false,\"query_executed\":\"SELECT id, name FROM location WHERE active = true ORDER BY name LIMIT 5\",\"row_count\":5,\"rows\":[{\"id\":\"EwyxfYPFMflt\",\"name\":\"A01\"},{\"id\":\"w5zig0ALH6a5\",\"name\":\"A05\"},{\"id\":\"GouXwoyjeFiq\",\"name\":\"A11\"},{\"id\":\"OS6xbBytkk_I\",\"name\":\"AC21\"},{\"id\":\"tcE-bZ0tcmFB\",\"name\":\"AC34\"}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"id"},{"database_type":"VARCHAR","name":"name"}],"limited":false,"query_executed":"SELECT id, name FROM location WHERE active = true ORDER BY name LIMIT 5","row_count":5,"rows":[{"id":"EwyxfYPFMflt","name":"A01"},{"id":"w5zig0ALH6a5","name":"A05"},{"id":"GouXwoyjeFiq","name":"A11"},{"id":"OS6xbBytkk_I","name":"AC21"},{"id":"tcE-bZ0tcmFB","name":"AC34"}]}}}{"jsonrpc":"2.0","id":4,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"id\"},{\"database_type\":\"VARCHAR\",\"name\":\"name\"},{\"database_type\":\"DECIMAL(10,7)\",\"name\":\"latitude\"},{\"database_type\":\"DECIMAL(10,7)\",\"name\":\"longitude\"}],\"limited\":false,\"query_executed\":\"SELECT id, name, latitude, longitude FROM location WHERE dataset_id = ? AND active = true LIMIT 1000\",\"row_count\":1,\"rows\":[{\"id\":\"0t9JyiuGID4w\",\"latitude\":\"-40.826344\",\"longitude\":\"172.585079\",\"name\":\"call site 1 1.2 test\"}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"id"},{"database_type":"VARCHAR","name":"name"},{"database_type":"DECIMAL(10,7)","name":"latitude"},{"database_type":"DECIMAL(10,7)","name":"longitude"}],"limited":false,"query_executed":"SELECT id, name, latitude, longitude FROM location WHERE dataset_id = ? AND active = true LIMIT 1000","row_count":1,"rows":[{"id":"0t9JyiuGID4w","latitude":"-40.826344","longitude":"172.585079","name":"call site 1 1.2 test"}]}}}{"jsonrpc":"2.0","id":5,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"dataset\"},{\"database_type\":\"BIGINT\",\"name\":\"location_count\"}],\"limited\":false,\"query_executed\":\"SELECT d.name as dataset, COUNT(l.id) as location_count FROM dataset d LEFT JOIN location l ON d.id = l.dataset_id WHERE d.active = true GROUP BY d.name ORDER BY d.name LIMIT 20\",\"row_count\":8,\"rows\":[{\"dataset\":\"Bluemine - Kiwi\",\"location_count\":11},{\"dataset\":\"Friends of Cobb - Kiwi\",\"location_count\":0},{\"dataset\":\"Lisa Whittle\",\"location_count\":15},{\"dataset\":\"MOK call site 1\",\"location_count\":1},{\"dataset\":\"Manu o Kahurangi - Kiwi\",\"location_count\":23},{\"dataset\":\"Pomona - Kiwi\",\"location_count\":48},{\"dataset\":\"Pure Salt - Kiwi\",\"location_count\":6},{\"dataset\":\"Twenty Four Seven\",\"location_count\":35}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"dataset"},{"database_type":"BIGINT","name":"location_count"}],"limited":false,"query_executed":"SELECT d.name as dataset, COUNT(l.id) as location_count FROM dataset d LEFT JOIN location l ON d.id = l.dataset_id WHERE d.active = true GROUP BY d.name ORDER BY d.name LIMIT 20","row_count":8,"rows":[{"dataset":"Bluemine - Kiwi","location_count":11},{"dataset":"Friends of Cobb - Kiwi","location_count":0},{"dataset":"Lisa Whittle","location_count":15},{"dataset":"MOK call site 1","location_count":1},{"dataset":"Manu o Kahurangi - Kiwi","location_count":23},{"dataset":"Pomona - Kiwi","location_count":48},{"dataset":"Pure Salt - Kiwi","location_count":6},{"dataset":"Twenty Four Seven","location_count":35}]}}}{"jsonrpc":"2.0","id":6,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"ENUM\",\"name\":\"type\"},{\"database_type\":\"BIGINT\",\"name\":\"count\"}],\"limited\":false,\"query_executed\":\"SELECT type, COUNT(*) as count FROM dataset WHERE active = true GROUP BY type LIMIT 1000\",\"row_count\":1,\"rows\":[{\"count\":8,\"type\":\"organise\"}]}"}],"structuredContent":{"columns":[{"database_type":"ENUM","name":"type"},{"database_type":"BIGINT","name":"count"}],"limited":false,"query_executed":"SELECT type, COUNT(*) as count FROM dataset WHERE active = true GROUP BY type LIMIT 1000","row_count":1,"rows":[{"count":8,"type":"organise"}]}}}{"jsonrpc":"2.0","id":7,"result":{"content":[{"type":"text","text":"only SELECT and WITH queries are allowed"}],"isError":true}}{"jsonrpc":"2.0","id":8,"result":{"content":[{"type":"text","text":"query contains forbidden keywords (INSERT/UPDATE/DELETE/DROP/CREATE/ALTER)"}],"isError":true}}
#!/bin/bash# Test script for MCP server resources and promptsDB_PATH="${1:-../db/skraak.duckdb}"if [ ! -f "$DB_PATH" ]; thenecho "Error: Database not found at $DB_PATH" >&2echo "Usage: $0 [path-to-database]" >&2exit 1fiecho "=== Testing MCP Resources and Prompts ===" >&2echo "Database: $DB_PATH" >&2echo "" >&2# Start the server and send test messages{echo "=== 1. Initialize ===" >&2echo '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test-client","version":"1.0.0"}}}'sleep 0.2echo "" >&2echo "=== 2. List Resources ===" >&2echo '{"jsonrpc":"2.0","id":2,"method":"resources/list","params":{}}'sleep 0.2echo "" >&2echo "=== 3. List Resource Templates ===" >&2echo '{"jsonrpc":"2.0","id":3,"method":"resources/templates/list","params":{}}'sleep 0.2echo "" >&2echo "=== 4. Read Full Schema (first 50 lines only) ===" >&2echo '{"jsonrpc":"2.0","id":4,"method":"resources/read","params":{"uri":"schema://full"}}'sleep 0.2echo "" >&2echo "=== 5. Read Dataset Table Schema ===" >&2echo '{"jsonrpc":"2.0","id":5,"method":"resources/read","params":{"uri":"schema://table/dataset"}}'sleep 0.2echo "" >&2echo "=== 6. Read Location Table Schema ===" >&2echo '{"jsonrpc":"2.0","id":6,"method":"resources/read","params":{"uri":"schema://table/location"}}'sleep 0.2echo "" >&2echo "=== 7. Try Invalid Table (should error) ===" >&2echo '{"jsonrpc":"2.0","id":7,"method":"resources/read","params":{"uri":"schema://table/invalid_table"}}'sleep 0.2echo "" >&2echo "=== 8. List Prompts ===" >&2echo '{"jsonrpc":"2.0","id":8,"method":"prompts/list","params":{}}'sleep 0.2echo "" >&2echo "=== 9. Get query_active_datasets Prompt ===" >&2echo '{"jsonrpc":"2.0","id":9,"method":"prompts/get","params":{"name":"query_active_datasets"}}'sleep 0.2echo "" >&2echo "=== 10. Get explore_database_schema Prompt (focus: dataset) ===" >&2echo '{"jsonrpc":"2.0","id":10,"method":"prompts/get","params":{"name":"explore_database_schema","arguments":{"focus_area":"dataset"}}}'sleep 0.2echo "" >&2echo "=== 11. Get system_status_check Prompt ===" >&2echo '{"jsonrpc":"2.0","id":11,"method":"prompts/get","params":{"name":"system_status_check"}}'sleep 0.2} | ../skraak_mcp "$DB_PATH" 2>/dev/null | jq '.'
#!/bin/bash# Test suite for execute_sql tool# Tests various SQL queries including safety validationDB_PATH="${1:-../db/skraak.duckdb}"{# Test 1: Initialize MCP connectionecho '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}}}'sleep 0.2# Test 2: Simple SELECT without LIMIT (should auto-append LIMIT 1000)echo '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT id, name, type FROM dataset WHERE active = true ORDER BY name"}}}'sleep 0.2# Test 3: SELECT with explicit limit parameter (5 rows)echo '{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT id, name FROM location WHERE active = true ORDER BY name","limit":5}}}'sleep 0.2# Test 4: Parameterized query with ? placeholderecho '{"jsonrpc":"2.0","id":4,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT id, name, latitude, longitude FROM location WHERE dataset_id = ? AND active = true","parameters":["vgIr9JSH_lFj"]}}}'sleep 0.2# Test 5: Complex JOIN query across multiple tablesecho '{"jsonrpc":"2.0","id":5,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT d.name as dataset, COUNT(l.id) as location_count FROM dataset d LEFT JOIN location l ON d.id = l.dataset_id WHERE d.active = true GROUP BY d.name ORDER BY d.name","limit":20}}}'sleep 0.2# Test 6: Aggregate query with GROUP BYecho '{"jsonrpc":"2.0","id":6,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT type, COUNT(*) as count FROM dataset WHERE active = true GROUP BY type"}}}'sleep 0.2# Test 7: INSERT attempt - should FAIL with validation errorecho '{"jsonrpc":"2.0","id":7,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"INSERT INTO dataset (id, name) VALUES (\"test\", \"test\")"}}}'sleep 0.2# Test 8: SQL injection attempt with forbidden keywords - should FAILecho '{"jsonrpc":"2.0","id":8,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT * FROM dataset; DROP TABLE dataset;"}}}'sleep 0.2} | ../skraak_mcp "$DB_PATH" 2>/dev/null
{"jsonrpc":"2.0","id":1,"result":{"capabilities":{"logging":{},"prompts":{"listChanged":true},"resources":{"listChanged":true},"tools":{"listChanged":true}},"protocolVersion":"2024-11-05","serverInfo":{"name":"skraak_mcp","version":"v1.0.0"}}}{"jsonrpc":"2.0","method":"notifications/prompts/list_changed","params":{}}{"jsonrpc":"2.0","method":"notifications/tools/list_changed","params":{}}{"jsonrpc":"2.0","method":"notifications/resources/list_changed","params":{}}{"jsonrpc":"2.0","id":2,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"id\"},{\"database_type\":\"VARCHAR\",\"name\":\"name\"},{\"database_type\":\"ENUM\",\"name\":\"type\"}],\"limited\":false,\"query_executed\":\"SELECT id, name, type FROM dataset WHERE active = true ORDER BY name LIMIT 1000\",\"row_count\":8,\"rows\":[{\"id\":\"wAJk9wuZN15x\",\"name\":\"Bluemine - Kiwi\",\"type\":\"organise\"},{\"id\":\"QZ0tlUrX4Nyi\",\"name\":\"Friends of Cobb - Kiwi\",\"type\":\"organise\"},{\"id\":\"RxajkKXz-w48\",\"name\":\"Lisa Whittle\",\"type\":\"organise\"},{\"id\":\"vgIr9JSH_lFj\",\"name\":\"MOK call site 1\",\"type\":\"organise\"},{\"id\":\"la-JpAf2nLKG\",\"name\":\"Manu o Kahurangi - Kiwi\",\"type\":\"organise\"},{\"id\":\"Yx0oNUDmP5ch\",\"name\":\"Pomona - Kiwi\",\"type\":\"organise\"},{\"id\":\"jWS-sw5RvM-j\",\"name\":\"Pure Salt - Kiwi\",\"type\":\"organise\"},{\"id\":\"gljgxDbfasva\",\"name\":\"Twenty Four Seven\",\"type\":\"organise\"}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"id"},{"database_type":"VARCHAR","name":"name"},{"database_type":"ENUM","name":"type"}],"limited":false,"query_executed":"SELECT id, name, type FROM dataset WHERE active = true ORDER BY name LIMIT 1000","row_count":8,"rows":[{"id":"wAJk9wuZN15x","name":"Bluemine - Kiwi","type":"organise"},{"id":"QZ0tlUrX4Nyi","name":"Friends of Cobb - Kiwi","type":"organise"},{"id":"RxajkKXz-w48","name":"Lisa Whittle","type":"organise"},{"id":"vgIr9JSH_lFj","name":"MOK call site 1","type":"organise"},{"id":"la-JpAf2nLKG","name":"Manu o Kahurangi - Kiwi","type":"organise"},{"id":"Yx0oNUDmP5ch","name":"Pomona - Kiwi","type":"organise"},{"id":"jWS-sw5RvM-j","name":"Pure Salt - Kiwi","type":"organise"},{"id":"gljgxDbfasva","name":"Twenty Four Seven","type":"organise"}]}}}{"jsonrpc":"2.0","id":3,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"id\"},{\"database_type\":\"VARCHAR\",\"name\":\"name\"}],\"limited\":false,\"query_executed\":\"SELECT id, name FROM location WHERE active = true ORDER BY name LIMIT 5\",\"row_count\":5,\"rows\":[{\"id\":\"EwyxfYPFMflt\",\"name\":\"A01\"},{\"id\":\"w5zig0ALH6a5\",\"name\":\"A05\"},{\"id\":\"GouXwoyjeFiq\",\"name\":\"A11\"},{\"id\":\"OS6xbBytkk_I\",\"name\":\"AC21\"},{\"id\":\"tcE-bZ0tcmFB\",\"name\":\"AC34\"}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"id"},{"database_type":"VARCHAR","name":"name"}],"limited":false,"query_executed":"SELECT id, name FROM location WHERE active = true ORDER BY name LIMIT 5","row_count":5,"rows":[{"id":"EwyxfYPFMflt","name":"A01"},{"id":"w5zig0ALH6a5","name":"A05"},{"id":"GouXwoyjeFiq","name":"A11"},{"id":"OS6xbBytkk_I","name":"AC21"},{"id":"tcE-bZ0tcmFB","name":"AC34"}]}}}{"jsonrpc":"2.0","id":4,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"id\"},{\"database_type\":\"VARCHAR\",\"name\":\"name\"},{\"database_type\":\"DECIMAL(10,7)\",\"name\":\"latitude\"},{\"database_type\":\"DECIMAL(10,7)\",\"name\":\"longitude\"}],\"limited\":false,\"query_executed\":\"SELECT id, name, latitude, longitude FROM location WHERE dataset_id = ? AND active = true LIMIT 1000\",\"row_count\":1,\"rows\":[{\"id\":\"0t9JyiuGID4w\",\"latitude\":\"-40.826344\",\"longitude\":\"172.585079\",\"name\":\"call site 1 1.2 test\"}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"id"},{"database_type":"VARCHAR","name":"name"},{"database_type":"DECIMAL(10,7)","name":"latitude"},{"database_type":"DECIMAL(10,7)","name":"longitude"}],"limited":false,"query_executed":"SELECT id, name, latitude, longitude FROM location WHERE dataset_id = ? AND active = true LIMIT 1000","row_count":1,"rows":[{"id":"0t9JyiuGID4w","latitude":"-40.826344","longitude":"172.585079","name":"call site 1 1.2 test"}]}}}{"jsonrpc":"2.0","id":5,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"dataset\"},{\"database_type\":\"BIGINT\",\"name\":\"location_count\"}],\"limited\":false,\"query_executed\":\"SELECT d.name as dataset, COUNT(l.id) as location_count FROM dataset d LEFT JOIN location l ON d.id = l.dataset_id WHERE d.active = true GROUP BY d.name ORDER BY d.name LIMIT 20\",\"row_count\":8,\"rows\":[{\"dataset\":\"Bluemine - Kiwi\",\"location_count\":11},{\"dataset\":\"Friends of Cobb - Kiwi\",\"location_count\":0},{\"dataset\":\"Lisa Whittle\",\"location_count\":15},{\"dataset\":\"MOK call site 1\",\"location_count\":1},{\"dataset\":\"Manu o Kahurangi - Kiwi\",\"location_count\":23},{\"dataset\":\"Pomona - Kiwi\",\"location_count\":48},{\"dataset\":\"Pure Salt - Kiwi\",\"location_count\":6},{\"dataset\":\"Twenty Four Seven\",\"location_count\":35}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"dataset"},{"database_type":"BIGINT","name":"location_count"}],"limited":false,"query_executed":"SELECT d.name as dataset, COUNT(l.id) as location_count FROM dataset d LEFT JOIN location l ON d.id = l.dataset_id WHERE d.active = true GROUP BY d.name ORDER BY d.name LIMIT 20","row_count":8,"rows":[{"dataset":"Bluemine - Kiwi","location_count":11},{"dataset":"Friends of Cobb - Kiwi","location_count":0},{"dataset":"Lisa Whittle","location_count":15},{"dataset":"MOK call site 1","location_count":1},{"dataset":"Manu o Kahurangi - Kiwi","location_count":23},{"dataset":"Pomona - Kiwi","location_count":48},{"dataset":"Pure Salt - Kiwi","location_count":6},{"dataset":"Twenty Four Seven","location_count":35}]}}}{"jsonrpc":"2.0","id":6,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"ENUM\",\"name\":\"type\"},{\"database_type\":\"BIGINT\",\"name\":\"count\"}],\"limited\":false,\"query_executed\":\"SELECT type, COUNT(*) as count FROM dataset WHERE active = true GROUP BY type LIMIT 1000\",\"row_count\":1,\"rows\":[{\"count\":8,\"type\":\"organise\"}]}"}],"structuredContent":{"columns":[{"database_type":"ENUM","name":"type"},{"database_type":"BIGINT","name":"count"}],"limited":false,"query_executed":"SELECT type, COUNT(*) as count FROM dataset WHERE active = true GROUP BY type LIMIT 1000","row_count":1,"rows":[{"count":8,"type":"organise"}]}}}{"jsonrpc":"2.0","id":7,"result":{"content":[{"type":"text","text":"only SELECT and WITH queries are allowed"}],"isError":true}}{"jsonrpc":"2.0","id":8,"result":{"content":[{"type":"text","text":"query contains forbidden keywords (INSERT/UPDATE/DELETE/DROP/CREATE/ALTER)"}],"isError":true}}
#!/bin/bash# End-to-end test: Create complete hierarchy (pattern → dataset → location → cluster)# Uses test.duckdb to preserve production dataDB_PATH="${1:-../db/test.duckdb}"SERVER_PATH="../skraak_mcp"echo "=== End-to-End Write Tools Test ===" >&2echo "Database: $DB_PATH" >&2echo "" >&2{# Initializeecho '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}}}'sleep 0.2# Step 1: Create recording patternecho '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"create_cyclic_recording_pattern","arguments":{"record_seconds":120,"sleep_seconds":300}}}'sleep 0.2# Step 2: Create datasetecho '{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"create_dataset","arguments":{"name":"E2E Test Dataset","description":"End-to-end test","type":"test"}}}'sleep 0.2# Step 3: Create location (using dataset ID from step 2)# NOTE: You need to extract the dataset ID from step 2 and use it hereecho '{"jsonrpc":"2.0","id":4,"method":"tools/call","params":{"name":"create_location","arguments":{"dataset_id":"REPLACE_WITH_DATASET_ID","name":"Test Location Wellington","latitude":-41.2865,"longitude":174.7762,"timezone_id":"Pacific/Auckland","description":"Test location"}}}'sleep 0.2# Step 4: Create cluster (using dataset ID and location ID)# NOTE: You need to extract both IDs and use them hereecho '{"jsonrpc":"2.0","id":5,"method":"tools/call","params":{"name":"create_cluster","arguments":{"dataset_id":"REPLACE_WITH_DATASET_ID","location_id":"REPLACE_WITH_LOCATION_ID","name":"Test Cluster Alpha","sample_rate":48000,"cyclic_recording_pattern_id":"REPLACE_WITH_PATTERN_ID","description":"Test cluster"}}}'sleep 0.2} | "$SERVER_PATH" "$DB_PATH" 2>/dev/null
#!/bin/bash# Simple test for write tools - tests happy path only# Uses test.duckdb to preserve production dataDB_PATH="${1:-../db/test.duckdb}"SERVER_PATH="../skraak_mcp"echo "=== Simple Write Tools Test (Happy Path) ===" >&2echo "Database: $DB_PATH" >&2echo "" >&2{# Initializeecho '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}}}'sleep 0.2# Test 1: Create patternecho '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"create_cyclic_recording_pattern","arguments":{"record_seconds":30,"sleep_seconds":90}}}'sleep 0.2# Test 2: Create datasetecho '{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"create_dataset","arguments":{"name":"Test Dataset 2026-01-27","description":"Automated test dataset","type":"test"}}}'sleep 0.2# Test 3: Invalid dataset (empty name) - should failecho '{"jsonrpc":"2.0","id":4,"method":"tools/call","params":{"name":"create_dataset","arguments":{"name":" ","type":"test"}}}'sleep 0.2# Test 4: Invalid pattern (negative value) - should failecho '{"jsonrpc":"2.0","id":5,"method":"tools/call","params":{"name":"create_cyclic_recording_pattern","arguments":{"record_seconds":-10,"sleep_seconds":90}}}'sleep 0.2} | "$SERVER_PATH" "$DB_PATH" 2>/dev/null
#!/bin/bash# Test script for write tools (create_dataset, create_location, create_cluster, create_cyclic_recording_pattern)# Tests both valid and invalid inputs# USES TEST DATABASE BY DEFAULT to preserve production data integrityDB_PATH="${1:-../db/test.duckdb}"SERVER_PATH="../skraak_mcp"echo "=== Testing Write Tools for Skraak MCP Server ==="echo "Database: $DB_PATH"echo ""# Initialize connectionecho '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}}}'echo ""echo "=== Test 1: Create Cyclic Recording Pattern (Valid) ==="echo '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"create_cyclic_recording_pattern","arguments":{"record_seconds":30,"sleep_seconds":90}}}'echo ""echo "=== Test 2: Create Cyclic Recording Pattern (Invalid - negative values) ==="echo '{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"create_cyclic_recording_pattern","arguments":{"record_seconds":-10,"sleep_seconds":90}}}'echo ""echo "=== Test 3: Create Dataset (Valid - organise type) ==="echo '{"jsonrpc":"2.0","id":4,"method":"tools/call","params":{"name":"create_dataset","arguments":{"name":"Test Dataset 2026","description":"Created by automated test script","type":"organise"}}}'echo ""echo "=== Test 4: Create Dataset (Valid - test type, no description) ==="echo '{"jsonrpc":"2.0","id":5,"method":"tools/call","params":{"name":"create_dataset","arguments":{"name":"Test Dataset ML","type":"test"}}}'echo ""echo "=== Test 5: Create Dataset (Invalid - empty name) ==="echo '{"jsonrpc":"2.0","id":6,"method":"tools/call","params":{"name":"create_dataset","arguments":{"name":" ","type":"test"}}}'echo ""echo "=== Test 6: Create Dataset (Invalid - bad type) ==="echo '{"jsonrpc":"2.0","id":7,"method":"tools/call","params":{"name":"create_dataset","arguments":{"name":"Bad Type Dataset","type":"invalid_type"}}}'echo ""echo "=== Test 7: Query recently created datasets to get IDs ==="echo '{"jsonrpc":"2.0","id":8,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT id, name, type FROM dataset WHERE name LIKE '\''Test Dataset%'\'' ORDER BY created_at DESC LIMIT 2"}}}'echo ""echo "=== Test 8: Create Location (Valid) ==="echo "NOTE: Replace DATASET_ID_HERE with actual ID from Test 7 results"echo '{"jsonrpc":"2.0","id":9,"method":"tools/call","params":{"name":"create_location","arguments":{"dataset_id":"DATASET_ID_HERE","name":"Test Location Auckland","latitude":-36.8485,"longitude":174.7633,"timezone_id":"Pacific/Auckland","description":"Test location in Auckland"}}}'echo ""echo "=== Test 9: Create Location (Invalid - bad coordinates) ==="echo '{"jsonrpc":"2.0","id":10,"method":"tools/call","params":{"name":"create_location","arguments":{"dataset_id":"DATASET_ID_HERE","name":"Invalid Coords","latitude":999,"longitude":174.7633,"timezone_id":"Pacific/Auckland"}}}'echo ""echo "=== Test 10: Create Location (Invalid - bad timezone) ==="echo '{"jsonrpc":"2.0","id":11,"method":"tools/call","params":{"name":"create_location","arguments":{"dataset_id":"DATASET_ID_HERE","name":"Bad Timezone","latitude":-36.8485,"longitude":174.7633,"timezone_id":"Invalid/Timezone"}}}'echo ""echo "=== Test 11: Create Location (Invalid - non-existent dataset) ==="echo '{"jsonrpc":"2.0","id":12,"method":"tools/call","params":{"name":"create_location","arguments":{"dataset_id":"NONEXISTENT1","name":"Orphan Location","latitude":-36.8485,"longitude":174.7633,"timezone_id":"Pacific/Auckland"}}}'echo ""echo "=== Test 12: Query recently created locations to get IDs ==="echo '{"jsonrpc":"2.0","id":13,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT id, name, dataset_id FROM location WHERE name LIKE '\''Test Location%'\'' ORDER BY created_at DESC LIMIT 1"}}}'echo ""echo "=== Test 13: Create Cluster (Valid) ==="echo "NOTE: Replace DATASET_ID_HERE and LOCATION_ID_HERE with actual IDs from previous results"echo '{"jsonrpc":"2.0","id":14,"method":"tools/call","params":{"name":"create_cluster","arguments":{"dataset_id":"DATASET_ID_HERE","location_id":"LOCATION_ID_HERE","name":"Test Cluster Alpha","sample_rate":44100,"description":"Test cluster with 44.1kHz sample rate"}}}'echo ""echo "=== Test 14: Create Cluster (Invalid - sample rate zero) ==="echo '{"jsonrpc":"2.0","id":15,"method":"tools/call","params":{"name":"create_cluster","arguments":{"dataset_id":"DATASET_ID_HERE","location_id":"LOCATION_ID_HERE","name":"Bad Sample Rate","sample_rate":0}}}'echo ""echo "=== Test 15: Create Cluster (Invalid - location/dataset mismatch) ==="echo "NOTE: Uses wrong dataset_id for the location"echo '{"jsonrpc":"2.0","id":16,"method":"tools/call","params":{"name":"create_cluster","arguments":{"dataset_id":"WRONG_DATASET","location_id":"LOCATION_ID_HERE","name":"Mismatched Cluster","sample_rate":48000}}}'echo ""echo "=== Test 16: Query recently created clusters ==="echo '{"jsonrpc":"2.0","id":17,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT c.id, c.name, c.sample_rate, l.name as location_name, d.name as dataset_name FROM cluster c JOIN location l ON c.location_id = l.id JOIN dataset d ON c.dataset_id = d.id WHERE c.name LIKE '\''Test Cluster%'\'' ORDER BY c.created_at DESC LIMIT 1"}}}'echo ""echo "=== End of Write Tools Tests ==="echo ""echo "MANUAL STEPS REQUIRED:"echo "1. Run this script and capture output: ./test_write_tools.sh > test_write_output.txt 2>&1"echo "2. Extract IDs from Test 7 results (dataset IDs)"echo "3. Extract ID from Test 12 results (location ID)"echo "4. Edit Tests 8-16 to replace DATASET_ID_HERE and LOCATION_ID_HERE with actual IDs"echo "5. Run individual tests with correct IDs to verify write operations"echo ""echo "VERIFICATION COMMANDS:"echo " rg '\"result\"' test_write_output.txt | wc -l # Count successful responses"echo " rg 'error' test_write_output.txt # Check for errors"echo " rg 'Successfully created' test_write_output.txt # Check success messages"
#!/usr/bin/env bash# Test script for import_ml_selections tool# Usage: ./test_import_selections.sh [db_path]# Default: ../db/test.duckdb (ALWAYS USE TEST DATABASE!)set -euo pipefail# Database path (default to test database)DB_PATH="${1:-../db/test.duckdb}"echo "=== Testing import_ml_selections Tool ==="echo "Database: $DB_PATH"echo ""# Initialize MCP connectioninitialize_request='{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test-client","version":"1.0.0"}}}'# Test 1: Parse folder nameecho "Test 1: Parse ML folder name"folder_name="Clips_opensoundscape-kiwi-1.0_2025-11-14"echo "Folder: $folder_name"echo "Expected filter: opensoundscape-kiwi-1.0"echo "Expected date: 2025-11-14"echo ""# Test 2: Check if test database has required entitiesecho "Test 2: Check test database entities"check_query='{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT COUNT(*) as cnt FROM dataset WHERE active = true"}}}'echo "$initialize_request" | ../skraak_mcp "$DB_PATH" 2>/dev/null | grep -q "initialized" && echo "✓ Server initialized"echo ""# Test 3: List available filtersecho "Test 3: List available filters"filter_query='{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT id, name, active FROM filter WHERE active = true LIMIT 5"}}}'echo ""# Test 4: List available speciesecho "Test 4: List available species"species_query='{"jsonrpc":"2.0","id":4,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT id, label, active FROM species WHERE active = true LIMIT 10"}}}'echo ""# Test 5: List available call types for Brown Kiwiecho "Test 5: List call types for Brown Kiwi"calltype_query='{"jsonrpc":"2.0","id":5,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT ct.id, ct.label, s.label as species FROM call_type ct JOIN species s ON ct.species_id = s.id WHERE s.label = ? AND ct.active = true","parameters":["Brown Kiwi"]}}}'echo ""# Test 6: Validation - missing folderecho "Test 6: Validation - missing folder"missing_folder_test='{"jsonrpc":"2.0","id":6,"method":"tools/call","params":{"name":"import_ml_selections","arguments":{"folder_path":"/nonexistent/Clips_test_2025-01-29","dataset_id":"test123","cluster_id":"cluster123"}}}'echo "Expected: folder does not exist error"echo ""# Test 7: Validation - invalid folder nameecho "Test 7: Validation - invalid folder name"# This would require a real folder, skipping for nowecho "Skipped (requires real folder)"echo ""echo "=== Manual Test Instructions ==="echo ""echo "To test with real data:"echo "1. Ensure test database has:"echo " - Active dataset"echo " - Location linked to dataset"echo " - Cluster linked to location"echo " - Filter record matching folder name"echo " - Species records matching folder structure"echo " - Call type records (if using call type subfolders)"echo " - File records in cluster matching selection base filenames"echo ""echo "2. Create test folder structure:"echo " mkdir -p /tmp/test_ml/Clips_test-filter_2025-01-29/'Brown Kiwi'/'Male - Solo'"echo " cd /tmp/test_ml/Clips_test-filter_2025-01-29/'Brown Kiwi'/'Male - Solo'"echo " touch file1-10-20.wav file1-10-20.png"echo ""echo "3. Run import:"echo " MCP call: import_ml_selections"echo " Arguments: {folder_path: '/tmp/test_ml/Clips_test-filter_2025-01-29', dataset_id: '<id>', cluster_id: '<id>'}"echo ""echo "4. Verify results:"echo " SELECT COUNT(*) FROM selection WHERE active = true;"echo " SELECT COUNT(*) FROM label WHERE active = true;"echo " SELECT COUNT(*) FROM label_subtype WHERE active = true;"echo ""echo "=== Unit Tests Passed ==="echo "✓ ParseSelectionFilename - 12 tests"echo "✓ ParseMLFolderName - 8 tests"echo "✓ ValidateWAVPNGPairs - 5 tests"echo "✓ ExtractDateTimePattern - 9 tests"echo "Total: 34 unit tests passed"echo ""echo "=== Tool Registration ==="echo "Tool 'import_ml_selections' is registered in main.go"echo "Run server with: ./skraak_mcp <db_path>"echo ""echo "=== Integration Testing ==="echo "For full integration test, you'll need:"echo "1. Actual ML-generated folder structure from inference pipeline"echo "2. Database with matching filter, species, call_types, and files"echo "3. Run through MCP protocol with proper initialization"echo ""
#!/bin/bash# Test script for the three new update tools: update_dataset, update_location, update_pattern# Usage: ./test_update_tools.sh [db_path]# Default: ../db/test.duckdbDB_PATH="${1:-../db/test.duckdb}"if [ ! -f "$DB_PATH" ]; thenecho "Error: Database not found at $DB_PATH"exit 1fiecho "Testing update tools with database: $DB_PATH"echo "================================================"echo ""# Navigate to the parent directory where skraak_mcp binary is locatedcd "$(dirname "$0")/.." || exit 1if [ ! -f "./skraak_mcp" ]; thenecho "Error: skraak_mcp binary not found. Run 'go build' first."exit 1fi# Function to send MCP requestsend_request() {local method="$1"local params="$2"(echo '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}}}'echo "{\"jsonrpc\":\"2.0\",\"id\":2,\"method\":\"$method\",\"params\":$params}") | ./skraak_mcp "$DB_PATH" 2>&1 | tail -1}echo "Step 1: Create test records"echo "----------------------------"# Create a test datasetecho -n "Creating test dataset... "DATASET_RESULT=$(send_request "tools/call" '{"name":"create_dataset","arguments":{"name":"Test Update Dataset","type":"test","description":"Dataset for testing update tool"}}')DATASET_ID=$(echo "$DATASET_RESULT" | jq -r '.result.content[0].text | fromjson | .dataset.id')if [ "$DATASET_ID" != "null" ] && [ -n "$DATASET_ID" ]; thenecho "✓ Created dataset: $DATASET_ID"elseecho "✗ Failed to create dataset"echo "$DATASET_RESULT" | jq '.'exit 1fi# Create a test locationecho -n "Creating test location... "LOCATION_RESULT=$(send_request "tools/call" '{"name":"create_location","arguments":{"dataset_id":"'"$DATASET_ID"'","name":"Test Location","latitude":-41.2865,"longitude":174.7762,"timezone_id":"Pacific/Auckland","description":"Wellington, NZ"}}')LOCATION_ID=$(echo "$LOCATION_RESULT" | jq -r '.result.content[0].text | fromjson | .location.id')if [ "$LOCATION_ID" != "null" ] && [ -n "$LOCATION_ID" ]; thenecho "✓ Created location: $LOCATION_ID"elseecho "✗ Failed to create location"echo "$LOCATION_RESULT" | jq '.'exit 1fi# Create a test patternecho -n "Creating test pattern... "PATTERN_RESULT=$(send_request "tools/call" '{"name":"create_cyclic_recording_pattern","arguments":{"record_s":60,"sleep_s":240}}')PATTERN_ID=$(echo "$PATTERN_RESULT" | jq -r '.result.content[0].text | fromjson | .pattern.id')if [ "$PATTERN_ID" != "null" ] && [ -n "$PATTERN_ID" ]; thenecho "✓ Created pattern: $PATTERN_ID"elseecho "✗ Failed to create pattern"echo "$PATTERN_RESULT" | jq '.'exit 1fiecho ""echo "Step 2: Test update_dataset"echo "----------------------------"# Test 1: Update dataset nameecho -n "Test 1: Update dataset name... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_dataset","arguments":{"dataset_id":"'"$DATASET_ID"'","name":"Updated Dataset Name"}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 2: Update dataset typeecho -n "Test 2: Update dataset type to 'train'... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_dataset","arguments":{"dataset_id":"'"$DATASET_ID"'","type":"train"}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 3: Update multiple fieldsecho -n "Test 3: Update multiple fields... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_dataset","arguments":{"dataset_id":"'"$DATASET_ID"'","name":"Multi-Field Update","description":"Updated description","type":"organise"}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 4: Invalid dataset IDecho -n "Test 4: Invalid dataset ID (should fail)... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_dataset","arguments":{"dataset_id":"INVALID_ID","name":"Should Fail"}}')ERROR=$(echo "$UPDATE_RESULT" | jq -r '.error.message // empty')if [ -n "$ERROR" ]; thenecho "✓ Correctly failed: $ERROR"elseecho "✗ Should have failed but succeeded"fi# Test 5: No fields providedecho -n "Test 5: No fields provided (should fail)... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_dataset","arguments":{"dataset_id":"'"$DATASET_ID"'"}}')ERROR=$(echo "$UPDATE_RESULT" | jq -r '.error.message // empty')if [ -n "$ERROR" ]; thenecho "✓ Correctly failed: $ERROR"elseecho "✗ Should have failed but succeeded"fiecho ""echo "Step 3: Test update_location"echo "----------------------------"# Test 1: Update location nameecho -n "Test 1: Update location name... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_location","arguments":{"location_id":"'"$LOCATION_ID"'","name":"Updated Location Name"}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 2: Update coordinatesecho -n "Test 2: Update coordinates... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_location","arguments":{"location_id":"'"$LOCATION_ID"'","latitude":-36.8485,"longitude":174.7633}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 3: Update timezoneecho -n "Test 3: Update timezone... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_location","arguments":{"location_id":"'"$LOCATION_ID"'","timezone_id":"Pacific/Fiji"}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 4: Invalid latitude (should fail)echo -n "Test 4: Invalid latitude (should fail)... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_location","arguments":{"location_id":"'"$LOCATION_ID"'","latitude":91.0}}')ERROR=$(echo "$UPDATE_RESULT" | jq -r '.error.message // empty')if [ -n "$ERROR" ]; thenecho "✓ Correctly failed: $ERROR"elseecho "✗ Should have failed but succeeded"fi# Test 5: Invalid longitude (should fail)echo -n "Test 5: Invalid longitude (should fail)... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_location","arguments":{"location_id":"'"$LOCATION_ID"'","longitude":181.0}}')ERROR=$(echo "$UPDATE_RESULT" | jq -r '.error.message // empty')if [ -n "$ERROR" ]; thenecho "✓ Correctly failed: $ERROR"elseecho "✗ Should have failed but succeeded"fiecho ""echo "Step 4: Test update_pattern"echo "----------------------------"# Test 1: Update record_secho -n "Test 1: Update record_s... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_pattern","arguments":{"pattern_id":"'"$PATTERN_ID"'","record_s":120}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 2: Update sleep_secho -n "Test 2: Update sleep_s... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_pattern","arguments":{"pattern_id":"'"$PATTERN_ID"'","sleep_s":300}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 3: Update both fieldsecho -n "Test 3: Update both fields... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_pattern","arguments":{"pattern_id":"'"$PATTERN_ID"'","record_s":180,"sleep_s":360}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 4: Invalid record_s (should fail)echo -n "Test 4: Invalid record_s <= 0 (should fail)... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_pattern","arguments":{"pattern_id":"'"$PATTERN_ID"'","record_s":0}}')ERROR=$(echo "$UPDATE_RESULT" | jq -r '.error.message // empty')if [ -n "$ERROR" ]; thenecho "✓ Correctly failed: $ERROR"elseecho "✗ Should have failed but succeeded"fi# Test 5: Invalid sleep_s (should fail)echo -n "Test 5: Invalid sleep_s < 0 (should fail)... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_pattern","arguments":{"pattern_id":"'"$PATTERN_ID"'","sleep_s":-1}}')ERROR=$(echo "$UPDATE_RESULT" | jq -r '.error.message // empty')if [ -n "$ERROR" ]; thenecho "✓ Correctly failed: $ERROR"elseecho "✗ Should have failed but succeeded"fiecho ""echo "Step 5: Verify final state with SQL"echo "------------------------------------"# Verify datasetecho "Final dataset state:"QUERY_RESULT=$(send_request "tools/call" '{"name":"execute_sql","arguments":{"query":"SELECT id, name, description, type FROM dataset WHERE id = ?","parameters":["'"$DATASET_ID"'"]}}')echo "$QUERY_RESULT" | jq -r '.result.content[0].text | fromjson | .result.rows[0] | " ID: \(.id)\n Name: \(.name)\n Description: \(.description)\n Type: \(.type)"'echo ""echo "Final location state:"QUERY_RESULT=$(send_request "tools/call" '{"name":"execute_sql","arguments":{"query":"SELECT id, name, latitude, longitude, timezone_id FROM location WHERE id = ?","parameters":["'"$LOCATION_ID"'"]}}')echo "$QUERY_RESULT" | jq -r '.result.content[0].text | fromjson | .result.rows[0] | " ID: \(.id)\n Name: \(.name)\n Latitude: \(.latitude)\n Longitude: \(.longitude)\n Timezone: \(.timezone_id)"'echo ""echo "Final pattern state:"QUERY_RESULT=$(send_request "tools/call" '{"name":"execute_sql","arguments":{"query":"SELECT id, record_s, sleep_s FROM cyclic_recording_pattern WHERE id = ?","parameters":["'"$PATTERN_ID"'"]}}')echo "$QUERY_RESULT" | jq -r '.result.content[0].text | fromjson | .result.rows[0] | " ID: \(.id)\n Record: \(.record_s)s\n Sleep: \(.sleep_s)s"'echo ""echo "================================================"echo "All tests completed!"echo ""
#!/bin/bash# Test suite for import_file tool# Usage: ./test_import_file.sh [db_path]# Always use test.duckdb for testing!DB_PATH="${1:-../db/skraak.duckdb}"# Get test IDs from databaseDATASET_ID=$(echo "SELECT id FROM dataset WHERE active = true LIMIT 1" | duckdb "$DB_PATH" -json | jq -r '.[0].id')LOCATION_ID=$(echo "SELECT id FROM location WHERE dataset_id = '$DATASET_ID' AND active = true LIMIT 1" | duckdb "$DB_PATH" -json | jq -r '.[0].id')CLUSTER_ID=$(echo "SELECT id FROM cluster WHERE location_id = '$LOCATION_ID' AND active = true LIMIT 1" | duckdb "$DB_PATH" -json | jq -r '.[0].id')# Get a real WAV file pathCLUSTER_PATH=$(echo "SELECT path FROM cluster WHERE id = '$CLUSTER_ID'" | duckdb "$DB_PATH" -json | jq -r '.[0].path')TEST_FILE=$(echo "SELECT file_name FROM file WHERE cluster_id = '$CLUSTER_ID' AND active = true LIMIT 1" | duckdb "$DB_PATH" -json | jq -r '.[0].file_name')# Construct full path (may or may not exist)if [ -n "$CLUSTER_PATH" ] && [ "$CLUSTER_PATH" != "null" ] && [ -n "$TEST_FILE" ] && [ "$TEST_FILE" != "null" ]; thenFULL_PATH="$CLUSTER_PATH/$TEST_FILE"elseFULL_PATH="/nonexistent/test.wav"fi{# Initialize MCP connectionecho '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}}}'sleep 0.2# Test 1: Import non-existent file (should error)echo "{\"jsonrpc\":\"2.0\",\"id\":2,\"method\":\"tools/call\",\"params\":{\"name\":\"import_file\",\"arguments\":{\"file_path\":\"/nonexistent/path/to/file.wav\",\"dataset_id\":\"$DATASET_ID\",\"location_id\":\"$LOCATION_ID\",\"cluster_id\":\"$CLUSTER_ID\"}}}"sleep 0.2# Test 2: Import non-WAV file (should error)echo "{\"jsonrpc\":\"2.0\",\"id\":3,\"method\":\"tools/call\",\"params\":{\"name\":\"import_file\",\"arguments\":{\"file_path\":\"/etc/passwd\",\"dataset_id\":\"$DATASET_ID\",\"location_id\":\"$LOCATION_ID\",\"cluster_id\":\"$CLUSTER_ID\"}}}"sleep 0.2# Test 3: Import with invalid dataset_id (should error)echo "{\"jsonrpc\":\"2.0\",\"id\":4,\"method\":\"tools/call\",\"params\":{\"name\":\"import_file\",\"arguments\":{\"file_path\":\"/tmp/test.wav\",\"dataset_id\":\"invalid_id123\",\"location_id\":\"$LOCATION_ID\",\"cluster_id\":\"$CLUSTER_ID\"}}}"sleep 0.2# Test 4: Import with invalid cluster_id (should error)echo "{\"jsonrpc\":\"2.0\",\"id\":5,\"method\":\"tools/call\",\"params\":{\"name\":\"import_file\",\"arguments\":{\"file_path\":\"/tmp/test.wav\",\"dataset_id\":\"$DATASET_ID\",\"location_id\":\"$LOCATION_ID\",\"cluster_id\":\"invalid_id123\"}}}"sleep 0.2# Test 5: Import real file (if it exists)if [ -f "$FULL_PATH" ]; then# Escape path for JSONESCAPED_PATH=$(echo "$FULL_PATH" | sed 's/\\/\\\\/g' | sed 's/"/\\"/g')echo "{\"jsonrpc\":\"2.0\",\"id\":6,\"method\":\"tools/call\",\"params\":{\"name\":\"import_file\",\"arguments\":{\"file_path\":\"$ESCAPED_PATH\",\"dataset_id\":\"$DATASET_ID\",\"location_id\":\"$LOCATION_ID\",\"cluster_id\":\"$CLUSTER_ID\"}}}"sleep 0.2# Test 6: Import same file again (should be duplicate)echo "{\"jsonrpc\":\"2.0\",\"id\":7,\"method\":\"tools/call\",\"params\":{\"name\":\"import_file\",\"arguments\":{\"file_path\":\"$ESCAPED_PATH\",\"dataset_id\":\"$DATASET_ID\",\"location_id\":\"$LOCATION_ID\",\"cluster_id\":\"$CLUSTER_ID\"}}}"sleep 0.2fi# Test 7: Query files to verifyecho "{\"jsonrpc\":\"2.0\",\"id\":8,\"method\":\"tools/call\",\"params\":{\"name\":\"execute_sql\",\"arguments\":{\"query\":\"SELECT file_name, xxh64_hash, duration, sample_rate, maybe_solar_night FROM file WHERE cluster_id = ? AND active = true ORDER BY created_at DESC LIMIT 3\",\"parameters\":[\"$CLUSTER_ID\"],\"limit\":3}}}"sleep 0.2} | ../skraak_mcp "$DB_PATH" 2>/dev/null
{"jsonrpc":"2.0","id":1,"result":{"capabilities":{"logging":{},"prompts":{"listChanged":true},"resources":{"listChanged":true},"tools":{"listChanged":true}},"protocolVersion":"2024-11-05","serverInfo":{"name":"skraak_mcp","version":"v1.0.0"}}}{"jsonrpc":"2.0","method":"notifications/tools/list_changed","params":{}}{"jsonrpc":"2.0","method":"notifications/prompts/list_changed","params":{}}{"jsonrpc":"2.0","method":"notifications/resources/list_changed","params":{}}{"jsonrpc":"2.0","id":2,"result":{"content":[{"type":"text","text":"file validation failed: file does not exist: /nonexistent/path/to/file.wav"}],"isError":true}}{"jsonrpc":"2.0","id":3,"result":{"content":[{"type":"text","text":"file validation failed: file must be a WAV file (got extension: )"}],"isError":true}}{"jsonrpc":"2.0","id":4,"result":{"content":[{"type":"text","text":"file validation failed: file does not exist: /tmp/test.wav"}],"isError":true}}{"jsonrpc":"2.0","id":5,"result":{"content":[{"type":"text","text":"file validation failed: file does not exist: /tmp/test.wav"}],"isError":true}}{"jsonrpc":"2.0","id":8,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"file_name\"},{\"database_type\":\"VARCHAR\",\"name\":\"xxh64_hash\"},{\"database_type\":\"DECIMAL(7,3)\",\"name\":\"duration\"},{\"database_type\":\"INTEGER\",\"name\":\"sample_rate\"},{\"database_type\":\"BOOLEAN\",\"name\":\"maybe_solar_night\"}],\"limited\":false,\"query_executed\":\"SELECT file_name, xxh64_hash, duration, sample_rate, maybe_solar_night FROM file WHERE cluster_id = ? AND active = true ORDER BY created_at DESC LIMIT 3\",\"row_count\":3,\"rows\":[{\"duration\":\"60\",\"file_name\":\"20231204_123000.WAV\",\"maybe_solar_night\":false,\"sample_rate\":\"250000\",\"xxh64_hash\":\"f51d08eb40779d25\"},{\"duration\":\"60\",\"file_name\":\"20231031_090000.WAV\",\"maybe_solar_night\":false,\"sample_rate\":\"250000\",\"xxh64_hash\":\"58e100c14b67c0f3\"},{\"duration\":\"59\",\"file_name\":\"20231102_110001.WAV\",\"maybe_solar_night\":false,\"sample_rate\":\"250000\",\"xxh64_hash\":\"d05da810db87d31b\"}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"file_name"},{"database_type":"VARCHAR","name":"xxh64_hash"},{"database_type":"DECIMAL(7,3)","name":"duration"},{"database_type":"INTEGER","name":"sample_rate"},{"database_type":"BOOLEAN","name":"maybe_solar_night"}],"limited":false,"query_executed":"SELECT file_name, xxh64_hash, duration, sample_rate, maybe_solar_night FROM file WHERE cluster_id = ? AND active = true ORDER BY created_at DESC LIMIT 3","row_count":3,"rows":[{"duration":"60","file_name":"20231204_123000.WAV","maybe_solar_night":false,"sample_rate":"250000","xxh64_hash":"f51d08eb40779d25"},{"duration":"60","file_name":"20231031_090000.WAV","maybe_solar_night":false,"sample_rate":"250000","xxh64_hash":"58e100c14b67c0f3"},{"duration":"59","file_name":"20231102_110001.WAV","maybe_solar_night":false,"sample_rate":"250000","xxh64_hash":"d05da810db87d31b"}]}}}