ZVYOPUNH7UJL3YALGNNXQW2B4H4ONI5Z6XWAUZUONFG7LR55W4SQC 65G4H2V6262GLHTPQQ5H4NIPDJB7HRPBRVNAD2EL26N75YUM5PWQC 2GJMZ6YA6OPHNS5KFFFI6POQ2BJ33SSS3NIPXYBFTJSN4BZBVEVAC IFVRAERTCCDICNTYTG3TX2WASB6RXQQEJWWXQMQZJSQDQ3HLE5OQC VZGXBNYYO3E7EPFQ4GOLNVMRXXTQDDQZUU2BZ6JHNBDY4B2QLDAAC L4STQEXDGCPZXDHTEUBCOQKBMTFDRVXRLNFQHPDHOVXDCJO33LQQC OGLLBQQYE5KICDMI6EX7ZI4TZT5RB7UFHH7O2DUOZ44QQXVL5YAAC #!/bin/bash# Test script for write tools (create_dataset, create_location, create_cluster, create_cyclic_recording_pattern)# Tests both valid and invalid inputs# USES TEST DATABASE BY DEFAULT to preserve production data integrityDB_PATH="${1:-../db/test.duckdb}"SERVER_PATH="../skraak_mcp"echo "=== Testing Write Tools for Skraak MCP Server ==="echo "Database: $DB_PATH"echo ""# Initialize connectionecho '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}}}'echo ""echo "=== Test 1: Create Cyclic Recording Pattern (Valid) ==="echo '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"create_cyclic_recording_pattern","arguments":{"record_seconds":30,"sleep_seconds":90}}}'echo ""echo "=== Test 2: Create Cyclic Recording Pattern (Invalid - negative values) ==="echo '{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"create_cyclic_recording_pattern","arguments":{"record_seconds":-10,"sleep_seconds":90}}}'echo ""echo "=== Test 3: Create Dataset (Valid - organise type) ==="echo '{"jsonrpc":"2.0","id":4,"method":"tools/call","params":{"name":"create_dataset","arguments":{"name":"Test Dataset 2026","description":"Created by automated test script","type":"organise"}}}'echo ""echo "=== Test 4: Create Dataset (Valid - test type, no description) ==="echo '{"jsonrpc":"2.0","id":5,"method":"tools/call","params":{"name":"create_dataset","arguments":{"name":"Test Dataset ML","type":"test"}}}'echo ""echo "=== Test 5: Create Dataset (Invalid - empty name) ==="echo '{"jsonrpc":"2.0","id":6,"method":"tools/call","params":{"name":"create_dataset","arguments":{"name":" ","type":"test"}}}'echo ""echo "=== Test 6: Create Dataset (Invalid - bad type) ==="echo '{"jsonrpc":"2.0","id":7,"method":"tools/call","params":{"name":"create_dataset","arguments":{"name":"Bad Type Dataset","type":"invalid_type"}}}'echo ""echo "=== Test 7: Query recently created datasets to get IDs ==="echo '{"jsonrpc":"2.0","id":8,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT id, name, type FROM dataset WHERE name LIKE '\''Test Dataset%'\'' ORDER BY created_at DESC LIMIT 2"}}}'echo ""echo "=== Test 8: Create Location (Valid) ==="echo "NOTE: Replace DATASET_ID_HERE with actual ID from Test 7 results"echo '{"jsonrpc":"2.0","id":9,"method":"tools/call","params":{"name":"create_location","arguments":{"dataset_id":"DATASET_ID_HERE","name":"Test Location Auckland","latitude":-36.8485,"longitude":174.7633,"timezone_id":"Pacific/Auckland","description":"Test location in Auckland"}}}'echo ""echo "=== Test 9: Create Location (Invalid - bad coordinates) ==="echo '{"jsonrpc":"2.0","id":10,"method":"tools/call","params":{"name":"create_location","arguments":{"dataset_id":"DATASET_ID_HERE","name":"Invalid Coords","latitude":999,"longitude":174.7633,"timezone_id":"Pacific/Auckland"}}}'echo ""echo "=== Test 10: Create Location (Invalid - bad timezone) ==="echo '{"jsonrpc":"2.0","id":11,"method":"tools/call","params":{"name":"create_location","arguments":{"dataset_id":"DATASET_ID_HERE","name":"Bad Timezone","latitude":-36.8485,"longitude":174.7633,"timezone_id":"Invalid/Timezone"}}}'echo ""echo "=== Test 11: Create Location (Invalid - non-existent dataset) ==="echo '{"jsonrpc":"2.0","id":12,"method":"tools/call","params":{"name":"create_location","arguments":{"dataset_id":"NONEXISTENT1","name":"Orphan Location","latitude":-36.8485,"longitude":174.7633,"timezone_id":"Pacific/Auckland"}}}'echo ""echo "=== Test 12: Query recently created locations to get IDs ==="echo '{"jsonrpc":"2.0","id":13,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT id, name, dataset_id FROM location WHERE name LIKE '\''Test Location%'\'' ORDER BY created_at DESC LIMIT 1"}}}'echo ""echo "=== Test 13: Create Cluster (Valid) ==="echo "NOTE: Replace DATASET_ID_HERE and LOCATION_ID_HERE with actual IDs from previous results"echo '{"jsonrpc":"2.0","id":14,"method":"tools/call","params":{"name":"create_cluster","arguments":{"dataset_id":"DATASET_ID_HERE","location_id":"LOCATION_ID_HERE","name":"Test Cluster Alpha","sample_rate":44100,"description":"Test cluster with 44.1kHz sample rate"}}}'echo ""echo "=== Test 14: Create Cluster (Invalid - sample rate zero) ==="echo '{"jsonrpc":"2.0","id":15,"method":"tools/call","params":{"name":"create_cluster","arguments":{"dataset_id":"DATASET_ID_HERE","location_id":"LOCATION_ID_HERE","name":"Bad Sample Rate","sample_rate":0}}}'echo ""echo "=== Test 15: Create Cluster (Invalid - location/dataset mismatch) ==="echo "NOTE: Uses wrong dataset_id for the location"echo '{"jsonrpc":"2.0","id":16,"method":"tools/call","params":{"name":"create_cluster","arguments":{"dataset_id":"WRONG_DATASET","location_id":"LOCATION_ID_HERE","name":"Mismatched Cluster","sample_rate":48000}}}'echo ""echo "=== Test 16: Query recently created clusters ==="echo '{"jsonrpc":"2.0","id":17,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT c.id, c.name, c.sample_rate, l.name as location_name, d.name as dataset_name FROM cluster c JOIN location l ON c.location_id = l.id JOIN dataset d ON c.dataset_id = d.id WHERE c.name LIKE '\''Test Cluster%'\'' ORDER BY c.created_at DESC LIMIT 1"}}}'echo ""echo "=== End of Write Tools Tests ==="echo ""echo "MANUAL STEPS REQUIRED:"echo "1. Run this script and capture output: ./test_write_tools.sh > test_write_output.txt 2>&1"echo "2. Extract IDs from Test 7 results (dataset IDs)"echo "3. Extract ID from Test 12 results (location ID)"echo "4. Edit Tests 8-16 to replace DATASET_ID_HERE and LOCATION_ID_HERE with actual IDs"echo "5. Run individual tests with correct IDs to verify write operations"echo ""echo "VERIFICATION COMMANDS:"echo " rg '\"result\"' test_write_output.txt | wc -l # Count successful responses"echo " rg 'error' test_write_output.txt # Check for errors"echo " rg 'Successfully created' test_write_output.txt # Check success messages"
#!/bin/bash# Simple test for write tools - tests happy path only# Uses test.duckdb to preserve production dataDB_PATH="${1:-../db/test.duckdb}"SERVER_PATH="../skraak_mcp"echo "=== Simple Write Tools Test (Happy Path) ===" >&2echo "Database: $DB_PATH" >&2echo "" >&2{# Initializeecho '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}}}'sleep 0.2# Test 1: Create patternecho '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"create_cyclic_recording_pattern","arguments":{"record_seconds":30,"sleep_seconds":90}}}'sleep 0.2# Test 2: Create datasetecho '{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"create_dataset","arguments":{"name":"Test Dataset 2026-01-27","description":"Automated test dataset","type":"test"}}}'sleep 0.2# Test 3: Invalid dataset (empty name) - should failecho '{"jsonrpc":"2.0","id":4,"method":"tools/call","params":{"name":"create_dataset","arguments":{"name":" ","type":"test"}}}'sleep 0.2# Test 4: Invalid pattern (negative value) - should failecho '{"jsonrpc":"2.0","id":5,"method":"tools/call","params":{"name":"create_cyclic_recording_pattern","arguments":{"record_seconds":-10,"sleep_seconds":90}}}'sleep 0.2} | "$SERVER_PATH" "$DB_PATH" 2>/dev/null
#!/bin/bash# End-to-end test: Create complete hierarchy (pattern → dataset → location → cluster)# Uses test.duckdb to preserve production dataDB_PATH="${1:-../db/test.duckdb}"SERVER_PATH="../skraak_mcp"echo "=== End-to-End Write Tools Test ===" >&2echo "Database: $DB_PATH" >&2echo "" >&2{# Initializeecho '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}}}'sleep 0.2# Step 1: Create recording patternecho '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"create_cyclic_recording_pattern","arguments":{"record_seconds":120,"sleep_seconds":300}}}'sleep 0.2# Step 2: Create datasetecho '{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"create_dataset","arguments":{"name":"E2E Test Dataset","description":"End-to-end test","type":"test"}}}'sleep 0.2# Step 3: Create location (using dataset ID from step 2)# NOTE: You need to extract the dataset ID from step 2 and use it hereecho '{"jsonrpc":"2.0","id":4,"method":"tools/call","params":{"name":"create_location","arguments":{"dataset_id":"REPLACE_WITH_DATASET_ID","name":"Test Location Wellington","latitude":-41.2865,"longitude":174.7762,"timezone_id":"Pacific/Auckland","description":"Test location"}}}'sleep 0.2# Step 4: Create cluster (using dataset ID and location ID)# NOTE: You need to extract both IDs and use them hereecho '{"jsonrpc":"2.0","id":5,"method":"tools/call","params":{"name":"create_cluster","arguments":{"dataset_id":"REPLACE_WITH_DATASET_ID","location_id":"REPLACE_WITH_LOCATION_ID","name":"Test Cluster Alpha","sample_rate":48000,"cyclic_recording_pattern_id":"REPLACE_WITH_PATTERN_ID","description":"Test cluster"}}}'sleep 0.2} | "$SERVER_PATH" "$DB_PATH" 2>/dev/null
#!/bin/bash# Test script for the three new update tools: update_dataset, update_location, update_pattern# Usage: ./test_update_tools.sh [db_path]# Default: ../db/test.duckdbDB_PATH="${1:-../db/test.duckdb}"if [ ! -f "$DB_PATH" ]; thenecho "Error: Database not found at $DB_PATH"exit 1fiecho "Testing update tools with database: $DB_PATH"echo "================================================"echo ""# Navigate to the parent directory where skraak_mcp binary is locatedcd "$(dirname "$0")/.." || exit 1if [ ! -f "./skraak_mcp" ]; thenecho "Error: skraak_mcp binary not found. Run 'go build' first."exit 1fi# Function to send MCP requestsend_request() {local method="$1"local params="$2"(echo '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}}}'echo "{\"jsonrpc\":\"2.0\",\"id\":2,\"method\":\"$method\",\"params\":$params}") | ./skraak_mcp "$DB_PATH" 2>&1 | tail -1}echo "Step 1: Create test records"echo "----------------------------"# Create a test datasetecho -n "Creating test dataset... "DATASET_RESULT=$(send_request "tools/call" '{"name":"create_dataset","arguments":{"name":"Test Update Dataset","type":"test","description":"Dataset for testing update tool"}}')DATASET_ID=$(echo "$DATASET_RESULT" | jq -r '.result.content[0].text | fromjson | .dataset.id')if [ "$DATASET_ID" != "null" ] && [ -n "$DATASET_ID" ]; thenecho "✓ Created dataset: $DATASET_ID"elseecho "✗ Failed to create dataset"echo "$DATASET_RESULT" | jq '.'exit 1fi# Create a test locationecho -n "Creating test location... "LOCATION_RESULT=$(send_request "tools/call" '{"name":"create_location","arguments":{"dataset_id":"'"$DATASET_ID"'","name":"Test Location","latitude":-41.2865,"longitude":174.7762,"timezone_id":"Pacific/Auckland","description":"Wellington, NZ"}}')LOCATION_ID=$(echo "$LOCATION_RESULT" | jq -r '.result.content[0].text | fromjson | .location.id')if [ "$LOCATION_ID" != "null" ] && [ -n "$LOCATION_ID" ]; thenecho "✓ Created location: $LOCATION_ID"elseecho "✗ Failed to create location"echo "$LOCATION_RESULT" | jq '.'exit 1fi# Create a test patternecho -n "Creating test pattern... "PATTERN_RESULT=$(send_request "tools/call" '{"name":"create_cyclic_recording_pattern","arguments":{"record_s":60,"sleep_s":240}}')PATTERN_ID=$(echo "$PATTERN_RESULT" | jq -r '.result.content[0].text | fromjson | .pattern.id')if [ "$PATTERN_ID" != "null" ] && [ -n "$PATTERN_ID" ]; thenecho "✓ Created pattern: $PATTERN_ID"elseecho "✗ Failed to create pattern"echo "$PATTERN_RESULT" | jq '.'exit 1fiecho ""echo "Step 2: Test update_dataset"echo "----------------------------"# Test 1: Update dataset nameecho -n "Test 1: Update dataset name... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_dataset","arguments":{"dataset_id":"'"$DATASET_ID"'","name":"Updated Dataset Name"}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 2: Update dataset typeecho -n "Test 2: Update dataset type to 'train'... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_dataset","arguments":{"dataset_id":"'"$DATASET_ID"'","type":"train"}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 3: Update multiple fieldsecho -n "Test 3: Update multiple fields... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_dataset","arguments":{"dataset_id":"'"$DATASET_ID"'","name":"Multi-Field Update","description":"Updated description","type":"organise"}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 4: Invalid dataset IDecho -n "Test 4: Invalid dataset ID (should fail)... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_dataset","arguments":{"dataset_id":"INVALID_ID","name":"Should Fail"}}')ERROR=$(echo "$UPDATE_RESULT" | jq -r '.error.message // empty')if [ -n "$ERROR" ]; thenecho "✓ Correctly failed: $ERROR"elseecho "✗ Should have failed but succeeded"fi# Test 5: No fields providedecho -n "Test 5: No fields provided (should fail)... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_dataset","arguments":{"dataset_id":"'"$DATASET_ID"'"}}')ERROR=$(echo "$UPDATE_RESULT" | jq -r '.error.message // empty')if [ -n "$ERROR" ]; thenecho "✓ Correctly failed: $ERROR"elseecho "✗ Should have failed but succeeded"fiecho ""echo "Step 3: Test update_location"echo "----------------------------"# Test 1: Update location nameecho -n "Test 1: Update location name... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_location","arguments":{"location_id":"'"$LOCATION_ID"'","name":"Updated Location Name"}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 2: Update coordinatesecho -n "Test 2: Update coordinates... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_location","arguments":{"location_id":"'"$LOCATION_ID"'","latitude":-36.8485,"longitude":174.7633}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 3: Update timezoneecho -n "Test 3: Update timezone... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_location","arguments":{"location_id":"'"$LOCATION_ID"'","timezone_id":"Pacific/Fiji"}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 4: Invalid latitude (should fail)echo -n "Test 4: Invalid latitude (should fail)... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_location","arguments":{"location_id":"'"$LOCATION_ID"'","latitude":91.0}}')ERROR=$(echo "$UPDATE_RESULT" | jq -r '.error.message // empty')if [ -n "$ERROR" ]; thenecho "✓ Correctly failed: $ERROR"elseecho "✗ Should have failed but succeeded"fi# Test 5: Invalid longitude (should fail)echo -n "Test 5: Invalid longitude (should fail)... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_location","arguments":{"location_id":"'"$LOCATION_ID"'","longitude":181.0}}')ERROR=$(echo "$UPDATE_RESULT" | jq -r '.error.message // empty')if [ -n "$ERROR" ]; thenecho "✓ Correctly failed: $ERROR"elseecho "✗ Should have failed but succeeded"fiecho ""echo "Step 4: Test update_pattern"echo "----------------------------"# Test 1: Update record_secho -n "Test 1: Update record_s... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_pattern","arguments":{"pattern_id":"'"$PATTERN_ID"'","record_s":120}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 2: Update sleep_secho -n "Test 2: Update sleep_s... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_pattern","arguments":{"pattern_id":"'"$PATTERN_ID"'","sleep_s":300}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 3: Update both fieldsecho -n "Test 3: Update both fields... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_pattern","arguments":{"pattern_id":"'"$PATTERN_ID"'","record_s":180,"sleep_s":360}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.content[0].text | fromjson | .success')if [ "$SUCCESS" = "true" ]; thenecho "✓ Success"elseecho "✗ Failed"echo "$UPDATE_RESULT" | jq '.'fi# Test 4: Invalid record_s (should fail)echo -n "Test 4: Invalid record_s <= 0 (should fail)... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_pattern","arguments":{"pattern_id":"'"$PATTERN_ID"'","record_s":0}}')ERROR=$(echo "$UPDATE_RESULT" | jq -r '.error.message // empty')if [ -n "$ERROR" ]; thenecho "✓ Correctly failed: $ERROR"elseecho "✗ Should have failed but succeeded"fi# Test 5: Invalid sleep_s (should fail)echo -n "Test 5: Invalid sleep_s < 0 (should fail)... "UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_pattern","arguments":{"pattern_id":"'"$PATTERN_ID"'","sleep_s":-1}}')ERROR=$(echo "$UPDATE_RESULT" | jq -r '.error.message // empty')if [ -n "$ERROR" ]; thenecho "✓ Correctly failed: $ERROR"elseecho "✗ Should have failed but succeeded"fiecho ""echo "Step 5: Verify final state with SQL"echo "------------------------------------"# Verify datasetecho "Final dataset state:"QUERY_RESULT=$(send_request "tools/call" '{"name":"execute_sql","arguments":{"query":"SELECT id, name, description, type FROM dataset WHERE id = ?","parameters":["'"$DATASET_ID"'"]}}')echo "$QUERY_RESULT" | jq -r '.result.content[0].text | fromjson | .result.rows[0] | " ID: \(.id)\n Name: \(.name)\n Description: \(.description)\n Type: \(.type)"'echo ""echo "Final location state:"QUERY_RESULT=$(send_request "tools/call" '{"name":"execute_sql","arguments":{"query":"SELECT id, name, latitude, longitude, timezone_id FROM location WHERE id = ?","parameters":["'"$LOCATION_ID"'"]}}')echo "$QUERY_RESULT" | jq -r '.result.content[0].text | fromjson | .result.rows[0] | " ID: \(.id)\n Name: \(.name)\n Latitude: \(.latitude)\n Longitude: \(.longitude)\n Timezone: \(.timezone_id)"'echo ""echo "Final pattern state:"QUERY_RESULT=$(send_request "tools/call" '{"name":"execute_sql","arguments":{"query":"SELECT id, record_s, sleep_s FROM cyclic_recording_pattern WHERE id = ?","parameters":["'"$PATTERN_ID"'"]}}')echo "$QUERY_RESULT" | jq -r '.result.content[0].text | fromjson | .result.rows[0] | " ID: \(.id)\n Record: \(.record_s)s\n Sleep: \(.sleep_s)s"'echo ""echo "================================================"echo "All tests completed!"echo ""
{"jsonrpc":"2.0","id":1,"result":{"capabilities":{"logging":{},"prompts":{"listChanged":true},"resources":{"listChanged":true},"tools":{"listChanged":true}},"protocolVersion":"2024-11-05","serverInfo":{"name":"skraak_mcp","version":"v1.0.0"}}}{"jsonrpc":"2.0","method":"notifications/prompts/list_changed","params":{}}{"jsonrpc":"2.0","method":"notifications/tools/list_changed","params":{}}{"jsonrpc":"2.0","method":"notifications/resources/list_changed","params":{}}{"jsonrpc":"2.0","id":2,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"id\"},{\"database_type\":\"VARCHAR\",\"name\":\"name\"},{\"database_type\":\"ENUM\",\"name\":\"type\"}],\"limited\":false,\"query_executed\":\"SELECT id, name, type FROM dataset WHERE active = true ORDER BY name LIMIT 1000\",\"row_count\":8,\"rows\":[{\"id\":\"wAJk9wuZN15x\",\"name\":\"Bluemine - Kiwi\",\"type\":\"organise\"},{\"id\":\"QZ0tlUrX4Nyi\",\"name\":\"Friends of Cobb - Kiwi\",\"type\":\"organise\"},{\"id\":\"RxajkKXz-w48\",\"name\":\"Lisa Whittle\",\"type\":\"organise\"},{\"id\":\"vgIr9JSH_lFj\",\"name\":\"MOK call site 1\",\"type\":\"organise\"},{\"id\":\"la-JpAf2nLKG\",\"name\":\"Manu o Kahurangi - Kiwi\",\"type\":\"organise\"},{\"id\":\"Yx0oNUDmP5ch\",\"name\":\"Pomona - Kiwi\",\"type\":\"organise\"},{\"id\":\"jWS-sw5RvM-j\",\"name\":\"Pure Salt - Kiwi\",\"type\":\"organise\"},{\"id\":\"gljgxDbfasva\",\"name\":\"Twenty Four Seven\",\"type\":\"organise\"}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"id"},{"database_type":"VARCHAR","name":"name"},{"database_type":"ENUM","name":"type"}],"limited":false,"query_executed":"SELECT id, name, type FROM dataset WHERE active = true ORDER BY name LIMIT 1000","row_count":8,"rows":[{"id":"wAJk9wuZN15x","name":"Bluemine - Kiwi","type":"organise"},{"id":"QZ0tlUrX4Nyi","name":"Friends of Cobb - Kiwi","type":"organise"},{"id":"RxajkKXz-w48","name":"Lisa Whittle","type":"organise"},{"id":"vgIr9JSH_lFj","name":"MOK call site 1","type":"organise"},{"id":"la-JpAf2nLKG","name":"Manu o Kahurangi - Kiwi","type":"organise"},{"id":"Yx0oNUDmP5ch","name":"Pomona - Kiwi","type":"organise"},{"id":"jWS-sw5RvM-j","name":"Pure Salt - Kiwi","type":"organise"},{"id":"gljgxDbfasva","name":"Twenty Four Seven","type":"organise"}]}}}{"jsonrpc":"2.0","id":3,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"id\"},{\"database_type\":\"VARCHAR\",\"name\":\"name\"}],\"limited\":false,\"query_executed\":\"SELECT id, name FROM location WHERE active = true ORDER BY name LIMIT 5\",\"row_count\":5,\"rows\":[{\"id\":\"EwyxfYPFMflt\",\"name\":\"A01\"},{\"id\":\"w5zig0ALH6a5\",\"name\":\"A05\"},{\"id\":\"GouXwoyjeFiq\",\"name\":\"A11\"},{\"id\":\"OS6xbBytkk_I\",\"name\":\"AC21\"},{\"id\":\"tcE-bZ0tcmFB\",\"name\":\"AC34\"}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"id"},{"database_type":"VARCHAR","name":"name"}],"limited":false,"query_executed":"SELECT id, name FROM location WHERE active = true ORDER BY name LIMIT 5","row_count":5,"rows":[{"id":"EwyxfYPFMflt","name":"A01"},{"id":"w5zig0ALH6a5","name":"A05"},{"id":"GouXwoyjeFiq","name":"A11"},{"id":"OS6xbBytkk_I","name":"AC21"},{"id":"tcE-bZ0tcmFB","name":"AC34"}]}}}{"jsonrpc":"2.0","id":4,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"id\"},{\"database_type\":\"VARCHAR\",\"name\":\"name\"},{\"database_type\":\"DECIMAL(10,7)\",\"name\":\"latitude\"},{\"database_type\":\"DECIMAL(10,7)\",\"name\":\"longitude\"}],\"limited\":false,\"query_executed\":\"SELECT id, name, latitude, longitude FROM location WHERE dataset_id = ? AND active = true LIMIT 1000\",\"row_count\":1,\"rows\":[{\"id\":\"0t9JyiuGID4w\",\"latitude\":\"-40.826344\",\"longitude\":\"172.585079\",\"name\":\"call site 1 1.2 test\"}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"id"},{"database_type":"VARCHAR","name":"name"},{"database_type":"DECIMAL(10,7)","name":"latitude"},{"database_type":"DECIMAL(10,7)","name":"longitude"}],"limited":false,"query_executed":"SELECT id, name, latitude, longitude FROM location WHERE dataset_id = ? AND active = true LIMIT 1000","row_count":1,"rows":[{"id":"0t9JyiuGID4w","latitude":"-40.826344","longitude":"172.585079","name":"call site 1 1.2 test"}]}}}{"jsonrpc":"2.0","id":5,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"dataset\"},{\"database_type\":\"BIGINT\",\"name\":\"location_count\"}],\"limited\":false,\"query_executed\":\"SELECT d.name as dataset, COUNT(l.id) as location_count FROM dataset d LEFT JOIN location l ON d.id = l.dataset_id WHERE d.active = true GROUP BY d.name ORDER BY d.name LIMIT 20\",\"row_count\":8,\"rows\":[{\"dataset\":\"Bluemine - Kiwi\",\"location_count\":11},{\"dataset\":\"Friends of Cobb - Kiwi\",\"location_count\":0},{\"dataset\":\"Lisa Whittle\",\"location_count\":15},{\"dataset\":\"MOK call site 1\",\"location_count\":1},{\"dataset\":\"Manu o Kahurangi - Kiwi\",\"location_count\":23},{\"dataset\":\"Pomona - Kiwi\",\"location_count\":48},{\"dataset\":\"Pure Salt - Kiwi\",\"location_count\":6},{\"dataset\":\"Twenty Four Seven\",\"location_count\":35}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"dataset"},{"database_type":"BIGINT","name":"location_count"}],"limited":false,"query_executed":"SELECT d.name as dataset, COUNT(l.id) as location_count FROM dataset d LEFT JOIN location l ON d.id = l.dataset_id WHERE d.active = true GROUP BY d.name ORDER BY d.name LIMIT 20","row_count":8,"rows":[{"dataset":"Bluemine - Kiwi","location_count":11},{"dataset":"Friends of Cobb - Kiwi","location_count":0},{"dataset":"Lisa Whittle","location_count":15},{"dataset":"MOK call site 1","location_count":1},{"dataset":"Manu o Kahurangi - Kiwi","location_count":23},{"dataset":"Pomona - Kiwi","location_count":48},{"dataset":"Pure Salt - Kiwi","location_count":6},{"dataset":"Twenty Four Seven","location_count":35}]}}}{"jsonrpc":"2.0","id":6,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"ENUM\",\"name\":\"type\"},{\"database_type\":\"BIGINT\",\"name\":\"count\"}],\"limited\":false,\"query_executed\":\"SELECT type, COUNT(*) as count FROM dataset WHERE active = true GROUP BY type LIMIT 1000\",\"row_count\":1,\"rows\":[{\"count\":8,\"type\":\"organise\"}]}"}],"structuredContent":{"columns":[{"database_type":"ENUM","name":"type"},{"database_type":"BIGINT","name":"count"}],"limited":false,"query_executed":"SELECT type, COUNT(*) as count FROM dataset WHERE active = true GROUP BY type LIMIT 1000","row_count":1,"rows":[{"count":8,"type":"organise"}]}}}{"jsonrpc":"2.0","id":7,"result":{"content":[{"type":"text","text":"only SELECT and WITH queries are allowed"}],"isError":true}}{"jsonrpc":"2.0","id":8,"result":{"content":[{"type":"text","text":"query contains forbidden keywords (INSERT/UPDATE/DELETE/DROP/CREATE/ALTER)"}],"isError":true}}
{"jsonrpc":"2.0","id":1,"result":{"capabilities":{"logging":{},"prompts":{"listChanged":true},"resources":{"listChanged":true},"tools":{"listChanged":true}},"protocolVersion":"2024-11-05","serverInfo":{"name":"skraak_mcp","version":"v1.0.0"}}}{"jsonrpc":"2.0","method":"notifications/tools/list_changed","params":{}}{"jsonrpc":"2.0","method":"notifications/resources/list_changed","params":{}}{"jsonrpc":"2.0","method":"notifications/prompts/list_changed","params":{}}{"jsonrpc":"2.0","id":2,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"id\"},{\"database_type\":\"VARCHAR\",\"name\":\"name\"},{\"database_type\":\"ENUM\",\"name\":\"type\"}],\"limited\":false,\"query_executed\":\"SELECT id, name, type FROM dataset WHERE active = true ORDER BY name LIMIT 1000\",\"row_count\":8,\"rows\":[{\"id\":\"wAJk9wuZN15x\",\"name\":\"Bluemine - Kiwi\",\"type\":\"organise\"},{\"id\":\"QZ0tlUrX4Nyi\",\"name\":\"Friends of Cobb - Kiwi\",\"type\":\"organise\"},{\"id\":\"RxajkKXz-w48\",\"name\":\"Lisa Whittle\",\"type\":\"organise\"},{\"id\":\"vgIr9JSH_lFj\",\"name\":\"MOK call site 1\",\"type\":\"organise\"},{\"id\":\"la-JpAf2nLKG\",\"name\":\"Manu o Kahurangi - Kiwi\",\"type\":\"organise\"},{\"id\":\"Yx0oNUDmP5ch\",\"name\":\"Pomona - Kiwi\",\"type\":\"organise\"},{\"id\":\"jWS-sw5RvM-j\",\"name\":\"Pure Salt - Kiwi\",\"type\":\"organise\"},{\"id\":\"gljgxDbfasva\",\"name\":\"Twenty Four Seven\",\"type\":\"organise\"}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"id"},{"database_type":"VARCHAR","name":"name"},{"database_type":"ENUM","name":"type"}],"limited":false,"query_executed":"SELECT id, name, type FROM dataset WHERE active = true ORDER BY name LIMIT 1000","row_count":8,"rows":[{"id":"wAJk9wuZN15x","name":"Bluemine - Kiwi","type":"organise"},{"id":"QZ0tlUrX4Nyi","name":"Friends of Cobb - Kiwi","type":"organise"},{"id":"RxajkKXz-w48","name":"Lisa Whittle","type":"organise"},{"id":"vgIr9JSH_lFj","name":"MOK call site 1","type":"organise"},{"id":"la-JpAf2nLKG","name":"Manu o Kahurangi - Kiwi","type":"organise"},{"id":"Yx0oNUDmP5ch","name":"Pomona - Kiwi","type":"organise"},{"id":"jWS-sw5RvM-j","name":"Pure Salt - Kiwi","type":"organise"},{"id":"gljgxDbfasva","name":"Twenty Four Seven","type":"organise"}]}}}{"jsonrpc":"2.0","id":3,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"id\"},{\"database_type\":\"VARCHAR\",\"name\":\"name\"}],\"limited\":false,\"query_executed\":\"SELECT id, name FROM location WHERE active = true ORDER BY name LIMIT 5\",\"row_count\":5,\"rows\":[{\"id\":\"EwyxfYPFMflt\",\"name\":\"A01\"},{\"id\":\"w5zig0ALH6a5\",\"name\":\"A05\"},{\"id\":\"GouXwoyjeFiq\",\"name\":\"A11\"},{\"id\":\"OS6xbBytkk_I\",\"name\":\"AC21\"},{\"id\":\"tcE-bZ0tcmFB\",\"name\":\"AC34\"}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"id"},{"database_type":"VARCHAR","name":"name"}],"limited":false,"query_executed":"SELECT id, name FROM location WHERE active = true ORDER BY name LIMIT 5","row_count":5,"rows":[{"id":"EwyxfYPFMflt","name":"A01"},{"id":"w5zig0ALH6a5","name":"A05"},{"id":"GouXwoyjeFiq","name":"A11"},{"id":"OS6xbBytkk_I","name":"AC21"},{"id":"tcE-bZ0tcmFB","name":"AC34"}]}}}{"jsonrpc":"2.0","id":4,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"id\"},{\"database_type\":\"VARCHAR\",\"name\":\"name\"},{\"database_type\":\"DECIMAL(10,7)\",\"name\":\"latitude\"},{\"database_type\":\"DECIMAL(10,7)\",\"name\":\"longitude\"}],\"limited\":false,\"query_executed\":\"SELECT id, name, latitude, longitude FROM location WHERE dataset_id = ? AND active = true LIMIT 1000\",\"row_count\":1,\"rows\":[{\"id\":\"0t9JyiuGID4w\",\"latitude\":\"-40.826344\",\"longitude\":\"172.585079\",\"name\":\"call site 1 1.2 test\"}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"id"},{"database_type":"VARCHAR","name":"name"},{"database_type":"DECIMAL(10,7)","name":"latitude"},{"database_type":"DECIMAL(10,7)","name":"longitude"}],"limited":false,"query_executed":"SELECT id, name, latitude, longitude FROM location WHERE dataset_id = ? AND active = true LIMIT 1000","row_count":1,"rows":[{"id":"0t9JyiuGID4w","latitude":"-40.826344","longitude":"172.585079","name":"call site 1 1.2 test"}]}}}{"jsonrpc":"2.0","id":5,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"dataset\"},{\"database_type\":\"BIGINT\",\"name\":\"location_count\"}],\"limited\":false,\"query_executed\":\"SELECT d.name as dataset, COUNT(l.id) as location_count FROM dataset d LEFT JOIN location l ON d.id = l.dataset_id WHERE d.active = true GROUP BY d.name ORDER BY d.name LIMIT 20\",\"row_count\":8,\"rows\":[{\"dataset\":\"Bluemine - Kiwi\",\"location_count\":11},{\"dataset\":\"Friends of Cobb - Kiwi\",\"location_count\":0},{\"dataset\":\"Lisa Whittle\",\"location_count\":15},{\"dataset\":\"MOK call site 1\",\"location_count\":1},{\"dataset\":\"Manu o Kahurangi - Kiwi\",\"location_count\":23},{\"dataset\":\"Pomona - Kiwi\",\"location_count\":48},{\"dataset\":\"Pure Salt - Kiwi\",\"location_count\":6},{\"dataset\":\"Twenty Four Seven\",\"location_count\":35}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"dataset"},{"database_type":"BIGINT","name":"location_count"}],"limited":false,"query_executed":"SELECT d.name as dataset, COUNT(l.id) as location_count FROM dataset d LEFT JOIN location l ON d.id = l.dataset_id WHERE d.active = true GROUP BY d.name ORDER BY d.name LIMIT 20","row_count":8,"rows":[{"dataset":"Bluemine - Kiwi","location_count":11},{"dataset":"Friends of Cobb - Kiwi","location_count":0},{"dataset":"Lisa Whittle","location_count":15},{"dataset":"MOK call site 1","location_count":1},{"dataset":"Manu o Kahurangi - Kiwi","location_count":23},{"dataset":"Pomona - Kiwi","location_count":48},{"dataset":"Pure Salt - Kiwi","location_count":6},{"dataset":"Twenty Four Seven","location_count":35}]}}}{"jsonrpc":"2.0","id":6,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"ENUM\",\"name\":\"type\"},{\"database_type\":\"BIGINT\",\"name\":\"count\"}],\"limited\":false,\"query_executed\":\"SELECT type, COUNT(*) as count FROM dataset WHERE active = true GROUP BY type LIMIT 1000\",\"row_count\":1,\"rows\":[{\"count\":8,\"type\":\"organise\"}]}"}],"structuredContent":{"columns":[{"database_type":"ENUM","name":"type"},{"database_type":"BIGINT","name":"count"}],"limited":false,"query_executed":"SELECT type, COUNT(*) as count FROM dataset WHERE active = true GROUP BY type LIMIT 1000","row_count":1,"rows":[{"count":8,"type":"organise"}]}}}{"jsonrpc":"2.0","id":7,"result":{"content":[{"type":"text","text":"only SELECT and WITH queries are allowed"}],"isError":true}}{"jsonrpc":"2.0","id":8,"result":{"content":[{"type":"text","text":"query contains forbidden keywords (INSERT/UPDATE/DELETE/DROP/CREATE/ALTER)"}],"isError":true}}
#!/bin/bash# Test script for import_audio_files tool# Tests tool registration and basic validation# Database path - USE TEST DATABASEDB_PATH="${1:-../db/test.duckdb}"echo "=== Testing import_audio_files Tool ==="echo "Database: $DB_PATH"echo ""# Test 1: List available tools (should include import_audio_files)echo "Test 1: List available tools"echo '{"jsonrpc":"2.0","method":"tools/list","id":1}' | ../skraak_mcp "$DB_PATH" | jq -r '.result.tools[] | select(.name == "import_audio_files") | "✓ Found: \(.name) - \(.description)"'echo ""# Test 2: Get tool schemaecho "Test 2: Get import_audio_files tool schema"echo '{"jsonrpc":"2.0","method":"tools/list","id":2}' | ../skraak_mcp "$DB_PATH" | jq '.result.tools[] | select(.name == "import_audio_files") | .inputSchema.properties'echo ""# Test 3: Test validation with invalid folder pathecho "Test 3: Test validation - invalid folder path"echo '{"jsonrpc": "2.0","method": "tools/call","params": {"name": "import_audio_files","arguments": {"folder_path": "/nonexistent/folder","dataset_id": "test123","location_id": "loc456","cluster_id": "clust789"}},"id": 3}' | ../skraak_mcp "$DB_PATH" | jq -r '.error.message // "Validation passed (unexpected!)"'echo ""# Test 4: Test validation with invalid dataset IDecho "Test 4: Test validation - invalid dataset_id"echo '{"jsonrpc": "2.0","method": "tools/call","params": {"name": "import_audio_files","arguments": {"folder_path": "/tmp","dataset_id": "invalidXXXXXX","location_id": "invalidXXXXXX","cluster_id": "invalidXXXXXX"}},"id": 4}' | ../skraak_mcp "$DB_PATH" | jq -r '.error.message // "No error (unexpected!)"'echo ""echo "=== Test Complete ==="echo ""echo "Note: For full functional testing with actual WAV files:"echo "1. Create a test dataset, location, and cluster in the database"echo "2. Place WAV files in a test folder"echo "3. Run import with valid IDs and folder path"
#!/bin/bash# Simple test of import_audio_files tool registration# Just checks if the server can start and the tool is registeredDB_PATH="${1:-../db/test.duckdb}"echo "=== Testing import_audio_files Tool Registration ==="echo "Database: $DB_PATH"echo ""# Create a test script that sends proper MCP initialization + tools/listcat > /tmp/test_import_mcp.txt << 'EOF'{"jsonrpc":"2.0","method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}},"id":1}{"jsonrpc":"2.0","method":"tools/list","id":2}EOFecho "Sending MCP commands..."cat /tmp/test_import_mcp.txt | ../skraak_mcp "$DB_PATH" 2>&1 | grep -A 20 '"method":"tools/list"' | jq -r 'select(.result != null) | .result.tools[] | select(.name == "import_audio_files") | "✓ Tool registered: \(.name)\n Description: \(.description)\n Required inputs: \(.inputSchema.required | join(", "))"'echo ""echo "=== Test Complete ==="# Cleanuprm -f /tmp/test_import_mcp.txt
{"jsonrpc":"2.0","id":1,"result":{"capabilities":{"logging":{},"prompts":{"listChanged":true},"resources":{"listChanged":true},"tools":{"listChanged":true}},"protocolVersion":"2024-11-05","serverInfo":{"name":"skraak_mcp","version":"v1.0.0"}}}{"jsonrpc":"2.0","method":"notifications/tools/list_changed","params":{}}{"jsonrpc":"2.0","method":"notifications/prompts/list_changed","params":{}}{"jsonrpc":"2.0","method":"notifications/resources/list_changed","params":{}}{"jsonrpc":"2.0","id":2,"result":{"content":[{"type":"text","text":"file validation failed: file does not exist: /nonexistent/path/to/file.wav"}],"isError":true}}{"jsonrpc":"2.0","id":3,"result":{"content":[{"type":"text","text":"file validation failed: file must be a WAV file (got extension: )"}],"isError":true}}{"jsonrpc":"2.0","id":4,"result":{"content":[{"type":"text","text":"file validation failed: file does not exist: /tmp/test.wav"}],"isError":true}}{"jsonrpc":"2.0","id":5,"result":{"content":[{"type":"text","text":"file validation failed: file does not exist: /tmp/test.wav"}],"isError":true}}{"jsonrpc":"2.0","id":8,"result":{"content":[{"type":"text","text":"{\"columns\":[{\"database_type\":\"VARCHAR\",\"name\":\"file_name\"},{\"database_type\":\"VARCHAR\",\"name\":\"xxh64_hash\"},{\"database_type\":\"DECIMAL(7,3)\",\"name\":\"duration\"},{\"database_type\":\"INTEGER\",\"name\":\"sample_rate\"},{\"database_type\":\"BOOLEAN\",\"name\":\"maybe_solar_night\"}],\"limited\":false,\"query_executed\":\"SELECT file_name, xxh64_hash, duration, sample_rate, maybe_solar_night FROM file WHERE cluster_id = ? AND active = true ORDER BY created_at DESC LIMIT 3\",\"row_count\":3,\"rows\":[{\"duration\":\"60\",\"file_name\":\"20231204_123000.WAV\",\"maybe_solar_night\":false,\"sample_rate\":\"250000\",\"xxh64_hash\":\"f51d08eb40779d25\"},{\"duration\":\"60\",\"file_name\":\"20231031_090000.WAV\",\"maybe_solar_night\":false,\"sample_rate\":\"250000\",\"xxh64_hash\":\"58e100c14b67c0f3\"},{\"duration\":\"59\",\"file_name\":\"20231102_110001.WAV\",\"maybe_solar_night\":false,\"sample_rate\":\"250000\",\"xxh64_hash\":\"d05da810db87d31b\"}]}"}],"structuredContent":{"columns":[{"database_type":"VARCHAR","name":"file_name"},{"database_type":"VARCHAR","name":"xxh64_hash"},{"database_type":"DECIMAL(7,3)","name":"duration"},{"database_type":"INTEGER","name":"sample_rate"},{"database_type":"BOOLEAN","name":"maybe_solar_night"}],"limited":false,"query_executed":"SELECT file_name, xxh64_hash, duration, sample_rate, maybe_solar_night FROM file WHERE cluster_id = ? AND active = true ORDER BY created_at DESC LIMIT 3","row_count":3,"rows":[{"duration":"60","file_name":"20231204_123000.WAV","maybe_solar_night":false,"sample_rate":"250000","xxh64_hash":"f51d08eb40779d25"},{"duration":"60","file_name":"20231031_090000.WAV","maybe_solar_night":false,"sample_rate":"250000","xxh64_hash":"58e100c14b67c0f3"},{"duration":"59","file_name":"20231102_110001.WAV","maybe_solar_night":false,"sample_rate":"250000","xxh64_hash":"d05da810db87d31b"}]}}}
#!/bin/bash# Comprehensive test suite for all write tools (create and update)# Tests: create_dataset, create_location, create_cluster, create_cyclic_recording_pattern# update_dataset, update_location, update_cluster, update_pattern# Usage: ./test_tools.sh [db_path]# Default: ../db/test.duckdb (ALWAYS USE TEST DATABASE!)# Get absolute paths before changing directorySCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"PROJECT_DIR="$(cd "$SCRIPT_DIR/.." && pwd)"DB_PATH="${1:-$PROJECT_DIR/db/test.duckdb}"if [ ! -f "$DB_PATH" ]; thenecho "Error: Database not found at $DB_PATH"exit 1fiecho "Testing write tools with database: $DB_PATH"echo "=========================================="echo ""# Navigate to the project directory where skraak_mcp binary is locatedcd "$PROJECT_DIR" || exit 1if [ ! -f "./skraak_mcp" ]; thenecho "Error: skraak_mcp binary not found. Run 'go build' first."exit 1fi# Function to send MCP requestsend_request() {local method="$1"local params="$2"(echo '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}}}'sleep 0.2echo "{\"jsonrpc\":\"2.0\",\"id\":2,\"method\":\"$method\",\"params\":$params}"sleep 0.5) | timeout 10 ./skraak_mcp "$DB_PATH" 2>&1 | grep '"id":2' | head -1}echo "=== PART 1: CREATE TOOLS ==="echo ""# Test 1: Create cyclic recording patternecho "Test 1: Create cyclic recording pattern (valid)"echo "------------------------------------------------"PATTERN_RESULT=$(send_request "tools/call" '{"name":"create_cyclic_recording_pattern","arguments":{"record_seconds":120,"sleep_seconds":300}}')PATTERN_ID=$(echo "$PATTERN_RESULT" | jq -r '.result.structuredContent.pattern.id // empty')if [ -n "$PATTERN_ID" ]; thenecho "✓ Created pattern: $PATTERN_ID (120s record, 300s sleep)"elseecho "✗ Failed to create pattern"echo "$PATTERN_RESULT" | jq '.'fiecho ""# Test 2: Create pattern with invalid values (should fail)echo "Test 2: Create pattern with negative values (should fail)"echo "----------------------------------------------------------"INVALID_PATTERN=$(send_request "tools/call" '{"name":"create_cyclic_recording_pattern","arguments":{"record_seconds":-10,"sleep_seconds":300}}')ERROR=$(echo "$INVALID_PATTERN" | jq -r '.result.isError // .error.message // empty')if [ -n "$ERROR" ] && [ "$ERROR" != "false" ]; thenecho "✓ Correctly rejected invalid pattern"elseecho "✗ Should have rejected negative values"fiecho ""# Test 3: Create datasetecho "Test 3: Create dataset (organise type)"echo "---------------------------------------"DATASET_RESULT=$(send_request "tools/call" '{"name":"create_dataset","arguments":{"name":"Test Dataset 2026","description":"Automated test dataset","type":"organise"}}')DATASET_ID=$(echo "$DATASET_RESULT" | jq -r '.result.structuredContent.dataset.id // empty')if [ -n "$DATASET_ID" ]; thenecho "✓ Created dataset: $DATASET_ID"elseecho "✗ Failed to create dataset"echo "$DATASET_RESULT" | jq '.'fiecho ""# Test 4: Create dataset with invalid type (should fail)echo "Test 4: Create dataset with invalid type (should fail)"echo "-------------------------------------------------------"INVALID_DATASET=$(send_request "tools/call" '{"name":"create_dataset","arguments":{"name":"Bad Dataset","type":"invalid_type"}}')ERROR=$(echo "$INVALID_DATASET" | jq -r '.result.isError // .error.message // empty')if [ -n "$ERROR" ] && [ "$ERROR" != "false" ]; thenecho "✓ Correctly rejected invalid dataset type"elseecho "✗ Should have rejected invalid type"fiecho ""# Test 5: Create locationecho "Test 5: Create location (Wellington, NZ)"echo "-----------------------------------------"LOCATION_RESULT=$(send_request "tools/call" '{"name":"create_location","arguments":{"dataset_id":"'"$DATASET_ID"'","name":"Wellington Test Location","latitude":-41.2865,"longitude":174.7762,"timezone_id":"Pacific/Auckland","description":"Test location in Wellington"}}')LOCATION_ID=$(echo "$LOCATION_RESULT" | jq -r '.result.structuredContent.location.id // empty')if [ -n "$LOCATION_ID" ]; thenecho "✓ Created location: $LOCATION_ID"elseecho "✗ Failed to create location"echo "$LOCATION_RESULT" | jq '.'fiecho ""# Test 6: Create location with invalid coordinates (should fail)echo "Test 6: Create location with invalid coordinates (should fail)"echo "---------------------------------------------------------------"INVALID_LOCATION=$(send_request "tools/call" '{"name":"create_location","arguments":{"dataset_id":"'"$DATASET_ID"'","name":"Invalid Location","latitude":999,"longitude":174.7762,"timezone_id":"Pacific/Auckland"}}')ERROR=$(echo "$INVALID_LOCATION" | jq -r '.result.isError // .error.message // empty')if [ -n "$ERROR" ] && [ "$ERROR" != "false" ]; thenecho "✓ Correctly rejected invalid coordinates"elseecho "✗ Should have rejected invalid latitude"fiecho ""# Test 7: Create clusterecho "Test 7: Create cluster with pattern"echo "------------------------------------"CLUSTER_RESULT=$(send_request "tools/call" '{"name":"create_cluster","arguments":{"dataset_id":"'"$DATASET_ID"'","location_id":"'"$LOCATION_ID"'","name":"Test Cluster A01","sample_rate":250000,"cyclic_recording_pattern_id":"'"$PATTERN_ID"'"}}')CLUSTER_ID=$(echo "$CLUSTER_RESULT" | jq -r '.result.structuredContent.cluster.id // empty')if [ -n "$CLUSTER_ID" ]; thenecho "✓ Created cluster: $CLUSTER_ID"elseecho "✗ Failed to create cluster"echo "$CLUSTER_RESULT" | jq '.'fiecho ""# Test 8: Create cluster with invalid sample rate (should fail)echo "Test 8: Create cluster with invalid sample rate (should fail)"echo "--------------------------------------------------------------"INVALID_CLUSTER=$(send_request "tools/call" '{"name":"create_cluster","arguments":{"dataset_id":"'"$DATASET_ID"'","location_id":"'"$LOCATION_ID"'","name":"Bad Cluster","sample_rate":-1000}}')ERROR=$(echo "$INVALID_CLUSTER" | jq -r '.result.isError // .error.message // empty')if [ -n "$ERROR" ] && [ "$ERROR" != "false" ]; thenecho "✓ Correctly rejected invalid sample rate"elseecho "✗ Should have rejected negative sample rate"fiecho ""echo "=== PART 2: UPDATE TOOLS ==="echo ""# Test 9: Update dataset name and descriptionecho "Test 9: Update dataset name and description"echo "--------------------------------------------"UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_dataset","arguments":{"dataset_id":"'"$DATASET_ID"'","name":"Updated Test Dataset","description":"Updated description after test"}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.structuredContent.success // empty')if [ "$SUCCESS" = "true" ]; thenecho "✓ Successfully updated dataset"elseecho "✗ Failed to update dataset"echo "$UPDATE_RESULT" | jq '.'fiecho ""# Test 10: Update dataset typeecho "Test 10: Update dataset type to 'train'"echo "---------------------------------------"UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_dataset","arguments":{"dataset_id":"'"$DATASET_ID"'","type":"train"}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.structuredContent.success // empty')if [ "$SUCCESS" = "true" ]; thenecho "✓ Successfully updated dataset type"elseecho "✗ Failed to update dataset type"echo "$UPDATE_RESULT" | jq '.'fiecho ""# Test 11: Update location coordinatesecho "Test 11: Update location coordinates and name"echo "----------------------------------------------"UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_location","arguments":{"location_id":"'"$LOCATION_ID"'","name":"Updated Wellington Location","latitude":-41.2900,"longitude":174.7800}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.structuredContent.success // empty')if [ "$SUCCESS" = "true" ]; thenecho "✓ Successfully updated location"elseecho "✗ Failed to update location"echo "$UPDATE_RESULT" | jq '.'fiecho ""# Test 12: Update cluster metadataecho "Test 12: Update cluster name and sample rate"echo "---------------------------------------------"UPDATE_RESULT=$(send_request "tools/call" '{"name":"update_cluster","arguments":{"cluster_id":"'"$CLUSTER_ID"'","name":"Updated Cluster A01","sample_rate":384000,"description":"Updated cluster description"}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.structuredContent.success // empty')if [ "$SUCCESS" = "true" ]; thenecho "✓ Successfully updated cluster"elseecho "✗ Failed to update cluster"echo "$UPDATE_RESULT" | jq '.'fiecho ""# Test 13: Update recording pattern (using existing pattern from database)echo "Test 13: Update recording pattern durations"echo "--------------------------------------------"echo -n "Getting existing pattern from database... "EXISTING_PATTERN=$(send_request "tools/call" '{"name":"execute_sql","arguments":{"query":"SELECT id, record_s, sleep_s FROM cyclic_recording_pattern WHERE active = true ORDER BY created_at DESC LIMIT 1"}}')EXISTING_PATTERN_ID=$(echo "$EXISTING_PATTERN" | jq -r '.result.structuredContent.rows[0].id // empty')CURRENT_RECORD=$(echo "$EXISTING_PATTERN" | jq -r '.result.structuredContent.rows[0].record_s // empty' | sed 's/"//g')CURRENT_SLEEP=$(echo "$EXISTING_PATTERN" | jq -r '.result.structuredContent.rows[0].sleep_s // empty' | sed 's/"//g')if [ -n "$EXISTING_PATTERN_ID" ] && [ -n "$CURRENT_RECORD" ] && [ -n "$CURRENT_SLEEP" ]; thenecho "found $EXISTING_PATTERN_ID (${CURRENT_RECORD}s/${CURRENT_SLEEP}s)"# Generate unique values based on current timestamp to avoid duplicatesTIMESTAMP_OFFSET=$(($(date +%s) % 100))NEW_RECORD=$((CURRENT_RECORD + TIMESTAMP_OFFSET))NEW_SLEEP=$((CURRENT_SLEEP + TIMESTAMP_OFFSET))# Check if this combination already existsCHECK_RESULT=$(send_request "tools/call" '{"name":"execute_sql","arguments":{"query":"SELECT COUNT(*) as count FROM cyclic_recording_pattern WHERE record_s = '"$NEW_RECORD"' AND sleep_s = '"$NEW_SLEEP"' AND active = true"}}')EXISTS=$(echo "$CHECK_RESULT" | jq -r '.result.structuredContent.rows[0].count // "0"' | sed 's/"//g')if [ "$EXISTS" = "0" ]; thenUPDATE_RESULT=$(send_request "tools/call" '{"name":"update_pattern","arguments":{"pattern_id":"'"$EXISTING_PATTERN_ID"'","record_s":'"$NEW_RECORD"',"sleep_s":'"$NEW_SLEEP"'}}')SUCCESS=$(echo "$UPDATE_RESULT" | jq -r '.result.structuredContent.success // empty')if [ "$SUCCESS" = "true" ]; thenecho "✓ Successfully updated pattern (${NEW_RECORD}s record, ${NEW_SLEEP}s sleep)"elseecho "✗ Failed to update pattern"echo "$UPDATE_RESULT" | jq '.'fielseecho "✓ Pattern with values ${NEW_RECORD}s/${NEW_SLEEP}s already exists (skipping to avoid duplicate)"fielseecho "✗ Could not find existing pattern to update"fiecho ""# Test 14: Update with invalid ID (should fail)echo "Test 14: Update with non-existent dataset ID (should fail)"echo "-----------------------------------------------------------"INVALID_UPDATE=$(send_request "tools/call" '{"name":"update_dataset","arguments":{"dataset_id":"INVALID_ID_123","name":"Should Fail"}}')ERROR=$(echo "$INVALID_UPDATE" | jq -r '.result.isError // .error.message // empty')if [ -n "$ERROR" ] && [ "$ERROR" != "false" ]; thenecho "✓ Correctly rejected invalid dataset ID"elseecho "✗ Should have rejected invalid ID"fiecho ""echo "=== TEST SUMMARY ==="echo "All write tool tests complete!"echo "Check output above for any ✗ failures"echo ""
# Always use test.duckdb for testing!
# Default: ../db/test.duckdb (ALWAYS USE TEST DATABASE!)# Get absolute paths before changing directorySCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"PROJECT_DIR="$(cd "$SCRIPT_DIR/.." && pwd)"DB_PATH="${1:-$PROJECT_DIR/db/test.duckdb}"if [ ! -f "$DB_PATH" ]; thenecho "Error: Database not found at $DB_PATH"exit 1fiecho "Testing import_file tool with database: $DB_PATH"echo "================================================="echo ""# Navigate to the project directory where skraak_mcp binary is locatedcd "$PROJECT_DIR" || exit 1if [ ! -f "./skraak_mcp" ]; thenecho "Error: skraak_mcp binary not found. Run 'go build' first."exit 1fi# Function to send MCP requestsend_request() {local method="$1"local params="$2"(echo '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}}}'sleep 0.2echo "{\"jsonrpc\":\"2.0\",\"id\":2,\"method\":\"$method\",\"params\":$params}"sleep 0.5) | timeout 10 ./skraak_mcp "$DB_PATH" 2>&1 | grep '"id":2' | head -1}echo "Setup: Getting test IDs from database"echo "--------------------------------------"# Get test IDs using execute_sqlDATASET_RESULT=$(send_request "tools/call" '{"name":"execute_sql","arguments":{"query":"SELECT id, name FROM dataset WHERE active = true LIMIT 1"}}')DATASET_ID=$(echo "$DATASET_RESULT" | jq -r '.result.structuredContent.rows[0].id // empty')DATASET_NAME=$(echo "$DATASET_RESULT" | jq -r '.result.structuredContent.rows[0].name // empty')if [ -z "$DATASET_ID" ]; thenecho "✗ No active datasets found in database"exit 1fiecho "✓ Using dataset: $DATASET_NAME ($DATASET_ID)"
DB_PATH="${1:-../db/skraak.duckdb}"
LOCATION_RESULT=$(send_request "tools/call" '{"name":"execute_sql","arguments":{"query":"SELECT id, name FROM location WHERE dataset_id = ? AND active = true LIMIT 1","parameters":["'"$DATASET_ID"'"]}}')LOCATION_ID=$(echo "$LOCATION_RESULT" | jq -r '.result.structuredContent.rows[0].id // empty')LOCATION_NAME=$(echo "$LOCATION_RESULT" | jq -r '.result.structuredContent.rows[0].name // empty')
# Get test IDs from databaseDATASET_ID=$(echo "SELECT id FROM dataset WHERE active = true LIMIT 1" | duckdb "$DB_PATH" -json | jq -r '.[0].id')LOCATION_ID=$(echo "SELECT id FROM location WHERE dataset_id = '$DATASET_ID' AND active = true LIMIT 1" | duckdb "$DB_PATH" -json | jq -r '.[0].id')CLUSTER_ID=$(echo "SELECT id FROM cluster WHERE location_id = '$LOCATION_ID' AND active = true LIMIT 1" | duckdb "$DB_PATH" -json | jq -r '.[0].id')
if [ -z "$LOCATION_ID" ]; thenecho "✗ No active locations found for dataset"exit 1fiecho "✓ Using location: $LOCATION_NAME ($LOCATION_ID)"
# Get a real WAV file pathCLUSTER_PATH=$(echo "SELECT path FROM cluster WHERE id = '$CLUSTER_ID'" | duckdb "$DB_PATH" -json | jq -r '.[0].path')TEST_FILE=$(echo "SELECT file_name FROM file WHERE cluster_id = '$CLUSTER_ID' AND active = true LIMIT 1" | duckdb "$DB_PATH" -json | jq -r '.[0].file_name')
CLUSTER_RESULT=$(send_request "tools/call" '{"name":"execute_sql","arguments":{"query":"SELECT id, name FROM cluster WHERE location_id = ? AND active = true LIMIT 1","parameters":["'"$LOCATION_ID"'"]}}')CLUSTER_ID=$(echo "$CLUSTER_RESULT" | jq -r '.result.structuredContent.rows[0].id // empty')CLUSTER_NAME=$(echo "$CLUSTER_RESULT" | jq -r '.result.structuredContent.rows[0].name // empty')
# Construct full path (may or may not exist)if [ -n "$CLUSTER_PATH" ] && [ "$CLUSTER_PATH" != "null" ] && [ -n "$TEST_FILE" ] && [ "$TEST_FILE" != "null" ]; thenFULL_PATH="$CLUSTER_PATH/$TEST_FILE"elseFULL_PATH="/nonexistent/test.wav"
if [ -z "$CLUSTER_ID" ]; thenecho "✗ No active clusters found for location"exit 1
# Test 1: Import non-existent file (should error)echo "{\"jsonrpc\":\"2.0\",\"id\":2,\"method\":\"tools/call\",\"params\":{\"name\":\"import_file\",\"arguments\":{\"file_path\":\"/nonexistent/path/to/file.wav\",\"dataset_id\":\"$DATASET_ID\",\"location_id\":\"$LOCATION_ID\",\"cluster_id\":\"$CLUSTER_ID\"}}}"sleep 0.2
# Test 1: Non-existent fileecho "Test 1: Import non-existent file (should fail)"echo "-----------------------------------------------"RESULT=$(send_request "tools/call" '{"name":"import_file","arguments":{"file_path":"/nonexistent/path/to/file.wav","dataset_id":"'"$DATASET_ID"'","location_id":"'"$LOCATION_ID"'","cluster_id":"'"$CLUSTER_ID"'"}}')IS_ERROR=$(echo "$RESULT" | jq -r '.result.isError // .error.message // empty')if [ -n "$IS_ERROR" ] && [ "$IS_ERROR" != "false" ]; thenecho "✓ Correctly rejected non-existent file"elseecho "✗ Should have rejected non-existent file"fiecho ""
# Test 2: Import non-WAV file (should error)echo "{\"jsonrpc\":\"2.0\",\"id\":3,\"method\":\"tools/call\",\"params\":{\"name\":\"import_file\",\"arguments\":{\"file_path\":\"/etc/passwd\",\"dataset_id\":\"$DATASET_ID\",\"location_id\":\"$LOCATION_ID\",\"cluster_id\":\"$CLUSTER_ID\"}}}"sleep 0.2
# Test 2: Non-WAV fileecho "Test 2: Import non-WAV file (should fail)"echo "------------------------------------------"RESULT=$(send_request "tools/call" '{"name":"import_file","arguments":{"file_path":"/etc/passwd","dataset_id":"'"$DATASET_ID"'","location_id":"'"$LOCATION_ID"'","cluster_id":"'"$CLUSTER_ID"'"}}')IS_ERROR=$(echo "$RESULT" | jq -r '.result.isError // .error.message // empty')if [ -n "$IS_ERROR" ] && [ "$IS_ERROR" != "false" ]; thenecho "✓ Correctly rejected non-WAV file"elseecho "✗ Should have rejected non-WAV file"fiecho ""
# Test 3: Import with invalid dataset_id (should error)echo "{\"jsonrpc\":\"2.0\",\"id\":4,\"method\":\"tools/call\",\"params\":{\"name\":\"import_file\",\"arguments\":{\"file_path\":\"/tmp/test.wav\",\"dataset_id\":\"invalid_id123\",\"location_id\":\"$LOCATION_ID\",\"cluster_id\":\"$CLUSTER_ID\"}}}"sleep 0.2
# Test 3: Invalid dataset IDecho "Test 3: Import with invalid dataset_id (should fail)"echo "-----------------------------------------------------"RESULT=$(send_request "tools/call" '{"name":"import_file","arguments":{"file_path":"/tmp/test.wav","dataset_id":"INVALID_ID_123","location_id":"'"$LOCATION_ID"'","cluster_id":"'"$CLUSTER_ID"'"}}')IS_ERROR=$(echo "$RESULT" | jq -r '.result.isError // .error.message // empty')if [ -n "$IS_ERROR" ] && [ "$IS_ERROR" != "false" ]; thenecho "✓ Correctly rejected invalid dataset_id"elseecho "✗ Should have rejected invalid dataset_id"fiecho ""
# Test 4: Import with invalid cluster_id (should error)echo "{\"jsonrpc\":\"2.0\",\"id\":5,\"method\":\"tools/call\",\"params\":{\"name\":\"import_file\",\"arguments\":{\"file_path\":\"/tmp/test.wav\",\"dataset_id\":\"$DATASET_ID\",\"location_id\":\"$LOCATION_ID\",\"cluster_id\":\"invalid_id123\"}}}"sleep 0.2
# Test 4: Invalid cluster IDecho "Test 4: Import with invalid cluster_id (should fail)"echo "-----------------------------------------------------"RESULT=$(send_request "tools/call" '{"name":"import_file","arguments":{"file_path":"/tmp/test.wav","dataset_id":"'"$DATASET_ID"'","location_id":"'"$LOCATION_ID"'","cluster_id":"INVALID_ID_123"}}')IS_ERROR=$(echo "$RESULT" | jq -r '.result.isError // .error.message // empty')if [ -n "$IS_ERROR" ] && [ "$IS_ERROR" != "false" ]; thenecho "✓ Correctly rejected invalid cluster_id"elseecho "✗ Should have rejected invalid cluster_id"fiecho ""
# Test 5: Import real file (if it exists)if [ -f "$FULL_PATH" ]; then# Escape path for JSONESCAPED_PATH=$(echo "$FULL_PATH" | sed 's/\\/\\\\/g' | sed 's/"/\\"/g')echo "{\"jsonrpc\":\"2.0\",\"id\":6,\"method\":\"tools/call\",\"params\":{\"name\":\"import_file\",\"arguments\":{\"file_path\":\"$ESCAPED_PATH\",\"dataset_id\":\"$DATASET_ID\",\"location_id\":\"$LOCATION_ID\",\"cluster_id\":\"$CLUSTER_ID\"}}}"sleep 0.2
echo "=== QUERY EXISTING FILES ==="echo ""
# Test 6: Import same file again (should be duplicate)echo "{\"jsonrpc\":\"2.0\",\"id\":7,\"method\":\"tools/call\",\"params\":{\"name\":\"import_file\",\"arguments\":{\"file_path\":\"$ESCAPED_PATH\",\"dataset_id\":\"$DATASET_ID\",\"location_id\":\"$LOCATION_ID\",\"cluster_id\":\"$CLUSTER_ID\"}}}"sleep 0.2fi
# Query existing files in the clusterecho "Querying existing files in cluster..."FILES_RESULT=$(send_request "tools/call" '{"name":"execute_sql","arguments":{"query":"SELECT file_name, xxh64_hash, duration FROM file WHERE cluster_id = ? AND active = true ORDER BY created_at DESC LIMIT 3","parameters":["'"$CLUSTER_ID"'"]}}')FILE_COUNT=$(echo "$FILES_RESULT" | jq -r '.result.structuredContent.row_count // 0')
# Test 7: Query files to verifyecho "{\"jsonrpc\":\"2.0\",\"id\":8,\"method\":\"tools/call\",\"params\":{\"name\":\"execute_sql\",\"arguments\":{\"query\":\"SELECT file_name, xxh64_hash, duration, sample_rate, maybe_solar_night FROM file WHERE cluster_id = ? AND active = true ORDER BY created_at DESC LIMIT 3\",\"parameters\":[\"$CLUSTER_ID\"],\"limit\":3}}}"sleep 0.2
if [ "$FILE_COUNT" -gt 0 ]; thenecho "✓ Found $FILE_COUNT file(s) in cluster:"echo "$FILES_RESULT" | jq -r '.result.structuredContent.rows[] | " - \(.file_name) (hash: \(.xxh64_hash), duration: \(.duration)s)"'elseecho "ℹ No existing files found in cluster"fiecho ""
} | ../skraak_mcp "$DB_PATH" 2>/dev/null
echo "=== TEST SUMMARY ==="echo "Validation tests complete!"echo ""echo "Note: This test validates error handling."echo "To test actual file import, you would need:"echo " 1. A real WAV file path"echo " 2. Proper dataset/location/cluster IDs"echo " 3. File system access to the WAV file"echo ""echo "Example import command:"echo ' {"name":"import_file","arguments":{"file_path":"/path/to/file.wav","dataset_id":"'$DATASET_ID'","location_id":"'$LOCATION_ID'","cluster_id":"'$CLUSTER_ID'"}}'echo ""
#!/bin/bash# Test script for bulk_file_import tool# Creates a test CSV and validates the tool# Usage: ./test_bulk_import.sh [db_path]# Default: ../db/test.duckdb (ALWAYS USE TEST DATABASE!)# Get absolute paths before changing directorySCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"PROJECT_DIR="$(cd "$SCRIPT_DIR/.." && pwd)"DB_PATH="${1:-$PROJECT_DIR/db/test.duckdb}"if [ ! -f "$DB_PATH" ]; thenecho "Error: Database not found at $DB_PATH"exit 1fiecho "Testing bulk_file_import tool"echo "============================="echo "Database: $DB_PATH"echo ""# Navigate to the project directory where skraak_mcp binary is locatedcd "$PROJECT_DIR" || exit 1if [ ! -f "./skraak_mcp" ]; thenecho "Error: skraak_mcp binary not found. Run 'go build' first."exit 1fi# Function to send MCP requestsend_request() {local method="$1"local params="$2"(echo '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}}}'sleep 0.2echo "{\"jsonrpc\":\"2.0\",\"id\":2,\"method\":\"$method\",\"params\":$params}"sleep 0.5) | timeout 10 ./skraak_mcp "$DB_PATH" 2>&1 | grep '"id":2' | head -1}echo "Step 1: Create test dataset and locations"echo "------------------------------------------"# Create a test datasetecho -n "Creating test dataset... "DATASET_RESULT=$(send_request "tools/call" '{"name":"create_dataset","arguments":{"name":"Bulk Import Test Dataset","type":"test","description":"Dataset for testing bulk import"}}')DATASET_ID=$(echo "$DATASET_RESULT" | jq -r '.result.structuredContent.dataset.id // empty')if [ -n "$DATASET_ID" ]; thenecho "✓ Created dataset: $DATASET_ID"elseecho "✗ Failed to create dataset"echo "$DATASET_RESULT" | jq '.'exit 1fi# Create test location Aecho -n "Creating test location A... "LOCATION_A_RESULT=$(send_request "tools/call" '{"name":"create_location","arguments":{"dataset_id":"'"$DATASET_ID"'","name":"Test Location A","latitude":-41.2865,"longitude":174.7762,"timezone_id":"Pacific/Auckland","description":"Test site A"}}')LOCATION_A_ID=$(echo "$LOCATION_A_RESULT" | jq -r '.result.structuredContent.location.id // empty')if [ -n "$LOCATION_A_ID" ]; thenecho "✓ Created location A: $LOCATION_A_ID"elseecho "✗ Failed to create location A"echo "$LOCATION_A_RESULT" | jq '.'exit 1fi# Create test location Becho -n "Creating test location B... "LOCATION_B_RESULT=$(send_request "tools/call" '{"name":"create_location","arguments":{"dataset_id":"'"$DATASET_ID"'","name":"Test Location B","latitude":-36.8485,"longitude":174.7633,"timezone_id":"Pacific/Auckland","description":"Test site B"}}')LOCATION_B_ID=$(echo "$LOCATION_B_RESULT" | jq -r '.result.structuredContent.location.id // empty')if [ -n "$LOCATION_B_ID" ]; thenecho "✓ Created location B: $LOCATION_B_ID"elseecho "✗ Failed to create location B"echo "$LOCATION_B_RESULT" | jq '.'exit 1fiecho ""echo "Step 2: Create test CSV file"echo "-----------------------------"# Create test CSV with sample dataCSV_FILE="/tmp/test_bulk_import_$$.csv"LOG_FILE="/tmp/test_bulk_import_$$.log"cat > "$CSV_FILE" << EOFlocation_name,location_id,directory_path,date_range,sample_rate,file_countTest Location A,$LOCATION_A_ID,/nonexistent/path/a,2024-01,250000,0Test Location B,$LOCATION_B_ID,/nonexistent/path/b,2024-02,384000,0EOFecho "✓ Created test CSV at $CSV_FILE"echo "Contents:"cat "$CSV_FILE"echo ""echo "Step 3: Test bulk_file_import tool"echo "-----------------------------------"# Note: This will fail because the directories don't exist, but it validates:# - CSV parsing# - Location ID validation# - Cluster auto-creation logic# - Error handlingecho "Calling bulk_file_import (expect directory errors)..."IMPORT_RESULT=$(send_request "tools/call" "{\"name\":\"bulk_file_import\",\"arguments\":{\"dataset_id\":\"$DATASET_ID\",\"csv_path\":\"$CSV_FILE\",\"log_file_path\":\"$LOG_FILE\"}}")echo ""echo "Result:"IS_ERROR=$(echo "$IMPORT_RESULT" | jq -r '.result.isError // empty')if [ -n "$IS_ERROR" ] && [ "$IS_ERROR" != "false" ]; thenERROR_MSG=$(echo "$IMPORT_RESULT" | jq -r '.result.content[0].text // "Unknown error"')echo "✓ Tool executed (with expected directory errors)"echo " Error: $ERROR_MSG"elseSUMMARY=$(echo "$IMPORT_RESULT" | jq -r '.result.structuredContent // empty')if [ -n "$SUMMARY" ]; thenecho "✓ Tool executed successfully"echo "$IMPORT_RESULT" | jq '.result.structuredContent'elseecho "✗ Unexpected response format"echo "$IMPORT_RESULT" | jq '.'fifiecho ""# Check if log file was createdif [ -f "$LOG_FILE" ]; thenecho "✓ Log file created at $LOG_FILE"echo "Log contents:"cat "$LOG_FILE"elseecho "ℹ Log file not created (expected if directories don't exist)"fiecho ""echo "Step 4: Test validation - invalid CSV path"echo "-------------------------------------------"INVALID_CSV=$(send_request "tools/call" "{\"name\":\"bulk_file_import\",\"arguments\":{\"dataset_id\":\"$DATASET_ID\",\"csv_path\":\"/nonexistent/file.csv\",\"log_file_path\":\"$LOG_FILE\"}}")IS_ERROR=$(echo "$INVALID_CSV" | jq -r '.result.isError // .error.message // empty')if [ -n "$IS_ERROR" ] && [ "$IS_ERROR" != "false" ]; thenecho "✓ Correctly rejected non-existent CSV file"elseecho "✗ Should have rejected non-existent CSV"fiecho ""echo "Step 5: Test validation - invalid dataset ID"echo "---------------------------------------------"INVALID_DATASET=$(send_request "tools/call" "{\"name\":\"bulk_file_import\",\"arguments\":{\"dataset_id\":\"INVALID_ID_123\",\"csv_path\":\"$CSV_FILE\",\"log_file_path\":\"$LOG_FILE\"}}")IS_ERROR=$(echo "$INVALID_DATASET" | jq -r '.result.isError // .error.message // empty')if [ -n "$IS_ERROR" ] && [ "$IS_ERROR" != "false" ]; thenecho "✓ Correctly rejected invalid dataset ID"elseecho "✗ Should have rejected invalid dataset ID"fiecho ""echo "=== TEST SUMMARY ==="echo "Bulk import tool validation complete!"echo "Note: Directory errors are expected (using non-existent paths)"echo "The test validates CSV parsing and validation logic."echo ""# Cleanupecho "Cleaning up test files..."rm -f "$CSV_FILE" "$LOG_FILE"echo "✓ Cleanup complete"echo ""
# Shell Test Scripts - Complete Verification## Test Execution SummaryAll 8 shell test scripts have been executed successfully with **ZERO FAILURES**.### Test Results| Script | Status | Tests Passed | Description ||--------|--------|--------------|-------------|| `get_time.sh` | ✅ PASS | 1/1 | Time tool returns current time with timezone || `test_sql.sh` | ✅ PASS | 8/8 | SQL queries: SELECT, JOINs, aggregates, security validation || `test_tools.sh` | ✅ PASS | 14/14 | All write tools (create + update) with validation || `test_import_file.sh` | ✅ PASS | 4/4 | File import validation tests || `test_import_selections.sh` | ✅ PASS | 34/34 | ML selection parser unit tests || `test_bulk_import.sh` | ✅ PASS | 5/5 | Bulk CSV import with cluster creation || `test_resources_prompts.sh` | ✅ PASS | 2/2 | Schema resources and prompts || `test_all_prompts.sh` | ✅ PASS | 11/11 | All 6 prompts + error handling |**Total: 79/79 tests passed (100%)**### Detailed Breakdown#### 1. get_time.sh ✅- Returns time in RFC3339 format with timezone- Includes Unix timestamp- No database required#### 2. test_sql.sh ✅**6 successful queries:**- Simple SELECT with auto-limit- SELECT with explicit LIMIT- Parameterized query with ? placeholder- Complex JOIN with GROUP BY- Aggregate query with GROUP BY- Security: Blocks INSERT attempts (2 tests)#### 3. test_tools.sh ✅**14 tests (8 positive, 6 negative):**- ✓ Create pattern (120s/300s)- ✓ Reject invalid pattern (negative values)- ✓ Create dataset (organise type)- ✓ Reject invalid dataset type- ✓ Create location (Wellington coordinates)- ✓ Reject invalid coordinates- ✓ Create cluster with pattern- ✓ Reject invalid sample rate- ✓ Update dataset name/description- ✓ Update dataset type to 'train'- ✓ Update location coordinates- ✓ Update cluster metadata- ✓ Update pattern (queries existing, updates dynamically)- ✓ Reject invalid dataset ID#### 4. test_import_file.sh ✅**4 validation tests + file query:**- ✓ Reject non-existent file- ✓ Reject non-WAV file- ✓ Reject invalid dataset_id- ✓ Reject invalid cluster_id- ✓ Query existing files (found 3 files in cluster)#### 5. test_import_selections.sh ✅**34 unit tests across 4 modules:**- ✓ ParseSelectionFilename (12 tests)- ✓ ParseMLFolderName (8 tests)- ✓ ValidateWAVPNGPairs (5 tests)- ✓ ExtractDateTimePattern (9 tests)#### 6. test_bulk_import.sh ✅**5 tests:**- ✓ Create test dataset- ✓ Create 2 locations- ✓ CSV parsing and tool execution- ✓ Reject invalid CSV path- ✓ Reject invalid dataset ID- Bonus: Generated detailed log file with timestamps#### 7. test_resources_prompts.sh ✅**2 resource tests:**- ✓ Schema resources (full + per-table)- ✓ All 6 prompts available#### 8. test_all_prompts.sh ✅**11 tests (10 prompts + 1 error):**- ✓ List all prompts (6 available)- ✓ Get query_active_datasets- ✓ Get explore_database_schema (overview)- ✓ Get explore_database_schema (dataset focus)- ✓ Get explore_location_hierarchy (no args)- ✓ Get explore_location_hierarchy (with dataset_id)- ✓ Get query_location_data- ✓ Get analyze_cluster_files (with cluster_id)- ✓ Get system_status_check- ✓ Error handling (missing required argument)## Key Improvements Made### All Scripts Fixed:1. **Path resolution** - Absolute paths before directory changes2. **JSON parsing** - Uses `.result.structuredContent` instead of nested parsing3. **Timing** - Added sleep delays for reliable server responses4. **Response filtering** - Uses `grep '"id":2'` to get correct response5. **Error detection** - Checks `.result.isError // .error.message`6. **Default database** - All scripts default to `test.duckdb` for safety### Script-Specific Fixes:- `test_tools.sh`: Fixed parameter names (record_seconds vs record_s), removed non-existent cluster_path parameter, dynamic pattern update values- `test_import_file.sh`: Complete rewrite with formatted output and proper validation- `test_bulk_import.sh`: Fixed all JSON parsing and validation logic- `test_sql.sh`: Changed default to test.duckdb## Safety Features✅ All scripts default to `test.duckdb` (not production)✅ All scripts use absolute paths (no relative path issues)✅ All scripts have proper error detection✅ All scripts have clear ✓/✗ indicators✅ All scripts include cleanup (temp files removed)## ConclusionThe shell test suite is **fully operational and production-ready**. All 8 scripts execute successfully with 100% test pass rate (79/79 tests).
The Skraak MCP Server provides 14 tools across three categories:- **Read tools (2)**: `get_current_time`, `execute_sql`- **Write tools (8)**: `create_*` and `update_*` for datasets, locations, clusters, patterns- **Import tools (4)**: `import_audio_files`, `import_file`, `import_ml_selections`, `bulk_file_import`Plus resources (schema) and prompts (SQL workflow templates).
./test_mcp.sh [path-to-database]
# Correct - uses test database (default)./test_sql.sh# Also correct - explicit test database path./test_sql.sh /home/david/go/src/skraak_mcp/db/test.duckdb# WRONG - uses production database./test_sql.sh /home/david/go/src/skraak_mcp/db/skraak.duckdb # ❌ DON'T DO THIS
Tests all functionality:1. Server initialization2. Tool listing3. `get_current_time` tool4. `query_datasets` tool
**Path handling:**- Simple scripts (`get_time.sh`, `test_sql.sh`, etc.) use relative paths: `../db/test.duckdb`- Complex scripts (`test_tools.sh`, `test_import_file.sh`, etc.) resolve absolute paths at runtime- All work correctly when run from the `shell_scripts/` directory
Both scripts output clean JSON using `jq`.
**Always pipe to file!** SQL tests produce large output.#### 2. Write Tools Tests**Comprehensive write tool test:**```bash./test_tools.sh > write_test.txt 2>&1```Tests all 8 write tools (create and update):- `create_dataset`, `create_location`, `create_cluster`, `create_cyclic_recording_pattern`- `update_dataset`, `update_location`, `update_cluster`, `update_pattern`- Valid inputs (should succeed)- Invalid inputs (should fail with validation errors)#### 3. Import Tools Tests**Import folder of WAV files:**```bash./test_import_file.sh > import_test.txt 2>&1```Tests the `import_file` tool (single file import):- Valid WAV file import- AudioMoth metadata parsing- Timestamp extraction- Hash computation and duplicate detection- Astronomical data calculation**Import ML detection selections:**```bash./test_import_selections.sh > selections_test.txt 2>&1```Tests the `import_ml_selections` tool:- Folder structure parsing (`Clips_{filter}_{date}/Species/CallType/*.wav+.png`)- Selection filename parsing (`base-start-end.wav`)- File matching (exact and fuzzy)- Validation of filter, species, call types**Bulk import across locations:**```bash./test_bulk_import.sh > bulk_test.txt 2>&1```Tests the `bulk_file_import` tool:- CSV parsing- Location validation- Auto-cluster creation- Progress logging- Error handling
#### 4. Resources and Prompts Tests**Resources and prompts:**```bash./test_resources_prompts.sh | jq '.'```Tests:- Schema resources (full schema, per-table schemas)- All 6 prompts (workflow templates)**All prompts test:**```bash./test_all_prompts.sh > prompts_test.txt 2>&1```Tests all 6 SQL workflow prompts:1. `query_active_datasets` - Dataset querying patterns2. `explore_database_schema` - Interactive schema exploration3. `explore_location_hierarchy` - Hierarchy navigation with JOINs4. `query_location_data` - Location analysis with filtering/aggregates5. `analyze_cluster_files` - File analysis with aggregate functions6. `system_status_check` - Comprehensive health check### Analyzing Test Results**Count responses:**```bashrg '"result":' test_output.txt | wc -l```**Check for errors:**```bashrg '"isError":true' test_output.txtrg -i "error" test_output.txt | grep -v '"isError"'```**View specific test:**```bashrg -A 20 "Test 5:" test_output.txt```
{"jsonrpc":"2.0","id":4,"method":"tools/call","params":{"name":"query_datasets","arguments":{}}}
{"jsonrpc":"2.0","id":4,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT * FROM dataset WHERE active = true LIMIT 10"}}}```### 5. Call execute_sql (parameterized query)```json{"jsonrpc":"2.0","id":5,"method":"tools/call","params":{"name":"execute_sql","arguments":{"query":"SELECT * FROM location WHERE dataset_id = ?","parameters":["vgIr9JSH_lFj"]}}}```### 6. Get schema resource```json{"jsonrpc":"2.0","id":6,"method":"resources/read","params":{"uri":"schema://full"}}```### 7. Get table schema```json{"jsonrpc":"2.0","id":7,"method":"resources/read","params":{"uri":"schema://table/dataset"}}```### 8. Get prompt```json{"jsonrpc":"2.0","id":8,"method":"prompts/get","params":{"name":"query_active_datasets"}}```### 9. Create dataset```json{"jsonrpc":"2.0","id":9,"method":"tools/call","params":{"name":"create_dataset","arguments":{"name":"Test Dataset","description":"Testing create tool","type":"test"}}}```### 10. Create location```json{"jsonrpc":"2.0","id":10,"method":"tools/call","params":{"name":"create_location","arguments":{"dataset_id":"YOUR_DATASET_ID","name":"Test Location","latitude":-41.2865,"longitude":174.7762,"timezone_id":"Pacific/Auckland"}}}
Returns 14 tools:- `get_current_time` - Time utility- `execute_sql` - Generic SQL query execution- `create_dataset`, `create_location`, `create_cluster`, `create_cyclic_recording_pattern`- `update_dataset`, `update_location`, `update_cluster`, `update_pattern`- `import_audio_files`, `import_file`, `import_ml_selections`, `bulk_file_import`### Execute SQL Response
"tools":[{"name":"get_current_time","description":"Get the current system time with timezone information","inputSchema":{"type":"object","additionalProperties":false},"outputSchema":{"type":"object","required":["time","timezone","unix"],"properties":{"time":{"type":"string","description":"Current system time in RFC3339 format"},"timezone":{"type":"string","description":"System timezone"},"unix":{"type":"integer","description":"Unix timestamp in seconds"}}}},{"name":"query_datasets","description":"Query all datasets from the database. Returns dataset information including ID, name, description, timestamps, active status, and type (organise/test/train).","inputSchema":{"type":"object","additionalProperties":false},"outputSchema":{"type":"object","required":["datasets","count"],"properties":{"datasets":{"type":"array","description":"Array of dataset records from the database"},"count":{"type":"integer","description":"Total number of datasets returned"}}}}]
"content":[{"type":"text","text":"{\"columns\":[\"id\",\"name\",\"type\",\"active\"],\"rows\":[[\"abc123\",\"Dataset 1\",\"organise\",true],[\"def456\",\"Dataset 2\",\"test\",true]],\"row_count\":2,\"total_columns\":4,\"has_more\":false}"}]
"structuredContent":{"count":10,"datasets":[{"id":"U1khPsIN_r9-","name":"sorted data test","description":null,"created_at":"2025-08-26T09:01:04Z","last_modified":"2025-08-26T09:03:05Z","active":false,"type":"organise"}]}
"description":"Query active datasets with filtering and analysis using SQL SELECT and GROUP BY","messages":[{"role":"user","content":{"type":"text","text":"I want to query active datasets..."}}]
- "Query all datasets"- "List the available datasets"
- "Show me all active datasets"- "What tables are in the database?"- "Query locations for dataset vgIr9JSH_lFj"- "Create a new test dataset called 'My Test Data'"- "Show me the database schema"## SQL Query Examples### Basic Queries**Get all active datasets:**```sqlSELECT id, name, type, description, activeFROM datasetWHERE active = trueORDER BY type, name;```**Get locations for a dataset (parameterized):**```json{"query": "SELECT id, name, latitude, longitude FROM location WHERE dataset_id = ? AND active = true","parameters": ["vgIr9JSH_lFj"]}```### JOINs**Dataset hierarchy with counts:**```sqlSELECTd.name as dataset,COUNT(DISTINCT l.id) as location_count,COUNT(DISTINCT c.id) as cluster_count,COUNT(f.id) as file_countFROM dataset dLEFT JOIN location l ON d.id = l.dataset_idLEFT JOIN cluster c ON l.id = c.location_idLEFT JOIN file f ON c.id = f.cluster_idWHERE d.active = trueGROUP BY d.nameORDER BY d.name;```### Aggregates**Cluster file statistics:**```sqlSELECTCOUNT(*) as total_files,SUM(duration) as total_duration,AVG(duration) as avg_duration,MIN(timestamp_local) as first_recording,MAX(timestamp_local) as last_recordingFROM fileWHERE cluster_id = ? AND active = true;```### Temporal Analysis**Daily recording counts:**```sqlSELECTDATE_TRUNC('day', timestamp_local) as day,COUNT(*) as recordings,SUM(duration) as total_secondsFROM fileWHERE active = trueAND timestamp_local >= '2024-01-01'GROUP BY dayORDER BY dayLIMIT 100;```
**Server immediately exits**: Normal - it waits for stdin input in MCP protocol mode**"Usage: ./skraak_mcp <path>"**: You must provide database path argument**JSON parsing errors**: Each JSON message must be on a single line**No response**: Server outputs to stdout; notifications may appear between responses
- **Server immediately exits**: Normal - it waits for stdin input- **"Usage: ./skraak_mcp <path>"**: You must provide database path argument- **JSON parsing errors**: Each JSON message must be on a single line- **No response**: Server outputs to stdout; notifications may appear between responses- **Tool not found**: Initialize the connection first before calling tools- **Database connection failed**: Check the database path exists and is readable
**Tool not found**: Initialize the connection first before calling tools**Database connection failed**: Check the database path exists and is readable**SQL syntax error**: Check query syntax, use schema resources to verify table/column names**Test output too large**: Always pipe large test outputs to files, then use `rg` to search**Validation error**: Check tool input schema - all required fields must be present and valid**Database locked**: Make sure you're not running multiple tests simultaneously on the same database## Best Practices1. **Always use test database** (`test.duckdb`) for testing, never production (`skraak.duckdb`)2. **Pipe large outputs to files** to avoid token overflow3. **Use parameterized queries** (? placeholders) for filtering by user input4. **Include `WHERE active = true`** for main tables (dataset, location, cluster, file)5. **Use LIMIT** to restrict large result sets6. **Query existing patterns** before creating new ones (use `execute_sql` to check if pattern exists)7. **Validate IDs** before using them in write operations8. **Check error messages** carefully - they contain specific validation details## Running Unit TestsGo unit tests cover all utility packages:```bash# Run all testsgo test ./...# Run specific packagego test ./utils/# Run with coveragego test -cover ./utils/# View coverage reportgo test -coverprofile=coverage.out ./utils/go tool cover -html=coverage.out```**Test coverage: 91.5%** across 170+ tests
# Shell Test ScriptsThis directory contains comprehensive test scripts for the Skraak MCP Server.## Quick StartAll scripts default to using the test database for safety. Simply run without arguments:```bashcd shell_scripts# Quick time check./get_time.sh# Test SQL queries (pipe to file!)./test_sql.sh > test_output.txt 2>&1# Test write tools./test_tools.sh > tools_output.txt 2>&1# Test imports./test_import_file.sh > import_output.txt 2>&1./test_import_selections.sh > selections_output.txt 2>&1./test_bulk_import.sh > bulk_output.txt 2>&1# Test resources/prompts./test_resources_prompts.sh | jq '.'./test_all_prompts.sh > prompts_output.txt 2>&1```## Available Scripts| Script | Description | Database ||--------|-------------|----------|| `get_time.sh` | Test get_current_time tool | None required || `test_sql.sh` | Test execute_sql with various queries | test.duckdb (default) || `test_tools.sh` | Test all 8 write tools (create/update) | test.duckdb (default) || `test_import_file.sh` | Test single file import | test.duckdb (default) || `test_import_selections.sh` | Test ML selection import | test.duckdb (default) || `test_bulk_import.sh` | Test CSV-based bulk import | test.duckdb (default) || `test_resources_prompts.sh` | Test schema resources and prompts | test.duckdb (default) || `test_all_prompts.sh` | Test all 6 SQL workflow prompts | test.duckdb (default) |## Safety✅ **All scripts default to test database** (`test.duckdb`) for safety.⚠️ **Never use production database** (`skraak.duckdb`) for testing!**Note:** Scripts use different path formats:- Simple scripts: `../db/test.duckdb` (relative)- Complex scripts: Absolute paths resolved at runtime## DocumentationSee `TESTING.md` for comprehensive testing documentation including:- Manual JSON-RPC testing- Expected responses- SQL query examples- Troubleshooting- Best practices
All test scripts accept an optional database path argument. **CRITICAL**: Always specify `../db/test.duckdb` explicitly!- Default (if no argument): `../db/skraak.duckdb` ⚠️ **PRODUCTION - DON'T USE FOR TESTING**- Test database: `../db/test.duckdb` ✅ **ALWAYS USE THIS FOR TESTING**
All test scripts accept an optional database path argument and **default to `../db/test.duckdb`** for safety!- Default: `../db/test.duckdb` ✅ **SAFE - Use for testing**- Production: `../db/skraak.duckdb` ⚠️ **Only use in production**
1. **test_sql.sh [db_path]** - Tests execute_sql tool with various queries
**Core functionality:**1. **get_time.sh** - Quick test of get_current_time tool (no database needed)2. **test_sql.sh [db_path]** - Tests execute_sql tool with various queries
2. **test_resources_prompts.sh [db_path]** - Tests resources and prompts3. **test_all_prompts.sh [db_path]** - Tests all 6 prompts
**Write tools (create/update):**3. **test_tools.sh [db_path]** - Comprehensive test of all 8 write tools- Tests: create_dataset, create_location, create_cluster, create_cyclic_recording_pattern- Tests: update_dataset, update_location, update_cluster, update_pattern- Tests both valid inputs (should succeed) and invalid inputs (should fail)**Import tools:**
6. **get_time.sh** - Quick test of get_current_time tool (no database needed)
6. **test_bulk_import.sh [db_path]** - Tests bulk_file_import tool (CSV-based bulk import)**Resources and prompts:**7. **test_resources_prompts.sh [db_path]** - Tests resources and prompts8. **test_all_prompts.sh [db_path]** - Tests all 6 prompts
├── test_sql.sh # SQL tool tests (use test.duckdb!)├── test_resources_prompts.sh # Resources/prompts tests (use test.duckdb!)├── test_all_prompts.sh # All 6 prompts tests (use test.duckdb!)├── test_import_file.sh # Single file import tests (use test.duckdb!)├── test_import_selections.sh # ML selection import setup test└── get_time.sh # Time tool test (no database)
├── get_time.sh # Time tool test (no database)├── test_sql.sh # SQL tool tests├── test_tools.sh # All write tools tests (create/update)├── test_import_file.sh # Single file import tests├── test_import_selections.sh # ML selection import tests├── test_bulk_import.sh # Bulk file import tests├── test_resources_prompts.sh # Resources/prompts tests├── test_all_prompts.sh # All 6 prompts tests└── TESTING.md # Comprehensive testing documentation
### Latest Update: Test Script Consolidation (2026-02-06)**Rationalized and consolidated shell test scripts for better organization****Removed redundant scripts:**- `test_import_simple.sh` - Only tested registration (redundant)- `test_import_tool.sh` - Incomplete, just schema validation- `test_write_simple.sh` - Incomplete happy-path test- `test_write_tools.sh` - Replaced by comprehensive test_tools.sh- `test_write_e2e.sh` - Required manual ID replacement (not automated)- `test_update_tools.sh` - Replaced by test_tools.sh**Added comprehensive test scripts:**- `test_tools.sh` - All 8 write tools (create + update) with validation- `test_bulk_import.sh` - Tests bulk_file_import tool with CSV parsing**Updated documentation:**- `shell_scripts/TESTING.md` - Complete rewrite with current tool set- Removed references to deleted tools (query_datasets, etc.)- Added examples for all 14 current tools- Added SQL query examples (JOINs, aggregates, temporal analysis)- Added troubleshooting section and best practices**Current test suite (8 scripts):**1. `get_time.sh` - Time tool (no database)2. `test_sql.sh` - SQL query tool (comprehensive)3. `test_tools.sh` - All write tools (create/update)4. `test_import_file.sh` - Single file import5. `test_import_selections.sh` - ML selection import6. `test_bulk_import.sh` - Bulk CSV import7. `test_resources_prompts.sh` - Resources/prompts8. `test_all_prompts.sh` - All 6 prompts**Benefits:**- Cleaner shell_scripts directory (8 scripts vs 14)- Better organization by functionality- No redundant/incomplete tests- Comprehensive coverage of all 14 tools- Up-to-date documentation matching current codebase- All tests default to test.duckdb for safety