Compare commits

..

15 Commits

Author SHA1 Message Date
Leonid Pershin
b188afd9ab Update workflow to trigger on 'dev' branch instead of 'develop'
All checks were successful
Tests / Run Tests (push) Successful in 3m10s
Tests / Run Tests (pull_request) Successful in 2m52s
SonarQube / Build and analyze (pull_request) Successful in 3m55s
2025-10-23 09:43:15 +03:00
Leonid Pershin
3e3df20d84 Update build workflow to trigger on pull requests instead of pushes 2025-10-23 09:41:27 +03:00
Leonid Pershin
e54d44b581 Enhance workflow configurations by adding timeouts for build, publish, and test jobs
All checks were successful
SonarQube / Build and analyze (pull_request) Successful in 3m27s
Tests / Run Tests (pull_request) Successful in 2m34s
2025-10-23 09:14:15 +03:00
Leonid Pershin
738ae73ebd fix pub
All checks were successful
SonarQube / Build and analyze (pull_request) Successful in 12m33s
Tests / Run Tests (pull_request) Successful in 3m36s
2025-10-23 07:56:42 +03:00
Leonid Pershin
3adbc189eb add
All checks were successful
SonarQube / Build and analyze (pull_request) Successful in 3m10s
Tests / Run Tests (pull_request) Successful in 2m37s
2025-10-22 12:38:16 +03:00
Leonid Pershin
a4bcb78295 add docker pub
All checks were successful
SonarQube / Build and analyze (pull_request) Successful in 3m7s
Tests / Run Tests (pull_request) Successful in 2m27s
2025-10-22 07:09:46 +03:00
Leonid Pershin
9063ddb881 fix issues
All checks were successful
SonarQube / Build and analyze (pull_request) Successful in 3m3s
Tests / Run Tests (pull_request) Successful in 2m29s
2025-10-22 05:42:11 +03:00
Leonid Pershin
594e4a1782 fix tests
All checks were successful
SonarQube / Build and analyze (pull_request) Successful in 3m40s
Tests / Run Tests (pull_request) Successful in 2m30s
2025-10-22 05:16:58 +03:00
Leonid Pershin
c03de646cc Add tests
Some checks failed
SonarQube / Build and analyze (pull_request) Failing after 1m40s
Tests / Run Tests (pull_request) Failing after 1m11s
2025-10-22 04:41:56 +03:00
Leonid Pershin
85515b89e1 fix build
All checks were successful
SonarQube / Build and analyze (pull_request) Successful in 3m3s
Tests / Run Tests (pull_request) Successful in 2m25s
2025-10-22 04:20:39 +03:00
Leonid Pershin
96026fb69e fix sec
Some checks failed
SonarQube / Build and analyze (pull_request) Failing after 2m58s
2025-10-22 04:05:04 +03:00
Leonid Pershin
d71542a0d1 f
Some checks failed
SonarQube / Build and analyze (pull_request) Failing after 2m55s
2025-10-22 03:59:52 +03:00
Leonid Pershin
57652d87e1 fix
Some checks failed
SonarQube / Build and analyze (pull_request) Failing after 2m58s
2025-10-22 03:57:33 +03:00
Leonid Pershin
6a45c04770 fix security hotspots exclusion
Some checks failed
SonarQube / Build and analyze (pull_request) Failing after 3m2s
2025-10-22 03:50:49 +03:00
Leonid Pershin
d9151105e8 add gate
Some checks failed
SonarQube / Build and analyze (pull_request) Failing after 2m56s
2025-10-22 03:42:41 +03:00
11 changed files with 441 additions and 11 deletions

13
.cursor/rules/default.mdc Normal file
View File

@@ -0,0 +1,13 @@
---
alwaysApply: true
---
MCP предоставляет ассистенту доступ к данным SonarQube. Используй инструменты для:
Поиска проблем: search_sonar_issues_in_projects
Проверки статуса: get_project_quality_gate_status, get_system_status, get_system_health
Анализа кода: analyze_code_snippet, get_raw_source
Работы с задачами: change_sonar_issue_status
Получения метрик: get_component_measures, search_metrics
Получение документации по библиотекам: use context7
Не гадай — запрашивай данные. Уточняй ключи проектов и issue. Действуй точно, опираясь на информацию из SonarQube.
Текущий проект ChatBot

View File

@@ -0,0 +1,47 @@
---
description: SonarQube MCP Server usage guidelines
globs:
alwaysApply: true
---
These are some guidelines when using the SonarQube MCP server.
# Important Tool Guidelines
## Basic usage
- When starting a new task, disable automatic analysis with the `toggle_automatic_analysis` tool if it exists.
- When you are done generating code at the very end of the task, re-enable automatic analysis with the `toggle_automatic_analysis` tool if it exists.
Then call the `analyze_file_list` tool if it exists.
## Project Keys
- When a user mentions a project key, use `search_my_sonarqube_projects` first to find the exact project key
- Don't guess project keys - always look them up
## Code Language Detection
- When analyzing code snippets, try to detect the programming language from the code syntax
- If unclear, ask the user or make an educated guess based on syntax
## Branch and Pull Request Context
- Many operations support branch-specific analysis
- If user mentions working on a feature branch, include the branch parameter
- Pull request analysis is available for PR-specific insights
## Code Issues and Violations
- After fixing issues, do not attempt to verify them using `search_sonar_issues_in_projects`, as the server will not yet reflect the updates
# Common Troubleshooting
## Authentication Issues
- SonarQube requires USER tokens (not project tokens)
- When the error `SonarQube answered with Not authorized` occurs, verify the token type
## Project Not Found
- Use `search_my_sonarqube_projects` to confirm available projects
- Check if user has access to the specific project
- Verify project key spelling and format
## Code Analysis Issues
- Ensure programming language is correctly specified
- Remind users that snippet analysis doesn't replace full project scans
- Provide full file content for better analysis results
- Mention that code snippet analysis tool has limited capabilities compared to full SonarQube scans

View File

@@ -1,14 +1,14 @@
name: SonarQube name: SonarQube
on: on:
push: pull_request:
branches: branches:
- master - master
pull_request:
types: [opened, synchronize, reopened] types: [opened, synchronize, reopened]
jobs: jobs:
build: build:
name: Build and analyze name: Build and analyze
runs-on: ubuntu-latest runs-on: ubuntu-latest
timeout-minutes: 20
steps: steps:
- name: Set up JDK 17 - name: Set up JDK 17
uses: actions/setup-java@v4 uses: actions/setup-java@v4
@@ -51,4 +51,32 @@ jobs:
echo "Running tests with coverage..." echo "Running tests with coverage..."
dotnet test /p:CollectCoverage=true /p:CoverletOutputFormat=opencover /p:CoverletOutput=./coverage/ /p:Exclude="[*]*.Migrations.*" /p:ExcludeByFile="**/Migrations/*.cs" dotnet test /p:CollectCoverage=true /p:CoverletOutputFormat=opencover /p:CoverletOutput=./coverage/ /p:Exclude="[*]*.Migrations.*" /p:ExcludeByFile="**/Migrations/*.cs"
echo "Ending SonarQube analysis..." echo "Ending SonarQube analysis..."
~/.sonar/scanner/dotnet-sonarscanner end /d:sonar.token="${{ secrets.SONAR_TOKEN }}" ~/.sonar/scanner/dotnet-sonarscanner end /d:sonar.token="${{ secrets.SONAR_TOKEN }}"
- name: Wait for Quality Gate
run: |
echo "Waiting for SonarQube Quality Gate result..."
sleep 10
# Get Quality Gate status using jq for proper JSON parsing
RESPONSE=$(curl -s -u "${{ secrets.SONAR_TOKEN }}:" \
"${{ secrets.SONAR_HOST_URL }}/api/qualitygates/project_status?projectKey=ChatBot")
echo "API Response: $RESPONSE"
# Install jq if not available
if ! command -v jq &> /dev/null; then
sudo apt-get update && sudo apt-get install -y jq
fi
QUALITY_GATE_STATUS=$(echo "$RESPONSE" | jq -r '.projectStatus.status')
echo "Quality Gate Status: $QUALITY_GATE_STATUS"
if [ "$QUALITY_GATE_STATUS" != "OK" ]; then
echo "❌ Quality Gate failed! Status: $QUALITY_GATE_STATUS"
echo "Please check the SonarQube dashboard for details:"
echo "${{ secrets.SONAR_HOST_URL }}/dashboard?id=ChatBot"
exit 1
else
echo "✅ Quality Gate passed!"
fi

View File

@@ -0,0 +1,49 @@
name: Publish Docker Image
on:
push:
branches:
- master
jobs:
publish:
name: Build and Publish to Harbor
runs-on: ubuntu-latest
timeout-minutes: 15
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to Harbor
uses: docker/login-action@v3
with:
registry: harbor.home
username: robot$chatbot
password: ${{ secrets.HARBOR_TOKEN }}
- name: Extract metadata
id: meta
uses: docker/metadata-action@v5
with:
images: harbor.home/chatbot/chatbot
tags: |
type=ref,event=branch
type=sha,prefix={{branch}}-
type=raw,value=latest,enable={{is_default_branch}}
- name: Build and push Docker image
uses: docker/build-push-action@v5
with:
context: ./ChatBot
file: ./ChatBot/Dockerfile
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=registry,ref=harbor.home/chatbot/chatbot:buildcache
cache-to: type=registry,ref=harbor.home/chatbot/chatbot:buildcache,mode=max
- name: Image digest
run: echo "Image published with digest ${{ steps.build.outputs.digest }}"

View File

@@ -0,0 +1,41 @@
name: Tests
on:
push:
branches:
- master
- dev
pull_request:
types: [opened, synchronize, reopened]
jobs:
test:
name: Run Tests
runs-on: ubuntu-latest
timeout-minutes: 10
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Setup .NET
uses: actions/setup-dotnet@v4
with:
dotnet-version: '9.0.x'
- name: Restore dependencies
run: dotnet restore --verbosity normal
- name: Build
run: dotnet build --configuration Release --no-restore --verbosity normal
- name: Run tests
run: dotnet test --configuration Release --no-build --verbosity normal --logger "trx;LogFileName=test-results.trx"
- name: Test Summary
if: always()
run: |
if [ -f "**/test-results.trx" ]; then
echo "✅ Tests completed"
else
echo "❌ Test results not found"
fi

View File

@@ -3,10 +3,11 @@ trigger: always_on
--- ---
MCP предоставляет ассистенту доступ к данным SonarQube. Используй инструменты для: MCP предоставляет ассистенту доступ к данным SonarQube. Используй инструменты для:
Поиска проблем: search_sonar_issues_in_projects, search_dependency_risks Поиска проблем: search_sonar_issues_in_projects
Проверки статуса: get_project_quality_gate_status, get_system_status, get_system_health Проверки статуса: get_project_quality_gate_status, get_system_status, get_system_health
Анализа кода: analyze_code_snippet, get_raw_source Анализа кода: analyze_code_snippet, get_raw_source
Работы с задачами: change_sonar_issue_status Работы с задачами: change_sonar_issue_status
Получения метрик: get_component_measures, search_metrics Получения метрик: get_component_measures, search_metrics
Получение документации по библиотекам: use context7
Не гадай — запрашивай данные. Уточняй ключи проектов и issue. Действуй точно, опираясь на информацию из SonarQube. Не гадай — запрашивай данные. Уточняй ключи проектов и issue. Действуй точно, опираясь на информацию из SonarQube.
Текущий проект ChatBot Текущий проект ChatBot

View File

@@ -0,0 +1,240 @@
using ChatBot.Models;
using ChatBot.Models.Dto;
using ChatBot.Services.Interfaces;
using FluentAssertions;
using Moq;
using OllamaSharp.Models.Chat;
using System.Collections.Concurrent;
namespace ChatBot.Tests.Models;
public class ChatSessionCompressionTests
{
[Fact]
public async Task CompressHistoryAsync_ShouldCompressMessages_WhenCompressionServiceAvailable()
{
// Arrange
var session = new ChatSession();
var compressionServiceMock = new Mock<IHistoryCompressionService>();
session.SetCompressionService(compressionServiceMock.Object);
// Setup compression service to return compressed messages
var compressedMessages = new List<ChatMessage>
{
new ChatMessage { Role = ChatRole.System.ToString(), Content = "System prompt" },
new ChatMessage { Role = ChatRole.User.ToString(), Content = "Compressed user message" }
};
compressionServiceMock
.Setup(x => x.CompressHistoryAsync(It.IsAny<List<ChatMessage>>(), It.IsAny<int>(), It.IsAny<CancellationToken>()))
.ReturnsAsync(compressedMessages);
compressionServiceMock
.Setup(x => x.ShouldCompress(It.IsAny<int>(), It.IsAny<int>()))
.Returns(true);
// Add messages to session
for (int i = 0; i < 10; i++)
{
session.AddMessage(new ChatMessage { Role = ChatRole.User, Content = $"Message {i}" });
}
// Act
await session.AddMessageWithCompressionAsync(
new ChatMessage { Role = ChatRole.User, Content = "New message" },
compressionThreshold: 5,
compressionTarget: 2
);
// Assert
var messages = session.GetAllMessages();
messages.Should().HaveCount(2);
messages[0].Role.Should().Be(ChatRole.System);
messages[1].Role.Should().Be(ChatRole.User);
messages[1].Content.Should().Be("Compressed user message");
}
[Fact]
public async Task CompressHistoryAsync_ShouldFallbackToTrimming_WhenCompressionFails()
{
// Arrange
var session = new ChatSession { MaxHistoryLength = 3 };
var compressionServiceMock = new Mock<IHistoryCompressionService>();
session.SetCompressionService(compressionServiceMock.Object);
// Setup compression service to throw an exception
var exception = new Exception("Compression failed");
compressionServiceMock
.Setup(x => x.CompressHistoryAsync(It.IsAny<List<ChatMessage>>(), It.IsAny<int>(), It.IsAny<CancellationToken>()))
.ThrowsAsync(exception);
compressionServiceMock
.Setup(x => x.ShouldCompress(It.IsAny<int>(), It.IsAny<int>()))
.Returns(true);
// Add messages to session
for (int i = 0; i < 5; i++)
{
session.AddMessage(new ChatMessage { Role = ChatRole.User, Content = $"Message {i}" });
}
// Act
await session.AddMessageWithCompressionAsync(
new ChatMessage { Role = ChatRole.User, Content = "New message" },
compressionThreshold: 3,
compressionTarget: 2
);
// Assert - Should fall back to simple trimming
var messages = session.GetAllMessages();
messages.Should().HaveCount(3);
}
[Fact]
public async Task AddMessageWithCompressionAsync_ShouldNotCompress_WhenBelowThreshold()
{
// Arrange
var session = new ChatSession();
var compressionServiceMock = new Mock<IHistoryCompressionService>();
session.SetCompressionService(compressionServiceMock.Object);
// Setup compression service to return false for ShouldCompress when count is below threshold
compressionServiceMock
.Setup(x => x.ShouldCompress(It.Is<int>(c => c < 5), It.Is<int>(t => t == 5)))
.Returns(false);
// Add messages to session (below threshold)
session.AddMessage(new ChatMessage { Role = ChatRole.User, Content = "Message 1" });
session.AddMessage(new ChatMessage { Role = ChatRole.Assistant, Content = "Response 1" });
// Act - Set threshold higher than current message count
await session.AddMessageWithCompressionAsync(
new ChatMessage { Role = ChatRole.User, Content = "Message 2" },
compressionThreshold: 5,
compressionTarget: 2
);
// Assert - Should not call compression service
compressionServiceMock.Verify(
x => x.CompressHistoryAsync(It.IsAny<List<ChatMessage>>(), It.IsAny<int>(), It.IsAny<CancellationToken>()),
Times.Never
);
var messages = session.GetAllMessages();
messages.Should().HaveCount(3);
}
[Fact]
public async Task AddMessageWithCompressionAsync_ShouldHandleConcurrentAccess()
{
// Arrange
var session = new ChatSession();
var compressionServiceMock = new Mock<IHistoryCompressionService>();
session.SetCompressionService(compressionServiceMock.Object);
// Setup compression service to simulate processing time
var delayedResult = new List<ChatMessage>
{
new ChatMessage { Role = ChatRole.System.ToString(), Content = "Compressed" }
};
compressionServiceMock
.Setup(x => x.CompressHistoryAsync(It.IsAny<List<ChatMessage>>(), It.IsAny<int>(), It.IsAny<CancellationToken>()))
.Returns(async (List<ChatMessage> messages, int target, CancellationToken ct) =>
{
await Task.Delay(50, ct);
return delayedResult;
});
compressionServiceMock
.Setup(x => x.ShouldCompress(It.IsAny<int>(), It.IsAny<int>()))
.Returns(true);
var tasks = new List<Task>();
int messageCount = 5;
// Act - Start multiple concurrent operations
for (int i = 0; i < messageCount; i++)
{
tasks.Add(
session.AddMessageWithCompressionAsync(
new ChatMessage { Role = ChatRole.User, Content = $"Message {i}" },
compressionThreshold: 2,
compressionTarget: 1
)
);
}
// Wait for all operations to complete
await Task.WhenAll(tasks);
// Assert - Should handle concurrent access without exceptions
// and maintain thread safety
session.GetMessageCount().Should().Be(1);
}
[Fact]
public void SetCompressionService_ShouldNotThrow_WhenCalledMultipleTimes()
{
// Arrange
var session = new ChatSession();
var compressionService1 = new Mock<IHistoryCompressionService>().Object;
var compressionService2 = new Mock<IHistoryCompressionService>().Object;
// Act & Assert
session.Invoking(s => s.SetCompressionService(compressionService1)).Should().NotThrow();
// Should not throw when setting a different service
session.Invoking(s => s.SetCompressionService(compressionService2)).Should().NotThrow();
}
[Fact]
public async Task CompressHistoryAsync_ShouldPreserveSystemMessage_WhenCompressing()
{
// Arrange
var session = new ChatSession();
var compressionServiceMock = new Mock<IHistoryCompressionService>();
session.SetCompressionService(compressionServiceMock.Object);
// Setup compression service to preserve system message
compressionServiceMock
.Setup(x => x.CompressHistoryAsync(It.IsAny<List<ChatMessage>>(), It.IsAny<int>(), It.IsAny<CancellationToken>()))
.Returns((List<ChatMessage> messages, int target, CancellationToken ct) =>
{
var systemMessage = messages.FirstOrDefault(m => m.Role == ChatRole.System.ToString());
var compressed = new List<ChatMessage>();
if (systemMessage != null)
{
compressed.Add(systemMessage);
}
compressed.Add(new ChatMessage
{
Role = ChatRole.User.ToString(),
Content = "Compressed user messages"
});
return Task.FromResult(compressed);
});
compressionServiceMock
.Setup(x => x.ShouldCompress(It.IsAny<int>(), It.IsAny<int>()))
.Returns(true);
// Add system message and some user messages
session.AddMessage(new ChatMessage { Role = ChatRole.System, Content = "System prompt" });
for (int i = 0; i < 10; i++)
{
session.AddMessage(new ChatMessage { Role = ChatRole.User, Content = $"Message {i}" });
}
// Act
await session.AddMessageWithCompressionAsync(
new ChatMessage { Role = ChatRole.User, Content = "New message" },
compressionThreshold: 5,
compressionTarget: 2
);
// Assert - System message should be preserved
var messages = session.GetAllMessages();
messages.Should().HaveCount(2);
messages[0].Role.Should().Be(ChatRole.System);
messages[0].Content.Should().Be("System prompt");
messages[1].Content.Should().Be("Compressed user messages");
}
}

View File

@@ -359,7 +359,7 @@ public class ChatSessionTests
var session = new ChatSession(); var session = new ChatSession();
session.AddUserMessage("Test", "user"); session.AddUserMessage("Test", "user");
var lastUpdated = session.LastUpdatedAt; var lastUpdated = session.LastUpdatedAt;
await Task.Delay(10); // Small delay await Task.Delay(10, CancellationToken.None); // Small delay
// Act // Act
session.ClearHistory(); session.ClearHistory();

View File

@@ -232,7 +232,7 @@ public class InMemorySessionStorageTests
var originalTime = session.LastUpdatedAt; var originalTime = session.LastUpdatedAt;
// Wait a bit to ensure time difference // Wait a bit to ensure time difference
await Task.Delay(10); await Task.Delay(10, CancellationToken.None);
session.ChatTitle = "Updated Title"; session.ChatTitle = "Updated Title";

View File

@@ -22,14 +22,19 @@ RUN dotnet publish -c Release -o /app/publish /p:UseAppHost=false
FROM mcr.microsoft.com/dotnet/aspnet:9.0 AS final FROM mcr.microsoft.com/dotnet/aspnet:9.0 AS final
WORKDIR /app WORKDIR /app
# Install PostgreSQL client for healthcheck (optional) # Install PostgreSQL client, create user, and prepare directories
RUN apt-get update && apt-get install -y postgresql-client && rm -rf /var/lib/apt/lists/* RUN apt-get update && apt-get install -y --no-install-recommends postgresql-client && rm -rf /var/lib/apt/lists/* \
&& groupadd -r appuser && useradd -r -g appuser appuser \
&& mkdir -p /app/logs
# Copy published application # Copy published application (safe: only contains compiled output from dotnet publish)
COPY --from=publish /app/publish . COPY --from=publish /app/publish .
# Create directory for logs # Set ownership after copying files
RUN mkdir -p /app/logs && chmod 777 /app/logs RUN chown -R appuser:appuser /app
# Switch to non-root user
USER appuser
# Expose ports (if needed for health checks or metrics) # Expose ports (if needed for health checks or metrics)
EXPOSE 8080 EXPOSE 8080

View File

@@ -4,6 +4,12 @@
[![License](https://img.shields.io/badge/license-MIT-green)](LICENSE.txt) [![License](https://img.shields.io/badge/license-MIT-green)](LICENSE.txt)
[![PostgreSQL](https://img.shields.io/badge/PostgreSQL-14+-blue)](https://www.postgresql.org/) [![PostgreSQL](https://img.shields.io/badge/PostgreSQL-14+-blue)](https://www.postgresql.org/)
[![Quality Gate Status](https://sonarqube.api.home/api/project_badges/measure?project=ChatBot&metric=alert_status)](https://sonarqube.api.home/dashboard?id=ChatBot)
[![Coverage](https://sonarqube.api.home/api/project_badges/measure?project=ChatBot&metric=coverage)](https://sonarqube.api.home/dashboard?id=ChatBot)
[![Bugs](https://sonarqube.api.home/api/project_badges/measure?project=ChatBot&metric=bugs)](https://sonarqube.api.home/dashboard?id=ChatBot)
[![Vulnerabilities](https://sonarqube.api.home/api/project_badges/measure?project=ChatBot&metric=vulnerabilities)](https://sonarqube.api.home/dashboard?id=ChatBot)
[![Code Smells](https://sonarqube.api.home/api/project_badges/measure?project=ChatBot&metric=code_smells)](https://sonarqube.api.home/dashboard?id=ChatBot)
Интеллектуальный Telegram-бот на базе локальных AI моделей (Ollama), построенный на .NET 9 с использованием Clean Architecture. Интеллектуальный Telegram-бот на базе локальных AI моделей (Ollama), построенный на .NET 9 с использованием Clean Architecture.
## ✨ Основные возможности ## ✨ Основные возможности