-
Notifications
You must be signed in to change notification settings - Fork 0
Add moderation features for user reporting and banning #23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
- Introduces content moderation capabilities, including reporting inappropriate messages, banning users based on report thresholds, and clearing user reports in chat rooms. - Adds new handlers, services, and repositories to manage these features. - Updates WebSocket logic to prevent banned users from sending messages and modifies room models to track reported users. - Enhances API with moderation-specific endpoints and integrates moderation checks into existing workflows.
- Introduces success confirmation messages when users join rooms or direct chats via WebSocket. - Adds a check to prevent users from reporting the same message multiple times. - Also updates moderation logic to enforce the maximum report threshold dynamically.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds moderation capabilities to the chat application by introducing reporting, banning, and report-clearing functionality.
- Added a
ModerationServicewith methods to report messages, fetch banned users, clear reports, and enforce report-based bans. - Introduced
ModerationHandlerand new API routes (/report,/banned-users,/clear-reports) with corresponding request/response models. - Updated WebSocket logic to prevent banned users from sending messages and notify them of join successes, and extended documentation (
swagger.yaml,swagger.json,docs.go) with these endpoints.
Reviewed Changes
Copilot reviewed 14 out of 14 changed files in this pull request and generated 4 comments.
Show a summary per file
| File | Description |
|---|---|
| internal/services/room_service.go | Added IsUserAdminOrOwner and left contains helper unused |
| internal/services/moderation_service.go | New moderation logic for reporting, banning, and clearing reports |
| internal/routes/router.go | Registered moderation endpoints in the router |
| internal/repositories/report_repository.go | New repository methods for report CRUD operations |
| internal/repositories/message_repository.go | Added GetMessageByID method |
| internal/pkg/websocket/websocket.go | Enforced ban checks in WebSocket, added success message type |
| internal/pkg/websocket/hub.go | Injected ReportRepository into the WebSocket hub |
| internal/models/room.go | Added ReportedUsers map to Room model |
| internal/models/report.go | Defined Report, ReportRequest, ClearReportRequest, and ban response models |
| internal/handlers/moderation_handler.go | Implemented HTTP handlers for report, banned-user, and clear-report endpoints |
| docs/swagger.yaml, swagger.json, docs.go | Extended API docs with moderation endpoints and models |
| cmd/api/main.go | Wired new moderation service and handler into application |
Comments suppressed due to low confidence (2)
internal/handlers/moderation_handler.go:77
- Domain-specific errors (e.g. "message not found" or "already reported") should map to appropriate HTTP status codes (404 or 400) instead of always returning 500.
http.Error(w, err.Error(), http.StatusInternalServerError)
docs/swagger.yaml:526
- The
summarykey is declared twice under this path (once at line 494 and again here), creating redundancy; remove the extra declaration to maintain clarity.
summary: Get banned users in a room
| } | ||
|
|
||
| // Helper para verificar si un slice contiene un valor | ||
| func contains(slice []string, value string) bool { |
Copilot
AI
Jun 22, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] The contains helper is defined but not used; consider either using it in IsUserAdminOrOwner to check admin membership or removing it to keep the codebase clean.
| } | ||
|
|
||
| // Verificar si el usuario es parte de la sala antes de enviar el mensaje | ||
| } // Verificar si el usuario es parte de la sala antes de enviar el mensaje |
Copilot
AI
Jun 22, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] Inline comment after a closing brace can reduce readability; consider moving the comment above the } or translating it consistently into English.
| } // Verificar si el usuario es parte de la sala antes de enviar el mensaje | |
| // Check if the user is part of the room before sending the message | |
| } |
| continue | ||
| } | ||
|
|
||
| // Check if the user is banned from sending messages due to reports |
Copilot
AI
Jun 22, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This ban-enforcement logic duplicates service behavior; extract it into a shared method (e.g. CanUserSendMessageInRoom) to avoid duplication and ease testing.
| // Validate that the message exists | ||
| message, err := s.messageRepo.GetMessageByID(roomID, messageID) | ||
| if err != nil { | ||
| return fmt.Errorf("message not found: %v", err) |
Copilot
AI
Jun 22, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use fmt.Errorf("message not found: %w", err) to wrap the original error and preserve its type for inspection.
| return fmt.Errorf("message not found: %v", err) | |
| return fmt.Errorf("message not found: %w", err) |
This pull request introduces a moderation feature to the chat application, including new services, handlers, and API endpoints to manage reported and banned users. The changes also update the API documentation to reflect the new functionality.
Moderation Feature Integration:
NewModerationService) and handlers (NewModerationHandler) for moderation functionality, along with a repository for managing reports (NewReportRepository) incmd/api/main.go.API Endpoints for Moderation:
GET /chat/rooms/{roomId}/banned-usersendpoint to retrieve a list of banned users in a chat room. [1] [2]POST /chat/rooms/{roomId}/clear-reportsendpoint to clear all reports for a specific user in a chat room. [1] [2]POST /chat/rooms/{roomId}/reportendpoint to report inappropriate messages in a chat room. [1] [2]API Documentation Updates:
docs/docs.go,docs/swagger.json, anddocs/swagger.yamlto include new endpoints and their details, such as descriptions, parameters, and response schemas. [1] [2] [3] [4] [5]Data Models for Moderation:
BannedUserResponseandBannedUsersResponseto represent banned users.ClearReportRequestfor clearing user reports.ReportRequestfor reporting messages.Roommodel to include areportedUsersfield mapping user IDs to report counts. [1] [2] [3] [4] [5] [6] [7] [8] [9]