Compare commits

..

289 Commits
v1.0.2 ... main

Author SHA1 Message Date
Marco Lanzara
182d98de0d 🚀 Release v1.0.103
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2026-01-02 16:33:13
2026-01-02 16:33:13 +00:00
marco370
a1be759431 Add public IP address lists for Microsoft and Meta to the application
Add new parsers for Microsoft Azure and Meta IP ranges, map them in PARSERS, and include a SQL migration script to add these lists to the database.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 57d0534a-1546-46c0-b4ff-6b3a82469c5e
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/C6BdLIt
2026-01-02 16:32:28 +00:00
Marco Lanzara
f404952e0e 🚀 Release v1.0.102
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2026-01-02 16:16:47
2026-01-02 16:16:47 +00:00
marco370
0a269a9032 Update list fetching to correctly parse Google IP ranges
Add 'google' as an alias for GCPParser in `python_ml/list_fetcher/parsers.py` to resolve issues with parsing Google Cloud and Google global IP lists.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 771e5bf9-f7cd-42b4-9abb-d79a800368ae
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/C6BdLIt
2026-01-02 16:16:15 +00:00
Marco Lanzara
1133ca356f 🚀 Release v1.0.101
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2026-01-02 16:14:47
2026-01-02 16:14:47 +00:00
marco370
aa74340706 Implement pagination and server-side search for detection records
Update client-side to handle pagination and debounced search input, refactor server API routes and storage to support offset, limit, and search queries on detection records.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 0ad70992-7e31-48d6-8e52-b2442cc2a623
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/C6BdLIt
2026-01-02 16:07:07 +00:00
Marco Lanzara
051c5ee4a5 🚀 Release v1.0.100
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2026-01-02 15:51:11
2026-01-02 15:51:11 +00:00
marco370
a15d4d660b Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 10e3deb8-de9d-4fbc-9a44-e36edbba13db
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/C6BdLIt
2026-01-02 15:50:34 +00:00
marco370
dee64495cd Add ability to manually unblock IPs and improve API key handling
Add a "Unblock Router" button to the Detections page and integrate ML backend API key for authenticated requests.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 3f5fe7aa-6fa1-4aa6-a5b4-916f113bf5df
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/C6BdLIt
2026-01-02 15:50:17 +00:00
marco370
16d13d6bee Add ability to automatically unblock IPs when added to whitelist
Add an endpoint to proxy IP unblocking requests to the ML backend and implement automatic unblocking from routers when an IP is added to the whitelist.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 67148eaa-9f6a-42a9-a7bb-a72453425d4c
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/8i4FqXF
2026-01-02 15:46:56 +00:00
marco370
a4bf75394a Add ability to trigger manual IP blocking and detection
Add a curl command to manually trigger IP detection and blocking with specific parameters.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: c0150b70-3a40-4b91-ad03-5beebb46ed63
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/8i4FqXF
2026-01-02 15:44:20 +00:00
Marco Lanzara
58fb6476c5 🚀 Release v1.0.99
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2026-01-02 15:39:39
2026-01-02 15:39:39 +00:00
marco370
1b47e08129 Add search functionality to the whitelist page and improve IP status indication
Add a search bar to the whitelist page and filter results by IP, reason, and notes. Modify the detections page to visually indicate when an IP is already whitelisted by changing the button color to green and using a different icon.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 4231475f-0a12-42cd-bf3f-3401022fd4e5
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/8i4FqXF
2026-01-02 15:37:32 +00:00
Marco Lanzara
0298b4a790 🚀 Release v1.0.98
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2026-01-02 15:20:02
2026-01-02 15:20:02 +00:00
marco370
a311573d0c Fix errors in IP detection and merge logic by correcting data types
Addresses type mismatches in `risk_score` handling and INET comparisons within `merge_logic.py`, ensuring correct data insertion and IP range analysis.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: e1f9b236-1e9e-4ac6-a8f7-8ca066dc8467
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/zqNbsxW
2026-01-02 15:19:26 +00:00
Marco Lanzara
21ff8c0c4b 🚀 Release v1.0.97
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2026-01-02 14:50:15
2026-01-02 14:50:15 +00:00
marco370
d966d26784 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 3b2f0862-1651-467b-b1ad-4392772a05a5
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/rDib6Pq
2026-01-02 14:46:04 +00:00
marco370
73ad653cb0 Update database management to version 8 for improved network matching
Update database management to version 8, forcing INET/CIDR column types for network range matching and recreating INET columns to resolve type mismatches.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 98e331cf-16db-40fc-a572-755928117e82
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/rDib6Pq
2026-01-02 14:45:44 +00:00
marco370
3574ff0274 Update database schema and migrations to correctly handle IP address data types
Introduce migration 008 to force INET and CIDR types for IP-related columns in `whitelist` and `public_blacklist_ips` tables, and update `shared/schema.ts` with comments clarifying production type handling.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 1d0f629d-65cf-420d-86d9-a51b24caffa4
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/rDib6Pq
2026-01-02 14:44:54 +00:00
marco370
0301a42825 Update IP address parsing to ensure uniqueness and fix duplicates
Update `normalize_cidr` function in `parsers.py` to use the full CIDR notation as the IP address for uniqueness, addressing duplicate entry errors during Spamhaus IP sync and resolving the `operator does not exist: inet = text` error related to the `whitelist` table by ensuring proper IP type handling.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 478f21ca-de02-4a5b-9eec-f73a3e16d0f0
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/rDib6Pq
2026-01-02 11:56:47 +00:00
Marco Lanzara
278bc6bd61 🚀 Release v1.0.96
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2026-01-02 11:49:34
2026-01-02 11:49:34 +00:00
marco370
3425521215 Update list fetching to handle new Spamhaus format and IP matching
Update Spamhaus parser to support NDJSON format and fix IP matching errors by ensuring database migrations are applied.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 11e93061-1fe5-4624-8362-9202aff893d7
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/rDib6Pq
2026-01-02 11:48:33 +00:00
marco370
c3a6f28434 Add idempotency to database migrations and fix data type issues
Modify database migrations to use `IF NOT EXISTS` for index creation and adjust column types from TEXT to INET to resolve data type conflicts.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 7b4fcf5a-6a83-4f13-ba5e-c95f24a8825a
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/zauptjn
2026-01-02 11:38:49 +00:00
Marco Lanzara
c0b2342c43 🚀 Release v1.0.95
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-27 18:29:37
2025-11-27 18:29:37 +00:00
marco370
6ad718c51f Update database to correctly handle IP address and CIDR data types
Modify migration script 007_add_cidr_support.sql to ensure correct data types for IP addresses and CIDR ranges in the database, resolving issues with existing TEXT columns and ensuring proper indexing.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 0e7b5b83-259f-47fa-81c7-c0d4520106b5
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/zauptjn
2025-11-27 18:27:10 +00:00
Marco Lanzara
505b7738bf 🚀 Release v1.0.94
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-27 18:25:47
2025-11-27 18:25:47 +00:00
marco370
2b24323f7f Make database migrations more robust and repeatable
Update SQL migration scripts to be idempotent, ensuring they can be run multiple times without errors by using `IF NOT EXISTS` clauses for index and column creation.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 7fe65eff-e75c-4ad2-9348-9df209d4ad11
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/zauptjn
2025-11-27 18:24:30 +00:00
Marco Lanzara
3e35032d79 🚀 Release v1.0.93
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-27 18:21:43
2025-11-27 18:21:43 +00:00
marco370
bb5d14823f Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: d017a104-cdea-4a96-9d4d-4002190415c1
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/zauptjn
2025-11-27 18:21:08 +00:00
marco370
e6db06e597 Improve database migration search and add service installation
Enhance database migration search logic to include the deployment directory and add a new script to install the ids-list-fetcher systemd service.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 5da210e7-b46e-46ce-a578-05f0f70545be
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/zauptjn
2025-11-27 18:20:51 +00:00
marco370
a08c4309a8 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: ba101f6f-fd84-4815-a191-08ab00111899
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/zauptjn
2025-11-27 18:10:39 +00:00
Marco Lanzara
584c25381c 🚀 Release v1.0.92
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-27 18:03:43
2025-11-27 18:03:43 +00:00
marco370
b31bad7d8b Implement list synchronization by fetching and saving IP addresses
Adds IP fetching logic to server routes and implements upsert functionality for blacklist IPs in the database storage layer.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 822e4068-5dab-436d-95b7-523678751e11
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/zauptjn
2025-11-27 18:02:24 +00:00
Marco Lanzara
4754cfd98a 🚀 Release v1.0.91
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-26 15:29:20
2025-11-26 15:29:20 +00:00
marco370
54d919dc2d Fix how public lists are managed by correcting API request parameters
Corrected the order of parameters in multiple `apiRequest` calls within `PublicLists.tsx` from `(url, { method, body })` to `(method, url, data)` to align with the function's expected signature.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 794d2db9-c8c8-4022-bb8d-1eb6a6ba7618
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/yTHq1Iw
2025-11-26 15:27:19 +00:00
Marco Lanzara
0cf5899ec1 🚀 Release v1.0.90
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-26 10:13:51
2025-11-26 10:13:51 +00:00
marco370
1a210a240c Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 1bbb096a-db4e-43c0-a689-350028ebe90b
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/qHCi0Qg
2025-11-26 09:55:43 +00:00
marco370
83468619ff Add full CIDR support for IP address matching in lists
Updates IP address handling to include CIDR notation for more comprehensive network range matching, enhances database schema with INET/CIDR types, and refactors logic for accurate IP detection and whitelisting.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 49a5a4b7-82b5-4dd4-84c1-9f0e855bea8a
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/qHCi0Qg
2025-11-26 09:54:57 +00:00
marco370
5952142a56 Add public lists integration with exact IP matching
Update merge logic to use exact IP matching for public lists, add deployment scripts and documentation for limitations.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 75a02f7d-492b-46a8-9e67-d4fd471cabc7
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/QKzTQQy
2025-11-26 09:45:55 +00:00
marco370
77874c83bf Add functionality to manage and sync public blacklists and whitelists
Integrates external public IP lists for enhanced threat detection and whitelisting capabilities, including API endpoints, database schema changes, and a new fetching service.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: b1366669-0ccd-493e-9e06-4e4168e2fa3b
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/QKzTQQy
2025-11-26 09:21:43 +00:00
marco370
24966154d6 Increase training data for ML model to improve detection accuracy
Increase `max_records` from 100,000 to 1,000,000 in the cron job for training the ML model.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 1ab6a903-d037-4e7c-8165-3fff9dd0df18
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/U7LNEhO
2025-11-26 08:45:56 +00:00
Marco Lanzara
0c0e5d316e 🚀 Release v1.0.89
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 18:13:58
2025-11-25 18:13:58 +00:00
marco370
5c74eca030 Update MikroTik API connection to use correct REST API port
Update MIKROTIK_API_FIX.md to reflect the correction of the MikroTik API connection from the binary API port (8728) to the REST API port (80), ensuring proper HTTP communication for IP blocking functionality.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 71f707e1-8089-4fe1-953d-aca8b360c12d
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/U7LNEhO
2025-11-25 18:13:31 +00:00
marco370
fffc53d0a6 Improve error reporting and add a simple connection test script
Adds enhanced error logging with traceback to the main connection test script and introduces a new, simplified script for step-by-step MikroTik connection testing.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: e1e6bdd5-fda7-4085-ad95-6f07f4b68b3c
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/jFtLBWL
2025-11-25 18:00:33 +00:00
marco370
ed197d8fb1 Improve MikroTik connection by supporting legacy SSL protocols
Adds a custom SSL context to `httpx.AsyncClient` to allow connections to MikroTik devices using older TLS versions and cipher suites, specifically addressing SSL handshake failures.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: c7f10319-c117-454c-bfc1-1bd3a59078cd
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/jFtLBWL
2025-11-25 17:58:02 +00:00
Marco Lanzara
5bb3c01ce8 🚀 Release v1.0.88
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 17:53:00
2025-11-25 17:53:00 +00:00
marco370
2357a7c065 Update database schema to improve security and integrity
Modify database schema definitions, including sequence restrictions and table constraints, to enhance data security and maintain referential integrity.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 3135367c-032d-4f9d-9930-c7c872f2c014
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/jFtLBWL
2025-11-25 17:52:56 +00:00
marco370
167e8d9575 Fix connection issues with MikroTik API by adding port information
Fix critical bug in `mikrotik_manager.py` where the API port was not included in the base URL, leading to connection failures. Also, added SSL support detection and a new script for testing MikroTik API connections.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 22d233cb-3add-46fa-b4e7-ead2de638008
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/jFtLBWL
2025-11-25 17:52:04 +00:00
marco370
a947ac8cea Fix connection issues with MikroTik routers
Update the MikroTik manager to correctly use API ports (8728/8729) and SSL settings for establishing connections.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 84f094af-954b-41c6-893f-6ee7fd519235
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/jFtLBWL
2025-11-25 17:49:26 +00:00
marco370
c4546f843f Fix permission errors to allow saving machine learning models
Correct ownership of the models directory to allow the ML training process to save generated models.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 51de2a29-c1c5-4d67-b236-7a1824b5b0d1
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/jFtLBWL
2025-11-25 17:40:13 +00:00
marco370
42541724cf Fix issue where the application fails to start due to port conflicts
Resolve "address already in use" error by resetting systemd, killing all Python processes, and ensuring the port is free before restarting the ML backend service.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: fe8f5eaa-c00f-4120-8b35-be03ff3fca3f
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/jFtLBWL
2025-11-25 17:35:58 +00:00
marco370
955a2ee125 Fix backend startup issue by resolving port conflict
Resolves an "address already in use" error by killing existing processes on port 8000 before restarting the ML backend service.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 2c691790-1a58-44ba-94dd-f03a528d1174
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/jFtLBWL
2025-11-25 17:33:54 +00:00
marco370
25e5735527 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: d6142b99-44c0-45a7-befa-3f08b9007213
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/jFtLBWL
2025-11-25 17:29:43 +00:00
Marco Lanzara
df5c637bfa 🚀 Release v1.0.87
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 17:26:07
2025-11-25 17:26:07 +00:00
marco370
9761ee6036 Add visual indicators for the Hybrid ML model version
Update the UI to display badges indicating the use of Hybrid ML v2.0.0 on both the Training and Anomaly Detection cards, and refine descriptive text for clarity.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 7abf54ed-5574-4967-a851-0590e80d6ad1
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/jFtLBWL
2025-11-25 17:24:29 +00:00
Marco Lanzara
fa61c820e7 🚀 Release v1.0.86
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 11:52:39
2025-11-25 11:52:39 +00:00
marco370
4d9ed22c39 Add automatic IP blocking system to enhance security
Implement a systemd timer and Python script to periodically detect and automatically block malicious IP addresses based on risk scores, improving the application's security posture.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 05ab2f73-e195-4de9-a183-cd4729713b92
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/31VdIyL
2025-11-25 11:52:13 +00:00
Marco Lanzara
e374c5575e 🚀 Release v1.0.85
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 11:31:00
2025-11-25 11:31:00 +00:00
marco370
7eb0991cb5 Update Mikrotik router connection settings and remove redundant tests
Removes the connection testing functionality and updates the default API port to 8729 for Mikrotik routers.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 54ecaeb2-ec77-4629-8d8d-e3bc4f663bec
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/31VdIyL
2025-11-25 11:29:12 +00:00
Marco Lanzara
81d3617b6b 🚀 Release v1.0.84
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 11:20:56
2025-11-25 11:20:56 +00:00
marco370
dae5ebbaf4 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 6aff0f36-808b-419c-888c-aa0cfcb9016b
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/31VdIyL
2025-11-25 11:16:11 +00:00
marco370
dd8d38375d Update router connection test to use REST API and default port 443
Adjusted the router connection test to target the MikroTik REST API on port 443 by default, and handle authentication and status codes accordingly.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 1d0150b7-28d2-4cd9-bb13-5d7a63792aab
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/31VdIyL
2025-11-25 11:05:32 +00:00
marco370
7f441ad7e3 Update system to allow adding and managing router connections
Implement backend endpoints for updating router configurations and testing network connectivity.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: b2d3685a-1706-4af8-9bca-219e40049634
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/31VdIyL
2025-11-25 11:02:33 +00:00
marco370
8aabed0272 Add functionality to manage and test router connections
Implement dialogs and forms for adding/editing routers, along with backend endpoints for updating router details and testing connectivity.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 72dce443-ff50-4028-b2d4-a6b504b9b018
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/L6QSDnx
2025-11-25 11:01:18 +00:00
marco370
7c204c62b2 Add automatic Python dependency installation to setup script
Modify deployment/setup_cleanup_timer.sh to automatically install Python dependencies, and update deployment/CLEANUP_DETECTIONS_GUIDE.md to reflect this change and add prerequisites.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 824b5d8d-e4c7-48ef-aff9-60efb91a2082
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/L6QSDnx
2025-11-25 10:49:08 +00:00
Marco Lanzara
4d9fcd472e 🚀 Release v1.0.83
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 10:46:03
2025-11-25 10:46:03 +00:00
marco370
50e9d47ca4 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 8f019e67-da7a-4a17-a5f8-c771846a8d47
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/L6QSDnx
2025-11-25 10:43:59 +00:00
marco370
e3dedf00f1 Automate removal of old blocked IPs and update timer
Fix bug where auto-unblock incorrectly removed all records for an IP, and correct systemd timer to run once hourly.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: ae7d80ee-d080-4e32-b4a2-b23e876e3650
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/L6QSDnx
2025-11-25 10:42:52 +00:00
marco370
791b7caa4d Add automatic cleanup for old detections and IP blocks
Implement automated detection cleanup after 48 hours and IP unblocking after 2 hours using systemd timers and Python scripts.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 3809a8a0-8dd5-4b5a-9e32-9e075dab335e
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/L6QSDnx
2025-11-25 10:40:44 +00:00
Marco Lanzara
313bdfb068 🚀 Release v1.0.82
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 10:33:51
2025-11-25 10:33:51 +00:00
marco370
51607ff367 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 4d02c993-3bb9-40f5-a53c-9c9e3d22ee4d
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/L6QSDnx
2025-11-25 10:33:32 +00:00
marco370
7c5f4d56ff Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: db68be2d-e6e5-41e2-b7d4-cd059b723951
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/L6QSDnx
2025-11-25 10:30:04 +00:00
marco370
61df9c4f4d Remove unused details button and related icon
Remove the "Dettagli" button and the Eye icon import from the Detections page as they are no longer needed.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 62bbad53-aa3a-4887-b48d-7203ea4974de
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/L6QSDnx
2025-11-25 10:29:18 +00:00
marco370
d3c0839a31 Increase detection limits and fix navigation to details
Update the default limit for retrieving detections to 5000 and adjust the "Details" link to navigate to the root page with the IP as a query parameter.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: fada4cc1-cc41-4254-999e-dd6c2d2a66dc
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/1zhedLT
2025-11-25 10:25:39 +00:00
Marco Lanzara
40d94651a2 🚀 Release v1.0.81
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 10:14:39
2025-11-25 10:14:39 +00:00
Marco Lanzara
a5a1ec8d16 🚀 Release v1.0.80
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 10:06:59
2025-11-25 10:06:59 +00:00
marco370
2561908944 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: e25c1759-3c67-4764-8d9b-f0dcb55d63f4
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/1zhedLT
2025-11-25 10:01:22 +00:00
marco370
ee6f3620b8 Improve detection filtering by correctly comparing numerical risk scores
Fix bug where risk scores were compared lexicographically instead of numerically by casting the `riskScore` column to numeric in SQL queries.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: e12effb9-1a7e-487d-8050-fce814f981ed
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/1zhedLT
2025-11-25 10:00:32 +00:00
marco370
83e2d1b1bb Add button to whitelist detected IPs and improve detection details
Implement a new "Whitelist" button for each detection entry, allowing users to add suspicious IPs to a whitelist, and refactor the UI to better organize action buttons for detection details.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 2aa19d64-471f-42f8-b39f-c065f4f1fc2f
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/1zhedLT
2025-11-25 09:58:06 +00:00
marco370
35e1b25dde Improve detection filtering and add whitelist functionality
Add filtering options for anomaly type and risk score to the detections page, increase the default limit to 500, and implement a button to add IPs to the whitelist.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: d66f597d-96c6-4844-945e-ceefb30e71c8
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/1zhedLT
2025-11-25 09:56:59 +00:00
marco370
d9aa466758 Enhance detection filtering and increase result limits
Update API endpoints and storage logic to support filtering detections by anomaly type, minimum/maximum risk score, and to increase the default limit of returned detections.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 2236a0ee-4ac6-4527-bd70-449e36f71c7e
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/1zhedLT
2025-11-25 09:56:13 +00:00
marco370
163776497f Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 69e690d0-c279-4a00-b04c-98e8ac3d2481
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/1zhedLT
2025-11-25 09:45:07 +00:00
Marco Lanzara
a206502ff1 🚀 Release v1.0.79
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 09:37:22
2025-11-25 09:37:22 +00:00
marco370
5a002413c2 Allow more flexible time range for detection analysis
Update DetectRequest model to accept float for hours_back, enabling fractional time ranges for analysis.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 20f544c1-d3ce-4a62-a345-cb7df0f0044a
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/1zhedLT
2025-11-25 09:37:19 +00:00
Marco Lanzara
c99edcc6d3 🚀 Release v1.0.78
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 09:34:57
2025-11-25 09:34:57 +00:00
marco370
f0d391b2a1 Map confidence level strings to numeric values for detection results
Converts 'high', 'medium', and 'low' confidence levels to their corresponding numeric values (95.0, 75.0, 50.0) before saving detection results to the database, resolving an invalid input syntax error for type numeric.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: fd44e6f4-fc55-4636-aa7a-f4f462ac978a
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/AXTUZmH
2025-11-25 09:34:44 +00:00
Marco Lanzara
2192607bf6 🚀 Release v1.0.77
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 09:19:17
2025-11-25 09:19:17 +00:00
marco370
14d67c63a3 Improve syslog parser reliability and add monitoring
Enhance the syslog parser with auto-reconnect, error recovery, and integrated health metrics logging. Add a cron job for automated health checks and restarts.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 4885eae4-ffc7-4601-8f1c-5414922d5350
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/AXTUZmH
2025-11-25 09:09:21 +00:00
Marco Lanzara
093a7ba874 🚀 Release v1.0.76
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 08:52:18
2025-11-25 08:52:18 +00:00
marco370
837f7d4c08 Update detection results to use correct key names for scores
Corrects the key names used to retrieve detection results in `compare_models.py` to match the output format of the hybrid detector.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 600ade79-ad9b-4993-b968-e6466b703598
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/RJGlbTt
2025-11-25 08:51:27 +00:00
Marco Lanzara
49eb9a9f91 🚀 Release v1.0.75
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 08:46:31
2025-11-25 08:46:31 +00:00
marco370
2d7185cdbc Adjust model comparison script to correctly process network logs
Correct logic in `compare_models.py` to pass raw network logs to the detection method, ensuring correct feature extraction and preventing a 'timestamp' KeyError.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: ecdb452a-13bf-4c0b-8da9-eebbafd63834
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/RJGlbTt
2025-11-25 08:43:55 +00:00
Marco Lanzara
27499869ac 🚀 Release v1.0.74
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 08:42:17
2025-11-25 08:42:17 +00:00
marco370
cf3223b247 Update model comparison script to use current database detections
Adjusted script to query existing database detections instead of a specific model version, updating column names to match the actual database schema (source_ip, risk_score, anomaly_type, log_count, last_seen, detected_at).

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 62d703c2-4658-4280-aec5-f5e7c090b266
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/RJGlbTt
2025-11-25 08:42:06 +00:00
Marco Lanzara
c56af1cb16 🚀 Release v1.0.73
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 08:37:26
2025-11-25 08:37:26 +00:00
marco370
a32700c149 Add script to compare old and new detection models
Creates a Python script that loads old detection data, reanalyzes IPs with the new hybrid detector, and compares the results to identify differences and improvements.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: fe294b77-4492-471d-9d6e-9c924153f4d8
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/RJGlbTt
2025-11-25 08:36:32 +00:00
Marco Lanzara
77cd8a823f 🚀 Release v1.0.72
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 08:24:55
2025-11-25 08:24:55 +00:00
marco370
a47079c97c Add historical training data logging for hybrid models
Integrate saving of training history to the database within `train_hybrid.py`, ensuring that model versioning is correctly applied for hybrid detector runs.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 9f8d0aa1-70ec-4271-b143-5f66d1d3756b
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/RJGlbTt
2025-11-25 08:08:45 +00:00
Marco Lanzara
2a33ac82fa 🚀 Release v1.0.71
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 08:02:02
2025-11-25 08:02:02 +00:00
marco370
cf094bf750 Update model version tracking for training history
Dynamically set the model version to "2.0.0" for hybrid detectors and "1.0.0" for legacy detectors, and update the database insertion logic in `main.py` to use this dynamic version.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 25db5356-3182-4db3-be10-c524c0561b39
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/RJGlbTt
2025-11-25 08:01:03 +00:00
Marco Lanzara
3a4d72f1e3 🚀 Release v1.0.70
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 07:56:24
2025-11-25 07:56:24 +00:00
marco370
5feb691122 Fix error when hybrid detector models are not loaded
Correctly check if the hybrid detector models are loaded by verifying the presence of `isolation_forest` instead of a non-existent `is_trained` attribute in `python_ml/main.py`.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: c8073c39-409d-45f4-a3e8-e48ce4d71e32
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/RJGlbTt
2025-11-25 07:56:15 +00:00
Marco Lanzara
a7f55b68d7 🚀 Release v1.0.69
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-25 07:54:07
2025-11-25 07:54:07 +00:00
marco370
08af108cfb Fix backend crash when initializing hybrid ML detector
Corrected `main.py` to handle the `ml_analyzer` being `None` when `USE_HYBRID_DETECTOR` is true, preventing an `AttributeError` during startup.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 27f5de5e-5ed6-4ee6-9cc2-a7c448ad2334
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/XSkkaPM
2025-11-25 07:53:47 +00:00
marco370
d086b00092 Fix system service to prevent continuous restart failures
The systemd service for the ML backend is repeatedly failing and restarting due to an exit-code failure.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: d83d5831-e125-4886-bdea-1bb0aba2d63b
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/XSkkaPM
2025-11-25 07:51:16 +00:00
marco370
adcf997bdd Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 42d78da0-9cbe-4323-88f0-1cec9233a0e9
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/XSkkaPM
2025-11-24 18:17:52 +00:00
Marco Lanzara
b3b87333ca 🚀 Release v1.0.68
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 18:17:37
2025-11-24 18:17:37 +00:00
marco370
f6e222d473 Correct SQL query for retrieving network traffic data
Fixes a critical SQL syntax error in `train_hybrid.py` preventing data loading by adjusting the `INTERVAL` calculation for the `timestamp` WHERE clause.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: fd776d7a-7ad0-46a7-9500-792cb8944915
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/XSkkaPM
2025-11-24 18:16:21 +00:00
marco370
b88377e2d5 Adapt ML model to new database schema and automate training
Adjusts SQL queries and feature extraction to accommodate changes in the network_logs database schema, enabling automatic weekly retraining of the ML hybrid detector.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: f4fdd53b-f433-44d9-9f0f-63616a9eeec1
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 18:14:43 +00:00
Marco Lanzara
7e9599804a 🚀 Release v1.0.67
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 18:07:00
2025-11-24 18:07:00 +00:00
Marco Lanzara
d384193203 🚀 Release v1.0.66
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 18:06:18
2025-11-24 18:06:18 +00:00
marco370
04136e4303 Add script to train hybrid ML detector with real data
Create a bash script to automate the training of the hybrid ML detector, automatically fetching database credentials from the .env file and executing the training process.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: a3c383a4-4a2c-4598-b060-f46984980561
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 18:05:04 +00:00
marco370
34bd6eb8b8 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 75d43bf3-66f2-40ce-8820-e516a7014165
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 18:01:42 +00:00
Marco Lanzara
7a2b52af51 🚀 Release v1.0.65
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 17:58:33
2025-11-24 17:58:33 +00:00
Marco Lanzara
3c68661af5 🚀 Release v1.0.64
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 17:57:38
2025-11-24 17:57:38 +00:00
marco370
7ba039a547 Fix index out of bounds error during synthetic data testing
Corrected an indexing error in `train_hybrid.py` by using `enumerate` to ensure accurate mapping of detections to the test dataset, resolving an `IndexError` when processing synthetic data.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: d05c3dd2-6349-426d-be9c-ec80a07ea78f
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 17:57:22 +00:00
marco370
0d9fda8a90 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: a17bbc74-efee-4ac9-ab10-8a21deeb3932
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 17:56:20 +00:00
Marco Lanzara
87d84fc8ca 🚀 Release v1.0.63
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 17:53:58
2025-11-24 17:53:58 +00:00
marco370
57afbc6eec Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: e99405da-a1e5-46f9-a34d-7773413667ae
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 17:53:04 +00:00
marco370
9fe2532217 Add timestamp to synthetic data for accurate model testing
Add a 'timestamp' column to the synthetic dataset generation in `python_ml/dataset_loader.py` to resolve a `KeyError` during model training and testing.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 276a3bd4-aaee-40c9-acb7-027f23274a9f
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 17:52:16 +00:00
marco370
db54fc3235 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 492f2652-d6c8-46dd-901b-3d25f2a4c5bb
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 17:50:09 +00:00
Marco Lanzara
71a186a891 🚀 Release v1.0.62
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 17:47:05
2025-11-24 17:47:05 +00:00
marco370
8114c3e508 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 222eee01-78e7-4e32-8beb-d5eb120d5da0
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 17:45:45 +00:00
marco370
75d3bd56a1 Simplify ML dependency to use standard Isolation Forest
Remove problematic Extended Isolation Forest dependency and leverage existing scikit-learn fallback for Python 3.11 compatibility.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 89ea874d-b572-40ad-9ac7-0c77d2b7d08d
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 17:44:11 +00:00
marco370
132a667b2a Update ML dependency installation script for improved build isolation
Refactor deployment script and documentation to correctly handle build isolation for ML dependencies, specifically `eif`, by leveraging environment variables and sequential installation steps.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 7a4bce6a-9957-4807-aa16-ce07daafe00f
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 17:35:40 +00:00
Marco Lanzara
8ad7e0bd9c 🚀 Release v1.0.61
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 17:11:49
2025-11-24 17:11:49 +00:00
marco370
051c838840 Add ability to install ML dependencies and resolve build issues
Update install_ml_deps.sh to use --no-build-isolation when installing eif to resolve ModuleNotFoundError during build.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 219383e3-8935-415d-8c84-77e7d6f76af8
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 17:06:43 +00:00
Marco Lanzara
485f3d983b 🚀 Release v1.0.60
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 17:02:58
2025-11-24 17:02:58 +00:00
marco370
102113e950 Improve ML dependency installation script for robust deployment
Update deployment script to correctly activate virtual environment, install Cython and numpy as build dependencies before eif, and ensure sequential installation for the ML hybrid detector.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 8b4c76c7-3a42-4713-8396-40f5db530225
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 17:02:15 +00:00
Marco Lanzara
270e211fec 🚀 Release v1.0.59
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 16:59:32
2025-11-24 16:59:32 +00:00
Marco Lanzara
81dee61ae4 🚀 Release v1.0.58
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 16:58:25
2025-11-24 16:58:25 +00:00
marco370
b78f03392a Update deployment process to handle machine learning dependencies
Create a dedicated script to install machine learning dependencies in the correct order, ensuring Cython is installed before packages that require it for compilation, and update documentation accordingly.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: aa7dc534-7330-4bd4-b726-d6eeb29008af
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 16:38:41 +00:00
Marco Lanzara
f5e212626a 🚀 Release v1.0.57
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 16:36:14
2025-11-24 16:36:14 +00:00
marco370
b4aaa5456f Add Cython to Python dependencies for model compilation
Add Cython==3.0.5 to python_ml/requirements.txt and update replit.md to reflect this change, resolving a compilation issue with the eif library.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: f24578fc-6be7-42c0-9a9c-5ffe13dacdbe
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 16:35:15 +00:00
Marco Lanzara
152e22621b 🚀 Release v1.0.56
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 16:33:35
2025-11-24 16:33:35 +00:00
marco370
043690f829 Update dependency version for improved compatibility
Update `eif` dependency from version 2.0.0 to 2.0.2 in `requirements.txt` and documentation to resolve a deployment issue, as version 2.0.0 is not available.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 104a0ac7-f020-4d00-9e3e-5a37b74bbc93
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 16:32:31 +00:00
Marco Lanzara
3a945ec7d0 🚀 Release v1.0.55
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 16:28:47
2025-11-24 16:28:47 +00:00
marco370
dac78addab Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: e9fa7e86-fa28-4924-9cbf-ddb3a20493fa
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 16:26:43 +00:00
marco370
16617aa0fa Improve model training by adding robust error handling and logging
Add exception handling to the model training process to log failures and improve robustness.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 9c7ad6b8-3e9d-41fe-83f7-6b2a48f8ff44
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 16:25:40 +00:00
marco370
783d28f571 Add a hybrid machine learning system to reduce false positives
Add a new hybrid ML detector system using Extended Isolation Forest and feature selection to reduce false positives. Documented with a deployment checklist and updated API performance notes.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 80860ac4-8fe9-479b-b8fb-cb4c6804a667
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 16:06:29 +00:00
marco370
8b16800bb6 Update system to use hybrid detector and improve validation accuracy
Update main.py endpoints to use the hybrid detector and improve validation logic in train_hybrid.py by mapping detections using source_ip. Also, add synthetic source_ip to dataset_loader.py for both CICIDS2017 and synthetic datasets.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 5c4982f1-3d37-47da-9253-c04888f5ff64
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/2lUhxO2
2025-11-24 16:02:49 +00:00
marco370
4bc4bc5a31 Update backend API to support new hybrid ML detection system
Introduce MLHybridDetector and update FastAPI app configuration to prioritize it, along with a new training script `train_hybrid.py`.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 462e355b-1642-45af-be7c-e04efa9dee67
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/F6DiMv4
2025-11-24 15:57:23 +00:00
marco370
350a0994bd Add dataset loader and validation metrics modules
Introduces `CICIDS2017Loader` for dataset handling and `ValidationMetrics` class for calculating performance metrics in Python.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: ad530f16-3a16-44a3-8fed-6c5d56775c77
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/F6DiMv4
2025-11-24 15:55:30 +00:00
marco370
932931457e Add a hybrid machine learning detection system
Add a new ML hybrid detector module with Extended Isolation Forest, feature selection, and an ensemble classifier, along with updated Python dependencies.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 8b74011c-0e9a-4433-b9a1-896e65cb4ae1
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/F6DiMv4
2025-11-24 15:53:05 +00:00
Marco Lanzara
0fa2f118a0 🚀 Release v1.0.54
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 15:30:39
2025-11-24 15:30:39 +00:00
marco370
6df27bbd11 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 8d2f60eb-2c1e-4be1-9d10-06ef65080214
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/F6DiMv4
2025-11-24 15:29:42 +00:00
marco370
3de433f278 Fix analytics data inconsistency on live dashboard
Update analytics aggregator to correctly count attack occurrences and fix type hinting for daily aggregation.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: f54e3953-68c3-42e1-be9d-1d1db98db671
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/F6DiMv4
2025-11-24 15:28:22 +00:00
marco370
3d7a0ce424 Improve attack data accuracy and add validation checks
Update analytics aggregation logic to accurately count attack packets by type and country, including fallbacks for missing data, and add validation to ensure breakdown totals match reported attack packets.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: cafbc828-3e12-4d4f-8a02-5127b485612d
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/F6DiMv4
2025-11-24 15:27:17 +00:00
Marco Lanzara
7734f56802 🚀 Release v1.0.53
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 15:12:02
2025-11-24 15:12:02 +00:00
marco370
d43b0de37f Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 0620a8b0-78f0-4139-82ca-a2445f672b19
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/F6DiMv4
2025-11-24 15:10:41 +00:00
marco370
3c14508aa5 Add critical alert for idle analytics aggregator
Add a destructive alert to the services page indicating when the analytics aggregator has been idle for too long, along with immediate solution instructions. Also creates a deployment checklist detailing the critical step of setting up the analytics aggregator timer.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 618f6e47-fbdc-49e2-b076-7366edc904a6
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/F6DiMv4
2025-11-24 15:09:18 +00:00
marco370
b61940f1fe Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: ad5dc98d-c7c5-4b47-9107-183ebb1eafc4
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/0sf5i4S
2025-11-24 11:47:10 +00:00
Marco Lanzara
7ecc88dded 🚀 Release v1.0.52
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 11:11:32
2025-11-24 11:11:32 +00:00
Marco Lanzara
bbb9987b9b 🚀 Release v1.0.51
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 11:08:56
2025-11-24 11:08:56 +00:00
marco370
7940694d43 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 7afe2abb-da72-483a-a6b8-e6bf3fc8c943
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/0sf5i4S
2025-11-24 11:08:19 +00:00
marco370
921dd81563 Improve system service reliability and monitoring details
Update systemd service files to ensure continuous operation with automatic restarts, add a timestamp for improved debugging of analytics aggregation, and introduce a new installation script for streamlined deployment.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 2c579454-c420-40e4-a574-a341fb962b69
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/0sf5i4S
2025-11-24 11:02:23 +00:00
marco370
402cbe1890 Add Analytics Aggregator monitoring and status display
Integrates Analytics Aggregator status checks into the services overview, updating both backend route handling and frontend display components.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 1211e51b-5231-4d36-9e68-107ca898152b
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/0sf5i4S
2025-11-24 10:58:41 +00:00
marco370
edf7ef97f0 Improve dashboard and detection loading performance
Add ?limit=100 to API calls in Dashboard and Detections components to optimize data fetching.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: aaffd902-9ebc-479f-a635-1a5480cf366e
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/x5P9dcJ
2025-11-24 10:56:44 +00:00
marco370
2067c390c1 Update database cleanup to retain logs for 3 days
Modify `cleanup_old_logs.sql` to retain logs for 3 days instead of 7, and update the `cleanup_database.sh` script to reflect this change.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 6c570c89-bf79-4953-8af1-61fd481625bc
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/x5P9dcJ
2025-11-24 10:56:07 +00:00
Marco Lanzara
070adb0d14 🚀 Release v1.0.50
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 10:12:24
2025-11-24 10:12:24 +00:00
marco370
3da26587f9 Display accurate real-time attack statistics on the dashboard
Introduce a new API endpoint to fetch aggregated dashboard statistics, replacing raw data retrieval with pre-calculated metrics for improved performance and accuracy in the live dashboard view.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: d36a1b00-13ef-4857-b703-6096a3bd4601
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/x5P9dcJ
2025-11-24 10:12:02 +00:00
Marco Lanzara
78f21d25a0 🚀 Release v1.0.49
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 10:04:40
2025-11-24 10:04:40 +00:00
marco370
bfff4fdcba Fix how live traffic distribution data is fetched
Correctly format the API query key for fetching network logs to display live traffic distribution.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: a8e8de55-f2a8-49d9-beb1-ee081253deb4
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/x5P9dcJ
2025-11-24 10:04:29 +00:00
Marco Lanzara
cbfa78de28 🚀 Release v1.0.48
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 09:59:49
2025-11-24 09:59:49 +00:00
marco370
e629bf4ed3 Improve analytics data fetching and server restart process
Fixes the analytics API call by correctly formatting the query parameters in the `AnalyticsHistory.tsx` component. Enhances the `restart_frontend.sh` script for more aggressive process killing and port cleanup to prevent 'address already in use' errors. Also, adds a check for the `country` column existence in the database schema, addressing a potential mismatch between development and production environments.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 23dd17a9-47b9-4533-bf4c-8b5cfdb426b4
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/x5P9dcJ
2025-11-24 09:59:06 +00:00
marco370
98fd06b8f7 Add logging to help debug analytics data retrieval
Add console logs to the analytics query and results in `server/storage.ts` to aid in debugging data retrieval issues.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: b5db3658-d0e1-4bb3-a767-7d1d7d363003
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/37PFEyl
2025-11-24 09:18:25 +00:00
Marco Lanzara
83d82f533a 🚀 Release v1.0.47
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 09:13:48
2025-11-24 09:13:48 +00:00
marco370
3beb9d8782 Add unique constraint to network analytics data and restart frontend script
Add a unique constraint to the network_analytics table in the Drizzle schema to prevent duplicate entries, and create a new script for restarting the frontend application.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 2676dcba-e26b-4eb4-b4a4-33d33d0c9b00
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/37PFEyl
2025-11-24 09:13:11 +00:00
marco370
4d7279a0ab Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 418fe490-23c5-4609-97af-500e270c718b
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/37PFEyl
2025-11-24 09:07:20 +00:00
Marco Lanzara
53860d7c5a 🚀 Release v1.0.46
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 08:57:31
2025-11-24 08:57:31 +00:00
marco370
ede6378241 Update analytics to show hourly data and time
Fix an issue where historical analytics were not displaying correctly by changing the data fetch to hourly and updating the date format.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 49cfc935-92d3-4961-b703-750d52c8bdc2
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/37PFEyl
2025-11-24 08:57:07 +00:00
Marco Lanzara
8b5cbb7650 🚀 Release v1.0.45
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 08:43:26
2025-11-24 08:43:26 +00:00
marco370
4155b0f01f Update script to use correct virtual environment path
Correct the path to the Python virtual environment in both the service definition and the wrapper script to ensure proper execution of the analytics aggregator.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 6b4e2246-92c4-45dd-8a8f-25a7d0067f86
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/oGXAoP7
2025-11-24 08:42:51 +00:00
Marco Lanzara
cb240cc9c8 🚀 Release v1.0.44
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 08:41:37
2025-11-24 08:41:37 +00:00
Marco Lanzara
6a2cb0bd94 🚀 Release v1.0.43
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 08:36:03
2025-11-24 08:36:03 +00:00
marco370
2b802397d1 Improve analytics data aggregation security and reliability
Enhance security by removing hardcoded paths, implementing a wrapper script for manual execution, and adding robust credential validation in the analytics aggregator.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 47661cde-4285-4ce6-8d00-fb236a5a01b7
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/oGXAoP7
2025-11-24 08:35:46 +00:00
Marco Lanzara
24001c2792 🚀 Release v1.0.42
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-24 08:29:03
2025-11-24 08:29:03 +00:00
marco370
2065e6ce05 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 4839cca5-4a24-4365-aa58-c798e5f39a0a
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/oGXAoP7
2025-11-22 11:35:51 +00:00
marco370
cbd03d9e64 Add network analytics and live dashboard features
Introduce new network analytics capabilities with persistent storage, hourly and daily aggregations, and enhanced frontend visualizations. This includes API endpoints for retrieving analytics data, systemd services for automated aggregation, and UI updates for live and historical dashboards. Additionally, country flag emojis are now displayed on the detections page.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 3c14f651-7633-4128-8526-314b4942b3a0
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/oGXAoP7
2025-11-22 11:34:36 +00:00
Marco Lanzara
d997afe410 🚀 Release v1.0.41
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-22 11:01:18
2025-11-22 11:01:18 +00:00
marco370
47539c16b4 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: bb007fd4-b472-429c-89ca-8d01b1edec6c
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/SXFWABi
2025-11-22 11:00:52 +00:00
marco370
1b9df79d56 Add IP geolocation and AS information to detection records
Integrates IP geolocation and Autonomous System (AS) information into detection records by modifying the frontend to display this data and updating the backend to perform asynchronous batch lookups for efficiency. This enhancement includes database schema updates and the creation of a new IP geolocation service.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: e81fd4a1-b7b0-48d2-ae38-f5905e278343
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/SXFWABi
2025-11-22 10:59:50 +00:00
marco370
ae106cf655 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: be08307b-daf8-403d-9a65-bc4481dd8b35
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/SXFWABi
2025-11-22 10:42:30 +00:00
Marco Lanzara
3f934b17ec 🚀 Release v1.0.40
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-22 10:41:13
2025-11-22 10:41:13 +00:00
marco370
9d1f8b69ee Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: b62afa1f-0ab8-4b17-8d24-840c51dd762c
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/SXFWABi
2025-11-22 10:37:39 +00:00
marco370
07c7e02770 Add automatic database cleanup for older log entries
Implement automatic cleanup of log entries older than 3 days in the database and switch to a safer streaming mode for processing log files.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 70142a7e-f1e2-4668-9fee-f1ff7d8615ae
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/SXFWABi
2025-11-22 10:37:12 +00:00
Marco Lanzara
06d6f47df1 🚀 Release v1.0.39
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-22 10:28:05
2025-11-22 10:28:05 +00:00
marco370
0bf61dc69d Improve model training and file saving capabilities
Fixes permission errors for model saving and enhances training logging, ensuring proper storage of ML models and historical data.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 2afb7ddf-484b-4d07-8d99-8c1ca39c0be5
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/VDRknFA
2025-11-22 10:27:30 +00:00
Marco Lanzara
88004cb7ec 🚀 Release v1.0.38
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-22 10:22:46
2025-11-22 10:22:46 +00:00
marco370
c31e1ca838 Improve training history logging and file management
Enhance error handling in Python ML backend for training and update script location.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: b7099249-7827-46da-bdf9-2ff1d9c07b6c
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/VDRknFA
2025-11-22 10:21:29 +00:00
Marco Lanzara
83f3d4cf8e 🚀 Release v1.0.37
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-22 10:15:49
2025-11-22 10:15:49 +00:00
marco370
e6fb3aefe3 Fix monitoring errors and database inconsistencies on the dashboard
Fix critical bugs in the Syslog Parser monitoring endpoint fetch and update database schema for network logs.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: a6dcc6ae-9272-494e-a68e-a0a2b865f1c4
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/VDRknFA
2025-11-22 10:15:19 +00:00
marco370
9d5ecf99c4 Add branding and content assets for the application interface
Add various branding assets, content files, and screenshots to enhance the application's user interface and visual presentation.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 5db5df9a-80de-4fbe-96fb-67a573fc567e
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/VDRknFA
2025-11-22 10:12:00 +00:00
Marco Lanzara
e7afa6dafb 🚀 Release v1.0.36
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-22 10:03:51
2025-11-22 10:03:51 +00:00
marco370
26f3589a7e Improve systemd service setup to aggressively kill manual processes
Updates `setup_systemd_services.sh` to forcefully stop existing systemd services, kill all manual Python processes owned by the `ids` user, and verify that port 8000 is free before starting the services.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: ae567421-923d-4371-a127-7bdeca91b824
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/VDRknFA
2025-11-22 10:03:21 +00:00
Marco Lanzara
9458829ebf 🚀 Release v1.0.35
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-22 09:58:38
2025-11-22 09:58:38 +00:00
marco370
6adc08a0e6 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: bc32919a-81dd-4576-9bd8-257e43feafb7
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/n4Q2eeE
2025-11-22 09:58:04 +00:00
marco370
e9e74f9944 Add missing Python libraries for backend functionality
Update the Python dependency installation script to include `httpx` and `joblib`, and enhance verification checks for installed modules.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: f0a704c9-cac4-4144-8f26-c6066459f615
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/n4Q2eeE
2025-11-22 09:57:47 +00:00
Marco Lanzara
8d4896df76 🚀 Release v1.0.34
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-22 09:53:48
2025-11-22 09:53:48 +00:00
marco370
5b350ff95f Add Python dependency installation and virtual environment support
Introduce a new script to install Python dependencies in a virtual environment, update systemd services to utilize this environment, and modify the setup script to automate dependency checks and installation.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: ea2a08f4-46e1-463d-9c58-16219914ad23
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/n4Q2eeE
2025-11-22 09:40:57 +00:00
marco370
d187aa533a Fix ML backend service startup failures
Update the systemd service unit for the ML backend to correctly log errors to journalctl, enabling easier debugging of startup failures.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: adfa3c2c-2d2f-40c7-8113-83a526fb3a96
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/n4Q2eeE
2025-11-22 09:38:27 +00:00
marco370
015770609a Update database schema and restart services
Apply SQL migrations and synchronize the database schema using Drizzle Kit, then restart systemd services.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 5dd94895-686f-45b1-9689-fcec435e180e
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/n4Q2eeE
2025-11-22 09:37:13 +00:00
Marco Lanzara
58619ff79f 🚀 Release v1.0.33
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-22 09:34:19
2025-11-22 09:34:19 +00:00
marco370
5252d5cb51 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 7a3fe3ef-3b4e-4116-af94-0593370c3076
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/n4Q2eeE
2025-11-22 09:33:57 +00:00
marco370
7ec5ff553b Add systemd service management with API key security
Implement systemd service management for ML backend and Syslog parser with API key authentication and robust error handling across frontend and backend.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: e0ddd146-1e7d-40e4-8607-ef8d247a1f49
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/n4Q2eeE
2025-11-22 09:33:30 +00:00
marco370
4a2d7f9c5c Add service monitoring and status indicators to the dashboard
Introduce a new services page, integrate real-time status monitoring for ML backend, database, and syslog parser, and update the dashboard to display service health indicators.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: cde95c60-908b-48a0-b7b9-38e5e924b3b3
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/n4Q2eeE
2025-11-22 09:24:10 +00:00
Marco Lanzara
27f475191e 🚀 Release v1.0.32
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-22 09:03:03
2025-11-22 09:03:03 +00:00
marco370
24b907e17b Fix log parsing by adding missing timestamps to incoming data
The attached log file and agent reasoning indicate that the `ids-syslog-parser` service failed to restart and the `/var/log/mikrotik/raw.log` file lacks timestamps, leading to the parser saving 0 logs. The provided solution involves manually updating the `/etc/rsyslog.d/99-mikrotik.conf` file to include the `%TIMESTAMP%` directive, restarting the `rsyslog` service, clearing and restarting the log file, and then restarting the `ids-syslog-parser` service to process logs correctly.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 9fa8bbb2-1781-4d01-b6d3-3b872fb304a3
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/6ZTQSoP
2025-11-22 09:02:34 +00:00
marco370
9448d54156 Add timestamp to log entries to ensure proper parsing
Fixes an issue where log entries were missing timestamps, preventing the Python parser from correctly processing the data. This change modifies the rsyslog template to include the timestamp.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: d449ad65-7340-48be-804d-a473dce5d0d8
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/6ZTQSoP
2025-11-22 08:57:19 +00:00
marco370
495e845a79 Update log format to include timestamps and filter incoming connections
Correct the rsyslog template to include timestamps in logs, ensuring compatibility with the Python parser. This change also refactors the log filtering to capture only incoming connections, significantly reducing log volume.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: c2f849f9-105f-452a-bdc3-a956d102c54b
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/6ZTQSoP
2025-11-22 08:54:21 +00:00
Marco Lanzara
5f82419240 🚀 Release v1.0.30
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-22 07:53:02
2025-11-22 07:53:02 +00:00
marco370
7b85fcc020 Implement a database versioning system to speed up updates
Add a database versioning system that tracks applied migrations and only runs new ones, significantly improving update speed.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: ab56b852-52de-4cd1-a489-5cf48f3a2965
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/myJmPIa
2025-11-22 07:51:56 +00:00
Marco Lanzara
79a37639e0 🚀 Release v1.0.29
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-22 07:41:57
2025-11-22 07:41:57 +00:00
marco370
c9a0ba1b66 Fix issue where logs are not saved to the database
Resolve a critical bug in the log processing system where millions of log entries were processed but not committed to the database, leading to zero records in the `network_logs` table. This involved stopping the parser, truncating/rotating the log file, and restarting the parser to process only new entries.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 6976842b-2cfa-4e4f-aea4-7e7206988693
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/DR50xVM
2025-11-21 17:37:31 +00:00
Marco Lanzara
39536b4f7a 🚀 Release v1.0.28
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-21 17:29:35
2025-11-21 17:29:35 +00:00
marco370
c9b2a8a9a9 Set up system to receive and store MikroTik logs
Add rsyslog configuration for receiving MikroTik logs via UDP, store them in a dedicated file, and prevent duplicates in system messages.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: b452008c-bd98-4e68-81a9-f20d3f714372
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/DR50xVM
2025-11-21 17:26:52 +00:00
marco370
b31b0ec932 Add log file detailing network traffic and detected DDOS activity
Added a log file containing network traffic data, including packet forwarding information, connection states, and detected denial-of-service (DDOS) activities.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 43992744-515b-49bf-8459-fe8c14ec1bde
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/4LjHWWz
2025-11-21 17:10:36 +00:00
marco370
6f2d0da1c9 Update system with latest code and database changes
Applies recent code updates from the Git repository and executes necessary database schema migrations to ensure data integrity and system functionality.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 2bb36b2d-b880-414f-ad99-7f18bc18ee9e
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/4LjHWWz
2025-11-21 17:06:35 +00:00
Marco Lanzara
e957556af2 🚀 Release v1.0.27
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-21 16:58:21
2025-11-21 16:58:21 +00:00
marco370
3e9b9f110a Improve environment variable loading for deployment scripts
Update `apply_migrations.sh`, `cleanup_database.sh`, and `debug_system.sh` to robustly load environment variables from the `.env` file, ensuring correct `DATABASE_URL` availability and improving script execution reliability.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: b9098694-dc08-42d0-9fe9-48515e4e82a0
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/4LjHWWz
2025-11-21 16:57:55 +00:00
marco370
1dd4e57999 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 14166d0c-389e-48f7-8401-9570c0433336
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/4LjHWWz
2025-11-21 16:53:12 +00:00
Marco Lanzara
07f1895bbd 🚀 Release v1.0.26
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-21 16:50:22
2025-11-21 16:50:22 +00:00
marco370
661e945f57 Implement automatic database cleanup and schema updates
Adds scripts for automatic database log cleanup, schema migration application, and cron job setup. Modifies the update script to apply SQL migrations before pushing Drizzle schema.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 9a659f15-d68a-4b7d-99f8-3eccc59afebe
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/4LjHWWz
2025-11-21 16:49:13 +00:00
marco370
d10b470793 Fix database connection errors and schema issues
Resolve 500 errors across database API endpoints by implementing a dual-mode database driver and adding missing columns to the routers table.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: c333ed12-07c5-412a-aff0-524321acc652
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/C4ZJnmQ
2025-11-21 16:41:55 +00:00
marco370
f4803a7451 Fix error when fetching router and stats information
Add missing `api_port` and `last_sync` columns to the `routers` table in the database to resolve `column "last_sync" does not exist` errors.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: ed8286cc-22d7-40bd-ae28-6bae4dd7f5ea
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/C4ZJnmQ
2025-11-21 16:35:52 +00:00
marco370
e8270da285 Fix database connection errors and improve logging
Update database connection handling to resolve errors and log database type instead of connection strings.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 81036932-56b6-46dc-a2eb-153586dfd77d
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/C4ZJnmQ
2025-11-21 16:25:32 +00:00
Marco Lanzara
dccfb4032a 🚀 Release v1.0.25
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-21 16:19:57
2025-11-21 16:19:57 +00:00
marco370
42354d5087 Adapt database connection for local and cloud environments
Update `server/db.ts` to support both Neon serverless and standard PostgreSQL drivers, add database health checks, and improve error logging for database operations.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 55ee188b-6bb4-49b0-8966-1795106363b1
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/C4ZJnmQ
2025-11-21 16:09:48 +00:00
Marco Lanzara
3076c7946b 🚀 Release v1.0.24
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-21 15:58:39
2025-11-21 15:58:39 +00:00
marco370
e5d038a2b4 Improve how the application starts to ensure environment variables are loaded correctly
Modify deployment/check_frontend.sh to use `env $(cat .env | grep -v '^#' | xargs) npm run dev` to inject environment variables into the npm process, resolving issues where `nohup` did not inherit exported variables. Update replit.md to reflect the corrected fix details.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 602cd9af-422f-4b1d-ab37-5fd3e0bd016f
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/dI1I79r
2025-11-21 15:58:17 +00:00
marco370
aec912cf16 Fix frontend startup to correctly load environment variables
Update `deployment/check_frontend.sh` to load `.env` variables before running `npm run dev`, resolving the `DATABASE_URL must be set` error and improving deployment on non-Replit environments.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 8cf499b5-f15d-4b0d-b459-f48510ac6539
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/dI1I79r
2025-11-21 15:52:15 +00:00
Marco Lanzara
2041d3926f 🚀 Release v1.0.23
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-21 15:45:57
2025-11-21 15:45:57 +00:00
Marco Lanzara
b158b56645 🚀 Release v1.0.22
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-21 15:37:36
2025-11-21 15:37:36 +00:00
marco370
d0683b363f Update startup script to load environment variables correctly
Modify `deployment/check_frontend.sh` to source the `.env` file before starting the Node.js development server. This ensures that environment variables, such as `DATABASE_URL`, are properly loaded, resolving startup errors on non-Replit environments. Also updates `replit.md` to document this fix.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 21e2b3ab-3e67-45ff-819b-a2db15c27292
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/dI1I79r
2025-11-21 15:28:30 +00:00
marco370
b3bbcb69d3 Update system to apply database changes and synchronize dependencies
Apply SQL scripts to update database schema and install/update Node.js and Python dependencies.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: bf319927-85dd-43f9-9df2-348d41742705
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/Aqah4U9
2025-11-21 14:31:40 +00:00
Marco Lanzara
40990d4ed7 🚀 Release v1.0.21
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-21 10:41:50
2025-11-21 10:41:50 +00:00
Marco Lanzara
b9be5834dd 🚀 Release v1.0.20
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-21 09:14:18
2025-11-21 09:14:18 +00:00
marco370
156df07656 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 0852850e-3a19-4be8-b7f0-82d38042c9bf
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/Aqah4U9
2025-11-21 09:09:34 +00:00
marco370
7b514f470f Implement form validation and improve error handling for ML features
Refactor Training and Whitelist pages to use react-hook-form and Zod for validation, and enhance ML backend API routes with timeouts, input validation, and better error reporting.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: 95d9d0e3-3da7-43ff-b8d3-d9d5d8fd6f6f
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/Aqah4U9
2025-11-21 09:08:38 +00:00
marco370
7171c3d607 Add new pages for ML training and IP whitelisting
Adds new frontend pages for ML training and IP whitelisting, along with backend API endpoints to trigger ML actions and manage the whitelist.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: intermediate_checkpoint
Replit-Commit-Event-Id: d6b17610-31b4-4ed0-a52a-66659e36d756
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/Aqah4U9
2025-11-21 09:00:50 +00:00
Marco Lanzara
84320a3c6f 🚀 Release v1.0.19
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-21 08:23:16
2025-11-21 08:23:16 +00:00
Marco Lanzara
5545b1de1c 🚀 Release v1.0.18
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-18 09:30:06
2025-11-18 09:30:06 +00:00
marco370
203eaa9a54 Improve system stability by fixing cron job and process monitoring
Refactor cron job execution to use separate shell scripts and update process monitoring scripts to correctly handle PID files and log directories.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: f6d4ec61-7130-41dd-aef9-87b4bc73d0e8
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/3R8dKMX
2025-11-18 08:33:05 +00:00
marco370
62ffbe499a Add automated training and detection jobs to the system
Adds new cron jobs and shell scripts for automated ML model training and anomaly detection, along with configurations for logging and system checks.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 534244cd-b293-403e-a6d6-43cd9fbb30fb
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/1P26v7M
2025-11-18 08:24:09 +00:00
marco370
94724ff580 Add ability to detect and block malicious network traffic based on risk
Adds functionality to detect network anomalies and block suspicious IP addresses.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: e23fa694-b094-4b45-9ce0-0dbf8fd7a9e2
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/1P26v7M
2025-11-18 07:31:39 +00:00
marco370
8c14c9fa8e Update project documentation with recent fixes and deployment instructions
Fixes multiple issues within the FastAPI backend, including the `/stats` endpoint, and clarifies deployment procedures for the intrusion detection system.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 3788e2d0-69c4-4ef8-90d0-f17cb6bd8999
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/1P26v7M
2025-11-17 18:28:16 +00:00
Marco Lanzara
4c8872c1ef 🚀 Release v1.0.17
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-17 18:26:12
2025-11-17 18:26:12 +00:00
marco370
253df0e6bc Add table for managing MikroTik router configurations
Create a new SQL table named 'routers' to store MikroTik router configuration details, including IP addresses, credentials, and status, with example data.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 4b47610c-d428-476f-95dd-d18864bbda27
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/1P26v7M
2025-11-17 18:26:01 +00:00
marco370
94cd497aa6 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: a1850166-547b-4a1b-b88f-1a1a5ca47686
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/1P26v7M
2025-11-17 18:24:35 +00:00
marco370
d661a6099c Update version information and add new changelog entry
Updates version.json to v1.0.16 and adds a new changelog entry for the latest patch deployment.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 32c46a99-4cbe-41a0-9ac2-d97ba30bfe84
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/1P26v7M
2025-11-17 18:21:54 +00:00
Marco Lanzara
9fcaaf829b 🚀 Release v1.0.15
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-17 18:18:49
2025-11-17 18:18:49 +00:00
marco370
7ba65c9d96 Fix errors when retrieving statistics by handling empty results
Update the get_stats function in main.py to safely fetch and process counts from the database, preventing potential errors when no records are found for total logs, recent logs, detections, blocked IPs, and active routers.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 853b2085-c74d-4b3d-adeb-9db4276a24aa
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/1P26v7M
2025-11-17 18:18:30 +00:00
Marco Lanzara
75e68982da 🚀 Release v1.0.14
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-17 18:12:17
2025-11-17 18:12:17 +00:00
marco370
dc91096d9d Improve log processing and add automated tasks
Update log processing to use the correct timestamp field and introduce cron jobs for automated model training and anomaly detection.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: f0653fd5-fc94-4fcb-8d7e-2a0e90fc81bf
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/MkBJZ0L
2025-11-17 18:11:49 +00:00
Marco Lanzara
d649a4ee5f 🚀 Release v1.0.13
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-17 18:06:53
2025-11-17 18:06:53 +00:00
marco370
0ab5e45413 Add database tables for storing detection, training, and whitelist information
Create SQL schemas for `detections`, `training_history`, and `whitelist` tables to manage IDS data, model training logs, and trusted IPs.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: f93295bb-74e9-4b67-b3d7-05437cb8c23f
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/MkBJZ0L
2025-11-17 18:03:30 +00:00
marco370
d345a24572 Improve intrusion detection system with functional updates and database fixes
Update `replit.md` to reflect recent system improvements including a fully functional syslog parser, PostgreSQL database, updated regex patterns, DDoS detection, and automated deployment workflows. Addresses issues with the `network_logs` table schema and incorrect regex matching. Includes SQL query outputs for verification.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 70827608-8ca8-471f-a794-336056b4ce88
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/MkBJZ0L
2025-11-17 17:54:18 +00:00
Marco Lanzara
ceacd8f978 🚀 Release v1.0.12
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-17 17:47:34
2025-11-17 17:47:34 +00:00
marco370
46ab780e60 Add detailed logging to track the syslog parser's execution and state
Enhance syslog_parser.py with debug, info, and error logging statements, including checks for log file existence and robust error handling for database connections and parsing.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 1663254c-86fe-4c1e-966a-168eb8cd8f97
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/MkBJZ0L
2025-11-17 17:46:37 +00:00
Marco Lanzara
2f0b79cc73 🚀 Release v1.0.11
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-17 17:44:26
2025-11-17 17:44:26 +00:00
marco370
95aff1cc0e Update script to apply multiple database migration scripts
Modified update_from_git.sh to iterate and apply all .sql files in the database-schema directory, excluding schema.sql, ensuring database synchronization with applied changes.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 890ee964-4158-4242-aa34-9b7e172a3fcb
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/MkBJZ0L
2025-11-17 17:44:11 +00:00
marco370
369c268bc1 Update deployment instructions and network log schema
Refactor `replit.md` to include `./push-gitlab.sh` deployment. Update `shared/schema.ts` and `database-schema/create_network_logs.sql` to change `routerId` to `routerName` in `networkLogs`, remove the relation, and update fields like `destIp` to `destinationIp`, `bytes`/`packets` to `packetLength`, and add `rawMessage`.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: d2b716bd-04d1-48e8-b4e3-1e6d950d8a15
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/MkBJZ0L
2025-11-17 17:42:17 +00:00
marco370
e5f307af27 Fix error when saving network logs to the database
Correct SQL query to insert data into the network_logs table, resolving "relation 'network_logs' does not exist" errors.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: e8e7edaf-8ac8-493c-b890-465ff617b0ce
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/MkBJZ0L
2025-11-17 17:40:22 +00:00
Marco Lanzara
e259c6c5c0 🚀 Release v1.0.10
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-17 17:38:30
2025-11-17 17:38:30 +00:00
marco370
0d34bf7d3c Update log parsing to better identify network traffic and DDoS events
Refactors the `SyslogParser` class in `python_ml/syslog_parser.py` to use a new, more comprehensive regex pattern (`main_pattern`) for parsing MikroTik logs. This includes improved identification of 'forward' and 'detected-ddos forward' actions, protocol details (UDP, TCP, ICMP), and associated IP addresses, ports, and lengths. The changes aim to accurately capture network traffic and potential DDoS events from MikroTik logs.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: b7377ada-e722-475a-86d2-07f21299ec70
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/MkBJZ0L
2025-11-17 17:35:37 +00:00
marco370
51aa026aae Update logs to show processing progress
Adds new log entries to the syslog parser output, indicating incremental progress in row processing without saving any logs.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: b96cf229-a1bd-4e23-8aee-d92bd339420b
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/MkBJZ0L
2025-11-17 17:34:04 +00:00
marco370
46e280ba4d Update scripts to correctly download and manage project files
Fix a bug in the script responsible for updating the project from Git.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 3459d938-5807-4b03-953c-57b27b9a727a
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/MkBJZ0L
2025-11-17 17:33:12 +00:00
Marco Lanzara
a12b493513 🚀 Release v1.0.9
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-17 17:32:12
2025-11-17 17:32:12 +00:00
marco370
e201c0e770 Add ability to load environment variables from a file
Imported the load_dotenv function from the dotenv library in syslog_parser.py.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 58d752f9-b57e-4e4c-be73-8208077248c4
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/MkBJZ0L
2025-11-17 17:31:57 +00:00
marco370
32d1da96df Provide initial setup instructions for deploying the application
Introduce a detailed bootstrap guide for initial deployment on AlmaLinux servers, including cloning the repository, preserving local configurations like .env, and setting up dependencies. This replaces the previous generic Git update section with specific instructions for the first-time setup.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 0554a4de-1920-4d70-b393-e75b1b92a94e
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/CGAMfXu
2025-11-17 17:15:31 +00:00
Marco Lanzara
c2dcb7bb79 🚀 Release v1.0.8
- Tipo: patch
- Database schema: database-schema/schema.sql (solo struttura)
- Data: 2025-11-17 17:10:48
2025-11-17 17:10:48 +00:00
marco370
3dc1397f20 Add ability to export database schema structure to Git
Modify deploy-to-gitlab.sh to export database schema DDL to database-schema/schema.sql, update commit messages, and adjust versioning.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 8bc17630-4554-4a5a-bac3-2cb45209a6fa
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/CGAMfXu
2025-11-17 17:07:31 +00:00
marco370
0d2a3f7359 Remove sensitive deployment credentials from project configuration
Removes the `git.env` file which contained sensitive GitLab deployment credentials, preventing accidental commits of this information.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: f939a767-0e98-474b-a2b1-72b815b1c8f0
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/CGAMfXu
2025-11-17 17:03:14 +00:00
marco370
ddeba04bd6 Add ability to update system and synchronize database schema from Git
Introduced `update_from_git.sh` for system updates and `export_db_schema.sh` for database schema export to Git.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 73928b7c-5285-4753-b497-d89555ba98b5
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/CGAMfXu
2025-11-17 17:02:02 +00:00
Marco Lanzara
4cfbe61973 🚀 Release v1.0.7
- Tipo: patch
- Database backup: database-backups/ids_v1.0.7_20251117_165440.sql.gz
- Data: 2025-11-17 16:54:46
2025-11-17 16:54:46 +00:00
marco370
35f7ac95f3 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 46661474-74cc-44c6-b8bb-0494641e9382
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/c9ITWqD
2025-11-17 16:53:40 +00:00
marco370
1bc61e71c5 Clarify user preferences for manual Git operations
Add a section to replit.md detailing user preferences for manual Git operations, emphasizing that the agent should not automatically perform commits or pushes.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: c0c657b0-aed6-4943-bd3d-d2546eb5cd34
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/c9ITWqD
2025-11-17 16:50:24 +00:00
Marco Lanzara
3fc3456bb5 🚀 Release v1.0.6
- Tipo: patch
- Database backup: database-backups/ids_v1.0.6_20251117_164834.sql.gz
- Data: 2025-11-17 16:48:40
2025-11-17 16:48:41 +00:00
marco370
62c3a2b238 Update application to use environment variables and force IPv4
Modify syslog_parser.py to load environment variables from .env file and force IPv4 connection to the database. Update replit.md to reflect recent fixes and workflow changes. Increment version in version.json.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 727221f9-ad54-4498-b2e4-e87a951b4308
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/c9ITWqD
2025-11-17 16:48:22 +00:00
marco370
7b6fc148a5 Update system to correctly process log files with proper permissions
Correctly sets file permissions and ownership for the `.env` file and demonstrates successful database connection and log file processing by the syslog parser script.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 91f9e939-55de-4fd8-9f48-6360540b2bfe
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/c9ITWqD
2025-11-17 16:42:42 +00:00
marco370
1936a29ae4 Improve database connection security and restart log parser
Modify pg_hba.conf to use SCRAM-SHA-256 for database authentication and restart the syslog parser service.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 2c44c1cf-5725-4dd8-8305-3fa9b2a5136c
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/c9ITWqD
2025-11-17 16:39:09 +00:00
marco370
08c2373aa5 Configure database user and password for secure access
Updates PostgreSQL configuration to set password encryption to SCRAM-SHA-256, creates a new user 'ids_user' with the specified password, grants necessary privileges on the 'ids_database', and configures default privileges for future objects. Includes troubleshooting steps and log excerpts related to authentication failures.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: f8be77ab-2269-4666-9e56-9309e455e81c
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/c9ITWqD
2025-11-17 16:28:10 +00:00
marco370
7c36dc039b Fix PostgreSQL authentication issues for user access
Introduces a shell script to modify pg_hba.conf and reset PostgreSQL user passwords, resolving authentication failures.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 39b3e0c3-d6b2-4c6f-afb4-e32fe7f09b02
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/c9ITWqD
2025-11-17 16:25:32 +00:00
marco370
23db30f420 Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: 20240e62-df0f-4e7a-b439-d77adf63252f
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/c9ITWqD
2025-11-17 16:08:39 +00:00
Marco Lanzara
265a8cdd66 🚀 Release v1.0.4
- Tipo: patch
- Database backup: database-backups/ids_v1.0.4_20251117_160458.sql.gz
- Data: 2025-11-17 16:05:07
2025-11-17 16:05:07 +00:00
marco370
cbeb5e666e Improve system update script with ownership fixes and better output
Update deployment script to automatically fix Git ownership issues, add backup for git.env, implement stash for local changes, and enhance output formatting for a clearer user experience.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: abc30983-39f6-45cf-b89b-5b3507bb3c27
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/c9ITWqD
2025-11-17 16:03:32 +00:00
marco370
a89ee20a2b Saved progress at the end of the loop
Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: eb55e906-3695-457d-9ed3-3af05be4c14b
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/c9ITWqD
2025-11-17 16:00:19 +00:00
Marco Lanzara
a367bb8f7b 🚀 Release v1.0.3
- Tipo: patch
- Database backup: database-backups/ids_v1.0.3_20251117_160011.sql.gz
- Data: 2025-11-17 16:00:18
2025-11-17 16:00:18 +00:00
marco370
f356181ada Add comprehensive installation and configuration guides for the IDS system
Create detailed Markdown documentation for IDS installation, including quick start, step-by-step, and configuration details.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 7a657272-55ba-4a79-9a2e-f1ed9bc7a528
Replit-Commit-Checkpoint-Type: full_checkpoint
Replit-Commit-Event-Id: e8ff5164-9660-4159-93ac-0d6cae981bb0
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/449cf7c4-c97a-45ae-8234-e5c5b8d6a84f/7a657272-55ba-4a79-9a2e-f1ed9bc7a528/c9ITWqD
2025-11-17 15:19:13 +00:00
205 changed files with 24955 additions and 574 deletions

View File

@ -1,5 +1,6 @@
# Database PostgreSQL # Database PostgreSQL
PGHOST=localhost # IMPORTANTE: Usa 127.0.0.1 invece di localhost per forzare IPv4 (evita problemi con ::1)
PGHOST=127.0.0.1
PGPORT=5432 PGPORT=5432
PGDATABASE=ids_database PGDATABASE=ids_database
PGUSER=ids_user PGUSER=ids_user

52
.gitignore vendored
View File

@ -3,4 +3,54 @@ dist
.DS_Store .DS_Store
server/public server/public
vite.config.ts.* vite.config.ts.*
*.tar.gz *.tar.gz
# Environment variables e secrets (NON committare!)
.env
.env.*
!.env.example
git.env
!git.env.example
# Backup e log
database-backups/
backups/
*.log
*.log.*
# Database schema (SOLO schema.sql committato, NO dati)
!database-schema/
!database-schema/schema.sql
# Python
__pycache__/
*.py[cod]
*$py.class
*.so
.Python
*.egg-info/
dist/
build/
*.pkl
*.joblib
python_ml/models/*.pkl
python_ml/models/*.joblib
# Database dumps (ESCLUDI database-schema/schema.sql)
database-backups/*.sql
database-backups/*.sql.gz
*.dump
# IDE
.vscode/
.idea/
*.swp
*.swo
*~
# OS
.DS_Store
Thumbs.db
# Versioning
version.json.backup

View File

@ -1,4 +1,4 @@
modules = ["nodejs-20", "web", "postgresql-16", "python-3.11"] modules = ["nodejs-20", "web", "python-3.11", "postgresql-16"]
run = "npm run dev" run = "npm run dev"
hidden = [".config", ".git", "generated-icon.png", "node_modules", "dist"] hidden = [".config", ".git", "generated-icon.png", "node_modules", "dist"]

284
GUIDA_INSTALLAZIONE.md Normal file
View File

@ -0,0 +1,284 @@
# 🚀 IDS - Guida Rapida Installazione
Installazione completa sistema IDS su AlmaLinux 9 con aggiornamenti da git.alfacom.it
---
## 📖 Documenti Disponibili
1. **`deployment/INSTALLAZIONE_STEP_BY_STEP.md`** ← **SEGUI QUESTA GUIDA**
- 📋 Guida completa passo-passo
- ✅ 25 step con comandi esatti
- 🔍 Checklist completa
- 🐛 Troubleshooting
2. **`deployment/COMANDI_RAPIDI.md`**
- ⚡ Riferimento veloce comandi
- 🔄 Operazioni comuni
- 📊 Monitoring e debug
3. **`deployment/README_DEPLOYMENT.md`**
- 📚 Documentazione tecnica completa
- 🏗️ Architettura sistema
- ⚙️ Configurazione avanzata
4. **`deployment/CONFIGURAZIONE_ROUTER_MIKROTIK.md`**
- 📡 Setup router MikroTik
- 🔧 Configurazione syslog
- ✅ Verifica connessione
---
## ⚡ Quick Start (3 minuti)
### 1. Prerequisiti
- ✅ Server AlmaLinux 9
- ✅ Accesso root SSH
- ✅ Credenziali git.alfacom.it (username + token)
### 2. Installazione Base
```bash
# Connetti al server
ssh root@<IP_SERVER>
# Clone repository temporaneo
cd /tmp
git clone https://<USER>:<TOKEN>@git.alfacom.it/<OWNER>/ids.git
cd ids
# Esegui setup
chmod +x deployment/setup_almalinux.sh
./deployment/setup_almalinux.sh
```
### 3. Setup Definitivo
```bash
# Clone in posizione finale
cd /opt/ids
sudo -u ids git clone https://<USER>:<TOKEN>@git.alfacom.it/<OWNER>/ids.git .
# Configura git.env
sudo -u ids cp git.env.example git.env
sudo -u ids nano git.env # Inserisci credenziali git
# Configura .env
sudo -u ids cp .env.example .env
sudo -u ids nano .env # Inserisci password (vedi sotto)
```
### 4. Genera Password Sicure
```bash
# Password database
openssl rand -base64 32
# Session secret
openssl rand -base64 32
# Copia gli output in .env
```
### 5. Installa Dipendenze
```bash
cd /opt/ids
sudo -u ids npm install
cd python_ml && sudo -u ids pip3.11 install -r requirements.txt
cd /opt/ids && sudo -u ids npm run db:push
```
### 6. Configura Syslog e Avvia
```bash
# Syslog per router
./deployment/setup_syslog_server.sh
# Avvia tutto
./deployment/setup_crontab.sh
```
### 7. Verifica
```bash
# Processi attivi
ps aux | grep -E 'python|node|syslog'
# API funzionante
curl http://localhost:8000/health
curl http://localhost:5000
# Dashboard web
http://<IP_SERVER>:5000
```
---
## 🔧 File di Configurazione
### git.env (Credenziali Git)
```bash
GITLAB_USER=tuo-username
GITLAB_TOKEN=glpat-abc123xyz
GITLAB_REPO=https://git.alfacom.it/owner/ids.git
GITLAB_BRANCH=main
```
> **Token**: git.alfacom.it → Settings → Access Tokens
> Permessi: `api`, `read_repository`, `write_repository`
### .env (Secrets Applicazione)
```bash
PGHOST=localhost
PGPORT=5432
PGDATABASE=ids_database
PGUSER=ids_user
PGPASSWORD=<openssl rand -base64 32>
SESSION_SECRET=<openssl rand -base64 32>
VITE_PYTHON_API_URL=http://localhost:8000
NODE_ENV=production
```
> **IMPORTANTE**: `.env` e `git.env` NON vanno su git (sono in `.gitignore`)
---
## 📡 Configurazione Router MikroTik
Su **OGNI router**:
```mikrotik
/system logging action
add name=ids-server target=remote remote=<IP_SERVER> remote-port=514
/system logging
add action=ids-server topics=firewall,info
/ip firewall filter
add chain=forward action=accept log=yes log-prefix="ACCEPT: "
add chain=forward action=drop log=yes log-prefix="DROP: "
```
Verifica log sul server:
```bash
tail -f /var/log/mikrotik/raw.log
```
---
## 🔄 Aggiornamenti da Git
```bash
cd /opt/ids
sudo -u ids ./deployment/update_from_git.sh
```
Questo fa:
1. Backup `.env`
2. `git pull` da git.alfacom.it
3. Ripristina `.env`
4. Aggiorna dipendenze
5. Sync database
6. Restart servizi
---
## 🎯 Primo Training ML
Dopo **24 ore** di raccolta log (almeno 10.000 log):
```bash
# Verifica log raccolti
psql -U ids_user -d ids_database -c "SELECT COUNT(*) FROM network_logs;"
# Training
curl -X POST http://localhost:8000/train \
-H "Content-Type: application/json" \
-d '{"max_records": 10000, "hours_back": 24}'
# Detection (già automatica ogni 5min)
curl -X POST http://localhost:8000/detect \
-H "Content-Type: application/json" \
-d '{"max_records": 5000, "auto_block": true, "risk_threshold": 75}'
```
---
## 📊 Monitoring
### Log Principali
```bash
tail -f /var/log/ids/backend.log # Backend Python
tail -f /var/log/ids/frontend.log # Frontend Node
tail -f /var/log/ids/syslog_parser.log # Parser log router
tail -f /var/log/ids/training.log # Training ML
tail -f /var/log/ids/detect.log # Detection
tail -f /var/log/mikrotik/raw.log # Log router in arrivo
```
### Comandi Utili
```bash
# Restart completo
/opt/ids/deployment/restart_all.sh
# Stato processi
ps aux | grep -E 'python|node|syslog'
# Database
psql -U ids_user -d ids_database
# Crontab
crontab -u ids -l
```
---
## 🐛 Problemi Comuni
| Problema | Soluzione |
|----------|-----------|
| Log non arrivano | `systemctl restart rsyslog` + verifica firewall |
| Database vuoto | Verifica `syslog_parser.py` attivo |
| Backend non risponde | `/opt/ids/deployment/check_backend.sh` |
| Frontend non carica | `/opt/ids/deployment/check_frontend.sh` |
---
## 📚 Documentazione Completa
- **Installazione**: `deployment/INSTALLAZIONE_STEP_BY_STEP.md`
- **Comandi**: `deployment/COMANDI_RAPIDI.md`
- **Deployment**: `deployment/README_DEPLOYMENT.md`
- **Router**: `deployment/CONFIGURAZIONE_ROUTER_MIKROTIK.md`
- **Risposte FAQ**: `RISPOSTA_DEPLOYMENT.md`
---
## ✅ Checklist Rapida
- [ ] Server AlmaLinux 9 pronto
- [ ] `setup_almalinux.sh` eseguito
- [ ] `git.env` configurato
- [ ] `.env` configurato con password sicure
- [ ] Dipendenze installate (`npm install`, `pip install`)
- [ ] Database schema sincronizzato (`npm run db:push`)
- [ ] Syslog configurato (`setup_syslog_server.sh`)
- [ ] Router MikroTik configurati
- [ ] Sistema avviato (`setup_crontab.sh`)
- [ ] Processi attivi (python, node, syslog_parser)
- [ ] API funzionanti (curl localhost:8000/health)
- [ ] Dashboard accessibile (http://IP:5000)
- [ ] Log arrivano (`tail -f /var/log/mikrotik/raw.log`)
- [ ] Database si popola (SELECT COUNT(*) FROM network_logs)
- [ ] Training eseguito (dopo 24h)
---
**Sistema pronto! 🛡️**
Per dettagli: `cat deployment/INSTALLAZIONE_STEP_BY_STEP.md`

311
MIKROTIK_API_FIX.md Normal file
View File

@ -0,0 +1,311 @@
# Fix Connessione MikroTik API
## 🐛 PROBLEMA RISOLTO
**Errore**: Timeout connessione API MikroTik - router non rispondeva a richieste HTTP.
**Causa Root**: Confusione tra **API Binary** (porta 8728) e **API REST** (porta 80/443).
## 🔍 API MikroTik: Binary vs REST
MikroTik RouterOS ha **DUE tipi di API completamente diversi**:
| Tipo | Porta | Protocollo | RouterOS | Compatibilità |
|------|-------|------------|----------|---------------|
| **Binary API** | 8728 | Proprietario RouterOS | Tutte | ❌ Non HTTP (libreria `routeros-api`) |
| **REST API** | 80/443 | HTTP/HTTPS standard | **>= 7.1** | ✅ HTTP con `httpx` |
**IDS usa REST API** (httpx + HTTP), quindi:
- ✅ **Porta 80** (HTTP) - **CONSIGLIATA**
- ✅ **Porta 443** (HTTPS) - Se necessario SSL
- ❌ **Porta 8728** - API Binary, NON REST (timeout)
- ❌ **Porta 8729** - API Binary SSL, NON REST (timeout)
## ✅ SOLUZIONE
### 1⃣ Verifica RouterOS Versione
```bash
# Sul router MikroTik (via Winbox/SSH)
/system resource print
```
**Se RouterOS >= 7.1** → Usa **REST API** (porta 80/443)
**Se RouterOS < 7.1** REST API non esiste, usa API Binary
### 2⃣ Configurazione Porta Corretta
**Per RouterOS 7.14.2 (Alfabit):**
```sql
-- Database: Usa porta 80 (REST API HTTP)
UPDATE routers SET api_port = 80 WHERE name = 'Alfabit';
```
**Porte disponibili**:
- **80** → REST API HTTP (✅ CONSIGLIATA)
- **443** → REST API HTTPS (se SSL richiesto)
- ~~8728~~ → API Binary (non compatibile)
- ~~8729~~ → API Binary SSL (non compatibile)
### 3⃣ Test Manuale
```bash
# Test connessione porta 80
curl http://185.203.24.2:80/rest/system/identity \
-u admin:password \
--max-time 5
# Output atteso:
# {"name":"AlfaBit"}
```
---
## 📋 VERIFICA CONFIGURAZIONE ROUTER
### 1⃣ Controlla Database
```sql
-- Su AlmaLinux
psql $DATABASE_URL -c "SELECT name, ip_address, api_port, username, enabled FROM routers WHERE enabled = true;"
```
**Output Atteso**:
```
name | ip_address | api_port | username | enabled
--------------+---------------+----------+----------+---------
Alfabit | 185.203.24.2 | 80 | admin | t
```
**Verifica**:
- ✅ `api_port` = **80** (REST API HTTP)
- ✅ `enabled` = **true**
- ✅ `username` e `password` corretti
**Se porta errata**:
```sql
-- Cambia porta da 8728 a 80
UPDATE routers SET api_port = 80 WHERE ip_address = '185.203.24.2';
```
### 2⃣ Testa Connessione Python
```bash
# Su AlmaLinux
cd /opt/ids/python_ml
source venv/bin/activate
# Test connessione automatico (usa dati dal database)
python3 test_mikrotik_connection.py
```
**Output atteso**:
```
✅ Connessione OK!
✅ Trovati X IP in lista 'ddos_blocked'
✅ IP bloccato con successo!
✅ IP sbloccato con successo!
```
---
## 🚀 DEPLOYMENT SU ALMALINUX
### Workflow Completo
#### 1**Su Replit** (GIÀ FATTO ✅)
- File `python_ml/mikrotik_manager.py` modificato
- Fix già committato su Replit
#### 2⃣ **Locale - Push GitLab**
```bash
# Dalla tua macchina locale (NON su Replit - è bloccato)
./push-gitlab.sh
```
Input richiesti:
```
Commit message: Fix MikroTik API - porta non usata in base_url
```
#### 3⃣ **Su AlmaLinux - Pull & Deploy**
```bash
# SSH su ids.alfacom.it
ssh root@ids.alfacom.it
# Pull ultimi cambiamenti
cd /opt/ids
./update_from_git.sh
# Riavvia ML Backend per applicare fix
sudo systemctl restart ids-ml-backend
# Verifica servizio attivo
systemctl status ids-ml-backend
# Verifica API risponde
curl http://localhost:8000/health
```
#### 4⃣ **Test Blocco IP**
```bash
# Dalla dashboard web: https://ids.alfacom.it/routers
# 1. Verifica router configurati
# 2. Clicca "Test Connessione" su router 185.203.24.2
# 3. Dovrebbe mostrare ✅ "Connessione OK"
# Dalla dashboard detections:
# 1. Seleziona detection con score >= 80
# 2. Clicca "Blocca IP"
# 3. Verifica blocco su router
```
---
## 🔧 TROUBLESHOOTING
### Connessione Ancora Fallisce?
#### A. Verifica Servizio WWW su Router
**REST API usa servizio `www` (porta 80) o `www-ssl` (porta 443)**:
```bash
# Sul router MikroTik (via Winbox/SSH)
/ip service print
# Verifica che www sia enabled:
# 0 www 80 * ← REST API HTTP
# 1 www-ssl 443 * ← REST API HTTPS
```
**Fix su MikroTik**:
```bash
# Abilita servizio www per REST API
/ip service enable www
/ip service set www port=80 address=0.0.0.0/0
# O con SSL (porta 443)
/ip service enable www-ssl
/ip service set www-ssl port=443
```
**NOTA**: `api` (porta 8728) è **API Binary**, NON REST!
#### B. Verifica Firewall AlmaLinux
```bash
# Su AlmaLinux - consenti traffico verso router
sudo firewall-cmd --permanent --add-rich-rule='rule family="ipv4" destination address="185.203.24.2" port protocol="tcp" port="8728" accept'
sudo firewall-cmd --reload
```
#### C. Test Connessione Raw
```bash
# Test TCP connessione porta 80
telnet 185.203.24.2 80
# Test REST API con curl
curl -v http://185.203.24.2:80/rest/system/identity \
-u admin:password \
--max-time 5
# Output atteso:
# {"name":"AlfaBit"}
```
**Se timeout**: Servizio `www` non abilitato sul router
#### D. Credenziali Errate?
```sql
-- Verifica credenziali nel database
psql $DATABASE_URL -c "SELECT name, ip_address, username FROM routers WHERE ip_address = '185.203.24.2';"
-- Se password errata, aggiorna:
-- UPDATE routers SET password = 'nuova_password' WHERE ip_address = '185.203.24.2';
```
---
## ✅ VERIFICA FINALE
Dopo il deployment, verifica che:
1. **ML Backend attivo**:
```bash
systemctl status ids-ml-backend # must be "active (running)"
```
2. **API risponde**:
```bash
curl http://localhost:8000/health
# {"status":"healthy","database":"connected",...}
```
3. **Auto-blocking funziona**:
```bash
# Controlla log auto-blocking
journalctl -u ids-auto-block.timer -n 50
```
4. **IP bloccati su router**:
- Dashboard: https://ids.alfacom.it/detections
- Filtra: "Bloccati"
- Verifica badge verde "Bloccato" visibile
---
## 📊 CONFIGURAZIONE CORRETTA
| Parametro | Valore (RouterOS >= 7.1) | Note |
|-----------|--------------------------|------|
| **api_port** | **80** (HTTP) o **443** (HTTPS) | ✅ REST API |
| **Servizio Router** | `www` (HTTP) o `www-ssl` (HTTPS) | Abilita su MikroTik |
| **Endpoint** | `/rest/system/identity` | Test connessione |
| **Endpoint** | `/rest/ip/firewall/address-list` | Gestione blocchi |
| **Auth** | Basic (username:password base64) | Header Authorization |
| **Verify SSL** | False | Self-signed certs OK |
---
## 🎯 RIEPILOGO
### ❌ ERRATO (API Binary - Timeout)
```bash
# Porta 8728 usa protocollo BINARIO, non HTTP REST
curl http://185.203.24.2:8728/rest/...
# Timeout: protocollo incompatibile
```
### ✅ CORRETTO (API REST - Funziona)
```bash
# Porta 80 usa protocollo HTTP REST standard
curl http://185.203.24.2:80/rest/system/identity \
-u admin:password
# Output: {"name":"AlfaBit"}
```
**Database configurato**:
```sql
-- Router Alfabit configurato con porta 80
SELECT name, ip_address, api_port FROM routers;
-- Alfabit | 185.203.24.2 | 80
```
---
## 📝 CHANGELOG
**25 Novembre 2024**:
1. ✅ Identificato problema: porta 8728 = API Binary (non HTTP)
2. ✅ Verificato RouterOS 7.14.2 supporta REST API
3. ✅ Configurato router con porta 80 (REST API HTTP)
4. ✅ Test curl manuale: `{"name":"AlfaBit"}`
5. ✅ Router inserito in database con porta 80
**Test richiesto**: `python3 test_mikrotik_connection.py`
**Versione**: IDS 2.0.0 (Hybrid Detector)
**RouterOS**: 7.14.2 (stable)
**API Type**: REST (HTTP porta 80)

View File

@ -0,0 +1,62 @@
5:37:12 PM [express] POST /api/ml/train 200 in 5ms :: {"message":"Training avviato in background","m…
5:37:12 PM [express] GET /api/training-history 304 in 2ms :: []
5:37:12 PM [express] GET /api/ml/stats 304 in 14ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:37:22 PM [express] GET /api/training-history 304 in 16ms :: []
5:37:22 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:37:32 PM [express] GET /api/training-history 304 in 12ms :: []
5:37:32 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:37:42 PM [express] GET /api/training-history 304 in 12ms :: []
5:37:42 PM [express] GET /api/ml/stats 304 in 14ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:37:52 PM [express] GET /api/training-history 304 in 12ms :: []
5:37:52 PM [express] GET /api/ml/stats 304 in 14ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:38:02 PM [express] GET /api/training-history 304 in 12ms :: []
5:38:02 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:38:12 PM [express] GET /api/training-history 304 in 10ms :: []
5:38:12 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:38:22 PM [express] GET /api/training-history 304 in 13ms :: []
5:38:22 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:38:32 PM [express] GET /api/training-history 304 in 12ms :: []
5:38:32 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:38:32 PM [express] GET /api/stats 200 in 11ms :: {"routers":{"total":1,"enabled":1},"detections":{…
5:38:33 PM [express] GET /api/detections 304 in 14ms :: []
5:38:33 PM [express] GET /api/routers 200 in 12ms :: [{"id":"aedb9b6e-6d38-4926-8a45-b2f0c7b48c3d","…
5:38:36 PM [express] GET /api/detections 304 in 2ms :: []
5:38:38 PM [express] GET /api/training-history 304 in 3ms :: []
5:38:38 PM [express] GET /api/ml/stats 304 in 18ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:38:42 PM [express] GET /api/detections 304 in 5ms :: []
5:38:42 PM [express] GET /api/routers 304 in 5ms :: [{"id":"aedb9b6e-6d38-4926-8a45-b2f0c7b48c3d","n…
5:38:42 PM [express] GET /api/stats 304 in 12ms :: {"routers":{"total":1,"enabled":1},"detections":{…
5:38:47 PM [express] GET /api/detections 304 in 3ms :: []
5:38:48 PM [express] GET /api/detections 304 in 2ms :: []
5:38:49 PM [express] GET /api/training-history 304 in 3ms :: []
5:38:49 PM [express] GET /api/ml/stats 304 in 19ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:38:59 PM [express] GET /api/training-history 304 in 11ms :: []
5:38:59 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:39:09 PM [express] GET /api/training-history 304 in 11ms :: []
5:39:09 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:39:19 PM [express] GET /api/training-history 304 in 11ms :: []
5:39:19 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:39:29 PM [express] GET /api/training-history 304 in 11ms :: []
5:39:29 PM [express] GET /api/ml/stats 304 in 14ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:39:39 PM [express] GET /api/training-history 304 in 11ms :: []
5:39:39 PM [express] GET /api/ml/stats 304 in 14ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:39:50 PM [express] GET /api/training-history 200 in 12ms :: []
5:39:50 PM [express] GET /api/ml/stats 200 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:40:00 PM [express] GET /api/training-history 304 in 12ms :: []
5:40:00 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:40:10 PM [express] GET /api/training-history 304 in 13ms :: []
5:40:10 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:40:13 PM [express] POST /api/ml/train 200 in 4ms :: {"message":"Training avviato in background","m…
5:40:13 PM [express] GET /api/training-history 304 in 2ms :: []
5:40:13 PM [express] GET /api/ml/stats 304 in 14ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:40:23 PM [express] GET /api/training-history 304 in 12ms :: []
5:40:23 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:40:33 PM [express] GET /api/training-history 304 in 11ms :: []
5:40:33 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:40:44 PM [express] GET /api/training-history 304 in 13ms :: []
5:40:44 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…

View File

@ -0,0 +1,55 @@
╔═══════════════════════════════════════════════╗
║ ✅ AGGIORNAMENTO COMPLETATO ║
╚═══════════════════════════════════════════════╝
📋 VERIFICA SISTEMA:
• Log backend: tail -f /var/log/ids/backend.log
• Log frontend: tail -f /var/log/ids/frontend.log
• API backend: curl http://localhost:8000/health
• Frontend: curl http://localhost:5000
📊 STATO SERVIZI:
root 20860 0.0 0.0 18344 6400 pts/3 S+ Nov22 0:00 sudo tail -f /var/log/ids/syslog_parser.log
root 20862 0.0 0.0 3088 1536 pts/3 S+ Nov22 0:02 tail -f /var/log/ids/syslog_parser.log
ids 64096 4.0 1.8 1394944 291304 ? Ssl 12:12 9:44 /opt/ids/python_ml/venv/bin/python3 main.py
ids 64102 16.0 0.1 52084 19456 ? Ss 12:12 38:36 /opt/ids/python_ml/venv/bin/python3 syslog_parser.py
root 69074 0.0 0.2 731152 33612 pts/0 Rl+ 16:13 0:00 /usr/bin/node /usr/bin/npm run dev
[root@ids ids]# sudo /opt/ids/deployment/setup_analytics_timer.sh
╔═══════════════════════════════════════════════╗
║ IDS Analytics Timer Setup ║
╚═══════════════════════════════════════════════╝
📋 Copia file systemd...
🔄 Reload systemd daemon...
⚙ Enable e start timer...
📊 Stato timer:
● ids-analytics-aggregator.timer - IDS Analytics Aggregation Timer - Runs every hour
Loaded: loaded (/etc/systemd/system/ids-analytics-aggregator.timer; enabled; preset: disabled)
Active: active (waiting) since Mon 2025-11-24 12:13:35 CET; 4h 3min ago
Until: Mon 2025-11-24 12:13:35 CET; 4h 3min ago
Trigger: Mon 2025-11-24 17:05:00 CET; 47min left
Triggers: ● ids-analytics-aggregator.service
Nov 24 12:13:35 ids.alfacom.it systemd[1]: Stopped IDS Analytics Aggregation Timer - Runs every hour.
Nov 24 12:13:35 ids.alfacom.it systemd[1]: Stopping IDS Analytics Aggregation Timer - Runs every hour...
Nov 24 12:13:35 ids.alfacom.it systemd[1]: Started IDS Analytics Aggregation Timer - Runs every hour.
📅 Prossime esecuzioni:
NEXT LEFT LAST PASSED UNIT ACTIVATES
Mon 2025-11-24 17:05:00 CET 47min left Mon 2025-11-24 16:05:01 CET 12min ago ids-analytics-aggregator.timer ids-analytics-aggregator.service
1 timers listed.
Pass --all to see loaded but inactive timers, too.
╔═══════════════════════════════════════════════╗
║ ✅ ANALYTICS TIMER CONFIGURATO ║
╚═══════════════════════════════════════════════╝
📝 Comandi utili:
Stato timer: sudo systemctl status ids-analytics-aggregator.timer
Prossime run: sudo systemctl list-timers
Log aggregazione: sudo journalctl -u ids-analytics-aggregator -f
Test manuale: sudo systemctl start ids-analytics-aggregator

View File

@ -0,0 +1,43 @@
📦 Aggiornamento dipendenze Python...
Defaulting to user installation because normal site-packages is not writeable
Requirement already satisfied: fastapi==0.104.1 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 1)) (0.104.1)
Requirement already satisfied: uvicorn==0.24.0 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 2)) (0.24.0)
Requirement already satisfied: pandas==2.1.3 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 3)) (2.1.3)
Requirement already satisfied: numpy==1.26.2 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 4)) (1.26.2)
Requirement already satisfied: scikit-learn==1.3.2 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 5)) (1.3.2)
Requirement already satisfied: psycopg2-binary==2.9.9 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 6)) (2.9.9)
Requirement already satisfied: python-dotenv==1.0.0 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 7)) (1.0.0)
Requirement already satisfied: pydantic==2.5.0 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 8)) (2.5.0)
Requirement already satisfied: httpx==0.25.1 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 9)) (0.25.1)
Collecting Cython==3.0.5
Downloading Cython-3.0.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.6 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.6/3.6 MB 8.9 MB/s eta 0:00:00
Collecting xgboost==2.0.3
Using cached xgboost-2.0.3-py3-none-manylinux2014_x86_64.whl (297.1 MB)
Collecting joblib==1.3.2
Using cached joblib-1.3.2-py3-none-any.whl (302 kB)
Collecting eif==2.0.2
Using cached eif-2.0.2.tar.gz (1.6 MB)
Preparing metadata (setup.py) ... error
error: subprocess-exited-with-error
× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [6 lines of output]
Traceback (most recent call last):
File "<string>", line 2, in <module>
File "<pip-setuptools-caller>", line 34, in <module>
File "/tmp/pip-install-843eies2/eif_72b54a0861444b02867269ed1670c0ce/setup.py", line 4, in <module>
from Cython.Distutils import build_ext
ModuleNotFoundError: No module named 'Cython'
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed
× Encountered error while generating package metadata.
╰─> See above for output.
note: This is an issue with the package mentioned above, not pip.
hint: See above for details.

View File

@ -0,0 +1,111 @@
 Aggiornamento schema database...
Applicando migrazioni SQL...
 Sistema Migrazioni Database (Versioned)
 Verifica sistema versioning...
psql:/opt/ids/database-schema/migrations/000_init_schema_version.sql:14: NOTICE: relation "schema_version" already exists, skipping
✅ Sistema versioning attivo
 Versione database corrente: 2
✅ Database già aggiornato (nessuna migrazione da applicare)
✅ Migrazioni SQL applicate
Sincronizzando schema Drizzle...
> rest-express@1.0.0 db:push
> drizzle-kit push
No config path provided, using default 'drizzle.config.ts'
Reading config file '/opt/ids/drizzle.config.ts'
Using 'pg' driver for database querying
[✓] Pulling schema from database...
[✓] Changes applied
✅ Schema database completamente sincronizzato
 Configurazione RSyslog (log MikroTik)...
✅ RSyslog già configurato
 Restart servizi...
✅ Servizi riavviati
╔═══════════════════════════════════════════════╗
║ ✅ AGGIORNAMENTO COMPLETATO ║
╚═══════════════════════════════════════════════╝
 VERIFICA SISTEMA:
• Log backend: tail -f /var/log/ids/backend.log
• Log frontend: tail -f /var/log/ids/frontend.log
• API backend: curl http://localhost:8000/health
• Frontend: curl http://localhost:5000
 STATO SERVIZI:
ids 1547 6.0 4.0 2187384 650548 ? Sl Nov21 57:18 /usr/bin/python3.11 main.py
root 12542 0.0 0.0 18344 8576 pts/3 S 09:45 0:00 sudo -u ids python3 syslog_parser.py
ids 12544 0.2 0.1 52844 27132 pts/3 S 09:45 0:06 python3 syslog_parser.py
root 13114 0.0 0.0 18344 8576 pts/3 S 09:58 0:00 sudo -u ids python3 syslog_parser.py
ids 13116 8.3 0.1 52928 27136 pts/3 S 09:58 3:04 python3 syslog_parser.py
root 14333 0.0 0.2 729796 33360 pts/0 Rl+ 10:35 0:00 /usr/bin/node /usr/bin/npm run dev
[root@ids ids]# sudo ./deployment/setup_systemd_services.sh
 Setup Systemd Services per IDS
 Generazione IDS_API_KEY...
✅ IDS_API_KEY generata e salvata in .env
 Installazione systemd units...
♻ Reload systemd daemon...
⏸ Fermando processi manuali esistenti...
 Attivazione servizi...
Created symlink /etc/systemd/system/multi-user.target.wants/ids-ml-backend.service → /etc/systemd/system/ids-ml-backend.service.
✅ ids-ml-backend.service attivato
Created symlink /etc/systemd/system/multi-user.target.wants/ids-syslog-parser.service → /etc/systemd/system/ids-syslog-parser.service.
✅ ids-syslog-parser.service attivato
📊 Status Servizi:
● ids-ml-backend.service - IDS ML Backend (FastAPI)
Loaded: loaded (/etc/systemd/system/ids-ml-backend.service; enabled; preset: disabled)
Active: activating (auto-restart) (Result: exit-code) since Sat 2025-11-22 10:35:57 CET; 2s ago
Process: 14445 ExecStart=/usr/bin/python3 main.py (code=exited, status=1/FAILURE)
Main PID: 14445 (code=exited, status=1/FAILURE)
CPU: 21ms
● ids-syslog-parser.service - IDS Syslog Parser (Network Logs Processor)
Loaded: loaded (/etc/systemd/system/ids-syslog-parser.service; enabled; preset: disabled)
Active: active (running) since Sat 2025-11-22 10:35:57 CET; 2s ago
Main PID: 14471 (python3)
Tasks: 1 (limit: 100409)
Memory: 10.8M (max: 1.0G available: 1013.1M)
CPU: 236ms
CGroup: /system.slice/ids-syslog-parser.service
└─14471 /usr/bin/python3 syslog_parser.py
╔═══════════════════════════════════════════════╗
║ ✅ SYSTEMD SERVICES CONFIGURATI ║
╚═══════════════════════════════════════════════╝
📚 COMANDI UTILI:
systemctl status ids-ml-backend - Status ML Backend
systemctl status ids-syslog-parser - Status Syslog Parser
systemctl restart ids-ml-backend - Restart ML Backend
systemctl restart ids-syslog-parser - Restart Syslog Parser
journalctl -u ids-ml-backend -f - Log ML Backend
journalctl -u ids-syslog-parser -f - Log Syslog Parser
[root@ids ids]# systemctl status ids-ml-backend
● ids-ml-backend.service - IDS ML Backend (FastAPI)
Loaded: loaded (/etc/systemd/system/ids-ml-backend.service; enabled; preset: disabled)
Active: activating (auto-restart) (Result: exit-code) since Sat 2025-11-22 10:35:57 CET; 8s ago
Process: 14445 ExecStart=/usr/bin/python3 main.py (code=exited, status=1/FAILURE)
Main PID: 14445 (code=exited, status=1/FAILURE)
CPU: 21ms
[root@ids ids]# systemctl status ids-syslog-parser
● ids-syslog-parser.service - IDS Syslog Parser (Network Logs Processor)
Loaded: loaded (/etc/systemd/system/ids-syslog-parser.service; enabled; preset: disabled)
Active: active (running) since Sat 2025-11-22 10:35:57 CET; 20s ago
Main PID: 14471 (python3)
Tasks: 1 (limit: 100409)
Memory: 11.0M (max: 1.0G available: 1012.9M)
CPU: 1.699s
CGroup: /system.slice/ids-syslog-parser.service
└─14471 /usr/bin/python3 syslog_parser.py
Nov 22 10:35:57 ids.alfacom.it systemd[1]: Started IDS Syslog Parser (Network Logs Processor).
[root@ids ids]#

View File

@ -0,0 +1,179 @@
Found existing installation: joblib 1.5.2
Uninstalling joblib-1.5.2:
Successfully uninstalled joblib-1.5.2
Successfully installed joblib-1.3.2
✅ Dipendenze Python installate
 Impostazione permessi...
 Verifica installazione:
✅ FastAPI: 0.104.1
✅ Uvicorn: 0.24.0
✅ Scikit-learn: 1.3.2
✅ Pandas: 2.1.3
✅ HTTPX: 0.25.1
✅ Joblib: 1.3.2
╔═══════════════════════════════════════════════╗
║ ✅ DIPENDENZE PYTHON INSTALLATE ║
╚═══════════════════════════════════════════════╝
 NOTA:
Il virtual environment è in: /opt/ids/python_ml/venv
I systemd services useranno automaticamente questo venv
[root@ids ids]# sudo systemctl restart ids-ml-backend
[root@ids ids]# sudo systemctl status ids-ml-backend
● ids-ml-backend.service - IDS ML Backend (FastAPI)
Loaded: loaded (/etc/systemd/system/ids-ml-backend.service; enabled; preset: disabled)
Active: activating (auto-restart) (Result: exit-code) since Sat 2025-11-22 11:00:28 CET; 5s ago
Process: 16204 ExecStart=/opt/ids/python_ml/venv/bin/python3 main.py (code=exited, status=1/FAILURE)
Main PID: 16204 (code=exited, status=1/FAILURE)
CPU: 3.933s
[root@ids ids]# sudo systemctl status ids-ml-backend
● ids-ml-backend.service - IDS ML Backend (FastAPI)
Loaded: loaded (/etc/systemd/system/ids-ml-backend.service; enabled; preset: disabled)
Active: activating (auto-restart) (Result: exit-code) since Sat 2025-11-22 11:00:28 CET; 7s ago
Process: 16204 ExecStart=/opt/ids/python_ml/venv/bin/python3 main.py (code=exited, status=1/FAILURE)
Main PID: 16204 (code=exited, status=1/FAILURE)
CPU: 3.933s
[root@ids ids]# tail -30 /var/log/ids/ml_backend.log
from fastapi import FastAPI, HTTPException, BackgroundTasks, Security, Header
ModuleNotFoundError: No module named 'fastapi'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 12, in <module>
import pandas as pd
ModuleNotFoundError: No module named 'pandas'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 20, in <module>
from ml_analyzer import MLAnalyzer
File "/opt/ids/python_ml/ml_analyzer.py", line 8, in <module>
from sklearn.ensemble import IsolationForest
ModuleNotFoundError: No module named 'sklearn'
INFO: Started server process [16144]
INFO: Waiting for application startup.
INFO: Application startup complete.
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
[LOAD] Modello caricato da models
 Starting IDS API on http://0.0.0.0:8000
 Docs available at http://0.0.0.0:8000/docs
INFO: Started server process [16204]
INFO: Waiting for application startup.
INFO: Application startup complete.
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
[LOAD] Modello caricato da models
 Starting IDS API on http://0.0.0.0:8000
 Docs available at http://0.0.0.0:8000/docs
[root@ids ids]# sudo systemctl status ids-ml-backend
● ids-ml-backend.service - IDS ML Backend (FastAPI)
Loaded: loaded (/etc/systemd/system/ids-ml-backend.service; enabled; preset: disabled)
Active: active (running) since Sat 2025-11-22 11:01:03 CET; 1s ago
Main PID: 16291 (python3)
Tasks: 15 (limit: 100409)
Memory: 100.2M (max: 2.0G available: 1.9G)
CPU: 3.101s
CGroup: /system.slice/ids-ml-backend.service
└─16291 /opt/ids/python_ml/venv/bin/python3 main.py
Nov 22 11:01:03 ids.alfacom.it systemd[1]: Started IDS ML Backend (FastAPI).
[root@ids ids]# sudo systemctl status ids-ml-backend
● ids-ml-backend.service - IDS ML Backend (FastAPI)
Loaded: loaded (/etc/systemd/system/ids-ml-backend.service; enabled; preset: disabled)
Active: activating (auto-restart) (Result: exit-code) since Sat 2025-11-22 11:01:05 CET; 9s ago
Process: 16291 ExecStart=/opt/ids/python_ml/venv/bin/python3 main.py (code=exited, status=1/FAILURE)
Main PID: 16291 (code=exited, status=1/FAILURE)
CPU: 3.804s
[root@ids ids]# sudo systemctl status ids-ml-backend
● ids-ml-backend.service - IDS ML Backend (FastAPI)
Loaded: loaded (/etc/systemd/system/ids-ml-backend.service; enabled; preset: disabled)
Active: activating (auto-restart) (Result: exit-code) since Sat 2025-11-22 11:01:17 CET; 251ms ago
Process: 16321 ExecStart=/opt/ids/python_ml/venv/bin/python3 main.py (code=exited, status=1/FAILURE)
Main PID: 16321 (code=exited, status=1/FAILURE)
CPU: 3.840s
[root@ids ids]# tail -30 /var/log/ids/ml_backend.log
[LOAD] Modello caricato da models
 Starting IDS API on http://0.0.0.0:8000
 Docs available at http://0.0.0.0:8000/docs
INFO: Started server process [16257]
INFO: Waiting for application startup.
INFO: Application startup complete.
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
[LOAD] Modello caricato da models
 Starting IDS API on http://0.0.0.0:8000
 Docs available at http://0.0.0.0:8000/docs
INFO: Started server process [16291]
INFO: Waiting for application startup.
INFO: Application startup complete.
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
[LOAD] Modello caricato da models
 Starting IDS API on http://0.0.0.0:8000
 Docs available at http://0.0.0.0:8000/docs
INFO: Started server process [16321]
INFO: Waiting for application startup.
INFO: Application startup complete.
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
[LOAD] Modello caricato da models
 Starting IDS API on http://0.0.0.0:8000
 Docs available at http://0.0.0.0:8000/docs
[root@ids ids]# sudo systemctl status ids-ml-backend
● ids-ml-backend.service - IDS ML Backend (FastAPI)
Loaded: loaded (/etc/systemd/system/ids-ml-backend.service; enabled; preset: disabled)
Active: active (running) since Sat 2025-11-22 11:01:27 CET; 2s ago
Main PID: 16348 (python3)
Tasks: 19 (limit: 100409)
Memory: 118.4M (max: 2.0G available: 1.8G)
CPU: 3.872s
CGroup: /system.slice/ids-ml-backend.service
└─16348 /opt/ids/python_ml/venv/bin/python3 main.py
Nov 22 11:01:27 ids.alfacom.it systemd[1]: Started IDS ML Backend (FastAPI).
[root@ids ids]# sudo systemctl status ids-ml-backend
● ids-ml-backend.service - IDS ML Backend (FastAPI)
Loaded: loaded (/etc/systemd/system/ids-ml-backend.service; enabled; preset: disabled)
Active: activating (auto-restart) (Result: exit-code) since Sat 2025-11-22 11:01:30 CET; 4s ago
Process: 16348 ExecStart=/opt/ids/python_ml/venv/bin/python3 main.py (code=exited, status=1/FAILURE)
Main PID: 16348 (code=exited, status=1/FAILURE)
CPU: 3.911s
Nov 22 11:01:30 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.
Nov 22 11:01:30 ids.alfacom.it systemd[1]: ids-ml-backend.service: Consumed 3.911s CPU time.
[root@ids ids]# tail -30 /var/log/ids/ml_backend.log
[LOAD] Modello caricato da models
🚀 Starting IDS API on http://0.0.0.0:8000
📚 Docs available at http://0.0.0.0:8000/docs
INFO: Started server process [16291]
INFO: Waiting for application startup.
INFO: Application startup complete.
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
[LOAD] Modello caricato da models
🚀 Starting IDS API on http://0.0.0.0:8000
📚 Docs available at http://0.0.0.0:8000/docs
INFO: Started server process [16321]
INFO: Waiting for application startup.
INFO: Application startup complete.
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
[LOAD] Modello caricato da models
🚀 Starting IDS API on http://0.0.0.0:8000
📚 Docs available at http://0.0.0.0:8000/docs
INFO: Started server process [16348]
INFO: Waiting for application startup.
INFO: Application startup complete.
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
[LOAD] Modello caricato da models
🚀 Starting IDS API on http://0.0.0.0:8000
📚 Docs available at http://0.0.0.0:8000/docs

View File

@ -0,0 +1,56 @@
[Fri Nov 21 16:46:54 CET 2025] Frontend Node NON attivo, riavvio...
[Fri Nov 21 16:46:56 CET 2025] Frontend riavviato con PID: 10083
> rest-express@1.0.0 dev
> NODE_ENV=development tsx server/index.ts
4:46:58 PM [express] serving on port 5000
A PostCSS plugin did not pass the `from` option to `postcss.parse`. This may cause imported assets to be incorrectly transformed. If you've recently added a PostCSS plugin that raised this warning, please contact the package author to fix the issue.
4:49:03 PM [express] GET /api/stats 500 in 39ms :: {"error":"Failed to fetch stats"}
4:49:03 PM [express] GET /api/detections 500 in 13ms :: {"error":"Failed to fetch detections"}
4:49:03 PM [express] GET /api/routers 500 in 11ms :: {"error":"Failed to fetch routers"}
4:49:06 PM [express] GET /api/stats 500 in 11ms :: {"error":"Failed to fetch stats"}
4:49:06 PM [express] GET /api/detections 500 in 8ms :: {"error":"Failed to fetch detections"}
4:49:06 PM [express] GET /api/routers 500 in 6ms :: {"error":"Failed to fetch routers"}
4:49:08 PM [express] GET /api/detections 500 in 4ms :: {"error":"Failed to fetch detections"}
4:49:10 PM [express] GET /api/training-history 500 in 42ms :: {"error":"Failed to fetch training his…
4:49:10 PM [express] GET /api/ml/stats 200 in 70ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:49:15 PM [express] POST /api/ml/train 200 in 10ms :: {"message":"Training avviato in background","…
4:49:15 PM [express] GET /api/training-history 500 in 6ms :: {"error":"Failed to fetch training hist…
4:49:15 PM [express] GET /api/ml/stats 200 in 14ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:49:21 PM [express] GET /api/training-history 500 in 7ms :: {"error":"Failed to fetch training hist…
4:49:21 PM [express] GET /api/ml/stats 200 in 20ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:49:27 PM [express] GET /api/training-history 500 in 6ms :: {"error":"Failed to fetch training hist…
4:49:27 PM [express] GET /api/ml/stats 200 in 20ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:49:31 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
4:49:31 PM [express] GET /api/ml/stats 200 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:49:33 PM [express] GET /api/routers 500 in 5ms :: {"error":"Failed to fetch routers"}
4:49:36 PM [express] GET /api/whitelist 500 in 4ms :: {"error":"Failed to fetch whitelist"}
4:49:38 PM [express] GET /api/routers 500 in 4ms :: {"error":"Failed to fetch routers"}
4:49:39 PM [express] GET /api/training-history 500 in 7ms :: {"error":"Failed to fetch training hist…
4:49:39 PM [express] GET /api/ml/stats 200 in 20ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:49:41 PM [express] GET /api/detections 500 in 4ms :: {"error":"Failed to fetch detections"}
4:49:42 PM [express] GET /api/stats 500 in 4ms :: {"error":"Failed to fetch stats"}
4:49:43 PM [express] GET /api/detections 500 in 6ms :: {"error":"Failed to fetch detections"}
4:49:43 PM [express] GET /api/routers 500 in 4ms :: {"error":"Failed to fetch routers"}
4:49:44 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
4:49:44 PM [express] GET /api/ml/stats 200 in 19ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:49:54 PM [express] GET /api/training-history 500 in 8ms :: {"error":"Failed to fetch training hist…
4:49:54 PM [express] GET /api/ml/stats 304 in 20ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:49:57 PM [express] GET /api/detections 500 in 4ms :: {"error":"Failed to fetch detections"}
4:49:59 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
4:49:59 PM [express] GET /api/ml/stats 304 in 19ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:50:00 PM [express] GET /api/routers 500 in 3ms :: {"error":"Failed to fetch routers"}
4:50:01 PM [express] GET /api/whitelist 500 in 4ms :: {"error":"Failed to fetch whitelist"}
4:50:02 PM [express] GET /api/training-history 500 in 6ms :: {"error":"Failed to fetch training hist…
4:50:02 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:50:12 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
4:50:12 PM [express] GET /api/ml/stats 304 in 18ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:50:22 PM [express] GET /api/training-history 500 in 4ms :: {"error":"Failed to fetch training hist…
4:50:22 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:50:32 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
4:50:33 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…

View File

@ -0,0 +1,60 @@
[INFO] Processate 16200 righe, salvate 0 log
[INFO] Processate 16300 righe, salvate 0 log
[INFO] Processate 16400 righe, salvate 0 log
[INFO] Processate 16500 righe, salvate 0 log
[INFO] Processate 16600 righe, salvate 0 log
[INFO] Processate 16700 righe, salvate 0 log
[INFO] Processate 16800 righe, salvate 0 log
[INFO] Processate 16900 righe, salvate 0 log
[INFO] Processate 17000 righe, salvate 0 log
[INFO] Processate 17100 righe, salvate 0 log
[INFO] Processate 17200 righe, salvate 0 log
[INFO] Processate 17300 righe, salvate 0 log
[INFO] Processate 17400 righe, salvate 0 log
[INFO] Processate 17500 righe, salvate 0 log
[INFO] Processate 17600 righe, salvate 0 log
[INFO] Processate 17700 righe, salvate 0 log
[INFO] Processate 17800 righe, salvate 0 log
[INFO] Processate 17900 righe, salvate 0 log
[INFO] Processate 18000 righe, salvate 0 log
[INFO] Processate 18100 righe, salvate 0 log[root@ids python_ml]# tail -20 /var/log/ids/syslog_parser.log
[INFO] Processate 71100 righe, salvate 0 log
[INFO] Processate 71200 righe, salvate 0 log
[INFO] Processate 71300 righe, salvate 0 log
[INFO] Processate 71400 righe, salvate 0 log
[INFO] Processate 71500 righe, salvate 0 log
[INFO] Processate 71600 righe, salvate 0 log
[INFO] Processate 71700 righe, salvate 0 log
[INFO] Processate 71800 righe, salvate 0 log
[INFO] Processate 71900 righe, salvate 0 log
[INFO] Processate 72000 righe, salvate 0 log
[INFO] Processate 72100 righe, salvate 0 log
[INFO] Processate 72200 righe, salvate 0 log
[INFO] Processate 72300 righe, salvate 0 log
[INFO] Processate 72400 righe, salvate 0 log
[INFO] Processate 72500 righe, salvate 0 log
[INFO] Processate 72600 righe, salvate 0 log
[INFO] Processate 72700 righe, salvate 0 log
[INFO] Processate 72800 righe, salvate 0 log
[INFO] Processate 72900 righe, salvate 0 log
[INFO] Processate 73000 righe, salvate 0 log[root@ids python_ml]# tail -20 /var/log/ids/syslog_parser.log
[INFO] Processate 107400 righe, salvate 0 log
[INFO] Processate 107500 righe, salvate 0 log
[INFO] Processate 107600 righe, salvate 0 log
[INFO] Processate 107700 righe, salvate 0 log
[INFO] Processate 107800 righe, salvate 0 log
[INFO] Processate 107900 righe, salvate 0 log
[INFO] Processate 108000 righe, salvate 0 log
[INFO] Processate 108100 righe, salvate 0 log
[INFO] Processate 108200 righe, salvate 0 log
[INFO] Processate 108300 righe, salvate 0 log
[INFO] Processate 108400 righe, salvate 0 log
[INFO] Processate 108500 righe, salvate 0 log
[INFO] Processate 108600 righe, salvate 0 log
[INFO] Processate 108700 righe, salvate 0 log
[INFO] Processate 108800 righe, salvate 0 log
[INFO] Processate 108900 righe, salvate 0 log
[INFO] Processate 109000 righe, salvate 0 log
[INFO] Processate 109100 righe, salvate 0 log
[INFO] Processate 109200 righe, salvate 0 log
[INFO] Processate 109300 righe, salvate 0 log

View File

@ -0,0 +1,124 @@
 Status Servizi:
● ids-ml-backend.service - IDS ML Backend (FastAPI)
Loaded: loaded (/etc/systemd/system/ids-ml-backend.service; enabled; preset: disabled)
Active: activating (auto-restart) (Result: exit-code) since Sat 2025-11-22 10:55:17 CET; 348ms ago
Process: 15380 ExecStart=/opt/ids/python_ml/venv/bin/python3 main.py (code=exited, status=1/FAILURE)
Main PID: 15380 (code=exited, status=1/FAILURE)
CPU: 3.435s
● ids-syslog-parser.service - IDS Syslog Parser (Network Logs Processor)
Loaded: loaded (/etc/systemd/system/ids-syslog-parser.service; enabled; preset: disabled)
Active: active (running) since Sat 2025-11-22 10:55:15 CET; 2s ago
Main PID: 15405 (python3)
Tasks: 1 (limit: 100409)
Memory: 10.7M (max: 1.0G available: 1013.2M)
CPU: 324ms
CGroup: /system.slice/ids-syslog-parser.service
└─15405 /opt/ids/python_ml/venv/bin/python3 syslog_parser.py
╔═══════════════════════════════════════════════╗
║ ✅ SYSTEMD SERVICES CONFIGURATI ║
╚═══════════════════════════════════════════════╝
 COMANDI UTILI:
systemctl status ids-ml-backend - Status ML Backend
systemctl status ids-syslog-parser - Status Syslog Parser
systemctl restart ids-ml-backend - Restart ML Backend
systemctl restart ids-syslog-parser - Restart Syslog Parser
journalctl -u ids-ml-backend -f - Log ML Backend
journalctl -u ids-syslog-parser -f - Log Syslog Parser
[root@ids ids]# # Verifica status servizi
systemctl status ids-ml-backend
systemctl status ids-syslog-parser
# Entrambi dovrebbero mostrare "Active: active (running)"
● ids-ml-backend.service - IDS ML Backend (FastAPI)
Loaded: loaded (/etc/systemd/system/ids-ml-backend.service; enabled; preset: disabled)
Active: activating (auto-restart) (Result: exit-code) since Sat 2025-11-22 10:55:17 CET; 4s ago
Process: 15380 ExecStart=/opt/ids/python_ml/venv/bin/python3 main.py (code=exited, status=1/FAILURE)
Main PID: 15380 (code=exited, status=1/FAILURE)
CPU: 3.435s
● ids-syslog-parser.service - IDS Syslog Parser (Network Logs Processor)
Loaded: loaded (/etc/systemd/system/ids-syslog-parser.service; enabled; preset: disabled)
Active: active (running) since Sat 2025-11-22 10:55:15 CET; 5s ago
Main PID: 15405 (python3)
Tasks: 1 (limit: 100409)
Memory: 10.7M (max: 1.0G available: 1013.2M)
CPU: 627ms
CGroup: /system.slice/ids-syslog-parser.service
└─15405 /opt/ids/python_ml/venv/bin/python3 syslog_parser.py
Nov 22 10:55:15 ids.alfacom.it systemd[1]: Started IDS Syslog Parser (Network Logs Processor).
[root@ids ids]# systemctl status ids-syslog-parser
● ids-syslog-parser.service - IDS Syslog Parser (Network Logs Processor)
Loaded: loaded (/etc/systemd/system/ids-syslog-parser.service; enabled; preset: disabled)
Active: active (running) since Sat 2025-11-22 10:55:15 CET; 14s ago
Main PID: 15405 (python3)
Tasks: 1 (limit: 100409)
Memory: 10.8M (max: 1.0G available: 1013.1M)
CPU: 1.268s
CGroup: /system.slice/ids-syslog-parser.service
└─15405 /opt/ids/python_ml/venv/bin/python3 syslog_parser.py
Nov 22 10:55:15 ids.alfacom.it systemd[1]: Started IDS Syslog Parser (Network Logs Processor).
[root@ids ids]# systemctl status ids-ml-backend
● ids-ml-backend.service - IDS ML Backend (FastAPI)
Loaded: loaded (/etc/systemd/system/ids-ml-backend.service; enabled; preset: disabled)
Active: activating (auto-restart) (Result: exit-code) since Sat 2025-11-22 10:55:29 CET; 7s ago
Process: 15441 ExecStart=/opt/ids/python_ml/venv/bin/python3 main.py (code=exited, status=1/FAILURE)
Main PID: 15441 (code=exited, status=1/FAILURE)
CPU: 3.642s
[root@ids ids]# systemctl restart ids-ml-backend
[root@ids ids]# systemctl status ids-ml-backend
● ids-ml-backend.service - IDS ML Backend (FastAPI)
Loaded: loaded (/etc/systemd/system/ids-ml-backend.service; enabled; preset: disabled)
Active: active (running) since Sat 2025-11-22 10:55:48 CET; 1s ago
Main PID: 15482 (python3)
Tasks: 15 (limit: 100409)
Memory: 110.1M (max: 2.0G available: 1.8G)
CPU: 3.357s
CGroup: /system.slice/ids-ml-backend.service
└─15482 /opt/ids/python_ml/venv/bin/python3 main.py
Nov 22 10:55:48 ids.alfacom.it systemd[1]: Started IDS ML Backend (FastAPI).
[root@ids ids]# systemctl status ids-ml-backend
● ids-ml-backend.service - IDS ML Backend (FastAPI)
Loaded: loaded (/etc/systemd/system/ids-ml-backend.service; enabled; preset: disabled)
Active: activating (auto-restart) (Result: exit-code) since Sat 2025-11-22 10:55:50 CET; 3s ago
Process: 15482 ExecStart=/opt/ids/python_ml/venv/bin/python3 main.py (code=exited, status=1/FAILURE)
Main PID: 15482 (code=exited, status=1/FAILURE)
CPU: 3.607s
[root@ids ids]# tail -30 /var/log/ids/ml_backend.log
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 21, in <module>
from mikrotik_manager import MikroTikManager
File "/opt/ids/python_ml/mikrotik_manager.py", line 6, in <module>
import httpx
ModuleNotFoundError: No module named 'httpx'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 21, in <module>
from mikrotik_manager import MikroTikManager
File "/opt/ids/python_ml/mikrotik_manager.py", line 6, in <module>
import httpx
ModuleNotFoundError: No module named 'httpx'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 21, in <module>
from mikrotik_manager import MikroTikManager
File "/opt/ids/python_ml/mikrotik_manager.py", line 6, in <module>
import httpx
ModuleNotFoundError: No module named 'httpx'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 21, in <module>
from mikrotik_manager import MikroTikManager
File "/opt/ids/python_ml/mikrotik_manager.py", line 6, in <module>
import httpx
ModuleNotFoundError: No module named 'httpx'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 21, in <module>
from mikrotik_manager import MikroTikManager
File "/opt/ids/python_ml/mikrotik_manager.py", line 6, in <module>
import httpx
ModuleNotFoundError: No module named 'httpx'

View File

@ -0,0 +1,70 @@
 VERIFICA SISTEMA:
• Log backend: tail -f /var/log/ids/backend.log
• Log frontend: tail -f /var/log/ids/frontend.log
• API backend: curl http://localhost:8000/health
• Frontend: curl http://localhost:5000
 STATO SERVIZI:
ids 5038 0.2 2.0 1894024 331912 ? Sl 09:20 1:10 /usr/bin/python3.11 main.py
root 10798 0.0 0.1 730192 31688 pts/2 Rl+ 17:01 0:00 /usr/bin/node /usr/bin/npm run dev
[root@ids ids]# sudo -u ids /opt/ids/deployment/restart_all.sh
pkill: killing pid 10842 failed: Operation not permitted
pkill: killing pid 10798 failed: Operation not permitted
[root@ids ids]# curl http://localhost:5000/api/stats
curl http://localhost:5000/api/routers
curl http://localhost:5000/api/detections
{"error":"Failed to fetch stats"}{"error":"Failed to fetch routers"}{"error":"Failed to fetch detections"}[root@ids ids]#
[root@ids ids]# tail -50 /var/log/ids/frontend.log
> NODE_ENV=development tsx server/index.ts
5:00:00 PM [express] serving on port 5000
A PostCSS plugin did not pass the `from` option to `postcss.parse`. This may cause imported assets to be incorrectly transformed. If you've recently added a PostCSS plugin that raised this warning, please contact the package author to fix the issue.
5:00:03 PM [express] GET /api/training-history 500 in 70ms :: {"error":"Failed to fetch training his…
5:00:03 PM [express] GET /api/ml/stats 304 in 70ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:00:13 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:00:13 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:00:18 PM [express] GET /api/stats 500 in 6ms :: {"error":"Failed to fetch stats"}
5:00:23 PM [express] GET /api/training-history 500 in 6ms :: {"error":"Failed to fetch training hist…
5:00:23 PM [express] GET /api/ml/stats 304 in 18ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:00:26 PM [express] GET /api/routers 500 in 5ms :: {"error":"Failed to fetch routers"}
5:00:33 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:00:33 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:00:35 PM [express] GET /api/detections 500 in 5ms :: {"error":"Failed to fetch detections"}
5:00:43 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:00:43 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:00:53 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:00:53 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
[Fri Nov 21 17:01:08 CET 2025] Frontend Node NON attivo, riavvio...
[Fri Nov 21 17:01:10 CET 2025] Frontend riavviato con PID: 10798
> rest-express@1.0.0 dev
> NODE_ENV=development tsx server/index.ts
5:01:12 PM [express] serving on port 5000
A PostCSS plugin did not pass the `from` option to `postcss.parse`. This may cause imported assets to be incorrectly transformed. If you've recently added a PostCSS plugin that raised this warning, please contact the package author to fix the issue.
5:01:15 PM [express] GET /api/training-history 500 in 64ms :: {"error":"Failed to fetch training his…
5:01:15 PM [express] GET /api/ml/stats 304 in 65ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:01:19 PM [express] GET /api/stats 500 in 10ms :: {"error":"Failed to fetch stats"}
5:01:19 PM [express] GET /api/detections 500 in 7ms :: {"error":"Failed to fetch detections"}
5:01:19 PM [express] GET /api/routers 500 in 6ms :: {"error":"Failed to fetch routers"}
5:01:21 PM [express] GET /api/detections 500 in 4ms :: {"error":"Failed to fetch detections"}
5:01:22 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:01:23 PM [express] GET /api/ml/stats 304 in 18ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:01:33 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:01:33 PM [express] GET /api/ml/stats 304 in 19ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:01:43 PM [express] GET /api/training-history 500 in 7ms :: {"error":"Failed to fetch training hist…
5:01:43 PM [express] GET /api/ml/stats 304 in 18ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:01:53 PM [express] GET /api/training-history 500 in 6ms :: {"error":"Failed to fetch training hist…
5:01:53 PM [express] GET /api/ml/stats 304 in 20ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:02:03 PM [express] GET /api/training-history 500 in 6ms :: {"error":"Failed to fetch training hist…
5:02:03 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:02:08 PM [express] GET /api/stats 500 in 5ms :: {"error":"Failed to fetch stats"}
5:02:08 PM [express] GET /api/routers 500 in 4ms :: {"error":"Failed to fetch routers"}
5:02:08 PM [express] GET /api/detections 500 in 3ms :: {"error":"Failed to fetch detections"}
5:02:13 PM [express] GET /api/training-history 500 in 4ms :: {"error":"Failed to fetch training hist…
5:02:13 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
[root@ids ids]#

View File

@ -0,0 +1,81 @@
crontab -u ids -l
# ============================================
# SISTEMA IDS - CONFIGURAZIONE AUTOMATICA
# ============================================
# Training ML ogni 12 ore (alle 00:00 e 12:00)
0 */12 * * * cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/train', json={'max_records': 10000, 'hours_back': 24})" >> /var/log/ids/training.log 2>&1
# Detection automatica ogni 5 minuti
*/5 * * * * cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/detect', json={'max_records': 5000, 'auto_block': True, 'risk_threshold': 75})" >> /var/log/ids/detect.log 2>&1
# Verifica processo backend Python ogni 5 minuti (riavvia se non attivo)
*/5 * * * * /opt/ids/deployment/check_backend.sh >> /var/log/ids/cron.log 2>&1
# Verifica processo frontend ogni 5 minuti (riavvia se non attivo)
*/5 * * * * /opt/ids/deployment/check_frontend.sh >> /var/log/ids/cron.log 2>&1
# Pulizia log settimanale (ogni domenica alle 02:00)
0 2 * * 0 find /var/log/ids -name "*.log" -size +100M -exec truncate -s 50M {} \; >> /var/log/ids/cron.log 2>&1
# Restart completo del sistema ogni settimana (domenica alle 03:00)
0 3 * * 0 /opt/ids/deployment/restart_all.sh >> /var/log/ids/cron.log 2>&1
# Backup database giornaliero (alle 04:00)
0 4 * * * /opt/ids/deployment/backup_db.sh >> /var/log/ids/cron.log 2>&1
[root@ids ids]# curl -X POST http://localhost:8000/train \
-H "Content-Type: application/json" \
-d '{"max_records": 100000, "hours_back": 24}'
{"message":"Training avviato in background","max_records":100000,"hours_back":24}[root@ids ids]#
psql -h 127.0.0.1 -U ids_user -d ids_database -c "
SELECT model_version, records_processed, status, trained_at
FROM training_history
ORDER BY trained_at DESC
LIMIT 10;
"
Password for user ids_user:
model_version | records_processed | status | trained_at
---------------+-------------------+--------+------------
(0 rows)
[root@ids var]# psql -h 127.0.0.1 -U ids_user -d ids_database -c "
SELECT source_ip, risk_score, detected_at
FROM detections
ORDER BY detected_at DESC
LIMIT 20;
"
Password for user ids_user:
source_ip | risk_score | detected_at
----------------+------------+----------------------------
64.34.90.127 | 77.42 | 2025-11-18 08:25:29.648227
10.0.249.226 | 78.49 | 2025-11-18 08:25:29.648227
10.0.249.26 | 78.65 | 2025-11-18 08:25:29.648227
72.46.85.161 | 78.76 | 2025-11-18 08:25:29.648227
160.202.129.17 | 78.98 | 2025-11-18 08:25:29.648227
213.175.208.76 | 79.34 | 2025-11-18 08:25:29.648227
67.213.119.137 | 79.84 | 2025-11-18 08:25:29.648227
185.203.25.157 | 82.51 | 2025-11-18 08:25:29.648227
10.1.0.254 | 84.36 | 2025-11-18 08:25:29.648227
185.203.25.254 | 85.57 | 2025-11-18 08:25:29.648227
185.203.25.69 | 85.87 | 2025-11-18 08:25:29.648227
10.0.254.124 | 87.74 | 2025-11-18 08:25:29.648227
37.59.16.12 | 90.28 | 2025-11-18 08:25:29.648227
79.124.56.186 | 91.24 | 2025-11-18 08:25:29.648227
37.59.16.20 | 91.24 | 2025-11-18 08:25:29.648227
10.0.249.160 | 91.35 | 2025-11-18 08:25:29.648227
54.36.50.241 | 92.52 | 2025-11-18 08:25:29.648227
185.203.26.17 | 92.52 | 2025-11-18 08:25:29.648227
185.203.26.34 | 92.63 | 2025-11-18 08:25:29.648227
185.203.24.23 | 95.45 | 2025-11-18 08:25:29.648227
(20 rows)
psql -h 127.0.0.1 -U ids_user -d ids_database -c "
SELECT model_version, records_processed, status, trained_at
FROM training_history
ORDER BY trained_at DESC
LIMIT 10;
"
Password for user ids_user:
model_version | records_processed | status | trained_at
---------------+-------------------+---------+----------------------------
1.0.0 | 100000 | success | 2025-11-18 08:37:28.627906
(1 row)

View File

@ -0,0 +1,60 @@
./deployment/install_ml_deps.sh
╔═══════════════════════════════════════════════╗
║ INSTALLAZIONE DIPENDENZE ML HYBRID ║
╚═══════════════════════════════════════════════╝
 Directory corrente: /opt/ids/python_ml
 Attivazione virtual environment...
 Python in uso: /opt/ids/python_ml/venv/bin/python
📦 Step 1/3: Installazione build dependencies (Cython + numpy)...
Collecting Cython==3.0.5
Downloading Cython-3.0.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.2 kB)
Downloading Cython-3.0.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.6 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.6/3.6 MB 59.8 MB/s 0:00:00
Installing collected packages: Cython
Successfully installed Cython-3.0.5
✅ Cython installato con successo
📦 Step 2/3: Verifica numpy disponibile...
✅ numpy 1.26.2 già installato
📦 Step 3/3: Installazione dipendenze ML (xgboost, joblib, eif)...
Collecting xgboost==2.0.3
Downloading xgboost-2.0.3-py3-none-manylinux2014_x86_64.whl.metadata (2.0 kB)
Requirement already satisfied: joblib==1.3.2 in ./venv/lib64/python3.11/site-packages (1.3.2)
Collecting eif==2.0.2
Downloading eif-2.0.2.tar.gz (1.6 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.6/1.6 MB 6.7 MB/s 0:00:00
Installing build dependencies ... done
Getting requirements to build wheel ... error
error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [20 lines of output]
Traceback (most recent call last):
File "/opt/ids/python_ml/venv/lib64/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 389, in <module>
main()
File "/opt/ids/python_ml/venv/lib64/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 373, in main
json_out["return_val"] = hook(**hook_input["kwargs"])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/ids/python_ml/venv/lib64/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 143, in get_requires_for_build_wheel
return hook(config_settings)
^^^^^^^^^^^^^^^^^^^^^
File "/tmp/pip-build-env-9buits4u/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 331, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=[])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/tmp/pip-build-env-9buits4u/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 301, in _get_build_requires
self.run_setup()
File "/tmp/pip-build-env-9buits4u/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 512, in run_setup
super().run_setup(setup_script=setup_script)
File "/tmp/pip-build-env-9buits4u/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 317, in run_setup
exec(code, locals())
File "<string>", line 3, in <module>
ModuleNotFoundError: No module named 'numpy'
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed to build 'eif' when getting requirements to build wheel

View File

@ -0,0 +1,40 @@
./deployment/install_ml_deps.sh
╔═══════════════════════════════════════════════╗
║ INSTALLAZIONE DIPENDENZE ML HYBRID ║
╚═══════════════════════════════════════════════╝
📍 Directory corrente: /opt/ids/python_ml
📦 Step 1/2: Installazione Cython (richiesto per compilare eif)...
Collecting Cython==3.0.5
Downloading Cython-3.0.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.6 MB)
|████████████████████████████████| 3.6 MB 6.2 MB/s
Installing collected packages: Cython
Successfully installed Cython-3.0.5
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
✅ Cython installato con successo
📦 Step 2/2: Installazione dipendenze ML (xgboost, joblib, eif)...
Collecting xgboost==2.0.3
Downloading xgboost-2.0.3-py3-none-manylinux2014_x86_64.whl (297.1 MB)
|████████████████████████████████| 297.1 MB 13 kB/s
Collecting joblib==1.3.2
Downloading joblib-1.3.2-py3-none-any.whl (302 kB)
|████████████████████████████████| 302 kB 41.7 MB/s
Collecting eif==2.0.2
Downloading eif-2.0.2.tar.gz (1.6 MB)
|████████████████████████████████| 1.6 MB 59.4 MB/s
Preparing metadata (setup.py) ... error
ERROR: Command errored out with exit status 1:
command: /usr/bin/python3 -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-xpd6jc3z/eif_1c539132fe1d4772ada0979407304392/setup.py'"'"'; __file__='"'"'/tmp/pip-install-xpd6jc3z/eif_1c539132fe1d4772ada0979407304392/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-pip-egg-info-lg0m0ish
cwd: /tmp/pip-install-xpd6jc3z/eif_1c539132fe1d4772ada0979407304392/
Complete output (5 lines):
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/tmp/pip-install-xpd6jc3z/eif_1c539132fe1d4772ada0979407304392/setup.py", line 3, in <module>
import numpy
ModuleNotFoundError: No module named 'numpy'
----------------------------------------
WARNING: Discarding https://files.pythonhosted.org/packages/83/b2/d87d869deeb192ab599c899b91a9ad1d3775d04f5b7adcaf7ff6daa54c24/eif-2.0.2.tar.gz#sha256=86e2c98caf530ae73d8bc7153c1bf6b9684c905c9dfc7bdab280846ada1e45ab (from https://pypi.org/simple/eif/). Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.
ERROR: Could not find a version that satisfies the requirement eif==2.0.2 (from versions: 1.0.0, 1.0.1, 1.0.2, 2.0.2)
ERROR: No matching distribution found for eif==2.0.2

View File

@ -0,0 +1,254 @@
./deployment/run_analytics.sh hourly
 Esecuzione aggregazione hourly...
[ANALYTICS] Aggregazione oraria: 2025-11-24 09:00
[ANALYTICS] ✅ Aggregazione completata:
- Totale: 7182065 pacchetti, 27409 IP unici
- Normale: 6922072 pacchetti (96%)
- Attacchi: 259993 pacchetti (3%), 15 IP
✅ Aggregazione hourly completata!
[root@ids ids]# ./deployment/restart_frontend.sh
 Restart Frontend Node.js...
⏸ Stopping existing processes...
 Starting frontend...
❌ Errore: Frontend non avviato!
 Controlla log: tail -f /var/log/ids/frontend.log
[root@ids ids]# curl -s http://localhost:5000/api/analytics/recent?days=7&hourly=true | jq '. | length'
[1] 59354
[root@ids ids]# echo "=== DIAGNOSTICA IDS ANALYTICS ===" > /tmp/ids_diagnostic.txtxt
echo "" >> /tmp/ids_diagnostic.txt
[1]+ Done curl -s http://localhost:5000/api/analytics/recent?days=7
[root@ids ids]# tail -f /var/log/ids/frontend.log
[Mon Nov 24 10:15:13 CET 2025] Frontend Node NON attivo, riavvio...
[Mon Nov 24 10:15:15 CET 2025] Frontend riavviato con PID: 59307
> rest-express@1.0.0 dev
> NODE_ENV=development tsx server/index.ts
 Using standard PostgreSQL database
10:15:17 AM [express] serving on port 5000
✅ Database connection successful
10:15:34 AM [express] GET /api/analytics/recent 200 in 32ms :: []
[Mon Nov 24 10:20:01 CET 2025] Frontend Node NON attivo, riavvio...
[Mon Nov 24 10:20:03 CET 2025] Frontend riavviato con PID: 59406
> rest-express@1.0.0 dev
> NODE_ENV=development tsx server/index.ts
 Using standard PostgreSQL database
node:events:502
throw er; // Unhandled 'error' event
^
Error: listen EADDRINUSE: address already in use 0.0.0.0:5000
at Server.setupListenHandle [as _listen2] (node:net:1908:16)
at listenInCluster (node:net:1965:12)
at doListen (node:net:2139:7)
at process.processTicksAndRejections (node:internal/process/task_queues:83:21)
Emitted 'error' event on Server instance at:
at emitErrorNT (node:net:1944:8)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
code: 'EADDRINUSE',
errno: -98,
syscall: 'listen',
address: '0.0.0.0',
port: 5000
}
Node.js v20.19.5
[Mon Nov 24 10:25:02 CET 2025] Frontend Node NON attivo, riavvio...
[Mon Nov 24 10:25:04 CET 2025] Frontend riavviato con PID: 59511
> rest-express@1.0.0 dev
> NODE_ENV=development tsx server/index.ts
 Using standard PostgreSQL database
node:events:502
throw er; // Unhandled 'error' event
^
Error: listen EADDRINUSE: address already in use 0.0.0.0:5000
at Server.setupListenHandle [as _listen2] (node:net:1908:16)
at listenInCluster (node:net:1965:12)
at doListen (node:net:2139:7)
at process.processTicksAndRejections (node:internal/process/task_queues:83:21)
Emitted 'error' event on Server instance at:
at emitErrorNT (node:net:1944:8)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
code: 'EADDRINUSE',
errno: -98,
syscall: 'listen',
address: '0.0.0.0',
port: 5000
}
Node.js v20.19.5
[Mon Nov 24 10:30:01 CET 2025] Frontend Node NON attivo, riavvio...
[Mon Nov 24 10:30:03 CET 2025] Frontend riavviato con PID: 59618
> rest-express@1.0.0 dev
> NODE_ENV=development tsx server/index.ts
 Using standard PostgreSQL database
node:events:502
throw er; // Unhandled 'error' event
^
Error: listen EADDRINUSE: address already in use 0.0.0.0:5000
at Server.setupListenHandle [as _listen2] (node:net:1908:16)
at listenInCluster (node:net:1965:12)
at doListen (node:net:2139:7)
at process.processTicksAndRejections (node:internal/process/task_queues:83:21)
Emitted 'error' event on Server instance at:
at emitErrorNT (node:net:1944:8)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
code: 'EADDRINUSE',
errno: -98,
syscall: 'listen',
address: '0.0.0.0',
port: 5000
}
Node.js v20.19.5
[Mon Nov 24 10:35:01 CET 2025] Frontend Node NON attivo, riavvio...
[Mon Nov 24 10:35:03 CET 2025] Frontend riavviato con PID: 59725
> rest-express@1.0.0 dev
> NODE_ENV=development tsx server/index.ts
 Using standard PostgreSQL database
node:events:502
throw er; // Unhandled 'error' event
^
Error: listen EADDRINUSE: address already in use 0.0.0.0:5000
at Server.setupListenHandle [as _listen2] (node:net:1908:16)
at listenInCluster (node:net:1965:12)
at doListen (node:net:2139:7)
at process.processTicksAndRejections (node:internal/process/task_queues:83:21)
Emitted 'error' event on Server instance at:
at emitErrorNT (node:net:1944:8)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
code: 'EADDRINUSE',
errno: -98,
syscall: 'listen',
address: '0.0.0.0',
port: 5000
}
Node.js v20.19.5
[Mon Nov 24 10:40:02 CET 2025] Frontend Node NON attivo, riavvio...
[Mon Nov 24 10:40:04 CET 2025] Frontend riavviato con PID: 59831
> rest-express@1.0.0 dev
> NODE_ENV=development tsx server/index.ts
 Using standard PostgreSQL database
node:events:502
throw er; // Unhandled 'error' event
^
Error: listen EADDRINUSE: address already in use 0.0.0.0:5000
at Server.setupListenHandle [as _listen2] (node:net:1908:16)
at listenInCluster (node:net:1965:12)
at doListen (node:net:2139:7)
at process.processTicksAndRejections (node:internal/process/task_queues:83:21)
Emitted 'error' event on Server instance at:
at emitErrorNT (node:net:1944:8)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
code: 'EADDRINUSE',
errno: -98,
syscall: 'listen',
address: '0.0.0.0',
port: 5000
}
Node.js v20.19.5
[Mon Nov 24 10:45:02 CET 2025] Frontend Node NON attivo, riavvio...
[Mon Nov 24 10:45:04 CET 2025] Frontend riavviato con PID: 59935
> rest-express@1.0.0 dev
> NODE_ENV=development tsx server/index.ts
 Using standard PostgreSQL database
node:events:502
throw er; // Unhandled 'error' event
^
Error: listen EADDRINUSE: address already in use 0.0.0.0:5000
at Server.setupListenHandle [as _listen2] (node:net:1908:16)
at listenInCluster (node:net:1965:12)
at doListen (node:net:2139:7)
at process.processTicksAndRejections (node:internal/process/task_queues:83:21)
Emitted 'error' event on Server instance at:
at emitErrorNT (node:net:1944:8)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
code: 'EADDRINUSE',
errno: -98,
syscall: 'listen',
address: '0.0.0.0',
port: 5000
}
Node.js v20.19.5
[Mon Nov 24 10:50:01 CET 2025] Frontend Node NON attivo, riavvio...
[Mon Nov 24 10:50:03 CET 2025] Frontend riavviato con PID: 60044
> rest-express@1.0.0 dev
> NODE_ENV=development tsx server/index.ts
 Using standard PostgreSQL database
node:events:502
throw er; // Unhandled 'error' event
^
Error: listen EADDRINUSE: address already in use 0.0.0.0:5000
at Server.setupListenHandle [as _listen2] (node:net:1908:16)
at listenInCluster (node:net:1965:12)
at doListen (node:net:2139:7)
at process.processTicksAndRejections (node:internal/process/task_queues:83:21)
Emitted 'error' event on Server instance at:
at emitErrorNT (node:net:1944:8)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
code: 'EADDRINUSE',
errno: -98,
syscall: 'listen',
address: '0.0.0.0',
port: 5000
}
Node.js v20.19.5
[Mon Nov 24 10:55:01 CET 2025] Frontend Node NON attivo, riavvio...
[Mon Nov 24 10:55:03 CET 2025] Frontend riavviato con PID: 60151
> rest-express@1.0.0 dev
> NODE_ENV=development tsx server/index.ts
A PostCSS plugin did not pass the `from` option to `postcss.parse`. This may cause imported assets to be incorrectly transformed. If you've recently added a PostCSS plugin that raised this warning, please contact the package author to fix the issue.
🐘 Using standard PostgreSQL database
node:events:502
throw er; // Unhandled 'error' event
^
Error: listen EADDRINUSE: address already in use 0.0.0.0:5000
at Server.setupListenHandle [as _listen2] (node:net:1908:16)
at listenInCluster (node:net:1965:12)
at doListen (node:net:2139:7)
at process.processTicksAndRejections (node:internal/process/task_queues:83:21)
Emitted 'error' event on Server instance at:
at emitErrorNT (node:net:1944:8)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
code: 'EADDRINUSE',
errno: -98,
syscall: 'listen',
address: '0.0.0.0',
port: 5000
}
Node.js v20.19.5
10:55:06 AM [express] GET /api/logs/[object%20Object] 200 in 10ms
10:55:06 AM [express] GET /api/detections 200 in 34ms :: [{"id":"5659c0b5-11df-4ebe-b73f-f53c64932953…
10:55:08 AM [express] GET /api/analytics/recent/[object%20Object] 200 in 7ms
10:55:11 AM [express] GET /api/analytics/recent/[object%20Object] 200 in 5ms
10:55:12 AM [express] GET /api/analytics/recent/[object%20Object] 200 in 5ms

View File

@ -0,0 +1,54 @@
./deployment/train_hybrid_production.sh
=======================================================================
TRAINING HYBRID ML DETECTOR - DATI REALI
=======================================================================
📂 Caricamento credenziali database da .env...
✅ Credenziali caricate:
Host: localhost
Port: 5432
Database: ids_database
User: ids_user
Password: ****** (nascosta)
🎯 Parametri training:
Periodo: ultimi 7 giorni
Max records: 1000000
🐍 Python: /opt/ids/python_ml/venv/bin/python
📊 Verifica dati disponibili nel database...
primo_log | ultimo_log | periodo_totale | totale_records
---------------------+---------------------+----------------+----------------
2025-11-22 10:03:21 | 2025-11-24 17:58:17 | 2 giorni | 234,316,667
(1 row)
🚀 Avvio training...
=======================================================================
[WARNING] Extended Isolation Forest not available, using standard IF
======================================================================
IDS HYBRID ML TRAINING - UNSUPERVISED MODE
======================================================================
[TRAIN] Loading last 7 days of real traffic from database...
❌ Error: column "dest_ip" does not exist
LINE 5: dest_ip,
^
Traceback (most recent call last):
File "/opt/ids/python_ml/train_hybrid.py", line 365, in main
train_unsupervised(args)
File "/opt/ids/python_ml/train_hybrid.py", line 91, in train_unsupervised
logs_df = train_on_real_traffic(db_config, days=args.days)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/ids/python_ml/train_hybrid.py", line 50, in train_on_real_traffic
cursor.execute(query, (days,))
File "/opt/ids/python_ml/venv/lib64/python3.11/site-packages/psycopg2/extras.py", line 236, in execute
return super().execute(query, vars)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
psycopg2.errors.UndefinedColumn: column "dest_ip" does not exist
LINE 5: dest_ip,
^

View File

@ -0,0 +1,360 @@
./deployment/update_from_git.sh
╔═══════════════════════════════════════════════╗
║  AGGIORNAMENTO SISTEMA IDS DA GIT ║
╚═══════════════════════════════════════════════╝
 Verifica configurazione git...
 Backup configurazione locale...
✅ .env salvato in .env.backup
 Verifica modifiche locali...
 Download aggiornamenti da git.alfacom.it...
remote: Enumerating objects: 25, done.
remote: Counting objects: 100% (25/25), done.
remote: Compressing objects: 100% (16/16), done.
remote: Total 16 (delta 13), reused 0 (delta 0), pack-reused 0 (from 0)
Unpacking objects: 100% (16/16), 2.36 KiB | 482.00 KiB/s, done.
From https://git.alfacom.it/marco/ids.alfacom.it
07f1895..e957556 main -> origin/main
* [new tag] v1.0.27 -> v1.0.27
From https://git.alfacom.it/marco/ids.alfacom.it
* branch main -> FETCH_HEAD
Updating 07f1895..e957556
Fast-forward
.replit | 4 ----
database-schema/apply_migrations.sh | 9 +++++++++
database-schema/schema.sql | 4 ++--
deployment/cleanup_database.sh | 4 +++-
deployment/debug_system.sh | 10 +++++++++-
version.json | 10 ++++++++--
6 files changed, 31 insertions(+), 10 deletions(-)
✅ Aggiornamenti scaricati con successo
 Ripristino configurazione locale...
✅ .env ripristinato
 Aggiornamento dipendenze Node.js...
up to date, audited 492 packages in 2s
65 packages are looking for funding
run `npm fund` for details
9 vulnerabilities (3 low, 5 moderate, 1 high)
To address issues that do not require attention, run:
npm audit fix
To address all issues (including breaking changes), run:
npm audit fix --force
Run `npm audit` for details.
✅ Dipendenze Node.js aggiornate
 Aggiornamento dipendenze Python...
Defaulting to user installation because normal site-packages is not writeable
Requirement already satisfied: fastapi==0.104.1 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 1)) (0.104.1)
Requirement already satisfied: uvicorn==0.24.0 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 2)) (0.24.0)
Requirement already satisfied: pandas==2.1.3 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 3)) (2.1.3)
Requirement already satisfied: numpy==1.26.2 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 4)) (1.26.2)
Requirement already satisfied: scikit-learn==1.3.2 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 5)) (1.3.2)
Requirement already satisfied: psycopg2-binary==2.9.9 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 6)) (2.9.9)
Requirement already satisfied: python-dotenv==1.0.0 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 7)) (1.0.0)
Requirement already satisfied: pydantic==2.5.0 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 8)) (2.5.0)
Requirement already satisfied: httpx==0.25.1 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 9)) (0.25.1)
Requirement already satisfied: anyio<4.0.0,>=3.7.1 in /home/ids/.local/lib/python3.11/site-packages (from fastapi==0.104.1->-r requirements.txt (line 1)) (3.7.1)
Requirement already satisfied: starlette<0.28.0,>=0.27.0 in /home/ids/.local/lib/python3.11/site-packages (from fastapi==0.104.1->-r requirements.txt (line 1)) (0.27.0)
Requirement already satisfied: typing-extensions>=4.8.0 in /home/ids/.local/lib/python3.11/site-packages (from fastapi==0.104.1->-r requirements.txt (line 1)) (4.15.0)
Requirement already satisfied: click>=7.0 in /home/ids/.local/lib/python3.11/site-packages (from uvicorn==0.24.0->-r requirements.txt (line 2)) (8.3.1)
Requirement already satisfied: h11>=0.8 in /home/ids/.local/lib/python3.11/site-packages (from uvicorn==0.24.0->-r requirements.txt (line 2)) (0.16.0)
Requirement already satisfied: python-dateutil>=2.8.2 in /home/ids/.local/lib/python3.11/site-packages (from pandas==2.1.3->-r requirements.txt (line 3)) (2.9.0.post0)
Requirement already satisfied: pytz>=2020.1 in /home/ids/.local/lib/python3.11/site-packages (from pandas==2.1.3->-r requirements.txt (line 3)) (2025.2)
Requirement already satisfied: tzdata>=2022.1 in /home/ids/.local/lib/python3.11/site-packages (from pandas==2.1.3->-r requirements.txt (line 3)) (2025.2)
Requirement already satisfied: scipy>=1.5.0 in /home/ids/.local/lib/python3.11/site-packages (from scikit-learn==1.3.2->-r requirements.txt (line 5)) (1.16.3)
Requirement already satisfied: joblib>=1.1.1 in /home/ids/.local/lib/python3.11/site-packages (from scikit-learn==1.3.2->-r requirements.txt (line 5)) (1.5.2)
Requirement already satisfied: threadpoolctl>=2.0.0 in /home/ids/.local/lib/python3.11/site-packages (from scikit-learn==1.3.2->-r requirements.txt (line 5)) (3.6.0)
Requirement already satisfied: annotated-types>=0.4.0 in /home/ids/.local/lib/python3.11/site-packages (from pydantic==2.5.0->-r requirements.txt (line 8)) (0.7.0)
Requirement already satisfied: pydantic-core==2.14.1 in /home/ids/.local/lib/python3.11/site-packages (from pydantic==2.5.0->-r requirements.txt (line 8)) (2.14.1)
Requirement already satisfied: certifi in /home/ids/.local/lib/python3.11/site-packages (from httpx==0.25.1->-r requirements.txt (line 9)) (2025.11.12)
Requirement already satisfied: httpcore in /home/ids/.local/lib/python3.11/site-packages (from httpx==0.25.1->-r requirements.txt (line 9)) (1.0.9)
Requirement already satisfied: idna in /home/ids/.local/lib/python3.11/site-packages (from httpx==0.25.1->-r requirements.txt (line 9)) (3.11)
Requirement already satisfied: sniffio in /home/ids/.local/lib/python3.11/site-packages (from httpx==0.25.1->-r requirements.txt (line 9)) (1.3.1)
Requirement already satisfied: six>=1.5 in /home/ids/.local/lib/python3.11/site-packages (from python-dateutil>=2.8.2->pandas==2.1.3->-r requirements.txt (line 3)) (1.17.0)
✅ Dipendenze Python aggiornate
 Aggiornamento schema database...
Applicando migrazioni SQL...
 Applicazione migrazioni database...
 Trovate 1 migrazioni
Applicando: 001_add_missing_columns.sql
✅ 001_add_missing_columns.sql applicata
✅ Tutte le migrazioni applicate con successo
✅ Migrazioni SQL applicate
Sincronizzando schema Drizzle...
> rest-express@1.0.0 db:push
> drizzle-kit push
No config path provided, using default 'drizzle.config.ts'
Reading config file '/opt/ids/drizzle.config.ts'
Using 'pg' driver for database querying
[✓] Pulling schema from database...
· You're about to add routers_ip_address_unique unique constraint to the table, which contains 1 items. If this statement fails, you will receive an error from the database. Do you want to truncate routers table?
Warning Found data-loss statements:
· You're about to delete last_check column in routers table with 1 items
· You're about to delete status column in routers table with 1 items
THIS ACTION WILL CAUSE DATA LOSS AND CANNOT BE REVERTED
Do you still want to push changes?
[x] All changes were aborted
✅ Schema database completamente sincronizzato
 Restart servizi...
✅ Servizi riavviati
╔═══════════════════════════════════════════════╗
║ ✅ AGGIORNAMENTO COMPLETATO ║
╚═══════════════════════════════════════════════╝
 VERIFICA SISTEMA:
• Log backend: tail -f /var/log/ids/backend.log
• Log frontend: tail -f /var/log/ids/frontend.log
• API backend: curl http://localhost:8000/health
• Frontend: curl http://localhost:5000
 STATO SERVIZI:
ids 5038 0.2 2.0 1894024 331912 ? Sl 09:20 1:17 /usr/bin/python3.11 main.py
root 12022 0.0 0.0 3088 1536 pts/3 S+ 17:51 0:00 tail -f /var/log/ids/syslog_parser.log
root 12832 0.0 0.1 730448 32068 pts/5 Rl+ 18:02 0:00 /usr/bin/node /usr/bin/npm run dev
[root@ids ids]# sudo -u ids /opt/ids/database-schema/apply_migrations.sh
 Applicazione migrazioni database...
 Trovate 1 migrazioni
Applicando: 001_add_missing_columns.sql
✅ 001_add_missing_columns.sql applicata
✅ Tutte le migrazioni applicate con successo
[root@ids ids]# psql postgresql://ids_user:TestPassword123@127.0.0.1:5432/ids_database -c "\d routers"
Table "public.routers"
Column | Type | Collation | Nullable | Default
------------+-----------------------------+-----------+----------+-------------------
id | character varying | | not null | gen_random_uuid()
name | text | | not null |
ip_address | text | | not null |
username | text | | not null |
password | text | | not null |
api_port | integer | | not null | 443
enabled | boolean | | not null | true
last_check | timestamp without time zone | | |
status | text | | |
created_at | timestamp without time zone | | not null | now()
last_sync | timestamp without time zone | | |
Indexes:
"routers_pkey" PRIMARY KEY, btree (id)
"routers_enabled_idx" btree (enabled)
"routers_ip_address_key" UNIQUE CONSTRAINT, btree (ip_address)
"routers_ip_idx" btree (ip_address)
[root@ids ids]# psql postgresql://ids_user:TestPassword123@127.0.0.1:5432/ids_database << 'EOF'
-- Conta log da eliminare
SELECT COUNT(*) as logs_da_eliminare FROM network_logs WHERE timestamp < NOW() - INTERVAL '7 days';
-- Elimina
DELETE FROM network_logs WHERE timestamp < NOW() - INTERVAL '7 days';
-- Libera spazio fisico
VACUUM FULL network_logs;
-- Verifica risultato
SELECT COUNT(*) as logs_rimasti FROM network_logs;
SELECT pg_size_pretty(pg_database_size(current_database())) as dimensione_db;
EOF
logs_da_eliminare
-------------------
0
(1 row)
DELETE 0
VACUUM
logs_rimasti
--------------
0
(1 row)
dimensione_db
---------------
8853 kB
(1 row)
[root@ids ids]# sudo /opt/ids/deployment/setup_cron_cleanup.sh
 Configurazione cron job per pulizia database...
⚠ Cron job già configurato
 Cron jobs attuali per utente ids:
# ============================================
# SISTEMA IDS - CONFIGURAZIONE AUTOMATICA
# ============================================
# Training ML ogni 12 ore (alle 00:00 e 12:00)
0 */12 * * * /opt/ids/deployment/cron_train.sh
# Detection automatica ogni 5 minuti
*/5 * * * * /opt/ids/deployment/cron_detect.sh
# Verifica processo backend Python ogni 5 minuti (riavvia se non attivo)
*/5 * * * * /opt/ids/deployment/check_backend.sh >> /var/log/ids/cron.log 2>&1
# Verifica processo frontend ogni 5 minuti (riavvia se non attivo)
*/5 * * * * /opt/ids/deployment/check_frontend.sh >> /var/log/ids/cron.log 2>&1
# Pulizia log settimanale (ogni domenica alle 02:00)
0 2 * * 0 find /var/log/ids -name "*.log" -size +100M -exec truncate -s 50M {} \; >> /var/log/ids/cron.log 2>&1
# Restart completo del sistema ogni settimana (domenica alle 03:00)
0 3 * * 0 /opt/ids/deployment/restart_all.sh >> /var/log/ids/cron.log 2>&1
# Backup database giornaliero (alle 04:00)
0 4 * * * /opt/ids/deployment/backup_db.sh >> /var/log/ids/cron.log 2>&1
0 3 * * * /opt/ids/deployment/cleanup_database.sh >> /var/log/ids/cleanup.log 2>&1
離 Test manuale pulizia:
sudo -u ids /opt/ids/deployment/cleanup_database.sh
[root@ids ids]# sudo -u ids /opt/ids/deployment/restart_all.sh
pkill: killing pid 12878 failed: Operation not permitted
pkill: killing pid 12832 failed: Operation not permitted
[root@ids ids]# /opt/ids/deployment/debug_system.sh
╔═══════════════════════════════════════════════╗
║  DEBUG SISTEMA IDS ║
╚═══════════════════════════════════════════════╝
═══ 1. VERIFICA DATABASE ═══
 Conta record per tabella:
tabella | record
------------------+--------
detections | 0
network_logs | 0
routers | 1
training_history | 0
whitelist | 0
(5 rows)
 Schema tabella routers:
Table "public.routers"
Column | Type | Collation | Nullable | Default
------------+-----------------------------+-----------+----------+-------------------
id | character varying | | not null | gen_random_uuid()
name | text | | not null |
ip_address | text | | not null |
username | text | | not null |
password | text | | not null |
api_port | integer | | not null | 443
enabled | boolean | | not null | true
last_check | timestamp without time zone | | |
status | text | | |
created_at | timestamp without time zone | | not null | now()
last_sync | timestamp without time zone | | |
Indexes:
"routers_pkey" PRIMARY KEY, btree (id)
"routers_enabled_idx" btree (enabled)
"routers_ip_address_key" UNIQUE CONSTRAINT, btree (ip_address)
"routers_ip_idx" btree (ip_address)
 Ultimi 5 network_logs:
timestamp | router_name | source_ip | destination_ip | protocol | packet_length
-----------+-------------+-----------+----------------+----------+---------------
(0 rows)
 Training history:
trained_at | model_version | records_processed | features_count | status | notes
------------+---------------+-------------------+----------------+--------+-------
(0 rows)
 Detections:
detected_at | source_ip | risk_score | anomaly_type | blocked | log_count
-------------+-----------+------------+--------------+---------+-----------
(0 rows)
═══ 2. STATO SERVIZI ═══
 Processi attivi:
ids 5038 0.2 2.0 1894024 331912 ? Sl 09:20 1:17 /usr/bin/python3.11 main.py
root 12022 0.0 0.0 3088 1536 pts/3 S+ 17:51 0:00 tail -f /var/log/ids/syslog_parser.log
root 12832 0.2 0.3 1097848 59768 pts/5 Sl 18:02 0:00 npm run dev
═══ 3. BACKEND PYTHON ML ═══
✅ Backend Python attivo
 Statistiche ML:
{
"logs": {
"total": 0,
"last_hour": 0
},
"detections": {
"total": 0,
"blocked": 0
},
"routers": {
"active": 1
},
"latest_training": null
}
═══ 4. FRONTEND NODE.JS ═══
✅ Frontend Node attivo
 Test API:
{
"routers": {
"total": 1,
"enabled": 1
},
"detections": {
"total": 0,
"blocked": 0,
"critical": 0,
"high": 0
},
"logs": {
"recent": 0
},
"whitelist": {
"total": 0
}
}
═══ 5. SYSLOG PARSER ═══
❌ Syslog Parser NON attivo
Avvia: cd /opt/ids/python_ml && nohup python syslog_parser.py > /var/log/ids/syslog_parser.log 2>&1 &
═══ 6. ERRORI RECENTI ═══
🔴 Errori backend Python:
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
🔴 Errori frontend Node:
[DB ERROR] Failed to fetch routers: error: column "last_sync" does not exist
╔═══════════════════════════════════════════════╗
║ 📋 RIEPILOGO ║
╚═══════════════════════════════════════════════╝
Database:
• Network logs: 0
• Detections: 0
• Training history: 0
🔧 COMANDI UTILI:
• Riavvia tutto: sudo -u ids /opt/ids/deployment/restart_all.sh
• Test training: curl -X POST http://localhost:8000/train -H 'Content-Type: application/json' -d '{"max_records": 1000}'
• Log frontend: tail -f /var/log/ids/frontend.log
• Log backend: tail -f /var/log/ids/backend.log

View File

@ -0,0 +1,101 @@
./deployment/update_from_git.sh --db
╔═══════════════════════════════════════════════╗
║  AGGIORNAMENTO SISTEMA IDS DA GIT ║
╚═══════════════════════════════════════════════╝
 Verifica configurazione git...
 Backup configurazione locale...
✅ .env salvato in .env.backup
 Verifica modifiche locali...
⚠ Ci sono modifiche locali non committate
Esegui 'git status' per vedere i dettagli
Vuoi procedere comunque? (y/n) y
Salvo modifiche locali temporaneamente...
No local changes to save
 Download aggiornamenti da git.alfacom.it...
remote: Enumerating objects: 21, done.
remote: Counting objects: 100% (21/21), done.
remote: Compressing objects: 100% (13/13), done.
remote: Total 13 (delta 9), reused 0 (delta 0), pack-reused 0 (from 0)
Unpacking objects: 100% (13/13), 3.37 KiB | 492.00 KiB/s, done.
From https://git.alfacom.it/marco/ids.alfacom.it
3a945ec..152e226 main -> origin/main
* [new tag] v1.0.56 -> v1.0.56
From https://git.alfacom.it/marco/ids.alfacom.it
* branch main -> FETCH_HEAD
Updating 3a945ec..152e226
Fast-forward
attached_assets/Pasted--deployment-update-from-git-sh-db-AGGIOR-1764001889941_1764001889941.txt | 90 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
database-schema/schema.sql | 4 ++--
python_ml/requirements.txt | 2 +-
replit.md | 5 +++--
version.json | 16 ++++++++--------
5 files changed, 104 insertions(+), 13 deletions(-)
create mode 100644 attached_assets/Pasted--deployment-update-from-git-sh-db-AGGIOR-1764001889941_1764001889941.txt
✅ Aggiornamenti scaricati con successo
 Ripristino configurazione locale...
✅ .env ripristinato
 Aggiornamento dipendenze Node.js...
up to date, audited 492 packages in 2s
65 packages are looking for funding
run `npm fund` for details
9 vulnerabilities (3 low, 5 moderate, 1 high)
To address issues that do not require attention, run:
npm audit fix
To address all issues (including breaking changes), run:
npm audit fix --force
Run `npm audit` for details.
✅ Dipendenze Node.js aggiornate
📦 Aggiornamento dipendenze Python...
Defaulting to user installation because normal site-packages is not writeable
Requirement already satisfied: fastapi==0.104.1 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 1)) (0.104.1)
Requirement already satisfied: uvicorn==0.24.0 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 2)) (0.24.0)
Requirement already satisfied: pandas==2.1.3 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 3)) (2.1.3)
Requirement already satisfied: numpy==1.26.2 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 4)) (1.26.2)
Requirement already satisfied: scikit-learn==1.3.2 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 5)) (1.3.2)
Requirement already satisfied: psycopg2-binary==2.9.9 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 6)) (2.9.9)
Requirement already satisfied: python-dotenv==1.0.0 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 7)) (1.0.0)
Requirement already satisfied: pydantic==2.5.0 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 8)) (2.5.0)
Requirement already satisfied: httpx==0.25.1 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 9)) (0.25.1)
Collecting xgboost==2.0.3
Using cached xgboost-2.0.3-py3-none-manylinux2014_x86_64.whl (297.1 MB)
Collecting joblib==1.3.2
Using cached joblib-1.3.2-py3-none-any.whl (302 kB)
Collecting eif==2.0.2
Downloading eif-2.0.2.tar.gz (1.6 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.6/1.6 MB 2.8 MB/s eta 0:00:00
Preparing metadata (setup.py) ... error
error: subprocess-exited-with-error
× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [6 lines of output]
Traceback (most recent call last):
File "<string>", line 2, in <module>
File "<pip-setuptools-caller>", line 34, in <module>
File "/tmp/pip-install-7w_zhzdf/eif_d01f9f1e418b4512a5d7b4cf0e1128e2/setup.py", line 4, in <module>
from Cython.Distutils import build_ext
ModuleNotFoundError: No module named 'Cython'
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed
× Encountered error while generating package metadata.
╰─> See above for output.
note: This is an issue with the package mentioned above, not pip.
hint: See above for details.

View File

@ -0,0 +1,90 @@
./deployment/update_from_git.sh --db
╔═══════════════════════════════════════════════╗
║  AGGIORNAMENTO SISTEMA IDS DA GIT ║
╚═══════════════════════════════════════════════╝
 Verifica configurazione git...
 Backup configurazione locale...
✅ .env salvato in .env.backup
 Verifica modifiche locali...
⚠ Ci sono modifiche locali non committate
Esegui 'git status' per vedere i dettagli
Vuoi procedere comunque? (y/n) y
Salvo modifiche locali temporaneamente...
No local changes to save
 Download aggiornamenti da git.alfacom.it...
remote: Enumerating objects: 51, done.
remote: Counting objects: 100% (51/51), done.
remote: Compressing objects: 100% (41/41), done.
remote: Total 41 (delta 32), reused 0 (delta 0), pack-reused 0 (from 0)
Unpacking objects: 100% (41/41), 31.17 KiB | 1.35 MiB/s, done.
From https://git.alfacom.it/marco/ids.alfacom.it
0fa2f11..3a945ec main -> origin/main
* [new tag] v1.0.55 -> v1.0.55
From https://git.alfacom.it/marco/ids.alfacom.it
* branch main -> FETCH_HEAD
Updating 0fa2f11..3a945ec
Fast-forward
database-schema/schema.sql | 4 +-
deployment/CHECKLIST_ML_HYBRID.md | 536 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
python_ml/dataset_loader.py | 384 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
python_ml/main.py | 120 ++++++++++++++++++++++++++++------
python_ml/ml_hybrid_detector.py | 705 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
python_ml/requirements.txt | 3 +
python_ml/train_hybrid.py | 378 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
python_ml/validation_metrics.py | 324 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
replit.md | 19 +++++-
version.json | 16 ++---
10 files changed, 2459 insertions(+), 30 deletions(-)
create mode 100644 deployment/CHECKLIST_ML_HYBRID.md
create mode 100644 python_ml/dataset_loader.py
create mode 100644 python_ml/ml_hybrid_detector.py
create mode 100644 python_ml/train_hybrid.py
create mode 100644 python_ml/validation_metrics.py
✅ Aggiornamenti scaricati con successo
🔄 Ripristino configurazione locale...
✅ .env ripristinato
📦 Aggiornamento dipendenze Node.js...
up to date, audited 492 packages in 3s
65 packages are looking for funding
run `npm fund` for details
9 vulnerabilities (3 low, 5 moderate, 1 high)
To address issues that do not require attention, run:
npm audit fix
To address all issues (including breaking changes), run:
npm audit fix --force
Run `npm audit` for details.
✅ Dipendenze Node.js aggiornate
📦 Aggiornamento dipendenze Python...
Defaulting to user installation because normal site-packages is not writeable
Requirement already satisfied: fastapi==0.104.1 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 1)) (0.104.1)
Requirement already satisfied: uvicorn==0.24.0 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 2)) (0.24.0)
Requirement already satisfied: pandas==2.1.3 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 3)) (2.1.3)
Requirement already satisfied: numpy==1.26.2 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 4)) (1.26.2)
Requirement already satisfied: scikit-learn==1.3.2 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 5)) (1.3.2)
Requirement already satisfied: psycopg2-binary==2.9.9 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 6)) (2.9.9)
Requirement already satisfied: python-dotenv==1.0.0 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 7)) (1.0.0)
Requirement already satisfied: pydantic==2.5.0 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 8)) (2.5.0)
Requirement already satisfied: httpx==0.25.1 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 9)) (0.25.1)
Collecting xgboost==2.0.3
Downloading xgboost-2.0.3-py3-none-manylinux2014_x86_64.whl (297.1 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 297.1/297.1 MB 8.4 MB/s eta 0:00:00
Collecting joblib==1.3.2
Downloading joblib-1.3.2-py3-none-any.whl (302 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 302.2/302.2 kB 62.7 MB/s eta 0:00:00
ERROR: Ignored the following versions that require a different python version: 1.21.2 Requires-Python >=3.7,<3.11; 1.21.3 Requires-Python >=3.7,<3.11; 1.21.4 Requires-Python >=3.7,<3.11; 1.21.5 Requires-Python >=3.7,<3.11; 1.21.6 Requires-Python >=3.7,<3.11
ERROR: Could not find a version that satisfies the requirement eif==2.0.0 (from versions: 1.0.0, 1.0.1, 1.0.2, 2.0.2)
ERROR: No matching distribution found for eif==2.0.0

View File

@ -0,0 +1,111 @@
./fix_postgresql_auth.sh
╔═══════════════════════════════════════════════╗
║ PostgreSQL Authentication Fix ║
╚═══════════════════════════════════════════════╝
 File pg_hba.conf: /var/lib/pgsql/data/pg_hba.conf
 Backup configurazione...
✅ Backup salvato: /var/lib/pgsql/data/pg_hba.conf.backup_20251117_171950
 Configurazione ATTUALE:
local all all peer
host all all 127.0.0.1/32 scram-sha-256
host all all ::1/128 scram-sha-256
local replication all peer
host replication all 127.0.0.1/32 ident
host replication all ::1/128 ident
local ids_database ids_user md5
host ids_database ids_user 127.0.0.1/32 md5
 Modifico pg_hba.conf...
✅ Configurazione modificata
 Configurazione NUOVA:
local all all peer
host all all 127.0.0.1/32 scram-sha-256
host all all ::1/128 scram-sha-256
local replication all peer
host replication all 127.0.0.1/32 ident
host replication all ::1/128 ident
local ids_database ids_user md5
host ids_database ids_user 127.0.0.1/32 md5
 Ricarico configurazione PostgreSQL...
✅ PostgreSQL ricaricato
離 Test connessione con password...
❌ Connessione FALLITA
Verifica password in /opt/ids/.env
[root@ids deployment]# systemctl restart postgresql
[root@ids deployment]# psql -h localhost -U ids_user -d ids_database -c "SELECT 1;"
psql: error: FATAL: password authentication failed for user "ids_user"
[root@ids deployment]# sudo -u postgres psql -c "ALTER USER ids_user WITH PASSWORD 'fuvX7Lk7gVNbW72bj81kfU/m6VV+j3SQ8w09z16W1CE=';"
ALTER ROLE
[root@ids deployment]# psql -h localhost -U ids_user -d ids_database -c "SELECT 1;"
psql: error: FATAL: password authentication failed for user "ids_user"
[root@ids deployment]# ./fix_postgresql_auth.sh
╔═══════════════════════════════════════════════╗
║ PostgreSQL Authentication Fix ║
╚═══════════════════════════════════════════════╝
 File pg_hba.conf: /var/lib/pgsql/data/pg_hba.conf
 Backup configurazione...
✅ Backup salvato: /var/lib/pgsql/data/pg_hba.conf.backup_20251117_172100
 Configurazione ATTUALE:
local all all peer
host all all 127.0.0.1/32 scram-sha-256
host all all ::1/128 scram-sha-256
local replication all peer
host replication all 127.0.0.1/32 ident
host replication all ::1/128 ident
local ids_database ids_user md5
host ids_database ids_user 127.0.0.1/32 md5
 Modifico pg_hba.conf...
✅ Configurazione modificata
 Configurazione NUOVA:
local all all peer
host all all 127.0.0.1/32 scram-sha-256
host all all ::1/128 scram-sha-256
local replication all peer
host replication all 127.0.0.1/32 ident
host replication all ::1/128 ident
local ids_database ids_user md5
host ids_database ids_user 127.0.0.1/32 md5
🔄 Ricarico configurazione PostgreSQL...
✅ PostgreSQL ricaricato
🧪 Test connessione con password...
❌ Connessione FALLITA
Verifica password in /opt/ids/.env
[root@ids deployment]# export PGPASSWORD='fuvX7Lk7gVNbW72bj81kfU/m6VV+j3SQ8w09z16W1CE='
[root@ids deployment]# psql -h localhost -U ids_user -d ids_database -c "SELECT 1;"
psql: error: FATAL: password authentication failed for user "ids_user"
[root@ids deployment]# TEST_PASS="TestPassword123"
[root@ids deployment]# sudo -u postgres psql -c "ALTER USER ids_user WITH PASSWORD '$TEST_PASS';"
ALTER ROLE
[root@ids deployment]# sudo -u ids sed -i "s/PGPASSWORD=.*/PGPASSWORD=$TEST_PASS/" /opt/ids/.env
[root@ids deployment]# grep PGPASSWORD /opt/ids/.env
PGPASSWORD=TestPassword123
[root@ids deployment]# export PGPASSWORD="$TEST_PASS"
[root@ids deployment]# psql -h localhost -U ids_user -d ids_database -c "SELECT 1;"
psql: error: FATAL: password authentication failed for user "ids_user"
[root@ids deployment]# tail -30 /var/lib/pgsql/data/log/postgresql-*.log | grep -i "ids_user"
2025-11-17 17:19:50.207 CET [59081] DETAIL: User "ids_user" does not have a valid SCRAM secret.
2025-11-17 17:20:26.379 CET [59126] FATAL: password authentication failed for user "ids_user"
2025-11-17 17:20:26.379 CET [59126] DETAIL: User "ids_user" does not have a valid SCRAM secret.
2025-11-17 17:20:51.102 CET [59132] FATAL: password authentication failed for user "ids_user"
2025-11-17 17:20:51.102 CET [59132] DETAIL: User "ids_user" does not have a valid SCRAM secret.
2025-11-17 17:21:00.789 CET [59154] FATAL: password authentication failed for user "ids_user"
2025-11-17 17:21:00.789 CET [59154] DETAIL: User "ids_user" does not have a valid SCRAM secret.
2025-11-17 17:22:28.055 CET [59160] FATAL: password authentication failed for user "ids_user"
2025-11-17 17:22:28.055 CET [59160] DETAIL: User "ids_user" does not have a valid SCRAM secret.
2025-11-17 17:23:42.513 CET [59171] FATAL: password authentication failed for user "ids_user"
2025-11-17 17:23:42.513 CET [59171] DETAIL: User "ids_user" does not have a valid SCRAM secret.

View File

@ -0,0 +1,42 @@
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:15256->108.55.41.22:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:15256->108.55.41.22:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:24416->185.114.48.212:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:24416->185.114.48.212:445, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-856_gianluca.carmellino>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 178.22.24.64:53707->185.203.25.160:10401, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-856_gianluca.carmellino>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 178.22.24.64:53707->185.203.25.160:10401, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 183.147.162.78:42369->185.203.24.153:23, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 183.147.162.78:42369->185.203.24.153:23, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-datev.router>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 38.242.148.189:51558->185.203.25.199:53, len 69
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:59956->185.114.64.51:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:59956->185.114.64.51:445, len 52
forward: in:<pppoe-1496_1143_demartinog> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac e4:38:83:be:c8:60, proto UDP, 10.0.254.250:64924->216.58.205.46:443, len 1228
forward: in:<pppoe-1496_1143_demartinog> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac e4:38:83:be:c8:60, proto UDP, 10.0.254.250:64924->216.58.205.46:443, len 1228
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:26015->85.39.11.225:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:26015->85.39.11.225:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:21538->216.0.0.11:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:21538->216.0.0.11:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:48075->108.55.66.212:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:48075->108.55.66.212:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:23250->78.107.87.197:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:23250->78.107.87.197:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:62934->172.121.122.57:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:62934->172.121.122.57:445, len 52
forward: in:<pppoe-1641_1395_hlukhnatal> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.64:35308->168.138.169.206:443, len 60
forward: in:<pppoe-1641_1395_hlukhnatal> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.64:35308->168.138.169.206:443, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.58.142:55556->185.203.24.204:4499, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.28.229:20400->185.203.24.25:443, len 52
forward: in:<pppoe-gennaro.cibelli.sala> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 84:d8:1b:68:7e:07, proto UDP, 185.203.25.162:57994->17.253.53.73:443, len 1378
forward: in:<pppoe-gennaro.cibelli.sala> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 84:d8:1b:68:7e:07, proto UDP, 185.203.25.162:57994->17.253.53.73:443, len 1378
forward: in:<pppoe-gennaro.cibelli.sala> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 84:d8:1b:68:7e:07, proto UDP, 185.203.25.162:57994->17.253.53.73:443, len 700
forward: in:<pppoe-gennaro.cibelli.sala> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 84:d8:1b:68:7e:07, proto UDP, 185.203.25.162:57994->17.253.53.73:443, len 700
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.29.25:27540->185.203.24.94:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.29.25:27540->185.203.24.94:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-alfonso.santonicola>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 103.102.230.4:33260->185.203.25.227:8728, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-alfonso.santonicola>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 103.102.230.4:33260->185.203.25.227:8728, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:sfp-sfpplus1_VS_FTTO, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 68.183.27.223:43452->185.203.26.33:29092, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:sfp-sfpplus1_VS_FTTO, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 68.183.27.223:43452->185.203.26.33:29092, len 52
forward: in:<pppoe-891_mariagiovanna.morrone> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.214:39490->44.212.216.137:443, len 60
forward: in:<pppoe-891_mariagiovanna.morrone> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.214:39490->44.212.216.137:443, len 60
forward: in:<pppoe-1024_maria.granato> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.94:47860->216.239.36.223:443, len 60
forward: in:<pppoe-1024_maria.granato> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.94:47860->216.239.36.223:443, len 60
forward: in:<pppoe-1024_maria.granato> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.94:47864->216.239.36.223:443, len 60

View File

@ -0,0 +1,581 @@
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53783->52.213.60.221:443, len 64
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53783->52.213.60.221:443, len 64
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53784->108.138.187.109:443, len 64
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53784->108.138.187.109:443, len 64
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:27417->8.8.8.8:53, len 79
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:27417->8.8.8.8:53, len 79
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:21103->8.8.8.8:53, len 72
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:21103->8.8.8.8:53, len 72
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 172.217.38.146:35055->185.203.24.95:993, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 172.217.38.146:35055->185.203.24.95:993, len 60
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:45846->8.8.8.8:53, len 217
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:45846->8.8.8.8:53, len 217
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:43652->185.203.24.135:9004, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:43652->185.203.24.135:9004, len 60
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53785->142.250.180.134:443, len 64
forward: in:<pppoe-imo.office> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac cc:2d:e0:d9:1a:07, proto UDP, 185.203.25.69:33806->165.154.165.205:8810, len 187
forward: in:<pppoe-imo.office> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac cc:2d:e0:d9:1a:07, proto UDP, 185.203.25.69:33806->165.154.165.205:8810, len 187
forward: in:<pppoe-imo.office> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac cc:2d:e0:d9:1a:07, proto UDP, 185.203.25.69:33806->165.154.165.205:8811, len 187
forward: in:<pppoe-imo.office> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac cc:2d:e0:d9:1a:07, proto UDP, 185.203.25.69:33806->165.154.165.205:8811, len 187
forward: in:<pppoe-imo.office> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac cc:2d:e0:d9:1a:07, proto UDP, 185.203.25.69:33806->165.154.165.205:8812, len 187
forward: in:<pppoe-imo.office> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac cc:2d:e0:d9:1a:07, proto UDP, 185.203.25.69:33806->165.154.165.205:8812, len 187
forward: in:<pppoe-imo.office> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac cc:2d:e0:d9:1a:07, proto UDP, 185.203.25.69:33806->165.154.165.205:8813, len 187
forward: in:<pppoe-imo.office> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac cc:2d:e0:d9:1a:07, proto UDP, 185.203.25.69:33806->165.154.165.205:8813, len 187
forward: in:<pppoe-imo.office> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac cc:2d:e0:d9:1a:07, proto UDP, 185.203.25.69:33806->165.154.165.205:8814, len 187
forward: in:<pppoe-imo.office> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac cc:2d:e0:d9:1a:07, proto UDP, 185.203.25.69:33806->165.154.165.205:8814, len 187
forward: in:<pppoe-903_adalgisa.citro> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.29:54050->108.138.192.65:443, len 60
forward: in:<pppoe-903_adalgisa.citro> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.29:54050->108.138.192.65:443, len 60
forward: in:<pppoe-903_adalgisa.citro> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.29:54062->108.138.192.65:443, len 60
forward: in:<pppoe-530_vincenzo.battipaglia> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac b8:69:f4:f7:b5:ec, proto TCP (ACK,PSH), 10.0.254.155:47704->157.240.231.60:443, len 76
forward: in:<pppoe-530_vincenzo.battipaglia> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.155:41058->157.240.231.60:443, len 60
forward: in:<pppoe-530_vincenzo.battipaglia> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.155:41058->157.240.231.60:443, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1537_1211_fglsrl>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.179:44575->185.203.25.89:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1537_1211_fglsrl>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.179:44575->185.203.25.89:53, len 62
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:9851->185.19.124.171:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:9851->185.19.124.171:445, len 52
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:53503->8.8.8.8:53, len 80
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:53503->8.8.8.8:53, len 80
forward: in:<pppoe-891_mariagiovanna.morrone> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac c4:ad:34:aa:c7:04, proto UDP, 10.0.254.214:64263->57.144.140.5:443, len 128
forward: in:<pppoe-891_mariagiovanna.morrone> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac c4:ad:34:aa:c7:04, proto UDP, 10.0.254.214:64263->57.144.140.5:443, len 128
forward: in:ether6_RB_CED out:sfp-sfpplus2_VS_AS, connection-state:new src-mac e4:8d:8c:03:f9:56, proto UDP, 10.1.0.254:37832->37.186.217.132:161, len 73
forward: in:ether6_RB_CED out:sfp-sfpplus2_VS_AS, connection-state:new src-mac e4:8d:8c:03:f9:56, proto UDP, 10.1.0.254:37832->37.186.217.132:161, len 73
forward: in:ether6_RB_CED out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac e4:8d:8c:03:f9:56, proto UDP, 10.1.0.254:37832->37.186.217.132:161, NAT (10.1.0.254:37832->185.203.27.253:37832)->37.186.217.132:161, len 73
forward: in:ether6_RB_CED out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac e4:8d:8c:03:f9:56, proto UDP, 10.1.0.254:37832->37.186.217.132:161, NAT (10.1.0.254:37832->185.203.27.253:37832)->37.186.217.132:161, len 73
forward: in:ether6_RB_CED out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac e4:8d:8c:03:f9:56, proto UDP, 10.1.0.254:37832->37.186.217.132:161, NAT (10.1.0.254:37832->185.203.27.253:37832)->37.186.217.132:161, len 73
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1537_1211_fglsrl>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.179:44575->185.203.25.89:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1537_1211_fglsrl>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.179:44575->185.203.25.89:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-910_michele.ferrara>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.160:13391->185.203.25.59:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-910_michele.ferrara>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.160:13391->185.203.25.59:53, len 62
forward: in:<pppoe-958_carolina.carpentieri> out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac b8:69:f4:0d:ae:7f, proto TCP (ACK,FIN,PSH), 10.0.254.129:42640->161.71.33.241:443, NAT (10.0.254.129:42640->185.203.27.253:42640)->161.71.33.241:443, len 76
forward: in:<pppoe-958_carolina.carpentieri> out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac b8:69:f4:0d:ae:7f, proto TCP (ACK,FIN,PSH), 10.0.254.129:42640->161.71.33.241:443, NAT (10.0.254.129:42640->185.203.27.253:42640)->161.71.33.241:443, len 76
forward: in:<pppoe-934_enza.adinolfi> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 48:8f:5a:f7:54:43, proto UDP, 10.0.254.89:51577->157.240.231.15:443, len
1228
forward: in:<pppoe-934_enza.adinolfi> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 48:8f:5a:f7:54:43, proto UDP, 10.0.254.89:51577->157.240.231.15:443, len
1228
forward: in:<pppoe-795_giuseppe.diblasi> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.92:54264->157.240.231.60:443, len 60
forward: in:<pppoe-795_giuseppe.diblasi> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.92:54264->157.240.231.60:443, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-910_michele.ferrara>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.160:13391->185.203.25.59:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-910_michele.ferrara>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.160:13391->185.203.25.59:53, len 62
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:37060->185.8.52.202:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:37060->185.8.52.202:445, len 52
forward: in:<pppoe-131_vinicola.siani> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.249.201:50895->157.240.231.175:5222, len 64
forward: in:<pppoe-131_vinicola.siani> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.249.201:50895->157.240.231.175:5222, len 64
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:47506->8.8.8.8:53, len 220
forward: in:<pppoe-618_aniello.fimiani> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac fc:ec:da:22:ed:55, proto UDP, 10.0.254.36:46032->173.194.182.167:443, len 1278
forward: in:<pppoe-618_aniello.fimiani> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac fc:ec:da:22:ed:55, proto UDP, 10.0.254.36:46032->173.194.182.167:443, len 1278
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-comune.nocerasuperiore>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.80:13460->185.203.26.17:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-comune.nocerasuperiore>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.80:13460->185.203.26.17:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.56.186:46068->185.203.24.60:45005, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.56.186:46068->185.203.24.60:45005, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.161:13979->185.203.25.204:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.161:13979->185.203.25.204:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 152.32.128.85:42054->185.203.24.160:7707, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 152.32.128.85:42054->185.203.24.160:7707, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 194.163.42.114:20073->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 194.163.42.114:20073->185.203.26.77:53, len 65
forward: in:sfp-sfpplus1_VS_FTTO out:sfp-sfpplus2_VS_AS, connection-state:new src-mac c4:ad:34:25:a7:b5, proto UDP, 10.0.30.254:34189->8.8.8.8:53, len 65
forward: in:sfp-sfpplus1_VS_FTTO out:sfp-sfpplus2_VS_AS, connection-state:new src-mac c4:ad:34:25:a7:b5, proto UDP, 10.0.30.254:34189->8.8.8.8:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-661_pasquale.cibelli>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 45.43.33.218:36350->185.203.25.186:554, len 60
forward: in:<pppoe-618_aniello.fimiani> out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac fc:ec:da:22:ed:55, proto UDP, 10.0.254.36:46032->173.194.182.167:443, NAT (10.0.254.36:46032->185.203.27.253:46032)->173.194.182.167:443,
len 1278
forward: in:<pppoe-618_aniello.fimiani> out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac fc:ec:da:22:ed:55, proto UDP, 10.0.254.36:46032->173.194.182.167:443, NAT (10.0.254.36:46032->185.203.27.253:46032)->173.194.182.167:443,
len 1278
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:31539->8.8.8.8:53, len 111
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:31539->8.8.8.8:53, len 111
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.161:13979->185.203.25.204:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.161:13979->185.203.25.204:53, len 62
forward: in:<pppoe-caronte.hightek_01> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.233:58390->192.168.25.254:80, len 60
forward: in:<pppoe-caronte.hightek_01> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.233:58390->192.168.25.254:80, len 60
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:25235->8.8.8.8:53, len 217
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:25235->8.8.8.8:53, len 217
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:39557->185.203.196.108:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:39557->185.203.196.108:445, len 52
forward: in:<pppoe-giovanni.villani> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.174:28748->216.58.204.129:443, len 64
forward: in:<pppoe-giovanni.villani> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.174:28748->216.58.204.129:443, len 64
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 91.134.84.178:56968->185.203.24.84:738, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cava.gioxiii.seg>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.44:17118->185.203.25.231:53, len 62
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:49523->52.182.143.208:443, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:49523->52.182.143.208:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 148.251.31.230:26015->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 148.251.31.230:26015->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 148.251.31.230:2509->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 148.251.31.230:2509->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.29.20:48602->185.203.24.35:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.29.20:48602->185.203.24.35:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.58.142:55556->185.203.24.37:2718, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.58.142:55556->185.203.24.37:2718, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-caronte.hightek_01>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.60.250:52216->185.203.25.232:32895, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-caronte.hightek_01>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.60.250:52216->185.203.25.232:32895, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cloud_team_system_2>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.28.32:23154->185.203.25.208:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cloud_team_system_2>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.28.32:23154->185.203.25.208:443, len 52
forward: in:<pppoe-salvatore.lanzara> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.249.49:60855->142.251.31.109:993, len 64
forward: in:<pppoe-salvatore.lanzara> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.249.49:60855->142.251.31.109:993, len 64
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-servizi.voip.esterni>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.31.169:12233->185.203.25.246:443, len 52
forward: in:<pppoe-1645_1400_codaantoni> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.12:42176->52.29.103.180:443, len 60
forward: in:<pppoe-1645_1400_codaantoni> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.12:42176->52.29.103.180:443, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-661_pasquale.cibelli>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.167:16865->185.203.25.186:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-661_pasquale.cibelli>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.167:16865->185.203.25.186:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:50336->185.203.24.135:9005, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:50336->185.203.24.135:9005, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:59360->185.203.24.135:9008, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:59360->185.203.24.135:9008, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:38014->185.203.24.135:9009, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:38014->185.203.24.135:9009, len 60
forward: in:<pppoe-618_aniello.fimiani> out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac fc:ec:da:22:ed:55, proto UDP, 10.0.254.36:41739->216.58.209.33:443, NAT (10.0.254.36:41739->185.203.27.253:41739)->216.58.209.33:443, len
1278
forward: in:<pppoe-618_aniello.fimiani> out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac fc:ec:da:22:ed:55, proto UDP, 10.0.254.36:41739->216.58.209.33:443, NAT (10.0.254.36:41739->185.203.27.253:41739)->216.58.209.33:443, len
1278
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:40816->185.203.24.135:9006, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:40816->185.203.24.135:9006, len 60
forward: in:<pppoe-530_vincenzo.battipaglia> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.155:42344->192.178.156.188:5228, len 60
forward: in:<pppoe-530_vincenzo.battipaglia> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.155:42344->192.178.156.188:5228, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 212.22.128.117:26548->185.203.24.100:443, len 52
forward: in:<pppoe-giovanni.villani> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 18:e8:29:d8:4d:1b, proto UDP, 185.203.25.174:28855->216.58.209.34:443, len 1228
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:46774->185.203.24.135:61616, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:46774->185.203.24.135:61616, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:47078->185.203.24.135:28017, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:47078->185.203.24.135:28017, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:38668->185.203.24.135:10000, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:38668->185.203.24.135:10000, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-661_pasquale.cibelli>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.167:16865->185.203.25.186:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-661_pasquale.cibelli>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.167:16865->185.203.25.186:53, len 62
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:28865->8.8.8.8:53, len 217
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:28865->8.8.8.8:53, len 217
forward: in:<pppoe-934_enza.adinolfi> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 48:8f:5a:f7:54:43, proto UDP, 10.0.254.89:49972->157.240.231.15:443, len
1228
forward: in:<pppoe-934_enza.adinolfi> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 48:8f:5a:f7:54:43, proto UDP, 10.0.254.89:49972->157.240.231.15:443, len
1228
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-661_pasquale.cibelli>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 154.198.203.177:37799->185.203.25.186:1434, len 29
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-661_pasquale.cibelli>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 154.198.203.177:37799->185.203.25.186:1434, len 29
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 2.16.5.221:60481->185.203.26.77:53, len 72
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:56206->185.203.24.135:5555, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:56206->185.203.24.135:5555, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-910_michele.ferrara>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.150:46817->185.203.25.59:53, len 63
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-910_michele.ferrara>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.150:46817->185.203.25.59:53, len 63
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:59550->185.203.24.135:9007, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:59550->185.203.24.135:9007, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:42722->185.203.24.135:61617, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:42722->185.203.24.135:61617, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:37468->185.203.24.135:8888, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:37468->185.203.24.135:8888, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:50566->185.203.24.135:8899, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:50566->185.203.24.135:8899, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:57292->185.203.24.135:2020, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:57292->185.203.24.135:2020, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:51274->185.203.24.135:10443, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:51274->185.203.24.135:10443, len 60
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53788->172.66.0.126:443, len 64
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53792->172.66.0.126:443, len 64
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:35264->8.8.8.8:53, len 86
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53792->172.66.0.126:443, len 64
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53793->52.44.182.224:443, len 64
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53793->52.44.182.224:443, len 64
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:55946->185.203.24.135:7777, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 118.31.248.145:55946->185.203.24.135:7777, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-910_michele.ferrara>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.150:46817->185.203.25.59:53, len 63
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-910_michele.ferrara>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.150:46817->185.203.25.59:53, len 63
forward: in:<pppoe-801_simone.marino> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 48:8f:5a:37:3a:a4, proto UDP, 10.0.254.148:51145->216.58.209.36:443, len
57
forward: in:<pppoe-801_simone.marino> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 48:8f:5a:37:3a:a4, proto UDP, 10.0.254.148:51145->216.58.209.36:443, len
57
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cloud_team_system_2>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.29.96:27795->185.203.25.208:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cloud_team_system_2>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.29.96:27795->185.203.25.208:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 65.49.51.58:24251->185.203.24.21:53, len 84
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 65.49.51.58:24251->185.203.24.21:53, len 84
forward: in:<pppoe-756_1398_carpentier> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.249.116:34054->2.21.54.101:80, len 52
forward: in:<pppoe-756_1398_carpentier> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.249.116:34054->2.21.54.101:80, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.60.250:52216->185.203.24.238:19570, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.60.250:52216->185.203.24.238:19570, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1537_1211_fglsrl>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.83:49547->185.203.25.89:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1537_1211_fglsrl>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.83:49547->185.203.25.89:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-comune.nocerasuperiore>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.167:50390->185.203.26.17:53, len 63
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-comune.nocerasuperiore>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.167:50390->185.203.26.17:53, len 63
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1499_1146_campitiell>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.60.250:52216->185.203.26.79:20062, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1499_1146_campitiell>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.60.250:52216->185.203.26.79:20062, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-929_agm.srl>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.62.134:58059->185.203.25.55:5985, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-929_agm.srl>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.62.134:58059->185.203.25.55:5985, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-910_michele.ferrara>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.243:48559->185.203.25.59:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-910_michele.ferrara>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.243:48559->185.203.25.59:53, len 62
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:f3:29, proto UDP, 185.203.24.23:9415->8.8.8.8:53, len 217
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:f3:29, proto UDP, 185.203.24.23:9415->8.8.8.8:53, len 217
forward: in:<pppoe-comune.nocerasuperiore> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:c8:96:e7, proto UDP, 185.203.26.17:53098->8.8.8.8:53, len 79
forward: in:<pppoe-comune.nocerasuperiore> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:c8:96:e7, proto UDP, 185.203.26.17:53098->8.8.8.8:53, len 79
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-910_michele.ferrara>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.243:48559->185.203.25.59:53, len 62
forward: in:<pppoe-caronte.hightek_01> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.233:55880->192.168.25.254:80, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 194.163.42.114:16665->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 194.163.42.114:16665->185.203.26.77:53, len 65
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:20272->2.42.225.140:443, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:20272->2.42.225.140:443, len 52
forward: in:<pppoe-cava.centro.sangiovanni> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 04:18:d6:24:ab:95, proto UDP, 185.203.25.206:2003->8.8.4.4:53, len
59
forward: in:<pppoe-cava.centro.sangiovanni> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 04:18:d6:24:ab:95, proto UDP, 185.203.25.206:2003->8.8.4.4:53, len
59
forward: in:<pppoe-cava.centro.sangiovanni> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 04:18:d6:24:ab:95, proto UDP, 185.203.25.206:62251->8.8.8.8:53, len 59
forward: in:<pppoe-cava.centro.sangiovanni> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 04:18:d6:24:ab:95, proto UDP, 185.203.25.206:62251->8.8.8.8:53, len 59
forward: in:<pppoe-cava.centro.sangiovanni> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 04:18:d6:24:ab:95, proto UDP, 185.203.25.206:7621->8.8.4.4:53, len
59
forward: in:<pppoe-cava.centro.sangiovanni> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 04:18:d6:24:ab:95, proto UDP, 185.203.25.206:7621->8.8.4.4:53, len
59
forward: in:<pppoe-cava.centro.sangiovanni> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 04:18:d6:24:ab:95, proto UDP, 185.203.25.206:20375->8.8.8.8:53, len 59
forward: in:<pppoe-cava.centro.sangiovanni> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 04:18:d6:24:ab:95, proto UDP, 185.203.25.206:20375->8.8.8.8:53, len 59
forward: in:<pppoe-cava.centro.sangiovanni> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 04:18:d6:24:ab:95, proto UDP, 185.203.25.206:28828->8.8.8.8:53, len 59
forward: in:<pppoe-cava.centro.sangiovanni> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 04:18:d6:24:ab:95, proto UDP, 185.203.25.206:28828->8.8.8.8:53, len 59
forward: in:<pppoe-cava.centro.sangiovanni> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 04:18:d6:24:ab:95, proto UDP, 185.203.25.206:39737->8.8.4.4:53, len 59
forward: in:<pppoe-cava.centro.sangiovanni> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 04:18:d6:24:ab:95, proto UDP, 185.203.25.206:39737->8.8.4.4:53, len 59
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:f3:29, proto UDP, 185.203.24.23:54021->8.8.8.8:53, len 220
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:f3:29, proto UDP, 185.203.24.23:54021->8.8.8.8:53, len 220
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.168:43296->185.203.25.204:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.168:43296->185.203.25.204:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 117.4.121.191:50826->185.203.24.149:445, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 117.4.121.191:50826->185.203.24.149:445, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1016_teresa.damico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 147.185.132.183:49736->185.203.25.13:4024, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1016_teresa.damico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 147.185.132.183:49736->185.203.25.13:4024, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.168:43296->185.203.25.204:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.168:43296->185.203.25.204:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-110_giancarlo.deprisco>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.58.142:55556->185.203.25.72:3551, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-110_giancarlo.deprisco>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.58.142:55556->185.203.25.72:3551, len 44
forward: in:<pppoe-934_enza.adinolfi> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 48:8f:5a:f7:54:43, proto UDP, 10.0.254.89:54238->157.240.231.1:443, len 1280
forward: in:<pppoe-934_enza.adinolfi> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 48:8f:5a:f7:54:43, proto UDP, 10.0.254.89:54238->157.240.231.1:443, len 1280
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53794->52.95.115.255:443, len 64
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53794->52.95.115.255:443, len 64
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53797->142.250.180.162:443, len 64
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53798->142.250.145.154:443, len 64
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53798->142.250.145.154:443, len 64
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53799->104.18.32.137:443, len 64
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53799->104.18.32.137:443, len 64
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 148.251.31.230:27009->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 148.251.31.230:27009->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cava.gioxiii.seg>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.234:44437->185.203.25.231:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cava.gioxiii.seg>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.234:44437->185.203.25.231:53, len 62
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:5588->89.89.0.11:161, len 106
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:5588->89.89.0.11:161, len 106
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cava.gioxiii.seg>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.234:44437->185.203.25.231:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cava.gioxiii.seg>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.234:44437->185.203.25.231:53, len 62
forward: in:<pppoe-903_adalgisa.citro> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.29:58166->104.18.158.26:443, len 60
forward: in:<pppoe-903_adalgisa.citro> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.29:58166->104.18.158.26:443, len 60
forward: in:<pppoe-903_adalgisa.citro> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.29:47874->104.17.249.168:443, len 60
forward: in:<pppoe-903_adalgisa.citro> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.29:47874->104.17.249.168:443, len 60
forward: in:<pppoe-giovanni.villani> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 18:e8:29:d8:4d:1b, proto UDP, 185.203.25.174:28877->216.58.204.130:443, len 1276
forward: in:<pppoe-giovanni.villani> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 18:e8:29:d8:4d:1b, proto UDP, 185.203.25.174:28877->216.58.204.130:443, len 1276
forward: in:<pppoe-giovanni.villani> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 18:e8:29:d8:4d:1b, proto UDP, 185.203.25.174:28877->216.58.204.130:443, len 248
forward: in:<pppoe-giovanni.villani> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 18:e8:29:d8:4d:1b, proto UDP, 185.203.25.174:28877->216.58.204.130:443, len 248
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 194.163.42.114:10319->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 194.163.42.114:10319->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-661_pasquale.cibelli>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.251:46534->185.203.25.186:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-661_pasquale.cibelli>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.251:46534->185.203.25.186:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 199.45.155.71:51900->185.203.24.157:7072, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 199.45.155.71:51900->185.203.24.157:7072, len 60
forward: in:<pppoe-826_giuliano.senatore> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 50:91:e3:c6:a5:93, proto UDP, 10.0.254.95:40119->157.240.231.15:443,
len 1260
forward: in:<pppoe-826_giuliano.senatore> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 50:91:e3:c6:a5:93, proto UDP, 10.0.254.95:40119->157.240.231.15:443,
len 1260
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:6349->8.8.8.8:53, len 127
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:6349->8.8.8.8:53, len 127
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-929_agm.srl>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 176.65.149.67:35653->185.203.25.55:15166, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 212.22.128.117:60848->185.203.24.100:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 212.22.128.117:60848->185.203.24.100:443, len 52
forward: in:<pppoe-475_varone.felice> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:31:92:3d:c4:6b, proto UDP, 10.0.249.11:56298->8.8.8.8:53, len 77
forward: in:<pppoe-475_varone.felice> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:31:92:3d:c4:6b, proto UDP, 10.0.249.11:56298->8.8.8.8:53, len 77
forward: in:<pppoe-804_vincenzo.pagano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.147:51030->151.101.131.52:80, len 64
forward: in:<pppoe-804_vincenzo.pagano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.147:51030->151.101.131.52:80, len 64
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1007_valentina.calvanese>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 35.203.210.15:53763->185.203.25.142:4345, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1007_valentina.calvanese>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 35.203.210.15:53763->185.203.25.142:4345, len 44
forward: in:<pppoe-giovanni.villani> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.174:28749->216.58.205.38:443, len 64
forward: in:<pppoe-giovanni.villani> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.174:28749->216.58.205.38:443, len 64
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:38001->8.8.8.8:53, len 220
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:38001->8.8.8.8:53, len 220
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 45.95.183.64:11007->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 45.95.183.64:11007->185.203.26.77:53, len 65
forward: in:<pppoe-1332_945_costantino> out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac 60:32:b1:17:9c:67, proto UDP, 10.0.254.40:57864->89.168.26.107:7635, NAT (10.0.254.40:57864->185.203.27.253:57864)->89.168.26.107:7635, len 122
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 199.45.155.71:51916->185.203.24.157:7072, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 199.45.155.71:51916->185.203.24.157:7072, len 60
forward: in:<pppoe-035_comune.csg.sedeftto> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.27.25:5623->62.149.128.179:995, len 52
forward: in:<pppoe-035_comune.csg.sedeftto> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.27.25:5623->62.149.128.179:995, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 194.163.42.114:18105->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 194.163.42.114:18105->185.203.26.77:53, len 65
forward: in:<pppoe-131_vinicola.siani> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 60:32:b1:17:a0:69, proto UDP, 10.0.249.201:51257->157.240.8.34:443, len
1280
forward: in:<pppoe-131_vinicola.siani> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 60:32:b1:17:a0:69, proto UDP, 10.0.249.201:51257->157.240.8.34:443, len
1280
forward: in:<pppoe-522_pasquale.palumbo> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 64:d1:54:4d:ad:e9, proto UDP, 185.203.25.85:48715->95.110.254.234:123, len 76
forward: in:<pppoe-522_pasquale.palumbo> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 64:d1:54:4d:ad:e9, proto UDP, 185.203.25.85:48715->95.110.254.234:123, len 76
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.28.0:49567->185.203.24.39:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.28.0:49567->185.203.24.39:443, len 52
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53800->92.122.95.137:443, len 64
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53800->92.122.95.137:443, len 64
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53801->150.171.22.12:443, len 64
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53801->150.171.22.12:443, len 64
forward: in:<pppoe-franco.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.77:53802->54.73.151.222:443, len 64
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:25247->190.85.86.177:445, len 52
forward: in:<pppoe-804_vincenzo.pagano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.147:51031->151.101.131.52:80, len 64
forward: in:<pppoe-804_vincenzo.pagano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.147:51031->151.101.131.52:80, len 64
forward: in:<pppoe-caronte.hightek_01> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.233:58392->192.168.25.254:80, len 60
forward: in:<pppoe-caronte.hightek_01> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.233:58392->192.168.25.254:80, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1537_1211_fglsrl>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.93:12393->185.203.25.89:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1537_1211_fglsrl>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.93:12393->185.203.25.89:53, len 62
forward: in:<pppoe-sergio.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.128:49671->92.122.95.129:80, len 52
forward: in:<pppoe-sergio.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.128:49671->92.122.95.129:80, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.28.174:46188->185.203.24.25:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.28.174:46188->185.203.24.25:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:sfp-sfpplus1_VS_FTTO, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.58.90:55467->185.203.26.34:6248, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:sfp-sfpplus1_VS_FTTO, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.58.90:55467->185.203.26.34:6248, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-910_michele.ferrara>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.99:48617->185.203.25.59:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-910_michele.ferrara>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.99:48617->185.203.25.59:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 35.203.211.137:51683->185.203.24.161:9111, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 35.203.211.137:51683->185.203.24.161:9111, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.58.142:55556->185.203.24.39:3065, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-661_pasquale.cibelli>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 123.136.80.5:25041->185.203.25.186:1434, len 29
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-661_pasquale.cibelli>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 123.136.80.5:25041->185.203.25.186:1434, len 29
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-comune.nocerasuperiore>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.212:43547->185.203.26.17:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-comune.nocerasuperiore>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.212:43547->185.203.26.17:53, len 62
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:15484->8.8.8.8:53, len 220
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:15484->8.8.8.8:53, len 220
forward: in:<pppoe-hightek.router.new> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.24.17:2624->154.12.226.43:7704, len 52
forward: in:<pppoe-hightek.router.new> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.24.17:2624->154.12.226.43:7704, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.98:13394->185.203.25.204:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.98:13394->185.203.25.204:53, len 62
forward: in:<pppoe-834_daniela.barticel> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac b8:69:f4:f7:b5:c0, proto UDP, 10.0.249.123:59807->8.8.8.8:53, len 66
forward: in:<pppoe-834_daniela.barticel> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac b8:69:f4:f7:b5:c0, proto UDP, 10.0.249.123:59807->8.8.8.8:53, len 66
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.58.142:55556->185.203.24.224:4759, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.58.142:55556->185.203.24.224:4759, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 109.94.96.215:7101->185.203.24.158:80, len 64
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 109.94.96.215:7101->185.203.24.158:80, len 64
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:28382->185.203.98.145:445, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.31.49:34298->185.203.24.37:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 148.251.31.230:25561->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 148.251.31.230:25561->185.203.26.77:53, len 65
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:b5:15, proto TCP (SYN), 185.203.24.37:54910->31.7.144.29:8449, len 60
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:b5:15, proto TCP (SYN), 185.203.24.37:54910->31.7.144.29:8449, len 60
forward: in:<pppoe-934_enza.adinolfi> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 48:8f:5a:f7:54:43, proto UDP, 10.0.254.89:62881->157.240.231.15:443, len
1228
forward: in:<pppoe-934_enza.adinolfi> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 48:8f:5a:f7:54:43, proto UDP, 10.0.254.89:62881->157.240.231.15:443, len
1228
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 167.94.138.189:57144->185.203.24.134:8291, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 167.94.138.189:57144->185.203.24.134:8291, len 60
forward: in:<pppoe-giovanni.villani> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.174:28750->23.22.48.139:443, len 64
forward: in:<pppoe-giovanni.villani> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.174:28750->23.22.48.139:443, len 64
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new,dnat src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.56.186:46068->10.1.13.200:35252, NAT 79.124.56.186:46068->(185.203.24.5:35252->10.1.13.200:35252), len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new,dnat src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.56.186:46068->10.1.13.200:35252, NAT 79.124.56.186:46068->(185.203.24.5:35252->10.1.13.200:35252), len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 45.95.183.64:6743->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 45.95.183.64:6743->185.203.26.77:53, len 65
forward: in:ether6_RB_CED out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac e4:8d:8c:03:f9:56, proto TCP (SYN), 10.1.0.254:57492->188.12.219.20:8291, NAT (10.1.0.254:57492->185.203.27.253:57492)->188.12.219.20:8291, len 60
forward: in:ether6_RB_CED out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac e4:8d:8c:03:f9:56, proto TCP (SYN), 10.1.0.254:57492->188.12.219.20:8291, NAT (10.1.0.254:57492->185.203.27.253:57492)->188.12.219.20:8291, len 60
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:55262->8.8.8.8:53, len 220
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:55262->8.8.8.8:53, len 220
forward: in:<pppoe-475_varone.felice> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.249.11:49708->51.124.78.146:443, len 52
forward: in:<pppoe-475_varone.felice> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.249.11:49708->51.124.78.146:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 148.251.31.230:8079->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 148.251.31.230:8079->185.203.26.77:53, len 65
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:59573->185.231.59.101:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:59573->185.231.59.101:445, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 176.65.149.55:35049->185.203.24.123:27273, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 176.65.149.55:35049->185.203.24.123:27273, len 44
forward: in:<pppoe-salvatore.lanzara> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.249.49:60856->142.251.31.109:993, len 64
forward: in:<pppoe-salvatore.lanzara> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.249.49:60856->142.251.31.109:993, len 64
forward: in:<pppoe-131_vinicola.siani> out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac 60:32:b1:17:a0:69, proto UDP, 10.0.249.201:51257->157.240.8.34:443, NAT (10.0.249.201:51257->185.203.27.253:51257)->157.240.8.34:443, len 1280
forward: in:<pppoe-131_vinicola.siani> out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac 60:32:b1:17:a0:69, proto UDP, 10.0.249.201:51257->157.240.8.34:443, NAT (10.0.249.201:51257->185.203.27.253:51257)->157.240.8.34:443, len 1280
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:33489->8.8.8.8:53, len 91
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:61:c7, proto UDP, 185.203.24.22:33489->8.8.8.8:53, len 91
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac dc:2c:6e:3b:d3:98, proto UDP, 185.203.26.39:60917->23.216.150.169:443,
len 1278
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac dc:2c:6e:3b:d3:98, proto UDP, 185.203.26.39:60917->23.216.150.169:443,
len 1278
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac dc:2c:6e:3b:d3:98, proto UDP, 185.203.26.39:60917->23.216.150.169:443,
len 1278
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.39:54758->20.189.173.11:443, len 52
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.39:54758->20.189.173.11:443, len 52
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.39:54759->150.171.27.10:443, len 52
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.39:54759->150.171.27.10:443, len 52
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.39:54760->13.107.246.43:443, len 52
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.39:54760->13.107.246.43:443, len 52
forward: in:<pppoe-795_giuseppe.diblasi> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.92:49082->91.81.128.35:443, len 60
forward: in:<pppoe-795_giuseppe.diblasi> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.92:49082->91.81.128.35:443, len 60
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:24721->20.101.57.9:123, len 76
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:24721->20.101.57.9:123, len 76
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-datev.router>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 47.251.47.128:46266->185.203.25.199:53, len 70
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-datev.router>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 47.251.47.128:46266->185.203.25.199:53, len 70
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:f3:29, proto UDP, 185.203.24.23:14732->8.8.8.8:53, len 220
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 123.129.132.101:59390->185.203.24.22:8080, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 123.129.132.101:59390->185.203.24.22:8080, len 60
forward: in:<pppoe-anna.lamberti> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.28:46370->3.165.255.7:80, len 60
forward: in:<pppoe-anna.lamberti> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.28:46370->3.165.255.7:80, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.29.182:2965->185.203.24.251:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.29.182:2965->185.203.24.251:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1537_1211_fglsrl>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.5:14980->185.203.25.89:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1537_1211_fglsrl>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.5:14980->185.203.25.89:53, len 62
forward: in:<pppoe-cava.centro.sangiovanni> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 04:18:d6:24:ab:95, proto UDP, 185.203.25.206:46030->8.8.4.4:53, len 51
forward: in:<pppoe-cava.centro.sangiovanni> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 04:18:d6:24:ab:95, proto UDP, 185.203.25.206:46030->8.8.4.4:53, len 51
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.40.130:56215->185.203.24.197:34443, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.40.130:56215->185.203.24.197:34443, len 44
forward: in:<pppoe-1332_945_costantino> out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac 60:32:b1:17:9c:67, proto UDP, 10.0.254.40:57864->89.168.26.107:7635, NAT (10.0.254.40:57864->185.203.27.253:57864)->89.168.26.107:7635, len 64
forward: in:<pppoe-1332_945_costantino> out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac 60:32:b1:17:9c:67, proto UDP, 10.0.254.40:57864->89.168.26.107:7635, NAT (10.0.254.40:57864->185.203.27.253:57864)->89.168.26.107:7635, len 64
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.28.27:26521->185.203.24.15:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.28.27:26521->185.203.24.15:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1499_1146_campitiell>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 162.19.80.39:47582->185.203.26.79:2543, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1499_1146_campitiell>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 162.19.80.39:47582->185.203.26.79:2543, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 176.65.148.16:39546->185.203.24.158:85, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 66.132.153.112:6026->185.203.24.193:7993, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 66.132.153.112:6026->185.203.24.193:7993, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 81.30.107.146:22962->185.203.24.93:587, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 81.30.107.146:22962->185.203.24.93:587, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-910_michele.ferrara>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.240:44113->185.203.25.59:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-910_michele.ferrara>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.240:44113->185.203.25.59:53, len 62
forward: in:<pppoe-934_enza.adinolfi> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 48:8f:5a:f7:54:43, proto UDP, 10.0.254.89:60703->157.240.231.1:443, len 1280
forward: in:<pppoe-934_enza.adinolfi> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 48:8f:5a:f7:54:43, proto UDP, 10.0.254.89:60703->157.240.231.1:443, len 1280
forward: in:<pppoe-934_enza.adinolfi> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.89:64365->17.253.53.207:443, len 64
forward: in:<pppoe-934_enza.adinolfi> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.89:64365->17.253.53.207:443, len 64
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-caronte.hightek_01>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.30.5:58532->185.203.25.233:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-caronte.hightek_01>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.30.5:58532->185.203.25.233:443, len 52
forward: in:<pppoe-1467_1111_parisianto> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.43:48244->23.227.39.200:443, len 60
forward: in:<pppoe-1467_1111_parisianto> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.43:48244->23.227.39.200:443, len 60
forward: in:<pppoe-1467_1111_parisianto> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.43:48254->23.227.39.200:443, len 60
forward: in:<pppoe-1467_1111_parisianto> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.43:48254->23.227.39.200:443, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-caronte.hightek_01>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.58.142:55556->185.203.25.235:1209, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-caronte.hightek_01>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.58.142:55556->185.203.25.235:1209, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.235:15527->185.203.25.204:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.235:15527->185.203.25.204:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.235:15527->185.203.25.204:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.235:15527->185.203.25.204:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 103.102.230.4:41819->185.203.24.72:8728, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 103.102.230.4:41819->185.203.24.72:8728, len 44
forward: in:<pppoe-666_settimio.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:b2:73:13, proto UDP, 185.203.25.195:42462->8.8.8.8:53, len 61
forward: in:<pppoe-666_settimio.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:b2:73:13, proto UDP, 185.203.25.195:42462->8.8.8.8:53, len 61
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 74.125.45.102:48342->185.203.24.95:993, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 74.125.45.102:48342->185.203.24.95:993, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 65.108.210.26:30513->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 65.108.210.26:30513->185.203.26.77:53, len 65
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.39:54761->150.171.27.12:443, len 52
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.39:54761->150.171.27.12:443, len 52
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac dc:2c:6e:3b:d3:98, proto UDP, 185.203.26.39:50585->95.101.34.74:443, len 1278
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac dc:2c:6e:3b:d3:98, proto UDP, 185.203.26.39:56661->23.216.150.145:443,
len 1278
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac dc:2c:6e:3b:d3:98, proto UDP, 185.203.26.39:56661->23.216.150.145:443,
len 1278
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac dc:2c:6e:3b:d3:98, proto UDP, 185.203.26.39:56661->23.216.150.145:443,
len 1278
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.39:54762->108.139.210.6:443, len 52
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.39:54762->108.139.210.6:443, len 52
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac dc:2c:6e:3b:d3:98, proto UDP, 185.203.26.39:52298->95.101.34.74:443, len 1278
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac dc:2c:6e:3b:d3:98, proto UDP, 185.203.26.39:52298->95.101.34.74:443, len 1278
forward: in:<pppoe-934_enza.adinolfi> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 48:8f:5a:f7:54:43, proto UDP, 10.0.254.89:59289->157.240.231.35:443, len
1228
forward: in:<pppoe-934_enza.adinolfi> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 48:8f:5a:f7:54:43, proto UDP, 10.0.254.89:59289->157.240.231.35:443, len
1228
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac dc:2c:6e:3b:d3:98, proto UDP, 185.203.26.39:52298->95.101.34.74:443, len 1278
forward: in:<pppoe-1326_938_eurobusine> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac dc:2c:6e:3b:d3:98, proto UDP, 185.203.26.39:52298->95.101.34.74:443, len 1278
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cava.gioxiii.seg>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.184:16574->185.203.25.231:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cava.gioxiii.seg>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.184:16574->185.203.25.231:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.29.255:61416->185.203.24.39:443, len 52
forward: in:<pppoe-903_adalgisa.citro> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.29:38558->3.165.255.33:443, len 60
forward: in:<pppoe-903_adalgisa.citro> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.29:38558->3.165.255.33:443, len 60
forward: in:<pppoe-628_1218_fierroassu> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac e4:38:83:be:c1:2d, proto UDP, 10.0.254.80:36171->8.8.8.8:53, len 65
forward: in:<pppoe-628_1218_fierroassu> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac e4:38:83:be:c1:2d, proto UDP, 10.0.254.80:36171->8.8.8.8:53, len 65
forward: in:<pppoe-628_1218_fierroassu> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac e4:38:83:be:c1:2d, proto UDP, 10.0.254.80:13308->8.8.8.8:53, len 65
forward: in:<pppoe-628_1218_fierroassu> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac e4:38:83:be:c1:2d, proto UDP, 10.0.254.80:13308->8.8.8.8:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cava.gioxiii.seg>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.184:16574->185.203.25.231:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cava.gioxiii.seg>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.184:16574->185.203.25.231:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-alfabitomega>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 176.65.149.64:54916->185.203.24.2:21239, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-alfabitomega>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 176.65.149.64:54916->185.203.24.2:21239, len 44
forward: in:<pppoe-caronte.hightek_01> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.233:51958->192.168.25.254:80, len 60
forward: in:<pppoe-caronte.hightek_01> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.233:51958->192.168.25.254:80, len 60
forward: in:<pppoe-666_settimio.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:b2:73:13, proto UDP, 185.203.25.195:48276->8.8.8.8:53, len 61
forward: in:<pppoe-666_settimio.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:b2:73:13, proto UDP, 185.203.25.195:48276->8.8.8.8:53, len 61
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 45.95.183.64:17223->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 45.95.183.64:17223->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 81.30.107.15:60216->185.203.24.93:587, len 60
forward: in:<pppoe-1400_1029_trasportig> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.249.141:8308->95.100.171.16:443, len 52
forward: in:<pppoe-035_comune.csg.sedeftto> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.27.25:57375->3.71.153.246:10051, len 60
forward: in:<pppoe-035_comune.csg.sedeftto> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.27.25:57375->3.71.153.246:10051, len 60
forward: in:<pppoe-666_settimio.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:b2:73:13, proto UDP, 185.203.25.195:47837->8.8.8.8:53, len 61
forward: in:<pppoe-666_settimio.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:b2:73:13, proto UDP, 185.203.25.195:47837->8.8.8.8:53, len 61
forward: in:<pppoe-1415_1047_orlandolui> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 08:55:31:ba:0d:54, proto TCP (ACK,PSH), 10.0.254.120:58642->157.240.209.38:443, len 76
forward: in:<pppoe-1415_1047_orlandolui> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 08:55:31:ba:0d:54, proto TCP (ACK,PSH), 10.0.254.120:58642->157.240.209.38:443, len 76
forward: in:<pppoe-caronte.hightek_01> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.233:55888->192.168.25.254:80, len 60
forward: in:<pppoe-caronte.hightek_01> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.233:55888->192.168.25.254:80, len 60
forward: in:<pppoe-903_adalgisa.citro> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.29:48996->51.92.2.118:443, len 60
forward: in:<pppoe-903_adalgisa.citro> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.29:48996->51.92.2.118:443, len 60
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:f3:29, proto UDP, 185.203.24.23:32705->8.8.8.8:53, len 199
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:88:f3:29, proto UDP, 185.203.24.23:32705->8.8.8.8:53, len 199
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-661_pasquale.cibelli>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.137:50226->185.203.25.186:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-661_pasquale.cibelli>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.137:50226->185.203.25.186:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 192.159.99.180:58221->185.203.24.36:9091, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 192.159.99.180:58221->185.203.24.36:9091, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-716_onofrio.menichini>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 213.209.143.64:50166->185.203.25.159:80, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-716_onofrio.menichini>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 213.209.143.64:50166->185.203.25.159:80, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 148.251.31.230:2503->185.203.26.77:53, len 65
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 148.251.31.230:2503->185.203.26.77:53, len 65
forward: in:<pppoe-882_francesco.canzolino> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.183:53372->3.223.15.108:5222, len 60
forward: in:<pppoe-882_francesco.canzolino> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.183:53372->3.223.15.108:5222, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-comune.nocerasuperiore>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.60.250:52216->185.203.26.24:61819, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-comune.nocerasuperiore>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.60.250:52216->185.203.26.24:61819, len 44
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:35722->185.231.59.101:445, len 52
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.26.201:35722->185.231.59.101:445, len 52
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:b2:c9:3f, proto UDP, 185.203.24.93:53055->8.8.8.8:53, len 71
forward: in:VLAN53_PPOE_DATACENTER out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:b2:c9:3f, proto UDP, 185.203.24.93:53055->8.8.8.8:53, len 71
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 108.167.178.116:60000->185.203.24.214:1143, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 108.167.178.116:60000->185.203.24.214:1143, len 44
forward: in:<pppoe-pietro.lucido> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.93:42090->44.219.18.249:443, len 60
forward: in:<pppoe-pietro.lucido> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.93:42090->44.219.18.249:443, len 60
forward: in:<pppoe-666_settimio.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:b2:73:13, proto UDP, 185.203.25.195:54171->8.8.8.8:53, len 61
forward: in:<pppoe-666_settimio.alfano> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 00:50:56:b2:73:13, proto UDP, 185.203.25.195:54171->8.8.8.8:53, len 61
forward: in:<pppoe-1523_1185_casaburisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 24:a4:3c:e0:e6:b1, proto TCP (ACK,PSH), 10.0.254.28:50616->216.58.204.150:443, len 76
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:38653->172.19.96.81:45473, len 3346
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:38653->172.19.96.81:45473, len 3346
forward: in:<pppoe-131_vinicola.siani> out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac 60:32:b1:17:a0:69, proto UDP, 10.0.249.201:51257->157.240.8.34:443, NAT (10.0.249.201:51257->185.203.27.253:51257)->157.240.8.34:443, len 1280
forward: in:<pppoe-131_vinicola.siani> out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac 60:32:b1:17:a0:69, proto UDP, 10.0.249.201:51257->157.240.8.34:443, NAT (10.0.249.201:51257->185.203.27.253:51257)->157.240.8.34:443, len 1280
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1537_1211_fglsrl>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.121:45373->185.203.25.89:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1537_1211_fglsrl>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.121:45373->185.203.25.89:53, len 62
forward: in:<pppoe-guglielmo.cataldo> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 4c:5e:0c:14:c0:b4, proto UDP, 185.203.25.254:55295->8.8.8.8:53, len 60
forward: in:<pppoe-guglielmo.cataldo> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 4c:5e:0c:14:c0:b4, proto UDP, 185.203.25.254:55295->8.8.8.8:53, len 60
forward: in:<pppoe-guglielmo.cataldo> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 4c:5e:0c:14:c0:b4, proto UDP, 185.203.25.254:55295->8.8.8.8:53, len 60
forward: in:<pppoe-guglielmo.cataldo> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 4c:5e:0c:14:c0:b4, proto UDP, 185.203.25.254:55295->8.8.8.8:53, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1537_1211_fglsrl>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.121:45373->185.203.25.89:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-1537_1211_fglsrl>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.121:45373->185.203.25.89:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:sfp-sfpplus1_VS_FTTO, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 82.62.84.108:59251->185.203.26.34:8472, len 96
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:sfp-sfpplus1_VS_FTTO, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 82.62.84.108:59251->185.203.26.34:8472, len 96
forward: in:<pppoe-1087_michele.ponticelli> out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac 08:55:31:b6:61:38, proto UDP, 10.0.254.205:49151->54.216.172.252:1789, NAT (10.0.254.205:49151->185.203.27.253:49151)->54.216.172.252:1789, len 92
forward: in:<pppoe-1087_michele.ponticelli> out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac 08:55:31:b6:61:38, proto UDP, 10.0.254.205:49151->54.216.172.252:1789, NAT (10.0.254.205:49151->185.203.27.253:49151)->54.216.172.252:1789, len 92
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-comune.nocerasuperiore>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.217:46562->185.203.26.17:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-comune.nocerasuperiore>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.217:46562->185.203.26.17:53, len 62
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:60670->172.19.96.81:45473, len 3227
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:60670->172.19.96.81:45473, len 3227
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:59567->172.19.96.81:45473, len 3227
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:59567->172.19.96.81:45473, len 3227
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:63204->172.19.96.81:45473, len 2877
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:63204->172.19.96.81:45473, len 2877
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:22166->172.19.96.81:45473, len 2867
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:22166->172.19.96.81:45473, len 2867
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:22166->172.19.96.81:45473, len 2867
forward: in:ether6_RB_CED out:sfp-sfpplus2_VS_AS, connection-state:new src-mac e4:8d:8c:03:f9:56, proto UDP, 10.1.0.254:37832->79.11.43.150:161, len 75
forward: in:<pppoe-919_vincenzo.muro> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.135:45904->96.47.5.157:4431, len 60
forward: in:<pppoe-919_vincenzo.muro> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.135:45904->96.47.5.157:4431, len 60
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:34275->172.19.96.81:45473, len 2873
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:34275->172.19.96.81:45473, len 2873
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:34275->172.19.96.81:45473, len 2873
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:34275->172.19.96.81:45473, len 2873
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.103:15601->185.203.25.204:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.103:15601->185.203.25.204:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-caronte.hightek_01>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.30.11:34450->185.203.25.237:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-caronte.hightek_01>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 170.247.30.11:34450->185.203.25.237:443, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 162.142.125.247:16762->185.203.24.242:39822, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 162.142.125.247:16762->185.203.24.242:39822, len 44
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:31872->89.89.0.16:161, len 106
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:31872->89.89.0.16:161, len 106
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 199.45.154.150:40778->185.203.24.174:7780, len 60
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:37613->172.19.96.81:45473, len 3344
forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:37613->172.19.96.81:45473, len 3344
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cava.gioxiii.seg>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.82:16873->185.203.25.231:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cava.gioxiii.seg>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.82:16873->185.203.25.231:53, len 62
forward: in:<pppoe-1471_1115_nappicarol> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 84:d8:1b:68:6a:cc, proto UDP, 10.0.254.67:53391->1.96.163.132:123, len 76
forward: in:<pppoe-1471_1115_nappicarol> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 84:d8:1b:68:6a:cc, proto UDP, 10.0.254.67:53391->1.96.163.132:123, len 76
forward: in:<pppoe-893_giovanna.dacunzi> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.249.252:50892->192.168.1.234:55443, len 60
forward: in:<pppoe-893_giovanna.dacunzi> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.249.252:50892->192.168.1.234:55443, len 60
forward: in:<pppoe-893_giovanna.dacunzi> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.249.252:50894->192.168.1.234:55443, len 60
forward: in:<pppoe-893_giovanna.dacunzi> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.249.252:50894->192.168.1.234:55443, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.58.142:55556->185.203.24.69:3065, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.58.142:55556->185.203.24.69:3065, len 44
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 162.19.80.39:44025->185.203.24.209:2543, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 162.19.80.39:44025->185.203.24.209:2543, len 52
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cava.gioxiii.seg>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.82:16873->185.203.25.231:53, len 62
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cava.gioxiii.seg>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 164.163.5.82:16873->185.203.25.231:53, len 62
^C

View File

@ -0,0 +1,51 @@
journalctl -u ids-list-fetcher -n 50 --no-pager
Jan 02 15:30:01 ids.alfacom.it ids-list-fetcher[9296]: Skipped (whitelisted): 0
Jan 02 15:30:01 ids.alfacom.it ids-list-fetcher[9296]: ============================================================
Jan 02 15:30:01 ids.alfacom.it systemd[1]: ids-list-fetcher.service: Deactivated successfully.
Jan 02 15:30:01 ids.alfacom.it systemd[1]: Finished IDS Public Lists Fetcher Service.
Jan 02 15:40:00 ids.alfacom.it systemd[1]: Starting IDS Public Lists Fetcher Service...
Jan 02 15:40:00 ids.alfacom.it ids-list-fetcher[9493]: ============================================================
Jan 02 15:40:00 ids.alfacom.it ids-list-fetcher[9493]: [2026-01-02 15:40:00] PUBLIC LISTS SYNC
Jan 02 15:40:00 ids.alfacom.it ids-list-fetcher[9493]: ============================================================
Jan 02 15:40:00 ids.alfacom.it ids-list-fetcher[9493]: Found 2 enabled lists
Jan 02 15:40:00 ids.alfacom.it ids-list-fetcher[9493]: [15:40:00] Downloading Spamhaus from https://www.spamhaus.org/drop/drop_v4.json...
Jan 02 15:40:00 ids.alfacom.it ids-list-fetcher[9493]: [15:40:00] Downloading AWS from https://ip-ranges.amazonaws.com/ip-ranges.json...
Jan 02 15:40:00 ids.alfacom.it ids-list-fetcher[9493]: [15:40:00] Parsing AWS...
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: [15:40:01] Found 9548 IPs, syncing to database...
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: [15:40:01] ✓ AWS: +0 -0 ~9511
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: [15:40:01] Parsing Spamhaus...
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: [15:40:01] Found 1468 IPs, syncing to database...
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: [15:40:01] ✓ Spamhaus: +0 -0 ~1464
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: ============================================================
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: SYNC SUMMARY
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: ============================================================
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: Success: 2/2
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: Errors: 0/2
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: Total IPs Added: 0
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: Total IPs Removed: 0
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: ============================================================
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: ============================================================
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: RUNNING MERGE LOGIC
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: ============================================================
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: ERROR:merge_logic:Failed to cleanup detections: operator does not exist: inet = text
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: LINE 9: d.source_ip::inet = wl.ip_inet
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: ^
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: HINT: No operator matches the given name and argument types. You might need to add explicit type casts.
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: ERROR:merge_logic:Failed to sync detections: operator does not exist: inet = text
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: LINE 29: bl.ip_inet = wl.ip_inet
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: ^
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: HINT: No operator matches the given name and argument types. You might need to add explicit type casts.
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: Traceback (most recent call last):
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: File "/opt/ids/python_ml/merge_logic.py", line 264, in sync_public_blacklist_detections
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: cur.execute("""
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: psycopg2.errors.UndefinedFunction: operator does not exist: inet = text
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: LINE 29: bl.ip_inet = wl.ip_inet
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: ^
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: HINT: No operator matches the given name and argument types. You might need to add explicit type casts.
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: Merge Logic Stats:
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: Created detections: 0
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: Cleaned invalid detections: 0
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: Skipped (whitelisted): 0
Jan 02 15:40:01 ids.alfacom.it ids-list-fetcher[9493]: ============================================================
Jan 02 15:40:01 ids.alfacom.it systemd[1]: ids-list-fetcher.service: Deactivated successfully.
Jan 02 15:40:01 ids.alfacom.it systemd[1]: Finished IDS Public Lists Fetcher Service.

View File

@ -0,0 +1,51 @@
journalctl -u ids-list-fetcher -n 50 --no-pager
Jan 02 17:10:02 ids.alfacom.it ids-list-fetcher[2139]: ============================================================
Jan 02 17:10:02 ids.alfacom.it ids-list-fetcher[2139]: ============================================================
Jan 02 17:10:02 ids.alfacom.it ids-list-fetcher[2139]: RUNNING MERGE LOGIC
Jan 02 17:10:02 ids.alfacom.it ids-list-fetcher[2139]: ============================================================
Jan 02 17:10:12 ids.alfacom.it ids-list-fetcher[2139]: INFO:merge_logic:Bulk sync complete: {'created': 0, 'cleaned': 0, 'skipped_whitelisted': 0}
Jan 02 17:10:12 ids.alfacom.it ids-list-fetcher[2139]: Merge Logic Stats:
Jan 02 17:10:12 ids.alfacom.it ids-list-fetcher[2139]: Created detections: 0
Jan 02 17:10:12 ids.alfacom.it ids-list-fetcher[2139]: Cleaned invalid detections: 0
Jan 02 17:10:12 ids.alfacom.it ids-list-fetcher[2139]: Skipped (whitelisted): 0
Jan 02 17:10:12 ids.alfacom.it ids-list-fetcher[2139]: ============================================================
Jan 02 17:10:12 ids.alfacom.it systemd[1]: ids-list-fetcher.service: Deactivated successfully.
Jan 02 17:10:12 ids.alfacom.it systemd[1]: Finished IDS Public Lists Fetcher Service.
Jan 02 17:12:35 ids.alfacom.it systemd[1]: Starting IDS Public Lists Fetcher Service...
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: ============================================================
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: [2026-01-02 17:12:35] PUBLIC LISTS SYNC
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: ============================================================
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: Found 4 enabled lists
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: [17:12:35] Downloading Spamhaus from https://www.spamhaus.org/drop/drop_v4.json...
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: [17:12:35] Downloading AWS from https://ip-ranges.amazonaws.com/ip-ranges.json...
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: [17:12:35] Downloading Google Cloud from https://www.gstatic.com/ipranges/cloud.json...
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: [17:12:35] Downloading Google globali from https://www.gstatic.com/ipranges/goog.json...
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: [17:12:35] Parsing AWS...
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: [17:12:35] Found 9548 IPs, syncing to database...
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: [17:12:35] ✓ AWS: +0 -0 ~9548
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: [17:12:35] Parsing Google globali...
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: [17:12:35] ✗ Google globali: No valid IPs found in list
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: [17:12:35] Parsing Google Cloud...
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: [17:12:35] ✗ Google Cloud: No valid IPs found in list
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: [17:12:35] Parsing Spamhaus...
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: [17:12:35] Found 1468 IPs, syncing to database...
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: [17:12:35] ✓ Spamhaus: +0 -0 ~1468
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: ============================================================
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: SYNC SUMMARY
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: ============================================================
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: Success: 2/4
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: Errors: 2/4
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: Total IPs Added: 0
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: Total IPs Removed: 0
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: ============================================================
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: ============================================================
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: RUNNING MERGE LOGIC
Jan 02 17:12:35 ids.alfacom.it ids-list-fetcher[2279]: ============================================================
Jan 02 17:12:45 ids.alfacom.it ids-list-fetcher[2279]: INFO:merge_logic:Bulk sync complete: {'created': 0, 'cleaned': 0, 'skipped_whitelisted': 0}
Jan 02 17:12:45 ids.alfacom.it ids-list-fetcher[2279]: Merge Logic Stats:
Jan 02 17:12:45 ids.alfacom.it ids-list-fetcher[2279]: Created detections: 0
Jan 02 17:12:45 ids.alfacom.it ids-list-fetcher[2279]: Cleaned invalid detections: 0
Jan 02 17:12:45 ids.alfacom.it ids-list-fetcher[2279]: Skipped (whitelisted): 0
Jan 02 17:12:45 ids.alfacom.it ids-list-fetcher[2279]: ============================================================
Jan 02 17:12:45 ids.alfacom.it systemd[1]: ids-list-fetcher.service: Deactivated successfully.
Jan 02 17:12:45 ids.alfacom.it systemd[1]: Finished IDS Public Lists Fetcher Service.

View File

@ -0,0 +1,53 @@
ls -la /opt/ids/.env
-rw-r--r--. 1 ids ids 351 Nov 17 17:23 /opt/ids/.env
[root@ids python_ml]# cat /opt/ids/.env | grep PGPASSWORD
PGPASSWORD=TestPassword123
[root@ids python_ml]# sudo chmod 640 /opt/ids/.env
[root@ids python_ml]# sudo chown ids:ids /opt/ids/.env
[root@ids python_ml]# sudo -u ids bash -c 'source /opt/ids/.env && echo "Password letta: $PGPASSWORD"'
Password letta: TestPassword123
[root@ids python_ml]# sudo -u ids bash -c 'export PGPASSWORD="TestPassword123" && psql -h ::1 -U ids_user -d ids_database -c "SELECT 1;"'
?column?
----------
1
(1 row)
[root@ids python_ml]# pkill -f syslog_parser
[root@ids python_ml]# sudo -u ids PGPASSWORD="TestPassword123" nohup python3.11 syslog_parser.py > /var/log/ids/syslog_parser.log 2>&1 &
[1] 59430
[root@ids python_ml]# sleep 2
[1]+ Done sudo -u ids PGPASSWORD="TestPassword123" nohup python3.11 syslog_parser.py > /var/log/ids/syslog_parser.log 2>&1
[root@ids python_ml]# tail -20 /var/log/ids/syslog_parser.log
nohup: ignoring input
=== SYSLOG PARSER PER ROUTER MIKROTIK ===
Pressione Ctrl+C per interrompere
[INFO] Connesso a PostgreSQL
[INFO] Processando /var/log/mikrotik/raw.log (follow=True)
[ERROR] Errore processamento file: [Errno 13] Permission denied: '/var/log/mikrotik/raw.log'
[INFO] Totale: 0 righe processate, 0 log salvati
[INFO] Disconnesso da PostgreSQL
[root@ids python_ml]# sudo -u ids nohup python3.11 syslog_parser.py > /var/log/ids/syslog_parser.log 2>&1 &
[1] 59438
[root@ids python_ml]# tail -20 /var/log/ids/syslog_parser.log
nohup: ignoring input
=== SYSLOG PARSER PER ROUTER MIKROTIK ===
Pressione Ctrl+C per interrompere
[ERROR] Connessione database fallita: connection to server at "localhost" (::1), port 5432 failed: FATAL: password authentication failed for user "ids_user"
[INFO] Disconnesso da PostgreSQL
Traceback (most recent call last):
File "/opt/ids/python_ml/syslog_parser.py", line 215, in <module>
main()
File "/opt/ids/python_ml/syslog_parser.py", line 203, in main
parser.connect_db()
File "/opt/ids/python_ml/syslog_parser.py", line 48, in connect_db
self.conn = psycopg2.connect(**self.db_config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ids/.local/lib/python3.11/site-packages/psycopg2/__init__.py", line 122, in connect
conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: FATAL: password authentication failed for user "ids_user"
[1]+ Exit 1 sudo -u ids nohup python3.11 syslog_parser.py > /var/log/ids/syslog_parser.log 2>&1

View File

@ -0,0 +1,55 @@
python compare_models.py
[WARNING] Extended Isolation Forest not available, using standard IF
================================================================================
IDS MODEL COMPARISON - DB Current vs Hybrid Detector v2.0.0
================================================================================
[1] Caricamento detection esistenti dal database...
Trovate 50 detection nel database
[2] Caricamento nuovo Hybrid Detector (v2.0.0)...
[HYBRID] Ensemble classifier loaded
[HYBRID] Models loaded (version: latest)
[HYBRID] Selected features: 18/25
[HYBRID] Mode: Hybrid (IF + Ensemble)
✅ Hybrid Detector caricato (18 feature selezionate)
[3] Rianalisi di 50 IP con nuovo modello Hybrid...
(Questo può richiedere alcuni minuti...)
[1/50] Analisi IP: 185.203.25.138
Current: score=100.0, type=ddos, blocked=False
Traceback (most recent call last):
File "/opt/ids/python_ml/venv/lib64/python3.11/site-packages/pandas/core/indexes/base.py", line 3790, in get_loc
return self._engine.get_loc(casted_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "index.pyx", line 152, in pandas._libs.index.IndexEngine.get_loc
File "index.pyx", line 181, in pandas._libs.index.IndexEngine.get_loc
File "pandas/_libs/hashtable_class_helper.pxi", line 7080, in pandas._libs.hashtable.PyObjectHashTable.get_item
File "pandas/_libs/hashtable_class_helper.pxi", line 7088, in pandas._libs.hashtable.PyObjectHashTable.get_item
KeyError: 'timestamp'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/opt/ids/python_ml/compare_models.py", line 265, in <module>
main()
File "/opt/ids/python_ml/compare_models.py", line 184, in main
comparison = reanalyze_with_hybrid(detector, ip, old_det)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/ids/python_ml/compare_models.py", line 118, in reanalyze_with_hybrid
result = detector.detect(ip_features)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/ids/python_ml/ml_hybrid_detector.py", line 507, in detect
features_df = self.extract_features(logs_df)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/ids/python_ml/ml_hybrid_detector.py", line 98, in extract_features
logs_df['timestamp'] = pd.to_datetime(logs_df['timestamp'])
~~~~~~~^^^^^^^^^^^^^
File "/opt/ids/python_ml/venv/lib64/python3.11/site-packages/pandas/core/frame.py", line 3893, in __getitem__
indexer = self.columns.get_loc(key)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/ids/python_ml/venv/lib64/python3.11/site-packages/pandas/core/indexes/base.py", line 3797, in get_loc
raise KeyError(key) from err
KeyError: 'timestamp'

View File

@ -0,0 +1,75 @@
python train_hybrid.py --test
[WARNING] Extended Isolation Forest not available, using standard IF
======================================================================
IDS HYBRID ML TEST - SYNTHETIC DATA
======================================================================
INFO:dataset_loader:Creating sample dataset (10000 samples)...
INFO:dataset_loader:Sample dataset created: 10000 rows
INFO:dataset_loader:Attack distribution:
attack_type
normal 8981
brute_force 273
suspicious 258
ddos 257
port_scan 231
Name: count, dtype: int64
[TEST] Created synthetic dataset: 10000 samples
Normal: 8,981 (89.8%)
Attacks: 1,019 (10.2%)
[TEST] Training on 6,281 normal samples...
[HYBRID] Training hybrid model on 6281 logs...
[HYBRID] Extracted features for 100 unique IPs
[HYBRID] Pre-training Isolation Forest for feature selection...
[HYBRID] Generated 3 pseudo-anomalies from pre-training IF
[HYBRID] Feature selection: 25 → 18 features
[HYBRID] Selected features: total_packets, conn_count, time_span_seconds, conn_per_second, hour_of_day... (+13 more)
[HYBRID] Normalizing features...
[HYBRID] Training Extended Isolation Forest (contamination=0.03)...
/opt/ids/python_ml/venv/lib64/python3.11/site-packages/sklearn/ensemble/_iforest.py:307: UserWarning: max_samples (256) is greater than the total number of samples (100). max_samples will be set to n_samples for estimation.
warn(
[HYBRID] Generating pseudo-labels from Isolation Forest...
[HYBRID] ⚠ IF found only 3 anomalies (need 10)
[HYBRID] Applying ADAPTIVE percentile fallback...
[HYBRID] Trying 5% percentile → 5 anomalies
[HYBRID] Trying 10% percentile → 10 anomalies
[HYBRID] ✅ Success with 10% percentile
[HYBRID] Pseudo-labels: 10 anomalies, 90 normal
[HYBRID] Training ensemble classifier (DT + RF + XGBoost)...
[HYBRID] Class distribution OK: [0 1] (counts: [90 10])
[HYBRID] Ensemble .fit() completed successfully
[HYBRID] ✅ Ensemble verified: produces 2 class probabilities
[HYBRID] Ensemble training completed and verified!
[HYBRID] Models saved to models
[HYBRID] Ensemble classifier included
[HYBRID] ✅ Training completed successfully! 10/100 IPs flagged as anomalies
[HYBRID] ✅ Ensemble classifier verified and ready for production
[DETECT] Ensemble classifier available - computing hybrid score...
[DETECT] IF scores: min=0.0, max=100.0, mean=57.6
[DETECT] Ensemble scores: min=86.9, max=97.2, mean=92.1
[DETECT] Combined scores: min=54.3, max=93.1, mean=78.3
[DETECT] ✅ Hybrid scoring active: 40% IF + 60% Ensemble
[TEST] Detection results:
Total detections: 100
High confidence: 0
Medium confidence: 85
Low confidence: 15
[TEST] Top 5 detections:
1. 192.168.0.24: risk=93.1, type=suspicious, confidence=medium
2. 192.168.0.27: risk=92.7, type=suspicious, confidence=medium
3. 192.168.0.88: risk=92.5, type=suspicious, confidence=medium
4. 192.168.0.70: risk=92.3, type=suspicious, confidence=medium
5. 192.168.0.4: risk=91.4, type=suspicious, confidence=medium
❌ Error: index 7000 is out of bounds for axis 0 with size 3000
Traceback (most recent call last):
File "/opt/ids/python_ml/train_hybrid.py", line 361, in main
test_on_synthetic(args)
File "/opt/ids/python_ml/train_hybrid.py", line 283, in test_on_synthetic
y_pred[i] = 1
~~~~~~^^^
IndexError: index 7000 is out of bounds for axis 0 with size 3000

View File

@ -0,0 +1,36 @@
sudo -u postgres psql -c "ALTER SYSTEM SET password_encryption = 'scram-sha-256';"
ALTER SYSTEM
[root@ids deployment]# systemctl restart postgresql
[root@ids deployment]# sudo -u postgres psql << EOF
DROP USER IF EXISTS ids_user;
CREATE USER ids_user WITH PASSWORD 'TestPassword123';
GRANT ALL PRIVILEGES ON DATABASE ids_database TO ids_user;
GRANT ALL PRIVILEGES ON ALL TABLES IN SCHEMA public TO ids_user;
GRANT ALL PRIVILEGES ON ALL SEQUENCES IN SCHEMA public TO ids_user;
ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT ALL ON TABLES TO ids_user;
ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT ALL ON SEQUENCES TO ids_user;
EOF
ERROR: role "ids_user" cannot be dropped because some objects depend on it
DETAIL: privileges for database ids_database
ERROR: role "ids_user" already exists
GRANT
GRANT
GRANT
ALTER DEFAULT PRIVILEGES
ALTER DEFAULT PRIVILEGES
[root@ids deployment]# export PGPASSWORD="TestPassword123"
[root@ids deployment]# psql -h localhost -U ids_user -d ids_database -c "SELECT 1;"
psql: error: FATAL: password authentication failed for user "ids_user"
[root@ids deployment]# tail -30 /var/lib/pgsql/data/log/postgresql-*.log | grep -i "ids_user"
2025-11-17 17:21:00.789 CET [59154] FATAL: password authentication failed for user "ids_user"
2025-11-17 17:21:00.789 CET [59154] DETAIL: User "ids_user" does not have a valid SCRAM secret.
2025-11-17 17:22:28.055 CET [59160] FATAL: password authentication failed for user "ids_user"
2025-11-17 17:22:28.055 CET [59160] DETAIL: User "ids_user" does not have a valid SCRAM secret.
2025-11-17 17:23:42.513 CET [59171] FATAL: password authentication failed for user "ids_user"
2025-11-17 17:23:42.513 CET [59171] DETAIL: User "ids_user" does not have a valid SCRAM secret.
2025-11-17 17:26:40.293 CET [59224] ERROR: role "ids_user" cannot be dropped because some objects depend on it
2025-11-17 17:26:40.293 CET [59224] STATEMENT: DROP USER IF EXISTS ids_user;
2025-11-17 17:26:40.293 CET [59224] ERROR: role "ids_user" already exists
2025-11-17 17:26:40.293 CET [59224] STATEMENT: CREATE USER ids_user WITH PASSWORD 'TestPassword123';
2025-11-17 17:27:06.520 CET [59226] FATAL: password authentication failed for user "ids_user"
2025-11-17 17:27:06.520 CET [59226] DETAIL: User "ids_user" does not have a valid SCRAM secret.

View File

@ -0,0 +1,66 @@
tail -f /var/log/ids/ml_backend.log
[HYBRID] Mode: Hybrid (IF + Ensemble)
[ML] ✓ Hybrid detector models loaded and ready
 Starting IDS API on http://0.0.0.0:8000
 Docs available at http://0.0.0.0:8000/docs
INFO: 127.0.0.1:45342 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:49754 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:50634 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:39232 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:35736 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:37462 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:59676 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:34256 - "GET /health HTTP/1.1" 200 OK
INFO: 127.0.0.1:34256 - "GET /services/status HTTP/1.1" 200 OK
INFO: 127.0.0.1:34256 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:34264 - "POST /train HTTP/1.1" 200 OK
[TRAIN] Inizio training...
INFO: 127.0.0.1:34264 - "GET /stats HTTP/1.1" 200 OK
[TRAIN] Trovati 100000 log per training
[TRAIN] Addestramento modello...
[TRAIN] Using Hybrid ML Detector
[HYBRID] Training hybrid model on 100000 logs...
INFO: 127.0.0.1:41612 - "GET /stats HTTP/1.1" 200 OK
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 201, in do_training
result = ml_detector.train_unsupervised(df)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/ids/python_ml/ml_hybrid_detector.py", line 467, in train_unsupervised
self.save_models()
File "/opt/ids/python_ml/ml_hybrid_detector.py", line 658, in save_models
joblib.dump(self.ensemble_classifier, self.model_dir / "ensemble_classifier_latest.pkl")
File "/opt/ids/python_ml/venv/lib64/python3.11/site-packages/joblib/numpy_pickle.py", line 552, in dump
with open(filename, 'wb') as f:
^^^^^^^^^^^^^^^^^^^^
PermissionError: [Errno 13] Permission denied: 'models/ensemble_classifier_latest.pkl'
[HYBRID] Extracted features for 1430 unique IPs
[HYBRID] Pre-training Isolation Forest for feature selection...
[HYBRID] Generated 43 pseudo-anomalies from pre-training IF
[HYBRID] Feature selection: 25 → 18 features
[HYBRID] Selected features: total_packets, total_bytes, conn_count, avg_packet_size, bytes_per_second... (+13 more)
[HYBRID] Normalizing features...
[HYBRID] Training Extended Isolation Forest (contamination=0.03)...
[HYBRID] Generating pseudo-labels from Isolation Forest...
[HYBRID] Pseudo-labels: 43 anomalies, 1387 normal
[HYBRID] Training ensemble classifier (DT + RF + XGBoost)...
[HYBRID] Class distribution OK: [0 1] (counts: [1387 43])
[HYBRID] Ensemble .fit() completed successfully
[HYBRID] ✅ Ensemble verified: produces 2 class probabilities
[HYBRID] Ensemble training completed and verified!
[TRAIN ERROR] ❌ Errore durante training: [Errno 13] Permission denied: 'models/ensemble_classifier_latest.pkl'
INFO: 127.0.0.1:45694 - "GET /stats HTTP/1.1" 200 OK
^C
(venv) [root@ids python_ml]# ls models/
ensemble_classifier_20251124_185541.pkl feature_names.json feature_selector_latest.pkl isolation_forest_20251125_183830.pkl scaler_20251124_192122.pkl
ensemble_classifier_20251124_185920.pkl feature_selector_20251124_185541.pkl isolation_forest.joblib isolation_forest_latest.pkl scaler_20251125_090356.pkl
ensemble_classifier_20251124_192109.pkl feature_selector_20251124_185920.pkl isolation_forest_20251124_185541.pkl metadata_20251124_185541.json scaler_20251125_092703.pkl
ensemble_classifier_20251124_192122.pkl feature_selector_20251124_192109.pkl isolation_forest_20251124_185920.pkl metadata_20251124_185920.json scaler_20251125_120016.pkl
ensemble_classifier_20251125_090356.pkl feature_selector_20251124_192122.pkl isolation_forest_20251124_192109.pkl metadata_20251124_192109.json scaler_20251125_181945.pkl
ensemble_classifier_20251125_092703.pkl feature_selector_20251125_090356.pkl isolation_forest_20251124_192122.pkl metadata_20251124_192122.json scaler_20251125_182742.pkl
ensemble_classifier_20251125_120016.pkl feature_selector_20251125_092703.pkl isolation_forest_20251125_090356.pkl metadata_20251125_092703.json scaler_20251125_183049.pkl
ensemble_classifier_20251125_181945.pkl feature_selector_20251125_120016.pkl isolation_forest_20251125_092703.pkl metadata_latest.json scaler_20251125_183830.pkl
ensemble_classifier_20251125_182742.pkl feature_selector_20251125_181945.pkl isolation_forest_20251125_120016.pkl scaler.joblib scaler_latest.pkl
ensemble_classifier_20251125_183049.pkl feature_selector_20251125_182742.pkl isolation_forest_20251125_181945.pkl scaler_20251124_185541.pkl
ensemble_classifier_20251125_183830.pkl feature_selector_20251125_183049.pkl isolation_forest_20251125_182742.pkl scaler_20251124_185920.pkl
ensemble_classifier_latest.pkl feature_selector_20251125_183830.pkl isolation_forest_20251125_183049.pkl scaler_20251124_192109.pkl
(venv) [root@ids python_ml]#

View File

@ -0,0 +1,200 @@
./update_from_git.sh --db
╔═══════════════════════════════════════════════╗
║  AGGIORNAMENTO SISTEMA IDS DA GIT ║
╚═══════════════════════════════════════════════╝
 Verifica configurazione git...
 Backup configurazione locale...
✅ .env salvato in .env.backup
 Verifica modifiche locali...
⚠ Ci sono modifiche locali non committate
Esegui 'git status' per vedere i dettagli
Vuoi procedere comunque? (y/n) y
Salvo modifiche locali temporaneamente...
Saved working directory and index state WIP on main: 40990d4  Release v1.0.21
 Download aggiornamenti da git.alfacom.it...
From https://git.alfacom.it/marco/ids.alfacom.it
* branch main -> FETCH_HEAD
Already up to date.
✅ Aggiornamenti scaricati con successo
 Ripristino configurazione locale...
✅ .env ripristinato
 Sincronizzazione schema database...
⚠ APPLICAZIONE MODIFICHE AL DATABASE
Database: ids_database@localhost
Script da applicare:
• create_detections.sql
• create_network_logs.sql
• create_routers.sql
• create_training_history.sql
• create_whitelist.sql
Confermi applicazione modifiche? (y/n) y
 Applicando create_detections.sql...
DROP TABLE
CREATE TABLE
CREATE INDEX
CREATE INDEX
CREATE INDEX
COMMENT
status
-----------------------------------------
Tabella detections creata con successo!
(1 row)
✅ create_detections.sql applicato con successo
 Applicando create_network_logs.sql...
DROP TABLE
CREATE TABLE
CREATE INDEX
CREATE INDEX
CREATE INDEX
COMMENT
status
-------------------------------------------
Tabella network_logs creata con successo!
(1 row)
✅ create_network_logs.sql applicato con successo
 Applicando create_routers.sql...
DROP TABLE
CREATE TABLE
CREATE INDEX
CREATE INDEX
COMMENT
INSERT 0 1
status
--------------------------------------
Tabella routers creata con successo!
(1 row)
✅ create_routers.sql applicato con successo
 Applicando create_training_history.sql...
DROP TABLE
CREATE TABLE
CREATE INDEX
COMMENT
status
-----------------------------------------------
Tabella training_history creata con successo!
(1 row)
✅ create_training_history.sql applicato con successo
 Applicando create_whitelist.sql...
DROP TABLE
CREATE TABLE
CREATE INDEX
CREATE INDEX
COMMENT
status
----------------------------------------
Tabella whitelist creata con successo!
(1 row)
✅ create_whitelist.sql applicato con successo
✅ Tutti gli script SQL applicati con successo
 Aggiornamento dipendenze Node.js...
up to date, audited 492 packages in 4s
65 packages are looking for funding
run `npm fund` for details
9 vulnerabilities (3 low, 5 moderate, 1 high)
To address issues that do not require attention, run:
npm audit fix
To address all issues (including breaking changes), run:
npm audit fix --force
Run `npm audit` for details.
✅ Dipendenze Node.js aggiornate
 Aggiornamento dipendenze Python...
Defaulting to user installation because normal site-packages is not writeable
Requirement already satisfied: fastapi==0.104.1 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 1)) (0.104.1)
Requirement already satisfied: uvicorn==0.24.0 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 2)) (0.24.0)
Requirement already satisfied: pandas==2.1.3 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 3)) (2.1.3)
Requirement already satisfied: numpy==1.26.2 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 4)) (1.26.2)
Requirement already satisfied: scikit-learn==1.3.2 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 5)) (1.3.2)
Requirement already satisfied: psycopg2-binary==2.9.9 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 6)) (2.9.9)
Requirement already satisfied: python-dotenv==1.0.0 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 7)) (1.0.0)
Requirement already satisfied: pydantic==2.5.0 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 8)) (2.5.0)
Requirement already satisfied: httpx==0.25.1 in /home/ids/.local/lib/python3.11/site-packages (from -r requirements.txt (line 9)) (0.25.1)
Requirement already satisfied: anyio<4.0.0,>=3.7.1 in /home/ids/.local/lib/python3.11/site-packages (from fastapi==0.104.1->-r requirements.txt (line 1)) (3.7.1)
Requirement already satisfied: starlette<0.28.0,>=0.27.0 in /home/ids/.local/lib/python3.11/site-packages (from fastapi==0.104.1->-r requirements.txt (line 1)) (0.27.0)
Requirement already satisfied: typing-extensions>=4.8.0 in /home/ids/.local/lib/python3.11/site-packages (from fastapi==0.104.1->-r requirements.txt (line 1)) (4.15.0)
Requirement already satisfied: click>=7.0 in /home/ids/.local/lib/python3.11/site-packages (from uvicorn==0.24.0->-r requirements.txt (line 2)) (8.3.1)
Requirement already satisfied: h11>=0.8 in /home/ids/.local/lib/python3.11/site-packages (from uvicorn==0.24.0->-r requirements.txt (line 2)) (0.16.0)
Requirement already satisfied: python-dateutil>=2.8.2 in /home/ids/.local/lib/python3.11/site-packages (from pandas==2.1.3->-r requirements.txt (line 3)) (2.9.0.post0)
Requirement already satisfied: pytz>=2020.1 in /home/ids/.local/lib/python3.11/site-packages (from pandas==2.1.3->-r requirements.txt (line 3)) (2025.2)
Requirement already satisfied: tzdata>=2022.1 in /home/ids/.local/lib/python3.11/site-packages (from pandas==2.1.3->-r requirements.txt (line 3)) (2025.2)
Requirement already satisfied: scipy>=1.5.0 in /home/ids/.local/lib/python3.11/site-packages (from scikit-learn==1.3.2->-r requirements.txt (line 5)) (1.16.3)
Requirement already satisfied: joblib>=1.1.1 in /home/ids/.local/lib/python3.11/site-packages (from scikit-learn==1.3.2->-r requirements.txt (line 5)) (1.5.2)
Requirement already satisfied: threadpoolctl>=2.0.0 in /home/ids/.local/lib/python3.11/site-packages (from scikit-learn==1.3.2->-r requirements.txt (line 5)) (3.6.0)
Requirement already satisfied: annotated-types>=0.4.0 in /home/ids/.local/lib/python3.11/site-packages (from pydantic==2.5.0->-r requirements.txt (line 8)) (0.7.0)
Requirement already satisfied: pydantic-core==2.14.1 in /home/ids/.local/lib/python3.11/site-packages (from pydantic==2.5.0->-r requirements.txt (line 8)) (2.14.1)
Requirement already satisfied: certifi in /home/ids/.local/lib/python3.11/site-packages (from httpx==0.25.1->-r requirements.txt (line 9)) (2025.11.12)
Requirement already satisfied: httpcore in /home/ids/.local/lib/python3.11/site-packages (from httpx==0.25.1->-r requirements.txt (line 9)) (1.0.9)
Requirement already satisfied: idna in /home/ids/.local/lib/python3.11/site-packages (from httpx==0.25.1->-r requirements.txt (line 9)) (3.11)
Requirement already satisfied: sniffio in /home/ids/.local/lib/python3.11/site-packages (from httpx==0.25.1->-r requirements.txt (line 9)) (1.3.1)
Requirement already satisfied: six>=1.5 in /home/ids/.local/lib/python3.11/site-packages (from python-dateutil>=2.8.2->pandas==2.1.3->-r requirements.txt (line 3)) (1.17.0)
✅ Dipendenze Python aggiornate
 Restart servizi...
✅ Servizi riavviati
╔═══════════════════════════════════════════════╗
║ ✅ AGGIORNAMENTO COMPLETATO ║
╚═══════════════════════════════════════════════╝
 VERIFICA SISTEMA:
• Log backend: tail -f /var/log/ids/backend.log
• Log frontend: tail -f /var/log/ids/frontend.log
• API backend: curl http://localhost:8000/health
• Frontend: curl http://localhost:5000
📊 STATO SERVIZI:
ids 5038 0.3 2.0 1894024 331912 ? Sl 09:20 0:46 /usr/bin/python3.11 main.py
[root@ids ids]# sudo -u ids /opt/ids/deployment/restart_all.sh
[root@ids ids]# curl http://localhost:5000
curl: (7) Failed to connect to localhost port 5000: Connection refused
[root@ids ids]# curl http://localhost:8000/health
{"status":"healttail -f /var/log/ids/backend.logodel":"loaded","timestamp":"2025-11-21T15:30:17.826004"}[root@ids ids]# tail -f /var/log/ids/backend.log
INFO: Waiting for application startup.
INFO: Application startup complete.
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
[LOAD] Modello caricato da models
🚀 Starting IDS API on http://0.0.0.0:8000
📚 Docs available at http://0.0.0.0:8000/docs
INFO: 127.0.0.1:53374 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:59428 - "GET /health HTTP/1.1" 200 OK
^C
[root@ids ids]# tail -f /var/log/ids/frontend.log
^
Error: DATABASE_URL must be set. Did you forget to provision a database?
at <anonymous> (/opt/ids/server/db.ts:9:9)
at ModuleJob.run (node:internal/modules/esm/module_job:325:25)
at async ModuleLoader.import (node:internal/modules/esm/loader:606:24)
at async asyncRunEntryPointWithESMLoader (node:internal/modules/run_main:117:5)
Node.js v20.19.5
^C

View File

@ -0,0 +1,79 @@
5:34:05 PM [express] POST /api/ml/train 200 in 6ms :: {"message":"Training avviato in background","m…
5:34:05 PM [express] GET /api/training-history 304 in 13ms :: []
5:34:05 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:34:15 PM [express] GET /api/training-history 304 in 13ms :: []
5:34:15 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:34:21 PM [express] GET /api/detections 304 in 2ms :: []
5:34:23 PM [express] GET /api/training-history 304 in 3ms :: []
5:34:23 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:34:31 PM [express] GET /api/detections 304 in 3ms :: []
5:34:32 PM [express] GET /api/training-history 304 in 2ms :: []
5:34:32 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:34:39 PM [express] GET /api/detections 304 in 2ms :: []
5:34:41 PM [express] GET /api/training-history 304 in 2ms :: []
5:34:41 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:34:42 PM [express] GET /api/detections 304 in 2ms :: []
5:34:43 PM [express] GET /api/training-history 304 in 2ms :: []
5:34:43 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:34:44 PM [express] GET /api/training-history 304 in 3ms :: []
5:34:44 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:34:45 PM [express] GET /api/training-history 304 in 2ms :: []
5:34:45 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:34:55 PM [express] GET /api/training-history 304 in 12ms :: []
5:34:55 PM [express] GET /api/ml/stats 304 in 14ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:34:59 PM [express] GET /api/detections 304 in 3ms :: []
[DB ERROR] Failed to fetch stats: error: column "last_sync" does not exist
at /opt/ids/node_modules/pg-pool/index.js:45:11
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async <anonymous> (/opt/ids/node_modules/src/node-postgres/session.ts:104:19)
at async DatabaseStorage.getAllRouters (/opt/ids/server/storage.ts:58:12)
at async <anonymous> (/opt/ids/server/routes.ts:139:23) {
length: 109,
severity: 'ERROR',
code: '42703',
detail: undefined,
hint: undefined,
position: '83',
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: undefined,
table: undefined,
column: undefined,
dataType: undefined,
constraint: undefined,
file: 'parse_relation.c',
line: '3562',
routine: 'errorMissingColumn'
}
5:35:01 PM [express] GET /api/stats 500 in 4ms :: {"error":"Failed to fetch stats"}
5:35:01 PM [express] GET /api/detections 304 in 14ms :: []
[DB ERROR] Failed to fetch routers: error: column "last_sync" does not exist
at /opt/ids/node_modules/pg-pool/index.js:45:11
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async <anonymous> (/opt/ids/node_modules/src/node-postgres/session.ts:104:19)
at async DatabaseStorage.getAllRouters (/opt/ids/server/storage.ts:58:12)
at async <anonymous> (/opt/ids/server/routes.ts:10:23) {
length: 109,
severity: 'ERROR',
code: '42703',
detail: undefined,
hint: undefined,
position: '83',
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: undefined,
table: undefined,
column: undefined,
dataType: undefined,
constraint: undefined,
file: 'parse_relation.c',
line: '3562',
routine: 'errorMissingColumn'
}
5:35:01 PM [express] GET /api/routers 500 in 13ms :: {"error":"Failed to fetch routers"}
5:35:06 PM [express] GET /api/training-history 304 in 3ms :: []
5:35:06 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:35:16 PM [express] GET /api/training-history 304 in 11ms :: []
5:35:16 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…

View File

@ -0,0 +1,40 @@
INFO: Shutting down
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
INFO: Finished server process [16990]
INFO: Started server process [18451]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
[LOAD] Modello caricato da models
🚀 Starting IDS API on http://0.0.0.0:8000
📚 Docs available at http://0.0.0.0:8000/docs
INFO: 127.0.0.1:53190 - "POST /detect HTTP/1.1" 200 OK
INFO: 127.0.0.1:50930 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:50942 - "POST /train HTTP/1.1" 200 OK
[TRAIN] Inizio training...
INFO: 127.0.0.1:50930 - "GET /stats HTTP/1.1" 200 OK
[TRAIN] Trovati 100000 log per training
[TRAIN] Addestramento modello...
[TRAINING] Estrazione feature da 100000 log...
INFO: 127.0.0.1:35464 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:33782 - "GET /stats HTTP/1.1" 200 OK
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 161, in do_training
result = ml_analyzer.train(df, contamination=request.contamination)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/ids/python_ml/ml_analyzer.py", line 191, in train
self.save_model()
File "/opt/ids/python_ml/ml_analyzer.py", line 311, in save_model
joblib.dump(self.model, model_path)
File "/opt/ids/python_ml/venv/lib64/python3.11/site-packages/joblib/numpy_pickle.py", line 552, in dump
with open(filename, 'wb') as f:
^^^^^^^^^^^^^^^^^^^^
PermissionError: [Errno 13] Permission denied: 'models/isolation_forest.joblib'
[TRAINING] Feature estratte per 4467 IP unici
[TRAINING] Normalizzazione features...
[TRAINING] Addestramento Isolation Forest (contamination=0.01)...
[TRAIN ERROR] ❌ Errore durante training: [Errno 13] Permission denied: 'models/isolation_forest.joblib'
INFO: 127.0.0.1:39856 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:48214 - "GET /stats HTTP/1.1" 200 OK

View File

@ -0,0 +1,104 @@
Marco, questo output chiude definitivamente ogni dubbio: **EIF non è installabile su Python 3.11**, indipendentemente da qualsiasi combinazione di flag.
Il nuovo errore:
```
ModuleNotFoundError: No module named 'numpy'
```
non è un vero errore tuo:
è un sintomo della nuova API **PEP 518 / build isolation** che crea un ambiente temporaneo (build-env) dove EIF cerca Numpy, ma EIF è così vecchio che non sa gestire correttamente Python ≥ 3.11, né il nuovo meccanismo di build.
E infatti lo avevamo già visto dai log precedenti:
### ❗ EIF usa codice C++ che *non può* essere compilato con Python 3.11
---
# 🛑 Conclusione tecnica definitiva
✔ **EIF funziona solo fino a Python 3.10.**
❌ **EIF NON funziona con Python 3.11.**
❌ Nemmeno con:
* `PIP_NO_BUILD_ISOLATION=1`
* `Cython` aggiornato
* `wheel`, `setuptools`, `build`
* disabilitando cache
* aggiungendo Numpy a mano
💡 Perché?
Il sorgente `eif` richiede:
* distutils (rimosso in 3.11)
* API Cython vecchie e rimosse
* costrutti C++ non più supportati
* build system che non funziona con Python moderno
---
# ✅ Le UNICHE soluzioni funzionanti
## **SOLUZIONE A — Consigliata**
Sostituire EIF con **PyOD** usando il modello equivalmente identico:
### Installazione
```bash
pip install pyod
```
### Uso
```python
from pyod.models.eif import EIF
model = EIF(ntrees=100)
model.fit(data)
scores = model.decision_function(data)
```
Hai lo stesso algoritmo, stessa logica, stesso output, zero problemi.
---
## **SOLUZIONE B — Usare Python 3.9 o 3.10 SOLO per EIF**
Se per qualche ragione vuoi usare per forza *quel pacchetto*, devi fare così:
### 1. Installa Python 3.10 su AlmaLinux
```bash
sudo dnf install python3.10 python3.10-devel gcc gcc-c++
```
### 2. Crea un venv dedicato
```bash
python3.10 -m venv venv_eif
source venv_eif/bin/activate
```
### 3. Installa EIF *nel venv Python 3.10*
```bash
pip install --upgrade pip setuptools wheel cython numpy
pip install eif==2.0.2
```
**Funzionerà al 100%.**
---
# 🔥 Marco, vuoi evitare Python 3.10?
Ti riscrivo ENTRO 5 MINUTI il tuo IDS basato su EIF, sostituendolo con PyOD, garantendo:
* stesso comportamento
* stessa accuracy o migliore
* supporto completo Python 3.11
* zero compilazioni
* codice più pulito e moderno
Vuoi che procedo?

View File

@ -0,0 +1,16 @@
Nov 21 18:09:20 FIBRA forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:17419->81.198.87.240:15252, len 548
Nov 21 18:09:20 FIBRA forward: in:<pppoe-1511_1179_comunedisa> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac d4:01:c3:34:0c:41, proto UDP, 185.203.26.201:17419->81.198.87.240:15252, len 548
Nov 21 18:09:20 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 45.225.195.139:14055->185.203.25.204:53, len 82
Nov 21 18:09:20 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 45.225.195.139:14055->185.203.25.204:53, len 82
Nov 21 18:09:20 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.60.250:52216->185.203.24.143:64508, len 44
Nov 21 18:09:20 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 79.124.60.250:52216->185.203.24.143:64508, len 44
Nov 21 18:09:20 FIBRA forward: in:<pppoe-hightek.router.new> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.24.17:4926->104.16.249.249:443, len 52
Nov 21 18:09:20 FIBRA forward: in:<pppoe-hightek.router.new> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.24.17:4926->104.16.249.249:443, len 52
Nov 21 18:09:20 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cava.gioxiii.seg>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 45.187.66.179:42774->185.203.25.231:53, len 66
Nov 21 18:09:20 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-cava.gioxiii.seg>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 45.187.66.179:42774->185.203.25.231:53, len 66
Nov 21 18:09:20 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 45.187.66.177:46130->185.203.25.204:53, len 66
Nov 21 18:09:20 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 45.187.66.177:46130->185.203.25.204:53, len 66
Nov 21 18:09:20 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 45.187.66.177:46130->185.203.25.204:53, len 66
Nov 21 18:09:20 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 45.187.66.177:46130->185.203.25.204:53, len 66
Nov 21 18:09:20 FIBRA forward: in:ether6_RB_CED out:sfp-sfpplus2_VS_AS, connection-state:new src-mac e4:8d:8c:03:f9:56, proto TCP (SYN), 10.1.0.254:36664->78.134.98.240:8291, len 60
^C

View File

@ -0,0 +1,39 @@
Nov 25 08:47:55 ids.alfacom.it systemd[1]: Stopped IDS ML Backend (FastAPI).
Nov 25 08:47:55 ids.alfacom.it systemd[1]: ids-ml-backend.service: Consumed 6min 21.039s CPU time.
Nov 25 08:47:55 ids.alfacom.it systemd[1]: Started IDS ML Backend (FastAPI).
Nov 25 08:47:58 ids.alfacom.it systemd[1]: ids-ml-backend.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 08:47:58 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.
Nov 25 08:47:58 ids.alfacom.it systemd[1]: ids-ml-backend.service: Consumed 4.156s CPU time.
Nov 25 08:48:08 ids.alfacom.it systemd[1]: ids-ml-backend.service: Scheduled restart job, restart counter is at 1.
Nov 25 08:48:08 ids.alfacom.it systemd[1]: Stopped IDS ML Backend (FastAPI).
Nov 25 08:48:08 ids.alfacom.it systemd[1]: ids-ml-backend.service: Consumed 4.156s CPU time.
Nov 25 08:48:08 ids.alfacom.it systemd[1]: Started IDS ML Backend (FastAPI).
Nov 25 08:48:11 ids.alfacom.it systemd[1]: ids-ml-backend.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 08:48:11 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.
Nov 25 08:48:11 ids.alfacom.it systemd[1]: ids-ml-backend.service: Consumed 4.059s CPU time.
Nov 25 08:48:16 ids.alfacom.it systemd[1]: Stopped IDS ML Backend (FastAPI).
Nov 25 08:48:16 ids.alfacom.it systemd[1]: ids-ml-backend.service: Consumed 4.059s CPU time.
Nov 25 08:48:16 ids.alfacom.it systemd[1]: Started IDS ML Backend (FastAPI).
Nov 25 08:48:18 ids.alfacom.it systemd[1]: ids-ml-backend.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 08:48:18 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.
Nov 25 08:48:18 ids.alfacom.it systemd[1]: ids-ml-backend.service: Consumed 3.908s CPU time.
Nov 25 08:48:28 ids.alfacom.it systemd[1]: ids-ml-backend.service: Scheduled restart job, restart counter is at 2.
Nov 25 08:48:28 ids.alfacom.it systemd[1]: Stopped IDS ML Backend (FastAPI).
Nov 25 08:48:28 ids.alfacom.it systemd[1]: ids-ml-backend.service: Consumed 3.908s CPU time.
Nov 25 08:48:28 ids.alfacom.it systemd[1]: Started IDS ML Backend (FastAPI).
Nov 25 08:48:31 ids.alfacom.it systemd[1]: ids-ml-backend.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 08:48:31 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.
Nov 25 08:48:31 ids.alfacom.it systemd[1]: ids-ml-backend.service: Consumed 3.952s CPU time.
Nov 25 08:48:41 ids.alfacom.it systemd[1]: ids-ml-backend.service: Scheduled restart job, restart counter is at 3.
Nov 25 08:48:41 ids.alfacom.it systemd[1]: Stopped IDS ML Backend (FastAPI).
Nov 25 08:48:41 ids.alfacom.it systemd[1]: ids-ml-backend.service: Consumed 3.952s CPU time.
Nov 25 08:48:41 ids.alfacom.it systemd[1]: Started IDS ML Backend (FastAPI).
Nov 25 08:48:43 ids.alfacom.it systemd[1]: ids-ml-backend.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 08:48:43 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.
Nov 25 08:48:43 ids.alfacom.it systemd[1]: ids-ml-backend.service: Consumed 4.019s CPU time.
Nov 25 08:48:53 ids.alfacom.it systemd[1]: ids-ml-backend.service: Scheduled restart job, restart counter is at 4.
Nov 25 08:48:53 ids.alfacom.it systemd[1]: Stopped IDS ML Backend (FastAPI).
Nov 25 08:48:53 ids.alfacom.it systemd[1]: ids-ml-backend.service: Consumed 4.019s CPU time.
Nov 25 08:48:53 ids.alfacom.it systemd[1]: ids-ml-backend.service: Start request repeated too quickly.
Nov 25 08:48:53 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.
Nov 25 08:48:53 ids.alfacom.it systemd[1]: Failed to start IDS ML Backend (FastAPI).

View File

@ -0,0 +1,419 @@
cat cron.log
mkdir: cannot create directory /var/run/ids: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: No such file or directory
cat: /var/run/ids/frontend.pid: No such file or directory
mkdir: cannot create directory /var/run/ids: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: No such file or directory
cat: /var/run/ids/backend.pid: No such file or directory
mkdir: cannot create directory /var/run/ids: Permission denied
mkdir: cannot create directory /var/run/ids: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: No such file or directory
cat: /var/run/ids/backend.pid: No such file or directory
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: No such file or directory
cat: /var/run/ids/frontend.pid: No such file or directory
mkdir: cannot create directory /var/run/ids: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: No such file or directory
mkdir: cannot create directory /var/run/ids: Permission denied
cat: /var/run/ids/backend.pid: No such file or directory
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: No such file or directory
cat: /var/run/ids/frontend.pid: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_backend.sh: No such file or directory
/bin/sh: line 1: /opt/ids/deployment/check_frontend.sh: No such file or directory
Mon Nov 17 19:19:08 CET 2025: === RESTART SETTIMANALE SISTEMA IDS ===
Mon Nov 17 19:19:18 CET 2025: Pulizia file temporanei...
Mon Nov 17 19:19:18 CET 2025: Riavvio servizi...
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
Mon Nov 17 19:26:38 CET 2025: === RESTART SETTIMANALE SISTEMA IDS ===
Mon Nov 17 19:26:48 CET 2025: Pulizia file temporanei...
Mon Nov 17 19:26:48 CET 2025: Riavvio servizi...
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
pg_dump: error: connection to database "ids_database" failed: FATAL: password authentication failed for user "ids_user"
Tue Nov 18 04:00:02 CET 2025: Backup completato: /opt/ids/backups/ids_backup_20251118_040002.sql.gz
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_backend.sh: line 15: /var/run/ids/backend.pid: Permission denied
/opt/ids/deployment/check_frontend.sh: line 15: /var/run/ids/frontend.pid: Permission denied

View File

@ -0,0 +1,132 @@
cat /opt/ids/.env | grep DATABASE_URL
DATABASE_URL=postgresql://ids_user:TestPassword123@127.0.0.1:5432/ids_database
[root@ids ids]# psql "$(grep DATABASE_URL /opt/ids/.env | cut -d= -f2)"
psql (13.22)
Type "help" for help.
ids_database=>
ABORT CLUSTER DELETE FROM FETCH MOVE RESET SHOW VALUES
ALTER COMMENT DISCARD GRANT NOTIFY REVOKE START WITH
ANALYZE COMMIT DO IMPORT PREPARE ROLLBACK TABLE
BEGIN COPY DROP INSERT REASSIGN SAVEPOINT TRUNCATE
CALL CREATE END LISTEN REFRESH MATERIALIZED VIEW SECURITY LABEL UNLISTEN
CHECKPOINT DEALLOCATE EXECUTE LOAD REINDEX SELECT UPDATE
CLOSE DECLARE EXPLAIN LOCK RELEASE SET VACUUM
ids_database=> close
ids_database-> ^Z
[3]+ Stopped psql "$(grep DATABASE_URL /opt/ids/.env | cut -d= -f2)"
[root@ids ids]# nano .env
[root@ids ids]# psql "$(grep DATABASE_URL /opt/ids/.env | cut -d= -f2)"
psql (13.22)
Type "help" for help.
ids_database=> ^Z
[4]+ Stopped psql "$(grep DATABASE_URL /opt/ids/.env | cut -d= -f2)"
[root@ids ids]# tail -100 /var/log/ids/frontend.log
4:46:58 PM [express] serving on port 5000
A PostCSS plugin did not pass the `from` option to `postcss.parse`. This may cause imported assets to be incorrectly transformed. If you've recently added a PostCSS plugin that raised this warning, please contact the package author to fix the issue.
4:49:03 PM [express] GET /api/stats 500 in 39ms :: {"error":"Failed to fetch stats"}
4:49:03 PM [express] GET /api/detections 500 in 13ms :: {"error":"Failed to fetch detections"}
4:49:03 PM [express] GET /api/routers 500 in 11ms :: {"error":"Failed to fetch routers"}
4:49:06 PM [express] GET /api/stats 500 in 11ms :: {"error":"Failed to fetch stats"}
4:49:06 PM [express] GET /api/detections 500 in 8ms :: {"error":"Failed to fetch detections"}
4:49:06 PM [express] GET /api/routers 500 in 6ms :: {"error":"Failed to fetch routers"}
4:49:08 PM [express] GET /api/detections 500 in 4ms :: {"error":"Failed to fetch detections"}
4:49:10 PM [express] GET /api/training-history 500 in 42ms :: {"error":"Failed to fetch training his…
4:49:10 PM [express] GET /api/ml/stats 200 in 70ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:49:15 PM [express] POST /api/ml/train 200 in 10ms :: {"message":"Training avviato in background","…
4:49:15 PM [express] GET /api/training-history 500 in 6ms :: {"error":"Failed to fetch training hist…
4:49:15 PM [express] GET /api/ml/stats 200 in 14ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:49:21 PM [express] GET /api/training-history 500 in 7ms :: {"error":"Failed to fetch training hist…
4:49:21 PM [express] GET /api/ml/stats 200 in 20ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:49:27 PM [express] GET /api/training-history 500 in 6ms :: {"error":"Failed to fetch training hist…
4:49:27 PM [express] GET /api/ml/stats 200 in 20ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:49:31 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
4:49:31 PM [express] GET /api/ml/stats 200 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:49:33 PM [express] GET /api/routers 500 in 5ms :: {"error":"Failed to fetch routers"}
4:49:36 PM [express] GET /api/whitelist 500 in 4ms :: {"error":"Failed to fetch whitelist"}
4:49:38 PM [express] GET /api/routers 500 in 4ms :: {"error":"Failed to fetch routers"}
4:49:39 PM [express] GET /api/training-history 500 in 7ms :: {"error":"Failed to fetch training hist…
4:49:39 PM [express] GET /api/ml/stats 200 in 20ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:49:41 PM [express] GET /api/detections 500 in 4ms :: {"error":"Failed to fetch detections"}
4:49:42 PM [express] GET /api/stats 500 in 4ms :: {"error":"Failed to fetch stats"}
4:49:43 PM [express] GET /api/detections 500 in 6ms :: {"error":"Failed to fetch detections"}
4:49:43 PM [express] GET /api/routers 500 in 4ms :: {"error":"Failed to fetch routers"}
4:49:44 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
4:49:44 PM [express] GET /api/ml/stats 200 in 19ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:49:54 PM [express] GET /api/training-history 500 in 8ms :: {"error":"Failed to fetch training hist…
4:49:54 PM [express] GET /api/ml/stats 304 in 20ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:49:57 PM [express] GET /api/detections 500 in 4ms :: {"error":"Failed to fetch detections"}
4:49:59 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
4:49:59 PM [express] GET /api/ml/stats 304 in 19ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:50:00 PM [express] GET /api/routers 500 in 3ms :: {"error":"Failed to fetch routers"}
4:50:01 PM [express] GET /api/whitelist 500 in 4ms :: {"error":"Failed to fetch whitelist"}
4:50:02 PM [express] GET /api/training-history 500 in 6ms :: {"error":"Failed to fetch training hist…
4:50:02 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:50:12 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
4:50:12 PM [express] GET /api/ml/stats 304 in 18ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:50:22 PM [express] GET /api/training-history 500 in 4ms :: {"error":"Failed to fetch training hist…
4:50:22 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:50:32 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
4:50:33 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:50:42 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
4:50:43 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:50:52 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
4:50:53 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:51:02 PM [express] GET /api/training-history 500 in 4ms :: {"error":"Failed to fetch training hist…
4:51:03 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:51:12 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
4:51:13 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:51:23 PM [express] GET /api/training-history 500 in 4ms :: {"error":"Failed to fetch training hist…
4:51:23 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:51:33 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
4:51:33 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:51:37 PM [express] GET /api/detections 500 in 4ms :: {"error":"Failed to fetch detections"}
4:51:39 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
4:51:39 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:51:40 PM [express] GET /api/routers 500 in 4ms :: {"error":"Failed to fetch routers"}
4:51:42 PM [express] GET /api/whitelist 500 in 4ms :: {"error":"Failed to fetch whitelist"}
4:51:43 PM [express] GET /api/training-history 500 in 7ms :: {"error":"Failed to fetch training hist…
4:51:43 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:51:53 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
4:51:53 PM [express] GET /api/ml/stats 304 in 18ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:52:03 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
4:52:03 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:52:13 PM [express] GET /api/training-history 500 in 4ms :: {"error":"Failed to fetch training hist…
4:52:13 PM [express] GET /api/ml/stats 304 in 18ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:52:23 PM [express] GET /api/training-history 500 in 4ms :: {"error":"Failed to fetch training hist…
4:52:23 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:52:33 PM [express] GET /api/training-history 500 in 4ms :: {"error":"Failed to fetch training hist…
4:52:33 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:52:43 PM [express] GET /api/training-history 500 in 4ms :: {"error":"Failed to fetch training hist…
4:52:43 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:52:53 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
4:52:53 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:53:03 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
4:53:03 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:53:13 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
4:53:13 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:53:23 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
4:53:23 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:53:33 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
4:53:33 PM [express] GET /api/ml/stats 304 in 19ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:53:43 PM [express] GET /api/training-history 500 in 4ms :: {"error":"Failed to fetch training hist…
4:53:43 PM [express] GET /api/ml/stats 304 in 18ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:53:53 PM [express] GET /api/training-history 500 in 4ms :: {"error":"Failed to fetch training hist…
4:53:53 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:54:03 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
4:54:03 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:54:13 PM [express] GET /api/training-history 500 in 4ms :: {"error":"Failed to fetch training hist…
4:54:13 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:54:23 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
4:54:23 PM [express] GET /api/ml/stats 304 in 20ms :: {"logs":{"total":0,"last_hour":0},"detections"…
4:54:33 PM [express] GET /api/training-history 500 in 4ms :: {"error":"Failed to fetch training hist…
4:54:33 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
[root@ids ids]# psql -h 127.0.0.1 -U ids_user -d ids_database
Password for user ids_user:
psql (13.22)
Type "help" for help.
ids_database=> ^Z
[5]+ Stopped psql -h 127.0.0.1 -U ids_user -d ids_database

View File

@ -0,0 +1,602 @@
cat training.log
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
[root@ids ids]# cat detect.log
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'requests'

View File

@ -0,0 +1,125 @@
cd /opt/ids/python_ml && source venv/bin/activate && python3 main.py
[WARNING] Extended Isolation Forest not available, using standard IF
[ML] Using Hybrid ML Detector (Extended Isolation Forest + Feature Selection)
[HYBRID] Ensemble classifier loaded
[HYBRID] Models loaded (version: latest)
[HYBRID] Selected features: 18/25
[HYBRID] Mode: Hybrid (IF + Ensemble)
[ML] ✓ Hybrid detector models loaded and ready
 Starting IDS API on http://0.0.0.0:8000
 Docs available at http://0.0.0.0:8000/docs
INFO: Started server process [108626]
INFO: Waiting for application startup.
INFO: Application startup complete.
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
(venv) [root@ids python_ml]# ls -la /opt/ids/python_ml/models/
total 22896
drwxr-xr-x. 2 ids ids 4096 Nov 25 18:30 .
drwxr-xr-x. 6 ids ids 4096 Nov 25 12:53 ..
-rw-r--r--. 1 root root 235398 Nov 24 18:55 ensemble_classifier_20251124_185541.pkl
-rw-r--r--. 1 root root 231504 Nov 24 18:59 ensemble_classifier_20251124_185920.pkl
-rw-r--r--. 1 root root 1008222 Nov 24 19:21 ensemble_classifier_20251124_192109.pkl
-rw-r--r--. 1 root root 925566 Nov 24 19:21 ensemble_classifier_20251124_192122.pkl
-rw-r--r--. 1 ids ids 200159 Nov 25 09:03 ensemble_classifier_20251125_090356.pkl
-rw-r--r--. 1 root root 806006 Nov 25 09:27 ensemble_classifier_20251125_092703.pkl
-rw-r--r--. 1 ids ids 286079 Nov 25 12:00 ensemble_classifier_20251125_120016.pkl
-rw-r--r--. 1 ids ids 398464 Nov 25 18:19 ensemble_classifier_20251125_181945.pkl
-rw-r--r--. 1 ids ids 426790 Nov 25 18:27 ensemble_classifier_20251125_182742.pkl
-rw-r--r--. 1 ids ids 423651 Nov 25 18:30 ensemble_classifier_20251125_183049.pkl
-rw-r--r--. 1 root root 806006 Nov 25 09:27 ensemble_classifier_latest.pkl
-rw-r--r--. 1 ids ids 461 Nov 25 00:00 feature_names.json
-rw-r--r--. 1 root root 1695 Nov 24 18:55 feature_selector_20251124_185541.pkl
-rw-r--r--. 1 root root 1695 Nov 24 18:59 feature_selector_20251124_185920.pkl
-rw-r--r--. 1 root root 1695 Nov 24 19:21 feature_selector_20251124_192109.pkl
-rw-r--r--. 1 root root 1695 Nov 24 19:21 feature_selector_20251124_192122.pkl
-rw-r--r--. 1 ids ids 1695 Nov 25 09:03 feature_selector_20251125_090356.pkl
-rw-r--r--. 1 root root 1695 Nov 25 09:27 feature_selector_20251125_092703.pkl
-rw-r--r--. 1 ids ids 1695 Nov 25 12:00 feature_selector_20251125_120016.pkl
-rw-r--r--. 1 ids ids 1695 Nov 25 18:19 feature_selector_20251125_181945.pkl
-rw-r--r--. 1 ids ids 1695 Nov 25 18:27 feature_selector_20251125_182742.pkl
-rw-r--r--. 1 ids ids 1695 Nov 25 18:30 feature_selector_20251125_183049.pkl
-rw-r--r--. 1 root root 1695 Nov 25 09:27 feature_selector_latest.pkl
-rw-r--r--. 1 ids ids 813592 Nov 25 00:00 isolation_forest.joblib
-rw-r--r--. 1 root root 1674808 Nov 24 18:55 isolation_forest_20251124_185541.pkl
-rw-r--r--. 1 root root 1642600 Nov 24 18:59 isolation_forest_20251124_185920.pkl
-rw-r--r--. 1 root root 1482984 Nov 24 19:21 isolation_forest_20251124_192109.pkl
-rw-r--r--. 1 root root 1465736 Nov 24 19:21 isolation_forest_20251124_192122.pkl
-rw-r--r--. 1 ids ids 1139256 Nov 25 09:03 isolation_forest_20251125_090356.pkl
-rw-r--r--. 1 root root 1428424 Nov 25 09:27 isolation_forest_20251125_092703.pkl
-rw-r--r--. 1 ids ids 1855240 Nov 25 12:00 isolation_forest_20251125_120016.pkl
-rw-r--r--. 1 ids ids 1519784 Nov 25 18:19 isolation_forest_20251125_181945.pkl
-rw-r--r--. 1 ids ids 1511688 Nov 25 18:27 isolation_forest_20251125_182742.pkl
-rw-r--r--. 1 ids ids 1559208 Nov 25 18:30 isolation_forest_20251125_183049.pkl
-rw-r--r--. 1 root root 1428424 Nov 25 09:27 isolation_forest_latest.pkl
-rw-r--r--. 1 root root 1661 Nov 24 18:55 metadata_20251124_185541.json
-rw-r--r--. 1 root root 1661 Nov 24 18:59 metadata_20251124_185920.json
-rw-r--r--. 1 root root 1675 Nov 24 19:21 metadata_20251124_192109.json
-rw-r--r--. 1 root root 1675 Nov 24 19:21 metadata_20251124_192122.json
-rw-r--r--. 1 root root 1675 Nov 25 09:27 metadata_20251125_092703.json
-rw-r--r--. 1 root root 1675 Nov 25 09:27 metadata_latest.json
-rw-r--r--. 1 ids ids 2015 Nov 25 00:00 scaler.joblib
-rw-r--r--. 1 root root 1047 Nov 24 18:55 scaler_20251124_185541.pkl
-rw-r--r--. 1 root root 1047 Nov 24 18:59 scaler_20251124_185920.pkl
-rw-r--r--. 1 root root 1047 Nov 24 19:21 scaler_20251124_192109.pkl
-rw-r--r--. 1 root root 1047 Nov 24 19:21 scaler_20251124_192122.pkl
-rw-r--r--. 1 ids ids 1047 Nov 25 09:03 scaler_20251125_090356.pkl
-rw-r--r--. 1 root root 1047 Nov 25 09:27 scaler_20251125_092703.pkl
-rw-r--r--. 1 ids ids 1047 Nov 25 12:00 scaler_20251125_120016.pkl
-rw-r--r--. 1 ids ids 1047 Nov 25 18:19 scaler_20251125_181945.pkl
-rw-r--r--. 1 ids ids 1047 Nov 25 18:27 scaler_20251125_182742.pkl
-rw-r--r--. 1 ids ids 1047 Nov 25 18:30 scaler_20251125_183049.pkl
-rw-r--r--. 1 root root 1047 Nov 25 09:27 scaler_latest.pkl
(venv) [root@ids python_ml]# tail -n 50 /var/log/ids/ml_backend.log
[HYBRID] Selected features: 18/25
[HYBRID] Mode: Hybrid (IF + Ensemble)
[ML] ✓ Hybrid detector models loaded and ready
🚀 Starting IDS API on http://0.0.0.0:8000
📚 Docs available at http://0.0.0.0:8000/docs
INFO: Started server process [108413]
INFO: Waiting for application startup.
INFO: Application startup complete.
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
[WARNING] Extended Isolation Forest not available, using standard IF
[ML] Using Hybrid ML Detector (Extended Isolation Forest + Feature Selection)
[HYBRID] Ensemble classifier loaded
[HYBRID] Models loaded (version: latest)
[HYBRID] Selected features: 18/25
[HYBRID] Mode: Hybrid (IF + Ensemble)
[ML] ✓ Hybrid detector models loaded and ready
🚀 Starting IDS API on http://0.0.0.0:8000
📚 Docs available at http://0.0.0.0:8000/docs
INFO: Started server process [108452]
INFO: Waiting for application startup.
INFO: Application startup complete.
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
[WARNING] Extended Isolation Forest not available, using standard IF
[ML] Using Hybrid ML Detector (Extended Isolation Forest + Feature Selection)
[HYBRID] Ensemble classifier loaded
[HYBRID] Models loaded (version: latest)
[HYBRID] Selected features: 18/25
[HYBRID] Mode: Hybrid (IF + Ensemble)
[ML] ✓ Hybrid detector models loaded and ready
🚀 Starting IDS API on http://0.0.0.0:8000
📚 Docs available at http://0.0.0.0:8000/docs
INFO: Started server process [108530]
INFO: Waiting for application startup.
INFO: Application startup complete.
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
[WARNING] Extended Isolation Forest not available, using standard IF
[ML] Using Hybrid ML Detector (Extended Isolation Forest + Feature Selection)
[HYBRID] Ensemble classifier loaded
[HYBRID] Models loaded (version: latest)
[HYBRID] Selected features: 18/25
[HYBRID] Mode: Hybrid (IF + Ensemble)
[ML] ✓ Hybrid detector models loaded and ready
🚀 Starting IDS API on http://0.0.0.0:8000
📚 Docs available at http://0.0.0.0:8000/docs
(venv) [root@ids python_ml]#

View File

@ -0,0 +1,4 @@
curl -X POST http://localhost:8000/detect \
-H "Content-Type: application/json" \
-d '{"max_records": 5000, "hours_back": 1, "risk_threshold": 80, "auto_block": true}'
{"detections":[{"source_ip":"108.139.210.107","risk_score":98.55466848373413,"confidence_level":"high","action_recommendation":"auto_block","anomaly_type":"ddos","reason":"High connection rate: 403.7 conn/s","log_count":1211,"total_packets":1211,"total_bytes":2101702,"first_seen":"2026-01-02T16:41:51","last_seen":"2026-01-02T16:41:54","confidence":95.0},{"source_ip":"216.58.209.54","risk_score":95.52801848493884,"confidence_level":"high","action_recommendation":"auto_block","anomaly_type":"brute_force","reason":"High connection rate: 184.7 conn/s","log_count":554,"total_packets":554,"total_bytes":782397,"first_seen":"2026-01-02T16:41:51","last_seen":"2026-01-02T16:41:54","confidence":95.0},{"source_ip":"95.127.69.202","risk_score":93.58280514393482,"confidence_level":"medium","action_recommendation":"manual_review","anomaly_type":"brute_force","reason":"High connection rate: 93.7 conn/s","log_count":281,"total_packets":281,"total_bytes":369875,"first_seen":"2026-01-02T16:41:51","last_seen":"2026-01-02T16:41:54","confidence":75.0},{"source_ip":"95.127.72.207","risk_score":92.50694363471318,"confidence_level":"medium","action_recommendation":"manual_review","anomaly_type":"brute_force","reason":"High connection rate: 76.3 conn/s","log_count":229,"total_packets":229,"total_bytes":293439,"first_seen":"2026-01-02T16:41:51","last_seen":"2026-01-02T16:41:54","confidence":75.0},{"source_ip":"95.110.183.67","risk_score":86.42278405656512,"confidence_level":"medium","action_recommendation":"manual_review","anomaly_type":"brute_force","reason":"High connection rate: 153.0 conn/s","log_count":459,"total_packets":459,"total_bytes":20822,"first_seen":"2026-01-02T16:41:51","last_seen":"2026-01-02T16:41:54","confidence":75.0},{"source_ip":"54.75.71.86","risk_score":83.42037059381207,"confidence_level":"medium","action_recommendation":"manual_review","anomaly_type":"brute_force","reason":"High connection rate: 58.0 conn/s","log_count":174,"total_packets":174,"total_bytes":25857,"first_seen":"2026-01-02T16:41:51","last_seen":"2026-01-02T16:41:54","confidence":75.0},{"source_ip":"79.10.127.217","risk_score":82.32814469102843,"confidence_level":"medium","action_recommendation":"manual_review","anomaly_type":"brute_force","reason":"High connection rate: 70.0 conn/s","log_count":210,"total_packets":210,"total_bytes":18963,"first_seen":"2026-01-02T16:41:51","last_seen":"2026-01-02T16:41:54","confidence":75.0},{"source_ip":"142.251.140.100","risk_score":76.61422108557721,"confidence_level":"medium","action_recommendation":"manual_review","anomaly_type":"botnet","reason":"Anomalous pattern detected (botnet)","log_count":16,"total_packets":16,"total_bytes":20056,"first_seen":"2026-01-02T16:41:51","last_seen":"2026-01-02T16:41:53","confidence":75.0},{"source_ip":"142.250.181.161","risk_score":76.3802033958719,"confidence_level":"medium","action_recommendation":"manual_review","anomaly_type":"botnet","reason":"Anomalous pattern detected (botnet)","log_count":15,"total_packets":15,"total_bytes":5214,"first_seen":"2026-01-02T16:41:51","last_seen":"2026-01-02T16:41:51","confidence":75.0},{"source_ip":"142.250.180.131","risk_score":72.7723405111559,"confidence_level":"medium","action_recommendation":"manual_review","anomaly_type":"suspicious","reason":"Anomalous pattern detected (suspicious)","log_count":8,"total_packets":8,"total_bytes":5320,"first_seen":"2026-01-02T16:41:51","last_seen":"2026-01-02T16:41:53","confidence":75.0},{"source_ip":"157.240.231.60","risk_score":72.26853648050493,"confidence_level":"medium","action_recommendation":"manual_review","anomaly_type":"botnet","reason":"Anomalous pattern detected (botnet)","log_count":16,"total_packets":16,"total_bytes":4624,"first_seen":"2026-01-02T16:41:51","last_seen":"2026-01-02T16:41:54","confidence":75.0}],"total":11,"blocked":0,"message":"Trovate 11 anomalie"}[root@ids python_ml]#

View File

@ -0,0 +1,42 @@
head -20 /var/log/mikrotik/raw.log
Nov 17 16:52:16 FIBRA forward: in:sfp-sfpplus1_VS_FTTO out:sfp-sfpplus2_VS_AS, connection-state:new src-mac c4:ad:34:25:a7:b5, proto UDP, 185.203.26.34:55841->192.178.203.94:443, len 1280
Nov 17 16:52:16 FIBRA forward: in:sfp-sfpplus1_VS_FTTO out:sfp-sfpplus2_VS_AS, connection-state:new src-mac c4:ad:34:25:a7:b5, proto UDP, 185.203.26.34:55841->192.178.203.94:443, len 1280
Nov 17 16:52:16 FIBRA forward: in:sfp-sfpplus1_VS_FTTO out:sfp-sfpplus2_VS_AS, connection-state:new src-mac c4:ad:34:25:a7:b5, proto UDP, 185.203.26.34:55841->192.178.203.94:443, len 1280
Nov 17 16:52:16 FIBRA forward: in:sfp-sfpplus1_VS_FTTO out:sfp-sfpplus2_VS_AS, connection-state:new src-mac c4:ad:34:25:a7:b5, proto UDP, 185.203.26.34:55841->192.178.203.94:443, len 1280
Nov 17 16:52:16 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 198.251.84.34:9991->185.203.26.77:53, len 65
Nov 17 16:52:16 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 198.251.84.34:9991->185.203.26.77:53, len 65
Nov 17 16:52:16 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:sfp-sfpplus1_VS_FTTO, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 82.62.84.108:43863->185.203.26.34:8472, len 210
Nov 17 16:52:16 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:sfp-sfpplus1_VS_FTTO, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 82.62.84.108:43863->185.203.26.34:8472, len 210
Nov 17 16:52:16 FIBRA forward: in:<pppoe-1018_mario.alfieri> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.138:56224->172.67.143.237:80, len 60
Nov 17 16:52:16 FIBRA forward: in:<pppoe-1018_mario.alfieri> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.138:56224->172.67.143.237:80, len 60
Nov 17 16:52:16 FIBRA forward: in:<pppoe-1018_mario.alfieri> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.138:56225->172.67.143.237:80, len 60
Nov 17 16:52:16 FIBRA forward: in:<pppoe-1018_mario.alfieri> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.138:56225->172.67.143.237:80, len 60
Nov 17 16:52:16 FIBRA forward: in:<pppoe-1018_mario.alfieri> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.138:58268->172.67.143.237:443, len 60
Nov 17 16:52:16 FIBRA forward: in:<pppoe-1018_mario.alfieri> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.138:58268->172.67.143.237:443, len 60
Nov 17 16:52:16 FIBRA forward: in:<pppoe-1018_mario.alfieri> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.138:56676->172.67.143.237:80, len 60
Nov 17 16:52:16 FIBRA forward: in:<pppoe-caronte.hightek_01> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.233:35832->192.168.25.254:80, len 60
Nov 17 16:52:16 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:sfp-sfpplus1_VS_FTTO, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 82.62.84.108:56670->185.203.26.34:8472, len 178
Nov 17 16:52:16 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:sfp-sfpplus1_VS_FTTO, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 82.62.84.108:56670->185.203.26.34:8472, len 178
Nov 17 16:52:16 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 72.46.85.161:43970->185.203.24.135:51688, len 44
Nov 17 16:52:16 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 72.46.85.161:43970->185.203.24.135:51688, len 44
[root@ids python_ml]# tail -20 /var/log/mikrotik/raw.log
Nov 17 18:34:26 FIBRA forward: in:<pppoe-023_maria.barba> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 98:da:c4:75:8c:fb, proto UDP, 10.0.254.170:56065->104.20.23.252:443, len 1278
Nov 17 18:34:26 FIBRA forward: in:<pppoe-023_maria.barba> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 98:da:c4:75:8c:fb, proto UDP, 10.0.254.170:56065->104.20.23.252:443, len 1278
Nov 17 18:34:26 FIBRA forward: in:<pppoe-023_maria.barba> out:sfp-sfpplus2_VS_AS, connection-state:new,snat src-mac 98:da:c4:75:8c:fb, proto UDP, 10.0.254.170:56065->104.20.23.252:443, NAT (10.0.254.170:56065->185.203.27.253:56065)->104.20.23.252:443, len 1278
Nov 17 18:34:26 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 126.220.199.81:32730->185.203.25.204:53, len 82
Nov 17 18:34:26 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 126.220.199.81:32730->185.203.25.204:53, len 82
Nov 17 18:34:26 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 160.202.129.17:43994->185.203.24.15:56929, len 44
Nov 17 18:34:26 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 160.202.129.17:43994->185.203.24.15:56929, len 44
Nov 17 18:34:26 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 95.216.123.229:4653->185.203.26.77:53, len 65
Nov 17 18:34:26 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 95.216.123.229:4653->185.203.26.77:53, len 65
Nov 17 18:34:26 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 198.251.84.34:28065->185.203.26.77:53, len 65
Nov 17 18:34:26 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 198.251.84.34:28065->185.203.26.77:53, len 65
Nov 17 18:34:26 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 168.227.31.21:59518->185.203.25.204:53, len 63
Nov 17 18:34:26 FIBRA forward: in:<pppoe-1099_maddalena.esposito> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.242:47946->3.223.194.130:443, len 60
Nov 17 18:34:26 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-gaetano.dibenedetto>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 168.227.31.21:59518->185.203.25.204:53, len 63
Nov 17 18:34:26 FIBRA forward: in:<pppoe-1099_maddalena.esposito> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 10.0.254.242:47946->3.223.194.130:443, len 60
Nov 17 18:34:26 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 198.251.84.34:3117->185.203.26.77:53, len 65
Nov 17 18:34:26 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 198.251.84.34:3117->185.203.26.77:53, len 65
Nov 17 18:34:26 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 198.251.84.34:30733->185.203.26.77:53, len 65
Nov 17 18:34:26 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-571_alberto.apostolico>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto UDP, 198.251.84.34:30733->185.203.26.77:53, len 65
Nov 17 18:34:26 FIBRA detected-ddos forward: in:sfp-sfpplus2_VS_AS out:VLAN53_PPOE_DATACENTER, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 35.203.211.209:50481->185.203.24.138:27482, len 44

View File

@ -0,0 +1,51 @@
journalctl -u ids-list-fetcher -n 50 --no-pager
Jan 02 12:50:02 ids.alfacom.it ids-list-fetcher[5900]: ============================================================
Jan 02 12:50:02 ids.alfacom.it systemd[1]: ids-list-fetcher.service: Deactivated successfully.
Jan 02 12:50:02 ids.alfacom.it systemd[1]: Finished IDS Public Lists Fetcher Service.
Jan 02 12:54:56 ids.alfacom.it systemd[1]: Starting IDS Public Lists Fetcher Service...
Jan 02 12:54:56 ids.alfacom.it ids-list-fetcher[6290]: ============================================================
Jan 02 12:54:56 ids.alfacom.it ids-list-fetcher[6290]: [2026-01-02 12:54:56] PUBLIC LISTS SYNC
Jan 02 12:54:56 ids.alfacom.it ids-list-fetcher[6290]: ============================================================
Jan 02 12:54:56 ids.alfacom.it ids-list-fetcher[6290]: Found 2 enabled lists
Jan 02 12:54:56 ids.alfacom.it ids-list-fetcher[6290]: [12:54:56] Downloading Spamhaus from https://www.spamhaus.org/drop/drop_v4.json...
Jan 02 12:54:56 ids.alfacom.it ids-list-fetcher[6290]: [12:54:56] Downloading AWS from https://ip-ranges.amazonaws.com/ip-ranges.json...
Jan 02 12:54:56 ids.alfacom.it ids-list-fetcher[6290]: [12:54:56] Parsing AWS...
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: [12:54:57] Found 9548 IPs, syncing to database...
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: [12:54:57] ✓ AWS: +0 -0 ~9511
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: [12:54:57] Parsing Spamhaus...
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: [12:54:57] Found 1468 IPs, syncing to database...
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: [12:54:57] ✗ Spamhaus: ON CONFLICT DO UPDATE command cannot affect row a second time
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: HINT: Ensure that no rows proposed for insertion within the same command have duplicate constrained values.
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: ============================================================
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: SYNC SUMMARY
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: ============================================================
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: Success: 1/2
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: Errors: 1/2
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: Total IPs Added: 0
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: Total IPs Removed: 0
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: ============================================================
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: ============================================================
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: RUNNING MERGE LOGIC
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: ============================================================
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: ERROR:merge_logic:Failed to cleanup detections: operator does not exist: inet = text
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: LINE 9: d.source_ip::inet = wl.ip_inet
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: ^
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: HINT: No operator matches the given name and argument types. You might need to add explicit type casts.
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: ERROR:merge_logic:Failed to sync detections: operator does not exist: text <<= text
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: LINE 30: OR bl.ip_inet <<= wl.ip_inet
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: ^
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: HINT: No operator matches the given name and argument types. You might need to add explicit type casts.
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: Traceback (most recent call last):
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: File "/opt/ids/python_ml/merge_logic.py", line 264, in sync_public_blacklist_detections
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: cur.execute("""
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: psycopg2.errors.UndefinedFunction: operator does not exist: text <<= text
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: LINE 30: OR bl.ip_inet <<= wl.ip_inet
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: ^
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: HINT: No operator matches the given name and argument types. You might need to add explicit type casts.
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: Merge Logic Stats:
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: Created detections: 0
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: Cleaned invalid detections: 0
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: Skipped (whitelisted): 0
Jan 02 12:54:57 ids.alfacom.it ids-list-fetcher[6290]: ============================================================
Jan 02 12:54:57 ids.alfacom.it systemd[1]: ids-list-fetcher.service: Deactivated successfully.
Jan 02 12:54:57 ids.alfacom.it systemd[1]: Finished IDS Public Lists Fetcher Service.

View File

@ -0,0 +1,51 @@
journalctl -u ids-list-fetcher -n 50 --no-pager
Jan 02 16:11:31 ids.alfacom.it ids-list-fetcher[10401]: HINT: No operator matches the given name and argument types. You might need to add explicit type casts.
Jan 02 16:11:31 ids.alfacom.it ids-list-fetcher[10401]: Merge Logic Stats:
Jan 02 16:11:31 ids.alfacom.it ids-list-fetcher[10401]: Created detections: 0
Jan 02 16:11:31 ids.alfacom.it ids-list-fetcher[10401]: Cleaned invalid detections: 0
Jan 02 16:11:31 ids.alfacom.it ids-list-fetcher[10401]: Skipped (whitelisted): 0
Jan 02 16:11:31 ids.alfacom.it ids-list-fetcher[10401]: ============================================================
Jan 02 16:11:31 ids.alfacom.it systemd[1]: ids-list-fetcher.service: Deactivated successfully.
Jan 02 16:11:31 ids.alfacom.it systemd[1]: Finished IDS Public Lists Fetcher Service.
Jan 02 16:15:04 ids.alfacom.it systemd[1]: Starting IDS Public Lists Fetcher Service...
Jan 02 16:15:04 ids.alfacom.it ids-list-fetcher[10801]: ============================================================
Jan 02 16:15:04 ids.alfacom.it ids-list-fetcher[10801]: [2026-01-02 16:15:04] PUBLIC LISTS SYNC
Jan 02 16:15:04 ids.alfacom.it ids-list-fetcher[10801]: ============================================================
Jan 02 16:15:04 ids.alfacom.it ids-list-fetcher[10801]: Found 2 enabled lists
Jan 02 16:15:04 ids.alfacom.it ids-list-fetcher[10801]: [16:15:04] Downloading Spamhaus from https://www.spamhaus.org/drop/drop_v4.json...
Jan 02 16:15:04 ids.alfacom.it ids-list-fetcher[10801]: [16:15:04] Downloading AWS from https://ip-ranges.amazonaws.com/ip-ranges.json...
Jan 02 16:15:04 ids.alfacom.it ids-list-fetcher[10801]: [16:15:04] Parsing Spamhaus...
Jan 02 16:15:04 ids.alfacom.it ids-list-fetcher[10801]: [16:15:04] Found 1468 IPs, syncing to database...
Jan 02 16:15:04 ids.alfacom.it ids-list-fetcher[10801]: [16:15:04] ✓ Spamhaus: +0 -0 ~1468
Jan 02 16:15:04 ids.alfacom.it ids-list-fetcher[10801]: [16:15:04] Parsing AWS...
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: [16:15:05] Found 9548 IPs, syncing to database...
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: [16:15:05] ✓ AWS: +9548 -0 ~0
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: ============================================================
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: SYNC SUMMARY
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: ============================================================
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: Success: 2/2
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: Errors: 0/2
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: Total IPs Added: 9548
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: Total IPs Removed: 0
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: ============================================================
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: ============================================================
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: RUNNING MERGE LOGIC
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: ============================================================
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: ERROR:merge_logic:Failed to sync detections: column "risk_score" is of type numeric but expression is of type text
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: LINE 13: '75',
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: ^
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: HINT: You will need to rewrite or cast the expression.
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: Traceback (most recent call last):
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: File "/opt/ids/python_ml/merge_logic.py", line 264, in sync_public_blacklist_detections
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: cur.execute("""
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: psycopg2.errors.DatatypeMismatch: column "risk_score" is of type numeric but expression is of type text
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: LINE 13: '75',
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: ^
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: HINT: You will need to rewrite or cast the expression.
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: Merge Logic Stats:
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: Created detections: 0
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: Cleaned invalid detections: 0
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: Skipped (whitelisted): 0
Jan 02 16:15:05 ids.alfacom.it ids-list-fetcher[10801]: ============================================================
Jan 02 16:15:05 ids.alfacom.it systemd[1]: ids-list-fetcher.service: Deactivated successfully.
Jan 02 16:15:05 ids.alfacom.it systemd[1]: Finished IDS Public Lists Fetcher Service.

View File

@ -0,0 +1,44 @@
journalctl -u ids-ml-backend -n 50 --no-pager
Nov 22 10:35:57 ids.alfacom.it systemd[1]: Started IDS ML Backend (FastAPI).
Nov 22 10:35:57 ids.alfacom.it systemd[1]: ids-ml-backend.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 10:35:57 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.
Nov 22 10:36:07 ids.alfacom.it systemd[1]: ids-ml-backend.service: Scheduled restart job, restart counter is at 1.
Nov 22 10:36:07 ids.alfacom.it systemd[1]: Stopped IDS ML Backend (FastAPI).
Nov 22 10:36:07 ids.alfacom.it systemd[1]: Started IDS ML Backend (FastAPI).
Nov 22 10:36:07 ids.alfacom.it systemd[1]: ids-ml-backend.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 10:36:07 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.
Nov 22 10:36:18 ids.alfacom.it systemd[1]: ids-ml-backend.service: Scheduled restart job, restart counter is at 2.
Nov 22 10:36:18 ids.alfacom.it systemd[1]: Stopped IDS ML Backend (FastAPI).
Nov 22 10:36:18 ids.alfacom.it systemd[1]: Started IDS ML Backend (FastAPI).
Nov 22 10:36:18 ids.alfacom.it systemd[1]: ids-ml-backend.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 10:36:18 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.
Nov 22 10:36:28 ids.alfacom.it systemd[1]: ids-ml-backend.service: Scheduled restart job, restart counter is at 3.
Nov 22 10:36:28 ids.alfacom.it systemd[1]: Stopped IDS ML Backend (FastAPI).
Nov 22 10:36:28 ids.alfacom.it systemd[1]: Started IDS ML Backend (FastAPI).
Nov 22 10:36:28 ids.alfacom.it systemd[1]: ids-ml-backend.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 10:36:28 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.
Nov 22 10:36:38 ids.alfacom.it systemd[1]: ids-ml-backend.service: Scheduled restart job, restart counter is at 4.
Nov 22 10:36:38 ids.alfacom.it systemd[1]: Stopped IDS ML Backend (FastAPI).
Nov 22 10:36:38 ids.alfacom.it systemd[1]: Started IDS ML Backend (FastAPI).
Nov 22 10:36:38 ids.alfacom.it systemd[1]: ids-ml-backend.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 10:36:38 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.
Nov 22 10:36:48 ids.alfacom.it systemd[1]: ids-ml-backend.service: Scheduled restart job, restart counter is at 5.
Nov 22 10:36:48 ids.alfacom.it systemd[1]: Stopped IDS ML Backend (FastAPI).
Nov 22 10:36:48 ids.alfacom.it systemd[1]: Started IDS ML Backend (FastAPI).
Nov 22 10:36:48 ids.alfacom.it systemd[1]: ids-ml-backend.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 10:36:48 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.
Nov 22 10:36:59 ids.alfacom.it systemd[1]: ids-ml-backend.service: Scheduled restart job, restart counter is at 6.
Nov 22 10:36:59 ids.alfacom.it systemd[1]: Stopped IDS ML Backend (FastAPI).
Nov 22 10:36:59 ids.alfacom.it systemd[1]: Started IDS ML Backend (FastAPI).
Nov 22 10:36:59 ids.alfacom.it systemd[1]: ids-ml-backend.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 10:36:59 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.
Nov 22 10:37:09 ids.alfacom.it systemd[1]: ids-ml-backend.service: Scheduled restart job, restart counter is at 7.
Nov 22 10:37:09 ids.alfacom.it systemd[1]: Stopped IDS ML Backend (FastAPI).
Nov 22 10:37:09 ids.alfacom.it systemd[1]: Started IDS ML Backend (FastAPI).
Nov 22 10:37:09 ids.alfacom.it systemd[1]: ids-ml-backend.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 10:37:09 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.
Nov 22 10:37:19 ids.alfacom.it systemd[1]: ids-ml-backend.service: Scheduled restart job, restart counter is at 8.
Nov 22 10:37:19 ids.alfacom.it systemd[1]: Stopped IDS ML Backend (FastAPI).
Nov 22 10:37:19 ids.alfacom.it systemd[1]: Started IDS ML Backend (FastAPI).
Nov 22 10:37:19 ids.alfacom.it systemd[1]: ids-ml-backend.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 10:37:19 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.

View File

@ -0,0 +1,82 @@
netstat -tlnp | grep 8000
tcp 0 0 0.0.0.0:8000 0.0.0.0:* LISTEN 106309/python3.11
(venv) [root@ids python_ml]# lsof -i :8000
COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
python3.1 106309 ids 7u IPv4 805799 0t0 TCP *:irdmi (LISTEN)
(venv) [root@ids python_ml]# kill -9 106309
(venv) [root@ids python_ml]# lsof -i :8000
(venv) [root@ids python_ml]# pkill -9 -f "python.*8000"
(venv) [root@ids python_ml]# pkill -9 -f "python.*main.py"
(venv) [root@ids python_ml]# sudo systemctl restart ids-ml-backend
Job for ids-ml-backend.service failed because the control process exited with error code.
See "systemctl status ids-ml-backend.service" and "journalctl -xeu ids-ml-backend.service" for details.
(venv) [root@ids python_ml]# sudo systemctl status ids-ml-backend
× ids-ml-backend.service - IDS ML Backend (FastAPI)
Loaded: loaded (/etc/systemd/system/ids-ml-backend.service; enabled; preset: disabled)
Active: failed (Result: exit-code) since Tue 2025-11-25 18:31:08 CET; 3min 37s ago
Duration: 2.490s
Process: 108530 ExecStart=/opt/ids/python_ml/venv/bin/python3 main.py (code=exited, status=1/FAILURE)
Main PID: 108530 (code=exited, status=1/FAILURE)
CPU: 3.987s
Nov 25 18:31:08 ids.alfacom.it systemd[1]: ids-ml-backend.service: Scheduled restart job, restart counter is at 5.
Nov 25 18:31:08 ids.alfacom.it systemd[1]: Stopped IDS ML Backend (FastAPI).
Nov 25 18:31:08 ids.alfacom.it systemd[1]: ids-ml-backend.service: Consumed 3.987s CPU time.
Nov 25 18:31:08 ids.alfacom.it systemd[1]: ids-ml-backend.service: Start request repeated too quickly.
Nov 25 18:31:08 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.
Nov 25 18:31:08 ids.alfacom.it systemd[1]: Failed to start IDS ML Backend (FastAPI).
Nov 25 18:34:35 ids.alfacom.it systemd[1]: ids-ml-backend.service: Start request repeated too quickly.
Nov 25 18:34:35 ids.alfacom.it systemd[1]: ids-ml-backend.service: Failed with result 'exit-code'.
Nov 25 18:34:35 ids.alfacom.it systemd[1]: Failed to start IDS ML Backend (FastAPI).
(venv) [root@ids python_ml]# tail -n 50 /var/log/ids/ml_backend.log
[HYBRID] Selected features: 18/25
[HYBRID] Mode: Hybrid (IF + Ensemble)
[ML] ✓ Hybrid detector models loaded and ready
🚀 Starting IDS API on http://0.0.0.0:8000
📚 Docs available at http://0.0.0.0:8000/docs
INFO: Started server process [108413]
INFO: Waiting for application startup.
INFO: Application startup complete.
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
[WARNING] Extended Isolation Forest not available, using standard IF
[ML] Using Hybrid ML Detector (Extended Isolation Forest + Feature Selection)
[HYBRID] Ensemble classifier loaded
[HYBRID] Models loaded (version: latest)
[HYBRID] Selected features: 18/25
[HYBRID] Mode: Hybrid (IF + Ensemble)
[ML] ✓ Hybrid detector models loaded and ready
🚀 Starting IDS API on http://0.0.0.0:8000
📚 Docs available at http://0.0.0.0:8000/docs
INFO: Started server process [108452]
INFO: Waiting for application startup.
INFO: Application startup complete.
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
[WARNING] Extended Isolation Forest not available, using standard IF
[ML] Using Hybrid ML Detector (Extended Isolation Forest + Feature Selection)
[HYBRID] Ensemble classifier loaded
[HYBRID] Models loaded (version: latest)
[HYBRID] Selected features: 18/25
[HYBRID] Mode: Hybrid (IF + Ensemble)
[ML] ✓ Hybrid detector models loaded and ready
🚀 Starting IDS API on http://0.0.0.0:8000
📚 Docs available at http://0.0.0.0:8000/docs
INFO: Started server process [108530]
INFO: Waiting for application startup.
INFO: Application startup complete.
ERROR: [Errno 98] error while attempting to bind on address ('0.0.0.0', 8000): address already in use
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
[WARNING] Extended Isolation Forest not available, using standard IF
[ML] Using Hybrid ML Detector (Extended Isolation Forest + Feature Selection)
[HYBRID] Ensemble classifier loaded
[HYBRID] Models loaded (version: latest)
[HYBRID] Selected features: 18/25
[HYBRID] Mode: Hybrid (IF + Ensemble)
[ML] ✓ Hybrid detector models loaded and ready
🚀 Starting IDS API on http://0.0.0.0:8000
📚 Docs available at http://0.0.0.0:8000/docs
(venv) [root@ids python_ml]#

View File

@ -0,0 +1,51 @@
ournalctl -u ids-list-fetcher -n 50 --no-pager
Jan 02 12:30:01 ids.alfacom.it ids-list-fetcher[5571]: Cleaned invalid detections: 0
Jan 02 12:30:01 ids.alfacom.it ids-list-fetcher[5571]: Skipped (whitelisted): 0
Jan 02 12:30:01 ids.alfacom.it ids-list-fetcher[5571]: ============================================================
Jan 02 12:30:01 ids.alfacom.it systemd[1]: ids-list-fetcher.service: Deactivated successfully.
Jan 02 12:30:01 ids.alfacom.it systemd[1]: Finished IDS Public Lists Fetcher Service.
Jan 02 12:40:01 ids.alfacom.it systemd[1]: Starting IDS Public Lists Fetcher Service...
Jan 02 12:40:01 ids.alfacom.it ids-list-fetcher[5730]: ============================================================
Jan 02 12:40:01 ids.alfacom.it ids-list-fetcher[5730]: [2026-01-02 12:40:01] PUBLIC LISTS SYNC
Jan 02 12:40:01 ids.alfacom.it ids-list-fetcher[5730]: ============================================================
Jan 02 12:40:01 ids.alfacom.it ids-list-fetcher[5730]: Found 2 enabled lists
Jan 02 12:40:01 ids.alfacom.it ids-list-fetcher[5730]: [12:40:01] Downloading Spamhaus from https://www.spamhaus.org/drop/drop_v4.json...
Jan 02 12:40:01 ids.alfacom.it ids-list-fetcher[5730]: [12:40:01] Downloading AWS from https://ip-ranges.amazonaws.com/ip-ranges.json...
Jan 02 12:40:01 ids.alfacom.it ids-list-fetcher[5730]: [12:40:01] Parsing AWS...
Jan 02 12:40:01 ids.alfacom.it ids-list-fetcher[5730]: [12:40:01] Found 9548 IPs, syncing to database...
Jan 02 12:40:02 ids.alfacom.it ids-list-fetcher[5730]: [12:40:02] ✓ AWS: +9511 -0 ~0
Jan 02 12:40:02 ids.alfacom.it ids-list-fetcher[5730]: [12:40:02] Parsing Spamhaus...
Jan 02 12:40:02 ids.alfacom.it ids-list-fetcher[5730]: [12:40:02] ✗ Spamhaus: No valid IPs found in list
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: ============================================================
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: SYNC SUMMARY
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: ============================================================
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: Success: 1/2
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: Errors: 1/2
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: Total IPs Added: 9511
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: Total IPs Removed: 0
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: ============================================================
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: ============================================================
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: RUNNING MERGE LOGIC
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: ============================================================
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: ERROR:merge_logic:Failed to cleanup detections: operator does not exist: inet = text
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: LINE 9: d.source_ip::inet = wl.ip_inet
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: ^
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: HINT: No operator matches the given name and argument types. You might need to add explicit type casts.
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: ERROR:merge_logic:Failed to sync detections: operator does not exist: text <<= text
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: LINE 30: OR bl.ip_inet <<= wl.ip_inet
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: ^
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: HINT: No operator matches the given name and argument types. You might need to add explicit type casts.
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: Traceback (most recent call last):
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: File "/opt/ids/python_ml/merge_logic.py", line 264, in sync_public_blacklist_detections
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: cur.execute("""
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: psycopg2.errors.UndefinedFunction: operator does not exist: text <<= text
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: LINE 30: OR bl.ip_inet <<= wl.ip_inet
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: ^
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: HINT: No operator matches the given name and argument types. You might need to add explicit type casts.
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: Merge Logic Stats:
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: Created detections: 0
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: Cleaned invalid detections: 0
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: Skipped (whitelisted): 0
Jan 02 12:40:03 ids.alfacom.it ids-list-fetcher[5730]: ============================================================
Jan 02 12:40:03 ids.alfacom.it systemd[1]: ids-list-fetcher.service: Deactivated successfully.
Jan 02 12:40:03 ids.alfacom.it systemd[1]: Finished IDS Public Lists Fetcher Service.

View File

@ -0,0 +1,48 @@
pkill -f syslog_parser
[root@ids python_ml]# sudo -u ids nohup python3.11 syslog_parser.py > /var/log/ids/syslog_parser.log 2>&1 &
[2] 60160
[1] Terminated sudo -u ids nohup python3.11 syslog_parser.py > /var/log/ids/syslog_parser.log 2>&1
[root@ids python_ml]# sudo -u ids nohup python3.11 syslog_parser.py > /var/log/ids/syslog_parser.log 2>&1 &
[3] 60164
[root@ids python_ml]# tail -20 /var/log/ids/syslog_parser.log
[ERROR] Errore salvataggio log: relation "network_logs" does not exist
LINE 2: INSERT INTO network_logs
^
[ERROR] Errore salvataggio log: relation "network_logs" does not exist
LINE 2: INSERT INTO network_logs
^
[ERROR] Errore salvataggio log: relation "network_logs" does not exist
LINE 2: INSERT INTO network_logs
^
[ERROR] Errore salvataggio log: relation "network_logs" does not exist
LINE 2: INSERT INTO network_logs
^
[ERROR] Errore salvataggio log: relation "network_logs" does not exist
LINE 2: INSERT INTO network_logs
^
[root@ids python_ml]# tail -20 /var/log/ids/syslog_parser.log
[ERROR] Errore salvataggio log: relation "network_logs" does not exist
LINE 2: INSERT INTO network_logs
^
[ERROR] Errore salvataggio log: relation "network_logs" does not exist
LINE 2: INSERT INTO network_logs
^
[ERROR] Errore salvataggio log: relation "network_logs" does not exist
LINE 2: INSERT INTO network_logs
^
[ERROR] Errore salvataggio log: relation "network_logs" does not exist
LINE 2: INSERT INTO network_logs
^
[ERROR] Errore salvataggio log: relation "network_logs" does not exist
LINE 2: INSERT INTO network_logs
^

View File

@ -0,0 +1,121 @@
psql $DATABASE_URL << 'EOF'
-- Conta record in ogni tabella
SELECT 'network_logs' as table_name, COUNT(*) as count FROM network_logs
UNION ALL
SELECT 'detections', COUNT(*) FROM detections
UNION ALL
SELECT 'training_history', COUNT(*) FROM training_history
UNION ALL
SELECT 'routers', COUNT(*) FROM routers
UNION ALL
SELECT 'whitelist', COUNT(*) FROM whitelist;
-- Mostra ultimi 5 log di rete
SELECT timestamp, source_ip, destination_ip, protocol, router_name
FROM network_logs
ORDER BY timestamp DESC
LIMIT 5;
-- Mostra training history
SELECT * FROM training_history ORDER BY trained_at DESC LIMIT 5;
-- Mostra detections
SELECT * FROM detections ORDER BY detected_at DESC LIMIT 5;
EOF
table_name | count
------------------+-------
network_logs | 0
detections | 0
training_history | 0
routers | 1
whitelist | 0
(5 rows)
timestamp | source_ip | destination_ip | protocol | router_name
-----------+-----------+----------------+----------+-------------
(0 rows)
id | model_version | records_processed | features_count | accuracy | training_duration | status | notes | trained_at
----+---------------+-------------------+----------------+----------+-------------------+--------+-------+------------
(0 rows)
id | source_ip | risk_score | confidence | anomaly_type | reason | log_count | first_seen | last_seen | blocked | blocked_at | detected_at
----+-----------+------------+------------+--------------+--------+-----------+------------+-----------+---------+------------+-------------
(0 rows)
[root@ids ids]# curl -s http://localhost:8000/stats | jq .
{
"logs": {
"total": 0,
"last_hour": 0
},
"detections": {
"total": 0,
"blocked": 0
},
"routers": {
"active": 1
},
"latest_training": null
}
[root@ids ids]# tail -50 /var/log/ids/syslog_parser.log
[ERROR] Errore salvataggio log: could not extend file "base/16384/16940.223": No space left on device
HINT: Check free disk space.
[ERROR] Errore salvataggio log: could not extend file "base/16384/16940.223": No space left on device
HINT: Check free disk space.
[ERROR] Errore salvataggio log: could not extend file "base/16384/16940.223": No space left on device
HINT: Check free disk space.
[INFO] Processate 417737400 righe, salvate 417728626 log
[ERROR] Errore salvataggio log: could not extend file "base/16384/16940.223": No space left on device
HINT: Check free disk space.
[ERROR] Errore salvataggio log: could not extend file "base/16384/16940.223": No space left on device
HINT: Check free disk space.
[ERROR] Errore salvataggio log: could not extend file "base/16384/16940.223": No space left on device
HINT: Check free disk space.
[ERROR] Errore salvataggio log: could not extend file "base/16384/16940.223": No space left on device
HINT: Check free disk space.
[ERROR] Errore salvataggio log: could not extend file "base/16384/16940.223": No space left on device
HINT: Check free disk space.
[ERROR] Errore salvataggio log: could not extend file "base/16384/16940.223": No space left on device
HINT: Check free disk space.
[ERROR] Errore salvataggio log: could not extend file "base/16384/16940.223": No space left on device
HINT: Check free disk space.
[ERROR] Errore salvataggio log: could not extend file "base/16384/16940.223": No space left on device
HINT: Check free disk space.
[ERROR] Errore salvataggio log: could not extend file "base/16384/16940.223": No space left on device
HINT: Check free disk space.
[ERROR] Errore salvataggio log: could not extend file "base/16384/16940.223": No space left on device
HINT: Check free disk space.
[ERROR] Errore salvataggio log: could not extend file "base/16384/16940.223": No space left on device
HINT: Check free disk space.
[ERROR] Errore salvataggio log: could not extend file "base/16384/16940.223": No space left on device
HINT: Check free disk space.
[ERROR] Errore salvataggio log: could not extend file "base/16384/16940.223": No space left on device
HINT: Check free disk space.
[ERROR] Errore salvataggio log: could not extend fil[root@ids ids]# df -h
Filesystem Size Used Avail Use% Mounted on
devtmpfs 4.0M 0 4.0M 0% /dev
tmpfs 7.7G 16K 7.7G 1% /dev/shm
tmpfs 3.1G 8.8M 3.1G 1% /run
efivarfs 256K 32K 220K 13% /sys/firmware/efi/efivars
/dev/mapper/almalinux_ids-root 491G 40G 451G 9% /
/dev/sda2 960M 327M 634M 34% /boot
/dev/sda1 599M 7.1M 592M 2% /boot/efi
tmpfs 1.6G 0 1.6G 0% /run/user/0
tmpfs 1.6G 0 1.6G 0% /run/user/1000

View File

@ -0,0 +1,48 @@
psql -h 127.0.0.1 -U ids_user -d ids_database -c "SELECT COUNT(*) AS totale_log FROM network_logs;"
totale_log
------------
593421
(1 row)
[root@ids python_ml]# psql -h 127.0.0.1 -U ids_user -d ids_database -c "SELECT timestamp, router_name, source_ip, destination_ip, protocol, action FROM network_logs ORDER BY
timestamp DESC LIMIT 10;"
timestamp | router_name | source_ip | destination_ip | protocol | action
---------------------+-------------+----------------+----------------+----------+---------
2025-11-17 18:52:08 | FIBRA | 185.203.25.162 | 191.101.79.84 | tcp | forward
2025-11-17 18:52:08 | FIBRA | 185.203.25.46 | 142.251.209.42 | udp | forward
2025-11-17 18:52:08 | FIBRA | 185.203.25.46 | 142.251.209.42 | udp | forward
2025-11-17 18:52:08 | FIBRA | 185.203.25.46 | 142.251.209.42 | udp | forward
2025-11-17 18:52:08 | FIBRA | 185.203.25.46 | 142.251.209.42 | udp | forward
2025-11-17 18:52:08 | FIBRA | 185.203.25.46 | 142.251.209.42 | udp | forward
2025-11-17 18:52:08 | FIBRA | 95.216.123.229 | 185.203.26.77 | udp | ddos
2025-11-17 18:52:08 | FIBRA | 95.216.123.229 | 185.203.26.77 | udp | ddos
2025-11-17 18:52:08 | FIBRA | 10.0.254.71 | 216.58.204.234 | udp | forward
2025-11-17 18:52:08 | FIBRA | 10.0.254.71 | 216.58.204.234 | udp | forward
(10 rows)
[root@ids python_ml]# psql -h 127.0.0.1 -U ids_user -d ids_database -c "SELECT router_name, COUNT(*) as log_count FROM network_logs GROUP BY router_name ORDER BY log_count DESC;"
router_name | log_count
-------------+-----------
FIBRA | 670503
(1 row)
[root@ids python_ml]# psql -h 127.0.0.1 -U ids_user -d ids_database -c "SELECT COUNT(*) as ddos_count FROM network_logs WHERE action = 'ddos';"
ddos_count
------------
305424
(1 row)
[root@ids python_ml]# psql -h 127.0.0.1 -U ids_user -d ids_database -c "SELECT source_ip, COUNT(*) as count FROM network_logs GROUP BY source_ip ORDER BY count DESC LIMIT 10;"
source_ip | count
----------------+--------
185.203.25.162 | 131859
198.251.84.34 | 110069
185.203.26.201 | 35745
185.203.25.233 | 19784
185.203.24.22 | 18885
82.62.84.108 | 13827
185.203.25.211 | 10483
10.1.0.254 | 9414
126.220.199.81 | 8574
185.203.25.50 | 8422
(10 rows)

View File

@ -0,0 +1,54 @@
python train_hybrid.py --test
[WARNING] Extended Isolation Forest not available, using standard IF
======================================================================
IDS HYBRID ML TEST - SYNTHETIC DATA
======================================================================
INFO:dataset_loader:Creating sample dataset (10000 samples)...
INFO:dataset_loader:Sample dataset created: 10000 rows
INFO:dataset_loader:Attack distribution:
attack_type
normal 8981
brute_force 273
suspicious 258
ddos 257
port_scan 231
Name: count, dtype: int64
[TEST] Created synthetic dataset: 10000 samples
Normal: 8,981 (89.8%)
Attacks: 1,019 (10.2%)
[TEST] Training on 6,281 normal samples...
[HYBRID] Training hybrid model on 6281 logs...
❌ Error: 'timestamp'
Traceback (most recent call last):
File "/opt/ids/python_ml/venv/lib64/python3.11/site-packages/pandas/core/indexes/base.py", line 3790, in get_loc
return self._engine.get_loc(casted_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "index.pyx", line 152, in pandas._libs.index.IndexEngine.get_loc
File "index.pyx", line 181, in pandas._libs.index.IndexEngine.get_loc
File "pandas/_libs/hashtable_class_helper.pxi", line 7080, in pandas._libs.hashtable.PyObjectHashTable.get_item
File "pandas/_libs/hashtable_class_helper.pxi", line 7088, in pandas._libs.hashtable.PyObjectHashTable.get_item
KeyError: 'timestamp'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/opt/ids/python_ml/train_hybrid.py", line 361, in main
test_on_synthetic(args)
File "/opt/ids/python_ml/train_hybrid.py", line 249, in test_on_synthetic
detector.train_unsupervised(normal_train)
File "/opt/ids/python_ml/ml_hybrid_detector.py", line 204, in train_unsupervised
features_df = self.extract_features(logs_df)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/ids/python_ml/ml_hybrid_detector.py", line 98, in extract_features
logs_df['timestamp'] = pd.to_datetime(logs_df['timestamp'])
~~~~~~~^^^^^^^^^^^^^
File "/opt/ids/python_ml/venv/lib64/python3.11/site-packages/pandas/core/frame.py", line 3893, in __getitem__
indexer = self.columns.get_loc(key)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/ids/python_ml/venv/lib64/python3.11/site-packages/pandas/core/indexes/base.py", line 3797, in get_loc
raise KeyError(key) from err
KeyError: 'timestamp'

View File

@ -0,0 +1,70 @@
sudo cp /var/lib/pgsql/data/pg_hba.conf /var/lib/pgsql/data/pg_hba.conf.backup_manual
[root@ids python_ml]# sudo sed -i '/^local.*ids_database.*ids_user/d' /var/lib/pgsql/data/pg_hba.conf
sudo sed -i '/^host.*ids_database.*ids_user/d' /var/lib/pgsql/data/pg_hba.conf
[root@ids python_ml]# cat << 'EOF' | sudo tee -a /var/lib/pgsql/data/pg_hba.conf
# IDS Database - SCRAM-SHA-256
local ids_database ids_user scram-sha-256
host ids_database ids_user 127.0.0.1/32 scram-sha-256
host ids_database ids_user ::1/128 scram-sha-256
EOF
# IDS Database - SCRAM-SHA-256
local ids_database ids_user scram-sha-256
host ids_database ids_user 127.0.0.1/32 scram-sha-256
host ids_database ids_user ::1/128 scram-sha-256
[root@ids python_ml]# systemctl reload postgresql
[root@ids python_ml]# sudo grep "ids_database" /var/lib/pgsql/data/pg_hba.conf
local ids_database ids_user scram-sha-256
host ids_database ids_user 127.0.0.1/32 scram-sha-256
host ids_database ids_user ::1/128 scram-sha-256
[root@ids python_ml]# export PGPASSWORD="TestPassword123"
[root@ids python_ml]# psql -h ::1 -U ids_user -d ids_database -c "SELECT 1;"
?column?
----------
1
(1 row)
[root@ids python_ml]# pkill -f syslog_parser
[root@ids python_ml]# sudo -u ids nohup python3.11 syslog_parser.py > /var/log/ids/syslog_parser.log 2>&1 &
[1] 59373
[root@ids python_ml]# tail -20 /var/log/ids/syslog_parser.log
nohup: ignoring input
=== SYSLOG PARSER PER ROUTER MIKROTIK ===
Pressione Ctrl+C per interrompere
[ERROR] Connessione database fallita: connection to server at "localhost" (::1), port 5432 failed: FATAL: password authentication failed for user "ids_user"
[INFO] Disconnesso da PostgreSQL
Traceback (most recent call last):
File "/opt/ids/python_ml/syslog_parser.py", line 215, in <module>
main()
File "/opt/ids/python_ml/syslog_parser.py", line 203, in main
parser.connect_db()
File "/opt/ids/python_ml/syslog_parser.py", line 48, in connect_db
self.conn = psycopg2.connect(**self.db_config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ids/.local/lib/python3.11/site-packages/psycopg2/__init__.py", line 122, in connect
conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: FATAL: password authentication failed for user "ids_user"
[1]+ Exit 1 sudo -u ids nohup python3.11 syslog_parser.py > /var/log/ids/syslog_parser.log 2>&1
[root@ids python_ml]# tail -20 /var/log/ids/syslog_parser.log
nohup: ignoring input
=== SYSLOG PARSER PER ROUTER MIKROTIK ===
Pressione Ctrl+C per interrompere
[ERROR] Connessione database fallita: connection to server at "localhost" (::1), port 5432 failed: FATAL: password authentication failed for user "ids_user"
[INFO] Disconnesso da PostgreSQL
Traceback (most recent call last):
File "/opt/ids/python_ml/syslog_parser.py", line 215, in <module>
main()
File "/opt/ids/python_ml/syslog_parser.py", line 203, in main
parser.connect_db()
File "/opt/ids/python_ml/syslog_parser.py", line 48, in connect_db
self.conn = psycopg2.connect(**self.db_config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ids/.local/lib/python3.11/site-packages/psycopg2/__init__.py", line 122, in connect
conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: FATAL: password authentication failed for user "ids_user"

View File

@ -0,0 +1,208 @@
sudo systemctl restart ids-syslog-parser
Failed to restart ids-syslog-parser.service: Unit ids-syslog-parser.service not found.
[root@ids python_ml]# tail -10 /var/log/mikrotik/raw.log
forward: in:<pppoe-cava.pompe-1> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 24:5a:4c:3e:a8:2a, proto UDP, 10.0.249.130:44595->165.154.165.238:8800, len 68
forward: in:<pppoe-cava.pompe-1> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 24:5a:4c:3e:a8:2a, proto UDP, 10.0.249.130:44595->165.154.165.238:8800, len 68
forward: in:<pppoe-cava.pompe-1> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 24:5a:4c:3e:a8:2a, proto UDP, 10.0.249.130:44594->93.150.220.226:4917, len 72
forward: in:<pppoe-cava.pompe-1> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 24:5a:4c:3e:a8:2a, proto UDP, 10.0.249.130:44594->93.150.220.226:4917, len 72
forward: in:<pppoe-caronte.hightek_01> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.233:56352->192.168.25.254:80, len 60
forward: in:<pppoe-caronte.hightek_01> out:sfp-sfpplus2_VS_AS, connection-state:new proto TCP (SYN), 185.203.25.233:56352->192.168.25.254:80, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-alfabitomega>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 5.99.210.125:23084->185.203.24.2:10204, len 60
detected-ddos forward: in:sfp-sfpplus2_VS_AS out:<pppoe-alfabitomega>, connection-state:new src-mac 18:fd:74:7c:aa:85, proto TCP (SYN), 5.99.210.125:23084->185.203.24.2:10204, len 60
forward: in:<pppoe-1471_1115_nappicarol> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 84:d8:1b:68:6a:cc, proto UDP, 10.0.254.67:39651->142.250.180.142:443, len 1378
forward: in:<pppoe-1471_1115_nappicarol> out:sfp-sfpplus2_VS_AS, connection-state:new src-mac 84:d8:1b:68:6a:cc, proto UDP, 10.0.254.67:39651->142.250.180.142:443, len 1378
[root@ids python_ml]# nohup sudo -u ids python3 syslog_parser.py > /var/log/ids/syslog_parser.log 2>&1 &
[3] 13114
[root@ids python_ml]# tail -f /var/log/ids/syslog_parser.log
nohup: ignoring input
=== SYSLOG PARSER PER ROUTER MIKROTIK ===
Pressione Ctrl+C per interrompere
[DEBUG] Avvio syslog_parser...
[DEBUG] Caricamento .env da /opt/ids/.env...
[DEBUG] .env caricato
[DEBUG] Configurazione database:
[DEBUG] Host: localhost
[DEBUG] Port: 5432
[DEBUG] Database: ids_database
[DEBUG] User: ids_user
[DEBUG] File log: /var/log/mikrotik/raw.log
[INFO] File log trovato: /var/log/mikrotik/raw.log
[DEBUG] Creazione parser...
[DEBUG] Connessione database...
[INFO] Connesso a PostgreSQL
[INFO] Avvio processamento log (modalità follow)...
[INFO] Processando /var/log/mikrotik/raw.log (follow=True)
[INFO] Processate 100 righe, salvate 0 log
[INFO] Processate 200 righe, salvate 0 log
[INFO] Processate 300 righe, salvate 0 log
[INFO] Processate 400 righe, salvate 0 log
[INFO] Processate 500 righe, salvate 0 log
[INFO] Processate 600 righe, salvate 0 log
[INFO] Processate 700 righe, salvate 0 log
[INFO] Processate 800 righe, salvate 0 log
[INFO] Processate 900 righe, salvate 0 log
[INFO] Processate 1000 righe, salvate 0 log
[INFO] Processate 1100 righe, salvate 0 log
[INFO] Processate 1200 righe, salvate 0 log
[INFO] Processate 1300 righe, salvate 0 log
[INFO] Processate 1400 righe, salvate 0 log
[INFO] Processate 1500 righe, salvate 0 log
[INFO] Processate 1600 righe, salvate 0 log
[INFO] Processate 1700 righe, salvate 0 log
[INFO] Processate 1800 righe, salvate 0 log
[INFO] Processate 1900 righe, salvate 0 log
[INFO] Processate 2000 righe, salvate 0 log
[INFO] Processate 2100 righe, salvate 0 log
[INFO] Processate 2200 righe, salvate 0 log
[INFO] Processate 2300 righe, salvate 0 log
[INFO] Processate 2400 righe, salvate 0 log
[INFO] Processate 2500 righe, salvate 0 log
[INFO] Processate 2600 righe, salvate 0 log
[INFO] Processate 2700 righe, salvate 0 log
[INFO] Processate 2800 righe, salvate 0 log
[INFO] Processate 2900 righe, salvate 0 log
[INFO] Processate 3000 righe, salvate 0 log
[INFO] Processate 3100 righe, salvate 0 log
[INFO] Processate 3200 righe, salvate 0 log
[INFO] Processate 3300 righe, salvate 0 log
[INFO] Processate 3400 righe, salvate 0 log
[INFO] Processate 3500 righe, salvate 0 log
[INFO] Processate 3600 righe, salvate 0 log
[INFO] Processate 3700 righe, salvate 0 log
[INFO] Processate 3800 righe, salvate 0 log
[INFO] Processate 3900 righe, salvate 0 log
[INFO] Processate 4000 righe, salvate 0 log
[INFO] Processate 4100 righe, salvate 0 log
[INFO] Processate 4200 righe, salvate 0 log
[INFO] Processate 4300 righe, salvate 0 log
[INFO] Processate 4400 righe, salvate 0 log
[INFO] Processate 4500 righe, salvate 0 log
[INFO] Processate 4600 righe, salvate 0 log
[INFO] Processate 4700 righe, salvate 0 log
[INFO] Processate 4800 righe, salvate 0 log
[INFO] Processate 4900 righe, salvate 0 log
[INFO] Processate 5000 righe, salvate 0 log
[INFO] Processate 5100 righe, salvate 0 log
[INFO] Processate 5200 righe, salvate 0 log
[INFO] Processate 5300 righe, salvate 0 log
[INFO] Processate 5400 righe, salvate 0 log
[INFO] Processate 5500 righe, salvate 0 log
[INFO] Processate 5600 righe, salvate 0 log
[INFO] Processate 5700 righe, salvate 0 log
[INFO] Processate 5800 righe, salvate 0 log
[INFO] Processate 5900 righe, salvate 0 log
[INFO] Processate 6000 righe, salvate 0 log
[INFO] Processate 6100 righe, salvate 0 log
[INFO] Processate 6200 righe, salvate 0 log
[INFO] Processate 6300 righe, salvate 0 log
[INFO] Processate 6400 righe, salvate 0 log
[INFO] Processate 6500 righe, salvate 0 log
[INFO] Processate 6600 righe, salvate 0 log
[INFO] Processate 6700 righe, salvate 0 log
[INFO] Processate 6800 righe, salvate 0 log
[INFO] Processate 6900 righe, salvate 0 log
[INFO] Processate 7000 righe, salvate 0 log
[INFO] Processate 7100 righe, salvate 0 log
[INFO] Processate 7200 righe, salvate 0 log
[INFO] Processate 7300 righe, salvate 0 log
[INFO] Processate 7400 righe, salvate 0 log
[INFO] Processate 7500 righe, salvate 0 log
[INFO] Processate 7600 righe, salvate 0 log
[INFO] Processate 7700 righe, salvate 0 log
[INFO] Processate 7800 righe, salvate 0 log
[INFO] Processate 7900 righe, salvate 0 log
[INFO] Processate 8000 righe, salvate 0 log
[INFO] Processate 8100 righe, salvate 0 log
[INFO] Processate 8200 righe, salvate 0 log
[INFO] Processate 8300 righe, salvate 0 log
[INFO] Processate 8400 righe, salvate 0 log
[INFO] Processate 8500 righe, salvate 0 log
[INFO] Processate 8600 righe, salvate 0 log
[INFO] Processate 8700 righe, salvate 0 log
[INFO] Processate 8800 righe, salvate 0 log
[INFO] Processate 8900 righe, salvate 0 log
[INFO] Processate 9000 righe, salvate 0 log
[INFO] Processate 9100 righe, salvate 0 log
[INFO] Processate 9200 righe, salvate 0 log
[INFO] Processate 9300 righe, salvate 0 log
[INFO] Processate 9400 righe, salvate 0 log
[INFO] Processate 9500 righe, salvate 0 log
[INFO] Processate 9600 righe, salvate 0 log
[INFO] Processate 9700 righe, salvate 0 log
[INFO] Processate 9800 righe, salvate 0 log
[INFO] Processate 9900 righe, salvate 0 log
[INFO] Processate 10000 righe, salvate 0 log
[INFO] Processate 10100 righe, salvate 0 log
[INFO] Processate 10200 righe, salvate 0 log
[INFO] Processate 10300 righe, salvate 0 log
[INFO] Processate 10400 righe, salvate 0 log
[INFO] Processate 10500 righe, salvate 0 log
[INFO] Processate 10600 righe, salvate 0 log
[INFO] Processate 10700 righe, salvate 0 log
[INFO] Processate 10800 righe, salvate 0 log
[INFO] Processate 10900 righe, salvate 0 log
[INFO] Processate 11000 righe, salvate 0 log
[INFO] Processate 11100 righe, salvate 0 log
[INFO] Processate 11200 righe, salvate 0 log
[INFO] Processate 11300 righe, salvate 0 log
[INFO] Processate 11400 righe, salvate 0 log
[INFO] Processate 11500 righe, salvate 0 log
[INFO] Processate 11600 righe, salvate 0 log
[INFO] Processate 11700 righe, salvate 0 log
[INFO] Processate 11800 righe, salvate 0 log
[INFO] Processate 11900 righe, salvate 0 log
[INFO] Processate 12000 righe, salvate 0 log
[INFO] Processate 12100 righe, salvate 0 log
[INFO] Processate 12200 righe, salvate 0 log
[INFO] Processate 12300 righe, salvate 0 log
[INFO] Processate 12400 righe, salvate 0 log
[INFO] Processate 12500 righe, salvate 0 log
[INFO] Processate 12600 righe, salvate 0 log
[INFO] Processate 12700 righe, salvate 0 log
[INFO] Processate 12800 righe, salvate 0 log
[INFO] Processate 12900 righe, salvate 0 log
[INFO] Processate 13000 righe, salvate 0 log
[INFO] Processate 13100 righe, salvate 0 log
[INFO] Processate 13200 righe, salvate 0 log
[INFO] Processate 13300 righe, salvate 0 log
[INFO] Processate 13400 righe, salvate 0 log
[INFO] Processate 13500 righe, salvate 0 log
[INFO] Processate 13600 righe, salvate 0 log
[INFO] Processate 13700 righe, salvate 0 log
[INFO] Processate 13800 righe, salvate 0 log
[INFO] Processate 13900 righe, salvate 0 log
[INFO] Processate 14000 righe, salvate 0 log
[INFO] Processate 14100 righe, salvate 0 log
[INFO] Processate 14200 righe, salvate 0 log
[INFO] Processate 14300 righe, salvate 0 log
[INFO] Processate 14400 righe, salvate 0 log
[INFO] Processate 14500 righe, salvate 0 log
[INFO] Processate 14600 righe, salvate 0 log
[INFO] Processate 14700 righe, salvate 0 log
[INFO] Processate 14800 righe, salvate 0 log
[INFO] Processate 14900 righe, salvate 0 log
[INFO] Processate 15000 righe, salvate 0 log
[INFO] Processate 15100 righe, salvate 0 log
[INFO] Processate 15200 righe, salvate 0 log
[INFO] Processate 15300 righe, salvate 0 log
[INFO] Processate 15400 righe, salvate 0 log
[INFO] Processate 15500 righe, salvate 0 log
[INFO] Processate 15600 righe, salvate 0 log
[INFO] Processate 15700 righe, salvate 0 log
[INFO] Processate 15800 righe, salvate 0 log
[INFO] Processate 15900 righe, salvate 0 log
[INFO] Processate 16000 righe, salvate 0 log
[INFO] Processate 16100 righe, salvate 0 log
[INFO] Processate 16200 righe, salvate 0 log
[INFO] Processate 16300 righe, salvate 0 log
[INFO] Processate 16400 righe, salvate 0 log
[INFO] Processate 16500 righe, salvate 0 log
[INFO] Processate 16600 righe, salvate 0 log
[INFO] Processate 16700 righe, salvate 0 log
[INFO] Processate 16800 righe, salvate 0 log
[INFO] Processate 16900 righe, salvate 0 log
[INFO] Processate 17000 righe, salvate 0 log
^C
[root@ids python_ml]# grep "TIMESTAMP" /etc/rsyslog.d/99-mikrotik.conf
[root@ids python_ml]# grep "TIMESTAMP" /etc/rsyslog.d/99-mikrotik.conf

View File

@ -0,0 +1,76 @@
systemctl status ids-ml-backend
Unit ids-ml-backend.service could not be found.
[root@ids ~]# ps aux | grep "python.*main.py"
ids 1547 6.0 4.1 2205816 668884 ? Sl Nov21 55:37 /usr/bin/python3.11 main.py
root 13688 0.0 0.0 3884 2304 pts/5 S+ 10:08 0:00 grep --color=auto python.*main.py
[root@ids ~]# tail -50 /var/log/ids/ml_backend.log
tail: cannot open '/var/log/ids/ml_backend.log' for reading: No such file or directory
[root@ids ~]# curl http://localhost:8000/health
{"status":"healthy","database":"connected","ml_model":"loaded","timestamp":"2025-11-22T10:09:55.941962"}[root@ids ~]#
[root@ids ~]# sudo crontab -u ids -l | grep train
0 */12 * * * /opt/ids/deployment/cron_train.sh
[root@ids ~]# # Verifica storico training
psql $DATABASE_URL -c "SELECT id, model_version, records_processed, status, notes, trained_at FROM training_history ORDER BY trained_at DESC LIMIT 5;"
psql: error: FATAL: role "root" does not exist
[root@ids ~]# cd /opt/ids/
[root@ids ids]# cat .env
# Database PostgreSQL
PGHOST=localhost
PGPORT=5432
PGDATABASE=ids_database
PGUSER=ids_user
PGPASSWORD=TestPassword123
DATABASE_URL=postgresql://ids_user:TestPassword123@127.0.0.1:5432/ids_database
# Session Secret (genera una stringa random sicura)
SESSION_SECRET=zLMzP8lLgjgz/NlgfDXuLK8bwHCod+o5zLOWP5DipRM=
# Python Backend URL (per frontend)
VITE_PYTHON_API_URL=http://localhost:8000
# Node Environment
NODE_ENV=production
[root@ids ids]# DATABASE_URL=postgresql://ids_user:TestPassword123@127.0.0.1:5432/ids_database
[root@ids ids]# cat .env
# Database PostgreSQL
PGHOST=localhost
PGPORT=5432
PGDATABASE=ids_database
PGUSER=ids_user
PGPASSWORD=TestPassword123
DATABASE_URL=postgresql://ids_user:TestPassword123@127.0.0.1:5432/ids_database
# Session Secret (genera una stringa random sicura)
SESSION_SECRET=zLMzP8lLgjgz/NlgfDXuLK8bwHCod+o5zLOWP5DipRM=
# Python Backend URL (per frontend)
VITE_PYTHON_API_URL=http://localhost:8000
# Node Environment
NODE_ENV=production
[root@ids ids]# psql $DATABASE_URL -c "SELECT id, model_version, records_processed, status, notes, trained_at FROM training_history ORDER BY trained_at DESC LIMIT 5;"
id | model_version | records_processed | status | notes | trained_at
----+---------------+-------------------+--------+-------+------------
(0 rows)
[root@ids ids]# # Trova dove sta loggando il processo
lsof -p 1547 | grep log
python3.1 1547 ids mem REG 253,0 187881 1053730 /home/ids/.local/lib/python3.11/site-packages/sklearn/utils/_logistic_sigmoid.cpython-311-x86_64-linux-gnu.so
python3.1 1547 ids 1w REG 253,0 1546719 538992839 /var/log/ids/backend.log
python3.1 1547 ids 2w REG 253,0 1546719 538992839 /var/log/ids/backend.log
[root@ids ids]# tail -f /var/log/ids/backend.log
📚 Docs available at http://0.0.0.0:8000/docs
INFO: 127.0.0.1:40168 - "POST /detect HTTP/1.1" 200 OK
INFO: 127.0.0.1:57698 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:56726 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:41940 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:39840 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:55900 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:43422 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:33580 - "GET /stats HTTP/1.1" 200 OK
INFO: 127.0.0.1:55752 - "GET /stats HTTP/1.1" 200 OK
^C

View File

@ -0,0 +1,190 @@
tail -30 /var/log/ids/frontend.log
5:16:03 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:16:13 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:16:13 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:16:23 PM [express] GET /api/training-history 500 in 4ms :: {"error":"Failed to fetch training hist…
5:16:23 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:16:33 PM [express] GET /api/training-history 500 in 4ms :: {"error":"Failed to fetch training hist…
5:16:33 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:16:49 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:16:49 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:16:59 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:16:59 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:17:09 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:17:09 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:18 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:19:18 PM [express] GET /api/ml/stats 200 in 18ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:28 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:19:28 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:38 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:19:38 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:48 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:19:48 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
[Fri Nov 21 17:20:33 CET 2025] Frontend Node NON attivo, riavvio...
[Fri Nov 21 17:20:35 CET 2025] Frontend riavviato con PID: 11385
> rest-express@1.0.0 dev
> NODE_ENV=development tsx server/index.ts
 Using standard PostgreSQL database
5:20:37 PM [express] serving on port 5000
✅ Database connection successful
[root@ids ~]# tail -30 /var/log/ids/frontend.log
5:16:03 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:16:13 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:16:13 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:16:23 PM [express] GET /api/training-history 500 in 4ms :: {"error":"Failed to fetch training hist…
5:16:23 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:16:33 PM [express] GET /api/training-history 500 in 4ms :: {"error":"Failed to fetch training hist…
5:16:33 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:16:49 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:16:49 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:16:59 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:16:59 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:17:09 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:17:09 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:18 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:19:18 PM [express] GET /api/ml/stats 200 in 18ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:28 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:19:28 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:38 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:19:38 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:48 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:19:48 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
[Fri Nov 21 17:20:33 CET 2025] Frontend Node NON attivo, riavvio...
[Fri Nov 21 17:20:35 CET 2025] Frontend riavviato con PID: 11385
> rest-express@1.0.0 dev
> NODE_ENV=development tsx server/index.ts
 Using standard PostgreSQL database
5:20:37 PM [express] serving on port 5000
✅ Database connection successful
[root@ids ~]# tail -30 /var/log/ids/frontend.log
5:16:49 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:16:49 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:16:59 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:16:59 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:17:09 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:17:09 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:18 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:19:18 PM [express] GET /api/ml/stats 200 in 18ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:28 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:19:28 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:38 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:19:38 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:48 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:19:48 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
[Fri Nov 21 17:20:33 CET 2025] Frontend Node NON attivo, riavvio...
[Fri Nov 21 17:20:35 CET 2025] Frontend riavviato con PID: 11385
> rest-express@1.0.0 dev
> NODE_ENV=development tsx server/index.ts
 Using standard PostgreSQL database
5:20:37 PM [express] serving on port 5000
✅ Database connection successful
A PostCSS plugin did not pass the `from` option to `postcss.parse`. This may cause imported assets to be incorrectly transformed. If you've recently added a PostCSS
plugin that raised this warning, please contact the package author to fix the issue.
5:21:01 PM [express] GET /api/training-history 200 in 34ms :: []
5:21:01 PM [express] GET /api/ml/stats 304 in 39ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:21:04 PM [express] POST /api/ml/train 200 in 14ms :: {"message":"Training avviato in background","…
5:21:04 PM [express] GET /api/training-history 304 in 3ms :: []
5:21:04 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
[root@ids ~]# tail -30 /var/log/ids/frontend.log
5:16:49 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:16:49 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:16:59 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:16:59 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:17:09 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:17:09 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:18 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:19:18 PM [express] GET /api/ml/stats 200 in 18ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:28 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:19:28 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:38 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:19:38 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:48 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:19:48 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
[Fri Nov 21 17:20:33 CET 2025] Frontend Node NON attivo, riavvio...
[Fri Nov 21 17:20:35 CET 2025] Frontend riavviato con PID: 11385
> rest-express@1.0.0 dev
> NODE_ENV=development tsx server/index.ts
 Using standard PostgreSQL database
5:20:37 PM [express] serving on port 5000
✅ Database connection successful
A PostCSS plugin did not pass the `from` option to `postcss.parse`. This may cause imported assets to be incorrectly transformed. If you've recently added a PostCSS
plugin that raised this warning, please contact the package author to fix the issue.
5:21:01 PM [express] GET /api/training-history 200 in 34ms :: []
5:21:01 PM [express] GET /api/ml/stats 304 in 39ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:21:04 PM [express] POST /api/ml/train 200 in 14ms :: {"message":"Training avviato in background","…
5:21:04 PM [express] GET /api/training-history 304 in 3ms :: []
5:21:04 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
[root@ids ~]# tail -30 /var/log/ids/frontend.log
5:16:49 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:16:49 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:16:59 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:16:59 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:17:09 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:17:09 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:18 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:19:18 PM [express] GET /api/ml/stats 200 in 18ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:28 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:19:28 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:38 PM [express] GET /api/training-history 500 in 5ms :: {"error":"Failed to fetch training hist…
5:19:38 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:19:48 PM [express] GET /api/training-history 500 in 3ms :: {"error":"Failed to fetch training hist…
5:19:48 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
[Fri Nov 21 17:20:33 CET 2025] Frontend Node NON attivo, riavvio...
[Fri Nov 21 17:20:35 CET 2025] Frontend riavviato con PID: 11385
> rest-express@1.0.0 dev
> NODE_ENV=development tsx server/index.ts
 Using standard PostgreSQL database
5:20:37 PM [express] serving on port 5000
✅ Database connection successful
A PostCSS plugin did not pass the `from` option to `postcss.parse`. This may cause imported assets to be incorrectly transformed. If you've recently added a PostCSS
plugin that raised this warning, please contact the package author to fix the issue.
5:21:01 PM [express] GET /api/training-history 200 in 34ms :: []
5:21:01 PM [express] GET /api/ml/stats 304 in 39ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:21:04 PM [express] POST /api/ml/train 200 in 14ms :: {"message":"Training avviato in background","…
5:21:04 PM [express] GET /api/training-history 304 in 3ms :: []
5:21:04 PM [express] GET /api/ml/stats 304 in 15ms :: {"logs":{"total":0,"last_hour":0},"detections"…
[root@ids ~]# tail -30 /var/log/ids/frontend.log
at async <anonymous> (/opt/ids/server/routes.ts:10:23) {
length: 109,
severity: 'ERROR',
code: '42703',
detail: undefined,
hint: undefined,
position: '83',
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: undefined,
table: undefined,
column: undefined,
dataType: undefined,
constraint: undefined,
file: 'parse_relation.c',
line: '3562',
routine: 'errorMissingColumn'
}
5:21:31 PM [express] GET /api/routers 500 in 12ms :: {"error":"Failed to fetch routers"}
5:21:32 PM [express] GET /api/training-history 304 in 15ms :: []
5:21:33 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:21:43 PM [express] GET /api/training-history 304 in 14ms :: []
5:21:43 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:21:44 PM [express] GET /api/detections 304 in 4ms :: []
5:21:46 PM [express] GET /api/training-history 304 in 4ms :: []
5:21:46 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
5:21:48 PM [express] GET /api/detections 304 in 2ms :: []
5:21:50 PM [express] GET /api/training-history 304 in 5ms :: []
5:21:50 PM [express] GET /api/ml/stats 304 in 19ms :: {"logs":{"total":0,"last_hour":0},"detections"…
[root@ids ~]# tail -30 /var/log/ids/frontend.log

View File

@ -0,0 +1,32 @@
tail -30 /var/log/ids/frontend.log
🐘 Using standard PostgreSQL database
6:31:19 PM [express] serving on port 5000
✅ Database connection successful
A PostCSS plugin did not pass the `from` option to `postcss.parse`. This may cause imported assets to be incorrectly transformed. If you've recently added a PostCSS plugin that raised this warning, please contact the package author to fix the issue.
6:34:07 PM [express] GET /api/routers 304 in 29ms :: [{"id":"77031e0b-ef65-4be7-9767-7220c762232f","…
6:34:09 PM [express] GET /api/detections 304 in 5ms :: []
6:34:11 PM [express] GET /api/training-history 200 in 13ms :: []
6:34:11 PM [express] GET /api/ml/stats 304 in 40ms :: {"logs":{"total":0,"last_hour":0},"detections"…
6:34:21 PM [express] GET /api/training-history 304 in 12ms :: []
6:34:21 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
6:34:31 PM [express] GET /api/training-history 304 in 18ms :: []
6:34:31 PM [express] GET /api/ml/stats 304 in 20ms :: {"logs":{"total":0,"last_hour":0},"detections"…
6:34:41 PM [express] GET /api/training-history 304 in 12ms :: []
6:34:41 PM [express] GET /api/ml/stats 304 in 19ms :: {"logs":{"total":0,"last_hour":0},"detections"…
6:34:51 PM [express] GET /api/training-history 304 in 14ms :: []
6:34:51 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
6:35:01 PM [express] GET /api/training-history 304 in 12ms :: []
6:35:01 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
6:35:11 PM [express] GET /api/training-history 304 in 12ms :: []
6:35:11 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
6:35:21 PM [express] GET /api/training-history 304 in 13ms :: []
6:35:21 PM [express] GET /api/ml/stats 304 in 18ms :: {"logs":{"total":0,"last_hour":0},"detections"…
6:35:31 PM [express] GET /api/training-history 304 in 13ms :: []
6:35:31 PM [express] GET /api/ml/stats 304 in 17ms :: {"logs":{"total":0,"last_hour":0},"detections"…
6:35:41 PM [express] GET /api/training-history 304 in 12ms :: []
6:35:41 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
6:35:51 PM [express] GET /api/training-history 304 in 13ms :: []
6:35:51 PM [express] GET /api/ml/stats 304 in 16ms :: {"logs":{"total":0,"last_hour":0},"detections"…
[root@ids ~]#

View File

@ -0,0 +1,51 @@
tail -50 /var/log/cron
Nov 18 08:50:01 ids CROND[78427]: (ids) CMD (cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/detect', json={'max_records': 5000, 'auto_block': True, 'risk_threshold': 75})" >> /var/log/ids/detect.log 2>&1)
Nov 18 08:50:01 ids CROND[78404]: (ids) CMDEND (cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/detect', json={'max_records': 5000, 'auto_block': True, 'risk_threshold': 75})" >> /var/log/ids/detect.log 2>&1)
Nov 18 08:50:01 ids CROND[78402]: (ids) CMDEND (/opt/ids/deployment/check_frontend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 08:50:01 ids CROND[78403]: (ids) CMDEND (/opt/ids/deployment/check_backend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 08:55:01 ids CROND[78522]: (ids) CMD (/opt/ids/deployment/check_frontend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 08:55:01 ids CROND[78523]: (ids) CMD (cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/detect', json={'max_records': 5000, 'auto_block': True, 'risk_threshold': 75})" >> /var/log/ids/detect.log 2>&1)
Nov 18 08:55:01 ids CROND[78527]: (ids) CMD (/opt/ids/deployment/check_backend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 08:55:01 ids CROND[78507]: (ids) CMDEND (cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/detect', json={'max_records': 5000, 'auto_block': True, 'risk_threshold': 75})" >> /var/log/ids/detect.log 2>&1)
Nov 18 08:55:01 ids CROND[78506]: (ids) CMDEND (/opt/ids/deployment/check_backend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 08:55:01 ids CROND[78505]: (ids) CMDEND (/opt/ids/deployment/check_frontend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:00:02 ids CROND[78620]: (ids) CMD (cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/detect', json={'max_records': 5000, 'auto_block': True, 'risk_threshold': 75})" >> /var/log/ids/detect.log 2>&1)
Nov 18 09:00:02 ids CROND[78622]: (ids) CMD (/opt/ids/deployment/check_frontend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:00:02 ids CROND[78626]: (ids) CMD (/opt/ids/deployment/check_backend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:00:02 ids CROND[78605]: (ids) CMDEND (cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/detect', json={'max_records': 5000, 'auto_block': True, 'risk_threshold': 75})" >> /var/log/ids/detect.log 2>&1)
Nov 18 09:00:02 ids CROND[78604]: (ids) CMDEND (/opt/ids/deployment/check_backend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:00:02 ids CROND[78603]: (ids) CMDEND (/opt/ids/deployment/check_frontend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:01:01 ids CROND[78698]: (root) CMD (run-parts /etc/cron.hourly)
Nov 18 09:01:02 ids run-parts[78701]: (/etc/cron.hourly) starting 0anacron
Nov 18 09:01:02 ids run-parts[78707]: (/etc/cron.hourly) finished 0anacron
Nov 18 09:01:02 ids CROND[78697]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 18 09:05:01 ids CROND[78733]: (ids) CMD (/opt/ids/deployment/check_backend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:05:01 ids CROND[78736]: (ids) CMD (cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/detect', json={'max_records': 5000, 'auto_block': True, 'risk_threshold': 75})" >> /var/log/ids/detect.log 2>&1)
Nov 18 09:05:01 ids CROND[78738]: (ids) CMD (/opt/ids/deployment/check_frontend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:05:01 ids CROND[78719]: (ids) CMDEND (cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/detect', json={'max_records': 5000, 'auto_block': True, 'risk_threshold': 75})" >> /var/log/ids/detect.log 2>&1)
Nov 18 09:05:01 ids CROND[78718]: (ids) CMDEND (/opt/ids/deployment/check_backend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:05:01 ids CROND[78717]: (ids) CMDEND (/opt/ids/deployment/check_frontend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:10:01 ids CROND[78836]: (ids) CMD (/opt/ids/deployment/check_frontend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:10:01 ids CROND[78838]: (ids) CMD (cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/detect', json={'max_records': 5000, 'auto_block': True, 'risk_threshold': 75})" >> /var/log/ids/detect.log 2>&1)
Nov 18 09:10:01 ids CROND[78843]: (ids) CMD (/opt/ids/deployment/check_backend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:10:01 ids CROND[78822]: (ids) CMDEND (cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/detect', json={'max_records': 5000, 'auto_block': True, 'risk_threshold': 75})" >> /var/log/ids/detect.log 2>&1)
Nov 18 09:10:01 ids CROND[78820]: (ids) CMDEND (/opt/ids/deployment/check_frontend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:10:01 ids CROND[78821]: (ids) CMDEND (/opt/ids/deployment/check_backend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:15:02 ids CROND[78939]: (ids) CMD (/opt/ids/deployment/check_frontend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:15:02 ids CROND[78941]: (ids) CMD (cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/detect', json={'max_records': 5000, 'auto_block': True, 'risk_threshold': 75})" >> /var/log/ids/detect.log 2>&1)
Nov 18 09:15:02 ids CROND[78943]: (ids) CMD (/opt/ids/deployment/check_backend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:15:02 ids CROND[78925]: (ids) CMDEND (cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/detect', json={'max_records': 5000, 'auto_block': True, 'risk_threshold': 75})" >> /var/log/ids/detect.log 2>&1)
Nov 18 09:15:02 ids CROND[78924]: (ids) CMDEND (/opt/ids/deployment/check_backend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:15:02 ids CROND[78923]: (ids) CMDEND (/opt/ids/deployment/check_frontend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:20:01 ids CROND[79040]: (ids) CMD (/opt/ids/deployment/check_backend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:20:01 ids CROND[79042]: (ids) CMD (cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/detect', json={'max_records': 5000, 'auto_block': True, 'risk_threshold': 75})" >> /var/log/ids/detect.log 2>&1)
Nov 18 09:20:01 ids CROND[79044]: (ids) CMD (/opt/ids/deployment/check_frontend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:20:01 ids CROND[79025]: (ids) CMDEND (cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/detect', json={'max_records': 5000, 'auto_block': True, 'risk_threshold': 75})" >> /var/log/ids/detect.log 2>&1)
Nov 18 09:20:01 ids CROND[79023]: (ids) CMDEND (/opt/ids/deployment/check_frontend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:20:01 ids CROND[79024]: (ids) CMDEND (/opt/ids/deployment/check_backend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:25:01 ids CROND[79143]: (ids) CMD (cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/detect', json={'max_records': 5000, 'auto_block': True, 'risk_threshold': 75})" >> /var/log/ids/detect.log 2>&1)
Nov 18 09:25:01 ids CROND[79145]: (ids) CMD (/opt/ids/deployment/check_frontend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:25:01 ids CROND[79146]: (ids) CMD (/opt/ids/deployment/check_backend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:25:01 ids CROND[79128]: (ids) CMDEND (cd /opt/ids/python_ml && /usr/bin/python3.11 -c "import requests; requests.post('http://localhost:8000/detect', json={'max_records': 5000, 'auto_block': True, 'risk_threshold': 75})" >> /var/log/ids/detect.log 2>&1)
Nov 18 09:25:01 ids CROND[79126]: (ids) CMDEND (/opt/ids/deployment/check_frontend.sh >> /var/log/ids/cron.log 2>&1)
Nov 18 09:25:01 ids CROND[79127]: (ids) CMDEND (/opt/ids/deployment/check_backend.sh >> /var/log/ids/cron.log 2>&1)

View File

@ -0,0 +1,52 @@
tail -50 /var/log/ids/ml_backend.log
from fastapi import FastAPI, HTTPException, BackgroundTasks, Security, Header
ModuleNotFoundError: No module named 'fastapi'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 6, in <module>
from fastapi import FastAPI, HTTPException, BackgroundTasks, Security, Header
ModuleNotFoundError: No module named 'fastapi'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 6, in <module>
from fastapi import FastAPI, HTTPException, BackgroundTasks, Security, Header
ModuleNotFoundError: No module named 'fastapi'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 6, in <module>
from fastapi import FastAPI, HTTPException, BackgroundTasks, Security, Header
ModuleNotFoundError: No module named 'fastapi'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 6, in <module>
from fastapi import FastAPI, HTTPException, BackgroundTasks, Security, Header
ModuleNotFoundError: No module named 'fastapi'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 6, in <module>
from fastapi import FastAPI, HTTPException, BackgroundTasks, Security, Header
ModuleNotFoundError: No module named 'fastapi'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 6, in <module>
from fastapi import FastAPI, HTTPException, BackgroundTasks, Security, Header
ModuleNotFoundError: No module named 'fastapi'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 6, in <module>
from fastapi import FastAPI, HTTPException, BackgroundTasks, Security, Header
ModuleNotFoundError: No module named 'fastapi'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 6, in <module>
from fastapi import FastAPI, HTTPException, BackgroundTasks, Security, Header
ModuleNotFoundError: No module named 'fastapi'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 6, in <module>
from fastapi import FastAPI, HTTPException, BackgroundTasks, Security, Header
ModuleNotFoundError: No module named 'fastapi'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 6, in <module>
from fastapi import FastAPI, HTTPException, BackgroundTasks, Security, Header
ModuleNotFoundError: No module named 'fastapi'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 6, in <module>
from fastapi import FastAPI, HTTPException, BackgroundTasks, Security, Header
ModuleNotFoundError: No module named 'fastapi'
Traceback (most recent call last):
File "/opt/ids/python_ml/main.py", line 6, in <module>
from fastapi import FastAPI, HTTPException, BackgroundTasks, Security, Header
ModuleNotFoundError: No module named 'fastapi'
[root@ids ids]#

View File

@ -0,0 +1 @@
No markdown content returned

View File

@ -0,0 +1 @@
No markdown content returned

View File

@ -0,0 +1 @@
No markdown content returned

Binary file not shown.

After

Width:  |  Height:  |  Size: 58 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 96 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 96 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 96 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 92 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 42 KiB

View File

@ -4,17 +4,28 @@ import { QueryClientProvider } from "@tanstack/react-query";
import { Toaster } from "@/components/ui/toaster"; import { Toaster } from "@/components/ui/toaster";
import { TooltipProvider } from "@/components/ui/tooltip"; import { TooltipProvider } from "@/components/ui/tooltip";
import { SidebarProvider, Sidebar, SidebarContent, SidebarGroup, SidebarGroupContent, SidebarGroupLabel, SidebarMenu, SidebarMenuButton, SidebarMenuItem, SidebarTrigger } from "@/components/ui/sidebar"; import { SidebarProvider, Sidebar, SidebarContent, SidebarGroup, SidebarGroupContent, SidebarGroupLabel, SidebarMenu, SidebarMenuButton, SidebarMenuItem, SidebarTrigger } from "@/components/ui/sidebar";
import { LayoutDashboard, AlertTriangle, Server, Shield, Menu } from "lucide-react"; import { LayoutDashboard, AlertTriangle, Server, Shield, Brain, Menu, Activity, BarChart3, TrendingUp, List } from "lucide-react";
import Dashboard from "@/pages/Dashboard"; import Dashboard from "@/pages/Dashboard";
import Detections from "@/pages/Detections"; import Detections from "@/pages/Detections";
import DashboardLive from "@/pages/DashboardLive";
import AnalyticsHistory from "@/pages/AnalyticsHistory";
import Routers from "@/pages/Routers"; import Routers from "@/pages/Routers";
import Whitelist from "@/pages/Whitelist";
import PublicLists from "@/pages/PublicLists";
import Training from "@/pages/Training";
import Services from "@/pages/Services";
import NotFound from "@/pages/not-found"; import NotFound from "@/pages/not-found";
const menuItems = [ const menuItems = [
{ title: "Dashboard", url: "/", icon: LayoutDashboard }, { title: "Dashboard", url: "/", icon: LayoutDashboard },
{ title: "Rilevamenti", url: "/detections", icon: AlertTriangle }, { title: "Rilevamenti", url: "/detections", icon: AlertTriangle },
{ title: "Dashboard Live", url: "/dashboard-live", icon: Activity },
{ title: "Analytics Storici", url: "/analytics", icon: BarChart3 },
{ title: "Training ML", url: "/training", icon: Brain },
{ title: "Router", url: "/routers", icon: Server }, { title: "Router", url: "/routers", icon: Server },
{ title: "Whitelist", url: "/whitelist", icon: Shield }, { title: "Whitelist", url: "/whitelist", icon: Shield },
{ title: "Liste Pubbliche", url: "/public-lists", icon: List },
{ title: "Servizi", url: "/services", icon: TrendingUp },
]; ];
function AppSidebar() { function AppSidebar() {
@ -48,7 +59,13 @@ function Router() {
<Switch> <Switch>
<Route path="/" component={Dashboard} /> <Route path="/" component={Dashboard} />
<Route path="/detections" component={Detections} /> <Route path="/detections" component={Detections} />
<Route path="/dashboard-live" component={DashboardLive} />
<Route path="/analytics" component={AnalyticsHistory} />
<Route path="/training" component={Training} />
<Route path="/routers" component={Routers} /> <Route path="/routers" component={Routers} />
<Route path="/whitelist" component={Whitelist} />
<Route path="/public-lists" component={PublicLists} />
<Route path="/services" component={Services} />
<Route component={NotFound} /> <Route component={NotFound} />
</Switch> </Switch>
); );

View File

@ -0,0 +1,62 @@
/**
* Country Flags Utilities
* Converte country code in flag emoji
*/
/**
* Converte country code ISO 3166-1 alpha-2 in flag emoji
* Es: "IT" => "🇮🇹", "US" => "🇺🇸"
*/
export function getFlagEmoji(countryCode: string | null | undefined): string {
if (!countryCode || countryCode.length !== 2) {
return '🏳️'; // Flag bianca per unknown
}
const codePoints = countryCode
.toUpperCase()
.split('')
.map(char => 127397 + char.charCodeAt(0));
return String.fromCodePoint(...codePoints);
}
/**
* Mappa nomi paesi comuni (fallback se API non ritorna country code)
*/
export const COUNTRY_CODE_MAP: Record<string, string> = {
'Italy': 'IT',
'United States': 'US',
'Russia': 'RU',
'China': 'CN',
'Germany': 'DE',
'France': 'FR',
'United Kingdom': 'GB',
'Spain': 'ES',
'Brazil': 'BR',
'Japan': 'JP',
'India': 'IN',
'Canada': 'CA',
'Australia': 'AU',
'Netherlands': 'NL',
'Switzerland': 'CH',
'Sweden': 'SE',
'Poland': 'PL',
'Ukraine': 'UA',
'Romania': 'RO',
'Belgium': 'BE',
};
/**
* Ottieni flag da nome paese o country code
*/
export function getFlag(country: string | null | undefined, countryCode?: string | null): string {
if (countryCode) {
return getFlagEmoji(countryCode);
}
if (country && COUNTRY_CODE_MAP[country]) {
return getFlagEmoji(COUNTRY_CODE_MAP[country]);
}
return '🏳️';
}

View File

@ -0,0 +1,320 @@
import { useQuery } from "@tanstack/react-query";
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
import { Badge } from "@/components/ui/badge";
import { Button } from "@/components/ui/button";
import {
LineChart, Line, BarChart, Bar, AreaChart, Area,
XAxis, YAxis, CartesianGrid, Tooltip, Legend, ResponsiveContainer
} from "recharts";
import { Calendar, TrendingUp, BarChart3, Globe, Download } from "lucide-react";
import type { NetworkAnalytics } from "@shared/schema";
import { format, parseISO } from "date-fns";
import { useState } from "react";
export default function AnalyticsHistory() {
const [days, setDays] = useState(30);
// Fetch historical analytics (hourly aggregations)
const { data: analytics = [], isLoading } = useQuery<NetworkAnalytics[]>({
queryKey: [`/api/analytics/recent?days=${days}&hourly=true`],
refetchInterval: 60000, // Aggiorna ogni minuto
});
// Prepara dati per grafici
const trendData = analytics
.map(a => {
// Parse JSON fields safely
let attacksByCountry = {};
let attacksByType = {};
try {
attacksByCountry = a.attacksByCountry ? JSON.parse(a.attacksByCountry) : {};
} catch {}
try {
attacksByType = a.attacksByType ? JSON.parse(a.attacksByType) : {};
} catch {}
return {
date: format(new Date(a.date), "dd/MM HH:mm"),
fullDate: a.date,
totalPackets: a.totalPackets || 0,
normalPackets: a.normalPackets || 0,
attackPackets: a.attackPackets || 0,
attackPercentage: a.totalPackets > 0
? ((a.attackPackets || 0) / a.totalPackets * 100).toFixed(1)
: "0",
uniqueIps: a.uniqueIps || 0,
attackUniqueIps: a.attackUniqueIps || 0,
};
})
.sort((a, b) => new Date(a.fullDate).getTime() - new Date(b.fullDate).getTime());
// Aggrega dati per paese (da tutti i giorni)
const countryAggregation: Record<string, number> = {};
analytics.forEach(a => {
if (a.attacksByCountry) {
try {
const countries = JSON.parse(a.attacksByCountry);
if (countries && typeof countries === 'object') {
Object.entries(countries).forEach(([country, count]) => {
if (typeof count === 'number') {
countryAggregation[country] = (countryAggregation[country] || 0) + count;
}
});
}
} catch (e) {
console.warn('Failed to parse attacksByCountry:', e);
}
}
});
const topCountries = Object.entries(countryAggregation)
.map(([name, attacks]) => ({ name, attacks }))
.sort((a, b) => b.attacks - a.attacks)
.slice(0, 10);
// Calcola metriche totali
const totalTraffic = analytics.reduce((sum, a) => sum + (a.totalPackets || 0), 0);
const totalAttacks = analytics.reduce((sum, a) => sum + (a.attackPackets || 0), 0);
const totalNormal = analytics.reduce((sum, a) => sum + (a.normalPackets || 0), 0);
const avgAttackRate = totalTraffic > 0 ? ((totalAttacks / totalTraffic) * 100).toFixed(2) : "0";
return (
<div className="flex flex-col gap-6 p-6" data-testid="page-analytics-history">
{/* Header */}
<div className="flex items-center justify-between">
<div>
<h1 className="text-3xl font-semibold flex items-center gap-2" data-testid="text-page-title">
<BarChart3 className="h-8 w-8" />
Analytics Storici
</h1>
<p className="text-muted-foreground" data-testid="text-page-subtitle">
Statistiche permanenti per analisi long-term
</p>
</div>
{/* Time Range Selector */}
<div className="flex items-center gap-2">
<Button
variant={days === 7 ? "default" : "outline"}
size="sm"
onClick={() => setDays(7)}
data-testid="button-7days"
>
7 Giorni
</Button>
<Button
variant={days === 30 ? "default" : "outline"}
size="sm"
onClick={() => setDays(30)}
data-testid="button-30days"
>
30 Giorni
</Button>
<Button
variant={days === 90 ? "default" : "outline"}
size="sm"
onClick={() => setDays(90)}
data-testid="button-90days"
>
90 Giorni
</Button>
</div>
</div>
{isLoading && (
<div className="text-center py-8" data-testid="text-loading">
Caricamento dati storici...
</div>
)}
{!isLoading && analytics.length === 0 && (
<Card>
<CardContent className="py-12 text-center text-muted-foreground">
<Calendar className="h-12 w-12 mx-auto mb-4 opacity-50" />
<p>Nessun dato storico disponibile</p>
<p className="text-sm mt-2">
I dati verranno aggregati automaticamente ogni ora dal sistema
</p>
</CardContent>
</Card>
)}
{!isLoading && analytics.length > 0 && (
<>
{/* Summary KPIs */}
<div className="grid grid-cols-1 md:grid-cols-4 gap-4">
<Card data-testid="card-total-summary">
<CardHeader className="pb-2">
<CardTitle className="text-sm font-medium text-muted-foreground">
Traffico Totale ({days}g)
</CardTitle>
</CardHeader>
<CardContent>
<div className="text-2xl font-bold" data-testid="text-total-summary">
{totalTraffic.toLocaleString()}
</div>
<p className="text-xs text-muted-foreground mt-1">pacchetti</p>
</CardContent>
</Card>
<Card data-testid="card-normal-summary">
<CardHeader className="pb-2">
<CardTitle className="text-sm font-medium text-muted-foreground">
Traffico Normale
</CardTitle>
</CardHeader>
<CardContent>
<div className="text-2xl font-bold text-green-600" data-testid="text-normal-summary">
{totalNormal.toLocaleString()}
</div>
<p className="text-xs text-muted-foreground mt-1">
{(100 - parseFloat(avgAttackRate)).toFixed(1)}% del totale
</p>
</CardContent>
</Card>
<Card data-testid="card-attacks-summary">
<CardHeader className="pb-2">
<CardTitle className="text-sm font-medium text-muted-foreground">
Attacchi Totali
</CardTitle>
</CardHeader>
<CardContent>
<div className="text-2xl font-bold text-red-600" data-testid="text-attacks-summary">
{totalAttacks.toLocaleString()}
</div>
<p className="text-xs text-muted-foreground mt-1">
{avgAttackRate}% del traffico
</p>
</CardContent>
</Card>
<Card data-testid="card-avg-daily">
<CardHeader className="pb-2">
<CardTitle className="text-sm font-medium text-muted-foreground">
Media Giornaliera
</CardTitle>
</CardHeader>
<CardContent>
<div className="text-2xl font-bold" data-testid="text-avg-daily">
{Math.round(totalTraffic / analytics.length).toLocaleString()}
</div>
<p className="text-xs text-muted-foreground mt-1">pacchetti/giorno</p>
</CardContent>
</Card>
</div>
{/* Trend Line Chart */}
<Card data-testid="card-trend">
<CardHeader>
<CardTitle className="flex items-center gap-2">
<TrendingUp className="h-5 w-5" />
Trend Traffico (Normale + Attacchi)
</CardTitle>
</CardHeader>
<CardContent>
<ResponsiveContainer width="100%" height={400}>
<AreaChart data={trendData}>
<CartesianGrid strokeDasharray="3 3" />
<XAxis dataKey="date" />
<YAxis />
<Tooltip />
<Legend />
<Area
type="monotone"
dataKey="normalPackets"
stackId="1"
stroke="#22c55e"
fill="#22c55e"
name="Normale"
/>
<Area
type="monotone"
dataKey="attackPackets"
stackId="1"
stroke="#ef4444"
fill="#ef4444"
name="Attacchi"
/>
</AreaChart>
</ResponsiveContainer>
</CardContent>
</Card>
{/* Attack Rate Trend */}
<Card data-testid="card-attack-rate">
<CardHeader>
<CardTitle>Percentuale Attacchi nel Tempo</CardTitle>
</CardHeader>
<CardContent>
<ResponsiveContainer width="100%" height={300}>
<LineChart data={trendData}>
<CartesianGrid strokeDasharray="3 3" />
<XAxis dataKey="date" />
<YAxis />
<Tooltip />
<Legend />
<Line
type="monotone"
dataKey="attackPercentage"
stroke="#ef4444"
name="% Attacchi"
strokeWidth={2}
/>
</LineChart>
</ResponsiveContainer>
</CardContent>
</Card>
{/* Top Countries (Historical) */}
<Card data-testid="card-top-countries">
<CardHeader>
<CardTitle className="flex items-center gap-2">
<Globe className="h-5 w-5" />
Top 10 Paesi Attaccanti (Storico)
</CardTitle>
</CardHeader>
<CardContent>
{topCountries.length > 0 ? (
<ResponsiveContainer width="100%" height={400}>
<BarChart data={topCountries} layout="vertical">
<CartesianGrid strokeDasharray="3 3" />
<XAxis type="number" />
<YAxis dataKey="name" type="category" width={100} />
<Tooltip />
<Legend />
<Bar dataKey="attacks" fill="#ef4444" name="Attacchi Totali" />
</BarChart>
</ResponsiveContainer>
) : (
<div className="text-center py-20 text-muted-foreground">
Nessun dato disponibile
</div>
)}
</CardContent>
</Card>
{/* Export Button (Placeholder) */}
<Card data-testid="card-export">
<CardContent className="pt-6">
<div className="flex items-center justify-between">
<div>
<h3 className="font-semibold">Export Report</h3>
<p className="text-sm text-muted-foreground">
Esporta i dati in formato CSV per analisi esterne
</p>
</div>
<Button variant="outline" data-testid="button-export">
<Download className="h-4 w-4 mr-2" />
Esporta CSV
</Button>
</div>
</CardContent>
</Card>
</>
)}
</div>
);
}

View File

@ -2,8 +2,9 @@ import { useQuery } from "@tanstack/react-query";
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card"; import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
import { Badge } from "@/components/ui/badge"; import { Badge } from "@/components/ui/badge";
import { Button } from "@/components/ui/button"; import { Button } from "@/components/ui/button";
import { Activity, Shield, Server, AlertTriangle, CheckCircle2, TrendingUp } from "lucide-react"; import { Activity, Shield, Server, AlertTriangle, CheckCircle2, TrendingUp, Database, FileText, Brain } from "lucide-react";
import { format } from "date-fns"; import { format } from "date-fns";
import { it } from "date-fns/locale";
import type { Detection, Router, TrainingHistory } from "@shared/schema"; import type { Detection, Router, TrainingHistory } from "@shared/schema";
interface StatsResponse { interface StatsResponse {
@ -14,6 +15,22 @@ interface StatsResponse {
latestTraining: TrainingHistory | null; latestTraining: TrainingHistory | null;
} }
interface ServiceStatus {
name: string;
status: "running" | "idle" | "offline" | "error" | "unknown";
healthy: boolean;
details: any;
}
interface ServicesStatusResponse {
services: {
mlBackend: ServiceStatus;
database: ServiceStatus;
syslogParser: ServiceStatus;
analyticsAggregator: ServiceStatus;
};
}
export default function Dashboard() { export default function Dashboard() {
const { data: stats } = useQuery<StatsResponse>({ const { data: stats } = useQuery<StatsResponse>({
queryKey: ["/api/stats"], queryKey: ["/api/stats"],
@ -21,7 +38,7 @@ export default function Dashboard() {
}); });
const { data: recentDetections } = useQuery<Detection[]>({ const { data: recentDetections } = useQuery<Detection[]>({
queryKey: ["/api/detections"], queryKey: ["/api/detections?limit=100"],
refetchInterval: 5000, // Refresh every 5s refetchInterval: 5000, // Refresh every 5s
}); });
@ -29,6 +46,11 @@ export default function Dashboard() {
queryKey: ["/api/routers"], queryKey: ["/api/routers"],
}); });
const { data: servicesStatus } = useQuery<ServicesStatusResponse>({
queryKey: ["/api/services/status"],
refetchInterval: 5000, // Refresh every 5s
});
const getRiskBadge = (riskScore: string) => { const getRiskBadge = (riskScore: string) => {
const score = parseFloat(riskScore); const score = parseFloat(riskScore);
if (score >= 85) return <Badge variant="destructive" data-testid={`badge-risk-critical`}>CRITICO</Badge>; if (score >= 85) return <Badge variant="destructive" data-testid={`badge-risk-critical`}>CRITICO</Badge>;
@ -47,6 +69,84 @@ export default function Dashboard() {
</p> </p>
</div> </div>
{/* Services Status */}
<Card data-testid="card-services-status">
<CardHeader>
<CardTitle className="flex items-center gap-2">
<Activity className="h-5 w-5" />
Stato Servizi
</CardTitle>
</CardHeader>
<CardContent>
<div className="grid grid-cols-1 md:grid-cols-3 gap-4">
{/* ML Backend */}
<div className="flex items-center gap-3 p-3 rounded-lg border" data-testid="service-ml-backend">
<div className={`h-3 w-3 rounded-full ${servicesStatus?.services.mlBackend.healthy ? 'bg-green-500' : 'bg-red-500'}`} data-testid="status-indicator-ml-backend" />
<div className="flex-1 min-w-0">
<div className="flex items-center gap-2">
<Brain className="h-4 w-4 text-muted-foreground" />
<p className="font-medium text-sm">ML Backend</p>
</div>
<p className="text-xs text-muted-foreground">
{servicesStatus?.services.mlBackend.status === 'running' && 'In esecuzione'}
{servicesStatus?.services.mlBackend.status === 'offline' && 'Offline'}
{servicesStatus?.services.mlBackend.status === 'error' && 'Errore'}
{!servicesStatus && 'Caricamento...'}
</p>
{servicesStatus?.services.mlBackend.details?.modelLoaded !== undefined && (
<p className="text-xs text-muted-foreground mt-1">
Modello: {servicesStatus.services.mlBackend.details.modelLoaded ? '✓ Caricato' : '✗ Non caricato'}
</p>
)}
</div>
</div>
{/* Database */}
<div className="flex items-center gap-3 p-3 rounded-lg border" data-testid="service-database">
<div className={`h-3 w-3 rounded-full ${servicesStatus?.services.database.healthy ? 'bg-green-500' : 'bg-red-500'}`} data-testid="status-indicator-database" />
<div className="flex-1 min-w-0">
<div className="flex items-center gap-2">
<Database className="h-4 w-4 text-muted-foreground" />
<p className="font-medium text-sm">Database</p>
</div>
<p className="text-xs text-muted-foreground">
{servicesStatus?.services.database.status === 'running' && 'Connesso'}
{servicesStatus?.services.database.status === 'error' && 'Errore connessione'}
{!servicesStatus && 'Caricamento...'}
</p>
</div>
</div>
{/* Syslog Parser */}
<div className="flex items-center gap-3 p-3 rounded-lg border" data-testid="service-syslog-parser">
<div className={`h-3 w-3 rounded-full ${servicesStatus?.services.syslogParser.healthy ? 'bg-green-500' : 'bg-yellow-500'}`} data-testid="status-indicator-syslog-parser" />
<div className="flex-1 min-w-0">
<div className="flex items-center gap-2">
<FileText className="h-4 w-4 text-muted-foreground" />
<p className="font-medium text-sm">Syslog Parser</p>
</div>
<p className="text-xs text-muted-foreground">
{servicesStatus?.services.syslogParser.status === 'running' && 'Attivo'}
{servicesStatus?.services.syslogParser.status === 'idle' && 'In attesa log'}
{servicesStatus?.services.syslogParser.status === 'error' && 'Errore'}
{!servicesStatus && 'Caricamento...'}
</p>
{servicesStatus?.services.syslogParser.details?.logsLast5Min !== undefined && (
<p className="text-xs text-muted-foreground mt-1">
{servicesStatus.services.syslogParser.details.logsLast5Min} log (5min)
</p>
)}
</div>
</div>
</div>
<div className="mt-4">
<Button variant="outline" size="sm" asChild data-testid="button-view-services">
<a href="/services">Gestisci Servizi</a>
</Button>
</div>
</CardContent>
</Card>
{/* Stats Grid */} {/* Stats Grid */}
<div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-4 gap-4"> <div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-4 gap-4">
<Card data-testid="card-routers"> <Card data-testid="card-routers">

View File

@ -0,0 +1,296 @@
import { useQuery } from "@tanstack/react-query";
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
import { Badge } from "@/components/ui/badge";
import { Activity, Globe, Shield, TrendingUp, AlertTriangle } from "lucide-react";
import { AreaChart, Area, BarChart, Bar, PieChart, Pie, Cell, XAxis, YAxis, CartesianGrid, Tooltip, Legend, ResponsiveContainer } from "recharts";
import type { Detection, NetworkLog } from "@shared/schema";
import { getFlag } from "@/lib/country-flags";
import { format } from "date-fns";
interface DashboardStats {
totalPackets: number;
attackPackets: number;
normalPackets: number;
uniqueIps: number;
attackUniqueIps: number;
attacksByCountry: Record<string, number>;
attacksByType: Record<string, number>;
recentDetections: Detection[];
}
export default function DashboardLive() {
// Fetch aggregated stats from analytics (ultimi 72h = 3 giorni)
const { data: stats, isLoading } = useQuery<DashboardStats>({
queryKey: ["/api/dashboard/live?hours=72"],
refetchInterval: 10000, // Aggiorna ogni 10s
});
// Usa dati aggregati precisi
const totalTraffic = stats?.totalPackets || 0;
const totalAttacks = stats?.attackPackets || 0;
const normalTraffic = stats?.normalPackets || 0;
const attackPercentage = totalTraffic > 0 ? ((totalAttacks / totalTraffic) * 100).toFixed(2) : "0";
const detections = stats?.recentDetections || [];
const blockedAttacks = detections.filter(d => d.blocked).length;
// Usa dati aggregati già calcolati dal backend
const attacksByCountry = stats?.attacksByCountry || {};
const attacksByType = stats?.attacksByType || {};
const countryChartData = Object.entries(attacksByCountry)
.map(([name, attacks]) => ({
name: `${getFlag(name, name.substring(0, 2))} ${name}`,
attacks,
normal: 0,
}))
.sort((a, b) => b.attacks - a.attacks)
.slice(0, 10);
const typeChartData = Object.entries(attacksByType).map(([name, value]) => ({
name: name.replace('_', ' ').toUpperCase(),
value,
}));
// Traffico normale vs attacchi (gauge data)
const trafficDistribution = [
{ name: 'Normal', value: normalTraffic, color: '#22c55e' },
{ name: 'Attacks', value: totalAttacks, color: '#ef4444' },
];
// Ultimi eventi (stream)
const recentEvents = [...detections]
.sort((a, b) => new Date(b.detectedAt).getTime() - new Date(a.detectedAt).getTime())
.slice(0, 20);
const COLORS = ['#ef4444', '#f97316', '#f59e0b', '#eab308', '#84cc16'];
return (
<div className="flex flex-col gap-6 p-6" data-testid="page-dashboard-live">
{/* Header */}
<div>
<h1 className="text-3xl font-semibold flex items-center gap-2" data-testid="text-page-title">
<Activity className="h-8 w-8" />
Dashboard Live
</h1>
<p className="text-muted-foreground" data-testid="text-page-subtitle">
Monitoraggio real-time (ultimi 3 giorni)
</p>
</div>
{isLoading && (
<div className="text-center py-8" data-testid="text-loading">
Caricamento dati...
</div>
)}
{!isLoading && (
<>
{/* KPI Cards */}
<div className="grid grid-cols-1 md:grid-cols-4 gap-4">
<Card data-testid="card-total-traffic">
<CardHeader className="pb-2">
<CardTitle className="text-sm font-medium text-muted-foreground">
Traffico Totale
</CardTitle>
</CardHeader>
<CardContent>
<div className="text-3xl font-bold" data-testid="text-total-traffic">
{totalTraffic.toLocaleString()}
</div>
<p className="text-xs text-muted-foreground mt-1">pacchetti</p>
</CardContent>
</Card>
<Card data-testid="card-normal-traffic">
<CardHeader className="pb-2">
<CardTitle className="text-sm font-medium text-muted-foreground">
Traffico Normale
</CardTitle>
</CardHeader>
<CardContent>
<div className="text-3xl font-bold text-green-600" data-testid="text-normal-traffic">
{normalTraffic.toLocaleString()}
</div>
<p className="text-xs text-muted-foreground mt-1">
{(100 - parseFloat(attackPercentage)).toFixed(1)}% del totale
</p>
</CardContent>
</Card>
<Card data-testid="card-attacks">
<CardHeader className="pb-2">
<CardTitle className="text-sm font-medium text-muted-foreground">
Attacchi Rilevati
</CardTitle>
</CardHeader>
<CardContent>
<div className="text-3xl font-bold text-red-600" data-testid="text-attacks">
{totalAttacks}
</div>
<p className="text-xs text-muted-foreground mt-1">
{attackPercentage}% del traffico
</p>
</CardContent>
</Card>
<Card data-testid="card-blocked">
<CardHeader className="pb-2">
<CardTitle className="text-sm font-medium text-muted-foreground">
IP Bloccati
</CardTitle>
</CardHeader>
<CardContent>
<div className="text-3xl font-bold text-orange-600" data-testid="text-blocked">
{blockedAttacks}
</div>
<p className="text-xs text-muted-foreground mt-1">
{totalAttacks > 0 ? ((blockedAttacks / totalAttacks) * 100).toFixed(1) : 0}% degli attacchi
</p>
</CardContent>
</Card>
</div>
{/* Charts Row 1 */}
<div className="grid grid-cols-1 lg:grid-cols-2 gap-6">
{/* Traffic Distribution (Pie) */}
<Card data-testid="card-distribution">
<CardHeader>
<CardTitle className="flex items-center gap-2">
<TrendingUp className="h-5 w-5" />
Distribuzione Traffico
</CardTitle>
</CardHeader>
<CardContent>
<ResponsiveContainer width="100%" height={300}>
<PieChart>
<Pie
data={trafficDistribution}
cx="50%"
cy="50%"
labelLine={false}
label={(entry) => `${entry.name}: ${entry.value}`}
outerRadius={100}
fill="#8884d8"
dataKey="value"
>
{trafficDistribution.map((entry, index) => (
<Cell key={`cell-${index}`} fill={entry.color} />
))}
</Pie>
<Tooltip />
<Legend />
</PieChart>
</ResponsiveContainer>
</CardContent>
</Card>
{/* Attacks by Type (Pie) */}
<Card data-testid="card-attack-types">
<CardHeader>
<CardTitle className="flex items-center gap-2">
<AlertTriangle className="h-5 w-5" />
Tipi di Attacco
</CardTitle>
</CardHeader>
<CardContent>
{typeChartData.length > 0 ? (
<ResponsiveContainer width="100%" height={300}>
<PieChart>
<Pie
data={typeChartData}
cx="50%"
cy="50%"
labelLine={false}
label={(entry) => `${entry.name}: ${entry.value}`}
outerRadius={100}
fill="#8884d8"
dataKey="value"
>
{typeChartData.map((entry, index) => (
<Cell key={`cell-${index}`} fill={COLORS[index % COLORS.length]} />
))}
</Pie>
<Tooltip />
<Legend />
</PieChart>
</ResponsiveContainer>
) : (
<div className="text-center py-20 text-muted-foreground">
Nessun attacco rilevato
</div>
)}
</CardContent>
</Card>
</div>
{/* Top Countries (Bar Chart) */}
<Card data-testid="card-countries">
<CardHeader>
<CardTitle className="flex items-center gap-2">
<Globe className="h-5 w-5" />
Top 10 Paesi Attaccanti
</CardTitle>
</CardHeader>
<CardContent>
{countryChartData.length > 0 ? (
<ResponsiveContainer width="100%" height={400}>
<BarChart data={countryChartData}>
<CartesianGrid strokeDasharray="3 3" />
<XAxis dataKey="name" />
<YAxis />
<Tooltip />
<Legend />
<Bar dataKey="attacks" fill="#ef4444" name="Attacchi" />
</BarChart>
</ResponsiveContainer>
) : (
<div className="text-center py-20 text-muted-foreground">
Nessun dato disponibile
</div>
)}
</CardContent>
</Card>
{/* Real-time Event Stream */}
<Card data-testid="card-event-stream">
<CardHeader>
<CardTitle className="flex items-center gap-2">
<Shield className="h-5 w-5" />
Stream Eventi Recenti
</CardTitle>
</CardHeader>
<CardContent>
<div className="space-y-2 max-h-96 overflow-y-auto">
{recentEvents.map(event => (
<div
key={event.id}
className="flex items-center justify-between p-3 rounded-lg border hover-elevate"
data-testid={`event-${event.id}`}
>
<div className="flex items-center gap-3">
{event.countryCode && (
<span className="text-xl">
{getFlag(event.country, event.countryCode)}
</span>
)}
<div>
<code className="font-mono font-semibold">{event.sourceIp}</code>
<p className="text-xs text-muted-foreground">
{event.anomalyType.replace('_', ' ')} {format(new Date(event.detectedAt), "HH:mm:ss")}
</p>
</div>
</div>
<Badge variant={event.blocked ? "destructive" : "secondary"}>
{event.blocked ? "Bloccato" : "Attivo"}
</Badge>
</div>
))}
</div>
</CardContent>
</Card>
</>
)}
</div>
);
}

View File

@ -1,24 +1,133 @@
import { useQuery } from "@tanstack/react-query"; import { useQuery, useMutation } from "@tanstack/react-query";
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card"; import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
import { Badge } from "@/components/ui/badge"; import { Badge } from "@/components/ui/badge";
import { Button } from "@/components/ui/button"; import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input"; import { Input } from "@/components/ui/input";
import { AlertTriangle, Search, Shield, Eye } from "lucide-react"; import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
import { Slider } from "@/components/ui/slider";
import { AlertTriangle, Search, Shield, Globe, MapPin, Building2, ShieldPlus, ShieldCheck, Unlock, ChevronLeft, ChevronRight } from "lucide-react";
import { format } from "date-fns"; import { format } from "date-fns";
import { useState } from "react"; import { useState, useEffect, useMemo } from "react";
import type { Detection } from "@shared/schema"; import type { Detection, Whitelist } from "@shared/schema";
import { getFlag } from "@/lib/country-flags";
import { apiRequest, queryClient } from "@/lib/queryClient";
import { useToast } from "@/hooks/use-toast";
const ITEMS_PER_PAGE = 50;
interface DetectionsResponse {
detections: Detection[];
total: number;
}
export default function Detections() { export default function Detections() {
const [searchQuery, setSearchQuery] = useState(""); const [searchInput, setSearchInput] = useState("");
const { data: detections, isLoading } = useQuery<Detection[]>({ const [debouncedSearch, setDebouncedSearch] = useState("");
queryKey: ["/api/detections"], const [anomalyTypeFilter, setAnomalyTypeFilter] = useState<string>("all");
refetchInterval: 5000, const [minScore, setMinScore] = useState(0);
const [maxScore, setMaxScore] = useState(100);
const [currentPage, setCurrentPage] = useState(1);
const { toast } = useToast();
// Debounce search input
useEffect(() => {
const timer = setTimeout(() => {
setDebouncedSearch(searchInput);
setCurrentPage(1); // Reset to first page on search
}, 300);
return () => clearTimeout(timer);
}, [searchInput]);
// Reset page on filter change
useEffect(() => {
setCurrentPage(1);
}, [anomalyTypeFilter, minScore, maxScore]);
// Build query params with pagination and search
const queryParams = useMemo(() => {
const params = new URLSearchParams();
params.set("limit", ITEMS_PER_PAGE.toString());
params.set("offset", ((currentPage - 1) * ITEMS_PER_PAGE).toString());
if (anomalyTypeFilter !== "all") {
params.set("anomalyType", anomalyTypeFilter);
}
if (minScore > 0) {
params.set("minScore", minScore.toString());
}
if (maxScore < 100) {
params.set("maxScore", maxScore.toString());
}
if (debouncedSearch.trim()) {
params.set("search", debouncedSearch.trim());
}
return params.toString();
}, [currentPage, anomalyTypeFilter, minScore, maxScore, debouncedSearch]);
const { data, isLoading } = useQuery<DetectionsResponse>({
queryKey: ["/api/detections", currentPage, anomalyTypeFilter, minScore, maxScore, debouncedSearch],
queryFn: () => fetch(`/api/detections?${queryParams}`).then(r => r.json()),
refetchInterval: 10000,
}); });
const filteredDetections = detections?.filter((d) => const detections = data?.detections || [];
d.sourceIp.toLowerCase().includes(searchQuery.toLowerCase()) || const totalCount = data?.total || 0;
d.anomalyType.toLowerCase().includes(searchQuery.toLowerCase()) const totalPages = Math.ceil(totalCount / ITEMS_PER_PAGE);
);
// Fetch whitelist to check if IP is already whitelisted
const { data: whitelistData } = useQuery<Whitelist[]>({
queryKey: ["/api/whitelist"],
});
// Create a Set of whitelisted IPs for fast lookup
const whitelistedIps = new Set(whitelistData?.map(w => w.ipAddress) || []);
// Mutation per aggiungere a whitelist
const addToWhitelistMutation = useMutation({
mutationFn: async (detection: Detection) => {
return await apiRequest("POST", "/api/whitelist", {
ipAddress: detection.sourceIp,
reason: `Auto-added from detection: ${detection.anomalyType} (Risk: ${parseFloat(detection.riskScore).toFixed(1)})`
});
},
onSuccess: (_, detection) => {
toast({
title: "IP aggiunto alla whitelist",
description: `${detection.sourceIp} è stato aggiunto alla whitelist e sbloccato dai router.`,
});
queryClient.invalidateQueries({ queryKey: ["/api/whitelist"] });
queryClient.invalidateQueries({ queryKey: ["/api/detections"] });
},
onError: (error: any, detection) => {
toast({
title: "Errore",
description: error.message || `Impossibile aggiungere ${detection.sourceIp} alla whitelist.`,
variant: "destructive",
});
}
});
// Mutation per sbloccare IP dai router
const unblockMutation = useMutation({
mutationFn: async (detection: Detection) => {
return await apiRequest("POST", "/api/unblock-ip", {
ipAddress: detection.sourceIp
});
},
onSuccess: (data: any, detection) => {
toast({
title: "IP sbloccato",
description: `${detection.sourceIp} è stato rimosso dalla blocklist di ${data.unblocked_from || 0} router.`,
});
queryClient.invalidateQueries({ queryKey: ["/api/detections"] });
},
onError: (error: any, detection) => {
toast({
title: "Errore sblocco",
description: error.message || `Impossibile sbloccare ${detection.sourceIp} dai router.`,
variant: "destructive",
});
}
});
const getRiskBadge = (riskScore: string) => { const getRiskBadge = (riskScore: string) => {
const score = parseFloat(riskScore); const score = parseFloat(riskScore);
@ -52,20 +161,58 @@ export default function Detections() {
{/* Search and Filters */} {/* Search and Filters */}
<Card data-testid="card-filters"> <Card data-testid="card-filters">
<CardContent className="pt-6"> <CardContent className="pt-6">
<div className="flex items-center gap-4"> <div className="flex flex-col gap-4">
<div className="relative flex-1"> <div className="flex items-center gap-4 flex-wrap">
<Search className="absolute left-3 top-1/2 -translate-y-1/2 h-4 w-4 text-muted-foreground" /> <div className="relative flex-1 min-w-[200px]">
<Input <Search className="absolute left-3 top-1/2 -translate-y-1/2 h-4 w-4 text-muted-foreground" />
placeholder="Cerca per IP o tipo anomalia..." <Input
value={searchQuery} placeholder="Cerca per IP, paese, organizzazione..."
onChange={(e) => setSearchQuery(e.target.value)} value={searchInput}
className="pl-9" onChange={(e) => setSearchInput(e.target.value)}
data-testid="input-search" className="pl-9"
/> data-testid="input-search"
/>
</div>
<Select value={anomalyTypeFilter} onValueChange={setAnomalyTypeFilter}>
<SelectTrigger className="w-[200px]" data-testid="select-anomaly-type">
<SelectValue placeholder="Tipo attacco" />
</SelectTrigger>
<SelectContent>
<SelectItem value="all">Tutti i tipi</SelectItem>
<SelectItem value="ddos">DDoS Attack</SelectItem>
<SelectItem value="port_scan">Port Scanning</SelectItem>
<SelectItem value="brute_force">Brute Force</SelectItem>
<SelectItem value="botnet">Botnet Activity</SelectItem>
<SelectItem value="suspicious">Suspicious Activity</SelectItem>
</SelectContent>
</Select>
</div>
<div className="space-y-2">
<div className="flex items-center justify-between text-sm">
<span className="text-muted-foreground">Risk Score:</span>
<span className="font-medium" data-testid="text-score-range">
{minScore} - {maxScore}
</span>
</div>
<div className="flex items-center gap-4">
<span className="text-xs text-muted-foreground w-8">0</span>
<Slider
min={0}
max={100}
step={5}
value={[minScore, maxScore]}
onValueChange={([min, max]) => {
setMinScore(min);
setMaxScore(max);
}}
className="flex-1"
data-testid="slider-risk-score"
/>
<span className="text-xs text-muted-foreground w-8">100</span>
</div>
</div> </div>
<Button variant="outline" data-testid="button-refresh">
Aggiorna
</Button>
</div> </div>
</CardContent> </CardContent>
</Card> </Card>
@ -73,9 +220,36 @@ export default function Detections() {
{/* Detections List */} {/* Detections List */}
<Card data-testid="card-detections-list"> <Card data-testid="card-detections-list">
<CardHeader> <CardHeader>
<CardTitle className="flex items-center gap-2"> <CardTitle className="flex items-center justify-between gap-2 flex-wrap">
<AlertTriangle className="h-5 w-5" /> <div className="flex items-center gap-2">
Rilevamenti ({filteredDetections?.length || 0}) <AlertTriangle className="h-5 w-5" />
Rilevamenti ({totalCount})
</div>
{totalPages > 1 && (
<div className="flex items-center gap-2 text-sm font-normal">
<Button
variant="outline"
size="icon"
onClick={() => setCurrentPage(p => Math.max(1, p - 1))}
disabled={currentPage === 1}
data-testid="button-prev-page"
>
<ChevronLeft className="h-4 w-4" />
</Button>
<span data-testid="text-pagination">
Pagina {currentPage} di {totalPages}
</span>
<Button
variant="outline"
size="icon"
onClick={() => setCurrentPage(p => Math.min(totalPages, p + 1))}
disabled={currentPage === totalPages}
data-testid="button-next-page"
>
<ChevronRight className="h-4 w-4" />
</Button>
</div>
)}
</CardTitle> </CardTitle>
</CardHeader> </CardHeader>
<CardContent> <CardContent>
@ -83,9 +257,9 @@ export default function Detections() {
<div className="text-center py-8 text-muted-foreground" data-testid="text-loading"> <div className="text-center py-8 text-muted-foreground" data-testid="text-loading">
Caricamento... Caricamento...
</div> </div>
) : filteredDetections && filteredDetections.length > 0 ? ( ) : detections.length > 0 ? (
<div className="space-y-3"> <div className="space-y-3">
{filteredDetections.map((detection) => ( {detections.map((detection) => (
<div <div
key={detection.id} key={detection.id}
className="p-4 rounded-lg border hover-elevate" className="p-4 rounded-lg border hover-elevate"
@ -93,7 +267,14 @@ export default function Detections() {
> >
<div className="flex items-start justify-between gap-4"> <div className="flex items-start justify-between gap-4">
<div className="flex-1 min-w-0"> <div className="flex-1 min-w-0">
<div className="flex items-center gap-2 mb-2 flex-wrap"> <div className="flex items-center gap-3 mb-2 flex-wrap">
{/* Flag Emoji */}
{detection.countryCode && (
<span className="text-2xl" title={detection.country || detection.countryCode} data-testid={`flag-${detection.id}`}>
{getFlag(detection.country, detection.countryCode)}
</span>
)}
<code className="font-mono font-semibold text-lg" data-testid={`text-ip-${detection.id}`}> <code className="font-mono font-semibold text-lg" data-testid={`text-ip-${detection.id}`}>
{detection.sourceIp} {detection.sourceIp}
</code> </code>
@ -107,6 +288,34 @@ export default function Detections() {
{detection.reason} {detection.reason}
</p> </p>
{/* Geolocation Info */}
{(detection.country || detection.organization || detection.asNumber) && (
<div className="flex flex-wrap gap-3 mb-3 text-sm" data-testid={`geo-info-${detection.id}`}>
{detection.country && (
<div className="flex items-center gap-1.5 text-muted-foreground">
<Globe className="h-3.5 w-3.5" />
<span data-testid={`text-country-${detection.id}`}>
{detection.city ? `${detection.city}, ${detection.country}` : detection.country}
</span>
</div>
)}
{detection.organization && (
<div className="flex items-center gap-1.5 text-muted-foreground">
<Building2 className="h-3.5 w-3.5" />
<span data-testid={`text-org-${detection.id}`}>{detection.organization}</span>
</div>
)}
{detection.asNumber && (
<div className="flex items-center gap-1.5 text-muted-foreground">
<MapPin className="h-3.5 w-3.5" />
<span data-testid={`text-as-${detection.id}`}>
{detection.asNumber} {detection.asName && `- ${detection.asName}`}
</span>
</div>
)}
</div>
)}
<div className="grid grid-cols-2 md:grid-cols-4 gap-4 text-sm"> <div className="grid grid-cols-2 md:grid-cols-4 gap-4 text-sm">
<div> <div>
<p className="text-muted-foreground text-xs">Risk Score</p> <p className="text-muted-foreground text-xs">Risk Score</p>
@ -156,12 +365,44 @@ export default function Detections() {
</Badge> </Badge>
)} )}
<Button variant="outline" size="sm" asChild data-testid={`button-details-${detection.id}`}> {whitelistedIps.has(detection.sourceIp) ? (
<a href={`/logs?ip=${detection.sourceIp}`}> <Button
<Eye className="h-3 w-3 mr-1" /> variant="outline"
Dettagli size="sm"
</a> disabled
</Button> className="w-full bg-green-500/10 border-green-500 text-green-600 dark:text-green-400"
data-testid={`button-whitelist-${detection.id}`}
>
<ShieldCheck className="h-3 w-3 mr-1" />
In Whitelist
</Button>
) : (
<Button
variant="outline"
size="sm"
onClick={() => addToWhitelistMutation.mutate(detection)}
disabled={addToWhitelistMutation.isPending}
className="w-full"
data-testid={`button-whitelist-${detection.id}`}
>
<ShieldPlus className="h-3 w-3 mr-1" />
Whitelist
</Button>
)}
{detection.blocked && (
<Button
variant="outline"
size="sm"
onClick={() => unblockMutation.mutate(detection)}
disabled={unblockMutation.isPending}
className="w-full"
data-testid={`button-unblock-${detection.id}`}
>
<Unlock className="h-3 w-3 mr-1" />
Sblocca Router
</Button>
)}
</div> </div>
</div> </div>
</div> </div>
@ -171,11 +412,40 @@ export default function Detections() {
<div className="text-center py-12 text-muted-foreground" data-testid="text-no-results"> <div className="text-center py-12 text-muted-foreground" data-testid="text-no-results">
<AlertTriangle className="h-12 w-12 mx-auto mb-2 opacity-50" /> <AlertTriangle className="h-12 w-12 mx-auto mb-2 opacity-50" />
<p>Nessun rilevamento trovato</p> <p>Nessun rilevamento trovato</p>
{searchQuery && ( {debouncedSearch && (
<p className="text-sm">Prova con un altro termine di ricerca</p> <p className="text-sm">Prova con un altro termine di ricerca</p>
)} )}
</div> </div>
)} )}
{/* Bottom pagination */}
{totalPages > 1 && detections.length > 0 && (
<div className="flex items-center justify-center gap-4 mt-6 pt-4 border-t">
<Button
variant="outline"
size="sm"
onClick={() => setCurrentPage(p => Math.max(1, p - 1))}
disabled={currentPage === 1}
data-testid="button-prev-page-bottom"
>
<ChevronLeft className="h-4 w-4 mr-1" />
Precedente
</Button>
<span className="text-sm text-muted-foreground" data-testid="text-pagination-bottom">
Pagina {currentPage} di {totalPages} ({totalCount} totali)
</span>
<Button
variant="outline"
size="sm"
onClick={() => setCurrentPage(p => Math.min(totalPages, p + 1))}
disabled={currentPage === totalPages}
data-testid="button-next-page-bottom"
>
Successiva
<ChevronRight className="h-4 w-4 ml-1" />
</Button>
</div>
)}
</CardContent> </CardContent>
</Card> </Card>
</div> </div>

View File

@ -0,0 +1,372 @@
import { useQuery, useMutation } from "@tanstack/react-query";
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
import { Button } from "@/components/ui/button";
import { Badge } from "@/components/ui/badge";
import { Table, TableBody, TableCell, TableHead, TableHeader, TableRow } from "@/components/ui/table";
import { Dialog, DialogContent, DialogDescription, DialogHeader, DialogTitle, DialogTrigger } from "@/components/ui/dialog";
import { Form, FormControl, FormField, FormItem, FormLabel, FormMessage } from "@/components/ui/form";
import { Input } from "@/components/ui/input";
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
import { Switch } from "@/components/ui/switch";
import { useForm } from "react-hook-form";
import { zodResolver } from "@hookform/resolvers/zod";
import { z } from "zod";
import { RefreshCw, Plus, Trash2, Edit, CheckCircle2, XCircle, AlertTriangle, Clock } from "lucide-react";
import { apiRequest, queryClient } from "@/lib/queryClient";
import { useToast } from "@/hooks/use-toast";
import { formatDistanceToNow } from "date-fns";
import { it } from "date-fns/locale";
import { useState } from "react";
const listFormSchema = z.object({
name: z.string().min(1, "Nome richiesto"),
type: z.enum(["blacklist", "whitelist"], {
required_error: "Tipo richiesto",
}),
url: z.string().url("URL non valida"),
enabled: z.boolean().default(true),
fetchIntervalMinutes: z.number().min(1).max(1440).default(10),
});
type ListFormValues = z.infer<typeof listFormSchema>;
export default function PublicLists() {
const { toast } = useToast();
const [isAddDialogOpen, setIsAddDialogOpen] = useState(false);
const [editingList, setEditingList] = useState<any>(null);
const { data: lists, isLoading } = useQuery({
queryKey: ["/api/public-lists"],
});
const form = useForm<ListFormValues>({
resolver: zodResolver(listFormSchema),
defaultValues: {
name: "",
type: "blacklist",
url: "",
enabled: true,
fetchIntervalMinutes: 10,
},
});
const createMutation = useMutation({
mutationFn: (data: ListFormValues) =>
apiRequest("POST", "/api/public-lists", data),
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["/api/public-lists"] });
toast({
title: "Lista creata",
description: "La lista è stata aggiunta con successo",
});
setIsAddDialogOpen(false);
form.reset();
},
onError: (error: any) => {
toast({
title: "Errore",
description: error.message || "Impossibile creare la lista",
variant: "destructive",
});
},
});
const updateMutation = useMutation({
mutationFn: ({ id, data }: { id: string; data: Partial<ListFormValues> }) =>
apiRequest("PATCH", `/api/public-lists/${id}`, data),
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["/api/public-lists"] });
toast({
title: "Lista aggiornata",
description: "Le modifiche sono state salvate",
});
setEditingList(null);
},
});
const deleteMutation = useMutation({
mutationFn: (id: string) =>
apiRequest("DELETE", `/api/public-lists/${id}`),
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["/api/public-lists"] });
toast({
title: "Lista eliminata",
description: "La lista è stata rimossa",
});
},
onError: (error: any) => {
toast({
title: "Errore",
description: error.message || "Impossibile eliminare la lista",
variant: "destructive",
});
},
});
const syncMutation = useMutation({
mutationFn: (id: string) =>
apiRequest("POST", `/api/public-lists/${id}/sync`),
onSuccess: () => {
toast({
title: "Sync avviato",
description: "La sincronizzazione manuale è stata richiesta",
});
},
});
const toggleEnabled = (id: string, enabled: boolean) => {
updateMutation.mutate({ id, data: { enabled } });
};
const onSubmit = (data: ListFormValues) => {
createMutation.mutate(data);
};
const getStatusBadge = (list: any) => {
if (!list.enabled) {
return <Badge variant="outline" className="gap-1"><XCircle className="w-3 h-3" />Disabilitata</Badge>;
}
if (list.errorCount > 5) {
return <Badge variant="destructive" className="gap-1"><AlertTriangle className="w-3 h-3" />Errori</Badge>;
}
if (list.lastSuccess) {
return <Badge variant="default" className="gap-1 bg-green-600"><CheckCircle2 className="w-3 h-3" />OK</Badge>;
}
return <Badge variant="secondary" className="gap-1"><Clock className="w-3 h-3" />In attesa</Badge>;
};
const getTypeBadge = (type: string) => {
if (type === "blacklist") {
return <Badge variant="destructive">Blacklist</Badge>;
}
return <Badge variant="default" className="bg-blue-600">Whitelist</Badge>;
};
if (isLoading) {
return (
<div className="p-6">
<Card>
<CardHeader>
<CardTitle>Caricamento...</CardTitle>
</CardHeader>
</Card>
</div>
);
}
return (
<div className="p-6 space-y-6">
<div className="flex items-center justify-between">
<div>
<h1 className="text-3xl font-bold">Liste Pubbliche</h1>
<p className="text-muted-foreground mt-2">
Gestione sorgenti blacklist e whitelist esterne (aggiornamento ogni 10 minuti)
</p>
</div>
<Dialog open={isAddDialogOpen} onOpenChange={setIsAddDialogOpen}>
<DialogTrigger asChild>
<Button data-testid="button-add-list">
<Plus className="w-4 h-4 mr-2" />
Aggiungi Lista
</Button>
</DialogTrigger>
<DialogContent className="max-w-2xl">
<DialogHeader>
<DialogTitle>Aggiungi Lista Pubblica</DialogTitle>
<DialogDescription>
Configura una nuova sorgente blacklist o whitelist
</DialogDescription>
</DialogHeader>
<Form {...form}>
<form onSubmit={form.handleSubmit(onSubmit)} className="space-y-4">
<FormField
control={form.control}
name="name"
render={({ field }) => (
<FormItem>
<FormLabel>Nome</FormLabel>
<FormControl>
<Input placeholder="es. Spamhaus DROP" {...field} data-testid="input-list-name" />
</FormControl>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={form.control}
name="type"
render={({ field }) => (
<FormItem>
<FormLabel>Tipo</FormLabel>
<Select onValueChange={field.onChange} defaultValue={field.value}>
<FormControl>
<SelectTrigger data-testid="select-list-type">
<SelectValue placeholder="Seleziona tipo" />
</SelectTrigger>
</FormControl>
<SelectContent>
<SelectItem value="blacklist">Blacklist</SelectItem>
<SelectItem value="whitelist">Whitelist</SelectItem>
</SelectContent>
</Select>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={form.control}
name="url"
render={({ field }) => (
<FormItem>
<FormLabel>URL</FormLabel>
<FormControl>
<Input placeholder="https://example.com/list.txt" {...field} data-testid="input-list-url" />
</FormControl>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={form.control}
name="fetchIntervalMinutes"
render={({ field }) => (
<FormItem>
<FormLabel>Intervallo Sync (minuti)</FormLabel>
<FormControl>
<Input
type="number"
{...field}
onChange={(e) => field.onChange(parseInt(e.target.value))}
data-testid="input-list-interval"
/>
</FormControl>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={form.control}
name="enabled"
render={({ field }) => (
<FormItem className="flex items-center justify-between">
<FormLabel>Abilitata</FormLabel>
<FormControl>
<Switch
checked={field.value}
onCheckedChange={field.onChange}
data-testid="switch-list-enabled"
/>
</FormControl>
</FormItem>
)}
/>
<div className="flex justify-end gap-2 pt-4">
<Button type="button" variant="outline" onClick={() => setIsAddDialogOpen(false)}>
Annulla
</Button>
<Button type="submit" disabled={createMutation.isPending} data-testid="button-save-list">
{createMutation.isPending ? "Salvataggio..." : "Salva"}
</Button>
</div>
</form>
</Form>
</DialogContent>
</Dialog>
</div>
<Card>
<CardHeader>
<CardTitle>Sorgenti Configurate</CardTitle>
<CardDescription>
{lists?.length || 0} liste configurate
</CardDescription>
</CardHeader>
<CardContent>
<Table>
<TableHeader>
<TableRow>
<TableHead>Nome</TableHead>
<TableHead>Tipo</TableHead>
<TableHead>Stato</TableHead>
<TableHead>IP Totali</TableHead>
<TableHead>IP Attivi</TableHead>
<TableHead>Ultimo Sync</TableHead>
<TableHead className="text-right">Azioni</TableHead>
</TableRow>
</TableHeader>
<TableBody>
{lists?.map((list: any) => (
<TableRow key={list.id} data-testid={`row-list-${list.id}`}>
<TableCell className="font-medium">
<div>
<div>{list.name}</div>
<div className="text-xs text-muted-foreground truncate max-w-xs">
{list.url}
</div>
</div>
</TableCell>
<TableCell>{getTypeBadge(list.type)}</TableCell>
<TableCell>{getStatusBadge(list)}</TableCell>
<TableCell data-testid={`text-total-ips-${list.id}`}>{list.totalIps?.toLocaleString() || 0}</TableCell>
<TableCell data-testid={`text-active-ips-${list.id}`}>{list.activeIps?.toLocaleString() || 0}</TableCell>
<TableCell>
{list.lastSuccess ? (
<span className="text-sm">
{formatDistanceToNow(new Date(list.lastSuccess), {
addSuffix: true,
locale: it,
})}
</span>
) : (
<span className="text-sm text-muted-foreground">Mai</span>
)}
</TableCell>
<TableCell className="text-right">
<div className="flex items-center justify-end gap-2">
<Switch
checked={list.enabled}
onCheckedChange={(checked) => toggleEnabled(list.id, checked)}
data-testid={`switch-enable-${list.id}`}
/>
<Button
variant="outline"
size="icon"
onClick={() => syncMutation.mutate(list.id)}
disabled={syncMutation.isPending}
data-testid={`button-sync-${list.id}`}
>
<RefreshCw className="w-4 h-4" />
</Button>
<Button
variant="destructive"
size="icon"
onClick={() => {
if (confirm(`Eliminare la lista "${list.name}"?`)) {
deleteMutation.mutate(list.id);
}
}}
data-testid={`button-delete-${list.id}`}
>
<Trash2 className="w-4 h-4" />
</Button>
</div>
</TableCell>
</TableRow>
))}
{(!lists || lists.length === 0) && (
<TableRow>
<TableCell colSpan={7} className="text-center text-muted-foreground py-8">
Nessuna lista configurata. Aggiungi la prima lista.
</TableCell>
</TableRow>
)}
</TableBody>
</Table>
</CardContent>
</Card>
</div>
);
}

View File

@ -1,19 +1,108 @@
import { useState } from "react";
import { useQuery, useMutation } from "@tanstack/react-query"; import { useQuery, useMutation } from "@tanstack/react-query";
import { queryClient, apiRequest } from "@/lib/queryClient"; import { queryClient, apiRequest } from "@/lib/queryClient";
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card"; import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
import { Badge } from "@/components/ui/badge"; import { Badge } from "@/components/ui/badge";
import { Button } from "@/components/ui/button"; import { Button } from "@/components/ui/button";
import { Server, Plus, Trash2 } from "lucide-react"; import {
Dialog,
DialogContent,
DialogDescription,
DialogHeader,
DialogTitle,
DialogTrigger,
DialogFooter,
} from "@/components/ui/dialog";
import {
Form,
FormControl,
FormDescription,
FormField,
FormItem,
FormLabel,
FormMessage,
} from "@/components/ui/form";
import { Input } from "@/components/ui/input";
import { Switch } from "@/components/ui/switch";
import { Server, Plus, Trash2, Edit } from "lucide-react";
import { format } from "date-fns"; import { format } from "date-fns";
import { useForm } from "react-hook-form";
import { zodResolver } from "@hookform/resolvers/zod";
import { insertRouterSchema, type InsertRouter } from "@shared/schema";
import type { Router } from "@shared/schema"; import type { Router } from "@shared/schema";
import { useToast } from "@/hooks/use-toast"; import { useToast } from "@/hooks/use-toast";
export default function Routers() { export default function Routers() {
const { toast } = useToast(); const { toast } = useToast();
const [addDialogOpen, setAddDialogOpen] = useState(false);
const [editDialogOpen, setEditDialogOpen] = useState(false);
const [editingRouter, setEditingRouter] = useState<Router | null>(null);
const { data: routers, isLoading } = useQuery<Router[]>({ const { data: routers, isLoading } = useQuery<Router[]>({
queryKey: ["/api/routers"], queryKey: ["/api/routers"],
}); });
const addForm = useForm<InsertRouter>({
resolver: zodResolver(insertRouterSchema),
defaultValues: {
name: "",
ipAddress: "",
apiPort: 8729,
username: "",
password: "",
enabled: true,
},
});
const editForm = useForm<InsertRouter>({
resolver: zodResolver(insertRouterSchema),
});
const addMutation = useMutation({
mutationFn: async (data: InsertRouter) => {
return await apiRequest("POST", "/api/routers", data);
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["/api/routers"] });
toast({
title: "Router aggiunto",
description: "Il router è stato configurato con successo",
});
setAddDialogOpen(false);
addForm.reset();
},
onError: (error: any) => {
toast({
title: "Errore",
description: error.message || "Impossibile aggiungere il router",
variant: "destructive",
});
},
});
const updateMutation = useMutation({
mutationFn: async ({ id, data }: { id: string; data: InsertRouter }) => {
return await apiRequest("PUT", `/api/routers/${id}`, data);
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["/api/routers"] });
toast({
title: "Router aggiornato",
description: "Le modifiche sono state salvate con successo",
});
setEditDialogOpen(false);
setEditingRouter(null);
editForm.reset();
},
onError: (error: any) => {
toast({
title: "Errore",
description: error.message || "Impossibile aggiornare il router",
variant: "destructive",
});
},
});
const deleteMutation = useMutation({ const deleteMutation = useMutation({
mutationFn: async (id: string) => { mutationFn: async (id: string) => {
await apiRequest("DELETE", `/api/routers/${id}`); await apiRequest("DELETE", `/api/routers/${id}`);
@ -34,6 +123,29 @@ export default function Routers() {
}, },
}); });
const handleAddSubmit = (data: InsertRouter) => {
addMutation.mutate(data);
};
const handleEditSubmit = (data: InsertRouter) => {
if (editingRouter) {
updateMutation.mutate({ id: editingRouter.id, data });
}
};
const handleEdit = (router: Router) => {
setEditingRouter(router);
editForm.reset({
name: router.name,
ipAddress: router.ipAddress,
apiPort: router.apiPort,
username: router.username,
password: router.password,
enabled: router.enabled,
});
setEditDialogOpen(true);
};
return ( return (
<div className="flex flex-col gap-6 p-6" data-testid="page-routers"> <div className="flex flex-col gap-6 p-6" data-testid="page-routers">
<div className="flex items-center justify-between"> <div className="flex items-center justify-between">
@ -43,10 +155,152 @@ export default function Routers() {
Gestisci i router connessi al sistema IDS Gestisci i router connessi al sistema IDS
</p> </p>
</div> </div>
<Button data-testid="button-add-router">
<Plus className="h-4 w-4 mr-2" /> <Dialog open={addDialogOpen} onOpenChange={setAddDialogOpen}>
Aggiungi Router <DialogTrigger asChild>
</Button> <Button data-testid="button-add-router">
<Plus className="h-4 w-4 mr-2" />
Aggiungi Router
</Button>
</DialogTrigger>
<DialogContent className="sm:max-w-[500px]" data-testid="dialog-add-router">
<DialogHeader>
<DialogTitle>Aggiungi Router MikroTik</DialogTitle>
<DialogDescription>
Configura un nuovo router MikroTik per il sistema IDS. Assicurati che l'API RouterOS (porta 8729/8728) sia abilitata.
</DialogDescription>
</DialogHeader>
<Form {...addForm}>
<form onSubmit={addForm.handleSubmit(handleAddSubmit)} className="space-y-4">
<FormField
control={addForm.control}
name="name"
render={({ field }) => (
<FormItem>
<FormLabel>Nome Router</FormLabel>
<FormControl>
<Input placeholder="es. MikroTik Ufficio" {...field} data-testid="input-name" />
</FormControl>
<FormDescription>
Nome descrittivo per identificare il router
</FormDescription>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={addForm.control}
name="ipAddress"
render={({ field }) => (
<FormItem>
<FormLabel>Indirizzo IP</FormLabel>
<FormControl>
<Input placeholder="es. 192.168.1.1" {...field} data-testid="input-ip" />
</FormControl>
<FormDescription>
Indirizzo IP o hostname del router
</FormDescription>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={addForm.control}
name="apiPort"
render={({ field }) => (
<FormItem>
<FormLabel>Porta API</FormLabel>
<FormControl>
<Input
type="number"
placeholder="8729"
{...field}
onChange={(e) => field.onChange(parseInt(e.target.value))}
data-testid="input-port"
/>
</FormControl>
<FormDescription>
Porta RouterOS API MikroTik (8729 per API-SSL, 8728 per API)
</FormDescription>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={addForm.control}
name="username"
render={({ field }) => (
<FormItem>
<FormLabel>Username</FormLabel>
<FormControl>
<Input placeholder="admin" {...field} data-testid="input-username" />
</FormControl>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={addForm.control}
name="password"
render={({ field }) => (
<FormItem>
<FormLabel>Password</FormLabel>
<FormControl>
<Input type="password" placeholder="••••••••" {...field} data-testid="input-password" />
</FormControl>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={addForm.control}
name="enabled"
render={({ field }) => (
<FormItem className="flex flex-row items-center justify-between rounded-lg border p-3">
<div className="space-y-0.5">
<FormLabel>Abilitato</FormLabel>
<FormDescription>
Attiva il router per il blocco automatico degli IP
</FormDescription>
</div>
<FormControl>
<Switch
checked={field.value}
onCheckedChange={field.onChange}
data-testid="switch-enabled"
/>
</FormControl>
</FormItem>
)}
/>
<DialogFooter>
<Button
type="button"
variant="outline"
onClick={() => setAddDialogOpen(false)}
data-testid="button-cancel"
>
Annulla
</Button>
<Button
type="submit"
disabled={addMutation.isPending}
data-testid="button-submit"
>
{addMutation.isPending ? "Salvataggio..." : "Salva Router"}
</Button>
</DialogFooter>
</form>
</Form>
</DialogContent>
</Dialog>
</div> </div>
<Card data-testid="card-routers"> <Card data-testid="card-routers">
@ -114,9 +368,11 @@ export default function Routers() {
variant="outline" variant="outline"
size="sm" size="sm"
className="flex-1" className="flex-1"
data-testid={`button-test-${router.id}`} onClick={() => handleEdit(router)}
data-testid={`button-edit-${router.id}`}
> >
Test Connessione <Edit className="h-4 w-4 mr-2" />
Modifica
</Button> </Button>
<Button <Button
variant="outline" variant="outline"
@ -140,6 +396,140 @@ export default function Routers() {
)} )}
</CardContent> </CardContent>
</Card> </Card>
<Dialog open={editDialogOpen} onOpenChange={setEditDialogOpen}>
<DialogContent className="sm:max-w-[500px]" data-testid="dialog-edit-router">
<DialogHeader>
<DialogTitle>Modifica Router</DialogTitle>
<DialogDescription>
Modifica le impostazioni del router {editingRouter?.name}
</DialogDescription>
</DialogHeader>
<Form {...editForm}>
<form onSubmit={editForm.handleSubmit(handleEditSubmit)} className="space-y-4">
<FormField
control={editForm.control}
name="name"
render={({ field }) => (
<FormItem>
<FormLabel>Nome Router</FormLabel>
<FormControl>
<Input placeholder="es. MikroTik Ufficio" {...field} data-testid="input-edit-name" />
</FormControl>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={editForm.control}
name="ipAddress"
render={({ field }) => (
<FormItem>
<FormLabel>Indirizzo IP</FormLabel>
<FormControl>
<Input placeholder="es. 192.168.1.1" {...field} data-testid="input-edit-ip" />
</FormControl>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={editForm.control}
name="apiPort"
render={({ field }) => (
<FormItem>
<FormLabel>Porta API</FormLabel>
<FormControl>
<Input
type="number"
placeholder="8729"
{...field}
onChange={(e) => field.onChange(parseInt(e.target.value))}
data-testid="input-edit-port"
/>
</FormControl>
<FormDescription>
Porta RouterOS API MikroTik (8729 per API-SSL, 8728 per API)
</FormDescription>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={editForm.control}
name="username"
render={({ field }) => (
<FormItem>
<FormLabel>Username</FormLabel>
<FormControl>
<Input placeholder="admin" {...field} data-testid="input-edit-username" />
</FormControl>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={editForm.control}
name="password"
render={({ field }) => (
<FormItem>
<FormLabel>Password</FormLabel>
<FormControl>
<Input type="password" placeholder="••••••••" {...field} data-testid="input-edit-password" />
</FormControl>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={editForm.control}
name="enabled"
render={({ field }) => (
<FormItem className="flex flex-row items-center justify-between rounded-lg border p-3">
<div className="space-y-0.5">
<FormLabel>Abilitato</FormLabel>
<FormDescription>
Attiva il router per il blocco automatico degli IP
</FormDescription>
</div>
<FormControl>
<Switch
checked={field.value}
onCheckedChange={field.onChange}
data-testid="switch-edit-enabled"
/>
</FormControl>
</FormItem>
)}
/>
<DialogFooter>
<Button
type="button"
variant="outline"
onClick={() => setEditDialogOpen(false)}
data-testid="button-edit-cancel"
>
Annulla
</Button>
<Button
type="submit"
disabled={updateMutation.isPending}
data-testid="button-edit-submit"
>
{updateMutation.isPending ? "Salvataggio..." : "Salva Modifiche"}
</Button>
</DialogFooter>
</form>
</Form>
</DialogContent>
</Dialog>
</div> </div>
); );
} }

View File

@ -0,0 +1,439 @@
import { useQuery, useMutation } from "@tanstack/react-query";
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
import { Badge } from "@/components/ui/badge";
import { Button } from "@/components/ui/button";
import { Activity, Brain, Database, FileText, Terminal, RefreshCw, AlertCircle, Play, Square, RotateCw } from "lucide-react";
import { Alert, AlertDescription, AlertTitle } from "@/components/ui/alert";
import { useToast } from "@/hooks/use-toast";
import { queryClient, apiRequest } from "@/lib/queryClient";
interface ServiceStatus {
name: string;
status: "running" | "idle" | "offline" | "error" | "unknown";
healthy: boolean;
details: any;
}
interface ServicesStatusResponse {
services: {
mlBackend: ServiceStatus;
database: ServiceStatus;
syslogParser: ServiceStatus;
analyticsAggregator: ServiceStatus;
};
}
export default function ServicesPage() {
const { toast } = useToast();
const { data: servicesStatus, isLoading, refetch } = useQuery<ServicesStatusResponse>({
queryKey: ["/api/services/status"],
refetchInterval: 5000, // Refresh every 5s
});
// Mutation for service control
const serviceControlMutation = useMutation({
mutationFn: async ({ service, action }: { service: string; action: string }) => {
return apiRequest("POST", `/api/services/${service}/${action}`);
},
onSuccess: (data, variables) => {
toast({
title: "Operazione completata",
description: `Servizio ${variables.service}: ${variables.action} eseguito con successo`,
});
// Refresh status after 2 seconds
setTimeout(() => {
queryClient.invalidateQueries({ queryKey: ["/api/services/status"] });
}, 2000);
},
onError: (error: any, variables) => {
toast({
title: "Errore operazione",
description: error.message || `Impossibile eseguire ${variables.action} su ${variables.service}`,
variant: "destructive",
});
},
});
const handleServiceAction = (service: string, action: string) => {
serviceControlMutation.mutate({ service, action });
};
const getStatusBadge = (service: ServiceStatus) => {
if (service.healthy) {
return <Badge variant="default" className="bg-green-600" data-testid={`badge-status-healthy`}>Online</Badge>;
}
if (service.status === 'idle') {
return <Badge variant="secondary" data-testid={`badge-status-idle`}>In Attesa</Badge>;
}
if (service.status === 'offline') {
return <Badge variant="destructive" data-testid={`badge-status-offline`}>Offline</Badge>;
}
if (service.status === 'error') {
return <Badge variant="destructive" data-testid={`badge-status-error`}>Errore</Badge>;
}
return <Badge variant="outline" data-testid={`badge-status-unknown`}>Sconosciuto</Badge>;
};
const getStatusIndicator = (service: ServiceStatus) => {
if (service.healthy) {
return <div className="h-3 w-3 rounded-full bg-green-500" />;
}
if (service.status === 'idle') {
return <div className="h-3 w-3 rounded-full bg-yellow-500" />;
}
return <div className="h-3 w-3 rounded-full bg-red-500" />;
};
return (
<div className="flex flex-col gap-6 p-6" data-testid="page-services">
<div className="flex items-center justify-between">
<div>
<h1 className="text-3xl font-semibold" data-testid="text-services-title">Gestione Servizi</h1>
<p className="text-muted-foreground" data-testid="text-services-subtitle">
Monitoraggio e controllo dei servizi IDS
</p>
</div>
<Button onClick={() => refetch()} variant="outline" data-testid="button-refresh">
<RefreshCw className="h-4 w-4 mr-2" />
Aggiorna
</Button>
</div>
<Alert data-testid="alert-server-instructions">
<AlertCircle className="h-4 w-4" />
<AlertTitle>Gestione Servizi Systemd</AlertTitle>
<AlertDescription>
I servizi IDS sono gestiti da systemd sul server AlmaLinux.
Usa i pulsanti qui sotto per controllarli oppure esegui i comandi systemctl direttamente sul server.
</AlertDescription>
</Alert>
{/* Services Grid */}
<div className="grid grid-cols-1 lg:grid-cols-3 gap-6">
{/* ML Backend Service */}
<Card data-testid="card-ml-backend-service">
<CardHeader>
<CardTitle className="flex items-center gap-2 text-lg">
<Brain className="h-5 w-5" />
ML Backend Python
{servicesStatus && getStatusIndicator(servicesStatus.services.mlBackend)}
</CardTitle>
</CardHeader>
<CardContent className="space-y-4">
<div className="flex items-center justify-between">
<span className="text-sm text-muted-foreground">Stato:</span>
{servicesStatus && getStatusBadge(servicesStatus.services.mlBackend)}
</div>
{servicesStatus?.services.mlBackend.details?.modelLoaded !== undefined && (
<div className="flex items-center justify-between">
<span className="text-sm text-muted-foreground">Modello ML:</span>
<Badge variant={servicesStatus.services.mlBackend.details.modelLoaded ? "default" : "secondary"}>
{servicesStatus.services.mlBackend.details.modelLoaded ? "Caricato" : "Non Caricato"}
</Badge>
</div>
)}
{/* Service Controls */}
<div className="mt-4 space-y-2">
<p className="text-xs font-medium mb-2">Controlli Servizio:</p>
<div className="flex gap-2 flex-wrap">
<Button
size="sm"
variant="outline"
onClick={() => handleServiceAction("ids-ml-backend", "start")}
disabled={serviceControlMutation.isPending || servicesStatus?.services.mlBackend.status === 'running'}
data-testid="button-start-ml"
>
<Play className="h-3 w-3 mr-1" />
Start
</Button>
<Button
size="sm"
variant="outline"
onClick={() => handleServiceAction("ids-ml-backend", "stop")}
disabled={serviceControlMutation.isPending || servicesStatus?.services.mlBackend.status === 'offline'}
data-testid="button-stop-ml"
>
<Square className="h-3 w-3 mr-1" />
Stop
</Button>
<Button
size="sm"
variant="outline"
onClick={() => handleServiceAction("ids-ml-backend", "restart")}
disabled={serviceControlMutation.isPending}
data-testid="button-restart-ml"
>
<RotateCw className="h-3 w-3 mr-1" />
Restart
</Button>
</div>
</div>
{/* Manual Commands (fallback) */}
<div className="mt-4 p-3 bg-muted rounded-lg">
<p className="text-xs font-medium mb-2">Comando systemctl (sul server):</p>
<code className="text-xs bg-background p-2 rounded block font-mono" data-testid="code-systemctl-ml">
sudo systemctl {servicesStatus?.services.mlBackend.status === 'offline' ? 'start' : 'restart'} ids-ml-backend
</code>
</div>
<div className="mt-4 p-3 bg-muted rounded-lg">
<p className="text-xs font-medium mb-2">Log:</p>
<code className="text-xs bg-background p-2 rounded block font-mono" data-testid="code-log-ml">
tail -f /var/log/ids/backend.log
</code>
</div>
</CardContent>
</Card>
{/* Database Service */}
<Card data-testid="card-database-service">
<CardHeader>
<CardTitle className="flex items-center gap-2 text-lg">
<Database className="h-5 w-5" />
PostgreSQL Database
{servicesStatus && getStatusIndicator(servicesStatus.services.database)}
</CardTitle>
</CardHeader>
<CardContent className="space-y-4">
<div className="flex items-center justify-between">
<span className="text-sm text-muted-foreground">Stato:</span>
{servicesStatus && getStatusBadge(servicesStatus.services.database)}
</div>
{servicesStatus?.services.database.status === 'running' && (
<div className="flex items-center justify-between">
<span className="text-sm text-muted-foreground">Connessione:</span>
<Badge variant="default" className="bg-green-600">Connesso</Badge>
</div>
)}
<div className="mt-4 p-3 bg-muted rounded-lg">
<p className="text-xs font-medium mb-2">Verifica status:</p>
<code className="text-xs bg-background p-2 rounded block font-mono" data-testid="code-status-db">
systemctl status postgresql-16
</code>
</div>
{servicesStatus?.services.database.status === 'error' && (
<div className="mt-4 p-3 bg-muted rounded-lg">
<p className="text-xs font-medium mb-2">Riavvia database:</p>
<code className="text-xs bg-background p-2 rounded block font-mono" data-testid="code-restart-db">
sudo systemctl restart postgresql-16
</code>
</div>
)}
<div className="mt-4 p-3 bg-muted rounded-lg">
<p className="text-xs font-medium mb-2">Log:</p>
<code className="text-xs bg-background p-2 rounded block font-mono" data-testid="code-log-db">
sudo journalctl -u postgresql-16 -f
</code>
</div>
</CardContent>
</Card>
{/* Syslog Parser Service */}
<Card data-testid="card-syslog-parser-service">
<CardHeader>
<CardTitle className="flex items-center gap-2 text-lg">
<FileText className="h-5 w-5" />
Syslog Parser
{servicesStatus && getStatusIndicator(servicesStatus.services.syslogParser)}
</CardTitle>
</CardHeader>
<CardContent className="space-y-4">
<div className="flex items-center justify-between">
<span className="text-sm text-muted-foreground">Stato:</span>
{servicesStatus && getStatusBadge(servicesStatus.services.syslogParser)}
</div>
{servicesStatus?.services.syslogParser.details?.pid && (
<div className="flex items-center justify-between">
<span className="text-sm text-muted-foreground">PID Processo:</span>
<Badge variant="outline" className="font-mono">
{servicesStatus.services.syslogParser.details.pid}
</Badge>
</div>
)}
{servicesStatus?.services.syslogParser.details?.systemd_unit && (
<div className="flex items-center justify-between">
<span className="text-sm text-muted-foreground">Systemd Unit:</span>
<Badge variant="outline" className="font-mono text-xs">
{servicesStatus.services.syslogParser.details.systemd_unit}
</Badge>
</div>
)}
{/* Service Controls */}
<div className="mt-4 space-y-2">
<p className="text-xs font-medium mb-2">Controlli Servizio:</p>
<div className="flex gap-2 flex-wrap">
<Button
size="sm"
variant="outline"
onClick={() => handleServiceAction("ids-syslog-parser", "start")}
disabled={serviceControlMutation.isPending || servicesStatus?.services.syslogParser.status === 'running'}
data-testid="button-start-parser"
>
<Play className="h-3 w-3 mr-1" />
Start
</Button>
<Button
size="sm"
variant="outline"
onClick={() => handleServiceAction("ids-syslog-parser", "stop")}
disabled={serviceControlMutation.isPending || servicesStatus?.services.syslogParser.status === 'offline'}
data-testid="button-stop-parser"
>
<Square className="h-3 w-3 mr-1" />
Stop
</Button>
<Button
size="sm"
variant="outline"
onClick={() => handleServiceAction("ids-syslog-parser", "restart")}
disabled={serviceControlMutation.isPending}
data-testid="button-restart-parser"
>
<RotateCw className="h-3 w-3 mr-1" />
Restart
</Button>
</div>
</div>
{/* Manual Commands (fallback) */}
<div className="mt-4 p-3 bg-muted rounded-lg">
<p className="text-xs font-medium mb-2">Comando systemctl (sul server):</p>
<code className="text-xs bg-background p-2 rounded block font-mono" data-testid="code-systemctl-parser">
sudo systemctl {servicesStatus?.services.syslogParser.status === 'offline' ? 'start' : 'restart'} ids-syslog-parser
</code>
</div>
<div className="mt-4 p-3 bg-muted rounded-lg">
<p className="text-xs font-medium mb-2">Log:</p>
<code className="text-xs bg-background p-2 rounded block font-mono" data-testid="code-log-parser">
tail -f /var/log/ids/syslog_parser.log
</code>
</div>
</CardContent>
</Card>
{/* Analytics Aggregator Service */}
<Card data-testid="card-analytics-aggregator-service">
<CardHeader>
<CardTitle className="flex items-center gap-2 text-lg">
<Activity className="h-5 w-5" />
Analytics Aggregator
{servicesStatus && getStatusIndicator(servicesStatus.services.analyticsAggregator)}
</CardTitle>
</CardHeader>
<CardContent className="space-y-4">
<div className="flex items-center justify-between">
<span className="text-sm text-muted-foreground">Stato:</span>
{servicesStatus && getStatusBadge(servicesStatus.services.analyticsAggregator)}
</div>
{servicesStatus?.services.analyticsAggregator.details?.lastRun && (
<div className="flex items-center justify-between">
<span className="text-sm text-muted-foreground">Ultima Aggregazione:</span>
<Badge variant="outline" className="text-xs">
{new Date(servicesStatus.services.analyticsAggregator.details.lastRun).toLocaleString('it-IT')}
</Badge>
</div>
)}
{servicesStatus?.services.analyticsAggregator.details?.hoursSinceLastRun && (
<div className="flex items-center justify-between">
<span className="text-sm text-muted-foreground">Ore dall'ultimo run:</span>
<Badge variant={parseFloat(servicesStatus.services.analyticsAggregator.details.hoursSinceLastRun) < 2 ? "default" : "destructive"}>
{servicesStatus.services.analyticsAggregator.details.hoursSinceLastRun}h
</Badge>
</div>
)}
{/* CRITICAL ALERT: Aggregator idle for too long */}
{servicesStatus?.services.analyticsAggregator.details?.hoursSinceLastRun &&
parseFloat(servicesStatus.services.analyticsAggregator.details.hoursSinceLastRun) > 2 && (
<Alert variant="destructive" className="mt-2" data-testid="alert-aggregator-idle">
<AlertCircle className="h-4 w-4" />
<AlertTitle className="text-sm font-semibold"> Timer Systemd Non Attivo</AlertTitle>
<AlertDescription className="text-xs mt-1">
<p className="mb-2">L'aggregatore non esegue da {servicesStatus.services.analyticsAggregator.details.hoursSinceLastRun}h! Dashboard e Analytics bloccati.</p>
<p className="font-semibold">Soluzione Immediata (sul server):</p>
<code className="block bg-destructive-foreground/10 p-2 rounded mt-1 font-mono text-xs">
sudo /opt/ids/deployment/setup_analytics_timer.sh
</code>
</AlertDescription>
</Alert>
)}
<div className="mt-4 p-3 bg-muted rounded-lg">
<p className="text-xs font-medium mb-2">Verifica timer:</p>
<code className="text-xs bg-background p-2 rounded block font-mono" data-testid="code-status-aggregator">
systemctl status ids-analytics-aggregator.timer
</code>
</div>
<div className="mt-4 p-3 bg-muted rounded-lg">
<p className="text-xs font-medium mb-2">Avvia aggregazione manualmente:</p>
<code className="text-xs bg-background p-2 rounded block font-mono" data-testid="code-run-aggregator">
cd /opt/ids && ./deployment/run_analytics.sh
</code>
</div>
<div className="mt-4 p-3 bg-muted rounded-lg">
<p className="text-xs font-medium mb-2">Log:</p>
<code className="text-xs bg-background p-2 rounded block font-mono" data-testid="code-log-aggregator">
journalctl -u ids-analytics-aggregator.timer -f
</code>
</div>
</CardContent>
</Card>
</div>
{/* Additional Commands */}
<Card data-testid="card-additional-commands">
<CardHeader>
<CardTitle className="flex items-center gap-2">
<Terminal className="h-5 w-5" />
Comandi Utili
</CardTitle>
</CardHeader>
<CardContent className="space-y-4">
<div>
<p className="text-sm font-medium mb-2">Verifica tutti i processi IDS attivi:</p>
<code className="text-xs bg-muted p-2 rounded block font-mono" data-testid="code-check-processes">
ps aux | grep -E "python.*(main|syslog_parser)" | grep -v grep
</code>
</div>
<div>
<p className="text-sm font-medium mb-2">Verifica log RSyslog (ricezione log MikroTik):</p>
<code className="text-xs bg-muted p-2 rounded block font-mono" data-testid="code-check-rsyslog">
tail -f /var/log/mikrotik/raw.log
</code>
</div>
<div>
<p className="text-sm font-medium mb-2">Esegui training manuale ML:</p>
<code className="text-xs bg-muted p-2 rounded block font-mono" data-testid="code-manual-training">
curl -X POST http://localhost:8000/train -H "Content-Type: application/json" -d '&#123;"max_records": 10000, "hours_back": 24&#125;'
</code>
</div>
<div>
<p className="text-sm font-medium mb-2">Verifica storico training nel database:</p>
<code className="text-xs bg-muted p-2 rounded block font-mono" data-testid="code-check-training">
psql $DATABASE_URL -c "SELECT * FROM training_history ORDER BY trained_at DESC LIMIT 5;"
</code>
</div>
</CardContent>
</Card>
</div>
);
}

View File

@ -0,0 +1,473 @@
import { useQuery, useMutation } from "@tanstack/react-query";
import { queryClient, apiRequest } from "@/lib/queryClient";
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
import { Button } from "@/components/ui/button";
import { Badge } from "@/components/ui/badge";
import { Brain, Play, Search, CheckCircle2, XCircle, Clock, TrendingUp } from "lucide-react";
import { format } from "date-fns";
import type { TrainingHistory } from "@shared/schema";
import { useToast } from "@/hooks/use-toast";
import { useState } from "react";
import { useForm } from "react-hook-form";
import { zodResolver } from "@hookform/resolvers/zod";
import { z } from "zod";
import {
Dialog,
DialogContent,
DialogDescription,
DialogHeader,
DialogTitle,
DialogTrigger,
DialogFooter,
} from "@/components/ui/dialog";
import {
Form,
FormControl,
FormField,
FormItem,
FormLabel,
FormMessage,
FormDescription,
} from "@/components/ui/form";
import { Input } from "@/components/ui/input";
import { Checkbox } from "@/components/ui/checkbox";
interface MLStatsResponse {
logs?: { total: number; last_hour: number };
detections?: { total: number; blocked: number };
routers?: { active: number };
latest_training?: any;
}
const trainFormSchema = z.object({
max_records: z.coerce.number().min(1, "Minimo 1 record").max(1000000, "Massimo 1M record"),
hours_back: z.coerce.number().min(1, "Minimo 1 ora").max(720, "Massimo 720 ore (30 giorni)"),
});
const detectFormSchema = z.object({
max_records: z.coerce.number().min(1, "Minimo 1 record").max(1000000, "Massimo 1M record"),
hours_back: z.coerce.number().min(1, "Minimo 1 ora").max(720, "Massimo 720 ore"),
risk_threshold: z.coerce.number().min(0, "Minimo 0").max(100, "Massimo 100"),
auto_block: z.boolean().default(true),
});
export default function TrainingPage() {
const { toast } = useToast();
const [isTrainDialogOpen, setIsTrainDialogOpen] = useState(false);
const [isDetectDialogOpen, setIsDetectDialogOpen] = useState(false);
const trainForm = useForm<z.infer<typeof trainFormSchema>>({
resolver: zodResolver(trainFormSchema),
defaultValues: {
max_records: 100000,
hours_back: 24,
},
});
const detectForm = useForm<z.infer<typeof detectFormSchema>>({
resolver: zodResolver(detectFormSchema),
defaultValues: {
max_records: 50000,
hours_back: 1,
risk_threshold: 75,
auto_block: true,
},
});
const { data: history, isLoading } = useQuery<TrainingHistory[]>({
queryKey: ["/api/training-history"],
refetchInterval: 10000,
});
const { data: mlStats } = useQuery<MLStatsResponse>({
queryKey: ["/api/ml/stats"],
refetchInterval: 10000,
});
const trainMutation = useMutation({
mutationFn: async (params: z.infer<typeof trainFormSchema>) => {
return await apiRequest("POST", "/api/ml/train", params);
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["/api/training-history"] });
queryClient.invalidateQueries({ queryKey: ["/api/ml/stats"] });
toast({
title: "Training avviato",
description: "Il modello ML è in addestramento. Controlla lo storico tra qualche minuto.",
});
setIsTrainDialogOpen(false);
trainForm.reset();
},
onError: (error: any) => {
toast({
title: "Errore",
description: error.message || "Impossibile avviare il training",
variant: "destructive",
});
},
});
const detectMutation = useMutation({
mutationFn: async (params: z.infer<typeof detectFormSchema>) => {
return await apiRequest("POST", "/api/ml/detect", params);
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["/api/detections"] });
queryClient.invalidateQueries({ queryKey: ["/api/stats"] });
toast({
title: "Detection avviata",
description: "Analisi anomalie in corso. Controlla i rilevamenti tra qualche secondo.",
});
setIsDetectDialogOpen(false);
detectForm.reset();
},
onError: (error: any) => {
toast({
title: "Errore",
description: error.message || "Impossibile avviare la detection",
variant: "destructive",
});
},
});
const onTrainSubmit = (data: z.infer<typeof trainFormSchema>) => {
trainMutation.mutate(data);
};
const onDetectSubmit = (data: z.infer<typeof detectFormSchema>) => {
detectMutation.mutate(data);
};
return (
<div className="flex flex-col gap-6 p-6" data-testid="page-training">
<div>
<h1 className="text-3xl font-semibold" data-testid="text-page-title">Machine Learning</h1>
<p className="text-muted-foreground" data-testid="text-page-subtitle">
Training e detection del modello Isolation Forest
</p>
</div>
{/* ML Stats */}
{mlStats && (
<div className="grid grid-cols-1 md:grid-cols-3 gap-4">
<Card data-testid="card-ml-logs">
<CardHeader className="flex flex-row items-center justify-between gap-2 space-y-0 pb-2">
<CardTitle className="text-sm font-medium">Log Totali</CardTitle>
<Brain className="h-4 w-4 text-muted-foreground" />
</CardHeader>
<CardContent>
<div className="text-2xl font-semibold" data-testid="text-ml-logs-total">
{mlStats.logs?.total?.toLocaleString() || 0}
</div>
<p className="text-xs text-muted-foreground mt-1">
Ultima ora: {mlStats.logs?.last_hour?.toLocaleString() || 0}
</p>
</CardContent>
</Card>
<Card data-testid="card-ml-detections">
<CardHeader className="flex flex-row items-center justify-between gap-2 space-y-0 pb-2">
<CardTitle className="text-sm font-medium">Detection Totali</CardTitle>
<Search className="h-4 w-4 text-muted-foreground" />
</CardHeader>
<CardContent>
<div className="text-2xl font-semibold" data-testid="text-ml-detections-total">
{mlStats.detections?.total || 0}
</div>
<p className="text-xs text-muted-foreground mt-1">
Bloccati: {mlStats.detections?.blocked || 0}
</p>
</CardContent>
</Card>
<Card data-testid="card-ml-routers">
<CardHeader className="flex flex-row items-center justify-between gap-2 space-y-0 pb-2">
<CardTitle className="text-sm font-medium">Router Attivi</CardTitle>
<TrendingUp className="h-4 w-4 text-muted-foreground" />
</CardHeader>
<CardContent>
<div className="text-2xl font-semibold" data-testid="text-ml-routers-active">
{mlStats.routers?.active || 0}
</div>
</CardContent>
</Card>
</div>
)}
{/* Actions */}
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
<Card data-testid="card-train-action">
<CardHeader>
<div className="flex items-center justify-between">
<CardTitle className="flex items-center gap-2">
<Brain className="h-5 w-5" />
Addestramento Modello
</CardTitle>
<Badge variant="secondary" className="bg-blue-50 text-blue-700 dark:bg-blue-950 dark:text-blue-300" data-testid="badge-model-version">
Hybrid ML v2.0.0
</Badge>
</div>
</CardHeader>
<CardContent className="space-y-4">
<p className="text-sm text-muted-foreground">
Addestra il modello Hybrid ML (Isolation Forest + Ensemble Classifier) analizzando i log recenti per rilevare pattern di traffico normale.
</p>
<Dialog open={isTrainDialogOpen} onOpenChange={setIsTrainDialogOpen}>
<DialogTrigger asChild>
<Button className="w-full" data-testid="button-start-training">
<Play className="h-4 w-4 mr-2" />
Avvia Training
</Button>
</DialogTrigger>
<DialogContent data-testid="dialog-training">
<DialogHeader>
<DialogTitle>Avvia Training ML</DialogTitle>
<DialogDescription>
Configura i parametri per l'addestramento del modello
</DialogDescription>
</DialogHeader>
<Form {...trainForm}>
<form onSubmit={trainForm.handleSubmit(onTrainSubmit)} className="space-y-4 py-4">
<FormField
control={trainForm.control}
name="max_records"
render={({ field }) => (
<FormItem>
<FormLabel>Numero Record</FormLabel>
<FormControl>
<Input type="number" {...field} data-testid="input-train-records" />
</FormControl>
<FormDescription>Consigliato: 100000</FormDescription>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={trainForm.control}
name="hours_back"
render={({ field }) => (
<FormItem>
<FormLabel>Ore Precedenti</FormLabel>
<FormControl>
<Input type="number" {...field} data-testid="input-train-hours" />
</FormControl>
<FormDescription>Consigliato: 24</FormDescription>
<FormMessage />
</FormItem>
)}
/>
<DialogFooter>
<Button
type="button"
variant="outline"
onClick={() => setIsTrainDialogOpen(false)}
data-testid="button-cancel-training"
>
Annulla
</Button>
<Button type="submit" disabled={trainMutation.isPending} data-testid="button-confirm-training">
{trainMutation.isPending ? "Avvio..." : "Avvia Training"}
</Button>
</DialogFooter>
</form>
</Form>
</DialogContent>
</Dialog>
</CardContent>
</Card>
<Card data-testid="card-detect-action">
<CardHeader>
<div className="flex items-center justify-between">
<CardTitle className="flex items-center gap-2">
<Search className="h-5 w-5" />
Rilevamento Anomalie
</CardTitle>
<Badge variant="secondary" className="bg-green-50 text-green-700 dark:bg-green-950 dark:text-green-300" data-testid="badge-detection-version">
Hybrid ML v2.0.0
</Badge>
</div>
</CardHeader>
<CardContent className="space-y-4">
<p className="text-sm text-muted-foreground">
Analizza i log recenti per rilevare anomalie e IP sospetti con il modello Hybrid ML. Blocca automaticamente gli IP critici (risk_score 80).
</p>
<Dialog open={isDetectDialogOpen} onOpenChange={setIsDetectDialogOpen}>
<DialogTrigger asChild>
<Button variant="secondary" className="w-full" data-testid="button-start-detection">
<Search className="h-4 w-4 mr-2" />
Avvia Detection
</Button>
</DialogTrigger>
<DialogContent data-testid="dialog-detection">
<DialogHeader>
<DialogTitle>Avvia Detection Anomalie</DialogTitle>
<DialogDescription>
Configura i parametri per il rilevamento anomalie
</DialogDescription>
</DialogHeader>
<Form {...detectForm}>
<form onSubmit={detectForm.handleSubmit(onDetectSubmit)} className="space-y-4 py-4">
<FormField
control={detectForm.control}
name="max_records"
render={({ field }) => (
<FormItem>
<FormLabel>Numero Record</FormLabel>
<FormControl>
<Input type="number" {...field} data-testid="input-detect-records" />
</FormControl>
<FormDescription>Consigliato: 50000</FormDescription>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={detectForm.control}
name="hours_back"
render={({ field }) => (
<FormItem>
<FormLabel>Ore Precedenti</FormLabel>
<FormControl>
<Input type="number" {...field} data-testid="input-detect-hours" />
</FormControl>
<FormDescription>Consigliato: 1</FormDescription>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={detectForm.control}
name="risk_threshold"
render={({ field }) => (
<FormItem>
<FormLabel>Soglia Rischio (%)</FormLabel>
<FormControl>
<Input type="number" min="0" max="100" {...field} data-testid="input-detect-threshold" />
</FormControl>
<FormDescription>Consigliato: 75</FormDescription>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={detectForm.control}
name="auto_block"
render={({ field }) => (
<FormItem className="flex flex-row items-start space-x-3 space-y-0">
<FormControl>
<Checkbox
checked={field.value}
onCheckedChange={field.onChange}
data-testid="checkbox-auto-block"
/>
</FormControl>
<div className="space-y-1 leading-none">
<FormLabel>Blocco automatico IP critici (80)</FormLabel>
</div>
</FormItem>
)}
/>
<DialogFooter>
<Button
type="button"
variant="outline"
onClick={() => setIsDetectDialogOpen(false)}
data-testid="button-cancel-detection"
>
Annulla
</Button>
<Button type="submit" disabled={detectMutation.isPending} data-testid="button-confirm-detection">
{detectMutation.isPending ? "Avvio..." : "Avvia Detection"}
</Button>
</DialogFooter>
</form>
</Form>
</DialogContent>
</Dialog>
</CardContent>
</Card>
</div>
{/* Training History */}
<Card data-testid="card-training-history">
<CardHeader>
<CardTitle className="flex items-center gap-2">
<Clock className="h-5 w-5" />
Storico Training ({history?.length || 0})
</CardTitle>
</CardHeader>
<CardContent>
{isLoading ? (
<div className="text-center py-8 text-muted-foreground" data-testid="text-loading">
Caricamento...
</div>
) : history && history.length > 0 ? (
<div className="space-y-3">
{history.map((item) => (
<div
key={item.id}
className="p-4 rounded-lg border hover-elevate"
data-testid={`training-item-${item.id}`}
>
<div className="flex items-start justify-between gap-4">
<div className="flex-1 space-y-1">
<div className="flex items-center gap-2">
<p className="font-medium" data-testid={`text-version-${item.id}`}>
Versione {item.modelVersion}
</p>
{item.status === "success" ? (
<Badge variant="outline" className="bg-green-50" data-testid={`badge-status-${item.id}`}>
<CheckCircle2 className="h-3 w-3 mr-1" />
Successo
</Badge>
) : (
<Badge variant="destructive" data-testid={`badge-status-${item.id}`}>
<XCircle className="h-3 w-3 mr-1" />
Fallito
</Badge>
)}
</div>
<div className="grid grid-cols-2 md:grid-cols-4 gap-2 text-sm text-muted-foreground">
<div>
<span className="font-medium">Record:</span> {item.recordsProcessed.toLocaleString()}
</div>
<div>
<span className="font-medium">Feature:</span> {item.featuresCount}
</div>
{item.accuracy && (
<div>
<span className="font-medium">Accuracy:</span> {item.accuracy}%
</div>
)}
{item.trainingDuration && (
<div>
<span className="font-medium">Durata:</span> {item.trainingDuration}s
</div>
)}
</div>
<p className="text-xs text-muted-foreground" data-testid={`text-date-${item.id}`}>
{format(new Date(item.trainedAt), "dd/MM/yyyy HH:mm:ss")}
</p>
{item.notes && (
<p className="text-sm text-muted-foreground" data-testid={`text-notes-${item.id}`}>
{item.notes}
</p>
)}
</div>
</div>
</div>
))}
</div>
) : (
<div className="text-center py-12 text-muted-foreground" data-testid="text-empty">
<Brain className="h-12 w-12 mx-auto mb-4 opacity-50" />
<p className="font-medium">Nessun training eseguito</p>
<p className="text-sm mt-2">Avvia il primo training per addestrare il modello ML</p>
</div>
)}
</CardContent>
</Card>
</div>
);
}

View File

@ -0,0 +1,287 @@
import { useQuery, useMutation } from "@tanstack/react-query";
import { queryClient, apiRequest } from "@/lib/queryClient";
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
import { Button } from "@/components/ui/button";
import { Shield, Plus, Trash2, CheckCircle2, XCircle, Search } from "lucide-react";
import { format } from "date-fns";
import { useState } from "react";
import { useForm } from "react-hook-form";
import { zodResolver } from "@hookform/resolvers/zod";
import { z } from "zod";
import type { Whitelist } from "@shared/schema";
import { insertWhitelistSchema } from "@shared/schema";
import { useToast } from "@/hooks/use-toast";
import {
Dialog,
DialogContent,
DialogDescription,
DialogHeader,
DialogTitle,
DialogTrigger,
DialogFooter,
} from "@/components/ui/dialog";
import {
Form,
FormControl,
FormField,
FormItem,
FormLabel,
FormMessage,
} from "@/components/ui/form";
import { Input } from "@/components/ui/input";
import { Textarea } from "@/components/ui/textarea";
const whitelistFormSchema = insertWhitelistSchema.extend({
ipAddress: z.string()
.min(7, "Inserisci un IP valido")
.regex(/^(\d{1,3}\.){3}\d{1,3}$/, "Formato IP non valido")
.refine((ip) => {
const parts = ip.split('.').map(Number);
return parts.every(part => part >= 0 && part <= 255);
}, "Ogni ottetto deve essere tra 0 e 255"),
});
export default function WhitelistPage() {
const { toast } = useToast();
const [isAddDialogOpen, setIsAddDialogOpen] = useState(false);
const [searchQuery, setSearchQuery] = useState("");
const form = useForm<z.infer<typeof whitelistFormSchema>>({
resolver: zodResolver(whitelistFormSchema),
defaultValues: {
ipAddress: "",
comment: "",
reason: "",
active: true,
},
});
const { data: whitelist, isLoading } = useQuery<Whitelist[]>({
queryKey: ["/api/whitelist"],
});
// Filter whitelist based on search query
const filteredWhitelist = whitelist?.filter((item) =>
item.ipAddress.toLowerCase().includes(searchQuery.toLowerCase()) ||
item.reason?.toLowerCase().includes(searchQuery.toLowerCase()) ||
item.comment?.toLowerCase().includes(searchQuery.toLowerCase())
);
const addMutation = useMutation({
mutationFn: async (data: z.infer<typeof whitelistFormSchema>) => {
return await apiRequest("POST", "/api/whitelist", data);
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["/api/whitelist"] });
toast({
title: "IP aggiunto",
description: "L'indirizzo IP è stato aggiunto alla whitelist",
});
setIsAddDialogOpen(false);
form.reset();
},
onError: (error: any) => {
toast({
title: "Errore",
description: error.message || "Impossibile aggiungere l'IP alla whitelist",
variant: "destructive",
});
},
});
const deleteMutation = useMutation({
mutationFn: async (id: string) => {
await apiRequest("DELETE", `/api/whitelist/${id}`);
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["/api/whitelist"] });
toast({
title: "IP rimosso",
description: "L'indirizzo IP è stato rimosso dalla whitelist",
});
},
onError: (error: any) => {
toast({
title: "Errore",
description: error.message || "Impossibile rimuovere l'IP dalla whitelist",
variant: "destructive",
});
},
});
const onSubmit = (data: z.infer<typeof whitelistFormSchema>) => {
addMutation.mutate(data);
};
return (
<div className="flex flex-col gap-6 p-6" data-testid="page-whitelist">
<div className="flex items-center justify-between">
<div>
<h1 className="text-3xl font-semibold" data-testid="text-page-title">Whitelist IP</h1>
<p className="text-muted-foreground" data-testid="text-page-subtitle">
Gestisci gli indirizzi IP fidati che non verranno bloccati
</p>
</div>
<Dialog open={isAddDialogOpen} onOpenChange={setIsAddDialogOpen}>
<DialogTrigger asChild>
<Button data-testid="button-add-whitelist">
<Plus className="h-4 w-4 mr-2" />
Aggiungi IP
</Button>
</DialogTrigger>
<DialogContent data-testid="dialog-add-whitelist">
<DialogHeader>
<DialogTitle>Aggiungi IP alla Whitelist</DialogTitle>
<DialogDescription>
Inserisci l'indirizzo IP che vuoi proteggere dal blocco automatico
</DialogDescription>
</DialogHeader>
<Form {...form}>
<form onSubmit={form.handleSubmit(onSubmit)} className="space-y-4 py-4">
<FormField
control={form.control}
name="ipAddress"
render={({ field }) => (
<FormItem>
<FormLabel>Indirizzo IP *</FormLabel>
<FormControl>
<Input placeholder="192.168.1.100" {...field} data-testid="input-ip" />
</FormControl>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={form.control}
name="reason"
render={({ field }) => (
<FormItem>
<FormLabel>Motivo</FormLabel>
<FormControl>
<Input placeholder="Es: Server backup" {...field} data-testid="input-reason" />
</FormControl>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={form.control}
name="comment"
render={({ field }) => (
<FormItem>
<FormLabel>Note</FormLabel>
<FormControl>
<Textarea placeholder="Note aggiuntive..." {...field} data-testid="input-comment" />
</FormControl>
<FormMessage />
</FormItem>
)}
/>
<DialogFooter>
<Button
type="button"
variant="outline"
onClick={() => setIsAddDialogOpen(false)}
data-testid="button-cancel"
>
Annulla
</Button>
<Button type="submit" disabled={addMutation.isPending} data-testid="button-confirm-add">
{addMutation.isPending ? "Aggiunta..." : "Aggiungi"}
</Button>
</DialogFooter>
</form>
</Form>
</DialogContent>
</Dialog>
</div>
{/* Search Bar */}
<Card data-testid="card-search">
<CardContent className="pt-6">
<div className="relative">
<Search className="absolute left-3 top-1/2 -translate-y-1/2 h-4 w-4 text-muted-foreground" />
<Input
placeholder="Cerca per IP, motivo o note..."
value={searchQuery}
onChange={(e) => setSearchQuery(e.target.value)}
className="pl-9"
data-testid="input-search-whitelist"
/>
</div>
</CardContent>
</Card>
<Card data-testid="card-whitelist">
<CardHeader>
<CardTitle className="flex items-center gap-2">
<Shield className="h-5 w-5" />
IP Protetti ({filteredWhitelist?.length || 0}{searchQuery && whitelist ? ` di ${whitelist.length}` : ''})
</CardTitle>
</CardHeader>
<CardContent>
{isLoading ? (
<div className="text-center py-8 text-muted-foreground" data-testid="text-loading">
Caricamento...
</div>
) : filteredWhitelist && filteredWhitelist.length > 0 ? (
<div className="space-y-3">
{filteredWhitelist.map((item) => (
<div
key={item.id}
className="p-4 rounded-lg border hover-elevate"
data-testid={`whitelist-item-${item.id}`}
>
<div className="flex items-start justify-between gap-4">
<div className="flex-1 space-y-1">
<div className="flex items-center gap-2">
<p className="font-mono font-medium" data-testid={`text-ip-${item.id}`}>
{item.ipAddress}
</p>
{item.active ? (
<CheckCircle2 className="h-4 w-4 text-green-500" data-testid={`icon-active-${item.id}`} />
) : (
<XCircle className="h-4 w-4 text-muted-foreground" data-testid={`icon-inactive-${item.id}`} />
)}
</div>
{item.reason && (
<p className="text-sm text-muted-foreground" data-testid={`text-reason-${item.id}`}>
<span className="font-medium">Motivo:</span> {item.reason}
</p>
)}
{item.comment && (
<p className="text-sm text-muted-foreground" data-testid={`text-comment-${item.id}`}>
{item.comment}
</p>
)}
<p className="text-xs text-muted-foreground" data-testid={`text-created-${item.id}`}>
Aggiunto il {format(new Date(item.createdAt), "dd/MM/yyyy HH:mm")}
{item.createdBy && ` da ${item.createdBy}`}
</p>
</div>
<Button
variant="ghost"
size="icon"
onClick={() => deleteMutation.mutate(item.id)}
disabled={deleteMutation.isPending}
data-testid={`button-delete-${item.id}`}
>
<Trash2 className="h-4 w-4 text-destructive" />
</Button>
</div>
</div>
))}
</div>
) : (
<div className="text-center py-12 text-muted-foreground" data-testid="text-empty">
<Shield className="h-12 w-12 mx-auto mb-4 opacity-50" />
<p className="font-medium">Nessun IP in whitelist</p>
<p className="text-sm mt-2">Aggiungi indirizzi IP fidati per proteggerli dal blocco automatico</p>
</div>
)}
</CardContent>
</Card>
</div>
);
}

316
database-schema/README.md Normal file
View File

@ -0,0 +1,316 @@
# Database Schema & Migrations - Sistema Versionato
## Overview
Sistema intelligente di migrazioni database con **version tracking**. Applica solo le migrazioni necessarie, velocizzando gli aggiornamenti.
## 🎯 Vantaggi
**Veloce**: Applica solo migrazioni mancanti (non riesegue quelle già fatte)
**Sicuro**: Traccia versione database, previene errori
**Automatico**: Integrato in `update_from_git.sh`
**Idempotente**: Puoi eseguire più volte senza problemi
## 📋 Come Funziona
### 1. Tabella `schema_version`
Traccia la versione corrente del database:
```sql
CREATE TABLE schema_version (
id INTEGER PRIMARY KEY DEFAULT 1, -- Sempre 1 (solo 1 riga)
version INTEGER NOT NULL, -- Versione corrente (es: 5)
applied_at TIMESTAMP, -- Quando è stata applicata
description TEXT -- Descrizione ultima migrazione
);
```
### 2. Migrazioni Numerate
Ogni migrazione SQL ha un numero sequenziale nel nome:
```
database-schema/migrations/
├── 000_init_schema_version.sql ← Inizializza tracking (sempre eseguita)
├── 001_add_missing_columns.sql ← Migrazione 1
├── 002_add_indexes.sql ← Migrazione 2
├── 003_alter_detections.sql ← Migrazione 3
└── ...
```
**Convenzione nome**: `XXX_description.sql` dove XXX è numero a 3 cifre (001, 002, 010, 100, etc.)
### 3. Logica Applicazione
```bash
# Script legge versione corrente
CURRENT_VERSION = SELECT version FROM schema_version WHERE id = 1;
# Esempio: 2
# Trova migrazioni con numero > 2
Trova: 003_*.sql, 004_*.sql, 005_*.sql
# Applica in ordine
Per ogni migrazione:
1. Esegui SQL
2. Aggiorna versione: UPDATE schema_version SET version = 3
3. Prossima migrazione...
# Risultato: Database aggiornato da v2 a v5
```
## 🚀 Uso Quotidiano
### Aggiornamento Automatico (Consigliato)
```bash
# Sul server AlmaLinux
cd /opt/ids
sudo ./deployment/update_from_git.sh
# Lo script esegue automaticamente:
# 1. Git pull
# 2. npm install
# 3. pip install
# 4. ./database-schema/apply_migrations.sh ← Applica migrazioni
# 5. npm run db:push ← Sincronizza schema Drizzle
# 6. Restart servizi
```
**Output atteso**:
```
🗄️ Sistema Migrazioni Database (Versioned)
📋 Verifica sistema versioning...
✅ Sistema versioning attivo
📊 Versione database corrente: 2
📋 Trovate 3 migrazioni da applicare
🔄 Applicando migrazione 3: 003_add_indexes.sql
✅ Migrazione 3 applicata con successo
🔄 Applicando migrazione 4: 004_alter_table.sql
✅ Migrazione 4 applicata con successo
🔄 Applicando migrazione 5: 005_new_feature.sql
✅ Migrazione 5 applicata con successo
╔═══════════════════════════════════════════════╗
║ ✅ MIGRAZIONI COMPLETATE ║
╚═══════════════════════════════════════════════╝
📊 Versione database: 2 → 5
```
Se database già aggiornato:
```
📊 Versione database corrente: 5
✅ Database già aggiornato (nessuna migrazione da applicare)
```
### Applicazione Manuale
```bash
cd /opt/ids/database-schema
./apply_migrations.sh
```
### Verifica Versione Corrente
```bash
psql $DATABASE_URL -c "SELECT * FROM schema_version;"
id | version | applied_at | description
----+---------+----------------------------+-----------------------
1 | 5 | 2025-11-22 14:30:15.123456 | Migration 5: Add indexes
```
## 🔨 Creare Nuova Migrazione
### STEP 1: Determina Prossimo Numero
```bash
# Trova ultima migrazione
ls database-schema/migrations/ | grep "^[0-9]" | sort | tail -n 1
# Output: 005_add_indexes.sql
# Prossima migrazione: 006
```
### STEP 2: Crea File Migrazione
```bash
# Formato: XXX_description.sql
touch database-schema/migrations/006_add_new_table.sql
```
### STEP 3: Scrivi SQL
```sql
-- ============================================================================
-- Migration 006: Add new table for feature X
-- ============================================================================
-- Descrizione: Crea tabella per gestire feature X
-- Data: 2025-11-22
-- ============================================================================
CREATE TABLE IF NOT EXISTS my_new_table (
id VARCHAR PRIMARY KEY DEFAULT gen_random_uuid(),
name TEXT NOT NULL,
created_at TIMESTAMP DEFAULT NOW() NOT NULL
);
-- Crea indici
CREATE INDEX IF NOT EXISTS my_new_table_name_idx ON my_new_table(name);
-- Inserisci dati iniziali (se necessario)
INSERT INTO my_new_table (name)
SELECT 'Default Entry'
WHERE NOT EXISTS (SELECT 1 FROM my_new_table LIMIT 1);
```
**Best Practices**:
- Usa sempre `IF NOT EXISTS` (idempotenza)
- Usa `ALTER TABLE ... ADD COLUMN IF NOT EXISTS`
- Documenta bene la migrazione
- Testa localmente prima di committare
### STEP 4: Testa Localmente (Replit)
```bash
# Su Replit
cd database-schema
./apply_migrations.sh
# Verifica versione aggiornata
psql $DATABASE_URL -c "SELECT version FROM schema_version;"
```
### STEP 5: Commit & Deploy
```bash
# Su Replit
./push-gitlab.sh
# Sul server
cd /opt/ids
sudo ./deployment/update_from_git.sh
```
## 📊 Esempi Migrazioni Comuni
### Aggiungere Colonna
```sql
-- Migration XXX: Add email column to users
ALTER TABLE users
ADD COLUMN IF NOT EXISTS email TEXT;
```
### Creare Indice
```sql
-- Migration XXX: Add index on source_ip
CREATE INDEX IF NOT EXISTS network_logs_source_ip_idx
ON network_logs(source_ip);
```
### Modificare Tipo Colonna (ATTENZIONE!)
```sql
-- Migration XXX: Change column type
-- NOTA: Può causare perdita dati se incompatibile!
ALTER TABLE detections
ALTER COLUMN risk_score TYPE DECIMAL(5,2)
USING risk_score::DECIMAL(5,2);
```
### Inserire Dati Iniziali
```sql
-- Migration XXX: Add default admin user
INSERT INTO users (username, role)
SELECT 'admin', 'admin'
WHERE NOT EXISTS (
SELECT 1 FROM users WHERE username = 'admin'
);
```
## 🔍 Troubleshooting
### Errore: Migrazione Fallisce
```bash
# Verifica errore
psql $DATABASE_URL -c "SELECT version FROM schema_version;"
# Se migrazione 5 è fallita, il database è ancora a v4
# Fix: Correggi file 005_*.sql e riesegui
./apply_migrations.sh
```
### Reset Completo (ATTENZIONE!)
```bash
# ⚠️ DISTRUTTIVO - Cancella tutti i dati!
psql $DATABASE_URL << 'EOF'
DROP SCHEMA public CASCADE;
CREATE SCHEMA public;
GRANT ALL ON SCHEMA public TO ids_user;
GRANT ALL ON SCHEMA public TO public;
EOF
# Ricrea schema da zero
npm run db:push --force
./apply_migrations.sh
```
### Saltare Migrazione (Avanzato)
```bash
# Se migrazione 003 è già applicata manualmente
# Aggiorna versione manualmente
psql $DATABASE_URL -c "
UPDATE schema_version
SET version = 3,
description = 'Migration 3: Manually applied',
applied_at = NOW()
WHERE id = 1;
"
```
## 🎯 Workflow Completo
### Sviluppo (Replit)
1. Modifica schema in `shared/schema.ts`
2. Esegui `npm run db:push` (sincronizza Drizzle)
3. Se serve migrazione SQL custom:
- Crea `XXX_description.sql`
- Testa con `./apply_migrations.sh`
4. Commit: `./push-gitlab.sh`
### Produzione (AlmaLinux)
1. `sudo ./deployment/update_from_git.sh`
2. Script applica automaticamente migrazioni
3. Verifica: `psql $DATABASE_URL -c "SELECT * FROM schema_version;"`
## 📝 Note Tecniche
- **000_init_schema_version.sql**: Sempre eseguita (idempotente), inizializza tracking
- **Constraint**: Tabella `schema_version` ammette solo 1 riga (id=1)
- **Formato numeri**: Usa 3 cifre (001, 002, ..., 010, ..., 100) per ordinamento corretto
- **Drizzle vs SQL**: `npm run db:push` sincronizza schema TypeScript, migrazioni SQL sono per logica custom
## ✅ Checklist Pre-Commit
Quando crei nuova migrazione:
- [ ] Numero sequenziale corretto (XXX+1)
- [ ] Nome file descrittivo
- [ ] Commento header con descrizione
- [ ] SQL idempotente (`IF NOT EXISTS`, etc.)
- [ ] Testata localmente su Replit
- [ ] Versione aggiornata: `SELECT version FROM schema_version;`
- [ ] Commit message chiaro

View File

@ -0,0 +1,164 @@
#!/bin/bash
# =============================================================================
# IDS - Applica Migrazioni Database (con Version Tracking)
# =============================================================================
# Sistema intelligente di migrazioni:
# - Controlla versione corrente database
# - Applica SOLO migrazioni mancanti (più veloce!)
# - Aggiorna versione dopo ogni migrazione
# =============================================================================
set -e
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
MIGRATIONS_DIR="$SCRIPT_DIR/migrations"
IDS_DIR="$(dirname "$SCRIPT_DIR")"
DEPLOYMENT_MIGRATIONS_DIR="$IDS_DIR/deployment/migrations"
# Carica variabili ambiente ed esportale
if [ -f "$IDS_DIR/.env" ]; then
set -a
source "$IDS_DIR/.env"
set +a
fi
# Colori
GREEN='\033[0;32m'
BLUE='\033[0;34m'
YELLOW='\033[1;33m'
RED='\033[0;31m'
CYAN='\033[0;36m'
NC='\033[0m'
echo -e "${BLUE}🗄️ Sistema Migrazioni Database (Versioned)${NC}"
# Verifica DATABASE_URL
if [ -z "$DATABASE_URL" ]; then
echo -e "${RED}❌ DATABASE_URL non impostato${NC}"
echo -e "${YELLOW} File .env non trovato o DATABASE_URL mancante${NC}"
exit 1
fi
# Crea directory migrations se non esiste
mkdir -p "$MIGRATIONS_DIR"
# =============================================================================
# STEP 1: Inizializza tracking versione (se necessario)
# =============================================================================
echo -e "${CYAN}📋 Verifica sistema versioning...${NC}"
# Esegue 000_init_schema_version.sql (idempotente)
INIT_MIGRATION="$MIGRATIONS_DIR/000_init_schema_version.sql"
if [ -f "$INIT_MIGRATION" ]; then
psql "$DATABASE_URL" -f "$INIT_MIGRATION" -q
echo -e "${GREEN}✅ Sistema versioning attivo${NC}"
else
echo -e "${YELLOW}⚠️ Migration 000_init_schema_version.sql non trovata${NC}"
echo -e "${YELLOW} Creazione tabella schema_version...${NC}"
psql "$DATABASE_URL" << 'EOF' -q
CREATE TABLE IF NOT EXISTS schema_version (
id INTEGER PRIMARY KEY DEFAULT 1,
version INTEGER NOT NULL DEFAULT 0,
applied_at TIMESTAMP NOT NULL DEFAULT NOW(),
description TEXT
);
INSERT INTO schema_version (id, version, description)
SELECT 1, 0, 'Initial schema version tracking'
WHERE NOT EXISTS (SELECT 1 FROM schema_version WHERE id = 1);
EOF
echo -e "${GREEN}✅ Tabella schema_version creata${NC}"
fi
# =============================================================================
# STEP 2: Leggi versione corrente database
# =============================================================================
CURRENT_VERSION=$(psql "$DATABASE_URL" -tAc "SELECT COALESCE(version, 0) FROM schema_version WHERE id = 1;" 2>/dev/null || echo "0")
echo -e "${CYAN}📊 Versione database corrente: ${YELLOW}${CURRENT_VERSION}${NC}"
# =============================================================================
# STEP 3: Trova migrazioni da applicare
# =============================================================================
# Formato migrazioni: 001_description.sql, 002_another.sql, etc.
# Cerca in ENTRAMBE le cartelle: database-schema/migrations E deployment/migrations
MIGRATIONS_TO_APPLY=()
# Combina migrations da entrambe le cartelle e ordina per numero
ALL_MIGRATIONS=""
if [ -d "$MIGRATIONS_DIR" ]; then
ALL_MIGRATIONS+=$(find "$MIGRATIONS_DIR" -name "[0-9][0-9][0-9]_*.sql" 2>/dev/null || true)
fi
if [ -d "$DEPLOYMENT_MIGRATIONS_DIR" ]; then
if [ -n "$ALL_MIGRATIONS" ]; then
ALL_MIGRATIONS+=$'\n'
fi
ALL_MIGRATIONS+=$(find "$DEPLOYMENT_MIGRATIONS_DIR" -name "[0-9][0-9][0-9]_*.sql" 2>/dev/null || true)
fi
# Ordina le migrations per nome file (NNN_*.sql) estraendo il basename
SORTED_MIGRATIONS=$(echo "$ALL_MIGRATIONS" | grep -v '^$' | while read f; do echo "$(basename "$f"):$f"; done | sort | cut -d':' -f2)
for migration_file in $SORTED_MIGRATIONS; do
MIGRATION_NAME=$(basename "$migration_file")
# Estrai numero versione dal nome file (001, 002, etc.)
MIGRATION_VERSION=$(echo "$MIGRATION_NAME" | sed 's/^\([0-9]\{3\}\)_.*/\1/' | sed 's/^0*//')
# Se versione vuota (000), salta
if [ -z "$MIGRATION_VERSION" ]; then
continue
fi
# Se migrazione > versione corrente, aggiungila
if [ "$MIGRATION_VERSION" -gt "$CURRENT_VERSION" ]; then
MIGRATIONS_TO_APPLY+=("$migration_file:$MIGRATION_VERSION:$MIGRATION_NAME")
fi
done
# =============================================================================
# STEP 4: Applica migrazioni mancanti
# =============================================================================
if [ ${#MIGRATIONS_TO_APPLY[@]} -eq 0 ]; then
echo -e "${GREEN}✅ Database già aggiornato (nessuna migrazione da applicare)${NC}"
exit 0
fi
echo -e "${BLUE}📋 Trovate ${#MIGRATIONS_TO_APPLY[@]} migrazioni da applicare${NC}"
echo ""
for migration_info in "${MIGRATIONS_TO_APPLY[@]}"; do
IFS=':' read -r migration_file migration_version migration_name <<< "$migration_info"
echo -e "${BLUE}🔄 Applicando migrazione ${migration_version}: ${CYAN}${migration_name}${NC}"
# Applica migrazione
if psql "$DATABASE_URL" -f "$migration_file" -q 2>&1 | tee /tmp/migration_output.log | grep -qiE "error|fatal"; then
echo -e "${RED}❌ Errore in migrazione ${migration_version}${NC}"
cat /tmp/migration_output.log
exit 1
fi
# Aggiorna versione nel database
DESCRIPTION=$(head -n 5 "$migration_file" | grep -E "^--.*Migration" | sed 's/^--.*Migration [0-9]*: //' || echo "Migration $migration_version")
psql "$DATABASE_URL" -q << EOF
UPDATE schema_version
SET version = $migration_version,
applied_at = NOW(),
description = '$DESCRIPTION'
WHERE id = 1;
EOF
echo -e "${GREEN} ✅ Migrazione ${migration_version} applicata con successo${NC}"
echo ""
done
# =============================================================================
# STEP 5: Verifica versione finale
# =============================================================================
FINAL_VERSION=$(psql "$DATABASE_URL" -tAc "SELECT version FROM schema_version WHERE id = 1;")
echo -e "${GREEN}╔═══════════════════════════════════════════════╗${NC}"
echo -e "${GREEN}║ ✅ MIGRAZIONI COMPLETATE ║${NC}"
echo -e "${GREEN}╚═══════════════════════════════════════════════╝${NC}"
echo -e "${CYAN}📊 Versione database: ${YELLOW}${CURRENT_VERSION}${CYAN}${GREEN}${FINAL_VERSION}${NC}"
echo ""

View File

@ -0,0 +1,40 @@
-- =============================================================================
-- IDS - Pulizia Automatica Log Vecchi
-- =============================================================================
-- Mantiene solo gli ultimi 3 giorni di network_logs
-- Con 4.7M record/ora, 3 giorni = ~340M record massimi
-- Esegui giornalmente via cron: psql $DATABASE_URL < cleanup_old_logs.sql
-- =============================================================================
-- Conta log prima della pulizia
DO $$
DECLARE
total_count bigint;
old_count bigint;
BEGIN
SELECT COUNT(*) INTO total_count FROM network_logs;
SELECT COUNT(*) INTO old_count FROM network_logs WHERE timestamp < NOW() - INTERVAL '3 days';
RAISE NOTICE 'Log totali: %', total_count;
RAISE NOTICE 'Log da eliminare (>3 giorni): %', old_count;
END $$;
-- Elimina log più vecchi di 3 giorni
DELETE FROM network_logs
WHERE timestamp < NOW() - INTERVAL '3 days';
-- Vacuum per liberare spazio fisico
VACUUM ANALYZE network_logs;
-- Conta log dopo pulizia
DO $$
DECLARE
remaining_count bigint;
db_size text;
BEGIN
SELECT COUNT(*) INTO remaining_count FROM network_logs;
SELECT pg_size_pretty(pg_database_size(current_database())) INTO db_size;
RAISE NOTICE 'Log rimanenti: %', remaining_count;
RAISE NOTICE 'Dimensione database: %', db_size;
END $$;

View File

@ -0,0 +1,35 @@
-- =========================================================
-- SCHEMA: Tabella detections per IDS anomalie
-- =========================================================
-- Memorizza IP rilevati come pericolosi dal ML
-- =========================================================
-- Drop tabella se esiste (solo per ambiente di sviluppo)
DROP TABLE IF EXISTS detections CASCADE;
-- Crea tabella detections
CREATE TABLE detections (
id VARCHAR PRIMARY KEY DEFAULT gen_random_uuid(),
source_ip TEXT NOT NULL,
risk_score DECIMAL(5, 2) NOT NULL,
confidence DECIMAL(5, 2) NOT NULL,
anomaly_type TEXT NOT NULL,
reason TEXT,
log_count INTEGER NOT NULL,
first_seen TIMESTAMP NOT NULL,
last_seen TIMESTAMP NOT NULL,
blocked BOOLEAN NOT NULL DEFAULT FALSE,
blocked_at TIMESTAMP,
detected_at TIMESTAMP NOT NULL DEFAULT NOW()
);
-- Indici per performance
CREATE INDEX IF NOT EXISTS detection_source_ip_idx ON detections(source_ip);
CREATE INDEX IF NOT EXISTS risk_score_idx ON detections(risk_score);
CREATE INDEX IF NOT EXISTS detected_at_idx ON detections(detected_at);
-- Commento tabella
COMMENT ON TABLE detections IS 'IP pericolosi rilevati dal sistema ML (Isolation Forest)';
-- Verifica
SELECT 'Tabella detections creata con successo!' AS status;

View File

@ -0,0 +1,35 @@
-- =========================================================
-- SCHEMA: Tabella network_logs per IDS MikroTik
-- =========================================================
-- Creata per compatibilità con syslog_parser.py
-- =========================================================
-- Drop tabella se esiste (solo per ambiente di sviluppo)
DROP TABLE IF EXISTS network_logs CASCADE;
-- Crea tabella network_logs
CREATE TABLE network_logs (
id VARCHAR PRIMARY KEY DEFAULT gen_random_uuid(),
router_name TEXT NOT NULL,
timestamp TIMESTAMP NOT NULL,
source_ip TEXT NOT NULL,
source_port INTEGER,
destination_ip TEXT,
destination_port INTEGER,
protocol TEXT,
action TEXT,
packet_length INTEGER,
raw_message TEXT,
created_at TIMESTAMP NOT NULL DEFAULT NOW()
);
-- Indici per performance
CREATE INDEX IF NOT EXISTS source_ip_idx ON network_logs(source_ip);
CREATE INDEX IF NOT EXISTS timestamp_idx ON network_logs(timestamp);
CREATE INDEX IF NOT EXISTS router_name_idx ON network_logs(router_name);
-- Commento tabella
COMMENT ON TABLE network_logs IS 'Log di rete da router MikroTik via syslog (parsati da syslog_parser.py)';
-- Verifica
SELECT 'Tabella network_logs creata con successo!' AS status;

View File

@ -0,0 +1,37 @@
-- =========================================================
-- SCHEMA: Tabella routers per gestione router MikroTik
-- =========================================================
-- Memorizza configurazione router per comunicazione API REST
-- =========================================================
-- Drop tabella se esiste (solo per ambiente di sviluppo)
DROP TABLE IF EXISTS routers CASCADE;
-- Crea tabella routers
CREATE TABLE routers (
id VARCHAR PRIMARY KEY DEFAULT gen_random_uuid(),
name TEXT NOT NULL,
ip_address TEXT NOT NULL UNIQUE,
username TEXT NOT NULL,
password TEXT NOT NULL,
api_port INTEGER NOT NULL DEFAULT 443,
enabled BOOLEAN NOT NULL DEFAULT TRUE,
last_check TIMESTAMP,
status TEXT,
created_at TIMESTAMP NOT NULL DEFAULT NOW()
);
-- Indici per performance
CREATE INDEX IF NOT EXISTS routers_ip_idx ON routers(ip_address);
CREATE INDEX IF NOT EXISTS routers_enabled_idx ON routers(enabled);
-- Commento tabella
COMMENT ON TABLE routers IS 'Configurazione router MikroTik per comunicazione API REST';
-- Inserisci router di esempio (FIBRA)
INSERT INTO routers (name, ip_address, username, password, api_port, enabled)
VALUES ('Router FIBRA', '192.178.204.1', 'admin', 'change_me', 443, true)
ON CONFLICT (ip_address) DO NOTHING;
-- Verifica
SELECT 'Tabella routers creata con successo!' AS status;

View File

@ -0,0 +1,30 @@
-- =========================================================
-- SCHEMA: Tabella training_history per storico ML
-- =========================================================
-- Memorizza storia training modelli ML
-- =========================================================
-- Drop tabella se esiste (solo per ambiente di sviluppo)
DROP TABLE IF EXISTS training_history CASCADE;
-- Crea tabella training_history
CREATE TABLE training_history (
id VARCHAR PRIMARY KEY DEFAULT gen_random_uuid(),
model_version TEXT NOT NULL,
records_processed INTEGER NOT NULL,
features_count INTEGER NOT NULL,
accuracy DECIMAL(5, 2),
training_duration INTEGER,
status TEXT NOT NULL,
notes TEXT,
trained_at TIMESTAMP NOT NULL DEFAULT NOW()
);
-- Indici per performance
CREATE INDEX IF NOT EXISTS trained_at_idx ON training_history(trained_at);
-- Commento tabella
COMMENT ON TABLE training_history IS 'Storico training modelli ML per IDS';
-- Verifica
SELECT 'Tabella training_history creata con successo!' AS status;

View File

@ -0,0 +1,29 @@
-- =========================================================
-- SCHEMA: Tabella whitelist per IP fidati
-- =========================================================
-- IP che non devono mai essere bloccati
-- =========================================================
-- Drop tabella se esiste (solo per ambiente di sviluppo)
DROP TABLE IF EXISTS whitelist CASCADE;
-- Crea tabella whitelist
CREATE TABLE whitelist (
id VARCHAR PRIMARY KEY DEFAULT gen_random_uuid(),
ip_address TEXT NOT NULL UNIQUE,
comment TEXT,
reason TEXT,
created_by TEXT,
active BOOLEAN NOT NULL DEFAULT TRUE,
created_at TIMESTAMP NOT NULL DEFAULT NOW()
);
-- Indici per performance
CREATE INDEX IF NOT EXISTS whitelist_ip_idx ON whitelist(ip_address);
CREATE INDEX IF NOT EXISTS whitelist_active_idx ON whitelist(active);
-- Commento tabella
COMMENT ON TABLE whitelist IS 'IP fidati che non devono mai essere bloccati dal sistema';
-- Verifica
SELECT 'Tabella whitelist creata con successo!' AS status;

View File

@ -0,0 +1,23 @@
-- ============================================================================
-- Migration 000: Initialize schema version tracking
-- ============================================================================
-- Crea tabella per tracciare versione schema database
-- Previene re-esecuzione di migrazioni già applicate
-- ============================================================================
-- Crea tabella schema_version se non esiste
CREATE TABLE IF NOT EXISTS schema_version (
id INTEGER PRIMARY KEY DEFAULT 1,
version INTEGER NOT NULL DEFAULT 0,
applied_at TIMESTAMP NOT NULL DEFAULT NOW(),
description TEXT
);
-- Inserisci versione iniziale (solo se tabella vuota)
INSERT INTO schema_version (id, version, description)
SELECT 1, 0, 'Initial schema version tracking'
WHERE NOT EXISTS (SELECT 1 FROM schema_version WHERE id = 1);
-- Constraint: solo 1 riga ammessa
ALTER TABLE schema_version ADD CONSTRAINT schema_version_single_row
CHECK (id = 1) NOT VALID;

View File

@ -0,0 +1,35 @@
-- Migration 001: Add missing columns to routers table
-- Date: 2025-11-21
-- Description: Adds api_port and last_sync columns if missing
-- Add api_port column if not exists
ALTER TABLE routers
ADD COLUMN IF NOT EXISTS api_port integer NOT NULL DEFAULT 8728;
-- Add last_sync column if not exists
ALTER TABLE routers
ADD COLUMN IF NOT EXISTS last_sync timestamp;
-- Add created_at if missing (fallback for older schemas)
ALTER TABLE routers
ADD COLUMN IF NOT EXISTS created_at timestamp DEFAULT now() NOT NULL;
-- Verify columns exist
DO $$
BEGIN
IF EXISTS (
SELECT 1 FROM information_schema.columns
WHERE table_name = 'routers'
AND column_name = 'api_port'
) THEN
RAISE NOTICE 'Column api_port exists';
END IF;
IF EXISTS (
SELECT 1 FROM information_schema.columns
WHERE table_name = 'routers'
AND column_name = 'last_sync'
) THEN
RAISE NOTICE 'Column last_sync exists';
END IF;
END $$;

View File

@ -0,0 +1,18 @@
-- ============================================================================
-- Migration 002: Add performance indexes
-- ============================================================================
-- Descrizione: Aggiunge indici per migliorare performance query detections
-- Data: 2025-11-22
-- ============================================================================
-- Indice su blocked per filtrare IP bloccati
CREATE INDEX IF NOT EXISTS detections_blocked_idx
ON detections(blocked);
-- Indice composto per query "IP bloccati recenti"
CREATE INDEX IF NOT EXISTS detections_blocked_detected_idx
ON detections(blocked, detected_at)
WHERE blocked = true;
-- Commento descrittivo
COMMENT ON INDEX detections_blocked_idx IS 'Index for filtering blocked IPs';

Some files were not shown because too many files have changed in this diff Show More