feat: initial release
This commit is contained in:
45
.gitignore
vendored
Normal file
45
.gitignore
vendored
Normal file
@@ -0,0 +1,45 @@
|
||||
proto/
|
||||
# Virtual environments
|
||||
.venv/
|
||||
|
||||
# Poetry dependencies
|
||||
poetry.lock
|
||||
|
||||
# IDE / Editor files
|
||||
*.swp
|
||||
*.swo
|
||||
*.pyc
|
||||
__pycache__/
|
||||
|
||||
# OS files
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# Build artifacts
|
||||
dist/
|
||||
build/
|
||||
*.egg-info/
|
||||
|
||||
# Logs and databases
|
||||
logs/
|
||||
*.log
|
||||
|
||||
# Environment variables (optional, wenn nicht in .env.example)
|
||||
.env.local
|
||||
.env.*.local
|
||||
|
||||
# Coverage reports
|
||||
htmlcov/
|
||||
.tox/
|
||||
.cache/
|
||||
nosetests.xml
|
||||
coverage.xml
|
||||
|
||||
|
||||
# PyInstaller
|
||||
build/
|
||||
dist/
|
||||
|
||||
|
||||
# pytest cache
|
||||
.pytest_cache/
|
||||
63
README.md
Normal file
63
README.md
Normal file
@@ -0,0 +1,63 @@
|
||||
# compliance-scan
|
||||
|
||||
SSL/TLS configuration analysis with automated IANA/BSI compliance checking.
|
||||
|
||||
## Quick Start
|
||||
|
||||
```bash
|
||||
# Scan
|
||||
poetry run compliance-scan scan example.com:443,636
|
||||
|
||||
# Report
|
||||
poetry run compliance-scan report -t md -o report.md
|
||||
```
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
poetry install
|
||||
```
|
||||
|
||||
## Features
|
||||
|
||||
- Multi-port TLS/SSL scanning
|
||||
- BSI TR-02102-1/2 compliance validation
|
||||
- IANA recommendations checking
|
||||
- Vulnerability detection (Heartbleed, ROBOT, CCS Injection)
|
||||
- Certificate validation
|
||||
- Multiple report formats (CSV, Markdown, reStructuredText)
|
||||
|
||||
## Commands
|
||||
|
||||
```bash
|
||||
# Scan with ports
|
||||
compliance-scan scan <hostname>:<port1>,<port2> [--print] [-db <path>]
|
||||
|
||||
# Generate report
|
||||
compliance-scan report [scan_id] -t <csv|md|rest> [-o <file>]
|
||||
|
||||
# List scans
|
||||
compliance-scan report --list
|
||||
```
|
||||
|
||||
## Supported Protocols
|
||||
|
||||
Opportunistic TLS: SMTP, LDAP, IMAP, POP3, FTP, XMPP, RDP, PostgreSQL
|
||||
Direct TLS: HTTPS, LDAPS, SMTPS, IMAPS, POP3S
|
||||
|
||||
## Documentation
|
||||
|
||||
**[Detailed Guide](docs/detailed-guide.md)** - Complete reference with CLI commands, database schema, compliance rules, and development guide.
|
||||
|
||||
## Requirements
|
||||
|
||||
- Python 3.13+
|
||||
- SSLyze 6.0.0+
|
||||
- Poetry
|
||||
|
||||
## Planned Features
|
||||
|
||||
- CLI command for updating IANA reference data
|
||||
- Automated IANA registry updates from web sources base on `src/sslysze_scan/scan_iana.py`
|
||||
- TLS Parameters: https://www.iana.org/assignments/tls-parameters/tls-parameters.xml
|
||||
- IKEv2 Parameters: https://www.iana.org/assignments/ikev2-parameters/ikev2-parameters.xml
|
||||
454
docs/detailed-guide.md
Normal file
454
docs/detailed-guide.md
Normal file
@@ -0,0 +1,454 @@
|
||||
# compliance-scan - Detailed Guide
|
||||
|
||||
Complete reference for developers and advanced users.
|
||||
|
||||
## Core Entry Points
|
||||
|
||||
| Component | Path | Purpose |
|
||||
| --------------- | ------------------------------------ | ------------------------------------- |
|
||||
| CLI | `src/sslysze_scan/__main__.py` | Command-line interface entry |
|
||||
| Scanner | `src/sslysze_scan/scanner.py` | SSLyze integration and scan execution |
|
||||
| Database Writer | `src/sslysze_scan/db/writer.py` | Scan result persistence |
|
||||
| Reporter | `src/sslysze_scan/reporter/` | Report generation (CSV/MD/reST) |
|
||||
| Compliance | `src/sslysze_scan/db/compliance.py` | BSI/IANA validation logic |
|
||||
| Query | `src/sslysze_scan/reporter/query.py` | Database queries using views |
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
poetry install
|
||||
```
|
||||
|
||||
## Quick Reference
|
||||
|
||||
```bash
|
||||
# Scan server on multiple ports
|
||||
poetry run compliance-scan scan example.com:443,636
|
||||
|
||||
# Generate Markdown report
|
||||
poetry run compliance-scan report -t md -o report.md
|
||||
|
||||
# Generate CSV reports
|
||||
poetry run compliance-scan report -t csv --output-dir ./reports
|
||||
|
||||
# List all scans
|
||||
poetry run compliance-scan report --list
|
||||
```
|
||||
|
||||
## CLI Commands
|
||||
|
||||
### Scan Command
|
||||
|
||||
```
|
||||
compliance-scan scan <hostname>:<port1>,<port2> [options]
|
||||
```
|
||||
|
||||
| Argument | Required | Description |
|
||||
| -------------------- | -------- | ---------------------------------------------------------------- |
|
||||
| `<hostname>:<ports>` | Yes | Target with comma-separated ports. IPv6: `[2001:db8::1]:443,636` |
|
||||
| `--print` | No | Display summary in console |
|
||||
| `-db <path>` | No | Database file path (default: compliance_status.db) |
|
||||
|
||||
Examples:
|
||||
|
||||
```bash
|
||||
compliance-scan scan example.com:443,636 --print
|
||||
compliance-scan scan [2001:db8::1]:443,636 -db custom.db
|
||||
```
|
||||
|
||||
### Report Command
|
||||
|
||||
```
|
||||
compliance-scan report [scan_id] -t <type> [options]
|
||||
```
|
||||
|
||||
| Argument | Required | Description |
|
||||
| -------------------- | -------- | ------------------------------ |
|
||||
| `scan_id` | No | Scan ID (default: latest scan) |
|
||||
| `-t <type>` | Yes | Report type: csv, md, rest |
|
||||
| `-o <file>` | No | Output file (md/rest only) |
|
||||
| `--output-dir <dir>` | No | Output directory |
|
||||
| `--list` | No | List all available scans |
|
||||
|
||||
Examples:
|
||||
|
||||
```bash
|
||||
compliance-scan report -t md -o report.md
|
||||
compliance-scan report 5 -t csv --output-dir ./reports
|
||||
compliance-scan report -t rest --output-dir ./docs
|
||||
```
|
||||
|
||||
## Report Formats
|
||||
|
||||
### CSV
|
||||
|
||||
Generates granular files per port and category.
|
||||
|
||||
| File Pattern | Content |
|
||||
| --------------------------------------------- | -------------------------------------- |
|
||||
| `summary.csv` | Scan statistics and compliance summary |
|
||||
| `<port>_cipher_suites_<version>_accepted.csv` | Accepted cipher suites per TLS version |
|
||||
| `<port>_cipher_suites_<version>_rejected.csv` | Rejected cipher suites per TLS version |
|
||||
| `<port>_supported_groups.csv` | Elliptic curves and DH groups |
|
||||
| `<port>_missing_groups_bsi.csv` | BSI-approved groups not offered |
|
||||
| `<port>_missing_groups_iana.csv` | IANA-recommended groups not offered |
|
||||
| `<port>_certificates.csv` | Certificate chain with compliance |
|
||||
| `<port>_vulnerabilities.csv` | Vulnerability scan results |
|
||||
| `<port>_protocol_features.csv` | TLS protocol features |
|
||||
| `<port>_session_features.csv` | Session handling features |
|
||||
| `<port>_http_headers.csv` | HTTP security headers |
|
||||
| `<port>_compliance_status.csv` | Aggregated compliance per check type |
|
||||
|
||||
Behavior: Ports without TLS support generate no files. Empty sections are omitted.
|
||||
|
||||
### Markdown
|
||||
|
||||
Single comprehensive report with:
|
||||
|
||||
1. Metadata: Scan ID, hostname, IPs, timestamp, duration, ports
|
||||
2. Summary: Statistics table
|
||||
3. Per-port sections (TLS-enabled ports only):
|
||||
- TLS configuration
|
||||
- Cipher suites (accepted/rejected by version)
|
||||
- Supported groups with compliance
|
||||
- Missing groups (collapsible details)
|
||||
- Certificates with key size and compliance
|
||||
- Vulnerabilities
|
||||
- Protocol features
|
||||
- Session features
|
||||
- HTTP security headers
|
||||
|
||||
### reStructuredText
|
||||
|
||||
Identical structure to Markdown but uses `.. csv-table::` directives for Sphinx integration.
|
||||
|
||||
Use case: Generate documentation that references CSV files for tabular data.
|
||||
|
||||
## Database Structure
|
||||
|
||||
File: `compliance_status.db` (SQLite, Schema Version 5)
|
||||
|
||||
Template: `src/sslysze_scan/data/crypto_standards.db`
|
||||
|
||||
Full schema: [schema.sql](schema.sql)
|
||||
|
||||
### Scan Result Tables
|
||||
|
||||
| Table | Content |
|
||||
| ------------------------ | ------------------------------------------------------------ |
|
||||
| `scans` | Scan metadata: scan_id, hostname, ports, timestamp, duration |
|
||||
| `scanned_hosts` | Resolved FQDN with IPv4/IPv6 addresses |
|
||||
| `scan_cipher_suites` | Cipher suites per port and TLS version (accepted/rejected) |
|
||||
| `scan_supported_groups` | Elliptic curves and DH groups per port |
|
||||
| `scan_certificates` | Certificate chain with key type, size, validity |
|
||||
| `scan_vulnerabilities` | Vulnerability test results per port |
|
||||
| `scan_protocol_features` | TLS protocol features (compression, early data, etc.) |
|
||||
| `scan_session_features` | Session renegotiation and resumption |
|
||||
| `scan_http_headers` | HTTP security headers per port |
|
||||
| `scan_compliance_status` | Compliance evaluation per item and port |
|
||||
|
||||
### Database Views (Schema v5)
|
||||
|
||||
Six optimized views eliminate complex JOINs and improve query performance:
|
||||
|
||||
| View | Purpose |
|
||||
| ------------------------------------ | --------------------------------------------- |
|
||||
| `v_cipher_suites_with_compliance` | Cipher suites with BSI/IANA compliance flags |
|
||||
| `v_supported_groups_with_compliance` | Groups with compliance status |
|
||||
| `v_certificates_with_compliance` | Certificates with key size compliance |
|
||||
| `v_port_compliance_summary` | Aggregated compliance statistics per port |
|
||||
| `v_missing_bsi_groups` | BSI-approved groups not offered by server |
|
||||
| `v_missing_iana_groups` | IANA-recommended groups not offered by server |
|
||||
|
||||
### Reference Data Tables
|
||||
|
||||
IANA TLS:
|
||||
|
||||
- `iana_tls_cipher_suites`: Cipher suite registry with recommendations
|
||||
- `iana_tls_signature_schemes`: Signature algorithm registry
|
||||
- `iana_tls_supported_groups`: Named groups registry
|
||||
|
||||
BSI TR-02102-1 (Certificates):
|
||||
|
||||
- `bsi_tr_02102_1_key_requirements`: Key length requirements
|
||||
- `bsi_tr_02102_1_hash_requirements`: Hash algorithm requirements
|
||||
|
||||
BSI TR-02102-2 (TLS):
|
||||
|
||||
- `bsi_tr_02102_2_tls`: TLS cipher suites and groups with validity periods
|
||||
|
||||
BSI TR-02102-3 (IPsec/IKEv2):
|
||||
|
||||
- Encryption, integrity, DH groups
|
||||
|
||||
BSI TR-02102-4 (SSH):
|
||||
|
||||
- Key exchange, encryption, MAC
|
||||
|
||||
CSV Export Metadata:
|
||||
|
||||
- `csv_export_metadata`: Stores CSV headers as JSON for all export types
|
||||
|
||||
## Compliance Validation
|
||||
|
||||
### BSI TR-02102-1 (Certificates)
|
||||
|
||||
Key length requirements:
|
||||
|
||||
| Algorithm | Minimum Bits | Status |
|
||||
| --------- | ------------ | ----------------------------- |
|
||||
| RSA | 3000 | Required |
|
||||
| ECDSA | 250 | Required |
|
||||
| DSA | 3072 | Deprecated (valid until 2029) |
|
||||
|
||||
Hash algorithms:
|
||||
|
||||
- Allowed: SHA-256, SHA-384, SHA-512
|
||||
- Deprecated: SHA-1, MD5
|
||||
|
||||
### BSI TR-02102-2 (TLS)
|
||||
|
||||
Validates:
|
||||
|
||||
- Cipher suites against BSI-approved lists
|
||||
- Supported groups against BSI requirements
|
||||
- Validity periods (time-based expiration)
|
||||
|
||||
### IANA
|
||||
|
||||
Validates:
|
||||
|
||||
- Cipher suite recommendations (Y/N/D flags)
|
||||
- Supported group recommendations (Y/N/D flags)
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
src/sslysze_scan/
|
||||
├── __main__.py # CLI entry point
|
||||
├── cli.py # Argument parsing
|
||||
├── scanner.py # SSLyze integration
|
||||
├── protocol_loader.py # Port-protocol mapping
|
||||
├── output.py # Console output
|
||||
├── commands/
|
||||
│ ├── scan.py # Scan command handler
|
||||
│ └── report.py # Report command handler
|
||||
├── db/
|
||||
│ ├── schema.py # Schema version management
|
||||
│ ├── writer.py # Scan result storage
|
||||
│ ├── compliance.py # Compliance validation
|
||||
│ └── writers/ # Specialized writers
|
||||
├── reporter/
|
||||
│ ├── query.py # Database queries (uses views)
|
||||
│ ├── csv_export.py # CSV generation
|
||||
│ ├── markdown_export.py # Markdown generation
|
||||
│ ├── rst_export.py # reST generation
|
||||
│ └── template_utils.py # Shared utilities
|
||||
├── templates/
|
||||
│ ├── report.md.j2 # Markdown template
|
||||
│ └── report.reST.j2 # reST template
|
||||
└── data/
|
||||
├── crypto_standards.db # Template DB (IANA/BSI + schema)
|
||||
└── protocols.csv # Port-protocol mapping
|
||||
```
|
||||
|
||||
## Key Functions
|
||||
|
||||
### CLI and Parsing
|
||||
|
||||
| Function | Module | Purpose |
|
||||
| -------------------------- | -------- | ----------------------------------- |
|
||||
| `parse_host_ports(target)` | `cli.py` | Parse `hostname:port1,port2` format |
|
||||
| `parse_arguments()` | `cli.py` | Parse CLI arguments |
|
||||
|
||||
### Scanning
|
||||
|
||||
| Function | Module | Purpose |
|
||||
| ------------------------------------------------------------ | ------------ | -------------------------------- |
|
||||
| `perform_scan(hostname, port, start_time)` | `scanner.py` | Execute SSLyze scan for one port |
|
||||
| `create_scan_request(hostname, port, use_opportunistic_tls)` | `scanner.py` | Create SSLyze scan request |
|
||||
|
||||
### Database Writing
|
||||
|
||||
| Function | Module | Purpose |
|
||||
| ---------------------------------------------------------------------------- | ------------------ | --------------------------------------- |
|
||||
| `save_scan_results(db_path, hostname, ports, results, start_time, duration)` | `db/writer.py` | Store all scan results, returns scan_id |
|
||||
| `check_compliance(db_path, scan_id)` | `db/compliance.py` | Validate compliance, returns statistics |
|
||||
| `check_schema_version(db_path)` | `db/schema.py` | Verify schema compatibility |
|
||||
| `get_schema_version(db_path)` | `db/schema.py` | Get current schema version |
|
||||
|
||||
### Database Querying
|
||||
|
||||
| Function | Module | Purpose |
|
||||
| ------------------------------------- | ------------------- | ---------------------------------- |
|
||||
| `get_scan_data(db_path, scan_id)` | `reporter/query.py` | Get complete scan data using views |
|
||||
| `get_scan_metadata(db_path, scan_id)` | `reporter/query.py` | Get scan metadata only |
|
||||
| `list_scans(db_path)` | `reporter/query.py` | List all scans in database |
|
||||
|
||||
### Report Generation
|
||||
|
||||
| Function | Module | Purpose |
|
||||
| ------------------------------------------------------------ | ----------------------------- | ------------------------------------ |
|
||||
| `generate_csv_reports(db_path, scan_id, output_dir)` | `reporter/csv_export.py` | Generate all CSV files |
|
||||
| `generate_markdown_report(db_path, scan_id, output)` | `reporter/markdown_export.py` | Generate Markdown report |
|
||||
| `generate_rest_report(db_path, scan_id, output, output_dir)` | `reporter/rst_export.py` | Generate reStructuredText report |
|
||||
| `_get_headers(db_path, export_type)` | `reporter/csv_export.py` | Load CSV headers from database |
|
||||
| `build_template_context(data)` | `reporter/template_utils.py` | Prepare Jinja2 template context |
|
||||
| `generate_report_id(metadata)` | `reporter/template_utils.py` | Generate report ID (YYYYMMDD_scanid) |
|
||||
|
||||
## SQL Query Examples
|
||||
|
||||
All queries use optimized views for performance.
|
||||
|
||||
### Cipher Suites with Compliance
|
||||
|
||||
```sql
|
||||
SELECT cipher_suite_name, iana_recommended_final, bsi_approved_final, compliant
|
||||
FROM v_cipher_suites_with_compliance
|
||||
WHERE scan_id = ? AND port = ? AND accepted = 1;
|
||||
```
|
||||
|
||||
### Port Compliance Summary
|
||||
|
||||
```sql
|
||||
SELECT check_type, total, passed, percentage
|
||||
FROM v_port_compliance_summary
|
||||
WHERE scan_id = ? AND port = ?;
|
||||
```
|
||||
|
||||
### Missing BSI Groups
|
||||
|
||||
```sql
|
||||
SELECT group_name, tls_version, valid_until
|
||||
FROM v_missing_bsi_groups
|
||||
WHERE scan_id = ?;
|
||||
```
|
||||
|
||||
### Non-Compliant Certificates
|
||||
|
||||
```sql
|
||||
SELECT port, key_type, key_bits, compliant, compliance_details
|
||||
FROM v_certificates_with_compliance
|
||||
WHERE scan_id = ? AND compliant = 0;
|
||||
```
|
||||
|
||||
### Vulnerabilities
|
||||
|
||||
```sql
|
||||
SELECT port, vuln_type, vulnerable, details
|
||||
FROM scan_vulnerabilities
|
||||
WHERE scan_id = ? AND vulnerable = 1;
|
||||
```
|
||||
|
||||
## Supported Protocols
|
||||
|
||||
### Opportunistic TLS (STARTTLS)
|
||||
|
||||
| Protocol | Ports |
|
||||
| ---------- | ---------- |
|
||||
| SMTP | 25, 587 |
|
||||
| LDAP | 389 |
|
||||
| IMAP | 143 |
|
||||
| POP3 | 110 |
|
||||
| FTP | 21 |
|
||||
| XMPP | 5222, 5269 |
|
||||
| RDP | 3389 |
|
||||
| PostgreSQL | 5432 |
|
||||
|
||||
### Direct TLS
|
||||
|
||||
| Protocol | Port |
|
||||
| -------- | ---- |
|
||||
| HTTPS | 443 |
|
||||
| LDAPS | 636 |
|
||||
| SMTPS | 465 |
|
||||
| IMAPS | 993 |
|
||||
| POP3S | 995 |
|
||||
|
||||
### Not Supported
|
||||
|
||||
MySQL (proprietary TLS protocol)
|
||||
|
||||
Fallback behavior: Automatic retry with direct TLS if STARTTLS fails.
|
||||
|
||||
## Testing
|
||||
|
||||
```bash
|
||||
poetry run pytest tests/ -v
|
||||
```
|
||||
|
||||
**Test structure:**
|
||||
|
||||
- `tests/conftest.py`: Fixtures with test_db, test_db_path
|
||||
- `tests/fixtures/test_scan.db`: Real scan data (Scan 1: dc.validation.lan:443,636)
|
||||
- `tests/test_csv_export.py`: 11 CSV export tests
|
||||
- `tests/test_template_utils.py`: 3 template utility tests
|
||||
- `tests/test_compliance.py`: 2 compliance tests
|
||||
- `tests/test_cli.py`: 3 CLI parsing tests
|
||||
|
||||
**Total:** 19 tests
|
||||
|
||||
**Test database setup:**
|
||||
|
||||
- Loads `crypto_standards.db` (reference data + schema)
|
||||
- Loads `test_scan.db` (scan data only)
|
||||
- Creates views dynamically
|
||||
- In-memory for speed
|
||||
|
||||
## Code Quality
|
||||
|
||||
**Linter:** Ruff
|
||||
|
||||
```bash
|
||||
poetry run ruff check src/ tests/
|
||||
poetry run ruff format src/ tests/
|
||||
```
|
||||
|
||||
**Configuration:** `pyproject.toml`
|
||||
|
||||
- Line length: 90 characters
|
||||
- Target: Python 3.13
|
||||
- Rules: PEP 8, pyflakes, isort, naming, upgrades
|
||||
|
||||
## Requirements
|
||||
|
||||
- Python 3.13+
|
||||
- SSLyze 6.0.0+
|
||||
- Poetry (dependency management)
|
||||
- Jinja2 3.1+
|
||||
- pytest 9.0+ (development)
|
||||
- ruff (development)
|
||||
|
||||
## Container Usage
|
||||
|
||||
```bash
|
||||
./container-build.sh
|
||||
podman run --rm compliance-scan:latest scan example.com:443
|
||||
```
|
||||
|
||||
## Database Workflow
|
||||
|
||||
1. **First scan:** Copies `crypto_standards.db` → `compliance_status.db`
|
||||
2. **Schema check:** Validates schema version (must be 5)
|
||||
3. **Scan execution:** SSLyze performs TLS analysis
|
||||
4. **Data storage:** Results written to scan tables
|
||||
5. **Compliance check:** Validation against BSI/IANA via views
|
||||
6. **Report generation:** Queries use views for optimized performance
|
||||
|
||||
## Architecture Notes
|
||||
|
||||
**Design principles:**
|
||||
|
||||
- Single database file contains everything (reference data + results)
|
||||
- Views optimize complex queries (no N+1 queries)
|
||||
- CSV headers in database (easy to modify)
|
||||
- Template-based reports (Jinja2)
|
||||
- Port-agnostic (one scan_id, multiple ports)
|
||||
|
||||
**Key decisions:**
|
||||
|
||||
- SQLite for simplicity and portability
|
||||
- Views introduced in schema v5 for performance
|
||||
- CSV export metadata centralized
|
||||
- Test fixtures use real scan data
|
||||
- Ruff for modern Python linting
|
||||
506
docs/schema.sql
Normal file
506
docs/schema.sql
Normal file
@@ -0,0 +1,506 @@
|
||||
CREATE TABLE iana_tls_cipher_suites (
|
||||
value TEXT PRIMARY KEY,
|
||||
description TEXT,
|
||||
dtls TEXT,
|
||||
recommended TEXT,
|
||||
rfc_draft TEXT
|
||||
);
|
||||
CREATE TABLE iana_tls_signature_schemes (
|
||||
value TEXT PRIMARY KEY,
|
||||
description TEXT,
|
||||
dtls TEXT,
|
||||
recommended TEXT,
|
||||
rfc_draft TEXT
|
||||
);
|
||||
CREATE TABLE iana_tls_supported_groups (
|
||||
value TEXT PRIMARY KEY,
|
||||
description TEXT,
|
||||
dtls TEXT,
|
||||
recommended TEXT,
|
||||
rfc_draft TEXT
|
||||
);
|
||||
CREATE TABLE iana_tls_alerts (
|
||||
value TEXT PRIMARY KEY,
|
||||
description TEXT,
|
||||
dtls TEXT,
|
||||
recommended TEXT,
|
||||
rfc_draft TEXT
|
||||
);
|
||||
CREATE TABLE iana_tls_content_types (
|
||||
value TEXT PRIMARY KEY,
|
||||
description TEXT,
|
||||
dtls TEXT,
|
||||
recommended TEXT,
|
||||
rfc_draft TEXT
|
||||
);
|
||||
CREATE TABLE iana_ikev2_encryption_algorithms (
|
||||
value TEXT PRIMARY KEY,
|
||||
description TEXT,
|
||||
esp TEXT,
|
||||
ikev2 TEXT,
|
||||
rfc_draft TEXT
|
||||
);
|
||||
CREATE TABLE iana_ikev2_prf_algorithms (
|
||||
value TEXT PRIMARY KEY,
|
||||
description TEXT,
|
||||
status TEXT,
|
||||
rfc_draft TEXT
|
||||
);
|
||||
CREATE TABLE iana_ikev2_integrity_algorithms (
|
||||
value TEXT PRIMARY KEY,
|
||||
description TEXT,
|
||||
status TEXT,
|
||||
rfc_draft TEXT
|
||||
);
|
||||
CREATE TABLE iana_ikev2_dh_groups (
|
||||
value TEXT PRIMARY KEY,
|
||||
description TEXT,
|
||||
status TEXT,
|
||||
rfc_draft TEXT
|
||||
);
|
||||
CREATE TABLE iana_ikev2_authentication_methods (
|
||||
value TEXT PRIMARY KEY,
|
||||
description TEXT,
|
||||
status TEXT,
|
||||
rfc_draft TEXT
|
||||
);
|
||||
CREATE TABLE bsi_tr_02102_2_tls (
|
||||
name TEXT,
|
||||
iana_number TEXT,
|
||||
category TEXT,
|
||||
tls_version TEXT,
|
||||
valid_until INTEGER,
|
||||
reference TEXT,
|
||||
notes TEXT,
|
||||
PRIMARY KEY (name, tls_version, iana_number)
|
||||
);
|
||||
CREATE TABLE bsi_tr_02102_3_ikev2_encryption (
|
||||
verfahren TEXT PRIMARY KEY,
|
||||
iana_nr TEXT,
|
||||
spezifikation TEXT,
|
||||
laenge TEXT,
|
||||
verwendung TEXT
|
||||
);
|
||||
CREATE TABLE bsi_tr_02102_3_ikev2_prf (
|
||||
verfahren TEXT PRIMARY KEY,
|
||||
iana_nr TEXT,
|
||||
spezifikation TEXT,
|
||||
verwendung TEXT
|
||||
);
|
||||
CREATE TABLE bsi_tr_02102_3_ikev2_integrity (
|
||||
verfahren TEXT PRIMARY KEY,
|
||||
iana_nr TEXT,
|
||||
spezifikation TEXT,
|
||||
verwendung TEXT
|
||||
);
|
||||
CREATE TABLE bsi_tr_02102_3_ikev2_dh_groups (
|
||||
verfahren TEXT PRIMARY KEY,
|
||||
iana_nr TEXT,
|
||||
spezifikation TEXT,
|
||||
verwendung TEXT
|
||||
);
|
||||
CREATE TABLE bsi_tr_02102_3_ikev2_auth (
|
||||
verfahren TEXT,
|
||||
bit_laenge TEXT,
|
||||
hash_funktion TEXT,
|
||||
iana_nr TEXT,
|
||||
spezifikation TEXT,
|
||||
verwendung TEXT,
|
||||
PRIMARY KEY (verfahren, hash_funktion)
|
||||
);
|
||||
CREATE TABLE bsi_tr_02102_3_esp_encryption (
|
||||
verfahren TEXT PRIMARY KEY,
|
||||
iana_nr TEXT,
|
||||
spezifikation TEXT,
|
||||
aes_schluessellaenge TEXT,
|
||||
verwendung TEXT
|
||||
);
|
||||
CREATE TABLE bsi_tr_02102_3_esp_integrity (
|
||||
verfahren TEXT PRIMARY KEY,
|
||||
iana_nr TEXT,
|
||||
spezifikation TEXT,
|
||||
verwendung_bis TEXT
|
||||
);
|
||||
CREATE TABLE bsi_tr_02102_3_ah_integrity (
|
||||
verfahren TEXT PRIMARY KEY,
|
||||
iana_nr TEXT,
|
||||
spezifikation TEXT,
|
||||
verwendung_bis TEXT
|
||||
);
|
||||
CREATE TABLE bsi_tr_02102_4_ssh_kex (
|
||||
key_exchange_method TEXT PRIMARY KEY,
|
||||
spezifikation TEXT,
|
||||
verwendung TEXT,
|
||||
bemerkung TEXT
|
||||
);
|
||||
CREATE TABLE bsi_tr_02102_4_ssh_encryption (
|
||||
verschluesselungsverfahren TEXT PRIMARY KEY,
|
||||
spezifikation TEXT,
|
||||
verwendung TEXT,
|
||||
bemerkung TEXT
|
||||
);
|
||||
CREATE TABLE bsi_tr_02102_4_ssh_mac (
|
||||
mac_verfahren TEXT PRIMARY KEY,
|
||||
spezifikation TEXT,
|
||||
verwendung TEXT
|
||||
);
|
||||
CREATE TABLE bsi_tr_02102_4_ssh_auth (
|
||||
signaturverfahren TEXT PRIMARY KEY,
|
||||
spezifikation TEXT,
|
||||
verwendung TEXT,
|
||||
bemerkung TEXT
|
||||
);
|
||||
CREATE INDEX idx_bsi_tls_category ON bsi_tr_02102_2_tls(category);
|
||||
CREATE INDEX idx_bsi_tls_valid_until ON bsi_tr_02102_2_tls(valid_until);
|
||||
CREATE INDEX idx_iana_cipher_recommended ON iana_tls_cipher_suites(recommended);
|
||||
CREATE INDEX idx_iana_groups_recommended ON iana_tls_supported_groups(recommended);
|
||||
CREATE TABLE bsi_tr_02102_1_key_requirements (
|
||||
algorithm_type TEXT NOT NULL,
|
||||
usage_context TEXT NOT NULL,
|
||||
min_key_length INTEGER,
|
||||
recommended_key_length INTEGER,
|
||||
valid_from INTEGER NOT NULL,
|
||||
valid_until INTEGER,
|
||||
notes TEXT,
|
||||
reference_section TEXT,
|
||||
PRIMARY KEY (algorithm_type, usage_context, valid_from)
|
||||
);
|
||||
CREATE INDEX idx_bsi_key_req_algo ON bsi_tr_02102_1_key_requirements(algorithm_type);
|
||||
CREATE INDEX idx_bsi_key_req_context ON bsi_tr_02102_1_key_requirements(usage_context);
|
||||
CREATE TABLE bsi_tr_02102_1_hash_requirements (
|
||||
algorithm TEXT PRIMARY KEY,
|
||||
min_output_bits INTEGER,
|
||||
recommended_for TEXT,
|
||||
valid_from INTEGER NOT NULL,
|
||||
deprecated INTEGER DEFAULT 0,
|
||||
notes TEXT,
|
||||
reference_section TEXT
|
||||
);
|
||||
CREATE TABLE bsi_tr_02102_1_symmetric_requirements (
|
||||
algorithm TEXT NOT NULL,
|
||||
mode TEXT,
|
||||
min_key_bits INTEGER,
|
||||
recommended_key_bits INTEGER,
|
||||
block_size_bits INTEGER,
|
||||
valid_from INTEGER NOT NULL,
|
||||
deprecated INTEGER DEFAULT 0,
|
||||
notes TEXT,
|
||||
reference_section TEXT,
|
||||
PRIMARY KEY (algorithm, mode, valid_from)
|
||||
);
|
||||
CREATE INDEX idx_bsi_sym_algo ON bsi_tr_02102_1_symmetric_requirements(algorithm);
|
||||
CREATE INDEX idx_bsi_sym_mode ON bsi_tr_02102_1_symmetric_requirements(mode);
|
||||
CREATE TABLE bsi_tr_02102_1_mac_requirements (
|
||||
algorithm TEXT PRIMARY KEY,
|
||||
min_key_bits INTEGER,
|
||||
min_tag_bits INTEGER,
|
||||
valid_from INTEGER NOT NULL,
|
||||
notes TEXT,
|
||||
reference_section TEXT
|
||||
);
|
||||
CREATE TABLE bsi_tr_02102_1_pqc_requirements (
|
||||
algorithm TEXT NOT NULL,
|
||||
parameter_set TEXT,
|
||||
usage_context TEXT NOT NULL,
|
||||
valid_from INTEGER NOT NULL,
|
||||
notes TEXT,
|
||||
reference_section TEXT,
|
||||
PRIMARY KEY (algorithm, parameter_set, usage_context)
|
||||
);
|
||||
CREATE INDEX idx_bsi_pqc_algo ON bsi_tr_02102_1_pqc_requirements(algorithm);
|
||||
CREATE INDEX idx_bsi_pqc_context ON bsi_tr_02102_1_pqc_requirements(usage_context);
|
||||
CREATE TABLE bsi_tr_02102_1_auth_requirements (
|
||||
method TEXT PRIMARY KEY,
|
||||
min_length INTEGER,
|
||||
min_entropy_bits INTEGER,
|
||||
max_attempts INTEGER,
|
||||
valid_from INTEGER NOT NULL,
|
||||
notes TEXT,
|
||||
reference_section TEXT
|
||||
);
|
||||
CREATE TABLE bsi_tr_02102_1_rng_requirements (
|
||||
class TEXT PRIMARY KEY,
|
||||
min_seed_entropy_bits INTEGER,
|
||||
valid_from INTEGER NOT NULL,
|
||||
deprecated INTEGER DEFAULT 0,
|
||||
notes TEXT,
|
||||
reference_section TEXT
|
||||
);
|
||||
CREATE TABLE bsi_tr_02102_1_metadata (
|
||||
key TEXT PRIMARY KEY,
|
||||
value TEXT
|
||||
);
|
||||
CREATE TABLE schema_version (
|
||||
version INTEGER PRIMARY KEY,
|
||||
applied_at TEXT NOT NULL,
|
||||
description TEXT
|
||||
);
|
||||
CREATE TABLE scans (
|
||||
scan_id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
timestamp TEXT NOT NULL,
|
||||
hostname TEXT NOT NULL,
|
||||
ports TEXT NOT NULL,
|
||||
scan_duration_seconds REAL
|
||||
);
|
||||
CREATE TABLE sqlite_sequence(name,seq);
|
||||
CREATE TABLE scanned_hosts (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
scan_id INTEGER NOT NULL,
|
||||
fqdn TEXT NOT NULL,
|
||||
ipv4 TEXT,
|
||||
ipv6 TEXT,
|
||||
FOREIGN KEY (scan_id) REFERENCES scans(scan_id) ON DELETE CASCADE
|
||||
);
|
||||
CREATE TABLE scan_cipher_suites (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
scan_id INTEGER NOT NULL,
|
||||
port INTEGER NOT NULL,
|
||||
tls_version TEXT NOT NULL,
|
||||
cipher_suite_name TEXT NOT NULL,
|
||||
accepted BOOLEAN NOT NULL,
|
||||
iana_value TEXT,
|
||||
key_size INTEGER,
|
||||
is_anonymous BOOLEAN,
|
||||
FOREIGN KEY (scan_id) REFERENCES scans(scan_id) ON DELETE CASCADE
|
||||
);
|
||||
CREATE TABLE scan_supported_groups (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
scan_id INTEGER NOT NULL,
|
||||
port INTEGER NOT NULL,
|
||||
group_name TEXT NOT NULL,
|
||||
iana_value INTEGER,
|
||||
openssl_nid INTEGER,
|
||||
FOREIGN KEY (scan_id) REFERENCES scans(scan_id) ON DELETE CASCADE
|
||||
);
|
||||
CREATE TABLE scan_certificates (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
scan_id INTEGER NOT NULL,
|
||||
port INTEGER NOT NULL,
|
||||
position INTEGER NOT NULL,
|
||||
subject TEXT,
|
||||
issuer TEXT,
|
||||
serial_number TEXT,
|
||||
not_before TEXT,
|
||||
not_after TEXT,
|
||||
key_type TEXT,
|
||||
key_bits INTEGER,
|
||||
signature_algorithm TEXT,
|
||||
fingerprint_sha256 TEXT,
|
||||
FOREIGN KEY (scan_id) REFERENCES scans(scan_id) ON DELETE CASCADE
|
||||
);
|
||||
CREATE TABLE scan_vulnerabilities (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
scan_id INTEGER NOT NULL,
|
||||
port INTEGER NOT NULL,
|
||||
vuln_type TEXT NOT NULL,
|
||||
vulnerable BOOLEAN NOT NULL,
|
||||
details TEXT,
|
||||
FOREIGN KEY (scan_id) REFERENCES scans(scan_id) ON DELETE CASCADE
|
||||
);
|
||||
CREATE TABLE scan_compliance_status (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
scan_id INTEGER NOT NULL,
|
||||
port INTEGER NOT NULL,
|
||||
timestamp TEXT NOT NULL,
|
||||
check_type TEXT NOT NULL,
|
||||
item_name TEXT NOT NULL,
|
||||
iana_value TEXT,
|
||||
iana_recommended TEXT,
|
||||
bsi_approved BOOLEAN,
|
||||
bsi_valid_until INTEGER,
|
||||
passed BOOLEAN NOT NULL,
|
||||
severity TEXT,
|
||||
details TEXT,
|
||||
FOREIGN KEY (scan_id) REFERENCES scans(scan_id) ON DELETE CASCADE
|
||||
);
|
||||
CREATE TABLE scan_protocol_features (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
scan_id INTEGER NOT NULL,
|
||||
port INTEGER NOT NULL,
|
||||
feature_type TEXT NOT NULL,
|
||||
supported BOOLEAN NOT NULL,
|
||||
details TEXT,
|
||||
FOREIGN KEY (scan_id) REFERENCES scans(scan_id) ON DELETE CASCADE
|
||||
);
|
||||
CREATE TABLE scan_session_features (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
scan_id INTEGER NOT NULL,
|
||||
port INTEGER NOT NULL,
|
||||
feature_type TEXT NOT NULL,
|
||||
client_initiated BOOLEAN,
|
||||
secure BOOLEAN,
|
||||
session_id_supported BOOLEAN,
|
||||
ticket_supported BOOLEAN,
|
||||
attempted_resumptions INTEGER,
|
||||
successful_resumptions INTEGER,
|
||||
details TEXT,
|
||||
FOREIGN KEY (scan_id) REFERENCES scans(scan_id) ON DELETE CASCADE
|
||||
);
|
||||
CREATE TABLE scan_http_headers (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
scan_id INTEGER NOT NULL,
|
||||
port INTEGER NOT NULL,
|
||||
header_name TEXT NOT NULL,
|
||||
header_value TEXT,
|
||||
is_present BOOLEAN NOT NULL,
|
||||
FOREIGN KEY (scan_id) REFERENCES scans(scan_id) ON DELETE CASCADE
|
||||
);
|
||||
CREATE INDEX idx_scans_hostname ON scans(hostname);
|
||||
CREATE INDEX idx_scans_timestamp ON scans(timestamp);
|
||||
CREATE INDEX idx_scanned_hosts_scan ON scanned_hosts(scan_id);
|
||||
CREATE INDEX idx_scanned_hosts_fqdn ON scanned_hosts(fqdn);
|
||||
CREATE INDEX idx_cipher_suites_scan ON scan_cipher_suites(scan_id, port);
|
||||
CREATE INDEX idx_cipher_suites_name ON scan_cipher_suites(cipher_suite_name);
|
||||
CREATE INDEX idx_supported_groups_scan ON scan_supported_groups(scan_id);
|
||||
CREATE INDEX idx_certificates_scan ON scan_certificates(scan_id);
|
||||
CREATE INDEX idx_vulnerabilities_scan ON scan_vulnerabilities(scan_id);
|
||||
CREATE INDEX idx_compliance_scan ON scan_compliance_status(scan_id);
|
||||
CREATE INDEX idx_compliance_passed ON scan_compliance_status(passed);
|
||||
CREATE INDEX idx_protocol_features_scan ON scan_protocol_features(scan_id);
|
||||
CREATE INDEX idx_session_features_scan ON scan_session_features(scan_id);
|
||||
CREATE INDEX idx_http_headers_scan ON scan_http_headers(scan_id);
|
||||
CREATE VIEW v_cipher_suites_with_compliance AS
|
||||
SELECT
|
||||
scs.scan_id,
|
||||
scs.port,
|
||||
scs.tls_version,
|
||||
scs.cipher_suite_name,
|
||||
scs.accepted,
|
||||
scs.iana_value,
|
||||
scs.key_size,
|
||||
scs.is_anonymous,
|
||||
sc.iana_recommended,
|
||||
sc.bsi_approved,
|
||||
sc.bsi_valid_until,
|
||||
sc.passed as compliant,
|
||||
CASE
|
||||
WHEN scs.accepted = 1 THEN sc.iana_recommended
|
||||
ELSE iana.recommended
|
||||
END as iana_recommended_final,
|
||||
CASE
|
||||
WHEN scs.accepted = 1 THEN sc.bsi_approved
|
||||
ELSE (bsi.name IS NOT NULL)
|
||||
END as bsi_approved_final,
|
||||
CASE
|
||||
WHEN scs.accepted = 1 THEN sc.bsi_valid_until
|
||||
ELSE bsi.valid_until
|
||||
END as bsi_valid_until_final
|
||||
FROM scan_cipher_suites scs
|
||||
LEFT JOIN scan_compliance_status sc
|
||||
ON scs.scan_id = sc.scan_id
|
||||
AND scs.port = sc.port
|
||||
AND sc.check_type = 'cipher_suite'
|
||||
AND scs.cipher_suite_name = sc.item_name
|
||||
LEFT JOIN iana_tls_cipher_suites iana
|
||||
ON scs.cipher_suite_name = iana.description
|
||||
LEFT JOIN bsi_tr_02102_2_tls bsi
|
||||
ON scs.cipher_suite_name = bsi.name
|
||||
AND scs.tls_version = bsi.tls_version
|
||||
AND bsi.category = 'cipher_suite'
|
||||
/* v_cipher_suites_with_compliance(scan_id,port,tls_version,cipher_suite_name,accepted,iana_value,key_size,is_anonymous,iana_recommended,bsi_approved,bsi_valid_until,compliant,iana_recommended_final,bsi_approved_final,bsi_valid_until_final) */;
|
||||
CREATE VIEW v_supported_groups_with_compliance AS
|
||||
SELECT
|
||||
ssg.scan_id,
|
||||
ssg.port,
|
||||
ssg.group_name,
|
||||
ssg.iana_value,
|
||||
ssg.openssl_nid,
|
||||
sc.iana_recommended,
|
||||
sc.bsi_approved,
|
||||
sc.bsi_valid_until,
|
||||
sc.passed as compliant
|
||||
FROM scan_supported_groups ssg
|
||||
LEFT JOIN scan_compliance_status sc
|
||||
ON ssg.scan_id = sc.scan_id
|
||||
AND ssg.port = sc.port
|
||||
AND sc.check_type = 'supported_group'
|
||||
AND ssg.group_name = sc.item_name
|
||||
/* v_supported_groups_with_compliance(scan_id,port,group_name,iana_value,openssl_nid,iana_recommended,bsi_approved,bsi_valid_until,compliant) */;
|
||||
CREATE VIEW v_certificates_with_compliance AS
|
||||
SELECT
|
||||
c.scan_id,
|
||||
c.port,
|
||||
c.position,
|
||||
c.subject,
|
||||
c.issuer,
|
||||
c.serial_number,
|
||||
c.not_before,
|
||||
c.not_after,
|
||||
c.key_type,
|
||||
c.key_bits,
|
||||
c.signature_algorithm,
|
||||
c.fingerprint_sha256,
|
||||
MAX(cs.passed) as compliant,
|
||||
MAX(cs.details) as compliance_details
|
||||
FROM scan_certificates c
|
||||
LEFT JOIN scan_compliance_status cs
|
||||
ON c.scan_id = cs.scan_id
|
||||
AND c.port = cs.port
|
||||
AND cs.check_type = 'certificate'
|
||||
AND cs.item_name = (c.key_type || ' ' || c.key_bits || ' Bit')
|
||||
GROUP BY c.scan_id, c.port, c.position, c.subject, c.issuer, c.serial_number,
|
||||
c.not_before, c.not_after, c.key_type, c.key_bits,
|
||||
c.signature_algorithm, c.fingerprint_sha256
|
||||
/* v_certificates_with_compliance(scan_id,port,position,subject,issuer,serial_number,not_before,not_after,key_type,key_bits,signature_algorithm,fingerprint_sha256,compliant,compliance_details) */;
|
||||
CREATE VIEW v_port_compliance_summary AS
|
||||
SELECT
|
||||
scan_id,
|
||||
port,
|
||||
check_type,
|
||||
COUNT(*) as total,
|
||||
SUM(CASE WHEN passed = 1 THEN 1 ELSE 0 END) as passed,
|
||||
ROUND(CAST(SUM(CASE WHEN passed = 1 THEN 1 ELSE 0 END) AS REAL) / COUNT(*) * 100, 1) as percentage
|
||||
FROM scan_compliance_status
|
||||
GROUP BY scan_id, port, check_type
|
||||
/* v_port_compliance_summary(scan_id,port,check_type,total,passed,percentage) */;
|
||||
CREATE VIEW v_missing_bsi_groups AS
|
||||
SELECT
|
||||
s.scan_id,
|
||||
s.ports,
|
||||
bsi.name as group_name,
|
||||
bsi.tls_version,
|
||||
bsi.valid_until
|
||||
FROM scans s
|
||||
CROSS JOIN (
|
||||
SELECT DISTINCT name, tls_version, valid_until
|
||||
FROM bsi_tr_02102_2_tls
|
||||
WHERE category = 'dh_group'
|
||||
) bsi
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1
|
||||
FROM scan_supported_groups ssg
|
||||
WHERE ssg.scan_id = s.scan_id
|
||||
AND LOWER(ssg.group_name) = LOWER(bsi.name)
|
||||
)
|
||||
/* v_missing_bsi_groups(scan_id,ports,group_name,tls_version,valid_until) */;
|
||||
CREATE VIEW v_missing_iana_groups AS
|
||||
SELECT
|
||||
s.scan_id,
|
||||
s.ports,
|
||||
iana.description as group_name,
|
||||
iana.value as iana_value
|
||||
FROM scans s
|
||||
CROSS JOIN (
|
||||
SELECT description, value
|
||||
FROM iana_tls_supported_groups
|
||||
WHERE recommended = 'Y'
|
||||
) iana
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1
|
||||
FROM scan_supported_groups ssg
|
||||
WHERE ssg.scan_id = s.scan_id
|
||||
AND LOWER(ssg.group_name) = LOWER(iana.description)
|
||||
)
|
||||
AND NOT EXISTS (
|
||||
SELECT 1
|
||||
FROM bsi_tr_02102_2_tls bsi
|
||||
WHERE LOWER(bsi.name) = LOWER(iana.description)
|
||||
AND bsi.category = 'dh_group'
|
||||
)
|
||||
/* v_missing_iana_groups(scan_id,ports,group_name,iana_value) */;
|
||||
CREATE TABLE csv_export_metadata (
|
||||
id INTEGER PRIMARY KEY,
|
||||
export_type TEXT UNIQUE NOT NULL,
|
||||
headers TEXT NOT NULL,
|
||||
description TEXT
|
||||
);
|
||||
45
pyproject.toml
Normal file
45
pyproject.toml
Normal file
@@ -0,0 +1,45 @@
|
||||
[project]
|
||||
name = "compliance-scan"
|
||||
version = "0.1.0"
|
||||
description = ""
|
||||
authors = [
|
||||
{name = "Heiko Haase",email = "heiko.haase.extern@univention.de"}
|
||||
]
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.13"
|
||||
dependencies = [
|
||||
"sslyze>=6.0.0",
|
||||
"jinja2 (>=3.1.6,<4.0.0)",
|
||||
]
|
||||
|
||||
[project.scripts]
|
||||
compliance-scan = "sslysze_scan.__main__:main"
|
||||
|
||||
[tool.poetry]
|
||||
packages = [{include = "sslysze_scan", from = "src"}]
|
||||
include = ["src/sslysze_scan/data/*.csv"]
|
||||
|
||||
[build-system]
|
||||
requires = ["poetry-core>=2.0.0,<3.0.0"]
|
||||
build-backend = "poetry.core.masonry.api"
|
||||
|
||||
[dependency-groups]
|
||||
dev = [
|
||||
"pytest (>=9.0.2,<10.0.0)",
|
||||
"ruff (>=0.14.9,<0.15.0)"
|
||||
]
|
||||
|
||||
[tool.ruff]
|
||||
line-length = 90
|
||||
target-version = "py313"
|
||||
|
||||
[tool.ruff.lint]
|
||||
select = ["E", "F", "W", "I", "N", "UP"]
|
||||
ignore = ["TRY003", "EM102", "EM101", "C901", "PLR0912", "PLR0915"]
|
||||
|
||||
[tool.ruff.format]
|
||||
quote-style = "double"
|
||||
indent-style = "space"
|
||||
|
||||
[tool.ruff.lint.extend-per-file-ignores]
|
||||
"*" = ["E501"]
|
||||
15
src/sslysze_scan/__init__.py
Normal file
15
src/sslysze_scan/__init__.py
Normal file
@@ -0,0 +1,15 @@
|
||||
"""compliance-scan package for scanning SSL/TLS configurations."""
|
||||
|
||||
import logging
|
||||
|
||||
from .__main__ import main
|
||||
from .scanner import perform_scan
|
||||
|
||||
__version__ = "0.1.0"
|
||||
__all__ = ["main", "perform_scan"]
|
||||
|
||||
# Configure logging
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
|
||||
)
|
||||
29
src/sslysze_scan/__main__.py
Normal file
29
src/sslysze_scan/__main__.py
Normal file
@@ -0,0 +1,29 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Main entry point for compliance-scan."""
|
||||
|
||||
import sys
|
||||
|
||||
from .cli import parse_arguments
|
||||
from .commands import handle_report_command, handle_scan_command
|
||||
from .output import print_error
|
||||
|
||||
|
||||
def main() -> int:
|
||||
"""Main entry point for compliance-scan.
|
||||
|
||||
Returns:
|
||||
Exit code (0 for success, 1 for error).
|
||||
|
||||
"""
|
||||
args = parse_arguments()
|
||||
|
||||
if args.command == "scan":
|
||||
return handle_scan_command(args)
|
||||
if args.command == "report":
|
||||
return handle_report_command(args)
|
||||
print_error(f"Unknown command: {args.command}")
|
||||
return 1
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
268
src/sslysze_scan/cli.py
Normal file
268
src/sslysze_scan/cli.py
Normal file
@@ -0,0 +1,268 @@
|
||||
"""Command-line interface for compliance-scan."""
|
||||
|
||||
import argparse
|
||||
import sys
|
||||
|
||||
# Port constants
|
||||
MIN_PORT_NUMBER = 1
|
||||
MAX_PORT_NUMBER = 65535
|
||||
MIN_PARTS_COUNT = 2
|
||||
|
||||
# Error messages
|
||||
ERR_INVALID_FORMAT = "Invalid format"
|
||||
ERR_EXPECTED_FORMAT = "Expected format: hostname:port or hostname:port1,port2,..."
|
||||
ERR_IPV6_MISSING_BRACKET = "Invalid IPv6 format"
|
||||
ERR_MISSING_CLOSING_BRACKET = "Missing closing bracket."
|
||||
ERR_IPV6_MISSING_COLON = "Expected colon after IPv6 address."
|
||||
ERR_HOSTNAME_EMPTY = "Hostname cannot be empty"
|
||||
ERR_INVALID_PORT_NUMBER = "Invalid port number"
|
||||
ERR_PORT_MUST_BE_INTEGER = "Must be an integer."
|
||||
ERR_PORT_OUT_OF_RANGE = "Must be between"
|
||||
ERR_AT_LEAST_ONE_PORT = "At least one port must be specified"
|
||||
|
||||
|
||||
def _parse_ipv6_target(target: str) -> tuple[str, str]:
|
||||
"""Parse IPv6 target in format [ipv6]:ports.
|
||||
|
||||
Args:
|
||||
target: String in format "[ipv6]:port1,port2,..."
|
||||
|
||||
Returns:
|
||||
Tuple of (hostname, port_str)
|
||||
|
||||
Raises:
|
||||
ValueError: If format is invalid
|
||||
|
||||
"""
|
||||
bracket_end = target.find("]")
|
||||
if bracket_end == -1:
|
||||
msg = f"{ERR_IPV6_MISSING_BRACKET} '{target}'. {ERR_MISSING_CLOSING_BRACKET}"
|
||||
raise ValueError(msg)
|
||||
|
||||
hostname = target[1:bracket_end]
|
||||
rest = target[bracket_end + 1 :]
|
||||
|
||||
if not rest.startswith(":"):
|
||||
msg = f"{ERR_INVALID_FORMAT} '{target}'. {ERR_IPV6_MISSING_COLON}"
|
||||
raise ValueError(msg)
|
||||
|
||||
return hostname, rest[1:]
|
||||
|
||||
|
||||
def _parse_regular_target(target: str) -> tuple[str, str]:
|
||||
"""Parse regular hostname or IPv4 target.
|
||||
|
||||
Args:
|
||||
target: String in format "hostname:port1,port2,..."
|
||||
|
||||
Returns:
|
||||
Tuple of (hostname, port_str)
|
||||
|
||||
Raises:
|
||||
ValueError: If format is invalid
|
||||
|
||||
"""
|
||||
parts = target.rsplit(":", 1)
|
||||
if len(parts) != MIN_PARTS_COUNT:
|
||||
msg = f"{ERR_INVALID_FORMAT} '{target}'. {ERR_EXPECTED_FORMAT}"
|
||||
raise ValueError(msg)
|
||||
|
||||
return parts[0].strip(), parts[1].strip()
|
||||
|
||||
|
||||
def _parse_port_list(port_str: str) -> list[int]:
|
||||
"""Parse comma-separated port list.
|
||||
|
||||
Args:
|
||||
port_str: String with comma-separated ports
|
||||
|
||||
Returns:
|
||||
List of port numbers
|
||||
|
||||
Raises:
|
||||
ValueError: If port is invalid or out of range
|
||||
|
||||
"""
|
||||
port_list = []
|
||||
for port_item in port_str.split(","):
|
||||
port_item = port_item.strip()
|
||||
if not port_item:
|
||||
continue
|
||||
|
||||
try:
|
||||
port = int(port_item)
|
||||
except ValueError as e:
|
||||
msg = f"{ERR_INVALID_PORT_NUMBER} '{port_item}'. {ERR_PORT_MUST_BE_INTEGER}"
|
||||
raise ValueError(msg) from e
|
||||
|
||||
if port < MIN_PORT_NUMBER or port > MAX_PORT_NUMBER:
|
||||
msg = f"{ERR_INVALID_PORT_NUMBER} {port}. {ERR_PORT_OUT_OF_RANGE} {MIN_PORT_NUMBER} and {MAX_PORT_NUMBER}."
|
||||
raise ValueError(msg)
|
||||
|
||||
port_list.append(port)
|
||||
|
||||
if not port_list:
|
||||
raise ValueError(ERR_AT_LEAST_ONE_PORT)
|
||||
|
||||
return port_list
|
||||
|
||||
|
||||
def parse_host_ports(target: str) -> tuple[str, list[int]]:
|
||||
"""Parse host:port1,port2,... string.
|
||||
|
||||
Args:
|
||||
target: String in format "hostname:port1,port2,..." or "[ipv6]:port1,port2,...".
|
||||
|
||||
Returns:
|
||||
Tuple of (hostname, list of ports).
|
||||
|
||||
Raises:
|
||||
ValueError: If format is invalid or port is out of range.
|
||||
|
||||
"""
|
||||
if ":" not in target:
|
||||
msg = f"{ERR_INVALID_FORMAT} '{target}'. {ERR_EXPECTED_FORMAT}"
|
||||
raise ValueError(msg)
|
||||
|
||||
if target.startswith("["):
|
||||
hostname, port_str = _parse_ipv6_target(target)
|
||||
else:
|
||||
hostname, port_str = _parse_regular_target(target)
|
||||
|
||||
if not hostname:
|
||||
raise ValueError(ERR_HOSTNAME_EMPTY)
|
||||
|
||||
port_list = _parse_port_list(port_str)
|
||||
return hostname, port_list
|
||||
|
||||
|
||||
def parse_arguments() -> argparse.Namespace:
|
||||
"""Parse command-line arguments.
|
||||
|
||||
Returns:
|
||||
Parsed arguments namespace.
|
||||
|
||||
"""
|
||||
parser = argparse.ArgumentParser(
|
||||
prog="compliance-scan",
|
||||
description="SSL/TLS configuration analysis with SSLyze and compliance checking.",
|
||||
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||
)
|
||||
|
||||
# Create subcommands
|
||||
subparsers = parser.add_subparsers(dest="command", help="Available commands")
|
||||
|
||||
# Scan subcommand
|
||||
scan_parser = subparsers.add_parser(
|
||||
"scan",
|
||||
help="Perform SSL/TLS scan",
|
||||
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||
epilog="""
|
||||
Examples:
|
||||
compliance-scan scan example.com:443
|
||||
compliance-scan scan example.com:443,636,993
|
||||
compliance-scan scan example.com:443 --print
|
||||
compliance-scan scan example.com:443,636 -db /path/to/scans.db
|
||||
compliance-scan scan [2001:db8::1]:443,636 --print
|
||||
""",
|
||||
)
|
||||
|
||||
scan_parser.add_argument(
|
||||
"target",
|
||||
type=str,
|
||||
help="Target to scan in format hostname:port1,port2,... (e.g., example.com:443,636)",
|
||||
)
|
||||
|
||||
scan_parser.add_argument(
|
||||
"-db",
|
||||
"--database",
|
||||
type=str,
|
||||
help="SQLite database file path (default: compliance_status.db in current directory)",
|
||||
default="compliance_status.db",
|
||||
)
|
||||
|
||||
scan_parser.add_argument(
|
||||
"--print",
|
||||
action="store_true",
|
||||
help="Print scan results to console",
|
||||
default=False,
|
||||
)
|
||||
|
||||
# Report subcommand
|
||||
report_parser = subparsers.add_parser(
|
||||
"report",
|
||||
help="Generate report from scan results",
|
||||
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||
epilog="""
|
||||
Examples:
|
||||
compliance-scan report -t csv
|
||||
compliance-scan report 5 -t md -o report.md
|
||||
compliance-scan report -t rest --output-dir ./rest-reports
|
||||
compliance-scan report --list
|
||||
compliance-scan report -t csv --output-dir ./reports
|
||||
""",
|
||||
)
|
||||
|
||||
report_parser.add_argument(
|
||||
"scan_id",
|
||||
type=int,
|
||||
nargs="?",
|
||||
help="Scan ID to generate report for (default: latest scan)",
|
||||
)
|
||||
|
||||
report_parser.add_argument(
|
||||
"-t",
|
||||
"--type",
|
||||
type=str,
|
||||
choices=["csv", "md", "markdown", "rest", "rst"],
|
||||
help="Report type (csv, markdown, or rest/rst)",
|
||||
)
|
||||
|
||||
report_parser.add_argument(
|
||||
"-db",
|
||||
"--database",
|
||||
type=str,
|
||||
help="SQLite database file path (default: compliance_status.db in current directory)",
|
||||
default="compliance_status.db",
|
||||
)
|
||||
|
||||
report_parser.add_argument(
|
||||
"-o",
|
||||
"--output",
|
||||
type=str,
|
||||
help="Output file for markdown report (auto-generated if not specified)",
|
||||
)
|
||||
|
||||
report_parser.add_argument(
|
||||
"--output-dir",
|
||||
type=str,
|
||||
default=".",
|
||||
help="Output directory for CSV/reST reports (default: current directory)",
|
||||
)
|
||||
|
||||
report_parser.add_argument(
|
||||
"--list",
|
||||
action="store_true",
|
||||
help="List all available scans",
|
||||
default=False,
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Check if no command was provided
|
||||
if args.command is None:
|
||||
parser.print_help()
|
||||
sys.exit(1)
|
||||
|
||||
return args
|
||||
|
||||
|
||||
def main() -> int:
|
||||
"""Main entry point for CLI.
|
||||
|
||||
Returns:
|
||||
Exit code (0 for success, 1 for error).
|
||||
|
||||
"""
|
||||
parse_arguments()
|
||||
return 0
|
||||
6
src/sslysze_scan/commands/__init__.py
Normal file
6
src/sslysze_scan/commands/__init__.py
Normal file
@@ -0,0 +1,6 @@
|
||||
"""Command handlers for compliance-scan CLI."""
|
||||
|
||||
from .report import handle_report_command
|
||||
from .scan import handle_scan_command
|
||||
|
||||
__all__ = ["handle_report_command", "handle_scan_command"]
|
||||
102
src/sslysze_scan/commands/report.py
Normal file
102
src/sslysze_scan/commands/report.py
Normal file
@@ -0,0 +1,102 @@
|
||||
"""Report command handler."""
|
||||
|
||||
import argparse
|
||||
import sqlite3
|
||||
from pathlib import Path
|
||||
|
||||
from ..output import print_error, print_success
|
||||
from ..reporter import generate_report, list_scans
|
||||
|
||||
|
||||
def handle_report_command(args: argparse.Namespace) -> int:
|
||||
"""Handle the report subcommand.
|
||||
|
||||
Args:
|
||||
args: Parsed arguments
|
||||
|
||||
Returns:
|
||||
Exit code (0 for success, 1 for error)
|
||||
|
||||
"""
|
||||
db_path = args.database
|
||||
|
||||
# Check if database exists
|
||||
if not Path(db_path).exists():
|
||||
print_error(f"Database not found: {db_path}")
|
||||
return 1
|
||||
|
||||
# Handle --list option
|
||||
if args.list:
|
||||
try:
|
||||
scans = list_scans(db_path)
|
||||
if not scans:
|
||||
print("No scans found in database.")
|
||||
return 0
|
||||
|
||||
print("Available scans:")
|
||||
print("-" * 80)
|
||||
print(f"{'ID':<5} {'Timestamp':<25} {'Hostname':<20} {'Ports':<20}")
|
||||
print("-" * 80)
|
||||
for scan in scans:
|
||||
print(
|
||||
f"{scan['scan_id']:<5} {scan['timestamp']:<25} {scan['hostname']:<20} {scan['ports']:<20}",
|
||||
)
|
||||
return 0
|
||||
except (sqlite3.Error, OSError) as e:
|
||||
print_error(f"Error listing scans: {e}")
|
||||
return 1
|
||||
|
||||
# Check if report type is specified
|
||||
if not args.type:
|
||||
print_error("Report type must be specified with -t/--type (csv, md, or rest)")
|
||||
return 1
|
||||
|
||||
# Determine scan_id
|
||||
scan_id = args.scan_id
|
||||
if scan_id is None:
|
||||
# Get latest scan
|
||||
try:
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
cursor.execute("SELECT MAX(scan_id) FROM scans")
|
||||
result = cursor.fetchone()
|
||||
conn.close()
|
||||
|
||||
if result and result[0]:
|
||||
scan_id = result[0]
|
||||
else:
|
||||
print_error("No scans found in database.")
|
||||
return 1
|
||||
except (sqlite3.Error, OSError) as e:
|
||||
print_error(f"Error determining latest scan ID: {e}")
|
||||
return 1
|
||||
|
||||
# Generate report
|
||||
try:
|
||||
# Map report type aliases
|
||||
if args.type in ["md", "markdown"]:
|
||||
report_type = "markdown"
|
||||
elif args.type in ["rest", "rst"]:
|
||||
report_type = "rest"
|
||||
else:
|
||||
report_type = "csv"
|
||||
|
||||
output = args.output if hasattr(args, "output") else None
|
||||
output_dir = args.output_dir if hasattr(args, "output_dir") else "."
|
||||
|
||||
files = generate_report(
|
||||
db_path,
|
||||
scan_id,
|
||||
report_type,
|
||||
output=output,
|
||||
output_dir=output_dir,
|
||||
)
|
||||
|
||||
print_success("Report successfully created:")
|
||||
for file in files:
|
||||
print(f" - {file}")
|
||||
return 0
|
||||
|
||||
except (sqlite3.Error, OSError, ValueError) as e:
|
||||
print_error(f"Error creating report: {e}")
|
||||
return 1
|
||||
188
src/sslysze_scan/commands/scan.py
Normal file
188
src/sslysze_scan/commands/scan.py
Normal file
@@ -0,0 +1,188 @@
|
||||
"""Scan command handler."""
|
||||
|
||||
import argparse
|
||||
from datetime import datetime, timezone
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
from ..cli import parse_host_ports
|
||||
from ..db import check_compliance, check_schema_version, save_scan_results
|
||||
from ..output import print_error
|
||||
from ..scanner import perform_scan
|
||||
|
||||
|
||||
def handle_scan_command(args: argparse.Namespace) -> int:
|
||||
"""Handle the scan subcommand.
|
||||
|
||||
Args:
|
||||
args: Parsed arguments
|
||||
|
||||
Returns:
|
||||
Exit code (0 for success, 1 for error)
|
||||
|
||||
"""
|
||||
# Parse target
|
||||
try:
|
||||
hostname, ports = parse_host_ports(args.target)
|
||||
except ValueError as e:
|
||||
print_error(str(e))
|
||||
return 1
|
||||
|
||||
# Database path
|
||||
db_path = args.database
|
||||
|
||||
# Initialize database by copying template if needed
|
||||
try:
|
||||
db_file = Path(db_path)
|
||||
|
||||
if not db_file.exists():
|
||||
# Get template database path
|
||||
script_dir = Path(__file__).parent.parent
|
||||
template_db = script_dir / "data" / "crypto_standards.db"
|
||||
|
||||
if not template_db.exists():
|
||||
print_error(f"Template database not found: {template_db}")
|
||||
return 1
|
||||
|
||||
# Copy template
|
||||
import shutil
|
||||
|
||||
print(f"Creating database from template: {template_db}")
|
||||
shutil.copy2(template_db, db_path)
|
||||
|
||||
# Check schema version
|
||||
check_schema_version(db_path)
|
||||
|
||||
except (OSError, sqlite3.Error, ValueError) as e:
|
||||
print_error(f"Error initializing database: {e}")
|
||||
return 1
|
||||
|
||||
# Single timestamp for all scans (program start time)
|
||||
program_start_time = datetime.now(timezone.utc)
|
||||
|
||||
# Scan results storage
|
||||
scan_results_dict: dict[int, Any] = {}
|
||||
failed_ports: list[int] = []
|
||||
total_scans = len(ports)
|
||||
|
||||
# Perform scans for all ports sequentially
|
||||
for port in ports:
|
||||
try:
|
||||
scan_result, scan_duration = perform_scan(hostname, port, program_start_time)
|
||||
scan_results_dict[port] = scan_result
|
||||
except (OSError, ValueError, RuntimeError) as e:
|
||||
print_error(f"Error scanning {hostname}:{port}: {e}")
|
||||
failed_ports.append(port)
|
||||
continue
|
||||
|
||||
# Calculate total scan duration
|
||||
scan_end_time = datetime.now(timezone.utc)
|
||||
total_scan_duration = (scan_end_time - program_start_time).total_seconds()
|
||||
|
||||
# Save all results to database with single scan_id
|
||||
if scan_results_dict:
|
||||
try:
|
||||
scan_id = save_scan_results(
|
||||
db_path,
|
||||
hostname,
|
||||
list(scan_results_dict.keys()),
|
||||
scan_results_dict,
|
||||
program_start_time,
|
||||
total_scan_duration,
|
||||
)
|
||||
print(f"\n=> Scan results saved to database (Scan-ID: {scan_id})")
|
||||
except (sqlite3.Error, OSError) as e:
|
||||
print_error(f"Error saving to database: {e}")
|
||||
return 1
|
||||
|
||||
# Run compliance checks
|
||||
try:
|
||||
compliance_stats = check_compliance(db_path, scan_id)
|
||||
print("=> Compliance check completed")
|
||||
except (sqlite3.Error, ValueError) as e:
|
||||
print_error(f"Error during compliance check: {e}")
|
||||
return 1
|
||||
|
||||
# Print summary if requested
|
||||
if args.print:
|
||||
import sqlite3
|
||||
|
||||
print("\n" + "=" * 70)
|
||||
print("SCAN SUMMARY")
|
||||
print("=" * 70)
|
||||
print(f"Scan-ID: {scan_id}")
|
||||
print(f"Hostname: {hostname}")
|
||||
print(f"Ports: {', '.join(str(p) for p in scan_results_dict.keys())}")
|
||||
print(f"Timestamp: {program_start_time.isoformat()}")
|
||||
print(f"Duration: {total_scan_duration:.2f}s")
|
||||
print("-" * 70)
|
||||
|
||||
for port, scan_res in scan_results_dict.items():
|
||||
print(f"\nPort {port}:")
|
||||
|
||||
from sslyze import ServerScanStatusEnum
|
||||
|
||||
if scan_res.scan_status == ServerScanStatusEnum.COMPLETED:
|
||||
print(" Status: COMPLETED")
|
||||
if scan_res.connectivity_result:
|
||||
print(
|
||||
f" Highest TLS: {scan_res.connectivity_result.highest_tls_version_supported}",
|
||||
)
|
||||
|
||||
# Query supported TLS versions from database
|
||||
try:
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT DISTINCT tls_version
|
||||
FROM scan_cipher_suites
|
||||
WHERE scan_id = ? AND port = ? AND accepted = 1
|
||||
ORDER BY
|
||||
CASE tls_version
|
||||
WHEN 'ssl_3.0' THEN 1
|
||||
WHEN '1.0' THEN 2
|
||||
WHEN '1.1' THEN 3
|
||||
WHEN '1.2' THEN 4
|
||||
WHEN '1.3' THEN 5
|
||||
ELSE 6
|
||||
END
|
||||
""",
|
||||
(scan_id, port),
|
||||
)
|
||||
supported_versions = [row[0] for row in cursor.fetchall()]
|
||||
conn.close()
|
||||
|
||||
if supported_versions:
|
||||
version_map = {
|
||||
"ssl_3.0": "SSL 3.0",
|
||||
"1.0": "TLS 1.0",
|
||||
"1.1": "TLS 1.1",
|
||||
"1.2": "TLS 1.2",
|
||||
"1.3": "TLS 1.3",
|
||||
}
|
||||
formatted_versions = [
|
||||
version_map.get(v, v) for v in supported_versions
|
||||
]
|
||||
print(f" Supported: {', '.join(formatted_versions)}")
|
||||
except (sqlite3.Error, OSError):
|
||||
pass # Silently ignore DB query errors in summary
|
||||
else:
|
||||
print(f" Status: {scan_res.scan_status}")
|
||||
|
||||
print("\n" + "-" * 70)
|
||||
print(
|
||||
f"Compliance: Cipher Suites {compliance_stats['cipher_suites_passed']}/{compliance_stats['cipher_suites_checked']}, "
|
||||
f"Groups {compliance_stats['supported_groups_passed']}/{compliance_stats['supported_groups_checked']}",
|
||||
)
|
||||
|
||||
# Final summary
|
||||
print("\n" + "=" * 70)
|
||||
successful_scans = total_scans - len(failed_ports)
|
||||
print(f"Completed: {successful_scans}/{total_scans} scans successful")
|
||||
if failed_ports:
|
||||
print(f"Failed: {', '.join(str(p) for p in failed_ports)}")
|
||||
print(f"Database: {db_path}")
|
||||
print("=" * 70)
|
||||
|
||||
return 0 if len(failed_ports) == 0 else 1
|
||||
BIN
src/sslysze_scan/data/crypto_standards.db
Normal file
BIN
src/sslysze_scan/data/crypto_standards.db
Normal file
Binary file not shown.
61
src/sslysze_scan/data/iana_parse.json
Normal file
61
src/sslysze_scan/data/iana_parse.json
Normal file
@@ -0,0 +1,61 @@
|
||||
{
|
||||
"proto/assignments/tls-parameters/tls-parameters.xml": [
|
||||
[
|
||||
"tls-parameters-4",
|
||||
"tls_cipher_suites.csv",
|
||||
["Value", "Description", "DTLS", "Recommended", "RFC/Draft"]
|
||||
],
|
||||
[
|
||||
"tls-signaturescheme",
|
||||
"tls_signature_schemes.csv",
|
||||
["Value", "Description", "DTLS", "Recommended", "RFC/Draft"]
|
||||
],
|
||||
[
|
||||
"tls-parameters-8",
|
||||
"tls_supported_groups.csv",
|
||||
["Value", "Description", "DTLS", "Recommended", "RFC/Draft"]
|
||||
],
|
||||
[
|
||||
"tls-parameters-6",
|
||||
"tls_alerts.csv",
|
||||
["Value", "Description", "DTLS", "Recommended", "RFC/Draft"]
|
||||
],
|
||||
[
|
||||
"tls-parameters-5",
|
||||
"tls_content_types.csv",
|
||||
["Value", "Description", "DTLS", "Recommended", "RFC/Draft"]
|
||||
],
|
||||
[
|
||||
"tls-parameters-7",
|
||||
"tls_content_types.csv",
|
||||
["Value", "Description", "DTLS", "Recommended", "RFC/Draft"]
|
||||
]
|
||||
],
|
||||
"proto/assignments/ikev2-parameters/ikev2-parameters.xml": [
|
||||
[
|
||||
"ikev2-parameters-5",
|
||||
"ikev2_encryption_algorithms.csv",
|
||||
["Value", "Description", "ESP", "IKEv2", "RFC/Draft"]
|
||||
],
|
||||
[
|
||||
"ikev2-parameters-6",
|
||||
"ikev2_prf_algorithms.csv",
|
||||
["Value", "Description", "Status", "RFC/Draft"]
|
||||
],
|
||||
[
|
||||
"ikev2-parameters-7",
|
||||
"ikev2_integrity_algorithms.csv",
|
||||
["Value", "Description", "Status", "RFC/Draft"]
|
||||
],
|
||||
[
|
||||
"ikev2-parameters-8",
|
||||
"ikev2_dh_groups.csv",
|
||||
["Value", "Description", "Status", "RFC/Draft"]
|
||||
],
|
||||
[
|
||||
"ikev2-parameters-12",
|
||||
"ikev2_authentication_methods.csv",
|
||||
["Value", "Description", "Status", "RFC/Draft"]
|
||||
]
|
||||
]
|
||||
}
|
||||
11
src/sslysze_scan/data/protocols.csv
Normal file
11
src/sslysze_scan/data/protocols.csv
Normal file
@@ -0,0 +1,11 @@
|
||||
protocol,port
|
||||
SMTP,25
|
||||
SMTP,587
|
||||
LDAP,389
|
||||
IMAP,143
|
||||
POP3,110
|
||||
FTP,21
|
||||
XMPP,5222
|
||||
XMPP,5269
|
||||
RDP,3389
|
||||
POSTGRES,5432
|
||||
|
12
src/sslysze_scan/db/__init__.py
Normal file
12
src/sslysze_scan/db/__init__.py
Normal file
@@ -0,0 +1,12 @@
|
||||
"""Database module for compliance-scan results storage."""
|
||||
|
||||
from .compliance import check_compliance
|
||||
from .schema import check_schema_version, get_schema_version
|
||||
from .writer import save_scan_results
|
||||
|
||||
__all__ = [
|
||||
"check_compliance",
|
||||
"check_schema_version",
|
||||
"get_schema_version",
|
||||
"save_scan_results",
|
||||
]
|
||||
463
src/sslysze_scan/db/compliance.py
Normal file
463
src/sslysze_scan/db/compliance.py
Normal file
@@ -0,0 +1,463 @@
|
||||
"""Compliance checking module for IANA and BSI standards."""
|
||||
|
||||
import sqlite3
|
||||
from datetime import datetime, timezone
|
||||
from typing import Any
|
||||
|
||||
# Error messages
|
||||
ERR_COMPLIANCE_CHECK = "Error during compliance check"
|
||||
|
||||
|
||||
def check_compliance(db_path: str, scan_id: int) -> dict[str, Any]:
|
||||
"""Check compliance of scan results against IANA and BSI standards.
|
||||
|
||||
Args:
|
||||
db_path: Path to database file
|
||||
scan_id: ID of scan to check
|
||||
|
||||
Returns:
|
||||
Dictionary with compliance statistics
|
||||
|
||||
Raises:
|
||||
sqlite3.Error: If database operations fail
|
||||
|
||||
"""
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
try:
|
||||
timestamp = datetime.now(timezone.utc).isoformat()
|
||||
stats = {
|
||||
"cipher_suites_checked": 0,
|
||||
"cipher_suites_passed": 0,
|
||||
"supported_groups_checked": 0,
|
||||
"supported_groups_passed": 0,
|
||||
"certificates_checked": 0,
|
||||
"certificates_passed": 0,
|
||||
}
|
||||
|
||||
# Check cipher suites
|
||||
stats["cipher_suites_checked"], stats["cipher_suites_passed"] = (
|
||||
_check_cipher_suite_compliance(cursor, scan_id, timestamp)
|
||||
)
|
||||
|
||||
# Check supported groups
|
||||
stats["supported_groups_checked"], stats["supported_groups_passed"] = (
|
||||
_check_supported_group_compliance(cursor, scan_id, timestamp)
|
||||
)
|
||||
|
||||
# Check certificates
|
||||
stats["certificates_checked"], stats["certificates_passed"] = (
|
||||
check_certificate_compliance(cursor, scan_id, timestamp)
|
||||
)
|
||||
|
||||
conn.commit()
|
||||
return stats
|
||||
|
||||
except Exception as e:
|
||||
conn.rollback()
|
||||
raise sqlite3.Error(f"{ERR_COMPLIANCE_CHECK}: {e}") from e
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
def check_certificate_compliance(
|
||||
cursor: sqlite3.Cursor,
|
||||
scan_id: int,
|
||||
timestamp: str,
|
||||
) -> tuple[int, int]:
|
||||
"""Check certificate compliance against BSI TR-02102-1 standards.
|
||||
|
||||
Returns:
|
||||
Tuple of (total_checked, passed_count)
|
||||
|
||||
"""
|
||||
# Get certificates from scan
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT id, port, key_type, key_bits, signature_algorithm
|
||||
FROM scan_certificates
|
||||
WHERE scan_id = ?
|
||||
""",
|
||||
(scan_id,),
|
||||
)
|
||||
|
||||
certificates = cursor.fetchall()
|
||||
total_checked = 0
|
||||
passed_count = 0
|
||||
|
||||
for cert_id, port, key_type, key_bits, signature_algorithm in certificates:
|
||||
total_checked += 1
|
||||
|
||||
# Determine algorithm type from key_type string
|
||||
# key_type examples: "RSA", "ECC", "DSA"
|
||||
algo_type = None
|
||||
if key_type:
|
||||
key_type_upper = key_type.upper()
|
||||
if "RSA" in key_type_upper:
|
||||
algo_type = "RSA"
|
||||
elif (
|
||||
"EC" in key_type_upper
|
||||
or "ECDSA" in key_type_upper
|
||||
or "ECC" in key_type_upper
|
||||
):
|
||||
algo_type = "ECDSA"
|
||||
elif "DSA" in key_type_upper and "EC" not in key_type_upper:
|
||||
algo_type = "DSA"
|
||||
|
||||
# Look up in BSI TR-02102-1 key requirements
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT min_key_length, valid_until, notes
|
||||
FROM bsi_tr_02102_1_key_requirements
|
||||
WHERE algorithm_type = ? AND usage_context = 'signature'
|
||||
""",
|
||||
(algo_type,),
|
||||
)
|
||||
bsi_result = cursor.fetchone()
|
||||
|
||||
passed = False
|
||||
severity = "critical"
|
||||
details = []
|
||||
|
||||
if bsi_result and algo_type:
|
||||
min_key_length, valid_until, notes = bsi_result
|
||||
current_year = datetime.now(timezone.utc).year
|
||||
|
||||
# Check key length
|
||||
if key_bits and key_bits >= min_key_length:
|
||||
if valid_until is None or valid_until >= current_year:
|
||||
passed = True
|
||||
severity = "info"
|
||||
details.append(
|
||||
f"BSI TR-02102-1: Compliant ({algo_type} {key_bits} ≥ {min_key_length} Bit)",
|
||||
)
|
||||
else:
|
||||
passed = False
|
||||
severity = "critical"
|
||||
details.append(
|
||||
f"BSI TR-02102-1: Algorithm deprecated (valid until {valid_until})",
|
||||
)
|
||||
else:
|
||||
passed = False
|
||||
severity = "critical"
|
||||
details.append(
|
||||
f"BSI TR-02102-1: Non-compliant ({algo_type} {key_bits} < {min_key_length} Bit required)",
|
||||
)
|
||||
else:
|
||||
details.append(f"BSI TR-02102-1: Unknown algorithm type ({key_type})")
|
||||
severity = "warning"
|
||||
|
||||
# Check signature hash algorithm
|
||||
# Extract hash from signature_algorithm (e.g., "sha256WithRSAEncryption" -> "SHA-256")
|
||||
sig_hash = None
|
||||
if signature_algorithm:
|
||||
sig_lower = signature_algorithm.lower()
|
||||
if "sha256" in sig_lower:
|
||||
sig_hash = "SHA-256"
|
||||
elif "sha384" in sig_lower:
|
||||
sig_hash = "SHA-384"
|
||||
elif "sha512" in sig_lower:
|
||||
sig_hash = "SHA-512"
|
||||
elif "sha1" in sig_lower:
|
||||
sig_hash = "SHA-1"
|
||||
elif "md5" in sig_lower:
|
||||
sig_hash = "MD5"
|
||||
|
||||
if sig_hash:
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT deprecated, min_output_bits
|
||||
FROM bsi_tr_02102_1_hash_requirements
|
||||
WHERE algorithm = ?
|
||||
""",
|
||||
(sig_hash,),
|
||||
)
|
||||
hash_result = cursor.fetchone()
|
||||
|
||||
if hash_result:
|
||||
deprecated, min_bits = hash_result
|
||||
if deprecated == 1:
|
||||
details.append(f"Hash: {sig_hash} deprecated")
|
||||
if passed:
|
||||
passed = False
|
||||
severity = "critical"
|
||||
else:
|
||||
details.append(f"Hash: {sig_hash} compliant")
|
||||
else:
|
||||
details.append(f"Hash: {sig_hash} unknown")
|
||||
|
||||
if passed:
|
||||
passed_count += 1
|
||||
|
||||
# Insert compliance record
|
||||
# Use key_type as-is for matching in reports
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scan_compliance_status (
|
||||
scan_id, port, timestamp, check_type, item_name,
|
||||
iana_value, iana_recommended, bsi_approved, bsi_valid_until,
|
||||
passed, severity, details
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
scan_id,
|
||||
port,
|
||||
timestamp,
|
||||
"certificate",
|
||||
f"{key_type} {key_bits} Bit" if key_type and key_bits else "Unknown",
|
||||
None,
|
||||
None,
|
||||
passed,
|
||||
None,
|
||||
passed,
|
||||
severity,
|
||||
"; ".join(details),
|
||||
),
|
||||
)
|
||||
|
||||
return total_checked, passed_count
|
||||
|
||||
|
||||
def _check_cipher_suite_compliance(
|
||||
cursor: sqlite3.Cursor,
|
||||
scan_id: int,
|
||||
timestamp: str,
|
||||
) -> tuple[int, int]:
|
||||
"""Check cipher suite compliance against IANA and BSI standards.
|
||||
|
||||
Returns:
|
||||
Tuple of (total_checked, passed_count)
|
||||
|
||||
"""
|
||||
# Get accepted cipher suites from scan
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT id, port, cipher_suite_name, tls_version
|
||||
FROM scan_cipher_suites
|
||||
WHERE scan_id = ? AND accepted = 1
|
||||
""",
|
||||
(scan_id,),
|
||||
)
|
||||
|
||||
cipher_suites = cursor.fetchall()
|
||||
total_checked = 0
|
||||
passed_count = 0
|
||||
|
||||
for cs_id, port, cipher_name, tls_version in cipher_suites:
|
||||
total_checked += 1
|
||||
|
||||
# Look up in IANA
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT value, recommended
|
||||
FROM iana_tls_cipher_suites
|
||||
WHERE description = ? COLLATE NOCASE
|
||||
""",
|
||||
(cipher_name,),
|
||||
)
|
||||
iana_result = cursor.fetchone()
|
||||
|
||||
iana_value = None
|
||||
iana_recommended = None
|
||||
if iana_result:
|
||||
iana_value = iana_result[0]
|
||||
iana_recommended = iana_result[1]
|
||||
|
||||
# Look up in BSI TR-02102-2
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT valid_until
|
||||
FROM bsi_tr_02102_2_tls
|
||||
WHERE name = ? COLLATE NOCASE AND tls_version = ? AND category = 'cipher_suite'
|
||||
""",
|
||||
(cipher_name, tls_version),
|
||||
)
|
||||
bsi_result = cursor.fetchone()
|
||||
|
||||
bsi_approved = bsi_result is not None
|
||||
bsi_valid_until = bsi_result[0] if bsi_result else None
|
||||
|
||||
# Determine if passed
|
||||
passed = False
|
||||
severity = "warning"
|
||||
details = []
|
||||
|
||||
# BSI check (sole compliance criterion)
|
||||
if bsi_approved:
|
||||
current_year = datetime.now(timezone.utc).year
|
||||
if bsi_valid_until and bsi_valid_until >= current_year:
|
||||
details.append(f"BSI: Approved until {bsi_valid_until}")
|
||||
passed = True
|
||||
severity = "info"
|
||||
else:
|
||||
details.append(f"BSI: Expired (valid until {bsi_valid_until})")
|
||||
passed = False
|
||||
severity = "critical"
|
||||
else:
|
||||
details.append("BSI: Not in approved list")
|
||||
passed = False
|
||||
severity = "critical"
|
||||
|
||||
# IANA check (informational only, does not affect passed status)
|
||||
if iana_recommended == "Y":
|
||||
details.append("IANA: Recommended")
|
||||
elif iana_recommended == "D":
|
||||
details.append("IANA: Deprecated/Transitioning")
|
||||
elif iana_recommended == "N":
|
||||
details.append("IANA: Not Recommended")
|
||||
else:
|
||||
details.append("IANA: Unknown")
|
||||
|
||||
if passed:
|
||||
passed_count += 1
|
||||
|
||||
# Insert compliance record
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scan_compliance_status (
|
||||
scan_id, port, timestamp, check_type, item_name,
|
||||
iana_value, iana_recommended, bsi_approved, bsi_valid_until,
|
||||
passed, severity, details
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
scan_id,
|
||||
port,
|
||||
timestamp,
|
||||
"cipher_suite",
|
||||
cipher_name,
|
||||
iana_value,
|
||||
iana_recommended,
|
||||
bsi_approved,
|
||||
bsi_valid_until,
|
||||
passed,
|
||||
severity,
|
||||
"; ".join(details),
|
||||
),
|
||||
)
|
||||
|
||||
return total_checked, passed_count
|
||||
|
||||
|
||||
def _check_supported_group_compliance(
|
||||
cursor: sqlite3.Cursor,
|
||||
scan_id: int,
|
||||
timestamp: str,
|
||||
) -> tuple[int, int]:
|
||||
"""Check supported groups compliance against IANA and BSI standards.
|
||||
|
||||
Returns:
|
||||
Tuple of (total_checked, passed_count)
|
||||
|
||||
"""
|
||||
# Get supported groups from scan
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT id, port, group_name
|
||||
FROM scan_supported_groups
|
||||
WHERE scan_id = ?
|
||||
""",
|
||||
(scan_id,),
|
||||
)
|
||||
|
||||
groups = cursor.fetchall()
|
||||
total_checked = 0
|
||||
passed_count = 0
|
||||
|
||||
for group_id, port, group_name in groups:
|
||||
total_checked += 1
|
||||
|
||||
# Look up in IANA
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT value, recommended
|
||||
FROM iana_tls_supported_groups
|
||||
WHERE description = ? COLLATE NOCASE
|
||||
""",
|
||||
(group_name,),
|
||||
)
|
||||
iana_result = cursor.fetchone()
|
||||
|
||||
iana_value = None
|
||||
iana_recommended = None
|
||||
if iana_result:
|
||||
iana_value = iana_result[0]
|
||||
iana_recommended = iana_result[1]
|
||||
|
||||
# Look up in BSI TR-02102-2 (DH groups for TLS 1.2 and 1.3)
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT valid_until
|
||||
FROM bsi_tr_02102_2_tls
|
||||
WHERE name = ? COLLATE NOCASE AND category = 'dh_group'
|
||||
ORDER BY valid_until DESC
|
||||
LIMIT 1
|
||||
""",
|
||||
(group_name,),
|
||||
)
|
||||
bsi_result = cursor.fetchone()
|
||||
|
||||
bsi_approved = bsi_result is not None
|
||||
bsi_valid_until = bsi_result[0] if bsi_result else None
|
||||
|
||||
# Determine if passed
|
||||
passed = False
|
||||
severity = "warning"
|
||||
details = []
|
||||
|
||||
# BSI check (sole compliance criterion)
|
||||
if bsi_approved:
|
||||
current_year = datetime.now(timezone.utc).year
|
||||
if bsi_valid_until and bsi_valid_until >= current_year:
|
||||
details.append(f"BSI: Approved until {bsi_valid_until}")
|
||||
passed = True
|
||||
severity = "info"
|
||||
else:
|
||||
details.append(f"BSI: Expired (valid until {bsi_valid_until})")
|
||||
passed = False
|
||||
severity = "critical"
|
||||
else:
|
||||
details.append("BSI: Not in approved list")
|
||||
passed = False
|
||||
severity = "critical"
|
||||
|
||||
# IANA check (informational only, does not affect passed status)
|
||||
if iana_recommended == "Y":
|
||||
details.append("IANA: Recommended")
|
||||
elif iana_recommended == "D":
|
||||
details.append("IANA: Deprecated/Transitioning")
|
||||
elif iana_recommended == "N":
|
||||
details.append("IANA: Not Recommended")
|
||||
else:
|
||||
details.append("IANA: Unknown")
|
||||
|
||||
if passed:
|
||||
passed_count += 1
|
||||
|
||||
# Insert compliance record
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scan_compliance_status (
|
||||
scan_id, port, timestamp, check_type, item_name,
|
||||
iana_value, iana_recommended, bsi_approved, bsi_valid_until,
|
||||
passed, severity, details
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
scan_id,
|
||||
port,
|
||||
timestamp,
|
||||
"supported_group",
|
||||
group_name,
|
||||
iana_value,
|
||||
iana_recommended,
|
||||
bsi_approved,
|
||||
bsi_valid_until,
|
||||
passed,
|
||||
severity,
|
||||
"; ".join(details),
|
||||
),
|
||||
)
|
||||
|
||||
return total_checked, passed_count
|
||||
71
src/sslysze_scan/db/schema.py
Normal file
71
src/sslysze_scan/db/schema.py
Normal file
@@ -0,0 +1,71 @@
|
||||
"""Database schema version management."""
|
||||
|
||||
import sqlite3
|
||||
|
||||
SCHEMA_VERSION = 5
|
||||
|
||||
# Error messages
|
||||
ERR_SCHEMA_READ = "Error reading schema version"
|
||||
|
||||
|
||||
def get_schema_version(db_path: str) -> int | None:
|
||||
"""Get current schema version from database.
|
||||
|
||||
Args:
|
||||
db_path: Path to database file
|
||||
|
||||
Returns:
|
||||
Schema version number or None if not found
|
||||
|
||||
Raises:
|
||||
sqlite3.Error: If database access fails
|
||||
|
||||
"""
|
||||
try:
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT name FROM sqlite_master
|
||||
WHERE type='table' AND name='schema_version'
|
||||
""",
|
||||
)
|
||||
if not cursor.fetchone():
|
||||
conn.close()
|
||||
return None
|
||||
|
||||
cursor.execute("SELECT MAX(version) FROM schema_version")
|
||||
result = cursor.fetchone()
|
||||
conn.close()
|
||||
|
||||
return result[0] if result and result[0] is not None else None
|
||||
except sqlite3.Error as e:
|
||||
raise sqlite3.Error(f"{ERR_SCHEMA_READ}: {e}") from e
|
||||
|
||||
|
||||
def check_schema_version(db_path: str) -> bool:
|
||||
"""Check if database schema version is compatible.
|
||||
|
||||
Args:
|
||||
db_path: Path to database file
|
||||
|
||||
Returns:
|
||||
True if schema version matches
|
||||
|
||||
Raises:
|
||||
ValueError: If schema version is incompatible
|
||||
|
||||
"""
|
||||
current_version = get_schema_version(db_path)
|
||||
|
||||
if current_version is None:
|
||||
raise ValueError(f"No schema version found in database: {db_path}")
|
||||
|
||||
if current_version != SCHEMA_VERSION:
|
||||
raise ValueError(
|
||||
f"Schema version mismatch: database has version {current_version}, "
|
||||
f"expected version {SCHEMA_VERSION}",
|
||||
)
|
||||
|
||||
return True
|
||||
893
src/sslysze_scan/db/writer.py
Normal file
893
src/sslysze_scan/db/writer.py
Normal file
@@ -0,0 +1,893 @@
|
||||
"""Database writer for scan results."""
|
||||
|
||||
import socket
|
||||
import sqlite3
|
||||
from datetime import datetime
|
||||
from typing import Any
|
||||
|
||||
from sslyze.scanner.models import ServerScanResult
|
||||
|
||||
# OpenSSL constants
|
||||
OPENSSL_EVP_PKEY_DH = 28
|
||||
|
||||
|
||||
def save_scan_results(
|
||||
db_path: str,
|
||||
hostname: str,
|
||||
ports: list[int],
|
||||
scan_results: dict[int, Any],
|
||||
scan_start_time: datetime,
|
||||
scan_duration: float,
|
||||
) -> int:
|
||||
"""Save scan results to database.
|
||||
|
||||
Args:
|
||||
db_path: Path to database file
|
||||
hostname: Scanned hostname
|
||||
ports: List of scanned ports
|
||||
scan_results: Dictionary mapping port to SSLyze ServerScanResult object
|
||||
scan_start_time: When scan started
|
||||
scan_duration: Scan duration in seconds
|
||||
|
||||
Returns:
|
||||
scan_id of inserted record
|
||||
|
||||
Raises:
|
||||
sqlite3.Error: If database operations fail
|
||||
|
||||
"""
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
try:
|
||||
# Insert main scan record
|
||||
scan_id = _insert_scan_record(
|
||||
cursor,
|
||||
hostname,
|
||||
ports,
|
||||
scan_start_time,
|
||||
scan_duration,
|
||||
)
|
||||
|
||||
# Save results for each port
|
||||
for port, scan_result in scan_results.items():
|
||||
# Save cipher suites (all TLS versions)
|
||||
_save_cipher_suites(cursor, scan_id, port, scan_result, "ssl_3.0")
|
||||
_save_cipher_suites(cursor, scan_id, port, scan_result, "1.0")
|
||||
_save_cipher_suites(cursor, scan_id, port, scan_result, "1.1")
|
||||
_save_cipher_suites(cursor, scan_id, port, scan_result, "1.2")
|
||||
_save_cipher_suites(cursor, scan_id, port, scan_result, "1.3")
|
||||
|
||||
# Save supported groups (elliptic curves)
|
||||
_save_supported_groups(cursor, scan_id, port, scan_result)
|
||||
|
||||
# Extract and save DHE groups from cipher suites
|
||||
_save_dhe_groups_from_cipher_suites(cursor, scan_id, port, scan_result)
|
||||
|
||||
# Save certificate information
|
||||
_save_certificates(cursor, scan_id, port, scan_result)
|
||||
|
||||
# Save vulnerability checks
|
||||
_save_vulnerabilities(cursor, scan_id, port, scan_result)
|
||||
|
||||
# Save protocol features
|
||||
_save_protocol_features(cursor, scan_id, port, scan_result)
|
||||
|
||||
# Save session features
|
||||
_save_session_features(cursor, scan_id, port, scan_result)
|
||||
|
||||
# Save HTTP headers
|
||||
_save_http_headers(cursor, scan_id, port, scan_result)
|
||||
|
||||
conn.commit()
|
||||
return scan_id
|
||||
|
||||
except (sqlite3.Error, OSError, ValueError) as e:
|
||||
conn.rollback()
|
||||
raise sqlite3.Error(f"Error saving scan results: {e}") from e
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
def _insert_scan_record(
|
||||
cursor: sqlite3.Cursor,
|
||||
hostname: str,
|
||||
ports: list[int],
|
||||
scan_start_time: datetime,
|
||||
scan_duration: float,
|
||||
) -> int:
|
||||
"""Insert main scan record and return scan_id."""
|
||||
ports_str = ",".join(str(p) for p in ports)
|
||||
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scans (
|
||||
timestamp, hostname, ports, scan_duration_seconds
|
||||
) VALUES (?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
scan_start_time.isoformat(),
|
||||
hostname,
|
||||
ports_str,
|
||||
scan_duration,
|
||||
),
|
||||
)
|
||||
|
||||
scan_id = cursor.lastrowid
|
||||
|
||||
# Resolve and store host information
|
||||
_save_host_info(cursor, scan_id, hostname)
|
||||
|
||||
return scan_id
|
||||
|
||||
|
||||
def _resolve_hostname(hostname: str) -> tuple[str | None, str | None]:
|
||||
"""Resolve hostname to IPv4 and IPv6 addresses.
|
||||
|
||||
Args:
|
||||
hostname: Hostname to resolve
|
||||
|
||||
Returns:
|
||||
Tuple of (ipv4, ipv6) addresses or (None, None) if resolution fails
|
||||
|
||||
"""
|
||||
ipv4 = None
|
||||
ipv6 = None
|
||||
|
||||
try:
|
||||
# Get all address info for the hostname
|
||||
addr_info = socket.getaddrinfo(hostname, None)
|
||||
|
||||
for info in addr_info:
|
||||
family = info[0]
|
||||
addr = info[4][0]
|
||||
|
||||
if family == socket.AF_INET and ipv4 is None:
|
||||
ipv4 = addr
|
||||
elif family == socket.AF_INET6 and ipv6 is None:
|
||||
ipv6 = addr
|
||||
|
||||
# Stop if we have both
|
||||
if ipv4 and ipv6:
|
||||
break
|
||||
|
||||
except (OSError, socket.gaierror):
|
||||
pass
|
||||
|
||||
return ipv4, ipv6
|
||||
|
||||
|
||||
def _save_host_info(cursor: sqlite3.Cursor, scan_id: int, hostname: str) -> None:
|
||||
"""Save host information with resolved IP addresses.
|
||||
|
||||
Args:
|
||||
cursor: Database cursor
|
||||
scan_id: Scan ID
|
||||
hostname: Hostname to resolve and store
|
||||
|
||||
"""
|
||||
ipv4, ipv6 = _resolve_hostname(hostname)
|
||||
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scanned_hosts (
|
||||
scan_id, fqdn, ipv4, ipv6
|
||||
) VALUES (?, ?, ?, ?)
|
||||
""",
|
||||
(scan_id, hostname, ipv4, ipv6),
|
||||
)
|
||||
|
||||
|
||||
def _get_ffdhe_group_name(dh_size: int) -> str | None:
|
||||
"""Map DH key size to ffdhe group name.
|
||||
|
||||
Args:
|
||||
dh_size: DH key size in bits
|
||||
|
||||
Returns:
|
||||
ffdhe group name or None if not a standard size
|
||||
|
||||
"""
|
||||
ffdhe_map = {
|
||||
2048: "ffdhe2048",
|
||||
3072: "ffdhe3072",
|
||||
4096: "ffdhe4096",
|
||||
6144: "ffdhe6144",
|
||||
8192: "ffdhe8192",
|
||||
}
|
||||
return ffdhe_map.get(dh_size)
|
||||
|
||||
|
||||
def _get_ffdhe_iana_value(group_name: str) -> int | None:
|
||||
"""Get IANA value for ffdhe group name.
|
||||
|
||||
Args:
|
||||
group_name: ffdhe group name (e.g., "ffdhe2048")
|
||||
|
||||
Returns:
|
||||
IANA value or None if unknown
|
||||
|
||||
"""
|
||||
iana_map = {
|
||||
"ffdhe2048": 256,
|
||||
"ffdhe3072": 257,
|
||||
"ffdhe4096": 258,
|
||||
"ffdhe6144": 259,
|
||||
"ffdhe8192": 260,
|
||||
}
|
||||
return iana_map.get(group_name)
|
||||
|
||||
|
||||
def _save_cipher_suites(
|
||||
cursor: sqlite3.Cursor,
|
||||
scan_id: int,
|
||||
port: int,
|
||||
scan_result: ServerScanResult,
|
||||
tls_version: str,
|
||||
) -> None:
|
||||
"""Save cipher suites for specific TLS version."""
|
||||
from sslyze import ScanCommandAttemptStatusEnum
|
||||
|
||||
# Map version to result attribute
|
||||
version_map = {
|
||||
"ssl_3.0": "ssl_3_0_cipher_suites",
|
||||
"1.0": "tls_1_0_cipher_suites",
|
||||
"1.1": "tls_1_1_cipher_suites",
|
||||
"1.2": "tls_1_2_cipher_suites",
|
||||
"1.3": "tls_1_3_cipher_suites",
|
||||
}
|
||||
|
||||
if tls_version not in version_map:
|
||||
return
|
||||
|
||||
if not scan_result.scan_result:
|
||||
return
|
||||
|
||||
cipher_attempt = getattr(scan_result.scan_result, version_map[tls_version])
|
||||
|
||||
if cipher_attempt.status != ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
return
|
||||
|
||||
cipher_result = cipher_attempt.result
|
||||
if not cipher_result:
|
||||
return
|
||||
|
||||
# Insert accepted cipher suites
|
||||
for accepted_cipher in cipher_result.accepted_cipher_suites:
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scan_cipher_suites (
|
||||
scan_id, port, tls_version, cipher_suite_name, accepted,
|
||||
iana_value, key_size, is_anonymous
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
scan_id,
|
||||
port,
|
||||
tls_version,
|
||||
accepted_cipher.cipher_suite.name,
|
||||
True,
|
||||
None, # IANA value mapping would go here
|
||||
accepted_cipher.cipher_suite.key_size,
|
||||
accepted_cipher.cipher_suite.is_anonymous,
|
||||
),
|
||||
)
|
||||
|
||||
# Insert rejected cipher suites (if available)
|
||||
if hasattr(cipher_result, "rejected_cipher_suites"):
|
||||
for rejected_cipher in cipher_result.rejected_cipher_suites:
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scan_cipher_suites (
|
||||
scan_id, port, tls_version, cipher_suite_name, accepted,
|
||||
iana_value, key_size, is_anonymous
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
scan_id,
|
||||
port,
|
||||
tls_version,
|
||||
rejected_cipher.cipher_suite.name,
|
||||
False,
|
||||
None,
|
||||
rejected_cipher.cipher_suite.key_size,
|
||||
rejected_cipher.cipher_suite.is_anonymous,
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
def _save_supported_groups(
|
||||
cursor: sqlite3.Cursor,
|
||||
scan_id: int,
|
||||
port: int,
|
||||
scan_result: ServerScanResult,
|
||||
) -> None:
|
||||
"""Save supported elliptic curves / DH groups."""
|
||||
from sslyze import ScanCommandAttemptStatusEnum
|
||||
|
||||
if not scan_result.scan_result:
|
||||
return
|
||||
|
||||
ec_attempt = scan_result.scan_result.elliptic_curves
|
||||
|
||||
if ec_attempt.status != ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
return
|
||||
|
||||
ec_result = ec_attempt.result
|
||||
if not ec_result:
|
||||
return
|
||||
|
||||
for curve in ec_result.supported_curves:
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scan_supported_groups (
|
||||
scan_id, port, group_name, iana_value, openssl_nid
|
||||
) VALUES (?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
scan_id,
|
||||
port,
|
||||
curve.name,
|
||||
None, # IANA value mapping would go here
|
||||
curve.openssl_nid,
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
def _is_dhe_key_exchange(ephemeral_key: Any) -> bool:
|
||||
"""Check if ephemeral key is DHE (Finite Field DH).
|
||||
|
||||
Args:
|
||||
ephemeral_key: Ephemeral key object from cipher suite
|
||||
|
||||
Returns:
|
||||
True if DHE key exchange
|
||||
|
||||
"""
|
||||
if hasattr(ephemeral_key, "type_name"):
|
||||
return ephemeral_key.type_name == "DH"
|
||||
if hasattr(ephemeral_key, "type"):
|
||||
return ephemeral_key.type == OPENSSL_EVP_PKEY_DH
|
||||
return False
|
||||
|
||||
|
||||
def _process_dhe_from_cipher_result(
|
||||
cursor: sqlite3.Cursor,
|
||||
scan_id: int,
|
||||
port: int,
|
||||
cipher_result: Any,
|
||||
discovered_groups: set[str],
|
||||
) -> None:
|
||||
"""Process cipher result to extract and save DHE groups.
|
||||
|
||||
Args:
|
||||
cursor: Database cursor
|
||||
scan_id: Scan ID
|
||||
port: Port number
|
||||
cipher_result: Cipher suite scan result
|
||||
discovered_groups: Set of already discovered groups
|
||||
|
||||
"""
|
||||
if not cipher_result:
|
||||
return
|
||||
|
||||
for accepted_cipher in cipher_result.accepted_cipher_suites:
|
||||
ephemeral_key = accepted_cipher.ephemeral_key
|
||||
|
||||
if not ephemeral_key:
|
||||
continue
|
||||
|
||||
if not _is_dhe_key_exchange(ephemeral_key):
|
||||
continue
|
||||
|
||||
# Get DH key size and map to ffdhe group name
|
||||
dh_size = ephemeral_key.size
|
||||
group_name = _get_ffdhe_group_name(dh_size)
|
||||
|
||||
if not group_name or group_name in discovered_groups:
|
||||
continue
|
||||
|
||||
# Get IANA value and insert into database
|
||||
iana_value = _get_ffdhe_iana_value(group_name)
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scan_supported_groups (
|
||||
scan_id, port, group_name, iana_value, openssl_nid
|
||||
) VALUES (?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
scan_id,
|
||||
port,
|
||||
group_name,
|
||||
iana_value,
|
||||
None,
|
||||
),
|
||||
)
|
||||
discovered_groups.add(group_name)
|
||||
|
||||
|
||||
def _save_dhe_groups_from_cipher_suites(
|
||||
cursor: sqlite3.Cursor,
|
||||
scan_id: int,
|
||||
port: int,
|
||||
scan_result: Any, # ServerScanResult with dynamic ephemeral_key attributes
|
||||
) -> None:
|
||||
"""Extract and save DHE groups from cipher suite ephemeral keys.
|
||||
|
||||
Analyzes accepted cipher suites to find DHE key exchanges and extracts
|
||||
the ffdhe group size (e.g., ffdhe2048, ffdhe3072).
|
||||
|
||||
Args:
|
||||
cursor: Database cursor
|
||||
scan_id: Scan ID
|
||||
port: Port number
|
||||
scan_result: SSLyze ServerScanResult. Uses Any because ephemeral_key
|
||||
has dynamic attributes (type_name, type, size) that vary by implementation.
|
||||
|
||||
"""
|
||||
from sslyze import ScanCommandAttemptStatusEnum
|
||||
|
||||
if not scan_result.scan_result:
|
||||
return
|
||||
|
||||
discovered_groups = set()
|
||||
|
||||
tls_versions = [
|
||||
("ssl_3.0", "ssl_3_0_cipher_suites"),
|
||||
("1.0", "tls_1_0_cipher_suites"),
|
||||
("1.1", "tls_1_1_cipher_suites"),
|
||||
("1.2", "tls_1_2_cipher_suites"),
|
||||
("1.3", "tls_1_3_cipher_suites"),
|
||||
]
|
||||
|
||||
for tls_version, attr_name in tls_versions:
|
||||
cipher_attempt = getattr(scan_result.scan_result, attr_name)
|
||||
|
||||
if cipher_attempt.status != ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
continue
|
||||
|
||||
_process_dhe_from_cipher_result(
|
||||
cursor, scan_id, port, cipher_attempt.result, discovered_groups
|
||||
)
|
||||
|
||||
|
||||
def _save_certificates(
|
||||
cursor: sqlite3.Cursor,
|
||||
scan_id: int,
|
||||
port: int,
|
||||
scan_result: ServerScanResult,
|
||||
) -> None:
|
||||
"""Save certificate information."""
|
||||
from sslyze import ScanCommandAttemptStatusEnum
|
||||
|
||||
if not scan_result.scan_result:
|
||||
return
|
||||
|
||||
cert_attempt = scan_result.scan_result.certificate_info
|
||||
|
||||
if cert_attempt.status != ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
return
|
||||
|
||||
cert_result = cert_attempt.result
|
||||
if not cert_result:
|
||||
return
|
||||
|
||||
for cert_deployment in cert_result.certificate_deployments:
|
||||
for position, cert in enumerate(cert_deployment.received_certificate_chain):
|
||||
# Get public key info
|
||||
public_key = cert.public_key()
|
||||
key_type = public_key.__class__.__name__
|
||||
key_bits = None
|
||||
if hasattr(public_key, "key_size"):
|
||||
key_bits = public_key.key_size
|
||||
|
||||
# Get signature algorithm
|
||||
sig_alg = None
|
||||
if hasattr(cert, "signature_hash_algorithm"):
|
||||
sig_alg = cert.signature_hash_algorithm.name
|
||||
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scan_certificates (
|
||||
scan_id, port, position, subject, issuer, serial_number,
|
||||
not_before, not_after, key_type, key_bits,
|
||||
signature_algorithm, fingerprint_sha256
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
scan_id,
|
||||
port,
|
||||
position,
|
||||
cert.subject.rfc4514_string(),
|
||||
cert.issuer.rfc4514_string() if hasattr(cert, "issuer") else None,
|
||||
str(cert.serial_number),
|
||||
cert.not_valid_before_utc.isoformat()
|
||||
if hasattr(cert, "not_valid_before_utc")
|
||||
else None,
|
||||
cert.not_valid_after_utc.isoformat()
|
||||
if hasattr(cert, "not_valid_after_utc")
|
||||
else None,
|
||||
key_type,
|
||||
key_bits,
|
||||
sig_alg,
|
||||
cert.fingerprint_sha256
|
||||
if hasattr(cert, "fingerprint_sha256")
|
||||
else None,
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
def _save_vulnerabilities(
|
||||
cursor: sqlite3.Cursor,
|
||||
scan_id: int,
|
||||
port: int,
|
||||
scan_result: ServerScanResult,
|
||||
) -> None:
|
||||
"""Save vulnerability scan results."""
|
||||
from sslyze import ScanCommandAttemptStatusEnum
|
||||
|
||||
if not scan_result.scan_result:
|
||||
return
|
||||
|
||||
# Heartbleed
|
||||
heartbleed_attempt = scan_result.scan_result.heartbleed
|
||||
if heartbleed_attempt.status == ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
heartbleed_result = heartbleed_attempt.result
|
||||
if heartbleed_result:
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scan_vulnerabilities (
|
||||
scan_id, port, vuln_type, vulnerable, details
|
||||
) VALUES (?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
scan_id,
|
||||
port,
|
||||
"heartbleed",
|
||||
heartbleed_result.is_vulnerable_to_heartbleed,
|
||||
None,
|
||||
),
|
||||
)
|
||||
|
||||
# ROBOT
|
||||
robot_attempt = scan_result.scan_result.robot
|
||||
if robot_attempt.status == ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
robot_result = robot_attempt.result
|
||||
if robot_result:
|
||||
# Check if robot_result has the attribute
|
||||
vulnerable = False
|
||||
details = None
|
||||
if hasattr(robot_result, "robot_result_enum"):
|
||||
vulnerable = (
|
||||
robot_result.robot_result_enum.name != "NOT_VULNERABLE_NO_ORACLE"
|
||||
)
|
||||
details = robot_result.robot_result_enum.name
|
||||
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scan_vulnerabilities (
|
||||
scan_id, port, vuln_type, vulnerable, details
|
||||
) VALUES (?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
scan_id,
|
||||
port,
|
||||
"robot",
|
||||
vulnerable,
|
||||
details,
|
||||
),
|
||||
)
|
||||
|
||||
# OpenSSL CCS Injection
|
||||
ccs_attempt = scan_result.scan_result.openssl_ccs_injection
|
||||
if ccs_attempt.status == ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
ccs_result = ccs_attempt.result
|
||||
if ccs_result:
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scan_vulnerabilities (
|
||||
scan_id, port, vuln_type, vulnerable, details
|
||||
) VALUES (?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
scan_id,
|
||||
port,
|
||||
"openssl_ccs_injection",
|
||||
ccs_result.is_vulnerable_to_ccs_injection,
|
||||
None,
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
def _insert_protocol_feature(
|
||||
cursor: sqlite3.Cursor,
|
||||
scan_id: int,
|
||||
port: int,
|
||||
feature_type: str,
|
||||
supported: bool,
|
||||
details: str | None = None,
|
||||
) -> None:
|
||||
"""Insert protocol feature into database.
|
||||
|
||||
Args:
|
||||
cursor: Database cursor
|
||||
scan_id: Scan ID
|
||||
port: Port number
|
||||
feature_type: Feature type identifier
|
||||
supported: Whether feature is supported
|
||||
details: Optional details string
|
||||
|
||||
"""
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scan_protocol_features (
|
||||
scan_id, port, feature_type, supported, details
|
||||
) VALUES (?, ?, ?, ?, ?)
|
||||
""",
|
||||
(scan_id, port, feature_type, supported, details),
|
||||
)
|
||||
|
||||
|
||||
def _save_protocol_features(
|
||||
cursor: sqlite3.Cursor,
|
||||
scan_id: int,
|
||||
port: int,
|
||||
scan_result: ServerScanResult,
|
||||
) -> None:
|
||||
"""Save protocol features (compression, early data, fallback SCSV, extended master secret)."""
|
||||
from sslyze import ScanCommandAttemptStatusEnum
|
||||
|
||||
if not scan_result.scan_result:
|
||||
return
|
||||
|
||||
# TLS Compression
|
||||
compression_attempt = scan_result.scan_result.tls_compression
|
||||
if compression_attempt.status == ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
compression_result = compression_attempt.result
|
||||
if compression_result:
|
||||
supported = (
|
||||
hasattr(compression_result, "supports_compression")
|
||||
and compression_result.supports_compression
|
||||
)
|
||||
_insert_protocol_feature(
|
||||
cursor,
|
||||
scan_id,
|
||||
port,
|
||||
"tls_compression",
|
||||
supported,
|
||||
"TLS compression is deprecated and should not be used",
|
||||
)
|
||||
|
||||
# TLS 1.3 Early Data
|
||||
early_data_attempt = scan_result.scan_result.tls_1_3_early_data
|
||||
if early_data_attempt.status == ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
early_data_result = early_data_attempt.result
|
||||
if early_data_result:
|
||||
supported = (
|
||||
hasattr(early_data_result, "supports_early_data")
|
||||
and early_data_result.supports_early_data
|
||||
)
|
||||
details = None
|
||||
if supported and hasattr(early_data_result, "max_early_data_size"):
|
||||
details = f"max_early_data_size: {early_data_result.max_early_data_size}"
|
||||
_insert_protocol_feature(
|
||||
cursor, scan_id, port, "tls_1_3_early_data", supported, details
|
||||
)
|
||||
|
||||
# TLS Fallback SCSV
|
||||
fallback_attempt = scan_result.scan_result.tls_fallback_scsv
|
||||
if fallback_attempt.status == ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
fallback_result = fallback_attempt.result
|
||||
if fallback_result:
|
||||
supported = (
|
||||
hasattr(fallback_result, "supports_fallback_scsv")
|
||||
and fallback_result.supports_fallback_scsv
|
||||
)
|
||||
_insert_protocol_feature(
|
||||
cursor,
|
||||
scan_id,
|
||||
port,
|
||||
"tls_fallback_scsv",
|
||||
supported,
|
||||
"Prevents downgrade attacks",
|
||||
)
|
||||
|
||||
# Extended Master Secret
|
||||
ems_attempt = scan_result.scan_result.tls_extended_master_secret
|
||||
if ems_attempt.status == ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
ems_result = ems_attempt.result
|
||||
if ems_result:
|
||||
supported = (
|
||||
hasattr(ems_result, "supports_extended_master_secret")
|
||||
and ems_result.supports_extended_master_secret
|
||||
)
|
||||
_insert_protocol_feature(
|
||||
cursor,
|
||||
scan_id,
|
||||
port,
|
||||
"tls_extended_master_secret",
|
||||
supported,
|
||||
"RFC 7627 - Mitigates certain TLS attacks",
|
||||
)
|
||||
|
||||
|
||||
def _save_session_features(
|
||||
cursor: sqlite3.Cursor,
|
||||
scan_id: int,
|
||||
port: int,
|
||||
scan_result: ServerScanResult,
|
||||
) -> None:
|
||||
"""Save session features (renegotiation and resumption)."""
|
||||
from sslyze import ScanCommandAttemptStatusEnum
|
||||
|
||||
if not scan_result.scan_result:
|
||||
return
|
||||
|
||||
# Session Renegotiation
|
||||
renegotiation_attempt = scan_result.scan_result.session_renegotiation
|
||||
if renegotiation_attempt.status == ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
renegotiation_result = renegotiation_attempt.result
|
||||
if renegotiation_result:
|
||||
client_initiated = (
|
||||
hasattr(renegotiation_result, "is_client_renegotiation_supported")
|
||||
and renegotiation_result.is_client_renegotiation_supported
|
||||
)
|
||||
secure = (
|
||||
hasattr(renegotiation_result, "supports_secure_renegotiation")
|
||||
and renegotiation_result.supports_secure_renegotiation
|
||||
)
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scan_session_features (
|
||||
scan_id, port, feature_type, client_initiated, secure,
|
||||
session_id_supported, ticket_supported,
|
||||
attempted_resumptions, successful_resumptions, details
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
scan_id,
|
||||
port,
|
||||
"session_renegotiation",
|
||||
client_initiated,
|
||||
secure,
|
||||
None,
|
||||
None,
|
||||
None,
|
||||
None,
|
||||
None,
|
||||
),
|
||||
)
|
||||
|
||||
# Session Resumption
|
||||
resumption_attempt = scan_result.scan_result.session_resumption
|
||||
if resumption_attempt.status == ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
resumption_result = resumption_attempt.result
|
||||
if resumption_result:
|
||||
session_id_supported = False
|
||||
ticket_supported = False
|
||||
attempted = 0
|
||||
successful = 0
|
||||
|
||||
if hasattr(resumption_result, "session_id_resumption_result"):
|
||||
session_id_resumption = resumption_result.session_id_resumption_result
|
||||
if session_id_resumption:
|
||||
session_id_supported = (
|
||||
hasattr(
|
||||
session_id_resumption,
|
||||
"is_session_id_resumption_supported",
|
||||
)
|
||||
and session_id_resumption.is_session_id_resumption_supported
|
||||
)
|
||||
if hasattr(session_id_resumption, "attempted_resumptions_count"):
|
||||
attempted += session_id_resumption.attempted_resumptions_count
|
||||
if hasattr(session_id_resumption, "successful_resumptions_count"):
|
||||
successful += session_id_resumption.successful_resumptions_count
|
||||
|
||||
if hasattr(resumption_result, "tls_ticket_resumption_result"):
|
||||
ticket_resumption = resumption_result.tls_ticket_resumption_result
|
||||
if ticket_resumption:
|
||||
ticket_supported = (
|
||||
hasattr(ticket_resumption, "is_tls_ticket_resumption_supported")
|
||||
and ticket_resumption.is_tls_ticket_resumption_supported
|
||||
)
|
||||
if hasattr(ticket_resumption, "attempted_resumptions_count"):
|
||||
attempted += ticket_resumption.attempted_resumptions_count
|
||||
if hasattr(ticket_resumption, "successful_resumptions_count"):
|
||||
successful += ticket_resumption.successful_resumptions_count
|
||||
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scan_session_features (
|
||||
scan_id, port, feature_type, client_initiated, secure,
|
||||
session_id_supported, ticket_supported,
|
||||
attempted_resumptions, successful_resumptions, details
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
scan_id,
|
||||
port,
|
||||
"session_resumption",
|
||||
None,
|
||||
None,
|
||||
session_id_supported,
|
||||
ticket_supported,
|
||||
attempted,
|
||||
successful,
|
||||
None,
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
def _save_http_headers(
|
||||
cursor: sqlite3.Cursor,
|
||||
scan_id: int,
|
||||
port: int,
|
||||
scan_result: ServerScanResult,
|
||||
) -> None:
|
||||
"""Save HTTP security headers."""
|
||||
from sslyze import ScanCommandAttemptStatusEnum
|
||||
|
||||
if not scan_result.scan_result:
|
||||
return
|
||||
|
||||
http_headers_attempt = scan_result.scan_result.http_headers
|
||||
if http_headers_attempt.status != ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
return
|
||||
|
||||
http_headers_result = http_headers_attempt.result
|
||||
if not http_headers_result:
|
||||
return
|
||||
|
||||
# Strict-Transport-Security
|
||||
if hasattr(http_headers_result, "strict_transport_security_header"):
|
||||
hsts = http_headers_result.strict_transport_security_header
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scan_http_headers (
|
||||
scan_id, port, header_name, header_value, is_present
|
||||
) VALUES (?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
scan_id,
|
||||
port,
|
||||
"Strict-Transport-Security",
|
||||
str(hsts) if hsts else None,
|
||||
hsts is not None,
|
||||
),
|
||||
)
|
||||
|
||||
# Public-Key-Pins
|
||||
if hasattr(http_headers_result, "public_key_pins_header"):
|
||||
hpkp = http_headers_result.public_key_pins_header
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scan_http_headers (
|
||||
scan_id, port, header_name, header_value, is_present
|
||||
) VALUES (?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
scan_id,
|
||||
port,
|
||||
"Public-Key-Pins",
|
||||
str(hpkp) if hpkp else None,
|
||||
hpkp is not None,
|
||||
),
|
||||
)
|
||||
|
||||
# Expect-CT
|
||||
if hasattr(http_headers_result, "expect_ct_header"):
|
||||
expect_ct = http_headers_result.expect_ct_header
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO scan_http_headers (
|
||||
scan_id, port, header_name, header_value, is_present
|
||||
) VALUES (?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
scan_id,
|
||||
port,
|
||||
"Expect-CT",
|
||||
str(expect_ct) if expect_ct else None,
|
||||
expect_ct is not None,
|
||||
),
|
||||
)
|
||||
216
src/sslysze_scan/output.py
Normal file
216
src/sslysze_scan/output.py
Normal file
@@ -0,0 +1,216 @@
|
||||
"""Console output module for scan results."""
|
||||
|
||||
from typing import Any
|
||||
|
||||
from sslyze.scanner.models import ServerScanResult
|
||||
|
||||
|
||||
def print_scan_results(
|
||||
scan_result: ServerScanResult, compliance_stats: dict[str, Any]
|
||||
) -> None:
|
||||
"""Print scan results to console.
|
||||
|
||||
Args:
|
||||
scan_result: SSLyze ServerScanResult object
|
||||
compliance_stats: Compliance check statistics
|
||||
|
||||
"""
|
||||
print("\n" + "=" * 70)
|
||||
print(
|
||||
f"Scan-Ergebnisse für {scan_result.server_location.hostname}:{scan_result.server_location.port}",
|
||||
)
|
||||
print("=" * 70)
|
||||
|
||||
# Connectivity status
|
||||
print(f"\nVerbindungsstatus: {scan_result.scan_status.name}")
|
||||
|
||||
if scan_result.connectivity_result:
|
||||
print(
|
||||
f"Höchste TLS-Version: {scan_result.connectivity_result.highest_tls_version_supported}",
|
||||
)
|
||||
print(f"Cipher Suite: {scan_result.connectivity_result.cipher_suite_supported}")
|
||||
|
||||
if not scan_result.scan_result:
|
||||
print("\nKeine Scan-Ergebnisse verfügbar (Verbindungsfehler)")
|
||||
return
|
||||
|
||||
# TLS 1.2 Cipher Suites
|
||||
_print_cipher_suites(scan_result, "1.2")
|
||||
|
||||
# TLS 1.3 Cipher Suites
|
||||
_print_cipher_suites(scan_result, "1.3")
|
||||
|
||||
# Supported Groups
|
||||
_print_supported_groups(scan_result)
|
||||
|
||||
# Certificates
|
||||
_print_certificates(scan_result)
|
||||
|
||||
# Vulnerabilities
|
||||
_print_vulnerabilities(scan_result)
|
||||
|
||||
# Compliance Summary
|
||||
print("\n" + "-" * 70)
|
||||
print("Compliance-Zusammenfassung:")
|
||||
print("-" * 70)
|
||||
print(
|
||||
f"Cipher Suites: {compliance_stats['cipher_suites_passed']}/{compliance_stats['cipher_suites_checked']} konform",
|
||||
)
|
||||
print(
|
||||
f"Supported Groups: {compliance_stats['supported_groups_passed']}/{compliance_stats['supported_groups_checked']} konform",
|
||||
)
|
||||
print("=" * 70 + "\n")
|
||||
|
||||
|
||||
def _print_cipher_suites(scan_result: ServerScanResult, tls_version: str) -> None:
|
||||
"""Print cipher suites for specific TLS version."""
|
||||
from sslyze import ScanCommandAttemptStatusEnum
|
||||
|
||||
version_map = {
|
||||
"1.2": "tls_1_2_cipher_suites",
|
||||
"1.3": "tls_1_3_cipher_suites",
|
||||
}
|
||||
|
||||
if tls_version not in version_map:
|
||||
return
|
||||
|
||||
cipher_attempt = getattr(scan_result.scan_result, version_map[tls_version])
|
||||
|
||||
if cipher_attempt.status != ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
print(f"\nTLS {tls_version} Cipher Suites: Nicht verfügbar")
|
||||
return
|
||||
|
||||
cipher_result = cipher_attempt.result
|
||||
if not cipher_result or not cipher_result.accepted_cipher_suites:
|
||||
print(f"\nTLS {tls_version} Cipher Suites: Keine akzeptiert")
|
||||
return
|
||||
|
||||
print(
|
||||
f"\nTLS {tls_version} Cipher Suites ({len(cipher_result.accepted_cipher_suites)} akzeptiert):",
|
||||
)
|
||||
for cs in cipher_result.accepted_cipher_suites:
|
||||
print(f" • {cs.cipher_suite.name}")
|
||||
|
||||
|
||||
def _print_supported_groups(scan_result: ServerScanResult) -> None:
|
||||
"""Print supported elliptic curves / DH groups."""
|
||||
from sslyze import ScanCommandAttemptStatusEnum
|
||||
|
||||
ec_attempt = scan_result.scan_result.elliptic_curves
|
||||
|
||||
if ec_attempt.status != ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
print("\nUnterstützte Gruppen: Nicht verfügbar")
|
||||
return
|
||||
|
||||
ec_result = ec_attempt.result
|
||||
if not ec_result or not ec_result.supported_curves:
|
||||
print("\nUnterstützte Gruppen: Keine gefunden")
|
||||
return
|
||||
|
||||
print(f"\nUnterstützte Gruppen ({len(ec_result.supported_curves)}):")
|
||||
for curve in ec_result.supported_curves:
|
||||
print(f" • {curve.name}")
|
||||
|
||||
|
||||
def _print_certificates(scan_result: ServerScanResult) -> None:
|
||||
"""Print certificate information."""
|
||||
from sslyze import ScanCommandAttemptStatusEnum
|
||||
|
||||
cert_attempt = scan_result.scan_result.certificate_info
|
||||
|
||||
if cert_attempt.status != ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
print("\nZertifikate: Nicht verfügbar")
|
||||
return
|
||||
|
||||
cert_result = cert_attempt.result
|
||||
if not cert_result:
|
||||
return
|
||||
|
||||
print("\nZertifikate:")
|
||||
for cert_deployment in cert_result.certificate_deployments:
|
||||
for i, cert in enumerate(cert_deployment.received_certificate_chain):
|
||||
print(f"\n Zertifikat #{i}:")
|
||||
print(f" Subject: {cert.subject.rfc4514_string()}")
|
||||
print(f" Serial: {cert.serial_number}")
|
||||
|
||||
if hasattr(cert, "not_valid_before_utc") and hasattr(
|
||||
cert,
|
||||
"not_valid_after_utc",
|
||||
):
|
||||
print(
|
||||
f" Gültig von: {cert.not_valid_before_utc.strftime('%Y-%m-%d %H:%M:%S UTC')}",
|
||||
)
|
||||
print(
|
||||
f" Gültig bis: {cert.not_valid_after_utc.strftime('%Y-%m-%d %H:%M:%S UTC')}",
|
||||
)
|
||||
|
||||
public_key = cert.public_key()
|
||||
key_type = public_key.__class__.__name__
|
||||
key_bits = (
|
||||
public_key.key_size if hasattr(public_key, "key_size") else "unknown"
|
||||
)
|
||||
print(f" Key: {key_type} ({key_bits} bits)")
|
||||
|
||||
|
||||
def _print_vulnerabilities(scan_result: ServerScanResult) -> None:
|
||||
"""Print vulnerability scan results."""
|
||||
from sslyze import ScanCommandAttemptStatusEnum
|
||||
|
||||
print("\nSicherheitsprüfungen:")
|
||||
|
||||
# Heartbleed
|
||||
heartbleed_attempt = scan_result.scan_result.heartbleed
|
||||
if heartbleed_attempt.status == ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
heartbleed_result = heartbleed_attempt.result
|
||||
if heartbleed_result:
|
||||
status = (
|
||||
"VERWUNDBAR ⚠️"
|
||||
if heartbleed_result.is_vulnerable_to_heartbleed
|
||||
else "OK ✓"
|
||||
)
|
||||
print(f" • Heartbleed: {status}")
|
||||
|
||||
# ROBOT
|
||||
robot_attempt = scan_result.scan_result.robot
|
||||
if robot_attempt.status == ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
robot_result = robot_attempt.result
|
||||
if robot_result:
|
||||
vulnerable = False
|
||||
if hasattr(robot_result, "robot_result_enum"):
|
||||
vulnerable = (
|
||||
robot_result.robot_result_enum.name != "NOT_VULNERABLE_NO_ORACLE"
|
||||
)
|
||||
elif hasattr(robot_result, "robot_result"):
|
||||
vulnerable = str(robot_result.robot_result) != "NOT_VULNERABLE_NO_ORACLE"
|
||||
status = "VERWUNDBAR ⚠️" if vulnerable else "OK ✓"
|
||||
print(f" • ROBOT: {status}")
|
||||
|
||||
# OpenSSL CCS Injection
|
||||
ccs_attempt = scan_result.scan_result.openssl_ccs_injection
|
||||
if ccs_attempt.status == ScanCommandAttemptStatusEnum.COMPLETED:
|
||||
ccs_result = ccs_attempt.result
|
||||
if ccs_result:
|
||||
status = (
|
||||
"VERWUNDBAR ⚠️" if ccs_result.is_vulnerable_to_ccs_injection else "OK ✓"
|
||||
)
|
||||
print(f" • OpenSSL CCS Injection: {status}")
|
||||
|
||||
|
||||
def print_error(message: str) -> None:
|
||||
"""Print error message to console.
|
||||
|
||||
Args:
|
||||
message: Error message
|
||||
|
||||
"""
|
||||
print(f"\n✗ Fehler: {message}\n")
|
||||
|
||||
|
||||
def print_success(message: str) -> None:
|
||||
"""Print success message to console.
|
||||
|
||||
Args:
|
||||
message: Success message
|
||||
|
||||
"""
|
||||
print(f"\n✓ {message}\n")
|
||||
75
src/sslysze_scan/protocol_loader.py
Normal file
75
src/sslysze_scan/protocol_loader.py
Normal file
@@ -0,0 +1,75 @@
|
||||
"""Module for loading protocol-port mappings from CSV file."""
|
||||
|
||||
import csv
|
||||
from pathlib import Path
|
||||
|
||||
# Port constants
|
||||
MIN_PORT_NUMBER = 1
|
||||
MAX_PORT_NUMBER = 65535
|
||||
|
||||
|
||||
def load_protocol_mappings() -> dict[int, str]:
|
||||
"""Load protocol-port mappings from CSV file.
|
||||
|
||||
Returns:
|
||||
Dictionary mapping port numbers to protocol names.
|
||||
|
||||
Raises:
|
||||
FileNotFoundError: If CSV file does not exist.
|
||||
ValueError: If CSV file is malformed.
|
||||
|
||||
"""
|
||||
# Get path to CSV file relative to this module
|
||||
csv_path = Path(__file__).parent / "data" / "protocols.csv"
|
||||
|
||||
if not csv_path.exists():
|
||||
raise FileNotFoundError(f"Protocol mappings file not found: {csv_path}")
|
||||
|
||||
mappings: dict[int, str] = {}
|
||||
|
||||
try:
|
||||
with csv_path.open(encoding="utf-8") as f:
|
||||
reader = csv.DictReader(f)
|
||||
|
||||
for row_num, row in enumerate(
|
||||
reader,
|
||||
start=2,
|
||||
): # start=2 because header is line 1
|
||||
try:
|
||||
protocol = row["protocol"].strip()
|
||||
port = int(row["port"].strip())
|
||||
|
||||
if port < MIN_PORT_NUMBER or port > MAX_PORT_NUMBER:
|
||||
raise ValueError(
|
||||
f"Invalid port number {port} on line {row_num}",
|
||||
)
|
||||
|
||||
mappings[port] = protocol
|
||||
|
||||
except KeyError as e:
|
||||
raise ValueError(
|
||||
f"Missing column {e} in CSV file on line {row_num}",
|
||||
) from e
|
||||
except ValueError as e:
|
||||
raise ValueError(
|
||||
f"Invalid data in CSV file on line {row_num}: {e}"
|
||||
) from e
|
||||
|
||||
except (OSError, csv.Error) as e:
|
||||
raise ValueError(f"Error reading CSV file: {e}") from e
|
||||
|
||||
return mappings
|
||||
|
||||
|
||||
def get_protocol_for_port(port: int) -> str | None:
|
||||
"""Get the protocol name for a given port number.
|
||||
|
||||
Args:
|
||||
port: Port number to check.
|
||||
|
||||
Returns:
|
||||
Protocol name if found, None otherwise.
|
||||
|
||||
"""
|
||||
mappings = load_protocol_mappings()
|
||||
return mappings.get(port)
|
||||
50
src/sslysze_scan/reporter/__init__.py
Normal file
50
src/sslysze_scan/reporter/__init__.py
Normal file
@@ -0,0 +1,50 @@
|
||||
"""Report generation module for scan results."""
|
||||
|
||||
from .csv_export import generate_csv_reports
|
||||
from .markdown_export import generate_markdown_report
|
||||
from .query import get_scan_data, get_scan_metadata, list_scans
|
||||
from .rst_export import generate_rest_report
|
||||
|
||||
__all__ = [
|
||||
"generate_csv_reports",
|
||||
"generate_markdown_report",
|
||||
"generate_report",
|
||||
"generate_rest_report",
|
||||
"get_scan_data",
|
||||
"get_scan_metadata",
|
||||
"list_scans",
|
||||
]
|
||||
|
||||
|
||||
def generate_report(
|
||||
db_path: str,
|
||||
scan_id: int,
|
||||
report_type: str,
|
||||
output: str = None,
|
||||
output_dir: str = ".",
|
||||
) -> list[str]:
|
||||
"""Generate report for scan.
|
||||
|
||||
Args:
|
||||
db_path: Path to database file
|
||||
scan_id: Scan ID
|
||||
report_type: Report type ('csv', 'markdown', or 'rest')
|
||||
output: Output file for markdown/rest (auto-generated if None)
|
||||
output_dir: Output directory for CSV/reST files
|
||||
|
||||
Returns:
|
||||
List of generated file paths
|
||||
|
||||
Raises:
|
||||
ValueError: If report type is unknown
|
||||
|
||||
"""
|
||||
if report_type == "markdown":
|
||||
file_path = generate_markdown_report(db_path, scan_id, output)
|
||||
return [file_path]
|
||||
if report_type == "csv":
|
||||
return generate_csv_reports(db_path, scan_id, output_dir)
|
||||
if report_type in ("rest", "rst"):
|
||||
file_path = generate_rest_report(db_path, scan_id, output, output_dir)
|
||||
return [file_path]
|
||||
raise ValueError(f"Unknown report type: {report_type}")
|
||||
536
src/sslysze_scan/reporter/csv_export.py
Normal file
536
src/sslysze_scan/reporter/csv_export.py
Normal file
@@ -0,0 +1,536 @@
|
||||
"""CSV report generation with granular file structure for reST integration."""
|
||||
|
||||
import csv
|
||||
import json
|
||||
import sqlite3
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
from .query import get_scan_data
|
||||
|
||||
|
||||
def _get_headers(db_path: str, export_type: str) -> list[str]:
|
||||
"""Get CSV headers from database.
|
||||
|
||||
Args:
|
||||
db_path: Path to database file
|
||||
export_type: Type of export (e.g. 'cipher_suites_accepted')
|
||||
|
||||
Returns:
|
||||
List of column headers
|
||||
|
||||
"""
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
cursor.execute(
|
||||
"SELECT headers FROM csv_export_metadata WHERE export_type = ?",
|
||||
(export_type,),
|
||||
)
|
||||
row = cursor.fetchone()
|
||||
conn.close()
|
||||
|
||||
if row:
|
||||
return json.loads(row[0])
|
||||
raise ValueError(f"No headers found for export_type: {export_type}")
|
||||
|
||||
|
||||
def _format_bool(
|
||||
value: bool | None,
|
||||
true_val: str = "Yes",
|
||||
false_val: str = "No",
|
||||
none_val: str = "-",
|
||||
) -> str:
|
||||
"""Format boolean value to string representation.
|
||||
|
||||
Args:
|
||||
value: Boolean value to format
|
||||
true_val: String representation for True
|
||||
false_val: String representation for False
|
||||
none_val: String representation for None
|
||||
|
||||
Returns:
|
||||
Formatted string
|
||||
|
||||
"""
|
||||
if value is True:
|
||||
return true_val
|
||||
if value is False:
|
||||
return false_val
|
||||
return none_val
|
||||
|
||||
|
||||
def _write_csv(filepath: Path, headers: list[str], rows: list[list[Any]]) -> None:
|
||||
"""Write data to CSV file.
|
||||
|
||||
Args:
|
||||
filepath: Path to CSV file
|
||||
headers: List of column headers
|
||||
rows: List of data rows
|
||||
|
||||
"""
|
||||
with filepath.open("w", newline="", encoding="utf-8") as f:
|
||||
writer = csv.writer(f)
|
||||
writer.writerow(headers)
|
||||
writer.writerows(rows)
|
||||
|
||||
|
||||
def _export_summary(
|
||||
output_dir: Path,
|
||||
summary: dict[str, Any],
|
||||
db_path: str,
|
||||
) -> list[str]:
|
||||
"""Export summary statistics to CSV.
|
||||
|
||||
Args:
|
||||
output_dir: Output directory path
|
||||
summary: Summary data dictionary
|
||||
|
||||
Returns:
|
||||
List of generated file paths
|
||||
|
||||
"""
|
||||
summary_file = output_dir / "summary.csv"
|
||||
rows = [
|
||||
["Scanned Ports", summary.get("total_ports", 0)],
|
||||
["Ports with TLS Support", summary.get("successful_ports", 0)],
|
||||
["Cipher Suites Checked", summary.get("total_cipher_suites", 0)],
|
||||
[
|
||||
"Cipher Suites Compliant",
|
||||
(
|
||||
f"{summary.get('compliant_cipher_suites', 0)} "
|
||||
f"({summary.get('cipher_suite_percentage', 0)}%)"
|
||||
),
|
||||
],
|
||||
["Supported Groups Checked", summary.get("total_groups", 0)],
|
||||
[
|
||||
"Supported Groups Compliant",
|
||||
(
|
||||
f"{summary.get('compliant_groups', 0)} "
|
||||
f"({summary.get('group_percentage', 0)}%)"
|
||||
),
|
||||
],
|
||||
[
|
||||
"Critical Vulnerabilities",
|
||||
summary.get("critical_vulnerabilities", 0),
|
||||
],
|
||||
]
|
||||
headers = _get_headers(db_path, "summary")
|
||||
_write_csv(summary_file, headers, rows)
|
||||
return [str(summary_file)]
|
||||
|
||||
|
||||
def _export_cipher_suites(
|
||||
output_dir: Path,
|
||||
port: int,
|
||||
cipher_suites: dict[str, dict[str, list]],
|
||||
db_path: str,
|
||||
) -> list[str]:
|
||||
"""Export cipher suites to CSV files.
|
||||
|
||||
Args:
|
||||
output_dir: Output directory path
|
||||
port: Port number
|
||||
cipher_suites: Cipher suites data per TLS version
|
||||
|
||||
Returns:
|
||||
List of generated file paths
|
||||
|
||||
"""
|
||||
generated = []
|
||||
|
||||
for tls_version, suites in cipher_suites.items():
|
||||
if suites.get("accepted"):
|
||||
filepath = output_dir / f"{port}_cipher_suites_{tls_version}_accepted.csv"
|
||||
rows = [
|
||||
[
|
||||
suite["name"],
|
||||
suite.get("iana_recommended", "-"),
|
||||
_format_bool(suite.get("bsi_approved")),
|
||||
suite.get("bsi_valid_until", "-"),
|
||||
_format_bool(suite.get("compliant")),
|
||||
]
|
||||
for suite in suites["accepted"]
|
||||
]
|
||||
headers = _get_headers(db_path, "cipher_suites_accepted")
|
||||
_write_csv(filepath, headers, rows)
|
||||
generated.append(str(filepath))
|
||||
|
||||
if suites.get("rejected"):
|
||||
filepath = output_dir / f"{port}_cipher_suites_{tls_version}_rejected.csv"
|
||||
rows = [
|
||||
[
|
||||
suite["name"],
|
||||
suite.get("iana_recommended", "-"),
|
||||
_format_bool(suite.get("bsi_approved")),
|
||||
suite.get("bsi_valid_until", "-"),
|
||||
]
|
||||
for suite in suites["rejected"]
|
||||
]
|
||||
headers = _get_headers(db_path, "cipher_suites_rejected")
|
||||
_write_csv(filepath, headers, rows)
|
||||
generated.append(str(filepath))
|
||||
|
||||
return generated
|
||||
|
||||
|
||||
def _export_supported_groups(
|
||||
output_dir: Path,
|
||||
port: int,
|
||||
groups: list[dict[str, Any]],
|
||||
db_path: str,
|
||||
) -> list[str]:
|
||||
"""Export supported groups to CSV.
|
||||
|
||||
Args:
|
||||
output_dir: Output directory path
|
||||
port: Port number
|
||||
groups: List of supported groups
|
||||
|
||||
Returns:
|
||||
List of generated file paths
|
||||
|
||||
"""
|
||||
filepath = output_dir / f"{port}_supported_groups.csv"
|
||||
rows = [
|
||||
[
|
||||
group["name"],
|
||||
group.get("iana_recommended", "-"),
|
||||
_format_bool(group.get("bsi_approved")),
|
||||
group.get("bsi_valid_until", "-"),
|
||||
_format_bool(group.get("compliant")),
|
||||
]
|
||||
for group in groups
|
||||
]
|
||||
headers = _get_headers(db_path, "supported_groups")
|
||||
_write_csv(filepath, headers, rows)
|
||||
return [str(filepath)]
|
||||
|
||||
|
||||
def _export_missing_groups(
|
||||
output_dir: Path,
|
||||
port: int,
|
||||
missing: dict[str, list[dict[str, Any]]],
|
||||
db_path: str,
|
||||
) -> list[str]:
|
||||
"""Export missing recommended groups to CSV.
|
||||
|
||||
Args:
|
||||
output_dir: Output directory path
|
||||
port: Port number
|
||||
missing: Dictionary with bsi_approved and iana_recommended groups
|
||||
|
||||
Returns:
|
||||
List of generated file paths
|
||||
|
||||
"""
|
||||
generated = []
|
||||
|
||||
if missing.get("bsi_approved"):
|
||||
filepath = output_dir / f"{port}_missing_groups_bsi.csv"
|
||||
rows = [
|
||||
[
|
||||
group["name"],
|
||||
", ".join(group.get("tls_versions", [])),
|
||||
group.get("valid_until", "-"),
|
||||
]
|
||||
for group in missing["bsi_approved"]
|
||||
]
|
||||
headers = _get_headers(db_path, "missing_groups_bsi")
|
||||
_write_csv(filepath, headers, rows)
|
||||
generated.append(str(filepath))
|
||||
|
||||
if missing.get("iana_recommended"):
|
||||
filepath = output_dir / f"{port}_missing_groups_iana.csv"
|
||||
rows = [
|
||||
[group["name"], group.get("iana_value", "-")]
|
||||
for group in missing["iana_recommended"]
|
||||
]
|
||||
headers = _get_headers(db_path, "missing_groups_iana")
|
||||
_write_csv(filepath, headers, rows)
|
||||
generated.append(str(filepath))
|
||||
|
||||
return generated
|
||||
|
||||
|
||||
def _export_certificates(
|
||||
output_dir: Path,
|
||||
port: int,
|
||||
certificates: list[dict[str, Any]],
|
||||
db_path: str,
|
||||
) -> list[str]:
|
||||
"""Export certificates to CSV.
|
||||
|
||||
Args:
|
||||
output_dir: Output directory path
|
||||
port: Port number
|
||||
certificates: List of certificate data
|
||||
|
||||
Returns:
|
||||
List of generated file paths
|
||||
|
||||
"""
|
||||
filepath = output_dir / f"{port}_certificates.csv"
|
||||
rows = [
|
||||
[
|
||||
cert["position"],
|
||||
cert["subject"],
|
||||
cert["issuer"],
|
||||
cert["not_before"],
|
||||
cert["not_after"],
|
||||
cert["key_type"],
|
||||
cert["key_bits"],
|
||||
_format_bool(cert.get("compliant")),
|
||||
]
|
||||
for cert in certificates
|
||||
]
|
||||
headers = _get_headers(db_path, "certificates")
|
||||
_write_csv(filepath, headers, rows)
|
||||
return [str(filepath)]
|
||||
|
||||
|
||||
def _export_vulnerabilities(
|
||||
output_dir: Path,
|
||||
port: int,
|
||||
vulnerabilities: list[dict[str, Any]],
|
||||
db_path: str,
|
||||
) -> list[str]:
|
||||
"""Export vulnerabilities to CSV.
|
||||
|
||||
Args:
|
||||
output_dir: Output directory path
|
||||
port: Port number
|
||||
vulnerabilities: List of vulnerability data
|
||||
|
||||
Returns:
|
||||
List of generated file paths
|
||||
|
||||
"""
|
||||
filepath = output_dir / f"{port}_vulnerabilities.csv"
|
||||
rows = [
|
||||
[
|
||||
vuln["type"],
|
||||
_format_bool(vuln["vulnerable"]),
|
||||
vuln.get("details", "-"),
|
||||
]
|
||||
for vuln in vulnerabilities
|
||||
]
|
||||
headers = _get_headers(db_path, "vulnerabilities")
|
||||
_write_csv(filepath, headers, rows)
|
||||
return [str(filepath)]
|
||||
|
||||
|
||||
def _export_protocol_features(
|
||||
output_dir: Path,
|
||||
port: int,
|
||||
features: list[dict[str, Any]],
|
||||
db_path: str,
|
||||
) -> list[str]:
|
||||
"""Export protocol features to CSV.
|
||||
|
||||
Args:
|
||||
output_dir: Output directory path
|
||||
port: Port number
|
||||
features: List of protocol feature data
|
||||
|
||||
Returns:
|
||||
List of generated file paths
|
||||
|
||||
"""
|
||||
filepath = output_dir / f"{port}_protocol_features.csv"
|
||||
rows = [
|
||||
[
|
||||
feature["name"],
|
||||
_format_bool(feature["supported"]),
|
||||
feature.get("details", "-"),
|
||||
]
|
||||
for feature in features
|
||||
]
|
||||
headers = _get_headers(db_path, "protocol_features")
|
||||
_write_csv(filepath, headers, rows)
|
||||
return [str(filepath)]
|
||||
|
||||
|
||||
def _export_session_features(
|
||||
output_dir: Path,
|
||||
port: int,
|
||||
features: list[dict[str, Any]],
|
||||
db_path: str,
|
||||
) -> list[str]:
|
||||
"""Export session features to CSV.
|
||||
|
||||
Args:
|
||||
output_dir: Output directory path
|
||||
port: Port number
|
||||
features: List of session feature data
|
||||
|
||||
Returns:
|
||||
List of generated file paths
|
||||
|
||||
"""
|
||||
filepath = output_dir / f"{port}_session_features.csv"
|
||||
rows = [
|
||||
[
|
||||
feature["type"],
|
||||
_format_bool(feature.get("client_initiated")),
|
||||
_format_bool(feature.get("secure")),
|
||||
_format_bool(feature.get("session_id_supported")),
|
||||
_format_bool(feature.get("ticket_supported")),
|
||||
feature.get("details", "-"),
|
||||
]
|
||||
for feature in features
|
||||
]
|
||||
headers = _get_headers(db_path, "session_features")
|
||||
_write_csv(filepath, headers, rows)
|
||||
return [str(filepath)]
|
||||
|
||||
|
||||
def _export_http_headers(
|
||||
output_dir: Path,
|
||||
port: int,
|
||||
headers: list[dict[str, Any]],
|
||||
db_path: str,
|
||||
) -> list[str]:
|
||||
"""Export HTTP headers to CSV.
|
||||
|
||||
Args:
|
||||
output_dir: Output directory path
|
||||
port: Port number
|
||||
headers: List of HTTP header data
|
||||
|
||||
Returns:
|
||||
List of generated file paths
|
||||
|
||||
"""
|
||||
filepath = output_dir / f"{port}_http_headers.csv"
|
||||
rows = [
|
||||
[
|
||||
header["name"],
|
||||
_format_bool(header["is_present"]),
|
||||
header.get("value", "-"),
|
||||
]
|
||||
for header in headers
|
||||
]
|
||||
csv_headers = _get_headers(db_path, "http_headers")
|
||||
_write_csv(filepath, csv_headers, rows)
|
||||
return [str(filepath)]
|
||||
|
||||
|
||||
def _export_compliance_status(
|
||||
output_dir: Path,
|
||||
port: int,
|
||||
compliance: dict[str, Any],
|
||||
db_path: str,
|
||||
) -> list[str]:
|
||||
"""Export compliance status to CSV.
|
||||
|
||||
Args:
|
||||
output_dir: Output directory path
|
||||
port: Port number
|
||||
compliance: Compliance data dictionary
|
||||
|
||||
Returns:
|
||||
List of generated file paths
|
||||
|
||||
"""
|
||||
filepath = output_dir / f"{port}_compliance_status.csv"
|
||||
rows = []
|
||||
|
||||
if "cipher_suites_checked" in compliance:
|
||||
rows.append(
|
||||
[
|
||||
"Cipher Suites",
|
||||
compliance["cipher_suites_checked"],
|
||||
compliance["cipher_suites_passed"],
|
||||
f"{compliance['cipher_suite_percentage']}%",
|
||||
],
|
||||
)
|
||||
|
||||
if "groups_checked" in compliance:
|
||||
rows.append(
|
||||
[
|
||||
"Supported Groups",
|
||||
compliance["groups_checked"],
|
||||
compliance["groups_passed"],
|
||||
f"{compliance['group_percentage']}%",
|
||||
],
|
||||
)
|
||||
|
||||
if rows:
|
||||
headers = _get_headers(db_path, "compliance_status")
|
||||
_write_csv(filepath, headers, rows)
|
||||
return [str(filepath)]
|
||||
|
||||
return []
|
||||
|
||||
|
||||
def _has_tls_support(port_data: dict[str, Any]) -> bool:
|
||||
"""Check if port has TLS support.
|
||||
|
||||
Args:
|
||||
port_data: Port data dictionary
|
||||
|
||||
Returns:
|
||||
True if port has TLS support
|
||||
|
||||
"""
|
||||
return bool(
|
||||
port_data.get("cipher_suites")
|
||||
or port_data.get("supported_groups")
|
||||
or port_data.get("certificates")
|
||||
or port_data.get("tls_version"),
|
||||
)
|
||||
|
||||
|
||||
# Export handlers mapping: (data_key, handler_function)
|
||||
EXPORT_HANDLERS = (
|
||||
("cipher_suites", _export_cipher_suites),
|
||||
("supported_groups", _export_supported_groups),
|
||||
("missing_recommended_groups", _export_missing_groups),
|
||||
("certificates", _export_certificates),
|
||||
("vulnerabilities", _export_vulnerabilities),
|
||||
("protocol_features", _export_protocol_features),
|
||||
("session_features", _export_session_features),
|
||||
("http_headers", _export_http_headers),
|
||||
("compliance", _export_compliance_status),
|
||||
)
|
||||
|
||||
|
||||
def generate_csv_reports(
|
||||
db_path: str,
|
||||
scan_id: int,
|
||||
output_dir: str = ".",
|
||||
) -> list[str]:
|
||||
"""Generate granular CSV reports for scan.
|
||||
|
||||
Args:
|
||||
db_path: Path to database file
|
||||
scan_id: Scan ID
|
||||
output_dir: Output directory for CSV files
|
||||
|
||||
Returns:
|
||||
List of generated file paths
|
||||
|
||||
"""
|
||||
data = get_scan_data(db_path, scan_id)
|
||||
output_dir_path = Path(output_dir)
|
||||
output_dir_path.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
generated_files = []
|
||||
|
||||
generated_files.extend(
|
||||
_export_summary(output_dir_path, data.get("summary", {}), db_path),
|
||||
)
|
||||
|
||||
for port_data in data["ports_data"].values():
|
||||
if not _has_tls_support(port_data):
|
||||
continue
|
||||
|
||||
port = port_data["port"]
|
||||
|
||||
for data_key, handler_func in EXPORT_HANDLERS:
|
||||
if port_data.get(data_key):
|
||||
generated_files.extend(
|
||||
handler_func(output_dir_path, port, port_data[data_key], db_path),
|
||||
)
|
||||
|
||||
return generated_files
|
||||
37
src/sslysze_scan/reporter/markdown_export.py
Normal file
37
src/sslysze_scan/reporter/markdown_export.py
Normal file
@@ -0,0 +1,37 @@
|
||||
"""Markdown report generation using shared template utilities."""
|
||||
|
||||
|
||||
from .query import _generate_recommendations, get_scan_data
|
||||
from .template_utils import (
|
||||
build_template_context,
|
||||
generate_report_id,
|
||||
prepare_output_path,
|
||||
render_template_to_file,
|
||||
)
|
||||
|
||||
|
||||
def generate_markdown_report(
|
||||
db_path: str, scan_id: int, output_file: str | None = None,
|
||||
) -> str:
|
||||
"""Generate markdown report for scan.
|
||||
|
||||
Args:
|
||||
db_path: Path to database file
|
||||
scan_id: Scan ID
|
||||
output_file: Optional output file path (auto-generated if None)
|
||||
|
||||
Returns:
|
||||
Path to generated report file
|
||||
|
||||
"""
|
||||
data = get_scan_data(db_path, scan_id)
|
||||
metadata = data["metadata"]
|
||||
report_id = generate_report_id(metadata)
|
||||
|
||||
context = build_template_context(data)
|
||||
context["recommendations"] = _generate_recommendations(data)
|
||||
|
||||
default_filename = f"compliance_report_{report_id}.md"
|
||||
output_path = prepare_output_path(output_file, ".", default_filename)
|
||||
|
||||
return render_template_to_file("report.md.j2", context, output_path)
|
||||
534
src/sslysze_scan/reporter/query.py
Normal file
534
src/sslysze_scan/reporter/query.py
Normal file
@@ -0,0 +1,534 @@
|
||||
"""Report generation module for scan results."""
|
||||
|
||||
import sqlite3
|
||||
from typing import Any
|
||||
|
||||
# Compliance thresholds
|
||||
COMPLIANCE_WARNING_THRESHOLD = 50.0
|
||||
|
||||
|
||||
def list_scans(db_path: str) -> list[dict[str, Any]]:
|
||||
"""List all available scans in the database.
|
||||
|
||||
Args:
|
||||
db_path: Path to database file
|
||||
|
||||
Returns:
|
||||
List of scan dictionaries with metadata
|
||||
|
||||
"""
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT scan_id, timestamp, hostname, ports, scan_duration_seconds
|
||||
FROM scans
|
||||
ORDER BY scan_id DESC
|
||||
""",
|
||||
)
|
||||
|
||||
scans = []
|
||||
for row in cursor.fetchall():
|
||||
scans.append(
|
||||
{
|
||||
"scan_id": row[0],
|
||||
"timestamp": row[1],
|
||||
"hostname": row[2],
|
||||
"ports": row[3],
|
||||
"duration": row[4],
|
||||
},
|
||||
)
|
||||
|
||||
conn.close()
|
||||
return scans
|
||||
|
||||
|
||||
def get_scan_metadata(db_path: str, scan_id: int) -> dict[str, Any] | None:
|
||||
"""Get metadata for a specific scan.
|
||||
|
||||
Args:
|
||||
db_path: Path to database file
|
||||
scan_id: Scan ID
|
||||
|
||||
Returns:
|
||||
Dictionary with scan metadata or None if not found
|
||||
|
||||
"""
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT s.scan_id, s.timestamp, s.hostname, s.ports, s.scan_duration_seconds,
|
||||
h.fqdn, h.ipv4, h.ipv6
|
||||
FROM scans s
|
||||
LEFT JOIN scanned_hosts h ON s.scan_id = h.scan_id
|
||||
WHERE s.scan_id = ?
|
||||
""",
|
||||
(scan_id,),
|
||||
)
|
||||
|
||||
row = cursor.fetchone()
|
||||
conn.close()
|
||||
|
||||
if not row:
|
||||
return None
|
||||
|
||||
return {
|
||||
"scan_id": row[0],
|
||||
"timestamp": row[1],
|
||||
"hostname": row[2],
|
||||
"ports": row[3].split(",") if row[3] else [],
|
||||
"duration": row[4],
|
||||
"fqdn": row[5] or row[2],
|
||||
"ipv4": row[6],
|
||||
"ipv6": row[7],
|
||||
}
|
||||
|
||||
|
||||
def get_scan_data(db_path: str, scan_id: int) -> dict[str, Any]:
|
||||
"""Get all scan data for report generation.
|
||||
|
||||
Args:
|
||||
db_path: Path to database file
|
||||
scan_id: Scan ID
|
||||
|
||||
Returns:
|
||||
Dictionary with all scan data
|
||||
|
||||
"""
|
||||
metadata = get_scan_metadata(db_path, scan_id)
|
||||
if not metadata:
|
||||
raise ValueError(f"Scan ID {scan_id} not found")
|
||||
|
||||
conn = sqlite3.connect(db_path)
|
||||
cursor = conn.cursor()
|
||||
|
||||
data = {
|
||||
"metadata": metadata,
|
||||
"ports_data": {},
|
||||
}
|
||||
|
||||
# Get data for each port
|
||||
for port in metadata["ports"]:
|
||||
port_num = int(port)
|
||||
port_data = {
|
||||
"port": port_num,
|
||||
"status": "completed",
|
||||
"tls_version": None,
|
||||
"cipher_suites": {},
|
||||
"supported_groups": [],
|
||||
"certificates": [],
|
||||
"vulnerabilities": [],
|
||||
"protocol_features": [],
|
||||
"session_features": [],
|
||||
"http_headers": [],
|
||||
"compliance": {},
|
||||
}
|
||||
|
||||
# Cipher suites using view
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT tls_version, cipher_suite_name, accepted, iana_value, key_size, is_anonymous,
|
||||
iana_recommended_final, bsi_approved_final, bsi_valid_until_final, compliant
|
||||
FROM v_cipher_suites_with_compliance
|
||||
WHERE scan_id = ? AND port = ?
|
||||
ORDER BY tls_version, accepted DESC, cipher_suite_name
|
||||
""",
|
||||
(scan_id, port_num),
|
||||
)
|
||||
|
||||
rejected_counts = {}
|
||||
for row in cursor.fetchall():
|
||||
tls_version = row[0]
|
||||
if tls_version not in port_data["cipher_suites"]:
|
||||
port_data["cipher_suites"][tls_version] = {
|
||||
"accepted": [],
|
||||
"rejected": [],
|
||||
}
|
||||
rejected_counts[tls_version] = 0
|
||||
|
||||
suite = {
|
||||
"name": row[1],
|
||||
"accepted": row[2],
|
||||
"iana_value": row[3],
|
||||
"key_size": row[4],
|
||||
"is_anonymous": row[5],
|
||||
}
|
||||
|
||||
if row[2]: # accepted
|
||||
suite["iana_recommended"] = row[6]
|
||||
suite["bsi_approved"] = row[7]
|
||||
suite["bsi_valid_until"] = row[8]
|
||||
suite["compliant"] = row[9]
|
||||
port_data["cipher_suites"][tls_version]["accepted"].append(suite)
|
||||
else: # rejected
|
||||
rejected_counts[tls_version] += 1
|
||||
# Only include rejected if BSI-approved OR IANA-recommended
|
||||
if row[7] or row[6] == "Y":
|
||||
suite["iana_recommended"] = row[6]
|
||||
suite["bsi_approved"] = row[7]
|
||||
suite["bsi_valid_until"] = row[8]
|
||||
suite["compliant"] = False
|
||||
port_data["cipher_suites"][tls_version]["rejected"].append(suite)
|
||||
|
||||
# Store rejected counts
|
||||
for tls_version in port_data["cipher_suites"]:
|
||||
port_data["cipher_suites"][tls_version]["rejected_total"] = (
|
||||
rejected_counts.get(tls_version, 0)
|
||||
)
|
||||
|
||||
# Determine highest TLS version
|
||||
if port_data["cipher_suites"]:
|
||||
tls_versions = list(port_data["cipher_suites"].keys())
|
||||
version_order = ["ssl_3.0", "1.0", "1.1", "1.2", "1.3"]
|
||||
for version in reversed(version_order):
|
||||
if version in tls_versions:
|
||||
port_data["tls_version"] = version
|
||||
break
|
||||
|
||||
# Supported groups using view
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT group_name, iana_value, openssl_nid,
|
||||
iana_recommended, bsi_approved, bsi_valid_until, compliant
|
||||
FROM v_supported_groups_with_compliance
|
||||
WHERE scan_id = ? AND port = ?
|
||||
ORDER BY group_name
|
||||
""",
|
||||
(scan_id, port_num),
|
||||
)
|
||||
|
||||
for row in cursor.fetchall():
|
||||
port_data["supported_groups"].append(
|
||||
{
|
||||
"name": row[0],
|
||||
"iana_value": row[1],
|
||||
"openssl_nid": row[2],
|
||||
"iana_recommended": row[3],
|
||||
"bsi_approved": row[4],
|
||||
"bsi_valid_until": row[5],
|
||||
"compliant": row[6],
|
||||
},
|
||||
)
|
||||
|
||||
# Certificates using view
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT position, subject, issuer, serial_number, not_before, not_after,
|
||||
key_type, key_bits, signature_algorithm, fingerprint_sha256,
|
||||
compliant, compliance_details
|
||||
FROM v_certificates_with_compliance
|
||||
WHERE scan_id = ? AND port = ?
|
||||
ORDER BY position
|
||||
""",
|
||||
(scan_id, port_num),
|
||||
)
|
||||
|
||||
for row in cursor.fetchall():
|
||||
port_data["certificates"].append(
|
||||
{
|
||||
"position": row[0],
|
||||
"subject": row[1],
|
||||
"issuer": row[2],
|
||||
"serial_number": row[3],
|
||||
"not_before": row[4],
|
||||
"not_after": row[5],
|
||||
"key_type": row[6],
|
||||
"key_bits": row[7],
|
||||
"signature_algorithm": row[8],
|
||||
"fingerprint_sha256": row[9],
|
||||
"compliant": row[10] if row[10] is not None else None,
|
||||
"compliance_details": row[11] if row[11] else None,
|
||||
},
|
||||
)
|
||||
|
||||
# Vulnerabilities
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT vuln_type, vulnerable, details
|
||||
FROM scan_vulnerabilities
|
||||
WHERE scan_id = ? AND port = ?
|
||||
ORDER BY vuln_type
|
||||
""",
|
||||
(scan_id, port_num),
|
||||
)
|
||||
|
||||
for row in cursor.fetchall():
|
||||
port_data["vulnerabilities"].append(
|
||||
{
|
||||
"type": row[0],
|
||||
"vulnerable": row[1],
|
||||
"details": row[2],
|
||||
},
|
||||
)
|
||||
|
||||
# Protocol features
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT feature_type, supported, details
|
||||
FROM scan_protocol_features
|
||||
WHERE scan_id = ? AND port = ?
|
||||
ORDER BY feature_type
|
||||
""",
|
||||
(scan_id, port_num),
|
||||
)
|
||||
|
||||
for row in cursor.fetchall():
|
||||
port_data["protocol_features"].append(
|
||||
{
|
||||
"name": row[0],
|
||||
"supported": row[1],
|
||||
"details": row[2],
|
||||
},
|
||||
)
|
||||
|
||||
# Session features
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT feature_type, client_initiated, secure, session_id_supported,
|
||||
ticket_supported, attempted_resumptions, successful_resumptions, details
|
||||
FROM scan_session_features
|
||||
WHERE scan_id = ? AND port = ?
|
||||
ORDER BY feature_type
|
||||
""",
|
||||
(scan_id, port_num),
|
||||
)
|
||||
|
||||
for row in cursor.fetchall():
|
||||
port_data["session_features"].append(
|
||||
{
|
||||
"type": row[0],
|
||||
"client_initiated": row[1],
|
||||
"secure": row[2],
|
||||
"session_id_supported": row[3],
|
||||
"ticket_supported": row[4],
|
||||
"attempted_resumptions": row[5],
|
||||
"successful_resumptions": row[6],
|
||||
"details": row[7],
|
||||
},
|
||||
)
|
||||
|
||||
# HTTP headers
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT header_name, header_value, is_present
|
||||
FROM scan_http_headers
|
||||
WHERE scan_id = ? AND port = ?
|
||||
ORDER BY header_name
|
||||
""",
|
||||
(scan_id, port_num),
|
||||
)
|
||||
|
||||
for row in cursor.fetchall():
|
||||
port_data["http_headers"].append(
|
||||
{
|
||||
"name": row[0],
|
||||
"value": row[1],
|
||||
"is_present": row[2],
|
||||
},
|
||||
)
|
||||
|
||||
# Compliance summary using view
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT check_type, total, passed, percentage
|
||||
FROM v_port_compliance_summary
|
||||
WHERE scan_id = ? AND port = ?
|
||||
""",
|
||||
(scan_id, port_num),
|
||||
)
|
||||
|
||||
for row in cursor.fetchall():
|
||||
check_type = row[0]
|
||||
total = row[1]
|
||||
passed = row[2]
|
||||
percentage = row[3]
|
||||
|
||||
if check_type == "cipher_suite":
|
||||
port_data["compliance"]["cipher_suites_checked"] = total
|
||||
port_data["compliance"]["cipher_suites_passed"] = passed
|
||||
port_data["compliance"]["cipher_suite_percentage"] = f"{percentage:.1f}"
|
||||
elif check_type == "supported_group":
|
||||
port_data["compliance"]["groups_checked"] = total
|
||||
port_data["compliance"]["groups_passed"] = passed
|
||||
port_data["compliance"]["group_percentage"] = f"{percentage:.1f}"
|
||||
|
||||
# Get missing recommended groups for this port
|
||||
port_data["missing_recommended_groups"] = _get_missing_recommended_groups(
|
||||
cursor,
|
||||
scan_id,
|
||||
port_num,
|
||||
)
|
||||
|
||||
data["ports_data"][port_num] = port_data
|
||||
|
||||
conn.close()
|
||||
|
||||
# Calculate overall summary
|
||||
data["summary"] = _calculate_summary(data)
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def _get_missing_recommended_groups(
|
||||
cursor: sqlite3.Cursor,
|
||||
scan_id: int,
|
||||
port: int,
|
||||
) -> dict[str, list[dict[str, Any]]]:
|
||||
"""Get recommended groups that are not offered by the server using views.
|
||||
|
||||
Args:
|
||||
cursor: Database cursor
|
||||
scan_id: Scan ID
|
||||
port: Port number
|
||||
|
||||
Returns:
|
||||
Dictionary with 'bsi_approved' and 'iana_recommended' lists
|
||||
|
||||
"""
|
||||
missing = {"bsi_approved": [], "iana_recommended": []}
|
||||
|
||||
# Get missing BSI-approved groups using view
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT group_name, tls_version, valid_until
|
||||
FROM v_missing_bsi_groups
|
||||
WHERE scan_id = ?
|
||||
ORDER BY group_name, tls_version
|
||||
""",
|
||||
(scan_id,),
|
||||
)
|
||||
|
||||
bsi_groups = {}
|
||||
for row in cursor.fetchall():
|
||||
group_name = row[0]
|
||||
tls_version = row[1]
|
||||
valid_until = row[2]
|
||||
|
||||
if group_name not in bsi_groups:
|
||||
bsi_groups[group_name] = {
|
||||
"name": group_name,
|
||||
"tls_versions": [],
|
||||
"valid_until": valid_until,
|
||||
}
|
||||
bsi_groups[group_name]["tls_versions"].append(tls_version)
|
||||
|
||||
missing["bsi_approved"] = list(bsi_groups.values())
|
||||
|
||||
# Get missing IANA-recommended groups using view
|
||||
cursor.execute(
|
||||
"""
|
||||
SELECT group_name, iana_value
|
||||
FROM v_missing_iana_groups
|
||||
WHERE scan_id = ?
|
||||
ORDER BY CAST(iana_value AS INTEGER)
|
||||
""",
|
||||
(scan_id,),
|
||||
)
|
||||
|
||||
for row in cursor.fetchall():
|
||||
missing["iana_recommended"].append(
|
||||
{
|
||||
"name": row[0],
|
||||
"iana_value": row[1],
|
||||
},
|
||||
)
|
||||
|
||||
return missing
|
||||
|
||||
|
||||
def _calculate_summary(data: dict[str, Any]) -> dict[str, Any]:
|
||||
"""Calculate overall summary statistics."""
|
||||
total_cipher_suites = 0
|
||||
compliant_cipher_suites = 0
|
||||
total_groups = 0
|
||||
compliant_groups = 0
|
||||
critical_vulnerabilities = 0
|
||||
ports_with_tls = 0
|
||||
ports_without_tls = 0
|
||||
|
||||
for port_data in data["ports_data"].values():
|
||||
# Check if port has TLS support
|
||||
has_tls = (
|
||||
port_data.get("cipher_suites")
|
||||
or port_data.get("supported_groups")
|
||||
or port_data.get("certificates")
|
||||
or port_data.get("tls_version")
|
||||
)
|
||||
|
||||
if has_tls:
|
||||
ports_with_tls += 1
|
||||
compliance = port_data.get("compliance", {})
|
||||
total_cipher_suites += compliance.get("cipher_suites_checked", 0)
|
||||
compliant_cipher_suites += compliance.get("cipher_suites_passed", 0)
|
||||
total_groups += compliance.get("groups_checked", 0)
|
||||
compliant_groups += compliance.get("groups_passed", 0)
|
||||
|
||||
for vuln in port_data.get("vulnerabilities", []):
|
||||
if vuln.get("vulnerable"):
|
||||
critical_vulnerabilities += 1
|
||||
else:
|
||||
ports_without_tls += 1
|
||||
|
||||
cipher_suite_percentage = (
|
||||
(compliant_cipher_suites / total_cipher_suites * 100)
|
||||
if total_cipher_suites > 0
|
||||
else 0
|
||||
)
|
||||
group_percentage = (compliant_groups / total_groups * 100) if total_groups > 0 else 0
|
||||
|
||||
return {
|
||||
"total_ports": len(data["ports_data"]),
|
||||
"successful_ports": ports_with_tls,
|
||||
"ports_without_tls": ports_without_tls,
|
||||
"total_cipher_suites": total_cipher_suites,
|
||||
"compliant_cipher_suites": compliant_cipher_suites,
|
||||
"cipher_suite_percentage": f"{cipher_suite_percentage:.1f}",
|
||||
"total_groups": total_groups,
|
||||
"compliant_groups": compliant_groups,
|
||||
"group_percentage": f"{group_percentage:.1f}",
|
||||
"critical_vulnerabilities": critical_vulnerabilities,
|
||||
}
|
||||
|
||||
|
||||
def _generate_recommendations(data: dict[str, Any]) -> list[dict[str, str]]:
|
||||
"""Generate recommendations based on scan results."""
|
||||
recommendations = []
|
||||
|
||||
# Check for vulnerabilities
|
||||
for port_data in data["ports_data"].values():
|
||||
for vuln in port_data.get("vulnerabilities", []):
|
||||
if vuln.get("vulnerable"):
|
||||
recommendations.append(
|
||||
{
|
||||
"severity": "CRITICAL",
|
||||
"message": f"Port {port_data['port']}: {vuln['type']} vulnerability found. Immediate update required.",
|
||||
},
|
||||
)
|
||||
|
||||
# Check for low compliance
|
||||
summary = data.get("summary", {})
|
||||
cipher_percentage = float(summary.get("cipher_suite_percentage", 0))
|
||||
if cipher_percentage < COMPLIANCE_WARNING_THRESHOLD:
|
||||
recommendations.append(
|
||||
{
|
||||
"severity": "WARNING",
|
||||
"message": f"Only {cipher_percentage:.1f}% of cipher suites are compliant. Disable insecure cipher suites.",
|
||||
},
|
||||
)
|
||||
|
||||
# Check for deprecated TLS versions
|
||||
for port_data in data["ports_data"].values():
|
||||
for tls_version in port_data.get("cipher_suites", {}).keys():
|
||||
if tls_version in ["ssl_3.0", "1.0", "1.1"]:
|
||||
if port_data["cipher_suites"][tls_version]["accepted"]:
|
||||
recommendations.append(
|
||||
{
|
||||
"severity": "WARNING",
|
||||
"message": f"Port {port_data['port']}: Deprecated TLS version {tls_version} is supported. Disable TLS 1.0 and 1.1.",
|
||||
},
|
||||
)
|
||||
|
||||
return recommendations
|
||||
39
src/sslysze_scan/reporter/rst_export.py
Normal file
39
src/sslysze_scan/reporter/rst_export.py
Normal file
@@ -0,0 +1,39 @@
|
||||
"""reStructuredText report generation with CSV includes using shared utilities."""
|
||||
|
||||
from .csv_export import generate_csv_reports
|
||||
from .query import get_scan_data
|
||||
from .template_utils import (
|
||||
build_template_context,
|
||||
prepare_output_path,
|
||||
render_template_to_file,
|
||||
)
|
||||
|
||||
|
||||
def generate_rest_report(
|
||||
db_path: str, scan_id: int, output_file: str | None = None, output_dir: str = ".",
|
||||
) -> str:
|
||||
"""Generate reStructuredText report with CSV includes.
|
||||
|
||||
Args:
|
||||
db_path: Path to database file
|
||||
scan_id: Scan ID
|
||||
output_file: Output file path (optional)
|
||||
output_dir: Output directory for report and CSV files
|
||||
|
||||
Returns:
|
||||
Path to generated report file
|
||||
|
||||
"""
|
||||
data = get_scan_data(db_path, scan_id)
|
||||
|
||||
# Generate CSV files first
|
||||
generate_csv_reports(db_path, scan_id, output_dir)
|
||||
|
||||
# Build template context
|
||||
context = build_template_context(data)
|
||||
|
||||
# Prepare output path - always use fixed filename
|
||||
default_filename = "compliance_report.rst"
|
||||
output_path = prepare_output_path(output_file, output_dir, default_filename)
|
||||
|
||||
return render_template_to_file("report.reST.j2", context, output_path)
|
||||
168
src/sslysze_scan/reporter/template_utils.py
Normal file
168
src/sslysze_scan/reporter/template_utils.py
Normal file
@@ -0,0 +1,168 @@
|
||||
"""Shared utilities for report template rendering."""
|
||||
|
||||
from datetime import datetime, timezone
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
from jinja2 import Environment, FileSystemLoader, select_autoescape
|
||||
|
||||
|
||||
def format_tls_version(version: str) -> str:
|
||||
"""Format TLS version string for display.
|
||||
|
||||
Args:
|
||||
version: TLS version identifier (e.g., "1.2", "ssl_3.0")
|
||||
|
||||
Returns:
|
||||
Formatted version string (e.g., "TLS 1.2", "SSL 3.0")
|
||||
|
||||
"""
|
||||
version_map = {
|
||||
"ssl_3.0": "SSL 3.0",
|
||||
"1.0": "TLS 1.0",
|
||||
"1.1": "TLS 1.1",
|
||||
"1.2": "TLS 1.2",
|
||||
"1.3": "TLS 1.3",
|
||||
}
|
||||
return version_map.get(version, version)
|
||||
|
||||
|
||||
def create_jinja_env() -> Environment:
|
||||
"""Create Jinja2 environment with standard configuration.
|
||||
|
||||
Returns:
|
||||
Configured Jinja2 Environment with custom filters
|
||||
|
||||
"""
|
||||
template_dir = Path(__file__).parent.parent / "templates"
|
||||
env = Environment(
|
||||
loader=FileSystemLoader(str(template_dir)),
|
||||
autoescape=select_autoescape(["html", "xml"]),
|
||||
trim_blocks=True,
|
||||
lstrip_blocks=True,
|
||||
)
|
||||
env.filters["format_tls_version"] = format_tls_version
|
||||
return env
|
||||
|
||||
|
||||
def generate_report_id(metadata: dict[str, Any]) -> str:
|
||||
"""Generate report ID from scan metadata.
|
||||
|
||||
Args:
|
||||
metadata: Scan metadata dictionary containing timestamp
|
||||
|
||||
Returns:
|
||||
Report ID in format YYYYMMDD_<scanid>
|
||||
|
||||
"""
|
||||
try:
|
||||
dt = datetime.fromisoformat(metadata["timestamp"])
|
||||
date_str = dt.strftime("%Y%m%d")
|
||||
except (ValueError, KeyError):
|
||||
date_str = datetime.now(timezone.utc).strftime("%Y%m%d")
|
||||
|
||||
return f"{date_str}_{metadata['scan_id']}"
|
||||
|
||||
|
||||
def build_template_context(data: dict[str, Any]) -> dict[str, Any]:
|
||||
"""Build template context from scan data.
|
||||
|
||||
Args:
|
||||
data: Scan data dictionary from get_scan_data()
|
||||
|
||||
Returns:
|
||||
Dictionary with template context variables
|
||||
|
||||
"""
|
||||
metadata = data["metadata"]
|
||||
|
||||
duration = metadata.get("duration")
|
||||
if duration is not None:
|
||||
duration_str = (
|
||||
f"{duration:.2f}" if isinstance(duration, (int, float)) else str(duration)
|
||||
)
|
||||
else:
|
||||
duration_str = "N/A"
|
||||
|
||||
# Format timestamp to minute precision (DD.MM.YYYY HH:MM)
|
||||
timestamp_str = metadata["timestamp"]
|
||||
try:
|
||||
dt = datetime.fromisoformat(timestamp_str)
|
||||
timestamp_str = dt.strftime("%d.%m.%Y %H:%M")
|
||||
except (ValueError, KeyError):
|
||||
pass
|
||||
|
||||
# Filter ports with TLS support for port sections
|
||||
ports_with_tls = []
|
||||
for port_data in data["ports_data"].values():
|
||||
has_tls = (
|
||||
port_data.get("cipher_suites")
|
||||
or port_data.get("supported_groups")
|
||||
or port_data.get("certificates")
|
||||
or port_data.get("tls_version")
|
||||
)
|
||||
if has_tls:
|
||||
ports_with_tls.append(port_data)
|
||||
|
||||
return {
|
||||
"scan_id": metadata["scan_id"],
|
||||
"hostname": metadata["hostname"],
|
||||
"fqdn": metadata["fqdn"],
|
||||
"ipv4": metadata["ipv4"],
|
||||
"ipv6": metadata["ipv6"],
|
||||
"timestamp": timestamp_str,
|
||||
"duration": duration_str,
|
||||
"ports": ", ".join(metadata["ports"]),
|
||||
"ports_without_tls": data.get("summary", {}).get("ports_without_tls", 0),
|
||||
"summary": data.get("summary", {}),
|
||||
"ports_data": sorted(ports_with_tls, key=lambda x: x["port"]),
|
||||
}
|
||||
|
||||
|
||||
def prepare_output_path(
|
||||
output_file: str | None,
|
||||
output_dir: str,
|
||||
default_filename: str,
|
||||
) -> Path:
|
||||
"""Prepare output file path and ensure parent directory exists.
|
||||
|
||||
Args:
|
||||
output_file: Explicit output file path (optional)
|
||||
output_dir: Output directory for auto-generated files
|
||||
default_filename: Default filename if output_file is None
|
||||
|
||||
Returns:
|
||||
Path object for output file
|
||||
|
||||
"""
|
||||
if output_file:
|
||||
output_path = Path(output_file)
|
||||
else:
|
||||
output_path = Path(output_dir) / default_filename
|
||||
|
||||
output_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
return output_path
|
||||
|
||||
|
||||
def render_template_to_file(
|
||||
template_name: str,
|
||||
context: dict[str, Any],
|
||||
output_path: Path,
|
||||
) -> str:
|
||||
"""Render Jinja2 template and write to file.
|
||||
|
||||
Args:
|
||||
template_name: Name of template file
|
||||
context: Template context variables
|
||||
output_path: Output file path
|
||||
|
||||
Returns:
|
||||
String path of generated file
|
||||
|
||||
"""
|
||||
env = create_jinja_env()
|
||||
template = env.get_template(template_name)
|
||||
content = template.render(**context)
|
||||
|
||||
output_path.write_text(content, encoding="utf-8")
|
||||
return str(output_path)
|
||||
401
src/sslysze_scan/scan_iana.py
Normal file
401
src/sslysze_scan/scan_iana.py
Normal file
@@ -0,0 +1,401 @@
|
||||
#!/usr/bin/env python3
|
||||
"""IANA XML Registry to SQLite Converter
|
||||
|
||||
Parses IANA XML registry files and exports specified registries directly to SQLite database
|
||||
based on configuration from iana_parse.json.
|
||||
"""
|
||||
|
||||
"""Script to fetch and parse IANA TLS registries into SQLite database."""
|
||||
|
||||
import json
|
||||
import sqlite3
|
||||
import xml.etree.ElementTree as ET
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
def load_config(config_path: str) -> dict:
|
||||
"""Load configuration from JSON file.
|
||||
|
||||
Args:
|
||||
config_path: Path to iana_parse.json
|
||||
|
||||
Returns:
|
||||
Dictionary with XML paths as keys and registry definitions as values
|
||||
|
||||
Raises:
|
||||
FileNotFoundError: If config file does not exist
|
||||
json.JSONDecodeError: If config file is invalid JSON
|
||||
|
||||
"""
|
||||
config_path_obj = Path(config_path)
|
||||
if not config_path_obj.is_file():
|
||||
raise FileNotFoundError(f"Konfigurationsdatei nicht gefunden: {config_path}")
|
||||
|
||||
with config_path_obj.open(encoding="utf-8") as f:
|
||||
return json.load(f)
|
||||
|
||||
|
||||
def parse_xml_with_namespace_support(
|
||||
xml_path: str,
|
||||
) -> tuple[ET.Element, dict | None]:
|
||||
"""Parse XML file and detect if it uses IANA namespace.
|
||||
|
||||
Args:
|
||||
xml_path: Path to XML file
|
||||
|
||||
Returns:
|
||||
Tuple of (root element, namespace dict or None)
|
||||
|
||||
Raises:
|
||||
FileNotFoundError: If XML file does not exist
|
||||
ET.ParseError: If XML is malformed
|
||||
|
||||
"""
|
||||
xml_path_obj = Path(xml_path)
|
||||
if not xml_path_obj.is_file():
|
||||
raise FileNotFoundError(f"XML-Datei nicht gefunden: {xml_path}")
|
||||
|
||||
try:
|
||||
tree = ET.parse(xml_path)
|
||||
root = tree.getroot()
|
||||
|
||||
# Check if IANA namespace is used
|
||||
if root.tag.startswith("{http://www.iana.org/assignments}"):
|
||||
ns = {"iana": "http://www.iana.org/assignments"}
|
||||
return root, ns
|
||||
return root, None
|
||||
|
||||
except ET.ParseError as e:
|
||||
raise ET.ParseError(f"Fehler beim Parsen von {xml_path}: {e}") from e
|
||||
|
||||
|
||||
def find_registry(root: ET.Element, registry_id: str, ns: dict | None) -> ET.Element:
|
||||
"""Find registry element by ID in XML tree.
|
||||
|
||||
Args:
|
||||
root: Root element of XML tree
|
||||
registry_id: ID of registry to find
|
||||
ns: Namespace dictionary or None
|
||||
|
||||
Returns:
|
||||
Registry element
|
||||
|
||||
Raises:
|
||||
ValueError: If registry not found
|
||||
|
||||
"""
|
||||
if ns:
|
||||
registry = root.find(f'.//iana:registry[@id="{registry_id}"]', ns)
|
||||
else:
|
||||
registry = root.find(f'.//registry[@id="{registry_id}"]')
|
||||
|
||||
if registry is None:
|
||||
raise ValueError(f"Registry mit ID '{registry_id}' nicht gefunden")
|
||||
|
||||
return registry
|
||||
|
||||
|
||||
def get_element_text(record: ET.Element, tag: str, ns: dict | None) -> str:
|
||||
"""Get text content of element, supporting both namespaced and non-namespaced XML.
|
||||
|
||||
Args:
|
||||
record: Record element
|
||||
tag: Tag name to find
|
||||
ns: Namespace dictionary or None
|
||||
|
||||
Returns:
|
||||
Element text or empty string if not found
|
||||
|
||||
"""
|
||||
if ns:
|
||||
elem = record.find(f"iana:{tag}", ns)
|
||||
else:
|
||||
elem = record.find(tag)
|
||||
|
||||
if elem is not None and elem.text:
|
||||
return elem.text.strip()
|
||||
return ""
|
||||
|
||||
|
||||
def process_xref_elements(record: ET.Element, ns: dict | None) -> str:
|
||||
"""Process all xref elements and combine them into a single string.
|
||||
|
||||
Args:
|
||||
record: Record element
|
||||
ns: Namespace dictionary or None
|
||||
|
||||
Returns:
|
||||
Semicolon-separated string of xref references
|
||||
|
||||
"""
|
||||
xrefs = []
|
||||
|
||||
if ns:
|
||||
xref_elements = record.findall("iana:xref", ns)
|
||||
else:
|
||||
xref_elements = record.findall("xref")
|
||||
|
||||
for xref in xref_elements:
|
||||
xref_type = xref.get("type", "")
|
||||
xref_data = xref.get("data", "")
|
||||
if xref_type and xref_data:
|
||||
xrefs.append(f"{xref_type}:{xref_data}")
|
||||
|
||||
return "; ".join(xrefs)
|
||||
|
||||
|
||||
def map_header_to_element(header: str) -> str:
|
||||
"""Map CSV header name to XML element name.
|
||||
|
||||
Implements implicit mapping with special cases:
|
||||
- "Recommended" -> "rec"
|
||||
- Most others: lowercase of header name
|
||||
|
||||
Args:
|
||||
header: CSV header name
|
||||
|
||||
Returns:
|
||||
XML element name
|
||||
|
||||
"""
|
||||
# Special mappings
|
||||
special_mappings = {
|
||||
"Recommended": "rec",
|
||||
"RFC/Draft": "xref", # Special handling needed
|
||||
"ESP": "esp",
|
||||
"IKEv2": "ikev2",
|
||||
"Status": "status",
|
||||
}
|
||||
|
||||
if header in special_mappings:
|
||||
return special_mappings[header]
|
||||
|
||||
# Default: lowercase
|
||||
return header.lower()
|
||||
|
||||
|
||||
def extract_field_value(record: ET.Element, header: str, ns: dict | None) -> str:
|
||||
"""Extract field value from record based on header name.
|
||||
|
||||
Args:
|
||||
record: XML record element
|
||||
header: CSV header name
|
||||
ns: Namespace dictionary or None
|
||||
|
||||
Returns:
|
||||
Field value as string
|
||||
|
||||
"""
|
||||
# Special handling for RFC/Draft (xref elements)
|
||||
if header == "RFC/Draft":
|
||||
return process_xref_elements(record, ns)
|
||||
|
||||
# Get XML element name for this header
|
||||
element_name = map_header_to_element(header)
|
||||
|
||||
# Extract text
|
||||
return get_element_text(record, element_name, ns)
|
||||
|
||||
|
||||
def get_table_name_from_filename(filename: str) -> str:
|
||||
"""Convert CSV filename to database table name.
|
||||
|
||||
Args:
|
||||
filename: CSV filename (e.g., "tls_cipher_suites.csv")
|
||||
|
||||
Returns:
|
||||
Table name with iana_ prefix (e.g., "iana_tls_cipher_suites")
|
||||
|
||||
"""
|
||||
table_name = filename.replace(".csv", "")
|
||||
if not table_name.startswith("iana_"):
|
||||
table_name = f"iana_{table_name}"
|
||||
return table_name
|
||||
|
||||
|
||||
def write_registry_to_db(
|
||||
root: ET.Element,
|
||||
registry_id: str,
|
||||
table_name: str,
|
||||
headers: list[str],
|
||||
ns: dict | None,
|
||||
db_conn: sqlite3.Connection,
|
||||
) -> int:
|
||||
"""Write registry data directly to SQLite database.
|
||||
|
||||
Args:
|
||||
root: Root element of XML tree
|
||||
registry_id: ID of registry to export
|
||||
table_name: Database table name
|
||||
headers: List of column names
|
||||
ns: Namespace dictionary or None
|
||||
db_conn: SQLite database connection
|
||||
|
||||
Returns:
|
||||
Number of rows inserted
|
||||
|
||||
Raises:
|
||||
ValueError: If registry not found
|
||||
sqlite3.Error: If database operation fails
|
||||
|
||||
"""
|
||||
# Find registry
|
||||
registry = find_registry(root, registry_id, ns)
|
||||
|
||||
# Process all records
|
||||
if ns:
|
||||
records = registry.findall("iana:record", ns)
|
||||
else:
|
||||
records = registry.findall("record")
|
||||
|
||||
# Prepare data
|
||||
rows = []
|
||||
for record in records:
|
||||
row = []
|
||||
for header in headers:
|
||||
value = extract_field_value(record, header, ns)
|
||||
row.append(value)
|
||||
rows.append(tuple(row))
|
||||
|
||||
if not rows:
|
||||
return 0
|
||||
|
||||
# Insert into database
|
||||
cursor = db_conn.cursor()
|
||||
placeholders = ",".join(["?"] * len(headers))
|
||||
|
||||
# Delete existing data for this table
|
||||
cursor.execute(f"DELETE FROM {table_name}")
|
||||
|
||||
# Insert new data
|
||||
cursor.executemany(f"INSERT INTO {table_name} VALUES ({placeholders})", rows)
|
||||
|
||||
db_conn.commit()
|
||||
|
||||
return len(rows)
|
||||
|
||||
|
||||
def process_xml_file(
|
||||
xml_path: str,
|
||||
registries: list[tuple[str, str, list[str]]],
|
||||
db_conn: sqlite3.Connection,
|
||||
repo_root: str,
|
||||
) -> int:
|
||||
"""Process single XML file and export all specified registries to database.
|
||||
|
||||
Args:
|
||||
xml_path: Relative path to XML file from repo root
|
||||
registries: List of (registry_id, output_filename, headers) tuples
|
||||
db_conn: SQLite database connection
|
||||
repo_root: Repository root directory
|
||||
|
||||
Returns:
|
||||
Total number of rows inserted
|
||||
|
||||
Raises:
|
||||
Various exceptions from helper functions
|
||||
|
||||
"""
|
||||
# Construct absolute path to XML file
|
||||
full_xml_path = repo_root / xml_path
|
||||
|
||||
print(f"\nVerarbeite XML: {xml_path}")
|
||||
|
||||
# Parse XML
|
||||
try:
|
||||
root, ns = parse_xml_with_namespace_support(str(full_xml_path))
|
||||
except (FileNotFoundError, ET.ParseError, OSError) as e:
|
||||
raise RuntimeError(f"Fehler beim Laden von {xml_path}: {e}") from e
|
||||
|
||||
# Process each registry
|
||||
total_rows = 0
|
||||
for registry_id, output_filename, headers in registries:
|
||||
table_name = get_table_name_from_filename(output_filename)
|
||||
|
||||
try:
|
||||
row_count = write_registry_to_db(
|
||||
root,
|
||||
registry_id,
|
||||
table_name,
|
||||
headers,
|
||||
ns,
|
||||
db_conn,
|
||||
)
|
||||
total_rows += row_count
|
||||
print(f"Tabelle aktualisiert: {table_name} ({row_count} Eintraege)")
|
||||
except (ValueError, sqlite3.Error) as e:
|
||||
print(f"Fehler bei Tabelle {table_name}: {e}")
|
||||
raise RuntimeError(
|
||||
f"Fehler beim Exportieren von Registry '{registry_id}' "
|
||||
f"aus {xml_path} in Tabelle {table_name}: {e}",
|
||||
) from e
|
||||
|
||||
return total_rows
|
||||
|
||||
|
||||
def main() -> None:
|
||||
"""Main entry point."""
|
||||
# Determine paths
|
||||
script_dir = Path(__file__).parent
|
||||
repo_root = script_dir.parent.parent
|
||||
config_path = script_dir / "data" / "iana_parse.json"
|
||||
db_path = script_dir / "data" / "crypto_standards.db"
|
||||
|
||||
print("IANA XML zu SQLite Konverter")
|
||||
print("=" * 50)
|
||||
print(f"Repository Root: {repo_root}")
|
||||
print(f"Konfiguration: {config_path}")
|
||||
print(f"Datenbank: {db_path}")
|
||||
|
||||
# Check if database exists
|
||||
if not db_path.exists():
|
||||
print(f"\n✗ Fehler: Datenbank {db_path} nicht gefunden", file=sys.stderr)
|
||||
print("Bitte zuerst die Datenbank erstellen.", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
# Load configuration
|
||||
try:
|
||||
config = load_config(config_path)
|
||||
except (FileNotFoundError, json.JSONDecodeError, OSError) as e:
|
||||
print(f"\nFehler beim Laden der Konfiguration: {e}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
print(f"\n{len(config)} XML-Datei(en) gefunden in Konfiguration")
|
||||
|
||||
# Connect to database
|
||||
try:
|
||||
db_conn = sqlite3.connect(str(db_path))
|
||||
except sqlite3.Error as e:
|
||||
print(f"\n✗ Fehler beim Öffnen der Datenbank: {e}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
# Process each XML file
|
||||
try:
|
||||
success_count = 0
|
||||
total_count = 0
|
||||
total_rows = 0
|
||||
|
||||
for xml_path, registries in config.items():
|
||||
total_count += len(registries)
|
||||
try:
|
||||
rows = process_xml_file(xml_path, registries, db_conn, str(repo_root))
|
||||
success_count += len(registries)
|
||||
total_rows += rows
|
||||
except (RuntimeError, ValueError, sqlite3.Error) as e:
|
||||
print(f"\nFehler: {e}", file=sys.stderr)
|
||||
db_conn.close()
|
||||
sys.exit(1)
|
||||
|
||||
# Summary
|
||||
print("\n" + "=" * 50)
|
||||
print(
|
||||
f"Erfolgreich abgeschlossen: {success_count}/{total_count} Registries "
|
||||
f"({total_rows} Einträge) in Datenbank importiert",
|
||||
)
|
||||
|
||||
finally:
|
||||
db_conn.close()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
203
src/sslysze_scan/scanner.py
Normal file
203
src/sslysze_scan/scanner.py
Normal file
@@ -0,0 +1,203 @@
|
||||
"""Module for performing SSL/TLS scans with SSLyze."""
|
||||
|
||||
import logging
|
||||
from datetime import datetime, timezone
|
||||
from typing import Any
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
from sslyze import (
|
||||
ProtocolWithOpportunisticTlsEnum,
|
||||
Scanner,
|
||||
ServerConnectivityStatusEnum,
|
||||
ServerHostnameCouldNotBeResolved,
|
||||
ServerNetworkConfiguration,
|
||||
ServerNetworkLocation,
|
||||
ServerScanRequest,
|
||||
ServerScanStatusEnum,
|
||||
)
|
||||
|
||||
from .protocol_loader import get_protocol_for_port
|
||||
|
||||
|
||||
def create_scan_request(
|
||||
hostname: str,
|
||||
port: int,
|
||||
use_opportunistic_tls: bool = True,
|
||||
) -> tuple[ServerScanRequest, bool]:
|
||||
"""Create a scan request for the given hostname and port.
|
||||
|
||||
Checks if the port requires opportunistic TLS and configures accordingly.
|
||||
|
||||
Args:
|
||||
hostname: Server hostname to scan.
|
||||
port: Port number to scan.
|
||||
use_opportunistic_tls: Whether to use opportunistic TLS if available.
|
||||
|
||||
Returns:
|
||||
Tuple of (ServerScanRequest, is_opportunistic_tls).
|
||||
|
||||
Raises:
|
||||
ServerHostnameCouldNotBeResolved: If hostname cannot be resolved.
|
||||
|
||||
"""
|
||||
# Check if port requires opportunistic TLS
|
||||
protocol = get_protocol_for_port(port)
|
||||
|
||||
if protocol and use_opportunistic_tls:
|
||||
# Port requires opportunistic TLS
|
||||
logger.info(
|
||||
"Port %s detected as %s - using opportunistic TLS scan", port, protocol
|
||||
)
|
||||
|
||||
# Get the protocol enum
|
||||
protocol_enum = getattr(ProtocolWithOpportunisticTlsEnum, protocol)
|
||||
|
||||
return (
|
||||
ServerScanRequest(
|
||||
server_location=ServerNetworkLocation(hostname=hostname, port=port),
|
||||
network_configuration=ServerNetworkConfiguration(
|
||||
tls_server_name_indication=hostname,
|
||||
tls_opportunistic_encryption=protocol_enum,
|
||||
),
|
||||
),
|
||||
True,
|
||||
)
|
||||
# Direct TLS connection
|
||||
if protocol and not use_opportunistic_tls:
|
||||
logger.info("Port %s - falling back to direct TLS scan", port)
|
||||
else:
|
||||
logger.info("Port %s - using direct TLS scan", port)
|
||||
|
||||
return (
|
||||
ServerScanRequest(
|
||||
server_location=ServerNetworkLocation(hostname=hostname, port=port),
|
||||
),
|
||||
False,
|
||||
)
|
||||
|
||||
|
||||
def perform_scan(
|
||||
hostname: str,
|
||||
port: int,
|
||||
scan_start_time: datetime,
|
||||
) -> tuple[Any, float]:
|
||||
"""Perform SSL/TLS scan on the given hostname and port.
|
||||
|
||||
Args:
|
||||
hostname: Server hostname to scan.
|
||||
port: Port number to scan.
|
||||
scan_start_time: Timestamp to use for this scan.
|
||||
|
||||
Returns:
|
||||
Tuple of (ServerScanResult, duration_seconds)
|
||||
|
||||
Raises:
|
||||
ServerHostnameCouldNotBeResolved: If hostname cannot be resolved.
|
||||
Exception: For other scan errors.
|
||||
|
||||
"""
|
||||
logger.info("Starting scan for %s:%s", hostname, port)
|
||||
|
||||
# Create scan request
|
||||
try:
|
||||
scan_request, is_opportunistic = create_scan_request(hostname, port)
|
||||
except ServerHostnameCouldNotBeResolved as e:
|
||||
raise RuntimeError(f"Error: Could not resolve hostname '{hostname}'") from e
|
||||
|
||||
# Queue the scan
|
||||
scanner = Scanner()
|
||||
scanner.queue_scans([scan_request])
|
||||
|
||||
# Process results
|
||||
all_server_scan_results = []
|
||||
for server_scan_result in scanner.get_results():
|
||||
all_server_scan_results.append(server_scan_result)
|
||||
logger.info(
|
||||
"Results for %s:%s",
|
||||
server_scan_result.server_location.hostname,
|
||||
server_scan_result.server_location.port,
|
||||
)
|
||||
|
||||
# Check connectivity
|
||||
if server_scan_result.scan_status == ServerScanStatusEnum.ERROR_NO_CONNECTIVITY:
|
||||
# If opportunistic TLS failed, try fallback to direct TLS
|
||||
if (
|
||||
is_opportunistic
|
||||
and server_scan_result.connectivity_status
|
||||
== ServerConnectivityStatusEnum.ERROR
|
||||
):
|
||||
logger.warning(
|
||||
"Opportunistic TLS connection failed for %s:%s",
|
||||
server_scan_result.server_location.hostname,
|
||||
server_scan_result.server_location.port,
|
||||
)
|
||||
logger.info("Retrying with direct TLS connection...")
|
||||
|
||||
# Create new scan request without opportunistic TLS
|
||||
try:
|
||||
fallback_request, _ = create_scan_request(
|
||||
hostname,
|
||||
port,
|
||||
use_opportunistic_tls=False,
|
||||
)
|
||||
except ServerHostnameCouldNotBeResolved as e:
|
||||
raise RuntimeError(
|
||||
f"Error: Could not resolve hostname '{hostname}'"
|
||||
) from e
|
||||
|
||||
# Queue and execute fallback scan
|
||||
fallback_scanner = Scanner()
|
||||
fallback_scanner.queue_scans([fallback_request])
|
||||
|
||||
# Process fallback results
|
||||
for fallback_result in fallback_scanner.get_results():
|
||||
all_server_scan_results[-1] = fallback_result
|
||||
server_scan_result = fallback_result
|
||||
logger.info(
|
||||
"Fallback Results for %s:%s",
|
||||
server_scan_result.server_location.hostname,
|
||||
server_scan_result.server_location.port,
|
||||
)
|
||||
|
||||
# Check connectivity again
|
||||
if (
|
||||
server_scan_result.scan_status
|
||||
== ServerScanStatusEnum.ERROR_NO_CONNECTIVITY
|
||||
):
|
||||
logger.error(
|
||||
"Could not connect to %s:%s",
|
||||
server_scan_result.server_location.hostname,
|
||||
server_scan_result.server_location.port,
|
||||
)
|
||||
if server_scan_result.connectivity_error_trace:
|
||||
logger.error(
|
||||
"Details: %s",
|
||||
server_scan_result.connectivity_error_trace,
|
||||
)
|
||||
continue
|
||||
break
|
||||
else:
|
||||
logger.error(
|
||||
"Could not connect to %s:%s",
|
||||
server_scan_result.server_location.hostname,
|
||||
server_scan_result.server_location.port,
|
||||
)
|
||||
if server_scan_result.connectivity_error_trace:
|
||||
logger.error(
|
||||
"Details: %s", server_scan_result.connectivity_error_trace
|
||||
)
|
||||
continue
|
||||
|
||||
# Skip further processing if still no connectivity
|
||||
if server_scan_result.scan_status == ServerScanStatusEnum.ERROR_NO_CONNECTIVITY:
|
||||
continue
|
||||
|
||||
# Calculate scan duration
|
||||
scan_end_time = datetime.now(timezone.utc)
|
||||
scan_duration = (scan_end_time - scan_start_time).total_seconds()
|
||||
|
||||
# Return first result (we only scan one host)
|
||||
if all_server_scan_results:
|
||||
return all_server_scan_results[0], scan_duration
|
||||
raise RuntimeError("No scan results obtained")
|
||||
181
src/sslysze_scan/templates/report.md.j2
Normal file
181
src/sslysze_scan/templates/report.md.j2
Normal file
@@ -0,0 +1,181 @@
|
||||
# Compliance Report: {{ hostname }}
|
||||
|
||||
**Scan-ID:** {{ scan_id }}
|
||||
**FQDN:** {{ fqdn }}
|
||||
**IPv4:** {{ ipv4 or 'N/A' }}
|
||||
**IPv6:** {{ ipv6 or 'N/A' }}
|
||||
**Timestamp:** {{ timestamp }}
|
||||
**Duration:** {{ duration }}s
|
||||
**Ports:** {{ ports }}
|
||||
**Ports without TLS Support:** {{ ports_without_tls }}
|
||||
|
||||
---
|
||||
|
||||
## Summary
|
||||
|
||||
| Metric | Value |
|
||||
|--------|-------|
|
||||
| Scanned Ports | {{ summary.total_ports }} |
|
||||
| Ports with TLS Support | {{ summary.successful_ports }} |
|
||||
| Cipher Suites Checked | {{ summary.total_cipher_suites }} |
|
||||
| Cipher Suites Compliant | {{ summary.compliant_cipher_suites }} ({{ summary.cipher_suite_percentage }}%) |
|
||||
| Supported Groups Checked | {{ summary.total_groups }} |
|
||||
| Supported Groups Compliant | {{ summary.compliant_groups }} ({{ summary.group_percentage }}%) |
|
||||
| Critical Vulnerabilities | {{ summary.critical_vulnerabilities }} |
|
||||
|
||||
---
|
||||
|
||||
{% for port_data in ports_data -%}
|
||||
{% if port_data.cipher_suites or port_data.supported_groups or port_data.certificates or port_data.tls_version -%}
|
||||
## Port {{ port_data.port }}
|
||||
|
||||
### TLS Configuration
|
||||
|
||||
**Status:** {{ port_data.status }}
|
||||
|
||||
{% if port_data.tls_version -%}
|
||||
**Highest TLS Version:** {{ port_data.tls_version }}
|
||||
{% endif -%}
|
||||
|
||||
{% if port_data.cipher_suites -%}
|
||||
### Cipher Suites
|
||||
|
||||
{% for tls_version, suites in port_data.cipher_suites.items() -%}
|
||||
#### {{ tls_version | format_tls_version }}
|
||||
|
||||
**Offered by Server:** {{ suites.accepted|length }}
|
||||
|
||||
{% if suites.accepted -%}
|
||||
| Cipher Suite | IANA | BSI | Valid Until | Compliant |
|
||||
|--------------|------|-----|-------------|-----------|
|
||||
{% for suite in suites.accepted -%}
|
||||
| {{ suite.name }} | {{ suite.iana_recommended or '-' }} | {{ 'Yes' if suite.bsi_approved else '-' }} | {{ suite.bsi_valid_until or '-' }} | {{ 'Yes' if suite.compliant else 'No' }} |
|
||||
{% endfor -%}
|
||||
{% endif -%}
|
||||
|
||||
{% if suites.rejected -%}
|
||||
**Not Offered by Server:** {{ suites.rejected|length }} (of {{ suites.rejected_total }} tested)
|
||||
|
||||
<details>
|
||||
<summary>Show recommended cipher suites not offered</summary>
|
||||
|
||||
| Cipher Suite | IANA | BSI | Valid Until |
|
||||
|--------------|------|-----|-------------|
|
||||
{% for suite in suites.rejected -%}
|
||||
| {{ suite.name }} | {{ suite.iana_recommended or '-' }} | {{ 'Yes' if suite.bsi_approved else '-' }} | {{ suite.bsi_valid_until or '-' }} |
|
||||
{% endfor -%}
|
||||
|
||||
</details>
|
||||
{% endif -%}
|
||||
|
||||
{% endfor -%}
|
||||
{% endif -%}
|
||||
|
||||
{% if port_data.supported_groups -%}
|
||||
### Supported Groups (Elliptic Curves / DH)
|
||||
|
||||
| Group | IANA | BSI | Valid Until | Compliant |
|
||||
|-------|------|-----|-------------|-----------|
|
||||
{% for group in port_data.supported_groups -%}
|
||||
| {{ group.name }} | {{ group.iana_recommended or '-' }} | {{ 'Yes' if group.bsi_approved else '-' }} | {{ group.bsi_valid_until or '-' }} | {{ 'Yes' if group.compliant else 'No' }} |
|
||||
{% endfor -%}
|
||||
{% endif -%}
|
||||
|
||||
{% if port_data.missing_recommended_groups -%}
|
||||
{% if port_data.missing_recommended_groups.bsi_approved or port_data.missing_recommended_groups.iana_recommended -%}
|
||||
### Recommended Groups Not Offered
|
||||
|
||||
<details>
|
||||
<summary>Show recommended groups not offered</summary>
|
||||
|
||||
{% if port_data.missing_recommended_groups.bsi_approved -%}
|
||||
**BSI TR-02102-2 Approved (missing):**
|
||||
|
||||
| Group | TLS Versions | Valid Until |
|
||||
|-------|--------------|-------------|
|
||||
{% for group in port_data.missing_recommended_groups.bsi_approved -%}
|
||||
| {{ group.name }} | {{ group.tls_versions|join(', ') }} | {{ group.valid_until }} |
|
||||
{% endfor -%}
|
||||
|
||||
{% endif -%}
|
||||
{% if port_data.missing_recommended_groups.iana_recommended -%}
|
||||
**IANA Recommended (missing):**
|
||||
|
||||
| Group | IANA Value |
|
||||
|-------|------------|
|
||||
{% for group in port_data.missing_recommended_groups.iana_recommended -%}
|
||||
| {{ group.name }} | {{ group.iana_value }} |
|
||||
{% endfor -%}
|
||||
|
||||
{% endif -%}
|
||||
|
||||
</details>
|
||||
{% endif -%}
|
||||
{% endif -%}
|
||||
|
||||
{% if port_data.certificates -%}
|
||||
### Certificates
|
||||
|
||||
| Position | Subject | Issuer | Valid From | Valid Until | Key Type | Key Size | Compliant |
|
||||
|----------|---------|--------|------------|-------------|----------|----------|-----------|
|
||||
{% for cert in port_data.certificates -%}
|
||||
| {{ cert.position }} | {{ cert.subject }} | {{ cert.issuer }} | {{ cert.not_before }} | {{ cert.not_after }} | {{ cert.key_type }} | {{ cert.key_bits }} | {{ 'Yes' if cert.compliant else 'No' if cert.compliant is not none else '-' }} |
|
||||
{% endfor -%}
|
||||
{% endif -%}
|
||||
|
||||
{% if port_data.vulnerabilities -%}
|
||||
### Vulnerabilities
|
||||
|
||||
| Type | Vulnerable | Details |
|
||||
|------|------------|---------|
|
||||
{% for vuln in port_data.vulnerabilities -%}
|
||||
| {{ vuln.type }} | {{ 'Yes' if vuln.vulnerable else 'No' }} | {{ vuln.details or '-' }} |
|
||||
{% endfor -%}
|
||||
{% endif -%}
|
||||
|
||||
{% if port_data.protocol_features -%}
|
||||
### Protocol Features
|
||||
|
||||
| Feature | Supported | Details |
|
||||
|---------|-----------|---------|
|
||||
{% for feature in port_data.protocol_features -%}
|
||||
| {{ feature.name }} | {{ 'Yes' if feature.supported else 'No' }} | {{ feature.details or '-' }} |
|
||||
{% endfor -%}
|
||||
{% endif -%}
|
||||
|
||||
{% if port_data.session_features -%}
|
||||
### Session Features
|
||||
|
||||
| Feature | Client Initiated | Secure | Session ID | TLS Ticket | Details |
|
||||
|---------|------------------|--------|------------|------------|---------|
|
||||
{% for feature in port_data.session_features -%}
|
||||
| {{ feature.type }} | {{ 'Yes' if feature.client_initiated else 'No' if feature.client_initiated is not none else '-' }} | {{ 'Yes' if feature.secure else 'No' if feature.secure is not none else '-' }} | {{ 'Yes' if feature.session_id_supported else 'No' if feature.session_id_supported is not none else '-' }} | {{ 'Yes' if feature.ticket_supported else 'No' if feature.ticket_supported is not none else '-' }} | {{ feature.details or '-' }} |
|
||||
{% endfor -%}
|
||||
{% endif -%}
|
||||
|
||||
{% if port_data.http_headers -%}
|
||||
### HTTP Security Headers
|
||||
|
||||
| Header | Present | Value |
|
||||
|--------|---------|-------|
|
||||
{% for header in port_data.http_headers -%}
|
||||
| {{ header.name }} | {{ 'Yes' if header.is_present else 'No' }} | {{ header.value or '-' }} |
|
||||
{% endfor -%}
|
||||
{% endif -%}
|
||||
|
||||
### Compliance Status Port {{ port_data.port }}
|
||||
|
||||
| Category | Checked | Compliant | Percentage |
|
||||
|----------|---------|-----------|------------|
|
||||
| Cipher Suites | {{ port_data.compliance.cipher_suites_checked }} | {{ port_data.compliance.cipher_suites_passed }} | {{ port_data.compliance.cipher_suite_percentage }}% |
|
||||
| Supported Groups | {{ port_data.compliance.groups_checked }} | {{ port_data.compliance.groups_passed }} | {{ port_data.compliance.group_percentage }}% |
|
||||
|
||||
{% else -%}
|
||||
## Port {{ port_data.port }} - Kein TLS Support
|
||||
|
||||
{% endif -%}
|
||||
{% endfor -%}
|
||||
|
||||
---
|
||||
|
||||
*Generated with compliance-scan*
|
||||
219
src/sslysze_scan/templates/report.reST.j2
Normal file
219
src/sslysze_scan/templates/report.reST.j2
Normal file
@@ -0,0 +1,219 @@
|
||||
{{ '#' * 27 }}
|
||||
TLS Compliance Report |UCS|
|
||||
{{ '#' * 27 }}
|
||||
|
||||
- **Scan-ID:** {{ scan_id }}
|
||||
- **FQDN:** {{ fqdn }}
|
||||
- **IPv4:** {{ ipv4 or 'N/A' }}
|
||||
- **IPv6:** {{ ipv6 or 'N/A' }}
|
||||
- **Timestamp:** {{ timestamp }}
|
||||
- **Duration:** {{ duration }} Seconds
|
||||
- **Ports:** {{ ports }}
|
||||
- **Ports without TLS Support:** {{ ports_without_tls }}
|
||||
|
||||
----
|
||||
|
||||
*******
|
||||
Summary
|
||||
*******
|
||||
|
||||
.. csv-table::
|
||||
:file: summary.csv
|
||||
:header-rows: 1
|
||||
:widths: auto
|
||||
|
||||
----
|
||||
|
||||
{% for port_data in ports_data -%}
|
||||
{% if port_data.cipher_suites or port_data.supported_groups or port_data.certificates or port_data.tls_version -%}
|
||||
{{ '*' * (5 + port_data.port|string|length) }}
|
||||
Port {{ port_data.port }}
|
||||
{{ '*' * (5 + port_data.port|string|length) }}
|
||||
|
||||
TLS Configuration
|
||||
=================
|
||||
|
||||
**Status:** {{ port_data.status }}
|
||||
|
||||
{% if port_data.tls_version -%}
|
||||
**Highest TLS Version:** {{ port_data.tls_version }}
|
||||
|
||||
{% endif -%}
|
||||
|
||||
{% if port_data.cipher_suites -%}
|
||||
Cipher Suites
|
||||
=============
|
||||
|
||||
{% for tls_version, suites in port_data.cipher_suites.items() -%}
|
||||
{{ tls_version | format_tls_version }}
|
||||
{{ '-' * (tls_version | format_tls_version | length) }}
|
||||
|
||||
**Offered by Server:** {{ suites.accepted|length }}
|
||||
|
||||
{% if suites.accepted -%}
|
||||
.. csv-table::
|
||||
:file: {{ port_data.port }}_cipher_suites_{{ tls_version }}_accepted.csv
|
||||
:header-rows: 1
|
||||
:widths: auto
|
||||
|
||||
{% else -%}
|
||||
No cipher suites accepted by server.
|
||||
|
||||
{% endif -%}
|
||||
{% if suites.rejected -%}
|
||||
**Not Offered by Server:** {{ suites.rejected|length }} (of {{ suites.rejected_total }} tested)
|
||||
|
||||
.. raw:: html
|
||||
|
||||
<details>
|
||||
|
||||
.. raw:: html
|
||||
|
||||
<summary>
|
||||
|
||||
Show recommended cipher suites not offered
|
||||
|
||||
.. raw:: html
|
||||
|
||||
</summary>
|
||||
|
||||
.. csv-table::
|
||||
:file: {{ port_data.port }}_cipher_suites_{{ tls_version }}_rejected.csv
|
||||
:header-rows: 1
|
||||
:widths: auto
|
||||
|
||||
.. raw:: html
|
||||
|
||||
</details>
|
||||
|
||||
{% endif -%}
|
||||
{% endfor -%}
|
||||
{% endif -%}
|
||||
|
||||
{% if port_data.supported_groups -%}
|
||||
Supported Groups (Elliptic Curves / DH)
|
||||
=======================================
|
||||
|
||||
.. csv-table::
|
||||
:file: {{ port_data.port }}_supported_groups.csv
|
||||
:header-rows: 1
|
||||
:widths: auto
|
||||
|
||||
{% endif -%}
|
||||
{% if port_data.missing_recommended_groups -%}
|
||||
{% if port_data.missing_recommended_groups.bsi_approved or port_data.missing_recommended_groups.iana_recommended -%}
|
||||
Recommended Groups Not Offered
|
||||
==============================
|
||||
|
||||
.. raw:: html
|
||||
|
||||
<details>
|
||||
|
||||
.. raw:: html
|
||||
|
||||
<summary>
|
||||
|
||||
Show recommended groups not offered
|
||||
|
||||
.. raw:: html
|
||||
|
||||
</summary>
|
||||
|
||||
{% if port_data.missing_recommended_groups.bsi_approved -%}
|
||||
**BSI TR-02102-2 Approved (missing):**
|
||||
|
||||
.. csv-table::
|
||||
:file: {{ port_data.port }}_missing_groups_bsi.csv
|
||||
:header-rows: 1
|
||||
:widths: auto
|
||||
|
||||
{% else -%}
|
||||
No BSI-approved groups missing.
|
||||
|
||||
{% endif -%}
|
||||
{% if port_data.missing_recommended_groups.iana_recommended -%}
|
||||
**IANA Recommended (missing):**
|
||||
|
||||
.. csv-table::
|
||||
:file: {{ port_data.port }}_missing_groups_iana.csv
|
||||
:header-rows: 1
|
||||
:widths: auto
|
||||
|
||||
{% else -%}
|
||||
No IANA-recommended groups missing.
|
||||
|
||||
{% endif -%}
|
||||
.. raw:: html
|
||||
|
||||
</details>
|
||||
|
||||
{% endif -%}
|
||||
{% endif -%}
|
||||
|
||||
{% if port_data.certificates -%}
|
||||
Certificates
|
||||
============
|
||||
|
||||
.. csv-table::
|
||||
:file: {{ port_data.port }}_certificates.csv
|
||||
:header-rows: 1
|
||||
:widths: auto
|
||||
|
||||
{% endif -%}
|
||||
{% if port_data.vulnerabilities -%}
|
||||
Vulnerabilities
|
||||
===============
|
||||
|
||||
.. csv-table::
|
||||
:file: {{ port_data.port }}_vulnerabilities.csv
|
||||
:header-rows: 1
|
||||
:widths: auto
|
||||
|
||||
{% endif -%}
|
||||
{% if port_data.protocol_features -%}
|
||||
Protocol Features
|
||||
=================
|
||||
|
||||
.. csv-table::
|
||||
:file: {{ port_data.port }}_protocol_features.csv
|
||||
:header-rows: 1
|
||||
:widths: auto
|
||||
|
||||
{% endif -%}
|
||||
{% if port_data.session_features -%}
|
||||
Session Features
|
||||
================
|
||||
|
||||
.. csv-table::
|
||||
:file: {{ port_data.port }}_session_features.csv
|
||||
:header-rows: 1
|
||||
:widths: auto
|
||||
|
||||
{% endif -%}
|
||||
{% if port_data.http_headers -%}
|
||||
HTTP Security Headers
|
||||
=====================
|
||||
|
||||
.. csv-table::
|
||||
:file: {{ port_data.port }}_http_headers.csv
|
||||
:header-rows: 1
|
||||
:widths: auto
|
||||
|
||||
{% endif -%}
|
||||
Compliance Status Port {{ port_data.port }}
|
||||
{{ '-' * (23 + port_data.port|string|length) }}
|
||||
|
||||
.. csv-table::
|
||||
:file: {{ port_data.port }}_compliance_status.csv
|
||||
:header-rows: 1
|
||||
:widths: auto
|
||||
|
||||
{% else -%}
|
||||
{{ '*' * (28 + port_data.port|string|length) }}
|
||||
Port {{ port_data.port }} - Kein TLS Support
|
||||
{{ '*' * (28 + port_data.port|string|length) }}
|
||||
|
||||
{% endif -%}
|
||||
{% endfor -%}
|
||||
|
||||
*Generated with compliance-scan*
|
||||
0
tests/__init__.py
Normal file
0
tests/__init__.py
Normal file
410
tests/conftest.py
Normal file
410
tests/conftest.py
Normal file
@@ -0,0 +1,410 @@
|
||||
"""Pytest configuration and shared fixtures."""
|
||||
|
||||
import sqlite3
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_scan_metadata() -> dict[str, Any]:
|
||||
"""Provide mock scan metadata."""
|
||||
return {
|
||||
"scan_id": 5,
|
||||
"hostname": "example.com",
|
||||
"fqdn": "example.com",
|
||||
"ipv4": "192.168.1.1",
|
||||
"ipv6": "2001:db8::1",
|
||||
"timestamp": "2025-01-08T10:30:00.123456",
|
||||
"duration": 12.34,
|
||||
"ports": ["443", "636"],
|
||||
}
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_scan_data(mock_scan_metadata: dict[str, Any]) -> dict[str, Any]:
|
||||
"""Provide complete mock scan data structure."""
|
||||
return {
|
||||
"metadata": mock_scan_metadata,
|
||||
"summary": {
|
||||
"total_ports": 2,
|
||||
"successful_ports": 2,
|
||||
"total_cipher_suites": 50,
|
||||
"compliant_cipher_suites": 45,
|
||||
"cipher_suite_percentage": 90,
|
||||
"total_groups": 10,
|
||||
"compliant_groups": 8,
|
||||
"group_percentage": 80,
|
||||
"critical_vulnerabilities": 0,
|
||||
},
|
||||
"ports_data": {
|
||||
443: {
|
||||
"port": 443,
|
||||
"status": "completed",
|
||||
"tls_version": "1.3",
|
||||
"cipher_suites": {
|
||||
"1.2": {
|
||||
"accepted": [
|
||||
{
|
||||
"name": "TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256",
|
||||
"iana_recommended": "Y",
|
||||
"bsi_approved": True,
|
||||
"bsi_valid_until": "2029",
|
||||
"compliant": True,
|
||||
},
|
||||
],
|
||||
"rejected": [],
|
||||
"rejected_total": 0,
|
||||
},
|
||||
"1.3": {
|
||||
"accepted": [
|
||||
{
|
||||
"name": "TLS_AES_128_GCM_SHA256",
|
||||
"iana_recommended": "Y",
|
||||
"bsi_approved": True,
|
||||
"bsi_valid_until": "2031",
|
||||
"compliant": True,
|
||||
},
|
||||
],
|
||||
"rejected": [
|
||||
{
|
||||
"name": "TLS_AES_128_CCM_SHA256",
|
||||
"iana_recommended": "Y",
|
||||
"bsi_approved": True,
|
||||
"bsi_valid_until": "2031",
|
||||
},
|
||||
],
|
||||
"rejected_total": 1,
|
||||
},
|
||||
},
|
||||
"supported_groups": [
|
||||
{
|
||||
"name": "x25519",
|
||||
"iana_recommended": "Y",
|
||||
"bsi_approved": False,
|
||||
"bsi_valid_until": None,
|
||||
"compliant": True,
|
||||
},
|
||||
],
|
||||
"missing_recommended_groups": {
|
||||
"bsi_approved": [
|
||||
{
|
||||
"name": "brainpoolP256r1",
|
||||
"tls_versions": ["1.2"],
|
||||
"valid_until": "2031",
|
||||
},
|
||||
],
|
||||
"iana_recommended": [],
|
||||
},
|
||||
"certificates": [
|
||||
{
|
||||
"position": 0,
|
||||
"subject": "CN=example.com",
|
||||
"issuer": "CN=Test CA",
|
||||
"not_before": "2024-01-01",
|
||||
"not_after": "2025-12-31",
|
||||
"key_type": "RSA",
|
||||
"key_bits": 2048,
|
||||
},
|
||||
],
|
||||
"vulnerabilities": [
|
||||
{
|
||||
"type": "Heartbleed",
|
||||
"vulnerable": False,
|
||||
"details": "Not vulnerable",
|
||||
},
|
||||
],
|
||||
"protocol_features": [
|
||||
{
|
||||
"name": "TLS Compression",
|
||||
"supported": False,
|
||||
"details": "Disabled",
|
||||
},
|
||||
],
|
||||
"session_features": [
|
||||
{
|
||||
"type": "Session Resumption",
|
||||
"client_initiated": True,
|
||||
"secure": True,
|
||||
"session_id_supported": True,
|
||||
"ticket_supported": True,
|
||||
"details": "Supported",
|
||||
},
|
||||
],
|
||||
"http_headers": [
|
||||
{
|
||||
"name": "Strict-Transport-Security",
|
||||
"is_present": True,
|
||||
"value": "max-age=31536000",
|
||||
},
|
||||
],
|
||||
"compliance": {
|
||||
"cipher_suites_checked": 45,
|
||||
"cipher_suites_passed": 40,
|
||||
"cipher_suite_percentage": 88.89,
|
||||
"groups_checked": 5,
|
||||
"groups_passed": 4,
|
||||
"group_percentage": 80.0,
|
||||
},
|
||||
},
|
||||
636: {
|
||||
"port": 636,
|
||||
"status": "completed",
|
||||
"tls_version": "1.2",
|
||||
"cipher_suites": {
|
||||
"1.2": {
|
||||
"accepted": [
|
||||
{
|
||||
"name": "TLS_DHE_RSA_WITH_AES_256_GCM_SHA384",
|
||||
"iana_recommended": "Y",
|
||||
"bsi_approved": True,
|
||||
"bsi_valid_until": "2029",
|
||||
"compliant": True,
|
||||
},
|
||||
],
|
||||
"rejected": [],
|
||||
"rejected_total": 0,
|
||||
},
|
||||
},
|
||||
"supported_groups": [],
|
||||
"missing_recommended_groups": {
|
||||
"bsi_approved": [],
|
||||
"iana_recommended": [],
|
||||
},
|
||||
"certificates": [],
|
||||
"vulnerabilities": [],
|
||||
"protocol_features": [],
|
||||
"session_features": [],
|
||||
"http_headers": [],
|
||||
"compliance": {
|
||||
"cipher_suites_checked": 5,
|
||||
"cipher_suites_passed": 5,
|
||||
"cipher_suite_percentage": 100.0,
|
||||
"groups_checked": 0,
|
||||
"groups_passed": 0,
|
||||
"group_percentage": 0.0,
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def temp_output_dir(tmp_path: Path) -> Path:
|
||||
"""Provide temporary output directory."""
|
||||
output_dir = tmp_path / "output"
|
||||
output_dir.mkdir()
|
||||
return output_dir
|
||||
|
||||
|
||||
# SQL for database views
|
||||
VIEWS_SQL = """
|
||||
-- View: Cipher suites with compliance information
|
||||
CREATE VIEW IF NOT EXISTS v_cipher_suites_with_compliance AS
|
||||
SELECT
|
||||
scs.scan_id,
|
||||
scs.port,
|
||||
scs.tls_version,
|
||||
scs.cipher_suite_name,
|
||||
scs.accepted,
|
||||
scs.iana_value,
|
||||
scs.key_size,
|
||||
scs.is_anonymous,
|
||||
sc.iana_recommended,
|
||||
sc.bsi_approved,
|
||||
sc.bsi_valid_until,
|
||||
sc.passed as compliant,
|
||||
CASE
|
||||
WHEN scs.accepted = 1 THEN sc.iana_recommended
|
||||
ELSE iana.recommended
|
||||
END as iana_recommended_final,
|
||||
CASE
|
||||
WHEN scs.accepted = 1 THEN sc.bsi_approved
|
||||
ELSE (bsi.name IS NOT NULL)
|
||||
END as bsi_approved_final,
|
||||
CASE
|
||||
WHEN scs.accepted = 1 THEN sc.bsi_valid_until
|
||||
ELSE bsi.valid_until
|
||||
END as bsi_valid_until_final
|
||||
FROM scan_cipher_suites scs
|
||||
LEFT JOIN scan_compliance_status sc
|
||||
ON scs.scan_id = sc.scan_id
|
||||
AND scs.port = sc.port
|
||||
AND sc.check_type = 'cipher_suite'
|
||||
AND scs.cipher_suite_name = sc.item_name
|
||||
LEFT JOIN iana_tls_cipher_suites iana
|
||||
ON scs.cipher_suite_name = iana.description
|
||||
LEFT JOIN bsi_tr_02102_2_tls bsi
|
||||
ON scs.cipher_suite_name = bsi.name
|
||||
AND scs.tls_version = bsi.tls_version
|
||||
AND bsi.category = 'cipher_suite';
|
||||
|
||||
-- View: Supported groups with compliance information
|
||||
CREATE VIEW IF NOT EXISTS v_supported_groups_with_compliance AS
|
||||
SELECT
|
||||
ssg.scan_id,
|
||||
ssg.port,
|
||||
ssg.group_name,
|
||||
ssg.iana_value,
|
||||
ssg.openssl_nid,
|
||||
sc.iana_recommended,
|
||||
sc.bsi_approved,
|
||||
sc.bsi_valid_until,
|
||||
sc.passed as compliant
|
||||
FROM scan_supported_groups ssg
|
||||
LEFT JOIN scan_compliance_status sc
|
||||
ON ssg.scan_id = sc.scan_id
|
||||
AND ssg.port = sc.port
|
||||
AND sc.check_type = 'supported_group'
|
||||
AND ssg.group_name = sc.item_name;
|
||||
|
||||
-- View: Certificates with compliance information
|
||||
CREATE VIEW IF NOT EXISTS v_certificates_with_compliance AS
|
||||
SELECT
|
||||
c.scan_id,
|
||||
c.port,
|
||||
c.position,
|
||||
c.subject,
|
||||
c.issuer,
|
||||
c.serial_number,
|
||||
c.not_before,
|
||||
c.not_after,
|
||||
c.key_type,
|
||||
c.key_bits,
|
||||
c.signature_algorithm,
|
||||
c.fingerprint_sha256,
|
||||
MAX(cs.passed) as compliant,
|
||||
MAX(cs.details) as compliance_details
|
||||
FROM scan_certificates c
|
||||
LEFT JOIN scan_compliance_status cs
|
||||
ON c.scan_id = cs.scan_id
|
||||
AND c.port = cs.port
|
||||
AND cs.check_type = 'certificate'
|
||||
AND cs.item_name = (c.key_type || ' ' || c.key_bits || ' Bit')
|
||||
GROUP BY c.scan_id, c.port, c.position, c.subject, c.issuer, c.serial_number,
|
||||
c.not_before, c.not_after, c.key_type, c.key_bits,
|
||||
c.signature_algorithm, c.fingerprint_sha256;
|
||||
|
||||
-- View: Port compliance summary
|
||||
CREATE VIEW IF NOT EXISTS v_port_compliance_summary AS
|
||||
SELECT
|
||||
scan_id,
|
||||
port,
|
||||
check_type,
|
||||
COUNT(*) as total,
|
||||
SUM(CASE WHEN passed = 1 THEN 1 ELSE 0 END) as passed,
|
||||
ROUND(CAST(SUM(CASE WHEN passed = 1 THEN 1 ELSE 0 END) AS REAL) / COUNT(*) * 100, 1) as percentage
|
||||
FROM scan_compliance_status
|
||||
GROUP BY scan_id, port, check_type;
|
||||
|
||||
-- View: Missing BSI-approved groups
|
||||
CREATE VIEW IF NOT EXISTS v_missing_bsi_groups AS
|
||||
SELECT
|
||||
s.scan_id,
|
||||
s.ports,
|
||||
bsi.name as group_name,
|
||||
bsi.tls_version,
|
||||
bsi.valid_until
|
||||
FROM scans s
|
||||
CROSS JOIN (
|
||||
SELECT DISTINCT name, tls_version, valid_until
|
||||
FROM bsi_tr_02102_2_tls
|
||||
WHERE category = 'dh_group'
|
||||
) bsi
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1
|
||||
FROM scan_supported_groups ssg
|
||||
WHERE ssg.scan_id = s.scan_id
|
||||
AND LOWER(ssg.group_name) = LOWER(bsi.name)
|
||||
);
|
||||
|
||||
-- View: Missing IANA-recommended groups
|
||||
CREATE VIEW IF NOT EXISTS v_missing_iana_groups AS
|
||||
SELECT
|
||||
s.scan_id,
|
||||
s.ports,
|
||||
iana.description as group_name,
|
||||
iana.value as iana_value
|
||||
FROM scans s
|
||||
CROSS JOIN (
|
||||
SELECT description, value
|
||||
FROM iana_tls_supported_groups
|
||||
WHERE recommended = 'Y'
|
||||
) iana
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1
|
||||
FROM scan_supported_groups ssg
|
||||
WHERE ssg.scan_id = s.scan_id
|
||||
AND LOWER(ssg.group_name) = LOWER(iana.description)
|
||||
)
|
||||
AND NOT EXISTS (
|
||||
SELECT 1
|
||||
FROM bsi_tr_02102_2_tls bsi
|
||||
WHERE LOWER(bsi.name) = LOWER(iana.description)
|
||||
AND bsi.category = 'dh_group'
|
||||
);
|
||||
"""
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def test_db() -> sqlite3.Connection:
|
||||
"""Provide in-memory test database with crypto standards and scan data."""
|
||||
conn = sqlite3.connect(":memory:")
|
||||
|
||||
# 1. Copy crypto_standards.db to memory
|
||||
standards_path = (
|
||||
Path(__file__).parent.parent / "src/sslysze_scan/data/crypto_standards.db"
|
||||
)
|
||||
if standards_path.exists():
|
||||
with sqlite3.connect(str(standards_path)) as src_conn:
|
||||
for line in src_conn.iterdump():
|
||||
conn.execute(line)
|
||||
|
||||
# 2. Copy test_scan.db data to memory (skip CREATE and csv_export_metadata)
|
||||
fixtures_dir = Path(__file__).parent / "fixtures"
|
||||
test_scan_path = fixtures_dir / "test_scan.db"
|
||||
if test_scan_path.exists():
|
||||
with sqlite3.connect(str(test_scan_path)) as src_conn:
|
||||
for line in src_conn.iterdump():
|
||||
if not line.startswith("CREATE ") and "csv_export_metadata" not in line:
|
||||
conn.execute(line)
|
||||
|
||||
# 3. Create views
|
||||
conn.executescript(VIEWS_SQL)
|
||||
|
||||
conn.commit()
|
||||
yield conn
|
||||
conn.close()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def test_db_path(tmp_path: Path) -> str:
|
||||
"""Provide test database as file path for functions expecting a path."""
|
||||
db_path = tmp_path / "test.db"
|
||||
conn = sqlite3.connect(str(db_path))
|
||||
|
||||
# 1. Copy crypto_standards.db to file
|
||||
standards_path = (
|
||||
Path(__file__).parent.parent / "src/sslysze_scan/data/crypto_standards.db"
|
||||
)
|
||||
if standards_path.exists():
|
||||
with sqlite3.connect(str(standards_path)) as src_conn:
|
||||
for line in src_conn.iterdump():
|
||||
conn.execute(line)
|
||||
|
||||
# 2. Copy test_scan.db data to file (skip CREATE and csv_export_metadata)
|
||||
fixtures_dir = Path(__file__).parent / "fixtures"
|
||||
test_scan_path = fixtures_dir / "test_scan.db"
|
||||
if test_scan_path.exists():
|
||||
with sqlite3.connect(str(test_scan_path)) as src_conn:
|
||||
for line in src_conn.iterdump():
|
||||
if not line.startswith("CREATE ") and "csv_export_metadata" not in line:
|
||||
conn.execute(line)
|
||||
|
||||
# 3. Create views
|
||||
conn.executescript(VIEWS_SQL)
|
||||
|
||||
conn.commit()
|
||||
conn.close()
|
||||
return str(db_path)
|
||||
1
tests/fixtures/__init__.py
vendored
Normal file
1
tests/fixtures/__init__.py
vendored
Normal file
@@ -0,0 +1 @@
|
||||
"""Test fixtures package."""
|
||||
BIN
tests/fixtures/test_scan.db
vendored
Normal file
BIN
tests/fixtures/test_scan.db
vendored
Normal file
Binary file not shown.
26
tests/test_cli.py
Normal file
26
tests/test_cli.py
Normal file
@@ -0,0 +1,26 @@
|
||||
"""Tests for CLI argument parsing."""
|
||||
|
||||
import pytest
|
||||
|
||||
from sslysze_scan.cli import parse_host_ports
|
||||
|
||||
|
||||
class TestParseHostPorts:
|
||||
"""Tests for parse_host_ports function."""
|
||||
|
||||
def test_parse_host_ports_multiple_ports(self) -> None:
|
||||
"""Test parsing hostname with multiple ports."""
|
||||
hostname, ports = parse_host_ports("example.com:443,636,993")
|
||||
assert hostname == "example.com"
|
||||
assert ports == [443, 636, 993]
|
||||
|
||||
def test_parse_host_ports_ipv6_multiple(self) -> None:
|
||||
"""Test parsing IPv6 address with multiple ports."""
|
||||
hostname, ports = parse_host_ports("[2001:db8::1]:443,636")
|
||||
assert hostname == "2001:db8::1"
|
||||
assert ports == [443, 636]
|
||||
|
||||
def test_parse_host_ports_invalid_port_range(self) -> None:
|
||||
"""Test error when port number out of range."""
|
||||
with pytest.raises(ValueError, match="Invalid port number.*Must be between"):
|
||||
parse_host_ports("example.com:99999")
|
||||
73
tests/test_compliance.py
Normal file
73
tests/test_compliance.py
Normal file
@@ -0,0 +1,73 @@
|
||||
"""Tests for compliance checking functionality."""
|
||||
|
||||
from datetime import datetime
|
||||
|
||||
|
||||
class TestComplianceChecks:
|
||||
"""Tests for compliance validation logic."""
|
||||
|
||||
def test_check_bsi_validity(self) -> None:
|
||||
"""Test BSI cipher suite validity checking."""
|
||||
# Valid BSI-approved cipher suite (not expired)
|
||||
cipher_suite_valid = {
|
||||
"name": "TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256",
|
||||
"iana_recommended": "N",
|
||||
"bsi_approved": True,
|
||||
"bsi_valid_until": "2029",
|
||||
}
|
||||
# Check that current year is before 2029
|
||||
current_year = datetime.now().year
|
||||
assert current_year < 2029, "Test assumes current year < 2029"
|
||||
# BSI-approved and valid should be compliant
|
||||
assert cipher_suite_valid["bsi_approved"] is True
|
||||
assert int(cipher_suite_valid["bsi_valid_until"]) > current_year
|
||||
|
||||
# Expired BSI-approved cipher suite
|
||||
cipher_suite_expired = {
|
||||
"name": "TLS_OLD_CIPHER",
|
||||
"iana_recommended": "N",
|
||||
"bsi_approved": True,
|
||||
"bsi_valid_until": "2020",
|
||||
}
|
||||
# BSI-approved but expired should not be compliant
|
||||
assert cipher_suite_expired["bsi_approved"] is True
|
||||
assert int(cipher_suite_expired["bsi_valid_until"]) < current_year
|
||||
|
||||
# No BSI data
|
||||
cipher_suite_no_bsi = {
|
||||
"name": "TLS_CHACHA20_POLY1305_SHA256",
|
||||
"iana_recommended": "Y",
|
||||
"bsi_approved": False,
|
||||
"bsi_valid_until": None,
|
||||
}
|
||||
# Without BSI approval, compliance depends on IANA
|
||||
assert cipher_suite_no_bsi["bsi_approved"] is False
|
||||
|
||||
def test_check_iana_recommendation(self) -> None:
|
||||
"""Test IANA recommendation checking."""
|
||||
# IANA recommended cipher suite
|
||||
cipher_suite_recommended = {
|
||||
"name": "TLS_AES_256_GCM_SHA384",
|
||||
"iana_recommended": "Y",
|
||||
"bsi_approved": True,
|
||||
"bsi_valid_until": "2031",
|
||||
}
|
||||
assert cipher_suite_recommended["iana_recommended"] == "Y"
|
||||
|
||||
# IANA not recommended cipher suite
|
||||
cipher_suite_not_recommended = {
|
||||
"name": "TLS_RSA_WITH_AES_128_CBC_SHA",
|
||||
"iana_recommended": "N",
|
||||
"bsi_approved": False,
|
||||
"bsi_valid_until": None,
|
||||
}
|
||||
assert cipher_suite_not_recommended["iana_recommended"] == "N"
|
||||
|
||||
# No IANA data (should default to non-compliant)
|
||||
cipher_suite_no_iana = {
|
||||
"name": "TLS_UNKNOWN_CIPHER",
|
||||
"iana_recommended": None,
|
||||
"bsi_approved": False,
|
||||
"bsi_valid_until": None,
|
||||
}
|
||||
assert cipher_suite_no_iana["iana_recommended"] is None
|
||||
297
tests/test_csv_export.py
Normal file
297
tests/test_csv_export.py
Normal file
@@ -0,0 +1,297 @@
|
||||
"""Tests for CSV export functionality."""
|
||||
|
||||
import csv
|
||||
from pathlib import Path
|
||||
|
||||
from sslysze_scan.reporter.csv_export import generate_csv_reports
|
||||
|
||||
|
||||
class TestCsvExport:
|
||||
"""Tests for CSV file generation."""
|
||||
|
||||
def test_export_summary(self, test_db_path: str, tmp_path: Path) -> None:
|
||||
"""Test summary CSV export with aggregated statistics."""
|
||||
output_dir = tmp_path / "output"
|
||||
output_dir.mkdir()
|
||||
|
||||
files = generate_csv_reports(test_db_path, 1, str(output_dir))
|
||||
|
||||
summary_file = output_dir / "summary.csv"
|
||||
assert summary_file.exists()
|
||||
assert str(summary_file) in files
|
||||
|
||||
with open(summary_file, newline="", encoding="utf-8") as f:
|
||||
reader = csv.reader(f)
|
||||
rows = list(reader)
|
||||
|
||||
assert rows[0] == ["Metric", "Value"]
|
||||
assert len(rows) >= 7
|
||||
|
||||
metrics = {row[0]: row[1] for row in rows[1:]}
|
||||
assert "Scanned Ports" in metrics
|
||||
assert "Ports with TLS Support" in metrics
|
||||
assert "Cipher Suites Checked" in metrics
|
||||
|
||||
def test_export_cipher_suites_port_443(
|
||||
self, test_db_path: str, tmp_path: Path
|
||||
) -> None:
|
||||
"""Test cipher suites export for port 443."""
|
||||
output_dir = tmp_path / "output"
|
||||
output_dir.mkdir()
|
||||
|
||||
files = generate_csv_reports(test_db_path, 1, str(output_dir))
|
||||
|
||||
accepted_files = [
|
||||
f for f in files if "443_cipher_suites" in f and "accepted" in f
|
||||
]
|
||||
assert len(accepted_files) > 0
|
||||
|
||||
accepted_file = Path(accepted_files[0])
|
||||
assert accepted_file.exists()
|
||||
|
||||
with open(accepted_file, newline="", encoding="utf-8") as f:
|
||||
reader = csv.reader(f)
|
||||
rows = list(reader)
|
||||
|
||||
assert rows[0] == ["Cipher Suite", "IANA", "BSI", "Valid Until", "Compliant"]
|
||||
assert len(rows) > 1
|
||||
|
||||
for row in rows[1:]:
|
||||
assert len(row) == 5
|
||||
assert row[4] in ["Yes", "No", "-"]
|
||||
|
||||
def test_export_supported_groups_port_636(
|
||||
self, test_db_path: str, tmp_path: Path
|
||||
) -> None:
|
||||
"""Test supported groups export for port 636."""
|
||||
output_dir = tmp_path / "output"
|
||||
output_dir.mkdir()
|
||||
|
||||
files = generate_csv_reports(test_db_path, 1, str(output_dir))
|
||||
|
||||
groups_files = [f for f in files if "636_supported_groups.csv" in f]
|
||||
|
||||
if groups_files:
|
||||
groups_file = Path(groups_files[0])
|
||||
assert groups_file.exists()
|
||||
|
||||
with open(groups_file, newline="", encoding="utf-8") as f:
|
||||
reader = csv.reader(f)
|
||||
rows = list(reader)
|
||||
|
||||
assert rows[0] == ["Group", "IANA", "BSI", "Valid Until", "Compliant"]
|
||||
|
||||
for row in rows[1:]:
|
||||
assert len(row) == 5
|
||||
assert row[4] in ["Yes", "No", "-"]
|
||||
|
||||
def test_export_missing_groups_port_443(
|
||||
self, test_db_path: str, tmp_path: Path
|
||||
) -> None:
|
||||
"""Test missing groups export for port 443."""
|
||||
output_dir = tmp_path / "output"
|
||||
output_dir.mkdir()
|
||||
|
||||
files = generate_csv_reports(test_db_path, 1, str(output_dir))
|
||||
|
||||
bsi_files = [f for f in files if "443_missing_groups_bsi.csv" in f]
|
||||
|
||||
if bsi_files:
|
||||
bsi_file = Path(bsi_files[0])
|
||||
assert bsi_file.exists()
|
||||
|
||||
with open(bsi_file, newline="", encoding="utf-8") as f:
|
||||
reader = csv.reader(f)
|
||||
rows = list(reader)
|
||||
|
||||
assert rows[0] == ["Group", "TLS Versions", "Valid Until"]
|
||||
|
||||
for row in rows[1:]:
|
||||
assert len(row) == 3
|
||||
|
||||
def test_export_certificates_port_636(
|
||||
self, test_db_path: str, tmp_path: Path
|
||||
) -> None:
|
||||
"""Test certificates export for port 636."""
|
||||
output_dir = tmp_path / "output"
|
||||
output_dir.mkdir()
|
||||
|
||||
files = generate_csv_reports(test_db_path, 1, str(output_dir))
|
||||
|
||||
cert_files = [f for f in files if "636_certificates.csv" in f]
|
||||
|
||||
if cert_files:
|
||||
cert_file = Path(cert_files[0])
|
||||
assert cert_file.exists()
|
||||
|
||||
with open(cert_file, newline="", encoding="utf-8") as f:
|
||||
reader = csv.reader(f)
|
||||
rows = list(reader)
|
||||
|
||||
expected_headers = [
|
||||
"Position",
|
||||
"Subject",
|
||||
"Issuer",
|
||||
"Valid From",
|
||||
"Valid Until",
|
||||
"Key Type",
|
||||
"Key Size",
|
||||
"Compliant",
|
||||
]
|
||||
assert rows[0] == expected_headers
|
||||
|
||||
for row in rows[1:]:
|
||||
assert len(row) == 8
|
||||
assert row[7] in ["Yes", "No", "-"]
|
||||
|
||||
def test_export_vulnerabilities_port_443(
|
||||
self, test_db_path: str, tmp_path: Path
|
||||
) -> None:
|
||||
"""Test vulnerabilities export for port 443."""
|
||||
output_dir = tmp_path / "output"
|
||||
output_dir.mkdir()
|
||||
|
||||
files = generate_csv_reports(test_db_path, 1, str(output_dir))
|
||||
|
||||
vuln_files = [f for f in files if "443_vulnerabilities.csv" in f]
|
||||
|
||||
if vuln_files:
|
||||
vuln_file = Path(vuln_files[0])
|
||||
assert vuln_file.exists()
|
||||
|
||||
with open(vuln_file, newline="", encoding="utf-8") as f:
|
||||
reader = csv.reader(f)
|
||||
rows = list(reader)
|
||||
|
||||
assert rows[0] == ["Type", "Vulnerable", "Details"]
|
||||
|
||||
for row in rows[1:]:
|
||||
assert len(row) == 3
|
||||
assert row[1] in ["Yes", "No", "-"]
|
||||
|
||||
def test_export_protocol_features_port_636(
|
||||
self, test_db_path: str, tmp_path: Path
|
||||
) -> None:
|
||||
"""Test protocol features export for port 636."""
|
||||
output_dir = tmp_path / "output"
|
||||
output_dir.mkdir()
|
||||
|
||||
files = generate_csv_reports(test_db_path, 1, str(output_dir))
|
||||
|
||||
protocol_files = [f for f in files if "636_protocol_features.csv" in f]
|
||||
|
||||
if protocol_files:
|
||||
protocol_file = Path(protocol_files[0])
|
||||
assert protocol_file.exists()
|
||||
|
||||
with open(protocol_file, newline="", encoding="utf-8") as f:
|
||||
reader = csv.reader(f)
|
||||
rows = list(reader)
|
||||
|
||||
assert rows[0] == ["Feature", "Supported", "Details"]
|
||||
|
||||
for row in rows[1:]:
|
||||
assert len(row) == 3
|
||||
assert row[1] in ["Yes", "No", "-"]
|
||||
|
||||
def test_export_session_features_port_443(
|
||||
self, test_db_path: str, tmp_path: Path
|
||||
) -> None:
|
||||
"""Test session features export for port 443."""
|
||||
output_dir = tmp_path / "output"
|
||||
output_dir.mkdir()
|
||||
|
||||
files = generate_csv_reports(test_db_path, 1, str(output_dir))
|
||||
|
||||
session_files = [f for f in files if "443_session_features.csv" in f]
|
||||
|
||||
if session_files:
|
||||
session_file = Path(session_files[0])
|
||||
assert session_file.exists()
|
||||
|
||||
with open(session_file, newline="", encoding="utf-8") as f:
|
||||
reader = csv.reader(f)
|
||||
rows = list(reader)
|
||||
|
||||
expected_headers = [
|
||||
"Feature",
|
||||
"Client Initiated",
|
||||
"Secure",
|
||||
"Session ID",
|
||||
"TLS Ticket",
|
||||
"Details",
|
||||
]
|
||||
assert rows[0] == expected_headers
|
||||
|
||||
for row in rows[1:]:
|
||||
assert len(row) == 6
|
||||
for i in range(1, 5):
|
||||
assert row[i] in ["Yes", "No", "-"]
|
||||
|
||||
def test_export_http_headers_port_636(
|
||||
self, test_db_path: str, tmp_path: Path
|
||||
) -> None:
|
||||
"""Test HTTP headers export for port 636."""
|
||||
output_dir = tmp_path / "output"
|
||||
output_dir.mkdir()
|
||||
|
||||
files = generate_csv_reports(test_db_path, 1, str(output_dir))
|
||||
|
||||
header_files = [f for f in files if "636_http_headers.csv" in f]
|
||||
|
||||
if header_files:
|
||||
header_file = Path(header_files[0])
|
||||
assert header_file.exists()
|
||||
|
||||
with open(header_file, newline="", encoding="utf-8") as f:
|
||||
reader = csv.reader(f)
|
||||
rows = list(reader)
|
||||
|
||||
assert rows[0] == ["Header", "Present", "Value"]
|
||||
|
||||
for row in rows[1:]:
|
||||
assert len(row) == 3
|
||||
assert row[1] in ["Yes", "No", "-"]
|
||||
|
||||
def test_export_compliance_status_port_443(
|
||||
self, test_db_path: str, tmp_path: Path
|
||||
) -> None:
|
||||
"""Test compliance status export for port 443."""
|
||||
output_dir = tmp_path / "output"
|
||||
output_dir.mkdir()
|
||||
|
||||
files = generate_csv_reports(test_db_path, 1, str(output_dir))
|
||||
|
||||
compliance_files = [f for f in files if "443_compliance_status.csv" in f]
|
||||
|
||||
if compliance_files:
|
||||
compliance_file = Path(compliance_files[0])
|
||||
assert compliance_file.exists()
|
||||
|
||||
with open(compliance_file, newline="", encoding="utf-8") as f:
|
||||
reader = csv.reader(f)
|
||||
rows = list(reader)
|
||||
|
||||
assert rows[0] == ["Category", "Checked", "Compliant", "Percentage"]
|
||||
|
||||
for row in rows[1:]:
|
||||
assert len(row) == 4
|
||||
assert "%" in row[3]
|
||||
|
||||
def test_generate_csv_reports_all_files(
|
||||
self, test_db_path: str, tmp_path: Path
|
||||
) -> None:
|
||||
"""Test that generate_csv_reports creates expected files."""
|
||||
output_dir = tmp_path / "output"
|
||||
output_dir.mkdir()
|
||||
|
||||
files = generate_csv_reports(test_db_path, 1, str(output_dir))
|
||||
|
||||
assert len(files) > 0
|
||||
assert any("summary.csv" in f for f in files)
|
||||
assert any("443_" in f for f in files)
|
||||
assert any("636_" in f for f in files)
|
||||
|
||||
for file_path in files:
|
||||
assert Path(file_path).exists()
|
||||
assert Path(file_path).suffix == ".csv"
|
||||
67
tests/test_template_utils.py
Normal file
67
tests/test_template_utils.py
Normal file
@@ -0,0 +1,67 @@
|
||||
"""Tests for template utilities."""
|
||||
|
||||
from datetime import datetime
|
||||
from typing import Any
|
||||
|
||||
from sslysze_scan.reporter.template_utils import (
|
||||
build_template_context,
|
||||
format_tls_version,
|
||||
generate_report_id,
|
||||
)
|
||||
|
||||
|
||||
class TestFormatTlsVersion:
|
||||
"""Tests for format_tls_version function."""
|
||||
|
||||
def test_format_tls_version_all_versions(self) -> None:
|
||||
"""Test formatting all known TLS versions."""
|
||||
versions = ["1.0", "1.1", "1.2", "1.3", "ssl_3.0", "unknown"]
|
||||
expected = ["TLS 1.0", "TLS 1.1", "TLS 1.2", "TLS 1.3", "SSL 3.0", "unknown"]
|
||||
assert [format_tls_version(v) for v in versions] == expected
|
||||
|
||||
|
||||
class TestGenerateReportId:
|
||||
"""Tests for generate_report_id function."""
|
||||
|
||||
def test_generate_report_id_valid_and_invalid(self) -> None:
|
||||
"""Test report ID generation with valid and invalid timestamps."""
|
||||
# Valid timestamp
|
||||
metadata = {"timestamp": "2025-01-08T10:30:00.123456", "scan_id": 5}
|
||||
result = generate_report_id(metadata)
|
||||
assert result == "20250108_5"
|
||||
|
||||
# Invalid timestamp falls back to current date
|
||||
metadata = {"timestamp": "invalid", "scan_id": 5}
|
||||
result = generate_report_id(metadata)
|
||||
today = datetime.now().strftime("%Y%m%d")
|
||||
assert result == f"{today}_5"
|
||||
|
||||
|
||||
class TestBuildTemplateContext:
|
||||
"""Tests for build_template_context function."""
|
||||
|
||||
def test_build_template_context_complete_and_partial(
|
||||
self, mock_scan_data: dict[str, Any]
|
||||
) -> None:
|
||||
"""Test context building with complete and partial data."""
|
||||
# Complete data
|
||||
context = build_template_context(mock_scan_data)
|
||||
assert context["scan_id"] == 5
|
||||
assert context["hostname"] == "example.com"
|
||||
assert context["fqdn"] == "example.com"
|
||||
assert context["ipv4"] == "192.168.1.1"
|
||||
assert context["ipv6"] == "2001:db8::1"
|
||||
assert context["timestamp"] == "08.01.2025 10:30"
|
||||
assert context["duration"] == "12.34"
|
||||
assert context["ports"] == "443, 636"
|
||||
assert "summary" in context
|
||||
assert "ports_data" in context
|
||||
|
||||
# Verify ports_data sorted by port
|
||||
ports = [p["port"] for p in context["ports_data"]]
|
||||
assert ports == sorted(ports)
|
||||
|
||||
# Missing duration
|
||||
mock_scan_data["metadata"]["duration"] = None
|
||||
context = build_template_context(mock_scan_data)
|
||||
assert context["duration"] == "N/A"
|
||||
Reference in New Issue
Block a user