refactor(readme): Improved README, make it way more readable

This commit is contained in:
2025-09-26 15:19:31 +02:00
parent a8fc2c07e8
commit 92a2b963c4

130
README.md
View File

@@ -1,94 +1,82 @@
# 🚀 SAP HANA Automation Scripts
This repository offers a comprehensive collection of Bash and Batch scripts designed to streamline various administrative and monitoring tasks for SAP HANA databases. Automate your routine operations and enhance your database management efficiency!
A collection of powerful Bash scripts designed to automate and simplify SAP HANA administration, monitoring, and management tasks.
## ⚙️ Installation
## ✨ Key Features
To quickly set up these powerful tools on your server, simply execute the following command:
* **Automate Everything**: Schedule routine backups, file cleanups, and schema refreshes.
* **Monitor Proactively**: Keep an eye on system health, disk space, and backup status with automated alerts.
* **Simplify Management**: Use powerful command-line tools and interactive menus for common tasks.
* **Secure**: Integrates with SAP's secure user store (`hdbuserstore`) for credential management.
* **Get Notified**: Receive completion and failure alerts via `ntfy.sh`.
## ⚙️ Quick Install
Get started in seconds. The interactive installer will guide you through selecting the tools you need.
```sh
bash -c "$(curl -sSL https://install.technopunk.space)"
```
This command will download and run our interactive `install.sh` script, guiding you through the selection and installation of individual tools. No manual downloads required!
## 🛠️ Tools Overview
Here's a detailed look at the automation scripts available in this repository:
The following scripts and suites are included. Suites are configured via a `.conf` file in their respective directories.
### 1. `cleaner.sh` (File Cleaner) 🧹
| Tool | Purpose & Core Function |
| :------------- | :------------------------------------------------------------------------------------------------------------------------------------------- |
| **`cleaner`** 🧹 | **File Cleaner**: Deletes files older than a specified retention period. Ideal for managing logs and temporary files. |
| **`hanatool`** 🗄️ | **HANA Management**: A powerful CLI tool to export/import schemas, perform full tenant backups, and compress artifacts. |
| **`keymanager`** 🔑 | **Key Manager**: An interactive menu to easily create, delete, and test `hdbuserstore` keys with an automatic rollback safety feature. |
| **`aurora`** 🌅 | **Schema Refresh Suite**: Automates refreshing a non-production schema from a production source. |
| **`backup`** 💾 | **Backup Suite**: A complete, cron-friendly solution for scheduling schema exports and/or full tenant backups with configurable compression. |
| **`monitor`** 📊 | **Monitoring Suite**: Continuously checks HANA process status, disk usage, log segments, and backup age, sending alerts when thresholds are breached. |
* **Purpose**: This script efficiently deletes files older than a specified retention period from given directories. It's designed to help manage disk space by automatically removing aged files.
* **Usage**: `./cleaner.sh <retention_days>:<path> [<retention_days>:<path> ...]`
* **Example**: `./cleaner.sh 30:/var/log 7:/tmp/downloads`
## 📖 Tool Details
### 2. `hanatool.sh` (SAP HANA Schema & Tenant Management) 🗄️
### 1\. `cleaner.sh` (File Cleaner) 🧹
* **Purpose**: A versatile command-line utility for SAP HANA, enabling quick exports and imports of schemas, as well as full tenant backups. It supports various options for flexible database management.
* **Features**:
* Export/Import schemas (with optional renaming).
* Perform full tenant backups.
* Supports `tar.gz` compression for efficient storage.
* Dry-run mode for command preview without execution.
* Ntfy.sh notifications for task completion/failure alerts.
* Custom `hdbsql` path support for flexible environments.
* Option to replace existing objects during import.
* **Usage**:
* Schema Export/Import: `./hanatool.sh [USER_KEY] export|import [SCHEMA_NAME] [PATH] [OPTIONS]`
* Schema Rename Import: `./hanatool.sh [USER_KEY] import-rename [SCHEMA_NAME] [NEW_SCHEMA_NAME] [PATH] [OPTIONS]`
* Tenant Backup: `./hanatool.sh [USER_KEY] backup [PATH] [OPTIONS]`
* **Options**: `-t, --threads N`, `-c, --compress`, `-n, --dry-run`, `--ntfy <token>`, `--replace`, `--hdbsql <path>`, `-h, --help`
* **Purpose**: Deletes files older than a specified retention period from given directories to help manage disk space.
### 3. `keymanager.sh` (SAP HANA Secure User Store Key Manager) 🔑
### 2\. `hanatool.sh` (SAP HANA Schema & Tenant Management) 🗄️
* **Purpose**: An interactive script designed to simplify the creation, deletion, and testing of SAP HANA `hdbuserstore` keys. It includes a rollback mechanism for failed key creations to ensure system integrity.
* **Features**:
* Interactive menu for key management.
* Create new secure keys with guided prompts for host, instance, tenant, and user.
* Delete existing keys from a dynamic selection list.
* Test connections for existing keys to verify functionality.
* Automatic rollback (deletion) of newly created keys if connection test fails.
* **Usage**: `./keymanager.sh` (runs an interactive menu)
* **Purpose**: A versatile command-line utility for SAP HANA, enabling quick exports and imports of schemas, as well as full tenant backups.
* **Features**:
* Export/Import schemas (with optional renaming).
* Perform full tenant backups.
* Dry-run mode to preview commands.
* `ntfy.sh` notifications for task completion/failure.
* **Options**: `-t, --threads N`, `-c, --compress`, `-n, --dry-run`, `--ntfy <token>`, `--replace`, `--hdbsql <path>`, `-h, --help`
### 4. `aurora/` (HANA Aurora Refresh Suite) 🌅
### 3\. `keymanager.sh` (Secure User Store Key Manager) 🔑
* **Purpose**: This suite automates the refresh of "Aurora" schemas, which are copies of production schemas used for testing or development. It ensures these environments are up-to-date with production data.
* **Components**:
* `aurora.sh`: The main script that orchestrates the refresh process.
* `aurora.conf`: Configuration file for `aurora.sh`, defining source schema, target user, paths, and post-import actions.
* **Features**:
* Drops existing Aurora schema before refresh (optional).
* Exports a specified source schema from production.
* Imports the exported data and renames it to the Aurora schema.
* Updates company name fields within the newly imported schema.
* Grants necessary privileges to a designated user on the Aurora schema.
* Executes custom post-import SQL scripts for further data manipulation or setup.
* **Usage**: Designed for automated execution, typically scheduled via cron, with all settings managed in `aurora/aurora.conf`.
* **Purpose**: An interactive script to simplify the creation, deletion, and testing of SAP HANA `hdbuserstore` keys.
* **Features**:
* Interactive menu for easy key management.
* Connection testing for existing keys.
* Automatic rollback of a newly created key if its connection test fails.
### 6. `backup/` (SAP HANA Automated Backup Suite) 💾
### 4\. `aurora.sh` (HANA Aurora Refresh Suite) 🌅
* **Purpose**: This suite provides automated solutions for performing schema exports and/or full tenant backups for SAP HANA databases. It's ideal for setting up reliable, scheduled backup routines via cronjobs.
* **Components**:
* `backup.sh`: The core script that executes the backup operations.
* `backup.conf`: Configuration file for `backup.sh`, specifying `hdbsql` path, user keys, backup directories, types, compression settings, and schemas to export.
* **Features**:
* Supports various backup types: schema export, tenant data backup, or both.
* Optionally backs up the SYSTEMDB.
* Configurable compression for both schema exports and tenant backups to save disk space.
* Uses `hdbuserstore` keys for secure database connections.
* **Usage**: `./backup/backup.sh` (all operational parameters are read from `backup/backup.conf`).
* **Purpose**: Automates the refresh of a "copy" schema from a production source, ensuring non-production environments stay up-to-date.
* **Process**:
1. Drops the existing target schema (optional).
2. Exports the source schema from production.
3. Imports and renames the data to the target schema.
4. Runs post-import configurations and grants privileges.
### 7. `monitor/` (SAP HANA Monitoring Suite) 📊
### 5\. `backup.sh` (SAP HANA Automated Backup Suite) 💾
* **Purpose**: This suite continuously monitors critical aspects of SAP HANA databases, including process status, disk usage, and log segment states. It sends proactive alerts via ntfy.sh when predefined thresholds are exceeded.
* **Components**:
* `monitor.sh`: The main script that performs the monitoring checks.
* `monitor.conf`: Configuration file for `monitor.sh`, defining company name, ntfy.sh settings, HANA connection details, monitoring thresholds (disk usage, log segments, backup age), and directories to monitor.
* **Features**:
* Checks if all SAP HANA processes are running with a 'GREEN' status.
* Monitors disk usage for specified directories against a configurable percentage threshold.
* Analyzes HANA log segments for 'Truncated' and 'Free' percentages, alerting if they fall outside acceptable ranges.
* Monitors the age of the last successful complete data backup, alerting if it exceeds a defined time threshold.
* Sends detailed notifications via ntfy.sh for critical alerts and resolutions.
* Utilizes a lock file to prevent multiple instances from running concurrently.
* **Usage**: `./monitor/monitor.sh` (all monitoring parameters and thresholds are read from `monitor/monitor.conf`).
* **Purpose**: Provides automated, scheduled backups for SAP HANA databases.
* **Features**:
* Supports schema exports, full tenant data backups, or both.
* Configurable compression to save disk space.
* Uses secure `hdbuserstore` keys for connections.
### 6\. `monitor.sh` (SAP HANA Monitoring Suite) 📊
* **Purpose**: Continuously monitors critical aspects of SAP HANA and sends proactive alerts via `ntfy.sh` when predefined thresholds are exceeded.
* **Checks Performed**:
* Verifies all HANA processes have a 'GREEN' status.
* Monitors disk usage against a set threshold.
* Analyzes log segment state.
* Checks the age of the last successful data backup.