mirror of
https://github.com/fccview/cronmaster.git
synced 2025-12-23 22:18:20 -05:00
3
.gitignore
vendored
3
.gitignore
vendored
@@ -14,4 +14,5 @@ node_modules
|
|||||||
.idea
|
.idea
|
||||||
tsconfig.tsbuildinfo
|
tsconfig.tsbuildinfo
|
||||||
docker-compose.test.yml
|
docker-compose.test.yml
|
||||||
/data
|
/data
|
||||||
|
claude.md
|
||||||
37
CONTRIBUTING.md
Normal file
37
CONTRIBUTING.md
Normal file
@@ -0,0 +1,37 @@
|
|||||||
|
# How to contribute
|
||||||
|
|
||||||
|
Hi, it's amazing having a community willing to push new feature to the app, and I am VERY open to contributors pushing their idea, it's what makes open source amazing.
|
||||||
|
|
||||||
|
That said for the sake of sanity let's all follow the same structure:
|
||||||
|
|
||||||
|
- When creating a new branch, do off from the develop branch, this will always be ahead of main and it's what gets released
|
||||||
|
- When creating a pull request, direct it back into develop, I'll then review it and merge it. Your code will end up in the next release that way and we all avoid conflicts!
|
||||||
|
- Please bear with on reviews, it may take a bit of time for me to go through it all on top of life/work/hobbies :)
|
||||||
|
|
||||||
|
## Some best practices
|
||||||
|
|
||||||
|
### Code Quality
|
||||||
|
|
||||||
|
- Follow the existing code style and structure
|
||||||
|
- Keep files modular and under 250-300 (split into smaller components if needed) lines unless it's a major server action, these can get intense I know
|
||||||
|
- Avoid code duplication - reuse existing functions and UI components, don't hardcode html when a component already exists (e.g. <button> vs <Button>)
|
||||||
|
- All imports should be at the top of the file unless it's for specific server actions
|
||||||
|
- Avoid using `any`
|
||||||
|
- Don't hardcode colors! Use the theme variables to make sure light/dark mode keep working well
|
||||||
|
- Make sure the UI is consistent with the current one, look for spacing issues, consistent spacing really makes a difference
|
||||||
|
|
||||||
|
### Pull Requests
|
||||||
|
|
||||||
|
- Keep PRs focused on a single feature or fix
|
||||||
|
- Update documentation if your changes affect user-facing features
|
||||||
|
- Test your changes locally before submitting
|
||||||
|
|
||||||
|
### Getting Started
|
||||||
|
|
||||||
|
1. Fork the repository
|
||||||
|
2. Create a feature branch from `develop`
|
||||||
|
3. Make your changes
|
||||||
|
4. Test thoroughly
|
||||||
|
5. Submit a pull request to `develop`
|
||||||
|
|
||||||
|
Thank you for contributing! <3
|
||||||
220
README.md
220
README.md
@@ -13,7 +13,6 @@
|
|||||||
- [Local Development](#local-development)
|
- [Local Development](#local-development)
|
||||||
- [Environment Variables](howto/ENV_VARIABLES.md)
|
- [Environment Variables](howto/ENV_VARIABLES.md)
|
||||||
- [Authentication](#authentication)
|
- [Authentication](#authentication)
|
||||||
- [REST API](#rest-api)
|
|
||||||
- [Usage](#usage)
|
- [Usage](#usage)
|
||||||
- [Viewing System Information](#viewing-system-information)
|
- [Viewing System Information](#viewing-system-information)
|
||||||
- [Managing Cron Jobs](#managing-cron-jobs)
|
- [Managing Cron Jobs](#managing-cron-jobs)
|
||||||
@@ -28,7 +27,7 @@
|
|||||||
## Features
|
## Features
|
||||||
|
|
||||||
- **Modern UI**: Beautiful, responsive interface with dark/light mode.
|
- **Modern UI**: Beautiful, responsive interface with dark/light mode.
|
||||||
- **System Information**: Display hostname, IP address, uptime, memory, network and CPU info.
|
- **System Information**: Display uptime, memory, network, CPU, and GPU info.
|
||||||
- **Cron Job Management**: View, create, and delete cron jobs with comments.
|
- **Cron Job Management**: View, create, and delete cron jobs with comments.
|
||||||
- **Script management**: View, create, and delete bash scripts on the go to use within your cron jobs.
|
- **Script management**: View, create, and delete bash scripts on the go to use within your cron jobs.
|
||||||
- **Job Execution Logging**: Optional logging for cronjobs with automatic cleanup, capturing stdout, stderr, exit codes, and timestamps.
|
- **Job Execution Logging**: Optional logging for cronjobs with automatic cleanup, capturing stdout, stderr, exit codes, and timestamps.
|
||||||
@@ -115,7 +114,7 @@ services:
|
|||||||
|
|
||||||
## API
|
## API
|
||||||
|
|
||||||
`cr*nmaster` includes a REST API for programmatic access to your checklists and notes. This is perfect for integrations.
|
`cr*nmaster` includes a REST API for programmatic access to your cron jobs and system information. This is perfect for integrations.
|
||||||
|
|
||||||
📖 **For the complete API documentation, see [howto/API.md](howto/API.md)**
|
📖 **For the complete API documentation, see [howto/API.md](howto/API.md)**
|
||||||
|
|
||||||
@@ -131,7 +130,7 @@ services:
|
|||||||
|
|
||||||
## Localization
|
## Localization
|
||||||
|
|
||||||
`cr*nmaster` officially support [some languages](app/_transations) and allows you to create your custom translations locally on your own machine.
|
`cr*nmaster` officially support [some languages](app/_translations) and allows you to create your custom translations locally on your own machine.
|
||||||
|
|
||||||
📖 **For the complete Translations documentation, see [howto/TRANSLATIONS.md](howto/TRANSLATIONS.md)**
|
📖 **For the complete Translations documentation, see [howto/TRANSLATIONS.md](howto/TRANSLATIONS.md)**
|
||||||
|
|
||||||
@@ -229,82 +228,11 @@ Cr\*nMaster supports SSO via OIDC (OpenID Connect), compatible with providers li
|
|||||||
- Entra ID (Azure AD)
|
- Entra ID (Azure AD)
|
||||||
- And many more!
|
- And many more!
|
||||||
|
|
||||||
For detailed setup instructions, see **[README_SSO.md](README_SSO.md)**
|
|
||||||
|
|
||||||
Quick example:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
environment:
|
|
||||||
- SSO_MODE=oidc
|
|
||||||
- OIDC_ISSUER=https://your-sso-provider.com
|
|
||||||
- OIDC_CLIENT_ID=your_client_id
|
|
||||||
- APP_URL=https://your-cronmaster-domain.com
|
|
||||||
```
|
|
||||||
|
|
||||||
### Combined Authentication
|
|
||||||
|
|
||||||
You can enable **both** password and SSO authentication simultaneously:
|
You can enable **both** password and SSO authentication simultaneously:
|
||||||
|
|
||||||
```yaml
|
|
||||||
environment:
|
|
||||||
- AUTH_PASSWORD=your_password
|
|
||||||
- SSO_MODE=oidc
|
|
||||||
- OIDC_ISSUER=https://your-sso-provider.com
|
|
||||||
- OIDC_CLIENT_ID=your_client_id
|
|
||||||
```
|
|
||||||
|
|
||||||
The login page will display both options, allowing users to choose their preferred method.
|
The login page will display both options, allowing users to choose their preferred method.
|
||||||
|
|
||||||
### Security Features
|
**For detailed setup instructions, see **[howto/SSO.md](howto/SSO.md)**
|
||||||
|
|
||||||
- ✅ **Secure session management** with cryptographically random session IDs
|
|
||||||
- ✅ **30-day session expiration** with automatic cleanup
|
|
||||||
- ✅ **HTTP-only cookies** to prevent XSS attacks
|
|
||||||
- ✅ **Proper JWT verification** for OIDC tokens using provider's public keys (JWKS)
|
|
||||||
- ✅ **PKCE support** for OIDC authentication (or confidential client mode)
|
|
||||||
|
|
||||||
<a id="rest-api"></a>
|
|
||||||
|
|
||||||
## REST API
|
|
||||||
|
|
||||||
Cr\*nMaster provides a full REST API for programmatic access. Perfect for:
|
|
||||||
|
|
||||||
- External monitoring tools
|
|
||||||
- Automation scripts
|
|
||||||
- CI/CD integrations
|
|
||||||
- Custom dashboards
|
|
||||||
|
|
||||||
### API Authentication
|
|
||||||
|
|
||||||
Protect your API with an optional API key:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
environment:
|
|
||||||
- API_KEY=your-secret-api-key-here
|
|
||||||
```
|
|
||||||
|
|
||||||
Use the API key in your requests:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
curl -H "Authorization: Bearer YOUR_API_KEY" \
|
|
||||||
https://your-domain.com/api/cronjobs
|
|
||||||
```
|
|
||||||
|
|
||||||
For complete API documentation with examples, see **[howto/API.md](howto/API.md)**
|
|
||||||
|
|
||||||
### Available Endpoints
|
|
||||||
|
|
||||||
- `GET /api/cronjobs` - List all cron jobs
|
|
||||||
- `POST /api/cronjobs` - Create a new cron job
|
|
||||||
- `GET /api/cronjobs/:id` - Get a specific cron job
|
|
||||||
- `PATCH /api/cronjobs/:id` - Update a cron job
|
|
||||||
- `DELETE /api/cronjobs/:id` - Delete a cron job
|
|
||||||
- `POST /api/cronjobs/:id/execute` - Manually execute a job
|
|
||||||
- `GET /api/scripts` - List all scripts
|
|
||||||
- `POST /api/scripts` - Create a new script
|
|
||||||
- `GET /api/system-stats` - Get system statistics
|
|
||||||
- `GET /api/logs/stream?runId=xxx` - Stream job logs
|
|
||||||
- `GET /api/events` - SSE stream for real-time updates
|
|
||||||
|
|
||||||
<a id="usage"></a>
|
<a id="usage"></a>
|
||||||
|
|
||||||
@@ -337,125 +265,7 @@ The application automatically detects your operating system and displays:
|
|||||||
|
|
||||||
### Job Execution Logging
|
### Job Execution Logging
|
||||||
|
|
||||||
CronMaster includes an optional logging feature that captures detailed execution information for your cronjobs:
|
📖 **For complete logging documentation, see [howto/LOGS.md](howto/LOGS.md)**
|
||||||
|
|
||||||
#### How It Works
|
|
||||||
|
|
||||||
When you enable logging for a cronjob, CronMaster automatically wraps your command with a log wrapper script. This wrapper:
|
|
||||||
|
|
||||||
- Captures **stdout** and **stderr** output
|
|
||||||
- Records the **exit code** of your command
|
|
||||||
- Timestamps the **start and end** of execution
|
|
||||||
- Calculates **execution duration**
|
|
||||||
- Stores all this information in organized log files
|
|
||||||
|
|
||||||
#### Enabling Logs
|
|
||||||
|
|
||||||
1. When creating or editing a cronjob, check the "Enable Logging" checkbox
|
|
||||||
2. The wrapper is automatically added to your crontab entry
|
|
||||||
3. Jobs run independently - they continue to work even if CronMaster is offline
|
|
||||||
|
|
||||||
#### Log Storage
|
|
||||||
|
|
||||||
Logs are stored in the `./data/logs/` directory with descriptive folder names:
|
|
||||||
|
|
||||||
- If a job has a **description/comment**: `{sanitized-description}_{jobId}/`
|
|
||||||
- If a job has **no description**: `{jobId}/`
|
|
||||||
|
|
||||||
Example structure:
|
|
||||||
|
|
||||||
```
|
|
||||||
./data/logs/
|
|
||||||
├── backup-database_root-0/
|
|
||||||
│ ├── 2025-11-10_14-30-00.log
|
|
||||||
│ ├── 2025-11-10_15-30-00.log
|
|
||||||
│ └── 2025-11-10_16-30-00.log
|
|
||||||
├── daily-cleanup_root-1/
|
|
||||||
│ └── 2025-11-10_14-35-00.log
|
|
||||||
├── root-2/ (no description provided)
|
|
||||||
│ └── 2025-11-10_14-40-00.log
|
|
||||||
```
|
|
||||||
|
|
||||||
**Note**: Folder names are sanitized to be filesystem-safe (lowercase, alphanumeric with hyphens, max 50 chars for the description part).
|
|
||||||
|
|
||||||
#### Log Format
|
|
||||||
|
|
||||||
Each log file includes:
|
|
||||||
|
|
||||||
```
|
|
||||||
==========================================
|
|
||||||
=== CronMaster Job Execution Log ===
|
|
||||||
==========================================
|
|
||||||
Log Folder: backup-database_root-0
|
|
||||||
Command: bash /app/scripts/backup.sh
|
|
||||||
Started: 2025-11-10 14:30:00
|
|
||||||
==========================================
|
|
||||||
|
|
||||||
[command output here]
|
|
||||||
|
|
||||||
==========================================
|
|
||||||
=== Execution Summary ===
|
|
||||||
==========================================
|
|
||||||
Completed: 2025-11-10 14:30:45
|
|
||||||
Duration: 45 seconds
|
|
||||||
Exit code: 0
|
|
||||||
==========================================
|
|
||||||
```
|
|
||||||
|
|
||||||
#### Automatic Cleanup
|
|
||||||
|
|
||||||
Logs are automatically cleaned up to prevent disk space issues:
|
|
||||||
|
|
||||||
- **Maximum logs per job**: 50 log files
|
|
||||||
- **Maximum age**: 30 days
|
|
||||||
- **Cleanup trigger**: When viewing logs or after manual execution
|
|
||||||
- **Method**: Oldest logs are deleted first when limits are exceeded
|
|
||||||
|
|
||||||
#### Custom Wrapper Script
|
|
||||||
|
|
||||||
You can override the default log wrapper by creating your own at `./data/wrapper-override.sh`. This allows you to:
|
|
||||||
|
|
||||||
- Customize log format
|
|
||||||
- Add additional metadata
|
|
||||||
- Integrate with external logging services
|
|
||||||
- Implement custom retention policies
|
|
||||||
|
|
||||||
**Example custom wrapper**:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
#!/bin/bash
|
|
||||||
JOB_ID="$1"
|
|
||||||
shift
|
|
||||||
|
|
||||||
# Your custom logic here
|
|
||||||
LOG_FILE="/custom/path/${JOB_ID}_$(date '+%Y%m%d').log"
|
|
||||||
|
|
||||||
{
|
|
||||||
echo "=== Custom Log Format ==="
|
|
||||||
echo "Job: $JOB_ID"
|
|
||||||
"$@"
|
|
||||||
echo "Exit: $?"
|
|
||||||
} >> "$LOG_FILE" 2>&1
|
|
||||||
```
|
|
||||||
|
|
||||||
#### Docker Considerations
|
|
||||||
|
|
||||||
- Mount the `./data` directory to persist logs on the host
|
|
||||||
- The wrapper script location: `./data/cron-log-wrapper.sh`. This will be generated automatically the first time you enable logging.
|
|
||||||
|
|
||||||
#### Non-Docker Considerations
|
|
||||||
|
|
||||||
- Logs are stored at `./data/logs/` relative to the project directory
|
|
||||||
- The codebase wrapper script location: `./app/_scripts/cron-log-wrapper.sh`
|
|
||||||
- The running wrapper script location: `./data/cron-log-wrapper.sh`
|
|
||||||
|
|
||||||
#### Important Notes
|
|
||||||
|
|
||||||
- Logging is **optional** and disabled by default
|
|
||||||
- Jobs with logging enabled are marked with a blue "Logged" badge in the UI
|
|
||||||
- Logs are captured for both scheduled runs and manual executions
|
|
||||||
- Commands with file redirections (>, >>) may conflict with logging
|
|
||||||
- The crontab stores the **wrapped command**, so jobs run independently of CronMaster
|
|
||||||
|
|
||||||
### Cron Schedule Format
|
### Cron Schedule Format
|
||||||
|
|
||||||
@@ -477,26 +287,6 @@ The application uses standard cron format: `* * * * *`
|
|||||||
4. **Delete Scripts**: Remove unwanted scripts (this won't delete the cronjob, you will need to manually remove these yourself)
|
4. **Delete Scripts**: Remove unwanted scripts (this won't delete the cronjob, you will need to manually remove these yourself)
|
||||||
5. **Clone Scripts**: Clone scripts to quickly edit them in case they are similar to one another.
|
5. **Clone Scripts**: Clone scripts to quickly edit them in case they are similar to one another.
|
||||||
|
|
||||||
<a id="technologies-used"></a>
|
|
||||||
|
|
||||||
## Technologies Used
|
|
||||||
|
|
||||||
- **Next.js 14**: React framework with App Router
|
|
||||||
- **TypeScript**: Type-safe JavaScript
|
|
||||||
- **Tailwind CSS**: Utility-first CSS framework
|
|
||||||
- **Lucide React**: Beautiful icons
|
|
||||||
- **next-themes**: Dark/light mode support
|
|
||||||
- **Docker**: Containerization
|
|
||||||
|
|
||||||
<a id="contributing"></a>
|
|
||||||
|
|
||||||
## Contributing
|
|
||||||
|
|
||||||
1. Fork the repository
|
|
||||||
2. Create a feature branch from the `develop` branch
|
|
||||||
3. Make your changes
|
|
||||||
4. Submit a pull request to the `develop` branch
|
|
||||||
|
|
||||||
## Community shouts
|
## Community shouts
|
||||||
|
|
||||||
I would like to thank the following members for raising issues and help test/debug them!
|
I would like to thank the following members for raising issues and help test/debug them!
|
||||||
|
|||||||
@@ -64,9 +64,10 @@ export const CreateTaskModal = ({
|
|||||||
}, [selectedScript]);
|
}, [selectedScript]);
|
||||||
|
|
||||||
const handleScriptSelect = async (script: Script) => {
|
const handleScriptSelect = async (script: Script) => {
|
||||||
|
const scriptPath = await getHostScriptPath(script.filename);
|
||||||
onFormChange({
|
onFormChange({
|
||||||
selectedScriptId: script.id,
|
selectedScriptId: script.id,
|
||||||
command: await getHostScriptPath(script.filename),
|
command: scriptPath,
|
||||||
});
|
});
|
||||||
};
|
};
|
||||||
|
|
||||||
@@ -123,11 +124,10 @@ export const CreateTaskModal = ({
|
|||||||
<button
|
<button
|
||||||
type="button"
|
type="button"
|
||||||
onClick={handleCustomCommand}
|
onClick={handleCustomCommand}
|
||||||
className={`p-4 rounded-lg border-2 transition-all ${
|
className={`p-4 rounded-lg border-2 transition-all ${!form.selectedScriptId
|
||||||
!form.selectedScriptId
|
|
||||||
? "border-primary bg-primary/5 text-primary"
|
? "border-primary bg-primary/5 text-primary"
|
||||||
: "border-border bg-muted/30 text-muted-foreground hover:border-border/60"
|
: "border-border bg-muted/30 text-muted-foreground hover:border-border/60"
|
||||||
}`}
|
}`}
|
||||||
>
|
>
|
||||||
<div className="flex items-center gap-3">
|
<div className="flex items-center gap-3">
|
||||||
<Terminal className="h-5 w-5" />
|
<Terminal className="h-5 w-5" />
|
||||||
@@ -145,11 +145,10 @@ export const CreateTaskModal = ({
|
|||||||
<button
|
<button
|
||||||
type="button"
|
type="button"
|
||||||
onClick={() => setIsSelectScriptModalOpen(true)}
|
onClick={() => setIsSelectScriptModalOpen(true)}
|
||||||
className={`p-4 rounded-lg border-2 transition-all ${
|
className={`p-4 rounded-lg border-2 transition-all ${form.selectedScriptId
|
||||||
form.selectedScriptId
|
|
||||||
? "border-primary bg-primary/5 text-primary"
|
? "border-primary bg-primary/5 text-primary"
|
||||||
: "border-border bg-muted/30 text-muted-foreground hover:border-border/60"
|
: "border-border bg-muted/30 text-muted-foreground hover:border-border/60"
|
||||||
}`}
|
}`}
|
||||||
>
|
>
|
||||||
<div className="flex items-center gap-3">
|
<div className="flex items-center gap-3">
|
||||||
<FileText className="h-5 w-5" />
|
<FileText className="h-5 w-5" />
|
||||||
|
|||||||
@@ -1,12 +1,13 @@
|
|||||||
"use client";
|
"use client";
|
||||||
|
|
||||||
import { useState, useEffect, useRef } from "react";
|
import { useState, useEffect, useRef, useCallback } from "react";
|
||||||
import { Loader2, CheckCircle2, XCircle, AlertTriangle, Minimize2, Maximize2 } from "lucide-react";
|
import { Loader2, CheckCircle2, XCircle, AlertTriangle, Minimize2, Maximize2 } from "lucide-react";
|
||||||
import { Modal } from "@/app/_components/GlobalComponents/UIElements/Modal";
|
import { Modal } from "@/app/_components/GlobalComponents/UIElements/Modal";
|
||||||
import { Button } from "@/app/_components/GlobalComponents/UIElements/Button";
|
import { Button } from "@/app/_components/GlobalComponents/UIElements/Button";
|
||||||
import { useSSEContext } from "@/app/_contexts/SSEContext";
|
import { useSSEContext } from "@/app/_contexts/SSEContext";
|
||||||
import { SSEEvent } from "@/app/_utils/sse-events";
|
import { SSEEvent } from "@/app/_utils/sse-events";
|
||||||
import { usePageVisibility } from "@/app/_hooks/usePageVisibility";
|
import { usePageVisibility } from "@/app/_hooks/usePageVisibility";
|
||||||
|
import { useTranslations } from "next-intl";
|
||||||
|
|
||||||
interface LiveLogModalProps {
|
interface LiveLogModalProps {
|
||||||
isOpen: boolean;
|
isOpen: boolean;
|
||||||
@@ -26,6 +27,7 @@ export const LiveLogModal = ({
|
|||||||
jobId,
|
jobId,
|
||||||
jobComment,
|
jobComment,
|
||||||
}: LiveLogModalProps) => {
|
}: LiveLogModalProps) => {
|
||||||
|
const t = useTranslations();
|
||||||
const [logContent, setLogContent] = useState<string>("");
|
const [logContent, setLogContent] = useState<string>("");
|
||||||
const [status, setStatus] = useState<"running" | "completed" | "failed">(
|
const [status, setStatus] = useState<"running" | "completed" | "failed">(
|
||||||
"running"
|
"running"
|
||||||
@@ -40,6 +42,11 @@ export const LiveLogModal = ({
|
|||||||
const abortControllerRef = useRef<AbortController | null>(null);
|
const abortControllerRef = useRef<AbortController | null>(null);
|
||||||
const [fileSize, setFileSize] = useState<number>(0);
|
const [fileSize, setFileSize] = useState<number>(0);
|
||||||
const [lineCount, setLineCount] = useState<number>(0);
|
const [lineCount, setLineCount] = useState<number>(0);
|
||||||
|
const [maxLines, setMaxLines] = useState<number>(500);
|
||||||
|
const [totalLines, setTotalLines] = useState<number>(0);
|
||||||
|
const [truncated, setTruncated] = useState<boolean>(false);
|
||||||
|
const [showFullLog, setShowFullLog] = useState<boolean>(false);
|
||||||
|
const [isJobComplete, setIsJobComplete] = useState<boolean>(false);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (isOpen) {
|
if (isOpen) {
|
||||||
@@ -49,92 +56,95 @@ export const LiveLogModal = ({
|
|||||||
setShowSizeWarning(false);
|
setShowSizeWarning(false);
|
||||||
setFileSize(0);
|
setFileSize(0);
|
||||||
setLineCount(0);
|
setLineCount(0);
|
||||||
|
setShowFullLog(false);
|
||||||
|
setIsJobComplete(false);
|
||||||
}
|
}
|
||||||
}, [isOpen, runId]);
|
}, [isOpen, runId]);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (!isOpen || !runId || !isPageVisible) return;
|
if (isOpen && runId && !isJobComplete) {
|
||||||
|
lastOffsetRef.current = 0;
|
||||||
|
setLogContent("");
|
||||||
|
fetchLogs();
|
||||||
|
}
|
||||||
|
}, [maxLines]);
|
||||||
|
|
||||||
const fetchLogs = async () => {
|
const fetchLogs = useCallback(async () => {
|
||||||
if (abortControllerRef.current) {
|
if (abortControllerRef.current) {
|
||||||
abortControllerRef.current.abort();
|
abortControllerRef.current.abort();
|
||||||
|
}
|
||||||
|
|
||||||
|
const abortController = new AbortController();
|
||||||
|
abortControllerRef.current = abortController;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const url = `/api/logs/stream?runId=${runId}&offset=${lastOffsetRef.current}&maxLines=${maxLines}`;
|
||||||
|
const response = await fetch(url, {
|
||||||
|
signal: abortController.signal,
|
||||||
|
});
|
||||||
|
const data = await response.json();
|
||||||
|
|
||||||
|
if (data.fileSize !== undefined) {
|
||||||
|
lastOffsetRef.current = data.fileSize;
|
||||||
|
setFileSize(data.fileSize);
|
||||||
|
|
||||||
|
if (data.fileSize > 10 * 1024 * 1024) {
|
||||||
|
setShowSizeWarning(true);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
const abortController = new AbortController();
|
if (data.totalLines !== undefined) {
|
||||||
abortControllerRef.current = abortController;
|
setTotalLines(data.totalLines);
|
||||||
|
}
|
||||||
|
setLineCount(data.displayedLines || 0);
|
||||||
|
|
||||||
try {
|
if (data.truncated !== undefined) {
|
||||||
const url = `/api/logs/stream?runId=${runId}&offset=${lastOffsetRef.current}`;
|
setTruncated(data.truncated);
|
||||||
const response = await fetch(url, {
|
}
|
||||||
signal: abortController.signal,
|
|
||||||
|
if (lastOffsetRef.current === 0 && data.content) {
|
||||||
|
setLogContent(data.content);
|
||||||
|
|
||||||
|
if (data.truncated) {
|
||||||
|
setTailMode(true);
|
||||||
|
}
|
||||||
|
} else if (data.newContent) {
|
||||||
|
setLogContent((prev) => {
|
||||||
|
const combined = prev + data.newContent;
|
||||||
|
const lines = combined.split("\n");
|
||||||
|
|
||||||
|
if (lines.length > maxLines) {
|
||||||
|
return lines.slice(-maxLines).join("\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
return combined;
|
||||||
});
|
});
|
||||||
const data = await response.json();
|
|
||||||
|
|
||||||
if (data.fileSize !== undefined) {
|
|
||||||
lastOffsetRef.current = data.fileSize;
|
|
||||||
setFileSize(data.fileSize);
|
|
||||||
|
|
||||||
if (data.fileSize > 10 * 1024 * 1024 && !showSizeWarning) {
|
|
||||||
setShowSizeWarning(true);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (lastOffsetRef.current === 0 && data.content) {
|
|
||||||
const lines = data.content.split("\n");
|
|
||||||
setLineCount(lines.length);
|
|
||||||
|
|
||||||
if (lines.length > MAX_LINES_FULL_RENDER) {
|
|
||||||
setTailMode(true);
|
|
||||||
setShowSizeWarning(true);
|
|
||||||
setLogContent(lines.slice(-TAIL_LINES).join("\n"));
|
|
||||||
} else {
|
|
||||||
setLogContent(data.content);
|
|
||||||
}
|
|
||||||
} else if (data.newContent) {
|
|
||||||
setLogContent((prev) => {
|
|
||||||
const newContent = prev + data.newContent;
|
|
||||||
const lines = newContent.split("\n");
|
|
||||||
setLineCount(lines.length);
|
|
||||||
|
|
||||||
if (lines.length > MAX_LINES_FULL_RENDER && !tailMode) {
|
|
||||||
setTailMode(true);
|
|
||||||
setShowSizeWarning(true);
|
|
||||||
return lines.slice(-TAIL_LINES).join("\n");
|
|
||||||
}
|
|
||||||
|
|
||||||
if (tailMode && lines.length > TAIL_LINES) {
|
|
||||||
return lines.slice(-TAIL_LINES).join("\n");
|
|
||||||
}
|
|
||||||
|
|
||||||
const maxLength = 50 * 1024 * 1024;
|
|
||||||
if (newContent.length > maxLength) {
|
|
||||||
setTailMode(true);
|
|
||||||
setShowSizeWarning(true);
|
|
||||||
const truncated = newContent.slice(-maxLength + 200);
|
|
||||||
const truncatedLines = truncated.split("\n");
|
|
||||||
return truncatedLines.slice(-TAIL_LINES).join("\n");
|
|
||||||
}
|
|
||||||
|
|
||||||
return newContent;
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
setStatus(data.status || "running");
|
|
||||||
|
|
||||||
if (data.exitCode !== undefined) {
|
|
||||||
setExitCode(data.exitCode);
|
|
||||||
}
|
|
||||||
} catch (error: any) {
|
|
||||||
if (error.name !== "AbortError") {
|
|
||||||
console.error("Failed to fetch logs:", error);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
};
|
|
||||||
|
const jobStatus = data.status || "running";
|
||||||
|
setStatus(jobStatus);
|
||||||
|
|
||||||
|
if (jobStatus === "completed" || jobStatus === "failed") {
|
||||||
|
setIsJobComplete(true);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (data.exitCode !== undefined) {
|
||||||
|
setExitCode(data.exitCode);
|
||||||
|
}
|
||||||
|
} catch (error: any) {
|
||||||
|
if (error.name !== "AbortError") {
|
||||||
|
console.error("Failed to fetch logs:", error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}, [runId, maxLines]);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
if (!isOpen || !runId || !isPageVisible) return;
|
||||||
|
|
||||||
fetchLogs();
|
fetchLogs();
|
||||||
|
|
||||||
let interval: NodeJS.Timeout | null = null;
|
let interval: NodeJS.Timeout | null = null;
|
||||||
if (isPageVisible) {
|
if (isPageVisible && !isJobComplete) {
|
||||||
interval = setInterval(fetchLogs, 3000);
|
interval = setInterval(fetchLogs, 3000);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -146,7 +156,7 @@ export const LiveLogModal = ({
|
|||||||
abortControllerRef.current.abort();
|
abortControllerRef.current.abort();
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
}, [isOpen, runId, isPageVisible, showSizeWarning, tailMode]);
|
}, [isOpen, runId, isPageVisible, fetchLogs, isJobComplete]);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (!isOpen) return;
|
if (!isOpen) return;
|
||||||
@@ -194,7 +204,7 @@ export const LiveLogModal = ({
|
|||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (logEndRef.current) {
|
if (logEndRef.current) {
|
||||||
logEndRef.current.scrollIntoView({ behavior: "smooth" });
|
logEndRef.current.scrollIntoView({ behavior: "instant" });
|
||||||
}
|
}
|
||||||
}, [logContent]);
|
}, [logContent]);
|
||||||
|
|
||||||
@@ -216,23 +226,23 @@ export const LiveLogModal = ({
|
|||||||
|
|
||||||
const titleWithStatus = (
|
const titleWithStatus = (
|
||||||
<div className="flex items-center gap-3">
|
<div className="flex items-center gap-3">
|
||||||
<span>Live Job Execution{jobComment && `: ${jobComment}`}</span>
|
<span>{t("cronjobs.liveJobExecution")}{jobComment && `: ${jobComment}`}</span>
|
||||||
{status === "running" && (
|
{status === "running" && (
|
||||||
<span className="flex items-center gap-1 text-sm text-blue-500">
|
<span className="flex items-center gap-1 text-sm text-blue-500">
|
||||||
<Loader2 className="w-4 h-4 animate-spin" />
|
<Loader2 className="w-4 h-4 animate-spin" />
|
||||||
Running...
|
{t("cronjobs.running")}
|
||||||
</span>
|
</span>
|
||||||
)}
|
)}
|
||||||
{status === "completed" && (
|
{status === "completed" && (
|
||||||
<span className="flex items-center gap-1 text-sm text-green-500">
|
<span className="flex items-center gap-1 text-sm text-green-500">
|
||||||
<CheckCircle2 className="w-4 h-4" />
|
<CheckCircle2 className="w-4 h-4" />
|
||||||
Completed (Exit: {exitCode})
|
{t("cronjobs.completed", { exitCode: exitCode ?? 0 })}
|
||||||
</span>
|
</span>
|
||||||
)}
|
)}
|
||||||
{status === "failed" && (
|
{status === "failed" && (
|
||||||
<span className="flex items-center gap-1 text-sm text-red-500">
|
<span className="flex items-center gap-1 text-sm text-red-500">
|
||||||
<XCircle className="w-4 h-4" />
|
<XCircle className="w-4 h-4" />
|
||||||
Failed (Exit: {exitCode})
|
{t("cronjobs.jobFailed", { exitCode: exitCode ?? 1 })}
|
||||||
</span>
|
</span>
|
||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
@@ -247,13 +257,82 @@ export const LiveLogModal = ({
|
|||||||
preventCloseOnClickOutside={status === "running"}
|
preventCloseOnClickOutside={status === "running"}
|
||||||
>
|
>
|
||||||
<div className="space-y-4">
|
<div className="space-y-4">
|
||||||
|
<div className="flex items-center justify-between gap-4">
|
||||||
|
<div className="flex items-center gap-3">
|
||||||
|
{!showFullLog ? (
|
||||||
|
<>
|
||||||
|
<label htmlFor="maxLines" className="text-sm text-muted-foreground">
|
||||||
|
{t("cronjobs.showLast")}
|
||||||
|
</label>
|
||||||
|
<select
|
||||||
|
id="maxLines"
|
||||||
|
value={maxLines}
|
||||||
|
onChange={(e) => setMaxLines(parseInt(e.target.value, 10))}
|
||||||
|
className="bg-background border border-border rounded px-2 py-1 text-sm"
|
||||||
|
>
|
||||||
|
<option value="100">{t("cronjobs.nLines", { count: "100" })}</option>
|
||||||
|
<option value="500">{t("cronjobs.nLines", { count: "500" })}</option>
|
||||||
|
<option value="1000">{t("cronjobs.nLines", { count: "1,000" })}</option>
|
||||||
|
<option value="2000">{t("cronjobs.nLines", { count: "2,000" })}</option>
|
||||||
|
<option value="5000">{t("cronjobs.nLines", { count: "5,000" })}</option>
|
||||||
|
</select>
|
||||||
|
{truncated && (
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
variant="outline"
|
||||||
|
size="sm"
|
||||||
|
onClick={() => {
|
||||||
|
setShowFullLog(true);
|
||||||
|
setMaxLines(50000);
|
||||||
|
}}
|
||||||
|
className="text-xs"
|
||||||
|
>
|
||||||
|
{totalLines > 0
|
||||||
|
? t("cronjobs.viewFullLog", { totalLines: totalLines.toLocaleString() })
|
||||||
|
: t("cronjobs.viewFullLogNoCount")}
|
||||||
|
</Button>
|
||||||
|
)}
|
||||||
|
</>
|
||||||
|
) : (
|
||||||
|
<div className="flex items-center gap-2">
|
||||||
|
<span className="text-sm text-muted-foreground">
|
||||||
|
{totalLines > 0
|
||||||
|
? t("cronjobs.viewingFullLog", { totalLines: totalLines.toLocaleString() })
|
||||||
|
: t("cronjobs.viewingFullLogNoCount")}
|
||||||
|
</span>
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
variant="outline"
|
||||||
|
size="sm"
|
||||||
|
onClick={() => {
|
||||||
|
setShowFullLog(false);
|
||||||
|
setMaxLines(500);
|
||||||
|
}}
|
||||||
|
className="text-xs"
|
||||||
|
>
|
||||||
|
{t("cronjobs.backToWindowedView")}
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
{truncated && !showFullLog && (
|
||||||
|
<div className="text-sm text-orange-500 flex items-center gap-1">
|
||||||
|
<AlertTriangle className="h-4 w-4" />
|
||||||
|
{t("cronjobs.showingLastOf", {
|
||||||
|
lineCount: lineCount.toLocaleString(),
|
||||||
|
totalLines: totalLines.toLocaleString()
|
||||||
|
})}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
{showSizeWarning && (
|
{showSizeWarning && (
|
||||||
<div className="bg-orange-500/10 border border-orange-500/30 rounded-lg p-3 flex items-start gap-3">
|
<div className="bg-orange-500/10 border border-orange-500/30 rounded-lg p-3 flex items-start gap-3">
|
||||||
<AlertTriangle className="h-4 w-4 text-orange-500 mt-0.5 flex-shrink-0" />
|
<AlertTriangle className="h-4 w-4 text-orange-500 mt-0.5 flex-shrink-0" />
|
||||||
<div className="flex-1 min-w-0">
|
<div className="flex-1 min-w-0">
|
||||||
<p className="text-sm text-foreground">
|
<p className="text-sm text-foreground">
|
||||||
<span className="font-medium">Large log file detected</span> ({formatFileSize(fileSize)})
|
<span className="font-medium">{t("cronjobs.largeLogFileDetected")}</span> ({formatFileSize(fileSize)})
|
||||||
{tailMode && ` - Tail mode enabled, showing last ${TAIL_LINES.toLocaleString()} lines`}
|
{tailMode && ` - ${t("cronjobs.tailModeEnabled", { tailLines: TAIL_LINES.toLocaleString() })}`}
|
||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
<Button
|
<Button
|
||||||
@@ -262,7 +341,7 @@ export const LiveLogModal = ({
|
|||||||
size="sm"
|
size="sm"
|
||||||
onClick={toggleTailMode}
|
onClick={toggleTailMode}
|
||||||
className="text-orange-500 hover:text-orange-400 hover:bg-orange-500/10 h-auto py-1 px-2 text-xs"
|
className="text-orange-500 hover:text-orange-400 hover:bg-orange-500/10 h-auto py-1 px-2 text-xs"
|
||||||
title={tailMode ? "Show all lines" : "Enable tail mode"}
|
title={tailMode ? t("cronjobs.showAllLines") : t("cronjobs.enableTailMode")}
|
||||||
>
|
>
|
||||||
{tailMode ? <Maximize2 className="h-3 w-3" /> : <Minimize2 className="h-3 w-3" />}
|
{tailMode ? <Maximize2 className="h-3 w-3" /> : <Minimize2 className="h-3 w-3" />}
|
||||||
</Button>
|
</Button>
|
||||||
@@ -271,20 +350,14 @@ export const LiveLogModal = ({
|
|||||||
|
|
||||||
<div className="bg-black/90 dark:bg-black/60 rounded-lg p-4 max-h-[60vh] overflow-auto">
|
<div className="bg-black/90 dark:bg-black/60 rounded-lg p-4 max-h-[60vh] overflow-auto">
|
||||||
<pre className="text-xs font-mono text-green-400 whitespace-pre-wrap break-words">
|
<pre className="text-xs font-mono text-green-400 whitespace-pre-wrap break-words">
|
||||||
{logContent ||
|
{logContent || t("cronjobs.waitingForJobToStart")}
|
||||||
"Waiting for job to start...\n\nLogs will appear here in real-time."}
|
|
||||||
<div ref={logEndRef} />
|
<div ref={logEndRef} />
|
||||||
</pre>
|
</pre>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div className="flex justify-between items-center text-xs text-muted-foreground">
|
<div className="flex justify-between items-center text-xs text-muted-foreground">
|
||||||
<span>
|
<span>
|
||||||
Run ID: {runId} | Job ID: {jobId}
|
{t("cronjobs.runIdJobId", { runId, jobId })}
|
||||||
</span>
|
|
||||||
<span>
|
|
||||||
{lineCount.toLocaleString()} lines
|
|
||||||
{tailMode && ` (showing last ${TAIL_LINES.toLocaleString()})`}
|
|
||||||
{fileSize > 0 && ` • ${formatFileSize(fileSize)}`}
|
|
||||||
</span>
|
</span>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -65,6 +65,7 @@ export const SystemInfoCard = ({
|
|||||||
const [systemInfo, setSystemInfo] =
|
const [systemInfo, setSystemInfo] =
|
||||||
useState<SystemInfoType>(initialSystemInfo);
|
useState<SystemInfoType>(initialSystemInfo);
|
||||||
const [isUpdating, setIsUpdating] = useState(false);
|
const [isUpdating, setIsUpdating] = useState(false);
|
||||||
|
const [isDisabled, setIsDisabled] = useState(false);
|
||||||
const t = useTranslations();
|
const t = useTranslations();
|
||||||
const { subscribe } = useSSEContext();
|
const { subscribe } = useSSEContext();
|
||||||
const isPageVisible = usePageVisibility();
|
const isPageVisible = usePageVisibility();
|
||||||
@@ -72,6 +73,10 @@ export const SystemInfoCard = ({
|
|||||||
const abortControllerRef = useRef<AbortController | null>(null);
|
const abortControllerRef = useRef<AbortController | null>(null);
|
||||||
|
|
||||||
const updateSystemInfo = async () => {
|
const updateSystemInfo = async () => {
|
||||||
|
if (isDisabled) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
if (abortControllerRef.current) {
|
if (abortControllerRef.current) {
|
||||||
abortControllerRef.current.abort();
|
abortControllerRef.current.abort();
|
||||||
}
|
}
|
||||||
@@ -88,13 +93,17 @@ export const SystemInfoCard = ({
|
|||||||
throw new Error("Failed to fetch system stats");
|
throw new Error("Failed to fetch system stats");
|
||||||
}
|
}
|
||||||
const freshData = await response.json();
|
const freshData = await response.json();
|
||||||
|
if (freshData === null) {
|
||||||
|
setIsDisabled(true);
|
||||||
|
return;
|
||||||
|
}
|
||||||
setSystemInfo(freshData);
|
setSystemInfo(freshData);
|
||||||
} catch (error: any) {
|
} catch (error: any) {
|
||||||
if (error.name !== "AbortError") {
|
if (error.name !== "AbortError") {
|
||||||
console.error("Failed to update system info:", error);
|
console.error("Failed to update system info:", error);
|
||||||
}
|
}
|
||||||
} finally {
|
} finally {
|
||||||
if (!abortController.signal.aborted) {
|
if (!abortControllerRef.current?.signal.aborted) {
|
||||||
setIsUpdating(false);
|
setIsUpdating(false);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -102,7 +111,7 @@ export const SystemInfoCard = ({
|
|||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
const unsubscribe = subscribe((event: SSEEvent) => {
|
const unsubscribe = subscribe((event: SSEEvent) => {
|
||||||
if (event.type === "system-stats") {
|
if (event.type === "system-stats" && event.data !== null) {
|
||||||
setSystemInfo(event.data);
|
setSystemInfo(event.data);
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
@@ -129,16 +138,16 @@ export const SystemInfoCard = ({
|
|||||||
let timeoutId: NodeJS.Timeout | null = null;
|
let timeoutId: NodeJS.Timeout | null = null;
|
||||||
|
|
||||||
const doUpdate = () => {
|
const doUpdate = () => {
|
||||||
if (!mounted || !isPageVisible) return;
|
if (!mounted || !isPageVisible || isDisabled) return;
|
||||||
updateTime();
|
updateTime();
|
||||||
updateSystemInfo().finally(() => {
|
updateSystemInfo().finally(() => {
|
||||||
if (mounted && isPageVisible) {
|
if (mounted && isPageVisible && !isDisabled) {
|
||||||
timeoutId = setTimeout(doUpdate, updateInterval);
|
timeoutId = setTimeout(doUpdate, updateInterval);
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
};
|
};
|
||||||
|
|
||||||
if (isPageVisible) {
|
if (isPageVisible && !isDisabled) {
|
||||||
timeoutId = setTimeout(doUpdate, updateInterval);
|
timeoutId = setTimeout(doUpdate, updateInterval);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -151,7 +160,7 @@ export const SystemInfoCard = ({
|
|||||||
abortControllerRef.current.abort();
|
abortControllerRef.current.abort();
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
}, [isPageVisible]);
|
}, [isPageVisible, isDisabled]);
|
||||||
|
|
||||||
const quickStats = {
|
const quickStats = {
|
||||||
cpu: systemInfo.cpu.usage,
|
cpu: systemInfo.cpu.usage,
|
||||||
@@ -201,15 +210,15 @@ export const SystemInfoCard = ({
|
|||||||
},
|
},
|
||||||
...(systemInfo.network
|
...(systemInfo.network
|
||||||
? [
|
? [
|
||||||
{
|
{
|
||||||
icon: Wifi,
|
icon: Wifi,
|
||||||
label: t("sidebar.network"),
|
label: t("sidebar.network"),
|
||||||
value: `${systemInfo.network.latency}ms`,
|
value: `${systemInfo.network.latency}ms`,
|
||||||
detail: `${systemInfo.network.latency}ms latency • ${systemInfo.network.speed}`,
|
detail: `${systemInfo.network.latency}ms latency • ${systemInfo.network.speed}`,
|
||||||
status: systemInfo.network.status,
|
status: systemInfo.network.status,
|
||||||
color: "text-teal-500",
|
color: "text-teal-500",
|
||||||
},
|
},
|
||||||
]
|
]
|
||||||
: []),
|
: []),
|
||||||
];
|
];
|
||||||
|
|
||||||
@@ -226,12 +235,12 @@ export const SystemInfoCard = ({
|
|||||||
},
|
},
|
||||||
...(systemInfo.network
|
...(systemInfo.network
|
||||||
? [
|
? [
|
||||||
{
|
{
|
||||||
label: t("sidebar.networkLatency"),
|
label: t("sidebar.networkLatency"),
|
||||||
value: `${systemInfo.network.latency}ms`,
|
value: `${systemInfo.network.latency}ms`,
|
||||||
status: systemInfo.network.status,
|
status: systemInfo.network.status,
|
||||||
},
|
},
|
||||||
]
|
]
|
||||||
: []),
|
: []),
|
||||||
];
|
];
|
||||||
|
|
||||||
@@ -290,7 +299,7 @@ export const SystemInfoCard = ({
|
|||||||
{t("sidebar.statsUpdateEvery")}{" "}
|
{t("sidebar.statsUpdateEvery")}{" "}
|
||||||
{Math.round(
|
{Math.round(
|
||||||
parseInt(process.env.NEXT_PUBLIC_CLOCK_UPDATE_INTERVAL || "30000") /
|
parseInt(process.env.NEXT_PUBLIC_CLOCK_UPDATE_INTERVAL || "30000") /
|
||||||
1000
|
1000
|
||||||
)}
|
)}
|
||||||
s • {t("sidebar.networkSpeedEstimatedFromLatency")}
|
s • {t("sidebar.networkSpeedEstimatedFromLatency")}
|
||||||
{isUpdating && (
|
{isUpdating && (
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
import { revalidatePath } from "next/cache";
|
import { revalidatePath } from "next/cache";
|
||||||
import { writeFile, readFile, unlink, mkdir } from "fs/promises";
|
import { writeFile, readFile, unlink, mkdir } from "fs/promises";
|
||||||
import { join } from "path";
|
import path from "path";
|
||||||
import { existsSync } from "fs";
|
import { existsSync } from "fs";
|
||||||
import { exec } from "child_process";
|
import { exec } from "child_process";
|
||||||
import { promisify } from "util";
|
import { promisify } from "util";
|
||||||
@@ -13,10 +13,6 @@ import { isDocker, getHostScriptsPath } from "@/app/_server/actions/global";
|
|||||||
|
|
||||||
const execAsync = promisify(exec);
|
const execAsync = promisify(exec);
|
||||||
|
|
||||||
export const getScriptPath = (filename: string): string => {
|
|
||||||
return join(process.cwd(), SCRIPTS_DIR, filename);
|
|
||||||
};
|
|
||||||
|
|
||||||
export const getScriptPathForCron = async (
|
export const getScriptPathForCron = async (
|
||||||
filename: string
|
filename: string
|
||||||
): Promise<string> => {
|
): Promise<string> => {
|
||||||
@@ -25,19 +21,19 @@ export const getScriptPathForCron = async (
|
|||||||
if (docker) {
|
if (docker) {
|
||||||
const hostScriptsPath = await getHostScriptsPath();
|
const hostScriptsPath = await getHostScriptsPath();
|
||||||
if (hostScriptsPath) {
|
if (hostScriptsPath) {
|
||||||
return `bash ${join(hostScriptsPath, filename)}`;
|
return `bash ${path.join(hostScriptsPath, filename)}`;
|
||||||
}
|
}
|
||||||
console.warn("Could not determine host scripts path, using container path");
|
console.warn("Could not determine host scripts path, using container path");
|
||||||
}
|
}
|
||||||
|
|
||||||
return `bash ${join(process.cwd(), SCRIPTS_DIR, filename)}`;
|
return `bash ${path.join(process.cwd(), SCRIPTS_DIR, filename)}`;
|
||||||
};
|
};
|
||||||
|
|
||||||
export const getHostScriptPath = (filename: string): string => {
|
export const getHostScriptPath = async (filename: string): Promise<string> => {
|
||||||
return `bash ${join(process.cwd(), SCRIPTS_DIR, filename)}`;
|
return `bash ${path.join(process.cwd(), SCRIPTS_DIR, filename)}`;
|
||||||
};
|
};
|
||||||
|
|
||||||
export const normalizeLineEndings = (content: string): string => {
|
export const normalizeLineEndings = async (content: string): Promise<string> => {
|
||||||
return content.replace(/\r\n/g, "\n").replace(/\r/g, "\n");
|
return content.replace(/\r\n/g, "\n").replace(/\r/g, "\n");
|
||||||
};
|
};
|
||||||
|
|
||||||
@@ -65,14 +61,14 @@ const generateUniqueFilename = async (baseName: string): Promise<string> => {
|
|||||||
};
|
};
|
||||||
|
|
||||||
const ensureScriptsDirectory = async () => {
|
const ensureScriptsDirectory = async () => {
|
||||||
const scriptsDir = join(process.cwd(), SCRIPTS_DIR);
|
const scriptsDir = path.join(process.cwd(), SCRIPTS_DIR);
|
||||||
if (!existsSync(scriptsDir)) {
|
if (!existsSync(scriptsDir)) {
|
||||||
await mkdir(scriptsDir, { recursive: true });
|
await mkdir(scriptsDir, { recursive: true });
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
const ensureHostScriptsDirectory = async () => {
|
const ensureHostScriptsDirectory = async () => {
|
||||||
const hostScriptsDir = join(process.cwd(), SCRIPTS_DIR);
|
const hostScriptsDir = path.join(process.cwd(), SCRIPTS_DIR);
|
||||||
if (!existsSync(hostScriptsDir)) {
|
if (!existsSync(hostScriptsDir)) {
|
||||||
await mkdir(hostScriptsDir, { recursive: true });
|
await mkdir(hostScriptsDir, { recursive: true });
|
||||||
}
|
}
|
||||||
@@ -81,7 +77,7 @@ const ensureHostScriptsDirectory = async () => {
|
|||||||
const saveScriptFile = async (filename: string, content: string) => {
|
const saveScriptFile = async (filename: string, content: string) => {
|
||||||
await ensureScriptsDirectory();
|
await ensureScriptsDirectory();
|
||||||
|
|
||||||
const scriptPath = getScriptPath(filename);
|
const scriptPath = path.join(process.cwd(), SCRIPTS_DIR, filename);
|
||||||
await writeFile(scriptPath, content, "utf8");
|
await writeFile(scriptPath, content, "utf8");
|
||||||
|
|
||||||
try {
|
try {
|
||||||
@@ -92,7 +88,7 @@ const saveScriptFile = async (filename: string, content: string) => {
|
|||||||
};
|
};
|
||||||
|
|
||||||
const deleteScriptFile = async (filename: string) => {
|
const deleteScriptFile = async (filename: string) => {
|
||||||
const scriptPath = getScriptPath(filename);
|
const scriptPath = path.join(process.cwd(), SCRIPTS_DIR, filename);
|
||||||
if (existsSync(scriptPath)) {
|
if (existsSync(scriptPath)) {
|
||||||
await unlink(scriptPath);
|
await unlink(scriptPath);
|
||||||
}
|
}
|
||||||
@@ -125,7 +121,7 @@ export const createScript = async (
|
|||||||
|
|
||||||
`;
|
`;
|
||||||
|
|
||||||
const normalizedContent = normalizeLineEndings(content);
|
const normalizedContent = await normalizeLineEndings(content);
|
||||||
const fullContent = metadataHeader + normalizedContent;
|
const fullContent = metadataHeader + normalizedContent;
|
||||||
|
|
||||||
await saveScriptFile(filename, fullContent);
|
await saveScriptFile(filename, fullContent);
|
||||||
@@ -176,7 +172,7 @@ export const updateScript = async (
|
|||||||
|
|
||||||
`;
|
`;
|
||||||
|
|
||||||
const normalizedContent = normalizeLineEndings(content);
|
const normalizedContent = await normalizeLineEndings(content);
|
||||||
const fullContent = metadataHeader + normalizedContent;
|
const fullContent = metadataHeader + normalizedContent;
|
||||||
|
|
||||||
await saveScriptFile(existingScript.filename, fullContent);
|
await saveScriptFile(existingScript.filename, fullContent);
|
||||||
@@ -235,7 +231,7 @@ export const cloneScript = async (
|
|||||||
|
|
||||||
`;
|
`;
|
||||||
|
|
||||||
const normalizedContent = normalizeLineEndings(originalContent);
|
const normalizedContent = await normalizeLineEndings(originalContent);
|
||||||
const fullContent = metadataHeader + normalizedContent;
|
const fullContent = metadataHeader + normalizedContent;
|
||||||
|
|
||||||
await saveScriptFile(filename, fullContent);
|
await saveScriptFile(filename, fullContent);
|
||||||
@@ -262,7 +258,7 @@ export const cloneScript = async (
|
|||||||
|
|
||||||
export const getScriptContent = async (filename: string): Promise<string> => {
|
export const getScriptContent = async (filename: string): Promise<string> => {
|
||||||
try {
|
try {
|
||||||
const scriptPath = getScriptPath(filename);
|
const scriptPath = path.join(process.cwd(), SCRIPTS_DIR, filename);
|
||||||
|
|
||||||
if (existsSync(scriptPath)) {
|
if (existsSync(scriptPath)) {
|
||||||
const content = await readFile(scriptPath, "utf8");
|
const content = await readFile(scriptPath, "utf8");
|
||||||
@@ -299,7 +295,7 @@ export const executeScript = async (
|
|||||||
}> => {
|
}> => {
|
||||||
try {
|
try {
|
||||||
await ensureHostScriptsDirectory();
|
await ensureHostScriptsDirectory();
|
||||||
const hostScriptPath = getHostScriptPath(filename);
|
const hostScriptPath = await getHostScriptPath(filename);
|
||||||
|
|
||||||
if (!existsSync(hostScriptPath)) {
|
if (!existsSync(hostScriptPath)) {
|
||||||
return {
|
return {
|
||||||
|
|||||||
@@ -87,7 +87,26 @@
|
|||||||
"both": "Both",
|
"both": "Both",
|
||||||
"minimalMode": "Minimal Mode",
|
"minimalMode": "Minimal Mode",
|
||||||
"minimalModeDescription": "Show compact view with icons instead of full text",
|
"minimalModeDescription": "Show compact view with icons instead of full text",
|
||||||
"applyFilters": "Apply Filters"
|
"applyFilters": "Apply Filters",
|
||||||
|
"nLines": "{count} lines",
|
||||||
|
"liveJobExecution": "Live Job Execution",
|
||||||
|
"running": "Running...",
|
||||||
|
"completed": "Completed (Exit: {exitCode})",
|
||||||
|
"jobFailed": "Failed (Exit: {exitCode})",
|
||||||
|
"showLast": "Show last:",
|
||||||
|
"viewFullLog": "View Full Log ({totalLines} lines)",
|
||||||
|
"viewFullLogNoCount": "View Full Log",
|
||||||
|
"viewingFullLog": "Viewing full log ({totalLines} lines)",
|
||||||
|
"viewingFullLogNoCount": "Viewing full log",
|
||||||
|
"backToWindowedView": "Back to Windowed View",
|
||||||
|
"showingLastOf": "Showing last {lineCount} of {totalLines} lines",
|
||||||
|
"showingLastLines": "Showing last {lineCount} lines",
|
||||||
|
"largeLogFileDetected": "Large log file detected",
|
||||||
|
"tailModeEnabled": "Tail mode enabled, showing last {tailLines} lines",
|
||||||
|
"showAllLines": "Show all lines",
|
||||||
|
"enableTailMode": "Enable tail mode",
|
||||||
|
"waitingForJobToStart": "Waiting for job to start...\n\nLogs will appear here in real-time.",
|
||||||
|
"runIdJobId": "Run ID: {runId} | Job ID: {jobId}"
|
||||||
},
|
},
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"scripts": "Scripts",
|
"scripts": "Scripts",
|
||||||
|
|||||||
@@ -86,7 +86,23 @@
|
|||||||
"both": "Entrambi",
|
"both": "Entrambi",
|
||||||
"minimalMode": "Modalità Minima",
|
"minimalMode": "Modalità Minima",
|
||||||
"minimalModeDescription": "Mostra vista compatta con icone invece del testo completo",
|
"minimalModeDescription": "Mostra vista compatta con icone invece del testo completo",
|
||||||
"applyFilters": "Applica Filtri"
|
"applyFilters": "Applica Filtri",
|
||||||
|
"nLines": "{count} linee",
|
||||||
|
"liveJobExecution": "Esecuzione Lavoro Live",
|
||||||
|
"running": "In esecuzione...",
|
||||||
|
"completed": "Completato (Exit: {exitCode})",
|
||||||
|
"jobFailed": "Fallito (Exit: {exitCode})",
|
||||||
|
"showLast": "Mostra ultime:",
|
||||||
|
"viewFullLog": "Visualizza Log Completo ({totalLines} linee)",
|
||||||
|
"viewingFullLog": "Visualizzazione log completo ({totalLines} linee)",
|
||||||
|
"backToWindowedView": "Torna alla Vista Finestrata",
|
||||||
|
"showingLastOf": "Mostrando ultime {lineCount} di {totalLines} linee",
|
||||||
|
"largeLogFileDetected": "Rilevato file di log di grandi dimensioni",
|
||||||
|
"tailModeEnabled": "Modalità tail abilitata, mostrando ultime {tailLines} linee",
|
||||||
|
"showAllLines": "Mostra tutte le linee",
|
||||||
|
"enableTailMode": "Abilita modalità tail",
|
||||||
|
"waitingForJobToStart": "In attesa che il lavoro inizi...\n\nI log appariranno qui in tempo reale.",
|
||||||
|
"runIdJobId": "ID Esecuzione: {runId} | ID Lavoro: {jobId}"
|
||||||
},
|
},
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"scripts": "Script",
|
"scripts": "Script",
|
||||||
|
|||||||
@@ -11,6 +11,7 @@ import {
|
|||||||
} from "./running-jobs-utils";
|
} from "./running-jobs-utils";
|
||||||
import { sseBroadcaster } from "./sse-broadcaster";
|
import { sseBroadcaster } from "./sse-broadcaster";
|
||||||
import { generateLogFolderName, cleanupOldLogFiles } from "./wrapper-utils";
|
import { generateLogFolderName, cleanupOldLogFiles } from "./wrapper-utils";
|
||||||
|
import { watchForLogFile } from "./log-watcher";
|
||||||
|
|
||||||
const execAsync = promisify(exec);
|
const execAsync = promisify(exec);
|
||||||
|
|
||||||
@@ -84,18 +85,29 @@ export const runJobInBackground = async (
|
|||||||
|
|
||||||
child.unref();
|
child.unref();
|
||||||
|
|
||||||
|
const jobStartTime = new Date();
|
||||||
|
|
||||||
saveRunningJob({
|
saveRunningJob({
|
||||||
id: runId,
|
id: runId,
|
||||||
cronJobId: job.id,
|
cronJobId: job.id,
|
||||||
pid: child.pid!,
|
pid: child.pid!,
|
||||||
startTime: new Date().toISOString(),
|
startTime: jobStartTime.toISOString(),
|
||||||
status: "running",
|
status: "running",
|
||||||
logFolderName,
|
logFolderName,
|
||||||
});
|
});
|
||||||
|
|
||||||
|
watchForLogFile(runId, logFolderName, jobStartTime, (logFileName) => {
|
||||||
|
try {
|
||||||
|
updateRunningJob(runId, { logFileName });
|
||||||
|
console.log(`[RunningJob] Cached logFileName for ${runId}: ${logFileName}`);
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`[RunningJob] Failed to cache logFileName for ${runId}:`, error);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
sseBroadcaster.broadcast({
|
sseBroadcaster.broadcast({
|
||||||
type: "job-started",
|
type: "job-started",
|
||||||
timestamp: new Date().toISOString(),
|
timestamp: jobStartTime.toISOString(),
|
||||||
data: {
|
data: {
|
||||||
runId,
|
runId,
|
||||||
cronJobId: job.id,
|
cronJobId: job.id,
|
||||||
|
|||||||
@@ -95,3 +95,62 @@ export const stopLogWatcher = () => {
|
|||||||
watcher = null;
|
watcher = null;
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
export const watchForLogFile = (
|
||||||
|
runId: string,
|
||||||
|
logFolderName: string,
|
||||||
|
jobStartTime: Date,
|
||||||
|
callback: (logFileName: string) => void
|
||||||
|
): NodeJS.Timeout => {
|
||||||
|
const logDir = path.join(LOGS_DIR, logFolderName);
|
||||||
|
const startTime = jobStartTime.getTime();
|
||||||
|
const maxAttempts = 30;
|
||||||
|
let attempts = 0;
|
||||||
|
|
||||||
|
const checkInterval = setInterval(() => {
|
||||||
|
attempts++;
|
||||||
|
|
||||||
|
if (attempts > maxAttempts) {
|
||||||
|
console.warn(`[LogWatcher] Timeout waiting for log file for ${runId}`);
|
||||||
|
clearInterval(checkInterval);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
if (!existsSync(logDir)) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const files = readdirSync(logDir);
|
||||||
|
const logFiles = files
|
||||||
|
.filter((f) => f.endsWith(".log"))
|
||||||
|
.map((f) => {
|
||||||
|
const filePath = path.join(logDir, f);
|
||||||
|
try {
|
||||||
|
const stats = statSync(filePath);
|
||||||
|
return {
|
||||||
|
name: f,
|
||||||
|
birthtime: stats.birthtime || stats.mtime,
|
||||||
|
};
|
||||||
|
} catch {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
})
|
||||||
|
.filter((f): f is { name: string; birthtime: Date } => f !== null);
|
||||||
|
|
||||||
|
const matchingFile = logFiles.find((f) => {
|
||||||
|
const fileTime = f.birthtime.getTime();
|
||||||
|
return fileTime >= startTime - 5000 && fileTime <= startTime + 30000;
|
||||||
|
});
|
||||||
|
|
||||||
|
if (matchingFile) {
|
||||||
|
clearInterval(checkInterval);
|
||||||
|
callback(matchingFile.name);
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`[LogWatcher] Error watching for log file ${runId}:`, error);
|
||||||
|
}
|
||||||
|
}, 500);
|
||||||
|
|
||||||
|
return checkInterval;
|
||||||
|
};
|
||||||
|
|||||||
@@ -86,4 +86,4 @@ export const getScriptById = (
|
|||||||
id: string
|
id: string
|
||||||
): Script | undefined => {
|
): Script | undefined => {
|
||||||
return scripts.find((script) => script.id === id);
|
return scripts.find((script) => script.id === id);
|
||||||
}
|
};
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
import { NextRequest, NextResponse } from "next/server";
|
import { NextRequest, NextResponse } from "next/server";
|
||||||
import { getRunningJob } from "@/app/_utils/running-jobs-utils";
|
import { getRunningJob } from "@/app/_utils/running-jobs-utils";
|
||||||
import { readFile } from "fs/promises";
|
import { readFile, open } from "fs/promises";
|
||||||
import { existsSync } from "fs";
|
import { existsSync } from "fs";
|
||||||
import path from "path";
|
import path from "path";
|
||||||
import { requireAuth } from "@/app/_utils/api-auth-utils";
|
import { requireAuth } from "@/app/_utils/api-auth-utils";
|
||||||
@@ -17,6 +17,11 @@ export const GET = async (request: NextRequest) => {
|
|||||||
const offsetStr = searchParams.get("offset");
|
const offsetStr = searchParams.get("offset");
|
||||||
const offset = offsetStr ? parseInt(offsetStr, 10) : 0;
|
const offset = offsetStr ? parseInt(offsetStr, 10) : 0;
|
||||||
|
|
||||||
|
const maxLinesStr = searchParams.get("maxLines");
|
||||||
|
const maxLines = maxLinesStr
|
||||||
|
? Math.min(Math.max(parseInt(maxLinesStr, 10), 100), 5000)
|
||||||
|
: 500;
|
||||||
|
|
||||||
if (!runId) {
|
if (!runId) {
|
||||||
return NextResponse.json(
|
return NextResponse.json(
|
||||||
{ error: "runId parameter is required" },
|
{ error: "runId parameter is required" },
|
||||||
@@ -136,42 +141,70 @@ export const GET = async (request: NextRequest) => {
|
|||||||
|
|
||||||
const fileSize = latestStats.size;
|
const fileSize = latestStats.size;
|
||||||
|
|
||||||
const MAX_RESPONSE_SIZE = 1024 * 1024;
|
let displayedLines: string[] = [];
|
||||||
const MAX_TOTAL_SIZE = 10 * 1024 * 1024;
|
let truncated = false;
|
||||||
|
let totalLines = 0;
|
||||||
let content = "";
|
let content = "";
|
||||||
let newContent = "";
|
let newContent = "";
|
||||||
|
|
||||||
if (fileSize > MAX_TOTAL_SIZE) {
|
if (offset === 0) {
|
||||||
const startPos = Math.max(0, fileSize - MAX_TOTAL_SIZE);
|
const AVERAGE_LINE_LENGTH = 100;
|
||||||
const buffer = Buffer.alloc(MAX_TOTAL_SIZE);
|
const ESTIMATED_BYTES = maxLines * AVERAGE_LINE_LENGTH * 2;
|
||||||
const { open } = await import("fs/promises");
|
const bytesToRead = Math.min(ESTIMATED_BYTES, fileSize);
|
||||||
const fileHandle = await open(latestLogFile, "r");
|
|
||||||
|
|
||||||
try {
|
if (bytesToRead < fileSize) {
|
||||||
await fileHandle.read(buffer, 0, MAX_TOTAL_SIZE, startPos);
|
const fileHandle = await open(latestLogFile, "r");
|
||||||
content = buffer.toString("utf-8");
|
const buffer = Buffer.alloc(bytesToRead);
|
||||||
newContent = content.slice(Math.max(0, offset - startPos));
|
await fileHandle.read(buffer, 0, bytesToRead, fileSize - bytesToRead);
|
||||||
} finally {
|
|
||||||
await fileHandle.close();
|
await fileHandle.close();
|
||||||
|
|
||||||
|
const tailContent = buffer.toString("utf-8");
|
||||||
|
const lines = tailContent.split("\n");
|
||||||
|
|
||||||
|
if (lines[0] && lines[0].length > 0) {
|
||||||
|
lines.shift();
|
||||||
|
}
|
||||||
|
|
||||||
|
if (lines.length > maxLines) {
|
||||||
|
displayedLines = lines.slice(-maxLines);
|
||||||
|
truncated = true;
|
||||||
|
} else {
|
||||||
|
displayedLines = lines;
|
||||||
|
truncated = true;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
const fullContent = await readFile(latestLogFile, "utf-8");
|
||||||
|
const allLines = fullContent.split("\n");
|
||||||
|
totalLines = allLines.length;
|
||||||
|
|
||||||
|
if (totalLines > maxLines) {
|
||||||
|
displayedLines = allLines.slice(-maxLines);
|
||||||
|
truncated = true;
|
||||||
|
} else {
|
||||||
|
displayedLines = allLines;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if (startPos > 0) {
|
if (truncated) {
|
||||||
content = `[LOG TRUNCATED - Showing last ${MAX_TOTAL_SIZE / 1024 / 1024
|
content = `[LOG TRUNCATED - Showing last ${maxLines} lines (${(fileSize / 1024 / 1024).toFixed(2)}MB total)]\n\n` + displayedLines.join("\n");
|
||||||
}MB of ${fileSize / 1024 / 1024}MB total]\n\n${content}`;
|
} else {
|
||||||
|
content = displayedLines.join("\n");
|
||||||
|
totalLines = displayedLines.length;
|
||||||
}
|
}
|
||||||
|
newContent = content;
|
||||||
} else {
|
} else {
|
||||||
const fullContent = await readFile(latestLogFile, "utf-8");
|
if (offset < fileSize) {
|
||||||
|
const fileHandle = await open(latestLogFile, "r");
|
||||||
|
const bytesToRead = fileSize - offset;
|
||||||
|
const buffer = Buffer.alloc(bytesToRead);
|
||||||
|
await fileHandle.read(buffer, 0, bytesToRead, offset);
|
||||||
|
await fileHandle.close();
|
||||||
|
|
||||||
if (offset > 0 && offset < fileSize) {
|
newContent = buffer.toString("utf-8");
|
||||||
newContent = fullContent.slice(offset);
|
const newLines = newContent.split("\n").filter(l => l.length > 0);
|
||||||
content = newContent;
|
if (newLines.length > 0) {
|
||||||
} else if (offset === 0) {
|
content = newContent;
|
||||||
content = fullContent;
|
}
|
||||||
newContent = fullContent;
|
|
||||||
} else if (offset >= fileSize) {
|
|
||||||
content = "";
|
|
||||||
newContent = "";
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -185,6 +218,9 @@ export const GET = async (request: NextRequest) => {
|
|||||||
exitCode: job.exitCode,
|
exitCode: job.exitCode,
|
||||||
fileSize,
|
fileSize,
|
||||||
offset,
|
offset,
|
||||||
|
totalLines: offset === 0 && !truncated ? totalLines : undefined,
|
||||||
|
displayedLines: displayedLines.length,
|
||||||
|
truncated,
|
||||||
});
|
});
|
||||||
} catch (error: any) {
|
} catch (error: any) {
|
||||||
console.error("Error streaming log:", error);
|
console.error("Error streaming log:", error);
|
||||||
|
|||||||
@@ -18,6 +18,11 @@ export const dynamic = "force-dynamic";
|
|||||||
export const GET = async (request: NextRequest) => {
|
export const GET = async (request: NextRequest) => {
|
||||||
const authError = await requireAuth(request);
|
const authError = await requireAuth(request);
|
||||||
if (authError) return authError;
|
if (authError) return authError;
|
||||||
|
|
||||||
|
if (process.env.DISABLE_SYSTEM_STATS === "true") {
|
||||||
|
return NextResponse.json(null);
|
||||||
|
}
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const t = await getTranslations();
|
const t = await getTranslations();
|
||||||
|
|
||||||
@@ -71,8 +76,8 @@ export const GET = async (request: NextRequest) => {
|
|||||||
network: {
|
network: {
|
||||||
speed:
|
speed:
|
||||||
mainInterface &&
|
mainInterface &&
|
||||||
mainInterface.rx_sec != null &&
|
mainInterface.rx_sec != null &&
|
||||||
mainInterface.tx_sec != null
|
mainInterface.tx_sec != null
|
||||||
? `${Math.round(rxSpeed + txSpeed)} Mbps`
|
? `${Math.round(rxSpeed + txSpeed)} Mbps`
|
||||||
: t("system.unknown"),
|
: t("system.unknown"),
|
||||||
latency: latency,
|
latency: latency,
|
||||||
|
|||||||
@@ -90,9 +90,11 @@ export default async function Home() {
|
|||||||
</div>
|
</div>
|
||||||
</header>
|
</header>
|
||||||
|
|
||||||
<SystemInfoCard systemInfo={initialSystemInfo} />
|
{process.env.DISABLE_SYSTEM_STATS !== "true" && (
|
||||||
|
<SystemInfoCard systemInfo={initialSystemInfo} />
|
||||||
|
)}
|
||||||
|
|
||||||
<main className="lg:ml-80 transition-all duration-300 ml-0 sidebar-collapsed:lg:ml-16">
|
<main className={`${process.env.DISABLE_SYSTEM_STATS === "true" ? "lg:ml-0" : "lg:ml-80"} transition-all duration-300 ml-0 sidebar-collapsed:lg:ml-16`}>
|
||||||
<div className="container mx-auto px-4 py-8 lg:px-8">
|
<div className="container mx-auto px-4 py-8 lg:px-8">
|
||||||
<WrapperScriptWarning />
|
<WrapperScriptWarning />
|
||||||
<TabbedInterface cronJobs={cronJobs} scripts={scripts} />
|
<TabbedInterface cronJobs={cronJobs} scripts={scripts} />
|
||||||
|
|||||||
227
howto/API.md
227
howto/API.md
@@ -106,6 +106,104 @@ curl -H "Authorization: Bearer YOUR_API_KEY" \
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
### PATCH /api/cronjobs/:id
|
||||||
|
|
||||||
|
Update a cron job.
|
||||||
|
|
||||||
|
**Parameters:**
|
||||||
|
|
||||||
|
- `id` (string) - Cron job ID
|
||||||
|
|
||||||
|
**Request:**
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"schedule": "0 3 * * *",
|
||||||
|
"command": "/usr/bin/echo updated",
|
||||||
|
"comment": "Updated job",
|
||||||
|
"logsEnabled": true
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Response:**
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"success": true,
|
||||||
|
"message": "Cron job updated successfully"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Example:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
curl -X PATCH \
|
||||||
|
-H "Authorization: Bearer YOUR_API_KEY" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{"schedule":"0 3 * * *","command":"/usr/bin/echo updated"}' \
|
||||||
|
https://your-cronmaster-url.com/api/cronjobs/fccview-0
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### DELETE /api/cronjobs/:id
|
||||||
|
|
||||||
|
Delete a cron job.
|
||||||
|
|
||||||
|
**Parameters:**
|
||||||
|
|
||||||
|
- `id` (string) - Cron job ID
|
||||||
|
|
||||||
|
**Response:**
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"success": true,
|
||||||
|
"message": "Cron job deleted successfully"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Example:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
curl -X DELETE \
|
||||||
|
-H "Authorization: Bearer YOUR_API_KEY" \
|
||||||
|
https://your-cronmaster-url.com/api/cronjobs/fccview-0
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### GET /api/cronjobs/:id/execute
|
||||||
|
|
||||||
|
Manually execute a cron job.
|
||||||
|
|
||||||
|
**Parameters:**
|
||||||
|
|
||||||
|
- `id` (string) - Cron job ID
|
||||||
|
|
||||||
|
**Query Parameters:**
|
||||||
|
|
||||||
|
- `runInBackground` (boolean, optional) - Whether to run the job in background. Defaults to `true`.
|
||||||
|
|
||||||
|
**Response:**
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"success": true,
|
||||||
|
"runId": "run-123",
|
||||||
|
"message": "Job execution started"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Example:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
curl -H "Authorization: Bearer YOUR_API_KEY" \
|
||||||
|
https://your-cronmaster-url.com/api/cronjobs/fccview-0/execute?runInBackground=true
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
### GET /api/scripts
|
### GET /api/scripts
|
||||||
|
|
||||||
List all scripts.
|
List all scripts.
|
||||||
@@ -196,6 +294,127 @@ curl -H "Authorization: Bearer YOUR_API_KEY" \
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
### GET /api/logs/stream
|
||||||
|
|
||||||
|
Stream job execution logs.
|
||||||
|
|
||||||
|
**Query Parameters:**
|
||||||
|
|
||||||
|
- `runId` (string, required) - The run ID of the job execution
|
||||||
|
- `offset` (number, optional) - Byte offset for streaming new content. Defaults to `0`.
|
||||||
|
- `maxLines` (number, optional) - Maximum lines to return. Defaults to `500`, min `100`, max `5000`.
|
||||||
|
|
||||||
|
**Note:** When `offset=0`, the endpoint only reads the last `maxLines` from the file for performance. This means `totalLines` is only returned when the file is small enough to read entirely (not truncated).
|
||||||
|
|
||||||
|
**Response:**
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"status": "running",
|
||||||
|
"content": "[log content]",
|
||||||
|
"newContent": "[new log content since offset]",
|
||||||
|
"logFile": "2025-11-10_14-30-00.log",
|
||||||
|
"isComplete": false,
|
||||||
|
"exitCode": null,
|
||||||
|
"fileSize": 1024,
|
||||||
|
"offset": 0,
|
||||||
|
"totalLines": 50,
|
||||||
|
"displayedLines": 50,
|
||||||
|
"truncated": false
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Response Fields:**
|
||||||
|
|
||||||
|
- `status` (string) - Job status: "running", "completed", or "failed"
|
||||||
|
- `content` (string) - The log content to display
|
||||||
|
- `newContent` (string) - New content since the last offset (for streaming)
|
||||||
|
- `logFile` (string) - Name of the log file
|
||||||
|
- `isComplete` (boolean) - Whether the job has completed
|
||||||
|
- `exitCode` (number | null) - Exit code of the job (null if still running)
|
||||||
|
- `fileSize` (number) - Total size of the log file in bytes
|
||||||
|
- `offset` (number) - Current byte offset
|
||||||
|
- `totalLines` (number | undefined) - Total number of lines in the file (only returned when file is small enough to read entirely)
|
||||||
|
- `displayedLines` (number) - Number of lines being displayed
|
||||||
|
- `truncated` (boolean) - Whether the content is truncated due to maxLines limit
|
||||||
|
|
||||||
|
**Example:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
curl -H "Authorization: Bearer YOUR_API_KEY" \
|
||||||
|
"https://your-cronmaster-url.com/api/logs/stream?runId=run-123&offset=0&maxLines=500"
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### GET /api/system/wrapper-check
|
||||||
|
|
||||||
|
Check if the log wrapper script has been modified from the default.
|
||||||
|
|
||||||
|
**Response:**
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"modified": false
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Example:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
curl -H "Authorization: Bearer YOUR_API_KEY" \
|
||||||
|
https://your-cronmaster-url.com/api/system/wrapper-check
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### GET /api/oidc/login
|
||||||
|
|
||||||
|
Initiate OIDC (SSO) login flow. Redirects to the OIDC provider's authorization endpoint.
|
||||||
|
|
||||||
|
**Note:** This endpoint is only available when `SSO_MODE=oidc` is configured.
|
||||||
|
|
||||||
|
**Response:** HTTP 302 redirect to OIDC provider
|
||||||
|
|
||||||
|
**Example:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
curl -L https://your-cronmaster-url.com/api/oidc/login
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### GET /api/oidc/callback
|
||||||
|
|
||||||
|
OIDC callback endpoint. Handles the authorization code from the OIDC provider and creates a session.
|
||||||
|
|
||||||
|
**Note:** This endpoint is typically called by the OIDC provider after authentication, not directly by clients.
|
||||||
|
|
||||||
|
**Query Parameters:**
|
||||||
|
|
||||||
|
- `code` (string) - Authorization code from OIDC provider
|
||||||
|
- `state` (string) - State parameter for CSRF protection
|
||||||
|
|
||||||
|
**Response:** HTTP 302 redirect to application root
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### GET /api/oidc/logout
|
||||||
|
|
||||||
|
Initiate OIDC logout flow. Redirects to the OIDC provider's logout endpoint.
|
||||||
|
|
||||||
|
**Note:** This endpoint is only available when `SSO_MODE=oidc` is configured.
|
||||||
|
|
||||||
|
**Response:** HTTP 302 redirect to OIDC provider logout endpoint
|
||||||
|
|
||||||
|
**Example:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
curl -L https://your-cronmaster-url.com/api/oidc/logout
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
### POST /api/auth/login
|
### POST /api/auth/login
|
||||||
|
|
||||||
Login with password (alternative to API key).
|
Login with password (alternative to API key).
|
||||||
@@ -264,11 +483,3 @@ Logout and clear session (requires login first).
|
|||||||
"message": "Authentication required. Use session cookie or API key (Bearer token)."
|
"message": "Authentication required. Use session cookie or API key (Bearer token)."
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
## Testing
|
|
||||||
|
|
||||||
For local testing I have made a node script that checks all available endpoints:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
AUTH_PASSWORD=your-password node test-api.js https://your-cronmaster-url.com
|
|
||||||
```
|
|
||||||
|
|||||||
@@ -56,6 +56,7 @@ Translation loading priority:
|
|||||||
| ----------------------------------- | ------- | -------------------------------------------------- |
|
| ----------------------------------- | ------- | -------------------------------------------------- |
|
||||||
| `NEXT_PUBLIC_CLOCK_UPDATE_INTERVAL` | `30000` | Clock update interval in milliseconds (30 seconds) |
|
| `NEXT_PUBLIC_CLOCK_UPDATE_INTERVAL` | `30000` | Clock update interval in milliseconds (30 seconds) |
|
||||||
| `LIVE_UPDATES` | `true` | Enable/disable Server-Sent Events for live updates |
|
| `LIVE_UPDATES` | `true` | Enable/disable Server-Sent Events for live updates |
|
||||||
|
| `DISABLE_SYSTEM_STATS` | `false` | Set to `true` to completely disable system stats (stops polling and hides sidebar) |
|
||||||
|
|
||||||
## Logging Configuration
|
## Logging Configuration
|
||||||
|
|
||||||
|
|||||||
93
howto/LOGS.md
Normal file
93
howto/LOGS.md
Normal file
@@ -0,0 +1,93 @@
|
|||||||
|
# Job Execution Logging
|
||||||
|
|
||||||
|
CronMaster includes an optional logging feature that captures detailed execution information for your cronjobs.
|
||||||
|
|
||||||
|
## How It Works
|
||||||
|
|
||||||
|
When you enable logging for a cronjob, CronMaster automatically wraps your command with a log wrapper script. This wrapper:
|
||||||
|
|
||||||
|
- Captures **stdout** and **stderr** output
|
||||||
|
- Records the **exit code** of your command
|
||||||
|
- Timestamps the **start and end** of execution
|
||||||
|
- Calculates **execution duration**
|
||||||
|
- Stores all this information in organized log files
|
||||||
|
|
||||||
|
## Enabling Logs
|
||||||
|
|
||||||
|
1. When creating or editing a cronjob, check the "Enable Logging" checkbox
|
||||||
|
2. The wrapper is automatically added to your crontab entry
|
||||||
|
3. Jobs run independently - they continue to work even if CronMaster is offline
|
||||||
|
|
||||||
|
## Log Storage
|
||||||
|
|
||||||
|
Logs are stored in the `./data/logs/` directory with descriptive folder names:
|
||||||
|
|
||||||
|
- If a job has a **description/comment**: `{sanitized-description}_{jobId}/`
|
||||||
|
- If a job has **no description**: `{jobId}/`
|
||||||
|
|
||||||
|
Example structure:
|
||||||
|
|
||||||
|
```
|
||||||
|
./data/logs/
|
||||||
|
├── backup-database_root-0/
|
||||||
|
│ ├── 2025-11-10_14-30-00.log
|
||||||
|
│ ├── 2025-11-10_15-30-00.log
|
||||||
|
│ └── 2025-11-10_16-30-00.log
|
||||||
|
├── daily-cleanup_root-1/
|
||||||
|
│ └── 2025-11-10_14-35-00.log
|
||||||
|
├── root-2/ (no description provided)
|
||||||
|
│ └── 2025-11-10_14-40-00.log
|
||||||
|
```
|
||||||
|
|
||||||
|
**Note**: Folder names are sanitized to be filesystem-safe (lowercase, alphanumeric with hyphens, max 50 chars for the description part).
|
||||||
|
|
||||||
|
## Log Format
|
||||||
|
|
||||||
|
Each log file includes:
|
||||||
|
|
||||||
|
```
|
||||||
|
--- [ JOB START ] ----------------------------------------------------
|
||||||
|
Command : bash /app/scripts/backup.sh
|
||||||
|
Timestamp : 2025-11-10 14:30:00
|
||||||
|
Host : hostname
|
||||||
|
User : root
|
||||||
|
--- [ JOB OUTPUT ] ---------------------------------------------------
|
||||||
|
|
||||||
|
[command output here]
|
||||||
|
|
||||||
|
--- [ JOB SUMMARY ] --------------------------------------------------
|
||||||
|
Timestamp : 2025-11-10 14:30:45
|
||||||
|
Duration : 45s
|
||||||
|
Exit Code : 0
|
||||||
|
Status : SUCCESS
|
||||||
|
--- [ JOB END ] ------------------------------------------------------
|
||||||
|
```
|
||||||
|
|
||||||
|
## Automatic Cleanup
|
||||||
|
|
||||||
|
Logs are automatically cleaned up to prevent disk space issues:
|
||||||
|
|
||||||
|
- **Maximum logs per job**: 50 log files
|
||||||
|
- **Maximum age**: 30 days
|
||||||
|
- **Cleanup trigger**: When viewing logs or after manual execution
|
||||||
|
- **Method**: Oldest logs are deleted first when limits are exceeded
|
||||||
|
|
||||||
|
## Docker Considerations
|
||||||
|
|
||||||
|
- Mount the `./data` directory to persist logs on the host
|
||||||
|
- The wrapper script location: `./data/cron-log-wrapper.sh`. This will be generated automatically the first time you enable logging.
|
||||||
|
|
||||||
|
## Non-Docker Considerations
|
||||||
|
|
||||||
|
- Logs are stored at `./data/logs/` relative to the project directory
|
||||||
|
- The codebase wrapper script location: `./app/_scripts/cron-log-wrapper.sh`
|
||||||
|
- The running wrapper script location: `./data/cron-log-wrapper.sh`
|
||||||
|
|
||||||
|
## Important Notes
|
||||||
|
|
||||||
|
- Logging is **optional** and disabled by default
|
||||||
|
- Jobs with logging enabled are marked with a blue "Logged" badge in the UI
|
||||||
|
- Logs are captured for both scheduled runs and manual executions
|
||||||
|
- Commands with file redirections (>, >>) may conflict with logging
|
||||||
|
- The crontab stores the **wrapped command**, so jobs run independently of CronMaster
|
||||||
|
|
||||||
2
next-env.d.ts
vendored
2
next-env.d.ts
vendored
@@ -2,4 +2,4 @@
|
|||||||
/// <reference types="next/image-types/global" />
|
/// <reference types="next/image-types/global" />
|
||||||
|
|
||||||
// NOTE: This file should not be edited
|
// NOTE: This file should not be edited
|
||||||
// see https://nextjs.org/docs/basic-features/typescript for more information.
|
// see https://nextjs.org/docs/app/building-your-application/configuring/typescript for more information.
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "cronjob-manager",
|
"name": "cronjob-manager",
|
||||||
"version": "1.5.3",
|
"version": "1.5.4",
|
||||||
"private": true,
|
"private": true,
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"dev": "next dev",
|
"dev": "next dev",
|
||||||
@@ -32,7 +32,7 @@
|
|||||||
"jose": "^6.1.1",
|
"jose": "^6.1.1",
|
||||||
"lucide-react": "^0.294.0",
|
"lucide-react": "^0.294.0",
|
||||||
"minimatch": "^10.0.3",
|
"minimatch": "^10.0.3",
|
||||||
"next": "14.0.4",
|
"next": "14.2.35",
|
||||||
"next-intl": "^4.4.0",
|
"next-intl": "^4.4.0",
|
||||||
"next-pwa": "^5.6.0",
|
"next-pwa": "^5.6.0",
|
||||||
"next-themes": "^0.2.1",
|
"next-themes": "^0.2.1",
|
||||||
|
|||||||
Reference in New Issue
Block a user