REST API and background job processing service for NodeByte infrastructure management.
The NodeByte Backend API provides a comprehensive REST API for managing game server infrastructure, with enterprise-grade features:
- Hytale OAuth 2.0 Authentication - Device code flow, token management, game session handling with JWT validation
- Panel Synchronization - Full Pterodactyl panel sync (locations, nodes, allocations, nests, eggs, servers, users, databases)
- Job Queue System - Redis-backed async job processing with priority queues (Asynq)
- Admin Dashboard - Complete REST API for system settings, webhooks, and sync management
- Email Queue - Asynchronous email sending via Resend API
- Discord Webhooks - Real-time notifications for sync events and system changes
- Cron Scheduler - Automated scheduled sync jobs with configurable intervals
- Rate Limiting - Token bucket algorithm with per-endpoint configuration
- Audit Logging - Compliance-grade event tracking with database persistence
- Database Configuration - Dynamic system settings with AES-256-GCM encryption
- Health Monitoring - Built-in health checks and comprehensive structured logging
- Go 1.24+
- PostgreSQL 12+
- Redis 6+
- Docker & Docker Compose (optional)
# Clone repository
cd backend
# Install dependencies
go mod download
go mod tidy
# Create environment file
cp .env.example .env
# Edit configuration (set PTERODACTYL_URL, DATABASE_URL, etc.)
nano .env
# Start dependencies (Redis + PostgreSQL)
docker-compose up -d postgres redis
# Run database migrations
# (Run these from the main app or apply schemas manually)
# Run server
go run ./cmd/api/main.go
# Server runs at http://localhost:8080
# Health check: curl http://localhost:8080/health# Start all services (backend + postgres + redis)
docker-compose up -d
# With Asynq web UI for job monitoring
docker-compose --profile monitoring up -d
# View live logs
docker-compose logs -f backend
# Shutdown
docker-compose down# Health check
curl http://localhost:8080/health
# Get statistics
curl http://localhost:8080/api/stats
# Trigger sync (requires API key)
curl -X POST http://localhost:8080/api/v1/sync/full \
-H "X-API-Key: your-api-key" \
-H "Content-Type: application/json" \
-d '{"skip_users": false}'# Server
ENV=production # development or production
BACKEND_PORT=8080 # HTTP server port
# Database
DATABASE_URL=postgresql://user:pass@localhost/nodebyte # Required
# Redis
REDIS_URL=redis://localhost:6379 # Can also be: host:port
# Security
BACKEND_API_KEY=your-secret-api-key # For X-API-Key authentication
CORS_ORIGINS=https://app.example.com # Comma-separated origins
ENCRYPTION_KEY=32-byte-hex-encoded-key # For encrypting sensitive values
# Pterodactyl Panel
PTERODACTYL_URL=https://panel.example.com # Required
PTERODACTYL_API_KEY=your-admin-api-key # Required
PTERODACTYL_CLIENT_API_KEY=client-api-key # Optional
# Hytale OAuth (Required for game server authentication)
HYTALE_USE_STAGING=false # false for production, true for staging Hytale OAuth
# Tokens auto-refresh every 5-10 minutes
# Virtfusion Panel (optional)
VIRTFUSION_URL=https://virtfusion.example.com
VIRTFUSION_API_KEY=your-api-key
# Email (Resend)
RESEND_API_KEY=re_xxxxxxxxxxxxx # Required for email sending
[email protected]
# Sync Settings
AUTO_SYNC_ENABLED=true # Enable scheduled syncs
AUTO_SYNC_INTERVAL=3600 # Interval in seconds (1 hour)
SYNC_BATCH_SIZE=100 # Items per batch during sync
# Scalar (optional)
SCALAR_URL=https://scalar.example.com
SCALAR_API_KEY=your-api-key
# Cloudflare (optional)
CF_ACCESS_CLIENT_ID=your-client-id
CF_ACCESS_CLIENT_SECRET=your-client-secretThe backend uses the same PostgreSQL database as the main application. Required tables are created automatically via migrations. Key tables include:
users/accounts- User authenticationhytale_oauth_tokens- OAuth token storage (encrypted)hytale_audit_logs- Compliance audit trailsync_logs- Pterodactyl sync operation logswebhooks- Discord webhook configurations
The backend implements complete OAuth 2.0 Device Code Flow for Hytale server authentication, enabling secure game session management.
β
Device Code Flow - RFC 8628 compliant device authorization
β
Token Management - Automatic refresh with 30-day validity
β
JWT Validation - Ed25519 signature verification with JWKS caching
β
Game Sessions - Per-player session tokens with 1-hour auto-refresh
β
Audit Logging - Complete compliance trail of all OAuth operations
β
Rate Limiting - Token bucket algorithm (5-20 requests per minute)
β
Error Handling - Graceful session limit & expiration handling
| Endpoint | Method | Purpose |
|---|---|---|
/api/v1/hytale/oauth/device-code |
POST | Request device code for browser auth |
/api/v1/hytale/oauth/token |
POST | Poll for authorization completion |
/api/v1/hytale/oauth/refresh |
POST | Refresh expired access tokens |
/api/v1/hytale/oauth/profiles |
POST | List user's game profiles |
/api/v1/hytale/oauth/select-profile |
POST | Bind profile to session |
/api/v1/hytale/oauth/game-session/new |
POST | Create game session |
/api/v1/hytale/oauth/game-session/refresh |
POST | Extend session lifetime |
/api/v1/hytale/oauth/game-session/delete |
POST | Terminate session |
# 1. Request device code
curl -X POST http://localhost:8080/api/v1/hytale/oauth/device-code
# Response:
{
"device_code": "DE123456789ABCDEF",
"user_code": "AB12-CD34",
"verification_uri": "https://accounts.hytale.com/device",
"expires_in": 1800,
"interval": 5
}
# 2. User authorizes at verification_uri and enters user_code
# 3. Poll for token (repeat until authorized)
curl -X POST http://localhost:8080/api/v1/hytale/oauth/token \
-H "Content-Type: application/json" \
-d '{"device_code": "DE123456789ABCDEF"}'
# Response (after user authorizes):
{
"access_token": "eyJhbGc...",
"refresh_token": "refresh_eyJhbGc...",
"expires_in": 3600,
"account_id": "550e8400-e29b-41d4-a716-446655440000"
}
# 4. Get profiles and create game session
curl -X POST http://localhost:8080/api/v1/hytale/oauth/profiles \
-H "Authorization: Bearer eyJhbGc..."
# 5. Create session for selected profile
curl -X POST http://localhost:8080/api/v1/hytale/oauth/game-session/new \
-H "Authorization: Bearer eyJhbGc..." \
-H "Content-Type: application/json" \
-d '{"profile_uuid": "f47ac10b-58cc-4372-a567-0e02b2c3d479"}'- GSP API Reference - Complete API docs with error codes
- Downloader CLI Integration - Automated provisioning
- Customer Auth Flow - User-facing authentication guide
All API endpoints (except public stats) require authentication:
API Key Header:
curl -H "X-API-Key: your-api-key" http://localhost:8080/api/v1/sync/logsAPI Key Query Parameter:
curl "http://localhost:8080/api/v1/sync/logs?api_key=your-api-key"Bearer Token (Admin Routes):
curl -H "Authorization: Bearer your-jwt-token" http://localhost:8080/api/admin/settingsGET /api/stats
GET /api/panel/countsResponse:
{
"success": true,
"data": {
"totalServers": 150,
"totalUsers": 42,
"activeUsers": 38,
"totalAllocations": 500
}
}POST /api/v1/sync/full
Content-Type: application/json
X-API-Key: your-api-key
{
"skip_users": false,
"requested_by": "[email protected]"
}Response: 202 Accepted
{
"success": true,
"data": {
"sync_log_id": "550e8400-e29b-41d4-a716-446655440000",
"task_id": "asynq:task:abc123def456",
"status": "PENDING"
},
"message": "Full sync has been queued"
}POST /api/v1/sync/locations # Locations only
POST /api/v1/sync/nodes # Nodes only
POST /api/v1/sync/servers # Servers only
POST /api/v1/sync/users # Users onlyGET /api/v1/sync/status/550e8400-e29b-41d4-a716-446655440000
X-API-Key: your-api-keyResponse:
{
"success": true,
"data": {
"id": "550e8400-e29b-41d4-a716-446655440000",
"type": "full",
"status": "RUNNING",
"itemsTotal": 500,
"itemsSynced": 250,
"itemsFailed": 2,
"error": null,
"startedAt": "2026-01-09T10:30:00Z",
"completedAt": null
}
}GET /api/v1/sync/logs?limit=20&offset=0&type=full
X-API-Key: your-api-keyPOST /api/v1/sync/cancel/550e8400-e29b-41d4-a716-446655440000
X-API-Key: your-api-keyPOST /api/v1/email/queue
Content-Type: application/json
X-API-Key: your-api-key
{
"to": "[email protected]",
"subject": "Welcome to NodeByte!",
"template": "welcome",
"data": {
"name": "John Doe",
"verifyUrl": "https://app.example.com/verify/abc123"
}
}POST /api/v1/webhook/dispatch
Content-Type: application/json
X-API-Key: your-api-key
{
"event": "sync.completed",
"data": {
"syncLogId": "550e8400-e29b-41d4-a716-446655440000",
"type": "full",
"status": "COMPLETED",
"duration": "5m30s"
}
}GET /api/admin/settings
Authorization: Bearer your-jwt-tokenPOST /api/admin/settings
Content-Type: application/json
Authorization: Bearer your-jwt-token
{
"pterodactylUrl": "https://panel.example.com",
"autoSyncEnabled": true,
"autoSyncInterval": 3600
}GET /api/admin/settings/webhooks
Authorization: Bearer your-jwt-tokenPOST /api/admin/settings/webhooks
Content-Type: application/json
Authorization: Bearer your-jwt-token
{
"name": "Sync Notifications",
"webhookUrl": "https://discord.com/api/webhooks/123456/abcdef",
"type": "SYSTEM",
"scope": "ADMIN"
}PATCH /api/admin/settings/webhooks
Content-Type: application/json
Authorization: Bearer your-jwt-token
{
"id": "webhook-id"
}# Get repositories
GET /api/admin/settings/repos
# Add repository
POST /api/admin/settings/repos
{
"repo": "owner/repository"
}
# Update repository
PUT /api/admin/settings/repos
{
"oldRepo": "old/repo",
"repo": "new/repo"
}
# Delete repository
DELETE /api/admin/settings/repos
{
"repo": "owner/repository"
}# Get sync status
GET /api/admin/sync
# Trigger sync
POST /api/admin/sync
{
"type": "full" # or: locations, nodes, servers, users
}
# Cancel sync
POST /api/admin/sync/cancel
# Get sync settings
GET /api/admin/sync/settings
# Update sync settings
POST /api/admin/sync/settings
{
"autoSyncEnabled": true,
"autoSyncInterval": 3600
}# Overview stats (admin)
GET /api/v1/stats/overview
# Server stats (admin)
GET /api/v1/stats/servers
# User stats (admin)
GET /api/v1/stats/users
# Admin dashboard stats
GET /api/admin/stats// lib/api.ts
const API_BASE = process.env.NEXT_PUBLIC_API_URL || 'http://localhost:8080';
export async function triggerFullSync(requestedBy: string) {
const response = await fetch(`${API_BASE}/api/v1/sync/full`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'X-API-Key': process.env.BACKEND_API_KEY!,
},
body: JSON.stringify({ requested_by: requestedBy, skip_users: false }),
});
if (!response.ok) {
throw new Error(`Sync failed: ${response.statusText}`);
}
return response.json();
}
export async function getSyncStatus(syncLogId: string) {
const response = await fetch(
`${API_BASE}/api/v1/sync/status/${syncLogId}?api_key=${process.env.BACKEND_API_KEY}`,
);
if (!response.ok) {
throw new Error(`Failed to get sync status`);
}
return response.json();
}
// pages/admin/sync.tsx
export default function SyncPage() {
const [syncLogId, setSyncLogId] = useState<string | null>(null);
const handleTriggerSync = async () => {
try {
const result = await triggerFullSync(session.user.email);
setSyncLogId(result.data.sync_log_id);
toast.success('Sync started');
} catch (error) {
toast.error('Failed to start sync');
}
};
return (
<div>
<button onClick={handleTriggerSync}>Start Full Sync</button>
{syncLogId && <SyncProgress syncLogId={syncLogId} />}
</div>
);
}βββββββββββββββββββββββββββββββββββββββββββββββββββ
β REST API (Fiber Framework) β
β Authentication: API Key or Bearer Token β
ββββββββββββββββββ¬βββββββββββββββββββββββββββββββββ
β
βββββββββΌβββββββββ
β Route Handler β
βββββββββ¬βββββββββ
β
ββββββββββββββΌβββββββββββββ
β β β
βββββΌβββ ββββββΌββββββ ββββΌββββ
β Sync β β Email β βAdmin β
βββββ¬βββ ββββββ¬ββββββ ββββ¬ββββ
β β β
ββββββββββ¬βββ΄ββββββββ¬ββββ
β β
ββββββββΌβββββββββββΌβββββββ
β Queue Manager (Asynq)β
β - Critical Queue β
β - Default Queue β
β - Low Queue β
ββββββββ¬βββββββββββ¬βββββββ
β β
ββββββββΌβββββββββββΌββββββββ
β Worker Processors β
β - SyncHandler β
β - EmailHandler β
β - WebhookHandler β
ββββββββ¬βββββββββββ¬ββββββββ
β β
ββββββββΌββββββββββΌβββββββββ
β External Services β
β - Pterodactyl Panel β
β - Resend Email API β
β - Discord Webhooks β
βββββββββββββββββββββββββββ
backend/
βββ cmd/
β βββ api/
β βββ main.go # Application entry point
βββ internal/
β βββ config/
β β βββ config.go # Configuration loading
β βββ crypto/
β β βββ encryption.go # Sensitive data encryption
β βββ database/
β β βββ connection.go # Database pool management
β β βββ webhooks.go # Webhook repository
β β βββ sync_repository.go # Sync log repository
β βββ handlers/
β β βββ api.go # API endpoint handlers
β β βββ middleware.go # Authentication middleware
β β βββ admin_settings.go # Admin settings handlers
β β βββ admin_webhooks.go # Webhook management
β β βββ errors.go # Error handling
β β βββ routes.go # Route definitions
β βββ panels/
β β βββ pterodactyl.go # Pterodactyl panel API client
β βββ queue/
β β βββ manager.go # Task queue management
β βββ scalar/
β β βββ client.go # Scalar API client
β βββ workers/
β βββ server.go # Asynq worker server
β βββ scheduler.go # Cron job scheduler
β βββ sync_handler.go # Sync task processor
β βββ email_handler.go # Email task processor
β βββ webhook_handler.go # Webhook task processor
βββ .env.example
βββ Dockerfile
βββ docker-compose.yml
βββ go.mod
βββ go.sum
βββ README.md
# Check server health
curl http://localhost:8080/health
# Response
{
"status": "ok",
"timestamp": "2026-01-09T10:30:00Z"
}Access the Asynq dashboard for job monitoring:
# Start with monitoring profile
docker-compose --profile monitoring up -d
# Visit http://localhost:8081Logs are structured JSON for easy parsing:
# View logs
docker-compose logs -f backend
# Example log output
{
"level": "info",
"time": "2026-01-09T10:30:00Z",
"message": "Full sync triggered",
"sync_log_id": "550e8400-e29b-41d4-a716-446655440000",
"task_id": "asynq:task:abc123"
}- Sync Operations: Process 100+ items per batch
- Job Concurrency: 10 concurrent workers per queue
- Connection Pool: 25 max connections, 5 minimum
- Request Timeout: 30 seconds
- Graceful Shutdown: 10 second timeout for cleanup
- Follow Go conventions and best practices
- Use interfaces for dependency injection
- Comprehensive error handling with zerolog
- Structured logging with contextual information
- Tests for business logic and API handlers
- Code must pass:
gofmt,go vet,golangci-lint
Before pushing, ensure code passes all checks:
# Format code
gofmt -w .
goimports -w .
# Run linters
golangci-lint run ./...
# Run tests
go test -v -race ./...
# Build binary
go build -o bin/nodebyte-backend ./cmd/api# All tests with race detection
go test -v -race ./...
# With coverage
go test -v -race -coverprofile=coverage.out ./...
go tool cover -html=coverage.out # View in browser# Build binary (Linux)
go build -o nodebyte-backend ./cmd/api
# Cross-compile (macOS)
GOOS=darwin GOARCH=amd64 go build -o nodebyte-backend ./cmd/api
# Cross-compile (Windows)
GOOS=windows GOARCH=amd64 go build -o nodebyte-backend.exe ./cmd/apiAutomated workflows run on every commit and PR.
| Workflow | Trigger | Purpose |
|---|---|---|
| Test & Build | Push/PR | Unit tests, build verification |
| Lint & Quality | Push/PR | Code quality (50+ linters) |
| Format Check | Push/PR | Code formatting (gofmt, goimports) |
| Coverage | Push/PR | Test coverage (70%+ required) |
| Dependencies | Weekly | Security scanning, dependency checks |
| Docker Build | Tags/Push | Build & push Docker images |
Run the same checks locally before pushing:
# Run all checks
./scripts/ci.sh # If available, or run manually:
gofmt -w .
goimports -w .
golangci-lint run ./...
go test -v -race -coverprofile=coverage.out ./...
go build -o bin/nodebyte-backend ./cmd/apiCheck workflow status:
- GitHub Web: Actions tab
- CLI:
gh run list/gh run watch <id> - Email: Failed workflow notifications
# Test database connection
psql $DATABASE_URL -c "SELECT 1"
# Test Redis connection
redis-cli -u $REDIS_URL ping
# Check if server is running
curl -v http://localhost:8080/health# Check Asynq Web UI at http://localhost:8081 (if monitoring profile active)
# Or check logs for worker errors
docker-compose logs backend | grep -i error
# Verify queue connectivity
redis-cli -u $REDIS_URL KEYS "*"- Check Pterodactyl API credentials in
.env - Verify network connectivity to Pterodactyl panel
- Review sync logs:
GET /api/v1/sync/logs(requires API key) - Check worker server status in Asynq UI
- Look for database connection pool exhaustion in logs
- Verify
HYTALE_ENVIRONMENTis set correctly (production or staging) - Check JWKS cache refresh
- Review audit logs for auth failures
- Ensure database has
hytale_audit_logstable
"gofmt errors"
gofmt -w .
goimports -w ."golangci-lint errors"
golangci-lint run ./... # View all issues
# Fix issues in code, then retry"Test coverage below 70%"
go test -coverprofile=coverage.out ./...
go tool cover -html=coverage.out # View coverage gaps
# Write tests for uncovered codeββββββββββββββββββββββββββββββββββββββ
β REST API (Fiber v2 Framework) β
β Auth: API Key, Bearer, or Public β
ββββββββββββββββββ¬ββββββββββββββββββββ
β
ββββββββββββββΌβββββββββββββ
β β β
βββββΌβββ βββββΌβββββ ββββββΌβββββ
β Sync β β Hytale β β Admin β
β β β OAuth β βSettings β
βββββ¬βββ βββββ¬βββββ ββββββ¬βββββ
β β β
βββββββββββββΌβββββββββββββ
β
βββββββββΌβββββββββββ
β Queue Manager β
β (Asynq/Redis) β
ββββββββββ¬ββββββββββ
β
ββββββββββΌβββββββββ
β Workers β
β- Sync Handler β
β- Email Handler β
β- Webhook Handlerβ
β- OAuth Refresherβ
β- Scheduler β
ββββββββββ¬βββββββββ
β
ββββββββββββββΌβββββββββββββ
β β β
βββββΌβββ ββββββΌβββ ββββββΌβββββ
β Panelβ β Hytaleβ β Resend β
β β β OAuth β β Email β
ββββββββ βββββββββ βββββββββββ
- Sync Operations: Process 100+ items per batch
- Job Concurrency: 10 concurrent workers per queue
- Connection Pool: 25 max connections, 5 minimum
- Rate Limiting: Token bucket (5-20 req/min per endpoint)
- Token Refresh: Auto-refresh every 5-10 minutes
- JWKS Cache: Hourly refresh, ~99% hit rate
- Request Timeout: 30 seconds
- Graceful Shutdown: 10 second timeout for cleanup
- Hytale GSP API Reference - Complete API docs
- Downloader CLI Integration - Provisioning guide
- Customer Auth Flow - User authentication guide
- CHANGELOG.md - Version history and features
- Issues & Bugs: Report on GitHub Issues
- Documentation: See inline code comments and docs/
- Discord: Join our community server
- Email: [email protected]
AGPL 3.0 - See LICENSE file for details