Table of Contents
- Contributing
- Getting Started
- Contribution Types
- Development Guidelines
- Documentation Standards
- Dependency Management
- Performance Considerations
- Security Considerations
- Troubleshooting Contributions
- Testing Your Contribution
- 1. Unit Tests Pass
- 2. Integration Tests Pass
- 3. Code Quality Passes
- 4. Security Scans Pass
- 5. Manual Testing
- 6. Documentation Updated
- Benchmarking and Profiling
- Advanced Contribution Topics
- Release Management
- Release Process
- Community Guidelines
- Code of Conduct
- Communication Channels
- Response Times
- Maintainer Responsibilities
- Recognition
- First-Time Contributors
- Advanced Contributions
- License
- Getting Help
- Quick Reference
Contributing
We welcome contributions! Whether you're fixing bugs, adding features, improving documentation, or reporting issues, your help makes this project better.
Getting Started
- Fork the repository on Gitea
- Clone your fork:
git clone https://git.serendipity.systems/YOUR_USERNAME/pterodactyl-discord-bot.git cd pterodactyl-discord-bot - Set up development environment (see Development section)
- Create a feature branch:
git checkout -b feature/your-feature-name
Contribution Types
🐛 Bug Reports
When reporting bugs, please include:
- Description: Clear description of the issue
- Steps to Reproduce: Detailed steps to recreate the bug
- Expected Behavior: What should happen
- Actual Behavior: What actually happens
- Environment: Python version, OS, Discord.py version
- Logs: Relevant log excerpts (sanitize sensitive data)
- Screenshots: If applicable
Template:
**Bug Description**
Brief description of the issue
**To Reproduce**
1. Run command `/server_status`
2. Select server "Test Server"
3. Click "Start" button
4. Error occurs
**Expected Behavior**
Server should start and embed should update
**Actual Behavior**
Error message appears: "Failed to start server"
**Environment**
- OS: Ubuntu 22.04
- Python: 3.11.5
- Discord.py: 2.3.2
- Bot Version: v1.2.3
**Logs**
2024-10-15 10:30:45 - ERROR - Failed to send power action: Connection timeout
**Additional Context**
Only happens with servers that have multiple allocations
✨ Feature Requests
When requesting features:
- Use Case: Explain why this feature is needed
- Proposed Solution: Describe your ideal implementation
- Alternatives: Consider alternative approaches
- Additional Context: Screenshots, mockups, or examples
Template:
**Feature Request**
Add support for automatic server backups
**Use Case**
Server administrators want automated daily backups triggered from Discord
**Proposed Solution**
Add `/backup` command that:
1. Creates backup via Pterodactyl API
2. Sends confirmation embed
3. Allows scheduling recurring backups
**Alternatives Considered**
- Use Pterodactyl's built-in backup scheduling
- Third-party backup management bot
**Additional Context**
Many users request this in Discord support channel
🔧 Pull Requests
Before submitting:
- ✅ Tests pass locally (
make test) - ✅ Code is linted (
make lint) - ✅ Code is formatted (
make format) - ✅ Security scans pass (
make security) - ✅ Documentation updated (if needed)
- ✅ Commits follow conventional commits format
PR Template:
## Description
Brief description of changes
## Type of Change
- [ ] Bug fix (non-breaking change fixing an issue)
- [ ] New feature (non-breaking change adding functionality)
- [ ] Breaking change (fix or feature causing existing functionality to change)
- [ ] Documentation update
- [ ] Performance improvement
- [ ] Code refactoring
## Changes Made
- Detailed list of changes
- Each major change on its own line
- Include technical details
## Testing
- [ ] Unit tests added/updated
- [ ] Integration tests added/updated
- [ ] Manual testing performed
- [ ] All tests pass locally
## Checklist
- [ ] Code follows project style guidelines
- [ ] Self-review completed
- [ ] Comments added for complex code
- [ ] Documentation updated
- [ ] No new warnings introduced
- [ ] Tests added for new functionality
- [ ] Dependent changes merged
## Related Issues
Fixes #123
Relates to #456
## Screenshots (if applicable)
[Add screenshots here]
## Additional Notes
[Any additional information]
Development Guidelines
Code Style
Commit Messages:
Follow Conventional Commits:
type(scope): subject
body
footer
Types:
feat: New featurefix: Bug fixdocs: Documentation changesstyle: Code style changes (formatting, no logic change)refactor: Code refactoringperf: Performance improvementstest: Test additions or changeschore: Build process, dependencies, or tooling changesci: CI/CD configuration changes
Examples:
feat(api): add support for server backups
- Add backup_server() method to PterodactylAPI
- Add /backup slash command
- Include backup status in embeds
- Add tests for backup functionality
Closes #123
fix(metrics): correct CPU scaling calculation for 16+ core servers
Previously failed to calculate correct scale for servers with >16 cores.
Now properly rounds up to nearest 100% increment.
Fixes #456
docs(readme): update installation instructions for Docker
- Add Docker Compose example
- Clarify volume mount requirements
- Add troubleshooting section
test(api): add integration tests for power actions
- Test start/stop/restart commands
- Mock Pterodactyl API responses
- Verify error handling
chore(deps): update discord.py to 2.3.2
Security update to address CVE-2024-XXXXX
Testing Requirements
For Bug Fixes:
- Write failing test that reproduces bug
- Fix the bug
- Verify test now passes
- Add regression test if needed
For New Features:
- Write tests first (TDD approach)
- Implement feature
- Achieve ≥80% coverage for new code
- Add integration test if appropriate
Test Structure:
class TestNewFeature:
"""Test suite for new feature."""
def test_basic_functionality(self):
"""Test basic functionality works."""
# Arrange
expected = "result"
# Act
actual = new_function()
# Assert
assert actual == expected
def test_edge_case(self):
"""Test edge case handling."""
with pytest.raises(ValueError):
new_function(invalid_input)
@pytest.mark.asyncio
async def test_async_behavior(self):
"""Test asynchronous operations."""
result = await async_function()
assert result is not None
Code Review Process
What Reviewers Look For:
-
Functionality
- Does it solve the problem?
- Are edge cases handled?
- Is error handling appropriate?
-
Code Quality
- Follows style guidelines
- Well-structured and readable
- Appropriate comments/documentation
- No unnecessary complexity
-
Testing
- Adequate test coverage
- Tests actually test the functionality
- Tests are maintainable
-
Performance
- No obvious performance issues
- Efficient algorithms used
- Resources properly managed
-
Security
- No security vulnerabilities introduced
- Input validation present
- Secrets not exposed
Addressing Review Comments:
- Respond to all comments within 1 week
- Make requested changes promptly
- Mark conversations as resolved after addressing
- Request re-review when ready
- Don't force-push after review (use regular commits)
Approval Requirements:
- ✅ At least 1 approval from maintainer
- ✅ All CI checks passing (tests, lint, security)
- ✅ No unresolved conversations
- ✅ Branch up to date with target branch
- ✅ Conflicts resolved
- ✅ Documentation updated (if applicable)
Review Iteration:
# After receiving review comments
git add .
git commit -m "refactor: address review comments"
git push origin feature-branch
# Maintainer will be notified automatically
# Once approved, maintainer will merge
Merging Strategy
We use the following merge strategies:
-
Squash and Merge (default for features)
- Combines all commits into one
- Clean, linear history
- Used for: Feature branches with many small commits
-
Rebase and Merge (for clean commits)
- Preserves individual commits
- Linear history without merge commits
- Used for: Well-crafted commit history worth preserving
-
Regular Merge (rarely)
- Creates merge commit
- Preserves branch history
- Used for: Long-running feature branches, releases
After Merge:
# Delete your feature branch
git checkout main
git pull origin main
git branch -d feature-branch
git push origin --delete feature-branch # Delete remote branch
Documentation Standards
When to Update Documentation:
- ✅ Always for new user-facing features
- ✅ Always for changed command behavior
- ✅ Always for new configuration options
- ✅ Usually for significant refactoring
- ⚠️ Sometimes for internal changes
- ❌ Rarely for minor bug fixes
Documentation Locations:
| Type | Location | When to Update |
|---|---|---|
| User Guide | README.md | New features, commands, configuration |
| API Reference | Docstrings | New functions, classes, methods |
| Architecture | Wiki Architecture Page | Major structural changes |
| Testing | TESTING.md | New test patterns, tools |
| Changelog | CHANGELOG.md | Every release |
| Code Comments | Inline | Complex logic, non-obvious decisions |
Documentation Examples:
Good Docstring:
async def get_server_resources(self, server_id: str) -> dict:
"""
Get resource usage for a specific server.
Uses client API key as this is a client endpoint. Returns current
state and resource metrics including CPU, memory, disk, and network.
Args:
server_id: The Pterodactyl server identifier (e.g., 'abc123')
Returns:
Dictionary containing server resource usage and current state.
Example structure:
{
'attributes': {
'current_state': 'running',
'resources': {
'cpu_absolute': 45.5,
'memory_bytes': 1073741824,
'disk_bytes': 5368709120
}
}
}
Raises:
aiohttp.ClientError: For network-related issues
Example:
>>> resources = await api.get_server_resources('abc123')
>>> print(resources['attributes']['current_state'])
'running'
"""
logger.debug(f"Fetching resource usage for server {server_id}")
try:
response = await self._request("GET", f"client/servers/{server_id}/resources")
if response.get('status') == 'error':
error_msg = response.get('message', 'Unknown error')
logger.error(f"Failed to get resources for server {server_id}: {error_msg}")
return {'attributes': {'current_state': 'offline'}}
state = response.get('attributes', {}).get('current_state', 'unknown')
logger.debug(f"Server {server_id} current state: {state}")
return response
except Exception as e:
logger.error(f"Exception getting resources for server {server_id}: {str(e)}")
return {'attributes': {'current_state': 'offline'}}
Good Inline Comment:
# Calculate dynamic CPU scale limit in 100% increments
# This handles multi-vCPU servers where usage can exceed 100%
# e.g., 4 vCPU server can use up to 400% CPU
cpu_scale_limit = math.ceil(max_cpu_value / 100) * 100
Good README Section:
### `/server_status` Command
Display an interactive dashboard to select which server to monitor.
**Permissions Required:**
- Use Slash Commands
**User Requirements:**
- Must be in the configured guild (AllowedGuildID)
**Usage:**
1. Type `/server_status` in any channel
2. An ephemeral dropdown menu appears (only visible to you)
3. Select a server from the list
4. The bot posts a status embed in the current channel
**Behavior:**
- If an embed already exists for that server, it will be deleted first
- The new embed will auto-update every 10 seconds
- Button controls are available for users with "Game Server User" role
**Example:**
User: /server_status
Bot: [Ephemeral dropdown with servers]
User: [Selects "Minecraft Production"]
Bot: [Posts status embed in channel]
Dependency Management
Adding New Dependencies:
-
Evaluate necessity:
- Can we implement this ourselves?
- Is it actively maintained?
- What's the license?
- What are the transitive dependencies?
-
Add to requirements:
# Add to requirements.txt with version pinning echo "new-package==1.2.3" >> requirements.txt # For test dependencies echo "test-package==2.3.4" >> requirements-test.txt -
Document usage:
- Why this dependency is needed
- What it replaces or adds
- Any special configuration
-
Update Docker:
- Rebuild Docker image to include new dependency
- Test in containerized environment
-
Security check:
safety check pip-audit
Updating Dependencies:
# Check for outdated packages
make outdated
pip list --outdated
# Update specific package
pip install --upgrade package-name==X.Y.Z
# Update all (carefully!)
pip install --upgrade -r requirements.txt
# Run security scans
make security
# Run full test suite
make test
# Update requirements file
pip freeze > requirements-frozen.txt
Dependency Version Pinning:
# ❌ Bad - No version specified
requests
# ⚠️ Okay - Minimum version
requests>=2.28.0
# ✅ Good - Pinned version
requests==2.31.0
# ✅ Best - Pinned with hash (in production)
requests==2.31.0 --hash=sha256:abc123...
Performance Considerations
When contributing, consider:
1. API Rate Limits
Problem: Discord and Pterodactyl APIs have rate limits
Solutions:
# ✅ Good - Batch operations
async with asyncio.gather(*[
update_embed(server1),
update_embed(server2),
update_embed(server3)
]):
pass
# ✅ Good - Add delays between requests
await asyncio.sleep(0.5)
# ❌ Bad - Sequential with no delays
for server in servers:
await update_embed(server) # May hit rate limits
2. Memory Usage
Problem: Bot runs 24/7, memory leaks accumulate
Solutions:
# ✅ Good - Use generators for large datasets
def process_servers():
for server in get_servers():
yield process(server)
# ✅ Good - Explicit cleanup
async def cleanup():
await session.close()
plt.close('all') # Close matplotlib figures
# ❌ Bad - Loading everything into memory
all_data = [expensive_operation(x) for x in huge_list]
3. Blocking Operations
Problem: Blocking operations freeze the entire bot
Solutions:
# ✅ Good - Use async/await
async def fetch_data():
async with aiohttp.ClientSession() as session:
async with session.get(url) as response:
return await response.json()
# ✅ Good - Run blocking code in executor
import asyncio
loop = asyncio.get_event_loop()
result = await loop.run_in_executor(None, blocking_function)
# ❌ Bad - Synchronous blocking call
import requests
response = requests.get(url) # Blocks entire event loop!
4. Database Queries (if implemented)
Solutions:
# ✅ Good - Use connection pooling
# ✅ Good - Index frequently queried columns
# ✅ Good - Batch inserts/updates
# ❌ Bad - N+1 query problem
Security Considerations
Security Checklist for Contributors:
Authentication & Authorization
- All Discord interactions check guild ID
- Role requirements enforced for sensitive actions
- API keys never logged or displayed
- User input validated before use
Input Validation
# ✅ Good - Validate user input
def validate_server_id(server_id: str) -> bool:
if not server_id or not isinstance(server_id, str):
return False
if len(server_id) > 50: # Reasonable limit
return False
if not server_id.isalnum(): # Only alphanumeric
return False
return True
# ❌ Bad - Direct use of user input
server_id = interaction.data['server_id']
await api.get_server(server_id) # No validation!
Secrets Management
# ✅ Good - From config file
token = config.get('Discord', 'Token')
# ✅ Good - From environment
token = os.getenv('DISCORD_TOKEN')
# ❌ Bad - Hardcoded
token = "MTIzNDU2Nzg5.ABCDEF.ghijklmnop"
Error Handling
# ✅ Good - Generic error messages
try:
result = await api.dangerous_operation()
except Exception as e:
logger.error(f"Operation failed: {str(e)}")
await interaction.response.send_message(
"An error occurred. Please try again.",
ephemeral=True
)
# ❌ Bad - Leaking sensitive info
except Exception as e:
await interaction.response.send_message(
f"Error: {str(e)}\nAPI Key: {api.key}\nServer: {server_details}",
ephemeral=True
)
Logging Security
# ✅ Good - Sanitize sensitive data
logger.info(f"User {user.id} executed command")
logger.debug(f"API request to {url}")
# ❌ Bad - Logging secrets
logger.debug(f"Using API key: {api_key}")
logger.info(f"Server details: {server_data}") # May contain tokens
Troubleshooting Contributions
Common Issues and Solutions:
Import Errors
# Problem: ModuleNotFoundError
# Solution: Install dependencies
pip install -r requirements.txt -r requirements-test.txt
# Problem: Wrong module path
# Solution: Add to PYTHONPATH
export PYTHONPATH="${PYTHONPATH}:$(pwd)"
Test Failures
# Problem: Tests fail locally
# Solution 1: Check Python version
python --version # Must be 3.9+
# Solution 2: Update dependencies
pip install --upgrade -r requirements-test.txt
# Solution 3: Clean cache
make clean-all
pytest --cache-clear
# Solution 4: Run specific test with verbose output
pytest test_pterodisbot.py::TestClass::test_method -vv --tb=long
Linting Failures
# Problem: Flake8 errors
# Solution: Auto-fix with black and isort
make format
# Problem: Pylint errors
# Solution: Address specific issues or disable specific checks
# Add to top of file: # pylint: disable=specific-check
# Problem: Formatting issues
# Solution: Auto-format
black --line-length=120 file.py
isort --profile black file.py
Git Issues
# Problem: Merge conflicts
# Solution: Rebase and resolve
git fetch origin
git rebase origin/main
# Fix conflicts in files
git add .
git rebase --continue
# Problem: Accidentally committed to wrong branch
# Solution: Cherry-pick to correct branch
git checkout correct-branch
git cherry-pick commit-hash
# Problem: Want to undo last commit
# Solution: Use reset (if not pushed)
git reset --soft HEAD~1 # Keep changes
git reset --hard HEAD~1 # Discard changes
Docker Issues
# Problem: Docker build fails
# Solution: Clear cache and rebuild
docker build --no-cache -t pterodisbot:latest .
# Problem: Container won't start
# Solution: Check logs
docker logs pterodisbot
# Problem: Config not loading
# Solution: Check volume mount
docker run -v $(pwd)/config.ini:/app/config.ini:ro pterodisbot
Testing Your Contribution
Before submitting PR, ensure:
1. Unit Tests Pass
# Run all unit tests
make test-unit
# Run your specific test
pytest test_pterodisbot.py::TestYourFeature -v
# Check coverage
make test-coverage
# Ensure your new code has >80% coverage
2. Integration Tests Pass
# Run integration tests
make test-integration
# If you added integration test
pytest test_pterodisbot.py::TestIntegration::test_your_workflow -v
3. Code Quality Passes
# Run all linters
make lint
# Auto-fix formatting
make format
# Individual checks
flake8 your_file.py
pylint your_file.py
black --check your_file.py
4. Security Scans Pass
# Run all security checks
make security
# Individual scans
bandit -r your_file.py
safety check
pip-audit
5. Manual Testing
# Create test config if needed
cp config.ini.example config.ini
# Edit with test credentials
# Run bot locally
python pterodisbot.py
# Test your changes in Discord:
# - Use test commands
# - Click test buttons
# - Verify expected behavior
# - Test error cases
# - Check logs for errors
6. Documentation Updated
# Check if documentation needs updates
# - README.md for user-facing changes
# - Docstrings for code changes
# - TESTING.md for test changes
# - Comments for complex logic
Benchmarking and Profiling
When optimizing performance:
CPU Profiling
# Profile the bot
python -m cProfile -o profile.stats pterodisbot.py
# Analyze results
python -m pstats profile.stats
>>> sort cumtime
>>> stats 20 # Show top 20 functions
# Visualize with snakeviz
pip install snakeviz
snakeviz profile.stats
Memory Profiling
# Install memory profiler
pip install memory_profiler
# Decorate functions to profile
from memory_profiler import profile
@profile
def my_function():
# Function code
# Run with profiler
python -m memory_profiler pterodisbot.py
Async Profiling
# Install yappi
pip install yappi
# Add to code:
import yappi
yappi.start()
# Code to profile
yappi.stop()
yappi.get_func_stats().print_all()
Performance Targets
| Metric | Target | Measurement |
|---|---|---|
| Bot startup time | <5 seconds | Time to "Bot ready" log |
| Embed update cycle | <30 seconds | Time to update all embeds |
| Command response time | <1 second | Time to ephemeral response |
| CPU usage (average) | <1% of single modern CPU thread | Docker stats |
| Memory usage (average) | <200 MB | Docker stats |
| API calls per minute | <100 | Log analysis |
Advanced Contribution Topics
Adding New Bot Commands
Complete sample workflow:
-
Design the command:
# Command specification Name: /backup Description: Create a backup of the server Parameters: - server_id: str (required) - description: str (optional) Permissions: "Game Server User" role Response: Ephemeral confirmation -
Implement the command:
@bot.tree.command(name="backup", description="Create a backup of the server") async def backup_server( interaction: discord.Interaction, server_id: str, description: str = "Manual backup" ): """ Create a backup of a Pterodactyl server. Args: interaction: Discord interaction object server_id: The Pterodactyl server identifier description: Optional backup description """ # Implementation -
Add API method:
# In PterodactylAPI class async def create_backup(self, server_id: str, description: str) -> dict: """ Create a backup for a server. Args: server_id: The server identifier description: Backup description Returns: API response dictionary """ logger.info(f"Creating backup for server {server_id}") return await self._request( "POST", f"client/servers/{server_id}/backups", {"description": description} ) -
Write tests:
@pytest.mark.asyncio async def test_backup_command_success(mock_discord_interaction, mock_pterodactyl_api): """Test successful backup creation.""" mock_pterodactyl_api.create_backup = AsyncMock( return_value={'status': 'success'} ) await backup_server( mock_discord_interaction, "abc123", "Test backup" ) mock_pterodactyl_api.create_backup.assert_called_once_with( "abc123", "Test backup" ) mock_discord_interaction.followup.send.assert_called_once() -
Update documentation:
### `/backup` Command Create a backup of a Pterodactyl server. **Parameters:** - `server_id`: The server identifier (required) - `description`: Backup description (optional) **Permissions:** "Game Server User" role required -
Test manually and submit PR
Adding New Metrics
Complete sample workflow for adding disk I/O metrics:
-
Update ServerMetricsGraphs:
class ServerMetricsGraphs: def __init__(self, server_id: str, server_name: str): self.server_id = server_id self.server_name = server_name # Add new data structure self.data_points = deque(maxlen=6) # (timestamp, cpu, memory, disk_io) def add_data_point( self, cpu_percent: float, memory_mb: float, disk_io_mb: float, # New parameter timestamp: Optional[datetime] = None ): """Add data point with disk I/O.""" if timestamp is None: timestamp = datetime.now() self.data_points.append((timestamp, cpu_percent, memory_mb, disk_io_mb)) def generate_disk_io_graph(self) -> Optional[io.BytesIO]: """Generate disk I/O graph.""" # Implementation similar to CPU/memory graphs pass -
Update data collection:
# In update_status method disk_io = round( resource_attributes.get('resources', {}).get('disk_io_bytes', 0) / (1024 ** 2), 2 ) self.metrics_manager.add_server_data( server_id, server_name, cpu_usage, memory_usage, disk_io # New parameter ) -
Write tests:
def test_disk_io_tracking(self): """Test disk I/O data tracking.""" graphs = ServerMetricsGraphs('abc123', 'Test Server') graphs.add_data_point(50.0, 1024.0, 15.5) # cpu, memory, disk_io assert len(graphs.data_points) == 1 assert graphs.data_points[0][3] == 15.5 # disk_io value -
Update documentation and submit PR
Extending the API Client
Sample adding support for file management:
-
Add new methods to PterodactylAPI:
async def list_files(self, server_id: str, directory: str = "/") -> dict: """List files in server directory.""" return await self._request( "GET", f"client/servers/{server_id}/files/list", {"directory": directory} ) async def get_file_contents(self, server_id: str, file_path: str) -> dict: """Get contents of a file.""" return await self._request( "GET", f"client/servers/{server_id}/files/contents", {"file": file_path} ) -
Add tests with mocked responses:
@pytest.mark.asyncio async def test_list_files(mock_pterodactyl_api): """Test file listing.""" mock_pterodactyl_api._request = AsyncMock(return_value={ 'data': [ {'name': 'config.yml', 'size': 1024}, {'name': 'server.jar', 'size': 50000000} ] }) files = await mock_pterodactyl_api.list_files('abc123', '/') assert len(files['data']) == 2 -
Document new API methods and submit PR
Release Management
For maintainers preparing releases:
Pre-Release Checklist
- All tests passing on main branch
- No open critical bugs
- Documentation up to date
- CHANGELOG.md updated with all changes
- Version bumped in code
- Release notes drafted
Release Process
-
Update version:
# In pterodisbot.py __version__ = "1.4.0" -
Update CHANGELOG.md:
## [1.4.0] - 2024-10-20 ### Added - Server backup functionality via `/backup` command - Disk I/O metrics in status embeds - File management API methods ### Changed - Improved error messages for API failures - Enhanced logging for debugging ### Fixed - CPU scaling calculation for 32+ core servers - Memory leak in graph generation - Race condition in embed updates ### Security - Updated discord.py to 2.3.3 (CVE-2024-XXXXX) - Added input validation for all user commands -
Create release commit:
git add . git commit -m "chore(release): bump version to v1.4.0" git push origin main -
Create and push tag:
git tag -a v1.4.0 -m "Release v1.4.0 Added: - Server backup functionality - Disk I/O metrics - File management API Changed: - Improved error messages - Enhanced logging Fixed: - CPU scaling for 32+ cores - Memory leak in graphs - Embed update race condition Security: - Updated discord.py (CVE fix) - Added input validation" git push origin v1.4.0 -
CI/CD automatically:
- Runs full test suite
- Builds Docker images for amd64 and arm64
- Tags images:
v1.4.0,v1.4,v1,latest - Pushes to registry
-
Create release on Gitea:
- Go to repository → Releases → New Release
- Select tag
v1.4.0 - Title: "Pterodisbot v1.4.0 Stable"
- Description: Copy from CHANGELOG.md
- Attach any binary artifacts if applicable
- Publish release
-
Announce release:
- Update Discord server announcement
- Post in relevant channels
- Update external documentation if needed
Hotfix Process
For critical bugs in production:
-
Create hotfix branch from latest release tag:
git checkout v1.4.0 git checkout -b hotfix/critical-bug -
Fix the bug with tests:
# Fix bug # Add test # Verify fix make test -
Bump patch version:
__version__ = "1.4.1" -
Update CHANGELOG.md:
## [1.4.1] - 2024-10-21 ### Fixed - Critical bug causing bot crashes on server restart -
Commit and tag:
git commit -am "fix: critical bug causing crashes on restart" git tag -a v1.4.1 -m "Hotfix v1.4.1" -
Merge to main and push:
git checkout main git merge hotfix/critical-bug git push origin main --tags -
CI/CD handles deployment automatically
Release Process
Version Numbering
We follow Semantic Versioning:
MAJOR.MINOR.PATCH
MAJOR: Breaking changes
MINOR: New features (backwards compatible)
PATCH: Bug fixes (backwards compatible)
Examples:
v1.2.3→v1.2.4: Bug fixv1.2.3→v1.3.0: New featurev1.2.3→v2.0.0: Breaking change
Creating a Release
-
Update version in code:
# pterodisbot.py __version__ = "1.3.0" -
Update CHANGELOG.md:
## [1.3.0] - 2024-10-15 ### Added - Server backup functionality - Scheduled backup support ### Fixed - CPU scaling for 16+ core servers ### Changed - Improved error messages -
Create release commit:
git add . git commit -m "chore(release): bump version to v1.3.0" -
Create and push tag:
git tag -a v1.3.0 -m "Release version 1.3.0" git push origin main --tags -
CI/CD automatically:
- Runs all tests
- Builds Docker images
- Tags:
v1.3.0,v1.3,v1,latest - Pushes to registry
Community Guidelines
Code of Conduct
We are committed to providing a welcoming and inclusive environment. All contributors must:
- ✅ Be respectful and inclusive
- ✅ Accept constructive criticism gracefully
- ✅ Focus on what's best for the community
- ✅ Show empathy toward others
- ❌ Use inappropriate language or imagery
- ❌ Make personal attacks
- ❌ Publish others' private information
Communication Channels
- Issues: Bug reports and feature requests
- Pull Requests: Code contributions and discussions
- Discussions: General questions and ideas
- Discord (if applicable): Real-time community chat
Response Times
Maintainers aim to:
- Acknowledge issues within 48 hours
- Review PRs within 5 business days
- Respond to security issues within 24 hours
Contributors should:
- Respond to review comments within 1 week
- Update PRs to address feedback
- Close PRs if no longer pursuing
Maintainer Responsibilities
Maintainers will:
- Review PRs and provide feedback
- Merge approved PRs
- Create releases
- Manage issues and discussions
- Enforce community guidelines
- Keep documentation updated
- Ensure CI/CD pipeline works
Maintainers commit to:
- Respectful and constructive feedback
- Timely responses
- Clear communication
- Fair and unbiased reviews
- Supporting contributors
Recognition
Contributors are credited in:
- CONTRIBUTORS.md file
- Release notes
- Git commit history
Significant contributors may:
- Receive collaborator status
- Be invited to maintain specific areas
- Help guide project direction
First-Time Contributors
Looking for your first contribution?
Issues labeled with good-first-issue are great starting points:
- Well-defined scope
- Clear acceptance criteria
- Mentorship available
Example first contributions:
- Fix typos in documentation
- Improve error messages
- Add test coverage
- Enhance logging
- Update dependencies
Advanced Contributions
For experienced contributors:
Issues labeled with help-wanted need expertise:
- Performance optimization
- Architecture improvements
- Complex feature implementation
- Security enhancements
License
By contributing, you agree that your contributions will be licensed under the GPL-3.0 License.
All contributions must:
- Be your original work
- Not infringe on others' intellectual property
- Be compatible with GPL-3.0
Getting Help
Questions about contributing?
-
Check existing documentation:
- README.md
- Project Wiki
- Code comments and docstrings
-
Search existing issues and PRs:
- Someone may have already asked
- Previous discussions contain valuable context
-
Ask in Discussions:
- General questions
- Ideas for contributions
- Clarification on project direction
-
Open an issue:
- Specific technical questions
- Clarification on code behavior
- Help with development setup
Quick Reference
Development Commands
# Setup
make install # Install all dependencies
make setup-test # Create test configuration
# Testing
make test # Run all tests with coverage
make test-quick # Fast test without coverage
make test-unit # Only unit tests
make test-integration # Only integration tests
make watch # Watch mode (continuous testing)
# Code Quality
make lint # Run all linters
make format # Auto-format code
make format-check # Check formatting without changes
make security # Run security scans
# CI/CD
make ci # Run full CI pipeline locally
# Cleanup
make clean # Remove test artifacts
make clean-all # Deep clean (includes caches)
# Docker
make docker-build # Build Docker image
make docker-run # Run bot in container
# Utilities
make outdated # Check for outdated packages
make update # Update all dependencies
make freeze # Generate requirements-frozen.txt
Git Workflow
# Start new feature
git checkout -b feature/feature-name
# Make changes and commit
git add .
git commit -m "feat(scope): description"
# Keep branch updated
git fetch origin
git rebase origin/main
# Push and create PR
git push origin feature/feature-name
# After PR approval and merge
git checkout main
git pull origin main
git branch -d feature/feature-name
Common Tasks
Add new slash command:
- Add command in
pterodisbot.pywith@bot.tree.command() - Add guild check with
check_allowed_guild() - Add tests in
test_pterodisbot.py - Update README.md with command documentation
- Test locally, then submit PR
Add new API endpoint:
- Add method to
PterodactylAPIclass - Add error handling and logging
- Add unit tests with mocked responses
- Update docstrings
- Test with real API, then submit PR
Fix a bug:
- Create branch:
git checkout -b fix/bug-description - Write failing test
- Fix the bug
- Verify test passes
- Run full test suite:
make test - Commit:
git commit -m "fix(scope): description" - Push and create PR
Support
- Issues: https://git.serendipity.systems/k.eaven/pterodactyl-discord-bot/issues
- Documentation: Check README.md and Wiki
- License: GPL-3.0 (see LICENSE file)
Important Note
This is private git infrastructure. If you want to contribute to any project hosted on this site or create your own repositories here, please email the admin (kimura.eaven@gmail.com) to set up an account for you.
Thank you for contributing to Pterodactyl Discord Bot! 🎉
Your contributions help make Pterodactyl server management more accessible for everyone in the community.
Support
- Issues: https://git.serendipity.systems/k.eaven/pterodactyl-discord-bot/issues
- Documentation: Check README.md and Wiki
- License: GPL-3.0 (see LICENSE file)
Important Note
This is private git infrastructure. If you want to contribute to any project hosted on this site or create your own repositories here, please email the admin (kimura.eaven@gmail.com) to set up an account for you.
Thank you for contributing to Pterodactyl Discord Bot! 🎉
Your contributions help make Pterodactyl server management more accessible for everyone in the community.