Overview
HoxCore is designed to be LLM-native from the ground up. This means that every aspect of the systemβfrom entity definitions to relationships and metadataβis structured to be understood by both humans and Large Language Models (LLMs).
π‘ Why LLM-Native?
By making HoxCore LLM-native, we enable:
- AI-assisted project management and decision-making
- Natural language queries across your entire registry
- Context-aware assistance based on project metadata
- Automated insights and recommendations
- Intelligent search and discovery
Key AI Capabilities
Model References
Link projects to specific LLM models for specialized assistance
Knowledge Bases
Connect projects to knowledge bases for context-aware AI
Registry Indexing
Index your entire registry for AI-powered search
Natural Language
Query and interact with your registry using natural language
LLM-Native Design
HoxCore's YAML-based entity definitions are specifically designed to be machine-readable while remaining human-friendly:
Structured Metadata
type: project
title: "AI-Powered CLI Tool"
description: >
Building an intelligent command-line interface
with LLM integration and declarative templates.
This project aims to create a universal project
registry that can be understood by both humans
and AI systems.
category: software.dev/cli-tool
tags: [cli, ai, python, llm, automation]
# AI can understand relationships
parent: prog_digital_transformation
related: [proj_backend_api, proj_frontend_app]
# AI can reason about status and timeline
status: active
start_date: 2024-01-01
due_date: 2024-12-31
Why This Works for LLMs
| Aspect | Human Benefit | LLM Benefit |
|---|---|---|
| YAML Format | Easy to read and edit | Structured, parseable data |
| Descriptive Fields | Clear understanding of purpose | Rich context for reasoning |
| Relationships | Visual project hierarchy | Graph-based reasoning |
| Tags & Categories | Quick filtering and search | Semantic understanding |
| Timestamps | Track progress | Temporal reasoning |
β Human + Machine Understanding
The same YAML file that you edit manually can be understood, analyzed, and reasoned about by AI systems. This dual-purpose design is at the core of HoxCore's philosophy.
Model Integration
Projects can reference one or multiple LLM models for specialized assistance:
Basic Model Reference
models:
- id: assistant
provider: openwebui
url: http://openwebui.local/?models=assistant
Multiple Models
Different models for different purposes:
models:
# General-purpose assistant
- id: general-assistant
provider: openwebui
url: http://openwebui.local/?models=gpt-4
description: "General project assistance and planning"
# Code-specific model
- id: code-helper
provider: ollama
model: codellama
description: "Code generation and review"
# Lightweight model for quick queries
- id: quick-assistant
provider: ollama
model: llama2
description: "Fast responses for simple queries"
# Cloud-based model for complex tasks
- id: advanced-reasoning
provider: openai
model: gpt-4-turbo
api_key_env: OPENAI_API_KEY
description: "Complex reasoning and analysis"
Model Configuration Options
| Field | Required | Description |
|---|---|---|
id |
Yes | Unique identifier for the model reference |
provider |
Yes | Provider name (openwebui, ollama, openai, etc.) |
url |
No | Direct URL to the model interface |
model |
No | Specific model name (for providers like Ollama) |
api_key_env |
No | Environment variable containing API key |
description |
No | Purpose or use case for this model |
Use Cases for Multiple Models
- Specialized Tasks: Use different models for code, documentation, and planning
- Cost Optimization: Use lightweight models for simple tasks, powerful models for complex ones
- Privacy: Use local models for sensitive data, cloud models for general tasks
- Performance: Balance speed and quality based on task requirements
Knowledge Bases
Link knowledge bases to provide context-specific information to AI assistants:
Basic Knowledge Base Reference
knowledge_bases:
- id: kb-project-docs
url: http://openwebui.local/workspace/knowledge/5f0f9cc7...
Multiple Knowledge Bases
knowledge_bases:
# Project-specific documentation
- id: kb-project-docs
url: http://openwebui.local/workspace/knowledge/5f0f9cc7...
description: "Project documentation and specifications"
# Technical standards
- id: kb-coding-standards
url: http://openwebui.local/workspace/knowledge/a1b2c3d4...
description: "Company coding standards and best practices"
# Domain knowledge
- id: kb-domain-expertise
path: ./knowledge/domain-specific/
description: "Domain-specific knowledge and terminology"
# API documentation
- id: kb-api-docs
url: https://api-docs.example.com
description: "External API documentation"
Knowledge Base Types
URL-Based Knowledge Bases
Reference external knowledge bases via URL:
knowledge_bases:
- id: kb-external
url: http://openwebui.local/workspace/knowledge/abc123
provider: openwebui
Path-Based Knowledge Bases
Reference local knowledge bases via file path:
knowledge_bases:
- id: kb-local
path: ./docs/knowledge-base/
format: markdown
Knowledge Base Configuration
| Field | Required | Description |
|---|---|---|
id |
Yes | Unique identifier for the knowledge base |
url |
No* | URL to external knowledge base |
path |
No* | Local path to knowledge base files |
provider |
No | Knowledge base provider (openwebui, notion, etc.) |
format |
No | Format of knowledge base (markdown, pdf, etc.) |
description |
No | Purpose or content description |
* Either url or path is required
Registry Indexing
The entire HoxCore registry can be indexed and made available to AI systems as a knowledge base:
Automatic Indexing
HoxCore maintains an internal index of all entities for fast querying:
# Index is automatically updated on entity changes
hxc create project --title "New Project" # Index updated
# Force rebuild index
hxc index rebuild
# View index statistics
hxc index stats
Export Registry as Knowledge Base
Export your registry in formats suitable for LLM consumption:
# Export as structured JSON
hxc export --format json --output registry.json
# Export as Markdown documentation
hxc export --format markdown --output registry.md
# Export specific entity types
hxc export --type projects --format json --output projects.json
Registry as LLM Context
Configure your registry to be accessible by AI assistants:
# In registry config.yml
ai:
indexing:
enabled: true
auto_update: true
export_format: json
knowledge_base:
provider: openwebui
url: http://openwebui.local/workspace/knowledge/registry
auto_sync: true
sync_interval: 1h
What Gets Indexed
- Entity Metadata: All YAML fields (title, description, status, etc.)
- Relationships: Parent-child and related entity connections
- History: Git commit history and change logs
- Tags & Categories: Classification and semantic information
- Integrations: Links to external tools and resources
π‘ Privacy Considerations
When indexing your registry for AI systems:
- Review what data is being shared with external AI providers
- Consider using local LLM models for sensitive projects
- Use selective indexing to exclude confidential information
- Implement access controls on knowledge base endpoints
Global AI Assistant
HoxCore can be configured with a global AI assistant that has access to your entire registry:
Configure Global Assistant
# In registry config.yml
ai:
global_assistant:
enabled: true
provider: openwebui
model: gpt-4
url: http://openwebui.local/?models=gpt-4
# Context configuration
context:
include_registry: true
include_relationships: true
include_history: false
max_entities: 100
# Capabilities
capabilities:
- query_entities
- suggest_relationships
- analyze_status
- recommend_actions
- generate_reports
Assistant Capabilities
π Query Entities
Ask questions about your projects in natural language
"Show me all active AI projects"
"What projects are overdue?"
"Which projects depend on the backend API?"
π Suggest Relationships
AI can identify potential relationships between entities
"These two projects seem related based on their tags and descriptions"
"Consider linking this project to the parent program"
π Analyze Status
Get insights about project health and progress
"Project X is behind schedule"
"You have 5 projects due this month"
"Program Y has 3 blocked projects"
π‘ Recommend Actions
Receive AI-powered recommendations
"Consider updating the status of completed projects"
"Project X might benefit from additional resources"
"Time to review on-hold projects"
Using the Global Assistant
# Ask questions via CLI
hxc ai ask "What are my active projects?"
# Get project recommendations
hxc ai recommend --project P-001
# Analyze registry health
hxc ai analyze
# Generate reports
hxc ai report --type status --format markdown
Natural Language Queries
Query your registry using natural language instead of structured commands:
Natural Language Query Examples
# Instead of: hxc list projects --status active --tags ai
hxc ai query "Show me all active AI projects"
# Instead of: hxc query projects --due-before 2024-12-31 --status active
hxc ai query "Which projects are due before the end of the year?"
# Complex queries
hxc ai query "What projects are related to the backend API and are currently on hold?"
# Relationship queries
hxc ai query "Show me all projects under the Digital Transformation program"
# Status analysis
hxc ai query "Which projects are overdue?"
# Trend analysis
hxc ai query "What projects were completed last month?"
Query Translation
The AI assistant translates natural language to structured queries:
Natural Language Input
"Show me all active software projects with AI tags that are due this quarter"
Structured Query
hxc query projects \
--status active \
--category software.dev \
--tags ai \
--due-before 2024-03-31
Conversational Context
The AI assistant maintains context across multiple queries:
# First query
hxc ai query "Show me all AI projects"
# Follow-up query (uses context)
hxc ai query "Which of these are overdue?"
# Another follow-up
hxc ai query "Show me their dependencies"
Supported Providers
HoxCore supports integration with various LLM providers:
OpenWebUI
π OpenWebUI
Self-hosted web interface for LLMs with knowledge base support
models:
- id: assistant
provider: openwebui
url: http://openwebui.local/?models=gpt-4
knowledge_bases:
- id: kb-docs
provider: openwebui
url: http://openwebui.local/workspace/knowledge/abc123
Features: Knowledge bases, multiple models, web interface
Best For: Self-hosted deployments, privacy-focused setups
Ollama
π¦ Ollama
Local LLM runtime for running models on your machine
models:
- id: code-helper
provider: ollama
model: codellama
url: http://localhost:11434
- id: general
provider: ollama
model: llama2
Features: Local execution, no API costs, privacy
Best For: Development, sensitive data, offline work
OpenAI
π€ OpenAI
Cloud-based LLM service with powerful models
models:
- id: advanced
provider: openai
model: gpt-4-turbo
api_key_env: OPENAI_API_KEY
- id: fast
provider: openai
model: gpt-3.5-turbo
api_key_env: OPENAI_API_KEY
Features: Powerful models, fast responses, API-based
Best For: Production use, complex reasoning, high quality
Anthropic Claude
π§ Anthropic Claude
Advanced AI assistant with strong reasoning capabilities
models:
- id: claude
provider: anthropic
model: claude-3-opus
api_key_env: ANTHROPIC_API_KEY
Features: Long context, strong reasoning, safety-focused
Best For: Complex analysis, long documents, detailed reasoning
Custom Providers
π§ Custom Providers
Integrate with any LLM provider via custom configuration
models:
- id: custom
provider: custom
api_url: https://api.custom-llm.com/v1/chat
api_key_env: CUSTOM_API_KEY
headers:
Content-Type: application/json
model: custom-model-v1
Features: Flexible integration, custom endpoints
Best For: Enterprise deployments, specialized models
Use Cases
Real-world scenarios where AI integration enhances project management:
1. Intelligent Project Discovery
Scenario
You have hundreds of projects and need to find relevant ones quickly.
Solution
# Natural language search
hxc ai query "Find all machine learning projects that are related to healthcare and started this year"
# AI understands context and relationships
hxc ai query "Show me projects similar to P-001"
2. Automated Status Reports
Scenario
Generate weekly status reports for stakeholders.
Solution
# Generate comprehensive report
hxc ai report --type weekly --format markdown --output report.md
# AI analyzes progress, identifies blockers, suggests actions
3. Context-Aware Code Assistance
Scenario
Get code help that understands your project context.
Solution
# Project with code-specific model and knowledge base
models:
- id: code-helper
provider: ollama
model: codellama
knowledge_bases:
- id: kb-codebase
path: ./src/
- id: kb-docs
path: ./docs/
The AI assistant can now provide code suggestions based on your actual codebase and documentation.
4. Dependency Analysis
Scenario
Understand complex project dependencies and impacts.
Solution
# Ask about dependencies
hxc ai query "What projects will be affected if I delay the backend API project?"
# Get impact analysis
hxc ai analyze --project P-001 --impact
5. Automated Documentation
Scenario
Generate documentation from project metadata.
Solution
# Generate project documentation
hxc ai document --project P-001 --output README.md
# Generate architecture overview
hxc ai document --program prog_digital_transformation --type architecture
6. Smart Recommendations
Scenario
Get AI-powered recommendations for project management.
Solution
# Get recommendations for a project
hxc ai recommend --project P-001
# AI suggests:
# - Update overdue tasks
# - Link to related projects
# - Add missing tags
# - Update status based on completion
Best Practices
Guidelines for effective AI integration with HoxCore:
1. Choose the Right Model
β Do
- Use lightweight models (Llama2) for simple queries
- Use powerful models (GPT-4) for complex reasoning
- Use specialized models (CodeLlama) for code-related tasks
- Consider cost vs. quality trade-offs
β Don't
- Use expensive models for simple tasks
- Use general models for highly specialized tasks
- Ignore privacy implications of cloud models
2. Maintain Rich Metadata
β Do
- Write detailed descriptions for projects
- Use meaningful tags and categories
- Document relationships between entities
- Keep status and dates up to date
β Don't
- Leave descriptions empty or vague
- Use inconsistent tagging schemes
- Neglect relationship documentation
3. Organize Knowledge Bases
β Do
- Create project-specific knowledge bases
- Keep knowledge bases up to date
- Use clear, descriptive names
- Document the purpose of each knowledge base
β Don't
- Mix unrelated information in one knowledge base
- Let knowledge bases become stale
- Use cryptic or unclear identifiers
4. Privacy and Security
β Do
- Use local models for sensitive data
- Review what data is sent to external APIs
- Implement access controls on knowledge bases
- Use environment variables for API keys
β Don't
- Send confidential data to public APIs
- Hardcode API keys in configuration
- Share knowledge bases without access controls
5. Performance Optimization
β Do
- Cache AI responses when appropriate
- Use selective indexing for large registries
- Limit context size for faster responses
- Monitor API usage and costs
β Don't
- Send entire registry for every query
- Ignore response time issues
- Exceed API rate limits
Complete Examples
Full configuration examples for different scenarios:
Example 1: Software Development Project
type: project
title: "AI-Powered CLI Tool"
description: >
Building an intelligent command-line interface
with LLM integration and declarative templates.
category: software.dev/cli-tool
tags: [cli, ai, python, llm, automation]
# Multiple models for different purposes
models:
# General assistance
- id: assistant
provider: openwebui
url: http://openwebui.local/?models=gpt-4
description: "General project assistance"
# Code generation
- id: code-helper
provider: ollama
model: codellama
description: "Code generation and review"
# Quick queries
- id: quick
provider: ollama
model: llama2
description: "Fast responses"
# Knowledge bases
knowledge_bases:
# Project documentation
- id: kb-docs
path: ./docs/
format: markdown
description: "Project documentation"
# Codebase
- id: kb-code
path: ./src/
format: python
description: "Source code"
# Standards
- id: kb-standards
url: http://openwebui.local/workspace/knowledge/standards
description: "Coding standards"
# Repositories
repositories:
- name: github
url: https://github.com/user/hoxcore
# Tools
tools:
- name: github-projects
provider: github
url: https://github.com/user/hoxcore/projects/1
Example 2: Research Project
type: project
title: "Machine Learning Research"
description: >
Investigating novel approaches to natural language
understanding using transformer architectures.
category: academic/research-paper
tags: [ml, nlp, transformers, research]
# Research-focused models
models:
# Literature review
- id: research-assistant
provider: anthropic
model: claude-3-opus
api_key_env: ANTHROPIC_API_KEY
description: "Literature review and analysis"
# Data analysis
- id: data-analyst
provider: openai
model: gpt-4-turbo
api_key_env: OPENAI_API_KEY
description: "Data analysis and visualization"
# Knowledge bases
knowledge_bases:
# Research papers
- id: kb-papers
path: ./papers/
format: pdf
description: "Related research papers"
# Experimental data
- id: kb-data
path: ./data/
description: "Experimental datasets"
# Domain knowledge
- id: kb-domain
url: http://openwebui.local/workspace/knowledge/nlp-domain
description: "NLP domain knowledge"
# Repositories
repositories:
- name: overleaf
url: https://www.overleaf.com/project/abc123
- name: github
url: https://github.com/user/ml-research
# Storage
storage:
- name: gdrive
provider: google-drive
url: https://drive.google.com/drive/folders/research
Example 3: Business Initiative
type: program
title: "Digital Transformation Initiative"
description: >
Company-wide initiative to modernize systems,
processes, and capabilities.
category: business/initiative.strategic
tags: [transformation, strategy, modernization]
# Executive-level models
models:
# Strategic planning
- id: strategy-advisor
provider: openai
model: gpt-4-turbo
api_key_env: OPENAI_API_KEY
description: "Strategic planning and analysis"
# Report generation
- id: report-generator
provider: anthropic
model: claude-3-opus
api_key_env: ANTHROPIC_API_KEY
description: "Executive report generation"
# Knowledge bases
knowledge_bases:
# Company strategy
- id: kb-strategy
url: https://notion.so/workspace/strategy
provider: notion
description: "Strategic documents"
# Market research
- id: kb-market
path: ./market-research/
description: "Market analysis and trends"
# Best practices
- id: kb-best-practices
url: http://openwebui.local/workspace/knowledge/best-practices
description: "Industry best practices"
# Tools
tools:
- name: azure-devops
provider: azure
url: https://dev.azure.com/company/transformation
# Children projects
children:
- proj_system_modernization
- proj_process_automation
- proj_capability_building
Next Steps
Now that you understand AI features in HoxCore, explore related topics: