XEvolutionaryNetwork
XEvolutionaryNetwork is a sophisticated multi-layer optimization framework that acts like a "neural network for hyperparameter optimization." It chains together optimization layers to create powerful, automated machine learning pipelines.
Overview
XEvolutionaryNetwork represents a paradigm shift in automated machine learning. Instead of manually tuning hyperparameters, you build a network of optimization layers that automatically discover the best configurations for your specific problem.
Key Benefits
🧠 Intelligent Optimization
Multi-layer architecture that learns optimal hyperparameters automatically.
🔗 Composable Layers
Chain together different optimization strategies for complex problems.
🎯 Adaptive Learning
Learns from previous optimizations to improve future performance.
🚀 Automated Pipelines
Create end-to-end automated machine learning workflows.
Architecture Overview
XEvolutionaryNetwork consists of specialized layers that work together:
- XParamOptimiser: Bayesian optimization for hyperparameter tuning
- Evolve: Evolutionary algorithms for complex search spaces
- Tighten: Gradient-based fine-tuning for precise optimization
- Target: Goal-oriented optimization with specific objectives
- NLP: Natural language processing for text-based optimization
- Layers: Each optimization strategy is a layer
- Forward Pass: Data flows through optimization layers
- Backpropagation: Results inform previous layers
- Training: The network learns optimal optimization strategies
Available Layers
XParamOptimiser Layer
The foundational layer for Bayesian optimization:
from xplainable.core.optimisation import XParamOptimiser, XEvolutionaryNetwork
from xplainable.core.models import XClassifier
# Create XParamOptimiser layer
param_layer = XParamOptimiser(
model=XClassifier(),
X=X_train,
y=y_train,
metric='roc_auc',
cv=5,
n_iter=50,
random_state=42
)
# Define search space
param_space = {
'max_depth': [3, 10],
'min_info_gain': [0.001, 0.1],
'weight': [0.1, 1.0],
'power_degree': [0.5, 3.0]
}
# Optimize parameters
best_params = param_layer.optimise(param_space)
print(f"Best parameters: {best_params}")
Evolve Layer
Evolutionary optimization for complex problems:
from xplainable.core.optimisation import Evolve
# Create evolution layer
evolve_layer = Evolve(
population_size=50,
generations=100,
mutation_rate=0.1,
crossover_rate=0.8,
selection_method='tournament'
)
# Define complex search space
complex_space = {
'model_architecture': ['shallow', 'medium', 'deep'],
'regularization': [0.0, 0.1, 0.2, 0.3],
'feature_selection': ['none', 'univariate', 'recursive'],
'preprocessing': ['standard', 'minmax', 'robust']
}
# Evolve solutions
best_solution = evolve_layer.evolve(complex_space, fitness_function)
Tighten Layer
Gradient-based fine-tuning:
from xplainable.core.optimisation import Tighten
# Create tightening layer
tighten_layer = Tighten(
learning_rate=0.01,
max_iterations=100,
tolerance=1e-6,
optimization_method='adam'
)
# Fine-tune parameters
refined_params = tighten_layer.tighten(initial_params, objective_function)