This project implements a Neural Architecture Search (NAS) approach to solve the Abstraction and Reasoning Corpus (ARC) Challenge using Proximal Policy Optimization (PPO) and NNI (Neural Network Intelligence).
The ARC Challenge presents a series of tasks that test an AI system's ability to learn and apply abstract reasoning. This solver uses a PPO model with a customizable neural architecture to tackle these tasks. The architecture is optimized using NNI to search for the most effective configuration.
- Implementation of PPO for ARC task solving
- Neural Architecture Search using NNI
- Flexible model architecture with configurable hyperparameters
- Support for variable-sized input and output grids (up to 30x30)
- Reward function considering both grid content and size accuracy
arc_challenge_solver/
│
├── analysis.ipynb
├── compare_models.py
├── evaluate.py
├── LICENCE
├── main.py
├── readme.md
├── requirements.txt
├── test.py
├── train.py
│
├── config/
│ └── config.py
│
├── data/
│ ├── arc_dataloader.py
│ ├── arc_task.py
│ ├── evaluation/
│ └── training/
│
├── models/
│ ├── base_model.py
│ ├── ppo_model.py
│ └── random_model.py
│
├── nas/
│ ├── analyze_nas_results.py
│ ├── config.yml
│ └── nas_trial.py
│
└── utils/
├── checkpoint.py
├── experiment.py
├── loss_functions.py
├── metrics.py
└── visualizer.py
- Clone the repository:
git clone https://github.com/clemspace/arc-challenge-solver.git
cd arc-challenge-solver
- Create a virtual environment and activate it:
python -m venv venv
source venv/bin/activate # On Windows use venv\Scripts\activate
- Install the required packages:
pip install -r requirements.txt
- Download the ARC dataset and place it in the appropriate directories under
data/evaluation/
anddata/training/
.
To run the Neural Architecture Search:
nnictl create --config nas/config.yml
This will start the NAS process using the configuration specified in config.yml
. The search will explore different neural architectures and hyperparameters to optimize the PPO model's performance on ARC tasks.
To run other components:
- Train the PPO model:
python train.py
- Evaluate models:
python evaluate.py
- Compare models:
python compare_models.py
- Run the full pipeline:
python main.py
The PPO model uses a convolutional neural network with the following key components:
- Variable number of convolutional layers
- Skip connections (optional)
- Dropout for regularization
- Flexible activation functions
- Size prediction for variable output sizes
The NAS process optimizes the following hyperparameters:
- Learning rate
- Batch size
- Number of layers
- Hidden dimension size
- Dropout rate
- Optimizer selection
- Activation function
- Use of skip connections
- Weight decay
Soon! Still cooking!
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
- Abstraction and Reasoning Corpus (ARC)
- OpenAI Spinning Up for PPO implementation guidance
- NNI (Neural Network Intelligence) for Neural Architecture Search