Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add overview pages to documentation #177

Merged
merged 2 commits into from
Nov 10, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
37 changes: 37 additions & 0 deletions NASLib_docs/docs/optimizers/overview.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# Optimizers Documentation Overview

This documentation provides information on a range of optimizers available in NASLib. Explore these optimizers for both discrete and one-shot architecture search methods, each tailored to specific tasks within the NAS field.

## Discrete Optimizers

### 1. [Bananas](discrete/bananas.md)
Bananas is a powerful discrete optimizer that can efficiently search for neural architectures in a discrete search space. It employs a combination of Bayesian optimization and neural networks to find optimal architectures.

### 2. [Base Predictor](discrete/bp.md)
The Base Predictor optimizer is a foundational component for discrete NAS. It provides a starting point for architecture search by predicting the performance of candidate architectures based on historical data.

### 3. [Local Search](discrete/ls.md)
Local Search is an optimizer that focuses on refining neural architectures through local exploration. It is a valuable tool for fine-tuning architectures to achieve better performance.

### 4. [NPenas](discrete/npenas.md)
NPenas is an optimizer that leverages penalization-based techniques to search for neural architectures. It helps prevent overfitting while finding architectures that perform well on your specific task.

### 5. [Regularized Evolution](discrete/re.md)
Regularized Evolution is a discrete optimizer that uses evolutionary algorithms with regularization to discover high-performing neural architectures. It balances exploration and exploitation to find optimal solutions.

### 6. [Random Search](discrete/rs.md)
Random Search is a simple yet effective optimizer that explores neural architecture search spaces by randomly sampling and evaluating architectures. It serves as a baseline for NAS experiments.

## One-Shot Optimizers

### 7. [DARTS](oneshot/darts.md)
DARTS (Differentiable Architecture Search) is a one-shot optimizer that uses gradient-based methods to search for optimal neural architectures. It allows the continuous relaxation of the architecture search space for efficiency.

### 8. [DrNAS](oneshot/drnas.md)
DrNAS is an optimizer that incorporates regularization techniques into one-shot NAS. It focuses on discovering robust neural architectures by adding regularization constraints during the search process.

### 9. [GDAS](oneshot/gdas.md)
GDAS (Gradient-Driven Architecture Search) is a one-shot optimizer that emphasizes the use of gradients to guide the search for neural architectures. It efficiently explores the search space while maintaining differentiability.

### 10. [RSWS](oneshot/rsws.md)
RSWS (Random Search with Weight Sharing) is a one-shot optimizer that combines random search with weight sharing to discover neural architectures efficiently. It leverages shared weights to speed up the search process.
24 changes: 24 additions & 0 deletions NASLib_docs/docs/search_spaces/overview.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# Search Spaces Overview

This documentation offers insights into a variety of search spaces within NASLib. Delve into these search spaces designed to address specific tasks and domains, and tailor your architecture search experiments accordingly.

## 1. [NAS-Bench-101](nasbench_101.md)
NAS-Bench-101 provides a comprehensive search space for evaluating neural architectures in the context of computer vision tasks. It is a valuable resource for benchmarking and comparison.

## 2. [NAS-Bench-201](nasbench_201.md)
NAS-Bench-201 is designed to support architecture search for various tasks, including computer vision and beyond. It offers diverse search spaces to cater to different application domains.

## 3. [NAS-Bench-301](nasbench_301.md)
NAS-Bench-301 extends the capabilities of our library by providing a versatile search space for exploring neural architectures in the context of computer vision and related fields.

## 4. [NAS-Bench-ASR](nasbench_asr.md)
NAS-Bench-ASR is tailored for the domain of Automatic Speech Recognition (ASR). It offers specific search spaces and configurations optimized for ASR tasks.

## 5. [NAS-Bench-NLP](nasbench_nlp.md)
NAS-Bench-NLP focuses on Natural Language Processing (NLP) tasks, providing search spaces designed to create and evaluate neural architectures for NLP applications.

## 6. [Simple Cell](simple_cell.md)
Simple Cell is a straightforward and customizable search space that can be used for a variety of NAS experiments. It allows researchers to define and explore their own architectures.

## 7. [Transbench 101](transbench101.md)
Transbench 101 is a specialized search space designed for transformer-based models, suitable for a wide range of tasks, including language modeling and machine translation.
3 changes: 1 addition & 2 deletions NASLib_docs/mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,7 @@ nav:
- NAS-Bench-NLP: search_spaces/nasbench_nlp.md
- Simple Cell: search_spaces/simple_cell.md
- Transbench101: search_spaces/transbench101.md
- Utils:
- Overview: utils/overview.md
# TODO Utils
- Contributing: contributing.md
plugins:
- mkdocstrings:
Expand Down
Loading