Skip to content

boyellow/ESO

Repository files navigation

When Evolution Strategy Meets Language Models Tuning (COLING 2025)

Acknowledgement

Our code is based on the codes of ICML2024 DistiLLM: Towards Streamlined Distillation for Large Language Models and ICLR2024 MiniLLM: Knowledge Distillation of Large Language Models

Environment

Create a Python virtual environment and install required libraries

conda create -n eso python=3.11 && conda activate eso
pip install -r requirements.txt

Data Processing

Follow the code of ICML2024 DistiLLM: Towards Streamlined Distillation for Large Language Models to perform data processing

Train

bash scripts/gpt2/eso/run.sh

Evaluate

bash scripts/eval/eval.sh

BibTeX

If you find this repo useful for your research, please consider citing our paper:

@inproceedings{huang-eso-2025,
    title = "When Evolution Strategy Meets Language Models Tuning",
    author = "Huang, Bo  and
      Jiang, Yuxin  and
      Chen, Mingyang  and
      Wang, Yi  and
      Chen, Hongyang  and
      Wang, Wei",
    editor = "Rambow, Owen  and
      Wanner, Leo  and
      Apidianaki, Marianna  and
      Al-Khalifa, Hend  and
      Eugenio, Barbara Di  and
      Schockaert, Steven",
    booktitle = "Proceedings of the 31st International Conference on Computational Linguistics",
    month = jan,
    year = "2025",
    address = "Abu Dhabi, UAE",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2025.coling-main.357/",
    pages = "5333--5344",
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published