The BigCode project is an open-scientific collaboration working on the responsible development of large language models for code. This tech report describes the progress of the collaboration until December 2022, outlining the current state of the Personally Identifiable Information (PII) redaction pipeline, the experiments conducted to de-risk the model architecture, and the experiments investigating better preprocessing methods for the training data. We train 1.1B parameter models on the Java, JavaScript, and Python subsets of The Stack and evaluate them on the MultiPL-E text-to-code benchmark. We find that more aggressive filtering of near-duplicates can further boost performance and, surprisingly, that selecting files from repositories with 5+ GitHub stars deteriorates performance significantly. Our best model outperforms previous open-source multilingual code generation models (InCoder-6.7B and CodeGen-Multi-2.7B) in both left-to-right generation and infilling on the Java, JavaScript, and Python portions of MultiPL-E, despite being a substantially smaller model. All models are released under an OpenRAIL license.
@inproceedings{santacoder, author = {Allal, Loubna Ben and Li, Raymond and Kocetkov, Denis and Mou, Chenghao and Akiki, Christopher and Ferrandis, Carlos Munoz and Muennighoff, Niklas and Mishra, Mayank and Gu, Alex and Dey, Manan and Umapathi, Logesh Kumar and Anderson, Carolyn Jane and Zi, Yangtian and Poirier, Joel Lamy and Schoelkopf, Hailey and Troshin, Sergey and Abulkhanov, Dmitry and Romero, Manuel and Lappert, Michael and De Toni, Francesco and del Río, Bernardo García and Liu, Qian and Bose, Shamik and Bhattacharyya, Urvashi and Zhuo, Terry Yue and Yu, Ian and Villegas, Paulo and Zocca, Marco and Mangrulkar, Sourab and Lansky, David and Nguyen, Huu and Contractor, Danish and Villa, Luis and Li, Jia and Bahdanau, Dzmitry and Jernite, Yacine and Hughes, Sean and Fried, Daniel and Guha, Arjun and de Vries, Harm and von Werra, Leandro}, title = {SantaCoder: don't reach for the stars!}, booktitle = "Deep Learning for Code Workshop (DL4C)", year = {2023}, }