Skip to content

Installation

Simply pull the latest LaikaLLM Docker Image which includes every preliminary step to run the project, including setting PYTHONHASHSEED and CUBLAS_WORKSPACE_CONFIG for reproducibility purposes!

From source

LaikaLLM requires Python 3.10 or later, and all packages needed are listed in requirements.txt

  • Torch with cuda 11.7 has been set as requirement for reproducibility purposes, but feel free to change the cuda version with the most appropriate for your use case!

To install LaikaLLM:

  1. Clone this repository and change work directory:
    git clone https://github.com/Silleellie/LaikaLLM.git
    cd LaikaLLM
    
  2. Install the requirements:
    pip install -r requirements.txt
    
  3. Start experimenting!

NOTE: It is highly suggested to set the following environment variables to obtain 100% reproducible results of your experiments:

export PYTHONHASHSEED=42
export CUBLAS_WORKSPACE_CONFIG=:16:8

You can check useful info about the above environment variables here and here

Info

LaikaLLM can be easily run by defining a .yaml file, which encapsulates the full logic of the experiment, or by accessing the Python API (a more flexible and powerful approach). Check the .yaml sample example or the python one to get up and running with LaikaLLM!


Tip: In case of installation from source, it is suggested to install LaikaLLM requirements in a virtual environment

Virtual environments are special isolated environments where all the packages and versions you install only apply to that specific environment. It’s like a private island! — but for code.

Read this Medium article for understanding all the advantages and the official python guide on how to set up one