Train an LLM


Now that you have generated your LLM Dataset, you can train your custom LLM Model.

Training Your Model

    1. Using the ... open the LLM Dataset Generator Element settings and adjust the following:
    2. Base Model Architecture: The default model is great for creating a model quickly.

      Want to train something tailored to a specific idea? webAI supports a variety of base models for whatever your use case is. Check out our Supported LLM Base Models page to learn more about all the models in webAI.

      Supported LLM Base Models

    3. Dataset Folder Path: Using the Select Directory button, choose the folder where you saved your LLM dataset during LLM Dataset Generation.
      Need a dataset? We've got two for you right here:
    4. Artifact Save Path: Using the Select Directory button, choose the folder where you would like to save your trained adapter.
    5. Base Model Assets Path: Using the Select Directory button, choose the folder where you would like to save your base model.
    6. Evaluator API Key: Add a Groq, OpenAI, Claude, or Gemini API key to enable the Faithfulness and Relevancy benchmarks in your training metrics.

      You can add as many as you like, but at least one is required. You can get a free Groq key below

      Groq API Key

    7. Batch Size: 4 is recommended for testing.
    8. Leave all other settings as the default.
    9. You can now hit the Run button to start the training process.

dependencies will be installed the first time this flow is run, so it may take a while for them to install.