Open source LLM models

Examples on connecting SentiOne Automate to popular LLM providers

SentiOne support generic integration with LLMs that have API compatible with OpenAI completions API. There are numerous providers that provide drop-in replacement like Groq or OpenRouter. You can even use our own LLM server available at https://llms.sentione.com

Groq

Groq.com is the official platform for Groq, an artificial intelligence hardware and software company renowned for inventing the Language Processing Unit (LPU). Unlike traditional GPUs, Groq's specialized LPU architecture is purpose-built to run large language models (LLMs) and generative AI applications with ultra-fast, remarkably low-latency inference speeds.

Intergration

To use grok please create an account and create API key on https://console.groq.com. After doing that in SentiOne Automate please go to Organization > AI Credentials. Click on Create and select tile LLM Compatible with OpenAI On that form please provide following values:

You can provide more than one model.

OpenRouter

OpenRouter is a unified AI platform and API aggregator that provides developers and users with centralized access to a vast array of large language models (LLMs) from multiple providers, including OpenAI, Anthropic, Google, and open-source creators. It simplifies AI integration by offering a single, standardized interface, allowing users to easily compare pricing, evaluate performance, and seamlessly switch between different models without needing to manage multiple accounts or API keys.

Intergration

Log in to https://openrouter.ai and create API Key in Workspace settings. After doing that in SentiOne Automate please go to Organization > AI Credentials. Click on Create and select tile LLM Compatible with OpenAI On that form please provide following values:

You can provide more than one model.

SentiOne LLMs Server

For the development purposes SentiOne has created it's own LLM server which can host various models with different backends (CPU, GPU). Currently it supports Polish LLMs from Bielik and PLLuM, however supported models may include other providers and models in future.

Intergration

Log in to https://llms.sentione.com and create API Key in section API Key Management. After doing that in SentiOne Automate please go to Organization > AI Credentials. Click on Create and select tile LLM Compatible with OpenAI On that form please provide following values:

  • Name: Human readable name that will help you identify those credentials eg. "SentiOne LLMs"
  • API Key: Key you have created at https://llms.sentione.com/dashboard/
  • Endpoint URL: http://llms.sentione.com/v1
  • Supported modes: Supported models are listed in dashboard. They include
    • speakleash/Bielik-11B-v3.0-Instruct-GGUF:Q5_K_M
    • speakleash/Bielik-11B-v2.6-Instruct-GGUF:Q5_K_M
    • piotrmaciejbednarski/PLLuM-8x7B-chat-GGUF:Q5_K_M

Please note that this is a development server not suitable for production deployments and different models are backed by different infrastructures. Response time may be significant.

How to pick best model?

There are two main factors that determine the model selection:

  • Overall quality in language you want to create you bot in
  • Speed (Time-To-First-Token)

Please refer to various leader boards that benchmark available models and infrastructure providers.