A resource is a third-party system or service that RunLLM interacts with. Resources include data sources (e.g., Snowflake, Google Docs), model providers (e.g., OpenAI), and vector DBs (e.g., Chroma, Pinecone). You can connect RunLLM to your resources from our Python SDK or from the Resources page on the UI. Most primitives operate on one or more resources — for example, the
generate primitive will use a model provider as a resource.
RunLLM currently supports the following resources: