Rivet Integration Getting Started
Welcome to the starting guide for integrating Rivet into your application. Most application integrations should start with @valerypopoff/rivet2-node, which re-exports the core graph APIs and adds Node-native defaults for filesystem loading, native APIs, MCP, plugin environment values, Code-node require, and remote debugging.
Installation
Install using your preferred package manager:
- Yarn
- NPM
- pnpm
Install using Yarn:
yarn add @valerypopoff/rivet2-node
Install using NPM:
npm install @valerypopoff/rivet2-node
Install using pnpm:
pnpm add @valerypopoff/rivet2-node
Getting Started
Once Rivet is installed, you can import it into your application:
import * as Rivet from '@valerypopoff/rivet2-node';
Using runGraphInFile
The simplest way to get started with Rivet is by using the runGraphInFile function. This function allows you to execute a graph defined in a Rivet project file.
runGraphInFile Parameters
The runGraphInFile function takes two parameters:
path: A string representing the path to your Rivet project file.options: An object of typeRunGraphOptions.
RunGraphOptions
The RunGraphOptions type is used to pass parameters to runGraphInFile. The exact type has more provider hooks, but the important shape is:
export type RunGraphOptions = {
graph?: string;
inputs?: Record<string, LooseDataValue>;
context?: Record<string, LooseDataValue>;
nativeApi?: NativeApi;
datasetProvider?: DatasetProvider;
audioProvider?: AudioProvider;
mcpProvider?: MCPProvider;
registry?: NodeRegistration<any, any>;
externalFunctions?: {
[key: string]: ExternalFunction;
};
onUserEvent?: {
[key: string]: (data: DataValue | undefined) => void;
};
abortSignal?: AbortSignal;
tokenizer?: Tokenizer;
codeRunner?: ProcessContext['codeRunner'];
projectPath?: string;
projectReferenceLoader?: ProjectReferenceLoader;
} & {
[P in keyof ProcessEvents as `on${PascalCase<P>}`]?: (params: ProcessEvents[P]) => void;
} & Settings;
type NodeRunGraphOptions = RunGraphOptions & {
remoteDebugger?: RivetDebuggerServer;
remoteDebuggerRequestId?: RemoteRunRequestId;
};
Let's break down the important fields:
graph: Specifies the graph you're running. This can either be the ID or the display name of the graph. If omitted, Rivet runs the project's main graph.inputs: Specifies the input values to the graph. These can either be plain JavaScript values like "foo", or{type: 'string', value: 'foo'}objects.context: Similar toinputs, but these values are available to every graph and subgraph.externalFunctions: This is how you define integration points that you can call from inside Rivet graphs.registry: Use this when running custom plugin nodes.remoteDebugger: Node-only. Attach a debugger server so the Rivet app can inspect the run.openAiKey/openAiOrganization/openAiEndpoint: Legacy/shared OpenAI settings used by legacy Chat and OpenAI-backed nodes. LLM Chat can also use provider settings or an API Key input port depending on its API key source.pluginEnv: Environment-like values for plugins. If omitted in@valerypopoff/rivet2-node, plugin-declared environment variables are read fromprocess.env.
Example Code
Here's a basic example of using runGraphInFile:
import { runGraphInFile, DataValue } from '@valerypopoff/rivet2-node';
await runGraphInFile('./myProject.rivet', {
graph: 'My Graph Name',
inputs: {
myInput: 'hello world',
},
context: {
myContext: 'global value',
},
externalFunctions: {
helloWorld: async (...args: unknown[]): Promise<DataValue> => {
return {
type: 'string',
value: 'hello world',
};
},
},
onUserEvent: {
myEvent: (data: DataValue): Promise<void> => {
console.log(data);
},
},
openAiKey: 'my-openai-key',
openAiOrganization: 'my-organization',
});
If the graph uses LLM Chat with API key source set to Input port, pass the key through the graph input connected to that node's API Key port instead of relying on settings.
Remote Debugging
See the Remote Debugging page for more information on how to set up the remote debugger.