Stream
Read from project-scoped streams with filtering, sorting, and pagination for use as context.
Stream nodes read data from a stream and pass the results to downstream nodes. Use them to inject stored context -- such as recent records, documents, or historical data -- into AI prompts or other processing steps.
Configuration
| Field | Type | Required | Description |
|---|---|---|---|
| streamId | string | No | The ID of the stream to read from. The stream must belong to the same project as the graph. |
| query | object | No | Query options for filtering, sorting, and paginating the stream data. See the query options table below. |
Query options
| Field | Type | Required | Description |
|---|---|---|---|
| filter | string | No | A filter expression to restrict which rows are returned (for example, filtering by status or date range). |
| sort | array | No | Sort order. Each element is an object with field (string) and direction ("asc" or "desc"). |
| limit | number | No | Maximum number of rows to return. Use this to cap context size when feeding results into an LLM node. |
| offset | number | No | Number of rows to skip, for pagination. |
Input
The node receives the trigger payload and upstream node outputs. You can use placeholders in query values to make queries dynamic at runtime -- for example, filtering by {{input.userId}} or limiting results based on upstream data.
Output
The node produces the result of the stream query: an array of rows (one object per row, with keys = SQL result columns). This output is stored in the node execution record and passed to downstream nodes. Set outputSchema to describe that array (e.g. { "type": "array", "items": { "type": "object", "properties": { ... }, "required": [...] } }). The LSP validates that the SQL SELECT columns match outputSchema.items.
Tips
- Use stream nodes for "retrieve then reason" patterns: fetch a relevant slice of data, then pass it to an LLM node as context.
- Always set
limitto avoid sending excessive context to downstream LLM nodes. Combine withsortto ensure the most relevant rows appear first. - Stream schemas can be versioned. Ensure the stream's current schema matches what downstream nodes expect.
- See Streams for creating, managing, and writing data to streams.
Related
- Streams (Storage) -- Create and manage streams.
- AI -- Use stream output as context in a prompt.
- Code -- Transform or filter stream results before passing them downstream.