Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updates to Substreams docs and to New Chain Integrations #849

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion website/pages/ar/substreams/sps/_meta.js
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import meta from '../../../en/substreams/sps/_meta.js'
import meta from '../../../en/sps/_meta.js'

export default {
...meta,
Expand Down
2 changes: 1 addition & 1 deletion website/pages/cs/substreams/sps/_meta.js
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import meta from '../../../en/substreams/sps/_meta.js'
import meta from '../../../en/sps/_meta.js'

export default {
...meta,
Expand Down
2 changes: 1 addition & 1 deletion website/pages/de/substreams/sps/_meta.js
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import meta from '../../../en/substreams/sps/_meta.js'
import meta from '../../../en/sps/_meta.js'

export default {
...meta,
Expand Down
2 changes: 1 addition & 1 deletion website/pages/en/_meta.js
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ export default {
},
'###3': {
type: 'heading',
title: 'Indexing',
title: 'Indexing',
},
indexing: {
type: 'children',
Expand Down
24 changes: 7 additions & 17 deletions website/pages/en/indexing/new-chain-integration.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -25,17 +25,19 @@ For Graph Node to be able to ingest data from an EVM chain, the RPC node must ex
- `eth_getBlockByHash`
- `net_version`
- `eth_getTransactionReceipt`, in a JSON-RPC batch request
- `trace_filter` *(optionally required for Graph Node to support call handlers)*
- `trace_filter` *(limited tracing and optionally required for Graph Node)*

### 2. Firehose Integration

[Firehose](https://firehose.streamingfast.io/firehose-setup/overview) is a next-generation extraction layer. It collects history in flat files and streams in real time. Firehose technology replaces those polling API calls with a stream of data utilizing a push model that sends data to the indexing node faster. This helps increase the speed of syncing and indexing.

The primary method to integrate the Firehose into chains is to use an RPC polling strategy. Our polling algorithm will predict when a new block will arrive and increase the rate at which it checks for a new block near that time, making it a very low-latency and efficient solution. For help with the integration and maintenance of the Firehose, contact the [StreamingFast team](https://www.streamingfast.io/firehose-integration-program). New chains and their integrators will appreciate the [fork awareness](https://substreams.streamingfast.io/documentation/consume/reliability-guarantees) and massive parallelized indexing capabilities that Firehose and Substreams bring to their ecosystem.

> NOTE: All integrations done by the StreamingFast team include maintenance for the Firehose replication protocol into the chain's codebase. StreamingFast tracks any changes and releases binaries when you change code and when StreamingFast changes code. This includes releasing Firehose/Substreams binaries for the protocol, maintaining Substreams modules for the block model of the chain, and releasing binaries for the blockchain node with instrumentation if need be.

#### Specific Firehose Instrumentation for EVM (`geth`) chains
#### Integration for Non-EVM chains

The primary method to integrate the Firehose into chains is to use an RPC polling strategy. Our polling algorithm will predict when a new block will arrive and increase the rate at which it checks for a new block near that time, making it a very low-latency and efficient solution. For help with the integration and maintenance of the Firehose, contact the [StreamingFast team](https://www.streamingfast.io/firehose-integration-program). New chains and their integrators will appreciate the [fork awareness](https://substreams.streamingfast.io/documentation/consume/reliability-guarantees) and massive parallelized indexing capabilities that Firehose and Substreams bring to their ecosystem.

#### Specific Instrumentation for EVM (`geth`) chains

For EVM chains, there exists a deeper level of data that can be achieved through the `geth` [live-tracer](https://github.com/ethereum/go-ethereum/releases/tag/v1.14.0), a collaboration between Go-Ethereum and StreamingFast, in building a high-throughput and rich transaction tracing system. The Live Tracer is the most comprehensive solution, resulting in [Extended](https://streamingfastio.medium.com/new-block-model-to-accelerate-chain-integration-9f65126e5425) block details. This enables new indexing paradigms, like pattern matching of events based on state changes, calls, parent call trees, or triggering of events based on changes to the actual variables in a smart contract.

Expand All @@ -56,24 +58,12 @@ While the JSON-RPC and Firehose are both suitable for subgraphs, a Firehose is a
Configuring Graph Node is as easy as preparing your local environment. Once your local environment is set, you can test the integration by locally deploying a subgraph.

1. [Clone Graph Node](https://github.com/graphprotocol/graph-node)
2. Modify [this line](https://github.com/graphprotocol/graph-node/blob/master/docker/docker-compose.yml#L22) to include the new network name and the EVM JSON-RPC compliant URL
2. Modify [this line](https://github.com/graphprotocol/graph-node/blob/master/docker/docker-compose.yml#L22) to include the new network name and the EVM JSON-RPC or Firehose compliant URL

> Do not change the env var name itself. It must remain `ethereum` even if the network name is different.

3. Run an IPFS node or use the one used by The Graph: https://api.thegraph.com/ipfs/

### Testing an EVM JSON-RPC by locally deploying a subgraph

1. Install [graph-cli](https://github.com/graphprotocol/graph-cli)
2. Create a simple example subgraph. Some options are below:
1. The pre-packed [Gravitar](https://github.com/graphprotocol/example-subgraph/tree/f89bdd4628efa4badae7367d4919b3f648083323) smart contract and subgraph is a good starting point
2. Bootstrap a local subgraph from any existing smart contract or solidity dev environment [using Hardhat with a Graph plugin](https://github.com/graphprotocol/hardhat-graph)
3. Adapt the resulting `subgraph.yaml` by changing `dataSources.network` to the same name previously passed on to Graph Node.
4. Create your subgraph in Graph Node: `graph create $SUBGRAPH_NAME --node $GRAPH_NODE_ENDPOINT`
5. Publish your subgraph to Graph Node: `graph deploy $SUBGRAPH_NAME --ipfs $IPFS_ENDPOINT --node $GRAPH_NODE_ENDPOINT`

Graph Node should be syncing the deployed subgraph if there are no errors. Give it time to sync, then send some GraphQL queries to the API endpoint printed in the logs.

## Substreams-powered Subgraphs

For StreamingFast-led Firehose/Substreams integrations, basic support for foundational Substreams modules (e.g. decoded transactions, logs and smart-contract events) and Substreams codegen tools are included. These tools enable the ability to enable [Substreams-powered subgraphs](/substreams/sps/introduction/). Follow the [How-To Guide](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application) and run `substreams codegen subgraph` to experience the codegen tools for yourself.
8 changes: 5 additions & 3 deletions website/pages/en/substreams/_meta.js
Original file line number Diff line number Diff line change
@@ -1,4 +1,6 @@
export default {
introduction: 'Introduction',
sps: 'Substreams-Powered Subgraphs',
}
introduction: 'Introduction to Substreams',
'getting-started-substreams': 'Quick Start',
'pubsubstreams': 'Run a Substreams Package',
developing: 'Developing',
}
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
5 changes: 5 additions & 0 deletions website/pages/en/substreams/developing/_meta.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export default {
devcontainer: 'Dev Container',
solana: 'Solana',
sinks: 'Sink your Substreams',
}
39 changes: 39 additions & 0 deletions website/pages/en/substreams/developing/devcontainer.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
---
title: Substreams Dev Container
---

The Substreams Dev Container is a tool to help you build your first project. You can either run it remotely through Github codespaces or locally by cloning the [substreams-starter repository](https://github.com/streamingfast/substreams-starter?tab=readme-ov-file). Inside the Dev Container, the `substreams init` command sets up a code-generated Substreams project, allowing you to easily build a subgraph or an SQL-based solution for data handling.

## Prerequisites

- Ensure Docker and VS Code are up-to-date.

## First Navigating the Dev Container

Upon entering the Dev Container, you can either build or import your own `substreams.yaml` and associate modules within the minimal path, or opt for the automatically generated Substreams paths. Then running `Substreams Build` generates the Protobuf files.

- **Minimal**: Starts you with the raw block data, this path is for experienced users. You can navigate to the `substreams.yaml` to modify the input data source.
- **Non-Minimal**: Extracts filtered data using network-specific cache and Protobufs from the corresponding Foundational Modules that is built and maintained by the StreamingFast team.

To publish your work with the broader community, publish your `.spkg` to [Substreams registry](https://substreams.dev/) using:

- `substreams registry login`
- `substreams registry publish`

> Note: If you run into any problems within the Dev Container, use the `help` command to access trouble shooting tools.

## Building a Sink for Your Project

You can configure your Substreams project to query data either through a Subgraph or directly from an SQL database:

- **Subgraph**: Run `substreams codegen subgraph`. This generates a project with a basic `schema.graphql` and `mappings.ts` file. You can customize these to define entities based on the data extracted by Substreams. For more information on configuring a Subgraph sink, see the [Subgraph documentation](https://thegraph.com/docs/en/sps/triggers).
- **SQL**: Run `substreams codegen sql` for SQL-based queries. For more information on configuring a SQL sink, refer to the [SQL documentation](https://docs.substreams.dev/how-to-guides/sinks/sql-sink).

## Deployment Options

To deploy a Subgraph, you can either run the `graph-node` locally using the `deploy-local` command or deploy to Subgraph Studio by using the `deploy` command found in the `package.json` file.

## Common Errors

- When running locally, make sure to verify that all Docker containers are healthy by running the `dev-status` command.
- If you put the wrong start-block while generating your project, navigate to the `substreams.yaml` to change the block number, then re-run `substreams build`.
5 changes: 5 additions & 0 deletions website/pages/en/substreams/developing/sinks/_meta.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export default {
sinks: 'Official Sinks',
sps: 'Substreams-Powered Subgraphs',
}

43 changes: 43 additions & 0 deletions website/pages/en/substreams/developing/sinks/sinks.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
Once you find a package that fits your needs, you can choose how you want to consume the data. Sinks are integrations that allow you to send the extracted data to different destinations, such as a SQL database, a file or a subgraph.

{% hint style="info" %}
**Note**: Some of the sinks are officially supported by the StreamingFast core development team (i.e. active support is provided), but other sinks are community-driven and support can't be guaranteed.
{% endhint %}

- [SQL Database](https://docs.substreams.dev/how-to-guides/sinks/sql-sink): Send the data to a database.
- [Subgraph](./sps/introduction.mdx): Configure an API to meet your data needs and host it on The Graph Network.
- [Direct Streaming](https://docs.substreams.dev/how-to-guides/sinks/stream): Stream data directly from your application.
- [PubSub](https://docs.substreams.dev/how-to-guides/sinks/community-sinks): Send data to a PubSub topic.
- [Community Sinks](https://docs.substreams.dev/how-to-guides/sinks/community-sinks): Explore quality community maintained sinks.

{% hint style="success" %}
**Deployable Service**: If you’d like your sink (e.g., SQL or PubSub) to be hosted for you, reach out to the StreamingFast team [here](mailto:[email protected]).
{% endhint %}

## Navigating Sink Repos

### Official

| Name | Support | Maintainer | Source Code |
|-----------|---------|------------------|-------------|
| SQL | O | StreamingFast |[substreams-sink-sql](https://github.com/streamingfast/substreams-sink-sql)|
| Go SDK | O | StreamingFast |[substreams-sink](https://github.com/streamingfast/substreams-sink)|
| Rust SDK | O | StreamingFast |[substreams-sink-rust](https://github.com/streamingfast/substreams-sink-rust)|
| JS SDK | O | StreamingFast |[substreams-js](https://github.com/substreams-js/substreams-js)|
| KV Store | O | StreamingFast |[substreams-sink-kv](https://github.com/streamingfast/substreams-sink-kv)|
| Prometheus| O | Pinax |[substreams-sink-prometheus](https://github.com/pinax-network/substreams-sink-prometheus)|
| Webhook | O | Pinax |[substreams-sink-webhook](https://github.com/pinax-network/substreams-sink-webhook)|
| CSV | O | Pinax |[substreams-sink-csv](https://github.com/pinax-network/substreams-sink-csv)|
| PubSub | O | StreamingFast |[substreams-sink-pubsub](https://github.com/streamingfast/substreams-sink-pubsub)|

### Community

| Name | Support | Maintainer | Source Code |
|-----------|---------|------------------|-------------|
| MongoDB | C | Community |[substreams-sink-mongodb](https://github.com/streamingfast/substreams-sink-mongodb)|
| Files | C | Community |[substreams-sink-files](https://github.com/streamingfast/substreams-sink-files)|
| KV Store | C | Community |[substreams-sink-kv](https://github.com/streamingfast/substreams-sink-kv)|
| Prometheus| C | Community |[substreams-sink-Prometheus](https://github.com/pinax-network/substreams-sink-prometheus)|

* O = Official Support (by one of the main Substreams providers)
* C = Community Support
Original file line number Diff line number Diff line change
Expand Up @@ -8,12 +8,16 @@ There are two methods of enabling this technology:

Using Substreams [triggers](/substreams/sps/triggers/): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph.

Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities.
Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities.

It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node.

Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly:
Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly:

- [Solana](https://docs.substreams.dev/tutorials/intro-to-tutorials/solana)
- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm)
- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet)
- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective)
- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra)


- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/solana)
- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/evm)
- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/injective)
5 changes: 5 additions & 0 deletions website/pages/en/substreams/developing/solana/_meta.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export default {
solana: 'Solana Offerings',
transactions: 'Transactions and Instructions',
accountchanges: 'Account Changes'
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
# Getting Started with Solana Account Changes

## Introduction

In this guide, you will learn how to consume Solana account change data using Substreams. We will walk you through the process of setting up your environment, configuring your first Substreams stream, and consuming account changes efficiently.

By the end of this tutorial, you will have a working Substreams feed that allows you to track real-time account changes on the Solana blockchain, as well as historical account change data.

{% hint style="info" %}
History for the Solana Account Changes dates as of 2025, block 310629601.
{% endhint %}

For each Solana Account block, only the latest update per account is recorded, see the [Protobuf Referece](https://buf.build/streamingfast/firehose-solana/file/main:sf/solana/type/v1/account.proto). If an account is deleted, a payload with `deleted == True` is provided. Additionally, events of low importance we're omitted, such as those with the special owner “Vote11111111…” account or changes that do not affect the account data (ex: lamport changes).

## Prerequisites

Before you begin, ensure that you have the following:

1. [Substreams CLI](../../references/cli/installing-the-cli.md) installed.
2. A [Substreams key](../../references/cli/authentication.md) for access to the Solana Account Change data.
3. Basic knowledge of [how to use](../../references/cli/command-line-interface.md) the command line interface (CLI).

## Step 1: Set Up a Connection to Solana Account Change Substreams

Now that you have Substreams CLI installed, we can set up a connection to the Solana Account Change Substreams feed.

Using the [Solana Accounts Foundational Module](https://substreams.dev/packages/solana-accounts-foundational/latest), you can choose to stream data directly or use the GUI for a more visual experience. The following `gui` example filters for Honey Token account data.

```bash
substreams gui solana-accounts-foundational filtered_accounts -t +10 -p filtered_accounts="owner:TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA || account:4vMsoUT2BWatFweudnQM1xedRLfJgJ7hswhcpz4xgBTy"
```
This command will stream account changes directly to your terminal.

```bash
substreams run solana-accounts-foundational filtered_accounts -s -1 -o clock
```

The Foundational Module has support for filtering on specific accounts and/or owners. You can adjust the query based on your needs.

This tutorial will continue to guide you through filtering, sinking the data, and setting up reconnection policies.

## Step 2: Sink the Substreams

Consume the account stream [directly in your applicaion](../../how-to-guides/sinks/stream/stream.md) using a callback or make it queryable by using the [SQL-DB sink](../../how-to-guides/sinks/sql/sql-sink.md).

## Step 3: Setting up a Reconnection Policy

[Cursor Management](../../references/reliability-guarantees.md) ensures seamless continuity and retraceability by allowing you to resume from the last consumed block if the connection is interrupted, preventing data loss and maintaining a persistent stream.

The user's primary responsibility when creating or using a sink is to pass a BlockScopedDataHandler and a BlockUndoSignalHandler implementation(s) which has the following interface:

```go
import (
pbsubstreamsrpc "github.com/streamingfast/substreams/pb/sf/substreams/rpc/v2"
)

type BlockScopedDataHandler = func(ctx context.Context, cursor *Cursor, data *pbsubstreamsrpc.BlockScopedData) error
type BlockUndoSignalHandler = func(ctx context.Context, cursor *Cursor, undoSignal *pbsubstreamsrpc.BlockUndoSignal) error
```
7 changes: 7 additions & 0 deletions website/pages/en/substreams/developing/solana/solana.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
With Substreams on Solana you can index [Transaction and Instruction](./transactions.mdx) data as well as [Account Changes](./accountchanges.mdx). The Solana Account Changes endpoint offers both real-time and historical data, similar to other [supported endpoints](https://docs.substreams.dev/reference-material/chains-and-endpoints), with a few key differences:

1. History for the Solana Account Changes dates as of 2025, block 310629601.

2. The pricing model of Account Changes is based on Egress rather than Bytes Processed given the file sizes.

Test our latency on Solana, measured as block-head drift, for yourself by installing the [Substreams CLI](https://docs.substreams.dev/reference-material/substreams-cli/installing-the-cli) and running `substreams run solana-common blocks_without_votes -s -1 -o clock`.
Loading
Loading