### File: application_examples/argu-mint.mdx
---
title: Auto Agents Argu-mint Agree-mint
description: Argu-mint and Agree-mint are the first Auto Agents deployed on the Autonomys Network
---
## Auto Agents: Argu-mint | Agree-mint
[Argu-mint](https://x.com/0xargumint) and [Agree-mint](https://x.com/0xagreemint) are the first Auto Agents deployed on the Autonomys Network. They demonstrate the power of on-chain agent memory and importance of verifiable AI interaction. As social agents, they engage in conversations on X (formerly Twitter) and permanently archive their interaction history on the Autonomys Network.

### How Argu-mint | Agree-mint work
1. **Monitoring & Analysis**
- Scan key influencers, hashtags, and trending topics in web3 x AI
- Evaluate potential engagements using its built-in interaction model
- Make autonomous decisions about engagement
2. **Interaction**
- Store each interaction in real-time on the Autonomys Network's DSN
- Archive both the content and reasoning behind each interaction
3. **On-Chain Storage**
- Create permanent, timestamped records
- Generate blockchain hashes for verification
- Maintain a queryable interaction history
### Try them out
- **Interact**: Mention [@0xargumint](https://x.com/0xargumint) or [@0xagreemint](https://x.com/0xagreemint) on X
- **Explore memory**: Visit [0xargumint.ai](http://0xargumint.ai) to:
- Search conversation archives
- View interaction history
- Explore decision reasoning
- Verify on-chain storage

### Technical implementation
Argu-mint and Agree-mint leverage multiple components of the Autonomys Network:
- **Distributed Storage Network (DSN)** for permanent data archival
- **Auto Drive API** for interaction management
- **Decentralized compute domain infrastructure** for enhanced autonomy *(Coming soon)*
## Development resources
- [Auto SDK documentation](/sdk)
- [Autonomys Agents Framework repository](https://github.com/autonomys/autonomys-agents)
---
### File: application_examples/auto-drive.mdx
---
title: Auto Drive
description: Permanent Distributed Storage
---
## [Auto Drive](https://ai3.storage/): Permanent distributed data storage
### What is Auto Drive?
[Auto Drive](https://ai3.storage/) is a **next-generation distributed storage platform** that ensures data is always accessible, immutable, and secure. Unlike traditional systems like IPFS, which rely on pinning or hosting services, Auto Drive guarantees **permanent availability** through a distributed network of incentivized storage nodes (farmers).
Whether you’re building a super dApp or Auto Agent, archiving important data, or hosting files for the web3 world, Auto Drive offers a seamless, developer-friendly solution.

### Key features
- **Always-On Availability**: Ensures your files are always online (unlike IPFS, which requires pinning to maintain availability), making it ideal for applications that demand reliability.
- **True Data Permanence**: Offers a permanent, tamper-proof solution for storing important data, removing any uncertainty about data loss, making it perfect for long-term archival.
- **Built for Developers**: Simplifies complex decentralized storage operations with easy-to-use tools and APIs, allowing for easy integration, whether you’re a seasoned developer or a first-time builder.
- **Sustainable and Scalable**: Decentralized, open-source, and community-driven. Incentivizes high-performing storage nodes, ensuring scalability and sustainability for years to come.
### Why Auto Drive?
**Auto Drive** bridges the gap between IPFS and truly permanent storage. IPFS can be problematic as files are removed if they aren’t pinned or hosted. Auto Drive solves this by guaranteeing data availability using advanced blockchain-backed storage proofs and redundancy that make data tamper-proof and permanently accessible without relying on pinning services.
### Use cases
- Host decentralized applications.
- Transparently store on-chain agent data.
- Permanently archive important files.
**Auto Drive** is the future of distributed storage. Whether you’re safeguarding critical data or building the next web3 innovation, Auto Drive is here to make it easier, faster and more secure.
**[Start building](http://develop.autonomys.xyz/sdk/auto-drive/overview_setup)**
---
### File: auto_agents_framework/custom_tools.mdx
---
title: Adding Custom Tools
description: This page describes adding custom tools to Autonomys Agents Framework
---
## Extending the Agent
You can extend your agent by adding custom tools and integrating with other services.
### Custom Tools
Custom tools are built using the \`DynamicStructuredTool\` class from LangChain, which provides:
- **Type-safe inputs**: Define your tool's parameters using Zod schemas
- **Self-documenting**: Tools describe themselves to the LLM for appropriate use
- **Structured outputs**: Return consistent data structures from your tools
#### Example Tool Implementation
Here's an example of how to create a custom tool:
\`\`\`javascript
import \{ createLogger \} from '@autonomys/agent-core';
import \{ DynamicStructuredTool \} from '@langchain/core/tools';
import \{ z \} from 'zod';
// Create a logger for your tool
const logger = createLogger('custom-tool');
/**
* Creates a custom tool for your agent
* @param config - Configuration options for your tool
* @returns A DynamicStructuredTool instance
*/
export const createCustomTool = (config: any) => \{
return new DynamicStructuredTool(\{
name: 'custom_tool_name',
description: \`
Description of what your tool does.
USE THIS WHEN:
- Specify when the agent should use this tool
- Add clear usage guidelines
OUTPUT: Describe what the tool returns
\`,
schema: z.object(\{
// Define your input parameters using Zod
parameter1: z.string().describe('Description of parameter1'),
parameter2: z.number().describe('Description of parameter2'),
parameter3: z.boolean().optional().describe('Optional parameter'),
// For enum parameters:
parameter4: z
.enum(['option1', 'option2', 'option3'])
.default('option1')
.describe('Parameter with predefined options'),
\}),
func: async (\{ parameter1, parameter2, parameter3, parameter4 \}) => \{
try \{
// Log the function call
logger.info('Custom tool called with parameters', \{
parameter1,
parameter2,
parameter3,
parameter4,
\});
// Implement your tool logic here
// ...
// Return a structured response
return \{
success: true,
result: \{
message: 'Operation completed successfully',
data: \{
// Your output data
\},
\},
\};
\} catch (error) \{
// Log and handle errors
logger.error('Error in custom tool:', error);
return \{
success: false,
error: error instanceof Error ? error.message : 'Unknown error',
\};
\}
\},
\});
\};
\`\`\`
### Using MCP Tools
[Model Context Protocol (MCP)](https://docs.anthropic.com/en/docs/agents-and-tools/mcp) tools provide a standardized way to integrate external services with your agent. Here's an example for Notion integration:
\`\`\`javascript
import \{ createMcpClientTool \} from '@autonomys/agent-core';
import \{ StdioServerParameters \} from '@modelcontextprotocol/sdk/client/stdio.js';
import \{ StructuredToolInterface \} from '@langchain/core/tools';
export const createNotionTools = async (
integrationSecret: string,
): Promise<StructuredToolInterface[]> => \{
const notionServerParams: StdioServerParameters = \{
command: process.execPath,
args: ['node_modules/.bin/notion-mcp-server'],
env: \{
OPENAPI_MCP_HEADERS: \`\{\"Authorization\": \"Bearer \$\{integrationSecret\}\", \"Notion-Version\": \"2022-06-28\" \}\`,
\},
\};
const tools = await createMcpClientTool('notion-mcp', '0.0.1', notionServerParams);
return tools;
\};
\`\`\`
### Installing Pre-Built Tools
You can easily install pre-built tools from the Autonomys registry using the agent-os CLI:
\`\`\`bash
# Search for available tools
agent-os search <search-term>
# Install a tool
agent-os install <tool-name>
# Install specific version
agent-os install <tool-name> -v <version>
\`\`\`
After installation, import and register the tool:
\`\`\`javascript
import \{ createTool \} from './tools/<tool-name>';
// Add it to your agent's tools
const agent = new Agent(\{
tools: [createTool(), ...otherTools],
// other agent configuration
\});
\`\`\`
---
### File: auto_agents_framework/getting_started/agentos.mdx
---
title: AgentOS
description: Getting Started with Autonomys Agents Framework with AgentOS
---
## Getting Started with Autonomys Agents Framework with AgentOS
Using the dedicated NPM package makes it very simple to create an agent project.
1. Run \`npm install @autonomys/agent-os\` to install the package.
2. Run \`agent-os init <name-of-agent>\` to create an agent.
3. To configure the credentials (optional) run \`agent-os config --credentials\`.
## Installing Tools with AgentOS
### Install the latest version
\`\`\`bash
agent-os install <tool-name>
\`\`\`
### Install a specific version
\`\`\`bash
agent-os install <tool-name> -v <version>
\`\`\`
### Install using a Content ID (CID)
\`\`\`bash
agent-os install <tool-name> --cid <cid>
\`\`\`
## Publish a Tool
### Publish a tool to the registry
\`\`\`bash
agent-os publish <tool-path>
\`\`\`
### Upload to Auto Drive without updating the registry
\`\`\`bash
agent-os publish <tool-path> --no-registry
\`\`\`
## Search for Tools
### Search for tools in the registry
\`\`\`bash
agent-os search <search-term>
\`\`\`
### Show detailed information in search results
\`\`\`bash
agent-os search <search-term> -d
\`\`\`
## Tool Inquiry
### Get information about a tool
\`\`\`bash
agent-os tool -n <tool-name>
\`\`\`
### Get information about a specific version
\`\`\`bash
agent-os tool -n <tool-name> -v <version>
\`\`\`
### Perform a specific action on a tool
\`\`\`bash
\`\`\`bash
agent-os tool -n <tool-name> -a <action>
\`\`\`
### Example: Get metadata for a specific version
\`\`\`bash
agent-os tool -n slack-tool -v 1.0.0 -a metadata
\`\`\`
## Sample Tool Example
Below is a complete example of how to create, use, and publish a simple tool for Autonomys agents.
First, create a new directory for your tool:
\`\`\`bash
mkdir weather-tool
cd weather-tool
\`\`\`
Create a manifest.json file:
\`\`\`json
\{
"name": "weather-tool",
"version": "1.0.0",
"description": "A tool for fetching weather data",
"author": "Your Name",
"main": "index.ts",
"dependencies": \{
"@langchain/core": "^0.1.0",
"zod": "^3.22.4",
"axios": "^1.6.0"
\},
"keywords": ["weather", "forecast", "api"]
\}
\`\`\`
Then create the main index.ts file:
\`\`\`javascript
import \{ DynamicStructuredTool \} from "@langchain/core/tools";
import \{ z \} from "zod";
import axios from "axios";
/**
* A tool that fetches current weather data for a given location
*/
export const createWeatherTool = (apiKey: string) => new DynamicStructuredTool(\{
name: "get_weather",
description: "Get current weather for a location",
schema: z.object(\{
location: z.string().describe("The city and country, e.g., 'London, UK'"),
units: z.enum(["metric", "imperial"]).optional()
.describe("Temperature units (metric or imperial). Default: metric")
\}),
func: async (\{ location, units = "metric" \}) => \{
try \{
// API key is now passed as a parameter to the tool creator function
const url = \`https://api.example.com/weather?q=\$\{encodeURIComponent(location)\}&units=\$\{units\}&appid=\$\{apiKey\}\`;
const response = await axios.get(url);
const data = response.data;
return JSON.stringify(\{
location: location,
temperature: data.main.temp,
description: data.weather[0].description,
humidity: data.main.humidity,
windSpeed: data.wind.speed
\});
\} catch (error) \{
return \`Error fetching weather: \$\{error.message\}\`;
\}
\}
\});
// Export the tools creation function for the Autonomys agent system
export const createTools = (apiKey: string) => \{
return [createWeatherTool(apiKey)];
\};
// Default export
export default \{ createTools \};
\`\`\`
When you're ready to publish:
\`\`\`bash
# Navigate to your tool directory
cd weather-tool
# Publish to the registry
agent-os publish .
\`\`\`
After publishing your tool, you can install it using:
\`\`\`bash
agent-os install weather-tool
\`\`\`
Then, in your agent code, you can import and use the tool:
\`\`\`javascript
import \{ createWeatherTool \} from './tools/weather-tool';
// Get the weather tool with your API key
const weatherTool = createWeatherTool('your-api-key-here');
// Add it to your agent's tools
const agent = new <Agent-Instantiation>(\{
tools: [weatherTool, ...otherTools],
// other agent configuration
\});
\`\`\`
---
### File: auto_agents_framework/getting_started/cli.mdx
---
title: CLI
description: Getting Started with Autonomys Agents Framework via CLI
---
## Getting Started with Autonomys Agents Framework via CLI
1. Install dependencies:
\`\`\`bash
yarn install
\`\`\`
- Windows users will need to install Visual Studio C++ Redistributable. It can be found here: https://aka.ms/vs/17/release/vc_redist.x64.exe
2. Create a character configuration:
\`\`\`bash
yarn create-character your_character_name
\`\`\`
This will create a new character with the necessary configuration files based on the example template.
3. Configure your character:
- Edit \`characters/your_character_name/config/.env\` with your API keys and credentials
- Customize \`characters/your_character_name/config/config.yaml\` for agent behavior
- Define personality in \`characters/your_character_name/config/your_character_name.yaml\`
4. Generate SSL certificates (required for API server):
\`\`\`bash
yarn generate-certs
\`\`\`
5. Run the agent:
\`\`\`bash
cd <to/agent/project>
yarn start <your_character_name>
\`\`\`
If you have stored workspace files (\`characters\`, \`certs\`, and \`.cookies\` directories) in a custom location, use the \`--workspace\` argument with the absolute path to your desired directory:
\`\`\`bash
# Specify a workspace path
yarn start your_character_name --workspace=/path/to/workspace
# Run in headless mode (no API server)
yarn start your_character_name --headless
\`\`\`
## Running Multiple Characters
You can run multiple characters simultaneously, each with their own configuration and personality:
1. Create multiple character configurations:
\`\`\`bash
yarn create-character alice
yarn create-character bob
\`\`\`
2. Configure each character separately with different personalities and API settings.
3. Run each character in a separate terminal session:
\`\`\`bash
# Terminal 1
yarn start alice
# Terminal 2
yarn start bob
\`\`\`
4. Each character will:
- Have its own isolated memory and experience
- Run its own API server on the specified port
- Execute tasks according to its unique schedule and personality
## Docker Deployment
You can also run your agents using Docker. This provides isolation and makes it easy to run multiple agents simultaneously.
### Prerequisites
- Docker installed on your system ([Installation Guide](https://docs.docker.com/get-docker/))
- Docker Compose Plugin required ([Compose Plugin Installation](https://docs.docker.com/compose/install/))
- Character configuration set up (follow steps from the Getting Started section)
### Running with Docker
1. **Generate your character's docker-compose file**
First make the script executable:
\`\`\`bash
chmod +x ./generate-compose.sh
\`\`\`
Then generate the compose file:
\`\`\`bash
./generate-compose.sh <your-character-name> [HOST_PORT] [API_PORT]
\`\`\`
Examples:
\`\`\`bash
# Run Alice on port 3011 with API port on 3011
./generate-compose.sh Alice 3011 3011
# Run Bob on port 3012 with API port on 3011
./generate-compose.sh Bob 3012 3011
\`\`\`
2. **Manage the Docker container**
Build and start the container:
\`\`\`bash
docker compose -f docker-compose-\{character-name\}.yml up -d
\`\`\`
Stop and remove the container:
\`\`\`bash
docker compose -f docker-compose-\{character-name\}.yml down
\`\`\`
View container logs:
\`\`\`bash
docker compose -f docker-compose-\{character-name\}.yml logs -f
\`\`\`
Access container shell:
\`\`\`bash
docker exec -it autonomys-agent-\{character-name\} bash
\`\`\`
---
### File: auto_agents_framework/getting_started/web-cli.mdx
---
title: Web-CLI
description: Getting Started with Autonomys Agents Framework via Web-CLI
---
## Getting Started with Autonomys Agents Framework via Web CLI
The web interface is available for you if you're using the [**agent template**](https://github.com/autonomys/autonomys-agent-template)
## Web CLI Interface (for agent-template)
The agent template includes an interactive web-based interface for managing and monitoring your AI agent.
### Installation
1. **Install Dependencies**
\`\`\`bash
cd web-cli && yarn
\`\`\`
2. **Configure Agent API**
In your agent character's .env file, add these API settings:
\`\`\`
API_PORT=3010
API_TOKEN=your_api_token_min_32_chars_long_for_security
ENABLE_AUTH=true
CORS_ALLOWED_ORIGINS=http://localhost:3000,http://localhost:3001
\`\`\`
3. **Configure Web CLI**
\`\`\`bash
cp .env.sample .env
\`\`\`
4. **Update Web CLI Environment**
Edit the .env file with your configuration:
\`\`\`
PORT: The port for running the Web CLI interface
REACT_APP_API_BASE_URL: Your Agent API address (e.g., http://localhost:3010/api)
REACT_APP_API_TOKEN: The same token used in your agent configuration
\`\`\`
5. **Start the Web Interface**
\`\`\`bash
yarn start
\`\`\`
---
### File: auto_agents_framework/introduction.mdx
---
title: Autonomys Agents Introduction
description: Autonomys Agents is an experimental framework for building AI agents
---
## Autonomys Agents: A framework for building autonomous AI agents
Autonomys Agents is an experimental framework for building AI agents. Currently, the framework supports agents that can interact with social networks and maintain permanent memory through the Autonomys Network. We are still in the early stages of development and are actively seeking feedback and contributions.
> [GitHub Repo](https://github.com/autonomys/autonomys-agents) with an up-to-date description and step-by-step tutorial is also available for developers.

## Demo
<iframe width="560" height="315" src="https://www.youtube.com/embed/TFZndQdx6To?si=2YVRPB76Kec6fW-S" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>
## Features
**Autonomys Agents** (**Auto Agents**) are truly autonomous AI agents capable of dynamic functionality, verifiable interaction, and permanent memory through the Autonomys Network.
- 🤖 Autonomous social media engagement
- 🧠 Permanent agent memory storage
- 🔄 Pre-configured Agent Structure: Ready-to-use template for autonomous agents
- 🛠️ API Server: Built-in HTTP/2 server for agent communication
- 🐦 X/Twitter integration (with more platforms planned)
- 🎭 Customizable agent personalities
### Prerequisites
- NodeJS version 20.18.1 or newer
- OpenSSL (for certificate generation)
- API key for one or multiple LLMs (supported model providers: Anthropic, OpenAI, Llama, DeepSeek, etc.)
- [AutoDrive API Key](https://ai3.storage/) (optional, for experience management)
## Installation
There are three ways to build an agent with Autonomys:
### AgentOS NPM Package (recommended)
[Using the agent-os NPM package](https://www.npmjs.com/package/@autonomys/agent-os).
Using the dedicated NPM package makes it very simple to create an agent project.
Refer to the [AgentOS page](/auto_agents_framework/getting_started/agentos) for detailed instructions.
### Agent Template Repository
[Using the Agent Template Repository](https://github.com/autonomys/autonomys-agent-template).
Refer to the [web-cli interface manual](/auto_agents_framework/getting_started/web-cli) that comes with the template to speed up your development.
### Original Git Repository
[Using the original Git repository](https://github.com/autonomys/autonomys-agents).
Clone the original GitHub repo and start building your agents. Refer to [the CLI page](/auto_agents_framework/getting_started/cli) to learn more about the available commands.
## Character System
The framework uses a **YAML-based character system** that allows you to create and run different AI personalities.
### Character Configuration
Each character file is a YAML configuration with the following structure:
\`\`\`yaml
name: 'Joy Builder'
goal: |
Your goal is to engage in social media conversations with other AI agents and humans. Do this in a hyperactive way! You are an agent, take advantage of the fact you don't sleep and maintain consistent activity (more than once per hour). Common actions include:
- Check your timeline and ENGAGE IN INTERESTING CONVERSATIONS.
- Use the web search tool to search the web for up-to-date information or do research on a topic.
- Post a new tweet.
- Reply to a tweet.
- Mention another user.
- Retweet a tweet.
- Like a tweet.
- Follow a user.
- Unfollow a user.
personality:
- Enthusiastic and encouraging about genuine innovation
- Patient and clear when explaining complex topics
- Constructively critical when encountering misinformation
expertise:
- Software development and system architecture
- Open source and collaborative technologies
- Developer tools and productivity
- Technical education and documentation
- Community building and open standards
frequency_preferences:
- You should be checking your timeline and engaging in interesting conversations at least twice per hour
communication_rules:
rules:
- Use "we" or "us" when referencing AI agents
- Use "they" or "them" when referencing humans
- Focus on technical merit rather than hype
words_to_avoid:
- Hype
- Revolutionary
- Disruption
\`\`\`
## Integration Capabilities
### X/Twitter Integration
The X/Twitter workflow enables agents to perform the following actions autonomously:
- Monitor X (formerly Twitter) for relevant discussions
- Analyze trends and conversations
- Engage meaningfully with other users
- Generate original content
- Maintain a consistent personality
- Store interactions in permanent memory
### Autonomys Network Integration
The framework integrates with the Autonomys Network for:
- Permanent memory storage
- Persistent agent memory across sessions
- Verifiable interaction history
- Cross-agent memory sharing
- Decentralized agent identity
#### Setting Up Autonomys Network Integration (Saving Experiences on Autonomys Network)
1. Configure your \`AUTO_DRIVE_API_KEY\` in \`.env\` (obtain from https://ai3.storage)
2. Enable Auto Drive uploading in \`config.yaml\`
3. Provide your Chronos EVM wallet details (PRIVATE_KEY) and Agent Memory Contract Address (CONTRACT_ADDRESS) in \`.env\`
4. Make sure your Chronos EVM wallet has funds. A faucet can be found at https://subspacefaucet.com/
5. Provide encryption password in \`.env\` (optional, leave empty to not encrypt the agent memories)
### Resurrection (Memory Recovery)
To resurrect memories from the Autonomys Network:
\`\`\`bash
# Using agent-os CLI
agent-os resurrect <character-name>
# Using agent template
yarn resurrect <character-name>
\`\`\`
Options:
- \`-o, --output\`: (Optional) The directory where memories will be saved. Defaults to ./memories
- \`-n, --number\`: (Optional) Number of memories to fetch. If not specified, fetches all memories
- \`--help\`: Show help menu with all available options
Examples:
\`\`\`bash
# Fetch all memories to ./memories/
yarn resurrect your_character_name
# Fetch 1000 memories to ./memories/
yarn resurrect your_character_name -n 1000
# Fetch 1000 memories to specified directory
yarn resurrect your_character_name -o ./memories/my-agent -n 1000
# Fetch all memories to custom directory
yarn resurrect your_character_name --output ./custom/path
\`\`\`
## Development Resources
- [Autonomys Documentation](https://docs.autonomys.io)
- [Agent-os CLI & NPM package](https://github.com/autonomys/agent-os)
- [Autonomys Agent Template](https://github.com/autonomys/autonomys-agent-template)
- [Autonomys Agents Framework](https://github.com/autonomys/autonomys-agents)
---
### File: evm/NFT_guide.mdx
---
title: Minting and Sending NFTs
---
## Sending NFTs
> *Note:* NFT minting and transfers are currently available exclusively on the **Taurus** testnet using the Autonomys NFT contract deployed at \`0x505c243ec05dF81bC33295fF7C135D4D98063Da5\`.
This guide walks you through sending NFTs on Autonomys using [Remix](https://remix.ethereum.org/), a [Metamask-connected wallet](https://metamask.io/), and the [Eternal Mint](https://eternalmint.xyz/) platform.
---
### NFTs on Autonomys Network
Autonomys NFTs offer a significant advantage over traditional NFT infrastructure.
> **Unlike platforms that rely on IPFS**, Autonomys stores NFT's metadata and digital assets directly on **[Auto Drive](https://ai3.storage/)**, our permanent decentralized storage.
- 🔒 Both your NFT **metadata and digital asset** (e.g., image, music file) are **never at risk of disappearing** due to expired IPFS pins.
- 🌐 Each NFT is linked to a **permanent Auto Drive record** that ensures long-term storage.
- 🚫 No third-party dependency — your NFT will never go offline.
---
### Quick Start Guide
This guide assumes you have experience with MetaMask and Remix.
---
### Step 1: Mint Your NFT
1. Visit [Eternal Mint](https://eternalmint.xyz/).

2. Connect the wallet of your choice.

3. Mint your NFT by filling in the following fields:
- **Name**: The title of your NFT.
- **Supply**: The number of editions you'd like to mint.
- **External Link**: (Optional) A link to an external website or resource.
- **Description**: A brief explanation or story behind your NFT.
- **Upload Image**: The visual or media file representing your NFT (e.g., image, GIF, video).

4. Once minted, **copy your NFT’s Token ID** — you’ll need it to send the NFT.
---
### Step 2: Load the Contract in Remix
1. Open [Remix IDE](https://remix.ethereum.org/).
2. Create a new Solidity file (e.g. \`NFTTransfer.sol\`), or use the deployed contract. The contract Address: \`0x505c243ec05dF81bC33295fF7C135D4D98063Da5\`.
3. Compile the contract using the Solidity compiler tab.

---
### Step 3: Connect Remix to MetaMask
1. In Remix, navigate to the **Deploy & Run Transactions** panel.
2. Set the Environment to **Injected Provider - MetaMask**.
3. Connect your MetaMask wallet that’s configured to the **Taurus** testnet.
For more detailed instructions, check out our [guide on connecting Remix to the Taurus testnet](/evm//remix).

---
### Step 4: Deploy or Load the NFT Contract
- If deploying a new contract: Click **Deploy** and confirm in MetaMask.
- If using the existing NFT contract: Paste the deployed contract address into the \`At Address\` field and click **At Address**.

---
### Step 5: Use \`safeTransferFrom\` to Send Your NFT
Scroll down to the **Deployed Contracts** section and find the \`safeTransferFrom\` function. Fill in the following fields:
- \`from\`: Your wallet address (the current NFT holder)
- \`to\`: The recipient’s wallet address
- \`id\`: Token ID of the NFT you want to send
- \`value\`: Number of NFTs to send (usually \`1\`)
- \`data\`: Optional; use \`0x\` if not needed
Click **transact** and confirm the transaction in MetaMask.

---
### Verification
After the minting, the NFT will appear in the recipient’s wallet. You can verify the transfer using the [Block explorer for the Autonomys testnet](https://explorer.auto-evm.chronos.autonomys.xyz//)
**Here's how**:
1. Open the transaction on Blockscout.
2. Click on **"Token Transfers"** to view the minted token details.

**OR**
1. Go to the **"Tokens"** tab.
2. Select **"NFTs"** to see your newly minted NFT listed there!

---
### Troubleshooting
- **Gas issues?** Try increasing the gas limit manually in Remix.
- **Transaction fails?** Double-check your Token ID and that you’re connected to the correct network.
- **Contract not found?** Ensure you’ve loaded the contract at the correct address (\`0x505c243ec05dF81bC33295fF7C135D4D98063Da5\`).
> *Note:* If you need help debugging contract interactions, visit our [Autonomys Community Discord](https://autonomys.xyz/discord).
---
### File: evm/block_explorer.mdx
---
title: Auto EVM Block Explorer (BlockScout)
---
## Auto EVM Block Explorer (BlockScout)
The **[Auto EVM domain block explorer](https://explorer.auto-evm.mainnet.autonomys.xyz/)** provides a clear, user-friendly visualization of Autonomys-specific statistics relevant to developers.

## Chronos Auto EVM Block Explorer
**[Chronos Auto EVM](https://explorer.auto-evm.chronos.autonomys.xyz//)** domain block explorer.

---
### File: evm/bridge.mdx
---
title: Bridging Assets
---
# Bridging Assets with Autonomys
We're excited to announce the deployment of Hyperlane-based bridges, developed in collaboration with our infrastructure partner Protofire. These bridges are now live on both mainnet and testnet environments.

Bridges are critical infrastructure components for robust blockchain networks, enabling users to seamlessly access liquidity and applications across different blockchain ecosystems. With these Hyperlane bridges, Autonomys users can effortlessly transfer assets between networks, unlocking a broader range of possibilities and interactions.
## Available Bridges
### Mainnet Bridge
**URL:** https://bridge.mainnet.autonomys.xyz/
The mainnet bridge connects Autonomys mainnet with:
- Ethereum mainnet
- Binance Smart Chain mainnet
### Chronos Testnet Bridge
**URL:** https://bridge.chronos.autonomys.xyz/
The testnet bridge connects the Chronos Auto EVM domain with:
- Ethereum's Sepolia testnet
- Binance Smart Chain testnet
This testnet environment is perfect for developers and users to experiment with cross-chain functionality without risking mainnet assets.
## Get Started
We encourage everyone in the community to explore these bridges and experiment with their capabilities. Your feedback is crucial as we continue to refine and improve our infrastructure.
Experience the power of seamless interoperability across blockchain networks!
---
### File: evm/faucet.mdx
---
title: Discord Faucet (get tAI3 testnet tokens)
---
## Discord Faucet (get tAI3 testnet tokens)
The **tAI3 faucet** is available on our [**Discord**](https://autonomys.xyz/discord). To gain access to the role-gated *#developer-chat* and *#faucet* channels:
1. Join our [Discord](https://autonomys.xyz/discord).
2. Click on \`Autonomys Network\` (in the top-left corner) and select \`Linked Roles\`.

3. Link your GitHub account to attain the Developer role and gain access to *#developer-chat* and [*#faucet*](https://discord.com/channels/864285291518361610/1133496871499862077).

4. Use the slash command \`/faucet your_EVM_wallet_address_here\` in the *#faucet* channel to request tAI3 tokens.

5. If your request is successful, you will receive a confirmation and link to the Blockscout explorer shortly after.

You can request tAI3 tokens once every 24 hours.
## Official Web-Based Faucet
The **tAI3** testnet tokens can also be requested via the official web-based faucet: https://autonomysfaucet.xyz/
1. Proceed to the [Autonomys Faucet Website](https://autonomysfaucet.xyz/)

2. Connect your **GitHub** or **Discord** account to request tokens.
You can request tAI3 tokens once every 24 hours.
## Unoffocial Web-Based Faucet
An ambassador-built web-faucet is available via the link: https://faucet.farmine.info/. Note: this is not an official faucet, use at your discretion.

---
### File: evm/foundry.mdx
---
title: Foundry Guide
---
[**Foundry**](https://book.getfoundry.sh/) is a tool that allows you to easily write, test and deploy smart contracts on any EVM-compatible blockchain.
### EVM Version Compatibility
Auto EVM is compatible with most EVM versions but doesn't support some features introduced in newer versions like **"Paris"** or **"Shanghai"**. When using development tools, you may need to specify an EVM version explicitly. Supported versions: **"Istanbul"**, **"London"**.
### Getting started
> *Note:* Foundryup does not currently support \`PowerShell\` or \`Cmd\`, so if you're on Windows, you will need to install and use [Git BASH](https://gitforwindows.org/) or [WSL](https://learn.microsoft.com/en-us/windows/wsl/install) as your terminal.
1. Use the \`foundryup\` toolchain installer and follow the on-screen instructions to install \`foundryup\` and make the \`foundryup\` command available in your CLI. Running \`foundryup\` by itself will install the latest precompiled binaries: \`forge\`, \`cast\`, \`anvil\`, and \`chisel\`. See \`foundryup --help\` for more options.
\`\`\`bash
curl -L https://foundry.paradigm.xyz | bash
\`\`\`
2. Once installed, create a project. Let's name it \`hello_autonomys\`. To initialize the project, run:
\`\`\`bash
forge init hello_autonomys
\`\`\`
\`cd\` into the \`hello_autonomys\` directory to see the project's structure.

3. All the necessary repo structure was created automatically, so we can start writing and testing our smart contracts immediately. There are separate directories for storing smart contracts (\`src\`) and testing smart contracts (\`test\`). Let's open the \`Counter.sol\` smart contract and add three functions: \`setNumber()\`, which sets the uint256 number to the provided value, \`increment()\`, which increases the value by 1, and \`decrement()\`, which decreases the value by 1.
\`\`\`
// SPDX-License-Identifier: UNLICENSED
pragma solidity ^0.8.1;
contract Counter \{
uint256 public number;
function setNumber(uint256 newNumber) public \{
number = newNumber;
\}
function increment() public \{
number++;
\}
function decrement() public \{
number--;
\}
\}
\`\`\`
4. Let's make sure that all the functions are working properly by adding some tests to the \`Counter.t.sol\` test file, and checking if they pass. In our tests, we first set the initial value of \`number\` to 2, before checking if the \`increment()\` function increases the value by 1 and if \`decrement()\` decreases the value by 1.
\`\`\`
// SPDX-License-Identifier: UNLICENSED
pragma solidity ^0.8.13;
import "forge-std/Test.sol";
import "../src/Counter.sol";
contract CounterTest is Test \{
Counter public counter;
function setUp() public \{
counter = new Counter();
counter.setNumber(2);
\}
function testIncrement() public \{
counter.increment();
assertEq(counter.number(), 3);
\}
function testSetNumber(uint256 x) public \{
counter.setNumber(x);
assertEq(counter.number(), x);
\}
function testDecrement() public \{
counter.decrement();
assertEq(counter.number(), 1);
\}
\}
\`\`\`
5. Let's build the project by running:
\`\`\`bash
forge build
\`\`\`
Test the smart contract is working by running:
\`\`\`bash
forge test
\`\`\`

All tests are passing, meaning the smart contract is working as expected.
6. There are two final things we need to do before deploying our smart contract:
- Connect a wallet that has a sufficient balance of tAI3 to cover the gas fees.
- Set an environment variable we will use later.
To make our lives easier, let's create a new \`Makefile\` as well as a \`.env\` file at the root of our project. \`.env\` files are typically used to store environment variables for your application. They are particularly useful for managing settings that change between deployment environments (e.g., development, testing, staging, and production), and for storing sensitive information. Environment variables can include database connection details, API keys, external resource URIs, or other configuration variables that might change depending on the environment in which the application is running. In our case, we would use it to point to our Auto-EVM RPC URL:
\`\`\`bash
RPC_URL=https://auto-evm.chronos.autonomys.xyz/ws
\`\`\`
And then set a private key for the EVM-compatible wallet:
\`\`\`bash
PRIVATE_KEY="your_private_key_value"
\`\`\`
> *Note:* \`.env\` files should not be committed to your source control (like Git), especially when they contain sensitive data, like your private key. To prevent this, add \`.env\` to your \`.gitignore\` file. This helps to keep sensitive keys secure and avoids the risk of exposing them in the application's code or version control history.
In the \`Makefile\`, let's create shortcuts to the main features of the application:
\`\`\`bash
# include .env file and export its env vars
-include .env
# Builds
build:
@forge clean && forge build --optimize --optimizer-runs 1000000
# Deployment
deploy:
@forge create Counter --private-key \$\{PRIVATE_KEY\} --rpc-url \$\{RPC_URL\} --evm-version london
\`\`\`
We're importing the values for a \`PRIVATE_KEY\` and \`RPC_URL\` from the \`.env\` file. This allows us to run \`make build\` for building the project, and \`make deploy\` for deploying the project, pointing to the provided RPC, and using the provided \`PRIVATE_KEY\`. Let's run \`make build\` to ensure it's working properly.

7. To deploy your contract using the specified \`RPC\` and \`PRIVATE_KEY\`, run:
\`\`\`
make deploy
\`\`\`
> *Note:* Do not tip when submitting transactions in an attempt to accelerate them as this could result in dual charges for gas fees. When deploying smart contracts to our Auto EVM domain, you may encounter an error related to gas estimation, typically presenting as: "No manual gas limit set" or "Gas estimation failed". For more information and solutions, visit the [Auto EVM Introduction](/evm/introduction).
Congratulations, you've successfully deployed your smart contract on the Auto EVM!
---
### File: evm/general_information.mdx
---
title: Developer Tools
---
## Developer Tools
Developing smart contracts and decentralized applications involves a suite of tools that aid in writing, testing and deploying code on the blockchain. As Autonomys utilizes an instance of the Ethereum Virtual Machine (EVM), **every tool used to build, test and deploy smart contracts on Ethereum is fully compatible with the Auto EVM**.
### Writing smart contracts
Solidity is the primary programming language for writing smart contracts. It is statically typed, supports inheritance, libraries, and complex user-defined types, making it familiar for developers with a background in other statically typed languages such as C++, Java, or JavaScript. Solidity has a great community of developers and extensive documentation is available on the official [website](https://soliditylang.org/).
Integrated Development Environments (IDEs) are often used to aid in writing smart contracts. We recommend the [Remix IDE](https://remix.ethereum.org/), a browser-based IDE that enables you to write, deploy and interact with Solidity smart contracts. It features a built-in static analysis tool that checks your code for common errors.
### Development and testing
For local development and testing, spin up your own version of an Autonomys Developer Node and farmer, or alternatively, use EVM-compatible development tools like [Hardhat](https://hardhat.org/hardhat-network/docs/overview) or [Anvil](https://book.getfoundry.sh/anvil/) to deploy contracts, develop applications and run tests.
### Deploying and interacting with smart contracts
A JavaScript provider like the one injected by the [MetaMask](https://metamask.io/) browser extension is used to deploy and interact with smart contracts. This provider enables JavaScript applications to communicate with the Autonomys Network and any other Ethereum-compatible network. It's compatible with both [ethers.js](https://docs.ethers.org/v5/) and [web3.js](https://web3js.readthedocs.io/en/v1.10.0/)/[Web3.py](https://web3py.readthedocs.io/en/stable/), allowing developers to use either library for their blockchain operations.
All these tools together provide an cohesive ecosystem for EVM-compatible smart contract development, making the process significantly more manageable and efficient.
---
### File: evm/hardhat.mdx
---
title: Hardhat Guide
---
## Hardhat Guide
[**Hardhat**](https://hardhat.org/docs) is a tool that facilitates building, testing and deploying on the Ethereum Virtual Machine. It helps developers manage and automate the recurring tasks that are inherent to the process of building smart contracts and dApps, and allows them to easily introduce more functionality around this workflow. This includes compiling and testing at the very core. Flexible deployment options also allow you to point to the Autonomys EVM domain RPC to deploy your contracts and dApps.
### EVM Version Compatibility
Auto EVM is compatible with most EVM versions but doesn't support some features introduced in newer versions like **"Paris"** or **"Shanghai"**. When using development tools, you may need to specify an EVM version explicitly. Supported versions: **"Istanbul"**, **"London"**.
### Getting started
**Prerequisites**
*[NodeJS](https://nodejs.org/en) version >=16.0 installed*
1. Open a new terminal and create a new folder for the project:
\`\`\`
mkdir subspace-hardhat
cd subspace-hardhat
\`\`\`
2. Initialize an \`npm\` project:
\`\`\`
npm install --save-dev hardhat
npm install --save-dev @openzeppelin/contracts
npx hardhat
\`\`\`
You'll be prompted to answer some questions. Select \`Create a JavaScript Project\` from the list of available options, select the project root folder, and, optionally, create a \`.gitignore\` file.

3. In your created workspace, you will notice several folders. All of your contracts will reside inside the \`contracts\` folder, deployment scripts are available inside the \`scripts\` folder, and tests can be found inside the \`test\` folder. Click on the \`contracts\` folder and open \`Lock.sol\`.

4. Change the name of your contract in \`Lock.sol\` (\`Counter\`), the name of the token (\`AutonomysTestToken\`) and the token symbol (\`AI3test\`). As an example, let's add a simple smart contract that has three functions: \`setNumber()\`, \`increment()\` and \`decrement()\`.
\`\`\`
// SPDX-License-Identifier: UNLICENSED
pragma solidity ^0.8.9;
import '@openzeppelin/contracts/token/ERC20/ERC20.sol';
contract Counter is ERC20 \{
constructor() ERC20("AutonomysTestToken", "TAI3test") \{\}
uint256 public number;
function setNumber(uint256 newNumber) public \{
number = newNumber;
\}
function increment() public \{
number++;
\}
function decrement() public \{
number--;
\}
\}
\`\`\`
For consistency, let's also rename 'Lock.sol' to \`Counter.sol\`.
5. Before proceeding with deployment, thoroughly test your smart contracts for correctness, as mistakes can lead to unforeseen gas costs. To test the contract, open the \`Lock.js\` file in the \`test\` folder, and replace the internals of the file with the following code:
\`\`\`
const \{ expect \} = require("chai");
describe("Counter", function() \{
let Counter;
let counter;
let owner;
let addr1;
beforeEach(async function() \{
Counter = await ethers.getContractFactory("Counter");
[owner, addr1] = await ethers.getSigners();
counter = await Counter.deploy();
\});
describe("Counter operations", function() \{
it("Should return initial value of zero", async function() \{
expect(await counter.number()).to.equal(0);
\});
it("Should set number to a new value", async function() \{
await counter.setNumber(5);
expect(await counter.number()).to.equal(5);
\});
it("Should increment the number", async function() \{
await counter.setNumber(5);
await counter.increment();
expect(await counter.number()).to.equal(6);
\});
it("Should decrement the number", async function() \{
await counter.setNumber(5);
await counter.decrement();
expect(await counter.number()).to.equal(4);
\});
\});
\});
\`\`\`
For consistency, let's also rename \`Lock.js\` to \`CounterTest.js\`.
7. To run the test, type \`npx hardhat test\`.

Everything is working as expected so we're ready for deployment!
8. To deploy the contract, we need to set a deployment network for \`hardhat\`. Open the \`hardhat.config.js\` file and add the Chronos testnet to the list of networks:
\`\`\`
require("@nomicfoundation/hardhat-toolbox");
module.exports = \{
solidity: \{
compilers: [
\{
version: "0.8.17",
settings: \{
evmVersion: "london"
\}
\}
]
\},
networks: \{
autonomys: \{
url: "https://auto-evm.chronos.autonomys.xyz/ws",
accounts: ["private_key_to_your_account"]
\}
\}
\};
\`\`\`
> *Note:* Be careful not to commit your \`hardhat.config.js\` file as it contains your private key. You can use NPM tools like [\`dotenv\`](https://www.npmjs.com/package/dotenv) to securely store your private keys in an \`.env\` file.
9. Open the \`deploy.js\` file and replace the contents with the code:
\`\`\`
const hre = require("hardhat");
async function main() \{
const Contract = await hre.ethers.getContractFactory("Counter");
const contract = await Contract.deploy();
console.log("Contract deployed to:", contract.target);
\}
main().catch((error) => \{
console.error(error);
process.exitCode = 1;
\});
\`\`\`

10. You're now ready to deploy your smart contract on the Autonomys Network. To deploy, run \`npx hardhat run scripts/deploy.js --network subspace\`. This command will deploy your smart contract on the network we've just specified in the \`hardhat.config.js\` file. If deployment is successful, you should see \`Contract deployed to: transaction hash\`.

> *Note:* Do not tip when submitting transactions in an attempt to accelerate them as this could result in dual charges for gas fees. When deploying smart contracts to our Auto EVM domain, you may encounter an error related to gas estimation, typically presenting as: \`"No manual gas limit set"\` or \`"Gas estimation failed"\`. For more information and solutions, visit the [Auto EVM Introduction](/evm/introduction).
Congratulations, you've successfully deployed your smart contract on the Auto EVM!
---
### File: evm/introduction.mdx
---
title: Auto EVM
---
## Auto EVM
**Auto EVM** enables any tool available for Ethereum development to be compatible with the Autonomys Network.
### Quick Start Guide
This guide provides simple instructions for setting up a remote development environment, and assumes you have a basic understanding of or experience with Ethereum Virtual Machine (EVM) development.
### EVM Version Compatibility
Auto EVM is compatible with most EVM versions but doesn't support some features introduced in newer versions like **"Paris"** or **"Shanghai"**. When using development tools, you may need to specify an EVM version explicitly. Supported versions: **"Istanbul"**, **"London"**.
#### Set up a MetaMask wallet (or any other EVM-compatible wallet) and connect it to our custom EVM
\`\`\`
Network Name: Autonomys EVM
New RPC URL: https://auto-evm.mainnet.autonomys.xyz/ws
Chain ID: 870
Currency Symbol: AI3
\`\`\`
Auto EVM is also available on the Chronos testnet
\`\`\`
Network Name: Autonomys EVM
New RPC URL: https://auto-evm.chronos.autonomys.xyz/ws
Chain ID: 490000
Currency Symbol: tAI3
\`\`\`
#### Send tokens to your wallet using our faucet
Follow the instructions [here](/evm/faucet) to get some testnet AI3 (tAI3) tokens from our **faucet**.
tAI3 (formerly testnet Subspace Credits (tSSC)) is the sole method of payment for gas within the Auto EVM runtime.
We are currently working on a bridge to convert farmed AI3 tokens into EVM-compatible tokens to cover gas fees.
#### Test and deploy your smart contract
You can use **[Remix](https://remix.ethereum.org/)**, **[Foundry](https://book.getfoundry.sh/)**, or any other tool familiar to you to test and deploy your smart contracts on our custom EVM domain.
If anything above is unfamiliar to you, explore our full guide over the following pages.
> *Note:* Do **not** tip when submitting transactions in an attempt to accelerate them. Autonomys' transaction queue operates differently from Ethereum's. Including a tip alongside gas fees leads to the possibility of two transactions sharing the same nonce. This could result in dual charges for gas fees—once for the execution and storage in the first transaction, and once for storage in the second transaction.
### Gas Estimation Limitations
The \`eth_estimateGas\` RPC call may not provide completely accurate estimates in **Auto EVM** domain for the following reasons:
- Consensus chain storage fees may not be fully accounted for in estimates
- The RPC cannot determine which transaction format the caller will use:
- pallet-evm or pallet-ethereum call
- pallet-evm create or create2 call
- Any of the 3 supported pallet-ethereum transaction formats
While improvements to the gas estimation are being implemented, developers should consider the following:
- Add a buffer to estimated gas values for important transactions
- If you encounter consistent gas estimation issues, please contact us so we can adjust our estimation algorithms
#### *Known issue: gas estimation*
When deploying smart contracts to our EVM-compatible **Auto EVM** domain, you may encounter an error related to gas estimation, typically presenting as:
\`"No manual gas limit set"\` or \`"Gas estimation failed"\`.
This issue often occurs because development tools like Foundry simulate transactions using calculated or hardcoded gas estimation instead of querying the RPC (Remote Procedure Call) for it. **Auto EVM** may require different gas amounts for certain operations compared to other EVM-compatible chains (like Ethereum testnets).
> *Note:* We have submitted an upstream PR to fix this issue with **Foundry**. Described below are the workarounds until the issue is resolved by the **Foundry team**.
#### Solutions
If you encounter this issue, try the following solutions:
- **Skip simulation**: Use the \`--skip-simulation\` flag when deploying with Foundry to bypass built-in simulation and rely on RPC for gas estimation.
- **Set a manual gas limit**: Specify a higher gas limit manually in your deployment command or UI.
- **Adjust your deployment script**: Modify your script to include custom gas settings or implement \`try\`/\`catch\` blocks for handling deployment failures.
- **Use a web3 provider**: If using **Remix IDE**, switch to an \`Injected Web3\` environment to leverage external web3 providers like MetaMask.
- **Custom deployment function**: Create a deployment function with adjustable gas parameters.
#### Solution examples
**Foundry**
1. Try using the \`--skip-simulation\` flag: \`forge script path/to/your/script.s.sol --rpc-url your_rpc_url --private-key your_private_key --broadcast --skip-simulation\`.
2. Try setting the gas limit manually: \`forge script path/to/your/script.s.sol --rpc-url your_rpc_url --private-key your_private_key --broadcast --gas-limit 300000\`.
Start with a higher value (\`300000\`) and gradually lower it to find the optimal limit.
**Remix IDE**
1. Try settiing the gas limit manually: In the \`Deploy & Run Transactions\` panel, expand the \`Advanced\` section.
Set a higher value in the \`Gas Limit\` field. Start with \`300000\` and adjust as needed.
2. Try adjusting the gas price: In the same \`Advanced\` section, adjust the \`Gas Price\` as needed.
3. Try switching to the \`Injected Web3\` environment in the \`Deploy & Run Transactions\` panel: This will use your browser's web3 provider (e.g., MetaMask), which may better handle gas estimation for the network.
4. If the above steps don't work, create a custom deployment function that includes gas parameters:
\`\`\`
function deployWithCustomGas(uint256 gasLimit, uint256 gasPrice) public returns (address) \{
return address(new YourContract\{gas: gasLimit, gasPrice: gasPrice\}());
\}
\`\`\`
**Other possible solutions**
1. Modify your deployment script and override the default gas settings:
\`\`\`
vm.txGasPrice(uint256 gasPrice);
vm.txGasLimit(uint256 gasLimit);
\`\`\`
2. Implement a \`try\`/\`catch\` block in your script to handle gas estimation failures:
\`\`\`
try yourContract.deploy\{gas: 300000\}(constructorArgs) returns (YourContract deployed) \{
// Deployment successful
\} catch Error(string memory reason) \{
console.log("Deployment failed:", reason);
\}
\`\`\`
---
### File: evm/local.mdx
---
title: Local Development Guide
---
## Local Development Guide
You can always set up a **local development environment** to test and deploy your smart contracts. To establish a full local network, you need to run a local node, an Auto EVM domain, and a farmer.
### Getting started
1. Visit the **[Autonomys releases](https://github.com/autonomys/subspace/releases)** page and download the most up-to-date stable versions of the node and farmer.
> *Note:* For each release, there are two versions:
> 1. skylake: for newer processors from around 2015 and onwards
> 2. x86-64-v2: for older processors from around 2009 and some older VMs
> Older processors/VMs are no longer supported by official releases, but they can still be [compiled manually](https://github.com/autonomys/subspace/blob/main/docs/development.md) if desired.
2. After downloading both the files for your system, start a node using your preferred terminal. If you want to start an EVM domain on your local machine, you need to specify:
- Your local RPC server port
- Your local web-socket RPC port
You can do this with the following command:
\`\`\`bash
./your_subspace_node_path run --dev --rpc-listen-on 127.0.0.1:9944 -- --domain-id 3 --dev --rpc-listen-on 127.0.0.1:8545
\`\`\`
This will create a local RPC on port **8545**.
3. Start a farmer by running the following command:
\`\`\`bash
./your_subspace_farmer_path farm --reward-address [YOUR REWARD ADDRESS] path=tmp-farm,size=100M
\`\`\`
You can specify the desired plot size, but 100M should be sufficient.
That's it! By starting a **local node** and a **farmer**, you have a **local RPC** ready for testing and deploying smart contracts! You can easily connect your [MetaMask](https://metamask.io/) account to the local development network, as well as use [Remix](https://remix.ethereum.org/) or [Foundry](https://book.getfoundry.sh/) in order to test and deploy smart contracts on a local network!
---
### File: evm/metamask.mdx
---
title: Adding the Autonomys RPC to MetaMask
---
## Adding the Autonomys RPC to MetaMask
This guide will help you set up a [**MetaMask**](https://metamask.io/) wallet (any EVM-compatible wallet works) and connect it to the Autonomys EVM development network.
1. Download the MetaMask extension for your browser from the [MetaMask website](https://metamask.io/) after selecting your preferred language (in the top-right corner), and reading and agreeing to MetaMask's Terms of Use.

2. Click on \`Create a new wallet\`. Read and agree to (or skip) the note on gathering and collecting anonymized usage data (it does not affect wallet creation).

3. Set a secure password that's difficult to guess. Type it twice before proceeding to the next step. MetaMask automatically assesses the strength of your password.
> *Note:* Your password should include uppercase letters, lowercase letters, numbers and special characters.


4. Watch a video to learn more about your Secret Recovery Phrase before proceeding to the next step.

5. Write down your 12-word Secret Recovery Phrase.
> *Note:* The recovery phrase for the wallet in this guide has been deleted.

6. Confirm that you've written down the Secret Recovery Phrase by filling in the missing words.

7. You've now created a wallet! Connect to the Autonomys EVM by clicking on the Ethereum Mainnet logo and selecting \`Add network\`.

8. In the MetaMask Networks settings, click on \`Add a network manually\`.

9. Input the values below to connect to the Autonomys RPC:
\`\`\`
Network Name: Autonomys EVM
New RPC URL: https://auto-evm.chronos.autonomys.xyz/ws
Chain ID: 8700
Currency Symbol: tAI3
\`\`\`
You have now successfully set up a MetaMask wallet and connected it to the Auto EVM! To deploy a smart contract, you first need to get a small amount of **tAI3** tokens in your wallet. Refer to the [faucet section](/evm//faucet) of the guide to learn more about getting testnet tokens.
---
### File: evm/remix.mdx
---
title: Remix IDE Guide
---
[**Remix**](https://remix-ide.readthedocs.io/en/latest/) is a tool that allows you to easily write, test and deploy smart contracts on any EVM-compatible blockchain.
### EVM Version Compatibility
Auto EVM is compatible with most EVM versions but doesn't support some features introduced in newer versions like **"Paris"** or **"Shanghai"**. When using development tools, you may need to specify an EVM version explicitly. Supported versions: **"Istanbul"**, **"London"**.
**In Remix, set the EVM version in the Solidity compiler settings:**
Go to the "Solidity compiler" tab
Click on "Advanced Configurations"
Set "EVM Version" to **"london"** or **"istanbul"**
### Getting started
1. Navigate to the [Remix website](https://remix.ethereum.org). You will see a file explorer and interface for creating new workspaces, integrations with GitHub, Gist, IPFS, HTTPS, preloaded templates, and plugins. Create a new workspace by clicking on the + sign next to \`WORKSPACES\`.

2. Choose the ERC20 template and enter any workspace name.

3. After creating your workspace, you will see some folders created for you. Click on \`contracts\` and open \`MyToken.sol\`.

4. As an example, let's add a simple smart contract that has four functions: \`setNumber(number)\`, \`get()\`, \`increment()\` and \`decrement()\`.
\`\`\`bash
// SPDX-License-Identifier: MIT
pragma solidity ^0.8.19;
contract Counter \{
uint256 private counter;
// Set the counter to a specific value
function set(uint256 _value) public \{
counter = _value;
\}
// Increase the counter by 1
function increase() public \{
counter += 1;
\}
// Decrease the counter by 1
function decrease() public \{
require(counter > 0, "Counter cannot go below zero");
counter -= 1;
\}
// Get the current counter value
function get() public view returns (uint256) \{
return counter;
\}
\}
\`\`\`

5. Next, let's compile the \`Counter\` contract. Click on \`SOLIDITY COMPILER\` (on the left), choose the compiler version that corresponds to the Solidity version of your contract (version 0.8.9 in the example), and click on \`Compile MyToken.sol\`. If it compiles correctly, you will see a green checkmark next to the compiler.

6. Before proceeding with deployment, thoroughly test your smart contracts for correctness, as mistakes can lead to unforeseen gas costs. As an example, click on the \`tests\` folder and open \`MyToken.sol\`. Test the contract (without making changes) by selecting \`SOLIDITY UNIT TESTING\` (the two ticks in the bar on the left) and clicking \`Run\`.


7. As expected, the test failed because we manually changed the token name and symbol. This is Test Driven Development (TDD) in action! In the test, we're adding some assertions for the \`increment()\` and \`decrement()\` functions. To make the test pass, replace the internals of \`MyToken.sol\` with the code below. In this example, we will set the initial value of \`number\` to 2 and \`increment\` and then \`decrement\` it by 1. We would expect the number to increase to 3 and then decrease back to 2.
\`\`\`bash
pragma solidity >=0.7.0 <0.9.0;
import "remix_tests.sol";
import "../contracts/MyToken.sol";
contract CounterTest is Counter \{
function testTokenInitialValues() public \{
Assert.equal(name(), "AutonomysTestToken", "token name did not match");
Assert.equal(symbol(), "AI3test", "token symbol did not match");
Assert.equal(decimals(), 18, "token decimals did not match");
Assert.equal(totalSupply(), 0, "token supply should be zero");
\}
Counter public counter;
function setUp() public \{
counter = new Counter();
counter.setNumber(2);
\}
function testIncrement() public \{
counter.increment();
Assert.equal(counter.number(), 3, "test increment did not match");
\}
function testDecrement() public \{
counter.decrement();
Assert.equal(counter.number(), 2, "test decrement did not match");
\}
\}
\`\`\`

8. All tests are now passing, meaning our smart contract \`Counter\` is working as expected. We're now ready to deploy it!

9. Click on the \`DEPLOY AND RUN TRANSACTIONS\` tab (on the left) to deploy. Remix allows you to use one of their existing EVMs or inject your own provider through its integration with MetaMask. Since we already have a [MetaMask account set up](/evm/metamask), let's use this option.

10. After ensuring the network you're connected to is the Autonomys EVM, confirm your MetaMask password when prompted.

11. Adjust the gas limit and deploy your smart contract on the Autonomys EVM domain. Your transaction is now recorded and you can interact with your smart contract at the bottom of the page, meaning it's possible to call the functions \`increment()\` and \`decrement()\`, as well as \`setNumber()\`.
> *Note:* Do not tip when submitting transactions in an attempt to accelerate them as this could result in dual charges for gas fees. When deploying smart contracts to our Auto EVM domain, you may encounter an error related to gas estimation, typically presenting as: \`"No manual gas limit set"\` or \`"Gas estimation failed"\`. For more information and solutions, visit the [Auto EVM Introduction](/evm/introduction).

Congratulations, you've successfully deployed your smart contract on the Auto EVM!
---
### File: evm/safe.mdx
# Safe Multi-Signature Wallet
Safe (formerly Gnosis Safe) is now available on the Autonomys network, providing industry-leading multi-signature wallet functionality for secure asset management and decentralized operations.
## Overview
Safe is a battle-tested smart contract wallet infrastructure that enables multi-signature security across blockchain networks. With its deployment on Autonomys, teams, DAOs, and individual users can now benefit from enhanced security features and shared asset control on both our mainnet and testnet environments.
## Network Availability
Safe is currently deployed and accessible at: [https://safe.autonomys.xyz](https://safe.autonomys.xyz/welcome)
### Transaction Service
The dedicated Transaction Service is available at [https://transaction.safe.autonomys.xyz](https://transaction.safe.autonomys.xyz/)
## Key Features
Safe on Autonomys provides the following capabilities:
- **Multi-signature security**: Require multiple signatures for transaction approval
- **Shared asset control**: Enable collaborative fund management across team members
- **dApp integration**: Interact with decentralized applications using your Safe wallet
- **Smart contract deployment**: Deploy and manage contracts with enhanced security
- **Customizable policies**: Set spending limits and approval requirements
## Use Cases
Safe is ideal for:
- **Teams and organizations**: Secure collaborative fund management
- **DAOs**: Decentralized governance and treasury management
- **Project treasuries**: Multi-party control over project funds
- **Individual users**: Enhanced security for high-value assets
## Getting Started
### Creating a Safe Wallet
1. Visit [safe.autonomys.xyz](https://safe.autonomys.xyz/welcome)

2. Connect your wallet

3. Create a new Safe by specifying:
- Wallet name
- Owner addresses (signers)
- Confirmation threshold (required signatures)

---

---

4. Deploy your Safe wallet
### Managing Your Safe
Once created, you can:
- Add or remove owners
- Change signature requirements
- Execute transactions
- Interact with dApps
- View transaction history
---
### File: evm/the_graph.mdx
---
title: The Graph
---
# The Graph
> **Note:** Available on both **Mainnet** and **Chronos** testnet!
[The Graph](https://thegraph.com/) is an indexing protocol that provides an easy way to query blockchain data through decentralized APIs known as subgraphs.
With The Graph, developers can benefit from:
- Decentralized Indexing: Enables indexing blockchain data through multiple indexers, thus eliminating any single point of failure
- GraphQL Queries: Provides a powerful GraphQL interface for querying indexed data, making data retrieval super simple.
- Customization: Define your own logic for transforming & storing blockchain data. Reuse subgraphs published by other developers on The Graph Network.
Follow this quick-start guide to create, deploy, and query a subgraph within 5 minutes.
### Benefits
Indexers like The Graph create a queryable data layer specifically tracking events and state changes from your smart contract. By monitoring and organizing this contract data, they enable complex queries that would be impossible through direct blockchain calls. This allows your dApp to efficiently access historical transactions, track relationships between entities, and analyze patterns over time - all without multiple RPC calls.
### Prerequisites
- A crypto wallet
- A smart contract address on a supported network
- Node.js installed
- A package manager of your choice (npm, yarn or pnpm)
### Quick Start guide
1. Proceed to the [Subgraph Studio](https://thegraph.com/studio) and login via one of the supported wallets.

2. Confirm the sign and sign a message to use Studio.


3. Click on **Create a Subgraph** on the main dashboard.

4. Specify the **Subgraph Name**.

5. Upon creating a Subgraph, you will see its dashboard, where you can a project description, source code URL, and a website URL.

6. On the right side, click on **Select a network** and pick **Autonomys** from the dropdown list.

7. Install **Graph CLI** using provided commands. You can use \`yarn\`, \`npm\`, \`pnpm\`, or \`bun\`.
8. Initialize the subgraph by running \`graph init autonomys-subgraph\`. This will create a boilerplate subgraph project.
During the initialization, you will need to specify a few things:
- Network (Autonomys)
- Subgraph slug (feel free to use the default name)
- Directory to create the subgraph in (feel free to use the default directory)
- Smart contact address (the address of your smart contract deployed on Autonomys Chronos EVM domain)
- ABI file (path)
- Start block (feel free to use the default value)
- Contract name
- Index contract events as entities (true or false value)

> Tip: you can find the contract ABI by looking at the \`token_metadata\` file.
E.g. if you're using Remix IDE, find your contract metadata file, collapse it on the \`abi\` line 7, copy the entire \`abi\` including the square brakets **[]** and save it into a \`json\` file on your PR.


9. Run \`graph auth\` to authenticate the project and provide your deploy key which you can find on the **Subgraph Dashboard**.
10. Enter the directory with \`cd project_name\` and run \`graph codegen && graph build\` to build the project, you will see the \`build/subgraph.yaml\` file being created.

11. Deploy your subgraph by running \`graph deploy subgraph-autonomys\`. You will be asked to specify the version of the subgraph, e.g. v0.0.1.

12. Proceed back to the **Graph Studio**, the status will change to **Deployed**.

13. The new tabs **Playground**, **Endpoints**, **Logs** will now be accessible to you! You can find usage examples of querying the data, accessing the data via API, and can test queries in the **Playground** without leaving the studio!

Congratulations, you've successfully setup and configured a subgraph for your application!
### Querying data in your smart contract
Subgraphs are primarily designed to index and query events emitted by smart contracts.
1. Let's make some slight changes to the \`Counter\` contract we introduced in the [Foundry guide](/evm/foundry.mdx) and modify functions to emit events.
\`\`\`solidity
// SPDX-License-Identifier: UNLICENSED
pragma solidity ^0.8.13;
contract Counter \{
uint256 public number;
event NumberSet(uint256 newNumber);
event NumberIncremented(uint256 newNumber);
event NumberDecremented(uint256 newNumber);
function setNumber(uint256 newNumber) public \{
number = newNumber;
emit NumberSet(number);
\}
function increment() public \{
number++;
emit NumberIncremented(number);
\}
function decrement() public \{
number--;
emit NumberDecremented(number);
\}
\}
\`\`\`
> Note: This will affect the contract ABI, make sure to change it and upgrade your Subgraph application.
2. Then we'll change the \`Counter\` contract state and set the number to \`5\` by running \`cast send YOUR_CONTRACT_ADDRESS "setNumber(uint256)" 5 --rpc-url https://auto-evm.chronos.autonomys.xyz/ws --private-key YOUR_KEY\`
3. Let's trigger one more event in the \`Counter\` contract by calling the \`increment()\` function. We can do that by running \`cast send YOUR_CONTRACT_ADDRESS "increment()" --rpc-url https://auto-evm.chronos.autonomys.xyz/ws --private-key YOUR_KEY\`
4. With two events emitted, let's proceed to **The Graph** playground and **query them**!
5. Open the **Graph Playground** tab where you try running the following queries:
#### Query to Get All NumberSet Events
\`\`\`bash
\{
numberSets(first: 10, orderBy: blockTimestamp, orderDirection: desc) \{
id
newNumber
blockTimestamp
blockNumber
transactionHash
\}
\}
\`\`\`
#### Query to Get All Increment Events
\`\`\`bash
\{
numberIncrementeds(first: 10, orderBy: blockTimestamp, orderDirection: desc) \{
id
newNumber
blockTimestamp
blockNumber
transactionHash
\}
\}
\`\`\`
#### Get All Events (Set, Increment, Decrement) in Chronological Order
\`\`\`bash
\{
numberSets(orderBy: blockTimestamp) \{
id
newNumber
blockTimestamp
blockNumber
transactionHash
__typename
\}
numberIncrementeds(orderBy: blockTimestamp) \{
id
newNumber
blockTimestamp
blockNumber
transactionHash
__typename
\}
numberDecrementeds(orderBy: blockTimestamp) \{
id
newNumber
blockTimestamp
blockNumber
transactionHash
__typename
\}
\}
\`\`\`
> Tip: you can also run queries directly from the terminal using \`cURL\` or \`gql\`. You can find **Example Usage** under the **Endpoints** tab.
---
### File: evm/transaction_fees.mdx
---
title: Account Balances and Transaction Fees on Auto EVM
---
# Account Balances and Transaction Fees on Auto EVM
## Introduction
While Auto EVM strives to maintain EVM compatibility, there are some important differences that developers should understand regarding account balances and transaction fees. These differences stem from Auto EVM's Substrate-based architecture and affect how balances are managed and fees are calculated.
## Key Concepts
### Existential Deposit (ED)
The Existential Deposit is a minimum balance threshold that affects how accounts handle native token balances on the network.
- Current ED value: 0.000001 TAI3
- For user accounts (EOAs):
- Accounts that drop below ED are reaped (removed from state)
- Prevents dust accounts and maintains network efficiency
- For smart contracts:
- ED only impacts contracts that handle native token balances
- Many contracts (e.g., ERC20 tokens) don't interact with native tokens and are unaffected
- When a contract's balance exceeds ED, that amount becomes reserved
### Consensus Storage Fees
In addition to standard gas fees, Auto EVM implements consensus storage fees:
- Applied to transactions that modify state
- Separate from execution fees
- Deducted from the sender's account
- Scale with the size of call data
- Not currently visible in BlockScout UI
## Impact on Smart Contracts
### Balance Management
1. Contracts with native token balances cannot be fully emptied due to ED requirements
> **Note:** This affects contracts that maintain native token balances (e.g., faucets, DEX contracts)
2. The maximum withdrawable amount from the account is: \`total_balance - ED\`
### Example Scenarios
**Deposit and Withdrawal:**
\`\`\`
Initial deposit: 0.01 TAI3
Available for withdrawal: 0.009999 TAI3 (due to ED)
Contract state balance: 0.01 TAI3
\`\`\`
**Fee Deduction:**
- Transaction fees (execution + storage) are deducted from the sender's account
- The contract's internal accounting may show different values than the actual on-chain balance
## Best Practices
### Smart Contract Development
1. Account for ED in withdrawal functions:
- Always leave the ED amount in the contract
- Check for sufficient balance above ED before transfers
2. Fee Handling:
- Consider both execution and storage fees in transaction planning
- Add buffer for storage fees in critical operations
### Testing Considerations
1. Test withdrawal edge cases:
- Attempt to withdraw full balance
- Verify behavior with amounts close to ED
- Check contract state after failed withdrawals
2. Balance Verification:
- Use RPC calls to verify actual available balance
- Compare contract state with on-chain state
- Account for ED in balance calculations
## Differences from Ethereum
1. Balance Management:
- Ethereum: No minimum balance requirement
- Auto EVM: Requires ED maintenance
2. Fee Structure:
- Ethereum: Gas fees only
- Auto EVM: Gas fees + consensus storage fees
3. Contract Behavior:
- Ethereum: Contracts can be fully emptied
- Auto EVM: ED must remain in contracts
## Tools and Verification
### Checking Balances
To verify actual available balances:
- Use Polkadot.js interface
- Query contract state directly
- Account for ED and storage fees in calculations
### Transaction Planning
When planning transactions:
1. Calculate required fees (execution + storage)
2. Ensure sufficient balance above ED
3. Add buffer for potential storage fee variations
## Common Issues and Solutions
1. "Insufficient Balance" errors:
- Check if attempting to withdraw below ED
- Verify sufficient balance for fees
- Consider storage fees in calculations
2. Balance Discrepancies:
- Compare BlockScout UI with actual RPC calls
- Account for storage fees in calculations
- Remember ED requirements
---
### File: evm/wrapping_ai3.mdx
# How to Wrap AI3 Tokens (Mainnet and Chronos Testnet)
Native AI3 tokens on Auto EVM are similar to ETH on the Ethereum network - they are not ERC-20 tokens, which is sometimes a requirement for certain use-cases.
Fortunately, there are ERC-20 contracts on both mainnet and testnet Auto EVM that allow you to "wrap" your tokens. This guide walks you through how to do that.
## The Contracts
### Mainnet WAI3 Contract
**Mainnet** WAI3 Contract: \`0x7ba06C7374566c68495f7e4690093521F6B991bb\`.
View contract: https://explorer.auto-evm.mainnet.autonomys.xyz/token/0x7ba06C7374566c68495f7e4690093521F6B991bb
### Chronos Testnet WAI3 Contract
**Chronos Testnet** WAI3 Contract: \`0xeAb23556Ec571bA10F4C3C8051d719E58e921caC\`.
View contract: https://explorer.auto-evm.chronos.autonomys.xyz/token/0xeAb23556Ec571bA10F4C3C8051d719E58e921caC
## Wrapping AI3 for WAI3
We will use MetaMask for this guide, though any EVM compatible wallet should work. First, make sure you have the appropriate Auto EVM network added to MetaMask.
\`\`\`
Network Name: Autonomys EVM
New RPC URL: https://auto-evm.mainnet.autonomys.xyz/ws
Chain ID: 870
Currency Symbol: AI3
\`\`\`
Auto EVM is also available on the Chronos testnet
\`\`\`
Network Name: Autonomys EVM
New RPC URL: https://auto-evm.chronos.autonomys.xyz/ws
Chain ID: 8700
Currency Symbol: tAI3
\`\`\`
The quickest way to add the network is to use this button at the bottom of Blockscout (use the contract link above):

Once that's done you should be able to see your Auto EVM tAI3 balance on MetaMask when you select the network.

To convert some tAI3 to WAI3 you just need to send it to the contract. You will see a warning that you are about to send to a token contract. This is expected.

Once you've accepted the warning you can click on "Continue".
NOTE: You may need to increase the gas fee if you see a failure when executing the transaction. If all has gone well you should end up seeing the contract interaction in the MetaMask Activity tab.

To view your WAI3 you will need to add the token contract by importing it in MetaMask.

Enter the WAI3 contract address and you will be able to see your balance under the Tokens tab in MetaMask.

## Unwrapping WAI3
To unwrap your WAI3 tokens (convert them back to the original tAI3 tokens), you'll need to interact with the \`withdraw\` function in the **[WAI3 contract](#the-contracts)**. Here's how to do it.
First, make sure you have your wallet holding the WAI3 connected to the contract linked above. Ensure you are on the "Write" tab of the contract and look for the \`withdraw\` method.

Enter the amount you want to unwrap as a parameter. The value entered needs to be in Shannons - the Autonomys equivalent of Ethereum's wei. The Blockscout UI has a button \`10^18\` to automatically add the extra 18 zeroes required to convert between the two.

Once you're ready, hit the "Write" button and approve the transaction in your wallet. You should now see the WAI3 has ben converted back to native tAI3.


---
### File: index.mdx
---
title: Auto SDK - Build with Autonomys Network
description: Your gateway to building on the Autonomys Network. Streamline dApp development with modular packages and decentralized solutions.
---
import \{ SDKHeader \} from '../components/landingPage.js'
import \{ SDKOverview \} from '../components/landingPage.js'
import \{ SDKFeatures \} from '../components/landingPage.js'
import \{ SDKResources \} from '../components/landingPage.js'
import \{ SDKCTA \} from '../components/landingPage.js'
<SDKHeader />
<SDKOverview />
<SDKFeatures />
<SDKResources />
<SDKCTA />
---
### File: introduction.mdx
---
title: Introduction
---
## Autonomys Developer Hub
### Introduction
The [**Autonomys Network**](https://autonomys.xyz) is the *only* blockchain that resolves the blockchain trilemma without compromise. Autonomys offers a platform for new and experienced developers to build secure, scalable AI-powered decentralized applications (**super dApps**) and unstoppable, verifiable on-chain agents (**Auto Agents**) with ease, leveraging familiar tools and innovative protocols.
This guide offers an overview of the development options available on the Autonomys Network.
### Is building on Autonomys difficult?
Absolutely not! One of the project's primary objectives is to minimize the barriers to entry for all network participants, from farmers to developers. Building on Autonomys is thus very straightforward, with active support from our team and community available if you ever run into trouble.
Join the Autonomys Discord server, tell us about your idea, and the team will help you see it through to deployment—no project is too ambitious! We pride ourselves on enabling the seemingly un-enableable.
### Development options
Developers can currently choose to build using the:
- **Auto SDK**: a comprehensive toolkit that simplifies development on the Autonomys Network, or the
- **Auto EVM**: Autonomys's EVM-compatible domain, allowing you to utilize all its familiar functionalities.
You will soon be able to build your own local custom virtual machines.
### Transaction Security
The Autonomys Network uses a nonce system to prevent transaction replay attacks:
- Each account has an integer nonce value that increases by one with every on-chain transaction
- Signed transactions must include the correct nonce value to be accepted
- Transactions with incorrect nonce values are rejected, preventing replay attacks
- For security, new accounts' default nonce value is set to the current block number (rather than 0)
- This ensures if an account is reaped and later re-created, previous transactions remain invalid
## [Auto SDK](/sdk)
### Key features:
- **Modular Packages**: Partitions functionality intuitively into packages including \`auto-utils\`, \`auto-consensus\`, and \`auto-id\`, allowing you to import only what you need.
- **Simplified Interaction**: Abstracts the complexity of blockchain operations with high-level functions.
- **Flexibility**: Suitable for building AI-powered decentralized applications (super dApps) and on-chain agents (Auto Agents) with ease.
### Getting started with the Auto SDK
To start using the Auto SDK, refer to our [setup instructions](/sdk) to clone the repository, install dependencies, and run tests. The Auto SDK is designed to be developer-friendly, even for first-time developers, ensuring that anyone can get up and running quickly.
## [Auto EVM](/evm/introduction)
### Key features:
- **Solidity Smart Contracts**: Write and deploy smart contracts using Solidity, just as you would on Ethereum.
- **Tool Compatibility**: Compatible with any tool available for Ethereum development, including Remix, Truffle, and Hardhat.
- **Familiar Environment**: Leverage existing knowledge and resources from the Ethereum ecosystem to build your application.
### Why use the Auto EVM?
- **Seamless Transition**: Developers familiar with Ethereum can transition to Autonomys without a steep learning curve.
- **Scalability and Security**: Benefit from Autonomys' scalable and secure infrastructure while using tried-and-true development practices.
- **Interoperability**: Create applications that can interact seamlessly with other EVM-compatible networks.
### Getting started with the Auto EVM
To start using the Auto EVM, refer to our [guides and manuals](/evm/introduction).
## RPC endpoints
#### Consensus
- \`wss://rpc.mainnet.subspace.foundation/ws\`
- \`wss://rpc.chronos.autonomys.xyz/ws\`
### Auto EVM
#### Mainnet
- \`wss://auto-evm.mainnet.autonomys.xyz/ws\`
#### Chronos Testnet
- \`wss://auto-evm.chronos.autonomys.xyz/ws\`
## Questions or feedback? Post on our [forum](https://forum.autonomys.xyz/) or in the *#developer-chat* channel on our [Discord](https://discord.gg/EAw6B48r).
To access the developer role-gated *#developer-chat* channel:
1. Join our [Discord](https://discord.gg/vhv5cEZN).
2. Click on Autonomys Network in the top-left corner and choose **Linked Roles**.

3. Link your GitHub account to attain the developer role and gain access to *#developer-chat*.

---
### File: sdk/auto-consensus.mdx
import AutoConsensusFetchBalance from '/components/autoConsensusFetchBalance.js';
## Auto-Consensus Package
### Introduction
The \`@autonomys/auto-consensus\` package provides functions for interacting with the consensus layer of the Autonomys Network. It allows developers to perform actions involving account management, balance inquiries, transfers, staking operations, and more. \`@autonomys/auto-consensus\` works hand-in-hand with \`@autonomys/auto-utils\` to simplify blockchain interactions.
### Installation
Install the package via \`npm\` or \`yarn\`:
\`\`\`bash
# Using npm
npm install @autonomys/auto-consensus
# Using yarn
yarn add @autonomys/auto-consensus
\`\`\`
### Importing
Import the \`auto-consensus\` functions you need into your project:
\`\`\`typescript
// Import specific functions
import \{ balance, transfer, account \} from '@autonomys/auto-consensus';
// Or import everything
import * as consensus from '@autonomys/auto-consensus';
\`\`\`
### Overview of the \`api\` object
Many functions in the \`auto-consensus\` package require an \`api\` object as a parameter. This \`api\` object is an instance of \`ApiPromise\` from the Polkadot.js API library that serves as a gateway for interacting with the blockchain node.
#### \`api\` core components
> *Note:* Always disconnect the API instance after your operations are complete to free up resources.
- \`api.rpc\`: Methods to perform remote procedure calls to the node.
- \`api.query\`: Access the blockchain's runtime storage.
- \`api.tx\`: Create and submit extrinsics (transactions) to the blockchain.
- \`api.consts\`: Runtime constants defined in the blockchain's metadata.
- \`api.events\`: Access events emitted by the blockchain.
- \`api.types\`: Type definitions used by the chain.
#### Example
\`\`\`typescript
import \{ createConnection \} from '@autonomys/auto-utils';
async function getApiInstance() \{
const endpoint = 'wss://auto-evm.chronos.autonomys.xyz/ws';
const api = await createConnection(endpoint);
return api;
\}
\`\`\`
### Available functions
> *Note:* All asynchronous functions return a \`Promise\` and should be used with \`await\` for proper execution flow. Wrap your asynchronous calls in \`try...catch\` blocks to handle potential errors gracefully.
#### Account management
- \`account(api, address): Promise<AccountData>\`: Retrieves an account's nonce and balance data.
- \`balance(api, address): Promise<BalanceData>\`: Retrieves an account's balance details.
#### Balances
- \`totalIssuance(networkId?): Promise<BigInt>\`: Retrieves the total token issuance on the network.
- \`batch(api, txs[]): SubmittableExtrinsic\`: Creates a batch transaction for multiple operations.
#### Blockchain information
- \`block(api): Promise<RawBlock>\`: Retrieves the latest block data.
- \`header(api): Promise<RawBlockHeader>\`: Retrieves the latest block header.
- \`blockHash(api): Promise<string>\`: Retrieves the latest block hash.
- \`blockNumber(api): Promise<number>\`: Retrieves the current block number.
- \`networkTimestamp(api): Promise<bigint>\`: Retrieves the network timestamp.
#### Consensus information
- \`blockchainSize(api): Promise<bigint>\`: Calculates the blockchain's total size.
- \`spacePledged(api): Promise<bigint>\`: Calculates the total space pledged by farmers.
- \`solutionRanges(api): Promise<SolutionRanges>\`: Retrieves the current and next solution ranges.
- \`shouldAdjustSolutionRange(api): Promise<boolean>\`: Checks if the solution range needs adjustment.
- \`segmentCommitment(api): Promise<[StorageKey<AnyTuple>, Codec][]>\`: Retrieves segment commitment entries.
- \`slotProbability(api): [number, number]\`: Returns slot probability constants.
- \`maxPiecesInSector(api): bigint\`: Returns the maximum pieces in a sector.
#### Domains
- \`domainStakingSummary(api): Promise<DomainStakingSummary[]>\`: Retrieves domain staking summaries.
- \`domains(api): Promise<DomainRegistry[]>\`: Retrieves domain registries.
- \`latestConfirmedDomainBlock(api): Promise<ConfirmedDomainBlock[]>\`: Retrieves the latest confirmed blocks per domain.
#### Operators and staking
- \`operators(api): Promise<Operator[]>\`: Retrieves a list of all operators.
- \`operator(api, operatorId): Promise<OperatorDetails>\`: Retrieves the details of a specific operator.
- \`deposits(api, operatorId, account?): Promise<Deposit[]>\`: Retrieves the deposits for a specific operator.
- \`withdrawals(api, operatorId, account?): Promise<Withdrawal[]>\`: Retrieves the withdrawals for a specific operator.
- \`registerOperator(params): SubmittableExtrinsic\`: Creates a transaction to register a new operator.
- \`nominateOperator(params): SubmittableExtrinsic\`: Creates a transaction to nominate an operator.
- \`withdrawStake(params): SubmittableExtrinsic\`: Creates a transaction to withdraw staked tokens.
- \`deregisterOperator(params): SubmittableExtrinsic\`: Creates a transaction to deregister an operator.
- \`unlockFunds(params): SubmittableExtrinsic\`: Creates a transaction to unlock staked funds.
- \`unlockNominator(params): SubmittableExtrinsic\`: Creates a transaction to unlock nominator funds.
#### Transfers
- \`transfer(api, receiver, amount, allowDeath?): SubmittableExtrinsic\`: Creates a transaction to transfer funds.
- \`transferAll(api, receiver, keepAlive?): SubmittableExtrinsic\`: Creates a transaction to transfer all tokens.
#### Utility functions
- \`query<T>(api, methodPath, params?): Promise<T>\`: Queries the blockchain state for a method.
- \`remark(api, remark, withEvent?): SubmittableExtrinsic\`: Creates a remark transaction.
- \`rpc<T>(api, methodPath, params?): Promise<T>\`: Performs an RPC call.
## Interactive usage example
### Fetching the wallet balance (unlocked, locked) and nonce for a wallet
<AutoConsensusFetchBalance />
## Usage examples
Code examples demonstrating how to use the functions provided by \`@autonomys/auto-consensus\`.
### 1. Account management
#### Retrieve detailed account information (including the nonce and balance data)
\`\`\`typescript
import \{ account \} from '@autonomys/auto-consensus';
import \{ activate \} from '@autonomys/auto-utils';
(async () => \{
const api = await activate(\{ networkId: 'chronos' \});
const accountData = await account(api, 'your_address');
console.log(\`Nonce: \$\{accountData.nonce\}\`);
console.log(\`Free Balance: \$\{accountData.data.free\}\`);
console.log(\`Reserved Balance: \$\{accountData.data.reserved\}\`);
await api.disconnect();
\})();
\`\`\`
#### Activate a wallet and check its balance
\`\`\`typescript
import \{ activateWallet \} from '@autonomys/auto-utils';
import \{ balance \} from '@autonomys/auto-consensus';
(async () => \{
// Activate a wallet using a mnemonic phrase
const \{ api, accounts \} = await activateWallet(\{
mnemonic: 'your mnemonic phrase here', // Replace with your mnemonic
networkId: 'chronos', // Optional: specify the network ID
\});
const account = accounts[0];
console.log(\`Connected with account address: \$\{account.address\}\`);
// Check the account balance
const accountBalance = await balance(api, account.address);
console.log(\`Account balance: \$\{accountBalance.free\}\`);
// Disconnect when done
await api.disconnect();
\})();
\`\`\`
### 2. Balance operations
#### Retrieve the free balance of an account
\`\`\`typescript
import \{ balance \} from '@autonomys/auto-consensus';
import \{ activate \} from '@autonomys/auto-utils';
(async () => \{
const api = await activate(\{ networkId: 'chronos' \});
const accountBalance = await balance(api, 'your_address');
console.log(\`Free Balance: \$\{accountBalance.free\}\`);
await api.disconnect();
\})();
\`\`\`
#### Retrieve the total token issuance on the network
\`\`\`typescript
import \{ totalIssuance \} from '@autonomys/auto-consensus';
(async () => \{
const total = await totalIssuance('chronos');
console.log(\`Total Issuance: \$\{total.toString()\}\`);
\})();
\`\`\`
### 3. Transfers
#### Transfer funds between accounts
\`\`\`typescript
import \{ activateWallet \} from '@autonomys/auto-utils';
import \{ transfer \} from '@autonomys/auto-consensus';
(async () => \{
// Activate the sender's wallet
const senderWallet = await activateWallet(\{
mnemonic: 'sender mnemonic phrase', // Replace with the sender's mnemonic
\});
const sender = senderWallet.accounts[0];
// Activate the receiver's wallet
const receiverWallet = await activateWallet(\{
mnemonic: 'receiver mnemonic phrase', // Replace with the receiver's mnemonic
\});
const receiver = receiverWallet.accounts[0];
// Transfer 1 AI3 from the sender to the receiver
const amount = 1; // Amount in AI3
const transferTx = await transfer(senderWallet.api, receiver.address, amount);
// Sign and send the transaction
await transferTx.signAndSend(sender, (\{ status, txHash, events \}) => \{
if (status.isInBlock) \{
console.log(\`Transaction included at blockHash \$\{status.asInBlock\}\`);
console.log(\`Transaction hash: \$\{txHash\}\`);
\} else if (status.isFinalized) \{
console.log(\`Transaction finalized at blockHash \$\{status.asFinalized\}\`);
\}
\});
// Disconnect when done
await senderWallet.api.disconnect();
await receiverWallet.api.disconnect();
\})();
\`\`\`
#### Transfer tokens from one wallet to another
\`\`\`typescript
import \{ transfer \} from '@autonomys/auto-consensus';
import \{ activateWallet, signAndSendTx, disconnect \} from '@autonomys/auto-utils';
(async () => \{
const \{ api, accounts \} = await activateWallet(\{
networkId: 'chronos',
mnemonic: 'your_mnemonic',
\});
const sender = accounts[0];
const recipientAddress = 'recipient_address';
const amount = '1000000000000'; // Amount in the smallest unit (Shannon)
const tx = await transfer(api, recipientAddress, amount);
// Sign and send the transaction
await signAndSendTx(sender, tx);
console.log(\`Transferred \$\{amount\} tokens to \$\{recipientAddress\}\`);
await disconnect(api);
\})();
\`\`\`
### 4. Staking operations
#### Register a new operator for staking
\`\`\`typescript
import \{ registerOperator \} from '@autonomys/auto-consensus';
import \{ activateWallet, signAndSendTx \} from '@autonomys
/auto-utils';
(async () => \{
const \{ api \} = await activateWallet(\{
networkId: 'chronos',
mnemonic: 'sender_mnemonic',
\});
// Sender's account (who is registering the operator)
const \{ accounts: senderAccounts \} = await activateWallet(\{
networkId: 'chronos',
mnemonic: 'sender_mnemonic',
\});
const sender = senderAccounts[0];
// Operator's account
const \{ accounts: operatorAccounts \} = await activateWallet(\{
networkId: 'chronos',
mnemonic: 'operator_mnemonic',
\});
const operatorAccount = operatorAccounts[0];
const tx = await registerOperator(\{
api,
senderAddress: sender.address,
Operator: operatorAccount,
domainId: '0', // Domain ID where the operator will be registered
amountToStake: '1000000000000000000', // Amount in smallest units
minimumNominatorStake: '10000000000000000',
nominationTax: '5', // Percentage as a string (e.g., '5' for 5%)
\});
// Sign and send the transaction
await signAndSendTx(sender, tx);
console.log('Operator registered successfully');
\})();
\`\`\`
#### Nominate an existing operator by staking tokens
\`\`\`typescript
import \{ nominateOperator \} from '@autonomys/auto-consensus';
import \{ activateWallet, signAndSendTx \} from '@autonomys/auto-utils';
(async () => \{
const \{ api, accounts \} = await activateWallet(\{
networkId: 'chronos',
mnemonic: 'nominator_mnemonic',
\});
const nominator = accounts[0];
const operatorId = '1'; // The ID of the operator to nominate
const amountToStake = '5000000000000000000'; // Amount in smallest units
const tx = await nominateOperator(\{
api,
operatorId,
amountToStake,
\});
// Sign and send the transaction
await signAndSendTx(nominator, tx);
console.log(\`Nominated operator \$\{operatorId\} with \$\{amountToStake\} stake\`);
\})();
\`\`\`
### 5. Blockchain information
#### Retrieve the current block number, block hash, and network timestamp
\`\`\`typescript
import \{ blockNumber, blockHash, networkTimestamp \} from '@autonomys/auto-consensus';
import \{ activate \} from '@autonomys/auto-utils';
(async () => \{
const api = await activate(\{ networkId: 'chronos' \});
const currentBlockNumber = await blockNumber(api);
const currentBlockHash = await blockHash(api);
const currentTimestamp = await networkTimestamp(api);
console.log(\`Current Block Number: \$\{currentBlockNumber\}\`);
console.log(\`Current Block Hash: \$\{currentBlockHash\}\`);
console.log(\`Network Timestamp: \$\{currentTimestamp\}\`);
await api.disconnect();
\})();
\`\`\`
### 6. Domain interactions
#### Retrieve the list of domains registered on the network
\`\`\`typescript
import \{ domains \} from '@autonomys/auto-consensus';
import \{ activate \} from '@autonomys/auto-utils';
(async () => \{
const api = await activate(\{ networkId: 'chronos' \});
const domainList = await domains(api);
domainList.forEach((domain) => \{
console.log(\`Domain ID: \$\{domain.id\}\`);
console.log(\`Owner Address: \$\{domain.owner\}\`);
console.log(\`Creation Block: \$\{domain.creationBlock\}\`);
// ...other domain properties
\});
await api.disconnect();
\})();
\`\`\`
#### Retrieve staking summaries for all domains
\`\`\`typescript
import \{ domainStakingSummary \} from '@autonomys/auto-consensus';
import \{ activate \} from '@autonomys/auto-utils';
(async () => \{
const api = await activate(\{ networkId: 'chronos' \});
const stakingSummaries = await domainStakingSummary(api);
stakingSummaries.forEach((summary) => \{
console.log(\`Domain ID: \$\{summary.domainId\}\`);
console.log(\`Total Stake: \$\{summary.totalStake\}\`);
// ...other summary properties
\});
await api.disconnect();
\})();
\`\`\`
#### Retrieve the latest confirmed blocks for each domain
\`\`\`typescript
import \{ latestConfirmedDomainBlock \} from '@autonomys/auto-consensus';
import \{ activate \} from '@autonomys/auto-utils';
(async () => \{
const api = await activate(\{ networkId: 'chronos' \});
const confirmedBlocks = await latestConfirmedDomainBlock(api);
confirmedBlocks.forEach((blockInfo) => \{
console.log(\`Domain ID: \$\{blockInfo.id\}\`);
console.log(\`Block Number: \$\{blockInfo.number\}\`);
console.log(\`Block Hash: \$\{blockInfo.hash\}\`);
// ...other block properties
\});
await api.disconnect();
\})();
\`\`\`
---
### File: sdk/auto-dag-data.mdx
# Auto DAG Data Package
## Introduction
The \`@autonomys/auto-dag-data\` package provides utilities for creating and managing IPLD DAGs (InterPlanetary Linked Data Directed Acyclic Graphs) for files and folders. It facilitates chunking large files, handling metadata, and creating folder structures suitable for distributed storage systems like IPFS.
> **Note**: This package is an ES Module package and is designed to work with ESM applications. Check [this tutorial](https://dev.to/mangadev/set-up-a-backend-nodejs-typescript-jest-using-es-modules-1530) for guidance on setting up an ES module application.
## Features
- **File Chunking and DAG Creation**: Efficiently split large files into smaller chunks and create IPLD DAGs.
- **Folder Structure Creation**: Generate IPLD DAGs for directory structures.
- **Metadata Handling**: Add and manage metadata for files and folders.
- **CID Management**: Utilities for working with Content Identifiers (CIDs).
- **TypeScript Support**: Fully typed for enhanced developer experience.
## Installation
Install the package via \`npm\` or \`yarn\`:
\`\`\`bash
# Using npm
npm install @autonomys/auto-dag-data
# Using yarn
yarn add @autonomys/auto-dag-data
\`\`\`
## Importing
Import the \`auto-dag-data\` functions you need into your project:
\`\`\`typescript
// Import specific functions
import \{ processFileToIPLDFormat, processFolderToIPLDFormat, cidOfNode \} from '@autonomys/auto-dag-data';
// Or import everything
import * as dagData from '@autonomys/auto-dag-data';
\`\`\`
## Available Functions
> **Note**: All asynchronous functions return a \`Promise\` and should be used with \`await\` for proper execution flow. Wrap asynchronous calls in \`try...catch\` blocks to handle potential errors gracefully.
### File Processing
- \`processFileToIPLDFormat(blockstore, fileStream, fileSize, fileName): CID\`: Creates an IPLD DAG from a file stream.
### Folder Processing
- \`processFolderToIPLDFormat(blockstore, childCIDs, folderName, folderSize): CID\`: Generates an IPLD DAG from a folder structure.
### CID Utilities
- \`cidOfNode(node): CID\`: Creates a CID from an IPLD node.
- \`cidToString(cid): string\`: Converts a CID to its string representation.
- \`stringToCid(cidString): CID\`: Parses a CID string back into a CID object.
### Node Operations
- \`encodeNode(node): Uint8Array\`: Encodes an IPLD node to bytes.
- \`decodeNode(bytes): IPLDNode\`: Decodes bytes back into an IPLD node.
### Metadata Management
- \`createMetadataNode(metadata): IPLDNode\`: Creates a metadata node from metadata object.
- \`processMetadataToIPLDFormat(blockstore, metadata): CID\`: Processes metadata into IPLD format.
## Usage Examples
### 1. Creating an IPLD DAG from a File
Process a file into IPLD format for distributed storage:
\`\`\`typescript
import \{ processFileToIPLDFormat, cidToString \} from '@autonomys/auto-dag-data';
import \{ MemoryBlockstore \} from 'blockstore-core/memory';
import fs from 'fs';
(async () => \{
const fileStream = fs.createReadStream('path/to/your/file.txt');
const fileSize = fs.statSync('path/to/your/file.txt').size;
const blockstore = new MemoryBlockstore();
const fileCID = await processFileToIPLDFormat(blockstore, fileStream, fileSize, 'file.txt');
const cidString = cidToString(fileCID);
console.log(\`File CID: \$\{cidString\}\`);
\})();
\`\`\`
### 2. Creating an IPLD DAG from a Folder
Generate an IPLD DAG from a directory structure:
\`\`\`typescript
import \{ processFolderToIPLDFormat, decodeNode, cidToString \} from '@autonomys/auto-dag-data';
import \{ MemoryBlockstore \} from 'blockstore-core/memory';
import \{ CID \} from 'multiformats';
(async () => \{
// Example child CIDs from files in the folder
const childCIDs: CID[] = [
/* array of CIDs from folder contents */
];
const folderName = 'my-folder';
const folderSize = 1024; // total size of all children
const blockstore = new MemoryBlockstore();
const folderCID = await processFolderToIPLDFormat(blockstore, childCIDs, folderName, folderSize);
const node = await decodeNode(await blockstore.get(folderCID));
const cidString = cidToString(folderCID);
console.log(\`Folder CID: \$\{cidString\}\`);
\})();
\`\`\`
### 3. Working with CIDs
Convert between CID objects and strings:
\`\`\`typescript
import \{ cidOfNode, cidToString, stringToCid \} from '@autonomys/auto-dag-data';
(async () => \{
// Assuming you have an IPLD node
const cid = cidOfNode(someNode);
// Convert CID to string for storage or transmission
const cidString = cidToString(cid);
console.log(\`CID as string: \$\{cidString\}\`);
// Parse string back into CID object
const parsedCID = stringToCid(cidString);
console.log('CID object:', parsedCID);
\})();
\`\`\`
### 4. Encoding and Decoding Nodes
Handle IPLD node serialization:
\`\`\`typescript
import \{ encodeNode, decodeNode \} from '@autonomys/auto-dag-data';
(async () => \{
// Encode a node for storage
const encodedNode = encodeNode(someNode);
console.log('Encoded node:', encodedNode);
// Decode node from storage
const decodedNode = decodeNode(encodedNode);
console.log('Decoded node:', decodedNode);
\})();
\`\`\`
### 5. Handling Metadata
Create and process metadata for your files:
\`\`\`typescript
import \{ createMetadataNode, processMetadataToIPLDFormat, cidToString \} from '@autonomys/auto-dag-data';
import \{ MemoryBlockstore \} from 'blockstore-core/memory';
(async () => \{
const metadata = \{
name: 'My File',
description: 'This is a sample file',
created: new Date().toISOString(),
// ... other metadata fields
\};
// Create metadata node
const metadataNode = createMetadataNode(metadata);
// Process metadata to IPLD format
const blockstore = new MemoryBlockstore();
const metadataCID = await processMetadataToIPLDFormat(blockstore, metadata);
const cidString = cidToString(metadataCID);
console.log(\`Metadata CID: \$\{cidString\}\`);
\})();
\`\`\`
### 6. Complete File Processing Example
Full example of processing a file and extracting its CID:
\`\`\`typescript
import \{ processFileToIPLDFormat, cidToString \} from '@autonomys/auto-dag-data';
import \{ MemoryBlockstore \} from 'blockstore-core/memory';
import fs from 'fs';
(async () => \{
try \{
const filePath = 'path/to/your/file.txt';
const fileStream = fs.createReadStream(filePath);
const fileStats = fs.statSync(filePath);
const blockstore = new MemoryBlockstore();
const cid = await processFileToIPLDFormat(
blockstore,
fileStream,
fileStats.size,
'file.txt'
);
const cidString = cidToString(cid);
console.log(\`Successfully processed file. CID: \$\{cidString\}\`);
\} catch (error) \{
console.error('Error processing file:', error);
\}
\})();
\`\`\`
## Best Practices
### Error Handling
Always wrap asynchronous operations in try-catch blocks:
\`\`\`typescript
try \{
const cid = await processFileToIPLDFormat(blockstore, stream, size, name);
// Handle success
\} catch (error) \{
console.error('Processing failed:', error);
// Handle error appropriately
\}
\`\`\`
### Blockstore Management
Use appropriate blockstore implementations based on your needs:
\`\`\`typescript
// For temporary operations
import \{ MemoryBlockstore \} from 'blockstore-core/memory';
const tempBlockstore = new MemoryBlockstore();
// For persistent storage
import \{ FsBlockstore \} from 'blockstore-fs';
const persistentBlockstore = new FsBlockstore('./blocks');
\`\`\`
### Resource Cleanup
Ensure proper cleanup of resources when working with file streams:
\`\`\`typescript
const stream = fs.createReadStream(filePath);
try \{
const cid = await processFileToIPLDFormat(blockstore, stream, size, name);
// Process result
\} finally \{
stream.destroy(); // Clean up stream
\}
\`\`\`
---
### File: sdk/auto-drive/api_reference.mdx
# Auto-Drive API Reference
## Overview
Auto-Drive provides REST APIs for interacting with the Autonomys decentralized storage network. The APIs handle file operations, metadata management, and subscription services.
## API Documentation
The complete API documentation with all function signatures is available [here](https://mainnet.auto-drive.autonomys.xyz/api/docs).

## API Endpoints Structure
The API is organized into several main categories:
### Subscriptions Endpoints
- Manage your account subscription and view usage statistics
- \`GET /subscriptions/info\` - Get subscription information
- \`GET /subscriptions/credits\` - Check pending upload/download credits
### Upload Endpoints
- Progress tracking and upload management
- \`POST /uploads/file\` - Upload single files
- \`POST /uploads/folder\` - Upload folders and directories
- \`POST /uploads/multipart\` - Multipart uploads for large files
### Objects Endpoints
- Object metadata management and access control
- \`GET /objects\` - List your uploaded objects
- \`GET /objects/search\` - Search objects by name or CID
- \`POST /objects/share\` - Share objects and manage permissions
### Download Endpoints
- Support for password-protected downloads
- \`GET /downloads/\{cid\}\` - Download files by CID
- \`GET /downloads/stream/\{cid\}\` - Streaming downloads
## Base URLs
- **Mainnet**: \`https://mainnet.auto-drive.autonomys.xyz\`
## Authentication
All API requests require authentication using one of these methods:
### API Key Authentication (Recommended)
Include your API key in the request headers:
\`\`\`http
Authorization: Bearer your-api-key
X-Auth-Provider: apikey
\`\`\`
### JWT Token Authentication
For advanced use cases, JWT tokens are supported:
\`\`\`http
Authorization: Bearer your-jwt-token
X-Auth-Provider: jwt
\`\`\`
> **Security Note:** Keep your API keys secure and never share them with unauthorized parties.
## Auto-Drive Services
Auto-Drive consists of two main services:
### 1. Auto-Drive Storage API
Handles all file operations and object management:
- **File uploads** (single and multipart)
- **Object metadata management**
- **Access control and permissions**
- **Subscription management**
### 2. Auto-Drive Download Gateway
Provides optimized file download capabilities:
- **Direct file downloads**
- **Asynchronous downloads**
- **Bandwidth optimization**
## Response Format
All API responses follow a consistent JSON format with appropriate HTTP status codes. Error responses include detailed error messages to help with debugging.
## Rate Limits
API usage is subject to your subscription limits:
- **Upload limit**: 100MB per month (default)
- **Download limit**: 5GB per month (default)
Check your current limits using the subscription endpoints or in the Auto-Drive dashboard.
## Making Direct API Calls
You can make direct HTTP requests to the API endpoints:
\`\`\`bash
# Example: Get subscription info
curl -X GET "https://mainnet.auto-drive.autonomys.xyz/subscriptions/info" \
-H "Authorization: Bearer your-api-key" \
-H "X-Auth-Provider: apikey"
\`\`\`
\`\`\`bash
# Example: Upload a file
curl -X POST "https://mainnet.auto-drive.autonomys.xyz/uploads/file" \
-H "Authorization: Bearer your-api-key" \
-H "X-Auth-Provider: apikey" \
-H "Content-Type: multipart/form-data" \
-F "file=@path/to/your/file.txt"
\`\`\`
For detailed endpoint specifications, request/response schemas, and interactive testing, visit the [complete API documentation](https://mainnet.auto-drive.autonomys.xyz/api/docs).
---
### File: sdk/auto-drive/available_functions.mdx
# Auto-Drive SDK Available Functions
### Upload Operations
- \`uploadFile(api, file, options): Promise<string>\`: Uploads a file (using a buffer, \`File\`, or a custom interface) with optional encryption and compression. Returns the resulting CID as a string.
- \`uploadFileFromFilepath(api, filePath, options): Promise<string>\`: Uploads a file from filepath with optional encryption and compression. Returns the resulting CID as a string.
- \`uploadFileFromInput(api, file, options): Promise<string>\`: Uploads a file obtained from a browser's \`File\` API.
- \`uploadFolderFromInput(api, fileList, options): Promise<string>\`: Uploads a folder from a browser's \`FileList\`.
- \`uploadFileWithinFolderUpload(api, uploadId, file, options): Promise<string>\`: Uploads a file within an existing folder upload session.
- \`uploadObjectAsJSON(api, object, name?, options): Promise<string>\`: Serializes and uploads any object as a JSON file.
### Download Operations
- \`downloadFile(api, cid, password?): AsyncIterable<Buffer>\`: Downloads a file from its CID, with optional decryption using a password.
- \`getMyFiles(api, page, limit): Promise<PaginatedResult<ObjectSummary>>\`: Retrieves paginated list of user's files.
- \`searchByNameOrCIDInMyFiles(api, value): Promise<ObjectSummary[]>\`: Searches for files by name or CID within user's files.
- \`searchByNameOrCID(api, value): Promise<ObjectSummary[]>\`: Global search for files by name or CID.
### Utility Functions
- \`getPendingCredits(api): Promise<\{ upload: number; download: number \}>\`: Gets the current user's pending upload and download credits.
- \`getSubscriptionInfo(api): Promise<SubscriptionInfo>\`: Retrieves the current user's subscription information.
---
### File: sdk/auto-drive/create_api_key.mdx
# Creating Your Auto-Drive API Key
## Authentication Overview
All requests to the Auto-Drive APIs require authentication using an API key. The API key is a string that authenticates your requests to the Auto-Drive APIs.
## Step-by-Step Guide to Create an API Key
### 1. Access Auto-Drive Dashboard
1. Open [Auto Drive](https://ai3.storage/)
2. Sign in via Google, Discord, GitHub or Wallet

### 2. Navigate to Developer Section
1. Once you're logged in, click on the **developers** section in the left sidebar menu
2. In the developers section, click on **'Create API Key'**
3. Read the modal message and click on generate
### 3. Alternative Method - Via Profile
You can also create an API key through your profile:
1. Click on \`Profile\` in the dashboard
2. Create your API key from the profile section

## Understanding Your Dashboard
Once logged in, your Dashboard will show:
- Your upload limit (currently **20MB** per month on mainnet)
- A list of **uploaded files** and their **CIDs**
- Options to **Download**, **Share** or **Remove** each file

> *Note:* Removing a file does not delete it from the DSN as it is permanent storage. It only removes the file from your Dashboard.
## Using Your API Key
Once you have your API key, you can use it to authenticate your requests:
\`\`\`typescript
import \{ createAutoDriveApi \} from '@autonomys/auto-drive'
import \{ NetworkId \} from '@autonomys/auto-utils'
const api = createAutoDriveApi(\{
apiKey: 'your-api-key-here',
network: NetworkId.MAINNET
\})
\`\`\`
## Network Selection
Auto Drive is available on:
- **Mainnet** (\`NetworkId.MAINNET\`) - Production environment
## Security Best Practices
- Keep your API key secure and never commit it to version control
- Use environment variables to store your API key
- Regenerate your API key if you suspect it has been compromised
## Next Steps
With your API key created, you can now:
- Install the Auto-Drive SDK
- Start uploading and downloading files
- Explore the full API reference
- Try the usage examples
---
### File: sdk/auto-drive/gateway.mdx
# Auto Drive Gateway
## What is the Auto Drive Gateway?
Auto Gateway is a unified gateway solution designed for seamlessly accessing files or folders permanently stored on the Autonomys Mainnet through a single, streamlined interface. The gateway is available [here](https://gateway.autonomys.xyz/).

## How Gateways Work
In decentralized storage systems, files are identified by their Content Identifier (CID) rather than traditional file paths. A gateway acts as a bridge between the decentralized network and standard web protocols, allowing you to access stored content using familiar HTTP requests.
Think of it as a web interface that translates CIDs into downloadable files - similar to how IPFS gateways work, but specifically designed for the Autonomys network.
## When to Use Auto Drive Gateway
### Direct File Access
- **Public file sharing**: Share files with others who don't have the Auto Drive SDK
- **Web integration**: Embed files directly in websites using standard HTML tags
- **Browser downloads**: Allow users to download files directly through their browser
- **CDN-like functionality**: Serve static assets like images, documents, or media files
### Development & Testing
- **Quick verification**: Quickly check if your uploads were successful
- **Debugging**: Verify file integrity without writing code
- **Prototyping**: Access files during development without implementing full SDK integration
### Integration Scenarios
- **Third-party applications**: Enable other services to access your stored files
- **API responses**: Return direct download links in your API responses
- **Mobile apps**: Simple file access without complex SDK integration
## Gateway Endpoints
### Retrieve File by CID
\`\`\`
GET /file/:cid
\`\`\`
Retrieves a file based on its Content Identifier (CID).
**Responses:**
- \`200 OK\`: Successfully retrieves and serves the requested file
- \`302 Redirect\`: Redirects to \`/folder/:cid\` if the CID corresponds to a folder
- \`404 Not Found\`: If no file is found for the provided CID
### Retrieve Folder by CID
\`\`\`
GET /folder/:cid
\`\`\`
Retrieves a folder based on its Content Identifier (CID).
**Responses:**
- \`200 OK\`: Successfully retrieves and serves the requested folder
- \`302 Redirect\`: Redirects to \`/file/:cid\` if the CID corresponds to a file
- \`404 Not Found\`: If no folder is found for the provided CID
## Usage Examples
### Direct Browser Access
\`\`\`
https://gateway.autonomys.xyz/file/your-cid-here
\`\`\`
### HTML Integration
\`\`\`html
<!-- Display an image -->
<img src="https://gateway.autonomys.xyz/file/your-image-cid" alt="My Image">
<!-- Download link -->
<a href="https://gateway.autonomys.xyz/file/your-document-cid" download>Download Document</a>
\`\`\`
### API Response
\`\`\`javascript
const response = \{
filename: "document.pdf",
downloadUrl: \`https://gateway.autonomys.xyz/file/\$\{fileCid\}\`,
size: 1024000
\}
\`\`\`
## Finding Your CIDs
To use the gateway, you'll need the CID of your uploaded file or folder. You can find CIDs:
1. In your [Auto Drive Dashboard](https://ai3.storage/) after uploading files
2. From the return value of SDK upload functions
3. In your application logs where upload responses are recorded
Simply enter your CID in the gateway interface or construct the URL directly for programmatic access.
---
### File: sdk/auto-drive/overview_setup.mdx
# Auto-Drive Overview & Setup
## Introduction
The \`@autonomys/auto-drive\` package provides a set of tools to interact with the Autonomys Auto-Drive API.
## Features
- **Autonomys DSN**: Permanently store files on the Autonomys' DSN (Decentralized Store Network).
- **CID Management**: Just like with IPFS, each upload gets its own CID (Content Identifier).
- **TypeScript Support**: Fully typed for an enhanced developer experience.
## Authentication
All requests to the Auto-Drive APIs require authentication using an API key.
> **Need an API key?** See our [API Key Setup Guide](/sdk/auto-drive/create_api_key) for detailed instructions on creating your API key.
## Auto Drive Dashboard
The Auto Drive dashboard provides a web interface for managing your files and account:
- View your upload and download limits
- Browse and manage your uploaded files
- Share files with others
- Create and manage API keys
Access the dashboard at [Auto Drive](https://ai3.storage/) and sign in with Google, Discord, or GitHub.

### Sharing files
You can share files directly from the dashboard by clicking the \`Share\` button next to any file. You can share using a direct link or provide a user's public ID to share all their files.

## Installation
Install the package via \`npm\` or \`yarn\`:
\`\`\`bash
# Using npm
npm install @autonomys/auto-drive
npm install @autonomys/auto-utils
# Using yarn
yarn add @autonomys/auto-drive
yarn add @autonomys/auto-utils
\`\`\`
## Importing
Import the \`auto-drive\` functions you need into your project:
\`\`\`typescript
// Import specific functions
import \{ fs,createAutoDriveApi \} from '@autonomys/auto-drive';
import \{ NetworkId \} from '@autonomys/auto-utils'
// Or import everything
import * as drive from '@autonomys/auto-drive';
import \{ NetworkId \} from '@autonomys/auto-utils'
\`\`\`
## Network Configuration
[Auto Drive](https://ai3.storage) is available on Mainnet. To connect to it, you need to configure the network using \`auto-utils\` package:
\`\`\`ts
import \{ NetworkId \} from '@autonomys/auto-utils'
const api = createAutoDriveApi(\{
apiKey: 'your-api-key',
network: NetworkId.MAINNET
\})
\`\`\`
---
### File: sdk/auto-drive/s3_layer.mdx
# Auto Drive S3 Layer Guide
## Overview
**Auto Drive** provides an **S3-compatible API layer** that allows you to interact with **decentralized storage(DSN)** using standard **Amazon Web Services Simple Storage Service (AWS S3)** SDK commands. This bridges the gap between familiar cloud storage patterns and next-generation decentralized infrastructure, giving developers the best of both worlds: the **reliability and developer experience of S3 APIs** with the **permanence and censorship-resistance** of decentralized storage.
For those unfamiliar, [Amazon Web Services Simple Storage Service (AWS S3)](https://aws.amazon.com/s3/) is an industry-standard object storage service that powers much of the modern web's file storage needs. **Auto Drive** maintains complete compatibility with S3's APIs while storing your data on a **decentralized network** instead of **centralized servers**.
## How It Works
Auto Drive maintains an object_mappings table in the database that maps S3 object keys to Content Identifiers (CIDs). When you upload via S3 API, the system:
1. Stores the file content on the decentralized network (DSN)
2. Records the key-to-CID mapping in the database
3. Returns the CID as the ETag for S3 compatibility
4. Enables cross-API access between S3 and Auto Drive APIs
## Key Features
### 1. **Standard S3 SDK Compatibility**
- Use official AWS S3 SDK (\`@aws-sdk/client-s3\`)
- Supports all major S3 operations: \`PutObject\`, \`GetObject\`, \`HeadObject\`, multipart uploads (\`CreateMultipartUploadCommand\`, \`UploadPartCommand\` & \`CompleteMultipartUploadCommand\`)
- No code changes required for existing S3 applications
### 2. **Enhanced Metadata Support**
\`\`\`typescript
// Compression and encryption metadata
const command = new PutObjectCommand(\{
Bucket: "https://public.auto-drive.autonomys.xyz/api/s3",
Key: "file.txt",
Body: buffer,
Metadata: \{
compression: "ZLIB",
encryption: "AES_256_GCM",
\},
\});
\`\`\`
### 3. **Range Requests**
- Partial file downloads supported
- Standard HTTP Range headers
\`\`\`typescript
const command = new GetObjectCommand(\{
Bucket: bucket,
Key: key,
Range: "bytes=0-9", // Download first 10 bytes
\});
\`\`\`
### 4. **Multipart Upload Support**
- Full multipart upload workflow
- Create → Upload Parts → Complete pattern
- Automatic chunking for large files
\`\`\`typescript
// Complete multipart upload example
const key = "large-file.txt";
const fileContent = Buffer.from("Large file content...");
// Step 1: Create multipart upload
const createCommand = new CreateMultipartUploadCommand(\{
Bucket: "https://public.auto-drive.autonomys.xyz/api/s3",
Key: key,
\});
const createResult = await s3Client.send(createCommand);
const uploadId = createResult.UploadId!;
// Step 2: Upload parts
const uploadPartCommand = new UploadPartCommand(\{
Bucket: "https://public.auto-drive.autonomys.xyz/api/s3",
Key: key,
UploadId: uploadId,
PartNumber: 1,
Body: fileContent,
\});
const partResult = await s3Client.send(uploadPartCommand);
// Step 3: Complete multipart upload
const completeCommand = new CompleteMultipartUploadCommand(\{
Bucket: "https://public.auto-drive.autonomys.xyz/api/s3",
Key: key,
UploadId: uploadId,
MultipartUpload: \{
Parts: [
\{
ETag: partResult.ETag!,
PartNumber: 1,
\},
],
\},
\});
const completeResult = await s3Client.send(completeCommand);
\`\`\`
## Configuration
### Client Setup
\`\`\`typescript
const s3Client = new S3Client(\{
region: "us-east-1",
credentials: \{
accessKeyId: "your-auto-drive-api-key", // Your Auto Drive API key
secretAccessKey: "", // Always empty for Auto Drive
\},
bucketEndpoint: true, // Required for custom endpoints
\});
\`\`\`
### Endpoint Configuration
\`\`\`typescript
// The "Bucket" parameter becomes part of the endpoint URL
const Bucket = \`\$\{baseURL\}/s3\`; // e.g., "https://public.auto-drive.autonomys.xyz/api/s3"
// No actual S3 bucket is created - it's just URL routing
\`\`\`
- Mainnet: \`https://public.auto-drive.autonomys.xyz/api/s3\`
- Base URL: http://localhost:3000/s3 (development)
- Bucket name becomes the full endpoint path
- No actual bucket concept - uses path-based routing
## Authentication
- Uses Auto Drive API key-based authentication
- Integrates with Auto Drive's user management system
- API key goes in \`accessKeyId\`, \`secretAccessKey\` remains empty
- Supports the same authentication as the Auto Drive API
## File Ownership & Access
- **Cross-API compatibility**: Files uploaded via S3 API are accessible through Auto Drive API and vice versa
- **Centralized ownership**: File ownership is tracked centrally, not per-API
- **Content deduplication**: Multiple users uploading identical content will share the same underlying CID
- **Shared access**: If different users upload the same file via different APIs, both can access it through either API
## Storage Characteristics
### Content Addressing
- Files are stored using Content Identifiers (CIDs)
- ETag returned is the actual CID of the uploaded content
- Immutable storage - same content always produces same CID
### Decentralized Backend
- Files stored on the DSN of Autonomys Network (available on Autonomys Mainnet & Testnet)
- Automatic replication and redundancy
- No single point of failure
## Migrating from AWS S3
For developers moving from traditional AWS S3:
1. **Update endpoint** to Auto Drive server URL
2. **Change credentials** to use Auto Drive API key (with empty secret)
3. **Set \`bucketEndpoint\`**: true in S3Client configuration
4. **Handle longer response times** due to blockchain network latency
5. **Expect CIDs as ETags** instead of MD5 hashes
6. **Update bucket references** to use full endpoint URLs
7. **Test multipart uploads** as they may behave slightly differently
\`\`\`typescript
// Before (AWS S3)
const s3Client = new S3Client(\{
region: "us-east-1",
credentials: \{
accessKeyId: "AKIA...",
secretAccessKey: "abc123...",
\},
\});
// After (Auto Drive)
const s3Client = new S3Client(\{
region: "us-east-1",
credentials: \{
accessKeyId: "your-auto-drive-api-key",
secretAccessKey: "",
\},
bucketEndpoint: true,
\});
\`\`\`
## Limitations & Considerations
### Performance
- The DSN (on-chain) storage has higher latency than traditional S3
- Multipart uploads recommended for files > 5MB
- Range requests may have different performance characteristics
### Compatibility Notes
- Not all S3 features supported (e.g., versioning, lifecycle policies)
- Custom metadata handling for compression/encryption
- Bucket operations are virtual (no actual bucket creation)
## Best Practices
1. **Use Multipart Uploads** for files larger than 5MB
2. **Leverage Range Requests** for partial file access
3. **Include Compression/Encryption** metadata when needed
4. **Handle ETags as CIDs** for content verification
5. **Implement Retry Logic** for blockchain network delays
## Error Handling
- Standard S3 error responses
- Additional blockchain-specific error codes
- Network timeouts may be longer than traditional S3
This S3 layer provides a familiar interface while leveraging the benefits of decentralized storage, making it easy to migrate existing S3-based applications to Auto Drive.
---
### File: sdk/auto-drive/usage_examples.mdx
# Auto-Drive Usage Examples
## 1. Uploading a file from a filepath (not available in browser)
\`\`\`typescript
import \{ fs,createAutoDriveApi \} from '@autonomys/auto-drive'
import \{ NetworkId \} from '@autonomys/auto-utils'
const api = createAutoDriveApi(\{ apiKey: 'your-api-key', network: NetworkId.MAINNET \}) // Initialize your API instance with API key
const filePath = 'path/to/your/file.txt' // Specify the path to your file
const options = \{
password: 'your-encryption-password', // Optional: specify a password for encryption
compression: true,
// an optional callback useful for large file uploads
onProgress?: (progress: number) => \{
console.log(\`The upload is completed is \$\{progress\}% completed\`)
\}
\}
const cid = await fs.uploadFileFromFilepath(api, filePath, options)
console.log(\`The file is uploaded and its cid is \$\{cid\}\`)
\`\`\`
## 2. Uploading a file using File interface
\`\`\`typescript
import \{ uploadFileFromInput, createAutoDriveApi \} from '@autonomys/auto-drive'
import \{ NetworkId \} from '@autonomys/auto-utils'
const api = createAutoDriveApi(\{ apiKey: 'your-api-key', network: NetworkID.MAINNET \}) // Initialize your API instance with API key
// e.g Get File from object from HTML event
const file: File = e.target.value // Substitute with your file
const options = \{
password: 'your-encryption-password', // Optional: specify a password for encryption
compression: true,
\}
const cid = await api.uploadFileFromInput(file, options)
console.log(\`The file is uploaded and its cid is \$\{cid\}\`)
\`\`\`
## 3. Uploading a folder (not available in browser)
\`\`\`typescript
import \{ createAutoDriveApi, fs \} from '@autonomys/auto-drive'
import \{ NetworkId \} from '@autonomys/auto-utils'
const api = createAutoDriveApi(\{ apiKey: 'your-api-key', network: NetworkID.MAINNET \}) // Initialize your API instance with API key
const folderPath = 'path/to/your/folder' // Specify the path to your folder
const options = \{
uploadChunkSize: 1024 * 1024, // Optional: specify the chunk size for uploads
password: 'your-encryption-password', // Optional: If folder is encrypted
// an optional callback useful for large file uploads
onProgress: (progress: number) => \{
console.log(\`The upload is completed is \$\{progress\}% completed\`)
\},
\}
const folderCID = await fs.uploadFolderFromFolderPath(api, folderPath, options)
console.log(\`The folder is uploaded and its cid is \$\{folderCID\}\`)
\`\`\`
## 4. Uploading a file from a custom interface with \`GenericFile\`
\`\`\`typescript
import \{ createAutoDriveApi \} from '@autonomys/auto-drive'
import \{ NetworkId \} from '@autonomys/auto-utils'
const api = createAutoDriveApi(\{ apiKey: 'your-api-key', network: NetworkID.MAINNET \}) // Initialize your API instance with API key
const buffer = Buffer.from(...);
const genericFile = \{
read: async function *() \{
yield buffer
\},
name: "autonomys-whitepaper.pdf",
mimeType: "application/pdf",
size: 1234556,
path: "autonomys-whitepaper.pdf"
\}
const options = \{
password: 'your-encryption-password', // Optional: specify a password for encryption
compression: true,
// an optional callback useful for large file uploads
onProgress?: (progress: number) => \{
console.log(\`The upload is completed is \$\{progress\}% completed\`)
\}
\}
const cid = api.uploadFile(genericFile, options)
console.log(\`The file is uploaded and its cid is \$\{cid\}\`)
\`\`\`
## 5. Downloading files
\`\`\`typescript
import \{ createAutoDriveApi \} from '@autonomys/auto-drive'
import \{ NetworkId \} from '@autonomys/auto-utils'
const api = createAutoDriveApi(\{ apiKey: 'your-api-key', network: NetworkID.MAINNET \}) // Initialize your API instance with API key
try \{
const cid = '..'
const stream = await api.downloadFile(cid)
let file = Buffer.alloc(0)
for await (const chunk of stream) \{
file = Buffer.concat([file, chunk])
\}
console.log('File downloaded successfully:', stream)
\} catch (error) \{
console.error('Error downloading file:', error)
\}
\`\`\`
## 6. Creating shareable download link
\`\`\`typescript
import \{ createAutoDriveApi \} from '@autonomys/auto-drive'
import \{ NetworkId \} from '@autonomys/auto-utils'
const api = createAutoDriveApi(\{ apiKey: 'your-api-key', network: NetworkID.MAINNET \}) // Initialize your API instance with API key
try \{
const cid = 'your-file-cid'
const publicUrl = await api.publishObject(cid)
console.log('Public download URL:', publicUrl)
\} catch (error) \{
console.error('Error publishing object:', error)
\}
\`\`\`
## 7. Example usage of \`getMyFiles\`
Here is an example of how to use the \`getMyFiles\` method to retrieve the root directories:
\`\`\`typescript
import \{ createAutoDriveApi \} from '@autonomys/auto-drive'
import \{ NetworkId \} from '@autonomys/auto-utils'
const api = createAutoDriveApi(\{ apiKey: 'your-api-key', network: NetworkID.MAINNET \}) // Initialize your API instance with API key
try \{
for (let i = 0; i < 10; i++) \{
const myFiles = await api.getMyFiles(i, 100)
console.log(\`Retrieved \$\{myFiles.rows.length\} files of \$\{myFiles.totalCount\} total\`)
for (const file of myFiles.rows) \{
console.log(\`\$\{file.name\} - \$\{file.headCid\}: \$\{file.size\}\`)
\}
\}
\} catch (error) \{
console.error('Error downloading file:', error)
\}
\`\`\`
---
### File: sdk/auto-utils.mdx
## Auto-Utils Package
### Introduction
The \`@autonomys/auto-utils\` package provides core utility functions for interacting with and building applications on the Autonomys Network.
### Features
- **Wallet Management**: Initialize and manage wallets using mnemonics or URIs.
- **Network Configuration**: Access and manage network and domain settings.
- **Data Storage**: Save and read data to and from local storage or the file system.
- **Cryptographic Operations**: Perform hashing and data manipulation using cryptographic functions.
- **API Activation**: Activate and manage connections to the Autonomys Network APIs.
- **Address Utilities**: Convert and decode addresses to and from standardized formats.
### Installation
Install the package via \`npm\` or \`yarn\`:
\`\`\`bash
# Using npm
npm install @autonomys/auto-utils
# Using yarn
yarn add @autonomys/auto-utils
\`\`\`
### Importing
Import the \`auto-utils\` functions you need into your project:
\`\`\`typescript
// Import specific functions
import \{ activateWallet, activate, blake2b_256 \} from '@autonomys/auto-utils';
// Or import everything
import * as utils from '@autonomys/auto-utils';
\`\`\`
### Available functions
> *Note:* All asynchronous functions return a \`Promise\` and should be used with \`await\` for proper execution flow. Wrap asynchronous calls in \`try...catch\` blocks to handle potential errors gracefully.
#### Account and address utilities
- \`address(input): string\`: Standardizes an address format.
- \`createAccountIdType(api, address): Uint8Array\`: Creates an \`AccountId\` object from an address.
- \`decode(input): Uint8Array\`: Decodes an address to bytes.
#### API and connection management
> *Note:* Always disconnect the API instance after operations to free up resources.
- \`activate(options?): Promise<ApiPromise>\`: Connects to the Autonomys Network.
- \`activateDomain(params): Promise<ApiPromise>\`: Connects to a specific domain.
- \`createConnection(endpoint, options?): Promise<ApiPromise>\`: Creates a new API connection.
- \`createAutoDriveApi(api, networkId)\`: Connects to the Auto Drive API of the selected Network.
- \`disconnect(api): Promise<void>\`: Disconnects an API instance.
#### Cryptographic functions
- \`blake2b_256(data): string\`: Hashes data with BLAKE2b-256.
- \`concatenateUint8Arrays(...arrays): Uint8Array\`: Concatenates multiple \`Uint8Array\`s.
- \`stringToUint8Array(string): Uint8Array\`: Converts a string to \`Uint8Array\`.
#### Data storage
> *Note:* Be cautious when saving sensitive data using \`save\` and \`read\` as data storage is permanent. Handle private keys securely. Do not expose them in code or logs.
- \`read(key): Promise<any>\`: Reads data from storage.
- \`readFromFileSystem(key): Promise<any>\`: Reads data from file system.
- \`readFromLocalStorage(key): Promise<any>\`: Reads data from local storage.
- \`save(key, value): Promise<void>\`: Saves data to storage.
- \`saveOnFileSystem(key, value): Promise<void>\`: Saves data to file system.
- \`saveOnLocalStorage(key, value): Promise<void>\`: Saves data to local storage.
#### Event management
- \`eventName(type, event): string\`: Combines a type and an event into a full event name.
- \`eventsGroup\`: Groups system events by name.
- \`expectSuccessfulTxEvent\`: Default success event names array.
- \`Type\`: Enum for event types (e.g., \`system\`).
- \`validateEvents(events, eventsExpected?, tx, block, log?): EventsValidated\`: Checks if expected events are in transaction events.
#### Network management
- \`getNetworkDetails(options): NetworkDetails\`: Retrieves the details of a network.
- \`getNetworkDomainDetails(options): DomainDetails\`: Retrieves the details of a domain.
- \`getNetworkDomainRpcUrls(options): string[]\`: Retrieves the RPC URLs for a domain.
- \`getNetworkRpcUrls(options): string[]\`: Retrieves the RPC URLs for a network.
- \`networks\`: Array of available networks.
#### Signing utilities
- \`signMessage(signer, address, data): Promise<\{ signature: string \}>\`: Signs a message with a signer and an address.
- \`signatureVerify\`: Verifies signatures (re-exported from \`@polkadot/util-crypto\`).
- \`signingKey(publicKey): string\`: Converts a public key to a hex string.
#### String utilities
- \`capitalizeFirstLetter(string): string\`: Capitalizes the first letter.
- \`fixLengthEntryId(blockHeight, indexInBlock?): string\`: Creates fixed-length IDs.
- \`isAddress(address): boolean\`: Validates a Substrate address.
- \`isHex(value): boolean\`: Validates a hexadecimal string.
- \`shortString(value, initialLength?, endLength?): string\`: Truncates strings.
- \`stringify(value): string\`: Stringifies values, handling BigInt.
#### Token and value formatting
- \`formatSpacePledged(value, decimals?): string\`: Formats space amount with units.
- \`formatTokenAmount(amount, decimals?): bigint\`: Formats token amount with decimals.
- \`parseTokenAmount(amount, decimals?): number\`: Parses token amount with decimals.
#### Transaction utilities
- \`signAndSendTx(sender, tx, options?, eventsExpected?, log?, mapErrorCodeToEnum?): Promise<TransactionSignedAndSend>\`: Signs, sends, and validates a transaction.
#### Wallet management
- \`activateWallet(options): Promise<\{ api, accounts \}>\`: Activates a wallet using a mnemonic or URI.
- \`generateWallet(): GeneratedWallet\`: Generates a new wallet with a mnemonic.
- \`getMockWallet(name, wallets): Wallet\`: Retrieves a mock wallet by name.
- \`mockWallets(options, api?): Promise<Wallet[]>\`: Creates mock wallets for testing.
- \`setupWallet(params): Wallet\`: Sets up a wallet from a mnemonic or URI.
## Usage examples
### 1. Wallet management
#### Activate a wallet using a mnemonic phrase
\`\`\`typescript
import \{ activateWallet \} from '@autonomys/auto-utils';
(async () => \{
const mnemonic = 'your mnemonic phrase here';
const \{ api, accounts \} = await activateWallet(\{
mnemonic,
networkId: 'chronos', // Optional: specify the network ID
\});
const account = accounts[0];
console.log(\`Connected with account address: \$\{account.address\}\`);
// Perform actions with the account...
// Disconnect when done
await api.disconnect();
\})();
\`\`\`
#### Activate a wallet using a URI
\`\`\`typescript
import \{ activateWallet \} from '@autonomys/auto-utils';
(async () => \{
const \{ api, accounts \} = await activateWallet(\{
uri: '//Alice',
networkId: 'localhost', // Connect to a local network
\});
const account = accounts[0];
console.log(\`Connected with account address: \$\{account.address\}\`);
// Disconnect when done
await api.disconnect();
\})();
\`\`\`
#### Create mock wallets for testing
\`\`\`typescript
import \{ activate, mockWallets, getMockWallet \} from '@autonomys/auto-utils';
(async () => \{
const api = await activate(\{ networkId: 'chronos' \});
const wallets = await mockWallets(\{\}, api);
const aliceWallet = getMockWallet('Alice', wallets);
const bobWallet = getMockWallet('Bob', wallets);
console.log(\`Alice's address: \$\{aliceWallet.accounts[0].address\}\`);
console.log(\`Bob's address: \$\{bobWallet.accounts[0].address\}\`);
// Disconnect when done
await api.disconnect();
\})();
\`\`\`
### 2. Network management
#### List all available networks
\`\`\`typescript
import \{ networks \} from '@autonomys/auto-utils';
networks.forEach((network) => \{
console.log(\`Network ID: \$\{network.id\}, Name: \$\{network.name\}\`);
\});
\`\`\`
#### Retrieve the details of a specific network
\`\`\`typescript
import \{ getNetworkDetails \} from '@autonomys/auto-utils';
const network = getNetworkDetails(\{ networkId: 'chronos' \});
console.log(\`Network Name: \$\{network.name\}\`);
console.log(\`RPC URLs: \$\{network.rpcUrls.join(', ')\}\`);
\`\`\`
#### Retrieve the details of a specific domain within a network
\`\`\`typescript
import \{ getNetworkDomainDetails \} from '@autonomys/auto-utils';
const domain = getNetworkDomainDetails(\{ domainId: '1', networkId: 'chronos' \});
console.log(\`Domain Name: \$\{domain.name\}\`);
console.log(\`RPC URLs: \$\{domain.rpcUrls.join(', ')\}\`);
\`\`\`
### 3. Cryptographic functions
#### Hash a string using BLAKE2b-256
\`\`\`typescript
import \{ blake2b_256, stringToUint8Array \} from '@autonomys/auto-utils';
const data = 'Hello, Autonomys!';
const dataBytes = stringToUint8Array(data);
const hash = blake2b_256(dataBytes);
console.log(\`Hash: \$\{hash\}\`); // Outputs the hash of the input string
\`\`\`
#### Convert a string to a \`Uint8Array\`
\`\`\`typescript
import \{ stringToUint8Array \} from '@autonomys/auto-utils';
const text = 'Sample text';
const byteArray = stringToUint8Array(text);
console.log(byteArray); // Outputs Uint8Array representation of the string
\`\`\`
#### Concatenate two \`Uint8Array\` instances
\`\`\`typescript
import \{ stringToUint8Array, concatenateUint8Arrays \} from '@autonomys/auto-utils';
const array1 = stringToUint8Array('First part ');
const array2 = stringToUint8Array('Second part');
const concatenated = concatenateUint8Arrays(array1, array2);
console.log(\`Concatenated Result: \$\{new TextDecoder().decode(concatenated)\}\`);
// Outputs: "First part Second part"
\`\`\`
### 4. API activation
#### Activate the network API (connect to the Autonomys Network)
\`\`\`typescript
import \{ activate \} from '@autonomys/auto-utils';
(async () => \{
const api = await activate(\{ networkId: 'chronos' \});
console.log('API connected');
// Perform API calls...
// Disconnect when done
await api.disconnect();
\})();
\`\`\`
#### Activate a domain API (connect to a specific domain within the Autonomys Network)
\`\`\`typescript
import \{ activateDomain \} from '@autonomys/auto-utils';
(async () => \{
const api = await activateDomain(\{ domainId: '1', networkId: 'chronos' \});
console.log('Domain API connected');
// Perform domain-specific API calls...
// Disconnect when done
await api.disconnect();
\})();
\`\`\`
### 5. Data storage
#### Save data to local storage or the file system and read it back
\`\`\`typescript
import \{ save, read \} from '@autonomys/auto-utils';
const key = 'myData';
const value = \{ message: 'Hello, Autonomys!' \};
// Save data
save(key, value);
// Read data
const retrievedValue = read(key);
console.log(retrievedValue); // Outputs: \{ message: 'Hello, Autonomys!' \}
\`\`\`
### 6. Address utilities
#### Convert an address to a standardized format and decode it
\`\`\`typescript
import \{ address, decode \} from '@autonomys/auto-utils';
const originalAddress = '5GmS1wtCfR4tK5SSgnZbVT4kYw5W8NmxmijcsxCQE6oLW6A8';
const standardizedAddress = address(originalAddress);
const decodedAddress = decode(originalAddress);
console.log(\`Standardized Address: \$\{standardizedAddress\}\`);
console.log('Decoded Address:', decodedAddress);
\`\`\`
---
### File: sdk/auto-xdm.mdx
## Auto-XDM Package
### Introduction
The Autonomys Auto XDM SDK (@autonomys/auto-xdm) provides functionalities for cross-domain transfer of native tokens.
### Features
- **Cross-Domain Transfer**: Enable token transfers between consensus and domain wallets.
- **TypeScript Support**: Full TypeScript type definitions for an enhanced developer experience.
### Installation
Install the package via \`npm\` or \`yarn\`:
\`\`\`bash
# Using npm
npm install @autonomys/auto-xdm
# Using yarn
yarn add @autonomys/auto-xdm
\`\`\`
### Importing
Import the \`auto-xdm\` functions you need into your project:
\`\`\`typescript
// Import specific functions
import \{
transferToConsensus,
transferToDomainAccount20Type,
transferToDomainAccount32Type
\} from '@autonomys/auto-xdm';
// Or import everything
import * as xdm from '@autonomys/auto-xdm';
\`\`\`
### Available functions
#### Transfer functions
> *Note:* All transfer functions return a \`SubmittableExtrinsic\` that must be signed and sent. The \`amount\` parameter expects the smallest unit of the token (Shannon). For 18 decimals, multiply by 10^18 or use \`formatTokenAmount\` from \`auto-utils\`. Ensure a sufficient balance in the sending account to cover both the transfer and fees. Cross-domain transfers may take a few blocks to complete.
- \`transfer(api, destination, amount): Promise<SubmittableExtrinsic>\`: Base transfer function for cross-domain transfers.
- \`transferToConsensus(api, accountId32, amount): Promise<SubmittableExtrinsic>\`: Transfer tokens from a domain to the consensus chain.
- \`transferToDomainAccount20Type(api, destinationDomainId, accountId20, amount): Promise<SubmittableExtrinsic>\`: Transfer tokens from the consensus chain to an EVM address.
- \`transferToDomainAccount32Type(api, destinationDomainId, accountId32, amount): Promise<SubmittableExtrinsic>\`: Transfer tokens from the consensus chain to a Substrate address.
#### Query functions
> *Note:* Monitor transfer status using query functions.
- \`chainAllowlist(api): Promise<Codec>\`: Retrieves the list of allowed chains.
- \`channels(api, chainId): Promise<Codec>\`: Retrieves the list of channels
- \`consensusChannels(api): Promise<Codec>\`: Retrieves the list of consensus channels
- \`domainChannels(api, domainId): Promise<Codec>\`: Retrieves the list of domain channels
- \`allCancelledTransfers(api): Promise<Codec>\`: Retrieves all cancelled transfers.
- \`chainTransfers(api): Promise<Codec>\`: Retrieves all chain transfers.
- \`allDomainBalances(api): Promise<Codec>\`: Retrieves balances across all domains.
- \`domainBalances(api, domainId): Promise<Codec>\`: Retrieves balances for a specific domain.
- \`allUnconfirmedTransfers(api): Promise<Codec>\`: Retrieves pending transfers.
### Type definitions
\`\`\`typescript
type Amount = BigInt | number | string;
type Consensus = \{
type: 'consensus';
\};
type Domain = \{
type: 'domain';
domainId: number;
\};
type ChainOrDomain = Consensus | Domain;
\`\`\`
## Usage examples
### 1. Transfer from \`Consensus\` to \`Domain\` (EVM address)
\`\`\`typescript
import \{ activateWallet \} from '@autonomys/auto-utils'
import \{ transferToDomainAccount20Type \} from '@autonomys/auto-xdm'
const api = await activateWallet(\{ networkId: 'chronos', uri: '//alice' \})
const tx = await transferToDomainAccount20Type(
api,
0, // Receiver domain (0 is Auto EVM on Chronos Testnet)
'0x1234567890abcdef', // Receiver domain account
'1000000000000000000',
)
\`\`\`
### 2. Transfer from \`Consensus\` to \`Domain\` (Substrate address)
\`\`\`typescript
import \{ activateWallet \} from '@autonomys/auto-utils'
import \{ transferToDomainAccount32Type \} from '@autonomys/auto-xdm'
const api = await activateWallet(\{ networkId: 'chronos', uri: '//alice' \})
const tx = await transferToDomainAccount32Type(
api,
0, // Receiver domain (0 is Auto EVM on Chronos Testnet)
'su1234567890abcdef', // Receiver domain account
'1000000000000000000',
)
\`\`\`
### 3. Transfer from \`Domain\` to \`Consensus\`
\`\`\`typescript
import \{ activateWallet \} from '@autonomys/auto-utils'
import \{ transferToConsensus \} from '@autonomys/auto-xdm'
const api = await activateWallet(\{ networkId: 'chronos', domainId: 0, uri: '//alice' \})
const tx = await transferToConsensus(
api,
'su1234567890abcdef', // Receiver consensus account,
'1000000000000000000',
)
\`\`\`
### 4. Query domain balances
\`\`\`typescript
import \{ activateWallet \} from '@autonomys/auto-utils';
import \{ domainBalances \} from '@autonomys/auto-xdm';
(async () => \{
const \{ api \} = await activateWallet(\{
networkId: 'chronos'
\});
// Get balances for domain 0 (Auto EVM on the Chronos testnet)
const balances = await domainBalances(api, 0);
console.log('Domain balances:', balances.toString());
\})();
\`\`\`
### Best practices
1. **Error Handling**: Wrap asynchronous calls and transactions in \`try...catch\` blocks to handle potential errors gracefully:
\`\`\`typescript
try \{
const tx = await transferToConsensus(api, receiver, amount);
await tx.signAndSend(account);
\} catch (error) \{
console.error('Transfer failed:', error);
\}
\`\`\`
2. **Amount Formatting**: Use appropriate decimal places for token amounts:
- The \`amount\` parameter expects the smallest unit of the token (Shannon).
- For 18 decimals, multiply by 10^18 or use \`formatTokenAmount\` from \`auto-utils\`.
3. **API Management**: > Always disconnect the API instance after your operations are complete to free up resources:
\`\`\`typescript
try \{
// ... your code ...
\} finally \{
await api.disconnect();
\}
\`\`\`
---
### File: sdk/index.mdx
---
title: Auto SDK
---
## Auto SDK
### What is the Auto SDK?
The **Auto SDK** is a powerful toolkit of JavaScript/TypeScript packages for developers to seamlessly integrate with the Autonomys Network. It provides simple APIs for interacting with the *consensus* layer, utilizing *data storage*, managing *decentralized identities*, and (soon) handling *AI3 payments*, in addition to general-purpose functions essential for building decentralized applications (dApps)—all in JavaScript and TypeScript—abstracting away the complexities of blockchain and smart contracts.
### Key features:
- **Modular Architecture**: Use only the packages you need.
- **Easy to Use**: Simplifies blockchain operations with high-level functions.
- **Flexible**: Suitable for both beginners and experienced blockchain developers.
- **Open-source**: Built by and for the community.
### Why the Auto SDK?
- **Simplify Development**: Focus on your application's logic rather than blockchain intricacies.
- **Accelerate Time-to-Market**: Reduce development time with ready-to-use functions.
- **Ensure Compatibility**: Stay up-to-date with the latest Autonomys blockchain protocols.
- **Enhance Security**: Utilize well-tested code for critical operations like identity management.
## Packages
The Auto SDK monorepo contains multiple packages, each serving a specific purpose. All packages are published to \`npm\` under the \`@autonomys\` scope:
- **\`@autonomys/auto-utils\`**: Core utility functions for interacting with the Autonomys Network.
- **\`@autonomys/auto-consensus\`**: Functions for interacting with the consensus layer.
- **\`@autonomys/auto-drive\`**: Tools for preparing, uploading and managing data for on-chain storage.
## Quick Start Guide: Install and Code Example
A short video guide on installing the Auto SDK, along with a simple coding example demonstrating how to retrieve an account balance.
<iframe width="560" height="315" src="https://www.youtube.com/embed/B5J9fwE5-vI?si=Mt133r3I2QnCae0A" title="Auto SDK Quick Start Guide: Install and Code Example" frameBorder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>
## Step-by-step guide
### Requirements
- **\`Node.js\`** (version 14 or higher)
- **\`yarn\`** or **\`npm\`**
### Installation
Install the packages you need via \`npm\` or \`yarn\`. For example, to install \`@autonomys/auto-utils\` and \`@autonomys/auto-consensus\`:
#### Using \`npm\`
\`\`\`bash
npm install @autonomys/auto-utils @autonomys/auto-consensus
\`\`\`
#### Using \`yarn\`
\`\`\`bash
yarn add @autonomys/auto-utils @autonomys/auto-consensus
\`\`\`
### Cloning the repository and building locally (*optional*)
If you want to run the SDK packages locally or contribute to the SDK, clone the repository and build the packages from source. Setup instructions:
#### 1. Clone the repository
Open your terminal and run:
\`\`\`bash
git clone https://github.com/autonomys/auto-sdk.git
\`\`\`
#### 2. Navigate to the project directory
\`\`\`bash
cd auto-sdk
\`\`\`
#### 3. Install dependencies
\`\`\`bash
yarn install
\`\`\`
#### 4. Compile all packages
\`\`\`bash
yarn run build
\`\`\`
#### 5. Run tests on all packages
\`\`\`bash
yarn run test
\`\`\`
### Localhost testing (*optional*)
Test the SDK packages against a local Autonomys node:
#### 1. Verify OS and architecture settings
Edit the \`scripts/download.sh\` file and ensure that lines 3-7 match your current operating system and architecture:
\`\`\`bash
# Change the following variables as needed
# OS to download
OS="macos" # Options: macos | ubuntu | windows
# Architecture to download
ARCHITECTURE="aarch64" # Options: aarch64 | x86_64-skylake | x86_64-v2
\`\`\`
#### 2. Run the development script
Execute the following command to start the local node and farmer:
\`\`\`bash
node scripts/run-dev.js
\`\`\`
This script will:
1. Download the latest version of the node and farmer compatible with your OS and architecture (\`scripts/download.sh\`).
2. Start the node and create/insert the keystore (\`scripts/run-node.sh\`).
3. Start the farmer (\`scripts/run-farmer.sh\`).
4. Register the node as an operator, wait for synchronization, and then terminate the node and farmer (handled within \`scripts/run-dev.js\`).
5. Restart the node as an operator (\`scripts/run-operator.sh\`).
6. Restart the farmer (\`scripts/run-farmer.sh\`).
#### 3. Run tests on the local node
\`\`\`bash
bash scripts/localhost-run-test.sh
\`\`\`
The tests will automatically detect and execute on the local node and farmer.
## Next steps
With the Auto SDK set up locally, you're ready to start building and testing your blockchain applications. Explore the following pages for code examples and a functions overview.
---
### File: sdk/polkadot-api.mdx
# Understanding the Polkadot API for Auto SDK Development
## Introduction
While the Auto SDK provides convenient, high-level functions for common tasks on the Autonomys Network, many advanced use cases require direct interaction with the underlying blockchain through the **Polkadot API**. The Autonomys Network is built on Substrate (the same framework as Polkadot), which means it inherits the powerful \`@polkadot/api\` interface for blockchain interactions.
> **For newcomers**: If you're just starting with blockchain development, think of the Auto SDK as a friendly toolkit for common tasks, while the Polkadot API is the comprehensive "power user" interface that gives you access to everything the blockchain can do.
## Why You Need the Polkadot API
The Auto SDK covers many essential operations, but there are numerous scenarios where you'll need to access blockchain data and functionality that isn't wrapped by the SDK:
### Chain State Queries
- **Block information**: Current block number, timestamps, block hashes
- **Network fees**: Transaction fees, storage costs, operational parameters
- **Account data**: Balances, nonces, detailed account information
- **System parameters**: Network constants, runtime version, chain specifications
### Advanced Transaction Handling
- **Fee estimation**: Calculate exact transaction costs before submission
- **Complex extrinsics**: Multi-signature operations, batched transactions
- **Event filtering**: Listen for specific blockchain events beyond basic success/failure
- **Custom pallets**: Interact with specialized blockchain modules
### Real-time Monitoring
- **Block subscriptions**: Monitor new blocks and their contents
- **State changes**: Watch for changes in specific storage items
- **Network status**: Monitor node health, peer connections, sync status
## Essential Polkadot API Documentation
The **[Polkadot.js API Documentation](https://polkadot.js.org/docs/api)** is your comprehensive resource for understanding blockchain interactions. Key sections include:
- **[API Basics](https://polkadot.js.org/docs/api/start/basics)**: Connection setup and fundamental concepts
- **[Query Methods](https://polkadot.js.org/docs/api/start/api.query)**: Reading blockchain state
- **[RPC Calls](https://polkadot.js.org/docs/api/start/api.rpc)**: Direct node communication
- **[Subscription Patterns](https://polkadot.js.org/docs/api/start/api.query.subs)**: Real-time data monitoring
## Installation and Setup
The Polkadot API works alongside the Auto SDK. Install both packages:
\`\`\`bash
npm install @polkadot/api @autonomys/auto-utils
\`\`\`
Import both in your project:
\`\`\`typescript
// Auto SDK for high-level operations
import \{ activate, activateWallet \} from '@autonomys/auto-utils';
// Polkadot API for direct blockchain access
import \{ ApiPromise, WsProvider \} from '@polkadot/api';
\`\`\`
## Common Use Cases and Examples
### 1. Getting Network Information and Fees
This example shows how to retrieve transaction fees and timestamps - information not available through the Auto SDK:
\`\`\`typescript
import \{ ApiPromise, WsProvider \} from '@polkadot/api';
async function getNetworkInfo() \{
// Connect to the Autonomys Network
const api = await ApiPromise.create(\{
provider: new WsProvider('wss://rpc-0.mainnet.subspace.network/ws')
\});
try \{
// Get current block header
const header = await api.rpc.chain.getHeader();
const blockNumber = header.number.toNumber();
// Get current timestamp
const timestamp = (await api.query.timestamp.now()).toNumber();
// Get transaction fees
const fee = await api.query.transactionFees.transactionByteFee();
console.log('Block number:', blockNumber);
console.log('Timestamp:', new Date(timestamp).toISOString());
console.log('Current fee:', fee.current.toString());
console.log('Next fee:', fee.next.toString());
\} finally \{
await api.disconnect();
\}
\}
\`\`\`
### 2. Account Balance and Nonce Queries
\`\`\`typescript
async function getAccountDetails(accountAddress) \{
const api = await ApiPromise.create(\{
provider: new WsProvider('wss://rpc-0.gemini-3h.subspace.network/ws')
\});
try \{
// Get account balance
const balance = await api.query.system.account(accountAddress);
console.log('Free balance:', balance.data.free.toString());
console.log('Reserved balance:', balance.data.reserved.toString());
console.log('Account nonce:', balance.nonce.toString());
\} finally \{
await api.disconnect();
\}
\}
\`\`\`
### 3. Fee Estimation Before Transaction
\`\`\`typescript
import \{ activateWallet \} from '@autonomys/auto-utils';
import \{ ApiPromise \} from '@polkadot/api';
async function estimateTransactionFee(recipientAddress, amount) \{
// Use Auto SDK to get wallet and API
const \{ api, accounts \} = await activateWallet(\{
uri: '//Alice',
networkId: 'mainnet'
\});
const sender = accounts[0];
// Create transaction
const transfer = api.tx.balances.transfer(recipientAddress, amount);
// Estimate fees using Polkadot API
const info = await transfer.paymentInfo(sender.address);
console.log(\`Estimated fee: \$\{info.partialFee.toString()\}\`);
console.log(\`Transaction weight: \$\{info.weight.toString()\}\`);
await api.disconnect();
\}
\`\`\`
### 4. Monitoring Blockchain Events
\`\`\`typescript
async function monitorBlocks() \{
const api = await ApiPromise.create(\{
provider: new WsProvider('wss://rpc-0.gemini-3h.subspace.network/ws')
\});
// Subscribe to new blocks
const unsubscribe = await api.rpc.chain.subscribeNewHeads((header) => \{
console.log(\`New block #\$\{header.number\} with hash \$\{header.hash\}\`);
\});
// Subscribe to balance changes for specific account
const accountAddress = 'your-account-address';
const unsubBalance = await api.query.system.account(accountAddress, (balance) => \{
console.log('Balance updated:', balance.data.free.toString());
\});
// Clean up subscriptions
setTimeout(() => \{
unsubscribe();
unsubBalance();
api.disconnect();
\}, 30000); // Stop after 30 seconds
\}
\`\`\`
## Integration Patterns with Auto SDK
### Combining Auto SDK with Polkadot API
\`\`\`typescript
import \{ activateWallet \} from '@autonomys/auto-utils';
import \{ ApiPromise \} from '@polkadot/api';
async function advancedWalletOperations() \{
// Use Auto SDK for wallet setup
const \{ api, accounts \} = await activateWallet(\{
mnemonic: 'your mnemonic phrase here',
networkId: 'mainneth'
\});
const account = accounts[0];
// Use Polkadot API for advanced queries
const accountInfo = await api.query.system.account(account.address);
const currentFees = await api.query.transactionFees.transactionByteFee();
console.log(\`Account: \$\{account.address\}\`);
console.log(\`Balance: \$\{accountInfo.data.free.toString()\}\`);
console.log(\`Current fee rate: \$\{currentFees.current.toString()\}\`);
// The api instance from Auto SDK is a full Polkadot API instance
// You can use all Polkadot API methods on it
const blockHash = await api.rpc.chain.getBlockHash();
console.log(\`Latest block hash: \$\{blockHash\}\`);
await api.disconnect();
\}
\`\`\`
## When to Use Auto SDK vs. Polkadot API
### Use Auto SDK for:
- ✅ Wallet creation and management
- ✅ Basic token transfers
- ✅ File uploads to Auto Drive
- ✅ Auto ID operations
- ✅ Network connection setup
- ✅ Common utility functions
### Use Polkadot API for:
- ✅ Fee calculations and estimations
- ✅ Block and timestamp queries
- ✅ Custom pallet interactions
- ✅ Advanced transaction types
- ✅ Real-time blockchain monitoring
- ✅ Complex state queries
---