Skip to content

Commit

Permalink
Refactor forester
Browse files Browse the repository at this point in the history
re-enable Playwright installation and update data types in photon-api/models

cd forester && pnpm test-1000-sync

failing interop tests added

• Use root_seq % 1400 instead of change_log_index
• Sort accounts in state nullification test

performance optimisations

wip

fix after rebase

js transfer testnet test

update forester keypair + move to .env, which is in .gitignore

wip

refactor config

Allow to nullify state and addresses simultaneously

Refactor init_rpc and nullify functions to use config directly

Refactor code to support concurrent nullification tasks

The code has been changed to support concurrent nullification of state and address trees. This includes changing the way some variables (like configuration and RPC) are passed around to use Arc<Mutex<T>> wrapper to ensure safe concurrent access. The nullify function is now also designed to be run in parallel using tokio's task spawn and join mechanism.

Reduce total number of transfers (generate_sync) and cleanup nullify_addresses function

Refactor and reformat Forester codebase

Simplify error handling and refactor import order in tests

Update test configurations and external service references

Add derivation key to external services config

A new field named 'derivation' has been incorporated to the external services configuration, setting a preset value for different environments. This modification enables us to use a variable derivation key, retrieved from the external services configuration, in the nullifier module.

Implement async processing for account nullification

Implement new version of application logic in forester's service.

Add extensive logging and debug support across codebase

This commit enriches various modules with extensive logging for better monitoring and debugging. Added Debug trait implementations for multiple structs and improved the error handling flow with additional logging. This will make understanding the pipeline's flow easier and facilitate troubleshooting during incidents.

chore: Refactor module structure and file organization in v2 state module

Refactor module structure and file organization in v2 state module

Refactor v2 state module structure and file organization

Refactor v2 address module structure and file organization

Refactor forester

Add shutdown handling to pipeline and processors

Refactoring

Add logging and retry mechanism in merkle tree update

Update pnpm-lock.yaml and adjust low_address_next_value in photon_indexer.rs

In pnpm-lock.yaml, some dependencies related to 'eslint-plugin-import' were adjusted and two nonessential lines were removed. Changes in photon_indexer.rs include fixing

Gracefully end address process

Refactor module hierarchy and update configuration

Revised the module from 'v2' to 'nullifier', consequently adjusting import paths across multiple files. Also, a few changes were made to the configuration settings in 'main.rs'. Testing strategy was slightly modified within 'package.json'. Completion of assertion test in 'e2e_test.rs' was restored.

fix

Clean up and remove unnecessary code comments
.

Refactor code for enhanced readability and maintainability

This commit encompasses a series of syntactical changes across multiple files. Primarily, the changes streamline `use` imports, update the arrangement of module imports, minimize the usage of whitespace, and enhance code indentation for better readability. No logical changes to functionalities or algorithms are made.
  • Loading branch information
sergeytimoshin committed Jun 27, 2024
1 parent 89d2f7d commit 4b88b39
Show file tree
Hide file tree
Showing 54 changed files with 1,598 additions and 840 deletions.
2 changes: 2 additions & 0 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

7 changes: 5 additions & 2 deletions examples/token-escrow/Anchor.toml
Original file line number Diff line number Diff line change
Expand Up @@ -7,12 +7,15 @@ skip-lint = false
[programs.localnet]
token_escrow = "GRLu2hKaAiMbxpkAM1HeXzks9YeGuz18SEgXEizVvPqX"

[programs.testnet]
token_escrow = "GRLu2hKaAiMbxpkAM1HeXzks9YeGuz18SEgXEizVvPqX"

[registry]
url = "https://api.apr.dev"

[provider]
cluster = "Localnet"
wallet = "/home/ananas/.config/solana/id.json"
cluster = "testnet"
wallet = "/Users/tsv/.config/solana/id.json"

[scripts]
test = "yarn run ts-mocha -p ./tsconfig.json -t 1000000 tests/**/*.ts"
1 change: 1 addition & 0 deletions forester/.gitignore
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
/target
.idea
.env
15 changes: 5 additions & 10 deletions forester/forester.toml
Original file line number Diff line number Diff line change
@@ -1,10 +1,5 @@
PAYER = [17, 34, 231, 31, 83, 147, 93, 173, 61, 164, 25, 0, 204, 82, 234, 91, 202, 187, 228, 110, 146,
97, 112, 131, 180, 164, 96, 220, 57, 207, 65, 107, 2, 99, 226, 251, 88, 66, 92, 33, 25, 216,
211, 185, 112, 203, 212, 238, 105, 144, 72, 121, 176, 253, 106, 168, 115, 158, 154, 188, 62,
255, 166, 81]

STATE_MERKLE_TREE_PUBKEY = "5bdFnXU47QjzGpzHfXnxcEi5WXyxzEAZzd1vrE39bf1W"
NULLIFIER_QUEUE_PUBKEY = "44J4oDXpjPAbzHCSc24q7NEiPekss4sAbLd8ka4gd9CZ"
REGISTRY_PUBKEY = "7Z9Yuy3HkBCc2Wf3xzMGnz6qpV4n7ciwcoEMGKqhAnj1"
ADDRESS_MERKLE_TREE_PUBKEY = "C83cpRN6oaafjNgMQJvaYgAz592EP5wunKvbokeTKPLn"
ADDRESS_MERKLE_TREE_QUEUE_PUBKEY = "HNjtNrjt6irUPYEgxhx2Vcs42koK9fxzm3aFLHVaaRWz"
STATE_MERKLE_TREE_PUBKEY="5bdFnXU47QjzGpzHfXnxcEi5WXyxzEAZzd1vrE39bf1W"
NULLIFIER_QUEUE_PUBKEY="44J4oDXpjPAbzHCSc24q7NEiPekss4sAbLd8ka4gd9CZ"
REGISTRY_PUBKEY="7Z9Yuy3HkBCc2Wf3xzMGnz6qpV4n7ciwcoEMGKqhAnj1"
ADDRESS_MERKLE_TREE_PUBKEY="C83cpRN6oaafjNgMQJvaYgAz592EP5wunKvbokeTKPLn"
ADDRESS_MERKLE_TREE_QUEUE_PUBKEY="HNjtNrjt6irUPYEgxhx2Vcs42koK9fxzm3aFLHVaaRWz"
13 changes: 8 additions & 5 deletions forester/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,9 @@
"start-validator": "../cli/test_bin/run test-validator --indexer-db-url=postgres://photon:photon@localhost:5432/postgres",
"restart-validator": "pnpm restart-db && pnpm wait-for-db && pnpm migrate-db && pnpm start-validator",
"build": "cargo build",
"test": "RUST_MIN_STACK=98388608 RUSTFLAGS=\"-D warnings\" cargo test --package forester -- --nocapture --test-threads=1",
"test": "RUST_MIN_STACK=98388608 RUSTFLAGS=\"-D warnings\" cargo test --package forester -- --test-threads=1 --nocapture",

"test-sync": "pnpm restart-validator && pnpm transfer-sync && pnpm nullify-state",

"interop-nullify-test": "RUST_MIN_STACK=98388608 RUSTFLAGS=\"-D warnings\" cargo test test_photon_interop_nullify_account -- --nocapture",
"interop-address-test": "RUST_MIN_STACK=98388608 RUSTFLAGS=\"-D warnings\" cargo test test_photon_interop_address -- --nocapture",
Expand All @@ -19,12 +21,13 @@
"rebuild-photon": "pkill photon && cd ../../photon && cargo build && cp ./target/debug/photon ./target/debug/photon-migration ../light-protocol/.local/cargo/bin",
"tree-info": "RUST_MIN_STACK=8388608 cargo test tree_info_test -- --nocapture",
"queue-info": "RUST_MIN_STACK=8388608 cargo test queue_info_test -- --nocapture",
"test-10-sync": "pnpm restart-validator && pnpm transfer-sync && pnpm nullify && pnpm transfer-sync && pnpm nullify && pnpm transfer-sync && pnpm nullify",
"test-1000": "pnpm restart-validator && pnpm transfer && pnpm nullify && pnpm transfer && pnpm nullify",
"transfer-100k": "ts-node scripts/generate_100k.ts",
"subscribe": "RUST_MIN_STACK=8388608 cargo run --release -- subscribe",
"nullify": "RUST_MIN_STACK=8388608 cargo run --release -- nullify",
"reindex": "RUST_MIN_STACK=8388608 cargo run --release -- index",
"subscribe": "RUST_MIN_STACK=8388608 cargo run -- subscribe",
"nullify-state": "RUST_MIN_STACK=8388608 cargo run -- nullify-state",
"nullify-addresses": "RUST_MIN_STACK=8388608 cargo run -- nullify-addresses",
"nullify": "RUST_MIN_STACK=8388608 cargo run -- nullify",
"reindex": "RUST_MIN_STACK=8388608 cargo run -- index",
"dump-accounts": "./scripts/dump.sh",
"transfer": "ts-node ./scripts/generate.ts",
"transfer-10": "ts-node ./scripts/generate_10.ts",
Expand Down
27 changes: 23 additions & 4 deletions forester/scripts/generate_sync.ts
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ const bobKeypair = [

const LAMPORTS = 1e11;
const COMPRESS_AMOUNT = 1e9;
const TOTAL_NUMBER_OF_TRANSFERS = 10;
const TOTAL_NUMBER_OF_TRANSFERS = 200;
const NUMBER_OF_CONCURRENT_TRANSFERS = 1;
const TRANSFER_AMOUNT = 10;

Expand All @@ -30,15 +30,34 @@ async function transferAsync(i: number, rpc: Rpc, payer: Signer, bobPublicKey: P
console.log(`transfer ${i} of ${TOTAL_NUMBER_OF_TRANSFERS}: ${transferSig}`);
}

function localRpc(): Rpc {
let validatorUrl = 'http://0.0.0.0:8899';
let photonUrl = 'http://0.0.0.0:8784';
let proverUrl = 'http://0.0.0.0:3001';

return createRpc(validatorUrl, photonUrl, proverUrl);
}

function zkTestnetRpc(): Rpc {
let validatorUrl = 'https://zk-testnet.helius.dev:8899';
let photonUrl = 'https://zk-testnet.helius.dev:8784';
let proverUrl = 'https://zk-testnet.helius.dev:3001';

return createRpc(validatorUrl, photonUrl, proverUrl);
}

async function prefillNullifierQueue() {
const rpc = createRpc();

const rpc = localRpc();
const payer = Keypair.fromSecretKey(Uint8Array.from(payerKeypair));
const tx1 = await airdropSol({connection: rpc, lamports: LAMPORTS, recipientPublicKey: payer.publicKey});

const bob = Keypair.fromSecretKey(Uint8Array.from(bobKeypair));

const tx1 = await airdropSol({connection: rpc, lamports: LAMPORTS, recipientPublicKey: payer.publicKey});
console.log('tx1', tx1);
console.log('Airdropping SOL to payer and bob...');
const tx2 = await airdropSol({connection: rpc, lamports: LAMPORTS, recipientPublicKey: bob.publicKey});
console.log('tx2', tx2);
console.log('Airdrop completed.');

const payerBalance = await rpc.getBalance(payer.publicKey);
console.log('payer balance', payerBalance);
Expand Down
4 changes: 4 additions & 0 deletions forester/scripts/solana.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
#!/bin/sh

ROOT_DIR=$(git rev-parse --show-toplevel)
solana-test-validator --account-dir "$ROOT_DIR"/cli/accounts
2 changes: 2 additions & 0 deletions forester/src/cli.rs
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@ pub struct Cli {
}
#[derive(Subcommand)]
pub enum Commands {
NullifyState,
NullifyAddresses,
Nullify,
Subscribe,
Index,
Expand Down
12 changes: 8 additions & 4 deletions forester/src/nullifier/config.rs → forester/src/config.rs
Original file line number Diff line number Diff line change
@@ -1,8 +1,10 @@
use crate::external_services_config::ExternalServicesConfig;
use solana_sdk::pubkey::Pubkey;
use solana_sdk::signature::Keypair;

pub struct Config {
pub server_url: String,
#[derive(Debug)]
pub struct ForesterConfig {
pub external_services: ExternalServicesConfig,
pub nullifier_queue_pubkey: Pubkey,
pub state_merkle_tree_pubkey: Pubkey,
pub address_merkle_tree_pubkey: Pubkey,
Expand All @@ -12,12 +14,13 @@ pub struct Config {
pub concurrency_limit: usize,
pub batch_size: usize,
pub max_retries: usize,
pub max_concurrent_batches: usize,
}

impl Clone for Config {
impl Clone for ForesterConfig {
fn clone(&self) -> Self {
Self {
server_url: self.server_url.clone(),
external_services: self.external_services.clone(),
nullifier_queue_pubkey: self.nullifier_queue_pubkey,
state_merkle_tree_pubkey: self.state_merkle_tree_pubkey,
address_merkle_tree_pubkey: self.address_merkle_tree_pubkey,
Expand All @@ -27,6 +30,7 @@ impl Clone for Config {
concurrency_limit: self.concurrency_limit,
batch_size: self.batch_size,
max_retries: self.max_retries,
max_concurrent_batches: self.max_concurrent_batches,
}
}
}
4 changes: 0 additions & 4 deletions forester/src/constants.rs

This file was deleted.

31 changes: 31 additions & 0 deletions forester/src/errors.rs
Original file line number Diff line number Diff line change
Expand Up @@ -22,11 +22,42 @@ pub enum ForesterError {
BincodeError(#[from] Box<bincode::ErrorKind>),
#[error("Indexer can't find any proofs")]
NoProofsFound,
#[error("Max retries reached")]
MaxRetriesReached,
#[error("error: {0:?}")]
Custom(String),
#[error("unknown error")]
Unknown,
}
impl ForesterError {
pub fn to_owned(&self) -> Self {
match self {
ForesterError::RpcError(e) => ForesterError::Custom(format!("RPC Error: {:?}", e)),
ForesterError::DeserializeError(e) => {
ForesterError::Custom(format!("Deserialize Error: {:?}", e))
}
ForesterError::CopyMerkleTreeError(e) => {
ForesterError::Custom(format!("Copy Merkle Tree Error: {:?}", e))
}
ForesterError::AccountCompressionError(e) => {
ForesterError::Custom(format!("Account Compression Error: {:?}", e))
}
ForesterError::HashSetError(e) => {
ForesterError::Custom(format!("HashSet Error: {:?}", e))
}
ForesterError::PhotonApiError(e) => {
ForesterError::Custom(format!("Photon API Error: {:?}", e))
}
ForesterError::BincodeError(e) => {
ForesterError::Custom(format!("Bincode Error: {:?}", e))
}
ForesterError::NoProofsFound => ForesterError::NoProofsFound,
ForesterError::MaxRetriesReached => ForesterError::MaxRetriesReached,
ForesterError::Custom(s) => ForesterError::Custom(s.clone()),
ForesterError::Unknown => ForesterError::Unknown,
}
}
}

#[derive(Error, Debug)]
pub enum PhotonApiErrorWrapper {
Expand Down
32 changes: 32 additions & 0 deletions forester/src/external_services_config.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
#[derive(Debug, Clone)]
pub struct ExternalServicesConfig {
pub rpc_url: String,
pub ws_rpc_url: String,
pub indexer_url: String,
pub prover_url: String,
pub derivation: String,
}

impl ExternalServicesConfig {
pub fn local() -> Self {
Self {
rpc_url: "http://localhost:8899".to_string(),
ws_rpc_url: "ws://localhost:8900".to_string(),
indexer_url: "http://localhost:8784".to_string(),
prover_url: "http://localhost:3001".to_string(),
// derivation: "H7ZzJngDRtAGCV8Y9HwJrMpsxeNZQyYkjxw4GE8YcUG2".to_string(),
derivation: "En9a97stB3Ek2n6Ey3NJwCUJnmTzLMMEA5C69upGDuQP".to_string(),
// derivation: "ALA2cnz41Wa2v2EYUdkYHsg7VnKsbH1j7secM5aiP8k".to_string()
}
}

pub fn zktestnet() -> Self {
Self {
rpc_url: "https://zk-testnet.helius.dev:8899".to_string(),
ws_rpc_url: "ws://zk-testnet.helius.dev:8900".to_string(),
indexer_url: "https://zk-testnet.helius.dev:8784".to_string(),
prover_url: "https://zk-testnet.helius.dev:3001".to_string(),
derivation: "En9a97stB3Ek2n6Ey3NJwCUJnmTzLMMEA5C69upGDuQP".to_string(),
}
}
}
67 changes: 31 additions & 36 deletions forester/src/indexer/photon_indexer.rs
Original file line number Diff line number Diff line change
@@ -1,14 +1,11 @@
use std::str::FromStr;

use crate::utils::decode_hash;
use account_compression::initialize_address_merkle_tree::Pubkey;
use light_test_utils::indexer::{
Indexer, IndexerError, MerkleProof, MerkleProofWithAddressContext, NewAddressProofWithContext,
};
use solana_sdk::bs58;

use light_test_utils::indexer::{Indexer, IndexerError, MerkleProof, NewAddressProofWithContext};
use log::info;
use photon_api::apis::configuration::Configuration;
use photon_api::models::GetCompressedAccountsByOwnerPostRequestParams;
use solana_sdk::bs58;
use std::fmt::Debug;

pub struct PhotonIndexer {
configuration: Configuration,
Expand All @@ -25,6 +22,15 @@ impl PhotonIndexer {
}
}

impl Debug for PhotonIndexer {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
f.debug_struct("PhotonIndexer")
.field("configuration", &self.configuration)
.field("configuration", &self.configuration)
.finish()
}
}

impl Clone for PhotonIndexer {
fn clone(&self) -> Self {
PhotonIndexer {
Expand All @@ -51,6 +57,7 @@ impl Indexer for PhotonIndexer {

match result {
Ok(response) => {
// info!("Response: {:?}", response);
match response.result {
Some(result) => {
let proofs = result
Expand Down Expand Up @@ -109,14 +116,6 @@ impl Indexer for PhotonIndexer {
Ok(hashes)
}

async fn get_address_tree_proof(
&self,
_merkle_tree_pubkey: [u8; 32],
_address: [u8; 32],
) -> Result<MerkleProofWithAddressContext, IndexerError> {
unimplemented!("only needed for testing")
}

async fn get_multiple_new_address_proofs(
&self,
_merkle_tree_pubkey: [u8; 32],
Expand All @@ -127,6 +126,8 @@ impl Indexer for PhotonIndexer {
..Default::default()
};

info!("Request: {:?}", request);

let result = photon_api::apis::default_api::get_multiple_new_address_proofs_post(
&self.configuration,
request,
Expand All @@ -137,41 +138,35 @@ impl Indexer for PhotonIndexer {
return Err(IndexerError::Custom(result.err().unwrap().to_string()));
}

info!("Result: {:?}", result);
let proofs: photon_api::models::MerkleContextWithNewAddressProof =
result.unwrap().result.unwrap().value[0].clone();

// TODO: use decode_hash
let tree_pubkey = Pubkey::from_str(&proofs.merkle_tree).unwrap();
let low_address_value = Pubkey::from_str(&proofs.lower_range_address).unwrap();
let next_address_value = Pubkey::from_str(&proofs.higher_range_address).unwrap();
let tree_pubkey = decode_hash(&proofs.merkle_tree);
let low_address_value = decode_hash(&proofs.lower_range_address);
let next_address_value = decode_hash(&proofs.higher_range_address);
Ok(NewAddressProofWithContext {
merkle_tree: tree_pubkey.to_bytes(),
merkle_tree: tree_pubkey,
low_address_index: proofs.low_element_leaf_index as u64,
low_address_value: low_address_value.to_bytes(),
low_address_value,
low_address_next_index: proofs.next_index as u64,
low_address_next_value: next_address_value.to_bytes(),
low_address_next_value: next_address_value,
low_address_proof: {
let proof_vec: Vec<[u8; 32]> = proofs
let mut proof_vec: Vec<[u8; 32]> = proofs
.proof
.iter()
.map(|x: &String| decode_hash(x))
.collect();
proof_vec
proof_vec.truncate(proof_vec.len() - 10); // Remove canopy
let mut proof_arr = [[0u8; 32]; 16];
proof_arr.copy_from_slice(&proof_vec);
proof_arr
},
root: decode_hash(&proofs.root),
root_seq: proofs.root_seq as i64,
new_low_element: None,
new_element: None,
new_element_next_value: None,
})
}

fn account_nullified(&mut self, _merkle_tree_pubkey: Pubkey, _account_hash: &str) {
unimplemented!("only needed for testing")
}

fn address_tree_updated(
&mut self,
_merkle_tree_pubkey: [u8; 32],
_context: MerkleProofWithAddressContext,
) {
unimplemented!("only needed for testing")
}
}
Loading

0 comments on commit 4b88b39

Please sign in to comment.