Solana ICCO Contract written using Anchor (#41)

* Add Solana (Anchor) Program

Co-authored-by: spacemandev <devbharel@gmail.com>
Co-authored-by: Drew <dsterioti@users.noreply.github.com>
Co-authored-by: skojenov <sekoje@users.noreply.github.com>
This commit is contained in:
Karl 2022-06-13 20:13:47 -05:00 committed by GitHub
parent 19910c7239
commit c54f4d830a
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
169 changed files with 13249 additions and 21310 deletions

2
.gitignore vendored
View File

@ -8,3 +8,5 @@ target
tilt.json
sdk/js/src/icco/*js
sdk/js/src/index.js
.vscode
.DS_Store

View File

@ -48,10 +48,10 @@ To create a sale, a user invokes the `createSale()` method on the sale conductor
- The ATA on the Solana contributor where offered tokens will be sent
- An array of accepted tokens on each chain + the USD conversion rate which they are accepted at
The `createSale()` method deposits the offered tokens, assigns an ID which identifies the sale and attests a `SaleInit` packet over the wormhole. This packet contains all the information from above.
The `createSale()` method deposits the offered tokens, assigns an ID which identifies the sale and attests a `SaleInit` packet over the wormhole. This packet contains all the information from above. It will also attest a `SolanaSaleInit` packet over the wormhole if any Solana tokens are accepted as collateral in the sale.
The sale information is also stored locally.
The attested `SaleInit` packet is submitted to the `TokenSaleContributor` contracts. The contributor contracts stores the sale information locally which is relevant to its chain.
The attested `SaleInit` packet (or the `SolanaSaleInit`) is submitted to the `TokenSaleContributor` contracts. The contributor contracts stores the sale information locally which is relevant to its chain.
The `TokenSaleConductor` contract can terminate the sale by calling `abortSaleBeforeStartTime()` before the sale period begins. Only the wallet that called `createSale()` can invoke this method.
@ -62,12 +62,13 @@ After the sale duration anyone can call the `attestContributions()` method on th
The `TokenSaleConductor` now collects the `Contributions` packets from all chains & tokens.
After all contributions have been collected, anyone can call the `sealSale()` method on the Conductor.
The method evaluates whether the minimum raise amount has been met using the conversion rates specified initially (a later version could use rates from an oracle at closing). In case it was successful it:
The method evaluates whether the minimum raise amount has been met using the conversion rates specified initially (a later version could use rates from an oracle at closing). The conversion rates are scaled based on the accepted token decimals on the conductor chain relative to the token decimals on the native chain. It is crucial that the conversion rates are scaled properly in order to correctly calculate token allocations. In case it was successful it:
- Calculates allocations and excess contributions (if total contributions sum to a value larger than the maximum raise amount)
- Excess contributions are calculated by taking the difference between the maximum raise amount and the total contributions.
Each contributor receives excess contributions proportional to their contribution amount (individualContribution / totalContributions \* totalExcessContributions)
- Emits a `SaleSealed` packet - indicated to the Contributor contracts that the sale was successful
- Emits another `SaleSealed` packet if the sale accepts Solana tokens as collateral. The message is in the same format as the original `SaleSealed` packet, but only contains information regarding Solana token allocations. This is necessary due to VAA size contraints on Solana.
- Bridges the relevant share of offered tokens to the Contributor contracts
Or in case the goal was not met, it:
@ -127,13 +128,18 @@ Owner Only:
- Token
- uint16 chainId
- bytes32 address
- uint16 tokenChain
- bytes32 tokenAddress
- uint256 conversionRate
- SolanaToken
- uint8 tokenIndex
- bytes32 tokenAddress
- Contribution
- uint8 tokenIndex (index in accepted tokens array)
- uint256 contributedAmount
- uint256 contributed
- Allocation
- uint8 tokenIndex (index in accepted tokens array)
@ -161,14 +167,14 @@ SaleInit:
```
// PayloadID uint8 = 1
uint8 payloadID;
// Sale ID
// sale ID
uint256 saleID;
// Address of the token being sold. Left-zero-padded if shorter than 32 bytes
// address of the token being sold, left-zero-padded if shorter than 32 bytes
bytes32 tokenAddress;
// Chain ID of the token being sold
// chain ID of the token being sold
uint16 tokenChain;
// sale token decimals
uint8 tokenDecimals
uint8 tokenDecimals;
// token amount being sold
uint256 tokenAmount;
// min raise amount
@ -183,15 +189,15 @@ uint256 saleEnd;
uint8 tokensLen;
// repeated for tokensLen times, Struct 'Token'
// Address of the token. Left-zero-padded if shorter than 32 bytes
// address of the token, left-zero-padded if shorter than 32 bytes
bytes32 tokenAddress;
// Chain ID of the token
// chain ID of the token
uint16 tokenChain;
// conversion rate for the token
uint256 conversionRate;
// sale token ATA for Solana
bytes32 solanaTokenAccount
bytes32 solanaTokenAccount;
// recipient of proceeds
bytes32 recipient;
// refund recipient in case the sale is aborted
@ -213,7 +219,7 @@ uint8 contributionsLen;
// repeated for tokensLen times, Struct 'Contribution'
// index in acceptedTokens array
uint8 index
uint8 tokenIndex;
// contributed amount of token
uint256 contributed;
```
@ -246,3 +252,33 @@ uint8 payloadID;
// Sale ID
uint256 saleID;
```
SolanaSaleInit:
```
// PayloadID uint8 = 5
uint8 payloadID;
// sale ID
uint256 saleID;
// sale token ATA for solana
bytes32 solanaTokenAccount;
// chain ID of the token
uint16 tokenChain;
// token decimals
uint8 tokenDecimals;
// timestamp raise start
uint256 saleStart;
// timestamp raise end
uint256 saleEnd;
// accepted tokens length
uint8 tokensLen;
// repeated for tokensLen times, Struct 'SolanaToken'
// index in acceptedTokens array
uint8 tokenIndex;
// address of the token, left-zero-padded if shorter than 32 bytes
bytes32 tokenAddress;
// recipient of proceeds
bytes32 recipient;
```

7
anchor-contributor/.gitignore vendored Normal file
View File

@ -0,0 +1,7 @@
.anchor
.DS_Store
target
**/*.rs.bk
node_modules
test-ledger

View File

@ -0,0 +1,8 @@
.anchor
.DS_Store
target
node_modules
dist
build
test-ledger

View File

@ -0,0 +1,39 @@
[features]
seeds = false
[programs.localnet]
anchor_contributor = "Efzc4SLs1ZdTPRq95oWxdMUr9XiX5M14HABwHpvrc9Fm"
[registry]
url = "https://anchor.projectserum.com"
[provider]
cluster = "localnet"
wallet = "./tests/test_orchestrator_keypair.json"
[scripts]
test = "yarn run ts-mocha -p ./tsconfig.json -t 1000000 tests/**/*.ts"
[[test.genesis]]
address = "Bridge1p5gheXUvJ6jGWGeCsgPKgnE3YgdGKRVCMY9o"
program = "./tests/core_bridge.so"
[[test.validator.account]]
address = "FKoMTctsC7vJbEqyRiiPskPnuQx2tX1kurmvWByq5uZP"
filename = "./tests/bridge_config.json"
[[test.validator.account]]
address = "6MxkvoEwgB9EqQRLNhvYaPGhfcLtBtpBqdQugr3AZUgD"
filename = "./tests/guardian_set.json"
[[test.validator.account]]
address = "GXBsgBD3LDn3vkRZF6TfY5RqgajVZ4W5bMAdiAaaUARs"
filename = "./tests/fee_collector.json"
[[test.genesis]]
address = "B6RHG3mfcckmrYN1UhmJzyS1XX3fZKbkeUcpJe9Sy3FE"
program = "./tests/token_bridge.so"
[[test.validator.account]]
address = "3GwVs8GSLdo4RUsoXTkGQhojauQ1sXcDNjm7LSDicw19"
filename = "./tests/token_config.json"

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,9 @@
[workspace]
members = [
"programs/*"
]
[profile.release]
overflow-checks = true
opt-level = 3
incremental = false

View File

@ -0,0 +1,41 @@
## ICCO Built with Anchor
### Dependencies
- Solana CLI (1.10.24)
- Anchor CLI (0.24.2)
- yarn (1.22.\*)
- node (16.\*)
### Tests
Unit tests (which just runs `cargo test`).
```sh
yarn run unit-test
```
Integration test using Anchor's local validator, which includes Wormhole and Token Bridge interaction.
```sh
yarn run integration-test
```
**NOTE: expect one failing test, which attempts to invoke Token Bridge program to transfer contributions to conductor.**
### Deploy
Currently there is only one deployment command in yarn, which deploys the contributor contract to devnet. _If you deploy
a new conductor, you will need to replace the existing `CONDUCTOR_ADDRESS` with that newly deployed contract address._
You need to specify a `WALLET` variable, which will be used to pay for deployment.
```sh
WALLET=path/to/your/key.json yarn run deploy-devnet
```
### Other Notes
We manage compile-time constants with environment variables found in `test.env` and `devnet.env`. When it comes time
to deploy to mainnet, make a corresponding `mainnet.env` file. If you inadvertently source these files outside of
any of the provided scripts, you can run `. unset.env` to unset all these variables.

8
anchor-contributor/devnet.env Executable file
View File

@ -0,0 +1,8 @@
export CONDUCTOR_CHAIN=2
export CONDUCTOR_ADDRESS="000000000000000000000000ce121ea9c289390df7d812f83ed6be79a167dfe4"
export GLOBAL_KYC_AUTHORITY="1df62f291b2e969fb0849d99d9ce41e2f137006e"
export CORE_BRIDGE_ADDRESS="3u8hJUVTA4jH1wYAyUur7FFZVQ8H635K3tSHHF4ssjQ5"
export TOKEN_BRIDGE_ADDRESS="DZnkkTmCiFWfYTfT41X3Rd1kDgozqzxWaHqsw6W4x2oe"
# misc
export BROWSER=""

View File

@ -0,0 +1,11 @@
#!/bin/bash
set -euo pipefail
solana config set --url devnet
# WALLET must be set
ls $WALLET
. devnet.env
solana program deploy target/deploy/anchor_contributor.so -k $WALLET

View File

@ -0,0 +1,29 @@
{
"scripts": {
"unit-test": "bash -ac '. test.env && cargo test'",
"integration-test": "bash -ac '. test.env && anchor test'",
"deploy-devnet": "bash migrations/deploy-devnet.sh",
"lint:fix": "prettier */*.js \"*/**/*{.js,.ts}\" -w",
"lint": "prettier */*.js \"*/**/*{.js,.ts}\" --check"
},
"dependencies": {
"@certusone/wormhole-sdk": "^0.3.4",
"@project-serum/anchor": "^0.24.2",
"@solana/spl-token": "^0.2.0",
"byteify": "^2.0.10",
"elliptic": "^6.5.4",
"ethers": "^5.6.8",
"keccak256": "^1.0.6",
"web3-utils": "^1.7.3"
},
"devDependencies": {
"@types/bn.js": "^5.1.0",
"@types/chai": "^4.3.0",
"@types/mocha": "^9.0.0",
"chai": "^4.3.4",
"mocha": "^9.0.3",
"prettier": "^2.6.2",
"ts-mocha": "^10.0.0",
"typescript": "^4.3.5"
}
}

View File

@ -0,0 +1,31 @@
[package]
name = "anchor-contributor"
version = "0.1.0"
description = "ICCO Contributor"
edition = "2021"
[lib]
crate-type = ["cdylib", "lib"]
name = "anchor_contributor"
[features]
no-entrypoint = []
no-idl = []
no-log-ix-name = []
cpi = ["no-entrypoint"]
default = []
localhost = []
[profile.release]
overflow-checks = true
[dependencies]
anchor-lang = { version= "0.24.2", features = ["init-if-needed"]}
anchor-spl = "0.24.2"
spl-token = "3.3.0"
num-traits = "0.2"
num-derive = "0.3"
borsh = "0.9.3"
hex = "0.4.3"
num = "0.4"
itertools = "0.8"

View File

@ -0,0 +1,45 @@
// seed prefixes
pub const SEED_PREFIX_CUSTODIAN: &str = "icco-custodian";
pub const SEED_PREFIX_SALE: &str = "icco-sale";
pub const SEED_PREFIX_BUYER: &str = "icco-buyer";
pub const CHAIN_ID: u16 = 1;
// vaa payload types
pub const PAYLOAD_SALE_INIT_SOLANA: u8 = 5; // 1 for everyone else
pub const PAYLOAD_ATTEST_CONTRIBUTIONS: u8 = 2;
pub const PAYLOAD_SALE_SEALED: u8 = 3;
pub const PAYLOAD_SALE_ABORTED: u8 = 4;
// universal
pub const PAYLOAD_HEADER_LEN: usize = 33; // payload + sale id
pub const INDEX_SALE_ID: usize = 1;
// for sale init
pub const INDEX_SALE_INIT_TOKEN_ADDRESS: usize = 33;
pub const INDEX_SALE_INIT_TOKEN_CHAIN: usize = 65;
pub const INDEX_SALE_INIT_TOKEN_DECIMALS: usize = 67;
pub const INDEX_SALE_INIT_SALE_START: usize = 68;
pub const INDEX_SALE_INIT_SALE_END: usize = 100;
pub const INDEX_SALE_INIT_ACCEPTED_TOKENS_START: usize = 132;
pub const ACCEPTED_TOKEN_NUM_BYTES: usize = 33;
pub const ACCEPTED_TOKENS_MAX: usize = 8;
pub const INDEX_ACCEPTED_TOKEN_INDEX: usize = 0;
pub const INDEX_ACCEPTED_TOKEN_ADDRESS: usize = 1;
pub const INDEX_ACCEPTED_TOKEN_END: usize = 33;
// for attest contributions
pub const ATTEST_CONTRIBUTIONS_ELEMENT_LEN: usize = 33; // token index + amount
// for sale sealed
pub const INDEX_SALE_SEALED_ALLOCATIONS_START: usize = 33;
pub const ALLOCATION_NUM_BYTES: usize = 65;
pub const INDEX_ALLOCATIONS_AMOUNT: usize = 1;
pub const INDEX_ALLOCATIONS_EXCESS: usize = 33;
pub const INDEX_ALLOCATIONS_END: usize = 65;
// misc
pub const PAD_U8: usize = 31;
pub const PAD_U64: usize = 24;

View File

@ -0,0 +1,647 @@
use anchor_lang::{
prelude::*,
solana_program::sysvar::{clock, rent},
};
use anchor_spl::token::{Mint, Token, TokenAccount};
use crate::{
constants::*,
state::{Buyer, Custodian, Sale},
};
/// Context allows contract owner to create an account that acts
/// to hold all associated token accounts for all sales.
/// See `create_custodian` instruction in lib.rs.
///
/// Mutable
/// * `custodian`
/// * `payer` (signer)
#[derive(Accounts)]
pub struct CreateCustodian<'info> {
#[account(
init,
payer = payer,
seeds = [
SEED_PREFIX_CUSTODIAN.as_bytes(),
],
bump,
space = 8 + Custodian::MAXIMUM_SIZE,
)]
pub custodian: Account<'info, Custodian>,
#[account(mut)]
pub payer: Signer<'info>,
pub system_program: Program<'info, System>,
}
/// Context provides all accounts required for someone to initialize a sale
/// with a signed VAA sent by the conductor. A `Sale` is created at this step,
/// which will be used for future actions.
/// See `init_sale` instruction in lib.rs.
///
/// /// Immutable
/// * `custodian`
/// * `core_bridge_vaa`
/// * `sale_token_mint`
/// * `custodian_sale_token_acct`
///
/// Mutable
/// * `sale`
/// * `payer` (signer)
#[derive(Accounts)]
pub struct InitSale<'info> {
#[account(
seeds = [
SEED_PREFIX_CUSTODIAN.as_bytes(),
],
bump,
)]
pub custodian: Account<'info, Custodian>,
#[account(
init,
seeds = [
SEED_PREFIX_SALE.as_bytes(),
&Custodian::get_sale_id_from_vaa(&core_bridge_vaa)?,
],
payer = payer,
bump,
space = 8 + Sale::MAXIMUM_SIZE
)]
pub sale: Account<'info, Sale>,
#[account(
constraint = core_bridge_vaa.owner.key() == Custodian::wormhole()?
)]
/// CHECK: This account is owned by Core Bridge so we trust it
pub core_bridge_vaa: AccountInfo<'info>,
pub sale_token_mint: Account<'info, Mint>,
#[
account(
constraint = custodian_sale_token_acct.mint == sale_token_mint.key(),
constraint = custodian_sale_token_acct.owner == custodian.key(),
)
]
pub custodian_sale_token_acct: Account<'info, TokenAccount>,
#[account(mut)]
pub payer: Signer<'info>,
pub system_program: Program<'info, System>,
}
/// Context provides all accounts required for user to send contribution
/// to ongoing sale.
/// See `contribute` instruction in lib.rs.
///
/// Immutable
/// * `custodian`
///
/// Mutable
/// * `sale`
/// * `buyer`
/// * `buyer_token_acct`
/// * `custodian_token_acct`
/// * `owner` (signer)
#[derive(Accounts)]
pub struct Contribute<'info> {
#[account(
seeds = [
SEED_PREFIX_CUSTODIAN.as_bytes(),
],
bump,
)]
pub custodian: Account<'info, Custodian>,
#[account(
mut,
seeds = [
SEED_PREFIX_SALE.as_bytes(),
&sale.id,
],
bump,
)]
pub sale: Account<'info, Sale>,
#[account(
init_if_needed,
seeds = [
SEED_PREFIX_BUYER.as_bytes(),
&sale.id,
&owner.key().as_ref(),
],
payer = owner,
bump,
space = 8 + Buyer::MAXIMUM_SIZE,
)]
pub buyer: Account<'info, Buyer>,
#[account(
mut,
constraint = buyer_token_acct.mint == custodian_token_acct.mint,
constraint = buyer_token_acct.owner == owner.key(),
)]
pub buyer_token_acct: Account<'info, TokenAccount>,
#[account(
mut,
constraint = custodian_token_acct.owner == custodian.key(),
)]
pub custodian_token_acct: Account<'info, TokenAccount>,
#[account(mut)]
pub owner: Signer<'info>,
pub system_program: Program<'info, System>,
pub token_program: Program<'info, Token>,
}
/// Context provides all accounts required to attest contributions.
/// See `attest_contributions` instruction in lib.rs.
///
/// Immutable
/// * `sale`
/// * `core_bridge`
/// * `clock`
/// * `rent`
///
/// Mutable
/// * `wormhole_config`
/// * `wormhole_fee_collector`
/// * `wormhole_emitter`
/// * `wormhole_sequence`
/// * `wormhole_message`
/// * `payer` (signer)
#[derive(Accounts)]
pub struct AttestContributions<'info> {
#[account(
seeds = [
SEED_PREFIX_SALE.as_bytes(),
&sale.id,
],
bump,
)]
pub sale: Account<'info, Sale>,
#[account(
constraint = wormhole.key() == Custodian::wormhole()?
)]
/// CHECK: Wormhole Program
pub wormhole: AccountInfo<'info>,
#[account(
mut,
seeds = [
b"Bridge".as_ref()
],
bump,
seeds::program = Custodian::wormhole()?
)]
/// CHECK: Wormhole Config
pub wormhole_config: AccountInfo<'info>,
#[account(
mut,
seeds = [
b"fee_collector".as_ref()
],
bump,
seeds::program = Custodian::wormhole()?
)]
/// CHECK: Wormhole Fee Collector
pub wormhole_fee_collector: AccountInfo<'info>,
#[account(
mut,
seeds = [
b"emitter".as_ref(),
],
bump
)]
/// CHECK: Wormhole Emitter is this program
pub wormhole_emitter: AccountInfo<'info>,
#[account(
mut,
seeds = [
b"Sequence".as_ref(),
wormhole_emitter.key().as_ref()
],
bump,
seeds::program = Custodian::wormhole()?
)]
/// CHECK: Wormhole Sequence Number
pub wormhole_sequence: AccountInfo<'info>,
#[account(
mut,
seeds = [
b"attest-contributions".as_ref(),
&sale.id,
],
bump,
)]
/// CHECK: Wormhole Message Storage
pub wormhole_message: AccountInfo<'info>,
#[account(
constraint = clock.key() == clock::id()
)]
/// CHECK: Clock
pub clock: AccountInfo<'info>,
#[account(
constraint = rent.key() == rent::id()
)]
/// CHECK: Rent
pub rent: AccountInfo<'info>,
#[account(mut)]
pub payer: Signer<'info>,
pub system_program: Program<'info, System>,
}
/// Context provides all accounts required to bridge tokens to recipient.
/// See `bridge_sealed_contribution` instruction in lib.rs.
///
/// Immutable
/// * `custodian`
/// * `wormhole`
/// * `token_bridge`
/// * `custody_signer`
/// * `token_mint_signer`
/// * `token_bridge_config`
/// * `clock`
/// * `rent`
///
/// Mutable
/// * `sale`
/// * `custodian_token_acct`
/// * `accepted_mint`
/// * `custody_or_wrapped_meta`
/// * `authority_signer`
/// * `wormhole_config`
/// * `wormhole_fee_collector`
/// * `wormhole_emitter`
/// * `wormhole_sequence`
/// * `wormhole_message`
/// * `payer` (signer)
#[derive(Accounts)]
pub struct BridgeSealedContribution<'info> {
#[account(
seeds = [
SEED_PREFIX_CUSTODIAN.as_bytes(),
],
bump,
)]
pub custodian: Account<'info, Custodian>,
#[account(
mut,
seeds = [
SEED_PREFIX_SALE.as_bytes(),
&sale.id,
],
bump,
)]
pub sale: Account<'info, Sale>,
/// CHECK: Check if owned by ATA Program
#[account(
mut,
constraint = custodian_token_acct.owner == custodian.key(),
constraint = custodian_token_acct.mint == accepted_mint.key()
)]
pub custodian_token_acct: Box<Account<'info, TokenAccount>>,
#[account(mut)]
/// CHECK: Check if owned by SPL Account. Token Bridge needs this to be mutable
pub accepted_mint: Box<Account<'info, Mint>>,
#[account(mut)]
pub payer: Signer<'info>,
pub token_program: Program<'info, Token>,
pub system_program: Program<'info, System>,
#[account(
constraint = token_bridge.key() == Custodian::token_bridge()?
)]
/// CHECK: Token Bridge Program
pub token_bridge: AccountInfo<'info>,
#[account(mut)]
/// CHECK: Will either be token bridge custody account or wrapped meta account
pub custody_or_wrapped_meta: AccountInfo<'info>,
#[account(
seeds=[b"custody_signer"],
bump,
seeds::program = token_bridge.key()
)]
/// CHECK: Only used for bridging assets native to Solana.
pub custody_signer: AccountInfo<'info>,
#[account(
seeds=[b"mint_signer"],
bump,
seeds::program = token_bridge.key()
)]
/// CHECK: We know what we're doing Mr. Anchor ;)
pub token_mint_signer: AccountInfo<'info>,
#[account(
seeds=[b"authority_signer"],
bump,
seeds::program = token_bridge.key()
)]
/// CHECK: Token Bridge Authority Signer, delegated approval for transfer
pub authority_signer: AccountInfo<'info>,
#[account(
mut,
seeds = [
b"config".as_ref()
],
bump,
seeds::program = Custodian::token_bridge()?
)]
/// CHECK: Token Bridge Config
pub token_bridge_config: AccountInfo<'info>,
#[account(
constraint = wormhole.key() == Custodian::wormhole()?
)]
/// CHECK: Wormhole Program
pub wormhole: AccountInfo<'info>,
#[account(
mut,
seeds = [
b"Bridge".as_ref()
],
bump,
seeds::program = Custodian::wormhole()?
)]
/// CHECK: Wormhole Config
pub wormhole_config: AccountInfo<'info>,
#[account(
mut,
seeds = [
b"fee_collector".as_ref()
],
bump,
seeds::program = Custodian::wormhole()?
)]
/// CHECK: Wormhole Fee Collector
pub wormhole_fee_collector: AccountInfo<'info>,
#[account(
mut,
seeds = [
b"emitter".as_ref(),
],
bump,
seeds::program = Custodian::token_bridge()?
)]
/// CHECK: Wormhole Emitter is the Token Bridge Program
pub wormhole_emitter: AccountInfo<'info>,
#[account(
mut,
seeds = [
b"Sequence".as_ref(),
wormhole_emitter.key().as_ref()
],
bump,
seeds::program = Custodian::wormhole()?
)]
/// CHECK: Wormhole Sequence Number
pub wormhole_sequence: AccountInfo<'info>,
#[account(
mut,
seeds = [
b"bridge-sealed".as_ref(),
&sale.id,
&accepted_mint.key().as_ref(),
],
bump,
)]
/// CHECK: Wormhole Message Storage
pub wormhole_message: AccountInfo<'info>,
#[account(
constraint = clock.key() == clock::id()
)]
/// CHECK: Clock
pub clock: AccountInfo<'info>,
#[account(
constraint = rent.key() == rent::id()
)]
/// CHECK: Rent
pub rent: AccountInfo<'info>,
}
/// Context provides all accounts required for someone to abort a sale
/// with a signed VAA sent by the conductor (sale didn't meet min raise).
/// See `abort_sale` instruction in lib.rs.
///
/// Immutable
/// * `custodian`
/// * `core_bridge_vaa`
///
/// Mutable
/// * `sale`
/// * `owner` (signer)
#[derive(Accounts)]
pub struct AbortSale<'info> {
#[account(
seeds = [
SEED_PREFIX_CUSTODIAN.as_bytes(),
],
bump,
)]
pub custodian: Account<'info, Custodian>,
#[account(
mut,
seeds = [
SEED_PREFIX_SALE.as_bytes(),
&sale.id,
],
bump,
)]
pub sale: Account<'info, Sale>,
#[account(
constraint = core_bridge_vaa.owner.key() == Custodian::wormhole()?
)]
/// CHECK: This account is owned by Core Bridge so we trust it
pub core_bridge_vaa: AccountInfo<'info>,
pub system_program: Program<'info, System>,
}
/// Context provides all accounts required for someone to seal a sale
/// with a signed VAA sent by the conductor (sale met at least min raise).
/// See `seal_sale` instruction in lib.rs.
///
/// Immutable
/// * `custodian`
/// * `core_bridge_vaa`
/// * `custodian_sale_token_acct`
///
/// Mutable
/// * `sale`
/// * `owner` (signer)
#[derive(Accounts)]
pub struct SealSale<'info> {
#[account(
seeds = [
SEED_PREFIX_CUSTODIAN.as_bytes(),
],
bump,
)]
pub custodian: Account<'info, Custodian>,
#[account(
mut,
seeds = [
SEED_PREFIX_SALE.as_bytes(),
&sale.id,
],
bump,
)]
pub sale: Account<'info, Sale>,
#[account(
constraint = core_bridge_vaa.owner.key() == Custodian::wormhole()?
)]
/// CHECK: This account is owned by Core Bridge so we trust it
pub core_bridge_vaa: AccountInfo<'info>,
#[account(
constraint = custodian_sale_token_acct.mint == sale.sale_token_mint,
constraint = custodian_sale_token_acct.owner == custodian.key(),
)]
pub custodian_sale_token_acct: Account<'info, TokenAccount>,
pub system_program: Program<'info, System>,
}
/// Context provides all accounts required for user to claim his allocation
/// and excess contributions after the sale has been sealed.
/// See `claim_allocation` instruction in lib.rs.
///
/// Immutable
/// * `custodian`
/// * `sale`
///
/// Mutable
/// * `buyer`
/// * `custodian_sale_token_acct`
/// * `buyer_sale_token_acct`
/// * `owner` (signer)
///
/// NOTE: With `claim_allocation`, remaining accounts are passed in
/// depending on however many accepted tokens there are for a given sale.
#[derive(Accounts)]
pub struct ClaimAllocation<'info> {
#[account(
seeds = [
SEED_PREFIX_CUSTODIAN.as_bytes(),
],
bump,
)]
pub custodian: Account<'info, Custodian>,
#[account(
seeds = [
SEED_PREFIX_SALE.as_bytes(),
&sale.id,
],
bump,
)]
pub sale: Account<'info, Sale>,
#[account(
mut,
seeds = [
SEED_PREFIX_BUYER.as_bytes(),
&sale.id,
&owner.key().as_ref(),
],
bump,
)]
pub buyer: Account<'info, Buyer>,
#[account(
mut,
constraint = custodian_sale_token_acct.mint == sale.sale_token_mint,
constraint = custodian_sale_token_acct.owner == custodian.key(),
)]
pub custodian_sale_token_acct: Account<'info, TokenAccount>,
#[account(
mut,
constraint = buyer_sale_token_acct.mint == sale.sale_token_mint,
constraint = buyer_sale_token_acct.owner == owner.key(),
)]
pub buyer_sale_token_acct: Account<'info, TokenAccount>,
#[account(mut)]
pub owner: Signer<'info>,
pub system_program: Program<'info, System>,
pub token_program: Program<'info, Token>,
}
/// Context provides all accounts required for user to claim his refunds
/// after the sale has been aborted.
/// See `claim_refunds` instruction in lib.rs.
///
/// /// Immutable
/// * `custodian`
/// * `sale`
///
/// Mutable
/// * `buyer`
/// * `owner` (signer)
///
/// NOTE: With `claim_refunds`, remaining accounts are passed in
/// depending on however many accepted tokens there are for a given sale.
#[derive(Accounts)]
pub struct ClaimRefunds<'info> {
#[account(
seeds = [
SEED_PREFIX_CUSTODIAN.as_bytes(),
],
bump,
)]
pub custodian: Account<'info, Custodian>,
#[account(
seeds = [
SEED_PREFIX_SALE.as_bytes(),
&sale.id,
],
bump,
)]
pub sale: Account<'info, Sale>,
#[account(
mut,
seeds = [
SEED_PREFIX_BUYER.as_bytes(),
&sale.id,
&owner.key().as_ref(),
],
bump,
)]
pub buyer: Account<'info, Buyer>,
#[account(mut)]
pub owner: Signer<'info>,
pub system_program: Program<'info, System>,
pub token_program: Program<'info, Token>,
}

View File

@ -0,0 +1,66 @@
use anchor_lang::{
prelude::Result,
solana_program::{keccak, secp256k1_recover::secp256k1_recover},
};
use crate::error::ContributorError;
pub fn ethereum_ecrecover(sig: &[u8], msg: &[u8; 32]) -> Result<[u8; 20]> {
let recovered = secp256k1_recover(&msg[..], sig[64], &sig[0..64])
.map_err(|_| ContributorError::EcdsaRecoverFailure)?;
let hash = keccak::hash(&recovered.to_bytes());
let mut pubkey = [0u8; 20];
pubkey.copy_from_slice(&hash.to_bytes()[12..32]);
Ok(pubkey)
}
#[cfg(test)]
pub mod test {
use super::*;
use itertools::izip;
#[test]
fn test_ethereum_ecrecover() -> Result<()> {
let msgs = [
"d62efc12bf7722b6cb53a67ce1179e6c3ef88daab5aa33e55c8ded771480802d",
"ae82e15be2effa4800bc09610d54512abe1f52be6802a87385b895a6c8e4e0fd",
"13dec14fa12d44fc90d66b322d9f2302590660b205c152b030ab4aafbea4aa6f",
"7766bedc7da3c5bb93a70dcba06eda741f8da7732926d80a33e319d1a57b3e1b",
"3ab687ea6e0e44807ddca6dec757d1529dc464b11819ee91a561195f52511235",
"987de2c8ff7d375fafca3e44b4a0251ea7dd964e6b33e4e9e94bc7dfe5acff2e",
"4cd4b6e793aab5e2a8a263459e089ea2299c35ed051328a92033483afde66751",
"fd6db9ce2c79df1dedaf32efa80af39a59f354a885d26391f0ae94e43652d87d",
];
let signatures = [
"dc4d6e7afa4d286eeec1547d5bc1631d25b20748c6152b803ddc124debfbc2f95f93e61e2c6c0e3fa9a1d7d060da5901b94c1769d7e76fb083087320e853885400",
"644659488ec8976cbc3a6b8118c826ca9753056136044d1b5bc62dea21bde8c44e2c4adc607bf32850f603ecf67a7028828f2fdc5bcf430b2c64f406bf1bffe400",
"8b79f0f57c2a4e0ce4f9725c1e0f5f2b639cfbb03439bc6454ee59b5c46fb2cb3a562b272e9ffd1ea6e121292e4746298d44450a4d1554820cd7f93fd518c3a801",
"4f8889df8c744e8c041e7f7aaf133e1da6708357d400d0ea7f19c15b70c1c0b37c8a3ec23d841ecb05e216a53f7c22e435185e51e557bfd522511309a0af0bfd01",
"c51dffa4f5c4e3b3a1710f2ca7e420e89763b1444ed5caa1e137419bc278447365849310a593e9b00bfc9328605c71e3c36a115fe6aa961b3c8ef26a6f4a596401",
"affc1f53934c7d7519a1078442b748ed9392e171dd1e8501f64156d7b0f172184f8cea200d928c4d27c5cdea51810f4a5266a41f6faac6aa5dfe01b47d59939901",
"3d3b32d7b56d7d304a68d8543a2ddbbe8ead030c080314003c074b07b277368c13559b915e1d3323ef7a688a1d928938d43b68d47e7beb4b862edd021abc6d8101",
"7e721bef8fb497a6f3cc383abe3a93d307e6a16929d3e20a2265fac68726f4024efed0ec1e9a5be3984982412e62575d0b143bf867f382fea2f39a205e5377ba01",
];
let signer_public_key = "1df62f291b2e969fb0849d99d9ce41e2f137006e";
let signer_public_key: [u8; 20] =
hex::decode(signer_public_key).unwrap().try_into().unwrap();
for (msg, signature) in izip!(msgs.iter(), signatures.iter()) {
let msg = hex::decode(msg).unwrap();
assert!(msg.len() == 32, "msg.len != 32");
let mut fixed = [0u8; 32];
fixed.copy_from_slice(&msg);
let signature = hex::decode(signature).unwrap();
assert!(signature.len() == 65, "signature.len != 65");
let recovered = ethereum_ecrecover(&signature, &fixed)?;
assert!(recovered == signer_public_key, "recovered != expected");
}
Ok(())
}
}

View File

@ -0,0 +1,5 @@
pub const CONDUCTOR_CHAIN: &str = std::env!("CONDUCTOR_CHAIN");
pub const CONDUCTOR_ADDRESS: &str = std::env!("CONDUCTOR_ADDRESS");
pub const GLOBAL_KYC_AUTHORITY: &str = std::env!("GLOBAL_KYC_AUTHORITY");
pub const CORE_BRIDGE_ADDRESS: &str = std::env!("CORE_BRIDGE_ADDRESS");
pub const TOKEN_BRIDGE_ADDRESS: &str = std::env!("TOKEN_BRIDGE_ADDRESS");

View File

@ -0,0 +1,97 @@
use anchor_lang::prelude::error_code;
#[error_code]
pub enum ContributorError {
#[msg("AlreadyClaimed")]
AlreadyClaimed,
#[msg("AmountTooLarge")]
AmountTooLarge,
#[msg("BuyerInactive")]
BuyerInactive,
#[msg("ContributeDeactivated")]
ContributeDeactivated,
#[msg("ContributionTooEarly")]
ContributionTooEarly,
#[msg("EcdsaRecoverFailure")]
EcdsaRecoverFailure,
#[msg("InsufficientFunds")]
InsufficientFunds,
#[msg("InvalidAcceptedTokenPayload")]
InvalidAcceptedTokenPayload,
#[msg("InvalidTokensAccepted")]
InvalidAcceptedTokens,
#[msg("InvalidAccount")]
InvalidAccount,
#[msg("InvalidConductorChain")]
InvalidConductorChain,
#[msg("InvalidConductorAddress")]
InvalidConductorAddress,
#[msg("InvalidKycAuthority")]
InvalidKycAuthority,
#[msg("InvalidKycSignature")]
InvalidKycSignature,
#[msg("InvalidSale")]
InvalidSale,
#[msg("InvalidRemainingAccounts")]
InvalidRemainingAccounts,
#[msg("InvalidTokenBridgeAddress")]
InvalidTokenBridgeAddress,
#[msg("InvalidTokenDecimals")]
InvalidTokenDecimals,
#[msg("InvalidTokenIndex")]
InvalidTokenIndex,
#[msg("InvalidVaaAction")]
InvalidVaaAction,
#[msg("InvalidWormholeAddress")]
InvalidWormholeAddress,
#[msg("InvalidVaaPayload")]
InvalidVaaPayload,
#[msg("NothingToClaim")]
NothingToClaim,
#[msg("SaleAlreadyInitialized")]
SaleAlreadyInitialized,
#[msg("SaleEnded")]
SaleEnded,
#[msg("SaleNotAborted")]
SaleNotAborted,
#[msg("SaleNotAttestable")]
SaleNotAttestable,
#[msg("SaleNotFinished")]
SaleNotFinished,
#[msg("SaleNotSealed")]
SaleNotSealed,
#[msg("TooManyAcceptedTokens")]
TooManyAcceptedTokens,
#[msg("TransferNotAllowed")]
TransferNotAllowed,
}

View File

@ -0,0 +1,736 @@
use anchor_lang::prelude::*;
mod constants;
mod context;
mod cryptography;
mod env;
mod error;
mod state;
mod token_bridge;
mod wormhole;
use constants::*;
use context::*;
use error::*;
use token_bridge::*;
use wormhole::*;
declare_id!("Efzc4SLs1ZdTPRq95oWxdMUr9XiX5M14HABwHpvrc9Fm"); // Solana devnet same
#[program]
pub mod anchor_contributor {
use super::*;
use anchor_lang::solana_program::{
borsh::try_from_slice_unchecked,
instruction::Instruction,
program::{invoke, invoke_signed},
program_option::COption,
system_instruction::transfer,
sysvar::*,
};
use anchor_spl::*;
use itertools::izip;
use state::custodian::Custodian;
/// Instruction to create the custodian account (which we referr to as `custodian`)
/// in all instruction contexts found in contexts.rs.
pub fn create_custodian(ctx: Context<CreateCustodian>) -> Result<()> {
// We save the "owner" in the custodian account. But the custodian does not
// need to be mutated in future interactions with the program.
ctx.accounts.custodian.new()?;
Ok(())
}
/// Instruction to initialize a sale. This parses an inbound signed VAA sent
/// by the conductor. A Sale account will be created at this point seeded by
/// the sale ID found in the signed VAA.
///
/// Included with this transaction is the sale token's mint public key and the
/// custodian's associated token account for the sale token. We need to verify
/// that the sale token info provided in the context is the same as what we parse
/// from the signed VAA.
pub fn init_sale(ctx: Context<InitSale>) -> Result<()> {
// We need to verify that the signed VAA was emitted from the conductor program
// that the contributor program knows.
let msg = ctx.accounts.custodian.parse_and_verify_conductor_vaa(
&ctx.accounts.core_bridge_vaa,
PAYLOAD_SALE_INIT_SOLANA,
)?;
// Once verified, we deserialize the VAA payload to initialize the Sale
// account with information relevant to perform future actions regarding
// this particular sale. It uses a 32-byte ID generated from the VAA as its
// identifier.
let sale = &mut ctx.accounts.sale;
sale.parse_sale_init(&msg.payload)?;
// The VAA encoded the Custodian's associated token account for the sale token. We
// need to verify that the ATA that we have in the context is the same one the message
// refers to.
require!(
sale.associated_sale_token_address == ctx.accounts.custodian_sale_token_acct.key(),
ContributorError::InvalidVaaPayload
);
// We want to save the sale token's mint information in the Sale struct. Most
// important of which is the number of decimals for this SPL token. The sale
// token that lives on the conductor chain can have a different number of decimals.
// Given how Portal works in attesting tokens, the foreign decimals will always
// be at least the amount found here.
sale.set_sale_token_mint_info(
&ctx.accounts.sale_token_mint.key(),
&ctx.accounts.sale_token_mint,
)?;
// Finish instruction.
Ok(())
}
/// Instruction to contribute to an ongoing sale. The sale account needs to be mutable so we
/// can uptick the total contributions for this sale. A buyer account will be created if it
/// hasn't been already from a previous contribution, seeded by the sale ID and the buyer's
/// public key.
///
/// As a part of this instruction, we need to verify that the contribution is allowed by checking
/// a signature provided from an outside source (a know-your-customer entity).
///
/// Once everything is verified, the sale and buyer accounts are updated to reflect this
/// contribution amount and the contribution will be transferred from the buyer's
/// associated token account to the custodian's associated token account.
pub fn contribute(ctx: Context<Contribute>, amount: u64, kyc_signature: Vec<u8>) -> Result<()> {
// We need to use the buyer's associated token account to help us find the token index
// for this particular mint he wishes to contribute.
let sale = &ctx.accounts.sale;
let buyer_token_acct = &ctx.accounts.buyer_token_acct;
let (idx, asset) = sale.get_total_info(&buyer_token_acct.mint)?;
let token_index = asset.token_index;
// If the buyer account wasn't initialized before, we will do so here. This initializes
// the state for all of this buyer's contributions.
let buyer = &mut ctx.accounts.buyer;
if !buyer.initialized {
buyer.initialize(sale.totals.len());
}
// We verify the KYC signature by encoding specific details of this contribution the
// same way the KYC entity signed for the transaction. If we cannot recover the KYC's
// public key using ecdsa recovery, we cannot allow the contribution to continue.
//
// We also refer to the buyer (owner) of this instruction as the transfer_authority
// for the SPL transfer that will happen at the end of all the accounting processing.
let transfer_authority = &ctx.accounts.owner;
sale.verify_kyc_authority(
token_index,
amount,
&transfer_authority.key(),
buyer.contributions[idx].amount,
&kyc_signature,
)?;
// We need to grab the current block's timestamp and verify that the buyer is allowed
// to contribute now. A user cannot contribute before the sale has started. If all the
// sale checks pass, the Sale's total contributions uptick to reflect this buyer's
// contribution.
let clock = Clock::get()?;
let sale = &mut ctx.accounts.sale;
sale.update_total_contributions(clock.unix_timestamp, token_index, amount)?;
// And we do the same with the Buyer account.
buyer.contribute(idx, amount)?;
// Finally transfer SPL tokens from the buyer's associated token account to the
// custodian's associated token account.
let custodian_token_acct = &ctx.accounts.custodian_token_acct;
token::transfer(
CpiContext::new(
ctx.accounts.token_program.to_account_info(),
token::Transfer {
from: buyer_token_acct.to_account_info(),
to: custodian_token_acct.to_account_info(),
authority: transfer_authority.to_account_info(),
},
),
amount,
)?;
// For some reason, using the anchor_spl library did not work on
// Solana devnet for one of our test trials, so we're keeping the
// native instruction here in case we need it.
//
// invoke(
// &spl_token::instruction::transfer(
// &token::ID,
// &buyer_token_acct.key(),
// &custodian_token_acct.key(),
// &transfer_authority.key(),
// &[&transfer_authority.key()],
// amount,
// )?,
// &ctx.accounts.to_account_infos(),
// )?;
// Finish instruction.
Ok(())
}
/// Instruction to attest contributions when the sale's contribution period expires. We cannot
/// attest contributions prior.
///
/// As a part of this instruction, we send a VAA to the conductor so it can factor this
/// contributor's contributions, making sure that the minimum raise is met.
pub fn attest_contributions(ctx: Context<AttestContributions>) -> Result<()> {
// Use the current block's time to check to see if we are allowed to attest contributions.
// If we can, serialize the VAA payload.
let clock = Clock::get()?;
let vaa_payload = ctx
.accounts
.sale
.serialize_contributions(clock.unix_timestamp)?;
// Prepare to send attest contribution payload via Wormhole.
let bridge_data: BridgeData =
try_from_slice_unchecked(&ctx.accounts.wormhole_config.data.borrow_mut())?;
// Prior to sending the VAA, we need to pay Wormhole a fee in order to
// use it.
let payer = &ctx.accounts.payer;
invoke(
&transfer(
&payer.key(),
&ctx.accounts.wormhole_fee_collector.key(),
bridge_data.config.fee,
),
&ctx.accounts.to_account_infos(),
)?;
// Post VAA to our Wormhole message account so it can be signed by the guardians
// and received by the conductor.
invoke_signed(
&Instruction {
program_id: ctx.accounts.wormhole.key(),
accounts: vec![
AccountMeta::new(ctx.accounts.wormhole_config.key(), false),
AccountMeta::new(ctx.accounts.wormhole_message.key(), true),
AccountMeta::new_readonly(ctx.accounts.wormhole_emitter.key(), true),
AccountMeta::new(ctx.accounts.wormhole_sequence.key(), false),
AccountMeta::new(payer.key(), true),
AccountMeta::new(ctx.accounts.wormhole_fee_collector.key(), false),
AccountMeta::new_readonly(ctx.accounts.clock.key(), false),
AccountMeta::new_readonly(ctx.accounts.rent.key(), false),
AccountMeta::new_readonly(ctx.accounts.system_program.key(), false),
],
data: (
wormhole::Instruction::PostMessage,
PostMessageData {
nonce: 0, // should only be emitted once, so no need for nonce
payload: vaa_payload,
consistency_level: wormhole::ConsistencyLevel::Confirmed,
},
)
.try_to_vec()?,
},
&ctx.accounts.to_account_infos(),
&[
&[
&b"attest-contributions".as_ref(),
&ctx.accounts.sale.id,
&[ctx.bumps["wormhole_message"]],
],
&[&b"emitter".as_ref(), &[ctx.bumps["wormhole_emitter"]]],
],
)?;
// Finish instruction.
Ok(())
}
/// Instruction to seal the current sale. This parses an inbound signed VAA sent
/// by the conductor.
///
/// Once the VAA is parsed and verified, we need to mark the sale as sealed so we
/// can invoke the program to bridge the contributed collateral over to the
/// conductor program. We would have liked to do the bridging of all collateral
/// in one instruction, but there are too many accounts (we need three or four per
/// SPL token we bridge, depending on the kind of token).
///
/// Users can use the `claim_allocation` instruction to claim their calculated
/// allocation of sale token and any excess contributions they have made.
pub fn seal_sale(ctx: Context<SealSale>) -> Result<()> {
// We verify that the signed VAA has the same sale information as the Sale
// account we pass into the context. It also needs to be emitted from the
// conductor we know.
let custodian = &ctx.accounts.custodian;
let sale = &mut ctx.accounts.sale;
let msg = custodian.parse_and_verify_conductor_vaa_and_sale(
&ctx.accounts.core_bridge_vaa,
PAYLOAD_SALE_SEALED,
sale.id,
)?;
// After verifying the VAA, save the allocation and excess contributions per
// accepted asset. Change the state from Active to Sealed.
sale.parse_sale_sealed(&msg.payload)?;
// Prior to sealing the sale, the sale token needed to be bridged to the custodian's
// associated token account. We need to make sure that there are enough allocations
// in the custodian's associated token account for distribution to all of the
// participants of the sale. If there aren't, we cannot allow the instruction to
// continue.
let total_allocations: u64 = sale.totals.iter().map(|total| total.allocations).sum();
require!(
ctx.accounts.custodian_sale_token_acct.amount >= total_allocations,
ContributorError::InsufficientFunds
);
// We pass as an extra argument remaining accounts. The first n accounts are
// the custodian's associated token accounts for each accepted token for the sale.
// The second n accounts are the buyer's respective associated token accounts.
// We need to verify that this context has the correct number of ATAs.
let totals = &mut sale.totals;
let custodian_token_accts = &ctx.remaining_accounts[..];
require!(
custodian_token_accts.len() == totals.len(),
ContributorError::InvalidRemainingAccounts
);
// We will mutate the buyer's accounting and state for each contributed mint.
for (total, custodian_token_acct) in izip!(totals, custodian_token_accts) {
// Verify the authority of the custodian's associated token account
require!(
token::accessor::authority(&custodian_token_acct)? == custodian.key(),
ContributorError::InvalidAccount
);
// We need to verify that the mints are the same between the two
// associated token accounts. After which, we will use the sale account
// to find the correct index to reference in the buyer account.
require!(
token::accessor::mint(&custodian_token_acct)? == total.mint,
ContributorError::InvalidAccount
);
// verify custodian ata has enough funds after conductor accounting
require!(
token::accessor::amount(&custodian_token_acct)? >= total.contributions,
ContributorError::InsufficientFunds
);
total.prepare_for_transfer();
}
// Finish instruction.
Ok(())
}
/// Instruction to bridge all of the sealed contributions to the conductor, one SPL token
/// at a time.
///
/// At the end of the instruction, we need to make sure that the contributions that remain
/// on the custodian's associated token account are at least as much needed to transfer
/// back to all of the market participants when they claim for their allocations using
/// the `claim_allocation` instruction.
///
/// *** NOTE: Token Bridge Wrapped Transfers are Un-Tested. ***
pub fn bridge_sealed_contribution(ctx: Context<BridgeSealedContribution>) -> Result<()> {
// We need to make sure that the sale is sealed before we can consider bridging
// collateral over to the conductor.
let sale = &ctx.accounts.sale;
require!(sale.is_sealed(), ContributorError::SaleNotSealed);
let custodian_token_acct = &ctx.accounts.custodian_token_acct;
let accepted_mint = &ctx.accounts.accepted_mint;
let (idx, asset) = sale.get_total_info(&accepted_mint.key())?;
require!(
asset.is_ready_for_transfer(),
ContributorError::TransferNotAllowed
);
let amount = asset.contributions - asset.excess_contributions;
let authority_signer = &ctx.accounts.authority_signer;
token::approve(
CpiContext::new_with_signer(
ctx.accounts.token_program.to_account_info(),
token::Approve {
to: custodian_token_acct.to_account_info(),
delegate: authority_signer.to_account_info(),
authority: ctx.accounts.custodian.to_account_info(),
},
&[&[SEED_PREFIX_CUSTODIAN.as_bytes(), &[ctx.bumps["custodian"]]]],
),
amount,
)?;
let transfer_data = TransferData {
nonce: ctx.accounts.custodian.nonce,
amount,
fee: 0,
target_address: sale.recipient,
target_chain: Custodian::conductor_chain()?,
};
// token bridge transfer this amount over to conductor_address on conductor_chain to recipient
let token_mint_signer = &ctx.accounts.token_mint_signer;
let minted_by_token_bridge = match accepted_mint.mint_authority {
COption::Some(authority) => authority == token_mint_signer.key(),
_ => false,
};
if minted_by_token_bridge {
let wrapped_meta = &ctx.accounts.custody_or_wrapped_meta;
invoke_signed(
&Instruction {
program_id: ctx.accounts.token_bridge.key(),
accounts: vec![
AccountMeta::new(ctx.accounts.payer.key(), true),
AccountMeta::new_readonly(ctx.accounts.token_bridge_config.key(), false),
AccountMeta::new(custodian_token_acct.key(), false),
AccountMeta::new_readonly(ctx.accounts.custodian.key(), true),
AccountMeta::new(accepted_mint.key(), false),
AccountMeta::new_readonly(wrapped_meta.key(), false),
AccountMeta::new_readonly(authority_signer.key(), false),
AccountMeta::new(ctx.accounts.wormhole_config.key(), false),
AccountMeta::new(ctx.accounts.wormhole_message.key(), true),
AccountMeta::new_readonly(ctx.accounts.wormhole_emitter.key(), true),
AccountMeta::new(ctx.accounts.wormhole_sequence.key(), false),
AccountMeta::new(ctx.accounts.wormhole_fee_collector.key(), false),
AccountMeta::new_readonly(clock::id(), false),
AccountMeta::new_readonly(rent::id(), false),
AccountMeta::new_readonly(ctx.accounts.system_program.key(), false),
AccountMeta::new_readonly(ctx.accounts.wormhole.key(), false),
AccountMeta::new_readonly(spl_token::id(), false),
],
data: (TRANSFER_WRAPPED_INSTRUCTION, transfer_data).try_to_vec()?,
},
&ctx.accounts.to_account_infos(),
&[
&[SEED_PREFIX_CUSTODIAN.as_ref(), &[ctx.bumps["custodian"]]],
&[
&b"bridge-sealed".as_ref(),
&sale.id,
&accepted_mint.key().as_ref(),
&[ctx.bumps["wormhole_message"]],
],
],
)?;
} else {
let token_bridge_custody = &ctx.accounts.custody_or_wrapped_meta;
invoke_signed(
&Instruction {
program_id: ctx.accounts.token_bridge.key(),
accounts: vec![
AccountMeta::new(ctx.accounts.payer.key(), true),
AccountMeta::new_readonly(ctx.accounts.token_bridge_config.key(), false),
AccountMeta::new(custodian_token_acct.key(), false),
AccountMeta::new(accepted_mint.key(), false),
AccountMeta::new(token_bridge_custody.key(), false),
AccountMeta::new_readonly(authority_signer.key(), false),
AccountMeta::new_readonly(ctx.accounts.custody_signer.key(), false),
AccountMeta::new(ctx.accounts.wormhole_config.key(), false),
AccountMeta::new(ctx.accounts.wormhole_message.key(), true),
AccountMeta::new_readonly(ctx.accounts.wormhole_emitter.key(), false),
AccountMeta::new(ctx.accounts.wormhole_sequence.key(), false),
AccountMeta::new(ctx.accounts.wormhole_fee_collector.key(), false),
AccountMeta::new_readonly(clock::id(), false),
AccountMeta::new_readonly(rent::id(), false),
AccountMeta::new_readonly(ctx.accounts.system_program.key(), false),
AccountMeta::new_readonly(ctx.accounts.wormhole.key(), false),
AccountMeta::new_readonly(spl_token::id(), false),
],
data: (TRANSFER_NATIVE_INSTRUCTION, transfer_data).try_to_vec()?,
},
&ctx.accounts.to_account_infos(),
&[&[
&b"bridge-sealed".as_ref(),
&ctx.accounts.sale.id,
&accepted_mint.key().as_ref(),
&[ctx.bumps["wormhole_message"]],
]],
)?;
}
ctx.accounts.sale.totals[idx].set_transferred();
// Finish instruction.
Ok(())
}
/// Instruction to abort the current sale. This parses an inbound signed VAA sent
/// by the conductor.
///
/// Once the VAA is parsed and verified, we mark the sale as aborted.
///
/// Users can use the `claim_refunds` instruction to claim however much they have
/// contributed to the sale.
pub fn abort_sale(ctx: Context<AbortSale>) -> Result<()> {
// We verify that the signed VAA has the same sale information as the Sale
// account we pass into the context. It also needs to be emitted from the
// conductor we know.
let sale = &mut ctx.accounts.sale;
let msg = ctx
.accounts
.custodian
.parse_and_verify_conductor_vaa_and_sale(
&ctx.accounts.core_bridge_vaa,
PAYLOAD_SALE_ABORTED,
sale.id,
)?;
// Finish the instruction by changing the status of the sale to Aborted.
sale.parse_sale_aborted(&msg.payload)
}
/// Instruction to claim refunds from an aborted sale. Only the buyer account needs to
/// be mutable so we can change its state.
///
/// The buyer account will copy what it knows as the buyer's contributions per SPL token
/// and assign that value to its excess for record keeping, marking the state of each
/// contribution as RefundClaimed.
///
/// There are n transfers for the refunds, depending on however many tokens a user has
/// contributed to the sale.
pub fn claim_refunds<'a, 'b, 'c, 'info>(
ctx: Context<'a, 'b, 'c, 'info, ClaimRefunds<'info>>,
) -> Result<()> {
// We need to make sure that the sale is actually aborted in order to use this
// instruction. If it isn't, we cannot continue.
let sale = &ctx.accounts.sale;
require!(sale.is_aborted(), ContributorError::SaleNotAborted);
// We pass as an extra argument remaining accounts. The first n accounts are
// the custodian's associated token accounts for each accepted token for the sale.
// The second n accounts are the buyer's respective associated token accounts.
// We need to verify that this context has the correct number of ATAs.
let totals = &sale.totals;
let num_accepted = totals.len();
let token_accts = &ctx.remaining_accounts;
require!(
token_accts.len() == 2 * num_accepted,
ContributorError::InvalidRemainingAccounts
);
let custodian_token_accts = &token_accts[..num_accepted];
let buyer_token_accts = &token_accts[num_accepted..];
// The owner reference is used to verify the authority for buyer's associated
// token accounts. And the transfer_authority is used for the SPL transfer.
let owner = &ctx.accounts.owner;
let transfer_authority = &ctx.accounts.custodian;
// This is used in case we need to use a native solana transfer.
//
// let mut all_accts = ctx.accounts.to_account_infos();
// all_accts.extend_from_slice(&ctx.remaining_accounts);
// We will mutate the buyer's accounting and state for each contributed mint.
let buyer = &mut ctx.accounts.buyer;
for (idx, (total, custodian_token_acct, buyer_token_acct)) in
izip!(totals, custodian_token_accts, buyer_token_accts).enumerate()
{
// Verify the custodian's associated token account
require!(
token::accessor::authority(&custodian_token_acct)? == transfer_authority.key(),
ContributorError::InvalidAccount
);
require!(
token::accessor::mint(&custodian_token_acct)? == total.mint,
ContributorError::InvalidAccount
);
// And verify the buyer's associated token account
require!(
token::accessor::authority(&buyer_token_acct)? == owner.key(),
ContributorError::InvalidAccount
);
require!(
token::accessor::mint(&buyer_token_acct)? == total.mint,
ContributorError::InvalidAccount
);
// Now calculate the refund and transfer to the buyer's associated
// token account if there is any amount to refund.
let refund = buyer.claim_refund(idx)?;
if refund == 0 {
continue;
}
token::transfer(
CpiContext::new_with_signer(
ctx.accounts.token_program.to_account_info(),
token::Transfer {
from: custodian_token_acct.to_account_info(),
to: buyer_token_acct.to_account_info(),
authority: transfer_authority.to_account_info(),
},
&[&[SEED_PREFIX_CUSTODIAN.as_bytes(), &[ctx.bumps["custodian"]]]],
),
refund,
)?;
// For some reason, using the anchor_spl library did not work on
// Solana devnet for one of our test trials, so we're keeping the
// native instruction here in case we need it.
//
// invoke_signed(
// &spl_token::instruction::transfer(
// &token::ID,
// &custodian_token_acct.key(),
// &buyer_token_acct.key(),
// &transfer_authority.key(),
// &[&transfer_authority.key()],
// refund,
// )?,
// &all_accts,
// &[&[&SEED_PREFIX_CUSTODIAN.as_bytes(), &[ctx.bumps["custodian"]]]],
// )?;
}
// Finish instruction.
Ok(())
}
/// Instruction to claim allocations from a sealed sale. Only the buyer account needs to
/// be mutable so we can change its state.
///
/// The buyer account will determine the total allocations reserved for the buyer based on
/// how much he has contributed to the sale (relative to the total contributions found in
/// the sale account) and mark its allocation as claimed. The same calculation is used to
/// determine how much excess of each contribution the buyer is allowed. It will also mark
/// the state of each contribution as ExcessClaimed.
///
/// There is one transfer for the total allocation and n transfers for the excess
/// contributions, depending on however many tokens a user has contributed to the sale.
pub fn claim_allocation<'a, 'b, 'c, 'info>(
ctx: Context<'a, 'b, 'c, 'info, ClaimAllocation<'info>>,
) -> Result<()> {
// We need to make sure that the sale is actually sealed in order to use this
// instruction. If it isn't, we cannot continue.
let sale = &ctx.accounts.sale;
require!(sale.is_sealed(), ContributorError::SaleNotSealed);
// first deal with the allocation
let custodian_sale_token_acct = &ctx.accounts.custodian_sale_token_acct;
let buyer_sale_token_acct = &ctx.accounts.buyer_sale_token_acct;
// compute allocation
let totals = &sale.totals;
let allocation = ctx.accounts.buyer.claim_allocation(totals)?;
require!(allocation > 0, ContributorError::NothingToClaim);
// spl transfer allocation
let transfer_authority = &ctx.accounts.custodian;
let custodian_bump = ctx.bumps["custodian"];
token::transfer(
CpiContext::new_with_signer(
ctx.accounts.token_program.to_account_info(),
token::Transfer {
from: custodian_sale_token_acct.to_account_info(),
to: buyer_sale_token_acct.to_account_info(),
authority: transfer_authority.to_account_info(),
},
&[&[SEED_PREFIX_CUSTODIAN.as_bytes(), &[custodian_bump]]],
),
allocation,
)?;
// For some reason, using the anchor_spl library did not work on
// Solana devnet for one of our test trials, so we're keeping the
// native instruction here in case we need it.
//
// invoke_signed(
// &spl_token::instruction::transfer(
// &token::ID,
// &custodian_sale_token_acct.key(),
// &buyer_sale_token_acct.key(),
// &transfer_authority.key(),
// &[&transfer_authority.key()],
// allocation,
// )?,
// &ctx.accounts.to_account_infos(),
// &[&[&SEED_PREFIX_CUSTODIAN.as_bytes(), &[custodian_bump]]],
// )?;
// We pass as an extra argument remaining accounts. The first n accounts are
// the custodian's associated token accounts for each accepted token for the sale.
// The second n accounts are the buyer's respective associated token accounts.
// We need to verify that this context has the correct number of ATAs.
let num_accepted = totals.len();
let token_accts = &ctx.remaining_accounts;
require!(
token_accts.len() == 2 * num_accepted,
ContributorError::InvalidRemainingAccounts
);
let custodian_token_accts = &token_accts[..num_accepted];
let buyer_token_accts = &token_accts[num_accepted..];
// The owner reference is used to verify the authority for buyer's associated
// token accounts. And the transfer_authority is used for the SPL transfer.
let owner = &ctx.accounts.owner;
let transfer_authority = &ctx.accounts.custodian;
// This is used in case we need to use a native solana transfer.
//
// let mut all_accts = ctx.accounts.to_account_infos();
// all_accts.extend_from_slice(&ctx.remaining_accounts);
// We will mutate the buyer's accounting and state for each contributed mint.
let buyer = &mut ctx.accounts.buyer;
for (idx, (total, custodian_token_acct, buyer_token_acct)) in
izip!(totals, custodian_token_accts, buyer_token_accts).enumerate()
{
// Verify the custodian's associated token account
require!(
token::accessor::authority(&custodian_token_acct)? == transfer_authority.key(),
ContributorError::InvalidAccount
);
require!(
token::accessor::mint(&custodian_token_acct)? == total.mint,
ContributorError::InvalidAccount
);
// And verify the buyer's associated token account
require!(
token::accessor::authority(&buyer_token_acct)? == owner.key(),
ContributorError::InvalidAccount
);
require!(
token::accessor::mint(&buyer_token_acct)? == total.mint,
ContributorError::InvalidAccount
);
// Now calculate the excess contribution and transfer to the
// buyer's associated token account if there is any amount calculated.
let excess = buyer.claim_excess(idx, total)?;
if excess == 0 {
continue;
}
token::transfer(
CpiContext::new_with_signer(
ctx.accounts.token_program.to_account_info(),
token::Transfer {
from: custodian_token_acct.to_account_info(),
to: buyer_token_acct.to_account_info(),
authority: transfer_authority.to_account_info(),
},
&[&[SEED_PREFIX_CUSTODIAN.as_bytes(), &[custodian_bump]]],
),
excess,
)?;
// For some reason, using the anchor_spl library did not work on
// Solana devnet for one of our test trials, so we're keeping the
// native instruction here in case we need it.
//
// invoke_signed(
// &spl_token::instruction::transfer(
// &token::ID,
// &custodian_token_acct.key(),
// &buyer_token_acct.key(),
// &transfer_authority.key(),
// &[&transfer_authority.key()],
// excess,
// )?,
// &all_accts,
// &[&[&SEED_PREFIX_CUSTODIAN.as_bytes(), &[ctx.bumps["custodian"]]]],
// )?;
}
// Finish instruction.
Ok(())
}
}

View File

@ -0,0 +1,208 @@
use anchor_lang::prelude::*;
use num_derive::*;
use crate::{constants::ACCEPTED_TOKENS_MAX, error::ContributorError, state::sale::AssetTotal};
#[derive(
AnchorSerialize, AnchorDeserialize, FromPrimitive, ToPrimitive, Copy, Clone, PartialEq, Eq,
)]
/// Status of Buyer's contribution
pub enum ContributionStatus {
/// Not initialized from `contribute` instruction
Inactive = 0,
/// Initialized from `contribute` instruction
Active,
/// For amount > 0, assigned after `claim_allocation` instruction
ExcessClaimed,
/// For amount > 0, assigned after `claim_refund` instruction
RefundClaimed,
}
#[derive(AnchorSerialize, AnchorDeserialize, Copy, Clone, PartialEq, Eq)]
/// A record of `Buyer`'s allocation owed
pub struct BuyerAllocation {
pub amount: u64, // 8
pub claimed: bool, // 1
}
#[derive(AnchorSerialize, AnchorDeserialize, Copy, Clone, PartialEq, Eq)]
/// A record of a `Buyer`'s contributions and excess owed
pub struct BuyerContribution {
/// Amount `Buyer` has contributed via `contribute` instruction.
/// Borsh size: 8
pub amount: u64,
/// Amount `Buyer` is owed after `seal_sale` instruction. Computed
/// at `claim_allocation` instruction.
/// Borsh size: 8
pub excess: u64,
/// Status of `Buyer`'s contribution
/// * Inactive
/// * Active
/// * ExcessClaimed
/// * RefundClaimed
/// Borsh size: 1
pub status: ContributionStatus,
}
#[account]
/// `Buyer` stores the state of an individual contributor to a sale
pub struct Buyer {
/// `Buyer` needs to keep track of a user's contribution amounts
/// and excess after a sealed sale
///
/// Borsh size: 4 + BuyerTotal::LENGTH * ACCEPTED_TOKENS_MAX
pub contributions: Vec<BuyerContribution>,
/// At the time of the `claim_allocation` instruction, we keep
/// a record of how much allocation `amount` the `Buyer` is owed
/// and whether it has been `claimed`
///
/// Borsh size: BuyerAllocation::LENGTH
pub allocation: BuyerAllocation,
/// Check if the `Buyer` has been initialized (happens at `contribute`
/// instruction)
///
/// Borsh size: 1
pub initialized: bool,
}
impl BuyerContribution {
pub const LENGTH: usize = 8 + 8 + 8 + 1;
}
impl BuyerAllocation {
pub const LENGTH: usize = 8 + 1;
}
impl Buyer {
pub const MAXIMUM_SIZE: usize =
(4 + BuyerContribution::LENGTH * ACCEPTED_TOKENS_MAX) + BuyerAllocation::LENGTH + 1;
/// If a `Buyer` account hasn't been created yet, set up initial state
///
/// # Arguments
/// * `num_totals` - Size of accepted tokens found in `Sale` account
///
pub fn initialize(&mut self, num_totals: usize) -> () {
self.contributions = vec![
BuyerContribution {
amount: 0,
excess: 0,
status: ContributionStatus::Inactive
};
num_totals
];
self.allocation.amount = 0;
self.allocation.claimed = false;
self.initialized = true;
}
/// At the `contribute` instruction, update the record of how much a
/// `Buyer` has contributed for a given token index. Update status to
/// `Active` after recording.
///
/// # Arguments
/// * `idx` - Which element of `contributions` to update
/// * `amount` - Amount to record in `contributions` element
///
pub fn contribute(&mut self, idx: usize, amount: u64) -> Result<()> {
require!(
idx < self.contributions.len(),
ContributorError::InvalidTokenIndex
);
require!(
!self.has_claimed_index(idx),
ContributorError::ContributeDeactivated
);
let total = &mut self.contributions[idx];
total.amount += amount;
total.status = ContributionStatus::Active;
Ok(())
}
/// Returns amount owed to `Buyer` at the `claim_refunds` instruction.
/// Update the record of how much a `Buyer` is owed for a given index of
/// `contributions`. Update status to `RefundClaimed` after recording.
/// The refund will equal the amount contributed from `contribute`
/// instruction.
///
/// # Arguments
/// * `idx` - Which element of `contributions` to update
///
pub fn claim_refund(&mut self, idx: usize) -> Result<u64> {
require!(
!self.has_claimed_index(idx),
ContributorError::AlreadyClaimed
);
let contribution = &mut self.contributions[idx];
contribution.excess = contribution.amount;
contribution.status = ContributionStatus::RefundClaimed;
Ok(contribution.excess)
}
/// Returns amount of allocation owed to `Buyer` at the `claim_allocation`
/// instruction. Update the record of his allocation and set `claimed`
/// to true.
///
/// # Arguments
/// * `sale_totals` - Taken from `Sale` after the sale has been sealed
///
pub fn claim_allocation(&mut self, sale_totals: &Vec<AssetTotal>) -> Result<u64> {
require!(!self.allocation.claimed, ContributorError::AlreadyClaimed);
let total_allocation: u128 = sale_totals
.iter()
.zip(self.contributions.iter())
.map(|(t, c)| match t.contributions {
0 => 0,
_ => t.allocations as u128 * c.amount as u128 / t.contributions as u128,
})
.sum();
require!(
total_allocation < u64::MAX as u128,
ContributorError::AmountTooLarge
);
self.allocation.amount = total_allocation as u64;
self.allocation.claimed = true;
Ok(self.allocation.amount)
}
/// Returns amount of excess owed to `Buyer` at the `claim_allocation`
/// instruction. Update the record of the excess contribution he made
/// and set status to ExcessClaimed.
///
/// # Arguments
/// * `idx` - Which element of `contributions` to update
/// * `total` - One `AssetTotal` element taken from `Sale` after the sale has been sealed
///
pub fn claim_excess(&mut self, idx: usize, total: &AssetTotal) -> Result<u64> {
require!(
!self.has_claimed_index(idx),
ContributorError::AlreadyClaimed
);
let excess_contribution = match total.contributions {
0 => 0,
_ => {
let contribution = &self.contributions[idx];
total.excess_contributions as u128 * contribution.amount as u128
/ total.contributions as u128
}
};
require!(
excess_contribution < u64::MAX as u128,
ContributorError::AmountTooLarge
);
let contribution = &mut self.contributions[idx];
contribution.excess = excess_contribution as u64;
contribution.status = ContributionStatus::ExcessClaimed;
Ok(contribution.excess)
}
/// Check whether a particular `contributions` index has been claimed
fn has_claimed_index(&self, idx: usize) -> bool {
let status = self.contributions[idx].status;
status == ContributionStatus::ExcessClaimed || status == ContributionStatus::RefundClaimed
}
}

View File

@ -0,0 +1,100 @@
use anchor_lang::prelude::*;
use crate::{
constants::INDEX_SALE_ID,
env::*,
error::ContributorError,
wormhole::{get_message_data, MessageData},
};
use std::str::FromStr;
#[account]
#[derive(Default)]
pub struct Custodian {
pub nonce: u32, // 4
}
impl Custodian {
pub const MAXIMUM_SIZE: usize = 32 + 4;
pub fn conductor_chain() -> Result<u16> {
let chain_id = CONDUCTOR_CHAIN
.to_string()
.parse()
.map_err(|_| ContributorError::InvalidConductorChain)?;
Ok(chain_id)
}
pub fn conductor_address() -> Result<[u8; 32]> {
let mut addr = [0u8; 32];
addr.copy_from_slice(
&hex::decode(CONDUCTOR_ADDRESS)
.map_err(|_| ContributorError::InvalidConductorAddress)?,
);
Ok(addr)
}
pub fn wormhole() -> Result<Pubkey> {
let pubkey = Pubkey::from_str(CORE_BRIDGE_ADDRESS)
.map_err(|_| ContributorError::InvalidWormholeAddress)?;
Ok(pubkey)
}
pub fn token_bridge() -> Result<Pubkey> {
let pubkey = Pubkey::from_str(TOKEN_BRIDGE_ADDRESS)
.map_err(|_| ContributorError::InvalidWormholeAddress)?;
Ok(pubkey)
}
pub fn new(&mut self) -> Result<()> {
self.nonce = 0;
Ok(())
}
pub fn parse_and_verify_conductor_vaa<'info>(
&self,
vaa_acct: &AccountInfo<'info>,
payload_type: u8,
) -> Result<MessageData> {
let msg = get_message_data(&vaa_acct)?;
require!(
msg.emitter_chain == Custodian::conductor_chain()?,
ContributorError::InvalidConductorChain
);
require!(
msg.emitter_address == Custodian::conductor_address()?,
ContributorError::InvalidConductorAddress
);
require!(
msg.payload[0] == payload_type,
ContributorError::InvalidVaaAction
);
Ok(msg)
}
pub fn get_sale_id_from_payload(payload: &[u8]) -> [u8; 32] {
let mut sale_id = [0u8; 32];
sale_id.copy_from_slice(&payload[INDEX_SALE_ID..INDEX_SALE_ID + 32]);
sale_id
}
pub fn get_sale_id_from_vaa<'info>(vaa_acct: &AccountInfo<'info>) -> Result<[u8; 32]> {
let msg = get_message_data(&vaa_acct)?;
Ok(Custodian::get_sale_id_from_payload(&msg.payload))
}
pub fn parse_and_verify_conductor_vaa_and_sale<'info>(
&self,
vaa_acct: &AccountInfo<'info>,
payload_type: u8,
sale_id: [u8; 32],
) -> Result<MessageData> {
let msg = self.parse_and_verify_conductor_vaa(vaa_acct, payload_type)?;
require!(
Custodian::get_sale_id_from_payload(&msg.payload) == sale_id,
ContributorError::InvalidSale,
);
Ok(msg)
}
}

View File

@ -0,0 +1,7 @@
pub use buyer::*;
pub use custodian::*;
pub use sale::*;
pub mod buyer;
pub mod custodian;
pub mod sale;

View File

@ -0,0 +1,407 @@
use anchor_lang::{prelude::*, solana_program::keccak};
use anchor_spl::token::Mint;
use num::{bigint::BigUint, traits::ToPrimitive};
use num_derive::*;
use std::{mem::size_of_val, u64};
use crate::{
constants::*, cryptography::ethereum_ecrecover, env::GLOBAL_KYC_AUTHORITY,
error::ContributorError, state::custodian::Custodian,
};
#[derive(AnchorSerialize, AnchorDeserialize, Copy, Clone, PartialEq, Eq)]
pub struct AssetTotal {
pub token_index: u8, // 1
pub mint: Pubkey, // 32
pub contributions: u64, // 8
pub allocations: u64, // 8
pub excess_contributions: u64, // 8
pub status: AssetStatus, // 1
}
#[derive(
AnchorSerialize, AnchorDeserialize, FromPrimitive, ToPrimitive, Copy, Clone, PartialEq, Eq,
)]
pub enum AssetStatus {
Active,
NothingToTransfer,
ReadyForTransfer,
TransferredToConductor,
}
#[derive(AnchorSerialize, AnchorDeserialize, Copy, Clone, PartialEq, Eq)]
pub struct SaleTimes {
pub start: u64,
pub end: u64,
}
#[derive(
AnchorSerialize, AnchorDeserialize, FromPrimitive, ToPrimitive, Copy, Clone, PartialEq, Eq,
)]
pub enum SaleStatus {
Active,
Sealed,
Aborted,
}
#[account]
pub struct Sale {
pub id: [u8; 32], // 32
pub associated_sale_token_address: Pubkey, // 32
pub token_chain: u16, // 2
pub token_decimals: u8, // 1
pub times: SaleTimes, // 8 + 8
pub recipient: [u8; 32], // 32
pub status: SaleStatus, // 1
pub kyc_authority: [u8; 20], // 20 (this is an evm pubkey)
pub initialized: bool, // 1
pub totals: Vec<AssetTotal>, // 4 + AssetTotal::MAXIMUM_SIZE * ACCEPTED_TOKENS_MAX
pub native_token_decimals: u8, // 1
pub sale_token_mint: Pubkey, // 32
}
impl AssetTotal {
pub const MAXIMUM_SIZE: usize = 1 + 32 + 8 + 8 + 8 + 1;
pub fn make_from_slice(bytes: &[u8]) -> Result<Self> {
require!(
bytes.len() == INDEX_ACCEPTED_TOKEN_END,
ContributorError::InvalidAcceptedTokenPayload
);
Ok(Self {
token_index: bytes[INDEX_ACCEPTED_TOKEN_INDEX],
mint: Pubkey::new(&bytes[INDEX_ACCEPTED_TOKEN_ADDRESS..INDEX_ACCEPTED_TOKEN_END]),
contributions: 0,
allocations: 0,
excess_contributions: 0,
status: AssetStatus::Active,
})
}
pub fn prepare_for_transfer(&mut self) {
self.status = {
if self.contributions == 0 {
AssetStatus::NothingToTransfer
} else {
AssetStatus::ReadyForTransfer
}
};
}
pub fn is_ready_for_transfer(&self) -> bool {
self.status == AssetStatus::ReadyForTransfer
}
pub fn set_transferred(&mut self) {
self.status = AssetStatus::TransferredToConductor;
}
}
impl Sale {
pub const MAXIMUM_SIZE: usize = 32
+ 32
+ 2
+ 1
+ (8 + 8)
+ 32
+ 1
+ 20
+ 1
+ (4 + AssetTotal::MAXIMUM_SIZE * ACCEPTED_TOKENS_MAX)
+ 1
+ 32;
pub fn parse_sale_init(&mut self, payload: &[u8]) -> Result<()> {
require!(!self.initialized, ContributorError::SaleAlreadyInitialized);
self.initialized = true;
// check that the payload has at least the number of bytes
// required to define the number of accepted tokens
require!(
payload.len() > INDEX_SALE_INIT_ACCEPTED_TOKENS_START,
ContributorError::InvalidVaaPayload
);
let num_accepted = payload[INDEX_SALE_INIT_ACCEPTED_TOKENS_START] as usize;
require!(
num_accepted <= ACCEPTED_TOKENS_MAX,
ContributorError::TooManyAcceptedTokens
);
self.totals = Vec::with_capacity(ACCEPTED_TOKENS_MAX);
for i in 0..num_accepted {
let start = INDEX_SALE_INIT_ACCEPTED_TOKENS_START + 1 + ACCEPTED_TOKEN_NUM_BYTES * i;
self.totals.push(AssetTotal::make_from_slice(
&payload[start..start + ACCEPTED_TOKEN_NUM_BYTES],
)?);
}
self.id = Sale::get_id(payload);
// deserialize other things
let mut addr = [0u8; 32];
addr.copy_from_slice(
&payload[INDEX_SALE_INIT_TOKEN_ADDRESS..INDEX_SALE_INIT_TOKEN_ADDRESS + 32],
);
self.associated_sale_token_address = Pubkey::new(&addr);
self.token_chain = to_u16_be(payload, INDEX_SALE_INIT_TOKEN_CHAIN);
self.token_decimals = payload[INDEX_SALE_INIT_TOKEN_DECIMALS];
// assume these times are actually u64... these are stored as uint256 in evm
self.times.start = to_u64_be(payload, INDEX_SALE_INIT_SALE_START + 24);
self.times.end = to_u64_be(payload, INDEX_SALE_INIT_SALE_END + 24);
// because the accepted tokens are packed in before the recipient... we need to find
// where this guy is based on how many accepted tokens there are. yes, we hate this, too
let recipient_idx =
INDEX_SALE_INIT_ACCEPTED_TOKENS_START + 1 + ACCEPTED_TOKEN_NUM_BYTES * num_accepted;
//self.recipient = to_bytes32(payload, recipient_idx);
self.recipient
.copy_from_slice(&payload[recipient_idx..recipient_idx + 32]);
// we may need to deserialize kyc authority in sale init in the future.
// but for now, just use global
self.kyc_authority.copy_from_slice(
&hex::decode(GLOBAL_KYC_AUTHORITY)
.map_err(|_| ContributorError::InvalidKycAuthority)?,
);
// finally set the status to active
self.status = SaleStatus::Active;
Ok(())
}
pub fn set_sale_token_mint_info(&mut self, mint: &Pubkey, mint_info: &Mint) -> Result<()> {
let decimals = mint_info.decimals;
require!(
self.token_decimals >= decimals,
ContributorError::InvalidTokenDecimals
);
self.native_token_decimals = decimals;
self.sale_token_mint = mint.clone();
Ok(())
}
pub fn get_token_index(&self, mint: &Pubkey) -> Result<u8> {
let result = self.totals.iter().find(|item| item.mint == *mint);
require!(result != None, ContributorError::InvalidTokenIndex);
Ok(result.unwrap().token_index)
}
pub fn get_total_info(&self, mint: &Pubkey) -> Result<(usize, &AssetTotal)> {
let result = self.totals.iter().position(|item| item.mint == *mint);
require!(result != None, ContributorError::InvalidTokenIndex);
let idx = result.unwrap();
Ok((idx, &self.totals[idx]))
}
pub fn update_total_contributions(
&mut self,
block_time: i64,
token_index: u8,
contributed: u64,
) -> Result<usize> {
require!(self.is_active(block_time), ContributorError::SaleEnded);
let block_time = block_time as u64;
require!(
block_time >= self.times.start,
ContributorError::ContributionTooEarly
);
let idx = self.get_index(token_index)?;
self.totals[idx].contributions += contributed;
Ok(idx)
}
pub fn serialize_contributions(&self, block_time: i64) -> Result<Vec<u8>> {
require!(
self.is_attestable(block_time),
ContributorError::SaleNotAttestable
);
let totals = &self.totals;
// Contributions length is encoded as a single byte, so we fail here if it overflows
let contributions_len: u8 = totals.len().try_into().unwrap();
let mut attested: Vec<u8> = Vec::with_capacity(
PAYLOAD_HEADER_LEN
+ size_of_val(&CHAIN_ID)
+ size_of_val(&contributions_len)
+ totals.len() * ATTEST_CONTRIBUTIONS_ELEMENT_LEN,
);
// push header
attested.push(PAYLOAD_ATTEST_CONTRIBUTIONS);
attested.extend(self.id.iter());
attested.extend(CHAIN_ID.to_be_bytes());
// push contributions length
attested.push(contributions_len);
// push each total contributions
for total in totals {
attested.push(total.token_index);
attested.extend(vec![0; PAD_U64]); // contribution is 8 bytes, but we need 32 bytes in the payload, so we left-pad
attested.extend(total.contributions.to_be_bytes());
}
Ok(attested)
}
pub fn parse_sale_sealed(&mut self, payload: &[u8]) -> Result<()> {
require!(!self.has_ended(), ContributorError::SaleEnded);
// check that the payload has at least the number of bytes
// required to define the number of allocations
require!(
payload.len() > INDEX_SALE_SEALED_ALLOCATIONS_START,
ContributorError::InvalidVaaPayload
);
let num_allocations = payload[INDEX_SALE_SEALED_ALLOCATIONS_START] as usize;
require!(
num_allocations == self.totals.len(),
ContributorError::InvalidVaaPayload
);
let decimal_difference = (self.token_decimals - self.native_token_decimals) as u32;
let pow10_divider = BigUint::from(10u128).pow(decimal_difference);
// deserialize other things
for i in 0..num_allocations {
let start = INDEX_SALE_SEALED_ALLOCATIONS_START + 1 + ALLOCATION_NUM_BYTES * i;
let total = &self.totals[i];
require!(
payload[start] == total.token_index,
ContributorError::InvalidVaaPayload
);
let total = &mut self.totals[i];
// convert allocation to u64 based on decimal difference and save
let raw_allocation = BigUint::from_bytes_be(
&payload[start + INDEX_ALLOCATIONS_AMOUNT..start + INDEX_ALLOCATIONS_EXCESS],
);
total.allocations = (raw_allocation / pow10_divider.clone())
.to_u64()
.ok_or(ContributorError::AmountTooLarge)?;
// and save excess contribution
total.excess_contributions = BigUint::from_bytes_be(
&payload[start + INDEX_ALLOCATIONS_EXCESS..start + INDEX_ALLOCATIONS_END],
)
.to_u64()
.ok_or(ContributorError::AmountTooLarge)?;
}
// finally set the status to sealed
self.status = SaleStatus::Sealed;
Ok(())
}
pub fn parse_sale_aborted(&mut self, payload: &[u8]) -> Result<()> {
require!(!self.has_ended(), ContributorError::SaleEnded);
// check that the payload has the correct size
// payload type + sale id
require!(
payload.len() == PAYLOAD_HEADER_LEN,
ContributorError::InvalidVaaPayload
);
// finally set the status to aborted
self.status = SaleStatus::Aborted;
Ok(())
}
pub fn verify_kyc_authority(
&self,
token_index: u8,
amount: u64,
buyer: &Pubkey,
prev_contribution: u64,
kyc_signature: &[u8],
) -> Result<()> {
require!(
kyc_signature.len() == 65,
ContributorError::InvalidKycSignature
);
// first encode arguments
let mut encoded: Vec<u8> = Vec::with_capacity(6 * 32);
// grab conductor address from Custodian
encoded.extend(Custodian::conductor_address()?); // 32
// sale id
encoded.extend(self.id); // 32
// token index
encoded.extend(vec![0u8; PAD_U8]); // 31 (zero padding u8)
encoded.push(token_index); // 1
// amount
encoded.extend(vec![0u8; PAD_U64]); // 24
encoded.extend(amount.to_be_bytes()); // 8
// buyer
encoded.extend(buyer.to_bytes()); // 32
// previously contributed amount
encoded.extend(vec![0u8; PAD_U64]); // 24
encoded.extend(prev_contribution.to_be_bytes()); // 8
let hash = keccak::hash(&encoded);
let recovered = ethereum_ecrecover(kyc_signature, &hash.to_bytes())?;
require!(
recovered == self.kyc_authority,
ContributorError::InvalidKycSignature
);
Ok(())
}
pub fn is_active(&self, block_time: i64) -> bool {
self.initialized && self.status == SaleStatus::Active && block_time as u64 <= self.times.end
}
fn is_attestable(&self, block_time: i64) -> bool {
self.initialized && self.status == SaleStatus::Active && block_time as u64 > self.times.end
}
pub fn has_ended(&self) -> bool {
return self.initialized && self.status != SaleStatus::Active;
}
pub fn is_sealed(&self) -> bool {
return self.initialized && self.status == SaleStatus::Sealed;
}
pub fn is_aborted(&self) -> bool {
return self.initialized && self.status == SaleStatus::Aborted;
}
pub fn get_index(&self, token_index: u8) -> Result<usize> {
let result = self
.totals
.iter()
.position(|item| item.token_index == token_index);
require!(result != None, ContributorError::InvalidTokenIndex);
Ok(result.unwrap())
}
fn get_id(payload: &[u8]) -> [u8; 32] {
let mut output = [0u8; 32];
output.copy_from_slice(&payload[INDEX_SALE_ID..INDEX_SALE_ID + 32]);
output
}
}
// assuming all slices are the correct sizes...
fn to_u16_be(bytes: &[u8], index: usize) -> u16 {
u16::from_be_bytes(bytes[index..index + 2].try_into().unwrap())
}
fn to_u64_be(bytes: &[u8], index: usize) -> u64 {
u64::from_be_bytes(bytes[index..index + 8].try_into().unwrap())
}

View File

@ -0,0 +1,17 @@
use anchor_lang::prelude::*;
use borsh::{BorshDeserialize, BorshSerialize};
/**
* Same as TransferNative & TransferWrapped Data.
*/
#[derive(BorshDeserialize, BorshSerialize, Default)]
pub struct TransferData {
pub nonce: u32,
pub amount: u64,
pub fee: u64,
pub target_address: [u8; 32],
pub target_chain: u16,
}
pub const TRANSFER_WRAPPED_INSTRUCTION: u8 = 4;
pub const TRANSFER_NATIVE_INSTRUCTION: u8 = 5;

View File

@ -0,0 +1,113 @@
use anchor_lang::prelude::*;
use borsh::{BorshDeserialize, BorshSerialize};
use std::io::Write;
#[derive(AnchorDeserialize, AnchorSerialize)]
pub struct PostMessageData {
/// Unique nonce for this message
pub nonce: u32,
/// Message payload
pub payload: Vec<u8>,
/// Commitment Level required for an attestation to be produced
pub consistency_level: ConsistencyLevel,
}
#[derive(AnchorDeserialize, AnchorSerialize)]
pub enum ConsistencyLevel {
Confirmed,
Finalized,
}
#[derive(AnchorDeserialize, AnchorSerialize)]
pub enum Instruction {
Initialize,
PostMessage,
PostVAA,
SetFees,
TransferFees,
UpgradeContract,
UpgradeGuardianSet,
VerifySignatures,
}
#[derive(AnchorDeserialize, AnchorSerialize, Clone)]
pub struct BridgeData {
/// The current guardian set index, used to decide which signature sets to accept.
pub guardian_set_index: u32,
/// Lamports in the collection account
pub last_lamports: u64,
/// Bridge configuration, which is set once upon initialization.
pub config: BridgeConfig,
}
#[derive(AnchorDeserialize, AnchorSerialize, Clone)]
pub struct BridgeConfig {
/// Period for how long a guardian set is valid after it has been replaced by a new one. This
/// guarantees that VAAs issued by that set can still be submitted for a certain period. In
/// this period we still trust the old guardian set.
pub guardian_set_expiration_time: u32,
/// Amount of lamports that needs to be paid to the protocol to post a message
pub fee: u64,
}
#[derive(Debug)]
#[repr(transparent)]
pub struct PostedMessageData(pub MessageData);
#[derive(Debug, Default, BorshDeserialize, BorshSerialize)]
pub struct MessageData {
/// Header of the posted VAA
pub vaa_version: u8,
/// Level of consistency requested by the emitter
pub consistency_level: u8,
/// Time the vaa was submitted
pub vaa_time: u32,
/// Account where signatures are stored
pub vaa_signature_account: Pubkey,
/// Time the posted message was created
pub submission_time: u32,
/// Unique nonce for this message
pub nonce: u32,
/// Sequence number of this message
pub sequence: u64,
/// Emitter of the message
pub emitter_chain: u16,
/// Emitter of the message
pub emitter_address: [u8; 32],
/// Message payload
pub payload: Vec<u8>,
}
impl AnchorSerialize for PostedMessageData {
fn serialize<W: Write>(&self, writer: &mut W) -> std::io::Result<()> {
writer.write(b"msg")?;
BorshSerialize::serialize(&self.0, writer)
}
}
impl AnchorDeserialize for PostedMessageData {
fn deserialize(buf: &mut &[u8]) -> std::io::Result<Self> {
*buf = &buf[3..];
Ok(PostedMessageData(
<MessageData as BorshDeserialize>::deserialize(buf)?,
))
}
}
pub fn get_message_data<'info>(vaa_account: &AccountInfo<'info>) -> Result<MessageData> {
Ok(PostedMessageData::try_from_slice(&vaa_account.data.borrow())?.0)
}

8
anchor-contributor/test.env Executable file
View File

@ -0,0 +1,8 @@
export CONDUCTOR_CHAIN=2
export CONDUCTOR_ADDRESS="0000000000000000000000005c49f34d92316a2ac68d10a1e2168e16610e84f9"
export GLOBAL_KYC_AUTHORITY="1df62f291b2e969fb0849d99d9ce41e2f137006e"
export CORE_BRIDGE_ADDRESS="Bridge1p5gheXUvJ6jGWGeCsgPKgnE3YgdGKRVCMY9o"
export TOKEN_BRIDGE_ADDRESS="B6RHG3mfcckmrYN1UhmJzyS1XX3fZKbkeUcpJe9Sy3FE"
# misc
export BROWSER=""

View File

@ -0,0 +1,978 @@
import { AnchorProvider, workspace, web3, Program, setProvider, BN } from "@project-serum/anchor";
import { AnchorContributor } from "../target/types/anchor_contributor";
import { expect } from "chai";
import { readFileSync } from "fs";
import {
CHAIN_ID_SOLANA,
setDefaultWasm,
tryHexToNativeString,
tryNativeToHexString,
transferFromSolana,
tryUint8ArrayToNative,
importCoreWasm,
getOriginalAssetSol,
uint8ArrayToHex,
} from "@certusone/wormhole-sdk";
import { sendAndConfirmTransaction } from "@solana/web3.js";
import {
getOrCreateAssociatedTokenAccount,
mintTo,
Account as AssociatedTokenAccount,
getAssociatedTokenAddress,
getAccount,
getMint,
} from "@solana/spl-token";
import { DummyConductor } from "./helpers/conductor";
import { IccoContributor } from "./helpers/contributor";
import {
deriveAddress,
getBlockTime,
getPdaAssociatedTokenAddress,
getPdaSplBalance,
getSplBalance,
hexToPublicKey,
wait,
} from "./helpers/utils";
import { BigNumber } from "ethers";
import { KycAuthority } from "./helpers/kyc";
import {
CONDUCTOR_ADDRESS,
CONDUCTOR_CHAIN,
CORE_BRIDGE_ADDRESS,
KYC_PRIVATE,
TOKEN_BRIDGE_ADDRESS,
} from "./helpers/consts";
// be careful where you import this
import { postVaaSolanaWithRetry } from "@certusone/wormhole-sdk";
setDefaultWasm("node");
describe("anchor-contributor", () => {
// Configure the client to use the local cluster.
setProvider(AnchorProvider.env());
const program = workspace.AnchorContributor as Program<AnchorContributor>;
const connection = program.provider.connection;
const orchestrator = web3.Keypair.fromSecretKey(
Uint8Array.from(JSON.parse(readFileSync("./tests/test_orchestrator_keypair.json").toString()))
);
const buyer = web3.Keypair.fromSecretKey(
Uint8Array.from(JSON.parse(readFileSync("./tests/test_buyer_keypair.json").toString()))
);
// dummy conductor to generate vaas
const dummyConductor = new DummyConductor(CONDUCTOR_CHAIN, CONDUCTOR_ADDRESS);
// our contributor
const contributor = new IccoContributor(program, CORE_BRIDGE_ADDRESS, TOKEN_BRIDGE_ADDRESS, postVaaSolanaWithRetry);
// kyc for signing contributions
const kyc = new KycAuthority(KYC_PRIVATE, CONDUCTOR_ADDRESS, contributor);
before("Airdrop SOL", async () => {
await connection.requestAirdrop(buyer.publicKey, 8000000000); // 8,000,000,000 lamports
// do we need to wait for the airdrop to hit a wallet?
await wait(5);
});
describe("Test Preparation", () => {
it("Create Dummy Sale Token", async () => {
// we need to simulate attesting the sale token on Solana.
// this allows us to "redeem" the sale token prior to sealing the sale
// (which in the case of this test means minting it on the contributor program's ATA)
await dummyConductor.attestSaleToken(connection, orchestrator);
});
it("Bridge Sale Token To Null Recipient", async () => {
// we need to simulate attesting the sale token on Solana.
// this allows us to "redeem" the sale token prior to sealing the sale
// (which in the case of this test means minting it on the contributor program's ATA)
//ait dummyConductor.attestSaleToken(connection, orchestrator);
const saleTokenMint = dummyConductor.getSaleTokenOnSolana();
const tokenAccount = await getOrCreateAssociatedTokenAccount(
connection,
orchestrator,
dummyConductor.getSaleTokenOnSolana(),
orchestrator.publicKey
);
await mintTo(connection, orchestrator, saleTokenMint, tokenAccount.address, orchestrator, 1n);
const transaction = await transferFromSolana(
connection,
CORE_BRIDGE_ADDRESS.toString(),
TOKEN_BRIDGE_ADDRESS.toString(),
orchestrator.publicKey.toString(),
tokenAccount.address.toString(),
saleTokenMint.toString(),
1n,
new Uint8Array(32), // null address
"ethereum"
);
transaction.partialSign(orchestrator);
const txid = await connection.sendRawTransaction(transaction.serialize());
});
it("Mint Accepted SPL Tokens to Buyer", async () => {
// first create them and add them to the accepted tokens list
const acceptedTokens = await dummyConductor.createAcceptedTokens(connection, orchestrator);
for (const token of acceptedTokens) {
const mint = hexToPublicKey(token.address);
// create ata for buyer
const tokenAccount = await getOrCreateAssociatedTokenAccount(connection, buyer, mint, buyer.publicKey);
// now mint to buyer for testing
let amount = new BN("200000000000");
await mintTo(
connection,
orchestrator,
mint,
tokenAccount.address,
orchestrator,
BigInt(amount.toString()) // 20,000,000,000 lamports
);
const balance = await getSplBalance(connection, mint, buyer.publicKey);
expect(balance.toString()).to.equal(amount.toString());
}
});
});
describe("Custodian Setup", () => {
it("Create Custodian", async () => {
const tx = await contributor.createCustodian(orchestrator);
// nothing to verify
});
it("Create ATAs for Custodian", async () => {
for (const token of dummyConductor.acceptedTokens) {
const allowOwnerOffCurve = true;
await getOrCreateAssociatedTokenAccount(
connection,
orchestrator,
hexToPublicKey(token.address),
contributor.custodian,
allowOwnerOffCurve
);
}
});
});
describe("Conduct Successful Sale", () => {
// global contributions for test
const contributions = new Map<number, string[]>();
const totalContributions: BN[] = [];
// squirrel away associated sale token account
let saleTokenAccount: AssociatedTokenAccount;
it("Create ATA for Sale Token if Non-Existent", async () => {
const allowOwnerOffCurve = true;
saleTokenAccount = await getOrCreateAssociatedTokenAccount(
connection,
orchestrator,
dummyConductor.getSaleTokenOnSolana(),
contributor.custodian,
allowOwnerOffCurve
);
});
it("Orchestrator Initialize Sale with Signed VAA", async () => {
const startTime = 8 + (await getBlockTime(connection));
const duration = 8; // seconds
const initSaleVaa = dummyConductor.createSale(startTime, duration, saleTokenAccount.address);
const tx = await contributor.initSale(orchestrator, initSaleVaa, dummyConductor.getSaleTokenOnSolana());
{
// get the first sale state
const saleId = dummyConductor.getSaleId();
const saleState = await contributor.getSale(saleId);
// verify
expect(Uint8Array.from(saleState.id)).to.deep.equal(saleId);
//expect(Uint8Array.from(saleState.tokenAddress)).to.deep.equal(Buffer.from(dummyConductor.tokenAddress, "hex"));
expect(saleState.tokenChain).to.equal(dummyConductor.tokenChain);
expect(saleState.tokenDecimals).to.equal(dummyConductor.tokenDecimals);
expect(saleState.times.start.toString()).to.equal(dummyConductor.saleStart.toString());
expect(saleState.times.end.toString()).to.equal(dummyConductor.saleEnd.toString());
expect(Uint8Array.from(saleState.recipient)).to.deep.equal(Buffer.from(dummyConductor.recipient, "hex"));
expect(saleState.status).has.key("active");
// check totals
const totals: any = saleState.totals;
const numAccepted = dummyConductor.acceptedTokens.length;
expect(totals.length).to.equal(numAccepted);
for (let i = 0; i < numAccepted; ++i) {
const total = totals[i];
const acceptedToken = dummyConductor.acceptedTokens[i];
expect(total.tokenIndex).to.equal(acceptedToken.index);
expect(tryNativeToHexString(total.mint.toString(), CHAIN_ID_SOLANA)).to.equal(acceptedToken.address);
expect(total.contributions.toString()).to.equal("0");
expect(total.allocations.toString()).to.equal("0");
expect(total.excessContributions.toString()).to.equal("0");
}
}
});
it("Orchestrator Cannot Initialize Sale Again with Signed VAA", async () => {
let caughtError = false;
try {
const tx = await contributor.initSale(
orchestrator,
dummyConductor.initSaleVaa,
dummyConductor.getSaleTokenOnSolana()
);
throw Error(`should not happen: ${tx}`);
} catch (e) {
// pda init should fail
caughtError = "programErrorStack" in e;
}
if (!caughtError) {
throw Error("did not catch expected error");
}
});
it("User Cannot Contribute Too Early", async () => {
const saleId = dummyConductor.getSaleId();
const tokenIndex = 2;
const amount = new BN("1000000000"); // 1,000,000,000 lamports
let caughtError = false;
try {
const tx = await contributor.contribute(
buyer,
saleId,
tokenIndex,
amount,
await kyc.signContribution(saleId, tokenIndex, amount, buyer.publicKey)
);
throw Error(`should not happen: ${tx}`);
} catch (e) {
caughtError = verifyErrorMsg(e, "ContributionTooEarly");
}
if (!caughtError) {
throw Error("did not catch expected error");
}
});
it("User Cannot Contribute With Bad Signature", async () => {
// wait for sale to start here
const saleStart = dummyConductor.saleStart;
await waitUntilBlock(connection, saleStart);
const saleId = dummyConductor.getSaleId();
const tokenIndex = 2;
const amount = new BN("1000000000"); // 1,000,000,000 lamports
let caughtError = false;
try {
// generate bad signature w/ amount that disagrees w/ instruction input
const badSignature = await kyc.signContribution(saleId, tokenIndex, new BN("42069"), buyer.publicKey);
const tx = await contributor.contribute(buyer, saleId, tokenIndex, amount, badSignature);
throw Error(`should not happen: ${tx}`);
} catch (e) {
caughtError = verifyErrorMsg(e, "InvalidKycSignature");
}
if (!caughtError) {
throw Error("did not catch expected error");
}
});
it("User Contributes to Sale", async () => {
// prep contributions info
const acceptedTokens = dummyConductor.acceptedTokens;
const contributedTokenIndices = [acceptedTokens[0].index, acceptedTokens[3].index];
contributions.set(contributedTokenIndices[0], ["1200000000", "3400000000"]);
contributions.set(contributedTokenIndices[1], ["5600000000", "7800000000"]);
contributedTokenIndices.forEach((tokenIndex) => {
const amounts = contributions.get(tokenIndex);
totalContributions.push(amounts.map((x) => new BN(x)).reduce((prev, curr) => prev.add(curr)));
});
const acceptedMints = acceptedTokens.map((token) => hexToPublicKey(token.address));
const startingBalanceBuyer = await Promise.all(
acceptedMints.map(async (mint) => {
return getSplBalance(connection, mint, buyer.publicKey);
})
);
const startingBalanceCustodian = await Promise.all(
acceptedMints.map(async (mint) => {
return getPdaSplBalance(connection, mint, contributor.custodian);
})
);
// now go about your business
// contribute multiple times
const saleId = dummyConductor.getSaleId();
for (const tokenIndex of contributedTokenIndices) {
for (const amount of contributions.get(tokenIndex).map((value) => new BN(value))) {
const tx = await contributor.contribute(
buyer,
saleId,
tokenIndex,
amount,
await kyc.signContribution(saleId, tokenIndex, amount, buyer.publicKey)
);
}
}
const endingBalanceBuyer = await Promise.all(
acceptedMints.map(async (mint) => {
return getSplBalance(connection, mint, buyer.publicKey);
})
);
const endingBalanceCustodian = await Promise.all(
acceptedMints.map(async (mint) => {
return getPdaSplBalance(connection, mint, contributor.custodian);
})
);
const expectedContributedAmounts = [
totalContributions[0],
new BN(0),
new BN(0),
totalContributions[1],
new BN(0),
new BN(0),
new BN(0),
new BN(0),
];
const numExpected = expectedContributedAmounts.length;
// check buyer state
{
const buyerState = await contributor.getBuyer(saleId, buyer.publicKey);
const totals: any = buyerState.contributions;
expect(totals.length).to.equal(numExpected);
// check balance changes and state
for (let i = 0; i < numExpected; ++i) {
let contribution = expectedContributedAmounts[i];
expect(startingBalanceBuyer[i].sub(contribution).toString()).to.equal(endingBalanceBuyer[i].toString());
expect(startingBalanceCustodian[i].add(contribution).toString()).to.equal(
endingBalanceCustodian[i].toString()
);
let item = totals[i];
const expectedState = contribution.eq(new BN("0")) ? "inactive" : "active";
expect(item.status).has.key(expectedState);
expect(item.amount.toString()).to.equal(contribution.toString());
expect(item.excess.toString()).to.equal("0");
}
}
// check sale state
{
const saleState = await contributor.getSale(saleId);
const totals: any = saleState.totals;
expect(totals.length).to.equal(numExpected);
for (let i = 0; i < numExpected; ++i) {
const total = totals[i];
expect(total.contributions.toString()).to.equal(expectedContributedAmounts[i].toString());
expect(total.allocations.toString()).to.equal("0");
expect(total.excessContributions.toString()).to.equal("0");
}
}
});
it("Orchestrator Cannot Attest Contributions Too Early", async () => {
const saleId = dummyConductor.getSaleId();
let caughtError = false;
try {
const tx = await contributor.attestContributions(orchestrator, saleId);
throw Error(`should not happen: ${tx}`);
} catch (e) {
caughtError = verifyErrorMsg(e, "SaleNotAttestable");
}
if (!caughtError) {
throw Error("did not catch expected error");
}
});
it("Orchestrator Attests Contributions", async () => {
// wait for sale to end here
const saleEnd = dummyConductor.saleEnd;
const saleId = dummyConductor.getSaleId();
await waitUntilBlock(connection, saleEnd);
const tx = await contributor.attestContributions(orchestrator, saleId);
const expectedContributedAmounts = [
totalContributions[0],
new BN(0),
new BN(0),
totalContributions[1],
new BN(0),
new BN(0),
new BN(0),
new BN(0),
];
const numExpected = expectedContributedAmounts.length;
// now go about your business. read VAA back.
await connection.confirmTransaction(tx);
const vaaAccountInfo = await connection.getAccountInfo(
contributor.deriveAttestContributionsMessageAccount(saleId),
"confirmed"
);
const payload = vaaAccountInfo.data.subarray(95); // 95 is where the payload starts
const headerLength = 33;
const contributionLength = 33;
expect(payload.length).to.equal(headerLength + 3 + contributionLength * numExpected);
const payloadId = 2;
expect(payload.readUint8(0)).to.equal(payloadId);
expect(payload.subarray(1, 33).toString("hex")).to.equal(saleId.toString("hex"));
expect(payload.readUint16BE(33)).to.equal(CHAIN_ID_SOLANA as number);
expect(payload.readUint8(35)).to.equal(numExpected);
const contributionsStart = headerLength + 3;
for (let i = 0; i < dummyConductor.acceptedTokens.length; ++i) {
const start = contributionsStart + contributionLength * i;
const tokenIndex = payload.readUint8(start);
expect(tokenIndex).to.equal(dummyConductor.acceptedTokens[i].index);
const amount = new BN(payload.subarray(start + 1, start + 33));
expect(amount.toString()).to.equal(expectedContributedAmounts[i].toString());
}
});
it("User Cannot Contribute After Sale Ended", async () => {
const saleId = dummyConductor.getSaleId();
const tokenIndex = 2;
const amount = new BN("1000000000"); // 1,000,000,000 lamports
let caughtError = false;
try {
const tx = await contributor.contribute(
buyer,
saleId,
tokenIndex,
amount,
await kyc.signContribution(saleId, tokenIndex, amount, buyer.publicKey)
);
throw Error(`should not happen: ${tx}`);
} catch (e) {
caughtError = verifyErrorMsg(e, "SaleEnded");
}
if (!caughtError) {
throw Error("did not catch expected error");
}
});
it("Orchestrator Cannot Seal Sale Without Allocation Bridge Transfers Redeemed", async () => {
const saleSealedVaa = dummyConductor.sealSale(await getBlockTime(connection), contributions);
let caughtError = false;
try {
const tx = await contributor.sealSale(orchestrator, saleSealedVaa);
throw Error(`should not happen: ${tx}`);
} catch (e) {
caughtError = verifyErrorMsg(e, "InsufficientFunds");
}
if (!caughtError) {
throw Error("did not catch expected error");
}
});
it("Orchestrator Cannot Bridge Before SaleSealed VAA Processed", async () => {
const saleId = dummyConductor.getSaleId();
const sale = await contributor.getSale(saleId);
const assets: any = sale.totals;
let caughtError = false;
try {
const tx = await contributor.bridgeSealedContribution(orchestrator, saleId, assets[0].mint);
throw Error(`should not happen: ${tx}`);
} catch (e) {
caughtError = verifyErrorMsg(e, "SaleNotSealed");
}
if (!caughtError) {
throw Error("did not catch expected error");
}
});
it("Orchestrator Seals Sale with Signed VAA", async () => {
// all we're doing here is minting spl tokens to replicate token bridge's mechanism
// of unlocking or minting tokens to someone's associated token account
await dummyConductor.redeemAllocationsOnSolana(connection, orchestrator, contributor.custodian);
// now go about your business
const saleSealedVaa = dummyConductor.sealSale(await getBlockTime(connection), contributions);
const allocations = dummyConductor.allocations;
const tx = await contributor.sealSale(orchestrator, saleSealedVaa);
{
// get the first sale state
const saleId = dummyConductor.getSaleId();
const saleState = await contributor.getSale(saleId);
// verify
expect(saleState.status).has.key("sealed");
const totals: any = saleState.totals;
expect(totals.length).to.equal(allocations.length);
const allocationDivisor = dummyConductor.getAllocationMultiplier();
for (let i = 0; i < totals.length; ++i) {
const actual = totals[i];
const expected = allocations[i];
const adjustedAllocation = BigNumber.from(expected.allocation).div(allocationDivisor).toString();
expect(actual.allocations.toString()).to.equal(adjustedAllocation);
expect(actual.excessContributions.toString()).to.equal(expected.excessContribution);
if (expected.allocation == "0") {
expect(actual.status).has.key("nothingToTransfer");
} else {
expect(actual.status).has.key("readyForTransfer");
}
}
}
});
it("Orchestrator Cannot Seal Sale Again with Signed VAA", async () => {
const saleSealedVaa = dummyConductor.sealSale(await getBlockTime(connection), contributions);
let caughtError = false;
try {
const tx = await contributor.sealSale(orchestrator, saleSealedVaa);
throw Error(`should not happen: ${tx}`);
} catch (e) {
caughtError = verifyErrorMsg(e, "SaleEnded");
}
if (!caughtError) {
throw Error("did not catch expected error");
}
});
it("Orchestrator Bridges Contributions to Conductor", async () => {
const saleId = dummyConductor.getSaleId();
const sale = await contributor.getSale(saleId);
const assets: any = sale.totals;
const expectedContributedAmounts = [
totalContributions[0],
new BN(0),
new BN(0),
totalContributions[1],
new BN(0),
new BN(0),
new BN(0),
new BN(0),
];
const expectedSealedAmounts = dummyConductor.allocations.map((item, i) =>
expectedContributedAmounts[i].sub(new BN(item.excessContribution))
);
const numExpected = expectedSealedAmounts.length;
// token bridge truncates to 8 decimals
const tokenBridgeDecimals = 8;
for (let i = 0; i < numExpected; ++i) {
const asset = assets[i];
if (asset.status.readyForTransfer) {
const mint = asset.mint;
const tx = await contributor.bridgeSealedContribution(orchestrator, saleId, mint);
// now go about your business. read VAA back.
await connection.confirmTransaction(tx);
const vaaAccountInfo = await connection.getAccountInfo(
contributor.deriveSealedTransferMessageAccount(saleId, mint),
"confirmed"
);
const payload = vaaAccountInfo.data.subarray(95); // 95 is where the payload starts
expect(payload.length).to.equal(133); // 1 + 32 + 32 + 2 + 32 + 2 + 32
expect(payload[0]).to.equal(1); // payload 1 is token transfer
const parsedAmount = new BN(payload.subarray(1, 33));
const mintInfo = await getMint(connection, mint);
const divisor = (() => {
const decimals = mintInfo.decimals;
if (decimals > tokenBridgeDecimals) {
return new BN("10").pow(new BN(decimals - tokenBridgeDecimals));
} else {
return new BN("1");
}
})();
expect(parsedAmount.toString()).to.equal(expectedSealedAmounts[i].div(divisor).toString());
const parsedTokenAddress = payload.subarray(33, 65);
const parsedTokenChain = payload.readUint16BE(65);
const tokenMintSigner = deriveAddress([Buffer.from("mint_signer")], TOKEN_BRIDGE_ADDRESS);
if (mintInfo.mintAuthority == tokenMintSigner) {
// wrapped, so get native info
const nativeInfo = await getOriginalAssetSol(connection, TOKEN_BRIDGE_ADDRESS.toString(), mint.toString());
expect(uint8ArrayToHex(nativeInfo.assetAddress)).to.equal(parsedTokenAddress.toString("hex"));
expect(parsedTokenChain).to.equal(nativeInfo.chainId as number);
} else {
// native, so use pubkeys
expect(new web3.PublicKey(parsedTokenAddress).toString()).to.equal(mint.toString());
expect(parsedTokenChain).to.equal(CHAIN_ID_SOLANA as number);
}
const parsedTo = payload.subarray(67, 99);
expect(parsedTo.toString("hex")).to.equal(dummyConductor.recipient);
const parsedToChain = payload.readUint16BE(99);
expect(parsedToChain).to.equal(CONDUCTOR_CHAIN);
const parsedFee = payload.subarray(101, 133);
expect(new BN(parsedFee).toString()).to.equal("0");
} else {
let caughtError = false;
try {
const tx = await contributor.bridgeSealedContribution(orchestrator, saleId, asset.mint);
throw Error(`should not happen: ${tx}`);
} catch (e) {
caughtError = verifyErrorMsg(e, "TransferNotAllowed");
}
if (!caughtError) {
throw Error("did not catch expected error");
}
}
}
});
it("Orchestrator Cannot Bridge Contribution Again", async () => {
const saleId = dummyConductor.getSaleId();
const sale = await contributor.getSale(saleId);
const assets: any = sale.totals;
let caughtError = false;
try {
const tx = await contributor.bridgeSealedContribution(orchestrator, saleId, assets[0].mint);
throw Error(`should not happen: ${tx}`);
} catch (e) {
caughtError = verifyErrorMsg(e, "TransferNotAllowed");
}
if (!caughtError) {
throw Error("did not catch expected error");
}
});
it("User Claims Allocations From Sale", async () => {
const saleId = dummyConductor.getSaleId();
const sale = await contributor.getSale(saleId);
const assets: any = sale.totals;
const startingBalanceBuyer = await Promise.all(
assets.map(async (asset) => {
return getSplBalance(connection, asset.mint, buyer.publicKey);
})
);
const startingBalanceCustodian = await Promise.all(
assets.map(async (asset) => {
return getPdaSplBalance(connection, asset.mint, contributor.custodian);
})
);
const tx = await contributor.claimAllocation(buyer, saleId);
const endingBalanceBuyer = await Promise.all(
assets.map(async (asset) => {
return getSplBalance(connection, asset.mint, buyer.publicKey);
})
);
const endingBalanceCustodian = await Promise.all(
assets.map(async (asset) => {
return getPdaSplBalance(connection, asset.mint, contributor.custodian);
})
);
// get state
const buyerState = await contributor.getBuyer(saleId, buyer.publicKey);
expect(buyerState.allocation.claimed).to.be.true;
const allocationDivisor = new BN(dummyConductor.getAllocationMultiplier());
const expectedAllocation = dummyConductor.allocations
.map((item) => new BN(item.allocation))
.reduce((prev, curr) => prev.add(curr))
.div(allocationDivisor);
expect(buyerState.allocation.amount.toString()).to.equal(expectedAllocation.toString());
const expectedExcessAmounts = dummyConductor.allocations.map((item) => new BN(item.excessContribution));
const numExpected = expectedExcessAmounts.length;
const totals: any = buyerState.contributions;
expect(totals.length).to.equal(numExpected);
// check balance changes and state
for (let i = 0; i < numExpected; ++i) {
let excess = expectedExcessAmounts[i];
expect(startingBalanceBuyer[i].add(excess).toString()).to.equal(endingBalanceBuyer[i].toString());
expect(startingBalanceCustodian[i].sub(excess).toString()).to.equal(endingBalanceCustodian[i].toString());
const item = totals[i];
expect(item.status).has.key("excessClaimed");
expect(item.excess.toString()).to.equal(excess.toString());
}
});
// TODO
it("User Cannot Claim Allocations Again", async () => {
const saleId = dummyConductor.getSaleId();
let caughtError = false;
try {
const tx = await contributor.claimAllocation(buyer, saleId);
throw Error(`should not happen: ${tx}`);
} catch (e) {
caughtError = verifyErrorMsg(e, "AlreadyClaimed");
}
if (!caughtError) {
throw Error("did not catch expected error");
}
});
});
describe("Conduct Aborted Sale", () => {
// global contributions for test
const contributions = new Map<number, string[]>();
const totalContributions: BN[] = [];
// squirrel away associated sale token account
let saleTokenAccount: AssociatedTokenAccount;
it("Create ATA for Sale Token if Non-Existent", async () => {
const allowOwnerOffCurve = true;
saleTokenAccount = await getOrCreateAssociatedTokenAccount(
connection,
orchestrator,
dummyConductor.getSaleTokenOnSolana(),
contributor.custodian,
allowOwnerOffCurve
);
});
it("Orchestrator Initialize Sale with Signed VAA", async () => {
const startTime = 8 + (await getBlockTime(connection));
const duration = 8; // seconds
const initSaleVaa = dummyConductor.createSale(startTime, duration, saleTokenAccount.address);
const tx = await contributor.initSale(orchestrator, initSaleVaa, dummyConductor.getSaleTokenOnSolana());
{
const saleId = dummyConductor.getSaleId();
const saleState = await contributor.getSale(saleId);
// verify
expect(Uint8Array.from(saleState.id)).to.deep.equal(saleId);
//expect(Uint8Array.from(saleState.tokenAddress)).to.deep.equal(Buffer.from(dummyConductor.tokenAddress, "hex"));
expect(saleState.tokenChain).to.equal(dummyConductor.tokenChain);
expect(saleState.tokenDecimals).to.equal(dummyConductor.tokenDecimals);
expect(saleState.nativeTokenDecimals).to.equal(dummyConductor.nativeTokenDecimals);
expect(saleState.times.start.toString()).to.equal(dummyConductor.saleStart.toString());
expect(saleState.times.end.toString()).to.equal(dummyConductor.saleEnd.toString());
expect(Uint8Array.from(saleState.recipient)).to.deep.equal(Buffer.from(dummyConductor.recipient, "hex"));
expect(saleState.status).has.key("active");
// check totals
const totals: any = saleState.totals;
const numAccepted = dummyConductor.acceptedTokens.length;
expect(totals.length).to.equal(numAccepted);
for (let i = 0; i < numAccepted; ++i) {
const total = totals[i];
const acceptedToken = dummyConductor.acceptedTokens[i];
expect(total.tokenIndex).to.equal(acceptedToken.index);
expect(tryNativeToHexString(total.mint.toString(), CHAIN_ID_SOLANA)).to.equal(acceptedToken.address);
expect(total.contributions.toString()).to.equal("0");
expect(total.allocations.toString()).to.equal("0");
expect(total.excessContributions.toString()).to.equal("0");
}
}
});
it("User Contributes to Sale", async () => {
// wait for sale to start here
const saleStart = dummyConductor.saleStart;
await waitUntilBlock(connection, saleStart);
// prep contributions info
const acceptedTokens = dummyConductor.acceptedTokens;
const contributedTokenIndices = [acceptedTokens[0].index, acceptedTokens[3].index];
contributions.set(contributedTokenIndices[0], ["1200000000", "3400000000"]);
contributions.set(contributedTokenIndices[1], ["5600000000", "7800000000"]);
contributedTokenIndices.forEach((tokenIndex) => {
const amounts = contributions.get(tokenIndex);
totalContributions.push(amounts.map((x) => new BN(x)).reduce((prev, curr) => prev.add(curr)));
});
// now go about your business
// contribute multiple times
const saleId = dummyConductor.getSaleId();
for (const tokenIndex of contributedTokenIndices) {
for (const amount of contributions.get(tokenIndex).map((value) => new BN(value))) {
const tx = await contributor.contribute(
buyer,
saleId,
tokenIndex,
amount,
await kyc.signContribution(saleId, tokenIndex, amount, buyer.publicKey)
);
}
}
});
it("Orchestrator Aborts Sale with Signed VAA", async () => {
const saleAbortedVaa = dummyConductor.abortSale(await getBlockTime(connection));
const tx = await contributor.abortSale(orchestrator, saleAbortedVaa);
{
const saleId = dummyConductor.getSaleId();
const saleState = await contributor.getSale(saleId);
expect(saleState.status).has.key("aborted");
}
});
it("Orchestrator Cannot Abort Sale Again", async () => {
const saleAbortedVaa = dummyConductor.abortSale(await getBlockTime(connection));
// cannot abort the sale again
let caughtError = false;
try {
const tx = await contributor.abortSale(orchestrator, saleAbortedVaa);
throw Error(`should not happen: ${tx}`);
} catch (e) {
caughtError = verifyErrorMsg(e, "SaleEnded");
}
if (!caughtError) {
throw Error("did not catch expected error");
}
});
it("User Claims Refund From Sale", async () => {
const saleId = dummyConductor.getSaleId();
const sale = await contributor.getSale(saleId);
const assets: any = sale.totals;
const startingBalanceBuyer = await Promise.all(
assets.map(async (asset) => {
return getSplBalance(connection, asset.mint, buyer.publicKey);
})
);
const startingBalanceCustodian = await Promise.all(
assets.map(async (asset) => {
return getPdaSplBalance(connection, asset.mint, contributor.custodian);
})
);
const tx = await contributor.claimRefunds(buyer, saleId);
const endingBalanceBuyer = await Promise.all(
assets.map(async (asset) => {
return getSplBalance(connection, asset.mint, buyer.publicKey);
})
);
const endingBalanceCustodian = await Promise.all(
assets.map(async (asset) => {
return getPdaSplBalance(connection, asset.mint, contributor.custodian);
})
);
const expectedRefundAmounts = [
totalContributions[0],
new BN(0),
new BN(0),
totalContributions[1],
new BN(0),
new BN(0),
new BN(0),
new BN(0),
];
const numExpected = expectedRefundAmounts.length;
// get state
const buyerState = await contributor.getBuyer(saleId, buyer.publicKey);
const totals: any = buyerState.contributions;
expect(totals.length).to.equal(numExpected);
// check balance changes and state
for (let i = 0; i < numExpected; ++i) {
let refund = expectedRefundAmounts[i];
expect(startingBalanceBuyer[i].add(refund).toString()).to.equal(endingBalanceBuyer[i].toString());
expect(startingBalanceCustodian[i].sub(refund).toString()).to.equal(endingBalanceCustodian[i].toString());
const item = totals[i];
expect(item.status).has.key("refundClaimed");
expect(item.excess.toString()).to.equal(refund.toString());
}
});
it("User Cannot Claim Refund Again", async () => {
const saleId = dummyConductor.getSaleId();
let caughtError = false;
try {
const tx = await contributor.claimRefunds(buyer, saleId);
throw Error(`should not happen: ${tx}`);
} catch (e) {
caughtError = verifyErrorMsg(e, "AlreadyClaimed");
}
if (!caughtError) {
throw Error("did not catch expected error");
}
});
});
});
async function waitUntilBlock(connection: web3.Connection, saleEnd: number) {
let blockTime = await getBlockTime(connection);
while (blockTime <= saleEnd) {
await wait(1);
blockTime = await getBlockTime(connection);
}
}
function verifyErrorMsg(e: any, msg: string): boolean {
if (e.msg) {
const result = e.msg == msg;
if (!result) {
console.error(e);
}
return result;
} else if (e.error) {
const result = e.error.errorMessage == msg;
if (!result) {
console.error(e);
}
return result;
}
console.error(e);
throw Error("unknown error");
}

View File

@ -0,0 +1,13 @@
{
"pubkey": "FKoMTctsC7vJbEqyRiiPskPnuQx2tX1kurmvWByq5uZP",
"account": {
"lamports": 1057920,
"data": [
"AAAAAACYDQAAAAAAgFEBAGQAAAAAAAAA",
"base64"
],
"owner": "Bridge1p5gheXUvJ6jGWGeCsgPKgnE3YgdGKRVCMY9o",
"executable": false,
"rentEpoch": 0
}
}

Binary file not shown.

View File

@ -0,0 +1,13 @@
{
"pubkey": "GXBsgBD3LDn3vkRZF6TfY5RqgajVZ4W5bMAdiAaaUARs",
"account": {
"lamports": 890880,
"data": [
"",
"base64"
],
"owner": "11111111111111111111111111111111",
"executable": false,
"rentEpoch": 0
}
}

View File

@ -0,0 +1,13 @@
{
"pubkey": "6MxkvoEwgB9EqQRLNhvYaPGhfcLtBtpBqdQugr3AZUgD",
"account": {
"lamports": 1141440,
"data": [
"AAAAAAEAAAC++kKdV80Yt/ik2RotqatK8F0PvoX2jWIAAAAA",
"base64"
],
"owner": "Bridge1p5gheXUvJ6jGWGeCsgPKgnE3YgdGKRVCMY9o",
"executable": false,
"rentEpoch": 0
}
}

View File

@ -0,0 +1,281 @@
import { web3 } from "@project-serum/anchor";
import { CHAIN_ID_ETH, CHAIN_ID_SOLANA, tryNativeToHexString } from "@certusone/wormhole-sdk";
import { createMint, mintTo } from "@solana/spl-token";
import { BigNumber, BigNumberish } from "ethers";
import { getPdaAssociatedTokenAddress, toBigNumberHex } from "./utils";
import { signAndEncodeVaa } from "./wormhole";
import { BN } from "bn.js";
import { SolanaAcceptedToken } from "./types";
// sale struct info
export const MAX_ACCEPTED_TOKENS = 8;
const NUM_BYTES_ACCEPTED_TOKEN = 33;
const NUM_BYTES_ALLOCATION = 65;
export class DummyConductor {
chainId: number;
address: Buffer;
saleId: number;
wormholeSequence: number;
saleStart: number;
saleEnd: number;
initSaleVaa: Buffer;
saleTokenOnSolana: string;
acceptedTokens: SolanaAcceptedToken[];
allocations: Allocation[];
constructor(chainId: number, address: string) {
this.chainId = chainId;
this.address = Buffer.from(address, "hex");
this.saleId = 0;
this.wormholeSequence = 0;
this.saleStart = 0;
this.saleEnd = 0;
this.acceptedTokens = [];
this.allocations = [];
}
async attestSaleToken(connection: web3.Connection, payer: web3.Keypair): Promise<void> {
const mint = await createMint(connection, payer, payer.publicKey, payer.publicKey, this.nativeTokenDecimals);
this.saleTokenOnSolana = mint.toString();
return;
}
async redeemAllocationsOnSolana(
connection: web3.Connection,
payer: web3.Keypair,
custodian: web3.PublicKey
): Promise<void> {
const mint = new web3.PublicKey(this.saleTokenOnSolana);
const custodianTokenAcct = await getPdaAssociatedTokenAddress(mint, custodian);
const amount = this.allocations.map((item) => new BN(item.allocation)).reduce((prev, curr) => prev.add(curr));
await mintTo(
connection,
payer,
mint,
custodianTokenAcct,
payer,
BigInt(amount.toString()) // 20,000,000,000 lamports
);
return;
}
getSaleTokenOnSolana(): web3.PublicKey {
return new web3.PublicKey(this.saleTokenOnSolana);
}
async createAcceptedTokens(connection: web3.Connection, payer: web3.Keypair): Promise<SolanaAcceptedToken[]> {
const tokenIndices = [2, 3, 5, 8, 13, 21, 34, 55];
for (let i = 0; i < MAX_ACCEPTED_TOKENS; ++i) {
// just make everything the same number of decimals (9)
const mint = await createMint(connection, payer, payer.publicKey, payer.publicKey, 9);
this.acceptedTokens.push(makeSolanaAcceptedToken(tokenIndices[i], mint.toString()));
}
return this.acceptedTokens;
}
getSaleId(): Buffer {
return Buffer.from(toBigNumberHex(this.saleId, 32), "hex");
}
createSale(startTime: number, duration: number, associatedSaleTokenAddress: web3.PublicKey): Buffer {
// uptick saleId for every new sale
++this.saleId;
++this.wormholeSequence;
// set up sale time based on block time
this.saleStart = startTime;
this.saleEnd = this.saleStart + duration;
this.initSaleVaa = signAndEncodeVaa(
startTime,
this.nonce,
this.chainId,
this.address,
this.wormholeSequence,
encodeSaleInit(
this.saleId,
tryNativeToHexString(associatedSaleTokenAddress.toString(), CHAIN_ID_SOLANA),
this.tokenChain,
this.tokenDecimals,
this.saleStart,
this.saleEnd,
this.acceptedTokens,
this.recipient
)
);
return this.initSaleVaa;
}
getAllocationMultiplier(): string {
const decimalDifference = this.tokenDecimals - this.nativeTokenDecimals;
return BigNumber.from("10").pow(decimalDifference).toString();
}
sealSale(blockTime: number, contributions: Map<number, string[]>): Buffer {
++this.wormholeSequence;
this.allocations = [];
const allocationMultiplier = this.getAllocationMultiplier();
// make up allocations and excess contributions
const excessContributionDivisor = BigNumber.from("5");
const acceptedTokens = this.acceptedTokens;
for (let i = 0; i < acceptedTokens.length; ++i) {
const tokenIndex = acceptedTokens[i].index;
const contributionSubset = contributions.get(tokenIndex);
if (contributionSubset === undefined) {
this.allocations.push(makeAllocation(tokenIndex, "0", "0"));
} else {
const total = contributionSubset.map((x) => BigNumber.from(x)).reduce((prev, curr) => prev.add(curr));
const excessContribution = total.div(excessContributionDivisor).toString();
const allocation = BigNumber.from(this.expectedAllocations[i]).mul(allocationMultiplier).toString();
this.allocations.push(makeAllocation(tokenIndex, allocation, excessContribution));
}
}
return signAndEncodeVaa(
blockTime,
this.nonce,
this.chainId,
this.address,
this.wormholeSequence,
encodeSaleSealed(this.saleId, this.allocations)
);
}
abortSale(blockTime: number): Buffer {
++this.wormholeSequence;
return signAndEncodeVaa(
blockTime,
this.nonce,
this.chainId,
this.address,
this.wormholeSequence,
encodeSaleAborted(this.saleId)
);
}
// sale parameters that won't change for the test
//associatedTokenAddress = "00000000000000000000000083752ecafebf4707258dedffbd9c7443148169db";
tokenChain = CHAIN_ID_ETH as number;
tokenDecimals = 18;
nativeTokenDecimals = 7;
recipient = tryNativeToHexString("0x22d491bde2303f2f43325b2108d26f1eaba1e32b", CHAIN_ID_ETH);
// we won't use all of these, but these are purely to verify decimal shift
expectedAllocations = [
"1000000000",
"1000000000",
"2000000000",
"3000000000",
"5000000000",
"8000000000",
"13000000000",
"21000000000",
];
// wormhole nonce
nonce = 0;
}
function makeSolanaAcceptedToken(index: number, pubkey: string): SolanaAcceptedToken {
return { index, address: tryNativeToHexString(pubkey, CHAIN_ID_SOLANA) };
}
function makeAllocation(index: number, allocation: string, excessContribution: string): Allocation {
return { index, allocation, excessContribution };
}
export function encodeAcceptedTokens(acceptedTokens: SolanaAcceptedToken[]): Buffer {
const n = acceptedTokens.length;
const encoded = Buffer.alloc(NUM_BYTES_ACCEPTED_TOKEN * n);
for (let i = 0; i < n; ++i) {
const token = acceptedTokens[i];
const start = i * NUM_BYTES_ACCEPTED_TOKEN;
encoded.writeUint8(token.index, start);
encoded.write(token.address, start + 1, "hex");
}
return encoded;
}
export function encodeSaleInit(
saleId: number,
associatedTokenAddress: string, // 32 bytes
tokenChain: number,
tokenDecimals: number,
saleStart: number,
saleEnd: number,
acceptedTokens: SolanaAcceptedToken[], // 33 * n_tokens
recipient: string // 32 bytes
): Buffer {
const numTokens = acceptedTokens.length;
const encoded = Buffer.alloc(165 + numTokens * NUM_BYTES_ACCEPTED_TOKEN);
encoded.writeUInt8(5, 0); // initSale payload for solana = 5
encoded.write(toBigNumberHex(saleId, 32), 1, "hex");
encoded.write(associatedTokenAddress, 33, "hex");
encoded.writeUint16BE(tokenChain, 65);
encoded.writeUint8(tokenDecimals, 67);
encoded.write(toBigNumberHex(saleStart, 32), 68, "hex");
encoded.write(toBigNumberHex(saleEnd, 32), 100, "hex");
encoded.writeUInt8(numTokens, 132);
encoded.write(encodeAcceptedTokens(acceptedTokens).toString("hex"), 133, "hex");
const recipientIndex = 133 + numTokens * NUM_BYTES_ACCEPTED_TOKEN;
encoded.write(recipient, recipientIndex, "hex");
return encoded;
}
export interface Allocation {
index: number;
allocation: string; // big number, uint256
excessContribution: string; // big number, uint256
}
export function encodeAllocations(allocations: Allocation[]): Buffer {
const n = allocations.length;
const encoded = Buffer.alloc(NUM_BYTES_ALLOCATION * n);
for (let i = 0; i < n; ++i) {
const item = allocations[i];
const start = i * NUM_BYTES_ALLOCATION;
encoded.writeUint8(item.index, start);
encoded.write(toBigNumberHex(item.allocation, 32), start + 1, "hex");
encoded.write(toBigNumberHex(item.excessContribution, 32), start + 33, "hex");
}
return encoded;
}
export function encodeSaleSealed(
saleId: number,
allocations: Allocation[] // 65 * n_allocations
): Buffer {
const headerLen = 33;
const numAllocations = allocations.length;
const encoded = Buffer.alloc(headerLen + 1 + numAllocations * NUM_BYTES_ALLOCATION);
encoded.writeUInt8(3, 0); // saleSealed payload = 3
encoded.write(toBigNumberHex(saleId, 32), 1, "hex");
encoded.writeUint8(numAllocations, headerLen);
encoded.write(encodeAllocations(allocations).toString("hex"), headerLen + 1, "hex");
return encoded;
}
export function encodeSaleAborted(saleId: number): Buffer {
const encoded = Buffer.alloc(33);
encoded.writeUInt8(4, 0); // saleSealed payload = 4
encoded.write(toBigNumberHex(saleId, 32), 1, "hex");
return encoded;
}

View File

@ -0,0 +1,22 @@
import { hexToUint8Array } from "@certusone/wormhole-sdk";
import { web3 } from "@project-serum/anchor";
// wormhole
//export const CORE_BRIDGE_ADDRESS = new web3.PublicKey("Bridge1p5gheXUvJ6jGWGeCsgPKgnE3YgdGKRVCMY9o");
export const CORE_BRIDGE_ADDRESS = new web3.PublicKey(
process.env.CORE_BRIDGE_ADDRESS
);
export const TOKEN_BRIDGE_ADDRESS = new web3.PublicKey(
process.env.TOKEN_BRIDGE_ADDRESS
);
// contributor
export const CONDUCTOR_CHAIN: number = parseInt(process.env.CONDUCTOR_CHAIN);
export const CONDUCTOR_ADDRESS: string = process.env.CONDUCTOR_ADDRESS;
export const GLOBAL_KYC_AUTHORITY: Uint8Array = hexToUint8Array(
process.env.GLOBAL_KYC_AUTHORITY
);
// kyc
export const KYC_PRIVATE: string =
"b0057716d5917badaf911b193b12b910811c1497b5bada8d7711f758981c3773";

View File

@ -0,0 +1,473 @@
import { getOriginalAssetSol, tryHexToNativeString } from "@certusone/wormhole-sdk";
import { BN, Program, web3 } from "@project-serum/anchor";
import { AnchorContributor } from "../../target/types/anchor_contributor";
import {
getAccount,
getAssociatedTokenAddress,
getMint,
getOrCreateAssociatedTokenAccount,
TOKEN_PROGRAM_ID,
} from "@solana/spl-token";
import { deriveAddress, getPdaAssociatedTokenAddress, makeReadOnlyAccountMeta, makeWritableAccountMeta } from "./utils";
import { PostVaaMethod } from "./types";
import { serializeUint16 } from "byteify";
import { hashVaaPayload } from "./wormhole";
export class IccoContributor {
program: Program<AnchorContributor>;
wormhole: web3.PublicKey;
tokenBridge: web3.PublicKey;
postVaaWithRetry: PostVaaMethod;
whMessageKey: web3.Keypair;
custodian: web3.PublicKey;
constructor(
program: Program<AnchorContributor>,
wormhole: web3.PublicKey,
tokenBridge: web3.PublicKey,
postVaaWithRetry: PostVaaMethod
) {
this.program = program;
this.wormhole = wormhole;
this.tokenBridge = tokenBridge;
this.postVaaWithRetry = postVaaWithRetry;
this.custodian = this.deriveCustodianAccount();
}
async createCustodian(payer: web3.Keypair) {
const program = this.program;
return program.methods
.createCustodian()
.accounts({
owner: payer.publicKey,
custodian: this.custodian,
systemProgram: web3.SystemProgram.programId,
})
.rpc();
}
async initSale(payer: web3.Keypair, initSaleVaa: Buffer, saleTokenMint: web3.PublicKey): Promise<string> {
const program = this.program;
const custodian = this.custodian;
// first post signed vaa to wormhole
await this.postVaa(payer, initSaleVaa);
const coreBridgeVaa = this.deriveSignedVaaAccount(initSaleVaa);
const saleId = await parseSaleId(initSaleVaa);
const sale = this.deriveSaleAccount(saleId);
const custodianSaleTokenAcct = await getPdaAssociatedTokenAddress(saleTokenMint, custodian);
return program.methods
.initSale()
.accounts({
custodian,
sale,
coreBridgeVaa,
saleTokenMint,
custodianSaleTokenAcct,
payer: payer.publicKey,
systemProgram: web3.SystemProgram.programId,
})
.rpc();
}
async contribute(
payer: web3.Keypair,
saleId: Buffer,
tokenIndex: number,
amount: BN,
kycSignature: Buffer
): Promise<string> {
// first find mint
const state = await this.getSale(saleId);
const totals: any = state.totals;
const found = totals.find((item) => item.tokenIndex == tokenIndex);
if (found == undefined) {
throw "tokenIndex not found";
}
const mint = found.mint;
// now prepare instruction
const program = this.program;
const custodian = this.custodian;
const buyer = this.deriveBuyerAccount(saleId, payer.publicKey);
const sale = this.deriveSaleAccount(saleId);
const buyerTokenAcct = await getAssociatedTokenAddress(mint, payer.publicKey);
const custodianTokenAcct = await getPdaAssociatedTokenAddress(mint, custodian);
return program.methods
.contribute(amount, kycSignature)
.accounts({
custodian,
sale,
buyer,
owner: payer.publicKey,
systemProgram: web3.SystemProgram.programId,
buyerTokenAcct,
custodianTokenAcct,
})
.signers([payer])
.rpc();
}
async attestContributions(payer: web3.Keypair, saleId: Buffer) {
const program = this.program;
const wormhole = this.wormhole;
// Accounts
const sale = this.deriveSaleAccount(saleId);
const wormholeConfig = deriveAddress([Buffer.from("Bridge")], wormhole);
const wormholeFeeCollector = deriveAddress([Buffer.from("fee_collector")], wormhole);
// contributor is the emitter
const wormholeEmitter = deriveAddress([Buffer.from("emitter")], program.programId);
const wormholeSequence = deriveAddress([Buffer.from("Sequence"), wormholeEmitter.toBytes()], wormhole);
const wormholeMessage = this.deriveAttestContributionsMessageAccount(saleId);
return program.methods
.attestContributions()
.accounts({
sale,
payer: payer.publicKey,
systemProgram: web3.SystemProgram.programId,
wormhole,
wormholeConfig,
wormholeFeeCollector,
wormholeEmitter,
wormholeSequence,
wormholeMessage,
clock: web3.SYSVAR_CLOCK_PUBKEY,
rent: web3.SYSVAR_RENT_PUBKEY,
})
.signers([payer])
.rpc();
}
async sealSale(payer: web3.Keypair, saleSealedVaa: Buffer): Promise<string> {
const saleId = await parseSaleId(saleSealedVaa);
const saleState = await this.getSale(saleId);
const saleTokenMint = saleState.saleTokenMint;
const program = this.program;
const custodian = this.custodian;
// first post signed vaa to wormhole
await this.postVaa(payer, saleSealedVaa);
const coreBridgeVaa = this.deriveSignedVaaAccount(saleSealedVaa);
const sale = this.deriveSaleAccount(saleId);
const custodianSaleTokenAcct = await getPdaAssociatedTokenAddress(saleTokenMint, custodian);
const totals: any = saleState.totals;
const mints = totals.map((total) => total.mint);
const remainingAccounts: web3.AccountMeta[] = [];
// push custodian token accounts
const custodianTokenAccounts = await Promise.all(
mints.map(async (mint) => getPdaAssociatedTokenAddress(mint, custodian))
);
remainingAccounts.push(
...custodianTokenAccounts.map((acct) => {
return makeReadOnlyAccountMeta(acct);
})
);
return program.methods
.sealSale()
.accounts({
custodian,
sale,
coreBridgeVaa,
custodianSaleTokenAcct,
systemProgram: web3.SystemProgram.programId,
})
.remainingAccounts(remainingAccounts)
.rpc();
}
async bridgeSealedContribution(payer: web3.Keypair, saleId: Buffer, acceptedMint: web3.PublicKey) {
const program = this.program;
const wormhole = this.wormhole;
const tokenBridge = this.tokenBridge;
const custodian = this.custodian;
const custodianTokenAcct = await getPdaAssociatedTokenAddress(acceptedMint, custodian);
const sale = this.deriveSaleAccount(saleId);
// need to check whether token bridge minted spl
const tokenMintSigner = deriveAddress([Buffer.from("mint_signer")], tokenBridge);
const custodyOrWrappedMeta = await (async () => {
const mintInfo = await getMint(program.provider.connection, acceptedMint);
if (mintInfo.mintAuthority == tokenMintSigner) {
//First derive the Wrapped Mint Key
const nativeInfo = await getOriginalAssetSol(
program.provider.connection,
tokenBridge.toString(),
acceptedMint.toString()
);
const wrappedMintKey = deriveAddress(
[Buffer.from("wrapped"), serializeUint16(nativeInfo.chainId as number), acceptedMint.toBytes()],
tokenBridge
);
//Then derive the Wrapped Meta Key
return deriveAddress([Buffer.from("meta"), wrappedMintKey.toBytes()], tokenBridge);
} else {
return deriveAddress([acceptedMint.toBytes()], tokenBridge);
}
})();
// wormhole
const wormholeConfig = deriveAddress([Buffer.from("Bridge")], wormhole);
const wormholeFeeCollector = deriveAddress([Buffer.from("fee_collector")], wormhole);
// token bridge emits vaa
const wormholeEmitter = deriveAddress([Buffer.from("emitter")], tokenBridge);
const wormholeSequence = deriveAddress([Buffer.from("Sequence"), wormholeEmitter.toBytes()], wormhole);
// token bridge
const authoritySigner = deriveAddress([Buffer.from("authority_signer")], tokenBridge);
const tokenBridgeConfig = deriveAddress([Buffer.from("config")], tokenBridge);
const custodySigner = deriveAddress([Buffer.from("custody_signer")], tokenBridge);
const wormholeMessage = this.deriveSealedTransferMessageAccount(saleId, acceptedMint);
const requestUnitsIx = web3.ComputeBudgetProgram.requestUnits({
units: 420690,
additionalFee: 0,
});
return program.methods
.bridgeSealedContribution()
.accounts({
custodian,
sale,
custodianTokenAcct,
acceptedMint,
payer: payer.publicKey,
systemProgram: web3.SystemProgram.programId,
tokenProgram: TOKEN_PROGRAM_ID,
tokenBridge,
custodyOrWrappedMeta,
custodySigner,
tokenMintSigner,
authoritySigner,
tokenBridgeConfig,
wormhole,
wormholeConfig,
wormholeFeeCollector,
wormholeEmitter,
wormholeSequence,
wormholeMessage,
clock: web3.SYSVAR_CLOCK_PUBKEY,
rent: web3.SYSVAR_RENT_PUBKEY,
})
.preInstructions([requestUnitsIx])
.signers([payer])
.rpc();
}
async abortSale(payer: web3.Keypair, saleAbortedVaa: Buffer): Promise<string> {
const program = this.program;
const custodian = this.custodian;
// first post signed vaa to wormhole
await this.postVaa(payer, saleAbortedVaa);
const coreBridgeVaa = this.deriveSignedVaaAccount(saleAbortedVaa);
const saleId = await parseSaleId(saleAbortedVaa);
const sale = this.deriveSaleAccount(saleId);
return program.methods
.abortSale()
.accounts({
custodian,
sale,
coreBridgeVaa,
systemProgram: web3.SystemProgram.programId,
})
.rpc();
}
async claimRefunds(payer: web3.Keypair, saleId: Buffer): Promise<string> {
const saleState = await this.getSale(saleId);
const totals: any = saleState.totals;
const mints = totals.map((total) => total.mint);
const program = this.program;
const custodian = this.custodian;
const buyer = this.deriveBuyerAccount(saleId, payer.publicKey);
const sale = this.deriveSaleAccount(saleId);
const remainingAccounts: web3.AccountMeta[] = [];
// push custodian token accounts
const custodianTokenAccounts = await Promise.all(
mints.map(async (mint) => getPdaAssociatedTokenAddress(mint, custodian))
);
// console.log("!!! Mints len: ", mints.length);
remainingAccounts.push(
...custodianTokenAccounts.map((acct) => {
return makeWritableAccountMeta(acct);
})
);
// next buyers
const buyerTokenAccounts = await Promise.all(
mints.map(async (mint) => getAssociatedTokenAddress(mint, payer.publicKey))
);
remainingAccounts.push(
...buyerTokenAccounts.map((acct) => {
return makeWritableAccountMeta(acct);
})
);
return program.methods
.claimRefunds()
.accounts({
custodian,
sale,
buyer,
owner: payer.publicKey,
systemProgram: web3.SystemProgram.programId,
})
.signers([payer])
.remainingAccounts(remainingAccounts)
.rpc();
}
async claimAllocation(payer: web3.Keypair, saleId: Buffer): Promise<string> {
const saleState = await this.getSale(saleId);
const saleTokenMint = saleState.saleTokenMint;
const totals: any = saleState.totals;
const mints = totals.map((total) => total.mint);
const program = this.program;
const custodian = this.custodian;
const buyer = this.deriveBuyerAccount(saleId, payer.publicKey);
const sale = this.deriveSaleAccount(saleId);
const buyerTokenAccount = await getOrCreateAssociatedTokenAccount(
program.provider.connection,
payer,
saleTokenMint,
payer.publicKey
);
const buyerSaleTokenAcct = buyerTokenAccount.address;
const custodianSaleTokenAcct = await getPdaAssociatedTokenAddress(saleTokenMint, custodian);
const remainingAccounts: web3.AccountMeta[] = [];
// push custodian token accounts
const custodianTokenAccounts = await Promise.all(
mints.map(async (mint) => getPdaAssociatedTokenAddress(mint, custodian))
);
remainingAccounts.push(
...custodianTokenAccounts.map((acct) => {
return makeWritableAccountMeta(acct);
})
);
// next buyers
const buyerTokenAccounts = await Promise.all(
mints.map(async (mint) => getAssociatedTokenAddress(mint, payer.publicKey))
);
remainingAccounts.push(
...buyerTokenAccounts.map((acct) => {
return makeWritableAccountMeta(acct);
})
);
return program.methods
.claimAllocation()
.accounts({
custodian,
sale,
buyer,
buyerSaleTokenAcct,
custodianSaleTokenAcct,
owner: payer.publicKey,
systemProgram: web3.SystemProgram.programId,
})
.signers([payer])
.remainingAccounts(remainingAccounts)
.rpc();
}
async getCustodian() {
return this.program.account.custodian.fetch(this.custodian);
}
async getSale(saleId: Buffer) {
return this.program.account.sale.fetch(this.deriveSaleAccount(saleId));
}
async getBuyer(saleId: Buffer, buyer: web3.PublicKey) {
return this.program.account.buyer.fetch(this.deriveBuyerAccount(saleId, buyer));
}
async postVaa(payer: web3.Keypair, signedVaa: Buffer): Promise<void> {
//return postVaa(this.program.provider.connection, payer, this.wormhole, signedVaa);
await this.postVaaWithRetry(
this.program.provider.connection,
async (tx) => {
tx.partialSign(payer);
return tx;
},
this.wormhole.toString(),
payer.publicKey.toString(),
signedVaa,
10
);
}
deriveSealedTransferMessageAccount(saleId: Buffer, mint: web3.PublicKey): web3.PublicKey {
return deriveAddress([Buffer.from("bridge-sealed"), saleId, mint.toBytes()], this.program.programId);
}
deriveAttestContributionsMessageAccount(saleId: Buffer): web3.PublicKey {
return deriveAddress([Buffer.from("attest-contributions"), saleId], this.program.programId);
}
deriveSaleAccount(saleId: Buffer): web3.PublicKey {
return deriveAddress([Buffer.from("icco-sale"), saleId], this.program.programId);
}
deriveBuyerAccount(saleId: Buffer, buyer: web3.PublicKey): web3.PublicKey {
return deriveAddress([Buffer.from("icco-buyer"), saleId, buyer.toBuffer()], this.program.programId);
}
deriveSignedVaaAccount(signedVaa: Buffer): web3.PublicKey {
const hash = hashVaaPayload(signedVaa);
return deriveAddress([Buffer.from("PostedVAA"), hash], this.wormhole);
}
deriveCustodianAccount(): web3.PublicKey {
return deriveAddress([Buffer.from("icco-custodian")], this.program.programId);
}
}
async function parseSaleId(iccoVaa: Buffer): Promise<Buffer> {
//const { parse_vaa } = await importCoreWasm();
const numSigners = iccoVaa[5];
const payloadStart = 57 + 66 * numSigners;
return iccoVaa.subarray(payloadStart + 1, payloadStart + 33);
}

View File

@ -0,0 +1,92 @@
import { CHAIN_ID_SOLANA, tryNativeToHexString } from "@certusone/wormhole-sdk";
import { web3, BN } from "@project-serum/anchor";
import { soliditySha3 } from "web3-utils";
import { IccoContributor } from "./contributor";
import { toBigNumberHex } from "./utils";
const elliptic = require("elliptic");
export class KycAuthority {
privateKey: Buffer;
conductorAddress: string;
contributor: IccoContributor;
constructor(
privateKey: string,
conductorAddress: string,
contributor: IccoContributor
) {
this.privateKey = Buffer.from(privateKey, "hex");
this.conductorAddress = conductorAddress;
this.contributor = contributor;
}
async getSale(saleId: Buffer) {
return this.contributor.getSale(saleId);
}
async getBuyer(saleId: Buffer, buyer: web3.PublicKey) {
return this.contributor.getBuyer(saleId, buyer);
}
async fetchBuyerTotalContribution(
saleId: Buffer,
tokenIndex: number,
buyer: web3.PublicKey
): Promise<BN> {
try {
const sale = await this.getSale(saleId);
const totals: any = sale.totals;
const idx = totals.findIndex((item) => item.tokenIndex == tokenIndex);
if (idx < 0) {
throw Error("tokenIndex not found");
}
const state = await this.getBuyer(saleId, buyer);
return state.contributions[idx].amount;
} catch (e) {
if (e.toString().includes("Account does not exist")) {
return new BN("0");
}
throw e;
}
}
async signContribution(
saleId: Buffer,
tokenIndex: number,
amount: BN,
buyer: web3.PublicKey
) {
const totalContribution = await this.fetchBuyerTotalContribution(
saleId,
tokenIndex,
buyer
);
const body = Buffer.alloc(6 * 32, 0);
body.write(this.conductorAddress, 0, "hex");
body.write(saleId.toString("hex"), 32, "hex");
body.write(toBigNumberHex(tokenIndex, 32), 2 * 32, "hex");
body.write(toBigNumberHex(amount.toString(), 32), 3 * 32, "hex");
body.write(
tryNativeToHexString(buyer.toString(), CHAIN_ID_SOLANA),
4 * 32,
"hex"
);
body.write(toBigNumberHex(totalContribution.toString(), 32), 5 * 32, "hex");
const hash = soliditySha3("0x" + body.toString("hex"));
const ec = new elliptic.ec("secp256k1");
const key = ec.keyFromPrivate(this.privateKey);
const signature = key.sign(hash.substring(2), { canonical: true });
const packed = Buffer.alloc(65);
packed.write(signature.r.toString(16).padStart(64, "0"), 0, "hex");
packed.write(signature.s.toString(16).padStart(64, "0"), 32, "hex");
packed.writeUInt8(signature.recoveryParam, 64);
return packed;
}
}

View File

@ -0,0 +1,15 @@
import { web3 } from "@project-serum/anchor";
export type PostVaaMethod = (
connection: web3.Connection,
signTransaction: (transaction: web3.Transaction) => Promise<web3.Transaction>,
bridge_id: string,
payer: string,
vaa: Buffer,
maxRetries: number
) => Promise<void>;
export interface SolanaAcceptedToken {
index: number; // uint8
address: string; // 32 bytes
}

View File

@ -0,0 +1,61 @@
import { web3, BN } from "@project-serum/anchor";
import { findProgramAddressSync } from "@project-serum/anchor/dist/cjs/utils/pubkey";
import { getAssociatedTokenAddress, getAccount } from "@solana/spl-token";
import { tryHexToNativeString, CHAIN_ID_SOLANA } from "@certusone/wormhole-sdk";
import { BigNumber, BigNumberish } from "ethers";
export function toBigNumberHex(value: BigNumberish, numBytes: number): string {
return BigNumber.from(value)
.toHexString()
.substring(2)
.padStart(numBytes * 2, "0");
}
export async function wait(timeInSeconds: number): Promise<void> {
await new Promise((r) => setTimeout(r, timeInSeconds * 1000));
}
export async function getBlockTime(connection: web3.Connection): Promise<number> {
const slot = await connection.getSlot();
return connection.getBlockTime(slot);
}
export async function getSplBalance(connection: web3.Connection, mint: web3.PublicKey, owner: web3.PublicKey) {
const tokenAccount = await getAssociatedTokenAddress(mint, owner);
const account = await getAccount(connection, tokenAccount);
return new BN(account.amount.toString());
}
export async function getPdaSplBalance(connection: web3.Connection, mint: web3.PublicKey, owner: web3.PublicKey) {
const tokenAccount = await getPdaAssociatedTokenAddress(mint, owner);
const account = await getAccount(connection, tokenAccount);
return new BN(account.amount.toString());
}
export function hexToPublicKey(hexlified: string): web3.PublicKey {
return new web3.PublicKey(tryHexToNativeString(hexlified, CHAIN_ID_SOLANA));
}
export async function getPdaAssociatedTokenAddress(mint: web3.PublicKey, pda: web3.PublicKey): Promise<web3.PublicKey> {
return getAssociatedTokenAddress(mint, pda, true);
}
export function makeWritableAccountMeta(pubkey: web3.PublicKey): web3.AccountMeta {
return {
pubkey,
isWritable: true,
isSigner: false,
};
}
export function makeReadOnlyAccountMeta(pubkey: web3.PublicKey): web3.AccountMeta {
return {
pubkey,
isWritable: false,
isSigner: false,
};
}
export function deriveAddress(seeds: (Buffer | Uint8Array)[], program: web3.PublicKey): web3.PublicKey {
return findProgramAddressSync(seeds, program)[0];
}

View File

@ -0,0 +1,93 @@
import { web3 } from "@project-serum/anchor";
import keccak256 from "keccak256";
import { soliditySha3 } from "web3-utils";
import { postVaaSolanaWithRetry } from "@certusone/wormhole-sdk";
const elliptic = require("elliptic");
/*
export async function postVaa(
connection: web3.Connection,
payer: web3.Keypair,
wormhole: web3.PublicKey,
signedVaa: Buffer
): Promise<void> {
await postVaaSolanaWithRetry(
connection,
async (tx) => {
tx.partialSign(payer);
return tx;
},
wormhole.toString(),
payer.publicKey.toString(),
signedVaa,
10
);
}*/
export function signAndEncodeVaa(
timestamp: number,
nonce: number,
emitterChainId: number,
emitterAddress: Buffer,
sequence: number,
data: Buffer
): Buffer {
if (emitterAddress.length != 32) {
throw Error("emitterAddress != 32 bytes");
}
// wormhole initialized with only one guardian in devnet
const signers = ["cfb12303a19cde580bb4dd771639b0d26bc68353645571a8cff516ab2ee113a0"];
const sigStart = 6;
const numSigners = signers.length;
const sigLength = 66;
const bodyStart = sigStart + sigLength * numSigners;
const bodyHeaderLength = 51;
const vm = Buffer.alloc(bodyStart + bodyHeaderLength + data.length);
// header
const guardianSetIndex = 0;
vm.writeUInt8(1, 0);
vm.writeUInt32BE(guardianSetIndex, 1);
vm.writeUInt8(numSigners, 5);
// encode body with arbitrary consistency level
const consistencyLevel = 1;
vm.writeUInt32BE(timestamp, bodyStart);
vm.writeUInt32BE(nonce, bodyStart + 4);
vm.writeUInt16BE(emitterChainId, bodyStart + 8);
vm.write(emitterAddress.toString("hex"), bodyStart + 10, "hex");
vm.writeBigUInt64BE(BigInt(sequence), bodyStart + 42);
vm.writeUInt8(consistencyLevel, bodyStart + 50);
vm.write(data.toString("hex"), bodyStart + bodyHeaderLength, "hex");
// signatures
const body = vm.subarray(bodyStart).toString("hex");
const hash = soliditySha3(soliditySha3("0x" + body)!)!.substring(2);
for (let i = 0; i < numSigners; ++i) {
const ec = new elliptic.ec("secp256k1");
const key = ec.keyFromPrivate(signers[i]);
const signature = key.sign(hash, { canonical: true });
const start = sigStart + i * sigLength;
vm.writeUInt8(i, start);
vm.write(signature.r.toString(16).padStart(64, "0"), start + 1, "hex");
vm.write(signature.s.toString(16).padStart(64, "0"), start + 33, "hex");
vm.writeUInt8(signature.recoveryParam, start + 65);
}
return vm;
}
export function hashVaaPayload(signedVaa: Buffer): Buffer {
const sigStart = 6;
const numSigners = signedVaa[5];
const sigLength = 66;
const bodyStart = sigStart + sigLength * numSigners;
return keccak256(signedVaa.subarray(bodyStart));
}

View File

@ -0,0 +1 @@
[156,58,190,217,217,182,164,165,16,94,2,148,52,60,124,179,124,246,51,210,65,41,197,4,69,101,18,51,144,149,178,85,170,87,2,76,226,26,32,176,106,152,191,126,156,67,26,119,102,204,192,42,216,85,108,243,227,56,255,208,45,59,124,205]

View File

@ -0,0 +1 @@
[128,234,237,75,126,9,5,95,22,89,161,204,162,155,126,118,233,0,87,171,69,102,211,233,50,152,163,63,145,5,132,63,158,238,105,62,204,112,139,148,240,143,108,17,236,215,6,163,114,46,101,158,161,227,52,43,109,225,184,233,161,235,186,224]

Binary file not shown.

View File

@ -0,0 +1,13 @@
{
"pubkey": "3GwVs8GSLdo4RUsoXTkGQhojauQ1sXcDNjm7LSDicw19",
"account": {
"lamports": 1113600,
"data": [
"AsgGMSy+W3nviqbBfj9CPY/f4dRpCfsfbN9l7o4ub6o=",
"base64"
],
"owner": "B6RHG3mfcckmrYN1UhmJzyS1XX3fZKbkeUcpJe9Sy3FE",
"executable": false,
"rentEpoch": 0
}
}

View File

@ -0,0 +1,11 @@
{
"compilerOptions": {
"types": ["mocha", "chai"],
"typeRoots": ["./node_modules/@types"],
"lib": ["es2020"],
"module": "commonjs",
"target": "es2020",
"esModuleInterop": true,
"moduleResolution": "node"
}
}

6
anchor-contributor/unset.env Executable file
View File

@ -0,0 +1,6 @@
unset CONDUCTOR_CHAIN
unset CONDUCTOR_ADDRESS
unset GLOBAL_KYC_AUTHORITY
unset CORE_BRIDGE_ADDRESS
unset TOKEN_BRIDGE_ADDRESS
unset BROWSER

2698
anchor-contributor/yarn.lock Normal file

File diff suppressed because it is too large Load Diff

View File

@ -1,113 +0,0 @@
---
apiVersion: v1
kind: Service
metadata:
name: solana-devnet
labels:
app: solana-devnet
spec:
ports:
- port: 8899
name: rpc
protocol: TCP
- port: 9900
name: faucet
protocol: TCP
clusterIP: None
selector:
app: solana-devnet
---
apiVersion: apps/v1
kind: StatefulSet
metadata:
name: solana-devnet
spec:
selector:
matchLabels:
app: solana-devnet
serviceName: solana-devnet
replicas: 1
template:
metadata:
labels:
app: solana-devnet
spec:
terminationGracePeriodSeconds: 1
containers:
- name: devnet
image: solana-contract
command:
- /root/.local/share/solana/install/active_release/bin/solana-test-validator
- --bpf-program
- Bridge1p5gheXUvJ6jGWGeCsgPKgnE3YgdGKRVCMY9o
- /opt/solana/deps/bridge.so
- --bpf-program
- B6RHG3mfcckmrYN1UhmJzyS1XX3fZKbkeUcpJe9Sy3FE
- /opt/solana/deps/token_bridge.so
- --bpf-program
- NFTWqJR8YnRVqPDvTJrYuLrQDitTG5AScqbeghi4zSA
- /opt/solana/deps/nft_bridge.so
- --bpf-program
- CP1co2QMMoDPbsmV7PGcUTLFwyhgCgTXt25gLQ5LewE1
- /opt/solana/deps/cpi_poster.so
- --bpf-program
- metaqbxxUerdq28cj1RbAWkYQm3ybzjb6a8bt518x1s
- /opt/solana/deps/spl_token_metadata.so
- --bpf-program
- gMYYig2utAxVoXnM9UhtTWrt8e7x2SVBZqsWZJeT5Gw # Derived from pyth_program.json
- /opt/solana/deps/pyth_oracle.so
- --bpf-program
- P2WH424242424242424242424242424242424242424
- /opt/solana/deps/pyth2wormhole.so
- --bpf-program
- Ex9bCdVMSfx7EzB3pgSi2R4UHwJAXvTw18rBQm5YQ8gK
- /opt/solana/deps/wormhole_migration.so
- --bpf-program
- 22mamxmojFWBdbGqaxTH46HBAgAY2bJRiGJJHfNRNQ95
- /opt/solana/deps/icco_contributor.so
- --log
ports:
- containerPort: 8001
name: gossip
protocol: UDP
- containerPort: 8003
name: tpu
protocol: UDP
- containerPort: 8004
name: tpufwd
protocol: UDP
- containerPort: 8000
name: tvu
protocol: UDP
- containerPort: 8002
name: tvufwd
protocol: UDP
- containerPort: 8006
name: repair
protocol: UDP
- containerPort: 8007
name: serverepair
protocol: UDP
- containerPort: 8899
name: rpc
protocol: TCP
- containerPort: 8900
name: pubsub
protocol: TCP
- containerPort: 9900
name: faucet
protocol: TCP
readinessProbe:
httpGet:
port: rpc
path: /health
periodSeconds: 1
- name: setup
image: bridge-client
command:
- /usr/src/solana/devnet_setup.sh
readinessProbe:
tcpSocket:
port: 2000
periodSeconds: 1
failureThreshold: 300

View File

@ -1,204 +0,0 @@
apiVersion: v1
kind: Service
metadata:
labels:
app: terra-terrad
name: terra-terrad
spec:
ports:
- name: rpc
port: 26657
protocol: TCP
- name: rest
port: 1317
protocol: TCP
selector:
app: terra-terrad
---
apiVersion: v1
kind: Service
metadata:
labels:
app: terra-postgres
name: terra-postgres
spec:
ports:
- name: postgres
port: 5432
protocol: TCP
selector:
app: terra-postgres
---
apiVersion: v1
kind: Service
metadata:
labels:
app: terra-fcd
name: terra-fcd
spec:
ports:
- name: fcd
port: 3060
protocol: TCP
selector:
app: terra-fcd
---
apiVersion: apps/v1
kind: StatefulSet
metadata:
labels:
app: terra-terrad
name: terra-terrad
spec:
replicas: 1
selector:
matchLabels:
app: terra-terrad
template:
metadata:
labels:
app: terra-terrad
spec:
containers:
- args:
- terrad
- start
image: terra-image
name: terra-terrad
ports:
- containerPort: 26657
- containerPort: 1317
readinessProbe:
httpGet:
port: 26657
resources: {}
- name: terra-contracts
image: terra-contracts
command:
- /bin/sh
- -c
- "sh /app/tools/deploy.sh && touch /app/tools/success && sleep infinity"
readinessProbe:
exec:
command:
- test
- -e
- "/app/tools/success"
initialDelaySeconds: 5
periodSeconds: 5
restartPolicy: Always
serviceName: terra-terrad
---
apiVersion: apps/v1
kind: StatefulSet
metadata:
labels:
app: terra-postgres
name: terra-postgres
spec:
replicas: 1
selector:
matchLabels:
app: terra-postgres
template:
metadata:
labels:
app: terra-postgres
spec:
containers:
- image: postgres:12
name: fcd-postgres
ports:
- containerPort: 5432
resources: {}
env:
- name: POSTGRES_USER
value: dev
- name: POSTGRES_PASSWORD
value: dev
- name: POSTGRES_DB
value: fcd
restartPolicy: Always
serviceName: terra-fcd
---
apiVersion: apps/v1
kind: StatefulSet
metadata:
labels:
app: terra-fcd
name: terra-fcd
spec:
replicas: 1
selector:
matchLabels:
app: terra-fcd
template:
metadata:
labels:
app: terra-fcd
spec:
containers:
- image: terramoney/fcd:bombay
name: fcd-collector
args:
- collector
resources: {}
env:
- name: CHAIN_ID
value: localterra
- name: LCD_URI
value: http://terra-terrad:1317
- name: BYPASS_URI
value: http://terra-terrad:1317
- name: RPC_URI
value: http://terra-terrad:26657
- name: TYPEORM_CONNECTION
value: postgres
- name: TYPEORM_HOST
value: terra-postgres
- name: TYPEORM_USERNAME
value: dev
- name: TYPEORM_PASSWORD
value: dev
- name: TYPEORM_DATABASE
value: fcd
- name: TYPEORM_SYNCHRONIZE
value: "true"
- name: TYPEORM_LOGGING
value: "false"
- name: TYPEORM_ENTITIES
value: "src/orm/*Entity.ts"
- image: terramoney/fcd:bombay
name: fcd-api
args:
- start
resources: {}
ports:
- containerPort: 3060
env:
- name: CHAIN_ID
value: localterra
- name: LCD_URI
value: http://terra-terrad:1317
- name: BYPASS_URI
value: http://terra-terrad:1317
- name: RPC_URI
value: http://terra-terrad:26657
- name: TYPEORM_CONNECTION
value: postgres
- name: TYPEORM_HOST
value: terra-postgres
- name: TYPEORM_USERNAME
value: dev
- name: TYPEORM_PASSWORD
value: dev
- name: TYPEORM_DATABASE
value: fcd
- name: TYPEORM_SYNCHRONIZE
value: "true"
- name: TYPEORM_LOGGING
value: "false"
- name: TYPEORM_ENTITIES
value: "src/orm/*Entity.ts"
restartPolicy: Always
serviceName: terra-fcd

View File

@ -1,40 +1,41 @@
### Building
Build the contracts by running `make build`
Build the contracts by running `make build`.
### Testing
Run the tests by running `make test`
Run the tests by running `make test`. The tests can be found [here](tests/icco.js).
The tests can be found here `tests/icco.js`
### Deploying ICCO to EVM testnets
### Deploying ICCO to testnet
To deploy the Conductor and Contributor smart contracts to testnet, follow the procedure below.
To deploy the Conductor and Contributor smart contracts to testnet you will need do the following:
**Set up the ICCO deployment config.** Each network in `icco_deployment_config.js` has several parameters:
1. Set up the ICCO deployment config: `icco_deployment_config.js`
- `conductorChainId` - the network that the `Conductor` contract is (or will be) deployed to
- `contributorChainId` - the network that `Contributor` contract will be deployed to
- `authority` - the public key of the KYC authority for the contributor
- This value does not have to be set when deploying the `Conductor` contract.
- `consistencyLevel` - number of confirmations
- `wormhole`- the wormhole coreBridge address
- `tokenBridge` -the wormhole tokenBridge address
- `mnemonic` - private key for deployment wallet
- `rpc` - URL for deployment provider
- Each network in the deployment config has several parameters:
There is a [sample config](icco_deployment_config.js.sample) that you can copy to `icco_deployment_config.js` to help you get started.
- `conductorChainId` - the network that the `Conductor` contract is (or will be) deployed to
- `contributorChainId` - the network that `Contributor` contract will be deployed to
- `authority` - the public key of the KYC authority for the contributor
- This value does not have to be set when deploying the `Conductor` contract.
- `consistencyLevel` - number of confirmations
- `wormhole`- the wormhole coreBridge address
- `tokenBridge` -the wormhole tokenBridge address
- `mnemonic` - private key for deployment wallet
- `rpc` - URL for deployment provider
The `conductorChainId` and `contributorChainId` should be the same only if both contracts are deployed on the same network. ChainIDs for each network can be found [here](https://docs.wormholenetwork.com/wormhole/contracts).
- The `conductorChainId` and `contributorChainId` should be the same only if both contracts are deployed on the same network
- ChainIDs for each network can be found here: <https://docs.wormholenetwork.com/wormhole/contracts>
Deploy the `Conductor` contract with the following command, where `your_network` corresponds to a network name from `icco_deployment_config.js`.
2. Deploy the `Conductor` contract with the following command:
```sh
npm run deploy-conductor your_network
```
- npx truffle migrate --f 2 --to 2 --network (`network key from icco_deployment_config.js`) --skip-dry-run
And deploy the `Contributor` contract(s) for any network you want to collect contributions from.
3. Deploy the `Contributor` contract(s) with the following command:
```sh
npm run deploy-contributor your_network
```
- npx truffle migrate --f 3 --to 3 --network (`network key from icco_deployment_config.js`) --skip-dry-run
4. Follow the instructions in the README.md in `tools/` for `Contributor` contract registration
These commands work for mainnet networks you configure, too. After deploying your contracts, follow the instructions in the README.md in `tools/` for `Contributor` contract registration.

View File

@ -50,7 +50,7 @@ contract Conductor is ConductorGovernance, ReentrancyGuard {
}
/**
* @dev Fetch the sale token decimals and place in the SaleInit struct.
* @dev Fetch the sale token decimals on this chain.
* The Contributors need to know this to scale allocations on non-evm chains.
*/
(,bytes memory queriedDecimals) = localTokenAddress.staticcall(
@ -97,7 +97,9 @@ contract Conductor is ConductorGovernance, ReentrancyGuard {
ICCOStructs.Raise memory raise,
ICCOStructs.Token[] memory acceptedTokens
) public payable nonReentrant returns (
uint256 saleId
uint256 saleId,
uint256 wormholeSequence,
uint256 wormholeSequence2
) {
/// validate sale parameters from client
require(block.timestamp < raise.saleStart, "sale start must be in the future");
@ -157,7 +159,7 @@ contract Conductor is ConductorGovernance, ReentrancyGuard {
tokenIndex: uint8(i),
tokenAddress: acceptedTokens[i].tokenAddress
});
/// only allow 10 accepted tokens for the Solana Contributor
/// only allow 8 accepted tokens for the Solana Contributor
require(solanaAcceptedTokens.length < 8, "too many solana tokens");
/// save in contract storage
solanaAcceptedTokens.push(solanaToken);
@ -206,7 +208,7 @@ contract Conductor is ConductorGovernance, ReentrancyGuard {
});
/// @dev send encoded SaleInit struct to Contributors via wormhole.
wormhole.publishMessage{
wormholeSequence = wormhole.publishMessage{
value : messageFee
}(0, ICCOStructs.encodeSaleInit(saleInit), consistencyLevel());
@ -234,7 +236,7 @@ contract Conductor is ConductorGovernance, ReentrancyGuard {
});
/// @dev send encoded SolanaSaleInit struct to the solana Contributor
wormhole.publishMessage{
wormholeSequence2 = wormhole.publishMessage{
value : messageFee
}(0, ICCOStructs.encodeSolanaSaleInit(solanaSaleInit), consistencyLevel());
@ -321,7 +323,7 @@ contract Conductor is ConductorGovernance, ReentrancyGuard {
* - it calculates allocations and excess contributions for each accepted token
* - it disseminates a saleSealed or saleAborted message to Contributors via wormhole
*/
function sealSale(uint256 saleId) public payable returns (uint256 wormholeSequence) {
function sealSale(uint256 saleId) public payable returns (uint256 wormholeSequence, uint256 wormholeSequence2) {
require(saleExists(saleId), "sale not initiated");
ConductorStructs.Sale memory sale = sales(saleId);
@ -360,7 +362,7 @@ contract Conductor is ConductorGovernance, ReentrancyGuard {
accounting.totalExcessContribution = accounting.totalContribution - sale.maxRaise;
}
/// @dev This is a successful sale struct that saves sale token allocation information.
/// @dev This is a successful sale struct that saves sale token allocation information
ICCOStructs.SaleSealed memory saleSealed = ICCOStructs.SaleSealed({
payloadID : 3,
saleID : saleId,
@ -440,11 +442,14 @@ contract Conductor is ConductorGovernance, ReentrancyGuard {
/// @dev send encoded SaleSealed message to Contributor contracts
wormholeSequence = wormhole.publishMessage{
value : accounting.messageFee
}(0, ICCOStructs.encodeSaleSealed(saleSealed), consistencyLevel());
}(0, ICCOStructs.encodeSaleSealed(saleSealed), consistencyLevel());
{ /// scope to make code more readable
/// @dev send separate SaleSealed VAA if accepting Solana tokens
if (sale.solanaAcceptedTokensCount > 0) {
// make sure we still have enough gas to send the Solana message
require(accounting.valueSent >= accounting.messageFee, "insufficient wormhole messaging fees");
/// create new array to handle solana allocations
ICCOStructs.Allocation[] memory solanaAllocations = new ICCOStructs.Allocation[](sale.solanaAcceptedTokensCount);
@ -460,7 +465,7 @@ contract Conductor is ConductorGovernance, ReentrancyGuard {
saleSealed.allocations = solanaAllocations;
/// @dev send encoded SaleSealed message to Solana Contributor
wormholeSequence = wormhole.publishMessage{
wormholeSequence2 = wormhole.publishMessage{
value : accounting.messageFee
}(0, ICCOStructs.encodeSaleSealed(saleSealed), consistencyLevel());
}

View File

@ -34,7 +34,7 @@ contract ConductorGovernance is ConductorGetters, ConductorSetters, ERC1967Upgra
_upgradeTo(newImplementation);
/// Call initialize function of the new implementation
/// @dev call initialize function of the new implementation
(bool success, bytes memory reason) = newImplementation.delegatecall(abi.encodeWithSignature("initialize()"));
require(success, string(reason));

View File

@ -159,6 +159,7 @@ contract Contributor is ContributorGovernance, ReentrancyGuard {
saleId,
tokenIndex,
amount,
bytes12(0x0),
msg.sender,
getSaleContribution(saleId, tokenIndex, msg.sender)
);
@ -190,7 +191,7 @@ contract Contributor is ContributorGovernance, ReentrancyGuard {
/// revert if token has fee
require(amount == balanceAfter - balanceBefore, "fee-on-transfer tokens are not supported");
/// store contribution information
/// @dev store contribution information
setSaleContribution(saleId, msg.sender, tokenIndex, amount);
}
@ -309,7 +310,7 @@ contract Contributor is ContributorGovernance, ReentrancyGuard {
/**
* @dev Cache the conductorChainId from storage to save on gas.
* We will check each accpetedToken to see if its from this chain.
* We will check each acceptedToken to see if it's from this chain.
*/
uint16 conductorChainId = conductorChainId();
for (uint256 i = 0; i < sale.acceptedTokensAddresses.length; i++) {
@ -324,7 +325,7 @@ contract Contributor is ContributorGovernance, ReentrancyGuard {
/// check to see if this contributor is on the same chain as conductor
if (thisChainId == conductorChainId) {
// send contributions to recipient on this chain
/// send contributions to recipient on this chain
SafeERC20.safeTransfer(
IERC20(acceptedTokenAddress),
address(uint160(uint256(sale.recipient))),
@ -375,7 +376,7 @@ contract Contributor is ContributorGovernance, ReentrancyGuard {
}
}
/// @dev saleAborted serves to mark the sale unnsuccessful or canceled.
/// @dev saleAborted serves to mark the sale unnsuccessful or canceled
function saleAborted(bytes memory saleAbortedVaa) public {
(IWormhole.VM memory vm, bool valid, string memory reason) = wormhole().parseAndVerifyVM(saleAbortedVaa);
@ -389,11 +390,11 @@ contract Contributor is ContributorGovernance, ReentrancyGuard {
}
/**
* @dev claimAllocation serves to send contributors an preallocated amount of sale tokens
* and a refund for any excessContributions
* @dev claimAllocation serves to send contributors a preallocated amount of sale tokens
* and a refund for any excessContributions.
* - it confirms that the sale was sealed
* - it transfers sale tokens to the contributors wallet
* - it transfer any excessContributions to the contributors wallet
* - it transfers sale tokens to the contributor's wallet
* - it transfer any excessContributions to the contributor's wallet
* - it marks the allocation as claimed to prevent multiple claims for the same allocation
*/
function claimAllocation(uint256 saleId, uint256 tokenIndex) public {

6
ethereum/deploy_contracts.sh Executable file
View File

@ -0,0 +1,6 @@
#!/bin/bash
cd migrations/
npx truffle migrate --f 2 --to 2 --network goerli --skip-dry-run
npx truffle-migrate --f 3 --to 3 --network goerli --skip-dry-run
npx truffle-migrate --f 3 --to 3 --network fuji --skip-dry-run

View File

@ -126,4 +126,13 @@ module.exports = {
rpc: "",
deployImplementationOnly: false,
},
solana_testnet: {
conductorChainId: 2,
contributorChainId: 1,
authority: "",
wormhole: "",
tokenBridge: "",
mnemonic: "",
rpc: "",
},
};

View File

@ -18,6 +18,8 @@
},
"scripts": {
"build": "truffle compile",
"deploy-conductor": "truffle migrate --skip-dry-run --f 2 --to 2 --network",
"deploy-contributor": "truffle migrate --skip-dry-run --f 3 --to 3 --network",
"test": "mkdir -p build/contracts && cp node_modules/@openzeppelin/contracts/build/contracts/* build/contracts/ && truffle test",
"flatten": "mkdir -p node_modules/@poanet/solidity-flattener/contracts && cp -r contracts/* node_modules/@poanet/solidity-flattener/contracts/ && poa-solidity-flattener",
"verify": "patch -u -f node_modules/truffle-plugin-verify/constants.js -i truffle-verify-constants.patch; truffle run verify $npm_config_module@$npm_config_contract_address --network $npm_config_network"

View File

@ -6162,7 +6162,7 @@ const signContribution = async function(
web3.eth.abi.encodeParameter("uint256", amount).substring(2),
web3.eth.abi
.encodeParameter("address", buyerAddress)
.substring(2 + (64 - 40)),
.substring(2), // we actually want 32 bytes
web3.eth.abi.encodeParameter("uint256", totalContribution).substring(2),
];

View File

@ -57,7 +57,7 @@ module.exports = {
);
},
network_id: "5",
gas: 4465030,
gas: 6465030,
gasPrice: 10000000000,
},
binance: {

383
sdk/js/package-lock.json generated
View File

@ -1,15 +1,15 @@
{
"name": "@certusone/wormhole-sdk",
"version": "0.2.0",
"name": "@certusone/wormhole-icco-sdk",
"version": "0.1.0",
"lockfileVersion": 2,
"requires": true,
"packages": {
"": {
"name": "@certusone/wormhole-sdk",
"version": "0.2.0",
"name": "@certusone/wormhole-icco-sdk",
"version": "0.1.0",
"license": "Apache-2.0",
"dependencies": {
"@certusone/wormhole-sdk": "^0.2.4",
"@certusone/wormhole-sdk": "^0.3.5",
"@improbable-eng/grpc-web": "^0.14.0",
"@solana/spl-token": "^0.1.8",
"@solana/web3.js": "^1.24.0",
@ -612,14 +612,15 @@
"dev": true
},
"node_modules/@certusone/wormhole-sdk": {
"version": "0.2.4",
"resolved": "https://registry.npmjs.org/@certusone/wormhole-sdk/-/wormhole-sdk-0.2.4.tgz",
"integrity": "sha512-aSb0IjobZG7wf1lKeceNkwzBNzjTIX+RjRQ1PsWmNgRvDoCqLohPQf7e929nfJrScuzIRIxfu1OOODBSJQzt+Q==",
"version": "0.3.5",
"resolved": "https://registry.npmjs.org/@certusone/wormhole-sdk/-/wormhole-sdk-0.3.5.tgz",
"integrity": "sha512-YVNnII54CxjZ96TVZE+GzrfnRd6z338cLxtCRp5Ri4tU8pT1/aqaWXgTW+8JzKDSb+pa0Yc6VuXrcOEsGikTdw==",
"dependencies": {
"@improbable-eng/grpc-web": "^0.14.0",
"@solana/spl-token": "^0.1.8",
"@solana/web3.js": "^1.24.0",
"@terra-money/terra.js": "^3.0.7",
"algosdk": "^1.15.0",
"axios": "^0.24.0",
"bech32": "^2.0.0",
"js-base64": "^3.6.1",
@ -2653,6 +2654,54 @@
"url": "https://github.com/sponsors/epoberezkin"
}
},
"node_modules/algo-msgpack-with-bigint": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/algo-msgpack-with-bigint/-/algo-msgpack-with-bigint-2.1.1.tgz",
"integrity": "sha512-F1tGh056XczEaEAqu7s+hlZUDWwOBT70Eq0lfMpBP2YguSQVyxRbprLq5rELXKQOyOaixTWYhMeMQMzP0U5FoQ==",
"engines": {
"node": ">= 10"
}
},
"node_modules/algosdk": {
"version": "1.16.0",
"resolved": "https://registry.npmjs.org/algosdk/-/algosdk-1.16.0.tgz",
"integrity": "sha512-oD1PEuzjJFXSx7/zwFB5N337kqykhxaPXEGgbw4IWKgjq1eRn++RAa+FH9PRwv+n9nOdzFfy83uWYwkVqliRuw==",
"dependencies": {
"algo-msgpack-with-bigint": "^2.1.1",
"buffer": "^6.0.2",
"hi-base32": "^0.5.1",
"js-sha256": "^0.9.0",
"js-sha3": "^0.8.0",
"js-sha512": "^0.8.0",
"json-bigint": "^1.0.0",
"superagent": "^6.1.0",
"tweetnacl": "^1.0.3",
"url-parse": "^1.5.1"
}
},
"node_modules/algosdk/node_modules/buffer": {
"version": "6.0.3",
"resolved": "https://registry.npmjs.org/buffer/-/buffer-6.0.3.tgz",
"integrity": "sha512-FTiCpNxtwiZZHEZbcbTIcZjERVICn9yq/pDFkTl95/AxzD1naBctN7YO68riM/gLSDY7sdrMby8hofADYuuqOA==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/feross"
},
{
"type": "patreon",
"url": "https://www.patreon.com/feross"
},
{
"type": "consulting",
"url": "https://feross.org/support"
}
],
"dependencies": {
"base64-js": "^1.3.1",
"ieee754": "^1.2.1"
}
},
"node_modules/ansi-escapes": {
"version": "4.3.2",
"resolved": "https://registry.npmjs.org/ansi-escapes/-/ansi-escapes-4.3.2.tgz",
@ -2788,8 +2837,7 @@
"node_modules/asynckit": {
"version": "0.4.0",
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
"integrity": "sha1-x57Zf380y48robyXkLzDZkdLS3k=",
"dev": true
"integrity": "sha1-x57Zf380y48robyXkLzDZkdLS3k="
},
"node_modules/available-typed-arrays": {
"version": "1.0.5",
@ -3069,7 +3117,6 @@
"version": "9.0.2",
"resolved": "https://registry.npmjs.org/bignumber.js/-/bignumber.js-9.0.2.tgz",
"integrity": "sha512-GAcQvbpsM0pUb0zw1EI0KhQEZ+lRwR5fYaAp3vPOYuP7aDvGy6cVN6XHLauvF8SOga2y0dcLcjt3iQDTSEliyw==",
"dev": true,
"engines": {
"node": "*"
}
@ -3675,7 +3722,6 @@
"version": "1.0.8",
"resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
"integrity": "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==",
"dev": true,
"dependencies": {
"delayed-stream": "~1.0.0"
},
@ -3703,6 +3749,11 @@
"resolved": "https://registry.npmjs.org/commander/-/commander-2.20.3.tgz",
"integrity": "sha512-GpVkmM8vF2vQUkj2LvZmD35JxeJOLCwJ9cUkugyk2nuhbv3+mJvpLYYt+0+USMxE+oj+ey/lJEnhZw75x/OMcQ=="
},
"node_modules/component-emitter": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/component-emitter/-/component-emitter-1.3.0.tgz",
"integrity": "sha512-Rd3se6QB+sO1TwqZjscQrurpEPIfO0/yYnSin6Q/rD3mOutHvUrCAhJub3r90uNb+SESBuE0QYoB90YdfatsRg=="
},
"node_modules/compound-subject": {
"version": "0.0.1",
"resolved": "https://registry.npmjs.org/compound-subject/-/compound-subject-0.0.1.tgz",
@ -3778,8 +3829,7 @@
"node_modules/cookiejar": {
"version": "2.1.3",
"resolved": "https://registry.npmjs.org/cookiejar/-/cookiejar-2.1.3.tgz",
"integrity": "sha512-JxbCBUdrfr6AQjOXrxoTvAMJO4HBTUIlBzslcJPAz+/KT8yk53fXun51u+RenNYvad/+Vc2DIz5o9UxlCDymFQ==",
"dev": true
"integrity": "sha512-JxbCBUdrfr6AQjOXrxoTvAMJO4HBTUIlBzslcJPAz+/KT8yk53fXun51u+RenNYvad/+Vc2DIz5o9UxlCDymFQ=="
},
"node_modules/copy-dir": {
"version": "1.3.0",
@ -4075,7 +4125,6 @@
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
"integrity": "sha1-3zrhmayt+31ECqrgsp4icrJOxhk=",
"dev": true,
"engines": {
"node": ">=0.4.0"
}
@ -4820,6 +4869,11 @@
"integrity": "sha1-PYpcZog6FqMMqGQ+hR8Zuqd5eRc=",
"dev": true
},
"node_modules/fast-safe-stringify": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/fast-safe-stringify/-/fast-safe-stringify-2.1.1.tgz",
"integrity": "sha512-W+KJc2dmILlPplD/H4K9l9LcAHAfPtP6BY84uVLXQ6Evcz9Lcg33Y2z1IVblT6xdY54PXYVHEv+0Wpq8Io6zkA=="
},
"node_modules/fb-watchman": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/fb-watchman/-/fb-watchman-2.0.1.tgz",
@ -4942,7 +4996,6 @@
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/form-data/-/form-data-3.0.1.tgz",
"integrity": "sha512-RHkBKtLWUVwd7SqRIvCZMEvAMoGUp0XU+seQiZejj0COz3RI3hWP4sCv3gZWWLjJTd7rGwcsF5eKZGii0r/hbg==",
"dev": true,
"dependencies": {
"asynckit": "^0.4.0",
"combined-stream": "^1.0.8",
@ -4952,6 +5005,15 @@
"node": ">= 6"
}
},
"node_modules/formidable": {
"version": "1.2.6",
"resolved": "https://registry.npmjs.org/formidable/-/formidable-1.2.6.tgz",
"integrity": "sha512-KcpbcpuLNOwrEjnbpMC0gS+X8ciDoZE1kkqzat4a8vrprf+s9pKNQ/QIwWfbfs4ltgmFl3MD177SNTkve3BwGQ==",
"deprecated": "Please upgrade to latest, formidable@v2 or formidable@v3! Check these notes: https://bit.ly/2ZEqIau",
"funding": {
"url": "https://ko-fi.com/tunnckoCore/commissions"
}
},
"node_modules/forwarded": {
"version": "0.2.0",
"resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.2.0.tgz",
@ -5308,6 +5370,11 @@
"minimalistic-assert": "^1.0.1"
}
},
"node_modules/hi-base32": {
"version": "0.5.1",
"resolved": "https://registry.npmjs.org/hi-base32/-/hi-base32-0.5.1.tgz",
"integrity": "sha512-EmBBpvdYh/4XxsnUybsPag6VikPYnN30td+vQk+GI3qpahVEG9+gTkG0aXVxTjBqQ5T6ijbWIu77O+C5WFWsnA=="
},
"node_modules/hmac-drbg": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/hmac-drbg/-/hmac-drbg-1.0.1.tgz",
@ -7780,11 +7847,21 @@
"resolved": "https://registry.npmjs.org/js-base64/-/js-base64-3.6.1.tgz",
"integrity": "sha512-Frdq2+tRRGLQUIQOgsIGSCd1VePCS2fsddTG5dTCqR0JHgltXWfsxnY0gIXPoMeRmdom6Oyq+UMOFg5suduOjQ=="
},
"node_modules/js-sha256": {
"version": "0.9.0",
"resolved": "https://registry.npmjs.org/js-sha256/-/js-sha256-0.9.0.tgz",
"integrity": "sha512-sga3MHh9sgQN2+pJ9VYZ+1LPwXOxuBJBA5nrR5/ofPfuiJBE2hnjsaN8se8JznOmGLN2p49Pe5U/ttafcs/apA=="
},
"node_modules/js-sha3": {
"version": "0.8.0",
"resolved": "https://registry.npmjs.org/js-sha3/-/js-sha3-0.8.0.tgz",
"integrity": "sha512-gF1cRrHhIzNfToc802P800N8PpXS+evLLXfsVpowqmAFR9uwbi89WvXg2QspOmXL8QL86J4T1EpFu+yUkwJY3Q=="
},
"node_modules/js-sha512": {
"version": "0.8.0",
"resolved": "https://registry.npmjs.org/js-sha512/-/js-sha512-0.8.0.tgz",
"integrity": "sha512-PWsmefG6Jkodqt+ePTvBZCSMFgN7Clckjd0O7su3I0+BW2QWUTJNzjktHsztGLhncP2h8mcF9V9Y2Ha59pAViQ=="
},
"node_modules/js-tokens": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/js-tokens/-/js-tokens-4.0.0.tgz",
@ -7876,6 +7953,14 @@
"node": ">=4"
}
},
"node_modules/json-bigint": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/json-bigint/-/json-bigint-1.0.0.tgz",
"integrity": "sha512-SiPv/8VpZuWbvLSMtTDU8hEfrZWg/mH/nV/b4o0CYbSxu1UIQPLdwKOCIyLQX+VIPO5vrLX3i8qtqFyhdPSUSQ==",
"dependencies": {
"bignumber.js": "^9.0.0"
}
},
"node_modules/json-buffer": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/json-buffer/-/json-buffer-3.0.0.tgz",
@ -8065,7 +8150,6 @@
"version": "6.0.0",
"resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-6.0.0.tgz",
"integrity": "sha512-Jo6dJ04CmSjuznwJSS3pUeWmd/H0ffTlkXXgwZi+eq1UCmqQwCh+eLsYOYCwY991i2Fah4h1BEMCx4qThGbsiA==",
"dev": true,
"dependencies": {
"yallist": "^4.0.0"
},
@ -8147,7 +8231,6 @@
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/methods/-/methods-1.1.2.tgz",
"integrity": "sha1-VSmk1nZUE07cxSZmVoNbD4Ua/O4=",
"dev": true,
"engines": {
"node": ">= 0.6"
}
@ -8200,7 +8283,6 @@
"version": "1.50.0",
"resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.50.0.tgz",
"integrity": "sha512-9tMZCDlYHqeERXEHO9f/hKfNXhre5dK2eE/krIvUjZbS2KPcqGDfNShIWS1uW9XOTKQKqK6qbeOci18rbfW77A==",
"dev": true,
"engines": {
"node": ">= 0.6"
}
@ -8209,7 +8291,6 @@
"version": "2.1.33",
"resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.33.tgz",
"integrity": "sha512-plLElXp7pRDd0bNZHw+nMd52vRYjLwQjygaNg7ddJ2uJtTlmnTCjWuPKxVu6//AdaRuME84SvLW91sIkBqGT0g==",
"dev": true,
"dependencies": {
"mime-db": "1.50.0"
},
@ -9049,7 +9130,6 @@
"version": "6.9.6",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.9.6.tgz",
"integrity": "sha512-TIRk4aqYLNoJUbd+g2lEdz5kLWIuTMRagAXxl78Q0RiVjAOugHmeKNGdd3cwo/ktpf9aL9epCfFqWDEKysUlLQ==",
"dev": true,
"engines": {
"node": ">=0.6"
},
@ -9057,6 +9137,11 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/querystringify": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/querystringify/-/querystringify-2.2.0.tgz",
"integrity": "sha512-FIqgj2EUvTa7R50u0rGsyTftzjYmv/a3hO345bZNrqabNqjtgiDMgmo4mkUjd+nzU5oF3dClKqFIPUKybUyqoQ=="
},
"node_modules/randombytes": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/randombytes/-/randombytes-2.1.0.tgz",
@ -9200,6 +9285,11 @@
"node": ">=0.10.0"
}
},
"node_modules/requires-port": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/requires-port/-/requires-port-1.0.0.tgz",
"integrity": "sha1-kl0mAdOaxIXgkc8NpcbmlNw9yv8="
},
"node_modules/resolve": {
"version": "1.20.0",
"resolved": "https://registry.npmjs.org/resolve/-/resolve-1.20.0.tgz",
@ -9766,6 +9856,74 @@
"npm": ">=3"
}
},
"node_modules/superagent": {
"version": "6.1.0",
"resolved": "https://registry.npmjs.org/superagent/-/superagent-6.1.0.tgz",
"integrity": "sha512-OUDHEssirmplo3F+1HWKUrUjvnQuA+nZI6i/JJBdXb5eq9IyEQwPyPpqND+SSsxf6TygpBEkUjISVRN4/VOpeg==",
"deprecated": "Please upgrade to v7.0.2+ of superagent. We have fixed numerous issues with streams, form-data, attach(), filesystem errors not bubbling up (ENOENT on attach()), and all tests are now passing. See the releases tab for more information at <https://github.com/visionmedia/superagent/releases>.",
"dependencies": {
"component-emitter": "^1.3.0",
"cookiejar": "^2.1.2",
"debug": "^4.1.1",
"fast-safe-stringify": "^2.0.7",
"form-data": "^3.0.0",
"formidable": "^1.2.2",
"methods": "^1.1.2",
"mime": "^2.4.6",
"qs": "^6.9.4",
"readable-stream": "^3.6.0",
"semver": "^7.3.2"
},
"engines": {
"node": ">= 7.0.0"
}
},
"node_modules/superagent/node_modules/debug": {
"version": "4.3.4",
"resolved": "https://registry.npmjs.org/debug/-/debug-4.3.4.tgz",
"integrity": "sha512-PRWFHuSU3eDtQJPvnNY7Jcket1j0t5OuOsFzPPzsekD52Zl8qUfFIPEiswXqIvHWGVHOgX+7G/vCNNhehwxfkQ==",
"dependencies": {
"ms": "2.1.2"
},
"engines": {
"node": ">=6.0"
},
"peerDependenciesMeta": {
"supports-color": {
"optional": true
}
}
},
"node_modules/superagent/node_modules/mime": {
"version": "2.6.0",
"resolved": "https://registry.npmjs.org/mime/-/mime-2.6.0.tgz",
"integrity": "sha512-USPkMeET31rOMiarsBNIHZKLGgvKc/LrjofAnBlOttf5ajRvqiRA8QsenbcooctK6d6Ts6aqZXBA+XbkKthiQg==",
"bin": {
"mime": "cli.js"
},
"engines": {
"node": ">=4.0.0"
}
},
"node_modules/superagent/node_modules/ms": {
"version": "2.1.2",
"resolved": "https://registry.npmjs.org/ms/-/ms-2.1.2.tgz",
"integrity": "sha512-sGkPx+VjMtmA6MX27oA4FBFELFCZZ4S4XqeGOXCv68tT+jb3vk/RyaKWP0PTKyWtmLSM0b+adUTEvbs1PEaH2w=="
},
"node_modules/superagent/node_modules/semver": {
"version": "7.3.7",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.3.7.tgz",
"integrity": "sha512-QlYTucUYOews+WeEujDoEGziz4K6c47V/Bd+LjSSYcA94p+DmINdf7ncaUinThfvZyu13lN9OY1XDxt8C0Tw0g==",
"dependencies": {
"lru-cache": "^6.0.0"
},
"bin": {
"semver": "bin/semver.js"
},
"engines": {
"node": ">=10"
}
},
"node_modules/superstruct": {
"version": "0.14.2",
"resolved": "https://registry.npmjs.org/superstruct/-/superstruct-0.14.2.tgz",
@ -10495,6 +10653,15 @@
"punycode": "^2.1.0"
}
},
"node_modules/url-parse": {
"version": "1.5.10",
"resolved": "https://registry.npmjs.org/url-parse/-/url-parse-1.5.10.tgz",
"integrity": "sha512-WypcfiRhfeUP9vvF0j6rw0J3hrWrw6iZv3+22h6iRMJ/8z1Tj6XfLP4DsUix5MhMPnXpiHDoKyoZ/bdCkwBCiQ==",
"dependencies": {
"querystringify": "^2.1.1",
"requires-port": "^1.0.0"
}
},
"node_modules/url-parse-lax": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/url-parse-lax/-/url-parse-lax-3.0.0.tgz",
@ -11404,8 +11571,7 @@
"node_modules/yallist": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/yallist/-/yallist-4.0.0.tgz",
"integrity": "sha512-3wdGidZyq5PB084XLES5TpOSRA3wjXAlIWMhum2kRcv/41Sn2emQ0dycQW4uZXLejwKvg6EsvbdlVL+FYEct7A==",
"dev": true
"integrity": "sha512-3wdGidZyq5PB084XLES5TpOSRA3wjXAlIWMhum2kRcv/41Sn2emQ0dycQW4uZXLejwKvg6EsvbdlVL+FYEct7A=="
},
"node_modules/yargs": {
"version": "16.2.0",
@ -11873,14 +12039,15 @@
"dev": true
},
"@certusone/wormhole-sdk": {
"version": "0.2.4",
"resolved": "https://registry.npmjs.org/@certusone/wormhole-sdk/-/wormhole-sdk-0.2.4.tgz",
"integrity": "sha512-aSb0IjobZG7wf1lKeceNkwzBNzjTIX+RjRQ1PsWmNgRvDoCqLohPQf7e929nfJrScuzIRIxfu1OOODBSJQzt+Q==",
"version": "0.3.5",
"resolved": "https://registry.npmjs.org/@certusone/wormhole-sdk/-/wormhole-sdk-0.3.5.tgz",
"integrity": "sha512-YVNnII54CxjZ96TVZE+GzrfnRd6z338cLxtCRp5Ri4tU8pT1/aqaWXgTW+8JzKDSb+pa0Yc6VuXrcOEsGikTdw==",
"requires": {
"@improbable-eng/grpc-web": "^0.14.0",
"@solana/spl-token": "^0.1.8",
"@solana/web3.js": "^1.24.0",
"@terra-money/terra.js": "^3.0.7",
"algosdk": "^1.15.0",
"axios": "^0.24.0",
"bech32": "^2.0.0",
"js-base64": "^3.6.1",
@ -13354,6 +13521,39 @@
"uri-js": "^4.2.2"
}
},
"algo-msgpack-with-bigint": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/algo-msgpack-with-bigint/-/algo-msgpack-with-bigint-2.1.1.tgz",
"integrity": "sha512-F1tGh056XczEaEAqu7s+hlZUDWwOBT70Eq0lfMpBP2YguSQVyxRbprLq5rELXKQOyOaixTWYhMeMQMzP0U5FoQ=="
},
"algosdk": {
"version": "1.16.0",
"resolved": "https://registry.npmjs.org/algosdk/-/algosdk-1.16.0.tgz",
"integrity": "sha512-oD1PEuzjJFXSx7/zwFB5N337kqykhxaPXEGgbw4IWKgjq1eRn++RAa+FH9PRwv+n9nOdzFfy83uWYwkVqliRuw==",
"requires": {
"algo-msgpack-with-bigint": "^2.1.1",
"buffer": "^6.0.2",
"hi-base32": "^0.5.1",
"js-sha256": "^0.9.0",
"js-sha3": "^0.8.0",
"js-sha512": "^0.8.0",
"json-bigint": "^1.0.0",
"superagent": "^6.1.0",
"tweetnacl": "^1.0.3",
"url-parse": "^1.5.1"
},
"dependencies": {
"buffer": {
"version": "6.0.3",
"resolved": "https://registry.npmjs.org/buffer/-/buffer-6.0.3.tgz",
"integrity": "sha512-FTiCpNxtwiZZHEZbcbTIcZjERVICn9yq/pDFkTl95/AxzD1naBctN7YO68riM/gLSDY7sdrMby8hofADYuuqOA==",
"requires": {
"base64-js": "^1.3.1",
"ieee754": "^1.2.1"
}
}
}
},
"ansi-escapes": {
"version": "4.3.2",
"resolved": "https://registry.npmjs.org/ansi-escapes/-/ansi-escapes-4.3.2.tgz",
@ -13470,8 +13670,7 @@
"asynckit": {
"version": "0.4.0",
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
"integrity": "sha1-x57Zf380y48robyXkLzDZkdLS3k=",
"dev": true
"integrity": "sha1-x57Zf380y48robyXkLzDZkdLS3k="
},
"available-typed-arrays": {
"version": "1.0.5",
@ -13685,8 +13884,7 @@
"bignumber.js": {
"version": "9.0.2",
"resolved": "https://registry.npmjs.org/bignumber.js/-/bignumber.js-9.0.2.tgz",
"integrity": "sha512-GAcQvbpsM0pUb0zw1EI0KhQEZ+lRwR5fYaAp3vPOYuP7aDvGy6cVN6XHLauvF8SOga2y0dcLcjt3iQDTSEliyw==",
"dev": true
"integrity": "sha512-GAcQvbpsM0pUb0zw1EI0KhQEZ+lRwR5fYaAp3vPOYuP7aDvGy6cVN6XHLauvF8SOga2y0dcLcjt3iQDTSEliyw=="
},
"bindings": {
"version": "1.5.0",
@ -14192,7 +14390,6 @@
"version": "1.0.8",
"resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
"integrity": "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==",
"dev": true,
"requires": {
"delayed-stream": "~1.0.0"
}
@ -14214,6 +14411,11 @@
"resolved": "https://registry.npmjs.org/commander/-/commander-2.20.3.tgz",
"integrity": "sha512-GpVkmM8vF2vQUkj2LvZmD35JxeJOLCwJ9cUkugyk2nuhbv3+mJvpLYYt+0+USMxE+oj+ey/lJEnhZw75x/OMcQ=="
},
"component-emitter": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/component-emitter/-/component-emitter-1.3.0.tgz",
"integrity": "sha512-Rd3se6QB+sO1TwqZjscQrurpEPIfO0/yYnSin6Q/rD3mOutHvUrCAhJub3r90uNb+SESBuE0QYoB90YdfatsRg=="
},
"compound-subject": {
"version": "0.0.1",
"resolved": "https://registry.npmjs.org/compound-subject/-/compound-subject-0.0.1.tgz",
@ -14282,8 +14484,7 @@
"cookiejar": {
"version": "2.1.3",
"resolved": "https://registry.npmjs.org/cookiejar/-/cookiejar-2.1.3.tgz",
"integrity": "sha512-JxbCBUdrfr6AQjOXrxoTvAMJO4HBTUIlBzslcJPAz+/KT8yk53fXun51u+RenNYvad/+Vc2DIz5o9UxlCDymFQ==",
"dev": true
"integrity": "sha512-JxbCBUdrfr6AQjOXrxoTvAMJO4HBTUIlBzslcJPAz+/KT8yk53fXun51u+RenNYvad/+Vc2DIz5o9UxlCDymFQ=="
},
"copy-dir": {
"version": "1.3.0",
@ -14531,8 +14732,7 @@
"delayed-stream": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
"integrity": "sha1-3zrhmayt+31ECqrgsp4icrJOxhk=",
"dev": true
"integrity": "sha1-3zrhmayt+31ECqrgsp4icrJOxhk="
},
"depd": {
"version": "1.1.2",
@ -15170,6 +15370,11 @@
"integrity": "sha1-PYpcZog6FqMMqGQ+hR8Zuqd5eRc=",
"dev": true
},
"fast-safe-stringify": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/fast-safe-stringify/-/fast-safe-stringify-2.1.1.tgz",
"integrity": "sha512-W+KJc2dmILlPplD/H4K9l9LcAHAfPtP6BY84uVLXQ6Evcz9Lcg33Y2z1IVblT6xdY54PXYVHEv+0Wpq8Io6zkA=="
},
"fb-watchman": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/fb-watchman/-/fb-watchman-2.0.1.tgz",
@ -15262,13 +15467,17 @@
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/form-data/-/form-data-3.0.1.tgz",
"integrity": "sha512-RHkBKtLWUVwd7SqRIvCZMEvAMoGUp0XU+seQiZejj0COz3RI3hWP4sCv3gZWWLjJTd7rGwcsF5eKZGii0r/hbg==",
"dev": true,
"requires": {
"asynckit": "^0.4.0",
"combined-stream": "^1.0.8",
"mime-types": "^2.1.12"
}
},
"formidable": {
"version": "1.2.6",
"resolved": "https://registry.npmjs.org/formidable/-/formidable-1.2.6.tgz",
"integrity": "sha512-KcpbcpuLNOwrEjnbpMC0gS+X8ciDoZE1kkqzat4a8vrprf+s9pKNQ/QIwWfbfs4ltgmFl3MD177SNTkve3BwGQ=="
},
"forwarded": {
"version": "0.2.0",
"resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.2.0.tgz",
@ -15535,6 +15744,11 @@
"minimalistic-assert": "^1.0.1"
}
},
"hi-base32": {
"version": "0.5.1",
"resolved": "https://registry.npmjs.org/hi-base32/-/hi-base32-0.5.1.tgz",
"integrity": "sha512-EmBBpvdYh/4XxsnUybsPag6VikPYnN30td+vQk+GI3qpahVEG9+gTkG0aXVxTjBqQ5T6ijbWIu77O+C5WFWsnA=="
},
"hmac-drbg": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/hmac-drbg/-/hmac-drbg-1.0.1.tgz",
@ -17366,11 +17580,21 @@
"resolved": "https://registry.npmjs.org/js-base64/-/js-base64-3.6.1.tgz",
"integrity": "sha512-Frdq2+tRRGLQUIQOgsIGSCd1VePCS2fsddTG5dTCqR0JHgltXWfsxnY0gIXPoMeRmdom6Oyq+UMOFg5suduOjQ=="
},
"js-sha256": {
"version": "0.9.0",
"resolved": "https://registry.npmjs.org/js-sha256/-/js-sha256-0.9.0.tgz",
"integrity": "sha512-sga3MHh9sgQN2+pJ9VYZ+1LPwXOxuBJBA5nrR5/ofPfuiJBE2hnjsaN8se8JznOmGLN2p49Pe5U/ttafcs/apA=="
},
"js-sha3": {
"version": "0.8.0",
"resolved": "https://registry.npmjs.org/js-sha3/-/js-sha3-0.8.0.tgz",
"integrity": "sha512-gF1cRrHhIzNfToc802P800N8PpXS+evLLXfsVpowqmAFR9uwbi89WvXg2QspOmXL8QL86J4T1EpFu+yUkwJY3Q=="
},
"js-sha512": {
"version": "0.8.0",
"resolved": "https://registry.npmjs.org/js-sha512/-/js-sha512-0.8.0.tgz",
"integrity": "sha512-PWsmefG6Jkodqt+ePTvBZCSMFgN7Clckjd0O7su3I0+BW2QWUTJNzjktHsztGLhncP2h8mcF9V9Y2Ha59pAViQ=="
},
"js-tokens": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/js-tokens/-/js-tokens-4.0.0.tgz",
@ -17439,6 +17663,14 @@
"integrity": "sha512-OYu7XEzjkCQ3C5Ps3QIZsQfNpqoJyZZA99wd9aWd05NCtC5pWOkShK2mkL6HXQR6/Cy2lbNdPlZBpuQHXE63gA==",
"dev": true
},
"json-bigint": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/json-bigint/-/json-bigint-1.0.0.tgz",
"integrity": "sha512-SiPv/8VpZuWbvLSMtTDU8hEfrZWg/mH/nV/b4o0CYbSxu1UIQPLdwKOCIyLQX+VIPO5vrLX3i8qtqFyhdPSUSQ==",
"requires": {
"bignumber.js": "^9.0.0"
}
},
"json-buffer": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/json-buffer/-/json-buffer-3.0.0.tgz",
@ -17591,7 +17823,6 @@
"version": "6.0.0",
"resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-6.0.0.tgz",
"integrity": "sha512-Jo6dJ04CmSjuznwJSS3pUeWmd/H0ffTlkXXgwZi+eq1UCmqQwCh+eLsYOYCwY991i2Fah4h1BEMCx4qThGbsiA==",
"dev": true,
"requires": {
"yallist": "^4.0.0"
}
@ -17659,8 +17890,7 @@
"methods": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/methods/-/methods-1.1.2.tgz",
"integrity": "sha1-VSmk1nZUE07cxSZmVoNbD4Ua/O4=",
"dev": true
"integrity": "sha1-VSmk1nZUE07cxSZmVoNbD4Ua/O4="
},
"micromatch": {
"version": "4.0.4",
@ -17699,14 +17929,12 @@
"mime-db": {
"version": "1.50.0",
"resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.50.0.tgz",
"integrity": "sha512-9tMZCDlYHqeERXEHO9f/hKfNXhre5dK2eE/krIvUjZbS2KPcqGDfNShIWS1uW9XOTKQKqK6qbeOci18rbfW77A==",
"dev": true
"integrity": "sha512-9tMZCDlYHqeERXEHO9f/hKfNXhre5dK2eE/krIvUjZbS2KPcqGDfNShIWS1uW9XOTKQKqK6qbeOci18rbfW77A=="
},
"mime-types": {
"version": "2.1.33",
"resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.33.tgz",
"integrity": "sha512-plLElXp7pRDd0bNZHw+nMd52vRYjLwQjygaNg7ddJ2uJtTlmnTCjWuPKxVu6//AdaRuME84SvLW91sIkBqGT0g==",
"dev": true,
"requires": {
"mime-db": "1.50.0"
}
@ -18358,8 +18586,12 @@
"qs": {
"version": "6.9.6",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.9.6.tgz",
"integrity": "sha512-TIRk4aqYLNoJUbd+g2lEdz5kLWIuTMRagAXxl78Q0RiVjAOugHmeKNGdd3cwo/ktpf9aL9epCfFqWDEKysUlLQ==",
"dev": true
"integrity": "sha512-TIRk4aqYLNoJUbd+g2lEdz5kLWIuTMRagAXxl78Q0RiVjAOugHmeKNGdd3cwo/ktpf9aL9epCfFqWDEKysUlLQ=="
},
"querystringify": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/querystringify/-/querystringify-2.2.0.tgz",
"integrity": "sha512-FIqgj2EUvTa7R50u0rGsyTftzjYmv/a3hO345bZNrqabNqjtgiDMgmo4mkUjd+nzU5oF3dClKqFIPUKybUyqoQ=="
},
"randombytes": {
"version": "2.1.0",
@ -18481,6 +18713,11 @@
"integrity": "sha1-jGStX9MNqxyXbiNE/+f3kqam30I=",
"dev": true
},
"requires-port": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/requires-port/-/requires-port-1.0.0.tgz",
"integrity": "sha1-kl0mAdOaxIXgkc8NpcbmlNw9yv8="
},
"resolve": {
"version": "1.20.0",
"resolved": "https://registry.npmjs.org/resolve/-/resolve-1.20.0.tgz",
@ -18919,6 +19156,52 @@
"is-hex-prefixed": "1.0.0"
}
},
"superagent": {
"version": "6.1.0",
"resolved": "https://registry.npmjs.org/superagent/-/superagent-6.1.0.tgz",
"integrity": "sha512-OUDHEssirmplo3F+1HWKUrUjvnQuA+nZI6i/JJBdXb5eq9IyEQwPyPpqND+SSsxf6TygpBEkUjISVRN4/VOpeg==",
"requires": {
"component-emitter": "^1.3.0",
"cookiejar": "^2.1.2",
"debug": "^4.1.1",
"fast-safe-stringify": "^2.0.7",
"form-data": "^3.0.0",
"formidable": "^1.2.2",
"methods": "^1.1.2",
"mime": "^2.4.6",
"qs": "^6.9.4",
"readable-stream": "^3.6.0",
"semver": "^7.3.2"
},
"dependencies": {
"debug": {
"version": "4.3.4",
"resolved": "https://registry.npmjs.org/debug/-/debug-4.3.4.tgz",
"integrity": "sha512-PRWFHuSU3eDtQJPvnNY7Jcket1j0t5OuOsFzPPzsekD52Zl8qUfFIPEiswXqIvHWGVHOgX+7G/vCNNhehwxfkQ==",
"requires": {
"ms": "2.1.2"
}
},
"mime": {
"version": "2.6.0",
"resolved": "https://registry.npmjs.org/mime/-/mime-2.6.0.tgz",
"integrity": "sha512-USPkMeET31rOMiarsBNIHZKLGgvKc/LrjofAnBlOttf5ajRvqiRA8QsenbcooctK6d6Ts6aqZXBA+XbkKthiQg=="
},
"ms": {
"version": "2.1.2",
"resolved": "https://registry.npmjs.org/ms/-/ms-2.1.2.tgz",
"integrity": "sha512-sGkPx+VjMtmA6MX27oA4FBFELFCZZ4S4XqeGOXCv68tT+jb3vk/RyaKWP0PTKyWtmLSM0b+adUTEvbs1PEaH2w=="
},
"semver": {
"version": "7.3.7",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.3.7.tgz",
"integrity": "sha512-QlYTucUYOews+WeEujDoEGziz4K6c47V/Bd+LjSSYcA94p+DmINdf7ncaUinThfvZyu13lN9OY1XDxt8C0Tw0g==",
"requires": {
"lru-cache": "^6.0.0"
}
}
}
},
"superstruct": {
"version": "0.14.2",
"resolved": "https://registry.npmjs.org/superstruct/-/superstruct-0.14.2.tgz",
@ -19470,6 +19753,15 @@
"punycode": "^2.1.0"
}
},
"url-parse": {
"version": "1.5.10",
"resolved": "https://registry.npmjs.org/url-parse/-/url-parse-1.5.10.tgz",
"integrity": "sha512-WypcfiRhfeUP9vvF0j6rw0J3hrWrw6iZv3+22h6iRMJ/8z1Tj6XfLP4DsUix5MhMPnXpiHDoKyoZ/bdCkwBCiQ==",
"requires": {
"querystringify": "^2.1.1",
"requires-port": "^1.0.0"
}
},
"url-parse-lax": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/url-parse-lax/-/url-parse-lax-3.0.0.tgz",
@ -20235,8 +20527,7 @@
"yallist": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/yallist/-/yallist-4.0.0.tgz",
"integrity": "sha512-3wdGidZyq5PB084XLES5TpOSRA3wjXAlIWMhum2kRcv/41Sn2emQ0dycQW4uZXLejwKvg6EsvbdlVL+FYEct7A==",
"dev": true
"integrity": "sha512-3wdGidZyq5PB084XLES5TpOSRA3wjXAlIWMhum2kRcv/41Sn2emQ0dycQW4uZXLejwKvg6EsvbdlVL+FYEct7A=="
},
"yargs": {
"version": "16.2.0",

View File

@ -62,7 +62,7 @@
"web3": "^1.6.1"
},
"dependencies": {
"@certusone/wormhole-sdk": "^0.2.4",
"@certusone/wormhole-sdk": "^0.3.5",
"@improbable-eng/grpc-web": "^0.14.0",
"@solana/spl-token": "^0.1.8",
"@solana/web3.js": "^1.24.0",

View File

@ -1417,7 +1417,7 @@ export async function signContribution(
web3.eth.abi.encodeParameter("uint256", amount).substring(2),
web3.eth.abi
.encodeParameter("address", buyerAddress)
.substring(2 + (64 - 40)),
.substring(2), // we actually want 32 bytes
web3.eth.abi.encodeParameter("uint256", totalContribution).substring(2),
];

View File

@ -9,8 +9,8 @@ export async function registerChainOnEth(
contributorAddress: Uint8Array,
wallet: ethers.Wallet
): Promise<ethers.ContractReceipt> {
const contributor = Conductor__factory.connect(conductorAddress, wallet);
const tx = await contributor.registerChain(
const conductor = Conductor__factory.connect(conductorAddress, wallet);
const tx = await conductor.registerChain(
contributorChain,
contributorAddress
);

View File

@ -1,7 +1,14 @@
import { ethers } from "ethers";
import { ChainId, uint8ArrayToHex } from "@certusone/wormhole-sdk";
import { AcceptedToken, Allocation, SaleInit, SaleSealed } from "./structs";
import {
AcceptedToken,
Allocation,
SaleInit,
SolanaSaleInit,
SolanaToken,
SaleSealed,
} from "./structs";
const VAA_PAYLOAD_NUM_ACCEPTED_TOKENS = 228;
const VAA_PAYLOAD_ACCEPTED_TOKEN_BYTES_LENGTH = 50;
@ -77,6 +84,60 @@ function parseAcceptedTokens(
return tokens;
}
const SOLANA_VAA_PAYLOAD_NUM_ACCEPTED_TOKENS = 132;
const SOLANA_VAA_PAYLOAD_ACCEPTED_TOKEN_BYTES_LENGTH = 33;
export async function parseSolanaSaleInit(
payload: Uint8Array
): Promise<SolanaSaleInit> {
const buffer = Buffer.from(payload);
const numAcceptedTokens = buffer.readUInt8(
SOLANA_VAA_PAYLOAD_NUM_ACCEPTED_TOKENS
);
const recipientIndex =
SOLANA_VAA_PAYLOAD_NUM_ACCEPTED_TOKENS +
numAcceptedTokens * SOLANA_VAA_PAYLOAD_ACCEPTED_TOKEN_BYTES_LENGTH +
1;
return {
payloadId: buffer.readUInt8(0),
saleId: ethers.BigNumber.from(payload.slice(1, 33)).toString(),
solanaTokenAccount: uint8ArrayToHex(payload.slice(33, 65)),
tokenChain: buffer.readUInt16BE(65),
tokenDecimals: buffer.readUInt8(67),
saleStart: ethers.BigNumber.from(payload.slice(68, 100)).toString(),
saleEnd: ethers.BigNumber.from(payload.slice(100, 132)).toString(),
acceptedTokens: parseSolanaAcceptedTokens(payload, numAcceptedTokens),
recipient: uint8ArrayToHex(
payload.slice(recipientIndex, recipientIndex + 32)
),
};
}
function parseSolanaAcceptedTokens(
payload: Uint8Array,
numTokens: number
): SolanaToken[] {
const buffer = Buffer.from(payload);
const tokens: SolanaToken[] = [];
for (let i = 0; i < numTokens; ++i) {
const startIndex =
SOLANA_VAA_PAYLOAD_NUM_ACCEPTED_TOKENS +
1 +
i * SOLANA_VAA_PAYLOAD_ACCEPTED_TOKEN_BYTES_LENGTH;
const token: SolanaToken = {
tokenIndex: buffer.readUInt8(startIndex),
tokenAddress: uint8ArrayToHex(
payload.slice(startIndex + 1, startIndex + 33)
),
};
tokens.push(token);
}
return tokens;
}
const VAA_PAYLOAD_NUM_ALLOCATIONS = 33;
const VAA_PAYLOAD_ALLOCATION_BYTES_LENGTH = 65;

View File

@ -78,6 +78,23 @@ export interface SaleInit {
refundRecipient: string;
}
export interface SolanaToken {
tokenIndex: number;
tokenAddress: ethers.BytesLike;
}
export interface SolanaSaleInit {
payloadId: number;
saleId: ethers.BigNumberish;
solanaTokenAccount: ethers.BytesLike;
tokenChain: number;
tokenDecimals: number;
saleStart: ethers.BigNumberish;
saleEnd: ethers.BigNumberish;
acceptedTokens: SolanaToken[];
recipient: string;
}
export interface Allocation {
tokenIndex: number;
allocation: ethers.BigNumberish;

View File

@ -1,2 +0,0 @@
bin
**/target

View File

@ -1,75 +0,0 @@
#syntax=docker/dockerfile:1.2@sha256:e2a8561e419ab1ba6b2fe6cbdf49fd92b95912df1cf7d313c3e2230a333fdbcc
FROM docker.io/library/rust:1.49@sha256:a50165ea96983c21832578afb1c8c028674c965bc1ed43b607871b1f362e06a5
RUN apt-get update && \
apt-get install -y \
clang \
libssl-dev \
libudev-dev \
llvm \
pkg-config \
zlib1g-dev \
&& \
rm -rf /var/lib/apt/lists/* && \
rustup component add rustfmt && \
rustup default nightly-2022-01-02
RUN sh -c "$(curl -sSfL https://release.solana.com/v1.9.4/install)"
ENV PATH="/root/.local/share/solana/install/active_release/bin:$PATH"
# Solana does a questionable download at the beginning of a *first* build-bpf call. Trigger and layer-cache it explicitly.
RUN cargo init --lib /tmp/decoy-crate && \
cd /tmp/decoy-crate && cargo build-bpf && \
rm -rf /tmp/decoy-crate
# Cache Pyth sources
# This comes soon after mainnet-v2.1
ENV PYTH_SRC_REV=31e3188bbf52ec1a25f71e4ab969378b27415b0a
ENV PYTH_DIR=/usr/src/pyth/pyth-client
WORKDIR $PYTH_DIR
ADD https://github.com/pyth-network/pyth-client/archive/$PYTH_SRC_REV.tar.gz .
# GitHub appends revision to dir in archive
RUN tar -xvf *.tar.gz && rm -rf *.tar.gz && mv pyth-client-$PYTH_SRC_REV pyth-client
# Add bridge contract sources
WORKDIR /usr/src/bridge
ADD . .
RUN mkdir -p /opt/solana/deps
ENV EMITTER_ADDRESS="11111111111111111111111111111115"
ENV BRIDGE_ADDRESS="Bridge1p5gheXUvJ6jGWGeCsgPKgnE3YgdGKRVCMY9o"
# Build Wormhole Solana programs
RUN --mount=type=cache,target=bridge/target \
--mount=type=cache,target=modules/token_bridge/target \
--mount=type=cache,target=modules/nft_bridge/target \
--mount=type=cache,target=modules/icco_contributor/target \
--mount=type=cache,target=pyth2wormhole/target \
--mount=type=cache,target=migration/target \
cargo build-bpf --manifest-path "bridge/program/Cargo.toml" -- --locked && \
cargo build-bpf --manifest-path "bridge/cpi_poster/Cargo.toml" -- --locked && \
cargo build-bpf --manifest-path "modules/token_bridge/program/Cargo.toml" -- --locked && \
cargo build-bpf --manifest-path "pyth2wormhole/program/Cargo.toml" -- --locked && \
cargo build-bpf --manifest-path "modules/nft_bridge/program/Cargo.toml" -- --locked && \
cargo build-bpf --manifest-path "modules/icco_contributor/program/Cargo.toml" -- --locked && \
cargo build-bpf --manifest-path "migration/Cargo.toml" -- --locked && \
cp bridge/target/deploy/bridge.so /opt/solana/deps/bridge.so && \
cp bridge/target/deploy/cpi_poster.so /opt/solana/deps/cpi_poster.so && \
cp migration/target/deploy/wormhole_migration.so /opt/solana/deps/wormhole_migration.so && \
cp modules/token_bridge/target/deploy/token_bridge.so /opt/solana/deps/token_bridge.so && \
cp modules/nft_bridge/target/deploy/nft_bridge.so /opt/solana/deps/nft_bridge.so && \
cp modules/icco_contributor/target/deploy/icco_contributor.so /opt/solana/deps/icco_contributor.so && \
cp modules/token_bridge/token-metadata/spl_token_metadata.so /opt/solana/deps/spl_token_metadata.so && \
cp pyth2wormhole/target/deploy/pyth2wormhole.so /opt/solana/deps/pyth2wormhole.so
# Build the Pyth Solana program
WORKDIR $PYTH_DIR/pyth-client/program
RUN make SOLANA=~/.local/share/solana/install/active_release/bin OUT_DIR=../target && \
cp ../target/oracle.so /opt/solana/deps/pyth_oracle.so
ENV RUST_LOG="solana_runtime::system_instruction_processor=trace,solana_runtime::message_processor=trace,solana_bpf_loader=debug,solana_rbpf=debug"
ENV RUST_BACKTRACE=1

View File

@ -1,106 +0,0 @@
# syntax=docker.io/docker/dockerfile:1.3@sha256:42399d4635eddd7a9b8a24be879d2f9a930d0ed040a61324cfdf59ef1357b3b2
FROM docker.io/library/rust:1.49@sha256:a50165ea96983c21832578afb1c8c028674c965bc1ed43b607871b1f362e06a5 AS build
RUN apt-get update && apt-get install -y libssl-dev libudev-dev pkg-config zlib1g-dev llvm clang
RUN rustup component add rustfmt
RUN rustup default nightly-2022-01-02
WORKDIR /usr/src/bridge
RUN cargo install wasm-pack --vers 0.9.1
ENV RUST_LOG="solana_runtime::system_instruction_processor=trace,solana_runtime::message_processor=trace,solana_bpf_loader=debug,solana_rbpf=debug"
ENV EMITTER_ADDRESS="11111111111111111111111111111115"
ENV BRIDGE_ADDRESS="Bridge1p5gheXUvJ6jGWGeCsgPKgnE3YgdGKRVCMY9o"
COPY bridge bridge
COPY modules modules
COPY solitaire solitaire
COPY migration migration
COPY pyth2wormhole pyth2wormhole
# wasm-bindgen 0.2.74 generates JavaScript bindings for SystemInstruction exported from solana-program 1.9.4.
# The generated JavaScript references a non-existent function (wasm.__wbg_systeminstruction_free) that leads
# to an attempted import error when importing the wasm packed for bundler. SystemInstruction isn't used in the sdk,
# so we remove the non-existent function reference as a workaround.
ARG SED_REMOVE_INVALID_REFERENCE="/^\s*wasm.__wbg_systeminstruction_free(ptr);$/d"
# TODO: it appears that wasm-pack ignores our lockfiles even with --locked
# Compile Wormhole
RUN --mount=type=cache,target=/root/.cache \
--mount=type=cache,target=bridge/target \
cd bridge/program && /usr/local/cargo/bin/wasm-pack build --target bundler -d bundler -- --features wasm --locked && \
cd bundler && sed -i $SED_REMOVE_INVALID_REFERENCE bridge_bg.js
RUN --mount=type=cache,target=/root/.cache \
--mount=type=cache,target=bridge/target \
cd bridge/program && /usr/local/cargo/bin/wasm-pack build --target nodejs -d nodejs -- --features wasm --locked
# Compile Token Bridge
RUN --mount=type=cache,target=/root/.cache \
--mount=type=cache,target=modules/token_bridge/target \
cd modules/token_bridge/program && /usr/local/cargo/bin/wasm-pack build --target bundler -d bundler -- --features wasm --locked && \
cd bundler && sed -i $SED_REMOVE_INVALID_REFERENCE token_bridge_bg.js
RUN --mount=type=cache,target=/root/.cache \
--mount=type=cache,target=modules/token_bridge/target \
cd modules/token_bridge/program && /usr/local/cargo/bin/wasm-pack build --target nodejs -d nodejs -- --features wasm --locked
# Compile Migration
RUN --mount=type=cache,target=/root/.cache \
--mount=type=cache,target=migration/target \
cd migration && /usr/local/cargo/bin/wasm-pack build --target bundler -d bundler -- --features wasm --locked && \
cd bundler && sed -i $SED_REMOVE_INVALID_REFERENCE wormhole_migration_bg.js
RUN --mount=type=cache,target=/root/.cache \
--mount=type=cache,target=migration/target \
cd migration && /usr/local/cargo/bin/wasm-pack build --target nodejs -d nodejs -- --features wasm --locked
# Compile NFT Bridge
RUN --mount=type=cache,target=/root/.cache \
--mount=type=cache,target=modules/nft_bridge/target \
cd modules/nft_bridge/program && /usr/local/cargo/bin/wasm-pack build --target bundler -d bundler -- --features wasm --locked && \
cd bundler && sed -i $SED_REMOVE_INVALID_REFERENCE nft_bridge_bg.js
RUN --mount=type=cache,target=/root/.cache \
--mount=type=cache,target=modules/nft_bridge/target \
cd modules/nft_bridge/program && /usr/local/cargo/bin/wasm-pack build --target nodejs -d nodejs -- --features wasm --locked
# Compile icco_contributor
RUN --mount=type=cache,target=/root/.cache \
--mount=type=cache,target=modules/icco_contributor/target \
cd modules/icco_contributor/program && /usr/local/cargo/bin/wasm-pack build --target bundler -d bundler -- --features wasm --locked && \
cd bundler && sed -i $SED_REMOVE_INVALID_REFERENCE icco_contributor_bg.js
RUN --mount=type=cache,target=/root/.cache \
--mount=type=cache,target=modules/icco_contributor/target \
cd modules/icco_contributor/program && /usr/local/cargo/bin/wasm-pack build --target nodejs -d nodejs -- --features wasm --locked
# Compile pyth2wormhole
RUN --mount=type=cache,target=/root/.cache \
--mount=type=cache,target=pyth2wormhole/target \
cd pyth2wormhole/program \
&& /usr/local/cargo/bin/wasm-pack build --target bundler -d bundler -- --features wasm --locked
RUN --mount=type=cache,target=/root/.cache \
--mount=type=cache,target=pyth2wormhole/target \
cd pyth2wormhole/program \
&& /usr/local/cargo/bin/wasm-pack build --target nodejs -d nodejs -- --features wasm --locked
FROM scratch AS export
COPY --from=build /usr/src/bridge/bridge/program/bundler sdk/js/src/solana/core
COPY --from=build /usr/src/bridge/modules/token_bridge/program/bundler sdk/js/src/solana/token
COPY --from=build /usr/src/bridge/migration/bundler sdk/js/src/solana/migration
COPY --from=build /usr/src/bridge/modules/nft_bridge/program/bundler sdk/js/src/solana/nft
COPY --from=build /usr/src/bridge/modules/icco_contributor/program/bundler sdk/js/src/solana/icco_contributor
COPY --from=build /usr/src/bridge/pyth2wormhole/program/bundler third_party/pyth/p2w-sdk/src/solana/p2w-core
COPY --from=build /usr/src/bridge/bridge/program/bundler third_party/pyth/p2w-sdk/src/solana/wormhole-core
COPY --from=build /usr/src/bridge/bridge/program/nodejs sdk/js/src/solana/core-node
COPY --from=build /usr/src/bridge/modules/token_bridge/program/nodejs sdk/js/src/solana/token-node
COPY --from=build /usr/src/bridge/migration/nodejs sdk/js/src/solana/migration-node
COPY --from=build /usr/src/bridge/modules/nft_bridge/program/nodejs sdk/js/src/solana/nft-node
COPY --from=build /usr/src/bridge/modules/icco_contributor/program/nodejs sdk/js/src/solana/icco_contributor-node

View File

@ -1,112 +0,0 @@
#!/usr/bin/env bash
# This script configures the devnet for test transfers with hardcoded addresses.
set -x
# Configure CLI (works the same as upstream Solana CLI)
mkdir -p ~/.config/solana/cli
cat <<EOF > ~/.config/solana/cli/config.yml
json_rpc_url: "http://127.0.0.1:8899"
websocket_url: ""
keypair_path: /usr/src/solana/keys/solana-devnet.json
EOF
# Static key for the mint so it always has the same address
cat <<EOF > token.json
[179,228,102,38,68,102,75,133,127,56,63,167,143,42,59,29,220,215,100,149,220,241,176,204,154,241,168,147,195,139,55,100,22,88,9,115,146,64,160,172,3,185,132,64,254,137,133,84,142,58,166,131,205,13,77,157,245,181,101,150,105,250,163,1]
EOF
# Static key for the NFT mint so it always has the same address
cat <<EOF > nft.json
[155,117,110,235,96,214,56,128,109,79,49,209,212,13,134,5,43,123,213,68,21,156,128,100,95,8,43,51,188,230,21,197,156,0,108,72,200,203,243,56,73,203,7,163,249,54,21,156,197,35,249,89,28,177,153,154,189,69,137,14,197,254,233,183]
EOF
# Static key for the 2nd NFT mint so it always has the same address
cat <<EOF > nft2.json
[40,74,92,250,81,56,202,67,129,124,193,219,24,161,198,98,191,214,136,7,112,26,72,17,33,249,24,225,183,237,27,216,11,179,26,170,82,220,3,253,152,185,151,186,12,21,138,161,175,46,180,3,167,165,70,51,128,45,237,143,146,49,34,180]
EOF
# Constants
cli_address=6sbzC1eH4FTujJXWj51eQe25cYvr4xfXbJ1vAj7j2k5J
bridge_address=Bridge1p5gheXUvJ6jGWGeCsgPKgnE3YgdGKRVCMY9o
nft_bridge_address=NFTWqJR8YnRVqPDvTJrYuLrQDitTG5AScqbeghi4zSA
token_bridge_address=B6RHG3mfcckmrYN1UhmJzyS1XX3fZKbkeUcpJe9Sy3FE
icco_contributor_address=22mamxmojFWBdbGqaxTH46HBAgAY2bJRiGJJHfNRNQ95
icco_conductor_address=1111111111112LDK1X5tGxG5Gcy3PvB4TTBKnJGp
initial_guardian=befa429d57cd18b7f8a4d91a2da9ab4af05d0fbe
recipient_address=90F8bf6A479f320ead074411a4B0e7944Ea8c9C1
chain_id_ethereum=2
retry () {
while ! $@; do
sleep 1
done
}
# Fund our account (as defined in solana/keys/solana-devnet.json).
retry solana airdrop 1000
# Create a new SPL token
token=$(spl-token create-token -- token.json | grep 'Creating token' | awk '{ print $3 }')
echo "Created token $token"
# Create token account
account=$(spl-token create-account "$token" | grep 'Creating account' | awk '{ print $3 }')
echo "Created token account $account"
# Mint new tokens owned by our CLI account
spl-token mint "$token" 10000000000 "$account"
# Create meta for token
token-bridge-client create-meta "$token" "Solana Test Token" "SOLT" ""
# Create a new SPL NFT
nft=$(spl-token create-token --decimals 0 -- nft.json | grep 'Creating token' | awk '{ print $3 }')
echo "Created NFT $nft"
# Create NFT account
nft_account=$(spl-token create-account "$nft" | grep 'Creating account' | awk '{ print $3 }')
echo "Created NFT account $nft_account"
# Mint new NFT owned by our CLI account
spl-token mint "$nft" 1 "$nft_account"
# Create meta for token
token-bridge-client create-meta "$nft" "Not a PUNK🎸" "PUNK🎸" "https://wrappedpunks.com:3000/api/punks/metadata/39"
nft=$(spl-token create-token --decimals 0 -- nft2.json | grep 'Creating token' | awk '{ print $3 }')
echo "Created NFT $nft"
nft_account=$(spl-token create-account "$nft" | grep 'Creating account' | awk '{ print $3 }')
echo "Created NFT account $nft_account"
spl-token mint "$nft" 1 "$nft_account"
token-bridge-client create-meta "$nft" "Not a PUNK 2🎸" "PUNK2🎸" "https://wrappedpunks.com:3000/api/punks/metadata/51"
# Create the bridge contract at a known address
# OK to fail on subsequent attempts (already created).
retry client create-bridge "$bridge_address" "$initial_guardian" 86400 100
# Initialize the token bridge
retry token-bridge-client create-bridge "$token_bridge_address" "$bridge_address"
# Initialize the NFT bridge
retry token-bridge-client create-bridge "$nft_bridge_address" "$bridge_address"
# Initialize the ICCO contributor. This will store bridge.
retry icco-contributor-client create-bridge "$icco_contributor_address" "$bridge_address" "$icco_contributor_address"
pushd /usr/src/clients/token_bridge
# Register the Token Bridge Endpoint on ETH
node main.js solana execute_governance_vaa $(node main.js generate_register_chain_vaa 2 0x0000000000000000000000000290FB167208Af455bB137780163b7B7a9a10C16)
node main.js solana execute_governance_vaa $(node main.js generate_register_chain_vaa 3 0x000000000000000000000000784999135aaa8a3ca5914468852fdddbddd8789d)
node main.js solana execute_governance_vaa $(node main.js generate_register_chain_vaa 4 0x0000000000000000000000000290FB167208Af455bB137780163b7B7a9a10C16)
popd
pushd /usr/src/clients/nft_bridge
# Register the NFT Bridge Endpoint on ETH
node main.js solana execute_governance_vaa $(node main.js generate_register_chain_vaa 2 0x00000000000000000000000026b4afb60d6c903165150c6f0aa14f8016be4aec)
node main.js solana execute_governance_vaa $(node main.js generate_register_chain_vaa 3 0x000000000000000000000000288246bebae560e006d01c675ae332ac8e146bb7)
popd
# Let k8s startup probe succeed
nc -k -l -p 2000

View File

@ -1,24 +0,0 @@
#!/usr/bin/env bash
# This script deploys built icco contributor solana contract to tilt devnet.
# Program Id: 5yrpFgtmiBkRmDgveVErMWuxC25eK5QE5ouZgfi46aqM
# print statements:
set -x
cd modules/icco_contributor
EMITTER_ADDRESS="11111111111111111111111111111115" BRIDGE_ADDRESS="Bridge1p5gheXUvJ6jGWGeCsgPKgnE3YgdGKRVCMY9o" cargo build-bpf
cd -
cd modules/icco_contributor/program
EMITTER_ADDRESS="11111111111111111111111111111115" BRIDGE_ADDRESS="Bridge1p5gheXUvJ6jGWGeCsgPKgnE3YgdGKRVCMY9o" wasm-pack build --target nodejs -d node -- --features wasm
cp node/* ../../../../sdk/js/src/solana/icco_contributor-node/
cd -
minikube kubectl -- cp -c devnet keys/solana-devnet.json solana-devnet-0:/root/.config/solana/id.json
minikube kubectl -- cp -c devnet modules/icco_contributor/contributor_id.json solana-devnet-0:/usr/src/
minikube kubectl -- cp -c devnet modules/icco_contributor/target/deploy/icco_contributor.so solana-devnet-0:/usr/src/
minikube kubectl -- exec -c devnet solana-devnet-0 -- solana program deploy -u l --program-id=/usr/src/contributor_id.json /usr/src/icco_contributor.so
# register conductor as emitter on solana contributor or tests will go bad.
# this succeds only first itme after tilt up, then it will fail because config PDA account was already initialized.
./modules/icco_contributor/target/debug/client create-bridge 5yrpFgtmiBkRmDgveVErMWuxC25eK5QE5ouZgfi46aqM B6RHG3mfcckmrYN1UhmJzyS1XX3fZKbkeUcpJe9Sy3FE B6RHG3mfcckmrYN1UhmJzyS1XX3fZKbkeUcpJe9Sy3FE

View File

@ -1 +0,0 @@
[39,20,181,104,82,27,70,145,227,136,168,14,170,24,33,88,145,152,180,229,219,142,247,114,237,79,52,97,84,65,213,172,49,165,99,116,254,135,110,132,214,114,59,200,109,253,45,43,74,172,107,84,162,223,23,15,78,167,240,137,234,123,4,231]

View File

@ -1 +0,0 @@
[151,156,152,229,131,186,5,254,107,42,234,87,191,209,182,237,170,57,174,150,37,14,5,58,100,237,114,141,46,22,155,104,10,20,225,112,227,95,250,0,102,170,119,34,187,74,144,163,181,123,233,253,191,6,2,70,127,227,138,51,98,209,205,172]

View File

@ -1 +0,0 @@
[62,189,176,181,215,49,125,17,130,43,109,83,115,112,151,110,117,239,235,54,205,209,6,255,76,27,210,115,206,166,217,165,250,48,211,191,77,246,195,18,170,246,162,103,141,129,14,143,127,4,243,114,79,112,11,46,90,174,215,2,63,42,134,56]

View File

@ -1 +0,0 @@
[14,173,153,4,176,224,201,111,32,237,183,185,159,247,22,161,89,84,215,209,212,137,10,92,157,49,29,192,101,164,152,70,87,65,8,174,214,157,175,126,98,90,54,24,100,177,247,77,19,112,47,44,165,109,233,102,14,86,109,29,134,145,132,141]

File diff suppressed because it is too large Load Diff

View File

@ -1,41 +0,0 @@
[package]
name = "wormhole-migration"
version = "0.1.0"
description = "Created with Rocksalt"
edition = "2018"
[lib]
crate-type = ["cdylib", "lib"]
name = "wormhole_migration"
[features]
no-entrypoint = ["solitaire/no-entrypoint", "rand"]
trace = ["solitaire/trace"]
wasm = ["no-entrypoint", "wasm-bindgen"]
client = ["solitaire-client", "solitaire/client", "no-entrypoint"]
cpi = ["no-entrypoint"]
default = []
[dependencies]
borsh = "=0.9.1"
byteorder = "1.4.3"
rocksalt = { path = "../solitaire/rocksalt" }
solitaire = { path = "../solitaire/program" }
sha3 = "0.9.1"
solana-program = "*"
spl-token = { version = "=3.2.0", features = ["no-entrypoint"] }
solitaire-client = { path = "../solitaire/client", optional = true }
wasm-bindgen = { version = "0.2.74", features = ["serde-serialize"], optional = true }
serde = { version = "1.0", features = ["derive"] }
rand = { version = "0.7.3", optional = true }
[dev-dependencies]
hex = "*"
hex-literal = "0.3.1"
libsecp256k1 = { version = "0.3.5", features = [] }
solana-client = "=1.9.4"
solana-sdk = "=1.9.4"
spl-token = { version = "=3.2.0", features = ["no-entrypoint"] }
[patch.crates-io]
memmap2 = { path = "../bridge/memmap2-rs" }

View File

@ -1,11 +0,0 @@
# Merge similar crates together to avoid multiple use statements.
imports_granularity = "Crate"
# Consistency in formatting makes tool based searching/editing better.
empty_item_single_line = false
# Easier editing when arbitrary mixed use statements do not collapse.
imports_layout = "Vertical"
# Default rustfmt formatting of match arms with branches is awful.
match_arm_leading_pipes = "Preserve"

View File

@ -1,84 +0,0 @@
use crate::types::{
PoolData,
SplAccount,
SplMint,
};
use solana_program::pubkey::Pubkey;
use solitaire::{
processors::seeded::Seeded,
AccountState,
Data,
Derive,
Info,
};
pub type ShareMint<'a, const STATE: AccountState> = Data<'a, SplMint, { STATE }>;
pub struct ShareMintDerivationData {
pub pool: Pubkey,
}
impl<'b, const STATE: AccountState> Seeded<&ShareMintDerivationData> for ShareMint<'b, { STATE }> {
fn seeds(accs: &ShareMintDerivationData) -> Vec<Vec<u8>> {
vec![
String::from("share_mint").as_bytes().to_vec(),
accs.pool.to_bytes().to_vec(),
]
}
}
pub type FromCustodyTokenAccount<'a, const STATE: AccountState> = Data<'a, SplAccount, { STATE }>;
pub struct FromCustodyTokenAccountDerivationData {
pub pool: Pubkey,
}
impl<'b, const STATE: AccountState> Seeded<&FromCustodyTokenAccountDerivationData>
for FromCustodyTokenAccount<'b, { STATE }>
{
fn seeds(accs: &FromCustodyTokenAccountDerivationData) -> Vec<Vec<u8>> {
vec![
String::from("from_custody").as_bytes().to_vec(),
accs.pool.to_bytes().to_vec(),
]
}
}
pub type ToCustodyTokenAccount<'a, const STATE: AccountState> = Data<'a, SplAccount, { STATE }>;
pub struct ToCustodyTokenAccountDerivationData {
pub pool: Pubkey,
}
impl<'b, const STATE: AccountState> Seeded<&ToCustodyTokenAccountDerivationData>
for ToCustodyTokenAccount<'b, { STATE }>
{
fn seeds(accs: &ToCustodyTokenAccountDerivationData) -> Vec<Vec<u8>> {
vec![
String::from("to_custody").as_bytes().to_vec(),
accs.pool.to_bytes().to_vec(),
]
}
}
pub type MigrationPool<'a, const STATE: AccountState> = Data<'a, PoolData, { STATE }>;
pub struct MigrationPoolDerivationData {
pub from: Pubkey,
pub to: Pubkey,
}
impl<'b, const STATE: AccountState> Seeded<&MigrationPoolDerivationData>
for MigrationPool<'b, { STATE }>
{
fn seeds(accs: &MigrationPoolDerivationData) -> Vec<Vec<u8>> {
vec![
String::from("pool").as_bytes().to_vec(),
accs.from.to_bytes().to_vec(),
accs.to.to_bytes().to_vec(),
]
}
}
pub type CustodySigner<'a> = Derive<Info<'a>, "custody_signer">;
pub type AuthoritySigner<'a> = Derive<Info<'a>, "authority_signer">;

View File

@ -1,5 +0,0 @@
pub mod add_liquidity;
pub mod claim_shares;
pub mod create_pool;
pub mod migrate_tokens;
pub mod remove_liquidity;

View File

@ -1,119 +0,0 @@
use crate::{
accounts::{
AuthoritySigner,
CustodySigner,
MigrationPool,
ShareMint,
ShareMintDerivationData,
ToCustodyTokenAccount,
ToCustodyTokenAccountDerivationData,
},
types::{
SplAccount,
SplMint,
},
MigrationError::WrongMint,
};
use borsh::{
BorshDeserialize,
BorshSerialize,
};
use crate::accounts::MigrationPoolDerivationData;
use solitaire::*;
#[derive(FromAccounts)]
pub struct AddLiquidity<'b> {
pub pool: Mut<MigrationPool<'b, { AccountState::Initialized }>>,
pub from_mint: Data<'b, SplMint, { AccountState::Initialized }>,
pub to_mint: Data<'b, SplMint, { AccountState::Initialized }>,
pub to_token_custody: Mut<ToCustodyTokenAccount<'b, { AccountState::Initialized }>>,
pub share_mint: Mut<ShareMint<'b, { AccountState::Initialized }>>,
pub to_lp_acc: Mut<Data<'b, SplAccount, { AccountState::Initialized }>>,
pub lp_share_acc: Mut<Data<'b, SplAccount, { AccountState::Initialized }>>,
pub custody_signer: CustodySigner<'b>,
pub authority_signer: AuthoritySigner<'b>,
}
#[derive(BorshDeserialize, BorshSerialize, Default)]
pub struct AddLiquidityData {
pub amount: u64,
}
pub fn add_liquidity(
ctx: &ExecutionContext,
accs: &mut AddLiquidity,
data: AddLiquidityData,
) -> Result<()> {
if *accs.from_mint.info().key != accs.pool.from {
return Err(WrongMint.into());
}
if *accs.to_mint.info().key != accs.pool.to {
return Err(WrongMint.into());
}
if accs.lp_share_acc.mint != *accs.share_mint.info().key {
return Err(WrongMint.into());
}
accs.to_token_custody.verify_derivation(
ctx.program_id,
&ToCustodyTokenAccountDerivationData {
pool: *accs.pool.info().key,
},
)?;
accs.share_mint.verify_derivation(
ctx.program_id,
&ShareMintDerivationData {
pool: *accs.pool.info().key,
},
)?;
accs.pool.verify_derivation(
ctx.program_id,
&MigrationPoolDerivationData {
from: accs.pool.from,
to: accs.pool.to,
},
)?;
let to_tokens_in = if accs.from_mint.decimals > accs.to_mint.decimals {
data.amount
} else {
data.amount
- (data.amount % 10u64.pow((accs.to_mint.decimals - accs.from_mint.decimals) as u32))
};
// Transfer out-tokens in
let transfer_ix = spl_token::instruction::transfer(
&spl_token::id(),
accs.to_lp_acc.info().key,
accs.to_token_custody.info().key,
accs.authority_signer.key,
&[],
to_tokens_in,
)?;
invoke_seeded(&transfer_ix, ctx, &accs.authority_signer, None)?;
// The share amount should be equal to the amount of from tokens an lp would be getting
let share_amount = if accs.from_mint.decimals > accs.to_mint.decimals {
data.amount
.checked_mul(10u64.pow((accs.from_mint.decimals - accs.to_mint.decimals) as u32))
.unwrap()
} else {
data.amount
.checked_div(10u64.pow((accs.to_mint.decimals - accs.from_mint.decimals) as u32))
.unwrap()
};
// Mint LP shares
let mint_ix = spl_token::instruction::mint_to(
&spl_token::id(),
accs.share_mint.info().key,
accs.lp_share_acc.info().key,
accs.custody_signer.key,
&[],
share_amount,
)?;
invoke_seeded(&mint_ix, ctx, &accs.custody_signer, None)?;
Ok(())
}

View File

@ -1,96 +0,0 @@
use crate::{
accounts::{
AuthoritySigner,
CustodySigner,
FromCustodyTokenAccountDerivationData,
MigrationPool,
ShareMint,
ShareMintDerivationData,
ToCustodyTokenAccount,
},
types::SplAccount,
MigrationError::WrongMint,
};
use borsh::{
BorshDeserialize,
BorshSerialize,
};
use crate::accounts::MigrationPoolDerivationData;
use solitaire::{
processors::seeded::{
invoke_seeded,
Seeded,
},
*,
};
#[derive(FromAccounts)]
pub struct ClaimShares<'b> {
pub pool: Mut<MigrationPool<'b, { AccountState::Initialized }>>,
pub from_token_custody: Mut<ToCustodyTokenAccount<'b, { AccountState::Initialized }>>,
pub share_mint: Mut<ShareMint<'b, { AccountState::Initialized }>>,
pub from_lp_acc: Mut<Data<'b, SplAccount, { AccountState::Initialized }>>,
pub lp_share_acc: Mut<Data<'b, SplAccount, { AccountState::Initialized }>>,
pub custody_signer: CustodySigner<'b>,
pub authority_signer: AuthoritySigner<'b>,
}
#[derive(BorshDeserialize, BorshSerialize, Default)]
pub struct ClaimSharesData {
pub amount: u64,
}
pub fn claim_shares(
ctx: &ExecutionContext,
accs: &mut ClaimShares,
data: ClaimSharesData,
) -> Result<()> {
if accs.lp_share_acc.mint != *accs.share_mint.info().key {
return Err(WrongMint.into());
}
accs.from_token_custody.verify_derivation(
ctx.program_id,
&FromCustodyTokenAccountDerivationData {
pool: *accs.pool.info().key,
},
)?;
accs.share_mint.verify_derivation(
ctx.program_id,
&ShareMintDerivationData {
pool: *accs.pool.info().key,
},
)?;
accs.pool.verify_derivation(
ctx.program_id,
&MigrationPoolDerivationData {
from: accs.pool.from,
to: accs.pool.to,
},
)?;
// Transfer claimed tokens to LP
let transfer_ix = spl_token::instruction::transfer(
&spl_token::id(),
accs.from_token_custody.info().key,
accs.from_lp_acc.info().key,
accs.custody_signer.key,
&[],
data.amount,
)?;
invoke_seeded(&transfer_ix, ctx, &accs.custody_signer, None)?;
// Burn LP shares
let mint_ix = spl_token::instruction::burn(
&spl_token::id(),
accs.lp_share_acc.info().key,
accs.share_mint.info().key,
accs.authority_signer.key,
&[],
data.amount,
)?;
invoke_seeded(&mint_ix, ctx, &accs.authority_signer, None)?;
Ok(())
}

View File

@ -1,118 +0,0 @@
use crate::{
accounts::{
CustodySigner,
FromCustodyTokenAccount,
FromCustodyTokenAccountDerivationData,
MigrationPool,
MigrationPoolDerivationData,
ShareMint,
ShareMintDerivationData,
ToCustodyTokenAccount,
ToCustodyTokenAccountDerivationData,
},
types::SplMint,
};
use borsh::{
BorshDeserialize,
BorshSerialize,
};
use solana_program::program::invoke_signed;
use solitaire::{
CreationLamports::Exempt,
*,
};
#[derive(FromAccounts)]
pub struct CreatePool<'b> {
pub payer: Mut<Signer<Info<'b>>>,
pub pool: Mut<MigrationPool<'b, { AccountState::Uninitialized }>>,
pub from_mint: Data<'b, SplMint, { AccountState::Initialized }>,
pub to_mint: Data<'b, SplMint, { AccountState::Initialized }>,
pub from_token_custody: Mut<FromCustodyTokenAccount<'b, { AccountState::Uninitialized }>>,
pub to_token_custody: Mut<ToCustodyTokenAccount<'b, { AccountState::Uninitialized }>>,
pub pool_mint: Mut<ShareMint<'b, { AccountState::Uninitialized }>>,
pub custody_signer: CustodySigner<'b>,
}
#[derive(BorshDeserialize, BorshSerialize, Default)]
pub struct CreatePoolData {}
pub fn create_pool(
ctx: &ExecutionContext,
accs: &mut CreatePool,
_data: CreatePoolData,
) -> Result<()> {
// Create from custody account
accs.from_token_custody.create(
&FromCustodyTokenAccountDerivationData {
pool: *accs.pool.info().key,
},
ctx,
accs.payer.key,
Exempt,
)?;
let init_ix = spl_token::instruction::initialize_account(
&spl_token::id(),
accs.from_token_custody.info().key,
accs.from_mint.info().key,
accs.custody_signer.info().key,
)?;
invoke_signed(&init_ix, ctx.accounts, &[])?;
// Create to custody account
accs.to_token_custody.create(
&ToCustodyTokenAccountDerivationData {
pool: *accs.pool.info().key,
},
ctx,
accs.payer.key,
Exempt,
)?;
let init_ix = spl_token::instruction::initialize_account(
&spl_token::id(),
accs.to_token_custody.info().key,
accs.to_mint.info().key,
accs.custody_signer.info().key,
)?;
invoke_signed(&init_ix, ctx.accounts, &[])?;
// Create to pool mint
accs.pool_mint.create(
&ShareMintDerivationData {
pool: *accs.pool.info().key,
},
ctx,
accs.payer.key,
Exempt,
)?;
let init_ix = spl_token::instruction::initialize_mint(
&spl_token::id(),
accs.pool_mint.info().key,
accs.custody_signer.info().key,
None,
accs.from_mint.decimals,
)?;
invoke_signed(&init_ix, ctx.accounts, &[])?;
// Set fields on pool
accs.pool.from = *accs.from_mint.info().key;
accs.pool.to = *accs.to_mint.info().key;
// Create pool
accs.pool.create(
&MigrationPoolDerivationData {
from: *accs.from_mint.info().key,
to: *accs.to_mint.info().key,
},
ctx,
accs.payer.key,
Exempt,
)?;
Ok(())
}

View File

@ -1,121 +0,0 @@
use crate::{
accounts::{
AuthoritySigner,
CustodySigner,
FromCustodyTokenAccount,
FromCustodyTokenAccountDerivationData,
MigrationPool,
ToCustodyTokenAccount,
ToCustodyTokenAccountDerivationData,
},
types::{
SplAccount,
SplMint,
},
MigrationError::WrongMint,
};
use borsh::{
BorshDeserialize,
BorshSerialize,
};
use crate::accounts::MigrationPoolDerivationData;
use solitaire::{
processors::seeded::{
invoke_seeded,
Seeded,
},
*,
};
#[derive(FromAccounts)]
pub struct MigrateTokens<'b> {
pub pool: Mut<MigrationPool<'b, { AccountState::Initialized }>>,
pub from_mint: Data<'b, SplMint, { AccountState::Initialized }>,
pub to_mint: Data<'b, SplMint, { AccountState::Initialized }>,
pub to_token_custody: Mut<ToCustodyTokenAccount<'b, { AccountState::Initialized }>>,
pub from_token_custody: Mut<FromCustodyTokenAccount<'b, { AccountState::Initialized }>>,
pub user_from_acc: Mut<Data<'b, SplAccount, { AccountState::Initialized }>>,
pub user_to_acc: Mut<Data<'b, SplAccount, { AccountState::Initialized }>>,
pub custody_signer: CustodySigner<'b>,
pub authority_signer: AuthoritySigner<'b>,
}
#[derive(BorshDeserialize, BorshSerialize, Default)]
pub struct MigrateTokensData {
pub amount: u64,
}
pub fn migrate_tokens(
ctx: &ExecutionContext,
accs: &mut MigrateTokens,
data: MigrateTokensData,
) -> Result<()> {
if *accs.from_mint.info().key != accs.pool.from {
return Err(WrongMint.into());
}
if *accs.to_mint.info().key != accs.pool.to {
return Err(WrongMint.into());
}
if accs.user_from_acc.mint != accs.pool.from {
return Err(WrongMint.into());
}
if accs.user_to_acc.mint != accs.pool.to {
return Err(WrongMint.into());
}
accs.to_token_custody.verify_derivation(
ctx.program_id,
&ToCustodyTokenAccountDerivationData {
pool: *accs.pool.info().key,
},
)?;
accs.from_token_custody.verify_derivation(
ctx.program_id,
&FromCustodyTokenAccountDerivationData {
pool: *accs.pool.info().key,
},
)?;
accs.pool.verify_derivation(
ctx.program_id,
&MigrationPoolDerivationData {
from: accs.pool.from,
to: accs.pool.to,
},
)?;
// Transfer in-tokens in
let transfer_ix = spl_token::instruction::transfer(
&spl_token::id(),
accs.user_from_acc.info().key,
accs.from_token_custody.info().key,
accs.authority_signer.key,
&[],
data.amount,
)?;
invoke_seeded(&transfer_ix, ctx, &accs.authority_signer, None)?;
// The out amount needs to be decimal adjusted
let out_amount = if accs.from_mint.decimals > accs.to_mint.decimals {
data.amount
.checked_div(10u64.pow((accs.from_mint.decimals - accs.to_mint.decimals) as u32))
.unwrap()
} else {
data.amount
.checked_mul(10u64.pow((accs.to_mint.decimals - accs.from_mint.decimals) as u32))
.unwrap()
};
// Transfer out-tokens to user
let transfer_ix = spl_token::instruction::transfer(
&spl_token::id(),
accs.to_token_custody.info().key,
accs.user_to_acc.info().key,
accs.custody_signer.key,
&[],
out_amount,
)?;
invoke_seeded(&transfer_ix, ctx, &accs.custody_signer, None)?;
Ok(())
}

View File

@ -1,117 +0,0 @@
use crate::{
accounts::{
AuthoritySigner,
CustodySigner,
MigrationPool,
MigrationPoolDerivationData,
ShareMint,
ShareMintDerivationData,
ToCustodyTokenAccount,
ToCustodyTokenAccountDerivationData,
},
types::{
SplAccount,
SplMint,
},
MigrationError::WrongMint,
};
use borsh::{
BorshDeserialize,
BorshSerialize,
};
use solitaire::{
processors::seeded::{
invoke_seeded,
Seeded,
},
*,
};
#[derive(FromAccounts)]
pub struct RemoveLiquidity<'b> {
pub pool: Mut<MigrationPool<'b, { AccountState::Initialized }>>,
pub from_mint: Data<'b, SplMint, { AccountState::Initialized }>,
pub to_mint: Data<'b, SplMint, { AccountState::Initialized }>,
pub to_token_custody: Mut<ToCustodyTokenAccount<'b, { AccountState::Initialized }>>,
pub share_mint: Mut<ShareMint<'b, { AccountState::Initialized }>>,
pub to_lp_acc: Mut<Data<'b, SplAccount, { AccountState::Initialized }>>,
pub lp_share_acc: Mut<Data<'b, SplAccount, { AccountState::Initialized }>>,
pub custody_signer: CustodySigner<'b>,
pub authority_signer: AuthoritySigner<'b>,
}
#[derive(BorshDeserialize, BorshSerialize, Default)]
pub struct RemoveLiquidityData {
pub amount: u64,
}
pub fn remove_liquidity(
ctx: &ExecutionContext,
accs: &mut RemoveLiquidity,
data: RemoveLiquidityData,
) -> Result<()> {
if *accs.from_mint.info().key != accs.pool.from {
return Err(WrongMint.into());
}
if *accs.to_mint.info().key != accs.pool.to {
return Err(WrongMint.into());
}
if accs.lp_share_acc.mint != *accs.share_mint.info().key {
return Err(WrongMint.into());
}
accs.to_token_custody.verify_derivation(
ctx.program_id,
&ToCustodyTokenAccountDerivationData {
pool: *accs.pool.info().key,
},
)?;
accs.share_mint.verify_derivation(
ctx.program_id,
&ShareMintDerivationData {
pool: *accs.pool.info().key,
},
)?;
accs.pool.verify_derivation(
ctx.program_id,
&MigrationPoolDerivationData {
from: accs.pool.from,
to: accs.pool.to,
},
)?;
// The out amount needs to be decimal adjusted
let out_amount = if accs.from_mint.decimals > accs.to_mint.decimals {
data.amount
.checked_div(10u64.pow((accs.from_mint.decimals - accs.to_mint.decimals) as u32))
.unwrap()
} else {
data.amount
.checked_mul(10u64.pow((accs.to_mint.decimals - accs.from_mint.decimals) as u32))
.unwrap()
};
// Transfer removed liquidity to LP
let transfer_ix = spl_token::instruction::transfer(
&spl_token::id(),
accs.to_token_custody.info().key,
accs.to_lp_acc.info().key,
accs.custody_signer.key,
&[],
out_amount,
)?;
invoke_seeded(&transfer_ix, ctx, &accs.custody_signer, None)?;
// Burn LP shares
let mint_ix = spl_token::instruction::burn(
&spl_token::id(),
accs.lp_share_acc.info().key,
accs.share_mint.info().key,
accs.authority_signer.key,
&[],
data.amount,
)?;
invoke_seeded(&mint_ix, ctx, &accs.authority_signer, None)?;
Ok(())
}

View File

@ -1,294 +0,0 @@
use crate::{
accounts::{
AuthoritySigner,
CustodySigner,
FromCustodyTokenAccount,
FromCustodyTokenAccountDerivationData,
MigrationPool,
MigrationPoolDerivationData,
ShareMint,
ShareMintDerivationData,
ToCustodyTokenAccount,
ToCustodyTokenAccountDerivationData,
},
api::{
add_liquidity::AddLiquidityData,
claim_shares::ClaimSharesData,
create_pool::CreatePoolData,
migrate_tokens::MigrateTokensData,
remove_liquidity::RemoveLiquidityData,
},
};
use borsh::BorshSerialize;
use solana_program::{
instruction::{
AccountMeta,
Instruction,
},
pubkey::Pubkey,
};
use solitaire::{
processors::seeded::Seeded,
AccountState,
};
pub fn add_liquidity(
program_id: Pubkey,
from_mint: Pubkey,
to_mint: Pubkey,
liquidity_token_account: Pubkey,
lp_share_token_account: Pubkey,
amount: u64,
) -> solitaire::Result<Instruction> {
let pool = MigrationPool::<'_, { AccountState::Initialized }>::key(
&MigrationPoolDerivationData {
from: from_mint,
to: to_mint,
},
&program_id,
);
Ok(Instruction {
program_id,
accounts: vec![
AccountMeta::new(pool, false),
AccountMeta::new_readonly(from_mint, false),
AccountMeta::new_readonly(to_mint, false),
AccountMeta::new(
ToCustodyTokenAccount::<'_, { AccountState::Uninitialized }>::key(
&ToCustodyTokenAccountDerivationData { pool },
&program_id,
),
false,
),
AccountMeta::new(
ShareMint::<'_, { AccountState::Uninitialized }>::key(
&ShareMintDerivationData { pool },
&program_id,
),
false,
),
AccountMeta::new(liquidity_token_account, false),
AccountMeta::new(lp_share_token_account, false),
AccountMeta::new_readonly(CustodySigner::key(None, &program_id), false),
AccountMeta::new_readonly(AuthoritySigner::key(None, &program_id), false),
// Dependencies
AccountMeta::new(solana_program::sysvar::rent::id(), false),
AccountMeta::new(solana_program::system_program::id(), false),
AccountMeta::new_readonly(spl_token::id(), false),
],
data: (
crate::instruction::Instruction::AddLiquidity,
AddLiquidityData { amount },
)
.try_to_vec()?,
})
}
pub fn remove_liquidity(
program_id: Pubkey,
from_mint: Pubkey,
to_mint: Pubkey,
liquidity_token_account: Pubkey,
lp_share_token_account: Pubkey,
amount: u64,
) -> solitaire::Result<Instruction> {
let pool = MigrationPool::<'_, { AccountState::Initialized }>::key(
&MigrationPoolDerivationData {
from: from_mint,
to: to_mint,
},
&program_id,
);
Ok(Instruction {
program_id,
accounts: vec![
AccountMeta::new(pool, false),
AccountMeta::new_readonly(from_mint, false),
AccountMeta::new_readonly(to_mint, false),
AccountMeta::new(
ToCustodyTokenAccount::<'_, { AccountState::Uninitialized }>::key(
&ToCustodyTokenAccountDerivationData { pool },
&program_id,
),
false,
),
AccountMeta::new(
ShareMint::<'_, { AccountState::Uninitialized }>::key(
&ShareMintDerivationData { pool },
&program_id,
),
false,
),
AccountMeta::new(liquidity_token_account, false),
AccountMeta::new(lp_share_token_account, false),
AccountMeta::new_readonly(CustodySigner::key(None, &program_id), false),
AccountMeta::new_readonly(AuthoritySigner::key(None, &program_id), false),
// Dependencies
AccountMeta::new(solana_program::sysvar::rent::id(), false),
AccountMeta::new(solana_program::system_program::id(), false),
AccountMeta::new_readonly(spl_token::id(), false),
],
data: (
crate::instruction::Instruction::RemoveLiquidity,
RemoveLiquidityData { amount },
)
.try_to_vec()?,
})
}
pub fn claim_shares(
program_id: Pubkey,
from_mint: Pubkey,
to_mint: Pubkey,
output_token_account: Pubkey,
lp_share_token_account: Pubkey,
amount: u64,
) -> solitaire::Result<Instruction> {
let pool = MigrationPool::<'_, { AccountState::Initialized }>::key(
&MigrationPoolDerivationData {
from: from_mint,
to: to_mint,
},
&program_id,
);
Ok(Instruction {
program_id,
accounts: vec![
AccountMeta::new(pool, false),
AccountMeta::new(
FromCustodyTokenAccount::<'_, { AccountState::Uninitialized }>::key(
&FromCustodyTokenAccountDerivationData { pool },
&program_id,
),
false,
),
AccountMeta::new(
ShareMint::<'_, { AccountState::Uninitialized }>::key(
&ShareMintDerivationData { pool },
&program_id,
),
false,
),
AccountMeta::new(output_token_account, false),
AccountMeta::new(lp_share_token_account, false),
AccountMeta::new_readonly(CustodySigner::key(None, &program_id), false),
AccountMeta::new_readonly(AuthoritySigner::key(None, &program_id), false),
// Dependencies
AccountMeta::new(solana_program::sysvar::rent::id(), false),
AccountMeta::new(solana_program::system_program::id(), false),
AccountMeta::new_readonly(spl_token::id(), false),
],
data: (
crate::instruction::Instruction::ClaimShares,
ClaimSharesData { amount },
)
.try_to_vec()?,
})
}
pub fn create_pool(
program_id: Pubkey,
payer: Pubkey,
from_mint: Pubkey,
to_mint: Pubkey,
) -> solitaire::Result<Instruction> {
let pool = MigrationPool::<'_, { AccountState::Initialized }>::key(
&MigrationPoolDerivationData {
from: from_mint,
to: to_mint,
},
&program_id,
);
Ok(Instruction {
program_id,
accounts: vec![
AccountMeta::new(payer, true),
AccountMeta::new(pool, false),
AccountMeta::new_readonly(from_mint, false),
AccountMeta::new_readonly(to_mint, false),
AccountMeta::new(
FromCustodyTokenAccount::<'_, { AccountState::Uninitialized }>::key(
&FromCustodyTokenAccountDerivationData { pool },
&program_id,
),
false,
),
AccountMeta::new(
ToCustodyTokenAccount::<'_, { AccountState::Uninitialized }>::key(
&ToCustodyTokenAccountDerivationData { pool },
&program_id,
),
false,
),
AccountMeta::new(
ShareMint::<'_, { AccountState::Uninitialized }>::key(
&ShareMintDerivationData { pool },
&program_id,
),
false,
),
AccountMeta::new_readonly(CustodySigner::key(None, &program_id), false),
// Dependencies
AccountMeta::new(solana_program::sysvar::rent::id(), false),
AccountMeta::new(solana_program::system_program::id(), false),
AccountMeta::new_readonly(spl_token::id(), false),
],
data: (
crate::instruction::Instruction::CreatePool,
CreatePoolData {},
)
.try_to_vec()?,
})
}
pub fn migrate_tokens(
program_id: Pubkey,
from_mint: Pubkey,
to_mint: Pubkey,
input_token_account: Pubkey,
output_token_account: Pubkey,
amount: u64,
) -> solitaire::Result<Instruction> {
let pool = MigrationPool::<'_, { AccountState::Initialized }>::key(
&MigrationPoolDerivationData {
from: from_mint,
to: to_mint,
},
&program_id,
);
Ok(Instruction {
program_id,
accounts: vec![
AccountMeta::new(pool, false),
AccountMeta::new_readonly(from_mint, false),
AccountMeta::new_readonly(to_mint, false),
AccountMeta::new(
ToCustodyTokenAccount::<'_, { AccountState::Uninitialized }>::key(
&ToCustodyTokenAccountDerivationData { pool },
&program_id,
),
false,
),
AccountMeta::new(
FromCustodyTokenAccount::<'_, { AccountState::Uninitialized }>::key(
&FromCustodyTokenAccountDerivationData { pool },
&program_id,
),
false,
),
AccountMeta::new(input_token_account, false),
AccountMeta::new(output_token_account, false),
AccountMeta::new_readonly(CustodySigner::key(None, &program_id), false),
AccountMeta::new_readonly(AuthoritySigner::key(None, &program_id), false),
// Dependencies
AccountMeta::new(solana_program::sysvar::rent::id(), false),
AccountMeta::new(solana_program::system_program::id(), false),
AccountMeta::new_readonly(spl_token::id(), false),
],
data: (
crate::instruction::Instruction::MigrateTokens,
MigrateTokensData { amount },
)
.try_to_vec()?,
})
}

View File

@ -1,48 +0,0 @@
#![allow(incomplete_features)]
#![feature(adt_const_params)]
use api::{
add_liquidity::*,
claim_shares::*,
create_pool::*,
migrate_tokens::*,
remove_liquidity::*,
};
use solitaire::{
solitaire,
SolitaireError,
};
pub mod accounts;
pub mod api;
pub mod types;
#[cfg(feature = "no-entrypoint")]
pub mod instructions;
#[cfg(feature = "wasm")]
#[cfg(all(target_arch = "wasm32", target_os = "unknown"))]
extern crate wasm_bindgen;
#[cfg(feature = "wasm")]
#[cfg(all(target_arch = "wasm32", target_os = "unknown"))]
pub mod wasm;
pub enum MigrationError {
WrongMint,
}
impl From<MigrationError> for SolitaireError {
fn from(t: MigrationError) -> SolitaireError {
SolitaireError::Custom(t as u64)
}
}
solitaire! {
AddLiquidity(AddLiquidityData) => add_liquidity,
RemoveLiquidity(RemoveLiquidityData) => remove_liquidity,
ClaimShares(ClaimSharesData) => claim_shares,
CreatePool(CreatePoolData) => create_pool,
MigrateTokens(MigrateTokensData) => migrate_tokens,
}

View File

@ -1,35 +0,0 @@
use borsh::{
BorshDeserialize,
BorshSerialize,
};
use serde::{
Deserialize,
Serialize,
};
use solana_program::pubkey::Pubkey;
use solitaire::{
pack_type,
processors::seeded::{
AccountOwner,
Owned,
},
};
use spl_token::state::{
Account,
Mint,
};
#[derive(Default, Clone, Copy, BorshDeserialize, BorshSerialize, Serialize, Deserialize)]
pub struct PoolData {
pub from: Pubkey,
pub to: Pubkey,
}
impl Owned for PoolData {
fn owner(&self) -> AccountOwner {
AccountOwner::This
}
}
pack_type!(SplMint, Mint, AccountOwner::Other(spl_token::id()));
pack_type!(SplAccount, Account, AccountOwner::Other(spl_token::id()));

View File

@ -1,222 +0,0 @@
use crate::{
accounts::{
AuthoritySigner,
FromCustodyTokenAccount,
FromCustodyTokenAccountDerivationData,
MigrationPool,
MigrationPoolDerivationData,
ShareMint,
ShareMintDerivationData,
ToCustodyTokenAccount,
ToCustodyTokenAccountDerivationData,
},
instructions,
types::PoolData,
};
use borsh::BorshDeserialize;
use solana_program::pubkey::Pubkey;
use solitaire::{
processors::seeded::Seeded,
AccountState,
};
use std::str::FromStr;
use wasm_bindgen::prelude::*;
#[wasm_bindgen]
pub fn add_liquidity(
program_id: String,
from_mint: String,
to_mint: String,
liquidity_token_account: String,
lp_share_token_account: String,
amount: u64,
) -> JsValue {
let program_id = Pubkey::from_str(program_id.as_str()).unwrap();
let from_mint = Pubkey::from_str(from_mint.as_str()).unwrap();
let to_mint = Pubkey::from_str(to_mint.as_str()).unwrap();
let liquidity_token_account = Pubkey::from_str(liquidity_token_account.as_str()).unwrap();
let lp_share_token_account = Pubkey::from_str(lp_share_token_account.as_str()).unwrap();
let ix = instructions::add_liquidity(
program_id,
from_mint,
to_mint,
liquidity_token_account,
lp_share_token_account,
amount,
)
.unwrap();
JsValue::from_serde(&ix).unwrap()
}
#[wasm_bindgen]
pub fn remove_liquidity(
program_id: String,
from_mint: String,
to_mint: String,
liquidity_token_account: String,
lp_share_token_account: String,
amount: u64,
) -> JsValue {
let program_id = Pubkey::from_str(program_id.as_str()).unwrap();
let from_mint = Pubkey::from_str(from_mint.as_str()).unwrap();
let to_mint = Pubkey::from_str(to_mint.as_str()).unwrap();
let liquidity_token_account = Pubkey::from_str(liquidity_token_account.as_str()).unwrap();
let lp_share_token_account = Pubkey::from_str(lp_share_token_account.as_str()).unwrap();
let ix = instructions::remove_liquidity(
program_id,
from_mint,
to_mint,
liquidity_token_account,
lp_share_token_account,
amount,
)
.unwrap();
JsValue::from_serde(&ix).unwrap()
}
#[wasm_bindgen]
pub fn claim_shares(
program_id: String,
from_mint: String,
to_mint: String,
output_token_account: String,
lp_share_token_account: String,
amount: u64,
) -> JsValue {
let program_id = Pubkey::from_str(program_id.as_str()).unwrap();
let from_mint = Pubkey::from_str(from_mint.as_str()).unwrap();
let to_mint = Pubkey::from_str(to_mint.as_str()).unwrap();
let output_token_account = Pubkey::from_str(output_token_account.as_str()).unwrap();
let lp_share_token_account = Pubkey::from_str(lp_share_token_account.as_str()).unwrap();
let ix = instructions::claim_shares(
program_id,
from_mint,
to_mint,
output_token_account,
lp_share_token_account,
amount,
)
.unwrap();
JsValue::from_serde(&ix).unwrap()
}
#[wasm_bindgen]
pub fn create_pool(
program_id: String,
payer: String,
from_mint: String,
to_mint: String,
) -> JsValue {
let program_id = Pubkey::from_str(program_id.as_str()).unwrap();
let payer = Pubkey::from_str(payer.as_str()).unwrap();
let from_mint = Pubkey::from_str(from_mint.as_str()).unwrap();
let to_mint = Pubkey::from_str(to_mint.as_str()).unwrap();
let ix = instructions::create_pool(program_id, payer, from_mint, to_mint).unwrap();
JsValue::from_serde(&ix).unwrap()
}
#[wasm_bindgen]
pub fn migrate_tokens(
program_id: String,
from_mint: String,
to_mint: String,
input_token_account: String,
output_token_account: String,
amount: u64,
) -> JsValue {
let program_id = Pubkey::from_str(program_id.as_str()).unwrap();
let from_mint = Pubkey::from_str(from_mint.as_str()).unwrap();
let to_mint = Pubkey::from_str(to_mint.as_str()).unwrap();
let input_token_account = Pubkey::from_str(input_token_account.as_str()).unwrap();
let output_token_account = Pubkey::from_str(output_token_account.as_str()).unwrap();
let ix = instructions::migrate_tokens(
program_id,
from_mint,
to_mint,
input_token_account,
output_token_account,
amount,
)
.unwrap();
JsValue::from_serde(&ix).unwrap()
}
#[wasm_bindgen]
pub fn pool_address(program_id: String, from_mint: String, to_mint: String) -> Vec<u8> {
let program_id = Pubkey::from_str(program_id.as_str()).unwrap();
let from_mint_key = Pubkey::from_str(from_mint.as_str()).unwrap();
let to_mint_key = Pubkey::from_str(to_mint.as_str()).unwrap();
let pool_addr = MigrationPool::<'_, { AccountState::Initialized }>::key(
&MigrationPoolDerivationData {
from: from_mint_key,
to: to_mint_key,
},
&program_id,
);
pool_addr.to_bytes().to_vec()
}
#[wasm_bindgen]
pub fn authority_address(program_id: String) -> Vec<u8> {
let program_id = Pubkey::from_str(program_id.as_str()).unwrap();
let authority_addr = AuthoritySigner::key(None, &program_id);
authority_addr.to_bytes().to_vec()
}
#[wasm_bindgen]
pub fn share_mint_address(program_id: String, pool: String) -> Vec<u8> {
let program_id = Pubkey::from_str(program_id.as_str()).unwrap();
let pool_key = Pubkey::from_str(pool.as_str()).unwrap();
let share_mint_addr = ShareMint::<'_, { AccountState::Initialized }>::key(
&ShareMintDerivationData { pool: pool_key },
&program_id,
);
share_mint_addr.to_bytes().to_vec()
}
#[wasm_bindgen]
pub fn from_custody_address(program_id: String, pool: String) -> Vec<u8> {
let program_id = Pubkey::from_str(program_id.as_str()).unwrap();
let pool_key = Pubkey::from_str(pool.as_str()).unwrap();
let from_custody_addr = FromCustodyTokenAccount::<'_, { AccountState::Initialized }>::key(
&FromCustodyTokenAccountDerivationData { pool: pool_key },
&program_id,
);
from_custody_addr.to_bytes().to_vec()
}
#[wasm_bindgen]
pub fn to_custody_address(program_id: String, pool: String) -> Vec<u8> {
let program_id = Pubkey::from_str(program_id.as_str()).unwrap();
let pool_key = Pubkey::from_str(pool.as_str()).unwrap();
let to_custody_addr = ToCustodyTokenAccount::<'_, { AccountState::Initialized }>::key(
&ToCustodyTokenAccountDerivationData { pool: pool_key },
&program_id,
);
to_custody_addr.to_bytes().to_vec()
}
#[wasm_bindgen]
pub fn parse_pool(data: Vec<u8>) -> JsValue {
JsValue::from_serde(&PoolData::try_from_slice(data.as_slice()).unwrap()).unwrap()
}

File diff suppressed because it is too large Load Diff

View File

@ -1,4 +0,0 @@
[workspace]
members = ["program", "client"]
[patch.crates-io]
#memmap2 = { git = "https://github.com/certusone/wormhole", branch="feat/token-bridge-proxy" }

View File

@ -1,77 +0,0 @@
local check For tilt devnet:
build using:
```
from: wormhole-icco/solana/modules/icco_contributor
EMITTER_ADDRESS="11111111111111111111111111111115" BRIDGE_ADDRESS="Bridge1p5gheXUvJ6jGWGeCsgPKgnE3YgdGKRVCMY9o" cargo check
or
EMITTER_ADDRESS="11111111111111111111111111111115" BRIDGE_ADDRESS="Bridge1p5gheXUvJ6jGWGeCsgPKgnE3YgdGKRVCMY9o" cargo build
or
EMITTER_ADDRESS="11111111111111111111111111111115" BRIDGE_ADDRESS="Bridge1p5gheXUvJ6jGWGeCsgPKgnE3YgdGKRVCMY9o" cargo build-bpf
```
Buid or check wasm build:
```
from: wormhole-icco/solana/modules/icco_contributor/program
#EMITTER_ADDRESS="11111111111111111111111111111115" BRIDGE_ADDRESS="Bridge1p5gheXUvJ6jGWGeCsgPKgnE3YgdGKRVCMY9o" wasm-pack build --target bundler -d bundler -- --features wasm
EMITTER_ADDRESS="11111111111111111111111111111115" BRIDGE_ADDRESS="Bridge1p5gheXUvJ6jGWGeCsgPKgnE3YgdGKRVCMY9o" wasm-pack build --target nodejs -d node -- --features wasm
cp node/* ../../../../sdk/js/src/solana/icco_contributor-node/
```
To add building and deployement of icco_contributor to tilt:
```
wormhole-icco/devnet/solana-devnet.yaml
add
- --bpf-program
- 22mamxmojFWBdbGqaxTH46HBAgAY2bJRiGJJHfNRNQ95
- /opt/solana/deps/icco_contributor.so
wormhole-icco/solana/Dockerfile.wasm (wasm building)
add:
# Compile icco_contributor
RUN --mount=type=cache,target=/root/.cache \
--mount=type=cache,target=modules/icco_contributor/target \
cd modules/icco_contributor/program && /usr/local/cargo/bin/wasm-pack build --target bundler -d bundler -- --features wasm --locked && \
cd bundler && sed -i $SED_REMOVE_INVALID_REFERENCE icco_contributor_bg.js
RUN --mount=type=cache,target=/root/.cache \
--mount=type=cache,target=modules/icco_contributor/target \
cd modules/icco_contributor/program && /usr/local/cargo/bin/wasm-pack build --target nodejs -d nodejs -- --features wasm --locked
COPY --from=build /usr/src/bridge/modules/icco_contributor/program/bundler sdk/js/src/solana/icco_contributor
COPY --from=build /usr/src/bridge/modules/icco_contributor/program/nodejs sdk/js/src/solana/icco_contributor-node
wormhole-icco/solana/Dockerfile (bpf contract building)
add following lines to appropriate places in RUN command:
--mount=type=cache,target=modules/icco_contributor/target \
cargo build-bpf --manifest-path "modules/icco_contributor/program/Cargo.toml" -- --locked && \
cp modules/icco_contributor/target/deploy/icco_contributor.so /opt/solana/deps/icco_contributor.so && \
```
OPTIONALLY - Deploying contributor contract to tilt devnet with new address:
```
wormhole-icco/solana/ directory:
0. Need to do every time tilt reloads Solana node.. Copy secret key and contract Id key to tilt.
kubectl cp -c devnet keys/solana-devnet.json solana-devnet-0:/root/.config/solana/id.json
kubectl cp -c devnet modules/icco_contributor/contributor_id.json solana-devnet-0:/usr/src/
1. Copy locally built bpf to tilt
kubectl cp -c devnet modules/icco_contributor/target/deploy/icco_contributor.so solana-devnet-0:/usr/src/
2. deploy to solana devnet
kubectl exec -c devnet solana-devnet-0 -- solana program deploy -u l --program-id=/usr/src/contributor_id.json /usr/src/icco_contributor.so
// returns Program Id: 5yrpFgtmiBkRmDgveVErMWuxC25eK5QE5ouZgfi46aqM
kubectl exec -c devnet solana-devnet-0 -- solana program deploy -u l /usr/src/icco_contributor.so // This makes new contract address every time
```
to register coreBridge and conductor run the following from solana/
./modules/icco_contributor/target/debug/client create-bridge 5yrpFgtmiBkRmDgveVErMWuxC25eK5QE5ouZgfi46aqM B6RHG3mfcckmrYN1UhmJzyS1XX3fZKbkeUcpJe9Sy3FE B6RHG3mfcckmrYN1UhmJzyS1XX3fZKbkeUcpJe9Sy3FE
addresses are: contributor, coreBridge, Conductor(fake for now)

View File

@ -1,23 +0,0 @@
[package]
name = "client"
version = "0.1.0"
edition = "2018"
[dependencies]
anyhow = "1.0.40"
borsh = "=0.9.1"
icco_contributor = { path = "../program", features = ["client"] }
clap = "2.33.0"
rand = "0.7.3"
shellexpand = "2.1.0"
solana-client = "=1.9.4"
solana-program = "=1.9.4"
solana-sdk = "=1.9.4"
solana-cli-config = "=1.9.4"
#solitaire = { path = "../../../solitaire/program" }
solitaire = { git = "https://github.com/certusone/wormhole", branch="feat/token-bridge-proxy" }
#solitaire-client = { path = "../../../solitaire/client" }
solitaire-client = { git = "https://github.com/certusone/wormhole", branch="feat/token-bridge-proxy" }
solana-clap-utils = "=1.9.4"
hex = "0.4.3"
spl-token-metadata = { path = "../token-metadata" }

View File

@ -1,461 +0,0 @@
#![feature(adt_const_params)]
#![allow(warnings)]
use std::{
fmt::Display,
mem::size_of,
process::exit,
};
use borsh::BorshDeserialize;
use clap::{
crate_description,
crate_name,
crate_version,
value_t,
App,
AppSettings,
Arg,
SubCommand,
};
use hex;
use solana_clap_utils::{
input_parsers::{
keypair_of,
pubkey_of,
value_of,
},
input_validators::{
is_keypair,
is_pubkey_or_keypair,
is_url,
},
};
use solana_client::{
rpc_client::RpcClient,
rpc_config::RpcSendTransactionConfig,
};
use solana_program::account_info::AccountInfo;
use solana_sdk::{
commitment_config::{
CommitmentConfig,
CommitmentLevel,
},
native_token::*,
program_error::ProgramError::AccountAlreadyInitialized,
pubkey::Pubkey,
signature::{
read_keypair_file,
Keypair,
Signer,
},
system_instruction::transfer,
transaction::Transaction,
};
use solitaire::{
processors::seeded::Seeded,
AccountState,
Info,
};
use solitaire_client::Derive;
struct Config {
rpc_client: RpcClient,
owner: Keypair,
fee_payer: Keypair,
commitment_config: CommitmentConfig,
}
type Error = Box<dyn std::error::Error>;
type CommmandResult = Result<Option<Transaction>, Error>;
fn command_init_bridge(config: &Config, bridge: &Pubkey, core_bridge: &Pubkey, icco_conductor: &Pubkey) -> CommmandResult {
println!("Initializing icco contributor {}", bridge);
let minimum_balance_for_rent_exemption = config
.rpc_client
.get_minimum_balance_for_rent_exemption(size_of::<icco_contributor::types::Config>())?;
let ix = icco_contributor::instructions::initialize(*bridge, config.owner.pubkey(), *core_bridge, *icco_conductor).unwrap();
println!("config account: {}, ", ix.accounts[1].pubkey.to_string());
let mut transaction = Transaction::new_with_payer(&[ix], Some(&config.fee_payer.pubkey()));
let (recent_blockhash, fee_calculator) = config.rpc_client.get_recent_blockhash()?;
check_fee_payer_balance(
config,
minimum_balance_for_rent_exemption + fee_calculator.calculate_fee(&transaction.message()),
)?;
transaction.sign(&[&config.fee_payer, &config.owner], recent_blockhash);
Ok(Some(transaction))
}
fn command_create_meta(
config: &Config,
mint: &Pubkey,
name: String,
symbol: String,
uri: String,
) -> CommmandResult {
println!("Creating meta for mint {}", mint);
let meta_acc = Pubkey::find_program_address(
&[
"metadata".as_bytes(),
spl_token_metadata::id().as_ref(),
mint.as_ref(),
],
&spl_token_metadata::id(),
)
.0;
println!("Meta account: {}", meta_acc);
let ix = spl_token_metadata::instruction::create_metadata_accounts(
spl_token_metadata::id(),
meta_acc,
*mint,
config.owner.pubkey(),
config.owner.pubkey(),
config.owner.pubkey(),
name,
symbol,
uri,
None,
0,
false,
false,
);
let mut transaction = Transaction::new_with_payer(&[ix], Some(&config.fee_payer.pubkey()));
let (recent_blockhash, _) = config.rpc_client.get_recent_blockhash()?;
transaction.sign(&[&config.fee_payer, &config.owner], recent_blockhash);
Ok(Some(transaction))
}
fn main() {
let matches = App::new(crate_name!())
.about(crate_description!())
.version(crate_version!())
.setting(AppSettings::SubcommandRequiredElseHelp)
.arg({
let arg = Arg::with_name("config_file")
.short("C")
.long("config")
.value_name("PATH")
.takes_value(true)
.global(true)
.help("Configuration file to use");
if let Some(ref config_file) = *solana_cli_config::CONFIG_FILE {
arg.default_value(&config_file)
} else {
arg
}
})
.arg(
Arg::with_name("json_rpc_url")
.long("url")
.value_name("URL")
.takes_value(true)
.validator(is_url)
.help("JSON RPC URL for the cluster. Default from the configuration file."),
)
.arg(
Arg::with_name("owner")
.long("owner")
.value_name("KEYPAIR")
.validator(is_keypair)
.takes_value(true)
.help(
"Specify the contract payer account. \
This may be a keypair file, the ASK keyword. \
Defaults to the client keypair.",
),
)
.arg(
Arg::with_name("fee_payer")
.long("fee-payer")
.value_name("KEYPAIR")
.validator(is_keypair)
.takes_value(true)
.help(
"Specify the fee-payer account. \
This may be a keypair file, the ASK keyword. \
Defaults to the client keypair.",
),
)
.subcommand(
SubCommand::with_name("create-bridge")
.about("Create a new bridge")
.arg(
Arg::with_name("icco-contributor")
.long("contributor")
.value_name("ICCO_CONTRIBUTOR_KEY")
.validator(is_pubkey_or_keypair)
.takes_value(true)
.index(1)
.required(true)
.help("Specify the icco contributor program address"),
)
.arg(
Arg::with_name("core-bridge")
.validator(is_pubkey_or_keypair)
.value_name("CORE_BRIDGE_KEY")
.takes_value(true)
.index(2)
.required(true)
.help("Address of the Wormhole core bridge program"),
)
.arg(
Arg::with_name("icco-conductor")
.validator(is_pubkey_or_keypair)
.value_name("ICCO_CONDUCTOR_KEY")
.takes_value(true)
.index(3)
.required(false)
.help("Address of conductor for icco"),
),
)
.subcommand(
SubCommand::with_name("emitter")
.about("Get the derived emitter used for contract messages")
.arg(
Arg::with_name("bridge")
.long("bridge")
.value_name("BRIDGE_KEY")
.validator(is_pubkey_or_keypair)
.takes_value(true)
.index(1)
.required(true)
.help("Specify the token bridge program address"),
),
)
.subcommand(
SubCommand::with_name("metadata")
.about("Get the derived metadata associated with token mints")
.arg(
Arg::with_name("mint")
.long("mint")
.value_name("MINT_KEY")
.validator(is_pubkey_or_keypair)
.takes_value(true)
.index(1)
.required(true)
.help("Specify the token mint to derive metadata for"),
),
)
.subcommand(
SubCommand::with_name("create-meta")
.about("Create token metadata")
.arg(
Arg::with_name("mint")
.long("mint")
.value_name("MINT")
.validator(is_pubkey_or_keypair)
.takes_value(true)
.index(1)
.required(true)
.help("Specify the mint address"),
)
.arg(
Arg::with_name("name")
.long("name")
.value_name("NAME")
.takes_value(true)
.index(2)
.required(true)
.help("Name of the token"),
)
.arg(
Arg::with_name("symbol")
.long("symbol")
.value_name("SYMBOL")
.takes_value(true)
.index(3)
.required(true)
.help("Symbol of the token"),
)
.arg(
Arg::with_name("uri")
.long("uri")
.value_name("URI")
.takes_value(true)
.index(4)
.required(true)
.help("URI of the token metadata"),
),
)
.get_matches();
let config = {
let cli_config = if let Some(config_file) = matches.value_of("config_file") {
solana_cli_config::Config::load(config_file).unwrap_or_default()
} else {
solana_cli_config::Config::default()
};
let json_rpc_url = value_t!(matches, "json_rpc_url", String)
.unwrap_or_else(|_| cli_config.json_rpc_url.clone());
let client_keypair = || {
read_keypair_file(&cli_config.keypair_path).unwrap_or_else(|err| {
eprintln!("Unable to read {}: {}", cli_config.keypair_path, err);
exit(1)
})
};
let owner = keypair_of(&matches, "owner").unwrap_or_else(client_keypair);
let fee_payer = keypair_of(&matches, "fee_payer").unwrap_or_else(client_keypair);
Config {
rpc_client: RpcClient::new(json_rpc_url),
owner,
fee_payer,
commitment_config: CommitmentConfig::processed(),
}
};
let _ = match matches.subcommand() {
("create-bridge", Some(arg_matches)) => {
let contributor = pubkey_of(arg_matches, "icco-contributor").unwrap();
let core_bridge = pubkey_of(arg_matches, "core-bridge").unwrap();
let conductor = pubkey_of(arg_matches, "icco-conductor").unwrap();
command_init_bridge(&config, &contributor, &core_bridge, &conductor)
}
("create-meta", Some(arg_matches)) => {
let mint = pubkey_of(arg_matches, "mint").unwrap();
let name: String = value_of(arg_matches, "name").unwrap();
let symbol: String = value_of(arg_matches, "symbol").unwrap();
let uri: String = value_of(arg_matches, "uri").unwrap();
command_create_meta(&config, &mint, name, symbol, uri)
}
("emitter", Some(arg_matches)) => {
let bridge = pubkey_of(arg_matches, "bridge").unwrap();
let emitter = <Derive<Info<'_>, "emitter">>::key(None, &bridge);
println!("Emitter Key: {}", emitter);
Ok(None)
}
("metadata", Some(arg_matches)) => {
let mint = pubkey_of(arg_matches, "mint").unwrap();
let meta_acc = Pubkey::find_program_address(
&[
"metadata".as_bytes(),
spl_token_metadata::id().as_ref(),
mint.as_ref(),
],
&spl_token_metadata::id(),
)
.0;
let meta_info = config.rpc_client.get_account(&meta_acc).unwrap();
let meta_info = spl_token_metadata::state::Metadata::from_bytes(&meta_info.data).unwrap();
println!("Key: {:?}", meta_info.key);
println!("Mint: {}", meta_info.mint);
println!("Metadata Key: {}", meta_acc);
println!("Update Authority: {}", meta_info.update_authority);
println!("Name: {}", meta_info.data.name);
println!("Symbol: {}", meta_info.data.symbol);
println!("URI: {}", meta_info.data.uri);
println!("Mutable: {}", meta_info.is_mutable);
Ok(None)
}
_ => unreachable!(),
}
.and_then(|transaction| {
if let Some(transaction) = transaction {
let signature = config
.rpc_client
.send_and_confirm_transaction_with_spinner_and_config(
&transaction,
config.commitment_config,
RpcSendTransactionConfig {
skip_preflight: true,
preflight_commitment: None,
encoding: None,
max_retries: None,
},
)?;
println!("Signature: {}", signature);
}
Ok(())
})
.map_err(|err| {
eprintln!("{}", err);
exit(1);
});
}
pub fn is_u8<T>(amount: T) -> Result<(), String>
where
T: AsRef<str> + Display,
{
if amount.as_ref().parse::<u8>().is_ok() {
Ok(())
} else {
Err(format!(
"Unable to parse input amount as integer, provided: {}",
amount
))
}
}
pub fn is_u32<T>(amount: T) -> Result<(), String>
where
T: AsRef<str> + Display,
{
if amount.as_ref().parse::<u32>().is_ok() {
Ok(())
} else {
Err(format!(
"Unable to parse input amount as integer, provided: {}",
amount
))
}
}
pub fn is_u64<T>(amount: T) -> Result<(), String>
where
T: AsRef<str> + Display,
{
if amount.as_ref().parse::<u64>().is_ok() {
Ok(())
} else {
Err(format!(
"Unable to parse input amount as integer, provided: {}",
amount
))
}
}
pub fn is_hex<T>(value: T) -> Result<(), String>
where
T: AsRef<str> + Display,
{
hex::decode(value.to_string())
.map(|_| ())
.map_err(|e| format!("{}", e))
}
fn check_fee_payer_balance(config: &Config, required_balance: u64) -> Result<(), Error> {
let balance = config
.rpc_client
.get_balance_with_commitment(
&config.fee_payer.pubkey(),
CommitmentConfig {
commitment: CommitmentLevel::Processed,
},
)?
.value;
if balance < required_balance {
Err(format!(
"Fee payer, {}, has insufficient balance: {} required, {} available",
config.fee_payer.pubkey(),
lamports_to_sol(required_balance),
lamports_to_sol(balance)
)
.into())
} else {
Ok(())
}
}

View File

@ -1,6 +0,0 @@
[
137, 48, 205, 72, 161, 248, 163, 94, 139, 120, 220, 242, 186, 179, 180, 83, 3,
163, 116, 25, 245, 254, 238, 209, 175, 92, 226, 22, 17, 36, 54, 210, 73, 255,
153, 67, 139, 148, 145, 75, 96, 144, 236, 48, 114, 57, 68, 38, 255, 33, 203,
132, 76, 127, 89, 139, 16, 254, 175, 248, 249, 255, 20, 48
]

View File

@ -1,44 +0,0 @@
[package]
name = "icco_contributor"
version = "0.1.0"
description = "Created with Rocksalt"
edition = "2018"
[lib]
crate-type = ["cdylib", "lib"]
name = "icco_contributor"
[features]
no-entrypoint = ["solitaire/no-entrypoint", "rand"]
trace = ["solitaire/trace"]
wasm = ["no-entrypoint", "wasm-bindgen"]
client = ["solitaire-client", "solitaire/client", "no-entrypoint"]
cpi = ["no-entrypoint"]
default = []
[dependencies]
wormhole-bridge-solana = { git = "https://github.com/certusone/wormhole", branch="feat/token-bridge-proxy", features = ["no-entrypoint", "cpi"] }
borsh = "=0.9.1"
bstr = "0.2.16"
byteorder = "1.4.3"
rocksalt = { git = "https://github.com/certusone/wormhole", branch="feat/token-bridge-proxy" }
solitaire = { git = "https://github.com/certusone/wormhole", branch="feat/token-bridge-proxy" }
sha3 = "0.9.1"
solana-program = { version="=1.9.4" }
spl-token = { version = "=3.2.0", features = ["no-entrypoint"] }
primitive-types = { version = "0.9.0", default-features = false }
solitaire-client = { git = "https://github.com/certusone/wormhole", branch="feat/token-bridge-proxy", optional = true }
spl-token-metadata = { git = "https://github.com/certusone/wormhole", branch="feat/token-bridge-proxy" }
wasm-bindgen = { version = "0.2.74", features = ["serde-serialize"], optional = true }
serde = { version = "1.0", features = ["derive"] }
rand = { version = "0.7.3", optional = true }
wormhole-sdk = { git = "https://github.com/certusone/wormhole", branch="feat/token-bridge-proxy", features = ["devnet", "solana"] }
[dev-dependencies]
hex = "*"
hex-literal = "0.3.1"
libsecp256k1 = { version = "0.3.5", features = [] }
solana-client = "=1.9.4"
solana-sdk = "=1.9.4"
#spl-token = { version = "=3.2.0", features = ["no-entrypoint"] }
#spl-token-metadata = { path = "../token-metadata" }

View File

@ -1,2 +0,0 @@
[target.bpfel-unknown-unknown.dependencies.std]
features = []

View File

@ -1,99 +0,0 @@
use crate::types::*;
use bridge::{
accounts::BridgeData,
api::ForeignAddress,
};
use solana_program::pubkey::Pubkey;
use solitaire::{
processors::seeded::Seeded,
*,
};
pub type AuthoritySigner<'b> = Derive<Info<'b>, "authority_signer">;
pub type CustodySigner<'b> = Derive<Info<'b>, "custody_signer">;
pub type MintSigner<'b> = Derive<Info<'b>, "mint_signer">;
pub type CoreBridge<'a, const STATE: AccountState> = Data<'a, BridgeData, { STATE }>;
pub type EmitterAccount<'b> = Derive<Info<'b>, "emitter">;
///-------------------------------------------------------------------
/// Cntributor Config.
pub type ConfigAccount<'b, const STATE: AccountState> =
Derive<Data<'b, Config, { STATE }>, "config">;
///-------------------------------------------------------------------
/// icco Sale state. PDA <= "state", SaleId
pub type SaleStateAccount<'b, const STATE: AccountState> =
Data<'b, SaleState, { STATE }>;
pub struct SaleStateDerivationData {
pub sale_id: u128,
}
impl<'b, const STATE: AccountState> Seeded<&SaleStateDerivationData>
for SaleStateAccount<'b, { STATE }>
{
fn seeds(accs: &SaleStateDerivationData) -> Vec<Vec<u8>> {
vec![
String::from("state").as_bytes().to_vec(),
accs.sale_id.to_be_bytes().to_vec(),
]
}
}
///-------------------------------------------------------------------
/// Custody Account.
pub type CustodyAccount<'b, const STATE: AccountState> = Data<'b, SplAccount, { STATE }>;
pub struct CustodyAccountDerivationData {
pub mint: Pubkey,
}
impl<'b, const STATE: AccountState> Seeded<&CustodyAccountDerivationData>
for CustodyAccount<'b, { STATE }>
{
fn seeds(accs: &CustodyAccountDerivationData) -> Vec<Vec<u8>> {
vec![accs.mint.to_bytes().to_vec()]
}
}
///-------------------------------------------------------------------
/// Registered chain endpoint
pub type Endpoint<'b, const STATE: AccountState> = Data<'b, EndpointRegistration, { STATE }>;
pub struct EndpointDerivationData {
pub emitter_chain: u16,
pub emitter_address: ForeignAddress,
}
/// Seeded implementation based on an incoming VAA
impl<'b, const STATE: AccountState> Seeded<&EndpointDerivationData> for Endpoint<'b, { STATE }> {
fn seeds(data: &EndpointDerivationData) -> Vec<Vec<u8>> {
vec![
data.emitter_chain.to_be_bytes().to_vec(),
data.emitter_address.to_vec(),
]
}
}
///-------------------------------------------------------------------
/// Token metadata.
pub type SplTokenMeta<'b> = Info<'b>;
pub struct SplTokenMetaDerivationData {
pub mint: Pubkey,
}
impl<'b> Seeded<&SplTokenMetaDerivationData> for SplTokenMeta<'b> {
fn seeds(data: &SplTokenMetaDerivationData) -> Vec<Vec<u8>> {
vec![
"metadata".as_bytes().to_vec(),
spl_token_metadata::id().as_ref().to_vec(), // Why ID is needed?
data.mint.as_ref().to_vec(),
]
}
}

View File

@ -1,7 +0,0 @@
pub mod initialize;
pub mod init_sale;
pub mod contribute;
pub use initialize::*;
pub use init_sale::*;
pub use contribute::*;

View File

@ -1,101 +0,0 @@
#![allow(dead_code)]
#![allow(unused_must_use)]
#![allow(unused_imports)]
use std::mem::size_of_val;
use crate::{
messages::SaleInit,
accounts::{
ConfigAccount,
SaleStateAccount,
SaleStateDerivationData,
}
// types::*,
};
use solana_program::msg;
use solana_program::{
account_info::AccountInfo,
// program_error::ProgramError,
pubkey::Pubkey,
};
use solitaire::{
CreationLamports::Exempt,
*,
};
//use wormhole_sdk::{VAA};
use bridge::{
vaa::{
ClaimableVAA,
DeserializePayload,
PayloadMessage,
},
error::Error::{
VAAAlreadyExecuted,
VAAInvalid,
},
CHAIN_ID_SOLANA,
};
#[derive(FromAccounts)]
pub struct ContributeIccoSale<'b> {
pub payer: Mut<Signer<AccountInfo<'b>>>,
pub config: ConfigAccount<'b, { AccountState::Initialized }>, // Must be created by now
pub sale_state: SaleStateAccount<'b, { AccountState::Initialized }>, // Must not be created yet
// TBD
pub init_sale_vaa: ClaimableVAA<'b, SaleInit>, // Was claimed.
// pub SaleVaa: Data<'b, SplAccount, { AccountState::Initialized }>
// pub from: Mut<Data<'b, SplAccount, { AccountState::Initialized }>>,
// pub mint: Mut<Data<'b, SplMint, { AccountState::Initialized }>>,
// pub custody: Mut<CustodyAccount<'b, { AccountState::MaybeInitialized }>>,
// pub clock: Sysvar<'b, Clock>,
}
impl<'a> From<&ContributeIccoSale<'a>> for SaleStateDerivationData {
fn from(accs: &ContributeIccoSale<'a>) -> Self {
SaleStateDerivationData {
sale_id: accs.init_sale_vaa.sale_id,
}
}
}
#[derive(BorshDeserialize, BorshSerialize, Default)]
pub struct ContributeIccoSaleData {
}
impl<'b> InstructionContext<'b> for ContributeIccoSale<'b> {
}
pub fn contribute_icco_sale(
_ctx: &ExecutionContext,
_accs: &mut ContributeIccoSale,
_data: ContributeIccoSaleData,
) -> Result<()> {
msg!("bbrp in contribute_icco_sale!");
// code to create custordy account as needed.
// https://github.com/certusone/wormhole/blob/1792141307c3979b1f267af3e20cfc2f011d7051/solana/modules/token_bridge/program/src/api/transfer.rs#L159
// if !accs.custody.is_initialized() {
// accs.custody
// .create(&(&*accs).into(), ctx, accs.payer.key, Exempt)?;
// let init_ix = spl_token::instruction::initialize_account(
// &spl_token::id(),
// accs.custody.info().key,
// accs.mint.info().key,
// accs.custody_signer.key,
// )?;
// invoke_signed(&init_ix, ctx.accounts, &[])?;
// }
Ok(())
}

View File

@ -1,173 +0,0 @@
use crate::{
accounts::{
ConfigAccount,
Endpoint,
EndpointDerivationData,
},
messages::{
GovernancePayloadUpgrade,
PayloadGovernanceRegisterChain,
},
// types::*,
TokenBridgeError::{
// InvalidChain,
InvalidGovernanceKey,
},
};
use bridge::{
vaa::{
ClaimableVAA,
DeserializePayload,
// PayloadMessage,
},
CHAIN_ID_SOLANA,
};
use solana_program::{
account_info::AccountInfo,
program::invoke_signed,
// program_error::ProgramError,
pubkey::Pubkey,
sysvar::{
clock::Clock,
rent::Rent,
},
};
use solitaire::{
processors::seeded::Seeded,
CreationLamports::Exempt,
*,
};
// use std::ops::{
// Deref,
// DerefMut,
// };
// Confirm that a ClaimableVAA came from the correct chain, signed by the right emitter.
fn verify_governance<'a, T>(vaa: &ClaimableVAA<'a, T>) -> Result<()>
where
T: DeserializePayload,
{
let expected_emitter = std::env!("EMITTER_ADDRESS");
let current_emitter = format!(
"{}",
Pubkey::new_from_array(vaa.message.meta().emitter_address)
);
// Fail if the emitter is not the known governance key, or the emitting chain is not Solana.
if expected_emitter != current_emitter || vaa.message.meta().emitter_chain != CHAIN_ID_SOLANA {
Err(InvalidGovernanceKey.into())
} else {
Ok(())
}
}
#[derive(FromAccounts)]
pub struct UpgradeContract<'b> {
/// Payer for account creation (vaa-claim)
pub payer: Mut<Signer<Info<'b>>>,
/// GuardianSet change VAA
pub vaa: ClaimableVAA<'b, GovernancePayloadUpgrade>,
/// PDA authority for the loader
pub upgrade_authority: Derive<Info<'b>, "upgrade">,
/// Spill address for the upgrade excess lamports
pub spill: Mut<Info<'b>>,
/// New contract address.
pub buffer: Mut<Info<'b>>,
/// Required by the upgradeable uploader.
pub program_data: Mut<Info<'b>>,
/// Our own address, required by the upgradeable loader.
pub own_address: Mut<Info<'b>>,
// Various sysvar/program accounts needed for the upgradeable loader.
pub rent: Sysvar<'b, Rent>,
pub clock: Sysvar<'b, Clock>,
pub bpf_loader: Info<'b>,
pub system: Info<'b>,
}
impl<'b> InstructionContext<'b> for UpgradeContract<'b> {
}
#[derive(BorshDeserialize, BorshSerialize, Default)]
pub struct UpgradeContractData {}
pub fn upgrade_contract(
ctx: &ExecutionContext,
accs: &mut UpgradeContract,
_data: UpgradeContractData,
) -> Result<()> {
verify_governance(&accs.vaa)?;
accs.vaa.verify(ctx.program_id)?;
accs.vaa.claim(ctx, accs.payer.key)?;
let upgrade_ix = solana_program::bpf_loader_upgradeable::upgrade(
ctx.program_id,
&accs.vaa.message.new_contract,
accs.upgrade_authority.key,
accs.spill.key,
);
let seeds = accs
.upgrade_authority
.self_bumped_seeds(None, ctx.program_id);
let seeds: Vec<&[u8]> = seeds.iter().map(|item| item.as_slice()).collect();
let seeds = seeds.as_slice();
invoke_signed(&upgrade_ix, ctx.accounts, &[seeds])?;
Ok(())
}
#[derive(FromAccounts)]
pub struct RegisterChain<'b> {
pub payer: Mut<Signer<AccountInfo<'b>>>,
pub config: ConfigAccount<'b, { AccountState::Initialized }>,
pub endpoint: Mut<Endpoint<'b, { AccountState::Uninitialized }>>,
pub vaa: ClaimableVAA<'b, PayloadGovernanceRegisterChain>,
}
impl<'a> From<&RegisterChain<'a>> for EndpointDerivationData {
fn from(accs: &RegisterChain<'a>) -> Self {
EndpointDerivationData {
emitter_chain: accs.vaa.chain,
emitter_address: accs.vaa.endpoint_address,
}
}
}
impl<'b> InstructionContext<'b> for RegisterChain<'b> {
}
#[derive(BorshDeserialize, BorshSerialize, Default)]
pub struct RegisterChainData {}
pub fn register_chain(
ctx: &ExecutionContext,
accs: &mut RegisterChain,
_data: RegisterChainData,
) -> Result<()> {
let derivation_data: EndpointDerivationData = (&*accs).into();
accs.endpoint
.verify_derivation(ctx.program_id, &derivation_data)?;
// Claim VAA
verify_governance(&accs.vaa)?;
accs.vaa.verify(ctx.program_id)?;
accs.vaa.claim(ctx, accs.payer.key)?;
// Create endpoint
accs.endpoint
.create(&((&*accs).into()), ctx, accs.payer.key, Exempt)?;
accs.endpoint.chain = accs.vaa.chain;
accs.endpoint.contract = accs.vaa.endpoint_address;
Ok(())
}

View File

@ -1,133 +0,0 @@
#![allow(dead_code)]
#![allow(unused_must_use)]
#![allow(unused_imports)]
use core::convert::TryInto;
//use std::mem::size_of_val;
use std::{
error::Error,
io::{
Cursor,
Read,
Write,
},
// str::Utf8Error,
// string::FromUtf8Error,
};
use byteorder::{
BigEndian,
ReadBytesExt,
WriteBytesExt,
};
use crate::{
messages::SaleInit,
accounts::{
ConfigAccount,
SaleStateAccount,
SaleStateDerivationData,
},
};
use crate:: {
errors::Error::{
VAAInvalidEmitterChain,
}
};
use solana_program::msg;
use solana_program::{
account_info::AccountInfo,
// program_error::ProgramError,
pubkey::Pubkey,
};
use solitaire::{
SolitaireError,
CreationLamports::Exempt,
*,
};
use wormhole_sdk::{VAA};
use bridge::{
vaa::{
ClaimableVAA,
DeserializePayload,
PayloadMessage,
},
error::Error::{
VAAAlreadyExecuted,
VAAInvalid,
},
CHAIN_ID_SOLANA,
};
#[derive(FromAccounts)]
pub struct InitIccoSale<'b> {
pub payer: Mut<Signer<AccountInfo<'b>>>,
pub config: ConfigAccount<'b, { AccountState::Initialized }>, // Must be created before Init
pub sale_state: SaleStateAccount<'b, { AccountState::Uninitialized }>, // Must not be created yet
// TBD
pub init_sale_vaa: ClaimableVAA<'b, SaleInit>,
}
impl<'a> From<&InitIccoSale<'a>> for SaleStateDerivationData {
fn from(accs: &InitIccoSale<'a>) -> Self {
SaleStateDerivationData {
sale_id: accs.init_sale_vaa.sale_id,
}
}
}
// No data so far. All is in VAA Account
#[derive(BorshDeserialize, BorshSerialize, Default)]
pub struct InitIccoSaleData {
}
// impl<'b> InstructionContext<'b> for InitIccoSale<'b> {
// }
pub fn init_icco_sale(
ctx: &ExecutionContext,
accs: &mut InitIccoSale,
_data: InitIccoSaleData,
) -> Result<()> {
msg!("bbrp in init_icco_sale!");
/* if accs.init_sale_vaa.payloadID != 1 {
msg!("bbrp init_icco_sale bad chain");
return Err(VAAInvalidEmitterChain.into());
}
*/
// --- vvv This is just for testing.. Needs to go away.
// let sale_id = accs.init_sale_vaa.get_sale_id(&accs.init_sale_vaa.meta().payload[..]);
let sale_id = accs.init_sale_vaa.sale_id;
if sale_id != 1 {
msg!("bbrp init_icco_sale bad chain");
return Err(VAAInvalidEmitterChain.into());
}
if accs.init_sale_vaa.meta().emitter_chain != 2 {
msg!("bbrp init_icco_sale bad VAA emitter chain");
return Err(VAAInvalidEmitterChain.into());
}
// --- ^^^ This is just for testing
// Create status account. (it was Uninitialized comind in)
accs.sale_state.create(&(&*accs).into(), ctx, accs.payer.key, Exempt)?;
// Ckeck if all Solana tokens exist. Custodian accounts are created on first contribution to each token. As well as contribution info PDA Account.
// If all good - Prevent vaa double processing
accs.init_sale_vaa.verify(ctx.program_id)?;
accs.init_sale_vaa.claim(ctx, accs.payer.key)?;
Ok(())
}

View File

@ -1,55 +0,0 @@
#![allow(dead_code)]
#![allow(unused_must_use)]
#![allow(unused_imports)]
use std::mem::size_of_val;
use crate::{
accounts::ConfigAccount,
// types::*,
};
use solana_program::msg;
use solana_program::{
account_info::AccountInfo,
// program_error::ProgramError,
pubkey::Pubkey,
};
use solitaire::{
CreationLamports::Exempt,
*,
};
// use std::ops::{
// Deref,
// DerefMut,
// };
#[derive(FromAccounts)]
pub struct Initialize<'b> {
pub payer: Mut<Signer<AccountInfo<'b>>>,
pub config: Mut<ConfigAccount<'b, { AccountState::Uninitialized }>>,
}
// Config account and InitializeData - only stores bridge address.
#[derive(BorshDeserialize, BorshSerialize, Default)]
pub struct InitializeData {
pub bridge: Pubkey,
pub conductor: Pubkey,
}
impl<'b> InstructionContext<'b> for Initialize<'b> {
}
pub fn initialize(
ctx: &ExecutionContext,
accs: &mut Initialize,
data: InitializeData,
) -> Result<()> {
// bbrp - local only. Print bridge and conductor.
// msg!("bbrp in icco initialize {} {}", data.bridge, data.conductor);
// Create the config account.
accs.config.create(ctx, accs.payer.key, Exempt)?;
accs.config.wormhole_bridge = data.bridge;
accs.config.icco_conductor = data.conductor;
Ok(())
}

View File

@ -1,21 +0,0 @@
//! Define application level errors that can be returned by the various instruction handlers that
//! make up the wormhole bridge.
use crate::trace;
use solitaire::SolitaireError;
#[derive(Debug)]
pub enum Error {
VAAAlreadyExecuted,
VAAInvalidEmitterChain,
VAAInvalid,
}
/// Errors thrown by the program will bubble up to the solitaire wrapper, which needs a way to
/// translate these errors into something Solitaire can log and handle.
impl From<Error> for SolitaireError {
fn from(e: Error) -> SolitaireError {
trace!("ProgramError: {:?}", e);
SolitaireError::Custom(e as u64)
}
}

Some files were not shown because too many files have changed in this diff Show More