Merge branch 'master' of https://github.com/metaplex-foundation/metaplex into uploadFile4

This commit is contained in:
adamjeffries 2021-09-26 09:18:20 -05:00
commit b5dd55c34e
No known key found for this signature in database
GPG Key ID: 71C4DC054797A48C
144 changed files with 20813 additions and 2224 deletions

34
.github/workflows/cli-pull-request.yml vendored Normal file
View File

@ -0,0 +1,34 @@
name: Pull Request (CLI)
on:
pull_request:
paths:
- js/packages/cli/*
push:
branches:
- master
paths:
- js/packages/cli/*
jobs:
unit_tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- uses: actions/setup-node@v2
with:
node-version: "14"
- uses: actions/cache@v2
with:
path: "**/node_modules"
key: ${{ runner.os }}-modules-${{ hashFiles('**/yarn.lock') }}
- name: Install modules
run: yarn install
working-directory: js/packages/cli
- name: Run Tests
run: yarn test
working-directory: js/packages/cli

1
.gitignore vendored
View File

@ -30,3 +30,4 @@ hfuzz_workspace
**/*.so
**/.DS_Store
.cache
js/packages/web/.env

View File

@ -14,7 +14,7 @@ This is the bedrock contract of the entire ecosystem. All that you need to inter
Furthermore, if your mint has one token in its supply, you can give it an additional decoration PDA, of type MasterEdition. This PDA denotes the mint as a special type of object that can mint other mints - which we call Editions (as opposed to MasterEditions because they can't print other mints themselves). This makes this mint like the "master records" that record studios used to use to make new copies of records back in the day. The MasterEdition PDA will take away minting and freezing authority from you in the process and will contain information about total supply, maximum possible supply, etc.
The existence of Metadata and its sister PDA MasterEdition makes a very powerful combination for a mint that enables the entire rest of the Metaplex contract stack. Now you can create:
The existence of Metadata and its sister PDA MasterEdition makes a very powerful combination for a mint that enables the entire rest of the Metaplex contract stack. Now you can create:
- Normal mints that just have names (Metadata but no MasterEdition)
- One of a kind NFTs (Metadata + MasterEdition with `max_supply` of 0)
@ -35,7 +35,7 @@ When there are outstanding shares, you cannot, as the vault owner, **Combine** t
### Auction
The Auction Contract represents an auction primitive, and it knows nothing about NFTs, or Metadata, or anything else in the Metaplex ecosystem. All it cares about is that it has a resource address, it has auction mechanics, and it is using those auction mechanics to auction off that resource. It currently supports English Auctions and Open Edition Auctions (no winners but bids are tracked.) Its only purpose is to track who won what place in an auction and to collect money for those wins. When you place bids, or cancel them, you are interacting with this contract. However, when you redeem bids, you are not interacting with this contract, but Metaplex, because while it can provide proof that you did indeed win 4th place, it has no opinion on how the resource being auctioned off is divvied up between 1st, 2nd, 3rd, and 4th place winners, for example.
The Auction Contract represents an auction primitive, and it knows nothing about NFTs, or Metadata, or anything else in the Metaplex ecosystem. All it cares about is that it has a resource address, it has auction mechanics, and it is using those auction mechanics to auction off that resource. It currently supports English Auctions and Open Edition Auctions (no winners but bids are tracked.) Its only purpose is to track who won what place in an auction and to collect money for those wins. When you place bids, or cancel them, you are interacting with this contract. However, when you redeem bids, you are not interacting with this contract, but Metaplex, because while it can provide proof that you did indeed win 4th place, it has no opinion on how the resource being auctioned off is divvied up between 1st, 2nd, 3rd, and 4th place winners, for example.
This contract will be expanded in the future to include other auction types, and better guarantees between that the auctioneer claiming the bid actually has provided the prize by having the winner sign a PDA saying that they received the prize. Right now this primitive contract should *not* be used in isolation, but in companionship with another contract (like Metaplex in our case) that makes such guarantees that prizes are delivered if prizes are won.
@ -48,7 +48,7 @@ This is the granddaddy contract of them all. The primary product of the Metaplex
- Full Rights Transfers (Giving away token + metadata ownership)
- Single Token Transfers (Giving away a token but not metadata ownership)
It orchestrates disbursements of those contents to winners of an auction. An AuctionManager requires both a Vault and an Auction to run, and it requires that the Auction's resource key be set to the Vault.
It orchestrates disbursements of those contents to winners of an auction. An AuctionManager requires both a Vault and an Auction to run, and it requires that the Auction's resource key be set to the Vault.
Due to each type of NFT transfer above requiring slightly different nuanced handling and checking, Metaplex handles knowing about those things, and making the different CPI calls to the Token Metadata contract to make those things happen as required during the redemption phase. It also has full authority over all the objects like Vault and Auction, and handles all royalties payments by collecting funds from the auction into its own central escrow account and then disbursing to artists.
@ -98,11 +98,11 @@ Get ready and grab some aspirin. Here we go!
### Overview
The Token Metadata contract can be used for storing generic metadata about any given mint, whether NFT or not. Metadata allows storage of name, symbol, and URI to an external resource. Additionally, the Metadata allows for the tracking of creators, primary sales, and seller fees. Once the mint has been created, the mint authority can use the SPL Metadata program to create metadata as described in this document.
The Token Metadata contract can be used for storing generic metadata about any given mint, whether NFT or not. Metadata allows storage of name, symbol, and URI to an external resource. Additionally, the Metadata allows for the tracking of creators, primary sales, and seller fees. Once the mint has been created, the mint authority can use the SPL Metadata program to create metadata as described in this document.
Minting an NFT requires creating a new SPL Mint with the supply of one and decimals zero as described [https://spl.solana.com/token#example-create-a-non-fungible-token](https://spl.solana.com/token#example-create-a-non-fungible-token)
Below is the Rust representation of the structs that are stored on-chain.
Below is the Rust representation of the structs that are stored on-chain.
```rust
@ -200,7 +200,7 @@ Master Edition accounts are PDA addresses of `['metaplex', metaplex_program_id,
An edition represents a copy of an NFT, and is created from a Master Edition. Each print has an edition number associated with it. Normally, prints can be created during Open Edition or Limited Edition auction, but they could also be created by the creator manually.
Editions are created by presenting the Master Edition token, along with a new mint that lacks a Metadata account and a token account containing one token from that mint to the `mint_new_edition_from_master_edition_via_token` endpoint. This endpoint will create both an immutable Metadata based on the parent Metadata and a special Edition struct based on the parent Master Edition struct.
Editions are created by presenting the Master Edition token, along with a new mint that lacks a Metadata account and a token account containing one token from that mint to the `mint_new_edition_from_master_edition_via_token` endpoint. This endpoint will create both an immutable Metadata based on the parent Metadata and a special Edition struct based on the parent Master Edition struct.
The Edition has the same PDA as a Master Edition to force collision and prevent a user from having a mint with both, `['metaplex', metaplex_program_id, mint_id, 'edition']`.
@ -208,7 +208,7 @@ The Edition has the same PDA as a Master Edition to force collision and prevent
### Decoration as PDA Extensions
The whole idea of the Token Metadata program is to be a decorator to a Token Mint. Each struct acts as further decoration. The Metadata struct gives a mint a name and a symbol and points to some external URI that can be anything. The Master Edition gives it printing capabilities. The Edition labels it as a child of something.
The whole idea of the Token Metadata program is to be a decorator to a Token Mint. Each struct acts as further decoration. The Metadata struct gives a mint a name and a symbol and points to some external URI that can be anything. The Master Edition gives it printing capabilities. The Edition labels it as a child of something.
This is important to internalize, because it means you as a Rust developer can take it a step further. There is nothing stopping you from building a new contract on top of ours that makes it's own PDAs and and extending this still further. Why not build a CookingRecipes PDA, that has seed `['your-app', your_program_id, mint_id, 'recipes']`? You can require that a Metadata PDA from our contract exists to make a PDA in your program, and then you can further decorate mints on top of our decorations. The idea is to compose mints with further information than they ever had before, and then build clients that can consume that information in new and interesting ways.
@ -245,7 +245,7 @@ The URI resource is compatible with [ERC-1155 JSON Schema](https://github.com/et
},
"seller_fee_basis_points": {
"type": "number",
},
"properties": {
"type": "object",
@ -376,7 +376,7 @@ Safety Deposit Boxes always have PDA addresses of type `['vault', vault_key, min
### External Price Account
The External Price Account is meant to be used as an external oracle. It is provided to a Vault on initialization and doesn't need to be owned or controlled by the vault authority (though it can be.) It can provide data on the `price_per_share` of fractional shares, whether or not the vault authority is currently allowed to **Combine** the vault and reclaim the contents, and what the `price_mint` of the vault is.
The External Price Account is meant to be used as an external oracle. It is provided to a Vault on initialization and doesn't need to be owned or controlled by the vault authority (though it can be.) It can provide data on the `price_per_share` of fractional shares, whether or not the vault authority is currently allowed to **Combine** the vault and reclaim the contents, and what the `price_mint` of the vault is.
ExternalPriceAccounts do not have PDA addresses.
@ -456,6 +456,8 @@ pub struct AuctionDataExtended {
pub tick_size: Option<u64>,
/// gap_tick_size_percentage - two decimal points
pub gap_tick_size_percentage: Option<u8>,
/// auction name
pub name: Option<[u8; 32]>,
}
/// Define valid auction state transitions.
@ -543,11 +545,11 @@ AuctionData accounts always have PDA addresses of `['auction', auction_program_i
### Bid State
Bid State is technically not a top level struct, but an embedded one within AuctionData. I thought it was good to give it a section anyway because it's a complex little beast. It's actually an enum that holds a bid vector and a maximum size denoting how many of those bids are actually "valid winners" vs just placeholders.
Bid State is technically not a top level struct, but an embedded one within AuctionData. I thought it was good to give it a section anyway because it's a complex little beast. It's actually an enum that holds a bid vector and a maximum size denoting how many of those bids are actually "valid winners" vs just placeholders.
It's reversed, which is to say that the number one winner is always at the end of the vec. It's also always bigger generally than the number of winners so that if a bid is cancelled, we have some people who got bumped out of top spots that can be moved back into them without having to cancel and replace their bids. When a bid is placed, it is inserted in the proper position based on it's amount and then the lowest bidder is bumped off the 0th position of the vec if the vec is at max size, so the vec remains sorted at all times.
In the case of open edition, the max is always zero, ie there are never any winners, and we are just accepting bids and creating BidderMetadata tickets and BidderPots to accept payment for (probably) fixed price Participation NFTs.
In the case of open edition, the max is always zero, ie there are never any winners, and we are just accepting bids and creating BidderMetadata tickets and BidderPots to accept payment for (probably) fixed price Participation NFTs.
We would prefer that OpenEdition enum have no bid vector and no max, but unfortunately borsh-js does not support enums with different internal data structures, so all data structures in an enum must be identical (even if unused.) Keep that in mind when designing your own end to end borsh implementations!
@ -567,7 +569,7 @@ BidderPot always has a PDA of `['auction', auction_program_id, auction_id, bidde
If you've read this far, you now get to witness my personal shame. So as it turns out, if you build a complex enough program with enough structs flying around, there is some kind of weird interaction in the Metaplex contract that causes it to blow out with an access violation if you add more than a certain number of keys to one particular struct (AuctionData), and *only* during the redemption endpoint calls. We were unable to discern why this was across 3 days of debugging. We had a theory it was due to some issue with borsh but it is not 100% certain, as we're not experts with that library's internals.
Instead, our work-around was to introduce AuctionDataExtended to add new fields that we needed to AuctionData without breaking this hidden bug that seems to exist. What is odd about the whole thing is adding fields to *other* structs doesn't cause any issues. In the future I'd love to have someone who knows way more than me about these subjects weigh in and tell me what I did wrong here to resolve this split-brain problem! We also don't have reverse lookup capability (Resource key on AuctionData) because of this bug - adding it would cause the blow out I mentioned.
Instead, our work-around was to introduce AuctionDataExtended to add new fields that we needed to AuctionData without breaking this hidden bug that seems to exist. What is odd about the whole thing is adding fields to *other* structs doesn't cause any issues. In the future I'd love to have someone who knows way more than me about these subjects weigh in and tell me what I did wrong here to resolve this split-brain problem! We also don't have reverse lookup capability (Resource key on AuctionData) because of this bug - adding it would cause the blow out I mentioned.
Another note here is `gap_tick_size_percentage` as of the time of this writing has not been implemented yet, it is just a dummy field.
@ -860,7 +862,7 @@ The instruction set for metaplex can be found here: [https://github.com/metaplex
### AuctionManager
This is the top level struct of the entire contract and serves as a container for "all the things." When you make auctions on Metaplex, you are actually really making these ultimately. An AuctionManager has a single authority (you, the auctioneer), a store, which is the storefront struct, an Auction from the auction contract, and a Vault from the vault contract. It also has a token account called `accept_payment` that serves as a central clearing escrow for all tokens that it will collect in the future from the winning bidders and all payments for fixed price participation nfts from all non-winners in the auction.
This is the top level struct of the entire contract and serves as a container for "all the things." When you make auctions on Metaplex, you are actually really making these ultimately. An AuctionManager has a single authority (you, the auctioneer), a store, which is the storefront struct, an Auction from the auction contract, and a Vault from the vault contract. It also has a token account called `accept_payment` that serves as a central clearing escrow for all tokens that it will collect in the future from the winning bidders and all payments for fixed price participation nfts from all non-winners in the auction.
It contains embedded within it a separate `state` and `settings` struct. It is seeded with the `settings` on initialization by the caller, while the `state` is derived from `settings` on initialization. AuctionManager goes through several states:
@ -872,7 +874,7 @@ It contains embedded within it a separate `state` and `settings` struct. It is s
**Disbursing**: The underlying Auction is over and now the AuctionManager is in the business of disbursing royalties to the auctioneer and creators, prizes and participation NFTs to the winners, and possibly participation NFTs to the non-winners.
**Finished:** All funds and prizes disbursed.
**Finished:** All funds and prizes disbursed.
This state is not currently in use as switching to it requires an iteration over prizes to review all items for claimed-ness and this costs CPU that is too precious during the redemption call OR adding new endpoint that is not guaranteed to be called. We will revisit it later to bring it back during a refactoring, for now it is considered a NOOP state.
@ -882,7 +884,7 @@ AuctionManagers always have PDAs of seed `['metaplex', metaplex_program_id, auct
AuctionManagerSettings is an embedded struct inside AuctionManager but is deserving of it's own section. This struct is actually provided by the user in the `init_auction_manager` call to parameterize the AuctionManager with who is winning what and whether or not there is a participation NFT. It is fairly straightforward - for each entry in the WinningConfig vec, it stands for a given winning place in the Auction. The 0th entry is the WinningConfig for the 1st place winner. A WinningConfig has many WinningConfigItems. For each WinningConfigItem in the 0th WinningConfig, it is a mapping to a Vault SafetyDepositBox that the 1st place winner gets items from. You can therefore configure quite arbitrary Auctions this way.
This setup is actually quite redundant and will likely change in the future to a setup where a WinningConfigItem is the top level structure and it simply declares which winners will receive it, because if you wish for multiple winners to receive prints from the same Master Edition, the WinningConfigItem must right now be duplicated across each WinningConfig.
This setup is actually quite redundant and will likely change in the future to a setup where a WinningConfigItem is the top level structure and it simply declares which winners will receive it, because if you wish for multiple winners to receive prints from the same Master Edition, the WinningConfigItem must right now be duplicated across each WinningConfig.
The Participation Config is optional, but has enums describing how it will behave for winners and for non-winners, whether or not it has a price associated with it, and what safety deposit box contains its printing tokens.
@ -902,7 +904,7 @@ BidRedemptionTickets always have PDAs of `['metaplex', auction_id, bidder_metada
### PayoutTicket
For each creator, for each metadata(WinningConfigItem), for each winning place(WinningConfig) in an Auction, a PayoutTicket is created to record the sliver of income generated for that creator. There is also one made for the Auctioneer for every such case. And yes, it really is that specific. This means that a given creator may have quite a few PayoutTickets for a single AuctionManager, but each one represents a slightly different royalty payout.
For each creator, for each metadata(WinningConfigItem), for each winning place(WinningConfig) in an Auction, a PayoutTicket is created to record the sliver of income generated for that creator. There is also one made for the Auctioneer for every such case. And yes, it really is that specific. This means that a given creator may have quite a few PayoutTickets for a single AuctionManager, but each one represents a slightly different royalty payout.
For instance, 1st place may have three items with 3 unique metadata won while 2nd place may have 4 metadata from 4 items, every item with a single unique creator. The split of funds in the 1st place is going to be 3 ways, while in 2nd place would be 4 ways. Even if 1st and 2nd place bids are the same, we want two records to reflect the royalties paid from 1st and 2nd place, because they would be different numbers in this case, and we want to preserve history.
@ -958,9 +960,9 @@ Note that owning the token itself is the *only* requirement for using the `updat
Metadata come locked and stocked with arrays of creators, each with their own `share` and all guaranteed to sum to 100. The Metadata itself has a `seller_fee_basis_points` field that represents the share creators get out of the proceeds in any secondary sale and a `primary_sale_happened` boolean that distinguishes to the world whether or not this particular Metadata has experienced it's first sale or not. With all of this, Metaplex is able to do complete Royalty calculations after an Auction is over. It was mentioned above that on initialization, the Metaplex contract snapshots for each Metadata being sold the `primary_sale_happened` just in case the boolean is flipped during the auction so that royalties are calculated as-of initiation - this is important to note.
At the end of the auction, anybody (permissionless) can cycle through each winning bid in the contract and ask the Metaplex contract to use its authority to call the Auction contract and pump the winning bid monies into the `accept_payment` escrow account via `claim_bid`. Once all winning bids have been settled into here, royalties are eligible to be paid out. We'll cover payouts of fixed price Participation NFTs separately.
At the end of the auction, anybody (permissionless) can cycle through each winning bid in the contract and ask the Metaplex contract to use its authority to call the Auction contract and pump the winning bid monies into the `accept_payment` escrow account via `claim_bid`. Once all winning bids have been settled into here, royalties are eligible to be paid out. We'll cover payouts of fixed price Participation NFTs separately.
Now, anybody (permissionless) can cycle through each creator PLUS the auctioneer on each item in each winning bid and call `empty_payment_account` with an Associated Token Account that is owned by that creator or auctioneer and that action will calculate, using the creator's share or auctioneer's share of that item's metadata, and the fractional percentage of that item of the overall winning basket, to payout the creator or auctioneer from the escrow.
Now, anybody (permissionless) can cycle through each creator PLUS the auctioneer on each item in each winning bid and call `empty_payment_account` with an Associated Token Account that is owned by that creator or auctioneer and that action will calculate, using the creator's share or auctioneer's share of that item's metadata, and the fractional percentage of that item of the overall winning basket, to payout the creator or auctioneer from the escrow.
Our front end implementation immediately calls the `update_primary_sale_happened` endpoint on token metadata for any token once redeemed for users so that if they re-sell, the `primary_sale_happened` boolean is taken into account in the `empty_payment_account` logic and only the basis points given in `seller_fee_basis_points` goes to the creators instead of the whole pie. The remaining part of the pie goes to the auctioneer doing the reselling.
@ -972,9 +974,9 @@ Note because our front end implementation chooses to use SOL instead of a generi
### Validation
Just because you provide a vault to an AuctionManager and an AuctionManagerSettings declaring this vault is filled with wonderful prizes *does not* believe that Metaplex will believe you. For every safety deposit box indexed in a WinningConfigItem, there must be a call to `validate_safety_deposit_box` after initiation where the safety deposit box is provided for inspection to the Metaplex contract so that it can verify that there are enough tokens, and of the right type, to pay off all winners in the auction.
Just because you provide a vault to an AuctionManager and an AuctionManagerSettings declaring this vault is filled with wonderful prizes *does not* believe that Metaplex will believe you. For every safety deposit box indexed in a WinningConfigItem, there must be a call to `validate_safety_deposit_box` after initiation where the safety deposit box is provided for inspection to the Metaplex contract so that it can verify that there are enough tokens, and of the right type, to pay off all winners in the auction.
Given how irritating this process is, we may in the future merge token-vault with metaplex, or simply copy over the parts of it that are relevant, leaving token-vault out for those interested in experimenting with fractionalization.
Given how irritating this process is, we may in the future merge token-vault with metaplex, or simply copy over the parts of it that are relevant, leaving token-vault out for those interested in experimenting with fractionalization.
### Unwon Items

View File

@ -1,6 +1,6 @@
## Setup
Be sure to be running Node v12.16.2 and yarn version 1.22.10.
Be sure to be running Node v14.17.6 and yarn version 1.22.10.
`yarn bootstrap`

View File

@ -8,7 +8,7 @@
"keywords": [],
"license": "Apache-2.0",
"engines": {
"node": ">=6.0.0"
"node": "~14.17"
},
"scripts": {
"bootstrap": "lerna link && lerna bootstrap",
@ -29,6 +29,9 @@
"eslint --cache --fix --max-warnings=0"
]
},
"resolutions": {
"@types/react": "^17.0.16"
},
"husky": {
"hooks": {
"pre-commit": "lint-staged"
@ -59,11 +62,11 @@
"@typescript-eslint/eslint-plugin": "^4.6.0",
"@typescript-eslint/parser": "^4.6.0",
"eslint-plugin-react": "^7.25.1",
"eslint": "^6.6.0",
"eslint": "^7.11.0",
"eslint-config-prettier": "^6.15.0",
"gh-pages": "^3.1.0",
"husky": "^4.3.0",
"jest": "24.9.0",
"jest": "26.6.0",
"jest-config": "24.9.0",
"lerna": "3.22.1",
"lint-staged": "^10.5.0",

1
js/packages/cli/.nvmrc Normal file
View File

@ -0,0 +1 @@
14.17.0

View File

@ -91,10 +91,10 @@ metaplex create_candy_machine -k ~/.config/solana/id.json -p 1
ts-node cli create_candy_machine -k ~/.config/solana/id.json -p 3
```
4. Set the start date for your candy machine.
4. Set the start date and update the price of your candy machine.
```
metaplex set_start_date -k ~/.config/solana/id.json -d "20 Apr 2021 04:20:00 GMT"
ts-node cli set_start_date -k ~/.config/solana/id.json -d "20 Apr 2021 04:20:00 GMT"
metaplex update_candy_machine -k ~/.config/solana/id.json -d "20 Apr 2021 04:20:00 GMT" -p 0.1
ts-node cli update_candy_machine -k ~/.config/solana/id.json -d "20 Apr 2021 04:20:00 GMT" -p 0.1
```
5. Test mint a token (provided it's after the start date)

View File

@ -1,37 +1,64 @@
{
"name": "@metaplex/cli",
"version": "0.0.1",
"version": "0.0.2",
"main": "./build/cli.js",
"license": "MIT",
"bin": {
"metaplex": "./build/cli.js"
"metaplex": "./build/candy-machine-cli.js"
},
"scripts": {
"build": "tsc -p ./src",
"watch": "tsc -w -p ./src",
"package:linux": "pkg . --no-bytecode --targets node14-linux-x64 --output bin/linux/metaplex",
"package:linuxb": "pkg . --targets node14-linux-x64 --output bin/linux/metaplex",
"package:macos": "pkg . --no-bytecode --targets node14-macos-x64 --output bin/macos/metaplex",
"package:win": "pkg . --targets node14-win-x64 --output bin/win/metaplex",
"package:macos": "pkg . --targets node14-macos-x64 --output bin/macos/metaplex",
"format": "prettier --loglevel warn --write \"**/*.{ts,js,json,yaml}\"",
"format:check": "prettier --loglevel warn --check \"**/*.{ts,js,json,yaml}\"",
"lint": "eslint \"src/**/*.ts\" --fix",
"lint:check": "eslint \"src/**/*.ts\""
"lint:check": "eslint \"src/**/*.ts\"",
"test": "jest"
},
"pkg": {
"scripts": "./build/**/*.js"
"scripts": "./build/**/*.{js|json}"
},
"babel": {
"presets": [
[
"@babel/preset-env",
{
"targets": {
"node": "current"
}
}
],
"@babel/preset-typescript"
]
},
"jest": {
"testPathIgnorePatterns": [
"<rootDir>/build/",
"<rootDir>/node_modules/"
]
},
"dependencies": {
"@project-serum/anchor": "^0.13.2",
"@project-serum/anchor": "^0.14.0",
"@solana/spl-token": "^0.1.8",
"arweave": "^1.10.16",
"bn.js": "^5.2.0",
"borsh": "^0.4.0",
"commander": "^8.1.0",
"form-data": "^4.0.0",
"ipfs-http-client": "^52.0.3",
"jsonschema": "^1.4.0",
"loglevel": "^1.7.1",
"node-fetch": "^2.6.1"
},
"devDependencies": {
"@babel/preset-env": "^7.15.6",
"@babel/preset-typescript": "^7.15.0",
"@types/jest": "^27.0.1",
"jest": "^27.2.0",
"pkg": "^5.3.1",
"typescript": "^4.3.5"
}

View File

@ -0,0 +1,653 @@
#!/usr/bin/env ts-node
import * as fs from 'fs';
import * as path from 'path';
import { program } from 'commander';
import * as anchor from '@project-serum/anchor';
import BN from 'bn.js';
import fetch from 'node-fetch';
import { fromUTF8Array, parseDate, parsePrice } from './helpers/various';
import { Token, TOKEN_PROGRAM_ID } from '@solana/spl-token';
import { PublicKey } from '@solana/web3.js';
import {
CACHE_PATH,
CONFIG_ARRAY_START,
CONFIG_LINE_SIZE,
EXTENSION_JSON,
EXTENSION_PNG,
} from './helpers/constants';
import {
getCandyMachineAddress,
loadCandyProgram,
loadWalletKey,
} from './helpers/accounts';
import { Config } from './types';
import { upload } from './commands/upload';
import { verifyTokenMetadata } from './commands/verifyTokenMetadata';
import { loadCache, saveCache } from './helpers/cache';
import { mint } from './commands/mint';
import { signMetadata } from './commands/sign';
import { signAllMetadataFromCandyMachine } from './commands/signAll';
import log from 'loglevel';
program.version('0.0.2');
if (!fs.existsSync(CACHE_PATH)) {
fs.mkdirSync(CACHE_PATH);
}
log.setLevel(log.levels.INFO);
programCommand('upload')
.argument(
'<directory>',
'Directory containing images named from 0-n',
val => {
return fs.readdirSync(`${val}`).map(file => path.join(val, file));
},
)
.option('-n, --number <number>', 'Number of images to upload')
.option(
'-s, --storage <string>',
'Database to use for storage (arweave, ipfs)',
'arweave',
)
.option(
'--ipfs-infura-project-id',
'Infura IPFS project id (required if using IPFS)',
)
.option(
'--ipfs-infura-secret',
'Infura IPFS scret key (required if using IPFS)',
)
.option('--no-retain-authority', 'Do not retain authority to update metadata')
.action(async (files: string[], options, cmd) => {
const {
number,
keypair,
env,
cacheName,
storage,
ipfsInfuraProjectId,
ipfsInfuraSecret,
retainAuthority,
} = cmd.opts();
if (storage === 'ipfs' && (!ipfsInfuraProjectId || !ipfsInfuraSecret)) {
throw new Error(
'IPFS selected as storage option but Infura project id or secret key were not provided.',
);
}
if (!(storage === 'arweave' || storage === 'ipfs')) {
throw new Error("Storage option must either be 'arweave' or 'ipfs'.");
}
const ipfsCredentials = {
projectId: ipfsInfuraProjectId,
secretKey: ipfsInfuraSecret,
};
const pngFileCount = files.filter(it => {
return it.endsWith(EXTENSION_PNG);
}).length;
const jsonFileCount = files.filter(it => {
return it.endsWith(EXTENSION_JSON);
}).length;
const parsedNumber = parseInt(number);
const elemCount = parsedNumber ? parsedNumber : pngFileCount;
if (pngFileCount !== jsonFileCount) {
throw new Error(
`number of png files (${pngFileCount}) is different than the number of json files (${jsonFileCount})`,
);
}
if (elemCount < pngFileCount) {
throw new Error(
`max number (${elemCount})cannot be smaller than the number of elements in the source folder (${pngFileCount})`,
);
}
log.info(`Beginning the upload for ${elemCount} (png+json) pairs`);
const startMs = Date.now();
log.info('started at: ' + startMs.toString());
let warn = false;
for (;;) {
const successful = await upload(
files,
cacheName,
env,
keypair,
elemCount,
storage,
retainAuthority,
ipfsCredentials,
);
if (successful) {
warn = false;
break;
} else {
warn = true;
log.warn('upload was not successful, rerunning');
}
}
const endMs = Date.now();
const timeTaken = new Date(endMs - startMs).toISOString().substr(11, 8);
log.info(
`ended at: ${new Date(endMs).toISOString()}. time taken: ${timeTaken}`,
);
if (warn) {
log.info('not all images have been uplaoded, rerun this step.');
}
});
programCommand('verify_token_metadata')
.argument(
'<directory>',
'Directory containing images and metadata files named from 0-n',
val => {
return fs
.readdirSync(`${val}`)
.map(file => path.join(process.cwd(), val, file));
},
)
.option('-n, --number <number>', 'Number of images to upload')
.action((files: string[], options, cmd) => {
const { number } = cmd.opts();
const startMs = Date.now();
log.info('started at: ' + startMs.toString());
verifyTokenMetadata({ files, uploadElementsCount: number });
const endMs = Date.now();
const timeTaken = new Date(endMs - startMs).toISOString().substr(11, 8);
log.info(
`ended at: ${new Date(endMs).toString()}. time taken: ${timeTaken}`,
);
});
programCommand('verify').action(async (directory, cmd) => {
const { env, keypair, cacheName } = cmd.opts();
const cacheContent = loadCache(cacheName, env);
const walletKeyPair = loadWalletKey(keypair);
const anchorProgram = await loadCandyProgram(walletKeyPair, env);
const configAddress = new PublicKey(cacheContent.program.config);
const config = await anchorProgram.provider.connection.getAccountInfo(
configAddress,
);
let allGood = true;
const keys = Object.keys(cacheContent.items);
for (let i = 0; i < keys.length; i++) {
log.debug('Looking at key ', i);
const key = keys[i];
const thisSlice = config.data.slice(
CONFIG_ARRAY_START + 4 + CONFIG_LINE_SIZE * i,
CONFIG_ARRAY_START + 4 + CONFIG_LINE_SIZE * (i + 1),
);
const name = fromUTF8Array([...thisSlice.slice(4, 36)]);
const uri = fromUTF8Array([...thisSlice.slice(40, 240)]);
const cacheItem = cacheContent.items[key];
if (!name.match(cacheItem.name) || !uri.match(cacheItem.link)) {
//leaving here for debugging reasons, but it's pretty useless. if the first upload fails - all others are wrong
// log.info(
// `Name (${name}) or uri (${uri}) didnt match cache values of (${cacheItem.name})` +
// `and (${cacheItem.link}). marking to rerun for image`,
// key,
// );
cacheItem.onChain = false;
allGood = false;
} else {
const json = await fetch(cacheItem.link);
if (json.status == 200 || json.status == 204 || json.status == 202) {
const body = await json.text();
const parsed = JSON.parse(body);
if (parsed.image) {
const check = await fetch(parsed.image);
if (
check.status == 200 ||
check.status == 204 ||
check.status == 202
) {
const text = await check.text();
if (!text.match(/Not found/i)) {
if (text.length == 0) {
log.debug(
'Name',
name,
'with',
uri,
'has zero length, failing',
);
cacheItem.onChain = false;
allGood = false;
} else {
log.debug('Name', name, 'with', uri, 'checked out');
}
} else {
log.debug(
'Name',
name,
'with',
uri,
'never got uploaded to arweave, failing',
);
cacheItem.onChain = false;
allGood = false;
}
} else {
log.debug(
'Name',
name,
'with',
uri,
'returned non-200 from uploader',
check.status,
);
cacheItem.onChain = false;
allGood = false;
}
} else {
log.debug('Name', name, 'with', uri, 'lacked image in json, failing');
cacheItem.onChain = false;
allGood = false;
}
} else {
log.debug('Name', name, 'with', uri, 'returned no json from link');
cacheItem.onChain = false;
allGood = false;
}
}
}
if (!allGood) {
saveCache(cacheName, env, cacheContent);
throw new Error(
`not all NFTs checked out. check out logs above for details`,
);
}
const configData = (await anchorProgram.account.config.fetch(
configAddress,
)) as Config;
const lineCount = new BN(config.data.slice(247, 247 + 4), undefined, 'le');
log.info(
`uploaded (${lineCount.toNumber()}) out of (${
configData.data.maxNumberOfLines
})`,
);
if (configData.data.maxNumberOfLines > lineCount.toNumber()) {
throw new Error(
`predefined number of NFTs (${
configData.data.maxNumberOfLines
}) is smaller than the uploaded one (${lineCount.toNumber()})`,
);
} else {
log.info('ready to deploy!');
}
saveCache(cacheName, env, cacheContent);
});
programCommand('verify_price')
.option('-p, --price <string>')
.option('--cache-path <string>')
.action(async (directory, cmd) => {
const { keypair, env, price, cacheName, cachePath } = cmd.opts();
const lamports = parsePrice(price);
if (isNaN(lamports)) {
return log.error(`verify_price requires a --price to be set`);
}
log.info(`Expected price is: ${lamports}`);
const cacheContent = loadCache(cacheName, env, cachePath);
if (!cacheContent) {
return log.error(
`No cache found, can't continue. Make sure you are in the correct directory where the assets are located or use the --cache-path option.`,
);
}
const walletKeyPair = loadWalletKey(keypair);
const anchorProgram = await loadCandyProgram(walletKeyPair, env);
const candyAddress = new PublicKey(cacheContent.candyMachineAddress);
const machine = await anchorProgram.account.candyMachine.fetch(
candyAddress,
);
//@ts-ignore
const candyMachineLamports = machine.data.price.toNumber();
log.info(`Candymachine price is: ${candyMachineLamports}`);
if (lamports != candyMachineLamports) {
throw new Error(`Expected price and CandyMachine's price do not match!`);
}
log.info(`Good to go!`);
});
programCommand('show')
.option('--cache-path <string>')
.action(async (directory, cmd) => {
const { keypair, env, cacheName, cachePath } = cmd.opts();
const cacheContent = loadCache(cacheName, env, cachePath);
if (!cacheContent) {
return log.error(
`No cache found, can't continue. Make sure you are in the correct directory where the assets are located or use the --cache-path option.`,
);
}
const walletKeyPair = loadWalletKey(keypair);
const anchorProgram = await loadCandyProgram(walletKeyPair, env);
const [candyMachine] = await getCandyMachineAddress(
new PublicKey(cacheContent.program.config),
cacheContent.program.uuid,
);
try {
const machine = await anchorProgram.account.candyMachine.fetch(
candyMachine,
);
log.info('...Candy Machine...');
//@ts-ignore
log.info('authority: ', machine.authority.toBase58());
//@ts-ignore
log.info('wallet: ', machine.wallet.toBase58());
//@ts-ignore
log.info('tokenMint: ', machine.tokenMint.toBase58());
//@ts-ignore
log.info('config: ', machine.config.toBase58());
//@ts-ignore
log.info('uuid: ', machine.data.uuid);
//@ts-ignore
log.info('price: ', machine.data.price.toNumber());
//@ts-ignore
log.info('itemsAvailable: ', machine.data.itemsAvailable.toNumber());
log.info(
'goLiveDate: ',
//@ts-ignore
machine.data.goLiveDate
? //@ts-ignore
new Date(machine.data.goLiveDate * 1000)
: 'N/A',
);
} catch (e) {
console.log('No machine found');
}
const config = await anchorProgram.account.config.fetch(
cacheContent.program.config,
);
log.info('...Config...');
//@ts-ignore
log.info('authority: ', config.authority);
//@ts-ignore
log.info('symbol: ', config.data.symbol);
//@ts-ignore
log.info('sellerFeeBasisPoints: ', config.data.sellerFeeBasisPoints);
//@ts-ignore
log.info('creators: ');
//@ts-ignore
config.data.creators.map(c =>
log.info(c.address.toBase58(), 'at', c.share, '%'),
),
//@ts-ignore
log.info('maxSupply: ', config.data.maxSupply.toNumber());
//@ts-ignore
log.info('retainAuthority: ', config.data.retainAuthority);
//@ts-ignore
log.info('maxNumberOfLines: ', config.data.maxNumberOfLines);
});
programCommand('create_candy_machine')
.option(
'-p, --price <string>',
'Price denominated in SOL or spl-token override',
'1',
)
.option(
'-t, --spl-token <string>',
'SPL token used to price NFT mint. To use SOL leave this empty.',
)
.option(
'-a, --spl-token-account <string>',
'SPL token account that receives mint payments. Only required if spl-token is specified.',
)
.option(
'-s, --sol-treasury-account <string>',
'SOL account that receives mint payments.',
)
.action(async (directory, cmd) => {
const {
keypair,
env,
price,
cacheName,
splToken,
splTokenAccount,
solTreasuryAccount,
} = cmd.opts();
let parsedPrice = parsePrice(price);
const cacheContent = loadCache(cacheName, env);
const walletKeyPair = loadWalletKey(keypair);
const anchorProgram = await loadCandyProgram(walletKeyPair, env);
let wallet = walletKeyPair.publicKey;
const remainingAccounts = [];
if (splToken || splTokenAccount) {
if (solTreasuryAccount) {
throw new Error(
'If spl-token-account or spl-token is set then sol-treasury-account cannot be set',
);
}
if (!splToken) {
throw new Error(
'If spl-token-account is set, spl-token must also be set',
);
}
const splTokenKey = new PublicKey(splToken);
const splTokenAccountKey = new PublicKey(splTokenAccount);
if (!splTokenAccount) {
throw new Error(
'If spl-token is set, spl-token-account must also be set',
);
}
const token = new Token(
anchorProgram.provider.connection,
splTokenKey,
TOKEN_PROGRAM_ID,
walletKeyPair,
);
const mintInfo = await token.getMintInfo();
if (!mintInfo.isInitialized) {
throw new Error(`The specified spl-token is not initialized`);
}
const tokenAccount = await token.getAccountInfo(splTokenAccountKey);
if (!tokenAccount.isInitialized) {
throw new Error(`The specified spl-token-account is not initialized`);
}
if (!tokenAccount.mint.equals(splTokenKey)) {
throw new Error(
`The spl-token-account's mint (${tokenAccount.mint.toString()}) does not match specified spl-token ${splTokenKey.toString()}`,
);
}
wallet = splTokenAccountKey;
parsedPrice = parsePrice(price, 10 ** mintInfo.decimals);
remainingAccounts.push({
pubkey: splTokenKey,
isWritable: false,
isSigner: false,
});
}
if (solTreasuryAccount) {
wallet = new PublicKey(solTreasuryAccount);
}
const config = new PublicKey(cacheContent.program.config);
const [candyMachine, bump] = await getCandyMachineAddress(
config,
cacheContent.program.uuid,
);
await anchorProgram.rpc.initializeCandyMachine(
bump,
{
uuid: cacheContent.program.uuid,
price: new anchor.BN(parsedPrice),
itemsAvailable: new anchor.BN(Object.keys(cacheContent.items).length),
goLiveDate: null,
},
{
accounts: {
candyMachine,
wallet,
config: config,
authority: walletKeyPair.publicKey,
payer: walletKeyPair.publicKey,
systemProgram: anchor.web3.SystemProgram.programId,
rent: anchor.web3.SYSVAR_RENT_PUBKEY,
},
signers: [],
remainingAccounts,
},
);
cacheContent.candyMachineAddress = candyMachine.toBase58();
saveCache(cacheName, env, cacheContent);
log.info(
`create_candy_machine finished. candy machine pubkey: ${candyMachine.toBase58()}`,
);
});
programCommand('update_candy_machine')
.option(
'-d, --date <string>',
'timestamp - eg "04 Dec 1995 00:12:00 GMT" or "now"',
)
.option('-p, --price <string>', 'SOL price')
.action(async (directory, cmd) => {
const { keypair, env, date, price, cacheName } = cmd.opts();
const cacheContent = loadCache(cacheName, env);
const secondsSinceEpoch = date ? parseDate(date) : null;
const lamports = price ? parsePrice(price) : null;
const walletKeyPair = loadWalletKey(keypair);
const anchorProgram = await loadCandyProgram(walletKeyPair, env);
const candyMachine = new PublicKey(cacheContent.candyMachineAddress);
const tx = await anchorProgram.rpc.updateCandyMachine(
lamports ? new anchor.BN(lamports) : null,
secondsSinceEpoch ? new anchor.BN(secondsSinceEpoch) : null,
{
accounts: {
candyMachine,
authority: walletKeyPair.publicKey,
},
},
);
cacheContent.startDate = secondsSinceEpoch;
saveCache(cacheName, env, cacheContent);
if (date)
log.info(
` - updated startDate timestamp: ${secondsSinceEpoch} (${date})`,
);
if (lamports)
log.info(` - updated price: ${lamports} lamports (${price} SOL)`);
log.info('updated_candy_machine finished', tx);
});
programCommand('mint_one_token').action(async (directory, cmd) => {
const { keypair, env, cacheName } = cmd.opts();
const cacheContent = loadCache(cacheName, env);
const configAddress = new PublicKey(cacheContent.program.config);
const tx = await mint(keypair, env, configAddress);
log.info('mint_one_token finished', tx);
});
programCommand('sign')
// eslint-disable-next-line @typescript-eslint/no-unused-vars
.option('-m, --metadata <string>', 'base58 metadata account id')
.action(async (directory, cmd) => {
const { keypair, env, metadata } = cmd.opts();
await signMetadata(metadata, keypair, env);
});
programCommand('sign_all')
.option('-b, --batch-size <string>', 'Batch size', '10')
.option('-d, --daemon', 'Run signing continuously', false)
.action(async (directory, cmd) => {
const { keypair, env, cacheName, batchSize, daemon } = cmd.opts();
const cacheContent = loadCache(cacheName, env);
const walletKeyPair = loadWalletKey(keypair);
const anchorProgram = await loadCandyProgram(walletKeyPair, env);
const candyAddress = cacheContent.candyMachineAddress;
const batchSizeParsed = parseInt(batchSize);
if (!parseInt(batchSize)) {
throw new Error('Batch size needs to be an integer!');
}
log.debug('Creator pubkey: ', walletKeyPair.publicKey.toBase58());
log.debug('Environment: ', env);
log.debug('Candy machine address: ', candyAddress);
log.debug('Batch Size: ', batchSizeParsed);
await signAllMetadataFromCandyMachine(
anchorProgram.provider.connection,
walletKeyPair,
candyAddress,
batchSizeParsed,
daemon,
);
});
function programCommand(name: string) {
return program
.command(name)
.option(
'-e, --env <string>',
'Solana cluster env name',
'devnet', //mainnet-beta, testnet, devnet
)
.option(
'-k, --keypair <path>',
`Solana wallet location`,
'--keypair not provided',
)
.option('-l, --log-level <string>', 'log level', setLogLevel)
.option('-c, --cache-name <string>', 'Cache file name', 'temp');
}
// eslint-disable-next-line @typescript-eslint/no-unused-vars
function setLogLevel(value, prev) {
if (value === undefined || value === null) {
return;
}
log.info('setting the log value to: ' + value);
log.setLevel(value);
}
program.parse(process.argv);

View File

@ -1,391 +0,0 @@
import * as fs from 'fs';
import * as path from 'path';
import { program } from 'commander';
import * as anchor from '@project-serum/anchor';
import BN from 'bn.js';
import { fromUTF8Array, parsePrice } from './helpers/various';
import { Token, TOKEN_PROGRAM_ID } from '@solana/spl-token';
import { PublicKey } from '@solana/web3.js';
import { CACHE_PATH, CONFIG_ARRAY_START, CONFIG_LINE_SIZE, EXTENSION_JSON, EXTENSION_PNG, } from './helpers/constants';
import { getCandyMachineAddress, loadAnchorProgram, loadWalletKey, } from './helpers/accounts';
import { Config } from './types';
import { upload } from './commands/upload';
import { loadCache, saveCache } from './helpers/cache';
import { mint } from "./commands/mint";
import { signMetadata } from "./commands/sign";
import { signAllMetadataFromCandyMachine } from "./commands/signAll";
import log from 'loglevel';
program.version('0.0.1');
if (!fs.existsSync(CACHE_PATH)) {
fs.mkdirSync(CACHE_PATH);
}
log.setLevel(log.levels.INFO);
programCommand('upload')
.argument(
'<directory>',
'Directory containing images named from 0-n',
val => {
return fs.readdirSync(`${val}`).map(file => path.join(val, file));
},
)
.option('-n, --number <number>', 'Number of images to upload')
.action(async (files: string[], options, cmd) => {
const {number, keypair, env, cacheName} = cmd.opts();
const pngFileCount = files.filter(it => {
return it.endsWith(EXTENSION_PNG);
}).length;
const jsonFileCount = files.filter(it => {
return it.endsWith(EXTENSION_JSON);
}).length;
const parsedNumber = parseInt(number);
const elemCount = parsedNumber ? parsedNumber : pngFileCount;
if (pngFileCount !== jsonFileCount) {
throw new Error(`number of png files (${pngFileCount}) is different than the number of json files (${jsonFileCount})`);
}
if (elemCount < pngFileCount) {
throw new Error(`max number (${elemCount})cannot be smaller than the number of elements in the source folder (${pngFileCount})`);
}
log.info(`Beginning the upload for ${elemCount} (png+json) pairs`)
const startMs = Date.now();
log.info("started at: " + startMs.toString())
let warn = false;
for (; ;) {
const successful = await upload(files, cacheName, env, keypair, elemCount);
if (successful) {
warn = false;
break;
} else {
warn = true;
log.warn("upload was not successful, rerunning");
}
}
const endMs = Date.now();
const timeTaken = new Date(endMs - startMs).toISOString().substr(11, 8);
log.info(`ended at: ${new Date(endMs).toString()}. time taken: ${timeTaken}`)
if (warn) { log.info("not all images have been uplaoded, rerun this step.") }
});
programCommand('verify')
.action(async (directory, cmd) => {
const { env, keypair, cacheName } = cmd.opts();
const cacheContent = loadCache(cacheName, env);
const walletKeyPair = loadWalletKey(keypair);
const anchorProgram = await loadAnchorProgram(walletKeyPair, env);
const configAddress = new PublicKey(cacheContent.program.config);
const config = await anchorProgram.provider.connection.getAccountInfo(
configAddress,
);
let allGood = true;
const keys = Object.keys(cacheContent.items);
for (let i = 0; i < keys.length; i++) {
log.debug('Looking at key ', i);
const key = keys[i];
const thisSlice = config.data.slice(
CONFIG_ARRAY_START + 4 + CONFIG_LINE_SIZE * i,
CONFIG_ARRAY_START + 4 + CONFIG_LINE_SIZE * (i + 1),
);
const name = fromUTF8Array([...thisSlice.slice(4, 36)]);
const uri = fromUTF8Array([...thisSlice.slice(40, 240)]);
const cacheItem = cacheContent.items[key];
if (!name.match(cacheItem.name) || !uri.match(cacheItem.link)) {
//leaving here for debugging reasons, but it's pretty useless. if the first upload fails - all others are wrong
// log.info(
// `Name (${name}) or uri (${uri}) didnt match cache values of (${cacheItem.name})` +
// `and (${cacheItem.link}). marking to rerun for image`,
// key,
// );
cacheItem.onChain = false;
allGood = false;
} else {
log.debug('Name', name, 'with', uri, 'checked out');
}
}
if (!allGood) {
saveCache(cacheName, env, cacheContent);
throw new Error(
`not all NFTs checked out. check out logs above for details`,
);
}
const configData = (await anchorProgram.account.config.fetch(
configAddress,
)) as Config;
const lineCount = new BN(config.data.slice(247, 247 + 4), undefined, 'le');
log.info(
`uploaded (${lineCount.toNumber()}) out of (${configData.data.maxNumberOfLines
})`,
);
if (configData.data.maxNumberOfLines > lineCount.toNumber()) {
throw new Error(
`predefined number of NFTs (${configData.data.maxNumberOfLines
}) is smaller than the uploaded one (${lineCount.toNumber()})`,
);
} else {
log.info('ready to deploy!');
}
saveCache(cacheName, env, cacheContent);
});
programCommand('verify_price')
.option('-p, --price <string>')
.option('--cache-path <string>')
.action(async (directory, cmd) => {
const { keypair, env, price, cacheName, cachePath } = cmd.opts();
const lamports = parsePrice(price);
if (isNaN(lamports)) {
return log.error(`verify_price requires a --price to be set`);
}
log.info(`Expected price is: ${lamports}`);
const cacheContent = loadCache(cacheName, env, cachePath);
if (!cacheContent) {
return log.error(
`No cache found, can't continue. Make sure you are in the correct directory where the assets are located or use the --cache-path option.`,
);
}
const walletKeyPair = loadWalletKey(keypair);
const anchorProgram = await loadAnchorProgram(walletKeyPair, env);
const [candyMachine] = await getCandyMachineAddress(
new PublicKey(cacheContent.program.config),
cacheContent.program.uuid,
);
const machine = await anchorProgram.account.candyMachine.fetch(
candyMachine,
);
//@ts-ignore
const candyMachineLamports = machine.data.price.toNumber();
log.info(`Candymachine price is: ${candyMachineLamports}`);
if (lamports != candyMachineLamports) {
throw new Error(`Expected price and CandyMachine's price do not match!`);
}
log.info(`Good to go!`);
});
programCommand('create_candy_machine')
.option('-p, --price <string>', 'Price denominated in SOL or spl-token override', '1')
.option('-t, --spl-token <string>', 'SPL token used to price NFT mint. To use SOL leave this empty.')
.option('-t, --spl-token-account <string>', 'SPL token account that receives mint payments. Only required if spl-token is specified.')
.action(async (directory, cmd) => {
const { keypair, env, price, cacheName, splToken, splTokenAccount } = cmd.opts();
let parsedPrice = parsePrice(price);
const cacheContent = loadCache(cacheName, env);
const walletKeyPair = loadWalletKey(keypair);
const anchorProgram = await loadAnchorProgram(walletKeyPair, env);
let wallet = walletKeyPair.publicKey;
const remainingAccounts = [];
if (splToken || splTokenAccount) {
if (!splToken) {
throw new Error("If spl-token-account is set, spl-token must also be set")
}
const splTokenKey = new PublicKey(splToken);
const splTokenAccountKey = new PublicKey(splTokenAccount);
if (!splTokenAccount) {
throw new Error("If spl-token is set, spl-token-account must also be set")
}
const token = new Token(
anchorProgram.provider.connection,
splTokenKey,
TOKEN_PROGRAM_ID,
walletKeyPair
);
const mintInfo = await token.getMintInfo();
if (!mintInfo.isInitialized) {
throw new Error(`The specified spl-token is not initialized`);
}
const tokenAccount = await token.getAccountInfo(splTokenAccountKey);
if (!tokenAccount.isInitialized) {
throw new Error(`The specified spl-token-account is not initialized`);
}
if (!tokenAccount.mint.equals(splTokenKey)) {
throw new Error(`The spl-token-account's mint (${tokenAccount.mint.toString()}) does not match specified spl-token ${splTokenKey.toString()}`);
}
wallet = splTokenAccountKey;
parsedPrice = parsePrice(price, 10 ** mintInfo.decimals);
remainingAccounts.push({ pubkey: splTokenKey, isWritable: false, isSigner: false });
}
const config = new PublicKey(cacheContent.program.config);
const [candyMachine, bump] = await getCandyMachineAddress(
config,
cacheContent.program.uuid,
);
await anchorProgram.rpc.initializeCandyMachine(
bump,
{
uuid: cacheContent.program.uuid,
price: new anchor.BN(parsedPrice),
itemsAvailable: new anchor.BN(Object.keys(cacheContent.items).length),
goLiveDate: null,
},
{
accounts: {
candyMachine,
wallet,
config: config,
authority: walletKeyPair.publicKey,
payer: walletKeyPair.publicKey,
systemProgram: anchor.web3.SystemProgram.programId,
rent: anchor.web3.SYSVAR_RENT_PUBKEY,
},
signers: [],
remainingAccounts,
},
);
saveCache(cacheName, env, cacheContent);
log.info(`create_candy_machine finished. candy machine pubkey: ${candyMachine.toBase58()}`);
});
programCommand('update_candy_machine')
.option('-d, --date <string>', 'timestamp - eg "04 Dec 1995 00:12:00 GMT"')
.option('-p, --price <string>', 'SOL price')
.action(async (directory, cmd) => {
const { keypair, env, date, price, cacheName } = cmd.opts();
const cacheContent = loadCache(cacheName, env);
const secondsSinceEpoch = date ? Date.parse(date) / 1000 : null;
const lamports = price ? parsePrice(price) : null;
const walletKeyPair = loadWalletKey(keypair);
const anchorProgram = await loadAnchorProgram(walletKeyPair, env);
const [candyMachine] = await getCandyMachineAddress(
new PublicKey(cacheContent.program.config),
cacheContent.program.uuid,
);
const tx = await anchorProgram.rpc.updateCandyMachine(
lamports ? new anchor.BN(lamports) : null,
secondsSinceEpoch ? new anchor.BN(secondsSinceEpoch) : null,
{
accounts: {
candyMachine,
authority: walletKeyPair.publicKey,
},
},
);
if (date) log.info(` - updated startDate timestamp: ${secondsSinceEpoch} (${date})`)
if (lamports) log.info(` - updated price: ${lamports} lamports (${price} SOL)`)
log.info('updated_candy_machine Done', tx);
});
programCommand('mint_one_token')
.option('-t, --spl-token-account <string>', 'SPL token account to payfrom')
.action(async (directory, cmd) => {
const {keypair, env, cacheName, splTokenAccount} = cmd.opts();
const cacheContent = loadCache(cacheName, env);
const configAddress = new PublicKey(cacheContent.program.config);
const splTokenAccountKey = splTokenAccount ? new PublicKey(splTokenAccount) : undefined;
const tx = await mint(keypair, env, configAddress, splTokenAccountKey);
log.info('Done', tx);
});
programCommand('sign')
// eslint-disable-next-line @typescript-eslint/no-unused-vars
.option('-m, --metadata <string>', 'base58 metadata account id')
.action(async (directory, cmd) => {
const { keypair, env, metadata } = cmd.opts();
await signMetadata(
metadata,
keypair,
env
);
});
function programCommand(name: string) {
return program
.command(name)
.option(
'-e, --env <string>',
'Solana cluster env name',
'devnet', //mainnet-beta, testnet, devnet
)
.option(
'-k, --keypair <path>',
`Solana wallet location`,
'--keypair not provided',
)
.option('-l, --log-level <string>', 'log level', setLogLevel)
.option('-c, --cache-name <string>', 'Cache file name', 'temp');
}
// eslint-disable-next-line @typescript-eslint/no-unused-vars
function setLogLevel(value, prev) {
if (value === undefined || value === null) {
return
}
log.info("setting the log value to: " + value);
log.setLevel(value);
}
programCommand("sign_candy_machine_metadata")
.option('-cndy, --candy-address <string>', 'Candy machine address', '')
.option('-b, --batch-size <string>', 'Batch size', '10')
.action(async (directory, cmd) => {
let { keypair, env, cacheName, candyAddress, batchSize } = cmd.opts();
if (!keypair || keypair == '') {
log.info("Keypair required!");
return;
}
if (!candyAddress || candyAddress == '') {
log.info("Candy machine address required! Using from saved list.")
const cacheContent = loadCache(cacheName, env);
const config = new PublicKey(cacheContent.program.config);
const [candyMachine, bump] = await getCandyMachineAddress(
config,
cacheContent.program.uuid,
);
candyAddress = candyMachine.toBase58();
}
let batchSizeParsed = parseInt(batchSize)
if (!parseInt(batchSize)) {
log.info("Batch size needs to be an integer!")
return;
}
const walletKeyPair = loadWalletKey(keypair);
const anchorProgram = await loadAnchorProgram(walletKeyPair, env);
log.info("Creator pubkey: ", walletKeyPair.publicKey.toBase58())
log.info("Environment: ", env)
log.info("Candy machine address: ", candyAddress)
log.info("Batch Size: ", batchSizeParsed)
await signAllMetadataFromCandyMachine(anchorProgram.provider.connection, walletKeyPair, candyAddress, batchSizeParsed)
});
program.parse(process.argv);

View File

@ -1,23 +1,31 @@
import { Keypair, PublicKey, SystemProgram } from "@solana/web3.js";
import { Keypair, PublicKey, SystemProgram } from '@solana/web3.js';
import {
getCandyMachineAddress,
getMasterEdition,
getMetadata,
getTokenWallet,
loadAnchorProgram,
loadCandyProgram,
loadWalletKey,
uuidFromConfigPubkey
} from "../helpers/accounts";
import { TOKEN_METADATA_PROGRAM_ID, TOKEN_PROGRAM_ID } from "../helpers/constants";
import * as anchor from "@project-serum/anchor";
import { MintLayout, Token } from "@solana/spl-token";
import { createAssociatedTokenAccountInstruction } from "../helpers/instructions";
uuidFromConfigPubkey,
} from '../helpers/accounts';
import {
TOKEN_METADATA_PROGRAM_ID,
TOKEN_PROGRAM_ID,
} from '../helpers/constants';
import * as anchor from '@project-serum/anchor';
import { MintLayout, Token } from '@solana/spl-token';
import { createAssociatedTokenAccountInstruction } from '../helpers/instructions';
import { sendTransactionWithRetryWithKeypair } from '../helpers/transactions';
export async function mint(keypair: string, env: string, configAddress: PublicKey, splTokenAccountKey?: PublicKey): Promise<string> {
export async function mint(
keypair: string,
env: string,
configAddress: PublicKey,
): Promise<string> {
const mint = Keypair.generate();
const userKeyPair = loadWalletKey(keypair);
const anchorProgram = await loadAnchorProgram(userKeyPair, env);
const anchorProgram = await loadCandyProgram(userKeyPair, env);
const userTokenAccountAddress = await getTokenWallet(
userKeyPair.publicKey,
mint.publicKey,
@ -28,90 +36,120 @@ export async function mint(keypair: string, env: string, configAddress: PublicKe
configAddress,
uuid,
);
const candyMachine : any = await anchorProgram.account.candyMachine.fetch(
const candyMachine: any = await anchorProgram.account.candyMachine.fetch(
candyMachineAddress,
);
const remainingAccounts = [];
if (splTokenAccountKey) {
const candyMachineTokenMintKey = candyMachine.tokenMint;
if (!candyMachineTokenMintKey) {
throw new Error('Candy machine data does not have token mint configured. Can\'t use spl-token-account');
}
const token = new Token(
anchorProgram.provider.connection,
candyMachine.tokenMint,
const signers = [mint, userKeyPair];
const instructions = [
anchor.web3.SystemProgram.createAccount({
fromPubkey: userKeyPair.publicKey,
newAccountPubkey: mint.publicKey,
space: MintLayout.span,
lamports:
await anchorProgram.provider.connection.getMinimumBalanceForRentExemption(
MintLayout.span,
),
programId: TOKEN_PROGRAM_ID,
}),
Token.createInitMintInstruction(
TOKEN_PROGRAM_ID,
userKeyPair
mint.publicKey,
0,
userKeyPair.publicKey,
userKeyPair.publicKey,
),
createAssociatedTokenAccountInstruction(
userTokenAccountAddress,
userKeyPair.publicKey,
userKeyPair.publicKey,
mint.publicKey,
),
Token.createMintToInstruction(
TOKEN_PROGRAM_ID,
mint.publicKey,
userTokenAccountAddress,
userKeyPair.publicKey,
[],
1,
),
];
let tokenAccount;
if (candyMachine.tokenMint) {
const transferAuthority = anchor.web3.Keypair.generate();
tokenAccount = await getTokenWallet(
userKeyPair.publicKey,
candyMachine.tokenMint,
);
const tokenAccount = await token.getAccountInfo(splTokenAccountKey);
if (!candyMachine.tokenMint.equals(tokenAccount.mint)) {
throw new Error(`Specified spl-token-account's mint (${tokenAccount.mint.toString()}) does not match candy machine's token mint (${candyMachine.tokenMint.toString()})`);
}
remainingAccounts.push({
pubkey: tokenAccount,
isWritable: true,
isSigner: false,
});
remainingAccounts.push({
pubkey: userKeyPair.publicKey,
isWritable: false,
isSigner: true,
});
if (!tokenAccount.owner.equals(userKeyPair.publicKey)) {
throw new Error(`Specified spl-token-account's owner (${tokenAccount.owner.toString()}) does not match user public key (${userKeyPair.publicKey})`);
}
remainingAccounts.push({ pubkey: splTokenAccountKey, isWritable: true, isSigner: false });
remainingAccounts.push({ pubkey: userKeyPair.publicKey, isWritable: false, isSigner: true });
}
const metadataAddress = await getMetadata(mint.publicKey);
const masterEdition = await getMasterEdition(mint.publicKey);
return await anchorProgram.rpc.mintNft({
accounts: {
config: configAddress,
candyMachine: candyMachineAddress,
payer: userKeyPair.publicKey,
//@ts-ignore
wallet: candyMachine.wallet,
mint: mint.publicKey,
metadata: metadataAddress,
masterEdition,
mintAuthority: userKeyPair.publicKey,
updateAuthority: userKeyPair.publicKey,
tokenMetadataProgram: TOKEN_METADATA_PROGRAM_ID,
tokenProgram: TOKEN_PROGRAM_ID,
systemProgram: SystemProgram.programId,
rent: anchor.web3.SYSVAR_RENT_PUBKEY,
clock: anchor.web3.SYSVAR_CLOCK_PUBKEY,
},
signers: [mint, userKeyPair],
remainingAccounts,
instructions: [
anchor.web3.SystemProgram.createAccount({
fromPubkey: userKeyPair.publicKey,
newAccountPubkey: mint.publicKey,
space: MintLayout.span,
lamports:
await anchorProgram.provider.connection.getMinimumBalanceForRentExemption(
MintLayout.span,
),
programId: TOKEN_PROGRAM_ID,
}),
Token.createInitMintInstruction(
instructions.push(
Token.createApproveInstruction(
TOKEN_PROGRAM_ID,
mint.publicKey,
0,
userKeyPair.publicKey,
userKeyPair.publicKey,
),
createAssociatedTokenAccountInstruction(
userTokenAccountAddress,
userKeyPair.publicKey,
userKeyPair.publicKey,
mint.publicKey,
),
Token.createMintToInstruction(
TOKEN_PROGRAM_ID,
mint.publicKey,
userTokenAccountAddress,
tokenAccount,
transferAuthority.publicKey,
userKeyPair.publicKey,
[],
1,
candyMachine.data.price.toNumber(),
),
],
});
);
}
const metadataAddress = await getMetadata(mint.publicKey);
const masterEdition = await getMasterEdition(mint.publicKey);
instructions.push(
await anchorProgram.instruction.mintNft({
accounts: {
config: configAddress,
candyMachine: candyMachineAddress,
payer: userKeyPair.publicKey,
//@ts-ignore
wallet: candyMachine.wallet,
mint: mint.publicKey,
metadata: metadataAddress,
masterEdition,
mintAuthority: userKeyPair.publicKey,
updateAuthority: userKeyPair.publicKey,
tokenMetadataProgram: TOKEN_METADATA_PROGRAM_ID,
tokenProgram: TOKEN_PROGRAM_ID,
systemProgram: SystemProgram.programId,
rent: anchor.web3.SYSVAR_RENT_PUBKEY,
clock: anchor.web3.SYSVAR_CLOCK_PUBKEY,
},
remainingAccounts,
}),
);
if (tokenAccount) {
instructions.push(
Token.createRevokeInstruction(
TOKEN_PROGRAM_ID,
tokenAccount,
userKeyPair.publicKey,
[],
),
);
}
return (
await sendTransactionWithRetryWithKeypair(
anchorProgram.provider.connection,
userKeyPair,
instructions,
signers,
)
).txid;
}

View File

@ -1,46 +1,35 @@
import { Keypair, PublicKey, TransactionInstruction } from "@solana/web3.js";
import { TOKEN_METADATA_PROGRAM_ID } from "../helpers/constants";
import { sendTransactionWithRetryWithKeypair } from "../helpers/transactions";
import { loadAnchorProgram, loadWalletKey } from "../helpers/accounts";
import { Program } from "@project-serum/anchor";
import { Keypair, PublicKey, TransactionInstruction } from '@solana/web3.js';
import { TOKEN_METADATA_PROGRAM_ID } from '../helpers/constants';
import { sendTransactionWithRetryWithKeypair } from '../helpers/transactions';
import { loadCandyProgram, loadWalletKey } from '../helpers/accounts';
import { Program } from '@project-serum/anchor';
const METADATA_SIGNATURE = Buffer.from([7]); //now thats some voodoo magic. WTF metaplex? XD
export async function signMetadata(
metadata: string,
keypair: string,
env: string
env: string,
) {
const creatorKeyPair = loadWalletKey(keypair);
const anchorProgram = await loadAnchorProgram(creatorKeyPair, env);
await signWithRetry(anchorProgram, creatorKeyPair, metadata);
const anchorProgram = await loadCandyProgram(creatorKeyPair, env);
await signWithRetry(anchorProgram, creatorKeyPair, new PublicKey(metadata));
}
export async function signAllUnapprovedMetadata(
keypair: string,
env: string
async function signWithRetry(
anchorProgram: Program,
creatorKeyPair: Keypair,
metadataAddress: PublicKey,
) {
const creatorKeyPair = loadWalletKey(keypair);
const anchorProgram = await loadAnchorProgram(creatorKeyPair, env);
const metadataIds = await findAllUnapprovedMetadataIds(anchorProgram, creatorKeyPair);
for(const id in metadataIds) {
await signWithRetry(anchorProgram, creatorKeyPair, id);
}
}
// @ts-ignore
// eslint-disable-next-line @typescript-eslint/no-unused-vars
async function findAllUnapprovedMetadataIds(anchorProgram: Program, creatorKeyPair: Keypair): Promise<string[]> {
//TODO well I need some help with that... so... help? :D
throw new Error("Unsupported yet")
}
async function signWithRetry(anchorProgram: Program, creatorKeyPair: Keypair, metadataAddress: string) {
await sendTransactionWithRetryWithKeypair(
anchorProgram.provider.connection,
creatorKeyPair,
[signMetadataInstruction(new PublicKey(metadataAddress), creatorKeyPair.publicKey)],
[
signMetadataInstruction(
new PublicKey(metadataAddress),
creatorKeyPair.publicKey,
),
],
[],
'single',
);
@ -50,7 +39,7 @@ export function signMetadataInstruction(
metadata: PublicKey,
creator: PublicKey,
): TransactionInstruction {
const data = Buffer.from([7]); //now thats bloody magic. WTF metaplex? XD
const data = METADATA_SIGNATURE;
const keys = [
{

View File

@ -1,18 +1,26 @@
import { Keypair, PublicKey, TransactionInstruction, Connection, AccountInfo } from '@solana/web3.js';
import { sendTransactionWithRetryWithKeypair } from '../helpers/transactions';
import * as borsh from "borsh"
import {
MAX_NAME_LENGTH,
MAX_URI_LENGTH,
MAX_SYMBOL_LENGTH,
AccountInfo,
Connection,
Keypair,
PublicKey,
TransactionInstruction,
} from '@solana/web3.js';
import { sendTransactionWithRetryWithKeypair } from '../helpers/transactions';
import * as borsh from 'borsh';
import {
MAX_CREATOR_LEN,
MAX_NAME_LENGTH,
MAX_SYMBOL_LENGTH,
MAX_URI_LENGTH,
TOKEN_METADATA_PROGRAM_ID,
} from '../helpers/constants';
import {
AccountAndPubkey,
Metadata,
METADATA_SCHEMA
} from '../types'
import { AccountAndPubkey, Metadata, METADATA_SCHEMA } from '../types';
import { signMetadataInstruction } from './sign';
import log from 'loglevel';
import { sleep } from '../helpers/various';
const SIGNING_INTERVAL = 60 * 1000; //60s
let lastCount = 0;
/*
Get accounts by candy machine creator address
Get only verified ones
@ -21,10 +29,108 @@ import {
PS: Don't sign candy machine addresses that you do not know about. Signing verifies your participation.
*/
async function decodeMetadata(buffer) {
const metadata = borsh.deserializeUnchecked(METADATA_SCHEMA, Metadata, buffer);
return metadata;
};
export async function signAllMetadataFromCandyMachine(
connection: Connection,
wallet: Keypair,
candyMachineAddress: string,
batchSize: number,
daemon: boolean,
) {
if (daemon) {
// noinspection InfiniteLoopJS
for (;;) {
await findAndSignMetadata(
candyMachineAddress,
connection,
wallet,
batchSize,
);
await sleep(SIGNING_INTERVAL);
}
} else {
await findAndSignMetadata(
candyMachineAddress,
connection,
wallet,
batchSize,
);
}
}
async function findAndSignMetadata(
candyMachineAddress: string,
connection: Connection,
wallet: Keypair,
batchSize: number,
) {
const metadataByCandyMachine = await getAccountsByCreatorAddress(
candyMachineAddress,
connection,
);
if (lastCount === metadataByCandyMachine.length) {
log.debug(`Didn't find any new NFTs to sign - ${new Date()}`);
return;
}
lastCount = metadataByCandyMachine.length;
log.info(
`Found ${metadataByCandyMachine.length} nft's minted by candy machine ${candyMachineAddress}`,
);
const candyVerifiedListToSign = await getCandyMachineVerifiedMetadata(
metadataByCandyMachine,
candyMachineAddress,
wallet.publicKey.toBase58(),
);
log.info(
`Found ${
candyVerifiedListToSign.length
} nft's to sign by ${wallet.publicKey.toBase58()}`,
);
await sendSignMetadata(
connection,
wallet,
candyVerifiedListToSign,
batchSize,
);
}
async function getAccountsByCreatorAddress(creatorAddress, connection) {
const metadataAccounts = await getProgramAccounts(
connection,
TOKEN_METADATA_PROGRAM_ID.toBase58(),
{
filters: [
{
memcmp: {
offset:
1 + // key
32 + // update auth
32 + // mint
4 + // name string length
MAX_NAME_LENGTH + // name
4 + // uri string length
MAX_URI_LENGTH + // uri*
4 + // symbol string length
MAX_SYMBOL_LENGTH + // symbol
2 + // seller fee basis points
1 + // whether or not there is a creators vec
4 + // creators vec length
0 * MAX_CREATOR_LEN,
bytes: creatorAddress,
},
},
],
},
);
const decodedAccounts = [];
for (let i = 0; i < metadataAccounts.length; i++) {
const e = metadataAccounts[i];
const decoded = await decodeMetadata(e.account.data);
const accountPubkey = e.pubkey;
const store = [decoded, accountPubkey];
decodedAccounts.push(store);
}
return decodedAccounts;
}
async function getProgramAccounts(
connection: Connection,
@ -80,134 +186,70 @@ async function getProgramAccounts(
return data;
}
export async function signAllMetadataFromCandyMachine(
connection,
wallet,
candyMachineAddress,
batchSize
){
let metadataByCandyMachine = await getAccountsByCreatorAddress(candyMachineAddress, connection)
console.log(`Found ${metadataByCandyMachine.length} nft's minted by candy machine ${candyMachineAddress}`)
let candyVerifiedListToSign = await getCandyMachineVerifiedMetadata(metadataByCandyMachine, candyMachineAddress, wallet.publicKey.toBase58())
console.log(`Found ${candyVerifiedListToSign.length} nft's to sign by ${wallet.publicKey.toBase58()}`)
await sendSignMetadata(connection, wallet, candyVerifiedListToSign, batchSize)
async function decodeMetadata(buffer) {
return borsh.deserializeUnchecked(METADATA_SCHEMA, Metadata, buffer);
}
async function getAccountsByCreatorAddress(creatorAddress, connection) {
let metadataAccounts = await getProgramAccounts(connection, TOKEN_METADATA_PROGRAM_ID.toBase58(), {
filters: [
{
memcmp: {
offset:
1 + // key
32 + // update auth
32 + // mint
4 + // name string length
MAX_NAME_LENGTH + // name
4 + // uri string length
MAX_URI_LENGTH + // uri*
4 + // symbol string length
MAX_SYMBOL_LENGTH + // symbol
2 + // seller fee basis points
1 + // whether or not there is a creators vec
4 + // creators vec length
0 * MAX_CREATOR_LEN,
bytes: creatorAddress,
},
},
],
})
let decodedAccounts = []
for (let i = 0; i < metadataAccounts.length; i++) {
let e = metadataAccounts[i];
let decoded = await decodeMetadata(e.account.data)
let accountPubkey = e.pubkey
let store = [decoded, accountPubkey]
decodedAccounts.push(store)
}
return decodedAccounts
}
async function getCandyMachineVerifiedMetadata(metadataList, candyAddress, creatorAddress){
let verifiedList = [];
async function getCandyMachineVerifiedMetadata(
metadataList,
candyAddress,
creatorAddress,
) {
const verifiedList = [];
metadataList.forEach(meta => {
let verifiedCandy = false;
let verifiedCreator = true;
meta[0].data.creators.forEach(creator => {
if (new PublicKey(creator.address).toBase58() == candyAddress && creator.verified === 1) {
if (
new PublicKey(creator.address).toBase58() == candyAddress &&
creator.verified === 1
) {
verifiedCandy = true;
}
if (new PublicKey(creator.address).toBase58() == creatorAddress && creator.verified === 0) {
if (
new PublicKey(creator.address).toBase58() == creatorAddress &&
creator.verified === 0
) {
verifiedCreator = false;
}
});
if(verifiedCandy && !verifiedCreator){
verifiedList.push(meta)
if (verifiedCandy && !verifiedCreator) {
verifiedList.push(meta);
}
});
return verifiedList
return verifiedList;
}
async function sendSignMetadata(
connection,
wallet,
metadataList,
batchsize
) {
async function sendSignMetadata(connection, wallet, metadataList, batchsize) {
let total = 0;
while(metadataList.length > 0){
console.log("Signing metadata")
while (metadataList.length > 0) {
log.debug('Signing metadata ');
let sliceAmount = batchsize;
if (metadataList.length < batchsize) {
sliceAmount = metadataList.length;
}
var removed = metadataList.splice(0,sliceAmount);
const removed = metadataList.splice(0, sliceAmount);
total += sliceAmount;
await delay(500)
await signMetadataBatch(removed, connection, wallet)
console.log(`Processed ${total} nfts`)
await delay(500);
await signMetadataBatch(removed, connection, wallet);
log.debug(`Processed ${total} nfts`);
}
console.log("Finished signing metadata..")
log.info(`Finished signing metadata for ${total} NFTs`);
}
async function signMetadataBatch(metadataList, connection, keypair){
const signers: Keypair[] = [];
const instructions: TransactionInstruction[] = [];
for (let i = 0; i < metadataList.length; i++) {
const meta = metadataList[i];
await signMetadataSingle(meta[1], keypair.publicKey.toBase58(), instructions)
}
await sendTransactionWithRetryWithKeypair(connection, keypair, instructions, [], 'single')
}
async function signMetadataSingle(
metadata,
creator,
instructions,
) {
const data = Buffer.from([7]);
const keys = [
{
pubkey: new PublicKey(metadata),
isSigner: false,
isWritable: true,
},
{
pubkey: new PublicKey(creator),
isSigner: true,
isWritable: false,
},
];
instructions.push(
({
keys,
programId: TOKEN_METADATA_PROGRAM_ID.toBase58(),
data,
}),
async function signMetadataBatch(metadataList, connection, keypair) {
const instructions: TransactionInstruction[] = metadataList.map(meta => {
return signMetadataInstruction(new PublicKey(meta[1]), keypair.publicKey);
});
await sendTransactionWithRetryWithKeypair(
connection,
keypair,
instructions,
[],
'single',
);
}
function delay(ms: number) {
return new Promise( resolve => setTimeout(resolve, ms) );
return new Promise(resolve => setTimeout(resolve, ms));
}

View File

@ -1,17 +1,29 @@
import { EXTENSION_PNG, ARWEAVE_PAYMENT_WALLET } from "../helpers/constants";
import path from "path";
import { createConfig, loadAnchorProgram, loadWalletKey } from "../helpers/accounts";
import { PublicKey } from "@solana/web3.js";
import fs from "fs";
import BN from "bn.js";
import * as anchor from "@project-serum/anchor";
import { sendTransactionWithRetryWithKeypair } from "../helpers/transactions";
import FormData from "form-data";
import { loadCache, saveCache } from "../helpers/cache";
import fetch from 'node-fetch';
import log from "loglevel";
import { EXTENSION_PNG } from '../helpers/constants';
import path from 'path';
import {
createConfig,
loadCandyProgram,
loadWalletKey,
} from '../helpers/accounts';
import { PublicKey } from '@solana/web3.js';
import fs from 'fs';
import BN from 'bn.js';
import { loadCache, saveCache } from '../helpers/cache';
import log from 'loglevel';
import { arweaveUpload } from '../helpers/upload/arweave';
import { ipfsCreds, ipfsUpload } from '../helpers/upload/ipfs';
import { chunks } from '../helpers/various';
export async function upload(files: string[], cacheName: string, env: string, keypair: string, totalNFTs: number): Promise<boolean> {
export async function upload(
files: string[],
cacheName: string,
env: string,
keypair: string,
totalNFTs: number,
storage: string,
retainAuthority: boolean,
ipfsCredentials: ipfsCreds,
): Promise<boolean> {
let uploadSuccessful = true;
const savedContent = loadCache(cacheName, env);
@ -48,7 +60,7 @@ export async function upload(files: string[], cacheName: string, env: string, ke
const SIZE = images.length;
const walletKeyPair = loadWalletKey(keypair);
const anchorProgram = await loadAnchorProgram(walletKeyPair, env);
const anchorProgram = await loadCandyProgram(walletKeyPair, env);
let config = cacheContent.program.config
? new PublicKey(cacheContent.program.config)
@ -64,8 +76,6 @@ export async function upload(files: string[], cacheName: string, env: string, ke
log.info(`Processing file: ${i}`);
}
const storageCost = 10;
let link = cacheContent?.items?.[index]?.link;
if (!link || !cacheContent.program.uuid) {
const manifestPath = image.replace(EXTENSION_PNG, '.json');
@ -80,7 +90,7 @@ export async function upload(files: string[], cacheName: string, env: string, ke
if (i === 0 && !cacheContent.program.uuid) {
// initialize config
log.info(`initializing config`)
log.info(`initializing config`);
try {
const res = await createConfig(anchorProgram, walletKeyPair, {
maxNumberOfLines: new BN(totalNFTs),
@ -88,7 +98,7 @@ export async function upload(files: string[], cacheName: string, env: string, ke
sellerFeeBasisPoints: manifest.seller_fee_basis_points,
isMutable: true,
maxSupply: new BN(0),
retainAuthority: true,
retainAuthority: retainAuthority,
creators: manifest.properties.creators.map(creator => {
return {
address: new PublicKey(creator.address),
@ -101,7 +111,9 @@ export async function upload(files: string[], cacheName: string, env: string, ke
cacheContent.program.config = res.config.toBase58();
config = res.config;
log.info(`initialized config for a candy machine with publickey: ${res.config.toBase58()}`)
log.info(
`initialized config for a candy machine with publickey: ${res.config.toBase58()}`,
);
saveCache(cacheName, env, cacheContent);
} catch (exx) {
@ -111,47 +123,31 @@ export async function upload(files: string[], cacheName: string, env: string, ke
}
if (!link) {
const instructions = [
anchor.web3.SystemProgram.transfer({
fromPubkey: walletKeyPair.publicKey,
toPubkey: ARWEAVE_PAYMENT_WALLET,
lamports: storageCost,
}),
];
const tx = await sendTransactionWithRetryWithKeypair(
anchorProgram.provider.connection,
walletKeyPair,
instructions,
[],
'single',
);
log.debug('transaction for arweave payment:', tx);
// data.append('tags', JSON.stringify(tags));
// payment transaction
const data = new FormData();
data.append('transaction', tx['txid']);
data.append('env', env);
data.append('file[]', fs.createReadStream(image), {filename: `image.png`, contentType: 'image/png'});
data.append('file[]', manifestBuffer, 'metadata.json');
try {
const result = await uploadToArweave(data, manifest, index);
const metadataFile = result.messages?.find(
m => m.filename === 'manifest.json',
);
if (metadataFile?.transactionId) {
link = `https://arweave.net/${metadataFile.transactionId}`;
log.debug(`File uploaded: ${link}`);
if (storage === 'arweave') {
link = await arweaveUpload(
walletKeyPair,
anchorProgram,
env,
image,
manifestBuffer,
manifest,
index,
);
} else if (storage === 'ipfs') {
link = await ipfsUpload(ipfsCredentials, image, manifestBuffer);
}
cacheContent.items[index] = {
link,
name: manifest.name,
onChain: false,
};
saveCache(cacheName, env, cacheContent);
if (link) {
log.debug('setting cache for ', index);
cacheContent.items[index] = {
link,
name: manifest.name,
onChain: false,
};
cacheContent.authority = walletKeyPair.publicKey.toBase58();
saveCache(cacheName, env, cacheContent);
}
} catch (er) {
uploadSuccessful = false;
log.error(`Error uploading file ${index}`, er);
@ -160,7 +156,6 @@ export async function upload(files: string[], cacheName: string, env: string, ke
}
}
const keys = Object.keys(cacheContent.items);
try {
await Promise.all(
@ -179,7 +174,9 @@ export async function upload(files: string[], cacheName: string, env: string, ke
const ind = keys[indexes[0]];
if (onChain.length != indexes.length) {
log.info(`Writing indices ${ind}-${keys[indexes[indexes.length - 1]]}`);
log.info(
`Writing indices ${ind}-${keys[indexes[indexes.length - 1]]}`,
);
try {
await anchorProgram.rpc.addConfigLines(
ind,
@ -203,7 +200,12 @@ export async function upload(files: string[], cacheName: string, env: string, ke
});
saveCache(cacheName, env, cacheContent);
} catch (e) {
log.error(`saving config line ${ind}-${keys[indexes[indexes.length - 1]]} failed`, e);
log.error(
`saving config line ${ind}-${
keys[indexes[indexes.length - 1]]
} failed`,
e,
);
uploadSuccessful = false;
}
}
@ -219,23 +221,3 @@ export async function upload(files: string[], cacheName: string, env: string, ke
console.log(`Done. Successful = ${uploadSuccessful}.`);
return uploadSuccessful;
}
async function uploadToArweave(data: FormData, manifest, index) {
log.debug(`trying to upload ${index}.png: ${manifest.name}`)
return await (
await fetch(
'https://us-central1-principal-lane-200702.cloudfunctions.net/uploadFile4',
{
method: 'POST',
// @ts-ignore
body: data,
},
)
).json();
}
function chunks(array, size) {
return Array.apply(0, new Array(Math.ceil(array.length / size))).map(
(_, index) => array.slice(index * size, (index + 1) * size),
);
}

View File

@ -0,0 +1,16 @@
{
"name": "Invalid shares",
"description": "",
"image": "0.png",
"external_url": "",
"seller_fee_basis_points": 0,
"properties": {
"files": [{ "uri": "0.png", "type": "image/png" }],
"creators": [
{
"address": "111111111111111111111111111111",
"share": 100
}
]
}
}

View File

@ -0,0 +1,16 @@
{
"name": "Invalid shares",
"description": "",
"image": "0.png",
"external_url": "",
"seller_fee_basis_points": 0,
"properties": {
"files": [{ "uri": "0.png", "type": "image/png" }],
"creators": [
{
"address": "111111111111111111111111111111111",
"share": 0
}
]
}
}

View File

@ -0,0 +1,5 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`\`metaplex verify_token_metadata\` invalidates ../__fixtures__/invalidSchema/invalid-address.json 1`] = `"does not match pattern \\"[1-9A-HJ-NP-Za-km-z]{32,44}\\""`;
exports[`\`metaplex verify_token_metadata\` invalidates ../__fixtures__/invalidSchema/invalid-shares.json 1`] = `"must be strictly greater than 0"`;

View File

@ -0,0 +1,103 @@
import fs from 'fs';
import path from 'path';
import log from 'loglevel';
import {
verifyTokenMetadata,
verifyAggregateShare,
verifyImageURL,
verifyConsistentShares,
verifyCreatorCollation,
} from '../index';
const getFiles = rootDir => {
const assets = fs.readdirSync(rootDir).map(file => path.join(rootDir, file));
return assets;
};
describe('`metaplex verify_token_metadata`', () => {
const spy = jest.spyOn(log, 'warn');
beforeEach(() => {
spy.mockClear();
});
it('catches mismatched assets', () => {
const mismatchedAssets = getFiles(
path.join(__dirname, '../__fixtures__/mismatchedAssets'),
);
expect(() =>
verifyTokenMetadata({ files: mismatchedAssets }),
).toThrowErrorMatchingInlineSnapshot(
`"number of png files (0) is different than the number of json files (1)"`,
);
});
const invalidSchemas = getFiles(
path.join(__dirname, '../__fixtures__/invalidSchema'),
);
invalidSchemas.forEach(invalidSchema => {
it(`invalidates ${path.relative(__dirname, invalidSchema)}`, () => {
expect(() =>
verifyTokenMetadata({
files: [invalidSchema, invalidSchema.replace('.json', '.png')],
}),
).toThrowErrorMatchingSnapshot();
});
});
it('throws on invalid share allocation', () => {
expect(() =>
verifyAggregateShare(
[{ address: 'some-solana-address', share: 80 }],
'placeholder-manifest-file',
),
).toThrowErrorMatchingInlineSnapshot(
`"Creator share for placeholder-manifest-file does not add up to 100, got: 80."`,
);
expect(() =>
verifyAggregateShare(
[
{ address: 'some-solana-address', share: 80 },
{
address: 'some-other-solana-address',
share: 19.9,
},
],
'placeholder-manifest-file',
),
).toThrowErrorMatchingInlineSnapshot(
`"Creator share for placeholder-manifest-file does not add up to 100, got: 99.9."`,
);
});
it('warns when using different image URIs', () => {
verifyImageURL(
'https://google.com?ext=png',
[{ uri: 'https://google.com?ext=png', type: 'image/png' }],
'0.json',
);
expect(spy).toHaveBeenCalledTimes(1);
});
it('warns when there are inconsistent share allocations', () => {
const collatedCreators = new Map([
['some-solana-address', { shares: new Set([70]), tokenCount: 10 }],
]);
verifyCreatorCollation(
[{ address: 'some-solana-address', share: 80 }],
collatedCreators,
'0.json',
);
expect(spy).toHaveBeenCalledTimes(1);
});
it('warns when there are inconsistent creator allocations', () => {
const collatedCreators = new Map([
['some-solana-address', { shares: new Set([80]), tokenCount: 10 }],
['some-other-solana-address', { shares: new Set([80]), tokenCount: 20 }],
]);
verifyConsistentShares(collatedCreators);
expect(spy).toHaveBeenCalledTimes(1);
});
});

View File

@ -0,0 +1,163 @@
import path from 'path';
import log from 'loglevel';
import { validate } from 'jsonschema';
import { EXTENSION_JSON, EXTENSION_PNG } from '../../helpers/constants';
import tokenMetadataJsonSchema from './token-metadata.schema.json';
type TokenMetadata = {
image: string;
properties: {
files: { uri: string; type: string }[];
creators: { address: string; share: number }[];
};
};
export const verifyAssets = ({ files, uploadElementsCount }) => {
const pngFileCount = files.filter(it => {
return it.endsWith(EXTENSION_PNG);
}).length;
const jsonFileCount = files.filter(it => {
return it.endsWith(EXTENSION_JSON);
}).length;
const parsedNumber = parseInt(uploadElementsCount, 10);
const elemCount = parsedNumber ?? pngFileCount;
if (pngFileCount !== jsonFileCount) {
throw new Error(
`number of png files (${pngFileCount}) is different than the number of json files (${jsonFileCount})`,
);
}
if (elemCount < pngFileCount) {
throw new Error(
`max number (${elemCount}) cannot be smaller than the number of elements in the source folder (${pngFileCount})`,
);
}
log.info(`Verifying token metadata for ${pngFileCount} (png+json) pairs`);
};
export const verifyAggregateShare = (
creators: TokenMetadata['properties']['creators'],
manifestFile,
) => {
const aggregateShare = creators
.map(creator => creator.share)
.reduce((memo, share) => {
return memo + share;
}, 0);
// Check that creator share adds up to 100
if (aggregateShare !== 100) {
throw new Error(
`Creator share for ${manifestFile} does not add up to 100, got: ${aggregateShare}.`,
);
}
};
type CollatedCreators = Map<
string,
{ shares: Set<number>; tokenCount: number }
>;
export const verifyCreatorCollation = (
creators: TokenMetadata['properties']['creators'],
collatedCreators: CollatedCreators,
manifestFile: string,
) => {
for (const { address, share } of creators) {
if (collatedCreators.has(address)) {
const creator = collatedCreators.get(address);
creator.shares.add(share);
if (creator.shares.size > 1) {
log.warn(
`The creator share for ${address} in ${manifestFile} is different than the share declared for a previous token. This means at least one token is inconsistently configured, but we will continue. `,
);
}
creator.tokenCount += 1;
} else {
collatedCreators.set(address, {
tokenCount: 1,
shares: new Set([share]),
});
}
}
};
export const verifyImageURL = (image, files, manifestFile) => {
const expectedImagePath = `image${EXTENSION_PNG}`;
if (image !== expectedImagePath) {
// We _could_ match against this in the JSON schema validation, but it is totally valid to have arbitrary URLs to images here.
// The downside, though, is that those images will not get uploaded to Arweave since they're not on-disk.
log.warn(`We expected the \`image\` property in ${manifestFile} to be ${expectedImagePath}.
This will still work properly (assuming the URL is valid!), however, this image will not get uploaded to Arweave through the \`metaplex upload\` command.
If you want us to take care of getting this into Arweave, make sure to set \`image\`: "${expectedImagePath}"
The \`metaplex upload\` command will automatically substitute this URL with the Arweave URL location.
`);
}
const pngFiles = files.filter(file => file.type === 'image/png');
if (pngFiles.length === 0 || !pngFiles.some(file => file.uri === image)) {
throw new Error(
`At least one entry with the \`image/png\` type in the \`properties.files\` array is expected to match the \`image\` property.`,
);
}
};
export const verifyConsistentShares = (collatedCreators: CollatedCreators) => {
// We expect all creators to have been added to the same amount of tokens
const tokenCountSet = new Set<number>();
for (const [address, collation] of collatedCreators.entries()) {
tokenCountSet.add(collation.tokenCount);
if (tokenCountSet.size > 1) {
log.warn(
`We found that ${address} was added to more tokens than other creators.`,
);
}
}
};
export const verifyMetadataManifests = ({ files }) => {
const manifestFiles = files.filter(
file => path.extname(file) === EXTENSION_JSON,
);
// Used to keep track of the share allocations for individual creators
// We will send a warning if we notice discrepancies across the entire collection.
const collatedCreators: CollatedCreators = new Map();
// Do manifest-specific stuff here
for (const manifestFile of manifestFiles) {
// Check the overall schema shape. This is a non-exhaustive check, but guarantees the bare minimum needed for the rest of the commands to succeed.
const tokenMetadata = require(manifestFile) as TokenMetadata;
validate(tokenMetadata, tokenMetadataJsonSchema, { throwError: true });
const {
properties: { creators },
} = tokenMetadata;
verifyAggregateShare(creators, manifestFile);
verifyCreatorCollation(creators, collatedCreators, manifestFile);
// Check that the `image` and at least one of the files has a URI matching the index of this token.
const {
image,
properties: { files },
} = tokenMetadata;
verifyImageURL(image, files, manifestFile);
}
verifyConsistentShares(collatedCreators);
};
export const verifyTokenMetadata = ({
files,
uploadElementsCount = null,
}): Boolean => {
// Will we need to deal with the cache?
verifyAssets({ files, uploadElementsCount });
verifyMetadataManifests({ files });
return true;
};

View File

@ -0,0 +1,67 @@
{
"title": "Token Metadata",
"type": "object",
"properties": {
"name": {
"type": "string",
"description": "Identifies the asset to which this token represents"
},
"description": {
"type": "string",
"description": "Describes the asset to which this token represents"
},
"image": {
"type": "string",
"description": "A URI pointing to a resource with mime type image/* representing the asset to which this token represents. Consider making any images at a width between 320 and 1080 pixels and aspect ratio between 1.91:1 and 4:5 inclusive."
},
"external_url": {
"type": "string",
"description": "A URI pointing to an external resource that will take user outside of the platform."
},
"seller_fee_basis_points": {
"type": "number",
"description": "Royalties percentage awarded to creators, represented as a 'basis point' (i.e., multiple the percentage by 100: 75% = 7500)",
"minimum": 0,
"maximum": 10000
},
"properties": {
"type": "object",
"description": "Arbitrary properties. Values may be strings, numbers, object or arrays.",
"properties": {
"files": {
"type": "array",
"items": {
"type": "object",
"properties": {
"uri": { "type": "string" },
"type": {
"type": "string",
"description": "The MIME type for this file"
}
}
}
},
"creators": {
"type": "array",
"description": "Contains list of creators, each with Solana address and share of the NFT",
"items": {
"type": "object",
"properties": {
"address": {
"type": "string",
"description": "A Solana address",
"pattern": "[1-9A-HJ-NP-Za-km-z]{32,44}"
},
"share": {
"type": "number",
"description": "Percentage of royalties to send to this address, represented as a percentage (0-100). The sum of all shares must equal 100",
"exclusiveMinimum": 0,
"maximum": 100
}
}
}
}
}
}
}
}

File diff suppressed because it is too large Load Diff

View File

@ -5,13 +5,14 @@ import {
SPL_ASSOCIATED_TOKEN_ACCOUNT_PROGRAM_ID,
TOKEN_METADATA_PROGRAM_ID,
TOKEN_PROGRAM_ID,
FAIR_LAUNCH_PROGRAM_ID,
} from './constants';
import * as anchor from '@project-serum/anchor';
import fs from 'fs';
import BN from "bn.js";
import { createConfigAccount } from "./instructions";
import { web3 } from "@project-serum/anchor";
import log from "loglevel";
import BN from 'bn.js';
import { createConfigAccount } from './instructions';
import { web3 } from '@project-serum/anchor';
import log from 'loglevel';
export const createConfig = async function (
anchorProgram: anchor.Program,
@ -99,6 +100,78 @@ export const getConfig = async (
);
};
export const getTokenMint = async (
authority: anchor.web3.PublicKey,
uuid: string,
): Promise<[anchor.web3.PublicKey, number]> => {
return await anchor.web3.PublicKey.findProgramAddress(
[
Buffer.from('fair_launch'),
authority.toBuffer(),
Buffer.from('mint'),
Buffer.from(uuid),
],
FAIR_LAUNCH_PROGRAM_ID,
);
};
export const getFairLaunch = async (
tokenMint: anchor.web3.PublicKey,
): Promise<[anchor.web3.PublicKey, number]> => {
return await anchor.web3.PublicKey.findProgramAddress(
[Buffer.from('fair_launch'), tokenMint.toBuffer()],
FAIR_LAUNCH_PROGRAM_ID,
);
};
export const getFairLaunchTicket = async (
tokenMint: anchor.web3.PublicKey,
buyer: anchor.web3.PublicKey,
): Promise<[anchor.web3.PublicKey, number]> => {
return await anchor.web3.PublicKey.findProgramAddress(
[Buffer.from('fair_launch'), tokenMint.toBuffer(), buyer.toBuffer()],
FAIR_LAUNCH_PROGRAM_ID,
);
};
export const getFairLaunchLotteryBitmap = async (
tokenMint: anchor.web3.PublicKey,
): Promise<[anchor.web3.PublicKey, number]> => {
return await anchor.web3.PublicKey.findProgramAddress(
[Buffer.from('fair_launch'), tokenMint.toBuffer(), Buffer.from('lottery')],
FAIR_LAUNCH_PROGRAM_ID,
);
};
export const getFairLaunchTicketSeqLookup = async (
tokenMint: anchor.web3.PublicKey,
seq: anchor.BN,
): Promise<[anchor.web3.PublicKey, number]> => {
return await anchor.web3.PublicKey.findProgramAddress(
[Buffer.from('fair_launch'), tokenMint.toBuffer(), seq.toBuffer('le', 8)],
FAIR_LAUNCH_PROGRAM_ID,
);
};
export const getAtaForMint = async (
mint: anchor.web3.PublicKey,
buyer: anchor.web3.PublicKey,
): Promise<[anchor.web3.PublicKey, number]> => {
return await anchor.web3.PublicKey.findProgramAddress(
[buyer.toBuffer(), TOKEN_PROGRAM_ID.toBuffer(), mint.toBuffer()],
SPL_ASSOCIATED_TOKEN_ACCOUNT_PROGRAM_ID,
);
};
export const getTreasury = async (
tokenMint: anchor.web3.PublicKey,
): Promise<[anchor.web3.PublicKey, number]> => {
return await anchor.web3.PublicKey.findProgramAddress(
[Buffer.from('fair_launch'), tokenMint.toBuffer(), Buffer.from('treasury')],
FAIR_LAUNCH_PROGRAM_ID,
);
};
export const getMetadata = async (
mint: anchor.web3.PublicKey,
): Promise<anchor.web3.PublicKey> => {
@ -131,14 +204,17 @@ export const getMasterEdition = async (
};
export function loadWalletKey(keypair): Keypair {
if (!keypair || keypair == '') {
throw new Error('Keypair is required!');
}
const loaded = Keypair.fromSecretKey(
new Uint8Array(JSON.parse(fs.readFileSync(keypair).toString())),
);
log.info(`wallet public key: ${loaded.publicKey}`)
log.info(`wallet public key: ${loaded.publicKey}`);
return loaded;
}
export async function loadAnchorProgram(walletKeyPair: Keypair, env: string) {
export async function loadCandyProgram(walletKeyPair: Keypair, env: string) {
// @ts-ignore
const solConnection = new web3.Connection(web3.clusterApiUrl(env));
const walletWrapper = new anchor.Wallet(walletKeyPair);
@ -148,6 +224,21 @@ export async function loadAnchorProgram(walletKeyPair: Keypair, env: string) {
const idl = await anchor.Program.fetchIdl(CANDY_MACHINE_PROGRAM_ID, provider);
const program = new anchor.Program(idl, CANDY_MACHINE_PROGRAM_ID, provider);
log.debug("program id from anchor", program.programId.toBase58());
log.debug('program id from anchor', program.programId.toBase58());
return program;
}
export async function loadFairLaunchProgram(
walletKeyPair: Keypair,
env: string,
) {
// @ts-ignore
const solConnection = new anchor.web3.Connection(web3.clusterApiUrl(env));
const walletWrapper = new anchor.Wallet(walletKeyPair);
const provider = new anchor.Provider(solConnection, walletWrapper, {
preflightCommitment: 'recent',
});
const idl = await anchor.Program.fetchIdl(FAIR_LAUNCH_PROGRAM_ID, provider);
return new anchor.Program(idl, FAIR_LAUNCH_PROGRAM_ID, provider);
}

View File

@ -1,19 +1,36 @@
import path from "path";
import { CACHE_PATH } from "./constants";
import fs from "fs";
import path from 'path';
import { CACHE_PATH } from './constants';
import fs from 'fs';
export function cachePath(env: string, cacheName: string, cPath: string = CACHE_PATH) {
export function cachePath(
env: string,
cacheName: string,
cPath: string = CACHE_PATH,
) {
return path.join(cPath, `${env}-${cacheName}`);
}
export function loadCache(cacheName: string, env: string, cPath: string = CACHE_PATH) {
export function loadCache(
cacheName: string,
env: string,
cPath: string = CACHE_PATH,
) {
const path = cachePath(env, cacheName, cPath);
return fs.existsSync(path)
? JSON.parse(fs.readFileSync(path).toString())
: undefined;
}
export function saveCache(cacheName: string, env: string, cacheContent, cPath: string = CACHE_PATH) {
fs.writeFileSync(cachePath(env, cacheName, cPath), JSON.stringify(cacheContent));
export function saveCache(
cacheName: string,
env: string,
cacheContent,
cPath: string = CACHE_PATH,
) {
cacheContent.env = env;
cacheContent.cacheName = cacheName;
fs.writeFileSync(
cachePath(env, cacheName, cPath),
JSON.stringify(cacheContent),
);
}

View File

@ -4,11 +4,24 @@ export const MAX_NAME_LENGTH = 32;
export const MAX_URI_LENGTH = 200;
export const MAX_SYMBOL_LENGTH = 10;
export const MAX_CREATOR_LEN = 32 + 1 + 1;
export const ARWEAVE_PAYMENT_WALLET = new PublicKey('HvwC9QSAzvGXhhVrgPmauVwFWcYZhne3hVot9EbHuFTm');
export const CANDY_MACHINE_PROGRAM_ID = new PublicKey('cndyAnrLdpjq1Ssp1z8xxDsB8dxe7u4HL5Nxi2K5WXZ');
export const TOKEN_METADATA_PROGRAM_ID = new PublicKey('metaqbxxUerdq28cj1RbAWkYQm3ybzjb6a8bt518x1s');
export const SPL_ASSOCIATED_TOKEN_ACCOUNT_PROGRAM_ID = new PublicKey('ATokenGPvbdGVxr1b2hvZbsiqW5xWH25efTNsLJA8knL');
export const TOKEN_PROGRAM_ID = new PublicKey('TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA');
export const ARWEAVE_PAYMENT_WALLET = new PublicKey(
'HvwC9QSAzvGXhhVrgPmauVwFWcYZhne3hVot9EbHuFTm',
);
export const CANDY_MACHINE_PROGRAM_ID = new PublicKey(
'cndyAnrLdpjq1Ssp1z8xxDsB8dxe7u4HL5Nxi2K5WXZ',
);
export const TOKEN_METADATA_PROGRAM_ID = new PublicKey(
'metaqbxxUerdq28cj1RbAWkYQm3ybzjb6a8bt518x1s',
);
export const SPL_ASSOCIATED_TOKEN_ACCOUNT_PROGRAM_ID = new PublicKey(
'ATokenGPvbdGVxr1b2hvZbsiqW5xWH25efTNsLJA8knL',
);
export const TOKEN_PROGRAM_ID = new PublicKey(
'TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA',
);
export const FAIR_LAUNCH_PROGRAM_ID = new PublicKey(
'faircnAB9k59Y4TXmLabBULeuTLgV7TkGMGNkjnA15j',
);
export const CONFIG_ARRAY_START =
32 + // authority
4 +

View File

@ -13,7 +13,7 @@ import {
} from '@solana/web3.js';
import { getUnixTs, sleep } from './various';
import { DEFAULT_TIMEOUT } from './constants';
import log from "loglevel";
import log from 'loglevel';
interface BlockhashAndFeeCalculator {
blockhash: Blockhash;
@ -47,11 +47,11 @@ export const sendTransactionWithRetryWithKeypair = async (
}
if (signers.length > 0) {
transaction.partialSign(...signers);
transaction.sign(...[wallet, ...signers]);
} else {
transaction.sign(wallet);
}
transaction.sign(wallet);
if (beforeSend) {
beforeSend();
}
@ -242,7 +242,7 @@ async function awaitTransactionSignatureConfirmation(
} else if (!status.confirmations) {
log.error('REST no confirmations for', txid, status);
} else {
log.info('REST confirmation for', txid, status);
log.debug('REST confirmation for', txid, status);
done = true;
resolve(status);
}

View File

@ -0,0 +1,73 @@
import * as anchor from '@project-serum/anchor';
import FormData from 'form-data';
import fs from 'fs';
import log from 'loglevel';
import fetch from 'node-fetch';
import { ARWEAVE_PAYMENT_WALLET } from '../constants';
import { sendTransactionWithRetryWithKeypair } from '../transactions';
async function upload(data: FormData, manifest, index) {
log.debug(`trying to upload ${index}.png: ${manifest.name}`);
return await (
await fetch(
'https://us-central1-principal-lane-200702.cloudfunctions.net/uploadFile4',
{
method: 'POST',
// @ts-ignore
body: data,
},
)
).json();
}
export async function arweaveUpload(
walletKeyPair,
anchorProgram,
env,
image,
manifestBuffer,
manifest,
index,
) {
const storageCost = 10;
const instructions = [
anchor.web3.SystemProgram.transfer({
fromPubkey: walletKeyPair.publicKey,
toPubkey: ARWEAVE_PAYMENT_WALLET,
lamports: storageCost,
}),
];
const tx = await sendTransactionWithRetryWithKeypair(
anchorProgram.provider.connection,
walletKeyPair,
instructions,
[],
'single',
);
log.debug('transaction for arweave payment:', tx);
const data = new FormData();
data.append('transaction', tx['txid']);
data.append('env', env);
data.append('file[]', fs.createReadStream(image), {
filename: `image.png`,
contentType: 'image/png',
});
data.append('file[]', manifestBuffer, 'metadata.json');
const result = await upload(data, manifest, index);
const metadataFile = result.messages?.find(
m => m.filename === 'manifest.json',
);
if (metadataFile?.transactionId) {
const link = `https://arweave.net/${metadataFile.transactionId}`;
log.debug(`File uploaded: ${link}`);
return link;
} else {
// @todo improve
throw new Error(`No transaction ID for upload: ${index}`);
}
}

View File

@ -0,0 +1,67 @@
import log from 'loglevel';
import fetch from 'node-fetch';
import { create, globSource } from 'ipfs-http-client';
export interface ipfsCreds {
projectId: string;
secretKey: string;
}
function sleep(ms: number): Promise<void> {
return new Promise(resolve => setTimeout(resolve, ms));
}
export async function ipfsUpload(
ipfsCredentials: ipfsCreds,
image: string,
manifestBuffer: Buffer,
) {
const tokenIfps = `${ipfsCredentials.projectId}:${ipfsCredentials.secretKey}`;
// @ts-ignore
const ipfs = create('https://ipfs.infura.io:5001');
const uploadToIpfs = async source => {
const { cid } = await ipfs.add(source).catch();
return cid;
};
const mediaHash = await uploadToIpfs(globSource(image, { recursive: true }));
log.debug('mediaHash:', mediaHash);
const mediaUrl = `https://ipfs.io/ipfs/${mediaHash}`;
log.debug('mediaUrl:', mediaUrl);
const authIFPS = Buffer.from(tokenIfps).toString('base64');
await fetch(`https://ipfs.infura.io:5001/api/v0/pin/add?arg=${mediaHash}`, {
headers: {
Authorization: `Basic ${authIFPS}`,
},
method: 'POST',
});
log.info('uploaded image for file:', image);
await sleep(500);
const manifestJson = JSON.parse(manifestBuffer.toString('utf8'));
manifestJson.image = mediaUrl;
manifestJson.properties.files = manifestJson.properties.files.map(f => {
return { ...f, uri: mediaUrl };
});
const manifestHash = await uploadToIpfs(
Buffer.from(JSON.stringify(manifestJson)),
);
await fetch(
`https://ipfs.infura.io:5001/api/v0/pin/add?arg=${manifestHash}`,
{
headers: {
Authorization: `Basic ${authIFPS}`,
},
method: 'POST',
},
);
await sleep(500);
const link = `https://ipfs.io/ipfs/${manifestHash}`;
log.info('uploaded manifest: ', link);
return link;
}

View File

@ -1,7 +1,4 @@
import { LAMPORTS_PER_SOL } from '@solana/web3.js';
import path from "path";
import { CACHE_PATH } from "./constants";
import fs from "fs";
import { LAMPORTS_PER_SOL, AccountInfo } from '@solana/web3.js';
export const getUnixTs = () => {
return new Date().getTime() / 1000;
};
@ -54,16 +51,71 @@ export function parsePrice(price: string, mantissa: number = LAMPORTS_PER_SOL) {
return Math.ceil(parseFloat(price) * mantissa);
}
export async function upload(data: FormData, manifest, index) {
console.log(`trying to upload ${index}.png: ${manifest.name}`);
return await (
await fetch(
'https://us-central1-principal-lane-200702.cloudfunctions.net/uploadFile4',
{
method: 'POST',
// @ts-ignore
body: data,
},
)
).json();
export function parseDate(date) {
if (date === 'now') {
return Date.now() / 1000;
}
return Date.parse(date) / 1000;
}
export const getMultipleAccounts = async (
connection: any,
keys: string[],
commitment: string,
) => {
const result = await Promise.all(
chunks(keys, 99).map(chunk =>
getMultipleAccountsCore(connection, chunk, commitment),
),
);
const array = result
.map(
a =>
//@ts-ignore
a.array.map(acc => {
if (!acc) {
return undefined;
}
const { data, ...rest } = acc;
const obj = {
...rest,
data: Buffer.from(data[0], 'base64'),
} as AccountInfo<Buffer>;
return obj;
}) as AccountInfo<Buffer>[],
)
//@ts-ignore
.flat();
return { keys, array };
};
export function chunks(array, size) {
return Array.apply(0, new Array(Math.ceil(array.length / size))).map(
(_, index) => array.slice(index * size, (index + 1) * size),
);
}
const getMultipleAccountsCore = async (
connection: any,
keys: string[],
commitment: string,
) => {
const args = connection._buildArgs([keys], commitment, 'base64');
const unsafeRes = await connection._rpcRequest('getMultipleAccounts', args);
if (unsafeRes.error) {
throw new Error(
'failed to get info about account ' + unsafeRes.error.message,
);
}
if (unsafeRes.result.value) {
const array = unsafeRes.result.value as AccountInfo<string[]>[];
return { keys, array };
}
// TODO: fix
throw new Error();
};

View File

@ -14,7 +14,8 @@
"noLib": false,
"preserveConstEnums": true,
"suppressImplicitAnyIndexErrors": true,
"lib": ["dom", "es6"]
"resolveJsonModule": true,
"lib": ["dom", "es2019"]
},
"exclude": ["node_modules", "typings/browser", "typings/browser.d.ts"],
"atom": {

View File

@ -1,16 +1,12 @@
import { BN } from "@project-serum/anchor";
import { PublicKey, AccountInfo } from "@solana/web3.js";
import { BN } from '@project-serum/anchor';
import { PublicKey, AccountInfo } from '@solana/web3.js';
export class Creator {
address: PublicKey;
verified: boolean;
share: number;
constructor(args: {
address: PublicKey;
verified: boolean;
share: number;
}) {
constructor(args: { address: PublicKey; verified: boolean; share: number }) {
this.address = args.address;
this.verified = args.verified;
this.share = args.share;
@ -67,7 +63,7 @@ export enum MetadataKey {
EditionV1 = 1,
MasterEditionV1 = 2,
MasterEditionV2 = 6,
EditionMarker = 7
EditionMarker = 7,
}
export class MasterEditionV1 {
@ -89,49 +85,38 @@ export class MasterEditionV1 {
this.printingMint = args.printingMint;
this.oneTimePrintingAuthorizationMint =
args.oneTimePrintingAuthorizationMint;
};
}
}
export class MasterEditionV2 {
key: MetadataKey;
supply: BN;
maxSupply?: BN;
constructor(args: {
key: MetadataKey;
supply: BN;
maxSupply?: BN;
}) {
constructor(args: { key: MetadataKey; supply: BN; maxSupply?: BN }) {
this.key = MetadataKey.MasterEditionV2;
this.supply = args.supply;
this.maxSupply = args.maxSupply;
};
}
}
export class EditionMarker {
key: MetadataKey;
ledger: number[];
constructor(args: {
key: MetadataKey;
ledger: number[];
}) {
constructor(args: { key: MetadataKey; ledger: number[] }) {
this.key = MetadataKey.EditionMarker;
this.ledger = args.ledger;
};
}
}
export class Edition {
key: MetadataKey;
parent: PublicKey;
edition: BN;
constructor(args: {
key: MetadataKey;
parent: PublicKey;
edition: BN;
}) {
constructor(args: { key: MetadataKey; parent: PublicKey; edition: BN }) {
this.key = MetadataKey.EditionV1;
this.parent = args.parent;
this.edition = args.edition;
};
}
}
export class Data {
@ -152,7 +137,7 @@ export class Data {
this.uri = args.uri;
this.sellerFeeBasisPoints = args.sellerFeeBasisPoints;
this.creators = args.creators;
};
}
}
export class Metadata {
@ -178,7 +163,7 @@ export class Metadata {
this.data = args.data;
this.primarySaleHappened = args.primarySaleHappened;
this.isMutable = args.isMutable;
};
}
}
export const METADATA_SCHEMA = new Map<any, any>([
@ -265,4 +250,4 @@ export const METADATA_SCHEMA = new Map<any, any>([
],
},
],
]);
]);

View File

@ -24,7 +24,7 @@
"watch-css": "less-watch-compiler src/ dist/lib/",
"watch-css-src": "less-watch-compiler src/ src/",
"watch": "tsc --watch",
"test": "jest test",
"test": "jest test --passWithNoTests",
"clean": "rm -rf dist",
"prepare": "run-s clean build",
"format:fix": "prettier --write \"**/*.+(js|jsx|ts|tsx|json|css|md)\""

View File

@ -14,7 +14,7 @@ import { findProgramAddress, StringPublicKey, toPublicKey } from '../utils';
export const AUCTION_PREFIX = 'auction';
export const METADATA = 'metadata';
export const EXTENDED = 'extended';
export const MAX_AUCTION_DATA_EXTENDED_SIZE = 8 + 9 + 2 + 200;
export const MAX_AUCTION_DATA_EXTENDED_SIZE = 8 + 9 + 2 + 9 + 33 + 158;
export enum AuctionState {
Created = 0,
@ -184,15 +184,21 @@ export class AuctionDataExtended {
totalUncancelledBids: BN;
tickSize: BN | null;
gapTickSizePercentage: number | null;
instantSalePrice: BN | null;
name: number[] | null;
constructor(args: {
totalUncancelledBids: BN;
tickSize: BN | null;
gapTickSizePercentage: number | null;
instantSalePrice: BN | null;
name: number[] | null;
}) {
this.totalUncancelledBids = args.totalUncancelledBids;
this.tickSize = args.tickSize;
this.gapTickSizePercentage = args.gapTickSizePercentage;
this.instantSalePrice = args.instantSalePrice;
this.name = args.name;
}
}
@ -225,6 +231,8 @@ export class AuctionData {
/// Used for precalculation on the front end, not a backend key
bidRedemptionKey?: StringPublicKey;
auctionDataExtended?: StringPublicKey;
public timeToEnd(): CountdownState {
const now = moment().unix();
const ended = { days: 0, hours: 0, minutes: 0, seconds: 0 };
@ -370,10 +378,14 @@ export interface IPartialCreateAuctionArgs {
tickSize: BN | null;
gapTickSizePercentage: number | null;
instantSalePrice: BN | null;
name: number[] | null;
}
export class CreateAuctionArgs implements IPartialCreateAuctionArgs {
instruction: number = 1;
instruction: number = 7;
/// How many winners are allowed for this auction. See AuctionData.
winners: WinnerLimit;
/// End time is the cut-off point that the auction is forced to end by. See AuctionData.
@ -393,6 +405,10 @@ export class CreateAuctionArgs implements IPartialCreateAuctionArgs {
gapTickSizePercentage: number | null;
instantSalePrice: BN | null;
name: number[] | null;
constructor(args: {
winners: WinnerLimit;
endAuctionAt: BN | null;
@ -403,6 +419,8 @@ export class CreateAuctionArgs implements IPartialCreateAuctionArgs {
priceFloor: PriceFloor;
tickSize: BN | null;
gapTickSizePercentage: number | null;
name: number[] | null;
instantSalePrice: BN | null;
}) {
this.winners = args.winners;
this.endAuctionAt = args.endAuctionAt;
@ -413,6 +431,8 @@ export class CreateAuctionArgs implements IPartialCreateAuctionArgs {
this.priceFloor = args.priceFloor;
this.tickSize = args.tickSize;
this.gapTickSizePercentage = args.gapTickSizePercentage;
this.name = args.name;
this.instantSalePrice = args.instantSalePrice;
}
}
@ -465,6 +485,8 @@ export const AUCTION_SCHEMA = new Map<any, any>([
['priceFloor', PriceFloor],
['tickSize', { kind: 'option', type: 'u64' }],
['gapTickSizePercentage', { kind: 'option', type: 'u8' }],
['instantSalePrice', { kind: 'option', type: 'u64' }],
['name', { kind: 'option', type: [32] }],
],
},
],
@ -542,6 +564,8 @@ export const AUCTION_SCHEMA = new Map<any, any>([
['totalUncancelledBids', 'u64'],
['tickSize', { kind: 'option', type: 'u64' }],
['gapTickSizePercentage', { kind: 'option', type: 'u8' }],
['instantSalePrice', { kind: 'option', type: 'u64' }],
['name', { kind: 'option', type: [32] }],
],
},
],

View File

@ -1,4 +1,5 @@
import {
PublicKey,
SystemProgram,
SYSVAR_RENT_PUBKEY,
TransactionInstruction,
@ -249,13 +250,27 @@ export class Metadata {
this.data = args.data;
this.primarySaleHappened = args.primarySaleHappened;
this.isMutable = args.isMutable;
this.editionNonce = args.editionNonce;
this.editionNonce = args.editionNonce ?? null;
}
public async init() {
const edition = await getEdition(this.mint);
this.edition = edition;
this.masterEdition = edition;
const metadata = toPublicKey(programIds().metadata);
if (this.editionNonce !== null) {
this.edition = (
await PublicKey.createProgramAddress(
[
Buffer.from(METADATA_PREFIX),
metadata.toBuffer(),
toPublicKey(this.mint).toBuffer(),
new Uint8Array([this.editionNonce || 0]),
],
metadata,
)
).toBase58();
} else {
this.edition = await getEdition(this.mint);
}
this.masterEdition = this.edition;
}
}

View File

@ -16,6 +16,7 @@ import {
import React, { useContext, useEffect, useMemo, useState } from 'react';
import { notify } from '../utils/notifications';
import { ExplorerLink } from '../components/ExplorerLink';
import { useQuerySearch } from '../hooks';
import {
TokenInfo,
TokenListProvider,
@ -87,10 +88,16 @@ const ConnectionContext = React.createContext<ConnectionConfig>({
});
export function ConnectionProvider({ children = undefined as any }) {
const [endpoint, setEndpoint] = useLocalStorageState(
const searchParams = useQuerySearch();
const network = searchParams.get('network');
const queryEndpoint =
network && ENDPOINTS.find(({ name }) => name.startsWith(network))?.endpoint;
const [savedEndpoint, setEndpoint] = useLocalStorageState(
'connectionEndpoint',
ENDPOINTS[0].endpoint,
);
const endpoint = queryEndpoint || savedEndpoint;
const connection = useMemo(
() => new Connection(endpoint, 'recent'),

View File

@ -22,6 +22,4 @@ export const getEmptyMetaState = (): MetaState => ({
prizeTrackingTickets: {},
safetyDepositConfigsByAuctionManagerAndIndex: {},
bidRedemptionV2sByAuctionManagerAndWinningIndex: {},
stores: {},
creators: {},
});

View File

@ -4,20 +4,20 @@ import { ParsedAccount } from '../accounts/types';
export const isMetadataPartOfStore = (
m: ParsedAccount<Metadata>,
store: ParsedAccount<Store> | null,
whitelistedCreatorsByCreator: Record<
string,
ParsedAccount<WhitelistedCreator>
>,
store?: ParsedAccount<Store> | null,
) => {
if (!m?.info?.data?.creators || !store?.info) {
if (!m?.info?.data?.creators) {
return false;
}
return m.info.data.creators.some(
c =>
c.verified &&
(store.info.public ||
(store?.info.public ||
whitelistedCreatorsByCreator[c.address]?.info?.activated),
);
};

View File

@ -16,9 +16,18 @@ import {
MAX_SYMBOL_LENGTH,
MAX_URI_LENGTH,
METADATA_PREFIX,
decodeMetadata,
getAuctionExtended,
} from '../../actions';
import { AccountInfo, Connection, PublicKey } from '@solana/web3.js';
import { AccountAndPubkey, MetaState, ProcessAccountsFunc } from './types';
import { WhitelistedCreator } from '../../models/metaplex';
import { Connection, PublicKey } from '@solana/web3.js';
import {
AccountAndPubkey,
MetaState,
ProcessAccountsFunc,
UpdateStateValueFunc,
UnPromise,
} from './types';
import { isMetadataPartOfStore } from './isMetadataPartOfStore';
import { processAuctions } from './processAuctions';
import { processMetaplexAccounts } from './processMetaplexAccounts';
@ -27,260 +36,444 @@ import { processVaultData } from './processVaultData';
import { ParsedAccount } from '../accounts/types';
import { getEmptyMetaState } from './getEmptyMetaState';
import { getMultipleAccounts } from '../accounts/getMultipleAccounts';
import { getProgramAccounts } from './web3';
import { createPipelineExecutor } from '../../utils/createPipelineExecutor';
async function getProgramAccounts(
connection: Connection,
programId: StringPublicKey,
configOrCommitment?: any,
): Promise<Array<AccountAndPubkey>> {
const extra: any = {};
let commitment;
//let encoding;
export const USE_SPEED_RUN = false;
const WHITELISTED_METADATA = ['98vYFjBYS9TguUMWQRPjy2SZuxKuUMcqR4vnQiLjZbte'];
const WHITELISTED_AUCTION = ['D8wMB5iLZnsV7XQjpwqXaDynUtFuDs7cRXvEGNj1NF1e'];
const AUCTION_TO_METADATA: Record<string, string[]> = {
D8wMB5iLZnsV7XQjpwqXaDynUtFuDs7cRXvEGNj1NF1e: [
'98vYFjBYS9TguUMWQRPjy2SZuxKuUMcqR4vnQiLjZbte',
],
};
const AUCTION_TO_VAULT: Record<string, string> = {
D8wMB5iLZnsV7XQjpwqXaDynUtFuDs7cRXvEGNj1NF1e:
'3wHCBd3fYRPWjd5GqzrXanLJUKRyU3nECKbTPKfVwcFX',
};
const WHITELISTED_AUCTION_MANAGER = [
'3HD2C8oCL8dpqbXo8hq3CMw6tRSZDZJGajLxnrZ3ZkYx',
];
const WHITELISTED_VAULT = ['3wHCBd3fYRPWjd5GqzrXanLJUKRyU3nECKbTPKfVwcFX'];
if (configOrCommitment) {
if (typeof configOrCommitment === 'string') {
commitment = configOrCommitment;
} else {
commitment = configOrCommitment.commitment;
//encoding = configOrCommitment.encoding;
if (configOrCommitment.dataSlice) {
extra.dataSlice = configOrCommitment.dataSlice;
}
if (configOrCommitment.filters) {
extra.filters = configOrCommitment.filters;
}
}
}
const args = connection._buildArgs([programId], commitment, 'base64', extra);
const unsafeRes = await (connection as any)._rpcRequest(
'getProgramAccounts',
args,
);
const data = (
unsafeRes.result as Array<{
account: AccountInfo<[string, string]>;
pubkey: string;
}>
).map(item => {
return {
account: {
// TODO: possible delay parsing could be added here
data: Buffer.from(item.account.data[0], 'base64'),
executable: item.account.executable,
lamports: item.account.lamports,
// TODO: maybe we can do it in lazy way? or just use string
owner: item.account.owner,
} as AccountInfo<Buffer>,
pubkey: item.pubkey,
};
});
return data;
}
export const loadAccounts = async (connection: Connection, all: boolean) => {
export const limitedLoadAccounts = async (connection: Connection) => {
const tempCache: MetaState = getEmptyMetaState();
const updateTemp = makeSetter(tempCache);
const forEach =
(fn: ProcessAccountsFunc) => async (accounts: AccountAndPubkey[]) => {
for (const account of accounts) {
await fn(account, updateTemp, all);
await fn(account, updateTemp);
}
};
let isSelectivePullMetadata = false;
const pullMetadata = async (creators: AccountAndPubkey[]) => {
await forEach(processMetaplexAccounts)(creators);
const pullMetadata = async (metadata: string) => {
const mdKey = new PublicKey(metadata);
const md = await connection.getAccountInfo(mdKey);
const mdObject = decodeMetadata(
Buffer.from(md?.data || new Uint8Array([])),
);
const editionKey = await getEdition(mdObject.mint);
const editionData = await connection.getAccountInfo(
new PublicKey(editionKey),
);
if (md) {
//@ts-ignore
md.owner = md.owner.toBase58();
processMetaData(
{
pubkey: metadata,
account: md,
},
updateTemp,
);
if (editionData) {
//@ts-ignore
editionData.owner = editionData.owner.toBase58();
processMetaData(
{
pubkey: editionKey,
account: editionData,
},
updateTemp,
);
}
}
};
const whitelistedCreators = Object.values(
tempCache.whitelistedCreatorsByCreator,
const pullAuction = async (auction: string) => {
const auctionExtendedKey = await getAuctionExtended({
auctionProgramId: AUCTION_ID,
resource: AUCTION_TO_VAULT[auction],
});
const auctionData = await getMultipleAccounts(
connection,
[auction, auctionExtendedKey],
'single',
);
if (whitelistedCreators.length > 3) {
console.log(' too many creators, pulling all nfts in one go');
additionalPromises.push(
getProgramAccounts(connection, METADATA_PROGRAM_ID).then(
forEach(processMetaData),
),
);
} else {
console.log('pulling optimized nfts');
isSelectivePullMetadata = true;
if (auctionData) {
auctionData.keys.map((pubkey, i) => {
processAuctions(
{
pubkey,
account: auctionData.array[i],
},
updateTemp,
);
});
}
};
for (let i = 0; i < MAX_CREATOR_LIMIT; i++) {
for (let j = 0; j < whitelistedCreators.length; j++) {
additionalPromises.push(
getProgramAccounts(connection, METADATA_PROGRAM_ID, {
const pullAuctionManager = async (auctionManager: string) => {
const auctionManagerKey = new PublicKey(auctionManager);
const auctionManagerData = await connection.getAccountInfo(
auctionManagerKey,
);
if (auctionManagerData) {
//@ts-ignore
auctionManagerData.owner = auctionManagerData.owner.toBase58();
processMetaplexAccounts(
{
pubkey: auctionManager,
account: auctionManagerData,
},
updateTemp,
);
}
};
const pullVault = async (vault: string) => {
const vaultKey = new PublicKey(vault);
const vaultData = await connection.getAccountInfo(vaultKey);
if (vaultData) {
//@ts-ignore
vaultData.owner = vaultData.owner.toBase58();
processVaultData(
{
pubkey: vault,
account: vaultData,
},
updateTemp,
);
}
};
const promises = [
...WHITELISTED_METADATA.map(md => pullMetadata(md)),
...WHITELISTED_AUCTION.map(a => pullAuction(a)),
...WHITELISTED_AUCTION_MANAGER.map(a => pullAuctionManager(a)),
...WHITELISTED_VAULT.map(a => pullVault(a)),
// bidder metadata pull
...WHITELISTED_AUCTION.map(a =>
getProgramAccounts(connection, AUCTION_ID, {
filters: [
{
memcmp: {
offset: 32,
bytes: a,
},
},
],
}).then(forEach(processAuctions)),
),
// bidder pot pull
...WHITELISTED_AUCTION.map(a =>
getProgramAccounts(connection, AUCTION_ID, {
filters: [
{
memcmp: {
offset: 64,
bytes: a,
},
},
],
}).then(forEach(processAuctions)),
),
// safety deposit pull
...WHITELISTED_VAULT.map(v =>
getProgramAccounts(connection, VAULT_ID, {
filters: [
{
memcmp: {
offset: 1,
bytes: v,
},
},
],
}).then(forEach(processVaultData)),
),
// bid redemptions
...WHITELISTED_AUCTION_MANAGER.map(a =>
getProgramAccounts(connection, METAPLEX_ID, {
filters: [
{
memcmp: {
offset: 9,
bytes: a,
},
},
],
}).then(forEach(processMetaplexAccounts)),
),
// safety deposit configs
...WHITELISTED_AUCTION_MANAGER.map(a =>
getProgramAccounts(connection, METAPLEX_ID, {
filters: [
{
memcmp: {
offset: 1,
bytes: a,
},
},
],
}).then(forEach(processMetaplexAccounts)),
),
// prize tracking tickets
...Object.keys(AUCTION_TO_METADATA)
.map(key =>
AUCTION_TO_METADATA[key]
.map(md =>
getProgramAccounts(connection, METAPLEX_ID, {
filters: [
{
memcmp: {
offset:
1 + // key
32 + // update auth
32 + // mint
4 + // name string length
MAX_NAME_LENGTH + // name
4 + // uri string length
MAX_URI_LENGTH + // uri
4 + // symbol string length
MAX_SYMBOL_LENGTH + // symbol
2 + // seller fee basis points
1 + // whether or not there is a creators vec
4 + // creators vec length
i * MAX_CREATOR_LEN,
bytes: whitelistedCreators[j].info.address,
offset: 1,
bytes: md,
},
},
],
}).then(forEach(processMetaData)),
);
}
}
}
};
const pullEditions = async () => {
console.log('Pulling editions for optimized metadata');
let setOf100MetadataEditionKeys: string[] = [];
const editionPromises: Promise<{
keys: string[];
array: AccountInfo<Buffer>[];
}>[] = [];
for (let i = 0; i < tempCache.metadata.length; i++) {
let edition: StringPublicKey;
if (tempCache.metadata[i].info.editionNonce != null) {
edition = (
await PublicKey.createProgramAddress(
[
Buffer.from(METADATA_PREFIX),
toPublicKey(METADATA_PROGRAM_ID).toBuffer(),
toPublicKey(tempCache.metadata[i].info.mint).toBuffer(),
new Uint8Array([tempCache.metadata[i].info.editionNonce || 0]),
],
toPublicKey(METADATA_PROGRAM_ID),
}).then(forEach(processMetaplexAccounts)),
)
).toBase58();
} else {
edition = await getEdition(tempCache.metadata[i].info.mint);
}
setOf100MetadataEditionKeys.push(edition);
if (setOf100MetadataEditionKeys.length >= 100) {
editionPromises.push(
getMultipleAccounts(
connection,
setOf100MetadataEditionKeys,
'recent',
),
);
setOf100MetadataEditionKeys = [];
}
}
if (setOf100MetadataEditionKeys.length >= 0) {
editionPromises.push(
getMultipleAccounts(connection, setOf100MetadataEditionKeys, 'recent'),
);
setOf100MetadataEditionKeys = [];
}
const responses = await Promise.all(editionPromises);
for (let i = 0; i < responses.length; i++) {
const returnedAccounts = responses[i];
for (let j = 0; j < returnedAccounts.array.length; j++) {
processMetaData(
{
pubkey: returnedAccounts.keys[j],
account: returnedAccounts.array[j],
},
updateTemp,
all,
);
}
}
console.log(
'Edition size',
Object.keys(tempCache.editions).length,
Object.keys(tempCache.masterEditions).length,
);
};
const IS_BIG_STORE =
all || process.env.NEXT_PUBLIC_BIG_STORE?.toLowerCase() === 'true';
console.log(`Is big store: ${IS_BIG_STORE}`);
const additionalPromises: Promise<void>[] = [];
const basePromises = [
getProgramAccounts(connection, VAULT_ID).then(forEach(processVaultData)),
getProgramAccounts(connection, AUCTION_ID).then(forEach(processAuctions)),
getProgramAccounts(connection, METAPLEX_ID).then(
forEach(processMetaplexAccounts),
),
IS_BIG_STORE
? getProgramAccounts(connection, METADATA_PROGRAM_ID).then(
forEach(processMetaData),
)
: getProgramAccounts(connection, METAPLEX_ID, {
filters: [
{
dataSize: MAX_WHITELISTED_CREATOR_SIZE,
},
],
}).then(pullMetadata),
.flat(),
)
.flat(),
// whitelisted creators
getProgramAccounts(connection, METAPLEX_ID, {
filters: [
{
dataSize: MAX_WHITELISTED_CREATOR_SIZE,
},
],
}).then(forEach(processMetaplexAccounts)),
];
await Promise.all(basePromises);
await Promise.all(additionalPromises);
await postProcessMetadata(tempCache, all);
console.log('Metadata size', tempCache.metadata.length);
await Promise.all(promises);
if (isSelectivePullMetadata) {
await pullEditions();
}
await postProcessMetadata(tempCache);
return tempCache;
};
export const loadAccounts = async (connection: Connection) => {
const state: MetaState = getEmptyMetaState();
const updateState = makeSetter(state);
const forEachAccount = processingAccounts(updateState);
const loadVaults = () =>
getProgramAccounts(connection, VAULT_ID).then(
forEachAccount(processVaultData),
);
const loadAuctions = () =>
getProgramAccounts(connection, AUCTION_ID).then(
forEachAccount(processAuctions),
);
const loadMetaplex = () =>
getProgramAccounts(connection, METAPLEX_ID).then(
forEachAccount(processMetaplexAccounts),
);
const loadCreators = () =>
getProgramAccounts(connection, METAPLEX_ID, {
filters: [
{
dataSize: MAX_WHITELISTED_CREATOR_SIZE,
},
],
}).then(forEachAccount(processMetaplexAccounts));
const loadMetadata = () =>
pullMetadataByCreators(connection, state, updateState);
const loadEditions = () => pullEditions(connection, updateState, state);
const loading = [
loadCreators().then(loadMetadata).then(loadEditions),
loadVaults(),
loadAuctions(),
loadMetaplex(),
];
await Promise.all(loading);
console.log('Metadata size', state.metadata.length);
return state;
};
const pullEditions = async (
connection: Connection,
updater: UpdateStateValueFunc,
state: MetaState,
) => {
console.log('Pulling editions for optimized metadata');
type MultipleAccounts = UnPromise<ReturnType<typeof getMultipleAccounts>>;
let setOf100MetadataEditionKeys: string[] = [];
const editionPromises: Promise<void>[] = [];
const loadBatch = () => {
editionPromises.push(
getMultipleAccounts(
connection,
setOf100MetadataEditionKeys,
'recent',
).then(processEditions),
);
setOf100MetadataEditionKeys = [];
};
const processEditions = (returnedAccounts: MultipleAccounts) => {
for (let j = 0; j < returnedAccounts.array.length; j++) {
processMetaData(
{
pubkey: returnedAccounts.keys[j],
account: returnedAccounts.array[j],
},
updater,
);
}
};
for (const metadata of state.metadata) {
let editionKey: StringPublicKey;
if (metadata.info.editionNonce === null) {
editionKey = await getEdition(metadata.info.mint);
} else {
editionKey = (
await PublicKey.createProgramAddress(
[
Buffer.from(METADATA_PREFIX),
toPublicKey(METADATA_PROGRAM_ID).toBuffer(),
toPublicKey(metadata.info.mint).toBuffer(),
new Uint8Array([metadata.info.editionNonce || 0]),
],
toPublicKey(METADATA_PROGRAM_ID),
)
).toBase58();
}
setOf100MetadataEditionKeys.push(editionKey);
if (setOf100MetadataEditionKeys.length >= 100) {
loadBatch();
}
}
if (setOf100MetadataEditionKeys.length >= 0) {
loadBatch();
}
await Promise.all(editionPromises);
console.log(
'Edition size',
Object.keys(state.editions).length,
Object.keys(state.masterEditions).length,
);
};
const pullMetadataByCreators = (
connection: Connection,
state: MetaState,
updater: UpdateStateValueFunc,
): Promise<any> => {
console.log('pulling optimized nfts');
const whitelistedCreators = Object.values(state.whitelistedCreatorsByCreator);
const setter: UpdateStateValueFunc = async (prop, key, value) => {
if (prop === 'metadataByMint') {
await initMetadata(value, state.whitelistedCreatorsByCreator, updater);
} else {
updater(prop, key, value);
}
};
const forEachAccount = processingAccounts(setter);
const additionalPromises: Promise<void>[] = [];
for (const creator of whitelistedCreators) {
for (let i = 0; i < MAX_CREATOR_LIMIT; i++) {
const promise = getProgramAccounts(connection, METADATA_PROGRAM_ID, {
filters: [
{
memcmp: {
offset:
1 + // key
32 + // update auth
32 + // mint
4 + // name string length
MAX_NAME_LENGTH + // name
4 + // uri string length
MAX_URI_LENGTH + // uri
4 + // symbol string length
MAX_SYMBOL_LENGTH + // symbol
2 + // seller fee basis points
1 + // whether or not there is a creators vec
4 + // creators vec length
i * MAX_CREATOR_LEN,
bytes: creator.info.address,
},
},
],
}).then(forEachAccount(processMetaData));
additionalPromises.push(promise);
}
}
return Promise.all(additionalPromises);
};
export const makeSetter =
(state: MetaState) =>
(prop: keyof MetaState, key: string, value: ParsedAccount<any>) => {
(state: MetaState): UpdateStateValueFunc<MetaState> =>
(prop, key, value) => {
if (prop === 'store') {
state[prop] = value;
} else if (prop !== 'metadata') {
} else if (prop === 'metadata') {
state.metadata.push(value);
} else {
state[prop][key] = value;
}
return state;
};
const postProcessMetadata = async (tempCache: MetaState, all: boolean) => {
const values = Object.values(tempCache.metadataByMint);
export const processingAccounts =
(updater: UpdateStateValueFunc) =>
(fn: ProcessAccountsFunc) =>
async (accounts: AccountAndPubkey[]) => {
await createPipelineExecutor(
accounts.values(),
account => fn(account, updater),
{
sequence: 10,
delay: 1,
jobsCount: 3,
},
);
};
const postProcessMetadata = async (state: MetaState) => {
const values = Object.values(state.metadataByMint);
for (const metadata of values) {
await metadataByMintUpdater(metadata, tempCache, all);
await metadataByMintUpdater(metadata, state);
}
};
export const metadataByMintUpdater = async (
metadata: ParsedAccount<Metadata>,
state: MetaState,
all: boolean,
) => {
const key = metadata.info.mint;
if (
all ||
isMetadataPartOfStore(
metadata,
state.store,
state.whitelistedCreatorsByCreator,
)
) {
if (isMetadataPartOfStore(metadata, state.whitelistedCreatorsByCreator)) {
await metadata.info.init();
const masterEditionKey = metadata.info?.masterEdition;
if (masterEditionKey) {
@ -293,3 +486,19 @@ export const metadataByMintUpdater = async (
}
return state;
};
export const initMetadata = async (
metadata: ParsedAccount<Metadata>,
whitelistedCreators: Record<string, ParsedAccount<WhitelistedCreator>>,
setter: UpdateStateValueFunc,
) => {
if (isMetadataPartOfStore(metadata, whitelistedCreators)) {
await metadata.info.init();
setter('metadataByMint', metadata.info.mint, metadata);
setter('metadata', '', metadata);
const masterEditionKey = metadata.info?.masterEdition;
if (masterEditionKey) {
setter('metadataByMasterEdition', masterEditionKey, metadata);
}
}
};

View File

@ -2,22 +2,26 @@ import React, { useCallback, useContext, useEffect, useState } from 'react';
import { queryExtendedMetadata } from './queryExtendedMetadata';
import { subscribeAccountsChange } from './subscribeAccountsChange';
import { getEmptyMetaState } from './getEmptyMetaState';
import { loadAccounts } from './loadAccounts';
import {
limitedLoadAccounts,
loadAccounts,
USE_SPEED_RUN,
} from './loadAccounts';
import { MetaContextState, MetaState } from './types';
import { useConnection } from '../connection';
import { useStore } from '../store';
import { useQuerySearch } from '../../hooks';
import { AuctionData, BidderMetadata, BidderPot } from '../../actions';
const MetaContext = React.createContext<MetaContextState>({
...getEmptyMetaState(),
isLoading: false,
// @ts-ignore
update: () => [AuctionData, BidderMetadata, BidderPot],
});
export function MetaProvider({ children = null as any }) {
const connection = useConnection();
const { isReady, storeAddress } = useStore();
const searchParams = useQuerySearch();
const all = searchParams.get('all') == 'true';
const [state, setState] = useState<MetaState>(getEmptyMetaState());
@ -26,17 +30,15 @@ export function MetaProvider({ children = null as any }) {
const updateMints = useCallback(
async metadataByMint => {
try {
if (!all) {
const { metadata, mintToMetadata } = await queryExtendedMetadata(
connection,
metadataByMint,
);
setState(current => ({
...current,
metadata,
metadataByMint: mintToMetadata,
}));
}
const { metadata, mintToMetadata } = await queryExtendedMetadata(
connection,
metadataByMint,
);
setState(current => ({
...current,
metadata,
metadataByMint: mintToMetadata,
}));
} catch (er) {
console.error(er);
}
@ -44,30 +46,43 @@ export function MetaProvider({ children = null as any }) {
[setState],
);
useEffect(() => {
(async () => {
if (!storeAddress) {
if (isReady) {
setIsLoading(false);
}
return;
} else if (!state.store) {
setIsLoading(true);
async function update(auctionAddress?: any, bidderAddress?: any) {
if (!storeAddress) {
if (isReady) {
setIsLoading(false);
}
return;
} else if (!state.store) {
setIsLoading(true);
}
console.log('-----> Query started');
console.log('-----> Query started');
const nextState = await loadAccounts(connection, all);
const nextState = !USE_SPEED_RUN
? await loadAccounts(connection)
: await limitedLoadAccounts(connection);
console.log('------->Query finished');
console.log('------->Query finished');
setState(nextState);
setState(nextState);
setIsLoading(false);
console.log('------->set finished');
setIsLoading(false);
console.log('------->set finished');
updateMints(nextState.metadataByMint);
})();
await updateMints(nextState.metadataByMint);
if (auctionAddress && bidderAddress) {
const auctionBidderKey = auctionAddress + '-' + bidderAddress;
return [
nextState.auctions[auctionAddress],
nextState.bidderPotsByAuctionAndBidder[auctionBidderKey],
nextState.bidderMetadataByAuctionAndBidder[auctionBidderKey],
];
}
}
useEffect(() => {
update();
}, [connection, setState, updateMints, storeAddress, isReady]);
useEffect(() => {
@ -75,7 +90,7 @@ export function MetaProvider({ children = null as any }) {
return;
}
return subscribeAccountsChange(connection, all, () => state, setState);
return subscribeAccountsChange(connection, () => state, setState);
}, [connection, setState, isLoading]);
// TODO: fetch names dynamically
@ -110,6 +125,8 @@ export function MetaProvider({ children = null as any }) {
<MetaContext.Provider
value={{
...state,
// @ts-ignore
update,
isLoading,
}}
>

View File

@ -6,7 +6,6 @@ export const onChangeAccount =
(
process: ProcessAccountsFunc,
setter: UpdateStateValueFunc,
all: boolean,
): ProgramAccountChangeCallback =>
async info => {
const pubkey = pubkeyToString(info.accountId);
@ -16,6 +15,5 @@ export const onChangeAccount =
account: info.accountInfo,
},
setter,
all,
);
};

View File

@ -11,9 +11,9 @@ import {
BIDDER_POT_LEN,
MAX_AUCTION_DATA_EXTENDED_SIZE,
} from '../../actions';
import { AUCTION_ID } from '../../utils';
import { ParsedAccount } from '../accounts/types';
import { cache } from '../accounts/cache';
import { AUCTION_ID, pubkeyToString } from '../../utils';
import { ParsedAccount } from '../accounts';
import { cache } from '../accounts';
import { CheckAccountFunc, ProcessAccountsFunc } from './types';
export const processAuctions: ProcessAccountsFunc = (
@ -92,7 +92,7 @@ export const processAuctions: ProcessAccountsFunc = (
};
const isAuctionAccount: CheckAccountFunc = account =>
(account.owner as unknown as any) === AUCTION_ID;
pubkeyToString(account.owner) === AUCTION_ID;
const isExtendedAuctionAccount: CheckAccountFunc = account =>
account.data.length === MAX_AUCTION_DATA_EXTENDED_SIZE;

View File

@ -12,14 +12,13 @@ import {
MetadataKey,
} from '../../actions';
import { ParsedAccount } from '../accounts/types';
import { METADATA_PROGRAM_ID } from '../../utils';
import { METADATA_PROGRAM_ID, pubkeyToString } from '../../utils';
export const processMetaData: ProcessAccountsFunc = (
export const processMetaData: ProcessAccountsFunc = async (
{ account, pubkey },
setter,
) => {
if (!isMetadataAccount(account)) return;
try {
if (isMetadataV1Account(account)) {
const metadata = decodeMetadata(account.data);
@ -33,7 +32,7 @@ export const processMetaData: ProcessAccountsFunc = (
account,
info: metadata,
};
setter('metadataByMint', metadata.mint, parsedAccount);
await setter('metadataByMint', metadata.mint, parsedAccount);
}
}
@ -84,9 +83,8 @@ export const processMetaData: ProcessAccountsFunc = (
}
};
const isMetadataAccount = (account: AccountInfo<Buffer>) => {
return (account.owner as unknown as any) === METADATA_PROGRAM_ID;
};
const isMetadataAccount = (account: AccountInfo<Buffer>) =>
account && pubkeyToString(account.owner) === METADATA_PROGRAM_ID;
const isMetadataV1Account = (account: AccountInfo<Buffer>) =>
account.data[0] === MetadataKey.MetadataV1;

View File

@ -18,16 +18,15 @@ import {
BidRedemptionTicketV2,
decodeSafetyDepositConfig,
SafetyDepositConfig,
} from '../../models/metaplex';
} from '../../models';
import { ProcessAccountsFunc } from './types';
import { METAPLEX_ID, programIds } from '../../utils';
import { ParsedAccount } from '../accounts/types';
import { cache } from '../accounts/cache';
import { METAPLEX_ID, programIds, pubkeyToString } from '../../utils';
import { ParsedAccount } from '../accounts';
import { cache } from '../accounts';
export const processMetaplexAccounts: ProcessAccountsFunc = async (
{ account, pubkey },
setter,
useAll,
) => {
if (!isMetaplexAccount(account)) return;
@ -40,7 +39,7 @@ export const processMetaplexAccounts: ProcessAccountsFunc = async (
) {
const storeKey = new PublicKey(account.data.slice(1, 33));
if ((STORE_ID && storeKey.equals(STORE_ID)) || useAll) {
if (STORE_ID && storeKey.equals(STORE_ID)) {
const auctionManager = decodeAuctionManager(account.data);
const parsedAccount: ParsedAccount<
@ -112,7 +111,6 @@ export const processMetaplexAccounts: ProcessAccountsFunc = async (
if (STORE_ID && pubkey === STORE_ID.toBase58()) {
setter('store', pubkey, parsedAccount);
}
setter('stores', pubkey, parsedAccount);
}
if (isSafetyDepositConfigV1Account(account)) {
@ -152,14 +150,6 @@ export const processMetaplexAccounts: ProcessAccountsFunc = async (
);
}
}
if (useAll) {
setter(
'creators',
parsedAccount.info.address + '-' + pubkey,
parsedAccount,
);
}
}
} catch {
// ignore errors
@ -168,7 +158,7 @@ export const processMetaplexAccounts: ProcessAccountsFunc = async (
};
const isMetaplexAccount = (account: AccountInfo<Buffer>) =>
(account.owner as unknown as any) === METAPLEX_ID;
pubkeyToString(account.owner) === METAPLEX_ID;
const isAuctionManagerV1Account = (account: AccountInfo<Buffer>) =>
account.data[0] === MetaplexKey.AuctionManagerV1;

View File

@ -6,7 +6,7 @@ import {
Vault,
VaultKey,
} from '../../actions';
import { VAULT_ID } from '../../utils';
import { VAULT_ID, pubkeyToString } from '../../utils';
import { ParsedAccount } from '../accounts/types';
import { ProcessAccountsFunc } from './types';
@ -47,7 +47,7 @@ export const processVaultData: ProcessAccountsFunc = (
};
const isVaultAccount = (account: AccountInfo<Buffer>) =>
(account.owner as unknown as any) === VAULT_ID;
pubkeyToString(account.owner) === VAULT_ID;
const isSafetyDepositBoxV1Account = (account: AccountInfo<Buffer>) =>
account.data[0] === VaultKey.SafetyDepositBoxV1;

View File

@ -1,10 +1,10 @@
import { MintInfo } from '@solana/spl-token';
import { Connection } from '@solana/web3.js';
import { Metadata } from '../../actions';
import { ParsedAccount } from '../accounts/types';
import { cache } from '../accounts/cache';
import { getMultipleAccounts } from '../accounts/getMultipleAccounts';
import { MintParser } from '../accounts/parsesrs';
import { ParsedAccount } from '../accounts';
import { cache } from '../accounts';
import { getMultipleAccounts } from '../accounts';
import { MintParser } from '../accounts';
export const queryExtendedMetadata = async (
connection: Connection,

View File

@ -6,7 +6,7 @@ import {
toPublicKey,
VAULT_ID,
} from '../../utils';
import { makeSetter, metadataByMintUpdater } from './loadAccounts';
import { makeSetter, initMetadata } from './loadAccounts';
import { onChangeAccount } from './onChangeAccount';
import { processAuctions } from './processAuctions';
import { processMetaData } from './processMetaData';
@ -16,7 +16,6 @@ import { MetaState, UpdateStateValueFunc } from './types';
export const subscribeAccountsChange = (
connection: Connection,
all: boolean,
getState: () => MetaState,
setState: (v: MetaState) => void,
) => {
@ -31,40 +30,49 @@ export const subscribeAccountsChange = (
subscriptions.push(
connection.onProgramAccountChange(
toPublicKey(VAULT_ID),
onChangeAccount(processVaultData, updateStateValue, all),
onChangeAccount(processVaultData, updateStateValue),
),
);
subscriptions.push(
connection.onProgramAccountChange(
toPublicKey(AUCTION_ID),
onChangeAccount(processAuctions, updateStateValue, all),
onChangeAccount(processAuctions, updateStateValue),
),
);
subscriptions.push(
connection.onProgramAccountChange(
toPublicKey(METAPLEX_ID),
onChangeAccount(processMetaplexAccounts, updateStateValue, all),
onChangeAccount(processMetaplexAccounts, updateStateValue),
),
);
subscriptions.push(
connection.onProgramAccountChange(
toPublicKey(METADATA_PROGRAM_ID),
onChangeAccount(
processMetaData,
async (prop, key, value) => {
if (prop === 'metadataByMint') {
const state = getState();
const nextState = await metadataByMintUpdater(value, state, all);
setState(nextState);
} else {
updateStateValue(prop, key, value);
}
},
all,
),
onChangeAccount(processMetaData, async (prop, key, value) => {
const state = { ...getState() };
const setter = makeSetter(state);
let hasChanges = false;
const updater: UpdateStateValueFunc = (...args) => {
hasChanges = true;
setter(...args);
};
if (prop === 'metadataByMint') {
await initMetadata(
value,
state.whitelistedCreatorsByCreator,
updater,
);
} else {
updater(prop, key, value);
}
if (hasChanges) {
setState(state);
}
}),
),
);

View File

@ -71,12 +71,18 @@ export interface MetaState {
ParsedAccount<WhitelistedCreator>
>;
payoutTickets: Record<string, ParsedAccount<PayoutTicket>>;
stores: Record<string, ParsedAccount<Store>>;
creators: Record<string, ParsedAccount<WhitelistedCreator>>;
}
export interface MetaContextState extends MetaState {
isLoading: boolean;
update: (
auctionAddress?: any,
bidderAddress?: any,
) => [
ParsedAccount<AuctionData>,
ParsedAccount<BidderPot>,
ParsedAccount<BidderMetadata>,
];
}
export type AccountAndPubkey = {
@ -84,16 +90,19 @@ export type AccountAndPubkey = {
account: AccountInfo<Buffer>;
};
export type UpdateStateValueFunc = (
export type UpdateStateValueFunc<T = void> = (
prop: keyof MetaState,
key: string,
value: ParsedAccount<any>,
) => void;
) => T;
export type ProcessAccountsFunc = (
account: PublicKeyStringAndAccount<Buffer>,
setter: UpdateStateValueFunc,
useAll: boolean,
) => void;
export type CheckAccountFunc = (account: AccountInfo<Buffer>) => boolean;
export type UnPromise<T extends Promise<any>> = T extends Promise<infer U>
? U
: never;

View File

@ -0,0 +1,61 @@
import { AccountInfo, Connection } from '@solana/web3.js';
import { StringPublicKey } from '../../utils/ids';
import { AccountAndPubkey } from './types';
export async function getProgramAccounts(
connection: Connection,
programId: StringPublicKey,
configOrCommitment?: any,
): Promise<Array<AccountAndPubkey>> {
const extra: any = {};
let commitment;
//let encoding;
if (configOrCommitment) {
if (typeof configOrCommitment === 'string') {
commitment = configOrCommitment;
} else {
commitment = configOrCommitment.commitment;
//encoding = configOrCommitment.encoding;
if (configOrCommitment.dataSlice) {
extra.dataSlice = configOrCommitment.dataSlice;
}
if (configOrCommitment.filters) {
extra.filters = configOrCommitment.filters;
}
}
}
const args = connection._buildArgs([programId], commitment, 'base64', extra);
const unsafeRes = await (connection as any)._rpcRequest(
'getProgramAccounts',
args,
);
return unsafeResAccounts(unsafeRes.result);
}
export function unsafeAccount(account: AccountInfo<[string, string]>) {
return {
// TODO: possible delay parsing could be added here
data: Buffer.from(account.data[0], 'base64'),
executable: account.executable,
lamports: account.lamports,
// TODO: maybe we can do it in lazy way? or just use string
owner: account.owner,
} as AccountInfo<Buffer>;
}
export function unsafeResAccounts(
data: Array<{
account: AccountInfo<[string, string]>;
pubkey: string;
}>,
) {
return data.map(item => ({
account: unsafeAccount(item.account),
pubkey: item.pubkey,
}));
}

View File

@ -2,7 +2,7 @@ import { SYSVAR_CLOCK_PUBKEY, TransactionInstruction } from '@solana/web3.js';
import { serialize } from 'borsh';
import { getAuctionKeys, ClaimBidArgs, SCHEMA } from '.';
import { getBidderPotKey } from '../../actions';
import { getBidderPotKey, getAuctionExtended } from '../../actions';
import { programIds, StringPublicKey, toPublicKey } from '../../utils';
export async function claimBid(
@ -30,6 +30,11 @@ export async function claimBid(
const value = new ClaimBidArgs();
const data = Buffer.from(serialize(SCHEMA, value));
const auctionExtendedKey = await getAuctionExtended({
auctionProgramId: PROGRAM_IDS.auction,
resource: vault,
});
const keys = [
{
pubkey: toPublicKey(acceptPayment),
@ -92,6 +97,11 @@ export async function claimBid(
isSigner: false,
isWritable: false,
},
{
pubkey: toPublicKey(auctionExtendedKey),
isSigner: false,
isWritable: false,
},
];
instructions.push(

View File

@ -11,6 +11,7 @@ import {
Metadata,
SafetyDepositBox,
Vault,
getAuctionExtended,
} from '../../actions';
import { AccountParser, ParsedAccount } from '../../contexts';
import {
@ -88,6 +89,7 @@ export class PayoutTicket {
this.amountPaid = args.amountPaid;
}
}
export class AuctionManager {
pubkey: StringPublicKey;
store: StringPublicKey;
@ -259,6 +261,7 @@ export class AuctionManagerV2 {
vault: StringPublicKey;
acceptPayment: StringPublicKey;
state: AuctionManagerStateV2;
auctionDataExtended?: StringPublicKey;
constructor(args: {
store: StringPublicKey;
@ -275,6 +278,13 @@ export class AuctionManagerV2 {
this.vault = args.vault;
this.acceptPayment = args.acceptPayment;
this.state = args.state;
const auction = programIds().auction;
getAuctionExtended({
auctionProgramId: auction,
resource: this.vault,
}).then(val => (this.auctionDataExtended = val));
}
}
@ -319,6 +329,15 @@ export class RedeemFullRightsTransferBidArgs {
export class StartAuctionArgs {
instruction = 5;
}
export class EndAuctionArgs {
instruction = 21;
reveal: BN[] | null;
constructor(args: { reveal: BN[] | null }) {
this.reveal = args.reveal;
}
}
export class ClaimBidArgs {
instruction = 6;
}
@ -960,6 +979,16 @@ export const SCHEMA = new Map<any, any>([
fields: [['instruction', 'u8']],
},
],
[
EndAuctionArgs,
{
kind: 'struct',
fields: [
['instruction', 'u8'],
['reveal', { kind: 'option', type: [BN] }],
],
},
],
[
ClaimBidArgs,
{

View File

@ -14,7 +14,7 @@ import {
RedeemUnusedWinningConfigItemsAsAuctioneerArgs,
SCHEMA,
} from '.';
import { VAULT_PREFIX } from '../../actions';
import { VAULT_PREFIX, getAuctionExtended } from '../../actions';
import {
findProgramAddress,
programIds,
@ -68,6 +68,11 @@ export async function redeemBid(
safetyDeposit,
);
const auctionExtended = await getAuctionExtended({
auctionProgramId: PROGRAM_IDS.auction,
resource: vault,
});
const value =
auctioneerReclaimIndex !== undefined
? new RedeemUnusedWinningConfigItemsAsAuctioneerArgs({
@ -172,6 +177,11 @@ export async function redeemBid(
isSigner: false,
isWritable: false,
},
{
pubkey: toPublicKey(auctionExtended),
isSigner: false,
isWritable: false,
},
];
if (isPrintingType && masterEdition && reservationList) {

View File

@ -14,7 +14,7 @@ import {
RedeemUnusedWinningConfigItemsAsAuctioneerArgs,
SCHEMA,
} from '.';
import { VAULT_PREFIX } from '../../actions';
import { VAULT_PREFIX, getAuctionExtended } from '../../actions';
import {
findProgramAddress,
programIds,
@ -67,6 +67,11 @@ export async function redeemFullRightsTransferBid(
safetyDeposit,
);
const auctionExtended = await getAuctionExtended({
auctionProgramId: PROGRAM_IDS.auction,
resource: vault,
});
const value =
auctioneerReclaimIndex !== undefined
? new RedeemUnusedWinningConfigItemsAsAuctioneerArgs({
@ -181,6 +186,12 @@ export async function redeemFullRightsTransferBid(
isSigner: false,
isWritable: false,
},
{
pubkey: toPublicKey(auctionExtended),
isSigner: false,
isWritable: false,
},
];
instructions.push(

View File

@ -14,7 +14,12 @@ import {
SCHEMA,
getSafetyDepositConfig,
} from '.';
import { getEdition, getEditionMarkPda, getMetadata } from '../../actions';
import {
getEdition,
getEditionMarkPda,
getMetadata,
getAuctionExtended,
} from '../../actions';
import { programIds, StringPublicKey, toPublicKey } from '../../utils';
export async function redeemPrintingV2Bid(
@ -63,6 +68,10 @@ export async function redeemPrintingV2Bid(
const value = new RedeemPrintingV2BidArgs({ editionOffset, winIndex });
const data = Buffer.from(serialize(SCHEMA, value));
const extended = await getAuctionExtended({
auctionProgramId: PROGRAM_IDS.auction,
resource: vault,
});
const keys = [
{
pubkey: toPublicKey(auctionManagerKey),
@ -193,6 +202,11 @@ export async function redeemPrintingV2Bid(
isSigner: false,
isWritable: false,
},
{
pubkey: toPublicKey(extended),
isSigner: false,
isWritable: false,
},
];
instructions.push(

View File

@ -0,0 +1,54 @@
export async function createPipelineExecutor<T>(
data: IterableIterator<T>,
executor: (d: T) => void,
{
delay = 0,
jobsCount = 1,
sequence = 1,
}: {
delay?: number;
jobsCount?: number;
sequence?: number;
} = {},
) {
function execute<T>(iter: IteratorResult<T, any>) {
executor(iter.value);
}
async function next() {
if (sequence <= 1) {
const iter = data.next();
if (iter.done) {
return;
}
await execute(iter);
} else {
const promises: any[] = [];
let isDone = false;
for (let i = 0; i < sequence; i++) {
const iter = data.next();
if (!iter.done) {
promises.push(execute(iter));
} else {
isDone = true;
break;
}
}
await Promise.all(promises);
if (isDone) {
return;
}
}
if (delay > 0) {
await new Promise(resolve => setTimeout(resolve, delay));
} else {
await Promise.resolve();
}
await next();
}
const result = new Array<Promise<void>>(jobsCount);
for (let i = 0; i < jobsCount; i++) {
result[i] = next();
}
await Promise.all(result);
}

View File

@ -9,3 +9,4 @@ export * from './strings';
export * as shortvec from './shortvec';
export * from './isValidHttpUrl';
export * from './borsh';
export * from './createPipelineExecutor';

View File

@ -0,0 +1,7 @@
REACT_APP_CANDY_MACHINE_ID=EodXoBBFMWMXe3KKpwAFRa3BHDDWF3y7S8DcGRUTdG9U
REACT_APP_SOLANA_NETWORK=mainnet-beta
REACT_APP_SOLANA_RPC_HOST=https://api.mainnet-beta.solana.com
# Phase 1
REACT_APP_FAIR_LAUNCH_ID=4stZ5uFD1EdS8wKgrLSqE54YW5dS4KUSyGUCeehVua3P

View File

@ -0,0 +1,62 @@
{
"name": "candy-machine-mint",
"version": "0.1.0",
"private": true,
"dependencies": {
"@material-ui/core": "^4.12.3",
"@material-ui/icons": "^4.11.2",
"@material-ui/lab": "^4.0.0-alpha.60",
"@project-serum/anchor": "^0.14.0",
"@solana/spl-token": "^0.1.8",
"@solana/wallet-adapter-base": "^0.5.2",
"@solana/wallet-adapter-material-ui": "^0.8.3",
"@solana/wallet-adapter-react": "^0.9.1",
"@solana/wallet-adapter-react-ui": "^0.1.0",
"@solana/wallet-adapter-wallets": "^0.7.5",
"canvas-confetti": "^1.4.0",
"@solana/web3.js": "^1.24.1",
"@testing-library/jest-dom": "^5.11.4",
"@testing-library/react": "^11.1.0",
"@testing-library/user-event": "^12.1.10",
"@types/jest": "^26.0.15",
"@types/node": "^12.0.0",
"@types/react": "^17.0.0",
"@types/react-dom": "^17.0.0",
"react": "^17.0.2",
"react-countdown": "^2.3.2",
"react-dom": "^17.0.2",
"react-scripts": "4.0.3",
"styled-components": "^5.3.1",
"typescript": "^4.1.2",
"web-vitals": "^1.0.1"
},
"scripts": {
"start": "react-scripts start",
"build": "react-scripts build",
"test": "react-scripts test",
"eject": "react-scripts eject",
"deploy:gh": "gh-pages -d ./build/ --repo https://github.com/metaplex-foundation/metaplex -t true --branch gh-pages",
"deploy": "cross-env ASSET_PREFIX=/metaplex/ yarn build && yarn deploy:gh"
},
"eslintConfig": {
"extends": [
"react-app",
"react-app/jest"
]
},
"browserslist": {
"production": [
">0.2%",
"not dead",
"not op_mini all"
],
"development": [
"last 1 chrome version",
"last 1 firefox version",
"last 1 safari version"
]
},
"devDependencies": {
"@types/styled-components": "^5.1.14"
}
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.8 KiB

View File

@ -0,0 +1,43 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8" />
<link rel="icon" href="%PUBLIC_URL%/favicon.ico" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<meta name="theme-color" content="#000000" />
<meta
name="description"
content="Web site created using create-react-app"
/>
<link rel="apple-touch-icon" href="%PUBLIC_URL%/logo192.png" />
<!--
manifest.json provides metadata used when your web app is installed on a
user's mobile device or desktop. See https://developers.google.com/web/fundamentals/web-app-manifest/
-->
<link rel="manifest" href="%PUBLIC_URL%/manifest.json" />
<!--
Notice the use of %PUBLIC_URL% in the tags above.
It will be replaced with the URL of the `public` folder during the build.
Only files inside the `public` folder can be referenced from the HTML.
Unlike "/favicon.ico" or "favicon.ico", "%PUBLIC_URL%/favicon.ico" will
work correctly both with client-side routing and a non-root public URL.
Learn how to configure a non-root public URL by running `npm run build`.
-->
<title>React App</title>
</head>
<body>
<noscript>You need to enable JavaScript to run this app.</noscript>
<div id="root"></div>
<!--
This HTML file is a template.
If you open it directly in the browser, you will see an empty page.
You can add webfonts, meta tags, or analytics to this file.
The build step will place the bundled scripts into the <body> tag.
To begin the development, run `npm start` or `yarn start`.
To create a production bundle, use `npm run build` or `yarn build`.
-->
</body>
</html>

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.4 KiB

View File

@ -0,0 +1,25 @@
{
"short_name": "React App",
"name": "Create React App Sample",
"icons": [
{
"src": "favicon.ico",
"sizes": "64x64 32x32 24x24 16x16",
"type": "image/x-icon"
},
{
"src": "logo192.png",
"type": "image/png",
"sizes": "192x192"
},
{
"src": "logo512.png",
"type": "image/png",
"sizes": "512x512"
}
],
"start_url": ".",
"display": "standalone",
"theme_color": "#000000",
"background_color": "#ffffff"
}

View File

@ -0,0 +1,3 @@
# https://www.robotstxt.org/robotstxt.html
User-agent: *
Disallow:

View File

@ -0,0 +1,3 @@
.App {
text-align: center;
}

View File

@ -0,0 +1,9 @@
import React from 'react';
import { render, screen } from '@testing-library/react';
import App from './App';
test('renders learn react link', () => {
render(<App />);
const linkElement = screen.getByText(/learn react/i);
expect(linkElement).toBeInTheDocument();
});

View File

@ -0,0 +1,76 @@
import './App.css';
import { useMemo } from 'react';
import Home from './Home';
import * as anchor from '@project-serum/anchor';
import { clusterApiUrl } from '@solana/web3.js';
import { WalletAdapterNetwork } from '@solana/wallet-adapter-base';
import {
getPhantomWallet,
getSolflareWallet,
getSolletWallet,
} from '@solana/wallet-adapter-wallets';
import {
ConnectionProvider,
WalletProvider,
} from '@solana/wallet-adapter-react';
import { WalletDialogProvider } from '@solana/wallet-adapter-material-ui';
import { ThemeProvider, createTheme } from '@material-ui/core';
import { ConfettiProvider } from './confetti';
const theme = createTheme({
palette: {
type: 'dark',
},
});
const candyMachineId = process.env.REACT_APP_CANDY_MACHINE_ID
? new anchor.web3.PublicKey(process.env.REACT_APP_CANDY_MACHINE_ID)
: undefined;
const fairLaunchId = new anchor.web3.PublicKey(
process.env.REACT_APP_FAIR_LAUNCH_ID!,
);
const network = process.env.REACT_APP_SOLANA_NETWORK as WalletAdapterNetwork;
const rpcHost = process.env.REACT_APP_SOLANA_RPC_HOST!;
const connection = new anchor.web3.Connection(rpcHost);
const startDateSeed = parseInt(process.env.REACT_APP_CANDY_START_DATE!, 10);
const txTimeout = 30000; // milliseconds (confirm this works for your project)
const App = () => {
const endpoint = useMemo(() => clusterApiUrl(network), []);
const wallets = useMemo(
() => [getPhantomWallet(), getSolflareWallet(), getSolletWallet()],
[],
);
return (
<ThemeProvider theme={theme}>
<ConnectionProvider endpoint={endpoint}>
<WalletProvider wallets={wallets} autoConnect>
<WalletDialogProvider>
<ConfettiProvider>
<Home
candyMachineId={candyMachineId}
fairLaunchId={fairLaunchId}
connection={connection}
startDate={startDateSeed}
txTimeout={txTimeout}
/>
</ConfettiProvider>
</WalletDialogProvider>
</WalletProvider>
</ConnectionProvider>
</ThemeProvider>
);
};
export default App;

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,337 @@
import * as anchor from '@project-serum/anchor';
import { MintLayout, TOKEN_PROGRAM_ID, Token } from '@solana/spl-token';
import { SystemProgram } from '@solana/web3.js';
import { sendTransactionWithRetry } from './connection';
import {
getAtaForMint,
SPL_ASSOCIATED_TOKEN_ACCOUNT_PROGRAM_ID,
} from './utils';
export const CANDY_MACHINE_PROGRAM = new anchor.web3.PublicKey(
'cndyAnrLdpjq1Ssp1z8xxDsB8dxe7u4HL5Nxi2K5WXZ',
);
const TOKEN_METADATA_PROGRAM_ID = new anchor.web3.PublicKey(
'metaqbxxUerdq28cj1RbAWkYQm3ybzjb6a8bt518x1s',
);
interface CandyMachineState {
itemsAvailable: number;
itemsRedeemed: number;
itemsRemaining: number;
config: anchor.web3.PublicKey;
treasury: anchor.web3.PublicKey;
tokenMint: anchor.web3.PublicKey;
isSoldOut: boolean;
isActive: boolean;
goLiveDate: anchor.BN;
price: anchor.BN;
}
export interface CandyMachineAccount {
id: anchor.web3.PublicKey;
program: anchor.Program;
state: CandyMachineState;
}
export const awaitTransactionSignatureConfirmation = async (
txid: anchor.web3.TransactionSignature,
timeout: number,
connection: anchor.web3.Connection,
commitment: anchor.web3.Commitment = 'recent',
queryStatus = false,
): Promise<anchor.web3.SignatureStatus | null | void> => {
let done = false;
let status: anchor.web3.SignatureStatus | null | void = {
slot: 0,
confirmations: 0,
err: null,
};
let subId = 0;
status = await new Promise(async (resolve, reject) => {
setTimeout(() => {
if (done) {
return;
}
done = true;
console.log('Rejecting for timeout...');
reject({ timeout: true });
}, timeout);
while (!done && queryStatus) {
// eslint-disable-next-line no-loop-func
(async () => {
try {
const signatureStatuses = await connection.getSignatureStatuses([
txid,
]);
status = signatureStatuses && signatureStatuses.value[0];
if (!done) {
if (!status) {
console.log('REST null result for', txid, status);
} else if (status.err) {
console.log('REST error for', txid, status);
done = true;
reject(status.err);
} else if (!status.confirmations) {
console.log('REST no confirmations for', txid, status);
} else {
console.log('REST confirmation for', txid, status);
done = true;
resolve(status);
}
}
} catch (e) {
if (!done) {
console.log('REST connection error: txid', txid, e);
}
}
})();
await sleep(2000);
}
});
//@ts-ignore
if (connection._signatureSubscriptions[subId]) {
connection.removeSignatureListener(subId);
}
done = true;
console.log('Returning status', status);
return status;
};
/* export */ const createAssociatedTokenAccountInstruction = (
associatedTokenAddress: anchor.web3.PublicKey,
payer: anchor.web3.PublicKey,
walletAddress: anchor.web3.PublicKey,
splTokenMintAddress: anchor.web3.PublicKey,
) => {
const keys = [
{ pubkey: payer, isSigner: true, isWritable: true },
{ pubkey: associatedTokenAddress, isSigner: false, isWritable: true },
{ pubkey: walletAddress, isSigner: false, isWritable: false },
{ pubkey: splTokenMintAddress, isSigner: false, isWritable: false },
{
pubkey: anchor.web3.SystemProgram.programId,
isSigner: false,
isWritable: false,
},
{ pubkey: TOKEN_PROGRAM_ID, isSigner: false, isWritable: false },
{
pubkey: anchor.web3.SYSVAR_RENT_PUBKEY,
isSigner: false,
isWritable: false,
},
];
return new anchor.web3.TransactionInstruction({
keys,
programId: SPL_ASSOCIATED_TOKEN_ACCOUNT_PROGRAM_ID,
data: Buffer.from([]),
});
};
export const getCandyMachineState = async (
anchorWallet: anchor.Wallet,
candyMachineId: anchor.web3.PublicKey,
connection: anchor.web3.Connection,
): Promise<CandyMachineAccount> => {
const provider = new anchor.Provider(connection, anchorWallet, {
preflightCommitment: 'recent',
});
const idl = await anchor.Program.fetchIdl(CANDY_MACHINE_PROGRAM, provider);
const program = new anchor.Program(idl, CANDY_MACHINE_PROGRAM, provider);
const state: any = await program.account.candyMachine.fetch(candyMachineId);
const itemsAvailable = state.data.itemsAvailable.toNumber();
const itemsRedeemed = state.itemsRedeemed.toNumber();
const itemsRemaining = itemsAvailable - itemsRedeemed;
return {
id: candyMachineId,
program,
state: {
itemsAvailable,
itemsRedeemed,
itemsRemaining,
isSoldOut: itemsRemaining === 0,
isActive: state.data.goLiveDate.toNumber() < new Date().getTime() / 1000,
goLiveDate: state.data.goLiveDate,
treasury: state.wallet,
tokenMint: state.tokenMint,
config: state.config,
price: state.data.price,
},
};
};
const getMasterEdition = async (
mint: anchor.web3.PublicKey,
): Promise<anchor.web3.PublicKey> => {
return (
await anchor.web3.PublicKey.findProgramAddress(
[
Buffer.from('metadata'),
TOKEN_METADATA_PROGRAM_ID.toBuffer(),
mint.toBuffer(),
Buffer.from('edition'),
],
TOKEN_METADATA_PROGRAM_ID,
)
)[0];
};
const getMetadata = async (
mint: anchor.web3.PublicKey,
): Promise<anchor.web3.PublicKey> => {
return (
await anchor.web3.PublicKey.findProgramAddress(
[
Buffer.from('metadata'),
TOKEN_METADATA_PROGRAM_ID.toBuffer(),
mint.toBuffer(),
],
TOKEN_METADATA_PROGRAM_ID,
)
)[0];
};
export const mintOneToken = async (
candyMachine: CandyMachineAccount,
payer: anchor.web3.PublicKey,
): Promise<string> => {
const mint = anchor.web3.Keypair.generate();
const userTokenAccountAddress = (
await getAtaForMint(mint.publicKey, payer)
)[0];
const userPayingAccountAddress = (
await getAtaForMint(candyMachine.state.tokenMint, payer)
)[0];
const candyMachineAddress = candyMachine.id;
const remainingAccounts = [];
const signers: anchor.web3.Keypair[] = [mint];
const instructions = [
anchor.web3.SystemProgram.createAccount({
fromPubkey: payer,
newAccountPubkey: mint.publicKey,
space: MintLayout.span,
lamports:
await candyMachine.program.provider.connection.getMinimumBalanceForRentExemption(
MintLayout.span,
),
programId: TOKEN_PROGRAM_ID,
}),
Token.createInitMintInstruction(
TOKEN_PROGRAM_ID,
mint.publicKey,
0,
payer,
payer,
),
createAssociatedTokenAccountInstruction(
userTokenAccountAddress,
payer,
payer,
mint.publicKey,
),
Token.createMintToInstruction(
TOKEN_PROGRAM_ID,
mint.publicKey,
userTokenAccountAddress,
payer,
[],
1,
),
];
let tokenAccount;
if (candyMachine.state.tokenMint) {
const transferAuthority = anchor.web3.Keypair.generate();
signers.push(transferAuthority);
remainingAccounts.push({
pubkey: userPayingAccountAddress,
isWritable: true,
isSigner: false,
});
remainingAccounts.push({
pubkey: transferAuthority.publicKey,
isWritable: false,
isSigner: true,
});
instructions.push(
Token.createApproveInstruction(
TOKEN_PROGRAM_ID,
userPayingAccountAddress,
transferAuthority.publicKey,
payer,
[],
candyMachine.state.price.toNumber(),
),
);
}
const metadataAddress = await getMetadata(mint.publicKey);
const masterEdition = await getMasterEdition(mint.publicKey);
instructions.push(
await candyMachine.program.instruction.mintNft({
accounts: {
config: candyMachine.state.config,
candyMachine: candyMachineAddress,
payer,
wallet: candyMachine.state.treasury,
mint: mint.publicKey,
metadata: metadataAddress,
masterEdition,
mintAuthority: payer,
updateAuthority: payer,
tokenMetadataProgram: TOKEN_METADATA_PROGRAM_ID,
tokenProgram: TOKEN_PROGRAM_ID,
systemProgram: SystemProgram.programId,
rent: anchor.web3.SYSVAR_RENT_PUBKEY,
clock: anchor.web3.SYSVAR_CLOCK_PUBKEY,
},
remainingAccounts:
remainingAccounts.length > 0 ? remainingAccounts : undefined,
}),
);
if (tokenAccount) {
instructions.push(
Token.createRevokeInstruction(
TOKEN_PROGRAM_ID,
userPayingAccountAddress,
payer,
[],
),
);
}
try {
return (
await sendTransactionWithRetry(
candyMachine.program.provider.connection,
candyMachine.program.provider.wallet,
instructions,
signers,
)
).txid;
} catch (e) {
console.log(e);
}
return 'j';
};
export const shortenAddress = (address: string, chars = 4): string => {
return `${address.slice(0, chars)}...${address.slice(-chars)}`;
};
const sleep = (ms: number): Promise<void> => {
return new Promise(resolve => setTimeout(resolve, ms));
};

View File

@ -0,0 +1,74 @@
import React, { useContext, useEffect, useMemo, useRef } from 'react';
import confetti from 'canvas-confetti';
export interface ConfettiContextState {
dropConfetti: () => void;
}
const ConfettiContext = React.createContext<ConfettiContextState | null>(null);
export const ConfettiProvider = ({ children = null as any }) => {
const canvasRef = useRef<HTMLCanvasElement>();
const confettiRef = useRef<confetti.CreateTypes>();
const dropConfetti = useMemo(
() => () => {
if (confettiRef.current && canvasRef.current) {
canvasRef.current.style.visibility = 'visible';
confettiRef
.current({
particleCount: 400,
spread: 160,
origin: { y: 0.3 },
})
?.finally(() => {
if (canvasRef.current) {
canvasRef.current.style.visibility = 'hidden';
}
});
}
},
[],
);
useEffect(() => {
if (canvasRef.current && !confettiRef.current) {
canvasRef.current.style.visibility = 'hidden';
confettiRef.current = confetti.create(canvasRef.current, {
resize: true,
useWorker: true,
});
}
}, []);
const canvasStyle: React.CSSProperties = {
width: '100vw',
height: '100vh',
position: 'absolute',
zIndex: 1,
top: 0,
left: 0,
};
return (
<ConfettiContext.Provider value={{ dropConfetti }}>
<canvas ref={canvasRef as any} style={canvasStyle} />
{children}
</ConfettiContext.Provider>
);
};
export const Confetti = () => {
const { dropConfetti } = useConfetti();
useEffect(() => {
dropConfetti();
}, [dropConfetti]);
return <></>;
};
export const useConfetti = () => {
const context = useContext(ConfettiContext);
return context as ConfettiContextState;
};

View File

@ -0,0 +1,536 @@
import {
Keypair,
Commitment,
Connection,
RpcResponseAndContext,
SignatureStatus,
SimulatedTransactionResponse,
Transaction,
TransactionInstruction,
TransactionSignature,
Blockhash,
FeeCalculator,
} from '@solana/web3.js';
import {
WalletNotConnectedError,
} from '@solana/wallet-adapter-base';
interface BlockhashAndFeeCalculator {
blockhash: Blockhash;
feeCalculator: FeeCalculator;
}
export const getErrorForTransaction = async (
connection: Connection,
txid: string,
) => {
// wait for all confirmation before geting transaction
await connection.confirmTransaction(txid, 'max');
const tx = await connection.getParsedConfirmedTransaction(txid);
const errors: string[] = [];
if (tx?.meta && tx.meta.logMessages) {
tx.meta.logMessages.forEach(log => {
const regex = /Error: (.*)/gm;
let m;
while ((m = regex.exec(log)) !== null) {
// This is necessary to avoid infinite loops with zero-width matches
if (m.index === regex.lastIndex) {
regex.lastIndex++;
}
if (m.length > 1) {
errors.push(m[1]);
}
}
});
}
return errors;
};
export enum SequenceType {
Sequential,
Parallel,
StopOnFailure,
}
export async function sendTransactionsWithManualRetry(
connection: Connection,
wallet: any,
instructions: TransactionInstruction[][],
signers: Keypair[][],
) {
let stopPoint = 0;
let tries = 0;
let lastInstructionsLength = null;
let toRemoveSigners: Record<number, boolean> = {};
instructions = instructions.filter((instr, i) => {
if (instr.length > 0) {
return true;
} else {
toRemoveSigners[i] = true;
return false;
}
});
let filteredSigners = signers.filter((_, i) => !toRemoveSigners[i]);
while (stopPoint < instructions.length && tries < 3) {
instructions = instructions.slice(stopPoint, instructions.length);
filteredSigners = filteredSigners.slice(stopPoint, filteredSigners.length);
if (instructions.length === lastInstructionsLength) tries = tries + 1;
else tries = 0;
try {
if (instructions.length === 1) {
await sendTransactionWithRetry(
connection,
wallet,
instructions[0],
filteredSigners[0],
'single',
);
stopPoint = 1;
} else {
stopPoint = await sendTransactions(
connection,
wallet,
instructions,
filteredSigners,
SequenceType.StopOnFailure,
'single',
);
}
} catch (e) {
console.error(e);
}
console.log(
'Died on ',
stopPoint,
'retrying from instruction',
instructions[stopPoint],
'instructions length is',
instructions.length,
);
lastInstructionsLength = instructions.length;
}
}
export const sendTransactions = async (
connection: Connection,
wallet: any,
instructionSet: TransactionInstruction[][],
signersSet: Keypair[][],
sequenceType: SequenceType = SequenceType.Parallel,
commitment: Commitment = 'singleGossip',
successCallback: (txid: string, ind: number) => void = (txid, ind) => {},
failCallback: (reason: string, ind: number) => boolean = (txid, ind) => false,
block?: BlockhashAndFeeCalculator,
): Promise<number> => {
if (!wallet.publicKey) throw new WalletNotConnectedError();
const unsignedTxns: Transaction[] = [];
if (!block) {
block = await connection.getRecentBlockhash(commitment);
}
for (let i = 0; i < instructionSet.length; i++) {
const instructions = instructionSet[i];
const signers = signersSet[i];
if (instructions.length === 0) {
continue;
}
let transaction = new Transaction();
instructions.forEach(instruction => transaction.add(instruction));
transaction.recentBlockhash = block.blockhash;
transaction.setSigners(
// fee payed by the wallet owner
wallet.publicKey,
...signers.map(s => s.publicKey),
);
if (signers.length > 0) {
transaction.partialSign(...signers);
}
unsignedTxns.push(transaction);
}
const signedTxns = await wallet.signAllTransactions(unsignedTxns);
const pendingTxns: Promise<{ txid: string; slot: number }>[] = [];
let breakEarlyObject = { breakEarly: false, i: 0 };
console.log(
'Signed txns length',
signedTxns.length,
'vs handed in length',
instructionSet.length,
);
for (let i = 0; i < signedTxns.length; i++) {
const signedTxnPromise = sendSignedTransaction({
connection,
signedTransaction: signedTxns[i],
});
signedTxnPromise
.then(({ txid, slot }) => {
successCallback(txid, i);
})
.catch(reason => {
// @ts-ignore
failCallback(signedTxns[i], i);
if (sequenceType === SequenceType.StopOnFailure) {
breakEarlyObject.breakEarly = true;
breakEarlyObject.i = i;
}
});
if (sequenceType !== SequenceType.Parallel) {
try {
await signedTxnPromise;
} catch (e) {
console.log('Caught failure', e);
if (breakEarlyObject.breakEarly) {
console.log('Died on ', breakEarlyObject.i);
return breakEarlyObject.i; // Return the txn we failed on by index
}
}
} else {
pendingTxns.push(signedTxnPromise);
}
}
if (sequenceType !== SequenceType.Parallel) {
await Promise.all(pendingTxns);
}
return signedTxns.length;
};
export const sendTransaction = async (
connection: Connection,
wallet: any,
instructions: TransactionInstruction[],
signers: Keypair[],
awaitConfirmation = true,
commitment: Commitment = 'singleGossip',
includesFeePayer: boolean = false,
block?: BlockhashAndFeeCalculator,
) => {
if (!wallet.publicKey) throw new WalletNotConnectedError();
let transaction = new Transaction();
instructions.forEach(instruction => transaction.add(instruction));
transaction.recentBlockhash = (
block || (await connection.getRecentBlockhash(commitment))
).blockhash;
if (includesFeePayer) {
transaction.setSigners(...signers.map(s => s.publicKey));
} else {
transaction.setSigners(
// fee payed by the wallet owner
wallet.publicKey,
...signers.map(s => s.publicKey),
);
}
if (signers.length > 0) {
transaction.partialSign(...signers);
}
if (!includesFeePayer) {
transaction = await wallet.signTransaction(transaction);
}
const rawTransaction = transaction.serialize();
let options = {
skipPreflight: true,
commitment,
};
const txid = await connection.sendRawTransaction(rawTransaction, options);
let slot = 0;
if (awaitConfirmation) {
const confirmation = await awaitTransactionSignatureConfirmation(
txid,
DEFAULT_TIMEOUT,
connection,
commitment,
);
if (!confirmation)
throw new Error('Timed out awaiting confirmation on transaction');
slot = confirmation?.slot || 0;
if (confirmation?.err) {
const errors = await getErrorForTransaction(connection, txid);
console.log(errors);
throw new Error(`Raw transaction ${txid} failed`);
}
}
return { txid, slot };
};
export const sendTransactionWithRetry = async (
connection: Connection,
wallet: any,
instructions: TransactionInstruction[],
signers: Keypair[],
commitment: Commitment = 'singleGossip',
includesFeePayer: boolean = false,
block?: BlockhashAndFeeCalculator,
beforeSend?: () => void,
) => {
if (!wallet.publicKey) throw new WalletNotConnectedError();
let transaction = new Transaction();
instructions.forEach(instruction => transaction.add(instruction));
transaction.recentBlockhash = (
block || (await connection.getRecentBlockhash(commitment))
).blockhash;
if (includesFeePayer) {
transaction.setSigners(...signers.map(s => s.publicKey));
} else {
transaction.setSigners(
// fee payed by the wallet owner
wallet.publicKey,
...signers.map(s => s.publicKey),
);
}
if (signers.length > 0) {
transaction.partialSign(...signers);
}
if (!includesFeePayer) {
transaction = await wallet.signTransaction(transaction);
}
if (beforeSend) {
beforeSend();
}
const { txid, slot } = await sendSignedTransaction({
connection,
signedTransaction: transaction,
});
return { txid, slot };
};
export const getUnixTs = () => {
return new Date().getTime() / 1000;
};
const DEFAULT_TIMEOUT = 15000;
export async function sendSignedTransaction({
signedTransaction,
connection,
timeout = DEFAULT_TIMEOUT,
}: {
signedTransaction: Transaction;
connection: Connection;
sendingMessage?: string;
sentMessage?: string;
successMessage?: string;
timeout?: number;
}): Promise<{ txid: string; slot: number }> {
const rawTransaction = signedTransaction.serialize();
const startTime = getUnixTs();
let slot = 0;
const txid: TransactionSignature = await connection.sendRawTransaction(
rawTransaction,
{
skipPreflight: true,
},
);
console.log('Started awaiting confirmation for', txid);
let done = false;
(async () => {
while (!done && getUnixTs() - startTime < timeout) {
connection.sendRawTransaction(rawTransaction, {
skipPreflight: true,
});
await sleep(500);
}
})();
try {
const confirmation = await awaitTransactionSignatureConfirmation(
txid,
timeout,
connection,
'recent',
true,
);
if (!confirmation)
throw new Error('Timed out awaiting confirmation on transaction');
if (confirmation.err) {
console.error(confirmation.err);
throw new Error('Transaction failed: Custom instruction error');
}
slot = confirmation?.slot || 0;
} catch (err: any) {
console.error('Timeout Error caught', err);
if (err.timeout) {
throw new Error('Timed out awaiting confirmation on transaction');
}
let simulateResult: SimulatedTransactionResponse | null = null;
try {
simulateResult = (
await simulateTransaction(connection, signedTransaction, 'single')
).value;
} catch (e) {}
if (simulateResult && simulateResult.err) {
if (simulateResult.logs) {
for (let i = simulateResult.logs.length - 1; i >= 0; --i) {
const line = simulateResult.logs[i];
if (line.startsWith('Program log: ')) {
throw new Error(
'Transaction failed: ' + line.slice('Program log: '.length),
);
}
}
}
throw new Error(JSON.stringify(simulateResult.err));
}
// throw new Error('Transaction failed');
} finally {
done = true;
}
console.log('Latency', txid, getUnixTs() - startTime);
return { txid, slot };
}
async function simulateTransaction(
connection: Connection,
transaction: Transaction,
commitment: Commitment,
): Promise<RpcResponseAndContext<SimulatedTransactionResponse>> {
// @ts-ignore
transaction.recentBlockhash = await connection._recentBlockhash(
// @ts-ignore
connection._disableBlockhashCaching,
);
const signData = transaction.serializeMessage();
// @ts-ignore
const wireTransaction = transaction._serialize(signData);
const encodedTransaction = wireTransaction.toString('base64');
const config: any = { encoding: 'base64', commitment };
const args = [encodedTransaction, config];
// @ts-ignore
const res = await connection._rpcRequest('simulateTransaction', args);
if (res.error) {
throw new Error('failed to simulate transaction: ' + res.error.message);
}
return res.result;
}
async function awaitTransactionSignatureConfirmation(
txid: TransactionSignature,
timeout: number,
connection: Connection,
commitment: Commitment = 'recent',
queryStatus = false,
): Promise<SignatureStatus | null | void> {
let done = false;
let status: SignatureStatus | null | void = {
slot: 0,
confirmations: 0,
err: null,
};
let subId = 0;
status = await new Promise(async (resolve, reject) => {
setTimeout(() => {
if (done) {
return;
}
done = true;
console.log('Rejecting for timeout...');
reject({ timeout: true });
}, timeout);
try {
subId = connection.onSignature(
txid,
(result, context) => {
done = true;
status = {
err: result.err,
slot: context.slot,
confirmations: 0,
};
if (result.err) {
console.log('Rejected via websocket', result.err);
reject(status);
} else {
console.log('Resolved via websocket', result);
resolve(status);
}
},
commitment,
);
} catch (e) {
done = true;
console.error('WS error in setup', txid, e);
}
while (!done && queryStatus) {
// eslint-disable-next-line no-loop-func
(async () => {
try {
const signatureStatuses = await connection.getSignatureStatuses([
txid,
]);
status = signatureStatuses && signatureStatuses.value[0];
if (!done) {
if (!status) {
console.log('REST null result for', txid, status);
} else if (status.err) {
console.log('REST error for', txid, status);
done = true;
reject(status.err);
} else if (!status.confirmations) {
console.log('REST no confirmations for', txid, status);
} else {
console.log('REST confirmation for', txid, status);
done = true;
resolve(status);
}
}
} catch (e) {
if (!done) {
console.log('REST connection error: txid', txid, e);
}
}
})();
await sleep(2000);
}
});
//@ts-ignore
if (connection._signatureSubscriptions[subId])
connection.removeSignatureListener(subId);
done = true;
console.log('Returning status', status);
return status;
}
export function sleep(ms: number): Promise<void> {
return new Promise(resolve => setTimeout(resolve, ms));
}

View File

@ -0,0 +1,136 @@
import { Paper } from '@material-ui/core';
import Countdown from 'react-countdown';
import { Theme, createStyles, makeStyles } from '@material-ui/core/styles';
import { useState } from 'react';
const useStyles = makeStyles((theme: Theme) =>
createStyles({
root: {
display: 'flex',
padding: theme.spacing(0),
'& > *': {
margin: theme.spacing(0.5),
marginRight: 0,
width: theme.spacing(6),
height: theme.spacing(6),
display: 'flex',
flexDirection: 'column',
alignContent: 'center',
alignItems: 'center',
justifyContent: 'center',
background: '#384457',
color: 'white',
borderRadius: 5,
fontSize: 10,
},
},
done: {
display: 'flex',
margin: theme.spacing(1),
marginRight: 0,
padding: theme.spacing(1),
flexDirection: 'column',
alignContent: 'center',
alignItems: 'center',
justifyContent: 'center',
background: '#384457',
color: 'white',
borderRadius: 5,
fontWeight: 'bold',
fontSize: 18,
},
item: {
fontWeight: 'bold',
fontSize: 18,
}
}),
);
interface PhaseCountdownProps {
date: Date | undefined;
style?: React.CSSProperties;
status?: string;
onComplete?: () => void;
start?: Date;
end?: Date;
}
interface CountdownRender {
days: number;
hours: number;
minutes: number;
seconds: number;
completed: boolean;
}
export const PhaseCountdown: React.FC<PhaseCountdownProps> = ({
date,
status,
style,
start,
end,
onComplete,
}) => {
const classes = useStyles();
const [isFixed, setIsFixed] = useState(start && end && date ? start.getTime() - Date.now() < 0 : false);
const renderCountdown = ({ days, hours, minutes, seconds, completed }: CountdownRender) => {
hours += days * 24
if (completed) {
return status ? <span className={classes.done} >{status}</span> : null;
} else {
return (
<div className={classes.root} style={style} >
{isFixed && <Paper elevation={0}>
<span className={classes.item}>
+
</span>
</Paper>}
<Paper elevation={0}>
<span className={classes.item}>
{hours < 10 ? `0${hours}` : hours}
</span>
<span>hrs</span>
</Paper>
<Paper elevation={0}>
<span className={classes.item}>
{minutes < 10 ? `0${minutes}` : minutes}
</span>
<span>mins</span>
</Paper>
<Paper elevation={0}>
<span className={classes.item}>
{seconds < 10 ? `0${seconds}` : seconds}
</span>
<span>secs</span>
</Paper>
</div>
)
}
}
if (date && start && end) {
if (isFixed) {
<Countdown
date={start}
now={() => end.getTime()}
onComplete={() => setIsFixed(false)}
renderer={renderCountdown}
/>
}
}
if (date) {
return (
<Countdown
date={date}
onComplete={onComplete}
renderer={renderCountdown}
/>
)
} else {
return null
}
}

View File

@ -0,0 +1,634 @@
import * as anchor from '@project-serum/anchor';
import { TOKEN_PROGRAM_ID, Token } from '@solana/spl-token';
import { LAMPORTS_PER_SOL, TransactionInstruction } from '@solana/web3.js';
import {
createAssociatedTokenAccountInstruction,
getAtaForMint,
getFairLaunchTicketSeqLookup,
} from './utils';
export const FAIR_LAUNCH_PROGRAM = new anchor.web3.PublicKey(
'faircnAB9k59Y4TXmLabBULeuTLgV7TkGMGNkjnA15j',
);
export interface FairLaunchAccount {
id: anchor.web3.PublicKey;
program: anchor.Program;
state: FairLaunchState;
ticket: {
pubkey: anchor.web3.PublicKey;
bump: number;
data?: FairLaunchTicket;
};
lottery: {
pubkey: anchor.web3.PublicKey;
data?: Uint8Array;
};
treasury: number;
}
export interface FairLaunchTicket {
fairLaunch: anchor.web3.PublicKey;
buyer: anchor.web3.PublicKey;
amount: anchor.BN;
state: {
punched?: {};
unpunched?: {};
withdrawn?: {};
no_sequence_struct: {};
};
bump: number;
seq: anchor.BN;
}
export interface AntiRugSetting {
reserveBp: number;
tokenRequirement: anchor.BN;
selfDestructDate: anchor.BN;
}
export interface FairLaunchState {
authority: anchor.web3.PublicKey;
bump: number;
currentMedian: anchor.BN;
currentEligibleHolders: anchor.BN;
data: {
antiRugSetting?: AntiRugSetting;
fee: anchor.BN;
numberOfTokens: anchor.BN;
phaseOneEnd: anchor.BN;
phaseOneStart: anchor.BN;
phaseTwoEnd: anchor.BN;
priceRangeEnd: anchor.BN;
priceRangeStart: anchor.BN;
lotteryDuration: anchor.BN;
tickSize: anchor.BN;
uuid: string;
};
numberTicketsDropped: anchor.BN;
numberTicketsPunched: anchor.BN;
numberTicketsSold: anchor.BN;
numberTicketsUnSeqed: anchor.BN;
numberTokensBurnedForRefunds: anchor.BN;
numberTokensPreminted: anchor.BN;
phaseThreeStarted: boolean;
tokenMint: anchor.web3.PublicKey;
tokenMintBump: number;
treasury: anchor.web3.PublicKey;
treasuryBump: number;
treasuryMint: anchor.web3.PublicKey; // only for SPL tokens
treasurySnapshot: null;
}
export enum LotteryState {
Brewing = 'Brewing',
Finished = 'Finished',
PastDue = 'Past Due',
}
export const getLotteryState = (
phaseThree: boolean | undefined,
lottery: Uint8Array | null,
lotteryDuration: anchor.BN,
phaseTwoEnd: anchor.BN,
): LotteryState => {
if (
!phaseThree &&
(!lottery || lottery.length === 0) &&
phaseTwoEnd.add(lotteryDuration).lt(new anchor.BN(Date.now() / 1000))
) {
return LotteryState.PastDue;
} else if (phaseThree) {
return LotteryState.Finished;
} else {
return LotteryState.Brewing;
}
};
export const getFairLaunchState = async (
anchorWallet: anchor.Wallet,
fairLaunchId: anchor.web3.PublicKey,
connection: anchor.web3.Connection,
): Promise<FairLaunchAccount> => {
const provider = new anchor.Provider(connection, anchorWallet, {
preflightCommitment: 'recent',
});
const idl = await anchor.Program.fetchIdl(FAIR_LAUNCH_PROGRAM, provider);
const program = new anchor.Program(idl, FAIR_LAUNCH_PROGRAM, provider);
const state: any = await program.account.fairLaunch.fetch(fairLaunchId);
const [fairLaunchTicket, bump] = await getFairLaunchTicket(
//@ts-ignore
state.tokenMint,
anchorWallet.publicKey,
);
let fairLaunchData: any;
try {
fairLaunchData = await program.account.fairLaunchTicket.fetch(
fairLaunchTicket,
);
} catch {
console.log('No ticket');
}
const treasury = await program.provider.connection.getBalance(state.treasury);
let lotteryData: Uint8Array = new Uint8Array([]);
let fairLaunchLotteryBitmap = (
await getFairLaunchLotteryBitmap(
//@ts-ignore
state.tokenMint,
)
)[0];
try {
const fairLaunchLotteryBitmapObj =
await program.provider.connection.getAccountInfo(fairLaunchLotteryBitmap);
lotteryData = new Uint8Array(fairLaunchLotteryBitmapObj?.data || []);
} catch (e) {
console.log('Could not find fair launch lottery.');
console.log(e);
}
return {
id: fairLaunchId,
state,
program,
ticket: {
pubkey: fairLaunchTicket,
bump,
data: fairLaunchData,
},
lottery: {
pubkey: fairLaunchLotteryBitmap,
data: lotteryData,
},
treasury,
};
};
export const punchTicket = async (
anchorWallet: anchor.Wallet,
fairLaunch: FairLaunchAccount,
) => {
const fairLaunchTicket = (
await getFairLaunchTicket(
//@ts-ignore
fairLaunch.state.tokenMint,
anchorWallet.publicKey,
)
)[0];
const ticket = fairLaunch.ticket.data;
const fairLaunchLotteryBitmap = //@ts-ignore
(await getFairLaunchLotteryBitmap(fairLaunch.state.tokenMint))[0];
const buyerTokenAccount = (
await getAtaForMint(
//@ts-ignore
fairLaunch.state.tokenMint,
anchorWallet.publicKey,
)
)[0];
if (ticket?.amount.gt(fairLaunch.state.currentMedian)) {
console.log(
'Adjusting down...',
ticket?.amount.toNumber(),
fairLaunch.state.currentMedian.toNumber(),
);
const { remainingAccounts, instructions, signers } =
await getSetupForTicketing(
fairLaunch.program,
fairLaunch.state.currentMedian.toNumber(),
anchorWallet,
fairLaunch,
fairLaunchTicket,
);
await fairLaunch.program.rpc.adjustTicket(fairLaunch.state.currentMedian, {
accounts: {
fairLaunchTicket,
fairLaunch: fairLaunch.id,
fairLaunchLotteryBitmap,
//@ts-ignore
treasury: fairLaunch.state.treasury,
systemProgram: anchor.web3.SystemProgram.programId,
clock: anchor.web3.SYSVAR_CLOCK_PUBKEY,
},
__private: { logAccounts: true },
instructions: instructions.length > 0 ? instructions : undefined,
remainingAccounts: [
{
pubkey: anchorWallet.publicKey,
isSigner: true,
isWritable: true,
},
...remainingAccounts,
],
signers,
});
}
const accountExists =
await fairLaunch.program.provider.connection.getAccountInfo(
buyerTokenAccount,
);
const instructions = !accountExists
? [
createAssociatedTokenAccountInstruction(
buyerTokenAccount,
anchorWallet.publicKey,
anchorWallet.publicKey,
//@ts-ignore
fairLaunch.state.tokenMint,
),
]
: [];
await fairLaunch.program.rpc.punchTicket({
accounts: {
fairLaunchTicket,
fairLaunch: fairLaunch.id,
fairLaunchLotteryBitmap,
payer: anchorWallet.publicKey,
buyerTokenAccount,
//@ts-ignore
tokenMint: fairLaunch.state.tokenMint,
tokenProgram: TOKEN_PROGRAM_ID,
},
instructions: instructions.length > 0 ? instructions : undefined,
});
};
export const getFairLaunchTicket = async (
tokenMint: anchor.web3.PublicKey,
buyer: anchor.web3.PublicKey,
): Promise<[anchor.web3.PublicKey, number]> => {
return await anchor.web3.PublicKey.findProgramAddress(
[Buffer.from('fair_launch'), tokenMint.toBuffer(), buyer.toBuffer()],
FAIR_LAUNCH_PROGRAM,
);
};
export const getFairLaunchLotteryBitmap = async (
tokenMint: anchor.web3.PublicKey,
): Promise<[anchor.web3.PublicKey, number]> => {
return await anchor.web3.PublicKey.findProgramAddress(
[Buffer.from('fair_launch'), tokenMint.toBuffer(), Buffer.from('lottery')],
FAIR_LAUNCH_PROGRAM,
);
};
const getSetupForTicketing = async (
anchorProgram: anchor.Program,
amount: number,
anchorWallet: anchor.Wallet,
fairLaunch: FairLaunchAccount | undefined,
ticketKey: anchor.web3.PublicKey,
): Promise<{
remainingAccounts: {
pubkey: anchor.web3.PublicKey | null;
isWritable: boolean;
isSigner: boolean;
}[];
instructions: TransactionInstruction[];
signers: anchor.web3.Keypair[];
amountLamports: number;
}> => {
if (!fairLaunch) {
return {
remainingAccounts: [],
instructions: [],
signers: [],
amountLamports: 0,
};
}
const ticket = fairLaunch.ticket;
const remainingAccounts = [];
const instructions = [];
const signers = [];
let amountLamports = 0;
//@ts-ignore
if (!fairLaunch.state.treasuryMint) {
if (!ticket && amount === 0) {
amountLamports = fairLaunch.state.data.priceRangeStart.toNumber();
} else {
amountLamports = Math.ceil(amount * LAMPORTS_PER_SOL);
}
} else {
const transferAuthority = anchor.web3.Keypair.generate();
signers.push(transferAuthority);
// NOTE this token impl will not work till you get decimal mantissa and multiply...
/// ex from cli wont work since you dont have a Signer, but an anchor.Wallet
/*
const token = new Token(
anchorProgram.provider.connection,
//@ts-ignore
fairLaunchObj.treasuryMint,
TOKEN_PROGRAM_ID,
walletKeyPair,
);
const mintInfo = await token.getMintInfo();
amountNumber = Math.ceil(amountNumber * 10 ** mintInfo.decimals);
*/
instructions.push(
Token.createApproveInstruction(
TOKEN_PROGRAM_ID,
//@ts-ignore
fairLaunch.state.treasuryMint,
transferAuthority.publicKey,
anchorWallet.publicKey,
[],
//@ts-ignore
// TODO: get mint decimals
amountNumber + fairLaunch.state.data.fees.toNumber(),
),
);
remainingAccounts.push({
//@ts-ignore
pubkey: fairLaunch.state.treasuryMint,
isWritable: true,
isSigner: false,
});
remainingAccounts.push({
pubkey: (
await getAtaForMint(
//@ts-ignore
fairLaunch.state.treasuryMint,
anchorWallet.publicKey,
)
)[0],
isWritable: true,
isSigner: false,
});
remainingAccounts.push({
pubkey: transferAuthority.publicKey,
isWritable: false,
isSigner: true,
});
remainingAccounts.push({
pubkey: TOKEN_PROGRAM_ID,
isWritable: false,
isSigner: false,
});
}
if (ticket.data) {
const [fairLaunchTicketSeqLookup, seqBump] =
await getFairLaunchTicketSeqLookup(
fairLaunch.state.tokenMint,
ticket.data?.seq,
);
const seq = await anchorProgram.provider.connection.getAccountInfo(
fairLaunchTicketSeqLookup,
);
if (!seq) {
instructions.push(
await anchorProgram.instruction.createTicketSeq(seqBump, {
accounts: {
fairLaunchTicketSeqLookup,
fairLaunch: fairLaunch.id,
fairLaunchTicket: ticketKey,
payer: anchorWallet.publicKey,
systemProgram: anchor.web3.SystemProgram.programId,
rent: anchor.web3.SYSVAR_RENT_PUBKEY,
},
signers: [],
}),
);
}
}
return {
remainingAccounts,
instructions,
signers,
amountLamports,
};
};
export const receiveRefund = async (
anchorWallet: anchor.Wallet,
fairLaunch: FairLaunchAccount | undefined,
) => {
if (!fairLaunch) {
return;
}
const buyerTokenAccount = (
await getAtaForMint(fairLaunch.state.tokenMint, anchorWallet.publicKey)
)[0];
const transferAuthority = anchor.web3.Keypair.generate();
const signers = [transferAuthority];
const instructions = [
Token.createApproveInstruction(
TOKEN_PROGRAM_ID,
buyerTokenAccount,
transferAuthority.publicKey,
anchorWallet.publicKey,
[],
1,
),
];
const remainingAccounts = [];
if (fairLaunch.state.treasuryMint) {
remainingAccounts.push({
pubkey: fairLaunch.state.treasuryMint,
isWritable: true,
isSigner: false,
});
remainingAccounts.push({
pubkey: (
await getAtaForMint(
fairLaunch.state.treasuryMint,
anchorWallet.publicKey,
)
)[0],
isWritable: true,
isSigner: false,
});
}
console.log(
'tfr',
fairLaunch.state.treasury.toBase58(),
anchorWallet.publicKey.toBase58(),
buyerTokenAccount.toBase58(),
);
await fairLaunch.program.rpc.receiveRefund({
accounts: {
fairLaunch: fairLaunch.id,
treasury: fairLaunch.state.treasury,
buyer: anchorWallet.publicKey,
buyerTokenAccount,
transferAuthority: transferAuthority.publicKey,
tokenMint: fairLaunch.state.tokenMint,
tokenProgram: TOKEN_PROGRAM_ID,
systemProgram: anchor.web3.SystemProgram.programId,
clock: anchor.web3.SYSVAR_CLOCK_PUBKEY,
},
__private: { logAccounts: true },
remainingAccounts,
instructions,
signers,
});
};
export const purchaseTicket = async (
amount: number,
anchorWallet: anchor.Wallet,
fairLaunch: FairLaunchAccount | undefined,
) => {
if (!fairLaunch) {
return;
}
const ticket = fairLaunch.ticket.data;
const [fairLaunchTicket, bump] = await getFairLaunchTicket(
//@ts-ignore
fairLaunch.state.tokenMint,
anchorWallet.publicKey,
);
const { remainingAccounts, instructions, signers, amountLamports } =
await getSetupForTicketing(
fairLaunch.program,
amount,
anchorWallet,
fairLaunch,
fairLaunchTicket,
);
if (ticket) {
const fairLaunchLotteryBitmap = ( //@ts-ignore
await getFairLaunchLotteryBitmap(fairLaunch.state.tokenMint)
)[0];
console.log(
'Anchor wallet',
anchorWallet.publicKey.toBase58(),
amountLamports,
);
await fairLaunch.program.rpc.adjustTicket(new anchor.BN(amountLamports), {
accounts: {
fairLaunchTicket,
fairLaunch: fairLaunch.id,
fairLaunchLotteryBitmap,
//@ts-ignore
treasury: fairLaunch.state.treasury,
systemProgram: anchor.web3.SystemProgram.programId,
clock: anchor.web3.SYSVAR_CLOCK_PUBKEY,
},
__private: { logAccounts: true },
remainingAccounts: [
{
pubkey: anchorWallet.publicKey,
isSigner: true,
isWritable: true,
},
...remainingAccounts,
],
signers,
instructions: instructions.length > 0 ? instructions : undefined,
});
return;
}
try {
console.log('Amount', amountLamports);
await fairLaunch.program.rpc.purchaseTicket(
bump,
new anchor.BN(amountLamports),
{
accounts: {
fairLaunchTicket,
fairLaunch: fairLaunch.id,
//@ts-ignore
treasury: fairLaunch.state.treasury,
buyer: anchorWallet.publicKey,
payer: anchorWallet.publicKey,
systemProgram: anchor.web3.SystemProgram.programId,
rent: anchor.web3.SYSVAR_RENT_PUBKEY,
clock: anchor.web3.SYSVAR_CLOCK_PUBKEY,
},
//__private: { logAccounts: true },
remainingAccounts,
signers,
instructions: instructions.length > 0 ? instructions : undefined,
},
);
} catch (e) {
console.log(e);
throw e;
}
};
export const withdrawFunds = async (
anchorWallet: anchor.Wallet,
fairLaunch: FairLaunchAccount | undefined,
) => {
if (!fairLaunch) {
return;
}
// TODO: create sequence ticket
const remainingAccounts = [];
//@ts-ignore
if (fairLaunch.state.treasuryMint) {
remainingAccounts.push({
//@ts-ignore
pubkey: fairLaunch.state.treasuryMint,
isWritable: true,
isSigner: false,
});
remainingAccounts.push({
pubkey: (
await getAtaForMint(
//@ts-ignore
fairLaunch.state.treasuryMint,
anchorWallet.publicKey,
)
)[0],
isWritable: true,
isSigner: false,
});
remainingAccounts.push({
pubkey: TOKEN_PROGRAM_ID,
isWritable: false,
isSigner: false,
});
}
await fairLaunch.program.rpc.withdrawFunds({
accounts: {
fairLaunch: fairLaunch.id,
// @ts-ignore
treasury: fairLaunch.state.treasury,
authority: anchorWallet.publicKey,
// @ts-ignore
tokenMint: fairLaunch.state.tokenMint,
systemProgram: anchor.web3.SystemProgram.programId,
},
remainingAccounts,
});
};

View File

@ -0,0 +1,14 @@
body {
background: #000000;
margin: 0;
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Roboto', 'Oxygen',
'Ubuntu', 'Cantarell', 'Fira Sans', 'Droid Sans', 'Helvetica Neue',
sans-serif;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
}
code {
font-family: source-code-pro, Menlo, Monaco, Consolas, 'Courier New',
monospace;
}

View File

@ -0,0 +1,17 @@
import React from 'react';
import ReactDOM from 'react-dom';
import './index.css';
import App from './App';
import reportWebVitals from './reportWebVitals';
ReactDOM.render(
<React.StrictMode>
<App />
</React.StrictMode>,
document.getElementById('root')
);
// If you want to start measuring performance in your app, pass a function
// to log results (for example: reportWebVitals(console.log))
// or send to an analytics endpoint. Learn more: https://bit.ly/CRA-vitals
reportWebVitals();

View File

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 841.9 595.3"><g fill="#61DAFB"><path d="M666.3 296.5c0-32.5-40.7-63.3-103.1-82.4 14.4-63.6 8-114.2-20.2-130.4-6.5-3.8-14.1-5.6-22.4-5.6v22.3c4.6 0 8.3.9 11.4 2.6 13.6 7.8 19.5 37.5 14.9 75.7-1.1 9.4-2.9 19.3-5.1 29.4-19.6-4.8-41-8.5-63.5-10.9-13.5-18.5-27.5-35.3-41.6-50 32.6-30.3 63.2-46.9 84-46.9V78c-27.5 0-63.5 19.6-99.9 53.6-36.4-33.8-72.4-53.2-99.9-53.2v22.3c20.7 0 51.4 16.5 84 46.6-14 14.7-28 31.4-41.3 49.9-22.6 2.4-44 6.1-63.6 11-2.3-10-4-19.7-5.2-29-4.7-38.2 1.1-67.9 14.6-75.8 3-1.8 6.9-2.6 11.5-2.6V78.5c-8.4 0-16 1.8-22.6 5.6-28.1 16.2-34.4 66.7-19.9 130.1-62.2 19.2-102.7 49.9-102.7 82.3 0 32.5 40.7 63.3 103.1 82.4-14.4 63.6-8 114.2 20.2 130.4 6.5 3.8 14.1 5.6 22.5 5.6 27.5 0 63.5-19.6 99.9-53.6 36.4 33.8 72.4 53.2 99.9 53.2 8.4 0 16-1.8 22.6-5.6 28.1-16.2 34.4-66.7 19.9-130.1 62-19.1 102.5-49.9 102.5-82.3zm-130.2-66.7c-3.7 12.9-8.3 26.2-13.5 39.5-4.1-8-8.4-16-13.1-24-4.6-8-9.5-15.8-14.4-23.4 14.2 2.1 27.9 4.7 41 7.9zm-45.8 106.5c-7.8 13.5-15.8 26.3-24.1 38.2-14.9 1.3-30 2-45.2 2-15.1 0-30.2-.7-45-1.9-8.3-11.9-16.4-24.6-24.2-38-7.6-13.1-14.5-26.4-20.8-39.8 6.2-13.4 13.2-26.8 20.7-39.9 7.8-13.5 15.8-26.3 24.1-38.2 14.9-1.3 30-2 45.2-2 15.1 0 30.2.7 45 1.9 8.3 11.9 16.4 24.6 24.2 38 7.6 13.1 14.5 26.4 20.8 39.8-6.3 13.4-13.2 26.8-20.7 39.9zm32.3-13c5.4 13.4 10 26.8 13.8 39.8-13.1 3.2-26.9 5.9-41.2 8 4.9-7.7 9.8-15.6 14.4-23.7 4.6-8 8.9-16.1 13-24.1zM421.2 430c-9.3-9.6-18.6-20.3-27.8-32 9 .4 18.2.7 27.5.7 9.4 0 18.7-.2 27.8-.7-9 11.7-18.3 22.4-27.5 32zm-74.4-58.9c-14.2-2.1-27.9-4.7-41-7.9 3.7-12.9 8.3-26.2 13.5-39.5 4.1 8 8.4 16 13.1 24 4.7 8 9.5 15.8 14.4 23.4zM420.7 163c9.3 9.6 18.6 20.3 27.8 32-9-.4-18.2-.7-27.5-.7-9.4 0-18.7.2-27.8.7 9-11.7 18.3-22.4 27.5-32zm-74 58.9c-4.9 7.7-9.8 15.6-14.4 23.7-4.6 8-8.9 16-13 24-5.4-13.4-10-26.8-13.8-39.8 13.1-3.1 26.9-5.8 41.2-7.9zm-90.5 125.2c-35.4-15.1-58.3-34.9-58.3-50.6 0-15.7 22.9-35.6 58.3-50.6 8.6-3.7 18-7 27.7-10.1 5.7 19.6 13.2 40 22.5 60.9-9.2 20.8-16.6 41.1-22.2 60.6-9.9-3.1-19.3-6.5-28-10.2zM310 490c-13.6-7.8-19.5-37.5-14.9-75.7 1.1-9.4 2.9-19.3 5.1-29.4 19.6 4.8 41 8.5 63.5 10.9 13.5 18.5 27.5 35.3 41.6 50-32.6 30.3-63.2 46.9-84 46.9-4.5-.1-8.3-1-11.3-2.7zm237.2-76.2c4.7 38.2-1.1 67.9-14.6 75.8-3 1.8-6.9 2.6-11.5 2.6-20.7 0-51.4-16.5-84-46.6 14-14.7 28-31.4 41.3-49.9 22.6-2.4 44-6.1 63.6-11 2.3 10.1 4.1 19.8 5.2 29.1zm38.5-66.7c-8.6 3.7-18 7-27.7 10.1-5.7-19.6-13.2-40-22.5-60.9 9.2-20.8 16.6-41.1 22.2-60.6 9.9 3.1 19.3 6.5 28.1 10.2 35.4 15.1 58.3 34.9 58.3 50.6-.1 15.7-23 35.6-58.4 50.6zM320.8 78.4z"/><circle cx="420.9" cy="296.5" r="45.7"/><path d="M520.5 78.1z"/></g></svg>

After

Width:  |  Height:  |  Size: 2.6 KiB

View File

@ -0,0 +1 @@
/// <reference types="react-scripts" />

View File

@ -0,0 +1,15 @@
import { ReportHandler } from 'web-vitals';
const reportWebVitals = (onPerfEntry?: ReportHandler) => {
if (onPerfEntry && onPerfEntry instanceof Function) {
import('web-vitals').then(({ getCLS, getFID, getFCP, getLCP, getTTFB }) => {
getCLS(onPerfEntry);
getFID(onPerfEntry);
getFCP(onPerfEntry);
getLCP(onPerfEntry);
getTTFB(onPerfEntry);
});
}
};
export default reportWebVitals;

View File

@ -0,0 +1,5 @@
// jest-dom adds custom jest matchers for asserting on DOM nodes.
// allows you to do things like:
// expect(element).toHaveTextContent(/react/i)
// learn more: https://github.com/testing-library/jest-dom
import '@testing-library/jest-dom';

View File

@ -0,0 +1,130 @@
import * as anchor from '@project-serum/anchor';
import { TOKEN_PROGRAM_ID } from '@solana/spl-token';
import { SystemProgram } from '@solana/web3.js';
import {
LAMPORTS_PER_SOL,
SYSVAR_RENT_PUBKEY,
TransactionInstruction,
} from '@solana/web3.js';
export const FAIR_LAUNCH_PROGRAM_ID = new anchor.web3.PublicKey(
'faircnAB9k59Y4TXmLabBULeuTLgV7TkGMGNkjnA15j',
);
export const toDate = (value?: anchor.BN) => {
if (!value) {
return;
}
return new Date(value.toNumber() * 1000);
};
const numberFormater = new Intl.NumberFormat('en-US', {
style: 'decimal',
minimumFractionDigits: 2,
maximumFractionDigits: 2,
});
export const formatNumber = {
format: (val?: number) => {
if (!val) {
return '--';
}
return numberFormater.format(val);
},
asNumber: (val?: anchor.BN) => {
if (!val) {
return undefined;
}
return val.toNumber() / LAMPORTS_PER_SOL;
},
};
export const SPL_ASSOCIATED_TOKEN_ACCOUNT_PROGRAM_ID =
new anchor.web3.PublicKey('ATokenGPvbdGVxr1b2hvZbsiqW5xWH25efTNsLJA8knL');
export const getFairLaunchTicketSeqLookup = async (
tokenMint: anchor.web3.PublicKey,
seq: anchor.BN,
): Promise<[anchor.web3.PublicKey, number]> => {
return await anchor.web3.PublicKey.findProgramAddress(
[
Buffer.from('fair_launch'),
tokenMint.toBuffer(),
seq.toArrayLike(Buffer, 'le', 8),
],
FAIR_LAUNCH_PROGRAM_ID,
);
};
export const getAtaForMint = async (
mint: anchor.web3.PublicKey,
buyer: anchor.web3.PublicKey,
): Promise<[anchor.web3.PublicKey, number]> => {
return await anchor.web3.PublicKey.findProgramAddress(
[buyer.toBuffer(), TOKEN_PROGRAM_ID.toBuffer(), mint.toBuffer()],
SPL_ASSOCIATED_TOKEN_ACCOUNT_PROGRAM_ID,
);
};
export const getFairLaunchTicket = async (
tokenMint: anchor.web3.PublicKey,
buyer: anchor.web3.PublicKey,
): Promise<[anchor.web3.PublicKey, number]> => {
return await anchor.web3.PublicKey.findProgramAddress(
[Buffer.from('fair_launch'), tokenMint.toBuffer(), buyer.toBuffer()],
FAIR_LAUNCH_PROGRAM_ID,
);
};
export function createAssociatedTokenAccountInstruction(
associatedTokenAddress: anchor.web3.PublicKey,
payer: anchor.web3.PublicKey,
walletAddress: anchor.web3.PublicKey,
splTokenMintAddress: anchor.web3.PublicKey,
) {
const keys = [
{
pubkey: payer,
isSigner: true,
isWritable: true,
},
{
pubkey: associatedTokenAddress,
isSigner: false,
isWritable: true,
},
{
pubkey: walletAddress,
isSigner: false,
isWritable: false,
},
{
pubkey: splTokenMintAddress,
isSigner: false,
isWritable: false,
},
{
pubkey: SystemProgram.programId,
isSigner: false,
isWritable: false,
},
{
pubkey: TOKEN_PROGRAM_ID,
isSigner: false,
isWritable: false,
},
{
pubkey: SYSVAR_RENT_PUBKEY,
isSigner: false,
isWritable: false,
},
];
return new TransactionInstruction({
keys,
programId: SPL_ASSOCIATED_TOKEN_ACCOUNT_PROGRAM_ID,
data: Buffer.from([]),
});
}

View File

@ -0,0 +1,26 @@
{
"compilerOptions": {
"target": "es5",
"lib": [
"dom",
"dom.iterable",
"esnext"
],
"allowJs": true,
"skipLibCheck": true,
"esModuleInterop": true,
"allowSyntheticDefaultImports": true,
"strict": true,
"forceConsistentCasingInFileNames": true,
"noFallthroughCasesInSwitch": true,
"module": "esnext",
"moduleResolution": "node",
"resolveJsonModule": true,
"isolatedModules": true,
"noEmit": true,
"jsx": "react-jsx"
},
"include": [
"src"
]
}

View File

@ -1,3 +1,2 @@
REACT_APP_STORE_OWNER_ADDRESS_ADDRESS=
REACT_APP_STORE_ADDRESS=
REACT_APP_BIG_STORE=FALSE
REACT_APP_STORE_ADDRESS=

View File

@ -44,7 +44,7 @@
"build": "next build",
"export": "next export -o ../../build/web",
"start:prod": "next start",
"test": "jest",
"test": "jest --passWithNoTests",
"deploy:ar": "yarn export && arweave deploy-dir ../../build/web --key-file ",
"deploy:gh": "yarn export && gh-pages -d ../../build/web --repo https://github.com/metaplex-foundation/metaplex -t true",
"deploy": "cross-env ASSET_PREFIX=/metaplex/ yarn build && yarn deploy:gh",
@ -94,4 +94,4 @@
"react-dom": "*"
},
"license": "MIT"
}
}

View File

@ -42,17 +42,32 @@ interface IArweaveResult {
}>;
}
const uploadToArweave = async (data: FormData): Promise<IArweaveResult> =>
(
await fetch(
'https://us-central1-principal-lane-200702.cloudfunctions.net/uploadFile4',
{
method: 'POST',
// @ts-ignore
body: data,
},
)
).json();
const uploadToArweave = async (data: FormData): Promise<IArweaveResult> => {
const resp = await fetch(
'https://us-central1-principal-lane-200702.cloudfunctions.net/uploadFile4',
{
method: 'POST',
// @ts-ignore
body: data,
},
);
if (!resp.ok) {
return Promise.reject(
new Error(
'Unable to upload the artwork to Arweave. Please wait and then try again.',
),
);
}
const result: IArweaveResult = await resp.json();
if (result.error) {
return Promise.reject(new Error(result.error));
}
return result;
};
export const mintNFT = async (
connection: Connection,

View File

@ -27,7 +27,7 @@ export async function sendPlaceBid(
auctionView: AuctionView,
accountsByMint: Map<string, TokenAccount>,
// value entered by the user adjust to decimals of the mint
amount: number,
amount: number | BN,
) {
const signers: Keypair[][] = [];
const instructions: TransactionInstruction[][] = [];
@ -62,7 +62,8 @@ export async function setupPlaceBid(
auctionView: AuctionView,
accountsByMint: Map<string, TokenAccount>,
// value entered by the user adjust to decimals of the mint
amount: number,
// If BN, then assume instant sale and decimals already adjusted.
amount: number | BN,
overallInstructions: TransactionInstruction[][],
overallSigners: Keypair[][],
): Promise<BN> {
@ -82,7 +83,12 @@ export async function setupPlaceBid(
const mint = cache.get(
tokenAccount ? tokenAccount.info.mint : QUOTE_MINT,
) as ParsedAccount<MintInfo>;
const lamports = toLamports(amount, mint.info) + accountRentExempt;
const lamports =
accountRentExempt +
(typeof amount === 'number'
? toLamports(amount, mint.info)
: amount.toNumber());
let bidderPotTokenAccount: string;
if (!auctionView.myBidderPot) {

View File

@ -123,7 +123,6 @@ export async function sendRedeemBid(
winnerIndex = auctionView.auction.info.bidState.getWinnerIndex(
auctionView.myBidderPot?.info.bidderAct,
);
console.log('Winner index', winnerIndex);
if (winnerIndex !== null) {
// items is a prebuilt array of arrays where each entry represents one

View File

@ -162,31 +162,31 @@ const HTMLContent = ({
uri,
animationUrl,
className,
preview,
style,
files,
artView,
}: {
uri?: string;
animationUrl?: string;
className?: string;
preview?: boolean;
style?: React.CSSProperties;
files?: (MetadataFile | string)[];
artView?: boolean;
}) => {
if (!artView){
return <CachedImageContent
uri={uri}
className={className}
preview={preview}
style={style}
/>
}
const htmlURL =
files && files.length > 0 && typeof files[0] === 'string'
? files[0]
: animationUrl;
const { isLoading } = useCachedImage(htmlURL || '', true);
if (isLoading) {
return (
<CachedImageContent
uri={uri}
className={className}
preview={false}
style={{ width: 300, ...style }}
/>
);
}
return (
<iframe allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture"
sandbox="allow-scripts"
@ -205,10 +205,10 @@ export const ArtContent = ({
active,
allowMeshRender,
pubkey,
uri,
animationURL,
files,
artView,
}: {
category?: MetadataCategory;
className?: string;
@ -223,6 +223,7 @@ export const ArtContent = ({
uri?: string;
animationURL?: string;
files?: (MetadataFile | string)[];
artView?: boolean;
}) => {
const id = pubkeyToString(pubkey);
@ -276,8 +277,10 @@ export const ArtContent = ({
uri={uri}
animationUrl={animationURL}
className={className}
preview={preview}
style={style}
files={files}
artView={artView}
/>
) : (
<CachedImageContent

View File

@ -19,12 +19,13 @@ import {
BidderMetadata,
MAX_METADATA_LEN,
MAX_EDITION_LEN,
placeBid,
useWalletModal,
VaultState,
} from '@oyster/common';
import { useWallet } from '@solana/wallet-adapter-react';
import { AuctionView, useBidsForAuction, useUserBalance } from '../../hooks';
import { sendPlaceBid } from '../../actions/sendPlaceBid';
// import { bidAndClaimInstantSale } from '../../actions/bidAndClaimInstantSale';
import { AuctionNumbers } from './../AuctionNumbers';
import {
sendRedeemBid,
@ -43,6 +44,7 @@ import { findEligibleParticipationBidsForRedemption } from '../../actions/claimU
import {
BidRedemptionTicket,
MAX_PRIZE_TRACKING_TICKET_SIZE,
WinningConfigType,
} from '@oyster/common/dist/lib/models/metaplex/index';
async function calculateTotalCostOfRedeemingOtherPeoplesBids(
@ -183,6 +185,7 @@ export const AuctionCard = ({
action?: JSX.Element;
}) => {
const connection = useConnection();
const { update } = useMeta();
const wallet = useWallet();
const { setVisible } = useWalletModal();
@ -251,11 +254,26 @@ export const AuctionCard = ({
const isAuctionNotStarted =
auctionView.auction.info.state === AuctionState.Created;
//if instant sale auction bid and claimed hide buttons
if (
(auctionView.isInstantSale &&
Number(auctionView.myBidderPot?.info.emptied) !== 0 &&
isAuctionManagerAuthorityNotWalletOwner &&
auctionView.auction.info.bidState.max.toNumber() === bids.length) ||
auctionView.vault.info.state === VaultState.Deactivated
) {
return <></>;
}
return (
<div className="auction-container" style={style}>
<Col>
<AuctionNumbers auctionView={auctionView} />
<br />
{!auctionView.isInstantSale && (
<>
<AuctionNumbers auctionView={auctionView} />
<br />
</>
)}
{showRedemptionIssue && (
<span>
There was an issue redeeming or refunding your bid. Please try
@ -374,7 +392,19 @@ export const AuctionCard = ({
onClick={() => setShowBidModal(true)}
style={{ marginTop: 20 }}
>
{loading ? <Spin /> : 'Place bid'}
{loading ? (
<Spin />
) : auctionView.isInstantSale ? (
!isAuctionManagerAuthorityNotWalletOwner ? (
'Claim item'
) : auctionView.myBidderPot ? (
'Claim Purchase'
) : (
'Buy Now'
)
) : (
'Place bid'
)}
</Button>
))}
@ -386,7 +416,8 @@ export const AuctionCard = ({
onClick={connect}
style={{ marginTop: 20 }}
>
Connect wallet to place bid
Connect wallet to{' '}
{auctionView.isInstantSale ? 'purchase' : 'place bid'}
</Button>
)}
{action}
@ -436,8 +467,9 @@ export const AuctionCard = ({
fontSize: '2rem',
}}
>
Your bid has been redeemed please view your NFTs in{' '}
<Link to="/artworks">My Items</Link>.
Your {auctionView.isInstantSale ? 'purchase' : 'bid'} has been
redeemed please view your NFTs in <Link to="/artworks">My Items</Link>
.
</p>
<Button
onClick={() => setShowRedeemedBidModal(false)}
@ -481,9 +513,82 @@ export const AuctionCard = ({
}
};
const instantSale = async () => {
setLoading(true);
const instantSalePrice =
auctionView.auctionDataExtended?.info.instantSalePrice;
const winningConfigType =
auctionView.items[0][0].winningConfigType;
const isAuctionItemMaster =
winningConfigType === WinningConfigType.FullRightsTransfer ||
winningConfigType === WinningConfigType.TokenOnlyTransfer;
const allowBidToPublic =
myPayingAccount &&
!auctionView.myBidderPot &&
isAuctionManagerAuthorityNotWalletOwner;
const allowBidToAuctionOwner =
myPayingAccount &&
!isAuctionManagerAuthorityNotWalletOwner &&
isAuctionItemMaster;
// Placing a "bid" of the full amount results in a purchase to redeem.
if (instantSalePrice && (allowBidToPublic || allowBidToAuctionOwner)) {
try {
const bid = await sendPlaceBid(
connection,
wallet,
myPayingAccount.pubkey,
auctionView,
accountByMint,
instantSalePrice,
);
setLastBid(bid);
} catch (e) {
console.error('sendPlaceBid', e);
setShowBidModal(false);
setLoading(false);
return;
}
}
const newAuctionState = await update(
auctionView.auction.pubkey,
wallet.publicKey,
);
auctionView.auction = newAuctionState[0];
auctionView.myBidderPot = newAuctionState[1];
auctionView.myBidderMetadata = newAuctionState[2];
// Claim the purchase
try {
await sendRedeemBid(
connection,
wallet,
myPayingAccount.pubkey,
auctionView,
accountByMint,
prizeTrackingTickets,
bidRedemptions,
bids,
).then(async () => {
await update();
setShowBidModal(false);
setShowRedeemedBidModal(true);
});
} catch (e) {
console.error(e);
setShowRedemptionIssue(true);
}
setLoading(false);
};
return (
<>
<h2 className="modal-title">Place a bid</h2>
<h2 className="modal-title">
{auctionView.isInstantSale
? 'Confirm Purchase'
: 'Place a bid'}
</h2>
{!!gapTime && (
<div
className="info-content"
@ -504,9 +609,7 @@ export const AuctionCard = ({
)}
</div>
)}
<br />
<AuctionNumbers auctionView={auctionView} />
<br />
{tickSizeInvalid && tickSize && (
<span style={{ color: 'red' }}>
@ -528,44 +631,55 @@ export const AuctionCard = ({
color: 'rgba(0, 0, 0, 0.5)',
}}
>
<InputNumber
autoFocus
className="input"
value={value}
style={{
width: '100%',
background: '#393939',
borderRadius: 16,
}}
onChange={setValue}
precision={4}
formatter={value =>
value
? `${value}`.replace(/\B(?=(\d{3})+(?!\d))/g, ',')
: ''
}
placeholder="Amount in SOL"
/>
<div
style={{
display: 'inline-block',
margin: '5px 20px',
fontWeight: 700,
}}
>
{formatAmount(balance.balance, 2)}{' '}
<span style={{ color: '#717171' }}>available</span>
</div>
<Link
to="/addfunds"
style={{
float: 'right',
margin: '5px 20px',
color: '#5870EE',
}}
>
Add funds
</Link>
{!auctionView.isInstantSale && (
<InputNumber
autoFocus
className="input"
value={value}
style={{
width: '100%',
background: '#393939',
borderRadius: 16,
}}
onChange={setValue}
precision={4}
formatter={value =>
value
? `${value}`.replace(/\B(?=(\d{3})+(?!\d))/g, ',')
: ''
}
placeholder="Amount in SOL"
/>
)}
{!(auctionView.isInstantSale && bids.length > 0) && (
<>
<div
style={{
color: '#FFFFFF',
display: 'inline-block',
margin: '5px 20px',
fontWeight: 700,
}}
>
{formatAmount(balance.balance, 2)}{' '}
<span
style={{ color: '#717171', paddingLeft: '5px' }}
>
available
</span>
</div>
<Link
to="/addfunds"
style={{
float: 'right',
margin: '5px 20px',
color: '#5870EE',
}}
>
Add funds
</Link>
</>
)}
</div>
<br />
@ -573,21 +687,31 @@ export const AuctionCard = ({
type="primary"
size="large"
className="action-btn"
onClick={placeBid}
onClick={() =>
auctionView.isInstantSale ? instantSale() : placeBid()
}
disabled={
tickSizeInvalid ||
gapBidInvalid ||
!myPayingAccount ||
value === undefined ||
value * LAMPORTS_PER_SOL < priceFloor ||
(!auctionView.isInstantSale &&
(value === undefined ||
value * LAMPORTS_PER_SOL < priceFloor)) ||
loading ||
!accountByMint.get(QUOTE_MINT.toBase58())
}
>
{loading || !accountByMint.get(QUOTE_MINT.toBase58()) ? (
<Spin />
) : auctionView.isInstantSale ? (
auctionView.myBidderPot ||
!isAuctionManagerAuthorityNotWalletOwner ? (
'Claim'
) : (
'Purchase'
)
) : (
'Place bid'
'Place Bid'
)}
</Button>
</>

View File

@ -50,13 +50,13 @@ export const AuctionNumbers = (props: { auctionView: AuctionView }) => {
return (
<div style={{ minWidth: 350 }}>
<Row>
{!ended && (
{(!ended || auctionView.isInstantSale) && (
<Col span={12}>
{(isUpcoming || bids.length === 0) && (
<AmountLabel
style={{ marginBottom: 10 }}
containerStyle={{ flexDirection: 'column' }}
title="Starting bid"
title={auctionView.isInstantSale ? 'Price' : 'Starting bid'}
amount={fromLamports(
participationOnly ? participationFixedPrice : priceFloor,
mintInfo,
@ -74,9 +74,9 @@ export const AuctionNumbers = (props: { auctionView: AuctionView }) => {
</Col>
)}
<Col span={ended ? 24 : 12}>
<Countdown state={state} />
</Col>
{!ended && <Col span={ended ? 24 : 12}>
<Countdown state={state}/>
</Col>}
</Row>
</div>
);

View File

@ -24,7 +24,7 @@ export interface AuctionCard extends CardProps {
}
export const AuctionRenderCard = (props: AuctionCard) => {
let { auctionView } = props;
const { auctionView } = props;
const id = auctionView.thumbnail.metadata.pubkey;
const art = useArt(id);
const name = art?.title || ' ';
@ -42,13 +42,17 @@ export const AuctionRenderCard = (props: AuctionCard) => {
const isUpcoming = auctionView.state === AuctionViewState.Upcoming;
const winningBid = useHighestBidForAuction(auctionView.auction.pubkey);
const ended =
const ended = !auctionView.isInstantSale &&
state?.hours === 0 && state?.minutes === 0 && state?.seconds === 0;
let currentBid: number | string = 0;
let label = '';
if (isUpcoming || bids) {
label = ended ? 'Ended' : 'Starting bid';
label = ended
? 'Ended'
: auctionView.isInstantSale
? 'Price'
: 'Starting bid';
currentBid = fromLamports(
participationOnly ? participationFixedPrice : priceFloor,
mintInfo,

View File

@ -0,0 +1,84 @@
.loader-container {
position: fixed;
top: 0;
left: 0;
height: 100%;
width: 100%;
background: rgb(0 0 0 / 48%);
z-index: 1;
display: none;
&.active {
display: block;
}
.loader-block {
position: absolute;
left: 50%;
top: 50%;
-webkit-transform: translate(-50%, -50%);
transform: translate(-50%, -50%);
display: flex;
flex-flow: column;
align-items: center;
font-size: 1.7em;
letter-spacing: 0.16em;
}
.loader-title {
color: #fff;
text-transform: uppercase;
}
}
.spinner {
margin-top: 0.5em;
.line {
width: 2px;
height: 24px;
background: #fff;
margin: 0 6px;
display: inline-block;
animation: spinner-line 1000ms infinite ease-in-out;
}
.line-1 {
animation-delay: 800ms;
}
.line-2 {
animation-delay: 600ms;
}
.line-3 {
animation-delay: 400ms;
}
.line-4 {
animation-delay: 200ms;
}
.line-6 {
animation-delay: 200ms;
}
.line-7 {
animation-delay: 400ms;
}
.line-8 {
animation-delay: 600ms;
}
.line-9 {
animation-delay: 800ms;
}
}
@keyframes spinner-line {
0% {
opacity: 1;
}
50% {
opacity: 0;
}
100% {
opacity: 1;
}
}

View File

@ -0,0 +1,34 @@
import { useMeta } from '@oyster/common';
import React, { FC } from 'react';
export const LoaderProvider: FC = ({ children }) => {
const { isLoading } = useMeta();
return (
<>
<div className={`loader-container ${isLoading ? 'active' : ''}`}>
<div className="loader-block">
<div className="loader-title">loading</div>
<Spinner />
</div>
</div>
{children}
</>
);
};
export const Spinner = () => {
return (
<div className="spinner">
<span className="line line-1" />
<span className="line line-2" />
<span className="line line-3" />
<span className="line line-4" />
<span className="line line-5" />
<span className="line line-6" />
<span className="line line-7" />
<span className="line line-8" />
<span className="line line-9" />
</div>
);
};

View File

@ -182,7 +182,7 @@ export function useSettlementAuctions({
const { accountByMint } = useUserAccounts();
const walletPubkey = wallet?.publicKey?.toBase58();
const { bidderPotsByAuctionAndBidder } = useMeta();
const auctionsNeedingSettling = useAuctions(AuctionViewState.Ended);
const auctionsNeedingSettling = [...useAuctions(AuctionViewState.Ended), ...useAuctions(AuctionViewState.BuyNow)];
const [validDiscoveredEndedAuctions, setValidDiscoveredEndedAuctions] =
useState<Record<string, number>>({});
@ -190,10 +190,13 @@ export function useSettlementAuctions({
const f = async () => {
const nextBatch = auctionsNeedingSettling
.filter(
a =>
walletPubkey &&
a => {
const isEndedInstantSale = a.isInstantSale && a.items.length === a.auction.info.bidState.bids.length;
return walletPubkey &&
a.auctionManager.authority === walletPubkey &&
a.auction.info.ended(),
(a.auction.info.ended() || isEndedInstantSale)
}
)
.sort(
(a, b) =>
@ -213,7 +216,12 @@ export function useSettlementAuctions({
av.auction.info.bidState.bids
.map(b => b.amount.toNumber())
.reduce((acc, r) => (acc += r), 0) > 0) ||
(balance.value.uiAmount || 0) > 0.01
// FIXME: Why 0.01? If this is used,
// no auctions with lower prices (e.g. 0.0001) appear in notifications,
// thus making settlement of such an auction impossible.
// Temporarily making the number a lesser one.
// (balance.value.uiAmount || 0) > 0.01
(balance.value.uiAmount || 0) > 0.00001
) {
setValidDiscoveredEndedAuctions(old => ({
...old,

View File

@ -28,8 +28,8 @@ export const useAuction = (id: StringPublicKey) => {
masterEditionsByOneTimeAuthMint,
masterEditionsByPrintingMint,
metadataByMasterEdition,
bidRedemptionV2sByAuctionManagerAndWinningIndex,
auctionDataExtended,
} = useMeta();
useEffect(() => {
@ -38,12 +38,12 @@ export const useAuction = (id: StringPublicKey) => {
const auctionView = processAccountsIntoAuctionView(
walletPubkey,
auction,
auctionDataExtended,
auctionManagersByAuction,
safetyDepositBoxesByVaultAndIndex,
metadataByMint,
bidderMetadataByAuctionAndBidder,
bidderPotsByAuctionAndBidder,
bidRedemptionV2sByAuctionManagerAndWinningIndex,
masterEditions,
vaults,
@ -55,6 +55,7 @@ export const useAuction = (id: StringPublicKey) => {
undefined,
existingAuctionView || undefined,
);
if (auctionView) setAuctionView(auctionView);
}
}, [

View File

@ -10,10 +10,12 @@ import {
MasterEditionV1,
MasterEditionV2,
StringPublicKey,
AuctionDataExtended,
createPipelineExecutor,
} from '@oyster/common';
import { useWallet } from '@solana/wallet-adapter-react';
import BN from 'bn.js';
import { useEffect, useState } from 'react';
import { useEffect, useMemo, useState } from 'react';
import { useMeta } from '../contexts';
import {
AuctionManager,
@ -23,6 +25,7 @@ import {
BidRedemptionTicket,
BidRedemptionTicketV2,
getBidderKeys,
MetaplexKey,
SafetyDepositConfig,
WinningConfigType,
AuctionViewItem,
@ -43,6 +46,7 @@ export interface AuctionView {
items: AuctionViewItem[][];
safetyDepositBoxes: ParsedAccount<SafetyDepositBox>[];
auction: ParsedAccount<AuctionData>;
auctionDataExtended?: ParsedAccount<AuctionDataExtended>;
auctionManager: AuctionManager;
participationItem?: AuctionViewItem;
state: AuctionViewState;
@ -52,53 +56,57 @@ export interface AuctionView {
myBidRedemption?: ParsedAccount<BidRedemptionTicket>;
vault: ParsedAccount<Vault>;
totallyComplete: boolean;
isInstantSale: boolean;
}
type CachedRedemptionKeys = Record<
string,
ParsedAccount<BidRedemptionTicket> | { pubkey: StringPublicKey; info: null }
>;
export function useStoreAuctionsList() {
const { auctions, auctionManagersByAuction } = useMeta();
const result = useMemo(() => {
return Object.values(auctionManagersByAuction).map(
manager => auctions[manager.info.auction],
);
}, [auctions, auctionManagersByAuction]);
return result;
}
export function useCachedRedemptionKeysByWallet() {
const { auctions, bidRedemptions } = useMeta();
const { bidRedemptions } = useMeta();
const auctions = useStoreAuctionsList();
const { publicKey } = useWallet();
const [cachedRedemptionKeys, setCachedRedemptionKeys] = useState<
Record<
string,
| ParsedAccount<BidRedemptionTicket>
| { pubkey: StringPublicKey; info: null }
>
>({});
const [cachedRedemptionKeys, setCachedRedemptionKeys] =
useState<CachedRedemptionKeys>({});
useEffect(() => {
if (!publicKey) return;
(async () => {
if (publicKey) {
const temp: Record<
string,
| ParsedAccount<BidRedemptionTicket>
| { pubkey: StringPublicKey; info: null }
> = {};
const keys = Object.keys(auctions);
const tasks: Promise<void>[] = [];
for (let i = 0; i < keys.length; i++) {
const a = keys[i];
if (!cachedRedemptionKeys[a])
tasks.push(
getBidderKeys(auctions[a].pubkey, publicKey.toBase58()).then(
key => {
temp[a] = bidRedemptions[key.bidRedemption]
? bidRedemptions[key.bidRedemption]
: { pubkey: key.bidRedemption, info: null };
},
),
const temp: CachedRedemptionKeys = {};
await createPipelineExecutor(
auctions.values(),
async auction => {
if (!cachedRedemptionKeys[auction.pubkey]) {
await getBidderKeys(auction.pubkey, publicKey.toBase58()).then(
key => {
temp[auction.pubkey] = bidRedemptions[key.bidRedemption]
? bidRedemptions[key.bidRedemption]
: { pubkey: key.bidRedemption, info: null };
},
);
else if (!cachedRedemptionKeys[a].info) {
temp[a] =
bidRedemptions[cachedRedemptionKeys[a].pubkey] ||
cachedRedemptionKeys[a];
} else if (!cachedRedemptionKeys[auction.pubkey].info) {
temp[auction.pubkey] =
bidRedemptions[cachedRedemptionKeys[auction.pubkey].pubkey] ||
cachedRedemptionKeys[auction.pubkey];
}
}
},
{ delay: 1, sequence: 2 },
);
await Promise.all(tasks);
setCachedRedemptionKeys(temp);
}
setCachedRedemptionKeys(temp);
})();
}, [auctions, bidRedemptions, publicKey]);
@ -109,9 +117,9 @@ export const useAuctions = (state?: AuctionViewState) => {
const [auctionViews, setAuctionViews] = useState<AuctionView[]>([]);
const { publicKey } = useWallet();
const cachedRedemptionKeys = useCachedRedemptionKeysByWallet();
const auctions = useStoreAuctionsList();
const {
auctions,
auctionManagersByAuction,
safetyDepositBoxesByVaultAndIndex,
metadataByMint,
@ -124,45 +132,47 @@ export const useAuctions = (state?: AuctionViewState) => {
metadataByMasterEdition,
safetyDepositConfigsByAuctionManagerAndIndex,
bidRedemptionV2sByAuctionManagerAndWinningIndex,
auctionDataExtended,
} = useMeta();
useEffect(() => {
const map = Object.keys(auctions).reduce((agg, a) => {
const auction = auctions[a];
const nextAuctionView = processAccountsIntoAuctionView(
publicKey?.toBase58(),
auction,
auctionManagersByAuction,
safetyDepositBoxesByVaultAndIndex,
metadataByMint,
bidderMetadataByAuctionAndBidder,
bidderPotsByAuctionAndBidder,
bidRedemptionV2sByAuctionManagerAndWinningIndex,
masterEditions,
vaults,
safetyDepositConfigsByAuctionManagerAndIndex,
masterEditionsByPrintingMint,
masterEditionsByOneTimeAuthMint,
metadataByMasterEdition,
cachedRedemptionKeys,
state,
);
agg[a] = nextAuctionView;
return agg;
}, {} as Record<string, AuctionView | undefined>);
(async () => {
const auctionViews: AuctionView[] = [];
setAuctionViews(
(Object.values(map).filter(v => v) as AuctionView[]).sort((a, b) => {
return (
b?.auction.info.endedAt
?.sub(a?.auction.info.endedAt || new BN(0))
.toNumber() || 0
);
}),
);
await createPipelineExecutor(
auctions.values(),
auction => {
const auctionView = processAccountsIntoAuctionView(
publicKey?.toBase58(),
auction,
auctionDataExtended,
auctionManagersByAuction,
safetyDepositBoxesByVaultAndIndex,
metadataByMint,
bidderMetadataByAuctionAndBidder,
bidderPotsByAuctionAndBidder,
bidRedemptionV2sByAuctionManagerAndWinningIndex,
masterEditions,
vaults,
safetyDepositConfigsByAuctionManagerAndIndex,
masterEditionsByPrintingMint,
masterEditionsByOneTimeAuthMint,
metadataByMasterEdition,
cachedRedemptionKeys,
state,
);
if (auctionView) {
auctionViews.push(auctionView);
}
},
{ delay: 1, sequence: 2 },
);
setAuctionViews(auctionViews.sort(sortByEnded));
})();
}, [
state,
auctions,
auctionDataExtended,
auctionManagersByAuction,
safetyDepositBoxesByVaultAndIndex,
metadataByMint,
@ -183,6 +193,24 @@ export const useAuctions = (state?: AuctionViewState) => {
return auctionViews;
};
function sortByEnded(a: AuctionView, b: AuctionView) {
return (
(b.auction.info.endedAt?.toNumber() || 0) -
(a.auction.info.endedAt?.toNumber() || 0)
);
}
function isInstantSale(
auctionDataExt: ParsedAccount<AuctionDataExtended> | null,
auction: ParsedAccount<AuctionData>,
) {
return !!(
auctionDataExt?.info.instantSalePrice &&
auction.info.priceFloor.minPrice &&
auctionDataExt?.info.instantSalePrice.eq(auction.info.priceFloor.minPrice)
);
}
function buildListWhileNonZero<T>(hash: Record<string, T>, key: string) {
const list: T[] = [];
let ticket = hash[key + '-0'];
@ -201,6 +229,7 @@ function buildListWhileNonZero<T>(hash: Record<string, T>, key: string) {
export function processAccountsIntoAuctionView(
walletPubkey: StringPublicKey | null | undefined,
auction: ParsedAccount<AuctionData>,
auctionDataExtended: Record<string, ParsedAccount<AuctionDataExtended>>,
auctionManagersByAuction: Record<
string,
ParsedAccount<AuctionManagerV1 | AuctionManagerV2>
@ -300,6 +329,15 @@ export function processAccountsIntoAuctionView(
bidRedemptions,
});
const auctionDataExtendedKey =
auctionManagerInstance.info.key == MetaplexKey.AuctionManagerV2
? (auctionManagerInstance as ParsedAccount<AuctionManagerV2>).info
.auctionDataExtended
: null;
const auctionDataExt = auctionDataExtendedKey
? auctionDataExtended[auctionDataExtendedKey]
: null;
const boxesExpected = auctionManager.safetyDepositBoxesExpected.toNumber();
const bidRedemption: ParsedAccount<BidRedemptionTicket> | undefined =
@ -317,9 +355,16 @@ export function processAccountsIntoAuctionView(
if (existingAuctionView && existingAuctionView.totallyComplete) {
// If totally complete, we know we arent updating anythign else, let's speed things up
// and only update the two things that could possibly change
existingAuctionView.auction = auction;
existingAuctionView.myBidderPot = bidderPot;
existingAuctionView.myBidderMetadata = bidderMetadata;
existingAuctionView.myBidRedemption = bidRedemption;
existingAuctionView.auctionDataExtended = auctionDataExt || undefined;
existingAuctionView.vault = vault;
existingAuctionView.isInstantSale = isInstantSale(
auctionDataExt,
auction,
);
for (let i = 0; i < existingAuctionView.items.length; i++) {
const winningSet = existingAuctionView.items[i];
for (let j = 0; j < winningSet.length; j++) {
@ -393,6 +438,7 @@ export function processAccountsIntoAuctionView(
auctionManager,
state,
vault,
auctionDataExtended: auctionDataExt || undefined,
safetyDepositBoxes: boxes,
items: auctionManager.getItemsFromSafetyDepositBoxes(
metadataByMint,
@ -419,6 +465,8 @@ export function processAccountsIntoAuctionView(
view.thumbnail =
((view.items || [])[0] || [])[0] || view.participationItem;
view.isInstantSale = isInstantSale(auctionDataExt, auction);
view.totallyComplete = !!(
view.thumbnail &&
boxesExpected ===

View File

@ -5,13 +5,14 @@ import {
cache,
ParsedAccount,
StringPublicKey,
useMeta,
USE_SPEED_RUN,
} from '@oyster/common';
export const useHighestBidForAuction = (
auctionPubkey: StringPublicKey | string,
) => {
const bids = useBidsForAuction(auctionPubkey);
const winner = useMemo(() => {
return bids?.[0];
}, [bids]);
@ -29,17 +30,18 @@ export const useBidsForAuction = (auctionPubkey: StringPublicKey | string) => {
: auctionPubkey,
[auctionPubkey],
);
const { bidderMetadataByAuctionAndBidder } = useMeta();
const [bids, setBids] = useState<ParsedAccount<BidderMetadata>[]>([]);
useEffect(() => {
const dispose = cache.emitter.onCache(args => {
if (args.parser === BidderMetadataParser) {
setBids(getBids(id));
setBids(getBids(bidderMetadataByAuctionAndBidder, id));
}
});
setBids(getBids(id));
setBids(getBids(bidderMetadataByAuctionAndBidder, id));
return () => {
dispose();
@ -49,21 +51,38 @@ export const useBidsForAuction = (auctionPubkey: StringPublicKey | string) => {
return bids;
};
const getBids = (id?: StringPublicKey) => {
return cache
.byParser(BidderMetadataParser)
.filter(key => {
const bidder = cache.get(key) as ParsedAccount<BidderMetadata>;
if (!bidder) {
return false;
}
const getBids = (
bidderMetadataByAuctionAndBidder: Record<
string,
ParsedAccount<BidderMetadata>
>,
id?: StringPublicKey,
) => {
// I have no idea why, but cache doesnt work with speed run and i couldnt figure it out for the life of me,
// because that file is so confusing I have no idea how it works.
// so we use the tempCache for pulling bids. B come save me.- J
let bids;
if (USE_SPEED_RUN) {
bids = Object.values(bidderMetadataByAuctionAndBidder).filter(
b => b.info.auctionPubkey === id,
);
} else {
bids = cache
.byParser(BidderMetadataParser)
.filter(key => {
const bidder = cache.get(key) as ParsedAccount<BidderMetadata>;
return id === bidder.info.auctionPubkey;
})
.map(key => {
const bidder = cache.get(key) as ParsedAccount<BidderMetadata>;
return bidder;
})
if (!bidder) {
return false;
}
return id === bidder.info.auctionPubkey;
})
.map(key => {
const bidder = cache.get(key) as ParsedAccount<BidderMetadata>;
return bidder;
});
}
return bids
.sort((a, b) => {
const lastBidDiff = b.info.lastBid.sub(a.info.lastBid).toNumber();
if (lastBidDiff === 0) {

View File

@ -0,0 +1,82 @@
import {
EndAuctionArgs,
getAuctionExtended,
getAuctionKeys,
programIds,
toPublicKey,
SCHEMA,
} from '@oyster/common';
import {
PublicKey,
SYSVAR_CLOCK_PUBKEY,
TransactionInstruction,
} from '@solana/web3.js';
import { serialize } from 'borsh';
export async function endAuction(
vault: PublicKey,
auctionManagerAuthority: PublicKey,
instructions: TransactionInstruction[],
) {
const PROGRAM_IDS = programIds();
const store = PROGRAM_IDS.store;
if (!store) {
throw new Error('Store not initialized');
}
const { auctionKey, auctionManagerKey } = await getAuctionKeys(
vault.toString(),
);
const auctionExtended = await getAuctionExtended({
auctionProgramId: PROGRAM_IDS.auction,
resource: vault.toString(),
});
const value = new EndAuctionArgs({ reveal: null });
const data = Buffer.from(serialize(SCHEMA, value));
const keys = [
{
pubkey: toPublicKey(auctionManagerKey),
isSigner: false,
isWritable: true,
},
{
pubkey: toPublicKey(auctionKey),
isSigner: false,
isWritable: true,
},
{
pubkey: toPublicKey(auctionExtended),
isSigner: false,
isWritable: false,
},
{
pubkey: toPublicKey(auctionManagerAuthority),
isSigner: true,
isWritable: false,
},
{
pubkey: toPublicKey(store),
isSigner: false,
isWritable: false,
},
{
pubkey: toPublicKey(PROGRAM_IDS.auction),
isSigner: false,
isWritable: false,
},
{
pubkey: toPublicKey(SYSVAR_CLOCK_PUBKEY),
isSigner: false,
isWritable: false,
},
];
instructions.push(
new TransactionInstruction({
keys,
programId: toPublicKey(PROGRAM_IDS.metaplex),
data,
}),
);
}

View File

@ -5,29 +5,32 @@ import {
WalletProvider,
MetaProvider,
} from '@oyster/common';
import { FC } from 'react';
import React, { FC } from 'react';
import { ConfettiProvider } from './components/Confetti';
import { AppLayout } from './components/Layout';
import { LoaderProvider } from './components/Loader';
import { CoingeckoProvider } from './contexts/coingecko';
export const Providers: FC = ({ children }) => {
return (
<ConnectionProvider>
<WalletProvider>
<AccountsProvider>
<CoingeckoProvider>
<StoreProvider
ownerAddress={process.env.NEXT_PUBLIC_STORE_OWNER_ADDRESS}
storeAddress={process.env.NEXT_PUBLIC_STORE_ADDRESS}
>
<MetaProvider>
<AccountsProvider>
<CoingeckoProvider>
<StoreProvider
ownerAddress={process.env.NEXT_PUBLIC_STORE_OWNER_ADDRESS}
storeAddress={process.env.NEXT_PUBLIC_STORE_ADDRESS}
>
<MetaProvider>
<LoaderProvider>
<ConfettiProvider>
<AppLayout>{children}</AppLayout>
</ConfettiProvider>
</MetaProvider>
</StoreProvider>
</CoingeckoProvider>
</AccountsProvider>
</LoaderProvider>
</MetaProvider>
</StoreProvider>
</CoingeckoProvider>
</AccountsProvider>
</WalletProvider>
</ConnectionProvider>
);

View File

@ -23,3 +23,4 @@
@import '../components/ArtCard/index.less';
@import '../components/ArtistCard/index.less';
@import '../components/Notifications/index.less';
@import '../components/Loader/index.less';

View File

@ -311,14 +311,14 @@ function InnerAnalytics({ mint }: { mint: MintInfo }) {
const [sortedSales, setSortedSales] = useState<number[]>([]);
const {
metadata,
stores,
// stores,
auctionManagersByAuction,
bidderPotsByAuctionAndBidder,
auctionDataExtended,
} = useMeta();
const totalNFTs = metadata.length;
const totalMarketplaces = Object.values(stores).length;
// const totalMarketplaces = Object.values(stores).length;
const auctionViews = useAuctions();
@ -353,7 +353,8 @@ function InnerAnalytics({ mint }: { mint: MintInfo }) {
</Button>
<h1>Overview</h1>
<h3>
Total NFTs: {totalNFTs} Total Marketplaces: {totalMarketplaces}
Total NFTs: {totalNFTs}
{/* Total Marketplaces: {totalMarketplaces} */}
</h3>
<h1>User Breakdown</h1>
<h3>Any Engagement: {Object.values(usersEngaged).length}</h3>

View File

@ -86,6 +86,7 @@ export const ArtView = () => {
pubkey={id}
active={true}
allowMeshRender={true}
artView={true}
/>
</Col>
{/* <Divider /> */}
@ -210,8 +211,6 @@ export const ArtView = () => {
<br />
{/*
TODO: add info about artist
<div className="info-header">ABOUT THE CREATOR</div>
<div className="info-content">{art.about}</div> */}
</Col>
@ -223,7 +222,7 @@ export const ArtView = () => {
<div className="info-header">Attributes</div>
<List size="large" grid={{ column: 4 }}>
{attributes.map(attribute => (
<List.Item>
<List.Item key={attribute.trait_type}>
<Card title={attribute.trait_type}>
{attribute.value}
</Card>

Some files were not shown because too many files have changed in this diff Show More