Merge pull request #1 from zone117x/master

合并更新
This commit is contained in:
whyyk7 2014-04-21 11:38:07 +08:00
commit 566e44e03b
121 changed files with 4333 additions and 1159 deletions

2
.gitignore vendored
View File

@ -1,3 +1,3 @@
node_modules/
.idea/
pool_configs/
config.json

389
README.md
View File

@ -1,58 +1,74 @@
# NOMP
# NOMP ![NOMP Logo](http://zone117x.github.io/node-open-mining-portal/logo.svg "NOMP Logo")
#### Node Open Mining Portal
This portal is an extremely efficient, highly scalable, all-in-one, easy to setup cryptocurrency mining pool written
entirely in Node.js. It contains a stratum poolserver, reward/payment/share processor, and a (*not yet completed*)
front-end website.
entirely in Node.js. It contains a stratum poolserver; reward/payment/share processor; and a (*not yet completed*)
responsive user-friendly front-end website featuring mining instructions, in-depth live statistics, and an admin center.
#### Features
#### Table of Contents
* [Features](#features)
* [Attack Mitigation](#attack-mitigation)
* [Security](#security)
* [Planned Features](#planned-features)
* [Community Support](#community--support)
* [Usage](#usage)
* [Requirements](#requirements)
* [Setting Up Coin Daemon](#0-setting-up-coin-daemon)
* [Downloading & Installing](#1-downloading--installing)
* [Configuration](#2-configuration)
* [Portal Config](#portal-config)
* [Coin Config](#coin-config)
* [Pool Config](#pool-config)
* [Setting Up Blocknotify](#optional-recommended-setting-up-blocknotify)
* [Starting the Portal](#3-start-the-portal)
* [Upgrading NOMP](#upgrading-nomp)
* [Donations](#donations)
* [Credits](#credits)
* [License](#license)
* For the pool server it uses the highly efficient [node-stratum](https://github.com/zone117x/node-stratum) module which
supports vardiff, POW & POS, transaction messages, anti-DDoS, IP banning, several hashing algorithms.
* The portal has an [MPOS](https://github.com/MPOS/php-mpos) compatibility mode so that the it can
function as a drop-in-replacement for [python-stratum-mining](https://github.com/Crypto-Expert/stratum-mining). This
### Features
* For the pool server it uses the highly efficient [node-stratum-pool](//github.com/zone117x/node-stratum-pool) module which
supports vardiff, POW & POS, transaction messages, anti-DDoS, IP banning, [several hashing algorithms](//github.com/zone117x/node-stratum-pool#hashing-algorithms-supported).
* The portal has an [MPOS](//github.com/MPOS/php-mpos) compatibility mode so that the it can
function as a drop-in-replacement for [python-stratum-mining](//github.com/Crypto-Expert/stratum-mining). This
mode can be enabled in the configuration and will insert shares into a MySQL database in the format which MPOS expects.
For a direct tutorial see the wiki page [Setting up NOMP for MPOS usage](//github.com/zone117x/node-open-mining-portal/wiki/Setting-up-NOMP-for-MPOS-usage).
* Multi-pool ability - this software was built from the ground up to run with multiple coins simultaneously (which can
have different properties and hashing algorithms). It can be used to create a pool for a single coin or for multiple
coins at once. The pools use clustering to load balance across multiple CPU cores.
* For reward/payment processing, shares are inserted into Redis (a fast NoSQL key/value store). The PROP (proportional)
reward system is used. Each and every share will be rewarded - even for rounds resulting in orphaned blocks.
reward system is used with [Redis Transactions](http://redis.io/topics/transactions) for secure and super speedy payouts.
Each and every share will be rewarded - even for rounds resulting in orphaned blocks.
* This portal does not have user accounts/logins/registrations. Instead, miners simply use their coin address for stratum
authentication. A minimalistic HTML5 front-end connects to the portals statistics API to display stats from from each
pool such as connected miners, network/pool difficulty/hash rate, etc.
#### Planned Features
* To knock down the barrier to entry for cryptocurrency and mining for those not programmers or tech-origiented, instead
of the "help" page on the website being being confusing for non-techies (when most people see a black command prompt screen
they run away screaming), there will be a simple "Download NOMP Desktop App" to get started mining immediately for your
platform (use javascript to detect platform and default them to the correct one). I will create this app using C# + Mono
so runs with ease on all platforms, and it will have its own github repo that NOMP links to. So a pool operator does a
`git clone --recursive` on NOMP repo, it will download NOMP app executables for each platform. There will be a nomp.ini
file paired with the executable which the pool operator configures to user their NOMP pool's API url. When the NOMP portal
initiates creates a zip for each platform with the nomp.ini inside. When user's download the app, it auto-connects to the
NOMP pool API to get available coins along with the version-byte for each coin so the app can securely generate a local private
key and valid address to mine with. The app pill prompt printing the private key to paper and also enforce a
STRONG (uncrackable) password encryption on the file.
The app will scan their system to get the appropriate mining software - run in background - parse the gibberish (to a noob) output
into something that makes sense. It will also prompt them to download the coins wallet software and import their private key.
When using the app they can choose a unique username that is used
with stratum authentication like "zone117x.mfsm1ckZKTTjDz94KonZZsbZnAbm1UV4BF", so that on a NOMP mobile app, a user can
enter in the NOMP pool and their username in order to see how their mining rig is doing since the API will report stats
back for the address such as hashrate and balance.
* To reduce variance for pools just starting out which have little to no hashing power a feature is planned which will
allow your own pool to connect upstream to a larger pool server. It will request work from the larger pool then
redistribute the work to our own connected miners.
* Automated switching of connected miners to different pools/coins is also easily done due to the multi-pool architecture
of this software. The switching can be controlled using a coin profitability API such as CoinChoose.com or CoinWarz.com
(or calculated locally using daemon-reported network difficulties and exchange APIs).
of this software. To use this feature the switching must be controlled by your own script, such as one that calculates
coin profitability via an API such as CoinChoose.com or CoinWarz.com (or calculated locally using daemon-reported network
difficulties and exchange APIs). NOMP's regular payment processing and miner authentication which using coin address as stratum
username will obviously not work with this coin switching feature - so you must control those with your own script as well.
#### Attack Mitigation
* Detects and thwarts socket flooding (garbage data sent over socket in order to consume system resources).
* Detects and thwarts zombie miners (botnet infected computers connecting to your server to use up sockets but not sending any shares).
* Detects and thwarts invalid share attacks:
* NOMP is not vulnerable to the low difficulty share exploits happening to other pool servers. Other pool server
software has hardcoded guesstimated max difficulties for new hashing algorithms while NOMP dynamically generates the
max difficulty for each algorithm based on values founds in coin source code.
* IP banning feature which on a configurable threshold will ban an IP for a configurable amount of time if the miner
submits over a configurable threshold of invalid shares.
* NOMP is written in Node.js which uses a single thread (async) to handle connections rather than the overhead of one
thread per connection, and clustering is also implemented so all CPU cores are taken advantage of.
#### Security
@ -63,7 +79,17 @@ giving hackers little reward and keeping your pool from being a target.
* Miners can notice lack of automated payments as a possible early warning sign that an operator is about to run off with their coins.
#### Community / Support
#### Planned Features
* NOMP API - Used by the website to display stats and information about the pool(s) on the portal's front-end website,
and by the NOMP Desktop app to retrieve a list of available coins (and version-bytes for local wallet/address generation).
* To reduce variance for pools just starting out which have little to no hashing power a feature is planned which will
allow your own pool to connect upstream to a larger pool server. It will request work from the larger pool then
redistribute the work to our own connected miners.
### Community / Support
IRC
* Support / general discussion join #nomp: https://webchat.freenode.net/?channels=#nomp
* Development discussion join #nomp-dev: https://webchat.freenode.net/?channels=#nomp-dev
@ -75,6 +101,16 @@ didn't follow the instructions in this README. Please __read the usage instructi
If your pool uses NOMP let us know and we will list your website here.
##### Some pools using NOMP or node-stratum-module:
* http://chunkypools.com
* http://clevermining.com
* http://rapidhash.net
* http://suchpool.pw
* http://hashfaster.com
* http://miningpoolhub.com
* http://kryptochaos.com
* http://pool.uberpools.org
Usage
=====
@ -85,20 +121,41 @@ Usage
* [Node.js](http://nodejs.org/) v0.10+ ([follow these installation instructions](https://github.com/joyent/node/wiki/Installing-Node.js-via-package-manager))
* [Redis](http://redis.io/) key-value store v2.6+ ([follow these instructions](http://redis.io/topics/quickstart))
##### Seriously
Those are legitimate requirements. If you use old versions of Node.js or Redis that may come with your system package manager then you will have problems. Follow the linked instructions to get the last stable versions.
#### 0) Setting up coin daemon
Follow the build/install instructions for your coin daemon. Your coin.conf file should end up looking something like this:
```
daemon=1
rpcuser=litecoinrpc
rpcpassword=securepassword
rpcport=19332
```
For redundancy, its recommended to have at least two daemon instances running in case one drops out-of-sync or offline,
all instances will be polled for block/transaction updates and be used for submitting blocks. Creating a backup daemon
involves spawning a daemon using the `-datadir=/backup` command-line argument which creates a new daemon instance with
it's own config directory and coin.conf file. Learn about the daemon, how to use it and how it works if you want to be
a good pool operator. For starters be sure to read:
* https://en.bitcoin.it/wiki/Running_bitcoind
* https://en.bitcoin.it/wiki/Data_directory
* https://en.bitcoin.it/wiki/Original_Bitcoin_client/API_Calls_list
* https://en.bitcoin.it/wiki/Difficulty
#### 1) Downloading & Installing
Clone the repository and run `npm update` for all the dependencies to be installed:
```bash
git clone https://github.com/zone117x/node-stratum-portal.git
git clone https://github.com/zone117x/node-open-mining-portal.git nomp
cd nomp
npm update
```
#### 2) Configuration
##### Portal config
Inside the `config.json` file, ensure the default configuration will work for your environment.
Inside the `config_example.json` file, ensure the default configuration will work for your environment, then copy the file to `config.json`.
Explanation for each field:
````javascript
@ -115,24 +172,95 @@ Explanation for each field:
"enabled": true,
"forks": "auto"
},
/* With this enabled, the master process will start listening on the configured port for
messages from the 'scripts/blockNotify.js' script which your coin daemons can be configured
to run when a new block is available. When a blocknotify message is received, the master
process uses IPC (inter-process communication) to notify each worker process about the
message. Each worker process then sends the message to the appropriate coin pool. See
"Setting up blocknotify" below to set up your daemon to use this feature. */
/* This is the front-end. Its not finished. When it is finished, this comment will say so. */
"website": {
"enabled": true,
"port": 80,
/* Used for displaying stratum connection data on the Getting Started page. */
"stratumHost": "cryppit.com",
"stats": {
/* Gather stats to broadcast to page viewers and store in redis for historical stats
every this many seconds. */
"updateInterval": 15,
/* How many seconds to hold onto historical stats. Currently set to 24 hours. */
"historicalRetention": 43200,
/* How many seconds worth of shares should be gathered to generate hashrate. */
"hashrateWindow": 300
},
/* Not done yet. */
"adminCenter": {
"enabled": true,
"password": "password"
}
},
/* Redis instance of where to store global portal data such as historical stats, proxy states,
ect.. */
"redis": {
"host": "127.0.0.1",
"port": 6379
},
/* With this enabled, the master process listen on the configured port for messages from the
'scripts/blockNotify.js' script which your coin daemons can be configured to run when a
new block is available. When a blocknotify message is received, the master process uses
IPC (inter-process communication) to notify each thread about the message. Each thread
then sends the message to the appropriate coin pool. See "Setting up blocknotify" below to
set up your daemon to use this feature. */
"blockNotifyListener": {
"enabled": true,
"port": 8117,
"password": "test"
},
/* This is the front-end. Its not finished. When it is finished, this comment will say so. */
"website": {
"enabled": true,
"port": 80,
"liveStats": true
/* With this enabled, the master process will listen on the configured port for messages from
the 'scripts/coinSwitch.js' script which will trigger your proxy pools to switch to the
specified coin (non-case-sensitive). This setting is used in conjuction with the proxy
feature below. */
"coinSwitchListener": {
"enabled": false,
"port": 8118,
"password": "test"
},
/* In a proxy configuration, you can setup ports that accept miners for work based on a
specific algorithm instead of a specific coin. Miners that connect to these ports are
automatically switched a coin determined by the server. The default coin is the first
configured pool for each algorithm and coin switching can be triggered using the
coinSwitch.js script in the scripts folder.
Please note miner address authentication must be disabled when using NOMP in a proxy
configuration and that payout processing is left up to the server administrator. */
"proxy": {
"sha256": {
"enabled": false,
"port": "3333",
"diff": 10,
"varDiff": {
"minDiff": 16, //Minimum difficulty
"maxDiff": 512, //Network difficulty will be used if it is lower than this
"targetTime": 15, //Try to get 1 share per this many seconds
"retargetTime": 90, //Check to see if we should retarget every this many seconds
"variancePercent": 30 //Allow time to very this % from target without retargeting
}
},
"scrypt": {
"enabled": false,
"port": "4444",
"diff": 10,
"varDiff": {
"minDiff": 16, //Minimum difficulty
"maxDiff": 512, //Network difficulty will be used if it is lower than this
"targetTime": 15, //Try to get 1 share per this many seconds
"retargetTime": 90, //Check to see if we should retarget every this many seconds
"variancePercent": 30 //Allow time to very this % from target without retargeting
}
},
"scrypt-n": {
"enabled": false,
"port": "5555"
}
}
}
````
@ -145,12 +273,15 @@ Here is an example of the required fields:
{
"name": "Litecoin",
"symbol": "ltc",
"algorithm": "scrypt", //or "sha256", "scrypt-jane", "quark", "x11"
"reward": "POW", //or "POS"
"txMessages": false //or true
"algorithm": "scrypt", //or "sha256", "scrypt-jane", "scrypt-n", "quark", "x11"
"txMessages": false, //or true (not required, defaults to false)
"mposDiffMultiplier": 256, //only for x11 coins in mpos mode, set to 256 (optional)
}
````
For additional documentation how to configure coins *(especially important for scrypt-n and scrypt-jane coins)*
see [these instructions](//github.com/zone117x/node-stratum-pool#module-usage).
##### Pool config
Take a look at the example json file inside the `pool_configs` directory. Rename it to `yourcoin.json` and change the
@ -160,9 +291,43 @@ Description of options:
````javascript
{
"disabled": false, //Set this to true and a pool will not be created from this config file
"enabled": true, //Set this to false and a pool will not be created from this config file
"coin": "litecoin.json", //Reference to coin config file in 'coins' directory
"address": "mi4iBXbBsydtcc5yFmsff2zCFVX4XG7qJc", //Address to where block rewards are given
"blockRefreshInterval": 1000, //How often to poll RPC daemons for new blocks, in milliseconds
/* How many milliseconds should have passed before new block transactions will trigger a new
job broadcast. */
"txRefreshInterval": 20000,
/* Some miner apps will consider the pool dead/offline if it doesn't receive anything new jobs
for around a minute, so every time we broadcast jobs, set a timeout to rebroadcast
in this many seconds unless we find a new job. Set to zero or remove to disable this. */
"jobRebroadcastTimeout": 55,
//instanceId: 37, //Recommend not using this because a crypto-random one will be generated
/* Some attackers will create thousands of workers that use up all available socket connections,
usually the workers are zombies and don't submit shares after connecting. This feature
detects those and disconnects them. */
"connectionTimeout": 600, //Remove workers that haven't been in contact for this many seconds
/* Sometimes you want the block hashes even for shares that aren't block candidates. */
"emitInvalidBlockHashes": false,
/* We use proper maximum algorithm difficulties found in the coin daemon source code. Most
miners/pools that deal with scrypt use a guesstimated one that is about 5.86% off from the
actual one. So here we can set a tolerable threshold for if a share is slightly too low
due to mining apps using incorrect max diffs and this pool using correct max diffs. */
"shareVariancePercent": 10,
/* Enable for client IP addresses to be detected when using a load balancer with TCP proxy
protocol enabled, such as HAProxy with 'send-proxy' param:
http://haproxy.1wt.eu/download/1.5/doc/configuration.txt */
"tcpProxyProtocol": false,
/* This determines what to do with submitted shares (and stratum worker authentication).
You have two options:
@ -197,19 +362,22 @@ Description of options:
/* (2% default) What percent fee your pool takes from the block reward. */
"feePercent": 0.02,
/* (Not implemented yet) Your address that receives pool revenue from fees */
//"feeReceiveAddress": "LZz44iyF4zLCXJTU8RxztyyJZBntdS6fvv",
/* Name of the daemon account to use when moving coin profit within daemon wallet. */
"feeCollectAccount": "feesCollected",
/* (Not implemented yet) How many coins from fee revenue must accumulate on top of the
/* Your address that receives pool revenue from fees. */
"feeReceiveAddress": "LZz44iyF4zLCXJTU8RxztyyJZBntdS6fvv",
/* How many coins from fee revenue must accumulate on top of the
minimum reserve amount in order to trigger withdrawal to fee address. The higher
this threshold, the less of your profit goes to transactions fees. */
//"feeWithdrawalThreshold": 5,
"feeWithdrawalThreshold": 5,
/* This daemon is used to send out payments. It MUST be for the daemon that owns the
configured 'address' that receives the block rewards, otherwise the daemon will not
be able to confirm blocks or send out payments. */
"daemon": {
"host": "localhost",
"host": "127.0.0.1",
"port": 19332,
"user": "litecoinrpc",
"password": "testnet"
@ -217,14 +385,16 @@ Description of options:
/* Redis database used for storing share and block submission data. */
"redis": {
"host": "localhost",
"host": "127.0.0.1",
"port": 6379
}
},
"mpos": { //Enabled this and shares will be inserted into share table in a MySQL database
/* Enabled mpos and shares will be inserted into share table in a MySQL database. You may
also want to use the "emitInvalidBlockHashes" option below if you require it. */
"mpos": {
"enabled": false,
"host": "localhost", //MySQL db host
"host": "127.0.0.1", //MySQL db host
"port": 3306, //MySQL db port
"user": "me", //MySQL db user
"password": "mypass", //MySQL db password
@ -237,24 +407,10 @@ Description of options:
}
},
"address": "mi4iBXbBsydtcc5yFmsff2zCFVX4XG7qJc", //Address to where block rewards are given
"blockRefreshInterval": 1000, //How often to poll RPC daemons for new blocks, in milliseconds
/* How many milliseconds should have passed before new block transactions will trigger a new
job broadcast. */
"txRefreshInterval": 20000,
//instanceId: 37, //Recommend not using this because a crypto-random one will be generated
/* Some attackers will create thousands of workers that use up all available socket connections,
usually the workers are zombies and don't submit shares after connecting. This feature
detects those and disconnects them. */
"connectionTimeout": 600, //Remove workers that haven't been in contact for this many seconds
/* If a worker is submitting a high threshold of invalid shares we can temporarily ban them
to reduce system/network load. Also useful to fight against flooding attacks. */
/* If a worker is submitting a high threshold of invalid shares we can temporarily ban their IP
to reduce system/network load. Also useful to fight against flooding attacks. If running
behind something like HAProxy be sure to enable 'tcpProxyProtocol', otherwise you'll end up
banning your own IP address (and therefore all workers). */
"banning": {
"enabled": true,
"time": 600, //How many seconds to ban worker for
@ -285,19 +441,17 @@ Description of options:
}
},
/* Recommended to have at least two daemon instances running in case one drops out-of-sync
or offline. For redundancy, all instances will be polled for block/transaction updates
and be used for submitting blocks. */
/* For redundancy, recommended to have at least two daemon instances running in case one
drops out-of-sync or offline. */
"daemons": [
{ //Main daemon instance
"host": "localhost",
"host": "127.0.0.1",
"port": 19332,
"user": "litecoinrpc",
"password": "testnet"
},
{ //Backup daemon instance
"host": "localhost",
"host": "127.0.0.1",
"port": 19344,
"user": "litecoinrpc",
"password": "testnet"
@ -305,23 +459,29 @@ Description of options:
],
/* This allows the pool to connect to the daemon as a node peer to recieve block updates.
/* This allows the pool to connect to the daemon as a node peer to receive block updates.
It may be the most efficient way to get block updates (faster than polling, less
intensive than blocknotify script). However its still under development (not yet working). */
intensive than blocknotify script). It requires additional setup: the 'magic' field must
be exact (extracted from the coin source code). */
"p2p": {
"enabled": false,
"host": "localhost",
/* Host for daemon */
"host": "127.0.0.1",
/* Port configured for daemon (this is the actual peer port not RPC port) */
"port": 19333,
/* If your coin daemon is new enough (i.e. not a shitcoin) then it will support a p2p
feature that prevents the daemon from spamming our peer node with unnecessary
transaction data. Assume its supported but if you have problems try disabling it. */
"disableTransactions": true,
/* Magic value is different for main/testnet and for each coin. It is found in the daemon
source code as the pchMessageStart variable.
For example, litecoin mainnet magic: http://git.io/Bi8YFw
And for litecoin testnet magic: http://git.io/NXBYJA
*/
"magic": "fcc1b7dc",
//Found in src as the PROTOCOL_VERSION variable, for example: http://git.io/KjuCrw
"protocolVersion": 70002,
And for litecoin testnet magic: http://git.io/NXBYJA */
"magic": "fcc1b7dc"
}
}
@ -330,7 +490,7 @@ Description of options:
You can create as many of these pool config files as you want (such as one pool per coin you which to operate).
If you are creating multiple pools, ensure that they have unique stratum ports.
For more information on these configuration options see the [pool module documentation](https://github.com/zone117x/node-stratum#module-usage)
For more information on these configuration options see the [pool module documentation](https://github.com/zone117x/node-stratum-pool#module-usage)
@ -342,10 +502,11 @@ node [path to scripts/blockNotify.js] [listener host]:[listener port] [listener
```
Example: inside `dogecoin.conf` add the line
```
blocknotify="node scripts/blockNotify.js localhost:8117 mySuperSecurePassword dogecoin %s"
blocknotify=node scripts/blockNotify.js 127.0.0.1:8117 mySuperSecurePassword dogecoin %s
```
Alternatively, you can use a more efficient block notify script written in pure C. Build and usage instructions
are commented in [scripts/blocknotify.c](scripts/blocknotify.c).
#### 3) Start the portal
@ -364,18 +525,38 @@ output from NOMP.
* Use [New Relic](http://newrelic.com/) to monitor your NOMP instance and server performance.
#### Upgrading NOMP
When updating NOMP to the latest code its important to not only `git pull` the latest from this repo, but to also update the `node-statum-pool` module and any config files that may have been changed.
* Inside your NOMP directory (where the init.js script is) do `git pull` to get the latest NOMP code.
* Remove the dependenices by deleting the `node_modules` directory with `rm -r node_modules`.
* Run `npm update` to force updating/reinstalling of the dependencies.
* Compare your `config.json` and `pool_configs/coin.json` configurations to the latest example ones in this repo or the ones in the setup instructions where each config field is explained. You may need to modify or add any new changes.
Donations
---------
To support development of this project feel free to donate :)
BTC: 1KRotMnQpxu3sePQnsVLRy3EraRFYfJQFR
* BTC: `1KRotMnQpxu3sePQnsVLRy3EraRFYfJQFR`
* LTC: `LKfavSDJmwiFdcgaP1bbu46hhyiWw5oFhE`
* VTC: `VgW4uFTZcimMSvcnE4cwS3bjJ6P8bcTykN`
* MAX: `mWexUXRCX5PWBmfh34p11wzS5WX2VWvTRT`
* QRK: `QehPDAhzVQWPwDPQvmn7iT3PoFUGT7o8bC`
* DRK: `XcQmhp8ANR7okWAuArcNFZ2bHSB81jpapQ`
* DOGE: `DBGGVtwAAit1NPZpRm5Nz9VUFErcvVvHYW`
* Cryptsy Trade Key: `254ca13444be14937b36c44ba29160bd8f02ff76`
Credits
-------
* [vekexasia](https://github.com/vekexasia) - co-developer & great tester
* [TheSeven](https://github.com/TheSeven) - answering an absurd amount of my questions and being a very helpful and king gentleman
* Those that contributed to [node-stratum](/zone117x/node-stratum)
* [Jerry Brady / mintyfresh68](https://github.com/bluecircle) - got coin-switching fully working and developed proxy-per-algo feature
* [Tony Dobbs](http://anthonydobbs.com) - designs for front-end and created the NOMP logo
* [LucasJones(//github.com/LucasJones) - getting p2p block notify script working
* [vekexasia](//github.com/vekexasia) - co-developer & great tester
* [TheSeven](//github.com/TheSeven) - answering an absurd amount of my questions and being a very helpful gentleman
* [UdjinM6](//github.com/UdjinM6) - helped implement fee withdrawal in payment processing
* [Alex Petrov / sysmanalex](https://github.com/sysmanalex) - contributed the pure C block notify script
* [svirusxxx](//github.com/svirusxxx) - sponsored development of MPOS mode
* [icecube45](//github.com/icecube45) - helping out with the repo wiki
* Those that contributed to [node-stratum-pool](//github.com/zone117x/node-stratum-pool#credits)
License

5
coins/365coin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "365coin",
"symbol": "365",
"algorithm": "keccak"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Alphacoin",
"symbol" : "ALF",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Alphacoin",
"symbol": "ALF",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Anoncoin",
"symbol" : "ANC",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Anoncoin",
"symbol": "ANC",
"algorithm": "scrypt"
}

6
coins/applecoin.json Normal file
View File

@ -0,0 +1,6 @@
{
"name": "Applecoin",
"symbol": "APC",
"algorithm": "scrypt-jane",
"chainStartTime": 1384720832
}

View File

@ -1,7 +1,5 @@
{
"name" : "Auroracoin",
"symbol" : "AUR",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Auroracoin",
"symbol": "AUR",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name": "Bitcoin",
"symbol": "btc",
"algorithm": "sha256",
"reward": "POW",
"txMessages": false
"symbol": "BTC",
"algorithm": "sha256"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Bottlecaps",
"symbol" : "CAP",
"algorithm" : "scrypt",
"reward" : "POS",
"txMessages" : false
"name": "Bottlecaps",
"symbol": "CAP",
"algorithm": "scrypt"
}

5
coins/bytecoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Bytecoin",
"symbol": "BTE",
"algorithm": "sha256"
}

6
coins/cachecoin.json Normal file
View File

@ -0,0 +1,6 @@
{
"name": "Cachecoin",
"symbol": "CACH",
"algorithm": "scrypt-jane",
"chainStartTime": 1388949883
}

View File

@ -1,7 +1,5 @@
{
"name" : "Casinocoin",
"symbol" : "CSC",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Casinocoin",
"symbol": "CSC",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Catcoin",
"symbol" : "CAT",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Catcoin",
"symbol": "CAT",
"algorithm": "scrypt"
}

6
coins/copperbars.json Normal file
View File

@ -0,0 +1,6 @@
{
"name": "Copperbars",
"symbol": "CPR",
"algorithm": "scrypt-jane",
"chainStartTime": 1376184687
}

7
coins/copperlark.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "Copperlark",
"symbol": "CLR",
"algorithm": "keccak",
"normalHashing": true,
"diffShift": 32
}

5
coins/cryptometh.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Cryptometh",
"symbol": "METH",
"algorithm": "keccak"
}

View File

@ -1,7 +1,6 @@
{
"name": "Darkcoin",
"symbol": "drk",
"symbol": "DRK",
"algorithm": "x11",
"reward": "POW",
"txMessages": false
}
"mposDiffMultiplier": 256
}

5
coins/defcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Defcoin",
"symbol": "DEF",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,6 @@
{
"name" : "Diamondcoin",
"symbol" : "DMD",
"algorithm" : "scrypt",
"reward" : "POS",
"txMessages" : true
"name": "Diamondcoin",
"symbol": "DMD",
"algorithm": "scrypt",
"txMessages": true
}

View File

@ -1,7 +1,5 @@
{
"name" : "Digibyte",
"symbol" : "DGB",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Digibyte",
"symbol": "DGB",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Dogecoin",
"symbol" : "DOGE",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Dogecoin",
"symbol": "DOGE",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Earthcoin",
"symbol" : "EAC",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Earthcoin",
"symbol": "EAC",
"algorithm": "scrypt"
}

7
coins/ecoin.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "Ecoin",
"symbol": "ECN",
"algorithm": "keccak",
"normalHashing": true,
"diffShift": 32
}

View File

@ -1,7 +1,5 @@
{
"name" : "Elephantcoin",
"symbol" : "ELP",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Elephantcoin",
"symbol": "ELP",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Emerald",
"symbol" : "EMD",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Emerald",
"symbol": "EMD",
"algorithm": "scrypt"
}

16
coins/execoin.json Normal file
View File

@ -0,0 +1,16 @@
{
"name": "Execoin",
"symbol": "EXE",
"algorithm": "scrypt-n",
"timeTable": {
"2048": 1390959880,
"4096": 1438295269,
"8192": 1485630658,
"16384": 1532966047,
"32768": 1580301436,
"65536": 1627636825,
"131072": 1674972214,
"262144": 1722307603
}
}

View File

@ -1,7 +1,5 @@
{
"name" : "Ezcoin",
"symbol" : "EZC",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Ezcoin",
"symbol": "EZC",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Fastcoin",
"symbol" : "FST",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Fastcoin",
"symbol": "FST",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Flappycoin",
"symbol" : "FLAP",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Flappycoin",
"symbol": "FLAP",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,6 @@
{
"name" : "Florincoin",
"symbol" : "FLO",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : true
"name": "Florincoin",
"symbol": "FLO",
"algorithm": "scrypt",
"txMessages": true
}

View File

@ -1,7 +1,5 @@
{
"name" : "Frankocoin",
"symbol" : "FRK",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Frankocoin",
"symbol": "FRK",
"algorithm": "scrypt"
}

8
coins/freecoin.json Normal file
View File

@ -0,0 +1,8 @@
{
"name": "Freecoin",
"symbol": "FEC",
"algorithm": "scrypt-jane",
"chainStartTime": 1375801200,
"nMin": 6,
"nMax": 32
}

View File

@ -1,7 +1,5 @@
{
"name" : "Galaxycoin",
"symbol" : "GLX",
"algorithm" : "scrypt",
"reward" : "POS",
"txMessages" : false
"name": "Galaxycoin",
"symbol": "GLX",
"algorithm": "scrypt"
}

5
coins/galleon.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Galleon",
"symbol": "GLN",
"algorithm": "keccak"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Gamecoin",
"symbol" : "GME",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Gamecoin",
"symbol": "GME",
"algorithm": "scrypt"
}

View File

@ -0,0 +1,6 @@
{
"name": "GoldPressedLatinum",
"symbol": "GPL",
"algorithm": "scrypt-jane",
"chainStartTime": 1377557832
}

5
coins/helixcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Helixcoin",
"symbol": "HXC",
"algorithm": "keccak"
}

6
coins/hirocoin.json Normal file
View File

@ -0,0 +1,6 @@
{
"name": "Hirocoin",
"symbol": "hic",
"algorithm": "x11",
"mposDiffMultiplier": 256
}

5
coins/hobonickels.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Hobonickels",
"symbol": "HBN",
"algorithm": "scrypt"
}

6
coins/internetcoin.json Normal file
View File

@ -0,0 +1,6 @@
{
"name": "Internetcoin",
"symbol": "ITC",
"algorithm": "scrypt-jane",
"chainStartTime": 1388385602
}

5
coins/jennycoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Jennycoin",
"symbol": "JNY",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Junkcoin",
"symbol" : "JKC",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Junkcoin",
"symbol": "JKC",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Kittehcoin",
"symbol" : "MEOW",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Kittehcoin",
"symbol": "MEOW",
"algorithm": "scrypt"
}

5
coins/klondikecoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Klondikecoin",
"symbol": "KDC",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Krugercoin",
"symbol" : "KGC",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Krugercoin",
"symbol": "KGC",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Litecoin",
"symbol" : "LTC",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Litecoin",
"symbol": "LTC",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Lottocoin",
"symbol" : "LOT",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Lottocoin",
"symbol": "LOT",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Luckycoin",
"symbol" : "LKY",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Luckycoin",
"symbol": "LKY",
"algorithm": "scrypt"
}

5
coins/maxcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Maxcoin",
"symbol": "MAX",
"algorithm": "keccak"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Memecoin",
"symbol" : "MEM",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Memecoin",
"symbol": "MEM",
"algorithm": "scrypt"
}

8
coins/microcoin.json Normal file
View File

@ -0,0 +1,8 @@
{
"name": "Microcoin",
"symbol": "MCR",
"algorithm": "scrypt-jane",
"chainStartTime": 1389028879,
"nMin": 6,
"nMax": 32
}

5
coins/mintcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Mintcoin",
"symbol": "MINT",
"algorithm": "scrypt"
}

6
coins/muniti.json Normal file
View File

@ -0,0 +1,6 @@
{
"name": "Muniti",
"symbol": "MUN",
"algorithm": "x11",
"mposDiffMultiplier": 256
}

View File

@ -1,7 +1,6 @@
{
"name" : "Neocoin",
"symbol" : "NEC",
"algorithm" : "scrypt",
"reward" : "POS",
"txMessages" : true
"name": "Neocoin",
"symbol": "NEC",
"algorithm": "scrypt",
"txMessages": true
}

View File

@ -1,7 +1,6 @@
{
"name" : "Netcoin",
"symbol" : "NET",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : true
"name": "Netcoin",
"symbol": "NET",
"algorithm": "scrypt",
"txMessages": true
}

View File

@ -1,7 +1,5 @@
{
"name" : "Noirbits",
"symbol" : "NRB",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Noirbits",
"symbol": "NRB",
"algorithm": "scrypt"
}

7
coins/onecoin.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "Onecoin",
"symbol": "ONC",
"algorithm": "scrypt-jane",
"chainStartTime": 1371119462,
"nMin": 6
}

View File

@ -1,7 +1,5 @@
{
"name": "Peercoin",
"symbol": "ppc",
"algorithm": "sha256",
"reward": "POS",
"txMessages": false
"symbol": "PPC",
"algorithm": "sha256"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Phoenixcoin",
"symbol" : "PXC",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"algorithm" : "scrypt"
}

5
coins/potcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Potcoin",
"symbol": "POT",
"algorithm": "scrypt"
}

5
coins/procoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Procoin",
"symbol": "PCN",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,6 @@
{
"name": "Quarkcoin",
"symbol": "qrk",
"symbol": "QRK",
"algorithm": "quark",
"reward": "POW",
"txMessages": false
}
"mposDiffMultiplier": 256
}

View File

@ -0,0 +1,6 @@
{
"name": "Radioactivecoin",
"symbol": "RAD",
"algorithm": "scrypt-jane",
"chainStartTime": 1389196388
}

View File

@ -1,7 +1,5 @@
{
"name" : "Reddcoin",
"symbol" : "REDD",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Reddcoin",
"symbol": "REDD",
"algorithm": "scrypt"
}

5
coins/ronpaulcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "RonPaulCoin",
"symbol": "RPC",
"algorithm": "scrypt"
}

5
coins/rubycoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Rubycoin",
"symbol": "RUBY",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Sexcoin",
"symbol" : "SXC",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Sexcoin",
"symbol": "SXC",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name": "Skeincoin",
"symbol": "skc",
"algorithm": "skein",
"reward": "POW",
"txMessages": false
"symbol": "SKC",
"algorithm": "skein"
}

5
coins/spartancoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Spartancoin",
"symbol": "SPN",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Spots",
"symbol" : "SPT",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Spots",
"symbol": "SPT",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Stablecoin",
"symbol" : "SBC",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Stablecoin",
"symbol": "SBC",
"algorithm": "scrypt"
}

5
coins/stoopidcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Stoopidcoin",
"symbol": "STP",
"algorithm": "scrypt"
}

5
coins/suncoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Suncoin",
"symbol": "SUN",
"algorithm": "scrypt"
}

5
coins/terracoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Terracoin",
"symbol": "TRC",
"algorithm": "sha256"
}

6
coins/ultracoin.json Normal file
View File

@ -0,0 +1,6 @@
{
"name": "Ultracoin",
"symbol": "UTC",
"algorithm": "scrypt-jane",
"chainStartTime": 1388361600
}

5
coins/unobtanium.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Unobtanium",
"symbol": "UNO",
"algorithm": "sha256"
}

6
coins/velocitycoin.json Normal file
View File

@ -0,0 +1,6 @@
{
"name": "Velocitycoin",
"symbol": "VEL",
"algorithm": "scrypt-jane",
"chainStartTime": 1387769316
}

5
coins/vertcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Vertcoin",
"symbol": "VTC",
"algorithm": "scrypt-n"
}

5
coins/wecoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Wecoin",
"symbol": "WEC",
"algorithm": "max"
}

5
coins/whitecoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Whitecoin",
"symbol": "WC",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,5 @@
{
"name" : "Xencoin",
"symbol" : "XNC",
"algorithm" : "scrypt",
"reward" : "POW",
"txMessages" : false
"name": "Xencoin",
"symbol": "XNC",
"algorithm": "scrypt"
}

View File

@ -1,7 +1,6 @@
{
"name": "Yacoin",
"symbol": "yac",
"symbol": "YAC",
"algorithm": "scrypt-jane",
"reward": "POS",
"txMessages": false
"chainStartTime": 1367991200
}

6
coins/ybcoin.json Normal file
View File

@ -0,0 +1,6 @@
{
"name": "YBcoin",
"symbol": "YBC",
"algorithm": "scrypt-jane",
"chainStartTime": 1372386273
}

5
coins/zetacoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Zetacoin",
"symbol": "ZTC",
"algorithm": "sha256"
}

7
coins/zzcoin.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "ZZcoin",
"symbol": "ZZC",
"algorithm": "scrypt-jane",
"chainStartTime": 1375817223,
"nMin": 12
}

View File

@ -1,61 +0,0 @@
{
"logLevel": "debug",
"clustering": {
"enabled": true,
"forks": "auto"
},
"blockNotifyListener": {
"enabled": false,
"port": 8117,
"password": "test"
},
"redisBlockNotifyListener": {
"enabled" : false,
"redisPort" : 6379,
"redisHost" : "hostname",
"psubscribeKey" : "newblocks:*"
},
"website": {
"enabled": true,
"siteTitle": "Cryppit",
"port": 80,
"statUpdateInterval": 3,
"hashrateWindow": 600
},
"proxy": {
"enabled": false,
"ports": {
"80": {
"diff": 32,
"varDiff": {
"minDiff" : 8,
"maxDiff" : 512,
"targetTime" : 15,
"retargetTime" : 90,
"variancePercent" : 30
}
},
"6000": {
"diff": 32,
"varDiff": {
"minDiff" : 8,
"maxDiff" : 512,
"targetTime" : 15,
"retargetTime" : 90,
"variancePercent" : 30
}
},
"8080": {
"diff": 32,
"varDiff": {
"minDiff" : 8,
"maxDiff" : 512,
"targetTime" : 15,
"retargetTime" : 90,
"variancePercent" : 30
}
}
}
}
}

88
config_example.json Normal file
View File

@ -0,0 +1,88 @@
{
"logLevel": "debug",
"clustering": {
"enabled": true,
"forks": "auto"
},
"website": {
"enabled": true,
"port": 80,
"stratumHost": "cryppit.com",
"stats": {
"updateInterval": 60,
"historicalRetention": 43200,
"hashrateWindow": 300
},
"adminCenter": {
"enabled": true,
"password": "password"
}
},
"redis": {
"host": "127.0.0.1",
"port": 6379
},
"blockNotifyListener": {
"enabled": false,
"port": 8117,
"password": "test"
},
"coinSwitchListener": {
"enabled": false,
"host": "127.0.0.1",
"port": 8118,
"password": "test"
},
"proxy": {
"sha256": {
"enabled": false,
"port": "3333",
"diff": 10,
"varDiff": {
"minDiff": 16,
"maxDiff": 512,
"targetTime": 15,
"retargetTime": 90,
"variancePercent": 30
}
},
"scrypt": {
"enabled": false,
"port": "4444",
"diff": 10,
"varDiff": {
"minDiff": 16,
"maxDiff": 512,
"targetTime": 15,
"retargetTime": 90,
"variancePercent": 30
}
},
"scrypt-n": {
"enabled": false,
"port": "5555"
}
},
"profitSwitch": {
"enabled": false,
"updateInterval": 600,
"depth": 0.90,
"usePoloniex": true,
"useCryptsy": true,
"useMintpal": true
},
"redisBlockNotifyListener": {
"enabled": false,
"redisPort": 6379,
"redisHost": "hostname",
"psubscribeKey": "newblocks:*"
}
}

226
init.js
View File

@ -1,93 +1,106 @@
var fs = require('fs');
var path = require('path');
var os = require('os');
var cluster = require('cluster');
var posix = require('posix');
var PoolLogger = require('./libs/logUtil.js');
var BlocknotifyListener = require('./libs/blocknotifyListener.js');
var async = require('async');
var PoolLogger = require('./libs/logUtil.js');
var BlocknotifyListener = require('./libs/blocknotifyListener.js');
var CoinswitchListener = require('./libs/coinswitchListener.js');
var RedisBlocknotifyListener = require('./libs/redisblocknotifyListener.js');
var WorkerListener = require('./libs/workerListener.js');
var PoolWorker = require('./libs/poolWorker.js');
var PaymentProcessor = require('./libs/paymentProcessor.js');
var Website = require('./libs/website.js');
var PoolWorker = require('./libs/poolWorker.js');
var PaymentProcessor = require('./libs/paymentProcessor.js');
var Website = require('./libs/website.js');
var ProfitSwitch = require('./libs/profitSwitch.js');
var algos = require('stratum-pool/lib/algoProperties.js');
JSON.minify = JSON.minify || require("node-json-minify");
if (!fs.existsSync('config.json')){
console.log('config.json file does not exist. Read the installation/setup instructions.');
return;
}
var portalConfig = JSON.parse(JSON.minify(fs.readFileSync("config.json", {encoding: 'utf8'})));
var loggerInstance = new PoolLogger({
var logger = new PoolLogger({
logLevel: portalConfig.logLevel
});
var logDebug = loggerInstance.logDebug;
var logWarning = loggerInstance.logWarning;
var logError = loggerInstance.logError;
try {
require('newrelic');
if (cluster.isMaster)
logDebug('newrelic', 'system', 'New Relic initiated');
logger.debug('NewRelic', 'Monitor', 'New Relic initiated');
} catch(e) {}
//Try to give process ability to handle 100k concurrent connections
try{
posix.setrlimit('nofile', { soft: 100000, hard: 100000 });
var posix = require('posix');
try {
posix.setrlimit('nofile', { soft: 100000, hard: 100000 });
}
catch(e){
if (cluster.isMaster)
logger.warning('POSIX', 'Connection Limit', '(Safe to ignore) Must be ran as root to increase resource limits');
}
}
catch(e){
logWarning('posix', 'system', '(Safe to ignore) Must be ran as root to increase resource limits');
if (cluster.isMaster)
logger.debug('POSIX', 'Connection Limit', '(Safe to ignore) POSIX module not installed and resource (connection) limit was not raised');
}
if (cluster.isWorker){
switch(process.env.workerType){
case 'pool':
new PoolWorker(loggerInstance);
new PoolWorker(logger);
break;
case 'paymentProcessor':
new PaymentProcessor(loggerInstance);
new PaymentProcessor(logger);
break;
case 'website':
new Website(loggerInstance);
new Website(logger);
break;
case 'profitSwitch':
new ProfitSwitch(logger);
break;
}
return;
} /* else {
var coinNames = ['alphacoin','frankocoin','emerald','kittehcoin'];
var curIndex = 0;
setInterval(function () {
var newCoinName = coinNames[++curIndex % coinNames.length];
console.log("SWITCHING to "+newCoinName);
var ipcMessage = {type:'switch', coin: newCoinName};
Object.keys(cluster.workers).forEach(function(id) {
cluster.workers[id].send(ipcMessage);
});
}, 20000);
} */
}
//Read all pool configs from pool_configs and join them with their coin profile
var buildPoolConfigs = function(){
var configs = {};
fs.readdirSync('pool_configs').forEach(function(file){
var poolOptions = JSON.parse(JSON.minify(fs.readFileSync('pool_configs/' + file, {encoding: 'utf8'})));
if (poolOptions.disabled) return;
var configDir = 'pool_configs/';
fs.readdirSync(configDir).forEach(function(file){
if (!fs.existsSync(configDir + file) || path.extname(configDir + file) !== '.json') return;
var poolOptions = JSON.parse(JSON.minify(fs.readFileSync(configDir + file, {encoding: 'utf8'})));
if (!poolOptions.enabled) return;
var coinFilePath = 'coins/' + poolOptions.coin;
if (!fs.existsSync(coinFilePath)){
logError(poolOptions.coin, 'system', 'could not find file: ' + coinFilePath);
logger.error('Master', poolOptions.coin, 'could not find file: ' + coinFilePath);
return;
}
var coinProfile = JSON.parse(JSON.minify(fs.readFileSync(coinFilePath, {encoding: 'utf8'})));
poolOptions.coin = coinProfile;
configs[poolOptions.coin.name] = poolOptions;
if (!(coinProfile.algorithm in algos)){
logger.error('Master', coinProfile.name, 'Cannot run a pool for unsupported algorithm "' + coinProfile.algorithm + '"');
delete configs[poolOptions.coin.name];
}
});
return configs;
};
@ -95,8 +108,29 @@ var buildPoolConfigs = function(){
var spawnPoolWorkers = function(portalConfig, poolConfigs){
var serializedConfigs = JSON.stringify(poolConfigs);
Object.keys(poolConfigs).forEach(function(coin){
var p = poolConfigs[coin];
var internalEnabled = p.shareProcessing && p.shareProcessing.internal && p.shareProcessing.internal.enabled;
var mposEnabled = p.shareProcessing && p.shareProcessing.mpos && p.shareProcessing.mpos.enabled;
if (!internalEnabled && !mposEnabled){
logger.error('Master', coin, 'Share processing is not configured so a pool cannot be started for this coin.');
delete poolConfigs[coin];
}
if (!Array.isArray(p.daemons) || p.daemons.length < 1){
logger.error('Master', coin, 'No daemons configured so a pool cannot be started for this coin.');
delete poolConfigs[coin];
}
});
if (Object.keys(poolConfigs).length === 0){
logger.warning('Master', 'PoolSpawner', 'No pool configs exists or are enabled in pool_configs folder. No pools spawned.');
return;
}
var serializedConfigs = JSON.stringify(poolConfigs);
var numForks = (function(){
if (!portalConfig.clustering || !portalConfig.clustering.enabled)
@ -108,41 +142,55 @@ var spawnPoolWorkers = function(portalConfig, poolConfigs){
return portalConfig.clustering.forks;
})();
var poolWorkers = {};
var createPoolWorker = function(forkId){
var worker = cluster.fork({
workerType : 'pool',
forkId : forkId,
pools : serializedConfigs,
portalConfig : JSON.stringify(portalConfig),
workerType: 'pool',
forkId: forkId,
pools: serializedConfigs,
portalConfig: JSON.stringify(portalConfig)
});
worker.forkId = forkId;
worker.type = 'pool';
poolWorkers[forkId] = worker;
worker.on('exit', function(code, signal){
logError('poolWorker', 'system', 'Fork ' + forkId + ' died, spawning replacement worker...');
logger.error('Master', 'PoolSpanwer', 'Fork ' + forkId + ' died, spawning replacement worker...');
setTimeout(function(){
createPoolWorker(forkId);
}, 2000);
}).on('message', function(msg){
switch(msg.type){
case 'banIP':
Object.keys(cluster.workers).forEach(function(id) {
if (cluster.workers[id].type === 'pool'){
cluster.workers[id].send({type: 'banIP', ip: msg.ip});
}
});
break;
}
});
};
for (var i = 0; i < numForks; i++) {
var i = 0;
var spawnInterval = setInterval(function(){
createPoolWorker(i);
}
i++;
if (i === numForks){
clearInterval(spawnInterval);
logger.debug('Master', 'PoolSpawner', 'Spawned ' + Object.keys(poolConfigs).length + ' pool(s) on ' + numForks + ' thread(s)');
}
}, 250);
};
var startWorkerListener = function(poolConfigs){
var workerListener = new WorkerListener(loggerInstance, poolConfigs);
workerListener.init();
};
var startBlockListener = function(portalConfig){
//block notify options
//setup block notify here and use IPC to tell appropriate pools
var listener = new BlocknotifyListener(portalConfig.blockNotifyListener);
listener.on('log', function(text){
logDebug('blocknotify', 'system', text);
logger.debug('Master', 'Blocknotify', text);
});
listener.on('hash', function(message){
@ -155,6 +203,32 @@ var startBlockListener = function(portalConfig){
listener.start();
};
//
// Receives authenticated events from coin switch listener and triggers proxy
// to swtich to a new coin.
//
var startCoinswitchListener = function(portalConfig){
var listener = new CoinswitchListener(portalConfig.coinSwitchListener);
listener.on('log', function(text){
logger.debug('Master', 'Coinswitch', text);
});
listener.on('switchcoin', function(message){
var ipcMessage = {type:'blocknotify', coin: message.coin, hash: message.hash};
Object.keys(cluster.workers).forEach(function(id) {
cluster.workers[id].send(ipcMessage);
});
var ipcMessage = {
type:'switch',
coin: message.coin
};
Object.keys(cluster.workers).forEach(function(id) {
cluster.workers[id].send(ipcMessage);
});
});
listener.start();
};
var startRedisBlockListener = function(portalConfig){
//block notify options
//setup block notify here and use IPC to tell appropriate pools
@ -163,7 +237,7 @@ var startRedisBlockListener = function(portalConfig){
var listener = new RedisBlocknotifyListener(portalConfig.redisBlockNotifyListener);
listener.on('log', function(text){
logDebug('blocknotify', 'system', text);
logger.debug('Master', 'blocknotify', text);
}).on('hash', function (message) {
var ipcMessage = {type:'blocknotify', coin: message.coin, hash: message.hash};
Object.keys(cluster.workers).forEach(function(id) {
@ -175,12 +249,26 @@ var startRedisBlockListener = function(portalConfig){
var startPaymentProcessor = function(poolConfigs){
var enabledForAny = false;
for (var pool in poolConfigs){
var p = poolConfigs[pool];
var enabled = p.enabled && p.shareProcessing && p.shareProcessing.internal && p.shareProcessing.internal.enabled;
if (enabled){
enabledForAny = true;
break;
}
}
if (!enabledForAny)
return;
var worker = cluster.fork({
workerType: 'paymentProcessor',
pools: JSON.stringify(poolConfigs)
});
worker.on('exit', function(code, signal){
logError('paymentProcessor', 'system', 'Payment processor died, spawning replacement...');
logger.error('Master', 'Payment Processor', 'Payment processor died, spawning replacement...');
setTimeout(function(){
startPaymentProcessor(poolConfigs);
}, 2000);
@ -198,7 +286,7 @@ var startWebsite = function(portalConfig, poolConfigs){
portalConfig: JSON.stringify(portalConfig)
});
worker.on('exit', function(code, signal){
logError('website', 'system', 'Website process died, spawning replacement...');
logger.error('Master', 'Website', 'Website process died, spawning replacement...');
setTimeout(function(){
startWebsite(portalConfig, poolConfigs);
}, 2000);
@ -206,6 +294,28 @@ var startWebsite = function(portalConfig, poolConfigs){
};
var startProfitSwitch = function(portalConfig, poolConfigs){
if (!portalConfig.profitSwitch.enabled){
logger.error('Master', 'Profit', 'Profit auto switching disabled');
return;
}
var worker = cluster.fork({
workerType: 'profitSwitch',
pools: JSON.stringify(poolConfigs),
portalConfig: JSON.stringify(portalConfig)
});
worker.on('exit', function(code, signal){
logger.error('Master', 'Profit', 'Profit switching process died, spawning replacement...');
setTimeout(function(){
startWebsite(portalConfig, poolConfigs);
}, 2000);
});
};
(function init(){
var poolConfigs = buildPoolConfigs();
@ -216,10 +326,12 @@ var startWebsite = function(portalConfig, poolConfigs){
startBlockListener(portalConfig);
startRedisBlockListener(portalConfig);
startCoinswitchListener(portalConfig);
startWorkerListener(poolConfigs);
startRedisBlockListener(portalConfig);
startWebsite(portalConfig, poolConfigs);
startProfitSwitch(portalConfig, poolConfigs);
})();

54
libs/api.js Normal file
View File

@ -0,0 +1,54 @@
var redis = require('redis');
var async = require('async');
var stats = require('./stats.js');
module.exports = function(logger, portalConfig, poolConfigs){
var _this = this;
var portalStats = this.stats = new stats(logger, portalConfig, poolConfigs);
this.liveStatConnections = {};
this.handleApiRequest = function(req, res, next){
switch(req.params.method){
case 'stats':
res.end(portalStats.statsString);
return;
case 'pool_stats':
res.end(JSON.stringify(portalStats.statPoolHistory));
return;
case 'live_stats':
res.writeHead(200, {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive'
});
res.write('\n');
var uid = Math.random().toString();
_this.liveStatConnections[uid] = res;
req.on("close", function() {
delete _this.liveStatConnections[uid];
});
return;
default:
next();
}
};
this.handleAdminApiRequest = function(req, res, next){
switch(req.params.method){
case 'pools': {
res.end(JSON.stringify({result: poolConfigs}));
return;
}
default:
next();
}
};
};

204
libs/apiCryptsy.js Normal file
View File

@ -0,0 +1,204 @@
var request = require('request');
var nonce = require('nonce');
module.exports = function() {
'use strict';
// Module dependencies
// Constants
var version = '0.1.0',
PUBLIC_API_URL = 'http://pubapi.cryptsy.com/api.php',
PRIVATE_API_URL = 'https://api.cryptsy.com/api',
USER_AGENT = 'nomp/node-open-mining-portal'
// Constructor
function Cryptsy(key, secret){
// Generate headers signed by this user's key and secret.
// The secret is encapsulated and never exposed
this._getPrivateHeaders = function(parameters){
var paramString, signature;
if (!key || !secret){
throw 'Cryptsy: Error. API key and secret required';
}
// Sort parameters alphabetically and convert to `arg1=foo&arg2=bar`
paramString = Object.keys(parameters).sort().map(function(param){
return encodeURIComponent(param) + '=' + encodeURIComponent(parameters[param]);
}).join('&');
signature = crypto.createHmac('sha512', secret).update(paramString).digest('hex');
return {
Key: key,
Sign: signature
};
};
}
// If a site uses non-trusted SSL certificates, set this value to false
Cryptsy.STRICT_SSL = true;
// Helper methods
function joinCurrencies(currencyA, currencyB){
return currencyA + '_' + currencyB;
}
// Prototype
Cryptsy.prototype = {
constructor: Cryptsy,
// Make an API request
_request: function(options, callback){
if (!('headers' in options)){
options.headers = {};
}
options.headers['User-Agent'] = USER_AGENT;
options.json = true;
options.strictSSL = Cryptsy.STRICT_SSL;
request(options, function(err, response, body) {
callback(err, body);
});
return this;
},
// Make a public API request
_public: function(parameters, callback){
var options = {
method: 'GET',
url: PUBLIC_API_URL,
qs: parameters
};
return this._request(options, callback);
},
// Make a private API request
_private: function(parameters, callback){
var options;
parameters.nonce = nonce();
options = {
method: 'POST',
url: PRIVATE_API_URL,
form: parameters,
headers: this._getPrivateHeaders(parameters)
};
return this._request(options, callback);
},
/////
// PUBLIC METHODS
getTicker: function(callback){
var parameters = {
method: 'marketdatav2'
};
return this._public(parameters, callback);
},
getOrderBook: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnOrderBook',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._public(parameters, callback);
},
getTradeHistory: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnTradeHistory',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._public(parameters, callback);
},
/////
// PRIVATE METHODS
myBalances: function(callback){
var parameters = {
command: 'returnBalances'
};
return this._private(parameters, callback);
},
myOpenOrders: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnOpenOrders',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._private(parameters, callback);
},
myTradeHistory: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnTradeHistory',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._private(parameters, callback);
},
buy: function(currencyA, currencyB, rate, amount, callback){
var parameters = {
command: 'buy',
currencyPair: joinCurrencies(currencyA, currencyB),
rate: rate,
amount: amount
};
return this._private(parameters, callback);
},
sell: function(currencyA, currencyB, rate, amount, callback){
var parameters = {
command: 'sell',
currencyPair: joinCurrencies(currencyA, currencyB),
rate: rate,
amount: amount
};
return this._private(parameters, callback);
},
cancelOrder: function(currencyA, currencyB, orderNumber, callback){
var parameters = {
command: 'cancelOrder',
currencyPair: joinCurrencies(currencyA, currencyB),
orderNumber: orderNumber
};
return this._private(parameters, callback);
},
withdraw: function(currency, amount, address, callback){
var parameters = {
command: 'withdraw',
currency: currency,
amount: amount,
address: address
};
return this._private(parameters, callback);
}
};
return Cryptsy;
}();

216
libs/apiMintpal.js Normal file
View File

@ -0,0 +1,216 @@
var request = require('request');
var nonce = require('nonce');
module.exports = function() {
'use strict';
// Module dependencies
// Constants
var version = '0.1.0',
PUBLIC_API_URL = 'https://api.mintpal.com/v2/market',
PRIVATE_API_URL = 'https://api.mintpal.com/v2/market',
USER_AGENT = 'nomp/node-open-mining-portal'
// Constructor
function Mintpal(key, secret){
// Generate headers signed by this user's key and secret.
// The secret is encapsulated and never exposed
this._getPrivateHeaders = function(parameters){
var paramString, signature;
if (!key || !secret){
throw 'Mintpal: Error. API key and secret required';
}
// Sort parameters alphabetically and convert to `arg1=foo&arg2=bar`
paramString = Object.keys(parameters).sort().map(function(param){
return encodeURIComponent(param) + '=' + encodeURIComponent(parameters[param]);
}).join('&');
signature = crypto.createHmac('sha512', secret).update(paramString).digest('hex');
return {
Key: key,
Sign: signature
};
};
}
// If a site uses non-trusted SSL certificates, set this value to false
Mintpal.STRICT_SSL = true;
// Helper methods
function joinCurrencies(currencyA, currencyB){
return currencyA + '_' + currencyB;
}
// Prototype
Mintpal.prototype = {
constructor: Mintpal,
// Make an API request
_request: function(options, callback){
if (!('headers' in options)){
options.headers = {};
}
options.headers['User-Agent'] = USER_AGENT;
options.json = true;
options.strictSSL = Mintpal.STRICT_SSL;
request(options, function(err, response, body) {
callback(err, body);
});
return this;
},
// Make a public API request
_public: function(parameters, callback){
var options = {
method: 'GET',
url: PUBLIC_API_URL,
qs: parameters
};
return this._request(options, callback);
},
// Make a private API request
_private: function(parameters, callback){
var options;
parameters.nonce = nonce();
options = {
method: 'POST',
url: PRIVATE_API_URL,
form: parameters,
headers: this._getPrivateHeaders(parameters)
};
return this._request(options, callback);
},
/////
// PUBLIC METHODS
getTicker: function(callback){
var options = {
method: 'GET',
url: PUBLIC_API_URL + '/summary',
qs: null
};
return this._request(options, callback);
},
getBuyOrderBook: function(currencyA, currencyB, callback){
var options = {
method: 'GET',
url: PUBLIC_API_URL + '/orders/' + currencyB + '/' + currencyA + '/BUY',
qs: null
};
return this._request(options, callback);
},
getOrderBook: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnOrderBook',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._public(parameters, callback);
},
getTradeHistory: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnTradeHistory',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._public(parameters, callback);
},
/////
// PRIVATE METHODS
myBalances: function(callback){
var parameters = {
command: 'returnBalances'
};
return this._private(parameters, callback);
},
myOpenOrders: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnOpenOrders',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._private(parameters, callback);
},
myTradeHistory: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnTradeHistory',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._private(parameters, callback);
},
buy: function(currencyA, currencyB, rate, amount, callback){
var parameters = {
command: 'buy',
currencyPair: joinCurrencies(currencyA, currencyB),
rate: rate,
amount: amount
};
return this._private(parameters, callback);
},
sell: function(currencyA, currencyB, rate, amount, callback){
var parameters = {
command: 'sell',
currencyPair: joinCurrencies(currencyA, currencyB),
rate: rate,
amount: amount
};
return this._private(parameters, callback);
},
cancelOrder: function(currencyA, currencyB, orderNumber, callback){
var parameters = {
command: 'cancelOrder',
currencyPair: joinCurrencies(currencyA, currencyB),
orderNumber: orderNumber
};
return this._private(parameters, callback);
},
withdraw: function(currency, amount, address, callback){
var parameters = {
command: 'withdraw',
currency: currency,
amount: amount,
address: address
};
return this._private(parameters, callback);
}
};
return Mintpal;
}();

212
libs/apiPoloniex.js Normal file
View File

@ -0,0 +1,212 @@
var request = require('request');
var nonce = require('nonce');
module.exports = function() {
'use strict';
// Module dependencies
// Constants
var version = '0.1.0',
PUBLIC_API_URL = 'https://poloniex.com/public',
PRIVATE_API_URL = 'https://poloniex.com/tradingApi',
USER_AGENT = 'npm-crypto-apis/' + version
// Constructor
function Poloniex(key, secret){
// Generate headers signed by this user's key and secret.
// The secret is encapsulated and never exposed
this._getPrivateHeaders = function(parameters){
var paramString, signature;
if (!key || !secret){
throw 'Poloniex: Error. API key and secret required';
}
// Sort parameters alphabetically and convert to `arg1=foo&arg2=bar`
paramString = Object.keys(parameters).sort().map(function(param){
return encodeURIComponent(param) + '=' + encodeURIComponent(parameters[param]);
}).join('&');
signature = crypto.createHmac('sha512', secret).update(paramString).digest('hex');
return {
Key: key,
Sign: signature
};
};
}
// If a site uses non-trusted SSL certificates, set this value to false
Poloniex.STRICT_SSL = true;
// Helper methods
function joinCurrencies(currencyA, currencyB){
return currencyA + '_' + currencyB;
}
// Prototype
Poloniex.prototype = {
constructor: Poloniex,
// Make an API request
_request: function(options, callback){
if (!('headers' in options)){
options.headers = {};
}
options.headers['User-Agent'] = USER_AGENT;
options.json = true;
options.strictSSL = Poloniex.STRICT_SSL;
request(options, function(err, response, body) {
callback(err, body);
});
return this;
},
// Make a public API request
_public: function(parameters, callback){
var options = {
method: 'GET',
url: PUBLIC_API_URL,
qs: parameters
};
return this._request(options, callback);
},
// Make a private API request
_private: function(parameters, callback){
var options;
parameters.nonce = nonce();
options = {
method: 'POST',
url: PRIVATE_API_URL,
form: parameters,
headers: this._getPrivateHeaders(parameters)
};
return this._request(options, callback);
},
/////
// PUBLIC METHODS
getTicker: function(callback){
var parameters = {
command: 'returnTicker'
};
return this._public(parameters, callback);
},
get24hVolume: function(callback){
var parameters = {
command: 'return24hVolume'
};
return this._public(parameters, callback);
},
getOrderBook: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnOrderBook',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._public(parameters, callback);
},
getTradeHistory: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnTradeHistory',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._public(parameters, callback);
},
/////
// PRIVATE METHODS
myBalances: function(callback){
var parameters = {
command: 'returnBalances'
};
return this._private(parameters, callback);
},
myOpenOrders: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnOpenOrders',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._private(parameters, callback);
},
myTradeHistory: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnTradeHistory',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._private(parameters, callback);
},
buy: function(currencyA, currencyB, rate, amount, callback){
var parameters = {
command: 'buy',
currencyPair: joinCurrencies(currencyA, currencyB),
rate: rate,
amount: amount
};
return this._private(parameters, callback);
},
sell: function(currencyA, currencyB, rate, amount, callback){
var parameters = {
command: 'sell',
currencyPair: joinCurrencies(currencyA, currencyB),
rate: rate,
amount: amount
};
return this._private(parameters, callback);
},
cancelOrder: function(currencyA, currencyB, orderNumber, callback){
var parameters = {
command: 'cancelOrder',
currencyPair: joinCurrencies(currencyA, currencyB),
orderNumber: orderNumber
};
return this._private(parameters, callback);
},
withdraw: function(currency, amount, address, callback){
var parameters = {
command: 'withdraw',
currency: currency,
amount: amount,
address: address
};
return this._private(parameters, callback);
}
};
return Poloniex;
}();

View File

@ -17,27 +17,43 @@ var listener = module.exports = function listener(options){
}
var blockNotifyServer = net.createServer(function(c) {
emitLog('Block listener has incoming connection');
var data = '';
c.on('data', function(d){
emitLog('Block listener received blocknotify data');
data += d;
if (data.slice(-1) === '\n'){
c.end();
}
});
c.on('end', function() {
try {
c.on('data', function (d) {
emitLog('Block listener received blocknotify data');
data += d;
if (data.slice(-1) === '\n') {
c.end();
}
});
c.on('end', function () {
emitLog('Block listener connection ended');
emitLog('Block listener connection ended');
var message = JSON.parse(data);
if (message.password === options.password){
_this.emit('hash', message);
}
else
emitLog('Block listener received notification with incorrect password');
var message;
try{
message = JSON.parse(data);
}
catch(e){
emitLog('Block listener failed to parse message ' + data);
return;
}
if (message.password === options.password) {
_this.emit('hash', message);
}
else
emitLog('Block listener received notification with incorrect password');
});
}
catch(e){
emitLog('Block listener had an error: ' + e);
}
});
});
blockNotifyServer.listen(options.port, function() {
emitLog('Block notify listener server started on port ' + options.port)

View File

@ -0,0 +1,56 @@
var events = require('events');
var net = require('net');
var listener = module.exports = function listener(options){
var _this = this;
var emitLog = function(text){
_this.emit('log', text);
};
this.start = function(){
if (!options || !options.enabled){
emitLog('Coinswitch listener disabled');
return;
}
var coinswitchServer = net.createServer(function(c) {
emitLog('Coinswitch listener has incoming connection');
var data = '';
try {
c.on('data', function (d) {
emitLog('Coinswitch listener received switch request');
data += d;
if (data.slice(-1) === '\n') {
c.end();
}
});
c.on('end', function () {
var message = JSON.parse(data);
if (message.password === options.password) {
_this.emit('switchcoin', message);
}
else
emitLog('Coinswitch listener received notification with incorrect password');
});
}
catch(e){
emitLog('Coinswitch listener failed to parse message ' + data);
}
});
coinswitchServer.listen(options.port, function() {
emitLog('Coinswitch notify listener server started on port ' + options.port)
});
emitLog("Coinswitch listener is enabled, starting server on port " + options.port);
}
};
listener.prototype.__proto__ = events.EventEmitter.prototype;

View File

@ -1,78 +1,76 @@
var dateFormat = require('dateformat');
/*
var defaultConfiguration = {
'default': true,
'keys': {
'client' : 'warning',
'system' : true,
'submitblock' : true,
var colors = require('colors');
var severityToColor = function(severity, text) {
switch(severity) {
case 'special':
return text.cyan.underline;
case 'debug':
return text.green;
case 'warning':
return text.yellow;
case 'error':
return text.red;
default:
console.log("Unknown severity " + severity);
return text.italic;
}
};
*/
var severityToInt = function(severity) {
switch(severity) {
case 'debug':
return 10;
case 'warning':
return 20;
case 'error':
return 30;
default:
console.log("Unknown severity "+severity);
return 1000;
}
}
var getSeverityColor = function(severity) {
switch(severity) {
case 'debug':
return 32;
case 'warning':
return 33;
case 'error':
return 31;
default:
console.log("Unknown severity "+severity);
return 31;
}
}
var severityValues = {
'debug': 1,
'warning': 2,
'error': 3,
'special': 4
};
var PoolLogger = function (configuration) {
var logLevelInt = severityToInt(configuration.logLevel);
// privates
var shouldLog = function(key, severity) {
var severity = severityToInt(severity);
return severity >= logLevelInt;
var logLevelInt = severityValues[configuration.logLevel];
var log = function(severity, system, component, text, subcat) {
if (severityValues[severity] < logLevelInt) return;
if (subcat){
var realText = subcat;
var realSubCat = text;
text = realText;
subcat = realSubCat;
}
var entryDesc = dateFormat(new Date(), 'yyyy-mm-dd HH:MM:ss') + ' [' + system + ']\t';
entryDesc = severityToColor(severity, entryDesc);
var logString =
entryDesc +
('[' + component + '] ').italic;
if (subcat)
logString += ('(' + subcat + ') ').bold.grey
logString += text.grey;
console.log(logString);
};
var log = function(severity, key, poolName, text) {
if (!shouldLog(key, severity))
return;
var desc = poolName ? '[' + poolName + '] ' : '';
console.log(
'\u001b[' + getSeverityColor(severity) + 'm' +
dateFormat(new Date(), 'yyyy-mm-dd HH:MM:ss') +
" [" + key + "]" + '\u001b[39m: ' + "\t" +
desc + text
);
}
// public
this.logDebug = function(poolName, logKey, text){
log('debug', logKey, poolName, text);
}
this.logWarning = function(poolName, logKey, text) {
log('warning', logKey, poolName, text);
}
this.logError = function(poolName, logKey, text) {
log('error', logKey, poolName, text);
}
}
var _this = this;
Object.keys(severityValues).forEach(function(logType){
_this[logType] = function(){
var args = Array.prototype.slice.call(arguments, 0);
args.unshift(logType);
log.apply(this, args);
};
});
};
module.exports = PoolLogger;

View File

@ -7,6 +7,9 @@ module.exports = function(logger, poolConfig){
var connection;
var logIdentify = 'MySQL';
var logComponent = coin;
function connect(){
connection = mysql.createConnection({
host: mposConfig.host,
@ -17,18 +20,18 @@ module.exports = function(logger, poolConfig){
});
connection.connect(function(err){
if (err)
logger.error('mysql', 'Could not connect to mysql database: ' + JSON.stringify(err))
logger.error(logIdentify, logComponent, 'Could not connect to mysql database: ' + JSON.stringify(err))
else{
logger.debug('mysql', 'Successful connection to MySQL database');
logger.debug(logIdentify, logComponent, 'Successful connection to MySQL database');
}
});
connection.on('error', function(err){
if(err.code === 'PROTOCOL_CONNECTION_LOST') {
logger.warning('mysql', 'Lost connection to MySQL database, attempting reconnection...');
logger.warning(logIdentify, logComponent, 'Lost connection to MySQL database, attempting reconnection...');
connect();
}
else{
logger.error('mysql', 'Database error: ' + JSON.stringify(err))
logger.error(logIdentify, logComponent, 'Database error: ' + JSON.stringify(err))
}
});
}
@ -38,10 +41,10 @@ module.exports = function(logger, poolConfig){
connection.query(
'SELECT password FROM pool_worker WHERE username = LOWER(?)',
[workerName],
[workerName.toLowerCase()],
function(err, result){
if (err){
logger.error('mysql', 'Database error when authenticating worker: ' +
logger.error(logIdentify, logComponent, 'Database error when authenticating worker: ' +
JSON.stringify(err));
authCallback(false);
}
@ -63,19 +66,20 @@ module.exports = function(logger, poolConfig){
var dbData = [
shareData.ip,
shareData.worker,
isValidShare ? 'Y' : 'N',
isValidShare ? 'Y' : 'N',
isValidBlock ? 'Y' : 'N',
shareData.difficulty,
shareData.difficulty * (poolConfig.coin.mposDiffMultiplier || 1),
typeof(shareData.error) === 'undefined' ? null : shareData.error,
typeof(shareData.solution) === 'undefined' ? '' : shareData.solution
shareData.blockHash ? shareData.blockHash : (shareData.blockHashInvalid ? shareData.blockHashInvalid : '')
];
connection.query(
'INSERT INTO `shares` SET time = NOW(), rem_host = ?, username = ?, our_result = ?, upstream_result = ?, difficulty = ?, reason = ?, solution = ?',
dbData,
function(err, result) {
if (err)
logger.error('mysql', 'Insert error when adding share: ' +
JSON.stringify(err));
logger.error(logIdentify, logComponent, 'Insert error when adding share: ' + JSON.stringify(err));
else
logger.debug(logIdentify, logComponent, 'Share inserted');
}
);
};
@ -86,7 +90,7 @@ module.exports = function(logger, poolConfig){
'UPDATE `pool_worker` SET `difficulty` = ' + diff + ' WHERE `username` = ' + connection.escape(workerName),
function(err, result){
if (err)
logger.error('mysql', 'Error when updating worker diff: ' +
logger.error(logIdentify, logComponent, 'Error when updating worker diff: ' +
JSON.stringify(err));
else if (result.affectedRows === 0){
connection.query('INSERT INTO `pool_worker` SET ?', {username: workerName, difficulty: diff});
@ -98,4 +102,4 @@ module.exports = function(logger, poolConfig){
};
};
};

View File

@ -9,79 +9,173 @@ module.exports = function(logger){
var poolConfigs = JSON.parse(process.env.pools);
var enabledPools = [];
Object.keys(poolConfigs).forEach(function(coin) {
SetupForPool(logger, poolConfigs[coin]);
var poolOptions = poolConfigs[coin];
if (poolOptions.shareProcessing &&
poolOptions.shareProcessing.internal &&
poolOptions.shareProcessing.internal.enabled)
enabledPools.push(coin);
});
async.filter(enabledPools, function(coin, callback){
SetupForPool(logger, poolConfigs[coin], function(setupResults){
callback(setupResults);
});
}, function(coins){
coins.forEach(function(coin){
var poolOptions = poolConfigs[coin];
var processingConfig = poolOptions.shareProcessing.internal;
var logSystem = 'Payments';
var logComponent = coin;
logger.debug(logSystem, logComponent, 'Payment processing setup to run every '
+ processingConfig.paymentInterval + ' second(s) with daemon ('
+ processingConfig.daemon.user + '@' + processingConfig.daemon.host + ':' + processingConfig.daemon.port
+ ') and redis (' + processingConfig.redis.host + ':' + processingConfig.redis.port + ')');
});
});
};
function SetupForPool(logger, poolOptions){
function SetupForPool(logger, poolOptions, setupFinished){
var coin = poolOptions.coin.name;
var processingConfig = poolOptions.shareProcessing.internal;
if (!processingConfig.enabled) return;
var logIdentify = 'Payment Processor (' + coin + ')';
var paymentLogger = {
debug: function(key, text){
logger.logDebug(logIdentify, key, text);
},
warning: function(key, text){
logger.logWarning(logIdentify, key, text);
},
error: function(key, text){
logger.logError(logIdentify, key, text);
}
};
var daemon = new Stratum.daemon.interface([processingConfig.daemon]);
daemon.once('online', function(){
paymentLogger.debug('system', 'Connected to daemon for payment processing');
daemon.cmd('validateaddress', [poolOptions.address], function(result){
if (!result[0].response.ismine){
paymentLogger.error('system', 'Daemon does not own pool address - payment processing can not be done with this daemon');
}
});
}).once('connectionFailed', function(error){
paymentLogger.error('system', 'Failed to connect to daemon for payment processing: ' + JSON.stringify(error));
}).on('error', function(error){
paymentLogger.error('system', error);
}).init();
var logSystem = 'Payments';
var logComponent = coin;
var processingPayments = true;
var daemon;
var redisClient;
async.parallel([
var connectToRedis = function(){
var reconnectTimeout;
redisClient = redis.createClient(processingConfig.redis.port, processingConfig.redis.host);
redisClient.on('ready', function(){
clearTimeout(reconnectTimeout);
paymentLogger.debug('redis', 'Successfully connected to redis database');
}).on('error', function(err){
paymentLogger.error('redis', 'Redis client had an error: ' + JSON.stringify(err))
}).on('end', function(){
paymentLogger.error('redis', 'Connection to redis database as been ended');
paymentLogger.warning('redis', 'Trying reconnection in 3 seconds...');
reconnectTimeout = setTimeout(function(){
connectToRedis();
}, 3000);
function(callback){
daemon = new Stratum.daemon.interface([processingConfig.daemon]);
daemon.once('online', function(){
daemon.cmd('validateaddress', [poolOptions.address], function(result){
if (!result[0].response || !result[0].response.ismine){
logger.error(logSystem, logComponent,
'Daemon does not own pool address - payment processing can not be done with this daemon, '
+ JSON.stringify(result[0].response));
return;
}
callback()
});
}).once('connectionFailed', function(error){
logger.error(logSystem, logComponent, 'Failed to connect to daemon for payment processing: config ' +
JSON.stringify(processingConfig.daemon) + ', error: ' +
JSON.stringify(error));
callback('Error connecting to deamon');
}).on('error', function(error){
logger.error(logSystem, logComponent, 'Daemon error ' + JSON.stringify(error));
}).init();
},
function(callback){
redisClient = redis.createClient(processingConfig.redis.port, processingConfig.redis.host);
redisClient.on('ready', function(){
if (callback) {
callback();
callback = null;
return;
}
logger.debug(logSystem, logComponent, 'Connected to redis at '
+ processingConfig.redis.host + ':' + processingConfig.redis.port + ' for payment processing');
}).on('end', function(){
logger.error(logSystem, logComponent, 'Connection to redis database as been ended');
}).once('error', function(){
if (callback) {
logger.error(logSystem, logComponent, 'Failed to connect to redis at '
+ processingConfig.redis.host + ':' + processingConfig.redis.port + ' for payment processing');
callback('Error connecting to redis');
callback = null;
}
});
}
], function(err){
if (err){
setupFinished(false);
return;
}
setInterval(function(){
try {
processPayments();
} catch(e){
throw e;
}
}, processingConfig.paymentInterval * 1000);
setTimeout(processPayments, 100);
setupFinished(true);
});
/* Call redis to check if previous sendmany and/or redis cleanout commands completed successfully.
If sendmany worked fine but redis commands failed you HAVE TO run redis commands again
(manually) to prevent double payments. If sendmany failed too you can safely delete
coin + '_finalRedisCommands' string from redis to let pool calculate payments again. */
function checkPreviousPaymentsStatus(callback) {
redisClient.get(coin + '_finalRedisCommands', function(error, reply) {
if (error){
callback('Could not get finalRedisCommands - ' + JSON.stringify(error));
return;
}
if (reply) {
callback('Payments stopped because of the critical error - failed commands saved in '
+ coin + '_finalRedisCommands redis set:\n' + reply);
return;
} else {
/* There was no error in previous sendmany and/or redis cleanout commands
so we can safely continue */
processingPayments = false;
callback();
}
});
}
/* Number.toFixed gives us the decimal places we want, but as a string. parseFloat turns it back into number
we don't care about trailing zeros in this case. */
var toPrecision = function(value, precision){
return parseFloat(value.toFixed(precision));
};
connectToRedis();
/* Deal with numbers in smallest possible units (satoshis) as much as possible. This greatly helps with accuracy
when rounding and whatnot. When we are storing numbers for only humans to see, store in whole coin units. */
var processPayments = function(){
var startPaymentProcess = Date.now();
async.waterfall([
function(callback) {
if (processingPayments) {
checkPreviousPaymentsStatus(function(error){
if (error) {
logger.error(logSystem, logComponent, error);
callback('Check finished - previous payments processing error');
return;
}
callback();
});
return;
}
callback();
},
/* Call redis to get an array of rounds - which are coinbase transactions and block heights from submitted
blocks. */
function(callback){
@ -89,25 +183,31 @@ function SetupForPool(logger, poolOptions){
redisClient.smembers(coin + '_blocksPending', function(error, results){
if (error){
paymentLogger.error('redis', 'Could get blocks from redis ' + JSON.stringify(error));
callback('done - redis error for getting blocks');
logger.error(logSystem, logComponent, 'Could not get blocks from redis ' + JSON.stringify(error));
callback('Check finished - redis error for getting blocks');
return;
}
if (results.length === 0){
callback('done - no pending blocks in redis');
callback('Check finished - no pending blocks in redis');
return;
}
var rounds = results.map(function(r){
var details = r.split(':');
return {txHash: details[0], height: details[1], reward: details[2], serialized: r};
return {
category: details[0].category,
blockHash: details[0],
txHash: details[1],
height: details[2],
reward: details[3],
serialized: r
};
});
callback(null, rounds);
});
},
/* Does a batch rpc call to daemon with all the transaction hashes to see if they are confirmed yet.
It also adds the block reward amount to the round object - which the daemon gives also gives us. */
function(rounds, callback){
@ -119,51 +219,91 @@ function SetupForPool(logger, poolOptions){
daemon.batchCmd(batchRPCcommand, function(error, txDetails){
if (error || !txDetails){
callback('done - daemon rpc error with batch gettransactions ' + JSON.stringify(error));
callback('Check finished - daemon rpc error with batch gettransactions ' +
JSON.stringify(error));
return;
}
txDetails = txDetails.filter(function(tx){
if (tx.error || !tx.result){
console.log('error with requesting transaction from block daemon: ' + JSON.stringify(t));
return false;
txDetails.forEach(function(tx, i){
var round = rounds[i];
if (tx.error && tx.error.code === -5 || round.blockHash !== tx.result.blockhash){
/* Block was dropped from coin daemon even after it happily accepted it earlier. */
//If we find another block at the same height then this block was drop-kicked orphaned
var dropKicked = rounds.filter(function(r){
return r.height === round.height && r.blockHash !== round.blockHash && r.category !== 'dropkicked';
}).length > 0;
if (dropKicked){
logger.warning(logSystem, logComponent,
'A block was drop-kicked orphaned'
+ ' - we found a better block at the same height, blockHash '
+ round.blockHash + " round " + round.height);
round.category = 'dropkicked';
}
else{
/* We have no other blocks that match this height so convert to orphan in order for
shares from the round to be rewarded. */
round.category = 'orphan';
}
}
else if (tx.error || !tx.result){
logger.error(logSystem, logComponent,
'Error with requesting transaction from block daemon: ' + JSON.stringify(tx));
}
else{
round.category = tx.result.details[0].category;
if (round.category === 'generate')
round.amount = tx.result.amount;
}
return true;
});
var orphanedRounds = [];
var confirmedRounds = [];
//Rounds that are not confirmed yet are removed from the round array
//We also get reward amount for each block from daemon reply
rounds.forEach(function(r){
var tx = txDetails.filter(function(tx){return tx.result.txid === r.txHash})[0];
var magnitude;
if (!tx){
console.log('daemon did not give us back a transaction that we asked for: ' + r.txHash);
return;
//Filter out all rounds that are immature (not confirmed or orphaned yet)
rounds = rounds.filter(function(r){
switch (r.category) {
case 'generate':
/* Here we calculate the smallest unit in this coin's currency; the 'satoshi'.
The rpc.getblocktemplate.amount tells us how much we get in satoshis, while the
rpc.gettransaction.amount tells us how much we get in whole coin units. Therefore,
we simply divide the two to get the magnitude. I don't know math, there is probably
a better term than 'magnitude'. Sue me or do a pull request to fix it. */
var roundMagnitude = r.reward / r.amount;
if (!magnitude) {
magnitude = roundMagnitude;
if (roundMagnitude % 10 !== 0)
logger.error(logSystem, logComponent,
'Satosihis in coin is not divisible by 10 which is very odd');
}
else if (magnitude != roundMagnitude) {
/* Magnitude for a coin should ALWAYS be the same. For BTC and most coins there are
100,000,000 satoshis in one coin unit. */
logger.error(logSystem, logComponent,
'Magnitude in a round was different than in another round. HUGE PROBLEM.');
}
return true;
case 'dropkicked':
case 'orphan':
return true;
default:
return false;
}
r.category = tx.result.details[0].category;
if (r.category === 'orphan'){
orphanedRounds.push(r);
}
else if (r.category === 'generate'){
r.amount = tx.result.amount;
r.magnitude = r.reward / r.amount;
confirmedRounds.push(r);
}
});
if (orphanedRounds.length === 0 && confirmedRounds.length === 0){
callback('done - no confirmed or orhpaned rounds');
if (rounds.length === 0){
callback('Check finished - no confirmed or orphaned blocks found');
}
else{
callback(null, confirmedRounds, orphanedRounds);
callback(null, rounds, magnitude);
}
});
},
@ -171,85 +311,78 @@ function SetupForPool(logger, poolOptions){
/* Does a batch redis call to get shares contributed to each round. Then calculates the reward
amount owned to each miner for each round. */
function(confirmedRounds, orphanedRounds, callback){
var rounds = [];
for (var i = 0; i < orphanedRounds.length; i++) rounds.push(orphanedRounds[i]);
for (var i = 0; i < confirmedRounds.length; i++) rounds.push(confirmedRounds[i]);
function(rounds, magnitude, callback){
var shareLookups = rounds.map(function(r){
return ['hgetall', coin + '_shares:round' + r.height]
});
redisClient.multi(shareLookups).exec(function(error, allWorkerShares){
if (error){
callback('done - redis error with multi get rounds share')
callback('Check finished - redis error with multi get rounds share')
return;
}
// Iterate through the beginning of the share results which are for the orphaned rounds
var orphanMergeCommands = []
for (var i = 0; i < orphanedRounds.length; i++){
var workerShares = allWorkerShares[i];
Object.keys(workerShares).forEach(function(worker){
orphanMergeCommands.push(['hincrby', coin + '_shares:roundCurrent', worker, workerShares[worker]]);
});
orphanMergeCommands.push([]);
}
// Iterate through the rest of the share results which are for the worker rewards
var orphanMergeCommands = [];
var workerRewards = {};
for (var i = orphanedRounds.length; i < allWorkerShares.length; i++){
var round = rounds[i];
rounds.forEach(function(round, i){
var workerShares = allWorkerShares[i];
var reward = round.reward * (1 - processingConfig.feePercent);
var totalShares = Object.keys(workerShares).reduce(function(p, c){
return p + parseInt(workerShares[c])
}, 0);
for (var worker in workerShares){
var percent = parseInt(workerShares[worker]) / totalShares;
var workerRewardTotal = Math.floor(reward * percent);
if (!(worker in workerRewards)) workerRewards[worker] = 0;
workerRewards[worker] += workerRewardTotal;
if (!workerShares){
logger.error(logSystem, logComponent, 'No worker shares for round: '
+ round.height + ' blockHash: ' + round.blockHash);
return;
}
}
switch (round.category){
case 'orphan':
/* Each block that gets orphaned, all the shares go into the current round so that
miners still get a reward for their work. This seems unfair to those that just
started mining during this current round, but over time it balances out and rewards
loyal miners. */
Object.keys(workerShares).forEach(function(worker){
orphanMergeCommands.push(['hincrby', coin + '_shares:roundCurrent',
worker, workerShares[worker]]);
});
break;
//this calculates profit if you wanna see it
/*
var workerTotalRewards = Object.keys(workerRewards).reduce(function(p, c){
return p + workerRewards[c];
}, 0);
case 'generate':
/* We found a confirmed block! Now get the reward for it and calculate how much
we owe each miner based on the shares they submitted during that block round. */
var reward = round.reward * (1 - processingConfig.feePercent);
var poolTotalRewards = rounds.reduce(function(p, c){
return p + c.amount * c.magnitude;
}, 0);
var totalShares = Object.keys(workerShares).reduce(function(p, c){
return p + parseInt(workerShares[c])
}, 0);
console.log(workerRewards);
console.log('pool profit percent' + ((poolTotalRewards - workerTotalRewards) / poolTotalRewards));
*/
for (var worker in workerShares){
var percent = parseInt(workerShares[worker]) / totalShares;
var workerRewardTotal = Math.floor(reward * percent);
if (!(worker in workerRewards)) workerRewards[worker] = 0;
workerRewards[worker] += workerRewardTotal;
}
break;
}
callback(null, rounds, workerRewards, orphanMergeCommands);
});
callback(null, rounds, magnitude, workerRewards, orphanMergeCommands);
});
},
/* Does a batch call to redis to get worker existing balances from coin_balances*/
function(rounds, workerRewards, orphanMergeCommands, callback){
function(rounds, magnitude, workerRewards, orphanMergeCommands, callback){
var workers = Object.keys(workerRewards);
redisClient.hmget([coin + '_balances'].concat(workers), function(error, results){
if (error){
callback('done - redis error with multi get balances');
if (error && workers.length !== 0){
callback('Check finished - redis error with multi get balances ' + JSON.stringify(error));
return;
}
@ -257,11 +390,11 @@ function SetupForPool(logger, poolOptions){
var workerBalances = {};
for (var i = 0; i < workers.length; i++){
workerBalances[workers[i]] = parseInt(results[i]) || 0;
workerBalances[workers[i]] = (parseInt(results[i]) || 0);
}
callback(null, rounds, workerRewards, workerBalances, orphanMergeCommands);
callback(null, rounds, magnitude, workerRewards, orphanMergeCommands, workerBalances);
});
},
@ -273,11 +406,12 @@ function SetupForPool(logger, poolOptions){
when deciding the sent balance, it the difference should be -1*amount they had in db,
if not sending the balance, the differnce should be +(the amount they earned this round)
*/
function(rounds, workerRewards, workerBalances, orphanMergeCommands, callback){
function(rounds, magnitude, workerRewards, orphanMergeCommands, workerBalances, callback){
var magnitude = rounds[0].magnitude;
//number of satoshis in a single coin unit - this can be different for coins so we calculate it :)
daemon.cmd('getbalance', [], function(results){
daemon.cmd('getbalance', [''], function(results){
var totalBalance = results[0].response * magnitude;
var toBePaid = 0;
@ -287,100 +421,199 @@ function SetupForPool(logger, poolOptions){
var balanceUpdateCommands = [];
var workerPayoutsCommand = [];
/* Here we add up all workers' previous unpaid balances plus their current rewards as we are
about to check if they reach the payout threshold. */
for (var worker in workerRewards){
workerPayments[worker] = (workerPayments[worker] || 0) + workerRewards[worker];
workerPayments[worker] = ((workerPayments[worker] || 0) + workerRewards[worker]);
}
for (var worker in workerBalances){
workerPayments[worker] = (workerPayments[worker] || 0) + workerBalances[worker];
}
for (var worker in workerPayments){
if (workerPayments[worker] < processingConfig.minimumPayment * magnitude){
balanceUpdateCommands.push(['hincrby', coin + '_balances', worker, workerRewards[worker]]);
delete workerPayments[worker];
}
else{
if (workerBalances[worker] !== 0)
balanceUpdateCommands.push(['hincrby', coin + '_balances', worker, -1 * workerBalances[worker]]);
workerPayoutsCommand.push(['hincrby', coin + '_balances', worker, workerRewards[worker]]);
toBePaid += workerPayments[worker];
}
workerPayments[worker] = ((workerPayments[worker] || 0) + workerBalances[worker]);
}
var balanceLeftOver = totalBalance - toBePaid;
/* Here we check if any of the workers reached their payout threshold, or delete them from the
pending payment ledger (the workerPayments object). */
if (Object.keys(workerPayments).length > 0){
var coinPrecision = magnitude.toString().length - 1;
for (var worker in workerPayments){
if (workerPayments[worker] < processingConfig.minimumPayment * magnitude){
/* The workers total earnings (balance + current reward) was not enough to warrant
a transaction, so we will store their balance in the database. Next time they
are rewarded it might reach the payout threshold. */
balanceUpdateCommands.push([
'hincrby',
coin + '_balances',
worker,
workerRewards[worker]
]);
delete workerPayments[worker];
}
else{
//If worker had a balance that is about to be paid out, subtract it from the database
if (workerBalances[worker] !== 0){
balanceUpdateCommands.push([
'hincrby',
coin + '_balances',
worker,
-1 * workerBalances[worker]
]);
}
var rewardInPrecision = (workerRewards[worker] / magnitude).toFixed(coinPrecision);
workerPayoutsCommand.push(['hincrbyfloat', coin + '_payouts', worker, rewardInPrecision]);
toBePaid += workerPayments[worker];
}
}
}
// txfee included in feeAmountToBeCollected
var leftOver = toBePaid / (1 - processingConfig.feePercent);
var feeAmountToBeCollected = toPrecision(leftOver * processingConfig.feePercent, coinPrecision);
var balanceLeftOver = totalBalance - toBePaid - feeAmountToBeCollected;
var minReserveSatoshis = processingConfig.minimumReserve * magnitude;
if (balanceLeftOver < minReserveSatoshis){
callback('done - payments would wipe out minimum reserve, tried to pay out ' + toBePaid +
/* TODO: Need to convert all these variables into whole coin units before displaying because
humans aren't good at reading satoshi units. */
callback('Check finished - payments would wipe out minimum reserve, tried to pay out ' +
toBePaid + ' and collect ' + feeAmountToBeCollected + ' as fees' +
' but only have ' + totalBalance + '. Left over balance would be ' + balanceLeftOver +
', needs to be at least ' + minReserveSatoshis);
return;
}
/* Move pending blocks into either orphan for confirmed sets, and delete their no longer
required round/shares data. */
var movePendingCommands = [];
var deleteRoundsCommand = ['del'];
var roundsToDelete = [];
rounds.forEach(function(r){
var destinationSet = r.category === 'orphan' ? '_blocksOrphaned' : '_blocksConfirmed';
var destinationSet = (function(){
switch(r.category){
case 'orphan': return '_blocksOrphaned';
case 'generate': return '_blocksConfirmed';
case 'dropkicked': return '_blocksDropKicked';
}
})();
movePendingCommands.push(['smove', coin + '_blocksPending', coin + destinationSet, r.serialized]);
deleteRoundsCommand.push(coin + '_shares:round' + r.height)
roundsToDelete.push(coin + '_shares:round' + r.height)
});
var finalRedisCommands = [];
finalRedisCommands = finalRedisCommands.concat(
movePendingCommands,
orphanMergeCommands,
balanceUpdateCommands,
workerPayoutsCommand
);
if (movePendingCommands.length > 0)
finalRedisCommands = finalRedisCommands.concat(movePendingCommands);
finalRedisCommands.push(deleteRoundsCommand);
finalRedisCommands.push(['hincrby', coin + '_stats', 'totalPaid', toBePaid]);
if (orphanMergeCommands.length > 0)
finalRedisCommands = finalRedisCommands.concat(orphanMergeCommands);
if (balanceUpdateCommands.length > 0)
finalRedisCommands = finalRedisCommands.concat(balanceUpdateCommands);
if (workerPayoutsCommand.length > 0)
finalRedisCommands = finalRedisCommands.concat(workerPayoutsCommand);
if (roundsToDelete.length > 0)
finalRedisCommands.push(['del'].concat(roundsToDelete));
if (toBePaid !== 0)
finalRedisCommands.push(['hincrbyfloat', coin + '_stats', 'totalPaid', (toBePaid / magnitude).toFixed(coinPrecision)]);
finalRedisCommands.push(['del', coin + '_finalRedisCommands']);
finalRedisCommands.push(['bgsave']);
callback(null, magnitude, workerPayments, finalRedisCommands);
});
},
function(magnitude, workerPayments, finalRedisCommands, callback) {
/* Save final redis cleanout commands in case something goes wrong during payments */
redisClient.set(coin + '_finalRedisCommands', JSON.stringify(finalRedisCommands), function(error, reply) {
if (error){
callback('Check finished - error with saving finalRedisCommands' + JSON.stringify(error));
return;
}
callback(null, magnitude, workerPayments, finalRedisCommands);
});
},
function(magnitude, workerPayments, finalRedisCommands, callback){
var sendManyCmd = ['', {}];
for (var address in workerPayments){
sendManyCmd[1][address] = workerPayments[address] / magnitude;
}
console.log(JSON.stringify(finalRedisCommands, null, 4));
console.log(JSON.stringify(workerPayments, null, 4));
console.log(JSON.stringify(sendManyCmd, null, 4));
//return callback('not yet...');
daemon.cmd('sendmany', sendManyCmd, function(results){
if (results[0].error){
callback('done - error with sendmany ' + JSON.stringify(results[0].error));
return;
}
//This does the final all-or-nothing atom transaction if block deamon sent payments
var finalizeRedisTx = function(){
redisClient.multi(finalRedisCommands).exec(function(error, results){
if (error){
callback('done - error with final redis commands for cleaning up ' + JSON.stringify(error));
callback('Error with final redis commands for cleaning up ' + JSON.stringify(error));
return;
}
callback(null, 'Payments sent');
processingPayments = false;
logger.debug(logSystem, logComponent, 'Payments processing performed an interval');
});
});
};
if (Object.keys(workerPayments).length === 0){
finalizeRedisTx();
}
else{
//This is how many decimal places to round a coin down to
var coinPrecision = magnitude.toString().length - 1;
var addressAmounts = {};
var totalAmountUnits = 0;
for (var address in workerPayments){
var coinUnits = toPrecision(workerPayments[address] / magnitude, coinPrecision);
addressAmounts[address] = coinUnits;
totalAmountUnits += coinUnits;
}
logger.debug(logSystem, logComponent, 'Payments to be sent to: ' + JSON.stringify(addressAmounts));
processingPayments = true;
daemon.cmd('sendmany', ['', addressAmounts], function(results){
if (results[0].error){
callback('Check finished - error with sendmany ' + JSON.stringify(results[0].error));
return;
}
finalizeRedisTx();
var totalWorkers = Object.keys(workerPayments).length;
logger.debug(logSystem, logComponent, 'Payments sent, a total of ' + totalAmountUnits
+ ' ' + poolOptions.coin.symbol + ' was sent to ' + totalWorkers + ' miners');
daemon.cmd('gettransaction', [results[0].response], function(results){
if (results[0].error){
callback('Check finished - error with gettransaction ' + JSON.stringify(results[0].error));
return;
}
var feeAmountUnits = parseFloat((totalAmountUnits / (1 - processingConfig.feePercent) * processingConfig.feePercent).toFixed(coinPrecision));
var poolFees = feeAmountUnits - results[0].response.fee;
daemon.cmd('move', ['', processingConfig.feeCollectAccount, poolFees], function(results){
if (results[0].error){
callback('Check finished - error with move ' + JSON.stringify(results[0].error));
return;
}
callback(null, poolFees + ' ' + poolOptions.coin.symbol + ' collected as pool fee');
});
});
});
}
}
], function(error, result){
var paymentProcessTime = Date.now() - startPaymentProcess;
if (error)
paymentLogger.debug('system', error)
logger.debug(logSystem, logComponent, '[Took ' + paymentProcessTime + 'ms] ' + error);
else{
paymentLogger.debug('system', result);
withdrawalProfit();
logger.debug(logSystem, logComponent, '[' + paymentProcessTime + 'ms] ' + result);
// not sure if we need some time to let daemon update the wallet balance
setTimeout(withdrawalProfit, 1000);
}
});
};
@ -390,27 +623,34 @@ function SetupForPool(logger, poolOptions){
if (!processingConfig.feeWithdrawalThreshold) return;
daemon.cmd('getbalance', [], function(results){
logger.debug(logSystem, logComponent, 'Profit withdrawal started');
daemon.cmd('getbalance', [processingConfig.feeCollectAccount], function(results){
var totalBalance = results[0].response;
var withdrawalAmount = totalBalance - processingConfig.minimumReserve;
var leftOverBalance = totalBalance - withdrawalAmount;
// We have to pay some tx fee here too but maybe we shoudn't really care about it too much as long as fee is less
// then minimumReserve value. Because in this case even if feeCollectAccount account will have negative balance
// total wallet balance will be positive and feeCollectAccount account will be refilled during next payment processing.
var withdrawalAmount = results[0].response;
if (leftOverBalance < processingConfig.minimumReserve || withdrawalAmount < processingConfig.feeWithdrawalThreshold){
paymentLogger.debug('system', 'Not enough profit to withdrawal yet');
if (withdrawalAmount < processingConfig.feeWithdrawalThreshold){
logger.debug(logSystem, logComponent, 'Not enough profit to withdraw yet');
}
else{
//Need to figure out how much of the balance is profit... ???
paymentLogger.debug('system', 'Can send profit');
}
var withdrawal = {};
withdrawal[processingConfig.feeReceiveAddress] = withdrawalAmount;
daemon.cmd('sendmany', [processingConfig.feeCollectAccount, withdrawal], function(results){
if (results[0].error){
logger.debug(logSystem, logComponent, 'Profit withdrawal finished - error with sendmany '
+ JSON.stringify(results[0].error));
return;
}
logger.debug(logSystem, logComponent, 'Profit sent, a total of ' + withdrawalAmount
+ ' ' + poolOptions.coin.symbol + ' was sent to ' + processingConfig.feeReceiveAddress);
});
}
});
};
setInterval(processPayments, processingConfig.paymentInterval * 1000);
setTimeout(processPayments, 100);
};

View File

@ -1,46 +1,98 @@
var Stratum = require('stratum-pool');
var Vardiff = require('stratum-pool/lib/varDiff.js');
var redis = require('redis');
var net = require('net');
var MposCompatibility = require('./mposCompatibility.js');
var ShareProcessor = require('./shareProcessor.js');
module.exports = function(logger){
var _this = this;
var poolConfigs = JSON.parse(process.env.pools);
var portalConfig = JSON.parse(process.env.portalConfig);
var forkId = process.env.forkId;
var forkId = process.env.forkId;
var pools = {};
var varDiffsInstances = {}; // contains all the vardiffs for the profit switching pool
var pools = {};
var proxySwitch = {};
var proxyStuff = {}
//Handle messages from master process sent via IPC
process.on('message', function(message) {
switch(message.type){
case 'blocknotify':
var pool = pools[message.coin.toLowerCase()]
if (pool) pool.processBlockNotify(message.hash)
case 'banIP':
for (var p in pools){
if (pools[p].stratumServer)
pools[p].stratumServer.addBannedIP(message.ip);
}
break;
case 'blocknotify':
var messageCoin = message.coin.toLowerCase();
var poolTarget = Object.keys(pools).filter(function(p){
return p.toLowerCase() === messageCoin;
})[0];
if (poolTarget)
pools[poolTarget].processBlockNotify(message.hash, 'blocknotify script');
break;
// IPC message for pool switching
case 'switch':
var newCoinPool = pools[message.coin.toLowerCase()];
if (newCoinPool) {
var oldPool = pools[proxyStuff.curActivePool];
var logSystem = 'Proxy';
var logComponent = 'Switch';
var logSubCat = 'Thread ' + (parseInt(forkId) + 1);
var messageCoin = message.coin.toLowerCase();
var newCoin = Object.keys(pools).filter(function(p){
return p.toLowerCase() === messageCoin;
})[0];
if (!newCoin){
logger.debug(logSystem, logComponent, logSubCat, 'Switch message to coin that is not recognized: ' + messageCoin);
break;
}
var algo = poolConfigs[newCoin].coin.algorithm;
var newPool = pools[newCoin];
var oldCoin = proxySwitch[algo].currentPool;
var oldPool = pools[oldCoin];
var proxyPort = proxySwitch[algo].port;
if (newCoin == oldCoin) {
logger.debug(logSystem, logComponent, logSubCat, 'Switch message would have no effect - ignoring ' + newCoin);
break;
}
logger.debug(logSystem, logComponent, logSubCat, 'Proxy message for ' + algo + ' from ' + oldCoin + ' to ' + newCoin);
if (newPool) {
oldPool.relinquishMiners(
function (miner, cback) {
// relinquish miners that are attached to one of the "Auto-switch" ports and leave the others there.
cback(typeof(portalConfig.proxy.ports[miner.client.socket.localPort]) !== 'undefined')
cback(miner.client.socket.localPort == proxyPort)
},
function (clients) {
newCoinPool.attachMiners(clients);
proxyStuff.curActivePool = message.coin.toLowerCase();
newPool.attachMiners(clients);
}
)
);
proxySwitch[algo].currentPool = newCoin;
var redisClient = redis.createClient(portalConfig.redis.port, portalConfig.redis.host)
redisClient.on('ready', function(){
redisClient.hset('proxyState', algo, newCoin, function(error, obj) {
if (error) {
logger.error(logSystem, logComponent, logSubCat, 'Redis error writing proxy config: ' + JSON.stringify(err))
}
else {
logger.debug(logSystem, logComponent, logSubCat, 'Last proxy state saved to redis for ' + algo);
}
});
});
}
break;
}
@ -51,19 +103,9 @@ module.exports = function(logger){
var poolOptions = poolConfigs[coin];
var logIdentify = 'Pool Fork ' + forkId + ' (' + coin + ')';
var poolLogger = {
debug: function(key, text){
logger.logDebug(logIdentify, key, text);
},
warning: function(key, text){
logger.logWarning(logIdentify, key, text);
},
error: function(key, text){
logger.logError(logIdentify, key, text);
}
};
var logSystem = 'Pool';
var logComponent = coin;
var logSubCat = 'Thread ' + (parseInt(forkId) + 1);
var handlers = {
auth: function(){},
@ -74,8 +116,8 @@ module.exports = function(logger){
var shareProcessing = poolOptions.shareProcessing;
//Functions required for MPOS compatibility
if (shareProcessing.mpos && shareProcessing.mpos.enabled){
var mposCompat = new MposCompatibility(poolLogger, poolOptions)
if (shareProcessing && shareProcessing.mpos && shareProcessing.mpos.enabled){
var mposCompat = new MposCompatibility(logger, poolOptions);
handlers.auth = function(workerName, password, authCallback){
mposCompat.handleAuth(workerName, password, authCallback);
@ -91,15 +133,19 @@ module.exports = function(logger){
}
//Functions required for internal payment processing
else if (shareProcessing.internal && shareProcessing.internal.enabled){
else if (shareProcessing && shareProcessing.internal && shareProcessing.internal.enabled){
var shareProcessor = new ShareProcessor(poolLogger, poolOptions)
var shareProcessor = new ShareProcessor(logger, poolOptions);
handlers.auth = function(workerName, password, authCallback){
pool.daemon.cmd('validateaddress', [workerName], function(results){
var isValid = results.filter(function(r){return r.response.isvalid}).length > 0;
authCallback(isValid);
});
if (shareProcessing.internal.validateWorkerAddress !== true)
authCallback(true);
else {
pool.daemon.cmd('validateaddress', [workerName], function(results){
var isValid = results.filter(function(r){return r.response.isvalid}).length > 0;
authCallback(isValid);
});
}
};
handlers.share = function(isValidShare, isValidBlock, data){
@ -112,7 +158,7 @@ module.exports = function(logger){
var authString = authorized ? 'Authorized' : 'Unauthorized ';
poolLogger.debug('client', authString + ' [' + ip + '] ' + workerName + ':' + password);
logger.debug(logSystem, logComponent, logSubCat, authString + ' ' + workerName + ':' + password + ' [' + ip + ']');
callback({
error: null,
authorized: authorized,
@ -122,65 +168,131 @@ module.exports = function(logger){
};
var pool = Stratum.createPool(poolOptions, authorizeFN);
var pool = Stratum.createPool(poolOptions, authorizeFN, logger);
pool.on('share', function(isValidShare, isValidBlock, data){
var shareData = JSON.stringify(data);
if (data.solution && !isValidBlock)
poolLogger.debug('client', 'We thought a block solution was found but it was rejected by the daemon, share data: ' + shareData);
if (data.blockHash && !isValidBlock)
logger.debug(logSystem, logComponent, logSubCat, 'We thought a block was found but it was rejected by the daemon, share data: ' + shareData);
else if (isValidBlock)
poolLogger.debug('client', 'Block found, solution: ' + data.solution);
logger.debug(logSystem, logComponent, logSubCat, 'Block found: ' + data.blockHash);
if (isValidShare)
poolLogger.debug('client', 'Valid share submitted, share data: ' + shareData);
else if (!isValidShare)
poolLogger.debug('client', 'Invalid share submitted, share data: ' + shareData)
logger.debug(logSystem, logComponent, logSubCat, 'Share accepted at diff ' + data.difficulty + '/' + data.shareDiff + ' by ' + data.worker + ' [' + data.ip + ']' );
else if (!isValidShare)
logger.debug(logSystem, logComponent, logSubCat, 'Share rejected: ' + shareData);
handlers.share(isValidShare, isValidBlock, data)
}).on('difficultyUpdate', function(workerName, diff){
logger.debug(logSystem, logComponent, logSubCat, 'Difficulty update to diff ' + diff + ' workerName=' + JSON.stringify(workerName));
handlers.diff(workerName, diff);
}).on('log', function(severity, logKey, logText) {
if (severity == 'debug') {
poolLogger.debug(logKey, logText);
} else if (severity == 'warning') {
poolLogger.warning(logKey, logText);
} else if (severity == 'error') {
poolLogger.error(logKey, logText);
}
}).on('log', function(severity, text) {
logger[severity](logSystem, logComponent, logSubCat, text);
}).on('banIP', function(ip, worker){
process.send({type: 'banIP', ip: ip});
});
pool.start();
pools[poolOptions.coin.name.toLowerCase()] = pool;
pools[poolOptions.coin.name] = pool;
});
if (typeof(portalConfig.proxy) !== 'undefined' && portalConfig.proxy.enabled === true) {
proxyStuff.curActivePool = Object.keys(pools)[0];
proxyStuff.proxys = {};
proxyStuff.varDiffs = {};
Object.keys(portalConfig.proxy.ports).forEach(function(port) {
proxyStuff.varDiffs[port] = new Vardiff(port, portalConfig.proxy.ports[port].varDiff);
});
Object.keys(pools).forEach(function (coinName) {
var p = pools[coinName];
Object.keys(proxyStuff.varDiffs).forEach(function(port) {
p.setVarDiff(port, proxyStuff.varDiffs[port]);
if (typeof(portalConfig.proxy) !== 'undefined') {
var logSystem = 'Proxy';
var logComponent = 'Setup';
var logSubCat = 'Thread ' + (parseInt(forkId) + 1);
var proxyState = {};
//
// Load proxy state for each algorithm from redis which allows NOMP to resume operation
// on the last pool it was using when reloaded or restarted
//
logger.debug(logSystem, logComponent, logSubCat, 'Loading last proxy state from redis');
var redisClient = redis.createClient(portalConfig.redis.port, portalConfig.redis.host);
redisClient.on('ready', function(){
redisClient.hgetall("proxyState", function(error, obj) {
if (error || obj == null) {
logger.debug(logSystem, logComponent, logSubCat, 'No last proxy state found in redis');
}
else {
proxyState = obj;
logger.debug(logSystem, logComponent, logSubCat, 'Last proxy state loaded from redis');
}
//
// Setup proxySwitch object to control proxy operations from configuration and any restored
// state. Each algorithm has a listening port, current coin name, and an active pool to
// which traffic is directed when activated in the config.
//
// In addition, the proxy config also takes diff and varDiff parmeters the override the
// defaults for the standard config of the coin.
//
Object.keys(portalConfig.proxy).forEach(function(algorithm) {
if (portalConfig.proxy[algorithm].enabled === true) {
var initalPool = proxyState.hasOwnProperty(algorithm) ? proxyState[algorithm] : _this.getFirstPoolForAlgorithm(algorithm);
proxySwitch[algorithm] = {
port: portalConfig.proxy[algorithm].port,
currentPool: initalPool,
proxy: {}
};
// Copy diff and vardiff configuation into pools that match our algorithm so the stratum server can pick them up
//
// Note: This seems a bit wonky and brittle - better if proxy just used the diff config of the port it was
// routed into instead.
//
if (portalConfig.proxy[algorithm].hasOwnProperty('varDiff')) {
proxySwitch[algorithm].varDiff = new Stratum.varDiff(proxySwitch[algorithm].port, portalConfig.proxy[algorithm].varDiff);
proxySwitch[algorithm].diff = portalConfig.proxy[algorithm].diff;
}
Object.keys(pools).forEach(function (coinName) {
var a = poolConfigs[coinName].coin.algorithm;
var p = pools[coinName];
if (a === algorithm) {
p.setVarDiff(proxySwitch[algorithm].port, proxySwitch[algorithm].varDiff);
}
});
proxySwitch[algorithm].proxy = net.createServer(function(socket) {
var currentPool = proxySwitch[algorithm].currentPool;
var logSubCat = 'Thread ' + (parseInt(forkId) + 1);
logger.debug(logSystem, 'Connect', logSubCat, 'Proxy connect from ' + socket.remoteAddress + ' on ' + proxySwitch[algorithm].port
+ ' routing to ' + currentPool);
pools[currentPool].getStratumServer().handleNewClient(socket);
}).listen(parseInt(proxySwitch[algorithm].port), function() {
logger.debug(logSystem, logComponent, logSubCat, 'Proxy listening for ' + algorithm + ' on port ' + proxySwitch[algorithm].port
+ ' into ' + proxySwitch[algorithm].currentPool);
});
}
else {
logger.debug(logSystem, logComponent, logSubCat, 'Proxy pool for ' + algorithm + ' disabled.');
}
});
});
}).on('error', function(err){
logger.debug(logSystem, logComponent, logSubCat, 'Pool configuration failed: ' + err);
});
Object.keys(portalConfig.proxy.ports).forEach(function (port) {
proxyStuff.proxys[port] = net .createServer({allowHalfOpen: true}, function(socket) {
console.log(proxyStuff.curActivePool);
pools[proxyStuff.curActivePool].getStratumServer().handleNewClient(socket);
}).listen(parseInt(port), function(){
console.log("Proxy listening on " + port);
});
});
}
};
this.getFirstPoolForAlgorithm = function(algorithm) {
var foundCoin = "";
Object.keys(poolConfigs).forEach(function(coinName) {
if (poolConfigs[coinName].coin.algorithm == algorithm) {
if (foundCoin === "")
foundCoin = coinName;
}
});
return foundCoin;
};
};

542
libs/profitSwitch.js Normal file
View File

@ -0,0 +1,542 @@
var async = require('async');
var net = require('net');
var bignum = require('bignum');
var algos = require('stratum-pool/lib/algoProperties.js');
var util = require('stratum-pool/lib/util.js');
var Cryptsy = require('./apiCryptsy.js');
var Poloniex = require('./apiPoloniex.js');
var Mintpal = require('./apiMintpal.js');
var Stratum = require('stratum-pool');
module.exports = function(logger){
var _this = this;
var portalConfig = JSON.parse(process.env.portalConfig);
var poolConfigs = JSON.parse(process.env.pools);
var logSystem = 'Profit';
//
// build status tracker for collecting coin market information
//
var profitStatus = {};
var symbolToAlgorithmMap = {};
Object.keys(poolConfigs).forEach(function(coin){
var poolConfig = poolConfigs[coin];
var algo = poolConfig.coin.algorithm;
if (!profitStatus.hasOwnProperty(algo)) {
profitStatus[algo] = {};
}
var coinStatus = {
name: poolConfig.coin.name,
symbol: poolConfig.coin.symbol,
difficulty: 0,
reward: 0,
exchangeInfo: {}
};
profitStatus[algo][poolConfig.coin.symbol] = coinStatus;
symbolToAlgorithmMap[poolConfig.coin.symbol] = algo;
});
//
// ensure we have something to switch
//
Object.keys(profitStatus).forEach(function(algo){
if (Object.keys(profitStatus[algo]).length <= 1) {
delete profitStatus[algo];
Object.keys(symbolToAlgorithmMap).forEach(function(symbol){
if (symbolToAlgorithmMap[symbol] === algo)
delete symbolToAlgorithmMap[symbol];
});
}
});
if (Object.keys(profitStatus).length == 0){
logger.debug(logSystem, 'Config', 'No alternative coins to switch to in current config, switching disabled.');
return;
}
//
// setup APIs
//
var poloApi = new Poloniex(
// 'API_KEY',
// 'API_SECRET'
);
var cryptsyApi = new Cryptsy(
// 'API_KEY',
// 'API_SECRET'
);
var mintpalApi = new Mintpal(
// 'API_KEY',
// 'API_SECRET'
);
//
// market data collection from Poloniex
//
this.getProfitDataPoloniex = function(callback){
async.series([
function(taskCallback){
poloApi.getTicker(function(err, data){
if (err){
taskCallback(err);
return;
}
Object.keys(symbolToAlgorithmMap).forEach(function(symbol){
var exchangeInfo = profitStatus[symbolToAlgorithmMap[symbol]][symbol].exchangeInfo;
if (!exchangeInfo.hasOwnProperty('Poloniex'))
exchangeInfo['Poloniex'] = {};
var marketData = exchangeInfo['Poloniex'];
if (data.hasOwnProperty('BTC_' + symbol)) {
if (!marketData.hasOwnProperty('BTC'))
marketData['BTC'] = {};
var btcData = data['BTC_' + symbol];
marketData['BTC'].ask = new Number(btcData.lowestAsk);
marketData['BTC'].bid = new Number(btcData.highestBid);
marketData['BTC'].last = new Number(btcData.last);
marketData['BTC'].baseVolume = new Number(btcData.baseVolume);
marketData['BTC'].quoteVolume = new Number(btcData.quoteVolume);
}
if (data.hasOwnProperty('LTC_' + symbol)) {
if (!marketData.hasOwnProperty('LTC'))
marketData['LTC'] = {};
var ltcData = data['LTC_' + symbol];
marketData['LTC'].ask = new Number(ltcData.lowestAsk);
marketData['LTC'].bid = new Number(ltcData.highestBid);
marketData['LTC'].last = new Number(ltcData.last);
marketData['LTC'].baseVolume = new Number(ltcData.baseVolume);
marketData['LTC'].quoteVolume = new Number(ltcData.quoteVolume);
}
// save LTC to BTC exchange rate
if (marketData.hasOwnProperty('LTC') && data.hasOwnProperty('BTC_LTC')) {
var btcLtc = data['BTC_LTC'];
marketData['LTC'].ltcToBtc = new Number(btcLtc.highestBid);
}
});
taskCallback();
});
},
function(taskCallback){
var depthTasks = [];
Object.keys(symbolToAlgorithmMap).forEach(function(symbol){
var marketData = profitStatus[symbolToAlgorithmMap[symbol]][symbol].exchangeInfo['Poloniex'];
if (marketData.hasOwnProperty('BTC') && marketData['BTC'].bid > 0){
depthTasks.push(function(callback){
_this.getMarketDepthFromPoloniex('BTC', symbol, marketData['BTC'].bid, callback)
});
}
if (marketData.hasOwnProperty('LTC') && marketData['LTC'].bid > 0){
depthTasks.push(function(callback){
_this.getMarketDepthFromPoloniex('LTC', symbol, marketData['LTC'].bid, callback)
});
}
});
if (!depthTasks.length){
taskCallback();
return;
}
async.series(depthTasks, function(err){
if (err){
taskCallback(err);
return;
}
taskCallback();
});
}
], function(err){
if (err){
callback(err);
return;
}
callback(null);
});
};
this.getMarketDepthFromPoloniex = function(symbolA, symbolB, coinPrice, callback){
poloApi.getOrderBook(symbolA, symbolB, function(err, data){
if (err){
callback(err);
return;
}
var depth = new Number(0);
var totalQty = new Number(0);
if (data.hasOwnProperty('bids')){
data['bids'].forEach(function(order){
var price = new Number(order[0]);
var limit = new Number(coinPrice * portalConfig.profitSwitch.depth);
var qty = new Number(order[1]);
// only measure the depth down to configured depth
if (price >= limit){
depth += (qty * price);
totalQty += qty;
}
});
}
var marketData = profitStatus[symbolToAlgorithmMap[symbolB]][symbolB].exchangeInfo['Poloniex'];
marketData[symbolA].depth = depth;
if (totalQty > 0)
marketData[symbolA].weightedBid = new Number(depth / totalQty);
callback();
});
};
this.getProfitDataCryptsy = function(callback){
async.series([
function(taskCallback){
cryptsyApi.getTicker(function(err, data){
if (err || data.success != 1){
taskCallback(err);
return;
}
Object.keys(symbolToAlgorithmMap).forEach(function(symbol){
var exchangeInfo = profitStatus[symbolToAlgorithmMap[symbol]][symbol].exchangeInfo;
if (!exchangeInfo.hasOwnProperty('Cryptsy'))
exchangeInfo['Cryptsy'] = {};
var marketData = exchangeInfo['Cryptsy'];
var results = data.return.markets;
if (results && results.hasOwnProperty(symbol + '/BTC')) {
if (!marketData.hasOwnProperty('BTC'))
marketData['BTC'] = {};
var btcData = results[symbol + '/BTC'];
marketData['BTC'].last = new Number(btcData.lasttradeprice);
marketData['BTC'].baseVolume = new Number(marketData['BTC'].last / btcData.volume);
marketData['BTC'].quoteVolume = new Number(btcData.volume);
if (btcData.sellorders != null)
marketData['BTC'].ask = new Number(btcData.sellorders[0].price);
if (btcData.buyorders != null) {
marketData['BTC'].bid = new Number(btcData.buyorders[0].price);
var limit = new Number(marketData['BTC'].bid * portalConfig.profitSwitch.depth);
var depth = new Number(0);
var totalQty = new Number(0);
btcData['buyorders'].forEach(function(order){
var price = new Number(order.price);
var qty = new Number(order.quantity);
if (price >= limit){
depth += (qty * price);
totalQty += qty;
}
});
marketData['BTC'].depth = depth;
if (totalQty > 0)
marketData['BTC'].weightedBid = new Number(depth / totalQty);
}
}
if (results && results.hasOwnProperty(symbol + '/LTC')) {
if (!marketData.hasOwnProperty('LTC'))
marketData['LTC'] = {};
var ltcData = results[symbol + '/LTC'];
marketData['LTC'].last = new Number(ltcData.lasttradeprice);
marketData['LTC'].baseVolume = new Number(marketData['LTC'].last / ltcData.volume);
marketData['LTC'].quoteVolume = new Number(ltcData.volume);
if (ltcData.sellorders != null)
marketData['LTC'].ask = new Number(ltcData.sellorders[0].price);
if (ltcData.buyorders != null) {
marketData['LTC'].bid = new Number(ltcData.buyorders[0].price);
var limit = new Number(marketData['LTC'].bid * portalConfig.profitSwitch.depth);
var depth = new Number(0);
var totalQty = new Number(0);
ltcData['buyorders'].forEach(function(order){
var price = new Number(order.price);
var qty = new Number(order.quantity);
if (price >= limit){
depth += (qty * price);
totalQty += qty;
}
});
marketData['LTC'].depth = depth;
if (totalQty > 0)
marketData['LTC'].weightedBid = new Number(depth / totalQty);
}
}
});
taskCallback();
});
}
], function(err){
if (err){
callback(err);
return;
}
callback(null);
});
};
this.getProfitDataMintpal = function(callback){
async.series([
function(taskCallback){
mintpalApi.getTicker(function(err, response){
if (err || !response.data){
taskCallback(err);
return;
}
Object.keys(symbolToAlgorithmMap).forEach(function(symbol){
response.data.forEach(function(market){
var exchangeInfo = profitStatus[symbolToAlgorithmMap[symbol]][symbol].exchangeInfo;
if (!exchangeInfo.hasOwnProperty('Mintpal'))
exchangeInfo['Mintpal'] = {};
var marketData = exchangeInfo['Mintpal'];
if (market.exchange == 'BTC' && market.code == symbol) {
if (!marketData.hasOwnProperty('BTC'))
marketData['BTC'] = {};
marketData['BTC'].last = new Number(market.last_price);
marketData['BTC'].baseVolume = new Number(market['24hvol']);
marketData['BTC'].quoteVolume = new Number(market['24hvol'] / market.last_price);
marketData['BTC'].ask = new Number(market.top_ask);
marketData['BTC'].bid = new Number(market.top_bid);
}
if (market.exchange == 'LTC' && market.code == symbol) {
if (!marketData.hasOwnProperty('LTC'))
marketData['LTC'] = {};
marketData['LTC'].last = new Number(market.last_price);
marketData['LTC'].baseVolume = new Number(market['24hvol']);
marketData['LTC'].quoteVolume = new Number(market['24hvol'] / market.last_price);
marketData['LTC'].ask = new Number(market.top_ask);
marketData['LTC'].bid = new Number(market.top_bid);
}
});
});
taskCallback();
});
},
function(taskCallback){
var depthTasks = [];
Object.keys(symbolToAlgorithmMap).forEach(function(symbol){
var marketData = profitStatus[symbolToAlgorithmMap[symbol]][symbol].exchangeInfo['Mintpal'];
if (marketData.hasOwnProperty('BTC') && marketData['BTC'].bid > 0){
depthTasks.push(function(callback){
_this.getMarketDepthFromMintpal('BTC', symbol, marketData['BTC'].bid, callback)
});
}
if (marketData.hasOwnProperty('LTC') && marketData['LTC'].bid > 0){
depthTasks.push(function(callback){
_this.getMarketDepthFromMintpal('LTC', symbol, marketData['LTC'].bid, callback)
});
}
});
if (!depthTasks.length){
taskCallback();
return;
}
async.series(depthTasks, function(err){
if (err){
taskCallback(err);
return;
}
taskCallback();
});
}
], function(err){
if (err){
callback(err);
return;
}
callback(null);
});
};
this.getMarketDepthFromMintpal = function(symbolA, symbolB, coinPrice, callback){
mintpalApi.getBuyOrderBook(symbolA, symbolB, function(err, response){
if (err){
callback(err);
return;
}
var depth = new Number(0);
if (response.hasOwnProperty('data')){
var totalQty = new Number(0);
response['data'].forEach(function(order){
var price = new Number(order.price);
var limit = new Number(coinPrice * portalConfig.profitSwitch.depth);
var qty = new Number(order.amount);
// only measure the depth down to configured depth
if (price >= limit){
depth += (qty * price);
totalQty += qty;
}
});
}
var marketData = profitStatus[symbolToAlgorithmMap[symbolB]][symbolB].exchangeInfo['Mintpal'];
marketData[symbolA].depth = depth;
if (totalQty > 0)
marketData[symbolA].weightedBid = new Number(depth / totalQty);
callback();
});
};
this.getCoindDaemonInfo = function(callback){
var daemonTasks = [];
Object.keys(profitStatus).forEach(function(algo){
Object.keys(profitStatus[algo]).forEach(function(symbol){
var coinName = profitStatus[algo][symbol].name;
var poolConfig = poolConfigs[coinName];
var daemonConfig = poolConfig.shareProcessing.internal.daemon;
daemonTasks.push(function(callback){
_this.getDaemonInfoForCoin(symbol, daemonConfig, callback)
});
});
});
if (daemonTasks.length == 0){
callback();
return;
}
async.series(daemonTasks, function(err){
if (err){
callback(err);
return;
}
callback(null);
});
};
this.getDaemonInfoForCoin = function(symbol, cfg, callback){
var daemon = new Stratum.daemon.interface([cfg]);
daemon.once('online', function(){
daemon.cmd('getblocktemplate', [{"capabilities": [ "coinbasetxn", "workid", "coinbase/append" ]}], function(result){
if (result[0].error != null){
logger.error(logSystem, symbol, 'Error while reading daemon info: ' + JSON.stringify(result[0]));
callback(null); // fail gracefully for each coin
return;
}
var coinStatus = profitStatus[symbolToAlgorithmMap[symbol]][symbol];
var response = result[0].response;
// some shitcoins dont provide target, only bits, so we need to deal with both
var target = response.target ? bignum(response.target, 16) : util.bignumFromBitsHex(response.bits);
coinStatus.difficulty = parseFloat((diff1 / target.toNumber()).toFixed(9));
logger.debug(logSystem, symbol, 'difficulty is ' + coinStatus.difficulty);
coinStatus.reward = new Number(response.coinbasevalue / 100000000);
callback(null);
});
}).once('connectionFailed', function(error){
logger.error(logSystem, symbol, JSON.stringify(error));
callback(null); // fail gracefully for each coin
}).on('error', function(error){
logger.error(logSystem, symbol, JSON.stringify(error));
callback(null); // fail gracefully for each coin
}).init();
};
this.getMiningRate = function(callback){
var daemonTasks = [];
Object.keys(profitStatus).forEach(function(algo){
Object.keys(profitStatus[algo]).forEach(function(symbol){
var coinStatus = profitStatus[symbolToAlgorithmMap[symbol]][symbol];
coinStatus.blocksPerMhPerHour = new Number(86400 / ((coinStatus.difficulty * Math.pow(2,32)) / (1 * 1000 * 1000)));
coinStatus.coinsPerMhPerHour = new Number(coinStatus.reward * coinStatus.blocksPerMhPerHour);
});
});
callback(null);
};
this.switchToMostProfitableCoins = function() {
Object.keys(profitStatus).forEach(function(algo) {
var algoStatus = profitStatus[algo];
var bestExchange;
var bestCoin;
var bestBtcPerMhPerHour = new Number(0);
Object.keys(profitStatus[algo]).forEach(function(symbol) {
var coinStatus = profitStatus[algo][symbol];
Object.keys(coinStatus.exchangeInfo).forEach(function(exchange){
var exchangeData = coinStatus.exchangeInfo[exchange];
if (exchangeData.hasOwnProperty('BTC') && exchangeData['BTC'].hasOwnProperty('weightedBid')){
var btcPerMhPerHour = new Number(exchangeData['BTC'].weightedBid * coinStatus.coinsPerMhPerHour);
if (btcPerMhPerHour > bestBtcPerMhPerHour){
bestBtcPerMhPerHour = btcPerMhPerHour;
bestExchange = exchange;
bestCoin = profitStatus[algo][symbol].name;
}
coinStatus.btcPerMhPerHour = btcPerMhPerHour;
logger.debug(logSystem, 'CALC', 'BTC/' + symbol + ' on ' + exchange + ' with ' + coinStatus.btcPerMhPerHour.toFixed(8) + ' BTC/day per Mh/s');
}
if (exchangeData.hasOwnProperty('LTC') && exchangeData['LTC'].hasOwnProperty('weightedBid')){
var btcPerMhPerHour = new Number((exchangeData['LTC'].weightedBid * coinStatus.coinsPerMhPerHour) * exchangeData['LTC'].ltcToBtc);
if (btcPerMhPerHour > bestBtcPerMhPerHour){
bestBtcPerMhPerHour = btcPerMhPerHour;
bestExchange = exchange;
bestCoin = profitStatus[algo][symbol].name;
}
coinStatus.btcPerMhPerHour = btcPerMhPerHour;
logger.debug(logSystem, 'CALC', 'LTC/' + symbol + ' on ' + exchange + ' with ' + coinStatus.btcPerMhPerHour.toFixed(8) + ' BTC/day per Mh/s');
}
});
});
logger.debug(logSystem, 'RESULT', 'Best coin for ' + algo + ' is ' + bestCoin + ' on ' + bestExchange + ' with ' + bestBtcPerMhPerHour.toFixed(8) + ' BTC/day per Mh/s');
if (portalConfig.coinSwitchListener.enabled){
var client = net.connect(portalConfig.coinSwitchListener.port, portalConfig.coinSwitchListener.host, function () {
client.write(JSON.stringify({
password: portalConfig.coinSwitchListener.password,
coin: bestCoin
}) + '\n');
});
}
});
};
var checkProfitability = function(){
logger.debug(logSystem, 'Check', 'Collecting profitability data.');
profitabilityTasks = [];
if (portalConfig.profitSwitch.usePoloniex)
profitabilityTasks.push(_this.getProfitDataPoloniex);
if (portalConfig.profitSwitch.useCryptsy)
profitabilityTasks.push(_this.getProfitDataCryptsy);
if (portalConfig.profitSwitch.useMintpal)
profitabilityTasks.push(_this.getProfitDataMintpal);
profitabilityTasks.push(_this.getCoindDaemonInfo);
profitabilityTasks.push(_this.getMiningRate);
// has to be series
async.series(profitabilityTasks, function(err){
if (err){
logger.error(logSystem, 'Check', 'Error while checking profitability: ' + err);
return;
}
//
// TODO offer support for a userConfigurable function for deciding on coin to override the default
//
_this.switchToMostProfitableCoins();
});
};
setInterval(checkProfitability, portalConfig.profitSwitch.updateInterval * 1000);
};

View File

@ -20,45 +20,39 @@ module.exports = function(logger, poolConfig){
var redisConfig = internalConfig.redis;
var coin = poolConfig.coin.name;
var connection;
var forkId = process.env.forkId;
var logSystem = 'Pool';
var logComponent = coin;
var logSubCat = 'Thread ' + (parseInt(forkId) + 1);
function connect(){
var connection = redis.createClient(redisConfig.port, redisConfig.host);
var reconnectTimeout;
connection = redis.createClient(redisConfig.port, redisConfig.host);
connection.on('ready', function(){
clearTimeout(reconnectTimeout);
logger.debug('redis', 'Successfully connected to redis database');
});
connection.on('error', function(err){
logger.error('redis', 'Redis client had an error: ' + JSON.stringify(err))
});
connection.on('end', function(){
logger.error('redis', 'Connection to redis database as been ended');
logger.warning('redis', 'Trying reconnection in 3 seconds...');
reconnectTimeout = setTimeout(function(){
connect();
}, 3000);
});
}
connect();
connection.on('ready', function(){
logger.debug(logSystem, logComponent, logSubCat, 'Share processing setup with redis (' + redisConfig.host +
':' + redisConfig.port + ')');
});
connection.on('error', function(err){
logger.error(logSystem, logComponent, logSubCat, 'Redis client had an error: ' + JSON.stringify(err))
});
connection.on('end', function(){
logger.error(logSystem, logComponent, logSubCat, 'Connection to redis database as been ended');
});
this.handleShare = function(isValidShare, isValidBlock, shareData){
var redisCommands = [];
if (isValidShare){
redisCommands.push(['hincrby', coin + '_shares:roundCurrent', shareData.worker, shareData.difficulty]);
redisCommands.push(['hincrbyfloat', coin + '_shares:roundCurrent', shareData.worker, shareData.difficulty]);
redisCommands.push(['hincrby', coin + '_stats', 'validShares', 1]);
/* Stores share diff, worker, and unique value with a score that is the timestamp. Unique value ensures it
doesn't overwrite an existing entry, and timestamp as score lets us query shares from last X minutes to
generate hashrate for each worker and pool. */
redisCommands.push(['zadd', coin + '_hashrate', Date.now() / 1000 | 0, [shareData.difficulty, shareData.worker, Math.random()].join(':')]);
var dateNow = Date.now();
redisCommands.push(['zadd', coin + '_hashrate', dateNow / 1000 | 0, [shareData.difficulty, shareData.worker, dateNow].join(':')]);
}
else{
redisCommands.push(['hincrby', coin + '_stats', 'invalidShares', 1]);
@ -66,19 +60,21 @@ module.exports = function(logger, poolConfig){
if (isValidBlock){
redisCommands.push(['rename', coin + '_shares:roundCurrent', coin + '_shares:round' + shareData.height]);
redisCommands.push(['sadd', coin + '_blocksPending', shareData.tx + ':' + shareData.height + ':' + shareData.reward]);
redisCommands.push(['sadd', coin + '_blocksPending', [shareData.blockHash, shareData.txHash, shareData.height, shareData.reward].join(':')]);
redisCommands.push(['hincrby', coin + '_stats', 'validBlocks', 1]);
}
else if (shareData.solution){
else if (shareData.blockHash){
redisCommands.push(['hincrby', coin + '_stats', 'invalidBlocks', 1]);
}
connection.multi(redisCommands).exec(function(err, replies){
if (err)
logger.error('redis', 'error with share processor multi ' + JSON.stringify(err));
logger.error(logSystem, logComponent, logSubCat, 'Error with share processor multi ' + JSON.stringify(err));
//else
//logger.debug(logSystem, logComponent, logSubCat, 'Share data and stats recorded');
});
};
};
};

Some files were not shown because too many files have changed in this diff Show More