Trading 101: Volume Indicators Explained

The importance of trading volume to the crypto ecosystem explained!

The importance of trading volume to the crypto ecosystem explained! submitted by cryptoallbot to cryptoall [link] [comments]

How to Trade Big Crypto Volumes, Explained

How to Trade Big Crypto Volumes, Explained submitted by Ranzware to BitNewsLive [link] [comments]

How to Trade Big Crypto Volumes, Explained

How to Trade Big Crypto Volumes, Explained submitted by cryptolobe to cryptolobe [link] [comments]

How to Trade Big Crypto Volumes, Explained

How to Trade Big Crypto Volumes, Explained submitted by a36 to AllThingsCrypto [link] [comments]

Coinsquare Accused of Market Manipulation by Canadian Regulator

submitted by VeeAar to BitcoinCA [link] [comments]

Chainlink analysis - my thoughts and research

Necessary Disclaimer: no rule breaking intended. No price manipulation intended. I only want to share verifiable facts/links and my analysis. If I am doing anything against the rules please let me know and I will do my best to fix it ASAP. I trade crypto, including LINK, and I am currently short on LINK. This is not financial advice; this is just for my own record and to start a discussion for anyone who might want more transparency around LINK.

TL;DR:

I believe there is a lot of misinformation, uncertainty, and unanswered questions about the LINK token, the Chainlink ecosystem, the SmartContract parent company. I also believe that LINK's current price is unjustified based on fundamental factors like usage/business case/current customers/future potential. So I'm raising some points and asking some questions.
What is this post? Why should I care? How do I use it?
Read or skim it. It's about the LINK token, the Chainlink ecosystem, and the parent company SmartContract. It's about why I believe the price of the LINK token may be currently driven mostly by hype and not backed by standard market fundamentals like usage/economics.
Update 9 AUG: reorganizing, rewriting this post and moving supporting data/sources into "appendix" comments below on this post. The previous versions of this post and my comments elsewhere were too emotionally charged and caused more division rather than honest, evidence-based, productive discussion and I sincerely apologize for that. I have now rewritten it and will continue to update it.

PARTNERSHIPS

Who has Chainlink partnered with? Who is using Chainlink's technology and network? Who is contributing to developing Chainlink?
Google - this is the pinned tweet on Chainlink's official page. Nothing there about Google using Chainlink services or co-developing with them. Just that blockchains/oracles CAN use google cloud services (APIs?). This is Google Cloud's June 13, 2019 blog post: https://cloud.google.com/blog/products/data-analytics/building-hybrid-blockchain-cloud-applications-with-ethereum-and-google-cloud
Oracle - (TODO. This seems to have potential as some product manager at Oracle has posted that chainlink integration is coming Q3/Q4 of 2020)
SWIFT - the best they've got is a 30 second video with NOBODY from SWIFT present, with a *hypothetical* use case using SWIFT API.
Intel This is the only google result for "chainlink site:intel.com", and it casually mentions that Intel's TEE (trusted execution environment) technology can be used to improve the security of oracles/blockchains. Nothing about Intel themselves using or developing with Chainlink. https://software.intel.com/content/www/us/en/develop/articles/new-confidential-computing-solutions-emerge-on-the-hyperledger-avalon-trusted-compute.html
Another 240+ claimed project integrations:
[TODO] There are so many to keep track of. Every week or even more frequent is yet another integration *announcement*
Current DeFi usage: we've heard that Chainlink "secures" $1 billion in DeFi. But that's not in value locked: https://defipulse.com/ (LINK doesn't even appear on that list). That's just with DeFi data supposedly being priced using Chainlink nodes.
EG Synthetix:
https://blog.synthetix.io/chainlink-decentralizes-first-wave-of-synthetix-price-feeds/ yet where does Synthetix actually PAY to use an oracle? Not visible on-chain, maybe someone will find it.
https://defipulse.com/blog/3-defi-dapps-starting-2020-off-strong/ "... Chainlink's following includes partnerships big and small, including Intel and Google Cloud services" example of misleading/exaggerated partnership claims being circulated.

Chainlink's ROADMAP

Threshold signatures, staking, on-chain SLAs:
How real are these, is there a roadmap, how will this benefit users, is there any evidence of users currently *wanting* to use chainlink but needing these features and actively waiting for Chainlink to launch these?
Staking: for there to be a valid incentive for users to stake LINK, it has to return around 5% annually because anything substantially under that would have users putting their money elsewhere (https://www.stakingrewards.com/cryptoassets) (not counting speculative capital gains in terms of LINK's price, but price gain per token/coin applies to all other crypto projects as well).
Currently, for stakable cryptos, around 30-80% of their total supply is staked, and a good adjusted reward is on the order of 5% as well (some actually negative, some 10%+). The promise of staking incentivises people to buy and hold more LINK tokens (again, many other crypto projects have staking already live). That 5% reward will ultimately have to come from the customers who pay Chainlink oracle nodes to use their services, so it's an extra 5% fee for them. Of course, in the near future, the staking rewards *could* be subsidized by the founders' reserve wallets.
Threshold signatures: addressed below in a comment.
On-chain SLAs: [TODO]
Here's supposedly Chainlink's agile/project planning board. (TODO: verify that it is indeed Chainlink's, and then analyse it)
https://www.pivotaltracker.com/n/projects/2129823

LINK wallet addresses

As LINK is an ERC20 token on the Ethereum blockchain, all its movements are visible, all the way from the genesis creation of 1,000,000,000 LINK tokens through to aggregator nodes through to cashing out on exchanges. Below are some examples and some reasons why this may be concerning to investors/holders of LINK.
This is one LINK address whale with over 6 million LINK. Looks like some of the funds end up on a Turkish exchange Paribu. https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0xc6bed363b30df7f35b601a5547fe56cd31ec63da This wallet has moved out >200,000 LINK in the last 24 hours. Don't know where, go trace it.
Typical data provider example. Lots of named Chainlink oracle nodes pay this address: https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0x72f3dff4cd17816604dd2df6c2741e739484ca62 Usually 0.16 LINK to this address every few minutes, sometimes 2 LINK. This data provider has sent out ~11,620 LINK out to the following wallet: https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0xa5d0084a766203b463b3164dfc49d91509c12dab That wallet has cashed out 9,560 LINK to 1inch.exchange (a DEX) over the past year. Has also transferred 6000 LINK to a currently loaded wallet (possibly exchange account ready to sell?): https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0x088d50c0bb5381a1205d1182cc21496c6fdc4c62 Another destination accumulation wallet (~493,000 LINK with no out transfers yet) https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0x7758e507850da48cd47df1fb5f875c23e3340c50 (unrelated but a sell order of this size would drop LINK's price by 10-30% on Binance, someone check my maths on this) Now tracing back who funds the 0x72f3... data provider, we see a number of named Chainlink Aggregator nodes. Picking one at random, say the TUSD/ETH one: https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0x73ead35fd6a572ef763b13be65a9db96f7643577 It was last funded March 12 2020 with 5000 LINK. Tracing back the funds we ultimately come to the genesis wallet of the Chainlink network itself, the original source of the 1,000,000,000 LINK tokens in existence. (side note: some interesting-looking transactions there) This is the first child of the genesis wallet that received 100,000,000 from the genesis wallet. https://etherscan.io/tokentxns?a=0xf37c348b7d19b17b29cd5cfa64cfa48e2d6eb8db Last time this wallet transferred out was YESTERDAY for 500,000 LINK. Now this doesn't prove anything, DYOR, but to me it looks like the genesis wallets are slowly cashing out through the aggregator nodes, making it look like the oracle node network is being actively used (which it is, but it's not the end customers like AAVE/NEXO paying the LINK required to power oracles, it's SmartContract itself). I know that this is just ONE aggregator node, but I've seen the same behaviour from their other named nodes - go check for yourselves.
If you trace chainlink oracle funds to their source, you can find some of the original addresses. Some of these early on (around 1000 days ago) were linked to AfroDex labs, which looks like now doesn’t work. http://afrodex.net/#!/trade/AfroX-ETH
Who currently pays Chainlink nodes?
How much of the revenue that Chainlink nodes receive is from potentially third party customers vs internal funding by the Chainlink team wallets?
For example, this is the "Chainlink: LINK / USD Aggregator" wallet.
It has had a total 8,200 LINK deposited from 5 transactions in round amounts (on any of the below links, click the "Analytics" tab to see In/Out balance history), and has so far paid out ~5,156 LINK.
https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0x32dbd3214ac75223e27e575c53944307914f7a90
It typically pays ~10 wallets 0.16 Link each, a few times an hour, like so:
https://etherscan.io/tx/0x02c595981b935a57cfbe6170656181faac9a16d7a33a123930a716c4abec615a ($45 in ETH fees to transfer $22 worth of LINK, sounds like a lot of overheads)
Where does this aggregator wallet get its LINK funding from?
From ONLY here: https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0x27158157136384c713bc09a0a7ae81c8391d7f11 (current net balance ~50,000 LINK, total ~5,000,000 million in and out)
Which in turn gets it from ONLY these three, in HUGE amounts:
https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0xf37c348b7d19b17b29cd5cfa64cfa48e2d6eb8db (6,000,000 LINK)
https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0xaf40738c6f940519516e043f924b8d05fc0292b8 (just a jump address into the one above, only 3 total tx)
https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0x1f9e26f1c050b5c018ab0e66fcae8e4394eb0165 (147,000 LINK)
the 0x1f9e2... one got its funding from:
  1. 6098.8 LINK from Binance about a year ago: https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0x161cdd891e04a77e0458a3ef65c563c4d2064cd6
  2. 12,600,000 from the genesis wallet through one jump address https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0xdad22a85ef8310ef582b70e4051e543f3153e11f
  3. 13,000,000 from the 0xf37... wallet above
the 0xf37... in turn got its 50,000,000 (!) LINK from the genesis address which minted the original 1 billion tokens:
https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0xf55037738604fddfc4043d12f25124e94d7d1780
So the 0x27158... wallet is basically a genesis wallet.
Now let's do the most popular feed on feeds.chain.link, the ETH/USD feed: https://feeds.chain.link/eth-usd, with a wallet address of: https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0xF79D6aFBb6dA890132F9D7c355e3015f15F3406F#tokenAnalytics
It was first funded in Jan 2020 and has been funded a total of 9 times for a total influx of 108,437.533 LINK, by:
  1. "Chainlink: Deployer" 10 LINK: https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0x6f61507f902e1c22bcd7aa2c0452cd2212009b61
  2. The 0x27158... genesis-sourced wallet, 20,000 LINK
  3. An intermediary/middle very active wallet (which is 99.998% funded by the 0x27158... genesis-sourced wallet), 52,000 LINK https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0x2f0acb9c5dd2a3511bc1d9d67258e5c9434ba569
  4. "Chainlink: Aggregator", 36,427.533 LINK, https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0x79febf6b9f76853edbcbc913e6aae8232cfb9de9#tokenAnalytics
I manually traced EVERY single inbound transaction/source of funds for the above 4 (not counting #1 as 10 LINK is negligible). 2 & 3 are 99.99%+ genesis-funded, being ACTIVELY topped up by a genesis wallet, last tx 4 days ago, 500,000 LINK. #4 has been funded 36 times over the past year and a half (that's 36 manual exports and I did them all). They all come from the 0x27158..., 0x2f0acb..., and https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0x1f9e26f1c050b5c018ab0e66fcae8e4394eb0165 (another address like the 0x2f0acb that I went through and checked EVERY SINGLE inbound source of funds, and it's also >99.9% genesis-funded - one tx from Binance for 6098 LINK out of a total ~6,560,000 inbound LINK from genesis wallets), and two other addresses linked to Binance (0x1b185c8611d157a67d9a9d5261b0d2bd52c0bb78, 10,000 LINK and 0x039ac18afe298747c51c85e7c8f0d67c327f3883, 1,000,000 LINK)
The 0x039ac... address funded the "Chainlink: Aggregator" address with 127,900 LINK, and the 0x1b185... with about ~9,600 LINK). So yes, it's technically possible that someone not related to Chainlink paid for the ETH / USD price feed because some funds do come from Binance. However, they only come from two distinct addresses. Surely for "240+" claimed partnerships, more than TWO would pay to use Chainlink's MOST POPULAR price feed? That is, unless they don't pay directly but to another address and then Chainlink covers this one from their own wallets. I will check if that's in line with Chainlink's whitepaper, but doesn't that throw doubt on the whole model of end-users paying to use oracles/aggregators, even if it's subsidized?
I provide you this much detail not to bore you but to show you that I went through BY HAND and checked every single source (detailed sources in Appendix B) of funds for the OFFICIAL, Chainlink-listed "ETH/USD" aggregator that's supposedly sponsored by 10 DeFi partners (Synthetix, LoopSpring, OpenLaw, 1inch, ParaSwap, MCDEX, FuturesSwap, DMM, Aave, The Force Protocol). Yet where are the transactions showing that those 10 partners have EVER paid for this ETH/USD oracle? Perhaps the data is there so what am I missing? This ETH/USD aggregator has transferred out ~76,000 LINK to I guess the data providers in increments of .33 LINK. It has 21 data providers responding. I will begin investigating the data providers themselves soon.
And those middle addresses like 0x1f9e26... and 0x2f0acb...? They have transferred out hundreds of thousands if not millions of LINK to exchanges. And that's just ONE price pair aggregator. Chainlink has around 40 of these (albeit this one's one of the more popular ones).

SNX / ETH aggregator is funded 100% by genesis-sourced wallets, only 3 inbound transactions:
https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0xe23d1142de4e83c08bb048bcab54d50907390828

Some random examples (for later, ignore these for now) ***********
https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0x039ac18afe298747c51c85e7c8f0d67c327f3883 bought 1,000,000 LINK from Binance in Sept 12 & 15, 2019. (one of the possible funding sources for the ETH / USD aggregator example above)
This address got 500,000 LINK from 0x27158... and has distributed them into ~5-10,000 LINK wallets that haven't had any out transactions yet
https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0x5bcf3edc0bb7119e35f322ba40793b99d4620f1e
**************
Another example with an unnamed aggregator-node-like wallet that was only spun up 5 days ago, Aug 5:
https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0x2cbfd29947f774b8cf338f776915e6fee052f236
It was funded 2,000 LINK SOLELY by the 0x27158... wallet and has so far paid out ~500 LINK in 0.43 LINK amounts to 9 wallets at a time. For example, this is one of the wallets it cashes out to:
https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0x64fe692be4b42f4ac9d4617ab824e088350c11c2#tokenAnalytics
That wallet extremely consistently collects small amounts of LINK since Oct 2019. It must be a data provider because a lot of Chainlink named wallets pay it small amounts of LINK regularly. It has transferred out 20 times. The most recent transfer out:
https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0xc8c30fa803833dd1fd6dbcdd91ed0b301eff87cf which then immediately transferred to the named "1inch.exchange" wallet, so I assume this was a "cash-out" transaction. It has cashed out via this address a lot.
Granted, it also has transfer-out transactions that haven't (yet) ended up in an exchange wallet, eg https://etherscan.io/token/0x514910771af9ca656af840dff83e8264ecf986ca?a=0x88e5353a73f38f25a9611e6083de6f361f9b537b with a current balance of 3000 LINK. This could be a user's exchange wallet, ready to be sold, or could be something else. No way for me to tell as there are no out txs from it.

LINK overall transaction, volume, and tx fees

This is to understand how much $ moves through the LINK ecosystem through: nodes, data providers, reserve wallets, wallets linked to exchanges, others.
A typical aggregator node tx (payout?): https://etherscan.io/tx/0xef9e8e6dd94ebe9bbac8866f18c2ea0a07408ced1aa77fa04826043eaa55e772 This is their ETH/USD aggregator paying out 1 LINK to each of 21 addresses. Value of 21 LINK ~= $210. Total eth tx fees: .233 ETH (~$88.5, ~42% of the total tx value. If LINK was $4.2 instead of $10, the tx fees would be 100% of the value of the tx). Transactions like this happen every few minutes, and the payout amounts are most often 0.16, 0.66, 1.0, and 2.0 Link.
Chainlink’s node/job listing site, https://market.link, lists 86 nodes, 195 feeds, 801 jobs, ~1,080,000 job runs (I can’t tell if this is over the past 2 months or 1.5 years). Only 20 nodes have over 1000 job runs, and 62 nodes have ZERO runs. Usual job cost is listed as 0.1 link, but the overall payout to the nodes is 10-20 times this. The nodes then cash out usually through a few jump addresses to exchanges. Some quick maths: (being generous and assuming it’s 1mil jobs every 2 months = ~6mil link/year = $60,000,000 revenue a year. This is the most generous estimate towards link’s valuation I’ve found so far. If we ignore the below examples where on multi-node payouts the tx fees are more than the node revenue itself, then it’s almost in line with an over-valued (but real) big tech company.
For example, one of the latest CHF/USD job runs paid 0.1 LINK to 9 addresses (data providers?) - total $14.4 payout - and paid 0.065 ETH ($24.5) in fees. That’s a $10.1 LOSS on a $14.4 revenue: https://etherscan.io/tx/0xa6351bab810b6864bfebb0f6e1e3bde3c8856f8aac3ba769dd2e6d1a39c0d23f
Linkpool’s (one of the biggest node operators) “ETH-USD CryptoCompare” job costs 0.1 link and has 33 runs in the past 24 hours (once every ~44min), total ~78,000 runs since May 30 2019 (once every ~8min). https://market.link/jobs/64bb0845-c4e1-4681-8853-0b5aa7366101/runs (PS cryptocompare has a free API that does this. Not sure why it costs $1 at current link prices to access an API once)

Token distribution:

Top 100 wallets (0.05% of ~186,000 total) hold 83% of tokens. 8 wallets each hold over 1% of total, 58 hold over 0.1%. Of these 58, 9 are named exchange/lending pool wallets.
For comparison, for Tether (TUSD), the top 100 wallets (0.006% of ~1,651,000 total) hold 35.9% of the supply. 3 addresses hold over 1% of the supply and 135 hold over 0.1%. Of these 135, at least 15 are named exchange/lending pool wallets.
LINK’s market cap is $3.5B (or $10B fully diluted, if we count the foundedev-controlled tokens, which we should as there's nothing preventing them from being moved at a moment's notice). Tether’s is $6.9B. Tether has 10 times more addresses and less distribution inequality. Both LINK and Tether are ERC20 tokens, and even if we temporarily ignore any arguments related to management/roadmap/teams etc, Tether has a clear, currently functional, single use case: keep 1 USDT = $1 USD by printing/burning USDT (and yet as of April 2019, only 74% of Tether's market cap is backed by real funds - https://en.wikipedia.org/wiki/Tether_(cryptocurrency))). Given that Chainlink's market cap is now 50% bigger than Tether's, surely by now there's AT LEAST one clear, currently functional use case for LINK? What is it? Can we *see* it happening on-chain?

Chainlink’s actual deliverable products

"What do I currently get for my money if I buy LINK 1) as an investor and 2) as a tech business/startup thinking of using oracles?”
Codebase (Chainlink’s github has around 140-200,000 lines of code (not counting html/css). What else is not counted in this? Town crier? Proprietary code that we don't know about yet? How much CODING has Chainlink done other than what's on github?
Current network of oracles - only ~20 active nodes - are there many more than the ones listed on market.link and reputation.link? If so, would be nice to know about these if we're allowed!
Documentation - they have what seems like detailed instructions on how to launch and use oracle nodes (and much more, I haven't investigated yet) (TODO this part more - what else do they offer to me as an end consumer, and eg as a tech startup needing oracle services that I can’t code myself?)

Network utilization statistics:

Etherscan.io allows csv export of the first 5000 txs from each day. From Jul 31 to Aug 6 2020, I thus downloaded 30,000 tx from midnight every day to an average of 7:10am (so 24 hour totals are 3.34x these numbers if we assume the same network utilization throughout the day).
(Summary of all LINK token activity on the ETH blockchain from 31.07 to 06.08, first 5000 txs of each day (30k total) shown Appendix A comment below this post.)
If we GENEROUSLY assume that EVERY SINGLE transaction under 10.0 LINK is ACTUAL chainlink nodes doing ACTUAL work, that’s still under 0.1% of the LINK network’s total volume being used for ACTUAL ecosystem functioning. The rest is speculation, trading, node funding by foundedev wallets, or dumping to exchanges (anything I missed?)
Assuming the above, the entire turnover of the actual LINK network is currently (18,422 LINK) * ($10/LINK) * (3.34 as etherscan.io’s data only gives first 5000 tx per day which averages to 7:10am) * (52 wk/year) = USD $31,995,329 turnover a year.
Note: the below paragraph is old analysis using traditional stock market Price/Earnings ratios which several users have now pointed out isn't really applicable in crypto. I leave it for the record. Assuming all of that is profit (which it’s not given tx fees at the very least), LINK would need a PE ratio (Price/Earnings) of 100 times to justify its current (undiluted) valuation of $3.5 billion of 300 if you count the other 65% of tokens that haven’t been dumped by the founders/devs yet. For comparison, common PE ratios are 32 (facebook), 29 (google), 37 (uber), 20 (twitter on a good year), 10 (good hedge fund returning 10% annual).

Thoughts on DeFi & yield-farming - [TODO]

Why would exchanges who do their due diligence list LINK, let alone at a leverage? 1) that's their business, they take a cut of every transaction, overhyped or not, 2) they're not safe from listing openly bearish tokens like EIDOS (troll token that incentivized users to make FAKE transactions, response to EOS) https://www.coindesk.com/defi-yield-farming-comp-token-explained
The current ANNUAL yield on liquidity/yield farming is something like 2% on STABLE tokens like USDC and TETHER which at least have most of their supply backed by real-world assets. If Chainlink LINK staking is to be successful, they'll have to achieve at LEAST that same 2% at end-state. IF LINK is in bubble territory and drops, that's a lot of years at 2% waiting to recoup losses.

SmartContract Team & Past Projects

Normally I don't like focussing on people because it leads too easily to ad-hominem attacks on personality rather than on technology/numbers as I've done above, but I came across this and didn't like what I saw.
Steve Ellis, SmartContract's current CTO, co-founded and worked in "Secure Asset Exchange" from 2014 to 2016. They developed the NXT blockchain, issued 1,000,000,000 NXT tokens (remind you of anything?), NXT was listed end of 2013 and saw 3 quick 500%-1000% pumps and subsequent dumps in early in mid 2014, and then declined to . SecureAE officially shut down in Jan 2016. Then at some point a company called Jelurida acquired the rights to NXT (presumably after SecureAE?), then during the 2017 altcoin craze NXT pumped 300 times to a market cap of $1.8 BILLION and then dumped back down 100 times and now it's a dead project with a market cap of $13 million.
https://www.linkedin.com/in/steveellis0606/
https://trade.secureae.com/
https://coinmarketcap.com/currencies/nxt/
https://www.jelurida.com/news/lawsuit-against-apollo-license-violations
As an investor or business owner, would you invest/hire a company whose co-founders/CTO's last project was a total flop with a price history chart that's textbook pump-and-dump behaviour? (and in this case, we KNOW the end result - 99% losses for investors) If you're Google/Oracle/SWIFT/Intel, would you partner with them?

Open questions for the Chainlink community and investors:

  1. Network activity: Are there any other currently active chainlink nodes other than those listed on market.link and reputation.link? If so, is there a list of them with usage statistics? Do they use some other token than LINK and thus making simple analytics of the LINK ERC20 token not an accurate representation of Chainlink’s actual activity? If the nodes listed on the two sites above ARE currently the main nodes, then
  2. PR, partnership announcements: Why is the google tweet still pinned to the top of Chainlink’s twitter? Due to the frequently circulated Chainlink promotion material (https://chainlinkecosystem.com/) that lists Google as one of the key partners, this tweet being pinned is potentially misleading as there isn't anything in there to merit calling Google a "collaborator" or "partner" - just that blockchains/oracles *can* use Google's APIs (but so can most software in the world). Is there something else going on with the SmartContract-Google relationship that warrants calling Google a partner that we're simply not aware of yet?
  3. By buying LINK, what backs YOUR money: If you have bought and currently hold LINK tokens, how comfortable are you that the future promise of your investment growing is supported on verifiable business and technological grounds versus pure, parabolic hype? If after reading this post you still are, I kindly ask you to reply and show how even one of the points I provided is either incorrect or not applicable, and I will edit my post and include your feedback in the relevant section as I have already done from other users.
  4. What have I missed? Of course not 100% of what I've said is infallible truth. I am a real human, and I have plenty of biases and blind spots. Even if what I've provided is technically correct, there may be other much more important info that I've missed that eclipses what I've provided here. Ask yourself: if the current hype around LINK is indeed valid and points to a $100/$1000 future LINK price, then Where’s Chainlink’s missing financial/performance/usage evidence to justify LINK’s current valuation of $10+?

Conclusion

For your consideration, I have provided evidence with links that you can follow and verify, and draw your own conclusions. I have made my case as to why I believe the LINK token is currently priced much higher than evidence supports, and I ask you to peer-review my analysis and share your thoughts with me and with the wider LINK/crypto community.
Thank you for your time, I realize this is a long post. All questions and feedback welcome, feel free to comment or PM. I won't delete/censoblock (except for personal threats, safety considerations etc). I am a real human but I am not revealing my true identity for obvious privacy/harassment reasons.
(If anyone is wondering about my credentials ability to add 2+2 and work with basic spreadsheets: I have previously won a math competition in a USA state, I won an English-speaking country's physics olympiad, my university education is in mathematical physics/optimization engineering, and I worked for a few years in a global manufacturing company doing data analytics, obviously I'm not posting my CV here to verify that but I promise you it's the truth)
I’m not looking to spread neither FUD, nor blind faith, nor pure hype, and I want an honest transparent objective discussion. I personally believe more that LINK is overvalued, but my beliefs have evolved and may continue to do so as I research more and understand more about Chainlink, LINK, Ethereum, DeFi, and other related topics, and as I incorporate YOUR feedback. If you think I haven't disclosed something, ask.
As always, this is not financial advice and I am not liable for anything that may happen as a result of you reading this!
submitted by Stratocatter to CryptoCurrency [link] [comments]

The 4th way of algorithmic trading (Signal Processing)

Algorithmic trading types classified based on development perspectives:
1) Technical Analysis
2) Statistics and Probability
3) Machine Learning
I took a different path which is not discussed widely in this subreddit.
4) Signal Processing
I'm not a good storyteller, but this is my journey and advices for the beginners
First, my background:
- Electrical and Electronic engineer,
- Software developer (20+ years)
- Trader (5+ years)
- Algorithmic trader (3+ years)

How I Found The Alpha:

Before algorithmic trading, I was somehow profitable tradeinvestor. Like most of you, when I began to algorithmic trading, I tried to find magic combination of technical indicators and parameters. Also I threw OHLCV and indicators data into the RNN for prediction.
I saw that, even very simple strategies like famous moving average crossover is profitable under right market conditions with correct parameters. But you must watch it carefully and if you fell it is not working anymore, you must shut it down. It means you must be experienced trader to take care of your algorithm.
I am a fulltime software developer, algorithmic trading was my side project also it became my hobby. I tried to learn everything about this industry. I watched and listened hundreds of hours of podcasts and videos in all my free time like commuting from home to work.
These are the most useful to me:
- Chat with traders: https://www.youtube.com/channel/UCdnzT5Tl6pAkATOiDsPhqcg
- Top traders unplugged: https://www.youtube.com/usetoptraderslive
- Ukspreadbetting: https://www.youtube.com/channel/UCnKPQUoCRb1Vu-qWwWituGQ
Also I read plenty of academic papers, blog posts and this subreddit for inspiration.
Inspiration came from my field, electronics. I will not give you much detail about it but I have developed a novel signal processing technique. It is a fast and natural technique which I couldn’t find any article or paper which mention this method. It can transform any interval price data into meaningful, tradable form. The best part is, it doesn't require any parameter and it adapts to changing market conditions intrinsically.
These are the concepts that inspire me:
- Information Theory: https://en.wikipedia.org/wiki/Information_theory
- Signal Processing: https://en.wikipedia.org/wiki/Signal_processing
- ADC: https://en.wikipedia.org/wiki/Analog-to-digital_converter

What a Coincidence:

While googling to improve my algorithm, I found out that, Signal Processing is used by Jim Simon's Renaissance Technologies according to various sources including wikipedia: https://en.wikipedia.org/wiki/Financial_signal_processing

Proverbs Integration:

Output of the process can be used to develop endless type of profitable strategies. I made some money with different momentum based strategies while thinking about how I can use this technique more efficiently.
I like to combine different fields. I think trading and life itself have many things in common. So beside general trading concepts, I think that I can try to implement concepts of the life. Also because of the parameterless design, it's more like a decision making process than an optimization problem.
I searched proverbs and advices for better decision making. I handled them one by one and thought how I could implement them in a unified strategy while preserving the parameterless design. In time, this process was significantly improved stability and reliability while it was evolving from momentum to mean reversion.
These are some proverbs which I use them at various aspects of the algorithm:

- “The bamboo that bends is stronger than the oak that resists.” (Japanese proverb)
- "When the rainwater rises and descends down to where you want to cross, wait until it settles." (Sun-Tzu)
- "If you do not expect the unexpected you will not find it, for it is not to be reached by search or trail" (Heraclitus)
If you wonder how I implement them in the code, think about the last one; how do you define the unexpected, how to wait for it and how to prepare your algorithm to generate profit.
By the way, I strongly recommend: The Art of War (Sun-Tzu)

Result:

I have plenty of ideas waiting to be tested and problems that need to be solved. Nevertheless these are the some of the backtest results, for the time being:
Crypto:
- Market fee and spread are considered, slippage is not.
- For multiple assets testing; Survivorship bias was attempted to be eliminated using historical market rank of the assets. Data is acquired from coinmarketcap.com weekly report.

ETH / BTC
BNB / BTC
Binance Historical Top 100 / BTC
Other Markets:
My main focus is crypto trading. But all the improvements are cross checked in different markets and intervals and validated empirically and logically. It can’t beat every asset and every interval but it tends to work profitably across them.

https://preview.redd.it/l865fw6mjfd51.png?width=900&format=png&auto=webp&s=ff217d35637b41e26db8d7cfc3df14c3fb7ec14e
Live:
The algorithm is running live for over 1.5 years with evolving strategies I mention before. The last one is running for months.

Warnings and Advices:

- Bugs: A few months ago, before bedtime, I released new version for fixing small cosmetic bug and gone to sleep. When I woke up, I saw that nearly 40% of my account wiped out in a few hours. Instead of live settings, I published test settings. It was very painful. I have been coding since childhood, so everyone must be careful. I recommend, implement hard limit for stopping the algorithm.
- Fully Automatic Strategy: Finding an edge is not enough. If you need fully automated trading system, you need a portfolio manager (a lot of research is going on at this field) and especially an asset selector mechanism which is maybe more important than the edge itself. If your algorithm is not be able to select which assets to trade, you must select manually. It's not an easy task and it's prone to error. I was very lucky with that: A mechanism already contained in the algorithm was used to rank and select the assets based on their momentums.
- Fee-Spread: Because of the market fee and spread, trading is a negative sum game. Do not ignore it when backtesting your algorithm.
- Slippage: It's really a problem for low volume assets like penny stocks and lower market cap crypto currencies. Stay away from them or play with small capital or find a way to determine how much money you can use.
- Latency: Don’t think it's a HFT only problem. If your algorithm synchronize multiple assets data from the market and run calculations before sending order back to the market, you lose significant amount of time. This usually causes losses that you have not considered before, especially in a volatile environment. Also if you want to develop realtime strategy, you must seriously consider what you will do in downtime.
- Datasource: This is the most important part for preparation before developing you strategy. If you don’t have good, reliable data; you cannot develop a good strategy. For free data for various market; I suggest investing.com, but you should consider that volume data is not provided. For crypto, all of the exchanges provide their real data for any asset and any interval, you can use them freely. Also you can buy data , especially if you want intraday data, but I can't suggest any because I never tested them.
- Biases: Before developing algorithm, please take a look at and understand the common biases like: Survivorship bias, Look-ahead bias, Time period bias. Or you can be sure that you will face them when you go live.
- Live trading: When you think your algorithm can make money, don’t wait till perfection. Go live as soon as possible with small capital to wake up from your dreams and face with the facts early.
- Psychology: If your education is based on STEM and you don’t have trading experience, it’s not easy in the real world to swallow all those ups and downs that you see in minutes during backtest. It can affect your mood and your life much more than you think. I suggest, work with a professional trader or only invest what you can really afford to lose.

Last Words:

After over 3 years of journey, I have a profitable algorithm that I trust. I was supposed to lie on the beach and drink beer while my algorithm printing money. But I am consistently checking it’s health and I have always things to do like all software development projects.
I posted some of the backtest results, but I don’t know are they considered P/L Porn or not. If so, I can remove it.
Sorry about mysterious parts of this post. I removed some parts unwillingly before posting, but there is really a thin line between giving away your edge freely (also it means loosing it) and inspiring people to find their own way.

“Non est ad astra mollis e terris via" - Seneca

EDIT:


For those engineers and EE students who are bombing my inbox for guessing what I did; I can not write all of you in private, also I want to explain it publicly.
I must say, you are on the wrong way. If I open sourced the signal processing part, probably it doesnt mean anything to you and you can not turn it into a profitable algorithm.
I have to clarify that; before I developed the technique, I knew what I am looking for exactly. Signal processing is not magically trading the market, I am trading the market. it's just a tool to do what is in my mind near perfectly.
Also proverbs are the way of thinking. I read them and think if it means anything for trading.

Lastly watch the Kung Fu Panda :)
https://www.youtube.com/watch?v=rHvCQEr_ETk

submitted by if-not-null to algotrading [link] [comments]

Why i’m bullish on Zilliqa (long read)

Edit: TL;DR added in the comments
 
Hey all, I've been researching coins since 2017 and have gone through 100s of them in the last 3 years. I got introduced to blockchain via Bitcoin of course, analyzed Ethereum thereafter and from that moment I have a keen interest in smart contact platforms. I’m passionate about Ethereum but I find Zilliqa to have a better risk-reward ratio. Especially because Zilliqa has found an elegant balance between being secure, decentralized and scalable in my opinion.
 
Below I post my analysis of why from all the coins I went through I’m most bullish on Zilliqa (yes I went through Tezos, EOS, NEO, VeChain, Harmony, Algorand, Cardano etc.). Note that this is not investment advice and although it's a thorough analysis there is obviously some bias involved. Looking forward to what you all think!
 
Fun fact: the name Zilliqa is a play on ‘silica’ silicon dioxide which means “Silicon for the high-throughput consensus computer.”
 
This post is divided into (i) Technology, (ii) Business & Partnerships, and (iii) Marketing & Community. I’ve tried to make the technology part readable for a broad audience. If you’ve ever tried understanding the inner workings of Bitcoin and Ethereum you should be able to grasp most parts. Otherwise, just skim through and once you are zoning out head to the next part.
 
Technology and some more:
 
Introduction
 
The technology is one of the main reasons why I’m so bullish on Zilliqa. First thing you see on their website is: “Zilliqa is a high-performance, high-security blockchain platform for enterprises and next-generation applications.” These are some bold statements.
 
Before we deep dive into the technology let’s take a step back in time first as they have quite the history. The initial research paper from which Zilliqa originated dates back to August 2016: Elastico: A Secure Sharding Protocol For Open Blockchains where Loi Luu (Kyber Network) is one of the co-authors. Other ideas that led to the development of what Zilliqa has become today are: Bitcoin-NG, collective signing CoSi, ByzCoin and Omniledger.
 
The technical white paper was made public in August 2017 and since then they have achieved everything stated in the white paper and also created their own open source intermediate level smart contract language called Scilla (functional programming language similar to OCaml) too.
 
Mainnet is live since the end of January 2019 with daily transaction rates growing continuously. About a week ago mainnet reached 5 million transactions, 500.000+ addresses in total along with 2400 nodes keeping the network decentralized and secure. Circulating supply is nearing 11 billion and currently only mining rewards are left. The maximum supply is 21 billion with annual inflation being 7.13% currently and will only decrease with time.
 
Zilliqa realized early on that the usage of public cryptocurrencies and smart contracts were increasing but decentralized, secure, and scalable alternatives were lacking in the crypto space. They proposed to apply sharding onto a public smart contract blockchain where the transaction rate increases almost linear with the increase in the amount of nodes. More nodes = higher transaction throughput and increased decentralization. Sharding comes in many forms and Zilliqa uses network-, transaction- and computational sharding. Network sharding opens up the possibility of using transaction- and computational sharding on top. Zilliqa does not use state sharding for now. We’ll come back to this later.
 
Before we continue dissecting how Zilliqa achieves such from a technological standpoint it’s good to keep in mind that a blockchain being decentralised and secure and scalable is still one of the main hurdles in allowing widespread usage of decentralised networks. In my opinion this needs to be solved first before blockchains can get to the point where they can create and add large scale value. So I invite you to read the next section to grasp the underlying fundamentals. Because after all these premises need to be true otherwise there isn’t a fundamental case to be bullish on Zilliqa, right?
 
Down the rabbit hole
 
How have they achieved this? Let’s define the basics first: key players on Zilliqa are the users and the miners. A user is anybody who uses the blockchain to transfer funds or run smart contracts. Miners are the (shard) nodes in the network who run the consensus protocol and get rewarded for their service in Zillings (ZIL). The mining network is divided into several smaller networks called shards, which is also referred to as ‘network sharding’. Miners subsequently are randomly assigned to a shard by another set of miners called DS (Directory Service) nodes. The regular shards process transactions and the outputs of these shards are eventually combined by the DS shard as they reach consensus on the final state. More on how these DS shards reach consensus (via pBFT) will be explained later on.
 
The Zilliqa network produces two types of blocks: DS blocks and Tx blocks. One DS Block consists of 100 Tx Blocks. And as previously mentioned there are two types of nodes concerned with reaching consensus: shard nodes and DS nodes. Becoming a shard node or DS node is being defined by the result of a PoW cycle (Ethash) at the beginning of the DS Block. All candidate mining nodes compete with each other and run the PoW (Proof-of-Work) cycle for 60 seconds and the submissions achieving the highest difficulty will be allowed on the network. And to put it in perspective: the average difficulty for one DS node is ~ 2 Th/s equaling 2.000.000 Mh/s or 55 thousand+ GeForce GTX 1070 / 8 GB GPUs at 35.4 Mh/s. Each DS Block 10 new DS nodes are allowed. And a shard node needs to provide around 8.53 GH/s currently (around 240 GTX 1070s). Dual mining ETH/ETC and ZIL is possible and can be done via mining software such as Phoenix and Claymore. There are pools and if you have large amounts of hashing power (Ethash) available you could mine solo.
 
The PoW cycle of 60 seconds is a peak performance and acts as an entry ticket to the network. The entry ticket is called a sybil resistance mechanism and makes it incredibly hard for adversaries to spawn lots of identities and manipulate the network with these identities. And after every 100 Tx Blocks which corresponds to roughly 1,5 hour this PoW process repeats. In between these 1,5 hour, no PoW needs to be done meaning Zilliqa’s energy consumption to keep the network secure is low. For more detailed information on how mining works click here.
Okay, hats off to you. You have made it this far. Before we go any deeper down the rabbit hole we first must understand why Zilliqa goes through all of the above technicalities and understand a bit more what a blockchain on a more fundamental level is. Because the core of Zilliqa’s consensus protocol relies on the usage of pBFT (practical Byzantine Fault Tolerance) we need to know more about state machines and their function. Navigate to Viewblock, a Zilliqa block explorer, and just come back to this article. We will use this site to navigate through a few concepts.
 
We have established that Zilliqa is a public and distributed blockchain. Meaning that everyone with an internet connection can send ZILs, trigger smart contracts, etc. and there is no central authority who fully controls the network. Zilliqa and other public and distributed blockchains (like Bitcoin and Ethereum) can also be defined as state machines.
 
Taking the liberty of paraphrasing examples and definitions given by Samuel Brooks’ medium article, he describes the definition of a blockchain (like Zilliqa) as: “A peer-to-peer, append-only datastore that uses consensus to synchronize cryptographically-secure data”.
 
Next, he states that: "blockchains are fundamentally systems for managing valid state transitions”. For some more context, I recommend reading the whole medium article to get a better grasp of the definitions and understanding of state machines. Nevertheless, let’s try to simplify and compile it into a single paragraph. Take traffic lights as an example: all its states (red, amber, and green) are predefined, all possible outcomes are known and it doesn’t matter if you encounter the traffic light today or tomorrow. It will still behave the same. Managing the states of a traffic light can be done by triggering a sensor on the road or pushing a button resulting in one traffic lights’ state going from green to red (via amber) and another light from red to green.
 
With public blockchains like Zilliqa, this isn’t so straightforward and simple. It started with block #1 almost 1,5 years ago and every 45 seconds or so a new block linked to the previous block is being added. Resulting in a chain of blocks with transactions in it that everyone can verify from block #1 to the current #647.000+ block. The state is ever changing and the states it can find itself in are infinite. And while the traffic light might work together in tandem with various other traffic lights, it’s rather insignificant comparing it to a public blockchain. Because Zilliqa consists of 2400 nodes who need to work together to achieve consensus on what the latest valid state is while some of these nodes may have latency or broadcast issues, drop offline or are deliberately trying to attack the network, etc.
 
Now go back to the Viewblock page take a look at the amount of transaction, addresses, block and DS height and then hit refresh. Obviously as expected you see new incremented values on one or all parameters. And how did the Zilliqa blockchain manage to transition from a previous valid state to the latest valid state? By using pBFT to reach consensus on the latest valid state.
 
After having obtained the entry ticket, miners execute pBFT to reach consensus on the ever-changing state of the blockchain. pBFT requires a series of network communication between nodes, and as such there is no GPU involved (but CPU). Resulting in the total energy consumed to keep the blockchain secure, decentralized and scalable being low.
 
pBFT stands for practical Byzantine Fault Tolerance and is an optimization on the Byzantine Fault Tolerant algorithm. To quote Blockonomi: “In the context of distributed systems, Byzantine Fault Tolerance is the ability of a distributed computer network to function as desired and correctly reach a sufficient consensus despite malicious components (nodes) of the system failing or propagating incorrect information to other peers.” Zilliqa is such a distributed computer network and depends on the honesty of the nodes (shard and DS) to reach consensus and to continuously update the state with the latest block. If pBFT is a new term for you I can highly recommend the Blockonomi article.
 
The idea of pBFT was introduced in 1999 - one of the authors even won a Turing award for it - and it is well researched and applied in various blockchains and distributed systems nowadays. If you want more advanced information than the Blockonomi link provides click here. And if you’re in between Blockonomi and the University of Singapore read the Zilliqa Design Story Part 2 dating from October 2017.
Quoting from the Zilliqa tech whitepaper: “pBFT relies upon a correct leader (which is randomly selected) to begin each phase and proceed when the sufficient majority exists. In case the leader is byzantine it can stall the entire consensus protocol. To address this challenge, pBFT offers a view change protocol to replace the byzantine leader with another one.”
 
pBFT can tolerate ⅓ of the nodes being dishonest (offline counts as Byzantine = dishonest) and the consensus protocol will function without stalling or hiccups. Once there are more than ⅓ of dishonest nodes but no more than ⅔ the network will be stalled and a view change will be triggered to elect a new DS leader. Only when more than ⅔ of the nodes are dishonest (66%) double-spend attacks become possible.
 
If the network stalls no transactions can be processed and one has to wait until a new honest leader has been elected. When the mainnet was just launched and in its early phases, view changes happened regularly. As of today the last stalling of the network - and view change being triggered - was at the end of October 2019.
 
Another benefit of using pBFT for consensus besides low energy is the immediate finality it provides. Once your transaction is included in a block and the block is added to the chain it’s done. Lastly, take a look at this article where three types of finality are being defined: probabilistic, absolute and economic finality. Zilliqa falls under the absolute finality (just like Tendermint for example). Although lengthy already we skipped through some of the inner workings from Zilliqa’s consensus: read the Zilliqa Design Story Part 3 and you will be close to having a complete picture on it. Enough about PoW, sybil resistance mechanism, pBFT, etc. Another thing we haven’t looked at yet is the amount of decentralization.
 
Decentralisation
 
Currently, there are four shards, each one of them consisting of 600 nodes. 1 shard with 600 so-called DS nodes (Directory Service - they need to achieve a higher difficulty than shard nodes) and 1800 shard nodes of which 250 are shard guards (centralized nodes controlled by the team). The amount of shard guards has been steadily declining from 1200 in January 2019 to 250 as of May 2020. On the Viewblock statistics, you can see that many of the nodes are being located in the US but those are only the (CPU parts of the) shard nodes who perform pBFT. There is no data from where the PoW sources are coming. And when the Zilliqa blockchain starts reaching its transaction capacity limit, a network upgrade needs to be executed to lift the current cap of maximum 2400 nodes to allow more nodes and formation of more shards which will allow to network to keep on scaling according to demand.
Besides shard nodes there are also seed nodes. The main role of seed nodes is to serve as direct access points (for end-users and clients) to the core Zilliqa network that validates transactions. Seed nodes consolidate transaction requests and forward these to the lookup nodes (another type of nodes) for distribution to the shards in the network. Seed nodes also maintain the entire transaction history and the global state of the blockchain which is needed to provide services such as block explorers. Seed nodes in the Zilliqa network are comparable to Infura on Ethereum.
 
The seed nodes were first only operated by Zilliqa themselves, exchanges and Viewblock. Operators of seed nodes like exchanges had no incentive to open them for the greater public. They were centralised at first. Decentralisation at the seed nodes level has been steadily rolled out since March 2020 ( Zilliqa Improvement Proposal 3 ). Currently the amount of seed nodes is being increased, they are public-facing and at the same time PoS is applied to incentivize seed node operators and make it possible for ZIL holders to stake and earn passive yields. Important distinction: seed nodes are not involved with consensus! That is still PoW as entry ticket and pBFT for the actual consensus.
 
5% of the block rewards are being assigned to seed nodes (from the beginning in 2019) and those are being used to pay out ZIL stakers. The 5% block rewards with an annual yield of 10.03% translate to roughly 610 MM ZILs in total that can be staked. Exchanges use the custodial variant of staking and wallets like Moonlet will use the non-custodial version (starting in Q3 2020). Staking is being done by sending ZILs to a smart contract created by Zilliqa and audited by Quantstamp.
 
With a high amount of DS; shard nodes and seed nodes becoming more decentralized too, Zilliqa qualifies for the label of decentralized in my opinion.
 
Smart contracts
 
Let me start by saying I’m not a developer and my programming skills are quite limited. So I‘m taking the ELI5 route (maybe 12) but if you are familiar with Javascript, Solidity or specifically OCaml please head straight to Scilla - read the docs to get a good initial grasp of how Zilliqa’s smart contract language Scilla works and if you ask yourself “why another programming language?” check this article. And if you want to play around with some sample contracts in an IDE click here. The faucet can be found here. And more information on architecture, dapp development and API can be found on the Developer Portal.
If you are more into listening and watching: check this recent webinar explaining Zilliqa and Scilla. Link is time-stamped so you’ll start right away with a platform introduction, roadmap 2020 and afterwards a proper Scilla introduction.
 
Generalized: programming languages can be divided into being ‘object-oriented’ or ‘functional’. Here is an ELI5 given by software development academy: * “all programs have two basic components, data – what the program knows – and behavior – what the program can do with that data. So object-oriented programming states that combining data and related behaviors in one place, is called “object”, which makes it easier to understand how a particular program works. On the other hand, functional programming argues that data and behavior are different things and should be separated to ensure their clarity.” *
 
Scilla is on the functional side and shares similarities with OCaml: OCaml is a general-purpose programming language with an emphasis on expressiveness and safety. It has an advanced type system that helps catch your mistakes without getting in your way. It's used in environments where a single mistake can cost millions and speed matters, is supported by an active community, and has a rich set of libraries and development tools. For all its power, OCaml is also pretty simple, which is one reason it's often used as a teaching language.
 
Scilla is blockchain agnostic, can be implemented onto other blockchains as well, is recognized by academics and won a so-called Distinguished Artifact Award award at the end of last year.
 
One of the reasons why the Zilliqa team decided to create their own programming language focused on preventing smart contract vulnerabilities is that adding logic on a blockchain, programming, means that you cannot afford to make mistakes. Otherwise, it could cost you. It’s all great and fun blockchains being immutable but updating your code because you found a bug isn’t the same as with a regular web application for example. And with smart contracts, it inherently involves cryptocurrencies in some form thus value.
 
Another difference with programming languages on a blockchain is gas. Every transaction you do on a smart contract platform like Zilliqa or Ethereum costs gas. With gas you basically pay for computational costs. Sending a ZIL from address A to address B costs 0.001 ZIL currently. Smart contracts are more complex, often involve various functions and require more gas (if gas is a new concept click here ).
 
So with Scilla, similar to Solidity, you need to make sure that “every function in your smart contract will run as expected without hitting gas limits. An improper resource analysis may lead to situations where funds may get stuck simply because a part of the smart contract code cannot be executed due to gas limits. Such constraints are not present in traditional software systems”. Scilla design story part 1
 
Some examples of smart contract issues you’d want to avoid are: leaking funds, ‘unexpected changes to critical state variables’ (example: someone other than you setting his or her address as the owner of the smart contract after creation) or simply killing a contract.
 
Scilla also allows for formal verification. Wikipedia to the rescue: In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of intended algorithms underlying a system with respect to a certain formal specification or property, using formal methods of mathematics.
 
Formal verification can be helpful in proving the correctness of systems such as: cryptographic protocols, combinational circuits, digital circuits with internal memory, and software expressed as source code.
 
Scilla is being developed hand-in-hand with formalization of its semantics and its embedding into the Coq proof assistant — a state-of-the art tool for mechanized proofs about properties of programs.”
 
Simply put, with Scilla and accompanying tooling developers can be mathematically sure and proof that the smart contract they’ve written does what he or she intends it to do.
 
Smart contract on a sharded environment and state sharding
 
There is one more topic I’d like to touch on: smart contract execution in a sharded environment (and what is the effect of state sharding). This is a complex topic. I’m not able to explain it any easier than what is posted here. But I will try to compress the post into something easy to digest.
 
Earlier on we have established that Zilliqa can process transactions in parallel due to network sharding. This is where the linear scalability comes from. We can define simple transactions: a transaction from address A to B (Category 1), a transaction where a user interacts with one smart contract (Category 2) and the most complex ones where triggering a transaction results in multiple smart contracts being involved (Category 3). The shards are able to process transactions on their own without interference of the other shards. With Category 1 transactions that is doable, with Category 2 transactions sometimes if that address is in the same shard as the smart contract but with Category 3 you definitely need communication between the shards. Solving that requires to make a set of communication rules the protocol needs to follow in order to process all transactions in a generalised fashion.
 
And this is where the downsides of state sharding comes in currently. All shards in Zilliqa have access to the complete state. Yes the state size (0.1 GB at the moment) grows and all of the nodes need to store it but it also means that they don’t need to shop around for information available on other shards. Requiring more communication and adding more complexity. Computer science knowledge and/or developer knowledge required links if you want to dig further: Scilla - language grammar Scilla - Foundations for Verifiable Decentralised Computations on a Blockchain Gas Accounting NUS x Zilliqa: Smart contract language workshop
 
Easier to follow links on programming Scilla https://learnscilla.com/home Ivan on Tech
 
Roadmap / Zilliqa 2.0
 
There is no strict defined roadmap but here are topics being worked on. And via the Zilliqa website there is also more information on the projects they are working on.
 
Business & Partnerships
 
It’s not only technology in which Zilliqa seems to be excelling as their ecosystem has been expanding and starting to grow rapidly. The project is on a mission to provide OpenFinance (OpFi) to the world and Singapore is the right place to be due to its progressive regulations and futuristic thinking. Singapore has taken a proactive approach towards cryptocurrencies by introducing the Payment Services Act 2019 (PS Act). Among other things, the PS Act will regulate intermediaries dealing with certain cryptocurrencies, with a particular focus on consumer protection and anti-money laundering. It will also provide a stable regulatory licensing and operating framework for cryptocurrency entities, effectively covering all crypto businesses and exchanges based in Singapore. According to PWC 82% of the surveyed executives in Singapore reported blockchain initiatives underway and 13% of them have already brought the initiatives live to the market. There is also an increasing list of organizations that are starting to provide digital payment services. Moreover, Singaporean blockchain developers Building Cities Beyond has recently created an innovation $15 million grant to encourage development on its ecosystem. This all suggests that Singapore tries to position itself as (one of) the leading blockchain hubs in the world.
 
Zilliqa seems to already take advantage of this and recently helped launch Hg Exchange on their platform, together with financial institutions PhillipCapital, PrimePartners and Fundnel. Hg Exchange, which is now approved by the Monetary Authority of Singapore (MAS), uses smart contracts to represent digital assets. Through Hg Exchange financial institutions worldwide can use Zilliqa's safe-by-design smart contracts to enable the trading of private equities. For example, think of companies such as Grab, Airbnb, SpaceX that are not available for public trading right now. Hg Exchange will allow investors to buy shares of private companies & unicorns and capture their value before an IPO. Anquan, the main company behind Zilliqa, has also recently announced that they became a partner and shareholder in TEN31 Bank, which is a fully regulated bank allowing for tokenization of assets and is aiming to bridge the gap between conventional banking and the blockchain world. If STOs, the tokenization of assets, and equity trading will continue to increase, then Zilliqa’s public blockchain would be the ideal candidate due to its strategic positioning, partnerships, regulatory compliance and the technology that is being built on top of it.
 
What is also very encouraging is their focus on banking the un(der)banked. They are launching a stablecoin basket starting with XSGD. As many of you know, stablecoins are currently mostly used for trading. However, Zilliqa is actively trying to broaden the use case of stablecoins. I recommend everybody to read this text that Amrit Kumar wrote (one of the co-founders). These stablecoins will be integrated in the traditional markets and bridge the gap between the crypto world and the traditional world. This could potentially revolutionize and legitimise the crypto space if retailers and companies will for example start to use stablecoins for payments or remittances, instead of it solely being used for trading.
 
Zilliqa also released their DeFi strategic roadmap (dating November 2019) which seems to be aligning well with their OpFi strategy. A non-custodial DEX is coming to Zilliqa made by Switcheo which allows cross-chain trading (atomic swaps) between ETH, EOS and ZIL based tokens. They also signed a Memorandum of Understanding for a (soon to be announced) USD stablecoin. And as Zilliqa is all about regulations and being compliant, I’m speculating on it to be a regulated USD stablecoin. Furthermore, XSGD is already created and visible on block explorer and XIDR (Indonesian Stablecoin) is also coming soon via StraitsX. Here also an overview of the Tech Stack for Financial Applications from September 2019. Further quoting Amrit Kumar on this:
 
There are two basic building blocks in DeFi/OpFi though: 1) stablecoins as you need a non-volatile currency to get access to this market and 2) a dex to be able to trade all these financial assets. The rest are built on top of these blocks.
 
So far, together with our partners and community, we have worked on developing these building blocks with XSGD as a stablecoin. We are working on bringing a USD-backed stablecoin as well. We will soon have a decentralised exchange developed by Switcheo. And with HGX going live, we are also venturing into the tokenization space. More to come in the future.”
 
Additionally, they also have this ZILHive initiative that injects capital into projects. There have been already 6 waves of various teams working on infrastructure, innovation and research, and they are not from ASEAN or Singapore only but global: see Grantees breakdown by country. Over 60 project teams from over 20 countries have contributed to Zilliqa's ecosystem. This includes individuals and teams developing wallets, explorers, developer toolkits, smart contract testing frameworks, dapps, etc. As some of you may know, Unstoppable Domains (UD) blew up when they launched on Zilliqa. UD aims to replace cryptocurrency addresses with a human-readable name and allows for uncensorable websites. Zilliqa will probably be the only one able to handle all these transactions onchain due to ability to scale and its resulting low fees which is why the UD team launched this on Zilliqa in the first place. Furthermore, Zilliqa also has a strong emphasis on security, compliance, and privacy, which is why they partnered with companies like Elliptic, ChainSecurity (part of PwC Switzerland), and Incognito. Their sister company Aqilliz (Zilliqa spelled backwards) focuses on revolutionizing the digital advertising space and is doing interesting things like using Zilliqa to track outdoor digital ads with companies like Foodpanda.
 
Zilliqa is listed on nearly all major exchanges, having several different fiat-gateways and recently have been added to Binance’s margin trading and futures trading with really good volume. They also have a very impressive team with good credentials and experience. They don't just have “tech people”. They have a mix of tech people, business people, marketeers, scientists, and more. Naturally, it's good to have a mix of people with different skill sets if you work in the crypto space.
 
Marketing & Community
 
Zilliqa has a very strong community. If you just follow their Twitter their engagement is much higher for a coin that has approximately 80k followers. They also have been ‘coin of the day’ by LunarCrush many times. LunarCrush tracks real-time cryptocurrency value and social data. According to their data, it seems Zilliqa has a more fundamental and deeper understanding of marketing and community engagement than almost all other coins. While almost all coins have been a bit frozen in the last months, Zilliqa seems to be on its own bull run. It was somewhere in the 100s a few months ago and is currently ranked #46 on CoinGecko. Their official Telegram also has over 20k people and is very active, and their community channel which is over 7k now is more active and larger than many other official channels. Their local communities also seem to be growing.
 
Moreover, their community started ‘Zillacracy’ together with the Zilliqa core team ( see www.zillacracy.com ). It’s a community-run initiative where people from all over the world are now helping with marketing and development on Zilliqa. Since its launch in February 2020 they have been doing a lot and will also run their own non-custodial seed node for staking. This seed node will also allow them to start generating revenue for them to become a self sustaining entity that could potentially scale up to become a decentralized company working in parallel with the Zilliqa core team. Comparing it to all the other smart contract platforms (e.g. Cardano, EOS, Tezos etc.) they don't seem to have started a similar initiative (correct me if I’m wrong though). This suggests in my opinion that these other smart contract platforms do not fully understand how to utilize the ‘power of the community’. This is something you cannot ‘buy with money’ and gives many projects in the space a disadvantage.
 
Zilliqa also released two social products called SocialPay and Zeeves. SocialPay allows users to earn ZILs while tweeting with a specific hashtag. They have recently used it in partnership with the Singapore Red Cross for a marketing campaign after their initial pilot program. It seems like a very valuable social product with a good use case. I can see a lot of traditional companies entering the space through this product, which they seem to suggest will happen. Tokenizing hashtags with smart contracts to get network effect is a very smart and innovative idea.
 
Regarding Zeeves, this is a tipping bot for Telegram. They already have 1000s of signups and they plan to keep upgrading it for more and more people to use it (e.g. they recently have added a quiz features). They also use it during AMAs to reward people in real-time. It’s a very smart approach to grow their communities and get familiar with ZIL. I can see this becoming very big on Telegram. This tool suggests, again, that the Zilliqa team has a deeper understanding of what the crypto space and community needs and is good at finding the right innovative tools to grow and scale.
 
To be honest, I haven’t covered everything (i’m also reaching the character limited haha). So many updates happening lately that it's hard to keep up, such as the International Monetary Fund mentioning Zilliqa in their report, custodial and non-custodial Staking, Binance Margin, Futures, Widget, entering the Indian market, and more. The Head of Marketing Colin Miles has also released this as an overview of what is coming next. And last but not least, Vitalik Buterin has been mentioning Zilliqa lately acknowledging Zilliqa and mentioning that both projects have a lot of room to grow. There is much more info of course and a good part of it has been served to you on a silver platter. I invite you to continue researching by yourself :-) And if you have any comments or questions please post here!
submitted by haveyouheardaboutit to CryptoCurrency [link] [comments]

Why NANO dropped 98% and why you shouldn't be worried (TL;DR: BULLISH)

The #1 complaint I hear about Nano is that it has dropped 98% from it's all-time high. This is SOOOOO misleading. I felt like those fudders (and perhaps some Nano supporters, too) could use a little history lesson regarding Nano's price.
Jan 1st, 2017
The 2017 bull market didn't start until ETH starting taking off in price in March. So this is a time with zero hype making it a good baseline for true ROI. At this time Nano didn't have measurable exchange volume and was being given away for free via faucets (so it's price could be considered approximately $0.00).
Coin Price on 1/1/17
BTC $973
ETH $8.23
NANO ~$0.00
May 1st, 2017
Nano starts appearing on exchanges in March. By May it has a pretty stable trading price of approximately one penny.
Coin Price on 5/1/17 ROI since 1/1/17
BTC $1470 150%
ETH $87 957%
NANO $0.014 ~∞
Dec 17th, 2017
Date of Bitcoin's ATH. Nano's price rise should have stopped rising around here (at $2.23), but for some reason the May Nano holders didn't sell despite a meteoric 15,000% ROI in just a few months. No one in their right mind would pass up a 160x price increase (those are Lambo gains), so we know for a fact that they will sell eventually.
Coin Price on 12/17/17 ROI since 5/1/17 ROI since 1/1/17
BTC $19,886 1253% 1944%
ETH $737 747% 8855%
NANO $2.23 15,828% ~∞
Jan 5th, 2018
Date of Nano's ATH. The May Nano holders' greed paid off - they are now sitting at an unfathomable 255,000% ROI in just 7 months.
Coin Price ROI since 5/1/17 ROI since 1/1/17
BTC $15,000 1020% 1442%
ETH $1032 1086% 12,439%
NANO $35.74 255,185% ~∞
May 18th, 2020 (Today)
Compared to ETH and BTC, NANO is still the better performing asset since the start of 2017.
Coin Price ROI since 5/1/17 ROI since 1/1/17
BTC $9710 560% 898%
ETH $213 145% 2488%
NANO $0.80 5614% ~∞
So why am I bullish?
Nano's price drop is easily explained by "Bigger the rise, bigger the fall". Nano probably had the largest and fastest rise of any trade-able asset in the history of the world (can anything else beat 255,000% in just 7 months?). People can FUD the price drop all they want, but they are being misleading by ignoring the unbelievable price rise that preceded it.
The 98% price drop was a GOOD thing (sounds delusional, I know) because it means those original May 2017 holders have finally sold off their position. Now that the original holders of that meteoric rise are done taking profit, we have seen the Nano price flatten out over the past two years and become more stable. The last time Nano had a stable price, it was trading for just $0.014, and now it's $0.80 (sweet!).
The stable price matters because it sets a new baseline for future ROI expectations. Future price increases will be measured from today's $0.80 price point instead of the previous $0.014. BTC has had several re-baselining cycles, this is Nano's first. And unlike BTC and other assets, Nano has no inflation to provide additional sell pressure.
From an economic standpoint, Nano is in a better place right now. So if you feel that Nano has a place in the future of crypto (DYOR still applies), then you should feel comfortable buying in at this price.
submitted by _PaamayimNekudotayim to nanotrade [link] [comments]

NEM lists XEM on bitFlyer ahead of Symbol launch

NEM lists XEM on bitFlyer ahead of Symbol launch

NEM XEM holders should prepare for a blockchain snapshot to receive a 1-for-1 allocation of the new XYM token on Symbol

https://preview.redd.it/z3yej9c386g51.png?width=1584&format=png&auto=webp&s=3c1cd933fe2ff2b96003d50f0ccae59e09371844
XEM, NEM’s native token on the original NEM blockchain (NIS1), has been listed on Japanese crypto exchange bitFlyer. Based in Tokyo, bitFlyer was founded in January 2014 by Yuzo Kano, a former Goldman Sachs trader. bitFlyer has offices in Tokyo, San Francisco, and Luxembourg, and is one of the largest Japanese crypto exchanges by trading volume.
Iain Wilson, Chief Financial Officer of NEM Group and Managing Director of NEM Trading, says the Japanese community has played an important role in the growth of the NEM ecosystem over the last five years. “The bitFlyer listing will encourage the growth of the community, both in Japan and beyond. The listing deepens their involvement in both NIS1 and Symbol, and brings increased liquidity to our community.”
NEM’s Japanese userbase will benefit from the liquidity of the bitFlyer exchange and the listing will prepare XEM holders for the upcoming launch of Symbol from NEM.
Symbol is a next-generation blockchain solution designed for enterprise use. Symbol (codename Catapult) represents a full rewrite of the NEM protocol. Due to launch later this year, Symbol has been in development since 2018. Symbol is designed to help businesses cut costs, reduce complexities, and streamline innovation. Symbol is expected to provide significant upgrades in flexibility, security, speed, and ease of use over the original NEM blockchain.
Jeff McDonald, co-founder of the NEM Foundation says the NEM developer team has applied all the learnings that come from running version 1 of their blockchain in production for four years. “Symbol is a more performant, scalable, and feature-rich protocol as a result. It will become the core NEM engine, powering both private and public blockchains,” explained McDonald.
When the Symbol chain launches it will be introduced in parallel with the original NEM NIS1 blockchain. This will give existing holders of XEM agency throughout the migration. XEM holders who choose to opt-in will obtain an XYM balance equal to their XEM balance at the time of the snapshot, a timestamp when user balances in XEM on the NEM blockchain will be recorded.
XYM is the native currency of the Symbol public blockchain. It will be used to pay for transactions in order to incentivize the network of public nodes that process transactions.
Selected data on the NEM public chain will migrate to Symbol when the public chain is released. When the Symbol public chain begins, all XYM will be allocated based on each user’s XEM account balances on the NEM blockchain. XEM to XYM allocations will be 1-for-1, meaning that 1 XEM on NEM gets a holder 1 XYM on Symbol. However, all allocations must be manually claimed by each account holder. Other data, such as root namespaces and multisignature account configurations will also be migrated and must be manually claimed.
XYM will be allocated based on each user’s on-chain XEM balance. The NEM team recommends that users move XEM off exchanges and into an on-chain account. Some exchanges may yet decide to provide migration for their users, but the only way to be certain is to hold XEM in your own account on-chain.
Several weeks before launch, the team will take a snapshot of the NEM chain at a specific block. Allocations will be finalized at this block. Transactions that happen after the snapshot block will not affect the new Symbol chain.
The NEM team has not yet announced when the snapshot will take place, but they will announce the date well in advance so all users are able to make the necessary arrangements. Snapshot data will be auditable and transparent.
Source
submitted by charlesgwynne to ICOAnalysis [link] [comments]

RESEARCH REPORT ABOUT KYBER NETWORK

RESEARCH REPORT ABOUT KYBER NETWORK
Author: Gamals Ahmed, CoinEx Business Ambassador

https://preview.redd.it/9k31yy1bdcg51.jpg?width=936&format=pjpg&auto=webp&s=99bcb7c3f50b272b7d97247b369848b5d8cc6053

ABSTRACT

In this research report, we present a study on Kyber Network. Kyber Network is a decentralized, on-chain liquidity protocol designed to make trading tokens simple, efficient, robust and secure.
Kyber design allows any party to contribute to an aggregated pool of liquidity within each blockchain while providing a single endpoint for takers to execute trades using the best rates available. We envision a connected liquidity network that facilitates seamless, decentralized cross-chain token swaps across Kyber based networks on different chains.
Kyber is a fully on-chain liquidity protocol that enables decentralized exchange of cryptocurrencies in any application. Liquidity providers (Reserves) are integrated into one single endpoint for takers and users. When a user requests a trade, the protocol will scan the entire network to find the reserve with the best price and take liquidity from that particular reserve.

1.INTRODUCTION

DeFi applications all need access to good liquidity sources, which is a critical component to provide good services. Currently, decentralized liquidity is comprised of various sources including DEXes (Uniswap, OasisDEX, Bancor), decentralized funds and other financial apps. The more scattered the sources, the harder it becomes for anyone to either find the best rate for their trade or to even find enough liquidity for their need.
Kyber is a blockchain-based liquidity protocol that aggregates liquidity from a wide range of reserves, powering instant and secure token exchange in any decentralized application.
The protocol allows for a wide range of implementation possibilities for liquidity providers, allowing a wide range of entities to contribute liquidity, including end users, decentralized exchanges and other decentralized protocols. On the taker side, end users, cryptocurrency wallets, and smart contracts are able to perform instant and trustless token trades at the best rates available amongst the sources.
The Kyber Network is project based on the Ethereum protocol that seeks to completely decentralize the exchange of crypto currencies and make exchange trustless by keeping everything on the blockchain.
Through the Kyber Network, users should be able to instantly convert or exchange any crypto currency.

1.1 OVERVIEW ABOUT KYBER NETWORK PROTOCOL

The Kyber Network is a decentralized way to exchange ETH and different ERC20 tokens instantly — no waiting and no registration needed.
Using this protocol, developers can build innovative payment flows and applications, including instant token swap services, ERC20 payments, and financial DApps — helping to build a world where any token is usable anywhere.
Kyber’s fully on-chain design allows for full transparency and verifiability in the matching engine, as well as seamless composability with DApps, not all of which are possible with off-chain or hybrid approaches. The integration of a large variety of liquidity providers also makes Kyber uniquely capable of supporting sophisticated schemes and catering to the needs of DeFi DApps and financial institutions. Hence, many developers leverage Kyber’s liquidity pool to build innovative financial applications, and not surprisingly, Kyber is the most used DeFi protocol in the world.
The Kyber Network is quite an established project that is trying to change the way we think of decentralised crypto currency exchange.
The Kyber Network has seen very rapid development. After being announced in May 2017 the testnet for the Kyber Network went live in August 2017. An ICO followed in September 2017, with the company raising 200,000 ETH valued at $60 million in just one day.
The live main net was released in February 2018 to whitelisted participants, and on March 19, 2018, the Kyber Network opened the main net as a public beta. Since then the network has seen increasing growth, with network volumes growing more than 500% in the first half of 2019.
Although there was a modest decrease in August 2019 that can be attributed to the price of ETH dropping by 50%, impacting the overall total volumes being traded and processed globally.
They are developing a decentralised exchange protocol that will allow developers to build payment flows and financial apps. This is indeed quite a competitive market as a number of other such protocols have been launched.
In Brief - Kyber Network is a tool that allows anyone to swap tokens instantly without having to use exchanges. - It allows vendors to accept different types of cryptocurrency while still being paid in their preferred crypto of choice. - It’s built primarily for Ethereum, but any smart-contract based blockchain can incorporate it.
At its core, Kyber is a decentralized way to exchange ETH and different ERC20 tokens instantly–no waiting and no registration needed. To do this Kyber uses a diverse set of liquidity pools, or pools of different crypto assets called “reserves” that any project can tap into or integrate with.
A typical use case would be if a vendor allowed customers to pay in whatever currency they wish, but receive the payment in their preferred token. Another example would be for Dapp users. At present, if you are not a token holder of a certain Dapp you can’t use it. With Kyber, you could use your existing tokens, instantly swap them for the Dapp specific token and away you go.
All this swapping happens directly on the Ethereum blockchain, meaning every transaction is completely transparent.

1.1.1 WHY BUILD THE KYBER NETWORK?

While crypto currencies were built to be decentralized, many of the exchanges for trading crypto currencies have become centralized affairs. This has led to security vulnerabilities, with many exchanges becoming the victims of hacking and theft.
It has also led to increased fees and costs, and the centralized exchanges often come with slow transfer times as well. In some cases, wallets have been locked and users are unable to withdraw their coins.
Decentralized exchanges have popped up recently to address the flaws in the centralized exchanges, but they have their own flaws, most notably a lack of liquidity, and often times high costs to modify trades in their on-chain order books.

Some of the Integrations with Kyber Protocol
The Kyber Network was formed to provide users with a decentralized exchange that keeps everything right on the blockchain, and uses a reserve system rather than an order book to provide high liquidity at all times. This will allow for the exchange and transfer of any cryptocurrency, even cross exchanges, and costs will be kept at a minimum as well.
The Kyber Network has three guiding design philosophies since the start:
  1. To be most useful the network needs to be platform-agnostic, which allows any protocol or application the ability to take advantage of the liquidity provided by the Kyber Network without any impact on innovation.
  2. The network was designed to make real-world commerce and decentralized financial products not only possible but also feasible. It does this by allowing for instant token exchange across a wide range of tokens, and without any settlement risk.
  3. The Kyber Network was created with ease of integration as a priority, which is why everything runs fully on-chain and fully transparent. Kyber is not only developer-friendly, but is also compatible with a wide variety of systems.

1.1.2 WHO INVENTED KYBER?

Kyber’s founders are Loi Luu, Victor Tran, Yaron Velner — CEO, CTO, and advisor to the Kyber Network.

1.1.3 WHAT DISTINGUISHES KYBER?

Kyber’s mission has always been to integrate with other protocols so they’ve focused on being developer-friendly by providing architecture to allow anyone to incorporate the technology onto any smart-contract powered blockchain. As a result, a variety of different dapps, vendors, and wallets use Kyber’s infrastructure including Set Protocol, bZx, InstaDApp, and Coinbase wallet.
Besides, dapps, vendors, and wallets, Kyber also integrates with other exchanges such as Uniswap — sharing liquidity pools between the two protocols.
A typical use case would be if a vendor allowed customers to pay in whatever currency they wish, but receive the payment in their preferred token. Another example would be for Dapp users. At present, if you are not a token holder of a certain Dapp you can’t use it. With Kyber, you could use your existing tokens, instantly swap them for the Dapp specific token and away you go.
Limit orders on Kyber allow users to set a specific price in which they would like to exchange a token instead of accepting whatever price currently exists at the time of trading. However, unlike with other exchanges, users never lose custody of their crypto assets during limit orders on Kyber.
The Kyber protocol works by using pools of crypto funds called “reserves”, which currently support over 70 different ERC20 tokens. Reserves are essentially smart contracts with a pool of funds. Different parties with different prices and levels of funding control all reserves. Instead of using order books to match buyers and sellers to return the best price, the Kyber protocol looks at all the reserves and returns the best price among the different reserves. Reserves make money on the “spread” or differences between the buying and selling prices. The Kyber wants any token holder to easily convert one token to another with a minimum of fuss.

1.2 KYBER PROTOCOL

The protocol smart contracts offer a single interface for the best available token exchange rates to be taken from an aggregated liquidity pool across diverse sources. ● Aggregated liquidity pool. The protocol aggregates various liquidity sources into one liquidity pool, making it easy for takers to find the best rates offered with one function call. ● Diverse sources of liquidity. The protocol allows different types of liquidity sources to be plugged into. Liquidity providers may employ different strategies and different implementations to contribute liquidity to the protocol. ● Permissionless. The protocol is designed to be permissionless where any developer can set up various types of reserves, and any end user can contribute liquidity. Implementations need to take into consideration various security vectors, such as reserve spamming, but can be mitigated through a staking mechanism. We can expect implementations to be permissioned initially until the maintainers are confident about these considerations.
The core feature that the Kyber protocol facilitates is the token swap between taker and liquidity sources. The protocol aims to provide the following properties for token trades: ● Instant Settlement. Takers do not have to wait for their orders to be fulfilled, since trade matching and settlement occurs in a single blockchain transaction. This enables trades to be part of a series of actions happening in a single smart contract function. ● Atomicity. When takers make a trade request, their trade either gets fully executed, or is reverted. This “all or nothing” aspect means that takers are not exposed to the risk of partial trade execution. ● Public rate verification. Anyone can verify the rates that are being offered by reserves and have their trades instantly settled just by querying from the smart contracts. ● Ease of integration. Trustless and atomic token trades can be directly and easily integrated into other smart contracts, thereby enabling multiple trades to be performed in a smart contract function.
How each actor works is specified in Section Network Actors. 1. Takers refer to anyone who can directly call the smart contract functions to trade tokens, such as end-users, DApps, and wallets. 2. Reserves refer to anyone who wishes to provide liquidity. They have to implement the smart contract functions defined in the reserve interface in order to be registered and have their token pairs listed. 3. Registered reserves refer to those that will be cycled through for matching taker requests. 4. Maintainers refer to anyone who has permission to access the functions for the adding/removing of reserves and token pairs, such as a DAO or the team behind the protocol implementation. 5. In all, they comprise of the network, which refers to all the actors involved in any given implementation of the protocol.
The protocol implementation needs to have the following: 1. Functions for takers to check rates and execute the trades 2. Functions for the maintainers to registeremove reserves and token pairs 3. Reserve interface that defines the functions reserves needs to implement
https://preview.redd.it/d2tcxc7wdcg51.png?width=700&format=png&auto=webp&s=b2afde388a77054e6731772b9115ee53f09b6a4a

1.3 KYBER CORE SMART CONTRACTS

Kyber Core smart contracts is an implementation of the protocol that has major protocol functions to allow actors to join and interact with the network. For example, the Kyber Core smart contracts provide functions for the listing and delisting of reserves and trading pairs by having clear interfaces for the reserves to comply to be able to register to the network and adding support for new trading pairs. In addition, the Kyber Core smart contracts also provide a function for takers to query the best rate among all the registered reserves, and perform the trades with the corresponding rate and reserve. A trading pair consists of a quote token and any other token that the reserve wishes to support. The quote token is the token that is either traded from or to for all trades. For example, the Ethereum implementation of the Kyber protocol uses Ether as the quote token.
In order to search for the best rate, all reserves supporting the requested token pair will be iterated through. Hence, the Kyber Core smart contracts need to have this search algorithm implemented.
The key functions implemented in the Kyber Core Smart Contracts are listed in Figure 2 below. We will visit and explain the implementation details and security considerations of each function in the Specification Section.

1.4 HOW KYBER’S ON-CHAIN PROTOCOL WORKS?

Kyber is the liquidity infrastructure for decentralized finance. Kyber aggregates liquidity from diverse sources into a pool, which provides the best rates for takers such as DApps, Wallets, DEXs, and End users.

1.4.1 PROVIDING LIQUIDITY AS A RESERVE

Anyone can operate a Kyber Reserve to market make for profit and make their tokens available for DApps in the ecosystem. Through an open reserve architecture, individuals, token teams and professional market makers can contribute token assets to Kyber’s liquidity pool and earn from the spread in every trade. These tokens become available at the best rates across DApps that tap into the network, making them instantly more liquid and useful.
MAIN RESERVE TYPES Kyber currently has over 45 reserves in its network providing liquidity. There are 3 main types of reserves that allow different liquidity contribution options to suit the unique needs of different providers. 1. Automated Price Reserves (APR) — Allows token teams and users with large token holdings to have an automated yet customized pricing system with low maintenance costs. Synthetix and Melon are examples of teams that run APRs. 2. Fed Price Reserves (FPR) — Operated by professional market makers that require custom and advanced pricing strategies tailored to their specific needs. Kyber alongside reserves such as OneBit, runs FPRs. 3. Bridge Reserves (BR) — These are specialized reserves meant to bring liquidity from other on-chain liquidity providers like Uniswap, Oasis, DutchX, and Bancor into the network.

1.5 KYBER NETWORK ROLES

There Kyber Network functions through coordination between several different roles and functions as explained below: - Users — This entity uses the Kyber Network to send and receive tokens. A user can be an individual, a merchant, and even a smart contract account. - Reserve Entities — This role is used to add liquidity to the platform through the dynamic reserve pool. Some reserve entities are internal to the Kyber Network, but others may be registered third parties. Reserve entities may be public if the public contributes to the reserves they hold, otherwise they are considered private. By allowing third parties as reserve entities the network adds diversity, which prevents monopolization and keeps exchange rates competitive. Allowing third party reserve entities also allows for the listing of less popular coins with lower volumes. - Reserve Contributors — Where reserve entities are classified as public, the reserve contributor is the entity providing reserve funds. Their incentive for doing so is a profit share from the reserve. - The Reserve Manager — Maintains the reserve, calculates exchange rates and enters them into the network. The reserve manager profits from exchange spreads set by them on their reserves. They can also benefit from increasing volume by accessing the entire Kyber Network. - The Kyber Network Operator — Currently the Kyber Network team is filling the role of the network operator, which has a function to adds/remove Reserve Entities as well as controlling the listing of tokens. Eventually, this role will revert to a proper decentralized governance.

1.6 BASIC TOKEN TRADE

A basic token trade is one that has the quote token as either the source or destination token of the trade request. The execution flow of a basic token trade is depicted in the diagram below, where a taker would like to exchange BAT tokens for ETH as an example. The trade happens in a single blockchain transaction. 1. Taker sends 1 ETH to the protocol contract, and would like to receive BAT in return. 2. Protocol contract queries the first reserve for its ETH to BAT exchange rate. 3. Reserve 1 offers an exchange rate of 1 ETH for 800 BAT. 4. Protocol contract queries the second reserve for its ETH to BAT exchange rate. 5. Reserve 2 offers an exchange rate of 1 ETH for 820 BAT. 6. This process goes on for the other reserves. After the iteration, reserve 2 is discovered to have offered the best ETH to BAT exchange rate. 7. Protocol contract sends 1 ETH to reserve 2. 8. The reserve sends 820 BAT to the taker.

1.7 TOKEN-TO-TOKEN TRADE

A token-to-token trade is one where the quote token is neither the source nor the destination token of the trade request. The exchange flow of a token to token trade is depicted in the diagram below, where a taker would like to exchange BAT tokens for DAI as an example. The trade happens in a single blockchain transaction. 1. Taker sends 50 BAT to the protocol contract, and would like to receive DAI in return. 2. Protocol contract sends 50 BAT to the reserve offering the best BAT to ETH rate. 3. Protocol contract receives 1 ETH in return. 4. Protocol contract sends 1 ETH to the reserve offering the best ETH to DAI rate. 5. Protocol contract receives 30 DAI in return. 6. Protocol contract sends 30 DAI to the user.

2.KYBER NETWORK CRYSTAL (KNC) TOKEN

Kyber Network Crystal (KNC) is an ERC-20 utility token and an integral part of Kyber Network.
KNC is the first deflationary staking token where staking rewards and token burns are generated from actual network usage and growth in DeFi.
The Kyber Network Crystal (KNC) is the backbone of the Kyber Network. It works to connect liquidity providers and those who need liquidity and serves three distinct purposes. The first of these is to collect transaction fees, and a portion of every fee collected is burned, which keeps KNC deflationary. Kyber Network Crystals (KNC), are named after the crystals in Star Wars used to power light sabers.
The KNC also ensures the smooth operation of the reserve system in the Kyber liquidity since entities must use third-party tokens to buy the KNC that pays for their operations in the network.
KNC allows token holders to play a critical role in determining the incentive system, building a wide base of stakeholders, and facilitating economic flow in the network. A small fee is charged each time a token exchange happens on the network, and KNC holders get to vote on this fee model and distribution, as well as other important decisions. Over time, as more trades are executed, additional fees will be generated for staking rewards and reserve rebates, while more KNC will be burned. - Participation rewards — KNC holders can stake KNC in the KyberDAO and vote on key parameters. Voters will earn staking rewards (in ETH) - Burning — Some of the network fees will be burned to reduce KNC supply permanently, providing long-term value accrual from decreasing supply. - Reserve incentives — KNC holders determine the portion of network fees that are used as rebates for selected liquidity providers (reserves) based on their volume performance.

Finally, the KNC token is the connection between the Kyber Network and the exchanges, wallets, and dApps that leverage the liquidity network. This is a virtuous system since entities are rewarded with referral fees for directing more users to the Kyber Network, which helps increase adoption for Kyber and for the entities using the Network.
And of course there will soon be a fourth and fifth uses for the KNC, which will be as a staking token used to generate passive income, as well as a governance token used to vote on key parameters of the network.
The Kyber Network Crystal (KNC) was released in a September 2017 ICO at a price around $1. There were 226,000,000 KNC minted for the ICO, with 61% sold to the public. The remaining 39% are controlled 50/50 by the company and the founders/advisors, with a 1 year lockup period and 2 year vesting period.
Currently, just over 180 million coins are in circulation, and the total supply has been reduced to 210.94 million after the company burned 1 millionth KNC token in May 2019 and then its second millionth KNC token just three months later.
That means that while it took 15 months to burn the first million KNC, it took just 10 weeks to burn the second million KNC. That shows how rapidly adoption has been growing recently for Kyber, with July 2019 USD trading volumes on the Kyber Network nearly reaching $60 million. This volume has continued growing, and on march 13, 2020 the network experienced its highest daily trading activity of $33.7 million in a 24-hour period.
Currently KNC is required by Reserve Managers to operate on the network, which ensures a minimum amount of demand for the token. Combined with future plans for burning coins, price is expected to maintain an upward bias, although it has suffered along with the broader market in 2018 and more recently during the summer of 2019.
It was unfortunate in 2020 that a beginning rally was cut short by the coronavirus pandemic, although the token has stabilized as of April 2020, and there are hopes the rally could resume in the summer of 2020.

2.1 HOW ARE KNC TOKENS PRODUCED?

The native token of Kyber is called Kyber Network Crystals (KNC). All reserves are required to pay fees in KNC for the right to manage reserves. The KNC collected as fees are either burned and taken out of the total supply or awarded to integrated dapps as an incentive to help them grow.

2.2 HOW DO YOU GET HOLD OF KNC TOKENS?

Kyber Swap can be used to buy ETH directly using a credit card, which can then be used to swap for KNC. Besides Kyber itself, exchanges such as Binance, Huobi, and OKex trade KNC.

2.3 WHAT CAN YOU DO WITH KYBER?

The most direct and basic function of Kyber is for instantly swapping tokens without registering an account, which anyone can do using an Etheruem wallet such as MetaMask. Users can also create their own reserves and contribute funds to a reserve, but that process is still fairly technical one–something Kyber is working on making easier for users in the future.

2.4 THE GOAL OF KYBER THE FUTURE

The goal of Kyber in the coming years is to solidify its position as a one-stop solution for powering liquidity and token swapping on Ethereum. Kyber plans on a major protocol upgrade called Katalyst, which will create new incentives and growth opportunities for all stakeholders in their ecosystem, especially KNC holders. The upgrade will mean more use cases for KNC including to use KNC to vote on governance decisions through a decentralized organization (DAO) called the KyberDAO.
With our upcoming Katalyst protocol upgrade and new KNC model, Kyber will provide even more benefits for stakeholders. For instance, reserves will no longer need to hold a KNC balance for fees, removing a major friction point, and there will be rebates for top performing reserves. KNC holders can also stake their KNC to participate in governance and receive rewards.

2.5 BUYING & STORING KNC

Those interested in buying KNC tokens can do so at a number of exchanges. Perhaps your best bet between the complete list is the likes of Coinbase Pro and Binance. The former is based in the USA whereas the latter is an offshore exchange.
The trading volume is well spread out at these exchanges, which means that the liquidity is not concentrated and dependent on any one exchange. You also have decent liquidity on each of the exchange books. For example, the Binance BTC / KNC books are wide and there is decent turnover. This means easier order execution.
KNC is an ERC20 token and can be stored in any wallet with ERC20 support, such as MyEtherWallet or MetaMask. One interesting alternative is the KyberSwap Android mobile app that was released in August 2019.
It allows for instant swapping of tokens and has support for over 70 different altcoins. It also allows users to set price alerts and limit orders and works as a full-featured Ethereum wallet.

2.6 KYBER KATALYST UPGRADE

Kyber has announced their intention to become the de facto liquidity layer for the Decentralized Finance space, aiming to have Kyber as the single on-chain endpoint used by the majority of liquidity providers and dApp developers. In order to achieve this goal the Kyber Network team is looking to create an open ecosystem that garners trust from the decentralized finance space. They believe this is the path that will lead the majority of projects, developers, and users to choose Kyber for liquidity needs. With that in mind they have recently announced the launch of a protocol upgrade to Kyber which is being called Katalyst.
The Katalyst upgrade will create a stronger ecosystem by creating strong alignments towards a common goal, while also strengthening the incentives for stakeholders to participate in the ecosystem.
The primary beneficiaries of the Katalyst upgrade will be the three major Kyber stakeholders: 1. Reserve managers who provide network liquidity; 2. dApps that connect takers to Kyber; 3. KNC holders.
These stakeholders can expect to see benefits as highlighted below: Reserve Managers will see two new benefits to providing liquidity for the network. The first of these benefits will be incentives for providing reserves. Once Katalyst is implemented part of the fees collected will go to the reserve managers as an incentive for providing liquidity.
This mechanism is similar to rebates in traditional finance, and is expected to drive the creation of additional reserves and market making, which in turn will lead to greater liquidity and platform reach.
Katalyst will also do away with the need for reserve managers to maintain a KNC balance for use as network fees. Instead fees will be automatically collected and used as incentives or burned as appropriate. This should remove a great deal of friction for reserves to connect with Kyber without affecting the competitive exchange rates that takers in the system enjoy. dApp Integrators will now be able to set their own spread, which will give them full control over their own business model. This means the current fee sharing program that shares 30% of the 0.25% fee with dApp developers will go away and developers will determine their own spread. It’s believed this will increase dApp development within Kyber as developers will now be in control of fees.
KNC Holders, often thought of as the core of the Kyber Network, will be able to take advantage of a new staking mechanism that will allow them to receive a portion of network fees by staking their KNC and participating in the KyberDAO.

2.7 COMING KYBERDAO

With the implementation of the Katalyst protocol the KNC holders will be put right at the heart of Kyber. Holders of KNC tokens will now have a critical role to play in determining the future economic flow of the network, including its incentive systems.
The primary way this will be achieved is through KyberDAO, a way in which on-chain and off-chain governance will align to streamline cooperation between the Kyber team, KNC holders, and market participants.
The Kyber Network team has identified 3 key areas of consideration for the KyberDAO: 1. Broad representation, transparent governance and network stability 2. Strong incentives for KNC holders to maintain their stake and be highly involved in governance 3. Maximizing participation with a wide range of options for voting delegation
Interaction between KNC Holders & Kyber
This means KNC holders have been empowered to determine the network fee and how to allocate the fees to ensure maximum network growth. KNC holders will now have three fee allocation options to vote on: - Voting Rewards: Immediate value creation. Holders who stake and participate in the KyberDAO get their share of the fees designated for rewards. - Burning: Long term value accrual. The decreasing supply of KNC will improve the token appreciation over time and benefit those who did not participate. - Reserve Incentives:Value creation via network growth. By rewarding Kyber reserve managers based on their performance, it helps to drive greater volume, value, and network fees.

2.8 TRANSPARENCY AND STABILITY

The design of the KyberDAO is meant to allow for the greatest network stability, as well as maximum transparency and the ability to quickly recover in emergency situations. Initally the Kyber team will remain as maintainers of the KyberDAO. The system is being developed to be as verifiable as possible, while still maintaining maximum transparency regarding the role of the maintainer in the DAO.
Part of this transparency means that all data and processes are stored on-chain if feasible. Voting regarding network fees and allocations will be done on-chain and will be immutable. In situations where on-chain storage or execution is not feasible there will be a set of off-chain governance processes developed to ensure all decisions are followed through on.

2.9 KNC STAKING AND DELEGATION

Staking will be a new addition and both staking and voting will be done in fixed periods of times called “epochs”. These epochs will be measured in Ethereum block times, and each KyberDAO epoch will last roughly 2 weeks.
This is a relatively rapid epoch and it is beneficial in that it gives more rapid DAO conclusion and decision-making, while also conferring faster reward distribution. On the downside it means there needs to be a new voting campaign every two weeks, which requires more frequent participation from KNC stakeholders, as well as more work from the Kyber team.
Delegation will be part of the protocol, allowing stakers to delegate their voting rights to third-party pools or other entities. The pools receiving the delegation rights will be free to determine their own fee structure and voting decisions. Because the pools will share in rewards, and because their voting decisions will be clearly visible on-chain, it is expected that they will continue to work to the benefit of the network.

3. TRADING

After the September 2017 ICO, KNC settled into a trading price that hovered around $1.00 (decreasing in BTC value) until December. The token has followed the trend of most other altcoins — rising in price through December and sharply declining toward the beginning of January 2018.
The KNC price fell throughout all of 2018 with one exception during April. From April 6th to April 28th, the price rose over 200 percent. This run-up coincided with a blog post outlining plans to bring Bitcoin to the Ethereum blockchain. Since then, however, the price has steadily fallen, currently resting on what looks like a $0.15 (~0.000045 BTC) floor.
With the number of partners using the Kyber Network, the price may rise as they begin to fully use the network. The development team has consistently hit the milestones they’ve set out to achieve, so make note of any release announcements on the horizon.

4. COMPETITION

The 0x project is the biggest competitor to Kyber Network. Both teams are attempting to enter the decentralized exchange market. The primary difference between the two is that Kyber performs the entire exchange process on-chain while 0x keeps the order book and matching off-chain.
As a crypto swap exchange, the platform also competes with ShapeShift and Changelly.

5.KYBER MILESTONES

• June 2020: Digifox, an all-in-one finance application by popular crypto trader and Youtuber Nicholas Merten a.k.a DataDash (340K subs), integrated Kyber to enable users to easily swap between cryptocurrencies without having to leave the application. • June 2020: Stake Capital partnered with Kyber to provide convenient KNC staking and delegation services, and also took a KNC position to participate in governance. • June 2020: Outlined the benefits of the Fed Price Reserve (FPR) for professional market makers and advanced developers. • May 2020: Kyber crossed US$1 Billion in total trading volume and 1 Million transactions, performed entirely on-chain on Ethereum. • May 2020: StakeWith.Us partnered Kyber Network as a KyberDAO Pool Master. • May 2020: 2Key, a popular blockchain referral solution using smart links, integrated Kyber’s on-chain liquidity protocol for seamless token swaps • May 2020: Blockchain game League of Kingdoms integrated Kyber to accept Token Payments for Land NFTs. • May 2020: Joined the Zcash Developer Alliance , an invite-only working group to advance Zcash development and interoperability. • May 2020: Joined the Chicago DeFi Alliance to help accelerate on-chain market making for professionals and developers. • March 2020: Set a new record of USD $33.7M in 24H fully on-chain trading volume, and $190M in 30 day on-chain trading volume. • March 2020: Integrated by Rarible, Bullionix, and Unstoppable Domains, with the KyberWidget deployed on IPFS, which allows anyone to swap tokens through Kyber without being blocked. • February 2020: Popular Ethereum blockchain game Axie Infinity integrated Kyber to accept ERC20 payments for NFT game items. • February 2020: Kyber’s protocol was integrated by Gelato Finance, Idle Finance, rTrees, Sablier, and 0x API for their liquidity needs. • January 2020: Kyber Network was found to be the most used protocol in the whole decentralized finance (DeFi) space in 2019, according to a DeFi research report by Binance. • December 2019: Switcheo integrated Kyber’s protocol for enhanced liquidity on their own DEX. • December 2019: DeFi Wallet Eidoo integrated Kyber for seamless in-wallet token swaps. • December 2019: Announced the development of the Katalyst Protocol Upgrade and new KNC token model. • July 2019: Developed the Waterloo Bridge , a Decentralized Practical Cross-chain Bridge between EOS and Ethereum, successfully demonstrating a token swap between Ethereum to EOS. • July 2019: Trust Wallet, the official Binance wallet, integrated Kyber as part of its decentralized token exchange service, allowing even more seamless in-wallet token swaps for thousands of users around the world. • May 2019: HTC, the large consumer electronics company with more than 20 years of innovation, integrated Kyber into its Zion Vault Wallet on EXODUS 1 , the first native web 3.0 blockchain phone, allowing users to easily swap between cryptocurrencies in a decentralized manner without leaving the wallet. • January 2019: Introduced the Automated Price Reserve (APR) , a capital efficient way for token teams and individuals to market make with low slippage. • January 2019: The popular Enjin Wallet, a default blockchain DApp on the Samsung S10 and S20 mobile phones, integrated Kyber to enable in-wallet token swaps. • October 2018: Kyber was a founding member of the WBTC (Wrapped Bitcoin) Initiative and DAO. • October 2018: Developed the KyberWidget for ERC20 token swaps on any website, with CoinGecko being the first major project to use it on their popular site.

Full Article

submitted by CoinEx_Institution to kybernetwork [link] [comments]

NVidia – Know What You Own

How many people really understand what they’re buying, especially when it comes to highly specialized hardware companies? Most NVidia investors seem to be relying on a vague idea of how the company should thrive “in the future”, as their GPUs are ostensibly used for Artificial Intelligence, Cloud, holograms, etc. Having been shocked by how this company is represented in the media, I decided to lay out how this business works, doing my part to fight for reality. With what’s been going on in markets, I don’t like my chances but here goes:
Let’s start with…
How does NVDA make money?
NVDA is in the business of semiconductor design. As a simplified image in your head, you can imagine this as designing very detailed and elaborate posters. Their engineers create circuit patterns for printing onto semiconductor wafers. NVDA then pays a semiconductor foundry (the printer – generally TSMC) to create chips with those patterns on them.
Simply put, NVDA’s profits represent the difference between the price at which they can sell those chips, less the cost of printing, and less the cost of paying their engineers to design them.
Notably, after the foundry prints the chips, NVDA also has to pay (I say pay, but really it is more like “sell at a discount to”) their “add-in board” (AIB) partners to stick the chips onto printed circuit boards (what you might imagine as green things with a bunch of capacitors on them). That leads to the final form in which buyers experience the GPU.
What is a GPU?
NVDA designs chips called GPUs (Graphical Processing Units). Initially, GPUs were used for the rapid processing and creation of images, but their use cases have expanded over time. You may be familiar with the CPU (Central Processing Unit). CPUs sit at the core of a computer system, doing most of the calculation, taking orders from the operating system (e.g. Windows, Linux), etc. AMD and Intel make CPUs. GPUs assist the CPU with certain tasks. You can think of the CPU as having a few giant very powerful engines. The GPU has a lot of small much less powerful engines. Sometimes you have to do a lot of really simple tasks that don’t require powerful engines to complete. Here, the act of engaging the powerful engines is a waste of time, as you end up spending most of your time revving them up and revving them down. In that scenario, it helps the CPU to hand that task over to the GPU in order to “accelerate” the completion of the task. The GPU only revs up a small engine for each task, and is able to rev up all the small engines simultaneously to knock out a large number of these simple tasks at the same time. Remember the GPU has lots of engines. The GPU also has an edge in interfacing a lot with memory but let’s not get too technical.
Who uses NVDA’s GPUs?
There are two main broad end markets for NVDA’s GPUs – Gaming and Professional. Let’s dig into each one:
The Gaming Market:
A Bit of Ancient History (Skip if impatient)
GPUs were first heavily used for gaming in arcades. They then made their way to consoles, and finally PCs. NVDA started out in the PC phase of GPU gaming usage. They weren’t the first company in the space, but they made several good moves that ultimately led to a very strong market position. Firstly, they focused on selling into OEMs – guys like the equivalent of today’s DELL/HP/Lenovo – , which allowed a small company to get access to a big market without having to create a lot of relationships. Secondly, they focused on the design aspect of the GPU, and relied on their Asian supply chain to print the chip, to package the chip and to install in on a printed circuit board – the Asian supply chain ended up being the best in semis. But the insight that really let NVDA dominate was noticing that some GPU manufacturers were focusing on keeping hardware-accelerated Transform and Lighting as a Professional GPU feature. As a start-up, with no professional GPU business to disrupt, NVidia decided their best ticket into the big leagues was blowing up the market by including this professional grade feature into their gaming product. It worked – and this was a real masterstroke – the visual and performance improvements were extraordinary. 3DFX, the initial leader in PC gaming GPUs, was vanquished, and importantly it happened when funding markets shut down with the tech bubble bursting and after 3DFX made some large ill-advised acquisitions. Consequently 3DFX, went from hero to zero, and NVDA bought them for a pittance out of bankruptcy, acquiring the best IP portfolio in the industry.
Some more Modern History
This is what NVDA’s pure gaming card revenue looks like over time – NVDA only really broke these out in 2005 (note by pure, this means ex-Tegra revenues):
📷 https://hyperinflation2020.tumblr.com/private/618394577731223552/tumblr_Ikb8g9Cu9sxh2ERno
So what is the history here? Well, back in the late 90s when GPUs were first invented, they were required to play any 3D game. As discussed in the early history above, NVDA landed a hit product to start with early and got a strong burst of growth: revenues of 160M in 1998 went to 1900M in 2002. But then NVDA ran into strong competition from ATI (later purchased and currently owned by AMD). While NVDA’s sales struggled to stay flat from 2002 to 2004, ATI’s doubled from 1Bn to 2Bn. NVDA’s next major win came in 2006, with the 8000 series. ATI was late with a competing product, and NVDA’s sales skyrocketed – as can be seen in the graph above. With ATI being acquired by AMD they were unfocused for some time, and NVDA was able to keep their lead for an extended period. Sales slowed in 2008/2009 but that was due to the GFC – people don’t buy expensive GPU hardware in recessions.
And then we got to 2010 and the tide changed. Growth in desktop PCs ended. Here is a chart from Statista:
📷https://hyperinflation2020.tumblr.com/private/618394674172919808/tumblr_OgCnNwTyqhMhAE9r9
This resulted in two negative secular trends for Nvidia. Firstly, with the decline in popularity of desktop PCs, growth in gaming GPUs faded as well (below is a chart from Jon Peddie). Note that NVDA sells discrete GPUs, aka DT (Desktop) Discrete. Integrated GPUs are mainly made by Intel (these sit on the motherboard or with the CPU).
📷 https://hyperinflation2020.tumblr.com/private/618394688079200256/tumblr_rTtKwOlHPIVUj8e7h
You can see from the chart above that discrete desktop GPU sales are fading faster than integrated GPU sales. This is the other secular trend hurting NVDA’s gaming business. Integrated GPUs are getting better and better, taking over a wider range of tasks that were previously the domain of the discrete GPU. Surprisingly, the most popular eSports game of recent times – Fortnite – only requires Intel HD 4000 graphics – an Integrated GPU from 2012!
So at this point you might go back to NVDA’s gaming sales, and ask the question: What happened in 2015? How is NVDA overcoming these secular trends?
The answer consists of a few parts.Firstly, AMD dropped the ball in 2015. As you can see in this chart, sourced from 3DCenter, AMD market share was halved in 2015, due to a particularly poor product line-up:
📷 https://hyperinflation2020.tumblr.com/private/618394753459994624/tumblr_J7vRw9y0QxMlfm6Xd
Following this, NVDA came out with Pascal in 2016 – a very powerful offering in the mid to high end part of the GPU market. At the same time, AMD was focusing on rebuilding and had no compelling mid or high end offerings. AMD mainly focused on maintaining scale in the very low end. Following that came 2017 and 2018: AMD’s offering was still very poor at the time, but cryptomining drove demand for GPUs to new levels, and AMD’s GPUs were more compelling from a price-performance standpoint for crypto mining initially, perversely leading to AMD gaining share. NVDA quickly remedied that by improving their drivers to better mine crypto, regaining their relative positioning, and profiting in a big way from the crypto boom. Supply that was calibrated to meet gaming demand collided with cryptomining demand and Average Selling Prices of GPUs shot through the roof. Cryptominers bought top of the line GPUs aggressively.
A good way to see changes in crypto demand for GPUs is the mining profitability of Ethereum:
📷 https://hyperinflation2020.tumblr.com/private/618394769378443264/tumblr_cmBtR9gm8T2NI9jmQ
This leads us to where we are today. 2019 saw gaming revenues drop for NVDA. Where are they likely to head?
The secular trends of falling desktop sales along with falling discrete GPU sales have reasserted themselves, as per the Jon Peddie research above. Cryptomining profitability has collapsed.
AMD has come out with a new architecture, NAVI, and the 5700XT – the first Iteration, competes effectively with NVDA in the mid-high end space on a price/performance basis. This is the first real competition from AMD since 2014.
NVDA can see all these trends, and they tried to respond. Firstly, with volumes clearly declining, and likely with a glut of second-hand GPUs that can make their way to gamers over time from the crypto space, NVDA decided to pursue a price over volume strategy. They released their most expensive set of GPUs by far in the latest Turing series. They added a new feature, Ray Tracing, by leveraging the Tensor Cores they had created for Professional uses, hoping to use that as justification for higher prices (more on this in the section on Professional GPUs). Unfortunately for NVDA, gamers have responded quite poorly to Ray Tracing – it caused performance issues, had poor support, poor adoption, and the visual improvements in most cases are not particularly noticeable or relevant.
The last recession led to gaming revenues falling 30%, despite NVDA being in a very strong position at the time vis-à-vis AMD – this time around their position is quickly slipping and it appears that the recession is going to be bigger. Additionally, the shift away from discrete GPUs in gaming continues.
To make matters worse for NVDA, AMD won the slots in both the New Xbox and the New PlayStation, coming out later this year. The performance of just the AMD GPU in those consoles looks to be competitive with NVidia products that currently retail for more than the entire console is likely to cost. Consider that usually you have to pair that NVidia GPU with a bunch of other expensive hardware. The pricing and margin impact of this console cycle on NVDA is likely to be very substantially negative.
It would be prudent to assume a greater than 30% fall in gaming revenues from the very elevated 2019 levels, with likely secular decline to follow.
The Professional Market:
A Bit of Ancient History (again, skip if impatient)
As it turns out, graphical accelerators were first used in the Professional market, long before they were employed for Gaming purposes. The big leader in the space was a company called Silicon Graphics, who sold workstations with custom silicon optimised for graphical processing. Their sales were only $25Mn in 1985, but by 1997 they were doing 3.6Bn in revenue – truly exponential growth. Unfortunately for them, from that point on, discrete GPUs took over, and their highly engineered, customised workstations looked exorbitantly expensive in comparison. Sales sank to 500mn by 2006 and, with no profits in sight, they ended up filing for bankruptcy in 2009. Competition is harsh in the semiconductor industry.
Initially, the Professional market centred on visualisation and design, but it has changed over time. There were a lot of players and lot of nuance, but I am going to focus on more recent times, as they are more relevant to NVidia.
Some More Modern History
NVDA’s Professional business started after its gaming business, but we don’t have revenue disclosures that show exactly when it became relevant. This is what we do have – going back to 2005:
📷 https://hyperinflation2020.tumblr.com/private/618394785029472256/tumblr_fEcYAzdstyh6tqIsI
In the beginning, Professional revenues were focused on the 3D visualisation end of the spectrum, with initial sales going into workstations that were edging out the customised builds made by Silicon Graphics. Fairly quickly, however, GPUs added more and more functionality and started to turn into general parallel data processors rather than being solely optimised towards graphical processing.
As this change took place, people in scientific computing noticed, and started using GPUs to accelerate scientific workloads that involve very parallel computation, such as matrix manipulation. This started at the workstation level, but by 2007 NVDA decided to make a new line-up of Tesla series cards specifically suited to scientific computing. The professional segment now have several points of focus:
  1. GPUs used in workstations for things such as CAD graphical processing (Quadro Line)
  2. GPUs used in workstations for computational workloads such as running engineering simulations (Quadro Line)
  3. GPUs used in workstations for machine learning applications (Quadro line.. but can use gaming cards as well for this)
  4. GPUs used by enterprise customers for high performance computing (such as modelling oil wells) (Tesla Line)
  5. GPUs used by enterprise customers for machine learning projects (Tesla Line)
  6. GPUs used by hyperscalers (mostly for machine learning projects) (Tesla Line)
In more recent times, given the expansion of the Tesla line, NVDA has broken up reporting into Professional Visualisation (Quadro Line) and Datacenter (Tesla Line). Here are the revenue splits since that reporting started:
📷 https://hyperinflation2020.tumblr.com/private/618394798232158208/tumblr_3AdufrCWUFwLgyQw2
📷 https://hyperinflation2020.tumblr.com/private/618394810632601600/tumblr_2jmajktuc0T78Juw7
It is worth stopping here and thinking about the huge increase in sales delivered by the Tesla line. The reason for this huge boom is the sudden increase in interest in numerical techniques for machine learning. Let’s go on a brief detour here to understand what machine learning is, because a lot of people want to hype it but not many want to tell you what it actually is. I have the misfortune of being very familiar with the industry, which prevented me from buying into the hype. Oops – sometimes it really sucks being educated.
What is Machine Learning?
At a very high level, machine learning is all about trying to get some sort of insight out of data. Most of the core techniques used in machine learning were developed a long time ago, in the 1950s and 1960s. The most common machine learning technique, which most people have heard of and may be vaguely familiar with, is called regression analysis. Regression analysis involves fitting a line through a bunch of datapoints. The most common type of regression analysis is called “Ordinary Least Squares” OLS regression, and that type of regression has a “closed form” solution, which means that there is a very simple calculation you can do to fit an OLS regression line to data.
As it happens, fitting a line through points is not only easy to do, it also tends to be the main machine learning technique that people want to use, because it is very intuitive. You can make good sense of what the data is telling you and can understand the machine learning model you are using. Obviously, regression analysis doesn’t require a GPU!
However, there is another consideration in machine learning: if you want to use a regression model, you still need a human to select the data that you want to fit the line through. Also, sometimes the relationship doesn’t look like a line, but rather it might look like a curve. In this case, you need a human to “transform” the data before you fit a line through it in order to make the relationship linear.
So people had another idea here: what if instead of getting a person to select the right data to analyse, and the right model to apply, you could just get a computer to do that? Of course the problem with that is that computers are really stupid. They have no preconceived notion of what data to use or what relationship would make sense, so what they do is TRY EVERYTHING! And everything involves trying a hell of a lot of stuff. And trying a hell of a lot of stuff, most of which is useless garbage, involves a huge amount of computation. People tried this for a while through to the 1980s, decided it was useless, and dropped it… until recently.
What changed? Well we have more data now, and we have a lot more computing power, so we figured lets have another go at it. As it happens, the premier technique for trying a hell of a lot of stuff (99.999% of which is garbage you throw away) is called “Deep Learning”. Deep learning is SUPER computationally intensive, and that computation happens to involve a lot of matrix multiplication. And guess what just happens to have been doing a lot of matrix multiplication? GPUs!
Here is a chart that, for obvious reasons, lines up extremely well with the boom in Tesla GPU sales:
📷 https://hyperinflation2020.tumblr.com/private/618394825774989312/tumblr_IZ3ayFDB0CsGdYVHW
Now we need to realise a few things here. Deep Learning is not some magic silver bullet. There are specific applications where it has proven very useful – primarily areas that have a very large number of very weak relationships between bits of data that sum up into strong relationships. An example of ones of those is Google Translate. On the other hand, in most analytical tasks, it is most useful to have an intuitive understanding of the data and to fit a simple and sensible model to it that is explainable. Deep learning models are not explainable in an intuitive manner. This is not only because they are complicated, but also because their scattershot technique of trying everything leaves a huge amount of garbage inside the model that cancels itself out when calculating the answer, but it is hard to see how it cancels itself out when stepping through it.
Given the quantum of hype on Deep learning and the space in general, many companies are using “Deep Learning”, “Machine Learning” and “AI” as marketing. Not many companies are actually generating significant amounts of tangible value from Deep Learning.
Back to the Competitive Picture
For the Tesla Segment
So NVDA happened to be in the right place at the right time to benefit from the Deep Learning hype. They happened to have a product ready to go and were able to charge a pretty penny for their product. But what happens as we proceed from here?
Firstly, it looks like the hype from Deep Learning has crested, which is not great from a future demand perspective. Not only that, but we really went from people having no GPUs, to people having GPUs. The next phase is people upgrading their old GPUs. It is much harder to sell an upgrade than to make the first sale.
Not only that, but GPUs are not the ideal manifestation of silicon for Deep Learning. NVDA themselves effectively admitted that with their latest iteration in the Datacentre, called Ampere. High Performance Computing, which was the initial use case for Tesla GPUs, was historically all about double precision floating point calculations (FP64). High precision calculations are required for simulations in aerospace/oil & gas/automotive.
NVDA basically sacrificed HPC and shifted further towards Deep Learning with Ampere, announced last Thursday. The FP64 performance of the A100 (the latest Ampere chip) increased a fairly pedestrian 24% from the V100, increasing from 7.8 to 9.7 TF. Not a surprise that NVDA lost El Capitan to AMD, given this shift away from a focus on HPC. Instead, NVDA jacked up their Tensor Cores (i.e. not the GPU cores) and focused very heavily on FP16 computation (a lot less precise than FP64). As it turns out, FP16 is precise enough for Deep Learning, and NVDA recognises that. The future industry standard is likely to be BFloat 16 – the format pioneered by Google, who lead in Deep Learning. Ampere now does 312 TF of BF16, which compares to the 420 TF of Google’s TPU V3 – Google’s Machine Learning specific processor. Not quite up to the 2018 board from Google, but getting better – if they cut out all of the Cuda cores and GPU functionality maybe they could get up to Google’s spec.
And indeed this is the problem for NVDA: when you make a GPU it has a large number of different use cases, and you provide a single product that meets all of these different use cases. That is a very hard thing to do, and explains why it has been difficult for competitors to muscle into the GPU space. On the other hand, when you are making a device that does one thing, such as deep learning, it is a much simpler thing to do. Google managed to do it with no GPU experience and is still ahead of NVDA. It is likely that Intel will be able to enter this space successfully, as they have widely signalled with the Xe.
There is of course the other large negative driver for Deep Learning, and that is the recession we are now in. Demand for GPU instances on Amazon has collapsed across the board, as evidenced by the fall in pricing. The below graph shows one example: this data is for renting out a single Tesla V100 GPU on AWS, which isthe typical thing to do in an early exploratory phase for a Deep Learning model:
📷 https://hyperinflation2020.tumblr.com/private/618396177958944768/tumblr_Q86inWdeCwgeakUvh
With Deep Learning not delivering near-term tangible results, it is the first thing being cut. On their most recent conference call, IBM noted weakness in their cognitive division (AI), and noted weaker sales of their power servers, which is the line that houses Enterprise GPU servers at IBM. Facebook cancelled their AI residencies for this year, and Google pushed theirs out. Even if NVDA can put in a good quarter due to their new product rollout (Ampere), the future is rapidly becoming a very stormy place.
For the Quadro segment
The Quadro segment has been a cash cow for a long time, generating dependable sales and solid margins. AMD just decided to rock the boat a bit. Sensing NVDA’s focus on Deep Learning, AMD seems to be focusing on HPC – the Radeon VII announced recently with a price point of $1899 takes aim at NVDAs most expensive Quadro, the GV100, priced at $8999. It does 6.5 TFLOPS of FP64 Double precision, whereas the GV100 does 7.4 – talk about shaking up a quiet segment.
Pulling things together
Let’s go back to what NVidia fundamentally does – paying their engineers to design chips, getting TSMC to print those chips, and getting board partners in Taiwan to turn them into the final product.
We have seen how a confluence of several pieces of extremely good fortune lined up to increase NVidia’s sales and profits tremendously: first on the Gaming side, weak competition from AMD until 2014, coupled with a great product in form of Pascal in 2016, followed by a huge crypto driven boom in 2017 and 2018, and on the Professional side, a sudden and unexpected increase in interest in Deep Learning driving Tesla demand from 2017-2019 sky high.
It is worth noting what these transient factors have done to margins. When unexpected good things happen to a chip company, sales go up a lot, but there are no costs associated with those sales. Strong demand means that you can sell each chip for a higher price, but no additional design work is required, and you still pay the printer, TSMC, the same amount of money. Consequently NVDA’s margins have gone up substantially: well above their 11.9% long term average to hit a peak of 33.2%, and more recently 26.5%:
📷 https://hyperinflation2020.tumblr.com/private/618396192166100992/tumblr_RiWaD0RLscq4midoP
The question is, what would be a sensible margin going forward? Obviously 33% operating margin would attract a wall of competition and get competed away, which is why they can only be temporary. However, NVidia has shifted to having a greater proportion of its sales coming from non-OEM, and has a greater proportion of its sales coming from Professional rather than gaming. As such, maybe one can be generous and say NVDA can earn an 18% average operating margin over the next cycle. We can sense check these margins, using Intel. Intel has a long term average EBIT margin of about 25%. Intel happens to actually print the chips as well, so they collect a bigger fraction of the final product that they sell. NVDA, since it only does the design aspect, can’t earn a higher EBIT margin than Intel on average over the long term.
Tesla sales have likely gone too far and will moderate from here – perhaps down to a still more than respectable $2bn per year. Gaming resumes the long-term slide in discrete GPUs, which will likely be replaced by integrated GPUs to a greater and greater extent over time. But let’s be generous and say it maintains $3.5 Bn Per year for the add in board, and let’s assume we keep getting $750mn odd of Nintendo Switch revenues(despite that product being past peak of cycle, with Nintendo themselves forecasting a sales decline). Let’s assume AMD struggles to make progress in Quadro, despite undercutting NVDA on price by 75%, with continued revenues at $1200. Add on the other 1.2Bn of Automotive, OEM and IP (I am not even counting the fact that car sales have collapsed and Automotive is likely to be down big), and we would end up with revenues of $8.65 Bn, at an average operating margin of 20% through the cycle that would have $1.75Bn of operating earnings power, and if I say that the recent Mellanox acquisition manages to earn enough to pay for all the interest on NVDAs debt, and I assume a tax rate of 15% we would have around $1.5Bn in Net income.
This company currently has a market capitalisation of $209 Bn. It blows my mind that it trades on 139x what I consider to be fairly generous earnings – earnings that NVidia never even got close to seeing before the confluence of good luck hit them. But what really stuns me is the fact that investors are actually willing to extrapolate this chain of unlikely and positive events into the future.
Shockingly, Intel has a market cap of 245Bn, only 40Bn more than NVDA, but Intel’s sales and profits are 7x higher. And while Intel is facing competition from AMD, it is much more likely to hold onto those sales and profits than NVDA is. These are absolutely stunning valuation disparities.
If I didn’t see NVDA’s price, and I started from first principles and tried to calculate a prudent price for the company I would have estimated a$1.5Bn normalised profit, maybe on a 20x multiple giving them the benefit of the doubt despite heading into a huge recession, and considering the fact that there is not much debt and the company is very well run. That would give you a market cap of $30Bn, and a share price of $49. And it is currently $339. Wow. Obviously I’m short here!
submitted by HyperInflation2020 to stocks [link] [comments]

How to Read Cryptocurrency Volume - Market Analysis Bitcoin Crypto News - Indications to Look For Trading with the Volume Profile (Beginner) The Ultimate Guide to Trading Volume - YouTube Crypto Terms 101: Basics (Volume , Market Cap , Supply) and more Session Time Update! Plus Trade Setups Explained.

Cryptocurrency Trading Charts Explained Simply put, crypto trading charts refer to data visualizations that represent a cryptocurrency’s value and profile. and incorporate details such as historic and current price as well as past and current trading volume. “Volume predicts price.” In today’s article, we are going to discuss a technical indicator called volume. And because “Volume” is not the most exciting indicator for trading we will also look to discuss “On Balance Volume” to give you a lead on the market. Volume is the easiest indicator we can calculate. How crypto trading volume is calculated and measured. It is normally measured by the amount of activity within the past 24 hours. Investors should keep in mind that there is a trading volume on a crypto exchange and then there’s also a general trading volume which takes into account aggregated data from all exchanges.Some exchanges will naturally have more volume, making it easier to trade Trading volume can give you some clues as to where a cryptocurrency is going to go next. ★ SUBSCRIBE: tradr.cc/dtxh If you are familiar with how trading volume works in stocks, then you will understand how it works in cryptocurrencies too. It is not a failsafe signal, but it can be a tremendous help. However, if are new to trading, then this video will help you understand why it is important Volume causes volatility An asset with very low volume can be compared with an encephalogram of a dead man. No trading activity means a flat line in the chart. In other words, the asset is dead which is a trader’s nightmare. The volume is the source of life in trading. It produces volatility and therefore profitable opportunities.

[index] [158] [48] [20] [169] [59] [187] [23] [87] [181] [441]

How to Read Cryptocurrency Volume - Market Analysis Bitcoin Crypto News - Indications to Look For

Today on this weekly wrap-up, We take a closer look at 3 things in Crypto that matter to your Portfolio. 1. Long-term Volume Trends (Macro) 2. Short-term Volume Trends (Micro) 3. News items you ... In this episode I explain some basic cryptocurrency terms (volume, market cap, and supply) everyone should know as well as how to apply these terms when looking at new cryptocurrencies. We started out by breaking down the trade history in GDAX, and we showed how the trade history can be sliced or partitioned into groups. We saw that each group is used to generate a candlestick ... #cryptotamil #cryptovolume #cryptocurrencyvolume #bitcoin In this video i have explained about the importance of volume of a coin. Using the volume in right way can make you good profits in long ... Explained: The On Balance Volume (OBV) trading indicator is a cumulative value of volume considered on balance based on the flow of the volume into or out of the security.

#