Bitcoin Mining Calculator CoinWarz

Why Bitcoin is Superior to Gold

There is a constant war being fought between goldbugs, like Peter Schiff, and Bitcoin enthusiasts so I decided to make an outline, with links, comparing and contrasting gold and Bitcoin. I made this in November of 2019 (thus the information therein is based on figures from that time) but, being scatter brained, neglected to post this for the Bitcoin community to see. The yardsticks I used to compare the two assets included the following: shipping/transactions costs, storage costs, censorship factor, settlement time, stock to flow, blockchain vs clearing house, validation, etc. I will also touch on Roosevelt's gold confiscation executive order in 1933, transporting gold during the Spanish Civil War in 1936, and the hypothetical cost for Venezuela to repatriate its gold more recently.
I will provide a brief summary first then follow that with the outline I made. This information can be used as a tool for the Bitcoin community to combat some of the silly rhetoric coming from goldbugs such as Peter Schiff and James Rickards. I would like to make it clear, however, that I am not against gold and think that it performed its role as money very well in a technologically inferior era, namely Victorian times but I think Bitcoin performs the functions of money better than gold does in the current environment.
I have been looking to make a contribution to the Bitcoin community and I hope this is a useful and educational tool for everyone who reads this.
Summary:
Shipping/transaction costs: 100 ounces of gold could be shipped for 315 dollars; the comparable dollar value in Bitcoin could be sent for 35 dollars using a non-segwit address. Using historical precendent, it would cost an estimated $32,997,989 to transport $1 billion in gold using the 3.3% fee that the Soviets charged the Spaniards in 1936; a $1 billion Bitcoin transaction moved for $690 last year by comparison. Please note that the only historic example we can provide for moving enormous sums of gold was when the government of Spain transported gold to Moscow during the Spanish Civil War in 1936. More information on this topic will be found in the notes section.
Storage costs: 100 ounces of gold would require $451 per year to custody while the equivalent value of Bitcoin in dollar terms could be stored for the cost of a Ledger Nano S, $59.99. $1 billion USD value of gold would cost $2,900,000 per year while an Armory set up that is more secure would run you the cost of a laptop, $200-300.
Censorship factor: Gold must pass through a 3rd party whenever it is shipped, whether for a transaction or for personal transportation. Gold will typically have to be declared and a customs duty may be imposed when crossing international borders. The key take-away is gatekeepers (customs) can halt movement of gold thus making transactions difficult. $46,000 of gold was seized in India despite the smugglers hiding it in their rectums.
Settlement time: Shipping gold based on 100 ounces takes anywhere from 3-10 days while Bitcoin transactions clear in roughly 10 minutes depending on network congestion and fee size.
Historic confiscation: Franklin Roosevelt confiscated and debased the paper value of gold in 1933 with Executive Order 6102. Since gold is physical in nature and value dense, it is often stored in custodial vaults like banks and so forth which act as a honeypot for rapacious governments.
Stock to flow: Plan B's stock to flow model has become a favorite on twitter. Stock to flow measures the relationship between the total stock of an asset against the amount that is produced in a given year. Currently gold still has the highest value at 62 while Bitcoin sits at 50 in 2nd place. Bitcoin will overtake gold in 2024 after the next halving.
Blockchain vs clearing house: gold payments historically passed through a 3rd party (clearinghouse) in order to be validated while Bitcoin transactions can be self validated through the use of a node.
Key Takeaway from above- Bitcoin is vastly superior to gold in terms of cost, speed, and censorship resistance. One could theoretically carry around an enormous sum of Bitcoin on a cold card while the equivalent dollar value of gold would require a wheelbarrow...and create an enormous target on the back of the transporter. With the exception of the stock to flow ratio (which will flip in Bitcoin's favor soon), Bitcoin is superior to gold by all metrics covered.
Notes:
Shipping/transaction costs
Gold
100 oz = 155,500. 45 x 7 = $315 to ship 100 oz gold.
https://seekingalpha.com/instablog/839735-katchum/2547831-how-much-does-it-cost-to-ship-silver-and-gold
https://www.coininvest.com/en/shipping-prices/
211 tonnes Venezuela; 3.3% of $10.5 billion = 346,478,880 or 32,997,989/billion usd
http://blogs.reuters.com/felix-salmon/2011/08/23/how-to-get-12-billion-of-gold-to-venezuela/ (counter party risk; maduro; quotes from article)
Bitcoin
18 bitcoin equivalent value; 35 USD with legacy address
https://blockexplorer.com/
https://bitcoinfees.info/
1 billion; $690 dollars
https://arstechnica.com/tech-policy/2019/09/someone-moved-1-billion-in-a-single-bitcoin-transaction/
Storage costs
Gold
.29% annually; https://sdbullion.com/gold-silver-storage
100 oz – $451/year
$1 billion USD value – $2,900,000/year
Bitcoin
Ledger Nano S - $59.00 (for less bitcoin)
https://shop.ledger.com/products/ledger-nano-s/transparent?flow_country=USA&gclid=EAIaIQobChMI3ILV5O-Z5wIVTtbACh1zTAwqEAQYASABEgJ5SPD_BwE
Armory - $200-300 cost of laptop for setup
https://www.bitcoinarmory.com/
Censorship factor (must pass through 3rd party)
Varies by country
Gold will typically have to be declared and a customs duty may be imposed
Key take-away is gatekeepers (customs) can halt movement of gold thus making transactions difficult
$46,000 seized in India
https://www.foxnews.com/travel/indian-airport-stops-29-passengers-smuggling-gold-in-their-rectums
Settlement time
Gold
For 100 oz transaction by USPS 3-10 days (must pass through 3rd party)
Bitcoin
Roughly 10 minutes to be included in next block
Historic confiscation-roosevelt 1933
Executive Order 6102 (forced spending, fed could ban cash, go through and get quotes)
https://en.wikipedia.org/wiki/Executive_Order_6102
“The stated reason for the order was that hard times had caused "hoarding" of gold, stalling economic growth and making the depression worse”
Stock to flow; https://medium.com/@100trillionUSD/modeling-bitcoins-value-with-scarcity-91fa0fc03e25 (explain what it is and use charts in article)
Gold; SF of 62
Bitcoin; SF of 25 but will double to 50 after May (and to 100 in four years)
Blockchain vs clearing house
Transactions can be validated by running a full node vs. third party settlement
Validation
Gold; https://www.goldismoney2.com/threads/cost-to-assay.6732/
(Read some responses)
Bitcoin
Cost of electricity to run a full node
Breaking down Venezuela conundrum; http://blogs.reuters.com/felix-salmon/2011/08/23/how-to-get-12-billion-of-gold-to-venezuela/
“The last (and only) known case of this kind of quantity of gold being transported across state lines took place almost exactly 75 years ago, in 1936, when the government of Spain removed 560 tons of gold from Madrid to Moscow as the armies of Francisco Franco approached. Most of the gold was exchanged for Russian weaponry, with the Soviet Union keeping 2.1% of the funds in the form of commissions and brokerage, and an additional 1.2% in the form of transport, deposit, melting, and refining expenses.”
“Venezuela would need to transport the gold in several trips, traders said, since the high value of gold means it would be impossible to insure a single aircraft carrying 211 tonnes. It could take about 40 shipments to move the gold back to Caracas, traders estimated. “It’s going to be quite a task. Logistically, I’m not sure if the central bank realises the magnitude of the task ahead of them,” said one senior gold banker.”
“So maybe Chávez intends to take matters into his own hands, and just sail the booty back to Venezuela on one of his own naval ships. Again, the theft risk is obvious — seamen can be greedy too — and this time there would be no insurance. Chávez is pretty crazy, but I don’t think he’d risk $12 billion that way.”
“Which leaves one final alternative. Gold is fungible, and people are actually willing to pay a premium to buy gold which is sitting in the Bank of England’s ultra-secure vaults. So why bother transporting that gold at all? Venezuela could enter into an intercontinental repo transaction, where it sells its gold in the Bank of England to some counterparty, and then promises to buy it all back at a modest discount, on condition that it’s physically delivered to the Venezuelan central bank in Caracas. It would then be up to the counterparty to work out how to get 211 tons of gold to Caracas by a certain date. That gold could be sourced anywhere in the world, and transported in any conceivable manner — being much less predictable and transparent, those shipments would also be much harder to hijack. How much of a discount would a counterparty require to enter into this kind of transaction? Much more than 3.3%, is my guess. And again, it’s not entirely clear who would even be willing to entertain the idea. Glencore, perhaps?”
“But here’s one last idea: why doesn’t Chávez crowdsource the problem? He could simply open a gold window at the Banco Central de Venezuela, where anybody at all could deliver standard gold bars. In return, the central bank would transfer to that person an equal number of gold bars in the custody of the Bank of England, plus a modest bounty of say 2% — that’s over $15,000 per 400-ounce bar, at current rates. It would take a little while, but eventually the gold would start trickling in: if you’re willing to pay a constant premium of 2% over the market price for a good, you can be sure that the good in question will ultimately find its way to your door. And the 2% cost of acquiring all that gold would surely be much lower than the cost of insuring and shipping it from England. It would be an elegant market-based solution to an artificial and ideologically-driven problem; I daresay Chávez might even chuckle at the irony of it. He’d just need to watch out for a rise in Andean banditry, as thieves tried to steal the bars on their disparate journeys into Venezuela.”
submitted by cornish_roots to Bitcoin [link] [comments]

Traditional Mining vs Green Staking: How UMI Cares for the Planet

Traditional Mining vs Green Staking: How UMI Cares for the Planet

https://preview.redd.it/fcymiab2fed51.jpg?width=1024&format=pjpg&auto=webp&s=a32e38290d6f8048ba7cc982bc2963369642eb7a
Cryptocurrencies are about a major contribution to the transformation of the existing financial system. They can dramatically change the world and be of great benefit to humankind. But looking for benefits mustn't do harm to the environment.
We've taken up this theme for a reason. It is indeed possible to do harm. In fact, harm is already being done. Do you want to know in what way? By traditional mining, which is necessary to maintain the Bitcoin network, and thousands of other Proof-of-Work-based cryptocurrencies.
Negative impact of traditional mining
In order to maintain the Bitcoin network or other PoW-based cryptocurrencies, miners have to solve complex computational math problems — by doing so they verify the authenticity of transactions and add valid ones to the blockchain. This process is dubbed mining and requires extensive computing resources.
The need to compete to solve a mathematical puzzle and receive a reward makes people use more and more powerful equipment. This is how new bitcoins are generated. With the cryptocurrency boom, harmless mining on computers turned into an endless race among miners. Today miners not only buy high-performance computers. Some miners create farms consisting of energy-consuming ASIC devices while others use huge plants to mine bitcoins.

A mining farm consisting of thousands of ASIC devices. Source.
As you know, intensive computing power requires elevated power expenses and leads to air pollution and a waste of natural resources. This poses a serious problem. Nowadays electric power stations, which are thermal power plants (TPP), burn fossil fuel, such as coal or natural gas, to produce electricity.
This process causes CO2 (carbon dioxide) emissions which adversely affect the biosphere — mining contributes to the greenhouse effect which heats the planet up. This consequently causes a global warming effect with its associated impacts on the environment and may pose threats to life on the planet. What is more, every minute we are breathing the same polluted air, thereby being at risk of a bunch of diseases and complications. All these factors shorten life expectancy for us and our children. Air pollution cause a great deal of premature deaths
The more carbon dioxide gets into the environment, the more harm it does. Carbon dioxide is a harmful by-product of industrial activity. The biting irony is that we use natural resources to generate these emissions, and these resources have limits too. Traditional mining significantly exacerbates the global problem and the situation has been deteriorating in recent years.
The effects of carbon footprint are already being felt
There are, undoubtedly, a lot of other factors that cause global environmental degradation, but the impact of mining should never be ignored. Bitcoin mining is estimated to produce as much carbon dioxide as that produced by industries of Estonia, Switzerland, the Czech Republic, Jordan, or Sri Lanka.
The entire bitcoin network is responsible for 22-22.9 million tons of CO2 per year — just think and try to imagine how much it is. Chinese miners represent about half (47%) of emissions. In China energy is cheap as it's produced by coal-fired thermal power plants. Once we add emissions produced by mining other cryptos, the numbers will double!

Powerful mining equipment. Source.
Two years ago, Nature Research journal published an article regarding Bitcoin emissions. It said: "We cannot predict the future of Bitcoin, but projected Bitcoin usage, should it follow the rate of adoption of other broadly adopted technologies, could alone produce enough CO2 emissions to push warming above 2 °C within less than three decades." Two years later, we can see the researchers' concerns had the ground — digital gold keeps to be mined with the same enthusiasm as well as the planet keeps to be polluted. "It [Bitcoin] alone could produce enough emissions to raise global temperatures as soon as 2033, " warn a group of researchers.
As an alternate solution, miners are encouraged to use renewable energy (wind, solar, etc.) — which can make bitcoin mining more environmentally friendly. Unfortunately, renewable energy sources account for just a small share of global energy which makes them impossible to be used widely. Moreover, in the pursuit of profit, miners don't seem particularly eager to get rid of profitable equipment which cost them a fortune.
Nonetheless, the fact that modern cryptocurrencies disapprove environment-damaging mining lets us hope for the early improvement of the situation. UMI is one of these cryptocurrencies.
UMI is a green cryptocurrency based on smart contract
Not all cryptocurrencies use computing power to generate new coins. For example, there are cryptocurrencies based on Proof-of-Stake (PoS) and Proof-of-Authority (PoA) technology. UMI is just like that.
As a substitute for mining and to incite users, UMI uses Staking Smart Contract which allows generating new coins with no energy expenses and powerful equipment. No waste of natural resources. Staking technology is perfectly safe for the planet. This is the latest technological development loop of crypto industry.

https://preview.redd.it/wpgh5cmoged51.jpg?width=1024&format=pjpg&auto=webp&s=761dd09821e16924dfeeb7db8e65b6a66e50c5d5
UMI can be definitely called an environmentally friendly cryptocurrency as it has no negative impact on the environment. Today this is of greatest importance for all of us. UMI staking neither endangers human health nor harms the environment. In other words, we are protecting the planet and all the people that inhabit it. This is something we can be really proud of. Because the environment influences our health, and good health is the most important thing in life.
As a final note, we would like to say that adhering closely to their ideology, the UMI team collaborates only with environmentally conscious partners who are concerned with the protection of the natural world. This was the main reason for choosing the ROY Club as our partner. We are certain this will be productive cooperation which will make this world a better place.
Join in and invite all your friends — together we can create new UMI coins using eco-friendly staking and care for our planet!
Best regards, UMI Team!
submitted by UMITop to u/UMITop [link] [comments]

Mining and Dogecoin - Some FAQs

Hey shibes,
I see a lot of posts about mining lately and questions about the core wallet and how to mine with it, so here are some facts!
Feel free to add information to that thread or correct me if I did any mistake.

You downloaded the core wallet

Great! After a decade it probably synced and now you are wondering how to get coins? Bad news: You don't get coins by running your wallet, even running it as a full node. Check what a full node is here.
Maybe you thought so, because you saw a very old screenshot of a wallet, like this (Version 1.2). This version had a "Dig" tab where you can enter your mining configuration. The current version doesn't have this anymore, probably because it doesn't make sense anymore.

You downloaded a GPU/CPU miner

Nice! You did it, even your antivirus system probably went postal and you started covering all your webcams... But here is the bad news again: Since people are using ASIC miners, you just can't compete with your CPU hardware anymore. Even with your more advanced GPU you will have a hard time. The hashrate is too high for a desktop PC to compete with them. The blocks should be mined every 1 minute (or so) and that's causing the difficulty to go up - and we are out... So definitly check what is your hashrate while you are mining, you would need about 1.5 MH/s to make 1 Doge in 24 hours!

Mining Doge

Let us start with a quote:
"Dogecoin Core 1.8 introduces AuxPoW from block 371,337. AuxPoW is a technology which enables miners to submit work done while mining other coins, as work on the Dogecoin block chain."
- langerhans
What does this mean? You could waste your hashrate only on the Dogecoin chain, probably find never a block, but when, you only receive about 10.000 Dogecoins, currently worth about $25. Or you could apply your hashrate to LTC and Doge (and probably even more) at the same time. Your change of solving the block (finding the nonce) is your hashrate divided by the hashrat in sum - and this is about the same for Doge and LTC. This means you will always want to submit your work to all chains available!

Mining solo versus pool

So let's face it - mining solo won't get you anywhere, so let's mine on a pool! If you have a really bad Hashrate, please consider that: Often you need about $1 or $2 worth of crypto to receive a payout (without fees). This means, you have to get there. With 100 MH/s on prohashing, it takes about 6 days, running 24/7 to get to that threshold. Now you can do the math... 1 MH/s = 1000 KH/s, if you are below 1 MH/s, you probably won't have fun.

Buying an ASIC

You found an old BTC USB-miner with 24 GH/s (1 GH/s = 1000 MH/s) for $80 bucks - next stop lambo!? Sorry, bad news again, this hashrate is for SHA-256! If you want to mine LTC/Doge you will need a miner using scrypt with quite lower numbers on the hashrate per second, so don't fall for that. Often when you have a big miner (= also loud), you get more Hashrate per $ spent on the miner, but most will still run on a operational loss, because the electricity is too expensive and the miners will be outdated soon again. Leading me to my next point...

Making profit

You won't make money running your miner. Just do the math: What if you would have bougth a miner 1 year ago? Substract costs for electricity and then compare to: What if you just have bought coins. In most cases you would have a greater profit by just buying coins, maybe even with a "stable" coin like Doges.

Cloud Mining

Okay, this was a lot of text and you are still on the hook? Maybe you are desperated enough to invest in some cloud mining contract... But this isn't a good idea either, because most of such contracts are scams based on a ponzi scheme. You often can spot them easy, because they guarantee way to high profits, or they fake payouts that never happened, etc.
Just a thought: If someone in a subway says to you: Give me $1 and lets meet in one year, right here and I give you $54,211,841, you wouldn't trust him and if some mining contract says they will give you 5% a day it is basically the same.
Also rember the merged mining part. Nobody would offer you to mine Doges, they would offer you to buy a hashrate for scrypt that will apply on multiple chains.

Alternative coins

Maybe try to mine a coin where you don't have ASICs yet, like Monero and exchange them to Doge. If somebody already tried this - feel free to add your thoughts!

Folding at Home (Doge)

Some people say folding at home (FAH - https://www.dogecoinfah.com/) still the best. I just installed the tool and it says I would make 69.852 points a day, running on medium power what equates to 8 Doges. It is easy, it was fun, but it isn't much.
Thanks for reading
_nformant
submitted by _nformant to dogecoin [link] [comments]

Bitcoin (BTC)A Peer-to-Peer Electronic Cash System.

Bitcoin (BTC)A Peer-to-Peer Electronic Cash System.
  • Bitcoin (BTC) is a peer-to-peer cryptocurrency that aims to function as a means of exchange that is independent of any central authority. BTC can be transferred electronically in a secure, verifiable, and immutable way.
  • Launched in 2009, BTC is the first virtual currency to solve the double-spending issue by timestamping transactions before broadcasting them to all of the nodes in the Bitcoin network. The Bitcoin Protocol offered a solution to the Byzantine Generals’ Problem with a blockchain network structure, a notion first created by Stuart Haber and W. Scott Stornetta in 1991.
  • Bitcoin’s whitepaper was published pseudonymously in 2008 by an individual, or a group, with the pseudonym “Satoshi Nakamoto”, whose underlying identity has still not been verified.
  • The Bitcoin protocol uses an SHA-256d-based Proof-of-Work (PoW) algorithm to reach network consensus. Its network has a target block time of 10 minutes and a maximum supply of 21 million tokens, with a decaying token emission rate. To prevent fluctuation of the block time, the network’s block difficulty is re-adjusted through an algorithm based on the past 2016 block times.
  • With a block size limit capped at 1 megabyte, the Bitcoin Protocol has supported both the Lightning Network, a second-layer infrastructure for payment channels, and Segregated Witness, a soft-fork to increase the number of transactions on a block, as solutions to network scalability.

https://preview.redd.it/s2gmpmeze3151.png?width=256&format=png&auto=webp&s=9759910dd3c4a15b83f55b827d1899fb2fdd3de1

1. What is Bitcoin (BTC)?

  • Bitcoin is a peer-to-peer cryptocurrency that aims to function as a means of exchange and is independent of any central authority. Bitcoins are transferred electronically in a secure, verifiable, and immutable way.
  • Network validators, whom are often referred to as miners, participate in the SHA-256d-based Proof-of-Work consensus mechanism to determine the next global state of the blockchain.
  • The Bitcoin protocol has a target block time of 10 minutes, and a maximum supply of 21 million tokens. The only way new bitcoins can be produced is when a block producer generates a new valid block.
  • The protocol has a token emission rate that halves every 210,000 blocks, or approximately every 4 years.
  • Unlike public blockchain infrastructures supporting the development of decentralized applications (Ethereum), the Bitcoin protocol is primarily used only for payments, and has only very limited support for smart contract-like functionalities (Bitcoin “Script” is mostly used to create certain conditions before bitcoins are used to be spent).

2. Bitcoin’s core features

For a more beginner’s introduction to Bitcoin, please visit Binance Academy’s guide to Bitcoin.

Unspent Transaction Output (UTXO) model

A UTXO transaction works like cash payment between two parties: Alice gives money to Bob and receives change (i.e., unspent amount). In comparison, blockchains like Ethereum rely on the account model.
https://preview.redd.it/t1j6anf8f3151.png?width=1601&format=png&auto=webp&s=33bd141d8f2136a6f32739c8cdc7aae2e04cbc47

Nakamoto consensus

In the Bitcoin network, anyone can join the network and become a bookkeeping service provider i.e., a validator. All validators are allowed in the race to become the block producer for the next block, yet only the first to complete a computationally heavy task will win. This feature is called Proof of Work (PoW).
The probability of any single validator to finish the task first is equal to the percentage of the total network computation power, or hash power, the validator has. For instance, a validator with 5% of the total network computation power will have a 5% chance of completing the task first, and therefore becoming the next block producer.
Since anyone can join the race, competition is prone to increase. In the early days, Bitcoin mining was mostly done by personal computer CPUs.
As of today, Bitcoin validators, or miners, have opted for dedicated and more powerful devices such as machines based on Application-Specific Integrated Circuit (“ASIC”).
Proof of Work secures the network as block producers must have spent resources external to the network (i.e., money to pay electricity), and can provide proof to other participants that they did so.
With various miners competing for block rewards, it becomes difficult for one single malicious party to gain network majority (defined as more than 51% of the network’s hash power in the Nakamoto consensus mechanism). The ability to rearrange transactions via 51% attacks indicates another feature of the Nakamoto consensus: the finality of transactions is only probabilistic.
Once a block is produced, it is then propagated by the block producer to all other validators to check on the validity of all transactions in that block. The block producer will receive rewards in the network’s native currency (i.e., bitcoin) as all validators approve the block and update their ledgers.

The blockchain

Block production

The Bitcoin protocol utilizes the Merkle tree data structure in order to organize hashes of numerous individual transactions into each block. This concept is named after Ralph Merkle, who patented it in 1979.
With the use of a Merkle tree, though each block might contain thousands of transactions, it will have the ability to combine all of their hashes and condense them into one, allowing efficient and secure verification of this group of transactions. This single hash called is a Merkle root, which is stored in the Block Header of a block. The Block Header also stores other meta information of a block, such as a hash of the previous Block Header, which enables blocks to be associated in a chain-like structure (hence the name “blockchain”).
An illustration of block production in the Bitcoin Protocol is demonstrated below.

https://preview.redd.it/m6texxicf3151.png?width=1591&format=png&auto=webp&s=f4253304912ed8370948b9c524e08fef28f1c78d

Block time and mining difficulty

Block time is the period required to create the next block in a network. As mentioned above, the node who solves the computationally intensive task will be allowed to produce the next block. Therefore, block time is directly correlated to the amount of time it takes for a node to find a solution to the task. The Bitcoin protocol sets a target block time of 10 minutes, and attempts to achieve this by introducing a variable named mining difficulty.
Mining difficulty refers to how difficult it is for the node to solve the computationally intensive task. If the network sets a high difficulty for the task, while miners have low computational power, which is often referred to as “hashrate”, it would statistically take longer for the nodes to get an answer for the task. If the difficulty is low, but miners have rather strong computational power, statistically, some nodes will be able to solve the task quickly.
Therefore, the 10 minute target block time is achieved by constantly and automatically adjusting the mining difficulty according to how much computational power there is amongst the nodes. The average block time of the network is evaluated after a certain number of blocks, and if it is greater than the expected block time, the difficulty level will decrease; if it is less than the expected block time, the difficulty level will increase.

What are orphan blocks?

In a PoW blockchain network, if the block time is too low, it would increase the likelihood of nodes producingorphan blocks, for which they would receive no reward. Orphan blocks are produced by nodes who solved the task but did not broadcast their results to the whole network the quickest due to network latency.
It takes time for a message to travel through a network, and it is entirely possible for 2 nodes to complete the task and start to broadcast their results to the network at roughly the same time, while one’s messages are received by all other nodes earlier as the node has low latency.
Imagine there is a network latency of 1 minute and a target block time of 2 minutes. A node could solve the task in around 1 minute but his message would take 1 minute to reach the rest of the nodes that are still working on the solution. While his message travels through the network, all the work done by all other nodes during that 1 minute, even if these nodes also complete the task, would go to waste. In this case, 50% of the computational power contributed to the network is wasted.
The percentage of wasted computational power would proportionally decrease if the mining difficulty were higher, as it would statistically take longer for miners to complete the task. In other words, if the mining difficulty, and therefore targeted block time is low, miners with powerful and often centralized mining facilities would get a higher chance of becoming the block producer, while the participation of weaker miners would become in vain. This introduces possible centralization and weakens the overall security of the network.
However, given a limited amount of transactions that can be stored in a block, making the block time too longwould decrease the number of transactions the network can process per second, negatively affecting network scalability.

3. Bitcoin’s additional features

Segregated Witness (SegWit)

Segregated Witness, often abbreviated as SegWit, is a protocol upgrade proposal that went live in August 2017.
SegWit separates witness signatures from transaction-related data. Witness signatures in legacy Bitcoin blocks often take more than 50% of the block size. By removing witness signatures from the transaction block, this protocol upgrade effectively increases the number of transactions that can be stored in a single block, enabling the network to handle more transactions per second. As a result, SegWit increases the scalability of Nakamoto consensus-based blockchain networks like Bitcoin and Litecoin.
SegWit also makes transactions cheaper. Since transaction fees are derived from how much data is being processed by the block producer, the more transactions that can be stored in a 1MB block, the cheaper individual transactions become.
https://preview.redd.it/depya70mf3151.png?width=1601&format=png&auto=webp&s=a6499aa2131fbf347f8ffd812930b2f7d66be48e
The legacy Bitcoin block has a block size limit of 1 megabyte, and any change on the block size would require a network hard-fork. On August 1st 2017, the first hard-fork occurred, leading to the creation of Bitcoin Cash (“BCH”), which introduced an 8 megabyte block size limit.
Conversely, Segregated Witness was a soft-fork: it never changed the transaction block size limit of the network. Instead, it added an extended block with an upper limit of 3 megabytes, which contains solely witness signatures, to the 1 megabyte block that contains only transaction data. This new block type can be processed even by nodes that have not completed the SegWit protocol upgrade.
Furthermore, the separation of witness signatures from transaction data solves the malleability issue with the original Bitcoin protocol. Without Segregated Witness, these signatures could be altered before the block is validated by miners. Indeed, alterations can be done in such a way that if the system does a mathematical check, the signature would still be valid. However, since the values in the signature are changed, the two signatures would create vastly different hash values.
For instance, if a witness signature states “6,” it has a mathematical value of 6, and would create a hash value of 12345. However, if the witness signature were changed to “06”, it would maintain a mathematical value of 6 while creating a (faulty) hash value of 67890.
Since the mathematical values are the same, the altered signature remains a valid signature. This would create a bookkeeping issue, as transactions in Nakamoto consensus-based blockchain networks are documented with these hash values, or transaction IDs. Effectively, one can alter a transaction ID to a new one, and the new ID can still be valid.
This can create many issues, as illustrated in the below example:
  1. Alice sends Bob 1 BTC, and Bob sends Merchant Carol this 1 BTC for some goods.
  2. Bob sends Carols this 1 BTC, while the transaction from Alice to Bob is not yet validated. Carol sees this incoming transaction of 1 BTC to him, and immediately ships goods to B.
  3. At the moment, the transaction from Alice to Bob is still not confirmed by the network, and Bob can change the witness signature, therefore changing this transaction ID from 12345 to 67890.
  4. Now Carol will not receive his 1 BTC, as the network looks for transaction 12345 to ensure that Bob’s wallet balance is valid.
  5. As this particular transaction ID changed from 12345 to 67890, the transaction from Bob to Carol will fail, and Bob will get his goods while still holding his BTC.
With the Segregated Witness upgrade, such instances can not happen again. This is because the witness signatures are moved outside of the transaction block into an extended block, and altering the witness signature won’t affect the transaction ID.
Since the transaction malleability issue is fixed, Segregated Witness also enables the proper functioning of second-layer scalability solutions on the Bitcoin protocol, such as the Lightning Network.

Lightning Network

Lightning Network is a second-layer micropayment solution for scalability.
Specifically, Lightning Network aims to enable near-instant and low-cost payments between merchants and customers that wish to use bitcoins.
Lightning Network was conceptualized in a whitepaper by Joseph Poon and Thaddeus Dryja in 2015. Since then, it has been implemented by multiple companies. The most prominent of them include Blockstream, Lightning Labs, and ACINQ.
A list of curated resources relevant to Lightning Network can be found here.
In the Lightning Network, if a customer wishes to transact with a merchant, both of them need to open a payment channel, which operates off the Bitcoin blockchain (i.e., off-chain vs. on-chain). None of the transaction details from this payment channel are recorded on the blockchain, and only when the channel is closed will the end result of both party’s wallet balances be updated to the blockchain. The blockchain only serves as a settlement layer for Lightning transactions.
Since all transactions done via the payment channel are conducted independently of the Nakamoto consensus, both parties involved in transactions do not need to wait for network confirmation on transactions. Instead, transacting parties would pay transaction fees to Bitcoin miners only when they decide to close the channel.
https://preview.redd.it/cy56icarf3151.png?width=1601&format=png&auto=webp&s=b239a63c6a87ec6cc1b18ce2cbd0355f8831c3a8
One limitation to the Lightning Network is that it requires a person to be online to receive transactions attributing towards him. Another limitation in user experience could be that one needs to lock up some funds every time he wishes to open a payment channel, and is only able to use that fund within the channel.
However, this does not mean he needs to create new channels every time he wishes to transact with a different person on the Lightning Network. If Alice wants to send money to Carol, but they do not have a payment channel open, they can ask Bob, who has payment channels open to both Alice and Carol, to help make that transaction. Alice will be able to send funds to Bob, and Bob to Carol. Hence, the number of “payment hubs” (i.e., Bob in the previous example) correlates with both the convenience and the usability of the Lightning Network for real-world applications.

Schnorr Signature upgrade proposal

Elliptic Curve Digital Signature Algorithm (“ECDSA”) signatures are used to sign transactions on the Bitcoin blockchain.
https://preview.redd.it/hjeqe4l7g3151.png?width=1601&format=png&auto=webp&s=8014fb08fe62ac4d91645499bc0c7e1c04c5d7c4
However, many developers now advocate for replacing ECDSA with Schnorr Signature. Once Schnorr Signatures are implemented, multiple parties can collaborate in producing a signature that is valid for the sum of their public keys.
This would primarily be beneficial for network scalability. When multiple addresses were to conduct transactions to a single address, each transaction would require their own signature. With Schnorr Signature, all these signatures would be combined into one. As a result, the network would be able to store more transactions in a single block.
https://preview.redd.it/axg3wayag3151.png?width=1601&format=png&auto=webp&s=93d958fa6b0e623caa82ca71fe457b4daa88c71e
The reduced size in signatures implies a reduced cost on transaction fees. The group of senders can split the transaction fees for that one group signature, instead of paying for one personal signature individually.
Schnorr Signature also improves network privacy and token fungibility. A third-party observer will not be able to detect if a user is sending a multi-signature transaction, since the signature will be in the same format as a single-signature transaction.

4. Economics and supply distribution

The Bitcoin protocol utilizes the Nakamoto consensus, and nodes validate blocks via Proof-of-Work mining. The bitcoin token was not pre-mined, and has a maximum supply of 21 million. The initial reward for a block was 50 BTC per block. Block mining rewards halve every 210,000 blocks. Since the average time for block production on the blockchain is 10 minutes, it implies that the block reward halving events will approximately take place every 4 years.
As of May 12th 2020, the block mining rewards are 6.25 BTC per block. Transaction fees also represent a minor revenue stream for miners.
submitted by D-platform to u/D-platform [link] [comments]

hahhaa doom go brrrr

The date was June 10, 2018. The sun was shining, the grass was growing, and the birds were singing. At least, that’s what I assumed. Being a video game and tech obsessed teenager, I was indoors, my eyes glued to my computer monitor like a starving lion spying on a plump gazelle. I was watching the E3 (Electronic Entertainment Expo) 2018 broadcast on twitch.com, a popular streaming website. Video game developers use E3 as an annual opportunity to showcase any upcoming video game projects to the public. So far, the turnout had been disappointing. Much to my disappointment, multiple game developers failed to unveil anything of actual sustenance for an entire two hours. A graphical update here, a bug fix there. Issues that should have been fixed at every game’s initial launch, not a few months after release. Feeling hopeless, I averted my eyes from my computer monitor to check Reddit (a social media app/website) if there were any forum posts that I had yet to see. But then, I heard it. The sound of music composer Mick Gordon’s take on the original “DooM” theme, the awesome combination of metal and electronic music. I looked up at my screen and gasped. Bethesda Softworks and id software had just announced “DOOM: Eternal”, the fifth addition in the “DooM” video game series. “DOOM: Eternal” creative director Hugo Martin promised that the game would feel more powerful than it’s 2016 predecessor, there would be twice as many enemy types, and the doom community would finally get to see “hell on earth”. (Martin) As a fan of “DOOM (2016)”, I was ecstatic. The original “DooM” popularized the “First Person Shooter (FPS)” genre, and I wished I wouldn’t have to wait to experience the most recent entry in the series. “DOOM(1993)” was a graphical landmark when it originally released, yet nowadays it looks extremely dated, especially compared to “DOOM: Eternal”. What advancements in computer technology perpetuated this graphical change? Computers became faster, digital storage increased, and computer peripherals were able to display higher resolution and refresh rates.
“DooM” 1993 graphics example:
📷(Doom | Doom Wiki)
“DOOM: Eternal” graphics example:
📷
(Bailey)
In their video “Evolution Of DOOM”, the video game YouTube Channel “gameranx” says that on December 10, 1993, a file titled “DOOM1_0.zip” was uploaded on the File Transfer Protocol (FTP) server of the University of Wisconsin. This file, two megabytes in size, contained the video game “DooM” created by the game development group “id Software”. (Evolution of DOOM) While not the first game in the “First Person Shooter” (FPS) genre, “DooM” popularized the genre, to the point of any other FPS game being referred to as a “Doom Clone” until the late 1990s. (Doom clones | Doom Wiki) The graphics of the original “DooM” is definitely a major downgrade compared to today’s graphical standards, but keep in mind that the minimum system requirements of “DooM”, according to the article “Doom System Requirements” on gamesystemrequirements.com, was eight megabytes of ram, an Intel Pentium or AMD (Advanced Micro Devices) Athlon 486 processor cycling at sixty-six megahertz or more, and an operating system that was Windows 95 or above. (Doom System Requirements) In case you don’t speak the language of technology (although I hope you learn a thing or two at the end of this essay), the speed and storage capacity is laughable compared to the specifications of today. By 1993, the microprocessor, or CPU (Central Processing Unit) had been active for the past twenty-two years after replacing the integrated circuit in 1971, thanks to the creators of the microprocessor, Robert Noyce and Gordon Moore who were also the founder of CPU manufacturer “Intel”. Gordon Moore also created “Moore’s law”, which states “The number of transistors incorporated in a chip will approximately double every 24 months”. (Moore) Sadly, according to writer and computer builder Steve Blank in his article “The End of More - The Death of Moore’s Law”, this law would end at around 2005, thanks to the basic laws of physics. (Blank) 1993 also marked an important landmark for Intel, who just released the first “Pentium” processor which was capable of a base clock of 60 MHz (megahertz). The term “base clock” refers to the default speed of a CPU. This speed can be adjusted via the user’s specifications, and “MHz” refers to one million cycles per second. A cycle is essentially one or more problems that the computer solves. The more cycles the CPU is running at, the more problems get solved. Intel would continue upgrading their “Pentium” lineup until January 4, 2000 when they would release the “Celeron” processor, with a base clock of 533 MHz. Soon after, on June 19, 2000, rival CPU company AMD would release their “Duron” processor which had a base clock of 600 MHz, with a maximum clock of 1.8 GHz (Gigahertz). One GHz is equal to 1,000 MHz. Intel and AMD had established themselves as the two major CPU companies in the 1970s in Silicon Valley. Both companies had been bitter rivals since then, trading figurative blows in the form of competitive releases, discounts, and “one upmanship” to this day. Moving on to April 21, 2005 when AMD released the first dual-core CPU, the “Athlon 64 X2 3800+”. The notable feature of this CPU, besides a 2.0 GHz base clock and a 3.8 maximum clock, was that it was the first CPU to have two cores. A CPU core is a CPU’s processor. The more cores a CPU has, the more tasks it can perform per cycle, thus maximizing it’s efficiency. Intel wouldn’t respond until January 9, 2006, when they released their dual-core processor, the “Core 2 Duo Processor E6320”, with a base clock of 1.86 GHz. (Computer Processor History) According to tech entrepreneur Linus Sebastian in his YouTube videos “10 Years of Gaming PCs: 2009 - 2014 (Part 1)” and “10 Years of Gaming PCs: 2015 - 2019 (Part 2)”, AMD would have the upper hand over Intel until 2011, when Intel released the “Sandy Bridge” CPU microarchitecture, which was faster and around the same price as AMD’s current competing products. (Sebastian) The article “What is Microarchitecture?” on the website Computer Hope defines microarchitecture as “a hardware implementation of an ISA (instruction set architecture). An ISA is a structure of commands and operations used by software to communicate with hardware. A microarchitecture is the hardware circuitry that implements one particular ISA”. (What is Microarchitecture?) Microarchitecture is also referred to as what generation a CPU belongs to. Intel would continue to dominate the high-end CPU market until 2019, when AMD would “dethrone” Intel with their third generation “Ryzen” CPU lineup. The most notable of which being the “Ryzen 3950x”, which had a total of sixteen cores, thirty-two threads, a base clock of 3.5 GHz, and a maximum clock of 4.7 GHz. (Sebastian) The term “thread” refers to splitting one core into virtual cores, via a process known as “simultaneous multithreading”. Simultaneous multithreading allows one core to perform two tasks at once. What CPU your computer has is extremely influential for how fast your computer can run, but for video games and other types of graphics, there is a special type of processor that is designed specifically for the task of “rendering” (displaying) and generating graphics. This processor unit is known as the graphics processing unit, or “GPU”. The term “GPU” wasn’t used until around 1999, when video cards started to evolve beyond the literal generation of two-dimensional graphics and into the generation of three-dimensional graphics. According to user “Olena” in their article “A Brief History of GPU”, The first GPU was the “GeForce 256”, created by GPU company “Nvidia'' in 1999. Nvidia promoted the GeForce 256 as “A single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second”. (Olena) Unlike the evolution of CPUs, the history of GPUs is more one sided, with AMD playing a game of “catchup” ever since Nvidia overtook AMD in the high-end GPU market in 2013. (Sebastian) Fun fact, GPUs aren’t used only for gaming! In 2010, Nvidia collaborated with Audi to power the dashboards and increase the entertainment and navigation systems in Audi’s cars! (Olena) Much to my (and many other tech enthusiasts), GPUs would increase dramatically in price thanks to the “bitcoin mania” around 2017. This was, according to senior editor Tom Warren in his article “Bitcoin Mania is Hurting PC Gamers By Pushing Up GPU Prices'' on theverge.com, around an 80% increase in price for the same GPU due to stock shortages. (Warren) Just for context, Nvidia’s “flagship” gpu in 2017 was the 1080ti, the finest card of the “pascal” microarchitecture. Fun fact, I have this card. The 1080ti launched for $699, with the specifications of a base clock of 1,481 MHz, a maximum clock of 1,582 MHz, and 11 gigabytes of GDDR5X Vram (Memory that is exclusive to the GPU) according to the box it came in. Compare this to Nvidia’s most recent flagship GPU, the 2080ti of Nvidia’s followup “Turing” microarchitecture, another card I have. This GPU launched in 2019 for $1,199. The 2080ti’s specifications, according to the box it came in included a base clock of 1,350 MHz, a maximum clock of 1,545 MHz, and 11 gigabytes of GDDR6 Vram.
A major reason why “DooM” was so popular and genius was how id software developer John Carmack managed to “fake” the three-dimensional graphics without taking up too much processing power, hard drive space, or “RAM” (Random access memory), a specific type of digital storage. According to the article “RAM (Random Access Memory) Definition” on the website TechTerms, Ram is also known as “volatile” memory, because it is much faster than normal storage (which at the time took the form of hard-drive space), and unlike normal storage, only holds data when the computer is turned on. A commonly used analogy is that Ram is the computer’s short-term memory, storing temporary files to be used by programs, while hard-drive storage is the computer’s long term memory. (RAM (Random Access Memory) Definition) As I stated earlier, in 1993, “DooM” required 8 megabytes of ram to run. For some context, as of 2020, “DOOM: Eternal” requires a minimum of 8 gigabytes of DDR4 (more on this later) ram to run, with most gaming machines possessing 16 gigabytes of DDR4 ram. According to tech journalist Scott Thornton in his article “What is DDR (Double Data Rate) Memory and SDRAM Memory”, in 1993, the popular format of ram was “SDRAM”. “SDRAM” stands for “Synchronous Dynamic Random Access Memory”. SDRAM differs from its predecessor, “DRAM” (Dynamic Random Access Memory) by being synchronized with the clock speed of the CPU. DRAM was asynchronous (not synchronized by any external influence), which “posted a problem in organizing data as it comes in so it can be queued for the process it’s associated with”. SDRAM was able to transfer data one time per clock cycle, and it’s replacement in the early 2000s, “DDR SDRAM” (Dual Data Rate Synchronous Dynamic Random Access Memory) was able to transfer data two times per clock cycle. This evolution of ram would continue to this day. In 2003, DDR2 SDRAM was released, able to transfer four pieces of data per clock cycle. In 2007, DDR3 SDRAM was able to transfer eight pieces of data per clock cycle. In 2014, DDR4 SDRAM still was able to transfer eight pieces of data per cycle, but the clock speed had increased by 600 MHz, and the overall power consumption had been reduced from 3.3 volts for the original SDRAM to 1.2 volts for DDR4. (Thornton)The digital size of each “ram stick” (a physical stick of ram that you would insert into your computer) had also increased, from around two megabytes per stick, to up to 128 gigabytes per stick (although this particular option will cost you around $1,000 per stick depending on the manufacturer) in 2020, although the average stick size is 8 gigabytes. For the average computer nowadays, you can insert up to four ram sticks, although for more high-end systems, you can insert up to sixteen or even thirty-two! Rewind back to 1993, where the original “DooM” took up two megabytes of storage, not to be confused with ram. According to tech enthusiast Rex Farrance in their article “Timeline: 50 Years of Hard Drives”, the average computer at this time had around two gigabytes of storage. Storage took the form of magnetic-optical discs, a combination of the previous magnetic discs and optical discs. (Farrance) This format of storage is still in use today, although mainly for large amounts of rarely used data, while data that is commonly used by programs (including the operating system) is put on solid-state drives, or SSDs. According to tech journalist Keith Foote in their article “A Brief History of Data Storage”, SSDs differed from the HDD by being much faster and smaller, storing data on a flash memory chip, not unlike a USB thumb drive. While SSDs had been used as far back as 1950, they wouldn’t find their way into the average gaming machine until the early 2010s. (Foote) A way to think about SSDs is common knowledge. It doesn’t contain every piece of information you know, it just contains what you use on a daily basis. For example, my computer has around 750 gigabytes of storage in SSDs, and around two terabytes of internal HDD storage. On my SSDs, I have my operating system, my favorite programs and games, and any files that I use frequently. On my HDD, I have everything else that I don’t use on a regular basis.
“DOOM: Eternal” would release on March 20, 2020, four months after it’s original release date on November 22, 2019. And let me tell you, I was excited. The second my clock turned from 11:59 P.M. to 12:00 A.M., I repeatedly clicked my refresh button, desperately waiting to see the words “Coming March 20” transform into the ever so beautiful and elegant phrase: “Download Now”. At this point in time, I had a monitor that was capable of displaying roughly two-million pixels spread out over it’s 27 inch display panel, at a rate of 240 times a second. Speaking of monitors and displays, according to the article “The Evolution of the Monitor” on the website PCR, at the time of the original “DooM” release, the average monitor was either a CRT (cathode ray tube) monitor, or the newer (and more expensive) LCD (liquid crystal display) monitor. The CRT monitor was first unveiled in 1897 by the German physicist Karl Ferdinand Braun. CRT monitors functioned by colored cathode ray tubes generating an image on a phosphorescent screen. These monitors would have an average resolution of 800 by 600 pixels and a refresh rate of around 30 frames per second. CRT monitors would eventually be replaced by LCD monitors in the late 2000s. LCD monitors functioned by using two pieces of polarized glass with liquid crystal between them. A backlight would shine through the first piece of polarized glass (also known as substrate). Electrical currents would then cause the liquid crystals to adjust how much light passes through to the second substrate, which creates the images that are displayed. (The Evolution of the Monitor) The average resolution would increase to 1920x1080 pixels and the refresh rate would increase to 60 frames a second around 2010. Nowadays, there are high end monitors that are capable of displaying up to 7,680 by 4,320 pixels, and also monitors that are capable of displaying up to 360 frames per second, assuming you have around $1,000 lying around.
At long last, it had finished. My 40.02 gigabyte download of “DOOM: Eternal” had finally completed, and oh boy, I was ready to experience this. I ran over to my computer, my beautiful creation sporting 32 gigs of DDR4 ram, an AMD Ryzen 7 “3800x” with a base clock of 3.8 GHz, an Nvidia 2080ti, 750 gigabytes of SSD storage and two terabytes of HDD storage. Finally, after two years of waiting for this, I grabbed my mouse, and moved my cursor over that gorgeous button titled “Launch DOOM: Eternal”. Thanks to multiple advancements in the speed of CPUs, the size of ram and storage, and display resolution and refresh rate, “DooM” had evolved from an archaic, pixelated video game in 1993 into the beautiful, realistic and smooth video game it is today. And personally, I can’t wait to see what the future has in store for us.
submitted by Voxel225 to voxelists [link] [comments]

Future electricity consumption of Bitcoin

TL;DR For Bitcoin to become the world currency we only would have to build about two more earths. Get cracking!
 
There have been many discussions about the current energy consumption of Bitcoin, but after reading some insane article yesterday that promised Bitcoin will solve all of our financial woes if it would be world currency I wondered what the future energy consumption would look like if we were to live in Butterland where that turned out to be true.
 
To make a good estimate for that we need a few assumptions for mining-related costs, which I will shamelessly steal from the Digiconomist. Additionally I will assume that this scenario plays somewhat in the future and the block rewards have been halved again (from 12.5 BTC/block currently).
 
This leaves us with:
price-per-bitcoin 10,000,000 $
block-reward 6.25 btc
blocks-per-year 52,560
miner-btc-per-year 328,500 btc
global mining revenues 3,285,000,000,000 $
 
Assuming 60% operational costs, and 1 kWh spent per 0.05$ operational costs we arrive at:
miner operational costs 1,971,000,000,000 $
electricity-consumption 39,420,000,000,000 kWh (= 39,420 TWh)
 
Ok, thats a large number but let's put it in context. According to the CIA World Factbook in 2015 we consumed 21,78 trillion kWh (= 21,780 TWh) of electricity worldwide.
 
So if the dreams of our glorious overlords would become true we would have to triple the electricity generation of the world and spend 2/3 of that just on running the miners.
 
But lets have a bit more fun with these numbers.
The largest (currently operating) nuclear power station is the Bruce Nuclear Generating Station in Canada which produced 47.15 TWh of electricity in 2015. So if we wanted to feed the miners purely on nuclear energy we would need about 836 of these nuclear power plants which would come with a nice price tag of 6.5 trillion $ (not adjusted for inflation).
 
But that's old technology and as our favorite prophet biglambda revealed in his infinite wisdom we would power those miners with green energy. So let's do it with solar.
Solar Star was the world's largest solar farm in 2015 and can annually produce 1.664 TWh of electricity. So to feed the miners we would only need about 23,690 of these installations, which would use up a measly 307.970 km² of land. That's only a bit more then the size of Texas. And the construction costs of 26 trillion $ wouldn't even triple the U.S. national debt so what do we worry about?
 
But as I am just a regret filled nocoiner I'm sure that there a several fast progressing free-energy projects from leading cryptocurrency investors out there which will end up saving the world. Anything else you hear is just baseless FUD.

HODL

submitted by elanko to Buttcoin [link] [comments]

Staking — The New Way to Earn Crypto for Free

Staking — The New Way to Earn Crypto for Free

https://preview.redd.it/jpadsinyz3c41.png?width=616&format=png&auto=webp&s=c0dc410484430b863b0488727f92135f218edff2
Airdrops are so 2017, free money was fun while it lasted but now when someone says free money in crypto, the first thoughts are scams and ponzi schemes. But in 2020, there is a way to earn free money, in a legitimate, common practice, and logical manner — staking.
Staking is the core concept behind the Proof-of-Stake (PoS) consensus protocol that is quickly becoming an industry standard throughout blockchain projects. PoS allows blockchains to scale effectively without compromising on security and resource efficiency. Projects that incorporate staking include aelf, Dash, EOS, Cosmos, Cardano, Dfinity and many others.

https://preview.redd.it/luczupo004c41.png?width=616&format=png&auto=webp&s=2a2aba11c35c9962e42d1ea56b9e4f33532750ef

PoW — Why change

First, let’s look at some of the issues facing Proof-of-Work (PoW) consensus that led to the development of PoS.
  1. Excessive energy consumption — In 2017, many concerns were raised over the amount of electricity used by the bitcoin network (Largest PoW blockchain). Since then the energy consumption has increased by over 400%, to the point where 1 single transaction on this network has the same carbon footprint of 736,722 Visa transactions or consumes the same amount of electricity as over 20 U.S. households.
  2. Varying Electricity Costs — The profit of any miner on the network is tied to two costs, the initial startup cost to obtain the hardware and infrastructure, and more critically, the running cost of said equipment in relation to electricity usage. Electricity costs can vary from fractions of a cent per kWh to over 50 cents (USD) and in some cases it is free. When a user may only be earning $0.40 USD per hour then this will clearly rule out certain demographics based purely on electricity costs, reducing the potential for complete decentralization.
  3. Reduced decentralization — Due to the high cost of the mining equipment, those with large financial bases setup mining farms, either for others to rent out individual miners or entirely for personal gains. This results in large demographic hotspots on the network reducing the decentralized aspect to a point where it no longer accomplishes this aspect.
  4. Conflicted interests — The requirements of running miners on the network are purely based on having possession of the hardware, electricity and internet connection. There are no limits to the amount a miner can earn, nor do they need to hold any stake in the network, and thus there is very little incentive for them to vote on upgrades that may benefit the network but reduce their rewards.
I want to take this moment to mention a potential benefit to PoW that I have not seen anyone mention previously. It is a very loose argument so don’t take this to heart too strongly.
Consistent Fiat Injection — The majority of miners will be paying for their electricity in fiat currency. At a conservative rate of $0.1 USD per kWh, the network currently uses 73.12 TWh per year. This equates to an average daily cost of over $20 million USD. This means every day around $20 million of fiat currency is effectively being injected into the bitcoin network. Although this concept is somewhat flawed in the sense that the same amount of bitcoin will be released each day regardless of how much is spent on electricity, I’m looking at this from the eyes of the miners, they are reducing their fiat bags and increasing their bitcoin bags. This change of bags is the essence of this point which will inevitably encourage crypto spending. If the bitcoin bags were increased but fiat bags did not decrease, then there would be less incentive to spend the bitcoin, as would see in a staking ecosystem.

https://preview.redd.it/8dtqt6e204c41.png?width=631&format=png&auto=webp&s=065aedde87b55f0768968307e59e62a35eac949d

PoS Variations

Different approaches have been taken to tackle different issues the PoS protocol faces. Will Little has an excellent article explaining this and more in PoS, but let me take an excerpt from his piece to go through them:
  • Coin-age selection — Blockchains like Peercoin (the first PoS chain), start out with PoW to distribute the coins, use coin age to help prevent monopolization and 51% attacks (by setting a time range when the probability of being selected as a node is greatest), and implement checkpoints initially to prevent NoS problems.
  • Randomized block selection — Chains like NXT and Blackcoin also use checkpoints, but believe that coin-age discourages staking. After an initial distribution period (either via PoW or otherwise), these chains use algorithms to randomly select nodes that can create blocks.
  • Ethereum’s Casper protocol(s) — Being already widely distributed, Ethereum doesn’t have to worry about the initial distribution problem when/if it switches to PoS. Casper takes a more Byzantine Fault Tolerant (BFT) approach and will punish nodes by taking away (“slashing”) their stake if they do devious things. In addition, consensus is formed by a multi-round process where every randomly assigned node votes for a specific block during a round.
  • Delegated Proof-of-Stake (DPoS) — Invented by Dan Larimer and first used in Bitshares (and then in [aelf,] Steem, EOS, and many others), DPoS tackles potential PoS problems by having the community “elect” delegates that will run nodes to create and validate blocks. Bad behavior is then punished by the community simply out-voting the delegated nodes.
  • Delegated Byzantine Fault Tolerance (DBFT) — Similar to DPoS, the NEO community votes for (delegates) nodes, but instead of each node producing blocks and agreeing on consensus, only 2 out of 3 nodes need to agree on what goes in every block (acting more like bookkeepers than validators).
  • Tendermint — As a more sophisticated form of DBFT and a precursor to Casper, Jae Kwon introduced tendermint in 2014, which leverages dynamic validator sets, rotating leader elections, and voting power (i.e. weight) that is proportional to the self-funding and community allocation of tokens to a node (i.e. a “validator”).
  • Masternodes — First introduced by DASH, a masternode PoS system requires nodes to stake a minimum threshold of coins in order to qualify as a node. Often this comes with requirements to provide “service” to a network in the form of governance, special payment protocols, etc…
  • Proof of Importance (POI)NEM takes a slightly different approach by granting an “importance calculation” to masternodes staking at least 10,000 XEM. This POI system then rewards active nodes that act in a positive way over time to impact the community.
  • “Proof-of-X” — And finally, there is no lack of activity in the PoS world to come up with clever approaches and variants of staking (some are more elaborate than others). In addition to BFT protocols such as Honeybadger, Ouroboros, and Tezos, for further reading, also check out “Proof-of-”: Stake Anonymous, Storage, Stake Time, Stake Velocity, Activity, Burn, and Capacity.
https://preview.redd.it/n28a8n5404c41.png?width=604&format=png&auto=webp&s=0ea8827fd0458e768d4eb3a0a1fa88c984ba0a82

Earning Your Stake

In order to understand how one can earn money from these networks, I’ll break them down into 3 categories: Simple staking, Running nodes, and Voting.
Simple Staking - This is the simplest of the 3 methods and requires almost no action by the user. Certain networks will reward users by simply holding tokens in a specified wallet. These rewards are generally minimal but are the easiest way to earn.
Running a node - This method provides the greatest rewards but also requires the greatest action by the user and most likely will require ongoing maintenance. Generally speaking, networks will require nodes to stake a certain amount of tokens often amounting to thousands of dollars. In DPoS systems, these nodes must be voted in by other users on the network and must continue to provide confidence to their supporters. Some companies will setup nodes and allow users to participate by contributing to the minimum staking amount, with a similar concept to PoW mining pools.
Voting - This mechanism works hand in hand with running nodes in relation to DPoS networks. Users are encouraged to vote for their preferred nodes by staking tokens as votes. Each vote will unlock a small amount of rewards for each voter, the nodes are normally the ones to provide these rewards as a portion of their own reward for running a node.

Aelf’s DPoS system

The aelf consensus protocol utilizes a form of DPoS. There are two versions of nodes on the network, active nodes & backup nodes (official names yet to be announced). Active nodes run the network and produce the blocks, while the backup nodes complete minor tasks and are on standby should any active nodes go offline or act maliciously. These nodes are selected based upon their number of votes received. Initially the top 17 nodes will be selected as active nodes, while the next 100 will stand as the backup ones, each voting period each node may change position should they receive more or less votes than the previous period. In order to be considered as a node, one must stake a minimum amount of ELF tokens (yet to be announced).

https://preview.redd.it/47d3wqe604c41.png?width=618&format=png&auto=webp&s=062a6aa6186b826d400a0015d4c91fd1a4ed0b65
In order to participate as a voter, there is no minimum amount of tokens to be staked. When one stakes, their tokens will be locked for a designated amount of time, selected by the voter from the preset periods. If users pull their tokens out before this locked period has expired no rewards are received, but if they leave them locked for the entire time frame they will receive the set reward, and the tokens will be automatically rolled over into the next locked period. As a result, should a voter decide, once their votes are cast, they can continue to receive rewards without any further action needed.
Many projects have tackled with node rewards in order to make them fair, well incentivized but sustainable for everyone involved. Aelf has come up with a reward structure based on multiple variables with a basic income guaranteed for every node. Variables may include the number of re-elections, number of votes received, or other elements.
As the system matures, the number of active nodes will be increased, resulting in a more diverse and secure network.
Staking as a solution is a win-win-win for network creators, users and investors. It is a much more resource efficient and scalable protocol to secure blockchain networks while reducing the entry point for users to earn from the system.
submitted by Floris-Jan to aelfofficial [link] [comments]

Burstcoin (BURST): A Dark Horse That Could Become A Major Cryptocurrency, The King of Proof of Capacity

Burstcoin (BURST): A Dark Horse That Could Become A Major Cryptocurrency, The King of Proof of Capacity
https://preview.redd.it/nt1qbc9cq4221.png?width=572&format=png&auto=webp&s=d867a4c98e7ab7e9c37c7dc23cc7fb251a5ecec7
https://cryptoiq.co/burstcoin-burst-a-dark-horse-that-could-become-a-major-cryptocurrency-the-king-of-proof-of-capacity/
Currently the cryptocurrency space is flooded with copycat coins and initial coin offering (ICO) tokens, most of which are moving steadily down the ranks on CoinMarketCap as the bear market of 2018 continues. This bear market is weeding out cryptocurrencies that have little long term potential, and cryptocurrencies that have strong communities and unique technology are rising to the top. Burstcoin (BURST) is one such cryptocurrency that is rising to the top, like cream in a glass of fresh milk. This is because the Burstcoin community is filled with diehard Cypherpunks, and BURST is the king of Proof of Capacity.
Back in the middle of October 2018 BURST was at #248 on CoinMarketCap, which was before the ‘nuclear’ bear market took effect, where the support level was broken due to the Bitcoin Cash hard fork, Bakkt delaying the launch of physical Bitcoin futures, and the Securities and Exchange Commission (SEC) initiating its first civil enforcement penalties against ICOs. BURST has decreased in price like every other cryptocurrency, but is rising relative to other cryptocurrencies, and as of 3 December 2018 sits at #199 on CoinMarketCap with a market cap of USD 13.5 million.
This increase in the price of BURST relative to other cryptocurrencies is due to Burstcoin’s unique technology. Burstcoin is the king of Proof of Capacity, a mining algorithm that uses the hard drive, versus raw computational power like with Proof of Work, and is much more energy efficient than Proof of Work. Proof of Capacity works by writing cryptographic hashes to an allotted segment of a hard drive called a plot. This plot is then read during mining to find the correct cryptographic hash, and whoever finds the cryptographic hash the fastest receives the block reward. More hard drive space dedicated to the plot equals more cryptographic hashes available, making it easier to find an answer and earn the BURST block reward.
Currently 1TB generates 1-2 BURST per day, and even though this is only equivalent to about a penny, it is all profit since reading the plot file requires a negligible amount of energy, and BURST miners can use their computer for other activities without impediment. Compare this to Proof of Work, which slows down personal computers and costs more electricity than the cryptocurrency it mines. BURST is one of the only cryptocurrencies that can be profitably mined on personal computers.
Further, unlike with Proof of Work where specialized mining equipment is required like application specific integrated circuits (ASICs), anyone with a computer or even mobile phone can mine BURST, and if they decide to stop mining BURST they can simply delete their plot file and use the hard drive space for other things. This is unlike ASICs, which cannot be used for anything but mining, so if someone decides to stop mining they lose all the money invested into the ASIC.
The ease of mining and negligible energy usage has led to the formation of a strong BURST mining community, with over 200,000 TB securing the BURST network. This is equivalent to hundreds of thousands of personal computers. The expansive mining community gives BURST value, and some of these miners are blockchain developers, and they have been building a full suite of technology based on the Burstcoin blockchain.
CloudBurst immutably stores files directly on the Burstcoin blockchain, for a small 1-time fee. Real blockchain storage is a rarity in the cryptocurrency world. The file will be stored as long as the Burstcoin blockchain exists, which is the foreseeable future and beyond considering the expansive BURST mining community. Cloudburst would be useful if you lost your computer and all of your backups in a natural disaster like a hurricane, and is a more secure solution than cloud storage like Google. Also, the Burstcoin wallet can be used to easily issue cryptocurrencies that are based off of the Burstcoin blockchain, and there is a decentralized exchange built-in to the wallet to trade these crypto assets.
Cryptocurrency scalability is a problem even for major cryptocurrencies like Bitcoin and Ethereum, but Burstcoin has tackled and solved this problem with the launch of the Dymaxion. The scalability of the Dymaxion is so powerful that it can handle all the non-cash transactions in the world. This is done via the utilization of tangle-based lightning networks on top of the Burstcoin blockchain. Transactions done via the Dymaxion are instant, with no fees and practically no energy expenditure. The Dymaxion gives Burstcoin the room to grow as much as it needs to.
When people look for the cryptocurrencies that will survive long term, it can be confusing due to the 2,000+ cryptocurrencies listed on CoinMarketCap. However, it is clear that cryptocurrencies with truly unique and useful technology, as well as strong communities will always be around and gain value long term relative to all the ICOs and copycats. Bitcoin is the king of SHA-256, Litecoin is the king of Scrypt, Ethereum is the king of blockchain-based dApps, Dogecoin is the king of the shibes on Reddit, Dash is the King of X11, Monero is the king of privacy coins, IOTA is the king of Directed Acyclic Graphs (DAGs), and Burstcoin is the king of Proof of Capacity. These kings of cryptocurrency will definitely be the winners and survivors when the fallout from the ICO apocalypse is over.
This is for educational purposes only and is not investment advice. We are not paid by BURST to write this article.
submitted by turtlecane to burstcoin [link] [comments]

IOTA and Tangle discussion/info, scam or not?

In the past weeks I heard a lot pros and cons about IOTA, many of them I believe were not true (I'll explain better). I would like to start a serious discussion about IOTA and help people to get into it. Before that I'll contribute with what I know, most things that I will say will have a source link providing some base content.
 
The pros and cons that I heard a lot is listed below, I'll discuss the items marked with *.
Pros
Cons
 

Scalability

Many users claim that the network infinitely scales, that with more transactions on the network the faster it gets. This is not entirely true, that's why we are seeing the network getting congested (pending transactions) at the moment (12/2017).
The network is composed by full-nodes (stores all transactions), each full-node is capable of sending transactions direct to the tangle. An arbitrary user can set a light-node (do not store all transactions, therefore a reduced size), but as it does not stores all transactions and can't decide if there are conflicting transactions (and other stuff) it needs to connect to a full-node (bitifinex node for example) and then request for the full-node to send a transaction to the tangle. The full-node acts like a bridge for a light-node user, the quantity of transactions at the same time that a full-node can push to the tangle is limited by its brandwidth.
What happens at the moment is that there are few full-nodes, but more important than that is: the majority of users are connected to the same full-node basically. The full-node which is being used can't handle all the requested transactions by the light-nodes because of its brandwidth. If you are a light-node user and is experiencing slow transactions you need to manually select other node to get a better performance. Also, you need to verify that the minimum weight magnitude (difficulty of the Hashcash Proof of Work) is set to 14 at least.
The network seems to be fine and it scales, but the steps an user has to make/know are not friendly-user at all. It's necessary to understand that the technology envolved is relative new and still in early development. Do not buy iota if you haven't read about the technology, there is a high chance of you losing your tokens because of various reasons and it will be your own fault. You can learn more about how IOTA works here.
There are some upcoming solutions that will bring the user-experience to a new level, The UCL Wallet (expected to be released at this month, will talk about that soon and how it will help the network) and the Nelson CarrIOTA (this week) besides the official implementations to come in december.
 

Centralization

We all know that currently (2017) IOTA depends on the coordinator because the network is still in its infancy and because of that it is considered centralized by the majority of users.
The coordinator are several full-nodes scattered across the world run by the IOTA foundation. It creates periodic Milestones (zero value transactions which reference valid transactions) which are validated by the entire network. The coordinator sets the general direction for the tangle growth. Every node verifies that the coordinator is not breaking consensus rules by creating iotas out of thin air or approving double-spendings, nodes only tells other nodes about transactions that are valid, if the Coordinator starts issuing bad Milestones, nodes will reject them.
The coordinator is optional since summer 2017, you can choose not implement it in your full-node, any talented programmer could replace Coo logic in IRI with Random Walk Monte Carlo logic and go without its milestones right now. A new kind of distributed coordinator is about to come and then, for the last, its completely removal. You can read more about the coordinator here and here.

Mining-Blockchain-based Cryptocurrencies

These are blockchain-based cryptocurrencies (Bitcoin) that has miners to guarantee its security. Satoshi Nakamoto states several times in the Bitcoin whitepaper that "The system is secure as long as honest nodes collectively control more CPU power than any cooperating group of attacker nodes". We can see in Blockchain.info that nowadays half of the total hashpower in Bitcoin is controlled by 3 companies (maybe only 1 in the future?). Users must trust that these companies will behave honestly and will not use its 50%> hashpower to attack the network eventually. With all that said it's reasonable to consider the IOTA network more decentralized (even with the coordinator) than any mining-blockchain-based cryptocurrency
You can see a comparison between DAG cryptocurrencies here
 

IOTA partnerships

Some partnerships of IOTA foundation with big companies were well known even when they were not officialy published. Some few examples of confirmed partnerships are listed below, others cofirmed partnerships can be seem in the link Partnerships with big companies at the pros section.
So what's up with all alarming in social media about IOTA Foundation faking partnerships with big companies like Microsoft and Cisco?
At Nov. 28th IOTA Foundation announced the Data Marketplace with 30+ companies participating. Basically it's a place for any entity sell data (huge applications, therefore many companies interested), at time of writing (11/12/2017) there is no API for common users, only companies in touch with IOTA Foundation can test it.
A quote from Omkar Naik (Microsoft worker) depicted on the Data Marketplace blog post gave an idea that Microsoft was in a direct partnership with IOTA. Several news websites started writing headlines "Microsoft and IOTA launches" (The same news site claimed latter that IOTA lied about partnership with Microsoft) when instead Microsoft was just one of the many participants of the Data Marketplace. Even though it's not a direct partnership, IOTA and Microsoft are in close touch as seen in IOTA Microsoft and Bosch meetup december 12th, Microsoft IOTA meetup in Paris 14th and Microsoft Azure adds 5 new Blockchain partners (may 2016). If you join the IOTA Slack channel you'll find out that there are many others big companies in close touch with IOTA like BMW, Tesla and other companies. This means that right now there are devs of IOTA working directly with scientists of these companies to help them integrate IOTA on their developments even though there is no direct partnership published, I'll talk more about the use cases soon.
We are excited to partner with IOTA foundation and proud to be associated with its new data marketplace initiative... - Omkar Naik
 

IOTA's use cases

Every cryptocurrency is capable of being a way to exchange goods, you pay for something using the coin token and receive the product. Some of them are more popular or have faster transactions or anonymity while others offers better scalablity or user-friendness. But none of them (except IOTA) are capable of transactioning information with no costs (fee-less transactions), in an securely form (MAM) and being sure that the network will not be harmed when it gets more adopted (scales). These characteristics open the gates for several real world applications, you probably might have heard of Big Data and how data is so important nowadays.
Data sets grow rapidly - in part because they are increasingly gathered by cheap and numerous information-sensing Internet of things devices such as mobile devices, aerial (remote sensing), software logs, cameras, microphones, radio-frequency identification (RFID) readers and wireless sensor networks.
 
It’s just the beginning of the data period. Data is going to be so important for human life in the future. So we are now just starting. We are a big data company, but compared to tomorrow, we are nothing. - Jack Ma (Alibaba)
There are enormous quantities of wasted data, often over 99% is lost to the void, that could potentially contain extremely valuable information if allowed to flow freely in data streams that create an open and decentralized data lake that is accessible to any compensating party. Some of the biggest corporations of the world are purely digital like Google, Facebook and Amazon. Data/information market will be huge in the future and that's why there so many companies interested in what IOTA can offer.
There are several real world use cases being developed at the moment, many of them if successful will revolutionize the world. You can check below a list of some of them.
Extra
These are just few examples, there are a lot more ongoing and to explore.
 

IOTA Wallet (v2.5.4 below)

For those who have read a lot about IOTA and know how it works the wallet is fine, but that's not the case for most users. Issues an user might face if decide to use the current wallet:
Problems that could be easily avoided with a better understand of the network/wallet or with a better wallet that could handle these issues. As I explained before, some problems during the "congestion" of the network could be simply resolved if stuff were more user-friendly, this causes many users storing their iotas on exchanges which is not safe either.
The upcoming (dec 2017) UCL Wallet will solve most of these problems. It will switch between nodes automatically and auto-reattach transactions for example (besides other things). You can have full a overview of it here and here. Also, the upcoming Nelson CarrIOTA will help on automatic peer discovery for users setup their nodes more easily.
 

IOTA Vulnerability issue

On sept 7th 2017 a team from MIT reported a cryptographic issue on the hash function Curl. You can see the full response of IOTA members below.
Funds were never in danger as such scenarios depicted on the Neha's blogpost were not pratically possible and the arguments used on the blogpost had'nt fundamentals, all the history you can check by yourself on the responses. Later it was discovered that the whole Neha Narula's team were envolved in other concurrent cryptocurrency projects
Currently IOTA uses the relatively hardware intensive NIST standard SHA-3/Keccak for crucial operations for maximal security. Curl is continuously being audited by more cryptographers and security experts. Recenlty IOTA Foundation hired Cybercrypt, the world leading lightweight cryptography and security company from Denmark to take the Curl cryptography to its next maturation phase.
 
It took me a couple of days to gather the informations presented, I wanted it to make easier for people who want to get into it. It might probably have some mistakes so please correct me if I said something wrong. Here are some useful links for the community.
This is my IOTA donation address, in case someone wants to donate I will be very thankful. I truly believe in this project's potential.
I9YGQVMWDYZBLHGKMTLBTAFBIQHGLYGSAGLJEZIV9OKWZSHIYRDSDPQQLTIEQEUSYZWUGGFHGQJLVYKOBWAYPTTGCX
 
This is a donation address, if you want to do the same you might pay attention to some important details:
  • Create a seed for only donation purposes.
  • Generate a address and publish it for everyone.
  • If you spend any iota you must attach a new address to the tangle and refresh your donation address published before to everyone.
  • If someone sends iota to your previous donation address after you have spent from it you will probably lose the funds that were sent to that specific address.
  • You can visualize how addresses work in IOTA here and here.
This happens because IOTA uses Winternitz one-time signature to become quantum resistent. Every time you spend iota from a address, part of the private key of that specific address is revealed. This makes easier for attackers to steal that address balance. Attackers can search if an address has been reused on the tangle explorer and try to brute force the private key since they already know part of it.
submitted by mvictordbz to CryptoCurrency [link] [comments]

[META] New to PC Building? - September 2018 Edition

Intro

You've heard from all your gaming friends/family or co-workers that custom PCs are the way to go. Or maybe you've been fed up with your HP, Dell, Acer, Gateway, Lenovo, etc. pre-builts or Macs and want some more quality and value in your next PC purchase. Or maybe you haven't built a PC in a long time and want to get back into the game. Well, here's a good place to start.

Instructions

  1. Make a budget for your PC (e.g., $800, $1000, $1250, $1500, etc.).
  2. Decide what you will use your PC for.
    • For gaming, decide what games and at what resolution and FPS you want to play at.
    • For productivity, decide what software you'll need and find the recommended specs to use those apps.
    • For a bit of both, your PC build should be built on the HIGHEST specs recommended for your applications (e.g., if you only play FortNite and need CPU power for CFD simulations, use specs recommended for CFD).
    Here are some rough estimates for builds with entirely NEW parts:
    1080p 60FPS ultra-settings modern AAA gaming: ~$1,200
    1440p 60FPS high/ultra-settings modern AAA gaming: ~$1,600
    1080p 144FPS ultra-settings modern AAA gaming: $2,000
    4K 50FPS medium/high-settings modern AAA gaming: > $2,400
    It's noted that some compromises (e.g., lower settings and/or resolution) can be made to achieve the same or slightly lower gaming experience within ±15% of the above prices. It's also noted that you can still get higher FPS on older or used PCs by lowering settings and/or resolution AND/OR buying new/used parts to upgrade your system. Make a new topic about it if you're interested.
    Also note that AAA gaming is different from e-sport games like CSGO, DOTA2, FortNite, HOTS, LoL, Overwatch, R6S, etc. Those games have lower requirements and can make do with smaller budgets.
  3. Revise your budget AND/OR resolution and FPS until both are compatible. Compare this to the recommended requirements of the most demanding game on your list. For older games, you might be able to lower your budget. For others, you might have to increase your budget.
    It helps to watch gaming benchmarks on Youtube. A good example of what you're looking for is something like this (https://www.youtube.com/watch?v=9eLxSOoSdjY). Take note of the resolution, settings, FPS, and the specs in the video title/description; ask yourself if the better gaming experience is worth increasing your budget OR if you're okay with lower settings and lowering your budget. Note that you won't be able to see FPS higher than 60FPS for Youtube videos; something like this would have to be seen in-person at a computer shop.
  4. Make a build on https://ca.pcpartpicker.com/. If you still have no idea how to put together parts, start here (http://www.logicalincrements.com/) to get an understanding of PC part tiers. If you want more info about part explanations and brief buying tips, see the next section below.
  5. Click on the Reddit logo button next to Markup, copy and paste the generated text (in markup mode if using new Reddit), and share your build for review!
  6. Consider which retailer to buy your parts from. Here's a table comparing different retailers: https://docs.google.com/spreadsheets/d/1L8uijxuoJH4mjKCjwkJbCrKprCiU8CtM15mvOXxzV1s/edit?usp=sharing
  7. Buy your parts! Use PCPP above to send you e-mail alerts on price drops or subscribe to /bapcsalescanada for deals.
    You can get parts from the following PC retailers in alphabetical order:
  8. After procuring your parts, it's time to build. Use a good Youtube tutorial like this (https://www.youtube.com/watch?v=IhX0fOUYd8Q) that teach BAPC fundamentals, but always refer to your product manuals or other Youtube tutorials for part-specific instructions like CPU mounting, radiator mounting, CMOS resetting, etc. If it everything still seems overwhelming, you can always pay a computer shop or a friend/family member to build it for you.
    It might also be smart to look up some first-time building mistakes to avoid:
  9. Share your experience with us.
  10. If you have any other questions, use the search bar first. If it's not there, make a topic.

BAPC News (Last Updated - 2018/09/20)

CPU

https://www.tomshardware.com/news/intel-9000-series-cpu-faq,37743.html
Intel 9000 CPUs (Coffee Lake Refresh) will be coming out in Q4. With the exception of i9 (8-core, 12 threads) flagship CPUs, the i3, i5, and i7 lineups are almost identical to their Intel 8000 (Coffee Lake) series, but slightly clocked faster. If you are wondering if you should upgrade to the newer CPU on the same tier (e.g., i5-8400 to i5-9400), I don't recommend that you do as you will only see marginal performance increases.

Mobo

https://www.anandtech.com/show/13135/more-details-on-intels-z390-chipset-exposed
Z370s will now be phased out for Z390s boards, which will natively support Intel 9000 CPUs (preferably i5-9600K, i7-9700K, and i9-9900K).

GPU

https://www.youtube.com/watch?v=WDrpsv0QIR0
RTX 2080 and 2080 Ti benchmarks are out; they provide ~10 and ~20 frames better than the 1080 Ti and also feature ray tracing (superior lighting and shadow effects) which is featured in only ~30 games so far (i.e., not supported a lot); effectively, they provide +25% more performance for +70% increased cost. My recommendation is NOT to buy them unless you need it for work or have lots of disposable income. GTX 1000 Pascal series are still relevant in today's gaming specs.

Part Explanations

CPU

The calculator part. More GHz is analogous to fast fingers number crunching in the calculator. More cores is analogous to having more calculators. More threads is analogous to having more filing clerks piling more work for the calculator to do. Microarchitectures (core design) is analogous to how the internal circuit inside the calculator is designed (e.g., AMD FX series are slower than Intel equivalents even with higher OC'd GHz speeds because the core design is subpar). All three are important in determining CPU speed.
In general, higher GHz is more important for gaming now whereas # cores and threads are more important for multitasking like streaming, video editing, and advanced scientific/engineering computations. Core designs from both AMD and Intel in their most recent products are very good now, but something to keep in mind.

Overclocking

The basic concept of overclocking (OCing) is to feed your CPU more power through voltage and hoping it does calculations faster. Whether your parts are good overclockers depends on the manufacturing process of your specific part and slight variations in materials and manufacturing process will result in different overclocking capability ("silicon lottery"). The downside to this is that you can void your warranties because doing this will produce excess heat that will decrease the lifespan of your parts AND that there is a trial-and-error process to finding OC settings that are stable. Unstable OC settings result in computer freezes or random shut-offs from excess heat. OCing will give you extra performance often for free or by investing in a CPU cooler to control your temperatures so that the excess heat will not decrease your parts' lifespans as much. If you don't know how to OC, don't do it.

Current Products

Intel CPUs have higher GHz than AMD CPUs, which make them better for gaming purposes. However, AMD Ryzen CPUs have more cores and threads than their Intel equivalents. The new parts are AMD Ryzen 3, 5, or 7 2000 series or Intel i3, i5, or i7 8000 series (Coffee Lake). Everything else is outdated.
If you want to overclock on an AMD system, know that you can get some moderate OC on a B350/B450 with all CPUs. X370/X470 mobos usually come with better VRMs meant for OCing 2600X, 2700, and 2700X. If you don't know how to OC, know that the -X AMD CPUs have the ability to OC themselves automatically without manually settings. For Intel systems, you cannot OC unless the CPU is an unlocked -K chip (e.g., i3-8350K, i5-8600K, i7-8700K, etc.) AND the motherboard is a Z370 mobo. In general, it is not worth getting a Z370 mobo UNLESS you are getting an i5-8600K and i7-8700K.

CPU and Mobo Compatibility

Note about Ryzen 2000 CPUs on B350 mobos: yes, you CAN pair them up since they use the same socket. You might get an error message on PCPP that says that they might not be compatible. Call the retailer and ask if the mobo you're planning on buying has a "Ryzen 2000 Series Ready" sticker on the box. This SHOULD NOT be a problem with any mobos manufactured after February 2018.
Note about Intel 9000 CPUs on B360 / Z370 mobos: same as above with Ryzen 2000 CPUs on B350 or X370 boards.

CPU Cooler (Air / Liquid)

Air or liquid cooling for your CPU. This is mostly optional unless heavy OCing on AMD Ryzen CPUs and/or on Intel -K and i7-8700 CPUs.
For more information about air and liquid cooling comparisons, see here:

Motherboard/mobo

Part that lets all the parts talk to each other. Comes in different sizes from small to big: mITX, mATX, ATX, and eATX. For most people, mATX is cost-effective and does the job perfectly. If you need more features like extra USB slots, go for an ATX. mITX is for those who want a really small form factor and are willing to pay a premium for it. eATX mobos are like ATX mobos except that they have more features and are bigger - meant for super PC enthusiasts who need the features.
If you are NOT OCing, pick whatever is cheap and meets your specs. I recommend ASUS or MSI because they have RMA centres in Canada in case it breaks whereas other parts are outside of Canada like in the US. If you are OCing, then you need to look at the quality of the VRMs because those will greatly influence the stability and lifespan of your parts.

Memory/RAM

Part that keeps Windows and your software active. Currently runs on the DDR4 platform for new builds. Go for dual channel whenever possible. Here's a breakdown of how much RAM you need:
AMD Ryzen CPUs get extra FPS for faster RAM speeds (ideally 3200MHz) in gaming when paired with powerful video cards like the GTX 1070. Intel Coffee Lake CPUs use up a max of 2667MHz for B360 mobos. Higher end Z370 mobos can support 4000 - 4333MHz RAM depending on the mobo, so make sure you shop carefully!
It's noted that RAM prices are highly inflated because of the smartphone industry and possibly artificial supply shortages. For more information: https://www.extremetech.com/computing/263031-ram-prices-roof-stuck-way

Storage

Part that store your files in the form of SSDs and HDDs.

Solid State Drives (SSDs)

SSDs are incredibly quick, but are expensive per TB; they are good for booting up Windows and for reducing loading times for gaming. For an old OEM pre-built, upgrading the PC with an SSD is the single greatest speed booster you can do to your system. For most people, you want to make sure the SSD you get is NOT DRAM-less as these SSDs do not last as long as their DRAM counterparts (https://www.youtube.com/watch?v=ybIXsrLCgdM). It is also noted that the bigger the capacity of the SSD, the faster they are. SSDs come in four forms:
The 2.5" SATA form is cheaper, but it is the old format with speeds up to 550MB/s. M.2 SATA SSDs have the same transfer speeds as 2.5" SATA SSDs since they use the SATA interface, but connect directly to the mobo without a cable. It's better for cable management to get an M.2 SATA SSD over a 2.5" SATA III SSD. M.2 PCI-e SSDs are the newest SSD format and transfer up to 4GB/s depending on the PCI-e lanes they use (e.g., 1x, 2x, 4x, etc.). They're great for moving large files (e.g., 4K video production). For more info about U.2 drives, see this post (https://www.reddit.com/bapccanada/comments/8jxfqs/meta_new_to_pc_building_may_2018_edition/dzqj5ks/). Currently more common for enterprise builds, but could see some usage in consumer builds.

Hard Disk Drives (HDDs)

HDDs are slow with transfer speeds of ~100MB/s, but are cheap per TB compared to SSDs. We are now at SATA III speeds, which have a max theoretical transfer rate of 600MB/s. They also come in 5400RPM and 7200RPM forms. 5400RPM uses slightly less power and are cheaper, but aren't as fast at dealing with a large number of small files as 7200RPM HDDs. When dealing with a small number of large files, they have roughly equivalent performance. It is noted that even a 10,000RPM HDD will still be slower than an average 2.5" SATA III SSD.

Others

SSHDs are hybrids of SSDs and HDDs. Although they seem like a good combination, it's much better in all cases to get a dedicated SSD and a dedicated HDD instead. This is because the $/speed better for SSDs and the $/TB is better for HDDs. The same can be said for Intel Optane. They both have their uses, but for most users, aren't worth it.

Overall

I recommend a 2.5" or M.2 SATA ≥ 250GB DRAM SSD and a 1TB or 2TB 7200RPM HDD configuration for most users for a balance of speed and storage capacity.

Video Card/GPU

Part that runs complex calculations in games and outputs to your monitor and is usually the most expensive part of the budget. The GPU you pick is dictated by the gaming resolution and FPS you want to play at.
In general, all video cards of the same product name have almost the same non-OC'd performance (e.g., Asus Dual-GTX1060-06G has the same performance as the EVGA 06G-P4-6163-KR SC GAMING). The different sizes and # fans DO affect GPU OCing capability, however. The most important thing here is to get an open-air video card, NOT a blower video card (https://www.youtube.com/watch?v=0domMRFG1Rw). The blower card is meant for upgrading pre-builts where case airflow is limited.
For cost-performance, go for the NVIDIA GTX cards because of the cryptomining industry that has inflated AMD RX cards. Bitcoin has taken a -20% hit since January's $10,000+ as of recently, but the cryptomining industry is still ongoing. Luckily, this means prices have nearly corrected itself to original MSRP in 2016.
In general:
Note that if your monitor has FreeSync technology, get an AMD card. If your monitor has G-Sync, get a NVIDIA card. Both technologies allow for smooth FPS gameplay. If you don't have either, it doesn't really matter which brand you get.
For AMD RX cards, visit https://www.pcworld.com/article/3197885/components-graphics/every-amd-radeon-rx-graphics-card-you-can-buy-for-pc-gaming.html

New NVIDIA GeForce RTX Series

New NVIDIA 2000 RTX series have been recently announced and will be carried in stores in Q3 and Q4. Until all of the products have been fully vetted and reviewed, we cannot recommend those yet as I cannot say if they are worth what NVIDIA has marketed them as. But they will be faster than their previous equivalents and will require more wattage to use. The 2070, 2080, and 2080 Ti will feature ray tracing, which is a new feature seen in modern CG movies that greatly enhances lighting and shadow effects. At this time, < 30 games will use ray tracing (https://www.pcgamer.com/21-games-will-support-nvidias-real-time-ray-tracing-here-are-demos-of-tomb-raider-and-control/). It's also noted that the 2080 Ti is the Titan XP equivalent, which is why it's so expensive. (https://www.youtube.com/watch?v=Irs8jyEmmPQ) The community's general recommendation is NOT to pre-order them until we see some reviews and benchmarks from reviewers first.
Looks like a couple of benchmarks are out. While keeping other parts equal the following results were obtained(https://videocardz.com/77983/nvidia-geforce-rtx-2080-ti-and-rtx-2080-official-performance-unveiled). So the 2080 and 2080 Ti are better than last generation's 1080 Ti by ~10 and ~20 frames respectively.

Case

Part that houses your parts and protects them from its environment. Should often be the last part you choose because the selection is big enough to be compatible with any build you choose as long as the case is equal to or bigger than the mobo form factor.
Things to consider: aesthetics, case airflow, cable management, material, cooling options (radiators or # of fan spaces), # fans included, # drive bays, toolless installation, power supply shroud, GPU clearance length, window if applicable (e.g., acrylic, tempered glass), etc.
It is recommended to watch or read case reviews on Youtube to get an idea of a case's performance in your setup.

Power Supply/PSU

Part that runs your PC from the wall socket. Never go with an non-reputable/cheap brand out on these parts as low-quality parts could damage your other parts. Recommended branded PSUs are Corsair, EVGA, Seasonic, and Thermaltake, generally. For a tier list, see here (https://linustechtips.com/main/topic/631048-psu-tier-list-updated/).

Wattage

Wattage depends on the video card chosen, if you plan to OC, and/or if you plan to upgrade to a more powerful PSU in the future. Here's a rule of thumb for non-OC wattages that meet NVIDIA's recommendations:
There are also PSU wattage calculators that you can use to estimate your wattage. How much wattage you used is based on your PC parts, how much OCing you're doing, your peripherals (e.g., gaming mouse and keyboard), and how long you plan to leave your computer running, etc. It is noted that these calculators use conservative estimates, so use the outputted wattage as a baseline of how much you need. Here are the calculators (thanks, VitaminDeity).
Pick ONE calculator to use and use the recommended wattage, NOT recommended product, as a baseline of what wattage you need for your build. Note that Cooler Master and Seasonic use the exact calculator as Outervision. For more details about wattage, here are some reference videos:

Modularity

You might also see some info about modularity (non-modular, semi-modular, or fully-modular). These describe if the cables will come connected to the PSU or can be separated of your own choosing. Non-modular PSUs have ALL of the cable connections attached to the PSU with no option to remove unneeded cables. Semi-modular PSUs have separate cables for HDDs/SSDs and PCI-e connectors, but will have CPU and mobo cables attached. Modular PSUs have all of their cables separate from each other, allowing you to fully control over cable management. It is noted that with decent cooling and airflow in your case, cable management has little effect on your temperatures (https://www.youtube.com/watch?v=YDCMMf-_ASE).

80+ Efficiency Ratings

As for ratings (80+, 80+ bronze, 80+ gold, 80+ platinum), these are the efficiencies of your PSU. Please see here for more information. If you look purely on electricity costs, the 80+ gold PSUs will be more expensive than 80+ bronze PSUs for the average Canadian user until a breakeven point of 6 years (assuming 8 hours/day usage), but often the better performance, longer warranty periods, durable build quality, and extra features like fanless cooling is worth the extra premium. In general, the rule of thumb is 80+ bronze for entry-level office PCs and 80+ gold for mid-tier or higher gaming/workstation builds. If the price difference between a 80+ bronze PSU and 80+ gold PSU is < 20%, get the 80+ gold PSU!

Warranties

Warranties should also be looked at when shopping for PSUs. In general, longer warranties also have better PSU build quality. In general, for 80+ bronze and gold PSU units from reputable brands:
Any discrepancies are based on varied wattages (i.e., higher wattages have longer warranties) or updated warranty periods. Please refer to the specific product's warranty page for the correct information. For EVGA PSUs, see here (https://www.evga.com/support/warranty/power-supplies/). For Seasonic PSUs, see here (https://seasonic.com/support#period). For Corsair PSUs, see here (https://www.corsair.com/ca/en/warranty).
For all other PSU inquiries, look up the following review sites for the PSUs you're interested in buying:
These guys are engineering experts who take apart PSUs, analyze the quality of each product, and provide an evaluation of the product. Another great website is http://www.orionpsudb.com/, which shows which PSUs are manufactured by different OEMs.

Operating System (OS)

Windows 10

The most common OS. You can download the ISO here (https://www.microsoft.com/en-ca/software-download/windows10). For instructions on how to install the ISO from a USB drive, see here (https://docs.microsoft.com/en-us/windows-hardware/manufacture/desktop/install-windows-from-a-usb-flash-drive) or watch a video here (https://www.youtube.com/watch?v=gLfnuE1unS8). For most users, go with the 64-bit version.
If you purchase a Windows 10 retail key (i.e., you buy it from a retailer or from Microsoft directly), keep in mind that you are able to transfer it between builds. So if you're building another PC for the 2nd, 3rd, etc. time, you can reuse the key for those builds PROVIDED that you deactivate your key before installing it on your new PC. These keys are ~$120.
However, if you have an OEM key (e.g., pre-builts), that key is tied specifically to your mobo. If you ever decide to upgrade your mobo on that pre-built PC, you might have to buy a new Windows 10 license. For more information, see this post (https://www.techadvisor.co.uk/feature/windows/windows-10-oem-or-retail-3665849/). The cheaper Windows 10 keys you can find on Kinguin are OEM keys; activating and deactivating these keys may require phoning an automated Microsoft activation line. Most of these keys are legitimate and cost ~$35, although Microsoft does not intend for home users to obtain this version of it. Buyer beware.
The last type of key is a volume licensing key. They are licensed in large volumes to corporate or commercial usage. You can find lots of these keys on eBay for ~$10, but if the IT department who manages these keys audit who is using these keys or if the number of activations have exceeded the number allotted on that one key, Microsoft could block that key and invalidate your license. Buyer beware.
For more information on differentiating between all three types of keys, see this page (https://www.tenforums.com/tutorials/49586-determine-if-windows-license-type-oem-retail-volume.html).
If money is tight, you can get Windows 10 from Microsoft and use a trial version of it indefinitely. However, there will be a watermark in the bottom-right of your screen until you activate your Windows key.

MacOS

If you're interested in using MacOS, look into Hackintosh builds. This will allow you to run MacOS to run on PC parts, saving you lots of money. These builds are pretty picky about part compatibility, so you might run into some headaches trying to go through with this. For more information, see the following links:

Linux

If you're interested in a free open-source OS, see the following links:
For more information, go to /linux, /linuxquestions, and /linux4noobs.

Peripherals

Monitors

Keyboards and Mice

Overall

Please note that the cost-performance builds will change daily because PC part prices change often! Some builds will have excellent cost-performance one day and then have terrible cost-performance the next. If you want to optimize cost-performance, it is your responsibility to do this if you go down this route!
Also, DO NOT PM me with PC build requests! It is in your best interests to make your own topic so you can get multiple suggestions and input from the community rather than just my own. Thanks again.

Sample Builds

Here are some sample builds that are reliable, but may not be cost-optimized builds. These builds were created on September 9, 2018; feel free to "edit this part list" and create your own builds.

Links

Helpful links to common problems below:

Contributors

Thanks to:

Housekeeping

2019/09/22
2019/09/18
Updates:
2019/09/09
Updates:
Sorry for the lack of updates. I recently got a new job where I work 12 hours/day for 7 days at a time out of the city. What little spare time I have is spent on grad school and the gym instead of gaming. So I've been pretty behind on the news and some might not be up-to-date as my standards would have been with less commitments. If I've made any mistakes, please understand it might take a while for me to correct them. Thank you!
submitted by BlackRiot to bapccanada [link] [comments]

The Great NiceHash Profit Explanation - for Sellers (the guys with the GPUs & CPUs)

Let's make a couple of things crystal clear about what you are not doing here:
But hey, I'm running MINING software!
What the hell am I doing then?!?
Who makes Profit, and how?
How is it possible everyone is making a profit?
Why do profits skyrocket, and will it last (and will this happen again)?
But my profits are decreasing all the time >:[
But why?!? I’m supposed to make lotsa money out of this!!!
But WHY!!!
  1. Interest hype -> Influx of Fiat money -> Coins quotes skyrocket -> Influx of miners -> Difficulty skyrockets -> Most of the price uptrend is choked within weeks, since it’s now harder to mine new blocks.
  2. Interest hype drains out -> Fiat money influx declines -> Coins quotes halt or even fall -> Miners still hold on to their dream -> Difficulty stays up high, even rises -> Earnings decrease, maybe even sharply, as it's still harder to mine new blocks, that may be even paid less.
So, how to judge what’s going on with my profits?
Simple breakdown of the relationship of BTC payouts by NiceHash, BTC/ALT Coins rates, and Fiat value:
BTC quote | ALTs quotes | BTC payout | Fiat value ----------------------------------------------------- UP | UP | stable*) | UP stable | UP | UP | UP UP | stable | DOWN | stable*) stable | stable | stable | stable DOWN | stable | UP | stable*) stable | DOWN | DOWN | DOWN DOWN | DOWN | stable*) | DOWN 
Some rather obvious remarks:
More help:
Disclaimer: I'm a user - Seller like you - not in any way associated with NiceHash; this is my personal view & conclusion about some more or less obvious basics in Crypto mining and particularly using NiceHash.
Comments & critics welcome...
submitted by t_3 to NiceHash [link] [comments]

AN INTRODUCTION TO DIGIBYTE

DigiByte

What are cryptocurrencies?
Cryptocurrencies are peer to peer technology protocols which rely on the block-chain; a system of decentralized record keeping which allows people to exchange unmodifiable and indestructible information “coins,” globally in little to no time with little to no fees – this translates into the exchange of value as these coins cannot be counterfeit nor stolen. This concept was started by Satoshi Nakamoto (allegedly a pseudonym for a single man or organization) whom described and coded Bitcoin in 2009.
What is DigiByte?
DigiByte (DGB) is a cryptocurrency like Bitcoin. It is also a decentralized applications protocol in a similar fashion to Neo or Ethereum.
DigiByte was founded and created by Jared Tate in 2014. DigiByte allows for fast (virtually instant) and low cost (virtually free) transactions. DigiByte is hard capped at 21 billion coins which will ever be mined, over a period of 21 years. DigiByte was never an ICO and was mined/created in the same way that Bitcoin or Litecoin initially were.
DigiByte is the fastest UTXO PoW scalable block-chain in the world. We’ll cover what this really means down below.
DigiByte has put forth and applied solutions to many of the problems that have plagued Bitcoin and cryptocurrencies in general – those being:
We will address these point by point in the subsequent sections.
The DigiByte Protocol
DigiByte maintains these properties through use of various technological innovations which we will briefly address below.
Why so many coins? 21 Billion
When initially conceived Bitcoin was the first of a kind! And came into the hands of a few! The beginnings of a coin such as Bitcoin were difficult, it had to go through a lot of initial growth pains which following coins did not have to face. It is for this reason among others why I believe Bitcoin was capped at 21 million; and why today it has thus secured a place as digital gold.
When Bitcoin was first invented no one knew anything about cryptocurrencies, for the inventor to get them out to the public he would have to give them away. This is how the first Bitcoins were probably passed on, for free! But then as interest grew so did the community. For them to be able to build something and create something which could go on to have actual value, it would have to go through a steady growth phase. Therefore, the control of inflation through mining was extremely important. Also, why the cap for Bitcoin was probably set so low - to allow these coins to amass value without being destroyed by inflation (from mining) in the same way fiat is today! In my mind Satoshi Nakamoto knew what he was doing when setting it at 21 million BTC and must have known and even anticipated others would take his design and build on top of it.
At DigiByte, we are that better design and capped at 21 billion. That's 1000 times larger than the supply of Bitcoin. Why though? Why is the cap on DigiByte so much higher than that of Bitcoin? Because DigiByte was conceived to be used not as a digital gold, nor as any sort of commodity, but as a real currency!
Today on planet Earth, we are approximately 7.6 billion people. If each person should want or need to use and live off Bitcoin; then equally split at best each person could only own 0.00276315789 BTC. The market cap for all the money on the whole planet today is estimated to have recently passed 80 trillion dollars. That means that each whole unit of Bitcoin would be worth approximately $3,809,523.81!
$3,809,523.81
This is of course in an extreme case where everyone used Bitcoin for everything. But even in a more conservative scenario the fact remains that with such a low supply each unit of a Bitcoin would become absurdly expensive if not inaccessible to most. Imagine trying to buy anything under a dollar!
Not only would using Bitcoin as an everyday currency be a logistical nightmare but it would be nigh impossible. For each Satoshi of a Bitcoin would be worth much, much, more than what is realistically manageable.
This is where DigiByte comes in and where it shines. DigiByte aims to be used world-wide as an international currency! Not to be hoarded in the same way Bitcoin is. If we were to do some of the same calculations with DigiByte we'd find that the numbers are a lot more reasonable.
At 7.6 billion people, each person could own 2.76315789474 DGB. Each whole unit of DGB would be worth approximately $3,809.52.
$3,809.52
This is much more manageable and remember in an extreme case where everyone used DigiByte for everything! I don't expect this to happen anytime soon, but with the supply of DigiByte it would allow us to live and transact in a much more realistic and fluid fashion. Without having to divide large numbers on our phone's calculator to understand how much we owe for that cup of coffee! With DigiByte it's simple, coffee cost 1.5 DGB, the cinema 2.8 DGB, a plane ticket 500 DGB!
There is a reason for DigiByte's large supply, and it is a good one!
Decentralisation
Decentralisation is an important concept for the block-chain and cryptocurrencies in general. This allows for a system which cannot be controlled nor manipulated no matter how large the organization in play or their intentions. DigiByte’s chain remains out of the reach of even the most powerful government. This allows for people to transact freely and openly without fear of censorship.
Decentralisation on the DigiByte block-chain is assured by having an accessible and fair mining protocol in place – this is the multi-algorithm (MultiAlgo) approach. We believe that all should have access to DigiByte whether through purchase or by mining. Therefore, DigiByte is minable not only on dedicated mining hardware such as Antminers, but also through use of conventional graphics cards. The multi-algorithm approach allows for users to mine on a variety of hardware types through use of one of the 5 mining algorithms supported by DigiByte. Those being:
Please note that these mining algorithms are modified and updated from time to time to assure complete decentralisation and thus ultimate security.
The problem with using only one mining algorithm such as Bitcoin or Litecoin do is that this allows for people to continually amass mining hardware and hash power. The more hash power one has, the more one can collect more. This leads to a cycle of centralisation and the creation of mining centres. It is known that a massive portion of all hash power in Bitcoin comes from China. This kind of centralisation is a natural tendency as it is cheaper for large organisations to set up in countries with inexpensive electricity and other such advantages which may be unavailable to the average miner.
DigiByte mitigates this problem with the use of multiple algorithms. It allows for miners with many different kinds of hardware to mine the same coin on an even playing field. Mining difficulty is set relative to the mining algorithm used. This allows for those with dedicated mining rigs to mine alongside those with more modest machines – and all secure the DigiByte chain while maintaining decentralisation.
Low Fees
Low fees are maintained in DigiByte thanks to the MultiAlgo approach working in conjunction with MultiShield (originally known as DigiShield). MultiShield calls for block difficulty readjustment between every single block on the chain; currently blocks last 15 seconds. This continuous difficulty readjustment allows us to combat any bad actors which may wish to manipulate the DigiByte chain.
Manipulation may be done by a large pool or a single entity with a great amount of hash power mining blocks on the chain; thus, increasing the difficulty of the chain. In some coins such as Bitcoin or Litecoin difficulty is readjusted every 2016 blocks at approximately 10mins each and 2mins respectively. Meaning that Bitcoin’s difficulty is readjusted about every two weeks. This system can allow for large bad actors to mine a coin and then abandon it, leaving it with a difficulty level far too high for the present hash rate – and so transactions can be frozen, and the chain stopped until there is a difficulty readjustment and or enough hash power to mine the chain. In such a case users may be faced with a choice - pay exorbitant fees or have their transactions frozen. In an extreme case the whole chain could be frozen completely for extended periods of time.
DigiByte does not face this problem as its difficulty is readjusted per block every 15 seconds. This innovation was a technological breakthrough and was adopted by several other coins in the cryptocurrency environment such as Dogecoin, Z-Cash, Ubiq, Monacoin, and Bitcoin Gold.
This difficulty readjustment along with the MultiAlgo approach allows DigiByte to maintain the lowest fees of any UTXO – PoW – chain in the world. Currently fees on the DigiByte block-chain are at about 0.0001 DGB per transaction of 100 000 DGB sent. This depends on the amount sent and currently 100 000 DGB are worth around $2000.00 with the fee being less than 0.000002 cents. It would take 500 000 transactions of 100 000 DGB to equal 1 penny’s worth. This was tested on a Ledger Nano S set to the low fees setting.
Fast transaction times
Fast transactions are ensured by the conjunctive use of the two aforementioned technology protocols. The use of MultiShield and MultiAlgo allows the mining of the DigiByte chain to always be profitable and thus there is always someone mining your transactions. MultiAlgo allows there to a greater amount of hash power spread world-wide, this along with 15 second block times allows for transactions to be near instantaneous. This speed is also ensured by the use DigiSpeed. DigiSpeed is the protocol by which the DigiByte chain will decrease block timing gradually. Initially DigiByte started with 30 second block times in 2014; which today are set at 15 seconds. This decrease will allow for ever faster and ever more transactions per block.
Robust security + The Immutable Ledger
At the core of cryptocurrency security is decentralisation. As stated before decentralisation is ensured on the DigiByte block chain by use of the MultiAlgo approach. Each algorithm in the MultiAlgo approach of DigiByte is only allowed about 20% of all new blocks. This in conjunction with MultiShield allows for DigiByte to be the most secure, most reliable, and fastest UTXO block chain on the planet. This means that DigiByte is a proof of work (PoW) block-chain where all transactional activities are stored on the immutable public ledger world-wide. In DigiByte there is no need for the Lightning protocol (although we have it) nor sidechains to scale, and thus we get to keep PoW’s security.
There are many great debates as to the robustness or cleanliness of PoW. The fact remains that PoW block-chains remain the only systems in human history which have never been hacked and thus their security is maximal.
For an attacker to divert the DigiByte chain they would need to control over 93% of all the hashrate on one algorithm and 51% of the other four. And so DigiByte is immune to the infamous 51% attack to which Bitcoin and Litecoin are vulnerable.
Moreover, the DigiByte block-chain is currently spread over 200 000 plus servers, computers, phones, and other machines world-wide. The fact is that DigiByte is one of the easiest to mine coins there is – this is greatly aided by the recent release of the one click miner. This allows for ever greater decentralisation which in turn assures that there is no single point of failure and the chain is thus virtually un-attackable.
On Chain Scalability
The biggest barrier for block-chains today is scalability. Visa the credit card company can handle around 2000 transactions per second (TPS) today. This allows them to ensure customer security and transactional rates nation-wide. Bitcoin currently sits at around 7 TPS and Litecoin at 28 TPS (56 TPS with SegWit). All the technological innovations I’ve mentioned above come together to allow for DigiByte to be the fastest PoW block-chain in the world and the most scalable.
DigiByte is scalable because of DigiSpeed, the protocol through which block times are decreased and block sizes are increased. It is known that a simple increase in block size can increase the TPS of any block-chain, such is the case with Bitcoin Cash. This is however not scalable. The reason a simple increase in block size is not scalable is because it would eventually lead to some if not a great amount of centralization. This centralization occurs because larger block sizes mean that storage costs and thus hardware cost for miners increases. This increase along with full blocks – meaning many transactions occurring on the chain – will inevitably bar out the average miner after difficulty increases and mining centres consolidate.
Hardware cost, and storage costs decrease over time following Moore’s law and DigiByte adheres to it perfectly. DigiSpeed calls for the increase in block sizes and decrease in block timing every two years by a factor of two. This means that originally DigiByte’s block sizes were 1 MB at 30 seconds each at inception in 2014. In 2016 DigiByte increased block size by two and decreased block timing by the same factor. Perfectly following Moore’s law. Moore’s law dictates that in general hardware increases in power by a factor of two while halving in cost every year.
This would allow for DigiByte to scale at a steady rate and for people to adopt new hardware at an equally steady rate and reasonable expense. Thus so, the average miner can continue to mine DigiByte on his algorithm of choice with entry level hardware.
DigiByte was one of the first block chains to adopt segregated witness (SegWit in 2017) a protocol whereby a part of transactional data is removed and stored elsewhere to decrease transaction data weight and thus increase scalability and speed. This allows us to fit more transactions per block which does not increase in size!
DigiByte currently sits at 560 TPS and could scale to over 280 000 TPS by 2035. This dwarfs any of the TPS capacities; even projected/possible capacities of some coins and even private companies. In essence DigiByte could scale worldwide today and still be reliable and robust. DigiByte could even handle the cumulative transactions of all the top 50 coins in coinmarketcap.com and still run smoothly and below capacity. In fact, to max out DigiByte’s actual maximum capacity (today at 560 TPS) you would have to take all these transactions and multiply them by a factor of 10!
Oher Uses for DigiByte
Note that DigiByte is not only to be used as a currency. Its immense robustness, security and scalability make it ideal for building decentralised applications (DAPPS) which it can host. DigiByte can in fact host DAPPS and even centralised versions which rely on the chain which are known as Digi-Apps. This application layer is also accompanied by a smart contract layer.
Thus, DigiByte could host several Crypto Kitties games and more without freezing out or increasing transaction costs for the end user.
Currently there are various DAPPS being built on the DigiByte block-chain, these are done independently of the DigiByte core team. These companies are simply using the DigiByte block-chain as a utility much in the same way one uses a road to get to work. One such example is Loly – a Tinderesque consensual dating application.
DigiByte also hosts a variety of other platform projects such as the following:
The DigiByte Foundation
As previously mentioned DigiByte was not an ICO. The DigiByte foundation was established in 2017 by founder Jared Tate. Its purpose is as a non-profit organization dedicated to supporting and developing the DigiByte block-chain.
DigiByte is a community effort and a community coin, to be treated as a public resource as water or air. Know that anyone can work on DigiByte, anyone can create, and do as they wish. It is a permissionless system which encourages innovation and creation. If you have an idea and or would like to get help on your project do not hesitate to contact the DigiByte foundation either through the official website and or the telegram developer’s channel.
For this reason, it is ever more important to note that the DigiByte foundation cannot exist without public support. And so, this is the reason I encourage all to donate to the foundation. All funds are used for the maintenance of DigiByte servers, marketing, and DigiByte development.
DigiByte Resources and Websites
DigiByte
Wallets
Explorers
Please refer to the sidebar of this sub-reddit for more resources and information.
Edit - Removed Jaxx wallet.
Edit - A new section was added to the article: Why so many coins? 21 Billion
Edit - Adjusted max capacity of DGB's TPS - Note it's actually larger than I initially calculated.
Edit – Grammar and format readjustment
Hello,
I hope you’ve enjoyed my article, I originally wrote this for the reddit sub-wiki where it generally will most likely, probably not, get a lot of attention. So instead I've decided to make this sort of an introductory post, an open letter, to any newcomers to DGB or for those whom are just curious.
I tried to cover every aspect of DGB, but of course I may have forgotten something! Please leave a comment down below and tell me why you're in DGB? What convinced you? Me it's the decentralised PoW that really convinced me. Plus, just that transaction speed and virtually no fees! Made my mouth water!
-Dereck de Mézquita
I'm a student typing this stuff on my free time, help me pay my debts? Thank you!
D64fAFQvJMhrBUNYpqUKQjqKrMLu76j24g
https://digiexplorer.info/address/D64fAFQvJMhrBUNYpqUKQjqKrMLu76j24g
submitted by xeno_biologist to Digibyte [link] [comments]

Noob's Guide To Bitcoin Mining - Super Easy & Simple - YouTube How to Mine Bitcoins Using Your Own Computer - YouTube How Does Bitcoin Work? - YouTube America's largest Bitcoin mining farm Inside a Bitcoin mine that earns $70K a day - YouTube

The primary ongoing cost for bitcoi n production is that of electricity, measured in do llars per kilowatt-hour (kWh). Of course, different regions of the world will consume electricity at their local rates (which may vary by customer type, power generation source, and time of day) and in the ir local currencies, but for the sake of convenience it is a good working assumption that the average ... A table adapted from Bitcoin Wiki shows the increasing ... Assume the cost per hashrate for 10 minutes is 5, then the following holds: 1*5= 600* 5/600, where the right side of the equation is the ... Bitcoin miners are all competing for the 3,600 bitcoins per-day issued (at current difficulty level). So Bitcoin mining cost is roughly the value the coins mined. It could be a little more than value of the mined coins (when miners mine at a loss, when taking into consideration the cost of the capital for the mining hardware) or a little less. Posted in r/Bitcoin by u/spider143 • 46 points and 53 comments Electricity Consumed Per Transaction: 291.00 kWh; Number of US Households That Could Be Powered By the Energy Usage of the Entire Bitcoin Network: 2,671,832 ; Number of US Households Powered for 1 Day by the Electricity Used for a Single Bitcoin Transaction: 9.82; Bitcoin’s Electricity Consumption as a Percentage of the World’s electricity Consumption: 0.13%; You can view further details ...

[index] [20342] [44930] [13780] [34702] [1190] [16995] [14685] [33270] [42096] [51141]

Noob's Guide To Bitcoin Mining - Super Easy & Simple - YouTube

Start trading Bitcoin and cryptocurrency here: http://bit.ly/2Vptr2X Bitcoin mining is the process of updating the ledger of Bitcoin transactions known as th... In this video we learn how electricity works starting from the basics of the free electron in the atom, through conductors, voltage, current, resistors, led,... Therefore, the price of Bitcoin may be related to the price of energy and the cost of maintaining this network. While it is easy to assert that there will be additional computers added to the ... Some Helpful Links: • Buy Parts for a Mining Rig: http://amzn.to/2jSSsCz • Download NiceHash Miner: https://www.nicehash.com/?p=nhmintro • Choose a Wallet: h... Dave Carlson, and the largest Bitcoin mining farm in America, generating $8 million in new coins per month, plus earning a percentage from transaction, through processing fees. His farm is in the ...

#