Bitcoin XT - EverybodyWiki Bios & Wiki

Why is Blockstream CTO Greg Maxwell u/nullc trying to pretend AXA isn't one of the top 5 "companies that control the world"? AXA relies on debt & derivatives to pretend it's not bankrupt. Million-dollar Bitcoin would destroy AXA's phony balance sheet. How much is AXA paying Greg to cripple Bitcoin?

Here was an interesting brief exchange between Blockstream CTO Greg Maxwell u/nullc and u/BitAlien about AXA:
https://np.reddit.com/Bitcoin/comments/62d2yq/why_bitcoin_is_under_attack/dfm6jt?context=3
The "non-nullc" side of the conversation has already been censored by r\bitcoin - but I had previously archived it here :)
https://archive.fo/yWnWh#selection-2613.0-2615.1
u/BitAlien says to u/nullc :
Blockstream is funded by big banks, for example, AXA.
https://blockstream.com/2016/02/02/blockstream-new-investors-55-million-series-a.html
u/nullc says to u/BitAlien :
is funded by big banks, for example, AXA
AXA is a French multinational insurance firm.
But I guess we shouldn't expect much from someone who thinks miners unilatterally control bitcoin.
Typical semantics games and hair-splitting and bullshitting from Greg.
But I guess we shouldn't expect too much honesty or even understanding from someone like Greg who thinks that miners don't control Bitcoin.
AXA-owned Blockstream CTO Greg Maxwell u/nullc doesn't understand how Bitcoin mining works
Mining is how you vote for rule changes. Greg's comments on BU revealed he has no idea how Bitcoin works. He thought "honest" meant "plays by Core rules." [But] there is no "honesty" involved. There is only the assumption that the majority of miners are INTELLIGENTLY PROFIT-SEEKING. - ForkiusMaximus
https://np.reddit.com/btc/comments/5zxl2l/mining_is_how_you_vote_for_rule_changes_gregs/
AXA-owned Blockstream CTO Greg Maxwell u/nullc is economically illiterate
Adam Back & Greg Maxwell are experts in mathematics and engineering, but not in markets and economics. They should not be in charge of "central planning" for things like "max blocksize". They're desperately attempting to prevent the market from deciding on this. But it will, despite their efforts.
https://np.reddit.com/btc/comments/46052e/adam_back_greg_maxwell_are_experts_in_mathematics/)
AXA-owned Blockstream CTO Greg Maxwell u/nullc doesn't understand how fiat works
Gregory Maxwell nullc has evidently never heard of terms like "the 1%", "TPTB", "oligarchy", or "plutocracy", revealing a childlike naïveté when he says: "‘Majority sets the rules regardless of what some minority thinks’ is the governing principle behind the fiats of major democracies."
https://np.reddit.com/btc/comments/44qr31/gregory_maxwell_unullc_has_evidently_never_heard/
AXA-owned Blockstream CTO Greg Maxwell u/nullc is toxic to Bitcoin
People are starting to realize how toxic Gregory Maxwell is to Bitcoin, saying there are plenty of other coders who could do crypto and networking, and "he drives away more talent than he can attract." Plus, he has a 10-year record of damaging open-source projects, going back to Wikipedia in 2006.
https://np.reddit.com/btc/comments/4klqtg/people_are_starting_to_realize_how_toxic_gregory/
So here we have Greg this week, desperately engaging in his usual little "semantics" games - claiming that AXA isn't technically a bank - when the real point is that:
AXA is clearly one of the most powerful fiat finance firms in the world.
Maybe when he's talking about the hairball of C++ spaghetti code that him and his fellow devs at Core/Blockstream are slowing turning their version of Bitcoin's codebase into... in that arcane (and increasingly irrelevant :) area maybe he still can dazzle some people with his usual meaningless technically correct but essentially erroneous bullshit.
But when it comes to finance and economics, Greg is in way over his head - and in those areas, he can't bullshit anyone. In fact, pretty much everything Greg ever says about finance or economics or banks is simply wrong.
He thinks he's proved some point by claiming that AXA isn't technically a bank.
But AXA is far worse than a mere "bank" or a mere "French multinational insurance company".
AXA is one of the top-five "companies that control the world" - and now (some people think) AXA is in charge of paying for Bitcoin "development".
A recent infographic published in the German Magazine "Die Zeit" showed that AXA is indeed the second-most-connected finance company in the world - right at the rotten "core" of the "fantasy fiat" financial system that runs our world today.
Who owns the world? (1) Barclays, (2) AXA, (3) State Street Bank. (Infographic in German - but you can understand it without knowing much German: "Wem gehört die Welt?" = "Who owns the world?") AXA is the #2 company with the most economic poweconnections in the world. And AXA owns Blockstream.
https://np.reddit.com/btc/comments/5btu02/who_owns_the_world_1_barclays_2_axa_3_state/
The link to the PDF at Die Zeit in the above OP is gone now - but there's other copies online:
https://www.konsumentenschutz.ch/sks/content/uploads/2014/03/Wem-geh%C3%B6rt-die-Welt.pdfother
http://www.zeit.de/2012/23/IG-Capitalist-Network
https://archive.fo/o/EzRea/https://www.konsumentenschutz.ch/sks/content/uploads/2014/03/Wem-geh%C3%B6rt-die-Welt.pdf
Plus there's lots of other research and articles at sites like the financial magazine Forbes, or the scientific publishing site plos.org, with articles which say the same thing - all the tables and graphs show that:
AXA is consistently among the top five "companies that control everything"
https://www.forbes.com/sites/bruceupbin/2011/10/22/the-147-companies-that-control-everything/#56b72685105b
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0025995
http://www98.griffith.edu.au/dspace/bitstream/handle/10072/37499/64037_1.pdf;sequence=1
https://www.outsiderclub.com/report/who-really-controls-the-world/1032
AXA is right at the rotten "core" of the world financial system. Their last CEO was even the head of the friggin' Bilderberg Group.
Blockstream is now controlled by the Bilderberg Group - seriously! AXA Strategic Ventures, co-lead investor for Blockstream's $55 million financing round, is the investment arm of French insurance giant AXA Group - whose CEO Henri de Castries has been chairman of the Bilderberg Group since 2012.
https://np.reddit.com/btc/comments/47zfzt/blockstream_is_now_controlled_by_the_bilderberg/
So, let's get a few things straight here.
"AXA" might not be a household name to many people.
And Greg was "technically right" when he denied that AXA is a "bank" (which is basically the only kind of "right" that Greg ever is these days: "technically" :-)
But AXA is one of the most powerful finance companies in the world.
AXA was started as a French insurance company.
And now it's a French multinational insurance company.
But if you study up a bit on AXA, you'll see that they're not just any old "insurance" company.
AXA has their fingers in just about everything around the world - including a certain team of toxic Bitcoin devs who are radically trying to change Bitcoin:
And ever since AXA started throwing tens of millions of dollars in filthy fantasy fiat at a certain toxic dev named Gregory Maxwell, CTO of Blockstream, suddenly he started saying that we can't have nice things like the gradually increasing blocksizes (and gradually increasing Bitcoin prices - which fortunately tend to increase proportional to the square of the blocksize because of Metcalfe's law :-) which were some of the main reasons most of us invested in Bitcoin in the first place.
My, my, my - how some people have changed!
Greg Maxwell used to have intelligent, nuanced opinions about "max blocksize", until he started getting paid by AXA, whose CEO is head of the Bilderberg Group - the legacy financial elite which Bitcoin aims to disintermediate. Greg always refuses to address this massive conflict of interest. Why?
https://np.reddit.com/btc/comments/4mlo0z/greg_maxwell_used_to_have_intelligent_nuanced/
Previously, Greg Maxwell u/nullc (CTO of Blockstream), Adam Back u/adam3us (CEO of Blockstream), and u/theymos (owner of r\bitcoin) all said that bigger blocks would be fine. Now they prefer to risk splitting the community & the network, instead of upgrading to bigger blocks. What happened to them?
https://np.reddit.com/btc/comments/5dtfld/previously_greg_maxwell_unullc_cto_of_blockstream/
"Even a year ago I said I though we could probably survive 2MB" - nullc
https://np.reddit.com/btc/comments/43mond/even_a_year_ago_i_said_i_though_we_could_probably/
Core/Blockstream supporters like to tiptoe around the facts a lot - hoping we won't pay attention to the fact that they're getting paid by a company like AXA, or hoping we'll get confused if Greg says that AXA isn't a bank but rather an insurance firm.
But the facts are the facts, whether AXA is an insurance giant or a bank:
  • AXA would be exposed as bankrupt in a world dominated by a "counterparty-free" asset class like Bitcoin.
  • AXA pays Greg's salary - and Greg is one of the major forces who has been actively attempting to block Bitcoin's on-chain scaling - and there's no way getting around the fact that artificially small blocksizes do lead to artificially low prices.
AXA kinda reminds me of AIG
If anyone here was paying attention when the cracks first started showing in the world fiat finance system around 2008, you may recall the name of another mega-insurance company, that was also one of the most connected finance companies in the world: AIG.
Falling Giant: A Case Study Of AIG
What was once the unthinkable occurred on September 16, 2008. On that date, the federal government gave the American International Group - better known as AIG (NYSE:AIG) - a bailout of $85 billion. In exchange, the U.S. government received nearly 80% of the firm's equity. For decades, AIG was the world's biggest insurer, a company known around the world for providing protection for individuals, companies and others. But in September, the company would have gone under if it were not for government assistance.
http://www.investopedia.com/articles/economics/09/american-investment-group-aig-bailout.asp
Why the Fed saved AIG and not Lehman
Bernanke did say he believed an AIG failure would be "catastrophic," and that the heavy use of derivatives made the AIG problem potentially more explosive.
An AIG failure, thanks to the firm's size and its vast web of trading partners, "would have triggered an intensification of the general run on international banking institutions," Bernanke said.
http://fortune.com/2010/09/02/why-the-fed-saved-aig-and-not-lehman/
Just like AIG, AXA is a "systemically important" finance company - one of the biggest insurance companies in the world.
And (like all major banks and insurance firms), AXA is drowning in worthless debt and bets (derivatives).
Most of AXA's balance sheet would go up in a puff of smoke if they actually did "mark-to-market" (ie, if they actually factored in the probability of the counterparties of their debts and bets actually coming through and paying AXA the full amount it says on the pretty little spreadsheets on everyone's computer screens).
In other words: Like most giant banks and insurers, AXA has mainly debt and bets. They rely on counterparties to pay them - maybe, someday, if the whole system doesn't go tits-up by then.
In other words: Like most giant banks and insurers, AXA does not hold the "private keys" to their so-called wealth :-)
So, like most giant multinational banks and insurers who spend all their time playing with debts and bets, AXA has been teetering on the edge of the abyss since 2008 - held together by chewing gum and paper clips and the miracle of Quantitative Easing - and also by all the clever accounting tricks that instantly become possible when money can go from being a gleam in a banker's eye to a pixel on a screen with just a few keystrokes - that wonderful world of "fantasy fiat" where central bankers ninja-mine billions of dollars in worthless paper and pixels into existence every month - and then for some reason every other month they have to hold a special "emergency central bankers meeting" to deal with the latest financial crisis du jour which "nobody could have seen coming".
AIG back in 2008 - much like AXA today - was another "systemically important" worldwide mega-insurance giant - with most of its net worth merely a pure fantasy on a spreadsheet and in a four-color annual report - glossing over the ugly reality that it's all based on toxic debts and derivatives which will never ever be paid off.
Mega-banks Mega-insurers like AXA are addicted to the never-ending "fantasy fiat" being injected into the casino of musical chairs involving bets upon bets upon bets upon bets upon bets - counterparty against counterparty against counterparty against counterparty - going 'round and 'round on the big beautiful carroussel where everyone is waiting on the next guy to pay up - and meanwhile everyone's cooking their books and sweeping their losses "under the rug", offshore or onto the taxpayers or into special-purpose vehicles - while the central banks keep printing up a trillion more here and a trillion more there in worthless debt-backed paper and pixels - while entire nations slowly sink into the toxic financial sludge of ever-increasing upayable debt and lower productivity and higher inflation, dragging down everyone's economies, enslaving everyone to increasing worktime and decreasing paychecks and unaffordable healthcare and education, corrupting our institutions and our leaders, distorting our investment and "capital allocation" decisions, inflating housing and healthcare and education beyond everyone's reach - and sending people off to die in endless wars to prop up the deadly failing Saudi-American oil-for-arms Petrodollar ninja-mined currency cartel.
In 2008, when the multinational insurance company AIG (along with their fellow gambling buddies at the multinational investment banks Bear Stearns and Lehmans) almost went down the drain due to all their toxic gambling debts, they also almost took the rest of the world with them.
And that's when the "core" dev team working for the miners central banks (the Fed, ECB, BoE, BoJ - who all report to the "central bank of central banks" BIS in Basel) - started cranking up their mining rigs printing presses and keyboards and pixels to the max, unilaterally manipulating the "issuance schedule" of their shitcoins and flooding the world with tens of trillions in their worthless phoney fiat to save their sorry asses after all their toxic debts and bad bets.
AXA is at the very rotten "core" of this system - like AIG, a "systemically important" (ie, "too big to fail") mega-gigantic multinational insurance company - a fantasy fiat finance firm quietly sitting at the rotten core of our current corrupt financial system, basically impacting everything and everybody on this planet.
The "masters of the universe" from AXA are the people who go to Davos every year wining and dining on lobster and champagne - part of that elite circle that prints up endless money which they hand out to their friends while they continue to enslave everyone else - and then of course they always turn around and tell us we can't have nice things like roads and schools and healthcare because "austerity". (But somehow we always can have plenty of wars and prisons and climate change and terrorism because for some weird reason our "leaders" seem to love creating disasters.)
The smart people at AXA are probably all having nightmares - and the smart people at all the other companies in that circle of "too-big-to-fail" "fantasy fiat finance firms" are probably also having nightmares - about the following very possible scenario:
If Bitcoin succeeds, debt-and-derivatives-dependent financial "giants" like AXA will probably be exposed as having been bankrupt this entire time.
All their debts and bets will be exposed as not being worth the paper and pixels they were printed on - and at that point, in a cryptocurrency world, the only real money in the world will be "counterparty-free" assets ie cryptocurrencies like Bitcoin - where all you need to hold is your own private keys - and you're not dependent on the next deadbeat debt-ridden fiat slave down the line coughing up to pay you.
Some of those people at AXA and the rest of that mafia are probably quietly buying - sad that they missed out when Bitcoin was only $10 or $100 - but happy they can still get it for $1000 while Blockstream continues to suppress the price - and who knows, what the hell, they might as well throw some of that juicy "banker's bonus" into Bitcoin now just in case it really does go to $1 million a coin someday - which it could easily do with just 32MB blocks, and no modifications to the code (ie, no SegWit, no BU, no nuthin', just a slowly growing blocksize supporting a price growing roughly proportional to the square of the blocksize - like Bitcoin always actually did before the economically illiterate devs at Blockstream imposed their centrally planned blocksize on our previously decentralized system).
Meanwhile, other people at AXA and other major finance firms might be taking a different tack: happy to see all the disinfo and discord being sown among the Bitcoin community like they've been doing since they were founded in late 2014 - buying out all the devs, dumbing down the community to the point where now even the CTO of Blockstream Greg Mawxell gets the whitepaper totally backwards.
Maybe Core/Blockstream's failure-to-scale is a feature not a bug - for companies like AXA.
After all, AXA - like most of the major banks in the Europe and the US - are now basically totally dependent on debt and derivatives to pretend they're not already bankrupt.
Maybe Blockstream's dead-end road-map (written up by none other than Greg Maxwell), which has been slowly strangling Bitcoin for over two years now - and which could ultimately destroy Bitcoin via the poison pill of Core/Blockstream's SegWit trojan horse - maybe all this never-ending history of obstrution and foot-dragging and lying and failure from Blockstream is actually a feature and not a bug, as far as AXA and their banking buddies are concerned.
The insurance company with the biggest exposure to the 1.2 quadrillion dollar (ie, 1200 TRILLION dollar) derivatives casino is AXA. Yeah, that AXA, the company whose CEO is head of the Bilderberg Group, and whose "venture capital" arm bought out Bitcoin development by "investing" in Blockstream.
https://np.reddit.com/btc/comments/4k1r7v/the_insurance_company_with_the_biggest_exposure/
If Bitcoin becomes a major currency, then tens of trillions of dollars on the "legacy ledger of fantasy fiat" will evaporate, destroying AXA, whose CEO is head of the Bilderbergers. This is the real reason why AXA bought Blockstream: to artificially suppress Bitcoin volume and price with 1MB blocks.
https://np.reddit.com/btc/comments/4r2pw5/if_bitcoin_becomes_a_major_currency_then_tens_of/
AXA has even invented some kind of "climate catastrophe" derivative - a bet where if the global warming destroys an entire region of the world, the "winner" gets paid.
Of course, derivatives would be something attractive to an insurance company - since basically most of their business is about making and taking bets.
So who knows - maybe AXA is "betting against" Bitcoin - and their little investment in the loser devs at Core/Blockstream is part of their strategy for "winning" that bet.
This trader's price & volume graph / model predicted that we should be over $10,000 USD/BTC by now. The model broke in late 2014 - when AXA-funded Blockstream was founded, and started spreading propaganda and crippleware, centrally imposing artificially tiny blocksize to suppress the volume & price.
https://np.reddit.com/btc/comments/5obe2m/this_traders_price_volume_graph_model_predicted/
"I'm angry about AXA scraping some counterfeit money out of their fraudulent empire to pay autistic lunatics millions of dollars to stall the biggest sociotechnological phenomenon since the internet and then blame me and people like me for being upset about it." ~ u/dresden_k
https://np.reddit.com/btc/comments/5xjkof/im_angry_about_axa_scraping_some_counterfeit/
Bitcoin can go to 10,000 USD with 4 MB blocks, so it will go to 10,000 USD with 4 MB blocks. All the censorship & shilling on r\bitcoin & fantasy fiat from AXA can't stop that. BitcoinCORE might STALL at 1,000 USD and 1 MB blocks, but BITCOIN will SCALE to 10,000 USD and 4 MB blocks - and beyond
https://np.reddit.com/btc/comments/5jgkxv/bitcoin_can_go_to_10000_usd_with_4_mb_blocks_so/
AXA/Blockstream are suppressing Bitcoin price at 1000 bits = 1 USD. If 1 bit = 1 USD, then Bitcoin's market cap would be 15 trillion USD - close to the 82 trillion USD of "money" in the world. With Bitcoin Unlimited, we can get to 1 bit = 1 USD on-chain with 32MB blocksize ("Million-Dollar Bitcoin")
https://www.reddit.com/btc/comments/5u72va/axablockstream_are_suppressing_bitcoin_price_at/
Anyways, people are noticing that it's a little... odd... the way Greg Maxwell seems to go to such lengths, in order to cover up the fact that bigger blocks have always correlated to higher price.
He seems to get very... uncomfortable... when people start pointing out that:
It sure looks like AXA is paying Greg Maxwell to suppress the Bitcoin price.
Greg Maxwell has now publicly confessed that he is engaging in deliberate market manipulation to artificially suppress Bitcoin adoption and price. He could be doing this so that he and his associates can continue to accumulate while the price is still low (1 BTC = $570, ie 1 USD can buy 1750 "bits")
https://np.reddit.com/btc/comments/4wgq48/greg_maxwell_has_now_publicly_confessed_that_he/
Why did Blockstream CTO u/nullc Greg Maxwell risk being exposed as a fraud, by lying about basic math? He tried to convince people that Bitcoin does not obey Metcalfe's Law (claiming that Bitcoin price & volume are not correlated, when they obviously are). Why is this lie so precious to him?
https://www.reddit.com/btc/comments/57dsgz/why_did_blockstream_cto_unullc_greg_maxwell_risk/
I don't know how a so-called Bitcoin dev can sleep at night knowing he's getting paid by fucking AXA - a company that would probably go bankrupt if Bitcoin becomes a major world currency.
Greg must have to go through some pretty complicated mental gymastics to justify in his mind what everyone else can see: he is a fucking sellout to one of the biggest fiat finance firms in the world - he's getting paid by (and defending) a company which would probably go bankrupt if Bitcoin ever achieved multi-trillion dollar market cap.
Greg is literally getting paid by the second-most-connected "systemically important" (ie, "too big to fail") finance firm in the world - which will probably go bankrupt if Bitcoin were ever to assume its rightful place as a major currency with total market cap measured in the tens of trillions of dollars, destroying most of the toxic sludge of debt and derivatives keeping a bank financial giant like AXA afloat.
And it may at first sound batshit crazy (until You Do The Math), but Bitcoin actually really could go to one-million-dollars-a-coin in the next 8 years or so - without SegWit or BU or anything else - simply by continuing with Satoshi's original 32MB built-in blocksize limit and continuing to let miners keep blocks as small as possible to satisfy demand while avoiding orphans - a power which they've had this whole friggin' time and which they've been managing very well thank you.
Bitcoin Original: Reinstate Satoshi's original 32MB max blocksize. If actual blocks grow 54% per year (and price grows 1.542 = 2.37x per year - Metcalfe's Law), then in 8 years we'd have 32MB blocks, 100 txns/sec, 1 BTC = 1 million USD - 100% on-chain P2P cash, without SegWit/Lightning or Unlimited
https://np.reddit.com/btc/comments/5uljaf/bitcoin_original_reinstate_satoshis_original_32mb/
Meanwhile Greg continues to work for Blockstream which is getting tens of millions of dollars from a company which would go bankrupt if Bitcoin were to actually scale on-chain to 32MB blocks and 1 million dollars per coin without all of Greg's meddling.
So Greg continues to get paid by AXA, spreading his ignorance about economics and his lies about Bitcoin on these forums.
In the end, who knows what Greg's motivations are, or AXA's motivations are.
But one thing we do know is this:
Satoshi didn't put Greg Maxwell or AXA in charge of deciding the blocksize.
The tricky part to understand about "one CPU, one vote" is that it does not mean there is some "pre-existing set of rules" which the miners somehow "enforce" (despite all the times when you hear some Core idiot using words like "consensus layer" or "enforcing the rules").
The tricky part about really understanding Bitcoin is this:
Hashpower doesn't just enforce the rules - hashpower makes the rules.
And if you think about it, this makes sense.
It's the only way Bitcoin actually could be decentralized.
It's kinda subtle - and it might be hard for someone to understand if they've been a slave to centralized authorities their whole life - but when we say that Bitcoin is "decentralized" then what it means is:
We all make the rules.
Because if hashpower doesn't make the rules - then you'd be right back where you started from, with some idiot like Greg Maxwell "making the rules" - or some corrupt too-big-to-fail bank debt-and-derivative-backed "fantasy fiat financial firm" like AXA making the rules - by buying out a dev team and telling us that that dev team "makes the rules".
But fortunately, Greg's opinions and ignorance and lies don't matter anymore.
Miners are waking up to the fact that they've always controlled the blocksize - and they always will control the blocksize - and there isn't a single goddamn thing Greg Maxwell or Blockstream or AXA can do to stop them from changing it - whether the miners end up using BU or Classic or BitcoinEC or they patch the code themselves.
The debate is not "SHOULD THE BLOCKSIZE BE 1MB VERSUS 1.7MB?". The debate is: "WHO SHOULD DECIDE THE BLOCKSIZE?" (1) Should an obsolete temporary anti-spam hack freeze blocks at 1MB? (2) Should a centralized dev team soft-fork the blocksize to 1.7MB? (3) OR SHOULD THE MARKET DECIDE THE BLOCKSIZE?
https://np.reddit.com/btc/comments/5pcpec/the_debate_is_not_should_the_blocksize_be_1mb/
Core/Blockstream are now in the Kübler-Ross "Bargaining" phase - talking about "compromise". Sorry, but markets don't do "compromise". Markets do COMPETITION. Markets do winner-takes-all. The whitepaper doesn't talk about "compromise" - it says that 51% of the hashpower determines WHAT IS BITCOIN.
https://np.reddit.com/btc/comments/5y9qtg/coreblockstream_are_now_in_the_k%C3%BCblerross/
Clearing up Some Widespread Confusions about BU
Core deliberately provides software with a blocksize policy pre-baked in.
The ONLY thing BU-style software changes is that baking in. It refuses to bundle controversial blocksize policy in with the rest of the code it is offering. It unties the blocksize settings from the dev teams, so that you don't have to shop for both as a packaged unit.
The idea is that you can now have Core software security without having to submit to Core blocksize policy.
Running Core is like buying a Sony TV that only lets you watch Fox, because the other channels are locked away and you have to know how to solder a circuit board to see them. To change the channel, you as a layman would have to switch to a different TV made by some other manufacturer, who you may not think makes as reliable of TVs.
This is because Sony believes people should only ever watch Fox "because there are dangerous channels out there" or "because since everyone needs to watch the same channel, it is our job to decide what that channel is."
So the community is stuck with either watching Fox on their nice, reliable Sony TVs, or switching to all watching ABC on some more questionable TVs made by some new maker (like, in 2015 the XT team was the new maker and BIP101 was ABC).
BU (and now Classic and BitcoinEC) shatters that whole bizarre paradigm. BU is a TV that lets you tune to any channel you want, at your own risk.
The community is free to converge on any channel it wants to, and since everyone in this analogy wants to watch the same channel they will coordinate to find one.
https://np.reddit.com/btc/comments/602vsy/clearing_up_some_widespread_confusions_about_bu/
Adjustable blocksize cap (ABC) is dangerous? The blocksize cap has always been user-adjustable. Core just has a really shitty inferface for it.
What does it tell you that Core and its supporters are up in arms about a change that merely makes something more convenient for users and couldn't be prevented from happening anyway? Attacking the adjustable blocksize feature in BU and Classic as "dangerous" is a kind of trap, as it is an implicit admission that Bitcoin was being protected only by a small barrier of inconvenience, and a completely temporary one at that. If this was such a "danger" or such a vector for an "attack," how come we never heard about it before?
Even if we accept the improbable premise that inconvenience is the great bastion holding Bitcoin together and the paternalistic premise that stakeholders need to be fed consensus using a spoon of inconvenience, we still must ask, who shall do the spoonfeeding?
Core accepts these two amazing premises and further declares that Core alone shall be allowed to do the spoonfeeding. Or rather, if you really want to you can be spoonfed by other implementation clients like libbitcoin and btcd as long as they are all feeding you the same stances on controversial consensus settings as Core does.
It is high time the community see central planning and abuse of power for what it is, and reject both:
  • Throw off central planning by removing petty "inconvenience walls" (such as baked-in, dev-recommended blocksize caps) that interfere with stakeholders coordinating choices amongst themselves on controversial matters ...
  • Make such abuse of power impossible by encouraging many competing implementations to grow and blossom
https://np.reddit.com/btc/comments/617gf9/adjustable_blocksize_cap_abc_is_dangerous_the/
So it's time for Blockstream CTO Greg Maxwell u/nullc to get over his delusions of grandeur - and to admit he's just another dev, with just another opinion.
He also needs to look in the mirror and search his soul and confront the sad reality that he's basically turned into a sellout working for a shitty startup getting paid by the 5th (or 4th or 2nd) "most connected", "systemically important", "too-big-to-fail", debt-and-derivative-dependent multinational bank mega-insurance giant in the world AXA - a major fiat firm firm which is terrified of going bankrupt just like that other mega-insurnace firm AIG already almost did before the Fed rescued them in 2008 - a fiat finance firm which is probably very conflicted about Bitcoin, at the very least.
Blockstream CTO Greg Maxwell is getting paid by the most systemically important bank mega-insurance giant in the world, sitting at the rotten "core" of the our civilization's corrupt, dying fiat cartel.
Blockstream CTO Greg Maxwell is getting paid by a mega-bank mega-insurance company that will probably go bankrupt if and when Bitcoin ever gets a multi-trillion dollar market cap, which it can easily do with just 32MB blocks and no code changes at all from clueless meddling devs like him.
submitted by ydtm to btc [link] [comments]

Block size limit debate history lesson

Pre 2013
Bitcoin users and developers have near universal agreement that the block size limit is a temporary feature must be raised and/or removed. Preparing for this hard fork is one of lead developer Gavin's top priorities.
https://web.archive.org/web/20140328052630/https://en.bitcoin.it/wiki/Talk:Scalability
MAX_BLOCK_SIZE has always been planned to increase as needed. That limitation should be ignored. theymos 17:15, 4 March 2011 (GMT)
What Theymos said. Increasing MAX_BLOCK_SIZE will be done when "lightweight, header-only" client mode is done. Until then, block size has to be kept under control.--Gavin Andresen 00:19, 5 March 2011 (GMT)
However development priorities are not very unified, as noted by one observer:
https://bitcointalk.org/index.php?topic=122013.msg1390298#msg1390298
When I joined this forum I was completely wrong calling the Bitcoin core development team "Bitcoin bunker". Now that I understand the situation better I know that there's no single bunker. There are numerous one-or-two-person cubbyholes that may occasionally form the aliances to shoot at the occupant of another cubbyhole. The situation conforms better to the distributed paradigm inherent in the design of Bitcoin.
2013
For the first time in Bitcoin's history, arguments begin to erupt regarding the desirability of increasing the block size limit.
Many of the proponents in favor of making the block size limit permanent are investors in competing currencies/payment systems and this fact was not lost on observers of the era and can easily be confirmed by viewing the profiles of the participants:
https://bitcointalk.org/index.php?topic=140233.0;all
https://bitcointalk.org/index.php?topic=144895.0;all
https://bitcointalk.org/index.php?topic=221111.0;all
In May of 2013, Peter Todd funds the production of a propaganda video:
https://www.youtube.com/watch?v=cZp7UGgBR0I
None of the claims in this video are true, but it is effective in creating drama. Tensions rise and development work grinds nearly to a halt due to infighting.
BTC market share is 95%.
In December, Gregory Maxwell begins to revive the idea of sidechains along with Adam Back, TheBlueMatt, and other individuals who will go on to form Blockstream.
They begin promoting sidechains as an alternative to Bitcoin scaling.
http://web.archive.org/web/20140226095319/http://download.wpsoftware.net/bitcoin/wizards/2013-12-18.txt
2014
April 7: Unwilling to deal with the drama any further, Gavin steps down as lead developer. At the time the BTC market share is 90%.
Sidechain discussion is well underway, yet a few people still manage to speak up to point out that sidechains should not be treated as an alternative to scaling Bitcoin. You may notice some familiar posters in these threads:
https://bitcointalk.org/index.php?topic=566704.0;all
https://bitcointalk.org/index.php?topic=563972.0;all
In October, Blockstream.com publishes their sidechain whitepaper:
https://bitcointalk.org/index.php?topic=831527.0;all
The response is underwhelming.
On November 17, Blockstream announces the securing of $21 million in seed funding.
BTC market share is 91%.
2015
On June 22, Gavin Andresen proposes BIP101 to increase the block size limit as the conclusion of his work performed since stepping down as lead developer.
On August 6, Mike Hearn announces BitcoinXT, a full node implementation that includes BIP101.
Many Blockstream employees, including Adam Back, call this effort a "coup", a claim that can not be made without admitting they believe themselves to be the legitimate rulers of Bitcoin.
http://spectrum.ieee.org/tech-talk/computing/networks/the-bitcoin-for-is-a-coup
In October, Blockstream employee Pieter Wuille proposes "Segregated Witness":
https://bitcointalk.org/index.php?topic=1210235.0
Post-2015
This is the time period most Bitcoin users are familiar with, which really only represents the tail end of a five year long fight to prevent the planned block size limit increase.
The BTC market share has been steadily dropping since the anti-scaling propaganda began in late 2012/early 2013.
It currently stands at 66%.
https://coinmarketcap.com/charts/
submitted by ABlockInTheChain to btc [link] [comments]

Preventing double-spends is an "embarrassingly parallel" massive search problem - like Google, [email protected], [email protected], or PrimeGrid. BUIP024 "address sharding" is similar to Google's MapReduce & Berkeley's BOINC grid computing - "divide-and-conquer" providing unlimited on-chain scaling for Bitcoin.

TL;DR: Like all other successful projects involving "embarrassingly parallel" search problems in massive search spaces, Bitcoin can and should - and inevitably will - move to a distributed computing paradigm based on successful "sharding" architectures such as Google Search (based on Google's MapReduce algorithm), or [email protected], [email protected], or PrimeGrid (based on Berkeley's BOINC grid computing architecture) - which use simple mathematical "decompose" and "recompose" operations to break big problems into tiny pieces, providing virtually unlimited scaling (plus fault tolerance) at the logical / software level, on top of possibly severely limited (and faulty) resources at the physical / hardware level.
The discredited "heavy" (and over-complicated) design philosophy of centralized "legacy" dev teams such as Core / Blockstream (requiring every single node to download, store and verify the massively growing blockchain, and pinning their hopes on non-existent off-chain vaporware such as the so-called "Lightning Network" which has no mathematical definition and is missing crucial components such as decentralized routing) is doomed to failure, and will be out-competed by simpler on-chain "lightweight" distributed approaches such as distributed trustless Merkle trees or BUIP024's "Address Sharding" emerging from independent devs such as u/thezerg1 (involved with Bitcoin Unlimited).
No one in their right mind would expect Google's vast search engine to fit entirely on a Raspberry Pi behind a crappy Internet connection - and no one in their right mind should expect Bitcoin's vast financial network to fit entirely on a Raspberry Pi behind a crappy Internet connection either.
Any "normal" (ie, competent) company with $76 million to spend could provide virtually unlimited on-chain scaling for Bitcoin in a matter of months - simply by working with devs who would just go ahead and apply the existing obvious mature successful tried-and-true "recipes" for solving "embarrassingly parallel" search problems in massive search spaces, based on standard DISTRIBUTED COMPUTING approaches like Google Search (based on Google's MapReduce algorithm), or [email protected], [email protected], or PrimeGrid (based on Berkeley's BOINC grid computing architecture). The fact that Blockstream / Core devs refuse to consider any standard DISTRIBUTED COMPUTING approaches just proves that they're "embarrassingly stupid" - and the only way Bitcoin will succeed is by routing around their damage.
Proven, mature sharding architectures like the ones powering Google Search, [email protected], [email protected], or PrimeGrid will allow Bitcoin to achieve virtually unlimited on-chain scaling, with minimal disruption to the existing Bitcoin network topology and mining and wallet software.
Longer Summary:
People who argue that "Bitcoin can't scale" - because it involves major physical / hardware requirements (lots of processing power, upload bandwidth, storage space) - are at best simply misinformed or incompetent - or at worst outright lying to you.
Bitcoin mainly involves searching the blockchain to prevent double-spends - and so it is similar to many other projects involving "embarrassingly parallel" searching in massive search spaces - like Google Search, [email protected], [email protected], or PrimeGrid.
But there's a big difference between those long-running wildly successful massively distributed infinitely scalable parallel computing projects, and Bitcoin.
Those other projects do their data storage and processing across a distributed network. But Bitcoin (under the misguided "leadership" of Core / Blockstream devs) instists on a fatally flawed design philosophy where every individual node must be able to download, store and verify the system's entire data structure. And it's even wore than that - they want to let the least powerful nodes in the system dictate the resource requirements for everyone else.
Meanwhile, those other projects are all based on some kind of "distributed computing" involving "sharding". They achieve massive scaling by adding a virtually unlimited (and fault-tolerant) logical / software layer on top of the underlying resource-constrained / limited physical / hardware layer - using approaches like Google's MapReduce algorithm or Berkeley's Open Infrastructure for Network Computing (BOINC) grid computing architecture.
This shows that it is a fundamental error to continue insisting on viewing an individual Bitcoin "node" as the fundamental "unit" of the Bitcoin network. Coordinated distributed pools already exist for mining the blockchain - and eventually coordinated distributed trustless architectures will also exist for verifying and querying it. Any architecture or design philosophy where a single "node" is expected to be forever responsible for storing or verifying the entire blockchain is the wrong approach, and is doomed to failure.
The most well-known example of this doomed approach is Blockstream / Core's "roadmap" - which is based on two disastrously erroneous design requirements:
  • Core / Blockstream erroneously insist that the entire blockchain must always be downloadable, storable and verifiable on a single node, as dictated by the least powerful nodes in the system (eg, u/bitusher in Costa Rica), or u/Luke-Jr in the underserved backwoods of Florida); and
  • Core / Blockstream support convoluted, incomplete off-chain scaling approaches such as the so-called "Lightning Network" - which lacks a mathematical foundation, and also has some serious gaps (eg, no solution for decentralized routing).
Instead, the future of Bitcoin will inevitably be based on unlimited on-chain scaling, where all of Bitcoin's existing algorithms and data structures and networking are essentially preserved unchanged / as-is - but they are distributed at the logical / software level using sharding approaches such as u/thezerg1's BUIP024 or distributed trustless Merkle trees.
These kinds of sharding architectures will allow individual nodes to use a minimum of physical resources to access a maximum of logical storage and processing resources across a distributed network with virtually unlimited on-chain scaling - where every node will be able to use and verify the entire blockchain without having to download and store the whole thing - just like Google Search, [email protected], [email protected], or PrimeGrid and other successful distributed sharding-based projects have already been successfully doing for years.
Details:
Sharding, which has been so successful in many other areas, is a topic that keeps resurfacing in various shapes and forms among independent Bitcoin developers.
The highly successful track record of sharding architectures on other projects involving "embarrassingly parallel" massive search problems (harnessing resource-constrained machines at the physical level into a distributed network at the logical level, in order to provide fault tolerance and virtually unlimited scaling searching for web pages, interstellar radio signals, protein sequences, or prime numbers in massive search spaces up to hundreds of terabytes in size) provides convincing evidence that sharding architectures will also work for Bitcoin (which also requires virtually unlimited on-chain scaling, searching the ever-expanding blockchain for previous "spends" from an existing address, before appending a new transaction from this address to the blockchain).
Below are some links involving proposals for sharding Bitcoin, plus more discussion and related examples.
BUIP024: Extension Blocks with Address Sharding
https://np.reddit.com/btc/comments/54afm7/buip024_extension_blocks_with_address_sharding/
Why aren't we as a community talking about Sharding as a scaling solution?
https://np.reddit.com/Bitcoin/comments/3u1m36/why_arent_we_as_a_community_talking_about/
(There are some detailed, partially encouraging comments from u/petertodd in that thread.)
[Brainstorming] Could Bitcoin ever scale like BitTorrent, using something like "mempool sharding"?
https://np.reddit.com/btc/comments/3v070a/brainstorming_could_bitcoin_ever_scale_like/
[Brainstorming] "Let's Fork Smarter, Not Harder"? Can we find some natural way(s) of making the scaling problem "embarrassingly parallel", perhaps introducing some hierarchical (tree) structures or some natural "sharding" at the level of the network and/or the mempool and/or the blockchain?
https://np.reddit.com/btc/comments/3wtwa7/brainstorming_lets_fork_smarter_not_harder_can_we/
"Braiding the Blockchain" (32 min + Q&A): We can't remove all sources of latency. We can redesign the "chain" to tolerate multiple simultaneous writers. Let miners mine and validate at the same time. Ideal block time / size / difficulty can become emergent per-node properties of the network topology
https://np.reddit.com/btc/comments/4su1gf/braiding_the_blockchain_32_min_qa_we_cant_remove/
Some kind of sharding - perhaps based on address sharding as in BUIP024, or based on distributed trustless Merkle trees as proposed earlier by u/thezerg1 - is very likely to turn out to be the simplest, and safest approach towards massive on-chain scaling.
A thought experiment showing that we already have most of the ingredients for a kind of simplistic "instant sharding"
A simplistic thought experiment can be used to illustrate how easy it could be to do sharding - with almost no changes to the existing Bitcoin system.
Recall that Bitcoin addresses and keys are composed from an alphabet of 58 characters. So, in this simplified thought experiment, we will outline a way to add a kind of "instant sharding" within the existing system - by using the last character of each address in order to assign that address to one of 58 shards.
(Maybe you can already see where this is going...)
Similar to vanity address generation, a user who wants to receive Bitcoins would be required to generate 58 different receiving addresses (each ending with a different character) - and, similarly, miners could be required to pick one of the 58 shards to mine on.
Then, when a user wanted to send money, they would have to look at the last character of their "send from" address - and also select a "send to" address ending in the same character - and presto! we already have a kind of simplistic "instant sharding". (And note that this part of the thought experiment would require only the "softest" kind of soft fork: indeed, we haven't changed any of the code at all, but instead we simply adopted a new convention by agreement, while using the existing code.)
Of course, this simplistic "instant sharding" example would still need a few more features in order to be complete - but they'd all be fairly straightforward to provide:
  • A transaction can actually send from multiple addresses, to multiple addresses - so the approach of simply looking at the final character of a single (receive) address would not be enough to instantly assign a transaction to a particular shard. But a slightly more sophisticated decision criterion could easily be developed - and computed using code - to assign every transaction to a particular shard, based on the "from" and "to" addresses in the transaction. The basic concept from the "simplistic" example would remain the same, sharding the network based on some characteristic of transactions.
  • If we had 58 shards, then the mining reward would have to be decreased to 1/58 of what it currently is - and also the mining hash power on each of the shards would end up being roughly 1/58 of what it is now. In general, many people might agree that decreased mining rewards would actually be a good thing (spreading out mining rewards among more people, instead of the current problems where mining is done by about 8 entities). Also, network hashing power has been growing insanely for years, so we probably have way more than enough needed to secure the network - after all, Bitcoin was secure back when network hash power was 1/58 of what it is now.
  • This simplistic example does not handle cases where you need to do "cross-shard" transactions. But it should be feasible to implement such a thing. The various proposals from u/thezerg1 such as BUIP024 do deal with "cross-shard" transactions.
(Also, the fact that a simplified address-based sharding mechanics can be outlined in just a few paragraphs as shown here suggests that this might be "simple and understandable enough to actually work" - unlike something such as the so-called "Lightning Network", which is actually just a catchy-sounding name with no clearly defined mechanics or mathematics behind it.)
Addresses are plentiful, and can be generated locally, and you can generate addresses satisfying a certain pattern (eg ending in a certain character) the same way people can already generate vanity addresses. So imposing a "convention" where the "send" and "receive" address would have to end in the same character (and where the miner has to only mine transactions in that shard) - would be easy to understand and do.
Similarly, the earlier solution proposed by u/thezerg1, involving distributed trustless Merkle trees, is easy to understand: you'd just be distributing the Merkle tree across multiple nodes, while still preserving its immutablity guarantees.
Such approaches don't really change much about the actual system itself. They preserve the existing system, and just split its data structures into multiple pieces, distributed across the network. As long as we have the appropriate operators for decomposing and recomposing the pieces, then everything should work the same - but more efficiently, with unlimited on-chain scaling, and much lower resource requirements.
The examples below show how these kinds of "sharding" approaches have already been implemented successfully in many other systems.
Massive search is already efficiently performed with virtually unlimited scaling using divide-and-conquer / decompose-and-recompose approaches such as MapReduce and BOINC.
Every time you do a Google search, you're using Google's MapReduce algorithm to solve an embarrassingly parallel problem.
And distributed computing grids using the Berkeley Open Infrastructure for Network Computing (BOINC) are constantly setting new records searching for protein combinations, prime numbers, or radio signals from possible intelligent life in the universe.
We all use Google to search hundreds of terabytes of data on the web and get results in a fraction of a second - using cheap "commodity boxes" on the server side, and possibly using limited bandwidth on the client side - with fault tolerance to handle crashing servers and dropped connections.
Other examples are [email protected], [email protected] and PrimeGrid - involving searching massive search spaces for protein sequences, interstellar radio signals, or prime numbers hundreds of thousands of digits long. Each of these examples uses sharding to decompose a giant search space into smaller sub-spaces which are searched separately in parallel and then the resulting (sub-)solutions are recomposed to provide the overall search results.
It seems obvious to apply this tactic to Bitcoin - searching the blockchain for existing transactions involving a "send" from an address, before appending a new "send" transaction from that address to the blockchain.
Some people might object that those systems are different from Bitcoin.
But we should remember that preventing double-spends (the main thing that the Bitcoin does) is, after all, an embarrassingly parallel massive search problem - and all of these other systems also involve embarrassingly parallel massive search problems.
The mathematics of Google's MapReduce and Berkeley's BOINC is simple, elegant, powerful - and provably correct.
Google's MapReduce and Berkeley's BOINC have demonstrated that in order to provide massive scaling for efficient searching of massive search spaces, all you need is...
  • an appropriate "decompose" operation,
  • an appropriate "recompose" operation,
  • the necessary coordination mechanisms
...in order to distribute a single problem across multiple, cheap, fault-tolerant processors.
This allows you to decompose the problem into tiny sub-problems, solving each sub-problem to provide a sub-solution, and then recompose the sub-solutions into the overall solution - gaining virtually unlimited scaling and massive efficiency.
The only "hard" part involves analyzing the search space in order to select the appropriate DECOMPOSE and RECOMPOSE operations which guarantee that recomposing the "sub-solutions" obtained by decomposing the original problem is equivalent to the solving the original problem. This essential property could be expressed in "pseudo-code" as follows:
  • (DECOMPOSE ; SUB-SOLVE ; RECOMPOSE) = (SOLVE)
Selecting the appropriate DECOMPOSE and RECOMPOSE operations (and implementing the inter-machine communication coordination) can be somewhat challenging, but it's certainly doable.
In fact, as mentioned already, these things have already been done in many distributed computing systems. So there's hardly any "original work to be done in this case. All we need to focus on now is translating the existing single-processor architecture of Bitcoin to a distributed architecture, adopting the mature, proven, efficient "recipes" provided by the many examples of successful distributed systems already up and running like such as Google Search (based on Google's MapReduce algorithm), or [email protected], [email protected], or PrimeGrid (based on Berkeley's BOINC grid computing architecture).
That's what any "competent" company with $76 million to spend would have done already - simply work with some devs who know how to implement open-source distributed systems, and focus on adapting Bitcoin's particular data structures (merkle trees, hashed chains) to a distributed environment. That's a realistic roadmap that any team of decent programmers with distributed computing experience could easily implement in a few months, and any decent managers could easily manage and roll out on a pre-determined schedule - instead of all these broken promises and missed deadlines and non-existent vaporware and pathetic excuses we've been getting from the incompetent losers and frauds involved with Core / Blockstream.
ASIDE: MapReduce and BOINC are based on math - but the so-called "Lightning Network" is based on wishful thinking involving kludges on top of workarounds on top of hacks - which is how you can tell that LN will never work.
Once you have succeeded in selecting the appropriate mathematical DECOMPOSE and RECOMPOSE operations, you get simple massive scaling - and it's also simple for anyone to verify that these operations are correct - often in about a half-page of math and code.
An example of this kind of elegance and brevity (and provable correctness) involving compositionality can be seen in this YouTube clip by the accomplished mathematician Lucius Greg Meredith presenting some operators for scaling Ethereum - in just a half page of code:
https://youtu.be/uzahKc_ukfM?t=1101
Conversely, if you fail to select the appropriate mathematical DECOMPOSE and RECOMPOSE operations, then you end up with a convoluted mess of wishful thinking - like the "whitepaper" for the so-called "Lightning Network", which is just a cool-sounding name with no actual mathematics behind it.
The LN "whitepaper" is an amateurish, non-mathematical meandering mishmash of 60 pages of "Alice sends Bob" examples involving hacks on top of workarounds on top of kludges - also containing a fatal flaw (a lack of any proposed solution for doing decentralized routing).
The disaster of the so-called "Lightning Network" - involving adding never-ending kludges on top of hacks on top of workarounds (plus all kinds of "timing" dependencies) - is reminiscent of the "epicycles" which were desperately added in a last-ditch attempt to make Ptolemy's "geocentric" system work - based on the incorrect assumption that the Sun revolved around the Earth.
This is how you can tell that the approach of the so-called "Lightning Network" is simply wrong, and it would never work - because it fails to provide appropriate (and simple, and provably correct) mathematical DECOMPOSE and RECOMPOSE operations in less than a single page of math and code.
Meanwhile, sharding approaches based on a DECOMPOSE and RECOMPOSE operation are simple and elegant - and "functional" (ie, they don't involve "procedural" timing dependencies like keeping your node running all the time, or closing out your channel before a certain deadline).
Bitcoin only has 6,000 nodes - but the leading sharding-based projects have over 100,000 nodes, with no financial incentives.
Many of these sharding-based projects have many more nodes than the Bitcoin network.
The Bitcoin network currently has about 6,000 nodes - even though there are financial incentives for running a node (ie, verifying your own Bitcoin balance.
[email protected] and [email protected] each have over 100,000 active users - even though these projects don't provide any financial incentives. This higher number of users might be due in part the the low resource demands required in these BOINC-based projects, which all are based on sharding the data set.
[email protected]
As part of the client-server network architecture, the volunteered machines each receive pieces of a simulation (work units), complete them, and return them to the project's database servers, where the units are compiled into an overall simulation.
In 2007, Guinness World Records recognized [email protected] as the most powerful distributed computing network. As of September 30, 2014, the project has 107,708 active CPU cores and 63,977 active GPUs for a total of 40.190 x86 petaFLOPS (19.282 native petaFLOPS). At the same time, the combined efforts of all distributed computing projects under BOINC totals 7.924 petaFLOPS.
[email protected]
Using distributed computing, [email protected] sends the millions of chunks of data to be analyzed off-site by home computers, and then have those computers report the results. Thus what appears an onerous problem in data analysis is reduced to a reasonable one by aid from a large, Internet-based community of borrowed computer resources.
Observational data are recorded on 2-terabyte SATA hard disk drives at the Arecibo Observatory in Puerto Rico, each holding about 2.5 days of observations, which are then sent to Berkeley. Arecibo does not have a broadband Internet connection, so data must go by postal mail to Berkeley. Once there, it is divided in both time and frequency domains work units of 107 seconds of data, or approximately 0.35 megabytes (350 kilobytes or 350,000 bytes), which overlap in time but not in frequency. These work units are then sent from the [email protected] server over the Internet to personal computers around the world to analyze.
Data is merged into a database using [email protected] computers in Berkeley.
The [email protected] distributed computing software runs either as a screensaver or continuously while a user works, making use of processor time that would otherwise be unused.
Active users: 121,780 (January 2015)
PrimeGrid
PrimeGrid is a distributed computing project for searching for prime numbers of world-record size. It makes use of the Berkeley Open Infrastructure for Network Computing (BOINC) platform.
Active users 8,382 (March 2016)
MapReduce
A MapReduce program is composed of a Map() procedure (method) that performs filtering and sorting (such as sorting students by first name into queues, one queue for each name) and a Reduce() method that performs a summary operation (such as counting the number of students in each queue, yielding name frequencies).
How can we go about developing sharding approaches for Bitcoin?
We have to identify a part of the problem which is in some sense "invariant" or "unchanged" under the operations of DECOMPOSE and RECOMPOSE - and we also have to develop a coordination mechanism which orchestrates the DECOMPOSE and RECOMPOSE operations among the machines.
The simplistic thought experiment above outlined an "instant sharding" approach where we would agree upon a convention where the "send" and "receive" address would have to end in the same character - instantly providing a starting point illustrating some of the mechanics of an actual sharding solution.
BUIP024 involves address sharding and deals with the additional features needed for a complete solution - such as cross-shard transactions.
And distributed trustless Merkle trees would involve storing Merkle trees across a distributed network - which would provide the same guarantees of immutability, while drastically reducing storage requirements.
So how can we apply ideas like MapReduce and BOINC to providing massive on-chain scaling for Bitcoin?
First we have to examine the structure of the problem that we're trying to solve - and we have to try to identify how the problem involves a massive search space which can be decomposed and recomposed.
In the case of Bitcoin, the problem involves:
  • sequentializing (serializing) APPEND operations to a blockchain data structure
  • in such a way as to avoid double-spends
Can we view "preventing Bitcoin double-spends" as a "massive search space problem"?
Yes we can!
Just like Google efficiently searches hundreds of terabytes of web pages for a particular phrase (and [email protected], [email protected], PrimeGrid etc. efficiently search massive search spaces for other patterns), in the case of "preventing Bitcoin double-spends", all we're actually doing is searching a massive seach space (the blockchain) in order to detect a previous "spend" of the same coin(s).
So, let's imagine how a possible future sharding-based architecture of Bitcoin might look.
We can observe that, in all cases of successful sharding solutions involving searching massive search spaces, the entire data structure is never stored / searched on a single machine.
Instead, the DECOMPOSE and RECOMPOSE operations (and the coordination mechanism) a "virtual" layer or grid across multiple machines - allowing the data structure to be distributed across all of them, and allowing users to search across all of them.
This suggests that requiring everyone to store 80 Gigabytes (and growing) of blockchain on their own individual machine should no longer be a long-term design goal for Bitcoin.
Instead, in a sharding environment, the DECOMPOSE and RECOMPOSE operations (and the coordination mechanism) should allow everyone to only store a portion of the blockchain on their machine - while also allowing anyone to search the entire blockchain across everyone's machines.
This might involve something like BUIP024's "address sharding" - or it could involve something like distributed trustless Merkle trees.
In either case, it's easy to see that the basic data structures of the system would remain conceptually unaltered - but in the sharding approaches, these structures would be logically distributed across multiple physical devices, in order to provide virtually unlimited scaling while dramatically reducing resource requirements.
This would be the most "conservative" approach to scaling Bitcoin: leaving the data structures of the system conceptually the same - and just spreading them out more, by adding the appropriately defined mathematical DECOMPOSE and RECOMPOSE operators (used in successful sharding approaches), which can be easily proven to preserve the same properties as the original system.
Conclusion
Bitcoin isn't the only project in the world which is permissionless and distributed.
Other projects (BOINC-based permisionless decentralized [email protected], [email protected], and PrimeGrid - as well as Google's (permissioned centralized) MapReduce-based search engine) have already achieved unlimited scaling by providing simple mathematical DECOMPOSE and RECOMPOSE operations (and coordination mechanisms) to break big problems into smaller pieces - without changing the properties of the problems or solutions. This provides massive scaling while dramatically reducing resource requirements - with several projects attracting over 100,000 nodes, much more than Bitcoin's mere 6,000 nodes - without even offering any of Bitcoin's financial incentives.
Although certain "legacy" Bitcoin development teams such as Blockstream / Core have been neglecting sharding-based scaling approaches to massive on-chain scaling (perhaps because their business models are based on misguided off-chain scaling approaches involving radical changes to Bitcoin's current successful network architecture, or even perhaps because their owners such as AXA and PwC don't want a counterparty-free new asset class to succeed and destroy their debt-based fiat wealth), emerging proposals from independent developers suggest that on-chain scaling for Bitcoin will be based on proven sharding architectures such as MapReduce and BOINC - and so we should pay more attention to these innovative, independent developers who are pursuing this important and promising line of research into providing sharding solutions for virtually unlimited on-chain Bitcoin scaling.
submitted by ydtm to btc [link] [comments]

[META] What happened to /u/gavinandresen's expert flair?

I'm fairly sure Gavin used to have 'Expert' Flair by his name on this sub, but recently it doesn't appear. Is this a mistake, or was it a decision made by mods. If so what was the reason for it's removal?
I think most would agree he's an Expert on Bitcoin - even if they don't agree with his BIP101 proposal?
Criteria for being an Expert is here - https://www.reddit.com/Bitcoin/wiki/about_expert_flair
If there is ever any dispute among the experts as to whether someone should have the flair, the expert consensus on this matter will be interpreted and executed by the head admin.
submitted by chriswheeler to Bitcoin [link] [comments]

Dr Peter R. Rizun, managing editor of the first peer-reviewed cryptocurrency journal, is an important Bitcoin researcher. He has also been attacked and censored for months by Core / Blockstream / Theymos. Now, he has now been *suspended* (from *all* subreddits) by some Reddit admin(s). Why?

Dr. Peter R. Rizun is arguably one of the most serious, prominent, and promising new voices in Bitcoin research today.
He not only launched the first scientific peer-reviewed cryptocurrency journal - he has also consistently provided high-quality, serious and insightful posts, papers and presentations on reddit (in writing, at conferences, and on YouTube) covering a wide array of important topics ranging from blocksize, scaling and decentralization to networking theory, economics, and fee markets - including:
It was of course probably to be expected that such an important emerging new Bitcoin researcher would be constantly harrassed, attacked and censored by the ancien régime of Core / Blockstream / Theymos.
But now, the attacks have risen to a new level, where some Reddit admin(s) have suspended his account Peter__R.
This means that now he can't post anywhere on reddit, and people can no longer see his reddit posts simply by clicking on his user name (although his posts - many of them massively upvoted with hundreds of upvotes - are of course still available individually, via the usual search box).
Questions:
  • What Reddit admin(s) are behind this reddit-wide banishing of Peter__R?
  • What is their real agenda, and why are they aiding and abbeting the censorship imposed by Core / Blockstream / Theymos?
  • Don't they realize that in the end they will only harm reddit.com itself, by forcing the most important new Bitcoin researchers to publish their work elsewhere?
(Some have suggested that Peter__R may have forgotten to use 'np' instead of 'www' when linking to other posts on reddit - a common error which subs like /btc will conveniently catch for the poster, allowing the post to be fixed and resubmitted. If this indeed was the actual justification of the Reddit admin(s) for banning him reddit-wide, it seems like a silly technical "gotcha" - and one which could easily have been avoided if other subs would catch this error the same way /btc does. At any rate, it certainly seems counterproductive for reddit.com to ban such a prominent and serious Bitcoin contributor.)
  • Why is reddit.com willing to risk pushing serious discussion off the site, killing its reputation as a decent place to discuss Bitcoin?
  • Haven't the people attempting to silence him ever heard of the Streisand effect?
Below are some examples of the kinds of outstanding contributions made by Peter__R, which Core / Blockstream / Theymos (and apparently some Reddit admin(s)) have been desperately trying to suppress in the Bitcoin community.
Peer-Reviewed Cryptocurrency Journal
Bitcoin Peer-Reviewed Academic Journal ‘Ledger’ Launches
https://www.coindesk.com/bitcoin-peer-reviewed-academic-journal-ledger-launches/
Blocksize as an Emergent Phenonomen
The Size of Blocks: Policy Tool or Emergent Phenomenon? [my presentation proposal for scaling bitcoin hong kong]
https://np.reddit.com/bitcoinxt/comments/3s5507/the_size_of_blocks_policy_tool_or_emergent/
Peter R's presentation is really awesome and much needed analysis of the market for blockspace and blocksize.
https://np.reddit.com/bitcoinxt/comments/3me634/peter_rs_presentation_is_really_awesome_and_much/
In case anyone missed it, Peter__R hit the nail on the head with this: "The reason we can't agree on a compromise is because the choice is binary: the limit is either used as an anti-spam measure, or as a policy tool to control fees."
https://np.reddit.com/btc/comments/3xaexf/in_case_anyone_missed_it_peter_r_hit_the_nail_on/
Bigger Blocks = Higher Prices: Visualizing the 92% historical correlation [NEW ANIMATED GIF]
https://np.reddit.com/bitcoinxt/comments/3nufe7/bigger_blocks_higher_prices_visualizing_the_92/
https://np.reddit.com/Bitcoin/comments/3nudkn/bigger_blocks_higher_prices_visualizing_the_92/
Miners are commodity producers - Peter__R
https://np.reddit.com/bitcoinxt/comments/3l3g4f/miners_are_commodity_producers_peter_
Fees and Fee Markets
“A Transaction Fee Market Exists Without a Block Size Limit” — new research paper ascertains. [Plus earn $10 in bitcoin per typo found in manuscript]
https://np.reddit.com/Bitcoin/comments/3fpuld/a_transaction_fee_market_exists_without_a_block/
"A Transaction Fee Market Exists Without a Block Size Limit", Peter R at Scaling Bitcoin Montreal 2015
https://np.reddit.com/Bitcoin/comments/3mddr4/a_transaction_fee_market_exists_without_a_block/
An illustration of how fee revenue leads to improved network security in the absence of a block size limit.
https://np.reddit.com/bitcoinxt/comments/3qana4/an_illustration_of_how_fee_revenue_leads_to/
Greg Maxwell was wrong: Transaction fees can pay for proof-of-work security without a restrictive block size limit
https://np.reddit.com/Bitcoin/comments/3yod27/greg_maxwell_was_wrong_transaction_fees_can_pay/
Networks and Scaling
Bitcoin's "Metcalfe's Law" relationship between market cap and the square of the number of transactions
https://np.reddit.com/Bitcoin/comments/3x8ba9/bitcoins_metcalfes_law_relationship_between/
Market cap vs. daily transaction volume: is it reasonable to expect the market cap to continue to grow if there is no room for more transactions?
https://np.reddit.com/bitcoinxt/comments/3nvkn3/market_cap_vs_daily_transaction_volume_is_it/
In my opinion the most important part of Scaling Bitcoin! (Peter R)
https://np.reddit.com/Bitcoin/comments/3l5uh4/in_my_opinion_the_most_important_part_of_scaling/
https://np.reddit.com/bitcoinxt/comments/3l5up3/in_my_opinion_the_most_important_part_of_scaling/
Visualizing BIP101: A Payment Network for Planet Earth
https://np.reddit.com/Bitcoin/comments/3uvaqn/visualizing_bip101_a_payment_network_for_planet/
A Payment Network for Planet Earth: Visualizing Gavin Andresen's blocksize-limit increase
https://np.reddit.com/Bitcoin/comments/3ame17/a_payment_network_for_planet_earth_visualizing/
Is Bitcoin's block size "empirically different" or "technically the same" as Bitcoin's block reward? [animated GIF visualizing real blockchain data]
https://np.reddit.com/btc/comments/3thu1n/is_bitcoins_block_size_empirically_different_o
New blocksize BIP: User Configurable Maximum Block Size
https://np.reddit.com/Bitcoin/comments/3hcrmn/new_blocksize_bip_user_configurable_maximum_block/
A Block Size Limit Was Never Part Of Satoshi’s Plan : Draft proposal to move the block size limit from the consensus layer to the transport layer
https://np.reddit.com/bitcoin_uncensored/comments/3hdeqs/a_block_size_limit_was_never_part_of_satoshis/
Truth-table for the question "Will my node follow the longest chain?"
https://np.reddit.com/bitcoinxt/comments/3i5pk4/truthtable_for_the_question_will_my_node_follow/
Peter R: "In the end, I believe the production quota would fail." #ScalingBitcoin
https://np.reddit.com/Bitcoin/comments/3koghf/peter_r_in_the_end_i_believe_the_production_quota/
Decentralized Nodes, Mining and Development
Centralization in Bitcoin: Nodes, Mining, Development
https://np.reddit.com/Bitcoin/comments/3n3z9b/centralization_in_bitcoin_nodes_mining_development/
Deprecating Bitcoin Core: Visualizing the Emergence of a Nash Equilibrium for Protocol Development
https://np.reddit.com/bitcoinxt/comments/3nhq9t/deprecating_bitcoin_core_visualizing_the/
What is wrong with the goal of decentralizing development across multiple competing implementations? - Peter R
https://np.reddit.com/bitcoinxt/comments/3ijuw3/what_is_wrong_with_the_goal_of_decentralizing/
Potentially Unlimited, "Fractal-Like" Scaling for Bitcoin: Peter__R's "Subchains" proposal
"Reduce Orphaning Risk and Improve Zero-Confirmation Security With Subchains" — new research paper on 'weak blocks' explains
https://np.reddit.com/btc/comments/3xkok3/reduce_orphaning_risk_and_improve/
A Visual Explanation of Subchains -- an application of weak blocks to secure zero-confirmation transactions and massively scale Bitcoin
https://np.reddit.com/btc/comments/3y76du/a_visual_explanation_of_subchains_an_application/
New Directions in Bitcoin Development
Announcing Bitcoin Unlimited.
https://np.reddit.com/btc/comments/3ynoaa/announcing_bitcoin_unlimited/
"It's because most of them are NOT Bitcoin experts--and I hope the community is finally starting to recognize that" -- Peter R on specialists vs. generalists and the aptitudes of Blockstream Core developers
https://np.reddit.com/btc/comments/3xn110/its_because_most_of_them_are_not_bitcoin/
It is time to usher in a new phase of Bitcoin development - based not on crypto & hashing & networking (that stuff's already done), but based on clever refactorings of datastructures in pursuit of massive and perhaps unlimited new forms of scaling
https://np.reddit.com/btc/comments/3xpufy/it_is_time_to_usher_in_a_new_phase_of_bitcoin/
Peter__R on RBF
Peter__R on RBF: (1) Easier for scammers on Local Bitcoins (2) Merchants will be scammed, reluctant to accept Bitcoin (3) Extra work for payment processors (4) Could be the proverbial straw that broke Core's back, pushing people into XT, btcd, Unlimited and other clients that don't support RBF
https://np.reddit.com/btc/comments/3umat8/upeter_r_on_rbf_1_easier_for_scammers_on_local/
Peter__R on Mt. Gox
Peter R’s Theory on the Collapse of Mt. Gox
https://np.reddit.com/Bitcoin/comments/1zdnop/peter_rs_theory_on_the_collapse_of_mt_gox/
Censorship and Attacks by Core / Blockstream / Theymos / Reddit Admins against Peter__R
Peter__R's infographic showing the BIP 101 growth trajectory gets deleted from /bitcoin for "trolling"
https://np.reddit.com/btc/comments/3uy3ea/peter_rs_infographic_showing_the_bip_101_growth/
"Scaling Bitcoin" rejected Peter R's proposal
https://np.reddit.com/bitcoinxt/comments/3takbscaling_bitcoin_rejected_peter_rs_proposal/
After censoring Mike and Gavin, BlockStream makes its first move to silence Peter R on bitcoin-dev like they did on /bitcoin
https://np.reddit.com/bitcoinxt/comments/3syb0z/after_censoring_mike_and_gavin_blockstream_makes/
Looks like the censors in /bitcoin are at it again: Peter_R post taken down within minutes
https://np.reddit.com/bitcoinxt/comments/3tvb3b/looks_like_the_censors_in_rbitcoin_are_at_it/
I've been banned for vote brigading for the animated GIF that visualized the possible future deprecation of Bitcoin Core.
https://np.reddit.com/bitcoinxt/comments/3nizet/ive_been_banned_for_vote_brigading_for_the/
An example of moderator subjectivity in the interpretation of the rules at /bitcoin: animated pie chart visualizing the deprecation of Bitcoin Core
https://np.reddit.com/bitcoinxt/comments/3osthv/an_example_of_moderator_subjectivity_in_the/
"My response to Pieter Wuille on the Dev-List has once again been censored, perhaps because I spoke favourably of Bitcoin Unlimited and pointed out misunderstandings by Maxwell and Back...here it is for those who are interested" -- Peter R
https://np.reddit.com/btc/comments/3ybhdy/my_response_to_pieter_wuille_on_the_devlist_has/
To those who are interested in judging whether Peter R's paper merits inclusion in the blockchain scaling conference, here it is:
https://np.reddit.com/btc/comments/3td6b9/to_those_who_are_interested_in_judging_whethe
The real reason Peter_R talk was refused (from his previous presentation) (xpost from /btc)
https://np.reddit.com/bitcoinxt/comments/3uwpvh/the_real_reason_peter_r_talk_was_refused_from_his/
[CENSORED] The Morning After the Moderation Mistake: Thoughts on Consensus and the Longest Chain
https://np.reddit.com/bitcoin_uncensored/comments/3h8o50/censored_the_morning_after_the_moderation_mistake/
Core / Blockstream cheerleader eragmus gloating over Peter__R's account getting suspended from Reddit (ie, from all subreddits) - by some Reddit admin(s)
[PSA] Uber Troll Extraordinaire, Peter__R, has been permanently suspended by Reddit
https://np.reddit.com/Bitcoin/comments/407j77/psa_uber_troll_extraordinaire_upeter_r_has_been/
submitted by ydtm to btc [link] [comments]

Obvious in hindsight: Consensus rules should've never been tied to dev teams

XT lobbed the ball, then BU knocked it out of the park.
The reason XT pissed so many people off is because they had assumed Core's tying the blocksize settings to its trusted codebase was the only thing keeping the market of users from doing something stupid. Core effectively said to users, "You cannot use our trusted code without accepting the blocksize we decree." Like theymos, Core seeks to use its influential position to manipulate the emergence of consensus on what Bitcoin is - for the common good of course.
By not making the blocksize settings configurable, Core arrogates to itself the power to dictate consensus parameters. Users have to mod the code if they want to change those parameters, which is a high enough hurdle to create a powerful Schelling point at 1MB or whatever Core decides. This is nice if you don't trust the market, though at the cost of concentrating ever more power in the Core team, opening Bitcoin to attack.
XT wrecked this paradigm. No longer could Core rely on users trusting "the experts" to determine consensus settings like blocksize, because there were new two sets of experts, both with widely respected members in their camp. Not only that, XT made it so that users could use the trusted Core codebase* without having to follow Core's diktats on blocksize. Core's imperative of shepherding the masses to ensure they don't mess up had been undermined, and its power position was in some jeopardy.
However, XT was not the full solution. XT was still stuck in the mindset of pushing the user into the XT devs' chosen blocksize settings. They still had illusions of controlling the consensus rules via the "wall of inconvenience" posed by locking down the blocksize cap (since many users are unable to mod the code to change it themselves). XT simply added a second option; the user was still forced to choose blocksize settings as a package deal with their choice of dev team:
While superior to the situation before XT where the choice was [email protected] or "roll your own and do your own testing" (a huge barrier to the establishment of a Schelling point around which consensus could coalesce), having only Core and XT as options for blocksize still left major friction in the market's selection process. The market still could not choose its ideal blocksize settings freely, as there were only two choices and they were tied to specific dev teams.
It would be better to "let a thousand implementations bloom," people reasoned. This was surely the logical endpoint of the movement XT started. That way there would be free choice. The problem of those choices about blocksize being tied to the specific dev team would have to persist as a necessary evil, even if mitigated by the variety.
Enter BU. By making blocksize settings user-adjustable, Bitcoin Unlimited untethers this controversial consensus parameter from the Core dev team's trusted software offerings (as well as from the XT dev team's offerings, once someone releases an "XT Unlimited" client).
The fact that the blocksize cap is controversial, previously used by Core as an excuse to take no action and keep it locked down, is revealed as the very reason it must NOT be locked down. The fact that the blocksize cap is controversial is the very reason it is too dangerous to be packaged in with the Core software. BU's stance is that no one should have that power. That power throws a monkey wrench into the market's process of finding consensus by jiggering the results. The true logical endpoint of the movement started by XT is Bitcoin Unlimited's approach of unbundling the consensus rules from each dev team's offerings.
If Core won't do it in its own releases, someone else will do it for them, draining power away from Core. That is fine with me, but Core may want to consider the futility of their approach if they have any interest in maintaining their dominant position.
In his classic presentation Silicon Valley's Ultimate Exit, Balaji Srinivasan says,
If a company or a country is in decline, you can try voice, or you can try exit. Voice is basically changing the system from within, whereas exit is leaving to create a new system, a new startup, or to join a competitor sometimes. Loyalty can modulate this; sometimes that's patriotism, which is voluntary, and sometimes it's lock-in, which are involuntary barriers to exit.
Gavin, Mike, and Jeff tried voice. Then they tried exit, via XT. They ran into loyalty, inertia, and trust issues as forms of lock-in, but as fees rise, transaction traffic jams happen, and altcoins rise menacingly, the lock-in at Core would likely be overwhelmed. Core would hemorrhage users as people jumped reluctantly to XT.
However, there is an easy way for Core to prevent this: unbar users from selecting blocksize parameters. Let them choose their favorite BIP, at least. That way the market can come to a consensus independently of Core, not leaving Core if their only beef is with the blocksize settings. If Core is right that they are giving the market what it wants, they should have no qualms about doing this.
If Core refuses to unlock its blocksize settings, BU and projects like it will steal users away with impunity. And once they're gone - who knows - they may find other reasons Core is not as trustworthy as they thought.
BU is the ball leaving the park, the horse leaving the stable, and a shot across Core's bow on behalf of the Bitcoin community.
*While XT has some other changes that have been controversial, there is also a "big blocks only" version that is just Core+BIP101.
submitted by ForkiusMaximus to btc [link] [comments]

What happened to BIP 102?

About a year ago the blocksize debate was not much about whether or not we should scale (as most agreed we should scale), but it was all about by how much we should scale. Proof: https://www.reddit.com/Bitcoin/comments/3id47g/can_anyone_summarize_bip100_bip101_and_bip102/ https://www.reddit.com/Bitcoin/comments/3djtko/bip_102_increase_block_size_limit_to_2mb_on_nov/ https://docs.google.com/spreadsheets/d/1Cg9Qo9Vl5PdJYD4EiHnIGMV3G48pWmcWI3NFoKKfIzU/edit https://en.bitcoin.it/wiki/Block_size_limit_controversy#Entities_positions Note that an overwhelming majority is in favor of a blocksize increase. But right now, it seems on reddit and bitcointalk the issue is largely ignored. Worse still, it seems no action at all is taken, even though it seems that early 2016 the general consensus was that an increase is urgently needed. What happened?
submitted by zimmah to Bitcoin [link] [comments]

Providing information on how we got to this point

Over the recent events of the thermos who rules with an iron cap, I am depressed of how little I can trust any bitcoin information now. For anyone who doesn't know, major discussion/media outlets bitcointalk and /bitcoin are under the control of someone called theymos, who many mockingly refer to as "thermos". This is the problem right now: many people do not know events that have transpired, myself included. This also includes anyone who gets introduced to bitcoin in the future; they get introduced to the theymos censored bubble of misinformation.
Now to my main point. I do not really know what is going on. XT, core, BIP100, BIP101, etc. have varying characteristics depending on where you get your information. Hell, I don't even know what happens to my bitcoin when there is some inevitable fork. I don't even know what is going on with XT running nodes right now. Frankly I am lost, and it is my own fault for not spending more time to form an educated opinion. Bitcointalk seems to have the most fleshed out discussions addressing pros/cons of different approaches, but it is the info that theymos lets us see.
What I need is a reasonable chunk of pertinent information for a layman to understand reasons and consequences for changing the protocol at this juncture. I am talking fresh information with no thermos bias (like bitcointalk, /bitcoin, and bitcoin wiki).
The stickied post should be updated to be full of information from many perspectives, so a new or out-of-touch user can get up to speed and make an informed decision. Maybe a good relationship with /bitcoinbeginners will help. I know this boils down to me complaining that learning is too hard, but I am sure there are many like me...
submitted by cameroon16 to bitcoin_uncensored [link] [comments]

Misconceptions about opposition to BIP101

Almost everyone wants the system to grow and scale. Many support moderate blocksize increases, but oppose excessive increases of 8,000x. For example I would now and always have been happy to tolerate BIP100, BIP102 or BIP103. I oppose the 8,000x increase.
Almost everyone wants the system to grow and scale. Moderately increasing the blocksize limit is a key part of the scaling process. I want the network to grow and volume to increase.
Many people want a working and competitive fee market to eventually develop, consistent with Satoshi's comment in the whitepaper that "incentives can translation entirely to transaction fees". This does not necessarily mean higher fees, actually fees could fall.
Firstly we need to prudently plan for the worst case attack scenario and therefore assume full blocks when assessing if the network can handle larger blocks. Secondly, if bitcoin is successful and useful, then yes there could be huge demand for transactions that fill blocks up. Please consider both the decisions of individual miners and the mining industry as a whole. Even if its against the interests of the mining industry as a whole to fill up blocks, the nash equilibrium decision of each individual miner could be to fill up blocks to maximize their own short term revenue.
The impact of changes to the limit on total fee revenue depends on the price elasticity of demand. Increasing the limit can reduce mining revenue, in some circumstances. This image from Wikipedia nicely illustrates this: https://en.wikipedia.org/wiki/Price_elasticity_of_demand#/media/File:Price_elasticity_of_demand_and_revenue.svg.
Most small blockers both hope and believe that technological improvements will occur. However, it is important to be prudent and not assume too much technological improvements before they occur. The future is never certain. At the same time, a high amount of technological improvements actually means propagation costs fall over time and Peter R's theory that mining fees can be driven by propagation risk costs becomes less relevant.
The concern that nodes cannot literally keep up with the network is very limited. There are a much wider set of concerns about nodes being able to operate with BIP101. For example the time for a new node to catch up when first switched on, the high cost and effort required to run a full node, the time taken to verify an attack transaction or attack block, running a node behind Tor, propagation time as a proportion of target time ect etc.
BIP102 and BIP103 have code.
I do not think this claim even merits a response
submitted by jonny1000 to Bitcoin [link] [comments]

Nyancoins Megapost - Central Link Collection

Edit: Going to finally start an overhaul on this (April 23rd, 2016); it's been six months since the last edit. I'm going to go from current back, so there's going to be a gap between this top, new stuff and what's below until I finish the update.
I'm just going to have the last six months all shoved together into one large update here. There's weak categorization, but basically just think of it as a huge list. In general, the newer items will be higher within a given category than the older items. I apologize if I left anything out which people would like to see included. Some things I considered more of a temporary update than something relevant months later, but just PM me and I'll add anything requested!
We're currently in a quiet low point. Nothing catastrophic is happening, but we are relatively weak. I call it "the best nadir" because if this is as bad as it gets, we're doing alright. The price is down to 4 satoshi now, which is the lowest sustained price since the beginning of the revival. I'm going on a year behind my original goal for releasing NYAN2, still stuck on a new build computer (alternately time and energy to cripple together a build system out of what I have available).
One major new element: I've set a goal for us to have a mission to visit the site of Apollo 17 in twenty years. This is basically a new dimension. For the first ten years, I envision this as a purely "paper program", doing research on past space programs, in particular Mercury through Apollo, but any and all launch platforms and spacecraft which have been done. We may additionally seek to gain additional education (for instance, I would like aerospace engineering and material science undergraduate degrees at a minimum; we also are going to need experienced test pilots).
Space Program Initial Vision: [NYAN 2035] We must send a mission to visit the site of the Apollo 17 plaque on the Moon
Also, I've replaced the previous "Nekonauts of the Month" competition with a "Who Wants to be a Nillionaire?". The major difference is that rather than relying upon me to track everything, the expectation is that Nekonauts will sign up and self-report accomplishments.
Nyan Projects
[Hype] Browser based MMORPG accepting Nyancoins for member items: KojoSlayer's latest foray into nyan video game development! I've seen an early preview and it reminds me of a primitive Runescape (meant as a compliment)
Fun Posts
Insert NyanDisk 1 into Drive A:: NyanDOS!
Nyan like it's 1999 ....: telnet into nyan!
[breaking news] Nyancoins will be bought out by Garza in a last-ditch attempt to save Paycoin - April fool's post
Trumpchain on Twitter: "It can happen. Our blockchain has tremendous potential. We have tremendous people. #MakeTheBlockchainGreatAgain" - Terrific shitpost; really fantastic!
Join the Nekonauts today! - Cool nyan poster
"I really hope Satoshi is finally dumping and declaring that, like, Nyancoin is the true bearer of his vision." - CountOneInterrupt - My favorite idea ever
Nyancoin Zen - So cute. This may be my favorite nyan image ever for its understatement and beauty.
High Definition Nyan up close - Amusing
Making PC more Nyan-Friendly! - cute; amusing. Such nyan!
Typical Nyancoiner breakfast. - DobbsCoin is great with this stuff!
[meta] [misadventures of coinaday] [Pizza Boy Adventures] Late Night Pizza - Just a little choose-your-own-delivery I wrote during my stint as a pizza delivery boy.
I don't know how I wasn't aware of this site before - I still can't believe there's an entire site for this!
Get NYAN
Want more NYAN? Faucet Mrai and trade to me for NYAN (and then hodl!): What is says on the tin. The faucet is down temporarily at time of this writing, but it'll be back up before I update this section likely. The price offered there is low (mailing list mentioning 200-300 satoshi currently; my offer is worth about 2 satoshi currently); I would consider higher, but probably wouldn't pay those apparent market rates (no actual exchange yet).
Force Multipliers
Content about the difference a determined person can make. Intended as inspiration.
[Force Multiplier] [Original Content] [pdf; 23 pages] Archimedes and the Siege of Syracuse - Previously unpublished paper I wrote for a history course in college.
[Force Multipliers] [Military History] Julius Caesar's Greatest Military Victory (Video; 10 minutes) - An explanation of achieving victory in an apparently unwinnable situation.
[Force Multipliers] [Naval History] Korea: Admiral Yi - I: Keep Beating the Drum - Extra History - Incredible loyalty and dedication from this greatest Admiral saved his country
Philosophy
Content which fits the themes of fun, self-improvement, and service to others.
Wikipedia essay: WikiLove - I think Wikipedia's policies are in a lot of ways something to look up to. It's true that they're stuck in bureaucracy now, and have driven away many experts, but they function and their policies have helped to give some structure to the anarchy.
[US history and macroeconomics] [59 minute video] Thom Hartmann, "The Crash of 2016" - Interesting video. I think the predicted outcome is something of a longshot, but it's interesting to me that he called Sanders as a major factor in the election years ago.
Taylor Mali, "Words and Their Consequences" (68 min video) - Poetry and philosophy
We Are One - Didn't get any attention at the time, but this is a general statement about the power of people working together.
A Message of Hope for the World - What's the point of Nyancoins? To inspire people.
A brief word on censorship - tl;dr: Censorship is bad, m'kay?
Who Owns Nyancoins? - Hodlers.
To The Moon is Not Enough: 100 Year Planning - About the importance of an unlimited time horizon. We build to last.
General
Catch-all category. Okay, this category got out of hand. I should do a second round later and break this out into a few different ones.
The best argument I've heard so far for keeping the 1MB cap in Bitcoin - I still think it would have been better for Bitcoin to grow, but this is the strongest argument for its stagnation in capacity that I've seen.
[conceptual design] How we should expect 100,000 transactions in a minute (or second?) to be handled - This is about the idea that we should expect to be able to handle large loads without crashing. Pretty basic. Related to an /cryptocurrency post I'd made: 100,000 Transactions Per Second: How Do We Get There?, which gives a very high-level overview of one way to reach high throughput capacity using blockchains.
Interesting cryptocurrency to try: raiblocks, protocol without transaction fees or block rewards - I think Raiblocks will be a valuable "companion coin" to Nyancoins ultimately. I don't know how exactly that'll work, but I believe that good cryptocurrency communities should make alliances. If nothing else, we can be valuable to each other as the "loyal opposition", critics who want to see success.
Coin-a-Year: Nyancoin : link to /CryptoCurrency post - Summary of the first year or so of NYAN revival
[far future concept] Nyanshares, Nythereumbits, and all-in on 37 rainbow - A double post: first part describes a possible spin-off, hybrid, 'companion coin' we could make in future years. The second part talks about what a gamble NYAN is.
NyanCoin compilation guide and downsizing nyan.space / NyanChain [semi-meta] - Has a link to a guide for compiling nyancoind on servers.
[meta] [misadventures of coinaday] Stuck in the Dihydrogen Monoxide - Another in a series of coinaday posts proving "play stupid games; win stupid prizes"
[Data] Faucet Stats - KojoSlayer's faucet stats
Thing to do a thing that can't do that thing.... - Bit of code for pulling BTC/NYAN feed from Cryptopia.
Fresh builds, coming up! - initial report from vmp32k on attempting to modernize the codebase
DigiShield - suggestion for different difficulty algorithm
BIP101 implementation to be made available for altcoins - prohashing announcing that they will have a Scrypt BIP101 implementation; this is planned to be our base for NYAN3
Year 1: Acquisition and Triage ; Year 2: Acquisition and Build - Optimistic; in reality, year two of the revival has largely been me just trying to survive. Hopefully more acquisition and build as the year goes on.
[technical] [financial] Price Stability and Consistent Hashing - Basic theory. If we have consistent prices, we'll have more consistent hashing.
[technical] [forking] [NYAN3] Should running old defaults be considered a vote against a hard fork or should the veto need to be explicit? / General voting discussion - What it says on the tin. I haven't gotten feedback on this yet. It's far in the future, but I think it's a critical question. I'm not sure which way is correct.
2015 in review: overview - Initial summary of the previous year; written before the Coin-a-Year post which did similar
[meta] [finance] [misadventures of coinaday] Paying Debts - Since writing this, I've gone further into debt. I need to get my personal finances together this year, for my own sake, for the sake of those I owe, and for the sake of Nyancoins.
Countdown to the Second Halving - The current block is 1168851 as I write this; we've got less than 350,000 more blocks until the third halving!
I updated the major risks page for Nyancoins to include mention of the fork bug and 'time warp'. Please review and comment. - bolded for visibility; I consider the risks document and making sure that we inform potential buyers as much as possible to be a critical requirement for us
[technical] [security] Time warp, fork bug, disclosure policies, and practical results: a working system despite flaws - Discussion of the success of Nyancoins as a working system despite its technical vulnerabilities.
Zero Fees (*) - Discussion of the role of zero fee transactions and why I consider them important
[finance] [meta] [Misadventures of coinaday] overdrafts and consequences / Cryptopia 1sat Dump - Discussion of my stupidity and its consequences on Nyancoins' financial health
[technical] NIP 1: Base NYAN3 on XT - I consider this critical. We will make a statement about not following the path Bitcoin is currently going down. This is not urgent for us because our activity is so low, but it will be part of building a strong foundation for the future.
Hodling Update: 30% - I haven't done the math recently. I'm probably within 5% of this, but I don't know if I've gone up or down. I haven't given away a whole lot, but I have put no new money into Nyancoins for months from being so broke. I've still gained some millions more from when my 5 satoshi bids got hit though.
[finance] Up? Down? Horizontal? - Considering 30 - It's pretty sad how far we are from 30 satoshi now (4 satoshi at the moment). I believe we'll get it back ultimately, but the revival certainly hasn't had the financial success I'd hoped.
Dice soft launch - Not sure of the current state here. Check with KojoSlayer.
State of the NYAN October 2015: An interlude for gratitude and yearning for more - I should get back to doing these monthly eventually. Right now it's quiet enough that there doesn't seem to be a real need.
[financial] NYAN vs DOGE as a long-term store of value - What it says. I believe that the lower supply inflation and smaller supply of NYAN will ultimately lead to NYAN trading above DOGE (currently trading at less than 10:1).
[finance] [stats] [gaming] Breaking the Bank: Risk-of-Ruin, Dice Games, and Basic Logic - I'm pretty proud of this one. By having more money than god, and a screwed up default max bet rule, I was able to beat the house. 8-)
100M - Talking about the remaining supply and the implications.
I think I'm done with this update (at least getting the new content in; I have not changed the old text and content, which is everything below).
Since I can only have one thing stickied at a time, but there are a lot of different things going on, I've switched over to having one main link collection post. And this is it.
I'll update this periodically (I'll try to do a major update once a month) and might replace it at some point. It'll have general discussion of the context behind why these various threads are significant.
I'm doing August and September together for Nekonaut awards and updates here since I got a bit busy at work. NYAN2 is released as a first-draft, but I haven't built it yet (nor done final changes and fixes). I need a computer with more RAM than what I have available to me now. However, I'm quite satisfied with the performance of NYAN1.2, ancient though it may be, so I'm not treating it as an emergency.
The biggest news is that we are now listed on cryptopia.co.nz ! They are a great community and provide better ecosystem support than most exchanges: they include a pool and explorer along with the exchange. And their exchange has a lot of basepairs, with NYAN/BTC, NYAN/UNO, NYAN/DOGE, and NYAN/DOT being relatively active, NYAN/LTC being quiet, and the other two (popularcoin and feathercoin) being unfamiliar to me and generally unused.
Oh, also, when I've taken a look at it, the Nyanchain seems to be running smoothly. I haven't been watching too closely, but the status page is usually showing all green. I especially like seeing the high number of connections (generally close to 30). [Comment from July version; still accurate. I should get automated metrics on the Nyanchain someday, but in the meantime, it seems to be moving pretty smoothly anecdotally.]
Top stories from August and Septemberish
Nekonauts of the Months, August and September 2015 - Combined awards, three awards for 1M as a result, and such. Just check it out. :-)
New IRC channel and tipbot - This came about during the listing process; we are now at #nyan2
WE ARE LIVE! Cryptopia added us just now!! - Culmination of the process of getting listed on Cryptopia. After leading in user votes and DOT votes after the first couple days, the admins decided to add us. So as I count it, we won three votes. :-)
Looking good on Cryptopia so far - My early reaction to the exchange.
The past few days. - Repost of a classic, which is always a good idea in NYAN, given our rich archives.
Miners We Need YOU! - Brief discussion by KojoSlayer about the importance of miners to the Nyancoin ecosystem.
Nyancoind Dockerfile (for the tech-nyans) - Cool demo by vmp32k
Nyancat all up on your Vim command line. - Cool xpost from /vim.
[financial] I hit a positive balance on Cryptsy-NYAN again - I started buying on Cryptsy. I've since withdrawn from Cryptsy and am working on eliminating my balances there, but I've got a lot of altcoins to consolidate yet.
100M - A discussion of the remaining supply to be generated (now under 100 million more coins)
Top stories from July
Gitian Build Instructions - !!! This is exactly what I was trying to figure out. With this roadmap, we should be able to help others build *coins with gitian as well as provide a solid introduction to our own community members. This should be linked and submitted for feedback elsewhere; I should report back to the Litecoin thread with a link to this for discussion. I cannot overstate how important I find this contribution.
Nekonauts of the Month, July 2015 - Still going with this. I may not always get this perfect, but I hope that it will help add some motivation and recognition to the community who is building the next generation of Nyancoins.
Ɲyancoins for Nekonauts! [designs] - Some logos and concept art; a start by W7phone; we hope to see more of this type of thing!
[hypothetical] What would it take for us to be able to start our own Nyan exchanges? - tl;dr: Let's get setup on some decentralized exchanges!
Linux Nekonauts: Building nyancoind - I should get this in the sidebar somewhere. An excellent first post by gentlenyan !
Top stories from June
Nekonauts of the Month, June 2015 - Latest round of awards; I plan to keep doing this each month for as long as I can
[community] You are a leader of Nyancoins / Herding Cats: Leading Leaders; Leadership in a Decentralized Community - A discussion of the importance of you to the success of Nyancoins
vmp32k launches a beta of a faucet - When is this going live?
kojoslayer launches a faucet
Various post on mining being stuck - we are still a bit spotty, but it seems like it might be a bit better. We could use something more than just an instantaneous status page; if someone wants to make something which does statistical analysis of the performance of the nyanchain, that would be awesome.
Broke through the 40 satoshi ceiling, and Plagiarizing great speeches in history and claiming to have a community mandate: Coin-a-Day writes inspirational pap as we stand on the verge of breaking through the 50 satoshi ceiling and envisions the glorious future ahead - and rather more. The price dipped back down on Cryptsy since, but we had a nice rise for a while. I'm hoping that when we get an exchange we have confidence in, we'll see more buying again.
Warning: Cryptsy does not process large NYAN withdrawals - This is why I recommend not using Cryptsy; plus this
Top stories from May
First off: Ɲyancoins needs YOU! - This is a discussion of how all of us have something we can do for Nyancoins, and how improving your own life is absolutely one of those things.
Nekonauts of the Month, May 2015 - This is my first month running this competition. I'm looking to recognize people who are active and contributing to the community and to give them NYAN to help further whatever they'd like to do next.
The network is stable! - Thanks to a new miner, spydud22, we are showing all green on status!
Wow, very large chunk of NYAN at 40 satoshi (6 million) - The title is outdated; there's about double this volume now. [Edit: And now the title is accurate again.]
Initial notes and thoughts on the Nyancoins client update - I've identified the approximate version of Litecoin that Nyancoins is based on and looked at a diff. It looks reasonable and do-able. I haven't yet looked at the latest branch on which I'll apply these changes.
Nyancoins 2.0
https://github.com/mathwizard1232/nyancoins/tree/nyancoins2 - first draft of NYAN2
(intentional duplication from top stories for July; I consider it that important): Gitian Build Instructions - !!! This is exactly what I was trying to figure out. With this roadmap, we should be able to help others build *coins with gitian as well as provide a solid introduction to our own community members. This should be linked and submitted for feedback elsewhere; I should report back to the Litecoin thread with a link to this for discussion. I cannot overstate how important I find this contribution.- earlier working notes
Cross-platform Gitian builds - Discussion about getting Gitian builds to work for Mac without access to a Mac.
Initial notes and thoughts on the Nyancoins client update - Right now I haven't had time to do much more on this, but I need to work on doing the Litecoin gitian build yet.
Gitian Build - jwflame's initial notes on trying the gitian build
DLC
Distributed Library Coin; stealing^Wrepurposing the ideas of others - Introducing the concept; basically a virtual lending library for the community; Learned Optimism is offered.
[DLC] Siege of Earth - Second post, offering Siege of Earth, a classic sci fi tale
Minecraft
[Idea] Minecraft NyanCoins - KojoSlayer is making a cool Minecraft Nyancoins faucet sort of thing (get Nyancoins for playing Minecraft).
[Sneak Peak] Nyancoin Minecraft Server - This project is moving forward quite quickly! See also /NyanCoinsMC for more information.
[Beta] Launch Nyancoins Minecraft Server : NyanCoinsMC - BOOM! I'm amazed at how quickly this has gotten setup. Go check it out!
Background / theory
Overview of major risks of buying Nyancoins - I've tried to collect every risk I could think of in this one place. This is important reading before investing.
Nyan's core principles and why they matter
draft one of Cold Storage 101: How to secure your coins for long-term hodling - I need to incorporate the suggestions still, but between the article and the comments, this is decent.
I will work harder: in which Coinaday reports for duty - My statement that this is going to a new level for me: I'm considering this my dream job now, rather than just my hobby. I'm dedicating myself to serving this community as best I can.
[community] You are a leader of Nyancoins / Herding Cats: Leading Leaders; Leadership in a Decentralized Community - This is a discussion of the importance of each individual, in particular you, to this revival.
A really good read about fiduciary duties in running an exchange - discussion of the responsibility one takes on in managing money for others
[rant] In response to "there is only BTC [and maybe LTC [and maybe DOGE]] AND DEFINITELY NOTHING ELSE MATTERS" - Possibly amusing rant.
My most worthless and most valuable coins: Comparing DIME and 42 - A discussion about interpreting spot price in context
[theory] Bitcoin discussion of hard forks - Talking about the risks involved with a hard fork
Rooting for LTC's Rally to Hold: Nyancoins and the Cryptocurrency Market - Nyancoins do not stand alone. Although it's easy to see the rise of another cryptocurrency as weakening us, because we might trade lower against them temporarily, I believe that a stronger CryptoCurrency market as a whole will be important for our long-term health.
[financial] Cryptocurrency valuation models: Considering Nyancoins as a zero-coupon bond against the community
Classic Posts
Why Nyancoin will hit $1/NYAN (and much more). We're going to space, and you're invited! - This is an infamous post by americanpegasus. I believe it was actually someone mocking him in /bitcoin by linking to this which first made me aware that Nyancoins existed, and got the idea in my head that it was a deadcoin (from seeing a post/comments on the sub at the time which claimed that). So the dream of this post was actually so bold that it brought it back from the grave, because it was bold enough to be mocked, and that mockery eventually led me to investigate it, and that investigation led me to fall in love.
1Ɲ >= 1Đ - This is a vision I have, that we shall rise above DOGE. This is not a dig against DOGE but merely a statement about the growth I expect to see us have. There are about 500x as many Dogecoins as there are Nyancoins, so even if we remain significantly smaller we can easily pass their unit price. We've done so briefly previously but are currently below this mark.
We choose to go to the Moon - This is my manifesto about why I am doing this. Cribbed from JFK's moon speech, it is meant to express that it is because of, not in spite of, the challenges that we face that I am here. This started out as a personal challenge. While I certainly would like to get rich off of this, the reason I chose to pursue this is because if we do then, then we're awesome badasses that people can be impressed by.
The original Nyancoins intro video - wasn't really sure where to categorize this
Older stories
I'll move stories down here as they get older. For now it's the block stoppage stuff as that seems to have stabilized.
Holy shit, 22 hours since the last block. At this rate, I'm going to have to start solving hashes by hand... - This was my post about the block stoppage.
Difficulty has spiked again; if we hit another stall I'll try the transaction fee trick again - Another block stoppage, and a record of my attempt to use the same trick to break it loose again (transaction fee incentive).
I'm ready to give up on life; in which coinaday finally has his full-blown mental breakdown. So long, and thanks for all the rainbows! - My personal mental breakdown. Just listed here because it made an impact. Also, it was an amazing response from the community which meant a lot to me.
Fuck it; encore une fois - My reaction afterward, saying that I'll give things another shot.
GFS
Disregard the below: GFS has been down for a few months and probably won't be back. At one point, this project had been offered to me, and perhaps I should have taken it, but I felt like I was already heavily committed here and couldn't take that on as well. It's a shame that no one managed to keep it running though. I really liked the idea.
Disregard the below: it's back down again, last I checked. Not sure what to link on that. The new bot got mildly political again / referenced being a shadowbanned user, and bam. I'm not sure where this is going to go now, if anywhere. Although I suppose the on-blockchain stuff isn't affected, and I'd wager go1dfish will do something again.
/GetFairShare will be attempting another distribution today; go try it out! - GetFairShare is back! Go get free money!
I don't really understand what's going on, but apparently the bot used for /GetFairShare got banned - Some background on GFS having gone down
I think that this will continue to be useful as we gain a larger and larger volume of posts and help me not have to worry about burying something significant posted a couple weeks back or something.
Also, right now I'm just gleaning from the frontpage, but I'll add in some great classic posts too.
Let me know in the comments if there are other posts you'd like to see added here.
submitted by coinaday to nyancoins [link] [comments]

Blocksizing = Bikeshedding

(definition of "Bikeshedding" on Wikipedia)
"Everyone can visualize a cheap, simple bicycle shed, so planning one can result in endless discussions because everyone involved wants to add a touch and show personal contribution."
Talk is cheap. Everyone has an opinion on easy, hot-button issues.
Devs ACKing on Github and users debating on Reddit get sucked into participating in the never-ending Blocksize BIP Bikeshedding debates (even jstolfi has BIP 99.5!) - because it's easy for everyone to weigh in and give their opinion on the starting value and periodic bump for a simple integer parameter - but meanwhile almost nobody is doing the hard work involving crypto and hashing to implement practical, useful stuff like IBLT or SegWit - or other features that have been missing for so long we've forgotten we even needed them (eg: HD - hierarchical deterministic wallets - without which you can't permanently back up your wallet).
BIP 202 is just the latest example of Blocksizing = Bikeshedding
The latest eposide of out-of-touch devs on Github ACKing yet another blocksize bikeshedding BIP (BIP 202 from jgarzik) is not actual "governance" and will not provide the scaling Bitcoin actually needs.
BIP 202 is wrong because it scales linearly instead of exponentially
https://np.reddit.com/btc/comments/3xf50u/bip_202_is_wrong_because_it_scales_linearly/
It would be like if you were selling a house for $ 200,000 dollars and the buyer originally offered $ 100,000 and then offered $ 100,002 - you wouldn't say you were willing to compromise - you'd simply laugh in their face.
BIP 202 isn't even acceptable as a "compromise".
https://np.reddit.com/btc/comments/3xedu8/a_comparison_of_bip202_bip101_growth_rates_with/cy45fzz
This is one of the reasons why this Blocksize BIP Bikeshedding debate is never-ending: it's easy, lazy, high-profile "executive decision-making" for devs, and easy, ponderous, philosophical pontificating for users, and everyone feels "qualified" to offer their expertise on how to set this one little parameter (which probably doesn't even need to be there in the first place since miners already soft-limit down as needed to avoid orphaning).
Nobody has been able to convincingly answer the question, "What should the optimal block size limit be?" And the reason nobody has been able to answer that question is the same reason nobody has been able to answer the question, "What should the price today be?" – tsontar
https://np.reddit.com/btc/comments/3xdc9e/nobody_has_been_able_to_convincingly_answer_the/
Setting a parameter is easy. Adding features is hard.
It's so much easier to simply propose a parameter versus actually adding any real features which real users really need in real life. There's a long list of much-needed features which none of these devs ever roll up their sleeves and work on, such as:
  • HD: hierachical deterministic wallets (BIP 32), without which it's impossible to back up your wallet permanently
  • simple optimizations and factorings like IBLT / Thin Blocks / Weak Blocks / SegWit
When are we going to get a pluggable policy architecture for Core?
https://np.reddit.com/btc/comments/3v4u52/when_are_we_going_to_get_a_pluggable_policy/
Bikeshedding in politics.
By the way, you can see the parallel in US electoral politics, on forums and comment threads and Facebook, where everyone has a really important opinion they urgently need to share with the world on the eternal trinity of American hot-button issues (abortion and racism and gays) - but nobody really feels like spending the time and effort to come up with solutions for the complicated stuff like education, healthcare, student loans, housing prices, or foreign policy.
It's all just bikeshedding - a way of feeling self-important and getting attention, while the more-important and less-glamorous bread-and-butter nuts-and-bolts real-life user-experience issues get left by the wayside, because they're just too "complicated" and "difficult" and not "sexy" enough for most devs to actually work on.
submitted by ydtm to btc [link] [comments]

segwit after a 2MB hardfork

Disclaimer: My preferred plan for bitcoin is soft-forking segregated witness in asap, and scheduling a 2MB hardforked blocksize increase sometime mid-2017, and I think doing a 2MB hardfork anytime soon is pretty crazy. Also, I like micropayments, and until I learnt about the lightning network proposal, bitcoin didn't really interest me because a couple of cents in fees is way too expensive, and a few minutes is way too slow. Maybe that's enough to make everything I say uninteresting to you, dear reader, in which case I hope this disclaimer has saved you some time. :)
Anyway there's now a good explanation of what segwit does beyond increasing the blocksize via accounting tricks or however you want to call it: https://bitcoincore.org/en/2016/01/26/segwit-benefits/ [0] I'm hopeful that makes it a bit easier to see why many people are more excited by segwit than a 2MB hardfork. In any event hopefully it's easy to see why it might be a good idea to do segwit asap, even if you do a hardfork to double the blocksize first.
If you were to do a 2MB hardfork first, and then apply segwit on top of that [1], I think there are a number of changes you'd want to consider, rather than just doing a straight merge. Number one is that with the 75% discount for witness data and a 2MB blocksize, you run the risk of worst-case 8MB blocks which seems to be too large at present [2]. The obvious solution is to change the discount rate, or limit witness data by some other mechanism. The drawback is that this removes some of the benefits of segwit in reducing UTXO growth and in moving to a simpler cost formula. Not hard, but it's a tradeoff, and exactly what to do isn't obvious (to me, anyway).
If IBLT or weak blocks or an improved relay network or something similar comes out after deploying segwit, does it then make sense to increase the discount or otherwise raise the limit on witness data, and is it possible to do this without another hardfork and corresponding forced upgrade? For the core roadmap, I think the answer would be "do segwit as a soft-fork now so no one has to upgrade, and after IBLT/etc is ready perhaps do a hard-fork then because it will be safer" so there's only one forced upgrade for users. Is some similar plan possible if there's an "immediate" hard fork to increase the block size, to avoid users getting hit with two hardforks in quick succession?
Number two is how to deal with sighashes -- segwit allows the hash calculation to be changed, so that for 2MB of transaction data (including witness data), you only need to hash up to around 4MB of data when verifying signatures, rather than potentially gigabytes of data. Compare that to Gavin's commits to the 0.11.2 branch in Classic which include a 1.3GB limit on sighash data to make the 2MB blocksize -- which is necessary because the quadratic scaling problem means that the 1.3GB limit can already be hit with 1MB blocks. Do you keep the new limit once you've got 2MB+segwit, or plan to phase it out as more transactions switch to segwit, or something else?
Again, I think with the core roadmap the plan here is straightforward -- do segwit now, get as many wallets/transactions switched over to segwit asap (whether due to all the bonus features, or just that they're cheaper in fees), and then revise the sighash limits later as part of soft-forking to increase the blocksize.
Finally, and I'm probably projecting my own ideas here, I think a 2MB hardfork in 2017 would give ample opportunity to simultaneously switch to a "validation cost metric" approach, making fees simpler to calculate and avoiding people being able to make sigop attacks to force near-empty blocks and other such nonsense. I think there's even the possibility of changing the limit so that in future it can be increased by soft-forks [3], instead of needing a hard fork for increases as it does now. ie, I think if we're clever, we can get a gradual increase to 1.8MB-2MB starting in the next few months via segwit with a soft-fork, then have a single hard-fork flag day next year, that allows the blocksize to be managed in a forwards compatible way more or less indefinitely.
Anyhoo, I'd love to see more technical discussion of classic vs core, so in the spirit of "write what you want to read", voila...
[0] I wrote most of the text for that, though the content has had a lot of corrections from people who understand how it works better than I do; see the github pull request if you care --https://github.com/bitcoin-core/website/pull/67
[1] https://www.reddit.com/btc/comments/42mequ/jtoomim_192616_utc_my_plan_for_segwit_was_to_pull/
[2] I've done no research myself; jtoomim's talk at Hong Kong said 2MB/4MB seemed okay but 8MB/9MB was "pushing it" -- http://diyhpl.us/wiki/transcripts/scalingbitcoin/hong-kong/bip101-block-propagation-data-from-testnet/ and his talks with miners indicated that BIP101's 8MB blocks were "Too much too fast" https://docs.google.com/spreadsheets/d/1Cg9Qo9Vl5PdJYD4EiHnIGMV3G48pWmcWI3NFoKKfIzU/edit#gid=0 Tradeblock's stats also seem to suggest 8MB blocks is probably problematic for now: https://tradeblock.com/blog/bitcoin-network-capacity-analysis-part-6-data-propagation
[3] https://botbot.me/freenode/bitcoin-wizards/2015-12-09/?msg=55794797&page=4
submitted by ajtowns to btc [link] [comments]

A compromise between BIP101 and Pieter's proposal | jl2012 at xbt.hk | Jul 31 2015

jl2012 at xbt.hk on Jul 31 2015:
There is a summary of the proposals in my previous mail at
https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-July/009808.html
I think there could be a compromise between Gavin's BIP101 and
Pieter's proposal (called "BIP103" here). Below I'm trying to play
with the parameters, which reasons:
  1. Initiation: BIP34 style voting, with support of 750 out of the last
1000 blocks. The "hardfork bit" mechanism might be used:
http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-July/009576.html
Rationale: This follows BIP101, to make sure the new chain is secure.
Also, no miner would like to be the first one to mine a large block if
they don't know how many others would accept it.
  1. Starting date: 30 days after 75% miner support, but not before
2016-01-12 00:00 UTC
Rationale: A 30-day grace period is given to make sure everyone has
enough time to follow. This is a compromise between 14 day in BIP101
and 1 year in BIP103. I tend to agree with BIP101. Even 1 year is
given, people will just do it on the 364th day if they opt to
procrastinate.
2016-01-12 00:00 UTC is Monday evening in US and Tuesday morning in
China. Most pool operators and devs should be back from new year
holiday and not sleeping. (If the initiation is delayed, we may
require that it must be UTC Tuesday midnight)
  1. The block size at 2016-01-12 will be 1,414,213 bytes, and
multiplied by 1.414213 by every 223 seconds (97 days) until exactly
8MB is reached on 2017-05-11.
Rationale: Instead of jumping to 8MB, I suggest to increase it
gradually to 8MB in 16 months. 8MB should not be particularly painful
to run even with current equipment (you may see my earlier post on
bitctointalk: https://bitcointalk.org/index.php?topic=1054482.0). 8MB
is also agreed by Chinese miners, who control >60% of the network.
  1. After 8MB is reached, the block size will be increased by 6.714%
every 97 days, which is equivalent to exactly octuple (8x) every 8.5
years, or double every 2.9 years, or +27.67% per year. Stop growth at
4096MB on 2042-11-17.
Rationale: This is a compromise between 17.7% p.a. of BIP103 and 41.4%
p.a. of BIP101. This will take us almost 8 years from now just to go
back to the original 32MB size (4 years for BIP101 and 22 years for
BIP103)
SSD price is expected to drop by >50%/year in the coming years. In
2020, we will only need to pay 2% of current price for SSD. 98% price
reduction is enough for 40 years of 27.67% growth.
Source: http://wikibon.org/wiki/v/Evolution_of_All-Flash_Array_Architectures
Global bandwidth is expected to grow by 37%/year until 2021 so 27.67%
should be safe at least for the coming 10 years.
Source:
https://www.telegeography.com/research-services/global-bandwidth-forecast-service/
The final cap is a compromise between 8192MB at 2036 of BIP101 and
2048MB at 2063 of BIP103
Generally speaking, I think we need to have a faster growth in the
beginning, just to normalize the block size to a more reasonable one.
After all, the 1MB cap was introduced when Bitcoin was practically
worthless and with inefficient design. We need to decide a new
"optimal" size based on current adoption and technology.
About "fee market": I do agree we need a fee market, but the fee
pressure must not be too high at this moment when block reward is
still miner's main income source. We already have a fee market: miners
will avoid building big blocks with low fee because that will increase
the orphan risk for nothing.
About "secondary layer": I respect everyone building secondary layer
over the blockchain. However, while the SWIFT settlement network is
processing 300tps, Bitcoin's current 7tps is just nothing more than an
experiment. If the underlying settlement system does not have enough
capacity, any secondary layer built on it will also fall apart. For
example, people may DoS attack a Lightening network by provoking a
huge amount of settlement request which some may not be confirmed on
time. Ultimately, this will increase the risk of running a LN service
and increase the tx fee inside LN. After all, the value of secondary
layer primarily comes from instant confirmation, not scarcity of the
block space.
original: http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-July/009812.html
submitted by bitcoin-devlist-bot to bitcoin_devlist [link] [comments]

03-13 20:27 - 'Well, for example: / [Gavin Andersen started discussing the need to increase the block size in public 2 years ago]. Prior to that time it had been discussed for years censorship free and no one objected to the idea we needed to...' by /u/Adrian-X removed from /r/Bitcoin within 1-6min

'''
Well, for example:
[Gavin Andersen started discussing the need to increase the block size in public 2 years ago]1 . Prior to that time it had been discussed for years censorship free and no one objected to the idea we needed to fork before blocks filled up this was accepted and never contentious. Limiting block space created demand for the products BlockStream are developing. Blockstream employ the most influential bitcoin developers who are injecting changes into the bitcoin code base, the conflict of interest goes ignored.
Adam Back - the CEO of Blockstream responded with FUD and accused Gavin of doing a coup when he released BIP101 for review by the Core developers:
Adam Back: "[Gavin naively thinks he'll do the coup, force the issue, and then invite people to participate in the coup.]2 "
Contention is being manufactured where there is none - people can choose to support or not.
BIP101 which included the following features which should make it obvious it's not a coup:
Ignorant small bock fundamentalists proclaiming this was contentious and would split the network.
Adam could maybe give us a definition of what he meant when he says "coup" he sounds very aggressive as if his control of bitcoin was under threat:
What followed was a closed door Hong Kong meeting where the CEO of Blockstream Adam Back meat with 80% of the bitcoin hashing power. After a 17 hour meeting an agreement, to block any hard fork proposals and wait until BS/Core releases segwit, was made.
what Adam said:
Gavin naively thinks he'll do the coup, force the issue, and then invite people to participate in the coup.
what Adam did:
He went behind the community's back, force the issue, and then invite people to participate in the coup.
and that is how we recognized bitcoin was under centralized control and we got the contentious segwit soft fork.
[Psychological_projection]3
'''
Context Link
Go1dfish undelete link
unreddit undelete link
Author: Adrian-X
1: gavin*n**es*n*ninj*/wh*-inc*ea*in*-*he-max*bloc**s*ze-is-urge*t 2: *pectrum.ie****r*/tec*-*alk/compu*ing*net*orks*t*e-b*tc*in-f*r-is-*-co*p 3: https://en.wikipedia.org/wiki/Psychological_projection
Unknown links are censored to prevent spreading illicit content.
submitted by removalbot to removalbot [link] [comments]

Blockchain tutorial 27: Bitcoin raw transaction and transaction id How to Choose your Bitcoin Wallet  Best Guide to Trade Bitcoins 2014 Blockchain tutorial 19: Create Bitcoin paper wallet Segwit addresses with python - Encoding - Final part How to secure Bitcoin Blockchain wallet

Yes: "In summary, we believe BIP 101 will safeguard Bitcoin’s decentralized nature while providing a reliable, immediate path toward greater network throughput, and we would like to express our support for merging BIP 101 into Bitcoin Core." - Stephen Pair: Bittiraha.fi: Yes: "We are supporting increasing #Bitcoin max block size to 20MB."}} "I'm strongly in favor of the block size cap ... Bip 101 was reverted and the 2-MB block size bump of Bitcoin Classic was applied instead. In January 2016, BIP 101 was removed from Bitcoin XT and replaced with the one-time block size increase to 2 MB present in Bitcoin Classic. In the year following this change, adoption of Bitcoin XT decreased dramatically, with fewer than 30 nodes remaining by January 2017. Later attempts by other ... Opcodes used in Bitcoin Script This is a list of all Script words, also known as opcodes, commands, or functions. OP_NOP1-OP_NOP10 were originally set aside to be used when HASH and other security functions become insecure due to improvements in computing. From Bitcoin Wiki. Jump to: navigation, search. A Bitcoin Improvement Proposal (BIP) is a design document for introducing features or information to Bitcoin. This is the standard way of communicating ideas since Bitcoin has no formal structure. The first BIP was submitted by Amir Taaki on 2011-08-19 and described what a BIP is. Contents. 1 Types; 2 Layers; 3 Workflow; 4 List of BIPs; 5 Notes ... Bitcoin XT was a fork of Bitcoin Core, the reference client for the bitcoin network. In mid ... On June 22, 2015, Gavin Andresen published BIP 101 calling for an increase in the maximum block size. The changes would activate a fork allowing eight MB blocks (doubling in size every two years) once 75% of a stretch of 1,000 mined blocks is achieved after the beginning of 2016. The new maximum ...

[index] [48296] [37763] [46128] [10780] [44581] [8761] [29910] [35033] [12242] [12016]

Blockchain tutorial 27: Bitcoin raw transaction and transaction id

For the complete text guide visit: http://bit.ly/2Qc7Khp List of wallet software for your computer: https://multibit.org/ http://bitcoin.org/en/download http... Using python 3.6 to generate segwit addresses. This video is part of a series of tutorials about the bitcoin protocol. In this video, I'll use different encoding to generate the final segwit ... This is part 27 of the Blockchain tutorial. This tutorial explains: - What Bitcoin raw transaction is. - Shows an example of a raw transaction using the very first transaction on the Genesis block. After hearing reports that bitcoin wallets can be hacked. This video shows you how to secure your blockchain wallet. blockchain technology advantages block chain block chain definition block chain ... This is the first part of a more technical talk where Andreas explores Bitcoin script, with examples from the 2nd edition of Mastering Bitcoin, focusing on t...

#