Bitcoin Fork Guide: History and Upcoming Bitcoin Forks

7 Technologies that will Bloom in 2020

7 Technologies that will Bloom in 2020
Isn’t it fascinating how technology is influencing the various sectors of the market? Ever since the advent of digitalization, the fundamentals of industries began to change for the better. One of the main reasons behind the drastic changes in the operation of industries is user needs and competitiveness.
To sustain in today’s competitive marketplace, it has become quintessential for every industry to utilize innovative technologies that make their solution more powerful. Even big companies like Facebook, Amazon and Google are investing valiantly in the latest technological trends like Blockchain technology, Big Data and AI for enhancing the user experience.
It is visible that technology is not changing but transforming at a great pace. There are exponential innovations in the world of business. It is estimated that over 1 billion people will use AVR in the year 2021 and IoT could be worth $20 trillion in the coming years. This makes it clear that universal digital transformation is not far away.
Here is a closer look at the technology trends that will bring considerable innovation the next year and in the coming decade.

1) Artificial Intelligence:

Artificial Intelligence or AI has already created a lot of buzz in the past few years, and it will surely be a trend to watch as its effects on our day-to-day life are still in the early stages. Machine Intelligence or Artificial Intelligence is a modern approach, which can be defined as the simulation of human intelligence processes by a computer system. These processes include the acquisition of information and rules for using the information system, reasoning i.e. using rules to reach approximate conclusions and self-correction.
Presently, AI services are being used in navigation apps, smartphone personal assistants, streaming services, ride-sharing apps, home personal assistants, smart home devices, in one or the other way. In addition to consumer use, AI is used to assess business risk, predict maintenance, schedule trains and improve energy efficiency.
In a recent survey, McKinsey has stated that AI adoption could increase the global GDP by as much as $13 trillion by 2030. Moreover, AI is predicted to create 23 million job opportunities by the end of the year 2020. Job opportunities are likely to be created in the field of development, programming, testing, support and maintenance, to name a few. And, in the next decade, we are sure to witness AIs that don’t require any human intervention to grow smarter.

Artificial Intelligence-Direct & Enabled Revenue-2014 to 2025 (USD Million)
Image Credit: Grand View Research

2) Mobile Commerce:

Mobile Commerce is now emerging from big brands to startups. In fact, everyone around the globe is embracing Mobile Commerce. Ever since the launch of smartphones, mobile commerce has become a crucial part of personal and professional lives. Simply put M-commerce entails the e-commerce transactions done using the mobile phone. Mobile commerce utilizes e-commerce background and WAP technology. Wireless technology (WAP) is utilized to conduct sales of goods, make payments, provide services and perform other financial transactions.
Reasons Why Mobile Commerce is Rising at Rapid Rate
Image Source: Peerbits
With over 80% of internet users owning a smartphone, retail m-commerce sales are expected to increase from 74.8% to 85.5% until 2025. Mobile commerce is expected to outpace non-mobile commerce in 2021. Undoubtedly, m-commerce is the rising star of the e-commerce world, however, there are few more innovations in the e-commerce industry, which will make the e-commerce industry more sustainable.
By understanding m-commerce and keeping tabs on where it’s going, business owners put themselves in the best position to take advantage of what all m-commerce has to offer.

3) 5G Mobile Network:

5G technology was the talk of CES this year and by the start of the year 2020, 5G will be the driving factor in wireless technology. It will benefit users with features like lower latency, higher capacities and for sure faster internet speed. With 5G, automation and technological advancements in cities and remote areas will certainly expand.
Along with the 5G mobile network, there will be development and implementation of WiFi 6. These technologies will make the operation of driverless-cars easy by offering real-time data. Moreover, the 5G system is going to offer facilities like drones for home delivery.
It is expected that 5G can cover up to 65 percent of the world’s population in 2024. According to the recent stats, 5G subscriptions for enhanced mobile broadband could increase to 1.9 billion by the end of 2024. In India, 5G subscriptions are expected to become available in 2022 and could rise. In US, service providers have already launched commercial 5G services, both for mobile and fixed wireless access. By the end of 2024, 5G subscriptions are expected to rise to 270 million, which will account for more than 60% of mobile subscriptions.
Here are the improvements that 5G will bring
Image Source: Digital Trends

4) Robotic Process Automation or RPA:

Like Machine Learning and AI, Robotic Process Automation is another technology that is automating jobs. RPA is the use of software to automate business processes such as interpreting applications, dealing with data, processing transactions and even replying to emails.

Features of Robotic Process Automation
Image Source: Digitals Fren
Robotic Process Automation automates the repetitive tasks that people used to do. Not only the menial tasks of the low-paid worker but the work of doctors, financial managers and CEOs can be automated using RPA. Although researchers estimate that RPA poses the risk to the livelihood of 9 percent of the global workplace. There are chances that RPA might create new jobs while altering existing jobs.
Rapid growth in the e-commerce industry is a crucial factor that adds to the growth of Robotic Process Automation market. Online sales in the USA are expected to double by 2025 and are likely to contribute to 30-35% of the overall retail sector.
For IT professionals who want to remain aligned with the technology trends, RPA offers plenty of career opportunities, including project manager, business analyst, solution architect, developer and consultant.

5) Blockchain

Although most people think of blockchain technology as cryptocurrencies such as Bitcoin, blockchain also offers security that is useful in many other ways. In simple terms, blockchain can be described as data that you can only add, not change or take away. Hence, the term “chain” is being used, because you are making a chain of data. What makes blockchain secure is the fact that previous blocks cannot be changed. Also, blockchain is consensus-driven, which means no single entity can take control of the data.

Expected Increase in Enterprise Blockchain Market from 2019 to 2024
Image Source: Document Media
Several industries are implementing blockchain and as the use of blockchain technology is increasing, the demand for skilled professionals will also increase. According to Techcrunch.com, blockchain-related jobs are the second-fastest growing category of jobs, hence creating a wide number of job opportunities. So, if you are planning to make a career in the fast-growing industry and intrigued by Blockchain, then you must consider learning blockchain and gear up for an exciting future.

6) Machine Learning:

Machine Learning is an application of computer programs that is capable of making decisions, making algorithms and generating outputs without any human involvement. Hailed as one of the significant and impactful technological developments that we have seen in recent times, machine learning has already helped us to perform key-real world calculations.

Machine Learning Explained
Supply chain technology vendors are incorporating machine learning into their applications, helping the solutions to understand changing circumstances.
Here is how machine learning will help companies in improving their performance:
  • Machine learning will aid companies in developing applications that are capable of understanding natural human language.
  • Efficiency of logistics and transport networks would be enhanced with Machine Learning
  • It helps companies utilize preventive care for reducing gear failures and raise profits.
  • With machine learning, companies can take customer information to boost sales, construct useful clients and enhance brand loyalty.
Machine learning has become the most crucial technology to work in the future. AI-driven applications, combined with machine learning will help businesses in increasing efficiency, enhancing customer relations and increase earnings.
Researchers estimate that machine learning has the capacity to put in $2.6 trillion in value into the advertising and sales sector by 2020. In the recent report of Univa, it has stated that Machine Learning is predicted to skyrocket in the coming 5 years with 96% of companies expecting to use it for the production projects.

7) Chatbots:

Chatbots enable businesses to answer customer service inquiries of all types answered at any time of the day, from anywhere and even on holidays when customer support staff is not available. Earlier, we have covered what are chatbots and what are the benefits offered by chatbots. Today, we are bringing into your notice how chatbots will be evolving in the coming years.
Chatbots have the ability to interact with customers bias-free. Moreover, chatbots synthesize metadata, AI-based applications and hence easily personalize the customer experience.
According to the recent stats by Global Market Insights, the overall market size for chatbots worldwide would be over $1.3 billion by 2024. Hence, it would not be wrong to say that the chatbot industry is sure to become a driving force of business communications. With technological advancement, bots will become more intelligent to understand the intent of the queries and conversations.

Benefits of Chatbots for Business
Image Source: Litslink
To remain aligned with customer expectations, businesses are now focussed on creating chatbots with the help of Machine Learning, Artificial Intelligence and Natural Language Processing (NLP). Chatbots with a more conversational AI will:
· Improve the user experience with user brand and hence aid in building brand presence.
· Deliver personalized customer experience to build better relationships.
· Positively affect customer’s perception and help you build satisfied customer base.
So, this was all about the trends that will be evolving in 2020
For more such updates, stay tuned!
submitted by Graffersid14 to u/Graffersid14 [link] [comments]

INT - Comparison with Other IoT Projects

What defines a good IoT project? Defining this will help us understand what some of the problems they might struggle with and which projects excel in those areas. IoT will be a huge industry in the coming years. The true Internet 3.0 will be one of seamless data and value transfer. There will be a tremendous amount of devices connected to this network, from your light bulbs to your refrigerator to your car, all autonomously transacting together in an ever growing network in concert, creating an intelligent, seamless world of satisfying wants and needs.
.
Let’s use the vastness of what the future state of this network is to be as our basis of what makes a good project.
.
Scalability
In that future we will need very high scalability to accommodate the exponential growth in transaction volume that will occur. The network doesn’t need to have the ability to do high transactions per second in the beginning, just a robust plan to grow that ability as the network develops. We’ve seen this issue already with Bitcoin on an admittedly small market penetration. If scaling isn’t a one of the more prominent parts of your framework, that is a glaring hole.
.
Applicability
Second to scalability is applicability. One size does not fit all in this space. Some uses will need real-time streaming of data where fast and cheap transactions are key and others will need heavier transactions full of data to be analyzed by the network for predictive uses. Some uses will need smart contracts so that devices can execute actions autonomously and others will need the ability to encrypt data and to transact anonymously to protect the privacy of the users in this future of hyper-connectivity. We cannot possibly predict the all of the future needs of this network so the ease of adaptability in a network of high applicability is a must.
.
Interoperability
In order for this network to have the high level of applicability mentioned, it would need to have access to real world data outside of it’s network to work off of or even to transact with. This interoperability can come in several forms. I am not a maximalist, thinking that there will be one clear winner in any space. So it is easy, therefore, to imagine that we would want to be able to interact with some other networks for payment/settlement or data gathering. Maybe autonomously paying for bills with Bitcoin or Monero, maybe smart contracts that will need to be fed additional data from the Internet or maybe even sending an auto invite for a wine tasting for the wine shipment that’s been RFID’d and tracked through WTC. In either case, in order to afford the highest applicability, the network will need the ability to interact with outside networks.
.
Consensus
How the network gains consensus is often something that is overlooked in the discussion of network suitability. If the network is to support a myriad of application and transaction types, the consensus mechanism must be able to handle it without choking the network or restricting transaction type. PoW can become a bottleneck as the competition for block reward requires an increase in difficulty for block generation, you therefore have to allow time for this computation in between blocks, often leading to less than optimal block times for fast transactions. This can create a transaction backlog as we have seen before. PoS can solve some of these issues but is not immune to this either. A novel approach to gaining consensus will have to be made if it is going to handle the variety and volume to be seen.
.
Developability
All of this can be combined to create a network that is best equipped to take on the IoT ecosystem. But the penetration into the market will be solely held back by the difficulty in connecting and interacting with the network from the perspective of manufacturers and their devices. Having to learn a new code language in order to write a smart contract or create a node or if there are strict requirements on the hardware capability of the devices, these are all barriers that make it harder and more expensive for companies to work with the network. Ultimately, despite how perfect or feature packed your network is, a manufacturer will more likely develop devices for those that are easy to work with.
.
In short, what the network needs to focus on is:
-Scalability – How does it globally scale?
-Applicability – Does it have data transfer ability, fast, cheap transactions, smart contracts, privacy?
-Interoperability – Can it communicate with the outside world, other blockchains?
-Consensus – Will it gain consensus in a way that supports scalability and applicability?
-Developability – Will it be easy for manufactures to develop devices and interact with the network?
.
.
The idea of using blockchain technology to be the basis of the IoT ecosystem is not a new idea. There are several projects out there now that are aiming at tackling the problem. Below you will see a high level breakdown of those projects with some pros and cons from how I interpret the best solution to be. You will also see some supply chain projects listed below. Supply chain solutions are just small niches in the larger IoT ecosystem. Item birth record, manufacturing history, package tracking can all be “Things” which the Internet of Things track. In fact, INT already has leaked some information hinting that they are cooperating with pharmaceutical companies to track the manufacture and packaging of the drugs they produce. INT may someday include WTC or VEN as one of its subchains feeding in information into the ecosystem.
.
.
IOTA
IOTA is a feeless and blockchain-less network called a directed acyclic graph. In my opinion, this creates more issues than it fixes.
The key to keeping IOTA feeless is that there are no miners to pay because the work associated with verifying a transaction is distributed to among all users, with each user verifying two separate transactions for their one. This creates some problems both in the enabling of smart contracts and the ability to create user privacy. Most privacy methods (zk-SNARKs in specific) require the one doing the verifying to use computationally intensive cryptography which are outside the capability of most devices on the IoT network (a weather sensor isn’t going to be able to build the ZK proof of a transaction every second or two). In a network where the device does the verifying of a transaction, cryptographic privacy becomes impractical. And even if there were a few systems capable of processing those transactions, there is no reward for doing the extra work. Fees keep the network safe by incentivizing honesty in the nodes, by paying those who have to work harder to verify a certain transaction, and by making it expensive to attack the network or disrupt privacy (Sybil Attacks).
IOTA also doesn’t have and may never have the ability to enable smart contracts. By the very nature of the Tangle (a chain of transactions with only partial structure unlike a linear and organized blockchain), establishing the correct time order of transactions is difficult, and in some situations, impossible. Even if the transactions have been time stamped, there is no way to verify them and are therefore open to spoofing. Knowing transaction order is absolutely vital to executing step based smart contracts.
There does exist a subset of smart contracts that do not require a strong time order of transactions in order to operate properly. But accepting this just limits the use cases of the network. In any case, smart contracts will not be able to operate directly on chain in IOTA. There will need to be a trusted off chain Oracle that watches transactions, establishes timelines, and runs the smart contract network
.
-Scalability – High
-Applicability – Low, no smart contracts, no privacy, not able to run on lightweight devices
-Interoperability – Maybe, Oracle possibility
-Consensus – Low, DAG won’t support simple IoT devices and I don’t see all devices confirming other transactions as a reality
-Developability – To be seen, currently working with many manufacturers
.
.
Ethereum
Ethereum is the granddaddy of smart contract blockchain. It is, arguably, in the best position to be the center point of the IoT ecosystem. Adoption is wide ranging, it is fast, cheap to transact with and well known; it is a Turing complete decentralized virtual computer that can do anything if you have enough gas and memory. But some of the things that make it the most advanced, will hold it back from being the best choice.
Turing completeness means that the programming language is complete (can describe any problem) and can solve any problem given that there is enough gas to pay for it and enough memory to run the code. You could therefore, create an infinite variety of different smart contracts. This infinite variability makes it impossible to create zk-SNARK verifiers efficiently enough to not cost more gas than is currently available in the block. Implementing zk-SNARKs in Ethereum would therefore require significant changes to the smart contract structure to only allow a small subset of contracts to permit zk-SNARK transactions. That would mean a wholesale change to the Ethereum Virtual Machine. Even in Zcash, where zk-SNARK is successfully implemented for a single, simple transaction type, they had to encode some of the network’s consensus rules into zk-SNARKs to limit the possible outcomes of the proof (Like changing the question of where are you in the US to where are you in the US along these given highways) to limit the computation time required to construct the proof.
Previously I wrote about how INT is using the Double Chain Consensus algorithm to allow easy scaling, segregation of network traffic and blockchain size by breaking the network down into separate cells, each with their own nodes and blockchains. This is building on lessons learned from single chain blockchains like Bitcoin. Ethereum, which is also a single chain blockchain, also suffers from these congestion issues as we have seen from the latest Cryptokitties craze. Although far less of an impact than that which has been seen with Bitcoin, transaction times grew as did the fees associated. Ethereum has proposed a new, second layer solution to solve the scaling issue: Sharding. Sharding draws from the traditional scaling technique called database sharding, which splits up pieces of a database and stores them on separate servers where each server points to the other. The goal of this is to have distinct nodes that store and verify a small set of transactions then tie them up to a larger chain, where all the other nodes communicate. If a node needs to know about a transaction on another chain, it finds another node with that information. What does this sound like? This is as close to an explanation of the Double Chain architecture as to what INT themselves provided in their whitepaper.
.
-Scalability – Neutral, has current struggles but there are some proposals to fix this
-Applicability – Medium, has endless smart contract possibilities, no privacy currently with some proposals to fix this
-Interoperability – Maybe, Oracle possibility
-Consensus – Medium, PoW currently with proposals to change to better scaling and future proofing.
-Developability – To be seen
.
.
IoTeX
A young project, made up of several accredited academics in cryptography, machine learning and data security. This is one of the most technically supported whitepapers I have read.They set out to solve scalability in the relay/subchain architecture proposed by Polkadot and used by INT. This architecture lends well to scaling and adaptability, as there is no end to the amount of subchains you can add to the network, given node and consensus bandwidth.
The way they look to address privacy is interesting. On the main parent (or relay) chain, they plan on implementing some of the technology from Monero, namely, ring signatures, bulletproofs and stealth addresses. While these are proven and respected technologies, this presents some worries as these techniques are known to not be lightweight and it takes away from the inherent generality of the core of the network. I believe the core should be as general and lightweight as possible to allow for scaling, ease of update, and adaptability. With adding this functionality, all data and transactions are made private and untraceable and therefore put through heavier computation. There are some applications where this is not optimal. A data stream may need to be read from many devices where encrypting it requires decryption for every use. A plain, public and traceable network would allow this simple use. This specificity should be made at the subchain level.
Subchains will have the ability to define their needs in terms of block times, smart contracting needs, etc. This lends to high applicability.
They address interoperability directly by laying out the framework for pegging (transaction on one chain causing a transaction on another), and cross-chain communication.
They do not address anywhere in the whitepaper the storage of data in the network. IoT devices will not be transaction only devices, they will need to maintain data, transmit data and query data. Without the ability to do so, the network will be crippled in its application.
IoTeX will use a variation of DPoS as the consensus mechanism. They are not specific on how this mechanism will work with no talk of data flow and node communication diagram. This will be their biggest hurdle and why I believe it was left out of the white paper. Cryptography and theory is easy to elaborate on within each specific subject but tying it all together, subchains with smart contracts, transacting with other side chains, with ring signatures, bulletproofs and stealth addresses on the main chain, will be a challenge that I am not sure can be done efficiently.
They may be well positioned to make this work but you are talking about having some of the core concepts of your network being based on problems that haven’t been solved and computationally heavy technologies, namely private transactions within smart contracts. So while all the theory and technical explanations make my pants tight, the realist in me will believe it when he sees it.
.
-Scalability – Neutral to medium, has the framework to address it with some issues that will hold it back.
-Applicability – Medium, has smart contract possibilities, privacy baked into network, no data framework
-Interoperability – Medium, inherent in the network design
-Consensus – Low, inherent private transactions may choke network. Consensus mechanism not at all laid out.
-Developability – To be seen, not mentioned.
.
.
CPChain
CPC puts a lot of their focus on data storage. They recognize that one of the core needs of an IoT network will be the ability to quickly store and reference large amounts of data and that this has to be separate from the transactional basis of the network as to not slow it down. They propose solving this using distributed hash tables (DHT) in the same fashion as INT, which stores data in a decentralized fashion so no one source owns the complete record. This system is much the same as the one used by BitTorrent, which allows data to be available regardless of which nodes will be online at a given time. The data privacy issue is solved by using client side encryption with one-to-many public key cryptography allowing many devices to decrypt a singly encrypted file while no two devices share the same key.
This data layer will be run on a separate, parallel chain as to not clog the network and to enable scalability. In spite of this, they don’t discuss how they will scale on the main chain. In order to partially solve this, it will use a two layer consensus structure centered on PoS to increase consensus efficiency. This two layer system will still require the main layer to do the entirety of the verification and block generation. This will be a scaling issue where the network will have no division of labor to segregate congestion to not affect the whole network.
They do recognize that the main chain would not be robust or reliable enough to handle high frequency or real-time devices and therefore propose side chains for those device types. Despite this, they are adding a significant amount of functionality (smart contracts, data interpretation) to the main chain instead of a more general and light weight main chain, which constrains the possible applications for the network and also makes it more difficult to upgrade the network.
So while this project, on the surface level (not very technical whitepaper), seems to be a robust and well thought out framework, it doesn’t lend itself to an all-encompassing IoT network but more for a narrower, data centric, IoT application.
.
-Scalability – Neutral to medium, has the framework to address it somewhat, too much responsibility and functionality on the main chain may slow it down.
-Applicability – Medium, has smart contract possibilities, elaborate data storage solution with privacy in mind as well has high frequency applications thought out
-Interoperability – Low, not discussed
-Consensus – Low to medium, discussed solution has high reliance on single chain
-Developability – To be seen, not mentioned.
.
.
ITC
The whitepaper reads like someone just grabbed some of the big hitters in crypto buzzword bingo and threw them in there and explained what they were using Wikipedia. It says nothing about how they will tie it all together, economically incentivize the security of the network or maintain the data structures. I have a feeling none of them actually have any idea how to do any of this. For Christ sake they explain blockchain as the core of the “Solutions” portion of their whitepaper. This project is not worth any more analysis.
.
.
RuffChain
Centralization and trust. Not very well thought out at this stage. DPoS consensus on a single chain. Not much more than that.
.
.
WaltonChain
Waltonchain focuses on tracking and validating the manufacture and shipping of items using RFID technology. The structure will have a main chain/subchain framework, which will allow the network to segregate traffic and infinitely scale by the addition of subchains given available nodes and main chain bandwidth.
DPoST (Stake & Trust) will be the core of their consensus mechanism, which adds trust to the traditional staking structure. This trust is based on the age of the coins in the staker’s node. The longer that node has held the coins, combined with the amount of coins held, the more likely that node will be elected to create the block. I am not sure how I feel about this but generally dislike trust.
Waltonchain's framework will also allow smart contracts on the main chain. Again, this level of main chain specificity worries me at scale and difficulty in upgrading. This smart contract core also does not lend itself to private transactions. In this small subset of IoT ecosystem, that does not matter as the whole basis of tracking is open and public records.
The whitepaper is not very technical so I cannot comment to their technical completeness or exact implementation strategy.
This implementation of the relay/subchain framework is a very narrow and under-utilized application. As I said before, WTC may someday just be one part of a larger IoT ecosystem while interacting with another IoT network. This will not be an all-encompassing network.
.
-Scalability – High, main/subchain framework infinitely scales
-Applicability – Low to medium, their application is narrow
-Interoperability – Medium, the framework will allow it seamlessly
-Consensus – Neutral, should not choke the network but adds trust to the equation
-Developability – N/A, this is a more centralized project and development will likely be with the WTC
.
.
VeChain
\*Let me preface this by saying I realize there is a place for centralized, corporatized, non-open source projects in this space.* Although I know this project is focused mainly on wider, more general business uses for blockchain, I was requested to include it in this analysis. I have edited my original comment as it was more opinionated and therefore determined not to be productive to the conversation. If you would like to get a feel for my opinion, the original text is in the comments below.\**
This project doesn't have much data to go off as the white paper does not contain much technical detail. It is focused on how they are positioning themselves to enable wider adoption of blockchain technology in the corporate ecosystem.
They also spend a fair amount of time covering their node structure and planned governance. What this reveals is a PoS and PoA combined system with levels of nodes and related reward. Several of the node types require KYC (Know Your Customer) to establish trust in order to be part of the block creating pool.
Again there is not much technically that we can glean from this whitepaper. What is known is that this is not directed at a IoT market and will be a PoS and PoA Ethereum-like network with trusted node setup.
I will leave out the grading points as there is not enough information to properly determine where they are at.
.
.
.
INT
So under this same lens, how does INT stack up? INT borrows their framework from Polkadot, which is a relay/subchain architecture. This framework allows for infinite scaling by the addition of subchains given available nodes and relay chain bandwidth. Custom functionality in subchains allows the one setting up the subchain to define the requirements, be it private transactions, state transaction free data chain, smart contracts, etc. This also lends to endless applicability. The main chain is inherently simple in it’s functionality as to not restrict any uses or future updates in technology or advances.
The consensus structure also takes a novel two-tiered approach in separating validating from block generation in an effort to further enable scaling by removing the block generation choke point from the side chains to the central relay chain. This leaves the subchain nodes to only validate transactions with a light DPoS allowing a free flowing transaction highway.
INT also recognizes the strong need for an IoT network to have robust and efficient data handling and storage. They are utilizing a decentralize storage system using DHT much like the BitTorrent system. This combined with the network implementation of all of the communication protocols (TCP/IP, UDP/IP, MANET) build the framework of a network that will effortlessly integrate any device type for any application.
The multi-chain framework easily accommodates interoperability between established networks like the Internet and enables pegging with other blockchains with a few simple transaction type inclusions. With this cross chain communication, manufactures wouldn’t have to negotiate their needs to fit an established blockchain, they could create their own subchain to fit their needs and interact with the greater network through the relay.
The team also understands the development hurdles facing the environment. They plan to solve this by standardizing requirements for communication and data exchange. They have heavy ties with several manufacturers and are currently developing a IoT router to be the gateway to the network.
.
-Scalability – High, relay/subchain framework enables infinite scalability
-Applicability – High, highest I could find for IoT. Subchains can be created for every possible application.
-Interoperability – High, able to add established networks for data support and cross chain transactions
-Consensus – High, the only structure that separates the two responsibilities of verifying and block generation to further enable scaling and not choke applicability.
-Developability – Medium, network is set up for ease of development with well-known language and subchain capability. Already working with device manufacturers. To be seen.
.
.
So with all that said, INT may be in the best place to tackle this space with their chosen framework and philosophy. They set out to accomplish more than WTC or VEN in a network that is better equipped than IOTA or Ethereum. If they can excecute on what they have laid out, there is no reason that they won’t become the market leader, easily overtaking the market cap of VeChain ($2.5Bn, $10 INT) in the short term and IOTA ($7Bn, $28 INT) in the medium term.
submitted by Graytrain to INT_Chain [link] [comments]

Decred Journal – August 2018

Note: you can read this on GitHub (link), Medium (link) or old Reddit (link) to see all the links.

Development

dcrd: Version 1.3.0 RC1 (Release Candidate 1) is out! The main features of this release are significant performance improvements, including some that benefit SPV clients. Full release notes and downloads are on GitHub.
The default minimum transaction fee rate was reduced from 0.001 to 0.0001 DCkB. Do not try to send such small fee transactions just yet, until the majority of the network upgrades.
Release process was changed to use release branches and bump version on the master branch at the beginning of a release cycle. Discussed in this chat.
The codebase is ready for the new Go 1.11 version. Migration to vgo module system is complete and the 1.4.0 release will be built using modules. The list of versioned modules and a hierarchy diagram are available here.
The testnet was reset and bumped to version 3.
Comments are welcome for the proposal to implement smart fee estimation, which is important for Lightning Network.
@matheusd recorded a code review video for new Decred developers that explains how tickets are selected for voting.
dcrwallet: Version 1.3.0 RC1 features new SPV sync mode, new ticket buyer, new APIs for Decrediton and a host of bug fixes. On the dev side, dcrwallet also migrated to the new module system.
Decrediton: Version 1.3.0 RC1 adds the new SPV sync mode that syncs roughly 5x faster. The feature is off by default while it receives more testing from experienced users. Other notable changes include a design polish and experimental Politeia integration.
Politeia: Proposal editing is being developed and has a short demo. This will allow proposal owners to edit their proposal in response to community feedback before voting begins. The challenges associated with this feature relate to updating censorship tokens and maintaining a clear history of which version comments were made on. @fernandoabolafio produced this architecture diagram which may be of interest to developers.
@degeri joined to perform security testing of Politeia and found several issues.
dcrdata: mainnet explorer upgraded to v2.1 with several new features. For users: credit/debit tx filter on address page, showing miner fees on coinbase transaction page, estimate yearly ticket rewards on main page, cool new hamburger menu and keyboard navigation. For developers: new chain parameters page, experimental Insight API support, endpoints for coin supply and block rewards, testnet3 support. Lots of minor API changes and frontend tweaks, many bug fixes and robustness improvements.
The upcoming v3.0 entered beta and is deployed on beta.dcrdata.org. Check out the new charts page. Feedback and bug reports are appreciated. Finally, the development version v3.1.0-pre is on alpha.dcrdata.org.
Android: updated to be compatible with the latest SPV code and is syncing, several performance issues are worked on. Details were posted in chat. Alpha testing has started, to participate please join #dev and ask for the APK.
iOS: backend is mostly complete, as well as the front end. Support for devices with smaller screens was improved. What works now: creating and recovering wallets, listing of transactions, receiving DCR, displaying and scanning QR codes, browsing account information, SPV connection to peers, downloading headers. Some bugs need fixing before making testable builds.
Ticket splitting: v0.6.0 beta released with improved fee calculation and multiple bug fixes.
docs: introduced new Governance section that grouped some old articles as well as the new Politeia page.
@Richard-Red created a concept repository sandbox with policy documents, to illustrate the kind of policies that could be approved and amended by Politeia proposals.
decred.org: 8 contributors added and 4 removed, including 2 advisors (discussion here).
decredmarketcap.com is a brand new website that shows the most accurate DCR market data. Clean design, mobile friendly, no javascript required.
Dev activity stats for August: 239 active PRs, 219 commits, 25k added and 11k deleted lines spread across 8 repositories. Contributions came from 2-10 developers per repository. (chart)

Network

Hashrate: went from 54 to 76 PH/s, the low was 50 and the new all-time high is 100 PH/s. BeePool share rose to ~50% while F2Pool shrank to 30%, followed by coinmine.pl at 5% and Luxor at 3%.
Staking: 30-day average ticket price is 95.6 DCR (+3.0) as of Sep 3. During the month, ticket price fluctuated between a low of 92.2 and high of 100.5 DCR. Locked DCR represented between 3.8 and 3.9 million or 46.3-46.9% of the supply.
Nodes: there are 217 public listening and 281 normal nodes per dcred.eu. Version distribution: 2% at v1.4.0(pre) (dev builds), 5% on v1.3.0 (RC1), 62% on v1.2.0 (-5%), 22% on v1.1.2 (-2%), 6% on v1.1.0 (-1%). Almost 69% of nodes are v.1.2.0 and higher and support client filters. Data snapshot of Aug 31.

ASICs

Obelisk posted 3 email updates in August. DCR1 units are reportedly shipping with 1 TH/s hashrate and will be upgraded with firmware to 1.5 TH/s. Batch 1 customers will receive compensation for missed shipment dates, but only after Batch 5 ships. Batch 2-5 customers will be receiving the updated slim design.
Innosilicon announced the new D9+ DecredMaster: 2.8 TH/s at 1,230 W priced $1,499. Specified shipping date was Aug 10-15.
FFMiner DS19 claims 3.1 TH/s for Blake256R14 at 680 W and simultaneously 1.55 TH/s for Blake2B at 410 W, the price is $1,299. Shipping Aug 20-25.
Another newly noticed miner offer is this unit that does 46 TH/s at 2,150 W at the price of $4,720. It is shipping Nov 2018 and the stats look very close to Pangolin Whatsminer DCR (which has now a page on asicminervalue).

Integrations

www.d1pool.com joined the list of stakepools for a total of 16.
Australian CoinTree added DCR trading. The platform supports fiat, there are some limitations during the upgrade to a new system but also no fees in the "Early access mode". On a related note, CoinTree is working on a feature to pay household bills with cryptocurrencies it supports.
Three new OTC desks were added to exchanges page at decred.org.
Two mobile wallets integrated Decred:
Reminder: do your best to understand the security and privacy model before using any wallet software. Points to consider: who controls the seed, does the wallet talk to the nodes directly or via middlemen, is it open source or not?

Adoption

Merchants:

Marketing

Targeted advertising report for August was posted by @timhebel. Facebook appeal is pending, some Google and Twitter campaigns were paused and some updated. Read more here.
Contribution to the @decredproject Twitter account has evolved over the past few months. A #twitter_ops channel is being used on Matrix to collaboratively draft and execute project account tweets (including retweets). Anyone with an interest in contributing to the Twitter account can ask for an invitation to the channel and can start contributing content and ideas there for evaluation by the Twitter group. As a result, no minority or unilateral veto over tweets is possible. (from GitHub)

Events

Attended:
For those willing to help with the events:
BAB: Hey all, we are gearing up for conference season. I have a list of places we hope to attend but need to know who besides @joshuam and @Haon are willing to do public speaking, willing to work booths, or help out at them? You will need to be well versed on not just what is Decred, but the history of Decred etc... DM me if you are interested. (#event_planning)
The Decred project is looking for ambassadors. If you are looking for a fun cryptocurrency to get involved in send me a DM or come talk to me on Decred slack. (@marco_peereboom, longer version here)

Media

Decred Assembly episode 21 is available. @jy-p and lead dcrwallet developer @jrick discussed SPV from Satoshi's whitepaper, how it can be improved upon and what's coming in Decred.
Decred Assembly episodes 1-21 are available in audio only format here.
New instructional articles on stakey.club: Decrediton setup, Deleting the wallet, Installing Go, Installing dcrd, dcrd as a Linux service. Available in both English and Portuguese.
Decred scored #32 in the August issue of Chinese CCID ratings. The evaluation model was explained in this interview.
Satis Group rated Decred highly in their cryptoasset valuation research report (PDF). This was featured by several large media outlets, but some did not link to or omitted Decred entirely, citing low market cap.
Featured articles:
Articles:
Videos:

Community Discussions

Community stats:
Comm systems news:
After another debate about chat systems more people began testing and using Matrix, leading to some gardening on that platform:
Highlights:
Reddit: substantive discussion about Decred cons; ecosystem fund; a thread about voter engagement, Politeia UX and trolling; idea of a social media system for Decred by @michae2xl; how profitable is the Obelisk DCR1.
Chats: cross-chain trading via LN; plans for contractor management system, lower-level decision making and contractor privacy vs transparency for stakeholders; measuring dev activity; what if the network stalls, multiple implementations of Decred for more resilience, long term vision behind those extensive tests and accurate comments in the codebase; ideas for process for policy documents, hosting them in Pi and approving with ticket voting; about SPV wallet disk size, how compact filters work; odds of a wallet fetching a wrong block in SPV; new module system in Go; security of allowing Android app backups; why PoW algo change proposal must be specified in great detail; thoughts about NIPoPoWs and SPV; prerequisites for shipping SPV by default (continued); Decred vs Dash treasury and marketing expenses, spending other people's money; why Decred should not invade a country, DAO and nation states, entangling with nation state is poor resource allocation; how winning tickets are determined and attack vectors; Politeia proposal moderation, contractor clearance, the scale of proposals and decision delegation, initial Politeia vote to approve Politeia itself; chat systems, Matrix/Slack/Discord/RocketChat/Keybase (continued); overview of Korean exchanges; no breaking changes in vgo; why project fund burn rate must keep low; asymptotic behavior of Decred and other ccs, tail emission; count of full nodes and incentives to run them; Politeia proposal translations and multilingual environment.
An unusual event was the chat about double negatives and other oddities in languages in #trading.

Markets

DCR started the month at USD 56 / BTC 0.0073 and had a two week decline. On Aug 14 the whole market took a huge drop and briefly went below USD 200 billion. Bitcoin went below USD 6,000 and top 100 cryptos lost 5-30%. The lowest point coincided with Bitcoin dominance peak at 54.5%. On that day Decred dived -17% and reached the bottom of USD 32 / BTC 0.00537. Since then it went sideways in the USD 35-45 / BTC 0.0054-0.0064 range. Around Aug 24, Huobi showed DCR trading volume above USD 5M and this coincided with a minor recovery.
@ImacallyouJawdy posted some creative analysis based on ticket data.

Relevant External

StopAndDecrypt published an extensive article "ASIC Resistance is Nothing but a Blockchain Buzzword" that is much in line with Decred's stance on ASICs.
The ongoing debates about the possible Sia fork yet again demonstrate the importance of a robust dispute resolution mechanism. Also, we are lucky to have the treasury.
Mark B Lundeberg, who found a vulnerability in atomicswap earlier, published a concept of more private peer-to-peer atomic swaps. (missed in July issue)
Medium took a cautious stance on cryptocurrencies and triggered at least one project to migrate to Ghost (that same project previously migrated away from Slack).
Regulation: Vietnam bans mining equipment imports, China halts crypto events and tightens control of crypto chat groups.
Reddit was hacked by intercepting 2FA codes sent via SMS. The announcement explains the impact. Yet another data breach suggests to think twice before sharing any data with any company and shift to more secure authentication systems.
Intel and x86 dumpsterfire keeps burning brighter. Seek more secure hardware and operating systems for your coins.
Finally, unrelated to Decred but good for a laugh: yetanotherico.com.

About This Issue

This is the 5th issue of Decred Journal. It is mirrored on GitHub, Medium and Reddit. Past issues are available here.
Most information from third parties is relayed directly from source after a minimal sanity check. The authors of Decred Journal have no ability to verify all claims. Please beware of scams and do your own research.
Feedback is appreciated: please comment on Reddit, GitHub or #writers_room on Matrix or Slack.
Contributions are welcome too. Some areas are collecting content, pre-release review or translations to other languages. Check out @Richard-Red's guide how to contribute to Decred using GitHub without writing code.
Credits (Slack names, alphabetical order): bee, Haon, jazzah, Richard-Red and thedecreddigest.
submitted by jet_user to decred [link] [comments]

EOT - Encryption of things

 **EOT - Encryption of things** 
There is a huge surge in devices attached to the internet, known as the Internet of Things, and it is estimated that over 80 Billion devices will be connected to the internet by 2025, from industrial machines to devices in our home. The constant hacking and cyber attacks have increased not only the demand but the necessity of secure solutions. Our privacy and digital footprint are at risk.
[b]Some examples where encryption plays a role:
[b]Secure messaging - To make messages truly secure we need a process whereby a cryptography can be applied to encrypt transaction.
[b]Secure calling - Secure calling is a process whereby the caller and the recipient of the call are identified and linked via a blockchain enabled cryptocurrency transfer, again creating public and private encryption keys making the call truly private.
[b]Secure media storage - To safely and securely store media a process is required where 1.) Access to the media is encrypted via public and private keys of the person wanting to store the media. 2.) The media itself needs to be encrypted with a set of encryption keys and 3.) Media storage costs need to be paid via cryptocurrency
[b]Secure browsing - To browse the internet securely we need to create a process of verification whereby nodes on the blockchain can verify websites as “safe”. Furthermore, the entire process needs to be encrypted as well.
[b]Verification - This is one of the most important uses of a blockchain, we can verify websites as in the example above but also various other things such as identity, title and ownership, date stamps and source of products as with the verification of the source of agricultural or other products. These are just a few examples. All of this data needs to be encrypted as well.
[b]“Smart home” security - Wi-Fi is often used for remote monitoring and control. Home devices, when remotely monitored and controlled via the Internet, are an important constituent of the Internet of Things - all needing encryption, otherwise, hackers paradise.
[b]EOT in the future - The examples we mentioned above are only “scratching the surface” of where these technologies are applicable and who knows what will be invented in future. Google, Apple and Uber are all testing cars that drive themselves. A major issue with this technology is again the security aspect and the need to protect against hacking and who want’s to get into a spaceship to Mars that might be hacked or hijacked by ransomware?
So the future for the [b]“Encryption of Things” – EOT, looks very interesting indeed and the role of crypto currencies in this will be major.
Read the full white paper here - http://eottoken.com/index.php/whitepape
The first device using EOT Coin is the BitVault®[/b] - the World's first crypto communicator and blockchain phone. The BitVault is a revolutionary new product that is built around security and privacy enabled through Blockchain technology. Biometric Security enabled through fingerprint and iris scan. Iris patterns are unique to you and are virtually impossible to replicate. This means that iris authentication is one of the safest ways to keep your BitVault locked and the contents private. Proven Biometric technology brings a whole new level of security to your crypto currency and blockchain transactions. Creating Military grade security for your device through third party independent Multilayer security.
September 2017 – Swiss Bank in Your Pocket integrates EOT Wallet(Achived)
October 2017 – BitVault®, the world’s first blockchain phone launches in London with integration of EOT for secure calling, secure messaging and secure browsing (First batch shipped)
November 2017 – BitVault® Global App Store launches for developers to develop their own applications (Achived)
November 2017 – Website EOT Payment Gateway for WordPress and WooCommerce (ACHIVED)
December 2017 – Cryptodoc stores all your documents securely and encrypted on your PC
December 2017 – Password Wallet stores all your passwords for applications and websites encrypted on your PC January 2018 – Smart Router for secure, encrypted internet which is direct, safe and easy
January 2018 – EOT Camera, an Encryption of Things connected camera
February 2018 – EOT Development Kit for hardware devices
EOT payment gateway live on swissbankinyourpocket.com, Now you can buy SBIYP and BitVault using EOT coins
More on the BitVault here:- https://swissbankinyourpocket.com/bitvault/]https://swissbankinyourpocket.com/bitvault/ https://swissbankinyourpocket.com/bitvault-apps/]https://swissbankinyourpocket.com/bitvault-apps/ https://bitcointalk.org/index.php?topic=2152534.0]https://bitcointalk.org/index.php?topic=2152534.0
JOIN US ON REDDIT : http://www.reddit.com/EncryptionOfThings]www.reddit.com/EncryptionOfThings JOIN US ON SLACK : [url=https://join.slack.com/t/eot-coin/shared_invite/enQtMjc3NzkxNzY5NzQ0LTFjMWI5NTJjOGEzYjU5ZDk0ZjRjZWE3MzBkNmI0MmQ2NTUzMTBkOGQ1YmEyNTllMmNiYzA3MGZjOGVmY2IyZGU
The EOT Token is trading on the Waves Platform, TOKENS are 1:1 image of EOT coins, EOT coins can be converted to tokens and vice versa using gateway service in SBIYP hardware wallet. if you do not have that hardware wallet, you can contact members on slack who have purchased SBIYP to do that swap for you.
TOKEN DETAILS
Name : EOT Token (Verified)
Identifier : GdnNbe6E3txF63gv3rxhpfxytTJtG7ZYyHAvWWrrEbK5
Total supply : 100,000,000
EOT token (EOT) markets added on the Tidex Exchange https://tidex.com/exchange/eot/btc https://tidex.com/exchange/eot/waves
EOT Coin details (currently minable)
https://github.com/EmbeddedDownloads/EOTCoin
windows wallet[/b] https://github.com/EmbeddedDownloads/EOTCoin/releases/download/v1.0.0.1/EOTCoin-win.exe
windows Desktop wallet[/b] https://github.com/EmbeddedDownloads/EOT-Coin-Windows-Desktop-Wallet/releases/download/1.0/EOTCoinDesktopWallet1-0.zip
MAC Wallet [/b] https://github.com/EmbeddedDownloads/EOTCoin/releases/download/v1.0.0.1/EOTCOIN-Qt-OSX-v1001.dmg
WEB wallet [/b] http://eot.digital (Closing, please withdraw your coins)
ANDROID wallet [/b] https://github.com/EmbeddedDownloads/EOTAndroidWallet/releases
Block Explorer [/b] http://www.eot.digital:3622/
Block Explorer 2 [/b] http://www.eotcoin.info (created by [b]@Luanptit[/b])
[Block reward [/b] 100 Coins, [b] ALGORITHM [/b] SCRYPT, [b] BLOCK TIME [/b] 90 seconds
MINING POOLS
Official mining pool [/b] http://www.eot.digital:3001/ Getting Started [/b] minerd -a scrypt -o stratum+tcp://www.eot.digital:3256 -u WalletAddressWhereYouWantYourMiningCoins -p 1
unofficial Mining pools http://www.altminer.net
http://antminepool.com
http://coinminers.net/
http://www.vivocryptopool.com
[red]Currently EOT is traded on WAVES DEX, Crypto-Bridge DEX and TIDEX. Big exchanges will be available soon, exchanges are in comkmunication.
Opportunities are available with EOT - from Development, Mining, Trading as well as other business opportunities built around the EOT currency and the "Encryption of Things"
[size=34px]Bitvault on Yahoo Finance https://finance.yahoo.com/news/bitvault-worlds-first-blockchain-phone-201600279.html [/size] [center][img width=770]https://i.imgur.com/UMIlRoC.png[/img][/center]
[center][size=30px]Press release 4th October 2017 [/size] [size=30px]yahoo Finance https://finance.yahoo.com/news/bitvault-announces-london-launch-161000826.html?soc_src=community&soc_trk=tw [img]https://i.imgur.com/mBDZnN7.png[/img]
Some Helpful Information
[quote author=Story777 link=topic=2091616.msg21890405#msg21890405 date=1505551168]
You have been keeping a great secret.
I've been doing a bit of research with the technology behind this coin. It looks like ALOT of research has gone into this tech, since about 2004 and shortly after a patent for this P2P system was quickly issued.
Bitvault (https://swissbankinyourpocket.com/product/bitvault/) who are using the worlds first blockchain phone as a secure communication device and ultimately taking [font=Verdana][b]encryption[/b][/font] to the Internet Of Things (IoT) keeping our personal and business data secure. All this is done using [b]EOT coin [/b](Encryption of Things).
In todays world insecure devices are rampant. Here are a couple of links about the CIA being able to use insecure devices to 'cause accidents' http://www.sandiegouniontribune.com/news/cyber-life/sd-me-wikileaks-cia-20170307-story.html and https://www.washingtonpost.com/news/innovations/wp/2017/03/08/what-we-know-about-car-hacking-the-cia-and-those-wikileaks-claims/
It's scary to think a legal entity could posses such power over life. Just the mere fact alone the governing authority can request phone records (e.g. txt msgs, voice msgs or eavesdropping) proves most if not all telecommunication companies do not encrypt, otherwise whats the point on requesting the information!? (legal or not).
Commercially sensitive information needs to be protected and most importantly in my opinion our [font=Verdana][b]rights[/b][/font] and the privacy of all citizens of the human race need to be protected.
From my understanding BitVault is a platform for reference data. This would be data that is stored for compliance reasons such as e-mails, invoicing systems and check imaging (e.g. high quality imaging for x-rays, MRI scans etc) and a prototype was developed in 2004. This would means massive amount of data storage is required with fail-safe systems so a authorised user could access this information very very quickly.
Three goals were needed to be achieved: Low cost, high reliability/availability and simplicity. This is the birth of Bitvault via EOT.
Bitvault ultimately stores immutable objects with each new version being updated and identified with a 160-bit key.
System stability is very important and must be immune to failure sequences. Parallel repair via indexing is one of the many strengths Bitvault has been able to demonstrate.
BitVault is a back-end system that uses [u][b]Applications[/b][/u] to catalog object ID's. Using a catalog utility and indexing within an application prevented scalability bottlenecking under heavy loads.
Fast forward 3 years to 2007 a very important decision was to [u][b]decentralise[/b][/u] BitVaults system. This in my opinion is one of the fundamental principles of cryptocurrency - [u][b]No one entity or person has any control of the data stored and only the authorised user can access this info[/b][/u]. Ultimate Security and thus personal safety (see above articles CIA hacks). BitVault using applications have been able to use provable communication and data storage with ease of retrieval with vital security measures.
BitVault is not alone in researching solutions for security for the IoT, such as Venti and the like are making progress, however, BitVault is 'head and shoulders' above the few competitors and are already offering practical working solutions on the market with huge scalability that is cost effective.
Well Done BitVault, well done EOT your secret is out and let the world embrace.
author=Story777 link=topic=2091616.msg21462424#msg21462424 date=1504428317]
I have had a response in Slack and it has satisfied my questions. Thank-you.
For everyone information here it is:
The currency was created with 200 Million EOT total supply on 7 July 2017 [ we showed it to the community a London Fintech week with the demonstration of the BitVault - fintechweek.com ]
100 Million was pre-mined and another 100 Million are currently being mined, 1 block every 90 seconds @ 100 coins per block.
So the pre-mined coins were listed on waves as a token so that it can create a market for the coins while we are working to get listed on other exchanges.
The 100 Million coins listed were distributed in several ways. Firstly, this was not an ICO because our business is already funded via private capital. We wanted to get the currency distributed a widely as possible. So most of the initial coins were given away to a number of interested parties. We distributed this to our whole development team, business partners, employees as well as to the waves and other communities. So we did not sell all these coins for the current price, most of it was given away for free to people that have an interest in our products and business. The price now is formed by whoever owns these coins.
The tokens on the exchange is really a representation of the currency and as such has value because it can be interchanged, just like Bitcoin and Ethereum are on the waves exchange. This whole process is explained on page 4 of the waves whitepaper, I think they call it an asset-to-asset exchange which makes it possible to list any asset that exist on waves. Unfortunately waves only has gateways currently for Bitcoin, Ethereum, Waves, Euro and USD, so we have to develop our own gateway, which will be available on Nov 1.
So to clarify 100,000,000 tokens costing $190M were not sold. It is a combination of airdrops, private sales and sales on the exchange.
Some EOT coins are needed because: "A lot of EOT will be distributed through our devices. For example our encrypted routers are pre-loaded with EOT, so we need that stock and it will be distributed that way".
And with the response to tokens on the Waves Exchange "This is how Bitcoin works on waves: - They created 21 Million BTC Tokens.. When you deposit Bitcoin into waves account, you receive an equal amount of tokens which you can either trade or even sent via the waves blockchain to another user.. Once you withdraw your tokens are exchanged for BTC and you receive it back into your BTC wallet.. Exactly the same for USD or EUR - You don't send Euro's to another client on waves if you transfer - you send a token that represents EUR -- This works exactly the w0083".
These are the answers I was looking for and make a lot of sense now. This is indeed an exciting project. :)
It's time to trade....
Now I have one question left....
Is there anyone using NiceHash to mine this coin?? I keep being disconnected because of the difficulty being too low. Can any one help?
[quote author=Shews link=topic=2091616.msg22876983#msg22876983 date=1507755312] EOT (coin) is now tradable on the CryptoBridge Decentralised Exchange, you can sign up below.
Please note this is for the EOT COIN ONLY, do not send tokens to this dex. This is a secure means to trade with the backend being on a blockchain. It is still in beta stage but has been working flawlessly so far. If you'd like more info I will post their website link is below.
https://wallet.crypto-bridge.org/?r=388691
You can sign up with a local wallet mode, meaning you are the only one with access to your keys, this is most secure. There is also the option to sign up with and account if you require access to you funds on the go.
More info:
https://crypto-bridge.org/
submitted by johningreece to crypto_currency [link] [comments]

An in depth look into Sparkster and why I believe it is in a league of its own

Introduction
Today I am writing about a project I truly believe in. I am on the same page with Ian Balina when I state that I see this project is an all-star ICO. This is not your average run-of-the-mill vapour ware ICO with No MVP. This is a working platform with a great team behind it. You can find AMA’s on YouTube(Link 1)with live demonstrations of their TPS progress to date and you can also try out their platform for yourself on their website, these are linked at the end of the article for your convenience. Also, they have a pretty good bounty programme running at the moment which I shall link also(Link 2).
Please don’t consider this investment advice, I hope you will read this article and consider it a starting point for your own research. At the time of writing the market has taken another nasty dip, however this is the time when smart investments need to be made, And I truly fell this is one of them. I would also like any of you who enjoy this article to please upvote it and check out my previous work and stay tuned for more.
I will be diving deep into this whitepaper (Link 3) today and basing my article off videos and my personal experience on their platform. All this information can be found within their website and whitepaper. As such I imagine this is going to be a long article.
So, to begin Sparkster is essentially a decentralised cloud platform that will allow anybody to build software in plain English via simple drag and drop function. In their whitepaper they confess that this was inspired by MIT scratch. In today’s world programmers work in various kinds of code languages, these all require training in different types of languages. For example solidity is one of the most popular used today which “is a contract-oriented programming language for writing smart contracts” (Wikipedia, 2018). This is currently used on many blockchain platforms, it was developed by Ethereum’s project solidity team for use on the Ethereum virtual machine and is the most popular language used at present.
Sparkster aims to provide a platform which will allow smart contracts to run at 10 million TPS per second, which would make it the fastest decentralised cloud software in the world.
Concept development
In their whitepaper they suggest this project was conceived after spending 14 years working with software engineers designing and building ERP software for a start-up. Sparkster was born from the frustration of this process and after 6 years of R&D they have the working product we see today. This is an enterprise ready platform. They also claim they have already signed deals with large tech companies (ARM & Libelium).
They also talk about how the entry is trying to make things more practical but it is not far enough. Sparkster are the market leaders here as they are targeting an audience of 99% non-software developers and allowing them to build software. Interestingly in 2018 at the mobile world congress they presented the use of this platform using AI facial recognition to detect a cleaner in a house and opening a door lock, I seen this on YouTube video, which I will link below(Link 4). This is a team which have proved they have a working product.
Claims/ Vision
  1. In their whitepaper they claim they want to become the world’s first platform where people can build their own visions into reality and create financial independence for themselves and contribute to society.
  2. Sparkster will tear apart the barrier to entry to software creation. Their drag and drop functionality on the platform allows this. Up until yesterday I had no clue how a smart contract worked at the basic level, now I consider myself an expert software developer- Who would have thought I could throw away my old life and upskill over 24 hours? Ha.
  3. What I also love about this project is that it will empower people to bring their own ideas to the table and be able to sell them, thus creating financial independence.
  4. The Sparkster, (2018) website(Link 5)suggests they will further disrupt the 200 billion cloud computing industry and combat the extortionate prices large centralised cloud provides like AWS, Microsoft, Google and IBM charge.
  5. This is a finished product guys, please try it yourself if you don’t believe me.
Problem today
As per Sparkster, (2018) claim the biggest problem faced today is that organisations and individuals who wish to implement AI, IOT and smart contract technology have limitations placed on them. Most notably being that their own IT departments are adapting too slowly and there is a serious lack of experienced personnel in these areas. When I watched the AMA that Sajjad Daya (CEO) did with IAN Balina, he described that it is hard to interpret what you want to a developer and get the result you require; the end result then often does not meet your expectations. This of course leads to time wasting as it requires much back and forth correspondence. He stated that this can be months down the line (Something I have experienced in my own organisation). This traditional “software development lifecycle” is truly a slow and painful process, just as they claim in their whitepaper. Also, when changes need to be made to the software down the line it is very expensive.
Further-more the team claim that most business software used today (SAP, Oracle, Microsoft etc… is in-capable of interfacing with the technologies of the future (IOT, AI, Smart contracts). The Sparkster whitepaper further goes on to suggest that the talent is just not there in the industry today to face this challenge either and much up-skilling is required. The team believe that the high capital cost and time periods to replicate vision onto software in the traditional manner is the biggest problem facing enterprises today as it curbs innovation. I concur with this sentiment.
How they will achieve/solve this
According to their whitepaper, this platform is the solution to all of the above problems. It is a Platform which targets the new era of AI, IOT and smart contracts and all tailored to non-developesoftware experts “making is accessible to the 99% who do not know how to code and don’t want to learn” (Sparkster, 2018).
They will create this platform by targeting users of cell phones, notebooks, laptops and other personal devices- who in essence will all become miners on the network. This will then in turn provide users with Spark tokens as a reward for contributing spare capacity. Using these devices is far cheaper than todays centralised systems according to the team. They further proclaim in their whitepaper this lower cost will arise from using inexpensive nodes and as this scales the cost goes down; compared to traditional cloud computing which remains constant. Companies will provide the value via paying for the software creation.
Fees
To scale the platform, they will make personal use free, but limited to a certain number of transactions per month. This restriction can be lifted by referring others. The commercial use will be via ongoing fees (licences, transaction fees, storage fees etc...). The team also describe how the platform and cloud are complimentary, which will allow users to build software 100x faster and cheaper than traditional means, so this will be a very popular mass blockchain adoption platform in my view.
Their plan for growth
A marketplace will essentially become available when users sell their software creations via peer to peer transactions. So, value really depends on how users use the platform. Also, users lending their free memory (CPU) on phones etc… will be awarded spark tokens. These can all be used to negate the fees paid.
According to their whitepaper they will also focus on strategic partnership. As mentioned above they have already partnerships with ARM (World’s largest computer chip designer) and Libellium (Industrial sensor and gateway distributor). They also plan to target vertical markets, specifically IOT and smart contracts as growth is forecast to be huge in both. I personally see the use of smart contracts in society as the single biggest use case of blockchain in the future.
Platform
What is amazing about this platform is that you can actually try it for yourself on their website. I conducted the 6 walkarounds myself and was very impressed by what I experienced. I have never attempted to try create anything with software, but the process was made so simple by Sparkster. You can literally drag and drop different interfaces together and define the behaviour of each block. It’s a very simple and intuitive approach to building smart contracts. As described by Sparkster, (2018) whitepaper you just snap together blocks that describe the “what” you want without worry about “how” it works, they even attribute it to building with Lego. The walkthroughs bring you through how to create a simple calculator and by the 6th lesson you have developed a complex insurance smart contract from which premiums can be calculated and payments automatically made.
Sparkster claim that this will make the creation of smart contracts 100 times faster and cheaper than traditional software development, a claim which I am starting to believe after experience their walkthrough. This is a rare project which already has a working platform- Why wouldn’t you be impressed?
Most ICO’s today are nothing but vapourware, who look for you money and don’t even have minimum viable projects to offer. I would advise you all to look at their AMA’s on YouTube and partake in their walkthroughs and you will see for yourself.
A more detailed look into their platform
According to Sparkster, (2018) their smart software is made up of:
  1. Flows- The definition of the software, made up of all core components of the platform.
  2. Functions- Single building blocks that perform units of work which can be plugged together to build processes (e.g. an insurance policy as seen in their walkthrough video). The have a well-defined user interface also.
  3. Documents- Basic data storage entities on the platform, they differ from functions as they are there to retrieve, persist, update and delete data. Sparkster say that they are there to represent an entity in the real world e.g. a user’s car insurance policy. Furthermore, storage nodes on the cloud will be rewarded for this storage and retrieval of data.
  4. Integrations- This is the interface to the outside world. Sparkster say they provide a simple abstraction to a 3rd party API or webservices. What I like about this is that somebody can create this (e.g. shipping quotation) and allow others to use after its created via the market place. Sparkster aim to allow people to do this without worrying how it all works.
  5. Devices- These replicate devices in the real world comprising of commands and fields (Bidirectional data transfer). In their whitepaper they use an example of a temperature probe in a greenhouse where the temperature feeds back to the action field. It is very complex stuff.
  6. Gateways- these represent a group of devices connected to one gateway. Sparkster say these are all connected to the internet allowing the platform interact with them all individually or as a group.
  7. Smart Contracts- This is the element I found most fascinating during the Sparkster walkthrough videos. This allows you to create smart contracts to allow transactions on the platform. Currently they are using Ethereum smart contracts and Iota smart transactions. I found the whole process so easy. They further state that all the above components can interact with the smart contracts, which was proven to me in the walkarounds.
Their claim of 10 million TPS
From what I can understand from their whitepaper and from an AMA with their CEO this will be a step by step approach to 10 million tps, admittedly a few years down the line but they already proved their platform works and is running at over 50k TPS with 50 cells. They don’t seem to have hit any scalability issues just yet. And I should not need to remind you that 50k TPS is much more than other blockchains products out there.
In their whitepaper they tell us that this is designed to be a specialised blockchain for the use of “smart software”, What is important to understand is that they can reach higher TPS because they don’t have to “act” like other blockchains, in that most of their clients will want to keep data private which “eliminates the necessity of maintaining global state” (Sparkster, 2018). This in turns allows them to shard their distributed hash tables into client groups, where “one shard never needs to have any awareness of another other shard” (Sparkster, 2018). They will essentially isolate cells from one another in order to scale to this level. They give a great example in their whitepaper where if a company like Air BNB want to put customer data into cells (usernames broke into separate letters per cell), where millions of customers make up their base.
Overall their theory is that there is technically no limit to the number of TPS they can achieve, this is just a target number. I have full confidence they can pull it off, what other blockchain is proving this live on air like this team is?
Decentralised cloud
Sparkster, (2018) website describes how traditional cloud providers such as Amazon, Microsoft and Google have huge costs, relating to server costs, backup power, staff, security and cooling. Decentralised cloud computing will be the death of these organisations. For instance, Sparkster claim that by executing small software components on one’s mobile phone these costs fall near to zero, they envisage a world where a lot of these miners will join the decentralised cloud and make reduce the costs further.
Their cloud will facilitate the execution of smart software created on the platform. Their whitepaper further suggests that one can simply download the Sparkster mining app on their phones which will provide user generated smart software environment (SRE). Companies will stake bids on the exchange for their software to run in a decentralised fashion and stake Spark tokens (Amount willing to pay). The team are envisaging this as a free market where bidders can stake as much as they like and miners ask for anything they like. Payment is made to the miners via these tokens.
They further say computer and storage nodes can join the network and be paid in Spark tokens, but they are required to stake tokens themselves as collateral to ensure they operate honestly. Sparkster will have verification nodes to validate transactions from computer and storage nodes and if any “bad behaviour” is found then they take these staked tokens in the form of a “bounty”. In my opinion this will make it a very secure platform
Sparkster Technology Stack
The below image from their whitepaper shows the levels “smart software” goes through to facilitate decentralised cloud computing.
https://preview.redd.it/4qnqlgowyf311.png?width=357&format=png&auto=webp&s=fa68bc369d37c14073dcbd4869518f3b1485c057
Source: Sparkster, (2018)
Throughput
What is very interesting is the high throughput they can sustain with such a high TPS. If you know anything about blockchain you will understand this is a challenge for every blockchain, the more users to a platform the more scaling is required. For instance, in the bull run in December I remember how slow the Ethereum blockchain became, this was also attributed to the increase in ICOs and DApps launching on the platform.
Sparkster claim their cloud is capable of “scaling linearly without any overhead curtailing its meteoric performance” (Sparkster, 2018). They can achieve this by isolating cells within the chain. They further claim that the whole Idea is to “isolate” chains, essentially creating independent blockchains which have their own hash tables and never synchronize with each other- they describe this like a human cell, which once splits never shares anything with another cell. It is a very simple concept, user’s data is stored in a specific cell, so why would another unrelated company using the Sparkster platform need to know about of access the information in the 1st cell. Each “cell” is capable of 1000tps and because they each have their own hash table this results in 2k tps and so on and so forth.
Essentially data is streamed in parallel but synching is never needed. This is huge- this is a platform which unlike any other blockchain is designed for mainstream adoption. Any company can use it and store data and be sure of a high throughput. As mentioned above they are already at 50k TPS- which is far better than most blockchains today. This is a true working product and I can see this getting to 10 million.
Consensus
Time for a quick history lesson, bitcoin uses proof of work and Ethereum use proof of stake. These are two most common consensuses used today by blockchains. Bitcoin relies on the party with the highest hashing power whereas Ethereum on the party with the highest amount of money. This team has chosen to implement the Steller Consensus Protocol (SCP), because it is better.
Sparkster describe this as a commercial version of the Federate Byzantine Agreement System (FBAS) (1000tps per second). They will also implement a layer for incentives to keep parties honest and minimise risk of attack as SCP does not have this. This will be done by awarding of Spark tokens to computer (donate CPU memory on device) and storage nodes (contribute storage space and network bandwidth). Clients of the platform will be covering these incentives. The team believe this extra layer is required to ensure the platform surpasses traditional cloud platforms and I tend to agree with them.
Their whitepaper further suggests that a proof of work consensus will be used to calculate these incentives. This will allow misbehaviour to be detected and stakes taken from them by verification nodes. Page 35-39 of the whitepaper goes into detail how these are all calculated, which is linked below for your interest.
Consistent hashing
As they don’t use global state this algorithm allows the platform to “hash the clients ID and extract a bounded number” (Sparkster, 2018). This will identify a particular client within a cell.
Privacy
One of the biggest fears of any data platform is privacy protection. The Sparkster team say that their cloud deconstructs data into fragments, encrypts them and disseminates them across the network of nodes. This is particularly important now with the EU’s general Data Protection Regulation (GDPR), as discussed in their whitepaper. So, any hack to the platform will wield meaningless returns. They also claim they use “zk-SNARKs… a zero-knowledge proof to ensure that client data is obfuscated, even from other network participants” (Sparkster, 2018).
Security
They also claim they can detect software intrusions such as tampering with the code, memory or thread. Once their system detects this all client data is automatically deleted from the memory along with the access keys to the Sparkster network, as claimed on their website.
In their whitepaper they also claim that any software built on the platform is “entirely bug free”. This is true because even though you as the users dictate the logic, the actually underlying code is very uniform and consistent.
Their app will also use public/ private keys and digital signatures and check sums will be used to detect file tampering. In their whitepaper they also state that cache data won’t be stored, all data will be encrypted, all communication is SSL/TLS and they will employ 3rd parties to detect malicious payloads in the memory.
Multi chain interoperability
Sparkster can already be used with both Ethereum and Iota, with plans to increase this down the line. This is all to cater for preferences of the user. This is a very transparent platform and tailored around usability and ease.
https://preview.redd.it/c23z7yl2zf311.png?width=451&format=png&auto=webp&s=81c05aeb65a528dd482fc97c4803cb2712b40fd7
Source: (Sparkster, 2018)
Token economics
Stats
Value
The value model proposed by their whitepaper suggests that the global marketplace will be the value driver of the platform. So, people can create and sell content on an open peer-peer market, with the value flowing though the Spark token. Small platform fees will be charged on transactions on the platform (Not on free contributions).
It is a utility tokens because its purpose is to facilitate payments, it will also be the only currency accepted on the platform. Once the decentralised cloud is released in Q4 2018, miners will be able to earn Spark tokens.
I believe this will be a market leader when it comes to mass adoption of blockchain, this is truly a one model fits all platform and it is with growth of the platform which will drive the value of the tokens up. Also, the Spark token is essential to the cloud functionality as miners need to stake tokens to ensure good behaviour, if the opposite occur verification nodes claim these takes, this makes the tokens essential to the smooth running of the platform.
Breakdown of token distribution
Use of funds:
Team
In my view the team has a huge wealth of experience within it. This consists of:
2 all-star advisors
4 on the leadership team
17 further team members (Sparkster Warriors)
· These team members range from software engineers, developers, designers, project team leaders, programmers and digital marketers.
· There is so much experience in this team it would take all day to write about them, but a wide encompassing team like this shows they are serious about what they doing.
Conclusion
This is a not to be missed ICO. I really feel like this is one of the all star ICO’s this year. There is nothing more that really needs to be said, I would just advise you that if you are considering this project then go to their website and test the platform for yourself. It is the walkthroughs that sold me on this project and one which I will be investing in.
Additional reading (Links)
Link 1- AMA with Ian Balina (All-star ICO): https://www.youtube.com/watch?v=_K9j_EGHbpc
Link 2- Sparkster bounty programme: http://sparkster.me/try?r=DU2VUW45
Link 3- Sparkster whitepaper: https://drive.google.com/file/d/1_341kbDEDc9PWn4lbsCGpAmcqDqcggUq/view
Link 4: Sparkster founder &Ceo speaking at MWC 2018: https://www.youtube.com/watch?v=X-Jf9_fcxYo
Link 5: Sparkster website: https://sparkster.me/
References
· En.wikipedia.org. (2018). Solidity. [online] Available at: https://en.wikipedia.org/wiki/Solidity [Accessed 11 Jun. 2018].
· Sparkster (2018). Build and Run Decentralized Software in Plain English. [ebook] Sparkster whitepaper, pp.1-57. Available at: https://drive.google.com/file/d/1_341kbDEDc9PWn4lbsCGpAmcqDqcggUq/view [Accessed 11 Jun.
· Sparkster.me. (2018). Sparkster – Build Apps, Write No Code. [online] Available at: https://sparkster.me/ [Accessed 11 Jun. 2018].018].
submitted by Mick2018 to Sparkster [link] [comments]

DADI - All-In-One Thread

DADI - All-In-One Thread

DADI: Decentralized Architecture for a Democratic Internet

For a FULL list of DADI updates, including those to the DADI d'Apps, see here: https://dadi.cloud/en/updates/

What is DADI?

DADI: Decentralized Architecture for a Democratic Internet
DADI Official Video
DADI is built on the Ethereum Blockchain using an ERC-20 token allowing the use of smart contracts and thus improving transparency. Think of the DADI network as seen in the likes of more centralised Amazon Web Services, Google Cloud and Microsoft Azure but on a decentralised cloud infrastructure supported by the contributors of the network (masternodes). The DADI network will be widely distributed on which will be an increasingly large number of nodes which are location aware and located at the edge of the network, this increases efficiency and also helps to prevent a single point of failure. DADIs decentralised cloud platform focuses on currently 11 web services (see below) which will feature in the DADI marketplace as intelligent apps.
One way of thinking about this is that the DADI nodes are on a side chain running the DADI software. Contributors stake (PoS) their DADI tokens on the ETH network to secure a node within the DADI network. The node uses Proof of Work (PoW) and Proof of Availability (PoA) to reward contributors with DADI tokens. More information on this in the Masternode section of this post. This means anyone with a device, laptop, phone, home router (a smart device with an internet connection) will be able to earn income by providing spare compute.
DADI is extremely secure and resistant to common attacks such as DDOS and Brute Force and also prevents malicious data entering the network making all of the web services much safer from malicious attack.
DADI has been in development for over 5+ years (yes, 5 years!) and already providing services to some of their top tier clients. These customers will slowly be moved across to mainnet over time.

DADI works with a few large media corporations
  • Virgin Limited
  • Monocle
  • Empire
  • What Car?
  • Grazia
  • Mojo
  • heat
  • Kerrang
  • + many others

DADI Technology Partner Programme

Gain access to DADI tools, training and support along with other benefits by joining the DADI Technology Partner Programme.

DADI Services

The DADI services are broken into a set of micro services within the DADI dApp marketplace and provide the necessary solutions to meet business requirements. They are currently being brought on to the network in a staged and development track.
DADI CDN - Network Ready
DADI CDN is the first product to launch on the DADI network and is currently live on mainnet.
What is CDN?
A Content Delivery Network (CDN) or Content Distribution Network (CDN). DADI facilitates the seamless delivery of image, audio and video assets for digital products accessed across a range of devices in multiple contexts. CDN's deliver content to end users via nodes deployed in multiple locations.
This allows to reduce bandwidth costs, improve end user experience and increase availability of content. With DADI being able to distribute these nodes in a decentralised manner, nodes are likely to be located much closer to the end user.
Here is the DADI CDN sandbox environment for some of the features: https://docs.dadi.cloud/sandbox/dadi-cdn
...10 other services are being built and on the development roadmap.
DADI Store -Q3 2018
DADI Store - A cloud storage solution for all types of data, with built-in security, privacy and redundancy.
DADI API - Q4 2018
DADI API - A high-performance RESTful API layer designed in support of API-first development and the principles of COPE.
DADI API Wrapper - This library is for interacting with DADI API and provides a high-level abstraction of the REST architecture style, exposing a set of chainable methods that allow developers to compose complex read and write operations using a simplistic and natural syntax.
DADI Publish - Q1 2019
DADI Publish – A writer’s window to the world of content creation. Flexible interfaces designed to optimize editorial workflow.
DADI Web - Q2 2019
DADI Web – A schemaless templating layer that can work standalone or with DADI API.
DADI Identity - Q3 2019
DADI Identity - Guarantees uniqueness of individuals — and powers segmentation — for anonymous and known users.
DADI Track - Q3 2019+
DADI Track - A real-time, streaming data layer providing accurate metrics at individual and product level.
DADI Visualize - Q3 2019+
DADI Visualize - A data visualization interface for Identity and Track, capable of taking data feeds from virtually any source.
DADI Predict - Q3 2019+
DADI Predict - A machine-learning layer that predicts user behavior at an individual level based on past interactions.
DADI Match - Q3 2019+
DADI Match - A taxonomic framework for automated content classification through machine learning, which plugs into Publish.
DADI Queue - Q3 2019+
DADI Queue - A lightweight queue processing system powered by Redis, featuring simple task routing and throttling.
...as well as the dApps we also have:
DADI CLI
DADI CLI - DADI CLI is a command-line tool to help with the installation and customisation of the various products of the DADI platform.

Tutorials

DADI Tutorials - Step-by-step guides and practical examples of our technology written by the DADI team.

Official Sources – Get to know the community.

DADI Website
DADI Telegram The community in the Telegram is the most active, but the admins and team will often visit all forms of social media.
DADI Telegram Announcements
DADI Discord
DADI Github
DADI Twitter
DADI Medium
DADI Youtube
DADI Coinmarketcap
DADI Careers
DADI Facebook
DADI investfeed
DADI Blockfolio Signal - Get the app here look out for DADI updates in the Signal feed.
DADI Delta App - Get the app here look out for DADI updates in the Signal feed.

Unofficial Sources

DADI Telegram 'Price talk'
DADI Twitter Bot (DADI Rank)
This is the communities chat talking price action amongst other things.

DADI Foundation

DADI Foundation
DADI Foundation The Foundation was established in January 2017. It has it's own board and it's own CEO. While it benefits from the network and was setup by the founders of DADI, it is independent, with it's own articles and governance.
· DADI AMA - DADI Foundation with Jennifer Martin-Nye, CEO - July 6th
DADI Foundation Social Media:
DADI Foundation - Twitter
DADI Foundation - LinkedIn
DADI Foundation - Instagram
Contact DADI Foundation:
[DADI Foundation - Contact Us](mailto:[email protected])

Roadmap – Where is DADI Heading?

So far DADI have accomplished all targets on time even releasing DADI Mainnet 2 days early.
DADI Roadmap

DADI Team – Who is in the team?

Starting at around 18 members, DADI has grown to around 30. The team decided to remove the office environment and work remotely to promote a better life/work balance.
Due to the size of the team and the information they have presented it is best to read here:
DADI Team
Have a read here about the remote working setup: https://dadi.cloud/en/culture/

DADI Tokenomics – How does the token work?

DADI tokens are an integral part of the DADI Platform. Consumers will be charged tokens for their usage of DADI Web Services. An exchange will be built into front end interfaces, allowing consumers to purchase services in their currency of choice.
Make sure you give the tokenomics doc a read:
DADI Official Tokenomics Documentation (available in 10 languages including: English/Korean/Chinese/German and Russian)
DADI Token
DADI Token is an ERC-20 token which can be stored in wallets such as MEW.
· Ethplorer: https://ethplorer.io/address/0xfb2f26f266fb2805a387230f2aa0a331b4d96fba
· Symbol: DADI
· Contract: 0xFb2f26F266Fb2805a387230f2aa0a331b4d96Fba
· Decimals: 18
· Total Supply: 100,000,000 (The creation of DADI tokens will be a one time event. The Token Creation event is the only time that these tokens can be created, and therefore the total supply of DADI tokens is fixed.)
ICO Prices:
ICO Presale: $0.40
ICO Public Sale: $0.50
ICO Amount raised: $29,000,000

DADI Account - Click the link for Preview

DADI is developing its own 'Account' page which will be fully securable with 2FA. DADI also decided to merge the DADI wallet into the account section.
The account section will be used for multiple reasons such as:
  • Holding DADI tokens
  • Managing Devices/Nodes
  • Increasing barrier for entry with a simple UI
  • DADI Contract management
  • Direct FIAT conversion
  • Consumer facing management functionality
DADI Account Page

Exchanges – Where can I buy (or sell)?

DADI has secured some of the top exchanges, including FIAT pairings\*.
· OKEX
· Bitfinex*
· Ethfinex*
· KuCoin
· HitBTC
· Cobinhood
· CoinFalcon
· Gate.io
· IDEX
· WandX
· London Block Exchange (LBX)* - This will be a future listing including a DADI-GBP pairing for the UK. See partnerships below.

Masternodes and Requirements

DADI Official Documentation
DADI masternodes are built on a 3 tier system to perform different functions required for the network. There are three key nodes within the DADI Network: Stargates, Gateways and Hosts. Availability of running certain nodes comes due to network requirements. As the demand increases so will the required amount of masternodes.
INITIAL On-Boarding of Masternodes:
The first wave of nodes will consist of the DADI Founding Node.
This includes the onboarding of c.500 Hosts, c.15 Gateways and c.2 Stargates during Q3 and Q4. These figures are designed to provide enough capacity for early network demand and are subject to change.

Masternode ROI

Note: A calculator to work this out is in development.
A common question is in regards to the ROI of the nodes. The expected returns are shown in the document linked above, but to further understand potential ROI you must first understand how the network rewards contributors.
DADI consumers purchase DADI services via a currency of their choice and this is then converted in real time to the DADI token. This revenue generated is then split between 3 bodies: Nodes, the Ecosystem fund and the DADI Foundation.
Revenue Distribution
The masternodes work on 3 different models:
  1. Proof of Work + Consensus
  2. Proof of Stake
  3. Proof of Availability
The more work your node does (the monitored requests and traffic to your node), with the amount staked (which is limited to avoid centralisation and additional nodes would be required beyond a certain point) and how available your node is on the network (its uptime) determines its reward payout. The larger masternodes (the Stargate especially) require a high uptime and may be penalised if downtime is not during a maintenance window, the hosts do not have such a heavy requirement.
So as a result, the consumers purchase services through DADI, the revenue is then distributed at 85% to all masternodes based on the results of the nodes Proof of Work, Proof of Stake and Proof of Availability a payout is performed monthly.
The PoS requirement is reviewed per quarter and may be lower as the network grows to allow more nodes to be onboarded.

Masternode Setup

DADI nodes are currently in staged on-boarding until the network is publicly available, this will be known as Constellation and is due Q4 2018/Q1 2019.
The masternode setup will be made to be simple and allow for maintenance windows when downtime is required such as hardware changes, update schedules or other.
You will also be able to monitor performance of your DADI nodes from your account.
Here is a sneak peak of the setup window:
Setup Screen (preview)

Masternode Types

DADI Stargate

· Purpose: Stargates provide the domain name system that makes Gateway/Host resources addressable. They are responsible for the secure running of the network. They monitor resources and control the payout contract.
· 500k DADI tokens
· Restricted availability
· Voting rights
Minimum specification:
High bandwidth: 1 Gbit/s+
High availability: 99.9999%
CPU: 2x quad-core+ @ 2.80GHz+
RAM: 128GB RAM+
Disk: 2TB SSD+
Stargates are intended for high-connectivity environments: think data centers and high bandwidth office environments, and are designed to be single, powerful machines rather than a cluster of smaller, less powerful machines.
DADI Stargate - specs below
For example this Stargate specs are as follows:
· 32x Xeon E3 1260L v5 Quad-Core @ 2.9GHz, 8Mb Cache
· 1.2TB RAM
· 2TB SSD raid
Plus room to expand that 10x per node as requirements grow.

DADI Gateway

· Purpose: Gateways are network node owners who contribute bandwidth. They are the entry point to the network, acting as an aggregate point for Host node capacity.
· 50k DADI tokens
· Limited availability
· Top 25% of nodes have voting rights
Minimum specification:
High bandwidth: 250 Mbit/s+
High availability: 99%
CPU: 1x quad-core+ @ 2.5GHz+
RAM: 64GB RAM+
Disk: 1TB SSD+

DADI Host

DADI Host performs extremely well on a Raspberry Pi 3. It can also be run behind a home router without the need for router configuration.
· Purpose: Hosts are network node owners who contribute computational power. DADI Web Services run in a container service within a secure enclave on Host environments.
· 5K DADI tokens
· No node limits
· Top 5% of nodes have voting rights
Minimum specification:
High bandwidth: 15 Mbit/s+
High availability: 20%+
CPU: 1x quad-core 1.2 GHZ+
RAM: 1GB RAM+
Disk: 50GB HDD+

Masternode Hosting

https://www.wirehive.com/masternode-hosting/ (Details by DADI team to be confirmed)

Partnerships – So who works with DADI?

Wirehive
Wirehive delivers expert infrastructure consultancy and support for a broad portfolio of clients including Vodafone, Honda and ITV — and it will be offering DADI’s network as a decentralized alternative to AWS, Microsoft Azure and Google Cloud.
NetWise
Netwise, a leading provider of server colocation and data centre services.
Netwise offers private facilities in London and throughout Europe, designed and built entirely in-house, and delivers end-user content on a national and international scale. It is also a pioneer in green colocation solutions, offering highly-efficient rack space powered by 100% renewable energy — an issue of equal importance to the DADI team.
Agorai
DADI is developing a marketplace proposition - a containerised service for approved partners that enables the running of third party software within the DADI network.
Agorai are working with DADI to deploy their technology in to the marketplace. They are also likely to use DADI API and Store for key components in their setup.
INDX
INDX are working with the DADI marketplace: an open container that is designed to run third party software. Specifically, DADI are working together with INDX to explore the potential for running their masternode setup within the DADI network. They are currently running in AWS.
blond
Blond creates contemporary products, spaces and digital experiences for a diverse range of brands, including Sony, LG, Revolut and Rapha. Blond are working on designs for dedicated DADI nodes. Think smart speakers and smart fridges (as one of the devs commented: DADI Cool) as well as other devices. One potential use case would be nodes that do not require staking, but would offset their own carbon emissions (more info to come on this).
The founding node design:
DADI x blond - Founding Node
Verasity
DADI have integrated Verasity’s VeraPlayer with the DADI network. Verasity’s video toolkit (vDaf) allows existing online video services to access the benefits of blockchain technology.
DADI and Verasity in Times Sq.
London Block Exchange (LBX)
London Block Exchange (LBX) will be providing a fiat on/off ramp for customers of the DADI network and DADI making available its dApps including API, Web and CDN for use on the LBX ICO website. During its upcoming ICO in August, LBX will also make use of three key dApps from within the DADI marketplaceDADI API, DADI Weband DADI CDN. These applications will deliver optimal performance during the LBX ICO, plus the company is exploring the use of DADI technology for its exchange environments. This will also include a DADI-GBP pairing for the folks in the UK.

FAQ

Will DADI move to its own blockchain?
There are no plans to do this. There is also no mainnet swap.
Can I keep DADI Tokens on my Ledger or Trezor?
Yes, DADI is an ERC-20 token which can integrate with MEW.
What is the utility of the DADI token?
The DADI token is used by the consumers of the DADI web services to pay for requests on the network, regardless of whether they pay in another form of currency.
How does the Fiat on-ramp work? What is the Fiat on-ramp?
Individuals and businesses can pay for DADI services using fiat if they want to. This does not change the fact that our services are paid for in DADI: it simply means that there there will be a small real-time exchange in place on dadi.cloud, removing the barrier to entry that a purely crypto based payments solution would pose. It's no different in concept to a business heading to OKEx, buying tokens and then paying in DADI, other than it is faster and that it provides the experience that the majority of our potential consumer base expects at this point in time. Of course you will be able to buy our services in Bitcoin, Ethereum, Nano and many other currencies besides.
The amount of nodes compared to the amount of coins does not add up?
As consumer demand for the technology increases, the size of the network will need to increase. To support growth in capacity the requirement of POS will be reduced. The token value is the second factor that will be monitored and factored in to POS requirements. Some nodes will be allowed a limited increase for proof of stake without allowing for centralisation.
Can a Masternode be run on a VPS?
While running within VPS will not give you the same performance as running on bare metal, it will still be possible, yes.
As the ETH network is used for reputation management and accounting purposes we understand congestion is not a huge concern, but what point could it become a problem and what is the plan B?
The level of congestion required for this be an issue is huge - existential threat to Ethereum level.
DADI will monitor performance, but do not expect this to be an issue. If it ever was however, of course DADI would look to alternatives.

AMA – All those questions that have been asked, answered.

The DADI team currently run an AMA fortnightly. The AMA is run over 3 platforms, Reddit/Discord/Telegram. All questions and answers are posted throughout. The AMA posts below have the clearest history of the questions asked.
· DADI AMA – The first AMA - 18th May 2018
· DADI AMA – Masternodes - June 1st 2018
· DADI AMA - 15th June 2018
· DADI AMA - Mainnet - 29th June 2018
· DADI AMA - DADI Foundation with Jennifer Martin-Nye, CEO - July 6th 2018
· DADI AMA - Tech AMA with DADI VP of tech James Lambie and Principal Engineer for the DADI network Arthur Mingard - July 20th 2018
· DADI AMA - Founding Node and DADI Store - August 10th 2018
· DADI AMA - Founding Node and Onboarding Process - Friday 07th September, 2018
· DADI AMA - September 21, 2018
· DADI AMA - October 6, 2018
· DADI AMA - October 26, 2018

In the Press – Where’s muh marketing?

The DADI marketing team have been hard at work with a large focus on adoption and real world use cases. See here some of the articles where DADI have been mentioned:
· Tech Digest - Six transformative ICO-funded companies which are definitely worth watching
· Global Coin Report - The decentralized architecture for a democratic internet
· Forbes - DADI: Firm Announces Node Giveaway Following Decentralised Internet Launch
· Cointelegraph - Decentralized Cloud Platform Launches Mainnet in Challenge to ‘Big Four’ Market Leaders
· Cloudcomputing - A datacentre with no centre
· Forbes - 5 Start-ups hoping to rebuild the internet
· City AM - London Blockchain startup building a new internet
· CBR Online - Big DADI Launch: This Startup Wants to Democratise the Data Centre
· The Blockchain - Who’s the DADI? UK Blockchain Startup Takes Cloud Services Fight to Amazon and Google
· INC - This Foundation Is Rewriting the Internet
· Computer Weekly - UK tech startup wants businesses to share their surplus compute capacity to run its cloud
· IT Pro Portal - British start-up launches 'new internet'
· Business Cloud - BIG BRANDS SIGN UP TO 'INTERNET OF FUTURE'
· Codavel - Can Blockchain CDNs be the next big thing?
· Cryptoiscoming.com - Sleeping Giant of Crypto
· TheBitcoinest - The DADI Network Goes Mainstream, Rolling Out Node Giveaway
· thenextweb.com - This blockchain-based company got $30 million to build a ‘new internet’

I am not part of the DADI team, and this information provided is from my own research. Links to sources have been provided where possible.
submitted by __Dragon__ to DADI [link] [comments]

Is anyone else freaked out by this whole blocksize debate? Does anyone else find themself often agreeing with *both* sides - depending on whichever argument you happen to be reading at the moment? And do we need some better algorithms and data structures?

Why do both sides of the debate seem “right” to me?
I know, I know, a healthy debate is healthy and all - and maybe I'm just not used to the tumult and jostling which would be inevitable in a real live open major debate about something as vital as Bitcoin.
And I really do agree with the starry-eyed idealists who say Bitcoin is vital. Imperfect as it may be, it certainly does seem to represent the first real chance we've had in the past few hundred years to try to steer our civilization and our planet away from the dead-ends and disasters which our government-issued debt-based currencies keep dragging us into.
But this particular debate, about the blocksize, doesn't seem to be getting resolved at all.
Pretty much every time I read one of the long-form major arguments contributed by Bitcoin "thinkers" who I've come to respect over the past few years, this weird thing happens: I usually end up finding myself nodding my head and agreeing with whatever particular piece I'm reading!
But that should be impossible - because a lot of these people vehemently disagree!
So how can both sides sound so convincing to me, simply depending on whichever piece I currently happen to be reading?
Does anyone else feel this way? Or am I just a gullible idiot?
Just Do It?
When you first look at it or hear about it, increasing the size seems almost like a no-brainer: The "big-block" supporters say just increase the blocksize to 20 MB or 8 MB, or do some kind of scheduled or calculated regular increment which tries to take into account the capabilities of the infrastructure and the needs of the users. We do have the bandwidth and the memory to at least increase the blocksize now, they say - and we're probably gonna continue to have more bandwidth and memory in order to be able to keep increasing the blocksize for another couple decades - pretty much like everything else computer-based we've seen over the years (some of this stuff is called by names such as "Moore's Law").
On the other hand, whenever the "small-block" supporters warn about the utter catastrophe that a failed hard-fork would mean, I get totally freaked by their possible doomsday scenarios, which seem totally plausible and terrifying - so I end up feeling that the only way I'd want to go with a hard-fork would be if there was some pre-agreed "triggering" mechanism where the fork itself would only actually "switch on" and take effect provided that some "supermajority" of the network (of who? the miners? the full nodes?) had signaled (presumably via some kind of totally reliable p2p trustless software-based voting system?) that they do indeed "pre-agree" to actually adopt the pre-scheduled fork (and thereby avoid any possibility whatsoever of the precious blockchain somehow tragically splitting into two and pretty much killing this cryptocurrency off in its infancy).
So in this "conservative" scenario, I'm talking about wanting at least 95% pre-adoption agreement - not the mere 75% which I recall some proposals call for, which seems like it could easily lead to a 75/25 blockchain split.
But this time, with this long drawn-out blocksize debate, the core devs, and several other important voices who have become prominent opinion shapers over the past few years, can't seem to come to any real agreement on this.
Weird split among the devs
As far as I can see, there's this weird split: Gavin and Mike seem to be the only people among the devs who really want a major blocksize increase - and all the other devs seem to be vehemently against them.
But then on the other hand, the users seem to be overwhelmingly in favor of a major increase.
And there are meta-questions about governance, about about why this didn't come out as a BIP, and what the availability of Bitcoin XT means.
And today or yesterday there was this really cool big-blockian exponential graph based on doubling the blocksize every two years for twenty years, reminding us of the pure mathematical fact that 210 is indeed about 1000 - but not really addressing any of the game-theoretic points raised by the small-blockians. So a lot of the users seem to like it, but when so few devs say anything positive about it, I worry: is this just yet more exponential chart porn?
On the one hand, Gavin's and Mike's blocksize increase proposal initially seemed like a no-brainer to me.
And on the other hand, all the other devs seem to be against them. Which is weird - not what I'd initially expected at all (but maybe I'm just a fool who's seduced by exponential chart porn?).
Look, I don't mean to be rude to any of the core devs, and I don't want to come off like someone wearing a tinfoil hat - but it has to cross people's minds that the powers that be (the Fed and the other central banks and the governments that use their debt-issued money to run this world into a ditch) could very well be much more scared shitless than they're letting on. If we assume that the powers that be are using their usual playbook and tactics, then it could be worth looking at the book "Confessions of an Economic Hitman" by John Perkins, to get an idea of how they might try to attack Bitcoin. So, what I'm saying is, they do have a track record of sending in "experts" to try to derail projects and keep everyone enslaved to the Creature from Jekyll Island. I'm just saying. So, without getting ad hominem - let's just make sure that our ideas can really stand scrutiny on their own - as Nick Szabo says, we need to make sure there is "more computer science, less noise" in this debate.
When Gavin Andresen first came out with the 20 MB thing - I sat back and tried to imagine if I could download 20 MB in 10 minutes (which seems to be one of the basic mathematical and technological constraints here - right?)
I figured, "Yeah, I could download that" - even with my crappy internet connection.
And I guess the telecoms might be nice enough to continue to double our bandwidth every two years for the next couple decades – if we ask them politely?
On the other hand - I think we should be careful about entrusting the financial freedom of the world into the greedy hands of the telecoms companies - given all their shady shenanigans over the past few years in many countries. After decades of the MPAA and the FBI trying to chip away at BitTorrent, lately PirateBay has been hard to access. I would say it's quite likely that certain persons at institutions like JPMorgan and Goldman Sachs and the Fed might be very, very motivated to see Bitcoin fail - so we shouldn't be too sure about scaling plans which depend on the willingness of companies Verizon and AT&T to double our bandwith every two years.
Maybe the real important hardware buildout challenge for a company like 21 (and its allies such as Qualcomm) to take on now would not be "a miner in every toaster" but rather "Google Fiber Download and Upload Speeds in every Country, including China".
I think I've read all the major stuff on the blocksize debate from Gavin Andresen, Mike Hearn, Greg Maxwell, Peter Todd, Adam Back, and Jeff Garzick and several other major contributors - and, oddly enough, all their arguments seem reasonable - heck even Luke-Jr seems reasonable to me on the blocksize debate, and I always thought he was a whackjob overly influenced by superstition and numerology - and now today I'm reading the article by Bram Cohen - the inventor of BitTorrent - and I find myself agreeing with him too!
I say to myself: What's going on with me? How can I possibly agree with all of these guys, if they all have such vehemently opposing viewpoints?
I mean, think back to the glory days of a couple of years ago, when all we were hearing was how this amazing unprecedented grassroots innovation called Bitcoin was going to benefit everyone from all walks of life, all around the world:
...basically the entire human race transacting everything into the blockchain.
(Although let me say that I think that people's focus on ideas like driverless cabs creating realtime fare markets based on supply and demand seems to be setting our sights a bit low as far as Bitcoin's abilities to correct the financial world's capital-misallocation problems which seem to have been made possible by infinite debt-based fiat. I would have hoped that a Bitcoin-based economy would solve much more noble, much more urgent capital-allocation problems than driverless taxicabs creating fare markets or refrigerators ordering milk on the internet of things. I was thinking more along the lines that Bitcoin would finally strangle dead-end debt-based deadly-toxic energy industries like fossil fuels and let profitable clean energy industries like Thorium LFTRs take over - but that's another topic. :=)
Paradoxes in the blocksize debate
Let me summarize the major paradoxes I see here:
(1) Regarding the people (the majority of the core devs) who are against a blocksize increase: Well, the small-blocks arguments do seem kinda weird, and certainly not very "populist", in the sense that: When on earth have end-users ever heard of a computer technology whose capacity didn't grow pretty much exponentially year-on-year? All the cool new technology we've had - from hard drives to RAM to bandwidth - started out pathetically tiny and grew to unimaginably huge over the past few decades - and all our software has in turn gotten massively powerful and big and complex (sometimes bloated) to take advantage of the enormous new capacity available.
But now suddenly, for the first time in the history of technology, we seem to have a majority of the devs, on a major p2p project - saying: "Let's not scale the system up. It could be dangerous. It might break the whole system (if the hard-fork fails)."
I don't know, maybe I'm missing something here, maybe someone else could enlighten me, but I don't think I've ever seen this sort of thing happen in the last few decades of the history of technology - devs arguing against scaling up p2p technology to take advantage of expected growth in infrastructure capacity.
(2) But... on the other hand... the dire warnings of the small-blockians about what could happen if a hard-fork were to fail - wow, they do seem really dire! And these guys are pretty much all heavyweight, experienced programmers and/or game theorists and/or p2p open-source project managers.
I must say, that nearly all of the long-form arguments I've read - as well as many, many of the shorter comments I've read from many users in the threads, whose names I at least have come to more-or-less recognize over the past few months and years on reddit and bitcointalk - have been amazingly impressive in their ability to analyze all aspects of the lifecycle and management of open-source software projects, bringing up lots of serious points which I could never have come up with, and which seem to come from long experience with programming and project management - as well as dealing with economics and human nature (eg, greed - the game-theory stuff).
So a lot of really smart and experienced people with major expertise in various areas ranging from programming to management to game theory to politics to economics have been making some serious, mature, compelling arguments.
But, as I've been saying, the only problem to me is: in many of these cases, these arguments are vehemently in opposition to each other! So I find myself agreeing with pretty much all of them, one by one - which means the end result is just a giant contradiction.
I mean, today we have Bram Cohen, the inventor of BitTorrent, arguing (quite cogently and convincingly to me), that it would be dangerous to increase the blocksize. And this seems to be a guy who would know a few things about scaling out a massive global p2p network - since the protocol which he invented, BitTorrent, is now apparently responsible for like a third of the traffic on the internet (and this despite the long-term concerted efforts of major evil players such as the MPAA and the FBI to shut the whole thing down).
Was the BitTorrent analogy too "glib"?
By the way - I would like to go on a slight tangent here and say that one of the main reasons why I felt so "comfortable" jumping on the Bitcoin train back a few years ago, when I first heard about it and got into it, was the whole rough analogy I saw with BitTorrent.
I remembered the perhaps paradoxical fact that when a torrent is more popular (eg, a major movie release that just came out last week), then it actually becomes faster to download. More people want it, so more people have a few pieces of it, so more people are able to get it from each other. A kind of self-correcting economic feedback loop, where more demand directly leads to more supply.
(BitTorrent manages to pull this off by essentially adding a certain structure to the file being shared, so that it's not simply like an append-only list of 1 MB blocks, but rather more like an random-access or indexed array of 1 MB chunks. Say you're downloading a film which is 700 MB. As soon as your "client" program has downloaded a single 1-MB chunk - say chunk #99 - your "client" program instantly turns into a "server" program as well - offering that chunk #99 to other clients. From my simplistic understanding, I believe the Bitcoin protocol does something similar, to provide a p2p architecture. Hence my - perhaps naïve - assumption that Bitcoin already had the right algorithms / architecture / data structure to scale.)
The efficiency of the BitTorrent network seemed to jive with that "network law" (Metcalfe's Law?) about fax machines. This law states that the more fax machines there are, the more valuable the network of fax machines becomes. Or the value of the network grows on the order of the square of the number of nodes.
This is in contrast with other technology like cars, where the more you have, the worse things get. The more cars there are, the more traffic jams you have, so things start going downhill. I guess this is because highway space is limited - after all, we can't pave over the entire countryside, and we never did get those flying cars we were promised, as David Graeber laments in a recent essay in The Baffler magazine :-)
And regarding the "stress test" supposedly happening right now in the middle of this ongoing blocksize debate, I don't know what worries me more: the fact that it apparently is taking only $5,000 to do a simple kind of DoS on the blockchain - or the fact that there are a few rumors swirling around saying that the unknown company doing the stress test shares the same physical mailing address with a "scam" company?
Or maybe we should just be worried that so much of this debate is happening on a handful of forums which are controlled by some guy named theymos who's already engaged in some pretty "contentious" or "controversial" behavior like blowing a million dollars on writing forum software (I guess he never heard that reddit.com software is open-source)?
So I worry that the great promise of "decentralization" might be more fragile than we originally thought.
Scaling
Anyways, back to Metcalfe's Law: with virtual stuff, like torrents and fax machines, the more the merrier. The more people downloading a given movie, the faster it arrives - and the more people own fax machines, the more valuable the overall fax network.
So I kindof (naïvely?) assumed that Bitcoin, being "virtual" and p2p, would somehow scale up the same magical way BitTorrrent did. I just figured that more people using it would somehow automatically make it stronger and faster.
But now a lot of devs have started talking in terms of the old "scarcity" paradigm, talking about blockspace being a "scarce resource" and talking about "fee markets" - which seems kinda scary, and antithetical to much of the earlier rhetoric we heard about Bitcoin (the stuff about supporting our favorite creators with micropayments, and the stuff about Africans using SMS to send around payments).
Look, when some asshole is in line in front of you at the cash register and he's holding up the line so they can run his credit card to buy a bag of Cheeto's, we tend to get pissed off at the guy - clogging up our expensive global electronic payment infrastructure to make a two-dollar purchase. And that's on a fairly efficient centralized system - and presumably after a year or so, VISA and the guy's bank can delete or compress the transaction in their SQL databases.
Now, correct me if I'm wrong, but if some guy buys a coffee on the blockchain, or if somebody pays an online artist $1.99 for their work - then that transaction, a few bytes or so, has to live on the blockchain forever?
Or is there some "pruning" thing that gets rid of it after a while?
And this could lead to another question: Viewed from the perspective of double-entry bookkeeping, is the blockchain "world-wide ledger" more like the "balance sheet" part of accounting, i.e. a snapshot showing current assets and liabilities? Or is it more like the "cash flow" part of accounting, i.e. a journal showing historical revenues and expenses?
When I think of thousands of machines around the globe having to lug around multiple identical copies of a multi-gigabyte file containing some asshole's coffee purchase forever and ever... I feel like I'm ideologically drifting in one direction (where I'd end up also being against really cool stuff like online micropayments and Africans banking via SMS)... so I don't want to go there.
But on the other hand, when really experienced and battle-tested veterans with major experience in the world of open-souce programming and project management (the "small-blockians") warn of the catastrophic consequences of a possible failed hard-fork, I get freaked out and I wonder if Bitcoin really was destined to be a settlement layer for big transactions.
Could the original programmer(s) possibly weigh in?
And I don't mean to appeal to authority - but heck, where the hell is Satoshi Nakamoto in all this? I do understand that he/she/they would want to maintain absolute anonymity - but on the other hand, I assume SN wants Bitcoin to succeed (both for the future of humanity - or at least for all the bitcoins SN allegedly holds :-) - and I understand there is a way that SN can cryptographically sign a message - and I understand that as the original developer of Bitcoin, SN had some very specific opinions about the blocksize... So I'm kinda wondering of Satoshi could weigh in from time to time. Just to help out a bit. I'm not saying "Show us a sign" like a deity or something - but damn it sure would be fascinating and possibly very helpful if Satoshi gave us his/hetheir 2 satoshis worth at this really confusing juncture.
Are we using our capacity wisely?
I'm not a programming or game-theory whiz, I'm just a casual user who has tried to keep up with technology over the years.
It just seems weird to me that here we have this massive supercomputer (500 times more powerful than the all the supercomputers in the world combined) doing fairly straightforward "embarassingly parallel" number-crunching operations to secure a p2p world-wide ledger called the blockchain to keep track of a measly 2.1 quadrillion tokens spread out among a few billion addresses - and a couple of years ago you had people like Rick Falkvinge saying the blockchain would someday be supporting multi-million-dollar letters of credit for international trade and you had people like Andreas Antonopoulos saying the blockchain would someday allow billions of "unbanked" people to send remittances around the village or around the world dirt-cheap - and now suddenly in June 2015 we're talking about blockspace as a "scarce resource" and talking about "fee markets" and partially centralized, corporate-sponsored "Level 2" vaporware like Lightning Network and some mysterious company is "stess testing" or "DoS-ing" the system by throwing away a measly $5,000 and suddenly it sounds like the whole system could eventually head right back into PayPal and Western Union territory again, in terms of expensive fees.
When I got into Bitcoin, I really was heavily influenced by vague analogies with BitTorrent: I figured everyone would just have tiny little like utorrent-type program running on their machine (ie, Bitcoin-QT or Armory or Mycelium etc.).
I figured that just like anyone can host a their own blog or webserver, anyone would be able to host their own bank.
Yeah, Google and and Mozilla and Twitter and Facebook and WhatsApp did come along and build stuff on top of TCP/IP, so I did expect a bunch of companies to build layers on top of the Bitcoin protocol as well. But I still figured the basic unit of bitcoin client software powering the overall system would be small and personal and affordable and p2p - like a bittorrent client - or at the most, like a cheap server hosting a blog or email server.
And I figured there would be a way at the software level, at the architecture level, at the algorithmic level, at the data structure level - to let the thing scale - if not infinitely, at least fairly massively and gracefully - the same way the BitTorrent network has.
Of course, I do also understand that with BitTorrent, you're sharing a read-only object (eg, a movie) - whereas with Bitcoin, you're achieving distributed trustless consensus and appending it to a write-only (or append-only) database.
So I do understand that the problem which BitTorrent solves is much simpler than the problem which Bitcoin sets out to solve.
But still, it seems that there's got to be a way to make this thing scale. It's p2p and it's got 500 times more computing power than all the supercomputers in the world combined - and so many brilliant and motivated and inspired people want this thing to succeed! And Bitcoin could be our civilization's last chance to steer away from the oncoming debt-based ditch of disaster we seem to be driving into!
It just seems that Bitcoin has got to be able to scale somehow - and all these smart people working together should be able to come up with a solution which pretty much everyone can agree - in advance - will work.
Right? Right?
A (probably irrelevant) tangent on algorithms and architecture and data structures
I'll finally weigh with my personal perspective - although I might be biased due to my background (which is more on the theoretical side of computer science).
My own modest - or perhaps radical - suggestion would be to ask whether we're really looking at all the best possible algorithms and architectures and data structures out there.
From this perspective, I sometimes worry that the overwhelming majority of the great minds working on the programming and game-theory stuff might come from a rather specific, shall we say "von Neumann" or "procedural" or "imperative" school of programming (ie, C and Python and Java programmers).
It seems strange to me that such a cutting-edge and important computer project would have so little participation from the great minds at the other end of the spectrum of programming paradigms - namely, the "functional" and "declarative" and "algebraic" (and co-algebraic!) worlds.
For example, I was struck in particular by statements I've seen here and there (which seemed rather hubristic or lackadaisical to me - for something as important as Bitcoin), that the specification of Bitcoin and the blockchain doesn't really exist in any form other than the reference implementation(s) (in procedural languages such as C or Python?).
Curry-Howard anyone?
I mean, many computer scientists are aware of the Curry-Howard isomorophism, which basically says that the relationship between a theorem and its proof is equivalent to the relationship between a specification and its implementation. In other words, there is a long tradition in mathematics (and in computer programming) of:
And it's not exactly "turtles all the way down" either: a specification is generally simple and compact enough that a good programmer can usually simply visually inspect it to determine if it is indeed "correct" - something which is very difficult, if not impossible, to do with a program written in a procedural, implementation-oriented language such as C or Python or Java.
So I worry that we've got this tradition, from the open-source github C/Java programming tradition, of never actually writing our "specification", and only writing the "implementation". In mission-critical military-grade programming projects (which often use languages like Ada or Maude) this is simply not allowed. It would seem that a project as mission-critical as Bitcoin - which could literally be crucial for humanity's continued survival - should also use this kind of military-grade software development approach.
And I'm not saying rewrite the implementations in these kind of theoretical languages. But it might be helpful if the C/Python/Java programmers in the Bitcoin imperative programming world could build some bridges to the Maude/Haskell/ML programmers of the functional and algebraic programming worlds to see if any kind of useful cross-pollination might take place - between specifications and implementations.
For example, the JavaFAN formal analyzer for multi-threaded Java programs (developed using tools based on the Maude language) was applied to the Remote Agent AI program aboard NASA's Deep Space 1 shuttle, written in Java - and it took only a few minutes using formal mathematical reasoning to detect a potential deadlock which would have occurred years later during the space mission when the damn spacecraft was already way out around Pluto.
And "the Maude-NRL (Naval Research Laboratory) Protocol Analyzer (Maude-NPA) is a tool used to provide security proofs of cryptographic protocols and to search for protocol flaws and cryptosystem attacks."
These are open-source formal reasoning tools developed by DARPA and used by NASA and the US Navy to ensure that program implementations satisfy their specifications. It would be great if some of the people involved in these kinds of projects could contribute to help ensure the security and scalability of Bitcoin.
But there is a wide abyss between the kinds of programmers who use languages like Maude and the kinds of programmers who use languages like C/Python/Java - and it can be really hard to get the two worlds to meet. There is a bit of rapprochement between these language communities in languages which might be considered as being somewhere in the middle, such as Haskell and ML. I just worry that Bitcoin might be turning into being an exclusively C/Python/Java project (with the algorithms and practitioners traditionally of that community), when it could be more advantageous if it also had some people from the functional and algebraic-specification and program-verification community involved as well. The thing is, though: the theoretical practitioners are big on "semantics" - I've heard them say stuff like "Yes but a C / C++ program has no easily identifiable semantics". So to get them involved, you really have to first be able to talk about what your program does (specification) - before proceeding to describe how it does it (implementation). And writing high-level specifications is typically very hard using the syntax and semantics of languages like C and Java and Python - whereas specs are fairly easy to write in Maude - and not only that, they're executable, and you state and verify properties about them - which provides for the kind of debate Nick Szabo was advocating ("more computer science, less noise").
Imagine if we had an executable algebraic specification of Bitcoin in Maude, where we could formally reason about and verify certain crucial game-theoretical properties - rather than merely hand-waving and arguing and deploying and praying.
And so in the theoretical programming community you've got major research on various logics such as Girard's Linear Logic (which is resource-conscious) and Bruni and Montanari's Tile Logic (which enables "pasting" bigger systems together from smaller ones in space and time), and executable algebraic specification languages such as Meseguer's Maude (which would be perfect for game theory modeling, with its functional modules for specifying the deterministic parts of systems and its system modules for specifiying non-deterministic parts of systems, and its parameterized skeletons for sketching out the typical architectures of mobile systems, and its formal reasoning and verification tools and libraries which have been specifically applied to testing and breaking - and fixing - cryptographic protocols).
And somewhat closer to the practical hands-on world, you've got stuff like Google's MapReduce and lots of Big Data database languages developed by Google as well. And yet here we are with a mempool growing dangerously big for RAM on a single machine, and a 20-GB append-only list as our database - and not much debate on practical results from Google's Big Data databases.
(And by the way: maybe I'm totally ignorant for asking this, but I'll ask anyways: why the hell does the mempool have to stay in RAM? Couldn't it work just as well if it were stored temporarily on the hard drive?)
And you've got CalvinDB out of Yale which apparently provides an ACID layer on top of a massively distributed database.
Look, I'm just an armchair follower cheering on these projects. I can barely manage to write a query in SQL, or read through a C or Python or Java program. But I would argue two points here: (1) these languages may be too low-level and "non-formal" for writing and modeling and formally reasoning about and proving properties of mission-critical specifications - and (2) there seem to be some Big Data tools already deployed by institutions such as Google and Yale which support global petabyte-size databases on commodity boxes with nice properties such as near-real-time and ACID - and I sometimes worry that the "core devs" might be failing to review the literature (and reach out to fellow programmers) out there to see if there might be some formal program-verification and practical Big Data tools out there which could be applied to coming up with rock-solid, 100% consensus proposals to handle an issue such as blocksize scaling, which seems to have become much more intractable than many people might have expected.
I mean, the protocol solved the hard stuff: the elliptical-curve stuff and the Byzantine General stuff. How the heck can we be falling down on the comparatively "easier" stuff - like scaling the blocksize?
It just seems like defeatism to say "Well, the blockchain is already 20-30 GB and it's gonna be 20-30 TB ten years from now - and we need 10 Mbs bandwidth now and 10,000 Mbs bandwidth 20 years from - assuming the evil Verizon and AT&T actually give us that - so let's just become a settlement platform and give up on buying coffee or banking the unbanked or doing micropayments, and let's push all that stuff into some corporate-controlled vaporware without even a whitepaper yet."
So you've got Peter Todd doing some possibly brilliant theorizing and extrapolating on the idea of "treechains" - there is a Let's Talk Bitcoin podcast from about a year ago where he sketches the rough outlines of this idea out in a very inspiring, high-level way - although the specifics have yet to be hammered out. And we've got Blockstream also doing some hopeful hand-waving about the Lightning Network.
Things like Peter Todd's treechains - which may be similar to the spark in some devs' eyes called Lightning Network - are examples of the kind of algorithm or architecture which might manage to harness the massive computing power of miners and nodes in such a way that certain kinds of massive and graceful scaling become possible.
It just seems like a kindof tiny dev community working on this stuff.
Being a C or Python or Java programmer should not be a pre-req to being able to help contribute to the specification (and formal reasoning and program verification) for Bitcoin and the blockchain.
XML and UML are crap modeling and specification languages, and C and Java and Python are even worse (as specification languages - although as implementation languages, they are of course fine).
But there are serious modeling and specification languages out there, and they could be very helpful at times like this - where what we're dealing with is questions of modeling and specification (ie, "needs and requirements").
One just doesn't often see the practical, hands-on world of open-source github implementation-level programmers and the academic, theoretical world of specification-level programmers meeting very often. I wish there were some way to get these two worlds to collaborate on Bitcoin.
Maybe a good first step to reach out to the theoretical people would be to provide a modular executable algebraic specification of the Bitcoin protocol in a recognized, military/NASA-grade specification language such as Maude - because that's something the theoretical community can actually wrap their heads around, whereas it's very hard to get them to pay attention to something written only as a C / Python / Java implementation (without an accompanying specification in a formal language).
They can't check whether the program does what it's supposed to do - if you don't provide a formal mathematical definition of what the program is supposed to do.
Specification : Implementation :: Theorem : Proof
You have to remember: the theoretical community is very aware of the Curry-Howard isomorphism. Just like it would be hard to get a mathematician's attention by merely showing them a proof without telling also telling them what theorem the proof is proving - by the same token, it's hard to get the attention of a theoretical computer scientist by merely showing them an implementation without showing them the specification that it implements.
Bitcoin is currently confronted with a mathematical or "computer science" problem: how to secure the network while getting high enough transactional throughput, while staying within the limited RAM, bandwidth and hard drive space limitations of current and future infrastructure.
The problem only becomes a political and economic problem if we give up on trying to solve it as a mathematical and "theoretical computer science" problem.
There should be a plethora of whitepapers out now proposing algorithmic solutions to these scaling issues. Remember, all we have to do is apply the Byzantine General consensus-reaching procedure to a worldwide database which shuffles 2.1 quadrillion tokens among a few billion addresses. The 21 company has emphatically pointed out that racing to compute a hash to add a block is an "embarrassingly parallel" problem - very easy to decompose among cheap, fault-prone, commodity boxes, and recompose into an overall solution - along the lines of Google's highly successful MapReduce.
I guess what I'm really saying is (and I don't mean to be rude here), is that C and Python and Java programmers might not be the best qualified people to develop and formally prove the correctness of (note I do not say: "test", I say "formally prove the correctness of") these kinds of algorithms.
I really believe in the importance of getting the algorithms and architectures right - look at Google Search itself, it uses some pretty brilliant algorithms and architectures (eg, MapReduce, Paxos) which enable it to achieve amazing performance - on pretty crappy commodity hardware. And look at BitTorrent, which is truly p2p, where more demand leads to more supply.
So, in this vein, I will close this lengthy rant with an oddly specific link - which may or may not be able to make some interesting contributions to finding suitable algorithms, architectures and data structures which might help Bitcoin scale massively. I have no idea if this link could be helpful - but given the near-total lack of people from the Haskell and ML and functional worlds in these Bitcoin specification debates, I thought I'd be remiss if I didn't throw this out - just in case there might be something here which could help us channel the massive computing power of the Bitcoin network in such a way as to enable us simply sidestep this kind of desperate debate where both sides seem right because the other side seems wrong.
https://personal.cis.strath.ac.uk/neil.ghani/papers/ghani-calco07
The above paper is about "higher dimensional trees". It uses a bit of category theory (not a whole lot) and a bit of Haskell (again not a lot - just a simple data structure called a Rose tree, which has a wikipedia page) to develop a very expressive and efficient data structure which generalizes from lists to trees to higher dimensions.
I have no idea if this kind of data structure could be applicable to the current scaling mess we apparently are getting bogged down in - I don't have the game-theory skills to figure it out.
I just thought that since the blockchain is like a list, and since there are some tree-like structures which have been grafted on for efficiency (eg Merkle trees) and since many of the futuristic scaling proposals seem to also involve generalizing from list-like structures (eg, the blockchain) to tree-like structures (eg, side-chains and tree-chains)... well, who knows, there might be some nugget of algorithmic or architectural or data-structure inspiration there.
So... TL;DR:
(1) I'm freaked out that this blocksize debate has splintered the community so badly and dragged on so long, with no resolution in sight, and both sides seeming so right (because the other side seems so wrong).
(2) I think Bitcoin could gain immensely by using high-level formal, algebraic and co-algebraic program specification and verification languages (such as Maude including Maude-NPA, Mobile Maude parameterized skeletons, etc.) to specify (and possibly also, to some degree, verify) what Bitcoin does - before translating to low-level implementation languages such as C and Python and Java saying how Bitcoin does it. This would help to communicate and reason about programs with much more mathematical certitude - and possibly obviate the need for many political and economic tradeoffs which currently seem dismally inevitable - and possibly widen the collaboration on this project.
(3) I wonder if there are some Big Data approaches out there (eg, along the lines of Google's MapReduce and BigTable, or Yale's CalvinDB), which could be implemented to allow Bitcoin to scale massively and painlessly - and to satisfy all stakeholders, ranging from millionaires to micropayments, coffee drinkers to the great "unbanked".
submitted by BeYourOwnBank to Bitcoin [link] [comments]

How To Buy Bitcoin With Credit Card [URGENT] Binance US Block Soon - Alternative Exchanges To Use New Bitcoin Mining Site to Make money online  Start ... Top 5 Crypto Debit Cards in 2020 #debitcard #bitcoin #visa #mastercard How To Get FREE Bitcoins WITHOUT Mining - Bitcoin Generator 2018 VIDEO GUIDE

In this article I argue that depending on how one programs one’s client, one can build a Consistent and Partition Tolerant or Available and Partition Tolerant system on top of Bitcoin or really any block chain. And no, that isn’t a contradiction and no this doesn’t violate the CAP theorem. [Note: Lots of updates in response to feedback] 1 Thanks. Thank you to Ashok Misra, Marc Walter ... Buy Bitcoin online with your credit card or debit card. Buy Bitcoin Cash (BCH), Bitcoin (BTC) and other cryptocurrencies instantly. Our guides makes it easy! The quirks of Bitcoin mining economics means that no matter what Bitcoin prices do, ... How to avoid car insurance scams. Learn about common scams, warning signs and what to do if you suspect fraud. Dr. Craig Wright explains the origins of Bitcoin – Full interview. Ask an Expert. Click here to cancel reply. Display Name. Your Email (will not be published) Your Question. You are about to post ... Bitcoin adds a new block of transactions every 10 minutes. Once a new block is added to the blockchain, it becomes immutable and can't be deleted or modified. A special group of participants in the network called miners (computers connected to the blockchain) are responsible for creating new blocks of transactions. A miner has to authenticate each transaction using the sender's public key ... Now that you have an idea of what “connected car” is all about, we can move on and explore the fundamentals of blockchain technology, as well. In a moment, we will explore how the two are merging to produce the automobile of the future. Blockchain and Bitcoin. First, it is important to note that blockchain and Bitcoin are not the same thing ...

[index] [40657] [5442] [33413] [34842] [21936] [15265] [16889] [33457] [38717] [28919]

How To Buy Bitcoin With Credit Card

Click here to start https://bit.ly/2YCVbmg What is Bitcoin Mining ? There are three primary ways of obtaining Bitcoins: buying them on an exchange, accepting... Binance is blocking US residence soon. Are you wondering which alternative Bitcoin and Crypto exchanges to use? If so tune in! Become a CryptosRus INSIDER to gain exclusive insight on the market ... #bitcoin buy with credit card #bitcoin buy with debit card #bitcoin sell #bitcoin price #bitcoin #btc to usd #bitcoin news #btc #btc price #bitcoin value #bitcoin mining #bitcoin price today #what ... http://CoinWithCard.com - How To Buy Bitcoin With Credit Card - FAST! Purchase bitcoin fast with a credit card. Related blog: https://troyrudd100.wixsite.com... This feature is not available right now. Please try again later. Published on Jul 31, 2019. Hello and welcome to 4C Trading TV where we talk all things cryptocurrency, and blockchain technology. I ...

#