Eci definitions
We demonstrate that attention flows manifest knowledge, and the distance (similarity) between crypto economies has predictive power to understand whether a fork or fierce competition within the same token space will be a destructive force or not. When dealing with hundreds of currencies and thousands of tokens investors have to face a very practical constraint: attention quickly becomes a scarce resource. To understand the role of attention in trustless markets we use Coase's theorem. For the theorem to hold, the conditions that the crypto communities that will split should meet are: (i)Well defined property rights: the crypto investor owns his attention; (ii) Information symmetry: it is reasonable to assume that up to the moment of the hard fork market participants are at a level ground in terms of shared knowledge. Specialization (who becomes the expert on each new digital asset) will come later; (iii) Low transaction costs: Just before the chains split there is no significant cost in switching attention. Other factors (such as mining profitability) will play a role after the fact, and any previous conditions (e.g. options sold on the future new assets) are mainly speculative. The condition of symmetry refers to the “common knowledge” available at t-1 where all that people know is the existing asset. Information asymmetries do exist at the micro level -we cannot assume full efficiency because transaction costs are really never zero. Say’s Law states that at the macro level, aggregate production inevitably creates an equal aggregate demand. Since a fork is really an event at the macroeconomic level (in this case, the economy of bitcoin cash vs the economy of bitcoin), the aggregate demand for output is determined by the aggregate supply of output — there is a supply of attention before there was demand for attention. The Economic Complexity Index (ECI) introduced by Hidalgo and Hausmann allows to predicting future economic growth by looking at the production characteristics of the economy as a whole, rather than as the sum of its parts i.e. the present information content of the economy is a predictor of future growth. Say’s Law and the ECI approach are about aggregation of dispersed resources, and that’s what makes those relevant to the study of decentralized systems. While economic complexity is measured by the mix of products that countries are able to make, crypto economy complexity depends on the remixing of activities. Some services are complex because few crypto economies consume them, and the crypto economies that consume those tend to be more diversified. We should differentiate between the structure of output (off-chain events) vs aggregated output (on-chain, strictly transactional events). It can be demonstrated that crypto economies tend to converge to the level of economic output that can be supported by the know-how that is embedded in their economy — and is manifested by attention flows. Therefore, it is likely that a crypto economy complexity is a driver of prosperity when complexity is greater than what we would expect, at a given level of investment return. As members of the community specialize in different aspects of the economy, the structure of the network itself becomes an expression of the composition of attention output. We use genetic programming to find drivers — in other words, to learn the rankings. Such a ranking score function has the form, returns_tokenA > returns_tokenB = f (sources_tokenA > sources_tokenB). Ultimately, the degree of complexity is an issue of trust or lack thereof, and that is what the flow of attention and its conversion into transactional events reveal.
Mix
Value in algorithmic currencies resides literally in the information content of the calculations; but given the constraints of consensus (security drivers) and the necessity for network effects (economic drivers), the definition of value extends to the multilayered structure of the network itself --that is, to the information content of the topology of the nodes in the blockchain network, and, on the complexity of the economic activity in the peripheral networks of the web, mesh-IoT, and so on. In this boundary between the information flows of the native network that serves as the substrate to the blockchain, and that of the real-world data, is where a new "fragility vector" emerges; the intensity of demand (as encoded in traffic flows) gives rise to a field, and the increase on demand affects the structure of the field, akin to a phase change. Our research question is whether factors related to market structure and design, transaction and timing cost, price formation and price discovery, information and disclosure, and market maker and investor behavior, are quantifiable to the degree that can be used to price risk in digital asset markets. The results obtained show that while in the popular discourse blockchains are considered robust and cryptocurrencies anti-fragile, the cryptocurrency markets are in fact fragile. This research is pertinent to the regulatory function of governments, that are actively seeking to advance the state of knowledge regarding systemic risk, to develop policies for crypto markets, and for investors, who are in need of expanding their understanding of market behavior beyond explicit price signals and technical analysis. 
Distribution
IntroductionOn prepared testimony to the House of Representatives Committee on Financial Services, the Secret Service stated that "Digital currencies have the potential to support more efficient and transparent global commerce and to enhance U.S. economic competitiveness. However, because digital currencies continue to be used to facilitate illicit activity, law enforcement must adapt our investigative tools and techniques \cite{services}". Despite the forensics angle on most investigative tools, applied science can also provide support in the form of early warning systems. Here we describe the algorithmic components for a suite of tools that detect patterns on user behavior, that can inform the authorities where to perform a network intervention. The end product has the form of an education project: a source of information containing those tools and datasets. We seek to evolve AIs that can learn what is important to humans. The optimization objective is to minimize error while keeping complexity manageable: we do not seek to eliminate the error (that will model noise as signal, and introduce overfitting). For practical purposes, there is a level of error we can live with, and there are also limits to human cognition (e.g. how many variables we can think of at the same time, and how many variable relationships). Ultimately, we want AIs that can gain situational awareness in the way humans do. Network inference \cite{Tieri_2019} is the discipline concerned with the dynamic modeling of biologicals networks and has been approached with the use of machine learning \cite{Spirtes_1993}  and non-linear modeling techniques (Oates 2012). Using the biological metaphor as inspiration, we use a genetic programming approach. We focus on "inferential sensors" in the points of interest to preserve the integrity of the financial system, the applications include prevention of investment fraud, computer hacking/ransomware scams, identity theft. Inoculation The key idea is that it is easier to cure people in the early stages of infection -influence human behavior, and, prevent machines to learn bad habits. Those users who conduct dark web-related activity in the regular web are still "newbies" (otherwise they would be in the dark web already), and therefore they are more susceptive to an intervention to modify their behavior. Data types The following use cases are based on blockchain and web panel datasets only. At this stage, we use global data and daily time granularity, although detail for specific countries, major US metro areas, and intraday sampling is possible. Those data points can be augmented with government intelligence (e.g. geodata), and/or dark web clickstream datasets, when available. Use cases In the following cases, we symbolically regress TOR browser downloads and cryptocurrencies price action on several real-world time series to find relationships regarding aspects such as deterioration of trust in the traditional financial system \cite{Venegas_2018}, the rise of some types of cyber crimes, among others. The modeling method is genetic programming. As a follow-up step, we investigate causation using several signal processing and AI techniques. 
1 ycmstowtiamoebmz9keecw
In the traditional financial sector, players profited from information asymmetries. In the blockchain financial system, they profit from trust asymmetries. Transactions are a flow, trust is a stock. Even if the information asymmetries across the medium of exchange are close to zero (as it is expected in a decentralized financial system), there exists a “trust imbalance” in the perimeter. This fluid dynamic follows Hayek's concept of monetary policy: “What we find is rather a continuum in which objects of various degrees of liquidity, or with values which can fluctuate independently of each other, shade into each other in the degree to which they function as money”. Trust-enabling structures are derived using Evolutionary Computing and Topological Data Analysis; trust dynamics are rendered using Fields Finance and the modeling of mass and information flows of Forrester's System Dynamics methodology. Since the levels of trust are computed from the rates of information flows (attention and transactions), trust asymmetries might be viewed as a particular case of information asymmetries -- albeit one in which hidden information can be accessed, of the sort that neither price nor on-chain data can provide. The key discovery is the existence of a “belief consensus” with trust metrics as the possible fundamental source of intrinsic value in digital assets. This research is relevant to policymakers, investors, and businesses operating in the real economy, who are looking to understand the structure and dynamics of digital asset-based financial systems. Its contributions are also applicable to any socio-technical system of value-based attention flows.