Difference between revisions of "Our ability to create"

From PKC
Jump to navigation Jump to search
 
(10 intermediate revisions by the same user not shown)
Line 1: Line 1:
===Accounting of content consumption===
Money and banking were created in lieu of accounting and the creation of receipts linked to verified accounts. The next byproduct onvenient byproduct As more and more information gets recorded and stored,
Corporate vs open source. the creation of data consensus
asset exchange marketplace
Matomo the measuring of page viewership analytics, the semantic web and the wiki
ethereum evolves this by the explicit classification of programmable accounts governed by code consensus and external accounts governed by private signatures.
===Data must relfect Asset ownership===
Moore's law assigns physical dimensions to data
orchestrate three kinds of assets related to data processing in order to obtain the ability to influence society
data processing instruments, relevant contextualized data content, compatable algorithyms that takes advantage of advanced instruments and available data.
===History of decision making===
decision making is accomplished by an abiity to compute a function of given inputs and produced outputs
the rate at which humans have been able to improve our decision making has been restricted by the ability to transmission information, the density of the information function, and the information quality and resolution.
Moore's law introduced the the idea that every two years the ability to transmit data would double, or improve exponentially, and so would the density of the information being transmitted.
==The history of decision making==
==The history of decision making==


Line 15: Line 45:
Decision making is the process of producing outputs in response to inputs by executing an intentional procedure. This is essentially what we do as humans on a day to day basis. We get the feeling of being thirsty (input) so we get up, walk to the kitchen pour a glass of water and take a drink (intentional procedure) until we are no longer thirsty (output), usually our outputs will be the inputs into the next decisions we make. If we are done drinking we may then need to use the bathroom, an example of inputs becoming the next outputs, the decision making [[Feedback Loops|feedback loop]].
Decision making is the process of producing outputs in response to inputs by executing an intentional procedure. This is essentially what we do as humans on a day to day basis. We get the feeling of being thirsty (input) so we get up, walk to the kitchen pour a glass of water and take a drink (intentional procedure) until we are no longer thirsty (output), usually our outputs will be the inputs into the next decisions we make. If we are done drinking we may then need to use the bathroom, an example of inputs becoming the next outputs, the decision making [[Feedback Loops|feedback loop]].


The intentional procedure of data decision making is known as computation and is executed by organizing, processing, securing, and deploying data. Organization structure gives you the ability to perceive what data inputs are relevant, processing methods give you the ability to makes sense of the data in a meaningful way,  
The intentional procedure of data decision making is known as computation and is executed by organizing, processing, deploying and securing data. Organization structure gives you the ability to perceive what data inputs are relevant, processing methods give you the ability to makes sense of the data in a meaningful way, deployment instrumentation methods allow you to reflect on your computation and create the new sets of inputs for the next decision making procedure, and security measures give you confidence that you data computation procedure is consistent and trustworthy.
security measures gives you confidence to make decisions, and deployment instrumentation methods allow you to reflect on your decision and create the new sets of inputs for the next decision making procedure.  
 
The quality of impact created by data computation is determined by the level of defined security, known as [[partially ordered computation]]. Security restricts impact within levels of defined correctness in order to reduce the potential of risk and harm. If a computation has large impact and little security it will be vulnerable and potentially destructive, if a computation is too secure then it can prevent any meaningful impact from taking place, the balance between to the two is known as computational correctness. By restricting impact into a level of correctness that's defined by contracts, computation can be effective and improve with little to no problems.
 
===DevSecOps, Processing, Securing, and Deploying Data===
 
The process used to create balance is known as [[DevSecOps]].
 
data ensures all participants have an equal opportunity to make decisions, and the private key mechanism delegates the decision making to a specific account.  


The quality of the decisions being made are decided by the scale of liveness and safety. If a decision has too much liveness it will be vulnerable and potentially destructive, if the decision is too safe then it wont produce any meaningful impact. The process used to create balance is known as [[DevSecOps]].
action and response = breaking symmetry and finding symmetry


The
the general theory of natural equivalence

Latest revision as of 06:23, 14 May 2021

Accounting of content consumption

Money and banking were created in lieu of accounting and the creation of receipts linked to verified accounts. The next byproduct onvenient byproduct As more and more information gets recorded and stored,

Corporate vs open source. the creation of data consensus

asset exchange marketplace

Matomo the measuring of page viewership analytics, the semantic web and the wiki

ethereum evolves this by the explicit classification of programmable accounts governed by code consensus and external accounts governed by private signatures.

Data must relfect Asset ownership

Moore's law assigns physical dimensions to data

orchestrate three kinds of assets related to data processing in order to obtain the ability to influence society

data processing instruments, relevant contextualized data content, compatable algorithyms that takes advantage of advanced instruments and available data.

History of decision making

decision making is accomplished by an abiity to compute a function of given inputs and produced outputs

the rate at which humans have been able to improve our decision making has been restricted by the ability to transmission information, the density of the information function, and the information quality and resolution.

Moore's law introduced the the idea that every two years the ability to transmit data would double, or improve exponentially, and so would the density of the information being transmitted.


The history of decision making

Everyday more and more of our life decisions are being automated by computers. Incredibly complex algorithms with vast amount of variables can be filtered down and processed in seconds; however there is still many decisions that computers are not able to make. Decision making is a human trait that we have delegated to machines, and in order for us to empower our machines ability to make decisions we must understand fundamentally what decision making is.

Decision making is the process of producing outputs in response to inputs by executing an intentional procedure. This is essentially what we do as humans on a day to day basis. We get the feeling of being thirsty (input) so we get up, walk to the kitchen pour a glass of water and take a drink (intentional procedure) until we are no longer thirsty (output), usually our outputs will be the inputs into the next decisions we make. If we are done drinking we may then need to use the bathroom, an example of inputs becoming the next outputs.

Transmission

As humans our ability to make internal decisions is incredibly fast, we have a wide range of senses, emotions, and thoughts that create our inputs and define our intentional procedures. Internally within our own minds these decision making procedures can happen in seconds because we have access to infrastructure that allows incredibly fast transmission speed and high levels of information density.

Computation Decision Making

The speed and density of transmitting data has doubled every two years, a phenomenon known as Moore's Law, giving the ability for anyone to collect and transmit large volumes of data. The exponential growth of data technology has allowed our global accounting system to infiltrate every aspect of human life. Every year more and more of our activities as humans are being recorded and accounted for. Recording and validating information will only create the potential of value, its when we understand how to give data the ability to make decisions that it becomes the most valuable asset of the 21st century.

Decision making is the process of producing outputs in response to inputs by executing an intentional procedure. This is essentially what we do as humans on a day to day basis. We get the feeling of being thirsty (input) so we get up, walk to the kitchen pour a glass of water and take a drink (intentional procedure) until we are no longer thirsty (output), usually our outputs will be the inputs into the next decisions we make. If we are done drinking we may then need to use the bathroom, an example of inputs becoming the next outputs, the decision making feedback loop.

The intentional procedure of data decision making is known as computation and is executed by organizing, processing, deploying and securing data. Organization structure gives you the ability to perceive what data inputs are relevant, processing methods give you the ability to makes sense of the data in a meaningful way, deployment instrumentation methods allow you to reflect on your computation and create the new sets of inputs for the next decision making procedure, and security measures give you confidence that you data computation procedure is consistent and trustworthy.

The quality of impact created by data computation is determined by the level of defined security, known as partially ordered computation. Security restricts impact within levels of defined correctness in order to reduce the potential of risk and harm. If a computation has large impact and little security it will be vulnerable and potentially destructive, if a computation is too secure then it can prevent any meaningful impact from taking place, the balance between to the two is known as computational correctness. By restricting impact into a level of correctness that's defined by contracts, computation can be effective and improve with little to no problems.

DevSecOps, Processing, Securing, and Deploying Data

The process used to create balance is known as DevSecOps.

data ensures all participants have an equal opportunity to make decisions, and the private key mechanism delegates the decision making to a specific account.

action and response = breaking symmetry and finding symmetry

the general theory of natural equivalence