Difference between revisions of "A Global Network of Personal Knowledge Container"
Jump to navigation
Jump to search
Line 1: | Line 1: | ||
__NOTOC__ | |||
=Context= | =Context= | ||
Given the imminent waves of products and services based on the late-breaking Artificial Intelligence Generated Content(AIGC) technologies, the goal of protecting personal and organizational data sovereignty is becoming ever more challenging. To address this issue, I propose the following approaches: | Given the imminent waves of products and services based on the late-breaking Artificial Intelligence Generated Content(AIGC) technologies, the goal of protecting personal and organizational data sovereignty is becoming ever more challenging. To address this issue, I propose the following approaches: |
Revision as of 23:54, 16 February 2023
Context
Given the imminent waves of products and services based on the late-breaking Artificial Intelligence Generated Content(AIGC) technologies, the goal of protecting personal and organizational data sovereignty is becoming ever more challenging. To address this issue, I propose the following approaches:
Goal
Provide functionally equivalent data processing, machine learning, and social/smart contract verification and execution capabilities to all members in society.
Success Criteria
Allowing individuals and organizations of all levels of privileges to share a common data exchange protocol that enables fair and freewill participation.
Inputs
- Provide a personalized cloud solution, namely Personal Knowledge Container(PKC), which enables individuals and organizations to scale their data solutions from personal computing devices to high-end data centers.
- It is necessary to enable many instances of independently administrated, and geographically distributed PKCs, so that data and localized knowledge can be organized and curated based on individual and distributed wills. In other words, large-scale citizen participation across diverse locations and cultures is highly desired.
- Use of publicly known cryptographic data protection protocols and identity-protecting Smart Contract execution and Blockchain-based transaction services could help set up the initial platform. See Zenroom and The EU Interfacer Project.
- Provide public computing resources to filter and train publicly submitted data content to create diverse and highly functional Large Language Models(LLMs).
- Create a knowledge asset brokerage institution focused on creating, filtering, sharing, and buying/selling LLMs to businesses and individuals who require locally tailored or customized LLMs. Allowing small firms and individuals to participate in the design and composition of LLMs fosters system variety and a broader user base, which encourages freedom of expression.
Activities
- Develop and collect existing technical solutions to build the PKC and data protection infrastructures.
- Organize educational and promotional events to make these solutions and potential opportunities/threats known to a large number of people
- Encourage exchanges and interconnections between different PKCs, forming a larger computing network that becomes robust in a decentralized manner.
- Identify productive and rewarding data resources, algorithms, scientific concepts, and governance models to promote the sustainability of this global network.
Outputs
- Create a self-governing community of PKC operators, that shares and exchanges data and knowledge assets.
- Operate regular events to promote best-practices, and late-breaking approaches to better protect data sovereignty while uphold social justice.
- Initially, raw data collection, data cleasing, and LLM creation and certification using PKC will be the main services and products.
Boundary Conditions
- The proposal assumes that the existing blockchain and data center infrastructures can handle the demands of large-scale machine learning computation and data processing required for the proposed service, which may not be the case.
- The proposal does not address potential security and privacy concerns around the use of blockchain and Smart Contracts in the management of sensitive data assets.
- The proposal also assumes that there is a demand for custom-made or localized Large Language Models, and that organizations and individuals are willing to pay for them, which may not be the case.
- edited by Grammarly and Quillbot
Related thoughts: A Global Network of Personal Knowledge Container
References
Related Pages