Last updated a year ago
Physics simulations run on centralized servers due to lack of decentralized infrastructure for academic & industry collaboration & use-cases
Free decentralized physics computations while rewarding computation & data for corporate, academic, and community involvement with Cardano
This is the total amount allocated to Decentralized Physics Simulations.
Overview
A myriad of sectors are heavily dependent on large simulations of physical systems based primarily on traditional methods like Molecular Dynamics, and Density Functional Theory. Such sectors include Pharmaceutical, energy, semiconductors, etc. For example, in the recent Covid-times, millions of Molecular Dynamics simulations have been run, largely independently, related to the ACE receptor and spike protein to better understand the binding mechanisms[3]. Currently, most of this information is dormant, redundant, and inconclusive. The data is frequently dormant as the simulation data is analyzed for publications or industrial applications and then held on local data storage units, redundant as there are often teams around the world doing highly similar simulations, and inconclusive because often single simulations lack enough information to lead to conclusive results. Thus, centralized infrastructures are rather limiting in developing AI-centric frameworks for improving the efficiency and accuracy of physics computation and knowledge extraction. As bad as this is, this is only the surface of the problem. The larger problem is that there is no natural way to incorporate vast and diverse amounts of physics information (experiments, quarks, chemicals, proteins), data, knowledge , and algorithms in a cohesive and synergetic manner.
We just nearly (the first) missed out on funding our project in Fund7 in the AI category. Here we have done refactoring, and updated our plans.
Objectives and Goals
Our end goal is clear. We hope to create the correct infrastructure to incentive mass adoption of cardano-based protocols in the computationally oriented scientific communities including academia, industry, start-ups, and individual community members.
We are creating a decentralized protocol for the simulation of physical systems while leveraging Nunet for computational resources and SingularityNet for AI enhancements with open ended improvements using anything from Deep Learning [1], to neuro-symbolic AI [2], quantum chemistry [4], cognitive architectures[5], etc. Additionally, we are building a tokemonics system to incentive computation, data, algorithm development, mining, and community rewards for collaborations and support from individual community members, academics, and even corporations. One of our driving principles is the coupling of advancements in artificial intelligence to advancements in functional near-term technologies.
Our solutions will be useful in markets like Biotechnology, Artificial Intelligence, Chemical Synthesis, and many more. These are quickly growing markets, and would be absolutely amazing for the health of the cardano ecosystem to bridge the market demand home. Take for instance just the Biotechnology market; it is expected to surpass 1.5 Trillion by 2030 and growing at nearly ten percent per year [6].
The paradigm shift we are creating with SNet and Nunet stems from creating a computational and algorithm environment for end-to-end integration of multi-scale simulations for developing and employing theoretical and AI algorithms built up from heterogeneous data sources, symbolic knowledge extraction, and cognitive principles to lead to the most interconnected framework for self-consistent computations in the physical sciences. This will all be done to mimic the use of High Performance Computing infrastructures, and in principle, we should be able to simulate molecular systems faster than many of the top supercomputer when Nunet is fully developed with a large enough ecosystem. All of our code will be developed for parallelized, multi-virtual node CPUs/GPUs. By using AI integration, we should also be able to surpass many of the conventional bottlenecks of such computations.
Industry and community
From an industry perspective, users (entities taking advantage of our computational protocol) can exchange tokens for theoretical computations of a particular system of study and/or private/public algorithms developed by various entities (individuals, research labs, corporations, community members). From the community perspective they will get rewarded for the contribution to data, computation, algorithm development (to name a few).
Rewards are mostly obtained from the following procedures: physics data (experiments, simulation data, theory), computational resources and storage, algorithm developments (developing new algorithms, training neural networks, improving existing networks), mining, and technology development. The first two are rather clear. In short, mining is the eventually-automated process of performing specific computations as suggested by community members or recommended by an AI agent that anyone can partake in by staking or resource allocation. As well, entities that develop on the protocol (via any of the above including mining) can obtain rewards via a predetermined ratio of tokens paid by industrial entities using smart contracts.
The particulars of the miscellaneous challenge are described such that there are no other well fitting categories to place our proposal. Some of the semi-related categories were developer ecosystem, open-source ecosystem, and Business creation. While our end objective is in partial alignment with each of these, the main focus and output of our current proposal is only the solution of a particular phase of development. That is, obtaining a foundational suite of algorithms and infrastructure to begin to address further developments at later funding events (either through project catalyst, SingularityNet’s DeepFunding, or third party funding). So, as this proposal is foundational specifically to physics algorithms and implementation with Nunet, we find it difficult to make a compelling argument in other challenge settings. Although, we do have smaller proposals to begin outreach to both industry and academic collaborations to begin this transition in parallel if possible.
Mostly general technical research and development uncertainties and complexity of the project from that side. We are fairly confident that the team will be able to deal with difficulties, but that may require additional time and work. Of course, we are working with Nunet, and any delays on their side could be near-term problematic, but can be circumvented by focusing on the details that can be directly implemented at current times. They are a well-proven team, and delays may happen, but they build great code.
Overview of Specific Algorithms to be implemented
Overview of High Level Design
Note that funding is up to month 78 We will then look for continued funding from future project catalyst cycles or alternate funding.
These are slightly difficult to precisely define as we will be developing our protocol as nunet, specifically, matures. Thus, many of our timeline objectives will be dependent on progress with Nunet.
Miscellaneous Hardware for local testing and development is not needed as we currently have self-owned servers. Any additional resources will be obtained out-of-pocket to improve our chances of obtaining funding.
Function Person/months People Salary Total
Physics Protocol Engineering 8 1 $3,500 $28,000
Justin Diamond - PhD Candidate - AI Researcher in Physics, Chemistry, Pharma, Bioinformatics at academic institutions including University of Michigan, Toyota Technological Institute of Chicago, Boston University, University of Luxembourg, and University of Basel.
Years of experience in academic settings studying machine learning related to chemistry, physics, bioinforamtics, and drug development. Some examples are at the University of Michigan I worked on Machine Learning for Protein Structure Prediction (working with Dr. Jinbo Xu, one of the inspirations for DeepMind's AlphaFold ) and at the University of Luxembourg I worked on generative machine learning models for calculating thermodynamic properties of small molecules as well as quantum mechanical and Molecular Dynamics to study the Spike Protein in the corona-virus in a highly parallelized and distributed fashion on a HPC.
https://www.linkedin.com/in/justin-sidney-diamond-881798193
https://github.com/blindcharzard
Floriane LeFloch - Foudning Member in lili.ai AI startup and Web3 Consultant
https://fr.linkedin.com/in/floriane-le-floch-678391a4
This Catalyst proposal will aid Hetzerk in prototyping large scale computations of physical systems using SingularityNet and Nunet allowing for the continued growth and progressive development with further funding.
Increased number of transactions on Cardano due to SingularityNET AI service calls.
Successful measures include
Our end goal is simple, we are building computational infrastructure on Cardano, in collaboration with Nunet and SingularityNet, to create the correct incentive structures and a recursive cycle of development, roll-out, rewards, and increased efficiency and increasing user-base to obtain a decentralized platform of simulation based solutions for academic and industry related problems like AI based drug development or simulations of bio-molecules with Quantum Mechanical algorithms.
One of the key mechanisms of incentives is to allow, at least, academic groups free computational resources. We can do this, by coupling the loss of value due to computational usage to the gain of actionable knowledge, data, and algorithms to solve some of the most computationally demanding problems in dramatic need of better data, algorithms, knowledge, and efficient and connected solutions. To create a net profitable cycle will take time and a growing ecosystem of partnerships and solutions, but by building now we create the future infrastructure to naturally and with decentralized protocols create more opportunities and participation in the future of beneficial technological and materials development.
These are medium to long term goals that we hope to accomplish in the next three to five years.
In contrast, at the end of the eight months of funding, we will have a foundational number of key algorithms to start generating valuable data. The current state of Artificial Intelligence and Machine Learning relies heavily on accurate data and being able to obtain this data in a connected fashion to reliably train machine learning models is crucial to develop computational solutions in an automated fashion.
What this allows, is for us to have the bare minimum necessary to operate at the same practical level as High Performance Computing Clusters, which are conventionally used to simulate > millions of atoms. This is made possible in collaboration with Nunet (a computational network for CPUs, GPUs, and storage), and we will be developing our codebase hand in hand with them to ensure efficient solutions making us one of the first use-cases on the Nunet Platform.
Entirely new proposal
References:
[1] https://pubs.acs.org/doi/10.1021/acs.accounts.0c00472
[2] https://arxiv.org/abs/2006.11287
[3] https://pubs.acs.org/doi/10.1021/acscentsci.0c01236
[4] http://quantum-machine.org/gdml/
[5] https://arxiv.org/abs/1410.5401
[6] https://www.globenewswire.com//news-release/2022/01/18/2368681/0/en/Biotechnology-Market-Size-to-Surpass-US-1-683-52-Bn-by-2030.html
NB: Monthly reporting was deprecated from January 2024 and replaced fully by the Milestones Program framework. Learn more here
Justin Diamond - PhD Candidate - AI Researcher in Physics, Chemistry, Pharma, Bioinformatics at academic institutions including University of Michigan, Toyota Technological Institute of Chicago, Boston University, University of Luxembourg, and University of Basel.