DoD Agency DARPA Taps Inca Digital to Assess Crypto Impacts

The Defense Advanced Research Projects Agency (DARPA), a division of the United States Department of Defense (DOD), has tapped Inca Digital to develop mapping tools that can be used to analyze the impacts of cryptocurrencies as it concerns National Security. 

The project is designated as the Phase II of the Small Business Innovation Research (SBIR) contract. It is now widely known that digital currencies are permeating the broader economy and the Department of Defense is interested in knowing how their use can unsettle the financial world.

Inca Digital will develop a first-of-its-kind cryptocurrency ecosystem mapping tool for analyzing cross-market crypto-financial data and risk. As announced, the outcome of the research will be used by both the United States government and commercial companies to conduct crypto-financial mapping and analysis.

“Digital asset markets hold amazing promise but also contend with money laundering, market manipulation, and state actors that may pose risks to U.S. national security,” said Adam Zarazinski, CEO of Inca Digital. “Given the increasing prevalence of digital assets, the Department of Defense and other federal agencies need to have better tools to understand how digital assets operate and how to leverage their jurisdictional authority over digital asset markets globally.”

Additionally, the outcome of the research will also enable key stakeholders in the digital assets ecosystem to understand the flow of resources in and out of the blockchain ecosystem. While the research will be funded by DARPA, the agency said the publication of results does not reflect the position of the Department of Defense.

It is not uncommon to find top government agencies in both the US and worldwide to partner with startups in the crypto ecosystem to better understand the digital currency ecosystem. 

Back in 2020, NASA funded a blockchain-based solution for space communication while the US Space force made its first foray into the Non-Fungible Token (NFT) ecosystem in June 2021.

Harvard's Breakthrough in Quantum Computing: A Leap Towards Error-Correction and Noise Reduction

There has been a substantial advancement in quantum computing, which was disclosed by a group of researchers from Harvard University, in conjunction with QuEra Computing Inc., the University of Maryland, and the Massachusetts Institute of Technology. The Defense Advanced Research Projects Agency (DARPA) of the United States of America has provided funding for the development of a one-of-a-kind processor that has been designed with the intention of overcoming two of the most major problems in the field: noise and mistakes.

Noise that affects qubits (quantum bits) and causes computational mistakes has been a significant obstacle for quantum computing, which has been confronting this difficulty for quite some time. In the process of improving quantum computer technology, this has proven to be a significant obstacle. Since the beginning of time, quantum computers that contain more than one thousand qubits have been needed to do enormous amounts of error correction. This is the issue that has prevented these computers from being widely used.

In a ground-breaking research that was published in the peer-reviewed scientific journal Nature, the team that was lead by Harvard University disclosed their strategy for addressing these concerns. They came up with the idea of logical qubits, which are collections of qubits that are linked together by quantum entanglement for communication purposes. In contrast to the conventional method of error correction, which relies on duplicate copies of information, this technique makes use of the inherent redundancy that is present in logical qubits.

A quantity of 48 logical qubits, which had never been accomplished previously, was used by the team in order to effectively perform large-scale computations on an error-corrected quantum computer. By proving a code distance of seven, which indicates a stronger resilience to quantum errors, this was made achievable by constructing and entangling the biggest logical qubits that have ever been created. Therefore, this was made practicable.

In order to construct the processor, thousands of rubidium atoms were separated in a vacuum chamber, and then they were chilled to a temperature that was very close to absolute zero using lasers and magnets. 280 of these atoms were converted into qubits and entangled with the help of additional lasers, which resulted in the creation of 48 logical qubits. Rather of utilizing wires, these qubits communicated with one another via the use of optical tweezers.

When compared to previous bigger machines that are based on physical qubits, this new quantum computer demonstrated a far lower rate of mistakes during computations. Instead of fixing mistakes that occur during computations, the processor used by the Harvard team incorporates a post-processing error-detection phase. During this phase, erroneous outputs are discovered and discarded. This is an expedited approach for scaling quantum computers beyond the current age of Noisy Intermediate-Scale Quantum (NISQ), which is currently in effect.

As a result of this accomplishment, new opportunities for quantum computing have become available. The achievement is a big step toward the development of quantum computers that are scalable, fault-tolerant, and capable of addressing problems that have traditionally been intractable. Specifically, the study highlights the possibility for quantum computers to conduct computations and combinatorics that are not conceivable with the technology that is now available in the field of computer science. This opens an altogether new avenue for the advancement of quantum technology.

Exit mobile version