Machine Learning Network Fetch.ai Shares Vision of Interoperable Blockchains

Fetch.ai has an interoperability vision to bring machine learning services to every ledger and blockchain.

In an email to Blockchain.News, Fetch.ai today announced its interoperability vision to bring machine learning services to every ledger and all chains. The AI network also gave us an update on their imminent upgrade to the Fetch.ai test network to incorporate a new performance-focused FET virtual machine, based on WebAssembly.

Fetch.Ai Upgrade

The Fetch.ai Virtual Machine (VM) went live in February 2019—and is a computer program that emulates a processor and is used to execute a smart contract when transactions are made across the network.

In the case of Fetch.AI and other distributed ledgers, the VM is critical as it enables smart contracts to be deployed and transactions to run in multiple places. In short, it enables a massively replicated execution of the smart contract across all computers on the Fetch.AI network, accounting for different types of hardware that may be involved. The VM is fully integrated with the Fetch.AI Smart Ledger so a record of each smart contract is stored.

Per the announcement on July 21, the Fetch.ai virtual machine will be upgraded on the test network later this week, along with new documentation to enable agent developers to prepare for the migration onto the live network.

Commenting on the announcement, Fetch.ai CEO, Humayun Sheikh said:

“This evolution in our network will in time unlock huge value for the Fetch.ai services, with new opportunities opening for every chain to access AI and machine learning services, powered by the FET token”.

Interoperability of Version 2.0

Fetch.ai is an artificial intelligence lab based in Cambridge, building an open access, tokenized, decentralized machine learning network. This open-source software stack allows organisations to build or configure applications on top of a digital representation of the world in which “software agents”, autonomously search, negotiate and transact. The planned version 2.0 of the Fetch.ai main net will be interoperable with Cosmos Hub and other chains such as Ethereum via Cosmos’ IBC bridge, with a roadmap to enable full cross-chain compatibility. Jonathan Ward, Fetch.ai, Head of Research said:

“This move allows our AI technology to be applied in areas where it has had greatest impact in the centralized world such as social networks, gaming and, of course, finance.”

Resonate Partners with Fetch.ai, Rendering a Decentralized AI-Powered Social Media Experience

To offer users a personal AI-powered, decentralized, and trusted social media experience, Resonate, a decentralized social non-fungible token (NFT) platform, is leveraging the power of blockchain technology and machine learning provided by Fetch.ai.  

Resonate seeks to harness digital twins or autonomous economic agents availed by Fetch.ai so that its decentralized system is automated through the strategic partnership. 

As a result, offer users a web3.0 social media experience devoid of untrustworthy and malicious actors and sources. 

With the power of blockchain technology, Resonate enables users to make videos, photos, and posts and transform them into non-fungible tokens (NFTs).

Abhinav Ramesh, CEO of Resonate.Social, said:

“We are creating Resonate because we feel that there could be a protocol-level linkage between social networks and the inherent properties of an NFT, kind of like combining the best of social media and NFTs.”

Eliminating centralized curation of social feeds

Resonate intends to simplify the process of NFT innovation and its trading so that the costs only fractions of a dollar. 

“We felt that there is a need to simplify the NFT market to enable easy, social-focused creation and trade of NFTs. Resonate helps users easily create NFTs at a very low cost, and brings about comments/likes/shares on the NFT within the same platform,”

To provide a seamless flow of social experiences, the partnership seeks to eliminate the centralized curation of personal social feeds and the problem of value exchange through NFTs and tokens. 

Humayun Sheikh, the CEO and Founder of Fetch.ai, welcomed the collaboration and stated:

“We are excited Resonate.social team chose to build on the Fetch.ai network. Social networks form the center of many other services including NFT marketplace and SoFi and Resonate plans to expand into several of these.”

Through Fetch.ai’s autonomous ecosystem, the digital twins will assist users in curating their feeds for NFT-driven tokenomics, social clout, privacy, and safety. 

Meanwhile, social trading is gaining steam because it enables traders to compare and learn from their techniques. This social aspect assists users to understand how to create and invest in financial NFTs. 

Blockchain in Supply Chain Market Anticipated to Top $14.88 Billion by 2028

The increasing adoption of product traceability for enhanced transparency in manufacturing processes and the urge for optimized security will make blockchain in the supply chain market surpass $14.88 billion by 2028, according to a report by Research Dive.

The study noted that the compound annual growth rate (CAGR) would be 57.4% during the forecast period between 2021 and 2028. 

The deployment of blockchain technology in e-commerce websites has spurred growth. It is expected to revamp the supply chain market in the coming years by rendering product traceability, quality control, and transparency in manufacturing processes. 

The pandemic has made e-commerce websites increase, and this has been made a reality by technologies like machine learning (ML) and artificial intelligence (AI). 

Per the report:

“By application, the product traceability sub-segment of the blockchain in supply chain market is anticipated to be the fastest-growing and reach $3.38 billion by 2028.”

The study noted that significant opportunities are availed by the need for automated, efficient, and transparent supply chains, and blockchain technology is expected to fill this void. 

The Asia-Pacific area is anticipated to be the fastest-growing region by amassing revenue worth $4.063 billion. The study acknowledged:

“Increasing technological advancements and growing adoption of blockchain technology by leading organizations of this region to make supply chains more robust have been the main factors behind the growth of blockchain in supply chain market in the Asia-Pacific region.”

According to the report, some of the prominent market players of the blockchain in the supply chain market include Oracle, Microsoft, Huawei, TIBCO Software, AWS, Huawei, and IBM.

Nevertheless, the research pointed out that the lack of awareness about blockchain technology might be a stumbling block in the speculated growth. 

Meanwhile, the global blockchain technology market is anticipated to reach $19.9 billion by 2026 from the current $3.4 billion value, according to market research publisher Global Industry Analysts Inc.

Fetch.ai Launches Blockchain-based File-Sharing Platform for Data Monetization Management

Fetch.ai, a machine learning-based blockchain platform, has rolled out an end-to-end encrypted file-sharing platform dubbed DabbaFlow, facilitating and accelerating the potential of data sharing securely.

Per the announcement:

“Advanced data-sharing and privacy-preserving technologies are ushering in a new era of data monetization. Fetch.ai’s first addition to its CoLearn ecosystem, DabbaFlow, empowers individuals and companies to take more control over their data and turn them into real business outcomes while keeping their data private and secure.”

Therefore, Fetch.ai sees DabbaFlow as a stepping stone towards making data auditable, verifiable, and secure because it is powered by blockchain technology. 

Since data is the new oil, Humayun Sheikh believes refineries and rigs that keep up with the times are needed.

The founder and CEO of Fetch.ai added:

“People are beginning to understand how valuable their data is, and with the paradigm shifting towards more secure and decentralized solutions, new business models are emerging. DabbaFlow is here to provide the data management tools to create powerful AI models that are relevant to a distributed web.”

Data transfers are primary for running a business in the new digital era. As a result, the amount of data shared online has increased exponentially. 

Nevertheless, data breaches have become widespread, causing privacy and security threats that have tarnished business reputations.

Therefore, DabbaFlow seeks to bridge the gap through encryption and decentralization for enhanced data management and monetization. The report noted:

“People are beginning to understand how valuable their data is, and with the paradigm shifting towards more secure and decentralized solutions, new business models are emerging. DabbaFlow is here to provide the data management tools to create powerful AI models relevant to a distributed web.”

Fetch.ai has been revamping different ecosystems using blockchain technology and artificial intelligence. For instance, it partnered with Resonate to offer its users a personal AI-powered, decentralized and trusted social media experience, Blockchain.News reported. 

OKX Launches AI Integration for Crypto Market Volatility

Artificial intelligence (AI) is becoming increasingly prevalent in the crypto industry, with OKX leading the way in integrating the technology to enhance user experience. On March 31, the cryptocurrency exchange and Web3 technology company announced a new integration from EndoTech.io that utilizes AI algorithms to capture crypto market volatility.

The algorithms used in the integration incorporate machine learning and other advanced techniques to conduct real-time analyses of data and trading opportunities. According to Dmitry Gooshchin, chief operating officer of EndoTech.io, understanding market volatility is essential for successful trading in the crypto space.

OKX’s adoption of AI in the crypto industry is not new. The company recently posted an AI-generated poem from ChatGPT-4 about its wallet on March 30. The poem was an example of how the AI technology can be used to enhance user experience and engagement.

The integration with EndoTech.io is just one example of how AI is finding various use cases in the crypto industry. It is not only used to identify real-time market volatility but also for tracking blockchain transactions, deploying autonomous economic agents for trading, and more.

In everyday life, AI is now used for personal assistant-like tasks, social media, and customer service needs, among other use cases. However, not everyone is convinced of the benefits of AI technology. Recently, a letter signed by 2,600 researchers and leaders in fintech called for a pause in AI development. The letter highlighted the concern that “human-competitive intelligence can pose profound risks to society and humanity,” among other issues.

While opinions on the impact of AI in the crypto industry may be mixed, OKX continues to push forward with its AI integration strategy. This new platform update comes only a few days after the company announced its intention to expand its services to Australia while shutting down its former operations in Canada.

As AI technology continues to evolve, it will be interesting to see how it shapes the future of the crypto industry and society as a whole. While there may be concerns about its impact, the potential benefits of AI cannot be ignored.

Binance Integrates AI Chatbot ChatGPT into Its Education Platform

In late November 2022, the artificial intelligence (AI) chatbot ChatGPT made headlines worldwide, sparking much opposition to the technology. However, despite the initial pushback, the technology has continued to gain traction and see growing implementations. The latest example of this is the integration of ChatGPT into Binance Academy, the education platform of cryptocurrency exchange and blockchain developer Binance.

On April 24, Binance announced the launch of its new AI-driven tool, the Binance Sensei, which uses machine learning to source answers from Binance Academy’s education platform to help users answer questions related to Web3. The tool is essentially an “AI-powered mentor” that users prompt with a specific question or keywords. In response, the Sensei provides a “concise, approximately 150-word summary” for each user.

Although the news has been received positively by many in the Binance community, some have questioned the idea of “allowing a robot to be our teacher.” The integration of AI in education has sparked concerns about the potential misuse of the technology.

Binance Sensei is not the first AI-based learning tool to be implemented in the cryptocurrency space. Industry giants such as Microsoft, Google, and Alibaba have all announced their own versions of ChatGPT. The technology is also finding a role in bringing more efficiency to the memecoin community.

However, the adoption of AI in various industries has also led to an increase in concern over the technology’s capabilities if left unchecked. Italy was one of the early adopters of a brief ban on the usage of the technology, while regulators across the European Union have decided to probe the AI-algorithms of BigTech companies.

Industry insiders speculate that there may be an upcoming regulatory crackdown on AI as it becomes more pervasive. In China, for instance, authorities are set to enforce mandatory security reviews for all AI services in the country.

In conclusion, the integration of AI chatbot ChatGPT into Binance Academy is another example of the increasing use of AI in various industries, including cryptocurrency. While the technology may provide many benefits, it is important to consider its potential consequences and regulate its use to prevent misuse.

Coinbase Rolls Out AI-Driven ERC-20 Scam Token Detection System

Coinbase’s Chief Legal Officer, Paul Grewal, recently shed light on how their Engineering team is harnessing Artificial Intelligence (AI) to root out ERC-20 scam tokens.This innovation was disclosed on October 6, 2023, by Yifan Xu, Indra Rustandi, Yao Ma, and Vijay Dialani from the engineering team at Coinbase. The innovative ERC-20 Scam Token Detection System is a blend of smart contract auditing and machine learning prediction aimed at identifying both known and emergent scam types, marking a significant leap towards ensuring a safer crypto space.

The burgeoning realm of cryptocurrency is not without its share of scams, especially surrounding new and unverified tokens. Fraudsters employ a variety of devious tactics, ranging from Honeypot scams, which are deceptive traps to ensnare investors, to Internal Fees scams involving hidden or unusually high transaction fees. The evolving nature of these scams presents a continuous challenge, with new scam types emerging daily, posing a significant threat to both investors and the broader crypto ecosystem.

To combat these challenges, Coinbase has developed the Scam Token Detection System which employs a two-pronged strategy: 

Smart contract auditing is a proactive measure to identify and filter out known scam types. By meticulously examining the integrity of tokens, this step helps to mitigate the risk of fraud by excluding tokens associated with known malicious activities, capturing and cataloging them for future reference.

On the flip side, the system employs a machine learning framework to detect unknown scam types by identifying abnormal activity patterns. For instance, unusual patterns in time-series transactions among a concentrated group of accounts could be indicative of unknown scam types. This abnormality detection mechanism spots these irregularities, safeguarding against potential unidentified scams.

The Scam Token Detection System isn’t just a technological safeguard; it translates into tangible benefits for users. The establishment of a whitelist of trusted tokens is crucial for launching Coinbase’s asset recovery service for unsupported ERC-20 tokens. This feature, coupled with the ability to hide scam/spam tokens within the Coinbase Wallet, significantly enhances the platform’s capacity to filter out spam tokens, providing a cleaner, safer, and more user-friendly experience.

Here's Why GPT-4 Becomes 'Stupid': Unpacking Performance Degradation

The realm of artificial intelligence (AI) and machine learning (ML) is constantly advancing, yet it’s not without its stumbling blocks. A prime example is the performance degradation, colloquially referred to as ‘stupidity’, in Large Language Models (LLMs) like GPT-4. This issue has gained traction in AI discussions, particularly following the publication of “Task Contamination: Language Models May Not Be Few-Shot Anymore,” which sheds light on the limitations and challenges faced by current LLMs.

Chomba Bupe, a prominent figure in the AI community, has highlighted on X (formerly Twitter) a significant issue: LLMs tend to excel in tasks and datasets they were trained on but falter with newer, unseen data. The crux of the problem lies in the static nature of these models’ post-training. Once their learning phase is complete, their ability to adapt to new and evolving input distributions is restricted, leading to a gradual decline in performance.

Source: DALL·E Generation

This degradation is especially concerning in domains like programming, where language models are employed and where updates to programming languages are frequent. Bupe points out that the fundamental design of LLMs is more about memorization than understanding, which limits their effectiveness in tackling new challenges.

The research conducted by Changmao Li and Jeffrey Flanigan further supports this viewpoint. They found that LLMs like GPT-3 demonstrate superior performance on datasets that predate their training data. This discovery indicates a phenomenon known as task contamination, where the models’ zero-shot and few-shot capabilities are compromised by their training data’s limitations.

Continual learning, as discussed by Bupe, emerges as a key area in machine intelligence. The challenge is developing ML models that can adapt to new information without compromising their performance on previously learned tasks. This difficulty is contrasted with the adaptability of biological neural networks, which manage to learn and adapt without similar drawbacks.

Alvin De Cruz offers an alternate perspective, suggesting the issue might lie in the evolving expectations from humans rather than the models’ inherent limitations. However, Bupe counters this by emphasizing the long-standing nature of these challenges in AI, particularly in the realm of continual learning.

To sum up, the conversation surrounding LLMs like GPT-4 highlights a critical facet of AI evolution: the imperative for models capable of continuous learning and adaptation. Despite their impressive abilities, current LLMs face significant limitations in keeping pace with the rapidly changing world, underscoring the need for more dynamic and evolving AI solutions.

Yann LeCun Reflects on the Impact of DjVu and Open-Access Publications in Machine Learning

In a recent series of tweets, Yann LeCun, a renowned figure in the field of artificial intelligence, shared his experiences and insights on the development of the DjVu image compression format and its profound impact on the machine learning (ML) and AI community. LeCun began the DjVu project in the mid-1990s at AT&T Labs, aiming to create an efficient method for distributing high-resolution scanned documents over the Internet. The DjVu format, later released in the late 90s/early 00s, found adoption by platforms such as the Internet Archive.

LeCun’s initiative to scan and distribute the complete collection of Neural Information Processing (NIPS) conference proceedings further exemplified the format’s usefulness. Gaining permission from publishers Morgan Kaufman and MIT Press, who were not earning revenue from past proceedings, LeCun and his team successfully made these resources widely accessible by 2000 through a free website.

This move was pivotal in shaping the culture of the ML/AI community towards open-access and rapid sharing of preprint publications. Around the same time, the community’s pushback against commercial journal publishers led to the creation of the Journal of Machine Learning Research (JMLR), an open-access and free journal, further endorsing this trend.

LeCun also recounted an intriguing episode with Springer, the for-profit publisher that owned rights to the first volume of NIPS. Initially refusing permission for digital dissemination, a surge of email requests directed at a Springer executive led to rapid reversal of this decision, highlighting the community’s collective influence.

Other contributors to the DjVu project, such as Léon Bottou and Patrick Haffner, were acknowledged by LeCun for their significant roles. The format’s legacy extends beyond academic circles, influencing projects like Google’s book scanning initiative and the Internet Archive’s Million Books project.

LeCun’s reflections shed light on the evolving dynamics of intellectual property in the digital age, emphasizing the importance of open-access resources in democratizing knowledge and fostering innovation in fields like machine learning and AI.

ChatGPT Poised to Challenge Google Assistant on Android Devices

OpenAI’s ChatGPT could potentially replace Google Assistant as the default voice assistant on Android devices. This evolution signifies a remarkable shift in the digital assistant landscape, previously dominated by Google Assistant for over half a decade. The integration of ChatGPT as a system-wide voice assistant on Android is backed by several intriguing factors:

Code Analysis and App Updates

A key piece of evidence pointing towards this potential shift is derived from an APK teardown. This method, which involves analyzing the code of an Android application, has revealed a new activity within the app named “com.openai.voice.assistant.AssistantActivity.” Although this feature is currently disabled by default, its existence hints at OpenAI’s plans to make ChatGPT a more integral part of the Android experience​​​​.

Platform Flexibility

Android, known for its flexibility and support for third-party applications, has historically allowed the use of alternative apps for various functions, including voice assistance. This openness of the Android platform has set the stage for potential replacements for Google Assistant, with ChatGPT emerging as a strong contender​​​​.

User Demand and Evolving Technologies

The move towards integrating ChatGPT as a default voice assistant on Android devices is not just a technological advancement but also aligns with evolving user preferences and demands. As AI and machine learning technologies continue to advance, users seek more sophisticated, responsive, and intelligent voice assistants. ChatGPT, with its advanced conversational capabilities and continuous improvements, is well-positioned to meet these demands​​.

This potential shift from Google Assistant to ChatGPT on Android devices represents a significant milestone in the evolution of digital assistants. It underscores the rapid advancements in AI technology and the growing influence of large language models like ChatGPT in everyday tech applications. As this development unfolds, it will be interesting to observe how it shapes user experiences and the competitive landscape of digital voice assistants.

Exit mobile version