1. Tether has effectively become the central bank of crypto. Like central banks, they ensure liquidity in the market and even engage in quantitative easing — the practice of central banks buying up financial assets in order to stimulate the economy and stabilize financial markets. The difference is that central banks, at least in theory, operate in the public good and try to maintain healthy levels of inflation that encourage capital investment. By comparison, private companies issuing stablecoins are indiscriminately inflating cryptocurrency prices so that they can be dumped on unsuspecting investors. This renders cryptocurrency not merely a bad investment or speculative bubble but something more akin to a decentralized Ponzi scheme. New investors are being lured in under the pretense that speculation is driving prices when market manipulation is doing the heavy lifting. This can’t go on forever. Unbacked stablecoins can and are being used to inflate the “spot price” — the latest trading price — of cryptocurrencies to levels totally disconnected from reality. But the electricity costs of running and securing blockchains is very real. If cryptocurrency markets cannot keep luring in enough new money to cover the growing costs of mining, the scheme will become unworkable and financially insolvent. (Source: jacobinmag.com)
2. Facebook’s parent company, Meta, is building the world’s most powerful AI-specific supercomputer to develop better speech-recognition tools, automatically translate between different languages and help build its 3D virtual metaverse. Although far from complete, the AI Research SuperCluster (RSC) is up and running and has already overtaken Meta’s previous fastest supercomputer. That machine was designed in 2017 and ran on 22,000 powerful graphics processing units (GPUs) which, despite being designed for playing games, are highly effective tools to train artificial intelligence models with. RSC currently has only 6080 GPUs, but they are more powerful than those in the older machine and it is already three times faster at training large AI models than its predecessor. Its current performance is on a par with the Perlmutter supercomputer at the National Energy Research Scientific Computing Center in California, which is currently placed at number five in the TOP500 global supercomputer rankings. When RSC is complete, it will consist of 16,000 GPUs and be almost three times more powerful than it is now. (Source: newscientist.com)
Keep reading with a 7-day free trial
Subscribe to News Items to keep reading this post and get 7 days of free access to the full post archives.