Web 3.0 in 2024
- We answer the big question: what is Web 3?
- AI and ML the last gasp of big tech companies.
- ‘Decntralized’ is 2024’s big word.
This year’s technology hype train has very much been focused on large language models (which most are inaccurately terming artificial intelligence), a train that’s gained more momentum this last few weeks as ChatGPT turns one year old. But as the dust gathers, ready to settle on 2023, it’s perhaps time to revisit an older ubiquitous term – Web 3 – which is now long-lived enough to begin to gather a quantifiable description.
While machine learning will continue to have a massive impact on life as it develops, large language models and associated technologies at work in video, audio and images, are very much proceeding along a Web 2 framework. To understand this, we need to look at the history of the internet, albeit read from a much-condensed account, to project the fullest implications of the ways that technology may develop.
History of the web
To understand what Web 3 is, we have to go backward before we can go forward – think of it as a renaissance moment. In the beginning, Web 1, or the “worldwide web,” was a way that individuals and organizations could disseminate and share information. That ethos was down at least in part to large academic institutions being among the first users of and publishers to the Web. The technology at the presentation layer of information was relatively basic: web servers and browsers could easily display text and some images, but little else by way of rich content. The ability to watch a video online was, in the early 1990s, something of a rarity.
Over time, both server- and client-side technologies evolved, allowing financial exchanges to take place online, video to be streamed, animated web page objects to become more commonplace, and the web began to become a much more lively place to spend serious amounts of time, rather than simply dipping in and out.
It’s worth also remembering that the ubiquitous web was made significantly more accessible once it was divorced from phone lines and the associated bills.
Ahh, the good old days. They were dreadful.
The transition to Web 2 was gradual, but generally speaking, there became available technology that allowed just about anyone (not just people who spoke HTML/CSS and JavaScript) to become content creators. Adobe Flash allowed the creation of full, on-page applications, YouTube – acquired early on by Google – began to gain traction, and many brands began to wake up to a new phenomenon, social media.
People could and did begin to express themselves online quickly and easily, whether by posting a portfolio of original images and selling them via e-commerce, or reducing their thoughts to pithy 400 character “Tweets,” or making videos about anything that interested them. Issues around monetization of online content began to develop solutions, such as advertising, data brokering, and targeted messages sent down multiple channels.
Web 2 was, and still is, epitomized by the centralization of resources and power into the hands of a few large companies. Facebook, Google, Twitter, Apple, Tencent, Alibaba and other household names made self-expression easy, and thereby controlled the means of production, and the materials produced (by dint of copyright over anything uploaded to their services). They also discovered that the lifeblood of all activity online was data, and began to use and monetize this resource they saw channeled through their servers.
The explosion of mobile phone apps for the iPhone and Android devices allowed anyone or anything to reach people via their back pockets – devices that were and are tiny yet powerful computers that connect people permanently to the internet, objects still called phones, yet that are often rarely used to make a phone call. Because eww – people pressure!
So, again what is Web 3?
The mainstay of what people call “Web 3” is best described in one word: decentralized. Decentralized technology means users no longer have to rely on the big names in technology to achieve their ends. Networks like Mastodon represent the first well-known systems that are deliberately decentralized. In Mastodon terms (it runs on a protocol called ActivityPub, which can also handle many other types of data like voice and video), there is no single organization that acts as a centralized hub. Mastodon servers, or ‘instances’ can be set up by anyone. Each instance is free to gather its own users, and can interact with any other instance to whatever degree its owner and users permit. It’s altogether less eBay, more Etsy.
Perhaps a better known example of decentralized technology is cryptocurrency. The tech on which cryptocurrency is built is blockchain, which is an encrypted database that’s distributed as widely as it’s needed.
Cryptocurrency is a decentralized system in that it does not require central authorities, such as registered banks, to be conduits for exchange. Cryptocurrency can be used person-to-person, or person-to-business without any oversight: small wonder that governments and tax authorities around the world tend to consider cryptocurrency a bad idea. Though arguably, cryptocurrency has rather played into that stereotype over the last two years.
But blockchain, or more accurately, a DLT (distributed ledger technology) means that systems of records – or ledgers – need not be stored by any single party. In fact, the fewer parties involved in holding a copy of a blockchain, the less secure it’s deemed to be. Decentralization of information and its widest distribution is a key to security and veracity.
Web 3 and the end of big tech
The furore around machine learning has shone a light on the fact that large language models, for instance, are currently very much centralized services, controlled by Microsoft (Copilot, ChatGPT), Google and other, venture capital-backed companies. The resources required to build learning models at present means that only very rich institutions can offer them. But this situation is changing, with more, faster and less resource-consuming algorithms that could be, and are being, deployed by smaller outfits.
While there is no AI-in-your-pocket yet available, it’s only a matter of time until running one’s own AI-type models can be achieved easily and quickly. The collectivist nature of software developers will undoubtedly mean that the best models, with the most practical use cases, will be available to all, as technology’s costs decrease and time advances.
Web 3 and identity online
Web 3 represents a massive change for the biggest names in tech. Where now companies like Google and Apple control and use data flows from source to end-point, soon distributed, decentralized computing will change the paradigm. The individual user can become the arbiter of their own information, giving away only what’s necessary to achieve what’s needed. Decentralized identity records, for example, will mean that a company or organization can confirm without doubt that a potential customer is who they say they are, but not necessarily know their age, or where they live, or – in some cases – what their name is. Only necessary information will be forthcoming, drawn from a decentralized, distributed repository that can vouch for a fact, without divulging a person’s or company’s habits, history or details.
Conclusion
As 2023 draws to a close and ChatGPT moves into its second year of being, suddenly technology has presented itself as a game-changer in the form of what people term AI. But the ready availability of machine learning depends on the last element of Web 2 that big, household names in technology need to retain: centralized authority. Once the privacy, security and reliability of decentralized technologies become apparent, most users will ask themselves why the of power AI is concentrated in the hands of so few.
As the cost and resource overhead of machine learning falls, so it becomes an everyday possibility. There is nothing that will stop LLMs and their descendants living on a decentralized, interconnected web of computers that no single organization can control.