Joe Green, Author at TechHQ https://techhq.com/author/joegreen/ Technology and business Wed, 06 Mar 2024 13:53:34 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 Tazama offers Know Your Customer options for all https://techhq.com/2024/03/open-source-kyc-payment-verification-aml/ Wed, 06 Mar 2024 12:30:40 +0000 https://techhq.com/?p=232521

Know Your Customer is a mandatory part of online transactions. Prohibitive costs form barrier to entry. Linux Foundation backs open source alternative. While the possibilities of taking payments online offers both parties in a transaction massive convenience, the threat of online fraud is ever-present. The Global Anti-Scam Alliance reports that close to $1 trillion was... Read more »

The post Tazama offers Know Your Customer options for all appeared first on TechHQ.

]]>
  • Know Your Customer is a mandatory part of online transactions.
  • Prohibitive costs form barrier to entry.
  • Linux Foundation backs open source alternative.

While the possibilities of taking payments online offers both parties in a transaction massive convenience, the threat of online fraud is ever-present. The Global Anti-Scam Alliance reports that close to $1 trillion was lost to online fraud in 2023, a cost that increases secondary business costs paid in insurance premiums, payment gateway fees, and a host of other quiet additions to everyday bills that land each month on the desks of CFOs worldwide.

An integral part of digital payment processes is the myriad routines that run background checks on every transaction, like identity lookup, heuristic pattern recognition for anomalous behavior, and payment detail verification.

“World Trade Center, Bahrain” by Ahmed Rabea is licensed under CC BY-SA 2.0.

These often furiously complex algorithms run quietly in the background, providing services like KYC (know your customer) and AML (anti-money laundering). They’re provided by reputable payment gateways and identity verification systems as a matter of course. Naturally, they come at a cost, one that’s pretty much mandatory whan running a lawful business and one that’s usually sold at a price that can be dictated by providers – as such, it’s rarely cheap.

However, that situation seems set to change in the near future, as the Linux Foundation Charities (with support from the Bill & Melinda Gates Foundation) has launched Tazama, an open source alternative to proprietary anti-fraud measures whose cost is often prohibitive, especially for organizations in the developing world. According to a press release from Linux Foundation Charities (LF Charities), it includes capabilities for fraud detection, AML compliance, and monitoring of online financial transactions. That means it should be able to provide as much know your customer data as traditional closed systems.

The service will be hosted by LF Charities (although its open source nature will enable independent hosting) and so act as a showcase for the efficacy of open source as a secure, independent, low-cost replacement for closed and costly systems.

Know your customer tools could be about to go open source.

“Cr48: Disabling boot verification” by jamalfanaian is licensed under CC BY 2.0.

Jim Zemlin, executive director of the Linux Foundation, said, “We are excited to see an open source solution that not only enhances financial security but also provides a platform for our community to actively contribute to a project with broad societal impacts.”

“The launch of Tazama signifies another stride towards securing and democratizing digital financial services,” said Kosta Peric, Deputy Director, Payment Systems at the Bill & Melinda Gates Foundation.

Greg McCormick, the Executive Director of Tazama, claims the platform has achieved 2,300 full payment transactions per second (TPS), which supports the type of throughput considered vital for a smooth and reassuring customer experience. The presence of delays, glitches, and timeouts is an anathema to payment processes (in B2C transactions, especially), as they suggest an unstable platform and worry users that they might be subject to fraud.

Several organizations are already working with Tazama to assess the platform’s effectiveness, including African organizations BCEAO and BankservAfrica, IPSL in the UK, and Jordan’s JoPACC. While emerging markets may be interested because of the lower potential cost of entry to a reliable payment platform, the overriding benefit of the open source Tazama will be the many thousands of eyes-on that will be able to attest to the veracity of the system and improve it overall.

The reputation of proprietary software in security-sensitive areas makes the case for Tazama. The experiences of Okta, SolarWinds, Lastpass and a half-dozen other companies suggests that in the area of highly-sensitive data, a limited number of developers and the tendency to place shareholder dividends before quality of product tends to create less secure software.

The post Tazama offers Know Your Customer options for all appeared first on TechHQ.

]]>
Spotify, Epic decry Apple terms under EU compliance https://techhq.com/2024/03/open-letter-to-apple-from-spotify-and-epic-on-terms-and-conditions/ Tue, 05 Mar 2024 09:30:12 +0000 https://techhq.com/?p=232501

Spotify among companies complaining about Apple EU developer terms & conditions. Anti-competitive practices make sideloading more expensive. Software companies likely to keep working under existing Apple terms & conditions. With iOS 17.4 due to be released in the coming week, 30 companies have penned an open letter to the European Commission, media groups, and lobby... Read more »

The post Spotify, Epic decry Apple terms under EU compliance appeared first on TechHQ.

]]>
  • Spotify among companies complaining about Apple EU developer terms & conditions.
  • Anti-competitive practices make sideloading more expensive.
  • Software companies likely to keep working under existing Apple terms & conditions.

With iOS 17.4 due to be released in the coming week, 30 companies have penned an open letter to the European Commission, media groups, and lobby organizations, stating their concerns about Apple’s terms and conditions, which they claim will still leave the company in contravention of the EU’s Digital Markets Act.

To comply with the DMA, Apple is now allowing third-party app stores and the sideloading of applications downloaded independently. Developers will be given a choice between signing up to Apple’s new terms or sticking with the existing T&Cs, which the group claims is a “false choice.” The new terms, the signatories claim, will “hamper fair competition with potential alternative payment providers.”

Rotten apple terms and conditions illustrative image.

“Rotten apples” by fotologic is licensed under CC BY 2.0.

To aid developers in their choice, Apple provides a handy calculator to guide them through the myriad available options. Users in the EU select whether they will qualify for the App Store Small Business Program, what App Store fees they would pay, and the value of in-app purchases they predict users will pay – under new and old terms.

What will surprise absolutely no one is that developers will end up paying more money to Apple if they choose to allow their apps to be sideloaded than they currently pay under existing terms. They will also have the cost of running an app store, a customer support function, and a payment processor. For developers, keeping business as usual under Apple’s existing terms results in greater revenue. The only way to preserve income under Apple’s new terms with apps served from a third-party store is to raise the price that consumers pay.

This puts some of the more hyperbolic language of the open letter to the European Commission into context. It claims that “Apple is rendering the DMA‘s goals of offering more choice and more control to consumers useless.” Consumers will rarely have a choice to sideload an app or download it from a third-party store because no application developers will opt to make less money.

The letter states:

“New app stores are critical to driving competition and choice both for app developers and consumers. Sideloading will give app developers a real choice between the Apple App Store or their own distribution channel and technology. Apple’s new terms do not allow for sideloading and make the installation and use of new app stores difficult, risky and financially unattractive for developers. Rather than creating healthy competition and new choices, Apple’s new terms will erect new barriers and reinforce Apple’s stronghold over the iPhone ecosystem.”

Apple’s new terms do “allow for sideloading” – in this, the letter is incorrect – but its terms are deliberately anti-competitive. The company is indeed “[making] a mockery of the DMA and the considerable efforts by the European Commission and EU institutions to make digital markets competitive.”

Apple terms and conditions illustrative imagery.

Something rotten in the state of Apple? Suuuurely not? “rotten apple” by johnwayne2006 is licensed under CC BY-NC-SA 2.0.

It would be naive to believe that the signatories of the letter are beating a drum for consumers’ right to choose where they source their apps from. The motives of Epic Games, Spotify, Uptodown, et al. are as mercenary and cynical as Apple’s. They expected to make more money thanks to the DMA‘s imposition but have been thwarted, at least for now. The ‘Apple Tax’ payed by companies with apps on the App Store is a thorn in the side to shareholders dependent on Apple’s App Store.

For the next few years, European taxpayers will fund the inevitable legal battle they will wage on behalf of the likes of Spotify (2023 Q4 revenue €3.7 billion, €68 million in adjusted operating profits) and Epic Games (valued at $31.5 billion in 2023), so justice can be granted to these stalwart defenders of consumer choice.

Under the Digital Markets Act, violators may be fined up to 10% of worldwide global turnover, which would amount to approximately $38 billion plus change. Likely for Apple, it won’t come to that, but as ever, Cupertino can afford its lawyers’ salaries for a few years until it can find ways to recoup the costs of operating in a competitive market – at least, in the EU. Developers and consumers in the US, UK, and elsewhere can look forward to business as usual.

Track available on both iTunes and Spotify…

The post Spotify, Epic decry Apple terms under EU compliance appeared first on TechHQ.

]]>
Inkitt: what happens when AI eats its own words? https://techhq.com/2024/03/ai-will-help-writers-create-literally-average-stories/ Mon, 04 Mar 2024 09:30:39 +0000 https://techhq.com/?p=232469

Inkitt AI help for writers shows successful patterns. Success delivered by what are proven to be winning formulae. We look forward to Fast & Furious 52‘s release in 2066. The latest $37m funding round for the self-publishing platform Inkitt was awarded at least in part due to its intention to use large language models that... Read more »

The post Inkitt: what happens when AI eats its own words? appeared first on TechHQ.

]]>
  • Inkitt AI help for writers shows successful patterns.
  • Success delivered by what are proven to be winning formulae.
  • We look forward to Fast & Furious 52‘s release in 2066.

The latest $37m funding round for the self-publishing platform Inkitt was awarded at least in part due to its intention to use large language models that work on behalf of its authors. The AI will guide submissions to the eponymous app in areas such as readability, plot, and characterization.

Self-publishing is hugely popular among authors. It circumvents the often-frustrating processes of finding an agent, receiving rejections from established publishing houses, and lessening any income from a work thanks to parties in the chain who each take a cut of revenues generated by sales. An AI-powered virtual assistant can help authors with advice and offer changes to a text that are drawn from previously successful stories.

Inkitt’s AI amalgamates the output from several large language models to find trends in the enormous body of previously published books, giving writers help to align their work with already successful and popular works. At first sight, its approach is clearly more appropriate than having ‘authors’ simply use an AI to create words for a book. It’s also a step above once-respected news outlets using AI to write stories. But a deeper understanding of how large language models work informs us that the boundaries of creativity possible with AI are claustrophobic.

AI help for writers illustration

“Cuba book” by @Doug88888 is licensed under CC BY-NC-SA 2.0.

Whether in video, visual art, game design, or text, machine learning algorithms are educated on extant publications. Over the period of the learning phase, they process large quantities of data, and learn patterns that can then be used to reproduce material similar to that in the body of learning data.

In the case of a novel or screenplay’s structure, then, what’s succeeded in the past (in terms of popularity and, often, revenue generated) can be teased out from the also-rans. It’s a process that is as old as creativity itself, albeit a habit that’s formed without digital algorithmic help. Hollywood industry experts can produce lists of formulae that work for the plot, the rhythm of narrative flow, characterization, and so on. Such lists, whether ephemeral or real, inform the commissioning and acceptance of new works that will have the best chance to succeed.

The threat to creativity from the models used in ways like that proposed by Inkitt is twofold. The most obvious is one of the repetition of successful formulae. This means, depending on your choice of language, works that are on-trend, derivative, zeitgeisty, or repetitious.

The second threat comes from the probability curves embedded into the AI code. The degree of exception from the average of any creative work chewed up by an algorithm will always be diminished. What can’t be judged particularly easily is what makes something an exception and whether it’s different from the average because it’s badly created or because it’s superbly created. Truly fantastic creative works may be given a lesser weight because they don’t conform to a number of other factors, like sentence length or a color palette that is (currently) familiar.

The effect is one of standardization and averaging across the gamut of creative output so that a product is successfully conformist to the mean. Standardization equals conforming, which equals success. But standardization leads inexorably to stagnation.

In practical uses of AI today, many of the traits and methods of models are perfect for their designed purpose. Data analytics of spending patterns informs vendors’ choices for new product development based on what sells well. Outliers and exceptions have little importance and are rightly ignored by the model’s probability curve.

But in areas of creating new art, product design, music composition, or text creation, the exceptions can have value, a value that is increased by not conforming to average patterns of success, readability, aesthetic attractiveness, characterization, or one of a thousand other variables at play. If conformity to guidelines means success, then how we define success is the interesting question. History is littered with composers, artists and writers who didn’t conform, and were succesful during their lifetimes or posthumously. Plenty too who were succesful conformists. And many who kicked against prevailing strictures and got nowhere, dying in poverty.

Will AI be able to give help to writers?

“book” by VV Nincic is licensed under CC BY 2.0.

So what help can AI actually deliver for writers? As in many areas of life and business, it can work well as a tool, but it cannot – or at least should not – be allowed to dictate the creative elements of art.

By reducing creativity to an algorithmically generated idea of “what works,” talent that’s non-conforming is immediately stymied. It depends, of course, on what the creator’s desired outcome is, or how they deem themselves to be succeful. If they want a greater chance of achieving mainstream popularity, then the Inkitt AI will help guide them in what to change to better fit into the milieu. Many dream of being the scriptwriter or 3D visual designer for the next movie blockbuster, and there is value in that. Inkitt may make people better writers, but it’s the individual’s idea of what a ‘better’ writer is that will inform their decision whether or not to sign up.

Individual human voices can make great creative works. But by placing those works inside a mass of mediocrity (and worse) and teaching an algorithm to imitate the mean, what’s produced is only ever, at best, going to be slightly better than average. As more content is created by AI and it too becomes part of the learning corpora of machine learning algorithms, AIs will become self-replicating, but not in the manner of dystopian sci-fi. Much of the future’s published content will just be very, very dull.

Oatmeal for breakfast, lunch, and dinner.

Amalgamating for the sellable mean turns tears of human creativity into nothing more than raindrops in the flood.

The post Inkitt: what happens when AI eats its own words? appeared first on TechHQ.

]]>
Italy’s Piracy Shield proves the internet works https://techhq.com/2024/02/does-italys-piracy-shield-work/ Thu, 29 Feb 2024 15:30:54 +0000 https://techhq.com/?p=232407

Italy’s Piracy Shield breaks multiple sites. CDNs’ clients hit by association. Even limited censorship breaks parts of the internet. Political parties are fond of making big promises, especially when in opposition, and few such claims are more specious than the promise to ‘clean up the internet’ to protect its citizenry from the scourges of pornography,... Read more »

The post Italy’s Piracy Shield proves the internet works appeared first on TechHQ.

]]>
  • Italy’s Piracy Shield breaks multiple sites.
  • CDNs’ clients hit by association.
  • Even limited censorship breaks parts of the internet.

Political parties are fond of making big promises, especially when in opposition, and few such claims are more specious than the promise to ‘clean up the internet’ to protect its citizenry from the scourges of pornography, piracy, and terrorism.

Political statements on the matter usually contain the word ‘children’ in the context of child abuse or protecting minors from the evils that lurk just a couple of mouse clicks away. While their aims are entirely laudable, they ignore or are unaware of the fact that the internet is not a place that can easily be policed either at national boundaries or by filtering content in an effective manner. The digital domain was never designed in a way that would allow total oversight, and attempts to impose the type of stricture required after the fact will always be hugely imperfect. Circumvention of stricture is in the digital DNA of the internet.

Italian scene for article on Piracy Shield.

“Via Tasso, Sorrento – Italy football shirts” by ell brown is licensed under CC BY 2.0.

That’s never stopped governments trying, of course, with the latest attempt from the Italian government coming in the form of its Piracy Shield. This was designed to address just a small area of lawlessness: the highly popular activity of watching live sports streams without paying the official providers of such services.

Given such a tight remit, it may have been imagined to be a relatively trivial undertaking. Unfortunately, that’s proved not to be the case.

Sports fans at the weekend just gone soon discovered firsthand how complex a specifically-targeted act of traffic blocking can be.

An IP address belonging to CDN Cloudflare found itself on the wrong side of Italy’s Piracy Shield, which prevented innocent traffic from reaching the ODW Prison Volunteers Association and Elimobile, a telecomms company, among others.

Stadium image for Italian Piracy Shield article.

“Stade de France – Italy-France football game” by Eric-P is licensed under CC BY-NC-ND 2.0.

Part of the issue is the complexity of the modern internet, where content distribution networks deliver large portions of online content. They offer this service because they’re better than smaller hosts at ensuring streams of data are delivered safely. Independent servers are more likely to suffer from interruptions caused by bad actors, and the fast voluminous cache-ing capabilities of CDNs makes their use logical in many instances; large-scale video streaming being one of the primary among them.

But because large CDNs aggregate data from multiple sources, the nefarious actions of just one of those sources can cause all of its clients to be tarred with the same brush. Bad actors are as wont to use CDNs as lawful parties, and traffic delivery assignment algorithms can’t differentiate between them. Additionally, it’s easy to mistake genuine traffic for bad traffic. In short, at a low level, things are very, very complicated, in ways not easily explained to those who draft laws.

The Italian experience should be a salutary lesson for lawmakers the world over. Even with a tightly constrained remit, the fallout from attempts to control the digital arena is unpredictable. As a rule of thumb, preventing dubious data movements is borderline impossible to achieve with any accuracy. The public has to be made aware of this fact, so that when the next clarion call goes out for legislation to ‘protect the children,’ the populace recognizes there may be secondary motives – or utter ignorance – at play. Both possibilities are equally alarming, and it’s naive to believe that people in government are any smarter than most.

The post Italy’s Piracy Shield proves the internet works appeared first on TechHQ.

]]>
Bitcoin inventor was aware of currency’s power demands https://techhq.com/2024/02/what-is-the-cost-of-bitcoin-in-environmental-terms/ Tue, 27 Feb 2024 15:30:48 +0000 https://techhq.com/?p=232330

Bitcoin designed to replace traditional finance and gold mining. Court documents say Nakamoto aware of energy consumption issue. Time to divest for the benefit of future generations. A court case currently underway in London, UK, has made several emails more widely known, that were purportedly written by the inventor of Bitcoin, Satoshi Nakamoto. In them,... Read more »

The post Bitcoin inventor was aware of currency’s power demands appeared first on TechHQ.

]]>
  • Bitcoin designed to replace traditional finance and gold mining.
  • Court documents say Nakamoto aware of energy consumption issue.
  • Time to divest for the benefit of future generations.

A court case currently underway in London, UK, has made several emails more widely known, that were purportedly written by the inventor of Bitcoin, Satoshi Nakamoto. In them, they considered the energy use of the Bitcoin network.

The legal case centers on Craig Wright’s claims that he is Nakamoto. The real identity of the inventor of the cryptocurrency is not known for certain, and Wright’s claims, if validated, will mean that he has a significant say in the future development of Bitcoin projects.

Published in Wired last week, Satoshi’s emails contained several comments about the network’s energy consumption.

“If [Bitcoin] did grow to consume significant energy, I think it would still be less wasteful than the labor and resources of intensive conventional banking activity it would replace,” Satoshi said in a message to Martii Malmi, one of the early developers of the technology.

Bitcoin’s energy use

The actual levels of power consumption by Bitcoin are uncertain: miners operate in a highly competitive market and so are not inclined to be particularly transparent as to the details of their operations.

Energy consumption comes largely from two activities in the Bitcoin network that consume electricity – throwing massive computational power at the process to ‘solve for coins,’ and the processing required to handle individual transactions when cryptocurrency changes hands.

A well-accepted metric by the Bitcoin industry on energy consumption is the Cambridge Bitcoin Electricity Consumption Index (CBECI), published by the University of Cambridge’s Judge Business School. The School revised its model in August of last year to take account of the changes in the underpinning technologies and hardware at the heart of the Bitcoin network since 2019. The update is, in part, a “response to evidence indicating a periodic overestimation of electricity consumption.”

Bitcoin network causing polloution illustration.

“Pollution” by sheilaz413 is licensed under CC BY-NC-ND 2.0.

The figure representing the total energy consumption by the Bitcoin network was revised down by 9.8TWh (terawatt hours) for 2022 to 95.5TWh. That places the global system’s consumption alongside nation-states like Belgium and the Netherlands. The paper publishing the Index’s revision details also notes that, overall, the efficiency of Bitcoin mining has increased as hardware advances and refines (albeit now at a slower rate than in the currency’s hayday).

Bitcoin’s environmental impact

The environmental impact of Bitcoin operations is even more complex to estimate than their total energy consumption. Renewable energy is said to power a sizeable proportion of mining operations, with estimates varying [paywall] from around 40% to 75% of the total power consumption. Bitcoin mining operations tend to congregate where energy is plentiful from renewable sources, such as hydroelectric power. In these locations, like certain areas of the US, China, and Scandinavia, such hydroelectric power tends to be cheaper than fossil-derived alternatives.

But environmental damage is said also to come from e-waste comprising of discarded mining rigs, which are superseded by faster, more efficient hardware in generational upgrades. Processing a single Bitcoin transaction is said to produce over 700 pounds of carbon, plus there are additional emissions from data center cooling systems and water usage, to name just a couple of other factors.

The two human activities that Bitcoin’s creator thought might be replaced by Bitcoin, conventional banking and gold mining, still create significantly more negative environmental effects than the entire Bitcoin apparatus, with conventional finance systems alone estimated to produce double the carbon emissions of Bitcoin.

But the slow rate at which Bitcoin transactions can be achieved effectively makes the currency unviable as an everyday means of exchange (there are other networks, such as Ethereum, which are capable of the type of scale required, and which do not use the power-intensive proof-of-work model to mine new coins).

The fact that Bitcoin exists in addition to the activities it was supposed to replace raises the question of its viability. Clearly, the technology cannot be uninvented, and attempts by governments to limit its use have been mostly unsuccessful, with most adopting the accept-and-tax approach to cyrptocurrencies. No governmental control over the Bitcoin was, it has to be said, part of its design remit.

But like renewable energy, which exists as a supplement to fossil-derived power, not as a replacement, Bitcoin and its ecological effects exist in addition to all the consequences of fiat finance.

Grist to the extinction mill

The Bitcoin network’s activities are said to consume the equivalent of around 2%-3% of the US’s annual power usage. Lowering power consumption worldwide, year-on-year, is the most important way to downgrade the status of environmental deterioration from an extinction event to merely a chance of survival for the generation that will live at the end of this century. (NB experiencing survival will still be deeply unpleasant.)

Given that Bitcoin’s purpose at present is just a different flavor of market speculation, and it will not replace conventional finance or gold mining, now might be the time to consider its net utility.

Pollution and the Bitcoin network, illustrative.

“Factory – Pollution” by plagal is licensed under CC BY-SA 2.0.

The post Bitcoin inventor was aware of currency’s power demands appeared first on TechHQ.

]]>
The existential crisis of the Lenovo X1 Fold https://techhq.com/2024/02/what-are-the-advantages-of-a-folding-screen-laptop/ Thu, 22 Feb 2024 09:30:32 +0000 https://techhq.com/?p=232249

The latest folding screen laptop from Lenovo. $4000+ asking price. We ask: Why? Performant, pretty, and pretty pointless. It can’t have escaped anyone’s notice that folding LCD screens are de rigeur right now. The Samsung Galaxy Z Fold 5 and Motorala Razr Plus mobile phones are on heavy advertising rotation, and there are a few laptops, tablets... Read more »

The post The existential crisis of the Lenovo X1 Fold appeared first on TechHQ.

]]>
  • The latest folding screen laptop from Lenovo.
  • $4000+ asking price. We ask: Why?
  • Performant, pretty, and pretty pointless.

It can’t have escaped anyone’s notice that folding LCD screens are de rigeur right now. The Samsung Galaxy Z Fold 5 and Motorala Razr Plus mobile phones are on heavy advertising rotation, and there are a few laptops, tablets and games consoles that also sport folding screens.

Perhaps calling such hardware laptops and tablets in their purest definitions is something of a misnomer. Ask anyone to describe a laptop, and they’ll describe a flat keyboard hinged with a display. There are variations on the theme, of course, such as displays that detach and transform into tablets like the Lenovo Yoga 9i Gen 8 and Microsoft’s Surface Pro 9, or standalone presentation screens.

The addition of a folding screen to the laptop form factor is the latest iteration on the laptop-cum-tablet theme, the Lenovo TPad X1 Fold (2nd generation), coming equipped with 16GB DDR5 RAM, a 512GB NVMe PCIE 4.0 internal drive, optional stylus and detachable keyboard.

The hardware runs on a 12th Generation Intel i7-1250U processor running at 3.5-4.7GHz, and the OLED display offers a maximum resolution of 2560×2024, rated at a bright 600 nits. Full specs of the model under review are here.

16" tablet with a kink of folding screen.

Giant tablet with folding screen.

2nd generation of this folding screen laptop

The first generation of the X1 Fold sported a 13” screen, slower processors, only 8GB RAM and weighed 0.99 kg. That latter stat is an important one: the second generation model reviewed here weighs in at 1.9kg including keyboard and stylus. This is not an ultrabook by any means, nor is it designed to be one. It’s worth noting that when folded, the measurements are 6.9 x 10.87” (176.4mm x 276.2mm) – so, a relatively small footprint, but one that’s offset by its thickness – 0.68” (17.4mm) plus a few millimeters for the keyboard, and the aforementioned 4lbs of heft.

Lenovo pictures the X1 Fold’s users working and playing on the device in the possible configurations: part-folded when in landscape configuration like a paper magazine, as large, flat standalone tablet, or using the screen in full landscape or portrait as a display with keyboard. There’s also a clamshell mode: more on this later.

Clamshell mode of the Lenovo X1 Fold Gen 2, folding screen laptop.

In clamshell mode, dividing the screen in half.

The machine comes with a kickstand that holds the screen at an angle suitable for tabletop use in either orientation, and the hardware keyboard latches on to the bottom of the stand just below the screen with a satisfying magnetic snap. The X1 Fold can also be used with a stylus, which can be attached, again via magnets, to the sides or top of the tablet display.

Note that the kickstand turns the machine into a tabletop, not a laptop: you can’t balance this beast on your lap, and trying to do so makes the keyboard detach with infallibly comic timing.

Folding screen laptop in landscape mode (Lenovo X1 Fold 2nd gen.)

In laptop mode, landscape aorientation.

The device’s motion sensors do a good job of detecting the user’s wishes, rotating and adapting according to configuration, snapping window tiles according to aspect. When no external keyboard is detected, the X1 defaults to tablet mode with an on-screen pop-up keyboard rolling in from the bottom of the display.

By default, Windows 11 is configured to run in dark mode – Lenovo states that this maximizes battery life, which is stated as being 4-6 hours of normal use, but naturally, your mileage may vary.

Lenovo, as is the case with many hardware manufacturers, ships the device with some bloatware from itself and Intel which can be largely ignored or uninstalled if required, in addition to the bloatware that’s unavoidable with Windows 11: Xbox, the mixed reality portal and the Spotify client that embeds comfortably into its autostarting niche after first run.

Folding screen laptop in portrait mode.

Screen in portrait with attached keyboard.

As a piece of hardware, the X1 Fold Gen 2 is a quick performer. There has clearly been a good deal of optimization of the interface’s responsiveness to the various sensors and peripherals like the stylus and keyboard. Connections are made quickly and there’s little of the rotate-(re-)rotate dance to have the device sense its orientation. The camera in the bezel responds to orientation in apps like Teams or Zoom. Speaker quality is better than you might expect, and the microphones gave clear and responsive results.

The keyboard that our device shipped with is exactly as you might expect, or indeed, dread. Let’s just say it’s not designed for protracted typing. Serious users will want to use the X1 Fold with something that’s less plastic-y, and for desk use, most will also opt for a secondary mouse: reaching back and forth from keyboard to screen to move the cursor/mouse pointer quickly loses its charm. The optional keyboard has the red ‘nipple’ mouse pointer, but that has few fans for obvious and understandable reasons.

The folding screen laptop’s big question

But it’s the laptop/tablet dichotomy that’s simply not solved by a folding screen. It begs the question: Why?

As a tablet, the X1 Fold is too big and certainly too heavy. The images on Lenovo’s website of models holding the device like a half-folded book are laughable. Sure, it’ll take the same shape as a large book or small-ish magazine, but has none of the advantages of either (lightness, portability, durability, disposability, lendability, longevity, finger-feel and so on). And good luck finding media that will render on the two-page layout without much touch/click-dragging around of windows.

As a laptop it’s not particularly portable, especially given that users will want a keyboard that doesn’t flex and feel like a $5 Walmart smart TV controller.

Folding screen laptop being read like a book.

That’s a 4lb, one-arm curl. Source: Lenovo.

Some may find the screen’s aspect ratio useful when unfolded, especially those who like the idea of a portrait mode, page-friendly layout for working on documents. For work in cramped environments, like Economy Class on airplanes, for example, you can fold the screen across its width into clamshell mode and either use the onscreen keyboard or sit the hardware keyboard on the lower portion of the screen. You lose half the screen’s area in both cases, obviously.

But the price premium for the TPad X1 Fold that buyers pay for the ‘feature’ of a folding screen is, in our opinion, not worth it. In the cellphone form factor, a folding screen may have arguments in its favor for reasons of portability plus decent screen real estate when unfurled. But in the tablet/laptop space, there a few advantages and plenty of downsides.

By trying to hit multiple targets, the Lenovo X1 Fold 16” feels like a fully-functioning concept prototype that should have been quietly ditched at the user-testing phase of production. Its price tag (more than $4000 as reviewed) ensures that any novelty will wear off long before the arrival of the first credit card bill after purchase.

Image of folding screen laptop for illustrative purposes.

In use mode as effective as it looks.

For the kind of money that only committed early-adopters might spend, you could buy an ultrabook for travel, a powerful desktop PC and a tablet for media consumption, and have a better experience all round. While this folding screen laptop is a powerful piece of hardware, it’s the solution to a problem no-one has.

On a final note, it’s worth remembering that Microsoft stripped out of Windows 11 the touchscreen capabilities of Windows 10, themselves a hangover from the mis-step of Windows 8. By default, the impressive screen’s native resolution is halved (200% zoom), presumably to try and make the desktop environment viable in the tablet-esque use model that Lenovo imagines its users will love. The Windows desktop is a horrible enough environment at the best of times; adding a hardware gimmick on top made for a no less miserable experience.

And now a word from our very much non-sponsors…

The post The existential crisis of the Lenovo X1 Fold appeared first on TechHQ.

]]>
VMware & Broadcom axe free tier https://techhq.com/2024/02/vmware-broadcom-axe-free-tier/ Mon, 19 Feb 2024 16:21:46 +0000 https://techhq.com/?p=232181

VMware for free heads into the sunset. VMware vSphere free edition axed. VMs’ alternatives threaten company’s future. Following Broadcom’s acquisition of VMware, the company has axed its free tier that offered the ESXi hypervisor to test labs, hobbyists, and home lab builders. In a knowledge base article, the company stated that the “VMware vSphere Hypervisor... Read more »

The post VMware & Broadcom axe free tier appeared first on TechHQ.

]]>
  • VMware for free heads into the sunset.
  • VMware vSphere free edition axed.
  • VMs’ alternatives threaten company’s future.

Following Broadcom’s acquisition of VMware, the company has axed its free tier that offered the ESXi hypervisor to test labs, hobbyists, and home lab builders.

In a knowledge base article, the company stated that the “VMware vSphere Hypervisor (Free Edition) has been marked as EOGA (End of General Availability). At this time, there is not an equivalent replacement product available.” While VMware for free is no more, “not an equivalent replacement product” is a significant misnomer.

What is a hypervisor?

Hypervisors are pieces of software that control virtual machines and encapsulated instances of operating systems, allowing multiple virtual computers to be run on a single computer. The technology of virtualization is one of the ways that servers can deploy multiple applications and services using fewer pieces of hardware than dedicating a single piece of hardware to each. It’s possible because a typical combination of computer + operating system rarely utilizes all its resources at any one time. Hypervisors share resources on a host machine across all of its virtual machines, allocating processor cycles, memory, and I/O across its fleet.

VMware for free giving cause for concern: Tweet.

Source: Twitter

VMware established itself as one of the main suppliers of virtualization technology in the early 2000s, just as commonly-available hardware capabilities became such that virtualization was viable. Relatively low-cost hardware was able to run multiple instances of servers, meaning organizations and data centers could offer server facilities without necessarily needing to supply dedicated hardware for each server instance. Today, it’s possible to rent a VPS (virtual private server) for as little as the cost of a round of take-out coffees, one that’s capable of running, for instance, a web server, database, and security stack.

Broadcom’s decision to limit its offerings is driven in part by a desire to recoup some of the $61bn cost of its acquisition of VMware. Previously, VMware had been bought and sold by EMC and Dell. By ensuring that the majority of its users will pay for licenses from now on, the company guarantees itself a revenue stream. The company has also ended perpetual licenses in favor of subscriptions.

Five threats to VMware’s future

While Broadcom may be content with its market consolidation and ensuring all users pay their dues, there are five threats to VMware that will ensure its eventual long-term demise, assuming its trajectory with regard to licensing remains the same.

  1. Competing products such as KVM, Harvester, Proxmox, and Zen offer IT teams alternative hypervisors and virtualization technologies that, in monetary terms, are free at the point of use. Their widespread use means that a large body of users are able to refine, update, and extend their solutions without the restriction of either fees or the closed strictures of the proprietary VMware platform. A significant portion of those users will also upstream their improvements and bug fixes and release new capabilities of the software to all other users under less restrictive license terms.
  2. By removing the ‘free to play’ tier, VMware and Broadcom have effectively removed the up-ramp to the paid tiers. Students, hobbyists, and career-minded IT staff are less likely to consider the platform to learn virtualization technology on. Large institutions offering CS courses, for example, can choose between free-to-use or paid-for licenses for software that is at least equal in capability.
  3. Organizations operating test environments for their development and deployment activities no longer have the wherewithal to experiment to the degree they might require. Strictures on using VMware in environments that can be created and torn down (test labs) will be subject to the same fees as production environments. That places a barrier created by financial constraints on how organizations’ solutions can be tested.
  4. Advances in container-based workload deployment and management now make microservice-based applications and services viable in many contexts. Containerization is one area in which cloud computing has significant advantages for the creator. New projects are more likely to opt for containers over full virtualization if the latter seems to come with a price tag.
  5. The specter of a large company owning the rights to technology on which its customers depend brings with it several undesirable elements. Broadcom can, and might, either further raise prices, limit hypervisors’ capabilities, or ditch the project altogether. Or not. It’s the unpredictability of decisions made well away from the customer base that represents significant danger to organizations looking to bed in for the long term. Vendor lock-in may have been effective in previous decades when technological advancements were made by a single party in combination with a crushing marketing budget. Windows NT and Oracle, to take two such examples, were able to near-monopolize their chosen sectors (desktop and database, respectively). But such practices were very much of their time; in 2024, end-users have to deliberately choose their degree of lock-in and, of course, can opt for next-to-none. Existing license fee-paying customers find themselves offered two expensive alternatives: the devil is to be milked for license revenue until the end of time, and the deep blue sea is to make deep investments in moving to a different platform.
Illustrative image for article on VMware for free's ending.

“The devil and the deep blue sea” by WarmSleepy is licensed under CC BY 2.0.

The end of VMware for free: consequences

It’s a common misconception that companies are bound by law to deliver increasing value to their shareholders – an idea, albeit an incorrect one, that might explain why Broadcom and VMware have chosen the path of license-fee revenue generation over ensuring a healthy intake of new users. What both companies’ decision-makers are bound by, however, is a need to stay in a position from which a board of directors and/or a shareholder vote can remove them.

Making shareholders happy at least ensures a short-term future that promises high salaries, stock options, and performance-related bonuses. If the desire for a third Ferrari or a modest mega-yacht outweighs the desire for the long-term viability of a software offering, then it’s easy to predict which way the wind will blow.

The post VMware & Broadcom axe free tier appeared first on TechHQ.

]]>
Teaching old dogs new SEO tricks https://techhq.com/2024/02/teaching-old-dogs-new-seo-tricks-with-an-seo-optimization-checklist/ Wed, 14 Feb 2024 09:30:08 +0000 https://techhq.com/?p=232066

An SEO optimization checklist is unavoidable in web publishing. Using older posts can breathe new life into rankings Don’t fall into the ‘AI’ trap. One of the realities of website publishing today is that the visibility of your content is determined to a large degree by the algorithms that surface a site in an online... Read more »

The post Teaching old dogs new SEO tricks appeared first on TechHQ.

]]>
  • An SEO optimization checklist is unavoidable in web publishing.
  • Using older posts can breathe new life into rankings
  • Don’t fall into the ‘AI’ trap.

One of the realities of website publishing today is that the visibility of your content is determined to a large degree by the algorithms that surface a site in an online search. Optimizing your website for the best results is, therefore, a part of online publishing as integral as good grammar and an attractive writing style. Using an SEO optimization checklist can help you achieve reliable results.

And while it’s true that ‘Google is not the internet,’ SEO has to be embraced if you’re at all interested in as many people as possible reading your content and absorbing your message. Google and many other search engines have their own agenda in terms of what they surface most readily, but among the carefully constructed caveats and provisos designed to shape the modern internet’s content to their own ends, many SEO ‘rules’ comprise aspects of a page that include readability, legibility for assistive web tools, relevancy, popularity and the freshness of information. Taken objectively, these are generally positive influences on websites’ content.

It’s easy to fall too far into the website optimization pit. Sites written primarily with SERP (search engine results pages) in mind are easy to spot – repetitive use of keywords in body text and sub-headings, plus very little interesting content, are key indicators. However, it is possible to strike a balance between the value of your content to actual human readers and its perceived value for search algorithms. One way to help achieve this is to ensure that older content is refreshed.

SEO optimization checklist illustration Toot.

Source: Fosstodon.org

Refreshing content should not necessarily mean re-spinning it, (that is, rewriting each sentence, swapping out adjectives for synonyms, swapping main and subordinate clauses, and so on).

Content creators should consider refreshing content as a revision process with particular emphasis on bringing the page’s pertinent information up-to-date. In that way, your readers learn from text and media that are more timely, and Google et al. see refreshed content that will, hopefully, be of more interest to readers today.

Finding stagnant pages on websites is a relatively trivial task. Content creation dates supplied by your website’s back end are the obvious metric to begin with, and these can be correlated with analytics from your internal analysis tools.

Most articles receive a spike in interest on publication, then fade away. For news-based content, that’s the natural order of things. Less time-dependent pages could be good targets for a virtual spring clean and spruce-up.

Here’s our search engine optimization checklist for revitalizing the dusty corners of your website.

One of the basic tenets of HTML is that text can be represented as hypertext, one aspect of which is that documents are linked. Finding more relevant, up-to-date references and restructuring the sentences around the reference is a great way to bring content forward to the present. Additionally, if what’s being referenced is a field that changes (academic research, for example), readers will appreciate the latest information.

Similarly, it’s worth refreshing your content to reflect changing opinions, either your own or those of the vertical about which you are writing. This type of edit can be made obvious, with clear demarcation, using italics or indented text to tell readers that the article has been updated.

The changing rules

All search engine optimization is guesswork where self-proclaimed experts attempt to backward engineer the closed algorithms used by large search engines. Like any form of guesswork, there are specialists out there whose guesses are better than others, the quality determined by long experience and dedicated testing of changes to content.

But rules of SEO evolve, and it’s worth re-approaching your text and media with a fresh eye that’s informed by some research into what is, as of the present, current practice. It may be that previously optimized web pages need to be changed to reflect current guidelines for best SERP rankings – at which point, you need to update your SEO optimization checklist.

A prime example is readability. Because much web consumption now takes place on mobile devices, ‘readability’ can mean that shorter paragraphs interspersed with carefully chosen media have the advantage over thick walls of text (which are difficult to read on a moving bus, for example). Changing your pages’ structure may be advantageous within the limits of house style.

Photo under Creative Commons to show teaching an old dog new tricks.

Here’s one. “Teaching an Old Dog New Tricks” by Fouquier ॐ is licensed under CC BY-NC 2.0.

Not using large language models

Like any other tool, using a large language model (Copilot, ChatGPT, etc.) can be effective in certain circumstances. LLMs act as relatively useful proxies for web searches, so if you’re looking to find supporting materials for your pages, LLMs can unearth nuggets of information that would otherwise take much longer to find using ‘traditional’ search.

But if we consider the text output of LLMs to be an amalgam of available information online, their output should not be used to create original content. By pulling together many thousands of websites for learning, LLMs produce literally average output. The models are poor at differentiating useful information from third-rate content, and so produce something in the middle: the sum mean.

Given that your content should be unique and informative rather than only readable, using ChatGPT to write for you is wrong on every level.

Journalists and writers are facing redundancy in swathes as short-term profits are prioritized by companies that are happy to embrace second-rate content. If you’re happy with human expression being replaced by algorithms that regurgitate other people’s work (or, more accurately, predict what the next word in a sentence is likely to be, based on other people’s work), then LLMs are a fine choice. You may also want to use the phrase artificial intelligence incorrectly.

Conclusions

Whether your web pages promote a commercial product or express your inner life, it’s worth bearing in mind that the ’web was, and should be, concerned with the dissemination of information. While going through an SEO optimization checklist for best SERP rankings is, in 2024, an undeniable necessity, that fact need not reduce the overall quality of your offerings.

A giant roadside billboard may begin to look dated in terms of its color palette, font choices, strapline, or imagery. Changing those elements to bring a billboard into present relevancy need not change its message. Websites’ pages can be considered similarly.

The post Teaching old dogs new SEO tricks appeared first on TechHQ.

]]>
Clouds darken in UK as on-prem makes a comeback https://techhq.com/2024/02/cloud-hosting-services-increasingly-rejected-in-favor-of-on-premise/ Tue, 13 Feb 2024 09:30:24 +0000 https://techhq.com/?p=232048

Cloud-hosting services suffering from buyers’ remorse. Many in the UK return to on-prem. Limited use-cases for cloud? The technology space runs on hype cycles. In 2023, talk of AI superseded the previous year’s metaverse speculation. Before that, we saw blockchain and distributed ledger technologies becoming the industry’s darling, fueling a boom-bust in cryptocurrency speculation that... Read more »

The post Clouds darken in UK as on-prem makes a comeback appeared first on TechHQ.

]]>
  • Cloud-hosting services suffering from buyers’ remorse.
  • Many in the UK return to on-prem.
  • Limited use-cases for cloud?

The technology space runs on hype cycles. In 2023, talk of AI superseded the previous year’s metaverse speculation. Before that, we saw blockchain and distributed ledger technologies becoming the industry’s darling, fueling a boom-bust in cryptocurrency speculation that made – and lost – billions of dollars. A decade earlier, cloud computing dominated the technology industry’s press pages.

But it’s worth noting that the subjects of the hype du jour are, more often than not, created by vendors keen to massively over-promote their wares. As is the case in almost every commercial offering, the impetus to find something new to sell creates answers to questions that have yet to be asked.

But after the hype has died down, the gray dawn of reality begins to shed light on the decisions made just a few years ago. And in the case of cloud computing (hindsight has reduced the phrase to lower case), the aftertaste of cloud adoption is dominated by the flavor of overspending, with acid notes of unused capability.

UK turns away from cloud-hosting services

According to a survey by Citrix of UK-based IT leaders as reported by InfoWorld, 43% of respondents said moving applications and associated data to the cloud from on-premise was more expensive than they’d thought. Nearly a quarter (24%) said cloud solutions were failing to meet expectations.

The three big promises of cloud computing were agility/scalability, lower cost, and access to cutting-edge innovation. If we translate the ‘lower cost’ into ‘OPEX, not CAPEX’ (shifting figures around a spreadsheet), what’s interesting about the remaining two benefits is that both refer to infrastructure, not what happens on that infrastructure.

Cloud hosting services allow applications to run on systems that can scale according to changing levels of demand. But without significant re-engineering or even rewriting, many applications simply can’t take advantage of the rapidity of scale-up/down on offer. The exception is data storage, which can easily be expanded, given deep enough pockets.

But most applications that have been in reliable production for more than the blink of an eye gain little from being cloud-hosted. The obvious exceptions are applications written using microservices: containers that can be quickly replicated and torn down. ‘Cloud-native application’ has become a differentiating term denoting a code base divided into replicable elements that can be controlled independently.

Access to cutting-edge technologies, cloud’s third great hope, is in no way unique to cloud vendors. There’s a degree of automation in creating new infrastructure, which is certainly made simple. But organizations demand systems without vendor lock-in so that they can, theoretically at least, migrate from cloud to cloud to hunt down the best value for money. Attempts by cloud vendors to make open source technologies unique (MongoDB on AWS is a good case in point) are heading for failure.

And therein is the core lesson about cloud computing. What’s offered is most simply described as ‘someone else’s computer .’ There are layers of usability placed on top, and users can choose from a menu of pre-vetted platforms and technologies that ‘just work.’ And that facility may be useful if, for example, users create new applications based on microservices that operate on data already embedded into cloud-based workflows. But as nearly half of UK IT decision-makers have found, doing so can be more expensive than was predicted.

Why containers, after all?

It’s also worth noting that the industry standard for fleets of containers, Kubernetes, dedicates a great number of lines of code to what should happen when containers crash: keeping systems going when component parts fail. Less fashionable but more mature technologies, like VMs or FreeBSD jails, may not be as glamorous in boardroom discussions but offer a more solid basis on which to build out new features on relatively reliable existing applications.

Companies hoping to sidestep the IT staff shortage may also look at the big cloud vendors for solutions. But the slew of AWS/GCP certifications now available indicate that trained and therefore expensive personnel are a pre-requisite, regardless of where systems are located.

Cloud-hosting services don’t offer their facilities primarily for the good of users. They aim to have as many users signed up as possible, whether or not their clients’ needs are best met by what’s on offer. It’s a good match in some cases, but more companies who rode with the ‘stampede to the cloud’ are regretting being caught up in the melee.

Cloud hosting services illustrative image for article on same.

“Outrunning the Wall Cloud” by Wesley Fryer is licensed under CC BY-SA 2.0.

The post Clouds darken in UK as on-prem makes a comeback appeared first on TechHQ.

]]>
IBM’s mainframe for the masses https://techhq.com/2024/02/what-is-the-business-case-for-a-mainframe-computer/ Mon, 12 Feb 2024 12:30:07 +0000 https://techhq.com/?p=231983

A mainframe computer on a limited budget. Transactional processing at scale. ROI in months compared to hyperscale cloud. Say the word mainframe to many IT professionals, and they immediately think of legacy computing, systems being replaced with more modern technologies to better cope with the demands on computing common in 2024. Some will have used... Read more »

The post IBM’s mainframe for the masses appeared first on TechHQ.

]]>
  • A mainframe computer on a limited budget.
  • Transactional processing at scale.
  • ROI in months compared to hyperscale cloud.

Say the word mainframe to many IT professionals, and they immediately think of legacy computing, systems being replaced with more modern technologies to better cope with the demands on computing common in 2024. Some will have used mainframe systems in the past, perhaps in environments where computer access was via a mainframe and thin client. In those situations, the client was a dumb terminal, and the computing work was done elsewhere by a central mainframe facility.

But mainframes are still in everyday production in some industries, and the market for new mainframe hardware and compatible software continues to grow where “transactional” computing is central to effective operations.

The new IBM LinuxONE 4 Express is a piece of hardware designed for smaller organizations, and it represents the baby of the breed, offering a low-cost of entry with pre-configured hardware options. The company also emphasizes its cyber-resilience, with hardware security systems it terms “Secure Execution.” The hardware has some high security clearance standards, including Common Criteria EAL 4 and DISA STIG certifications, and FIPS 140-3 compliance.

Users can choose their preferred software platform, with SUSE now offering its Linux Enterprise Server for IBM Z as part of a bundle that can also come with SLE Live Patching, SUSE Manager Lifecycle Management Plus, SLE High Availability, and a long-term service package. The Secure Execution hardware means multiple containerized applications can be run simultaneously in isolation. That makes the system ideal for multi-tenancy operations or parallel application spaces that are effectively separated from each other.

A maninframe computer, the IBM LinuxONE Express.

IBM LinuxONE Express. Image: IBM.

Mainframe computer benefits

While similar secure and powerful environments can be created using several x86 servers, mainframes represent a more sustainable approach to hardware and power use. Expansion of storage, memory, and processing capacity over time makes this style of hardware a more attractive long-term prospect: the use of fewer components obviously reduces environmental impact and makes hardware maintenance plans simpler to budget.

While the initial cost (from $135,000) may seem high as a line item on a CAPEX sheet, enterprises with large cloud provider bills may see an effective return on investment sooner than they think. Depending on use cases, third-party clouds’ abilities to scale and provide agility are often seldom used. That means large organizations pay for capabilities they may rarely make use of.

Transactional computing

The continuing existence of a thriving market in mainframe computing stems from the need for accurate transactional computing in a growing number of verticals. Transactional computing can be best described as a way that canonical records of all aspects of a single transaction can be kept, with each element of one transaction being required to be successful for a record to be made, changed, or deleted.

For example, in an e-commerce business, a transaction would comprise of moving funds from a bank account to a vendor via a payment provider. If one of those steps fails (and each comprises several sub-steps), the transaction has not occurred, so the only record made is one flagged as a failure. Therefore, the emphasis in computing terms is not on raw processing power (the main requirement for supercomputing, for example) but on the integrity of database entries. That emphasis doesn’t necessarily require a different computing architecture, but it’s one that comes built into the design specifications of mainframes.

For that reason, banking and financial services, for example, still rely on mainframe technologies. But as the scale of internet use grows, more industries rely on the type of security, reliability, and data veracity that mainframe methodologies (still) excel at. Other use cases may be found in high-volume e-commerce marketplaces, engineering facilities that rely on multiple IIoT nodes, and power distribution networks, to name just three examples.

Image of old mainframe computer for article on IBM's LinuxONE Express.

“Mainframe computer” by scriptingnews is licensed under CC BY-SA 2.0.

Business case for mainframe computers

Software optimized for transactional computing, whether monolithic or based on microservices, is available from several vendors: LinuxONE hardware will run Red Hat, SUSE, and Ubuntu. The LinuxONE Express hardware range contains (at base version) 16 IBM IFL systems (Integrated Facility for Linux), expandable to 68 IFL instances. The Emperor LinuxONE range supports over 200 for those looking for more grunt.

The size and power of mainframes make them ideal for placing data and applications in the same place. IBM quotes an example of medical data and medical claims software sharing the same hardware tenancy, allowing for faster claim assessment. Similarly, for businesses looking to consolidate their server fleets, either in-house or leased from cloud providers, an Express instance can replace up to 2,000 x86 instances (manufacturer’s claim: YMMV).

Many IT decision-makers are coming to the conclusion that hyperscale cloud providers are not offering their services with end-user advantage front of mind. To grab a single example, Microsoft’s Q4 net income was $21.9 billion in 2023. While cloud computing still suits many, financial decision-makers might question the value for money their organization gets from their existing agreements with hyperscalers. That element of doubt and an increased need for reliable transactional processing will make the capital expenditure option look increasingly attractive to many.

The Express mainframe range can be sourced directly from IBM or approved partners.

The Express is not your meemaw’s mainframe.

The post IBM’s mainframe for the masses appeared first on TechHQ.

]]>