Cloud - TechHQ Technology and business Wed, 06 Mar 2024 10:12:14 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 How cloud and co-location services can protect manufacturers against the rising threat of cyber attacks https://techhq.com/2024/03/how-cloud-and-co-location-services-can-protect-manufacturers-against-the-rising-threat-of-cyber-attacks/ Wed, 06 Mar 2024 10:12:14 +0000 https://techhq.com/?p=232545

Cybersecurity professionals are in exceptional demand Research by Gartner has predicted that by 2025, nearly half of all cybersecurity leaders will look to change jobs, with a quarter of them leaving the industry due to work-related stressors. The responsibility cybersec leaders have is rising thanks to the evolving landscape of cyber threats, which constantly demand... Read more »

The post How cloud and co-location services can protect manufacturers against the rising threat of cyber attacks appeared first on TechHQ.

]]>

Cybersecurity professionals are in exceptional demand

Research by Gartner has predicted that by 2025, nearly half of all cybersecurity leaders will look to change jobs, with a quarter of them leaving the industry due to work-related stressors. The responsibility cybersec leaders have is rising thanks to the evolving landscape of cyber threats, which constantly demand innovative solutions and proactive defences. Another study from (ISC)2 found that nearly 70 percent of cyber professionals claim their organization doesn’t currently have enough cybersecurity staff. Gartner said the resulting talent shortage will ultimately be responsible for over half of significant cyber incidents. Such events are costly, both in the direct financial sense through operational downtime and data recovery, and in the form of reputational damage.

Manufacturing

Source: DartPoints

The manufacturing industry is the most vulnerable to cyber attacks

The problem is particularly relevant in manufacturing, which was the top industry affected by ransomware in 2023. The sector is becoming increasingly connected through the Industrial Internet of Things (IIoT), incorporating sensors, actuators, and other devices networked together with computers’ industrial applications. This expands the attack surface available for cyber criminals to exploit to gain unauthorized access, disrupt operations, or steal sensitive data.

Manufacturers are also often targeted because a successful attack can impact all equipment and IIoT devices, leading to complete operational stoppage, with ripple effects on the supply chain. For example, in 2022, a ransomware attack on Kojima Industries Corporation, a vehicle parts manufacturer, forced Toyota to shut down 14 factories for 24 hours.

Additional common challenges the industry faces include intellectual property theft, user error, phishing, and espionage. It is therefore essential that manufacturers have robust business continuity and disaster recovery (BCDR) plans in place.

The security benefits of cloud or co-location services

The cyber security measures available to manufacturing businesses largely depend on where they store their critical data, whether on-premises, with a cloud service provider, or at a colocation data center.

On-premises infrastructure involves hosting servers locally, while colocation provides secure data center space for servers and equipment. Cloud services offer virtualized resources accessible over the internet, enabling on-demand access to computing power and storage. Many organizations have been moving away from on-premises infrastructure for several years due to its high upfront costs and maintenance requirements. Colocation and cloud services eliminate these expenses and offer greater scalability and flexibility, catering to the fluctuating demands of the manufacturing industry. Organizations that utilize colo and cloud can easily scale up resources in response to growth or a need for increased computing power for data analysis and other applications.

On top of this, colocation and cloud service providers tend to offer advanced security features that are not available when hosting on-premises. These might include physical measures like biometric access and 24/7 surveillance, or network features like advanced firewalls, intrusion detection systems, EDR/MDR, SIEM, and DDoS protection. They can encrypt customer or financial data in transit and at rest, providing an additional layer of security for the most sensitive information.

The provider should conduct regular security audits to ensure compliance with the industry standards relevant to manufacturing and data protection, like the SOC, HIPAA, and NIST frameworks. These can significantly ease a manufacturing company’s burden of achieving and maintaining compliance independently. Outsourcing IT infrastructure to colocation or cloud service providers also supplies businesses with additional third-party expertise. The third-party team can provide deeper insights into existing and emerging threats while offering invaluable guidance about how the company might best detect and defend against them. This allows manufacturers to allocate resources more efficiently, focusing on their core competencies while leaving cybersecurity management to the experts.

Manufacturing

Source: DartPoints

Colocation and cloud services can form essential components of BCDR plans for organizations. They offer geo-redundancy, ensuring data and applications are replicated across multiple locations to minimize downtime during disasters or cyberattacks. These services also provide reliable data backup solutions, enabling the swift restoration of operations from secure offsite backups to reduce losses. Transitioning to cloud or colocation solutions with a trusted third party can help ensure long-term cybersecurity and operational resilience, providing peace of mind to a highly targeted industry.

Consider DartPoints

DartPoints, a leading provider of colocation, cloud, and cybersecurity, stands out as an invaluable partner for organizations grappling with the escalating threat of cyber attacks.

With a comprehensive suite of tailored cybersecurity solutions, DartPoints provides a unique, multi-layered defense strategy to safeguard manufacturing operations. Its approach encompasses round-the-clock monitoring, robust security protocols, and advanced technologies such as firewalls, intrusion prevention systems, and sophisticated DDoS mitigation tools, ensuring that sensitive manufacturing data and intellectual property remain secure. Moreover, DartPoints offers regular data backup and fast recovery solutions, guaranteeing swift restoration in the event of data loss, while high levels of redundancy and failover capability minimize downtime during disasters.

The company’s customizable security postures cater to manufacturing companies’ unique risk profiles and business requirements, ensuring that security measures are aligned with specific operational needs, reporting requirements, or compliance standards. With 24/7 support and security monitoring services, manufacturing organizations can rely on DartPoints to provide unparalleled protection against cyber threats.

It responds to incidents much faster than a limited in-house team could, and boasts an uptime SLA of 99.999 percent. Its bases across the eastern US ensure low-latency connectivity and easy access.

Discover how cloud and co-location services from DartPoints can protect your manufacturing business from cyber attacks by visiting its website or speaking to one of the team today.

The post How cloud and co-location services can protect manufacturers against the rising threat of cyber attacks appeared first on TechHQ.

]]>
Legacy IT infrastructure is a sustainability nightmare https://techhq.com/2024/02/it-infrastructure-problems-sustainability-for-decision-makers-report/ Fri, 16 Feb 2024 12:30:39 +0000 https://techhq.com/?p=232124

A recent report shows the rapid changes IT infrastructure is undergoing. Legacy tech can slow down an organization’s whole IT modernization program. IT leaders reveal the infrastructure problems they’re having – and what they forecast for the future. Legacy infrastructure is a sustainability issue for IT professionals, who rank energy efficiency and sustainability as important,... Read more »

The post Legacy IT infrastructure is a sustainability nightmare appeared first on TechHQ.

]]>
  • A recent report shows the rapid changes IT infrastructure is undergoing.
  • Legacy tech can slow down an organization’s whole IT modernization program.
  • IT leaders reveal the infrastructure problems they’re having – and what they forecast for the future.

Legacy infrastructure is a sustainability issue for IT professionals, who rank energy efficiency and sustainability as important, but don’t feel confident about meeting their green targets.

All this and more has been revealed by research by Daisy Corporate Services on legacy tech with a focus on the cloud. Researchers heard from more than 250 senior IT decision makers at large private and public sector organizations in the UK, uncovering what they consider to be the key IT infrastructure challenges facing them and their teams – and how to overcome them.

The report provides insights into IT decision makers’ challenges including the global IT talent shortage, the drivers for IT outsourcing and factors for on-premises vs. hosted IT infrastructure.

75% fully (24%) or partly (51%) outsource their organization’s IT infrastructure management to a managed service provider. Partly due to the global talent shortage, IT decision makers felt that their organizations lack the necessary skills and expertise to stay in-house. Decisions on IT infrastructure are getting more complex.

If that wasn’t enough, the report further details predictions, barriers and benefits of moving more of the IT estate to the cloud and looks at IT budgets, investments, and the impact of and opinions on new tech like AI – thought it’d been left out?

Over the last five years, businesses have had to manage the significant evolution of business systems and IT landscapes, with the pandemic a driver of that change. Increasing competition and globalization have affected the operating environment, which has further changed as a result of political and supply chain instability.

The increasing range of technologies and applications was in joint first place for contributing most to IT complexity alongside cloud services (57% each). Full visibility into infrastructure performance is made difficult by both of these factors.

90% of IT decision makers say building, managing and maintaining their IT landscape has become more complex and simplifying IT infrastructure is a priority for 89%.

IT infrastructure in the cloud?

Moving your IT infrastructure to the cloud – an unstoppable trend?

Despite that, though, the benefits of cloud use are obvious, so it’s increasing.

The ability to harness the cloud is dependent on the existing IT landscape at a company, with the biggest barrier to success being that the greater complexity of existing IT infrastructure, the greater is the work needed to migrate to the cloud.

Another significant threat to organizations looking to move more of their IT estate to the cloud are data security concerns.

Too often, organizations find they’re using a cloud hybrid system by accident, not design. Which means falling at the first hurdle in a world where it’s vital that an organization’s infrastructure is secure and able to dynamically adjust to workload demands.

The multitude of different environments and tools in use make it hard for organizations already struggling to gain the end-to-end visibility necessary for consistent security and reliability.

IT infrastructure needs to be flexible if it's going to deliver on its promises.

There is a strategic impact from sustainability as the issue becomes a business differentiator; the importance of green policy use means that consumers look to a company’s sustainability as a major factor of their appeal.

The focus on corporate and social responsibility filters through to IT teams as businesses turn their attention to the sustainability and efficiency of their IT operations.

The biggest hurdle to a clean, green transition is legacy and on-premise infrastructure: legacy tech was described by 63% of IT decision makers as a “sustainability nightmare.” 86% say sustainability and efficiency is important to IT operations, and 84% that their organization has IT sustainability targets in place – but only 51% of them are “very confident” they’ll meet them.

And it isn’t just contributing to a difficult transition to greener systems: legacy hardware contributes to 37% of organizations’ overall IT power consumption.

Spending is also coming under pressure as the global economic slowdown means businesses are having to adapt for survival – driving efficiencies is one tactic for survival that has huge impact on IT teams.

IT leaders are under pressure to reduce expenditure, many resorting to cloud services to turn capital infrastructure expenses into opex costs.

Would moving to a consumption-based model of IT infrastructure be viable for your business?

“Sustainability is a vital component of any modern business, and IT departments have a growing role in helping the wider organisation achieve green targets. But legacy technology is a cause for concern among IT teams, with ageing equipment still contributing significantly to power consumption,” comments Andy Bevan, head of propositions and strategy consulting at Daisy.

“Organizations can benefit from the sustainability features of their cloud providers but are being held back by the challenges of migrating their legacy hardware. Here is where modern hybrid cloud platforms can help bridge the gap between on-site infrastructure and cloud to deliver performance and sustainability benefits.”

Many IT leaders are being asked to re-evaluate their IT spending. More than two thirds (69%) of survey respondents describe the pressure to reduce IT capital expenditure as “significant.” Many are looking to consumption-based pricing to reduce ongoing costs and increase flexibility by paying only for what they use – scaling up and using more resources at peak times, and then reducing resources and costs again afterwards.

Legacy infrastructure is a headache in this process though, as its maintenance and support incurs significant cost. New technology is the solution for some IT managers, with many expecting to see the benefits of using artificial intelligence and automation in areas like service and performance management.

Bevan adds, “At a time when IT leaders are under pressure to reduce capital expenditure many organizations are still incurring significant maintenance and support costs on their legacy hardware. By moving to the cloud and a consumption-based pricing model, organizations can reduce ongoing costs and increase flexibility by paying for what they use. For cost-constrained IT departments this should be their nirvana.”

The full research from Daisy, Faster, greener, cheaper – dealing with IT infrastructure complexity in a hybrid cloud worldis available here.

The post Legacy IT infrastructure is a sustainability nightmare appeared first on TechHQ.

]]>
O’Reilly report predicts technology trends for 2024 https://techhq.com/2024/02/oreilly-tech-trends-for-2024/ Thu, 15 Feb 2024 12:30:25 +0000 https://techhq.com/?p=232011

• What technology trends can we expect to hit big in 2024? • Generative AI dominated 2023 – will its bubble burst in 2024? • Security remains a strong trend – what will this year bring? We’ve all lived through technological advancements that were once considered sci-fi. Some of us were there when the web... Read more »

The post O’Reilly report predicts technology trends for 2024 appeared first on TechHQ.

]]>

• What technology trends can we expect to hit big in 2024?
• Generative AI dominated 2023 – will its bubble burst in 2024?
• Security remains a strong trend – what will this year bring?

We’ve all lived through technological advancements that were once considered sci-fi. Some of us were there when the web was unveiled 31 years ago, marking the first glimpses of a future where “browsing” took on a whole new meaning. While there have been many technological advancements over the succeeding years, 2023 may have been one of the most disruptive, with AI, in particular large language models, transforming the industry, and the world.

AI has already altered the software industry, but believe it or not, we are still at the very beginning of AI’s narrative. What’s to come in the future is hard to predict, but according to the highly renowned O’Reilly learning platform, we can start to have a clearer indication of what to expect through shifting patterns.

Relaying to O’Reilly’s internal “Units Viewed” metric, this snapshot of trends is measured by data within the O’Reilly report covering January 2022 to November 2023. According to this O’Reilly report, technology adoption in companies tends to be gradual, with established technology stacks evolving slowly over time. This is why it is important to recognize the unique technology landscapes of individual companies.

O’Reilly software trends for 2024

O’Reilly found that programmers continued to write software throughout 2023, despite a decline in interest or usage. This in no way implies a decrease in the overall significance of software development, and the impact of software on our daily lives continues to grow.

A trend that will not change is that of software developers designing larger, increasingly complex projects. The uncertainty, however, is whether generative AI will help manage this growing complexity or add a new layer of complexity itself. Many are using AI systems, like GitHub Copilot, to write code, using AI has a quick fix. In fact, O’Reilly found that 92% of software developers are now using AI to create low-level code.

This leaves a few questions:

  • Is AI capable of doing high-level design?
  • How will AI change things software developers want to design?

Perhaps the key question is how can humans collaborate with generative AI to design systems effectively? There’s little doubt that humans will still be required to understand and specify designs. And, while there has been an overall decline in most software architecture and design topics according to O’Reilly, there are notable exceptions. For instance, enterprise architecture, event-driven architecture, domain-driven design, and functional programming are examples of topics that have either shown growth or experienced relatively small declines.

These changes indicate a shifting landscape in software development; one that leans more towards the design of distributed systems that handle substantial real-time data flows. The apparent growth in content in these evolving fields seems to reflect a focus on addressing challenges posed by managing large volumes of data in distributed systems.

There has also been a microservices decline. According to O’Reilly, this popular architectural approach experienced a 20% drop in interest during 2023, with many developers advocating for a return to monolithic applications. It seems organizations are using microservices as a trend, rather than as a necessity, which could lead to challenges if they are implemented poorly.

Design patterns also saw a decline (16%) in interest among developers, which may be driven by AI’s involvement in writing code, and a growing focus on maintaining existing applications. This points to a trend where design patterns are growing in importance and software becomes more flexible, even in legacy applications. However, when there has been a burst of interest in pattern designs, there has also been a surge in pattern abuse, such as developers implementing FactoryFactoryFactory factories.

O’Reilly’s report suggests a shift in interest regarding software development, primarily influenced by practical considerations, and occasional misapplications of methodologies.

O’Reilly AI trends for 2024

Right now, the GPT family of models is the main talking point when it comes to AI. In 2023 alone, user numbers went up a staggering 3,600%. This was kickstarted by the introduction of ChatGPT in November 2022, of course. As far back as 2020, however, GPT-3 was making a splash on the AI scene, with GPT 1 and 2 launched in 2018 and 2019 respectively.

O’Reilly’s analysis has shown that interest in the broader field of natural language processing (NLP) has experienced a substantial increase, specifically a 195% rise among its users. This is a growing trend that is expected to continue throughout 2024, with software developers inclined to focus on building applications and solutions using the APIs provided for GPT and other language models. Therefore, they may become less interested in ChatGPT.

Other substantial gains included Transformers (a type of deep learning model architecture), up 325%, and generative models, up 900%. Prompt engineering, only introduced in 2022, has become a significant topic, with a similar usage to Transformers. NLP is used almost twice as much as GPT, although, according to O’Reilly’s data, the next year will be driven hugely by GPT models and generative AI.

Here are some other key insights taken from O’Reilly’s analysis, giving us a clearer indication of AI trends for 2024:

  • Deep learning remains fundamental to modern AI, with a reported 19% growth in content usage, while other AI techniques, such as reinforcement learning, have also seen positive gains.
  • Programming libraries, such as PyTorch, a Python library, continue to grow and dominate programming in machine learning and AI, with a 25% increase.
  • TensorFlow has reversed a decline with a modest 1.4% gain, and it seems there is a noticeable decline in interest for scikit-learn and Keras.
  • Interest in operations for machine learning (MLOps) has increased by 14%. This reflects the recognition of the importance of deploying, monitoring, and managing AI models.
  • LangChain, a framework for generative AI applications, is showing signs of emergence, particularly in the retrieval-augmented generation (RAG) pattern.
  • Vector databases are expected to gain importance, albeit with specialized usage.

Throughout 2024, and beyond, generative AI’s influence is set to span various industries, including logistics, finance, manufacturing, pharmaceuticals, healthcare, and government.

That indicates a dynamic and evolving landscape in the year to come.

O’Reilly security trends for 2024

Another topic that saw serious interest gains among developers in 2023 is security. According to O’Reilly, the majority of related search topics showed growth from 2022 through 2023, with network being the most used topic, seeing a 5% growth year-over-year, closely followed by a 22% growth in governance.

DevSecOps saw one of the largest growths in usage amongst security topics of 30%, while interest in application security topics increased by 42%. This indicates a move towards using security throughout the entire process of software development.

Additional things to watch in 2024

Tech trends for 2024 - sneaky robots?

Rise of the machines in 2024? O’Reilly has ideas…

O’Reilly’s analysis signals a variety of technology trends for 2024. Here are some other trends we expect to experience as the year goes on:

  • With a 175% growth, cloud native has become the most used cloud-related topic. This suggests a widespread shift of companies towards developing primarily for the cloud as their main deployment platform.
  • Experiencing a 36% rise, Microsoft Power BI seems set to continue as one of the most widely used data topics.
  • There has been an increased focus on professional development, project management, and project communications, signifying developers’ enhancement of “soft skills” through upskilling.
  • CompTIA A+ encountered the most significant growth in content usage at 58%, suggesting a large increase in people looking to start IT careers.

Mike Loukides, vice president of emerging technology content at O’Reilly, said, “This year marks a rare and genuinely disruptive time for the industry, as the emergence of generative AI promises important changes for businesses and individuals alike.”

But, Loukides continued, saying, “Efficiency gains from AI do not, however, replace expertise. Our data signals a shift for programming as we know it, with consequences for skills, job prospects, and IT management.” With new innovations rolling out as the year progresses, it’s a time for preparation, with upskilling more critical than ever before.

The post O’Reilly report predicts technology trends for 2024 appeared first on TechHQ.

]]>
Future-proofing utility companies: The role of data, analytics and IoT https://techhq.com/2024/02/future-proofing-utility-companies-operations-the-role-of-data-analytics-and-iot/ Wed, 14 Feb 2024 15:57:28 +0000 https://techhq.com/?p=231961

Modern utility companies face an imperative for transformation of their operations as they grapple with increasing energy demands, environmental concerns and the evolving technological expectations of consumers. There’s also a raft of legislation that needs to be observed, plus an omnipresent need to modernise infrastructure. Finally, geopolitical events, like the Russian invasion of Ukraine, and... Read more »

The post Future-proofing utility companies: The role of data, analytics and IoT appeared first on TechHQ.

]]>

Modern utility companies face an imperative for transformation of their operations as they grapple with increasing energy demands, environmental concerns and the evolving technological expectations of consumers. There’s also a raft of legislation that needs to be observed, plus an omnipresent need to modernise infrastructure. Finally, geopolitical events, like the Russian invasion of Ukraine, and economic uncertainties are introducing unpredictability into energy supply chains.

Data, analytics and the Internet of Things (IoT) can be harnessed to enhance operational efficiency and deliver customers a more sustainable and responsive service. The adoption of these technologies by utility companies can also improve safety, enhance the employee experience and uncover new revenue streams.

Prioritisation is key in this situation; after all, funds are not infinite. But utilities also have the potential to be massive data gatherers, from their plant and infrastructure, their delivery methods, consumers, end-users and business operations.

Utility operations are changing rapidly.

Source: Shutterstock

In some cases, there needs to be a shift in mindset with regard to the information gathered and used by utility companies. TechHQ recently spoke to Wipro – a company in multiple partnerships with UK utilities – about the challenges and approaches their consultants and architects see daily.

The company’s Director of Digital Transformation, Sampathkumaran Hariharan, said: “The data gathered from the field is impacted by the kind of equipment and sensors that have been deployed over a long period of time. Many of the utilities have not refreshed the data-gathering mechanisms from the field. But even from an operational point of view, the information about the customer is also maintained in multiple systems.”

Sampathkumaran Hariharan. Source: LinkedIn

“The customer has not been the centre of data acquisition over a long period of time. Utilities have traditionally been an asset-centric organisation. The emphasis on being a customer-centric organisation is heavily emphasised by the regulator and the frameworks the regulator has. So there is a transitioning that is happening within the industry.”

All over the UK, water companies are becoming more responsive to their customers’ demands for less water wastage from existing pipe infrastructures. Wipro’s use of advanced machine learning algorithms helps one UK water utility monitor 600 DMAs (Digital Metered Areas) to identify those zones where leaks are a substantial issue. IoT devices monitor flow rate and pressure, and with data on the geometrical structure of the network (routes, pipe diameters, existing gate valves, etc.), software can predict pipe burst events and detect major leakage anomalies in pilot the utility’s sites.

Proactive maintenance and infrastructure replacement mean less water is lost to leaks, and critically, water supplies are better ensured to customers at home and in the workplace.

Further up the chain, the combination of OT (operational technology) and IT helps translate collated data and convert it for use by distribution network operators. This amalgamation of new technology with legacy infrastructure combines localised DCS (distributed control system) and geographically widespread SCADA (supervisory control and data acquisition) data to create next-generation ADMS (advanced distribution management system) solutions.  These solutions are more suited to the highly flexible approach to electricity distribution required for future generations.

Technology also elevates both the customer and employee experiences; utility personnel benefit from streamlined workflows and data-driven insights, while customers receive more personalised services and access to real-time information through smart home utility meters. Furthermore, integrating state-of-the-art technologies creates the potential for new revenue streams, such as energy efficiency consulting and water quality certification services, increasing the longevity of these critical sectors.

Data is the lifeblood of transformation in the utility, transport & distribution (T&D) and water sectors. It powers informed decision-making, bespoke services for customers and efficient resource allocation, to name just a few benefits. However, this data is next to useless if the company does not have the capability, facilitated by a robust data platform, to turn it into actionable insights.

Data-driven approaches can lead a company to embrace ‘Industry 4.0’, or the integration of IT systems with business processes. For example, say a utilities company installed advanced sensors across its power grid infrastructure; it could then apply predictive maintenance models that identify potential equipment failures before they occur. These insights could be integrated into its IT systems, allowing for automated alerts and maintenance scheduling.

Traditional operational threat (OT) practices in the utilities industry primarily focus on monitoring and managing industrial processes and equipment. However, recent technological advancements offer new benefits, particularly in the context of real-time remote sensor data. These sensors provide authorities with detailed and reliable information about damaged locations or faulty equipment, which is especially crucial during emergencies when swift action can be a matter of life and death.

The introduction of hydrogen into the UK’s power infrastructure is an area that’s being pioneered by some companies in collaboration with Wipro. It’s a critically important player in this type of research that has the ability to change the ways in which power and heat are delivered to UK homes and businesses.

Ankit Sharma. Source: LinkedIn

Ankit Sharma, Account Executive at Wipro, said: “Recently, we brought together a number of industry bodies, Ofgem, and a lot of network players, retailers, and so on, all to talk about hydrogen and how we were going to help support [its] use, how we were going to integrate, how that would work, how we use that heat in all forms in the industry.”

“But it’s really important that we look at how to integrate [hydrogen] and other renewable energy sources – wind, solar, and so on – and reduce the reliance on the carbon-intensive power generation that we currently have to support our net zero commitments.”

As IT methods are applied to the management of OT environments in enterprise, including utility sectors, further possibilities are opening up thanks to data collection and digitisation in general.

“I think what utility companies have to now look at is how we have efficient workforce management as well,” said Mr Sharma. “That means how they are geared up for the challenge from a grid maintenance perspective, and also how they are looking out for the new skills that are coming within the market. How do you upskill workforces on that?”

For utility T&D and water companies specifically, industrial IOT (IIoT) and the power of data lakes enable the shift from periodic or manual analysis to real-time monitoring of significant assets and processes, like transformers, pipelines and water treatment facilities, plus a host of other operational factors, from financing, accountancy, HR and logistics. These are just a few examples of how the incorporation of carefully chosen IT opens up more efficient and cost-effective work practices.

The future of utility companies relies on data, analytics and the IIoT. As conditions become more challenging, through growing energy demands and environmental concerns, embracing digital innovation is essential. With the integration of IT and OT and business goals that dictate technology choices, utility companies can adapt, thrive, and remain competitive in an evolving yet challenging landscape.

You can listen to the full podcast here, and check out the show notes for links and more information on Wipro’s work in the sector in the UK and abroad.

In the interim, visit Wipro’s website to find out more about its work within the utilities industry.

The post Future-proofing utility companies: The role of data, analytics and IoT appeared first on TechHQ.

]]>
The flexible work revolution: Changing work culture in distribution, logistics and manufacturing https://techhq.com/2024/02/the-flexible-work-revolution-changing-work-culture-in-distribution-logistics-and-manufacturing/ Mon, 12 Feb 2024 10:07:45 +0000 https://techhq.com/?p=232018

The work landscape in distribution, logistics and manufacturing is undergoing a profound transformation driven by the evolving demands of a dynamic workforce. The question arises for these industries: Is flexible working feasible? Traditionally, these sectors require onsite employees for safety and operational reasons. However, a strong argument can be made that introducing flexible working is... Read more »

The post The flexible work revolution: Changing work culture in distribution, logistics and manufacturing appeared first on TechHQ.

]]>

The work landscape in distribution, logistics and manufacturing is undergoing a profound transformation driven by the evolving demands of a dynamic workforce. The question arises for these industries: Is flexible working feasible? Traditionally, these sectors require onsite employees for safety and operational reasons. However, a strong argument can be made that introducing flexible working is actually essential for the industry to thrive in the current labour market.

Understanding the changing workforce dynamics

The modern workforce is increasingly seeking job roles that offer flexibility and work-life balance. The shift is particularly pronounced among younger workers, who value autonomy and personal well-being alongside their professional commitments. Adapting to these changing expectations is beneficial and essential for industries facing labour shortages and high staff turnover rates. A report from Advanced shows that attracting & retaining talent is listed as a top priority for the manufacturing (37%) and distribution & logistics (39%) sectors.

Flexible working

Source: Shutterstock

Recent research from the CIPD underscores the urgency of adopting flexible work practices. An estimated four million people have changed careers, and two million have left jobs in the last year due to a perceived lack of flexibility on the part of their employers. The CIPD warns that businesses may face a talent exodus if they fail to offer flexible working options. The report highlights that flexibility is critical to retain and attract staff, address the current skills shortage and foster inclusive workplaces.

Post-pandemic, 39 per cent of organisations offer flexible working from an employee’s first day, up from 36 per cent in 2021. However, nearly half of employers remain unaware of the legislation that will soon make flexible working requests an immediate right. The report also found significant unmet demand for more flexible hours arrangements and, simultaneously increased demand for the same. The findings highlight the need for more education and action among employers to adopt and promote hybrid working practices.

Navigating skills shortages and global supply chain disruptions

The distribution, logistics and manufacturing sectors face a critical skills shortage, further complicated by recent global supply chain disruptions, notably in the Red Sea. Attacks by the Houthi, an Iran-backed Yemeni group, have led to significant shipping disruptions, with key players like the Mediterranean Shipping Company halting routes through this vital shipping corridor. The events have resulted in substantial delays and increased operational costs, affecting businesses reliant on just-in-time supply chains.

Redirecting shipping routes has led to longer transits, escalating fuel and insurance costs and increasing operational overheads. These changes have had a cascading effect, from manufacturers experiencing delays receiving raw materials to retailers like IKEA and Next facing product shortages.

Adaptability in work practices becomes more essential in response to these challenges. Embracing flexible working arrangements, supported by technology like comprehensive contract management software, enables businesses to reassess and adapt their supply chains, enhancing resilience against disruption.

The long-term implications of shipping disruptions may lead to strategic shifts in supply chains, such as increased onshoring and nearshoring. They also underscore the importance of a flexible approach and strong operational infrastructure to maintain efficiency amid changing global conditions.

The feasibility of flexible working

Despite the onsite nature of work in distribution, logistics and manufacturing, flexibility can be introduced in various ways. For instance, administrative tasks and certain training components can be conducted off-site or through digital platforms, reducing the need for constant onsite presence. Additionally, firms can offer a variety of shift options, including job-sharing and shift permutations, which cater to different lifestyles.

Protolabs research reveals that 56 per cent of manufacturing professionals believe flexible working enhances innovation, and only 39 per cent view a four-day working week negatively. Seven in ten would consider a four-day week if their suppliers did, indicating an industry openness to the model. However, the shift requires adapting business models, with 78 per cent acknowledging the need for new manufacturing operational strategies.

The study also emphasises the role of collaborative robots – or cobots – and automation to enhance productivity and creativity. More than half of the respondents believe cobots would improve employee productivity and idea generation, with many manufacturers already using or planning to use such technology. This suggests human-machine collaboration may support flexible working patterns and strengthen operational capabilities.

Flexible working

Souce: Shutterstock

SaaS-based time and attendance solutions are key enablers in this transition, as they offer a range of functions that support flexible working arrangements. For example, self-service shift management allows employees to swap shifts inside defined boundaries, ensuring that skill gaps are not created. Workers highly value this level of autonomy, and empowerment in this way can significantly enhance job satisfaction.

Maintaining security and operational control is a common concern with flexible working arrangements. Here, technology plays a vital role. Advanced solutions provide robust access control systems, ensuring that only authorised personnel are on-site at any given time. Real-time data and monitoring capabilities create a comprehensive overview of operations, giving leaders the peace of mind that their facilities are secure and functioning optimally, even with a more flexible workforce.

Offering flexible working arrangements can be a significant differentiator during a skills shortage. The approach positions a firm as one that understands and respects the needs of its workforce, attracting new talent and playing a crucial role in retaining existing skilled workers.

How to implement new working plans

The integration of flexible working in distribution, logistics and manufacturing is not only feasible but essential for the sustainability and growth of these sectors. It is not just a response to current workforce trends but a strategic move towards a more adaptable and resilient business model aided by a committed and loyal workforce.

“With the right software and support, it’s perfectly possible for distribution, logistics and manufacturing leaders to offer the flexibility key talent is looking for, without compromising on the bottom line or quality service.”

Adrian West, VP of Distribution, Logistics and Manufacturing, Advanced.

To learn more about how flexible working solutions and advanced technology can transform your business in the sector, connect with Advanced. Advanced has a proven track record of success, helping over 100,000 people annually get paid accurately and on time. It currently saves employees over 750,000 hours a year through automation and helps companies save millions through improved operational efficiencies. Discover how Advanced can help your business stay ahead in the competitive landscape by contacting a member of the expert team today.

The post The flexible work revolution: Changing work culture in distribution, logistics and manufacturing appeared first on TechHQ.

]]>
A conversation with Dynatrace’s CTO https://techhq.com/2024/02/dynatrace-cto-bernd-greifeneder-causal-ai-and-other-stuff/ Fri, 09 Feb 2024 09:30:32 +0000 https://techhq.com/?p=231941

• Dynatrace can now deploy causal AI to deliver certainty of results. • This fits a particular niche of need for enterprises that GenAI can’t deliver. • It’s also delivering a carbon calculator that goes beyond standard, vague models. From causal AI to harsh deletion; after a run of exciting announcements at Perform 2024, we... Read more »

The post A conversation with Dynatrace’s CTO appeared first on TechHQ.

]]>

• Dynatrace can now deploy causal AI to deliver certainty of results.
• This fits a particular niche of need for enterprises that GenAI can’t deliver.
• It’s also delivering a carbon calculator that goes beyond standard, vague models.

From causal AI to harsh deletion; after a run of exciting announcements at Perform 2024, we spoke to Dynatrace’s CTO and co-founder, Bernd Greifeneder, to get some insight on the technology behind the observability platform.

As the “tech guy,” how do you approach the marketing side of things? How do you get across the importance of Dynatrace to those who don’t “get” the tech?

Right now we are on that journey – actually, this Perform is the first one explicitly messaging to executives. It’s worked out great, I’m getting fantastic feedback. We also ran breakout sessions with Q&A’s on this three by three matrix to drive innovation by topics like business analytics, cloud modernization and user experience.

Then, we have the cost optimization because every executive needs to fund something. I can explain ten ways to reduce tool sprawl alone with Dynatrace. Cloud cost coupled with carbon is obviously a big topic, and the third layer is risk mitigation.

No one can afford an outage, no one can afford a security breach – we help with both.

How do you sell causal AI?

Bernd Greifeneder presented Dynatrace’s new products on the mainstage at Perform 2024.

Executives have always asked me how to get to the next level of use cases. I think that’s another opportunity; in the past we were mostly focused on middle management. If we first give executives the value proposition, they can go down to the next level of scale, implementing the use cases they wanted.

The other aspect is extending to the left. It’s more than bridging development with middle management, because you can’t leave it just to developers. You still need DevOps and platform engineering to maintain consistency and think about the bigger picture. Otherwise it’s a disaster!

How has changing governance around data sovereignty affected Dynatrace clients – if at all?

[At Perform 2024, Bernd announced Dynatrace OpenPipeline, a single pipeline for petabyte-scale ingestion of data into the Dynatrace platform, fuelling secure and cost-effective analytics, AI, and automation – THQ.]

Well, we have lots of engagements on the data side – governance and privacy. For instance, with OpenPipeline it’s all about privacy because when customers just collect data it’s hard to avoid it being transported.

It’s best not to capture or store it, but in a production environment you have to. We can qualify out the data at our agent level and maintain interest in it throughout the pipeline. We have detection of what is sensitive data to ensure it isn’t stored – when it is, say if analytics require it to be, you have custom account names on the platform.

That means you can inform specific customers when an issue was found and fixed, but still have proper access control.

We also allow harsh deletion; the competition offers soft deletion only. The difference is that although soft deletion marks something as deleted, it’s still actually there.

Dynatrace’s hard deletion enables the highest standard of compliance in data privacy. Obviously, in the bigger scheme of Grail in the platform, we have lots of certifications from HIPAA and others on data governance and data privacy.

[Dynatrace has used AI on its platform for years; this year it’s adding a genAI assistant to the stack and introducing an AI observability platform for their customers – THQ.]

What makes your use of AI different from what’s already out there? How are you working to dispel mistrust?

Would you want to get into an autonomous car run by ChatGPT? Of course not, we don’t trust it. You never know what’s coming – and that’s exactly the issue. That’s why Dynatrace’s Davis hypermodal AI is a combination of predictive, causal and generative AI.

Generative AI is the latest addition to Davis, intended as an assistant for humans, not to drive automation. The issue is the indeterminism of GenAI: you never know what you’ll get, and you can’t repeat the same thing with it over and over. That’s why you can’t automate with it, or at least automate in the sense of driving a car.

What does it mean then for running operations? For a company, isn’t this like driving a car? It can’t go down, it can’t be insecure, it can’t be too risky. This is where causal AI is the exact opposite of nondeterministic, meaning Dynatrace’s Davis causal AI produces the same results over and over, if given the same prompts.

It’s based on actual facts. It’s about causation not correlation, really inferring. In realtime, a graph is created so you can clearly see dependencies.

For example, you can identify the database that had a leak and caused a password to be compromised and know for certain that a problem arose from this – that’s the precision only causal AI provides.

Generative AI might be able to identify a high probability that the database leak caused the issue, but it would also think maybe it came from that other source.

This is also why all the automation that Dynatrace does is based on such high-quality data. The key differentiator is the contextual analytics. We feed this high-quality, contextual data into Davis and causal AI helps drive standard automation so customers can run their production environments in a way that lets them sleep well.

Observability is another way of building that trust – your AI observability platform lets customers see where it’s implemented and where it isn’t working.

Yeah, customers are starting to implement in the hope that generative AI will solve problems for them. With a lot of it, no one really knows how helpful it is. We know from ChatGPT that there is some value there, but you need to observe it because you never know what it’s doing.

Because of its nondeterministic nature, you never know what it’s doing performance wise and cost wise, output wise.

What about the partnership with Lloyds? Where do you see that going?

Especially for Dynatrace, the topic of sustainability and FinOps go hand in hand and continue to rapidly grow. We’ve also implemented sophisticated algorithms to precisely calculate carbon, which is really challenging.

Here’s a story that demonstrates how challenging it is: enterprise companies need to fulfil stewardship requirements. To do so, they might hire another company that’s known in the market to help with carbon calculation.

But the way they do it is to apply a factor to the amount the enterprise spends with AWS or Google Cloud, say, and provide a lump sum of carbon emissions – how can you optimize that?

The result is totally inaccurate, too, because some companies negotiate better deals with hyperscalers; the money spent doesn’t exactly correlate to usage. You need deep observability to know where the key carbon consumption is, whether those areas truly need to be run the way they are.

We apply that to this detailed, very granular information of millions of monitored entities. With Lloyds, for example, optimization allowed a cut of 75 grams of carbon per user transaction, which ultimately adds up to more and more.

Our full coverage of Dynatrace Perform is here, and in the next part of this article, you can read a conversation with Dynatrace VP of marketing Stefan Greifender.

The post A conversation with Dynatrace’s CTO appeared first on TechHQ.

]]>
Top three manufacturing challenges for 2024: Confronting data security, regulation, and talent shortages head-on https://techhq.com/2024/02/top-three-manufacturing-challenges-for-2024-confronting-data-security-regulation-and-talent-shortages-head-on/ Thu, 01 Feb 2024 09:56:12 +0000 https://techhq.com/?p=231785

It was a busy 2023 for manufacturers, with rising fuel and energy costs, high inflation, and supply chain issues stemming from geopolitical events. This all accumulated in the J.P. Morgan Global Manufacturing PMI  that remained below the 50.0 contractionary mark for the sixteenth consecutive month in December 2023. However, the new year is in full... Read more »

The post Top three manufacturing challenges for 2024: Confronting data security, regulation, and talent shortages head-on appeared first on TechHQ.

]]>

It was a busy 2023 for manufacturers, with rising fuel and energy costs, high inflation, and supply chain issues stemming from geopolitical events. This all accumulated in the J.P. Morgan Global Manufacturing PMI  that remained below the 50.0 contractionary mark for the sixteenth consecutive month in December 2023. However, the new year is in full swing, and looking at industry priorities is crucial when making strategic, future-proof decisions. TechHQ teamed up with experts at DartPoints, a leading provider of colocation, cloud, and cybersecurity, to take a deep dive into the top three issues for manufacturers in the next year.

  1. Rising cyber security threats

Manufacturing was the top industry affected by ransomware in 2023, and attacks of this kind have cost $46 billion in losses in the last five years. Unfortunately, manufacturers have targets painted on their backs because of the critical role they play in global supply chains and infrastructure.

Manufacturing industry

Source: DartPoints

But bad actors are not the only threat to industry safety; significant natural disasters are becoming increasingly frequent as climate change worsens. Power outages cause 35 percent of unexpected downtime globally, often brought about by rain, flooding, fires, tornados, and earthquakes. These cost an average of $1,467 per minute, or $88,000 per hour. Data security will, therefore, be a huge area of focus for the coming year, including implementing robust business continuity and disaster recovery (BCDR) plans.

  1. Navigating regulatory compliance

Manufacturing businesses are also concerned about complying with increasingly stringent industry regulations. Depending on the sector and scale of a company, regulations could include HIPAA, PCI, FISMA, and other geographically specific data privacy acts.

Last October, President Biden signed an executive order on safer AI, which will likely influence further regulation in the space. Manufacturers looking to incorporate AI technology in operations must be aware of any regulatory developments.

Furthermore, reporting on IT operations to comply with environmental standards, like the Energy Policy Act, is becoming more significant with ensuing extra data demands. This will affect decisions around green IT practices and energy-efficient data centers.

  1. Ongoing staffing challenges and the talent gap

Many manufacturing businesses are keen to continue their digital transformation journey in 2024. This could include projects around cloud migration and management, AI and machine learning implementation, and the integration of 5G-enabled computing. These technologies are pivotal in achieving operational efficiency, scalability, and innovation. However, the IT talent gap is a major roadblock in managing an IT overhaul that could significantly disrupt business operations.

The number of suitably skilled employees cannot keep up with rapid technological developments. According to the International Monetary Fund, the shortage of qualified tech professionals will lead to 85 million vacancies in 2030, potentially causing over $8 trillion in lost revenue. Without the necessary specialist staff, manufacturers may struggle to reach their digital goals smoothly and within budget.

Leveraging cloud services: How a solutions provider drives innovation

Cloud service providers (CSPs) and data centers are pivotal in fortifying manufacturers against cyber attacks and data loss. They can help companies with disaster recovery, backup, business continuity planning, and cloud migration, which  mitigate the effects of cybersecurity threats. Leveraging AI-powered threat detection, CSPs and data centers have the capability and experience to identify and respond to potential risks in real time, offering manufacturers a proactive defense strategy. Thorough security assessment during onboarding helps optimize the use of security tools and addresses vulnerabilities in a business’s IT stack.

Manufacturing industry

Source: DartPoints

Offering manufacturers a comprehensive range of security solutions, including immutable backup copies, data storage, and continuous monitoring, contributes to safeguarding against external threats and maintaining data integrity. Such measures also aid manufacturers in achieving compliance and evading penalties. CSPs and data centers can offer data backup and disaster recovery services to ensure data availability, thus fulfilling regulatory requirements. Furthermore, CSPs and data centers can undergo routine compliance audits and acquire pertinent certifications, such as SOC 2 or ISO 27001, to showcase adherence to stringent standards.

Having comprehensive incident response processes in place ensure any security incident is identified promptly and reported to the appropriate authorities; a requirement for data protection regulations like PCI  and HIPAA.

Outsourcing services to CSPs or data centers  enables manufacturers to alleviate frustrations around staffing issues. These technical experts These technical experts handle digital transformation tasks daily, offering specialized support and enabling the business to concentrate on core operations. This  approach relieves the burden of recruiting, training, and retaining in-house IT personnel,  providing a solution that shifts expenditure from CAPEX to OPEX. Additionally, it ensures reliable, expert management of critical infrastructure and services.

DartPoints – A manufacturing solution partner

In the manufacturing sector, where cybersecurity, regulation compliance, and staffing issues pose significant challenges, DartPoints emerges as a trusted ally, offering tailored solutions to address these pressing concerns.

Cybersecurity: Strengthening digital defenses

Recognizing the critical importance of safeguarding manufacturing operations against cyber threats, DartPoints employs a multi-layered approach to cybersecurity. Through continuous monitoring and robust security protocols, it provides a comprehensive shield to protect sensitive data. With an impressive uptime SLA of 99.999 percent and low-latency connectivity, DartPoints ensures unparalleled performance, instilling confidence amid evolving cyber risks.

Regulation and compliance: Navigating regulatory terrain

Adhering to industry regulations is crucial for manufacturing success. DartPoints’ commitment to regulatory compliance is evident through its adherence to relevant governance standards and rigorous annual audits. By partnering with DartPoints, manufacturers can ensure their operations meet regulatory requirements, mitigating the risk of penalties and disruptions.

Manufacturing industry

Source: DartPoints

Addressing staffing challenges: Your extended team

The shortage of skilled IT professionals presents a significant obstacle for manufacturers. DartPoints serves as an extension of your team, offering expertise, experience, and a customer-centric approach to fill staffing gaps. With DartPoints as your partner, you gain access to a dedicated team of seasoned professionals, alleviating the burden on internal resources and ensuring seamless IT operations.

DartPoints transcends being a mere service provider; it is your strategic partner, committed to empowering manufacturers to navigate the complexities of cybersecurity, compliance, and staffing with confidence and ease.

Discover how DartPoints can help smoothly navigate your manufacturing business through 2024 by visiting its website or speaking to one of the team today.

The post Top three manufacturing challenges for 2024: Confronting data security, regulation, and talent shortages head-on appeared first on TechHQ.

]]>
Biden weighs blocking China’s access to US cloud tech, fearing AI advancement https://techhq.com/2024/01/us-cloud-control-biden-eyes-blocking-china-ai-access/ Tue, 30 Jan 2024 15:00:58 +0000 https://techhq.com/?p=231735

Raimondo warns against unwanted access for China to US cloud technology to build AI. The Secretary of Commerce is acting to block use of US tech for AI by China due to “security concerns.” The move, impacting players like Amazon and Microsoft, is anticipated to escalate tech tensions with China. The long-standing rivalry between the US... Read more »

The post Biden weighs blocking China’s access to US cloud tech, fearing AI advancement appeared first on TechHQ.

]]>
  • Raimondo warns against unwanted access for China to US cloud technology to build AI.
  • The Secretary of Commerce is acting to block use of US tech for AI by China due to “security concerns.”
  • The move, impacting players like Amazon and Microsoft, is anticipated to escalate tech tensions with China.

The long-standing rivalry between the US and China has evolved into many facades over the last decade. The intensifying competition underscores economic supremacy and national security concerns, shaping the dynamics of a burgeoning tech war. Last year, the battleground extended into the development of AI, but this year, the US has indicated the desire to control and dominate local cloud computing services. 

Recent proposals suggest stringent measures to curb China’s access to US cloud computing firms, fueled by concerns over the potential exploitation of American technology for AI advancement. In a recent interview, US Secretary of Commerce Gina Raimondo emphasized the need to prevent non-state actors and China from utilizing American cloud infrastructure to train their AI models.

“We’re beginning the process of requiring US cloud companies to tell us every time a non-US entity uses their cloud to train a large language model,” Raimondo said at an event on January 27. Raimondo, however, did not name any countries or firms about which she was particularly concerned. Still, the maneuver is anticipated to intensify the technological trade war between the US and China, and signify a notable step toward the politicization of cloud provision.

The focal point of this battle lies in recognizing that controlling access to cloud computing is equivalent to safeguarding national interests. Raimondo parallels the control exerted through export restrictions on chips, which are integral to American cloud data centers. As the US strives to maintain technological supremacy, closing avenues for potential malicious activity becomes imperative.

Therefore, the proposal mandates explicitly firms like Amazon and Google to gather, store, and scrutinize customer data, resembling the weight of stringent “know-your-customer” regulations akin to those shaping the financial sector. Conversely, China has been aggressively pursuing AI development, seeking to establish itself as a global leader in the field. 

The US concerns stem from the dual-use nature of AI technologies, which can have both civilian and military applications. The fear is that China’s advancements in AI could potentially be leveraged for strategic military purposes, posing a direct challenge to US national security.

Of AI, cloud computing, and the US-China tech war

China's Premier Li Qiang (R) speaks with US Commerce Secretary Gina Raimondo during their meeting at the Great Hall of the People in Beijing on August 29, 2023. (Photo by Andy Wong/POOL/AFP).

China’s Premier Li Qiang (R) speaks with US Commerce Secretary Gina Raimondo during their meeting at the Great Hall of the People in Beijing on August 29, 2023. (Photo by Andy Wong/POOL/AFP).

Although the US broadened chip controls in October, focusing on Chinese firms in 40+ nations, a gap remains. That is why it is paramount for the US to address how Chinese companies can still leverage chip capabilities through the cloud. Cloud technology has become the backbone of modern businesses and governments, making it a critical asset in the ongoing tech war. 

From start to finish, cloud computing is inherently political, Trey Herr, director of cyber statecraft at the Atlantic Council, told Raconteur. He said that its reliance on extensive physical infrastructure tied to specific jurisdictions makes it susceptible to local politics, adding that conversations about cloud security inevitably take on political dimensions.

In October 2023, Biden mandated the US Department of Commerce mandate disclosures, aiming to uncover foreign actors deploying AI for cyber-mischief. Now, the Commerce Department, building on stringent semiconductor restrictions for China, is exploring the idea of regulating the cloud through export controls. Raimondo said the concern is that Chinese firms could gain computing power via cloud giants like Amazon, Microsoft, and Google.

“We want to make sure we shut down every avenue that the Chinese could have to get access to our models or to train their models,” she said in an interview with Bloomberg last month. In short, China’s strides in AI and cutting-edge technologies are a paramount worry for the administration. After all, despite Washington’s efforts to curtail China’s progress through chip export restrictions and sanctions on Chinese firms, the nation’s tech giants resiliently achieve substantial breakthroughs, challenging the effectiveness of US constraints.

Nevertheless, regulating such activities in the US is still being determined because cloud services, which do not involve physical goods transfer, fall outside export control domains. Thea Kendler, assistant secretary for export administration, mentioned the potential need for additional authority in this space during discussions with lawmakers last month.

Addressing further loopholes, the Commerce Department also plans to conduct surveys on companies developing large language models for their safety tests, as mentioned by Raimondo on Friday. However, specific details about the survey requests were not disclosed.

What are cloud players saying?

As with previous export controls, US cloud providers fear that limitations on their interactions with international customers, lacking reciprocal measures from allied nations, may put American firms at a disadvantage. However, Raimondo said that comments on the proposed rule are welcome until April 29 as the US seeks input before finalizing the regulation.

What is certain is that the cloud will persist as an arena for trade war extensions and geopolitical maneuvers. Nevertheless, this tech war has broader implications for the global tech ecosystem. It prompts questions about data sovereignty, privacy, and the geopolitical alignment of technological alliances. As the US seeks to tighten its grip on the flow of technology, China is compelled to find alternative routes to sustain its AI ambitions.

The outcome will shape the future trajectory of technological innovation, with ramifications extending far beyond cloud computing and AI development. 

The post Biden weighs blocking China’s access to US cloud tech, fearing AI advancement appeared first on TechHQ.

]]>
Google’s first data center in the UK: a billion-dollar tech investment https://techhq.com/2024/01/google-billion-dollar-uk-data-center-unveiled/ Mon, 22 Jan 2024 15:00:00 +0000 https://techhq.com/?p=231319

The data center will be the first to be operated by Google in the UK. Google’s 2022 deal with ENGIE adds 100MW wind energy. The aim is for 90% carbon-free UK operations by 2025. In the ever-evolving landscape of cloud computing, Google Cloud is a formidable player, shaping the global data center market with its... Read more »

The post Google’s first data center in the UK: a billion-dollar tech investment appeared first on TechHQ.

]]>
  • The data center will be the first to be operated by Google in the UK.
  • Google’s 2022 deal with ENGIE adds 100MW wind energy.
  • The aim is for 90% carbon-free UK operations by 2025.

In the ever-evolving landscape of cloud computing, Google Cloud is a formidable player, shaping the global data center market with its leading solutions and heavyweight presence. Google Cloud’s commitment to expanding its global footprint is exemplified by its recent announcement of a US$1 billion investment in a new data center in Waltham Cross, Hertfordshire, UK. 

The move not only underscores the company’s dedication to meeting the needs of its European customer base, but also aligns with the UK government’s vision of fostering technological leadership on the global stage. As it is, one of the critical pillars of Google Cloud’s presence in the UK is its substantial investment in cutting-edge data infrastructure. That said, the upcoming data center would be Google’s first in the country.

Illustration of Google's new UK data Centre in Waltham Cross, Hertfordshire. The 33-acre site will create construction and technical jobs for the local community. Source: Google

Illustration of Google’s new UK data Centre in Waltham Cross, Hertfordshire. Source: Google.

“As more individuals embrace the opportunities of the digital economy and AI-driven technologies enhance productivity, creativity, health, and scientific advancements, investing in the necessary technical infrastructure becomes crucial,” Debbie Weinstein, VP of Google and managing director of Google UK & Ireland, said in a statement last week.

In short, this investment will provide vital computing capacity, supporting AI innovation and ensuring dependable digital services for Google Cloud customers and users in the UK and beyond.

Google already operates data centers in various European locations, including the Netherlands, Denmark, Finland, Belgium, and Ireland, where its European headquarters are situated. The company already has a workforce of over 7,000 people in Britain.

Google Cloud’s impact extends far beyond physical infrastructure, though. The company’s cloud services have become integral to businesses across various sectors in the UK. From startups to enterprises, organizations are using Google Cloud’s scalable and flexible solutions to drive efficiency, enhance collaboration, and accelerate innovation

The comprehensive nature of Google Cloud’s offerings, including infrastructure as a service (IaaS), platform as a service (PaaS), and software as a service (SaaS), ensures that it caters to the diverse needs of the UK’s business landscape.

That said, the investment in Google’s Waltham Cross data center is part of the company’s ongoing commitment to the UK. It follows other significant assets, such as the US$1 billion acquisition of a Central Saint Giles office in 2022, a development in King’s Cross, and the launch of the Accessibility Discovery Centre, fostering accessible tech across the UK.

“Looking beyond our office spaces, we’re connecting nations through projects like the Grace Hopper subsea cable, linking the UK with the United States and Spain,” Weinstein noted.

“In 2021, we expanded the Google Digital Garage training program with a new AI-focused curriculum, ensuring more Brits can harness the opportunities presented by this transformative technology,” Weinstein concluded. 

Google is investing US$1 billion in a new UK data center to meet rising service demand, supporting Prime Minister Rishi Sunak's tech leadership ambitions. Source: Google.

Google is investing US$1 billion in a new UK data center to meet rising service demand, supporting Prime Minister Rishi Sunak’s tech leadership ambitions. Source: Google.

24/7 Carbon-free energy by 2030

Google Cloud’s commitment to sustainability also aligns seamlessly with the UK’s environmental goals. The company has been at the forefront of implementing green practices in its data centers, emphasizing energy efficiency and carbon neutrality. “As a pioneer in computing infrastructure, Google’s data centers are some of the most efficient in the world. We’ve set out our ambitious goal to run all of our data centers and campuses on carbon-free energy (CFE), every hour of every day by 2030,” it said.

This aligns with the UK’s ambitious targets to reduce carbon emissions, creating a synergy beyond technological innovation. Google forged a partnership with ENGIE for offshore wind energy from the Moray West wind farm in Scotland, adding 100 MW to the grid and propelling its UK operations towards 90% carbon-free energy by 2025. 

Beyond that, the tech giant said it is delving into groundbreaking solutions, exploring the potential of harnessing data center heat for off-site recovery and benefiting local communities by sharing warmth with nearby homes and businesses.

The post Google’s first data center in the UK: a billion-dollar tech investment appeared first on TechHQ.

]]>
Feel the road: virtual sensor stack gains traction https://techhq.com/2023/12/feel-the-road-virtual-sensor-stack-gains-traction/ Tue, 19 Dec 2023 15:47:37 +0000 https://techhq.com/?p=230815

It’s said that elite racing drivers can feel the road through their bodies. Pressed into their seat, hands on the wheel, they are able to detect fine details of the track below and set the steering and speed to match the surface conditions. However, it takes talent, years of experience, and a race-engineered vehicle to... Read more »

The post Feel the road: virtual sensor stack gains traction appeared first on TechHQ.

]]>

It’s said that elite racing drivers can feel the road through their bodies. Pressed into their seat, hands on the wheel, they are able to detect fine details of the track below and set the steering and speed to match the surface conditions. However, it takes talent, years of experience, and a race-engineered vehicle to achieve this.

In the meantime, there are over a billion cars on the road. And while some of those motorists may think that they can feel the road like a racing driver, accident rates suggest otherwise. Plus, adding autonomous vehicles to the mix remains a work in progress and raises a number of safety concerns.

Wouldn’t it be good if we could fix all of this? And, according to automotive software firm Tactile Mobility, we can. What’s more, the raw inputs are already there thanks to electronic hardware that’s been part of vehicle designs for years.

“There’s a wealth of data that exists in every vehicle,” Yagil Tzur – VP of Product at Tactile Mobility – told TechHQ. “Every car has a CAN network, sensors, and electronic control units (ECUs).”

Creating a feel the road virtual sensor stack for automotive applications.

Model calibration: sample vehicles are driven on a proving ground to dial-in the analytic capabilities of the automotive software.

To turn those inputs into insights, the firm uses machine learning to create what is, in effect, a virtual sensor stack capable of deciphering grip levels, steering health, and other valuable vehicle performance characteristics.

The process begins by taking cars to a proving ground where they can be driven on a wide range of road conditions at different tyre pressures and set up to mimic different numbers of passengers, for example.

Model calibration then weights the observed behaviour against readings that are available on the CAN bus such as wheel speed, engine torque, yaw rate, and many other signals.

When complete, the automotive software can estimate the wheel grip and provide real time updates on tyre wear – all without having to fit any additional hardware. And having a virtual sensor stack designed to feel the road has got car manufacturers and suppliers interested.

Investors in Tactile Mobility include Porsche Ventures and Goodyear Ventures, which both came on board in 2021 as part of a USD $27 million series C funding round.

Why EVs are tough on tyres

Governments around the world are incentivising drivers to purchase battery-powered electric vehicles (EVs). And while that will improve air quality on city streets, it makes life a whole lot tougher for the rubber that connects cars to the road – due to the increased weight and high torque of EVs.

Tyre performance is critical to vehicle safety, and the importance of those four contact patches is often overlooked by regular drivers – so much so that the US Department of Transportation mandated that automobiles be fitted with tyre pressure monitoring systems (TPMS), as part of the TREAD act, in the early 2000s.


Similar legislation was introduced in other countries, and today all cars have TPMS sensors. However, those systems by no means feel the road. Instead, they simply trigger a warning light to indicate low tyre pressure once the value passes a crude threshold.

Virtual sensors provide a much more granular view of tyre performance, giving drivers not just a warning that something is wrong, but adjusting lifetime estimates so that failures don’t happen in the first place and timely replacements can be made.

Tzur sums up the gains of having Tactile Mobility data as being like TPMS on steroids. What’s more, having real world information on tap could help tyre companies further improve their designs – for example, to cater for the specific demands of EVs.

Tyre developers already spend big sums on research and development, but feedback from virtual sensors that feel the road could enrich the amount of information that designers have to work with.

“99% of testing is performed on brand new tyres,” Tzur points out. “Companies have less information on the performance of mid-worn tyres.”

Wisdom of the cloud

There are two sides to Tactile Mobility’s business. First is the onboard virtual sensing, which presents its data to other ECUs on the CAN bus and helps to finesse advanced driver assistance systems (ADAS) such as cruise control. But there are also cloud-based solutions enabled by the feel-the-road technology.

For example, as the install base increases (Tactile Mobility is working with various automotive OEMs in the US and Europe – and has publicly mentioned Ford Motor Company as a partner; Porsche is also listed on its website), some very interesting crowd-sourcing opportunities arise.

The setup is well-suited to characterising the quality of the road surface and sharing this information with local authorities – to schedule maintenance and fill potholes – as well as other interested parties, such as insurance firms.

It’s possible to use the technology to generate a road friction map, which could advise on speed adjustments to reduce the chance of accidents. Potentially, insurance premiums could take into account the quality of the road surface and data would help accident investigations.

Also, should a vehicle experience a driving incident, that information could be shared to warn nearby motorists. “If there’s an adverse event, the data can be sent to the cloud at high priority to communicate with other vehicles,” said Tzur.

Once vehicle-to-vehicle (V2V) communications become more commonplace, the adverse event alerting process could even happen directly to give drivers (and autonomous vehicles) additional reaction time.

Tactile Mobility has its origins in fleet management, where a forerunner of its systems helped to guide road transport operators on fuel management. And performance coaching feedback could also cross over into its feel-the-road virtual sensing stack – for example, to help drivers prolong tyre life on their vehicles.

Navigating self-driving roadblocks

It’s possible too that self-driving cars, which are having a bunch of bad press lately, may also benefit from better road surface data enabled using machine learning methods such as those deployed by Tactile Mobility.

For example, autonomous vehicles can be spoofed by lines painted on the road, but virtual sensors capable of feeling the road ahead may be able to better distinguish between legitimate markings and rogue ones.

The post Feel the road: virtual sensor stack gains traction appeared first on TechHQ.

]]>