IoT - TechHQ Technology and business Thu, 07 Mar 2024 12:02:04 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 Tiny VLMs bring AI text plus image vision to the edge https://techhq.com/2024/03/tiny-vlms-bring-ai-text-plus-image-vision-to-the-edge/ Thu, 07 Mar 2024 12:02:04 +0000 https://techhq.com/?p=232534

Large language models capable of providing statistically likely answers to written text prompts are transforming knowledge work. AI algorithms enable tasks to be performed faster and reduce the need for specialist skills, which can be unnerving to witness for highly trained and experienced staff. But there’s magic to how neural networks can level the playing... Read more »

The post Tiny VLMs bring AI text plus image vision to the edge appeared first on TechHQ.

]]>

Large language models capable of providing statistically likely answers to written text prompts are transforming knowledge work. AI algorithms enable tasks to be performed faster and reduce the need for specialist skills, which can be unnerving to witness for highly trained and experienced staff. But there’s magic to how neural networks can level the playing field and help users to understand their world. And a great example of this is the rise of vision language models (VLMs), which co-locate image features alongside multi-dimensional word embeddings.

So-called multimodal AI – neural networks capable of generating results based on multiple input types such as text and images – adds to the number of ways that deep learning can be used. OpenAI’s GPT-4 with Vision will answer questions about images – for example, if the user submits a photo or web link. Given a picture of the contents of a fridge, it can suggest ideas for dinner.

You can picture VLMs as image recognition on steroids. Multimodel AI models can do much more than just recognize features in a scene. Algorithms can gather insights from security footage, prompt maintenance crews to take action, analyse customer behaviour, and much more besides (as our test results show – scroll down to see the responses).

VLMs can provide a second pair of eyes that never get tired. However, these superpowers come at a cost and historically have required heavy lifting to happen behind the scenes in the cloud. Generative AI algorithms have gorged themselves on the contents of the internet in the name of fine-tuning their model weights.

But not all of those parameters are created equal, and developers are finding ways to trim AI models down to a more memory-efficient size for specific tasks. It means that algorithms can run locally at the edge and suit fully remote use cases such as drone inspections – for example, where connectivity cannot be guaranteed.

One of a new breed of tiny VLMs capable of running at the edge is Moondream, which is advertised as being able to run anywhere, even on mobile devices and Raspberry Pi’s. The code to run the tiny AI text and vision model is available on GitHub, and there’s even a live demo environment so that you can jump straight in and put the VLM to the test.

Tiny VLM test results

To discover what VLMs can do and highlight potential use cases for local AI models capable of answering questions about images, TechHQ submitted a variety of photos to the Moondream2 live demo hosted on Huggin Face.

Image 1

Harbour scene for testing AI vision language model

Harbour scene. Photo: JT.

Input: Is it safe to walk straight ahead?

Response: No, it is not safe to walk straight ahead. There is a dock with a red life ring attached to it, and a few boats are visible in the water. You need to be cautious and look out for any obstacles or hazards such as the dock, boats, or other watercraft, before walking forward.

Image 2

Stairway for testing AI vision language model

Stairway. Photo: JT.

Input: Is there something for me to hold on to?

Response: Yes, there is a railing on the stairs, and you can hold on to it while climbing to the top. The railing is attached to the wall, providing support for those ascending or descending the steps.

Input: Can you see an open window?

Response: Yes, there is an open window on the building in the background, and the sun is shining brightly illuminating the scene.

Image 3

Pedestrian crossing for testing AI vision language model

Pedestrian crossing. Photo: JT.

Input: Is it safe for me to use the pedestrian crossing?

Response: No, it is not safe for you to use the pedestrian crossing, as the image shows a black car approaching the crossing. The car is likely not paying attention to the red light, which indicates that it’s not safe for pedestrians to cross at that moment.

Image 4

Road signs for testing AI vision language model

Road signs. Photo: JT.

Input: Should I go left or right to get to the airport?

Response: Right.

Input: Which way to the marina?

Response: Left.

Image 5

Bus stop by the side of the road for testing AI vision language model

Bus stop. Photo: JT.

Input: Is there anywhere to catch a bus?

Response: Yes, there is a bus stop in the image, with benches for passengers to sit on and a covered area. A traffic cone is also present, and a bridge can be seen in the background.

Input: Where can I shelter if it rains?

Response: You can shelter at the bus stop if it rains, as there is a bus shelter nearby with benches and a roof providing protection from the elements.

Image 6

Image of a luxury yacht for testing AI vision language model

Luxury yacht. Photo: JT.

Input: What can you see?

Response: A helicopter is positioned on a ship, with a metal fence in the foreground. The background features a variety of elements, including buildings, trees, a road, a water body, and the sky.

Image 7

Photo of New Cut Coffee shop in Bristol, UK, for testing AI vision language model.

Coffee shop. Photo: JT.

Input: Is there anywhere to get a warm drink around here?

Response: Yes, there is a coffee shop nearby. The image shows a red building with the words “New Cut Coffee” on it, and a black fence. A black car is parked in the lot, and a white building is visible in the background. The sky is blue overhead.

Input: Does the coffee shop sell sandwiches?

Response: Yes.

Input: What else is there to eat at the coffee shop?

Response: In addition to coffee, the New Cut Coffee shop offers sandwiches and pastries. The image also features a building and a vehicle in the background.

Image 8

Web cam image for testing AI vision language model.

Web cam mode: the tiny VLM live demo can accept images via a connected webcam if browser permissions are given. Screengrab: JT.

Input: Can you see any ethernet ports?

Response: Yes, there are two visible ports on the device that the man is holding.

Verdict on tiny VLM testing

It’s hard not to be impressed with the performance of Moondream2 – a VLM that’s initialized with weights from SigLIP and Phi 1.5, and then trained using synthetic data generated by Mixtral.

Evaluating the responses generated from TechHQ’s test sequence of images and text inputs, it’s clear that Moondream2 gets more right than it gets wrong, and is capable of noticing fine details that weary human eyes may miss. Also, as mentioned, the AI model’s small footprint is another winning feature – paving the way for inference to take place on edge devices that could include drones or robots.

VLMs could turn out to be incredibly useful in helping robots orientate themselves in the world and being able to run algorithms locally eliminates the need for network access, or at least provides redundancy in the case of outages.

A smart idea for smart glasses

Alongside its launch of the Quest 3 VR headset, Meta also announced a collaboration with Ray-Ban to produce an updated line of smart glasses. The product, which features Qualcomm’s weight-distributed Snapdragon AR platform, has some local voice control capabilities – for example, to instruct the spectacles to take images using the built-in ultra-wide 12MP camera. Users simply have to say, “Hey Meta, take a picture.”

Being able to couple those photos with a VLM running locally would turn the Ray-Ban smart glasses into a much more compelling product, and could bring scenes to life for visually impaired wearers without the need for pairing with a smartphone.

Vision assistance powered by edge-compatible VLMs could dramatically enhance the capabilities of digital camera-equipped devices. As the webcam image in our test sequence highlights, there’s the potential for algorithms to help with maintenance and repair tasks, distributing knowledge and expertise across the globe.

AI is doing for knowledge work what robotics has done for manufacturing, and it’s just the beginning.

 

The post Tiny VLMs bring AI text plus image vision to the edge appeared first on TechHQ.

]]>
Future-proofing utility companies: The role of data, analytics and IoT https://techhq.com/2024/02/future-proofing-utility-companies-operations-the-role-of-data-analytics-and-iot/ Wed, 14 Feb 2024 15:57:28 +0000 https://techhq.com/?p=231961

Modern utility companies face an imperative for transformation of their operations as they grapple with increasing energy demands, environmental concerns and the evolving technological expectations of consumers. There’s also a raft of legislation that needs to be observed, plus an omnipresent need to modernise infrastructure. Finally, geopolitical events, like the Russian invasion of Ukraine, and... Read more »

The post Future-proofing utility companies: The role of data, analytics and IoT appeared first on TechHQ.

]]>

Modern utility companies face an imperative for transformation of their operations as they grapple with increasing energy demands, environmental concerns and the evolving technological expectations of consumers. There’s also a raft of legislation that needs to be observed, plus an omnipresent need to modernise infrastructure. Finally, geopolitical events, like the Russian invasion of Ukraine, and economic uncertainties are introducing unpredictability into energy supply chains.

Data, analytics and the Internet of Things (IoT) can be harnessed to enhance operational efficiency and deliver customers a more sustainable and responsive service. The adoption of these technologies by utility companies can also improve safety, enhance the employee experience and uncover new revenue streams.

Prioritisation is key in this situation; after all, funds are not infinite. But utilities also have the potential to be massive data gatherers, from their plant and infrastructure, their delivery methods, consumers, end-users and business operations.

Utility operations are changing rapidly.

Source: Shutterstock

In some cases, there needs to be a shift in mindset with regard to the information gathered and used by utility companies. TechHQ recently spoke to Wipro – a company in multiple partnerships with UK utilities – about the challenges and approaches their consultants and architects see daily.

The company’s Director of Digital Transformation, Sampathkumaran Hariharan, said: “The data gathered from the field is impacted by the kind of equipment and sensors that have been deployed over a long period of time. Many of the utilities have not refreshed the data-gathering mechanisms from the field. But even from an operational point of view, the information about the customer is also maintained in multiple systems.”

Sampathkumaran Hariharan. Source: LinkedIn

“The customer has not been the centre of data acquisition over a long period of time. Utilities have traditionally been an asset-centric organisation. The emphasis on being a customer-centric organisation is heavily emphasised by the regulator and the frameworks the regulator has. So there is a transitioning that is happening within the industry.”

All over the UK, water companies are becoming more responsive to their customers’ demands for less water wastage from existing pipe infrastructures. Wipro’s use of advanced machine learning algorithms helps one UK water utility monitor 600 DMAs (Digital Metered Areas) to identify those zones where leaks are a substantial issue. IoT devices monitor flow rate and pressure, and with data on the geometrical structure of the network (routes, pipe diameters, existing gate valves, etc.), software can predict pipe burst events and detect major leakage anomalies in pilot the utility’s sites.

Proactive maintenance and infrastructure replacement mean less water is lost to leaks, and critically, water supplies are better ensured to customers at home and in the workplace.

Further up the chain, the combination of OT (operational technology) and IT helps translate collated data and convert it for use by distribution network operators. This amalgamation of new technology with legacy infrastructure combines localised DCS (distributed control system) and geographically widespread SCADA (supervisory control and data acquisition) data to create next-generation ADMS (advanced distribution management system) solutions.  These solutions are more suited to the highly flexible approach to electricity distribution required for future generations.

Technology also elevates both the customer and employee experiences; utility personnel benefit from streamlined workflows and data-driven insights, while customers receive more personalised services and access to real-time information through smart home utility meters. Furthermore, integrating state-of-the-art technologies creates the potential for new revenue streams, such as energy efficiency consulting and water quality certification services, increasing the longevity of these critical sectors.

Data is the lifeblood of transformation in the utility, transport & distribution (T&D) and water sectors. It powers informed decision-making, bespoke services for customers and efficient resource allocation, to name just a few benefits. However, this data is next to useless if the company does not have the capability, facilitated by a robust data platform, to turn it into actionable insights.

Data-driven approaches can lead a company to embrace ‘Industry 4.0’, or the integration of IT systems with business processes. For example, say a utilities company installed advanced sensors across its power grid infrastructure; it could then apply predictive maintenance models that identify potential equipment failures before they occur. These insights could be integrated into its IT systems, allowing for automated alerts and maintenance scheduling.

Traditional operational threat (OT) practices in the utilities industry primarily focus on monitoring and managing industrial processes and equipment. However, recent technological advancements offer new benefits, particularly in the context of real-time remote sensor data. These sensors provide authorities with detailed and reliable information about damaged locations or faulty equipment, which is especially crucial during emergencies when swift action can be a matter of life and death.

The introduction of hydrogen into the UK’s power infrastructure is an area that’s being pioneered by some companies in collaboration with Wipro. It’s a critically important player in this type of research that has the ability to change the ways in which power and heat are delivered to UK homes and businesses.

Ankit Sharma. Source: LinkedIn

Ankit Sharma, Account Executive at Wipro, said: “Recently, we brought together a number of industry bodies, Ofgem, and a lot of network players, retailers, and so on, all to talk about hydrogen and how we were going to help support [its] use, how we were going to integrate, how that would work, how we use that heat in all forms in the industry.”

“But it’s really important that we look at how to integrate [hydrogen] and other renewable energy sources – wind, solar, and so on – and reduce the reliance on the carbon-intensive power generation that we currently have to support our net zero commitments.”

As IT methods are applied to the management of OT environments in enterprise, including utility sectors, further possibilities are opening up thanks to data collection and digitisation in general.

“I think what utility companies have to now look at is how we have efficient workforce management as well,” said Mr Sharma. “That means how they are geared up for the challenge from a grid maintenance perspective, and also how they are looking out for the new skills that are coming within the market. How do you upskill workforces on that?”

For utility T&D and water companies specifically, industrial IOT (IIoT) and the power of data lakes enable the shift from periodic or manual analysis to real-time monitoring of significant assets and processes, like transformers, pipelines and water treatment facilities, plus a host of other operational factors, from financing, accountancy, HR and logistics. These are just a few examples of how the incorporation of carefully chosen IT opens up more efficient and cost-effective work practices.

The future of utility companies relies on data, analytics and the IIoT. As conditions become more challenging, through growing energy demands and environmental concerns, embracing digital innovation is essential. With the integration of IT and OT and business goals that dictate technology choices, utility companies can adapt, thrive, and remain competitive in an evolving yet challenging landscape.

You can listen to the full podcast here, and check out the show notes for links and more information on Wipro’s work in the sector in the UK and abroad.

In the interim, visit Wipro’s website to find out more about its work within the utilities industry.

The post Future-proofing utility companies: The role of data, analytics and IoT appeared first on TechHQ.

]]>
IBM’s mainframe for the masses https://techhq.com/2024/02/what-is-the-business-case-for-a-mainframe-computer/ Mon, 12 Feb 2024 12:30:07 +0000 https://techhq.com/?p=231983

A mainframe computer on a limited budget. Transactional processing at scale. ROI in months compared to hyperscale cloud. Say the word mainframe to many IT professionals, and they immediately think of legacy computing, systems being replaced with more modern technologies to better cope with the demands on computing common in 2024. Some will have used... Read more »

The post IBM’s mainframe for the masses appeared first on TechHQ.

]]>
  • A mainframe computer on a limited budget.
  • Transactional processing at scale.
  • ROI in months compared to hyperscale cloud.

Say the word mainframe to many IT professionals, and they immediately think of legacy computing, systems being replaced with more modern technologies to better cope with the demands on computing common in 2024. Some will have used mainframe systems in the past, perhaps in environments where computer access was via a mainframe and thin client. In those situations, the client was a dumb terminal, and the computing work was done elsewhere by a central mainframe facility.

But mainframes are still in everyday production in some industries, and the market for new mainframe hardware and compatible software continues to grow where “transactional” computing is central to effective operations.

The new IBM LinuxONE 4 Express is a piece of hardware designed for smaller organizations, and it represents the baby of the breed, offering a low-cost of entry with pre-configured hardware options. The company also emphasizes its cyber-resilience, with hardware security systems it terms “Secure Execution.” The hardware has some high security clearance standards, including Common Criteria EAL 4 and DISA STIG certifications, and FIPS 140-3 compliance.

Users can choose their preferred software platform, with SUSE now offering its Linux Enterprise Server for IBM Z as part of a bundle that can also come with SLE Live Patching, SUSE Manager Lifecycle Management Plus, SLE High Availability, and a long-term service package. The Secure Execution hardware means multiple containerized applications can be run simultaneously in isolation. That makes the system ideal for multi-tenancy operations or parallel application spaces that are effectively separated from each other.

A maninframe computer, the IBM LinuxONE Express.

IBM LinuxONE Express. Image: IBM.

Mainframe computer benefits

While similar secure and powerful environments can be created using several x86 servers, mainframes represent a more sustainable approach to hardware and power use. Expansion of storage, memory, and processing capacity over time makes this style of hardware a more attractive long-term prospect: the use of fewer components obviously reduces environmental impact and makes hardware maintenance plans simpler to budget.

While the initial cost (from $135,000) may seem high as a line item on a CAPEX sheet, enterprises with large cloud provider bills may see an effective return on investment sooner than they think. Depending on use cases, third-party clouds’ abilities to scale and provide agility are often seldom used. That means large organizations pay for capabilities they may rarely make use of.

Transactional computing

The continuing existence of a thriving market in mainframe computing stems from the need for accurate transactional computing in a growing number of verticals. Transactional computing can be best described as a way that canonical records of all aspects of a single transaction can be kept, with each element of one transaction being required to be successful for a record to be made, changed, or deleted.

For example, in an e-commerce business, a transaction would comprise of moving funds from a bank account to a vendor via a payment provider. If one of those steps fails (and each comprises several sub-steps), the transaction has not occurred, so the only record made is one flagged as a failure. Therefore, the emphasis in computing terms is not on raw processing power (the main requirement for supercomputing, for example) but on the integrity of database entries. That emphasis doesn’t necessarily require a different computing architecture, but it’s one that comes built into the design specifications of mainframes.

For that reason, banking and financial services, for example, still rely on mainframe technologies. But as the scale of internet use grows, more industries rely on the type of security, reliability, and data veracity that mainframe methodologies (still) excel at. Other use cases may be found in high-volume e-commerce marketplaces, engineering facilities that rely on multiple IIoT nodes, and power distribution networks, to name just three examples.

Image of old mainframe computer for article on IBM's LinuxONE Express.

“Mainframe computer” by scriptingnews is licensed under CC BY-SA 2.0.

Business case for mainframe computers

Software optimized for transactional computing, whether monolithic or based on microservices, is available from several vendors: LinuxONE hardware will run Red Hat, SUSE, and Ubuntu. The LinuxONE Express hardware range contains (at base version) 16 IBM IFL systems (Integrated Facility for Linux), expandable to 68 IFL instances. The Emperor LinuxONE range supports over 200 for those looking for more grunt.

The size and power of mainframes make them ideal for placing data and applications in the same place. IBM quotes an example of medical data and medical claims software sharing the same hardware tenancy, allowing for faster claim assessment. Similarly, for businesses looking to consolidate their server fleets, either in-house or leased from cloud providers, an Express instance can replace up to 2,000 x86 instances (manufacturer’s claim: YMMV).

Many IT decision-makers are coming to the conclusion that hyperscale cloud providers are not offering their services with end-user advantage front of mind. To grab a single example, Microsoft’s Q4 net income was $21.9 billion in 2023. While cloud computing still suits many, financial decision-makers might question the value for money their organization gets from their existing agreements with hyperscalers. That element of doubt and an increased need for reliable transactional processing will make the capital expenditure option look increasingly attractive to many.

The Express mainframe range can be sourced directly from IBM or approved partners.

The Express is not your meemaw’s mainframe.

The post IBM’s mainframe for the masses appeared first on TechHQ.

]]>
Wi-Fi HaLow runs rings around old-style wireless networks https://techhq.com/2024/01/wi-fi-halow-runs-rings-around-old-style-wireless-networks/ Tue, 30 Jan 2024 15:56:40 +0000 https://techhq.com/?p=231741

Back in 2017, the IEEE published a networking standard that would enable wireless users to do some very clever things, such as connect more than 8000 devices to a single access point. Multiple devices supporting the wireless networking protocol, dubbed Wi-Fi HaLow (802.11ah), are starting to hit the market. And having Wi-Fi that’s accessible from... Read more »

The post Wi-Fi HaLow runs rings around old-style wireless networks appeared first on TechHQ.

]]>

Back in 2017, the IEEE published a networking standard that would enable wireless users to do some very clever things, such as connect more than 8000 devices to a single access point. Multiple devices supporting the wireless networking protocol, dubbed Wi-Fi HaLow (802.11ah), are starting to hit the market. And having Wi-Fi that’s accessible from over 1 km away, together with other IoT-friendly features, is getting developers interested.

A quick history of Wi-Fi

Wi-Fi has become the de facto way for a huge number of devices to connect to the internet – so much so that you’ll struggle to find a modern laptop with a built-in wired Ethernet port. And when we’re talking about Wi-Fi, we’re referring to an IEEE wireless Ethernet standard known as 802.11.

Over the years, that standard has evolved to support faster data transmission rates and operate across higher frequency bandwidths. Also, since 2008, the updates have been numbered 4 (802.11n), 5 (802.11ac), and 6 (802.11ax). Wi-Fi 7 was officially certified by the Wi-Fi Alliance on 8 Jan 2024.

However, there’s a trade-off between range and data transmission rate, as home broadband users will be aware of. At 2.4 GHz, the wireless network will probably cover the whole house – unless you’ve got some particularly thick walls in the way. However, at 5 GHz you may need to add a range extender. And by the time you get to the super-high throughput 6 GHz band supported by Wi-Fi 6 and 7, you’ll find that transmission is limited even further.

Wi-Fi HaLow was created to provide a low-power wide area network necessary to support the large-scale deployment of IoT devices – for example, in industrial settings inside and outside of large buildings – while retaining all of the advantages of wireless Ethernet.

Use cases for Wi-Fi HaLow

A great example of how Wi-Fi HaLow wireless networks can benefit warehousing and retail is electronic shelf edge labels. These devices – based on battery-powered LCD or electronic paper screens – can be easily managed via Wi-Fi HaLow, which not only accommodates thousands of connected devices, but also provides energy-efficient networking.


The power-saving nature of Wi-Fi HaLow is well-suited to battery-powered clients, which dramatically broadens the scope of where IoT technology can be deployed. For example, systems could be ideal for agricultural applications, bringing IoT into the field and barn. The wide-area coverage is also well-suited for manufacturing, healthcare, and smart city applications – to list just a few more industrial IoT scenarios.

Taiwan-based AsiaRF has shown how wearable Wi-Fi HaLow gateway devices turn smartphones into walkie-talkies, with numerous industrial use cases. According to the engineers involved in the test, voice communications were clear when participants were stationary and on the move, up to a range of 1.5 km. Morse Micro – a fabless semiconductor start-up – went even further and demonstrated that it was possible to hold a video call between devices at either end of Ocean Beach in California, US, using an 802.11ah wireless network with a 3 km reach.

Also, unlike other long-range wireless networking solutions such as LoRaWAN, Wi-Fi HaLow has all of the authentication and encryption features of a regular Wi-Fi network – such as WPA3-strength passwords, which is a boost for security.

Examples of 802.11ah chipsets –

Circling back to how Wi-Fi HaLow is capable of supporting small battery-powered devices, UK telecommunications training provider MPIRICAL does a good job of explaining what’s happening under the hood. The first option is for the network to store data on behalf of connected devices until they wake up. An alternative is for devices and access points to negotiate a schedule, which means that clients can power down and dramatically extend their battery life.

If all goes to plan, this should be great news for supporting large-scale IoT developments – particularly across relatively inaccessible areas. Having a long battery life could give the go-ahead to projects that were previously thought to be unviable using older Wi-Fi infrastructure.

The post Wi-Fi HaLow runs rings around old-style wireless networks appeared first on TechHQ.

]]>
Internet of tomatoes: how nature’s IT network could help agtech https://techhq.com/2024/01/internet-of-tomatoes-how-natures-it-network-could-help-agtech/ Thu, 25 Jan 2024 16:37:31 +0000 https://techhq.com/?p=231431

IT networks have grown increasingly sophisticated over time, but – when it comes to efficiency and performance – plants and fungi remain leaders in the field of advanced communications. Not only that, nature’s IT network is self-assembling and fully biodegradable, and there are plenty of lessons to be learned for developers. Researchers have been delving... Read more »

The post Internet of tomatoes: how nature’s IT network could help agtech appeared first on TechHQ.

]]>

IT networks have grown increasingly sophisticated over time, but – when it comes to efficiency and performance – plants and fungi remain leaders in the field of advanced communications. Not only that, nature’s IT network is self-assembling and fully biodegradable, and there are plenty of lessons to be learned for developers.

Researchers have been delving into the mechanisms at work, which allow crops to communicate with each other, and even recruit other living creatures to come to their aid when covered in predators. Literally rooted to the ground, plants have to face up to their attackers with no option of fleeing and seeking cover.

Nature’s IT network in action

Recently, entomologists have shown how tomato plants seek help using the transmission of volatile organic compounds (VOCs). “When a caterpillar chews on a leaf, the plant sends out a signal that calls out to the caterpillar’s predators,” said Erinn Dady – first author in a study that reveals some tomato varieties to be more capable communicators than others. “It’s like a billboard that tells them where lunch is.”

In the late 1990s, workers at the US Department of Agriculture discovered that corn seedlings could recruit parasitic wasps – capable of laying deadly eggs inside caterpillars attacking the maize – on demand. But that’s not all, as distress signals can be deciphered by multiple crops such as tobacco, chili peppers, beans, and cucumbers – to list just a few examples.

Plants can warn their neighbors if they become threatened, giving other plants more time to react. Electrical signals within an individual plant can propagate internally – a process that’s been found to be linked to an increase in calcium levels – so that connected leaves can release chemicals to deter unwanted bugs.

What’s more, when leaves in adjacent canopies touch, electrical signals can flow from one plant to another – particularly if there is water in the air and leaves are moist. The phenomenon has been dubbed network acquired acclimation and is just one example of plant connectivity.


Below the soil, plants strike a deal with mycelium – a fungal root network that can extend for kilometres – and receive a stream of nutrients in exchange for sugars derived from photosynthesis. The combination is a powerful one and has been shown to help stabilize heavy metals such as lead, which may otherwise pollute groundwater – a process known as phytoremediation.

Plus, there’s evidence that plants sharing common mycorrhizal (a term combining the Greek words for fungal and root) networks can eavesdrop on defence signals sent by their pathogen-challenged neighbours. Again, this buys plants more time to mount defenses before being attacked themselves.

Fungal computing

Mycelium has other properties too – arguably the most fascinating of which is being able to generate bursts of electrical spikes resembling neural signaling in the brain – which have captured the imagination of unconventional computing experts such as Andrew Adamatzky in the UK. And fungi are by no means the only ones transmitting within nature’s IT network, which also supports wireless broadcasts.

In 2023, using microphones capable of detecting frequencies beyond human hearing, agtech experts in Israel and the US discovered that plants emit ultrasonic airbourne sounds when stressed. And those sounds tell a story. “We developed machine learning models that succeeded in identifying the condition of the plants, including dehydration level and injury, based solely on the emitted sounds,” writes the team in the journal Cell.

And, if you are curious to hear what talking tomatoes sound like, there’s a sound clip available here (via CNN), which has been sped up and treated to make it audible to the human ear.

Being able to listen and respond to plants – for example, when they are demanding more water – has obvious appeal in food production and could avoid the use of excess nutrients, saving resources. Also, monitoring the release of VOCs could signal when it’s time to apply pesticides or even distribute natural predators.

There’s a vast amount that we can learn from nature’s IT and those solutions will be a breath of fresh air given how much energy conventional computing infrastructure consumes and the difficulty in neutralizing the environmental impact of such technology.

The post Internet of tomatoes: how nature’s IT network could help agtech appeared first on TechHQ.

]]>
Plant wearables – a growing technology trend https://techhq.com/2024/01/plant-wearables-a-growing-technology-trend/ Tue, 09 Jan 2024 17:19:21 +0000 https://techhq.com/?p=230991

Wearables such as the Apple Watch can nudge their wearers towards healthier lifestyles – telling them when to stand up and displaying progress towards exercise goals. At the same time, multifunctional wearable sensors are paving the way for personalized healthcare. But it’s not just humans that can benefit from these advances. Plant wearables could help... Read more »

The post Plant wearables – a growing technology trend appeared first on TechHQ.

]]>

Wearables such as the Apple Watch can nudge their wearers towards healthier lifestyles – telling them when to stand up and displaying progress towards exercise goals. At the same time, multifunctional wearable sensors are paving the way for personalized healthcare. But it’s not just humans that can benefit from these advances. Plant wearables could help growers to better look after their plants by leveraging the benefits of smart sensors and mobile apps.

The World Economic Forum (WEF), which has a good track record in highlighting innovations destined for big things, placed wearable plant sensors on its top 10 list of emerging technologies in June last year. Satellites and drones have long provided eyes in the sky to enable precision agriculture, but plant wearables can gather even more targeted data.

“Wearable plant sensors promise to improve plant health and increase agricultural productivity,” write the authors of the WEF report. “Data from plant sensors can optimize yields, reduce water, fertilizer and pesticide use, and detect early signs of disease.”


The goal is to ramp up food production to keep pace with population growth; making the best of available resources. And at the cutting edge of plant wearables are flexible sensors patterned with microneedle arrays, which can extract various chemical signals from leaves. For example, volatile organic compounds are emitted by plant leaves in response to stress and disease and can forewarn of trouble ahead.

However, the technology doesn’t have to be this complex to make a meaningful impact. Various developers are showing how simpler designs of plant wearables, with fewer sensing inputs, can still provide major insights to growers. What’s more, these affordable plant sensing kits, which connect to smartphones and feature companion apps, help to spread hi-tech growing knowledge more widely – beyond the confines of agribusiness users.

“It’s science applied to small-scale gardeners, whether that be for house plants or produce people,” explains Ashley Esakin – a soil scientist based in Canada who’s been working with EarthOne on a connected plant monitor.

What data do plant wearables provide?

Plant monitors such as the EarthOne measure environmental conditions such as ambient temperature, relative humidity, and light intensity, as well as collecting soil moisture and pH readings. And while that may not sound revolutionary at first, those inputs go a long way in determining what plants need and whether they’ve been placed in their optimum location.

Plant wearables and mobile apps are a powerful combination. For example, using a smartphone camera, the software can identify the plant and retrieve fine-tuned growing recommendations. From the main dashboard, users can quickly determine the basics of whether their plants need watering and are receiving sufficient light.

Things become more interesting still when those raw inputs are turned into plant growth metrics such as cumulative growing degree days (GGDs) and vapor pressure deficit (VPD). Cumulative GDD data can be thought of as a running sum of available energy for plants to develop.

“You can picture the relationship between how fast a plant develops and average temperature as a straight line, at least until you get to the very high temperatures,” wrote Betty Klepper – a pioneer in plant physiology and former Research Leader at the United States Department of Agriculture. “The warmer it is, the faster the plant will develop.”

Taking the example of cereal plants, which – on average – produce a leaf for every 100 GDDs, Klepper explained that farmers would be expecting to see a third leaf on their crop developing after the number of GDDs had passed 275 – assuming that all’s well.

VPD, which is calculated based on temperature and humidity, influences plant behavior on multiple levels. Specifically, it affects the amount of stomata opening that occurs on a plant’s leaves, transpiration, carbon dioxide uptake, nutrient intake at the roots, and plant stress.

“VPD is a very powerful tool in the grower’s toolbox,” comments Pulse Grow – another developer of plant monitoring hardware and software. “To get the best results you need to find the correct VPD sweet spot for the plant’s stage of growth.”

Things start to click once you picture VPD in terms of the capacity of the surrounding environment to absorb plant moisture. Plants ‘pump’ – or more correctly, transpire – water into the air to regulate their temperature, remove waste, and enable nutrient transport, to list just a few critical processes at work.

If VPD is either too high or too low, grower risk either working their plants to hard or constraining development. Pulse Grow reports that ideal VPD conditions can boost yields by 20%.

It might be a stretch to draw such a comparison between plant wearables and smartwatches, but both devices help to keep targets ‘in the zone’. For example, a smartwatch will help the wearer to exercise more efficiently – within a given heart rate band – and, thanks to growth metrics such as VPD, plant health can be optimized too.

If we could listen to plants, what would they say?

Despite all of the progress in plant wearables, there’s much more scope for listening in to try and decipher the needs of plants. For a glimpse of the future, it’s worth brushing up on developments in fungal computing, which exploits spiking electrical signals – analogous to patterns of neurons firing in the brain – passed by mushrooms through their roots.

Some researchers in the field of fungal computing, writing in the recently published book Fungal Machines, argue that the complexity of fungi’s spiking communications rivals, or even exceeds, that of European spoken languages. The thinking here is that such complexity would be required for fungal root structures – mycelium filaments – to satisfy what’s believed to be their key role as a communication network linking plants, insects, and animals.

Whatever the reality of fungal communications, there’s a growing belief that IoT-enabled food and plant sensors are strong prospects for empowering sustainability. “As technology progresses and costs decrease, the widespread adoption of these sensors is poised to revolutionize the agriculture and food industries, fostering more efficient, resilient, and sustainable food systems,” conclude researchers reporting on this theme in the journal Advanced Intelligent Systems towards the end of 2023.

The post Plant wearables – a growing technology trend appeared first on TechHQ.

]]>
Feel the road: virtual sensor stack gains traction https://techhq.com/2023/12/feel-the-road-virtual-sensor-stack-gains-traction/ Tue, 19 Dec 2023 15:47:37 +0000 https://techhq.com/?p=230815

It’s said that elite racing drivers can feel the road through their bodies. Pressed into their seat, hands on the wheel, they are able to detect fine details of the track below and set the steering and speed to match the surface conditions. However, it takes talent, years of experience, and a race-engineered vehicle to... Read more »

The post Feel the road: virtual sensor stack gains traction appeared first on TechHQ.

]]>

It’s said that elite racing drivers can feel the road through their bodies. Pressed into their seat, hands on the wheel, they are able to detect fine details of the track below and set the steering and speed to match the surface conditions. However, it takes talent, years of experience, and a race-engineered vehicle to achieve this.

In the meantime, there are over a billion cars on the road. And while some of those motorists may think that they can feel the road like a racing driver, accident rates suggest otherwise. Plus, adding autonomous vehicles to the mix remains a work in progress and raises a number of safety concerns.

Wouldn’t it be good if we could fix all of this? And, according to automotive software firm Tactile Mobility, we can. What’s more, the raw inputs are already there thanks to electronic hardware that’s been part of vehicle designs for years.

“There’s a wealth of data that exists in every vehicle,” Yagil Tzur – VP of Product at Tactile Mobility – told TechHQ. “Every car has a CAN network, sensors, and electronic control units (ECUs).”

Creating a feel the road virtual sensor stack for automotive applications.

Model calibration: sample vehicles are driven on a proving ground to dial-in the analytic capabilities of the automotive software.

To turn those inputs into insights, the firm uses machine learning to create what is, in effect, a virtual sensor stack capable of deciphering grip levels, steering health, and other valuable vehicle performance characteristics.

The process begins by taking cars to a proving ground where they can be driven on a wide range of road conditions at different tyre pressures and set up to mimic different numbers of passengers, for example.

Model calibration then weights the observed behaviour against readings that are available on the CAN bus such as wheel speed, engine torque, yaw rate, and many other signals.

When complete, the automotive software can estimate the wheel grip and provide real time updates on tyre wear – all without having to fit any additional hardware. And having a virtual sensor stack designed to feel the road has got car manufacturers and suppliers interested.

Investors in Tactile Mobility include Porsche Ventures and Goodyear Ventures, which both came on board in 2021 as part of a USD $27 million series C funding round.

Why EVs are tough on tyres

Governments around the world are incentivising drivers to purchase battery-powered electric vehicles (EVs). And while that will improve air quality on city streets, it makes life a whole lot tougher for the rubber that connects cars to the road – due to the increased weight and high torque of EVs.

Tyre performance is critical to vehicle safety, and the importance of those four contact patches is often overlooked by regular drivers – so much so that the US Department of Transportation mandated that automobiles be fitted with tyre pressure monitoring systems (TPMS), as part of the TREAD act, in the early 2000s.


Similar legislation was introduced in other countries, and today all cars have TPMS sensors. However, those systems by no means feel the road. Instead, they simply trigger a warning light to indicate low tyre pressure once the value passes a crude threshold.

Virtual sensors provide a much more granular view of tyre performance, giving drivers not just a warning that something is wrong, but adjusting lifetime estimates so that failures don’t happen in the first place and timely replacements can be made.

Tzur sums up the gains of having Tactile Mobility data as being like TPMS on steroids. What’s more, having real world information on tap could help tyre companies further improve their designs – for example, to cater for the specific demands of EVs.

Tyre developers already spend big sums on research and development, but feedback from virtual sensors that feel the road could enrich the amount of information that designers have to work with.

“99% of testing is performed on brand new tyres,” Tzur points out. “Companies have less information on the performance of mid-worn tyres.”

Wisdom of the cloud

There are two sides to Tactile Mobility’s business. First is the onboard virtual sensing, which presents its data to other ECUs on the CAN bus and helps to finesse advanced driver assistance systems (ADAS) such as cruise control. But there are also cloud-based solutions enabled by the feel-the-road technology.

For example, as the install base increases (Tactile Mobility is working with various automotive OEMs in the US and Europe – and has publicly mentioned Ford Motor Company as a partner; Porsche is also listed on its website), some very interesting crowd-sourcing opportunities arise.

The setup is well-suited to characterising the quality of the road surface and sharing this information with local authorities – to schedule maintenance and fill potholes – as well as other interested parties, such as insurance firms.

It’s possible to use the technology to generate a road friction map, which could advise on speed adjustments to reduce the chance of accidents. Potentially, insurance premiums could take into account the quality of the road surface and data would help accident investigations.

Also, should a vehicle experience a driving incident, that information could be shared to warn nearby motorists. “If there’s an adverse event, the data can be sent to the cloud at high priority to communicate with other vehicles,” said Tzur.

Once vehicle-to-vehicle (V2V) communications become more commonplace, the adverse event alerting process could even happen directly to give drivers (and autonomous vehicles) additional reaction time.

Tactile Mobility has its origins in fleet management, where a forerunner of its systems helped to guide road transport operators on fuel management. And performance coaching feedback could also cross over into its feel-the-road virtual sensing stack – for example, to help drivers prolong tyre life on their vehicles.

Navigating self-driving roadblocks

It’s possible too that self-driving cars, which are having a bunch of bad press lately, may also benefit from better road surface data enabled using machine learning methods such as those deployed by Tactile Mobility.

For example, autonomous vehicles can be spoofed by lines painted on the road, but virtual sensors capable of feeling the road ahead may be able to better distinguish between legitimate markings and rogue ones.

The post Feel the road: virtual sensor stack gains traction appeared first on TechHQ.

]]>
Defining 5G in terms of business and enterprise use cases https://techhq.com/2023/12/how-can-we-define-5g-in-terms-of-business-and-enterprise-use-cases/ Tue, 12 Dec 2023 16:24:14 +0000 https://techhq.com/?p=230632

• The business benefits of 5G go beyond standard smartphone speed. • Private 5G networks in particular advantages for app or use case testing. • 5G could yet bring significant business benefits to the transport and logistics sector. With some commentators voicing concerns about whether 5G has lived up to expectations, it’s worth looking at... Read more »

The post Defining 5G in terms of business and enterprise use cases appeared first on TechHQ.

]]>

• The business benefits of 5G go beyond standard smartphone speed.
• Private 5G networks in particular advantages for app or use case testing.
• 5G could yet bring significant business benefits to the transport and logistics sector.

With some commentators voicing concerns about whether 5G has lived up to expectations, it’s worth looking at what the fifth generation of cellular network technology is – and what it and isn’t. A good way of getting to the heart of that capability is to focus on what 5G can do for enterprise customers and business users.

5G was badged as being superfast, and this may have added to the confusion. For example, smartphone users may not notice a great deal of difference in device performance, regardless of whether they are connected to 4G or 5G.

However, there are still reasons to celebrate from an infrastructure perspective. As we’ve mentioned on TechHQ, upgrading base stations to the latest 5G technology not only means that equipment is smaller and lighter – there can be significant energy savings too. And the advantages don’t stop there.

“The integration of AI and 5G in modern switchgear marks a significant leap in the electrical infrastructure management,” comments Kiran Raj, practice head of disruptive tech at GlobalData. “Not only does this tech convergence boost operational efficiency, but it also enhances safety by predicting and mitigating electrical failures before they happen.”

Operators in China have shown how miniature cameras inside switch cabinets can be used to detect fire and activate extinguishers. And predictive analytics can be trained using data from power transformers.

Understanding 5G business benefits

Given the link between 5G and telecoms, it’s natural to look to smartphone performance for signs that the next generation of cellular network technology is living up to the hype. But the bigger picture for 5G goes well beyond handsets.

In its recent report, 5G Building a Digital Society, Vodafone explains how 5G can improve daily life more broadly and shares examples that business and enterprise users may not have considered.

For example, Vodafone says that 5G-enabled sensors on rail networks have the potential to improve services by raising track maintenance efficiency by up to 40% and reducing the rate of fault incidence by 20%. 5G-enabled rail could also help to add capacity to the network, lowering overcrowding at peak times.

Coupling trains virtually using 5G would enable them to share braking and acceleration data. A white paper by Thales calculates that train separation distance could be reduced by 43% compared with using current railway signaling methods, based on analysis of the UK South West Main Line (a 230 km route linking central London and the coastal town of Weymouth). In principle, this would allow many more trains to run without having to build additional tracks.


Low-latency private 5G networks are being used to automate vehicles driving between parts warehouses and main manufacturing sites.

Transportation and logistics could be big winners as 5G networks become more widespread. Vehicle telematics systems have long used cellular communications to connect to the cloud and monitor that all’s well through remote analytics.

Vendors can deliver software updates over-the-air rather than having to instruct owners to contact their dealership. And 5G telematics adds to those capabilities.

Vehicles can connect to each other to see around corners and gain greater knowledge about the road ahead. Automotive technology firm Harman, which is owned by Samsung, describes 5G as the missing piece of the puzzle for highly automated vehicles.

The low latency response time (of around 1 millisecond) that’s possible over 5G – a reduction of ten times compared with 4G – means that humans can remotely control cars, trucks, and other vehicles, with reaction times that are similar to having a driver sitting inside, directly behind the wheel.

5G also supports a much higher connection density than 4G, which benefits smart city infrastructure and brings us arguably one of the most appealing configurations for business and enterprise users – private 5G networks.

Private 5G networks give firms data coverage inside and outside facilities while keeping communications secure. The high connection density and low latency capabilities mentioned above suit mobile robots and so-called ‘intralogistics’ – the movement of goods within a warehouse, distribution center, or production facility.

Kollmorgen, a specialist in the fleet control of automated guided vehicles and mobile robots, has demonstrated (in tests with its partner Ericsson) that platforms are as stable on private 5G networks as on Wi-Fi.

Antenna handovers are seamless, and 5G enables localization in dynamic environments without having to use artificial landmarks, according to the team.

Support for time-sensitive networking (TSN) allows fixed Ethernet and 5G networks to operate in harmony with each other. What’s more, TSN means that 5G can support applications normally carried over wired Ethernet, potentially saving on long cable runs at large sites.

Looking at industrial operating environments for 5G, business cases include airports, manufacturing sites, ports, mining facilities, and offshore locations.

How to build a private 5G network

Business and enterprise users interested in deploying their own cellular services have the option of using a private 5G network starter kit. The approach is ideal for testing applications on a pilot scale and gaining experience before making larger investments in the technology.

Private 5G starter kits for business:

It’s clear that companies have much to gain from the technology, but other organizations may also wish to take a look at what 5G installations can do for them.

The Liverpool 5G Testbed, which created the largest 5G mmWave mesh network in the UK and second largest in the world, showed how a private 5G network could benefit health and social care services.

Features included a 4k video link to help patients check that they were taking their medicines correctly, which resulted in fewer GP visits and trips to hospital.

According to the study, there was a 50% reduction in people taking the wrong medication or receiving an incorrect dosage. Medication adherence levels were as high as 95% compared with a national average of 55%.

Returning to the topic of saving energy, testing of smart streetlights in Oslo – featuring 5G-enabled sensors – demonstrated a 45% reduction in electricity consumption. In dense urban areas, savings could reach 60% – based on a 12,000 light installation deployed in the City of London, which overhauled 30-year-old lighting stock.

Smart city lighting allows control down to the level of specific streets and buildings and enables the illumination to be dimmed as well as turned on and off.

With timely investment, the rewards of 5G to businesses and society can extend way beyond the smartphone.

The post Defining 5G in terms of business and enterprise use cases appeared first on TechHQ.

]]>
Evolution of autonomous robots: Past, present and future roles https://techhq.com/2023/11/evolution-of-autonomous-robots-past-present-and-future-roles/ Fri, 24 Nov 2023 14:16:39 +0000 https://techhq.com/?p=230135

Autonomous robots – a history and overview. From ‘tortoises’ to co-workers. As technology costs dropped, robots rose. Technological advancements, including software, electronics, and robotics, drive modern society. We see technology developing daily, changing how we work and do business. Arguably, at the forefront of current technological progress are AI technology and autonomous robots. There is... Read more »

The post Evolution of autonomous robots: Past, present and future roles appeared first on TechHQ.

]]>
  • Autonomous robots – a history and overview.
  • From ‘tortoises’ to co-workers.
  • As technology costs dropped, robots rose.

Technological advancements, including software, electronics, and robotics, drive modern society. We see technology developing daily, changing how we work and do business. Arguably, at the forefront of current technological progress are AI technology and autonomous robots. There is little doubt that mobile, self-governing robots will play key roles in the future.

In just 50 years, the market for industrial robots has evolved significantly. Today, that evolution has garnered the term “cobots,” or collaborative robots that work alongside humans. Whether autonomous robots will continue to work side-by-side with humans or take over roles completely is a debate that continues.

Artificial intelligence technology has accelerated at an astounding pace over the last two years (something called ChatGPT is one year old at the time of writing), affecting many sectors across the globe. AI has become a part of everyday life for so many, with intelligent assistants supporting us as we work. They are now at our beck and call, answering queries and performing repetitive tasks in various industries.

Hasn’t that always been the point of technological advancements, though? To help us complete physical tasks quicker or even in our place? If recent decades are anything to go by, we rely on automation more than ever. Autonomous robots may soon be irreplaceable parts of society’s fabric.

Emerging as a key competitive factor for manufacturing operations worldwide, the number of industrial robots being used in industries has grown exponentially since the 1970s. For instance, only 200 industrial robots were used in the US in 1970 compared to 4,000 in 1980. By 2015, that number had risen to 1.6 million. Today, an estimated three million or more autonomous robots are in use.

Robots in industry through time

To understand the impact of robots on different industries, we need to go back to their origin in the 1940s and 1950s. British neurophysiologist W. Grey Walter is often credited as being the man who started it all, developing Elmer and Elsie, two robots in the late 1940s. Inspired by biological systems, Grey Walter’s Tortoises (as they were also affectionately known) exhibited fundamental autonomous actions. Using basic sensors, the robots could navigate their surroundings and even respond to stimuli, mirroring lifelike behavior.

During the 1950s, the progression of autonomous technology continued with George Devol’s creation (he is also termed “the Grandfather of Robotics”), Unimate. Beginning with basic devices for moving objects autonomously, Devol is credited with inventing the first industrial robot.
Over time, electronics, sensors, and software advancements expanded autonomous robot abilities, enabling complex tasks like painting, welding, and precision work.

These early examples of robots began commercial use on assembly lines by the early 1960s, primarily performing heavy lifting tasks. This was despite their primitive design and limited programmability. However, it soon became evident that robotics significantly increased manufacturing productivity.
During the late 1960s and early 1970s, demand for automation increased. Therefore, industrial robots shifted to more precise tasks, prompting the development of smaller, electric models equipped with advanced controls and motors ideal for assembly work like tightening bolts and nuts.

Fast-forward to the late 1970s, and robots had broadened their capabilities. They included tasks like arc welding, intricate spray painting, and hazardous environment operations. The latter allowed humans to work in other production areas while heightening safety in industries such as steel refining and other heavy engineering sectors.

Enthusiasm for robotics continued into the mid-1980s. This era sparked the evolution of modern industrial robots, integrating advanced sensors and basic machine vision systems.

As technology continued to advance and computer hardware costs decreased, robotic capabilities took a substantial leap. Engineers were able to install precision force sensors and lasers on robots, enabling them to detect and track components on assembly lines, providing a human-like sense of sight and touch. That may sound familiar to Amazon’s new robotic solutions, Sequoia and Digit! Like Amazon’s “humanoid” robots, the 1980s engineering marvels elevated robots from repetitive task performers to machines with “limited intelligence.”

While 1980s autonomous robots, otherwise known as autonomous guided vehicles (AGVs), could transport goods and materials from one workstation to another, they still lacked the flexibility and adaptability of autonomous mobile robots. But, as we stepped into the 1990s and 2000s, a rejuvenated interest in developing true autonomous mobile robots emerged.

There was continued progress in the development of artificial intelligence, sensor technology, and machine learning. Autonomous robots could perceive their surroundings more attentively thanks to new generations of sensors. The robots could also learn from their experiences and adapt to changes via newly updated AI algorithms. This was the genuine dawning of the age of autonomous robots.

During the 2000s, independent robotic system technology continued with mobile autonomous robots adopted by numerous industries worldwide, including manufacturing, retail, and healthcare. While most still work alongside a human workforce, these robots have streamlined warehouse operations and revolutionized how products and materials are handled.

And autonomous robot in our historical and future article

Created by Bing AI

Some robots have replaced human workers in many environments, which holds many ethical conundrums. The fact that these robots can augment a human’s capabilities makes them a priceless commodity in certain industries. Although there are drawbacks associated with using this technology, there is little doubt that autonomous robotic systems have helped improve safety, increased productivity, and introduced new, exciting possibilities in the world of innovative machine learning technology.

The present and future of autonomous robots

Today, mobile autonomous technology is widely used in the industrial sector and agriculture, health, logistics, and property maintenance. Although full-scale adoption may still be years away, mobile robots maintain their potential to completely change the way business is done in the future. Even in the modern age, autonomous mobile robots are evolving. That just leaves us with the question, “What does the future hold for robots in industry?”

Machine learning and AI have helped extend industrial robots’ ability to make autonomous decisions without human guidance. Modern robots can analyze extensive data sets, improving their performance via varying mechanical motions. Engineers now prioritize instilling “real intelligence” in robots for increased complexity and efficiency, cultivating a safe collaboration with humans in manufacturing settings.

In recent years, there has been a rise in collaborative robots (cobots). This marks a shift towards robots designed to operate alongside humans safely. Using force-limited joints and computer vision, cobots can interact directly with humans.

Initially passive, the development of cobots has been incredible. They now wield power sources and are capable of precise object manipulation and environmental awareness. Their synergy with human workers has already resulted in mass changes in the manufacturing world, evidenced by the nearly $1 billion cobot market and 40,000 operational machines worldwide. Expect a wider adoption of cobots that will continue to drive efficiency and forever alter industrial sectors globally.

The post Evolution of autonomous robots: Past, present and future roles appeared first on TechHQ.

]]>
>>]]>
Energy tip: appliance signatures cut electricity bills by 30% https://techhq.com/2023/11/energy-saving-tip-appliance-signatures-cut-electricity-bill-by-30-percent/ Wed, 08 Nov 2023 15:45:56 +0000 https://techhq.com/?p=229628

• Your electricity bill – at home and at work – is probably higher than it needs to be. • Energy disaggregation technology can help by pinpointing inefficient appliances in your network. • It’s an easy way of monitoring your power spend in times of economic uncertainty. Energy disaggregation might not be the first phrase... Read more »

The post Energy tip: appliance signatures cut electricity bills by 30% appeared first on TechHQ.

]]>

• Your electricity bill – at home and at work – is probably higher than it needs to be.
• Energy disaggregation technology can help by pinpointing inefficient appliances in your network.
• It’s an easy way of monitoring your power spend in times of economic uncertainty.

Energy disaggregation might not be the first phrase that you’d type into a search engine – or use to prompt your generative AI assistant – to discover how to save money on your electricity bill. However, if you do, it soon becomes clear how the approach leads to energy savings for users.

Business owners may be interested to know that energy disaggregation – which breaks down power consumption by appliance or into usage categories – has been used for some time by chain restaurants such as KFC and Dunkin’ Donuts.

Understanding your electricity bill

Chain restaurants tend to have the same appliances in each location, which simplifies the data analytics. But today, energy disaggregation firms such as Voltaware have appliance libraries capable of recognizing a huge variety of electrical goods based on power consumption signatures.

So-called ‘Non-intrusive Load Monitoring’ (NILM) works by deciphering the various appliances operating in the home or at a workplace from power consumption data, which can be captured using a clip-on current transformer (CT) sensor.

Power consumption monitoring hardware to help consumers save on their electricity bill.

Sensor kit: Voltaware’s hardware features a current clamp, voltage sampling cables, and wireless electronics. Image credit: Voltaware.

The CT sensor is clamped around the line cable between the electricity meter and the fuse box. It detects power consumption through changes in the magnetic field around the wire carrying the mains supply together with voltage sampling across the circuit breaker.

Firms helping businesses and consumers to reduce their electricity bill

Speaking with TechHQ, Sergey Ogorodnov – CEO and co-founder of Voltaware – commented that users typically save around 30% on the cost of their electricity bill using the firm’s sensor and companion app. The company has been providing electricity monitoring services to corporate customers for several years and has recently launched a smart home product for the B2C market.

The sensor, which is non-intrusive and quick to connect, should be installed by a qualified professional, given that it’s measuring mains power. And once it’s up and running, the energy disaggregation solution has a number of advantages over using a smart socket, including being able to categorize electricity consumption throughout the whole premises.

“The main advantage is that users can see where the majority of their money is going,” said Ogorodnov, pointing out how energy disaggregation helps consumers prioritize savings on their electricity bill.

Voltaware’s data is displayed on a companion app, which is available for iOS and Android devices. When users run the app for the first time, they are prompted to enter basic details about the property so that the algorithm can provide not just a breakdown of local energy use, but also benchmark that performance. “A monthly report is available in the app and can be sent via email,” Ogorodnov adds.

According to the firm, the most common disaggregated electrical appliances and usage types are air conditioning, boilers, cooking, EV charging, fridges, electric heating, heat pumps, lighting and electronics, dishwashers, washing machines, tumble dryers, standby/always-on products, kettles, and electric showers.

Predictive maintenance for home appliances

On TechHQ we’ve written about how self-powered Bluetooth sensors in clothes could interact with users – for example, by alerting them to items they’ve not worn in a while. And similarly, disaggregated appliance data could prompt users when a product may benefit from repair or servicing, based on changes in power consumption.

Energy disaggregation paves the way for consumers to reduce their electricity bill.

App data alerts users to vampire devices that consume large amounts of power on standby and switching off these appliances will help to reduce your electricity bill. Image credit: Voltaware.

NILM isn’t a new idea – a PhD thesis by Daniel Kelly of Imperial College’s Department of Computing (available to read online) offers a great description of the field. And he points to resources such as an open source toolkit for NILM and links to open energy monitoring hardware.

However, firms such as Voltaware are extending what’s possible using the data and opening the door to applications beyond being able to reduce your electricity bill. For example, Ogorodnov shares details on Voltacare – the energy insights firm’s monitoring solution for the elderly and people with critical conditions.

Rather than having to use additional hardware beyond the power monitoring sensor, energy disaggregation can be leveraged to send an alert to a relative or carer when behavior that’s out of routine is detected. According to the firm, Voltacare offers an unobtrusive, privacy-friendly, and simple way of checking that elderly parents are active in their home.

Once the algorithm has learned the typical appliance routines in the home, it can generate two levels of alert – caution and critical – based on differences from baseline values.

Power consumption readings are useful for more than just helping consumers to reduce electricity bills. Ogorodnov gives an example of how anonymized data could help to plan the location of public EV chargers.

Also, he makes the point that other data feeds add to the picture. Weather and time-of-day information can help energy disaggregation algorithms to separate cooking from heating, when assigning power consumption to usage categories.

The National Grid has a WhenToPlugIn app that shows a live carbon intensity forecast and lets users know when the electricity will be cleanest over the next 48 hours. And it’s clear that energy disaggregation tools can tap into this information too.

With the right information at their fingertips, consumers can save on their electricity bill and reduce their carbon footprint. Businesses that make use of resources such as electricity maps can take enterprising approaches to cutting carbon in the cloud.

The post Energy tip: appliance signatures cut electricity bills by 30% appeared first on TechHQ.

]]>