Transformers are machine learning models designed to uncover and track patterns in sequential data, such as text sequences. In recent years, these models have become increasingly sophisticated, forming the backbone of popular conversational platforms, such as ChatGPT,
Computer Sciences
A quantum algorithm for the segmentation of a moving target in grayscale videos
Computer vision algorithms have become increasingly advanced over the past decades, enabling the development of sophisticated technologies to monitor specific environments, detect objects of interest in video footage and uncover suspicious activities in CCTV...
Exploring the connections between Nobel laureates using network science
Network science is the study of the complex relationships and connections underpinning wide datasets, groups of individuals or other systems composed of many interacting parts. This fascinating study of connections can used to create maps and representations of...
A framework for risk-aware robot navigation in unknown environments
An interactive platform that explains machine learning models to its users
Machine learning models are now commonly used in various professional fields, while also underpinning the functioning of many smartphone applications, software packages and online services. While most people are exposed to these models and interact with them in some...
A technique to facilitate the robotic manipulation of crumpled cloths
To assist humans during their day-to-day activities and successfully complete domestic chores, robots should be able to effectively manipulate the objects we use every day, including utensils and cleaning equipment. Some objects, however, are difficult to grasp and...
An energy-efficient object detection system for UAVs based on edge computing
Unmanned aerial vehicles (UAVs), commonly known as drones, are already used in countless settings to tackle real-world problems. These flying robotic systems can, among other things, help to monitor natural environments, detect fires or other environmental hazards,...
IBM develops a new 64-core mixed-signal in-memory computing chip
For decades, electronics engineers have been trying to develop increasingly advanced devices that can perform complex computations faster and consuming less energy. This has become even more salient after the advent of artificial intelligence (AI) and deep learning...
Exploring the effects of feeding emotional stimuli to large language models
Since the advent of OpenAI's ChatGPT, large language models (LLMs) have become significantly popular. These models, trained on vast amounts of data, can answer written user queries in strikingly human-like ways, rapidly generating definitions to specific terms, text...
Evaluating the ability of ChatGPT and other large language models to detect fake news
Large language models (LLMs) are an evolution of natural language processing (NLP) techniques that can rapidly generate texts closely resembling those written by humans and complete other simple language-related tasks. These models have become increasingly popular...