G protein-coupled receptors (GPCRs) play key roles in physiology and are central targets for drug discovery and development, yet the design of protein agonists and antagonists has been challenging as GPCRs are integral membrane proteins and conformationally dynamic. Here we describe computational de novo design methods and a high throughput “receptor diversion” microscopy-based screen for generating GPCR binding miniproteins with high affinity, potency and selectivity, and the use of these methods to generate MRGPRX1 agonists and CXCR4, GLP1R, GIPR, GCGR and CGRPR antagonists. Cryo-electron microscopy data reveals atomic-level agreement between designed and experimentally determined structures for CGRPR-bound antagonists and MRGPRX1-bound agonists, confirming precise conformational control of receptor function. Our de novo design and screening approach opens new frontiers in GPCR drug discovery and development.
Design space exploration (DSE) plays an important role in optimising quantum circuit execution by systematically evaluating different configurations of compilation strategies and hardware settings. In this work, we study the impact of layout methods, qubit routing techniques, compiler optimization levels, and hardware-specific properties, including noise characteristics, topological structures, connectivity densities, and device sizes. By traversing these dimensions, we aim to understand how compilation choices interact with hardware features. A central question in our study is whether carefully selected device parameters and mapping strategies, including initial layouts and routing heuristics, can mitigate hardware-induced errors beyond standard error mitigation methods. Our results show that choosing the right software strategies (e.g., layout and routing) and tailoring hardware properties (e.g., reducing noise or leveraging connectivity) significantly enhances the fidelity of quantum circuit executions. We provide performance estimates using metrics such as circuit depth, gate count, and expected fidelity. These findings highlight the value of hardware-software co-design, especially as quantum systems scale and move toward error-corrected computing. Our simulations, though noisy, include quantum error correction (QEC) scenarios, revealing similar sensitivities to layout and connectivity. This suggests that co-design principles will be vital for integrating QEC in future devices. Overall, we offer practical guidance for co-optimizing mapping, routing, and hardware configuration in real-world quantum computing.
Maturity of companies, organizational culture and costs are some of the major limiting factors for implementation of cost-effective six sigma methodology in Bosnian companies. Purpose of this paper is to analyze possibilities of applying six sigma concept in a medium-sized company with 200 employees. Considering all characteristics of the company, in this case, implementation model was proposed and tested. The project included process monitoring using statistical process control charts for two different periods, before and after improvement. Dominant defect and its causes were identified and it was found out that the process was out of control. Implementation of improvement measures, dominant defect was eliminated but the process has remained out of control. As a conclusion, the test indicated that the model is effective but it takes more iteration to achieve the desired state.
Background/Objectives: This study aimed to investigate the skating determinants and differences between male and female bandy players in the spatiotemporal variables during acceleration and maximum sprint skating velocity. Methods: Seventy-four female bandy players (age: 18.9 ± 4.1 years; height: 1.67 ± 0.06 m; body mass: 63.2 ± 7.4 kg; training experience: 13.4 ± 3.9 yrs.; and 26 elite and 48 junior elite) and 111 male bandy players (age: 20.7 ± 5.0 years; height: 1.80 ± 0.05 m; body mass: 76.4 ± 8.4 kg; training experience: 13.8 ± 5.0 yrs.; and 47 elite and 66 junior elite players) performed linear sprint skating over 80 m. Split times were measured every ten metres by photocells to calculate velocities for each step and spatiotemporal skating variables (glide times and length, step length, and frequency) by IMUs attached to the skates. The first six steps (acceleration phase), the six steps at the highest velocity (maximal speed phase), and the average of all steps were used for analysing glide-by-glide spatiotemporal variables. Results: These revealed that male players exhibited higher acceleration and maximal skating velocity than female players. A higher acceleration in men was accompanied by shorter gliding time, longer step length, and higher step frequency. When skating at maximal speed, male players had a longer step length and gliding time and length. The sub-group analysis revealed that step frequency did not correlate with skating velocity, acceleration, or maximal speed phases. On the other hand, glide and step lengths significantly correlated with skating velocity in both phases (r ≥ 0.60). Conclusions: In general, for faster skating in bandy, it is generally better to prioritise glide and step length than stride frequency. Hence, players should be encouraged to stay low and have more knee flexion to enable a longer extension length and, therefore, a longer path and more horizontal direction of applied force to enhance their acceleration ability.
The increasing demand for lithium-ion batteries (LIBs) and their limited lifespan emphasize the urgent need for sustainable recycling strategies. This study investigates the application of tetrabutylphosphonium-based ionic liquids (ILs) as alternative leaching agents for recovering critical metals, Li(I), Co(II), Ni(II), and Mn(II), from spent NMC cathode materials. Initial screening experiments evaluated the leaching efficiencies of nine tetrabutylphosphonium-based ILs for Co(II), Ni(II), Mn(II), and Li(I), revealing distinct metal dissolution behaviors. Three ILs containing HSO4−, EDTA2−, and DTPA3− anions exhibited the highest leaching performance and were selected for further optimization. Key leaching parameters, including IL and acid concentrations, temperature, time, and solid-to-liquid ratio, were systematically adjusted, achieving leaching efficiencies exceeding 90%. Among the tested systems, [TBP][HSO4] enabled near-complete metal dissolution (~100%) even at room temperature. Furthermore, an aqueous biphasic system (ABS) was investigated utilizing [TBP][HSO4] in combination with ammonium sulfate, enabling the complete extraction of all metals into the salt-rich phase while leaving the IL phase metal-free and potentially suitable for reuse, indicating the feasibility of integrating leaching and extraction into a continuous, interconnected process. This approach represents a promising step forward in LIB recycling, highlighting the potential for sustainable and efficient integration of leaching and extraction within established hydrometallurgical frameworks.
Vehicular communication via V2X networks increases road safety, but is vulnerable to data manipulation which can lead to serious incidents. Existing security systems, such as misbehavior detection systems, have limitations in detecting and mitigating such threats. To address these challenges, we have implemented a software prototype of a Trust Assessment Framework (TAF) that assesses the trustworthiness of received V2X data by integrating evidence from multiple trust sources. This interactive demonstration illustrates the quantification of trust for a smart traffic light system application. We demonstrate the impact of varying evidence coming from a misbehavior detection system and a security report generator on the trust assessment process. We also showcase internal processing steps within our TAF when receiving new evidence, up to and including the eventual decision making on the trustworthiness of the received V2X data.
Future vehicles and infrastructure will rely on data from external entities such as other vehicles via V2X communication for safety-critical applications. Malicious manipulation of this data can lead to safety incidents. Earlier works proposed a trust assessment framework (TAF) to allow a vehicle or infrastructure node to assess whether it can trust the data it received. Using subjective logic, a TAF can calculate trust opinions for the trustworthiness of the data based on different types of evidence obtained from diverse trust sources. One particular challenge in trust assessment is the appropriate quantification of this evidence. In this paper, we introduce different quantification methods that transform evidence into appropriate subjective logic opinions. We suggest quantification methods for different types of evidence: security reports, misbehavior detection reports, intrusion detection system alerts, GNSS spoofing scores, and system integrity reports. Our evaluations in a smart traffic light system scenario show that the TAF detects attacks with an accuracy greater than 96% and intersection throughput increased by 42% while maintaining safety and security, when using our proposed quantification methods.
This paper addresses the challenge of analyzing CVs to parse their content into structured formats suitable for further processing and analysis. The proposed solution processes CVs provided as images or PDFs, handling diverse input formats, including free-form, multi-language, non-standardized layouts, and highly structured documents. Various heuristic approaches are employed for layout analysis, complemented by lightweight language models for extracting information. While multimodal models demonstrate strong performance, their cost and deployment complexity remain significant barriers. This study explores alternative methods optimized for computational efficiency, processing accuracy, and easier deployment. A comparative analysis of approaches is conducted on a standard dataset containing CVs from diverse clients and job roles, ranging from entry-level to specialized positions in various domains. The findings highlight the potential of these tailored, efficient solutions for scalable and secure CV parsing.
The Vehicle Routing Problem (VRP) is among the most complex optimization problems. Practical solutions require addressing real-world constraints such as time windows, vehicle capacities, delivery restrictions, driver working hours, and heterogeneous vehicle fleets. Solutions are often implemented in two stages: the first involves clustering customers, while the second focuses on incremental routing of these clusters to reduce complexity and improve solution control and explainability. However, the second stage heavily depends on the quality of the first, and clustering methods vary depending on client requirements. This paper explores various clustering methods and their impact on the final routing results, with a focus on real-world examples. The study includes diverse client scenarios, ranging from small-scale distribution systems with a limited number of customers to large-scale operations managing more than thousand of deliveries daily, covering both small and large orders. From fixed clustering and geographic partitioning to dynamic clustering algorithms and hybrid approaches, the advantages and limitations of each method are analyzed. The findings aim to provide actionable insights into selecting clustering methods that align with specific use cases, ensuring enhanced efficiency and adaptability in practical applications.
Computer games can be used not only for entertainment but also for education. Embedded systems can be used to improve the gamified learning process by making the interaction with intended users more interesting. Texas Instruments development kits with booster plug-in modules can improve the outcomes of gamified learning. However, there is a lack of studies that explore the benefits and drawbacks of different input methods for gamified learning purposes. In this paper, a snake game was developed on the Texas Instruments MSP-EXP432P401R development kit that uses the analog joystick of the BOSTXL-EDUMKII plug-in module for controlling the snake. An experimental usability study was conducted on 61 3rd year university students, comparing the analog joystick to the computer keyboard, computer mouse, and mobile touchscreen input methods. The achieved results showed that the majority of students preferred the original computer keyboard input method and that more than half of the participants preferred the 90 -degree rotation of the snake compared to the 360 degree analog joystick. However, the analog joystick improved the gaming experience by 63.6 %, and many students made positive comments about its usability in general, indicating that its application for gamified learning may be possible for other types of games.
Warehouse Management Systems (WMS) employ advanced optimization techniques to enhance efficiency and streamline processes, from inventory positioning to order picking and packing. Among these, order picking represents the most time-consuming and resourceintensive operation. This paper presents a novel approach for monitoring worker efficiency in warehouses, focusing on estimating the complexity and time required for order picking. A variety of factors influence these estimates, including item location, quantity, dimensions and weight of items, picking sequence, and whether the location is in the stock or picking zone. Accurate estimation enables effective daily work planning, real-time monitoring of worker productivity, and overall warehouse efficiency. The proposed approach has been tested in real-world warehouse environments, demonstrating its practical applicability and potential to significantly improve worker performance, resource allocation, and operational management.
Texas Instruments development kits have a wide application in practical and scientific experiments due to their small size, processing power, available booster packs, and compatibility with different environments. The most popular integrated development environments for programming these development kits are Energia and Code Composer Studio. Unfortunately, there are no existing studies that compare the benefits and drawbacks of these environments and their performances. Conversely, the performances of the FreeRTOS environment are well-explored, making it a suitable baseline for embedded systems execution. In this paper, we performed the experimental evaluation of the performance of Texas Instruments MSP-EXP432P401R when using Energia, Code Composer Studio, and FreeRTOS for program execution. Three different sorting algorithms (bubble sort, radix sort, merge sort) and three different search algorithms (binary search, random search, linear search) were used for this purpose. The results show that Energia sorting algorithms outperform other environments with a maximum of 400 elements. On the other hand, FreeRTOS search algorithms far outperform other environments with a maximum of $\mathbf{2 5 5, 0 0 0}$ elements (whereas this maximum was $\mathbf{1 0, 0 0 0}$ elements for other environments). Code Composer Studio resulted in the largest processing time, which indicates that the lowlevel registry editing performed in this environment leads to significant performance issues.
The impressive results achieved by language recognition using a generative pre-trained transformer have led to divided opinions on whether or not the Turing test has finally been passed. After understanding the working principles of the GPT programs, it was remarked that the tokenization concept, used by GPT, resulted in the loss of the word-to-letter relationship. Through about 36 specially prepared anagrams with a description of a term in a verse in the languages of the South Slavs, it was shown that ChatGPT and similar programs are far more capable of understanding the semantic connection between words and allusions than in performing the relatively simple task of searching for an adequate word from the offered letters.
Nema pronađenih rezultata, molimo da izmjenite uslove pretrage i pokušate ponovo!
Ova stranica koristi kolačiće da bi vam pružila najbolje iskustvo
Saznaj više