For many years now MATLAB has been considered the academia standard when it comes to technical computing and simulation. Many university and college courses rely on multiple tool-boxes and ad-dons that MATLAB provides. With its relatively simple syntax, and large user community it has been, for so many years, a logical choice for academia. However, more often than not, students fresh out of university have been facing a new software that has very quickly become an industry standard in many areas of electrical engineering. On a simple example of DC motor control, this paper aims to showcase advantages of early adoption and using LabViewfor programming and simulation purposes in academia.
Purpose This study aims to investigate the relationship between Islamic governance and the social performance of Islamic banks, pioneering a new aspect in terms of the impact of the National Shariah Board (NSB) on the social performance of Islamic banks. The essential body in the Islamic banks in charge of Islamic governance is the Shariah Supervisory Board (SSB). Therefore, in this study, the authors explore how the characteristics of the Shariah board and Islamic governance mechanisms influence the social performance of Islamic banks. Design/methodology/approach Panel data methods are applied to the annual data of 43 banks from 14 countries over the period 2012–2018 to explore the impact of Islamic governance on Islamic banks’ social performance. The authors have used all available bank annual reports in the given period. Social performance is measured by Maqasid al-Shariah (in terms of the goals of the Islamic moral economy) index using a comprehensive evaluation framework. Islamic governance is represented by the improved Islamic Governance Score (IG-Score) index, which measures the quality of Islamic governance in Islamic banks. In the research, the authors also introduce the frequency of SSB meetings in IG-Score. Findings The findings suggest a strong link between Islamic governance and the social performance of Islamic banks, illustrating the importance of the Shariah board in achieving maqasid. On the other hand, the research discovered that NSBs are inefficient and the existence of NSB can jeopardize the social performance of Islamic banks. The results of this research imply valuable recommendations for Islamic banks that are keen to improve their social performance. Originality/value Besides investigating the impact of SSB governance on the social performance of Islamic banks by using an improved IG score index, to the best of the authors’ knowledge, this is the first study that investigates the impact of NSBs on the social performance of Islamic banks.
Abstract: The degradation of the environment is one of the most urgent challenges today. Since the industrial revolution, we have only known the model of linear economy that deals with the relationship between growth and consumption with the creation of large amounts of waste. As an alternative, a new the concept of the mod-ern economy, the circular economy. The underlying assumptions of such a system are characterised by a ten-dency towards efficient use, and recycling and re-use of resources asit would limit the negative environmental impacts of the economy, while reducing costs in economic activities with the aim of economic growth. Our goal in this paper is to highlight the role and significance of the Circular Economy and natural resources in the process of creation of competitive advantages in a globaly connected world as well as in Bosnia and Herzegovina. Our companies have preferred the mass production method of material wealth based on the mass consumption of natural resources as the main economic development method while pursuing high economic growth and maximum economic profit. These days, this economic development method faces various limitations. Many problems, such as mass generation of wastes exceeding the natural purification capacity, enormous damage environment, deepening of natural disasters and global warming, various disputes surrounding natural resources. This analysis highlights that the use of circular economy tools can help economic policy makers and researchers to take into account the impact on the environment during strategic planning activities and projections of economic growth in BiH.
A search is presented for a heavy resonance $Y$ decaying into a Standard Model Higgs boson $H$ and a new particle $X$ in a fully hadronic final state. The full Large Hadron Collider Run 2 dataset of proton-proton collisions at $\sqrt{s}= 13$ TeV collected by the ATLAS detector from 2015 to 2018 is used, and corresponds to an integrated luminosity of 139 fb$^{-1}$. The search targets the high $Y$-mass region, where the $H$ and $X$ have a significant Lorentz boost in the laboratory frame. A novel signal region is implemented using anomaly detection, where events are selected solely because of their incompatibility with a learned background-only model. It is defined using a jet-level tagger for signal-model-independent selection of the boosted $X$ particle, representing the first application of fully unsupervised machine learning to an ATLAS analysis. Two additional signal regions are implemented to target a benchmark $X$ decay into two quarks, covering topologies where the $X$ is reconstructed as either a single large-radius jet or two small-radius jets. The analysis selects Higgs boson decays into $b\bar{b}$, and a dedicated neural-network-based tagger provides sensitivity to the boosted heavy-flavor topology. No significant excess of data over the expected background is observed, and the results are presented as upper limits on the production cross section $\sigma(pp \rightarrow Y \rightarrow XH \rightarrow q\bar{q}b\bar{b}$) for signals with $m_Y$ between 1.5 and 6 TeV and $m_X$ between 65 and 3000 GeV.
This paper analyses court cases which qualified as organised crime in Bosnia and Herzegovina (B&H). The final judgments were analysed according to the following criteria: the number of defendants; the continuity of membership within the crime organisation; the existence of criminal structure; the existence of a developed plan of activities; the type and number of the offences committed; influence on public authorities, the judiciary, and citizens; and sentences imposed on the defendants. This paper seeks to identify the extent to which court judgments are based on these criteria. A secondary analysis of the data related to the organised crime cases heard in the Court of Bosnia and Herzegovina in the period between 2015 and 2018 was conducted. This analysis encompassed 21 organised crime cases in which 27 judgments were pronounced. In the observed period (2015-2018), we identified two organised criminal groups that meet the criteria analysed. The identified number of organised criminal groups is minimal in relation to the total number of organised crime cases processed. Our findings contradict the prevailing view in public discourse that organised crime is a widespread security threat in BiH. The findings of our research demonstrated the existence of legal gaps, reflected in the lack of clear criteria on the basis of which OCGs can be distinguish from other forms of criminal activity. Legal and institutional weaknesses create opportunities for OCGs to operate and create a sense of insecurity among citizens in the already complex security environment in B&H.
This paper presents the implementation of the Binary Search Algorithm (BSA) to determine the Maximum Power Point (MPP) of a photovoltaic (PV) system under variable weather conditions. Additionally, the conventional well-known Perturb and Observe (P&O) algorithm is also implemented to be compared with the binary search based Maximum Power Point Tracking (MPPT) algorithm. Both algorithms are implemented in real time in MATLAB/Simulink environment. The experimental study is performed using the two 260 W series connected PV modules, the buck converter, and Humusoft MF 634 card to enable real-time operation. The value of the duty cycle for the buck converter is being updated in each step moving the operation point closer to MPP. The obtained experimental results demonstrate that the binary search based MPPT algorithm is more efficient and accurate when compared to the P&O MPPT algorithm.
In this paper, an analysis of depolarisation in Body Area Networks for Body-to-Infrastructure communications based on a measurement campaign in the 5.8 GHz band in an indoor environment is performed. Measurements were made with an off-body antenna transmitting linearly polarised signals and dual-polarised receiving antennas carried by the user on the body. A Normal Distribution with a mean of 2.0 dB and a standard deviation of 4.3 dB is found to be the best fit for modelling cross-polarisation discrimination. The average correlation between the signals received by the orthogonally polarised antennas is below 0.5, showing that polarisation diversity can be used. A model is proposed for the average value of the standard deviation of the cross-polarisation discrimination ratio as a function of the transmitted polarisation, the mobility of users and link dynamics.
Cause-effect graphs are a commonly used black-box testing method, and many different algorithms for converting system requirements to cause-effect graph specifications and deriving test case suites have been proposed. However, in order to test the efficiency of black-box testing algorithms on a variety of cause-effect graphs containing different numbers of nodes, logical relations and dependency constraints, a dataset containing a collection of cause-effect graph specifications created by authors of existing papers is necessary. This paper presents CEGSet, the first collection of existing cause-effect graph specifications. The dataset contains a total of 65 graphs collected from the available relevant literature. The specifications were created by using the ETF-RI-CEG graphical software tool and can be used by future authors of papers focusing on the cause-effect graphing technique. The collected graphs can be re-imported in the tool and used for the desired purposes. The collection also includes the specification of system requirements in the form of natural language from which the cause-effect graphs were derived where possible. This will encourage future work on automatizing the process of converting system requirements to cause-effect graph specifications.
Technical advances as well as continuously evolving business demands are reshaping the need for flexible connectivity in industrial control systems. A way to achieve such needs is by using a service-oriented approach, where a connectivity service middleware provides controller as well as protocol-specific interfaces. The Message Queuing Telemetry Transport (MQTT) protocol is a widely used protocol for device-to-device communication in the Internet of Things (IoT). However it is not commonly integrated in industrial control systems. To address this gap, this paper describes the development and implementation of a prototype of a connectivity service middleware for MQTT within an industrial private control network. The prototype implementation is done in the context of an industrial controller, and used in a simulated modular automation system. Furthermore, various deployment scenarios are evaluated with respect to response time and scalability of the connectivity service.
As the shipping sector has been one of the major impact factors on economic growth over the past decades, its digitalization is expected to make unprecedented improvements in the safety and reliability of ship control, thereby ultimately enabling the autonomous operations of ships. The automated control of ships will not only mitigate the risks of human mistakes but will also improve the efficiency of operations by preventing unexpected delays while being environmentally sustainable. With the advent of the Internet of Ships (IoS) sector, well-known and mature concepts of the Internet of Things (IoT) are being applied to ships and ports, thereby making them more and more equipped with sensing and communication capabilities that set the ground for improved situational awareness and better decision-making. However, there are many challenges that need to be thoroughly studied, such as the communication between barges, ports, and services, as increased network latency and limitations on the bandwidth imposed by satellite communications could introduce significant risks for accident occurrence, ultimately affecting the overall automated operation/teleoperation of barges. In this paper, we present one of the first attempts to test the potential of 5G systems for automating barge operations, starting from teleoperation as an enabler of automation, thereby creating and validating a cellular-based automated barge control system in a real-life environment. In this system, the barge is sailing in a busy port area such as one of the Port of Antwerp Bruges, while being connected to the 5G network. We assess the quality of the 5G communication system and present and discuss our initial results on the enhancements that 5G could bring to teleoperation and automation of the barge control.
The proliferation of 5G technology is enabling vertical industries to improve their day-to-day operations by leveraging enhanced Quality of Service (QoS). One of the key enablers for such 5G performance is network slicing, which allows telco operators to logically split the network into various virtualized networks, whose configuration and thus performance can be tailored to verticals and their low-latency and high throughput requirements. However, given the end-to-end perspective of 5G ecosystems where slicing needs to be applied on all network segments, including radio, edge, transport, and core, managing the deployment of slices is becoming excessively demanding. There are also various verticals with strict requirements that need to be fulfilled. Thus, in this paper, we focus on the solution for dynamic and quality-aware network slice management and orchestration, which is simultaneously orchestrating network slices that are deployed on top of the three 5G testbeds built for transport and logistics use cases. The slice orchestration system is dynamically interacting with the testbeds, while at the same time monitoring the real-time performance of allocated slices, which is triggering decisions to either allocate new slices or reconfigure the existing ones. In this paper, we illustrate the scenarios where dynamic provisioning of slices is required in one of the testbeds while taking into account specific latency/throughput/location requirements coming from the verticals and their end users.
5G Stand Alone (SA) networks are starting to be considered, designed and implemented in multiple countries in various forms (public, private, experimental). 5G SA networks mass adoption is expected to materialize by 2025. Mass deployment is anticipated at a large scale, due to the rich features and capabilities offered by 5G networks, including but not limited to slicing, service orchestration and automation, bringing the benefits of 5G among industry stakeholders and verticals. The concept of Network Applications is gaining momentum, as a way to ease the process of deploying industry-specific services and applications and to integrate them seamlessly with the new 5G networks and customer-specific application components. We target deploying and operating the novel 5G SA testbeds, Network Application and related capabilities in different T&L facilities across Europe. We envision the architectural advancement in terms of 5G features, such as orchestration, multi-slice implementation, Quality of Service (QoS)/Quality of Experience (QoE) and an innovative end-to-end monitoring framework, for network and application KPIs. In this paper, 5G open testbed advancements (3GPP Rel. 16 compliant) and readiness for Network Application experiments in real-life scenarios are presented, integrated as a unitary whole within the ED-funded VITAL-5G project.
Assessment of the functional significance of coronary artery stenosis using invasive measurement of fractional flow reserve (FFR) or non-hyperemic indices has been shown to be safe and effective in making clinical decisions on whether to perform percutaneous coronary intervention (PCI). Despite strong evidence from clinical trials, utilization of these techniques is still relatively low worldwide. This may be to some extent attributed to factors that are inherent to invasive measurements like prolongation of the procedure, side effects of drugs that induce hyperemia, additional steps that the operator should perform, the possibility to damage the vessel with the wire, and additional costs. During the last few years, there was a growing interest in the non-invasive assessment of coronary artery lesions, which may provide interventionalist with important physiological information regarding lesion severity and overcome some of the limitations. Several dedicated software solutions are available on the market that could provide an estimation of FFR using 3D reconstruction of the interrogated vessel derived from two separated angiographic projections taken during diagnostic coronary angiography. Furthermore, some of them use data about aortic pressure and frame count to more accurately calculate pressure drop (and FFR). The ideal non-invasive system should be integrated into the workflow of the cath lab and performed online (during the diagnostic procedure), thereby not prolonging procedural time significantly, and giving the operator additional information like vessel size, lesion length, and possible post-PCI FFR value. Following the development of these technologies, they were all evaluated in clinical trials where good correlation and agreement with invasive FFR (considered the gold standard) were demonstrated. Currently, only one trial (FAVOR III China) with clinical outcomes was completed and demonstrated that QFR-guided PCI may provide better results at 1-year follow-up as compared to the angiography-guided approach. We are awaiting the results of a few other trials with clinical outcomes that test the performance of these indices in guiding PCI against either FFR or angiography-based approach, in various clinical settings. Herein we will present an overview of the currently available data, a critical review of the major clinical trials, and further directions of development for the five most widely available non-invasive indices: QFR, vFFR, FFRangio, caFFR, and AccuFFRangio.
The paper evaluates statistical significance of the differences in the feature values necessary to differentiate the signals corresponding to cardiac arrhythmia (AR) and atrial fibrillation (AF). The initial set of heart rate variability (HRV) features includes time and frequency domain metrics, as well as geometric metrics based on the Poincare diagram. Due to non-uniformity of the heart rate signal, frequency domain features are calculated using two approaches: the Lomb-Scargle method for spectral analysis for non-uniform signals, and Welch method for uniform signals, but after the signal interpolation and resampling. Selection of an appropriate statistical test was depending on the distribution of feature values. Normal distribution allowed use of parametric ANOVA test and otherwise non-parametric Wilcoxon–Mann–Whitney test were used. The statistical tests indicated statistically significant difference between the two observed groups of signals of interest with respect to the evaluated feature. The success of the classification depends on the well-chosen features according to their importance. In the paper, statistical tests resulted in selection of 27 features out of the initial 51. The proposed set of features could be used for the classification between the AR and AF signals to assist diagnosis of the mentioned heart diseases.
Nema pronađenih rezultata, molimo da izmjenite uslove pretrage i pokušate ponovo!
Ova stranica koristi kolačiće da bi vam pružila najbolje iskustvo
Saznaj više