Al Ain Campus
Dr. Issam Al-Azzoni received his Ph.D. in Software Engineering from McMaster University, Ontario, Canada. He is currently with the College of Engineering at Al Ain University, Al Ain, United Arab Emirates. His research interests include software modeling, model transformations, data science, and the application of formal methods and machine learning in software engineering. Dr. Issam is also a certified Associate Big Data Analyst (ABDA) by the Data Science Council of America (DASCA).
Applications of formal methods and machine learning in software engineering, software modeling, model transformations, software performance engineering and data science.
Software Evolution and Maintenance, Data and Web Mining, Introduction to Artificial Intelligence, Ethical Hacking, Software Measurement and Testing, Introduction to Numerical Methods, Discrete Structures, Introduction to Programming, Introduction to Compilers, Web Development, Data Structures and Algorithms, and Object-Oriented Programming.
In 2015, UN member states agreed to 17 global Sustainable Development Goals (SDGs) to end poverty, protect the planet and ensure prosperity for all.
This person’s work contributes towards the following SDG(s):
IEEE: Institute of Electrical and Electronics Engineers – Senior Member
Published in: International Conference On Systems Engineering
Aug 04, 2023
The trending large language model-based ChatGPT service, originally meant to be used as conversational agent, has been adopted in many areas - from programming to entertainment. On the other side, development of smart contracts for various blockchain platforms represents a time and effort demanding task due to their special characteristics. In this paper, we explore how ChatGPT can be leveraged for the purpose of automated smart contract generation with aims to reduce the time and effort required for their development. For our case studies, we consider Solidity and DAML smart contract languages. As an outcome, we propose a model-driven framework treating the problem as a dialogue in a specific context between a user on one hand, facilitated via a smart contract model, and a ChatGPT service, on the other hand. According to our results, the approach seems promising, especially due to its flexibility.
Published in: International Conference On Systems Engineering
Aug 04, 2023
Blockchain technology is attracting huge attention across multiple industries. It sparks a revolution unlike anything else since the start of the Internet, since it goes against old ideas, changing the way that both individuals and organizations will function in the future. In this paper, we are presenting the design and implementation of a blockchain-based volunteers management system. This system will assist all volunteers and managers, even those who plan to join the volunteering community, in keeping track of new activities and be rewarded in a secure, yet transparent way.
Published in: IEEE International Conference on Blockchain and Cryptocurrency (ICBC)
May 01, 2023
In recent years, blockchains have been exploited in areas way beyond finance, enabling numerous innovative usage scenarios and applications. However, the extension of the existing systems and applications in order to support data persistence on a blockchain is time-consuming. Therefore, this paper proposes a model-driven based approach leveraging smart contracts with the goal to automate data persistence on blockchains. The approach is evaluated in data analytics use cases. According to our results, the proposed approach fully automates the data import and export processes without negatively affecting the predictive power of the models trained using data coming from the blockchain.
Published in: International Conference on Internet of Things: Systems, Management and Security (IOTSMS)
Nov 29, 2022
In recent years, blockchain technology in synergy with smart contracts has opened new horizons within almost any field from entertainment to healthcare. However, in order to enable innovative usage scenarios, significant efforts are needed to adapt the existing systems and solutions, so the full potential of blockchain-based tools can be leveraged. In this paper, we propose a model-driven framework which provides automated persistence of domain-specific data within Ethereum blockchain platform, starting from Ecore model instances. Moreover, the corresponding Solidity smart contracts are generated relying on model-to-model and model-to-text transformations using Acceleo. The proposed approach is evaluated on persons-movies dataset inside the Ganache environment. According to the obtained results, our solution successfully automatized the persistence of the evaluated dataset.
Published in: International Conference on Broadband Communications for Next Generation Networks and Multimedia Applications (CoBCom)
Jul 12, 2022
Machine learning is one of key-enablers in case of novel usage scenarios and adaptive behavior within next generation mobile networks. In this paper, it is examined how model-driven approach can be adopted to automatize machine learning tasks aiming mobile network data analysis. The framework is evaluated on classification task for purpose of base station anomaly detection relying on Neo4j graph database. According to the experiments performed on publicly available dataset, such approach shows promising results when it comes to both classification performance and reducing the time required for operations related to data import and model training.
Published in: Mathematical Problems in Engineering
Apr 08, 2022
Efficient resource planning is recognized as one of the key enablers making the large-scale deployment of next-generation wireless networks available for mass usage. Modelling, planning, and software simulation tools reduce both the time needed and costs of their tuning and realization. In this paper, we propose a model-driven framework for proactive network planning relying on synergy of deep learning and multiobjective optimization. The predictions about service demand and energy consumption are taken into account. Also, the impact of degradations resulting from fading and cochannel interference (CCI) effects is also considered. The optimization task is treated as a component allocation problem (CAP) aiming to find the best possible base station allocation for the considered smart city locations with respect to performance and service demand constraints. The goal is to maximize Quality of Service (QoS) while keeping the costs and energy consumption as low as possible. The adoption of a model-driven approach in combination with model-to-model transformations and automated code generation does not only reduce the complexity, making experimentation more rapid and convenient at the same time, but also increase the overall reusability and expandability of the planning tool. According to the obtained results, the proposed solution seems to be promising not only due to achieved benefits but also regarding the execution time, which is shorter than that achieved in our previous works, especially for larger distances. Further, we adopt model-based representation of handover strategies within the planning tool, enabling examination of the dynamic behavior of user-created plan, which is not exploited in other similar works. The main contributions of the paper are (1) wireless network planning (WNP) metamodel, a modelling notation for network plans; (2) model-to-model transformation for conversion of WNP to generalized CAP metamodel; (3) prediction problem (PP) metamodel, high-level abstraction for representation of prediction-related regression and classification problems; (4) code generator that creates PyTorch neural network from PP representation; (5) service demand and energy consumption prediction modules performing regression; (6) multiobjective optimization model for base station allocation; (7) Handover Strategy (HS) metamodel used for description of dynamic aspects and adaptability relevant to network planning.
Published in: Small Systems Simulation Symposium (SSSS)
Feb 28, 2022
Vaccination is recognized as one of crucial measures in battle against COVID-19, contributing to both the reduction of its negative impact on infected person and overall spread reduction. In this paper, we focus on adoption of model-driven approach to proactive and cost-effective vaccine distribution, relying on deep-learning (for vaccine-demand predictions) and multi-objective optimization (for solving the allocation problem). As outcome, software simulation tool for efficient vaccination planning, relying on the proposed approach is presented, showing promising results. Furthermore, the adoption of model-driven approach reduces both the learning curve and time necessary for experimentation.
Published in: International Conference on Science, Technology and Management in Energy (eNergetics)
Dec 08, 2021
Stability is of utmost importance when it comes to smart grid infrastructures. Dramatic parameter variations and fluctuations can lead to wrong decisions, which could lead to fatal consequences. In this paper, we propose a model-driven methodology for highly automated machine learning approach to smart grid stability prediction. Stability prediction is treated as binary classification problem and implemented relying on Neo4j graph database's Graph Data Science Library (GDS). The proposed framework is evaluated on open, publicly available dataset. According to the achieved results, the predictive model shows better performance in terms of accuracy and execution time compared to other solutions based on deep learning. On the other side, the adoption of model-driven approach is beneficial when it comes to reusability and convenient experimentation compared to manual, non-automated design.
Published in: International Conference on Software Defined Systems (SDS)
Dec 06, 2021
With the era of data evolution, enterprises increasingly depend on data utilization tools to import or export data from various data sources. Traditionally, enterprises archive such data into row formats, commonly in CSV files. The flat representation of these files has become an excessive burden to opt the right approach for developing and designing applications that structurally meet business needs. CASE (Computer-Aided Software Environment) tools have been praised by domain experts to build applications by describing their domains in a high abstracted level and automatically generating the appropri-ate implementations. However, these tools lack the appropriate facilities to support efficient and generic bulk data import. In this paper, we present a generic CSV data parser based on EMF (Eclipse Modeling Framework) to automatically map row data into platform-specific models. We define a mapping model which defines the mapping between the CSV files and the target metamodels, and an auxiliary Python script to retrieve the corresponding elements. The experimental evaluation of our parser demonstrates its efficiency to import large CSV files into EMF. In this sense, we aim to increase the adoption of model-based approaches for data-driven use cases by executing bulk and row data import into EMF in an agnostic manner.
Published in: Algorithms
Dec 01, 2021
The underlying infrastructure paradigms behind the novel usage scenarios and services are becoming increasingly complex—from everyday life in smart cities to industrial environments. Both the number of devices involved and their heterogeneity make the allocation of software components quite challenging. Despite the enormous flexibility enabled by component-based software engineering, finding the optimal allocation of software artifacts to the pool of available devices and computation units could bring many benefits, such as improved quality of service (QoS), reduced energy consumption, reduction of costs, and many others. Therefore, in this paper, we introduce a model-based framework that aims to solve the software component allocation problem (CAP). We formulate it as an optimization problem with either single or multiple objective functions and cover both cases in the proposed framework. Additionally, our framework also provides visualization and comparison of the optimal solutions in the case of multi-objective component allocation. The main contributions introduced in this paper are: (1) a novel methodology for tackling CAP-alike problems based on the usage of model-driven engineering (MDE) for both problem definition and solution representation; (2) a set of Python tools that enable the workflow starting from the CAP model interpretation, after that the generation of optimal allocations and, finally, result visualization. The proposed framework is compared to other similar works using either linear optimization, genetic algorithm (GA), and ant colony optimization (ACO) algorithm within the experiments based on notable papers on this topic, covering various usage scenarios—from Cloud and Fog computing infrastructure management to embedded systems, robotics, and telecommunications. According to the achieved results, our framework performs much faster than GA and ACO-based solutions. Apart from various benefits of adopting a multi-objective approach in many cases, it also shows significant speedup compared to frameworks leveraging single-objective linear optimization, especially in the case of larger problem models.
Published in: International Conference on Advanced Technologies, Systems and Services in Telecommunications (TELSIKS)
Oct 20, 2021
Ultra high-speed and reliable next-generation 6G mobile networks are recognized as key enablers for many innovative scenarios in smart cities – from vehicular use cases and surveillance to healthcare. However, deployment of such network requires tremendous amount of time and involves various costs. For that reason, optimal network planning is of utmost importance for development of 6G mobile networks in smart cities. In this paper, we explore the potential of multi-objective linear optimization in synergy with model-driven approach in order to achieve efficient network planning in smart cities. As outcome, a solution relying on pymoo is proposed and compared to previous works relying only on single objective implemented in AMPL. According to the achieved results, this approach speeds up the execution, while giving more flexibility when it comes to cost/performance trade-offs.
Published in: Journal of King Saud University - Computer and Information Sciences
Aug 01, 2021
The application of model transformations is a critical component in Model-Driven Engineering (MDE). To ensure the correctness of the generated models, these model transformations need to be extensively tested. However, during the regression testing of these model transformations, it becomes too costly to frequently run a large number of test cases. Test case prioritization techniques are needed to rank the test cases and help the tester during the regression testing to be more efficient. The objective is to rank the fault revealing test cases higher so that a tester can only execute the top ranked test cases and still be able to detect as many faults as possible in the case of limited budget and resources. The aim of this paper is to present a test prioritization approach for the regression testing of model transformations. The approach is based on exploiting the rule coverage information of the test cases. The paper presents an empirical study which compares several techniques introduced by our approach for prioritizing test cases. The approach is complemented with a tool that implements the proposed techniques and can automatically generate test case orderings.
Published in: e-Informatica Software Engineering Journal
Jun 01, 2021
Background: Model transformations play a key role in Model-Driven Engineering (MDE). Testing model transformation is an important activity to ensure the quality and correctness of the generated models. However, during the evolution and maintenance of these model transformation programs, frequently testing them by running a large number of test cases can be costly. Regression test selection is a form of testing, which selects tests from an existing test suite to test a modified program. Aim: The aim of the paper is to present a test selection approach for the regression testing of model transformations. The selected test case suite should be smaller in size than the full test suite, thereby reducing the testing overhead, while at the same time the fault detection capability of the full test suite should not be compromised. Method: approach is based on the use of a traceability mapping of test cases with their corresponding rules to select the affected test items. The approach is complemented with a tool that automates the proposed process. Results: Our experiments show that the proposed approach succeeds in reducing the size of the selected test case suite, and hence its execution time, while not compromising the fault detection capability of the full test suite. Conclusion: The experimental results confirm that our regression test selection approach is cost-effective compared to a retest strategy.
Published in: IEEE Access
Aug 13, 2020
The software component allocation problem is concerned with mapping a set of software components to the computational units available in a heterogeneous computing system while maximizing a certain objective function. This problem is important in the domain of component-based software engineering, and solving it is not a trivial task. In this paper, we demonstrate a software framework for defining and solving component allocation problem instances. In addition, we implement two meta-heuristics for solving the problem. The experiments show that these meta-heuristics achieve good performance. The framework is designed to be extensible and therefore other researchers can conveniently use it to implement new meta-heuristics for solving the software component allocation problem.
Published in: e-Informatica Software Engineering Journal
Jul 06, 2020
Background: The comprehensive representation of functional requirements is a crucial activity in the analysis phase of the software development life cycle. Representation of a complete set of functional requirements helps in tracing business goals effectively throughout the development life cycle. Use case modelling is one of the most widely-used methods to represent and document functional requirements of the system. Practitioners exploit use case modelling to represent interactive functional requirements of the system while overlooking some of the non-interactive functional requirements. The non-interactive functional requirements are the ones which are performed by the system without an initiation by the user, for instance, notifying something to the user or creating an internal backup. Aim: This paper addresses the representation of non-interactive requirements along with interactive ones (use cases) in one model. This paper calls such requirements 'operation cases' and proposes a new set of graphical and textual notations to represent them. Method: The proposed notations have been applied on a case study and have also been empirically evaluated to demonstrate the effectiveness of the new notations in capturing non-interactive functional requirements. Results and Conclusion: The results of the evaluation indicate that the representation of operation cases helps in documenting a complete set of functional requirements, which ultimately results in a comprehensive translation of requirements into design.
Published in: Journal of Software Engineering and Applications
Sep 28, 2018
In this paper, we present an approach for model transformation from Queueing Network Models (QNMs) into Queueing Petri Nets (QPNs). The performance of QPNs can be analyzed using a powerful simulation engine, SimQPN, designed to exploit the knowledge and behavior of QPNs to improve the efficiency of simulation. When QNMs are transformed into QPNs, their performance can be analyzed efficiently using SimQPN. To validate our approach, we apply it to analyze the performance of several queueing network models including a model of a database system. The evaluation results show that the performance analysis of the transformed QNMs has high accuracy and low overhead. In this context, model transformation enables the performance analysis of queueing networks using different ways that can be more efficient.
Published in: Journal of Computing and Information Technology
Jun 01, 2018
We extend an approach to component allocation on heterogeneous embedded systems using Coloured Petri Nets (CPNs). We improve the CPN model for the embedded systems and outline a technique that exploits CPN Tools, a well-known CPN tool, to efficiently analyze embedded system's state space and find optimal allocations. The approach is model-based and represents an advancement towards a model-driven engineering view of the component allocation problem. We incorporate communication costs between components by extending the CPN formalism with a non-trivial technique to analyze the generated state space. We also suggest a technique to improve the state space generation time by using the branching options supported in CPN Tools. In the evaluation, we demonstrate that this technique significantly cuts down the size of the generated state space and thereby reduces the runtime of state space generation and thus the time to find an optimal allocation.
Published in: MODELSWARD 2017
Feb 19, 2017
This paper presents an approach for model transformation from Queueing Network Models (QNMs) into Queueing Petri Nets (QPNs). This would open up the benefits of QPNs in analyzing the performance of QNMs. We present metamodels for QNMs and QPNs, and then present the transformation rules in the ATLmodel transformation language. To validate our approach, we apply it to analyze the performance of a QNM and compare the results with those obtained using analytic methods. Although the approach is presented using ATL and Ecore meta modeling language in the context of the Eclipse Modeling Project, it can be realized using other modeling frameworks and languages.
Published in: MODELSWARD 2017
Feb 19, 2017
Due to the popularity and heterogeneity of embedded systems, the problem of software component (SW-component) allocation in such systems is receiving increasing attention. Addressing this problem using a graphical modeling language such as Ecore will enable system designers to better and more easily allocate their components. However, the existing Ecore models do not address the problem of SW-component allocation in heterogeneous embedded systems. Because of Ecore informal semantics, Ecore models cannot be analyzed using mathematical tools. On the other hand, an approach based on colored Petri nets (CPNs) was proposed for the modeling and analysis of the software component allocation problem. The approach was shown to be applicable in the field not only with respect to the cost optimization problem, but also because it takes nonfunctional requirements into consideration. In this paper, we propose an approach for the automated transformation of an Ecore model into an equivalent CPN model, which will help the modeler use the power of a formal modeling language by only modeling the system using a simple Ecore-based modeling language.
Published in: Journal of King Saud University: Computer and Information Sciences
Jan 01, 2015
In this paper, we present a new approach to server consolidation in heterogeneous computer clusters using Colored Petri Nets (CPNs). Server consolidation aims to reduce energy costs and improve resource utilization by reducing the number of servers necessary to run the existing virtual machines in the cluster. It exploits the emerging technology of live migration which allows migrating virtual machines between servers without stopping their provided services. Server consolidation approaches attempt to find migration plans that aim to minimize the necessary size of the cluster. Our approach finds plans which not only minimize the overall number of used servers, but also minimize the total data migration overhead. The latter objective is not taken into consideration by other approaches and heuristics. We explore the use of CPN Tools in analyzing the state spaces of the CPNs. Since the state space of the CPN model can grow exponentially with the size of the cluster, we examine different techniques to generate and analyze the state space in order to find good plans to server consolidation within acceptable time and computing power.
Copyright © 2023 Al Ain University. All Rights Reserved.